Deploying a learning analytics tool in your organization brings many benefits, including improving your students’ engagement and their success rates.
However, as you may know, there are a number of technological and human challenges that you must overcome to be successful in your endeavor.
In this post, you will find 15 strategies that you can follow today to effectively accomplish your learning analytics implementation. From each strategy, you will discover what early adopters have learnt and how you can benefit from their experience.
1. Benefit from capturing student interactions happening out of the digital environment
When collecting data for your learning analytics tool, be aware that not all relevant student interactions happen in a digital environment (LMS, online library catalog, etc.). Some factors influencing the learning process could go unnoticed by your analytics engine, simply because they are not captured.
You can collect these missing pieces of information by:
- building a simple web application in which teachers can record interaction elements like assistance to tutoring sessions, extra academic work, or questions in class;
- making sure teachers using the application get recognition and are thanked in front of the faculty for their active participation in the analytics improvement initiative; and
- using the collected information only when the majority of the involved teachers have used the application; otherwise, your data will be partial.
2. Make students’ dashboard rich enough for them to understand immediate actions.
Students usually receive feedback about their progress through small dashboards, sometimes present in the LMS. These dashboards should be informative enough to raise some thoughts in the students’ minds and, when possible, to point out some immediate improvement actions. For example, a red or green traffic light could show the students’ progress status in a course.
Student dashboards should display:
- the overall status of the student taking part in a course or semester (this is where the traffic light makes sense);
- the status of the students within the group so they have a clear reference on where they are compared to their peers;
- hints on why other students perform better, such us: “Students at the top of the class … spend more time in the LMS, take optional exercises, and go to tutoring sessions”; and
- a call to action for students to ask for help in case they feel it is time to raise their hand.
All of this should be displayed in a summarized and easy-to-understand format.
3. Lead the project from the learning innovation side, not from IT.
IT is a critical stakeholder in any learning analytics project, but it should not take the leading role. Learning analytics is not only about data, systems, and dashboards; it is also about finding factors that contribute to students’ failures and successes and designing intervention strategies that work in your learning context.
- list all the relevant stakeholders of your project (leadership team members, teachers, students, IT, learning innovation, implementation teams, legal, etc.);
- include them in the project core team from the very beginning to make things easier in the long run; and
- make sure you have periodic follow-up meetings which include the stakeholders beyond the pure implementation team.
4. Clear the ethical concerns from day one.
When your learning analytics tool is deployed, people will raise ethical concerns. This is completely normal, as they know you will be using their data for further scrutiny about the way they learn and the way they behave, etc. At the end of the day, information is power. Transparency is your best strategy to reinforce the final goal of your learning analytics project: to improve students’ success.
To clear the ethical concerns, you should:
- create a project ethical committee involving all the stakeholders, especially teachers and students;
- jointly create an ethical project charter including the following elements:
- Goals of the project,
- Information that will be collected,
- Information that will be presented to teachers and students,
- The project code of conduct or set of principles governing the collection and usage of data, and
- Mechanisms for students to opt out in case they don’t want to participate
- make sure the ethical project charter is accessible to everybody.
5. Benefit from the use of multiple data sources.
The more data sources you use, the better your learning analytics models will be. However, it is not only about quantity but about the quality of your data too. On the other hand, studies show that there are some well-defined predictive indicators that need to be part of your collected data for your learning analytics tool to be accurate.
The recommendations are to:
- collect, as a minimum, students’ demographic data, LMS activity records, and historic academic data;
- gather students’ current performance information (as well as past data) to supplement what you already know about your student base; and
- use other types of data, such as financial information, if you have a strong belief that it will make a difference to the accuracy of your models.
6. Improve your predictions by using multiple mathematical models.
There are multiple mathematical models to determine the same set of learning analytics. Some of them will be more accurate than others within your context and your data. Don’t come up with insights based on a single model. Explore multiple ways of processing your data to reach meaningful analytics.
- define multiple mathematical models to calculate your learning analytics;
- run the defined models through your existing data and evaluate them in terms of precision (low volume of false positives) and recall (low volume of false negatives); and
- choose the models that rank best on precision and recall as the baseline for your learning analytics tool (you will probably have to find the balance between both elements as no model is 100% accurate).
7. Minimize your errors by permanently monitoring the validity of your analytical models.
Your learning analytics tool will be based on a set of mathematical models. At the time those models were defined, they were designated as the best ones based on their precision and recall. But education is a living entity: the environment changes, students evolve, and teachers improve their content. This has a clear impact on the validity of our analytical models.
To be on the right track, you should:
- make sure there is a feedback component on every model you deploy. In other words, the model should be able to auto evaluate and report how well it performs;
- periodically review the models in use and review them if they don’t report relevant deviations; and
- re-evaluate other models you have already tested in the past if you need to change your model. They may now work, but only based on the dynamics of the educational environment.
8. Present analytic information in a format that can be understood by all the involved parties.
Learning analytics is about mathematical models applied to demographic, academic, and student action related data, but the complexity of these models must be hidden from the final consumers of the analytical information, unless they have a very strong technical background.
- try to present your analytics in a very simple way using colors, symbols, or other representation mechanisms required to visually summarize information;
- use simple terms when referring to analytics rather than mathematical concepts; and
- create a team to test the final consumers of your analytics information that should review your dashboards and make sure they are easy to understand and actionable.
9. Improve your students’ satisfaction by carefully planning communication.
One of the main goals of your analytics tool is being able to lead students to success. This involves periodic communication with them, especially with those requiring additional help according to your analysis. Communication needs to keep the right balance between letting them know they are behind and not discouraging them.
You should have a communications plan in place that establishes:
- how frequently you should communicate with students (depending on the course duration);
- baseline content for each type of communication. You should have scripts, but if the number of students allows for it, be personal. Don’t make your emails look like ‘cut and paste’ emails;
- communication with those requiring help, but also with those who have medium and good performance to keep their engagement levels up; and
- your instructors’ instinct and advice. Learning analytics tools may help you decide who to contact, but at the end of the day, your instructors know how to address students in the best way.
10. Properly manage the “requiring vs demanding” support needs of your students.
Some learning analytics projects include mechanisms for at-risk students to request on-demand extra support from the institution. These mechanisms are only effective for as long as they are being used. Research shows that a relevant percentage of students requiring support don’t demand it.
- identify students requiring extra help (your learning analytics tool should do it for you); and
- offer unsolicited support to all of your at-risk students even if they have not requested it.
11. Improve the effectiveness of your interventions by creating an intervention framework.
The only purpose of learning analytics tools is triggering improvement actions. Actions must be coherent across the entire institution. This is achieved by defining an “intervention framework.” This framework must be discussed in pedagogical terms and should define:
- conditions that will trigger each type of intervention;
- types of interventions to be performed;
- communication mechanisms involved in each intervention;
- how interventions will be documented and fed back into your analytics model; and
- how to measure and report the effectiveness of interventions.
12. Raise the level of comfort of your learning analytics users by training and supporting them.
In order for your interventions to be effective, users of your learning analytics tool should feel comfortable interpreting and acting on your dashboards. This requires the development of a training and support plan. The plan should define:
- who should be trained (everyone actively using the dashboards);
- how to read and interpret the analytics contained in every dashboard;
- the types of interventions and triggering conditions according to the defined intervention framework; and
- communications plans.
13. Start a seed project within your institution.
Some organizations are not very mature in terms of learning analytics. Showing them the power of this technology may be an invaluable catalyst to push the institution forward on its adoption.
This may be accomplished by means of a seed project. Seed projects are limited in scope and are designed to explore the benefits of the technology. Seed projects bypass strict readiness criteria and follow the principle of “leading by example.” Your seed project should:
- be limited in scope;
- be limited in cost;
- target quick wins;
- involve from the beginning all the key stakeholders you will need in the long run; and
- effectively communicate successes.
It is important to remember that bypassing readiness criteria doesn’t mean forgetting about two important aspects:
- Your ethical project charter
- The use of multiple mathematical models and the permanent obsession to validate their outcomes
14. Benefit from your teachers’ instincts and promote their freedom to act.
No matter how complex your learning analytics models are, they will never be able to compile all the dimensions of your students. Students are much more than a set of data. Learning analytics are a powerful tool to predict engagement and performance but they must not banish your teachers’ instinct.
- Promote the use of your learning analytics tool as the main mechanism for students follow up.
- Promote the 100% intervention culture. All at-risk students should be taken care of.
- Encourage your teachers to act according to your defined intervention framework, but leave them room to follow their own instincts.
15. Search not only for failure factors, but also for success triggers.
Many times, learning analytics is used to perform educational risk assessments leading to early interventions that mitigate or clear the identified risks. In this regard, they tend to focus on factors that negatively impact students’ engagement and performance. However, learning analytics may also be used to explore the “light side of the force.”
For this to happen:
- make sure your learning analytics models also predict student success triggers; and
- have mechanisms in place to promote the identified success factors.
Now, it is time for you to benefit from all the lessons learnt by those who already went through the process of deploying their learning analytics tool.
Pick list items that apply to your context and make your learning analytics deployment easier and more straight forward.
It is up to you and your institution.
If you found this list valuable, please share it on social networks.