If there is one facet that has come to define the learning management system (LMS), it’s reporting. Such is the high demand for data that it’s probably fair to say that a lot of the time that’s been saved delivering learning in a more cost-efficient manner has now been reapportioned to the work needed to collect and report on learning activity and the remedial actions needed to ensure the quality of this data. Reporting issues top the list of things LMS administrators want to talk about and it’s often a high-level driver when choosing or replacing an LMS.
In my opinion, until we appreciate the scale of learning reporting and start thinking about reporting up-front – rather than as the after-thought once the content is created or the LMS procured – then we will always be chasing our tails. In this first of two posts, I will look at the need to look at reporting as part of our strategy and planning. In the second part, I will look at how we deliver our reporting activities.
With so much to think about when to it comes to reporting, it’s essential that you develop a reporting strategy. And this needs to happen up-front and at a high level, rather than as the afterthought; and at an individual course level, although later you will need to look at what is required for different learning activities.
The content of these two posts will serve as a good guide as to what to include in the strategy, but at its heart it needs to have clear objectives. For example, are you…?:
- Just looking to provide data for auditing purposes.
- Needing to provide data to support the chasing-up of incomplete learning assignments.
- Seeking to better understand the usability and performance of individual training activities.
- Looking to demonstrate the impact of your learning on the organisation.
In all likelihood you’ll be interested in the first two of these, though it’s so important nowadays to focus on the qualitative reporting for the third and reporting that supports the fourth. This reporting supports the case for the work that you’re doing. Just including the last two in your strategic thinking is a good start.
With strategic reporting objectives and goals come targets, or the KPIs you want to achieve. These may be pre-existing KPIs, especially if you are delivering programmes to support other initiatives, or they may be newly defined for a specific purpose or those that L&D has signed up to as a way to measure its own performance.
Typically they may revolve around completion rates. Here it’s important that realistic completion rates are set. Is it realistic to strive for 100 percent completion? People are always coming and going, moving into and out of roles, so there is rarely a fixed start and end-point.
This is particularly prevalent for compliance training programmes where you are in effect targeting two different audiences. First, all those who were employed on the day the compliance programme was launched and second, all the new starters who join throughout the rest of the year. If full compliance remains the goal, then splitting the reporting between the two groups should help, i.e. all existing staff have to complete it by a certain date; all new starters have to complete it within a set number of weeks.
If you’re looking to move reporting towards measuring the impact your learning is having, then it’s also important to define what your reporting strategy will seek to analyse.
Will you be using data to monitor user engagement, often measured by the number and duration of accesses? Or will you will be looking to use, for example, your LMS’s surveying capabilities to start to report on personal improvement. Maybe you’ll be looking to pull in data from other sources to report on the business improvement that is being experienced after the training has been completed, along with being able to provide stakeholders with return on investment calculations.
It’s important too to consider the stakeholders who will be the recipients of your reports.
- Who are they and why do they want these reports? What goals are they hoping to meet for which they need your supporting data?
- What data do they need? Can you even collect this? [see “reporting capabilities” in Part 2].
- How often do they need to receive these reports [see “frequency” below]?
- How do they want it presented [see “format” in Part 2]?
A failure to understand and then provide what the business needs is one of the most cited issues reported by L&D. As I will no doubt repeat throughout these posts, once the learning has been deployed, it’s often too late or at least very difficult and time-consuming to revisit and deliver on these requirements.
As you increase the amount of learning activity that you can track and the number of interested parties increases, the frequency of your reporting schedules becomes even more important, as the process for collating and sharing data often generates questions and calls for remedial action. This all takes time and it’s very easy to find yourself chasing your tail.
A good model is to consider the stages of each learning campaign. Immediately after launch, there is generally an interest in examining the initial uptake. But thereafter, the frequency of reporting could be slowed until nearer the target completion date. During the “final push” reporting can be more frequent, though I always point out that you should still allow enough time between reports for the business to contact slow-completers and for them to actually go and take the training, otherwise you might see very little change on a day-by-day basis. Finally, after the due date has passed, you will most likely keep some reporting going while you “mop up” the real laggards.
I am also a fan of agreeing fixed schedules for reporting and avoiding accepting requests for ad hoc reports. The more you can work to a pattern, the easier it is to collate and report more accurately.
If your reporting goals also seek to measure the impact of your learning, then you’ll also need to think about when follow-up reporting should be undertaken, having given the learning the chance to be applied and embedded, as well as the appropriate intervals between follow-ups.
The pragmatist inside me knows you are reporting on learning completions in order to chase people to finish off their learning. So your reporting strategy and planning should address this area, so that you can develop a smart, efficient and repeatable approach.
When and how often will you chase? Who will do the chasing? I’m all in favour of the stakeholder doing the chasing, as this places the ownership on them to support you in your efforts to deliver the learning. It also gives much more authority to the process. This then poses the question how will they chase? Will they be cascading the chasing messages down their chain of command, for example? So what report will they need? The main stakeholder won’t likely need to see the detail, just high level statistics, whereas their direct reports will need to know who to chase. And – particularly if you’re a large organisation – can you report easily at the appropriate organisational level? I’ve been in a situation before where I could only report on teams up to a certain point, after which everyone at a lower level was all lumped together in one big group.
Are you able to automate the communications within your systems, accompanied by the relevant data? Or do you need to plan for a more personalised approach?
There are also two related operational factors that you need to consider within your reporting strategy . Any chasing exercise will surface issues with people claiming they have completed the learning but that “the system isn’t working”. So you will need to decide how you will manage appeals. Will you insist on examining each case in turn, or just approve the manual updating of the data in good faith? And if your LMS technical support team will be the ones fielding these queries, you need to ensure your reporting and launch plans factor in your capacity to resource a spike in such support calls.
Support for competitions and incentives
Until recently LMS reporting has largely been used within the “stick approach” to encouraging learning engagement. But increasingly, though the use of gamification and gameful design, we are seeing a more positive approach to generating interest and interactions with learning.
So now is the time to look into the role of reporting to share good news and to reward appropriate learning activity. The reporting strategy and plan needs to look at things such as the use and generation of leaderboards and measuring the attainment of certain award thresholds.
So as you’ve seen, there is a lot to think about when developing your high-level reporting strategy and plan.
There are still lots more things to consider though and I’ll cover those in the second post, looking at how you deliver your reporting approaches.