Delivering a successful learning report

dreamstime_s_71163608The reporting of learning can easily take over a lot of our time.  Over the last 25 years, I must have encountered every issue and gremlin that can cause reporting difficulties.  I’ve learned to plan reporting up-front and to not leave it as the after-thought and in the first of two posts I outlined some things to consider when developing your own reporting strategy and plan.

In this second post I will focus on the delivery of your reporting approach.  I’ll be sharing the lessons I’ve learned – usually after-the-event – so that you can include them within your own plans.  Forearmed is forewarned, as they say.

Course tracking capabilities

If you can’t track it, you can’t report it!

In Part 1 I talked about the need to understand what it is you want to report on.  So it’s imperative that you are fully conversant with what’s possible.  You might have courses you’ve built from scratch, or developed using a popular authoring tool, or you might have licenced a library of content from a third-party supplier.  But whereas AICC and SCORM promised us the earth, the reality is each content piece might well offer different tracking capabilities, especially when coupled with the options available in your LMS.

  • Do you want to report on test scores? Does your content export this data and what happens if there is more than one test within the content?  Does it report them all or just one?  Does it report “pass/fail” or some other labels, such as “in-complete/complete”?  Can you see the actual percentage achieved? Do you want to see the answers to individual questions?
  • How are you looking to track completions? Will success mean 100 percent of the course has been taken, or a lessor percentage?  Does your content report this way or are you limited to a more simple status indicator or even just a pages-viewed count?
  • If your content is designed to be dipped in and out of, what happens when someone goes back in a second time after having first completed it and been awarded the “completed status”? Does the course status reset to “not started” – so making them non-compliant”, or does it retain the initial score?  Often known as the “best versus latest score/status” conundrum, this is one to clarify early on.
  • Are you able to report on how a user has taken a course? Some authoring tools do now allow you to see how long people are spending on certain screens, for example.  Are you able to drill into your content this way to enable you to perform some qualitative reporting on how well a course is performing?
LMS functionality

There are two parties to the reporting relationship in your LMS.  As we’ve just seen, the course is one.  The LMS is the second.  Both have to work well together if you’re to deliver your reporting goals.

  • Does your LMS correctly interpret the data that is being sent to it from your content? As mentioned above, you cannot guarantee that all content will behave in the same way, despite what the so-called interoperability standards tell us should happen.
  • How does your LMS handle deadlines? In particular, how easy is it for the learners themselves to see what they have left to do and by when?  Learners claiming not to know what was expected of them is one of the more popular reasons given to explain poor course completion statistics.
  • Auditing requirements are often based around the concepts of accreditations (the need to have completed a suite of content and assessments in order to be internally qualified in the role), or refreshers (the requirement to repeat content at set intervals to ensure ongoing compliance). How does your LMS handle these and specifically, how does it report on such activity?  Fair to say that this is probably the one area where we often come unstuck, usually as a result of not truly appreciating the monster we sometimes create and not fully thinking everything through down to the matter of reporting.  My advice is to seek the advice of your LMS provider and to speak to your peers as to how they do things.  Simplicity is key here.
  • If you have a multi-lingual portfolio, how does your LMS manage the different language versions? Ideally you want the LMS to regard the English, French, Spanish and Chinese versions as one and the same course when it comes to reporting.
  • On similar lines, does your LMS offer something along the lines of “equivalent” courses, where different courses are regarded as being of the same standard and coverage for the purposes of reporting? This will remove the need to manually have to stitch the reports for such courses together.
  • Does your LMS allow you to report on course usability and performance (see earlier), if your content allows you to assess how well it’s performing as a piece of learning in its own right?
Reporting capabilities

Here we need to look at how well we are equipped to undertake the required reporting activity.

  • Again, are we certain about what it is we want to report on? Can we source all that data?
  • Will our data need to come from different sources, not just that held by the LMS? How easy will it be to obtain the data that doesn’t exist in the LMS?  Can this new data be added to the LMS database, or will we need to use another tool altogether, such as SAP Business Objects?
  • Who is going to manage the reporting? Can the LMS do it all automatically, or will some manual intervention be required?  And if the manual intervention is too non-standard or different stakeholders demand something different for the same report, might it be better for the stakeholder to take over the final processing of the data?
  • How long is it going to take to compile each report? Even if it’s all contained within the LMS, if you are a large organisation, you may still experience a time interval while your data is gathered and processed.  If you need to use an external tool or have to perform a number of manual interventions, then you will need to know who much time this will take, particularly if this is a report that has been requested with a high frequency (see “frequency” in Part 1).
  • Are our systems able to handle the quantity of data that the reports require? I have seen instances where so much data was required that the underlying database would time-out when trying to process the requests and the resulting spreadsheets were so large as to be quite simply unusable.
  • Do you have to be prepared to run this report off on an ad hoc basis, or can it be automatically scheduled and distributed? As I wrote in Part 1, I always try to steer stakeholders to accept a regularly scheduled report.  If they want an irregular pattern instead, I just tell them to delete the ones they get that they don’t want to read at that time.
  • Are our systems and reporting tools able to report as per the organisation’s hierarchical structure? It’s very frustrating if you want to issue a compliance report for Team Z that is lower down the organisational chart, only to find that your LMS stops categorising teams a few levels up.  It’s a painful process to have to then manually slice and dice data in order to give manager’s just what they need (and to not breach any internal confidentiality considerations).  It’s even worse when your stakeholders work to a different structure than that held in the HR database.
Format

In Part 1 I talked about agreeing the report formats with stakeholders.

  • In terms of delivering the reports, are they required to be nicely formatted, or does the stakeholder just want the raw data that they can manipulate as they wish? How far can you go with your reporting tools to be able to present the data in the required format?
  • Would your stakeholders prefer to see just the headline statistics in the form of dashboards? There is definitely a trend to using dashboards, more so than running off reports.  Even at the line manager level, managers would rather just be presented with a simple dashboard than receive a report via e-mail.  Can your LMS surface these dashboards to users, or is there some way you can publish them on your intranet?
  • Have you considered the use of infographics? You can’t have failed to notice how a good infographic seems to convey a lot of different pieces of information on just the one page.
  • Is it possible to agree an organisation-wide standardised format, or – worst case – a small set of standard formats? The greater the number of formats and templates you employ, the greater the risk of inaccuracies creeping in.  You ideally need to keep the sources of truth to just one or a handful of core reports.
Pilot

If there is one thing I’ve learned over the years when it comes to reporting and that’s to determine your reporting requirements up-front and to pilot them alongside the piloting of the content itself.  You quite simply don’t know if you’ll have reporting issues until you have gathered the usage data from the group that has piloted the content.  Your objectives for piloting your reporting should include:

  • Making sure the course itself performs as intended. If completion means they’ve viewed 100 percent of the content, then if it’s possible for someone to miss a section or even a single screen, then your reports will show incomplete when the learner is adamant they viewed the entire course.
  • If the course is sending back the correct status. If someone does complete the course as intended, what does the LMS say? Sometimes there are settings within the course itself which need to be tweaked to allow your LMS to interpret the status correctly.
  • Evaluating the reporting approach itself. It’s often only when you have real data to play with that you realise if your reporting is meeting its intended purpose or not.  Does the stakeholder feel they have the information they need?  If not, what’s missing, or do they in fact have too much?  How easy will it be for them to use the data – if needed – to chase people, or to present it a board meeting?
  • Assessing the accuracy of the data. Quite simply, is what you’re seeing correct?  Are users being correctly assigned to the right business areas?  Are there obvious errors that need to be investigated?

Finding out that your reporting is not working as intended after you’ve launched your learning is not good.  It’s often impossible to backtrack quickly enough, or to correct omissions that weren’t thought through beforehand.  Major initiatives can suddenly take on an air of amateurism.  Remember that the content, your LMS and your reporting capabilities are all inextricably linked, so it’s essential that you put them all to the test in a pilot.

*****

I hope that these thoughts, combined with my thinking on how to set out a sound reporting strategy and plan will help you to deliver clear, timely and accurate reports that support the objectives of your stakeholders and content developers.

Reporting Delivery

There’s a lot to think about, as the mind map I generated before writing these posts indicates.

Reporting

Pausing for a moment to consider all of these factors will pay dividends.

 

Advertisements
Delivering a successful learning report

One thought on “Delivering a successful learning report

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s