xAPI doesn't solve all your learning measurement problems

xapi learning measurement problems LinkedIn

 

From doing our podcasts about xAPI and learning data I’ve realised that one of the great myths about xAPI is that it solves all your learning measurement problems. What xAPI actually does is provide a flexible way of recording activity that doesn’t just have to be in a learning management system. BUT measuring learning activity is not the same as measuring the impact of your learning programs or business outcomes.

Learning is an input, not an outcome.

xAPI doesn’t magically solve the major L&D problems of how to measure the impact of learning on an individual’s performance or business performance.

xAPI is great for measuring learning activity such as:

  • what learning resources people are accessing
  • who is accessing the resources most often
  • what answers people are getting wrong
  • what time day of people are accessing learning.

xAPI provides a huge level of detail about learning activity. Lori Niles-Hofmann, on her Learning While Working podcast on data driven learning design, speaks about how this type of data can be used to drive learning design decisions.

Measure learning impact on individual performance or business outcomes can be done with xAPI, but it needs to be designed and integrated into the approach. Part of the power of xAPI is that it’s flexible, and performance data can be stored in your learning record store and then correlated with your learning data.

If you’re interested in individual performance data then self assessments and peer or manager assessments can be used. Asking individuals to collect data might at first sound time consuming but it can be an opportunity to build reflective activities. One of Peter Drucker’s most famous quotes is "If you can't measure it, you can't improve it." Asking employees each week to enter a certain metric and make notes on their performance is a great learning opportunity. At Sprout Labs we are working on an xAPI-based form builder for Glasshouse that makes these types of interactions easy.

If you’re interested in business performance data it’s often hard to isolate the effect of a learning program. We find that asking questions early in the design conversations about the influence of a learning program on KPIs tends to lead to design changes. It also often means that different data needs to be collected, or that L&D needs access to different data that might already be available in the organisation.

Some learning analytics platforms have tools for bringing this data into your learning record store. Watershed can bring in data from platforms such as Salesforce. The other approach is to bring your learning data into your standard business intelligence tools, and use those tools for looking at the relationship between your learning data and business data. This second approach is a trend we are increasingly seeing.

The correlation of learning data and performance data is a design problem, not a technical problem. By design, I mean deciding on:

  • what data relates to the learning program
  • how you will collect that data
  • how you intend to analyse the data and what questions you will ask of it.

Using xAPI for measuring business impact is possible – it just doesn’t magically solve the problem of learning impact measurement.