One of the key ways to determine the value of e-Learning to an organization is through the use of measurement and reporting. As with any sufficiently mature technology, there are clear and defined ways to measure e-Learning effectiveness.
The learning management system (LMS) provides many metrics that learning professionals employ every day to see how their learning is being used and the effect it has on their organization. With the advent of mLearning, however, a lot of changes that have upset the proverbial apple cart are happening in the space. For better or worse, many mobile learning applications simply don’t run in an LMS. Furthermore, many traditional e-Learning measures, such as mastery, completion, and course duration, simply don’t apply to mobile learning. These are all vestiges of the concept of “ahead-of-time” learning, not “just-in-time” information delivery.
This key difference in the reason for putting learning content online in a mobile-accessible format should be a tip that the previous methods for measurement may need re-evaluation. People often use mobile learning applications for quick reference, performance support, job aids, and the like. These use cases are usually much more “application-like” than immersive simulation-based learning is. So, in terms of measurement in the world outside of e-Learning, how are developers making decisions based on the real-world use of their creations?
The answer may lie in event measurement through analytics gathering software. Just as logging software records a visit or page-view when someone visits a Web site, application developers can and often do embed logging into the programs they write. It is very easy to embed measurement in rich media frameworks, AJAX, and Flash platform applications, via Google Analytics, Omniture, or any number of other services. When applied to mobile learning, these events could log anything: the user’s geolocational data, the OS/platform they are on, their corporate ID, or whether they searched or browsed to get to the target data they needed. You can use user session length, bounce rate, and the top content accessed to determine if your content is well-designed and providing assistance where it needs to be.
Another aspect to examine would be the post support or sales incident logging. If your sales personnel or technician needs to update a CRM or other ticketing application after a meeting or house call, perhaps there is an opportunity to add fields to the ticketing screen that ask if they used mobile help while on the call or before it. A follow-up field could ask what content they accessed or if it was helpful. Make these fields a required entry prior to logging the event and you instantly have data points to track to see if your mobile learning is helping those you designed it for. Use this information to inform your revisions and additions, and continue to check in to see if the overall evaluation of your software improves.
One way to measure the success of your performance support, is to consider that, as time passes, you would like your employees to rely on it less and less. Cross-tabulating usage records with employee IDs and feedback can show you if your cleverly crafted mLearning is resulting in more retention, and hopefully more confident and capable employees. Perhaps another area to look at is support center calls. Are they decreasing after putting your knowledge base online in mobile format? Are the types of calls that would be considered “Level 1” or “easy” dropping off? These types of performance support measurements don’t require a LMS at all, but they do an excellent job at letting you see how effective your deployment is.
In closing, as a learning professional, one who is used to using metrics to inform your design decisions, you may feel that moving learning content outside of your LMS ties your hands in terms of making the right choices. In reality, though, when you decouple the content a bit from the existing forms of measurement, and reconsider what it truly is that you want to measure, you may find that mobile learning is actually an empowering way for you to see just how effective your instructional design is. You’ll also learn how it really affects your learners’ day-to-day operations.