top of page
Search
  • Writer's pictureTraci Pate

Top 3 eLearning Testing Gaps

I think we can all agree that the magic of eLearning is delivering the right content, to the right audience, in the right context. It’s the big picture!


But the devil is in the details, and the little things can make for a terrible learning experience. Thoroughly testing your eLearning is a good way to find and fix issues before they become a source of frustration for learners, or tickets for you to fix.


Here are a few common testing gaps we observe:



1. Incorrectly set or unclear passing requirements.

This is how we describe the issue where the learner is able to reach the “congrats/thank you” end screen when in fact they have not satisfied all the attainment requirements. This causes frustration and tickets.


This can happen in many situations. For example, the developer may set a course to require that all screens or interactions are viewed, when some are optional or skippable. Another example would be setting the course to require a specific assessment score but allowing learners to continue on after failing. In short, the course attainment settings should match the required contents of the course. In addition to making the course attainment intuitive, the passing requirements should be spelled out for learners.


If content is required, don’t include it in optional slides or branches. If an assessment score is required, tell learners up front what the passing score and retake options are. Test this by skipping whatever you can in testing, and see what happens? Did the course register as complete in your LMS, should it have?


Guiding principle: The course should never indicate that a learner has passed/completed, if the learner has not satisfied all the requirements to register their completion in the LMS.



2. Course doesn’t resume correctly.

This is an item that we’ve noticed many developers don’t test. Put simply, if a learner doesn’t complete the course in one sitting, they should be able to return to where they left off. If your course doesn’t resume properly, it can be a huge source of frustration, especially for longer compliance courses. If a learner gets interrupted during the course or is kicked out unexpectedly due to an internet connection issue, you want to make sure that they can start back where they left off.


In an ideal world, the course wouldn’t be locked, and it would be broken up into shorter lessons. But in the real world of compliance training, that’s not always possible. So having the course resume correctly matters.


Be sure to test this by hopping out of your course and ensuring that when you jump back in you aren’t starting over from the start or on a random screen.


Guiding principle: Courses should allow learners to start back where they left off.



3. Breakable interactions.

If you’re creating any type of custom interactions be sure to test them thoroughly on the same devices learners will be using. (Tip: Test them thoroughly on a variety of devices even when working in plug-and-play tools like Evolve or Rise. Bugs happen!)


Tabs or other click-to-reveal interactions with audio? What happens if you click all of them at the same time? Are you trapped listening to multiple overlapping audio tracks? (This is terrifyingly common.)


Drag and drops? What happens if you drop them imperfectly into the drop zones? Do they overlap the other drop zones? Can you drag them off screen and lose them entirely? Can you click submit before completing the interaction and miss valuable feedback?


Assessments? Is the assessment properly locked down in the menu to prevent jumping around? If they are allowed to jump around in and out of an assessment, what happens to the learner's score?


What about return visits? Can learners navigate back to content when they return to the interaction, or are they locked out from re-engaging with it? Can they see all of the questions and feedback?


Guiding principle: Don't assume learners will complete the course as intended. Try to break the course in testing to discover potential issues and fix them before launch.



 

These are the top three things we notice in testing and updating courses created by others. Does your QA process check for these?





25 views0 comments

Recent Posts

See All
bottom of page