Occasionally it pays to take a step backward. That’s the mantra that our three organizations — Type A Learning Agency, Humana Health Insurance, and Century 21 — have been playing with for years as we applied the concept of the scavenger hunt to e-Learning assignments. In this article, we’ll share that experience with you. We’ll give you an in-depth look at one of our learning programs that makes use of scavenger hunts as part of its design, and the kind of results that we have achieved with these activities. We hope you’ll be inspired to try this for yourself!
Editor’s Note: Parts of this article may not format well on smartphones and smaller mobile devices. We recommend viewing on larger screens.
Because there are three of us authors, and because we have a lot of experience with scavenger hunts as part of e-Learning, this article will look a little different from what you may think of as a “normal” format. We’ll present our collective thoughts about scavenger hunt design as straight text. Then, when we are presenting our individual experiences, we’ll give you the information in the form of an extended quote, as if this were an interview article.
The e-Learning scavenger hunt model
So what is an “e-Learning scavenger hunt”?
The “typical” e-Learning blend assumes that asynchronous learning is, by definition, non-social. The scavenger hunt model addresses those situations in which your organization is facing challenges that traditional approaches to e-Learning cannot solve satisfactorily, as well as those situations in which you simply do not have the time or resources to build a full-blown simulated environment.
The e-Learning scavenger hunt is a balanced blend of a “synthetic” learning management system (LMS) controlled environment and a controlled experience in the “real” world. The learners work inside a computer-based training application (CBT), while gleaning experience and answers within the actual application they are being asked to learn.
The scavenger hunt model brings many advantages to the table:
• The need for time-consuming simulations is greatly reduced making the scavenger hunt FAST and INEXPENSIVE to build.
• The scavenger hunt model is a fun and EFFECTIVE way to train.
• The scavenger hunt allows for the collection of metrics, but does so in a way that replaces “test anxiety” with a spirit of competition.
Anna Belyaev (Founder, Type A Learning Agency): “The scavenger hunt has proven to be universally popular through the ages for a reason. It’s intuitive, interesting, inherently customizable, and can take place anywhere.”
Bob Dick (Instructional Technology Specialist, National Education and Policy Development team, Humana Inc.): “I agree. One of the neatest things about the scavenger hunt model is that it opens up the opportunity for learners to get real work done. At the same time, the scavenger hunt is a practical exercise that feels as much like a game as it feels like learning, and we can deliver it at just a fraction of the cost of other methods.”
Ernie Brescia (Chief Learning Officer, CENTURY 21): “We use scavenger hunts that incorporate online and offline activities. For example, since we’re training real estate salespeople from across the country, we’re unable to drill down to local market data. Therefore, we send our students on a scavenger hunt to the census website, have those collect data about demographics in their market, and report back on their findings in a subsequent class.”
Two brief examples of scavenger hunt use in blended e-Learning
A scavenger hunt opportunity presented itself last year when Type A helped a one-hour photofinishing service replace a four-day classroom-training program with e-Learning.
Anna Belyaev: “It was one of those assignments that everyone suspected could never be satisfactorily accomplished via e-Learning. Being safe and effective in a photofinishing lab requires becoming intimately familiar and comfortable with a whole slew of physical equipment, chemicals, software, machines, and processes. No amount of simulation or rich media is going to produce sufficiently skilled and confident photo technicians.”
This challenge ignited the formulation of Type A’s Hit the Floor™ activity, which is based upon the scavenger hunt concept. In place of costly simulation, the course that Type A designed periodically sends learners out into the lab, armed with printed instructions for conducting a scavenger hunt. These scavenger hunt activities proved to be far more popular and effective than anyone anticipated.
Anna credits their effectiveness in part to the preferred learning styles of this particular audience.
Anna Belyaev: “These activities not only give learners a welcome break from sitting in front of a computer, but they respect the fact that photofinishing technicians, by and large, enjoy working with their hands and being of service to others. A lot of instructional designers make the faulty assumption that just because an audience is primarily ‘young,’ they’re going to find being on the computer more fun than doing the real work they have chosen.”
Store managers love the fact that Hit the Floor™ activities include exercises that comprise getting real work done, and which contribute in other ways to reducing the “seat” time required to get new staff up to speed in their roles.
Using these types of activities is great for data mining, but how about exercises where the students are in the field actually “hunting” for business? In the fast-paced world of residential real estate sales, keeping learners engaged is critical — and making it “real-world” is just as important. CENTURY 21’s “CREATE 2®” new agent program requires sales associates to hunt for homeowners who are attempting to sell on their own, contact them, and try to arrange for an appointment.
Ernie Brescia: “Getting the appointment is the goal of the exercise. And if the sales associate actually gets the FSBO (For Sale By Owner) to list with her, that’s a victory that gets the whole class excited!”
Humana tries online scavenger hunts
At Humana, similar
success came from developing entirely-online game concepts around what is essentially
a scavenger hunt exercise framework. The National Education and Policy
Development (NEPD) organization develops innovative and effective training for
Humana’s Operations area, a group of about 8000 learners. Bob and Anna met at
an e-Learning Guild conference in
In the first effort of this methodology at Humana, learners were sent off to glean specific coordination of benefits (COB) data from the company’s actual central processing system, just as they would be expected do on the job after their training was complete. Bob’s team made this scavenger hunt experience both fun and real by framing it in a playful context involving baseball, the company mascot HOWIE, a little healthy competition among learners, and a race against the clock. Bob maintains that the exciting atmosphere of the game added value by replacing the element of artificial “test stress” with a more positive “competitive” environment. This allows Humana’s training team to better identify and fill any gaps that exist between performance and expectations.
Bob Dick: “Learners demand that training be engaging and have a balance of meaning and value to what it is they need to be doing. Most simulations and games are no longer as ‘fun,’ or ‘new,’ or ‘engaging,’ as they may have once been. I think that’s in part because they’re generally so obviously fictional or grounded in the past that they hold little interest or meaning for the adult learner, who doesn’t have much time or tolerance in the work day for that particular kind of play. In this context, the scavenger hunt exercise is a breath of fresh air, allowing for the introduction of ‘hands-on’ fun and games that do not preclude getting real work done, in real time. That’s an aspect of the scavenger hunt that makes it as effective at senior levels of the organization as it is among the core workforce. In addition, because we are using an existing environment, we don’t have the high development costs that we do with our learning modules that are heavy with traditional, media-rich simulations.”
All three learning teams agree that the scavenger hunt should not be limited to the end of training.
Bob Dick: “At Humana, we very often administer the scavenger hunt somewhere in the middle of a training course — and again at the end. It is just as valuable a tool for identifying areas that need reinforcement as it for assessing overall achievement — perhaps more so. Because it doesn’t “feel” like a test, we are very often able to identify issues with a specific individual, a specific trainer, or with the training itself and make corrections before we turn a group loose to do a job. Our goal is to have everyone come out of training prepared to perform — this model helps us to better achieve that goal.”
Humana expands the use of scavenger hunts
The numbers gleaned by Bob’s team from their initial launch and later projects seemed to back up his beliefs about the use and effectiveness of scavenger hunts online. Buoyed by the success of the earlier efforts, the Claims Processing management team at Humana approached the NEPD team to create and implement a cost effective solution to the paying of interest to providers or members accurately and efficiently. Unlike earlier efforts targeted at new hire instruction, the paying of interest training was aimed at a specific and measured performance gap. The challenge presented offered an opportunity not often available in the area of e-Learning: to obtain specific, verifiable and quantifiable metrics.
Humana is required to pay claims to providers within a reasonable timeline. In those rare instances when a claim (for whatever reason) has not been processed before the deadline, interest is generally owed to the provider or member who has filed the claim. The following issues complicate this process:
• The timeframe within which Humana must pay varies from state to state.
• The interest rate itself varies from state to state.
• In some states, interest is paid for ALL days after the due date, while in others interest is paid only for missed BUSINESS days.
• Interest due claims from providers and from members are handled differently between states.
• Some states charge interest on the amount allowed, others charge interest on the claim amount itself.
• Paper claims may be calculated differently than electronic claims.
• The date we first received the claim, whether we had to wait for additional information, and whether the claim is contested ALL effect the interest due — and the rules vary from state to state.
• Potential liability varies state-to-state, up to ~$100,000.
Because the process is extremely complicated, compliance with this requirement has been low (interest was only paid on about 35% of required claims during the month of November 2004, according to a status report dated December 2004). In those instances where interest was paid, statistics indicate that accuracy has been questionable (pre-launch statistics indicated about 24% — a full report is available on request). While the amount of money per claim paid is not overly significant, the potential liability to Humana for failure to pay the required and correct amount of interest could range from $100 to $1,000,000 per case depending upon individual state regulations. In addition, it was, and remains, Humana’s goal to provide top-notch service to its members and providers, and such low quality numbers were deemed inconsistent with those goals, and therefore unacceptable.
To try to simplify the process, the Process Organization inside Humana created a simple application that would standardize the calculation process. Their solution was to create a simple calculator that a claims processor could use to quickly and correctly compute the interest due for any given claim, in any given state. The NEPD team, under Bob’s leadership, was asked to produce a training solution, and to assist in the launch, use, and monitoring of this tool to affected Associates.
A major concern with training was one of lost work time, and also overall effectiveness. The belief was that conventional, instructor-led, “team huddle” training would take up to an hour per Associate to accomplish. A further concern was that tool demonstration by an instructor to any given group would not be an effective method to teach Associates how to correctly use the tool. Operations and NEPD were in absolute agreement that the only way to get the associates to use the tool would be to have them try it themselves.
With these concerns raised the decision was made to deliver training to Associates at their desktops through an interactive, computer-based training module that integrated standard simulation training into a controlled foray into the real application itself — in other words, a “scavenger hunt”. The computer based training first dove into the theory behind the training — in order to communicate effectively to the adult learner what “was in it for them”. A short interactive section followed, where they learned the basic functionality of the tool itself. This was then capped by a “contest” where the learners used the tool itself to compete with one another in the real environment. Humana’s Integrated Learning Group tracked the key metrics of accuracy and speed for NEPD through the Humana LMS.
The “rules” or instructions for the scavenger hunt are generally quite skeletal. We assume that the learner now knows the basics of the application and, beyond explaining the concept, will need little guidance. (See Figure 1.)
Figure 1 The instructions for the CIS scavenger hunt (Humana Inc.) are simple and primarily focused on the concept.
The instructions develop a spirit of friendly competition, deflecting the learner away from the absolute fact that this is an exam (it is being graded and scores are being recorded). We give the learner feedback whenever the learner provides a wrong answer. (Just because it is an exam does not mean we cannot teach.)
It is important to
note that internal research indicated that while this methodology would be (and
has been) quite successful for the overwhelming majority of the target audience
of Humana associates, approximately 10% of users could be more effectively
trained by simply giving them the information and asking them to read it. In
support of this need, alternate training was developed and placed on
The developers believed ahead of launch that lost time would be significantly reduced using computer based training (CBT) by at least one-half (from 60 minutes X ~350 associates to 30 minutes X ~350 associates) and the training itself would be more hands-on and effective. Post-training metrics showed that even the most optimistic visions fell well short of reality.
Measured time resulted in even greater savings. The mean time spent on the CBT was 13 minutes, 38 seconds with the median time of 11 minutes, 18 seconds, far below the 30 minutes estimated. Accuracy and speed also rose well above their targets. (We’ll discuss specific numbers later in this article.) As astounding as the results, was the fact that because of the blended approach to the training, NEPD was able to develop the entire training piece (support documents included) in less than 3 days.
A challenge Humana faces is meeting the regulatory changes each state makes on how interest is paid. It is highly important that all relevant associates use the same version of the tool every time. NEPD organized the process for updates and worked actively with the IT team to make the decision as to where the tool was to be stored, to allow restricted access to associates, and absolute control of the application itself by operations and the IT team. NEPD made arrangements for the creation of the network drive, and security to that drive, for all relevant operations people and ensured that all relevant documentation and training was made available to a global partner’s personnel. (The global partner has implemented this tool, but their numbers are not reflected in any outcomes in this report.)
Impact and metric results
As stated earlier, this project resulted in an increase in first attempt accuracy of over 73% and a reduction in time to calculate the first attempt of over a minute. We used a pre- and post-test model that focused on accuracy and time using a random sample (n=40, 11.3% of the target population) of affected Associates located in the Green Bay, WI and Louisville, KY call centers.
- Speed per claim (1st try only) was reduced over 60 seconds per relevant claim from 96 seconds to less than 35 seconds
- Accuracy per claim was raised from 24% to 97.5%
- Lost time in training was reduced over 273 hours from an estimated 354 hours to 80 hours 26 minutes and 30 seconds.
On the pre-test, the Associates had an opportunity to correct their response. 25% (n=10) arrived at the correct answer on the first attempt. An additional 40% (n=16) arrived at the correct answer on their second attempt, 5% (n=2) more arrived at the answer in three or more attempts, and 30% (n=12) never arrived at a correct answer (given up to minutes to try). After receiving the new tool, one Associate (2.5%) made an error during the timed first attempt posttest.
After realizing, without prompting, that the initial result was clearly out of line, the associate self-corrected that error in an additional 27 seconds.
See Table 1, below, for a complete summary of the results.
CBT Development (total) Lost Time Assumed (1 hour per associate for huddle training) Actual Lost Time (per associate as of 2/1/2005) Lost Time Savings Average time to Calculate Interest per claim (before) Average time to Calculate Interest per claim (after) Time Improvement per Claim Accuracy (before) Accuracy (after) Accuracy Improvement
22 hours 354 hours 80.5 hours 251.5 hours 96 seconds 35.8 seconds -60.2 seconds 24% correct (1st try) 97.5% correct (1st try) +73.5%
|Usage of Humana’s document database (Mentor) through Launch Period
Mentor Alert Viewings
Mentor Guideline Launches
Note: ~50 associates used the Mentor documents exclusively
Information eco-system health is as much about people behaviors and information practices as it is about technology. This particular scavenger hunt demonstrated that integrating training into a workflow- learning environment can be successful and can positively impact performance.
The measurable success of this and earlier projects has prompted Humana to integrate the idea (whenever the methodology is deemed appropriate) into an aggressive strategy to close performance gaps in lieu of the time-honored tradition of “team huddles.” While these meetings did (and continue to) foment a sense of unity, the time spent demonstrating new procedures simply took longer and was not as effective as allowing the associate to practice them in a controlled manner.
Learner reaction to scavenger hunts
The Humana team has noticed an interesting trend whenever it builds on the scavenger hunt model.
Bob Dick: “We don’t usually hear directly from our learners. If we get feedback at all, it is generally from the facilitators in those cases where we are working with a blended approach. However, in the case of our scavenger hunts, our phones literally ring off the hook and our email boxes load up the day we launch them. In some cases people are arguing that THEIR answer was correct (and the assessment incorrect — the caller is generally wrong!), but they are usually commenting about how the training broke their routine, and provided real information and real skills. This unstructured and unsolicited feedback provides the Humana team a great deal of guidance. Regardless of whether it is a compliment a complaint, the calls and emails tell us that the learners are engaged. There is no such thing as bad unsolicited feedback — we learn from all of it.”
Anna Belyaev: “It’s when your learners are silent that you worry. We don’t have silent learners when we use the scavenger hunt model.”
The only downside to this methodology the Humana team has been able to identify, is that maintenance of scavenger hunt models is generally more labor-intensive — and needs to be done more often. Maintenance of the modules themselves (and the assessments in particular) is consistently higher than it is with some of the more static, simulation driven, offerings. Over the course of a year’s time, the average scavenger hunt must be updated five to six times in order to keep content in line with the dynamic “real” environments with which they are keyed to interact. The more static courses require updates only once or twice a year, by contrast. This updating is accomplished through regularly scheduled meetings with content owners and SMEs, and through a feedback system that allows corporate facilitators to inform the development team as changes are being integrated into the given systems.
In spite of slightly higher upkeep costs, Humana remains committed to scavenger-hunt learning. Even with the upgrades, the cost of development remains considerably less than simulation or typical game driven modules. Additionally, by keeping the material and approach fresh, Humana associates are being trained on the exact (or very nearly so) environment and situations they will be seeing when they hit the floor and are expected to perform.
Of course, any activity or methodology is going to get boring after a certain amount of time or repetition. So, if you’re finding it harder and harder to satisfy learner demands for something “new and cool” under today’s corporate budgetary conditions, or you’re looking for better ways to get real learning and work accomplished, we certainly recommend giving the old scavenger hunt a new spin.