Your Source for Learning
Technology, Strategy, and News
ARTICLES       RSS feed RSS feed

Nuts and Bolts: The 10-Minute Instructional Design Degree

by Jane Bozarth

September 6, 2011


by Jane Bozarth

September 6, 2011

“But here’s the thing: regardless of what side of the fence you’re on, whether all designers should have formal training is not the same as whether they will.”

Many people working in training and instructional design came to it through side doors, such as topic expertise or Web design. Is this a problem – or just a fact of life? Either way, what can you do about it?

There are heated debates about whether everyone working in the field should have formal training, as well as discussions of the pros and cons of academic instructional design programs. I’ve seen great designers who had no background at all in the field; I’ve seen terrible designers with every certificate under the sun. But here’s the thing: regardless of what side of the fence you’re on, whether all designers should have formal training is not the same as whether they will. The HR office with a resident training specialist, charged with “putting New Hire Orientation online,” isn’t going to wait for the specialist to finish a two-year Master’s program. I get that. So I recommend:

  1. Design assessments first. Too often we create assessments and tests as an afterthought, in a scramble after the training program is essentially otherwise complete. The result? Usually, it’s a pile of badly written multiple-choice questions. When approaching a project, ask: “What is it you want people to do back on the job?” Then, “What does successful performance look like?” “How will you measure that?” Design that assessment first. Then design the instruction that leads to that goal.
  1. Don’t let the “objectives” obscure the point. I watched the launch of an expansive new employee timekeeping system. The “eLearning team” developed a vast array of lovely modules. They based these on “learning objectives” that had been wordsmithed and vetted and approved six levels up. Those objectives? “Employee will: assign charge object numbers to hours; distinguish types of overtime, holiday, and callback pay rates; generate leave requests; calculate vacation quota.” Sounds great, right? Except that the day the system went live, no one could actually complete a time sheet. Turns out the objectives didn’t really address that. Nor did the module content. Nor did the assessments (see item 1 above). Oops.
  1. No tool will take your content-heavy slides and turn them into an engaging, interactive eLearning experience. You cannot put a Volkswagen into a carwash and have it come out a Lexus. Good eLearning is about design, not software. So invest time and money in talking with performers and learning about the content, and in good books and other training (see “Want More?” below), not in buying and learning to use more products that will not live up to promises. Also – instructional design and visual design are different things. Visual design is just as important (and it isn’t about making things “pretty”), and it needs to be done before the development phase begins.
  1. Design is finished when there’s nothing left to take out. It’s a problem that has bedeviled learning practitioners since the first trainer put stick to dirt: The HR department wants you to include the whole policy. The legal department wants a disclaimer. The owners want to include the company history. The subject matter experts want to include every possible extenuating circumstance. Along the way others say, “Be sure to include…” or, “Don’t you think we should mention…?” And the program grows … and grows … and grows. Revisit your assessment (remember that? You designed it before anything else). Will this information in any way affect successful performance? If not, can you add a link to the policy and an e-mail introduction with a disclaimer? Can you cut the verbiage (here’s a trick: what if you had to pay $5 per word? I bet you could cut it then.)? Or, if there’s really that much content, should you break it into pieces?
  1. There is no such thing as fidelity. This has always been a problem for those in the classroom training business and it didn’t disappear when eLearning came along. Years ago I designed a set of stand-alone tutorials for new supervisors, kind of an “emergency kit” to help them during their first 90 days on the job, often before they went to formal training. Modules covered things like quick overviews of the company hiring process, basics of the performance review system, supervising former peers, that kind of thing. I designed it to be exactly what it said it was – a set of stand-alone modules, more performance support than “training,” easily available independent of each other, to provide just-in-time, just-for-me help as an incumbent encountered new situations. Within weeks I found out that classroom instructors were assigning these modules as pre-work to other training, and workers were e-mailing me, panicked about needing a completion form for a “course” that didn’t even exist. The once-useful program quickly stopped being a “toolkit” and started being one more mandatory training requirement, assigned for completion during the first week of work, apart from any real need for use. Sigh. Another time, I was being held prisoner in a compliance course when the instructor announced she’d found this “great eLearning program” and proceeded to project a stand-alone tutorial on the wall, played the audio through speakers, and did all the clicking and interactions and simulations herself while everyone watched. Sigh again. Here’s the deal: just because it’s online doesn’t mean you can control it. Build it with this in mind.
  1.  Beware of Clicky-Clicky-Bling-Bling (CCBB) design. I mentioned this in an earlier column – and thanks to Cammy Bean for coining the term. Zooming and spinning words, irrelevant animations, and neon “next” buttons do not heighten engagement. They confuse and distract learners. An example? Kevin Thorn’s fabulously CCBB eLearning Christmas card at:
  1. Learn to say “No.” Or at least learn to say, “You know, I don’t think that will solve the performance problem. Can we talk about some ways to get you a better solution?” Designing a solution that doesn’t solve the problem, or that makes it worse, doesn’t do anything to help the organization, the field, or your department’s reputation – or your own. “Instructional designer” is a job title. “Performance consultant” is a mindset. 
  1. Instruction does not cause learning. Etienne Wenger: “Instruction does not cause learning; it creates a context in which learning takes place, as do other contexts. Learning and teaching are not inherently linked. Much learning takes place without teaching, and indeed much teaching takes place without learning.” In other words, knowledge acquisition doesn’t cause behavior change. People learn through experience, through making mistakes, through trying things out, through talking things through with others. Don’t just deliver facts and “content,” but provide meaningful exercises and activities that can help to “cause” learning. Provide performance support tools. Insinuate the learning into the social spaces in which the workers operate. Help the instruction become part of that context in which the learner can learn. 

What would you add to the “10 Minute ID Degree” program?

Want more? Here are some great resources for basics of instructional design:

Cathy Moore’s blog

The Rapid eLearning Blog

Clark, R. & Mayer, R. (2011) e-Learning and the Science of Instruction: Proven Guidelines for Consumers and Designers of Multimedia Learning. 3rd Edition. San Francisco: Pfeiffer. 

Schank, R. (2005). Lessons in Learning, eLearning, and Training. San Francisco: Pfeiffer.

Topics Covered

Appreciate this!
Google Plusone Twitter LinkedIn Pinterest Facebook Email Print

Login or subscribe to comment

ok, catchy but maybe misleading title. Like a "crash-course in driving".
Just because you've got the checklist doesn't mean you'll solve the problem. I believe that your 8 items for consideration are so powerful, that each one deserves 10 minutes of refelection. And once that has taken place, maybe you'll feel the pain of missing the mark the next time you're on site.
My opinions is that there is no ways that anybody without prior knowledge, experience, some serious mistakes and inutition behind them could comprehend the titles of each of your points - never mind the substance behind them.
I'll definitely think about and reflect on all of your ideas, but can't go with the 10 minutes thing.
I would add that you should anticipate that your work my not be understood or appreciated as you think it should. My masters program prepared me for the nuts and bolts of ISD, but my head was spinning when my first design doc came back from the client multi-colored with tracked changes. "What?! Don't they know I poured blood, sweat, and tears into that?!? I thought they would LOVE it!" What we do is creative, sometimes emotions get involved. And most people don't really know what it is we really do anyway...or what value we add. It' the nature of the business. Manage client expectations, be clear about your role and the role of the SMEs, and don't ever call anything "final."
That's a good idea, sleblanc. I do a workshop called "Instructional Design for the Real World"that has a small piece on feedback and how to take it. Essentially the message is, "all feedback has value", and try to step away for 3 days before having a reaction to it. But it's hard. (Also: Ask to have it back on Monday or Tuesday, not on Friday afternoon ;-) )
Solid tips, Jane... and might I add, more helpful than much of the advice one might get in an actual ID degree program.

I have one complaint, though... While your column is called "Nuts and Bolts", there was nothing actually about nuts in this article. Bolts either, in fact. And now that I think about it, there never is!

You should give some serious thought to your misleading naming strategies.
LOL Judy I'll keep that in mind. The banner image has pictures of nuts and bolts; does that count? Thanks for commenting-- I agree, this is the kind of thing ID degree programs should include but often don't.
Yes, as with many professions, there are differences between theory and practice in ISD. You certainly spend a couple years getting your ISD "licks." Now, I have ways of doing things...similar to what you suggest with deadlines and client expectations, etc.
My favorite is the tool- so many learning professionals are 'what tool do you use?' 'have you seen this tool?' 'omg this tool is going to change elearning'. No tool can take the place of a good designer, and a good designer can use any tool.
I would add - keep assessment authentic. Life doesn't test us in 10 question multiple choice quizzes (and I'm pretty sure clients don't either), so why do we test learners that way? Cheap, perhaps. Easy, yes.
I agree that no tool will fix a poor design or solve all e-learning problems. However, I think some tools are more flexible than others, and sometimes "anyone can do it" means "I made a PowerPoint. With a quiz. It's eLearning!" I think some tool biases come from seeing that happen too many times. But you are absolutely right - a good ID can figure out how to make it work no matter what the tool, and the tool doesn't make eLearning work. I enjoyed this post!
Lots of comments on this one... let's see, #7 anon. "No tool can take the place of a good designer, and a good designer can use any tool." Well said.

And #9 Anon (another or the same Anon?) You know I wrote a book about using PowerPoint to create good eLearning, right? :-) I think in your example the person is confusing a bad online presentation with 'eLearning'. Nah-- it's eReading. But you can do a lot with PowerPoint if you're willing to learn (see Tom Kuhlmann's Rapid eLearning blog for plenty more on that). And there's certainly plenty of bad stuff built with high-end products! I'm glad you liked the post and thank you for commenting.

And Chamblan: Amen. Life is not a multiple-choice quiz. As often as not it's an open-book test. So why don't we invest more time in performance support and less in 'courses', I wonder?

Thanks to everyone for the feedback.

Good concise info! I agree with your points 2-8. However, I would differ with you that assessments be written first. Objectives should be written first, but the challenge is, as you inferred, is that content experts often don't know how to write objectives, nor do they even know the goal of the module. Objectives need to be written based upon learner gap between what is known/unknown or practiced/not practiced. Assessments should mirror the objectives. As a "Masters Degree" to your 10-minute instruction, may I suggest reading Clark & Mayer e-learning and the Science of Instruction!
I would add task analysis. It is an essential ingredient to success for any type of training that involves the performance of a task. And, I mean a task analysis that is done while doing or observing the task to be trained on. Also, the analysis needs to incorporate the different participants in the process. For your example, you would need analyses for at least both the person who fills out the timecard and the person approves it. Frequently, this is viewed as taking too much time but it is an investment that pays off throughout the process. And forgoing it, more than anything else on your list practically, leads to ineffective and wasted training. When I teach train-the-trainer courses, I can tell many of the people who did the task analysis from memory because they start to divert from their lesson plan when they demonstrate the task. Maybe you believe this is covered in your paragraph on assessment but I would separate it and make it more explicit. Leaving out this analysis is truly false economy.

An addition of lesser importance but one that can save considerable money and time is to pilot the training at least initially in the classroom as soon as initial visuals, verbiage and evaluation are available. There is almost always an initial group that could form the class for pilot training. Their responses, questions and discussion will prevent a great deal of confusion and errors that will occur during e-learning. You remarked during your presentation on "The Truth about Social Learning" on the paucity of post training evaluations. Frequently there is a great time pressure on getting the e-learning out the door and little follow-up on its effectiveness. Pilot classroom training may be the best shot at heading off problems before the e-learning is far down the road.
Related Articles