I ‘ve seen lots of great courses, but this time I want to tell you a tale of a course that, despite the best intentions of the training organization, was just plain awful. I not only observed this course, I took it, and so I was motivated to learn from the start. But, try as I might, my 40 years in the L&D business got in the way, as I found myself critiquing it as well as struggling to learn. What made this course such a bad training experience, and what can we learn from it?

The environment

Before getting started with the course itself, you need to know that each session (there were seven) was multiple hours and we sat on old bridge chairs, mostly in the dark (to accommodate the slides). “Uncomfortable” is an understatement.

The “textbook”

The 400+ page book is marred by authors who didn’t edit their content down to what was actually needed, technical writers who didn’t seem to master common sentence structure, graphic designers who didn’t correctly annotate their work without going into unnecessary detail, and editors who didn’t seem to understand the term “readability.” In other words, a mess. Although it says it’s been updated, it contains countless out-of-date references, wrong answers to the chapter quizzes, and entire sections that the instructors freely admit are “unnecessary.”

The PowerPoint slides

Don’t get me started. Okay, I’ll get started. There are hundreds and hundreds of them; exact copies of images from the book, too small and too dense to see, coupled with massive bullet point lists. The insane use of almost every animation option available borders almost on hallucinogenic (oh Microsoft, what have you wrought?). I’m kidding only a little, but you get the idea. This wasn’t just “death by PowerPoint,” it was “suicide by PowerPoint”; the misuse of the slides drove a lot of the other problems in the course.

The instructors

The instructors (there were several), all SMEs, have been “teaching” this course for many, many years; they are enthusiastic about their work, have lots of experience, and sincerely want to help us learn. But they clearly have little or no instructor training. They read the slides—verbatim, had trouble explaining content in lay terms, got into meaningless “war stories” that are not relevant to the purposes of the course, used enough acronyms to create an endless alphabet soup of unexplained gibberish, and, unfortunately, occasionally used male-dominated sexist language (they’re all men), not in a mean or deliberate way, but clearly in an uncomfortable way.

The instructors constantly told us, “What I am about to teach you (or, ‘what I just taught you’), you don’t need to know.” It’s hard to say if they were simply compensating for bad, outdated, or unnecessary content, an over-abundance of required slides to cover, a lack of teaching skills, the absence of a sound instructional design, or all of these, but it’s clear that the participants were totally confused as to what was important and wasn’t.

The instructional design

There isn’t any. This is a skills course without actually practicing any skill. In fact, the entire course never uses any realistic representations of the objects being trained. It’s like learning to start a fire without ever actually doing it, or learning to drive a car without ever getting into one (or even seeing one!). There is also no logic to the sequence of the content, unless following the textbook is someone’s idea of design, but as I mentioned, the textbook is a disaster. If you think about Bloom’s Taxonomy, this course starts at the bottom level—remember—and stays there.

The final exam

At the center of it all is “the exam,” a 100-item multiple-choice test given at the end of the course. It’s important to first know this about the exam: all the questions are taken from the chapter quizzes in the textbook. About 25% of the questions, I would guess, could be answered without ever taking the course, either because the answers were so obvious, or because so many of the distractors (choices) were completely ridiculous. If that wasn’t enough, the instructors are constantly telling us which of the chapter quiz questions will be on the final exam. Conclusion: you don’t actually have to go to class, you don’t actually have to read the book; all you have to do is get comfortable with the chapter quizzes (and answers), and, whether you have actually learned anything or not, you will pass.

And then there’s this: the answers to some of the chapter quizzes were incorrect. The instructors suggested that if we see these questions on the final exam, we should answer according to the existing answer key so that our answers match the key. Wow!

Participant reaction

Despite all of this, as I talked to many people in the class, their reaction to the course was more positive. “I learned a lot,” was one response. “The instructors were great,” was another. These weren’t outliers; they were typical and extended to comments on the course website. I was dumbfounded. Were their expectations lower than mine or were they just being nice? Perhaps they are so used to classes like this, they believe this is what adult training is, and will always be. Makes me wonder how big of a rock we, as a profession, have to push uphill to help learners, their managers, and especially ourselves, become effective and critical consumers of training program quality.

The moral of this story

Why tell this story of such a deeply flawed course? Because by looking at agonizing experiences like this and noticing the factors that contribute to the problem, we all have an opportunity to avoid them in our own work. My guess is that you have seen all these flaws before, perhaps not in a single class, but they’re there. Some may be more obvious than others, and some may be so critical that addressing them is a top priority.

You can eliminate or at least substantially reduce the presence of these training problems through better instructor selection, a train-the-trainer program more focused on good teaching skills, interactivity and engagement (with appropriate teaching observations and feedback), and higher quality learning and presentation materials.

But most importantly, you can address bad courses, and bad teaching, in two essential ways. First, a sound instructional design plan, coupled with a delivery strategy that makes sense to the learners and gets them involved, especially with practice and feedback. And second, an appropriate evaluation strategy that not just measures memory or recall, but real, practical, applied, and valued learning, so that participants do not just walk away with the right knowledge, they also walk away with the confidence that they can do what they were taught.

And converting this classroom course to eLearning would probably not have helped; in fact, it could make things worse. If you have bad training, moving it to eLearning without first improving it will likely just result in more efficiently delivered bad training.

Dumping a pile of content on someone’s desk and simply saying, “go teach this,” no matter how well intentioned, is a recipe for disaster, often a disaster you don’t see because you are so used to the way things are. Said someone, I’m not sure whom, “If you always do what you’ve always done, you’ll always get what you’ve always gotten.” Don’t let this happen to you.

Yes, the fix would take more time and cost more money, but if you have lousy courses like this one, you really have no choice. You can’t leave it all as is; the downstream consequences of poor training are just too high. If you have to, take money out of new development and upgrade what you’ve already produced. You can’t wait for everyone to see what you see. Even if your learners are not demanding it (yet), do it now anyway.

By the way, I passed the course (I mean, the exam) with flying colors, so I must have learned something, right? But I have no real idea of what that “something” really was, whether it was important or trivial, or whether I was simply a good (or lucky) test-taker. More critically, I have no confidence as to how much, if anything, I can actually apply.

Your learners may feel as I did, or they may not. Their reactions (level 1) may be a good barometer of course quality, or they may be ambivalent. It doesn’t matter; whether or not others had a bad training experience, if you have one, do something about it.