Is Instructional System Design Dead? Why there are better questions to ask

Written By

Marc Rosenberg

May 10, 2004

Every few years, the debate rises anew: is instructional systems design (ISD) dead? In articles and conferences, as well as on the job, we debate the merits and the challenges facing the field. The arguments against ISD usually center on its perceived inflexibility and the excessive time it takes to go through the process. The arguments for ISD cite its systematic approach and evidence that, if followed, you’re likely to produce more effective training.

These debates are certainly good theatre, but they really don’t help us much. We usually end up with answers like, “well, yes and no,” or “it depends.” In our zeal to determine why projects take too long or cost too much, or why they don’t work as expected, many assume the fault lies in the ISD process that, all too often, results in abandonment of instructional design itself. In properly evaluating ISD, here are four better questions.

First, do we rely too much on simple, linear ISD models, such as ADDIE, and forget the complex nature of the process? Mention instructional design and most people think of the traditional ADDIE model — analysis, design, development, implementation and evaluation. We criticize the model’s linearity and its implied “required” steps, and suggest that this takes too long. This can lead to the conclusion that ISD is dead. Yet actual practice often looks a lot different, more iterative and heuristic. More experienced practitioners will say that the model has very little to do with how they actually design instruction, and they don’t make the mistake of confusing models with practice.

Second, do we put process over results, and project management over learning? We may look too much at means over ends — the quality of the ISD process at the expense of the quality of its outcomes (how the instruction — its logic, interactivity, feedback, reinforcement, etc. — is actually designed, and what the learner actually knows and can do). Often, the preparation and subsequent management review of documents such as needs assessments, task analyses, design documents, etc., takes more time and more energy than that devoted to the design of the actual learning product. So when people ask if ISD is dead, perhaps they are focusing less on how courses are designed, and how effective they are, than on how the design process is managed, documented and audited.

Third, do we agree on just what instructional designers are, what their role should be, and whether or not tools can substitute for skills? Can anyone do instructional design, given new tools that appear to make it easy (for example, the proliferation of tools that purport to turn slide shows into effective instruction), or is it a set of complex activities that require more extensive professional skills? If you believe that the process can be automated so that instructional designers might only be marginally needed, and that courses can be developed by SMEs alone, you likely see its purpose as making it easy to plug as much content as possible into predetermined templates. If you see instructional design as a more unique and creative process and requiring a specific skill set, and the instructional designer as much more critical to overall success, you’re more likely reject the “anybody can do this” attitude. Like models, tools can be used to good effect, or they can be misused. Processes need to adapt to automation, but without an underlying approach and a skilled user, the automation may be meaningless.

Finally, is ISD still the right approach (or the only approach) for the job? Whereas we were once called on solely to develop quality training, we are now asked to “blend in” a variety of knowledge management, collaboration and performance support interventions, as well as incentive, environmental and selection systems, all focused on individual and organizational performance improvement, rather than just learning. The debate about ISD may have relevance for instructional solutions, but it becomes less useful as we expand the boundaries of what we do. Moreover, the very nature of traditional ISD practice often puts “needs assessment,” for example, in the context of an instructional solution which, when we consider the overall performance picture, may be a premature and inappropriate assumption. This could send us rushing to a training solution that is also inappropriate. ISD may not be dead for true instructional solutions, but as new, non-instructional interventions become more important, new, more expansive approaches seem needed.

The debate about the efficacy of ISD is extremely contextual. It works, or it doesn’t, depending on how it’s used in a particular case. The ultimate question we must answer is whether or not ISD is appropriate for the task at hand, and how much we embrace its underlying principles. If we do, it matters not what model we use, how it’s managed, or what people or tools we apply to the effort. If we buy into an instructional design philosophy, and have sound expectations for what it can and cannot do, it is very much alive.

(My thanks to Dr. John Larson for editorial support and insight on this topic.)

More Management

You May Also Like