Your Source for Learning
Technology, Strategy, and News
ARTICLES       RSS feed RSS feed

Why I LOVE Instructional Objectives

by Allison Rossett

July 12, 2012

Column

by Allison Rossett

July 12, 2012

“I know we agree that crummy objectives are useless, even harmful. Some are too big and some are just silly. Long lists cause eyes to glaze over, defying credulity. You respond and say that we should de-emphasize objectives. I say flush the wicked ones down the toilet.”

Hate, Marc? You hate objectives?

You write that it really is not hate that you feel. It’s concern. You say you are concerned about the overuse of instructional objectives.

That isn’t the problem I run into. I see more underuse than overuse.

Objectives work for me

When I look at objectives, I gain insight into what the program is all about. I notice that the program is making idiotic claims, or I am delighted to see realistic ones. Just by glancing at a few objectives, I can discern the instructional designer’s grasp of audience, needs, culture, and concreteness. Is she tuned in—or is she promising perfunctory, murky, or overly ambitious outcomes? Am I looking at marketing pap masquerading as an instructional objective? Called upon to teach, coach, or judge, objectives help me get a handle on the designer and the work product.

Objectives work for the organization

A wise organization is concerned about the programs it places before its people. How better to render smart selection or revision decisions than to scrutinize objectives? How better to judge suitability than to consider those objectives given organizational priorities? (Are we crossing “T’s” and dotting “i’s” or are we committing to influencing minds, hearts, and bellies?) How better to judge quality than to consider how well the objectives match the strategies, practices, and tests? How better to judge value than to examine the cost of the program in light of the worth of these promises as articulated in the objectives?

Savvy executives are not likely to be schmoozing about the relative value of objective formats a la Mager vs. Gagne. What leaders would appreciate instead are tangible, pithy statements about how their people will be different and better as a result of investment in this program. Give them objectives.

Objectives work for instructional designers

I’ve taught ABCD objectives (audience, behavior, condition, and degree) on many continents, and in universities, companies, and government agencies. I promise you that students of instructional design appreciate ABCD. They rely on the mnemonic to help them produce and screen their efforts. Admittedly trickier is demonstrating where objectives come from, establishing that valuable link between the tasking, analysis, goals, and objectives. That is a story for another time.

Here are reasons for embracing the ABCD parts of objectives, presented in a table. (Note: If you are having difficulty seeing this table on a mobile device, just turn your phone sideways and view the screen horizontally.)

Table 1 The ABCD Parts of Objectives
Audience Behavior Condition Degree
What is it? For whom is the program intended? Who will be changed as a result of the experience? What behavior is intended? What will they be able to do? Under what circumstances will performance occur? Aided or unaided? If aided, how? What is good enough? How fast? How well? With what results? To what standard?
Why bother with it? Reminds IDs and instructors to attend to the learner, not what they themselves will do. Are we talking about defining, identifying, listing, creating, constructing, repairing….? Will we expect the learner to perform by heart OR to rely on a job aid or mobile device? This is an era of accountability. What will satisfy?
Examples A. The medic … … will be able to identify the kind and severity of the burn … … given pictures of burns, and a device with names and pictures of burns … … with 90% accuracy, to match veteran medic.
B. The medic … … will be able to apply proper bandage … … under fire from the enemy, on the battlefield … … so that the right soldiers return to the field and do not get infections.

When instructional designers work with attention to ABCD objectives, they grapple with weighty matters like enablers and aided performance. Enablers ask this question: In order to do that, what must the audience know and do first? In the example above, A (and other objectives too) enable B. The medic cannot treat until he knows the severity of the burn. He cannot decide whether to send the soldier to the field or to the field hospital without that diagnosis.

Aided performance is a particularly rich aspect of objectives. It’s why the “C” in ABCD is my favorite component. Are we expecting our medics (or middle schoolers, or sales people) to perform without assistance and in the midst of the action (battlefield or high stakes test or sales presentation) or may she perform at her leisure, with help from documentation, job aids, or performance support? In medic example A above, the audience members are beginners who are expected to turn to support (their devices) for burn concept identification. On the battlefield, in B above, no such aid is feasible. Attention to condition helps the ID be more mindful about the context for performance.

Objectives work for learners

Marc, you urge us to add expectations to ABCD, expectations that assure links to work and results. I say that good ABCD objectives are themselves that statement of expectations. For example, my medic-in-training will look at those objectives and perceive the benefits to come from participation. In this little example, he sees that he can expect to become savvy about kinds of burns and treatments, with a little help from his mobile device; and then he will advance to serving without aids, in difficult and dangerous conditions for diagnosing and treating, where his performance must be fluent.

Perhaps a medic already possesses textbook knowledge about burns, but has never treated them under battlefield conditions. Substantive objectives, including matched tests and self-assessments, then help him navigate his online resources. In this case, the medic examines the outcomes associated with the burn program and then reaches for what he needs, in particular.

What was my objective?

Marc, I wrote this because I value objectives and I value you. When somebody as wise as you gives permission to skip them, I urge reconsideration of the matter.

I know we agree that crummy objectives are useless, even harmful. Some are too big and some are just silly. Long lists cause eyes to glaze over, defying credulity. You respond and say that we should de-emphasize objectives. I say flush the wicked ones down the toilet.

But you and I must not abandon the mantra that WE BEGIN WITH THE END IN MIND. Today, more technology, mobility, and independence increase our need for ends, for objectives, for the right objectives.

Now, as I think about it, I bet you proclaimed that you hate objectives to get us to remember how much we value them and how important it is for us to get them right.


Topics Covered

(78)
Appreciate this!
Google Plusone Twitter LinkedIn Facebook Email Print
Comments

Login or subscribe to comment

I didn't get that sense at all from Marc's column. I think what Marc is railing against is the propensity of our field discipline to present a clinically perfect three part objective to the student right up front. I agree with Marc.

Want to set the stage for a boring and irrelevant experience? Frame it with an artifact of your design process.

I think you two are in agreement, really. Expectations of who it's for (audience), what the participant should expect the program to help improve (behavior), the specific context of application (condition), and criteria that indicates success (degree / standard). I think the confusion and disagreement is more in HOW (form) than WHAT (application).

I think we'd agree that we design for our audience, right? You state above that you appreciate a clear three / four part objective up front to clarify application and relevance. But you've been conditioned to appreciate this form.

I don't advocate that ID's transfer the artifacts of their architecture directly to the learner. This can cause translation and motivation problems. Real translation and motivation problems when we try to shoehorn an instructional objective into something relevant.

Instructional objectives are CRITICAL to the architecture of a learning experience. But I assert that it's both lazy and detrimental to use the same form of clinical expression in an advanced organizer targeted at participants.

Transformation of an artifact of architecture to a more relevant form (story, challenge, narrative) while still retaining clarity and brevity in the articulation of the
essence of the objective should be the goal.

The presentation of three part perfect clinical objectives isn't the right way to set the stage for a relevant experience. Well... it might be if Instructional Design folk are your audience:)
I see xpconcept's point. The form of objectives should vary, in their friendliness. Yes, indeed, use anecdotes or stories to tell learners what it will do for them, the kinds of problems it will help them solve. You bet. For instructional designers, the ABCDs, emphasis on the Cs, will be particularly helpful.

For all objectives, the point is to keep ENDS in mind and to make sure those ends are attentive to what matters in the real world. Review of objectives should proclaim relevance to potential learners and organizational representatives. If they don't do that, then I'd pass on the programs and thank the objectives.
Ah, the debate begins! I don't want to de-emphasize objectives; they are great tools. I want to augment them with statements of real value and worth beyond specifically designed instructional performance.
Thank you, Allison. I too am forever on my soapbox about the importance of observable, measurable, performance objectives. This article is fantastic.
Karen Mahon
www.karenmahon.com
Paraphrased from someone I heard somewhere a long time ago, but speaks strongly in support of objectives:

"If you don't know where you're going, you probably won't know when or if you get there."
Nothing better than a good debate about learning objectives, their quality, and how we use them with learners!

It sounds like there is agreement objectives are critical to the development of good learning. It also sounds like there is agreement we should not use them to describe to learners why they are participating in the learning. I would argue one of the significant challenges we have is ensuring the learning objectives and performance expectations that we as learning professionals build for, are what actually happens in the real world. Experience has shown there can be dissonance between the way the business describes the needs/requirements and what really happens in the work. We should not overlook the fact that sometimes our objectives are bad because we took a faulty workorder, not because we wrote them poorly, and we should blame ourselves for that. We have a responsibility during the requirements phase of our work to ensure the performance outcomes we warrant delivery on will be used in the workplace. This is a piece of what Allison states as her objectives for responding to Marc, and our challenge is embedding a step in our process to test the objectives validity in the work we have been asked to support.

Marc has proposed we amend the four-part objective paradigm with a fifth part that essentially deals with learner expectations. Maybe we should focus less on the “what’s in it for me” (WIIFM) from a learner perspective and more on our organizational culture so it is obvious to the learner the value to self and value to the organization. Remember the story of the NASA cleaning employee (who was reported to be mopping the floor) who answered JFK’s question about what he did at NASA by saying “I’m helping to put a man on the moon”. Wouldn’t most CEO’s love to hear a learner connect why they are doing the learning to the company’s vision/mission? Maybe that’s how we should introduce our course objectives to learners, and maybe that’s the end we keep in mind when we are writing our objectives.
Allison hit the mark earlier with the objective of the medic coming under fire. Its not just about the goal or ideal performance, it's also about what is or might hinder the objective. Without doing your homework and understanding the 'dark side' of the objective, then it becomes irrelevant because it's either too easy or won't help in the real world. 

If however you write an objective that, for example, states how to troubleshoot a computer problem with an angry inexperienced computer user, now you've got something that's relevant to the learner and guides your instructional strategies. 

So you need both optimal performance defined AND the drivers that are preventing it. Wait...isn't that a gap analysis? :-)
My suggestion to add an "E" (expectations) component to the ABCD model was designed to stimulate a discussion of how objectives should and should not be used, and how we really explain job and personal relevance to learners. The model itself, while useful to designers, is far less important than our overall intent. This should not be a debate on the ABCD model, much like a debate on the future of instructional design should not center on the ADDIE model.
Objectives should be transparent to the learner and completely visible to the instructional designer and our SMEs. If I were king I would never allow an instructional product to say, "At the end of this topic you will be able to: (fill in the blank)" but I must be clear about such things. But we must be flexible. As I see it, A and B are mandatory. For learners in office settings, I often skip "C." And "D" is optional unless you're dealing with compliance, safety, or related requirements.
I've come to the conclusion that lay people (non-instructional designers) have reactions to learning objectives to that span from genunine puzzlement (as to why they are important) to actual aversion (because they feel like they're being locked into some accountability for the behavior stated in the learning objective). No matter how long I've been doing this. There are always those stakeholders who don't get it. For these folks/clients. I've learned to write the objectives carefully so that they specify the behavior/skill to be learned and how they can be observed and measured if possible. I design the activities around these objectives. But later when I'm presenting before stakeholders & staff I put the objectives in 'layman's terms.'
I posted this comment at Allison's blog where she references Marc's column and her response article above. I thought I'd duplicate the comment here:

My two cents… I always preferred to think of — and use — formal instructional or performance objectives primarily as a tool that gave the ID a “track to run on” in developing the content. If the content was for ILT (classroom or virtual), then it served a similar purpose for the instructor (no such person in the case of self-paced e-Learning). Such a “track” helps the ID to stay focused, to determine what the main content is, and what the supporting material is.

It is particularly helpful in this way in dealing with SMEs — and who hasn’t worked with Subject Matter Experts who a.) think they are great teachers when they really are not, and/or b.) think *everything* must be taught, including every tangent, every nook and cranny, every side path from the main “track we are running on”, as determined by the formal objectives?

As a proponent of focusing just as much if not more on performance support content and tools and less on formal training, I think formal objectives can aid the ID and instructor on this key decision point as well: What should we include in the training content, what do we really need to tell/show/etc., what do we want them to remember, to store in the precious real estate that is their brain… versus what is better provided in a job aid, checklist, EPSS, on-demand small module, or other performance support resource?

This last point is critical — we’ve all seen the hockey-stick graphs, the increasing-increases as to just how much info/data/etc. is in the world these days. This trend isn’t going away — and as a proponent of David Allen’s GTD method and principles and his mantra to not abuse your brain by using it in ways it wasn’t intended, I think it is vital that we really focus more and more on the “training” vs. “performance support” decision points. Properly using formal objectives can help in this regard — as an ID, keep asking yourself if the next fact, concept, process, procedure, or principle is really necessary to include in the formal training content in order to optimize the chances of the performance objective being achieved for the most learners… or should that bit be saved for a performance support resource instead? In short, as an ID or instructor, never think you can win against the Forgetting Curve — you will lose. Use your objectives to help you stay focused on these critical decision points as you design and instruct.

I of course agree with — apparently both — Marc and Allison that we shouldn’t disparage X (objectives, in this case) by noting the worst examples of X. That is like attacking PowerPoint because some people create bad presentations (you could still attack PowerPoint for other reasons, just not for this reason.)

That all said, I don’t think such formal objectives are very helpful for students/learners to read or have read to them in class. They are formal. They are boring — even the ones that are perfectly written qua objectives for the ID or instructor’s needs for a “track to run on.”

Instead, I think the relevant chunks of content should provide learners/students with both context statements and WIIFMs. The context statement is a brief statement that connects the upcoming topic/lesson/course with what they already know (or what they just learned in the case of a mid-course lesson). I like Marc’s list of “expectations” — but I think most of those are covered in good WIIFMs, which I think are vital for learning motivation, yadda yadda. For these reasons I was always pleased that for a decade at Element K we always made sure our content (whether e-Learning or print courseware) had both formal and informal objectives for each content chunk — geared towards to the ID/instructor — and included context statements and WIIFMs — geared towards the learner/student.

So I guess I don’t like adding an “E” to the ABCD as Marc suggests, because that seems to imply that the learner/student will still be subjected to the whole thing. The boring ABCD part, and the more relevant E part. Just give them context statements and WIIFMs at each appropriate point, and save the formal objective for behind the scenes, for the ID during design, and for the Instructor during prep and instruction.

In Allison’s response to Marc at Learning Solutions, I like that she notes a learner can quickly use a list of objectives to determine relevance of a course/etc. for them. That seems fine to me — with the caveat that they aren’t the full formal objectives. I think shorter, informally written objectives will suffice for this benefit for the students who are interested. Again, we provided these for learners in the materials we created at Element K over the years — but we downplayed even informal objectives in favor of the context statements and WIIFMs for each section.

I guess that was more than two cents… hopefully it was worth reading.

Best,

Tom Stone
This debate is like trying to define how to use a hammer.

Objectives are an instructional designer’s tool to focus on what need to be learned and how. For the student is should be a means to establish expectations and arouse interest.

Suggest reading: Good Beginnings: Leveraging the Strengths and Avoiding the Weaknesses of the e-Learning Medium Learning Solutions September 24, 2007

and Cj’s Instructional System Design Blog – Blog Archive April; Learning Objectives- Rosetta Stone of ISD. (http://cjsisdblog.blogspot.com/)
Related Articles