[Forgot Password?]
Learning Solutions Conference & Expo 2012 Learning Solutions Magazine Home The eLearning Guild Home Learning Solutions Conference Home
<< Back to session list


Learning Solutions Conference & Expo 2012 - Measurement Track
105 Measuring Strategic, Visible, and Costly Learning Programs
402 Why Training (Too Often) Doesn't Work and What You Can Do About It
702 Augmented Reality Learning and Assessments with the Microsoft Kinect
812 Three Critical Concepts for Assessing Asynchronous Discussion Questions
902 How to Design Scenario-based Assessments
913 The ANSWER to Rapid Analysis: A Successful Implementation

Measuring Strategic, Visible, and Costly Learning Programs

Wednesday, March 21, 2012 10:45 AM - 11:45 AM

The numbers run business … literally – it’s the bottom line. Yet many learning professionals do not “speak business numbers” and therefore cannot justify their initiatives to those who approve funding. The stakes are even higher if the program is of strategic importance, highly visible, and/or costly. This session will share why measuring critical programs like leadership development, sales training, and onboarding are vital to program managers.

Session participants will learn the major elements of a measurement plan to successfully and comprehensively generate quantitative and qualitative metrics in a timely and practical manner using limited resources. You’ll get examples of tools such as a key performance indicator matrix, a communication plan, and smart-sheet evaluations, and you’ll see sample reports, including dashboards.

In this session, you will learn:

  • The importance of measuring strategic, visible, and costly programs
  • How to discuss the measurement plan to gather timely, credible data
  • How to showcase sample reports, statements, dashboards, and scorecards
  • How to apply this type of data to your unique needs

Audience: Those participants who need to learn how to understand qualitative and quantitative metrics to justify their learning initiatives.

Jeffrey Berk
Chief Operating Officer
Jeffrey Berk works closely with clients to optimize their talent development investments through measurement and analytics tools. Jeffrey, a CPA, is also an adjunct professor of management at Loyola University and is the author of the book Champions of Change: The Manager's Guide to Sustainable Process Improvement and co-author of the book Human Capital Analytics: Measuring and Improving Learning and Talent Impact.
<< Back to session list Top ^



Why Training (Too Often) Doesn't Work and What You Can Do About It

Wednesday, March 21, 2012 04:00 PM - 5:00 PM

There are a number of reasons why training doesn’t work. A variety of factors affect job performance, and when you don’t take all of them into consideration, any solutions you create (such as training) are likely to have limited impact – at best. Too many eLearning professionals are unaware of the systems nature of training, and what impacts the reasons why training does or doesn't work. They're frustrated that it doesn't work and that they're held responsible that it doesn't.

In this session, you will explore the “performance system,” the constellation of factors that impact job performance, including how to analyze a given performance problem situation (such as, “Customer service reps too often give out the wrong information.”). This systems view of job performance goes beyond a one-size-fits-all approach (which is typically throw-training-at-it-and-see-if-sticks) to a more systematic process for assessing what you really need for the desired results.

In this session, you will learn:

  • How to describe performance in terms of specific business needs
  • How to analyze the factors that affect individual performance
  • How to select data collection methods to understand job performance
  • How to map problems with specific factors to needed interventions

Audience: Intermediate to advanced.

Patti Shank
Learning Peaks
Patti Shank, the president of Learning Peaks, is an internationally known learning expert, researcher, author, and writer who has been named one of the 10 most influential people in eLearning internationally. She is the author, co-author, or editor of numerous books. Patti was the research director for The eLearning Guild and an award-winning contributing editor for Online Learning Magazine, and her articles are found in the ATD Science of Learning and Senior Leaders Blogs and elsewhere.
<< Back to session list Top ^



Augmented Reality Learning and Assessments with the Microsoft Kinect

Thursday, March 22, 2012 02:30 PM - 3:30 PM

Certain fields require access to physical hardware or products to successfully train or assess qualified users. This type of training and assessment is expensive and does not scale well, which limits the learners’ access and scope of learning.

This session will introduce an augmented reality tool that allows users to configure a learning or assessment environment and then interact with this educational environment using the Microsoft Kinect. Also, by adding gamification techniques, you will see how this approach helps increase engagement, reinforces behavior, allows active participation, increases the emotional appeal of the learning, provides feedback, enables skills practice, and adds fun to learning. With technology similar to those you will see demonstrated, participants will be able to create their own augmented reality tool to save money and scale their training or assessments for efficiency.

In this session, you will learn:

  • How you can use augmented reality to reduce costs of training and assessing learners
  • How to use augmented reality to mirror real-world product training
  • The game mechanics that augmented-reality training brings to education
  • What worked and what didn't work using the Microsoft Kinect in education

Audience: Those who want to apply augmented reality tools and approaches to their learning courses.

Curtis Burchett
Technical Learning Architect
Curtis Burchett is a technical learning architect manager at NetApp. Prior to moving into adult education, Curtis practiced law in Colorado and Missouri. Since 1998, he has lectured on technical material for Microsoft, Siebel, Documentum, and NetApp. Today Curtis manages end-to-end technical web-based and instructor-led materials for NetApp and develops custom cutting-edge training tools for both internal and external use.
<< Back to session list Top ^



Three Critical Concepts for Assessing Asynchronous Discussion Questions

Friday, March 23, 2012 08:30 AM - 9:30 AM

One of the most widely used instructional activities for online and blended learning environments is the asynchronous online discussion, however it is also one of the least understood components. People utilize online discussions more for lower-level thinking skills such as remembering and understanding than for higher-order skills such as analyzing, evaluating, and creating. To benefit from this course discussion component we must address a three-fold issue how do we develop discussion questions, as well as how do we design and deliver assessments of students’ responses.

This session will provide solutions to each of the issue areas. Participants will get a practical guide to developing questions that encourage higher-order thinking skills along with real-world examples. You’ll learn a framework for developing sound assessments of discussion questions, and get demonstrations of technological tools to aid in the delivery of these processes and their applications.

In this session, you will learn:

  • How to design discussion questions that encourage higher-order thinking skills
  • The technologies that support the question-design process
  • The technologies that support the assessment design process
  • When you should assess discussion questions in terms of both summative and formative measures
  • How to design simple and effective assessment tools

Audience: Intermediate-level participants should have general knowledge regarding Bloom’s Taxonomy, the instructional design process, online course delivery and pedagogy, and LMS navigation.

Katherine Hixson
Assessment Specialist- Instructional Design
Pearson Learning Solutions - Custom Curriculum
TBAAs the Assessment Specialist for Pearson Learning Solutions – Custom Curriculum, Katharine Hixson identifies and compiles appropriate assessments based on all the relevant details of a project and makes recommendations regarding evaluation strategies. Katharine is a pre-doctoral Research Fellow with NASA, working at the Johnson Space Center each summer where she works to quantify, identify, and ultimately, mitigate issues related to human behavioral health and performance for long duration missions. Katharine holds a B.S. degree in Journalism from Ohio University, a M.F.A. degree in Creative Writing, and is currently pursuing a Ph.D. in Computing Technology in Education.
<< Back to session list Top ^



How to Design Scenario-based Assessments

Friday, March 23, 2012 09:45 AM - 10:45 AM

Scenario-based assessments can be an effective form of evaluation since they represent job-related application of knowledge and skills that can span time, space, people, tools, and various job features. When carefully designed, they can also provide instructors, instructional designers, and other stakeholders with valid units of performance analysis.

This case-study session will show suggested methods and demonstrate the application of the techniques of a Web-based training program developed for the Florida Department of Transportation that utilizes micro-scenarios, interactive simulations of roadways, task-centered quizzes, and 3-D characters as virtual mentors to guide the learners.

In this session, you will learn:

  • How to create assessment scenarios, beginning with a task analysis
  • How to identify the performance claims to be made based on target knowledge and skills
  • How to create the story and characters
  • How to identify the sequence of events that provides sufficient guidance to users while increasing validity
  • How to use basic tools to manage the process

Audience: Novice-to-intermediate participants.

Iskandaria Masduki
Instructional Design Coordinator
Florida State University
Iskandaria Masduki is a Research Associate and Instructional Design Coordinator at the Center for Information Management and Educational Services, Florida State University. She’s the lead instructional designer on learning projects involving Florida state agencies and has taught Flash animation, interactive media, and instructional design. Her diverse work experience includes Web design, scriptwriting, broadcast journalism, and marketing communications. Iskandaria is a doctoral candidate in Instructional Systems at FSU and is a big fan of “Angry Birds” and zombies.
<< Back to session list Top ^



The ANSWER to Rapid Analysis: A Successful Implementation

Friday, March 23, 2012 09:45 AM - 10:45 AM

The rapid pace of business often makes the traditional needs assessment impractical. Companies sometimes are unwilling, or unable, to spend excessive amounts of hours and resources on comprehensive needs studies as part of their change-management initiatives. Additionally, the size and scope of many training projects doesn’t necessitate the need for a traditional, drawn out needs analysis. Unfortunately, this dilemma has left the corporate learning landscape littered with ineffective, directionless training programs.

Session participants will learn about the ANSWER Analysis, a model for a needs assessment that can keep pace with both business demand and rapid development technologies by avoiding the time and cost hassles associated with traditional needs assessments. You’ll learn how, using ANSWER, learning professionals will be able to expeditiously analyze complex employee and business development situations and create targeted solutions.

In this session, you will learn:

  • How to facilitate a rapid analysis of organizational needs
  • How to customize an analysis methodology to suit your work context
  • How to produce a summary report of the analysis
  • How to introduce an analysis methodology to your organization (one that can easily go viral)
  • How to leverage a mobile tool to accomplish a needs analysis and produce a blueprint for moving forward

Audience: Intermediate participants should be familiar with the ADDIE model for instructional design and understand the basic concepts of traditional needs analysis, including job/task analysis and performance analysis.

Jo Anna Hatcher
Sales Force Effectiveness Manager
Rain for Rent
Jo Anna Hatcher is the sales force effectiveness manager at Rain for Rent and she also volunteers time as an instructional designer for the American Foundation for the Blind. She has 11 years of experience in workplace learning and performance and is passionate about introducing innovative structures and strategy to the learning design and development process. She has experience executing international learning programs and has successfully implemented employee performance-improvement programs in several different industries. Additionally, Jo Anna is an experienced course facilitator. Jo Anna holds an MS degree in applied technology and performance improvement and is a certified professional in learning and performance.
Barbara Matthews
Director of Learning Solutions
Allen Communication Learning Services Inc.
Barbara Matthews has always been passionate about learning. Originally an elementary school teacher, Barbara has spent the last 15 years in the field of learning and development. While at Allen Communication Learning Services, she has had the opportunity to consult on learning solutions with many of the world’s best L&D organizations. Barbara has held such roles as Senior Design Consultant, Director of Project Management, and Director of Learning Solutions. Barbara holds a B.S. degree in Elementary Education and a M.S. degree in Instructional Technology.
<< Back to session list Top ^