The AI Blog

Filter by Category
Filter by Category

Be an Advocate for Dumping the Information Dump

by Ann Iverson, Instructional Designer

Angel Green, senior instructional strategist at Allen Interactions, hosted a webinar on Design Thinking for the Instructional Designer that was both informative and inspirational. In it, she stressed the importance of moving beyond a formulaic approach into designing instructional products through creative and empathetic endeavors. An essential success factor for these instructional events is to focus on performance, minimizing content that learners can access easily outside of the learning experience.

For most of my career as an instructional designer, I’ve been an advocate for putting an end to the information dump that many clients believe to be effective. I’ve put myself in the learner’s shoes, dreading the idea of trudging through screens overloaded with information. Over the years, I’ve tried to help decision makers and Subject Matter Experts (SMEs) understand the importance of minimizing content they consider to be “need-to-know.” While their motivation for holding on tight varies, our goal as instructional designers is to try to move the needle toward the design principles that make for great e-learning.

There are a few common questions that raise a red flag for me, highlighting some of the best needle-moving opportunities with stakeholders. When they ask these questions, I realize there’s a chance to advocate for dumping the information dump. Maybe you recognize these questions as opportunities too:

1. Where are the learning objectives?

Starting a course with bulleted learning objectives was

once the standard. When learners see those lists, they get an immediate impression  the course is heavy on content, light on interactivity. Try starting the course with objectives that challenge learners right away. For example, for a fire safety course…

Instead of this:

Upon completion of this course, learners should be able to:

  • Understand how grease fires ignite
  • Recognize a grease fire
  • Identify the steps for putting out a grease fire
  • Know the consequences of using a variety of materials for putting out a grease fire

Try this:

Quick! There’s a grease fire in your kitchen! Grab the right items to put the fire out now.

Defining a “mission objective” for learners upfront giv

es them an engaging and compelling reason to find the information they need to make the best decisions.

2. Where are the page numbers?

Work with stakeholders to clarify the difference between e-learning and e-reading. Page numbers are for text books, not virtual learning activities. Think about it, 

you never see page numbers in online games. The path is often nonlinear, so it can’t be measured in screens. The page number is a classic example of setting up learners to believe they’re making progress by clicking through screens of content. But when you immerse learners in a rich, engaging environment, page numbers become irrelevant. Learners are too focused on and engaged in the activity to care about what page they’re on.

InformationDump

3. Where are all the documents we sent?

When designing performance-based courses, some of the same content as what is included in an information dump course is there, it’s just stored in a different place, like in a toolkit or coaching resource so learners can access it as needed. Rather than force learners to read the information, the design pulls them toward the content. 

For example, two employees are putting out grease fires. Which one is in compliance?

Learners also view content in feedback after making decisions. For example, if I choose water to put out the grease fire, in addition to seeing the graphic of the fire spread, I might see feedback like this:

Oh no! Pouring water on the fire can cause the oil to splash and spread the fire. The vaporizing water can also carry grease particles in it, spreading the fire. Quick! Make another choice!

Contextual feedback makes a lasting impression. Learners want to know if they’re right or wrong, and if they’re wrong, they’re curious to find out why, maximizing their attention toward the feedback.

4. Where are the assessments?

Traditional e-learning assessments ask learners to read some content and then choose one correct answer from three or four options to show knowledge mastery. Typical types of assessment questions are multiple choice, true/false, matching or fill-in-the blank items. In performance-based design, the assessments are built into the action. Learners are given a challenge that requires them to apply new skills to solve a problem. For example, when learners put out a grease fire correctly, they view feedback that reinforces their actions and lets them know they made the right decisions. They may get credit, points or even certification for demonstrating the steps correctly. Completing the challenge correctly then makes a more formal assessment unnecessary.

Help your stakeholders and SMEs avoid the pitfalls of the information dump. Guide them to create learning experiences that engage learners’ innate curiosity so that they seek out the content they need to be successful. If you do, you’ll be helping to move the needle toward great e-learning.

 
 Subscribe to the e-Learning Leadership Blog
Go Beyond Sizzle & Fizzle: Create Learning that Delights
Iterations: Evaluating for Improved e-Learning Performance

About Author

Ann Iverson
Ann Iverson

Ann is an instructional designer for Allen Interactions who’s consulted for many years with a variety of clients, industries and projects. She learns best by making mistakes!

Related Posts
Conversation Hearts for e-Learning
Conversation Hearts for e-Learning
Make a Learning Love Connection: 5 FACTS for Better e-Learning Courses 
Make a Learning Love Connection: 5 FACTS for Better e-Learning Courses 
5 Questions You Need to Consider for High-Impact Learning
5 Questions You Need to Consider for High-Impact Learning

Comment

Subscribe To Blog

Subscribe to Email Updates