by Richard Sites, vice president - client services | @rhillsites
Two weeks ago, I had the opportunity to give a presentation on Creating Value-Driven e-Learning in Orlando at the Learning Solutions conference. As a strategic partner of my clients, this is a topic I feel very strongly about.
For every project you manage or work on, the mission is likely the same: accomplish the goals in the time allotted while achieving a standard measure of quality. For each of these efforts, there are plenty of documented strategies for success. There are theories and tools on setting measurable goals – especially in the world of instructional design. You can read books, and even become certified, in project and time management. Quality is the ability to meet accepted industry standards. This may be measured by passing inspection or in our case, by producing instructionally sound learning.
But what about the value of the work? Value is the benefit our clients (whether internal or external) perceive from our efforts. Value is intangible, but not unmanageable.
Have you ever worked on a project that seemed to be progressing as planned and running flawlessly, but still didn’t feel as if it was hitting the mark? Likely you experienced this feeling because you were missing the value target for the project.
As designers, we need to ask questions and seek answers on how we can meet this value target. Only by acknowledging and addressing the personal expectations and organizational obligations of our clients can we create value-driven projects.
Through my experience, I have found four main strategies particularly helpful in creating value driven e-learning projects. They are:
- Define Review Expectations
- Maintain Control of Content
- Limit Revisions not Iterations
- Reduce the Risk of Inaction
For today’s blog, I will explain one of the strategies and will discuss the other strategies in subsequent posts.
NOTE: My strategies for creating value are based on the use of an iterative process. However, you can take advantage of these with any type of instructional or training development process.
VALUE-DRIVEN STRATEGY 1: Define Review Expectations
As course designers and developers, we often make the assumption that by simply asking a person to review an instructional product that (1) they will provide us with meaningful and useful input and (2) they understand the benefit of conducting the review. But, unfortunately, nothing is further from the truth.
When it comes time for review – whether this is a prototype, an alpha version of an e-learning course, or a facilitator guide – we generally ask our review team to complete a task within a certain timeframe in order to achieve a particular objective. However, without knowing the expectations, our review team will not have a good opportunity to provide an effective evaluation of the training.
Setting clear and concise expectations for a review is not rocket science, but is often overlooked in the rapid paced e-learning development projects many know to be familiar. When reviews are not managed properly, project leads can get barraged with frustrated comments from reviewers or even calls from the boss. These simple missteps can go a long way in the feeling that your project was not a success.
To avoid this situation:
- Provide a review strategy to project team and reviewers – Let everyone know the intent of this review, what you hope to accomplish and what specifically you are asking of them at this review.
- Create a strategy for collecting comments from reviewers – If the effort to provide input during the review is too difficult or time-consuming, reviewers will likely only provide general, broad-sweeping comments that may not be beneficial. Provide tools, like spreadsheets or databases, and strategies to make commenting easier for the review team.
- Discuss the plan for aggregating and prioritizing the collected comments – If a reviewer takes the time to carefully evaluate the e-learning course and provide comments, they are certainly going to expect that their comments are recognized. If you don’t explain to all reviewers the process you will use to combine their comments into a single iteration document, prioritizing the ones that will be the basis for the next iteration, some reviewers may wonder why you didn’t make the change(s) they requested.
Managing these events to avoid missteps can go far in creating a sense of value – a sense that this project and the training built from it are a huge success. Good luck and we’ll pick this up again next post!
Comment