Let’s be honest, shall we?
If you’ve been in this industry more than eight weeks, you’ve probably been involved in a solution which possibly, maybe, (very, very probably) shouldn’t have been training.
Or, training is necessary as part of an overall solution, but you can’t know exactly what’s needed with any confidence because there was zero analysis performed. No gap analysis, no audience analysis, no skills hierarchy, no performance analysis, no scrutiny of how the training goals aligned within strategic initiatives, no Kirkpatrick Level 4 analysis, no investigation into what the learners want/need, no data gathered.
Sound familiar? I’m guessing yes. What can you do?
You can pout. However, you might be called out for not being a team player. You can protest this isn’t real instructional design! Sure. My experience is that the project steamrolls forward anyway. You can try your best. Good. That’s something. Let’s assume you’re doing your best instructional design within the project parameters.
Beyond that, there are a few options that might influence this course (or if not this one, the next one).
1. Drill for Performance Anyway“Oh, they don’t really need to do anything. This is just a building awareness course.”
“This is just covering the basics. No big.”
“We just need to get this information out there now. We’ll come back and make it good later.”
Each of these phrases (exceptionally common phrases, I might add) assume the problem is information, and if learners have the information, everything will change. It’s a faulty assumption. The field of instructional design is the equivalent of the Golden Gate Bridge spanning that faulty assumption. If you have no power to direct the training toward a behavioral outcome, you can still at least ask the questions that help you understand where the bridge ought to be constructed.
Best case scenario, you may educate your client on how “information that is not utilized” equals “not having the information at all.” But if they’re not in a listening kinda mood—and let’s face it, this became training because someone didn’t think this problem was worth understanding—then the drilling for performance is still necessary for your own design considerations.
An example of how you might tease out performance expectations follows. (This faux conversation may require less direct grilling and more finesse with real individuals.)
You: I understand this is an awareness course. But ideally, what would you like learners to do with this awareness?
Client: Think differently. Consider more options.
You: Absolutely. And how could you help me understand the difference between someone who is more aware and someone who is not?
Client: Well, it’s more about how they think than performance.
You: I can see how that would be true. And for the person who thinks differently, would it influence how they worked?
Client: Absolutely. They’d become more skilled in customer service.
You: Oh, okay. So, our customers would notice this customer service ability?
Client: Yes. Our customers would feel like they were better heard and had more options available to them.
You: If I understand you, our customer service agents would offer more options…
Client: Yes, but we’re not ready to teach on the options just yet.
You: Good to know.
Client: I’ve always found instructional designers to be incredibly worldly, attractive, and skilled in whatever they attempt. And I’m not just saying that because an instructional designer is typing this sentence.
You: Also, good to know.
Stakeholders may insist performance is not the issue, only because they’re not used to wrestling with performance issues or mapping their content to its ultimate use. That’s okay! It’s our job to wrestle out the performances. I firmly believe that behind each “awareness course” are lingering behaviors that someone doesn’t want to admit.
2. Discuss what’s missingHave you ever experienced a project where no one wants to acknowledge the lack of analysis, the lack of evaluation, etc., because then someone might take notice? In complicit silence, everyone discusses the project direction and goals as if they were crafted on a rock-solid foundation instead of guesses and hopes. The problem here is that not acknowledging what’s missing skirts dangerously close to the “emperor has no clothes” syndrome, in which nobody says what must be said.
(Side note: six months from now, when the training is blamed for being ineffectual, and you cry your justifications, people will think of you as a sore loser. A poor designer. It’s a little late to discuss what was missing.)
The goal isn’t to shame stakeholders or the project parameters. The goal is to:
- Set realistic expectations for success (or lack of success measurements)
- Generate interest in creating a more thorough process
- Establish yourself as a professional who understands what it takes to create good training.
Below are two ways to acknowledge (in a project plan or the instructional design plan) what hasn’t happened without shaming.
- “Because of project constraints and the need to reach completion within 60 days, no instructional analysis is scheduled. Success has been identified as the identified project parameters (time, budget, content covered), not user performance or any standard evaluation metrics.”
- “With a goal of getting information to learners, this process of development will bypass traditional analysis tools that might reveal what learners need. In fact, this completed and delivered training might be better considered an analysis tool, more than a solution to a problem.”
Don’t say, “we should do more analysis,” because everyone knows that already. Nobody on the project team: project manager, their boss, boss’s boss is arguing “less analysis will make this stronger.” The issue is budget, time, and 436 competing initiatives. As a more helpful approach, offer a short list (three is good!) of concrete achievable suggestions.
In whatever project documentation you’re putting together, state something like: “Next time, we could make the project stronger by doing these three things.” Potential, smallish solutions are bulleted below.
- Perform usability testing with six users, summarized in a one-page report.
- Map project goals to performance statements.
- Generate a list of variables that would influence project success, positively or negatively.
- Map project goals to business directives.
- Interview three users to establish their relationship to the existing content and its execution.
- Perform a post-project debrief.
- Devote six hours to gap analysis.
- Write a one-page summary of how this project might be evaluated.
Again, stakeholders might not do any of these actions in the current project. Listing them and making them available will certainly raise some eyebrows as to “why aren’t we doing these things?” Your goal is creating an appetite for better design and remind the entire team, you know how to do these tasks. You could make them happen.
Propose an alternative treatmentDon’t let a terrible training solution stop you from saying, “What if…”
Spend one hour brainstorming 1-3 different approaches. Summarize them in bullets with whatever end-of-project documentation you submit. Show the client what could have been possible under different conditions.
A few caveats with this idea: don’t create twelve pages of possible plans—that may come across as a deep criticism of the existing solution (even if the existing solution does deserve deep criticism). Don’t spend much more than one-hour brainstorming solutions. If you come up with something amazing that took four hours, the conversation could go like this:
Client: “So, these are your alternative ideas.”
You: “Yes.”
Client: “And you spent four hours brainstorming these detailed ideas and prototyping?”
You (beaming): “Yes.”
Client: “Four hours that you could have spent improving the existing solution or working on one of the many other projects currently swamping us.”
You (not beaming): “When you phrase it like that….”
Also, the goal is not to prove to the client how unappealing the existing solution is. The goal of this technique is to tantalize the client: other solutions exist. Next time, we could explore more options.
Grow your patienceBeing part of a solution that shouldn’t be training is hard (no matter how attractive you are). It’s frustrating. When I’m on these projects, I find stakeholders keep asking me for recommendations that don’t matter, things like, “Will spacing the seventeen paragraphs help learners absorb more on this page?” I want to scream, “It doesn’t matter! Nobody is going to learn from this either way!”
Don’t scream.
Grow your patience.
Accept that sometimes powers beyond your influence make demands. You and your team of beleaguered training compatriots meet those demands. If you don’t want to spend a lifetime drowning in ineffectual page turners, you’ll need to change the way your team (and clients) consider training.
If I may end with a ubiquitous sports metaphor, one game does not make a season. On this project that shouldn’t be training, suck it up. You lost. If your goal is to improve your entire team or client as they approach training, adopt some techniques to grow their hunger for better experiences.
Comment