Train the Trainer Program at a Tech Company

This project focused on upskilling a training team at a global technology company. This example focuses on the design, develop, and implement phases of ADDIE. The end product was a series of weekly training webinars for the team to learn and apply principles of training design so that they were more effective at their job. This was a large project that went on for over half a year, so the information below will only highlight a few weeks of this program for clarity and ease of accessibility. Read on below for more details.

  • NovaByte (pseudonym) is a SaaS company started in the mid 2000s. It creates B2B software and has a global training team that is dedicated to increasing the productivity of existing employees. This team had evolved organically over time as NovaByte had undergone rapid growth in the previous decade and made the transition from a private company to a publicly traded one.

    One of the pains created from this organic rapid growth is that the training team was made up of almost entirely people who had been excellent in their job but were not trainers by practice. The team was full of Subject Matter Experts (SMEs) but not trainers. At the direction of the team’s vice president, it was decided that the training team needed a consultant who could upskill the training team through teaching them principles and best practices of adult learning. This led to a week-long workshop that was very well received by the team. However, almost a year later, the desired change was not seen in the team. That led to this train the trainer program.

  • In the previous intervention, the 1-week workshop, not much thought was given to the ongoing reinforcement requirement. This is the “magic bullet” problem you often see in training interventions where, the mythological thinking goes “if we just take them off the job for a training session and tell them the right thing, they’ll now do that right thing going forward”.

    Unfortunately, in this case and as often happens, the employees returned to their regular jobs and the same environment that doesn’t incentivize or support a change in behavior. Analysis of the situation revealed that the desired change didn’t happen because of two problems:

    1. Employees did not get to practice what they learned after the workshop.

    2. Employees received no feedback, whether positive or negative, after implementing or practicing the new skills they learned.

    These are the problems any new intervention would have to solve for.

    In her book Design for How People Learn, Instructional Designer Julie Dirksen argues that to determine if something is a skill, you only have to ask “do you reasonably expect that someone needs practice to become proficient in this?” If the answer is yes, then it is skill (if no, it’s knowledge and telling them will solve it). If you are experiencing a skill problem, practice should be built into the intervention because otherwise, you won’t solve the problem you are seeing. You can’t tell someone how to be a good trainer in a day just like you can’t tell someone how to be a good computer programmer in a day. Both these things take practice.

    In this case for the NovaByte training team, the problem was a skills gap in training rather than a knowledge gap because training takes practice. The previous workshop intervention was designed to solve a knowledge gap, ie. “we will just tell them what good training looks like and they will do it”. Analysis confirmed this as most employees on the team knew the right thing, they just didn’t have the breathing room to practice and try it out.

    The lesson learned for this project from the issues with the previous workshop was that any new intervention would need to include practice to support the change and feedback to help along the way as this skill in training was developed by the team.

  • Learning from the analysis and the failures of the previous workshop intervention, a solution was created to address the problem and teach employees what best practices were in training while giving them practice to try it out and regular feedback.

    This section will take you through the overall design of the program, and the theories used to influence this design. The next section below will take you through an example of a two-week stretch in the program so you can see it come to life while talking about how it was developed.

    We agreed with the manager of the team that the intervention would best be delivered as an ongoing weekly workshop for the team. This 45-minute workshop would include:

    • 30 minutes of engaging and interactive instructor-led training (ILT) from an expert on rotating topics in training best practices.

    • 15 minutes for questions and a review of how applying the principles from the previous week’s learning had gone.

    This design was chosen to allow for time to practice and so employees could get regular feedback. By giving employees a week to apply what they learned over the course of their regular work, they could apply the principles in a context that was relevant to them and their problems. This was done to align with Knowles’s assumption of adult learning that adults learn best when they can apply it to solve problems relevant to them in a way that allows them to be self-directed in their learning. Additionally, by setting aside 15 minutes a week to review and recap what worked well, we provided an ongoing reward (positive or negative feedback) to applying what they learned.

    Since this was a long running program, the subjects varied from week to week (there is a lot of potential ground to cover in the training world). Bloom’s Taxonomy was used to create learning goals for each week. The overall goal of this program was to increase the application of what was learned so it was used to help with the learner’s real problems and use cases. Typically, this meant the learning objectives fell on level 2 and 3 of the Taxonomy. So, learning goals often started with “Describe”, “Explain”, “Apply”, or “Implement”. You’ll see a specific example of this in the next section.

    This design overall was created using Gagne’s 9 Levels of Instruction. When I highlight an example in the next section, I will breakdown how this was used level by level.

  • Pretend you are in a Quintin Tarantino or Christopher Nolan film, as I will start In Medias Res (in the middle of the action) and highlight an example in a specific week in the middle of the program.

    Last week, the topic of cognitive load was covered in our workshop to introduce and practice the science that shows us how we can avoid overwhelming the learner’s innate capacity when designing presentations. This concluded the importance of designing presentations, and now, with this new week, we’re focusing on the all-important task of getting the learner’s attention. This was the deck used in the presentation.

    The learning goal for this session was “by the end of this presentation, you will understand a key lesson of behavioral economics and be able to apply it to secure the attention of your learners in your presentations”. Gagne’s 9 Levels of Instruction was used to design this lesson. Here’s the breakdown level by level of how:

    1. Gain Attention: A video created with Vyond was used to create a brief animated video to catch the audience’s attention. The animation was novel for the audience and really created a strong initial reaction. Additionally, the video provokes a series of questions around a puzzle about human decision making and involves the learner, further securing their attention.

    2. Inform on the Objectives: The audience was informed after the video and puzzle that they would learn the answer to the question posed in the video and how this lesson could help them get the attention of their learners.

    3. Stimulate Recall of Prior Learning: In this lesson, there was a pause to discuss with the audience how they had captured attention before and we talked about how well that had gone.

    4. Present Content: Information was shared through breaking down the video and solving the puzzle, in this case why the same amount of money doesn’t always cause us to act the same way.

    5. Provide Learning Guidance: Visual images, metaphors, and analogies were used to make it easier for the learner to follow along.

    6. Practice: Learners were given an actionable key takeaway and recommendations for how to talk to the elephant part of the human brain. They were encouraged to apply this over the next week.

    7. Provide Feedback: In the next week’s session, 15 minutes was used to discuss how well things were going in applying the concept and to discuss what was going well. Peer evaluation was used to give feedback on what the sharer had done.

    8. Assess Performance: In a future presentation by the learner months down the line, we used a standard rubric to evaluate the presenter on specific attributes of their presentation, including how well they managed to secure the attention of the audience using what we learned.

    9. Enhance Retention: Concepts shared in the past pop up in future presentations. You can see an example of this in the beginning of this presentation where a reference to Cognitive Load and how it relates to toilet paper is shared early on. Overall, throughout this program, you get a feeling you’re in the Marvel Cinematic University (MCU) because everyone knows everyone, and all the things are actually related in some way.

    For development, a number of tools were used. The video was developed using Vyond and Camtasia. Images were created using SnagIt.

  • For evaluating this intervention, we used a rubric to evaluate change over time after the learning intervention.

    Employees regularly gave presentations as part of their job, so, to track changes over time, a standard rubric and peer-evaluation survey was used. This survey was specific and included questions like “did the speaker secure your attention” or “were there no more than 6 items on the screen at one time”. We were able to see employees get better over time and provided them with specific feedback to improve. For example, “60% of the audience said that you grabbed their attention, what could you do better to increase that number next time?”

    Unfortunately, we did not do a pre-test to determine a baseline level. If I were to do this project again, that would be the most obvious immediate thing I’d change. However, the combination of a standard rubric for evaluation and a series of post-evaluations over time with feedback given to the employees was effective in tracking and proving the desired change in the audience.

    Reactions to this program were positive overall and learner’s surveyed said they enjoyed it.

  • Gagne’s 9 Levels of Instruction, Kirkpatrick’s 4 Level Model of Evaluation, Cognitive Load Theory, Knowle’s Core Principles of Adult Learning, Mayer’s Principles of Multimedia Learning, Bloom’s Taxonomy, Vyond, Camtasia, SnagIt.

Gagne’s 9 Levels of Instruction was used extensively in the design phase of this project. See the Example of Design and Development section for details.

Image Credit: MissyKrupp at SlideShare.net

Learners’ attention can be secured through puzzles or thought provoking questions. In one example from this program, the presentation was opened with a video and a puzzle of behavioral economics to grab the audience’s attention.

You can see the video here (only the first 3 minutes were used for this workshop)

“This is like that other thing you know” is a powerful way to quickly teach challenging new topics to an audience. This aligns to Gagne’s 5th level: provide learning guidance. In an example from this program, a slogan from Charmin toilet paper, which is much more well known by the targe audience, was compared to Cognitive Load Theory to make the concept easier to understand and apply.