Business and training needs change quickly. However, if you’re thinking about how to create a training program, incorporating a pilot can be a strategic game-changer to help fine-tune your program to address your current business and skill development needs.
Conducting a pilot gives you opportunities to gather valuable feedback, identify areas for improvement, and enhance the impact of your learning design.
First, what is a pilot program in business?
A pilot of a learning program is a small-scale trial run of that program involving a smaller group of participants. Its purpose is to evaluate the program’s design, gather feedback on learner engagement, identify areas for improvement, and make necessary adjustments before the program is available to its full audience.
The pilot phase allows learning designers and business leaders to evaluate the program’s feasibility and impact, address any issues or challenges, and make informed decisions about scaling up the program for a broader audience.
Why should we pilot when developing a training program for employees?
Here are a few things a pilot might allow you to do:
- Understand which delivery methods meet the learners’ needs and expectations when developing a training program for employees
- Find out which learner engagement strategies had the greatest impact… and which didn’t
- Test new tools and platforms
- Gain a better understanding of the administrative resources you’ll need
- Gain insight into unexpected challenges and obstacles learners encounter
Ok, let’s say you’ve decided to run a pilot when creating a training program for employees. How do you run a pilot, who do you run it with, and what exactly will you test? Keep reading as we explore what factors to consider.
Clarify outcomes for your pilot
Clarifying the outcomes of a pilot for a learning program can help you come up with a clear approach for assessing your pilot’s successes and challenges. It can help in identifying areas of improvement and refining the learning materials and delivery methods based on feedback and data-driven insights during the pilot.
Clarifying outcomes also enhances communication and alignment among stakeholders, ensuring that everyone involved understands the purpose and expectations of the pilot phase.
For example, a retail training program was originally run as a four-week self-paced, asynchronous program. Learners were handed a guidebook with directions to online lessons they needed to take in an LMS, as well as on-the-job trainings where they needed to get signed off by a manager or other observer. Completion rates for this program were low as learners would start strong with the first few online lessons and then slowly drop off as they lost momentum throughout the 4 weeks. The team wanted to try a different rollout strategy to see if it would improve completion rates and keep learners motivated.
Consequently, the outcomes of the pilot were to:
- Increase the completion rates of learners (i.e., would a new rollout strategy affect completion rates?)
- Keep learners more engaged via discussions, short assignments, on-the-job trainings, and virtual live sessions
- Identify any challenges and issues that might emerge when coordinating on-the-job training and virtual live sessions (i.e., how would this new approach need to be resourced?)
The pilot was designed as a shorter two-week experience, with learners in different locations placed into online cohort-based learning, where all online lessons they needed were lined up on a daily schedule. The team also provided checklists for learners to prep for their on-the-job trainings, communicated a calendar of virtual live sessions, and added discussions and “scavenger hunt” assignments so that learners could get more familiar with the layout of their retail store as well as see examples from other stores.
Here are a few questions to consider when thinking about the outcomes of your pilot:
- What are the overall goals of your learning program?
- Your pilot could help you test if learners are meeting the overall goals, but at a smaller scale to help you determine if your approach is working.
- Could you test different rollout strategies or tools for delivering the learning experience to see what would help boost completion rates for a program?
- For example, if you have both synchronous and asynchronous elements in a learning program, your pilot can help you find the best balance of these elements to meet your learners’ needs and contribute to a successful outcome.
- Are you helping the admin team run through the program to help spot any issues before the program rolls out to a larger audience?
- Could you try a new activity type that learners in your organization haven’t encountered before to see how it could help them apply the learning to their everyday work?
Include the right audience that represents the actual learner population
That is, your audience should not be the learning team or subject matter experts (SMEs).
You can of course ask your learning team stakeholders and subject matter experts to review your program and give you feedback; but it’s vital that the audience of your pilot mirrors the actual learner population to ensure the validity of the pilot.
By including representative participants, you can gather more accurate feedback and insights into how the learning program will perform in real-world scenarios. This helps you identify potential issues early on, tailor the program to meet diverse learner needs, and ultimately improve the overall effectiveness of the experience for the target audience.
Here are a few questions to consider when it comes to how to create a training program for the right audience:
- How big will your pilot be? How many participants?
- Is there a variety of participants (different regions, different business units, etc.) that are representative of your learner audience?
- Are there content and activities that are relevant to the variety of learners who will be participating?
Consider the context of your learners
If your learners are currently in a particularly busy season or are already participating in a learning program that will be interrupted with a pilot experience, they might find it jarring or confusing to be suddenly asked to participate in an experience they don’t have mental bandwidth for.
For example, an eight-week onboarding program was originally run as an eight-week ILT program with learners attending either virtually or in-person depending on location. The team wanted to experiment with having a portion of the program as self-paced. Since a group of new hires was already going through the ILT program, the last week was run as a partially asynchronous pilot program along with morning and afternoon huddles with their instructors.
However, the new hires going through this program found it difficult to adjust from the mostly ILT experience they had been in to now having to move through the bulk of their content for the week in a self-paced manner, while meeting with their instructors only at the beginning and middle of their day.
It was too much of a shift of their daily schedule that they had established for the previous seven weeks, and their feedback was focused more on how jarring that shift was rather than on the pilot experience.
Here are a few things to consider when developing a training program for employees in the context of your learners:
- Time Commitment: Consider the time constraints and availability of participants when designing the pilot program. Design the duration of the pilot phase, scheduling of sessions or modules, and flexibility options to accommodate your learners’ schedules and allow for the most participation possible.
- Learning Environment: Consider the physical and digital learning environments where participants will engage with the program. Think about factors such as accessibility, technology infrastructure, available resources, and potential distraction that learners will encounter
- Support and Resources: Provide adequate support, guidance, and resources to learners throughout the pilot program. Offer access to instructors, mentors, helpdesk services, and online discussion to facilitate learning, address questions, and resolve issues promptly.
- Support for instructors: Give instructors, facilitators, and moderators an orientation to the pilot program before launching to learners. This will help instructors and facilitators understand what to expect during this phase of the program and allow them to give you guidance on what worked and what didn’t during the pilot
Communicate the pilot experience and get feedback
Communicating to participants that they are in a pilot of a learning program will help you manage expectations on the “complete-ness” of your program. It also helps participants understand that the program is in a testing phase, where their feedback and insights are crucial for refining and improving the learning experience.
Consider how you’ll get feedback from learners.
For example, if your program has learners participating in a digital environment, here are a few ways to get feedback:
- Surveys and Questionnaires: Create online surveys or questionnaires and include questions about the user’s experience, satisfaction levels, suggestions for improvement, and specific feedback on content relevance, features, and day-to-day work application.
- User Testing and Interviews: Conduct user testing sessions where participants interact with your digital learning while providing real-time feedback. Use screen recording software and observe user behavior and reactions. Follow up with interviews to delve deeper into their experiences and gather qualitative insights.
- Analytics and User Behavior Analysis: Use analytics tools within your learning. Analyze metrics such as user flow, learner engagement, and completions with specific features to gain insights into what resonated with learners and what didn’t.
If learners will be participating in a synchronous physical environment as well, here are a few other ways to consider getting feedback:
On-Site Feedback Stations or Exit Surveys:
- Set up feedback stations within the physical environment, such as kiosks or tablets with digital forms. Encourage participants to provide feedback by asking them to rate their experience, share suggestions, or highlight areas of improvement.
- Or use mobile-friendly survey tools or QR codes that participants can scan to access a survey online. Ask about their impressions, memorable moments, and feedback on facilities or services.
Observational Feedback:
- Train staff or observers to gather observational feedback by observing participant behavior, interactions, and reactions within the physical environment.
- Document observations about navigation, engagement with exhibits or displays, and overall comfort levels.
Focus Groups and Interviews:
- Organize focus groups, use voice of the learner sessions, or conduct interviews with a sample of participants to delve deeper into their experiences.
- Use open-ended questions to explore their perceptions, preferences, and suggestions for enhancing the physical environment.
Use the data from your pilot to inform next steps when designing a training program for employees
It seems obvious, but there’s no point in running your pilot if you don’t use the data you collect to inform your learning program design for employees.
Here are a few questions to consider as you’re assessing your pilot’s impact:
- How did the pilot outcomes align with the overall learning goals and expectations?
- Did the delivery methods (e.g., online modules, instructor-led training) meet the learners’ needs and preferences?
- How did the pilot participants engage with the content and activities provided?
- What feedback did the pilot participants provide regarding the clarity and relevance of the content?
- Were there any unexpected challenges or obstacles encountered during the pilot, and how were they addressed?
- Were there any technical issues or barriers that hindered the learning experience during the pilot?
And most importantly…what adjustments or improvements are needed based on the pilot results to enhance the effectiveness of the learning program?
As you assess your pilot’s impact, remember that the real value lies in using the data and insights gathered to refine and enhance your learning program.
Summary: When designing a training program, start with a pilot
Analyzing pilot outcomes against your initial goals, evaluating delivery methods, gauging learner engagement, and addressing feedback are all important steps when creating training and development programs for employees. By identifying areas for improvement and evolving your program based on pilot feedback, you can make informed adjustments that lead to a more effective learning experience for your audience.
Frequently asked questions about how to create a training program
What does pilot mean in business?
A pilot of a learning program is a small-scale trial run of that program involving a smaller group of participants. Its purpose is to evaluate the program’s design, gather feedback on learner engagement, identify areas for improvement, and make necessary adjustments before the program is available to its full audience.
How to pilot a training program?
Five factors to consider when you pilot employee development programs:
- Clarify outcomes for your pilot
- Include the right audience that represents actual learner population
- Consider the context of your learners
- Communicate the pilot experience and get feedback
- Use the data from your pilot to inform next steps