Six Steps to Add Surveys to Learning
Six steps to integrate surveys into learning experiences and achieve better insights than assessments.
Continuous improvement in learning requires a consistent flow of new information. To create great sustainable courses that provide ongoing value to the learner, the flow of knowledge must be bidirectional – new knowledge to the learner and feedback on the experience back to the course stakeholders.
Why Assessments Do Not Create Great Courses
If you are using assessments to evaluate the effectiveness of the learning, why not ask the learner directly? When designing learning, the assessment often becomes a punitive exercise in memory retention or trivia — often boiled down to true or false, multiple choice, and the learner cannot progress until they select the correct answer. In a recent course, a stakeholder noted that he could pass the assessment by simply selecting the longest answer from the responses, as incorrect answers were vague and notably shorter. LMS reporting now provides advanced reporting capabilities to understand how the learner advanced through the course and if they spent any meaningful time on the subject and content.
Assessments reveal what learners don't know ...
Surveys reveal what stakeholders don't know.
Learning as a Customer Success Model
If you consider your learners your customers, evaluating and assessing their knowledge usage seems out of place. Following the customer model, provide support and customer success programs that follow-up, seeking to improve the learning. Once the learning is created, the learner can offer innovative insight to develop future enhancements and improvements to learning experiences.
Connecting the learning experience to frequent feedback from learners, participants, and stakeholders and conversations provides relevant inputs that allow the learning and learners to evolve continuously.
Surveys can drive future enhancements, expansion, or contraction of the learning experience and program. In a real sense, surveys shape the stakeholder outcome for the learning experience. Surveys can often better assess the success of a program than the evaluation scores from a test. Evaluations scores tell you whether the learner knows the material. Surveys tell you whether the learner thinks the materials are helpful.
1. Designing a Learning Survey
Many times, survey questions are an afterthought to the learning experience and program launch. The survey design should feel simple, intuitive, and short to the learner, but that does not mean the design is that way. Ideally, the survey should leverage existing knowledge from the assessment or knowledge of the learner to craft.
Don't ask any questions you can get from other means or are not planning to analyze the results. What makes a survey effective?
- Specificity - reference or link to the content if possible
- Immediate follow-up
- Mine for feedback
2. Rating Questions and Scales
One of the fastest ways to allow learners to provide quantitative feedback is to use a rating scale or likert questions. A likert question is a form of rating scale question where the learner can use a scale to register the fractional agreement or disagreement with a statement or question. Using a likert rating scale for statements connected to the learning experience offers the learner the ability to offer quantifiable ratings of their perception, knowledge, behavior change, and impact of the learning experience. Likerts offer five, seven, or nine-point rating scales ranging from strongly negative to highly positive feedback, with the midpoint being neutral.
To increase the specificity of the feedback, combine likerts for quantitative analysis with open feedback opportunities. The best survey question structure is crafted in three parts. It may include several questions that appear depending on the learner's input—using Boolean logic with likerts and seeking direct feedback from the learner.
3. Three Part Question Strategy
An effective learning survey has three components. It does not need to be extensive to gain valuable insight into the effectiveness and impact of the learning.
- An initial question that filters or segments the learner's experience. This question can segment the learner's experience with the content into several categories, such as novice to expert or core competency to a role or informational.
- A second question that rates the learner's perception of KPIs, learning outcomes, or other items of strategic value to the stakeholder. List three or four elements of the course and ask about the impact of the experience on the learner.
- A third open-ended question that requests the learner's feedback.
By first asking a yes or no question or select from a list option, you can then ask a following up on rating likerts and request text with free input that can be segmented.
Remember the question logic and complexity to derive valuable information quickly from the learner, so while the design of the survey may be complex, from the learner's perspective, it is easy.
4. Survey Learning Program
Learning surveys can be incorporated into a learning journey, and questions can be check-ins as the learning progresses on the path. Three-phase learning survey program would include:
Pre-Survey (5 Questions)
As part of the onboarding process, participants will participate in a benchmark survey to assess their initial knowledge, skills, and abilities pertaining to the learning experience program.
Pulse Survey (3-5 Questions)
Pulse surveys are short and easy to complete, so participants can take them and contribute their perceived progress throughout the program.
Post Survey (5 Questions)
The post-survey will repeat key measurement questions as with the pre-survey. These responses will form a benchmark measure for evaluating progress by and among the participants.
5. Survey Reporting and Outcomes
We define reporting from surveys as the data measurements to monitor the health, usage, and engagement with the learning experience. Reporting differs from outcomes measures as the data provided by the LMS and assessment results typically do not provide or translate to measures of outcomes and successful impact of the learning experience.
6. Survey Follow-up
When learners invest the time to provide positive or negative input for improving the learning experience, the feedback loop must make it to the learning developers, stakeholders, and ultimately to the learner with follow-up.
Following up with a learner provides an opportunity to gain new insight or requirements for the learning experience. A quick follow-up, "Your recent feedback about [XXXX] in the learning experience prompted these changes in the experience; I invite you to take a look here[link to the course]."
What to read next?Review the list of tweets to curated articles and resources from around the web. Have additional ideas or suggestions — leave a comment and we'll be sure to add them. |
|
Comments