top of page

Why do some knowledge checks work better than others? Part two

  • Writer: Zoe Goldthorpe
    Zoe Goldthorpe
  • Mar 4
  • 8 min read

Updated: 1 day ago


An illustrated earth character smiles while holding a lightbulb above their head.

Improving knowledge checks


In part one of this guide, we explored several design choices that determine whether knowledge checks genuinely support learning or serve only as markers of progress. We looked at how aligning questions to learning outcomes, introducing an appropriate level of challenge, and grounding questions in realistic situations can make assessments feel purposeful and engaging for learners.


If you have not read this first part yet, it is worth starting there, as those foundations shape everything that follows.


In this second part, we turn to what happens after the question itself. In asynchronous digital learning, the moment when a learner submits an answer is often the only opportunity the course has to explain, clarify, or reinforce understanding. How feedback is written, how reflection is supported, and how the limits of automated assessment are handled can make a significant difference to the learning experience.


The sections below focus on practical ways to design this next stage effectively, helping knowledge checks do more than simply mark answers and instead contribute meaningfully to learning. 


  1. Include meaningful feedback


Effective classrooms run on feedback. Learners are constantly receiving small signals about how they are doing through explanation, correction, clarification, and conversation. It would be unusual for a teacher to simply say ‘That’s wrong’ and move on. More often, they would explore the misunderstanding and guide the learner towards clearer thinking. 

 

In digital learning, that exchange does not happen automatically. Once a learner selects an answer, they’ll typically receive immediate confirmation of whether they’re correct or incorrect. But what appears next can either close the conversation or continue it. Well-written feedback can be a great opportunity to explain the reasoning and reinforce key messages. 

 

If a learner selects an incorrect answer, there is usually a reason for it. We cannot know exactly what that reason is, and we cannot address every possible misconception in an asynchronous course. However, feedback gives us a chance to respond to the most likely misunderstandings and guide learners towards the correct principle. 

 

Say we are assessing understanding of data protection responsibilities and a learner selects an incorrect option. Consider the difference between these two types of feedback: 


Basic feedback


That's not correct. The correct answer is B.

More meaningful feedback


That’s not the correct answer. Before sharing personal data externally, you must ensure it has been anonymised where required to comply with data protection standards. Formatting and branding may matter, but they do not override the responsibility to protect confidentiality. Anonymisation should always be checked first.


This second response does more than identify the right answer. It clarifies the reasoning and redirects attention to the key principle. In an asynchronous course, this moment often carries the explanation that would otherwise happen through discussion. 

 

Both Articulate Rise and Storyline allow you to build feedback directly into the interaction. The tools offer different levels of flexibility, but the principle is the same: feedback should leave the learner clearer than they were before. 

 

Put it into practice

Treat feedback as part of the question design, not something to add later. As you write each question, pause and ask what the learner needs to understand after answering. Use the feedback to reinforce the principle for correct responses and to clarify likely misconceptions for incorrect ones.

Checklist for success

  • Does the feedback add insight beyond simply revealing the answer? 

  • Does the feedback explain why the correct answer is correct? 

  • Would someone who guessed correctly still learn something from the feedback? 

  • If a learner chose an incorrect option, would the feedback help redirect their thinking?

  1. Be realistic about what asynchronous e-learning can assess


Asynchronous digital learning has clear strengths, but it also has limits. Automatically marked knowledge checks, particularly multiple-choice questions that generate immediate correct or incorrect feedback, work best when they focus on content that can be assessed objectively. 

 

This is especially important in tools such as Articulate Rise, where knowledge checks are designed around clear right-or-wrong responses. While Storyline offers more flexibility, the same principle applies: if a system is marking the answer, there needs to be a defensible basis for doing so. 

 

For example, the following types of question work well in an automated format: 

  • According to NHS guidance, how many times a day should children brush their teeth? 

  • What does the acronym GDPR stand for? 

  • What is the first step in your organisation’s incident reporting process? 

  • Which document must be completed before beginning a research study involving human participants? 

Each of these has one clearly defensible answer that can be marked reliably by a system. 

 

By contrast, multiple choice questions that attempt to assess more open or subjective thinking can be problematic. For example: 


  • What is the best way to build trust with a marginalised community? 

  • Which leadership style do you believe is most effective? 

  • How should schools balance academic performance with pupil wellbeing? 

  • What is the most effective way to lead organisational change? 


These invite reflection and discussion, but they are difficult to mark meaningfully as correct or incorrect without facilitation. In an asynchronous environment, learners may be unsure why one answer has been judged ‘right’ if the issue itself is nuanced. This is especially true of courses that want to teach analysis, judgement, or complex decision-making: knowledge checks can help assess the foundational understanding that supports these skills, but on their own they are limited to measuring knowledge of the skill, rather than the full skill in action. 

 

Being clear about these constraints early on leads to better design decisions. It allows you to use automated knowledge checks where they build confidence and clarity, and to use reflection, discussion, or facilitated activities where deeper thinking is required. 


Put it into practice

Ask a colleague or subject matter expert to review your questions before finalising them. A fresh set of eyes can quickly highlight where wording is unclear or where more than one option could reasonably be defended. If reviewers consistently identify the same correct answer, that is a good sign the question is clear. If they do not, refine it until agreement is easier to reach.

Checklist for success

  • Is there one clearly defensible correct answer? 

  • Would subject matter experts agree on which option is correct? 

  • Is the answer grounded in policy, procedure, or agreed best practice? 


 3. Use suggested responses to support reflection 


Not every good question fits neatly into a multiple-choice format; this becomes particularly clear when converting workshop or classroom-based learning into self-paced digital courses. Discussion prompts that once led to thoughtful conversation can lose depth if forced into a single correct answer. 

 

Some learning goals are better supported through reflection and exploration. In asynchronous learning, the challenge is to preserve that thinking time while still offering structure. Learners need space to generate their own response first, but they also need reassurance that they are broadly on the right track. 

 

A practical solution is to build in deliberate reflection. Pose the open question. Prompt learners to write down their response or keep a note for themselves. Then invite them to reveal a short list of suggested responses. 

 

The purpose of suggested responses is not to dictate exactly what learners should think, nor to replace the richness of live discussion. Instead, they provide a reference point: learners can compare their ideas with informed examples, notice gaps, pick up additional perspectives, and refine their thinking. For those who feel stuck, the responses offer direction without closing down reflection. 

 

Both Articulate Rise and Storyline make this easy to implement through simple click-and-reveal interactions. Used sparingly, it is a straightforward and effective compromise. 


Here’s how it might look in practice 


Reflection question 


You are planning a community health session for parents. What factors should you consider to make sure the session is inclusive and accessible? 

 

Take a moment to note down your ideas before you continue. 

 

Suggested responses (click to reveal) 


You might have considered factors such as: 

 

  • The time and location of the session, and whether these are convenient for working parents 

  • Language needs and whether translation or interpretation support is required 

  • Cultural norms that may influence who feels comfortable attending 

  • Childcare arrangements that could remove barriers to participation 

  • Physical accessibility for parents or carers with disabilities 

  • Gender dynamics within the community and how these may affect engagement 

  • How the session is advertised, and whether communication channels reach all groups  

 

These suggestions illustrate the kind of practical thinking that supports inclusive planning. They are not exhaustive, and your context may lead you to prioritise different factors. 


Put it into practice

Reflection questions often benefit from more than one perspective, so consider inviting input from colleagues or subject matter experts when drafting your suggested responses. Remember that you’re not aiming to list every possible answer, but to offer a representative selection that models thoughtful practice and reinforces the key principle of the lesson. 

Checklist for success

  • Does the question genuinely invite reflection rather than a single correct answer? 

  • Do the suggested responses reflect a range of relevant perspectives? 

  • Do the suggestions reinforce the core principle or learning goal? 

  • Is it clear that the responses are illustrative rather than exhaustive? 

  1. Write in clear, accessible language 


Language plays a significant role in how knowledge checks are experienced. When wording is unnecessarily complex, learners may struggle with the phrasing rather than the concept being assessed. 

 

Clear language is not about lowering expectations or talking down to learners. Good learning requires effort, but the effort should come from engaging with the content, not decoding the sentence. If the challenge lies in untangling double negatives, dense clauses, abstract phrasing, or informal expressions and idioms that may not translate clearly, the assessment is no longer measuring what you intend it to measure. 

 

Clarity also supports accessibility. In professional settings, learners may speak English as an additional language, have dyslexia, have different levels of reading fluency, or be completing the course in a busy office or on the move. Removing avoidable barriers helps ensure that cognitive effort is directed towards the knowledge or judgement you want them to apply. 


Here’s how it might look in practice 


Less clear working

Clearer alternative

Which of the following is not unlikely to occur if guidance is not incorrectly applied? 

What is likely to happen if the guidance is applied incorrectly? 

In circumstances where employees may be experiencing difficulties in relation to compliance expectations, which of the following would be deemed most appropriate? 

What should an employee do if they are unsure about compliance requirements? 

If things go sideways during the session, what’s your best bet? 

If the session does not go as planned, what should you do first? 

In the event that a participant demonstrates behaviours which may or may not be indicative of distress, what action might reasonably be considered? 

If a participant shows signs of distress, what should you do? 

Which option is not inappropriate in situations where policy has not been disregarded? 

Which option is appropriate when policy has been followed? 

What is the recommended frequency with which training should ideally be completed? 

How often should training be completed? 

Put it into practice

Read each question aloud. If it sounds complicated, simplify it. Remove double negatives, cut long sentences into shorter ones, and focus on one clear idea at a time. If possible, ask someone unfamiliar with the course to read the question and explain what it is asking. 

Checklist for success

  • Is the vocabulary appropriate for your intended audience? 

  • Could the question be shortened without losing meaning? 

  • Is the language free from idioms, slang, and culturally specific expressions that could confuse some learners? 

  • Could a learner understand what they are being asked to do without having to reread the question several times? 


Turning guidance into practice


Designing effective knowledge checks is more detailed work than it may initially seem, particularly in asynchronous digital learning. That’s because they sit at the intersection of subject expertise, learning design, and the practical limits of the tools being used. 

 

It’s also why this part of the process can feel difficult to tackle alone. Writing clear questions, judging the right level of challenge, and deciding how much feedback is enough all take careful thought, and often benefit from collaboration. 

 

We regularly work with organisations who bring strong content knowledge and early drafts, and then work together to refine questions, clarify learning outcomes, and develop knowledge checks that feel meaningful and engaging for learners. 

 

The difference often lies in small, deliberate design decisions, and that level of detail is precisely where we do our best work. If you would like to strengthen your content and feel confident in your assessment design, get in touch and we can take a closer look together. 

bottom of page