Skip to main content

Question Type & Cognitive Demand – Choosing the Right Tool

Phase 3 – Implementation: Using Assessment Evidence with Impact

From Results → Action

In this final phase of the EBTD Guide to Better Assessment, we move from collecting marks to using evidence with professional precision. Teachers turn raw results into insights — spotting patterns, diagnosing misconceptions, and deciding what to reteach or reinforce.

Big question: “I’ve marked the scripts — what now?”

  • Interpret assessment evidence with clarity and confidence.
  • Identify misconceptions that block progress.
  • Act quickly and purposefully — adjusting teaching, feedback, or curriculum design.

The first step in that journey: choosing the right question type for the right level of thinking.


What Cognitive Demand Really Means

Every question sits somewhere on a thinking ladder. This ladder shows how understanding deepens — from recalling facts to applying and transferring ideas. At EBTD we use a simple five-step progression model that stays consistent across subjects.

The Ladder of Thinking and Cognitive Demand

Step What It Means in the Classroom Example (Science: Evaporation)
1. Recall Remember key facts, terms, or definitions. “What is evaporation?”
2. Understand Explain ideas in your own words; show comprehension. “Why does heat make water evaporate faster?”
3. Apply Use knowledge in a familiar situation or similar example. “Why do wet clothes dry faster on a sunny day?”
4. Reason Link causes and effects; justify, compare, or analyse ideas. “Why might a shaded plant dry more slowly?”
5. Transfer Use learning in a new or real-world context. “How does evaporation help animals keep cool?”

Bangladesh Tip: In large or exam-focused classes, you don’t need to reach Step 5 every time. A smart balance works best: many quick Recall / Understand checks for breadth, and a few Apply / Reason / Transfer tasks for depth.


Quick Match: Question Type → Thinking Level (at a glance)

Question Type Best Fit (Thinking Level) What It’s Good For (Teacher) Learning Value (Student) Watch-outs / Non-Negotiables One-line Example
True/False Recall (sometimes Understand) Very fast whole-class check; surfaces misconceptions Low on its own — add quick discussion One clear idea; avoid tricky wording/negatives; balance T/F “All metals expand when heated.” (T/F)
Multiple Choice Recall → Apply → Light Reasoning (with scenarios) Broad coverage; easy marking; diagnostic distractors reveal gaps Useful if feedback explains why options are wrong/right One best answer; plausible distractors; avoid ‘NOT’ unless bolded “Which method best separates salt from water?” A–D
Matching Recall / Understand Fast vocab or term checks; pairs/definitions Builds associations; quick win Homogeneous sets; 5–8 pairs; say if options repeat Match terms to definitions (photosynthesis ⇄ process…)
Fill-in-the-Blank Recall Tests exact knowledge (formulas/terms); reduces guessing Strong retrieval practice One unambiguous answer; blank near end; specify units/spelling “Rate = ______ × Time”
Short Answer Understand → Apply (some Reason) See why students think that; partial credit Retrieval + articulation build memory Specific prompt; expected length; simple rubric “Why add a control in this experiment?” (2–3 lines)
Extended/Open (structured or essay) Reason / Transfer Evaluate deeper understanding, argument, multi-step methods Highest learning value if feedback given Clear criteria; manageable length; avoid overuse in large cohorts “Compare two methods for water purification; justify your choice.”

Question Types with Implementation Plans

1️⃣ True / False – Quick Whole-Class Check

Thinking: Recall → Understand
Purpose: Rapidly check factual knowledge and surface misconceptions.

Active Ingredients

  • One clear idea per item; avoid compound statements.
  • Balanced truth ratio (half true, half false).
  • False items reflect real student errors.
  • Short, readable wording (no double negatives).
  • Discuss false answers to surface misunderstandings.

Avoid

  • Two ideas in one sentence.
  • Trivial facts.
  • Subjective statements.

Example

  • “All metals expand when cooled.” → False — shows confusion between expansion and contraction.
  • “Evaporation happens only in sunlight.” → False — invites reasoning about temperature.

Implementation Plan

  1. Write 5–6 clear statements per topic.
  2. Collect responses (whiteboards, cards, polls).
  3. Ask “Why might someone think that’s true?” and discuss.
  4. Record high-error items → reteach next lesson.
  5. Keep a diagnostic bank of True/False items.

2️⃣ Multiple Choice – Breadth + Diagnosis

Thinking: Recall → Apply → Reason
Purpose: Test broad coverage and diagnose misconceptions through distractors.

Active Ingredients

  • Clear stem; 3–4 options.
  • Each distractor represents a real misconception.
  • Parallel phrasing and length.
  • Short scenario for context where helpful.

How to Design a Diagnostic MCQ

  1. List common errors.
  2. Write correct reasoning.
  3. Turn each error into a distractor.
  4. Test with colleagues.
  5. Analyse responses by option (letter = misconception).

Avoid

  • “Which is NOT…” unless clearly highlighted.
  • “All/None of the above.”
  • Long stems.

Example

Which statement about evaporation is true?
A It only happens in sunlight.
B It happens at any temperature.
C It requires boiling.
D It cools the surface.

A shows the common misconception “evaporation = sunlight”.

Implementation Plan

  1. 4–6 items per topic.
  2. Mark quickly by tally or form.
  3. Review patterns and reteach by error type.
  4. Reuse for starters with explanations.

3️⃣ Matching – Connect Language and Concepts

Thinking: Recall / Understand
Purpose: Reinforce vocabulary and concept links.

Active Ingredients

  • One relationship type (term–definition).
  • 5–8 pairs + 1–2 extra options.
  • Clear layout (one page).
  • Short definitions.

Avoid

  • Mixed categories.
  • Wordy phrasing.

Example

1 Photosynthesis   2 Respiration   3 Transpiration
A Releases energy from food   B Removes excess water   C Makes food using sunlight

Implementation Plan

  1. Use for plenary or review.
  2. Ask students to explain one pair.
  3. Note common errors → reteach that link.
  4. Convert missed pairs to fill-in items.

4️⃣ Fill-in-the-Blank – Precision Recall

Thinking: Recall
Purpose: Strengthen accuracy and fluency.

Active Ingredients

  • One blank per sentence (near end).
  • Context without clues.
  • Optional format hint ([unit]).
  • Consistent style.

Avoid

  • Multiple blanks.
  • Ambiguous grammar clues.

Example

“The formula for density is ______ ÷ volume.” → mass ÷ volume

Implementation Plan

  1. Prepare 8–10 core facts per topic.
  2. Let students self-check.
  3. Note top three errors → reteach.
  4. Use orally in warm-ups.

5️⃣ Short Answer – Explain and Apply

Thinking: Understand → Apply
Purpose: Show reasoning and clarify thinking.

Active Ingredients

  • Clear verb and focus.
  • Length guide (2–3 sentences).
  • 3-point rubric (idea + reason + example).
  • Model answers shown.

Avoid

  • Vague prompts.
  • Over-marking.
  • Penalising language over content.

Example

Why is a control important in a fair test?
✅ Keeps variables constant → shows real effect → valid result.

Implementation Plan

  1. Write 3–4 per unit.
  2. Sort answers into secure/partial/unclear.
  3. Use partials for next-lesson improvement.
  4. Track patterns → reteach.

6️⃣ Extended / Open Response – Reason and Judge

Thinking: Reason → Transfer
Purpose: Assess depth and evidence-based reasoning.

Active Ingredients

  • Claim → Evidence → Because structure.
  • Clear criteria (content, reasoning, clarity).
  • Scaffolds for EAL learners.
  • Model examples shared.

Avoid

  • “Write everything you know.”
  • Over-long tasks.
  • No feedback.

Example

Compare two methods for water purification and justify which is better for rural use.
Strong answer: compares methods, resources, and concludes with reasoned judgement.

Implementation Plan

  1. Give criteria up front.
  2. Mark analytically (3×3).
  3. Use peer review samples.
  4. Short 10-minute weekly reasoning tasks.
  5. Track improvement over time.

✅ Quality Checklist for Every Question

Choosing the question type is only half the work. The real skill lies in making sure every question still asks the right question — the one that matches what you actually taught and the thinking you wanted to see.

Check Why It Matters
Alignment Does the question truly measure what was taught? Review the principles from Alignment – Test Exactly What You Taught to ensure every item fits your learning intent.
Demand Does the question match the intended thinking level? Use the Curriculum Progression ladder to check if it sits at Recall, Understand, Apply, Reason, or Transfer.
Clarity Is the wording simple and direct? Avoid Construct-Irrelevant Difficulty — complexity that makes the question harder for the wrong reasons.
Fairness Can every student show what they know? Check vocabulary, layout, and readability so EAL or struggling learners aren’t disadvantaged.
Markability Is there a clear correct answer or marking guide? This ensures consistency and fairness.
Actionable Do results tell you what to reteach next? Revisit Calibrating Difficulty — a good question gives insight, not just a score.

EBTD Reminder: The first part of this series gave you the design tools — Alignment, Progression, Fairness, and Calibrated Difficulty — to make every question valid and purposeful. Phase 3 builds on that foundation, showing how each question type works in practice. Together they form a complete cycle: Design → Analyse → Use → Improve.