Skip to main content

Alignment – Testing What You Actually Taught

Part of EBTD’s Guide to Better Assessment Series: From Principles to Practice — professional-development resources for teachers in Bangladesh.

Each article takes one principle from our Guide to Better Assessment Series and turns it into clear, classroom-ready practice.

1. Introduction – Why Alignment Matters

Have you ever marked a test and thought, “But they knew this in class!”?

You taught carefully. Students practised. Yet the results don’t show what they actually learned. This is rarely a teaching problem — it’s usually a problem of alignment.

Alignment means the question tests the exact knowledge or skill that was taught. When a question measures something else — complex language, tricky formatting, or unrelated ideas — the result is misleading.

In Bangladesh, this happens often: teachers reuse textbook or internet questions without checking whether they fit the lesson aim. Students “fail” topics they understood.

The good news: once you can check for alignment, assessment instantly becomes more accurate — and teaching more targeted.

2. What Alignment Means

Definition: Alignment means that assessment questions directly match what students were taught and practised. Each question should measure one clear learning objective — nothing more, nothing less.

3. Why Alignment Matters

  • Results make sense. They show what students truly understand.
  • Feedback becomes useful. You know what to reteach and what to extend.
  • Students trust your tests. They see fairness and consistency.

If alignment is missing, you can’t tell whether low scores reflect weak learning or a confusing question.

4. The Five Active Ingredients of Aligned Assessment

These five non-negotiables define what good alignment looks like. If one is missing, your assessment data loses accuracy.

Active Ingredient Definition What Good Looks Like
Clear Learning Objective You know exactly what skill or knowledge the question measures. You can finish the sentence “I’m checking whether students can …” in one line.
Right Cognitive Demand The thinking level matches what was taught — recall, apply, or reason. Tasks aren’t easier or harder than practice in class.
Direct Focus The question tests the concept itself, not reading ability or background facts. Only one idea per question; no hidden skills.
Familiar Response Format The format mirrors how students practised. If students solved orally or with diagrams, the question uses the same mode.
Fair and Clear Language Wording is concise, culturally familiar, and free of traps. Uses everyday Bangladeshi examples, simple numbers, clear layout.

From Principle to Practice

When you understand these five Active Ingredients, you can start to see alignment in every question you write. The next section shows what this looks like in practice — how a few small wording choices can satisfy or break these principles and completely change what you end up assessing.

5. Writing an Aligned Question – What Difference Wording Makes

Alignment Matrix: Rows = Active Ingredients. Columns = Questions. Each cell shows a tick/cross and why the question meets (or doesn’t meet) the ingredient.

Swipe to see more →
Active Ingredient Maths: Misaligned
“Train leaves Dhaka… what time will it arrive?”
Maths: Aligned
“A car travels 90 km in 2 hours. What is its average speed?”
English: Misaligned
“Discuss how Shakespeare uses dramatic irony in his plays.”
English: Aligned
“Choose one scene from Romeo and Juliet… explain dramatic irony…”
Science: Misaligned
“Why do plants grow faster in summer than in winter?”
Science: Aligned
“Explain how temperature affects the rate of photosynthesis.”
1. Clear Learning Objective ❌ Objective muddied by time calculation + journey context. ✅ Single skill: average speed = distance ÷ time. ❌ Too broad (“his plays”); aim not bounded. ✅ One scene; one technique; tight aim. ❌ Unclear if measuring growth or photosynthesis. ✅ Explicit: temperature → rate of photosynthesis.
2. Right Cognitive Demand ⚠️ Higher than taught (multi-step reasoning + time). ✅ Matches taught application level. ⚠️ Leaps to evaluate across works (too demanding). ✅ Targets analysis only, as practised. ⚠️ Invites general explanation with multiple factors. ✅ Apply taught concept to a single relationship.
3. Direct Focus (one concept only) ❌ Adds reading, time arithmetic, place names. ✅ Focus stays on speed calculation concept. ❌ Mixes literary history, essay craft, and evaluation. ✅ One concept: dramatic irony in one scene. ❌ Multiple variables (light, soil, rainfall) implied. ✅ Isolates temperature as the only variable.
4. Familiar Response Format ⚠️ Wordy, multi-step word problem unlike class practice. ✅ Short item mirrors practice problems. ⚠️ Full essay structure not practised. ✅ Short analytical paragraph, as practised. ⚠️ Vague open response unlike data-based tasks. ✅ Short explanation/data interpretation as in class.
5. Fair & Clear Language ❌ Dense context; extraneous details distract weaker readers. ✅ Plain wording; familiar numbers. ❌ Broad prompt; ambiguous scope and phrasing. ✅ Precise terms (scene, dialogue, stage direction). ⚠️ Informal phrasing; seasonal background knowledge. ✅ Clear scientific wording used in lessons.

Tick = clearly meets the ingredient; Cross/Warning = does not or only partially meets it.

Why This Matters

Each Active Ingredient acts like a filter: if one is missing, the question starts measuring something else.

Active Ingredient If Missing, You Risk …
Clear Learning Objective Data that can’t guide teaching decisions.
Right Cognitive Demand Tasks that are too easy or too hard.
Direct Focus Students failing for irrelevant reasons.
Familiar Format Confusion or anxiety instead of evidence.
Fair Language Inequity between strong and weak readers.

When all five are present, questions reveal true learning — and that’s the foundation for the Assessment MAP.

6. The EBTD Assessment MAP (Monitoring Alignment and Progress)

M – Map Out the Goal Precision

Clarify what to assess → Precision.

  • What exactly do I want to find out if students know or can do?
  • Which part of the lesson or scheme of work does this question check?
  • What kind of thinking am I testing — recall, apply, or reason?

EBTD Tip: If you can’t describe your goal in one sentence, your question will be too broad or unclear.

A – Align the Design Validity

Choose the type and wording that match teaching → Validity.

  • Use your existing resources (textbooks, past papers, class tasks) as a starting point; adapt to meet the five Active Ingredients.
  • Keep the response style familiar to students.
  • Match wording to what you taught — no hidden reading or vocabulary barriers.
  • If using AI, brief it using your five Active Ingredients.

Example AI prompt: “Write one question to check if students can calculate average speed using speed = distance ÷ time. Make it an application question, using clear, familiar Bangladeshi examples and simple language.”

P – Probe for Fairness Reliability

Check the five Active Ingredients → Reliability.

Ingredient Self-Check Prompt
Clear Objective Does it measure exactly what I taught?
Right Cognitive Demand Is the level of challenge appropriate?
Direct Focus Is it testing one concept only?
Familiar Format Will students recognise the response style?
Fair & Clear Language Could weaker readers or EAL students still show what they know?
Letter Meaning Purpose
M – Map Out the Goal Clarify what to assess Precision
A – Align the Design Choose type and wording that match teaching Validity
P – Probe for Fairness Check the five Active Ingredients Reliability

Worked Example – Applying the MAP (Teacher’s Thought Process)

Context: Secondary Science — students have been learning how temperature affects the rate of photosynthesis. The teacher wants to check whether students can interpret experimental data and apply their understanding of variables.

M – Map Out the Goal (Precision)

Teacher’s goal (in one sentence): “I’m checking whether students can identify and explain how temperature affects the rate of photosynthesis, using data from an experiment.”

  • Curriculum link: Topic: Photosynthesis → Working scientifically → Interpreting data and drawing conclusions.
  • Thinking level: Apply and reason — students use knowledge, not recall facts.
  • What I’m not assessing:
    • Memorisation of the photosynthesis equation
    • Recall of plant structures
    • Ability to read long passages or interpret unrelated data (like soil or light intensity)

Decision: Keep the focus on one variable (temperature). Avoid overlapping factors like light or water. Provide clear data and short, accessible text.

A – Align the Design (Validity)

Initial draft (too broad):

“Why do plants grow faster in summer than in winter?”

  • Too general — mixes multiple variables (temperature, sunlight, rainfall).
  • Doesn’t isolate what was taught (temperature’s effect on photosynthesis).
  • Encourages vague, memorised answers rather than scientific reasoning.
Aligned redesign (matches practice):

“The graph shows how the number of oxygen bubbles released by a pondweed changes as temperature increases. Explain what the results show about the effect of temperature on the rate of photosynthesis.”

  • Students interpret data — a skill explicitly practised in lessons.
  • Focuses on one relationship: temperature ↔ rate of photosynthesis.
  • Question wording mirrors practical work completed in class.
  • Supports fairness — short sentences, clear structure, familiar context.

Response format: Short explanation (2–3 sentences) or data interpretation — mirrors classroom scientific writing practice.

If using AI to draft: “Write one question to check if students can explain how temperature affects the rate of photosynthesis using simple data. Make it a short, applied question using clear, familiar Bangladeshi examples.”

P – Probe for Fairness (Reliability)

Check the five Active Ingredients. Use this quick self-check, then review your decisions.

Ingredient Self-check Teacher’s Decision
Clear Objective Does it measure exactly what I taught? Yes — isolates temperature as the single variable.
Right Cognitive Demand Is the challenge appropriate? Yes — requires interpretation, not recall or advanced analysis.
Direct Focus Is it testing one concept only? Yes — no confusion with light or soil factors.
Familiar Format Will students recognise the response style? Yes — identical to class experiment questions.
Fair & Clear Language Could weaker readers or EAL students still show what they know? Yes — short sentences, familiar terms (“temperature,” “rate,” “oxygen bubbles”).

Teacher’s reflection: The improved version directly tests scientific reasoning rather than general knowledge. It removes distractions, keeps wording clear, and focuses precisely on the intended learning goal.

EBTD Principle – What This Shows: The MAP process works across all subjects, not just Science. It encourages teachers to ask: What am I really measuring? Have I kept the focus on the concept, not the context? Can every student access the question fairly? The MAP helps refine textbook and internet questions into valid, reliable assessments that reflect real understanding — not just memorisation.

Summary – Key Takeaways

  • Alignment means every question tests exactly what was taught.
  • Five Active Ingredients keep assessments precise, valid, and fair.
  • The Assessment MAP helps teachers — and AI tools — stay guided by evidence.
  • When all elements align, assessment results truly reflect learning.

If you found this useful, join the EBTD newsletter for monthly, research-backed tips, free classroom tools, and updates on our training in Bangladesh—no spam, just what helps. Sign up to the newsletter and please share this blog with colleagues or on your social channels so more teachers can benefit. Together we can improve outcomes and change lives.

Back to Introduction: Guide to Better Assessment  |  Next: Curriculum Progression