Listen on Youtube

Episode Summary

Welcome to the deep dive — where EBTD turns research and classroom guides into practical, immediate knowledge for teachers across Bangladesh.

In this episode, we explore how assessment can move beyond just collecting marks to truly making meaning from them. Using examples from the EBTD Guide to Better Assessment, we unpack the three phases of assessment expertise: designing valid questions, ensuring fairness and accessibility, and turning evidence into action.

Discover how frameworks like RUART and MAP 2.0 help teachers align, simplify, and interpret assessment data to improve learning outcomes — not just grades. Featuring real classroom stories from Dhaka and Rajshahi, this conversation shows how smarter assessment builds confidence, clarity, and real progress.

🎓 Learn more and download the free guide: https://www.ebtd.education/research-hub-free-teacher-resources/guide-to-better-assessment/
📩 Join the EBTD newsletter for new courses, research-backed tools, and monthly teaching insights: https://www.ebtd.education/newsletter/

Key Takeaways

  • 🎯 Key Takeaways

    • Assessment isn’t about collecting marks — it’s about generating insight.
      The real value of data comes from interpreting what students’ answers reveal about their understanding, not just their scores.

    • Design questions that are valid, aligned, and fair.
      Every question should measure exactly what was taught, use clear language, and be free from unnecessary difficulty or cultural bias.

    • Use the RUART framework to build progression.
      Sequence questions through the five stages — Recall, Understand, Apply, Reason, and Transfer — so assessments mirror how learning deepens over time.

    • Remove the fog, don’t lower the bar.
      Simplifying language or context makes assessments more accessible without reducing rigour — clarity is equity.

    • Turn evidence into action.
      Analyse patterns in student responses to identify misconceptions, target reteaching, and give feedback that tells students what to do next.

    • Use the formative power of summative tests.
      Even term exams can inform teaching if teachers pause to interpret trends before moving on.

    • Technology and AI can support fairness.
      Smart prompting can help reword complex questions for local contexts while preserving challenge and accuracy.

    • Professional judgement grows with deliberate practice.
      By following the three phases — designing, interpreting, and acting — teachers strengthen their expertise and make assessment a driver of learning, not paperwork.

Research Notes & Links

Transcript

Welcome to the deep dive. This is where we take research, guides, practical tools, and turn them into well, immediate knowledge for your classroom. And today, we’re talking directly to you, the dedicated teachers across Bangladesh. We know you’re collecting marks almost constantly, quizzes, homework checks, term exams. It’s a lot. It really is. And you end up with stacks of data, right? The challenge we find isn’t getting the marks. It’s turning those raw scores into something genuinely meaningful. Insights that actually tell you, okay, what do I need to teach next.

We often say assessment is the bridge, the essential link between teaching something and knowing really knowing if learning happened. Exactly. That distinction, collecting data versus generating actual knowledge. That’s what our deep dive is all about today. We’re unpacking the EBTD guide to better assessment. It’s a fantastic free resource and it’s designed specifically for teachers right here in Bangladesh.

You know, Daisy Crystal had this great insight assessment done well. Turns teaching into knowledge. It’s not about testing more. It’s about testing well. Smarter. Smarter. Precisely. And to help you do that, the guide walks you through three really key phases. Think of it as a sequence. First, writing valid questions. Then, how to interpret the evidence you get from those questions. And finally, the crucial part, taking action based on what you learn.

This whole structure, it’s about strengthening your professional judgment, making assessment less about paperwork and more about impact. Closing that gap, you know, between just marking and really improving learning. Okay, let’s dive into phase one then. Designing valid questions. This seems like the foundation. We hear this so often from teachers that frustration when you’ve taught something, you know they understood in class, but then poof, half the class gets the test question wrong. Mhm. That classic feeling, but they knew this. That’s almost always an alignment problem. Alignment. Okay. So, what does that mean in practice? It’s fundamental. Alignment means the question must test only the specific knowledge or skill you explicitly taught. Nothing extra snuck in there.

The guide introduces these five active ingredients. Think them like filters. If even one is missing your assessment data, it’s probably compromised. It’s not giving you a true picture. Five filters. Okay. What are maybe one or two critical ones a teacher should always check like right before they hit print on that quiz? Good question. Number one, definitely a clear learning objective. Can you state in just one sentence exactly what concept or skill this specific question is checking? If not, it’s too fuzzy. And the Second one I’d highlight is direct focus. This means the question tests the concept itself, not something else by accident. Right? What what else could it be testing? Well, things like reading speed, maybe complex vocabulary that isn’t part of the science lesson or recalling some background fact you didn’t actually cover. You want to isolate the skill, right? Separating the skill from just the general difficulty of the task. Got it. Exactly.

And a third one, so important for classrooms here, fair and clear language, simple wording, avoiding jargon, if possible using examples that make sense in a Bangladeshi context you know everyday things a clear layout too if it’s accurate clear and fair then you can start to trust the answers you get okay so that’s aligning the question to the lesson but then there’s the bigger picture right how it fits into the whole curriculum yes that’s curriculum progression if alignment is about hitting the target for today’s lesson progression is about mapping the whole journey assessment should feel like climbing a staircase step by step not suddenly hitting a brick wall a staircase. I like that metaphor. So, how do we build that staircase into our questions? The guide uses a really helpful five-step thinking model. It’s called the RU A T framework. R U A R T. It stands for recall, understand, apply, reason, and transfer. Ru A. Yeah, most of us are good at recall questions. What is define this? But often we unintentionally jump straight from recall to maybe reason, skipping crucial steps in between.

And this is where that teacher example comes in, right? Is the I’m in DACA, the science teacher. Exactly. It shows how powerful this framework is in practice. She used ru to look at her year 8 quiz on evaporation. And what did she find? Well, she found a pretty big gap. She had some basic recall questions. You know, what is evaporation? And she had a couple of highlevel reason questions, but she’d completely missed the apply step. Ah, so asking kids to explain complex stuff before they’d even practiced using the idea in a simple situation. Precisely. She was asking them to justify before they could apply. So she who restructured it. How did she fix it? She reordered her questions to follow the RU art steps logically. Started with recall. What is evaporation? Move to understand. Why does sunlight make water evaporate faster? Then she added an apply question, a simple calculation about a drying puddle. And then she went to transfer. How might a farmer say in Rash Shahi use evaporation to cool milk on a hot day? That makes so much more sense. It maps the thinking. Her reflection really struck me. It feels like I’m mapping their thinking, not just marking their answers. And that map tells you exactly where understanding broke down, right? Do I retach recall or focus on apply? Exactly. It gives you precise diagnostic information.

Okay. So, if phase one is getting the question design right, aligned and progressed, what’s next? Seems like even a perfect question isn’t useful if some students can’t fairly access it. You’ve nailed the transition. That brings us straight into phase two. Fairness and accessibility. This is non-negotiable especially in Bangladesh with such diverse classrooms. You know the sources point out something quite stark. Sometimes assessments measure advantage long before a student writes a single word. Advantage. How so? It happens because of something called construct irrelevant difficulty. Let’s call it C. C. Okay. Sounds technical. What is it simply? It’s basically anything that makes a question hard for reasons that have nothing to do with the subject you’re actually trying to test.

So irrelevant difficulty like what it it could be uh unfamiliar cultural contexts. Imagine a maths problem about, I don’t know, winter heating costs in a Scandinavian climate. That’s irrelevant for most students here. Or overly dense, complex sentences, asking students to wade through paragraphs of text just to find the simple maths problem hidden inside. That tests reading stamina, not math skills. Right. If a student fails because they couldn’t decode the question, the result is meaningless for judging their maths ability. Exactly. The test becomes invalid for that student. We need to remove that that fog especially in subjects like maths where language can be such a barrier.

Is there an example of clearing that fog? Yes. Mr. Raman’s story, a math teacher in Rash Shahi, it’s perfect. His students struggled with a ratio problem. The original question was something like a factory produces 450 bolts every 6 hours. If the same rate continues, calculate the total number of bolts produced over a period of 1 and a half days. Okay, I can see the issue. You’ve got the ratio, the time conversion, plus all that narrative. context, right? It’s a heavy cognitive load before you even get to the maths. He realized the reading difficulty was the main barrier, not the ratio concept itself. So, what did he change? He simplified it dramatically. Cut right to the chase. The factory makes 450 bolts in 6 hours. How many bolts in one and a half days? Same numbers, same calculations, same cognitive demand for the maths. But he stripped away all the extra wording. Access became much more equitable. And his reflection, I didn’t lower the bar, I removed the fog around it. I love that. Remove the fog. Yeah, that’s a fantastic way to think about making assessments fair.

Now, the guide offers a tool to help teachers do this systematically. The assessment MA 2.0 model. Yes, MAP 2.0. It’s more than just a checklist. It’s a proactive planning tool. MAP, measure validity, assure accessibility, and plan for action. That aure accessibility is where we focus on spotting and removing those hidden barriers, that fog we talked about. And it even suggests using tools like AI to help with this. How does that work? Yeah, it’s about using technology smartly as a thinking partner. You could, for instance, take a question you’ve written, especially if it feels a bit complex or uses maybe UK-centric examples, and you prompt an AI tool. Rewrite this question using simple, clear English. Use examples familiar to students in Bangladesh. Keep the core concept and difficulty level exactly the same. Ah, okay. So, the AI generates options and the teacher chooses the clearest, fairest version. Exactly. It helps ensure the challenges and the thinking required that RUART level not just in decoding the language or context.

Which brings us neatly to phase three. We’ve designed a valid fair question. We’ve collected the answers. Now what? Implementation and action using that evidence effectively. Because let’s be honest, often those marks just end up in a spreadsheet or a grade book, right? How do we make sure they actually change our teaching? That’s the core of it. We need to be clear about the purpose. Is this assessment for learning formative, guiding our next steps? or is it assessment of learning summitive giving a final grade? But here’s a really powerful idea. The formative use of sumitive tests using a big exam formatively. How does that work? Doesn’t it just mean more marking? Not necessarily more marking, but smarter analysis. You give the term exam, you mark it. But before you just assign the final grade, you pause. You look for patterns. Which questions did most students struggle with? Was it an apply question, a reason question? Were there common error? on question four. That analysis tells you exactly what misconceptions are still lingering. It diagnoses the learning gaps across the class.

Okay, I see. So, you’re not regrading, you’re diagnosing, and that diagnosis saves time later because you’re not just vaguely reteing everything. Precisely. You can target your next lesson. Okay. 70% struggled with converting units on question four. Let’s spend 15 minutes specifically on that tomorrow. It’s hyperfocused intervention. So, that feedback loop in practice is key. It’s not just marked. It has to be pause, interpret, act that interpret step is vital and the act step needs to lead to action focused feedback. We know good feedback needs to be timely, specific, but crucially action focused meaning it tells the student exactly what to do next. Not just good effort or 710 maybe using next step boxes on their work. Next step review the definition of evaporation or next step practice applying the ratio formula to two more problems linked back to that Are you art ladder? And this all loops back to how we designed the question in the first place, doesn’t it? Choosing the right question type and cognitive demand helps generate the right kind of data for taking action. Absolutely. It’s all connected.

If you use multiplechoice questions or MCQs, don’t just look at the right answer. Analyze the distractors, the wrong options. Why did students choose B instead of C? That tells you about their specific misunderstanding, right? And if you use a short answer question, then you’re probably looking for their reasoning. So that question should be deliberately designed to tap into the reason or transfer level of ru. The whole point is to design questions that give you the precise information you need to decide. What do I teach or retach next week? This feels like such a practical powerful way to think about assessment. It really shifts it from judgment to information. When teachers adopt these principles using the EBTD guide, the benefit seems clear. You teach with more clarity. Students build confidence because the tests feel fair and the whole school can focus. on real measurable progress. Absolutely.

And we want to stress again this resource, the EBTD guide to better assessment. It’s completely free. It’s tailored for Bangladeshi classrooms packed with these practical strategies. You can download it and start using these ideas today. And for educators who feel ready to go deeper to really build that expertise. Well, for them, there’s the more intensive blended EBTD professional course, developing assessment expertise. This involves six online modules you do at your own pace, plus and this is crucial. Two full days of in-person training right here in DACA. It’s designed for the realities of teaching here. You know, large classes, diverse learners. It helps you build that practical skill alongside colleagues.

How can people find out more? The best way is to join the EBTD newsletter. You’ll get updates on the course registration, plus ongoing researchback tips and free tools you can use straight away in your classroom. Excellent. Okay. So, if assessment is really about understanding knowledge, not just assigning marks, here’s a final thought for you listening right now to take away this week. Look at one assessment you currently use. Just when first try to identify its main RUR level as it recall, understand apply. Then ask yourself, what’s one small change I could make to this question? Maybe simplify the language, change the context, add a step to make absolutely sure it’s testing their understanding of the concept and not just say their reading speed or background knowledge. What’s one tweak for truer insight?

Leave a Reply