Blog post —
Assessment Built on Research - not Assumptions
By Dr. Claudia Rademaker, Co-founder Dugga Assessment
We digitize our exams. We move paper to screens. We automate grading and streamline scheduling, and then we call it progress. But here is the question I keep coming back to; do we actually improve how students learn?
Across education systems worldwide, schools have invested heavily in devices, connectivity, and platforms. The first wave of digital assessment solved an obvious problem: getting rid of paper. But the harder question, whether digital tools actually improve the way students learn and the quality of feedback they receive has largely gone unasked. The technology changed. The pedagogy often did not. And I believe the reason lies not in whether we assess digitally, but in how we assess digitally.
The Logistics Trap
When I look at the assessment solutions offered today, I see a pattern. Most of them share the same DNA: they were built by engineers to solve operational problems. Scheduling. Distribution. Grading at scale. These are real problems and solving them matters. But they are the school's problems, not the student's problems.
Where is the pedagogical research behind the question design? Where is the evidence that the assessment format actually measures what it claims to measure? Where is the inclusivity, not as a compliance checkbox but as a design principle woven into the architecture from day one?
When we choose platforms designed primarily around logistics, we unconsciously signal something to our entire education system: that the purpose of assessment is to process students efficiently. Not to understand how they learn. Not to give them meaningful feedback. Not to ensure that every student, regardless of ability or background gets a fair chance to show what they know.
That signal has consequences, and I think we can do better.
Why we built Dugga differently
Dugga did not start in a startup garage. It started in classrooms, lecture halls and research labs at Stockholm University, Stockholm School of Economics, Umeå University, HAN University of Applied Sciences and Stics Research.
We did not begin by asking "How do we build an exam platform?" We began by asking "What does the science say about how assessment can actually drive learning?" Then we built backwards from the evidence.
Every feature in Dugga was co-created with teachers, students and administrators. Every design choice was tested in real classrooms before it shipped. This is not a product built by engineers and validated by a sales team. It is a product built on the conviction that assessment when done right, is one of the most powerful tools we have to improve education.
Our mission is straightforward: we believe we can dramatically improve quality and equality in education by transforming learning assessment. That is not a tagline. It is the research hypothesis we set out to prove and it guides every decision we make.
Three things we believe set Dugga apart: Pedagogy first, not Features first
There is a difference between asking "What question types can we add?" and asking "What question types does the research show improve retention, understanding, and fair evaluation?"
Dugga is built around the principle of constructive alignment, matching what you assess to what you actually taught and what students should be able to demonstrate. That means rich, subject-specific assessment formats: mathematics with formula editors, languages with audio recording, sciences with interactive diagrams. Because a one-size-fits-all text box is not a neutral choice, it is a pedagogical compromise that disadvantages students whose strengths lie outside written text.
Inclusion by Architecture - not by Add-on
Most platforms treat accessibility as something you bolt on after the fact: a plugin here, a license there, an accommodation workflow that requires three extra steps from an already overworked teacher.
We designed Dugga the other way around. WCAG-compliant high-contrast mode, Immersive Reader, ReadSpeaker, Eye Able, spell-checking, speech-to-text: these are not premium features. They are built into the platform, available to any student, configurable per individual from the admin panel. No extra licenses.
Even the way we handle extra time reflects this philosophy. Teachers can assign individual time accommodations per student with clear, mutually exclusive options that prevent configuration errors. It is a small design detail but for a student who depends on that extra time, it is everything.
Inclusion is not just an ethical imperative. Across the Nordics, it is a legal requirement. Dugga makes compliance effortless so that school leaders can focus on what actually matters: the student in front of them.
AI that serves Teachers - not replaces them
I understand why AI in assessment makes educators nervous. Will it grade unfairly? Will it flatten nuance? Will it quietly replace the professional judgment that teachers have spent years developing?
At Dugga, we designed our AI features as teacher tools, not teacher replacements. AI can suggest, draft, and flag but the teacher always decides. AI-assisted grading suggestions come with explanations. Teachers can accept, modify, or reject them. The audit trail is complete and transparent.
With the Nordics Directorate for Education emphasizing responsible AI adoption, this approach is not just philosophically sound; it is aligned with where national policy is heading. We would rather build AI that teachers trust than AI that impresses a demo audience.
The Choice ahead
Every school leader reading this will make an assessment platform decision in the next few years. On a feature comparison spreadsheet, the options may look similar. But the foundation beneath those features is not the same.
There is a fundamental difference between a platform built to run exams and a platform built to improve learning through assessment. That difference starts in the research. It shows up in the design. And it matters in the outcomes your students experience.
We built Dugga because we believe assessment deserves better than the status quo. We built it with researchers, with teachers, and with the students whose futures depend on getting this right. If that resonates with what you are building in your school, I would love to hear from you.
Because in the end, you are not choosing software. You are choosing what you believe assessment is for.
/CR