← Back to blog

How to Evidence Curriculum Impact in Primary Schools

David Hunter·

The problem with curriculum evidence

Every primary school leader has been asked some version of this question: "How do you know your curriculum is having an impact?"

It comes from Ofsted inspectors during deep dives. From governors in committee meetings. From trust boards reviewing school improvement plans. And the honest answer - "I can see it in the classrooms" - doesn't survive a scrutiny panel.

The trouble is that most schools respond by creating paper. Assessment spreadsheets. Monitoring folders. Learning walk pro formas. Evidence files that take hours to compile and minutes to forget about. The result is a bureaucratic shadow curriculum that exists to prove things are happening rather than to make things happen.

This is the wrong approach. Not because evidence doesn't matter, but because the evidence should be a byproduct of teaching, not a product in itself.

What inspectors actually look for

The Ofsted Education Inspection Framework (2019, updated 2024) is clear about what "curriculum impact" means. It isn't test scores alone. Quality of Education - the single most heavily weighted judgement - looks at three things:

  1. Intent - what you're teaching and why
  2. Implementation - how it's being taught in classrooms
  3. Impact - what pupils know, remember, and can do

The crucial shift that many schools miss: impact is assessed primarily through lesson visits and pupil conversations, not through data spreadsheets. An inspector will listen to a Year 4 child read, ask them about their favourite book, test whether they can explain an inference strategy, and check whether they can use vocabulary from last term's science topic. That conversation tells them more about curriculum impact than any tracking grid.

Amanda Spielman, then Chief Inspector, made this explicit in her 2018 commentary:

"I want to see a curriculum that is broad, knowledge-rich, and carefully sequenced - and I want to see evidence of that in pupils, not just in paperwork."

The implication for schools: the best evidence of curriculum impact is visible in the children, not in the filing cabinet.

A practical evidence framework

Here's a framework that produces genuine evidence without generating busywork. It works because it's built on things schools are already doing - just captured more deliberately.

1. Curriculum mapping: show the sequence

Every school has a curriculum map. Most sit in a shared folder and get updated once a year. The evidence opportunity is making the sequence visible and explicit:

  • Long-term plan showing how knowledge builds across year groups (not just topic titles - the actual knowledge and skills that accumulate)
  • Subject progression documents showing how concepts deepen from Year 1 to Year 6
  • Cross-curricular connections mapped deliberately, not by accident

This is intent evidence. It answers: "Do you know what you're teaching and why?" The important thing is that it's genuine - it should reflect what's actually happening, not an aspirational version.

What inspectors notice: A curriculum map that clearly shows how Year 3 science builds on Year 2 science, and how the vocabulary introduced in geography appears again in writing. That's implementation of a sequenced curriculum. A topic web with disconnected boxes is not.

2. Domain coverage: track breadth, not depth

In reading, maths, and writing, schools need to demonstrate that their curriculum covers the full breadth of the national curriculum - not just the bits that are easy to teach or test.

For reading comprehension, this means tracking which domains (retrieval, inference, vocabulary, summarising, etc.) children encounter across the year. Not scoring them - tracking exposure.

The EEF's Improving Literacy in KS2 guidance (2021) recommends:

"Schools should ensure that comprehension instruction covers all key reading skills systematically, rather than focusing disproportionately on retrieval and inference."

A simple coverage tracker - which domains have been taught, which texts were used, which year groups have gaps - is more powerful evidence than any set of test scores. It proves breadth. It proves deliberate planning. And it takes almost no additional teacher time if it's built into the planning cycle rather than bolted on afterwards.

3. Pupil voice: capture the informal

The most compelling evidence of curriculum impact is what children say. Not in formal interviews - in everyday classroom moments:

  • A Year 5 child using the word "photosynthesis" unprompted in a writing lesson because they learned it in science last half-term
  • A Year 3 child explaining that they used retrieval to find the answer because the information was stated directly in the text
  • A Year 6 child connecting a theme in their class novel to something they studied in history

Hattie's research on visible learning (2023, updated meta-analysis) gives an effect size of d=0.49 for "teacher clarity" - and one of the strongest indicators of clarity is when pupils can articulate what they're learning and why.

Capturing this doesn't require a pro forma. A half-termly note from each year group - three examples of pupils demonstrating curriculum knowledge in unexpected contexts - builds a portfolio of genuine impact evidence over the year.

4. Work scrutiny: look at the books

Book scrutiny is already standard practice. The evidence opportunity is looking for the right things:

  • Progression within a unit - can you see knowledge building across a sequence of lessons?
  • Vocabulary in use - are the key terms from the curriculum appearing in children's writing?
  • Retrieval practice - is there evidence of children recalling and applying prior learning?
  • Adaptive teaching - can you see scaffolding for pupils who need it, and challenge for those who don't?

The EEF's guidance on structured approaches to literacy (2021) found that structured teaching approaches improve outcomes by an average of 5 months' additional progress compared to unstructured teaching. Book scrutiny should look for evidence of structure - not just neat presentation.

5. Assessment that serves teaching

Assessment should generate evidence of impact as a side effect of informing teaching. The problem with many school assessment systems is that they produce data for data's sake - league tables, colour-coded spreadsheets, termly grids - without changing what happens in classrooms.

Effective assessment evidence looks like:

  • Formative assessment in lessons - teachers adjusting based on what they see (not waiting for a summative test to tell them)
  • Low-stakes retrieval - regular recall practice that shows whether knowledge is sticking (Rosenshine's Principles of Instruction, 2012)
  • Diagnostic assessment - identifying specific gaps and responding to them, not just recording a score

Rosenshine (2012) identified daily review and weekly/monthly review as two of the ten most effective instructional principles, with effect sizes consistently above d=0.50. Schools that build retrieval into their daily routine have the strongest evidence of knowledge retention - because the evidence is the practice.

What to stop doing

Evidence of curriculum impact is as much about what you remove as what you add. Schools that produce the strongest evidence tend to have fewer monitoring activities, not more - but the ones they keep are more focused.

Consider stopping:

  • Termly data drops that nobody acts on. If the data doesn't change teaching within a week, it's not formative assessment - it's administration.
  • Learning walks with tick-box pro formas. A focused 10-minute visit looking for one specific thing (e.g., "Are pupils using the vocabulary from this unit?") tells you more than a 20-box checklist.
  • Evidence files compiled for inspectors. Ofsted explicitly says they don't want schools to produce documentation specifically for inspection. If you wouldn't use it yourself, don't create it.

The one-page test

Here's a useful discipline: can you explain your curriculum's impact on one page?

Not a data dump. Not a monitoring calendar. One page that shows:

  1. What you teach and why (intent - the curriculum map summary)
  2. How you know it's being taught well (implementation - your monitoring focus for this term)
  3. What difference it's making to pupils (impact - three or four concrete examples from this half-term)

If you can do that clearly and concisely, you have strong evidence of curriculum impact. If you can't, adding more spreadsheets won't help.

Making it sustainable

The hardest part of curriculum evidence isn't producing it - it's sustaining it. The framework above works because it's built on what schools already do:

  • Curriculum maps already exist - make the sequence explicit
  • Teaching already happens - track domain coverage as part of planning
  • Children already talk - capture what they say
  • Books already get looked at - look for the right things
  • Assessment already runs - make it serve teaching first

The evidence emerges from the curriculum, not from monitoring the curriculum. That's the difference between a school that can confidently answer "How do you know?" and one that reaches for a folder.


David Hunter is the founder of Kickstone Learning, building research-informed tools for UK primary schools. Before founding Kickstone, he spent over a decade as a primary school teacher and subject leader in Greater Manchester.

Kickstone's tools - including KS Fluency for daily maths practice and DISCOVER for structured KS2 reading - are designed to build evidence of curriculum impact into everyday teaching, not as an additional workload.