Twelve words per minute in four months: what measuring fluency changed at our school
Twelve words per minute in four months: what measuring fluency changed at our school
Eighteen months ago we were celebrating a phonics result. Ninety-four per cent of our Year 1 children had passed the phonics screening check. Our systematic synthetic phonics teaching was working. The data proved it.
But we noticed something. Children who had passed phonics brilliantly in Year 1 were stalling on harder texts in Year 2. Not because they could not decode the words. Because they could not read them quickly enough to make sense of a sentence.
Our children could decode. They were not yet reading.
So we started assessing fluency. Every child in Year 2. Every term. Words correct per minute, accuracy, prosody. Gareth McGovern, our Year 2 lead and assistant headteacher, built the programme around Rasinski's approach to classroom fluency practice: choral reading as the daily routine, poetry and rhyme as the vehicle, and quick termly assessments to catch children before they stalled. Nothing we did was expensive. Nothing we did required a new scheme.
This week I pulled the data out to see what a term and a half of systematic fluency teaching had actually produced.
The headline
Thirty-two Year 2 children have two or more fluency assessments on record since December. Their average words correct per minute has moved from sixty-six to seventy-eight. That is a gain of twelve WCPM in four months.
For context, FFT Education Datalab's analysis of over 340,000 assessments across nearly 700 English schools puts the end-of-Year-2 median at seventy-three WCPM. Our current Year 2 mean is above that, with a term still to run. A mid-year mean and an end-of-year median are not strictly comparable, so I will avoid dressing this up as outperformance. The honest framing is that our children are tracking ahead of where we would have expected them to be if nothing had changed.
The quieter story
Accuracy across the same thirty-two children has moved from 96.9 per cent to 98.5 per cent.
If you have spent time inside fluency research, you will know why that number matters. Rasinski describes the real goal of phonics instruction as getting children to the point where they no longer have to use phonics much when they read. At 98.5 per cent accuracy, our children have largely crossed that threshold. Their working memory is no longer consumed by decoding. It is free for comprehension.
This is what the FFT report was pointing at when it found a correlation of 0.68 between oral reading fluency and end-of-Key-Stage-2 reading comprehension. Accurate, automatic word recognition is the bridge. For the thirty-two children in this sample, the bridge is now carrying their weight.
The children we worried about most
The children who started lowest have made the biggest gains.
Five children in the group started below thirty WCPM in December. By April they had gained an average of twenty-two WCPM each. Three children on SEN Support started at an average of twenty-eight WCPM. They have gained twenty-nine each.
These are small numbers and I am not going to dress them up as a statistical finding. Sample sizes of five and three do not prove a pattern. But the direction is the one Gareth and the Year 2 team hoped for when they started this work, and it is the direction Rasinski's research predicts. Repeated reading of appropriately levelled text, modelled first by a fluent reader and then practised in the safety of a group, closes the gap. It does not widen it.
Our highest-starting children, the nine who were already reading above ninety WCPM in December, have gained an average of six. That is slower growth, and that is exactly right. They are approaching the natural ceiling of oral reading speed for age-appropriate text. The point at that level is no longer speed. It is expression, comprehension, and a reading diet that keeps them growing as thinkers.
What changed at our school
Three things, and none of them are dramatic.
We stopped treating fluency as something that would happen naturally once phonics was secure. It does not, for many children. It has to be taught.
We made the texts easier, not harder. The temptation with a Year 2 child reading below benchmark is to push them into age-appropriate books and hope the stretch pulls them through. It does not. It pushes them back into effortful decoding. We used short, rhythmic, predictable texts, mostly poetry, and practised them until fluency emerged. Then we moved on.
We measured. Two minutes per child, once a term. No stopwatch held at arm's length. No child reading under pressure. Just a passage, a timer on the desk, and a note of the words read correctly. The measurement did not drive the teaching. The teaching drove itself. The measurement just told us whether the teaching was working.
What we have not yet worked out
Plenty.
We do not yet know how the gains will hold over the summer holidays, when the daily practice stops. We do not yet know how to measure prosody reliably across a team of teachers without it becoming subjective. And we are still refining the poetry selections, because some poems work in the classroom and others, on paper just as suitable, do not.
This post is a four-month snapshot from a single infant school. It is not a research finding. It is a note from a team who is still learning.
For colleagues reading this
If you are a headteacher, an English lead, a SENCO, or a Year 2 teacher reading this because you recognise the pattern, two things.
Fluency assessment and fluency teaching are genuinely simple. They do not require a new scheme, a new budget line, or a new member of staff. They require a decision to take fluency as seriously as you already take phonics, and a team willing to measure what they are doing.
And the gains show up quickly. We started this in earnest at the start of the spring term. Four months later, the youngest children in our school are reading with fluency that would have seemed ambitious when we began. If a small infant school in Surrey can do this, so can yours.
Fluency is not a mystery. It is the bridge between decoding and comprehension. Children cross it with practice, with good texts, with a fluent model to follow, and with someone paying attention to how they are getting on.
Ours are crossing it. Gareth and the team will keep you posted.
References
FFT Education Datalab (2024). Measuring reading fluency during primary education. ffteducationdatalab.org.uk
FFT Education Datalab (2024). Low oral reading fluency at the end of Key Stage 1. ffteducationdatalab.org.uk
Rasinski, T.V. (2010). The Fluent Reader. Scholastic.
Rasinski, T.V. & Paige, D.D. (2014). Reading fluency: What it is, how to assess it, and how to teach it. Reading Research Quarterly.
Tags:
Stay in the loop
Weekly reading fluency insights from a serving headteacher. No spam, just practical strategies for your school.
Related Articles
How to build a Pupil Premium reading fluency evidence trail that holds up
A step-by-step guide for primary schools: baseline assessment, termly tracking, group-level analysis and connecting intervention to outcome for Pupil Premium reading fluency evidence.
Pupil Premium and reading fluency: where is the evidence of impact?
Pupil Premium funding exceeds £3bn in 2025-26, but most schools evidence reading impact with comprehension scores alone. Here is why fluency data is the missing metric for disadvantaged children.
How to prepare for the Year 8 reading fluency test: five steps for primary schools
The Year 8 reading fluency test is coming. Here are five practical steps primary schools can take now to prepare, from auditing provision to training KS2 staff.