Schoolsweek has just published an article from Ayesha Baloch, a senior policy adviser at Impetus, urging us to do much more about oracy. Impetus are a funding body, set up by venture capitalists to close the disadvantage gap.
So far, so Bill Gates.
They believe in making research-backed bets. This is what I do too - I train school leaders in how to apply the findings of research at scale. And then how to measure it to check how it is working.
You might know by now that Research Schools add almost nothing extra to their students’ progress, year on year. This is because we are very, very bad at scaling up ideas.
The Research Schools Network is so bad at scaling up the research ideas that the words “progress” is not even in their mission statement. There are no progress measures on their website either.
They have chosen not to list the research schools on their website. To find out the P8 of their schools, you have to look them up individually, and then track how much they improve year on year on FFT Schools Like Yours. Trust me, it’s a pain.
My Life as a Tax Inspector
My formative years, when I was young and green and full of vim, were spent investigating businesses. An investigation started with data - a set of accounts. These are the statistics which contain lies and damned lies. I was brilliant at spotting them - it was very easy to keep score in thousands or millions of pounds, so I am not being rose tinted about this. Either you made an impact, or you didn’t. It was there in black and white, and in the coffers of HMRC.
The way to spot the lies is to try a series of thought experiments. If X is true in the accounts, what would that look like in the business? Who would do what, and when? What would have to happen on a daily basis for X to be the size it is? Is that likely?
These are not great leaps of the imagination, but in schools we often fall at the first hurdle. We just don’t seem to be wired that way as teachers.
I am like this with meditation - I can’t even count my breath for beats of 4, hold for 4, release for 4 - it is all too much for my little brain - but I seem to be made for thought experiments.
My experience of clawing back millions in hidden profits is relevant to teaching in this way:
All research is a thought experiment.
All research is a thought experiment. What get’s written up is just an account of what we think is true (or we want you to accept is true), like a balance sheet.
So, is X true?
Impetus and the EEF
Impetus say yes, the success of Oracy is true. ‘Check out the EEF,’ they say.
The EEF has found that Oral Language Interventions work. In secondary schools, if carried out over a year, they would provide an extra 5 months’ progress in reading, or 1 month in science and maths.
Let’s unpack what this means.
Evidence suggests that Oral language interventions that explicitly aim to develop spoken vocabulary work best when they are related to current content being studied in school, and when they involve active and meaningful use of any new vocabulary.
Some examples of approaches that have been shown to be effective include:
encouraging pupils to read aloud and then have conversations about book content with teachers and peers
modelling inference through the use of structured questioning
group or paired work that allow pupils to share thought processes
implicit and explicit activities that extend pupils
With any of these activities is it crucial to ensure that oral language activities are linked to the wider curriculum (e.g., using oral language activities to model technical language in science).
Oral language interventions can be delivered intensively over the course of a few weeks, but may also be developed over the course of an academic year.
Frequent sessions (3 times a week or more) over a sustained period (half a term to a term) appear to be most successful.
Thought experiment. Why did it work with reading, but not so much with maths and science? Those subjects are full of ideas, and ideas can be talked about. If talking about books works, then talking about maths and science should work?
But it didn’t. So, X is not the whole truth. The whole truth might include this - the students who made progress in reading were reading first. Then they were talking.
They were doing much more reading than their previous curriculum, because they had at least 3 (probably 30 minute long) sessions per week. So, it isn’t the magic of oracy that led to this, it is the mundane reveal that they spent an extra 90 minutes a week doing more reading - minus the time spent talking about it.
How would we test this theory?
Well, what do we know about more reading? You might have read a study on this, or perhaps David Didau’s summary of that study, in which teachers simply read challenging novels to their classes, with much less interruption for, er, oracy.
This is what the study found:
20 English teachers in the South of England changed their current practice to read two whole challenging novels at a faster pace than usual in 12 weeks with their average and poorer readers ages 12–13.
Ten teachers received additional training in teaching comprehension.
Students in both groups made 8.5 months’ mean progress on standardised tests of reading comprehension, but the poorer readers made a surprising 16 months progress but with no difference made by the training programme.
They didn’t make 5 months extra progress per year, but 8.5 months extra in 3 months!
Those teachers who had been trained to do a lot of oracy style comprehension made no extra progress with their classes. The benefit seems to have come entirely from reading the texts quickly and being exposed to complex vocabulary and sentences again and again.
Impetus and Voice 21
Impetus are delighted at the progress made by Voice 21. But their new data is sketchy. Some students made a lot of progress. But there is no data on the cohort as a whole. They simply tell us that 29% of their cohort have scores above the average, where nationally this figure is 25%.
Why only publish the figures for the above average? Could it be that the remaining 71% of students made so little progress that the overall finding was 0 months of progress, or worse, less progress?
We can’t possibly know, as none of the data is published.
Not to worry though, EEF have analysed not one, but two pilots of Voice 21.
This is what they found:
Although teachers were not confident that the observed improvements to oracy skills would have an immediate impact on attainment, some felt that there could be longer-term academic benefits.
This pilot did not collect any quantitative data on academic outcomes.
Thought experiment: if the EEF wanted to prove that Voice 21 worked, they would have made sure that they could gather enough data to measure it.
They chose not to measure this. Instead, they trusted Voice 21 to set up their own measure of impact:
The Voice 21 oracy assessment measure used in the pilot did not provide sufficiently reliable data. A revised or alternative impact measure would be needed for a trial.
Did Voice 21 have time to get assessment right? Yes, 2 years and 8 months.
Maybe Voice 21 Have Cracked it Now?
Thought experiment: is it likely that Voice 21 have got assessment right now? On the plus side, the last pilot ended in 2018, so they have had plenty of time. On the minus side, EEF has funded no further trial. It is highly likely that invalid assessment is the problem.
Maybe the data is there, they just haven’t presented it in their report on Voicing Vocabulary.
Thought experiment: how likely is it that planning what vocabulary to introduce and how to introduce it, with multiple teachers in multiple subjects is likely to lead to huge strides in reading, compared to something else the schools are doing?
How likely is it that talking about and with that vocabulary would get students through enough of the vocabulary needed to improve reading scores so much?
If a student needs to acquire 2000 words per year, how likely is it that Voicing Vocabulary is a fast way to do this, compared to say, reading?
It just isn’t.
Who is most likely to read more and so get much better at reading? Good readers. The top 25%, in other words. We would not be surprised if lots of sessions talking about books made good readers want to read more. And so these are the students who boosted their scores.
Impetus - if you want to invest in closing the disadvantage gap, fund ways to get students to read. Voice 21 is not the answer. Oracy is not the answer. Oracy is to reading as walking is to running.
‘Just do it’ Impetus.
If you are in a school - read!
It would seem that oracy is the new in thing in education. We have oracy planned for our whole school CPD in the coming months.
It is hard not to agree that increasing reading ability among students should be the priority and a better way to improve outcomes for students.
Very interesting Dominic. I definitely agree that reading is best. However, I remember the halcyon days before the “new” GCSEs when we would often do lots of speaking and listening based activities in lessons including discussion and debate - when speaking and listening actually counted towards the GCSE grade instead of one activity being shoe horned in!