Termly Exclusion Data

December 3, 2022

Some new data about exclusions has been published, and it reveals how misleading a previous data release was.

On November 24th, the DfE released an avalanche of updates. I counted over 40 items on the DfE news feed. Among them was a release of new information about exclusions and suspensions. This was data for the autumn term of the 2021-22 academic year. Previously, exclusion and suspension data had only been released annually.

Before the pandemic, exclusion and suspension figures were not broken down by terms at all. It was when the data for 2019-20 were released that, suddenly, termly statistics were included. The lockdown that affected part of the spring term, and all of the summer term, had seen exclusions fall dramatically. A warning was given that:

While permanent exclusions and suspensions were still possible throughout the academic year, school closures have had a substantial effect on the number of permanent exclusions and suspension [sic] and therefore caution should be taken when comparing figures across years.

Then the termly figures were given, and comparisons were made for the first time between autumn term 2018 and autumn term 2019.

Looking across terms, there were 3,200 permanent exclusions in Autumn term 2019/20. This is a 5% increase on the same period in 2018/19 (from 3,000). Across school types, permanent exclusions

  • increased by 20% in primary schools (77 permanent exclusions)
  • increased by 3% in secondary schools (77 permanent exclusions)
  • were stable in special schools

One of the most enduring features in reporting and commentary on exclusions is the focus on rises which are presented as remorseless; ongoing, and affecting every subgroup of the population. The story of 2019-20 should have been that permanent exclusions, which had already stopped rising the previous year, were incredibly low due to lockdown.

Instead, the publication of the autumn term figures enabled claims that exclusions were on an upward trend, temporarily interrupted by lockdown.

But were those reported rises in the autumn actually good evidence of an underlying upward trend in exclusions? The termly data released with the 2019-20 figures, covered the two year period from Autumn 2018 to Summer 2020, i.e. a period of six terms. Because lockdown affected the last two terms of this, the only term on term comparison that was not affected by lockdown was between Autumn 2018 and Autumn 2019. And so, for many, the story became what was quoted above: permanent exclusions are up 5% and exclusions from primary schools are up 20%. The narrative of rising exclusions continued, because for many these felt like dramatic increases. A more statistically literate and less sensationalist press would have realised that you cannot identify a trend from two data points. The wonderful BBC programme More or Less frequently focuses its statistical reporting on the question “Is that a big number?”. With only two data points, it was impossible to tell whether 5% and 20% are big numbers showing an important new development, or whether they are the kind of variation that will occur most years.

This is where the new release comes in. It included termly figures going back to the 2016-17 academic year and going forward to Autumn 2021. Did a 5 percent increase from Autumn 2018 to 2019 actually show a worrying upward trend? Not really, as can be seen from looking at the full data set. The highly publicised 5% increase is in red, and was actually the smallest change from autumn to autumn to be found in the figures.

How about that change in primary school exclusions? Surely, a 20% increase must be a big concern that should be investigated? Not when we have the context. Again, it was actually the smallest autumn to autumn change to be found in the data.

It’s quite an impressive feat for a government department to ensure a fall is reported as a rise. The DfE was reporting statistics that showed a massive fall in exclusions. However, by slicing up the data; highlighting unnecessary comparisons, and holding on to the data needed to interpret it for another 16 months they managed to create a narrative about increases in exclusions. The genuinely dramatic fall in exclusions was reported with comments like this in Schoolsweek:

Kiran Gill, founder of The Difference, said: “The figures for the autumn term reveal the real story on exclusions in this country – this is the data that we can rely on before pandemic lockdown measures hit. This is a social justice issue.

“Yet again the most vulnerable children are the most likely to fall out of education, such as those with special educational needs, social service interaction and living in increasing child poverty. 

“No one should rest assured that exclusions are declining, quite the opposite. Instead, children are being permanently excluded in greater numbers at younger ages. This should sound alarm bells.”

And in less honest publications the commentary became the headlines:

I expect anti-exclusion propagandists like Kiran Gill and the Guardian to come out with this rubbish. Is it too much to ask that the DfE doesn’t help them by cherrypicking misleading statistics for them?

More generally, the addition of termly data that hadn’t existed before, ignores the principle that government statistics should be primarily about releasing accurate factual information as to what has happened. If lockdown has reduced exclusions, then the most important thing for the figures to show is that lockdown has reduced exclusions. Analysis trying to establish that there is an underlying trend that lockdown has interrupted should be secondary to that. There’s enough consistency in the number of exclusions from year to year to suggest that the level of exclusions is not randomly distributed (although some of the figures you see reported for subgroups of exclusions are somewhat “noisy” and may show random variations). However, exclusions do seem to have changed enough over the years for us to be skeptical of the idea that changes are always driven by long term trends rather than specific events. Perhaps the rise after 2012-13 was caused by changes in the appeal system that indicated that the government was prepared to accept a rise in exclusion rates, or by OFSTED’s subsequent crackdown on “off-rolling” (removing pupils from a school’s roll without a permanent exclusion). Perhaps the stability in exclusions from 2016-2019 was a result of widely reported concern and campaigning about rising exclusions. If so (and I’m not claiming to know for sure about the effects of these factors) then the difference between changes in exclusions because of underlying trends, and changes because of specific events, is less clear than the reaction to lockdown changes suggests.

Lockdown was not an interruption to a trend. It was a policy decision that, like other policy decisions, affected exclusions. It might have been more dramatic and more easily identified, but that’s a difference in degree, not a difference in kind. Researchers are, of course, free to look for longer term trends and patterns, but the belief that these, rather than how the figures respond to specific events, are what matters, is only an opinion. By slicing up the data in a new way for the convenience of those hoping to find longer term trends, the DfE statisticians were privileging the opinion that there was an an underlying trend, rather than reporting the facts as they had in previous years. That’s not to say the released information should have been hidden; I would assume researchers could have acquired it by making a Freedom of Information request. But by releasing it in the way they did, the DfE encouraged reports of a problem of rising exclusions at a time when exclusions were low and falling. This may also have coloured the perspective of many commentators who incorrectly predicted a dramatic rise in exclusions in 2020-21 (as discussed here).

With regard to the most recent changes in how the exclusion data are released, I’m skeptical about releasing termly statistics. Exclusions statistics are already cherrypicked to death, and this will create a few more opportunities to do this each year. But at least it has revealed just how badly we were misled by the DfE’s own statisticians the last time there was a change in how the figures were reported.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: