My Meeting With Sean Harford, OFSTED’s National Director for Schools PolicyJuly 30, 2014
You may recall (see here) that in February a group of bloggers met Mike Cladingbowl, one of OFSTED’s biggest cheeses, as part of an exercise in bridge-building with the online teacher world. It was a bit noticeable at the time that I wasn’t invited, despite the effort I had been putting into blogging about OFSTED. It almost seemed as if they were willing to reach out to teachers on social media, but not if it meant having to answer some of the questions I was asking. However, I was pleased to get an invitation for a chat from @HarfordSean, who is now (I think this is a recent appointment) their national director for schools policy. I went to meet him in OFSTED’s secret base (sort of) in the West Midlands last Friday, with my associates Gwen (@) and David (@).
Before I go through the content of the discussion, it is probably worth mentioning the general tone of the meeting. Sean was not defensive. When told OFSTED horror stories, either recent or from the bad old days of last year, he did not make excuses and would ask about what could be done. At times he even seemed to pre-empt the possible criticisms of how OFSTED operates, facing them head on rather than skirting round them. He had a message to put across about what was being put into place, but was keen to discuss ideas. This was the voice of a reformed, or at least reforming OFSTED. When I left, I remembered reading some cynics, back in February, had made very pessimistic remarks on social media about the bloggers’ meeting being a PR exercise designed to lull the critical faculties of those invited by flattering their egos. I could imagine the same criticism being made of my meeting, in that I did leave feeling far less concerned or angry about OFSTED than I arrived. I do hope, however, that this is because the substance of the discussion gave some genuine grounds for hope.
It is probably worth noting what seemed to be Sean’s main message, or at least his main message for those of us there. The guidance for inspectors was being cut back and simplified to prevent too much scope for misinterpretation. The subject guidance used for survey visits (as criticised here) have now been removed from the website. The guidance on quality of teaching (out soon) will be crystal clear that it is not about doing particular things. This will be backed up by the training, the briefing of HMI and the changes in who will be inspecting. Current practitioners should be more involved in inspecting, and can even be part of some training activities. Inspectors will be looking for a broad and balanced curriculum, which is intended to prevent OFSTED being used as an excuse for a narrow curriculum.
Gwen, who had recently taught at a “Category 4” school, described what it was like working at a school under the scrutiny of OFSTED. Managers constantly second-guess OFSTED and there is constant pressure on teachers to conform to whatever ideas, no matter how counter-productive, the managers come up with. I suggested that the worst effect OFSTED has is this indirect effect; that any recommendation in an OFSTED report becomes the school’s priority and open to bizarre interpretations that actually make teaching less effective. This raised the issue of why schools don’t just concentrate on achievement in order to show improvement. Sean seemed confident that this would be the best strategy in most cases, particularly as teaching grades tend to match achievement grades and grading of lesson observations should end completely from September. He described schools that get rid of teachers whose classes get good results as behaving in a way that is “ridiculous” and “bizarre”. I pointed out that this relies on schools feeling they can influence the achievement grade by improving results, rather than the achievement grade being an unpredictable result of particular inspectors’ concerns.
This moved us on to the topic of reliability. Sean feels that schools should be able to predict their achievement grade in advance of the inspection from their results, something which, in my experience, hasn’t been the case. He said that he had in the past felt there was an argument for having an algorithm that analysed all schools on published data and gave some kind of grade for that, although he would not consider that as enough on its own to analyse achievement. I felt that this would, at least, give schools some better guidance as to what needed to improve. We discussed some recent blogs (more recent than my examples) that described disagreements with inspectors over data. Sean accepted that sometimes inspectors have got things wrong and that his priority was improving on that. I asked what was done to ensure reliability and validity and, in particular, whether there was any attempt to ensure that different teams would reach the same judgement about a school. There is apparently some scrutiny of “outliers” among judgements but no actual direct checks for reliability, i.e. consistency, between inspectors. There was a bit of discussion about whether two inspectors, or two inspection teams, could inspect the same school to see if they come up with the same judgement as part of quality assurance. With hindsight, it is staggering that this hasn’t been routine practice after every major change in inspection methods. Reliability should be a fundamental consideration of any method of measuring anything.
David raised questions about how inspectors will be hired in future. Unfortunately it looks impractical to stop serving inspectors acting as consultants and there will be limits to how much inspection work can be done by serving school leaders. He also discussed the use of the behaviour and safety part of the report to make “stealth” judgements about teaching practice. Sean was well aware of the issue. Hopefully David will be blogging about some of this discussion before too long; he even raised Trojan Horse.
From my point of view, the meeting served to reassure me about some of the points about which I’ve been most critical of OFSTED and to raise some ongoing issues about the impact the organisation is having. I don’t want to imply that this means anything is resolved or solved, but we have moved on considerably from a situation where communication about what OFSTED wants seemed to go via consultants & CPD providers, through SMT, and only reached the classroom once it had been turned into a compulsory style of teaching that the frontline had to comply with. I think we still have a long way to go before schools shake off the “OFSTED-culture” where schools are motivated largely by fear of inspectors, but I do hope OFSTED seem a little less scary than they were this time last year. But I will be a lot happier when there exists a collection of good evidence that OFSTED judgements are both valid and reliable.