h1

My Meeting With Sean Harford, OFSTED’s National Director for Schools Policy

July 30, 2014

You may recall (see here)  that in February a group of bloggers met Mike Cladingbowl, one of OFSTED’s biggest cheeses, as part of an exercise in bridge-building with the online teacher world. It was a bit noticeable at the time that I wasn’t invited, despite the effort I had been putting into blogging about OFSTED. It almost seemed as if they were willing to reach out to teachers on social media, but not if it meant having to answer some of the questions I was asking. However, I was pleased to get an invitation for a chat from @HarfordSean, who is now (I think this is a recent appointment) their national director for schools policy. I went to meet him in OFSTED’s secret base (sort of) in the West Midlands last Friday, with my associates Gwen  (@Gwenelope) and David (@LearningSpy).

Before I go through the content of the discussion, it is probably worth mentioning the general tone of the meeting. Sean was not defensive. When told OFSTED horror stories, either recent or from the bad old days of last year, he did not make excuses and would ask about what could be done. At times he even seemed to pre-empt the possible criticisms of how OFSTED operates, facing them head on rather than skirting round them. He had a message to put across about what was being put into place, but was keen to discuss ideas. This was the voice of a reformed, or at least reforming OFSTED. When I left, I remembered reading some cynics, back in February, had made very pessimistic remarks on social media about the bloggers’ meeting being a PR exercise designed to lull the critical faculties of those invited by flattering their egos. I could imagine the same criticism being made of my meeting, in that I did leave feeling far less concerned or angry about OFSTED than I arrived. I do hope, however, that this is because the substance of the discussion gave some genuine grounds for hope.

It is probably worth noting what seemed to be Sean’s main message, or at least his main message for those of us there. The guidance for inspectors was being cut back and simplified to prevent too much scope for misinterpretation. The subject guidance used for survey visits (as criticised here) have now been removed from the website. The guidance on quality of teaching (out soon) will be crystal clear that it is not about doing particular things. This will be backed up by the training, the briefing of HMI and the changes in who will be inspecting. Current practitioners should be more involved in inspecting, and can even be part of some training activities. Inspectors will be looking for a broad and balanced curriculum, which is intended to prevent OFSTED being used as an excuse for a narrow curriculum.

Gwen, who had recently taught at a “Category 4” school, described what it was like working at a school under the scrutiny of OFSTED. Managers constantly second-guess OFSTED and there is constant pressure on teachers to conform to whatever ideas, no matter how counter-productive, the managers come up with. I suggested that the worst effect OFSTED has is this indirect effect; that any recommendation in an OFSTED report becomes the school’s priority and open to bizarre interpretations that actually make teaching less effective. This raised the issue of why schools don’t just concentrate on achievement in order to show improvement. Sean seemed confident that this would be the best strategy in most cases, particularly as teaching grades tend to match achievement grades and grading of lesson observations should end completely from September. He described schools that get rid of teachers whose classes get good results as behaving in a way that is “ridiculous” and “bizarre”. I pointed out that this relies on schools feeling they can influence the achievement grade by improving results, rather than the achievement grade being an unpredictable result of particular inspectors’ concerns.

This moved us on to the topic of reliability. Sean feels that schools should be able to predict their achievement grade in advance of the inspection from their results, something which, in my experience, hasn’t been the case. He said that he had in the past felt there was an argument for having an algorithm that analysed all schools on published data and gave some kind of grade for that, although he would not consider that as enough on its own to analyse achievement. I felt that this would, at least, give schools some better guidance as to what needed to improve. We discussed some recent blogs (more recent than my examples) that described disagreements with inspectors over data. Sean accepted that sometimes inspectors have got things wrong and that his priority was improving on that. I asked what was done to ensure reliability and validity and, in particular, whether there was any attempt to ensure that different teams would reach the same judgement about a school. There is apparently some scrutiny of “outliers” among judgements but no actual direct checks for reliability, i.e. consistency, between inspectors. There was a bit of discussion about whether two inspectors, or two inspection teams, could inspect the same school to see if they come up with the same judgement as part of quality assurance. With hindsight, it is staggering that this hasn’t been routine practice after every major change in inspection methods. Reliability should be a fundamental consideration of any method of measuring anything.

David raised questions about how inspectors will be hired in future. Unfortunately it looks impractical to stop serving inspectors acting as consultants and there will be limits to how much inspection work can be done by serving school leaders. He also discussed the use of the behaviour and safety part of the report to make “stealth” judgements about teaching practice. Sean was well aware of the issue. Hopefully David will be blogging about some of this discussion before too long; he even raised Trojan Horse.

From my point of view, the meeting served to reassure me about some of the points about which I’ve been most critical of OFSTED and to raise some ongoing issues about the impact the organisation is having. I don’t want to imply that this means anything is resolved or solved, but we have moved on considerably from a situation where communication about what OFSTED wants seemed to go via consultants & CPD providers, through SMT, and only reached the classroom once it had been turned into a compulsory style of teaching that the frontline had to comply with. I think we still have a long way to go before schools shake off the “OFSTED-culture” where schools are motivated largely by fear of inspectors, but I do hope OFSTED seem a little less scary than they were this time last year. But I will be a lot happier when there exists a collection of good evidence that OFSTED judgements are both valid and reliable.

19 comments

  1. Thank you for taking the time to go and talk with them.


  2. Good update. Keep pushing.


  3. We owe you a huge debt of gratitude. Thank you, Andrew.


  4. […] detailed by Old Andrew here, I attended a meeting with the new National Director for Schools Policy, Sean Harford in Birmingham […]


  5. […] detailed by Old Andrew here, I attended a meeting with the new National Director for Schools Policy, Sean Harford in Birmingham […]


  6. Reblogged this on Apprenticeship, Skills & Employability..


  7. Great blog. Nice to see teachers being consulted.


  8. Thank you for the update, posted here on this ongoing thread about lesson observations and Ofsted:
    http://phonicsinternational.com/forum/viewtopic.php?p=1835#1835


  9. Reblogged this on The Echo Chamber.


  10. Reliability not validity? Algorithms and particising school leaders as HMIs.

    Reliability is akin to profession confidence. Would different inspection teams draw the same conclusions? Would it be difficult for Ofsted to show their ‘reliability?’ (Could a school be subject to two inspection teams?)

    Also of interest – is how accurate are Ofsted judgements. Are school confident that Ofsted actually measures what it sets out to measure – school effectiveness?

    Algorithms for achievement – perhaps more important is how accurately schools know themselves. Should schools submit forecast outcomes to Ofsted. Do forecast outcomes post inspection ever get reviewed?

    HMIs – should schools wrap into one of their senior roles x days to work and visit schools as part of a local or national partnership? Developing, sharing and communicating school leadership and practice?

    Informative read – appreciated.


    • I mentioned validity, but the reason for focusing on reliability is that it is both easier to test and, for some reason, less a focus for debate.


      • Completely agree on importance of reliability, which is an objective, statistical issue,rather than validity, which is really a quality of the criteria against which the assessment is being made.

        The ignoring of reliability is a massive black hole, in my view, in the whole of our assessment system and not just in relation to OFSTED. Why does no-one monitor the reliability of exam results and teacher predictions and assessments? This will be the huge contribution that learning analytics is able to make, in my view.


  11. PS please excuse the typos!!


  12. Yes, you did mention validity. I’m not sure why reliability has not been questioned more rigorously and more often by the profession. Is it possible we have been too focused on the grades and guilty of overlooking the process?
    One would most probably avoid test-retest investigation as it would place undue pressure on a single school community but inter-rater reliability or parallel-test reliability would be feasible? Should HMI teams be issued an internal consistency reliability value?
    Curious, what measure would you favour? Where would it add confidence to inspection process?


  13. […] Teaching in British schools « My Meeting With Sean Harford, OFSTED’s National Director for Schools Policy […]


  14. […] Smith gets to meet and talk with Ofsted’s Sean Hartford and has his say in the latest guidance published. Well done OA and all those who attended. What […]


  15. […] Sherrington David Didao Ross McGill Tom Bennet Shena Lewington Andrew Old Naureen Khalid Cherryl Kd Stephen Tierney Chris McDonald Old Primary Head Part 1 and Part 2 Miss […]


  16. […] in what would be termed ‘blind second-marking’ in a university context. Interestingly he said something similar when he met Andrew Smith. It’s not an area I can claim any expertise in but I am pretty sure that there are various ways […]


  17. […] My post about meeting Sean Harford in July, as I think a lot oif people missed it because it was in the […]



Comments are closed.