You may recall (see here) that in February a group of bloggers met Mike Cladingbowl, one of OFSTED’s biggest cheeses, as part of an exercise in bridge-building with the online teacher world. It was a bit noticeable at the time that I wasn’t invited, despite the effort I had been putting into blogging about OFSTED. It almost seemed as if they were willing to reach out to teachers on social media, but not if it meant having to answer some of the questions I was asking. However, I was pleased to get an invitation for a chat from @HarfordSean, who is now (I think this is a recent appointment) their national director for schools policy. I went to meet him in OFSTED’s secret base (sort of) in the West Midlands last Friday, with my associates Gwen (@Gwenelope) and David (@LearningSpy).
Before I go through the content of the discussion, it is probably worth mentioning the general tone of the meeting. Sean was not defensive. When told OFSTED horror stories, either recent or from the bad old days of last year, he did not make excuses and would ask about what could be done. At times he even seemed to pre-empt the possible criticisms of how OFSTED operates, facing them head on rather than skirting round them. He had a message to put across about what was being put into place, but was keen to discuss ideas. This was the voice of a reformed, or at least reforming OFSTED. When I left, I remembered reading some cynics, back in February, had made very pessimistic remarks on social media about the bloggers’ meeting being a PR exercise designed to lull the critical faculties of those invited by flattering their egos. I could imagine the same criticism being made of my meeting, in that I did leave feeling far less concerned or angry about OFSTED than I arrived. I do hope, however, that this is because the substance of the discussion gave some genuine grounds for hope.
It is probably worth noting what seemed to be Sean’s main message, or at least his main message for those of us there. The guidance for inspectors was being cut back and simplified to prevent too much scope for misinterpretation. The subject guidance used for survey visits (as criticised here) have now been removed from the website. The guidance on quality of teaching (out soon) will be crystal clear that it is not about doing particular things. This will be backed up by the training, the briefing of HMI and the changes in who will be inspecting. Current practitioners should be more involved in inspecting, and can even be part of some training activities. Inspectors will be looking for a broad and balanced curriculum, which is intended to prevent OFSTED being used as an excuse for a narrow curriculum.
Gwen, who had recently taught at a “Category 4” school, described what it was like working at a school under the scrutiny of OFSTED. Managers constantly second-guess OFSTED and there is constant pressure on teachers to conform to whatever ideas, no matter how counter-productive, the managers come up with. I suggested that the worst effect OFSTED has is this indirect effect; that any recommendation in an OFSTED report becomes the school’s priority and open to bizarre interpretations that actually make teaching less effective. This raised the issue of why schools don’t just concentrate on achievement in order to show improvement. Sean seemed confident that this would be the best strategy in most cases, particularly as teaching grades tend to match achievement grades and grading of lesson observations should end completely from September. He described schools that get rid of teachers whose classes get good results as behaving in a way that is “ridiculous” and “bizarre”. I pointed out that this relies on schools feeling they can influence the achievement grade by improving results, rather than the achievement grade being an unpredictable result of particular inspectors’ concerns.
This moved us on to the topic of reliability. Sean feels that schools should be able to predict their achievement grade in advance of the inspection from their results, something which, in my experience, hasn’t been the case. He said that he had in the past felt there was an argument for having an algorithm that analysed all schools on published data and gave some kind of grade for that, although he would not consider that as enough on its own to analyse achievement. I felt that this would, at least, give schools some better guidance as to what needed to improve. We discussed some recent blogs (more recent than my examples) that described disagreements with inspectors over data. Sean accepted that sometimes inspectors have got things wrong and that his priority was improving on that. I asked what was done to ensure reliability and validity and, in particular, whether there was any attempt to ensure that different teams would reach the same judgement about a school. There is apparently some scrutiny of “outliers” among judgements but no actual direct checks for reliability, i.e. consistency, between inspectors. There was a bit of discussion about whether two inspectors, or two inspection teams, could inspect the same school to see if they come up with the same judgement as part of quality assurance. With hindsight, it is staggering that this hasn’t been routine practice after every major change in inspection methods. Reliability should be a fundamental consideration of any method of measuring anything.
David raised questions about how inspectors will be hired in future. Unfortunately it looks impractical to stop serving inspectors acting as consultants and there will be limits to how much inspection work can be done by serving school leaders. He also discussed the use of the behaviour and safety part of the report to make “stealth” judgements about teaching practice. Sean was well aware of the issue. Hopefully David will be blogging about some of this discussion before too long; he even raised Trojan Horse.
From my point of view, the meeting served to reassure me about some of the points about which I’ve been most critical of OFSTED and to raise some ongoing issues about the impact the organisation is having. I don’t want to imply that this means anything is resolved or solved, but we have moved on considerably from a situation where communication about what OFSTED wants seemed to go via consultants & CPD providers, through SMT, and only reached the classroom once it had been turned into a compulsory style of teaching that the frontline had to comply with. I think we still have a long way to go before schools shake off the “OFSTED-culture” where schools are motivated largely by fear of inspectors, but I do hope OFSTED seem a little less scary than they were this time last year. But I will be a lot happier when there exists a collection of good evidence that OFSTED judgements are both valid and reliable.
First Impressions of the New OFSTED Handbook
July 31, 2014The new OFSTED handbook is out and can be found here. Although it was meant to be simplified, it replaces not just the old handbook but the old subsidiary guidance and, therefore, is actually quite lengthy. I am too busy to be able to read it from cover to cover, but I have had time to look into a few of the key issues that I’ve been blogging about.
The new handbook really spells out what I would want it to on observations; stating that there is no grading and no required style of teaching.
From the description of what should happen during an inspection:
The guidance on how to grade teaching and learning in a school makes the same point and spells out what inspectors should not be looking out for or taking objection to:
It also states clearly that the information that inspectors will want to see includes “records of the evaluation of the quality of teaching, but inspectors should not expect to see records of graded lesson observations” [their underlining]. This really gives managers little excuse for grading lessons. This needs to be widely publicised, and I would hope that trade unions would start making sure their representatives and members are fully aware that any attempt to grade teachers in observations is neither required by OFSTED, nor in line with OFSTED’s practices, but entirely down to the willingness of managers to grasp at excuses to label their teachers.
I’m hoping that the guidance on marking is vague enough that it might help break the delusion that marking must be acted on in writing to count. As before inspectors are to look for “[c]onsistently high quality marking and constructive feedback” as part of outstanding teaching but elsewhere they are simply looking for “whether marking, assessment and testing are carried out in line with the school’s policy and whether they are used effectively to help teachers improve pupils’ learning”. I hope this causes some schools to reflect on whether their marking policy actually helps teachers and students, or is there only to appease OFSTED.
You may also recall that here I described a school whch had been marked down, despite good results, apparently for an achievement gap:
This now seems to have been addressed. Guidance on achievement says:
Several footnotes might also help make judgements based on the achievement gap less unfair. It is stated that that inspectors should be “considering in-school gaps in the context of national gaps”. Outstanding achievement now has an exception to the rule that the results of the disadvantaged most be rapidly approaching other groups: “[w]here the attainment of disadvantaged pupils is high, any in-school attainment gaps need not be closing rapidly”. Good achievement has a similar exception: “[w]here the attainment of disadvantaged pupils is high, in-school attainment gaps may exist”.
I can see why OFSTED were confident about meeting me last week. The new handbook does seem to have addressed most of the points I’ve raised. However, I may well return to it if I uncover anything that seems less positive. Let me know if you find anything. Also, when term starts, let me know if inspectors are doing what they are supposed to. Just today I got an email from somebody, who went through an OFSTED during last half-term, telling me:
Share this:
Like this:
Posted in Commentary | 9 Comments »