Archive for February, 2013


Does Sir Michael Wilshaw Know What OFSTED Good Practice Looks Like?

February 26, 2013

While I have recently discussed the continuing domination of trendy teaching ideas over OFSTED there’s one subject where one would assume traditional methods still hold sway if one has listened to Sir Michael Wilshaw. When signalling his departure from enforcing progressive teaching he hasn’t hesitated to identify secondary mathematics for his examples of the acceptability of chalk and talk.

In his RSA speech he described an outstanding maths teacher he knew in this way:

[The teacher] was somebody in his late fifties. He was the head of maths. He was a very traditional teacher. He taught in a pretty didactic way, but the kids loved him across the ability range. He knew how to teach maths. You know what a great maths teacher does?  Builds block by block to ensure that youngsters don’t move on until they understand the ground rules. He would spend many, many hours in the evening every night preparing powerpoints for himself and for the staff in his department and he would disseminate good practice, in terms of how to use powerpoints, to other people in his department and beyond his department to other schools in Hackney and beyond. And he produced absolutely fantastic results although some people would say he was a very didactic teacher.

He also mentioned that “the structured reinforcement of mathematical formula” was an acceptable use of time and in his London Festival of Education speech he even said:

If a teacher on a wet Friday afternoon is doing a fairly boring lesson on quadratic equations but the children are learning, that’s fine as well.

So nobody can doubt that OFSTED have nothing against fairly traditional maths lessons where things are explained rather than discovered and no gimmicks are used to make it entertaining. Well nobody could doubt it except those who work for OFSTED. Here’s what they actually like. This is the only maths good practice video added since Sir Michael made those speeches. Update 30/3/2013: The OFSTED good practice videos were all removed immediately after I blogged about them. If you’d like me to send you a transcript, email me.

Update 5/5/2013: Current version of video below:

Highlights include the teacher, Katharine, saying:

I’m not going to tell them how to do anything. So the challenge for me is the questions that I’m going to be using to make sure that they get out everything that they want to from the lesson. And that they discover things for themselves rather than me just telling them, oh this is how you do it – in algebra we write 3N instead of 3 times N. So I want them to find things out for themselves which sometimes you need to really think about the questions that are the probing questions that I’m going to be going round and asking them.

We then see the students try to work out how algebra works from discussing the 12 Days of Christmas.

The HMI who, as I understand it, is one of the top “specialists” in maths explains:

Katharine teaches for understanding throughout her lessons. She does that in various ways… she doesn’t just leap in with the right answer at any stage…If we contrast that with what we often see in other lessons we tend to see a compartmentalised approach to algebra. Where students are just taught a basic skill about you replace a letter by a number… And no really understanding about the role of the letters involved. So this school, this early work on algebra contrasts very sharply with what we see. And I would encourage all schools to think very carefully about laying the foundations for later learning of algebra in ways, similar ways to this.

Clearly this HMI was unaware that far from condemning a “compartmentalised approach” her chief inspector actually favours building “block by block” and “structured reinforcement”. Or perhaps she doesn’t care because it is her, and not Sir Michael, who will actually be judging maths teachers in lessons. Still, I’m sure the three recent studies of best practice in secondary mathematics  have more traditional contents.

How about the one for Loreto High School in Chorlton? What maths teaching methods does that describe? Well there’s no reason to think they don’t use plenty of traditional methods, but this is what is singled out for praise:

Students enjoy a range of sorting and matching activities, often in pairs or groups that promote discussion and help to develop their understanding. Students of all abilities, but particularly those who are less able, respond well to opportunities to explore mathematics through, for example, recording their ideas informally on mini-whiteboards.

What about Archbishop Temple School?

The outstanding teaching in the department makes use of published resources, such as text-books and worksheets, but only selectively. There is no set text-book. Teachers aim for active learning, using a range of sorting and matching activities that engage students and encourage discussion.

So more discussion and card sorts.  How about Allenbourn Middle School? Not strictly a secondary school but there should be some overlap.

Working from that secure starting point four years ago, and with pupil behaviour that was reliably good, the first step was to begin to encourage teachers to try new approaches. The only constraint placed upon staff was that outstanding quality of learning in mathematics in the school was to involve pupils using and applying their learning for the majority of theirlessons and, as a part of this, always wanting to ‘get right into the corners’ of their understanding. They made it clear that this approach wasn’t to be reserved for the occasional lesson – it had to become the way that mathematics is learned and taught throughout the school.

This often meant turning traditional lessons on their head. For example, in a Year 8 lesson on percentage increase and decrease, pupils don’t spend any time listening to reminders of the basic ideas about how calculations are done; they move straight into a buying and selling simulation with laminated ‘money’ and cards representing merchandise in the six ‘shops’ around the room. Traders are told the cost price of items and set prices; and buyers are encouraged to haggle by requiring percentage reductions. Reductions in prices in multiples of 10% are used initially and it is evident that some pupils can handle the idea quicker than others; it is also clear that those that can’t appreciate that understanding how someone can rapidly calculate a 30% reduction on a price of £5 is an important skill! Pupils learn quickly from each other and the lesson gets pacier and more demanding as some begin to demand 35% reductions and more complex discounts. For more complex calculations calculators are allowed, but they have to be used intelligently and this brings in the need to turn percentages into decimals quickly and fluently in order to keep up with the pace of haggling. Throughout, the teacher is closely monitoring pupils’ rates of understanding and skill acquisition. The plenaries are short and sharp, focused on specific skills, and are continually ratcheting up expectations. By the end of the lesson, all pupils have developed a ‘feel’ for the topic and have the capacity to deal with the mathematical concepts confidently and with a fluent recall of knowledge. Just as significantly, they show a rare level of confidence in problem solving.

Similarly, in a Year 5 lesson reinforcing techniques for addition and multiplication, pupils do not spend time responding to a long list of questions from a text book. Instead, it’s a game of Cluedo. The various clues around the room require a range of calculations to be made (and checked using an alternative method) and the answers provide pieces of evidence for these young sleuths to identify the killer. (It was Professor Plum!)

Oh. Well that’s all I can find about good practice in secondary maths that’s been published recently. Can anybody else find anything OFSTED have published about secondary mathematics which in any way suggests anybody other than Sir Michael will celebrate, or even tolerate, traditional teaching in the subject?

If you are wondering why I have singled out maths here it’s because the other good practice videos don’t need this much explanation. I’ll come back to them later in the week. The English one is just great.

Update (2/3/2013): There has been some discussion (not so much in the comments but elsewhere) as to the effectiveness of the lessons shown here, so I thought I’d add a few comments.

The point of this blogpost was to show how much OFSTED’s example of good teaching in secondary maths differed from that of its chief inspector rather than to say that these were particularly bad lessons. There is much that is praiseworthy within the lesson, one could almost believe it is the work of a highly effective teacher who was just putting on a show. However, if the case was to be made as to why the lesson shouldn’t be an example to all, I would suggest that some of the methods fly in the face of the evidence. Firstly, children “discovering” the ideas for themselves is not as effective a teaching method as telling them. The key paper to read about this is here. Secondly, the use of a sorting exercise is not necessarily particularly effective. I haven’t looked into the primary evidence but according to this blog the effect size for manipulatives in general is 0.37 and for manipulatives in algebra is 0.21. If you are familiar with the use of effect sizes, you are probably aware that John Hattie suggested that anything below 0.4 was of below average effectiveness. Finally, with regard to the content, if the whole “12 days of Christmas idea” is meant to make substitution and formulas more accessible, one has to wonder what the use of negative numbers in the second class was meant to mean  (“On Christmas minus five days, I debited my true love’s account with 10 presents”?)

Now these considerations do not mean that children will never learn well from these methods; even I am not that prescriptive about methods. Whether the teacher shown is effective or not would have to be judged by looking at the success (or otherwise) of her students. However, there is no good case for OFSTED repeatedly presenting this sort of lesson as good practice, and criticising the alternative when even their own chief inspector is aware that traditional teaching in secondary maths can be highly effective. There is definitely no excuse for OFSTED’s chief inspector going around telling teachers that traditional teaching is fine while his organisation is using their unaccountable power to enfore the progressive consensus quite brutally at the school level.


Who will watch the OFSTED watchmen?

February 23, 2013

Best bets

Big Brother

Quis custodiet ipsos custodes?

Who will watch the watchmen?

Decimus JuniusJuvenal, Satires, 2nd century AD.


The rhetoric is that all OFSTED want is effective teaching and leadership;

The reality is that OFSTED still prize entertainment over effective instruction.

In 2000, the British reality TV show Big Brother jumpstarted Channel 4’s drive into the brave new millennium. Over a decade later, with thirty series so far and two annual celebrity programs, it has become increasingly detached from reality. No one actually remembers the original message of the writer who coined the phrase ‘Big Brother’. Ask almost any schoolkid in England whether they know what it was originally based on, and you’ll be met by blank stares. Big Brother and the entertainment culture it spawned have spun loose from their moorings in reality.

In 1949, George Orwell wrote his book 1984. In it, he envisioned mass surveillance by the…

View original post 1,346 more words


Some Final Words on the English GCSE Farrago

February 21, 2013

It’s probably worth mentioning how the regrading lobby have reacted to the high court judgement that OFQUAL acted fairly. The main response I have encountered has been along the lines of “I/We know what a C grade looks like and our students should have got C grades”. This is the same argument that I have highlighted previously (at least once; maybe twice) when it appeared on Geoff Barton’s blog:

I’m merely a humble English teacher, and it took me five attempts to get my O-level Maths, so I can’t do the fancy statistical pyrotechnics that others can.

But I know what a C in English looks like; I know what you have to do to achieve it; and I cannot accept that because some kind of quota system has been created, through the incompetence of distant bureaucrats, our students – my students – should have a D on their certificates rather than the C they deserve. I taught a group of Year 11 students this year and they – like so many across the country – have been let down. Obvious C grade students have been given a D.

After all, to get a C you essentially only need to do be able to do three things: write using paragraphs; write using mostly accurate sentences and spelling; and be boring. If you stop being boring you move to a B or higher.

A grade C therefore demonstrates a general level of technical accuracy in the construction of writing and an ability to read that goes some way beneath the surface level of a text. It’s what we ought to be able to expect of more of our students.

And this, over any years, is what I’ve trained students to be able to do, in my own school, on courses and at conferences, and as a guest speaker in many other schools across England.

I know what a C grade is, what is looks like and involves, and I cannot accept that a cohort of up to 60,000 young people (according to ASCL’s calculations) should be denied the grade because of an error that’s not of their own or their teachers’ making.

As this appears to be the last remaining excuse for regrading, I think it’s time to assess it for credibility. Do English teachers have a well developed sense of what C grade work is like and what a C student looks like? One assumes that this stems from marking all the coursework in the old English GCSE and identifying accurately whether it is the work of a C student. I am prepared to believe that English teachers were good at getting coursework (particularly tasks that have been done before) up to a C grade standard (that’s historically been part of the problem). However, this does not mean that there is a strong understanding of what C should look like in any given exam or assessment. For this to be credible we would expect that the work of a C student could be easily identified regardless of whether it is teacher assessed or examiner assessed. We would expect a C grade in coursework to mean something roughly similar to a C grade in written exams. We would expect a consistent picture in all assessments and exams. Is this credible? Not according to the statistics collected by OFQUAL (See page 51 although the table appears to have the wrong date on it). According to these, in the old GCSE (the one where English teachers could easily identify the work of a C grade student) there was a huge variation in the C pass rate between modules. For the foundation tier the percentage of students getting a grade C or better in the speaking coursework was 70.9%. The percentage getting a grade C or better in the written coursework was 56.2%. That’s a bit of a discrepancy just between the two parts of the exam marked by teachers. However, for the two written exams, the bits marked by external examiners rather than by the students’ own teachers, the percentage of C grades or greater were 4.4% and 4.5%. This is not a misprint. The supposedly easily identifiable C grade was identified more than a dozen times more frequently by teacher assessment than by external examiners.

Apart from being evidence that the old, coursework based system, was heavily based on getting higher marks in the teacher assessed parts of the course, this makes a complete mockery of the claim that the work of a C student is easily identifiable. Far from knowing how to identify the work of a C student, teachers were seeing C grades everywhere. This is, to my mind, the fault of a system which forced teachers to manipulate grades, rather than the fault of teachers themselves, but any teacher who claims that they have learnt to identify C grade work from their part in this disreputable exercise strains credibility.  Similarly, anyone who claims that there are objective criteria which would identify C grade work regardless of the test or assessment it appears cannot be taken seriously. The system was rigged. The new GCSE rigged it some more. Things fell apart when the introduction of comparative outcomes stopped anyone getting away with it.

As a final note, in a blogpost Geoff Barton accepted the fight for a regrade was over. However, he remained determined to ignore the facts to the very end, identifying a series of “unanswered questions” which, to the best of my knowledge, have been clearly answered. Just in case anyone is unaware of those answers, I thought I’d answer them here.

Why did the exams boards get it so wrong?

It is now established that they got it wrong in January because of a lack of information about the comparative performance of the relatively small number of students who took their GCSEs in January.

What has happened as a result?

The GCSE is being reformed to get rid of modules that can be done early and Controlled Assessments that allowed manipulation of grades.

Who has been fired?

The main people responsible were those in charge of exams when the new GCSE was introduced. They had, on the whole, already gone either due to losing power in the general election or in the abolition of the QCDA. If anyone responsible is still in place, feel free to identify them.

Why was Ofqual so slow to respond to concerns they had raised long before the exams took place?

The framework for the exam was already place. It could not be conveniently changed just because people in OFQUAL realised it was ill-judged.

Why did their subsequent report start by blaming the exam boards and then switch to blaming the teachers?

This is fantasy. The structure of the exams has been blamed. The earlier report suggested some problems with what the exam boards and teachers had done, the later report revealed how much of that was a result of an inappropriate structure and perverse incentives. The idea that teachers were blamed by OFQUAL was an invention of the media.

Why was Ofqual allowed to investigate itself?

It wasn’t. It investigated what had happened. While conspiracy theorists may have blamed either Michael Gove or OFQUAL for everything that happened, there was no actual evidence for any of it and it was up to OFQUAL to find out who had done what and why, not to “investigate” an accusation. Any claim that they failed to get to the bottom of this had now been discredited by a high court judgement confirming all the substantial points OFQUAL had made.

How are English teachers supposed to prepare this year’s cohort of GCSE students?

Teach them. Get them to be better at English. You know, what schools should have been doing in the first place instead of gaming the system.

Is the message that however you do in the examination hall, some faceless bureaucrat will decide your grade according to the superstitious mantra of ‘comparable outcomes?

It has always been the case that, in the final analysis, grades were decided by the faceless. Now they are being decided in a way likely to make them consistent from year to year, instead of going up every year. By all means make the case for grade inflation if you can, but don’t dismiss a refusal to inflate grades as “superstition”.


Last Week’s Verdict on the English GCSE Farrago

February 20, 2013

After a conversation with some English teachers, I pointed out in July 2012 that the English GCSE appeared to contain some ridiculous, dumbed-down content.

English Language GCSE – Narrowing the Horizons of the Next Generation

The week before the 2012 GCSE results came out I tweeted to say that schools seemed unaware that many of them could expect to see results fall.

On the day schools received the results (but before they were made public),  I saw lots of claims that results in English must be down 10% across the board and so I explained that some schools having a fall in scores was inevitable under “comparative outcomes”. In particular:

If too many schools target what they think is a “C”, then they won’t get it. It is no good looking at January mark schemes, or previous year’s mark schemes, and trying to replicate was a C grade then. Everyone else will be doing the same thing and they can’t all get Cs. Boundaries will shift upwards.

Furthermore the effects of these things I have described will be disproportionately felt by schools which have focused on improving their number of C grades. If you aimed for lots of low Cs then you are likely to be in trouble. If you relied on controlled assessments and coursework to get grade Cs (i.e. cheating), then it is almost certain the goalposts will have moved. The effects will also be felt more in subjects where marking is imprecise and arbitrary.

None of the fuss so far has indicated yet that there has been a real problem with English beyond the failure of schools to realise the above. The culture of continual “improvement” (that actually just meant gaming the system) is quite heavily ingrained. An end to grade inflation will be a shock to the system with a lot of consequences for schools.

From A Note About The GCSEs

As the results were published the following day it became clear that there was no general collapse in results. I observed that the exam’s bizarre structure had resulted in many schools attempting to manipulate their results, but this had been foiled by the “comparative outcomes” approach used by the examining boards.

Actually, It Was About Cheating

A day or two later I responded to some of the arguments being put forward to estblish unfairness in the exams. In particular I pointed out that there was no reason to maintain the January grade boundaries in June.

The Exam Hysteria Continues…

I then followed up on any remaining arguments a few days later and argued that what the regrading lobby were pushing for (a massive increase in C grades) was not acceptable.

More About Exams

The following month I looked at the case for comparative outcomes in more detail, and considered the arguments that had emerged since results came out.

A Note on Exams

The GCSE English Farrago

Finally, OFQUAL’s report came out in November and I observed that it’s key claims (that the exam was flawed and open to manipulation and that results could not have been allowed to shoot up) for which it had compiled a large amount of evidence, were exactly what I had claimed all along.

I Told You So

Now, this entire line of argument made me staggeringly unpopular with people who claimed only to care for the best interests of the students. These arguments were repeatedly dismissed as excuses and it was far more common to hear it claimed (without evidence) that Michael Gove had personally caused the results to be pushed down for political reasons, or that OFQUAL had made a mistake and by pointing out the flaws in the exams was “attacking teachers”.

For this reason I cannot resist pointing out that the claims went to the high court, and last week a judgement was released with the following conclusion (I have highlighted some keypoints):

149. The claimants brought this case because they considered that students had been treated unfairly. There are two principal grievances: first, the actual performance of these students had not been fairly reflected in their grade because the results had been unjustly moulded to reflect predicted performance. The statistics had dominated the assessment process in a wholly unacceptable way. I have rejected that submission, essentially on the ground that it was legitimate for Ofqual to pursue a policy of comparable outcomes, ensuring a consistent standard year on  year, and assessing marks against predicted outcomes was a rational way of achieving that objective. Moreover, the Awarding Committee in each of these AOs believed that the June grades fairly reflected the quality of the candidates.

150. The second grievance is a wholly understandable one, and relates to the inconsistent treatment meted out to the students taking assessments in January and June respectively. There is no doubt with hindsight that the former were treated more generously than the latter. Some teachers, again understandably, took the January grade boundaries as a strong guide to future assessment. They did not anticipate the boundary shifting as much as it did in certain units. The reason for the change was in part that some teachers had marked papers more leniently in June specifically in order to bring them just above the C grade; but that was far from the whole story.  More significantly, there was fuller information available in June than in January and it became clear with hindsight that the January cohort had been treated too leniently.

151. Ofqual was in a difficult position. It considered and rejected the possibility of reassessing the January grade assessments.  Nobody seriously suggests that it should have retrospectively reduced a candidate’s grade in that way when the result had been made public. Yet if it were to have applied the grade boundaries in June, it would have led to a significant dilution of standards, with an unrealistically high proportion of students obtaining a C grade. That would have created an injustice as between those qualifying in June 2012 when compared with students in earlier and subsequent years. Indeed, the problem is compounded when it is appreciated that some candidates for particular units in June 2012 were qualifying in June 2013. If they were to be assessed according to the January 2012 boundary marks, that would be unfair to candidates taking the same unit in January and June 2013. It would manifest precisely the same unfairness that the claimants now allege, but shifted to different victims.

152. The problem lies in the modular nature of the examination, coupled with the fact that grade boundaries were assessed and made  public at each stage of the process. Mr Sheldon [the QC acting on behalf of the regrading lobby] was highly critical of this structure. He rightly points out that a number of experts had predicted precisely the kind of difficulties which have, in fact, arisen. He says that the problem is of Ofqual’s own making (or at least, Ofqual’s predecessor). That may be so, but the judicial review challenge is not to the modular nature of the assessment process, or to the practice of assessments being made at different points in the two year qualification period. It is a challenge to the way in which Ofqual and the AOs sought to deal with the problems once they had materialised.

153. Initially it was assumed that since the same procedures were being adopted in January as in June, there should be no change in standards.  In fact, this was not so and the January cohort were assessed more leniently.  Once that became clear, Ofqual was engaged in an exercise of damage limitation. Whichever way it chose to resolve the problem, there was going to be an element  of unfairness. If it imposed the same standard in June as it had in January, this would be unjust to subsequent cohorts of students taking the units in subsequent years. If it did not, that would favour the January cohort over the June cohort in 2012. Unless standards were to be lowered into the future and the currency of GCSE English debased, at some stage a decision would have had to be taken to depart from the less rigorous January grade boundaries and at that point, whenever it was, there would be winners and losers.  

154. The claimants submit that even if the January cohort was treated unduly favourably, it was wrong to draw a distinction between groups of candidates qualifying in the same year. This was more important than equality as between years.

155. However, there is no obvious or right answer to the question where the balance of unfairness should lie. Ofqual’s solution was in my judgment plainly open to them. Their priority was to protect the comparable outcomes objective, although it meant that January candidates were treated more generously.  However, the adverse consequences were relatively contained by acting at that point since far fewer students took the relevant units in January than in June.  

156. For these reasons, which briefly recapitulate those spelt out in some detail in this judgment, I do not think it can be said that Ofqual or the AOs erred in law.

157. I therefore dismiss these applications. As I have said, however, this is a rolled up hearing, and although nothing turns on the point, I would grant permission for the applicants to bring these proceedings. This was a matter of widespread and genuine concern; there was on the face of it an unfairness which needed to be explained. There is no question, in my view, that the matter was properly brought to court.  Indeed, following the outcry when the results were published in August, Ofqual itself carried out an investigation into the concerns which were being expressed and produced two reports, an interim report and a final one  produced after consulting widely with interested parties. Ofqual was not persuaded that it should require the grade boundaries to be changed, but it appreciated that there were features of the process which had operated unfairly and it proposed numerous changes for the future which are designed to ensure that the problems which arose in this case will not be repeated. It also took the unusual step of allowing students to take resits in November instead of having to wait until the following January. We are not directly concerned with those reports which simply reflect Ofqual’s own views.  However, having now reviewed the Judgment Approved by the court for handing down evidence in detail, I am satisfied that it was indeed the structure of the qualification itself which is the source of such unfairness as has been demonstrated in this case, and not any unlawful action by either Ofqual or the AOs.

Does anybody who said I was wrong before care to reconsider their position?


We all know David Starkey was a terrible teacher… don’t we?

February 19, 2013

In the midst of the completely unedifying ideological battle over the teaching of history, a letter appeared in the Guardian from one of the “experts” who we are all meant to listen to. Most of it is just insult, responding to previous insults, responding to yet more insults from people whose lack of professional courtesy shocks me, and I’m an anonymous blogger. If historians wanted to give politicians every reason to ignore their views on the curriculum, this is the right way to go about it.

But what actually annoyed me was this bit:

Ferguson boasts that he’s “written and presented popular history”, but being a telly don doesn’t equip you for the realities of the classroom, as David Starkey found to his cost in Jamie’s Dream School.

This is a reference to David’s Starkey bust up with a bunch of difficult teenagers while trying to teach on the TV programme “Jamie’s Dream School” – reality TV not classroom reality – which I’ve seen trotted out again and again as an example of bad teaching, or how being an expert won’t help you teach.

Of course, this whole incident was based on a television programme trying to show the most sensational footage. What if it had been cut to show the quality of the teaching rather than the difficulty of the students’ behaviour?

Well we have an answer to that question. Here is just the teaching from that infamously terrible lesson.

OFSTED would no doubt disagree, but I think that’s pretty brilliant.

Worth bearing in mind if you are ever told that the best route to behaviour management is through good teaching.


What OFSTED Actually Want

February 16, 2013

My most popular blogpost ever (in terms of hits) wasn’t really written by me. Entitled “What OFSTED say they want” it was a transcript of a speech made by the chief inspector Sir Michael Wilshaw.

It became widely distributed because it seemed to contradict the widespread impression that OFSTED wanted “progressive” style teaching, with lots of groupwork, entertainment, discovery learning and little teacher talk. Sir Michael rejected many of the common ideas about OFSTED, even saying he was wary of “an insistence that there should be a balance between teacher led activities and independent learning” and describing a “very traditional teacher” who “ taught in a pretty didactic way” as outstanding.

In a later speech – one that I actually saw him deliver – he made similar comments and in some ways went further, suggesting that even a “fairly boring lesson” could be acceptable if there was learning.

Let me emphasise again to anyone who hasn’t heard this from me or from anyone else in OFSTED.  OFSTED does not have a preferred style of teaching, does not have a preferred style of teaching.  Inspectors will simply judge teaching on whether children are engaged, focused, learning, and making progress, and in the best and most outstanding lessons, being inspired by the person in front of them.

We don’t want to see lessons that are too crowded, too frenetic, and with too many activities designed simply to impress the inspector.  And if that’s happened in the past, it’s wrong.  We simply want to see teaching that embeds learning.  Ultimately that is what matters.

Indeed, our recent Improving English forum report found a disturbing lack of extended reading and writing in English lessons, because too many teachers thought that they had to plan lessons that focused on activity rather than learning, so if teachers are going through with the class a Shakespeare text, that’s absolutely fine, and do nothing else, that’s fine.  If a teacher on a wet Friday afternoon is doing a fairly boring lesson on quadratic equations but the children are learning, that’s fine as well.

This did not appear to be a case of the chief inspector going “off-message” in that changes in the OFSTED handbook were also made to reflect this approach.

Lesson observations

25. The key objective of lesson observations is to evaluate the quality of teaching and its contribution to learning, particularly in the core subjects. Inspectors will not look for a preferred methodology but must identify ways in which teaching and learning can be improved…

…Quality of teaching in the school…

…111. Inspectors must not expect teaching staff to teach in any specific way or follow a prescribed methodology…

Some key “progressive” jargon from the description of outstanding teaching in the previous handbook -“Teaching promotes pupils’ high levels of resilience, confidence and independence” – was removed. While it may not sound controversial, this was usually uderstood to refer to staples of progressive pedagogy such as groupwork, discovery learning and project-based learning. Inspectors had been advised to ask

Are pupils working independently? Are they self-reliant – do they make the most of the choices they are given or do they find it difficult to make choices? To what extent do pupils take responsibility for their own learning?„ How well do pupils collaborate with others? Do they ask questions, of each other, of the teacher or other adults, about what they are learning? Are pupils creative, do they show initiative?

One popular (but unofficial) guide to how to inspect lessons, used in differing versions in a lot of schools but often believed to be popular with inspectors claimed (among other atrocities such as learning styles), that a lesson would be inadequate if:

The children are not used to collaborative talk / working with a talk partner…

Classroom practices discourage independence…

To actually remove the requirement that inspectors look for independence and resilience in their observations was a huge shift. It was made clear that the removal of this criteria was not a mistake by the comment that:

“…Not all aspects of learning, for example pupils’ engagement, interest, concentration, determination, resilience and independence, will be seen in a single observation.”

The message was quite explicit, and I had helped spread it. Didactic teaching, unexciting content and plenty of teacher led activities were perfectly acceptable if it resulted in learning. “Independence” was no longer a requirement for lesson observations. Unfortunately, it was all bollocks.

” How I feel when I read that Ofsted don’t require any specific teaching style” Tom Bennett

So here’s the truth. Here’s what they actually want. Here are quotations about teaching and learning from OFSTED reports carried out since the new handbook was introduced in September 2012. (Dates given are the dates for the inspection). Here’s everything I could find about direct instruction, groupwork, whole class teaching and discovery learning in OFSTED reports. This is what they have been saying up and down the country at the chalkface, in a variety of schools with a variety of overall gradings.

In good lessons, teachers plan activities and use a wide variety of resources that enthuse and cater for the full range of needs and abilities. Teachers monitor learning throughout and pupils are encouraged to work in groups and engage in independent learning activities.

For example, in an English lesson, pupils worked enthusiastically in groups, rotating to different tables every 10 minutes to work together in solving challenges on the various forms and uses of verbs.

More typical, however, was a mathematics lesson seen on mirror images where progress was slow as some pupils had already mastered the topic in the previous year. The teacher’s extended presentation and setting of the same work for the entire class failed to cater for this group’s needs. The majority of lessons require improvement to enable pupils to consistently learn well and progress at a faster rate because typically: …

− teachers talk for too long and dominate the learning to the point that pupils begin to get restless

Brimsdown Primary School, Enfield, 11–12 September 2012

Teachers spend too long talking to the whole class, which restricts the time available for pupils to get on with their work.

Aylesford Primary School, Aylesford, 12–13 September 2012

In most lessons, pupils are keen and enthusiastic learners who relish the opportunity to work together. However, they are not always given enough opportunities to develop their independent learning skills.

St Ann’s CofE Primary School, South Tottenham,  20-21 September 2012

The most effective teachers provide significant opportunities for students to work independently. However, this was not a feature of the majority of lessons. Students’ views confirmed inspectors’ observations of too much passive learning where teachers dominated the discussion.

Shevington High School, Wigan, 26-27 September 2012

This is a school that requires improvement. It is not good because…Lessons are sometimes dominated by the teacher…

Opportunities are missed to engage everyone actively in their learning. Pupils sometimes sit and listen for too long while one pupil answers. In the best lessons, pupils are given opportunities to play a full part by briefly discussing the question in pairs. Teachers’ questioning skills are inconsistent. Not all teachers ask questions that develop pupils’ thinking skills sufficiently.

High Halstow Primary School, Rochester, 3–4 October 2012

Learners are frequently encouraged to develop their independent learning skills…. Learners value the opportunity to engage in debate in lessons, skilfully managed by teachers. A small minority of lessons are too teacher dominated and learners remain largely passive.

Brighton, Hove and Sussex Sixth Form College  (BHASVIC) 9-12 October 2012

Pupils are treated as individuals. Teachers and support staff motivate the pupils to do their  best and positive approaches help the pupils to build their confidence and self-esteem. Discussion led by the teacher or in small groups or pairs is well established. There is a buzz of activity and engagement when pupils explore their different ideas together.

St Katharine’s CE (VC) Primary School  Marlborough,  11–12 October 2012

Where teaching is outstanding, teachers encourage pupils to work together by telling them that ‘scientists work as a team’. Interesting problems posed by the teacher lead to pupils eagerly investigating such questions as whether sound travels round corners.

Duke Street Primary, Chorley, 23-24th October 2013

When teaching is less effective in Key Stages 1 and 2, the pace of learning slows when teachers spend too long talking to the whole class

Medlock Primary School, Manchester, 24–25 October 2012

In the best lessons, teachers provide opportunities for students to work independently and think for themselves. This is not always the case, and some lessons are too dominated by the teacher, which does not always help students to practise what they have learned.

Holy Family Catholic High School, Liverpool, 31 October–1 November 2012

In some lessons teachers spend too long talking to pupils and it takes too long for the pupils to engage with the activities set.

Pheasey Park Farm Primary School, Birmingham, 31 October–1 November 2012

Occasionally, however, teachers talk for too long and pupils are not given enough opportunities to offer their own ideas about how to tackle a given task.

The Crescent Primary School, Croydon, 7−8 November 2012

Some lessons include activities that help promote pupils’ personal development very effectively. For example, pupils in Year 1 developed a good understanding of their Cornish heritage when the teacher transformed the classroom so that they could pretend they were tin miners imagining the difficulties of working in the dark. Pupils worked well together in groups helping each other with their learning.

Wadebridge Primary Academy,  Wadebridge,  7–8 November 2012

In an outstanding English lesson, the energy and enthusiasm in the room were infectious.Animated Year 7 students worked in groups debating the vocabulary used by Dickens in an excerpt from A Christmas Carol….

Occasionally, tasks in lessons are not tailored closely enough to students’ needs, with all
expected to complete more or less similar work. These lessons do not provide enough challenge. They offer too few opportunities for students to work independently or in groups and too much of the ‘talk’ comes from the teacher, so that students are not able to contribute enough. Where this happens students do not meet their potential in the lesson.

The Forest School, Horsham, 14–15 November 2012

As a result of the teacher’s advice, Year 6 pupils have made good progress in science when encouraged to work together more to investigate soluble substances….

In some lessons observed, for example, when counting coins or practising times tables, all pupils were expected to recall the same skills at the same time…

In an outstanding Year 5 mathematics lesson, teachers and teaching support staff were deployed very well to work with different ability groups when investigating how to plot coordinates…

In a good Year 6 science lesson, pupils were encouraged to talk about each other’s views and offer critical opinions about whether sugar, flour and sand would dissolve in water.

St John the Evangelist RC, Primary School Bolton, 14–15 November 2012

In a few lessons, pupils were less enthusiastic because the teachers’ explanations took too long and were too complicated, and there was too much recapping of work already familiar to pupils. This left little time for pupils to work on individual tasks, and occasionally, these were class tasks, rather than adapted to meet individual needs.

College House Junior School, Chilwell, Nottinghamshire, 21–22 November 2012

A minority of teaching still requires improvement. In these cases, the teacher dominates the lesson by talking too much and/or over-directing learning. This cuts down students’ opportunities to develop their independent learning skills by working on their own or in groups.

Hellesdon High School, Norwich, 21–22 November 2012

In the best lessons questions are clearly focused and teachers provide opportunities and time for pupils to think for themselves and work in pairs, for example to solve problems.

Children in the Nursery class regularly enjoy a variety of interesting and stimulating activities. Adults are knowledgeable about the needs of very young children and plan their play and learning so it is imaginative and challenging.

Most pupils, especially those in Key Stage 2, know how well they are doing and can say whatbthey need to do in order to develop their learning further. Pupils say that they find teachers’ comments and marking helpful but not all benefit from the opportunities to read and think about what their teacher has to say.

Teachers strive to make lessons as interesting as possible and encourage pupils to work together and listen to what others have to say. Pupils in one class were able to show how well they work together when they were asked to put the Big Bad Wolf on trial. They took delight in working in small groups to form a prosecution and defence and to bring forward fairy-tale character witnesses to support them in assessing the wolf’s guilt or innocence.

Many lessons are well planned with a variety of activities for pupils of different abilities. However, in a small minority there is not enough evidence in the planning of how teachers intend to meet individual pupil’s needs and all activities are the same….

In an outstanding lesson, pupils showed  their ability to work well together as they used their creativity and imaginations to compose mobile phone ring tones and produce short animations to a very high standard….

In some lessons, opportunities are missed to harness the enthusiasm and interest of more-able pupils who have the ability and drive to work on their own, in pairs or in groups to find things out for themselves, explore and discuss their ideas.

Calverley Parkside Primary School, 22–23 November 2012

In the less effective lessons there are common weaknesses. In some cases, teachers spend too long talking to the whole class… These lessons offer too few opportunities for students to work independently or in groups

Chichester High School for Boys, Chichester, 27–28 November 2012

In the most effective lessons, teachers plan a variety of creative learning activities which stimulate and challenge pupils to explore their learning for themselves and to use their good social and interaction skills. This was particularly found in the foundation unit and special resource centre. The very colourful integrated provision provides pupils with large free-flowing spaces and stimulating resources which help them to play and learn together and to develop their knowledge and understanding about the world in which they live.

… Occasionally, teachers’ questioning is not demanding enough, and teachers do not encourage pupils to share their thoughts, feelings and opinions with each other. An example of this was seen in a Year 3 lesson where pupils were considering why it is important for Hindus to make a pilgrimage to Varanasi in order to ‘wash away sins’. Pupils spent too much time being told what they needed to do, and not enough time thinking for themselves.

Mayflower Primary School, Plymouth, 28–29 November 2012

What does the school need to do to improve further? Increase the proportion of outstanding teaching by… making sure that teachers do not dominate lessons by talking for too long or telling pupils things they could read or work out for themselves…

…Occasionally, teachers are too quick to explain things or to ‘tell’ pupils something that they could have challenged pupils to explain or find out about for themselves.

Penn Fields School, Wolverhampton, 4–5 December 2012

In English, marking motivates pupils to be imaginative as well as improving the technical aspects of their writing. Occasionally, teachers miss opportunities to encourage pupils to explore mathematical ideas in a similarly creative way.

Bayton CofE Primary School, Near Kidderminster, 5–6 December 2012

In less effective lessons, there is an over reliance on whole-class activities that are sometimes dominated by the teacher. This prevents students from taking their own initiative or developing the ability to work more independently.

Chilton Trinity School,Bridgwater, 6–7 December 2012

In the best lessons enthusiastic teachers use a wide range of resources to capture pupils’ interest. In some, however, teachers spend too long talking to the whole class when introducing the lesson and pupils lose valuable time when they could be working by themselves. This limits the progress pupils can make and hinders their ability to practise and improve their work.

…Teachers encourage pupils to work together and this allows them to learn from each other. In a Year 5 religious education lesson pupils worked in pairs in a role-playing exercise, followed by small group discussions to describe a situation and share ideas successfully.

St Vincent’s RC Primary School, Newcastle-upon-Tyne, 18–19 December 2012

In the best lessons, teachers provide opportunities for students to work independently or collaboratively. This is not always the case and a few lessons are too dominated by the teacher, preventing students thinking for themselves and taking the initiative.

Preston Muslim Girls High School, Preston, 15–16 January 2013

Pupils’ ability to work exceptionally well in groups and independently for extended periods makes a significant contribution to their outstanding learning.

Tonacliffe Primary School, Whitworth, 16–17 January 2013

In the few lessons where teaching requires improvement, lessons are too teacher-dominated and students are asked to complete repetitive tasks. Students have excellent attitudes to learning. They say that teachers ‘make lessons fun and interesting’ and are strongly motivated to achieve. When working in groups, they value each other’s views and enjoy collaborating to solve problems set by their teachers. Students develop confidence and resilience as learners, and this supports their good progress.

Madeley High School, Crewe, 17–18 January 2013

Lessons are planned in a way that challenges all students to make rapid progress. They are typically characterised by… opportunities for students to work together in groups and to help each other…

St Albans Girls’ School, St Albans, 22–23 January 2013

Some features of effective teaching are seen across all subjects. Teachers give students opportunities to develop the ability to work on their own and to collaborate and discuss their work in groups.

Bilton School, Rugby, 23–24 January 2013

I’m sure I can be accused of being selective, but I doubt that an opposing case could be made from the reports I have seen. Whatever Michael Wilshaw says, teacher talk is still out and groupwork and discovery learning are still in.  I deeply regret that I was ever stupid enough to believe that the views of the chief inspector and a revised handbook would be enough to cure the Child-Centred Inquisition of their mission to enforce trendy teaching methods on us all. OFSTED remains the steadfast enforcer of the orthodoxies of progressive education, and it is OFSTED, not league tables or government policies, which most shapes our classroom practices.

We should also remember that these quotations are not simply dry statements in reports which will be read by very few; they reflect judgements that will have consequences for schools and for individuals. Teachers, many of whom will have considered themselves highly effective, will have been sat down and told they are inadequate because they talk too much or because their classes were not working in groups. Headteachers in schools with good results and happy kids will be told they need to improve because of their reliance on traditional teaching methods. Leaders of appalling, under-performing schools who fail their students by chasing after all the latest guff, will have been told that they are “good” because they have introduced groupwork into every lesson. Careers will have been made or ruined on the back of the unofficial ideology enforced, through fear, by OFSTED. My view is that until OFSTED are abolished, or reformed beyond recognition, then our system will remain imprisoned by the progressive orthodoxy no matter what the politicians, or the chief inspector, happens to say.


Policy Based Evidence Making

February 8, 2013

Apologies for being a week late (and, therefore, not terribly topical) with this one. There were technical difficulties.

There seems to have been a craze recently for people engaged in partisan arguments about education to claim that those they disagree with have ignored the evidence. It seems that the more ideological one’s own position is, the more likely it is that one will declare other peoiple’s views as being without evidence. So, for instance, SWP activist Michael Rosen, a man so divorced from the evidence on how children learn to read that he thinks phonics is “barking at print”, declared in one of his tedious rants directed at Michael Gove that

For you to be able to push through what is fast becoming an exam that will be a major impediment for most young people to develop as learners, you must… ignore all evidence on adolescents and learning.

Similarly, in another Guardian rant, this one so short of actual evidence or argument that Tom Bennett described it as “a joyless donkey ride across the greatest hits of armchair fantasy edu- football”, Suzanne Moore argued that:

Gove, charming as he is, is one of the most profoundly ideological of the lot. One would have thought that a man of his intelligence might push through policies based on evidence. Evidence-based policy-making is all the rage you know. Scientists even do it! But no: the entire education system is now one vast experiment without any aim except the reach of Gove’s ambition.

A third example can be found from this blogger. I was saddened to read:

Like many teachers at a senior level, I have an MA. Three years of hard work in my own time, travelling up to 80 miles on a round trip once a week or so, I wasn’t going to waste my time. The time I used was spent on gathering evidence, from the research of others and from my own work in the classroom. No evidence, no MA basically. There is little evidence in Gove’s ideology.

The one source he has quoted, Daniel Willingham, is a cognitive scientist, not a primary or secondary school teacher, in the USA, a country with perhaps more rigid curriculum rules than our own. He has researched brain mechanisms and memory, and has dismissed the usefulness of learning styles. He appears to be Gove’s guru, and the source of his obsession with rote learning and the rigour of exams.

Phonics: I am not opposed to phonics as such, it is a way of teaching reading, but not the only one. The evidence base was very narrow. A study in Clackmannanshire, the smallest authority in Scotland, is the basis for the introduction of synthetic phonics in England. Too narrow a base, in a part of our nation with a different educational system. The testing too is a political tool. The use of nonsense words, whilst enabling new language learners to show their phonic skills, actually penalised good readers who for example might read the nonsense word ‘dess’ as ‘dress’ because they want to start reading real words to make sense of the nonsense. This appears to have penalised more able readers in their scores, and impacted on schools in the ‘leafier’ suburbs.

So to sum up, the author claims:

  • A professor of psychology, who has published two books on education, knows nothing about how learning works.
  • The evidence on phonics (Hattie suggested in 2009 that there were 425 available studies of phonics instruction) is reduced to one study that apparently can be ignored due to the size of the local government boundaries.
  • The alternative to both the evidence-based discipline of cognitive psychology and the empirical evidence is: the opinion of people who have done MAs in education.
  • Probably lots of other things, I just couldn’t bring myself to read any further.

However, if these contributions were not enough to make me wonder if “evidence” is a synonym for “my opinion” and “lack of evidence” is another way of saying “your opinion” there was one blog, widely celebrated on Twitter, that really got my goat. Not because it could compete with the “evidence-based ranting” approach of the above, but because it seemed remarkably plausible until you actually analysed the sources and saw how they had been cherry-picked. This is “The research v the government” from Ian Gilbert. This draws on the Hattie style analysis of education research published by Education Endowment Foundation here.

I have issues with much of the EEF analysis for a few reasons.

1) It looks at effect sizes but seems to ignore Hattie’s claim that when you use this for analysis:

Almost everything works, Ninety per cent of all effect sizes in education are positive. Of the ten per cen that are negative about half are expected (e.g. effects of disruptive students); thus about 95 percent of all things we do have a positive influence on achievement. When teachers claim that they are having a positive effect on achievement or when a policy improves achievement this is almost a trivial claim: virtually everything works. One only needs a pulse and we can improve achievement.

Famously, Hattie’s answer is to compare effect sizes with the average effect size of 0.4. I am a little sceptical about such a cut-off point, but I would suggest that we have every reason to consider effects that are of the order of this “hingepoint” or less to be unproven even when statistically significant.

2) In the absence of decent empirical evidence, the next best thing is the evidence from experimental psychology. To ignore this on the basis of education research, which is usually of a much lower standard, strikes me as a mistake and undermines any claim to be “evidence-based”. There is an exploration of this argument here.

3) The EEF report includes both evidence from studies and opinions which are not clearly drawing upon, and sometimes contradicts, the studies.

Gilbert’s blog ignores these problems, but exploits the third so as to sometime quote from the research conclusions, and sometimes quote from the opinions accompanying them, according to whichever one contradicts the government. For these reasons much of his “evidence” soon seems to be less than convincing when under scrutiny and I will address them each in turn.

Claim 1: Ability grouping harms middle and low attainers.

This is a classic among educational ideologues. I have lost count of the number of times I’ve heard it claimed that this is what all the evidence shows by someone who promptly discovers that they cannot find the evidence in question. While I am yet to do a full review of the evidence myself I can point out that this particular claim is made by the EEF authors on the basis of 4 meta-analyses. 2 of them found a small positive effect for ability grouping. The one that found the largest negative effect (-0.12) for low attainers, according to their own description, actually found a positive effect for homogeneous grouping of +0.12. Given Hattie’s observations about how education research usually finds a positive effect (and a bigger one than this) inverting the result seems unfair. Also, all these meta-analyses are from the 80s and 90s meaning one of the most rigorous bits of education research on ability grouping isn’t included.

Claim 2: There is no evidence for the benefits of school uniform.

This may be an accurate description of the research, but given that elsewhere the opinions of the EEF authors are quoted as evidence it seems a little odd that it has been ignored that they go on to say:

When combined with the development of a school ethos and the improvement of behaviour and discipline, the introduction or enforcement of a school uniform can be successfully included as part of this process.

Claim 3) Performance Pay doesn’t work.

I’m not going to argue with that.

Claim 4) Evidence does not support longer school days.

The EEF authors actually concede there is evidence of effectiveness but doubt whether it is cost effective. It would have been equally possible to quote the following section:

Overall approaches to increasing the length of the school day or the school year add on average two months additional progress to pupils’ attainment over the course of a year. Additionally, research based on international comparisons, looking at average times for schooling in different countries is consistent with this conclusion.  However, it should also be noted that pupils from disadvantaged backgrounds benefit by, on average, an additional half a month’s progress relative to their peers suggesting that extending school time can be an effective means to improve learning for pupils who are most at risk of failure.

Here, we have a exact reversal of the way claim 2 was considered. For claim 2, the summary of the evidence was reported but not the opinion of the EEF authors. Here, the opinion of the EEF authors (that it is not cost-effective) is reported but the summary of the evidence (it works) was not reported. Nothing could show more clearly how selective Gilbert is being.

Claim 5) SEAL works.

This is one where the EEF authors are partly to blame. They appear to have quoted a wide variety of studies related to the social and emotional aspects of learning as supporting the effectiveness of SEAL. However, they (unlike Gilbert) admit that when SEAL itself was studied the evidence was not good: “A quasi-experimental evaluation of the impact of the secondary programme did not find a significant impact on attainment in the SEAL schools.”

Claim 6) Nick Gibb was wrong to recognise the success of phonics.

This is again an outrageous selection of opinion over evidence.

The evidence, as the EEF authors admits, indicates “Phonics approaches have been consistently found to be effective in supporting younger readers to master the basics of reading.The approach tends to be more effective than other approaches to early reading (such as whole language or alphabetic approaches)…” Unfortunately the rest of the passage is marred by the usual phonics denialist rhetoric used to obscure the clear message of the evidence with qualifications which can’t actually be deduced from the evidence. Gilbert has quoted only from this obfuscation and opinion.

Claim 7) Despite Gove’s support for sitting in rows, collaborative learning works really well.

This is really one with a lot of background and I intend to blog about it in more detail at a later date. However, it is worth mentioning that the actual research on sitting in rows is ignored here. It is also worth mentioning that the effect size the EEF authors find for collaborative learning is 0.42. Hattie found 0.41.  Both are not really distinguishable from Hattie’s “hingepoint” of 0.4 making “collaborative learning” less than clearly effective. This is a case where I would suggest we look at the evidence from psychology. We actually have a 100 years of psychology research supporting “The Ringelmann effect”: a general tendency for people to become less motivated when made to work in groups.

Claim 8) In contrast to the government’s policy of ending ringfencing for one-to-one tuition, such tuition does work.

This is another one where relevant opinions of the EEF authors are ignored. They state that one-to-one tuition is expensive and other alternatives should be considered. Ending ringfencing (as opposed to stopping all one-to-one tuition) actually seems to be in line with this opinion.

Claim 9) Early Years Intervention works, despite a government minister saying Sure Start isn’t a candidate for more money.

Like claim 8,  this seems to miss the difference between something being a good use of money and it having an effect. More importantly though, it ignores that a general level of evidence for this form of intervention isn’t necessarily evidence for Sure Start, as the EEF authors acknowledge “Evaluations of Sure Start in the UK do not show consistent positive effects and indicate that some caution is needed when generalising from exceptionally successful examples”

Claim 10) Peer tutoring works and this disproves Gove’’s point of view about collaborative learning.

The immediate problem with this is that Gove’s opposition to collaborative learning was actually an interpretation of a comment about sitting kids in rows (the research on which is again ignored here) so it is far from clear that the evidence on peer tutoring actually has any relevance to what he said. However, even if he does have a general dislike of groupwork, then this cannot be said to disproved by picking the one type of groupwork with a strong positive effect. Why not? I think the following is a really good explanation of why this is not a fair way to do research:

Claim 11: Something about meta-cognition

I don’t even get what is being claimed here.


What You Sow…

February 1, 2013

I’m sure this is just a coincidence, and in no way a matter of cause and effect.

Here we have a description of behaviour from a Northampton School for Boys student which got a lot of media coverage:

And here we have what I wrote about the headteacher not so long ago:

A prominent headteacher … disapproves of using detentions as punishment rather than relationship building.

In front of the Education Select Committee (minutes here; video here) Mike Griffiths, Head of Northampton School for Boys and witness for the Association of School and College Leaders said:

Detentions are not terribly useful. People tend to try and find a more creative way of dealing with issues, because to get good discipline you need to work with youngsters and get their co-operation. Simply penalising and depriving them of time and so on isn’t always helpful. The only time when I think it can be useful is when that time is used by the teacher to constructively work with that individual child, in a way that they don’t normally have time to, to actually rebuild the relationship. Personally, I am completely against the notion of what I think in some schools is called faculty detention, where somebody else runs it. As far as I can see, the only reason for keeping a youngster behind is to enable me, as the teacher, to improve relationships with that youngster, but that’s unlikely to occur if the youngster perceives the detention as being a period of almost imprisonment.

%d bloggers like this: