Archive for April, 2015

h1

Yet Another Andrew Old Round Up

April 24, 2015

A couple of things I really should promote before it’s too late:

Firstly, I am speaking on a panel discussion at an event tomorrow based around “Character vs Knowledge? What is the purpose of education?” This is organised by the East London Science School and The Education Foundation. Details (and still the chance to buy a very cheap ticket) can be found here.

Secondly, assuming my contribution survives the editing process, I should have a chapter in Changing Schools: Dispatches from the Front Line of England’s Rapidly Changing Educational Landscape This book, which is now availble for pre-order, is edited by Robert Peal and should also have contributions from, among others, Doug Lemov, Daisy Christodoulou and Tom Bennett.

Advertisements
h1

Nobody Believes in Learning Styles Any More, Do They?

April 22, 2015

If you are a connected teacher, reading blogs and following Twitter, you could be forgiven for thinking that nobody believes in learning styles any more. It’s been discredited again and again. I was dismissing them 5 years ago. Just a few minutes searching online (and knowing that adding the word “debunked” to a search for an educational idea is always worth a try) shows that while there might still be material out there promoting them, learning styles are no longer the mainstream idea they were a few years ago. They are often used, perhaps along with brain gym, as an example of nonsense that we don’t believe any more.

To think they have gone though, is a mistake. Whenever this comes up teachers can give recent anecdotal examples of observation forms or teaching and learning policies that still push them. I can find recent blogs that promote them (e.g. here or here) and or that mention them in passing as if they were still credible (e.g. here). They are the walking dead of pedagogy, still shuffling about long after they should have been buried.

However, we shouldn’t be surprised at finding a few mentions in blogs (although we perhaps should be surprised I found several just from this month). Blogs are just one person’s opinion and are not always up to date. Last year I seem to recall discovering that at least a couple of blogs of the blogs I read about learning styles were by people who had gone overseas for a few years, and missed the fact that learning styles were no longer in fashion. Nor should we be surprised that some schools take longer to get over old fads; they may also have been the last to adopt them. What is of more concern is where learning styles are still being taken seriously by those whose influence is felt more widely.

For starters, a source sent me a copy of material used for a course at the University of Warwick for undergraduates wanting to become teachers. Here’s the details of a session held in January of this year:

Screenshot 2015-04-22 at 20.57.56

However, the most incredible example of the continuing existence of learning styles is in the one area of education most conspicuously left alone by Gove, Early Years. In the statutory framework for the Early Years, in effect from September 2014, the section on assessment requires the following (by law):

Assessment plays an important part in helping parents, carers and practitioners to recognise children’s progress, understand their needs, and to plan activities and support. Ongoing assessment (also known as formative assessment) is an integral part of the learning and development process. It involves practitioners observing children to understand their level of achievement, interests and learning styles, and to then shape learning experiences for each child reflecting those observations. In their interactions with children, practitioners should respond to their own day-to-day observations about children’s progress and observations that parents and carers share. [my emphasis]

So here we are, 5 years after the blogosphere cottoned on to learning styles being nonsense, and they are still being taught by educationalists in a top university, and it is required by law they be assessed by EYFS practitioners.

h1

The arguments against the phonics screening check have been discredited

April 2, 2015

I had the inevitable holiday run in with phonics denialists on Twitter. Not really worth rehashing any of it here; none of the arguments are new. However, I hadn’t realised that a lot of them, including primary teachers (and presumably this may also apply for a lot of primary teachers who are not denying the evidence for phonics on Twitter) are not actually aware that the main arguments used to deny the usefulness of the phonics screening check have now been discredited.

We now have the results from the students who took the phonics check in 2013 and did their key stage 1 reading assessment in 2014. And (from page 12 here) we learn that:

Pupils who do well in the phonics screening check do well in reading at the end of key stage 1. 99% of pupils who met the expected standard of phonic decoding in year 1 went on to achieve level 2 or above in reading at the end of key stage 1. 43% of these pupils achieved level 3 or above in reading. 88% of pupils who met the expected standard of phonic decoding at the end of year 2 achieved level 2 or above in reading. Only 34% of pupils who didn’t meet the expected standard of phonic decoding by the end of year 2 achieved level 2 or above in reading.

Looking at the more detailed results from here (Table 14) we can break down performance in the KS1 assessment by the results of the phonics screening check. The differences between those who passed 1st time (blue), those who passed 2nd time (red) and those who didn’t pass (orange) are striking.

image (1)

If you were around for the debates over the introduction of the check, you’d know that the following claims were made at the time:

  • Good readers would do badly in the phonics check.
  • The check would not tell us anything useful about their ability to read.
  • Teaching students to pass the phonics check would harm students’ ability to read later.
  • It would tell us nothing that teachers did not already know.

If you know anything about testing, you’d know that a test that identifies loads of pupils (in fact a big majority of the cohort) who will have a 99% chance of succeeding at the next level, is incredibly useful. And even the 66% figure for indicating those who will do poorly in the reading assessment is remarkable for a 5 minute check. Which teacher would not want to know if students were in the blue, red or yellow distributions above? This is remarkably extensive information about probable future performance gained in really very little time. It also tells us the first 3 claims above made by opponents of the phonics check do not match up with what generally happens. Those who do badly in the phonics check (particularly twice) are rarely good readers. Check performance tells us a lot about subsequent reading scores. Those students who have been most effectively prepared for the check, also appear to be better prepared for the reading test.

Of course, the last claim of the opponents, that teachers already knew all the stuff the check told them, could be true. But given the impressive figures for the predictive ability of the phonics check, I think the burden of proof now lies squarely on those who claim that teacher assessment would be more accurate.

Update 2/4/2015:

I was perhaps a bit naive with this post. I didn’t guess that the general response for phonics denialists would be to claim that everybody already knew that performance in the phonics screening check would be closely correlated to reading ability and effectively deny that any of the claims above (except perhaps for the claim that teacher assessment would be more accurate) had ever been an issue. So just in case there is any doubt that people claimed that the phonics check would cause problems for those who could read and would tell us nothing about reading ability, here’s a link to a letter opposing the phonics check from June 2012.

Please note it contains the following claims:

we [don’t] believe that this will help parents know how well their children are learning to read…

They will not show whether a child can understand the words they are reading, nor provide teachers with any information about children’s reading ability they did not already know…

The use of made-up words …. risks … frustrate [sic] those who can already read

…using unrealistic, arbitrary benchmarks in the checks plucked out of the air is of benefit to no one.

The signatories included:

  • Mary Bousted (General secretary, Association of Teachers and Lecturers)
  • Russell Hobby (General secretary, National Association of Head Teachers)
  • Christine Blower (General secretary, National Union of Teachers)
  • David Reedy (United Kingdom Literacy Association)

It also included Stephen Twigg and Lisa Nandy who were both Labour frontbench education spokespeople and the prominent anti-phonics activist Michael Rosen.

This was not some fringe group. These were the loudest enemies of the phonics screening check. And they were all utterly wrong.

Anybody know if any of them have acknowledged this?

 

 

%d bloggers like this: