Archive for July, 2012

h1

The Two Types of Guardian Journalism About Where to Send your Kids to School

July 31, 2012

Dilbert.com

Type 1: Unbelievable Hypocrisy

If you had told me twenty years ago that’d I’d send my children to a private school then I wouldn’t have believed you. At the time, having gone to a top public school myself, and as an activist in the Socialist Workers Party (Oxford University branch), I thought that private schools were the cause of all Britain’s social problems and that they represented everything I hated about this country and my life. I also believed that England’s comprehensives were the finest, most noble institutions to have ever been created and that anyone who did not agree must have been influenced by the Daily Mail, the Tory Party and a virulent hatred of the poor. However, since Caitlyn and Jeremy were born I have had time to reflect. I now realise that some of my local schools aren’t as good as they should be. Class sizes in them are much too big. Some of the other children in them are funny-looking. Also, having visited my GP’s surgery 337 times this year, they have agreed that Jeremy has Special Needs and I don’t believe that the local state school can meet those needs as well as the small class sizes and dedicated teaching staff at Eton. Some may accuse me of hypocrisy but actually I just care about my children. Besides, there’s no difference between what I am doing and moving into the catchment area of a good comprehensive, converting to Anglicanism, and spending £30,000 on suing the local authority which is what most of my friends have done. I am still really left wing and radical. Just look at what I wrote last week about how I hate the royal family. I’m really radical.

Type 2: Patronising Self-Righteousness

Nobody is more evil than somebody who sends their child to a private school. I went to a top public school myself and it never did me any good, except for getting me into Oxbridge and a career in the media where I earn a six figure salary. I have lost count of all the people at my dinner parties, who said to me:

“You aren’t going to send Caitlyn and Jeremy to a state school are you? They’ll mix with the wrong sort. And would you mind passing me some more humus?”

However, after visiting the brand new building of the local academy, and checking my bank balance, I decided that it would be in society’s best interest for Caitlyn and Jeremy to go to their local state school. No, no, don’t thank me. It’s not the truly selfless, altruistic example of personal heroism it looks like. Actually it’s in Caitlyn and Jeremy’s best interest. After all, they are so gifted they don’t actually need all those small classes and extra tuition we could have paid for. What going to a state school will give them is the opportunity to make friends from a wide variety of backgrounds, including poor people. Poor people are wonderful and I believe that to the bottom of my heart even though I have never actually met a poor person. Also the teachers are wonderfully committed in my local state school. If you don’t send your child to the local comprehensive then you must hate poor people and teachers. And you’re probably a racist too. Not like me. If everyone did what I have done all the social divisions in this country would simply melt away. In fact we should make everyone do this. Otherwise it’s not fair.

What you won’t see in the Guardian is this:

I didn’t go to a private school and I can’t afford to send my kids to a private school. I hope the local state school is good enough. I know that most state comprehensives aren’t, and except for a few ideologues, most people who can afford to avoid them, or can work the system to avoid them, do so. What would be a radical left-wing policy would be to work on improving state schools so that they are good enough for even the most anxious, middle class parent to use without worrying. But that is a difficult policy to argue for in the pages of the Guardian, and not because of the cost, but because it would involve challenging some deeply held views of the middle class left. It would challenge the belief that children are natural saints whose bad behaviour only results from false consciousness created by capitalism, social problems and insufficiently compassionate teachers. It would challenge the belief that children learn best through play, having fun or being preached at about the importance of tolerance. It would challenge the belief that all we need to do is claim to care a lot, and have the most politically acceptable structures, and everything will sort itself out without a lot of effort or any change in attitude on the part of everybody with power and influence over education. Additionally, it would involve admitting that the question of where the upper middle class choose to send their children is an irrelevant distraction to the actual issue of what happens to the majority of our children in the majority of our schools.

h1

Dylan Wiliam’s Lecture and “Sharing Good Practice”

July 30, 2012

The following talk from Dylan Wiliam is rather interesting. He is the man behind the AfL craze and tends to have a lot of interesting ideas, although I’ve been sceptical about the implementation ever since I watched “The Classroom Experiment” some time ago.

Of particular interest is the section on “sharing good practice” where he makes a good case against sharing good practice.

I think sharing good practice is actually a very dangerous idea. I think it’s completely overblown. I think if you want to learn how to write a timetable going to see how another school does it is quite useful. If you want to find out about how to organise a 3 year key stage 4 then again visiting another school and sharing good practice is a good idea. But the danger is that those things have small effects. The thing that has big effects is changing teaching and, in changing teaching, sharing good practice is a fundamental distraction because teachers are like magpies. They love picking up shiny little ideas from one classroom; taking it back to their classroom;  trying it once, and then moving on to the next shiny idea. We don’t need to share good practice. Most teachers have enough good ideas to last a lifetime.

This is the exact opposite of what so much CPD in schools suggests and yet I think most experienced teachers would immediately see the truth in this. I spend a lot of time complaining that we get presented with (supposedly) new ideas that simply aren’t any good, but we should actually question whether we should even have been looking for new ideas in the first place, rather than evaluating, prioritising and perfecting our existing methods.

h1

Speak For England

July 29, 2012

One of my heroes, the brilliant US education historian (and now infamous pro-teacher troublemaker), Diane Ravitch recently blogged about an article from the New Statesman about education in England. Being the New Statesman the article didn’t really give a lot of perspective to the education debate here. I commented and that reply now forms the main part of this blogpost in which she suggests “I hope that other readers in the U.K. weigh in.”

Having realised that the English education system is ridiculously complex I no longer feel my contribution is particularly adequate. It would be great if other people could, indeed, weigh in.

Also, if you haven’t read any of her books, do so. Reading “Left Back” followed by “The Death and Life of the Great American School System” is a great way to catch up on over 100 years of educational history, most of which is entirely relevant to education in this country too.

h1

Dumbing Down: The Tory Way

July 27, 2012

For a short time it seemed the tide might actually be turning. The government might actually be against dumbing-down. They might actually want teachers to know what they are doing. They might have an educational agenda beyond privatisation and union-bashing. They might actually care about what happens in the nation’s classrooms.

My optimism just ended. Tonight the government announced (apparently on Twitter) that academies would be able to employ unqualified teachers (i.e. without QTS). Now I don’t want to overdo the value of QTS. Some PGCE courses are dire. The training signified by QTS is not always worth a lot. However, what QTS does represent is a commitment to join the profession. If you want to dedicate your life to teaching then you needed to, at the very least, work towards QTS status. Teaching was not seen as something you do for a few months when there are no better jobs available. It is a career and a profession, not something to be done in a gap year before starting a real career.

Dilbert.com

Now, in the government’s fantasy, the absence of QTS will lead to schools employing highly qualified experts. Former academics would, perhaps, just wander into schools and begin a teaching career no questions asked. Part of the inspiration is the extent to which private schools employ unqualified teachers if they have the right academic qualifications. However, aside from the question of whether the teaching in private schools often suffers as a result of this (and there is considerable anecdotal evidence suggesting it does), this completely misunderstands the mindset of state school SMT. Whereas the head of a private school will usually be highly academically qualified themselves and be looking for somebody with a similar background, our state schools have not valued academic achievement in a long time. Headteachers do not go out of their way to get the best qualified staff as it is. There is simply no reason why lowering the bar in one way (QTS required) will give any reason to raise it in any other way. All that has happened is that teachers just became cheaper. You no longer have to pay even enough to attract somebody who shows signs of having wanted to become a teacher.

The image of teacher recruitment I now have is one where, in the event of a vacancy, SMT calls upon anyone they know (a family member, a former pupil) who has just finished a degree in a vaguely suitable discipline and is now unemployed. Sure, they might not be any good at teaching but they are cheap and easily replaced. The dumbed-down ethos of so many schools, which says teachers need to be only one step ahead of the pupils, will now come with significant financial rewards.

Deprofessionalisation can never improve teaching. It will, however, make privatisation easier (by removing the need for private education providers to recruit qualified staff) and reduce the bargaining power of unions over pay and conditions. Despite all the rhetoric of wanting a rigorous curriculum, this policy reveals an agenda that puts saving money and attacking the teaching profession above attracting anybody with the ability to teach a demanding curriculum.

Dilbert.com

h1

Responses to Sir Ken Robinson’s Education Paradigms Video

July 27, 2012

Too often in online education debate somebody links to this particular atrocity against history and reason:

Very often it is done in such a way as to suggest nobody could ever fail to be anything other than impressed at the tired old arguments contained within. I usually respond by linking to several different blogs replying to it. In order to save time in future I will put all the links here on one page.

And now I have a nagging suspicion I have missed one out. If I have, somebody please remind me.

Update: 20/3/2013: The Factory Model of Schooling from the “Webs of Substance” blog is a more recent contribution.

Update: 12/10/2013: What Sir Ken Got Wrong from the irrepressible Joe Kirby.

h1

The Future Part 6: Does New Technology Mean We Don’t Need to Know Anything?

July 25, 2012

Dilbert.com

If there are no clear grounds for saying that technological progress in general is transforming society in ways which require low-content education, then the alternative is to suggest that there are particular changes happening which require it. The usual suggestion is that developments in information technology, particularly the availability of information on the internet, mean we no longer need to hold onto knowledge. In effect the “knowledge economy” is one in which nobody needs any personal knowledge.

Examples of this line of argument include:

Why do some teachers still provide children with answers when all the answers are out there on the Web? Contemporary pedagogy is only effective if there is a clear understanding of the power and efficacy of the tools that are available. Shakespeare may well have died in 1616, but surely anyone can look this up on Wikipedia if and when they need to find out for themselves? Inquiry based learning is gradually taking hold in schools, but not quickly enough. Give the kids questions from which more questions will arise. Send them out confused and wanting more. Get them using the digital tools they are familiar with to go find the knowledge they are unfamiliar with. After all, these are the tools they carry around with them all the time, and these are the tools they will be using when they enter the world of work. And these are the tools that will mould them into independent learners in preparation for challenging times ahead.

From Steve Wheeler’s blogpost.

I don’t believe, however, that we’ll see the so-called knowledge economy because at the same time what we’re seeing is knowledge become free and ubiquitous. You cannot build an economy on something that’s all around you and completely free. So I think the whole concept of the knowledge economy is actually a bit of a fake, so that’s at least one thing we can cross off our worry list.

From Caroline Walters’s speech.

If you have a tolerance for being shouted at by an exuberant American youth then you can see another version of this argument here:

And then there’s the inevitable list of supporting “facts” from Shift Happens UK:

It is estimated that a week’s worth of the Times contains more information than a person was likely to come across in a lifetime in the 18th century.

It is estimated that 4 exabytes (4.010^19) of unique information will be generated this year.
That is more than the previous 5,000 years.

The amount of new technical information is doubling every 2 years. By 2010 it is predicted to double every 72 hours.

For students starting a 3 year university degree this means that half of what they learn in their first year of study will be outdated by the end of their studies.

Much of this is simply based on misconceptions about how we think. As I have argued before, we actually require knowledge to be held in our heads in order to think effectively. Being able to look something up is no substitute for knowing it if what you want to do with the information involves thinking about, understanding or effectively applying that information. Technology has not changed the basics of how we think. Here I don’t mean to replay this argument, just to give a few reasons to doubt the narrative in which this is presented as an issue raised by technological change.

Firstly, we can note that this is not a new argument. According to E.D. Hirsch (in his excellent essay on this topic):  “’You can always look it up’ has always been a watchword of the progressive approach.”  The excellent Quote Investigator website, investigating the false claim that Einstein said: “I don’t need to know everything; I just need to know where to find it, when I need it”, found examples of this argument going back to 1914. We are not the first generation even to hear this argument being based on technology; it is almost 40 years since Lister (1974) claimed “the revolution in media technology has made the school obsolete as a transmitter of information”.

Secondly, it is not a compelling argument even to those who have played a role in the development of online information sources. One notable critic of the idea that we can look everything up is Larry Sanger, the co-founder of Wikipedia. His essay on the importance of individual knowledge can be found here arguing:

…to claim that the Internet allows us to learn less, or that it makes memorizing less important, is to belie any profound grasp of the nature of knowledge. Finding out a fact about a topic with a search in Wolfram Alpha, for example, is very different indeed from knowing about and understanding the topic. Having a well-understood fact ready to recall is far different from merely getting an unfamiliar answer to a question. Reading a few sentences in Wikipedia about some theories on the causes of the Great Depression does not mean that one thereby knows or understands this topic. Being able to read (or view) anything quickly on a topic can provide one with information, but actually having a knowledge of or understanding about the topic will always require critical study. The Internet will never change that.

Moreover, if you read an answer to a question, you usually need fairly substantial background knowledge to interpret the answer. For example, if you have never memorized any dates, then when you discover from Wikipedia that the Battle of Hastings took place in 1066, this fact will mean absolutely nothing to you. (Anyone who has tried to teach a little history to young children, as I have to my three-year-old, knows this.) Indeed, you need knowledge in order to know what questions to ask. Defenders of liberal arts education often remind us that the point of a good education is not merely to amass a lot of facts. The point is to develop judgment or understanding of questions that require a nuanced grasp of the various facts and to thereby develop the ability to think about and use those facts. If you do not have copious essential facts at the ready, then you will not be able to make wise judgments that depend on your understanding of those facts, regardless of how fast you can look them up.

He also replied to the blogpost from Steve Wheeler (the one I have been repeatedly quoting)  in the strongest possible terms:

Mainly what I think is interesting here is that this is a professor of education, and he is espousing flat-out, pure, unadulterated anti-intellectualism. An educator opposed to teaching knowledge–it’s like a chemist opposed to chemicals–a philosopher opposed to thinking. Beyond the sheer madness of the thing, just look at how simple-minded the argument is, and from what appears to be a rather distinguished academic. I actually find this rather sobering and alarming, as I said. It’s one thing to encounter such sentiments in academic jargon, with sensible hedging and qualifications, or made by callow, unreflective students; it’s another to encounter them in a heartfelt blog post in which, clearly, a serious theorist is airing some of his deeply-held views in forceful language.

Thirdly, we don’t actually know that the ease with which we can look things up is a benefit to learning. Some have argued that it is actually harmful. The BBC reported:

The culture of clicking online for instant answers risks “infantilising” learning, says the head of a charity which runs independent girls schools.

Helen Fraser will warn delegates of the Girls’ Day School Trust about the risk of pupils relying on “nuggets of information” from the internet.

She says that children should be reading whole books, rather than gathering a few shallow impressions.

“I do worry that the ease of access to nuggets of information means that our appetites are becoming infantilised.

“We’re so used to fast facts that we’re in danger of losing sight of the truth that some learning is more of a slow casserole, with knowledge stewing in our minds to form a richer, deeper flavour,” Ms Fraser will tell the conference on Wednesday.

“So I’m a firm believer in the importance for our students of switching off the computer, the radio, the smartphone, the TV, and any other distractions, and reading a whole book – I would say from cover to cover.”

Ms Fraser says she is concerned about the way that quick-fix answers from internet search engines can leave children with a lack of awareness of different views and a one-dimensional view of topics.

Nicholas Carr wrote an essay entitled “Is Google Making Us Stupid?” in which he argued:

Over the past few years I’ve had an uncomfortable sense that someone, or something, has been tinkering with my brain, remapping the neural circuitry, reprogramming the memory. My mind isn’t going—so far as I can tell—but it’s changing. I’m not thinking the way I used to think. I can feel it most strongly when I’m reading. Immersing myself in a book or a lengthy article used to be easy. My mind would get caught up in the narrative or the turns of the argument, and I’d spend hours strolling through long stretches of prose. That’s rarely the case anymore. Now my concentration often starts to drift after two or three pages. I get fidgety, lose the thread, begin looking for something else to do. I feel as if I’m always dragging my wayward brain back to the text. The deep reading that used to come naturally has become a struggle.

I think I know what’s going on. For more than a decade now, I’ve been spending a lot of time online, searching and surfing and sometimes adding to the great databases of the Internet. The Web has been a godsend to me as a writer. Research that once required days in the stacks or periodical rooms of libraries can now be done in minutes. A few Google searches, some quick clicks on hyperlinks, and I’ve got the telltale fact or pithy quote I was after. Even when I’m not working, I’m as likely as not to be foraging in the Web’s info-thickets’reading and writing e-mails, scanning headlines and blog posts, watching videos and listening to podcasts, or just tripping from link to link to link. (Unlike footnotes, to which they’re sometimes likened, hyperlinks don’t merely point to related works; they propel you toward them.)

While I am deeply sceptical about this argument, I do think that we cannot simply assume that the easy availability of information is always an advantage. Sparrow et al (2011) found evidence in a series of psychological experiments that the belief we can look up information, particularly online, may have a detrimental effect on our ability to recall information.

Good teachers, when asked a question about something which hinges on factual knowledge, don’t simply provide the answer. They provide context, explanation and ways to remember the infomration in the future. They may even question the student in order to discover whether the problem is a lack of factual knowledge, or a lack of understanding of what factual knowledge is relevant. Being able to acquire an answer quickly and easily from an external source is not always to one’s best advantage.

Finally, we should consider the alleged increase in “information”. None of the statistics I quoted earlier from Shift Happens UK can be confirmed by reliable sources; the one about degree courses being quickly outdated is stupendously unlikely. If they do have some basis in truth then there is an ambiguity here in the word “information”. Information can, particularly in the context of measurements of quantity like those mentioned above, mean little more than collections of bits stored in digital (or other) media. It doesn’t necessarily refer to anything useful or worth knowing. Even with a broader definition there has been more information than could be learnt in a lifetime for many years now. Gleick (2011) describes the problem of “information overload” with examples going back to 1621. Education has never been about passing on all the information in the world, only about passing on what is considered the most worthwhile knowledge in a particular culture. What constitutes this knowledge does change with time. Sometimes it is fascinating to find out when ideas we take for granted first appeared (for instance, the equals sign is apparently only a sixteenth century invention). However, what knowledge is worth knowing cannot be expected to change in reaction to the quantity of new information, only the quality. There is stability in all bodies of knowledge, even science. As Goldacre (2008) put it:

[O]ne of the key humanities graduates’ parodies of science [is that] science is temporary, changeable, constantly revising itself, like a transient fad. Scientific findings, the argument goes, are therefore dismissible.

While this is true at the bleeding edges of various research fields, it’s worth bearing in mind that Archimedes has been right about why things float for a couple of millennia. He also understood why levers work, and Newtonian physics will probably be right about the behaviour of snooker balls forever.

While there is some knowledge that is clearly replaced over time (no need to go back to teaching kids how to use a slide rule or a log table, and we can recycle any maps of the world still showing the USSR) there is no shortage of knowledge that isn’t going to be outdated anytime soon.

References

Gleick J (2011) The information. A history, a theory, a flood, Fourth, Estate, London.

Goldacre, Ben ( 2008). Bad Science, London: Fourth Estate.

Lister, Ian,  (1974) Deschooling, Cambridge University Press

Sparrow, B, Liu J, Wegner DM.  (2011).  “Google effects on memory: Cognitive consequences of having information at our fingertips2. Science. 333:776-778.

h1

The Future Part 5: Are We Living in a Time of Unprecedented Technological Change?

July 17, 2012

I have been dealing with the argument that to educate children for the future we need to give less importance to knowledge.

We saw in my last post on the future that if technological change is seen as a long-established feature of human life then this is not easily compatible with the idea that technological change is so unpredictable that it gives grounds to remove factual content from the curriculum. The alternative argument, for those arguing that technological change is too unpredictable to allow us to identify valuable knowledge for the future, is to suggest that contemporary (and upcoming) technological change is different to technological change in the past.

From Thornberg (1991):

Our students must be prepared for life in the next century – a time of unprecedented change. The global changes of 1991 will pale in comparison to those that yet face us, and the key to thriving in the new world order is for all of us to become lifelong learners who have retained our native creativity and who know how to use information technologies effectively.

The most outrageous example of this line of argument is from a blog written by an apparently well-established education “expert”:

For a child at school in 1850, the path that their adult life would take was not a million miles away from that of a child one hundred years later in 1950. They would both leave school having been through years of drilling in the basic academic skills. Their handwriting would not be greatly different, the words that they used more or less the same. Their methods for multiplying numbers identical. And off in to the world they would go. Their social class would determine the path that they followed, but regardless of whether it be one of a banker or shipworker, teacher or bricklayer, politician or journalist, the common factor for the 1850 child and his 1950 counterpart is that both would enter jobs for life…

The world had not changed a great deal. Transplant the 1850 child to the 1950 world and there would not be a great deal that they did not recognize or understand, save the fright of a speeding car or two.

Now think of the next step in the sequence 1850, 1950…

When I was growing up, Britain was pretty much the same place as for the 1950 child. Everyone still talked about the War, overseas travel was rare, TV had two channels. The world was small.

For many centuries Britain had not changed a great deal in terms of society, principles, values and life paths.

But then it happened. The greatest, most rapid evolution of society ever known. I am of course referring to the communication revolution.

Suddenly the world was not small. Cultures now intertwined, opinions, hopes and experiences shared.

This moves us from a debateable claim into an outrageous one. The period from 1850 to 1950 is one of tremendous technological change in almost every sphere, particularly transportation, communication and manufacturing

According to Cowen (2011):

The period from 1880 to 1940 brought numerous major technological advances into our lives. The long list of new developments includes electricity, electric lights, powerful motors, automobiles, airplanes, household appliances, the telephone, indoor plumbing, pharmaceuticals, mass production, the typewriter, the tape recorder, the phonograph, and radio, to name just a few, with television coming at the end of that period. The railroad and fast international ships were not completely new, but they expanded rapidly during this period, tying together the world economy.

We are perhaps too eager to assume that the changes we see in our lifetimes are incomparable to those in other eras. If the internet revolution seems to be the most exciting development in communications ever then consider the following description of the reaction to the telegraph and telephone:

In their earliest days these inventions inspired exhilaration without  precedent in the annals of technology. The excitement passed from place to place in daily newspapers and monthly magazines and. More to the point along the wires themselves. A new sense of futurity arose: a sense that the world was in a state of change, that life for one’s children and grandchildren would be very different, all because of this force and its uses.

Gleick (2011)

Alongside the case that rapid technological change is not unprecedented, it is also worth considering the argument that there are significant spheres of life where the impact of technological change has, if anything, noticeably slowed down.

According to Cowen (2011):

Today, in contrast, apart from the seemingly magical internet, life in broad material terms isn’t so different from what it was in 1953. We still drive cars, use refrigerators, and turn on the light switch, even if dimmers are more common these days. The wonders portrayed in The Jetsons, the space-age television cartoon from the 1960s, have not come to pass. You don’t have a jet pack. You won’t live forever or visit a Mars colony. Life is better and we have more stuff, but the pace of change has slowed down compared to what people saw two or three generations ago.

…  it was easier for the average person to produce an important innovation in the nineteenth century than in the twentieth century. It’s not because everyone back then was so well educated— quite the contrary, hardly anyone went to college— but rather because innovation was easier and it could be done by amateurs. The average rate of innovation peaks in 1873, which is more or less the beginning of the move toward the modern world of electricity and automobiles. The rate of innovations also plummets after about 1955, which heralds the onset of a technological slowdown. ..

… a consistent pattern shows up in other numbers. Across the years 1965 to 1989, employment in research and development doubled in the United States, tripled in West Germany and France, and quadrupled in Japan. Meanwhile, economic growth has slowed down in those same countries, and the number of patents from those countries has remained fairly steady. The United States produced more patents in 1966 (54,600) than in 1993 (53,200). “Patents per researcher” has been falling for most of the twentieth century.

Another economist, Paul Krugman, was making a similar argument over a decade ago:

…if you measure the progress of technology not by Mips and bytes but by how it affects people’s lives and their ability to get useful work done, you realize that the last 30 years have been a time not of unexpected achievement but of persistent disappointment.

Surely, for example, the startling thing about computers is not how fast and small they have become but how stupid they remain. Back in 1958 the pioneer computer scientist Herbert Simon confidently predicted that a computer would be the world’s chess champion by 1970; this makes the inability of IBM’s Deep Blue to beat Gary Kasparov even now a bit of a letdown. And building a computer that plays high-level chess turns out to be an easy problem — nowhere near as hard as, say, designing a robot that can vacuum your living room, an achievement that is still probably many decades away.

Better yet, think about how a typical middle-class family lives today compared with 40 years ago — and compare those changes with the progress that took place over the previous 40 years.

I happen to be an expert on some of those changes, because I live in a house with a late-50s-vintage kitchen, never remodelled. The nonself-defrosting refrigerator, and the gas range with its open pilot lights, are pretty depressing (anyone know a good contractor?) — but when all is said and done it is still a pretty functional kitchen. The 1957 owners didn’t have a microwave, and we have gone from black and white broadcasts of Sid Caesar to off-color humor on The Comedy Channel, but basically they lived pretty much the way we do. Now turn the clock back another 39 years, to 1918 — and you are in a world in which a horse-drawn wagon delivered blocks of ice to your icebox, a world not only without TV but without mass media of any kind (regularly scheduled radio entertainment began only in 1920). And of course back in 1918 nearly half of Americans still lived on farms, most without electricity and many without running water. By any reasonable standard, the change in how America lived between 1918 and 1957 was immensely greater than the change between 1957 and the present.

Another supporter of the theory that technological change has slowed down is the internet billionaire co-founder of Paypal, Peter Thiel :

I believe that the late 1960s … scientific and technological progress began to advance much more slowly. Of course, the computer age, with the internet and web 2.0 developments of the past 15 years, is an exception. Perhaps so is finance … There has been a tremendous slowdown everywhere else, however. Look at transportation, for example: Literally, we haven’t been moving any faster.

(Quoted here.)

The argument that we are in an era of unprecedented technological change is far from proven and, if Cowen, Krugman and Thiel are right, it may be the opposite of the truth. It is certainly not grounds for reducing the content of schooling in order to prepare for a flood of new technology.

Dilbert.com

References:

Cowen, Tyler (2011-01-25). The Great Stagnation: How America Ate All The Low-Hanging Fruit of Modern History, Got Sick, and Will (Eventually) Feel Better: A Penguin eSpecial from Dutton (Kindle Locations 90-93). Penguin Group. Kindle Edition.

Gleick J (2011) The information. A history, a theory, a flood, Fourth, Estate, London.

Thornberg, David, D.  (1991) “Edutrends 2010” Starsong Publications.

h1

English Language GCSE – Narrowing the Horizons of the Next Generation

July 10, 2012

Most of the material in this blog was pointed out to me, or sent to me, by English teachers. I won’t mention any names as they may not wish to be associated with the opinions here, but they should all feel free to claim credit in the comments afterwards.

One of the subjects most at risk of having its content hollowed out is English. This is because we apply our knowledge of the English language all the time and so virtually any activity, no matter how mindless, can be presented as an attempt to practise English “skills” and be used to replace the actual teaching of useful content. We see this in the reaction to phonics tests, and proposed grammar tests at primary school. It is claimed that learning definite knowledge, such as what letters mean or what the parts of a sentence are, will distract from more important skills and dispositions like being able to express oneself or having a fondness for books.

However, we also see this at GCSE. Because there is a separate English literature GCSE it is argued that there is no reason for the English language GCSE to rise above mundane and trivial uses of the English language. The following three examples show this is the quite explicit outlook of three different exam boards.

Firstly from Edexcel, we have the (already widely reported ) controlled assessment on “Talent Television”. This assessment included such exemplary uses of English as the front cover of Heat Magazine and the Britain’s Got Talent Website and asked questions such as:

Write an article for a television magazine in which you describe your ideas for a new television talent show.

OR

Write the script for a podcast aimed at young people in which you review a television talent show.

Secondly, we have the AQA Spoken Language Study. Among the options for the controlled assessment tasks we have such gems as:

Explore varieties of and attitudes to texting…

….Explore some of the similarities and differences between spoken conversation and web-based communication such as messaging, Twitter and Facebook.

Finally, from WJEC we have a unit on studying spoken language . There is some freedom to choose examples of spoken language, so what guidance is given for selection? What will ensure that students are analysing the best possible examples of spoken English?

Popular culture seems to offer the most engaging and interesting examples. E.g. interviews with people they admire or respect such as J.K.Rowling, Lady Gaga or Alan Sugar.

All of these tasks and guidance relate to assessments that count towards the exam. As a result, teachers have every incentive to spend a good number of lessons preparing students for work on these topics. Obviously, good teachers will find alternative tasks or find a way to teach proper content as well. But equally, bad teachers and bad managers will have every excuse to avoid challenging content and focus on dross.

I know from Twitter discussions following the “Talent Television” story that there will be those who simply refuse to see a problem with looking at such inauspicious examples of English. After all, they argue, English language is all about “skills”, not content, and so studying the mediocre and the inane may develop those skills just as much as studying the best of what has been thought or written (which has its rightful place in the separate English literature exams).

However, to accept this argument is to ignore one of the most important purposes of education:  to broaden horizons and to open up the world to those you are studying.  Every time students are directed towards things they see everyday than an opportunity has been wasted to direct them to something better. If there are skills that can only be developed by applying them to the ordinary and commonplace then we have to ask whether they are skills worth developing. But if this is not enough to convince you  (and I know there are those who refuse to acknowledge the problem even when it is staring them in the face) then can you at least answer this question: where was the consultation and debate where it was decided that children need to be able to  discuss the cover of Heat Magazine and analyse an interview with Lady Gaga? Which elected politician argued for it in parliament? Which parents were asking for it? Even if this all seems fine to you, you have to admit that this is not acceptable to many, and that the decision hardly seems to have been made after listening to, and considering, those objections.