If there are no clear grounds for saying that technological progress in general is transforming society in ways which require low-content education, then the alternative is to suggest that there are particular changes happening which require it. The usual suggestion is that developments in information technology, particularly the availability of information on the internet, mean we no longer need to hold onto knowledge. In effect the “knowledge economy” is one in which nobody needs any personal knowledge.
Examples of this line of argument include:
Why do some teachers still provide children with answers when all the answers are out there on the Web? Contemporary pedagogy is only effective if there is a clear understanding of the power and efficacy of the tools that are available. Shakespeare may well have died in 1616, but surely anyone can look this up on Wikipedia if and when they need to find out for themselves? Inquiry based learning is gradually taking hold in schools, but not quickly enough. Give the kids questions from which more questions will arise. Send them out confused and wanting more. Get them using the digital tools they are familiar with to go find the knowledge they are unfamiliar with. After all, these are the tools they carry around with them all the time, and these are the tools they will be using when they enter the world of work. And these are the tools that will mould them into independent learners in preparation for challenging times ahead.
From Steve Wheeler’s blogpost.
I don’t believe, however, that we’ll see the so-called knowledge economy because at the same time what we’re seeing is knowledge become free and ubiquitous. You cannot build an economy on something that’s all around you and completely free. So I think the whole concept of the knowledge economy is actually a bit of a fake, so that’s at least one thing we can cross off our worry list.
From Caroline Walters’s speech.
If you have a tolerance for being shouted at by an exuberant American youth then you can see another version of this argument here:
And then there’s the inevitable list of supporting “facts” from Shift Happens UK:
It is estimated that a week’s worth of the Times contains more information than a person was likely to come across in a lifetime in the 18th century.
It is estimated that 4 exabytes (4.010^19) of unique information will be generated this year.
That is more than the previous 5,000 years.
The amount of new technical information is doubling every 2 years. By 2010 it is predicted to double every 72 hours.
For students starting a 3 year university degree this means that half of what they learn in their first year of study will be outdated by the end of their studies.
Much of this is simply based on misconceptions about how we think. As I have argued before, we actually require knowledge to be held in our heads in order to think effectively. Being able to look something up is no substitute for knowing it if what you want to do with the information involves thinking about, understanding or effectively applying that information. Technology has not changed the basics of how we think. Here I don’t mean to replay this argument, just to give a few reasons to doubt the narrative in which this is presented as an issue raised by technological change.
Firstly, we can note that this is not a new argument. According to E.D. Hirsch (in his excellent essay on this topic): “’You can always look it up’ has always been a watchword of the progressive approach.” The excellent Quote Investigator website, investigating the false claim that Einstein said: “I don’t need to know everything; I just need to know where to find it, when I need it”, found examples of this argument going back to 1914. We are not the first generation even to hear this argument being based on technology; it is almost 40 years since Lister (1974) claimed “the revolution in media technology has made the school obsolete as a transmitter of information”.
Secondly, it is not a compelling argument even to those who have played a role in the development of online information sources. One notable critic of the idea that we can look everything up is Larry Sanger, the co-founder of Wikipedia. His essay on the importance of individual knowledge can be found here arguing:
…to claim that the Internet allows us to learn less, or that it makes memorizing less important, is to belie any profound grasp of the nature of knowledge. Finding out a fact about a topic with a search in Wolfram Alpha, for example, is very different indeed from knowing about and understanding the topic. Having a well-understood fact ready to recall is far different from merely getting an unfamiliar answer to a question. Reading a few sentences in Wikipedia about some theories on the causes of the Great Depression does not mean that one thereby knows or understands this topic. Being able to read (or view) anything quickly on a topic can provide one with information, but actually having a knowledge of or understanding about the topic will always require critical study. The Internet will never change that.
Moreover, if you read an answer to a question, you usually need fairly substantial background knowledge to interpret the answer. For example, if you have never memorized any dates, then when you discover from Wikipedia that the Battle of Hastings took place in 1066, this fact will mean absolutely nothing to you. (Anyone who has tried to teach a little history to young children, as I have to my three-year-old, knows this.) Indeed, you need knowledge in order to know what questions to ask. Defenders of liberal arts education often remind us that the point of a good education is not merely to amass a lot of facts. The point is to develop judgment or understanding of questions that require a nuanced grasp of the various facts and to thereby develop the ability to think about and use those facts. If you do not have copious essential facts at the ready, then you will not be able to make wise judgments that depend on your understanding of those facts, regardless of how fast you can look them up.
He also replied to the blogpost from Steve Wheeler (the one I have been repeatedly quoting) in the strongest possible terms:
Mainly what I think is interesting here is that this is a professor of education, and he is espousing flat-out, pure, unadulterated anti-intellectualism. An educator opposed to teaching knowledge–it’s like a chemist opposed to chemicals–a philosopher opposed to thinking. Beyond the sheer madness of the thing, just look at how simple-minded the argument is, and from what appears to be a rather distinguished academic. I actually find this rather sobering and alarming, as I said. It’s one thing to encounter such sentiments in academic jargon, with sensible hedging and qualifications, or made by callow, unreflective students; it’s another to encounter them in a heartfelt blog post in which, clearly, a serious theorist is airing some of his deeply-held views in forceful language.
Thirdly, we don’t actually know that the ease with which we can look things up is a benefit to learning. Some have argued that it is actually harmful. The BBC reported:
The culture of clicking online for instant answers risks “infantilising” learning, says the head of a charity which runs independent girls schools.
Helen Fraser will warn delegates of the Girls’ Day School Trust about the risk of pupils relying on “nuggets of information” from the internet.
She says that children should be reading whole books, rather than gathering a few shallow impressions.
“I do worry that the ease of access to nuggets of information means that our appetites are becoming infantilised.
“We’re so used to fast facts that we’re in danger of losing sight of the truth that some learning is more of a slow casserole, with knowledge stewing in our minds to form a richer, deeper flavour,” Ms Fraser will tell the conference on Wednesday.
“So I’m a firm believer in the importance for our students of switching off the computer, the radio, the smartphone, the TV, and any other distractions, and reading a whole book – I would say from cover to cover.”
Ms Fraser says she is concerned about the way that quick-fix answers from internet search engines can leave children with a lack of awareness of different views and a one-dimensional view of topics.
Nicholas Carr wrote an essay entitled “Is Google Making Us Stupid?” in which he argued:
Over the past few years I’ve had an uncomfortable sense that someone, or something, has been tinkering with my brain, remapping the neural circuitry, reprogramming the memory. My mind isn’t going—so far as I can tell—but it’s changing. I’m not thinking the way I used to think. I can feel it most strongly when I’m reading. Immersing myself in a book or a lengthy article used to be easy. My mind would get caught up in the narrative or the turns of the argument, and I’d spend hours strolling through long stretches of prose. That’s rarely the case anymore. Now my concentration often starts to drift after two or three pages. I get fidgety, lose the thread, begin looking for something else to do. I feel as if I’m always dragging my wayward brain back to the text. The deep reading that used to come naturally has become a struggle.
I think I know what’s going on. For more than a decade now, I’ve been spending a lot of time online, searching and surfing and sometimes adding to the great databases of the Internet. The Web has been a godsend to me as a writer. Research that once required days in the stacks or periodical rooms of libraries can now be done in minutes. A few Google searches, some quick clicks on hyperlinks, and I’ve got the telltale fact or pithy quote I was after. Even when I’m not working, I’m as likely as not to be foraging in the Web’s info-thickets’reading and writing e-mails, scanning headlines and blog posts, watching videos and listening to podcasts, or just tripping from link to link to link. (Unlike footnotes, to which they’re sometimes likened, hyperlinks don’t merely point to related works; they propel you toward them.)
While I am deeply sceptical about this argument, I do think that we cannot simply assume that the easy availability of information is always an advantage. Sparrow et al (2011) found evidence in a series of psychological experiments that the belief we can look up information, particularly online, may have a detrimental effect on our ability to recall information.
Good teachers, when asked a question about something which hinges on factual knowledge, don’t simply provide the answer. They provide context, explanation and ways to remember the infomration in the future. They may even question the student in order to discover whether the problem is a lack of factual knowledge, or a lack of understanding of what factual knowledge is relevant. Being able to acquire an answer quickly and easily from an external source is not always to one’s best advantage.
Finally, we should consider the alleged increase in “information”. None of the statistics I quoted earlier from Shift Happens UK can be confirmed by reliable sources; the one about degree courses being quickly outdated is stupendously unlikely. If they do have some basis in truth then there is an ambiguity here in the word “information”. Information can, particularly in the context of measurements of quantity like those mentioned above, mean little more than collections of bits stored in digital (or other) media. It doesn’t necessarily refer to anything useful or worth knowing. Even with a broader definition there has been more information than could be learnt in a lifetime for many years now. Gleick (2011) describes the problem of “information overload” with examples going back to 1621. Education has never been about passing on all the information in the world, only about passing on what is considered the most worthwhile knowledge in a particular culture. What constitutes this knowledge does change with time. Sometimes it is fascinating to find out when ideas we take for granted first appeared (for instance, the equals sign is apparently only a sixteenth century invention). However, what knowledge is worth knowing cannot be expected to change in reaction to the quantity of new information, only the quality. There is stability in all bodies of knowledge, even science. As Goldacre (2008) put it:
[O]ne of the key humanities graduates’ parodies of science [is that] science is temporary, changeable, constantly revising itself, like a transient fad. Scientific findings, the argument goes, are therefore dismissible.
While this is true at the bleeding edges of various research fields, it’s worth bearing in mind that Archimedes has been right about why things float for a couple of millennia. He also understood why levers work, and Newtonian physics will probably be right about the behaviour of snooker balls forever.
While there is some knowledge that is clearly replaced over time (no need to go back to teaching kids how to use a slide rule or a log table, and we can recycle any maps of the world still showing the USSR) there is no shortage of knowledge that isn’t going to be outdated anytime soon.
Gleick J (2011) The information. A history, a theory, a flood, Fourth, Estate, London.
Goldacre, Ben ( 2008). Bad Science, London: Fourth Estate.
Lister, Ian, (1974) Deschooling, Cambridge University Press
Sparrow, B, Liu J, Wegner DM. (2011). ”Google effects on memory: Cognitive consequences of having information at our fingertips2. Science. 333:776-778.