But over the past millennium, many of us have undergone a profound shift. We've gradually replaced our internal memory with what psychologists refer to as external memory, a vast superstructure of technological crutches that we've invented so that we don't have to store information in our brains. We've gone, you might say, from remembering everything to remembering awfully little. We have photographs to record our experiences, calendars to keep track of our schedules, books (and now the Internet) to store our collective knowledge, and Post-it notes for our scribbles. What have the implications of this outsourcing of memory been for ourselves and for our society? Has something been lost?
My disagreement with the extended mind thesis is documented, but that ontological dispute aside, I'm certainly willing to admit that the average person--even the average educated person--must remember many fewer facts today than 50 or 100 years ago in order to function. This is, as the National Geographic indicates, because the average person has immediate access to much more reliable information than the average person 50 or 100 years ago. Does the fact that the access to information is immediate and trustworthy (with the aid of the appropriate technology) make it memory in anything like the sense of biological memory? No--but that's not my point here. I want to discuss the question raised in the last two sentences of the National Geographic snip above: has something been lost in switch from storing information toward skill at accessing information? Again, I think the answer is a resounding 'no.'
It seems, actually, that something has been gained. The October/November issue of "Scientific American: Mind" magazine includes an article by James R. Flynn about the 'Flynn Effect'--the enormous increase in IQ in the last 100 years. The article notes that "Gains in Full Scale IQ and Raven's [IQ test score] suggest that our parents are some nine to 15 points duller than we are, and that our children [i.e. me and, more than likely, you] are some 9 to 15 points brighter." That would imply that our (i.e. post-baby-boomers') grandparents' generation had an average IQ of around 75--that's mildly mentally retarded by today's standards. I think most people will agree that their grandparents were not mentally handicapped, so it seems that something odd is going on here.
The Scientific American article argues (convincingly, I think) that this paradox is explained by the fact that people today are, from a very young age, trained much more for analysis and data processing than ever before. The obvious question, which I don't think the article satisfactorily addresses, is "why is this the case?". The answer, I think, should be relatively obvious: the fact that I have very easy access to vast information--more information that my grandparents could have dreamed--so long as I know how to access it means that much, much more of my brain is dedicated to skill at accessing and processing data compared to my grandparents, who had to dedicate substantial portions of their brains to storage of data.
Again, I think the question of whether or not my skill at using Google constitutes memory (it doesn't) is irrelevant to this discussion: the point is that the fact that I know I can access virtually any fact virtually whenever I want means that I don't have to remember, say, the formula for the area of a pyramid. Instead, I can spend the effort that I might have spent to memorize all those facts toward increasing my critical thinking skills--again, which is what IQ tests primarily measure.
Access to information has increased at an amazing rate even within my (relatively short) lifetime, and I suspect that it will continue to increase as technology advances. It is interesting to consider how human intelligence will advance in kind.
No comments:
Post a Comment