Friday, August 1, 2008

100th Post - An Ode to Technology

This is the 100th post on this blog, and I'm pretty happy about it. As I said in the very first post, I've tried keeping a blog before, and it's never really worked out as well as it has here. I think it is a fitting celebration, then, to talk a little bit about technology.

I'm 2/3 of the way through my second CTY session, and this time I'm teaching philosophy of mind with Eripsa, who, despite being dreadfully wrong about consciousness, is an all-around awesome dude. He works primarily on the philosophy of technology, a disappointingly underrepresented field that deals with questions like "what is the ontological status of a tool," "what is necessary to create an artificial mind," and "how does technology influence human thought?" He does a lot of really interesting work (particularly on robots), so I encourage you to go check out his blog.

Anyway, being around him inevitably gets me thinking even more about technology than I usually do (which is saying something)--I'm particularly interested in that last question I posed above, though: how does technology influence human thought? Eripsa wants to follow Andy Clark and David Chalmers in endorsing the strong-externalist extended mind thesis, which claims that there is a relevant sense in which my cognition and mental states (including beliefs) spend a lot of time in the external world. Their paradigm case for this is that of Otto, a hypothetical Alzheimer's patient who, in lieu of using his deteriorating biological memory, writes down facts in a notebook, which he carries with him at all times. Clark claims that when Otto consults his notebook for a fact (e.g. the location of a restaurant he wants to go to), the notebook is serving as a repository for his beliefs about the world in just the same way that my (or your) biological memory does; that is, his belief about the location of the restaurant is literally stored in the external world.

This thesis seems fraught with problems to me, but that's not the point I want to make (at least not in this post). While I think that Clark (and by extension Eripsa) is wrong about the ontology of technology (Otto's notebook is supposed to stand for a whole host of technological "extensions" of our biological minds into the world), I think he's precisely right about its importance in a cognitive sense. Human beings are, by their very nature, tool users; it's a big part of what makes us human. Of course other primates (and even some birds) can use--or even manufacture--tools to accomplish certain tasks, but nothing else in the known natural world comes even close to doing it as well as humans do. Technology use is a part of who we are, and always has been; we created language as a tool to manipulate our environment, learning to create compression waves in the air for the purpose of communicating our ideas to each other, and in the process beginning the long, slow march toward the incredibly sophisticated tools we have today--tools like the one you're using right now.

Language might have been our first tool--and perhaps even still our best--but in recent years, the computer (and more specifically the Internet) has proven to be one of our most important in terms of cognition. I've argued before that the advent of the information age should herald a radical change in educational strategy, but I want to reiterate that point here. Today's kids are growing up in a world where virtually any fact that want is immediately and reliably accessible at any time. I'd say that at least 1/3 of the kids I'm teaching at CTY--and these are 12-15 year olds--have Internet-enabled cell phones that they keep on their person at all times; this is a very, very big deal, and our educational strategy should reflect it.

100 years ago, a good education was an education of facts. Students memorized times-tables, theorems, names and dates, literary styles, and an endless list of other factual statements about the world, because that's what it took to be an "educated citizen." Information was available, but it was cumbersome (physical books), difficult to access (most areas didn't have high quality libraries), and generally hard to come by for the average citizen--even an educated one. The exact opposite is true today--students don't need to memorize (say) George Washington's birthday, because they can pull that information up within seconds. This frees up an enormous "cognitive surplus" (to borrow Clay Shirkey's term) that can be used to learn _how to analyze and work with facts_ rather than memorize the facts themselves.

I've postulated before that the so-called "Flynn Effect"--that is, the steadily increasing IQ of every generation since the close of the 19th century--might be due to the increasing availability of information, and thus the increasingly analysis and abstraction oriented brain of the average citizen. If I'm right, we're going to see a huge leap in the IQ of this generation, but only if we start to educate them appropriately. We need a radical emphasis shift as early as in the kindergarten classroom; students need to be taught that it's not what you know, but how well you can work with the almost infinite array of facts that are available to you. The spotlight should be taken off memorizing names and dates, facts and figures, and focused squarely on approaches to thinking about those facts and figures. Today's child is growing up in a world where he is not a passive consumer of information, but rather an active participant in the process of working with information in a way that humans have never been before.

This leads me to my final point, which is that you should all go read this speech by Clay Shirky, author of the book Here Comes Everyone. It's very, very well articulated, and makes exactly the kind of point I'm driving at here. Snip:

I was having dinner with a group of friends about a month ago, and one of them was talking about sitting with his four-year-old daughter watching a DVD. And in the middle of the movie, apropos nothing, she jumps up off the couch and runs around behind the screen. That seems like a cute moment. Maybe she's going back there to see if Dora is really back there or whatever. But that wasn't what she was doing. She started rooting around in the cables. And her dad said, "What you doing?" And she stuck her head out from behind the screen and said, "Looking for the mouse."


Here's something four-year-olds know: A screen that ships without a mouse ships broken. Here's something four-year-olds know: Media that's targeted at you but doesn't include you may not be worth sitting still for. Those are things that make me believe that this is a one-way change. Because four year olds, the people who are soaking most deeply in the current environment, who won't have to go through the trauma that I have to go through of trying to unlearn a childhood spent watching Gilligan's Island, they just assume that media includes consuming, producing and sharing.


It's also become my motto, when people ask me what we're doing--and when I say "we" I mean the larger society trying to figure out how to deploy this cognitive surplus, but I also mean we, especially, the people in this room, the people who are working hammer and tongs at figuring out the next good idea. From now on, that's what I'm going to tell them: We're looking for the mouse. We're going to look at every place that a reader or a listener or a viewer or a user has been locked out, has been served up passive or a fixed or a canned experience, and ask ourselves, "If we carve out a little bit of the cognitive surplus and deploy it here, could we make a good thing happen?" And I'm betting the answer is yes.

I'm betting the same. Thanks for reading, and here's to the next 100 posts.

1 comment:

Adam Harris said...

Jon,


I know your compy died, so you might not be able to read this for a while, but I want to discuss your idea for changing how we are teaching.

I agree, we do have access to an endless number of facts almost instantaneously. With that being said, each fact we store in our brain (I'm guessing the term is internal memory...but I don't want to misuse it) factors into how we analyze new entries. For instance, my knowledge of historical dates allows me to determine causal relationships and lessons that others may never get if they were not taught the base facts, but were taught how to interpret them when they are received as input.

While I agree that mundane facts like the birth date of historical figures are not important, memorization of times tables can be recalled much faster than they are input into an external device, especially if such memorization begins at an early age. This reduced operating time allows for increased time to think about meaningful things instead of looking up the mundane. Its almost as if critical thinking is a function, and the facts are the input operand. The more you have to look up what you're inputting, the longer it will take the whole thing to run.

I guess approximating the brain as a computer isn't a reliable method for analyzing this issue, but I'm a physicist...its like considering a spherical cow...