In the fall of 1995, I took a class on the technological history of communication. I didn't particularly care for the professor -- we got off to a bad start because he preferred his students to join his cult of personality, and I don't do culty -- but I did end up loving the subject, appreciating the insights he patiently guided us toward, and giving reluctant props for how he launched us there: with a reading of Neal Stephenson's Snow Crash.
If you've read the book, you know that one of the central plots hinges on the Chomskeyan theory of a generative grammar or, in layman's terms, the idea that there is an innate body of linguistic knowledge hard-coded in our brains, and this hard-coded language is what lets us quickly master our native tongues. In class, we used that idea to springboard to an examination of how oral cultures successfully transmit knowledge: it involves sagas, alliteration, music and repetition. I may not buy into the idea of generative grammar, but I do think we're biologically wired for audio transmission. It is a trait that midwifed human society.
(As we talked about this in class, I had two smaller flashes of self-interested insight: the shift from orality to literacy explains how the substance of the Arthurian myths shifted from Wace (c.1155) down to Mallory (c.1470) down to Tennyson (c. 1830) down to T.H. White (1938). It also explains why I can remember song lyrics after a few listens, but recalling what I read requires re-reading, and usually rewriting it in my own words.)
We spent the class examining how shifts in communication technology change both the nature of how we learn and what information we absorb and transmit. When we moved to the printed word, we untethered the nature of knowledge from both space and time. Being learned no longer required being in the right place for a duration and listening; you could now consume the information that someone else, somewhere else, had permanently recorded. If you've ever read Walter J. Ong's Orality and Literacy, or Eric J. Havelock's Preface to Plato, this argument is old hat.
As a corollary to the space-time argument, there is also the question of context for the words. When speaking, each word exists in context with the ones around it, and there are also a host of other cues -- intonation, body language -- that further shade meanings. In writing, the words rely on each other for context; in this way, an argument is shaped and the experience of moving from one point in time (and understanding) to another is facilitated. The question we tackled in class: When moving to a digital environment like hypertext, in which words can be stripped of their context, how will the human intellectual tradition change in response?
Remember, this was 1995. We were all still shaking off the effects of CD-ROMs, with their pocket universes of nonlinear narrative and decontextualization. All of us in the class had different theories about what digital media would do to the human intellect; mine was inordinately influenced by Gödel, Escher, Bach and argued that the human mind's tendency toward formal rule systems and our subsequent linguistic bias toward formalized structure in language would eventually impose a new set of rules on how we learned. Frankly, I'm amazed I could summon that argument, as my chief grad school accomplishment was the launch of weekly Melrose Place viewing parties.
Perhaps the Brooke and Billy saga went to my head, because I blithely figured that digital media would make us all little James Burkes. We'd use Web pages and search engines to spin skeins of hypertext between different disciplines and weaving together interdisciplinary insights that depended heavily upon the interplay between facts when placed in different contexts. We'd become smarter because we'd have to: we'd need outstanding analytical and synthesizing skills to navigate the sea of information before us.
Twelve years later, Nicholas Carr has come to exactly the opposite conclusion in "Is Google Making Us Stupid?" (Atlantic Monthly, July 08) Permit me to distill his argument to its broadest forms: a decade of Web-surfing has left him unable to sink into a book the way he used it. Thanks to the near-instant gratification of Google, people have lost the ability to patiently plow through primary sources; they expect to find what they want within a few clicks, and to absorb what they need within a few sentences. He writes, "What the Net seems to be doing is chipping away my capacity for concentration and contemplation. My mind now expects to take in information the way the Net distributes it: in a swiftly moving stream of particles."
What Carr's experiencing is possible. Patton Oswalt recently riffed on YouTube with:
"YouTube has destroyed our ability to know when we're entertained. It has turned us into a nation of deranged Roman emperors. We're a continent of Caligulas sitting around in our bathrobes, saying, 'Mmm. I'm bored. I'd like to see something poop or sneeze. Hmmm. Yes ... what would I like to see poop or sneeze? Perhaps a reptile or a gay homeless person, or perhaps Della Reese.' And isn't it odd that YouTube actually sounds like the name of the eunuch slave who would be working for that deranged emperor?"
I believe his point was that having access to too much video content completely destroys one's abilities to develop or maintain critical faculties.
However, Carr seems to be implying that merely reading online can alter one's cognitive capacities. Several bloggers anecdotally corroborate this. They all fret that not being able to plow through a thick tome is proof that they're getting dumber.
Here is my counterargument: Yes, the way one absorbs information online is different from the way one absorbs the written word. And that is different from the way one absorbs information via the spoken word. Each medium has its own benefits and drawbacks when it comes to absorbing information. But no one medium has a lock on knowledge-advancing cognition. This is obvious to anyone who's ever listened to David Sedaris, then read his word; it hit me when I read In the Beginning Was the Command Line as a book instead of as a free download.
Let's face it: aural learning is very good for learning things by repetition, and for getting the basics engraved in the very bedrock of your brain. There's a reason we sing the A-B-Cs. I'm betting at least one chemistry student got through a test thanks to "The Elements." Print is very good for engendering the cognitive state that 43 folders types know as "flow," that state of deep concentration which allows you to focus on a specific topic or idea to the exclusion of distraction, then evaluate and apply it in a way that permanently affects your understanding of the world around you. And digital media is good for something I think of as "systemic comprehension," or the ability to create a cognitive model that permits you to come to a deeper understanding of the complex world around you.
Each type of medium has its own physical strengths. Who among us has not hummed the alphabet song when trying to sort file folders? Who hasn't looked for a passage in a book by flipping around the general physical spot where you first read it? Who hasn't retraced their steps online trying to recreate the cues that led to a particularly relevant weblog post or article? The external stimuli we depend on to jog the memory vary depending on the medium. (For example, one panel of a comic will usually remind me of what happened in the entire issue. That's another physical cue unique to a specific medium. And speaking of comics, Hunter: The Age of Magic has a wonderful storyline about the power of words to define and shape the world -- told in a medium that depends on pictures as much as said words.)
I do think it is more difficult to read lengthy pieces online, and it's killing me that this one is going to be an (uncommented-upon) essay. The digital medium rewards brevity -- but that's ultimately enabling that Burkean "Connections" model I imagined. The more discrete the node, the more likely it is to be incorporated into one's personal cognitive Web.
The real challenge to this medium is not that it's making us all ADD-addled idiots. It's that the digital medium is so intensely solipsistic. Aural learning was eminently social: it requires a speaker and a listener and a social negotiation between the two. Written media is still somewhat social, as there is still a deal between writer and reader: "I control what information I give you, how I give it to you, and what it means. You pay attention and I'll make it worth your while." Sinking into a book is essentially a submissive act; you are allowing the author to take control of your attention.
However, the emerging model in the digital medium is eminently self-centered, and it requires active control. We collect information that's significant to us. We make connections between discrete nodes that make sense only to ourselves. Sometimes, those connections may not even be what a blogger or Twitterer intended. Even when we get feedback from others, or give it via the comments in a blog post, it is still largely an act of self-reinforcement.
In short, when we go online, we're responsible for our own mental enrichment. It is a privilege and a responsibility, and it requires a whole different set of skills than listening or reading. The smart people among us will be able to cultivate the skills that allow them to learn in any medium.
The anxious will bitch and moan about how the emergence of one medium means the decline of our intellectual civilization. A good example: "The Fate of the Sentence: Is the Writing on the Wall" (WaPo, June 15, 08). In it, a guardian of culture frets that creeping IMspeak and LOLcattery leads to "creeping inarticulateness," and that will, in turn, lead to the demise of critical thought.
To this, I can only reply: WTF?
Not all communication is meant to carry the explication of abstract thought. Sometimes you just want to find the directions to the nearest In'n'Out. Or you want to tell someone their kid looks adorable on Flickr. Casual communication is a different beast than the stuff used for heavy cognitive lifting. Great sagas may start out with "Hwaet!" but that's only a call to attention; the real meat was within language that was explicitly meant to convey something important to an audience. We are not losing our ability to create sentences that convey great meaning, nor are we losing our ability to comprehend those weighty words.
Those communicative traditions have persisted across milennia and mediums. We have persisted. The search engine will not be the downfall of us all. In the beginning, there was the word. We are at another beginning; we are learning what to do with these words. Hwaet! Read. Click ...
Human comprehension marches on.
Yeah, I had a J-school professor who was always irritated at what he termed "the declinists"--the people who thought that television was going to turn us all into drooling idiots. I felt like you could take Carr's essay, substitute "television" and "changing channels" for all the Internet-specific stuff, and you'd have every declinist screed written over the past 60 years. And they pretty much all begin with the author bemoaning the fact that he doesn't read books any more--I've read that enough times that I'm curious to know if book-reading typically declines with age (distractibility seems to increase with age, so maybe that is Carr's problem right there).
I was also annoyed because I know that there is research out there on Internet use and people's emotional well-being, so when I picked up the article, I was hoping for something a little more substantial. Ever the optimist, I know, expecting people to do all that difficult research and reporting.
Posted by: Polly | 2008.06.17 at 15:50
I find it ironic that the WaPo article alludes to a 1937 article warning of the death of the sentence, seeming to argue that what that author was worried about -- needless complexity -- was nothing: NOW we have a real problem. I learned through my training to teach first-year composition to college students that the types of grammatical errors students make have stayed pretty much the same in type and number over time: see, for example, Harap's "The Most Common Grammatical Errors" (The English Journal 19(6): 440) and note how closely that list of errors from 1930 matches the errors on English Fail or your nearest K-12 or college classroom.
I agree that the written word isn't in any danger. What the declinists might be responding to is that in the past many may have been more "protected" from grammatical errors and heterodox structure, by publishers deciding what would be printed and copyeditors working to ensure it conformed to orthodoxy. With the internet, there is not as much stricture on production of the written word, so suddenly authors "unauthorized" by these power structures can produce writing and be read. These heterodox styles and grammar are coming into the light in a new way, but many aren't new. And still every generation has had their guardians of culture complaining that the new mode of language will be the death of us all.
Side note: just yesterday, my boyfriend and I were sharing amazed chuckles over outmoded language structures. I study 19th-century medical history, and in my research came across the stunningly titled, "Hydrophobia: an account of the awful and lamentable end of a whole family, who died deranged, from drinking the milk of a cow, bit by a mad dog: also, of the death of 2 persons, and the state of others, bit by dogs, in and near Glasgow, on Friday, Aug. 28th. 1824" I'm pretty sure I would have broken out in hives if one of my first-year English students tried to get a title with two colons and eight commas by me, but it was customary then. Point being, language "rules" are often customs, which change. As can rules, when you get down to it.
Posted by: Auntie Maim | 2008.06.17 at 17:12
Oops, forgot to complete the link: English Fail is at http://englishfail.wordpress.com/.
Posted by: Auntie Maim | 2008.06.17 at 17:13
Yeah, you know, I learned to write well (to the extent I do) by reading books and newspapers (the Post, as it happens) -- not by reading notes and letters from my friends; the quality of their writing really didn't affect mine. I don't see why the technological component would change that. And I had a couple high school English teachers whose language skills I considered pitiable, so I refuse to accept that there is some kind of generational problem.
Now. I will concede that the young attorneys and law students who submit writing samples to my firm are starting to fall into some pretty informal patterns. (...don't put "FYI" in your goddamned footnote, you know?)
But I don't think the flat-out incompetent writing we see here is the fault of the interwebs. I certainly feel the disappointment when someone makes it through high school, college, AND law school still under the firm impression that one forms a plural noun with the help of our plucky friend the apostrophe. And I understand the desire to blame -something- for this horrific state of events. And I was so disappointed to realized that widespread internet use would not naturally assist in the creation of BETTER writers (from the constant written self-expression, don't you know). But bad writing is not technology's fault.
Posted by: SP | 2008.06.17 at 17:44
Given that I've spent the week flipping between writing a paper on Dante's treatment of feminine sexuality and chatting with friends in text that is often influenced by LOLcats? I think I will give Carr a hearty UR DOIN IT RONG.
Posted by: Nomie | 2008.06.18 at 17:20
I should have ditched the essay format and written my rebuttal as a series of LOLcats!
MY COUNTERARGUMENT. LET ME SHOW YOU IT.
Posted by: Lisa S. | 2008.06.18 at 22:14