By Ryan McGreal
Published August 20, 2009
A year-old essay in The Atlantic has once again been making the rounds among internet-anxious intellectuals after its mention in a more recent (and more compelling) piece in Slate that suggests the popularity of - nay, obsessive adherence to - internet search, texting and social networking is grounded in the brain's dopamine reward system.
Whereas the Slate article investigates the question about whether the internet is affecting us at a cognitive level in an empirical manner, the Atlantic essay approaches it from a more personal, anecdotal direction:
Over the past few years I've had an uncomfortable sense that someone, or something, has been tinkering with my brain, remapping the neural circuitry, reprogramming the memory. My mind isn't going - so far as I can tell - but it's changing. I'm not thinking the way I used to think. I can feel it most strongly when I'm reading. Immersing myself in a book or a lengthy article used to be easy. My mind would get caught up in the narrative or the turns of the argument, and I'd spend hours strolling through long stretches of prose.
That's rarely the case anymore. Now my concentration often starts to drift after two or three pages. I get fidgety, lose the thread, begin looking for something else to do. I feel as if I'm always dragging my wayward brain back to the text. The deep reading that used to come naturally has become a struggle.
The author quotes playwright Richard Foreman arguing that the internet is draining our "inner repertory of dense cultural inheritance" and turning us into "pancake people - spread wide and thin as we connect with that vast network of information accessed by the mere touch of a button."
In the context of that quote, a friend ask me:
Is this just old-fart-style reminiscing and resistance to technological change? Or are we really in danger of losing something important?
It this really what's happening? Are we all turning into Bilbo Baggins, like butter scraped over too much bread?
Since the Atlantic piece relies on anecdote rather than data, I'm happy to respond in kind with a countervailing anecdote. In fact, if I were feeling cheekier, I'd merely answer the question with another: Why hold, say, the most quotable works of Keats committed to memory when you can look them up in mere seconds, and save your brain for higher-order thinking?
I've definitely noticed a growing tendency, over the past several years, on my part to remember not articles of cultural endurance or ephemera themselves but paths to such articles.
Like the author, I've been inclined to worry about this, but when it comes down to it, I still have these saved insights at my disposal when it comes time to write something and make reference to them.
In fact, I now have access to a much huger system of references than ever before, because the pathways:
Again, I'm inclined to kvetch over a) above, when it seems on reflection that the more significant observation is actually b) - the fact that my system of knowledge turns out to be not sparser but denser and more richly connected than it was when I was still principally a typographic thinker.
When movable type made cheap, mass printing of books possible, learned people of the time worried that the widespread availability of books would destroy the oral tradition of passing knowledge down through communities (and the Scribes worried that they would be put out of a job).
While the oral tradition is not yet destroyed (and in fact I attended a fantastic storytelling session earlier this year that was geared to adults), it has certainly been supplanted as the principal means through which people learn about the world - and thank heavens for it!
Widespread literacy has done much to destroy the steady state of ignorance, received wisdom, parochialism and narrow-mindedness that for so many eons retarded the progress of humanity toward a greater understanding of how the world works and how we can use that knowledge to advance our interests.
I don't accept either the heavy-handed liberalism that says progress is inevitable or the reflexive conservatism that says we're losing something precious by casting off tradition-for-tradition's-sake. However, I do believe the following:
For all these reasons, I believe the internet - and the kind of thinking, researching and production that it tends to encourage - is on balance a strongly positive development.
The fact that you can read - on the internet, no less - an unnecessarily long-winded diatribe about how the internet makes it harder to read long diatribes suggests another of its strengths: by increasing by orders of magnitude the speed and extend of diffusion, the internet also tends to support the self-correction of knowledge on which science is also based.
Here's an interesting tidbit: the widespread adoption of MP3s has had the side-effect of rekindling a robust cottage industry of vinyl recordings among people who have decided that there's value in the physical substance of an album, with the warm, organic, analog sound of records played on a turntable and the sheer physicality of a 12"x12" dust jacket.)
If enough people find themselves thinking about how the internet makes it harder to read War and Peace or (my personal nemesis) Moby Dick, it will provoke a backlash that will have people consciously taking on longer and longer essays to cultivate the ability to read and absorb such pieces.
To the extent that the Atlantic essay signals an alarm, it may itself be part of the solution for the problem it poses. The crucial matter is whether:
Having considered the latter question (I'll presuppose the former for the sake of this argument), I'm inclined toward the conclusion that the change in our reading and comprehension patterns is more of a mutable habit than an immutable property of how the internet changes the brain.
I have a few reasons for thinking this, not least of which is the increasingly well-understood idea that the structure of the human brain remains dynamic, flexible and malleable even after adulthood. However, my personal experience strongly suggests - at least for me - that the problem of concentration is fairly easy to solve.
The secret to extended, deep concentration lies in achieving the flow state of immersing and losing oneself in a full-spectrum narrative or argumentative work.
It's akin to programming, in that I simply can't do it when I'm being constantly interrupted. Each time I lose focus, I must first go through the exercise of 'uploading' the structure back into my consciousness before I can proceed. That's a lot of overhead if I'm only going to spend five or ten minutes on it before being distracted again.
(Yes, I know I'm using a computer metaphor to talk about the brain, and that metaphors are greedy and opportunistic. Believe me, it's just for convenience. What happens in a brain is a lot different - and orders of magnitude more complex - than what happens in a computer.)
Since flow state is essential for immersing in a large, involved work and flow state is highly vulnerable to distractions, the mere fact of spending long periods of time on what we can accurately call a Distraction Machine means our ability to achieve flow state will be severely curtailed - and hence our practice at achieving flow state will be stunted.
What this suggests is that the problem of being able to read large, absorbing works in the internet age is not a problem of a new technology changing our brain patterns so much as it is simply a problem of a new technology requiring us to develop new methods of organizing our time to facilitate what we want to accomplish.
If it's important to you to read large, absorbing works, it follows that you need to develop techniques - of habit and/or technology - to block out chunks of time during which you can read without the myriad interruptions that go along with a network-connected computer.
That might mean installing a program that allows you to lock out distracting websites or web-based services for those blocks of time during which you wish to turn your attention to weightier contents than a blog entry.
Now, where cognitive science comes into play is detailed in the Slate article, which points out that the brain is strongly primed for distraction and interruption.
Actually all our electronic communication devices - email, Facebook feeds, texts, Twitter - are feeding the same drive as our searches. Since we're restless, easily bored creatures, our gadgets give us in abundance qualities the seeking/wanting system finds particularly exciting. Novelty is one. [Washington State University neuroscientist Jaak] Panksepp says the dopamine system is activated by finding something unexpected or by the anticipation of something new. If the rewards come unpredictably - as email, texts, updates do - we get even more carried away.
That is, it's possible that the underlying reason why we can't seem to ignore or turn off the distractions that make it hard to concentrate is that we really don't want them to stop.
And that's an entirely different problem from the one the Atlantic essayist tries to hang on the internet.
You must be logged in to comment.
There are no upcoming events right now.
Why not post one?