You are currently browsing the category archive for the ‘Quality of life’ category.

The interface between the self and the Web has been a topic that I think about a lot. I’ve written previously about Sherry Turkle’s work and her new book, Evocative Objects, and some of the ways the porous membrane between a cyber persona and a physical self can almost disappear.

The generational implications of the last ten years of technological development are also provocative. Who knows what will shift and change for my children and their deviced and gadgeted cohorts?

The following excerpt is from an article by Clive Thompson in Wired and suggests some interesting twists on these themes.

This summer, neuroscientist Ian Robertson polled 3,000 people and found that the younger ones were less able than their elders to recall standard personal info. When Robertson asked his subjects to tell them a relative’s birth date, 87 percent of respondents over age 50 could recite it, while less than 40 percent of those under 30 could do so. And when he asked them their own phone number, fully one-third of the youngsters drew a blank. They had to whip out their handsets to look it up.

That reflexive gesture — reaching into your pocket for the answer — tells the story in a nutshell. Mobile phones can store 500 numbers in their memory, so why would you bother trying to cram the same info into your own memory? Younger Americans today are the first generation to grow up with go-everywhere gadgets and services that exist specifically to remember things so that we don’t have to: BlackBerrys, phones, thumb drives, Gmail.

I’ve long noticed this phenomenon in my own life. I can’t remember a single friend’s email address. Hell, sometimes I have to search my inbox to remember an associate’s last name. Friends of mine space out on lunch dates unless Outlook pings them. And when it comes to cultural trivia — celebrity names, song lyrics — I’ve almost given up making an effort to remember anything, because I can instantly retrieve the information online.

In fact, the line between where my memory leaves off and Google picks up is getting blurrier by the second. Often when I’m talking on the phone, I hit Wikipedia and search engines to explore the subject at hand, harnessing the results to buttress my arguments.

My point is that the cyborg future is here. Almost without noticing it, we’ve outsourced important peripheral brain functions to the silicon around us.

And frankly, I kind of like it. I feel much smarter when I’m using the Internet as a mental plug-in during my daily chitchat. Say you mention the movie Once: I’ve never seen it, but in 10 seconds I’ll have reviewed a summary of the plot, the actors, and its cultural impact. Machine memory even changes the way I communicate, because I continually stud my IMs with links, essentially impregnating my very words with extra intelligence.

You could argue that by offloading data onto silicon, we free our own gray matter for more germanely “human” tasks like brainstorming and daydreaming. What’s more, the perfect recall of silicon memory can be an enormous boon to thinking. For example, I’ve been blogging for four years, which means I’ve poured out about a million words’ worth of my thoughts online. This regularly produces the surreal and delightful experience of Googling a topic only to unearth an old post that I don’t even remember writing. The machine helps me rediscover things I’d forgotten I knew — it’s what author Cory Doctorow refers to as an “outboard brain.”

Still, I have nagging worries. Sure, I’m a veritable genius when I’m on the grid, but am I mentally crippled when I’m not? Does an overreliance on machine memory shut down other important ways of understanding the world?

There’s another type of intelligence that comes not from rapid-fire pattern recognition but from slowly ingesting and retaining a lifetime’s worth of facts. You read about the discoveries of Madame Curie and the history of the countries bordering Iraq. You read War and Peace. Then you let it all ferment in the back of your mind for decades, until, bang, it suddenly coalesces into a brilliant insight. (If Afghanistan had stores of uranium, the Russians would’ve discovered nuclear energy before 1917!)

We’ve come to think of human intelligence as being like an Intel processor, able to quickly analyze data and spot patterns. Maybe there’s just as much value in the ability to marinate in the seemingly trivial.

Of course, it’s probably not an either/or proposition. I want both: I want my organic brain to contain vast stores of knowledge and my silicon overmind to contain a stupidly huge amount more.

At the very least, I’d like to be able to remember my own phone number.


I am heartened by the attention being garnered by Timothy Ferriss’ new book, The 4-Hour Workweek. As the texture of all of our lives has been complexified by information overload, Ferriss has been one of the first credible voices to say, Whoa.

From the New York Times:

Mr. Ferriss has seen his book quickly become a best seller, largely on the strength of blog chatter in the tech community. Subsequently, he has become a pet guru of Silicon Valley, precisely by preaching apostasy in the land of shiny gadgets: just pull the plug. Crawl out from beneath the reams of data. Stand firm against the torrent of information.

His methods include practicing “selective ignorance” — tuning out pointless communiqués, random Twitters, and even world affairs (Mr. Ferriss says he gets most of his news by asking waiters)…Once the e-clutter is cleared away, he argues, there will be plenty of time to scuba dive the Blue Hole in Belize, just as he does.

Or at least fantasize about it…

Jason Hoffman, a founder of Joyent, which designs Web-based software for small businesses, urged his employees to cut out the instant-messaging and swear off multitasking. From now on, he told them, severely restrict e-mail use and conduct business the old-fashioned way, by telephone.

“All of a sudden,” Mr. Hoffman said of the results, “their evenings are free. All of a sudden Monday doesn’t feel so overwhelming…”

“BlackBerrys and e-mail aren’t inherently bad,” Ferriss said. “It’s just like medicine: it’s the dose that makes the poison.”

I used to be a technowolf, seduced by gadgetry and the toy value of technology that has been delivered steadily since the PC was introduced in the early 80s. But a few years ago I stepped off that train. I have had to close more doors than I have opened just to stay afloat. I’ve had to cut way back on that “first to know” need in a number fields like politics, news, science, film, publishing, fiction. A wise friend once described the second half of life as the time when you finally know what you like and what you do well, and so now you just have to do it.

Blogging, increasingly more ubiquitous, also needs the pruner’s touch. I now rely on the finds of trusted others more than trolling on my own. And I agree with Ferriss when he prioritizes the range of possibilities facing him:

“I’d be much better off putting my time into three or four really good blog posts.”