You are currently browsing the category archive for the ‘Technology’ category.

I have more to report on Pacific Standard Time but a channel change seems like a good idea right about now. So here are a few highlights from The Visionary, a portrait of Jaron Lanier by Jennifer Kahn in the New Yorker, July 11 & 18, 2011. (I am particularly fond of Lanier and have written about him previously, here, here and here.)

***
Unlike more Luddite critics, Lanier complains not that technology has taken over our lives but that it has not given us enough back in return. In place of a banquet, we’ve been given a vending machine.

“The thing about technology is that it’s made the world of information ever more dominant,” Lanier told me. “And there’s so much loss in that. It really does feel as if we’ve sworn allegiance to a dwarf world, rather than to a giant world.”

***
About his childhood:
“The trifecta for me was eating chocolate, listening to Bach, and staring at Bosch.”

***
Part of what Lanier finds most regrettable about Facebook—the way it mediates social contact—is precisely what makes it so appealing to most people. “We use technology this way all the time,” Andy van Damn, a professor of computer science at Brown University, notes. “To create a layer of insulation. We send an e-mail so we don’t have to call someone on the phone. Or we call someone so we don’t have to go over to their house.”

***
“My dad was more into ‘Be the Buckminister Fuller or the Frank Lloyd Wright’–be the weird outsider who becomes influential. Which is kind of where I ended up.”

***
Lanier is like “an innovative painter who alternately courts and scorns the establishment.”


The Wasatch Mountains in Utah

This comment from Bill Keller in the New York Times caught my eye:

In “The Uncoupling,” there is a wistful passage about the high-school cohort my daughter is about to join. Wolitzer describes them this way: “The generation that had information, but no context. Butter, but no bread. Craving, but no longing.”

That description could apply to many more cohort groups than just high schoolers. Generally I’m not wired for jeremiads or lamentations of discontent, but it is reasonable to ask if all of us are suffering from contextually challenged information, from too much of the oleaginous spread without the hearty loaf, from cravings that still leave us feeling unsatisfied and undernourished.

On one level our Twitter feed and Facebook news are a customized news service. My partner David calls his Twitter connections his “readers”, grateful for how they flag articles he would never find if he were meandering the halls of cyberspace by himself. And that’s useful, without question.

But like a river that can just as easily carry pleasure boats as it can sewage, the constant flow of information (a term i use loosely) starts to resemble effluvia if you stay in it too long. Perhaps I should just make that a first person statement: The constant flow of information starts to resemble effluvia if I stay in it too long. Like the sage advice for dealing with a difficult family member, the secret seems to be this: Limit your exposure.

While my online persona gets much less time than my other more important selves, I still look forward to extended breaks when I get to check out of all my obligations. These sojourns are my own “limit your exposure” control system.

So I’m off for a week, this time to Utah. I’ll be back, refreshed and ready to replunge, re-engage, reconnect.


Jaron Lanier

Most of us can recognize people who think like us. It’s the ease we have in following arguments, the familiarity in the way someone moves from one idea to the next. Sometimes it is subtle, but when you share your thinking mother tongue with someone, there is comfort in that shared vernacular.

Most of us can also recognize when we run up against someone who has a completely different way of thinking about the world. I’ve had that sensation of dis-familiarity when I’ve sat with someone suffering from schizophrenia or Alzheimer’s. But I’ve also been exhilarated when I encounter an extremely different way of seeing the world. That’s what I have been feeling from the very beginning of the brilliant and provocative book, You Are Not A Gadget, by Jaron Lanier. A technologist who has been at a the forefront of software design and the Web, Lanier lays open many of the missteps made a long time ago that we have had to adjust to and accommodate. But things didn’t have to be the way they are, and paying attention to those errors is of importance in our decisions going forward.

Lanier describes his book as a manifesto, and in many ways it has the rhetorical power of a political declaration. Chunked into manageable, bite sized passages, You are Not a Gadget is a fistful of extraordinary insights and wisdom that come from a mind that can stand still and drill down 50 feet. He’s got extreme verticality, that’s for sure. And since I’m more horizontally inclined—more adept at covering lots of territory rather than staying in one spot and digging deep—the perspicacity of Lanier’s thinking just keeps coming with every page.

The thing about Lanier is he doesn’t take anything for granted. Everything is scrutinized. One of his key concepts that explains where things have gone wrong is what he calls “lock-in.” Once a software design is formalized and ubiquitous, everything must conform to that structure. Good ideas that don’t fit that particular approach cannot be considered. Lanier offers a number of great examples of this, but the one I particularly like is his discussion of the ubiquitous software concept of the file.

An even deeper locked-in idea is the notion of the file. Once upon a time, not too long ago, plenty of computer scientists thought the idea of the file was not so great.

The first design for something like the World Wide Web, Ted Nelson’s Xanadu, conceived of one giant, global file, for instance. The first iteration of the Macintosh, which never shipped, didn’t have files. instead, the whole of a user’s productivity accumulated in one big structure, sort of like a singular personal web page. Steve Jobs took the Mac project over…and soon files appeared.

UNIX had files; the Mac as it shipped had files; Windows had files. files are not part of life; we teach the idea of a file to computer science students as if it were part of nature. In fact, our conception of files may be more persistent than our ideas about nature. I can imagine that someday physicists might tell us that it is time to stop believing in photons, because they have discovered a better way to think about light—but the file will likely live on.

The file is a set of philosophical ideas made into eternal flesh. The ideas expressed by the file include the notion that human expression comes in severable chunks that can be organized as leaves on an abstract tree—and that the chunks have versions and need to be matched to compatible applications.

That’s from page 13, and so much more follows. The book’s five parts each deal with topics of profound importance:

What is a person?
What will money be?
The unbearable thinness of flatness
Making the best of bits
Future humors

I’m still swimming in this sea of extraordinary ideas and will be for a while. I am sure I will have more comments to make about the book as I continue reading it Until then, here’s a Lanierism to keep a spirit hopeful: “If it’s important to find the edge of mystery, to ponder the things that can’t quite be defined—or rendered into a digital standard—then we will have to perpetually seek out entirely new ideas and objects, abandoning old ones like musical notes.”


Kevin Kelly and Steve Johnson (Illustration: Jason Holley, Wired)

This is a follow on to my earlier post about Steve Johnson’s new book, Where Ideas Come From.

These excerpts are from a conversation between Kevin Kelly, author of What Technology Wants, and Steve Johnson published in Wired:

***
Kelly: Really, we should think of ideas as connections, in our brains and among people. Ideas aren’t self-contained things; they’re more like ecologies and networks. They travel in clusters.

***
Johnson: I was particularly taken with your idea that technology wants increasing diversity—which is what I think also happens in biological systems, as the adjacent possible becomes larger with each innovation. As tech critics, I think we have to keep this in mind, because when you expand the diversity of a system, that leads to an increase in great things and an increase in crap.

***
Kelly: Ten years ago, I was arguing that the problem with TV was that there wasn’t enough bad TV. Making TV was so expensive that accountants prevented it from becoming really crappy—or really great. It was all mediocre. But that was before YouTube. Now there is great TV!

***
Kelly: To create something great, you need the means to make a lot of really bad crap. Another example is spectrum. One reason we have this great explosion of innovation in wireless right now is that the US deregulated spectrum. Before that, spectrum was something too precious to be wasted on silliness. But when you deregulate—and say, OK, now waste it—then you get Wi-Fi.

Johnson: This is another idea with a clear evolutionary parallel, right? If we didn’t have genetic mutations, we wouldn’t have us. You need error to open the door to the adjacent possible.

***
Kelly: In my book, I quote the astrophysicist Paul Davies, who asks whether the laws of nature are “rigged in favor of life.” For my part, I think the laws of nature are rigged in favor of innovation.

Johnson: Life seems to gravitate toward these complex states where there’s just enough disorder to create new things. There’s a rate of mutation just high enough to let interesting new innovations happen, but not so many mutations that every new generation dies off immediately.

Kelly: Right. This is a big theme in your book, too—the idea that the most creative environments allow for repeated failure.


Complexity and flow: Never what is seems

Nicholas Carr’s latest book, The Shallows: What the Internet Is Doing to Our Brains, continues to spawn conversations regarding what we can and cannot know about the effect of cybertechnology on our brains and cognitive abilities. (A recent post about the book is here with links to earlier posts about agent provocateur Carr.)

In an article posted on Miller-McCune by Nate Kornell and Sam Kornell, the authors draw parallels to the jeremaids written in the 50s about the damage television would do to intelligence and education. That turned out to not be true. (Research the Flynn Effect for more information on this.)

Kornell and Kornell make their case regarding the internet:

Is Nicholas Carr correct to argue that the Internet is remapping our neural circuitry in a harmful way? Critics hoping to poke holes in Carr’s argument have cited a 2009 study by neuroscientists at the University of California, Los Angeles, who found that compared to reading a book, performing Google searches increased brain activity in the area that underlies selective attention and deliberate analysis.

It’s not a bad study to cite, since Carr specifically claims that the Web is bad for our neural circuitry. But it’s also misleading, because the term “intelligence” is so broad and complex that neurological research hasn’t begun to explain it in its totality — which means that the study shouldn’t be used to support the claim that the Internet is making us “smarter,” either.

The authors point that everything affects the neural circuitry and that neural circuitry per se is not the place to explore effects on intelligence.

So what, finally, of the simple logical argument that skipping from hyperlink to hyperlink online is less mentally nourishing than reading a challenging book or a long magazine article? Here, critics of the Web have a strong case. Life is a daily struggle to attain clarity of thought, and devoting your undivided attention to something for an extended period of time — like a book — is a good way to achieve it. Better, probably, than surfing the Web.

But clarity of thought and IQ — which is the measure of intelligence independent of knowledge — are not the same thing. The great wealth of empirical data gathered by neuroscientists and cognitive psychologists in recent decades suggests that “intelligence” is a broad term for a very complex phenomenon, which makes it tenuous at best to draw conclusions about the effect of the Internet on something as “global” — as the Nobel-prize winning cognitive psychologist Daniel Kahneman has put it — as intelligence.

So even though bookreading requires a formalized concentration and surfing the Internet disperses it, that is no foundation for saying that life on the Web makes us less intelligent. “It can be mentally distracting, but that doesn’t mean it’s mentally deforming.”

It’s useful to remember, when considering the argument that Web is contributing to our mental downfall, that ruing the invention of new forms of mass communication is a historical tradition of long standing. Television, typewriters, telegrams, telephones, writing in languages other than Latin, writing at all—at one point or another all of these were declared sure signposts of the fall of Western civilization.

None of them did, and if history is any guide, the Internet won’t either.


(Image: Doug Johnson at The Blue Skunk Blog)

In Hamlet’s BlackBerry: A Practical Philosophy for Building a Good Life in the Digital Age, William Powers quotes Henry David Thoreau who wrote that the man who constantly and desperately keeps going to the post office to check for correspondence from others “has not heard from himself in a long while.”

Sounds like a contemporary proclivity with so many who interrupt their lives to constantly check email, Facebook, Twitter and LinkedIn. “Of the two mental worlds everyone inhabits, the inner and the outer, the latter increasingly rules,” says William Powers. “We’re like so many pinballs bouncing around a world of blinking lights and buzzers. There’s lots of movement and noise, but it doesn’t add up to much.”

“Seems, madam! nay it is; I know not ‘seems.'”

In her review of the book, Rasha Madkour points to Powers’ economic impact statement:

While recognizing that technology has made tasks like paying bills much easier and faster, Powers disputes the notion that it has made us more efficient. By interrupting our work to check our inboxes throughout the day, we’re actually becoming less productive because of the time it takes to refocus on the task at hand. Powers cites a study that found workers spending more than a quarter of their day managing distractions, adding up to $900 billion in economic loss in 2009.

High cost proclivities indeed.

Powers, interviewed by Bella English in the Boston Globe, tells the story behind the title of his book:

The more I connect digitally, the more I’m drawn to hard copy, so I decided to look at the history of paper and all related technologies. In reading Shakespeare, I stumbled on this moment where Hamlet pulls this thing out of his pocket that he called his tables. It was this fascinating example of a new technology where you wrote on pages (made of specially coated paper) with a stylus and you erased it at night. It was very much a 400-year-old version of what we’re doing today. It came to me that this thing was like his BlackBerry.

Focusing on seven individuals from previous eras, Powers explores how each of them used new inventions to make connecting with others easier. His point is clear: This is not a new problem. A more appropriate question to ask is how does a person live a life and use these tools while maintaining some balance.

Powers points to Benjamin Franklin who had a bit of social networking addiction. “He was constantly out and about, forming these new clubs and groups and associations, and he reached a point early in his life where he was just extended in too many directions, ” says Powers. “So he set up 13 goals he wanted to achieve. For example, he loved conversation, but he said he was going to aim for a little silence, too. It was not a case of withdrawing, but a case of looking for balance.”

Powers’ personal solution for seeking balance? Internet Sabbath. He and his family turn off the modem on Friday night and keep it off until Monday morning. Sounds a bit too fundamentalist for me, but I like the concept. Ten days spent hiking in Canada outside the range of cell or cyber felt pretty damn good.


Seo 2, mixed media on canvas, 24 x 48″. From a series commissioned by Catherine Seo, professor of business and management and a social media maven. I painted this series with her hyperconnectedness in mind.

Some of you have engaged with me on the topic of the Internet’s impact on the way we think, process, interact, make sense and process our world. Based on the streetchatter I hear in the Twitter neighborhood where I spend my time, this issue has been a Top Ten-er for months now. Steven Johnson, a reasonable voice through this ongoing discussions, has written a piece in the Sunday New York Times that addresses many of the same “yes that is true, but on the other hand” concerns I have as well on this complex, still TBD topic. The fact is I am of two minds: I am enchanted and enriched by the chaotic overstimulation of the web AND I need and crave the solitude of my contemplative time in the studio.

Here’s a quick and topical guide into the latest variation on the essential tension between these two nodes. Nicholas Carr (whose earlier article, “Is Google Making Us Stupid?” I wrote about here and here) recently published a new book. The Shallows: What the Internet Is Doing to Our Brains. Also recently released is quite a different take on things, Clay Shirky’s Cognitive Surplus: Creativity and Generosity in a Connected Age. Both books are thorough and well documented defenses of each point of view. And they are looking at the same reality from two completely different ends of the observational spectrum.

There’s the issue of multitasking, for example. Carr makes the case that the distractions so prevalent in the online world are costing us the ability to concentrate. While tests have demonstrated that heavy multitaskers perform 10-20% worse than light multitaskers, those same tests “are meaningless as a cultural indicator without measuring what we gain from multitasking.”

Johnson uses himself as a case in point and makes an argument I have made many times:

Thanks to e-mail, Twitter and the blogosphere, I regularly exchange information with hundreds of people in a single day: scheduling meetings, sharing political gossip, trading edits on a book chapter, planning a family vacation, reading tech punditry. How many of those exchanges could happen were I limited exclusively to the technologies of the phone, the post office and the face-to-face meeting? I suspect that the number would be a small fraction of my current rate.

I have no doubt that I am slightly less focused in these interactions, but, frankly, most of what we do during the day doesn’t require our full powers of concentration. Even rocket scientists don’t do rocket science all day long.

In Johnson’s view, the core of the problem with Carr’s model is that it holds “slow contemplation of deep reading” as the highest form. According to Carr, the quiet solitude of the book is required for society to move forward. But says Johnson there is another way to view this:

Many great ideas that have advanced culture over the past centuries have emerged from a more connective space, in the collision of different worldviews and sensibilities, different metaphors and fields of expertise. (Gutenberg himself borrowed his printing press from the screw presses of Rhineland vintners, as Mr. Carr notes.)

It’s no accident that most of the great scientific and technological innovation over the last millennium has taken place in crowded, distracting urban centers. The printed page itself encouraged those manifold connections, by allowing ideas to be stored and shared and circulated more efficiently. One can make the case that the Enlightenment depended more on the exchange of ideas than it did on solitary, deep-focus reading.

Quiet contemplation has led to its fair share of important thoughts. But it cannot be denied that good ideas also emerge in networks.

Yes, we are a little less focused, thanks to the electric stimulus of the screen. Yes, we are reading slightly fewer long-form narratives and arguments than we did 50 years ago…but what of the other side of the ledger? We are reading more text, writing far more often, than we were in the heyday of television.

And the speed with which we can follow the trail of an idea, or discover new perspectives on a problem, has increased by several orders of magnitude. We are marginally less focused, and exponentially more connected. That’s a bargain all of us should be happy to make.

Can we go for the both/and on this?


Is it a fence or a tree? Or both?

A friend described her experience with a therapy technique that has helped her family tremendously. She distilled the approach down to 2 sentences:

I am doing the best I can.
I can do better.

Learning to hold these two dialectical statements as true at the same time has given her family members a new sense of themselves and each other. Like so many dialectical exercises—philosophical, ideological or otherwise—a hidden power is unleashed when two opposing forces find an unexpected third place to coexist.

A few other unexpected thoughts, some of them also dialectical in nature, have crossed my transom over the last few days. Most of these were picked up at the social media/technology conferences I’ve participated in. Although not related directly to my art making life, these statements have brought me new insights and seem worthy of sharing here. (The content in the parentheses are mine.)

***
From Caterina Fake, founder of Flickr and Hunch, and an optimist of the down deep variety:

“Babies, pets and sunsets–the backbone of the Internet.” (And all this time I thought it was porn.)

“Strangers used to be bad and dangerous. Now strangers are the source of good things online.” (So true.)

“We now live in a culture of generosity–people everywhere spending time to put all sorts of information up for free” (All those reviews on Amazon, all those lyrics to Bob Dylan’s music entered by hand…)

***
Andrew Rasiej, expert on social media and its political implications:

“Soon 9 billion people will be connected by phones. It is going to create a new form of governance for all of humanity. People will start ignoring government and just start solving problems themselves.” (One dream of a better future for the planet.)

“Technology is not a slice of the pie. It’s the pan.” (I can imagine replacing the nouns in that metaphor to get some interesting variations)

***
JP Rangaswami, CIO and Chief Scientist at BT Design, speaking to IT professionals:

“Trying to restrict/control something that is meant to be abundant results in an equal and greater effort to restore that abundance.” (This may sound New Age-ish, but that is definitely not where he is coming from on this.)

“Once info is made digital, it will leak.” (The corporate firewall=Swiss cheese)

“It took 50 years for IBM to become evil, 20 years for Microsoft, 10 for Google, 5 for Facebook and 2 for Twitter.” (New variation on Moore’s law…)

“When I used to call my grandmother, she never had to say, ‘Can you hear me now?'” (Back when all phones were black and you could only buy them from the phone company.)

“My father had one job his entire life. I have had 7. My son has 7 all at once.” (Like the reading of books, concomitance is now king)

“If you are on Second Life, you don’t have a first life.” (Sorry if you are a SL fan)

“My advice is to always start open and then only close when you must.” (In more ways than in the design of IT systems…)

***
Andrew McAfee, author of “Enterprise 2.0”:

“Best practices are a recipe for mediocrity. All that means is that everybody is doing the same thing.” (Yes!)

“Decisions are the least digitized asset.” (This is the bane of knowledge management systems—how do you capture that?)

“As William Gibson has pointed out, ‘the future is already here—it is just unevenly distributed.'” (Oh ye sage, William the Great)


A watercolor by Renee Collins, from my collection. I don’t know the name that Renee originally gave it, but I’ve always referred to it as “Leaky Margins.”

If you spend a fair amount of time online, you have probably come up against The Membrane. It functions a bit like a cell wall, as the boundary between the cyber inside and cyber out there, the me and the you, a sense of being connected and yet protected, visible and yet not. At certain places that membrane is only a few air packets thick, with lots of porosity and flow back and forth. At other places it is a glove fit and impermeable, invisible armor that makes for easy escapes and to travel incognito. There’s nothing quite like it IRL. Sometimes it frustrates me. Sometimes I think it is the coolest thing in all the world.

It was in that context that I found this quote so provocative.

I do not know if it has ever been noted before that one of the main characteristics of life is discreteness. Unless a film of flesh envelopes us, we die. Man exists only insofar as he is separated from his surroundings. The cranium is a space-traveler’s helmet. Stay inside or you perish. Death is divestment, death is communion. It may be wonderful to mix with the landscape, but to do so is the end of the tender ego.

–Vladimir Nabokov

Beautiful thing, discreteness, even when contemplated within the entropic morbidity we all face. Thank you Whiskey River for bringing me another great quote moment.

Two of my all time favorite blogs have now been transmogrified into a version of themselves as old media (i.e., books). The first was BibliOdyssey: Amazing Archival Images from the Internet, compiled by my friend and master archivist, the inimitable PK from his very popular site of the same name. Published at the end of 2007, PK’s book brought together his extraordinary gleaning of images from web image coffers all over the world. The book was a venture undertaken by British small publisher/design company Fuel where BibliOdyssey shares a book berth with other Fuel titles such as Home-Made, a compendium of objects made by Russians when the Soviet Union’s demise made access to manufactured goods difficult, and Russian Criminal Tattoo Encyclopedia (Volumes I, II and III).

A second conversion from new to old is Strange Maps. Frank Jacobs thought his love of weird and eccentric cartographic imaging was something only he and his map geeky friends were interested in. So the success of his blog—10 million hits as of March of last year—came as a surprise to him. His Strange Maps: An Atlas of Cartographic Curiosities was published late in 2009 and combines both maps and commentary. His categories range from cartographic misconceptions to zoomorphic maps to a great catch all, watchamacallit.

Each media has its advantages. The random access quality of stopping by either of these sites and never knowing what will pop up is engaging, but the organizational advantages of the book form has its place as well. I just love the panoply of images that PK and Jacobs have gathered for our general enlightenment and delight. More, more.