You are currently browsing the category archive for the ‘Philosophy’ category.


Revisiting the past: “Tuffesse,” 20 x 50″, from a body of work I painted about the same time as this original post

This post first appeared here in April 2007. In looking back through that period of time I found these quotes from Gilles Deleuze and Felix Guattari still relevant and useful. A Thousand Plateaus is one of those timeless books that continues to be a fecund source for ideas, stimulation, provocation, inspiration, insight.

***
I spent last week at the Ad:Tech interactive advertising and technology conference in San Francisco talking to people about where they see the Web heading and what life online is going to look like in a few more years. The range of future views I heard was, as expected, diverse. While I do not have a clear idea of my own about how all the plethora of possible scenarios will play out, what did emerge was the distinct view of this space as a potentiality, an undefined, nonlinear, anything-is-possible vortex. I kept being reminded of A Thousand Plateaus, the mindblowing, rhapsodic “book” (hard to call it that) by Gilles Deleuze and Felix Guattari. A few salient quotes:

The principal characteristics of a rhizome: unlike tress or their roots, the rhizome connects any point to any other point, and its traits are not necessarily linked to traits of the same nature; it brings into play very different regimes of signs, and even nonsign states.

A rhizome has no beginning or end; it is always in the middle, between things, interbeing, intermezzo.

Unlike the tree, the rhizome is not the object of reproduction: neither external reproduction as image-tree nor internal reproduction as tree-structure. The rhizome is an antigeneaology. It is short term memory, or antimemory. The rhizome operates by variation, expansion, conquest, capture, offshoots.

Once a rhizome has been obstructed, aborified, it’s all over, no desire stirs; for it is always by rhizome that desire moves and produces.

The wisdom of the plants; even when they have roots, there is always an outside where they form a rhizome with something else–with the wind, an animal, human beings.

Write, form a rhizome, increase your territory by deterritorialization, extend the line of flight to the point where it becomes an abstract machine covering the entire plane of consistency.

We have lost the rhizome, or the grass. Henry Miller:…”Grass is the only way out.”

Make rhizomes, not roots, never plant! Don’t sow, grow offshoots! Don’t be one or multiple, be multiplicities! Run lines, never plot a point! Speed turns the point into a line!


Illustration by Joon Mo Kang (New York Times)

In John Logan‘s play Red, one of the first topics discussed by the painter with his studio assistant is The Birth of Tragedy by Friedrich Nietzsche. While Rothko waxes rhapsodic about the profundity of the ideas in the book he also takes time to brow beat his new assistant, a young artist, for never having read it. The arguments and insights as voiced by Rothko are still relevant and compelling to a 21st century audience, part of what makes the play so satisfying. And yes, the studio assistant does, in the course of time covered by the play, read the book for himself. Near the end he articulates his own take on its significance to his generation of artists. Nietzsche’s ideas have the ability to reformat as the pressure points in the culture change and morph.

So Nietzsche continues to be a vital force all these years later. Complex and complicated as a man and a thinker, his legacy still incites debate and multiple interpretations. A new book, American Nietzsche, by Jennifer Ratner-Rosenhagen, focuses specifically on the German philosopher’s imprint on American thinkers. Reviewed in the Times Book Review by Alexander Star, the book is a fascinating exploration of Nietzsche’s impact on the very particular drift of American culture. From the review:

Today’s inescapable and perplexing Nietzsche is not necessarily the same Nietzsche who inspired readers in the past…Though Nietzsche loathed the left, he was loved by it. As Ratner-Rosenhagen explains, the anarchists and “romantic radicals” as well as the “literary cosmopolitans of varying political persuasions” who welcomed him to America believed they had found the perfect manifestation of Emerson’s Poet, for whom a thought is “alive, . . . like the spirit of a plant or an animal.” To read Nietzsche was to overcome an entire civilization’s inhibiting divide between thinking and feeling. Isadora Duncan said he “ravished my being,” while both Jack London and Eugene O’Neill saw him as their Christ. Emma Goldman ended her romance with the Austrian anarchist Ed Brady because he didn’t appreciate the great author who had taken her to “undreamed-of heights.” For such readers, “Thus Spoke Zarathustra,” with its incantatory calls for a race of overmen to establish a new morality that would “remain faithful to the earth,” was the true Nietzsche. Thrilling to its rhapsodies, they felt confirmed in their judgment that pious, stultifying America was no place for a serious thinker. Ratner-Rosenhagen nicely writes, “Many years before members of this generation were ‘lost’ in Europe, they felt at home in Nietzsche, and homeless in modern America.”

Exploring the role Nietzsche played in the evolution of Emersonianism and postmodernism as well as the thinking of H. L. Mencken, Harold Bloom and Stanley Cavell is a worthy journey.

The final paragraph of the review captures an essence of his thinking that I have come back to again and again:

In a 1985 book “Nietzsche: Life as Literature,” the Princeton philosopher Alexander Nehamas argued that Nietzsche’s perspectivism does not imply that all beliefs are equally valid but that “one’s beliefs are not, and need not be, true for everyone.” On this reading, to fully accept a set of beliefs is to accept the values and way of life that are bound up with it, and since there is no single way of life that is right for everyone, there may be no set of beliefs that is fit for everyone. At its best, American individualism is not about the aggrandizement of the self or the acquiescent assumption that everybody simply has a right to think what they want. Rather, it stresses that our convictions are our own, and should be held as seriously as any other possessions. Or, as Nietzsche imagined philosophers would one day say, “ ‘My judgment is my judgment’: no one else is easily entitled to it.”


Skyline of the Wasatch Mountains in Salt Lake just after a cloudburst

Yesterday I heard an interview with an American journalist on NPR. She has spent most of the last 8 years in Afghanistan reporting on the war. In the process she developed a deep affection for the country and its plight, so much so that she just couldn’t bear to stay and watch as bad decisions and misguided policies have made things worse.

For the last few months she has been living on Cape Cod. Instead of reveling in the exquisite summer of that breathtaking landscape she has been restless and dissatisfied, stewing over her discomfort in being back in what was once her homeland. Her turmoil is more than missing the adrenaline of a war zone, she said. It is how much the United States has changed since she last lived here.

“Everyone I speak with now is deeply unhappy with the way things are going in this country. Everyone. They each have a list of what they think is broken, and their concerns vary. But every person I speak with is convinced this country has severe problems and that we are headed in the wrong direction.”

That is my experience as well, and it was brought home to me recently during a recent trip to the west. Two of my most spiritually-inclined friends live off the grid in the wilderness of New Mexico, and they announced to me quite unexpectedly they were very optimistic about the future. It stopped me in my tracks. I hadn’t heard that kind of optimism from anyone. For years. At that moment I realized the deep divide between life 10 years ago and now. If I had polled my friends about their view of the future just 5 years ago, I would have seen a reasonable bell curve distribution ranging from “life is great!” to “everything sucks”. Now that response would just flatline.

Is this just a case of “end of the American empire” blues? The twilight of our self-professed hegemony and “best country in the world” mythos? Is it generational? Is it a proclivity particular to progressives and liberals (like me and 99% of my closest friends)? Is it a larger story, a global pessimism that transcends national boundaries or political beliefs? Maybe a case of e) all of the above?

I keep thinking about the cultural anthropologist Angeles Arrien who spent 20 years living with indigenous people and learning about how to live from those who seem to do it with more joy than we do.* She was a keynote at a psychology conference a few years ago and told a thousand therapists, “You think you know all about addictions? Well maybe not. We live in a culture that harbors addictions so large you probably don’t even see them.”

Here is her list:

1. Addiction to intensity and drama.
2. Addiction to the myth of perfection.
3 Addiction to focusing on what’s not working.
4. Addiction to having to know.

This past week has been all about #3 for me. Every political update on MSNBC, Facebook and Twitter (and particularly exemplified by the hashtag firestorm of Jeff Jarvis‘ “#fuckyouwashington” last weekend) has been about what’s broken, what isn’t working. And yes, it does have an addictive quality to it. You get good at finding what’s broken, and what’s broken gets very good at finding you. It’s a reinforcing loop.

Being a hermit or doing a “Jonathan Franzen” (he wrote The Corrections wearing “earplugs, earmuffs and a blindfold”, and for Freedom he shut down his Ethernet port with Super Glue) are options. But is it possible to shift to another lens for viewing the world? I am tired of feeling hopeless. Maybe that is part of the old wisdom that things sometimes have to get worse before they can get better. The saturation finally forces a shift.

No answers here, just a public pondering of what it will take to move out of this weather pattern.

________
* Angeles Arrien’s Four Fold Way, culled from her experiences living with several different indigenous populations:

1. Show up and be present.
2. Pay attention to what has heart and meaning for you.
3. Tell your truth without judgment or blame.
4. Be open to outcome, but not attached to outcome.


Susan Sontag

Claims and concerns that we are creating an increasingly voyeuristic culture are heard frequently these days. The deeply disturbing (but essential viewing IMHO) film, Catfish, is just one of a number of movies, books and articles delving more deeply into how we are constructing relationships with others and how we construct our sense of ourselves. Couple that with our culture’s fixation on celebrities and their dramas, often manufactured like plot lines, and it would be easy to see a serious cultural devolution headed our way.

I’m not of that mind set. Yes we have more tools with which to understand (probe, invade, explore, exploit) the lives of others. But it isn’t all a play to the lowest common denominator in our natures. Small towns used to be the perfect setting for learning from your neighbor’s follies as well as successes. Most of us don’t live in that arena so we have shifted our “learning by watching others” to novels, films and biographies—the more acceptable and “high brow” variation—or by way of the less acceptable realms of reality TV, E! channel, People magazine and Gawker.

I don’t care about the details of Charlie Sheen’s life or his latest twitter rant, but there are life details that do compel me. Is it voyeurism or a more respectable desire to learn by watching a pro? Maybe a little of both?

A new memoir, Sempre Susan, by Sigrid Nunez, describes a young woman’s experience as a housemate with the larger than life and brilliant Susan Sontag. In a review of the book in the Boston Globe, Alice Gregory offers this perspective:

The literature that discloses the private lives of public intellectuals is a category of erotica in itself. For a certain sort of person, nothing is more titillating. Deciphering a persona, anecdote-by-anecdote, to reveal the person behind it is can be a vexed enterprise, since risking their dignity is almost always an occupational hazard…Nunez quietly gets out of the way in this thin volume. Her own writing style is mostly invisible, which is as it should be. We want Sontag’s eccentricities neat — not shaken or stirred by those who witnessed them.

Sontag’s life, told with her flaws and pretensions on display, is of interest to me. She was a seminal influence on my thinking from my college days, and her point of view and writing still stir me. I’m not surprised to learn she was difficult. But all the more credit to Nunez for being able to get past the voyeuristic fodder and deliver up a more full bodied portrait of this brilliant, complex, vulnerable woman.

Gregory does offer up the darker side of Sontagism in her closing comments. And there’s some truth in this, even for a perennial Sontag fan like me:

A premature introduction to Susan Sontag is a dangerous thing. How many 18-year-olds have read “Against Interpretation’’ and taken from it permission to write ruthless polemics that they aren’t quite ready to defend? Sontag is often the gateway drug to intellectual life, lionized by students hell-bent on muscling out a critical worldview. And for good reason. Her essays on art and politics are some of the fiercest and most influential of 20th century letters. “Sempre Susan’’ summons those sophomoric yearnings while also giving us a fair and openhearted portrait. Nunez has constructed a eulogy that mythologizes and humanizes one of the most intimidating figures of contemporary culture.

A common theme in my postings over the last few weeks has been the very basic question, “how are we to live?” While it is sometimes hard to be objective about the prevalence of a trend when it is a topic you yourself are interested in (I call it the “car buying syndrome”—all of a sudden the type of car you are considering starts showing up everywhere) it does seem to be a topic of increased interest in the culture in general. I referenced several new book titles that address various aspects of these concerns in my earlier post, The River of Knowledge, as well as a few inspired by Sarah Bakewell’s very successful book on Montaigne, How to Live (here and here.)

In writing about her review of Examined Lives: From Socrates to Nietzsche by James Miller for the New York Times Book Review, Bakewell states her belief that “philosophy is poorer when it loses sight of the messy lives of those who do the philosophizing.” And certainly her book does a great job of bringing together the events of Montaigne’s life with his philosophical writings. “Montaigne’s idea of philosophy, which he inherited from the Greeks and Romans, was mainly of a practical art for living well,” says Bakewell. “It would have seemed odd to him to spend all day studying philosophy in a university classroom, but then have to go to a bookstore’s self-help department to find a book on how to cope with bereavement or depression.” Bakewell’s answer to the query of how to live? “Let life be its own answer,” she said. “You learn to live mainly by living — and making a lot of mistakes.”

More “how to live” wisdom showed up in James Ryerson’s essay, Thinkers and Dreamers. Posing the question, “can a novelist write philosophically?”, Ryerson quotes novelist and philosopher Iris Murdoch. The two pursuits are contrary, says Murdoch in a BBC interview from 1978. Philosophy uses the analytical mind to solve conceptual problems in an “austere, unselfish, candid” prose, and literature calls upon the imagination to produce something “mysterious, ambiguous, particular” about the world.

Murdoch’s distinction between philosophy and fiction applies to life in general it seems to me. The conscious—and conscientious—deployment of our analytical and imaginal skills is an ongoing balancing act. In my case the “mysterious, ambiguous and particular” is where I spend most of my time. For someone else, it may be the reverse. In spite of my own proclivities, I want to be competently bilingual. And as Bakewell suggests, you learn how to do that by living your life. And by making lots of mistakes.

One of my favorite spots on the web is the annual World Question* presented by The Edge. Each year a provocative question is posed, then answers flow in from every profession and point of view. It is a fascinating cross section of thinking, perspectives and insights.

The question being asked for 2011 is:

What scientific concept would improve everybody’s cognitive toolkit?

The answers posted are rich, varied and unexpected. And there is very little overlap. If you are compelled by ideas, reading through them all will be a terrific adventure. Here are a few excerpts that stood out for me:

***
Linda Stone, former executive at Apple and Microsoft:

Suspending Disbelief

Projective thinking is a term coined by Edward de Bono to describe
generative rather than reactive thinking…

Articulate, intelligent individuals can skillfully construct a convincing
case to argue almost any point of view. This critical, reactive use of
intelligence narrows our vision. In contrast, projective thinking is
expansive, “open-ended” and speculative, requiring the thinker to create the
context, concepts, and the objectives…

When we cling rigidly to our constructs…we can be blinded to what’s right in front of us.

***
Kevin Kelly, author of What Technology Wants:

The Virtues of Negative Results

We can learn nearly as much from an experiment that does not work as from one that does. Failure is not something to be avoided but rather something to be cultivated. That’s a lesson from science that benefits not only laboratory research, but design, sport, engineering, art, entrepreneurship, and even daily life itself. All creative avenues yield the maximum when failures are embraced.

***
Alison Gopnik, author of Philosophical Baby:

The Rational Unconscious

One of the greatest scientific insights of the twentieth century was that most psychological processes are not conscious. But the “unconscious” that made it into the popular imagination was Freud’s irrational unconscious — the unconscious as a roiling, passionate id, barely held in check by conscious reason and reflection. This picture is still widespread even though Freud has been largely discredited scientifically.

The “unconscious” that has actually led to the greatest scientific and technological advances might be called Turing’s rational unconscious…The greatest advantage of understanding the rational unconscious would be to demonstrate that rational discovery isn’t a specialized abstruse privilege of the few we call “scientists”, but is instead the evolutionary birthright of us all. Really tapping into our inner vision and inner child might not make us happier or more well-adjusted, but it might make us appreciate just how smart we really are.

***
Richard Foreman, playwright:

Negative Capability Is A Profound Therapy

Mistakes, errors, false starts — accept them all. The basis of creativity.

My reference point (as a playwright, not a scientist) was Keat’s notion of negative capability (from his letters). Being able to exist with lucidity and calm amidst uncertainty, mystery and doubt, without “irritable (and always premature) reaching out” after fact and reason.

This toolkit notion of negative capability is a profound therapy for all manner of ills — intellectual, psychological, spiritual and political. I reflect it (amplify it) with Emerson’s notion that “Art (any intellectual activity?) is (best thought of as but) the path of the creator to his work.”

***
Robert Sapolsky, neuroscientist:

The Lure Of A Good Story

Various concepts come to mind for inclusion in that cognitive toolkit. “Emergence,” or related to that, “the failure of reductionism” — mistrust the idea that if you want to understand a complex phenomenon, the only tool of science to use is to break it into its component parts, study them individually in isolation, and then glue the itty-bitty little pieces back together. This often doesn’t work and, increasingly, it seems like it doesn’t work for the most interesting and important phenomena out there.

***
Christine Finn, archaeologist

Absence and Evidence

I first heard the words “absence of evidence is not evidence of absence” as a first-year archaeology undergraduate. I now know it was part of Carl Sagan’s retort against evidence from ignorance, but at the time the non-ascribed quote was part of the intellectual toolkit offered by my professor to help us make sense of the process of excavation…Recognising the evidence of absence is not about forcing a shape on the intangible, but acknowledging a potency in the not-thereness.

***
John McWhorter, author of That Being Said

Path Dependence

In an ideal world all people would spontaneously understand that what political scientists call path dependence explains much more of how the world works than is apparent. Path dependence refers to the fact that often, something that seems normal or inevitable today began with a choice that made sense at a particular time in the past, but survived despite the eclipse of the justification for that choice, because once established, external factors discouraged going into reverse to try other alternatives.

The paradigm example is the seemingly illogical arrangement of letters on typewriter keyboards…Most of life looks path dependent to me. If I could create a national educational curriculum from scratch, I would include the concept as one taught to young people as early as possible.

***
Scott D. Sampson, author of Dinosaur Odyssey: Fossil Threads in the Web of Life:

Interbeing

Humanity’s cognitive toolkit would greatly benefit from adoption of “interbeing,” a concept that comes from Vietnamese Buddhist monk Thich Nhat Hanh. In his words:

“If you are a poet, you will see clearly that there is a cloud floating in [a] sheet of paper. Without a cloud, there will be no rain; without rain, the trees cannot grow; and without trees, we cannot make paper. The cloud is essential for the paper to exist. If the cloud is not here, the sheet of paper cannot be here either . . . “Interbeing” is a word that is not in the dictionary yet, but if we combine the prefix “inter-” with the verb to be,” we have a new verb, inter-be. Without a cloud, we cannot have a paper, so we can say that the cloud and the sheet of paper inter-are. . . . “To be” is to inter-be. You cannot just be by yourself alone. You have to inter-be with every other thing. This sheet of paper is, because everything else is.”

Depending on your perspective, the above passage may sound like profound wisdom or New Age mumbo-jumbo. I would like to propose that interbeing is a robust scientific fact — at least insomuch as such things exist — and, further, that this concept is exceptionally critical and timely.

***
Marco Iacoboni, author of Mirroring People

Entanglement

Entanglement is “spooky action at a distance”, as Einstein liked to say (he actually did not like it at all, but at some point he had to admit that it exists.) In quantum physics, two particles are entangled when a change in one particle is immediately associated with a change in the other particle. Here comes the spooky part: we can separate our “entangled buddies” as far as we can, they will still remain entangled. A change in one of them is instantly reflected in the other one, even though they are physically far apart (and I mean different countries!)

Entanglement feels like magic. It is really difficult to wrap our heads around it. And yet, entanglement is a real phenomenon, measurable and reproducible in the lab. And there is more. While for many years entanglement was thought to be a very delicate phenomenon, only observable in the infinitesimally small world of quantum physics (“oh good, our world is immune from that weird stuff”) and quite volatile, recent evidence suggests that entanglement may be much more robust and even much more widespread than we initially thought. Photosynthesis may happen through entanglement, and recent brain data suggest that entanglement may play a role in coherent electrical activity of distant groups of neurons in the brain.

Entanglement is a good cognitive chunk because it challenges our cognitive intuitions. Our minds seem built to prefer relatively mechanic cause-and-effect stories as explanations of natural phenomena. And when we can’t come up with one of those stories, then we tend to resort to irrational thinking, the kind of magic we feel when we think about entanglement. Entangled particles teach us that our beliefs of how the world works can seriously interfere with our understanding of it. But they also teach us that if we stick with the principles of good scientific practice, of observing, measuring, and then reproducing phenomena that we can frame in a theory (or that are predicted by a scientific theory), we can make sense of things. Even very weird things like entanglement.

***
Barry Smith, writer and presenter, BBC World Service series “The Mysteries of the Brain”:

The Senses and the Multi-Sensory

For far too long we have laboured under a faulty conception of the senses. Ask anyone you know how many senses we have and they will probably say five; unless they start talking to you about a sixth sense. But why pick five? What of the sense of balance provided by the vestibular system, telling you whether you are going up or down in a lift, forwards or backwards on a train, or side to side on a boat? What about proprioception that gives you a firm sense of where your limbs are when you close your eyes? What about feeling pain, hot and cold? Are these just part of touch, like feeling velvet or silk? And why think of sensory experiences like seeing, hearing, tasting, touching and smelling as being produced by a single sense?

Contemporary neuroscientists have postulated two visual systems — one responsible for how things look to us, the other for controlling action — that operate independently of one another. The eye may fall for visual illusions but the hand does not, reaching smoothly for a shape that looks larger than it is to the observer.

And it doesn’t stop here. There is good reason to think that we have two senses of smell: an external sense of smell, orthonasal olfaction, produced by inhaling, that enables us to detect things in the environment such food, predators or smoke; and internal sense, retronasal olfaction, produced by exhaling, that enables us to detect the quality of what we have just eaten, allowing us to decide whether we want any more or should expel it.

***
Neri Oxman, architect

It Ain’t Necessarily So

Preceding the scientific method is a way of being in the world that defies the concept of a solid, immutable reality. Challenging this apparent reality in a scientific manner can potentially unveil a revolutionary shift in its representation and thus recreate reality itself. Such suspension of belief implies the temporary forfeiting of some explanatory power of old concepts and the adoption of a new set of assumptions in their place.

Reality is the state of things as they actually exist, rather than the state by which they may appear or thought to be — a rather ambiguous definition given our known limits to observation and comprehension of concepts and methods. This ambiguity, captured by the aphorism that things are not what they seem, and again with swing in Sportin’ Life’s song It Ain’t Necessarily So, is a thread that seems to consistently appear throughout the history of science and the evolution of the natural world. In fact, ideas that have challenged accepted doctrines and created new realities have prevailed in fields ranging from warfare to flight technology, from physics to medicinal discoveries…

It Ain’t Necessarily So is a drug dealer’s attempt to challenge the gospel of religion by expressing doubts in the Bible: the song is indeed immortal, but Sportin’ himself does not surpass doubt. In science, Sportin’s attitude is an essential first step forward but it ain’t sufficiently so. It is a step that must be followed by scientific concepts and methods. Still, it is worth remembering to take your Gospel with a grain of salt because, sometimes, it ain’t nessa, ain’t nessa, it ain’t necessarily so.

______
*The World Question Center is part of Edge Foundation, Inc., an organization with a mandate to “promote inquiry into and discussion of intellectual, philosophical, artistic, and literary issues, as well as to work for the intellectual and social achievement of society.”

This is an additional serving of Montaigne and an addendum to yesterday’s post regarding the book, How to Live: A Life of Montaigne in One Question and Twenty Attempts at an Answer, by Sarah Bakewell.

A few more passages and thoughts from the book…

On the relevance of Montaigne to our age and time:

Some might question whether there is still any meed for an essayist such as Montaigne. Twenty-first-century people, in the developed world, are already individualistic to excess, as well as entwined with one another to a degree beyond the wildest dreams of a sixteenth-century winegrower. His sense of the “I” in all things may seem a case of preaching to the converted, or even feeding drugs to the addicted. But Montaigne offers more than an incitement to self-indulgence. The twenty-first century has everything to gain from a Montaignean sense of life, and, in its most troubled moments so far, it has been sorely in need of a Montaignean politics. It could cuse his sense of moderation, his love of sociability and courtesy, his suspension of judgment, and his subtle understanding of the psychological mechanisms involved in confrontation and conflict. It needs his conviction that no vision of heaven, no imagined Apocalypse, and no perfectionist fantasy can outweight the tiniest of selves in the real world.

Bakewell suggests that some credit for Montaigne’s ability to be so open to others is because of his cat (and I’m all for giving credit to insights that come by way of a beloved four-legged):

She was the one who, by wanting to play with Montaigne at an inconvenient moment, reminded him what it was to be alive. They look at each other, and, just for a moment, he leaped across the gap in order to see himself through her eyes. Out of that moment—and countless others like it—came his whole philosophy.

In Bakewell’s page of Acknowledgments, she describes her unexpected introduction to Montaigne. The final sentence below, the last of her book, is a worthy one:

I first met Montaigne when, some twenty years ago in Budapest, I was so desperate for something to read on a train that I took a chance on a cheap “Essays” translation in a secondhand shop. It was the only English-language book on the shelf; I very much doubted that I would enjoy it. There is no one in particular I can thank for this turn of events: only Fortune, and the Montaignean truth that the best things in life happen when you don’t get what you think you want.

i’m on the look out for other ways to be with the world since I’ve put myself on a Lenten program of no political reading or discussions. Too bleak. Too close to hopeless. So here’ a bit of advice on “attainable felicity” from the author of our greatest American novel, even after all these years, Herman Melville (from a piece by Sean Kelly in the New York Times):

Writing 30 years before Nietzsche, in his great novel “Moby Dick,” the canonical American author [Melville] encourages us to “lower the conceit of attainable felicity”; to find happiness and meaning, in other words, not in some universal religious account of the order of the universe that holds for everyone at all times, but rather in the local and small-scale commitments that animate a life well-lived. The meaning that one finds in a life dedicated to “the wife, the heart, the bed, the table, the saddle, the fire-side, the country,” these are genuine meanings. They are, in other words, completely sufficient to hold off the threat of nihilism, the threat that life will dissolve into a sequence of meaningless events.

In the way of a small homage to Melville, Jay Parini offered this paean to the master himself. (Parini’s latest historical novel, The Passages of H.M.: A Novel of Herman Melville, was recently released):

I believe Melville had his finger on the American pulse, understood our yearning, our ambivalences, our sense of being cut off from Europe yet somehow wedded to its traditions. Melville understood that Americans are all on a quest, for knowledge, for wealth, for “power” in all its broad expanses. Moby-Dick is our major novel. It is our Odyssey, and Melville our Homer. In “Bartleby the Scrivener,” an incomparable work of art in miniature, we learn all we need to know about the American experience of business and drudgery and obsession. Again and again, Melville holds a mirror up to our souls.

We won’t discuss the theory that the same Mr Melville may have actually pushed his wife down the stairs…

It is important to have a secret, a premonition of things unknown. It fills life with something impersonal, a numinosum. A man who has never experienced that has missed something important. He must sense that he lives in a world which in some respects is mysterious; that things happen and can be experienced which remain inexplicable; that not everything which happens can be anticipated. The unexpected and the incredible belong in this world. Only then is life whole. For me the world has from the beginning been infinite and ungraspable.

–Carl Jung

For the last few weeks my view of the world has been shifted significantly by reading The Black Swan, by Nassim Nicholas Taleb. Written in 2007 but recently released with updated footnotes, the book has been provoking and inspiring shifts in thinking in a variety of disciplines. It has a horizontality that reminds me of Thomas Kuhn’s The Structures of Scientific Revolutions, that landmark book that appeared in 1962 and introduced the brand new concepts of paradigms and paradigm shifts to science, history, sociology, psychology et al.

Taleb’s “Black Swan Events” theory is offered up to explain the following:

1) The disproportionate role of high-impact, hard to predict, and rare events that are beyond the realm of normal expectations
2) The non-computability of the probability of the consequential rare events using scientific methods (owing to their very nature of small probabilities)
3) The psychological biases that make people individually and collectively blind to uncertainty and unaware of the massive role of the rare event in historical affairs.

It’s a great name. There are no black swans in the Northern Hemisphere so whiteness was assumed to be an essential quality of swanness. When a Dutch explorer spotted a black one on an expedition to Australia in 1697, that concept had to be restated. It is a simple but useful analogy for how fragile a system of thought actually can be. Our assumptions, whether they result from reason, logic, falsifiability and/or evidence, can be undone in a moment.

From a review of the book by Will Self:

The Black Swans of the title aren’t simply known unknowns; there are unknown unknowns – events, or inventions, or runaway successes, or indeed contingencies of any kind – for which no statistical analysis, or inductive reasoning can possibly arm us. They are events like 9/11, or Black Monday, or publishing phenomena like the Harry Potter books, or inventions such as the internet, all of which alter the human world.

And from Taleb himself:

Black Swans being unpredictable, we need to adjust to their existence (rather than naively try to predict them.) There are so many things we can do if we focus on antiknowledge, or what we do not know. Among many other benefits, you can set yourself up to collect serendipitous Black Swans (of the positive kind) by maximizing your exposure to them. Indeed, in some domains—such as scientific discovering and venture capital investments—there is a disproportionate payoff from the unknown, since you typically have little to lose and plenty to gain from a rare event…the strategy for the discoverers and entrepreneurs is to rely less on top-down planning and focus on maximum tinkering and recognizing opportunities when they present themselves…The strategy is, then, to tinker as much as possible and try to collect as many Black Swan opportunities as you can.

It is not surprising that a number of venture capitalists have embraced Taleb’s approach as their investment modus operandi. Taleb was a Wall Streeter at one point (don’t hold it against him although he certainly has no shortage of tonal arrogance) so his examples are primarily in the financial/economic realm. But I read this book as an artist’s manifesto, correlating with another variation on the value of tinkering that came up in the conversation between technologists Kevin Kelly and Steven Johnson (and written about here.) As Kelly colloquially put it, “to create something great, you need the means to make a lot of really bad crap.” Or as Johnson phrased it, “You need error to open the door to the adjacent possible.”

So tinker away. Be willing to err, to fail, to “set yourself up to collect serendipitous Black Swans.” And Emily Dickinson’s take on the adjacent possible seems right in line with Taleb, Kelly and Johnson:

To make a prairie it takes a clover and one bee,
One clover, and a bee,
And revery.
The revery alone will do,
If bees are few.


Barnett Newman

I’ve been having a lot of discussions lately about irony, particularly its role in art. Many of these are conversations I have been having with parts of myself, but some of them are with friends and cotravelers. This interest was piqued a few weeks ago when a good friend with an exceptionally developed sense of art and its history commented to me about my work with this: “There’s no irony. No appropriation. No erasing of boundaries between high and low. No entertaining riffs and slights-of-hand. Which is okay with me, but how do you feel about it?”

How do I feel about it? I’m still pondering that last query but clear that the absence of irony is intentional. When I mentioned this issue to another friend, her response was, “Maybe you should strive to live your life irony-free, like your paintings!” Yet another way to think about it. Irony is a concept so complex and layered that its many permutations can keep the mind occupied for a long time and never come to a final position.

In a review of the Abstract Expressionism in New York show at the MOMA that appeared in the Brooklyn Rail, artist and writer Shane McAdams drew some generational distinctions that I found useful, particularly in the context of irony:

I grew up in a generation that would view claims of painting’s or New York’s supremacy as somewhat chauvinistic and confrontational. Our way has been more polite, less opinionated, and more circumspect, opting for the more slippery strategies of relativity and irony to make our points. The tendency has favored not being wrong over being right. If irony is to state one thing and to mean another, our generation has carved an entire worldview out of not actually meaning anything. This is the legacy of Andy Warhol, the high priest of cool detachment. So it’s not such a leap for the children of Warhol to assume that those AbExers were playing fast and loose with meaning as well, when in fact they meant every word they said.

***
While viewing AbExNY, I noticed that at least half the spectators were experiencing the paintings through a camera viewfinder, snapping digital photos, saving the experience for later. The younger the viewer, the less likely they were to engage the work directly. Jackson Pollock mediated through an LCD screen seems an apt metaphor for generational detachment given his determination to dissolve the barriers between him and his painting, the exterior and interior universe: “When I am in my painting, I’m not aware of what I’m doing,” he declared in 1948. We want distance from the world and our consciousness; Pollock didn’t.

A younger generation of artists who want “distance from the world and our consciousness” do live their lives in stark contrast to the heady AbExers in the late 40s. A pendulum swing or a trend that was just moving through? Who is to say for sure. But I do like McAdams’ move to a larger arc of concern with this closing thought: “When the world looks like it’s falling apart, though, perhaps ironic detachment will begin to look less like an antidote to chauvinism and more like a banal evil, unequipped to fight the pricks of history.”

No answers, and maybe there is no need to look for any. But plenty to ponder.

And thanks to Carl Belz for linking me to McAdams’ review.