You are currently browsing the category archive for the ‘Ideas’ category.
Over the fall months James Elkins, the prodigiously prolific writer about art, art history, criticism and art appreciation, wrote a series of pieces for the Huffington Post. (I wrote about his series here.) One of those articles as a title—Are Artists Bored By Their Work?—that is so provocative you can’t not read it.
Elkins starts by addressing a theme kindred to the founding principle of this blog—slow looking:
The philosopher Arthur Danto asked that I not fetishize slow looking. He pointed out that some works of modern art, like Duchamp’s urinal or Warhol’s Brillo Boxes, do not ask to be looked at for hours and hours. A quick look, or even a glance, is right and appropriate. But I’d like to pursue slow looking, and think about it as carefully as I think it deserves. One way to pursue this subject is to ask how long it took the artist to make the work in the first place…It is interesting to be writing about slow looking and slow thinking in a setting like the Huffington Post, where news moves at such a breakneck pace, where you can jump from one post to another as quickly as a click of the mouse. We are all afflicted with a mild attention deficit disorder, and when comes to images, our flightiness is especially intense. We consume more images per lifetime, per day, and even per minute than any culture before us. Modern paintings often seem to have been made quickly, by comparison with the paintings of earlier centuries, and that seems to give us the license to look at them quickly–to consume them and move on.
Elkins draws a comparison between how much time it takes to make a work of art now and before the modernist era. He postulates that paintings took more time during the Renaissance because of the desire to represent the real world. Capturing a “highly ornamented world” takes more time to draw than the minimalism and single brush stroke styles so common in the postmodern era.
But were the representational artists of the Renaissance just a little bored with the tediousness of their task? Elkins points out that many Renaissance artists only painted heads and hands. The rest, painted by their tribe of apprentices. Which raises a reasonable question: Were they bored by the task of capturing the rest of nature? “How interested could Titian have been in all those trees?”
Contemporary art making practices are very different. Elkins again:
Modern painting, on the other hand, is said to be potentially interesting throughout, in every mark, at every inch. Frank Auerbach could not possibly have been bored when each brushstroke mattered so much. His best work is exemplary because he risked everything at every moment. Not a single mark is rote, habitual, or routine. Everything is contingent, as the art historian T. J. Clark says, and nothing is settled. Boredom is out of the question. A good work could not possibly be made by a bored artist.
It never occurred to me that it could be otherwise.
But let us not forget that boredom is actually a rather recent invention. It would be easy to treat it as a human trait that has always been with us, but that turns out to not be the case. According to Walter Benjamin, boredom was an invention of the middle classes that started around 1840. Up until then, no one was writing about it, complaining about it, suffering from it. “I don’t doubt that in the Renaissance people often found themselves at a loss about how to spend an afternoon, but no one was vexed by boredom, or in need of continuous distractions. Not a single Renaissance artist left a diary, or a letter, describing the appalling boredom of the long hours spent in the studio,” Elkins writes.
Boredom is now part of what we don’t like about our lives. It also speaks to the easily distracted, ADD-ish culture in which we live. And so many contemporary trends—social, technological, personal, behavioral—are feeding that proclivity. Committed slow looking—that deliberate and disciplined slowing down to really look at something—is just what many of us need to countermand all those distractions that constantly pull us away from center, away from the deeper connection. It may now need to be taught, like learning musical scales and fundamental ball handling skills. Any artist, contemporary or historical, needs and wants that kind of engagement.
A common theme in my postings over the last few weeks has been the very basic question, “how are we to live?” While it is sometimes hard to be objective about the prevalence of a trend when it is a topic you yourself are interested in (I call it the “car buying syndrome”—all of a sudden the type of car you are considering starts showing up everywhere) it does seem to be a topic of increased interest in the culture in general. I referenced several new book titles that address various aspects of these concerns in my earlier post, The River of Knowledge, as well as a few inspired by Sarah Bakewell’s very successful book on Montaigne, How to Live (here and here.)
In writing about her review of Examined Lives: From Socrates to Nietzsche by James Miller for the New York Times Book Review, Bakewell states her belief that “philosophy is poorer when it loses sight of the messy lives of those who do the philosophizing.” And certainly her book does a great job of bringing together the events of Montaigne’s life with his philosophical writings. “Montaigne’s idea of philosophy, which he inherited from the Greeks and Romans, was mainly of a practical art for living well,” says Bakewell. “It would have seemed odd to him to spend all day studying philosophy in a university classroom, but then have to go to a bookstore’s self-help department to find a book on how to cope with bereavement or depression.” Bakewell’s answer to the query of how to live? “Let life be its own answer,” she said. “You learn to live mainly by living — and making a lot of mistakes.”
More “how to live” wisdom showed up in James Ryerson’s essay, Thinkers and Dreamers. Posing the question, “can a novelist write philosophically?”, Ryerson quotes novelist and philosopher Iris Murdoch. The two pursuits are contrary, says Murdoch in a BBC interview from 1978. Philosophy uses the analytical mind to solve conceptual problems in an “austere, unselfish, candid” prose, and literature calls upon the imagination to produce something “mysterious, ambiguous, particular” about the world.
Murdoch’s distinction between philosophy and fiction applies to life in general it seems to me. The conscious—and conscientious—deployment of our analytical and imaginal skills is an ongoing balancing act. In my case the “mysterious, ambiguous and particular” is where I spend most of my time. For someone else, it may be the reverse. In spite of my own proclivities, I want to be competently bilingual. And as Bakewell suggests, you learn how to do that by living your life. And by making lots of mistakes.
One of my favorite spots on the web is the annual World Question* presented by The Edge. Each year a provocative question is posed, then answers flow in from every profession and point of view. It is a fascinating cross section of thinking, perspectives and insights.
The question being asked for 2011 is:
What scientific concept would improve everybody’s cognitive toolkit?
The answers posted are rich, varied and unexpected. And there is very little overlap. If you are compelled by ideas, reading through them all will be a terrific adventure. Here are a few excerpts that stood out for me:
Linda Stone, former executive at Apple and Microsoft:
Projective thinking is a term coined by Edward de Bono to describe
generative rather than reactive thinking…
Articulate, intelligent individuals can skillfully construct a convincing
case to argue almost any point of view. This critical, reactive use of
intelligence narrows our vision. In contrast, projective thinking is
expansive, “open-ended” and speculative, requiring the thinker to create the
context, concepts, and the objectives…
When we cling rigidly to our constructs…we can be blinded to what’s right in front of us.
Kevin Kelly, author of What Technology Wants:
The Virtues of Negative Results
We can learn nearly as much from an experiment that does not work as from one that does. Failure is not something to be avoided but rather something to be cultivated. That’s a lesson from science that benefits not only laboratory research, but design, sport, engineering, art, entrepreneurship, and even daily life itself. All creative avenues yield the maximum when failures are embraced.
Alison Gopnik, author of Philosophical Baby:
The Rational Unconscious
One of the greatest scientific insights of the twentieth century was that most psychological processes are not conscious. But the “unconscious” that made it into the popular imagination was Freud’s irrational unconscious — the unconscious as a roiling, passionate id, barely held in check by conscious reason and reflection. This picture is still widespread even though Freud has been largely discredited scientifically.
The “unconscious” that has actually led to the greatest scientific and technological advances might be called Turing’s rational unconscious…The greatest advantage of understanding the rational unconscious would be to demonstrate that rational discovery isn’t a specialized abstruse privilege of the few we call “scientists”, but is instead the evolutionary birthright of us all. Really tapping into our inner vision and inner child might not make us happier or more well-adjusted, but it might make us appreciate just how smart we really are.
Richard Foreman, playwright:
Negative Capability Is A Profound Therapy
Mistakes, errors, false starts — accept them all. The basis of creativity.
My reference point (as a playwright, not a scientist) was Keat’s notion of negative capability (from his letters). Being able to exist with lucidity and calm amidst uncertainty, mystery and doubt, without “irritable (and always premature) reaching out” after fact and reason.
This toolkit notion of negative capability is a profound therapy for all manner of ills — intellectual, psychological, spiritual and political. I reflect it (amplify it) with Emerson’s notion that “Art (any intellectual activity?) is (best thought of as but) the path of the creator to his work.”
Robert Sapolsky, neuroscientist:
The Lure Of A Good Story
Various concepts come to mind for inclusion in that cognitive toolkit. “Emergence,” or related to that, “the failure of reductionism” — mistrust the idea that if you want to understand a complex phenomenon, the only tool of science to use is to break it into its component parts, study them individually in isolation, and then glue the itty-bitty little pieces back together. This often doesn’t work and, increasingly, it seems like it doesn’t work for the most interesting and important phenomena out there.
Christine Finn, archaeologist
Absence and Evidence
I first heard the words “absence of evidence is not evidence of absence” as a first-year archaeology undergraduate. I now know it was part of Carl Sagan’s retort against evidence from ignorance, but at the time the non-ascribed quote was part of the intellectual toolkit offered by my professor to help us make sense of the process of excavation…Recognising the evidence of absence is not about forcing a shape on the intangible, but acknowledging a potency in the not-thereness.
John McWhorter, author of That Being Said
In an ideal world all people would spontaneously understand that what political scientists call path dependence explains much more of how the world works than is apparent. Path dependence refers to the fact that often, something that seems normal or inevitable today began with a choice that made sense at a particular time in the past, but survived despite the eclipse of the justification for that choice, because once established, external factors discouraged going into reverse to try other alternatives.
The paradigm example is the seemingly illogical arrangement of letters on typewriter keyboards…Most of life looks path dependent to me. If I could create a national educational curriculum from scratch, I would include the concept as one taught to young people as early as possible.
Scott D. Sampson, author of Dinosaur Odyssey: Fossil Threads in the Web of Life:
Humanity’s cognitive toolkit would greatly benefit from adoption of “interbeing,” a concept that comes from Vietnamese Buddhist monk Thich Nhat Hanh. In his words:
“If you are a poet, you will see clearly that there is a cloud floating in [a] sheet of paper. Without a cloud, there will be no rain; without rain, the trees cannot grow; and without trees, we cannot make paper. The cloud is essential for the paper to exist. If the cloud is not here, the sheet of paper cannot be here either . . . “Interbeing” is a word that is not in the dictionary yet, but if we combine the prefix “inter-” with the verb to be,” we have a new verb, inter-be. Without a cloud, we cannot have a paper, so we can say that the cloud and the sheet of paper inter-are. . . . “To be” is to inter-be. You cannot just be by yourself alone. You have to inter-be with every other thing. This sheet of paper is, because everything else is.”
Depending on your perspective, the above passage may sound like profound wisdom or New Age mumbo-jumbo. I would like to propose that interbeing is a robust scientific fact — at least insomuch as such things exist — and, further, that this concept is exceptionally critical and timely.
Marco Iacoboni, author of Mirroring People
Entanglement is “spooky action at a distance”, as Einstein liked to say (he actually did not like it at all, but at some point he had to admit that it exists.) In quantum physics, two particles are entangled when a change in one particle is immediately associated with a change in the other particle. Here comes the spooky part: we can separate our “entangled buddies” as far as we can, they will still remain entangled. A change in one of them is instantly reflected in the other one, even though they are physically far apart (and I mean different countries!)
Entanglement feels like magic. It is really difficult to wrap our heads around it. And yet, entanglement is a real phenomenon, measurable and reproducible in the lab. And there is more. While for many years entanglement was thought to be a very delicate phenomenon, only observable in the infinitesimally small world of quantum physics (“oh good, our world is immune from that weird stuff”) and quite volatile, recent evidence suggests that entanglement may be much more robust and even much more widespread than we initially thought. Photosynthesis may happen through entanglement, and recent brain data suggest that entanglement may play a role in coherent electrical activity of distant groups of neurons in the brain.
Entanglement is a good cognitive chunk because it challenges our cognitive intuitions. Our minds seem built to prefer relatively mechanic cause-and-effect stories as explanations of natural phenomena. And when we can’t come up with one of those stories, then we tend to resort to irrational thinking, the kind of magic we feel when we think about entanglement. Entangled particles teach us that our beliefs of how the world works can seriously interfere with our understanding of it. But they also teach us that if we stick with the principles of good scientific practice, of observing, measuring, and then reproducing phenomena that we can frame in a theory (or that are predicted by a scientific theory), we can make sense of things. Even very weird things like entanglement.
Barry Smith, writer and presenter, BBC World Service series “The Mysteries of the Brain”:
The Senses and the Multi-Sensory
For far too long we have laboured under a faulty conception of the senses. Ask anyone you know how many senses we have and they will probably say five; unless they start talking to you about a sixth sense. But why pick five? What of the sense of balance provided by the vestibular system, telling you whether you are going up or down in a lift, forwards or backwards on a train, or side to side on a boat? What about proprioception that gives you a firm sense of where your limbs are when you close your eyes? What about feeling pain, hot and cold? Are these just part of touch, like feeling velvet or silk? And why think of sensory experiences like seeing, hearing, tasting, touching and smelling as being produced by a single sense?
Contemporary neuroscientists have postulated two visual systems — one responsible for how things look to us, the other for controlling action — that operate independently of one another. The eye may fall for visual illusions but the hand does not, reaching smoothly for a shape that looks larger than it is to the observer.
And it doesn’t stop here. There is good reason to think that we have two senses of smell: an external sense of smell, orthonasal olfaction, produced by inhaling, that enables us to detect things in the environment such food, predators or smoke; and internal sense, retronasal olfaction, produced by exhaling, that enables us to detect the quality of what we have just eaten, allowing us to decide whether we want any more or should expel it.
Neri Oxman, architect
It Ain’t Necessarily So
Preceding the scientific method is a way of being in the world that defies the concept of a solid, immutable reality. Challenging this apparent reality in a scientific manner can potentially unveil a revolutionary shift in its representation and thus recreate reality itself. Such suspension of belief implies the temporary forfeiting of some explanatory power of old concepts and the adoption of a new set of assumptions in their place.
Reality is the state of things as they actually exist, rather than the state by which they may appear or thought to be — a rather ambiguous definition given our known limits to observation and comprehension of concepts and methods. This ambiguity, captured by the aphorism that things are not what they seem, and again with swing in Sportin’ Life’s song It Ain’t Necessarily So, is a thread that seems to consistently appear throughout the history of science and the evolution of the natural world. In fact, ideas that have challenged accepted doctrines and created new realities have prevailed in fields ranging from warfare to flight technology, from physics to medicinal discoveries…
It Ain’t Necessarily So is a drug dealer’s attempt to challenge the gospel of religion by expressing doubts in the Bible: the song is indeed immortal, but Sportin’ himself does not surpass doubt. In science, Sportin’s attitude is an essential first step forward but it ain’t sufficiently so. It is a step that must be followed by scientific concepts and methods. Still, it is worth remembering to take your Gospel with a grain of salt because, sometimes, it ain’t nessa, ain’t nessa, it ain’t necessarily so.
*The World Question Center is part of Edge Foundation, Inc., an organization with a mandate to “promote inquiry into and discussion of intellectual, philosophical, artistic, and literary issues, as well as to work for the intellectual and social achievement of society.”
A visual/verbal commentary on a few days in New York City, where spring has come and spread its gorgeousness everywhere.
First on the list: The High Line, my favorite urban touchstone for seasonal drift. Two views looking south from 20th street—two months ago and this weekend:
And of course keeping it all in perspective—here’s what we have to look forward to in the next iteration:
A few more visual remembrances:
A special message for all my younguns friends out there: Beware of contempt about growing old. It may be harmful to your health later on.
As reported by Kay Lazar in the Boston Globe, what you think about aging while you are still young can impact how it happens to you when it does:
When you think about aging, what words and images come to mind? Wrinkled, forgetful, maybe feeble?
You might want to rethink those, and try spry, wise, and distinguished, because our negative perceptions of our elders may have adverse effects on our own long-term health, according to a growing body of research.
Scientists are increasingly linking negative stereotypes about older adults to a number of health problems, from memory impairments to increased risk of heart disease and even a shortened life span. With elders often portrayed as the dentures, wrinkle cream, and incontinence segment of our youth-obsessed society, negative messages about aging can be pernicious and long-lasting, specialists say.
So I have to ask: Is there a way to retrofit these more positive attitudes? I too was once a 20 year old who was so breezily dismissive of older people, contemptuous and ignorant of what it means . So as a preventative, repeat after me: Spry! Wise! Distinguished! Again…
The two entries below, a poem by Moramarco and a quote by Tom Robbins, were included in two separate posts on my favorite random access wisdom source, Whiskey River. But when I landed on the site this morning they both happened to shared the screen together. Intentional or not, these two are natural bedfellows.
For anyone who is a maker and pulls things into existence from who knows where, the states of mind described in both of these entries should sound familiar. They also perfectly mirror the questions my friend and former Yale art prof Susana would pose to any student who asked her if they should pursue a career in fine arts.
Hers is still the best litmus test I know, and it came in the form of two questions:
1. Can you imagine living your entire life in uncertainty? Of never knowing if your work is any good, of never really being able to get meaningful feedback from anyone else since you and you alone have to be your own measure of success or failure?
2. Can you imagine living your life knowing that at any point in time you could read something or see something that would force you to abandon everything you thought you believed?
Being willing to see the mental forms that inhabit the mind as migratory and transient—Moramarco’s poem is a direct hit. And as for failure, Robbins’ advice should be ensconced on every artist’s studio wall and read daily.
One Hundred and Eighty Degrees
Have you considered the possibility
that everything you believe is wrong,
not merely off a bit, but totally wrong,
nothing like things as they really are?
If you’ve done this, you know how durably fragile
those phantoms we hold in our heads are,
those wisps of thought that people die and kill for,
betray lovers for, give up lifelong friendships for.
If you’ve not done this, you probably don’t understand this poem,
or think it’s not even a poem, but a bit of opaque nonsense,
occupying too much of your day’s time,
so you probably should stop reading it here, now.
But if you’ve arrived at this line,
maybe, just maybe, you’re open to that possibility,
the possibility of being absolutely completely wrong,
about everything that matters.
How different the world seems then:
everyone who was your enemy is your friend,
everything you hated, you now love,
and everything you love slips through your fingers like sand.
So you think that you’re a failure, do you? Well, you probably are. What’s wrong with that? In the first place, if you’ve any sense at all you must have learned by now that we pay just as dearly for our triumphs as we do for our defeats. Go ahead and fail. But fail with wit, fail with grace, fail with style. A mediocre failure is as insufferable as a mediocre success. Embrace failure! Seek it out. Learn to love it. That may be the only way any of us will ever be free.
The essay on the last page of the Sunday Times Book Review by Jennifer Schuessler this week is provocative. Her topic: Boredom. Ah, that dreaded word. Full of moral implications. Antithetical to everything I learned (and probably inherited through epigenetics) from my pioneer heritage. You never left yourself get bored, and you never admit if for some reason you do. NEVER.
As Schuessler points out, “As a general state of mind, boredom is morally suspect, threatening to shine its dull light back on the person who invokes it. ‘The only horrible thing in the world is ennui,’ Oscar Wilde once wrote, suggesting that boredom doesn’t feel much better in French. ‘That is the one sin for which there is no forgiveness.'”
In fact, says Schuessler, boredom has some benefits. Brain research suggests that when we are awake but not doing anything in particular, our central processors are churning along, particularly in those parts responsible for memory, empathy with others and imagining hypothetical events—in other words, many of the skills needed for a successful literary experience. Hmmm. Something to consider.
Schuessler makes the discussion lively:
It’s common to decry our collective thaasophobia, or fear of boredom, manifested in our addiction to iPhone apps, the cable news crawl and ever mutating varieties of multitasking. One cellphone company has even promoted the idea of “microboredom,” which refers to those moments of inactivity that occur when we’re, say, stuck waiting in line for a latte without our BlackBerry. But novelists, for all their own fears of being dismissed as boring, continue to offer some bold resistance to the broader culture’s zero-tolerance boredom eradication program.
Bringing the discussion around to books, Schuessler highlight the posthumous publication of David Foster Wallace’s unfinished manuscript, The Pale King:
In April 2011, the limits of literary boredom will be tested when Little, Brown & Company publishes “The Pale King,” David Foster Wallace’s novel, found unfinished after his suicide in 2008, about the inner lives of number-crunching I.R.S. agents. An excerpt that appeared last year in The New Yorker depicts a universe of microboredom gone macro…For all the mundanity of its subject matter, the excerpt presents boredom as something more strenuous and exalted than the friendly helper depicted by the neuroscientists, keeping our minds revved up even when we think we’re idling. Boredom isn’t just good for your brain. It’s good for your soul. “Bliss — a second-by-second joy and gratitude at the gift of being alive, conscious — lies on the other side of crushing, crushing boredom,” Wallace wrote in a note left with the manuscript. “Pay close attention to the most tedious thing you can find (Tax Returns, Televised Golf) and, in waves, a boredom like you’ve never known will wash over you and just about kill you. Ride these out, and it’s like stepping from black and white into color. Like water after days in the desert. Instant bliss in every atom.”
Boredom and bliss. Who knew?
A few remembrances from the inimitable John Cage:
“The sound experience I prefer to all others is silence,” he says in this short clip on You Tube. And for most of us on the planet, says Cage, the sound of silence is actually traffic. He rhapsodizes that the sound of traffic is constantly modulating and cannot be predicted. “I don’t need sound to talk to me,” he says simply.
My favorite vignette about Cage has always been the one that I heard during a Laurie Anderson performance. Asked to interview him for the Buddhist magazine Tricycle, Laurie was intent upon asking him a really difficult question: Is life getting better or is it getting worse?
When she finally did pose this query to Cage, he looked at her intently and then answered in a very measured fashion:
“Well of course it is getting better Laurie. It’s just that it is happening so slowly.”
Since posting the quote from the Roiphe review of David Reiff’s memoir of his mother Susan Sontag, Swimming in a Sea of Death, I have been more conscious of the ambient energy that continues to emanate from Sontag’s thoughts and writings. Here’s a sampling:
Even in modern times, when most artists and critics have discarded the theory of art as representation of an outer reality in favor of the theory of art as subjective expression, the main feature of the mimetic theory persists. Whether we conceive of the work of art on the model of a picture (art as a picture of reality) or on the model of a statement (art as the statement of the artist), content still comes first. The content may have changed. It may now be less figurative, less lucidly realistic. But it is still assumed that a work of art is its content. Or, as it’s usually put today, that a work of art by definition says something. (“What X is saying is . . . ,” “What X is trying to say is . . .,” “What X said is . . .” etc., etc.)
Today is such a time, when the project of interpretation is largely reactionary, stifling. Like the fumes of the automobile and of heavy industry which befoul the urban atmosphere, the effusion of interpretations of art today poisons our sensibilities. In a culture whose already classical dilemma is the hypertrophy of the intellect at the expense of energy and sensual capability, interpretation is the revenge of the intellect upon art.
Even more. It is the revenge of the intellect upon the world. To interpret is to impoverish, to deplete the world – in order to set up a shadow world of “meanings.” It is to turn the world into this world. (“This world”! As if there were any other.)
The world, our world, is depleted, impoverished enough. Away with all duplicates of it, until we again experience more immediately what we have.
Of course, I don’t mean interpretation in the broadest sense, the sense in which Nietzsche (rightly) says, “There are no facts, only interpretations.” By interpretation, I mean here a conscious act of the mind which illustrates a certain code, certain “rules” of interpretation.
Directed to art, interpretation means plucking a set of elements (the X, the Y, the Z, and so forth) from the whole work. The task of interpretation is virtually one of translation. The interpreter says, Look, don’t you see that X is really – or, really means – A? That Y is really B? That Z is really C?
All quotes from the essay, Against Interpretation, by Susan Sontag.
Susan Sontag has been a life long beacon for me. Brilliant, articulate, quixotic, complicated, relentless, tenacious, long-suffering, wise—her work and her life have informed so many of my views.
In a New York Times review of Sontag’s son David Rieff’s book, Swimming in a Sea of Death, Katie Roiphe captured a quicksilver and bittersweet vision of Sontag in her last days:
Of course, Sontag’s belief in her exceptionality had a history. In her first bout with breast cancer in her early 40s, she survived. In early interviews after her recovery, she seemed intoxicated by her brush with death. She claimed she had acquired a “fierce intensity” that she would bring to her work; and she incorporated the idea of radical illness into the drama of her intellect, the dark glamour of her writer’s pose. Sontag had written in her diary during her treatment that she needed to learn “how to turn it into a liberation.” And it was that determination, that stubbornness, that constant act of self-transcendence that she thought she could reproduce at 71, when cancer was diagnosed for a third time. But this time it didn’t work. “She had the death that somewhere she must have come to believe that other people had from cancer,” Rieff writes, “the death where knowledge meant nothing, the will to fight meant nothing, the skill of the doctors meant nothing.”