You are currently browsing the category archive for the ‘Against the grain’ category.
The difference between being a complainer (who wants that reputation?) and being a precise observer can sometimes be a fine line. I may be grazing close to the edge of grousing by sharing excerpts from two articles by art critic Karen Wright of The Independent. But they are worthy of note, and of discussion.
The first is from Wright’s reivew of the PST mega show in LA last fall (which I wrote about extensively here last November):
The artist John Baldessari is grumpy, or perhaps just tired. He has been dealing with the press, having received massive attention recently as the most included artist (in 11 shows) in the multi-show extravaganza known as Pacific Standard Time: Art in LA, 1945-1980…
When I asked his opinion of the show he said it was “BM” – “before money” – and that, in fact, all art in LA in Pacific Standard Time, and particularly at MoCA, could be defined this way. “BM” – that is, before artists had money. I entered the cavernous space with his words ringing in my ears. The last time I was here I saw a Takashi Murakami show, and the contrasts between Murakami’s work and Under the Big Black Sun: California Art 1974-1981 could not be more apparent.
Murakami’s mirror-like surfaces speak of money and of the factory. The shimmering surfaces are carefully polished, to remove any trace of the artist or indeed his many assistants’ hands. Tonight, these have given way to the simple objects and hand-worked surfaces of a group of artists, many of whom were deeply engaged with political or gender themes. We are talking about the height of feminism and race issues and the end of the Vietnam War, after all.
The second is from Wright’s short account of her studio visit with the painter Jock McFadyen:
Jock McFadyen’s East End studio is infused with the heady perfume of paint and turps. Painting, now seemingly the least fashionable of arts, is literally getting up my nose here. When I ask McFadyen if he minds practising the art form seemingly not at the forefront of chic curating, his defence is instantaneous and robust: “The great thing about painting is that it’s not fashionable.”
I ask if he always wanted to be an artist, and his response illuminates the current divide in art. “I don’t want to be an artist. I want to be a painter. The man in the street might think you make art out of dirt and string. It is embarrassing to be an artist.”
I’m with you Jock.
Sharon Begley writes a column at Newsweek and can mince through a problem about as fast as anybody these days. Her mind is sharp, agile and very cool.
The following excerpt is from her column in the February 14 issue and deals with the general failure of prognostication. I have been miffed by (you too?) how piss poor the experts performed in anticipating our current malaise. Begley’s article gave me a framework with which to approach this problem.
While her piece is dealing with political pundits, Begley’s take could be applied to many other areas as well. Berlin’s oft used analogy of hedgehogs and foxes continues to have a long life. I still find it an entertaining thinking tool.
Pointing out how often pundits’ predictions are not only wrong but egregiously wrong—a 36,000 Dow! euphoric Iraqis welcoming American soldiers with flowers!—is like shooting fish in a barrel, except in this case the fish refuse to die. No matter how often they miss the mark, pundits just won’t shut up…But while we can’t shut pundits up, we can identify those more likely to have an accurate crystal ball when it comes to forecasts from the effect of the stimulus bill to the likelihood of civil unrest in China. Knowing who’s likely to be right comes down to something psychologists call cognitive style, and with that in mind Philip Tetlock, a research psychologist at Stanford University, would like to introduce you to foxes and hedgehogs.
At first, Tetlock’s ongoing study of 82,361 predictions by 284 pundits (most but not all of them American) came up empty. He initially looked at whether accuracy was related to having a Ph.D., being an economist or political scientist rather than a blowhard journalist, having policy experience or access to classified information, or being a realist or neocon, liberal or conservative. The answers were no on all counts. The best predictor, in a backward sort of way, was fame: the more feted by the media, the worse a pundit’s accuracy. And therein lay Tetlock’s first clue. The media’s preferred pundits are forceful, confident and decisive, not tentative and balanced. They are, in short, hedgehogs, not foxes.
That bestiary comes from the political philosopher Isaiah Berlin, who in 1953 argued that hedgehogs “know one big thing.” They apply that one thing (for instance, that ethnicity and language are primal; ergo, any country that contains many ethnic groups will break up) everywhere, express supreme confidence in their forecasts, dismiss opposing views and are drawn to top-down arguments deduced from that Big Idea. Foxes, in contrast, “know many things,” as Berlin put it. They consider competing views, make bottom-up inductive arguments from an array of facts and doubt the power of Big Ideas. “The hedgehog-fox dimension did what none of the other traits did,” says Tetlock…
In short, what experts think matters far less than how they think, or their cognitive style. At one extreme, hedgehogs seek certainty and closure, dismiss information that undercuts their preconceptions and embrace evidence that reinforces them, in what is called “belief defense and bolstering.” At the other extreme, foxes are cognitively flexible, modest and open to self-criticism. White House economics czar Larry Summers is seldom accused of having a modest personality, but he displays the fox’s cognitive style: in briefing the president, he assigns numerical probabilities to possible outcomes of economic policies, rather than saying, “This will [or will not] happen.” Similarly, Yale economist Robert Shiller, who forecast the bursting of both the tech bubble in 2000 and the housing bubble in 2006, deploys a flexible cognitive style that works from the data up and not from one Big Idea down. Here’s how to identify fauna: foxes pepper their speech and writing with “however” and “but,” recognizing uncertainty in the face of competing forces. Hedgehogs suffer from no such doubts, which (combined with their adherence to a Big Idea) makes them especially prone to overpredict change: the House of Saud will fall, the European Monetary Union will collapse, Canada will disintegrate like Yugoslavia—in the last case, from the primal force of ethnicity. Leftist hedgehogs, applying the Big Idea that those who oppose dictators are virtuous, failed to foresee the fierce repressiveness of Iran’s 1979 revolution, which overthrew the shah; applying the Big Idea that involvement in regional war = quagmire, they predicted that the first Gulf war would last 20 years and claim 50,000 American lives. Oops.
The media, of course, eat this up. Bold, decisive assertions make better sound bites; bombast, swagger and certainty make for better TV. As a result, the marketplace of ideas does not punish poor punditry. Few of us even remember who got what wrong. We are instead impressed by credentials, affiliation, fame and even looks—traits that have no bearing on a pundit’s accuracy.
The truly bad news for forecasters, however, is that although foxes beat hedgehogs, math often beats all but the best foxes. If there are three possibilities (say, that China will experience more, less or the same amount of civil unrest), throwing darts at targets representing each one produces a forecast more accurate than most pundits’. Simply extrapolating from recent data on, say, economic output does even better. But booking statistical models on talk shows probably wouldn’t help their ratings.
I found an article in The Independent yesterday that I posted on my filter blog Slow Painting. It has dominated my thinking all day. In a singularly succinct manner, it captures a core set of issues that are at the center of my disaffection with a number of trends in contemporary art. These are some of the same concerns that drove me to start blogging two years ago.
Two imperatives are identified as de rigeur in the high profile world of contemporary art:
Rule 1) Justification by meaning: the worth and interest of a work resides in what it’s about.
Rule 2) Absolute freedom of interpretation: a work is “about” anything that can, at a pinch, be said about it.
The article goes on to elaborate this conundrum:
In short, meanings are arbitrary, but compulsory. And this double bind holds almost universal sway. Whenever you learn that a work explores or investigates or raises questions about something, that it’s concerned with issues around this or notions of that or debates about the other, you know you’re in its grip.
It’s weird how people can’t resist. If you want to make art sound serious, this is simply the way you do it. Read any gallery wall-caption or leaflet or catalogue, and see how long it is before the writer commends the work solely on the basis of what it’s about. And then note how it is isn’t really about that at all.
Meaning comes first – even before the work itself…
That’s the problem with these meanings. They’re not just highly tenuous. They’re depressingly limiting. And we should put them aside. We should stop measuring art by its meaningfulness. We should heed the wise words of Susan Sontag, written almost 50 years ago in her essay “Against Interpretation”.
“Our task is not to find the maximum amount of content in a work of art, much less to squeeze more content out of the work than is already there. Our task is to cut back on content so that we can see the thing at all. The aim of all commentary on art now should be to make works of art – and, by analogy, our own experience – more, rather than less, real to us.”
This runs in a similar vein with much of what Lawrence Weschler has explored in my still current favorite book, Seeing is Forgetting the Name of the Thing One Sees. What Irwin keeps moving in and out of in the interviews included in the book is related to Sontag’s issue of cutting back on content and getting the viewer closer to what is “real.”
I’ve referenced Irwin’s well known response to a Philip Guston painting in an earlier posting here but it is particularly pertinent to this discussion. He describes going to a gallery and seeing a small Guston hanging next to a large James Brooks. The Brooks painting was big in every way—large shapes, with strong color. The Guston, an early piece, was small, painted in the subtle and signatory muted pinks, greys and greens. But in Irwin’s eyes, it outstripped the Brooks completely.
In Irwin’s words:
My discovery was that from one hundred yards away…I looked over, and that goddamn Guston…Now, I’m talking not on quality, and not on any assumption of what you like or don’t like, but on just pure strength, which was one of the things we were into. Strength was a big word in abstract expressionism; you were trying to get power into the painting, so that the painting really vibrated, had life to it. It wasn’t just colored shapes sitting flat. It had to do with getting a real tension going in the thing, something that made the thing really stand up and hum…Well, that goddamn Guston just blew the Brooks right off the wall…
Not on quality, just on power…some people call it “the inner life of the painting,” all that romantic stuff, and I guess that’s a way of talking about it. But shapes on a painting are just shapes on a canvas unless they start acting on each other and really, in a sense, multiplying. A good painting has a gathering, interactive build-up in it. It’s a psychic build-up, but it’s also a pure energy build-up. And the good artists knew it, too. That’s what a good Vermeer has, or a raku cup, or a Stonehenge. And when they’ve got it, they just jump off the goddamn wall. They just, bam!
What Irwin keeps getting at—that power of the painting itself—lives outside the domain of applied and obligatory meaning. It’s Irwin’s memorable phrase that I referenced in an earlier posting—phenomenal presence. As Weschler posts in describing Irwin’s line canvases:
They only work immediately; they command an incredible presence—“a rich floating sense of energy,” as Irwin describes it—but only to one who is in fact present. Back at home, you may remember what it felt like to stand before the painting, the texture of the meditative stance it put you in, but the canvas itself, its image in your mind, will be evanescent. That is why for many years Irwin declined to allow his work to be photographed, because the image of the canvas was precisely what the painting was not about.
This is the deep furrow I want to plow. The contemporary concerns for obligatory meaning and languaged legitimacy melts away for me in the face of full-bodied power. Overlay and artifice? Enough already.
I just returned from a few days in New York City. I only did about half of what I had intended. When it is over 100 degrees, the walkability of that city drops into negative numbers. Is it just me or do mental functions slow down for all humans in that kind of heat?
And speaking of mental functions, there is a great article in the latest Atlantic, “Is Google Making Us Stupid?” by smart guy Nicolas Carr (who rocked the IT smart set in 2004 with his book Does IT Matter? setting off a worldwide debate about the role of computers in business, a topic still being argued today.)
He’s a facile writer with a clear thinking mind. And I am particularly impressed that Carr chose to quote one of my favorite playwrights (and in the view of some, a way way out kind of guy) Richard Foreman who phrased his ideal in a beautiful turn of phrase—“the complex, dense and ‘cathedral-like’ structure of the highly educated and articulate personality.” As Foreman posits, we need the “inner repertory of dense cultural inheritance” or we turn into “‘pancake people’—spread wide and thin as we connect with that vast network of information accessed by the mere touch of a button.” It is a visceral and apt metaphor.
Much of what Carr says could be viewed as slightly tilted toward the Luddite. But at this point, being a Luddite might be the more progressive position any of us can take. In this world, everything is being skewed, where up is now down, in is now out. Being a Luddite in this manner might be similar to the artist who eschews the contemporary rhetoric and au courant posturing, choosing instead to be silent—which may be the most powerfully subversive position of all.
I’m only excerpting a few highlights below, so go to the link above if you are interested in getting the entire experience. And given his arguments, you might be guilted into taking the full read…
Over the past few years I’ve had an uncomfortable sense that someone, or something, has been tinkering with my brain, remapping the neural circuitry, reprogramming the memory. My mind isn’t going—so far as I can tell—but it’s changing. I’m not thinking the way I used to think. I can feel it most strongly when I’m reading. Immersing myself in a book or a lengthy article used to be easy. My mind would get caught up in the narrative or the turns of the argument, and I’d spend hours strolling through long stretches of prose. That’s rarely the case anymore. Now my concentration often starts to drift after two or three pages. I get fidgety, lose the thread, begin looking for something else to do. I feel as if I’m always dragging my wayward brain back to the text. The deep reading that used to come naturally has become a struggle.
I think I know what’s going on. For more than a decade now, I’ve been spending a lot of time online, searching and surfing and sometimes adding to the great databases of the Internet. The Web has been a godsend to me as a writer. Research that once required days in the stacks or periodical rooms of libraries can now be done in minutes. A few Google searches, some quick clicks on hyperlinks, and I’ve got the telltale fact or pithy quote I was after. Even when I’m not working, I’m as likely as not to be foraging in the Web’s info-thickets—reading and writing e-mails, scanning headlines and blog posts, watching videos and listening to podcasts, or just tripping from link to link to link. (Unlike footnotes, to which they’re sometimes likened, hyperlinks don’t merely point to related works; they propel you toward them.)
For me, as for others, the Net is becoming a universal medium, the conduit for most of the information that flows through my eyes and ears and into my mind. The advantages of having immediate access to such an incredibly rich store of information are many, and they’ve been widely described and duly applauded. “The perfect recall of silicon memory,” Wired’s Clive Thompson has written, “can be an enormous boon to thinking.” But that boon comes at a price. As the media theorist Marshall McLuhan pointed out in the 1960s, media are not just passive channels of information. They supply the stuff of thought, but they also shape the process of thought. And what the Net seems to be doing is chipping away my capacity for concentration and contemplation. My mind now expects to take in information the way the Net distributes it: in a swiftly moving stream of particles. Once I was a scuba diver in the sea of words. Now I zip along the surface like a guy on a Jet Ski.
I’m not the only one. When I mention my troubles with reading to friends and acquaintances—literary types, most of them—many say they’re having similar experiences. The more they use the Web, the more they have to fight to stay focused on long pieces of writing. Some of the bloggers I follow have also begun mentioning the phenomenon. Scott Karp, who writes a blog about online media, recently confessed that he has stopped reading books altogether. “I was a lit major in college, and used to be [a] voracious book reader,” he wrote. “What happened?” He speculates on the answer: “What if I do all my reading on the web not so much because the way I read has changed, i.e. I’m just seeking convenience, but because the way I THINK has changed?”…
Thanks to the ubiquity of text on the Internet, not to mention the popularity of text-messaging on cell phones, we may well be reading more today than we did in the 1970s or 1980s, when television was our medium of choice. But it’s a different kind of reading, and behind it lies a different kind of thinking—perhaps even a new sense of the self. “We are not only what we read,” says Maryanne Wolf, a developmental psychologist at Tufts University and the author of Proust and the Squid: The Story and Science of the Reading Brain. “We are how we read.” Wolf worries that the style of reading promoted by the Net, a style that puts “efficiency” and “immediacy” above all else, may be weakening our capacity for the kind of deep reading that emerged when an earlier technology, the printing press, made long and complex works of prose commonplace. When we read online, she says, we tend to become “mere decoders of information.” Our ability to interpret text, to make the rich mental connections that form when we read deeply and without distraction, remains largely disengaged.
Reading, explains Wolf, is not an instinctive skill for human beings. It’s not etched into our genes the way speech is. We have to teach our minds how to translate the symbolic characters we see into the language we understand. And the media or other technologies we use in learning and practicing the craft of reading play an important part in shaping the neural circuits inside our brains. Experiments demonstrate that readers of ideograms, such as the Chinese, develop a mental circuitry for reading that is very different from the circuitry found in those of us whose written language employs an alphabet. The variations extend across many regions of the brain, including those that govern such essential cognitive functions as memory and the interpretation of visual and auditory stimuli. We can expect as well that the circuits woven by our use of the Net will be different from those woven by our reading of books and other printed works…
Never has a communications system played so many roles in our lives—or exerted such broad influence over our thoughts—as the Internet does today. Yet, for all that’s been written about the Net, there’s been little consideration of how, exactly, it’s reprogramming us. The Net’s intellectual ethic remains obscure…the assumption that we’d all “be better off” if our brains were supplemented, or even replaced, by an artificial intelligence is unsettling. It suggests a belief that intelligence is the output of a mechanical process, a series of discrete steps that can be isolated, measured, and optimized. In Google’s world, the world we enter when we go online, there’s little place for the fuzziness of contemplation. Ambiguity is not an opening for insight but a bug to be fixed. The human brain is just an outdated computer that needs a faster processor and a bigger hard drive.
The idea that our minds should operate as high-speed data-processing machines is not only built into the workings of the Internet, it is the network’s reigning business model as well. The faster we surf across the Web—the more links we click and pages we view—the more opportunities Google and other companies gain to collect information about us and to feed us advertisements. Most of the proprietors of the commercial Internet have a financial stake in collecting the crumbs of data we leave behind as we flit from link to link—the more crumbs, the better. The last thing these companies want is to encourage leisurely reading or slow, concentrated thought. It’s in their economic interest to drive us to distraction…
Then again, the Net isn’t the alphabet, and although it may replace the printing press, it produces something altogether different. The kind of deep reading that a sequence of printed pages promotes is valuable not just for the knowledge we acquire from the author’s words but for the intellectual vibrations those words set off within our own minds. In the quiet spaces opened up by the sustained, undistracted reading of a book, or by any other act of contemplation, for that matter, we make our own associations, draw our own inferences and analogies, foster our own ideas. Deep reading, as Maryanne Wolf argues, is indistinguishable from deep thinking.
If we lose those quiet spaces, or fill them up with “content,” we will sacrifice something important not only in our selves but in our culture. In a recent essay, the playwright Richard Foreman eloquently described what’s at stake:
“I come from a tradition of Western culture, in which the ideal (my ideal) was the complex, dense and ‘cathedral-like’ structure of the highly educated and articulate personality—a man or woman who carried inside themselves a personally constructed and unique version of the entire heritage of the West. [But now] I see within us all (myself included) the replacement of complex inner density with a new kind of self—evolving under the pressure of information overload and the technology of the ‘instantly available.'”
As we are drained of our “inner repertory of dense cultural inheritance,” Foreman concluded, we risk turning into “‘pancake people’—spread wide and thin as we connect with that vast network of information accessed by the mere touch of a button.”
I’m haunted by that scene in 2001. What makes it so poignant, and so weird, is the computer’s emotional response to the disassembly of its mind: its despair as one circuit after another goes dark, its childlike pleading with the astronaut—“I can feel it. I can feel it. I’m afraid”—and its final reversion to what can only be called a state of innocence. HAL’s outpouring of feeling contrasts with the emotionlessness that characterizes the human figures in the film, who go about their business with an almost robotic efficiency. Their thoughts and actions feel scripted, as if they’re following the steps of an algorithm. In the world of 2001, people have become so machinelike that the most human character turns out to be a machine. That’s the essence of Kubrick’s dark prophecy: as we come to rely on computers to mediate our understanding of the world, it is our own intelligence that flattens into artificial intelligence.
Has it happened, are there more blogs now than people on the planet? The uncontrollable sprawl of online scribblers has led to a lot of pondering in the media lately, with cultural critics ready to unpack and dissect the implications of this curious new form of expression and interconnection.
I have intentionally kept clear of this increasingly overexposed dissection of blogs, bloggers, blogging, the blogosphere, the battle for airtime and audience grab. It isn’t because I feel untouched by these issues because that isn’t the case. I’m a blogger like a gazillion other people. But it wasn’t until I read the New York Times magazine cover article on Sunday by Emily Gould that I realized just how much I was chafing against the increasing meaninglessness of the term “blogger.”
If you didn’t read Gould’s article, it was a tell all confession of a highly charged, high profile case of “he said/she said”, one that can happen when you live your life out loud, online, without much in the way of editing. Gould began as a blogger who openly shared the details of her relationships and personal life, was hired to be an editor at the now infamous website Gawker, pissed off a lot of people particularly when she defended the ethos of Gawker’s celebrity stalking, lost her job, became a target just as she had targeted others, and now is reconsidering just what it all meant. Gould is 24 years old, which explains a lot. Tact and temperance were not qualities I had honed when I was her age either.
But Gould’s confessional mea culpa—with a twist (there’s always a twist)—has been bouncing in my head for days. Her compulsive need to “overshare” (her term) is a feature of her personality she says, and even though she would like to search and destroy many of her earlier and unwise postings, she seems committed to continue her maturation process online, in full view of the public. Reading her New York Times account has inspired me to articulate my own reasons for writing and for making the determinations about what I share and what I do not.
I have a few favorite bloggers who are regular self-scrutinizers. D at Joe Felso: Ruminations recently wrote one of his ever thoughtful postings on his own blogging oeuvre, including some ideas about where he would like to take his site. Another favorite, G, who currently captains the excellent Writer Reading, taxonomized the categories of bloggers on one of her previous blog incarnations. (I particularly liked the label “Sheherazadists” for bloggers like G–yes, another G name–at How to Survive Suburban Life who use the blogging form to write about their life story in a series of vignette postings.) C at Mariachristina has written about the constraints of writing without the cover of an alias or avatar. She has had to truncate her observations and expressions in order to respect the privacy of her family and friends. The analytical and intellectually probing J at little essays often asks out loud what her blog should and could be, particularly during a time when she is pressured with pursuing an advanced degree in art history and expecting her second child.
I am not of the Gould mold. If anything, I am an undersharer. The oft-evoked distinction Stevens makes in “13 Ways of Looking at a Blackbird” between inflection and innuendo has resonance for me. I want to be subjective, to a point. Idea driven, to a point. Personal, to a point.
I am not a journalist, a confessionalist, or memoirist or a dialectician. The closest analog I can find to describe my aspirations for this blog is my aspirations for my paintings: Evocative, but not manipulative. Suggestive, but not formulaic. Mysterious but not self conscious. Memorable and yet personal, sized for a human being.
One of my favorite descriptions of an artist is from Donald Winnicott and seems apropos for blogging as well:
“Artists are continually torn between the urgent need to communicate, and the still more urgent need not to be found.”
Gould’s blogging style of full disclosure is probably more in keeping with an increasingly confessional, privacy-blind culture. I for one am in search for something more. Or perhaps something less.
An unforgettable exhibit at the San Francisco Museum of Modern Art: Photos of Silicon Valley by Milan-based architect and photographer Gabriele Basilico.
Having grown up in the Bay Area, I remember well when the Valley was mostly apricot orchards and vegetable farms. But Basilico’s images do not sentimentalize the past or assault the viewer with a harsh, urban, edgy vision. These photographs are quiet–almost apocalyptically silent–and most of them capture a people-free version of a region that has become notoriously overpopulated, overtraffiked and drenched in a smug layer of “we’re just a little smarter (and richer) than everyone else” self satisfaction.
That isn’t what captures Basilico’s eye however. Instead he discovers what urban theorist Manuel Castells calls the “space of flows.” As described by Jeff Byles in Modern Painters magazine:
That’s what you see beyond the galvanized steel guardrails. That is the informational city, a land of virtual networks ever more severed from their social context…Check out Basilico’s view of US Highway 101 gashing through the flat valley in ominous shades of black and white, a vast parking lot to the left, an empty field to the right. Transmission wires arc low across the sky and trail into the distance. This is the space of flows. On the horizon sit carceral towers, the seeming prison houses of software engineers and product managers. Latent in the image are layers of spatial data: vestigial scraps of nature; the low, defining hills; cars streaking along the highway, their own vectors in the landscape.
Byles goes on to draw specifically from the writings of Castells:
This is where the social meaning of place evaporates… “There is no tangible oppression,” Castells wrote of the informational city, “no identifiable enemy, no center of power that can be held responsible for specific social issues.” There are just flows. Input, output: service stations and taco stands.
Basilico’s photographs capture a centerless, ambient foreboding that something here isn’t right. How he does this is beguiling and mysterious. And he achieves it without resorting to manipulative gestures or a need to patronize the viewer. These images feel fresh. Raw, yes, but starkly fresh.
Perhaps it is his method of work: “To slow down vision,” Basilico wrote, “was for me a small revolution in the way of seeing.” In Byles’ view, the emptiest photographs are the most powerful. “Basilico is the de Chrico of sprawl.” Well put.
To view the Basilico images in the SFMOMA show, click here.
Crown Point Press, a major force in the Bay Area art scene for 40 years, has produced prints with and for some of the greats including Richard Diebenkorn, John Cage, Richard Tuttle, Wayne Thiebaud and Pat Steir. In addition to a gallery and bookstore in its well appointed space on Hawthorne Street in San Francisco, CPP has a tremendous set of files, brochures and descriptive spec sheets on the artists who have worked with founder Kathan Brown and her team of Master printers.
I spent several hours rifling through the extensive resources and files during which I found a small monograph on Judy Pfaff, one of my favorite artists. It features an in depth interview with Pfaff by Constance Lewallen of CPP.
Here’s a memorable passage from that exchange:
CL: [Your] work is not ironic as so much of the work being shown today, in which the artist is the art critic as well…You once said to me that a positive way of looking at this phenomenon is that now artists have created another arena for themselves–they can be critics, they can be businessmen.
JP: When I am in a generous mood I think that. But often I think it is very depressing that the whole art world seems to demonstrate that attitude now–cool, detached, competent. I think one of the things about being an artist is that you should be allowed to test murky, unclear, unsure territory or all you have left are substitutes that signify these positions. Having it all together is the least interesting thing in art, in being alive.
CL: Someone once wrote that your work deals with art at the fringes of confusion of life itself.
JP: I like that.
I found a terrific article about painting and its complex relationship with the contemporary art scene. It is so provocative, and it reflects many of my own beliefs about the “state of the art” (so to speak) of painting that I posted most of it on my Slow Painting blog.
I don’t want to come across as a monomaniacal, logger headed defender of the ancient practice of painting, especially now when there are so many options for visual expression. While I am regularly delighted and provoked by art delivered in other media, there’s no other method that has ever captured for me the power, scope, reach, and depth of applying gooey stuff to a flat, receptive surface. And that connection happened even though I came of age as an artist during a time when painting was being vociferously declared (once more, with feeling) DEAD. As a result, I began my career as an artist on a definite back beat. Knowingly.
The story of how painting as survived successive waves of being disregarded is certainly more complex than a single newspaper article can cover, but Christopher Knight of the Los Angles Times pulls on a few of the key threads that feed into a knotty tapestry of influences and trends. He starts by sharing the dilemma of a young painter still in school (which is, uncannily, almost exactly the same sentiments I encountered when I was an art student years ago.) “They sneer and say I’m foolish because painting is obsolete, and I don’t know what to say to them,” she said.
Ah, that old chestnut—the belief that art is like science and technology and discussed in the context of progress. That means the old traditions, like painting, become obsolete, “like absolute monarchy or 8-track tapes.”
Knight’s advice to the young artist is clear and straightforward: Say thanks, and mean it.
The short explanation for expressing gratitude is that every young artist should take hostile groupthink — the promiscuous pressure to conform — as a cue that she’s on the right track. Those pressures can be especially acute at school. That’s one hazard of the current pervasiveness of academic training for artists.
Knight goes on to demonstrate that the shift in painting’s place within the au courant practices of fine arts has more to do with the decentralization of art (with New York no longer being an essential center of gravity) than a particular trend or movement. His final point is well taken:
Painting, unlike most image-making practices in industrial or post-industrial society, is already pretty much a solitary job. Rarely do production assistants, teams of fabricators and collaborators gather in a painter’s studio, as they do for movies at Paramount, TV shows at HBO and at the far-flung art factories established by video artist Bill Viola, sculptor Jeff Koons or installation artist Ann Hamilton. Usually it’s just one person in a room, with a flat plane and some colors, trying to juice the corpse and make it dance.
That’s the real legerdemain facing anyone determined to be a painter, whether the student who asked the original question gets the support of her teachers and peers or not. Painting isn’t dead — or, more precisely, it always has been and always will be. The perpetual trick is to give a painting life.
In responding to my previous post about theory and art making, Elatia Harris left a comment that is so full of potent issues I felt it needed to be brought forward, into the headlights. She touches on issues that many visual artists (including myself) mull over, struggle with and voice frustration about. I don’t necessarily agree with Elatia’s conclusion, but I also don’t have a hard and fast answer that satisfies me.
So much has been written about authenticity in aesthetic philosophy in all its various meanings, but here I am referring specifically to the use of the term that speaks to Peter Kivy’s definition of authenticity–faithfulness to the artist’s own self, original, not derivative or aping of someone else’s way of working. In this definition, authenticity is being committed to personal expression, being true to one’s artistic self rather than to the precepts of a particular tradition or -ism.
With as open-ended a definition as that, it is still fair to ask, What IS authenticity? How do you know when you have it and when you do not? There’s no answer that satisfies that question for me. I put it in the same category as a question that is often asked of painters and poets and that cannot be languaged: How do you know when a work of art finished? (Well, it feels balanced. It stops complaining. It hums. It radiates. What can I say?)
Similarly, authenticity in all its inchoate splendor is as close to a religious creed as I have when I’m in my studio. Like a lot of things in life–love, grief, ecstasy–we keep being tempted to define these powerful experiences in language, but they will not abide.
Here’s Elatia’s comment:
For my painting career, I tried to remain outside theory while including it in my awareness. I didn’t want the pigeon-holes for myself, and wondered why anyone would tolerate them. This is quite different from failing to value consistency or vision, and it also never left me feeling at an emotional disadvantage when I painted or thought about painting. After all, if you cannot or will not say what you are as a painter or how you are affiliated with other painters doing work like yours, then you are trusting your instincts, and instincts tend to be rather unfriendly to theory.
But I have to look at where all this got me — all this rejecting of -isms and refusing to be an -ist. I created a great deal of confusion in the minds of viewers — critics and other intellectuals, friends, gallerists, potential clients. I seemed never to represent any “flavor of the month” they could believe in, or to be a part of what they could understand as the coming thing. And I misunderstood how much the classification mania of the art establishment drove the career progress an artist could make. Perhaps one can’t ever truly be outside the system — only irrelevant to it. Post-modernism engineered a slow breakdown of these taxonomies, but then became, itself, theory-ridden.
I saw the way I negotiated all that as the price of being authentic, and even from this distance I still see it that way. Authentic, yes. Intelligent, no.
I think that art should be allowed to go private. It should be a matter of one-on-one. In the last few years, the public has only heard of art when it makes record prices at auction, or is stolen, or allegedly withheld from its rightful owners. We need to concentrate more on art that sits still some place and minds its own business. We all hope for a strong response from art, but the kind of buzz that we have to live with nowadays is the enemy of art. Quietness and slow time are its friends. Let’s hope that their turn will come.
–John Russell, in conversation with Jason Edward Kaufman
This quote captures the essence of the idea behind Slow Art and the reason I started blogging over a year ago. Russell’s advocacy for a more personal one-on-one art experience–an art that has gone “private”–runs against all the tendencies of our culture.
The sentiments Russell expresses remind me of one of my culture heroes, Craig Newmark, founder of Craigslist.org. Even as his social networking site is valued in the billions of dollars, he is not interested in selling out. When asked why, this was his answer:
“Who needs the money? If you’re living comfortably, what’s the point of having more?”
He has talked about starting the site in his spare time as a service to the community, and it just kept expanding. “I believe people are overwhelmingly trustworthy and good.” By taking that approach, the site has become a massive force of its own.
When something authentic and powerful goes against the drag-it-down current of conventional wisdom, who knows what will open up? I long for these new points of view, new ways of thinking, a shift in the consciousness.
Thank you Elatia Harris for finding the quote by Russell and sending it my way.