By David Ross. Two propositions about James Brown: 1) He was the highest energy performer ever. In full overdrive, he was the rubber-legged soul incarnation of Chuck Jones’ Tasmanian Devil. Nobody – not Elvis Presley, Jimi Hendrix, The Who, or Bruce Springsteen – ever approached Brown’s expenditure of calories per second or approximated his capacity for what amounts to self-detonated metabolic nuclear explosion. 2) His bands of the late fifties and sixties – featuring vocal backup by the Famous Flames and best captured on the classic album Live at the Apollo (1963) – were the greatest of the R&B and rock era. These bands were not necessarily the most gifted, but they were the best rehearsed, the most cohesive, the most rhythmically agile, and in all ways the most pinpoint. The most virtuosic rock units – The Who, The Jimi Hendrix Experience, Led Zeppelin, Mahavishnu Orchestra, The E Street Band – sound tattered in comparison. While the rock ethos tended toward drug-induced laziness, Brown was an obsessive-compulsive Karajanesque whip-cracker, as his sixties-era sax player Maceo Parker recalls:
You gotta be on time. You gotta have your uniform. Your stuff’s got to be intact. You gotta have the bow tie. You got to have it. You can’t come up without the bow tie. You cannot come up without a cummerbund … [The] patent leather shoes we were wearing at the time gotta be greased. You just gotta have this stuff. This is what [Brown expected] ….
This YouTube chestnut (see above) shows footage from the famous T.A.M.I. (Teenage Music Awards International) concert, which was held in the Santa Monica Civic Auditorium on October 28 and 29, 1964. In addition to Brown, the concert starred the Rolling Stones, the Beach Boys, Marvin Gaye, Chuck Berry, the Supremes, and Smokey Robinson. The Stones famously whined about having to follow Brown onstage – one understands why. You can purchase the full concert here.
By way of bonus, here’s Brown, circa 1966, belting out an epic version of “It’s a Man’s Man’s Man’s World.”
Posted on October 3rd, 2011 at 1:33pm.
By David Ross. In the annals of the prematurely departed, nothing compares to the world-catastrophe of Keats’ death. English literature lost its best chance at another Shakespeare; Western civilization lost its most promising spokesman. Here’s my Hall of Fame of Bereavement, my Tenebrous Top Ten, in descending order of regret:
• John Keats (1795-1821)
• Percy Shelley (1792-1822)
• Jane Austen (1775-1817)
• Jimi Hendrix (1942-1970)
• Andrei Tarkovsky (1932-1986)
• Flannery O’Connor (1925-1964)
• Charlotte Bronte (1816-1855)
• Richard Parkes Bonington (1802-1828). Perhaps the most gifted of all British painters.
• Sandy Denny (1947-1978). See here for additional elegy.
• Stevie Ray Vaughan (1954-1990)
Steve Ray comes last on this list only because the blues is a relatively blunt instrument. All the same, his death is a raw and bitter recollection. While Jane Austen and possibly even Charlotte Bronte had entered a terminal pattern, Stevie Ray was in the process of transcending the constraints of I-IV-V and taking up the kaleidoscopic jazz fusion that Jimi Hendrix had initiated (see here) before wastefully doing himself in. Listening to “Lenny,” recorded at Toronto’s El Mocambo Club in 1983, we’re haunted by the sound of things never to come. The song is an epitaph for Jimi, for Stevie Ray, and for an entire school of American music that was conceived but never born.
Posted on September 26th, 2011 at 1:46pm.
By David Ross. In Catcher in the Rye, Holden goes down to Greenwich Village and hears Ernie the piano player and says:
“You could hardly check your coat, it was so crowded. It was pretty quiet, though, because Ernie was playing the piano. It was supposed to be something holy, for God’s sake, when he sat down at the piano. Nobody’s that good. About three couples, besides me, were waiting for tables, and they were all shoving and standing on tiptoes to get a look at old Ernie while he played. He had a big damn mirror in front of the piano, with this big spotlight on him, so that everybody could watch his face while he played. You couldn’t see his fingers while he played – just his big old face. Big deal. I’m not too sure what the name of the song was that he was playing when I came in, but whatever it was, he was really stinking it up. He was putting all these dumb, show-offy ripples in the high notes, and a lot of other very tricky stuff that gives me a pain in the ass. You should’ve heard the crowd, though, when he was finished. You would’ve puked. They went mad. [ ...] In a funny way, though, I felt sort of sorry for him when he was finished. I don’t even think he knows any more when he’s playing right or not. It isn’t all his fault. I partly blame all those dopes that clap their heads off – they’d foul up anybody.”
Whenever I hear Oscar Peterson, this passage goes off like a firecracker in my head. I’m sure this is terribly unfair, but there it is.
Peterson, in any case, is indeed “that good.” He’s preposterously good, impossibly good, infinitely over-the-top in every way relating to the intersection of the piano and human fingers. This heated blues romp – an encyclopedia of forms and variations and cute little subversions thereof – is typical. If you happen to play the piano, be advised that whatever little self-regard you’ve developed over the years will be completely crushed. This is for non-players only.
Posted on September 20th, 2011 at 1:06pm.
By David Ross. I offered a little paean to the independent film here, but what would such a paean to the contemporary Hollywood comedy look like? The Hollywood comedy is, if not dead, at least writhing on the floor and gasping for breath and rapidly turning blue. Netflix’s selection of new comedies is a flotilla of cliché-ridden, two-starred dreck, of which a film like You Again (2010) is representative. Kristen Bell, Jamie Lee Curtis, and Sigourney Weaver head a mildly promising cast, but the film is a mire of clichés. Here’s the plot in Netflix-speak:
History – make that high school – may repeat itself when Marni (Kristen Bell) learns that Joanna (Odette Yustman), the mean girl from her past, is set to be her sister-in-law. Before the wedding bells toll, Marni must show her brother that a tiger doesn’t change its stripes. On Marni’s side is her mother (Jamie Lee Curtis), while Joanna’s backed by her wealthy aunt (Sigourney Weaver).
Every character is a deliberate incarnation of cliché: the nerd-turned-LA-public relations-whiz-who’s-still-a-nerd-at-heart, the head cheerleader who sadistically delights in tormenting her social inferiors, the salacious granny who says things like “If you don’t go for him I will” (this an attempt to reverse the eighteenth-century cliché of Mrs. Grundy, never mind that this cliché was killed off decades ago, possibly by Auntie Mame), the eye-rolling, wise-ass younger brother whose sarcastic sallies are affectionately disregarded. And of course the film ends with psychotherapeutic reconciliations and teary hugs. Comedy depends on surprise and sudden anarchic subversion. Cliché is anathema to surprise and the reverse of subversive. Thus a film like You Again is, by definition, dead on arrival.
I’m interested in You Again only as an example of the countless comedies that go quietly to the doom of bored trans-Atlanticists who watch thirty minutes and decide to snooze or fiddle with their iPhones. Movies like this don’t even bother to resist their fate; they may even seek it on the principle that formulaic death-in-life is less culpable than risk.
Genuinely funny films like David O. Russell’s Flirting with Disaster (1996), the Coen brothers’ Intolerable Cruelty (2003), Borat (2006), and Greg Mottola’s Superbad (2007) are exceptions that prove the rule. Ben Stiller in Flirting with Disaster reminds us of Ben Stiller in Zoolander, Dodgeball, Envy (don’t even remember this stinker, do you?), Starsky and Hutch, Night at the Museum (1 & 2!), The Marc Pease Experience, and so forth. Jonah Hill in Superbad reminds us of Jonah Hill in calibrated conventionalities like Knocked Up, Saving Sarah Marshall, and Get him to the Greek. Rowan Atkinson in Black Adder – perhaps the funniest turn in all of contemporary comedy – reminds us of Rowan Atkinson in The Lion King, Johnny English, and Mr. Bean’s Holiday.
In the last twenty or thirty years, Hollywood has spent billions of dollars on comedies that have produced a few sniggers among thirteen-year-olds, but left this rest of us with a sense of having been gypped and maybe even insulted. It has also serially wasted the talents of generations of not untalented comedians. To those mentioned above, we can add countless others, beginning with Eddie Murphy, a legitimate comic genius (v. Delirious) reduced to whooping it up with barnyard animals. How does this happen?
A comprehensive explanation probably has a few components. There is of course the quest to hedge one’s bets and play it safe, which is part of the innate economics of the modern film industry, in which star vehicles begin $10 or $20 million in the salary hole. There is also the coarsening of the film audience along with the rest of the culture. Verbal high-wire acts like Ball of Fire and His Girl Friday assume audiences conditioned by the mordant witticisms of Mencken and the bright sophistication of Broadway show tunes and the lively bloodsport of cities with numerous daily newspapers. Audiences reared on video games and public school political correctness tend to go glassy-eyed while trying to follow a baseline rally of ironic badinage. Here’s a test: show a young person “Duck Amuck,” the most manic and zanily postmodern of the Loony Tunes. I bet they don’t laugh.
By David Ross. Lyonel Feininger has suddenly and splendidly swung into view, like some rare astral event. The Whitney Museum is holding, through October 16, an exhibition called “Lyonel Feininger: At the Edge of the World,” which should go far toward confirming the obvious: Feininger was for half a century one of the world’s chief painters. The exhibition is a major contention on his behalf, as the magnificent exhibition catalogue – available here – makes clear.
Feininger (1871-1956) is less celebrated than he should be principally because he confuses the national categories that structure so much art history. He was born in New York to German parents. So far so good. At age sixteen, he shifted his studies to Germany and wound up becoming the proverbial American abroad. During his fifty-year German sojourn, he fell in with the expressionists and later joined the faculty of the Bauhaus as an instructor in printmaking. During the 1920s, he became one of the “Blue Four,” an eminent coterie that included Kandinsky, Klee, and Alexej von Jawlensky (see here for an excellent survey). Feininger returned to the U.S. in 1937, after the Nazis sent a not so subtle signal by including his work in their infamous “Degenerate Art Exhibit.”
Neither quite American nor quite German, Feininger figures in nobody’s national tale. Had he remained in the U.S. or expatriated himself in England or France – countries entwined in our own modernist myth – I suspect he would now be considered one of the Titans of twentieth-century American art. Certainly he was a greater painter than Marsden Hartley (born 1877), Georgia O’Keeffe (born 1887), and Thomas Hart Benton (born 1889), who may be his closest American counterparts. As it is, the Wikipedia entry on American art does not even mention Feininger.
Complicating matters further, Feininger passed through three distinct and not easily reconciled phases. He was first a German expressionist, an oil cartoonist of spooky elongations and lurid Halloween scenery; he was second an impeccably elegant cubist of the school of Cezanne in its Weimar manifestation; he was third – especially during his later American years – a sketch artist whose modest drawings of sailboats, waterfront scenes, and New York buildings translated nature into a kind of wiry architecture, a taut cross-hatching whose inspiration, it’s not incredible to think, may have been the rigging of ships. These latter drawings, sometimes overlaid with watercolor, have a wonderful simplicity, a relaxed confidence in the soundness of their own geometry. Which is the primary Feininger? What explains the strange, disjunctive pattern of his career? There are no clear answers and thus few critics inclined to take up the questions.
By David Ross. My preferred form of Internet time-wasting is “Google Images.” I collect photos of great writers, Georgian architecture, Michelin-starred food (the kind I may never get a chance to eat), nineteenth and early twentieth-century art (Samuel Palmer, Lyonel Feininger, Wyndham Lewis, etc.), and, yes, glamour shots of classic actresses, including, but not limited to, Anouk Aimee, Lauren Bacall, Capucine, Audrey Hepburn, Katherine Hepburn, Anna Karina, and Grace Kelly, preferably in Givenchy or Chanel, always in glorious black and white. In short, I’m a minor connoisseur, on which basis I would like to make the reckless assertion that the above photo of Veronica Lake, circa 1941, is the greatest still photo – the most elegant, seductive, multivalent – ever taken of an actress. The photographer was George Hurrell (1904–1992), whom Virginia Postrel calls the “master of Hollywood glamour.” You can read about his revival here and buy his work here.
The photo seems at first glance your standard come-hither boilerplate, elevated, obviously, by Veronica’s preternatural bone structure and hallmark tresses. I find, though, that Veronica’s expression has a kind of Gioconda irreducibility. At once sexy, weary, predatory, and demure, her expression seems to say something like, “I have no interest in you – no interest in the mere world – but if you insist, I will rouse myself to the matter of your destruction – and you will relish every wound.” Notice the faint sneer that registers at the right corner of the mouth; notice the shadowed right eye that carries dual connotations of the harlequin and the gun moll with a shiner; notice the coffin-forming play of light and shadow. This is a disconcerting silhouette indeed: a dark little study of sex and death, a forked image of the sleeping beauty and the stirred succubus, the thirst-awakened vampiress.
In comparison, Rita Hayworth kneeling on her satin-sheeted bed and Marilyn Monroe struggling with her billowing skirt are images of mere adolescent wish fulfillment, of sweaty pubescence. If buxom vistas are your thing — well, enjoy. Hurrell’s version of Veronica Lake belongs to an entirely different category. Its glamour recalls Beardsley, Weimar, what have you; it’s not kid’s stuff.
Posted on September 9th, 2011 at 2:07pm.
By David Ross. Let’s admit it. We all have a weak spot for certain women from the wrong side of the political tracks. Maybe you have little fantasies about discussing Bresson with Susan Sontag while soaping her back in the tub. Maybe you imagine sharing the Sunday paper with Joan Didion. My own weakness – lifelong – is for Patti Smith. I had a girlfriend who gamely stood in line to have Patti sign a CD copy of Horses for me. When she finally got to the front of the queue, she told Patti, “My boyfriend is in love with you.” Patti said, “Doesn’t he notice these grey hairs?” My girlfriend said, “I don’t think he cares.” Well spoken on my behalf.
William Blake offers – perhaps ‘records’ is the more appropriate verb – this exchange with the prophet Isaiah in The Marriage of Heaven and Hell:
Then I asked: “Does a firm persuasion that a thing is so, make it so?”
He replied, “All poets believe that it does, and in ages of imagination the firm persuasion removed mountains; but many are not capable of a firm persuasion of anything.” (V, 27-32).
What’s so alluring in the supra-physical sense is Patti’s capacity for this “firm persuasion.” She’s not mugging (like Bono) or merely howling (like Kurt Cobain): her music is a disciplined act of conviction in her own poetic and prophetic calling. One can look awfully silly as a self-styled poet or prophet (Jim Morrison certainly did) but Patti never waivers and never allows the spell to break; we’re convinced in the end because she’s utterly convinced from the start. Arguably, Patti was the last legitimate keeper of the romantic flame itself, that desperate belief in art that began in the late nineteenth century and guttered utterly in our own time.
A bony, boyish waif from Woodbury Gardens, NJ, Smith cut her teeth at the Chelsea Hotel and St. Mark’s Church during the late 1960s and early 1970s, achieving minor underground celebrity as an actress, playwright, rock journalist, artist, and poet. Her chief inspirations were predictable but nonetheless powerful: Rimbaud, Genet, Burroughs, Ginsberg, Dylan, Hendrix, the Rolling Stones. In 1971, she began to recite her poetry to guitarist Lenny Kaye’s accompaniment. By 1976, she had improbably become the most acclaimed female rock star since Janis Joplin and Grace Slick had emerged ten years earlier.
Smith’s first album, Horses (1975), weds cascades of Beat-and Symbolist-inflected poetry to the lean, driving sound of proto-punk garage rock. The album remains a signature argument for the artistry of rock’n'roll and is to my mind one of the ten supreme albums of the rock era. Rolling Stone ranks Horses 44th on its list of the 500 greatest albums of all time, just behind Dark Side of the Moon. This becomes a backhanded compliment when you consider certain albums that rank higher: the Eagles’ Hotel California (#37), Carol King’s Tapestry (#36), David Bowie’s The Rise and Fall of Ziggy Stardust and the Spiders from Mars (#35), U2’s The Joshua Tree (#26), Fleetwood Mac’s Rumours (#25), and Michael Jackson’s Thriller (#20). Preferring Tapestry to Horses is like preferring Jennifer Aniston to Veronica Lake – an aesthetic misjudgment that raises questions about one’s entire world view.
Judge for yourself: here is the epic studio version of “Birdland,” Smith’s fantastic synthesis of Arthur C. Clarke, Shelley’s Queen Mab, and the Book of Revelations. Ponder also these scruffy, raging, nearly epileptic live versions of “Horses” and “Gloria.”
By David Ross. Every so often I dip into contemporary literature to confirm my sense that I’m not missing very much. I recall forays into the work of Paul Auster, Angela Carter, Douglas Coupland, Dave Eggers, Bret Easton Ellis, Jonathan Franzen, Michel Houellebecq, Jay MacInerney, Cormac McCarthy, Rick Moody, Chuck Palahniuk, Salman Rushdie, Jeanette Winterston, and other passing fancies of Time and Newsweek. Zadie Smith waits her turn on my shelf. All this sifting of silt has produced only a few glinting nuggets. I discovered in Houellebecq a fierce and welcome fellow despiser of modernity (see my comments here), and something even more in David Foster Wallace: a vast nineteenth-century mind struggling to find itself.
The “covering cherub,” in Blake’s parlance, was the postmodernism that DFW formally embraced against the grain of his personality. He was profoundly sincere, empathetic, and humane, a believer in “the sub-surface unity of things,” as he puts it in his famous Kenyon graduation address of May 2005, and yet devoted his career to self-conscious intricacies of irony and gamesmanship. He made great art in this mode – only Nabokov and Borges are his postmodern betters – but it was not, I can’t help feeling, the art he was born to make.
I have additional misgivings about his prose, though he is the only prose writer of his generation even worth noting. While meticulously attentive to his art, he was ambivalent about the formality of his art, the ideal of the well-wrought urn. His language is often splendid, but always splendid despite a certain scruffiness and loose-limbed sprawl. My eye is always instinctively performing the function of an editor, pruning, reshaping. He was too invested in his own unpretentiousness, too much infected with the modern American ideal of jeans and sandals, which ultimately expresses a yearning to be liked, to be no better or different than the rest of the crowd. I suppose this is the symbolic meaning of DFW’s hallmark bandana, an accouterment of kitchen and field workers, housewives and athletes. Great writers don’t care about being liked. They scorn our right to judge. They discover themselves amid the execrations of the crowd.
Even with his foibles and arguable failings totted up, DFW was the redeemer of his literary generation. He saved it from the humiliation of being the first generation in American history to lay nothing – not the least nosegay – on the graves of Emerson, Thoreau, and Whitman. He saved it from the gaping wound of a great naught.
DFW’s rightly famous Kenyon Commencement Speech (here and here) has become a pop-cultural touchstone. Perhaps enthusiasm for it has already become a bit of a cliché. Yet I defy anybody to listen attentively without succumbing to its moral seriousness and sinking into an inner hush just as the initially boisterous Kenyon audience stills into an outer hush. In the guise and moment of his speech, DFW defies the default setting of the culture. He sheds his celebrity – the unpeelable skin of the Oprah era – and becomes the conduit and servant of a message more urgent than himself. Thus Emerson spoke from the podium of the Concord lyceum.
Alex Niven, a friend of a friend, comments intelligently on the speech and on much else concerning DFW.
Posted on August 24th, 2011 at 2:04pm.
By David Ross. “Independent film” is defined by its circumvention of the Hollywood production mechanism, but this is incidental. The issue is not process but content. Independent film is an indigenous American genre just like the science-fiction film, the noir film, and the Western. Its chief attribute is loquacity. Talk is cheap – literally – and independent movies have made a virtue of necessity by rediscovering what Hollywood can afford to forget: that dialogue is the basis of drama. Less cardinal but still defining attributes include gritty naturalism (almost always urban), cultural and moral skepticism, a penchant for irony and deadpan, identification with pitiable outsiders and addled anti-heroes, impatience with traditional sequential narrative, appreciation for the retro, and a certain seated or merely ambulatory anti-kinesis (not a few films, like Slackers and Before Sunrise, narrate a literal walk). Independent film disdains the happy ending and could not care less about sex or sexiness or conventional good looks (hence Steve Buscemi). The operative politics tend to be amorphously anti-establishmentarian, but too skeptical to be actively liberal. I could make an excellent argument for the conservatism of films like Annie Hall (1997), Metropolitan (1990), Barcelona (1994), and A Serious Man (2009).
The films of the “Easy Rider, Raging Bull” era – films like John Schlesinger’s Midnight Cowboy (1969), Peter Bogdanovich’s Last Picture Show (1971), Martin Scorsese’s Mean Streets (1973) and Alice Doesn’t Live Here Anymore (1974), Roman Polanski’s Chinatown (1974), and Sydney Lumet’s Dog Day Afternoon – heralded the independent film movement, but were not strictly seminal. They yearned to be epic, mythic, and culturally central, with John Ford, Howard Hawks, and John Huston in mind. Independent cinema would later snort at these manly pretensions, settling for a peripheral and ironic self-awareness, like the satirical wallflower at the high school dance.
The first true – and possibly best – independent film was Annie Hall. Inspired by European conversationalists like Ingmar Bergman and Eric Rohmer, Woody Allen created an aggressively small and verbal film in an era of equally aggressive hypertrophy and hyperactivity. Perhaps even more to the point, Annie Hall was a quietly scathing critique of the post-sixties liberal and pop-cultural order, setting the tone for a whole generation of filmmakers united by the instinct that “something is wrong,” though skeptical of their own capacity for overt social statement in the style of seventies masterpieces like A Clockwork Orange (1971) and Network (1976). Gabby symposia like Louis Malle’s My Dinner with Andre (1981) and Barry Levinson’s Diner (1982) further crystallized the verbal and essentially seated nature of the genre.
[EDITOR'S NOTE: The New York Post's Kyle Smith suddenly has a column out today (8/17) entitled "The Clockwork riots," which compares the London riots to both the Burgess and Kubrick versions of A Clockwork Orange, referring to the "prophetic" nature of those works, as well as to the ongoing crime "orgy" in London. No attribution is made to Libertas. This seems to be a striking coincidence. We would appreciate a clarification from Mr. Smith.]
By David Ross. A Clockwork Orange, Kubrick’s classic interpretation of Anthony Burgess’ 1962 novel, is no longer prophetic. It is actual. The realization of its vision is unmistakable as London mobs of juvenile miscreants burn and loot, differing from Malcolm McDowell’s Alex DeLarge only insofar as they would not be caught dead listening to Beethoven. We are witnessing the nanny state eventuate in its logical terminus: abdication of personal responsibility, dissolution of purpose, collapse of belief, crippling unconscious sense of one’s own infantilization. As Mark Steyn says, “Big government means small citizens.”
This seems to me the movie’s crucial exchange:
Tramp: Well, go on, do me in you bastard cowards! I don’t want to live anyway, not in a stinking world like this!
Alex: Oh? And what’s so stinking about it?
Tramp: It’s a stinking world because there’s no law and order anymore! It’s a stinking world because it lets the young get on to the old, like you done. Oh, it’s no world for an old man any longer. What sort of a world is it at all? Men on the moon, and men spinning around the earth, and there’s not no attention paid to earthly law and order no more.
“No attention paid to Earthly law and order” is a curious and pregnant phrase. The tramp wants to complain not merely about crime, but about alienation from something more fundamental than the penal code. The exasperated allusion to “men on the moon” implicates that rationalism and materialism of a post-religious age. Seduced by our new powers of knowledge and control, we’ve lost sight of basic truths and duties.
Just now I’m reading Elizabeth Gaskell’s 1857 biography of Charlotte Bronte. As the yobs spread their mayhem, I can’t help thinking of the three Bronte sisters, their mother dead, their two elder sisters dead, no schooling to speak of, no money to speak of, nothing but the cold and howling Yorkshire wind for company, and yet toiling to master French, German, politics, history, and literature, and eventually promulgating, from the nursery of a provincial parsonage, one of the great literary sprees in all history. How is it that their kind has become so utterly inconceivable?
By David Ross. The kiddy culture – the culture of sneakers, fast food, and video games – has subsumed the adult culture; or rather adolescents have stopped graduating from one to the other. Thus, as I read in Mark Steyn’s latest tome, the chilling and funerary After America, “males 18 to 34 years old play more video games than kids: according to a 2006 Nielsen survey, 48.2 percent of men in that demographic amused themselves in that way for an average of two hours and forty-three minutes every day – that’s thirteen minutes longer than the 12- to 17-year-olds” (181). Kay Hymowitz provides the definitive account of the new “child-man” in City Journal.
The kiddy world is characterized by impulse; the adult world by purpose. The kiddy world belongs to the playpen of the present moment; the adult world tethers itself to both past and future. The kiddy world passively imitates and downloads; the adult world discriminates and invents.
If I had to offer a living symbol of the “adult world” – its tenderness, stoicism, rigor, mature calm – I would point to Tony Rice’s version of “Shenandoah.” I would say to our thirtysomething sneaker-wearers, this is what it means to be grown up, to carry yourself like a man.
Posted on August 15th, 2011 at 10:34am.
By David Ross. Every so often liberal big leaguers take a whack at Thomas Kinkade, the king of mall and mail-order art, the entrepreneurial painter laureate of what Jed Pearl calls “Wal-Mart America.” His depictions of gingerbread cottages nestled in what seem to be sleepy Cotswold hamlets are beloved by the masses and equally detested by people who consider themselves – by virtue of college degrees and the occasional glass of white wine with dinner – Blue State sophisticates. In 2001, Susan Orlean gave Kinkade the once-over in the New Yorker (see here), though she semi-restrained her snark on the grounds that Kinkade’s buffoonery speaks for itself. Pearl has now followed suit with an inchoate piece of hostility – titled “Bullshit Heaven” no less – in The New Republic. Extending the toilet metaphor, Pearl concludes that Kinkade has “urinated on us all.”
There’s no denying that Kinkade’s art is pure kitsch, a confection of Christmas-card nostalgia derived from Wordsworth at his most fey, Norman Rockwell at his most precious, and whoever first had the idea of painting and mass-producing scenes of beagles playing poker. His cotton-candy shire scenes look as if model trains should be running through them or Hobbits should be peeking from the windows. I would no more hang a Kinkade in my living room than a poster of Ashton Kutcher in the buff.
The blame is usually – okay, always – directed at putative yahoos who clamor for this kind of thing and create demand for what were better handled like dog poo in the street (quick condescending glance, wide berth). Articles about Kinkade are never really about Kinkade; they are about the people who buy Kinkade. Essentially, they license the readers of the New Yorker and The New Republic to look down on “Wal-Mart America” from a standpoint of cultural and aesthetic superiority. Their real substance, in other words, is Blue State-Red State politics.(I wonder, by the way, whether a film like Winter’s Bone doesn’t exploit the same condescension.)