The Bloodstained Rise

Christopher Logue: All Day Permanent Red

National Review, November 9, 2003

helmet.jpg

Christopher Logue has been a dealer in stolen property (briefly), a prisoner in a Crusader castle (16 months), a pornographer (the book Lust), and, probably no less discreditably, an actor, a poet, and a writer of screenplays. As if this weren't enough, for over four decades this versatile Englishman has been engaged in a "reworking" of the Iliad. It is not, he is at pains to stress, a translation (he knows no Greek), but an episodic "account" of the ancient epic that has already taken far longer to produce than Troy took to fall.

And, as you read those words. I can hear you sigh. The prospect of yet another tawdry modernization of a classic that needs none seems like nothing to look forward to. Our age often shows itself too restless, unimaginative, and self-important to attempt a genuine understanding of our culture's past. Hot in the pursuit of some imagined relevance, we are forever reinterpreting and updating, here The Tempest as an allegory of slavery, there a few nipples to spice up that boring old Jane Austen. And if, in the process, the sense of the original is lost, we shrug, and settle for what is left: deracinated pap, bland at best, topically—and inconsequentially— "controversial" at worst. Only later do we bother to wonder where our literature has disappeared to.

But All Day Permanent Red is very different from the usual dross. Logue's previous work on the Iliad has been called a masterpiece (Henry Miller, not always a reliable source, described an early section as better than Homer): a devalued term these days, but, in this case, well deserved. All Day Permanent Red is the latest chapter and it doesn't disappoint. Here is Logue's description of the Greek soldiers rising to face their Trojan opponents:

Think of a raked sky-wide Venetian blind.

Add the receding traction of its slats

Of its slats of its slats as a hand draws it up.

Hear the Greek army getting to its feet.

Then of a stadium when many boards are raised

And many faces change to one vast face.

So, where there were so many masks.

Now one Greek mask glittered from strip to ridge.

In earlier installments—War Music (1981). Kings (1991), and The Husbands (1994)—Logue darted in and out of Homer's chronology, starting with the death of Patroclus and the return of Achilles, then taking his readers baek to the early quarrels between Agamemnon and Achilles, and then on to the single combat between wronged Menelaus and spoiled, lethal Paris. In All Day Permanent Red (the title is. wonderfully, borrowed from an advertisement for lipstick), Logue takes a step back—to the very first full day of combat between the two armies.

The language is as ferocious as its subject matter and, in its cinematic intensity, it's easy to see the hand of the former screenwriter:

Sunlight like lamplight.

Brown clouds of dust touch those brown clouds of dust already overhead.

And snuffling through the blood and filth-stained legs

Of those still-standing-thousands goes Nasty, Thersites' little dog.

Now licking this, now tasting that.

But there is more to this saga than a simple recital of slaughter. The savagery on the plains before Troy is echoed in the heavens above. Nowadays we tend to trust in the benign God of the monotheistic imagination or, failing that, in the indifference of a universe that does not actually set out to harm us. The men of Homer's time had no such comfort: "Host must fight host, / And to amuse the Lord our God / Man slaughter man."

The gods of antiquity were capricious - selfish, and vain, playground bullies or the smug members of the smart set in a high-school movie, monsters as often as they were saviors. Pitiless, dangerous Olympus is a recurrent theme that Logue, like Homer, has emphasized throughout his narrative, and this new volume is no exception. Here is Athena's response to a plea for help from Odysseus;

Setting down her topaz saucer heaped with nectarine jelly

Emptying her blood-red mouth set in her ice-white face

Teenaged Athena jumped up and shrieked

"Kill! Kill for me!

Better to die than to live without killing!"

Logue's language, both grand and, at times, oddly conversational ("Only this this is certain: when a lull comes—they do— / You hear the whole ridge coughing"), brings immediacy to an ancient epic. His use of deliberately anachronistic wording neither jars (partly because most English-speaking readers, including this one, are not comparing Logue's work against the original Greek) nor does it break that sense of the past that is no small part of the spell of a tale thousands of years old. And, yes, the references to Venetian blinds, plane crashes, and even an aircraft carrier somehow work in this tale of Bronze Age fury. Their very modernity reminds us both of our vast distance from this saga, and of the extraordinary cultural continuity that its survival represents.

And if we want to understand why, beyond an accident of history, the Iliad has been remembered for so long, Logue's extraordinary, compelling poetry gives us a clue. The Iliad has as much to say about the human condition now as it did when Homer began to write, not least the destructive, glorious, inglorious love of battle that will endure until the Armageddon which, one day, it will doubtless bring about:

Your heart beats strong. Your spirit grips.

King Richard calling for another horse (his fifth).

King Marshal Ney shattering his saber on a cannon ball.

King Ivan Kursk, 22.30 hrs, July 4th to 14th '43, 7000 tanks engaged,

"... he clambered up and pushed a stable-bolt Into that Tiger-tank's red-  hot-machine-gun's mouth

And bent the bastard up. Woweee!"

Where would we be if he had lost?

Achilles? Let him sulk,

A masterpiece? Of course it is.

Horror Show

Joe Bob Briggs: Profoundly Disturbing -  Shocking Movies that Changed History

National Review, August 26, 2003

Briggs.jpg

The title is reassuringly lurid and the cover comfortingly nasty, but, on opening this book, anxious readers may worry that Joe Bob has left the drive-in. Now that would be profoundly disturbing. Author, journalist, cable-TV stalwart, and former NR columnist, Briggs overcame fictitious origins and nonexistent competition to become America's finest drive-in-movie critic. He saw Nail Gun Massacre and he watched All Cheerleaders Die. Who else could take on that sort of responsibility?

He is the Zagat of the Z-movie, the one indispensable guide for those who like slaughter, sex, and lethal household tools with their popcorn. He wallows in the movies that other critics flee. Ebert on Shrunken Heads? Silence. Kael on Fury of the Succubus? No comment. But Joe Bob was there for them both. He's funny, well informed, and succinct (The Evil Dead is "Spam in a cabin"), and he tells his audience what it needs to know (Bloodsucking Freaks: "pretty good fried-eyeball scene . . . 76 breasts . . . excellent midget sadism and dubbed moaning"). If Joe Bob tells you to "check it out," that's what you do.

And when, as a result, you are watching man-eating giant rats starting their gory feast (Gnaw), you will still be laughing at the memory of what Joe Bob had to say. Yes, he both subverts and celebrates these films, but who cares? It's better to lighten up, grab a beer, and just see Joe Bob as someone who delights in rummaging through cinema's trash heap and telling us what he's found.

He does this brilliantly, in a style — Hazzard County, with a touch of Cahiers du Cinema — that is all his own; but, after all these years, is the drive-in still enough for Mr. Briggs? Joe Bob's Jekyll, the erudite and rather more suave "John Bloom," has been developing a journalistic career of his own, while Joe Bob himself has been spotted on stage and screen, and in the pages of Maximum Golf magazine; can the country club be far behind?

In spite of this, it's still startling to find that Briggs chose The Cabinet of Dr. Caligari as the first movie to discuss in his new book. The fact that it's foreign isn't the problem. Joe Bob has written about plenty of foreign films; they usually feature kickboxing, kung fu, gratuitous violence, more kickboxing, incomprehensible dialogue, over-choreographed fight scenes, and the exploitation of attractive young actresses who manage to lose their clothes and their lives in the course of the movie. They are, in short, identical in almost every respect to the domestic offerings he reviews.

Caligari is different. Yes, it's a horror movie, but it's a coffee-on-the-Left-Bank, furrowed-brow unfiltered cigarette of a horror movie and, like a number of the other films described in this book, it's far from typical territory for the sage of the slasher pic. It's a German expressionist masterpiece from 1919, an allegory of totalitarianism often thought to have anticipated the Nazi terror to come. There are no nunchucks in Caligari. Still, there's more than an echo of the drive-in in the irreverent glee with which Joe Bob penetrates the Teutonic gloom. All too often, Caligari is shown with a melodramatic "silent movie" musical backdrop, rather than the modernist score envisaged by its makers. Perhaps worse still, it has also been relentlessly over-analyzed by film highbrows. To Joe Bob, this is like "trying to watch Schindler's List with 'Turkey in the Straw' playing in the background and a professor pointing out every shaft of light as a pivotal moment in German Expressionism."

Caligari is, Briggs argues, a film that "changed history," but in this book that can mean less than you might think. The movies in Profoundly Disturbing may all "have been banned, censored, condemned, or despised" at one time or another, but some of them wouldn't change the course of an afternoon, let alone history.

Perhaps this is why Joe Bob is careful to stress that, in a number of cases, the only history that has been changed is cinema history. How the films he discusses relate to the broader cultural picture is complex: Did a movie influence the culture, merely reflect it, or a bit of both? As he tries to find an answer to this question, quality can be irrelevant. Deep Throat is a terrible film even on its own terms, but somehow it managed to help shape the Ice Storm era and thus had much greater cultural impact than the far more artistically significant Caligari. Caligari may have warned Germans about the dangers of totalitarianism, but little more than ten years later Hitler was in power.

If Profoundly Disturbing doesn't always convince us that the movies it describes "changed history," it is, nonetheless, a hugely entertaining account of the frequently bizarre way they came to be made. Some of these films were made by people operating at the creative edge (the art director of Texas Chainsaw Massacre was, we learn, able "to indulge his lifelong fascination with animal bones") while others were manufactured by those who had hit artistic rock-bottom (Linda Lovelace for President) and didn't care. This is a cinema of desperate improvisation (the night before the "classic tongue-ripping scene" in Blood Feast, the victim still hadn't been cast) and even more desperate finances.

And then there's Mom and Dad (1947), a "sex education" movie that circulated for over 20 years through small-town America. This cautionary tale of the dangers of premarital naughtiness included footage of a live birth and hideous syphilitic sores. It grossed an estimated $100 million. Showings came complete with two women in nurse's outfits and a 20-minute lecture by "Elliot Forbes," an "eminent sexual hygiene commentator." At one point there were no fewer than 26 Elliot Forbeses, "most of them retired or underemployed vaudeville comedians."

If this all sounds like a carny stunt, it's because it was. Profoundly Disturbing includes a good number of more "serious" films (and Briggs writes about them very well), but the movies that make up its sleazy, captivating core are the successors of the freak show, the circus, and old-time burlesque. As told with gusto by an author obviously far from ready to quit the drive-in (whew!), theirs is a story of that wild, ludicrously optimistic entrepreneurial spirit that is, somehow, very typically American. Combine those hucksters, visionaries, and madmen with the dreams of a restless, somewhat deracinated population spreading across a continent and we begin to understand how this country's popular culture became the liveliest in the world — if not always the most elevated. Mencken was right: No one ever went broke underestimating the taste of the American public.

Why so much of that taste revolves around mayhem and gore (that sex has box-office appeal is no surprise) is a mystery beyond the scope of Profoundly Disturbing. Suffice to say that it does, and the result is a book that blends fascinating pop-culture history, first-rate film criticism, and learned commentary on the stunt-vomit in The Exorcist.

Check it out.

Everybody Must Get Stoned?

Jacob Sullum: Saying Yes - In Defense of Drug Use

National Review, June 20, 2003

Sullum.jpg

Jacob Sullum is a brave man. In his first book, the entertaining and provocative For Your Own Good, he attacked the excesses of anti-smoking activism and was duly—and unfairly—vilified as a Marlboro mercenary, a hard-hearted shill for Big Tobacco with little care for nicotine's wheezing victims. Fortunately, he was undeterred. In Saying Yes, Sullum, formerly of NATIONAL RF.VIEW and now a senior editor at Reason magazine, turns his attention to the most contentious of all the substance wars, the debate over illegal drugs. Sullum being Sullum, he manages to find a bad word for the mothers of MADD and a good one for 19th-century China's opium habit.

Sullum's effort in Saying Yes is more ambitious (or, depending on your viewpoint, outrageous) than that of most critiques of the war on drugs. Supporters of legalization typically base their case on moral or practical grounds, or both. The moral case is broadly libertarian—the individual has the right to decide for himself what drugs to take—while the practical objection to prohibition rests on the notion that it has not only failed, but is also counterproductive: It creates a lucrative (black) market where none would otherwise exist. Sullum repeats these arguments, but then goes further. Taken in moderation, he claims, drugs can be just fine—and he's not talking just about pot.

Whoa. In an era so conflicted about pleasure that wicked old New York City has just banned smoking (tobacco) in bars, this is not the sort of thing Americans are used to reading. Health is the new holiness and in this puritanical, decaf decade, most advocates of a change in the drug laws feel obliged to seem more than a little, well, unenthusiastic about the substances they want to make legal. Their own past drug use was, they intone, nothing more than youthful "experimentation." Most confine themselves to calling for the legalization of "softer" drugs and, even then, they are usually at pains to stress that, no, no, no, they themselves would never recommend drugs for anyone.

Sullum is made of sterner stuff. He admits to "modest but instructive" use of marijuana, psychedelics, cocaine, opioids, and tranquilizers with, apparently, no regrets. (Judging by the quality of his reasoning, I would guess the drugs had no adverse effect on him.) He seems prepared to legalize just about anything that can be smoked, snorted, swallowed, injected, or chewed—and, more heretically still, has no truck with the notion that drug use is automatically "abuse." "Reformers," he warns, "will not make much progress as long as they agree with defenders of the status quo that drug use is always wrong."

In this book Sullum demonstrates that if anything is "wrong"—or at least laughably inconsistent—it is the status quo. The beer-swilling, Starbucks-sipping Prozac Nation is not one that ought to have an objection in principle to the notion of mood-altering substances. Yet the U.S. persists with a war on drugs that is as pointless as it is destructive. This contradiction is supposedly justified by the assumption that certain drugs are simply too risky to be permitted. Unlike alcohol (full disclosure: Over the years I have enjoyed a drink or two with Mr. Sullum) the banned substances are said to be products that cannot be enjoyed in moderation. They will consume their consumers. Either they are so addictive that the user no longer has a free choice, or their side effects are too destructive to be compatible with "normal" life.

To Sullum, most such claims are nonsense, propaganda, and "voodoo pharmacology." Much of his book is dedicated to a highly effective debunking of the myths that surround this "science." There's little that will be new to specialists in this topic, but the more general reader will be startled to discover that, for example, heroin is far less addictive than is often thought. The horrors of cold turkey? Not much worse than a bad case of flu. (John Lennon—not for the only time in his career—was exaggerating.) Even crack gets a break: Of 1988's "crack-related" homicides in New York City, only one was committed by a perpetrator high on the drug. That's one too many, of course, but 85 percent of these murders were the result of black-market disputes, a black market that had been created by prohibition.

So if drug users are neither necessarily dangerous nor, in most cases, addicts, can they be successful CPAs or pillars of the PTA? Sullum argues that many currently illegal drugs can safely be taken in moderation—and over a long period of time. He interviews a number of drug users who have managed to combine their reputedly perilous pastime with 9-to-5 respectability. Sullum concedes that they may not necessarily be representative, but his larger point is correct: The insistence that drugs lead inevitably to a squalid destiny is difficult to reconcile with the millions of former or current drug users who have passed through neither prison nor the Betty Ford. As Sullum points out, "excess is the exception," a claim buttressed by the fact that there are millions of former drug users.

Typically, drug consumption peaks just when would be expected—high school, college, or shortly thereafter. Then most people grow out of it. The experience begins to pall and the demands of work and family mean that there's no time, or desire, to linger with the lotus-eaters. Others no longer want to run the risks of punishment or stigma associated with an illegal habit. Deterrence does-— sometimes—deter, and it may deter some of those who would not be able to combine a routine existence with recreational drug use. But this is not an argument that Sullum is prepared to accept: He counters that the potentially vulnerable population is small and may well become alcoholics anyway, "thereby exposing themselves to more serious health risks than if they had taken up, say, heroin." Sullum is not, we are again reminded, an author who is afraid of controversy.

But is he too blithe about the degree of potential medical problems associated with drug use? As he shows (occasionally amusingly and often devastatingly), much of the "evidence" against drug use has been bunk, little more than crude scare- mongering frequently infected with racial, sexual, or moralistic panic; but it doesn't follow that all the dangers arc imaginary. To be sure, he does acknowledge some other health hazards associated with drugs; but he can sometimes be disconcertingly relaxed about some of the real risks.

His discussion of LSD is a case in point. The causal relationship between LSD and schizophrenia is complex (and muddled by the fact that both schizophrenics and schizotypal individuals are more likely to be attracted lo drugs in the first place), but it's not too unfair to describe an acid trip as a chemically induced psychotic episode. The "heightened sense of reality" often recorded by LSD users is, in fact, exactly the opposite—a blurring of the real with the unreal that is also a hallmark of schizophrenia. Throw in acid's ability to generate the occasional-—and utterly unpredictable—"flashback" and, even if many of the horror stories arc no more than folklore, it's difficult to feel much enthusiasm for legalizing LSD except, just perhaps, under carefully controlled therapeutic conditions.

What's more, as a substance that, even in small doses, will create a prolonged delusional state, LSD is not exactly the poster pill for responsible drug use. But this exception should not distract us from the overall strength of Sullum's case. It is possible, he writes, to "control" drug consumption "without prohibition. Drug users themselves show that it is." It's unnecessary for him to add that the abolition of prohibition would imply a relearning of the virtue of self-control, a quality long imperiled by the soft tyranny of the nanny state.

For Sullum is not advocating a descent into Dionysian frenzy. The poverty of "Just Say No" may be obvious, he writes, "but moving beyond abstinence does not mean plunging into excess. Without abstaining from food, it is possible to condemn gluttony as sinful, self-destructive, or both . . . Viewing intoxication as a basic human impulse is the beginning of moral judgment, not the end. It brings us into the territory of temperance"—a word Sullum uses, accurately, to mean moderation. The 19th-century anti-alcohol campaigners who hijacked it were as cavalier with vocabulary as they were with science.

Proponents of legalization will, naturally, say yes to this book, but their opponents should read it too. Sullum's arguments deserve a response from those who disagree with him. As he points out, the costs of the war on drugs far exceed the billions of dollars of direct expenditure. They also include "violence, official corruption, disrespect for the law, diversion of law-enforcement resources, years wasted in prison by drug offenders who are not predatory criminals, thefts that would not occur if drugs were more affordable, erosion of privacy rights and other civil liberties, and deaths from tainted drugs, unexpectedly high doses, and unsanitary injection practices." Under these circumstances, it's up to the drug warriors to come up with a convincing explanation as to why we are fighting their drug war. Judging by this well-written, persuasive, and important book, they are unlikely to succeed.

Keepers Without Peace

Frederick Fleitz: Peacekeeping Fiascoes of the 1990s : Causes, Solutions, and U.S. Interests

UN.jpg

With his good intentions and his blue helmet, the U.N. peacekeeper was an icon of post-World War II internationalism. He was G.I. Joe for the Eleanor Roosevelt set, muscular assurance that the days of the feeble League of Nations would never return. And for a while it seemed to work. The record was far from perfect, but from Cyprus to West New Guinea to Namibia, the presence of relatively small numbers of U.N. troops was sufficient to separate warring forces and supervise the return to peace. The key to their success was evenhandedness and the consent of those whom they had come to police.

In the wake of the Gulf War and the breakup of the Soviet Union, this comparatively restrained approach to peace-keeping underwent a transmutation. The shambles that ensued is neatly summarized in this book’s delightfully blunt title. The author, Frederick Fleitz Jr., knows his material well: He is a former CIA analyst who covered the U.N. and its peacekeeping efforts during parts of the Reagan, George H. W. Bush, and Clinton administrations. Today, he is special assistant to the undersecretary of state for arms control and international security, though readers are warned that his opinions "do not necessarily represent the views of the Department of State, the Central Intelligence Agency, or the U.S. government." But what Fleitz has to say makes a great deal of sense, so we must hope that warning is not to be taken literally.

The real starting point for this book is the Soviet collapse, which made it possible for the West to intervene more aggressively in some of the world's most dangerous trouble spots. Fleitz's central thesis is that U.S. policymakers threw this opportunity away; Instead of building on the Cold War victory with a foreign policy that combined the judicious use of force with enlightened national interest, the government decided to expand the United Nations' global role in peacekeeping. The Clinton administration's poorly thought-out liberal-internationalist agenda combined sanctimony, parsimony, and ineffectiveness in roughly equal measure. The consequences were had for the U.N., in that they made a mockery of belief in that organization’s potential usefulness, and often disastrous for the U.S. There is a good reason that this book is dedicated to the U.S. Army Rangers and aircrew killed in Somalia in the terrible events of October 1993.

The rot began in the immediate aftermath of the Gulf War. As Fleitz explains, supporters of a more activist U.N. "seized on the fact that Operation Desert Storm was authorized by the U.N. Security Council" as proof that a new era had arrived. The U.N.'s role in approving the Gulf War was said by many liberals to herald "an end to the unilateral use of military force, at least by the United States." But as Fleitz correctly observes, "these claims ... ignored the reality that the first Bush administration used the U.N. endorsement... largely as a fig leaf to protect the sensitivities of America's Middle East allies."

These claims may have ignored reality, but they helped create a climate in which U.N. peacekeeping could be transformed. The scope of peace- keeping operations became more ambitious and the traditional requirements of consent and impartiality were abandoned. U.N. forces could now be empowered to impose "peace" on warring parties and, if necessary, take sides in a conflict. Fleitz argues that this more aggressive definition of peacekeeping (and the expansion of the U.N.'s role it implied) fitted in well with a liberal foreign-policy agenda in Washington. "It represented a way to implement . . . dreams of Wilsonian internationalism while drastically cutting defense spending." Beyond that, it is not necessary to hear the whirring of black helicopters to recall, as Fleitz does, that this was also a time when some foreign-policy gurus who were to be influential in the Clinton administration were "talking about how the new world order meant the lowering of national boundaries . . . and the beginning of a slow movement toward world government." It's also worth noting (although Fleitz never does so explicitly) that arguments for a more activist United Nations were always likely to find favor in a Clinton White House instinctively suspicious of the U.S. military and its use as an instrument of American power.

Much of the rest of the book is devoted to an examination of how these expanded notions of peacekeeping have worked or, far too frequently, failed to work. With topics that include Rwanda, Cambodia, Liberia, and Bosnia, this makes for grim but never sensationalist reading: Despite its title, this book is not an exercise in simple U.N.-bashing, satisfying though that would doubtless be. Fleitz is, quite justifiably, highly critical of the U.N., but he is also quick to acknowledge the way the organization has all too often been used as a scapegoat for feckless Western policymaking. And just as the book’s narrative is not sensationalist, neither is its style: The text is often highly detailed (this book will be found on the bookshelves of our more sensible universities for years to come) and brutally burdened down by the fact that U.N. military operations are rich in acronyms if not in achievements.

Above all, Fleitz stresses that these fiascoes were nothing if not predictable. With the precondition of consent abandoned, U.N. peacekeepers ran the risk of being seen as an occupying or hostile force, even when the motives for their mission were primarily humanitarian. The umpires had become players. Despite that, the troops sent in to do the dirty work were often as under-equipped as their objectives were ill-defined. In the course of this book, the author offers up various reasons as to why this was, but touches only briefly on one of the most likely explanations: the fact that the U.N. has been used by Western elites to pursue an internationalist agenda that ordinarily would not secure domestic political approval in their home countries. Using the United Nations to this end is a clever trick, but it ensures that peacekeeping missions will almost always be shortchanged when it comes to resources; proper funding would require politicians to admit the full scope of these operations to their electorates. And voters are rarely enthused by the idea of endangering their soldiers in the name of the United Nations.

This absence of democratic accountability—and the level of blame it should bear for foreign-policy disasters—would make an ideal topic for Fleitz's next book. In the meantime, Fleitz offers some highly practical advice: Continue to use U.N. peacekeepers, but only along the lines of the traditional, limited model that used to work so well. Combine a return to that more modest approach with the adoption by Washington of a realistic foreign policy in which bien pensant internationalism is discarded, American interests are put first, and the isolationist temptation is avoided, and the results could be impressive.

It won't be easy, but an intelligent foreign policy never is.

Basic Instinct

American Outlook, September 1, 2002

Joseph Epstein: Snobbery - The American Version

Trading Places.jpg

The Englishman said to me, “oh you are writing for an American magazine.” The eyebrow arched, the lip curled, the cliché was confirmed over a smugly sipped cup of tea. English snobbery, again. To the rest of the world, it is our defining vice (full disclosure: I’m also from the scepter’d isle), something as English as military defeat is French. Fair enough: mine is a country obsessed by class. Only in England could a humorous essay (published in the 1950s by one of the Mitfords, naturally) on the distinctions between the language (“U”) of the upper classes and that spoken by everyone else (“Non-U”) become a national obsession. Lavatory was “U,” toilet was (and, some would say, still is) a social catastrophe. Of course, such refinement should be no surprise in a nation with a sense of class so acute that, only a few years ago, it was usually possible to tell a man’s social origins by his socks (ideally dark blue or black, calf-length, and never, ever patterned).

But if snobbery is our vice, it isn’t ours alone. England’s trick was to market its snobbery as the best in the world, and then to put it to work. In this, if nothing else, Britain succeeded brilliantly. In his Ornamentalism: How the British Saw Their Empire, historian David Cannadine makes the case that the British colonizers often co-opted the “native” social hierarchy (medals all ’round!) into their own in order to assist in the preservation of colonial rule. As any reader of Rudyard Kipling’s Kim will know, class did not always trump race, but as a prop (in both senses) of the glittering imperial structure, it certainly played its part. Even today snobbery remains a useful weapon in London’s diplomatic arsenal, most notably in the awarding of knighthoods to the occasional friendly foreigners. Step forward, “Sir” Norman Schwarzkopf.

Snobbery, then, is not confined to those damp islands off the northwestern coast of Europe. In his entertaining new book, Snobbery: The American Version, author and Northwestern University lecturer Joseph Epstein gives credit where credit is due (“the English are more practiced in snobbery than any other people”), but chooses not to linger too long in Albion. The main focus of his book is snootiness on the western side of the pond, “its perplexities and its perils, its complications and not least its comedy.” On a more serious note (this is, after all, a book by an American academic), he aims to examine “whether snobbery is a constituent part of human nature or instead an aberration brought about by any particular social conditions.” He succeeds admirably in the analysis of the first part of his objective, stumbles over the second, and has problems too with a third, no less important question: what exactly is a snob?

That last difficulty puts Epstein in good company. In his 1848 collection, The Book of Snobs, Thackeray complains that although “the word snob has taken a place in our honest English vocabulary,” it can’t be defined. “We can’t say what it is, any more than we can define wit or humor or humbug; but we know what it is.” Epstein has a similar problem. His notion of the “essence of snobbery” (“arranging to make yourself superior at the expense of other people”) seems to miss the point. Ray Kroc, no snob icon but the man who made McDonald’s what it is today, reportedly said that if he saw a competitor drowning, he would put a live fire hose in his mouth. Superiority is often achieved at the expense of someone else. Such leapfrogging has taken our species from mud huts to the moon. But how superior is that superiority? Epstein writes that “snobbery often entails taking a petty, superficial, or irrelevant distinction and running with it.” He’s right, and if anything is the essence of snobbery, that would be it. Some of his examples, however, are strangely unpersuasive.

Contrary to what Epstein suggests, the driver of a BMW 740i is indeed quite entitled to feel “quietly, assuredly better than the poor vulgarian in his garish Cadillac.” As is acknowledged elsewhere in this book, good taste is not the same as snobbery. Equally, whatever Epstein may think, the parent of a daughter “studying art history at Harvard” need not be ashamed of the “calm pleasure” with which he greets the news that the child of an acquaintance is able to manage only a major in photojournalism at Arizona State University. That parent has, in all probability, earned that moment of satisfaction. The snob is not distinguished from the man of taste by his ability and willingness to discern the difference between a Beamer and a Caddy but by the use he makes of that discernment. Coming to the conclusion that Harvard is better than ASU is not necessarily the mark of the snob: treating an ASU graduate worse, merely because of where he went to college, most surely is.

These lapses into a dismaying (and, one hopes, insincere) egalitarianism are the exception rather than the rule in this book. Epstein soon finds himself on safer ground. Like Thackeray (a comparison that he would, doubtless, accept with “calm pleasure”), Epstein is rather better at identifying snobs than at analyzing snobbery. From a vantage point of somewhat tweedy, curmudgeonly disdain, he offers his readers an enjoyably vicious introduction to the different types of American snob. They are presented as a ludicrous and absurd spectacle, lampooned with a vim and biliousness that is all too rare in an era wherein there is no offense greater than giving offense. Among Epstein’s victims are Susan Sontag (“when young, a knockout American woman who did a fairly decent impression of a European intellectual”), PC “virtucrats” (“What makes the virtucrat a snob is that not only is he smug about the righteousness of his views, but he imputes bad faith to anyone who doesn’t share them. Upon this imputed bad faith he erects his own superiority.”), Gore Vidal (“Self-love, which in him never goes unrequited, is sufficient for this remarkably confident snob.”), and foodies (“When did my dentist begin using the word pasta?”).

Epstein appears to concede that he himself may be something of a snob, but it would be wrong to dismiss his tastes (there are, for example, touches of PBS, academe, and the hair shirt in his rather ostentatious lack of interest in material gain) as routine examples of intellectual snobbery. As he explains elsewhere in the book:

High standards far from being snobbish are required to maintain decency in life. When the people who value these things are called snobs, the word is usually being used in a purely sour-grapes way. Elitist is almost invariably another sour-grapes word, at least when used to denigrate people who insist on a high standard. The distinction is that the elitist desires the best; the snob wants other people to think he has, or is associated with, the best. Delight in excellence is easily confused with snobbery by the ignorant.

Quite. The mere fact that he is so obviously comfortable using a shockingly abrasive word like ignorant tells the reader all he needs to know about Joseph Epstein.

Epstein is even prepared to risk being labeled snobbish about snobbery with his suggestion that American snobbery has itself gone down in the world. In a key chapter (“O WASP, Where Is Thy Sting-a-Ling?”), he chronicles how America’s old White Anglo-Saxon Protestant elite walked away from power (and, as he notes in a brilliant, brutal aside, “came away disliked, diminished, maybe even a little despised for having done so”), leaving snobbery unanchored, “setting it afloat if not aloft, to alight on objects other than those connected exclusively with social class,” including, presumably, pasta.

But that’s an exaggeration. Class sensibility was no longer so rooted in ethnicity or tradition as in the past, but, as Paul Fussell showed in his book Class (1983), it was flourishing well into the Reagan era. It continues to do so today, but, so far as snobs are concerned, class has lost much of its glitter. The years of fluid hierarchy and social change have taken their toll. Old notions of caste no longer suffice for truly effective one-upmanship. In response, snobs did what they had to. They evolved.

As snobbery is such a basic instinct, this was only to be expected. Yet, despite the fact that the force and existence of such an instinct explains much of what Epstein describes, he seems curiously unwilling to accept it. In an attempt designed, presumably, to satisfy his objective of seeing whether snobbery can be linked to “particular social conditions,” Epstein asserts that “snobbery as we know it today, [the] snobbery meant to shore up one’s own sense of importance and to make others sorely feel their insignificance” was rarely seen before the nineteenth century. The reason for its expansion, he argues, was the spread of democracy. By unsettling a previously fixed social order, democracy increased the level of insecurity within society. Epstein quotes H. L. Mencken’s observation that, socially speaking, the American is on a perpetually icy slope, wanting to climb “a notch or two,” but “with no wall of caste to protect him if he slips.” As an ersatz class system, snobbery could assist in the struggle to survive within a society that had become suddenly, and frighteningly, competitive.

It is an ingenious theory, but it fails. Snobbery, and its simpering handmaiden, deference, could be witnessed long before the emergence of mass democracy. Epstein need have no doubt that it is, indeed, “a constituent part of human nature.” Let’s take one example. “Novelists,” writes Epstein, “are our keenest sociologists,” and there were none keener than Jane Austen. At the time she was writing, the ballot box was yet to cast much of a shadow over England’s country gentry, and yet her novels are filled with snobbish tension and social unease. And that’s only natural. People have always understood that no social order can be guaranteed to endure forever. Our species has emerged through millennia of turmoil, conflict, disaster, and war, and the lesson it has drawn has been simple: there is never, ever a bad time to be jockeying for position.

If there’s one person who knows about jockeying for position, it is a snob. On its face, Epstein’s comment that “there is something deeply antisocial about the snob” seems puzzling. There is, on the contrary, no one more social. Lacking the talent to succeed on his own merits, the snob is forced to manipulate social convention in such a way as to ensure that he achieves that all-too-necessary commodity, status. Epstein’s complaint, however, is subtler: it is not the snob who is antisocial, but his methods. The snob, he grumbles, “is, in a profound sense, in business for himself,” to which the obvious retort is, “Who isn’t?” Where snobbery can be said to be antisocial is in the misdirection of effort and ability that it implies; but like it or not, its existence is inevitable in any functioning society: a successful organism will always attract parasites.

It is difficult to avoid the feeling that Epstein’s disapproval of his snooty subjects colors his other main theme: that snobs have no fun. His description of the miseries of the snob’s life is bleak indeed. Epstein contends that the snob has only one standard, “that of comparison,” and that this approach to life can bring no “lengthy contentment” because “comparison inevitably implies competition.” There’s something to this; the snob’s self-esteem may be unusually susceptible to the opinions of others. But this is only a question of degree: almost all of us worry about how we are seen by the outside world. Besides, what’s the problem with competition? Epstein’s notion that competition is automatically an ordeal is a view that I suspect (perhaps snobbishly) only an academic could hold. Competition can be agony (check out the scene in Bret Easton Ellis’s repulsive but perceptive novel American Psycho, in which various Wall Street types compare the quality of their business cards), but it can also be ecstasy (Ray Kroc again). It depends on the nature not of the game (which can be snobbish or not), but of the individual who is playing it.

The truth is that, disapproving of snobbery as he does, Epstein desperately wants to believe that snobs must, by definition, be unhappy. In this he is doomed to be disappointed. Like all primates, we are social animals, and therefore status in itself—deserved or not—can be a source of profound satisfaction. The rewards from the superficial can run very, very deep.

It’s not “fair,” of course, but so far as snobs are concerned, that’s just the point.

The Good Russian

Richard Lourie: Sakharov - A Biography

National Review, August 12, 2002

Sakharov's Grave, Vostryakovskoye Cemetery, Moscow, 1991 © Andrew Stuttaford

Sakharov's Grave, Vostryakovskoye Cemetery, Moscow, 1991 © Andrew Stuttaford

It takes more than a Bolshevik to erase history. Lenin intended his revolution to be a clean break with the unruly, uncontrollable past, but, in the end, he failed. Remnants of the older—and, for all its faults, more humane—Russia succeeded in enduring through three-quarters of a century of Communist brutality. Andrei Sakharov, the subject of this new biography by Richard Lourie, may have been born in the formative years of the Soviet dystopia, but he is best seen as a child of the earlier, finer civilization that the revolution had been designed to destroy. Miraculously, he too managed to survive.

More than that, he was even—for a while—to flourish within the Soviet system. The regime knew how to promote talent as well as to punish it. Although Sakharov was never a party member, his scientific ability was enough to bring him into the inner circles of the Soviet establishment. It was his moral strength, however, that was to take him out again. It turned out that the enormously gifted scientist, an explorer of the impossibly complex, was to find fulfillment in his dedication to some very basic truths. Sakharov, the man who gave the Kremlin the H-bomb, became a champion of human rights and—in a delightful irony—an architect of the Soviet collapse.

It was an extraordinary journey, and any attempt to make sense of it must begin with an understanding of the Russian intelligentsia into which Sakharov was born—a group, as Lourie puts it, that is "something between a class and a clan." Its members were, and are, "educated people whose sense of honor and duty compels them to take action against injustice." But, as Lourie also notes, "Lenin and some of the other Bolsheviks [also] were of the intelligentsia, its crude and jagged cutting edge. And there were also spiritual extremists." Indeed there were. Those true believers still shouting Stalin's praise at the very moment his executioners gunned them down were no less representative of the intelligentsia than were those gentle, thoughtful folk found in Turgenev or Chekhov.

What these people had in common was the idea that it was they who should set (and live up to) the standards necessary to build a better Russia. They saw themselves as intellectually and morally superior both to the dangerous and benighted masses below and the crude and despotic rulers above. They believed that they were the nation's true elite, elevated and yet oppressed. Theirs was a state of mind prone to lethal naivete and Utopian fantasy, to dreams of a finer, purer way of life that were to pave the way for the Bolshevik nightmare.

That Sakharov inherited this utopianism can be seen from his "Reflections," the 1968 essay that marked his definitive break with the Communist regime. It was an extraordinarily brave attack on totalitarianism, strangely skewed by a lingering attachment both to collectivism and dopily enthusiastic futurism. Science fiction is blended with Stalinist mega- project ("Gigantic fertilizer factories and irrigation systems using atomic power will be built... gigantic factories will produce synthetic amino acids"). As Lourie notes, Sakharov at that time still had hopes of a worldwide socialist paradise, to be achieved by technological advance, heavy taxation, and "convergence" between "democratic socialism" and "the leftist reformist wing of the bourgeoisie."

If this dreamlike world view was one aspect of Sakharov's fidelity to the traditions of the Russian intelligentsia, so too was his dedication to his work and the notion that he could somehow do something for the greater good. These are demanding standards to maintain in the best of times. Trying to live up to them in the moral slum that was the mid-20th-century Soviet Union was to lead Sakharov to a life of barely comprehensible contradictions. So, in the late 1940s, we find the future winner of the Nobel Peace Prize busily designing weapons of mass destruction, an apparently decent man conscientiously putting his talent for murderous innovation at the disposal of a regime already responsible for the deaths of millions the old-fashioned way.

Loyalty to his country (enhanced by memories of its huge wartime losses) was partly to blame, as were the shreds of belief in a Soviet future (the letter that Sakharov wrote to his first wife on the occasion of Stalin's death makes for nauseating reading). Ignorance, certainly, offered no alibi. Sakharov knew. The facility where he worked was built by slave labor. He wrote later that he saw them everywhere—"long lines of men in quilted jackets, guard dogs at their heels"—but it did not stop him doing his best for the government that had imprisoned them.

Then something changed. This loyal servant of the Soviet state began asking awkward questions. And when he didn't get the answers he wanted, Sakharov did what very few dared do. He persisted—and it is the great weakness of Lourie's book that it never really explains why. Superficially, the story is straightforward, and so is the way that Lourie tells it. Increasing concern over the dangers posed by the atmospheric testing of his nuclear devices led Sakharov to urge restraint. He was told, none too kindly, to keep his thoughts to himself and to get back to work, but he continued with his complaints, embarking on a voyage that would take him from privilege to protest, through gradual alienation to outright dissidence, internal exile, and, ultimately, triumph.

To be fair to Lourie, pinning down what drove Sakharov may be a hopeless task. This most public of dissidents was a private, reserved man. Aged about 50, he claimed to have only one close friend (a friend who subsequently let him down in a characteristically squalid, characteristically Soviet way); it is easy to detect a similar pattern of emotional distance in Sakharov's first marriage.

With Sakharov, however, there is always that capacity for surprise. Whatever the shortcomings in their relationship, he fell apart when his first wife died. A little later this quiet, dry, slightly prudish introvert found himself drawn to the lively, abrasive, and demanding Elena Bonner. Understandably enough, their partnership (they subsequently married) is often (and Lourie's book is no exception) discussed in a primarily political context, but it was, clearly, much, much more than that. This was a great romance, a grand, gorgeous late-flowering love affair that carried alt before it, a light in the midst of totalitarian darkness, a bastion of integrity in a state that had none.

But those looking for the source of Sakharov's anti-Soviet struggle need to look further than Elena Bonner. She accelerated the process and made it more bearable for the beleaguered physicist (two against an empire is better than one), but this was a question of speed, not destination. By the time the pair first met, it was 1970—and Sakharov was already in irrevocable opposition.

The key to the puzzle must lie elsewhere. Readers of Lourie's book are given enough clues to draw some conclusions of their own. It is necessary to look again at the influence of what Sakharov once referred to as the intelligentsia's "inherited humanist values." Add those values to a demanding family tradition, courage, and a certain innate goodness, and we start to understand why Sakharov began asking those awkward questions, both of his government and of himself And once he had begun, there could be no going back. Dedicated scientist that he was, Sakharov could not rest until he had arrived at the solution, no matter the cost.

This quest ought, one day, to be at the core of a more substantial biography. In the meantime, Lourie's book will do, not least because the stories it tells do give a good measure of the man that Sakharov became. Here's a wonderful example dating from the late 1970s (1978 according to Lourie; Sakharov in his Memoirs places it two years earlier). Bonner and Sakharov had been shown photographs of a dissident exiled to Nyurbachan, a settlement in a remote part of Siberia. Troubled by the look on the exile's face (that was all it took) they decided to visit him.

On the way to the airport, their taxi was rammed. Undaunted, they took another. The first leg of their journey brought them within 400 miles of their objective, but the next flight was "unexpectedly" delayed by 24 hours. They camped out at the terminal, and took the plane the next day. On landing, they were told that the bus to Nyurbachan had been canceled. There were still 15 miles to go. The secret police were obviously watching their every move. Lourie tells us what this indomitable duo, no longer young, no longer in good health, then did.

"Though it was getting dark, Sakharov and Bonner decided to walk . . . The forest path was moonlit, the air fresh, a Siberia of stars above the trees. They stopped for bread and cheese, sipping coffee from a thermos . . . Alt the KGB's machinations had only afforded them hours of happiness."

And, yes, they reached their destination.

Hollow Laughter

Martin Amis: Koba the Dread - Laughter and The Twenty Million  

National Review Online, July 16, 2002

Stalin, Moscow, 1997  © Andrew Stuttaford

Stalin, Moscow, 1997  © Andrew Stuttaford

Back in the time of the revolution he was described as a gray blur, and it is as a gray blur that Stalin survives today, a nullity, a gap in our memory, an absence. In the lands of his old empire, they remember more, far, far more. The absence there is absent fathers, absent mothers, absent grandparents, absent uncles, absent aunts, absences in the millions, all victims of the monster who remains, remarkably, still present in Red Square (there's a small bust at his burial site by the Kremlin's walls and usually someone takes the trouble to leave a flower or two). In our ignorant, spared West, the West that never knew him, not really, we catch only glimpses of what we think what was. The images are caught on fading, flickering newsreel, a friend from the greatest of America's wars, FDR's pal, smiling benignly out, hooded eyes beneath a peaked cap, good old Uncle Joe.

In his new book, Koba the Dread: Laughter and the Twenty Million, the British novelist Martin Amis makes an attempt to fill this gap. It is a curious, compelling but more than occasionally self-indulgent work, a meditation that uneasily combines snatches of its writer's autobiography with tales of the Soviet holocaust.

The tone too seems just slightly off. Amis has long been known as a master of the acid one-liner, but it jars to read his snide reminiscence of the trivial (attendance at Tony Blair's dreary millennium celebrations) within a few pages of this extract from a letter written by the elderly Soviet theater director, Vsevolod Meyerhold after his arrest and torture by the secret police:

I was made to lie face down and then beaten on the soles of my feet and my spine with a rubber strap…For the next few days, when those parts of my legs were covered with extensive internal hemorrhaging, they again beat the red-blue-and-yellow bruises with the strap and the pain was so intense that it felt as if boiling water was being poured on these sensitive areas. I howled and wept from the pain…Lying face down on the floor, I discovered that I could wriggle, twist and squeal like a dog when its master whips it.

Meyerhold was shot three weeks later. He managed, at least, to outlive his wife. She was found murdered in their apartment a few days after his arrest. Reportedly, her eyes had been cut out.

And so yes, London's Millennium Dome may, indeed, have resembled a "second-rate German airport," but, in the context of such horror, so what?

It's not just the tone and the awkward snippets of autobiography. Martin Amis's style, mannered, arch and self-consciously clever, also seems out of place, an all too elegant frame for such a crude and bloody canvas. We read of the "fantastic sordor" of the Gulag's slave ships, and that Stalin's "superbity" was "omnivorous." When told of the Wehrmacht's initial successes on the Eastern front, the Soviet dictator apparently "collapsed as a regnant presence." The baroque vocabulary acts as a barrier between the reader and the events that it is being used to describe. It may also signify the emotional distance that Amis himself feels from the Soviet tragedy. Good writer that he is, he understands "why Solzhenitsyn needs his expletives, his italics, his exclamation marks, his thrashing sarcasm," but rarely seems to feel such a compulsion himself.

What Amis does offer is a brief, and competent, introduction to the Stalin years, drawing both on recently published research and, very obviously, a long acquaintanceship with Robert Conquest, the finest English-language historian of Stalinist terror, who happens also to be an old friend of the Amis family. Tics of style and tone apart, the tale is well told, and clearly benefits from the skills of an accomplished and insightful writer. We learn, for instance, that Stalin failed to show up for his mother's funeral, a decision that "scandalized the remains of Georgian public opinion." The insertion of those three bleak words, "the remains of," tells the reader all that he or she needs to know about Stalin's impact on his native land.

Similarly, in describing the catastrophe of collectivization Amis manages in a few short lines both to summarize the onrush of disaster and to speculate what that might say about the differing personalities of Lenin and Stalin. Faced by peasant resistance, "Lenin accepted defeat, withdrawal and compromise. In other words, he accepted reality. Stalin did not. The peasantry no longer faced a frigid intellectual. It faced a passionate lowbrow whose personality was warping and crackling in the heat of power. He would not accept reality. He would break it." The result was a death toll that ran into the millions and, in Amis's vivid phrase, "swaying, howling lines" in front of the few food stores with anything to sell.

It is a hideous story, and Martin Amis should be thanked for retelling it. In forgetting those who were murdered, it is as if we kill them again, and yet with Stalin's dead that it is just what the world seems content to do. As many as seven million died in the genocidal Soviet famine of the early 1930's, yet in most histories it usually merits no more than a footnote. Walter Duranty, the New York Times correspondent who tried to deny the famine's existence earned a Pulitzer for his "reporting" in Moscow, a prize that the "paper of record" still includes on its roll of honor.

As for the other slaughtered millions (Amis believes that Stalin was responsible for a total of at least 20 million deaths — and there are other, much higher, estimates), their fate is often passed over in silence or with the most insultingly cursory of regrets. Almost no one has ever been held accountable. There has never been a Soviet Nuremberg. Solzhenitsyn has calculated that between 1945-1966 West Germany convicted some 86,000 people for crimes committed for the Nazis. The number of those found guilty of similar atrocities on behalf of the Communist Party in the former Soviet Union is unlikely — even now — to run into triple digits. In the countries of the former USSR, however, there is at least an argument (albeit misguided) for inaction: it is said that the long duration of Soviet rule manufactured too many accomplices to permit — yet — a full examination of the past in societies where democracy remains fragile.

In the West there is no such excuse, yet, when Stalin is discussed at all, the tone is often strangely sympathetic, and the tally of victims is frequently subjected to downwards revisions on a scale that would embarrass even David Irving. Where Koba The Dread fails, and fails most completely, is in trying to explain why. As a first step, Amis looks again at the old question as to whether Hitler's crimes were "worse" than those of Stalin (Conquest, interestingly, believes that they were, but can give no reason other than the fact that he "feels" so), but this controversy is, forgive the phrase, a red herring. Any moral distinction between these two bestial systems is so slight as to be irrelevant, and yet our response to them is strikingly different. In contemporary discourse, the Nazis are totems of wickedness, while Communism (despite accounting for far greater slaughter, a slaughter that still continues) is somehow seen as not so very bad.

As a shorthand for these perversely different responses to two very similar evils, Amis records how at a debate featuring the two Hitchens brothers (Christopher and Peter), Christopher Hitchens (quoted elsewhere in Koba as — astonishingly — still believing that Lenin was a "great" man) referred to evenings passed in the company of his "old comrades," a remark greeted with affectionate laughter (it is the laughter referred to in the title of Amis's book), a laughter that would be inconceivable as a reaction to a light-hearted reference to happy days with the fascists.

As Amis (who admits to laughing himself) concedes, "this isn't right." To explain that laughter, he turns, unconvincingly, to the elements of black farce that were never absent from Communist rule (but which were, he neglects, crucially, to say, equally present under the Nazis), and then, more believably, "to the laughter of universal fondness for that old, old idea about the perfect society, [which] is also the laughter of forgetting. It forgets the demonic energy unconsciously embedded in that hope. It forgets the Twenty Million."

And in that one word "unconsciously," Martin Amis gets it all wrong. Murder, turmoil, and repression were always explicit in that "old, old, idea" and they play no small part in its appeal. Glance, just for a second, at Lenin's writings and you will be amazed by the morbid love of violence that permeates his prose. The "Just City" of Marxism's dreams always came with a concentration camp. The Bolsheviks had the genius to understand this. Their intellectual descendants know enough to try and cover it up: thus the silence about Stalin, thus that disgusting laughter.

Martin Amis's achievement is that, in writing this odd, flawed book, he has done something to help ensure that it is we — and not Stalin's heirs — who will have the last laugh.

Sunday School for Atheists

National Review, March 25, 2002

his-dark-materials-covers-829675.jpg

The His Dark Materials trilogy, by Philip Pullman was, some said, the moment that literature for the young finally came of age. On January 22, Philip Pullman, a children's writer (although he objects to that label), was awarded Britain's prestigious Whitbread prize for the final installment of his best-selling His Dark Materials trilogy. In the opinion of the judges, The Amber Spyglass was Britain's book of the year. It was an unprecedented honor for a work aimed at younger readers, but Pullman is a man who must be getting used to praise, and not just in Britain. His writing has been described as "very grand indeed" in the New York Times, while reviews in the Washington Post have included adoring references to the "moral complexity" and "extravagant . . . wonders" to be found in Pullman's work.

There can, indeed, be little doubt that the first book in the trilogy. The Golden Compass, is a masterpiece, a sparkling addition to the canon of great children's fiction that leaves poor Harry Potter helplessly stranded in the comparative banality of his Platform 9-3/4. Within the time it takes to read his first few, skillfully drawn pages, Pullman takes us into a beguiling parallel universe. His spikily endearing heroine, 11-year-old Lyra, lives in an England that is a curious blend of the Edwardian and the modern. It is a place where the boundaries between what we would think of as the natural and the supernatural are blurred, no more distinct than the fraying edges of the alternate realities that Pullman describes so well. In Lyra's world every person has a daemon: a companion in animal form, part soul, part familiar spirit. There are witches in Lapland, and the most feared warriors in the North are a rampaging race of armor-clad bears, ursine Klingons who have fallen into decadence under the rule of a corrupt and vicious usurper.

In constructing this captivating, fascinating fantasy, Pullman has pillaged the past and looted from legend. He is a magpie of myth, an author whose work borrows from saga, folklore, and some delightfully obscure parts of the historical record, and, oh yes, he can write.

Lyra raised her eyes and had to wipe them with the inside of her wrist, for she was so cold that tears were blurring them. When she could see clearly, she gasped at the sight of the sky, The Aurora had faded to a pallid trembling glimmer, but the stars were as bright as diamonds, and across the great dark diamond-scattered vault, hundreds upon hundreds of tiny black shapes were flying out of the east and south toward the north. "Are they birds?" she said. "They are witches," said the bear.

That literature of this, well, literacy is being written for the young (Pullman's target audience begins at around 11, Lyra's age) is wonderful. And finding a large market for it in this grunting, ineloquent era is little short of a miracle. More than a million copies of Pullman's books have been sold in the U.S., and the same again in his native Britain.

Their author, however, would be a little uneasy to hear the use of that word "miracle." For he is, alas, a man with a message, and by the end of the trilogy the message has drowned out the magic. Narrative thrust is abandoned in favor of a hectoring, pontificating preachiness-—which has itself probably played no small part in the rise of Pullmania among the chattering classes on both sides of the Atlantic.

Pullman, you see, is a man with an apse to grind. He hates the Church, and he hates it with a passion. This is an unusual fixation for someone from the scepter'd isle; most of the English are rather relaxed about religion, tending to lack strong views about the matter one way or the other. Our predominant faith is a benign, "play nice" agnosticism, vaguely rooted in the Anglican tradition. Metaphysical debate is as foreign to us English people as a sunny day in November.

Philip Pullman is made of more strident stuff. He wants, he once told the Washington Post, "to undermine the basis of Christian belief." This is an immodest ambition even for a winner of the Whitbread prize, and the rationale behind it seems crude, no more sophisticated than that of the high-school heretic, and gratingly simplistic from such a clever writer. The history of the Christian Church is, Pullman intones, a "record of terrible infamy and cruelty and persecution and tyranny." True, to an extent; but the full story is a little more complex than that. It is no surprise to discover that C. S. Lewis is a particular bogeyman: Pullman claims to hate the Narnia hooks "with deep and bitter passion." Among other offenses, Lewis apparently celebrated "racism [and] misogyny"—a choice of thought crimes that reveals the supposedly skeptical Mr. Pullman as a loyal follower of a very orthodox form of political correctness (the inquisitorial piety of our own time). PC's dismal spoor can be found throughout his books, a spot of class hatred here, a little global warming there.

And, above all, there is his omnipresent attitudinizing vis-a-vis religion. It's not so much the role of a wicked Church that is the problem (malevolent clergymen with twisted creeds are nothing new in fiction), but the tiresome little lectures that come with it. So, for example, in The Subtle Knife a speech attacking the sinister Church of Lyra's world becomes an attack on all churches everywhere: "Every church is the same: control, destroy, obliterate every good feeling." There is plenty more of the same, crude, nagging, and bombastic, its form objectionable, whatever one might think of the content. In writing his tales of Narnia, C. S. Lewis may also have been a man on a mission, but at least he had enough respect for his readers to prefer allegory and parable to assertion and propaganda. It is worth remembering that, compared with Pullman, Lewis was writing for a much younger audience, children of an age at which it is quite possible to read and reread the Narnia adventures and miss most or even all of the Christian references; aged eight or nine, I did. Nevertheless, Lewis was content to leave his message oblique; Pullman never allows his readers such freedom.

Despite these concerns, the second book. The Subtle Knife, remains imaginative and alluring if less startlingly original than its predecessor, and still able to survive increasing amounts of its author's pedestrian philosophizing. By the end of The Subtle Knife, however, it is becoming painfully apparent that Pullman's overall theme (basically, a variation on Paradise Lost) is unlikely ever to soar; a devastating weakness in a work that, like many epics, is structured as a quest. The Amber Spyglass, the allegedly grand finale of the series, is intended to bring resolution, but it is difficult to care. The object of Lyra's quest remains (at best) obscure and (at worst) highly pretentious, an unholy grail that simply does not engage the imagination.

When I asked 11-year-old Holly, the daughter of some friends, what she thought of these books, she said that they were "well-written." The story itself didn't quite catch her attention.

Dust is to blame; The Amber Spyglass is a book in which, despite some sporadically spectacular passages, any real sense of excitement is, quite literally, ground into Dust. Scattered over page after wearying page, this endlessly discussed "Dust" is the substance that represents consciousness in Pullman's universe, but it runs the risk of inducing unconsciousness in his youthful and, doubtless, exhausted readership.

And there is, unfortunately, no escaping it. For there is Dust to be found in every nook and cranny of this wordy, wordy, wordy culmination of Pullman's three-volume morality play, which is, at its core, nothing less than an assault on the notion of Original Sin. In the end, the assault takes very literal form: After a battle that rather uneasily combines elements of Star Wars with the Book of Revelation, God (or, at least, an entity who is clearly meant to the the Christian God) is overthrown, the underworld is liberated, and a "Republic of Heaven" is proclaimed.

The true nature of this apparently marvelous republic is never made clear. It may be the materialist heaven on earth, but there are also hints that it could be the New Age's goblin-infested alternative, that empty-headed, shallowly superstitious zone where everything, and nothing, is sacred. It makes for a somewhat frustrating conclusion to this very frustrating trilogy, a flawed, fascinating creation of great promise that is eventually brought down by its tendency to go too far—much like naughty old Adam himself, as Philip Pullman would never say.

Diana, Again

National Review Online, October 6, 2001

Diana.jpg

There is, let's admit it, something grimly satisfying about having a prejudice confirmed. So, if you are one of those people who believe that there is absolutely nothing more to say about Charles and Di, Christopher Andersen's new work, Diana's Boys, is the book for you. Once again weary readers are presented with the same shop-soiled menagerie (mean queen, pained prince, plain Camilla, horrible Hewitt, foolish Fergie, loveable Tiggy, playboy Dodi), the same exhausted anecdotes (hysteria at Highgrove, bulimia in the palace, Charles' confession of adultery, Diana's TV interview, the rudeness at Harry's birth), and, above all, that same doomed, fascinating heroine, bewitching and manipulative, a Sybil in Chanel, with her bewildering, ever-shifting personality leading all those around her to ruin and to despair.

We know how her story will end, of course. We are told again about those last tragic hours in Paris, that speedy departure from the Ritz and the disaster in a tunnel, hours that will be particularly familiar to fans of Mr. Andersen, in that he had already discussed them at some length in an earlier bestseller, The Day Diana Died. Now, Mr. Andersen, the author of two books about Katharine Hepburn, three volumes about the Kennedys, and two works about Princess Diana, is clearly a man who is not too worried about reworking a profitable subject. It is best, however, if such a return to the mother lode can be justified by the claim that something fresh is being discovered. The kindly Ms. Hepburn has, most obligingly for her biographers, been very long-lived, leaving plenty of room for the two, doubtless distinct, efforts by Mr. Andersen, Young Kate and The Remarkable Love Story of Katharine Hepburn and Spencer Tracy. The Kennedys enjoyed far less staying power than the formidable actress, but, in their case Mr. Andersen could, presumably, reduce the risk of repeating himself by moving across, then down, the former First Family tree. He followed Jack and Jackie with Jackie after Jack, and then, in a confirmation of his mortuary franchise, he gave us The Day John Died.

In Diana's case, however, going back to the celebrity seam was not so straightforward. The inconveniently dead princess lacked Ms. Hepburn's powers of survival. A "Young Diana" was all there ever was, and all there ever would be. There were no long decades, just a few short years filled with incident, almost all of which Mr. Andersen had already chronicled. The Kennedy alternative, harvesting the family tree, was also tricky in the case of the gloomy royals. Compared with JFK the poor princess lacked a sellable surviving spouse. Who, other than Camilla, who would go for Charles after Diana?

That only left the sons, William and Harry, in Diana's words, her "one splendid achievement", and so they appear to be. But as camouflage for an opportunistic retelling of the Spencer story, her offspring prove hopelessly inadequate. This is hardly their fault. They may, in the words of Mr. Andersen's publisher be "the world's two most celebrated royals" (eat your heart out, Elizabeth), but they simply have not done enough to carry a biography. This would be true of almost any teenager. Diana's children are no exception, as a quick glance at this book's index reveals.

Entries under "William, Prince" include "backside pinching of… e-mail romances of…formality disliked by… Harry dangled from window by." Take away the story of their parents, and the Windsor princelings' lives are the stuff of trivia. While that is not a bad level for Mr. Andersen's writing style ("Finally, the Princess of Wales leaned forward to see what the boys' found so riveting: steamy photos of the buxom Barbi twins, Playboy centerfold models Shane and Sia") he is astute enough to know that, when it comes to book sales, his best hopes still lie with Diana. So, much of what we get is a tired rehash of a failed marriage and a tragic death, with, on occasion, the only variety coming, quite literally, from the pagination.

On page 43 of Diana's Boys, for instance, we can read that "William's mother indulged in an orgy of self-mutilation. At various times, Diana slashed her wrist with a razor, stabbed herself in the chest with a pocketknife, cut herself with the jagged edge of a lemon peeler, and hurled herself against a glass display case, shattering it." This is a drama that may be familiar to admirers of page 49 of The Day Diana Died where readers are told that "in an orgy of self-mutilation, at various times Diana slashed her wrist with a razor, stabbed herself in the chest with a pocketknife, cut herself with the jagged edge of a lemon peeler, and hurled herself against a glass display case, shattering it."

The only difference between these two accounts lies in the description of its protagonist. In Diana's Boys the lemon-peeler-wielding princess is also, in keeping with the theme of a book allegedly focused on her sons, described as "William's mother," rather than just the "Diana" used in the earlier text.

To be fair, there are some revelations (at least to this Brit) in the more recent book. I was, for example, unaware of the fact that, in an unorthodox variant of the curt handshake generally preferred by the English upper classes, one socialite allegedly prefers to greet Prince William by putting her hand down the front of his trousers. For the most part, however, even those parts of Diana's Boys that relate specifically to the children cover fairly familiar ground, if in ever more excruciating detail. In The Day Diana Died, Mr. Andersen tells us that William once "tried to flush his father's shoe down the toilet", while in Diana's Boys, we learn that they were "four-hundred-dollar" shoes.

More excruciating for William, should he ever look at this book, will be the speculation about his love life, speculation helpfully illustrated by an inspired selection of photographs that manages to include seductive pictures of no fewer than three cuties whose names (Tara Palmer-Tomkinson, Emma Parker Bowles, Davina Duckworth-Chad) seem more substantial than the outfits that they are wearing. For the time being, however, both the young princes seem remarkably well balanced given what they have been through, but it is difficult to read Mr. Andersen's book without wondering whether Diana's boys are destined to share some of the bleaker aspects of their parents' fate.

For, while the source of many of Charles and Diana's problems lay in their own personalities (well summarized in Sally Bedell Smith's Diana in Search of Herself, psychobabble-heavy, but nevertheless the best single account of the whole miserable saga) other factors were also very much to blame. In particular, the royal couple had to contend with the challenge of living in a country that no longer knew what it wanted from its monarchy. Like their predecessors, the prince and princess were public figures, but the public had changed. To their cost, Charles and Diana were to discover that the old deference was dead, taking with it the stuffily comfortable etiquette that once cocooned the inhabitants of Buckingham Palace. It had been replaced by a relentlessly intrusive tabloid-driven agenda that mixed class resentment and prurience with the curiously old-fashioned notion that the Royal Family should set some sort of example, although no one seemed to be able to agree on what that example should be.

It is worth remembering that when, in the bawling, mawkish week that followed Diana's death, the formerly vilified princess was being sanctified for allegedly being able to show her true feelings, the Queen was at the same time coming under attack in the press ("Show us you care") for failing to fake hers. What Fleet Street wanted from Her Majesty ("Speak to us Ma'am — Your people are suffering") was a blubbering expression of regret for a former daughter-in-law she clearly no longer really cared for.

Poor William ("the heir") and, to a lesser extent, Harry ("the spare") face a lifetime of trying to satisfy the conflicting, unclear, and capricious demands of such scrutiny, of which Mr. Andersen's book is an early, and relatively harmless, example.

No wonder William is said to doubt whether he wants to be king.

David Horowitz: Hating Whitey and Other Progressive Causes.

National Review, May 22, 2000

david-horowitz.jpg

WAS there a David Horowitz in Bosnia, a Cassandra warning of the cataclysm to come? For most ethnic conflicts are fairly predictable, and it's not too difficult to identify who is going to start them. The underlying message of this collection of essays is that race relations in this country too are being deliberately poisoned, with potentially disastrous results. The culprits are a grubby group of demagogues and ideological hucksters, given their opportunity by the development of identity politics. It is worth reading what Horowitz has to say. After all, he was once a prominent '60s radical, a "progressive" pur et dur. Now, thankfully, he's a conservative (of sorts), but he still writes like an old-fashioned left-wing polemicist. His prose is splendidly savage and invigoratingly rude. David Horowitz has a message to deliver, and if he offends someone in the process, that's just too bad.

This is an angry book, and with good reason. The "progressive causes" related by the author are full of bullying, career destruction, race baiting, rape, and murder. We may giggle about political correctness, but it is, as Horowitz explains, no less than "the stuff that totalitarian dreams are made of." As a former Leninist, he understands how the Left plays the game and the tactics it uses.

The most worrying of these is the manipulation of ethnic antagonism. Today's diversity politics have often been reduced to little more than the "expression of racial paranoia." The consequences could be terrifying. For as Horowitz warns, "by projecting their fear and aggression onto those around them, paranoids create enemies too."

Sure sounds like Bosnia to me.