Wednesday, November 3, 2010

nytbr text

Nation-Building
By RICHARD BROOKHISER

On Sept. 17, 1787, the convention that had been sitting in Philadelphia for four months to design a new form of government for the United States adjourned, offering its handiwork to the nation. Almost a year later, on Sept. 13, 1788, Congress declared that the Constitution had been duly ratified, and prescribed the rules for the first presidential election the following year. Pauline Maier’s delightful and engrossing book shows how America got from the first date to the second — and ultimately to today, since we still live with the same document, however modified.

One caveat: To like this book, you have to like politics. “Ratification” is an ur-text of the Almanac of American Politics. It has process, issues, arguments, local context, major players, minor players — and hoopla. “The popular excitement” that attended the struggle, Maier writes, “reminded me at times of Americans’ obsession with the final games of the World Series, but with greater intensity because everyone understood that the results would last far longer than a season. . . . Politics was in a real sense the first national game.”

The United States’ first form of government was the Articles of Confederation, which established a unicameral Congress, dominated by the state legislatures that picked its members (there was no national executive or judiciary). The framers of the Constitution thought they were replacing a system that was fractious, irresponsible and broken. But many Americans thought the framers had botched the job. The Constitution as it left Philadelphia had no Bill of Rights. Two powers of the new bicameral Congress seemed sinister: levying direct taxes (Article I, Sections 8 and 9), and controlling “the Times, Places and Manner” of electing members of Congress (Article I, Section 4). Congress would not know what ordinary Americans could pay, and it might manipulate elections for its own benefit. Many opponents of the Constitution wanted religious tests that would keep pagans, deists, “Mahometans” and “Popish priests” from holding office. Quakers thought the Constitution upheld slavery; some slave owners feared that it might threaten it.

Article VII said that the Constitution was to be ratified by state conventions, and that 9 of 13 states would be enough to do the job. Supporters of the Constitution, who called themselves Federalists, won four states in December 1787 as conventions in Delaware, New Jersey and Georgia approved it unanimously, and Pennsylvania — large, populous and diverse — also voted yea. But the Pennsylvania victory was almost Pyrrhic. Local Federalists bullied their way to a win; the State Legislature voted to summon a ratifying convention only because two members who opposed the Constitution were hustled to the Statehouse to make a quorum. The outraged losers — called Antifederalists by their opponents — told their story to the nation, stiffening resistance elsewhere.

Connecticut joined the Federalist column in January 1788, setting up the next great struggle, in Massachusetts — another populous state, with a long tradition of town meetings. Massachusetts Federalists realized they had to fight this battle “with cordiality.” This convention went through the text paragraph by paragraph, and Federalists agreed for the first time to recommend possible amendments. They earned a slim but honorable victory.

After a check — New Hampshire’s convention adjourned without voting — and wins in Maryland and South Carolina, the Constitution faced three decisive tests: Virginia, the largest, proudest state; New York, at the time with a mid-size population but centrally located; and a revote in New Hampshire. By then, with eight states in the bag, expectations favored the Federalists. In the end, all three states voted yea. But there was no final burst of unanimity: North Carolina would not sign on until November 1789, and Rhode Island held out until May 1790.

Maier, a professor of American history at M.I.T., depicts all the famous founders that one would expect to find. George Washington was a Greek chorus at Mount Vernon, with one eye on his crops and the other on events, knowing that if the United States was to have a president, he would have a new job. Tireless James Madison and a sometimes tiring Alexander Hamilton labored for the Federalist cause. Patrick Henry was the gaudiest of the Antifederalists. At the climax of the Virginia ratifying convention, he declaimed that “intelligent beings which inhabit the aetherial mansions” were watching the contest. As if on cue, a sudden thunderstorm rocked the building.

Maier also gives us founders, famous in their day, whom we have forgotten. James Wilson of Pennsylvania made a learned case for the Constitution; when an Anti caught him in an error concerning the history of juries, he quoted an English lawyer: “Young man, I have forgotten more law than ever you learned.” John Jay lacked the pyrotechnic skills of his ally Hamilton, but his offstage courtesy won more votes at New York’s hard-fought convention.

But Maier’s subtitle directs our attention to the people, and she devotes time to Americans who stepped onto history’s stage only for this drama. At several conventions important speeches were made by back-benchers who sat quietly through most of the proceedings, then gave their judgment. Jonathan Smith of western Massachusetts said he knew “the worth of good government by the want of it.” There was “a time to sow and a time to reap,” and if the Constitution were not ratified now, “we shall never have another opportunity.”

Maier also covers the media campaign for and against. America’s newspapers were lopsidedly Federalist, but New York’s Antis ran “the only prominent organized group that worked across state lines,” disseminating critical essays and trying to coordinate strategy. The war of words enlisted a woman, Mercy Otis Warren, a Massachusetts bluestocking, who compared the Constitution to “shackles on our own necks.” Her fellow Antis found her writing “too sublime & florid.” The level of popular interest in contested states was high. One New York newspaper wrote that “almost every man is now a politician, and can judge for himself.”

Politics is a contact sport, and Maier shows a lot of rough stuff. Federalists in some states hired convention secretaries whose minutes mutilated the speeches of Antis, or suppressed them altogether. Newspaper editors who ran political stories that readers didn’t like were threatened with canceled ads and subscriptions. Mobs in Philadelphia, Albany and New York City trashed buildings and roughed up the un-likeminded. Happily, no one was killed.

Both sides, Maier believes, won something. The Constitution prevailed, but the spirited resistance encouraged the First Congress to propose the amendments now known as the Bill of Rights. Maier does not lard her conclusion with Big Thoughts, so let me rush in. The ratification process was a tribute to what Nathan Dane of Massachusetts, a reluctant convert to the Constitution, called “the attention of this intelligent people.” Elites who disdain or ignore their fellow citizens come to grief. Witness the mess of the European Union, made and run by Brussels wire-pullers. Americans who tut-tut about our political process sometimes have a point — we can always do better —but sometimes they go too far. The process was not that different in 1787-88, and we did all right.
Arms and the Man
By MAX BOOT

THE GUN

By C. J. Chivers

Illustrated. 481 pp. Simon & Schuster. $28

The Soviet Union did not often make a product superior to the American alternative. Nobody in his right mind would have chosen to drive a pokey Lada over a speedy Corvette or even a stolid Packard. One of the few exceptions was in the military sphere, where the Soviets devoted a disproportionate share of their resources. In the long run, their greatest engineering triumph may not have been the construction of an atomic bomb in 1949 (based on stolen Western secrets) or the lofting of the first satellite into orbit (1957) or even the first man in space (1961). Far more enduring, if more low tech, was the development of the world’s most ubiquitous firearm: the AK‑47, or as it is often called, after its designer, the Kalashnikov.

Western experts initially dismissed this automatic rifle as crude and simplistic. As the New York Times correspondent (and former Marine) C. J. Chivers explains in “The Gun,” Westerners were used to making guns with “precision tools that allowed assembly lines to work within tight tolerances and mill parts to an exacting fit.” That’s not how the AK-47 was constructed in Russia’s primitive assembly plants by workers who no doubt consumed more vodka than was good for them. “Anyone who removed the return spring from a Kalashnikov, for example, would find that many parts, when not held by its tension, would slide and rattle,” Chivers notes. Even on a test range the AK-47 was not particularly impressive, its accuracy inferior to that of Western competitors.

What made the Kalashnikov the winner in a global arms race that has been going on for more than 60 years was how it performed in the field. The very fact that its parts were “loose fitting, rather than snug” meant that it was “less likely to jam when dirty, inadequately lubricated or clogged with carbon from heavy firing.” “It was so reliable,” Chivers writes, that even when it was “soaked in bog water and coated with sand” its Soviet testers “had trouble making it jam.”

Not only was the AK-47 utterly reliable in the kind of adverse conditions that soldiers encounter in battle, but it was also easy to operate. Its simplicity meant, Chivers writes, that it could be employed by “the small-statured, the mechanically disinclined, the dimwitted and the untrained.” Practically anyone, even child soldiers, could use this compact marvel, less than three feet long and weighing about 10 pounds, which “could push out blistering fire for the lengths of two or three football fields.”

Thus the AK-47 emerged as the Model T of assault rifles. With as many as 100 million copies in circulation (no one knows the exact figure), it is the best-selling gun of all time. The distant runner-up is the M-16 and its descendants, which have been reproduced fewer than 10 million times. Chivers explains how this unusual success for Soviet industry came about. The 47 in the gun’s name refers to the year it was invented — 1947. The AK stands for Avtomat Kalashnikova — “the automatic by Kalashnikov.” That would be Senior Sgt. Mikhail Timofeyevich Kalashnikov, who in 1947 was just 28 years old and had no formal training in metallurgy, engineering or any other technical discipline. As might be expected, the Soviet state built a formidable myth around this proletarian hero.

Among the inconvenient facts that were airbrushed out was that during the Stalinist Terror the Kalashnikov family had been stigmatized as “kulaks” (rich peasants), forcibly removed from their home (which was razed) and relocated to western Siberia. One of Mikhail’s brothers served years of forced labor; his father quickly succumbed to the harsh Siberian winter. Mikhail was one of eight children (out of 18) who survived childhood. He was drafted into the Red Army and, during World War II, was wounded while commanding a T-34 tank.

While in the hospital he began reading an encyclopedia of firearms and doodling his own gun designs, which resulted in the creation of a submachine prototype. This prototype was rejected by the Red Army, as were his other early brainstorms, but Kalashnikov succeeded in getting assigned to a weapons design bureau. In 1947 his team entered a contest to develop an automatic rifle for the Red Army — a weapon more portable than a machine gun but with a longer range than a submachine gun.

This would represent a major advance over the single-shot, bolt-action rifles, like the British Lee-Enfield or the German Mauser, that were common in World War II. The United States Army had already fielded a semiautomatic rifle known as the M-1 (Patton called it “the greatest battle implement ever devised”), but it still required a trigger pull for every shot, and its magazine held only eight rounds. By contrast, the AK-47 could expend a 30-round clip in seconds.

Exactly how the winning design was created remains murky, but contrary to Soviet propaganda, it is clear that Kalashnikov got plenty of help — not only from other Russian konstruktors but (more embarrassing) from a captured German arms designer, Hugo Schmeisser, who during World War II had created an early assault rifle (the Sturmgewehr) that bore an uncanny resemblance to what became the AK-47. But even though the AK-47 was the product of considerable collaboration, it was Kalashnikov who got the glory. He was twice named a Hero of Socialist Labor and acquired sufficient riches to buy a refrigerator, vacuum cleaner and automobile — all scarce commodities in postwar Russia. Eventually he would become a lieutenant general and a world-famous symbol of the Soviet arms industry.

The AK-47 and various knockoffs would be constructed not only in the Soviet Union but also in China, North Korea, East Germany, Egypt and numerous other countries that set up their own production lines with Soviet help. This proliferation would in time make the AK‑47 the emblem of terrorists and guerrillas or, if you prefer, “freedom fighters”; it would even appear on the flags of Mozambique and a number of terrorist groups. Hezbollah’s flag incorporates an image of an assault rifle that may or may not be the AK-47.

The romance of the AK-47 was well justified by its battlefield performance. In Vietnam, for example, it initially gave the Viet Cong a considerable advantage over American troops armed with M-16s, which had been rushed into production and were notorious for jamming in the heat of battle. Eventually the M-16’s design flaws were corrected, but it would never catch up in the global popularity contest with its Eastern bloc rival.

Although Chivers’s book is ostensibly devoted to the AK-47, he actually gives considerable space to tracing the development of other weapons — not only the M-16 but also predecessors like the Gatling, Maxim and Tommy guns. This helps put the AK-47 into context but also causes the book to lose some narrative momentum.

Chivers is a first-rate war correspondent and a prodigious researcher who has tracked down every relevant document (or so it appears). He even interviewed the aging Kalashnikov. He is less adept as a writer, and “The Gun” is full of infelicitous phrases. I am still struggling to figure out the meaning of this sentence: “Whenever an idea organizes for battle it gathers around its guns.” There are also too many repetitions and distracting detours, like the lengthy story of a Kurdish bodyguard who was badly wounded by an AK-47. But these are only quibbles. “The Gun” is likely to become the standard account of the world’s standard assault rifle.
In the Beginning
By LOUISA THOMAS

ADAM & EVE

By Sena Jeter Naslund

339 pp. William Morrow/HarperCollins Publishers. $26.99

Lucy Bergmann sits in front of a little fire in the South of France, thinking of Wordsworth, the Bedouins and the ancient document — an alternative rendering of the Book of Genesis — that her friend Pierre has just read aloud. She is about to express her admiration for pantheism when a gang of intruders bursts into the room, one brandishing a gun. The reader, desperate to be spared Lucy’s musings on religion, thinks: not a moment too soon. “I suppose you’ve come for the manuscript?” Pierre asks the men. “And for your pendant, Lucy,” one of them demands. Ah, yes, the pendant — a USB flash drive that Lucy wears around her neck containing evidence, discovered by her late husband, of extraterrestrial life.

Sena Jeter Naslund’s new novel, “Adam & Eve,” set a few years in the future, dramatizes the conflict between these “demons of literalism,” as Lucy calls them, and those who have a more aesthetic understanding of human experience, one that is alternately gnostic, scientific, historical and intuitive.

The villains — two Christians, a Muslim and a Jew — are members of an ultraconservative cult called Perpetuity, determined to suppress anything that would shake the foundations of fundamentalism. “The enemies of this codex are those who would shroud the past, our origins, our art, our sacred poetry, with their ignorance,” Pierre, an anthropologist, says to Lucy at one point. But Lucy and her friends are hardly inspiring defenders of a more enlightened view. Lucy in particular is clueless and grating, the kind of woman who manages to crash-land a plane in an Edenic pocket of Mesopotamia and then obsesses over the highly nutritious quality of the produce. Trained as an art therapist, she treats Adam, the handsome American soldier whom she encounters in the oasis and who nurses her back to health, with clinical condescension — except when she finds him quite sexy. Adam is damaged and delusional — he thinks Lucy is Eve, sent by God to be his mate — but he’s sensitive and perceptive, and he notices “something stunted” about Lucy. She had fallen in love with her astrophysicist husband, Thom, when she was 18 and he was 41, and the relationship in some ways arrested and isolated her. (It’s no surprise to learn that her husband might not have been as perfect as he seemed.) To her, though, marriage was “paradise.” Paradise is lost when her husband is crushed — possibly murdered — by a falling grand piano.

After violence enters Lucy and Adam’s idyll, they leave it, determined to deliver the codex that Lucy was smuggling to Pierre in France. The first people they bump into in the wilderness, naturally, are part of Perpetuity.

Naslund, whose 1999 novel “Ahab’s Wife” reworked “Moby-Dick” from a similar feminist liberal Protestant point of view, has created fundamentalists who are cartoonishly scarier and yet weaker than the real thing. The idea that the Bible or Koran contain the actual word of God has managed to survive Copernicus, Darwin, higher criticism and countless minor challenges. Fundamentalists have certainly gone to great lengths to secure their singular vision, from editing high school textbooks to blowing up buildings, but it’s hard to take the members of Perpetuity or their mission seriously. Nor do the other characters seem ­really human. (Imagine a farmboy-turned-soldier from Idaho saying, “Throughout my youth, my difficult youth, the chief imperative was to escape the domination of my father.”)

Yet “Adam & Eve” is surprisingly affecting — if only because it’s so weird. A grand piano as murder weapon? A feral boy with a thing for eating hearts? For a catalog of the improbable, the novel has few matches — save, of course, the Bible.
Cold Case Files
By JOSHUA HAMMER

TRAVELS IN SIBERIA

By Ian Frazier

Illustrated. 529 pp. Farrar, Straus & Giroux. $30

Ian Frazier’s “Great Plains,” published in 1989, was a tour de force of travel writing: a 25,000-mile jaunt from the Dakotas to Texas that stripped away the region’s seemingly bland facade. From Sitting Bull to Bonnie and Clyde to the Clutter family, whose murder was chronicled in Truman Capote’s “In Cold Blood,” Frazier revisited American archetypes, and in some cases reinvented them. Later, in “On the Rez,” he drew on his 20-year friendship with Le War Lance, a beer-swilling Oglala Sioux, to describe life at the Pine Ridge Reservation in South Dakota. In both books, Frazier’s skillful storytelling, acute powers of observation and wry voice captured the soul of the American West.

Now Frazier has set his sights on another region of wide-open spaces and violent history: the Russian East. Shortly after the collapse of the Soviet Union, he joined some Russian artists he’d met in New York on a trip to Moscow, where he became infected, he writes, with “dread Russia-love.” In particular, Frazier was enthralled by Siberia, that vast, forbidding region that stretches across eight time zones, running from the Ural Mountains to the Pacific Ocean, bordered by Mongolia and China to the south and the Arctic Circle to the north. Frazier learned Russian, immersed himself in the literature and history of the territory, and embarked on more journeys across the taiga and tundra. The result is “Travels in Siberia,” an uproarious, sometimes dark yarn filled with dubious meals, broken-down vehicles, abandoned slave-labor camps and ubiquitous statues of Lenin — “On the Road” meets “The Gulag Archipelago.”

“For most people, Siberia is not the place itself but a figure of speech,” Frazier remarks at the beginning of the book: a metaphor for cold, remoteness and exile. (The Russian word Sibir derives from two Turkic words roughly translated as “marshy wilderness.”) Turning metaphor into reality, Frazier made the first of several exploratory trips via Nome, the Alaskan port a hundred miles south of the Arctic Circle — and a short hop from Siberia across the Bering Strait. Arriving in Alaska during the post-Soviet Union diplomatic thaw, Frazier found a flurry of unlikely activity: intrepid Christian missionaries planning snowmobile expeditions across the frozen sea, and an eccentric entrepreneur, the sole member of the Interhemispheric Bering Strait Tunnel and Railroad Group, dreaming of building a 72-mile-long Chunnel across the strait.

Touching down in an airport near the Siberian city of Provideniya, Frazier was instantly enraptured by the aromas of “the tea bags, the cucumber peels, the wet cement, the chilly air, the currant jam. . . . The smell of America says, ‘Come in and buy.’ The smell of Russia says, ‘Ladies and gentlemen: Russia!’ ”

Eventually, the Russian scent enticed him back on a far more ambitious adventure: a trans-Siberian journey in a used Renault delivery van. Accompanied by a pair of raffish guides, Sergei and Volodya, Frazier set out from St. Petersburg and traveled east. He forded giant rivers, waded through piles of trash, overnighted in mosquito-plagued campgrounds and met scientists, poets, scuba divers, sales ladies and many, many others whom fate had tossed to the far end of the Russian frontier. The Renault broke down repeatedly, beginning on Day 1, when “the speedometer needle, which had been fluttering spasmodically, suddenly lay down on the left side of the dial and never moved again.” The two guides came to exemplify a very Russian mix of unreliability and resourcefulness, gregariousness and gloom — miraculously repairing the dying van, then disappearing to party all night with the locals.

In his many visits, Frazier experienced Siberia’s highs and lows. In Tobolsk, the former capital, where Christian knights defeated the Muslim khan in the mid-17th century and put Siberia under the control of the czars, he gazed admiringly at the kreml , a medieval walled city. Perched on a promontory at the confluence of the Tobol and Irtysh Rivers, it “rises skyward like the fabled crossroads of Asiatic caravan traffic that it used to be.” On the other hand, the modern industrial city of Omsk, a symbol of Siberian desolation in the post-Soviet era, is little more than “crumbling high-rise apartment buildings, tall roadside weeds, smoky traffic and blowing dust.”

As he demonstrated in “Great Plains,” Frazier is the most amiable of obsessives. From his first encounter with Russian authority — a tense face-off with a boyish-looking border guard at Sheremetyevo Airport in Moscow — he peels away Russia’s stolid veneer to reveal the quirkiness and humanity beneath. The staring contest ends when the guard breaks into a big smile. “It was a kid’s grin,” Frazier concludes, “suggesting that we had only been playing a game, and I was now a point down.”

Frazier has the gumption and sense of wonder shared by every great travel writer, from Bruce Chatwin to Redmond O’Hanlon, as well as the ability to make us see how the most trivial or ephemeral detail is part of the essential texture of a place: the variety of TV antennas on Siberian rooftops, the giant bison skull in the paleontology museum of Irkutsk. Frazier never fully explains the nature of his “dread Russia-love,” though he clearly sees himself as the spiritual descendant of a long line of Russophiles. These include John Reed, the author of “Ten Days That Shook the World,” the classic account of the Russian Revolution, and George Kennan — not the diplomat but the 19th-century American adventurer of the same name, who followed the Siberian Trakt, “Russia’s great trans-Asian road,” along which goods and prisoners passed for centuries.

Frazier suggests that the country’s opaqueness has given it a twisted appeal. “Russia is older, crookeder, more obscure,” he writes, experiencing a “shiver of patriotism” on a flight back to the United States, just days after 9/11. He’s also fascinated by the role Siberia has played in the Russian psyche, recounting in bloody detail the exploits of the Golden Horde, the Mongol conquerors who rode out of the Asian steppe and reduced Kiev and other cities to smoldering ruins strewn with corpses. “Russia can be thought of as an abused country,” Frazier notes. “One has to make allowances for her because she was badly mistreated in her childhood by the Mongols.” The horror of that conquest, he observes, was enough to turn the attention of the czars to the East, and led to the gradual colonization of Siberia. Most of all, this region has served as a place of exile, an end-of-the-world dumping ground for everyone from petty criminals to visionaries and would-be reformers.

Mikhail Bakunin, the anarchist and revolutionary, was one of the few who succeeded in escaping from Siberia’s bleak prison camps, embarking on a lengthy transcontinental getaway in 1861 that eventually landed him in London, by way of Yokohama, San Francisco and New York. Dostoyevsky, sentenced to death for anti-czarist activities, was spared at the last minute and sent to serve a term of hard labor in Siberia. Anton Chekhov, who traveled to Sakhalin Island to examine the conditions of internal exile, described a prison cell with a “black crepe” of cockroaches on the walls. Frazier particularly rues the fate of the Decembrists, a group of idealistic young military officers whose exposure to democracy during the Napoleonic Wars inspired a doomed effort to reform the Russian autocracy. They were dispatched to Siberia, where, true to their reputation, they devised a system called artel — sharing stipends from home to ensure that no inmate was ever in need.

Ultimately, Frazier seems more interested in exploring Siberia’s past than contemplating its future. He barely flicks at its crucial role in gas and oil exploration, which is gradually making this vast territory the prime source of Russia’s wealth. And despite his attempts, he never managed to visit the reindeer herders of the far north, whose nomadic lives have come under threat from Gazprom, the Russian gas giant. However, Siberia’s richness in metaphor is enough to sustain this endlessly fascinating tale.

After a long drive across the frozen wastes of Lake Baikal, Frazier arrived at a long-abandoned prison camp near the town of Topolinoe. The camps along the Topolinskaya Highway were among the most dreaded destinations in Stalin’s gulag, the prison system that claimed the lives of more than a million people during the height of the Great Terror in 1937 and 1938. Frazier walked through one of the barracks where inmates starved and froze in the Siberian winter: “This interior offered little to think about besides the limitless periods of suffering that had been crossed off here, and the unquiet rest these bunks had held.” As always, Frazier locates the apt historical anecdote that captures the horror with precision. He tells the story of two child prisoners who were given a pair of guard-dog puppies to raise, then struggled to find names for them: “The poverty of their surroundings had stripped their imaginations bare. Finally they chose names from common objects they saw every day. They named one puppy Ladle and the other Pail."


The State of Liberalism
By JONATHAN ALTER

It’s a sign of how poorly liberals market themselves and their ideas that the word “liberal” is still in disrepute despite the election of the most genuinely liberal president that the political culture of this country will probably allow. “Progressive” is now the self-description of choice for liberals, though it’s musty and evasive. The basic equation remains: virtually all Republican politicians call themselves conservative; few Democratic politicians call themselves liberal. Even retired Classic Coke liberals like Walter F. Mondale are skittish about their creed. “I never signed up for any ideology,” he writes in his memoirs.

That would be fine (people are sick of labels) if clarity weren’t such an obvious political advantage. Simple ideology routinely trounces nuanced pragmatism, just as emotion so often beats reason and the varsity fullback will most likely deck the captain of the debate team in a fistfight. For four decades, conservatives have used the word “liberal” as an epithet, while liberals have used “conservative” defensively (“I’m a little conservative on . . .”). And Fox fans range out of factual bounds (“death panels”) more than their NPR-­listening counterparts in the liberal “­reality-based community” (a term attributed to a Bush White House aide by the author Ron Suskind).

Liberals are also at a disadvantage because politics, at its essence, is about self-interest, an idea that at first glance seems more closely aligned with conservatism. To make their more complex case, liberals must convince a nation of individualists that enlightened self-interest requires mutual interest, and that the liberal project is better constructed for the demands of an increasingly interdependent world.

That challenge is made even harder because of a tactical split within liberalism itself. Think of it as a distinction between “action liberals” and “movement liberals.” Action liberals are policy-oriented pragmatists who use their heads to get something important done, even if their arid deal-making and Big Money connections often turn off the base. Movement liberals can sometimes specialize in logical arguments (e.g., Garry Wills), but they are more often dreamy idealists whose hearts and moral imagination can power the deepest social change (notably the women’s movement and the civil rights movement). They frequently over­indulge in fine whines, appear naïve about political realities and prefer emotionally satisfying gestures to incremental but significant change. Many Democrats are an uneasy combination of realpolitik and “gesture politics,” which makes for a complicated approach toward governing.

As Senator Al Franken says of the Republicans: “Their bumper sticker . . . it’s one word: ‘No.’ . . . Our bumper sticker has — it’s just way too many words. And it says, ‘Continued on next bumper ­sticker.’ ”

Action liberalism has its modern roots in empiricism and the scientific method. Adam Smith was the original liberal. While “The Wealth of Nations” (1776) has long been the bible of laissez-faire conservatism, Smith’s first book, “The Theory of Moral Sentiments” (1759), pioneered liberal ideas of social and moral inter­dependence. By today’s standards, Abraham Lincoln’s support for large-scale government spending on infrastructure and appeals to “the better angels of our nature” would qualify him as a liberal. In the 20th century, progressives cleaned up and expanded government, trust-busted on behalf of what came to be known as “the public interest,” and experimented with different practical and heavily compromised ways of addressing the Great Depression.

The quintessential example of the pragmatic core of liberalism came in 1943, when President Franklin D. Roosevelt announced that “Dr. New Deal” had become “Dr. Win the War.” Roosevelt believed that the ends of liberalism — advancing democracy, expanding participation, protecting the environment and consumers (first promoted by a progressive Republican, Theodore Roosevelt), securing the vulnerable — were fixed, but that the timing and means of achieving them were highly negotiable, a distinction that often eludes modern liberals.

Whatever F.D.R.’s advantages over President Obama in communicating with the public, they share an unsentimental emphasis on what’s possible and what works. Both men, for instance, rejected the urgent pleas of some liberals to nationalize the banks and tacked toward their goals rather than standing ostentatiously on principle. Roosevelt was criticized by New Deal liberals in 1935 for allowing Congress to water down the Social Security bill before passage. Sound familiar?

Many movement liberals consider such concessions to be a sellout, just as they thought President Bill Clinton sold out by signing welfare reform in 1996. It’s important to criticize parts of Obama’s performance where merited — he didn’t use his leverage over banks when he had it — but some liberal writers have gone further, savaging his motives and integrity. Roger D. Hodge’s book is called THE MENDACITY OF HOPE: Barack Obama and the Betrayal of American Liberalism ­(Harper/HarperCollins, $25.99), as if Obama’s corporate fund-raising and failure to live up to the unrealistic expectations of purist liberals made him and his team puppets and liars. Hodge says the fact that Obama is “in most respects better” than George W. Bush or Sarah Palin is “completely beside the point.” Really? Since when did the tenets of liberalism demand that politics no longer be viewed as the art of the ­possible?

Hodge, formerly the editor of Harper’s Magazine, makes valid arguments about the failure of Democrats to undertake the essential liberal function of checking the excesses of capitalism. But the political scientists Jacob S. Hacker and Paul Pierson are closer to the mark in their important new book, WINNER-TAKE-ALL POLITICS: How Washington Made the Rich Richer — and Turned Its Back on the Middle Class (Simon & Schuster, $27). Without rationalizing specific policy choices, they describe the “paradox” Obama confronted on taking office when the country faced a genuine risk of another depression: “how to heal a fragile economy without simply reasserting the dominance of the forces that had brought that economy to the brink of ruin.” It’s the healing part — preventing another depression — that voters often forget in their understandable rage over bailouts, almost all of which, by the way, have already been paid back.

In making the broader case that the rich have essentially bought the country, Hacker and Pierson zero in on two killer statistics. Over the last three decades, the top 1 percent of the country has received 36 percent of all the gains in household incomes; 1 percent got more than a third of the upside. And the top one-tenth of 1 percent acquired much more of the nation’s increased wealth during those years than the bottom 60 percent did. That’s roughly 300,000 super-rich people with a bigger slice of the pie than 180 million Americans. The collapse of the American middle class and the huge transfer of wealth to the already wealthy is the biggest domestic ­story of our time and a proper focus of liberal energy.

Arianna Huffington wasn’t exaggerating when she entitled her latest book THIRD WORLD AMERICA: How Our Politicians Are Abandoning the Middle Class and Betraying the American Dream (Crown, $23.99). Poverty in the United States isn’t as bad as in the third world, but the disparity between rich and poor is far beyond that of other highly developed nations. While Huffington’s muscular tone fits the mood of today’s liberals, she insists on pivoting to the positive. After excoriating politicians, she cites innovative nonprofits that can help liberals feel less helpless.

The good news reported by Hacker and Pierson is that American wealth disparities — almost exactly as wide as in 1928 — are not the residue of globalization or technology or anything else beyond our control. There’s nothing inevitable about them. They’re the result of politics and policies, which tilted toward the rich beginning in the 1970s and can, with enough effort, be tilted back over time (emphasis added for impatient liberals). The primary authors of the shocking transfer of wealth are Republicans, whose claims to be operating from principle now lie in tatters. It doesn’t take feats of scholarship to prove that simultaneously supporting balanced budgets, status quo entitlement and defense spending, and huge tax cuts for the wealthy (the Republicans’ new plan) is mathematically impossible and intellectually bankrupt.

But of course Democrats, caught up for years in the wonders of the market, are complicit in the winner-take-all ethos. President Clinton and his Treasury secretary Robert E. Rubin played to the bond market, and many of their protégés later came to dominate the Obama administration. Hacker and Pierson call Rahm Emanuel types “Mark Hanna Democrats,” a reference to William McKinley’s campaign manager, who said: “There are two things that are important in politics. The first is money, and I can’t remember what the second one is.” Action liberals can explain that they opposed the ruinous 2001 Bush tax cuts and that their prodigious fund-raising is necessary to stay competitive, but large segments of their base are no longer buying it. They want a more bare-knuckle attack on Wall Street than Obama has so far offered.

But now the president is getting hit from both sides. In FORTUNES OF CHANGE: The Rise of the Liberal Rich and the Remaking of America (Wiley, $25.95), David Callahan points out that Obama raised more than John McCain in 8 of the 10 wealthiest ZIP codes in the United States. Callahan, the author of “The Cheating Culture,” notes that Hollywood money proves that rich donors don’t always push the parties to the right. It can also push the Democrats left on issues like the environment and gay rights. And yet in the months since he finished his book, many wealthy Obama supporters have grown disenchanted with what they see as the president’s “anti-business” language (he attacked “fat-cat bankers”). This was inevitable. “A benign plutocracy is still a plutocracy,” Callahan concludes. He quotes Louis Brandeis: “We can have democracy in this country, or we can have great wealth concentrated in the hands of a few, but we can’t have both.” That’s as relevant today to liberal thinking as it was when Brandeis said it, decades ago.

On social issues, liberals have mostly won, with the public backing them on abortion, gay rights and other live-and-let-live ideas. That doesn’t necessarily make liberals more libertine (in fact, divorce rates are higher in red states than in blue). But it floats them closer to history’s tide. The great hope for the future of liberalism lies in the changing demographics of the country. With younger voters and Hispanics moving sharply into the Democratic column, Republicans are in danger of being marginalized as an old, white, regional party. The Tea Party energy might be seen in retrospect as the last gasp of the “Ozzie and Harriet” order, with Obama as the scary face of a different-looking America. (Why else did Tea Partiers not seem to care over the last decade about President Bush’s profligate spending?) For now, of course, it’s conservatives who have the mojo, and not just because the economy is so bad. Despite historic advances in 2008, liberals remain better at complaining than organizing, which is a big reason they may take a shellacking in November.

A couple of new books recall the story about the civil rights and labor leader A. Philip Randolph, who was visiting F.D.R. to push for a policy. “Make me do it,” the president is said to have replied. Roosevelt meant that his visitors should go out and organize and demonstrate, not just expect him to wave a magic wand. Liberals have a tendency to think that when the “right” person wins, order has been restored. The idea of permanent trench warfare between liberals and conservatives is an abstraction to them rather than a call to arms. One reason health care reform stalled in the summer of 2009 was that Tea Party forces turned up en masse at town meetings in swing districts while liberals stayed home, convinced that after electing Obama they were free to go on Miller Time.

The enactment of the Patient Protection and Affordable Care Act of 2010 was a bad experience for certain movement liberals. If conservatives were mindless in describing as “socialism” what was essentially a plan pioneered by Bob Dole, ­Howard Baker and Mitt Romney, liberals seemed strangely incapable of taking yes for an answer after more than 70 years of trying to expand coverage. They were right about the value of a public option, but wrong to attack Obama for not obtaining it when the votes were never there in the Senate. Many Democrats were ignorant of all the good things in the legislation (partly the fault of White House mistakes in framing the message) and politically suicidal in echoing Howard Dean’s infamous cry of “Kill the bill!” By the end of the process, voters were revolted by the notorious “Cornhusker kickback” and other smelly deals. If making laws is akin to making sausage (you don’t really want to know what goes into it), the stench from Capitol Hill spoiled everyone’s appetite for the liberal meal.

But somewhere Ted Kennedy is smiling. To the list of revealing Kennedy books, add Burton Hersh’s EDWARD KENNEDY: An Intimate Biography (Counterpoint, $32). Hersh, a Harvard classmate of the future senator, ignores much of his Senate career but makes good use of sources going back six decades to paint a personal portrait. While Hersh’s uncontrolled freight-train prose is loaded with often extraneous details, he nonetheless brings many of the old stories alive again. Kennedy was both the heart and the tactical brains of late-20th-­century liberalism, which won many small victories even as it fell out of fashion. Had he been vital in 2009 and able to work his charm across the aisle, senators in both parties agree, the health care debate would have been healthier.

In GETTING IT DONE: How Obama and Congress Finally Broke the Stalemate to Make Way for Health Care Reform (Thomas Dunne/St. Martin’s, $25.99), Tom Daschle, the former Senate majority leader, strips the color from that story in order to maintain his Washington relationships. But Daschle, forced by a tax problem to step down as Obama’s health care czar, has written, with David Nather, an exceptionally clear account of an exceptionally tangled piece of recent history. He’s especially good on why the credibility of Democrats depends on how skillfully they implement the bill over the next 10 years. Left unsaid is that Democrats in 2012 will face not just hostile Republicans favoring repeal but also cost controls on Medicare that will encourage conservatives to resume their pandering to the elderly, an approach long taken by liberals to retain power. Beyond the specifics of the bill, Daschle is obviously right that “health care has become a symbol of the deep divide in Americans’ feelings about the role government should play.”

In his Inaugural Address, Obama tried to define his view of that role when he said, “The question we ask today is not whether our government is too big or too small, but whether it works.” This is a sensible definition of modern liberalism but also a bloodless and incomplete one, evoking Michael Dukakis’s claim that the 1988 presidential election was about “competence, not ideology.” That definitional dispute had first flared four years earlier, when Gary Hart mounted a stiff challenge to Walter Mondale for the Democratic nomination. Both have new books out.

Hart’s eccentric contribution, THE THUNDER AND THE SUNSHINE: Four Seasons in a Burnished Life (Fulcrum, $25), will remind readers why he and the presidency would have been an awkward fit. His bitterness over the sex scandal that ended his political career in 1987 hasn’t fully ebbed. As always, he tries to aim higher: Condoleezza Rice is in the index, but Donna Rice isn’t.

The book contains a sustained and ponderous “Odyssey” metaphor, with one chapter opening, “As he rises from his stony perch above the harbor, Ulysses tells his mariners that he is prepared to sail beyond the sunset.” Hart began his career as a 1960s movement liberal, a seeker and intellectual (he received a doctorate from Oxford in 2001, at the age of 64) with Homeric aspirations. He ruminates well about some of the essential differences between the American political creeds. Conservatives, by nature more skeptical, “accept that life is just one damn thing after another, that we are on our own, and it is up to us to make the most of it. But for those with a sense of commonwealth and common good, the shattering of dreams and hopes is always viewed more ­tragically.”

True enough, but it raises the question of why attaching emotion to politics makes conservatives stronger but often weakens liberals. In the years since Presidents Kennedy and Johnson, something soft has wormed its way into the heart of liberalism, a diffidence about the cut-and-thrust of politics. Carville-style fisticuffs are satisfying, but have not yet made it a fighting faith again.

Mondale’s memoir, THE GOOD FIGHT: A Life in Liberal Politics (Scribner, $28), written with David Hage, is, not surprisingly, more conventional than Hart’s, but he comes to terms more squarely with the limits of liberalism. Looking back at his early days in the Senate in 1965, at the peak of Lyndon Johnson’s Great Society — what he calls the “high tide” of liberalism — Mondale says: “A lot of it was wonderful, overdue and much needed. But we also overstated what was possible.” He recalls the years in the wilderness, when “I wanted to talk about poverty and opportunity, but people wondered why I wanted to give away things for free.” Even in feeling vindicated by Obama’s election, he admits that “liberalism is still on trial.”

Especially when it comes to education. It’s encouraging that even a paleo­liberal like Mondale now believes that “we should weed out teachers who are unsuited to the profession” and that teachers’ union rules “must have flexibility.” There’s a great struggle under way today within the Democratic Party between Obama and the reformers on one side and, on the other, hidebound adult interest groups (especially the National Education Association) that have until recently dominated the party. If liberalism is about practical problem solving, then establishing the high standards and accountability necessary to rescue a generation of poor minority youths and train the American work force of the future must move to the top of the progressive agenda. Education reform is emerging as the first important social movement of the 21st century, a perfect cause for a new generation of ­idealists.

Where education might offer grounds for cooperation with conservatives, foreign policy almost certainly will not. After a long period of favoring interventionism to fight fascism and Communism, liberals have been doves since Vietnam, even in a post-9/11 world. If Democrats retain control of the House, they will pressure Obama hard next year to begin withdrawing from Afghanistan as promised. Chalmers Johnson, a noted scholar of Japan, has in recent years made a point of explaining how the Afghan freedom fighters the C.I.A. supported in the 1980s, when they were fighting the Soviet Union, are now the Taliban and Qaeda forces trying to kill Americans. In DISMANTLING THE EMPIRE: America’s Last Best Hope (Metropolitan/Holt, $25), he argues for a complete reordering of the national security state to save not just lives but treasure. While Obama won’t go as far as Johnson urges, a big tussle between the White House and the Pentagon is likely next year, when we’ll learn if neoconservatives can once again convince the country that liberals are “unpatriotic.”

The answer to that question — and to the immediate fate of liberal ideas — depends largely on the performance of one man, the president. Jeffrey C. Alexander’s intriguing argument in THE PERFORMANCE OF POLITICS: Obama’s Victory and the Democratic Struggle for Power (Oxford University, $29.95), a meticulous review of the 2008 campaign, is that his fellow sociologists have over­emphasized impersonal social forces at the expense of the theater of public life — the way politicians perform “symbolically.” It’s a prosaic call for a more poetic (or at least aesthetic) understanding of politics. Ideology must connect viscerally, or it doesn’t connect at all. Liberalism, like any idea or product, can succeed only if it sells.
The State of Conservatism
By CHRISTOPHER CALDWELL

Within the space of a week last summer, one judge in Arizona, ruling in a suit brought by the Obama administration, blocked a provision in a new state law permitting police officers to check the status of suspected illegal immigrants, while another blocked the implementation of a California referendum banning gay marriage. The two decisions imposed liberal policies that public opinion opposed. These things happen, of course. Congress had acted contrary to measurable public opinion when it passed health care reform in March. What made the two judicial rulings different was that both seemed to challenge the principle that it is the people who have the last word on how they are governed.

American conservatives, most notably the activists who support various Tea Party groups, have a great variety of anxieties and grievances just now. But what unites them all, at least rhetorically, is the sense that something has gone wrong constitutionally, shutting them out of decisions that rightfully belong to them as citizens. This is why many talk about “taking our country back.”

If polls are to be believed, conservatives should have no difficulty taking the country back or doing whatever else they want with it. Gallup now counts 54 percent of likely voters as self-described conservatives and only 18 per cent as liberals. More than half of Americans (55 per cent) say they have grown more conservative in the past year, according to the pollsters Scott Rasmussen and Doug Schoen in their new book, MAD AS HELL: How the Tea Party Movement Is Fundamentally Remaking Our Two-Party System (Harper/HarperCollins, $27.99).

America’s self-described conservatives, however, have a problem: They lack a party. While the Tea Party may look like a stalking horse for Republicans, the two have been a bad fit. Insurgents have cut a swath through Republicans’ well-laid election plans. They helped oust Florida’s party chairman. They toppled the favored candidates of the party establishment in Alaska, Colorado, Delaware, Florida, Kentucky, New York, South Carolina, Utah and elsewhere.

More than 70 percent of Republicans embrace the Tea Party, but the feeling is not reciprocated. If conservatives could vote for the Tea Party as a party, they would prefer it to the Republicans, according to Rasmussen. (Lately, Rasmussen’s polling, more than others’, has favored Republicans. Not coincidentally, perhaps, it has picked up certain recent shifts earlier and more reliably — like the surge that won the Republican Scott Brown the late Ted Kennedy’s Massachusetts Senate seat in January.) Much of the Tea Party is made up of conservative-leaning independents. The journalist Jonathan Rauch has called these people “debranded Republicans,” and they are debranded for a reason — 55 percent of them oppose the Republican leadership. While Republicans are likely to reap all the benefit of Tea Party enthusiasm in November’s elections, this is a marriage of convenience. The influential conservative blogger Erick Erickson of RedState.com, insists that one of his top goals is denying the Republican establishment credit for any electoral successes.

Hence the Republicans’ problem. After November, the party will need to reform in a conservative direction, in line with its base’s wishes, and without a clear idea of whether the broader public will be well disposed to such reform.

How Republicans wound up in this situation requires one to state the obvious. Well before George W. Bush presided over the collapse of the global financial system, a reasonable-sounding case was being mustered that he was the worst president in history. Foreign policy was the grounds on which voters repudiated him and his party, starting in 2006, and President Obama’s drawdown of forces in Iraq may be the most popular thing he has done. But foreign policy is unlikely to drive voters’ long-term assessment of the parties. The Iraq misadventure was justified with the same spreading-democracy rhetoric that Bill Clinton, Madeleine Albright and other Democrats used to justify interventions in Haiti and the Balkans in the 1990s. President Obama’s difficulties in resolving Afghanistan and closing Guantánamo show that Bush’s options were narrower than they appeared at the time.

Republicans’ future electoral fortunes will depend on domestic policy and specifically on whether they can reconnect with “small-c” conservatism — the conservatism whose mottoes are “Neither a borrower nor a lender be” and “Mind your own business,” and the opposite of which is not liberalism but utopianism. The Bush administration was a time of “big-C” Conservatism, ideological conservatism, which the party pursued with mixed results. As far as social issues were concerned, this ideology riveted a vast bloc of religious conservatives to the party, and continues to be an electoral asset (although that bloc, by some measures, is shrinking). Had gay marriage not been on several state ballots in 2004, John Kerry might now be sitting in the White House.

Ideological conservatism also meant “supply-side economics” — a misnomer for the doctrine that all tax cuts eventually pay for themselves through economic growth. The problem is, they don’t. So supply-side wound up being a form of permanent Keynesian stimulus — a bad idea during the overheated years before 2008. Huge tax cuts, from which the highest earners drew the biggest benefits, helped knock the budget out of balance and misallocated trillions of dollars. To a dispiriting degree, tax cuts remain the Republican answer to every economic question. Eric Cantor, potentially the House majority leader, told The Wall Street Journal that if Democrats went home without renewing various Bush-era tax cuts (which they did), “I promise you, H.R. 1 will be to retroactively restore the lower rates.”

Until recently, supply side was political gravy for Republicans. It confirmed the rule that in American politics the party most plausibly offering something for nothing wins. In the 1980s, the New York congressman Jack Kemp was the archetype of an ambitious, magnanimous, “sunny” kind of Republican who let you keep more of your taxes while building more housing for the poor. Democrats who questioned the affordability of these policies sounded like killjoys. In a time of scarcity like our own, calculations change. Today your tax cut means shuttering someone else’s AIDS clinic. Your welfare check comes off of someone else’s dinner table.

Deficits in the Obama era are a multiple of the Bush ones, and the product of a more consciously pursued Keynesianism. But that does not absolve Republicans of the need to find a path to balancing the budget. With some exceptions — like Representative Paul Ryan of Wisconsin, a Kemp protégé who has laid out a “Road Map” for reforming (i.e., cutting) Social Security in coming generations — Republicans have not adjusted to zero-sum economics. There is certainly no credible path to budget balance in the “Pledge to America” released in late September.

Yet the case against supply-side economics can never be airtight or decisive, and Republican tax promises will probably help the party this year. That is because taxes are not just an economic benchmark, but a political one. The public should not expect more in services than it pays in taxes. But the government should not expect more in taxes than it offers in representation. And the number of Americans who feel poorly represented has risen alarmingly during the Obama administration.

Americans’ feelings toward the president are complex. On the one hand, there is little of the ad hominem contempt that was in evidence during the Clinton and Bush administrations. There are no campaign spots showing a Congressional candidate’s face morphing into Obama’s. But the president’s ideology, fairly or not, has provoked something approaching panic. Not many Americans agree that Obama is a closet totalitarian, as the Fox News host Glenn Beck has claimed. But they have serious misgivings of a milder kind.

In retrospect it looks inevitable that Republicans would have been punished by voters in 2008; but until Lehman Brothers collapsed in mid-September of that year, it was far from certain they would be, despite strong Democratic gains in the 2006 elections. Independent and Republican voters wanted an assurance that Senator Obama would not simply hand over power to the Democratic Party. He consistently provided it. The centerpiece of his campaign was a promise of post-partisanship. He introduced himself as a Senate candidate in 2004 at the Boston convention, deriding as false the tendency of pundits to “slice and dice our country into red states and blue states” — a bracingly subversive thing to do at a partisan convention. He praised Ronald Reagan.

And in 2008 he got more than 52 percent of the vote, a higher percentage than many political consultants thought possible for a Democrat. That means he came into office unusually dependent on the good will of independents and Republicans. And yet, once in power, the president set to work enacting the agenda of the same Congressional Democrats he had implied he would keep at arm’s length. No president in living memory has compiled a slenderer record of bipartisanship.

It is often said in the president’s defense that Republican obstructionism left him no choice. Today, this is true — and it has put an end, for now, to the productive part of his presidency. But it was not true at the time of the stimulus in early 2009, when the president’s poll numbers were so stratospherically high that it appeared risky to oppose him on anything. Republicans certainly cannot be blamed for the way Democrats passed their health care bill. Whether or not the deal-making and parliamentary maneuvering required to secure passage was unprecedented, it was unprecedented in the era of C-Span and blogs, and many voters found it corrupt. The president’s legislative program has been bought at a huge price in public discontent. The expression “picking up nickels in front of a steamroller” has been used to describe a lot of the gambles taken by A.I.G. and other companies on the eve of the financial crisis. It describes the president’s agenda equally well.

It is vital to understand where this steamroller is coming from. According to Gallup, support for Obama has fallen only slightly among Democrats, from 90 percent to 81 percent, and only slightly among Republicans, from 20 percent to 12 percent. It is independents who have abandoned him: 56 percent approved of him when he came into office, versus 38 percent now. The reason the country is getting more conservative is not that conservatives are getting louder. It is that people in the dead center of the electorate are turning into conservatives at an astonishing rate.

The frustration and disappointment of these voters is probably directed as much at themselves as at their president. There were two ways to judge Obama the candidate — by what he said or by the company he kept. The cable-TV loudmouths who dismissed Obama right off the bat were unfair in certain particulars. But, on the question of whether Obama, if elected, would be more liberal or more conservative than his campaign rhetoric indicated, they arrived at a more accurate assessment than those of us who pored over his speeches, parsed his interviews and read his first book.

Some wish the president had governed more to the left, insisting on a public option in the health care bill and pushing for a larger stimulus. But those people make up only a small fraction even of the 18 percent of voters who call themselves liberal. In a time of growing populism and distrust, Republicans enjoy the advantage of running against the party of the elite. This seems to be a controversial proposition, but it should not be. It is not the same as saying that Democrats are the party of elitism. One can define elitism as, say, resistance to progressive taxation, and make a case that Republicans better merit that description. But, broadly speaking, the Democratic Party is the party to which elites belong. It is the party of Harvard (and most of the Ivy League), of Microsoft and Apple (and most of Silicon Valley), of Hollywood and Manhattan (and most of the media) and, although there is some evidence that numbers are evening out in this election cycle, of Goldman Sachs (and most of the investment banking profession). That the billionaire David Koch’s Americans for Prosperity Foundation supports the Tea Party has recently been much in the news. But the Democrats have the support of more, and more active, billionaires. Of the 20 richest ZIP codes in America, according to the Center for Responsive Politics, 19 gave the bulk of their money to Democrats in the last election, in most cases the vast bulk — 86 percent in 10024 on the Upper West Side. Meanwhile, only 22 percent of non-high-school educated white males are happy with the direction the country is going in. The Democrats’ overlap with elites leaves each party with a distinctive liability. The Democrats appear sincerely deluded about whom they actually represent. Democrats — who would have no trouble discerning elite solidarity in the datum that, say, in the 1930s the upper ranks of Britain’s media, church, business and political institutions were dominated by Tories — somehow think their own predominance in similar precincts is . . . what? Coincidence? Irony?

Republicans, meanwhile, do not recognize the liability that their repudiation by elites represents in an age of expertise and specialization — even in the eyes of the non-elite center of the country. Like a European workingman’s party at the turn of the last century, the Republican Party today inspires doubts that it has the expertise required to run a large government bureaucracy. Whatever one thinks of Obama’s economic team, and Bill Clinton’s before it, the Bush White House was never capable, in eight years, of assembling a similarly accomplished one. Nor is there much evidence that Republicans were ever able to conceptualize the serious problems with the nation’s medical system, let alone undertake to reform it on their own terms. “Democrats and Republicans agree that our health care system is broken in fundamental ways,” Eric Cantor notes in YOUNG GUNS: A New Generation of Conservative Leaders (Threshold, $15), a campaign book he has written with Paul Ryan and Representative Kevin McCarthy of California. Well, great. But for years now, Republicans discussing the availability and cost of health care have been like a kid who, when asked why he hasn’t cleaned up his room, replies, “I was just about to!”

It is in the context of class that Sarah Palin’s two-year career on the American political scene is so significant. She “almost seemed to set off a certain trip wire within the political class regarding access to power,” as Rasmussen and Schoen put it. But it is not an ideological trip wire. The Alaska governorship that catapulted Palin onto the national scene requires dealing with oil executives and divvying up the money from their lease payments. It is a job for a pragmatist, not a preacher. Palin has sometimes opposed big government and sometimes favored it, as became clear when journalists discovered that, contrary to Palin’s claims, she had been slow to oppose the wasteful Alaskan “Bridge to Nowhere,” which became a symbol of federal pork.

The controversies over Palin are about class (and markers of class, like religiosity), not ideology. She endorsed several underdog insurgent candidates who wound up winning Republican primaries in the spring and summer. How did she do that, when few observers — no matter how well informed, no matter how close to the Republican base — had given them a chance? Either Palin is a political idiot savant of such gifts that those who have questioned her intelligence should revise their opinion or, more likely, she is hearing signals from the median American that are inaudible to the governing classes — like those frequencies that teenagers can hear but adults can’t.

This talent alone does not make Palin a viable national leader. But until Republican politicians learn to understand the party’s new base, Palin will be their indispensable dragoman. After November’s election, the party will either reform or it will disappoint its most ardent backers. If it reforms, it is unlikely to be in a direction Palin disapproves of.

In The Ruling Class: How They Corrupted America and What We Can Do About It (American Spectator/Beaufort, paper, $12.95), Angelo M. Codevilla, an emeritus professor of international relations at Boston University who formerly was on the staff of the Senate Select Committee on Intelligence, gives a very interesting, conservative account of class politics. Codevilla sees the country as divided into “the Ruling Class” and “the Country Class,” who “have less in common culturally, dislike each other more and embody ways of life more different from one another than did the 19th century’s Northerners and Southerners.” Codevilla’s terms are often frustratingly vague. The Ruling Class, in his definition, includes top Democrats as well as Bush Republicans, despite their many differences; the Country Class seems sometimes to mean the passive remainder of the country, and sometimes the vanguard of ideological insurgents.

And yet Codevilla captures the texture of today’s conservative grievances with admirable boldness and convincing exactitude. Slights are harder to tolerate than exactions, he finds: “Day after day, the Ruling Class’s imputations — racist, stupid, prone to violence, incapable of running things — hit like artillery cover for the advance of legislation and regulation to restrict and delegitimize.” This is a polemic, and people wholly out of sympathy with conservatism will dislike it. But Codevilla makes what we might call the Tea Party case more soberly, bluntly and constructively than anyone else has done.

Codevilla takes seriously the constitutional preoccupations of today’s conservative protesters and their professed desire for enhanced self-rule. He sees that the temptation merely to form “an alternative Ruling Class” in the mirror image of the last one would be self-defeating. Americans must instead reacquire the sinews of self-government, he thinks. Self-­government is difficult and time-­consuming. If it weren’t, everyone would have it. The “light” social democratic rule that has prevailed for the past 80 years has taken a lot of the burdens of self-government off the shoulders of citizens. They were probably glad to be rid of them. Now, apparently, they are changing their minds.

Codevilla has no illusions about their prospects for success. Americans are not in the position to roll back their politics to before the time when Franklin D. Roosevelt or Woodrow Wilson or whoever-you-like ran roughshod over the Yankee yeomanry. Town, county and state governments no longer have much independent political identity. They are mere “conduits for federal mandates,” as Codevilla puts it. He notes that the 132 million Americans who inhabited the country in 1940 could vote on 117,000 school boards, while today a nation of 310 million votes in only 15,000 school districts. Self-rule depends on constitutional prerogatives that have long been revoked, institutions that have long been abandoned and habits of mind that were unlearned long ago. (Not to mention giving up Social Security and Medicare benefits that have already been paid for.) “Does the Country Class really want to govern itself,” Codevilla asks, “or is it just whining for milder taskmasters?”

We will find out soon enough. With a victory in November, Republicans could claim a mandate to repeal the Obama health care law and roll back a good deal of recent stimulus-related spending, neither of which they’ve made any pretense of tolerating. But achieving the larger goal — a citizenry sufficiently able to govern itself to be left alone by Washington — will require more. The Republican Party’s leaders will need to sit down respectfully with the people who brought them to power and figure out what they agree on. If Republicans make the error that Democrats did under President Obama, mistaking a protest vote for a wide mandate, the public will turn on them just as quickly.
The Myth of Consensus Politics
By SAM TANENHAUS

Two years after an election that seemed to portend a new era of comity, American politics has resumed what now appears to be its permanent condition of polarization, quite possibly worsened by widening rifts within the two major parties.

Jonathan Alter, in his cover essay this week on liberalism, notes that the Democratic Party is split between purists and pragmatists; Christopher Caldwell, assessing the state of conservatism, warns that the Republican Party, should it gain control of Congress, will be accountable to angry insurgents aligned with the Tea Party movement.

If Alter and Caldwell are right, and the books they discuss suggest they are, then Democrats and Republicans seem destined to move even farther apart than they are now. How, then, will they forge the compromises that are the foundation of effective governance?

The answer is: They may not need to. For most of the past century, consensus in American politics has been more phantom than fact, especially when it comes to staking out ideological ground.

Even in the cold war era, a peak period of bipartisan cooperation, liberals and conservatives clashed over first principles, and the most respected spokesmen in either party were not afraid to say so.

Consider an important midcentury manifesto, “A Democrat Looks at His Party” (1955). Its author, Dean Acheson, had left government after the 1952 election, but he remained a formidable presence — ­perhaps the greatest of all modern secretaries of state. Yet his book, although written in courtly prose, with learned references to Renaissance history and choice aphorisms from Oliver Wendell Holmes, was an act of ideological revivalism, steeped in the glories of the New Deal.

The Democrats’ triumphs, Acheson argued, originated in their discovery that the federal government, in particular the executive branch, should be “an instrument to accomplish what needs to be done, even if this cuts across cherished doctrines,” including those stated or implied in the Constitution.

In fact, Acheson explained, the history of the Democratic Party, dating as far back as Grover Cleveland’s presidency, “is the history of America’s unwritten constitution, of the powers of the federal government, of the nature and authority of the presidential office and its relation to the legislative and judicial powers.” It climaxed in the rush of programs and agencies Democrats devised to meet the crisis of the Great Depression. These innovations required “knowledge, perceptiveness, imagination ­— in other words, brains,” and it was the Democratic Party that“attracts intellectuals and puts them to work.”

Republicans, meanwhile, clung to an outmoded ideal of a weak federal government: “In the name of checking . . . ‘executive aggrandizements,’ the party historically would subordinate the Executive to the Congress, and the national voice to a babel of local voices,” Acheson wrote.

The unwritten constitution? The babel of local voices? It’s hard to imagine the high-profile Democrat today who would so openly acknowledge these presumptions of modern liberalism.

Republicans in the 1950s were no less direct. Take, for instance, the ideas expressed by Arthur Larson, the under secretary of labor to President Eisenhower, in his book “A Republican Looks at His Party,” published in 1956. Responding to Acheson, Larson accused him of a “thinly veiled contempt for state and municipal government,” formed under “the influence of a school of European political theory” — specifically, the socialist theory of Harold Laski. Larson stated his party’s position in language as strident as Newt Gingrich’s. “Let us put it perfectly bluntly: the typical American is inherently a states’-righter by inclination and sentiment.” That same American had “an instinctive sense that . . . excessive centralization means the threat of ultimate loss of personal liberties, and that our constitutional division of powers between the central government, the state governments and the people is right and must be preserved at all costs.” The Democrats’ ideal of the federal leviathan, Larson warned, would place the nation on the road to “totalitarian dictatorship.”

Acheson and Larson were by no means extremists. Each stood at or near the political center. Acheson had been the architect of the cold war containment policy that included the use of loyalty oaths, enacted under Truman, to expunge suspected Communists from the government payroll, though in his manifesto he regretted this “grave mistake.”

Larson, a self-described “New Republican,” proudly pointed to the Eisenhower administration’s expansion of New Deal programs — unemployment insurance, for one­ — and advocated a “strong, confident center-of-the-road American consensus,” a view repudiated by the conservative wing of his own party.

The dominant political figure in the 1950s was Eisenhower, a popular president twice elected with sweeping majorities. He disdained ideological debate but it swirled all around him, at times almost paralyzing his administration. For two years, he was locked in battle with his party’s right wing, most conspicuously with a group of legislators led by Senator Joseph R. McCarthy, the ringmaster of loyalty investigations that reached deep into the executive branch. Other Republicans, exploiting wafer-thin majorities in Congress, gave less attention to major appropriations bills than to drafting constitutional amendments that might confound the most devout Tea Partier. In his 1956 book “Affairs of State: The Eisenhower Years,” Richard Rovere, The New Yorker’s Washington correspondent, counted no fewer than 107 amendments that had been submitted to Senate committees as of June 1954. They included one empowering state governors to fill “vacancies” in the House of Representatives should Washington suffer a nuclear attack, another to prevent “interference with or limitation upon the power of any state to regulate health, morals, education, marriage, and good order in the state,” and a third that would have inserted the following words in the Constitution: “This nation devoutly recognizes the authority and law of Jesus Christ, Savior and Ruler of Nations through whom are bestowed the blessings of Almighty God.”

At one point, Eisenhower, frustrated that the nation’s serious business was being ignored, considered quitting the Republican Party and starting a new party of his own.

The problem was temporarily solved by the 1954 election. It was a defeat for Congressional Republicans, but not for Eisenhower since, as Rovere reported, the election had “removed several persons whom the president found offensive and had weakened the authority of quite a few others,” most of them Republicans.

Democrats, all the while, were equally fissured, as the party nearly self-destructed over civil rights, the great social issue of the 1950s (though neither Acheson nor Larson had much to say about it). The fiercest proponents of the states’ rights ideology championed by Larson were not Republicans, but Southern Democrats. Some had already broken with the party, in 1948, when Strom Thurmond, then the governor of South Carolina, headed a third-party, states rights’ ticket, the Dixiecrats, which captured four states in the general election, the first step in the Democratic Party’s eventual loss of “the Solid South.”

Today, much of this history has been forgotten, and the Eisenhower years are remembered instead as an oasis of responsible governance and nonideological, bipartisan calm.

It is too soon to say the same about the politics of the present moment. But it is also too soon to say where we are headed or even to guess how we might get there.
From Bomb to Bust
By STEPHANIE ZACHAREK

There’s been lots of ink and oceans of pixels spilled on the question of whether the Internet has killed film criticism, but the very short answer is that serious (if unpaid) criticism has thrived on the Web. The problem is that it’s all too serious: you don’t have to strain your Google finger to find a knowledgeable enthusiast expending 8,000 words on Ozu or Leone. Locating someone who can write succinctly and intelligently on, say, the 1985 Christmasploitation extravaganza “Santa Claus: The Movie” is much harder.

Intentionally or not, that’s a gap that Nathan Rabin, the head writer for The Onion’s A.V. Club, filled by embarking on a yearlong blog project in 2007, the results of which — rounded out with a few extras — are collected in “My Year of Flops.” Rabin applied what he terms “three ­unyielding/slippery criteria” in choosing the films: Each had to be a critical and commercial failure upon its release. Each “had to have, at best, a marginal cult following.” “And,” he adds, gearing up for the zinger, each “had to facilitate an endless procession of facile observations and labored one-liners.”

“My Year of Flops” covers some 50 underappreciated pictures; every troubled orphan is assessed and deemed a Failure, a Fiasco or a Secret Success. Rabin scrutinizes stinker after stinker, from the 1956 Howard Hughes-produced anti-miscegenation screed “The Conqueror” (he refers to its central figure, played by John Wayne, as John Wayneghis Khan), to the dismal 2005 film version of “Rent” (which he describes, aptly, as starring “fake 20-somethings playing fake bohemians in a wholly inauthentic take on la vie bohème”), to Cameron Crowe’s woebegone 2005 “Elizabethtown” (which confounded Rabin so much he wrote about it twice).

Many of the entries follow a familiar pattern: Rabin battles his own incredulousness at just how bad a particular movie is — assessing Otto Preminger’s 1968 counterculture oddity “Skidoo,” he’s rendered almost speechless in describing a 46-year-old Carol Channing stripped down to her bright yellow undies — only to come to the endearing if perverse conclusion that he genuinely likes it, flaws and all. His cheerful lack of snobbery is the book’s greatest attribute.

The problem is that while Rabin is terrific at being a wag and a wisecracker, he’s less convincing in making his case for the movies that really speak to him. After feebly defending the Jim Carrey cult favorite “The Cable Guy,” he concludes that for him, the movie “hits awfully close to home.” Rabin occasionally refers to the time he spent, as a teenager, in a group home (he detailed those experiences in his 2009 memoir “The Big Rewind”), and for me, at least, that kind of personal touch is always welcome in a piece of criticism. Unfortunately, in this case it does nothing to make me want to sit through “The Cable Guy” again. It’s easy to see where Rabin is coming from; it’s harder to actually follow him there.

In the end, too many of the entries here resemble the blow-by-blow recaps of reality television shows that have flooded the Web over the past 10 years, in which aggressively clever writers recount ridiculous plot turns and contestants’ bad behavior with a detached, ironic wink. Many of the essays in “My Year of Flops” probably worked beautifully in their original form, as smart online bonbons, something to click on for a 10-minute break from work. But Rabin is better at being funny than he is at cutting to the heart of why bad movies affect us so deeply. He ends the book with a minute-by-minute dissection of Kevin Costner’s notoriously lousy “Waterworld” that appears to exist just so he can prove how good he is at coming up with killer quips. That’s a waste of his time and ours. And in a book whose central argument — a valid one — is that time thrown away on junk movies is time well spent, it’s a luxury Rabin can’t afford.
In the Eyes of Others
By JONATHAN HAIDT

Western societies got weird in the 19th century. I mean that not as an insult but as an acronym. The cultural psychologists Joe Henrich, Steve Heine and Ara Norenzayan recently showed that many psychological processes work differently in people raised in Western, educated, industrialized, rich and democratic (WEIRD) societies. The normal, or default, mode of human cognition, for example, is holistic, given to seeing relationships among elements, but people in WEIRD societies think more analytically. They see a world full of discrete objects, like balls on a billiard table, whose properties are best analyzed individually.

The WEIRDing process has been particularly visible in moral philosophy. In his 2008 book “Experiments in Ethics,” the Princeton philosopher Kwame Anthony Appiah described the loss of relevance that philosophers inflicted on themselves, beginning in the late 19th century, when they abandoned philosophy’s ancient interest in messy human nature and retreated into the conceptual analysis of moral terms.

Appiah is one of the most relevant philosophers today. He writes about ethics in diverse modern societies, where it is often a challenge to find solid ground, let alone common ground. His work reveals the heart and sensitivity of a novelist — or perhaps a mystery writer, given that he’s written three whodunits — and he develops ideas the way a writer develops characters. He shows them in action, in relationships, in context and in flux. He helps us think holistically before turning analytic.

In “The Honor Code,” we accompany Detective Appiah as he tries to figure out who killed three morally repugnant practices: dueling among British gentlemen, foot-binding among the Chinese elite and slavery in the British Empire. In each case he shows how notions of honor sustained the practice for centuries, and how (spoiler alert) it was honor that later killed the practice in just a few decades, making these cases the “moral revolutions” referred to in his subtitle. Appiah also presents a fourth case: honor killings in present-day Pakistan, in which women and girls who are thought to have had sex outside of marriage, even in cases of rape, are murdered by male relatives to preserve the family’s “honor.” In this case the revolution has not yet happened, but Appiah draws on the other three cases to suggest how this horrific practice might someday meet its end.

Take the practice of foot-binding. Nobody knows precisely why aristocratic Chinese parents began, more than 800 years ago, to change the shape of their daughters’ feet. But once tiny, pointed feet became a difficult-to-attain ideal of feminine beauty, an obstacle to infidelity and a mark of elevated social status, there was no way for parents in the upper social strata to abandon the practice without losing honor — and reducing their daughters’ marriage prospects. Honor overpowered compassion: silk straps were used to pull up the middle third of the foot, like an inchworm, gradually bringing the ball of the foot and the heel together over many years. Even a poet who found such feet erotic wrote, “Can’t bear to hear — the cries of a young girl as her feet are bound for the first time.”

As Christian missionaries spread throughout China in the late 19th century, they were appalled by the practice and formed societies opposed to foot-binding. Allied with members of the Chinese literati, they made arguments that appealed to China’s national interest, like the need for strong and healthy women to bear strong and healthy children. Yet these arguments had no effect on the practice until members of the elite class discovered that they and their nation had become objects of ridicule. Foreigners were taking pictures of women’s tiny feet and sending them around the world. Combined with the shame of recent military and commercial defeats at the hands of Japan, Britain and other foreign powers, the thirst to restore national honor created an opening. The anti-foot-binding ­societies recruited high-ranking families to make a dual pledge: to refrain from binding their daughters’ feet, and from marrying their sons to women with bound feet. With ­upper-class boys growing up ready to marry a new pool of upper-class, unbound girls, there was now an honorable alternative, and the practice essentially disappeared within a generation.

I have just one criticism of this fascinating, erudite and beautifully written book: Appiah thinks honor survives in WEIRD societies. He distinguishes “competitive honor,” which accrues to people who excel, from “peer honor,” which governs relationships among members of an “honor world” who acknowledge a shared code. Appiah is certainly right that people in modern societies seek competitive honor — earning the highest grade or largest bonus, for example — but this pursuit often motivates unethical behavior, and so this is not the kind of honor that most interests him. Rather, he believes that we moderns have retained a form of peer honor, stripped of gender and re-engineered for a large and diverse society whose moral triumph has been the extension of dignity to all. “Honor is no decaying vestige of a premodern order,” he writes. “It is, for us, what it has always been, an engine, fueled by the dialogue between our self-­conceptions and the regard of others, that can drive us to take seriously our responsibilities in a world we share.”

Yet by Appiah’s own analysis, peer honor can survive only in an “honor world,” and that is precisely the kind of world that WEIRD societies asphyxiate. At the University of Virginia, for example, we have a student-run honor system, created in 1842 by a few hundred sons of Virginia planters whose families vigilantly tracked one another’s reputations and arranged marital and commercial alliances accordingly. In that world, a gentleman could not tolerate a stain upon his honor, and neither could a community of gentlemen. We therefore have a “single sanction” based on a psychology of purity: any dishonorable behavior contaminates the whole community, so any violation of the honor code is punishable by expulsion.

Today, however, the university’s 21,000 students come from all over the world, and concerns about purity are mostly confined to the cafeteria. The moral domain has shrunk — as it must to accommodate the individualism, mobility and diversity of a WEIRD society — to its bare minimum: don’t hurt people, treat them fairly but otherwise leave them alone. Students at Virginia work hard and care about their grades, but when they learn about fellow students’ cheating, they usually do nothing. They understand that cheating harms others (in courses graded on a curve), but because WEIRD moral calculus involves only individuals (not the honor of the group), they feel that expulsion is too harsh a punishment. And because they do not feel personally dishonored by a cheater, it’s not clear to them why they should step forward and press charges. The result is that our purity-based single sanction, still in force long after the death of its natal honor world, increases students’ willingness to tolerate dishonorable behavior.

A more accurate subtitle for “The Honor Code” might have been: “How Moral Revolutions Used to Happen, and What We Gained (and Lost) When We Replaced Peer Honor With Respect for All Persons.” That subtitle would have made it clear that Detective Appiah is really working on the hardest case of all: Who are we, morally speaking, and how did we get here?
Teen Spirit, Soured
By JOSHUA MOHR

Too often, our comfort zones are our tombstones. We settle into numbing patterns and that’s that — wake me when it’s over. Not so in the frenetic world of James Franco, whose ambition over the past few years has manifested almost as performance art: he’s been affiliated with multiple M.F.A. programs, in fiction, poetry and filmmaking; he’s angling to add “Dr.” to his name, having recently become a Ph.D. aspirant at some shabby school called Yale. Oh, and in case your particular comfort zone is a cave, he’s a pretty successful actor, too.

Now he’s a published author. So often with young writers, we read regurgitations, remixes, short stories like pop songs we’ve already heard 500 times. This isn’t the case in Franco’s first story collection, “Palo Alto.” Its best entry is “American History,” in which high school freshmen must articulate the pro- and anti‑­slavery arguments of the mid-1800s. Franco sharply merges historical elements with a modern-day social commentary that makes you wonder how much we’ve actually evolved in post-bellum America. In another story, “I Could Kill Someone” — a wild romp in which the narrator decides to murder his locker-room nemesis — the bouncy, pubescent voice at first seems discombobulated, but ends up perfectly mirroring the undulations of a teenage mind: “In the old days, you could duel. Emotions have been around forever. I wish I had a girlfriend. Or someone.”

Many of the stories end in nihilistic violence and gratuitous gore, which, let’s admit, can be entertaining. Oscillating angst is acted out in car accidents, fistfights, petty felonies and empty threats lobbed at drug dealers. Yet many of these tales have no emotional payoff. They feel vacant, not because the images aren’t vivid — Franco has a talent for viscerally evoking danger — but because throughout, there’s a larger problem: a lack of individual characterization.

One could argue that Franco is modifying the prototype of teenage suburban ennui and the concentric experience of such an ecosystem. His characters are bored, ambivalent, confused, and certainly these generalities can encapsulate adolescence. But the male narrator of “Halloween” sounds suspiciously like the female lead of “Lockheed,” who bears a pungent resemblance to the main player in “Chinatown in Three Parts.” To exclude the specifics of these characters’ emotional lives is a cop-out. What rattles around inside their hearts and skulls is not only as vital as reporting what’s happening in the exterior world; it’s crucial to the kind of storytelling Franco seems to be attempting.

s a writer, Franco needs to harness the skills he’s cultivated as an actor, mainly the ability to inhabit a consciousness independent of his own. We get glimpses of his potential, as in this lovely psychic hint from “Jack-O’,” by a suicide-curious stoner: “Things, shapes, folded in on themselves, . . . and if time is variable, then how do I vary it, and why do I want to? Because everything just focuses in on me and I hate it.”

On-screen, Franco is obviously willing to take chances (see his charismatic portrayal of Scott Smith, the long-suffering lover of Harvey Milk, in “Milk”); he knows how to animate and distinguish each new role with sovereign passions, biases and nuances. That, of course, is what a writer should do: exert a keen ear and a bold heart to spill his characters’ secrets. With “Palo Alto,” Franco’s literary execution hasn’t quite matched his other performances, though I’ll be surprised if he doesn’t keep at it, trying to shake our comfort zones until his players stand onstage and bare their pillaged souls.

A Life Between
By JOHN McWHORTER

As of 2005, the United States had a black, female secretary of state, and yet black America has largely observed this more than celebrated it. There is a tacit sense “out there” that Condoleezza Rice isn’t black in the “real” way, as we might put it. Not “with” us, perhaps.

Part of this is of course because she is a Republican who served under a deeply unpopular president. After the N.A.A.C.P. dutifully honored her with a President’s Image Award in 2002, the black Columbia historian Manning Marable dismissed Rice as a “leading race traitor” and the award as “accommodation” to an antiblack corporate establishment. Around the same time, black audiences chuckled approvingly when Amiri Baraka read the line “Who know what kind of Skeeza is a Condoleezza,” from his poem “Somebody Blew Up America.”

Yet there is more to it than that. Rice’s public self-presentation is distinctly impersonal. Unethnic, for one, but shading into outright ineffability. One grapples for an adjective to describe her personality, even after reading her autobiography, “Extraordinary, Ordinary People.”

She would have us believe that her dazzling journey, from growing up in segregated Birmingham to helping to lead the world, can be credited to attentive parents, the “extraordinary, ordinary” folk of the title. Yet it becomes clear that Rice has always been a wunderkind singleton. As a result, one detects a touch of the perfunctory in the family aspect of her tale, as well as a disinclination toward serious introspection.

Rice’s parents, both educators, provided a fine environment for germination. Rice grew up in the parallel universe that middle-­class black parents in the segregated South built for their children, a world of socials, bowling and bonnets, with black children from “rough” neighborhoods kept at a distance. Her recollection of her parents all aflutter trying to teach her about the birds and the bees plays like something out of “Father Knows Best.”

As a young piano student, Rice liked to imagine herself as Mozart’s wife, and in 1968, when she was 13 years old, she spent afternoons mimicking ice-skating moves to Dvorak’s “New World” Symphony. No one in this era was tarring any of this as “white,” and parents insisted that blacks had to be twice as good as whites to succeed. Rice notes, “This was declared as a matter of fact, not a point for debate.”

All of this was ordinary for middle-class blacks in that time and place. What wasn’t ordinary was Rice’s coming out of a political science course at the University of Denver so entranced with Russian history that she decided to become a Soviet specialist. It does not discount black people’s wide range of interests to say that in the early ’70s, Soviet affairs was an unusual career choice for a black woman raised in Jim Crow Birmingham. Members of this first generation of black academics much more commonly sought to explore the black past and present.

You would never know this from Rice’s breezy account of this period in her life. She sometimes sounds like a white debutante from somewhere in Connecticut, as if black people from the Deep South always named their cars after the protagonist of their favorite Russian opera (hers was “Boris Godunov,” for the record). We learn little about Rice’s inner life as she sails to one triumph after another, as a Stanford fellowship becomes a tenure-track assistant professorship, the Council on Foreign Relations sends her to Washington and next thing she knows she’s working at the National Security Council, getting appointed provost of Stanford, and fielding calls from George Bush père, who wants her to meet his son for some foreign policy brush-up.

The rest we know. Yet we will need biographers to give us more than Rice does about her actual work and the reasons for its rapturous reception. Her book comes close only in furnishing scattered childhood evidence of a furiously disciplined, even insular, individual. Rice reminds us that she liked the Temptations and Led Zeppelin and admits a tendency to procrastination (one she has apparently indulged only rarely over the past 30 years). However, this is also someone who as a girl encouraged her father to rat out local kids who were having an unchaperoned party; refused to settle for the kiddie plate in restaurants; and as a teenager adhered enthusiastically to a schedule that had her up at 4:30 for skating practice, followed by school at 7, piano lessons and more skating afterward, and bedtime by 9:30. Plenty of her peers, even the above-average ones with self-sacrificing parents, would have considered this schedule unthinkable.

This singularity presumably helps explain the Republicanism that all but a sliver of her black generation rejected. Her explanation is that she’d rather be ignored by Republicans than patronized by Democrats, but this suggests an ironic, back-door motivation that does not correspond to her general politics, upon which she would find little disagreement from Michael (as well as Shelby) Steele. “There are no excuses and there is no place for victims,” she says she was taught. She rejects the idea that one needs mentors who “look like you,” as well as the term “African-­American.”

Yet Rice is not deracialized in the way some suppose. It would be hard to be, growing up in Birmingham in the ’50s. She knew the girls who died in the bombing of the 16th Street Baptist Church, and reminds us that to blacks of that era, the immediate response to John F. Kennedy’s assassination was terror that a Southerner was now in the White House.

Despite reports to the contrary, she favors affirmative action, albeit in the sense of outreach rather than so-called diversity quotas (although her comments here reveal a level of ambivalence about aspects of implementation). At Stanford in the 1990s, she helped found the Centers for a New Generation, a youth-education program in depressed East Palo Alto; and she acknowledges that her own rise at Stanford, early in her academic career, was facilitated by the university’s affirmative-action efforts.

We learn these things as facts, but over all, “Extraordinary, Ordinary People” is oddly detached for an autobiography. People like Henry Louis Gates Jr. and Maya Angelou are fully present in their childhood recollections of similar circumstances in a way that Rice never is; she often seems to be watching rather than writing about herself. Those interested in her romantic life, for instance, will have to be satisfied with elliptical glimpses. (She didn’t marry her main college sweetheart, the Denver Broncos wide receiver Rick Upchurch, because he mysteriously “had too many irons in the fire.”)

In general, her political aperçus rarely go deeper than this: “But the war left the Iraqi dictator in power, able to threaten his neighbors and oppress his people. That would be a problem for another day.” Nor is this “Memoir of Family” an insider’s report on Rice’s life after 2000, to which she devotes a single page: the last one.

If there is a lesson from Rice’s book, it is that the civil rights revolution made it possible for an extremely talented black person (a woman, no less) to embrace a race-neutral subject and ride it into service as secretary of state, all the while thinking of herself largely as just a person. That the story is not exactly exciting can perhaps be taken as confirmation of how considerably times have changed.
Under God . . . or Not
By BEVERLY GAGE

Today’s conservatives often describe themselves as strict constructionists, seeking the “original meaning” of the nation’s founding texts. In the case of the Pledge of Allegiance, a much ­fetishized if not legally binding document, this approach is unlikely to yield the desired political result. As Jeffrey Owen Jones and Peter Meyer note, the original author of the pledge was a former Christian Socialist minister who hoped to redeem the United States from its class and ethnic antagonisms. Interpretations of its meaning have been growing more conservative, not more liberal, ever since.

The author in question was Francis Bellamy, cousin to the novelist Edward Bellamy, whose “Looking Backward” offered the 19th century’s most popular vision of a future welfare-­state utopia. In 1892, after abandoning the ministry, Francis was working at The Youth’s Companion, a mass-market magazine aimed at schoolchildren. For promotional purposes, the magazine planned a national youth pageant to mark the 400th anniversary of Christopher Columbus’s American landfall. Bellamy was assigned to rally the necessary political support and, at the last minute, to compose a few words appropriate to the occasion. He came up with a statement of what he later called “intelligent patriotism,” designed to counteract some of the nation’s most divisive and ­reactionary impulses.

His original salute to the flag was just 23 words: “I pledge allegiance to my flag and to the republic for which it stands — one nation indivisible — with liberty and justice for all.” Even so, it contained a subtle political message. Amid the heightened class conflict of the Gilded Age, the phrase “liberty and justice for all” was an idealist’s demand as well as a patriotic affirmation. So, too, was the idea of “one nation indivisible.” Just a generation removed from the Civil War, divided over the new immigrants pouring in from Eastern and Southern Europe, Americans of the era could not take their country’s stability for granted. Bellamy hoped his pledge would bind them together in a celebration of the nation’s traditions — and sell a few magazines along the way.

As Jones and Meyer note, Bellamy himself eventually backed away from his early flirtation with radicalism, emerging by World War I as an advocate of immigration restriction and stringent countersubversion. Much of the nation followed a similar path. In the 1920s, patriotic groups like the American Legion campaigned to change the words “my flag” to “the flag of the United States of America,” anxious that immigrant children might secretly be pledging to the flags of their original homelands. Three decades later, Congress added the words “under God” to distinguish American patriotism from “godless Communism,” thus condemning Bellamy’s high-minded call for national unity to decades of court challenges and contention.

As the pledge grew more ­restrictive, it also became increasingly mandatory. Today, at least 42 states feature some sort of recitation law, usually aimed at public school children. Politicians on both sides of the aisle pay homage to the pledge as an essential and edifying patriotic rite, advertising their willingness to place hand over heart. (Strict constructionists should note that the original pledge was accompanied by a right-side straight-arm salute, a gesture that mysteriously began to lose popularity in the 1930s.)

Jones and Meyer make a good case that Bellamy’s original pledge was more elegant and rhythmic than today’s clause-laden version. They are less effective in explaining how the former “Youth’s Companion Pledge of Allegiance” evolved from a vaguely progressive one-off promotional spot into a mandatory childhood rite of passage and a political weapon. “The Pledge” often relies on exclamation and enthusiasm in lieu of analysis. (“Only in America!” appears as free-standing commentary.) As a result, the book reads more like an amateur hobbyist’s guide to pledge-related happenings than a fully realized history of American patriotism and national identity.

What “The Pledge” does offer is an enthusiast’s fascination with the odd (if not quite “magical”) string of events that led modern conservatives to adopt the ditty of a 19-century socialist as a 21st-century badge of honor.
Read My Book? Tour My House
By ANNE TRUBEK

This past summer, I sat in Norman Mailer’s living room in Provincetown, Mass., as the sun beat on the windows looking out to the bay, listening to Mailer’s literary executor and longtime friend, J. Michael Lennon, tell stories about “Norm,” his wives and his writing habits. I was there as a writing fellow at the Norman Mailer Center, which is what the brick house on Commercial Street has become, and our group, Lennon told us, was going to have a special opportunity to tour the house, including Mailer’s attic study, which has been “preserved in amber.” Mailer was in the middle of writing a book when he died, and after his death nothing was touched. The plan is to have the room “stay that way forever, ” Lennon said.

Having toured dozens of dead writers’ houses over the years, I was familiar with the genre of the Mailer tour: the bedrooms, the bookshelves, the dining room table scratched up by the author. I could also anticipate the quiet awe of my fellow tourgoers once we reached the mecca, the third-floor study. My reaction? Geez, it’s hot up here, and musty, and I hope no one saw me yawn. But I did marvel at one thing: the Universal Gym, in mint condition, installed in the middle of the room. I wonder how that will play to the Provincetown pilgrims 50 years from now?

By my count, there are 73 writers’ houses open to the public in the United States. Not all provoke the awestruck response of the self-selected group at the Mailer house. According to curator and tour-guide estimates, only about half of the 2,000 people who visit the Walt Whitman House in Camden, N.J., each year come because they are interested in Whitman (as opposed to a nice historical stopover after touring the battleship down the road). Just 10 percent of the 9,000 annual visitors to the Thomas Wolfe Memorial in Asheville, N.C., come specifically for the author. Most people who visit the Mount, Edith Wharton’s lavish estate in tourist-friendly Lenox, Mass., are killing time before a concert at Tanglewood (and tend not to continue to Arrowhead, Herman Melville’s modest homestead in the nearby depressed industrial city of Pittsfield). Half of the 182,000 annual visitors to Hemingway’s house in Key West say they come for the cats.

It’s tempting to write off the interest in writers’ real estate to American materialism, but literary pilgrimages — and resistance to them — have a long history. Petrarch’s birthplace in Arezzo was preserved in the 14th century, while he was still alive, though the poet found the idea preposterous, as he had never lived there and considered Florence his home. The Casa di Dante, near the Piazza della Signoria in Florence, is utterly ersatz — the guidebook on sale at the ticket counter says of the bedroom, “This was certainly not Dante’s bedroom.” In England, writers began visiting Shakespeare’s Birthplace in Stratford-upon-Avon (which may or may not have been the house where Shakespeare was born) in the 18th century, followed in the 19th by the tourist hordes. More keep opening up: you can now tour — and sleep in — Agatha Christie’s house in Devon, where several of her books were set.

American writers’ houses can be spotty in their chronology, as well as their historical accuracy. New York may be the nation’s literary capital, but there is only one writer’s house museum in the five boroughs: Edgar Allan Poe’s cottage in the Bronx, where he spent the last few years of his life. The contents of Marianne Moore’s Greenwich Village living room were preserved, but moved to Philadelphia. The Whitman house displays old-timey paper the curator bought across the river at the Ben Franklin print shop in Philadelphia, even though Franklin predates Whitman by a century. Hannibal, Mo., has plaques announcing where events from Mark Twain’s books — Tom painting the fence, Huck fishing — took place, though of course none of these things really happened. The Poe house in Baltimore — one of several dwellings (including a dorm room at the University of Virginia) preserved in honor of the peripatetic writer — shows a video in which school kids tell us Poe is important because he helped spawn the TV series “Moonlighting” (does anyone remember “Moonlighting”?).

These small museums do tell fascinating stories, though not always the ones they intend. In Cleveland, where I live, a community development group bought a house Langston Hughes lived in during his high school years for $100 at a sheriff’s sale in November 2009. The city has condemned the structure, so people are debating whether to raise the $100,000 it would cost to save it from demolition. The questions under discussion — Will a Hughes Museum attract tourists to a dicey part of town? Did Hughes live in the house long enough for it to be significant? — reflect the confusions at the heart of the idea of preserving writer’s houses. I’d argue we’d be better off using the money to help buy every school kid a book of Hughes’s poetry.

Then again, $100,000 is chump change next to the costs associated with Mailer’s bay-front pile in Provincetown, or the old Hemingway place amid the multi-million-dollar retreats of Ketchum, Idaho, where the neighbors blocked efforts to turn the property into a public museum. (At present, it’s mainly visited by potential donors to the Nature Conservancy, which owns the property, the largest undeveloped site in Ketchum.) Sitting at Mailer’s bar overlooking the water, I find it hard not to price the place and wonder how long they will be able to afford to offer writing fellowships.

Literary capital can fluctuate as much as hard currency or real estate. Instead of buying houses, investors in literary futures might consider setting up tours of college campuses, where so many of today’s esteemed writers do much of their work anyway. One imagines a tour of Toni Morrison’s hard drive in her Princeton office, Marilynne Robinson’s archived in-box in Iowa City and the yellowed note card — I’m imagining here — Tobias Wolff tacked to his Stanford office door, announcing fall 2010 office hours. And then there’s the risk that a writer’s reputation will be foreclosed on while the house still stands. Consider the Hamlin Garland Homestead in West Salem, Wis., and the Thomas Bailey Aldrich house in Portsmouth, N.H., which honor writers whom most of us have forgotten.

It may be morbid to imagine the deaths of our greatest writers, but that they will someday die is one thing of which we can be assured, unlike property values or literary reputations. Jack London serves as a cautionary tale: In Sonoma County, he built a dream house on top of earthquake-resistant concrete slabs large enough to reinforce a 40-story building, intended to last 1,000 years. Three days before it was to be completed, it burned down. But that hasn’t stopped visitors from coming to tour the ruins.
Never Grow Old
By KEITH OLBERMANN

An attempt to publish the inventory of Mickey Mantle iconography began in February 2009 and will continue well into next year. From Mickey Mantle vanity license plates to Mickey Mantle postcards to Mickey Mantle bobblehead dolls to Mickey Mantle postcards depicting him holding Mickey Mantle bobblehead dolls, the cataloged, photographed and priced relics will ultimately exceed 2,000, and will have filled 200 ­pages in the memorabilia industry’s weekly Sports Collectors Digest. Included in the reckoning are at least 30 full-scale Mickey Mantle biographies, half a dozen of which carry the Mantle imprimatur as author, co-­author or frontman.

Thus, to wade now into the river of nostalgia, collection and recollection that is Mickey Charles Mantle, 42 years since his last major league at-bat, and 15 years since his death at 63, is like crowding into the last row of the Yankee Stadium bleachers at the start of a World Series game and expecting to get a TV close-up.

Yet as she did in her innovative biography “Sandy Koufax,” Jane Leavy has found a different path through the throng. For her portrait of Koufax, she alternated an inning-by-inning account of that great pitcher’s perfect game in 1965 with deeply researched and fluidly written examinations of the rest of his life and import. “The Last Boy,” a nonlinear biography, takes the form of 20 days in Mantle’s life (something of a conceit; some of the “days” are stretched to cover nearly a season, or an entire childhood).

The approach refreshes and underscores the facts and patterns of a life, and enables Leavy to connect the dots in new and disturbing ways. The Mantle who emerges is perhaps more whole than ever previously captured. His was an almost Dickensian childhood spent atop a veritable toxic waste dump in Commerce, Okla., with piles of lead and zinc mining debris called “chat.” The detritus was dangerous: Leavy offers evidence that it might have induced dyslexia in Mantle, and one of Mantle’s sons suggests it might have contributed more damage in his father’s fatal liver cancer than did 40 years of ­alcoholism.

Death is, in fact, the unexpected theme of this biography, and it emerges in the most unexpected places. Leavy’s most salient observation is of the day in June 1969 when the Yankees retired Mantle’s uniform number in front of 60,096 fans:

“He had watched Gary Cooper deliver Lou Gehrig’s farewell address in ‘The Pride of the Yankees.’ Now he was standing in the same spot, invoking Gehrig’s parting words: ‘I always wondered how a man who knew he was going to die could stand here and say he was the luckiest man in the world. Now I think I know how Lou Gehrig felt.’

“What was lost in all the huzzahs attendant to the occasion — the last lap around the stadium in a bullpen cart with hand-painted pinstripes — was that he cast himself as a dying man. In fact, he was already planning his funeral.”

Almost anyone who knows about Mantle knows that the frequently admitted presumption of early death is part of his legend. While Leavy disproves his depiction of a family in which all the men died by 40, she also convincingly identifies this specific fear as the likely outcome of Mantle’s having been repeatedly sexually abused as a child by a half sister and neighborhood boys, and produces heartbreaking on-the-record evidence to support this painful conclusion.

This is not, however, a dark book, no matter how dark parts of the life it portrays surely were. The hero worship of the fans, and the women who constituted a kind of endless batting practice in Mantle’s life, are presented thoroughly and fairly. There are revelations of hidden charity and great empathy, of a hero’s genuine inability to understand what others saw in him, and deeply endearing self-deprecating humor, even when a drunken Mantle is literally in the gutter. Almost everyone in sports over 40 has a “When I met Mickey” story, and Leavy weaves her own through five vignettes interspersed with the main chapters. Hers is too sweetly, horribly, blissfully, embarrassingly Mantlean to give away here.

Most important, the affection with which Mantle’s teammates always embraced him is chronicled abundantly, and stands in stark contrast to his wife and children’s struggles to do the same despite the emotional roadblocks that were seemingly all Mantle was capable of offering them. And as Leavy honors their Sisyphean efforts, she does the same for Mantle’s own attempts to overcome an equally impossible obstacle. Reinforcing the historical record with scientific reinterpretation, she posits that when Mantle injured his right knee swerving out of Joe DiMaggio’s way in the fifth inning of the second game of the 1951 World Series, he in fact tore his meniscus and the anterior cruciate and medial collateral ligaments. Insufficient treatment of the “unhappy triad” would downgrade him from the prospect of being the game’s greatest performer to playing nearly all of his remaining 17 years on one knee. Still, he won three M.V.P. awards and, in 1956, the triple crown.

Leavy has also given us old-fashioned, nonanalytical gumshoe research, enough — and good enough — to make the crowds of amateur baseball sleuths or the pros at the Hall of Fame weep. Mantle’s 565-foot home run at Griffith Stadium in Washington in April 1953 was not merely one of the longest ever hit, nor was it just Mantle’s true self-introduction on the baseball stage. It also sealed the sport’s obsession with the “tape-measure homer,” largely through the artifice of the anecdotal report by the Yankees’ public relations director, Red Patterson, that he found the boy who had come upon the Mantle baseball where it finally stopped, in somebody’s backyard. More than half a century later, Leavy tracked down the man, by then 69 years old, and managed to get just enough detail from him to produce a true picture of the transformational blast.

His was one of 563 interviews Leavy conducted, ranging from the executive responsible for the creation — and scarcity — of Mantle’s landmark 1952 Topps baseball card, to Eric Kandel, who won the 2000 Nobel Prize in Physiology or Medicine. Kandel is asked to try to explain both Mantle’s explosive swing, which made the bat seem of double width, and his inability to explain to others how he did it. (Kandel rightly answers, “I think your question is not dramatically different than asking, ‘What makes Mozart Mozart?’ ”)

But Leavy comes as close as perhaps anyone ever has to answering “What makes Mantle Mantle?” She transcends the familiarity of the subject, cuts through both the hero worship and the backlash of pedestal-wrecking in the late 20th century, treats evenly his belated sobriety and the controversial liver transplant (doomed mid-surgery by an oncologist’s discovery that the cancer had spread), and handles his infidelity with dispassion. Sophocles could have easily worked with a story like Mantle’s — the prominent figure, gifted and beloved, through his own flaws wasteful, given clarity too late to avoid his fate. Leavy spares us the classical tragedy even as she avoids the morality play. “The Last Boy” is something new in the history of the histories of the Mick. It is hard fact, reported by someone greatly skilled at that craft, assembled into an atypical biography by someone equally skilled at doing that, and presented so that the reader and not the author draws nearly all the ­conclusions.
Hearts Full of Sorrow
By REBECCA NEWBERGER GOLDSTEIN

Admirers of Nicole Krauss’s novel “The History of Love” (and they are many, and I am one) will want to know the answer to this question: How much does her new novel, “Great House,” resemble its predecessor? The good news is: very much indeed. And the good news is also: not so very much.

In themes and preoccupations, “Great House” and “The History of Love” overlap. Both explore shattered characters, with pasts blasted by the sort of loss that makes even the pretense of normal life impossible. (By “normal life” let us mean one in which certain premises can be assumed — for example, that it is possible to put one foot in front of the other, on the way to meet a lover or to buy a loaf of rye bread, without being overtaken by tremors issuing from convulsions of the moral order.) The narrative structures of both books mirror the characters’ own shattering and require readers to reassemble the full story for themselves. And both books coalesce around artifacts from the past. In the case of “The History of Love,” it is a book; in the case of “Great House,” it is a writing desk. Given the nature of these organizing objects, it is no surprise that both novels designate literature as singularly significant. Writing — at least when it is great — is a kind of consecration, placing its practitioners in the way of assaults from large truths and perils. This last theme runs the risk of morbid solemnity; everything rests on the execution.

So what, then, is so different about the two novels? It is their tone. “The History of Love,” despite its tragic underpinnings, is anything but solemn. Its sorrow is rambunctious, its anguish rollicking. Its fulgurating pain comes out in shrieks of unlikely laughter. This extraordinary feat of immaculate blending is accomplished by main characters who are, despite all (and all is truly terrible here), stuffed with unconscionable amounts of charm. In particular, there is Leopold Gursky, who is, to my mind, one of the great characters of recent fiction, as hilarious as he is tragic, an exquisite amalgam of great artist and great clown — think of Beckett, with a Polish-Yiddish accent. Gursky does not choose to be excluded from normal life, but rather strives, with quirky futility, to achieve the ordinary. This is the derivation of so much of the exuberance in what is an essentially tragic novel.

The characters of “Great House” lack all trace of exuberance. Normal life does not beckon them. They inhabit their sorrow with a lover’s ardor, cultivating it into an art form. There is a forbidding, and seductive, remoteness about them that captures those who draw too close and then can get no closer. There are four strands to the novel, and the tasks of narration fall to those who have been caught by these dangerously removed enchanters. (In one case, the enchanter is not human, but the art of writing itself.) These narrators are marooned in a terrible place, unable to return to the safe shore of normal life, unable to follow their enchanters into the deeps where only they can breathe. Their enchanters are themselves enchanted with their own sorrows. They have been shaped around what it is they have lost, a central idea in “Great House” — in fact, the meaning behind its title.

The novel opens with a writer named Nadia telling her story to someone she addresses as “Your honor,” whose identity we will learn when we ought to. She is explaining herself, and her explanation is focused on her relationship with the desk that, decades ago, she came to half-­possess. Abandoned by her boyfriend, who left with the furniture, Nadia learned of the desk from a friend of a young Chilean poet, Daniel Varsky, who was leaving New York to go back to Chile, at least temporarily, and needed a place to park his furniture. The desk she got from him is a huge affair, best described by another narrator, whose wife passed it on to Varsky in the first place. “This desk was something else entirely: an enormous, foreboding thing that bore down on the occupants of the room it inhabited, pretending to be inanimate but, like a Venus’ flytrap, ready to pounce on them and digest them via one of its many little terrible drawers.” There are, to be exact, 19 drawers, one of them permanently locked. The desk stayed with Nadia in New York for decades, never reclaimed because Varsky had become one of Pinochet’s disappeared. Then one day a young girl, claiming to be both Varsky’s daughter and from Jerusalem, called Nadia and asked for the desk, which Nadia relinquished, a more wrenching parting than any human had ever presented her.

The narration is then taken up by a father, who lives in Israel and whose connection to the story we must wait to discover. He is addressing his son, Dov, one of the suffering remote who are such a torment to love. Dov had wished to become a writer, but the father, in an outraged protest against literature’s elective affinity with suffering, squashed the ambition. “Who do you think you are? I asked. The hero of your own existence?” This father, both ruthless and heartbroken, is a bundle of contradictions vibrating with a boisterous grief reminiscent of Gursky’s.

The next strand belongs to a cultivated British husband, Arthur Bender, whose wife, Lotte Berg, was the one who gave Varsky the desk. She had come to London from Germany as the chaperone on a Kindertransport, leaving her parents to be murdered in the camps. She, too, is a writer, a locked drawer and a torment to love. Her husband begins to guess at the loss around which she is formed only at the end of their long life together. And then the narration passes to a young American woman who has come to Oxford to study literature before falling in love with the male half of a mysterious sibling pair, Yoav and Leah Weisz, who are themselves held captive by their father, George Weisz, a famous antiques dealer, originally from Hungary, whose genius lies in being able to locate pieces plundered by the Nazis, objects remembered with infinite longing by the original owners and their children.

What gives the quickening of life to this elegiac novel and takes the place of the unlikely laughter of “The History of Love”? The feat is achieved through exquisitely chosen sensory details that reverberate with emotional intensity. So, for example, here is George Weisz describing how, when his clients speak of their lives before the war, “between their words I see the way the light fell across the wooden floor. . . . I see his mother’s legs move about the kitchen, and the crumbs the housekeeper’s broom missed.” Those crumbs are an artist’s true touch. They demonstrate how Krauss is able, despite the formidable remove of the central characters and the mournfulness of their telling, to ground “Great House” in the shock of immediacy.

Krauss has taken great risks in dispensing with the whimsy and humor that she summoned for her tragic vision in “The History of Love.” Here she gives us her tragic vision pure. It is a high-wire performance, only the wire has been replaced by an exposed nerve, and you hold your breath, and she does not fall.

Den of Antiquities
By GEORGE JOHNSON

In a remarkable interlude in Willa Cather’s novel “The Professor’s House,” a New Mexico cowboy named Tom Outland describes climbing a landmark he calls Blue Mesa: “Far up above me, a thousand feet or so, set in a great cavern in the face of the cliff, I saw a little city of stone, asleep. It was as still as sculpture, . . . pale little houses of stone nestling close to one another, perched on top of each other, with flat roofs, narrow windows, straight walls, and in the middle of the group, a round tower. . . . I knew at once that I had come upon the city of some extinct civilization, hidden away in this inaccessible mesa for centuries.”

Blue Mesa was Cather’s stand-in for Green Mesa — Mesa Verde in southern Colorado — and she was evoking what a real cowboy, Richard Wetherill, might have felt when, a week before Christmas 1888, he found Cliff Palace, the centerpiece of what is now Mesa Verde National Park. Craig Childs understands these kinds of epiphanies, and he beautifully captures them in “Finders Keepers: A Tale of ­Archaeological Plunder and Obsession” — along with the moral ambiguities that come from exposing a long-hidden world.

Wetherill’s city of stone, Childs reminds us, was quickly commandeered by a Swedish archaeologist who, over the objections of outraged locals, shipped crates of Anasazi artifacts to Europe. Worse things could have happened. Childs tells of a self-proclaimed amateur archaeologist who in the 1980s removed hundreds of artifacts from a cave in Nevada, including a basket with the mummified remains of two children. He kept the heads and buried the bodies in his backyard.

In his explorations of the Southwest, Childs has often crossed paths with pot hunters and private collectors. Most were driven by curiosity and the excitement of discovering lost history. Cather understood the impulse: “There is something stirring about finding evidences of human labor and care in the soil of an empty country,” Tom Outland says in the novel. “It comes to you as a sort of message, makes you feel differently about the ground you walk over every day.” Then you have to resist the urge to take home what you have found.

After a raid by federal officers in 2009, prominent citizens of Blanding, Utah, were arrested on charges of illegally trading in artifacts. One man was so humiliated that he asphyxiated himself in his Jeep. The niece of another man asked for Childs’s understanding: “Farmers in the ’10s and ’20s would use pots for target practice, potshots. There were millions of artifacts, for Christ’s sake.” Her uncle collected them out of love, she said, to protect them. But according to the arrest warrant, Childs writes, he also boasted of selling sets of pottery for as much as $500,000.

Childs is no angel either, and that gives his book its drama. He tells of a time in the parched lands of the lower Colorado River when he and some fellow desert rats came across a small seashell, a sign that travelers had passed through. Picking up the shell, delicate as “a curl of white paper,” Childs had a Tom Outland moment: “I saw copper-skinned people filing between isolated mountains, baskets weighted across their foreheads on leather tumps. They had bare legs, hard footsoles, and spun countless generations of themselves before asphalt or steel ever came to this land.” As one of his cohort scooped up beads, Childs tried to intervene. “Aw, man, don’t start that,” the scavenger replied.

The beads, spilled accidentally, weren’t museum quality. There was little in the way of archaeological context to preserve. Why leave them behind for someone else to take? Though he knew it was illegal, Childs acquiesced: “Maybe the beads were meant to travel.”

When they came upon a more important site the next day, the moral calculus seemed clearer. In a cave, a pile of rocks turned out to be a hunting shrine concealing a cache of ceremonial bows. After passing them around, Childs insisted they be carefully replaced. Later, he noticed one was missing. “There is a difference between finding and keeping,” he writes. “The two are often lumped together into one action, but there is a blink that comes in between. It is when a thing goes from being its own to being yours.”

Childs doesn’t say whether, upon returning to civilization, he contacted a university archaeology department. Probably not. Over the years, he and others continued to find artifacts in the wild. They would carefully examine them and then leave them behind, telling no one the location. They were “cat burglars of the desert,” he writes.

In one of the book’s most gripping scenes, Childs visits the Peabody Museum at Harvard to study murals peeled from the walls of ruined Hopi kivas, ancient art that might otherwise have been destroyed. But with archaeologists complaining of a “curation crisis,” in which millions of artifacts sit boxed in museum basements, how many more arrowheads and potsherds do we need?

“If anyone tells you there is only one right answer to the conundrum of archaeology, he is trying to sell you something,” Childs writes. “At this point, considering all that has been removed, it is worth leaving the last pieces where they lie.”

Tom Outland may have come to feel the same way. Toward the end of “The Professor’s House,” he takes a train to Washington, where he tries to enlist the Smithsonian in the preservation of Blue Mesa. When he finally gets an appointment with an archaeologist, Outland lends him some pottery to evaluate. The Smithsonian decides to pass. And Outland never gets the artifacts back.

Revolutionary Road
By JOSEPH BERGER

THE KING’S BEST HIGHWAY

The Lost History of the Boston Post Road, the Route That Made America

By Eric Jaffe

Illustrated. 322 pp. Scribner. $27.50

A few blocks from my home in Westchester County lies Boston Post Road, a ribbon of retail dotted with Dunkin’ Donuts franchises, gas stations and drive-through banks. Indeed, for much of its span between New York and Boston, the Post Road is, in Eric Jaffe’s apt phrase, “a gluttonous commercial wild.” But in his valuable if uneven history, “The King’s Best Highway,” Jaffe burrows under the asphalt to reveal a thoroughfare of deeper distinction. Starting with its pastoral founding by American colonists, who exploited rocky, slippery paths that Indians had burned out of the wilderness, he tells how the road became the artery for an inchoate mail service, with loping riders carrying wax-sealed letters from tavern to tavern — the first post offices — and distributing newspapers that united the scattered colonists to revolt against the British. He traces the changes brought on by stagecoach, trolley, bicycle, reminding us that robust railroads led to the neglect of highways, until the automobile enabled the highways’ revenge.

Too often — and this may be a problem with such conceits — Jaffe, a former editor of Smithsonian magazine’s Web site, strains to make his case for the road’s place in history, telling us about curious happenings and personalities that came in contact with the road but left little significant impact. Did the fact that Lincoln chose Cooper Union — at the two-mile point of the road to Boston — for a speech defending the federal government’s right to control the spread of slavery really mean that the road played a crucial role in shaping events?

Nevertheless, when Jaffe hits his stride, the result can be illuminating and entertaining. He recaptures a time — the early 1700s — when people took two weeks to get from Manhattan to Boston on horseback. (The Post Road is actually two roads: a high road through Springfield and a low coastal road.) John Winthrop Jr., son of a Massachusetts Bay Colony governor, and later Benjamin Franklin envisioned the road’s potential; Franklin worked out a more predictable schedule of “post riders,” financed mail delivery and required night rides that cut the journey to four days.

Jaffe describes how a suitably surfaced road — previously many towns had simply recruited men for annual highway grooming duty — helped spread the products of upstart textile factories early in the Industrial Revolution. Charmingly, he takes us back to a pre-baseball New York-Boston rivalry over transportation, with New York striking first with the Erie Canal in 1825 and Boston bouncing back in the ninth with its railroad to Worcester in 1835.

The railroad consigned the Post Road running alongside it to stepchild status. But the road recovered when Albert Pope, wanting smoother trails for his Columbia bicycles, led a movement for state-of-the-art paving. Jaffe argues persuasively that the automobile industry blossomed along the Post Road because the road provided a testing ground for early-model cars. (The manufacturer Hiram Percy Maxim, for one, drove his “loud, coughing” machines mostly at night to avoid attracting undue attention.)

By 1930, with 26.5 million cars registered, Americans found it easy to live outside cities, and the Post Road became a commuter corridor. As traffic coagulated, pressure built to create Interstates. Jaffe asserts that the expressways Eisenhower initiated sucked wealth out of cities and helped incite the urban unrest of the 1960s. But skeptics might point out that such turmoil also took place in spots like central Brooklyn, where expressways didn’t slice through.

In a delightful final chapter, Jaffe cruises the road’s remnants in a Mini Cooper like an archaeologist, seeking old taverns and milestones. He realizes anew how, thanks to expressways, the old highway has been “relegated, in most places, to a glorified local street.” As Jaffe observes in a graceful passage, landscape features like the Boston Post Road end up “not so much a place one can visit but the idea of a place, the idea that something special took place, here.”

Lost Tribe
By ALANA NEWHOUSE

WHEREVER YOU GO

By Joan Leegant

253 pp. W. W. Norton & Company. $23.95

I must have terrible luck. I’m an Ameri­can Jew who, from time to time, has — like the characters in Joan Leegant’s “Wherever You Go” — found my way to Israel. But unlike her characters, I’ve never managed to stumble upon small sculpture galleries nestled behind lemon trees. The muezzins haven’t sounded for me at perfect poetic moments. And, though I’ve been approached by good-looking Israeli men with motorcycles, none seemed motivated by a burning desire to talk politics.

But things work more symbolically in the universe of Leegant’s first novel, in which three American Jews suffering from various shades of misguidedness visit Israel in search of meaning (or closure or salvation or . . . you get the picture) and their lives collide in (what else?) an act of terror. The book is an indictment of certain anemic corners of the modern American Jewish experience — spiritually sapped by bourgeois values, rote religious observance, Holocaust fatigue and jingoistic ethnic pride — and an exploration of the radicalism, religious and political, into which some searching people flee.

The story opens with Yona Stern, a 30-year-old art gallery assistant from New York, who has left her married lover and journeyed to Israel in an attempt to reconcile with her estranged sister, Dena. Once an idealistic peace-and-justice type, Dena is now active in the radical settler movement. Her community is (of course) named in honor of Baruch Goldstein, the American-born doctor who massacred 29 Muslims in 1994, and her children have “Norman Rockwell features that suggested an adorable cuteness, a wholesome naïveté, but cold hazel eyes that militated against it.” The reunion isn’t friendly because, obviously, the sisters couldn’t be more different. “Just like I can’t fathom your lurid serial liaisons with adulterers,” Dena tells Yona, “you can’t fathom my belief in the sanctity of this land and the supreme importance of its redemption.” O.K., then.

Yona’s interaction with Dena’s world leads her story to become intertwined with those of two others: Mark Greenglass, a drug addict turned Talmud ­scholar suffering from a crisis of faith, and Aaron Blinder, an unambiguously named depressive who had “Jewish day school shoved down his throat by his father.” Once the connections among the three become obvious, the plot moves toward its tragic denouement.

Leegant reveals a flair for description — in the overdone Upper East Side apartment of a wealthy assimilated Jewish couple “the ceiling was all swirls, icing for the living room cake” — as well as insight. When Greenglass’s existential confusion begins to lift, he notes that he “had uttered so many prayers for Jerusalem, maybe it has said one for him in return.”

But, alas, nothing can save this book from its clichéd principal characters. And the secondary figures are no better: “squared-jawed Israelis who talked only to each other in machine-gun Hebrew,” a rabbi who “without apology ate a pound of brisket every Friday night except when there was chicken,” and so on. These people are, as one of Leegant’s own characters admits, straight out of central casting. They’re too coarsely drawn, and they change too quickly, entering and exiting extremisms as if via turnstiles.

Leegant — whose story collection, “An Hour in Paradise,” was filled with small gems — seems to have fallen victim to an artistic strain of Jerusalem Syndrome, whereby a previously good writer exhibits signs of imbalance in reaction to the Holy Land. Trying to render the confounding Middle East as a more comprehensible landscape, she instead produces a grotesquerie of caricatures and politics drawn so crudely that they defy humanness. As Yona’s motorcycle philosopher explains, the settlers “need black and white. They don’t like the gray. . . . They like absolutes. And drama. . . . They want a big life. Historical, theatrical.” Perhaps. But, then again, so does everyone else in this book. And therein lies the problem.

Losing Battles
By COLM TOIBIN

TO THE END OF THE LAND

By David Grossman

Translated by Jessica Cohen

576 pp. Alfred A. Knopf. $26.95

In the introduction to his 2003 collection of journalism, “Death as a Way of Life,” the Israeli novelist David Grossman wrote: “The daily reality in which I live surpasses anything I could imagine, and it seeps into my deepest parts.” In a note at the conclusion of his somber, haunting new novel, “To the End of the Land,” he explains that he began writing it in May 2003 — around the same time he wrote that introduction, six months before the end of his older son’s military service and a year and a half before his younger son, Uri, enlisted. “At the time,” he writes, “I had the feeling — or rather, a wish — that the book I was writing would protect him.”

“On Aug. 12, 2006,” Grossman continues, “in the final hours of the Second Lebanon War, Uri was killed in Southern Lebanon.” By that time, most of this book “was already written. What changed, above all, was the echo of the reality in which the final draft was written.”

It is a testament to Grossman’s novelistic talent, indeed perhaps his genius, that “To the End of the Land” manages to create and dramatize a world that gives both the reality and the echo their full due. He weaves the essences of private life into the tapestry of history with deliberate and delicate skill; he has created a panorama of breathtaking emotional force, a masterpiece of pacing, of dedicated storytelling, with characters whose lives are etched with extraordinary, vivid detail. While his novel has the vast sweep of pure tragedy, it is also at times playful, and utterly engrossing; it is filled with original and unexpected detail about domestic life, about the shapes and shadows that surround love and memory, and about the sharp and desperate edges of loss and fear.

This novel is, on the one hand, a retelling of Truffaut’s “Jules and Jim,” in which two guys, best friends, fall in love with the same girl. Ora, the girl in this novel, is emotional, introspective, filled with an ability to notice and an ability to love. As for the boys, Ilan is rational, vulnerable, brittle, oddly needy and nerdy; and Avram is impulsive, brilliant, super­intelligent, larger than life. Having loved them both, Ora finally decides to marry Ilan, and they have a son, Adam; a few years later, made pregnant by Avram, she has a second son, Ofer, who is brought up as though he were Ilan’s child.

In another society, this might have the makings of a comedy, but in Israel between 1967 and 2000, the years in which the novel takes place, public life had a way of eating into the most private moments and the most intimate relationships and poisoning them. Avram is captured and tortured during the 1973 war; this free-spirited, somewhat goofy genius is thereafter a broken man. He does not want to have anything to do with his old friends and does not want to see his son.

In parallel with the pain and terror of war, there is daily life. Grossman offers a wonderful, almost quirky account of Ora raising her two boys. He has a way of making the most ordinary moments glow, each detail chosen to suggest how odd and engaging people are, and how unsimple and deeply interesting human relations become. Like everything else in the book, the haven of love and care that Ora creates for her sons and her husband is invaded by fear and misery and a sort of coarseness once her sons begin their military service, entering a world of roadblocks, ambushes and arrests that she can only imagine in horror.

With her husband and elder son away in South America, Ora arranges to go on a hike with Ofer when his time with the military is up. Instead, he re-enlists. Ora must again live in fear of the “notifiers” from the army, who might call in the night, knocking on her door to deliver bad news.

Rather than staying at home and waiting, however, Ora settles on an almost magical way of keeping her son safe: she will not be there for the notifiers if they call. She will go to the north of Israel without a phone, where no one can notify her of anything, and she will hike south and not listen to news. She will find Avram, the boy’s father, and she will make him come with her.

The novel traces what happens as they walk and talk. Most of the time this device works brilliantly. Ora needs to tell Avram about his son, every single detail she can think of, to make him come alive for his natural father for the first time. By invoking him with such zeal, however, she is already placing him in the past. This casts a shadow on their walk and imbues their conversation with a sort of dark tension. At times, Ora’s level of self-consciousness, her alertness to the emotional contours of things, her exquisite introspection, give this story the depth and privacy of an ­Ingmar Bergman film, especially “Scenes From a Marriage.” The story she tells darts between public and private life, between war and torture on the one hand and the sweet anxieties of bourgeois life on the other.

As in other novels of love and loyalty in a time of conflict — Nadine Gordimer’s “Burger’s Daughter,” Michael Ondaatje’s “English Patient” or Shirley Hazzard’s “Great Fire” — there is a palpable urgency here about the carnal and the sexual. The portrait of Ora as a woman alive in her body is one of the triumphs of Grossman’s book.

Grossman also manages to play the ordinary against the highly charged. He displays masterly control over the emotional life of the novel, maintaining it at a very high level indeed, and then pushing it at points where the narrative becomes almost unbearable. There is a moment, for example, when Ora and Avram meet a man on their journey who says, “It’s good to get away from the news a bit, especially after yesterday,” and you simply have to put the book down, so great is your fear for Ora’s son. There is another moment, told in flashback, when Avram, delirious in the hospital, having been released from captivity after the war, when he was led to believe that Israel had been fully defeated, asks Ora: “Is there . . . Is there an Israel?” Again, the tension becomes so great that you hold your breath.

To say this is an antiwar book is to put it too mildly, and in any case such labels do an injustice to its great sweep, the levels of its sympathy. There is a plenitude of felt life in the book. There is a novelist’s notice taken of the sheer complexity not only of the characters but of the legacy of pain and conflict written into the gnarled and beautiful landscape through which Ora and Avram walk. And there is the story itself, unfolded with care and truth, wit and tenderness and rare understanding. This is one of those few novels that feel as though they have made a difference to the world.

They Did What?
By SUSAN DOMINUS

HOW TO BECOME A SCANDAL

Adventures in Bad Behavior

By Laura Kipnis

Illustrated. 209 pp. Metropolitan Books/Henry Holt & Company. $24

Scandal has never had it so good: typically ogled, mocked, knocked down and dismissed, it’s at last being graciously invited to lie down on the couch. In “How to Become a Scandal,” Laura Kipnis delivers consumers of high and low culture that rare twofer, taking material that self-respecting people are supposed to resist and treating it with such smarts that the reader feels nothing short of enlightened. Her book is filled with sensational subjects (Eliot Spitzer, Linda Tripp, James Frey and that notorious astronaut with the diaper), but Kipnis delivers all the thrills.

One might think that increased wariness in this privacy-free era would offset whatever moral erosion has occurred in the last 20 years, leaving culture with the same net number of blockbuster scandals per election cycle. Instead, we’re flooded by them, as every other superstar seems to fall to a combination of sleaze and ­naïveté. With each incident, we’re forced anew to try to fathom how it could have happened, with no convenient or definitive analysis for reference.

Into this breach steps Kipnis, a professor at Northwestern University and the author of “Against Love: A Polemic” and other books, with her “scandal psychodynamic,” ready to outline the relationship between the transgressors and a judgmental society. That scandals are public affairs, she argues, is hardly incidental, especially when news of a golfer’s crashed Escalade can go viral by lunchtime. Surely, she says, transgressors need the exposure as much as the culture needs scandal.

The particular obsession of her analysis is not why people have sex with the wrong people (which seems obvious), or tape-­record conversations in acts of betrayal, or exercise criminally bad judgment (as did the New York judge Sol Wachtler, sending lascivious notes with condoms enclosed to the 14-year-old daughter of a former girlfriend). What fascinates Kipnis are the elaborate ways those transgressors reassure themselves that they are not bringing colossal ruin upon themselves, that their dalliances will never see the light of day, that no one will ever trace the source of that condom-filled card.

At the heart of any scandal is a mystery, the mystery of the self, and Kipnis scours the transcripts of her chosen debacles, finding quotations that best illustrate the delusion enabling so many implosions. Lisa Nowack, the astronaut who put on a wig and pepper-sprayed her ex-boyfriend’s new lover, seemed to truly believe that all she wanted was to get that young woman’s attention, to convey the urgent information that their love interest had been seeing them both for some time. She was also under the impression that perhaps the police need not mention the episode — the wig, the BB gun found in her car — to her employer, NASA. Such rationalizations make sense, Kipnis says, only in “a hermetically sealed cognitive universe, i.e. the padded cell of your own imagination.”

Kipnis’s book is stuffed with such witty aperçus. “If there were a Nobel Prize for denial,” she says of the disgraced governor Eliot Spitzer, “Stockholm would soon be calling.” But for the most part, she treats her subjects with remarkable compassion, using their failings to illustrate our own potential for the same, ridiculing the idea that the rest of us are above sordid self-immolation. “All the self-­examination in the world isn’t going to help anyone bent on self-deception,” she writes, “which is no doubt true of any of us at least some of the time.”

What allows for scandal in Kipnis’s schema is every individual’s blind spot, “a little existential joke on humankind (or in some cases, a ticking time bomb) nestled at the core of every lonely consciousness.” Phrases like that illustrate both the author’s power as a writer and her own blind spot, some unchecked impulse to indulge in thorny syntax and extraneous flourishes. But in general, the verbal extravagance seems appropriate for the feverish pitch of her material. “Scandal loves your appetites,” she writes, “all of them, the more voracious the better.” In this book, the competing drives that result in scandal don’t live in some neuro­chemical haze; they’re corporal, engaging in martial arts. Anxiety is “fermenting in every social being’s gut”; all of us fall prey to self-destructive desires that are “deviously tunneling for freedom.”

Rather than feeling dated (what, no Tiger?), her examples serve as case studies for the ages. In Kipnis’s hands, Linda Tripp, who claimed that patriotism led her to record the conversations with Monica Lewinsky that exposed Bill Clinton, is not just a symbol of feminine betrayal, but of the two-­facedness essential to most scandals, the divided selves revealed when the public eye alights. Perhaps it wasn’t Tripp’s facial features that provoked such widespread harshness, Kipnis says, but rather the disconnect between what she was saying — that she was only trying to do the best thing — and her own complicated motives, a denial that played itself out in Tripp’s expressions with painful results. This point, once explained by Kipnis, will forever inform the way you watch anyone simultaneously squirm and smile before Chris Matthews.

Kipnis herself hovers quietly around the edges, perhaps as a personalizing stand-in for the rest of us, so wary from her research that she essentially disclaims any real potential for understanding her motives in choosing her subject matter. The scandals that fill the book “chose me as much as I chose them,” she writes. Why the personal fascination with the subject? Kipnis is off the hook for that one, she says — after all, how should she know? Ostensibly about scandal, her book is most memorable as a convincing case for the ultimate unknowability of the self.

Learning to Be Washington
By ANDREW CAYTON

WASHINGTON

A Life

By Ron Chernow

Illustrated. 904 pp. The Penguin Press. $40

George Washington’s corpse was scarcely a month in its grave when an enterprising minister from Maryland named Mason Locke Weems made a pitch to a Philadelphia publisher. “I’ve got something to whisper in your lug,” Weems wrote in January 1800. “Washington, you know is gone! Millions are gaping to read something about him. . . . My plan! I give his history, sufficiently minute” and “go on to show that his unparalleled rise & elevation were due to his Great Virtues.” Weems was on to something. His sentimental and often fictional biography became a best seller, the first in a seemingly endless stream of studies of the man who led the Continental Army to victory in the American War for Independence and who as the first president of the United States did more than anyone else to establish the legitimacy of a national government merely outlined in the Constitution of 1787. Today, books about Washington continue to appear at such an astonishing rate that the publication of Ron Chernow’s prompts the inevitable question: Why another one?

An obvious answer is that Chernow is no ordinary writer. Like his popular biographies of John D. Rockefeller and Alexander Hamilton, his “Washington” while long, is vivid and well paced. If Chernow’s sense of historical context is sometimes superficial, his understanding of psychology is acute and his portraits of individuals memorable. Most readers will finish this book feeling as if they have actually spent time with human beings. Given Chernow’s considerable literary talent and the continued hunger of some Americans for a steady diet of tales of Washington and his exploits, what publisher could resist the prospect of adding “Washington: A Life” to its list?

A more complicated answer lies in considering why we still long for news of Washington. To be sure, his life reflected, if it didn’t epitomize, the once unimaginable transformation of several British colonies into an imperial Republic whose dominion extended to the Mississippi River. But Chernow and, I suspect, most of his readers are less interested in how the United States became the United States than in how George Washington became George Washington. Accepting the inevitability of our nation, they remain perplexed by the pre-eminence of this man. Chernow is far subtler and far more sophisticated than Weems. Yet there is a familiar ring to his desire to “elucidate the secrets” of Washington’s “uncanny ability to lead a nation” by detailing his acquisition of such “exemplary virtues” as “unerring judgment, sterling character, rectitude, steadfast patriotism, unflagging sense of duty and civic-mindedness.” What, we still wonder, was the secret of his success?

Contemporaries, even those rivals who deeply resented him, observed that Washington seemed to be blessed by Divine Providence — or just plain luck. How else to explain the many bullets that whizzed around but never into his body? Or his emergence from a string of catastrophic military disasters in the French and Indian War and the War for Independence with a reputation enhanced rather than ruined? Over the past two centuries, scholars have detailed more prosaic explanations of Washington’s “unparalleled rise & elevation,” including his acquisition of thousands of acres through fortuitous inheritance and relentless speculation; his marriage to the wealthy widow Martha Dandridge Custis; his connection with members of the powerful Fairfax family, who became important early patrons; his struggle to master his body and his passions within the language and conventions of 18th-century Anglo-American republicanism; and most recently, his creative conflation of his personal ambition with the cause of the Republic. Chernow acknowledges all these interpretations of Washington’s life. But because he tends to slide into the biographer’s quicksand of identifying too closely with his subject, his particular contribution is to argue for the critical role Washington himself played in becoming George Washington.

Few human beings have ever lived a life more self-consciously devoted to proving he merited his fame. In retrospect, Washington seems profoundly insecure. Given to dark moods and angry outbursts, especially at those who questioned his intentions, he compensated by studying rules of etiquette, mimicking successful older men, cultivating the loyalty of younger men and displaying an extraordinary sensitivity to what others thought of him. Nothing was more likely to provoke his legendary rage than accusations that he was motivated by a base motive.

Like many of his peers, he made a great show of resisting public office, if only to demonstrate the absence of ambition. Washington fretted at length about the performance he would give from the balcony of Federal Hall in Lower Manhattan when he became president on April 30, 1789. What should he wear? How should he behave? Knowing that “the first of everything in our situation will serve to establish a precedent,” he wanted to avoid acting like a king while respecting the dignity of his office and the Republic it represented. Months earlier, he had decided he ought to say something, thereby inventing the presidential Inaugural Address. In an early draft, of which only fragments survive, Washington, Chernow writes, “spent a ridiculous amount of time defending his decision to become president, as if he stood accused of some heinous crime.” This prickliness rarely surfaced in public. Indeed, he tirelessly cultivated an impassive demeanor that suited to perfection his preferred role as a remote, stoic figure towering above the sordid business of ordinary politics.

But of course George Washington was anything but an uninterested observer. He didn’t just learn from events; he shaped them to his own purposes. Throughout his career he wanted the gentlemen of Virginia and then the United States to master the landscape and peoples of North America as well as their bodies and emotions. Where Thomas Jefferson spent much of his life defying power, Washington imagined using power to improve transportation, encourage education, develop commerce, establish federal authority and unite the diverse regions of the sprawling Republic into an imposing whole that transcended the sum of its parts. “The name of AMERICAN,” he said, must override any local attachments.

To command respect and inspire emulation throughout the world, the United States had to balance liberty with order. This breathtaking imperial vision informed virtually everything Washington did. When he decided to provide for the freedom of most of his enslaved Africans upon the death of Martha Washington, he acted in good measure out of a conviction that “nothing but the rooting out of slavery can perpetuate the existence of our union, by consolidating it in a common bond of principle.”

Washington was concerned with his reputation and that of the nation he helped to found because he wisely understood that he could improve both through close attention to the expectations of others. But we are mistaken if we think he offered himself as a democratic example of how an ordinary person could succeed. The Washington who fashioned his public image did not believe he created the core of his character. To the contrary, his rise, he thought, served as proof that he was an extraordinary man who had always possessed exemplary talent and integrity. If he fretted about what posterity would think of him, it was probably because he doubted our willingness to acknowledge his greatness. On that score, at least, he needn’t have worried.

New Birth of Freedom
By BELINDA COOPER

THE LAST UTOPIA

Human Rights in History

By Samuel Moyn

337 pp. The Belknap Press/Harvard University Press. $27.95

Human rights have come to dominate international discourse, but while this fact is often portrayed as the culmination of a centuries-old tradition, Samuel Moyn, a professor of history at Columbia University, takes a different view. The modern concept of human rights, he says in “The Last Utopia,” differs radically from older claims of rights, like those that arose out of the American and French Revolutions. According to Moyn, human rights in their current form — applicable to all and internationally protected — can be traced not to the Enlightenment, nor to the humanitarian impulses of the 19th century nor to the impact of the Holocaust after World War II. Instead, he sees them as dating from the 1970s, exemplified by President Jimmy Carter’s effort to make human rights a pillar of United States foreign policy.

Today’s human rights movement emerged “seemingly from nowhere,” Moyn says, as a depoliticized, moral response to disillusionment with revolutionary political projects, specifically the anticolonial independence struggles of the 1950s and ’60s. Moyn credibly juxtaposes the hopes placed in a new internationalist “utopia” of human rights against the failure of national self-determination to guarantee human dignity.

The idea that international legal protections apply directly to individuals, outside the authority of their governments, is indeed a recent phenomenon. Yet in his untidy attempt to decouple human rights entirely from what went before, Moyn stretches his argument too far. In all-too-brief asides, he dismisses the anti­slavery campaign and the development of the laws of war in the 19th century because neither was explicitly framed in terms of human rights. Yet both contained universalist and internationalist aspects. Moyn also fails to explain how an early international organization like the Red Cross, which engaged with governments to protect individuals from mistreatment in wartime, differed from the modern human rights organizations he describes.

Moyn argues that the Holocaust played a relatively small role in post-World War II rights debates and correctly reminds us that the Nuremberg tribunal, which put Nazi leaders on trial for war crimes, did not concentrate primarily on the genocide of the Jews. But he ignores Nuremberg’s crucial contribution to the development of the modern human rights movement: for the first time, international law was directly applied to crimes against individuals in a forum that transcended national boundaries.

At the same time, Moyn overestimates the extent to which human rights today take precedence over the sovereignty of states. International treaties designed to protect individuals are still directed to national governments, which remain the first line of defense, even in the modern world of globalized thinking. The concept of national sovereignty has hardly disappeared: the continuing debate over whether to intervene in places like Sudan testifies to the difficulty of overcoming deep-seated resistance to interfering with what are still seen as internal affairs.

In the end, Moyn’s main pieces of evidence for taking the 1970s as the time of a human rights breakthrough are Carter’s abortive steps to inject human rights into foreign policy and the 1975 Helsinki accords with the Soviet Union. But if one must find a recent starting point, a more appropriate decade would be the 1990s, when human rights organizations truly flourished and international criminal tribunals became reality. It was arguably the collapse of the cold war blocs, far more than the end of decolonization, that allowed international human rights to emerge as a viable program, rather than merely a propaganda tool employed by antagonistic political systems. If Moyn’s argument isn’t persuasive, it is in large part because an alternative history to his own is far too easy to construct.

Nabob of Negativism
By BILL SCHEFT

HALF EMPTY

By David Rakoff

224 pp. Doubleday. $24.95

The book jacket of “Half Empty,” David Rakoff’s third essay collection, contains not only the warning “No Inspirational Life Lessons Will Be Found in These Pages,” but the guarantee that the author will have you “positively reveling in the power of negativity.” It’s never clear whether the pessimism alluded to is Rakoff’s philosophy, Rakoff’s device or Rakoff’s publicist clearing his throat. Luckily, we don’t have to judge this book by its cover.

The inherent problem with most collections is that the reader can’t help comparing entries, like a track handicapper setting the morning line. In his ambitious opening essay, “The Bleak Shall Inherit,” an interview with the psychologist Julie Norem (author of “The Positive Power of Negative Thinking”) sets Rakoff off on an attempt to construct his case for the defensive pessimism (expecting the worst so one will never be disappointed) imbued in the nine essays that follow. They don’t all follow, and it doesn’t really matter, in the same way it doesn’t matter whether you buy a film’s premise that Diane Lane can’t get a date.

The factory-installed defining characteristic of those I run across who ply in the pessimism trade is a custom blend: 95 percent self-absorbed, 5 percent self-aware. Rakoff has a self-­awareness that could be recreated only by a team of geneticists working in a lab. The conviction with which he writes is, at the risk of blowing his jacket, uplifting.

The man has been self-aware almost since birth. Consider this memory: “As a child at a picnic, rocking a hula hoop or something, another child came up to me asking to use it. ‘I’ll be your best friend,’ he entreated. ‘Best friend?’ I said, all of 7 years old but already a prig in the making, ‘I hardly know you.’ ”

Aware of himself and others. In “The Satisfying Crunch of Dreams Underfoot,” a rueful remembrance of his career in publishing, his aborted movie debut in “The First Wives Club” and his relationship with the author Olivia Goldsmith, Rakoff observes, “I once watched Jacqueline Onassis wait for an elevator, and the heightened performance of casualness of everyone around her paying her no notice had about as much in common with ignoring someone as a Father’s Day department-store window resembles an actual barbecue.”

Writing like this can only be a positive experience for all concerned. In the gleefully unapologetic takedown “Isn’t It Romantic?” Rakoff summarizes “Rent” thus: “In ‘Rent,’ AIDS seems only to render one cuter and cuter.” Later, he calls the musical a “middlebrow lie . . . posing as an antidote, like watching a sex-ed film narrated by gonorrhea.”

Occasionally, the essays are mined with people and notions that are simply in Rakoff’s way. And ours. You wonder how many of these selections would be better said than read. But be sure to feast on “Dark Meat,” which chronicles the illicit affair between Jews and pork, and somehow winds up on a book-tour stop at a Holocaust museum in Germany. And suit up for the final triumph of “Another Shoe,” when Rakoff, after receiving a second diagnosis of cancer, prepares for amputation by living life using just one arm.

To file Rakoff under “essayist, brilliant” would be to overlook his formidable gifts as a reporter. In “A Capacity for Wonder,” he scoots from Disneyland to the Hollywood Walk of Fame to Mormon Utah, devouring his surroundings just as Calvin Trillin devoured barbecue. Again, you can’t be self-absorbed and that observant.

While we’re at Trillin, I always find him best in the long form (“Remembering Denny,” “About Alice”). Just the thought of Rakoff abandoning installments and cutting loose for 250-ish pages — on anyone, on anything — is exciting. The reality that he hasn’t done so yet leaves us, well, “Half Empty.” Which is more than enough, for now.

Learning to Be Lincoln
By DAVID S. REYNOLDS

THE FIERY TRIAL

Abraham Lincoln and American Slavery

By Eric Foner

By Illustrated. 426 pp. W. W. Norton & Company. $29.95

Do we need yet another book on Lincoln, especially in the wake of all the Lincoln volumes that appeared last year in commemoration of the 200th anniversary of his birth? Well, yes, we do — if the book is by so richly informed a commentator as Eric Foner, the DeWitt Clinton professor of history at Columbia. Foner tackles what would seem to be an obvious topic, Lincoln and slavery, and manages to cast new light on it.

Foner has long been deliberating about Lincoln. He is, most recently, the editor of a collection of essays, “Our Lincoln: New Perspectives on Lincoln and His World,” and among his previous books are a seminal one on the rise of the Republican Party, “Free Soil, Free Labor, Free Men,” and another, “Reconstruction: America’s Unfinished Revolution, 1863-1877,” in which Lincoln’s fledgling policies toward the defeated South were revised in the decade just after the Civil War.

Having probed the politics of the Civil War era, Foner is in a strong position to offer what amounts to a political biography of Lincoln. His approach in “The Fiery ­Trial” underscores the usefulness of contextual study. Many of history’s leading figures, from Shakespeare and Beethoven through American presidents to popular entertainers, have been written about endlessly by traditional biographers. But barring the discovery of new letters, long-hidden diaries or the like, fresh information is hard to find about eminent people whose every small motion has been put under the biographical microscope.

Recent years have witnessed books on Lincoln’s marriage, his supposed homosexuality, and his melancholia and occasional temper tantrums. Such books are often fascinating and provocative, but their originality and reliability can vary greatly, since no new cache of private ­Lincolniana has recently come to light. Fortunately, there’s a way of re-envisioning even the most famous people: by freshly examining their relationship to their historical contexts. The great figures of history, as Melville wrote, “are parts of the times; they themselves are the times, and possess a correspondent coloring.”

Lincoln was no exception. By venturing into Lincoln’s contexts, Foner doesn’t choose the direction of, say, military history or popular culture or sexual mores. Instead, he keeps sharply focused on Lincoln’s political background. This is a wise move since Lincoln was a politician to the core.

Because of his broad-ranging knowledge of the 19th century, Foner is able to provide the most thorough and judicious account of Lincoln’s attitudes toward slavery that we have to date. Historians have long been puzzled by apparent inconsistencies. One the one hand, Lincoln was the Great Emancipator. There’s no reason to doubt his declaration: “I am naturally antislavery. If slavery is not wrong, nothing is wrong.” On the other hand, he had a racist streak. He used the words “nigger” and “darky” in conversation, and he thought that blacks, whom he regarded as physically different from whites, should be deported to Liberia, Central America or somewhere else, since they couldn’t live on equal terms with whites in America. No one was more eloquent than Lincoln in describing the injustice of the institution of slavery; yet rarely did he dwell on the actual sufferings of America’s four million enslaved blacks.

Foner reveals that these contradictions were part and parcel of Lincoln’s upbringing and his participation in party politics. Born in 1809 in the slave state of Kentucky, Lincoln was taken at 7 to live in southwestern Indiana, a region, Foner informs us, that was moderate in its views of slavery but pervaded by racism. Lincoln’s later move to Illinois immersed him in a milieu that coupled tepid antislavery politics with, again, fierce racial prejudice.

Then came Lincoln’s political service in the Whig Party, which contained a range of factions, from fire-eating Southern planters to antislavery New Englanders. Lincoln’s wife, Mary Todd, belonged to a family of slaveholders. His political idol, Henry Clay, was himself a man of contradiction: he was a Kentucky slave owner who accepted the hidebound racial views of the time, yet looked forward to a day when the nation’s enslaved blacks would be emancipated. Outside the party system were abolitionists like William Lloyd Garrison and Wendell Phillips, who were so outraged by slavery that they called for its immediate abolition or, if that didn’t occur, the separation of the North from the South.

Faced with this welter of attitudes, Foner shows, Lincoln steered a middle course. He believed slavery violated America’s basic principles — a view he expressed forcefully and frequently. Still, he was reluctant to take dramatic action against it, unlike some of the radicals within the Whig Party. He remained so devoted to the American Constitution, with its protections of slavery, that he supported (albeit with reluctance) the Fugitive Slave Act of 1850, which imposed stiff penalties on Northerners who assisted runaway slaves. At the same time, he never faltered in his effort to prevent slavery’s western expansion, and he refused to follow party conservatives who were overly conciliatory to the South. When the Republican Party formed in the 1850s, Foner explains, it was Lincoln’s middling position that made him the North’s most attractive presidential candidate in 1860 and helped him keep his wits about him during the tumultuous war years. So dexterously did he navigate the political waters that he could rightly claim credit for bringing about slavery’s abolition.

While appreciatively discussing Lincoln’s moderation, Foner takes an unblinking look at the blots on his record: a court case during his lawyer years when he defended a Southerner trying to repossess a slave family that had claimed its freedom in Illinois; his early opposition to political rights for blacks; his stubborn belief in the need to deport American blacks, even after the scheme had become untenable; his statement that he conducted the war to preserve the Union, regardless of whether slavery survived; and an astonishing remark he once made that held blacks responsible for bringing on the Civil War because of their presence in America.

Foner adeptly contextualizes these unsavory aspects of Lincoln’s history. He points out that only a handful of whites in that era espoused racial attitudes that today would be considered consistently progressive. Racism was rampant, and Lincoln reflected it. Above all, he treasured the American Union. And though he venerated the law, he was willing to use his powers as a wartime president to supersede the law, as when he suspended habeas corpus as part of his effort to crush the Southern rebellion.

Lincoln also exhibited a remarkable ability to alter his attitudes according to circumstance. At first dismissive of the abilities of black people, he came to sincerely admire them during the Civil War and eventually made strides toward endorsing political rights for them. Once staunchly opposed to the immediate abolition of slavery, he was the first president who took action in the cause of emancipation and in time, of course, he dedicated the war effort to the goal of freedom.

Lincoln once declared that he couldn’t control events; they controlled him. More cogently than any previous historian, Foner examines the political events that shaped Lincoln and ultimately brought out his true greatness.

The Natural
By GORDON MARINO

THE SILENT SEASON OF A HERO

The Sports Writing of Gay Talese

By Edited by Michael Rosenwald

308 pp. Walker & Company. Paper, $16

Remarkable writing reaches through the mind and touches the body, sometimes in the form of rainy eyes or a tremulous chin. There are pages in “The Silent Season of a Hero,” a collection of sportswriting by Gay Talese, that will do just that. Consider, for instance, a passage from the title essay, which first appeared in Esquire in 1966 and which some insist is the best piece of sportswriting in history.

It is Sept. 18, 1965, Mickey Mantle Day at Yankee Stadium. Joe DiMaggio has flown in from San Francisco to introduce the man who succeeded him in center field. The throngs are there to cheer Mantle and perhaps dissuade him from retiring. But there are older fans in attendance for whom No. 7 is in essence No. 5, the Yankee Clipper. Talese writes, “One month before, during a pregame exhibition at Old-Timers’ Day, . . . DiMaggio had hit a pitch into the left-field seats, and suddenly thousands of people had jumped wildly to their feet, joyously screaming — the great DiMaggio had returned; they were young again; it was yesterday.”

The son of a tailor, this writer who is himself a meticulous craftsman grew up in the resort town of Ocean City, N.J. Talese’s career as a journalist began when his high school baseball coach asked him to phone in the scores of their games to the local news­paper. He did, adding snippets to leaven his factual accounts. Soon he was reporting for both the school and the town newspaper. After getting a degree in journalism from the University of Alabama in 1953, he became a copy boy at The New York Times. He had a couple of offbeat articles published in the paper and eventually wrote for the sports pages.

This collection of lapidary essays spans seven decades, from 1948 to 2006. Articles from Talese’s teens (which already shimmer with the perspective that he could all but copyright) are pressed together with sketches from The Times and his classic Esquire articles on DiMaggio, Floyd Patterson, Joe Louis and Muhammad Ali. Accompanying this gallery of short and medium-length pieces are brief but deft commentaries from the editor, Michael Rosenwald of The Washington Post, on the development of Talese’s craft, as well as a set of raw notes for a story on the 1979 Yankees. A scribbler who warms to the task of writing about writing, Talese offers an introduction that brims with indirect, authorial advice, but the master classes are surely to be found in the masterworks of this compendium.

In a time when celebrity, scandal and the number of Web hits seem to determine what gets covered and what remains hidden, it is refreshing to encounter Talese’s quiet reportage on the likes of a boxing timekeeper, a blacksmith for horse shows and a dentist who makes mouthpieces for professional athletes.

In a taut 1958 article for The Times, Talese preserves the memories of 93-year-old Billy Ray, the last of the bare-knuckle fighters. Ray retired when gloves became popular and “the game was getting too soft.” There is a sparkling miniature of the future light-heavyweight star José Torres, which does not mention his name until the final paragraph but, in true Talese fashion, immediately registers these facts: “In the closet of his $11-a-week furnished room at 340 Union Street, Brooklyn, he has 8 suits, a dozen silk shirts and 14 pairs of shoes.” However, this esurient eye for detail can, on rare occasions, cloud the larger picture. In his justly famous and rather sunny portrait of a retired Joe Louis, Talese seems dull to the fact that the timbers of Louis’s life are cracking.

Many of these calmly told tales carry an enormous emotional punch. This power is doubly amazing because, for all the intimacy Talese establishes with his subjects, he maintains a Sinatra-like cool detachment. As an adolescent, he helped in his father’s shop and his mother’s adjoining dress store. There, it seems, with people trying on clothes and being fitted, he learned to step into a person’s psychic space and keep himself apart at the same time. Only rarely, as in the poignant Patterson essays (he wrote a whopping 37 of them), do Talese’s feelings bleed through his prose.

In the end, Talese does not so much convince you that he got a person right — whatever that would mean — as make you feel as if you were actually hanging out with him as he practices what he has always humbly termed “the art of hanging out.” Though the athletes he first shadowed and then captured predated even ESPN by a generation or two, Talese’s work remains riveting.

Early on, Talese studied fiction with the strange intention of writing nonfiction, of elevating real life to literary life. Taking note of his way of setting up scenes, his oddly angled story lines and realistic dialogue, Tom Wolfe credited Talese with stirring a revolution in reporting that Wolfe christened the “new journalism.” This pronouncement was neither fiction nor hyperbole. Gay Talese’s outré method of framing and developing his “factual short stories” (as Rosenwald describes them) was as groundbreaking as it is still arresting. As this marvel of an anthology makes manifest, Talese transformed sportswriting into literature that is both serious and delightful.

The Hezbollah Project
By JOE KLEIN

A PRIVILEGE TO DIE

Inside Hezbollah’s Legions and Their Endless War Against Israel

By Thanassis Cambanis

Illustrated. 317 pp. Free Press. $27

There is a striking episode early on in “A Privilege to Die,” Thanassis Cambanis’s well-reported account of the rise of Lebanon’s Hezbollah movement. The 2006 war between Hezbollah and Israel has just begun, a war Hezbollah provoked by crossing the border and kidnapping two Israeli soldiers. Israel has responded with a massive bombing campaign; thousands of Lebanese Shiites are fleeing their villages in the south, traveling north to the relative safety of Beirut. Hezbollah, remarkably, has replaced bombed out roads and bridges to ease the evacuation. It has organized reception ­areas for the refugees.

Cambanis, who spent six very productive years reporting from the Middle East for The Boston Globe and who contributes frequently to The New York Times, is one of those rare foreign correspondents more interested in the impact of the carnage on average humans than in military maneuvers or bang-bang. He follows a farm family to their new home — a parking garage in an almost finished Beirut shopping mall. He is given a tour by a former cop named Jihad Lakkis, who is now working security for Hezbollah and has the place completely organized. Hezbollah is providing crates of bottled water, fresh bread, processed cheese and canned tuna. The toilets are inadequate, but there are Hez­bollah workers keeping them clean.

“I was struck by the discipline of Jihad and the other operatives as well as the calmness of the arriving refugees,” Cambanis writes. “None of them seemed as terrified or hysterical as I would have expected, especially after I had seen the fear and stress etched on the faces of the much safer Christians and Sunnis in Beirut. . . . Without being asked, each and ­every one of them delivered a sort of ode” to the Hez­bollah leader Hassan Nasrallah “and conviction in the coming divine victory over Israel. Such displays of groupthink are always unnerving, but this one was particularly impressive, because . . . all of them wanted to confide to us their abiding love for Hezbollah and Nasrallah, even when none of their peers was watching.”

A metaphor, of course: Hezbollah has created a national bomb shelter in the southern half of Lebanon. Its followers are fed and cared for, and they are devoted in a cultlike way, even though many have lost family members, homes and their livelihoods because of Hezbollah’s foolish bellicosity. It is a perverse triumph. Cambanis argues, persuasively, that Hezbollah represents the most successful radical Islamist movement in the region. He also tells a story that hasn’t been told with this much attention to detail by a Western reporter before.

Created in the early 1980s, Hezbollah was a joint venture of Israel and Iran. Israel inadvertently provided the motivation with its brutal 1982 invasion of Lebanon and attempt to establish a pro-Israeli puppet government there — undoubtedly, the worst foreign policy decision the Jewish state ever made. Iran, intoxicated by the euphoria of its 1979 revolution, provided the money, military training and equipment to its fellow Shiites in south Lebanon who, up till then, had been a disdained underclass in Lebanon’s polyglot ethnic mash-up. Israel continued to provide the motivation, by occupying a sliver of southern Lebanon until 2000, and Iran — using its Syrian ally as a go-between — continued to provide money and arms. But along the way, an extraordinary thing happened: Hezbollah developed a successful formula for governing the Shiite districts in southern Lebanon. It provided security, social services and an all-encompassing sense of community, with its own schools, scout troops and television station (Al-Manar, which magically remained on the air throughout the 2006 war, even though Israel bombed its facilities 15 times).

There was a fair amount of magic in the Hezbollah project, much of it emanating from its charismatic leader, Nasrallah. Cambanis, with his concentration on the flock, seems to diminish the leader’s importance early on, claiming that it was the devotion of the followers that made Hezbollah unique. For example, he writes, “As important as Nasrallah were the anonymous musicians who churned out new propaganda songs every week.” But his own reporting gives the lie to that.

Nasrallah is an extraordinarily shrewd leader. He lives modestly and has made sacrifices for the cause; he lost his oldest son in the war. He can be funny and self-deprecating in public. He has an “almost erotic” appeal for his followers, many of whom are afflicted by an eschatological delusion (the return of the Mahdi) that is remarkably similar to the Christian Rapture myth. Nasrallah’s rhetoric is fierce and his anti-Semitism flagrant, but, Cambanis writes, he has none of the pomposity that characterizes the family dynasties in the rest of the region. He makes smart decisions — refusing to take vengeance on those who collaborated with the Israelis during their occupation; allowing a looser, more permissive form of Islam to Lebanon’s Mediterranean sunbathers and beer-drinkers than his Iranian sponsors permit. And, most important of all, he is an ingenious marketer, especially in his ability to redefine success: victory is survival.

In the 2006 war, Lebanon lost roughly 10 times as many people as Israel (1,191 deaths compared with 163), suffered huge destruction and dislocation — and yet, when peace came, Nasrallah declared “divine victory” and much of the world seemed to buy it; even Israel, accustomed to flattening its Arab foes, suffered a period of introspection afterward, initiating official studies of the war, trying to figure out why Hezbollah hadn’t been obliterated. The “loss” led eventually to the election of Benjamin Netanyahu’s right-wing coalition, which suited Nasrallah just fine: the more extreme Israel is, the better (though Hezbollah did choose to stand down, after some internal debate, and refrained from joining the festivities when Hamas, in a copycat war, provoked Israel into the Gaza conflagration in 2009). With Iran’s help — $12,000 allocated to each displaced family — Hezbollah’s brilliant postwar marketing campaign was called “Better Than Before,” with special attention paid to rebuilding and expanding the communities near the Israeli border that had been flattened.

Cambanis is strongest reporting on the things he’s actually seen, especially the individual Hezbollah fighters he meets, like the martyr Rani Bazzi, and others like Inaya, the hospital worker, and Dergham Dergham, his alcohol-swilling, motorcycle-riding, Hezbollah-supporting translator. He gets weaker as he attempts to fly higher and put Hezbollah in a larger context. He says several times that Hez­bollah has had a profound impact on radical Islamist movements in the region, which is probably true, but he doesn’t do any of the meticulous reporting necessary to prove the point. He also fails to put Lebanese Hezbollah in the context of Iran’s larger terrorist network — which includes Saudi Hezbollah and a surprisingly active Latin American wing. Who runs those? How does Hezbollah fit into Iran’s Revolutionary Guard Quds Force structure?

Cambanis can’t seem to make up his mind as to whether the 2006 war really was a victory or defeat for Hezbollah, or whether Nasrallah will simply follow orders and attack Israel if the United States and Israel launch a pre-emptive strike on Iran’s nuclear facilities. These problems are compounded by sloppy, repetitive writing. By my count, Cambanis says that Hezbollah is the most dynamic force in the Middle East no fewer than four times in the first 15 pages — an annoying opening, but one forgotten as the book unfolds and the excellence of the reporting becomes manifest.

In the end, this is a valuable account. Hezbollah has found a supple and sophisticated extremist formula — the combination of social services, an aura of incorruptability, jihad and inspiration — that can and may well be replicated throughout the region. It also represents an alternative value system, popular and horrifying, to the freedom proselytized as “God’s gift to humanity” by George W. Bush. As such, Hezbollah, sadly, may prove over time to be the strongest indigenous response to the colonial hubris visited upon the Middle East by Western powers since the end of World War I.

Summer of ’44
By LEAH HAGER COHEN

NEMESIS

By Philip Roth

280 pp. Houghton Mifflin Harcourt. $26.

I wrote Roth off. Back in my early 20s, in a fit of literary conscientiousness, I undertook to sample his work. At the point when my nose could wrinkle no further in distaste, I was struck by a relieving epiphany — “Oh: these are for boys” — upon which I resumed my reading life unburdened by any expectation of venturing deeper into Rothiana.

Until the Book Review offered me this assignment. So unlikely, so ill-­conceived a pairing, I thought it must be an error. Yet no sooner had I restored my jaw to its rightful position than I found myself accepting, and an instant later my office was empty save for a few speed lines, as in a comic strip: I’d high-tailed it to the library in order to begin remediating my embarrassing literary gap, a cause to which I devoted — with a kind of mounting, marveling pleasure — much of this past summer.

All of which is to say: Before you stands a convert. I come to swallow the leek.

But first there is the matter of the cause of my early repulsion, relevant because surely not unique. The trouble for me with Philip Roth’s fiction wasn’t so much the sex thing, or even the sexism thing (although, perhaps especially in the case of a young female reader, one might reasonably expect a barrier to enjoyment to rise from the surfeit of all those women-as-orifices, ­women-as-booby-traps, women-as-­willing-stand-ins for whatever his protagonists are so driven to shtup). The trouble was what seemed to be the curdling vein of hostility and nihilism in the prose. Why, I wondered, if the guy’s so anti-­everything, does he keep bothering to write?

From the vantage point of two decades and thousands of pages of Roth later, I don’t think it’s a bad question. My mistake was asking it rhetorically. If treated as a point of real inquiry, the question affords an opening, a way of reading and being reached by the work. For a writer so generously endowed in the irony department, Roth turns out to be astonishingly earnest. We see this in his excesses — not merely the prolificacy of his output, but the outrageousness of his characters’ offenses, their deeds, appetites, shames and confessions. Steaming along on the twin engines of intellect and humor (and what engines — horsepower through the roof), the novels transport us or run us over or both. His characters sometimes get caught up in a kind of Socratic Möbius strip, endlessly debating one another and themselves in a way that can verge on the tedious, but even then one cannot but marvel at his sheer energy, his unremitting investment in — what? Provocation. Interrogation. The feat of living. This is not a nihilist. This is a writer whose creative work lays bare the act of struggle.

For all that, what heat his previous novels give off is the heat of friction, of conflagration. His newest, “Nemesis,” stands out for its warmth. It is suffused with precise and painful tenderness. Set mostly in 1944 Newark, it tells the story of Bucky Cantor, at 23 a freshly minted phys ed teacher and summertime playground director. Life’s dealt him some blows: his mother died in childbirth; his father, a thief, exited the picture long ago. Worse, to his anguish and disgrace, Bucky’s poor vision keeps him from going to fight the Germans alongside his best buddies — alongside, for that matter, “all the able-bodied men his age.”

But life’s dealt him blessings, too, prominent among these the grandparents who raised him: the immigrant grandfather who encouraged him “to stand up for himself as a man and to stand up for himself as a Jew,” and his grandmother, a “tender­hearted little woman” with whom he is living in their tenement flat when the story begins. He has a girl he loves, prospective in-laws thrilled to welcome him into their family and solid aspirations of becoming a high school athletic coach. He’s blessed, too, with an awareness of his blessings, a sense not only of gratitude for them but also of the obligation they confer, and it emerges that this sense of obligation is what allows him to withstand his disappointments; indeed, to flourish within his circumscribed world.

He inhabits the role of playground director with a combination of enthusiasm and dignity that makes him, in the eyes of the children, “an outright hero,” and Bucky’s goals are no less exalted. “He wanted to teach them what his grandfather had taught him: toughness and determination, to be physically brave and physically fit and never to allow themselves to be pushed around or, just because they knew how to use their brains, to be defamed as Jewish weaklings and sissies.”

The school playground becomes Bucky’s Fort Dix, his Normandy landing. And when a polio outbreak hits the city, his sense of duty swells: “This was real war too, a war of slaughter, ruin, waste and damnation, war with the ravages of war — war upon the children of Newark.”

Too decent to relish the chance to serve on the front lines of this battle, too honest not to acknowledge his own fear, Bucky rises to the occasion as best he can, which is to say with seriousness, compassion, bewilderment and anger. Not the sharpest of Roth’s protagonists (he lacks introspection, possesses “barely a trace of wit”), he has an appealing capacity to apprehend happiness. Given its pall of war and disease, “Nemesis” is surprisingly dense with happiness — a happiness that’s ever-tenuous, and the sweeter for it. The word “happy” crops up immoderately throughout, and happiness is made manifest in stirringly specific moments (eating a peach on a hot night, observing a butterfly sip sweat from bare skin). The architecture of Roth’s sentences is almost invisibly elegant; not only doesn’t one notice the art, one barely notices the sentence, registering instead pure function: meaning, rhythm, intent.

Is it impertinent to suggest Roth outdoes himself here by getting out of his own way? This short book has all his brilliance, minus the bluster. And it’s a love story. I’m not thinking of Bucky and his girl, but of the narrator and Bucky. Roth achieves something strange and good here with point of view. From the outset, the narration is evocative of a Greek chorus, at once communal and all-knowing. More than a hundred pages go by before we discover who is telling the story. And even then, until very near the end, I persisted in believing that the narrator was somehow omniscient, speaking perhaps from beyond the grave, as in another of Roth’s recent novels, “Indignation.” There and here, one feels him exploring what role memory and narrative might play in salvaging meaning from a life.

In the end, we learn that Bucky, who had so wished to serve and to save, has been crippled by his own sense of decency. His eventual determination to dis­allow himself happiness runs deep; he rebuffs the narrator’s attempts to persuade him otherwise. Yet in the final shining pages, the narrator does restore Bucky to happiness, not by changing the man, but by doing what a storyteller can: conjure a moment from the past and fix it for all time.

Anatomy of an Uprising
By ALAN BRINKLEY

GIVE US LIBERTY

A Tea Party Manifesto

By Dick Armey and Matt Kibbe

Illustrated. 265 pp. William Morrow/HarperCollins Publishers. $19.99

BOILING MAD

Inside Tea Party America

By Kate Zernike

Illustrated. 243 pp. Times Books/Henry Holt & Company. $25

THE WHITES OF THEIR EYES

The Tea Party’s Revolution and the Battle Over American History

By Jill Lepore

207 pp. Princeton University Press. $19.95

Trying to describe the ideas of the Tea Party movement is a bit like a blind man trying to describe the elephant. The movement, like the elephant, exists. But no one, not even the Tea Partiers themselves, can seem to get hands around the whole of it. Jonathan Raban, who attended the first National Tea Party Convention and wrote about it in The New York Review of Books, was struck by how little agreement there was among “members” as they talked about their griev­ances and aspirations. And yet the Tea Party uprising is the most visible and energized political phenomenon of the last year. Whether it will remain a viable movement remains to be seen, but it shows no sign yet of running out of steam.

What lies behind the Tea Party movement? Some of it is purely partisan. It has close ties to the Republican Party. It is opportunistically promoted by Fox News. One of its best-known leaders is Dick Armey, former Republican majority leader in the House of Representatives, who spent much of the last year or so promoting the new movement through FreedomWorks, an organization he helped to create. Its program is presented in “Give Us Liberty,” by Armey and the group’s president, Matt Kibbe. It is a simple set of goals, consistent with those professed by Republicans over the last 25 years: “lower taxes, less government, more freedom.” But if that were all there were to the Tea Partiers, they would be indistinguishable from the Republican Party itself.

Kate Zernike, a national correspondent for The New York Times, has interviewed a number of Tea Partiers in an effort to understand what they believe and what they want. Her book, “Boiling Mad: Inside Tea Party America,” is an anecdotal description of the movement, supplemented by an April 2010 New York Times/CBS News poll (already a generation away from the fast-moving character of the insurgents). Her interviews, too few to be of any statistical significance, are nevertheless illuminating as a picture of how different some Tea Partiers are from the Republican establishment’s view of the movement.

At some points, Armey describes the Tea Partiers as loyal Republicans who do not want to divide the party. At others, he talks about a “hostile takeover,” although he is not clear about what that would mean. Zernike’s interviews reveal that some of the most outspoken Tea Partiers are disgusted with both major parties. They are enraged by the fecklessness of Wall Street (and the huge bailout in 2008). They argue that cutting government costs (without raising taxes) is the only way to reduce the frightening deficit. They insist that almost everyone in power is corrupt and out of touch with the public. And they believe that the Constitution has been perverted by liberal judges and academics and the political world in general — stretching its meaning well beyond what the founders intended. The people Zernike interviewed rarely expressed bigotry, prejudice or racism, but there are many self-identified Tea Partiers who detest immigration and fear the prospect of an America in which white people will be a minority. Older white men, who seem to constitute the majority of the movement, often rally around the cry “Take Back Our Country.” There is little doubt as to whom they wish to take the country back from.

Jill Lepore, a historian of the American Revolution and a staff writer at The New Yorker, has written a brief but valuable book, “The Whites of Their Eyes: The Tea Party’s Revolution and the Battle Over American History,” which combines her own interviews with Tea Partiers (mostly from her home state, Massachusetts) and her deep knowledge of the founders and of their view of the Constitution. The architects of the Constitution, she makes clear, did not agree about what it meant. Nor did they believe that the Constitution would or should be the final word on the character of the nation and the government. It was the product of much compromise, and few were satisfied with all its parts.

There were enormous omissions — among them the failure to define citizenship, the lack of a clear definition of suffrage, the evasion of most of the issues connected to African-Americans and Native Americans. Jefferson insisted that “laws and institutions must go hand in hand with the progress of the human mind.” Madison asked in Federalist 14, “Is it not the glory of the people of America, that, whilst they have paid a decent regard to the opinions of former times and other nations, they have not suffered a blind veneration for antiquity, for custom, or for names, to overrule the suggestions of their own good sense, the knowledge of their own situation, and the lessons of their own experience?” The reality of the creation of the Constitution is a far cry from the idea that it instituted immutable limits to what government could do.

Listening to the many and diverse demands and ideas that the Tea Partiers express in their rallies, pamphlets and oratory does relatively little to explain why so many Americans are so angry. After all, most of those railing against government deficits (mostly created by Reagan and Bush tax cuts), protesting against taxes (the rates on the top income bracket are lower than at any time since before World War II, with the exception of a brief period two decades ago), and complaining about violations of the Constitution were, only a few years ago, much less concerned about these and many other issues that now loom so large in their vision of the future. Without the economic crisis, these same issues would remain unaddressed. Similar outbreaks of outrage and blame have accompanied most major economic crises over the last century and more. The populist movement during the adversities of the 1890s spawned the People’s Party, a powerful but short-lived organization based on hostility to corporate malfeasance and the gold standard. The Great Depression produced multiple movements that reviled the power of bankers and the concentration of wealth. In both cases, as in our own time, the movements soon became immersed in innumerable other grievances and prejudices.

We should not be surprised that so many Americans are angry. Almost four decades of growing inequality have left most of them no better off than they were in 1970, and many worse off. The recklessness and greed of much of the financial world — the principal causes of the crisis — have done far more damage than taxes or the deficit. The corruption and dysfunction of Congress and much of the rest of the government have disillusioned many. Everyone should be angry about these injustices, even if no one has proposed a workable solution to them. The Tea Partiers are right to be angry. But the objects of their outcries — taxes, deficits, immigration and supposed violations of the Constitution — are of far less consequence than the great failures that plague the nation.

If Walls Could Talk
By DOMINIQUE BROWNING

AT HOME

A Short History of Private Life

By Bill Bryson

Illustrated. 497 pp. Doubleday. $28.95

Many adults have a fantasy that if they could go back to college — now that the desire to party, drink and sleep around has faded to a burnished memory — they’d get so much more out of it. The publishing industry often reflects this wish. Every season brings offerings that are right at home on anyone’s continuing-ed syllabus: innovative, original ways to study world history through lenses trained on the minutiae of salt or cod, earthworms or spices, tea or telephones. Now, finally, for those of us who wrestled with Rocks for Jocks, pined amid Physics for Poets and schlepped through college on 101s of any and every subject — the beloved survey courses — here’s that most popular professor, Bill Bryson, with a fascinating new book, “At Home: A Short History of Private Life.”

Bryson is best known for “A Short History of Nearly Everything,” which took a cosmic perspective on the creation of the place we call home, our planet — no, make that our solar system — and created a run on yellow highlighters. Why he insists on calling these histories “short” is beyond me, when each runs to more than 450 pages. Perhaps they’re short when compared with the stacks of tomes that have to be ingested, digested and egested in order to produce them? With “At Home,” Bryson’s focus is domestic; he intends, as he puts it, to “write a history of the world without leaving home.” You can take this class in your pajamas — and, judging by the book’s laid-back, comfy tone, I have a sneaking suspicion that Bryson wrote much of it in his.

Or he should have. Pajamas might be one of the few subjects not covered in “At Home.” Bryson’s conceit is nifty, providing what business majors might recognize as a “loose-tight” management structure, flexible enough to maintain a global scope without losing track of the mundane. Join this amiable tour guide as he wanders through his house, a former rectory built in 1851 in a tranquil English village. “At Home” takes off from the second half of the 19th century, when, Bryson reminds us, “private life was completely transformed. . . . It is almost impossible to conceive just how much radical day-to-day change people were exposed to.”

Moving from room to room, talking while we walk, please notice that the backs of those antique parlor chairs are never upholstered. They were kept against the wall “to make it easier to walk through rooms without tripping over furniture in the dark.” Now admire the suits hanging in the closet: “When buttons came in, about 1650, people couldn’t get enough of them and arrayed them in decorative profusion on the backs and collars and sleeves of coats, where they didn’t actually do anything. One relic of this is the short row of pointless buttons that are still placed on the underside of jacket sleeves near the cuff.” These, Bryson insists, “have always been purely decorative and have never had a purpose.” (Actually, I can think of a few men for whom the purpose of those buttons is to leave them unbuttoned, a sartorial display of status that only those with custom-tailored suits will recognize.)

A trip to the larder involves a discussion of the servant classes. In the dining room we learn about vitamins (not even a word until 1912). As we move to the second floor, you may be interested to learn that stairs are very dangerous places; they “rank as the second most common cause of accidental death, well behind car accidents, but far ahead of drownings, burns and other similarly grim misfortunes.” A visit to the cellar is the occasion for a survey of building materials in the Western world — stone, wood, brick, concrete. And we can’t talk about hydraulic cement without going into the history of the Erie Canal.

Reach into the freezer for a pint of Ben & Jerry’s “Everything but the . . .” and there you’ll find the history of ice, beginning with the blocks that were carved out of Wenham Lake in Massachusetts in 1844 and shipped to London. (“For several decades, ice was America’s ­second-largest crop, measured by weight.”) Consider the contents of the bedroom: “It has been calculated that if your pillow is six years old (which is the average age for a pillow), one-tenth of its weight will be made up of sloughed skin, living and dead mites, and mite dung.” This may be more than you want to know, but one thing is clear: Climate change notwithstanding, the history of pestilence, plague and the potty — not to mention sewage — should make us glad to be living in the here and now.

Throughout “At Home,” I kept thinking, “Who knew?” But many’s the time I realized, actually, I did know. If you have any interest in furniture, food, fashion, architecture, energy or world history, chances are you’ve stumbled across some (or all) of the information Bryson has on offer. Countless books have been written on every subject covered in “At Home”; many are credited in the ample bibliography. But while Bryson may not have done much original research, it takes a very particular kind of thoughtfulness, as well as a bold temperament, to stuff all this research into a mattress that’s supportive enough to loll about on while pondering the real subject of this book — the development of the modern world. Very few of us know much about everything, and the appeal of Bryson’s sprawling history is in the warp of the material as much as its weft — in the way the whole thing is strung before it is woven. (And speaking of weaving, in the 17th century “India dominated the cotton trade, as we are reminded by the endless numbers of words that came into English from there: khaki, dungarees, gingham, muslin, pajamas, shawl, seersucker, and so on.”)

Bryson has mastered the art of making sweeping generalizations and bold pronouncements. With the discovery of how to extract rock oil (what we now call petroleum) on an industrial scale, George Bissell “changed the world completely and forever.” The realization that land did not have to be rested between plantings also “changed the world.” “The history of early America is really a history of coping with shortages of building materials.” “If you had to summarize it in a sentence, you could say that the history of private life is a history of getting comfortable slowly.” About guano, used as a fertilizer, “suddenly there was nothing in the world people wanted more.” It must be hard to keep track of so many things. Then again, word processing, and with it the ability to cut and paste with ease, has also changed the world.

“At Home” is baggy, loose-jointed and genial. It moves along at a vigorously restless pace, with the energy of a Labrador retriever off the leash, racing up to each person it encounters, pawing and sniffing and barking at every fragrant thing, plunging into icy waters only to dash off again, invigorated. You do, somehow, maintain forward momentum and eventually get to the end. Bryson is fascinated by everything, and his curiosity is infectious. As a reviewer, I ought to be concerned with matters of focus and organization. Bryson himself seems to have had moments of anxiety on the matter. “We might pause here for a moment,” he writes, about midway through, “to consider where we are and why.”

Indeed. We have wondered, many times, just exactly where we are and why. By the time he thinks to do a head count, we have just examined the skeleton of the Statue of Liberty, watched the erection of the Eiffel Tower and met the nouveaux riches of the Gilded Age — while standing in the gloomy recesses of a windowless corridor. No matter. Bryson’s enthusiasm brightens any dull corner. I recommend that you hand over control and simply enjoy the ride. You’ll be given a delightful smattering of information about everything but, weirdly, the kitchen sink.

The Beat Generation and the Tea Party
By LEE SIEGEL

The counterculture of the late 1950s and early 1960s appears to be everywhere these days. A major exhibition of Allen Ginsberg’s photography just closed at the National Gallery in Washington. A superb book, by the historian Sean Wilentz, about Ginsberg’s dear friend and sometime influence Bob Dylan recently made the best-seller list. “Howl,” a film about Ginsberg and the Beats, opened last month. And everywhere around us, the streets and airwaves hum with attacks on government authority, celebrations of radical individualism, inflammatory rhetoric, political theatrics.

In other words, the spirit of Beat dissent is alive (though some might say not well) in the character of Tea Party protest. Like the Beats, the Tea Partiers are driven by that maddeningly contradictory principle, subject to countless interpretations, at the heart of all American protest movements: individual freedom. The shared DNA of American dissent might be one answer to the question of why the Tea Partiers, so extreme and even anachronistic in their opposition to any type of government, exert such an astounding appeal.

Of course, on the surface the differences between “Beat” and “Tea Party” are so immense as to make comparisons seem frivolous. The Beats, though pacifist, were essentially apolitical. (Kerouac’s hatred of the left at the end of his life seemed most of all to be a revulsion against the New Left’s enthusiastic hating.) Their aims were spiritual and sexual liberation, and a unifying wholeness with nature. Insofar as they had sociopolitical ambitions, their goals — abolishing censorship, protecting the environment, opposing what Ginsberg called “the military-­industrial machine civilization” — were the stuff of poetry, not organized politics. In contrast, the Tea Partiers seek the political objectives of “individual liberty, limited government and economic freedom.” Balancing the budget and rejecting cap and trade are their hearts’ desires, not sexual revolution or the quest for spiritual harmony through the use of Zen meditation and hallucinogenics.

Still, American dissent turns on a tradition of troublemaking, suspicion of elites and feelings of powerlessness, no matter where on the political spectrum dissent takes place. Surely just about every Tea Partier agrees with Ginsberg on the enervating effect of the liberal media: “Are you going to let our emotional life,” he once wrote, “be run by Time magazine?”

More seriously, the origin of the word “beat” has a connection to the Tea Partiers’ sense that they are being marginalized as the country is taken away from them. According to Ginsberg, to be “beat” most basically signified “exhausted, at the bottom of the world, looking up or out . . . rejected by society.” Barack Obama meant much the same thing when, during the presidential primaries, he notoriously said that “in a lot of these communities in big industrial states like Ohio and Pennsylvania, people have been beaten down so long, and they feel so betrayed by government.” That he went on to characterize such people as “bitter” souls who “cling to their guns or religion or antipathy toward people who aren’t like them” only strengthened the anxiety among proto-Tea Partiers that they were about to be “rejected by society.”

It’s too bad that the movie “Howl” reduces the socio­political meaning of the Beats to the obscenity trial that took place in San Francisco in 1957, when Lawrence Ferlinghetti stood accused of printing and selling “Howl,” Ginsberg’s explosively profane long poem. Hollywood loves self-righteously to portray now-unchallenged liberal causes under siege, even though in this case the cause of free speech was vindicated when the presiding judge ruled that “Howl” was a work of “redeeming social importance” and that Ferlin­ghetti was innocent. What the movie should have spun out into its own subplot was the fact — never mentioned in the film — that the judge, W. J. Clayton Horn, was a conservative jurist locally renowned for his Sunday-school Bible classes. Horn might well have been as much an outsider in San Francisco’s sophisticated social circles as Ferlinghetti and Ginsberg were in the eyes of the law. It takes an outsider to know an outsider.

Or perhaps Horn had a glimpse of the future. The eventual assimilation of Beat hedonism ensured that by the end of the millennium, white middle-class Christians like him would themselves be marginalized — at least by the dominant culture — as the “silent majority.” (Is the commercialization of Beat values why the film “Howl” mischievously casts Jon Hamm, who plays the boozing, womanizing, yet respectable advertising executive Don Draper in “Mad Men,” as Ferlinghetti’s defense lawyer?)

When the Tea Party came along, however, the silent majority started to get its voice back. Liberals could well be drawn nostalgically to the Beats nowadays because all the countercultural energy belongs to the other side. “When will you be worthy of your million Trotskyites?” Ginsberg asked his fellow Americans in his poem “America.” The Tea Party has an answer to that rhetorical question. A former community organizer might be in the White House, but the Tea Partiers taking to the streets are now the ones supposedly influenced by Saul Alinsky’s Trotskyish “Rules for Radicals,” not the liberals who watch horrified and silent from the sidelines.

Then again, the Beats were as much at odds with the liberals of their time as the Tea Partiers are with the liberals of today. The same liberal air of elite-seeming abstraction that provokes the Tea Partiers drove the Beats around the bend. For the Beats, liberals were part of the power structure: they spoke loftily about conscience and social obligation yet lived comfortably within the plush boundaries of universities, law firms and financial institutions. Worst of all, they accepted the government’s role in organizing their lives. Indeed, in the secret file the F.B.I. kept on him, Ginsberg was described by J. Edgar Hoover himself as having a dangerous “antipathy” toward government. Against the liberals’ seeming complicity with the status quo, the Beats took to the road in quest of what Jack Kerouac (quoting Oswald Spengler) called a “second religiousness” within Western civilization. With their noisy commitment to their churches, the Tea Partiers also seem to want their religious communities to take the place of government in their lives. They would certainly sympathize with Ginsberg’s antipathy.

Perhaps this mutual feeling of cultural exile is why some Tea Partiers share with the Beats a reverence for the power of imprecation — in the matter of unbridled speech, they would have been, with Judge Horn, on the side of Ginsberg and Ferlinghetti. True, the Tea Partiers’ unnerving habit of bringing guns to town-hall meetings would have repelled the Beats. But William S. Burroughs fetishized guns, accidentally killing his wife while trying to shoot a glass off her head. Violence, implicit or explicit, comes with the “beaten” state of mind. So does theatricality, since playing roles — and manipulating symbols — is often the first resort of people who do not feel acknowledged for being who they really are. As the movie “Howl” vividly shows, Ginsberg didn’t merely write poetry, and he didn’t simply recite it. He turned his poetry readings into theatrical performances of Dionysian proportions. Some people might say the difference between Allen Ginsberg and Glenn Beck is the difference between psychedelic and psychopathic, but Beck might well envy Ginsberg’s attempt, in 1967, to help Abbie Hoffman and a band of antiwar protesters levitate the Pentagon by means of tantric chanting, though Beck would no doubt concentrate his telepathic efforts on the I.R.S.

American freedom is a many-splendored thing, and multifaceted too. “We drove in his old Chevy,” Kerouac says, with portentous joy, in “On the Road.” In the course of the exuberant tirade that gave birth to the Tea Party, Rick Santelli of CNBC referred to the ’54 Chevy, “maybe the last great car to come out of Detroit.” That might be as close to a convergence of different ideas of American freedom as our tortured polity will ever come.

Getting to Five
By DAHLIA LITHWICK

JUSTICE BRENNAN

Liberal Champion

By Seth Stern and Stephen Wermiel

Illustrated. 674 pp. Houghton Mifflin ­Harcourt. $35

The story of this book is really two stories, and only one of them is about Supreme Court Justice William J. Brennan Jr. As Seth Stern and Stephen Wermiel explain in an authors’ note, Brennan first agreed to cooperate on a biography with Wermiel, then the Supreme Court correspondent for The Wall Street Journal, in 1986. Brennan granted more than 60 recorded interviews in chambers and gave Wermiel access to tens of thousands of pages of records, including case files, correspondence and the coveted case histories his clerks had prepared each term. Wermiel completed part of the book before Brennan’s death in 1997, then, as he explains, put the project aside. He came under criticism, most notably from Jeffrey Toobin of The New Yorker, for his unparalleled access to documents closed to other researchers and for “what little he has managed to make of it.” That was in 2004. In 2006, Wermiel brought Stern, of Congressional Quarterly, to the project and, their note continues, it was Stern who reorganized the vast amounts of material and drafted most of the chapters.

“Justice Brennan” provides the most comprehensive and well-organized look at the legendary liberal jurist to date. Stern and Wermiel dig below the popular cliché of Bill Brennan as the Constitution’s Gene Kelly — all twinkling eyes and glad-to-see-ya Irish charm — to reveal the complicated (and quite conservative) man beneath. But the book is weakened by the long delay from conception to completion. With the release in recent years of, among other things, Lewis Powell’s, Harry Blackmun’s and some of Ruth Bader Ginsburg’s papers, plus the access granted Jim Newton to some of Brennan’s case histories for his 2006 book on the Warren court, “Justice Brennan” is very late to the party. There is a sense, at times, that Brennan has slipped away.

The many interviews with the justice don’t always illuminate matters. They reveal a Brennan who frequently recast interpersonal conflicts into neat, mutually satisfactory resolutions (like the episode of a 1966 clerkship he offered, then rescinded, to a Berkeley student with controversial political views). The interviews were also very likely limited by the fact that despite his gregarious exterior, Brennan was intensely private about everything from his wife Marjorie’s longstanding drinking problem to his persistent financial woes. The authors quote an early colleague of Brennan’s who says “he never told you anything about himself,” and later, a former clerk explaining that “so much of who he really was was covered up by the backslapping leprechaun exterior.” It seems that throughout his life, Brennan made an art of being opaque.

The burning question has always been whether Brennan’s influence on the Warren court — which engendered a revolution that has yet to be fully reversed all these years later — was as dramatic and outsized as we’ve been led to believe. In “The Brethren,” Bob Woodward and Scott Armstrong described Brennan glad-handing and horse-trading his way to one victory after another, a depiction Brennan resented for portraying him as the archetypal “Irish ward boss.”

In the decades since, Brennan has come to be seen as an epic strategist and deal-maker who coordinated many of the Warren court’s major decisions behind the scenes. Where this book truly soars is in its account of Brennan’s skills at — as he always described it to his clerks — getting to five: finding a way to string together five fractious votes for some new principle or doctrine, or seeding some future principle or doctrine between the lines. It’s clear from this biography that what Brennan did wasn’t alchemy, even when it wasn’t always perfectly principled. He emerges as so carefully attuned to the concerns and passions of his colleagues that he was able, time after time, to draft opinions, or help them draft opinions, in ways that could achieve five votes.

But if “Justice Brennan” clarifies how Brennan accomplished what he did, it sometimes disappoints when it attempts to explain why. It’s not clear what animated this man, what drove and moved him, and while the authors show us a Brennan occasionally at odds with his own doctrine — a champion of the free press who deeply mistrusted reporters; an advocate of women’s rights terrified by the idea of female justices, or even clerks; a Roman Catholic less interested in attending Mass than in securing the church’s approval — readers will find it difficult to understand such contradictions. Brennan was obviously influenced by his father, who rose from a poor Irish immigrant childhood to become a beloved city commissioner in Newark. Brennan was so determined to please his parents that he kept his marriage a secret for years for fear they would disapprove of his having wed before he could afford to support a wife. His service in World War II (and the death of his brother Charlie) affected him deeply, as did his complicated relationship with the church, although again, one is hard pressed to know how these early experiences shaped him or his jurisprudence.

Time and again, the sense one gets is that Brennan was sensitive to the needs of the people around him, a kind of genius at providing what they wanted. At his Newark law firm Brennan represented business interests in labor disputes, although his father had at one time been one of the city’s most famous union leaders. At 76, he married his secretary, Mary Fowler, only months after Marjorie’s death, at which point his lifestyle changed completely and he began to travel and speak again after years of self-imposed isolation. There is no doubt Brennan was a committed liberal, and that on some matters — like the death penalty — his liberalism was a deeply felt conviction. But his story also raises questions about whether he in fact spearheaded the Warren court revolution for himself, for the country or because it was something that mattered deeply to Chief Justice Earl Warren.

“Justice Brennan” is a scrupulously fair book, and readers looking for hagiography will be dismayed by Brennan’s views of women (he once declared that were a woman ever nominated to the high court, he might have to resign). Similarly, Brennan’s crushing disappointment in the performance of his famed colleague Thurgood Marshall will bother liberals who revere Brennan and Marshall as equal partners in a crucial period. “What the hell happened when he came on the court, I’m not sure,” Brennan said, “but he doesn’t seem to have had the same interest.” Anyone who wants their liberal heroes to hold bone-deep progressive views will be frustrated by a Brennan who hated pornography but ruled it was permissible, and disapproved of abortion but protected it. The membrane between the great jurist’s interior and doctrinal life may have been, in the end, more substantial than the membrane between himself and the rest of the world. Ultimately, “Justice Brennan” is a far more informative account of what the man achieved than why he did it. It seems clear he would have preferred it that way.

No comments:

Post a Comment