CHAPTER ONE
THE NEW ROME
The Decaying City
 
 
 
The form was still the same, but the animating health and vigor were fled.
—Edward Gibbon, The History of the Decline
and Fall of the Roman Empire
(1776–1789)
 
Picture a man of the late nineteenth century, perhaps your own greatgrandfather, sitting in an ordinary American home of 1890. And now pitch him forward in an H. G. Wells machine, not to our time but about halfway—to that same ordinary American home, circa 1950.
Why, the poor gentleman of 1890 would be astonished. His old home is full of mechanical contraptions. There is a huge machine in the corner of the kitchen, full of food and keeping the milk fresh and cold! There is another shiny device whirring away and seemingly washing milady’s bloomers with no human assistance whatsoever! Even more amazingly, there is a full orchestra playing somewhere within his very house. No, wait, it’s coming from a tiny box on the countertop!
The music is briefly disturbed by a low rumble from the front yard, and our time-traveler glances through the window: a metal conveyance is coming up the street at an incredible speed—with not a horse in sight. It’s enclosed with doors and windows, like a house on wheels, and it turns into the yard, and the doors open all at once, and two grown-ups and four children all get out—just like that, as if it’s the most natural thing in the world! He notices there is snow on the ground, and yet the house is toasty warm, even though no fire is lit and there appears to be no stove. A bell jingles from a small black instrument on the hall table. Good heavens! Is this a “telephone”? He’d heard about such things, and that the important people in the big cities had them. But to think one would be here in his very own home! He picks up the speaking tube. A voice at the other end says there is a call from across the country—and immediately there she is, a lady from California talking as if she were standing next to him, without having to shout, or even raise her voice! And she says she’ll see him tomorrow!
Oh, very funny. They’ve got horseless carriages in the sky now, have they?
What marvels! In a mere sixty years!
But then he espies his Victorian time machine sitting invitingly in the corner of the parlor. Suppose he were to climb on and ride even farther into the future. After all, if this is what an ordinary American home looks like in 1950, imagine the wonders he will see if he pushes on another six decades!
So on he gets, and sets the dial for our own time.
And when he dismounts he wonders if he’s made a mistake. Because, aside from a few design adjustments, everything looks pretty much as it did in 1950: the layout of the kitchen, the washer, the telephone.... Oh, wait. It’s got buttons instead of a dial. And the station wagon in the front yard has dropped the woody look and seems boxier than it did. And the folks getting out seem ... larger, and dressed like overgrown children.
And the refrigerator has a magnet on it holding up an endless list from a municipal agency detailing what trash you have to put in which colored boxes on what collection days.
But other than that, and a few cosmetic changes, he might as well have stayed in 1950.
Let’s pause and acknowledge the one exception to the above scenario: the computer. Instead of having to watch Milton Berle on that commode-like thing in the corner, as one would in 1950, you can now watch Uncle Miltie on YouTube clips from your iPhone. But be honest, aside from that, what’s new? Your horseless carriage operates on the same principles it did a century ago. It’s added a CD player and a few cup holders, but you can’t go any faster than you could fifty years back. As for that great metal bird in the sky, commercial flight hasn’t advanced since the introduction of the 707 in the 1950s. Air travel went from Wilbur and Orville to bi-planes to flying boats to jetliners in its first half-century, and then for the next half-century it just sat there, like a commuter twin-prop parked at Gate 27B at LaGuardia waiting for the mysteriously absent gate agent to turn up and unlock the jetway.
Other arenas aren’t quite as static as the modern American airport, but nor do they move at the same clip they used to. When was the last big medical breakthrough? I mean “big” in the sense of something that takes a crippling worldwide disease man has accepted as a cruel fact of life and so clobbers it that a generation on nobody gives it a thought. That’s what the polio vaccine did in 1955. Why haven’t we done that for Alzheimer’s? Today, we have endless “races for the cure,” and colored ribbons advertising one’s support for said races for the cure, and yet fewer cures. It’s not just pink ribbons for breast cancer, and gray ribbons for brain cancer, and white for bone cancer, but also yellow ribbons for adenosarcoma, light blue for Addison’s Disease, teal for agoraphobia, periwinkle for acid reflux, pink and blue ribbons for amniotic fluid embolisms, and pinstripe ribbons for amyotrophic lateral sclerosis. We have had phenomenal breakthroughs in hues of awareness-raising ribbons. Yet for all the raised awareness, very few people seem aware of how the whole disease-curing business has ground to a halt.
Compare the Twenties to the Nineties: in the former, the discovery of insulin and penicillin, plus the first vaccines for tuberculosis, diphtheria, tetanus, whooping cough, on and on. In the last decade of the twentieth century, what? A vaccine for Hepatitis A, and Viagra. Good for erectile dysfunction, but what about inventile dysfunction? In October 1920, a doctor in London, Ontario, Frederick Banting, had an idea as to how insulin might be isolated and purified and used to treat diabetes, which in those days killed you.1 By August 1922, Elizabeth Hughes, the daughter of America’s Secretary of State and a diabetic near death, was being given an experimental course of the new treatment. By January 1923, Eli Lilly & Company were selling insulin to American druggists. That’s it: a little over two years from concept to patient. Not today: the U.S. Food and Drug Administration now adds half a decade to the process by which a treatment makes it to market, and they’re getting slower. Between 1996 and 1999, the FDA approved 157 new drugs. Between 2006 and 2009, the approvals fell by half—to 74.2 What happens during that half-decade? People die, nonstop—as young Elizabeth Hughes would have died under the “protection” of today’s FDA. Because statism has no sense of proportion. You can still find interesting articles about new discoveries that might have implications for, say, Parkinson’s disease. But that’s all you’ll find: articles, in periodicals, lying around your doctor’s waiting room. The chances of the new discovery advancing from the magazine on the coffee table to your prescription are less and less. To begin the government-approval process is to enter what the cynics of the twenty-first-century research biz call the valley of death.
When America Alone came out, arguing that the current conflict is about demographic decline, globalized psychoses, and civilizational confidence, a lot of folks objected, as well they might: seeing off supple amorphous abstract nouns is not something advanced societies do well. You’re looking at it the wrong way, I was told. Technocratic solutions, new inventions, the old can-do spirit: that’s the American way, and that’s what will see us through.
Well, okay, so where is it?
006

CRESCENT MOON

Half a century ago, the future felt different. Take 1969, quite a year in the aerospace biz: in one twelve-month period, we saw the test flight of the Boeing 747, the maiden voyage of the Concorde, the RAF’s deployment of the Harrier “jump jet,” and Neil Armstrong’s “giant step for mankind.” Buzz Aldrin packed a portable tape player with him on Apollo 11, and so Sinatra’s ring-a-ding-ding recording of “Fly Me to the Moon” became the first (human) music to be flown to the moon and played there.3 Had any other nation beaten NASA to it, they’d have marked the occasion with the “Ode to Joy” or Also Sprach Zarathustra, something grand and formal. But there’s something marvelously American about the first human being to place his feet on the surface of a heavenly sphere standing there with a cassette machine blasting out Frank and the Count Basie band in a swingin’ Quincy Jones arrangement—the insouciant swagger of the American century breaking the bounds of the planet.
In 1961, before the eyes of the world, President Kennedy had set American ingenuity a very specific challenge—and put a clock on it:
This nation should commit itself to achieving the goal, before this decade is out, of landing a man on the moon and returning him safely to the earth.4
That’s it. No wiggle room. A monkey on the moon wouldn’t count, nor an unmanned drone, nor a dune buggy that can’t take off again but transmits grainy footage back to Houston as it rusts up in the crater it came to rest in. The only way to win the bet is with a real-live actual American standing on the surface of the moon planting the Stars and Stripes. Even as it happened, the White House was so cautious that William Safire wrote President Nixon a speech to be delivered in the event of disaster:
Fate has ordained that the men who went to the moon to explore in peace will stay on the moon to rest in peace ...5
Yet America did it. “Fly Me to the Moon/Let me sing forever more.” What comes after American yearning and achievement? Democratization: “Everybody Gets to Go the Moon.” That all but forgotten Jimmy Webb song from 1969 catches the spirit of the age:
Isn’t it a miracle
That we’re the generation
That will touch that shiny bauble with our own two hands?
Whatever happened to that?
Four decades later, Bruce Charlton, professor of Theoretical Medicine at the University of Buckingham in England, wrote that “that landing of men on the moon and bringing them back alive was the supreme achievement of human capability, the most difficult problem ever solved by humans.”6 That’s a good way to look at it: the political class presented the boffins with a highly difficult and specific problem, and they solved it—in eight years. Charlton continued:
Forty years ago, we could do it—repeatedly—but since then we have not been to the moon, and I suggest the real reason we have not been to the moon since 1972 is that we cannot any longer do it. Humans have lost the capability.
Of course, the standard line is that humans stopped going to the moon only because we no longer wanted to go to the moon, or could not afford to, or something.... But I am suggesting that all this is BS.... I suspect that human capability reached its peak or plateau around 1965-75—at the time of the Apollo moon landings—and has been declining ever since.
Can that be true? Charlton is a controversialist gadfly in British academe, but, comparing 1950 to the early twenty-first century, our time traveler from 1890 might well agree with him. And, if you think about it, isn’t it kind of hard even to imagine America pulling off a moon mission now? The countdown, the takeoff, a camera transmitting real-time footage of a young American standing in a dusty crater beyond our planet blasting out from his iPod Lady Gaga and the Black-Eyed Peas or whatever the twenty-first-century version of Sinatra and the Basie band is.... It half-lingers in collective consciousness as a memory of faded grandeur, the way a ninetheenth-century date farmer in Nasiriyah might be dimly aware that the Great Ziggurat of Ur used to be around here someplace.
So what happened? According to Professor Charlton, in the 1970s “the human spirit began to be overwhelmed by bureaucracy.” The old can-do spirit? Oh, you can try to do it, but they’ll toss every obstacle in your path. Go on, give it a go: invent a new medical device; start a company; go to the airport to fly to D.C. and file a patent. Everything’s longer, slower, more soul-crushing. And the decline in “human capability” will only worsen in the years ahead, thanks not just to excess bureaucracy but insufficient cash.
“Yes, we can!” droned the dopey Obamatrons of 2008. No, we can’t, says Charlton, not if you mean “land on the moon, swiftly win wars against weak opposition and then control the defeated nation, secure national borders, discover breakthrough medical treatments, prevent crime, design and build to a tight deadline, educate people so they are ready to work before the age of 22....”
Houston, we have a much bigger problem.
To be sure, there’s still something called “NASA” and it still stands for the “National Aeronautics and Space Administration.” But there’s not a lot of either aeronautics or space in the in-box of the agency’s head honcho. A few days after Charlton penned his elegy for human capability, NASA Administrator Charles Bolden appeared on al-Jazeera and explained the brief he’d been given by President Obama:
One was he wanted me to help re-inspire children to want to get into science and math; he wanted me to expand our international relationships; and third and perhaps foremost, he wanted me to find a way to reach out to the Muslim world and engage much more with dominantly Muslim nations to help them feel good about their historic contribution to science and math and engineering.7
Islam: The final frontier! To boldly go where no diversity outreach consultant has gone before! What’s “foremost” for NASA is to make Muslims “feel good” about their contributions to science. Why, as recently as the early ninth century Muhammad al-Khwarizmi invented the first universal horary quadrant! Things have been a little quiet since then, or at least since Taqi-al-Din’s observatory in Istanbul was razed to the ground by the Sultan’s janissaries in 1580. If you hear a Muslim declaring “We have lift off!” it’s likely to be a triumphant ad-lib after lighting up his crotch. As far as I recall, the most recent Islamic contribution to the subject of space exploration came from Britain’s most prominent imam, Abu Hamza, who in 2003 declared that the fate of the space shuttle Columbia was God’s punishment “because it carried Americans, an Israeli and a Hindu, a trinity of evil against Islam.”8
It’s easy to laugh at the likes of Abu Hamza, although not as easy as it should be, not in Europe and Canada, where the state is eager to haul you into court for “Islamophobia.” But the laugh’s on us. NASA is the government agency whose acronym was known around the planet, to every child who looked up at the stars and wondered what technological marvels the space age would have produced by the time he was out of short pants. Now the starry-eyed moppets are graying boomers, and the agency that symbolized man’s reach for the skies has transformed itself into a self-esteem boosterism operation. Is there an accompanying book—Muslims Are from Mars, Infidels Are from Venus?
There’s your American decline right there: from out-of-this-world to out-of-our-minds, an increasingly unmanned flight from real, historic, technological accomplishment to unreal, ahistorical, therapeutic, touchyfeely multiculti.
So we can’t go to the moon. And, by the time you factor in getting to the airport to do the shoeless shuffle and the enhanced patdown, flying to London takes longer than it did in 1960. If they were trying to build the transcontinental railroad now, they’d be spending the first three decades on the environmental-impact study and hammering in the Golden Spike to celebrate the point at which the Feasibility Commission’s expansion up from the fifth floor met the Zoning Board’s expansion down from the twelfth floor.
Google and Apple and other latter day American success stories started in somebody’s garage—the one place where innovation isn’t immediately buried by bureaucracy, or at least in most states, not until some minor municipal functionary discovers you neglected to apply for a Not Sitting Around on My Ass All Day permit. What did Apple and company do in those garages? They invented and refined home computers—an entirely logical response to late twentieth-century America: when reality seizes up, freedom retreats and retrenches to virtual reality, to the internal. Where once space was the final frontier, now we frolic in the canyons of our mind. We’re in the Wilbur & Orville era of the Internet right now, but at the Federal Communications Commission and other agencies they’re already designing the TSA uniforms for the enhanced cyber-patdown.
And what do you have to show for all that government? It’s amazing with a multi-trillion-dollar barrel how quickly you wind up scraping the bottom of it. In Obama’s “American Recovery and Reinvestment Plan,” two of the five objectives were to “computerize the health-care system” and “modernize classrooms.”9 That sound you hear is the computerized eyerolling with which every modernized hack author now comes equipped. For its part, the Congressional Progressive Caucus wanted “green jobs creation” and “construction of libraries in rural communities to expand broadband access.”10 And in a postmodern touch, Mark Pinsky at the New Republic made the pitch for a new Federal Writers’ Project, in which writers laid off by America’s collapsing newspaper industry would be hired by the government to go around the country “documenting the ground-level impact of the Great Recession.”11 America has a money-no-object government with a lot of money but no great objects.
007

GOTTERDAMMERUNG

When the father of Big Government, Franklin Roosevelt, was brought before the Hoover Dam, he declared:
This morning I came, I saw, and I was conquered, as everyone would be who sees for the first time this great feat of mankind.12
But the bigger government gets, the less it actually does. You think a guy like Obama is going to put up a new Hoover Dam (built during the Depression and opened two years ahead of schedule)? No chance. Today’s Big Government crowd is more likely to put up a new regulatory agency to tell the Hoover Dam it’s non-wheelchair accessible and has to close. As Deanna Archuleta, Obama’s Deputy Assistant Secretary of the Interior, assured an audience in Nevada: “You will never see another federal dam.”13 “Great feats of mankind” are an environmental hazard, for mankind has great feats of clay. But hang on, isn’t hydropower “renewable” energy? It doesn’t use coal or oil, it generates electricity from the natural water cycle. If that’s not renewable, what is? Ah, but, according to environmental “dam-busters,” reservoirs are responsible for some 4 percent of the earth’s carbon dioxide emissions. Environmental devastation-wise, the Hoover Dam is the patio pool to Al Gore’s mansion. Out, out, dammed spot!
So, just as the late Roman Empire was no longer an aqueduct-building culture, we are no longer a dam-building one. It’s not just that we no longer invent, but that we are determined to disinvent everything our great-grandparents created to enable the self-indulgent lives we take for granted and that leave us free to chip away at the foundations of our own society. So-called “progressives” actively wage war on progress. They’re opposed to dams, which spurred the growth of California. They’re opposed to airconditioning, which led to the development of the Southwest. They’re opposed to light bulbs, which expanded man’s day, and they’re opposed to automobiles, which expanded man’s reach. They’re still nominally in favor of mass transit, so maybe we can go back to wood-fired steam trains? No, sorry, no can do. The progressives are opposed to logging; they want a ban on forestry work in environmentally sensitive areas such as forests. Ultimately, progressives are at war with mass prosperity.
In the old days, we didn’t have these kinds of problems. But then Mr. and Mrs. Peasant start remodeling the hovel, adding a rec room and indoor plumbing, replacing the emaciated old nag with a Honda Civic and driving to the mall in it, and next thing you know, instead of just having an extra yard of mead every Boxing Day at the local tavern and adding a couple more pustules to the escutcheon with the local trollop, they begin taking vacations in Florida. When it was just medieval dukes swanking about like that, the planet worked fine: that was “sustainable” consumerism. But now the masses want in. And, once you do that, there goes the global neighborhood.
Human capital is the most important element in any society. The first requirement of the American Dream is Americans. Today we have American sclerosis, to which too many Americans are contributing. Capitalism is liberating: you’re born a peasant but you don’t have to die one. You can work hard and get a nice place in the suburbs. If you’re a nineteenth-century Russian serf and you get to Ellis Island, you’ll be living in a tenement on the Lower East Side, but your kids will get an education and move uptown, and your grandkids will be doctors and accountants in Westchester County.
And your great-grandchild will be a Harvard-educated dam-busting environmental activist demanding an end to all this electricity and indoor toilets.
To go back to 1950, once our friend from 1890 had got his bearings in mid-century, he’d be struck by how our entire conception of time had changed in a mere sixty years. If you live in my part of New Hampshire and you need to pick something up from a guy in the next town, you hop in the truck and you’re back in little more than an hour. In a horse and buggy, that would have been most of your day gone. The first half of the twentieth century overhauled the pattern of our lives: the light bulb abolished night; the internal combustion engine tamed distance. They fundamentally reconceived the rhythms of life. That’s why our young man propelled from 1890 to 1950 would be flummoxed at every turn. A young fellow catapulted from 1950 to today would, on the surface, feel instantly at home—and then notice a few cool electronic toys. And, after that, he might wonder about the defining down of “accomplishment”: Wow, you’ve invented a more compact and portable delivery system for Justin Bieber!
Long before they slump into poverty, great powers succumb to a poverty of ambition. It could be that the Internet is a lone clipper of advancement on a sea of stasis because, as its proponents might argue, we’re on the brink of a reconceptualization of space similar to the reconceptualization of time that our great-grandparents lived through with the development of electricity and automobiles. But you could as easily argue that for most of the citizenry the computer is, in the Roman context, a cyber-circus. In Aldous Huxley’s Brave New World, written shortly after Hollywood introduced us to “the talkies,” the masses are hooked on “the feelies”:
“Take hold of those metal knobs on the arms of your chair,” Lenina whispers to her date. “Otherwise you won’t get any of the feely effects.” He does so. The “scent organ” breathes musk; when the on-screen couple kiss with “stereoscopic lips,” the audience tingles. When they make out on the rug, every moviegoer can feel every hair of the bearskin.
In our time, we don’t even need to go to the theater. We can “feel” what it’s like to drive a car on a thrilling chase through a desert or lead a commando raid on a jungle compound without leaving our own bedrooms. We can photoshop ourselves into pictures with celebrities. We can have any permutation of men, women, and pre-operative transsexuals engaging in every sexual practice known to man or beast just three inches from our eyes: a customized 24-hour virtual circus of diverting games, showbiz gossip, and downloadable porn, a refuge from reality, and a gaudy “feely” playground for the plebs at a time when the regulators have made non-virtual reality a playground for regulators and no one else.
In the end, the computer age may presage not a reconceptualization of space but an abandonment of the very concept of time. According to Mushtaq Yufzai, the Taliban have a saying:
Americans have all the watches, but we’ve got all the time.14
Cute. If it’s not a Taliban proverb, it would make an excellent country song. It certainly distills the essence of the “clash of civilizations”: Islam is playing for tomorrow, whereas much of the West has, by any traditional indicator, given up on the future. We do not save, we do not produce, we do not reproduce, not in Europe, Canada, Vermont, or San Francisco. Instead, we seek new, faster ways to live in an eternal present, in an unending whirl of sensory distraction. Tocqueville’s prediction of the final stage of democracy prefigures the age of “social media”:
It hides his descendants and separates his contemporaries from him; it throws him back for ever upon himself alone, and threatens in the end to confine him entirely within the solitude of his own heart.
008

THE HOLE IS GREATER THAN THE SUM OF ITS PARTS

Almost anyone who’s been exposed to western pop culture over the last half-century is familiar with the brutal image that closes Planet of the Apes: a loinclothed Charlton Heston falling to his knees as he comes face to face with a shattered Statue of Liberty poking out of the sand and realizes that the “planet of the apes” is, in fact, his own—or was. What more instantly recognizable shorthand for civilizational ruin? In the film Independence Day, Lady Liberty gets zapped by aliens. In Cloverfield, she’s decapitated by a giant monster. If you’re in the apocalyptic fantasy business, clobbering the statue in the harbor is de rigueur.
As far as I can ascertain, the earliest example of Liberty-toppling dates back to an edition of Life, and a story called “The Next Morning,” illustrated by a pen-and-ink drawing of a headless statue with the smoldering rubble of the city behind her. That was in 1887. The poor old girl had barely got off the boat from France and they couldn’t wait to blow her to kingdom come. Two years later, on the cover of J. A. Mitchell’s story The Last American , she still stands but the city around her has sunk into a watery grave as a Persian sailing ship navigates the ruins of a once mighty nation called Mehrika in the year 2951.
But liberty is not a statue, and that is not how liberty falls. So what about a different kind of dystopian future? Picture a land where the Statue of Liberty remains in the harbor, yet liberty itself has withered away. The word is still in use. Indeed, we may have a bright shiny array of new “liberties,” new freedoms—“free” health care, “free” college education. If you smash liberty in an instant—as the space aliens do in Independence Day—we can all have our Charlton Heston moment and fall to our knees wailing about the folly and stupidity of man. But when it happens incrementally, and apparently painlessly, free peoples who were once willing to give their lives for liberty can be persuaded very easily to relinquish their liberties for a quiet life. In the days when President Bush was going around promoting the notion of democracy in the Muslim world, there was a line he liked to fall back on:
Freedom is the desire of every human heart.15
If only that were true. It’s doubtful whether that’s actually the case in Gaza and Waziristan, but we know for absolute certain that it’s not in Paris and Stockholm, London and Toronto, Buffalo and San Jose. The story of the western world since 1945 is that, invited to choose between freedom and government “security,” large numbers of people vote to dump freedom every time—the freedom to make their own decisions about health care, education, property rights, the right to eat non-state-licensed homemade pie, and eventually (as we already see in Europe, Canada, the UN Human Rights Council, and U.S. college campuses) what you’re permitted to say and think. An America running out of ideas eventually gives up on the American idea.
The pop-cultural detonation of national landmarks is a mostly American phenomenon. In the rest of the world, it happens for real. At the same time as Amazing Stories and Astounding Science Fiction were running those covers of the Statue of Liberty decapitated and toppled in one lurid fantasy after another, Buckingham Palace took nine direct hits during the Blitz. Reducing British landmarks to rubble wasn’t Fiction and it wasn’t that Astounding, and it didn’t even require much Science. On one occasion, an enterprising lone German bomber flew low up the Mall and dropped his load directly above the Royal Family’s living quarters. The King and Queen were in their drawing room and showered with shards of glass. When American audiences whoop and holler at the vaporizing of the White House in Independence Day, it’s because such thrills are purely the stuff of weekend multiplex diversion.
Or at least they were until a Tuesday morning one September when a guy in a cave remade the Manhattan skyline.
Somewhere along the way, back home in Saudi, at summer school in Oxford, or on a VCR hooked up to the generator at Camp Jihad in Waziristan, Osama bin Laden must surely have seen some of those despised Hollywood blockbusters, because he evidently gave some thought to the iconography of the moment. Planning the operation, did he ever consider taking out the Statue of Liberty? Fewer dead, but what a statement! A couple of days after 9/11, the celebrated German composer Karlheinz Stockhausen told a radio interviewer that the destruction of the World Trade Center was “the greatest work of art ever.”16 I’m reminded of the late Sir Thomas Beecham’s remark when asked if he’d ever conducted any Stockhausen: “No,” he replied. “But I think I’ve trodden in some.”17 Stockhausen stepped in his own that week: in those first days after the assault, even the anti-American Left felt obliged to be somewhat circumspect. But at a certain level the composer understood what Osama was getting at.
Nevertheless, Stockhausen was wrong. The “greatest work of art” is not the morning of 9/11, with the planes slicing through the building, and the smoke and the screaming and the jumping, and the swift, eerily smooth collapse of the towers. No, the most eloquent statement about America in the early twenty-first century is Ground Zero in the years after. 9/11 was something America’s enemies did to us. The hole in the ground a decade later is something we did to ourselves. By 2010, Michael Bloomberg, the take-charge get-it-done make-it-happen mayor of New York was reduced to promising that that big hole in Lower Manhattan isn’t going to be there for another decade, no, sir. “I’m not going to leave this world with that hole in the ground ten years from now,” he declared defiantly.18 In the twenty-first century, that’s what passes for action, for get-tough leadership, for riding herd. When the going gets tough, the tough boot the can another decade down the road. Sure, those jihad boys got lucky and took out a couple of skyscrapers, but the old can’t-do spirit kicked in, and a mere ten years later we had a seven-storey hole on which seven billion dollars had been lavished. But, if we can’t put up a replacement building within a decade, we can definitely do it within two. Probably. As a lonely steel skeleton began lethargically to rise from the 16-acre site, the unofficial estimated date of completion for the brand new “1 World Trade Center” was said to be 2018.19 That date should shame every American.
What happened? Everyone knows the “amber waves of grain” and “purple mountain majesties” in “America the Beautiful,” but Katharine Lee Bates’ words are also a hymn to modernity:
Oh beautiful for patriot dream
That sees beyond the years
Thine alabaster cities gleam
Undimmed by human tears ...
“America the Beautiful” is not a nostalgic evocation of a pastoral landscape but a paean to its potential, including the gleaming metropolis. Miss Bates visited the Columbian Exposition in Chicago just before July 4, 1893, and she meant the word “alabaster” very literally: the centerpiece of the fair was the “White City” of the future, fourteen blocks of architectural marvels with marble facades painted white, and shining even whiter in the nightly glow of thousands of electric light bulbs, like a primitive prototype of Al Gore’s carbon-offset palace in Tennessee. They were good times, but even in bad the United States could still build marvels. Much of the New York skyline dates from the worst of times. As Fred Astaire and Ginger Rogers sang in the Thirties: “They all laughed at Rockefeller Center, Now they’re fighting to get in ...”
The Empire State Building, then the tallest in the world, was put up in eighteen months during a depression—because the head of General Motors wanted to show the head of Chrysler that he could build something that went higher than the Chrysler Building. Three-quarters of a century later, the biggest thing either man’s successor had created was a mountain of unsustainable losses—and both GM and Chrysler were now owned and controlled by government and unions.
In the months after 9/11, I used to get the same joke emailed to me every few days: the proposed design for the replacement World Trade Center. A new skyscraper towering over the city, with the top looking like a stylized hand—three towers cut off at the joint, and the “middle finger” rising above them, flipping the bird not only to Osama bin Laden but also to Karlheinz Stockhausen and the sneering Euro-lefties and all the rest who rejoiced that day at America getting it, pow, right in the kisser: they all laughed at the Twin Towers takedown. Soon they’ll be fighting to get in to whatever reachfor-the-skies only-in-America edifice replaces it. The very word “skyscraper” is quintessentially American: it doesn’t literally scrape the sky, but hell, as soon as we figure out how to build an even more express elevator, there’s no reason why it shouldn’t.
But the years go by, and they stopped emailing that joke, because it’s not quite so funny after two, three, five, nine years of walking past Windows on the Hole every morning. It doesn’t matter what the eventual replacement building is at Ground Zero. The ten-year hole is the memorial: a gaping, multi-story, multi-billion-dollar pit, profound and eloquent in its nullity.
As for the gleam of a brand new “White City,” well, in the interests of saving the planet, Congress went and outlawed Edison’s light bulb. And on the grounds of the White City hymned by Katherine Lee Bates stands Hyde Park, home to community organizer Barack Obama, terrorist educator William Ayers, and Nation of Islam numerologist and Jeremiah Wright Award-winner Louis Farrakhan. That’s one fruited plain all of its own.
In the decade after 9/11, China (which America still thinks of as a cheap assembly plant for your local KrappiMart) built the Three Gorges Dam, the largest electricity-generating plant in the world.20 Dubai, a mere sub-jurisdiction of the United Arab Emirates, put up the world’s tallest building and built a Busby Berkeley geometric kaleidoscope of offshore artificial islands.21 Brazil, an emerging economic power, began diverting the Sao Francisco River to create some 400 miles of canals to irrigate its parched northeast.22
But the hyperpower can’t put up a building.
Happily, there is one block in Lower Manhattan where ambitious redevelopment is in the air. In 2010, plans were announced to build a 15-story mosque at Ground Zero, on the site of an old Burlington Coat Factory damaged by airplane debris that Tuesday morning.
So, in the ruins of a building reduced to rubble in the name of Islam, a temple to Islam will arise.
A couple years after the events of that Tuesday morning, James Lileks, the bard of Minnesota, wrote:
If 9/11 had really changed us, there’d be a 150-story building on the site of the World Trade Center today. It would have a classical memorial in the plaza with allegorical figures representing Sorrow and Resolve, and a fountain watched over by stern stone eagles. Instead there’s a pit, and arguments over the usual muted dolorous abstraction approved by the National Association of Grief Counselors. 23
The best response to 9/11 on the home front—if only to demonstrate that there is a “home front” (which is the nub of al-Qaeda’s critique of a soft and decadent West)—would have been to rebuild the World Trade Center bigger, better, taller—not 150 stories, but 250, a marvel of the age. And, if there had to be “the usual muted dolorous abstraction,” the National Healing Circle would have been on the penthouse floor with a clear view all the way to al-Qaeda’s executive latrine in Waziristan.
Leslie Gelb, president emeritus of the Committee on Foreign Relations, is no right-winger but rather a sober, respected, judicious paragon of torpidly conventional wisdom. Nevertheless, musing on American decline, he writes, “The country’s economy, infrastructure, public schools and political system have been allowed to deteriorate. The result has been diminished economic strength, a less-vital democracy, and a mediocrity of spirit.”24
That last is the one to watch: a great power can survive a lot of things, but not “a mediocrity of spirit.” A wealthy nation living on the accumulated cultural capital of a glorious past can dodge its rendezvous with fate, but only for so long. “Si monumentum requiris, circumspice”25 reads the inscription on the tomb of Sir Christopher Wren in St. Paul’s Cathedral: If you seek my monument, look around. After two-thirds of the City of London was destroyed in the Great Fire of 1666, Wren designed and rebuilt the capital’s tallest building (St. Paul’s), another fifty churches, and a new skyline for a devastated metropolis. Three centuries later, if you seek our monument, look in the hole.
It’s not about al-Qaeda. It’s about us.