- Home
- Joan Didion
Vintage Didion Page 15
Vintage Didion Read online
Page 15
Later I remembered thinking: 1967, no problem, no land mines there.
I put on my glasses. I began to read.
“New York was no mere city,” the marked lines began. “It was instead an infinitely romantic notion, the mysterious nexus of all love and money and power, the shining and perishable dream itself.”
I hit the world “perishable” and I could not say it.
I found myself onstage at the Herbst Theater in San Francisco unable to finish reading the passage, unable to speak at all for what must have been thirty seconds. All I can say about the rest of that evening, and about the two weeks that followed, is that they turned out to be nothing I had expected, nothing I had ever before experienced, an extraordinarily open kind of traveling dialogue, an encounter with an America apparently immune to conventional wisdom. The book I was making the trip to talk about was Political Fictions, a series of pieces I had written for The New York Review about the American political process from the 1988 through the 2000 presidential elections. These people to whom I was listening—in San Francisco and Los Angeles and Portland and Seattle—were making connections I had not yet in my numbed condition thought to make: connections between that political process and what had happened on September 11, connections between our political life and the shape our reaction would take and was in fact already taking.
These people recognized that even then, within days after the planes hit, there was a good deal of opportunistic ground being seized under cover of the clearly urgent need for increased security. These people recognized even then, with flames still visible in lower Manhattan, that the words “bipartisanship” and “national unity” had come to mean acquiescence to the administration’s preexisting agenda—for example the imperative for further tax cuts, the necessity for Arctic drilling, the systematic elimination of regulatory and union protections, even the funding for the missile shield—as if we had somehow missed noticing the recent demonstration of how limited, given a few box cutters and the willingness to die, superior technology can be.
These people understood that when Judy Woodruff, on the evening the president first addressed the nation, started talking on CNN about what “a couple of Democratic consultants” had told her about how the president would be needing to position himself, Washington was still doing business as usual. They understood that when the political analyst William Schneider spoke the same night about how the president had “found his vision thing,” about how “this won’t be the Bush economy anymore, it’ll be the Osama bin Laden economy,” Washington was still talking about the protection and perpetuation of its own interests.
These people got it.
They didn’t like it.
They stood up in public and they talked about it.
Only when I got back to New York did I find that people, if they got it, had stopped talking about it. I came in from Kennedy to find American flags flying all over the Upper East Side, at least as far north as 96th Street, flags that had not been there in the first week after the fact. I say “at least as far north as 96th Street” because a few days later, driving down from Washington Heights past the big projects that would provide at least some of the manpower for the “war on terror” that the president had declared—as if terror were a state and not a technique—I saw very few flags: at most, between 168th Street and 96th Street, perhaps a half-dozen. There were that many flags on my building alone. Three at each of the two entrances. I did not interpret this as an absence of feeling for the country above 96th Street. I interpreted it as an absence of trust in the efficacy of rhetorical gestures.
There was much about this return to New York that I had not expected. I had expected to find the annihilating economy of the event—the way in which it had concentrated the complicated arrangements and misarrangements of the last century into a single irreducible image—being explored, made legible. On the contrary, I found that what had happened was being processed, obscured, systematically leached of history and so of meaning, finally rendered less readable than it had seemed on the morning it happened. As if overnight, the irreconcilable event had been made manageable, reduced to the sentimental, to protective talismans, totems, garlands of garlic, repeated pieties that would come to seem in some ways as destructive as the event itself. We now had “the loved ones,” we had “the families,” we had “the heroes.”
In fact it was in the reflexive repetition of the word “hero” that we began to hear what would become in the year that followed an entrenched preference for ignoring the meaning of the event in favor of an impenetrably flattening celebration of its victims, and a troublingly belligerent idealization of historical ignorance. “Taste” and “sensitivity,” it was repeatedly suggested, demanded that we not examine what happened. Images of the intact towers were already being removed from advertising, as if we might conveniently forget they had been there. The Roundabout Theatre had canceled a revival of Stephen Sondheim’s Assassins, on the grounds that it was “not an appropriate time” to ask audiences “to think critically about various aspects of the American experience.” The McCarter Theatre at Princeton had canceled a production of Richard Nelson’s The Vienna Notes, which involves a terrorist act, saying that “it would be insensitive of us to present the play at this moment in our history.”
I found in New York that “the death of irony” had already been declared, repeatedly, and curiously, since irony had been declared dead at the precise moment—given that the gravity of September 11 derived specifically from its designed implosion of historical ironies—when we might have seemed most in need of it. “One good thing could come from this horror: it could spell the end of the age of irony,” Roger Rosenblatt wrote within days of the event in Time, a thought, or not a thought, destined to be frequently echoed but never explicated. Similarly, I found that “the death of postmodernism” had also been declared. (“It seemed bizarre that events so serious would be linked causally with a rarified form of academic talk,” Stanley Fish wrote after receiving a call from a reporter asking if September 11 meant the end of postmodernist relativism. “But in the days that followed, a growing number of commentators played serious variations on the same theme: that the ideas foisted upon us by postmodern intellectuals have weakened the country’s resolve.”) “Postmodernism” was henceforth to be replaced by “moral clarity,” and those who persisted in the decadent insistence that the one did not necessarily cancel out the other would be subjected to what William J. Bennett would call—in Why We Fight: Moral Clarity and the War on Terrorism—“a vast relearning,” “the reinstatement of a thorough and honest study of our history, undistorted by the lens of political correctness and pseudosophisticated relativism.”
I found in New York, in other words, that the entire event had been seized—even as the less nimble among us were still trying to assimilate it—to stake new ground in old domestic wars. There was the frequent deployment of the phrase “the Blame America Firsters,” or “the Blame America First crowd,” the wearying enthusiasm for excoriating anyone who suggested that it could be useful to bring at least a minimal degree of historical reference to bear on the event. There was the adroit introduction of convenient straw men. There was Christopher Hitchens, engaging in a dialogue with Noam Chomsky, giving himself the opportunity to generalize whatever got said into “the liberal-left tendency to ‘rationalize’ the aggression of September 11.” There was Donald Kagan at Yale, dismissing his colleague Paul Kennedy as “a classic case of blaming the victim,” because the latter had asked his students to try to imagine what resentments they might harbor if America were small and the world dominated by a unified Arab-Muslim state. There was Andrew Sullivan, warning on his Web site that while the American heartland was ready for war, the “decadent left in its enclaves on the coasts” could well mount “what amounts to a fifth column.”
There was the open season on Susan Sontag—on a single page of a single issue of The Weekly Standard that October she was accused of “unusual stupidity,” of “mora
l vacuity,” and of “sheer tastelessness”—all for three paragraphs in which she said, in closing, that “a few shreds of historical awareness might help us understand what has just happened, and what may continue to happen”; in other words that events have histories, political life has consequences, and the people who led this country and the people who wrote and spoke about the way this country was led were guilty of trying to infantilize its citizens if they continued to pretend otherwise.
Inquiry into the nature of the enemy we faced, in other words, was to be interpreted as sympathy for that enemy. The final allowable word on those who attacked us was to be that they were “evildoers,” or “wrongdoers,” peculiar constructions that served to suggest that those who used them were transmitting messages from some ultimate authority. This was a year in which it would come to seem as if we had been plunged at one fell stroke into a premodern world. The possibilities of the Enlightenment vanished. We had suddenly been asked to accept—and were in fact accepting—a kind of reasoning so extremely fragile that it might have been based on the promised return of the cargo gods.
I recall, early on, after John Ashcroft and Condoleezza Rice warned the networks not to air the bin Laden tapes because he could be “passing information,” heated debate about the First Amendment implications of this warning—as if there were even any possible point to the warning, as if we had all forgotten that our enemies as well as we lived in a world where information gets passed in more efficient ways. A year later, we were still looking for omens, portents, the supernatural manifestations of good or evil. Pathetic fallacy was everywhere. The presence of rain at a memorial for fallen firefighters was gravely reported as evidence that “even the sky cried.” The presence of wind during a memorial at the site was interpreted as another such sign, the spirit of the dead rising up from the dust.
This was a year when Rear Admiral John Stufflebeem, deputy director of operations for the Joint Chiefs of Staff, would say at a Pentagon briefing that he had been “a bit surprised” by the disinclination of the Taliban to accept the “inevitability” of their own defeat. It seemed that Admiral Stufflebeem, along with many other people in Washington, had expected the Taliban to just give up. “The more that I look into it,” he said at this briefing, “and study it from the Taliban perspective, they don’t see the world the same way we do.” It was a year when the publisher of The Sacramento Bee, speaking at the midyear commencement of California State University, Sacramento, would be forced off the stage of the Arco Arena for suggesting that because of the “validity” and “need” for increased security we would be called upon to examine to what degree we might be “willing to compromise our civil liberties in the name of security.” Here was the local verdict on this aborted speech, as expressed in one of many outraged letters to the editor of the Bee:
It was totally and completely inappropriate for her to use this opportunity to speak about civil liberties, military tribunals, terrorist attacks, etc. She should have prepared a speech about the accomplishments that so many of us had just completed, and the future opportunities that await us.
In case you think that’s a Sacramento story, it’s not.
Because this was also a year when one of the student speakers at the 2002 Harvard commencement, Zayed Yasin, a twenty-two-year-old Muslim raised in a Boston suburb by his Bangladeshi father and Irish-American mother, would be caught in a swarm of protests provoked by the announced title of his talk, which was “My American Jihad.” In fact the speech itself, which he had not yet delivered, fell safely within the commencement-address convention: its intention, Mr. Yasin told The New York Times, was to reclaim the original meaning of “jihad” as struggle on behalf of a principle, and to use it to rally his classmates in the fight against social injustice. Such use of “jihad” was not in this country previously uncharted territory: the Democratic pollster Daniel Yankelovich had only a few months before attempted to define the core values that animated what he called “the American jihad”—separation of church and state, the value placed on diversity, and equality of opportunity. In view of the protests, however, Mr. Yasin was encouraged by Harvard faculty members to change his title. He did change it. He called his talk “Of Faith and Citizenship.” This mollified some, but not all. “I don’t think it belonged here today,” one Harvard parent told The Washington Post. “Why bring it up when today should be a day of joy?”
This would in fact be a year when it was to become increasingly harder to know who was infantilizing whom.
2
California Monthly, the alumni magazine for the University of California at Berkeley, published in its November 2002 issue an interview with a member of the university’s political science faculty, Steven Weber, who is the director of the MacArthur Program on Multilateral Governance at Berkeley’s Institute of International Studies and a consultant on risk analysis to both the State Department and such private-sector firms as Shell Oil. It so happened that Mr. Weber was in New York on September 11, 2001, and for the week that followed. “I spent a lot of time talking to people, watching what they were doing, and listening to what they were saying to each other,” he told the interviewer:
The first thing you noticed was in the bookstores. On September 12, the shelves were emptied of books on Islam, on American foreign policy, on Iraq, on Afghanistan. There was a substantive discussion about what it is about the nature of the American presence in the world that created a situation in which movements like al-Qaeda can thrive and prosper. I thought that was a very promising sign.
But that discussion got short-circuited. Sometime in late October, early November 2001, the tone of that discussion switched, and it became: What’s wrong with the Islamic world that it failed to produce democracy, science, education, its own enlightenment, and created societies that breed terror?
The interviewer asked him what he thought had changed the discussion. “I don’t know,” he said, “but I will say that it’s a long-term failure of the political leadership, the intelligentsia, and the media in this country that we didn’t take the discussion that was forming in late September and try to move it forward in a constructive way.”
I was struck by this, since it is so coincided with my own impression. Most of us saw that discussion short-circuited, and most of us have some sense of how and why it became a discussion with nowhere to go. One reason, among others, runs back sixty years, through every administration since Franklin Roosevelt’s. Roosevelt was the first American president who tried to grapple with the problems inherent in securing Palestine as a Jewish state.
It was also Roosevelt who laid the groundwork for our relationship with the Saudis. There was an inherent contradiction here, and it was Roosevelt, perhaps the most adroit political animal ever made, who instinctively devised the approach adopted by the administrations that followed his: Stall. Keep the options open. Make certain promises in public, and conflicting ones in private. This was always a high-risk business, and for a while the rewards seemed commensurate: we got the oil for helping the Saudis, we got the moral credit for helping the Israelis, and, for helping both, we enjoyed the continuing business that accrued to an American defense industry significantly based on arming all sides.
Consider the range of possibilities for contradiction.
Sixty years of making promises we had no way of keeping without breaking the promises we’d already made.
Sixty years of long-term conflicting commitments, made in secret and in many cases for short-term political reasons.
Sixty years that tend to demystify the question of why we have become unable to discuss our relationship with the current government of Israel.
Whether the actions taken by that government constitute self-defense or a particularly inclusive form of self-immolation remains an open question. The question of course has a history, a background involving many complicit state and nonstate actors and going back most recently to, but by no means beginning with, the breakup of the Ottoman Empire. This open question, and its h
istory, are discussed rationally and with considerable intellectual subtlety in Jerusalem and Tel Aviv, as anyone who reads Amos Elon or Avishai Margalit in The New York Review or even occasionally sees Ha’aretz on-line is well aware. Where the question is not discussed rationally—where in fact the question is rarely discussed at all, since so few of us are willing to see our evenings turn toxic—is in New York and Washington and in those academic venues where the attitudes and apprehensions of New York and Washington have taken hold. The president of Harvard recently warned that criticisms of the current government of Israel could be construed as “anti-Semitic in their effect if not their intent.”
The very question of the US relationship with Israel, in other words, has come to be seen—at Harvard as well as in New York and Washington—as unraisable, potentially lethal, the conversational equivalent of an unclaimed bag on a bus. We take cover. We wait for the entire subject to be defused, safely insulated behind baffles of invective and counterinvective. Many opinions are expressed. Few are allowed to develop. Even fewer change.
We have come in this country to tolerate many such fixed opinions, or national pieties, each with its own baffles of invective and counterinvective, of euphemism and downright misstatement, its own screen that slides into place whenever actual discussion threatens to surface. We have, for example, allowed American biological research to fall behind that in countries where stem cell programs are not confused with “cloning” and “abortion on demand,” countries, in other words, where rationality is not held hostage to the posturing of the political process. We have allowed all rhetorical stops to be pulled out on nonissues, for example when the federal appeals court’s Ninth Circuit ruled the words “under God” an unconstitutional addition to the Pledge of Allegiance. The Pledge was written in 1892 by a cousin of Edward Bellamy’s, Francis Bellamy, a socialist Baptist minister who the year before had been pressured to give up his church because of the socialist thrust of his sermons. The clause “under God” was added in 1954 to distinguish the United States from the atheistic Soviet Union.