یازیافت تساوی حقوقی

کتاب: جنگ سرد / فصل 7

یازیافت تساوی حقوقی

توضیح مختصر

  • زمان مطالعه 0 دقیقه
  • سطح خیلی سخت

دانلود اپلیکیشن «زیبوک»

این فصل را می‌توانید به بهترین شکل و با امکانات عالی در اپلیکیشن «زیبوک» بخوانید

دانلود اپلیکیشن «زیبوک»

فایل صوتی

برای دسترسی به این محتوا بایستی اپلیکیشن زبانشناس را نصب کنید.

متن انگلیسی فصل

CHAPTER FIVE - THE RECOVERY OF EQUITY

For a man who wants to make a profession of good in all regards must come to ruin among so many who are not good. Hence it is necessary to a prince, if he wants to maintain himself, to learn to be able not to be good, and to use this and not use it according to necessity.

—NICCOLÒ MACHIAVELLI1

To the Soviet leadership such a precipitous collapse . . . came as an unpleasant surprise. . . . [ T]here was perplexity in the minds of the Kremlin leaders, who were at a loss to understand the mechanics of how a powerful president could be forced into resignation by public pressure and an intricate judicial procedure based on the American Constitution—all because of what they saw as a minor breach of conduct. Soviet history knew no parallel.

—ANATOLY DOBRYNIN2

THE WATERGATE CRISIS surprised Nixon, as well as the Soviet ambassador and the Kremlin leadership. How could the most powerful man in the world be brought down by what his own press spokesman described as a “third-rate burglary,” detected only because the bungling thieves had taped a door lock horizontally instead of vertically, so that the end of the tape was visible to a graveyard shift security guard? The discovery of a break-in at the Democratic National Committee headquarters in the Watergate building in Washington shortly after 1:00 AM on June 17, 1972, set in motion a series of events that would force the first resignation of an American president. The disproportion between the offense and its consequences left Nixon incredulous: “[A]ll the terrible battering we have taken,” he commiserated with himself shortly before leaving office, “is really pygmy-sized when compared to what we have done, and what we can do in the future not only for peace in the world but, indirectly, to effect the well-being of people everywhere.”3 Perhaps so, but what Watergate also revealed was that Americans placed the rule of law above the wielding of power, however praiseworthy the purposes for which power was being used. Ends did not always justify means. Might alone did not make right.

“Well, when the president does it, that means it is not illegal,” Nixon would later explain, in a lame attempt to justify the wiretaps and break-ins he had authorized in an effort to plug leaks within his administration regarding the conduct of the Vietnam War. “If the president, for example, approves something because of. . . national security, or in this case because of a threat to internal peace and order of significant magnitude, then the president’s decision . . . enables those who carry it out to [do so] without violating a law.”4 The claim was not a new one. Every chief executive since Franklin D. Roosevelt had sanctioned acts of questionable legality in the interests of national security, and Abraham Lincoln had done so more flagrantly than any of them in order to preserve national unity. Nixon, however, made several mistakes that were distinctly his own. The first was to exaggerate the problem confronting him: the leaking of The Pentagon Papers to the New York Times was not a threat comparable to secession in 1861, or to the prospect of subversion during World War II and the early Cold War. Nixon’s second mistake was to employ such clumsy agents that they got themselves caught. And his third mistake—the one that ended his presidency—was to lie about what he had done in a futile attempt to cover it up.5 Watergate might have remained only an episode in the domestic history of the United States except for one thing: distinctions between might and right were also beginning to affect the behavior of the Cold War superpowers. The last years of the Nixon administration marked the first point at which the United States and the Soviet Union encountered constraints that did not just come from the nuclear stalemate, or from the failure of ideologies to deliver what they had promised, or from challenges mounted by the deceptively “weak” against the apparently “strong.” They came as well now from a growing insistence that the rule of law—or at least basic standards of human decency—should govern the actions of states, as well as those of the individuals who resided within them.

I.

THERE HAD long been hope that force alone would not always shape relations among nations. “The greatest problem for the human species,” the philosopher Immanuel Kant wrote, as early as 1784, “is that of attaining a civil society which can administer universal justice.”6 Woodrow Wilson intended that the League of Nations impose upon states some of the same legal constraints that states—at least the more progressive ones—imposed upon their own citizens. The founders of the United Nations designed it in such a way as to repair the League’s many deficiencies while preserving its purpose: the new organization’s charter committed it “to the equal rights of men and women and nations large and small,” and to the establishment of conditions “under which justice and respect for the obligations arising from treaties and other sources of international law can be maintained.”7 The order that came from balancing power within the international system was no longer to be an end in itself: the priority henceforth would be to secure agreement, among the states that made up that system, upon some externally derived standard of justice.

It is difficult today to evoke the optimism that existed, at the time of its founding, that the United Nations might actually accomplish this task: such is the disrepute into which the organization has fallen in the eyes of its many critics. In 1946, though, the Truman administration trusted the United Nations sufficiently that it proposed turning over its atomic weapons and the means of producing them—admittedly under conditions it would have specified—to the new international body. Four years later, the United States took the North Korean invasion of South Korea to the United Nations instantly, and for the next three years fought the war that followed under its flag. Truman’s own commitment to global governance was deep and emotional: throughout his adult life he carried in his wallet the passage from Alfred Tennyson’s poem Locksley Hall which looked forward to “the Parliament of Man, the Federation of the World.”8 But the harsh realities of the Cold War quickly ensured that Tennyson’s dream—and Truman’s—remained only that. Although the United States and the Soviet Union were founding members of the United Nations, they each reserved the right of veto within the Security Council, the body charged with enforcing its resolutions. Great Britain, France, and China (still under Chiang Kai-shek’s Nationalists) received the same privilege. This meant that the United Nations could act only when its most powerful members agreed on the action, an arrangement that obscured the distinction between might and right. And the veto-empowered members of the Council were unlikely to reach such agreements because they differed so widely on how to define “justice.” For the Americans, that term meant political democracy, market capitalism, and—in principle if not always in practice—respect for the rights of individuals. For the British and the French, still running colonial empires, it meant something short of that; for the Chinese Nationalists, facing the prospect that the Chinese Communists might eject them from power, it meant even less. And for Stalin’s Soviet Union, “justice” meant the unquestioning acceptance of authoritarian politics, command economies, and the right of the proletariat to advance, by whatever means the dictatorship that guided it chose to employ, toward a worldwide “classless” society.

It was hardly surprising, then, that the United Nations functioned more as a debating society than as an organization capable of defining principles and holding states accountable to them. As George Kennan complained early in 1948, positions taken there resembled “a contest of tableaux morts: there is a long period of preparation in relative obscurity ; then the curtain is lifted; the lights go on for a brief moment; the posture of the group is recorded for posterity by the photography of voting; and whoever appears in the most graceful and impressive position has won.” If the great powers could agree to rely on it for that purpose, Kennan added, this “parliamentary shadow-boxing . . . would indeed be a refined and superior manner of settling international differences.” 9 But that was not to be. The general view in Washington—certainly Kennan’s—was that, as the Joint Chiefs of Staff had put it, “faith in the ability of the United Nations as presently constituted to protect, now or hereafter, the security of the United States would mean only that the faithful have lost sight of the vital security interest of the United States.”10 The United Nations General Assembly did manage to pass, in December of 1948, a “Universal Declaration of Human Rights.” But it did so without the support of the Soviet Union and its allies as well as Saudi Arabia and South Africa—all of whom abstained—and without providing any enforcement mechanisms.11 Far more deeply entrenched in the organization’s charter and in its practices was the principle of non-intervention in the internal affairs of sovereign states—even when the most powerful of these states violated that principle. There would be, thus, no United Nations condemnation when the Soviet Union used military force to suppress dissent in East Germany in 1953, Hungary in 1956, and Czechoslovakia in 1968, or when the United States employed covert action to overthrow the governments of Iran in 1953, Guatemala in 1954, and attempted to do so in Cuba in 1961 and in Chile a decade later. Nor did the United Nations protest the human costs involved when Stalin launched his postwar purges inside the Soviet Union and Eastern Europe, or when the United States aligned itself with authoritarian regimes to keep communists from coming to power in the “third world,” or when Mao Zedong allowed so many millions of Chinese to starve as a result of his Great Leap Forward.

What all of this meant, then, was that if constraints on power for the purposes of securing justice were to arise at all, they would have to come not from the United Nations but from the states that were themselves fighting the Cold War. That seemed improbable during the late 1940s and the early 1950s: why would a superpower limit its power? By the mid-1970s, though, the improbable had become irreversible. The process by which this happened was most visible in the United States, where the Cold War at first widened, but subsequently narrowed, the gap between the wielding of power in world affairs and the principles of universal justice.

II.

AMERICAN officials were, at first, reasonably confident that they could contain the Soviet Union and international communism without abandoning standards of behavior drawn from their own domestic experience. 12 They believed firmly that aggression was linked to autocracy, and that a stable international order could best be built upon such principles as freedom of speech, freedom of belief, freedom of enterprise, and freedom of political choice. “The issue of Soviet-American relations is in essence a test of the over-all worth of the United States as a nation among nations,” Kennan wrote in the summer of 1947. “To avoid destruction the United States need only measure up to its own best traditions and prove itself worthy of preservation as a great nation. Surely, there was never a fairer test . . . than this.”13 It may have been a fair test, but it was not an easy one: almost at once pressures began to build to allow actions abroad that would not have been acceptable at home. The Marshall Plan itself—at first glance a successful projection of domestic values into the Cold War—illustrated the problem. Its goal was to secure political freedom by means of economic rehabilitation in the remaining non-communist states of Europe: only hungry and demoralized people, the plan’s architects assumed, would vote communists into office. But recovery and the restoration of self-confidence would take time; meanwhile balloting was already taking place. The problem was particularly acute in Italy, where a large communist party generously financed from Moscow looked likely to win the April, 1948, elections. Had it done so the effects—in the wake of the February coup in Czechoslovakia—could have been psychologically devastating. “If Italy goes Red,” one State Department adviser warned, “Communism cannot be stopped in Europe.”14 And with American aid only beginning to flow, the Marshall Plan had little beyond promises to rely upon.

The newly established Central Intelligence Agency had neither the capability nor the authority at the time to conduct covert operations : such was the relative innocence of the era. But with the State Department’s encouragement, it stepped into the breach. It quickly organized secret financing for the Christian Democrats and other non-communist parties in Italy, while supporting a letter-writing campaign by Italian-Americans to friends and relatives there. These improvised measures worked: the Italian communists were overwhelmed at the polls on April 18th–19th. Kennan concluded, as he later recalled, that “in the unusual circumstances prevailing . . . there was occasional need for actions by the United States Government that did not fit into its overt operations and for which it could not accept formal responsibility.” 15 Shortly thereafter, the National Security Council expanded the role of the C.I.A. to include propaganda, economic warfare; preventive direct action, including sabotage, anti-sabotage, demolition and evacuation measures; subversion against hostile states, including assistance to underground resistance movements, guerillas and refugee liberation groups, and support of indigenous anti-communist elements in threatened countries of the free world.

All of these activities were to be conducted in such a way “that if uncovered the US Government can plausibly disclaim any responsibility for them.”16 In short, American officials were to learn to lie.

So how did this square with Kennan’s earlier claim that the United States need only “measure up to its own best traditions” to “prove itself worthy of preservation as a great nation”? Kennan insisted that the State Department monitor C.I.A. activities to ensure that “plausible deniability” would not mean the lifting of all restraints: he personally expected “specific knowledge of the objectives of every operation and also of the procedures and methods employed where [these] involve political decisions.” He acknowledged that such initiatives would have to have “the greatest flexibility and freedom from the regulations and administrative standards governing ordinary operations.”17 They would, however, be rare: the option would be available “when and if an occasion arose when it might be needed,” but “[t]here might be years when we wouldn’t have to do anything like this.” As Kennan later admitted : “It did not work out at all the way I had conceived it.”18 The number of C.I.A. employees involved in covert operations grew from 302 in 1949 to 2,812 in 1952, with another 3,142 overseas “contract” personnel. They were stationed, by then, at forty-seven locations outside of the United States—up from seven in 1949—and the annual budget for secret activities had mushroomed from $4.7 million to $82 million.19 Nor were these actions infrequent. As the Eisenhower administration took office, the C.I.A. was regularly attempting to infiltrate spies, saboteurs, and resistance leaders into the Soviet Union, Eastern Europe, and China. It was financing ostensibly independent radio stations broadcasting to those countries, as well as labor unions, academic conferences, scholarly journals, and student organizations— some of them inside the United States. It was cooperating with the Air Force to fly reconnaissance missions that routinely violated the airspace of the U.S.S.R. and other communist states. It was experimenting with toxins and mind-control drugs. It was mounting counter-insurgency operations in the Philippines. And, working with local supporters and exile groups, it would successfully overthrow the left-leaning governments of Mohammed Mossadegh in Iran in 1953, and of Jacobo Arbenz Guzmán in Guatemala in 1954, both of whom had nationalized foreignowned properties in their respective countries, causing Washington to suspect them of sympathy for communism.20 The expanding scale and audacity of covert operations led Kennan to admit, years later, that recommending them had been “the greatest mistake I ever made.”21 Few officials within the Truman and Eisenhower administrations shared that view. For them the issue was simple: the Soviet Union had been engaging in espionage, financing “front” organizations, subverting foreign governments, and seeking to control minds since the earliest days of the Bolshevik Revolution. It respected no moral or legal constraints. As NSC-68, a top-secret review of national security strategy, pointed out in 1950, “the Kremlin is able to select whatever means are expedient in seeking to carry out its fundamental design.” The principal author of that document was Paul Nitze, Kennan’s successor as director of the State Department’s Policy Planning Staff. Confronted by such dangers, Nitze insisted, free societies would have to suspend their values if they were to defend themselves: The integrity of our system will not be jeopardized by any measures, overt or covert, violent or non-violent, which serve the purposes of frustrating the Kremlin design, nor does the necessity for conducting ourselves so as to affirm our values in action as well as words forbid such measures, provided only they are appropriately calculated to that end and are not so excessive or misdirected as to make us enemies of the people instead of the evil men who have enslaved them.22

The chief purpose of NSC-68 had been to make the case for “flexible response”: a strategy of responding to aggression wherever it took place, without expanding the conflict or backing away from it. Eisenhower jettisoned that approach because of its costs, relying instead on the threat of nuclear retaliation.23 But he and subsequent presidents through Nixon retained the view, most clearly articulated in NSC-68, that the legal and moral restraints limiting government action at home need not do so in the world at large: within that wider sphere, the United States had to be free to operate as its adversaries did.

“[W]e are facing an implacable enemy whose avowed objective is world domination,” the Doolittle Report, a highly classified evaluation of C.I.A. covert operations, concluded in 1954. “There are no rules in such a game. Hitherto acceptable norms of human conduct do not apply.” 24 Eisenhower agreed. “I have come to the conclusion that some of our traditional ideas of international sportsmanship are scarcely applicable in the morass in which the world now flounders,” he wrote privately in 1955. “Truth, honor, justice, consideration for others, liberty for all—the problem is how to preserve them . . . when we are opposed by people who scorn . . . these values. I believe that we can do it,” and here he underlined his words for emphasis, “but we must not confuse these values with mere procedures, even though these last may have at one time held almost the status of moral concepts. ”25 And so the Cold War transformed American leaders into Machiavellians. Confronted with “so many who are not good,” they resolved “to learn to be able not to be good” themselves, and to use this skill or not use it, as the great Italian cynic—and patriot—had put it, “according to necessity.”

III.

IT MIGHT become necessary, the Doolittle Report suggested, for the American people to “be made acquainted with, understand and support this fundamentally repugnant philosophy.”26 But no administration from Eisenhower’s through Nixon’s tried publicly to justify learning “not to be good.” The reasons were obvious: covert operations could hardly remain covert if openly discussed, nor would departures from “hitherto acceptable norms of human conduct” be easy to explain in a society still resolutely committed to the rule of law. The resulting silence postponed, but did not resolve, the issue of how to reconcile Machiavellian practices with the constitutionally based principle of accountability, whether to Congress, the media, or the public at large. As a result, Americans did gradually become acquainted with the “repugnant philosophy” their leaders thought necessary to fight the Cold War, although rarely in ways those leaders had intended.

As the scope and frequency of covert operations increased, it became more difficult to maintain “plausible deniability.”27 Rumors of American involvement in the Iranian and Guatemalan coups began to circulate almost at once, and although these would not be confirmed officially for many years,28 they were persuasive enough at the time to give the C.I.A. publicity it did not want. By the end of the 1950s, it had an almost mythic reputation throughout Latin America and the Middle East as an instrument with which the United States could depose governments it disliked, whenever it wished to do so.

The consequences, in both regions, proved costly. In the Caribbean, the overthrow of Arbenz inadvertently encouraged communism: outraged by what had happened in Guatemala, Fidel Castro, Che Guevara, and their supporters resolved to liberate Cuba from Washington’s sphere of influence and turn it into a Marxist-Leninist state. When, after they seized power in 1959, the C.I.A. tried to overthrow them, it failed miserably. The unsuccessful Bay of Pigs landing in April, 1961, exposed the most ambitious covert operation the Agency had yet attempted, humiliated the newly installed Kennedy administration, strengthened relations between Moscow and Havana, and set in motion the series of events that would, within a year and a half, bring the world to the brink of nuclear war.29 Meanwhile, the Shah of Iran, restored to power by the Americans in 1953, was consolidating an increasingly repressive regime which Washington found impossible to disavow. Once again, a tail wagged a dog, linking the United States to an authoritarian leader whose only virtues were that he maintained order, kept oil flowing, purchased American arms, and was reliably anti-communist. Iranians were sufficiently fed up by 1979 that they overthrew the Shah, denounced the United States for supporting him, and installed in power under the Ayatollah Ruhollah Khomeini the first radically Islamist government anywhere in the world.30

Not all C.I.A. operations ended this badly. In April, 1956, one of the most successful of them was, quite literally, exposed when the Russians invited reporters to tour a tunnel the Agency had constructed, extending from West Berlin a third of a mile into East Berlin, by which it had intercepted Soviet and East German cable and telephone communications for more than a year. This early example of wiretapping elicited more praise than criticism in the United States, however: the general reaction was that this was exactly the sort of thing American spies should be doing.31 Two months later, the C.I.A. arranged for the publication of excerpts from Khrushchev’s secret speech denouncing Stalin at the 20th Party Congress. Obtained through Polish and Israeli sources, this purloined document provoked few qualms either, despite the fact that it fed the unrest that led to a near revolt in Poland and a real one in Hungary later that year. What did induce regrets were inadequately supervised broadcasts over the C.I.A.-financed Radio Free Europe which convinced many Hungarians that the United States would defend them from Soviet retaliation. The Agency quietly concluded that in this case it had indeed gone too far, but kept public embarrassment to a minimum.32 The first open debate over the ethics of espionage came in May, 1960, when the Russians shot down Francis Gary Powers’s U-2 near Sverdlovsk. Eisenhower had long worried about how he might justify such flights if they should ever become public: any Soviet violation of American airspace, he once admitted, would lead him to ask Congress for an immediate declaration of war. “Plausible deniability” provided some assurance that this double standard could be maintained. Given the altitude at which the U-2 operated, Eisenhower was told, neither the plane nor the pilot would remain intact if anything went wrong. Informed that the plane was down, the president therefore authorized an official lie: a State Department press spokesman announced that a weather aircraft had simply wandered off course. Khrushchev then gleefully displayed the remnants of the U-2, the photographs it had taken, and its pilot, alive and well—forcing a furious Eisenhower to acknowledge his falsehood. “I didn’t realize how high a price we were going to have to pay for that lie,” he later recalled. “And if I had it to do over again, we would have kept our mouths shut.”33 The idea that their leaders might lie was new to the American people. There were no serious consequences for Eisenhower, however: he would soon be leaving office, and most Americans admired the C.I.A.’s skill in building the U-2 and keeping it flying for so long—even if, like Eisenhower, they would never have tolerated Soviet flights over the United States. Shortly after taking office, President Kennedy had to admit that he too had lied when he denied, at a press conference just prior to the Bay of Pigs landing, that American forces would be used in any effort to overthrow Castro. To Kennedy’s astonishment, his approval rating in the polls went up: getting rid of a Marxist regime in the Caribbean was a popular cause, and the new president got credit for attempting it, even if he had failed. “The worse you do,” he concluded, “the better they like you.”34 But what if a president should lie—and do so repeatedly—in an unpopular cause? Lyndon Johnson knew that an expanded war in Vietnam would be just that. “I don’t think the people . . . know much about Vietnam and I think they care a hell of a lot less,” he worried privately in May, 1964. But “we haven’t got much choice, . . . we are treatybound, . . . we are there, [and if South Vietnam falls] this will be a domino that will kick off a whole list of others. . . . [ W ]e’ve just got to prepare for the worst.”35 Johnson sought to do this by denying, throughout the presidential campaign of that year, any intention to escalate the war, deliberately allowing his opponent, Barry Goldwater, to endorse that course of action. After his overwhelming victory, Johnson authorized the escalation he had promised not to undertake, apparently in the belief that he could win the war quickly before public opinion could turn against it. “I consider it a matter of the highest importance,” he instructed his aides in December, “that the substance of this position should not become public except as I specifically direct.”36 The war, though, did not end quickly: instead it escalated with no end in sight. Johnson knew that the prospects were grim, but he could not bring himself to explain this openly. His reasons went beyond his own personal political fortunes. He had presided, by mid-1965, over the greatest wave of domestic reform legislation since the New Deal, and there was more to be done. “I was determined,” he later recalled, “to keep the war from shattering that dream, which meant that I had no choice but to keep my foreign policy in the wings. . . . I knew the Congress as well as I know Lady Bird, and I knew that the day it exploded into a major debate on the war, that day would be the beginning of the end of the Great Society.”37 The dilemma, then, was a cruel one. American interests in the Cold War, Johnson believed, required that the United States persist in Vietnam until it prevailed. But he was also convinced that he could not reveal what it would take to win without sacrificing the Great Society: the nation would not simultaneously support major expenditures for both “guns” and “butter.” So he sacrificed public trust instead. The term “credibility gap” grew out of Johnson’s sustained attempt to conceal the costs—together with the pessimism with which the C.I.A. and other intelligence agencies, as well as his own war planners, evaluated the prospects for success—of the largest American military operation since the Korean War. 38 It is difficult to understand how Johnson thought he could get away with this. Part of the explanation may simply be that when all alternatives are painful, the least painful one is to make no choice: certainly Johnson postponed choosing between the Great Society and the Vietnam War for as long as possible. Part of it may also have been Johnson’s personal belief that the most affluent society in the world could afford to spend whatever was required to ensure security abroad and equity at home, whatever the public or the Congress thought.39 But that economic argument failed to consider whether Americans could sustain their morale as the human costs of the war rose while the prospects of victory faded. By the beginning of 1968 several hundred American troops were being killed in action per week, and yet the Tet Offensive of late January and early February showed that no location within South Vietnam—not even the American embassy in Saigon—was secure. Tet turned out to be a military defeat for the North Vietnamese: the mass uprising they had hoped to provoke did not occur. But it was also a psychological defeat for the Johnson administration, and that at the time was more important. The president acknowledged this at the end of March when he refused to send still more troops to fight in the war, while announcing his own surprise decision not to seek re-election. 40 It seems likely, though, that one other legacy of the early Cold War influenced Johnson’s handling of the Vietnam War: it was that American presidents had long been free to act abroad in ways for which they need not account at home. Had not Eisenhower authorized intercepted communications, violations of airspace, and in two instances the actual overthrow of foreign governments? Had not Kennedy failed to overthrow another, and been cheered for making the effort? It was easy to conclude, as Johnson entered the White House in 1963 on a wave of grief over Kennedy’s assassination and of goodwill toward himself, that the presidency was all-powerful: that he could continue to employ, as NSC-68 had put it, “any measures, overt or covert, violent or non-violent,” that would advance the American cause in the Cold War, without jeopardizing “the integrity of our system.” But by the time Johnson left the White House in 1969, that proposition looked much less plausible: the manner in which he had fought the Vietnam War had left the American system, both abroad and at home, in deep trouble.

The authors of NSC-68 had assumed that there could be separate standards of conduct in these two spheres: that American leaders could learn “not to be good” in waging the Cold War while remaining “good” within the framework of their own domestic democratic society. It had been hard enough to maintain that separation during the Eisenhower and Kennedy years: both presidents had been forced to admit that their “denials” in the U-2 and Bay of Pigs incidents had not been “plausible.” With the Vietnam War, the line between what was allowed overseas and what was permitted at home disappeared altogether. The Johnson administration found it impossible to plan or prosecute the war without repeatedly concealing its intentions from the American people, and yet the decisions it made profoundly affected the American people. Far from measuring up to “its own best traditions” in fighting the Cold War, as Kennan had hoped it would, the United States in fighting the Vietnam War appeared to be sacrificing its own best traditions of constitutional and moral responsibility.

IV.

RICHARD NIXON inherited this situation, then made it much worse. One of the most geopolitically adept leaders of modern times, he also happened to be the American president least inclined—ever—to respect constraints on his own authority. After all that had happened during the Johnson years, he still believed that the requirements of national security, as he defined them, outweighed whatever obligations of accountability, even legality, the presidency demanded. Nixon’s actions went well beyond the idea that there could be separate standards of behavior at home and abroad: instead he made the homeland itself a Cold War battleground. There, however, he encountered an adversary more powerful than either the Soviet Union or the international communist movement. It was the Constitution of the United States of America.

“I can say unequivocally,” Nixon wrote after resigning the presidency, “that without secrecy there would have been no opening to China, no SALT agreement with the Soviet Union, and no peace agreement ending the Vietnam war.”41 There is little reason to doubt that claim. To have consulted the Departments of State and Defense, the C.I.A., the appropriate Congressional committees, and all allies whose interests would have been affected prior to Kissinger’s 1971 Beijing trip would only have ensured that it not take place. To have attempted arms control negotiations with Moscow in the absence of a “back channel” that allowed testing positions before taking them would probably have guaranteed failure. And the only way Nixon saw to break the long stalemate in the Vietnamese peace talks—short of accepting Hanoi’s demands for an immediate withdrawal of American forces and the removal from power of the South Vietnamese government—was to increase military and diplomatic pressure on North Vietnam while simultaneously decreasing pressures from within Congress, the anti-war movement, and even former members of the Johnson administration to accept Hanoi’s terms. That too required operating both openly and invisibly.

Where Nixon went wrong was not in his use of secrecy to conduct foreign policy—diplomacy had always required that—but in failing to distinguish between actions he could have justified if exposed and those he could never have justified. Americans excused the lies Eisenhower and Kennedy told because the operations they covered up turned out to be defensible when uncovered. So too did the methods by which Nixon brought about the China opening, the SALT agreement, and the Vietnam cease-fire: the results, in those instances, made reliance on secrecy, even deception, seem reasonable.

But what about the secret bombing of a sovereign state? Or the attempted overthrow of a democratically elected government? Or the bugging of American citizens without legal authorization? Or burglaries carried out with presidential authorization? Or the organization of a conspiracy, inside the White House itself, to hide what had happened ? Nixon allowed all of this during his first term; his reliance on secrecy became so compulsive that he employed that tactic in situations for which there could never be a plausible justification. So when plausible denial was no longer possible—in large part because Nixon, with his secret Oval Office taping system, had even bugged himself—a constitutional crisis became unavoidable.

The process began in the spring of 1969 when Nixon ordered the bombing of Cambodia in an effort to interdict the routes through that country and Laos along which the North Vietnamese had for years sent troops and supplies into South Vietnam. The decision was militarily justifiable, but Nixon made no effort to explain it publicly. Instead he authorized the falsification of Air Force records to cover up the bombing, while insisting for months afterward that the United States was respecting Cambodian neutrality. The bombing was no secret, obviously, to the Cambodians themselves, or the North Vietnamese, or their Chinese and Soviet allies. Only Americans were kept in the dark, and the reason, as Nixon later acknowledged, was to avoid anti-war protests. “My administration was only two months old, and I wanted to provoke as little public outcry as possible at the outset.”42 That, however, was how Johnson’s “credibility gap” had developed, and Nixon soon had one too. Exploiting well-placed sources, the New York Times quickly reported the bombing of Cambodia, as well as the administration’s plans to begin a gradual withdrawal of American troops from Vietnam. An angry Nixon responded by ordering wiretaps on the phones of several Kissinger assistants whom the Justice Department and the Federal Bureau of Investigation suspected of having leaked the information. They remained in place, with Kissinger’s approval, even after some of these aides had left the government, and they were soon extended to include journalists who could not have been involved in the original leaks.43 The line between defensible and indefensible secrecy, already blurred in the Johnson administration, was now even less distinct.

Then in October, 1970, the democratically elected Marxist government of Salvador Allende took power in Chile. Nixon claimed, in public, to respect this outcome: “[F]or the United States to have . . . intervened in a free election . . . would have had repercussions all over Latin America that would have been far worse than what has happened in Chile.”44 But his administration had intervened there, and was continuing to do so even as Nixon made this statement early in 1971. Following a precedent set by Johnson, the C.I.A. had undertaken a series of undercover initiatives meant to favor Allende’s opponents during the election campaign. When he won anyway, Nixon authorized the Agency “to prevent Allende from coming to power or to unseat him.”45 This led the C.I.A. to help set in motion a military coup that failed to prevent Allende’s inauguration, but that did result in the kidnapping and assassination of General René Schneider, the commander-in-chief of the Chilean armed forces. Over the next three years, the Agency persisted in its efforts to destabilize Allende’s regime.

Fortunately for the administration, none of this leaked at the time: instead Nixon got credit for his apparent restraint in Chile. But the gap between what appeared to be happening and what was actually happening was widening, while the prospects for defending the disparity—should it become public—were diminishing. Attempting to deny Allende the office he had won, one of Kissinger’s aides commented, was “patently a violation of our own principles. . . . If these principles have any meaning, we normally depart from them only to meet the gravest threat . . . to our survival. Is Allende a mortal threat to the U.S.? It is hard to argue this.”46 At home, even less defensible acts followed. In June, 1971, Daniel Ellsberg, a former Defense Department official, turned over to the New York Times what came to be called The Pentagon Papers, a classified history of the origins and escalation of the Vietnam War ordered by Johnson’s secretary of defense, Robert McNamara. Nothing in this history compromised national security or criticized Nixon’s handling of the war, but he regarded the leak as a dangerous precedent and a personal affront. Lacking confidence in the ability of the F.B.I. or the courts to deal with this and similar cases, the president demanded the formation of a team within the White House that would prevent the further unauthorized release of sensitive material. “We’re up against an enemy, a conspiracy,” he insisted. “We are going to use any means. Is that clear?”47 Nixon’s staff quickly assembled an improbable gang of retired police detectives as well as former C.I.A. and F.B.I. agents—soon to be known, for their assignment to plug leaks, as the “Plumbers.” Over the next year they undertook a series of burglaries, surveillance operations, and wiretaps that had to be kept secret because, despite their White House authorization, they were illegal. “I don’t think this kind of conversation should go on in the attorney general’s office,” a nervous Nixon aide commented, after the Plumbers had briefed the attorney general, John Mitchell, on their operations.48 Mitchell himself became nervous when, on the morning of June 17, 1972, several of the Plumbers found themselves under arrest inside the headquarters of the Democratic National Committee in the Watergate building—a place where, by the laws Mitchell was charged with enforcing, they were definitely not supposed to be.49 It would take until August 9, 1974—the date of Nixon’s resignation—for all the consequences of this bungled break-in to unfold. What was set in motion on the morning of the arrests, however, was a reassertion of moral, legal, and ultimately constitutional principle over presidential authority. It proceeded through the trial and conviction of the hapless burglars, their implication of the administration officials who had supervised and financed their operations, an increasingly startling series of revelations in the media, a diminishingly credible sequence of presidential denials, the appointment of a special prosecutor, a highly public Senate investigation, the exposure of Nixon’s Oval Office taping system, legal challenges to get the tapes released, the approval of impeachment resolutions by the House of Representatives, and in the end a Supreme Court ruling that the president had to turn over the single “smoking-gun” tape that proved his complicity in the cover-up.

At that point, facing conviction and removal from office, Nixon gave up his office. He thereby acknowledged that the president of the United States was not in fact free to use whatever means he considered necessary to protect national security interests. There was, even within that sensitive realm, a standard of behavior that he alone could not determine. Contrary to what Nixon had assumed, the president was not above the law.

V.

NOR DID the law itself remain static. The president’s behavior provoked Congress into reclaiming much of the authority over the conduct of national security policy that it had abdicated during the early Cold War. This happened first with respect to Vietnam, where by the end of January, 1973, Nixon and Kissinger had forced Hanoi to accept a cease-fire on terms that the United States could accept—and could impose on its reluctant South Vietnamese ally. But they had also had to withdraw almost all American troops from the region: that had been necessary to defuse domestic anti-war sentiment, while fending off pressures on Capitol Hill to legislate an end to American involvement in the war.

Nixon had no illusions that the North Vietnamese would willingly abide by the cease-fire. But he did expect to compel compliance by threatening—and if necessary resuming—the bombing that had caused Hanoi to accept the cease-fire in the first place. The United States had, after all, reserved the right to act similarly to enforce a Korean cease-fire that had lasted for two decades. The situation in Vietnam was less promising; still the hope, Kissinger recalled, was “that Nixon’s renown for ruthlessness would deter gross violations.”50

But Watergate had severely weakened the president. Frustrated by a long and bitter war, utterly distrustful of Nixon’s intentions, sensing that his authority was crumbling, Congress voted in the summer of 1973 to terminate all combat operations in Indochina. It then passed the War Powers Act, which imposed a sixty-day limit on all future military deployments without Congressional consent. Nixon’s vetoes were overridden and the restrictions became law. It was left to his successor, Gerald Ford, to suffer the consequences: when North Vietnam invaded and conquered South Vietnam in the spring of 1975, he was unable to do anything about it. “Our domestic drama,” Kissinger later commented, “first paralyzed and then overwhelmed us.”51 Much the same thing happened with intelligence operations. The C.I.A. had always operated under minimal Congressional oversight: the assumption had been that the nation’s representatives neither needed nor wanted to know what the Agency was doing. That attitude survived the U-2 and Bay of Pigs incidents, the onset and escalation of the Vietnam War, even the revelation, in 1967, that the C.I.A. had for years been secretly funding academic conferences, journals, and research, as well as the National Student Association.52 But it did not survive Watergate.

The evidence that former C.I.A. employees had been part of the Plumbers unit—and that Nixon had sought the Agency’s cooperation in arranging a cover-up—led to pressures from within the organization to review potentially illegal activities, and to scrutiny from without that was meant to expose them. In December, 1974, the New York Times revealed that the C.I.A. had run its own program of domestic surveillance against anti-war protesters during the Johnson and the Nixon administrations, involving both wiretaps and the interception of mail. The director of Central Intelligence, William Colby, promptly confirmed the story, acknowledging that the Agency had violated its own charter—which prohibited activities inside the United States—and that it had broken the law. 53 There quickly followed the appointment of three commissions, one presidential and one each in the Senate and the House of Representatives, to investigate C.I.A. abuses. With Colby cooperating, the Agency’s “skeletons”—assassination plots, surveillance operations, concealed subsidies, connections to Watergate, and the attempt to prevent a constitutionally elected government in Chile from taking power—all went on public display. As had been the case during Nixon’s last years in office, the nation again faced the question of whether the United States should, or even could, maintain separate standards in fighting the Cold War from what it was prepared to accept at home.

Events in Chile posed the dilemma most clearly. A successful military coup had finally taken place in Santiago in September, 1973. It left Allende dead—probably by suicide—and a reliably anti-communist government in power headed by General Augusto Pinochet. Direct C.I.A. complicity was never established, but Nixon and Kissinger openly welcomed the outcome and sought to cooperate with the new Chilean leader. By the time the C.I.A. investigations got under way in 1975, however, Pinochet’s government had imprisoned, tortured, and executed thousands of Allende supporters—some of them American citizens. Chile, for many years a democracy, now had one of the most repressive dictatorships Latin America had ever seen. 54 What the United States did in Chile differed little from what it had done, two decades earlier, in Iran and Guatemala. But the 1970s were not the 1950s: once the information got out that the Nixon administration had tried to keep Allende from the office to which he had been elected and had sought to remove him once there, “plausible denial” became impossible. That made questions about responsibility unavoidable. Could Allende have remained in power if there had been no American campaign against him? Would he have retained democratic procedures had he done so? Should the United States have refrained, to the extent that it did, from condemning Pinochet’s abuses? Had it made a greater effort, might it have stopped them? There are, even today, no clear answers: Washington’s role in Chile’s horrors remains a hotly contested issue among both historians of these events and participants in them.55 What was clear at the time, though, was that the C.I.A.’s license to operate without constraints had produced actions in Chile that, by its own admission, failed the “daylight” test. They could not be justified when exposed to public view.

Congress responded by prohibiting actions that might, in the future, lead to similar results. It chose to make this point in Angola, a former Portuguese colony where a three-way struggle for power was under way in 1975, with the competitors looking to the United States, the Soviet Union, and China for support. There was no possibility, in the aftermath of Vietnam, of direct American military intervention: covert funding for the pro-American National Front for the Liberation of Angola seemed the only available option. But with the C.I.A. under intense scrutiny, there was no way to arrange this without the approval of Congressional leaders, and as soon as they were consulted the plan became overt and opposition to it became intense. Because abuses had taken place in Chile and other parts of the world, the Senate voted in December, 1975, to deny any secret use of funds in Angola, despite the likelihood that this action would leave the country, by default, under Moscow’s influence. It was, Ford complained, an “abdication of responsibility” that would have “the gravest consequences for the long-term position of the United States and for international order in general.”56 That turned out to be an exaggeration. The Soviet Union had been reluctantly dragged into Angola by its Cuban ally, and gained little from the experience. 57 What had happened in Washington, though, was significant: distrust between the executive and legislative branches of government was now so deep that the United States Congress was passing laws—always blunt instruments—to constrain the use of United States military and intelligence capabilities. It was as if the nation had become its own worst enemy.

VI.

I F THE White House, the Pentagon, and the C.I.A. were not above the law—indeed if legal standards could shift to guarantee this—then could the overall conduct of American foreign policy be held accountable to some comparably independent set of moral standards? Did learning “not to be good . . . according to necessity” mean abandoning all sense of what it meant to be “good” in working within the Cold War international system? And where, in all of this, did détente fit?

It would have been difficult, by any traditional moral principle, to justify the artificial division of entire countries like Germany, Korea, and Vietnam—and yet the United States and its allies had expended thousands of lives and billions of dollars to maintain those divisions. It strained democratic values to embrace right-wing dictatorships throughout much of the “third world” as a way of preventing the emergence of left-wing dictatorships, and yet every administration since Truman’s had done this. And surely Mutual Assured Destruction could only be defended if one considered hostage-taking on a massive scale—deliberately placing civilian populations at risk for nuclear annihilation—to be a humane act. American strategists did just that, however, because they saw no better way to deter a much greater evil, the possibility of an all-out nuclear war. As the Cold War wore on, they went from regarding these compromises as regrettable to considering them necessary, then normal, and then even desirable.58 A kind of moral anesthesia settled in, leaving the stability of the Soviet-American relationship to be valued over its fairness because the alternative was too frightening to contemplate. Once it became clear that everybody was in the same lifeboat, hardly anybody wanted to rock it.

This moral ambivalence was not moral equivalence. The United States never found it necessary to violate human rights on the scale that the Soviet Union, its Eastern European allies, and the Chinese under Mao Zedong had done. But Washington officials had long since convinced themselves that the only way they could prevent those violations would be to go to war, a prospect that could only make things much worse. American military action, John Foster Dulles warned publicly at the time of the 1956 Hungarian uprising, “would . . . precipitate a full-scale world war and probably the result of that would be all these people wiped out.”59 As late as the Soviet invasion of Czechoslovakia in 1968, the Johnson administration saw little it could do beyond protesting the offense, warning against repeating it elsewhere, and canceling the summit at which the outgoing president and the new Soviet leader, Leonid Brezhnev, were to have begun negotiations on limiting strategic arms. What happened in Eastern Europe, Johnson’s secretary of state, Dean Rusk, later explained, had “never been an issue of war and peace between us and the Soviet Union—however ignoble this sounds.”60 Détente had been meant to lower the risks of nuclear war, to encourage a more predictable relationship among Cold War rivals, and to help them recover from the domestic disorders that had beset them during the 1960s. It had not been intended, in any immediate sense, to secure justice: that could only emerge, most of its supporters believed, from within a balance of power that each of the great powers considered legitimate. Kissinger was the most thoughtful advocate of this position. Legitimacy, he had written in 1957 of the post-1815 European settlement, “should not be confused with justice.”

It implies the acceptance of the framework of the international order by all the major powers, at least to the extent that no state is so dissatisfied that . . . it expresses its dissatisfaction in a revolutionary foreign policy. A legitimate order does not make conflicts impossible, but it limits their scope.61

Kissinger was still making this point in October, 1973, after Nixon appointed him secretary of state: “The attempt to impose absolute justice by one side will be seen as absolute injustice by all others. . . . Stability depends on the relative satisfaction and therefore also the relative dissatisfaction of the various states.”

Kissinger was careful to caution against “becoming obsessed with stability.” An “excessively pragmatic policy” would “lack not only direction, but also roots and heart.” It would provide “no criteria for other nations to assess our performance and no standards to which the American people can rally.” But an “excessively moralistic” approach to Cold War diplomacy could become “quixotic or dangerous,” leading to “ineffectual posturing or adventuristic crusades.” The responsible policy-maker, therefore, “must compromise with others, and this means to some extent compromising with himself.”62 The morality inherent in détente lay in its avoidance of war and revolution, no small accomplishment in a nuclear age. Kant’s goal of universal justice, however, could only follow from a universal acceptance, for the foreseeable future, of the Cold War status quo.

This argument, however, left one issue unresolved: if détente was indeed diminishing the danger of nuclear war, then why would it continue to be so dangerous to apply moral standards in conducting the Cold War? If that conflict was becoming the normal condition of international relations, did that mean the United States would have to accept amorality as a permanent characteristic of its foreign policy? How would that square with Kissinger’s acknowledgment that “America cannot be true to itself without moral purpose”? 63 This was the dilemma the new secretary of state faced in taking over the direction of foreign policy from the increasingly beleaguered Nixon: securing the status quo abroad was making support for it vulnerable at home.

The vulnerabilities appeared most clearly with respect to human rights. Soon after the 1972 Moscow summit, Kremlin leaders imposed an exit tax on emigrants leaving the U.S.S.R., supposedly to recover the costs of their state-financed education. It seemed a small brutality compared to the many larger ones that had preceded it, but it came at a time when concerns were growing within the United States about the treatment of Soviet Jews and dissidents. The exit tax provoked a backlash in Congress, where Senator Henry M. Jackson and Representative Charles Vanik proposed an amendment to the otherwise routine Trade Reform Act that would have denied “most-favored nation” treatment and Export-Import Bank credits to any “non-market economy” that restricted or taxed the right to emigrate. The United States, Jackson argued—no doubt with his own presidential aspirations in mind— should use its economic strength, not to reward the Soviet Union for its external behavior, but to change its internal behavior: “When we have something we feel strongly about . . . [then] we should put that issue of principle on the table knowing that the Russians are not going to agree to it.”64 Kissinger protested that the provisions of the Trade Reform Act had been among the carefully balanced sticks and carrots that had persuaded the Soviet Union at last to agree on the limitation of strategic arms. To add new demands after the deal had been made—especially demands that required the Russians to alter internal policies as a result of outside pressure—could only be a mandate “for an unfulfillable course that sapped our credibility abroad without giving us the tools to deal with the consequences of the resulting tension.” Quiet diplomacy would do more for Soviet Jews, dissidents, and other potential emigrants than public posturing; and in the absence of an amicable Soviet-American relationship it would hardly be possible to do anything for them.65 Moscow’s objections to the Jackson-Vanik amendment had an even deeper basis. As Ambassador Dobrynin later admitted, “the Kremlin was afraid of emigration in general (irrespective of nationality or religion) lest an escape hatch from the happy land of socialism seem to offer a degree of liberalization that might destabilize the domestic situation.”66 What this meant, though, was that in its search for geopolitical stability, the Nixon administration had begun to support domestic stability inside the U.S.S.R. It had sought to manage the Cold War international system much as Metternich and Castlereagh had managed Europe after Napoleon—by balancing the antagonisms within it. But that 19th-century arrangement had accepted the internal character of the states being balanced: calls for reform, in the era Kissinger had written about as a historian, could easily be brushed aside. That was less easy to do in the more transparent and democratic age within which he himself sought to direct the course of events.

Kissinger never intended that détente would secure the future of Soviet authoritarianism. “Brezhnev’s gamble,” he had written Nixon in the summer of 1973, “is that as these policies gather momentum and longevity, their effects will not undermine the very system from which Brezhnev draws his power and legitimacy. Our goal on the other hand is to achieve precisely such effects over the long run.”67 But with Jackson-Vanik, the long run became the present: the amendment won support from opposite ends of the ideological spectrum. Liberals, convinced that foreign policy always ought to pursue justice, condemned Kissinger’s cynicism in seeking stability first. Conservatives, certain that the Soviet Union could never be trusted, denounced Kissinger’s naiveté in being willing to do so. And with Nixon approaching the end of his presidency, there was little he could do to help resist these pressures.

The Jackson-Vanik amendment passed both houses of Congress early in 1975, several months after Nixon left office. The Soviet Union responded by canceling the entire trade deal. The causes of emigration, commerce, and détente itself suffered as a result: whatever “thaw” had occurred in the Cold War now seemed to be ending. But these events had advanced a different cause. Through a circuitous process involving its own constitutional checks and balances, the presidential aspirations of an ambitious senator, and the diminishing power of an ethically challenged president, the United States had wound up taking a position consistent with the 1948 United Nations Universal Declaration on Human Rights: that neither national sovereignty nor the demands of diplomacy should allow states to treat their own citizens in any way they pleased. There was after all, if not a universal standard of justice, then at least a basic standard of human decency that ought to take precedence, even over efforts to stabilize the Cold War.

VII.

THIS REALIGNMENT of American strategy with legal and moral principles would have had little effect on the course of the Cold War, however, had there not been echoes of it on the other side. These were at first difficult to detect. The Soviet leadership appeared to have become less tolerant of dissent at home and in Eastern Europe than it had been during the last years of the Khrushchev era. The invasion of Czechoslovakia and its subsequent justification, the Brezhnev Doctrine, set the stage for a tightening of ideological discipline, a rejection of experimentation in the media and the arts, and the increasingly harsh repression of even mild political protests.68 However much détente might have improved relations with the West, Brezhnev and his colleagues seemed determined to control everything—even ideas—within their sphere of influence. They justified this not through an appeal to morality or law, but to ideology: to the claim that, in Marxism-Leninism, they had discovered the mechanisms by which history worked, and thus the means by which to improve the lives people lived.

But it had long been clear that history was not working in this way. Khrushchev revealed that Lenin and Stalin had enslaved far more people than they had liberated; and by the time of his overthrow the Soviet Union and its Eastern Europe satellites had fallen far behind the United States and most of the rest of the capitalist world in most of the economic indices that measured prosperity. It had even been necessary, in 1968, to use force to keep communism in power in Czechoslovakia, an act that shattered whatever illusions remained that anyone might voluntarily embrace that ideology. “Our tanks in Prague . . . ‘fired’ at ideas,” one young Soviet journalist wrote at the time. “With a fist to the jaw of thinking society, they thought they had knocked out . . . its thinking processes. . . . [Instead they] ‘awakened’ new layers within the Party intelligentsia who would repeat the [Prague] attempt with more success.”69 Not immediately, to be sure. It would take time for thoughts alone to ensure that tanks would never again be used. The suppression of the “Prague spring” did, however, have a powerful psychological effect: it led a growing number of people in the Soviet Union and Eastern Europe to continue to defer in public to Marxist-Leninist doctrine, while privately ceasing to believe in it. There developed what the historian Timothy Garton Ash has called a “double life”: “The split between the public and the private self, official and unofficial language, outward conformity and inward dissent. . . . I applaud conduct by the state that I would never endorse in private life.”70 It was just the opposite from what was happening inside the United States, where by the mid-1970s the gap between what people believed in and what their leaders did had significantly narrowed. The credibility gap was migrating from Washington to Moscow. And Brezhnev was even less prepared to deal with it than Nixon had been.

His problem was that the Communist Party of the Soviet Union, like all other ruling communist parties, drew its authority from its claim to historical infallibility: that left it vulnerable when events failed to follow the script. Once it became clear that that was happening, there was little left—apart from a morally and legally indefensible use of force, as in Czechoslovakia—to justify the party’s existence. Its legitimacy rested on an increasingly implausible ideology, and nothing more. Whatever the excesses of American leaders during the Vietnam and Watergate years, they never had to face that difficulty.

Brezhnev could have diminished the party’s vulnerability by qualifying its claim to a monopoly on wisdom—but that would have produced challenges to its monopoly on power, and that he was not prepared to do. “This is dangerous,” K.G.B. chief Yuri Andropov warned in a 1974 Politburo discussion of criticisms that had already surfaced from the Soviet Union’s most distinguished writer, Aleksandr Solzhenitsyn, and its most prominent physicist, Andrei Sakharov. “[T]here are hundreds and thousands of people among whom Solzhenitsyn will find support. . . . [I]f we remain inactive on Sakharov, then how will [other] academicians . . . behave in the future?”71 The only strengths these dissidents deployed lay in their pens, their voices, and their principles. Yet principles were contagious, and the Soviet system, shielded only by ideology, had insufficient immunity to them.

With internal reform too risky, the Kremlin leadership turned toward diplomacy: if the world acknowledged the legitimacy of its rule, then how could a few malcontents—even famous ones—get anyone else to object to it? That was one of the reasons why Brezhnev liked détente, a fundamental premise of which was that the West would not seek to alter the internal character of Marxist-Leninist regimes. The objective instead would be to encourage their responsible behavior within the international arena. That did not mean giving up the class struggle: Brezhnev insisted that it would continue where it safely could, especially in the “third world.”72 He was, however, prepared to concede the permanence of NATO and, by implication, a continuing role for the United States in Europe. In return, he expected the Americans and their NATO allies formally to ratify post–World War II boundaries in Eastern Europe.

This was not a new idea. As early as 1954, Molotov had proposed a conference at which the nations of Europe—but not the United States—would meet to confirm their existing borders. That plan went nowhere, but as Kissinger once noted, Moscow’s diplomacy “makes up in persistence what it lacks in imagination.”73 The Soviet foreign ministry revived Molotov’s proposal regularly over the next decade and a half, modifying it to include the Americans. Meanwhile, NATO had endorsed negotiations with the Warsaw Pact on mutual force reductions in Europe, while Brandt’s Ostpolitik had produced a Soviet–West German treaty recognizing the long-contested boundary of postwar Poland, as well as an agreement among the four powers occupying Berlin to continue the status quo in that city. It was clear, then, that no one had an interest in changing the European political map: that made renewed Soviet pressure for a “Conference on Security and Cooperation in Europe” seem relatively harmless to the Americans and, to several of their NATO partners, a potentially positive development. 74 For Brezhnev, however, such a conference would mean much more. It would require the United States and its allies to state publicly and in writing that they accepted the postwar division of Europe. The Kremlin leader was almost capitalist in the importance he attached to this contractual obligation, which he believed would discourage future “Prague springs,” reinforce the Brezhnev Doctrine, deflate dissidents inside the U.S.S.R., and ensure his own reputation as a man of peace.75 And he was willing to make extraordinary concessions to get this commitment. They included promising advance notice for military maneuvers, permitting the peaceful change of international borders, allowing signatory states to join or leave alliances, and, most surprisingly, recognizing “the universal significance of human rights and fundamental freedoms . . . in conformity with the purposes and principles of the Charter of the United Nations and with the Universal Declaration of Human Rights.”76 The Russians were admittedly nervous about this last condition, but it had originated with the West Europeans and the Canadians, not the Americans, which made it difficult to oppose.77 Moreover, the liberties it specified appeared in the largely unimplemented Soviet constitution: that too would have made rejection awkward. Nor would it be easy, solely on these grounds, to back out of a conference for which the U.S.S.R. had pressed for so long. So the Politburo agreed, with misgivings, to the inclusion of human rights provisions in the conference’s “Final Act.” “We are masters in our own house,” Foreign Minister Andrei Gromyko assured Brezhnev. The Soviet government and no one else would decide what the recognition of “human rights and fundamental freedoms” actually meant.78 The Conference on Security and Cooperation in Europe opened in Helsinki on July 30, 1975. Brezhnev dozed through its many speeches, and two days later he, Ford, and the leaders of thirty-three other states signed the long and complex document that had brought them together. The consequences, on all sides, were unexpected. As Kissinger later put it: “Rarely has a diplomatic process so illuminated the limitations of human foresight.”79

VIII.

WITHIN THE UNITED STATES, liberals and conservatives alike denounced Ford and Kissinger for having abandoned the cause of human rights. Brezhnev’s motives in wanting the Helsinki agreement, they argued, were all too transparent: pursuing détente was hardly worth it if it meant perpetuating injustice by recognizing Soviet control in Eastern Europe. A series of administration missteps inadvertently advanced this argument. Just prior to the Helsinki conference, Kissinger had advised Ford not to receive Solzhenitsyn—by then an involuntary exile from the Soviet Union and a bitter critic of détente—at the White House: this came across as excessive deference to Moscow. Then, in December, 1975, a Kissinger aide, Helmut Sonnenfeldt, told what he thought was an off-the-record meeting with American diplomats that the administration hoped to end the “inorganic, unnatural relationship” between the Soviet Union and the Eastern Europeans. When the comment leaked, it was taken as acknowledging that the Russians were in that part of the world to stay.80 These episodes made Helsinki a liability to Ford during the 1976 presidential campaign, as both Ronald Reagan, his challenger from within the Republican Party, and Jimmy Carter, the nominee of the Democratic Party, condemned the agreement. Ford found it necessary to prohibit subordinates from even using the word “détente”; he also disassociated himself from Kissinger as the election approached. And then on October 6th, while debating Carter, the president committed one final, fatal gaffe: briefed to deny the existence of the “Sonnenfeldt Doctrine,” he instead denied that the Soviet Union dominated Eastern Europe.81 That ensured Carter’s election, and so after January 20, 1977, neither Ford nor Kissinger retained any further responsibility for the conduct of American foreign policy. The Helsinki conference was one of the reasons.

Helsinki’s effects inside the Soviet Union and Eastern Europe, however, were equally unexpected, and far more significant. Brezhnev had looked forward, Dobrynin recalls, to the “publicity he would gain . . . when the Soviet public learned of the final settlement of the postwar boundaries for which they had sacrificed so much.”

As to the humanitarian issues, these could be mentioned at home just vaguely, without much publicity. He thought this would not bring much trouble inside our country. But he was wrong. The condition of Soviet dissidents certainly did not change overnight, but they were definitely encouraged by this historic document. Its very publication in Pravda gave it the weight of an official document. It gradually became a manifesto of the dissident and liberal movement, a development totally beyond the imagination of the Soviet leadership.82

Helsinki became, in short, a legal and moral trap.83 Having pressed the United States and its allies to commit themselves in writing to recognizing existing boundaries in Eastern Europe, Brezhnev could hardly repudiate what he had agreed to in the same document—also in writing—with respect to human rights. Without realizing the implications, he thereby handed his critics a standard, based on universal principles of justice, rooted in international law, independent of Marxist-Leninist ideology, against which they could evaluate the behavior of his and other communist regimes.

What this meant was that the people who lived under these systems—at least the more courageous—could claim official permission to say what they thought: perhaps it might not be necessary to live a “double life” for all time to come. Andropov’s 1974 nightmare became a reality as thousands of individuals who lacked the prominence of Solzhenitsyn and Sakharov began to stand with them in holding the U.S.S.R. and its satellites accountable for human rights abuses. By the summer of 1976 a Public Group to Promote Observance of the Helsinki Accords was operating in Moscow with Sakharov’s endorsement, and similar “Helsinki Groups” were sprouting throughout Eastern Europe.84 Begun by the Kremlin in an effort to legitimize Soviet control in that part of the world, the Helsinki process became instead the basis for legitimizing opposition to Soviet rule.

The effects, to put it mildly, were unpredictable. It is unlikely, for example, that the aging leaders in Moscow followed the fortunes of a scruffy, anti-establishment Czechoslovak rock band, the “Plastic People of the Universe,” formed in the aftermath of the invasion of that country in 1968. Given to performing in secret while dodging the police, the band ran out of luck in 1976, when its members were arrested. Their trial provoked several hundred intellectuals into signing, on January 1, 1977, a manifesto called Charter 77, which politely but pointedly called upon the Czech government to respect the free expression provisions of the Helsinki Final Act, which with Brezhnev’s approval it had signed. Several of the “Chartists” themselves were then arrested. One of them, the playwright—and lover of rock music—Václav Havel, spent four years in prison, followed by many more years of close surveillance after his release.85 That gave Havel the motive and the time, through his essays and plays, to become the most influential chronicler of his generation’s disillusionment with communism. He was, it has been said, “a Lennonist rather than a Leninist.”86 Havel did not call for outright resistance: given the state’s police powers, there would have been little point in that. Instead he encouraged something more subtle, developing standards for individual behavior apart from those of the state. People who failed to do this, he wrote, “confirm the system, fulfill the system, make the system, are the system.” But people who were true to what they themselves believed—even in so small a matter as a brewer deciding to brew better beer than the official regulations called for—could ultimately subvert the system. “[W ]hen one person cries out, ‘The emperor is naked!’—when a single person breaks the rules of the game, thus exposing it as a game—everything suddenly appears in another light, and the whole crust seems then to be made of a tissue on the point of tearing, and disintegrating uncontrollably.”87 Havel gave voice—just as Brezhnev inadvertently gave legitimacy—to the pressures that had been building throughout the Soviet Union and Eastern Europe to end the double life that Marxism-Leninism had seemed to require: all at once a vision beckoned of a society in which universal morality, state morality, and individual morality might all be the same thing. At which point God, or at least His agents, intervened to make that vision an unexpected—and to the Kremlin a profoundly alarming—reality.

Karol Wojtyła, an accomplished actor, poet, playwright, and athlete, had entered the priesthood in 1946, and had been appointed archbishop of Kraków in 1964 with the full approval of the Polish Communist Party, which vetoed seven other candidates. It would be hard to find a clearer example of historical fallibility, for Pope Paul VI made Wojtyła a cardinal in 1967, and then on October 16, 1978, his fellow cardinals elected him, at fifty-eight, the youngest pope in 132 years, the first non-Italian pope in 455 years, and the first Slavic pope ever. “How could you possibly allow the election of a citizen of a socialist country as pope?” Andropov demanded of his unfortunate bureau chief in Warsaw. There was no good answer to this, for not even the K.G.B. controlled papal conclaves.

Nor, as it soon became clear, did it control the spiritual life of the Polish people. “The Pope is our enemy,” a desperate party directive warned, shortly before John Paul II made his first visit, as supreme pontiff, to his native country:

He is dangerous, because he will make St. Stanisław [the patron saint of Poland] . . . a defender of human rights. . . . [O]ur activities designed to atheize the youth not only cannot diminish but must intensely develop. . . . In this respect all means are allowed and we cannot allow any sentiments.

“Take my advice,” Brezhnev told Polish party leader Edward Gierek, “don’t give him any reception. It will only cause trouble.” When Gierek protested that he could hardly turn away the first Polish pope, the old man in the Kremlin relented: “Well, do as you wish. But be careful you don’t regret it.”88

It was, for once from Brezhnev, an accurate prediction of things to come. But it was too late to prevent them, because Wojtyła had been working quietly for years—as priest, archbishop, and cardinal—to preserve, strengthen, and expand the ties between the individual morality of Poles and the universal morality of the Roman Catholic Church. Now, as pope, he witnessed his success.

When John Paul II kissed the ground at the Warsaw airport on June 2, 1979, he began the process by which communism in Poland—and ultimately everywhere else in Europe—would come to an end. Hundreds of thousands of his countrymen cheered his entry into the city, shouting, “We want God, we want God!” A million greeted him the next day in Gniezno. At Częstochowa on the following day the crowds were even larger: here the pope slyly reminded the authorities that the church’s teaching on religious freedom “directly tallies with the principles promulgated in fundamental state and international documents, including the Constitution of the Polish People’s Republic.” By the time the pope reached his home city of Kraków, between 2 and 3 million people were there to welcome him, many of them the young people the party had hoped to “atheize.” “Who’s making all this noise?” the pope joked. “Stay with us!” they chanted in response. “Stay with us!” As he left the city in which, as he put it, “every stone and every brick is dear to me,” John Paul reiterated the great theme of his papacy: “Be not afraid.”

You must be strong, dear brothers and sisters . . . with the strength of faith. . . . You must be strong with the strength of hope. . . . You must be strong with love, which is stronger than death. . . . When we are strong with the Spirit of God, we are also strong with faith in man. . . . There is therefore no need to fear.89

“The Pope!” Josef Stalin was reputedly fond of asking. “How many divisions has he got?”90 John Paul II, during the nine days he spent in Poland in 1979, provided the answer. This too was a development, as Dobrynin might have put it, “totally beyond the imagination of the Soviet leadership.”

مشارکت کنندگان در این صفحه

تا کنون فردی در بازسازی این صفحه مشارکت نداشته است.

🖊 شما نیز می‌توانید برای مشارکت در ترجمه‌ی این صفحه یا اصلاح متن انگلیسی، به این لینک مراجعه بفرمایید.