The Human Rights Issue:
- Sometimes A Great Nation
- Share
Sometimes A Great Nation
The U.S. has a proud history of advancing human rights—and sometimes forgetting them. Historian Eric Foner proposes we give humility and respect a try.
A history of proud ideals and mixed results.
Through the intervening years, the notion that the United States is a showcase for freedom has been a central part of our political culture, coupled with the belief that our nation is obligated to spread—by example, persuasion, or force—basic rights throughout the world. From the beginning we Americans have considered our nation a “universal” one—the embodiment of ideals that other nations should and would adopt. As early as 1776, in his great pamphlet Common Sense, Thomas Paine identified the cause of America with the fate of liberty throughout the world. An independent United States, he proclaimed, would be a unique embodiment of liberty in a world overrun with oppression, an “asylum for mankind.” Six months later, Thomas Jefferson began the Declaration of Independence by invoking “life, liberty, and the pursuit of happiness”—rights not confined to any one country or people but to be enjoyed by all humans.
But the realization of human rights in the United States is not a story of steady evolution toward a predetermined goal. It is the story of cyclical progress and retreat, of debate and struggle over the definition of human rights and over who is entitled to enjoy them. Time and again in American history, the definition of freedom has been transformed by the demands of those denied its blessings—racial minorities, women, workers, and others.
Taking the Lead
At times the United States has worked to realize its founding ideals. For much of its history, the nation has been an asylum for immigrants seeking rights denied them at home. In the 20th century, American leaders played a central role in articulating the ideal of worldwide human rights and in drafting documents that tried to define them. Even before the United States entered World War II, President Franklin D. Roosevelt spoke of the Four Freedoms—freedom of speech and religion, freedom from fear and want—that inspired the struggle against Nazi tyranny, and of his commitment to their enjoyment “everywhere in the world.”
The atrocities committed during World War II, as well as the global language of the Four Freedoms, forcefully raised the issue of human rights in the postwar world. After the war, the victorious Allies put German officials on trial at Nuremberg for crimes against humanity, establishing the principle that the international community may punish gross violations of human rights.
There is a definable set of rights belonging to all humans.
However, the document had no enforcement mechanism. For that reason, some considered it an exercise in empty rhetoric. But its core principles—that there is a definable set of rights belonging to all humans and that a nation’s treatment of its own citizens should be subject to international evaluation—slowly became part of the language of world affairs. In 1948, the United Nations General Assembly approved the Universal Declaration of Human Rights (UDHR), drafted by a committee chaired by Eleanor Roosevelt. It identified a broad range of rights to be enjoyed by people everywhere, including political rights such as freedom of speech, religious tolerance, and protection against arbitrary government action, along with social and economic rights like the right to an adequate standard of living and access to housing, education, and medical care.
During the Cold War, the idea of human rights became a propaganda tool. Neither the United States nor the Soviet Union could resist emphasizing certain provisions of the Universal Declaration while ignoring others. The Soviets claimed to provide all citizens with social and economic rights but violated democratic rights and civil liberties. Many Americans condemned the non-political rights as a step toward socialism (by this time, FDR’s “freedom from want” had disappeared from the political dialogue).
Eleanor Roosevelt had seen the UDHR as an integrated body of principles, a combination of traditional civil and political liberties with the social conditions of freedom. But to make it easier for member states to ratify the document amid Cold War tensions, the U.N. divided it into two “covenants”—Civil and Political Rights, and Economic, Social, and Cultural Rights. It took until 1992 for Congress to ratify the first. It has never approved the second.
Ambivalent History
The mixed reception of the UDHR illustrates the complex, contradictory story of human rights in the American past and present. No idea is more fundamental to Americans’ sense of themselves as individuals and as a nation than freedom. The central term in our political vocabulary, freedom—or liberty, with which it is almost always used interchangeably—is deeply embedded in the documentary record of our history and the language of everyday life. The Declaration of Independence lists liberty among mankind’s inalienable rights; the Constitution announces as its purpose to secure liberty’s blessings. The United States fought the Civil War to bring about a new birth of freedom, World War II for the Four Freedoms, the Cold War to defend the Free World.
Yet the American Revolution, which proclaimed freedom as a universal human right, gave birth to a republic resting economically in large measure on slavery. When Jefferson wrote the Declaration, he owned more than 100 slaves, and slaves comprised one-fifth of the population of the United States. Even as Americans celebrated their status as an “empire of liberty,” in Jefferson’s phrase, the constitutional definition of those entitled to enjoy the “blessings of liberty” was defined by race. The first Nationalization Act, passed in 1790, barred non-whites from emigrating to this “asylum for mankind.” No black person, declared the Supreme Court on the eve of the Civil War, even if born in this country, could ever be an American citizen or enjoy the rights of white persons.
The principles of birthright citizenship and equal protection of the law without regard to race were products of the antislavery struggle.
The slavery issue illustrates a central element of the development of human rights in this country. Our modern notion of human rights as a set of entitlements that transcend the boundaries of race and nationality owes less to the founding fathers than to the abolitionists, white and black, who struggled to end slavery and redefine freedom as a universal birthright, a truly human ideal.
The crusade against slavery, wrote Angelina Grimké, the daughter of a South Carolina slaveholder who became a leading abolitionist speaker, was the nation’s preeminent “school in which human rights are … investigated.” ÎÞÂëÊÓƵover, she added, “the investigation of the rights of the slave, necessarily led to a better understanding of my own,” helping to inspire early feminism. “I know nothing,” she continued, “of men’s rights and women’s rights”—it was human rights for which she contended. Although it took many decades for women to gain legal and political equality, the principles of birthright citizenship and equal protection of the law without regard to race, which became central elements of American freedom, were products of the antislavery struggle.
Losing Freedom to Preserve It
The tension between the ideal of human rights and periodic violations of them persisted well after the end of slavery. Wars fought in the name of freedom have produced significant deprivations of liberty at home. World War I witnessed the most massive suppression of freedom of expression in American history, outbreaks of racial violence in major American cities, and severe restrictions on immigration that once again contradicted the image of the United States as an “asylum for mankind.” During World War II, Roosevelt’s Four Freedoms were juxtaposed with the internment of more than 100,000 Japanese-Americans.
Similar contradictions between rhetoric and reality have sometimes characterized American foreign relations. During the Cold War, the stated purpose of foreign policy was to defend freedom worldwide against the threat of communism. Yet to do so, the United States formed alliances with, or helped install, some of the world’s most brutal dictators, who systematically violated the human rights of their own citizens. The “Free World” of the Cold War era included such nations as Iran under the Shah, the Philippines under Marcos, and even South Africa under apartheid.
Freedom often depends on the existence of political power to enforce it.
President Jimmy Carter believed that in the post-Vietnam era, American foreign policy should de-emphasize Cold War thinking. In a 1977 address, he insisted that foreign policy could not be separated from “questions of justice, equity, and human rights.” He attempted to curb the murderous violence of death squads linked to the right-wing government of El Salvador, an ally of the United States. But he often found it impossible to translate rhetoric into action. The United States continued its support of allies with records of serious human rights violations, such as the governments of Guatemala, the Philippines, South Korea, and Iran. A similar contradiction, between insistence on America’s role as an international emblem of freedom and alliances with dictators abroad, marked the administration of Ronald Reagan.
During Bill Clinton’s presidency, reports by Amnesty International and Human Rights Watch strongly influenced world public opinion. Human rights emerged as a justification for intervention in matters once considered the internal affairs of sovereign nations. The United States dispatched the military to distant parts of the world as part of international missions to protect civilians. NATO’s intervention in the Balkans to stop “ethnic cleansing” during the breakup of Yugoslavia gave the organization a new purpose. New institutions like the European Court of Human Rights emerged with the power to overturn national laws and court decisions that violated international human rights standards. Commentators proclaimed the birth of an international era of human rights.
Yet at the same time, the 1990s drew attention to the challenge to human rights arising from the rapidly accelerating process of economic globalization—the unregulated international flow of capital, labor, and investment. Globalization raises profound questions about the relationship between political sovereignty, national identity, and human rights. Despite the existence of international institutions dedicated to human rights, rights have historically been derived from membership in a nation state, and freedom often depends on the existence of political power to enforce it.
The terrorist attacks of September 11, 2001 brought a new retreat from human rights as official U.S. policy. Even before the attacks, the Bush administration had made clear its unwillingness to adhere to international agreements such as the Kyoto Protocol on global warming. Now, in launching a “war on terrorism,” it repudiated the U.N., the new International Criminal Court, and long-standing treaties governing the treatment of prisoners of war and those accused of crimes.
As in previous wars, the idea of an open-ended global battle between freedom and its opposite was invoked to justify serious infringements on civil liberties at home. Legal protections—including habeas corpus, trial by impartial jury, the right to legal representation, and equality before the law regardless of race or national origin—were curtailed. The Justice Department argued in court that both non-citizens and citizens accused of assisting terrorism could be held indefinitely without a charge—a policy that violates centuries of Anglo-Saxon jurisprudence, the Constitution, and human rights law. That policy was carried out in practice at Guantánamo Bay, where hundreds of detainees have been held for up to five years without charge or trial. It was codified in the Military Commissions Act of 2006 (MCA), which strips the courts of any ability to review detentions ordered by the executive branch. This provision of the MCA has already been challenged in the courts and its fate is uncertain.
The idea of an open-ended global battle between freedom and its opposite was invoked to justify serious infringements on civil liberties at home.
Officials of the Bush administration also insisted that the United States need not be bound by international law in pursuing the war on terrorism. They were especially eager to sidestep the Geneva Conventions and the International Convention Against Torture. White House counsel Alberto Gonzales, who later became attorney general, advised the president that the Geneva accords were “quaint” and “obsolete” in this “new kind of war.” The Defense Department approved methods of interrogation that most observers considered torture, a policy now made law in the MCA. In addition, the CIA set up a series of jails in foreign countries outside the traditional chain of military command and took part in the “rendition” of suspects—that is, kidnapping them and spiriting them away to prisons run by countries like Egypt, Yemen, and former communist states of Eastern Europe, where torture is practiced. The photographs of Abu Ghraib prisoners abused by American soldiers, which spread around the world in newspapers, on televisions, and on the Internet, did more than any event in living memory to undermine the reputation of the United States as a country that adheres to standards of civilized behavior, human rights, and the rule of law.
Return to Cooperation
If this brief history proves anything, it is that, as an 18th-century jurist remarked, “the price of liberty is eternal vigilance.” Despite the vital role the United States has played at certain points in our history in promoting the idea of human rights, respect for these rights cannot be taken for granted.
Shortly before his death in 1970, the historian Richard Hofstadter was interviewed by Newsweek. The result was a melancholy reflection on a society confronting what he called a “crisis of the spirit.” He referred to the turmoil of the 60s—the anti-war movement, the black revolution, alienation of the young. Ultimately, he said, American society’s conception of itself must change. “I think that part of the trouble is that our sense of ourselves hasn’t diminished as much as it ought to.” The United States, he seemed to say, needs to accept limitations on its power to shape the world.
National humility will be bitter medicine for a nation that has always considered itself a city upon a hill, a beacon to the world. Yet American independence was proclaimed by men anxious to demonstrate, as Jefferson wrote in the Declaration of Independence, “a decent respect to the opinions of mankind.” If our nation’s commitment to human rights at home and abroad is to be reinvigorated after the dark era we are now living through, it will have to be, as it has been in the past, by Americans acting in cooperation with one another, and with the rest of humanity. No unilateral effort to reshape the world in our own image can succeed—not even in the name of freedom.