TIME Opinion

Lessons of the Fall of Saigon

War of Vietnam. Saigon's fall. Taken of the presid
Francoise De Mulder—Roger Viollet/Getty Images Saigon's fall and the taking of the presidential palace, on April 30, 1975

The Vietnam War changed the United States as much as it changed South Vietnam

Forty years ago today, on April 30, 1975, helicopters carried away the last Americans in Saigon as North Vietnamese troops entered the city. What followed showed that the war had changed the United States as much as it had changed South Vietnam.

Only 28 months before the end, President Nixon had announced that the war’s end would come with “peace with honor,” and promised to respond vigorously to any North Vietnamese violations of the peace agreement. But Congress had insisted upon a final end to military action in Southeast Asia in the summer of 1973, and Watergate had driven Nixon out of office a year later. Neither the US government nor their South Vietnamese ally, President Thieu, had shown any interest in implementing the provisions of the peace agreement designed to lead to genuine peace The millions of young Americans who had served in South Vietnam from 1962 through 1972, and the thousands of planes that had flown bombing missions from carriers and airfields in the region, had proven time and time again that they could hold on to most of the country as long as they were there. But the Americans could do nothing about the political weakness of the South Vietnamese government. The communists still effectively ruled much of the countryside and had infiltrated every level of the South Vietnamese government from the Presidential palace on down. American money, not loyalty, had driven the South Vietnamese war effort. With no prospect of American help, the South Vietnamese Army simply collapsed in the spring of 1975 after Thieu ordered a precipitous withdrawal from the Central Highlands. The North Vietnamese won their final victory almost without fighting.

A variant of this sad story has already been replayed in Iraq, where tens of thousands of supposedly American-trained Iraqi Army troops melted away in 2014 when faced with ISIS. There, too, the American-backed government had totally failed to secure the allegiance of the population in Sunni areas. The same thing may well happen in Afghanistan, where a new President has already persuaded the Obama Administration to delay a final withdrawal. That was the overwhelming lesson of Vietnam: that American forces, no matter how large, cannot create a strong allied government where the local will is lacking.

Like most historical lessons, that one lasted for as long as men and women who were at least 40 years old in 1975 held power. Army officers like Colin Powell were determined never to see anything similar happen on their watch, and they kept the military out of similar situations in El Salvador and Lebanon during the Reagan years. Instead, the Soviet Union found its own Vietnam in Afghanistan, and that last foreign policy adventure helped bring Communism down. In 1990-1, George H. W. Bush decided to expel Saddam Hussein from Kuwait, but Powell and others made sure that operation would be carried out quickly, with overwhelming force, and with no long-term occupation of enemy territory. Bill Clinton, who had opposed the Vietnam War, kept the United States out of any ground wars as well.

The neoconservatives who took over policy and strategy under George H. W. Bush were either too young to have fought in Vietnam, or, like Bush (and, for that matter, myself), had served in non-combatant roles. Some of them had persuaded themselves that Vietnam would have been successful if the United States had sent South Vietnam more aid, and all of them were certain they could topple the Iraqi government without serious repercussions. Iraq in 2003 was about twice as populated and much larger in area than South Vietnam in 1962, but they were certain that less than a third of the troops eventually needed in South Vietnam would do the job. They were wrong on all counts. Late in Bush’s second term, American troops showed once again that they could quiet an uprising as long as they remained in the country. But the Iraqi government was determined to see them leave, and last year it seemed that that government might go the way of President Thieu. That has not happened, but Baghdad seems to have lost control of much of the Sunni region for a long time to come.

President Gerald Ford was the American hero of the last phase of the Vietnam War. Although Congress had refused his requests for additional aid to the South in those last desperate weeks, he refused to blame Congress, war protesters, or the media for the fall of Saigon. On April 23, with the complete collapse of South Vietnam only days away, the President gave a major speech in New Orleans. “Today,” he said, “America can regain the sense of pride that existed before Vietnam. But it cannot be achieved by refighting a war that is finished as far as America is concerned. As I see it, the time has come to look forward to an agenda for the future, to unify, to bind up the Nation’s wounds, and to restore its health and its optimistic self-confidence.” This much-underrated President, who was destined to lose a close election in another 18 months, had caught the mood of the American people. Henry Kissinger had explained to Nixon in the fall of 1972 that the US could survive the eventual fall of South Vietnam if the South Vietnamese could clearly be held responsible—immediately began blaming the Soviet Union on the one hand, and the Congress on the other, for the debacle. But Ford gave the American people permission to feel that they had given far more than anyone could ever have expected to this hopeless cause.

It seems today as if another frustrating series of interventions has temporarily vaccinated the US against any such large-scale deployments. Neither politicians nor military leaders will be eager to repeat the Iraq experience for a long time, and the Obama Administration has moved from “counterinsurgency” to “counterterror,” relying on drone strikes. But since the interventions in Iraq and Afghanistan seem likely to lead to endless chaos rather than to the symbolic fall of a capital, it seems unlikely that Obama or any future President will manage to put our Middle East adventure behind us in the way that Ford did for Vietnam. That is unfortunate, because great powers need to be able to come to grips with the limits of their power, especially in highly troubled times like our own.

The Long ViewHistorians explain how the past informs the present

David Kaiser, a historian, has taught at Harvard, Carnegie Mellon, Williams College, and the Naval War College. He is the author of seven books, including, most recently, No End Save Victory: How FDR Led the Nation into War. He lives in Watertown, Mass.

TIME Opinion

Lessons for Baltimore From 1968

Baltimore Arrest During Riot
Picasa / Getty Images A man carried away by police during riots, Baltimore, Maryland, 1968.

How history can heal a harmed city

In the 20 years that I have lived in Baltimore City, I have seen guns fired only twice; in each instance the targets were black men and the shooters were police. In one case the officer was trying to stop a group of men who had apparently stolen a car. They bailed out in front of my house, and as they were running away, the officer fired, but missed. In the second case the officer’s aim was better; an assailant held up a medical student on a bicycle, then ran through traffic right in front of our car. An off-duty cop saw the scuffle and fired. He turned out to be a 14-year-old with a BB gun. The boy lay in the street, shot in the stomach; my 12-year-old son and I waited until the police told us to move on. I called my district and set up an appointment with a detective. No one ever came to question me.

Those incidents came back to me this week when the death of Freddie Gray triggered days of peaceful protests that splintered into something uglier on Saturday, and anti-police violence erupted on Monday. But those weren’t the only moments from the past that seemed worth thinking about. The looting and arson led to comparisons to the unrest that followed the assassination of the Rev. Dr. Martin Luther King, Jr.—and, as an assistant professor of history at the University of Baltimore who has studied Baltimore in 1968, I can see a number of similarities. After several days of peaceful commemoration of Dr. King’s death, disenfranchised youth instigated disturbances in fifteen neighborhood commercial districts. Curfews were imposed, just as they were in Baltimore this week, and hundreds of citizens were eventually swept into custody. During both of the crises, members of the clergy of all faiths walked the streets in attempts to restore order.

But the real link between the two moments, 1968 and today, runs deeper than that. It’s not about the appearance of similarity, but rather the causes and effects.

As UB discovered in a community-based, multi-disciplinary examination of the riots 40 years later, the causes and consequences of urban unrest are complex and multifaceted. As part of our project, our diverse student body interviewed their friends and family, and we heard stories that illustrated deep systemic trends that led to generations of anger and frustration: practices in the private sector like residential covenants that forbade sales to black and Jewish buyers, federal policies like redlining that discouraged bank loans to poor and aging neighborhoods, urban renewal policies that used federal funds to build highways that cut neighborhoods off from the rest of Baltimore; limited job opportunities as Baltimore’s blue-collar jobs began to evaporate. All of those forces had been at work long before Dr. King’s assassination, and, as we see violence along the same streets almost five decades later, Baltimoreans still feel their effects today.

We also heard stories about businesses that were destroyed after families had poured years of effort and capital into them. In 1968 the Pats family lost its pharmacy on West North Avenue, just a few blocks from the CVS that burned this Monday evening. Their business was looted, then their entire block was burned, including their apartment. Their neighbors, who lost their jewelry store, had been relocated to Baltimore after surviving the Holocaust. Baltimore’s retail sector has still not recovered in many areas of the city. A number of neighborhoods have been declared food deserts, and no department store exists within the city limits. When a Target arrived at Mondawmin Mall and hired city residents, Baltimoreans welcomed it. But on Monday night we watched with dismay as looters ran out of Mondawmin, their arms full of merchandise.

In 1968, the governor of Maryland called out the National Guard, just as Governor Larry Hogan did on Monday night, and soon tanks patrolled the city streets. The unrest quieted, and by the end of the week the Orioles held opening day on schedule.

Here’s where the stories diverge. Maryland’s then-governor, Spiro Agnew, rode the wake of Baltimore’s disturbances right into the White House, using his tough-on-crime reputation to become Richard Nixon’s vice-presidential running mate. It is too simplistic to say that the policing approach Agnew advocated led directly to the kind of practices that killed Freddie Gray, Michael Brown, and Eric Garner. We cannot exclude from the list of causes Nixon’s War on Drugs, the crack epidemic of the 1980s and ‘90s, the growth of the prison-industrial complex, and the continuing hemorrhaging of blue-collar jobs from America’s aging industrial cities—but the reaction to the urban riots of the 1960s certainly started us down this path.

The similarities can stop. Knowledge of the aftermath of 1968 can help prevent its repetition. In the early 1970s law and order policing reinforced divisions around race, class, and geography in an attempt to lock up the problems instead of addressing them. We can learn from those mistakes. On Tuesday morning the NAACP announced that they would open a satellite office in Sandtown-Winchester, Freddie Gray’s neighborhood, to provide counsel to residents on a host of legal issues, including police misconduct. An external oversight board to monitor reports of police violence would serve as a powerful partner in this effort. Out on the streets on Tuesday morning, Baltimoreans worked together to clean up the debris from the night. I hope that as we work we will find a chance to tell each other our stories, and that this time we will listen.

The Long ViewHistorians explain how the past informs the present

Elizabeth M. Nix is a professor of legal, ethical and historical studies at the University of Baltimore, and co-editor with Jessica Elfenbein and Thomas Hollowak of Baltimore ’68: Riots and Rebirth in An American City.

 

TIME Opinion

Exclusive: Dr. Oz Says ‘We’re Not Going Anywhere’

The physician and TV personality slams his critics and responds to their critiques

I started my show to give TV audiences advice on how to find a good life, not to practice medicine on air. This means celebrating them wherever they are in their search for health, and offering tools to nudge them along in the right direction. In the same hour-long show, a board certified doctor will discuss cancer followed by a celebrity sharing their personal weight loss story and concluding with an audience member learning to manage their money better. I don’t expect all of my colleagues to understand this marriage between conventional medicine and the broader definition of wellness that the show pursues. I expect and respect the criticism of colleagues who struggle with my approach and I try to improve the show accordingly.

But I was surprised by a brazen note as I entered the operating room at New York Presbyterian/Columbia University this week. A small group of physicians unknown to me were asking my dean to revoke my faculty position for manifesting “an egregious lack of integrity by promoting quack treatments and cures in the interest of personal financial gain.”

The dean politely reinforced that the academic tradition of all institutions protects freedom of speech for their faculty, and I assumed the matter was over. The surgery went much better than the media fury around this letter. Within 12 hours, most major media outlets had published articles on the note, many mistakenly stating Columbia faculty were trying to oust me. Who were these authors and why were they attacking now?

With a few clicks and some simple searches, a remarkable web of intrigue emerged—one that the mainstream media has completely missed. The lead author, Henry I. Miller, appears to have a history as a pro-biotech scientist, and was mentioned in early tobacco-industry litigation as a potential ally to industry. He also furthered the battle in California to block GMO labeling—a cause that I have been vocal about supporting. Another of the letter signees, Gilbert Ross, was found guilty after trial of 13 counts of fraud related to Medicaid. He is now executive director of American Council on Science and Health, a group that has reportedly received donations from big tobacco and food and agribusiness companies, among others. Another four of the 10 authors are also linked to this organization.

I have spent my entire career searching for ways to lessen the suffering of my patients. The best and safest paths have generally been the traditions of conventional medicine. They are tried and true, well funded, and fast. But there are other routes to healing that offer wisdom as well, so I have been willing to explore alternative routes to healing and share any wisdom that can be gathered. I have done this throughout my career as a surgeon, professor, author and, of late, as a talk-show host. Despite being criticized, I want to continue exploring for myself and my audience. Why?

Because in some instances, I believe unconventional approaches appear to work in some people’s lives. They are often based on long-standing traditions from different cultures that visualize the healing process in very different ways from our Western traditions. They are aimed at chronic conditions like lack of energy, fogginess, or moodiness—which are frequently overlooked or under-treated by conventional practitioners. They are also often inexpensive. With limited profit motive, companies understandably do not wish to invest significant resources into proving benefit, so these unconventional remedies do not undergo rigorous clinical studies. So we have practitioners recommend therapies that they find effective in their own practices. When I interview an unusual or interesting person on my show, often it’s expository or out of fascination—not to tell my audience they should see a psychic instead of their primary care physician.

It’s vital that I drive the following point home: My exploration of alternative medicine has never been intended to take the place of conventional medicine, but rather as additive. Critics often imply that any exploration of alternative methods means abandoning conventional approaches. It does not. In fact, many institutions like mine use the names “complementary” or “integrative” medicine, which is also appropriate.

This can lead to confusion and irritation when analyzed by conventional physicians. For example, another daytime TV show and mine were recently noted in a BMJ article for only having proof for half of what we shared with the audience. A similar figure is often used to approximate the amount of randomized clinical trial data underlying conversations in physician’s offices across America. This reflects that natural gap between what is proven in clinical trials and the needs of our patients.

The BMJ authors were correct in reporting that advising people with the flu to rest or cough into the crook of their arms is completely unproven. But major organizations like the Centers for Disease Control and Prevention (CDC) give rational advice of this nature that isn’t directly linked to a research paper. When there isn’t data, we rely on the non-literature-based guidance of the CDC, the National Institutes of Health, the Food and Drug Administration, the World Health Organization (WHO), as well as specialty professional organizations and experts. (The authors of the BMJ piece later acknowledged being “disappointed that the overwhelming commentary seems to be that our study somehow proves that Dr. Oz or The Doctors are quacks or charlatans or worse. Our data in no way supports these conclusions.”) The reality of being a healer is that we won’t ever know everything about our chosen field, which is what attracts many of us to medicine in the first place.

So I have traveled off the beaten path in search of tools and tips that might help heal. These explorations are fraught with their own unique peril. For example, my voyage into the land of weight loss supplements left me in a very unsavory place. I wish I could take back enthusiastic words I used to support these products years ago. And I understand the criticism I’ve received as a result.

I discovered problems in the promising research papers that supported some products; the products themselves were often poor quality; and scammers stole my image to promote fake pills. So I have not mentioned weight loss supplements for a year and have no plans to return to that neighborhood.

Other times the topics are controversial, but are still worthwhile, like our campaign supporting GMO labeling. And this brings me back to a motive for the letter. These doctors criticized my “baseless and relentless opposition to the genetic engineering of food crops,” which is another false accusation. Whether you support genetically engineered crops or not, the freedom to make an informed choice should belong to consumers. The bill in Congress this month proposing to block states from independently requiring labeling offers a coup to pro-GMO groups.

As a scientist, I am not that concerned about GMOs themselves, but I am worried about why they were created. Highly toxic herbicides would kill crops unless they were genetically modified, but with the genetic upgrade, these plants can be doused with much higher doses, with potential complications to the environment. The WHO believes that glyphosate is “probably a human carcinogen.” Perhaps we are all showing “disdain for science and evidence-based medicine,” but I would argue that unleashing these products creates a real-time experiment on the human species. Sure, we will eventually know if these pesticides are a problem, but at the expense of the pain and suffering and disease in real people. I owe my kids more. And so do you.

I know I have irritated some potential allies. No matter our disagreements, freedom of speech is the most fundamental right we have as Americans. We will not be silenced. We’re not going anywhere.

TIME politics

Once a Liberal Icon, Jefferson’s Now Claimed by Both Left and Right

Thomas Jefferson
Hulton Archive/Getty Images circa 1790: engraving of American statesman Thomas Jefferson

The horrible human tragedy and deceit on which the fortune of Cecil Rhodes rests

History News Network

This post is in partnership with the History News Network, the website that puts the news into historical perspective. The article below was originally published at HNN.

Thomas Jefferson has been problematic for historians and divisive for culture warriors. An idealist who crafted language that remains beautiful and enduringly quotable, he has more recently stood in for the persistence of states’ rights and racial injustice. He has, however, never lost the universality expressed by Mikhail Gorbachev during a pilgrimage to Monticello, when the Russian affirmed that as he was conceiving reform in the Soviet Union he recurred to a college text that expounded Jefferson’s political principles.

Jefferson is a featured player in the political memory game as it has been practiced in America over the last century. Every country needs its national origins story. Ours, wearing the garb of American exceptionalism, has given generations a narrative, dating to 1776, that pronounces the moral worth of the founders and their humane principles. In this project, Jefferson is the principal author of the ingenious, hopeful script that we commemorate on select holidays and reintroduce in times of war or perceived danger. As “democracy’s muse,” he alternately soothes and buoys a militarily strong yet frustrated, disoriented nation marked by social contradictions. It is that phenomenon which I examine in my new book. I cover reimagined Jeffersons across the presidencies of the modern era, in debates on Capitol Hill, and among the columnists and popular authors who have aimed to bring the concept of Jeffersonian democracy up to date.

The political Left “owned” Jefferson from the New Deal through the 1960s; yet Ronald Reagan, as much a Jefferson lover as any Democrat, went far in converting the eloquent founder to the conservative cause–where he has largely remained to this day. Curiously, it is only on the Right that we find those who deny the link between Jefferson DNA and the mixed-race offspring of Monticello slave Sally Hemings; on the Right it is thought that plantation sex is a diminution of founding “greatness” and a threat to the moral underpinnings of the founding narrative.

Franklin D. Roosevelt was integrally involved in the planning and execution of the Jefferson Memorial, which he dedicated on the Virginian’s 200th birthday, April 13, 1943. Thousands gathered to witness the event. “Thomas Jefferson believed, as we believe, in man,” the president said on that day. Enlisting Jefferson in the ongoing struggle against Hitler, he continued: “He believed, as we believe, that … no king, no tyrant, no dictator can govern for them.” In conclusion, FDR explained the choice of the quote that wraps around the interior of the dome, which expressed “Jefferson’s noblest and most urgent meaning.” It reads: “I have sworn before the altar of God, eternal hostility against every form of tyranny over the mind of man.”

It was actually a line Jefferson had written in 1800 to Dr. Benjamin Rush, in reaction to the Christian Right of his day, which considered Jefferson an atheist and feared in a Jefferson presidency an abandonment of religious morals and imposition of non-belief. The recipient of the letter believed, unlike Jefferson, that Jesus was divine. The “altar of God” allusion was a subtle means to reach a man of faith, while stating that more suspicious men of faith need not fear his intrusion into anyone’s right of conscience. A few months after the letter to Dr. Rush, in his First Inaugural Address, the third president sent new signals, intentionally referring to Americans’ “benign” religion, “practiced in its various forms”; and he went on to say that the “blessings” of an “overruling Providence” needed only a “wise and frugal government” to complete the happiness of citizens in their new republic.

He spoke presidentially. The common invocation of “Providence,” like presidential recurrence to “God bless the United States of America” these days, was meant only to soothe. The historical Jefferson wrote most forcefully–and privately– about the dangers of “metaphysical speculations” and the need to employ reason to confront the “mischievous” dogmas repeated by ill-informed (or deceptive, manipulative) preachers.

Of course, not everyone regards the historical Jefferson with calm deliberation. Extracting Jefferson quotes has been a hobby of many over the years, and a major problem for single-issue politicians who have endeavored to translate their Jefferson into a spokesman for whatever they advocate. Public figures are guilty of removing the most emotionally resonant of the founders from historical context, and will often mingle legitimate phrases with invented ones. Monticello’s research department actively susses out spurious Jefferson quotes and posts explanations on its website. Channeling Jefferson during his unsuccessful run for the Republican nomination in 2012, Newt Gingrich told a questioning voter in New Hampshire that Jefferson, who grew hemp, would, if alive today, impose severe penalties for marijuana possession. Adoring Jefferson, Gingrich repeatedly decried the “radical secularists” who were ruining America.

I call this, somewhat tongue-in-cheek, “Jefferson abuse.” Of recent vintage, to complete the example of Jefferson’s religious views, is David Barton’s dramatic recovery of an evangelical Jefferson in his abortive book of 2012 (since pulled from the shelves by his publisher), titled The Jefferson Lies. Barton combined his stern rejection of Jefferson-Hemings liaison as a moral impossibility with his insistence that Jefferson had never advocated “a secular public square.” His Jefferson “regularly prayed, believing that God would answer his prayers for his family, his country, the unity of the Christian church, and the end of slavery.”

The “wise and frugal government” of Jefferson’s First Inaugural has become, since the 1980s, a touchstone for fiscal conservatives. At the Jefferson Memorial on July 3, 1987, President Reagan broadcast a “Jeffersonian” dictum, citing the magnetic founder’s hope for a constitutional amendment “taking from the federal government the power of borrowing.” For Reagan, big government posed a threat to liberty as granted by Jefferson and his cohort. Ironically, like Reagan, Jefferson was an enemy of spending who ran up a sizeable debt.

Distortion of the historical Jefferson reminds us that people believe what they want to believe. Our democratic politics actually depends on a mass psychology that advances through artful manipulation. We may protest the “long train of abuses” (to quote from the Declaration) that attach to statements made in Jefferson’s name; but he continues to occupy a privileged position as we converse with the past and seek to reconcile it, somehow, with our relatively disorganized present. Whoever “owns” Jefferson (or the collective founders) takes themselves to be inheritors of America’s essential ideals.

Andrew Burstein is the Charles P. Manship Professor of History at Louisiana State University. His most recent book is Democracy’s Muse: How Thomas Jefferson Became an FDR Liberal, a Reagan Republican, and a Tea Party Fanatic, All the While Being Dead.”

TIME Opinion

Another Similarity Between Lincoln and Obama: They Polarized the Nation

Abraham Lincoln portrait
Stock Montage / Getty Images Abraham Lincoln (1809-1865) posed for a formal portrait, mid-19th century.

Lincoln was a lightning rod—and Obama is too

Americans yearn for an end to political polarization and partisanship, and many today fault President Obama for failing to achieve consensus on his major initiatives: health care, immigration reform, foreign policy and so on. But consider Abraham Lincoln. From their state of origin to their legal backgrounds, the two presidents have drawn many comparisons, and here’s another: Despite his various efforts at outreach, our sixteenth president was, in life, an intensely polarizing and partisan figure, every bit as polarizing and partisan as our current president.

Lincoln’s presidency, which ended exactly 150 years ago today, sharply differed from the experience of his predecessors. Before Lincoln, five presidents had won a second term: George Washington, Thomas Jefferson, James Madison, James Monroe and Andrew Jackson. Each had carried both North and South in at least one of his presidential bids. By contrast, Lincoln was purely a regional candidate, despised by intense majorities in a large chunk of the country. In 1860, he received zero popular votes south of Virginia, and in 1864, none of the 11 states in Dixie held a valid presidential election, thanks to sectional war precipitated by Lincoln’s prior election. Even Lincoln’s assassination was related to regional differences: John Wilkes Booth was an intense southern partisan.

In the ensuing century and a half, many of America’s most successful presidents have managed to achieve considerable popularity in both North and South. Franklin Roosevelt, Harry Truman, Dwight Eisenhower, John Kennedy, Lyndon Johnson, Ronald Reagan and Bill Clinton all outdid Lincoln in this regard. But our current president won, twice, by following a more emphatically Lincolnian path to power—that is, a distinctly northern route: Of the 11 states in the former confederacy, Obama lost eight twice, and lost a ninth (North Carolina) once, prevailing twice only in Virginia and Florida.

In our era, as in Lincoln’s, regional polarization is on the upswing. Prior to 1850, the winning presidential candidate typically carried both North and South. But that pattern broke down in the 1850s, even before Lincoln rose to national prominence; and a similar fate has befallen Obama. At the presidential level the North and the South have backed different candidates in every one of the six most recent elections; and many states are becoming increasingly red or blue, presidentially. In 2012, only four swing states—Florida, Ohio, North Carolina and Virginia—were close enough to be decided by fewer than five points.

If we shift gears from regional polarization to political polarization, Lincoln and Obama once again appear as political doubles. Both made efforts to reach across the aisle. For example, Lincoln, a Republican, chose a former Democrat, Edwin Stanton, to serve as Secretary of War. Democrat Obama has symmetrically chosen Republicans Robert Gates and Chuck Hagel to fill the same slot, now renamed the Secretary of Defense. Still, Lincoln’s signature executive accomplishments were at risk in a judiciary dominated by appointees of the opposite political party; the same remains true for Obama. Shortly after Lincoln’s death, every single congressional Democrat voted against the Fourteenth Amendment, which codified Lincoln’s dream of birthright equality of all citizens; almost never before had America seen such 100% polarization. In our era, every single congressional Republican likewise opposed Obama’s signature health care plan.

But even on the topics where his proposals were most radical, Lincoln’s opponents’ arguments have not aged well. Shortly before his death, he signed a proposed constitutional amendment providing for an end to American slavery—immediately and with no financial compensation to slaveholders. Nothing like this had ever happened in any American jurisdiction where slavery was widespread. In 1860, less than 1% of America’s black population voted on equal terms. In 1870, all racial disfranchisement was constitutionally forbidden, building on another suggestion made by Lincoln himself in his last public speech, just days before he died.

That level of equality had been a new public stance for Lincoln, a break from his more cautious early views, much as Obama has only recently evolved to a position of open embrace of same-sex marriage. If the Supreme Court later this year constitutionalizes this egalitarian vision, following the lead of the latest lanky lawyer from Illinois to occupy the Oval Office, the decision will likely trigger howls of protest. These howls are likely to be loudest in those regions that hated Lincoln and all that he stood for when he was still standing. But Lincoln’s example should remind us that contemporary controversy does not necessarily mean that the judgment of history will be equivocal. Lincoln’s vision of racial equality has been vindicated by posterity; and the same seems highly likely for Obama’ vision of sexual-orientation equality. As Mark Twain is said to have noted, history never repeats itself—but it sometimes rhymes.

The Long ViewHistorians explain how the past informs the present

Akhil Reed Amar is a professor of law at Yale and author of the newly released book, The Law of the Land: A Grand Tour of our Constitutional Republic.

TIME Opinion

The ‘Obama Doctrine’ Echoes Kennedy and Nixon

President Barack Obama Hosts Easter Prayer Breakfast
Andrew Harrer—Bloomberg / Getty Images U.S. President Barack Obama smiles while speaking during the Easter Prayer Breakfast in the White House in Washington, D.C., on April 7, 2015.

Where does the “Obama doctrine” fit in with the history of presidents and foreign policy?

In a recent New York Times interview with Thomas Friedman, President Obama enunciated an “Obama doctrine” for dealing with nations such as Cuba and Iran: “We will engage, but we preserve all our capabilities.” By “engage,” he meant engage diplomatically; by “capabilities,” he meant our overwhelmingly superior military. Following the example of President James Monroe, a great many modern Presidents have enunciated explicit or implicit doctrines, including Truman, Eisenhower, Kennedy, Nixon, Carter, Reagan and George W. Bush. Some of them pushed the United States further forward in the world; others represented something of a step back. Putting Obama’s doctrine in historical perspective suggests that he is returning to an earlier tradition of American foreign policy represented above all by those two great rivals, Kennedy and Nixon—but, typically, Obama used vaguer, gentler language than any other President, continuing his endless, so far fruitless search for consensus.

The Truman Doctrine, presented to Congress and the world in a March 1947 speech, set the tone for the next 40 years of American foreign policy. Confronted with a civil war in Greece, President Truman argued that the United States should give aid to allied governments that were resisting internal rebellions aided by outside forces. While he did not specifically identify Moscow as the ultimate enemy, the speech became the basis for the containment strategy that ruled our foreign policy until 1989. And in fact, the Eisenhower, Nixon, Carter and Reagan doctrines were all simply extensions or modifications of containment. Eisenhower in 1957 announced that the United States would resist Communist encroachment in the Middle East. Nixon in 1969 stated that the United States would no longer send ground forces to help third-world allies threatened by Communist aggression, but would use naval and air power. Carter in 1980 announced that the United States would forcibly resist any Soviet attempt to move into the Persian Gulf region. Going a step beyond containment to liberation, Reagan in the 1980s declared that the United States would assist guerrillas fighting Communist regimes in the Third World.

More broadly, however, different Presidents enunciated different principles behind the ends and means of their foreign policy, particularly towards the Soviet Union. Thus, in his American University speech in June 1963, John F. Kennedy called for peaceful coexistence between the United States and the Soviet Union, founded on mutual respect for one another’s institutions and even beliefs. The Test Ban treaty followed in short order. Nixon not only enunciated his own doctrine, but declared, echoing Kennedy, that there could be no winners in a nuclear war, and embarked upon détente with the Soviet Union and arms-control treaties based upon equality. Reagan on the other hand declared, through subordinates, that the United States must prepare to fight and “prevail” in a nuclear war with the Soviets, whom he often argued, until the advent of Mikhail Gorbachev, could not be trusted.

The collapse of Communism and the end of the Cold War in 1991 opened up new vistas for foreign policy. George H. W. Bush essentially unveiled a new approach to aggression in response to Saddam Hussein’s occupation of Kuwait, using the United Nations to form an overwhelming coalition to roll back Saddam’s aggression without trying to overthrow him. (It is interesting that neither Kennedy nor Bush the elder, whom I consider the two most effective diplomats to have occupied the White House in my lifetime, saw fit to enunciate a “doctrine” that might serve as a substitute for case-by-case analysis.) In 1999 Bill Clinton went a fateful step further, undertaking the Kosovo war without the full support of the U.N. Security Council, and establishing a precedent that Vladimir Putin loves to invoke. But the real shift came in 2002, when George W. Bush’s administration, in a new National Security strategy, announced that the United States would use its overwhelming military power to prevent dangerous or hostile states from acquiring weapons of mass destruction, without regard to the rest of world opinion. This doctrine led us into Iraq, and had Iraq gone more smoothly, it might well have led us into Iran and North Korea as well.

The Obama doctrine seems to represent an explicit, although vaguely stated, return to a policy of containment and deterrence, in the tradition of Kennedy and Nixon. It repudiates not only preventive war, but also the fantasy that economic sanctions can bring down or fundamentally alter hostile regimes. Our sanctions against Castro have lasted the whole of Barack Obama’s natural life, without result. While the agreement with Iran may not work out as planned, Obama said, “Iran understands that they cannot fight us.” That argument—that Iran can be, and is being, effectively deterred from war by American power—trumps, in the President’s mind, Iran’s ideological stance, and especially its rejection of Israel’s right to exist. The President rejected Prime Minister Netanyahu’s demand that Iran be required to accept Israel’s existence as part of the deal. In the same way, Nixon and Kennedy (although not Reagan) made important agreements with Soviet leaders without trying to insist that they renounce their ideology.

Iran has in fact made remarkable concessions for the sake of the agreement. Because successive Presidents have declared that Iran must not have a nuclear weapon, we tend to forget that nothing in international law forbids them from enriching uranium, and only the Nonproliferation Treaty, which the Iranian government signed but could denounce, as North Korea did, forbids them from developing weapons. The Iranian government, a proud and militant regime, is surrendering a good deal of its sovereignty for the sake of improving its economy and its relations with the West.

The President is moving towards a new, more realistic approach to the Arab world, one that does not demand a wholesale capitulation to American policies and values as the price of any cooperation with the United States. Given the intense opposition that he faces at home, however, I do not know if he can bring this shift about without using less tentative and more inspiring language than he did in his interview with Friedman. Both Kennedy and Nixon, following parallel courses in their dealings with Moscow, spoke boldly of a new era of peace and a potential end to a nuclear nightmare. Today’s President remains “No drama Obama.” The shift he is proposing is truly dramatic, especially against the background of the last 15 years, but it may need more inspiring rhetoric to turn his vision into reality—lest his opponents’ jeremiads against the dangers of the agreement drown out his very sensible arguments for it.

The Long ViewHistorians explain how the past informs the present

David Kaiser, a historian, has taught at Harvard, Carnegie Mellon, Williams College, and the Naval War College. He is the author of seven books, including, most recently, No End Save Victory: How FDR Led the Nation into War. He lives in Watertown, Mass.

TIME Opinion

Who Are the Nuclear Scofflaws?

atomic bomb
US Army / The LIFE Picture Collection / Getty Photos recorded by U.S. Army automatic motion picture camera six miles distant when an atomic bomb was exploded at Alamo-Gordo in 1945

Surprise: The US is on the list. So is Russia. But Iran? Nope

History News Network

This post is in partnership with the History News Network, the website that puts the news into historical perspective. The article below was originally published at HNN.

Given all the frothing by hawkish U.S. Senators about Iran’s possible development of nuclear weapons, one might think that Iran was violating the nuclear Non-Proliferation Treaty (NPT).

But it’s not. The NPT, signed by 190 nations and in effect since 1970, is a treaty in which the non-nuclear nations agreed to forgo developing nuclear weapons and the nuclear nations agreed to divest themselves of their nuclear weapons. It also granted nations the right to develop peaceful nuclear power. The current negotiations in which Iran is engaged with other nations are merely designed to guarantee that Iran, which signed the NPT, does not cross the line from developing nuclear power to developing nuclear weapons.

Nine nations, however, have flouted the NPT by either developing nuclear weapons since the treaty went into effect or failing to honor the commitment to disarm. These nine scofflaws and their nuclear arsenals are Russia (7,500 nuclear warheads), the United States (7,100 nuclear warheads), France (300 nuclear warheads), China (250 nuclear warheads), Britain (215 nuclear warheads), Pakistan (100-120 nuclear warheads), India (90-110 nuclear warheads), Israel (80 nuclear warheads), and North Korea (10 nuclear warheads).

Nor are the nuclear powers likely to be in compliance with the NPT any time soon. The Indian and Pakistani governments are engaged in a rapid nuclear weapons buildup, while the British government is contemplating the development of a new, more advanced nuclear weapons system. Although, in recent decades, the U.S. and Russian governments did reduce their nuclear arsenals substantially, that process has come to a halt in recent years, as relations have soured between the two nations. Indeed, both countries are currently engaged in a new, extremely dangerous nuclear arms race. The U.S. government has committed itself to spending $1 trillion to “modernize” its nuclear facilities and build new nuclear weapons. For its part, the Russian government is investing heavily in the upgrading of its nuclear warheads and the development of new delivery systems, such as nuclear missiles and nuclear submarines.

What can be done about this flouting of the NPT, some 45 years after it went into operation?

That will almost certainly be a major issue at an NPT Review Conference that will convene at the UN headquarters, in New York City, from April 27 to May 22. These review conferences, held every five years, attract high-level national officials from around the world to discuss the treaty’s implementation. For a very brief time, the review conferences even draw the attention of television and other news commentators before the mass communications media return to their preoccupation with scandals, arrests, and the lives of movie stars.

This spring’s NPT review conference might be particularly lively, given the heightening frustration of the non-nuclear powers at the failure of the nuclear powers to fulfill their NPT commitments. At recent disarmament conferences in Norway, Mexico and Austria, the representatives of a large number of non-nuclear nations, ignoring the opposition of the nuclear powers, focused on the catastrophic humanitarian consequences of nuclear war. One rising demand among restless non-nuclear nations and among nuclear disarmament groups is to develop a nuclear weapons ban treaty, whether or not the nuclear powers are willing to participate in negotiations.

To heighten the pressure for the abolition of nuclear weapons, nuclear disarmament groups are staging a Peace and Planet mobilization, in Manhattan, on the eve of the NPT review conference. Calling for a “Nuclear-Free, Peaceful, Just, and Sustainable World,” the mobilization involves an international conference (comprised of plenaries and workshops) on April 24 and 25, plus a culminating interfaith convocation, rally, march, and festival on April 26. Among the hundreds of endorsing organizations are many devoted to peace (Fellowship of Reconciliation, Pax Christi, Peace Action, Physicians for Social Responsibility, Veterans for Peace, and Women’s International League for Peace & Freedom), environmentalism (Earth Action, Friends of the Earth, and 350NYC), religion (Maryknoll Office for Global Concerns, Unitarian Universalist UN Office, United Church of Christ, and United Methodist General Board of Church & Society), workers’ rights (New Jersey Industrial Union Council, United Electrical Workers, and Working Families Party), and human welfare (American Friends Service Committee and National Association of Social Workers).

Of course, how much effect the proponents of a nuclear weapons-free world will have on the cynical officials of the nuclear powers remains to be seen. After as many as 45 years of stalling on their own nuclear disarmament, it is hard to imagine that they are finally ready to begin negotiating a treaty effectively banning nuclear weapons―or at least their nuclear weapons.

Meanwhile, let us encourage Iran not to follow the bad example set by the nuclear powers. And let us ask the nuclear-armed nations, now telling Iran that it should forgo the possession of nuclear weapons, when they are going to start practicing what they preach.

Dr. Lawrence Wittner (http://lawrenceswittner.com) is Professor of History emeritus at SUNY/Albany. He is the author of “Confronting the Bomb: A Short History of the World Nuclear Disarmament Movement” (Stanford University Press).

TIME Opinion

Before You Pick a College, Decide If You Want to Go Greek

Fraternity house exterior
John Greim—LightRocket via Getty Images Fraternity house exterior

Why deciding whether to join a fraternity or sorority should be a major part of the college selection process

As the college acceptances roll in over the next few weeks, kids and parents will be making some tough decisions about which school to pick: city or country? Big school or small school? Close to home or far away?

But there’s a major consideration that few kids take seriously, one that’s almost as important as financial aid and academic opportunity. Lost in the frenzy about dorm style and class size and sports ranking is one factor that could have an enormous effect on you for the next four years: Greek life.

The truth is, deciding to join a fraternity or sorority is as much about the campus dynamic as it is about a student’s own preferences. At a campus with a prominent Greek scene, so much of the social scene is dominated by fraternities and sororities that deciding not to join may have social consequences. That’s why students should decide how they feel about Greek life before they pick a campus, not after.

Because once you get to school, it may feel like that decision has been made for you. On a heavily Greek campus, choosing not to join can affect your housing and dining options as well as your social life. At many schools, the choice is virtually nonexistent: at University of Texas Pan-American, 100% of women on campus are in sororities and 99% of men are in fraternities, at Washington and Lee University, 82% of men and women go Greek. This kind of overwhelming majority is rare, but Greek life can still feel pervasive even at campuses with far lower rates of enrollment: at the University of Oklahoma, which has recently been embroiled in scandal over a racist chant sung by frat brothers, only 26% of male students are in frats.

True, the vast majority of people who participate in Greek life are thoughtful, productive members of society with no interest in racist chants or hazing anybody to death. Most fraternities and sororities were originally founded as philanthropic organizations, and many still make enormous contributions to their communities. But as we’ve seen recently, it can take just a few bad apples to change the way fraternity members behave as a group.

Going Greek can be risky business. In the last two weeks, five national fraternity chapters have been suspended for unethical and possibly illegal behavior. First, Sigma Alpha Epsilon frat brothers at University of Oklahoma were taped singing a racist chant that resulted in the suspension of the chapter and the expulsion of two members. Then, the Penn State chapter of Kappa Delta Rho was suspended after police found a secret Facebook page full of pictures of nude, passed out women– an incident which could lead to criminal charges. The University of South Carolina chapter of Pi Kappa Alpha was suspended Wednesday after the suspicious death of a student, the same day the University of Houston closed its Sigma Chi chapter after allegations of hazing. And last week, Washington & Lee suspended their chapter of Phi Kappa Psi over allegations that frat brothers hazed pledges with tasers. And that’s not even getting started on the sexual assault statistics: multiple studies have shown that men who join fraternities are statistically more likely to commit rape than men who don’t.

You might be thinking: how could anybody behave like that? But when you join a Greek organization, personal responsibility can get diluted into the group mindset. “People lose their sense of individuality when they become a member of a group,” explains Dr. Brad Bushman, a professor of communication and psychology at Ohio State University. “Although a group is comprised of individuals, the individuals don’t necessarily think for themselves.”

Even Will Ferrell, a former brother of Delta Tau Delta who played an overgrown frat boy in the movie Old School, thinks fraternities are problematic. “The incident in Oklahoma, that is a real argument for getting rid of the system altogether, in my opinion, even having been through a fraternity,” he said in a Q&A with the New York Times. “Because when you break it down, it really is about creating cliques and clubs and being exclusionary.”

And if you want to avoid that atmosphere, your best bet might be to avoid campuses where the Greek scene rules–the Princeton Review lists the schools that have the most Greek life, and US News & World Report lists the schools with the highest percentage of students in frats and sororities.

But even on campuses where fewer than half the students rush, Greek life can feel ubiquitous. “Going into school I didn’t really have any exposure to Greek life,” says Dylan Tucker, a senior psychology major at Cornell University who chose not to rush a frat. “But once I got here, I was a little bit surprised at how prominent Greek life was, how many people who were in frats.” At Cornell, only 27% of men are in fraternities, but it can feel like much more than that.

Tucker was able to make friends through the basketball scene, but he says if you’re not in a frat, it can be hard to meet people unless you participate in another activity. “If you don’t plan on being in a frat or sorority, people should be aware that it can affect your ability to make friends,” he says. “If you’re going to a school that has a very prominent Greek life, be aware that you will be excluded from a lot of events and things.”

So when it comes to going Greek, you can be damned if you do, damned if you don’t: joining can lead to risky situations, but resisting can feel isolating. That’s why you should decide on Greek life before you decide on a campus, so the choice is actually up to you.

TIME Opinion

Monica Lewinsky and Why the Word ‘Slut’ Is Still So Potent

Monica Lewinsky
Amanda Edwards—WireImage/Getty Images Monica Lewinsky in Los Angeles, Dec. 7, 2014.

Lewinsky was a 22-year-old intern when her affair with Bill Clinton branded her with a ​Scarlet ​L​etter​ S. Nearly two decades later, she's still suffering the repercussions.​ Why is the word slut still so damning?

Slut.

​Tart.

​Whore.

That Woman.

​Those were the word​s​ used to describe Monica Lewinsky, the once 22-year-old intern who had an affair with the President. She is 41 now and speaking ​publicly about the impact of that relationship for the first time. When those words weren’t used to describe her, they were simply known as what defined her.

​Almost two decades later, those are the same words — though slightly updated — used daily to harass, threaten and humiliate young women and girls who deviate from the sexual (and sometimes not-so-sexual at all) norm, both at school and online.

History met the present recently at a Manhattan performance of a play called SLUT, where Monica Lewinsky watched the story of a teen girl who is assaulted, reports it, and is slut-shamed by her peers. I sat next to Lewinsky as she watched the drama play out. At the end, ​she stood up, surveyed the young faces in the audience, and spoke​: ​“Thank you,​” she ​told the crowd, “for standing up against the sexual scapegoating of women and girls.” Afterwards, girls crowded around Lewinsky to express their own gratitude for her outspokenness.

The Lewinksy scandal broke in in 1998. ​SLUT the play takes place today. In between, the word has been used by Rush Limbaugh to discredit Sandra Fluke a law student who spoke up for birth control; to debate the validity of sexual assault claims; and more often than one could count, to talk viciously about women on the Internet. (Just this week, Ashley Judd proclaimed she would sue her slut-shaming harassers on Twitter.)

What is it about the word slut that is still so potent?

​Slut didn’t begin as a bad word — or a word for women at all — but merely an “untidy” one. Chaucer (yes, that Chaucer) put it in print in the early 1300s, referring to a sloppy male character as “sluttish” in The Canterbury Tales.

But if the word was used for men more broadly it was only for a second: by the 1400s, it had morphed into a term for maids and unkempt, dirty women (like actually dirty, not sexually dirty). It wasn’t long before that notion was infused with sexual connotations. Today, the term is defined by Oxford Dictionary as a woman who “has many casual sexual partners” or one with “low standards of cleanliness” — though it’s clear that in our modern lexicon, those two might as well be one and the same.

Sure, there have been positive usages or attempts to take slut back: Kathleen Hanna famously scrawled the word across her stomach while on stage with Riot Grrrl in the 90s; there is the SlutWalk movement, an effort to reclaim the word.

But by and large one definition remains: Slut is loaded. Slut is bad. Much in the way that Lewinsky became a kind of public symbol, said the linguist Robin Lakoff, “​of all that is sexually loathsome and scary about women,” ​the word slut — and its linguistic sisters, ho, whore, tramp, and skank — is a stand-in for the same: used to describe women who deviate from the norm.

“Girls are still targeted when they cross some kind of boundary,” said Eliza Price, ​a ​16​-year-old cast member in the SLUT play, which is produced by an all-girl theater group called the Arts Effect. ​​

But that boundary can almost anything: clothing, behavior, attitude or something else. As a group of Mississippi teens described it to the author Rachel Simmons, in her book, Odd Girl Out, a girl can be a slut — or in this particular interview, a “skank” — if she sits with her legs open, wears baggy clothes, wears tight clothes, talks in slang, gets into fights, or shows too much PDA. “In other words: almost anything,” said Simmons. “‘Slut’ and its cousin ‘skank’ are used to denote girls who take up space and break the good girl rules.”

And sometimes that has nothing to do with sex. Leora Tanenbaum, the author of a new book, I Am Not a Slut, has interviewed girls and women who’ve been labeled with the word — coining, in 1999, the term “slut-bashing,” which would later evolve into “slut-shaming.” But being called a slut, she found, actually had little to do with whether or not these girls were sexually active. Rather, anybody could be called a slut, she said. The word was a catch-all to discredit women; for young women, it was a way to define them before they got the chance to define themselves.

And while words like bitch have an action associated with them — i.e., if you change your behavior you might be able to shed the label — the word slut is forever.

“Once you’re labeled a slut, it’s pretty much impossible to rid yourself of it,” explains Winnifred Bonjean-Alpart, 17, the lead actress in the play and a high school student in New York. As another young actress explained it: You can be valedictorian, class president and prom queen, but if more than one person calls you slut, all that gets wiped away.

And the Internet makes that even more the case. “In the 90s, when girls would come to me and say ‘I’m the slut in my school and I can’t bear it, what should I do?’ One of the things I would say is ‘Have you looked into transferring to another school?’” said Tanenbaum. “But you can’t say that anymore, because her reputation is going to follow her. You can’t go off the grid.”

The way slut as epithet plays out is multifold:

It’s the reason young women are so obsessed with their “number”— how many sexual partners they’ve had. It might explain why some women lie to their healthcare providers about those numbers, even when it’s not in their best interest.

It’s the reason why, on more than one occasion, as a young woman I would say “no” when I really wanted to say “yes”: yes, of course, would be considered slutty. (You can imagine how that plays into the complicated conversation we’re now having about consent.)

In one case that Tanenbaum describes, a young college woman believed that being called slut contributed to the reason she was raped. “He must have thought, ‘Well, she sleeps around all the time, so she’ll say yes to me,’” the woman told her.

In Monica Lewinsky’s case, that label is the reason she still can’t find work, and has largely stayed out of the public eye for close to a decade. As she said in her TED talk this past week, “It was easy to forget that ‘that woman’ was dimensional, had a soul and was once unbroken.”

Back in 1998, Lewinsky was condemned by the left and the right, by men and women alike, even self-proclaimed feminists (including the New York Times columnist Maureen Dowd, whose columns on the scandal of President Clinton’s affair and “slutty” Monica Lewinsky won a Pulitzer Prize). Today Lewinsky would be likely to have defenders: there are simply more avenues to push back against a singular media narrative; and we have a new language with which to talk about it.

But the word still has the power to wound, diminish and discredit — as so many victims of sexual assault can attest. Which begs the question: Instead of discrediting women, can we simply discredit the word?

Jessica Bennett is a contributing columnist at Time.com covering the intersection of gender, sexuality, business and pop culture. She writes regularly for the New York Times and is a contributing editor on special projects for Sheryl Sandberg’s women’s nonprofit, Lean In. You can follow her @jess7bennett.

Read next: Monica Lewinsky TED Talk: ‘I was Patient Zero’ of Internet Shaming

Listen to the most important stories of the day.

TIME Opinion

Why We Should Reconsider the War on Crime

Outrage In Missouri Town After Police Shooting Of 18-Yr-Old Man
Scott Olson—Getty Images Police advance through a cloud of tear gas on Aug. 17, 2014 in Ferguson, Mo.

Fifty years after it began, the initiative has brought America to a crossroads

Fifty years ago this month, President Lyndon B. Johnson called for a “War on Crime,” a declaration that ushered in a new era of American law enforcement. Johnson’s turn toward crime control as a federal priority remains his most enduring legacy—even more than the Great Society programs that scholars often herald as his greatest achievement—and continues to shape what is arguably the most important social crisis the United States now faces.

Until recently, the devastating outcomes of the War on Crime that Johnson began had gone relatively unnoticed. Then, last August, during the series of demonstrations in Ferguson, Mo., images of law-enforcement authorities drawing M-4 carbine rifles and dropping tear gas bombs on protestors and civilians alike shocked much of the American public. Ferguson looked like a war zone. Many commentators attributed this sight to the ongoing technology transfers from the defense sector to local law-enforcement authorities, which began during the War on Drugs and escalated in the climate of the War on Terror.

But the source of those armored cars is much older than that. It was the Law Enforcement Assistance Act that Johnson presented to Congress on March 8, 1965, that first established a direct role for the federal government in local police operations, court systems, and state prisons. Even though the Voting Rights Act is considered the major policy victory of that year, Johnson himself hoped that 1965 would be remembered not as the apex of American liberal reform, but rather as “the year when this country began a thorough, intelligent, and effective war against crime.”

President Johnson saw the urban policeman as the “frontline soldier” of this mission, and, as a result, the administration focused on building the weapons arsenal of local law enforcement. The 1965 legislation created a grant-making agency within the Department of Justice, which—with $30 million at its disposal, or $223 million in today’s dollars—purchased bulletproof vests, helicopters, tanks, rifles, gas masks and other military-grade hardware for police departments. Like the Mine-Resistant Ambush Protected vehicles driven first in Iraq and then in Ferguson, much of this equipment had been used by the military in Vietnam and Latin America.

Those programs culminated in the Omnibus Crime Control and Safe Streets Act of 1968, the last major piece of domestic legislation Johnson passed, which gave the Department of Justice a new degree of influence over social policy by enlarging the grant-making agency into the Law Enforcement Assistance Administration. In contrast, the Office of Economic Opportunity at the Center of the War on Poverty never grew into a more permanent agency. Over time, national policymakers retreated from and eventually dismantled many of the social welfare programs of the Great Society; the War on Crime, on the other hand, became the foremost policy approach to the social and demographic challenges of the late twentieth century.

Indeed, federal law-enforcement programs have expanded rapidly over the past five decades. Despite the misconception that the Reagan administration spearheaded the rise of urban surveillance and mass incarceration, federal policymakers had already dedicated a total of $7 billion in taxpayer dollars (roughly $20 billion today) to crime-control programs before Reagan took office in 1981. The most recent available figures from the Bureau of Justice Assistance indicate that federal officials have sustained these funding commitments, appropriating well over $1 billion annually to law enforcement programs at the state and local level.

Law enforcement and criminal justice remain at the heart of the nation’s economic and social programs. That fact began to change life for many Americans well before the attention it got in the last year. For example, in Detroit in the early 1970s, officers of a decoy squad known as STRESS (an acronym for “Stop the Robberies, Enjoy Safe Streets”) killed 17 African American civilians—the vast majority unarmed—during its two years of operation. If the “War on Crime” was meant to be a useful metaphor that would spur policymakers into action, it quickly evolved into what resembled an actual war.

And it’s never been a matter of policing alone. Proximity to the expanding punitive arm of the federal government puts citizens, often low-income urban Americans, in close contact with the criminal justice system. Federal grants were tied to arrest rates, encouraging more apprehensions in those neighborhoods that had been explicitly targeted for special law-enforcement programs. New sentencing guidelines and criminal categories emerged that increased the chance that men and women from these same communities would serve long sentences in prison. In turn, the penal confinement of disproportionate numbers of young African American men during the 1970s often transformed first-time offenders and drug addicts into hardened criminals. Even Richard Nixon referred to prisons as “colleges of crime.”

Although the Johnson administration had created a blueprint for a national crime-control program to improve American society, the long-term impact of the shift toward surveillance and confinement has brought our nation to a fiscal and moral crossroads.

Last year marked the 50th anniversary of Johnson’s call for a “War on Poverty” in his first State of the Union address. Yet, according to Census Bureau estimates, the poverty rate today is equivalent to its rate in the mid-1960s. This year, with the 50th anniversary of the War on Crime upon us, and with #BlackLivesMatter and other movements against justice disparities gaining momentum, we should include the implications of this less understood dimension of the Great Society in our reconsiderations of the past.

In order to move forward as a nation we must come to terms with the reality that the programs unleashed by the War on Crime a half-century ago have overshadowed much of the War on Poverty’s social promise. President Johnson could not have foreseen the unintended consequences of the path he set in motion. But what is perhaps the central irony of the late 20th century is that one of the most idealistic enterprises in the history of the United States has left a legacy of crime, incarceration and inequality.

The Long ViewHistorians explain how the past informs the present

Elizabeth Hinton is an Assistant Professor of History and African and African American Studies at Harvard University. She is the author of a forthcoming history of the War on Crime and its long-term impact on domestic policy.

 

Your browser is out of date. Please update your browser at http://update.microsoft.com