TIME politics

The Conservative Case for Legalizing Marijuana

William F. Jr. Buckley
Truman Moore—The LIFE Images Collection/Getty William F. Buckley Jr., riding in airplane en route to Washington DC, in 1965

American conservatives haven't always opposed legalizing pot

The United States’ latest skirmish in the battle over marijuana laws is still ongoing and, for lawmakers, it hits close to home. On Thursday, possession of a limited amount of the drug became legal for adult residents of Washington, D.C. — but, thanks to the intervention of a group of Congressmen, there’s still no way to legally buy it or sell it there, which may lead to the development of a “free weed economy.”

The legislative action taken to stop the District from developing a monetary economy for pot has broken down along party lines, with Republican lawmakers against the change in stance toward the drug and Democrats urging the city to go ahead.

It may seem like a natural thing for conservatives to be, well, conservative about changing drug laws — polls have shown that Republicans are much less likely than Democrats to support legalization —but that wasn’t always the case. In fact, there was a time during the 1970s when the nation’s leading conservative voices spoke out on behalf of legalizing marijuana, for many of the same reasons that advocates of legalization cite today.

At that time, in late 1972, a large study from the nonpartisan Consumers Union had just come out, urging legalization, as well as government-supported treatment for addictions to other substances. The report found that it was too late for law enforcement to keep pot from becoming part of American culture — and, surprisingly, its authors weren’t the only ones to think so, as TIME reported that December:

…American conservatives may have arched their eyebrows well above the hairline when they glimpsed the latest issue of William F. Buckley Jr.’s staunchly nonpermissive National Review. There on the cover was the headline: THE TIME HAS COME: ABOLISH THE POT LAWS. Inside, Richard C. Cowan, a charter member of the conservative Young Americans for Freedom, sets forth his arguments that the criminal penalties for marijuana possession and use should be stricken from the books. Cowan contends that pot is comparatively harmless, demonstrably ubiquitous and that the laws against it only alienate the young and breed disrespect for American justice.

The attitude was a shift for Buckley, who in 1971 testified against loosening penalties but wrote in 1972 that he agreed with Cowan. “It seems, in fact, that Buckley has smoked grass himself—but only on his sailboat, outside the three-mile limit,” TIME noted. “His verdict: ‘To tell the truth, marijuana didn’t do a thing for me.'”

See the full story, here in the TIME Vault: Concerning Pot and Man at The National Review

TIME People

Why Napoleon Probably Should Have Just Stayed in Exile the First Time

Napoleon I, Emperor of France, in exile.
Print Collector/Getty Images An illustration of Napoleon I, Emperor of France, in exile.

Feb. 26, 1815: Napoleon escapes from Elba to begin his second conquest of France

For the man with history’s first recorded Napoleon complex, it must have been the consummate insult. After Napoleon Bonaparte’s disastrous campaign in Russia ended in defeat, he was forced into exile on Elba. He retained the title of emperor — but of the Mediterranean island’s 12,000 inhabitants, not the 70 million Europeans over whom he’d once had dominion.

Two hundred years ago today, on Feb. 26, 1815, just short of a year after his exile began, Napoleon left the tiny island behind and returned to France to reclaim his larger empire. It was an impressive effort, but one that ended in a second defeat, at Waterloo, and a second exile to an even more remote island — Saint Helena, in the South Atlantic, where escape proved impossible. And he didn’t even get to call himself emperor.

From this new prison perspective, he may have missed Elba. After all, as much as he hated the idea of his reduced empire, he didn’t seem to dislike the island itself. His mother and sister had moved there with him, and they occupied lavish mansions. According to a travel writer for the Telegraph, “Though his wife kept away, his Polish mistress visited. He apparently also found comfort in the company of a local girl, Sbarra. According to a contemporary chronicler, he ‘spent many happy hours eating cherries with her.’”

It was easy to believe — until he fled — that he meant what he said when he first arrived: “I want to live from now on like a justice of the peace.” He tended to his empire with apparent gusto, albeit on a smaller scale than he was used to. In his 300 days as Elba’s ruler, Napoleon ordered and oversaw massive infrastructure improvements: building roads and draining marshes, boosting agriculture and developing mines, as well as overhauling the island’s schools and its entire legal system.

The size of the island, it seemed, did not weaken Napoleon’s impulse to shape it in his own image. The title of emperor brought out the unrepentant dictator in him, so confident in his own vision that, as TIME once attested, he “never doubted that [he] was wise enough to teach law to lawyers, science to scientists, and religion to Popes.”

When a collection of Napoleon’s letters was published in 1954, TIME noted that his “prodigious” vanity was most apparent in the letters he’d written from Elba, in which “he referred to his 18 marines as ‘My Guard’ and to his small boats as ‘the Navy.’ ”

The Elbans seemed to think as highly of their short-lived emperor as he did of himself. They still have a parade every year to mark the anniversary his death (on May 5, 1821, while imprisoned on his other exile island). And, as TIME has pointed out, “not every place that the old Emperor conquered is so fond of his memory that they annually dress a short man in a big hat and parade him around…”

Read TIME’s review of a collection of Napoleon’s letters, here in the archives: From the Pen of N

TIME curiosities

How Sword Swallowing Contributed to Modern Medicine

On World Sword Swallower's Day, practitioners of the ancient art raise awareness that their tradition is more than a circus sideshow

This weekend, spectators will gather at a dozen Ripley’s Believe It or Not! Odditoriums across America to watch performers stick swords down their throats, through their esophageal sphincters and into their stomachs. According to the Sword Swallowers Association International, World Sword Swallower’s Day exists to celebrate the ancient art, dispel myths and “raise awareness of the contributions sword swallowers have made in the fields of science and medicine.”

If that last bit is a little hard to swallow, chew on this historical nugget: The first endoscopy of the upper gastrointestinal tract, or esophagoscopy, was performed on a sword swallower in 1868 by the German physician Adolph Kussmaul. After experiencing frustration at not being able to see far enough into the esophagus of a patient with a tumor, he was able to see all the way into the stomach of the sword swallower. The subject swallowed a 47-centimeter tube, which Kussmaul looked through using a laryngeal mirror and gasoline lamp.

Electrocardiography also owes a debt to the sword swallowing community, as the first electrocardiogram of the esophagus used a sword swallower as a test subject in 1906. The physician, M. Cremer, also a German, inserted an electrode into the sword swallower’s esophagus in order to record his heart activity.

The nineteenth and twentieth century medical contributions of sword swallowers are a fortuitous byproduct of the practice, which dates back to 2000 B.C.E. It began in ancient India, where it was performed, like firewalking, as a test of courage and a demonstration of faith. The practice gradually spread across Asia and Europe, morphing over the course of centuries from religious rite to street entertainment.

What was once a widespread global phenomenon is now a dwindling profession, with the SSAI estimating no more than a few dozen professional sword swallowers still performing. But those active in the small community will insist that people inclined to write them off as a circus sideshow acknowledge their contributions to the annals of medicine. So the next time a doctor looks inside your body through a tube, thank a sword swallower.

TIME Race

Advice for Young Black Boys, 3 Years After Trayvon Martin’s Death

A Million Hoodies March Protests Death Of Trayvon Martin
Mario Tama—Getty Images A Million Hoodies March protests the death Of Trayvon Martin on Mar. 21, 2012, in New York City.

"You could be a Trayvon," columnist Touré wrote in 2012

It was three years ago — on Feb. 26, 2012 — that unarmed Florida teenager Trayvon Martin was shot by George Zimmerman. It would be months before Zimmerman, who had said the shooting was in self-defense, was found not guilty, a decision that inspired a new wave of debate about racism and the law. Following the verdict, TIME devoted a cover story to the way the case had shaken the country, as well as its reverberations on a more intimate scale.

As a columnist for TIME, Touré addressed the situation in the Apr. 2, 2012, issue of TIME. He responded to the news with a list of eight pieces of advice for people who “could be a Trayvon”:

Many black families have been forced into uncomfortable but necessary conversations since the Feb. 26 killing of 17-year-old Trayvon Martin. His death and the release of the uncharged shooter, George Zimmerman, have reminded many of how vulnerable we still are. The icy cold wind of racism has crept into our homes and made the hairs on the backs of our necks stand up. Blood memories of strange fruit have been stirred. Young black boys have been reminded that they are walking targets for hate. What do you say to them about what happened to Trayvon? Here’s a start:

1. It’s unlikely but possible that you could get killed today. Or any day. I’m sorry, but that’s the truth. Black maleness is a potentially fatal condition. I tell you that not to scare you but because knowing that could save your life. There are people who will look at you and see a villain or a criminal or something fearsome. It’s possible they may act on their prejudice and insecurity. Being black could turn an ordinary situation into a life-or-death moment even if you’re doing nothing wrong…

Read the rest of his advice, here in the TIME Vault: How to Stay Alive While Being Black

Read the cover story from 2013, here in the TIME Vault: After Trayvon

TIME politics

Why Ronald Reagan Is Such a Big Deal at CPAC

Pres Ronald Reagan speaking at CPAC conference
Cynthia Johnson—The LIFE Images Collection/Getty President Ronald Reagan speaking at CPAC conference in 1986

The special relationship goes back to the conference's beginnings

It’s no secret that the Conservative Political Action Conference (CPAC) loves Ronald Reagan. The agenda for this year’s event, which takes place this week, includes a screening of Ronald Reagan: Rendezvous with Destiny, the Ronald Reagan Reception and the Ronald Reagan Dinner.

Sure, the late president was conservative and CPAC is a conservative conference, but their connection goes deeper than that: not only did Reagan speak at the first ever CPAC in 1974, he also provided part of the impetus for its creation.

The key to the special relationship between the event and the politician is timing. When the first CPAC took place, Reagan’s position among conservatives was not the established spot on a pedestal that he occupies today. Rather, he was the subject of severe regret for many. And the reason for that regret was obvious: Richard Nixon.

As TIME explained later that spring, then-President Nixon had seemed like a safe bet, and proved to be anything but:

The alliance between Richard Nixon and the nation’s conservative ideologues has never been automatic or assured. His 1960 campaign, in which he compromised with New York Governor Nelson Rockefeller on matters like civil rights and medical care for the aged, caused many conservatives to worry that he was far too willing to sacrifice philosophical principles for the sake of votes. They backed him for the Republican nomination in 1968 largely because he seemed more likely to win than their preferred candidate, California Governor Ronald Reagan. Explains Texas Senator John Tower: “Having gone through the debacle of 1964 with Barry Goldwater, we were not going to be lemmings again.” Moreover, according to Tower, “we received certain assurances from Nixon. So we felt that his inclination would be in our direction, even though he was never really regarded as one of us.”

…Now, more and more conservatives are uneasy about the President. They were pleased by some of his actions, such as his move to end the antipoverty program, his stance against busing, his Supreme Court appointments, his efforts to scale down the Federal Government’s activity and return revenues to the local levels. But they were dismayed by many of his other moves, including the wage-price controls that he imposed and the rapprochement with Peking. Says Frank Donatelli, executive director of the Young Americans for Freedom: “He certainly is not a conservative President so far as we are concerned. We do not see how his health-care program is much better than [Senator Edward] Kennedy’s. His conception of detente is riding roughshod over our friends, ruining our defense posture and ignoring the basic human rights of people within the Soviet Union.”

In the months that followed, Watergate would prove that distrust of Nixon well-founded, but it was already there when CPAC was held—and, in fact, much of that first conference was devoted to that very point. As the New York Times noted in its coverage, the message was “Richard Nixon has done us dirt” and a prominent political consultant added that “a substantial majority [of attendees] wishes the President would just go away.”

Meanwhile, the decision not to back Reagan seemed more and more of a mistake. He was chosen as a speaker at the very first CPAC, receiving what the Times called a “rousing, placard-waving welcome.” Reagan has been, as long as CPAC has existed, a symbol of the idea that compromising on conservatism is a mistake. After all, choosing the more liberal, electable candidate over him had resulted in the worst presidential disaster in American history.

In the years that followed, the relationship between Reagan and CPAC, established even before that very first meeting, grew. Reagan assumed the presidency and CPAC became a major force in conservative politics, each helping the other along. As TIME put it in 1986, “Speakers and delegates alike credited Reagan with having permanently changed the national agenda to make the conservative voice not just relevant but dominant.”

Read original coverage of CPAC 1986, including Reagan’s speech, here in the TIME Vault: The Tide Is Still Running

TIME politics

How the First Black U.S. Senator Was Nearly Kept From His Seat

Hiram R Revels
MPI / Getty Images circa 1870: Hiram R Revels

Feb. 25, 1870: Hiram Revels, a Mississippi Republican, is sworn in as the first black member of the U.S. Senate

Hiram Rhodes Revels was a rising star of the Republican Party in 1869. A gifted orator — a skill he’d honed in his pre-political career as a minister — he’d just won a seat in the Mississippi state senate when he delivered an opening prayer so moving it left the statehouse awestruck.

“That prayer, one of the most impressive and eloquent prayers that had ever been delivered in the Senate Chamber, made Revels a United States Senator,” Revels’ fellow Mississippi legislator, John R. Lynch, later wrote. “It impressed those who heard it that Revels was not only a man of great natural ability but that he was also a man of superior attainments.”

So why, when Revels was chosen the following year to fill one of Mississippi’s two empty seats in the U.S. Senate, did his appointment raise the ruckus that would land him on TIME’s top-ten list of contested officeholders? Because some Democrats argued that since the 14th Amendment, which granted citizenship to people of color (including recently-freed slaves), had been ratified in 1868, Revels had only technically been a citizen for two years — not long enough to meet the Senate’s requirements.

Their argument was quashed, and on this day, Feb. 25, 1870, Revels became America’s first black Senator, serving out the unexpired term in a Senate seat that had been vacated when Mississippi seceded from the Union. The state’s other seat had formerly been occupied by Confederate President Jefferson Davis.

The irony of that reversal wasn’t lost on Revels’ Senate colleagues, including Nevada Senator James Nye.

“[Jefferson Davis] went out to establish a government whose cornerstone should be the oppression and perpetual enslavement of a race because their skin differed in color from his,” Nye declared on the Senate floor. “Sir, what a magnificent spectacle of retributive justice is witnessed here today! In the place of that proud, defiant man, who marched out to trample under foot the Constitution and the laws of the country he had sworn to support, comes back one of that humble race whom he would have enslaved forever to take and occupy his seat upon this floor.”

While Revels might have taken issue with his characterization as a member of “that humble race,” he apparently didn’t mention it publicly. His time in office was marked by moderation and forgiveness. He was a staunch advocate for granting amnesty to former Confederates, provided they swore an oath of loyalty to the Union, and he spoke out against segregation, believing it only perpetuated prejudice.

“I find that the prejudice in this country to color is very great, and I sometimes fear that it is on the increase,” he said in one floor speech. Amid the tensions of the Reconstruction Era, he attempted to soothe the fears of his fellow politicians. In an argument for educating freed slaves, he promised, “The colored race can be built up and assisted … in acquiring property, in becoming intelligent, valuable, useful citizens, without one hair upon the head of any white man being harmed.”

Read a 1967 story about Edward William Brooke III, the first African American senator elected after the ratification of the 17th Amendment, here in the TIME Vault: An Individual Who Happens To Be a Negro

TIME photography

9 Iconic Photographs From African American History

A new book, Through the African American Lens, and forthcoming exhibit from the Smithsonian's National Museum of African American History and Culture offer iconic images of black culture, activism and community in America

The casual student of history might not look to Frederick Douglass for wisdom on the power of photography. The abolitionist is best known for his unmatched talent for oration, and when he died in 1895, the medium was still an evolving technology. But Douglass knew that photography had a quality that couldn’t always be found in other art forms. He touched on the transformative energy of the image when he wrote in 1864 that making pictures enables us to “see what ought to be by the reflection of what is, and endeavor to remove the contradiction.”

Douglass’ words introduce a selection of some of the most iconic photographs of African American history in the new book Through the African American Lens, curated by Smithsonian’s National Museum of African American History and Culture, slated to open next year in Washington, D.C. The book is the first in a series of seven, and an exhibit of the same name will open on May 8, 2015.

“The book essentially reflects the vastness and the dynamism that is the subject matter for the museum,” says Rhea Combs, Curator of Film and Photography at the museum, who led the team that distilled a collection of 15,000 images into the 60 photographs that make the book. While future books will delve into more specific themes in African American history, like the civil rights movement and African American women, the first book takes a sweeping look at more than 150 years of the vast and varied set of African American experiences in America.

Throughout history, photographs have afforded African Americans a way of “inserting themselves into a conversation,” Combs says, especially in a society “that oftentimes dismissed them or discounted them.”

The images reveal how agency can be created in the space between lens and subject. “There is a real, conscientious effort with individuals that are standing in front of the camera to present themselves in a way that shows a regality, a fortitude, a resolve,” she says. Whenever Douglass was photographed, he made sure to see the photographs before they were distributed, as he knew the importance of controlling his image. During the mid-nineteenth century, abolitionists mailed out photographs of slaves in an effort to change hearts and minds on the matter of abolition.

Many of the photographs were taken by photographers who were not African American themselves. When Wayne Miller, a white photographer, knocked on the doors of black Chicagoans in the 1940s, he earned their trust through conversation rather than setting out to conduct an anthropological study. Though this was certainly not always the case, and the relationship between subject and photographer can be quite complicated, “I think the agency was definitely in their gaze at the camera instead of the camera recording them,” says Combs.

Sixty might sound like an impossibly small number of images to capture all of African American history. But the images Combs and her team selected speak volumes. A 1938 photograph of a Harlem Elks Parade shows, rather than the parade itself, the sense of community and togetherness among its spectators. Images of exile—in the form of James Baldwin and Eldridge Cleaver—speak to, in Combs’ words, “freedom movements that are part of American history, but didn’t occur on American soil.”

LIFE photographer Eliot Elisofon’s photo of Zack Brown photographing two men in Harlem is a fitting choice for the book’s cover. In it, a black photographer, behind the lens, documents the dapper and dignified appearance of two black men in Harlem. The photograph is about urban life and the Great Migration, but it’s also about photography itself: that interplay between the voyeurism of viewers and the self-awareness of subjects that brings a static image to life nearly 80 years later.

Many of the photographs have this sense of immediacy, a sometimes startling relevance that belies their age. In a picture taken by Dave Mann of Emmett Till’s funeral in 1955, Till’s mother clutches a handkerchief in one hand and extends the other, searching, it seems, for balance. The track of a single tear, which appears to have fallen just before the shutter was clicked, is visible on her face.

“Especially on the heels of things that are happening now,” says Combs, “this story unfortunately—how many years later—feels very, very familiar.”

Liz Ronk, who edited this gallery, is the Photo Editor for LIFE.com. Follow her on Twitter at @LizabethRonk.

TIME Sports

The ‘Death Penalty’ and How the College Sports Conversation Has Changed

Mustangs Texas A&M Football
Bill Jansch—AP Photo Southern Methodist University tailback Erick Dickerson is all smiles on Nov. 2, 1982, in Texas Stadium.

On Feb. 25, 1987, the Southern Methodist University football team was suspended for an entire season. Nearly two decades later, the program has yet to recover

“It’s like what happened after we dropped the [atom] bomb in World War II. The results were so catastrophic that now we’ll do anything to avoid dropping another one.”

That’s how John Lombardi, former president of the University of Florida, described the so-called “death penalty” levied upon Southern Methodist University in 1987 after the NCAA determined that the school had been paying several of its football players.

Until the punishment came down—on this day, Feb. 25, in 1987—SMU had seemed like the opposite of a cautionary tale. The tiny Dallas university, with just 6,000 students, had finished its 1982 season undefeated, ranking No. 2 in the nation and winning the Cotton Bowl, and added a second Southwest Conference championship to its résumé two years later. The SMU of the early 1980s stood toe-to-toe with conference powers Texas, Texas A&M and Arkansas—and proved itself their equal.

Trouble was, SMU needed help standing with those giants. There aren’t many ways to build a dominant football program on the fly, but if you’re going to try, you need a coach who can convince a bunch of teenagers that they’re better off coming to your unheralded program than they are heading down the road to Austin or College Station or hopping a plane to Los Angeles or South Bend. That’s no easy task, even for a recruiter as gifted as Ron Meyer, who became SMU’s head coach in 1976. Sometimes promises of playing time or TV exposure aren’t enough—especially when your competitors are offering the same things, only more and better. Though the Mustangs weren’t caught till a decade after Meyer arrived in Dallas, there’s every reason to suspect SMU and its boosters had been bending the rules for years.

When the other cleat dropped, it dropped hard. The death penalty—part of the “repeat violators” rule in official NCAA parlance—wiped out SMU’s entire 1987 season and forced the Mustangs to cancel their 1988 campaign as well. So, when Lombardi compared the punishment to the nuclear option, in 2002, the analogy seemed like an apt one. For years, scorched earth was all that remained of the SMU football program, and of the idea of paying players.

Now, however, the conversation has changed.

Dallas itself played a major role in the rapid rise and ferocious fall of the Mustangs. By the 1970s, the northern Texas city was a growing metropolis, a hub for businessmen who had recently acquired their fortunes thanks to oil and real estate. Virtually to a man, each had a college football team he supported, and with that support came an intense sense of pride, not to mention competition. Combine that environment with the enormous success of the NFL’s Dallas Cowboys during the 1970s as they assumed the title of “America’s team,” and it’s easy to see how so much pressure was placed on SMU.

With Ron Meyer’s arrival at the university, the goal became to dovetail the success of the Cowboys with the Mustangs’ performance—and he fit right in with the image that Dallas had begun to embody. He was brash, he was charming, he was dapper; the comparisons with Dallas’ J.R. Ewing came all too easily. And like Ewing, Meyer could be ruthless, pursuing recruits throughout eastern Texas with near-mythic fervor.

And the best myths have a dragon to slay. For Meyer, that dragon was Eric Dickerson. Dickerson was one of the nation’s top prospects—a high school running back so gifted he could have chosen any school in the country to play for in 1979. By all accounts, SMU wasn’t even in the running. They’d come a long way toward respectability since Meyer had arrived, but still weren’t on a level with Oklahoma or USC or Notre Dame. Plus, Dickerson had already committed to Texas A&M (and famously received a Pontiac Trans-Am that SMU supporters had dubbed the ‘Trans A&M’ right around the same time). But then, suddenly, miraculously, Dickerson had a change of heart. He decommitted from A&M and picked SMU shortly thereafter.

To this day, that decision remains a mystery wrapped in an enigma. There’s a section of ESPN 30 for 30’s excellent documentary about the SMU scandal, The Pony Exce$$—a riff on the SMU backfield, Dickerson and classmate Craig James, which was dubbed ‘The Pony Express’—about Dickerson’s recruiting process. No one involved, from Meyer to the boosters to Dickerson himself, would say how he really ended up at SMU. But none of them were able to contain the smirks that crept across their faces when they talked about the coup. There’s a reason that a popular sports joke in the early ’80s was that Dickerson took a pay-cut when he graduated and went to the NFL.

Dickerson changed everything for the Mustangs. With him powering SMU’s vaunted offense, the team became a force to be reckoned with in the Southwest Conference. Greater success, however, brought with it greater scrutiny. SMU was in a difficult position because Dallas had such a vibrant and competitive sports media scene (led by the Dallas Morning News and the Dallas Times Herald) at the time—one increasingly focused on investigative journalism in the wake of Watergate. The school’s status as a relative neophyte in the world of big-time college football and lack of rapport with the NCAA also did them no favors. There’s little question that other programs in the Southwest Conference were engaged in recruiting practices that bent the rules when it was possible, but none had quite as many eyes on them as the Mustangs.

Bobby Collins took over in 1982 and led SMU to its undefeated season, after Meyer left to be head coach of the hapless New England Patriots, but the Mustangs would never again reach those dizzying heights. Despite a growing recruiting reach, Collins failed to lure top-caliber prospects to Dallas, even with the help of the program’s increasingly notorious group of boosters. Instead, SMU became better known for its damning misfires, the first of which was Sean Stopperich, a prep star from Pittsburgh. Stopperich was paid $5,000 to commit and moved his family to Texas, but SMU had failed to realize that Stopperich’s career as a useful football player was already over. The offensive lineman had blown out his knee in high school, spent little time on the field for the Mustangs and left the university after just one year. Upon his departure from SMU, Stopperich became the first key witness for the NCAA in its pursuit of SMU.

The first round of penalties came down in 1985, banning SMU from bowl games for two seasons and stripping the program of 45 scholarships over a two-year period. At the time, those were considered some of the harshest sanctions in NCAA history. In response, Bill Clements, chairman of the board of governors for SMU, hung a group of the school’s boosters—dubbed the “Naughty Nine” by the media—out to dry, blaming them for the program’s infractions and the university’s sullied reputations.

Shortly thereafter, the NCAA convened a special meeting to discuss new, harsher rules for cheating, the most severe of which was the death penalty. (Despite Texas’ reputation as a pro-death penalty state for felons, its universities were some of the new rules’ staunchest opponents.) Still, due to the sanction’s power, few believed it would ever be used.

If SMU had cut off its payments to players immediately, it might not have been. Instead, the school and its boosters implemented a “phase-out” plan, which meant they would continue paying the dozen or so athletes to whom they had promised money until their graduation. One of those students-athletes, David Stanley, came forward after being kicked off the team and gave a televised interview outlining the improper benefits he had received from SMU. His words alone may not have been enough to damn the university, but an appearance on Dallas’ ABC affiliate, WFAA, by Coach Collins, athletic director Bob Hitch and recruiting coordinator Henry Lee Parker sealed the program’s fate.

Their interview with WFAA’s sports director Dale Hansen is a mesmerizing watch. Hansen sets a beautiful trap for Parker involving a letter that the recruiting director had initialed, and the recruiting coordinator walks right into it, all but proving that payments to players came directly from the recruiting office. The fact that Parker, Collins and Hitch looked uncomfortably guilty the entire time didn’t help their case.

The NCAA continued gathering evidence, and on Feb. 25, 1987—a gray, drizzly day in Dallas—it announced it would be giving SMU the death penalty. The man who made the announcement, the NCAA Director of Enforcement David Berst, fainted moments after handing down the sentence, in full view of the assembled media. SMU football, for all intents and purposes, was dead. The team managed just one winning season from 1989 to 2008, in no small part because the rest of the university community had decided it wanted nothing to do with a program that had brought so much infamy to the school.

The initial reaction to the penalty—both in Dallas and throughout the country—was one of shock. The Mustangs had gone from undefeated to non-existent in just five years. Few, however, could deny that if the NCAA were going to have a death penalty, then SMU was certainly deserving of it. But the fallout from the penalty was worse than anticipated; perhaps not coincidentally, in the decades since 1987, the penalty has never once been used against a Division I school.

Over the last two decades, the conversation that surrounded SMU’s fall from grace has changed even more. These days, those in and around the world of college sports don’t talk much about what the penalties for paying players should be; instead, many are wondering whether there should be any penalty at all for paying college athletes. The arguments in favor of paying college athletes are manifold, especially considering they often generate millions on behalf of their universities. Few, however, would argue that players should be paid in secret (or while still in high school). Any sort of pecuniary compensation that student-athletes receive would, as in pro sports, require some sort of regulation.

Despite the recent groundswell of support, the NCAA appears reluctant to change its rules. At some point, the governing body of college sports may not have a choice, especially if wants to avoid further legal trouble.

Ron Meyer, the SMU coach who nabbed Eric Dickerson more than 25 years ago, would famously walk into high schools throughout Texas and pin his business card to the biggest bulletin board he could find. Stuck behind it would be a $100 bill. That sort of shenanigan may not be the future of college sports, but we may be getting closer to the day when money isn’t a four-letter word for student-athletes.

Read TIME’s 2013 cover story about the ongoing debate over paying college athletes, here in the TIME Vault: It’s Time to Pay College Athletes

TIME conflict

What Actually Happened in the Falklands, With or Without Bill O’Reilly

Apr. 19, 1982, cover of TIME
Cover Credit: TODD SCHORR The Apr. 19, 1982, cover of TIME, featuring the war in the Falkland Islands

The conflict between Britain and Argentina took the world by surprise

After more than three decades out of the spotlight, the Falkland Islands are back in the news, this time because of controversy over a claim that Bill O’Reilly has made misleading statements about his time covering the conflict that took place there in 1982.

O’Reilly says that he has always been honest about the fact that his reporting on the war was from Buenos Aires, not the islands themselves—as TIME reported back then, only 27 British reporters were able to get there—but Mother Jones magazine contends that his statement that he reported from active war zones suggests otherwise. The controversy continued Tuesday as O’Reilly further insisted that he never misled anyone.

But what exactly did happen in the Falklands?

In 1982, the archipelago had long been home to little else besides shepherds, sheep, 10 million penguins and a history of diplomatic disputes.

The islands had first been seen by British eyes in the 16th century, were claimed by the U.K. in the 17th century, went to Spain in the 18th century and back to Britain in 1833. Meanwhile, Argentina, which became independent from Spain during the period of Spanish control of the Falklands, claimed the right to the land—they had gained the Malvinas, their name for the islands, when Spain left, they argued—even over the objections of many who actually lived on the Islands. Argentina’s military ruler, General Galtieri, hoped to boost his own popularity by scoring a win in the islands. The locals, largely descended from Brits, did not support leaving the shelter of the British crown (which held them as a dependency, not an independent member of the commonwealth) for then-unstable but nearby rule.

In early April of 1982, the Falklands (and, by extension, the South Georgia and South Sandwich Islands) were defended by a few dozen British marines already on the islands when thousands of Argentine troops suddenly swept in. In fighting that lasted mere hours, the South American nation seized the territories from the U.K., which responded by breaking off diplomatic relations and, via the U.N., demanding that Argentina withdraw. Prime Minister Margaret Thatcher and her government promised that, were the request denied, the islands would be retaken by force. And, when the British navy arrived in the area—to enforce a blockade and evacuate the invaders—that result began to seem more and more likely.

Even as war loomed, TIME observed that the spectacle was “out of nowhere, it seemed, or out of another century.” One of the world’s major powers, no longer famous for its empire, and a country on another continent, fighting a sudden territorial war over a couple of islands? Just plain weird. Nonetheless, the pride of two nations was on the line, and citizens on both sides supported action.

President Ronald Reagan was unable to mediate a diplomatic solution and, at the end of the month, thousands of Argentine troops prepared for a confrontation. Rather than landing in the Falklands directly, the British forces landed on South Georgia Island, one of the Falklands’ dependencies, to the east of the main archipelago. South Georgia was quickly captured, bringing the two sides within striking distance.

By May, Britain’s Defense Secretary announced that the nation’s aircraft had taken action “to enforce the total exclusion zone and to deny the Argentines use of the airport at Port Stanley,” the Falklands capital. Military targets in the Falklands were bombed and other nations, including the U.S., ended their neutrality in the conflict. (The U.S. sided with England; the Soviets would eventually speak up for Argentina.) Fighting increased, as did patriotic support on both home fronts, even as the costs began to climb.

As the second month of fighting drew to an end, there was nothing quaint about it. As TIME reported:

Meanwhile, preparations for an all-out war over the Falklands continued. To the skirl of bagpipes, some 3,500 Scottish, Welsh and Gurkha troops last week boarded the hastily requisitioned Queen Elizabeth 2 to begin a ten-day journey to the South Atlantic. They were intended to join some 4,000 other British soldiers in the potential invasion force aboard the 20-ship battle squadron surrounding the islands. British warships kept up a harassing bombardment of the Falklands coastline, while Sea Harrier jets sank an Argentine trawler, possibly a spy ship, that was discovered deep within the blockade zone. Argentine warplanes flew a retaliatory sortie against the blockading fleet; London said that three of the aircraft were downed, and the Argentines damaged one British frigate in the action.

Then the British added a daring new twist to their tactics. Late Friday night, a commando force slipped ashore on Pebble Island, a slice of land practically touching West Falkland Island. Supported by naval gunfire, the raiders, who were probably ferried ashore in helicopters, attacked an airstrip and Argentine military outpost, blowing up a large ammunition dump and destroying eleven aircraft. The action was a sustained one; it was only after dawn that the commando force left the island, suffering only two minor casualties. London stressed that the operation was a “raid, not an invasion,” but the assault marked the first time that British troops had set foot on the Falklands since their departure after the Argentine invasion on April 2.

The conflict finally ended in June, after a full-on fight for Port Stanley. The death tolls had reached about 250 British troops and nearly 700 Argentine. The Argentine troops were driven from the islands, and a few days later General Galtieri was replaced, even as his country continued to assert their claim to the Falklands. In England, Thatcher’s popularity soared.

And on the islands themselves, life had changed too: the mellow home of shepherds had become a military stronghold. The military investment improved the local economy and modernized the lifestyle there but did not fully resolve the conflict. Argentina still hopes to regain the territory. A 2013 vote found that 1,513 residents wanted to remain under U.K. control. Only three people voted to leave.

Read TIME’s full coverage of the beginning of the conflict, here in the TIME Vault: Gunboats in the South Atlantic

TIME politics

3 Lessons from the French Revolution European Policymakers Should Keep in Mind

Place de Grève at the Storming of the Bastille
Lebrecht/Getty Images Place de Grève at the Storming of the Bastille, Jul. 14, 1789, from an 18th-century engraving by Letourmy of Orléans

The moment has come to diversify our analogy portfolio

History News Network

This post is in partnership with the History News Network, the website that puts the news into historical perspective. The article below was originally published at HNN.

It has become a commonplace to compare today’s economic and political news with interwar Europe. The Great Recession is measured against the Great Depression and Paul Krugman has recently suggested that demands being made on Greece are like those the Treaty of Versailles imposed on Germany. In his new Hall of Mirrors, however, economist Barry Eichengreen argues that repeated comparison of our current moment with the 1930s has resulted in poor policy decisions. Policymakers managed to avert another depression but, once they did so, their mission seemed complete and their hands were tied. Radical reforms—of the type necessary to prevent another recession—proved politically impossible.

The moment has come to diversify our analogy portfolio. What happens, for instance, if we think about the Eurozone crisis in terms provided by the history of the French Revolution? The comparison may initially seem contrived, but it offers significant lessons nonetheless.

Consider: debt dominated public debate in 1780s France as it does in Eurozone negotiations today. For decades, the monarchy had struggled to effectively tax the Catholic Church and the nobility. For decades, as Michael Kwass has conclusively shown, the wealthy and super wealthy branded such efforts despotic; they used the resulting power struggles to their own, political ends. Appealing repeatedly to notions of the public good—new taxes would mean “the loss of our liberty, the destruction of our laws, the ruin of our commerce, and the desperate misery of the people”—they blocked all attempts to spread taxation more fairly. Social elites thereby successfully defended their own riches, made themselves popular spokesmen for the common good, and pushed France further into borrowing (since it could not tax). Norman noblemen and Paris magistrates were the Koch Brothers of their day: bent on conserving their own privileges by fueling grass-roots populism. Their effective depiction of the monarchy’s fiscal crisis as a result of its own opulence—even now, don’t we imagine the money was spent on Marie Antoinette’s dresses and the King’s hunting dogs?—made state finances look like moral, rather than political, issues. (For in-depth discussion of this point, see Clare Haru Crowston’s Credit, Fashion, Sex and John Shovlin’s Political Economy of Virtue).

Like many in the United States and Europe today, these critics of the centralizing monarchy played politics with money. None of these men—all members of the privileged elite—intended to start a revolution. But by blocking needed tax reform, they provoked a political showdown that eventually turned summer 1789 into a social, cultural, and economic crisis of unparalleled proportions.

Lesson One Revolutions are their most revolutionary when no one sees them coming.

Money and debt played a crucial role in further revolutionizing France. In one of the first acts of the Revolution (prior even to the storming of the Bastille), the National Assembly declared France’s existing debt “sacred.” Unlike the later Bolsheviks (who in 1918 defaulted on everything the Russian Empire had borrowed, thereby giving rise to the doctrine of Odious Debt), the French revolutionaries accepted an inherited burden. The formerly royal debt became the national debt, and the new political body faced the same fiscal problems as had the old. At the same time, however, the National Assembly also challenged all existing taxes because they had been imposed by the monarch alone (the French version of “no taxation without representation”). Monetary-fiscal policy at this point was not intentionally revolutionary. The National Assembly was in many ways conservative. Like Angela Merkel’s Germany and other core economies today, it insisted that debts had to be honored and bills had to be paid. Like critics of quantitative easing, its members recoiled from the thought of merely printing money. They demanded that France put “solid assets” behind its promises to creditors.

With debt on the books and taxes delegitimized, a small majority within the National Assembly moved to nationalize and then monetize lands held by the Catholic Church. Here any parallel between the current moment and the 1790s appears to break down. Today, austerity means shrinking the public sector, whereas balancing the budget in 1789 involved expanding it (since the state took over the Church’s traditional welfare-providing role as well as its property). The social effects of the measures, however, were strikingly similar: widespread resistance, popular unrest, and growing political polarization. And, as today, uncertainty and confused expectations led investors to flee risk and seek safe places to put their money. Commercial credit, the backbone of eighteenth-century economic growth, dried up overnight. Lack of confidence in the new regime quickly metastasized into a generalized collapse in trust. All debts came due at once.

Lesson Two Insisting on balanced books and sound money may be conservative rhetoric but it often has radical effects.

Since Maastricht, the EU has tried to use currency to create a stronger sense of European identity. So too did policymakers in 1790s France, who believed widespread use of a new currency (paper notes backed by the value of the nationalized properties) would force people to “buy into” the Revolution whether they supported its politics or not.

Like many Europeans and Americans today, most revolutionaries in the 1790s understood “liberty” as entailing both political freedom and market non-regulation. The majority within the National Assembly therefore embraced free trade in money as they had in grain. Radical monetary liberty—such that any merchant was free to accept the nationally issued paper for less than its face value—pushed both freedom and money to their breaking point.

As does the European Union, the French Revolution emphasized equality, rights, and citizenship. In both contexts, this political rhetoric collides with the social reality of growing inequality to create a feedback loop. A political choice (be it free trade in money or European monetary union) disrupts social-economic life; that disruption makes political ideals (such as liberty) seem all the more desirable and elusive; upholding those ideals causes further economic dislocation and social uncertainty.

Lesson Three Playing politics with money is a sure way to make policy decisions matter to ordinary people. The results are often explosive.

Rebecca L. Spang is the author of “Stuff and Money in the Time of the French Revolution” (Harvard University Press, 2015) and a faculty member at Indiana University, where she directs the Center for Eighteenth-Century Studies and is the Acting Director of the Institute for European Studies. Her first book, “The Invention of the Restaurant: Paris and Modern Gastronomic Culture” was also published by Harvard.

Your browser, Internet Explorer 8 or below, is out of date. It has known security flaws and may not display all features of this and other websites.

Learn how to update your browser