TIME Crime

Two Shootings, 30 Years Apart, Linked by Fear

Daily News Front page of Sunday, December 23, 1984, with hea
The New York Daily News front page from Dec. 23, 1984 New York Daily News Archive / Getty

Why Bernie Goetz was celebrated for shooting four unarmed black teens, and how things have—or haven’t—changed

Regardless of the grand jury’s decision, America’s response to the shooting of an unarmed black teen by a white police officer in Ferguson, Mo., this year has been, predominantly, one of outrage. Officer Darren Wilson has fallen so far out of public favor that his unpopularity became an uneasy punchline in a Saturday Night Live skit (albeit one that didn’t air) in which a chef with a similar name has it pulled from the cover of his newly-published cookbook.

Thirty years ago today, on Dec. 22, an altogether different story played out on a New York City subway car, when a white man shot four black youths he believed were about to mug him — and instead of being reviled, he was celebrated. Before his name was known, newspapers dubbed him the “subway vigilante,” and many New Yorkers hailed him as a hero. When Bernhard Goetz finally turned himself in to the police, Joan Rivers reportedly sent him a telegram signed “love and kisses,” offering to help pay his bail.

Why was Goetz glorified while Wilson has been broadly reviled? There is the obvious point that Goetz, a scrawny electrical engineer who carried a .38 revolver inside his windbreaker, was not a law enforcement officer but a civilian who attempted to enforce the law as he saw fit — and many Americans seemed to view him as a triumphant underdog. A police officer would have been held to a different standard, then as now. But the magnitude of the recent public outcry over Brown’s death (and perhaps more tellingly over the death of Trayvon Martin two years ago) suggests that similar vigilantism would not be received as warmly today.

Despite their outward differences, both Goetz and Wilson identified the same motivation in using lethal force: fear. And one other thing the cases may have in common, says University of Texas professor Keisha Bentley-Edwards, is the possibility that longstanding racial stereotypes played a part in the threat that both Goetz and Wilson perceived in the moments before they fired shots.

“They both describe these primal looks in the eyes of the teenagers that made them decide they needed to use lethal force,” says Bentley-Edwards, whose research focuses on the racial experiences of youth.

In his confession, Goetz recalled sensing an ineffable, predatory menace from the four teens: “You see, what they said wasn’t even so much as important as the look, the look, you see — the body language… They wanted to play with me. You know, it’s kind of like a cat plays with a mouse before, you know.”

Wilson, in his grand jury testimony, described Brown’s “intense aggressive face,” explaining, “The only way I can describe it, it looks like a demon. That’s how angry he looked.”

“This is not to say that either of them did not feel threatened,” Bentley-Edwards adds. “It’s a question of whether that threat is rational enough to justify the use of force.”

Fear remains as powerful a motivator today as it was 30 years ago. But 30 years ago the feeling was so prevalent in crime-ridden New York that many people — black and white — identified with Goetz. Between 1965 and 1984, New York’s violent crime rate nearly tripled, thanks in part to an economic crisis and the crack epidemic. The city’s annual murder rate was fast approaching its 1990 peak of 2,245, or an average of six people a day. Bernie Goetz, it appeared, had every reason to think he was about to become one of them.

Few could blame him. A New York Times survey conducted shortly after the shootings found that 52% of New Yorkers overall — including 56% of whites and 45% of black respondents — believed Goetz’s response was justified.

Compare that to the majority of Americans — 57%, according to a CNN poll — who believe Darren Wilson should have been charged with a crime for shooting Michael Brown. (The CNN poll reflects a much wider chasm between white and non-white opinion, however: 49% of whites said Wilson should face criminal charges, compared to 78% of people of color.)

Goetz rallied support in part because New Yorkers were eager for tales of would-be victims prevailing against the bad guys. His story filled that void — at least initially, according to George Fletcher, a professor at Columbia Law School and the author of A Crime of Self-Defense: Bernhard Goetz and the Law on Trial.

According to news reports, as detailed in Fletcher’s book, the four black youths were “noisy and boisterous,” and menacing enough that the other riders had huddled on the opposite end of the subway car when Goetz got on. Two of the young men approached him and insisted that he give them $5. Instead, he pulled out a gun and shot each of them once. Then, as if scripted in a Western, he turned to one and said, “You seem to be [doing] all right; here’s another,” and fired the shot that severed the teen’s spinal cord, leaving him brain damaged and partly paralyzed. When the car stopped and a conductor appeared, Goetz walked to the platform between the cars, jumped down, and left through the subway tunnel.

“A common man had emerged from the shadows of fear. He shot back when others only fantasized their responses to shakedowns on the New York subways,” Fletcher writes, summarizing the mythology surrounding the shootings. “Like the Lone Ranger, the mysterious gunman subdues the criminal and disappears into the night.”

But when the gunman was unmasked a week or so later, he fell almost immediately from the pedestal of public opinion. His lengthy confession revealed a vindictive streak that complicated his apparent heroism and poked holes in his underdog persona.

Asked if he had intended to kill the teens, Goetz answered, “My intention was to murder them, to hurt them, to make them suffer as much as possible.”

Once it became clear — as more details of Goetz’s troubled past and racist tendencies emerged — that there was more to his story than justified fear and an attempt to make the subways safe, support for Goetz ebbed. Rampant crime had, it seemed, made New Yorkers so quick to identify with him that they hadn’t stopped to consider the possibility that other, less noble motives might be at work. After that point, accounts tended to portray Goetz as unhinged, though legally sane, says John Inazu, a law professor at Washington University in St. Louis who has written about the implications of the Ferguson shooting.

Nonetheless, Goetz eventually served only eight months for criminal possession of a weapon. (Since all four shooting victims survived, he didn’t face murder charges, but could have been convicted of attempted first-degree murder.)

Today, statutes governing police use of force and stand-your-ground laws on the books in many states mean a murder conviction still tends to be unlikely in cases where officers or civilians who fear for their own lives respond by taking someone else’s.

“The thirty years between Goetz and the deaths of Brown and Garner have seen many improvements in race relations, but our criminal justice system remains broken in many ways,” Inazu says. “Some of these use-of-force, self-defense, and stand-your-ground statutes are incredibly broad. For example, the current Missouri use-of-force statute is likely unconstitutional as written — it permits deadly force to effect an arrest when an officer suspects any felony, which would include someone who has passed a bad check.”

These statutes — and more permissive concealed weapon laws — can be seen as part of the lingering influence of Goetz’s onetime folk-hero status. But more people seem to be questioning the use of lethal force by police officers and civilian vigilantes today, says Bentley-Edwards. She doesn’t see the recent cases as setbacks in the strides America has made toward greater equality and inclusion.

“I feel like they’re opportunities for more progress, in that they’ve forced more frank conversations, and deeper investigations into policies that may be differently applied,” she says.

The best-case scenario, she says, is that they will lead to new, if awkward, discussions of race and justice — discussions as difficult today as they were 30 years ago, but critical to moving forward. There’s a better chance of that now that the fog of fear that blinded 1980s New York has lifted, and Americans are more likely to scrutinize self-defense stories than to celebrate them.

Read TIME’s original coverage of the Bernhard Goetz case: A Troubled and Troubling Life

TIME Opinion

‘Offensive’ Is the New ‘Obscene’

LENNY BRUCE AT AIRPORT
Lenny Bruce, refused entry to Britain earlier in the day "in the public interest," makes a V-sign as he leaves U.S. customs office after returning to New York's Idlewild Airport on Apr. 8, 1963. John Lindsay—AP Photo

50 years after Lenny Bruce's sentencing, the world is still deciding what a comedian is allowed to say on stage

Reading about Lenny Bruce’s arrest for obscenity 50 years ago makes me think about a popular sketch Amy Schumer recently did on her Comedy Central show. On Dec. 21, 1964, Bruce was sentenced to four months in a workhouse for a set he did in a New York comedy club that included a bit about Eleanor Roosevelt’s “nice tits,” another about how men would like to “come in a chicken,” and other scatological and overly sexual humor.

How does this relate to Amy Schumer? In the sketch called “Acting Off Camera,” Schumer signs up to do the voice of what she thinks will be a sexy animated character, because Jessica Alba and Megan Fox are doing the voices of her friends. When she arrives to work she sees that her character is an idiotic meerkat who defecates continuously, eats worms and has her vagina exposed. She says to her agent, “My character has a pussy.” Schumer is the first woman to say that word on Comedy Central without being censored, a right she fought for. Her struggle was commended by the press for advancing feminism because the word had been banned even though four-letter words for male genitalia were given the O.K.

A word that could have landed Bruce in the slammer 50 years ago is now available for public consumption, and its inclusion into the cuss-word canon is applauded. These days each of George Carlin’s “seven words” seems quaint. There is nothing so raunchy, so profane or so over-the-top that it could land a comedian in jail.

However, they have other reasons to censor themselves — namely Twitter.

The most dangerous thing that a comedian has to face today is offending political correctness or saying something so racist or sexist that it kicks up an internet firestorm. In 2012, Daniel Tosh made a rape joke at a comedy club, which everyone on the internet seemed to have an opinion about. Many were offended and he later apologized for the joke. Just last month comedian Artie Lang tweeted a sexist slavery fantasy about an ESPN personality and was met with harsh criticism. Saturday Night Live writer and performer Leslie Jones, a black woman, also took heat for making jokes about slavery; her critics said they were offensive, but she defended her comments, claiming they were misunderstood. Most of this exchange took place on Twitter.

This is a common cycle these days and one that can derail a comedian’s career (just look at what happened to Seinfeld alum Michael Richards after his racist rant became public). It’s also something that comedians are hyper-aware of. “I stopped playing colleges, and the reason is because they’re way too conservative,” Chris Rock said in a recent interview in New York magazine (referring to over-prudence, not political ideology). “Kids raised on a culture of ‘We’re not going to keep score in the game because we don’t want anybody to lose.’ Or just ignoring race to a fault. You can’t say ‘the black kid over there.’ No, it’s ‘the guy with the red shoes.’ You can’t even be offensive on your way to being inoffensive.” In a world where trigger warnings are becoming popular, how can a comedian really push the envelope?

In the interview, Rock says this policing of speech and ideas leads to self-censorship, especially when he’s trying out new material. He says that comedians used to know when they went too far based on the audience reaction within a room; now they know they’ve gone too far based on the reaction of everyone with an Internet connection. Now the slightest step over the line could land a comedian not in the slammer but in a position like Bill Maher’s, where students demanded he not be able to speak at Berkeley because of statements he made about Muslims.

That’s the difference between Lenny Bruce and someone like Leslie Jones. A panel of judges decided that Bruce should face censorship because of what he said. Now Leslie Jones gets called out, but the public is the judge. Everyone with a voice on the internet can start an indecency trial and let the public decide who is guilty and to what degree. (The funny truth is, depending on whom you follow on Twitter, the party is usually universally guilty or universally innocent.)

What hasn’t changed as we’ve shifted from “obscene” to “offensive” is just how unclear the scenario could be. The Supreme Court famously refused to define “obscene” but instead said they know it when they see it. The same is true of “offensive.” One comedian can make a joke about race or rape and have it be fine, another can make a joke on the same subject matter and be the victim of a million blog posts. There was even an academic study to determine which strategies were most effective for making jokes about race.

Whenever one of these edgy jokes makes the news, a rash of comedians come to defend not the joke, necessarily, but that the person has the right to tell it in the first place. The same thing happened at Bruce’s trial when Woody Allen, Bob Dylan, Norman Mailer and James Baldwin all showed up to testify on Bruce’s behalf. Bruce never apologized for what he said. Though he passed away before his appeal could make its way through the courts, he received a posthumous pardon in 2003. Then-Governor of New York George Pataki noted that the pardon was a reminder of the importance of the First Amendment.

In 50 years a lot has changed, but comedy, like the First Amendment, really hasn’t. There are always going to be people pushing the boundaries of what is acceptable, because that’s what we find funny. What has changed is who is policing that acceptability — and that makes a big difference. We no longer have too-conservative judges enforcing “community standards” about poop jokes, telling people like Lenny Bruce that they can’t say one thing or another. Instead, today’s comedians are policed by the actual community, using the democratic voices of the Internet and social media to communicate about standards around race, religion, sexuality, gender and identity. The community doesn’t say comedians can’t offend, but that they’ll face consequences if they do. Their First Amendment rights are preserved and, though it may get in the way of the creative processed once used by people like Chris Rock, online feedback can often lead to productive conversations.

In a world where nothing is obscene, it doesn’t mean that things can’t be offensive, as murky as both those ideas might be. At least we’ve taken the government out of comedy, which seems to be safer for everyone. Now they can stick to dealing with the important things, like Janet Jackson’s nipple.

Read TIME’s original coverage of Lenny Bruce’s conviction, here in the archives: Tropic of Illinois

TIME Economy

Just How Much Does the Economy Affect the Outcome of Presidential Elections?

Obama
Barack Obama speaks during a rally in Cedar Rapids, Iowa, on Jan. 2, 2008. Mauricio Rubio—Getty Images

It’s time for the media to stop pretending that candidates’ personalities, rhetoric and strategies are what really count

History News Network

This post is in partnership with the History News Network, the website that puts the news into historical perspective. The article below was originally published at HNN.

In a fascinating paper, Princeton economists Alan S. Blinder (formerly Vice Chairman of the Federal Reserve Board) and Mark W. Watson point to the significance of economic factors in presidential contests (see pages 14-16, especially). Their synopsis of elections since the end of the Second World War reveals that presidential candidates operated with distinct advantages or disadvantages, depending on whether their party or their opponent’s party recently governed in a period of prosperity or economic hardship. In many instances the state of the economy appeared to make as much or more of an impact on the presidential race than the candidates’ personal attributes, campaign strategies, or debating skills.

It is intriguing to expand upon the insights of Blinder and Watson and consider the potential influence of economic conditions on the 2016 presidential race. The state of the economy could play a major role in the outcome. But long-term wage stagnation could make that factor less significant in 2016. The disruptive character of stagnant wages was evident in the 2014 congressional elections. Even though the U.S. economy had improved substantially in recent years, Democrats lost decidedly in many sections of the nation. Democrats’ failed to excite voter support, partly because average American workers had seen little or no personal economic improvement in the years of the Obama presidency and Democratic influence in Washington. If this situation does not change in the next few years, the condition of the overall economy in 2016 may not influence the voters’ decisions as much as it has in the past.

Drawing upon insights presented by Blinder and Watson, it is evident that economic factors often affected voters’ judgments in presidential elections up until recent times.

For instance, historians often cite Harry S. Truman’s fighting spirit and the Republicans’ flawed strategies when identifying causes of the Democratic president’s surprise victory in 1948. Yet Truman’s campaign was buoyed by early signs in 1948 of an impressive post-war economic boom. Real Gross Domestic Product (GDP) had dropped precipitously in 1946 (a development that made pundits think Truman would lose in 1948), but a substantial economic recovery was underway by the time of the November, 1948 elections.

Richard Nixon ran for president in 1960. He lost, not only because he ran against a handsome, charismatic, and eloquent Democrat named John F. Kennedy. A third recession of the Eisenhower era, stretching from 1960 to 1961 undermined Nixon’s campaign. JFK excited voters with a promise to “get America moving again.”

Lyndon Baines Johnson won easily against Republican Barry Goldwater in 1964. Goldwater’s image as an extremist hurt his campaign, but economic conditions also made the Arizona Senator’s efforts difficult. The Kennedy/Johnson tax cut of 1964 quickly stimulated business expansion. Voters were in an optimistic mood when they went to the polls in 1964. Four years later, Richard Nixon benefited from the Johnson Administration’s economic troubles. Worries about inflation related to huge U.S. military commitments in Vietnam cut into voters’ support for the Democratic candidate, Hubert Humphrey. Federal efforts to deal with the emerging economic problems through fiscal and monetary policies aided Nixon, who won a race that turned close in the final days.

The economy first helped and then hurt Democrat Jimmy Carter. Shifts in energy prices made a big impact on Carter’s fortunes. Republican President Gerald Ford campaigned under a cloud in 1976. “Stagflation,” a combination of economic recession and price inflation, created difficulties for the GOP’s candidate, as did Ford’s pardon of Richard Nixon. Jimmy Carter secured a victory. Four years later, Carter’s efforts to remain in the White House failed. Jimmy Carter stumbled as a leader, and economic conditions exacerbated his difficulties. Oil prices surged in 1979 and inflation turned worse. The chairman of the Federal Reserve, Paul Volker, tried to tame inflation with tight monetary policies. Business and employment slowed considerably during the months that Carter campaigned for re-election.

In 1980 Ronald Reagan excited voters with promises to revive the economy. Reagans’ popularity slipped during his first two years in office, in large part because of a deep recession. By late 1982, however, Paul Volker’s monetary squeeze appeared to be working. Inflation declined. Additionally, global production of petroleum had expanded and prices dropped substantially. In 1984 Reagan won reelection in a landslide. Perhaps the Republican president’s ebullient personality would have carried him to victory under less promising conditions, but Reagan surely benefited from the favorable economic winds at his back.

Following the Persian Gulf War, President George H. W. Bush received a 90% approval rating and seemed well-positioned to win a second term in 1992. Then a troubling recession in 1990-1991 undermined his popularity. George H. W.’s Bush’s disapproval rating hit 64%. Bill Clinton projected an effervescent personality in the 1992 campaign, but that was not his only advantage over Bush and an independent candidate, Ross Perot. The voters’ unhappiness with the economy figured prominently. Clinton strategist James Carville famously identified the main issue: “It’s the economy, stupid.” Two years after the 1992 victory, Bill Clinton’s presidency was deeply troubled. Republicans crushed Democrats in the congressional races of 1994, and the GOP appeared to have enough clout in Washington to block Clinton’s initiatives. Republicans hoped to make Clinton a one-term president. In 1996, however, the U.S. economy looked much stronger than it had a few years before. Voter optimism helped Clinton to dispatch his competitors, Republican Bob Dole and independent Ross Perot.

Unfortunately for the Democrats, their candidate in 2000 chose to keep his distance from Bill Clinton. Al Gore, Vice President during the previous eight years, refused to exploit the Clinton connection to the fullest during his presidential campaign. Gore feared that voters would view an association with Clinton negatively because of the president’s scandalous relationship with a young intern. Al Gore made a strategic mistake. The U.S. economy had been on a sustained climb though most of Bill Clinton’s eight years. Gore failed to take adequate credit for Clinton-era prosperity. He won the popular vote but lost the election after the Supreme Court intervened in the Florida vote count.

Barack Obama benefited from economic conditions during both of his presidential campaigns. With the collapse of Lehman Brothers investment bank in September 2008, the U.S. and global economies began to crash. Many voters associated Republicans with the financial crisis. They backed the newcomer, Barack Obama over Senator John McCain, who displayed little understanding of economics during the campaign. In 2012 Republican Mitt Romney claimed that he, a successful businessman, knew better than President Obama about creating jobs and fostering prosperity. Romney’s message failed to resonate. There were many reasons for Romney’s defeat, but one of the most important was his inability to gain traction on economic issues. Mitt Romney could not effectively characterize Barack Obama’s administration as incompetent in business affairs. Stock markets had climbed steeply since their lows in early 2009, and the unemployment rate had declined substantially by election time.

Since the U.S. economy has been on an upward tear from the first months of Barack Obama’s presidency, Hillary Clinton or some other Democratic presidential candidate should have a distinct advantage in 2016. The Democrats’ future also looks promising because of the sudden drop in energy prices. A slowdown in global demand for oil, declining production costs related to fracking, and a glut of oil in global markets have rapidly cut the cost of a barrel of crude oil from about $100 to less than $70. Price drops work like a large tax cut or a welcome pay raise. In coming months and, perhaps, years, Americans will need less money to purchase gas for their car or heat their home. Consumer products may be cheaper, since they will be manufactured and transported at reduced cost. By the time of the 2016 elections, the benefits of reduced expenditures for energy may be more evident to voters than they were at the time of the 2014 congressional elections. Optimistic voters may reward the Democratic presidential candidate.

Democrats cannot, however, be certain that the U.S. economy will be dynamic in November, 2016. There are some troubling signs on the horizon. Global business has been slowing, especially in Europe. The U.S. economy has been growing more impressively, but Wall Street analysts warn that the lengthy stock market boom cannot last forever. Values have been climbing since early 2009. A serious market “correction” might arrive at a bad time for Democrats – weeks or months before the 2016 election.

Also, despite vigorous business expansion in recent years, most working Americans are not realizing true economic improvement. Employment opportunities have expanded, but many of the new jobs are part-time. They do not pay good wages, and they offer few benefits. In contrast, individuals with technical skills and advanced degrees often command strong earnings. Income inequality has become a glaring issue.

In recent decades individuals and families at the top have realized extraordinary gains, while the rest of the U.S. population saw disappointing returns. The Congressional Budget Office found that between 1979 and 2007 the top 1% of households realized 275% growth of inflation-adjusted income. In contrast, the bottom 20% of Americans saw growth in those 28 years of just 18%. Another study by the Economic Policy Institute revealed that between 1983 and 2010 approximately three-quarters of all new wealth went to the richest 5% of households, while the bottom 60% of households actually turned poorer over that period. Data from the Labor Department reveal that income for the middle 60% of the U.S. population has stagnated since 2007.

The angst of working Americans was evident in the 2014 congressional elections. Despite improvements in equity markets and corporate earnings, voters felt a pinch. Republicans cast President Obama as the culprit in their campaign rhetoric. They claimed his flawed leadership left millions of Americans struggling to earn a decent living.

President Barack Obama and Democratic senators have been dominant in Washington in the years of a remarkable economic turnaround, yet they failed to convince voters that their policies helped in significant ways to foster a recovery. A post-2014 election headline in the New York Times indicated, “Democrats Say Economic Message Was Lacking.” The Times reported thatDemocrats could not project the kind of broad vision in 2014 that inspires voter turnout. Larry LaRocco, a former Democratic congressman from Idaho, identified the challenge Democrats face as they look ahead: “What do we stand for?” he asked. In 2016 Democrats will need to convince voters that they do, indeed, have an effective plan for economic growth.

The Democrats’ efforts to persuade voters that the Obama presidency has produced results may become easier if recent employment statistics augur a trend. The Labor Department reported that employers added 321,000 jobs in November and, even more significant, the hourly earnings of ordinary workers jumped sharply. If future reports continue to show wage gains, the Democratic candidate will benefit from favorable economic winds. If the November gains prove a fluke and wage stagnation persists, Republicans may be able to capitalize on voter discontent in 2016, much as they did in 2014.

Whatever the situation, economic conditions will likely affect the outcome– as it usually does in presidential contests. Yet when writing and speaking about the campaign, many pundits will overlook this important factor. They will focus on the candidates’ personalities, rhetoric and strategies rather than evidence from history that suggests the state of the economy often has a major impact on the voters’ decision.

Robert Brent Toplin taught at Denison University and the University of North Carolina, Wilmington. Since retirement from full-time teaching, he has taught some courses at the University of Virginia. Toplin has published several books and articles about history, politics, and film.

TIME movies

The Very Political History of Annie

Quvenzhane Wallis;Jamie Foxx
Barry Wetcher—Columbia Pictures/Sony

The new movie adaptation finds a new time

The new version of Annie — in theaters Friday — doesn’t exactly shy away from its New Deal origins. Mere minutes of the film have passed before the newest actress to step into the orphan’s shoes, Quvenzhané Wallis, is talking about Franklin Roosevelt and the Great Depression.

Except this time that history is, well, history. The musical that once contained songs with the actual titles “We’d Like to Thank You Herbert Hoover” and “A New Deal for Christmas” has been updated for modern times. And, though its Daddy Warbucks equivalent (Jamie Foxx as Benjamin Stacks, a New York gazillionaire with aspirations à la Michael Bloomberg) is still involved in politics, the story has left behind much of its erstwhile focus on the national political climate.

“The interesting thing about Annie is that it was started as a political cartoon and with pretty biting social and political commentary, and then it was turned into a musical, and people have forgotten that,” says Will Gluck, the writer-director behind the new adaptation. “They just think about ‘Tomorrow,’ the plucky kid and the dog.”

The content of that original social commentary may surprise some of today’s “Tomorrow” singers. In the ’20s, when the strip debuted, Little Orphan Annie was already “issuing a steady stream of far-right propaganda.” In 1935, one newspaper canceled the comic because “Annie has been made the vehicle for a studied, veiled, and alarmingly vindictive propaganda.” Cartoonist Harold Gray was a staunch believer in the way Daddy Warbucks got rich, which was “doing his job and not asking for help from anyone,” as he put it. “Gray agrees that Annie dabbles in dialectics, and he has no intention, of stopping her,” TIME commented in 1962. “To Artist Gray, Daddy and Annie are salesmen of the American dream, the “pioneer spirit” that without assistance, even from the State Department, can cope with Castro, neutralize the H-bomb, and eliminate the income tax.”

In the 1970s, however, when Annie went to Broadway, though TIME opined that her newspaper-comic twin was “still fight[ing] the Red Menace and bleeding-heart liberals,” the character’s priorities changed. In the musical version of Annie, the spunky orphan — who has already helped her war-profiteering rescuer realize that those who have less are worth taking care of — is brought along to a meeting with FDR, at which point her natural optimism helps inspire the President to institute the New Deal. The general take-away, besides the fact that the sun will come out tomorrow, is that New-Deal-style, progressive policies help everyone get the fair shake he or she deserves. Annie’s can-do pluck is still important, but she’s optimistic about the government’s ability to help all rather than individuals helping themselves.

Gluck says that, while updating the story for today’s audiences — Annie lives in a foster home rather than an orphanage, for example — he didn’t want to lose that part of the story’s background. “The one thing I wanted to keep is the socioeconomic divide of the Depression,” he says, “which sadly has even gotten bigger now and sadly is not going away.” That was why he made sure to have his Annie teach viewers a little lesson about the Great Depression when, she says, things were just like they are today except without the Internet.

Still, this iteration of Annie ends up bringing the political girl to a more centrist position.

By keeping things local and staying away from specific historical moments — no, new Annie does not inspire the President to believe that there really are plenty of shovel-ready stimulus projects out there — some of the specificity of Annie’s political message is lost too. Stacks thinks that in New York City, if you work hard enough, you can achieve anything you want, just like old-fashioned Daddy Warbucks did. Meanwhile, Annie recognizes that folks in her neighborhood are often ignored and left behind, even when they work hard, just like her theatrical predecessor did. They each come to see the other’s side a little better, but the audience doesn’t come away singing a song about Obamacare.

But, Gluck says, that’s a better fit for the audience anyway — though not because today’s political divides are so treacherous. Adults may see Annie as a rags-to-riches story, he says, but kids don’t really know what that means; the core message of Annie, about hope and optimism, works just as well now as it did in the ’70s or ’30s because it’s a universal story. “I don’t believe the end of the movie is that she got to live with a rich guy,” he says. “I believe that to her the end of the movie is that she got to find a family.”

Besides, he still remembers the first time he saw the original Annie, and the questions he had for his parents when it was over: Who is Herbert Hoover and who is that guy in the wheelchair? When he took his own kids to see Annie on Broadway recently, they had the same exact questions. His movie’s young viewers, however, won’t be left scratching their heads. “You don’t need to study for this essay question,” he says.

Read our original review of the musical Annie, here in the TIME Vault: No Waif Need Apply

TIME Revolution

When Fidel Castro Took Power: How TIME Covered the News

The Jan. 26, 1959, cover of TIME
Fidel Castro on the Jan. 26, 1959, cover of TIME Cover Credit: BORIS CHALIAPIN

Castro was on the cover of the magazine three weeks after he seized control of Cuba

When Fidel Castro first ousted Fulgencio Batista at the turn of 1959, there weren’t many non-Cuban journalists there to see it happen — but TIME’s Bruce Henderson was there, and he was soon joined by Bernard Diederich, who would later cover the Caribbean for the magazine. Their presence meant that, throughout that January, TIME’s “Hemispheres” section carried up-to-the-minute news about the changes on the island.

As Diederich recalls in his book 1959: The Year That Changed Our World, the assignment was an unusual one:

Henderson assigned me to cover Fidel’s arrival in Havana. I leaped onto a tank with a group of 26th of July female fighters and rode in Fidel’s wake into Camp Columbia, once the bastion of Batista’s army. It was January 8. Rodríguez Echazábel was already at the camp headquarters when I arrived. My Santiago-issued laissez-passer did wonders too. I was introduced to bearded rebel comandante (Maj.)Maj. Camilo Cienfuegos to whom I explained my challenging assignment. Time would want a full description of Fidèl’s first night in Havana. Would the 26th of July leader choose to dance, date, or dive into bed after his arduous trip up the island from the Sierra Maestra to Havana. Camilo smiled broadly when I also told him that I needed to know the color of Fidèl’s pajamas—if he wore them!

Though those “female fighters” were the subject of a story in the Jan. 19, 1959, issue, Castro’s pajamas did not. (Actually, his blue cotton PJs did get their moment, but it wasn’t until that May.)

However, Castro got even more focus from TIME the following week, when he was featured on the cover of the magazine, in a story that focused on matters a lot more important than his sleepwear choices. Rather, the article opened with Castro pushing for the executions of those who had abetted the Batista regime:

…Castro was in no mood for mercy. “They are criminals,” he said. “Everybody knows that. We give them a fair trial. Mothers come in and say, ‘This man killed my son.’ ” To demonstrate, Castro offered to stage the courts-martial in Havana’s Central Park—an unlikely spot for cool justice but perfect for a modern-day Madame Defarge.

In the trials rebels acted as prosecutor, defender and judge. Verdicts, quickly reached, were as quickly carried out. In Santiago the show was under the personal command of Fidel’s brother Raul, 28, a slit-eyed man who had already executed 30 “informers” during two years of guerrilla war. Raul’s firing squads worked in relays, and they worked hour after hour. Said Raul: “There’s always a priest on hand to hear the last confession.”

Read the full 1959 cover story, free of charge, here in the TIME archives: The Vengeful Visionary

TIME politics

Cuba’s Unanswered Questions

Obama Makes Statement On U.S.-Cuba Policy
U.S. President Barack Obama speaks to the nation about normalizing diplomatic relations with Cuba, at the White House on Dec. 17, 2014 Pool / Getty Images

In 2013, TIME took a look at a changing Cuba

When President Barack Obama announced Wednesday that the United States would work toward normalizing long-severed diplomatic relations with Cuba, it came as a surprise to many.

But as TIME observed in a feature story last July, change has long been underway for an Island nation that, in the past, has had a reputation for seeming frozen in time. Rules about commerce and private business had been relaxed, citizens were encouraged to find non-state jobs, tourism was opening up and the possibility of a non-Castro leader suddenly seemed less distant. However, that didn’t mean that Cuba’s future was clear.

Many of the questions raised by writer Pico Iyer are, even in this new phase of Cuban history, still unanswered:

Cubans today are free–at last–to enjoy their own version of Craigslist, to take holidays in fancy local tourist hotels, to savor seafood-and-papaya lasagna with citrus compote, washed down by a $200 bottle of wine, in one of the country’s more than 1,700 paladares, or privately run restaurants. They’re free to speak out against just about everything–except the two brothers at the top–and they strut around their capital in T-shirts featuring the $1 bill or Barack Obama in his “Yes we can” pose, even (in the case of one woman leaning against the gratings in Fraternity Park) in very skimpy briefs decorated with the Stars and Stripes.

Yet as what was long underground is now aboveboard, and as capitalist all-against-all has become official communist policy, no one seems quite sure whether the island is turning right or left. Next to the signs saying EVERYTHING FOR THE REVOLUTION, there’s an Adidas store; and the neglected houses of Old Havana sit among rooftop swimming pools and life-size stuffed bears being sold for $870. “Nobody knows where we’re going,” says a trained economist whose specialty was market research, “and people don’t know what they want. We’re sailing in the dark.”

Read the rest of the story, free of charge, here in TIME’s archives: Cuban Evolution

TIME Media

How Radio History Hinted at the Conclusion of Serial

On The Wireless
Circa 1945, a family of four gathers in their living room to listen to their home radio set Harold M. Lambert—Getty Images

The radio serial has been around for nearly a century — and some things haven't changed

Warning: spoilers follow for the end of the first season of Serial

The true-crime podcast Serial, which published its season finale Thursday morning, may be a 2014 phenomenon — but, though the 12-episodes-one-story format may have been new for that style of podcast, it’s actually one of the oldest tricks in the radio book.

The idea of publishing a story a little bit at a time is often traced back to Charles Dickens, so when radio became the popular medium of choice in the early 20th century, the benefits of hooking an audience with the drip-drip of a story were already well known. By the 1930s, Procter & Gamble was the biggest radio advertiser in the country, by dint of its serial dramas, the original “soap operas.” The serial had become one of the most popular formats for radio. A 1939 calculation by the chair of radio writing at Columbia University estimated that there were 20,000,000 words spoken each day on U.S. radio — more than all the words spoken in the movies in a year, or on Broadway in ten — and that, at the time, writers of popular daily serials were some of the best paid staffers in the business, earning $1,000 a week.

As Elena Razlogova notes in her history of early radio, The Listener’s Voice, early serials shared something with Serial beyond merely being split up into multiple episodes. Just as Serial was taped as it aired, allowing for those familiar with the case to hear the show and volunteer new information, early radio serials relied on fan mail to help decide how the stories would move along.

Which means there’s something else Serial shares with its predecessors: the fact that, as the podcast’s listeners have discovered by now, the end of the story isn’t known in advance — which often means the conclusion of a radio serial isn’t necessarily an “ending” per se.

According to the Concise Encyclopedia of American Radio, traditional radio serials were popular but “frustrating” because they were “neverending.” Unlike literary serials of the Dickens variety — a format that TIME reported in 2012 was making a comeback on e-readers — radio serials that are written as they go along have tended, historically, not to have a predetermined arc. They just start and see where they go; when the season ends or the show gets canceled, denouement or not, that’s that.

Though Serial‘s season one ended with a guess from host Sarah Koenig that Adnan Syed was probably innocent, her year of research and weeks of podcasting yielded no certainty — a fact that Koenig readily admits in the episode. Syed’s case is still not settled and it seems possible that the world may never know exactly what happened on the fateful day in question, but the season is over so that’s it as far as listeners are concerned. And, given the show’s format, if we ever do find out what happened, it won’t be on Serial season two. So, in that, Serial and old-time serials have something in common with real life as well: unlike in the world of pre-scripted shows, a neat and tidy conclusion is a rarity.

Read more about the return of serial fiction, here in the TIME Vault: Stay Tuned for E-Serials

TIME faith

This Is What Hanukkah Is All About. Hint: It Was a Near-Death Experience

Table Set For Hanukkah
A Hanukkah table in the 1950s Underwood Archives / Getty Images

Though Hanukkah now is a Jewish counterpart to the Christian holiday of Christmas, it commemorates Judaism’s bare survival of an attempt to obliterate it by a Hellenistic king

History News Network

This post is in partnership with the History News Network, the website that puts the news into historical perspective. The article below was originally published at HNN.

Though Hanukkah now is a Jewish counterpart to the Christian holiday of Christmas, it commemorates Judaism’s bare survival of an attempt to obliterate it by a Hellenistic king, Antiochus the fourth. It is a holiday marking Judaism’s near miraculous survival of trauma, a trauma that left a permanent mark on both the Jewish and Christian Bibles.

The time was the late 170’s BCE (BCE is a secular equivalent to BC), and the land of Israel was ruled by a Hellenistic king, Antiochus the Fourth. Antiochus ruled the Seleucid Empire of lands on the Eastern Mediterranean. The Romans to the West were on the rise, and Antiochus’ kingdom was financially broke from paying high tribute to the Romans after a military loss to them. Though the Seleucid kings had permitted the Jews to practice their ancestral customs, Antiochus’ need for money made him offer the high priesthood of the Jerusalem temple to the highest bidder.

In 172 BCE Antiochus sold the high priesthood to an unsavory character, Menelaus, an advocate for Hellenistic reforms who was not a member of the priestly family. Menelaus soon outraged the local populace by plundering the temple treasury for money to pay the high tribute that he had promised Antiochus. When the populace revolted against Menelaus, Antiochus restored him to power and issued a decree forbidding observance of Jewish laws in Jerusalem and surrounding towns. According to the books of Maccabees, Jews were forced to offer a sacrifice to foreign gods, Torah scrolls were burned, mothers who had allowed their babies to be circumcised were killed with their children. Anyone with a copy of a Torah scroll was executed, and leading citizens were required, on pain of death, to eat pork in public, thus openly disobeying the Torah’s commands. Judaism was faced with a life and death struggle for its continued existence.

The people of Israel had faced crises before, but never such a direct challenge to their religious practices. The Mesopotamia-based Assyrian empire had destroyed towns and dominated Jerusalem for decades. The Babylonian empire destroyed Jerusalem and took many of its leading citizenry into extended exile in Babylon. But no one before Antiochus IV tried so hard to eradicate Jewish Torah observance and monotheism.

Faced with the choice of “eat pig or die,” Jews responded in different ways. Some went ahead and ate some pork or offered the required sacrifices to foreign gods. Others are said to have fled to the barren wilderness to avoid the persecution. Still others openly defied the law and were killed outright. For example, the book of 2 Maccabees (chapters 6-7) tells of the killing of an elderly scribe, Eleazar and also of seven brothers with their mother.

Still others chose to fight. Around 168 BCE, a provincial priestly family, the Hasmoneans (also known as “the Maccabees”), began a guerrilla operation against the armies of Antiochus. They scored repeated successes against a Seleucid army weakened by problems in other parts of the empire. Within a few years, in 164 BCE, the Hasmoneans had fought the Seleucids to a draw. Antiochus IV issued a decree rescinding his prohibition of Judaism:

King Antiochus to the senate of the Jews and to the other Jews, greeting. If you are well, it is as we desire. We also are in good health. Menelaus has informed us that you wish to return home and look after your own affairs. Therefore those who go home by the thirtieth day of Xanthicus will have our pledge of friendship and full permission for the Jews to enjoy their own food and laws, just as formerly, and none of them shall be molested in any way for what he may have done in ignorance. [2 Macc 11:27-31]

Later that year, in December of 164, the leader of the Hasmoneans at the time, Judah Maccabee, retook Jerusalem, trapped the Seleucid forces in the fortress there, and purified the temple of the non-Yahwistic cult. The holiday of Hanukkah celebrates this event, and the Hasmonean royal-priestly family was known by Judah’s nickname, “the hammer” (Maccabee) up to the present day.

Hanukkah celebrates the first decisive victory in the story of Jewish resistance to Greek persecution, but the struggle would leave lasting marks on Judaism and on the Hebrew Scriptures that Jews and Christians share. Eventually the Hasmoneans ended the rule of the Greek Seleucids over Palestine and established their own monarchy based in Judaism.

The Jews triumphed, but the trauma of near-annihilation at the hands of the Greeks left its mark. The Hanukkah resistance struggle gave birth to the Jewish idea of martyrdom that was then expanded in Christianity and became a major theme in Islam as well. This was the time that a fundamental division between “Jew” and “Greek” developed that had not existed before, a perceived hostility between Greek ways of thought and Hebrew traditions.

This may also have been the time when the idea of clearly-defined Hebrew “scripture” developed. The Greeks already had a defined set of educational writings that marked an educated “Greek,” focused on Homer’s epics above all and additional authors from the classical period of Greek literature, e.g. Herodotus, Plato and Euripides. The post-Hanukkah Hasmonean kingdom is the time when we first see signs of the development of a Hebrew counterpart to this Greek curriculum. This Hebrew Bible was focused on Torah first and foremost (rather than Homer) and a set of prophetically-inspired books from the time before Greek rule and the “end of prophecy” (1 Maccabees 9:27). This Hebrew Bible, the basis for the Christian “Old Testament,” was an anti-Greek Bible formed in the wake of hellenistic trauma.

Not every Jewish group immediately adopted such a standardized, defined set of holy scriptures. Throughout centuries, Israelites had worked with a looser idea of holy scripture, and this idea of a more fluid set of writings continued in later Jewish groups, such as the Qumran/Dead Sea community around the time of the Hasmoneans and the early Christians a bit later. The apocalyptic book of Enoch, for example, is still cited as scripture in the Christian scriptural writing of Jude (Jude 14). But as Roman trauma followed Greek trauma, this clearly defined Hebrew Bible of 24 books became the established Bible of rabbinic Judaism, and a model for the Christian New Testament of the early church. Before the trauma of the near-death of Judaism at the hands of Antiochus the fourth there was no such thing. Hanukkah marked the beginning of the biblical age.

David Carr, Professor of Old Testament/Hebrew Bible at Union Theological Seminary in New York, is the author of Holy Resilience: The Bible’s Traumatic Origins, which was just published by Yale University Press.

Your browser, Internet Explorer 8 or below, is out of date. It has known security flaws and may not display all features of this and other websites.

Learn how to update your browser