TIME Opinion

‘Offensive’ Is the New ‘Obscene’

LENNY BRUCE AT AIRPORT
Lenny Bruce, refused entry to Britain earlier in the day "in the public interest," makes a V-sign as he leaves U.S. customs office after returning to New York's Idlewild Airport on Apr. 8, 1963. John Lindsay—AP Photo

50 years after Lenny Bruce's sentencing, the world is still deciding what a comedian is allowed to say on stage

Reading about Lenny Bruce’s arrest for obscenity 50 years ago makes me think about a popular sketch Amy Schumer recently did on her Comedy Central show. On Dec. 21, 1964, Bruce was sentenced to four months in a workhouse for a set he did in a New York comedy club that included a bit about Eleanor Roosevelt’s “nice tits,” another about how men would like to “come in a chicken,” and other scatological and overly sexual humor.

How does this relate to Amy Schumer? In the sketch called “Acting Off Camera,” Schumer signs up to do the voice of what she thinks will be a sexy animated character, because Jessica Alba and Megan Fox are doing the voices of her friends. When she arrives to work she sees that her character is an idiotic meerkat who defecates continuously, eats worms and has her vagina exposed. She says to her agent, “My character has a pussy.” Schumer is the first woman to say that word on Comedy Central without being censored, a right she fought for. Her struggle was commended by the press for advancing feminism because the word had been banned even though four-letter words for male genitalia were given the O.K.

A word that could have landed Bruce in the slammer 50 years ago is now available for public consumption, and its inclusion into the cuss-word canon is applauded. These days each of George Carlin’s “seven words” seems quaint. There is nothing so raunchy, so profane or so over-the-top that it could land a comedian in jail.

However, they have other reasons to censor themselves — namely Twitter.

The most dangerous thing that a comedian has to face today is offending political correctness or saying something so racist or sexist that it kicks up an internet firestorm. In 2012, Daniel Tosh made a rape joke at a comedy club, which everyone on the internet seemed to have an opinion about. Many were offended and he later apologized for the joke. Just last month comedian Artie Lang tweeted a sexist slavery fantasy about an ESPN personality and was met with harsh criticism. Saturday Night Live writer and performer Leslie Jones, a black woman, also took heat for making jokes about slavery; her critics said they were offensive, but she defended her comments, claiming they were misunderstood. Most of this exchange took place on Twitter.

This is a common cycle these days and one that can derail a comedian’s career (just look at what happened to Seinfeld alum Michael Richards after his racist rant became public). It’s also something that comedians are hyper-aware of. “I stopped playing colleges, and the reason is because they’re way too conservative,” Chris Rock said in a recent interview in New York magazine (referring to over-prudence, not political ideology). “Kids raised on a culture of ‘We’re not going to keep score in the game because we don’t want anybody to lose.’ Or just ignoring race to a fault. You can’t say ‘the black kid over there.’ No, it’s ‘the guy with the red shoes.’ You can’t even be offensive on your way to being inoffensive.” In a world where trigger warnings are becoming popular, how can a comedian really push the envelope?

In the interview, Rock says this policing of speech and ideas leads to self-censorship, especially when he’s trying out new material. He says that comedians used to know when they went too far based on the audience reaction within a room; now they know they’ve gone too far based on the reaction of everyone with an Internet connection. Now the slightest step over the line could land a comedian not in the slammer but in a position like Bill Maher’s, where students demanded he not be able to speak at Berkeley because of statements he made about Muslims.

That’s the difference between Lenny Bruce and someone like Leslie Jones. A panel of judges decided that Bruce should face censorship because of what he said. Now Leslie Jones gets called out, but the public is the judge. Everyone with a voice on the internet can start an indecency trial and let the public decide who is guilty and to what degree. (The funny truth is, depending on whom you follow on Twitter, the party is usually universally guilty or universally innocent.)

What hasn’t changed as we’ve shifted from “obscene” to “offensive” is just how unclear the scenario could be. The Supreme Court famously refused to define “obscene” but instead said they know it when they see it. The same is true of “offensive.” One comedian can make a joke about race or rape and have it be fine, another can make a joke on the same subject matter and be the victim of a million blog posts. There was even an academic study to determine which strategies were most effective for making jokes about race.

Whenever one of these edgy jokes makes the news, a rash of comedians come to defend not the joke, necessarily, but that the person has the right to tell it in the first place. The same thing happened at Bruce’s trial when Woody Allen, Bob Dylan, Norman Mailer and James Baldwin all showed up to testify on Bruce’s behalf. Bruce never apologized for what he said. Though he passed away before his appeal could make its way through the courts, he received a posthumous pardon in 2003. Then-Governor of New York George Pataki noted that the pardon was a reminder of the importance of the First Amendment.

In 50 years a lot has changed, but comedy, like the First Amendment, really hasn’t. There are always going to be people pushing the boundaries of what is acceptable, because that’s what we find funny. What has changed is who is policing that acceptability — and that makes a big difference. We no longer have too-conservative judges enforcing “community standards” about poop jokes, telling people like Lenny Bruce that they can’t say one thing or another. Instead, today’s comedians are policed by the actual community, using the democratic voices of the Internet and social media to communicate about standards around race, religion, sexuality, gender and identity. The community doesn’t say comedians can’t offend, but that they’ll face consequences if they do. Their First Amendment rights are preserved and, though it may get in the way of the creative processed once used by people like Chris Rock, online feedback can often lead to productive conversations.

In a world where nothing is obscene, it doesn’t mean that things can’t be offensive, as murky as both those ideas might be. At least we’ve taken the government out of comedy, which seems to be safer for everyone. Now they can stick to dealing with the important things, like Janet Jackson’s nipple.

Read TIME’s original coverage of Lenny Bruce’s conviction, here in the archives: Tropic of Illinois

TIME Opinion

Like It or Not, Uber is Transforming Life in Middle America

Ridesharing services are under fire amid breakneck growth but they way they’re changing city life in flyover country is not trivial

In cities around the world, Uber is fighting for its life. On Monday, Portland, Oregon, sued to force the company to stop operating after it opened up shop without the city’s blessing. This comes on the heels of a statewide Uber ban in Nevada and a crackdown on the company in New Delhi following a rape accusation against a driver.

Many of Uber’s problems are of its own creation, a result of its extraordinarily fast ascent and equally extraordinary hubris. The company has managed to marry runaway success with a corporate culture that evidently permits executives to say and do some incredibly stupid things. And Uber drivers may have been involved in some truly scary assaults (which is not to say that traditional taxi drivers haven’t). But as taxi drivers, regulators and a critical media push back against the young company we’re faced with a baby and the bathwater situation. I want to make a case for not throwing out the good with the bad.

Uber, Lyft and other ridesharing services are often framed as alternatives to taxis. But the reality in most of America, where taxi services are often clunky and undependable, is that they aren’t alternatives to anything. They’re something entirely new that is transforming life and culture for the better.

I asked my network of friends who live outside the beltway and west of the Hudson River if, and if so, how, ridesharing had changed their lives.

“In a city like Houston where you can’t get a cab on the street (or very quickly from a call most of the time) it is life changing to have cheap car service available within minutes,” says Brandie Mask, a 29-year-old attorney. “Also, it’s great to have the same service in different cities so you don’t need to figure out the cab situation in an unfamiliar place. You just open your trusty app.”

When I posed the question to people in Tulsa, Oklahoma, (my hometown) the responses were unanimous that the recent introduction of ridesharing companies into the city had changed for the better how people move around town.

“I use Uber a lot in Tulsa,” said Mike Villafuerte, 35. “I been using them since I’ve had back surgery and also suffer from fibromyalgia. Last year I was in ICU for a month. I survived and I’m a big user of Uber to get to my doctor appointments.”

Amanda Gammill, 31, who lives with her husband in a suburb just outside of Tulsa, reports using ridesharing services while traveling and even on a night out in Tulsa though she owns a car. “We use it so we don’t get lost,” Gammill says. “And if we happen to decide to drink, we are safe. Cheaper than tolls, parking etc., and no risk of break in.”

The drinking and driving issue is key because in many cities rather than find a taxi dispatcher’s phone number and wait unknowable amount of time for a generally unclean car that may never come at all the truth is that many people just drink and drive. The question has come up before, as analysts—most notably Uber itself on its blog—have tried to parse out any correlation between a decline in DUI’s and the introduction of Uber into a market. Whether or not that is a causal relationship is a question for the statisticians but as a matter of cultural change there is no question as to the cause and its effect. By allowing people to efficiently get affordable rides between any location around town, ridesharing services are utterly transformative in the car-bound cities of Middle America.

In parts of the country where taxis are unreliable at best and public transportation is spotty if it exists at all, ridesharing links neighborhoods and opens cities up to the carless—or less car-dependent—lifestyle that many of todays urban professionals seem to prefer. Cities that shut down ridesharing services at the behest of grumbling taxi companies send a clear message to young professionals—move somewhere else, because this town ain’t for you. But just like it’s not only the kids these days who send text messages and have Facebook accounts, its not just the young and hip who use Uber.

Jason Boston, a driver for Uber and Lyft in Cincinnati, says that while he gets a lot of passengers in the younger set there’s a whole cohort of retired people who he sees using the services too. “I think the unique thing about the retirees,” he tells TIME, “is that they typically don’t go out that often but these new services allow them to go out with safe reliable options to get to and from a destination where they may have a few drinks.”

Last year I was at home alone in residential Washington, DC, attempting to fix a broken window when I slashed my arm on a piece of broken glass. I hardly needed an ambulance but I don’t own a car, wasn’t going to wait around for the bus and, with one arm useless and a belt tourniquet held taught through my teeth, I wasn’t inclined to start calling dispatchers trying to find a cab. So I opened up an app and caught a ride to the emergency room. That ride literally saved me hundreds of dollars.

The above is an extreme example but it illustrates how powerful ridesharing apps can be. In much of the country—especially the places between the coasts where most policymakers and national media people don’t live—the changes brought about by ridesharing aren’t trivial. Uber needs to grow up and all ridesharing companies need to make peace with regulators but cities that shutdown these services are likely to fall behind cities that don’t.

TIME Opinion

Girl Gone Wild: The Rise of the Lone She-Wolf

Wild
Fox Searchlight

A woman on a solitary journey used to be seen as pitiful, vulnerable or scary. Not any more.

The first few seconds of Wild sound like sex. You hear a woman panting and moaning as the camera pans across the forest, and it seems like the movie is starting off with an outdoor quickie. But it’s not the sound of two hikers hooking up: it’s the sound of Cheryl Strayed, played by Reese Witherspoon, climbing a mountain all by herself.

It lasts only a moment, but that first shot contains everything you need to know about why Wild is so important. It’s a story of a woman who hikes the Pacific Crest Trail for 94 days in the wake of her mother’s death, but more than that, it’s a story of a woman who is no longer anything to anybody. We’re so used to seeing women entangled with other people (with parents, with men, with children, in neurotic friendships with other women), that it’s surprising, almost shocking, to see a woman who is gloriously, intentionally, radically alone.

When it comes to women onscreen, the lone frontier is the last frontier. It’s no big deal to see women play presidents, villains, baseball players, psychopaths, superheroes, math geniuses, or emotionally stunted losers. We’ve even had a female Bob Dylan. But a woman, alone, in the wilderness, for an entire movie? Not until now.

Which is unfair, considering all the books and movies dedicated to the often-tedious excursions of solitary men, from Henry David Thoreau to Jack Kerouac to Christopher McCandless. Audiences have sat through hours of solo-dude time in critically acclaimed movies like Castaway, Into the Wild, Life of Pi, 127 Hours, and All is Lost. America loves a Lone Ranger so much, even Superman worked alone.

In fact, the only thing more central to the American canon than a solitary guy hanging out in the woods is a guy on a quest (think Huckleberry Finn or Moby Dick). The road narrative may be the most fundamental American legend, grown from our history of pilgrimage and Western expansion. But adventure stories are almost always no-girls-allowed, partly because the male adventurer is usually fleeing from a smothering domesticity represented by women. In our collective imaginations, women don’t set out on a journey unless they’re fleeing from something, usually violence. As Vanessa Veselka writes in her excellent essay on female road narratives in The American Reader: “A man on the road is caught in the act of a becoming. A woman on the road has something seriously wrong with her. She has not ‘struck out on her own.’ She has been shunned.”

MORE: The Top 10 Best Movies of 2014

The ‘loner in nature’ and the ‘man on the road’ are our American origin stories, our Genesis and Exodus. They’re fables of an American national character which, as A.O. Scott pointed out in his The New York Times essay on the death of adulthood in American culture, has always tended towards the boyish. Wild is the first big movie– or bestselling book, for that matter–to re-tell that central American story with a female protagonist.

But Wild is just the most visible example of what’s been a slow movement towards loner ladies onscreen. Sandra Bullock’s solo spin through space last year in Gravity was the first step (although her aloneness was accidental, and it was more a survival story than road narrative). Mia Wasikowska’s long walk across Australia in Tracks this year was another. But Wild, based on Strayed’s bestselling memoir and propelled by Witherspoon’s star power, is the movie that has the best shot at moving us past the now-tired “power woman” towards a new kind of feminist role model: the lone female.

Because for women, aloneness is the next frontier. Despite our chirpy boosting of “independent women” and “strong female leads,” it’s easy to forget that women can never be independent if we’re not allowed to be alone.

For men, solitude is noble: it implies moral toughness, intellectual rigor, a deep connection with the environment. For women, solitude is dangerous: a lone woman is considered vulnerable to attacks, pitiful for her lack of male companionship, or threatening to another woman’s relationship. We see women in all kinds of states of loneliness–single, socially isolated, abandoned–but almost never in a state of deliberate, total aloneness.

Not to mention the fact that women’s stories are almost always told in the context of their relationships with other people. Even if you set aside romance narratives, the “girl group” has become the mechanism for telling the stories of “independent” women– that is, women’s stories that don’t necessarily revolve around men. Think Sex & The City, Steel Magnolias, A League of Their Own, Sisterhood of the Traveling Pants, Girls: if a woman’s not half of a couple, she must be part of a gaggle.

When Cheryl Strayed describes her experience of “radical aloneness,” she’s talking about being completely cut off from human contact–no cell phone, no credit card, no GPS. But her aloneness is also radical in that it rejects the female identity that is always viewed through the lens of a relationship with someone else. To be alone, radically alone, is to root yourself in your own life, not the role you play in other people’s lives. Or, as Strayed’s mother Bobbi wistfully puts it, “I always did what someone else wanted me to do. I’ve always been someone’s daughter or mother or wife. I’ve never just been me.”

MORE: The Top 10 Best Movie Performances of 2014

And that’s the difference between aloneness and independence. The “independent woman” is nothing new– if anything, it’s become a tired catchphrase of a certain kind of rah-rah feminism. “Independence” implies a relationship with another thing, a thing from which you’re severing your ties. It’s inherently conspicuous, even performative. Female independence has become such a trope that it’s become another role for women to play: independent career woman, independent post-breakup vixen, independent spitfire who doesn’t care what anyone thinks. And usually, that “independence” is just a temporary phase before she meets a guy at the end of the movie who conveniently “likes a woman who speaks her mind.”

Aloneness is more fundamental, and more difficult. It involves cultivating a sense of self that has little to do with the motherhood, daughterhood, wifehood or friendship that society calls “womanhood.” When interviewed by the Hobo Times about being a “female hobo,” Strayed says: “Women can’t walk out of their lives. They have families. They have kids to take care of.” Aloneness then, isn’t just a choice to focus on one’s self– it’s also a rejection of all the other social functions women are expected to perform.

In 1995, when Strayed hiked for 94 days, that would have been hard. In 2014, it’s even harder. Thanks to the internet, our world is more social now than ever before, and it’s even harder to escape other people. But aloneness is at the root of real independence, it’s where self-reliance begins and ends. So these days, if you want to be independent, maybe you can start by trying to be alone.

Read next: Reese Witherspoon Isn’t Nice or Wholesome in Wild, and That’s What Makes It Great

TIME Opinion

Determined to Cut Taxes Once Again, Republicans Use New Math to Reshape Reality

CBO Director Elmendorf Releases Budget And Economic Outlook For 2014-2024
CBO Director Elmendorf on Aug. 27, 2014, in Washington, DC. Alex Won—Getty Images

"Republicans and Democrats inhabit different factual universes"

In 2004, an unnamed senior White House staffer, widely thought to be political adviser Karl Rove, gave a famous interview to Ron Suskind, “The aide,” Suskind wrote, “said that guys like me were ‘in what we call the reality-based community,’ which he defined as people who ‘believe that solutions emerge from your judicious study of discernible reality.’”

But, the anonymous man told Suskind, that wasn’t the way the world worked. “We’re an empire now,” he’s quoted as saying, “and when we act, we create our own reality. And while you’re studying that reality — judiciously, as you will — we’ll act again, creating other new realities, which you can study too, and that’s how things will sort out. We’re history’s actors . . . and you, all of you, will be left to just study what we do.’”

The staffer may not have known it, but he was identifying one feature of the kind of crisis era into which the United States had just entered. Great political crises like those of the American Revolution, the Civil War, the Depression and the Second World War, and the one that began in 2001 and continues today are also struggles over the meaning of words—words like democracy and dictatorship, freedom, free enterprise and socialism, and so on. They also become struggles over the most basic facts. Writing during the great worldwide crisis of the Second World War, George Orwell identified a number of competing world views whose adherents could not accept various obviously true statements about the world around them. Within ten years more—that is, by the mid-1950s—that was no longer the case. Victory in war and postwar economic growth had created a new consensus in the western world.

But today, divisions over reality are as deep as they have been for a very long time, and Republicans and Democrats inhabit different factual universes. One obvious area of disagreement is climate change. While President Obama seeks agreements with foreign nations to reduce greenhouse gas emissions, Republicans almost unanimously reject the consensus of scientific opinion and argue that those emissions have not been proven to heat up the planet. Another very important area of disagreement concerns fiscal and economic policy — and this week it became clear that some Republicans are struggling to make their version of economic reality prevail in Washington, too.

The New York Times reported on a campaign by prominent Republican conservatives and some Tea Party Congressmen to replace the director of the Congressional Budget Office, Douglas W. Elmendorf, with someone who would calculate the effects of tax cuts in a different way. Led by the anti-tax activist Grover Norquist and the Heritage Foundation, these Republicans specifically want a new director who would use what they call “dynamic scoring” to predict the impact of tax cuts. “Dynamic scoring” is based upon the theory of supply-side economics. That theory, first popularized under Ronald Reagan, held that tax cuts, particularly on the highest income brackets, would unleash extraordinary economic growth, and therefore bring in more, rather than less, revenue within a few years. Dynamic scoring is a theoretically more sophisticated application of this idea. Rather than simply deduct the projected cost of tax cuts from federal revenues to estimate their future impact, dynamic scoring actually predicts how much new tax cuts will increase GDP by unleashing economic growth, and how much they will tax revenues and mitigate the effect of the cuts upon the deficit. Since the new Republican majorities in Congress are determined to cut taxes yet again while claiming to move closer to a balanced budget, this is an idea they need to validate in order to justify their plans.

The history of dynamic scoring is closely tied to the history of Republican economic policy since Ronald Reagan. When Reagan took office in 1981, the federal deficit was $79 billion. His Administration immediately adopted policies based on supply-side economics. It didn’t work. When he left office eight years later after several rounds of tax cuts on the higher brackets, the annual deficit was $152 billion, down from a peak of $221 billion in 1986. Despite George H. W. Bush’s tax increases, which split the Republican Party, a severe recession had raised the deficit back up to $255 billion when he left office in 1993.

Bill Clinton began his Administration with an income tax increase on nearly all Americans. It helped cost the Democrats the House of Representatives in 1994. It was at that point that Newt Gingrich, then the new Speaker of the House, and his fellow Republicans began to advocate dynamic scoring as a means of calculating the impact of further tax cuts. Once again, they claimed they could accurately estimate the beneficial impact of leaving more money in the hands of the wealthy and ease fear of deficits. But Clinton refused to go along, and eventually, the Clinton Administration ran a budget surplus in fiscal 2000.

George W. Bush inherited a deficit of only $32 billion in 2001, and came into office determined to cut taxes again. By 2003, the Bush Administration was basing calls for a second round of cuts on dynamic scoring estimates that once again claimed that the cuts would generate increased revenue. The cuts passed, but their impact, combined with the Iraq war and the Great Recession, was to balloon the deficit up to $641 billion in fiscal 2008, and $1.55 trillion in fiscal 2009. Together, President Obama and the Republican Congress have now reduced the deficit to $483 billion in the fiscal year that was just completed. This pattern actually dates from the 1950s. Beginning with Dwight D. Eisenhower, every Republican President has substantially increased the federal deficit, while every Democratic President except Jimmy Carter has reduced it during his term of office.

Few theories of public policy have been tested so repeatedly and failed tests so spectacularly, as the idea that tax cuts in the high brackets will ultimately increase revenue and lower deficits. But led by Representative Paul Ryan, the Republican majority, by pushing for personnel changes that will lead the non-partisan Congressional budget office to adopt dynamic scoring, is eager to try it again. It is hard for me to believe that any Republican activists seriously believe that a new round of top-bracket and corporate tax cuts will increase revenues. Their real agenda, I suspect, is the one that Grover Norquist—a prime mover in the campaign to replace Elmendorf—has repeatedly spoken of: to force further reductions in government spending by reducing government revenues still further. Meanwhile, wealthy Republican donors will get even wealthier, and, presumably, even more generous in their contributions. Once again the Republicans are trying to create their own reality: a world in which making the rich richer will bring down deficits, while only a few poor benighted members of the “reality-based community” take the trouble to notice that this is not so.

David Kaiser, a historian, has taught at Harvard, Carnegie Mellon, Williams College, and the Naval War College. He is the author of seven books, including, most recently, No End Save Victory: How FDR Led the Nation into War. He lives in Watertown, Mass.

This post has been updated to include additional economic data.

TIME Opinion

History’s Echoes in the Policing that Made Eric Garner Say ‘Enough’

Protests Continue Across Country In Wake Of NY Grand Jury Verdict In Chokehold Death Case
In Oakland, Calif., Niels Smith holds a sign reading "I can't breathe" on the second night of demonstrations following a Staten Island, New York grand jury's decision not to indict a police officer in the chokehold death of Eric Garner, on Dec. 4, 2014. Elijah Nouvelage—Getty Images

"Racism can work through laws, even seemingly good laws"

Ever since the grand jury’s decision not to indict a police officer in the death of Eric Garner, protests across the country have rallied around Garner’s final words, repeated multiple times as he was choked and restrained: “I can’t breathe.”

But we should also pay attention to Garner’s first words in the video that recorded his death: “Every time you see me you want to mess with me. I’m tired of it.” Those words point us to the past—not only to Garner’s past encounters with the police, but also to the nation’s long history of using the law to “mess with” black citizens.

That history includes the years following the American Civil War. After their defeat, citizens of the former Confederacy could no longer buy or sell black people. Still, in 1865 and 1866, Southern state legislatures quickly passed a series of “Black Codes” that hampered attempts by newly freed people to escape the control of former masters.

These postwar Black Codes redefined customary local privileges, such as hunting wild hogs, as major crimes. Local white police forces enforced criminal laws selectively, using ostensibly color-blind laws to harass and incarcerate the newly freed. Of the nearly three hundred formerly enslaved people sent to the state penitentiary in Texas between 1865 and February 1867, the vast majority were charged with theft, including George Tucker, accused of stealing twenty cents in Montgomery County. Despite his protests of innocence, Tucker was sentenced to two years in prison, as was Tom Gravys of Harris County, accused of stealing half a plug of tobacco.

Even tax codes became weapons in white Southerners’ efforts to control black citizens. Black Southerners briefly thwarted those efforts between 1867 and 1876, when Congressional Reconstruction enabled formerly enslaved people and their allies to seize control of Southern legislatures. Republican governments, elected by black voters, overturned Black Codes and strengthened public services like education. To pay for increased public spending, Republican lawmakers overhauled the antebellum Southern tax system, too, shifting the burden of financing the state to its wealthiest instead of its poorest citizens.

Yet in the mid-1870s, so-called “Redeemer” Democrats, who campaigned on white supremacist platforms, began to wrest control of Southern legislatures away from Republicans, placing the power of the law back in the hands of reactionary whites. Redeemers quickly reinstituted Black Codes, using them not only to imprison but to disenfranchise black voters. According to a 2009 article by historian Pippa Holloway, between 1874 and 1882 every southern state but one passed constitutional amendments that disenfranchised petty thieves and turned misdemeanor crimes like stealing chickens into felony offenses.

Those Democrats also made taxes more regressive, hoping to force fine-harried workers to accept any wages white employers were willing to offer. By increasing licensing fees and excise taxes that targeted black citizens, they shifted tax burdens away from white landowners and rolled back public spending on things that mattered most to the people they disenfranchised. The result, as historian Eric Foner once wrote, was that poor black laborers “bore the heaviest burden of taxation and received the fewest public services.”

For students of this history, it is difficult not to hear echoes of such stories today. Two black men lie dead after encounters with the police that reportedly began because of alleged offenses like petty theft and sales of loose cigarettes designed to evade a regressive tax. Police had already arrested Eric Garner dozens of times on a string of misdemeanor charges, all of which he planned to contest in court.

Meanwhile, last year in Ferguson, Mo., the population of the city was lower than the number of warrants, many of them for offenses like minor traffic infractions or forgetting to sign up for garbage collection. And nationally, data indicates that black citizens are more likely than whites to be arrested, patrolled, stopped, and killed by police officers, even as crime rates have declined.

Such examples of overpolicing black citizens may well appear to future historians much like the application of the Black Codes look to historians of Reconstruction today, as clear examples of one group of Americans using power to hold another group down. To prove them wrong, we the living will have to confront our own history, stretching back to the Civil War and beyond, more honestly than most of us have done in the past.

Indeed, our collective lack of historical perspective shows even when Americans from different sides of the political spectrum agree that Garner’s homicide was unjust. Earlier this week, Senator Rand Paul of Kentucky blamed high cigarette taxes for Garner’s death. While Paul has commented before on the role of racism in our criminal justice system, his and others’ focus on “big government” betrays a reluctance to grapple with our history. We cannot fix our problems without acknowledging how long “broken windows” policing has targeted black communities, or how often public spending has been vital to protecting their rights as citizens.

On the other hand, without endorsing Paul’s specific analysis or policy proposals, we should also be reluctant to declare, as some did on social media, that “it is harder to get farther from the point than arguing the cause of Eric Garner’s death was cigarette taxes.” History warns against assuming that any part of our society is necessarily far from the reach of racism. Our history reminds us that racism can work through laws, even seemingly good laws, just as it can through lynch mobs. Black citizens have been held down with physical force, but also with fines, fees and felony sentences that took away the political power to change those very things.

In view of that past, there is little, if anything, about our country and our legal system that we should not be subjecting to a critical eye today. None of our institutions stand outside of our history, not our grand juries, not our police and not even our excise tax laws. And every chapter of that history contains evidence of black Americans being harassed by both legal and extralegal means.

The nation has found countless ways to “mess with” black lives for generations, and our past ways are not dead. They are not even past. That is why Eric Garner started by saying “I’m tired,” and why so many other Americans are now shouting “I can’t breathe.”

W. Caleb McDaniel is assistant professor of history at Rice University and the author of The Problem of Democracy in the Age of Slavery: Garrisonian Abolitionists and Transatlantic Reform, which won the Merle Curti Award from the Organization of American Historians.

TIME Opinion

The Problem With Frats Isn’t Just Rape. It’s Power.

The Phi Kappa Psi fraternity house at the University of Virginia in Charlottesville, Va., Nov. 24, 2014. A Rolling Stone article last week alleged a gang rape at the house which has since suspended operations.
The Phi Kappa Psi fraternity house at the University of Virginia in Charlottesville, Va., on Nov. 24, 2014. A Rolling Stone article alleged a gang rape at the house, which has since suspended operations Steve Helber—AP

Too many frats breed sexism and misogyny that lasts long after college. Why we need to ban them—for good.

At the university I called home my freshman year, fraternity row was a tree-lined street full of Southern style mansions, against a backdrop of the poor urban ghetto that surrounded the school. Off-campus frat parties weren’t quite how I pictured spending my weekends at a new school – I wasn’t actually part of the Greek system – but it became clear quickly that they were the center of the social structure. They controlled the alcohol on campus, and thus, the social life. So there I was, week after week, joining the throngs of half-naked women trekking to fraternity row.

We learned the rules to frat life quickly, or at least we thought we did. Never let your drink out of your sight. Don’t go upstairs – where the bedrooms were housed – without a girlfriend who could check in on you later. If one of us was denied entry to a party because we weren’t deemed “hot” enough – houses often ranked women on a scale of one to 10, with only “sixes” and up granted entry to a party – we stuck together. Maybe we went to the foam party next door.

In two years at the University of Southern California, I heard plenty of stories of women being drugged at frat parties. At least one woman I knew was date raped, though she didn’t report it. But most of us basically shrugged our shoulders: This was just how it worked… right?

If the recent headlines are any indication, it certainly appears so. Among them: women blacked out and hospitalized after a frat party at the University of Wisconsin, only to discover red or black X’s marked on their hands. An email guide to getting girls in bed called “Luring your rapebait.” A banner displayed at a Texas Tech party reading “No Means Yes, Yes Means Anal” – which happened to be the same slogan chanted by frat brothers at Yale, later part of a civil rights complaint against the university.

And now, the story of Jackie, who alleged in a Rolling Stone article — swiftly becoming the subject over fairness in reporting whether the author was negligent in not reaching out to the alleged rapists — that she was gang raped by seven members of the Phi Kappa Psi house at the University of Virginia, and discouraged from pressing charges to protect the university’s reputation.

The alleged rape, it turned out, took place at the same house where another rape had occurred some thirty years prior, ultimately landing the perpetrator in jail.

“I’m sick about this,” says Caitlin Flanagan, a writer and UVA alumna who spent a year documenting the culture of fraternity life for a recent cover story in the Atlantic. “It’s been 30 years of education programs by the frats, initiatives to change culture, management policies, and we’re still here.”

Which begs the question: Why isn’t every campus in America dissolving its fraternity program — or at least instituting major, serious reform?

Not every fraternity member is a rapist (nor is every fraternity misogynist). But fraternity members are three times more likely to rape, according to a 2007 study, which notes that fraternity culture reinforces “within-group attitudes” that perpetuate sexual coercion. Taken together, frats and other traditionally male-dominated social clubs (ahem: the Princeton eating club) crystalize the elements of our culture that reinforce inequality, both gender and otherwise.

For starters, they are insulated from outside perspective. It wasn’t until the late 1960s that Greek organizations eradicated whites-only membership clauses; as a recent controversy at the University of Alabama revealed, only one black student had been permitted into that Greek system since 1964. Throughout the country, the fraternities grew into “caste system based on socioeconomic status as perceived by students,” John Chandler, the former president of Middlebury, which has banned frats on campus, recently told Newsweek.

And when it comes to campus social life, they exert huge social control: providing the alcohol, hosting the parties, policing who may enter–based on whatever criteria they choose. Because sororities are prohibited from serving alcohol, they can’t host their own parties; they must also abide by strict decorum rules. So night after night, women line up, in tube tops and high heels, vying for entrance. Even their clothes are a signifier of where the power lies. “Those with less power almost invariably dress up for those who have more,” Michael Kimmel, a sociologist at Stony Brook University, wrote in a recent column for TIME. “So, by day, in class, women and men dress pretty much the same … At parties, though, the guys will still be dressed that way, while the women will be sporting party dresses, high heels and make up.”

And when frat boys grow up? They slide right into the boys club of the business world, where brothers land Wall Street jobs via the “fraternity pipeline,” as a recent Bloomberg Businessweek piece put it — a place where secret handshakes mean special treatment in an already male-dominated field. Fraternities have graduated plenty of brilliant Silicon Valley founders: the creators of Facebook, Instagram, among others. They’ve also brought us Justin Mateen, the founder of Tinder, who stepped down amid a sexual harassment lawsuit, and Evan Spiegel, the Snapchat CEO, whose recently apologized for e-mails sent while in the Stanford frat where Snapchat was founded, which discussed convincing sorority women to perform sex acts and drunkenly peeing on a woman in bed.

(VIDEO: My Rapist Is Still on Campus: A Columbia Undergrad Tells Her Story)

If we lived in a gender-equal world, fraternities might work. But in an age where 1 in five college women are raped or assaulted on campus, where dozens of universities are under federal investigations for their handling of it, and where the business world remains dominated by men, doesn’t the continued existence of fraternities normalize a kind of white, male-dominated culture that already pervades our society? There is something insidious about a group of men who deny women entry, control the No. 1 asset on campus – alcohol – and make the rules in isolated groups. “[Colleges] should be cultivating the kind of sensibility that makes you a better citizen of a diverse and distressingly fractious society,” Frank Bruni wrote it in a New York Times column this week. “How is that served by retreating into an exclusionary clique of people just like you?”

The argument for Greek life – at least for the mainstream, largely white frats that seem to be the problem – goes something like this: It’s about fostering camaraderie. (According to a 2014 Gallup Poll, fraternity and sorority members have stronger relationships with friends and family than other college graduates.) It’s about community: As the Washington Post reported, chapters at UVA reportedly raised $400,000 for charity and logged 56,000 hours of community service during the past academic year. It’s part of a student’s free right to congregate. And also about training future leaders. According to Gallup, fraternity and sorority members will end up better off financially, and more likely to start businesses than other college graduates.

But the real benefit – at least the unspoken one – may be about money. Frats breed generous donors: as Flanagan pointed out in her Atlantic piece, fraternities save universities millions of dollars in student housing. At least one study has confirmed that fraternity brothers also tend to be generous to their alma maters.

All of which is part of the problem. Who wants to crack down on frats if it’s going to profoundly disturb campus life?

UVA, for its part, has suspended the frat in question until the new year, what the Inter-Fraternity Council described as a helpful opportunity for UVA’s Greek system to “take a breath.” The university’s president has said that the school “is too good a place to allow this evil to reside.” But critics saw the punishment as a slap on the wrist: a suspension, when most students are out of town for the holidays?

There are other options on the table: The school is reportedly considering proposals to crack down on underage drinking and even a ban on alcohol. Other universities have explored making fraternities co-ed. And there’s some evidence that fraternity brothers who participate in a rape prevention program at the start of the academic year are less likely to commit a sexually coercive act than a control group of men who also joined fraternities.

Yet all the while, the parade of ugly news continues. A group of frat brothers at San Diego State University interrupted a “Take Back the Night” march last week by screaming obscenities, throwing eggs and waving dildos at marchers. The next night, a woman reported she was sexually assaulted at a party near the school’s campus; she was the seventh person to come forward this semester. And on Monday, Wesleyan announced that its Psi Upsilon fraternity would be banned from hosting social events until the end of 2015, also because of rape accusations.

Fraternities have created something that’s fairly unique in the modern world: a place where young men spend three or four years living with other men whom they have vetted as like them and able to “fit in.” What do you expect to happen at a club where women are viewed as outsiders, or commodities, or worse, as prey, and where men make the rules? It should be no surprise they end up recreating the boys club — and one that isn’t all so great for the boys, either.

Jessica Bennett is a contributing columnist at Time.com covering the intersection of gender, sexuality, business and pop culture. She writes regularly for the New York Times and is a contributing editor on special projects for Sheryl Sandberg’s women’s non-profit, Lean In. You can follow her @jess7bennett.

Read more views on the debate about preventing sexual assault on campus:

Caitlin Flanagan: We Need More Transparency on the Issue of Fraternity Rape

A Lawyer for the Accused on Why Some Rules About Consent Are Unfair to Men

Ban Frat Parties–Let Sororities Run the Show

TIME Education

Sorry, Fellow Teachers: Standardized Testing Is Not the Devil

scantron
Getty Images

At first, like many teachers, I hated these tests. But what if we used the results to make us better educators?

xojane

This story originally appeared on xoJane.com.

What I’m about to say might make me the most unpopular teacher in the lounge.

Standardized testing actually isn’t all that bad. In fact, if used correctly, it can do some great stuff.

Sorry, I just had to duck as a head of lettuce was thrown my way.

I teach adults English as a Second Language (ESL) in the Los Angeles Unified School District. Our students take two or three tests during a semester, and if their scores improve, we get money from the state.

At first, like many teachers, I hated these tests. Then, I got a job as my school’s testing advisor. For nine years, I assisted teachers with the testing, trained them to do it more effectively, scanned all the test results into a giant database, collected a bunch of demographic information, and then sent off the final report to the school district’s main office. From there, it went on to some shadowy place in the government, and somehow this all resulted in a check for some books we might buy.

The thing that bothered me was that all that work never truly resulted in helping struggling teachers or improving our programs. We made some cursory attempts at improvement, but the focus was on the money we earned from performance, not the evaluative process itself. We improved in order to get more funding. We didn’t truly examine core issues.

There was a widespread distrust of the tests among teachers and administrators. In my experience, teachers tend to be, oddly enough, people who distrust authority. We look at ourselves as overworked heroes, and we see standardized tests as the instruments that don’t truly measure the value of our work. But the thing is, the tests did measure a few things that we should have examined more closely.

I had never been much of a computer geek, but I had a knack for this bean counting database job. So I got pretty familiar with the reporting we did, and year after year, the numbers painted a clear picture. Some teachers were getting more improvements than others. Yes, there were outliers and exceptions and individual situations that affected testing. But if you took a step back and looked at trends, you could see which teachers were more effective — at least at the subjects our tests covered.

Moreover, unlike K-12 programs, adult education is not compulsory. Students who don’t enjoy your class can just walk out. They “vote with their feet.” The data showed us which teachers had high retention and which teachers had high drop rates. In other words, we could see who was engaging students and who might be a little, um…boring.

This isn’t to say that a boring teacher is a bad teacher, or that a fun teacher is necessarily a great teacher. But if I could see great retention and consistent test score improvements, term after term, I knew a teacher was pretty right on. At least, this was a teacher we could all learn a bit from.

One criticism that gets lobbed against standardized testing is that it can be used against instructors. But what about the opposite situation, when it can be used in their favor? There was a younger teacher in our school who just amazed me with his stellar test results. He dressed flamboyantly and danced and sang in his classroom. His learner persistence was great, too. Students loved him, because he was fun and he enjoyed the hell out of his job.

He was denied tenure. An administrator simply didn’t like his style. She was a more conservative, buttoned-down type, and she thought he was too silly, too wild. She overlooked the empirical evidence of his great results. Why aren’t we using these kinds of measurements to reward teachers like him or learn to emulate them? (He finally did get tenure.)

There was a lot of useful information that could have helped us improve our teaching. But we didn’t make enough use of that information. No teachers were dismissed for being uninteresting, but sadly, no teachers were trained to be more interesting, either.

In many ways, the cynicism of the staff toward the tests was understandable. The whole testing system was pretty flawed (and still is). And over the years, the pressure to do well on these tests mounted as our other funding was cut. So of course, when people don’t believe something has value, but they are forced to do it to earn money, a predictable result occurs. Teachers started to manipulate the test results.

One day, I was looking at a set of tests, and I noticed some strange patterns in the data. Students were scoring very high at the end of the semester, but then their scores would plummet at the beginning of the next semester. After narrowing this activity down to one class, I realized this teacher was cheating. He was depressing scores at the beginning of the term in order to see bigger results at the end of the term.

Oddly enough, he actually walked into my office right as I was making this discovery. And he proudly told me how he’d been getting such great test results: He’d been giving his students 10 minutes on the first test and an hour on the second. And bam! Improvement points.

I explained to this teacher, and a few others, that we couldn’t play the game like that. And my pitch basically went something like this: “Look, in the most basic sense, teachers should not be the ones trying to cheat on tests. But if that call to honesty doesn’t convince you, just know that your cheating is transparent, and if I can see it, you’ll get caught by someone less nice, who might fire you.”

I cleaned up that sort of behavior at our school, but I was aware that it was happening at other schools. I told some people who maybe could have done something about it. Not much happened.

At first, my school always did well, and I was considered competent. But year by year, we fell behind. Every year, when we didn’t meet our targets, I had to go do a sort of confessional with my supervisor in the district office. I had to examine the error of my ways and make a plan of improvement. There was no question that my job was riding on that improvement.

Just when I thought I’d have to capitulate or lose my job, the cheating at the other schools was discovered, the people involved were disciplined, and I was vindicated. Our school came out smelling clean and fresh and non-cheaty. I wish I could say this was a victory, but funding for our entire program was cut so drastically (because of a little economic crash that happened back in 2008) that hundreds of teachers lost their jobs anyway. Since layoffs were based on seniority, I’m sure a lot of teachers who did well on the tests were let go.

It’s hard to be a teacher. I’m back in the classroom now, and I’m giving these tests once again to my students. At the beginning of the semester, I didn’t even have my textbooks yet. I teach a group of adult students who are among the most economically challenged in the nation, and they show up to class exhausted from working full days, cleaning houses, painting, working at fast food restaurants, or taking care of kids or elderly people for low wages. These tests take up my time, cause me to exercise organizational muscles that I don’t have, and put stress on me. And there’s the rest of my life, too, where I’m not planning lessons but trying to write articles or parent a kid.

Yeah, the tests are a pain in the butt. And of course, the test results aren’t the only measure of a teacher’s worth.

But I know from sifting through data that the tests show us useful information, if only we would use it — and of course, if we are honestly taking the tests and not gaming the system. I don’t have a problem with the tests being financially compensated, but when you make educators desperate to earn money because regular funds are so scarce, that’s bound to incentivize dishonesty.

If we had better working conditions to start with, we might have time to read the story of what we do in test measurements, without the pressure of counting on the results for our survival.

Julie Cross is a screenwriter and graduate of UCLA Film School.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME Opinion

The Inevitable Rehabilitation of Ray Rice

From left: Janay and Ray Rice arrive for a hearing on Nov. 5, 2014 in New York City.
From left: Janay and Ray Rice arrive for a hearing on Nov. 5, 2014 in New York City. Andrew Burton—Getty Images

Susanna Schrobsdorff is an Assistant Managing Editor at TIME. Previously, she was the Editorial Director for Newsweek Digital. She is the winner of a New York Press Club award for Outstanding Web Coverage and three Front Page Awards for cultural commentary and interactive journalism.

Any NFL team that hires Ray Rice in the next few months will get a little flack. But don’t be surprised if Rice makes a full comeback on the field and off.

Consider that just a few weeks ago, Mike Tyson, a convicted rapist and self-confessed wife batterer, was making small talk on the late night circuit about his sold-out one-man show, directed by Spike Lee. The show is based on his memoir, Undisputed Truth, which has lines like: “How do you rape someone when they come to your hotel at two in the morning? There’s nothing open that late but legs.” There was the fun game he played on Jimmy Fallon called “Punch Out.” And last year, there was much mirth with Chelsea Handler about his three years in prison, drug tests and conjugal visits. Tyson has also joked about “socking” his ex-wife Robin Givens. According to a biography by his former friend Jose Torres, Tyson said the “best punch” he ever threw was at Givens–it was so hard she “bounced off two different walls” and was knocked out cold. (It’s worth noting that the New York Times‘ Michiko Kakutani glossed over that abusive relationship, calling it a “tumultuous marriage,” in her review of Tyson’s book.)

The one journalist to refer to the fighter as a “convicted rapist” in a TV interview got a long profanity laced rant from Tyson who called him “negative” and “a piece of sh-t.” That reporter, a Canadian broadcaster, later apologized for hurting Tyson’s feelings. Undisputed truth indeed.

The moral calculus of who we shun and for how long is nothing short of perplexing. Let’s not forget that a decade of happy Jello salesmanship intervened since the last time Bill Cosby was caught up in a maelstrom of rape accusations. And what about Chris Brown who was convicted in 2009 for felony assault of his then-girlfriend Rihanna? Or actor Josh Brolin who was charged with spousal battery in in 2004? (His wife Diane Lane declined to press charges.) Neither man’s career seemed to lose much public momentum after those incidents. And there’s Sean Penn, who was charged with assault during his marriage to Madonna in 1988 and later pled to a lesser offense. Yes, there’s a huge difference between allegations, arrests and convictions, but those distinctions don’t seem to matter much when it comes to the vicissitudes of public opinion.

In Rice’s case, the main thing keeping him from total rehabilitation now that he’s been reinstated will likely be his recent lackluster playing record. Never mind the fact that half the planet has watched a video of him punching his then-fiance so hard he knocked her unconscious, then dragging her limp body, face down, out of an elevator.

America loves a good comeback.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME Opinion

Why Ferguson Should Matter to Asian-Americans

A female protester raises her hands while blocking police cars in Ferguson
A female protester raises her hands while blocking police cars in Ferguson, Mo. on Nov. 25, 2014. Adrees Latif—Reuters

Ferguson isn’t simply black versus white

A peculiar Vine floated around social media Monday evening following the grand jury announcement in Ferguson, Mo. The short video shows an Asian-American shopkeeper standing in his looted store, with a hands-in-his-pockets matter-of-factness and a sad slump to his facial expression. “Are you okay, sir?” an off-screen cameraman asks. “Yes,” the storeowner says, dejectedly.

The clip is only a few seconds, but it highlights the question of where Asian-Americans stand in the black and white palette often used to paint incidents like Ferguson. In the story of a white cop’s killing of a black teen, Asian-Americans may at first seem irrelevant. They are neither white nor black; they assume the benefits of non-blackness, but also the burdens of non-whiteness. They can appear innocuous on nighttime streets, but also defenseless; getting into Harvard is a result of “one’s own merit,” but also a genetic gift; they are assumed well-off in society, but also perpetually foreign. Asian-Americans’ peculiar gray space on the racial spectrum can translate to detachment from the situation in Ferguson. When that happens, the racialized nature of the events in Ferguson loses relevance to Asian-Americans. But seen with a historical perspective, it’s clear that such moments are decidedly of more colors than two.

VOTE: Should the Ferguson Protestors Be TIME’s Person of the Year?

Michael Brown’s death has several parallels in Asian-American history. The first to come to mind may be the story of Vincent Chin, a Chinese-American killed in 1982 by a Chrysler plant superintendent and his stepson, both white, both uncharged in a racially-motivated murder; like Brown, Chin unified his community to demand protection under the law. However, most direct parallels have often had one distinct dissimilarity to Ferguson: they have not spurred widespread resistance, nor have they engraved a visible legacy.

There is the story of Kuanchang Kao, an intoxicated Chinese-American fatally shot in 1997 by police threatened by his “martial arts” moves. There is Cau Bich Tran, a Vietnamese-American killed in 2003 after holding a vegetable peeler, which police thought was a cleaver. There is Fong Lee, a Hmong-American shot to death in 2006 by police who believed he was carrying a gun. None of the three cases resulted in criminal charges against the police or in public campaigns that turned the victim’s memory into a commitment to seek justice. One op-ed even declared how little America learned from Tran’s slaying.

While Ferguson captures the world’s attention, why do these Asian-American stories remain comparatively unknown?

One possible answer could be found in the model minority myth. The myth, a decades-old stereotype, casts Asian-Americans as universally successful, and discourages others — even Asian-Americans themselves — from believing in the validity of their struggles. But as protests over Ferguson continue, it’s increasingly important to remember the purpose of the model minority narrative’s construction. The doctored portrayal, which dates to 1966, was intended to shame African-American activists whose demands for equal civil rights threatened a centuries-old white society. (The original story in the New York Times thrust forward an image of Japanese-Americans quietly rising to economic successes despite the racial prejudice responsible for their unjust internment during World War II.)

Racial engineering of Asian-Americans and African-Americans to protect a white-run society was nothing new, but the puppeteering of one minority to slap the other’s wrist was a marked change. The apparent boost of Asian-Americans suggested that racism was no longer a problem for all people of color — it was a problem for people of a specific color. “The model minority discourse has elevated Asian-Americans as a group that’s worked hard, using education to get ahead,” said Daryl Maeda, a professor of ethnic studies at the University of Colorado, Boulder. “But the reality is that it’s a discourse that intends to pit us against other people of color. And that’s a divide and conquer strategy we shouldn’t be complicit with.”

Through the years, that idea erased from the public consciousness the fact that the Asian-American experience was once a story of racially motivated legal exclusion, disenfranchisement and horrific violence — commonalities with the African-American experience that became rallying points in demanding racial equality. That division between racial minorities also erased a history of Afro-Asian solidarity born by the shared experience of sociopolitical marginalization.

As with Ferguson, it’s easy to say the Civil Rights movement was entirely black and white, when in reality there were many moments of interplay between African-American and Asian-American activism. Japanese-American activist Yuri Kochiyama worked alongside Malcolm X until he was assassinated in front of her. Groups protesting America’s involvement in the Vietnam War, like the student-run Third World Liberation Front, united resisters across racial lines under a collective radical political identity. W.E.B. DuBois called on African Americans to support the 1920s Indian anti-colonial resistance, which he compared to whites’ oppression of blacks. Chinese-American activist Grace Lee Boggs, who struggled as a female scholar of color, found passion in fighting similar injustices against African-Americans alongside C.L.R. James in the 1950s. Though Afro-Asian solidarity wasn’t the norm in either groups’ resistance movements, the examples highlight the power of cross-racial resistance, and what hardships they shared as non-whites.

The concept of non-whiteness is one way to begin the retelling of most hyphenated American histories. In Asian-American history, non-whiteness indelibly characterized the first waves of Asians arriving in the mid-1800s in America. Cases like People v. Hall (1854) placed them alongside unfree blacks, in that case by ruling that a law barring blacks from testifying against whites was intended to block non-white witnesses, while popular images documented Asian-American bodies as dark, faceless and indistinguishable — a racialization strengthened against the white supremacy of Manifest Destiny and naturalization law. Non-whiteness facilitated racism, but it in time also facilitated cross-racial opposition. With issues like post-9/11 racial profiling, anti-racism efforts continue to uphold this tradition of a shared non-white struggle.

“This stuff is what I call M.I.H. — missing in history,” said Helen Zia, an Asian-American historian and activist. “Unfortunately, we have generations growing up thinking there’s no connection [between African-Americans and Asian-Americans]. These things are there, all the linkages of struggles that have been fought together.”

The disassociation of Asian-Americans from Ferguson — not just as absent allies, but forgotten legacies — is another chapter in that missing history. In final moments of the Vine depicting an Asian-American shopkeeper’s looted store, the cameraman offers a last thought in their conversation that had halted to a brief pause. “It’s just a mess,” the cameraman says. The observation, however simplistic, has a truth. That, as an Asian-American who’s become collateral damage in a climate often black-and-white, he, like all of Ferguson, must first clean up — and then reassess the unfolding reality outside.

TIME Opinion

This Thanksgiving Let’s Finally Stop the Nonsense About the Puritans and Pilgrims

Puritans Pray At Thanksgiving Dinner
Puritans in prayer as a man leads the blessing at Thanksgiving dinner, from an 1867 illustration Kean Collection / Getty Images

Our ancestors were aliens. It’s time we realize that

History News Network

This post is in partnership with the History News Network, the website that puts the news into historical perspective. The article below was originally published at HNN.

Twenty years ago, James Loewen’s book Lies My Teacher Told Me addressed the paradox that American history was full of gripping stories that bored students silly. The problem, Loewen decided, lay with the textbooks, written in a tone he called ‘mumbling lecturer,’ and in content off-putting even to adolescents of white European descent whose ancestry received the most attention. For the rest, these books were either irrelevant or offensive.

Loewen began with the ‘first Thanksgiving’ in 1621, which in the approved version was full of beguiling assumptions. The Pilgrims, it seemed, had carved a world from a wilderness, and everyone else — the Spanish, the Dutch, the Indians — was invisible or passive or wrong. The story had become a pious morality play, a creation myth to preserve and propagate national values. And what about the colonists outside Massachusetts, those who did not sail on the Mayflower? The first permanent colony was Jamestown in 1607, but who had heard of the Susan Constant, Discovery or Godspeed?

I’d like to think that things have improved in US schools since Loewen was writing. But the myths he describes thrive elsewhere, perhaps because previous generations have cherished them into adulthood. Liberty and democracy are historical tripwires. Pilgrim ‘liberty’ was not something we would much fancy today. New Plymouth’s government was more like an oligarchy than a democracy, and the idea of freedom of speech was anathema. Passengers on the Mayflower drew up a compact, often painted as an egalitarian proto-Constitution whereas in reality it was just a socially-exclusive old world company agreement. ‘In their pious treatment of the Pilgrims,’ Loewen argues, ‘history textbooks introduce the archetype of American exceptionalism.’

Exceptionalism implies that a people were special, selected by God or nature to blaze their own trial. The trend was established early on. Edward Johnson’s History of New England (1653) self-righteously praised migrants’ holy work. ‘Behold how the Lord of Hosts hath carried it on in despight of all opposition from his and their enemies,’ thundered Johnson, ‘in planting of his churches in the New World, with the excellent frame of their government.’ The tradition gathered strength in the post-revolutionary era. Thomas Jefferson viewed the Old World as a redundant Ruritania where brave souls were weighed down by the ‘monkish trammels of priests and kings.’

Some historians have taken an explicitly exceptionalist line, others fall into it. Even well-informed, liberal-minded scholars address readers as ‘we’ (as if all readers were US citizens), and treat Europe as mere backstory – a place to quit, not to communicate with, still less return to. It’s like the Atlantic had baptismal powers: by crossing it, a person might be reborn and given the chance to realize dreams that elsewhere were fantasies. Puritans, ruthlessly persecuted in England, could worship freely in America.

The truth was different. Of the 10,000 ministers in England, many of whom had radical sympathies, only seventy-six emigrated, and a third of those had never been in trouble with the authorities. So much for the ‘Puritan diaspora.’ The Pilgrim Fathers who dominate our memory were a tiny unrepresentative minority. Even the Puritans, whom the Pilgrims resembled, were outnumbered four-to-one in New England, and made the other four-fifths resent them by laying down the law and monopolizing power. Of 350,000 English migrants between 1630 and 1700, only 21,000 went to New England anyway. Three times as many went to the Chesapeake, and most to the Caribbean. From the English perspective, the West Indies were by far the most significant American destination, likewise from the perspective of the New England farmers who supplied plantation owners with food so they could concentrate on growing sugar.

So what was in these people’s heads? The Pilgrims, and the Puritans of Massachusetts Bay Colony, were undeniably hostile to orthodox religion in England. But they did not despair of the motherland, and hoped their example would be salutary. This is the true meaning of the Puritan John Winthrop’s ‘city on a hill’ speech, so beloved by presidential speechwriters, who have seen in it the mirage of a uniquely precocious libertarian spirit. Then there was reverse migration. A fifth of New England’s colonists had returned home by 1640, as most had always intended. When the English civil war broke out in 1642, hundreds came back to fight for parliament in the place they called home. Finally, the typical migrant, in so far as such a type existed, was deeply anxious about ceasing to be English, and did everything possible to cling to his or her former cultural identity.

In the end, institutions have origins and events have causes. We just need to remember that nothing looked as it does now, that nothing was inevitable, and that early Americans were not like us. It matters that in 1620 it was England that was the global superpower not the USA, which neither existed nor could be imagined. And it matters that when we mistake our minds for theirs, we do not reanimate the past: we fictionalize it. Our ancestors were aliens. ‘Were we to confront a seventeenth-century Anglo-American,’ the historian David Freeman Hawke once observed, ‘we would experience a sense of culture shock as profound as if we had encountered a member of any other of the world’s exotic cultures.’
Malcolm Gaskill is Professor of Early Modern History at the University of East Anglia. His book “Between Two Worlds: How the English Became Americans” is published by Basic Books

Your browser, Internet Explorer 8 or below, is out of date. It has known security flaws and may not display all features of this and other websites.

Learn how to update your browser