TIME Opinion

What History Books Should Say About Ferguson

Michael Brown's mother Lesley McSpadden cries outside the police station in Ferguson, Mo. on Nov. 24, 2014 after hearing the grand jury decision on her son's fatal shooting.
Michael Brown's mother Lesley McSpadden cries outside the police station in Ferguson, Mo. on Nov. 24, 2014 after hearing the grand jury decision on her son's fatal shooting. Jewel Samad—AFP/Getty Images

How we tell the story of what happened in Missouri matters

When the grand jury decision not to indict officer Darren Wilson in the shooting of Mike Brown was announced late Monday evening in Ferguson, Mo., the world was watching. After hours of delay, misleading “Breaking News” banners, and a preemptive build-up of riot management forces on Ferguson streets, we were more than ready to hear the verdict. But the lengthy remarks delivered by St. Louis County prosecuting attorney Robert McCulloch were far less welcome.

McCulloch padded his announcement with nearly 30 minutes of narrative, detailing his own particular version of events in Ferguson since August 9, 2014, when Brown, an unarmed black teenager was fatally shot in the street. He complimented local authorities, conveniently choosing not to mention their internationally panned militarized assault on citizens in the days following Brown’s death. He praised his own management of the process, conveniently ignoring the fact that Attorney General Eric Holder had to step in for oversight and ultimately, to launch a federal investigation because of a lack of trust in the local “process”. And while no indictment came for Darren Wilson, in McCulloch’s tale, the media, twitter, eyewitnesses and even Mike Brown himself were tried and found guilty.

Why would McCulloch feel compelled to use his time on the national stage to recount the previous three months and tell his story? Because as a public official and an attorney, he understands the importance of the record: what account is written, what story is told, and, most of all what remains in our collective memory. What matters most as the chaos of cultural moments and social movements unfold is the history – or, more accurately, the telling of the history for generations to come.

As the late Nigerian writer Chinua Achebe tells us: Until lions have their own historians, the history of the hunt will always glorify the hunter. This history, an account of these past months, matters because as much as we want to believe that the problem upon which these events were built – violent, systemic racism – will be a distant memory by the time our children are themselves adults, the arc of the moral universe is long…very long. It is quite possible (read: highly likely) that the struggle to make a more perfect union will continue and that our grandchildren will turn to the history books for context for their own fight. There they will read about our turning point moments – and about us: the activists, the officials, the media, the mothers and fathers, the sons and daughters, the heroes and villains of these perilous times.

So, for future generations, let us write some history:

Let the record show that after Mike Brown’s death, Ferguson became ground zero for a movement that had been building in cities all across America. It was not the isolated reaction of a group of disgruntled residents. Thanks to the fearlessness and raw emotion of the Ferguson community, it was the strike of the match that finally lit the flame for people nationwide who felt as if those sworn to protect them, were hunting them instead.

Let the record show that a generation of young people rose up in this moment to lead. Tell the story of Ashley Yates, Tef Poe, and Tory Russell, brilliant young people ushering in a new era of activism, media, politics and community engagement. Tell the story of the organizations and networks that they are building in the face of a narrative that claims that young black people will loot and tweet but not strategize and work.

Let the record show that despite widespread celebrity disengagement from issues of racism, Grey’s Anatomy actor Jesse Williams has tirelessly forgone the glamour of his Hollywood career to be a bold, unapologetic presence in Ferguson and beyond, making him poised to be this generation’s Harry Belafonte.

Let the record show that national organizations like the nearly one million member ColorofChange.org worked in solidarity with Ferguson residents to support their leadership and also connect the events on the ground to a larger movement against injustice and police brutality.

Let the record show that members of rival St. Louis gangs stood together, united, protecting the elderly, women, children and physical property during the protests as a show of solidarity for their community.

Let the record show that it was not the Ferguson police department who made history but the hundreds of people who stood peacefully night after night for 15 weeks, chanting, talking and holding one another at youth organized meetings and healing stations organized by poet Elizabeth Vega.

Let the record show social media’s role in raising the name and story of an unarmed black citizen being killed – just as it has for Ezell Ford, Rekia Boyd, Eric Garner, Oscar Grant, Renisha McBride, Jordan Davis, Aiyana Stanley-Jones and countless others.

Let the record show that those very same social media platforms and voices were responsible for shining light on a city using tanks and tear gas on its citizens when mainstream media was being arrested and shut out.

Yes, let the record show the rage. Do not be afraid to talk about the disproportionately small number of people who would rather break things - windows, shelves, fences – than stand for the breaking of more people.

And most importantly, let the record show that the George Zimmerman verdict and the Darren Wilson decision are not evidence of black people’s delusions of racism but instead of how deeply entrenched bias and hatred is in a system that was built on, you guessed it, state-sanctioned racism.

Long after the facts of the case have been parsed and forgotten, long after Mike Brown t-shirts are faded and Darren Wilson rides off into a sunset that still hides George Zimmerman, there will be a record.

And if written correctly, it will tell the story of a people who refused to let America run from her promise of justice and equal protection under the law; citizens who used every awful tragedy, every imperfect victim, every messy media firestorm, every conflicting account, every questionable death, every chance it got to scream a truth that it knows deep in its bones: the police state is dangerous and unequal.

So, dear lions. Those of you black, brown, female, gay, poor, and oppressed; those feared and hunted by a system that won’t recognize its flaws, commit now to being historians. Tell and claim the parts of the Ferguson story that didn’t make it into the President’s remarks or McCulloch’s recap or the 24 hour news coverage.

If we do this, history will undoubtedly show what the state never has: that black lives – and all lives – matter.

 

TIME Opinion

The Reason Every One of Us Should Be Thankful

Thanksgiving Preparations
Illustration of preparing the Thanksgiving meal circa 1882. Kean Collection / Getty Images

As Thanksgiving approaches, a little bit of historical context goes a long way

Astronomy is a historical science because the distance scales involved are so immense that to look out into space is to look back into time. Even at the almost unfathomable speed of light — 300,000 kilometers per second — the sun is eight light minutes away, the nearest star is 4.3 light years away, the nearest galaxy, Andromeda, is about 2.5 million light years away and the farthest object ever observed is about 13.8 billion light years away. Astronomers call this way of describing such distances “lookback time.”

The concept is not limited to astronomy: current events also have their own lookback times, accounting for what gave rise to them. Just as looking at a star now actually involves seeing light from the past, looking at the world today actually involves looking at the reverberations of history. We have to think about the past in order to put current events into proper context, because that’s only way to track human progress.

Consider the longing many people have for the peaceful past, filled with bucolic scenes of pastoral bliss, that existed before overpopulation and pollution, mass hunger and starvation, world wars and civil wars, riots and revolutions, genocides and ethnic cleansing, rape and murder, disease and plagues, and the existential angst that comes from mass consumerism and empty materialism. Given so much bad news, surely things were better then than they are now, yes?

No.

Overall, there has never been a better time to be alive than today. As I document in my 2008 book The Mind of the Market and in my forthcoming book The Moral Arc, if you lived 10,000 years ago you would have been a hunter-gatherer who earned the equivalent of about $100 a year — extreme poverty is defined by the United Nations as less than $1.25 a day, or $456 a year — and the material belongings of your tiny band would have consisted of about 300 different items, such as stone tools, woven baskets and articles of clothing made from animal hides. Today, the average annual income in the Western world — the U.S. and Canada, the countries of the European Union, and other developed industrial nations — is about $40,000 per person per year, and the number of available products is over 10 billion, with the universal product code (barcode) system having surpassed that number in 2008.

Poverty itself may be going extinct, and not just in the West. According to UN data, in 1820 85-95% of the world’s people lived in poverty; by the 1980s that figure was below 50%, and today it is under 20%. Yes, 1 in 5 people living in poverty is too many, but if the trends continue by 2100, and possibly even by 2050, no one in the world will be poor, including in Africa.

Jesus said that one cannot live on bread alone, but our medieval ancestors did nearly that. Over 80% of their daily calories came from the nine loaves a typical family of five consumed each day. Also devoured was the 60 to 80% of a family’s income that went to food alone, leaving next to nothing for discretionary spending or retirement after housing and clothing expenses. Most prosperity has happened over the two centuries since the Industrial Revolution, and even more dramatic gains have been enjoyed over the last half-century. From 1950 to 2000, for example, the per capita real Gross Domestic Product of the United States went from $11,087 (adjusted for inflation and computed in 1996 dollars) to $34,365, a 300% increase in comparable dollars! This has allowed more people to own their own homes, and for those homes to double in size even as family size declined.

For centuries human life expectancy bounced around between 30 and 40 years, until the average went from 41 in 1900 to the high 70s and low 80s in the Western world in 2000. Today, no country has a lower life expectancy than the country with the highest life expectancy did 200 years ago. Looking back a little further, around the time of the Black Death in the 14th century, even if you escaped one of the countless diseases and plagues that were wont to strike people down, young men were 500 times more likely to die violently than they are today.

Despite the news stories about murder in cities like Ferguson and rape on college campuses, crime is down. Way down. After the crime wave of the 1970s and 1980s, homicides plummeted between 50 and 75% in such major cities as New York, Los Angeles, Boston, Baltimore and San Diego. Teen criminal acts fell by over 66%. Domestic violence against women dropped 21%. According to the U.S. Department of Justice the overall rate of rape has declined 58% between 1995 and 2010, from 5.0 per 1,000 women age 12 or older to 2.1. And on Nov. 10, 2014, the FBI reported that in 2013, across more than 18,400 city, county, state, and federal law enforcement agencies that report crime data to the FBI, every crime category saw declines.

What about the amount of work we have today compared with that of our ancestors? Didn’t they have more free and family time than we do? Don’t we spend endless hours commuting to work and toiling in the office until late into the neon-lit night? Actually, the total hours of life spent working has been steadily declining over the decades. In 1850, for example, the average person invested 50% of his or her waking hours in the year working, compared to only 20% today. Fewer working hours means more time for doing other things, including doing nothing. In 1880, the average American enjoyed just 11 hours per week in leisure time, compared to today’s 40 hours per week.

That leisure time can be spent in cleaner environments. In my own city of Los Angeles, for example, in the 1980s I had to put up with an average of 150 “health advisory” days per year and 50 “stage one” ozone alerts caused by all the fine particulate matter in the air—dirt, dust, pollens, molds, ashes, soot, aerosols, carbon dioxide, sulfur dioxide and nitrogen oxides—AKA smog. Today, thanks to the Clean Air Act and improved engine and fuel technologies, in 2013 there was only one health advisory day, and 0 stage-one ozone alerts. Across the country, even with the doubling of the number of automobiles and an increase of 150% in the number of vehicle-miles driven, smog has diminished by a third, acid rain by two-thirds, airborne lead by 97%, and CFCs are a thing of the past.

Today’s world has its problems — many of them serious ones — but, while we work to fix them, it’s important to see them with astronomers’ lookback-time eyes. With their historical context, even our worst problems show that we have made progress.

Rewind the tape to the Middle Ages, the Early Modern Period or the Industrial Revolution and play it back to see what life was really like in a world lit only by fire. Only the tiniest fraction of the population lived in comfort, while the vast majority toiled in squalor, lived in poverty and expected half their children would die before adulthood. Very few people ever traveled beyond the horizon of their landscape, and if they did it was either on horseback or, more likely, on foot. No Egyptian pharaoh, Greek king, Roman ruler, Chinese emperor or Ottoman sultan had anything like the most quotidian technologies and public-health benefits that ordinary people take for granted today. Advances in dentistry alone should encourage us all to stay away from time machines.

As it turns out, these are the good old days, and we should all be thankful for that.

Michael Shermer is the Publisher of Skeptic magazine, a monthly columnist for Scientific American, and a Presidential Fellow at Chapman University. He is the author of a dozen books, including Why People Believe Weird Things and The Believing Brain. His next book, to be published in January, is entitled The Moral Arc: How Science and Reason Lead Humanity Toward Truth, Justice, and Freedom.

TIME

When One Twin is More Academically Gifted

My son tested into the gifted program at school, but my daughter didn't. Should I split them up?

Splitting up twins in school is never easy. But splitting up twins so that one goes on the advanced learning track and the other follows the regular program is one of the most agonizing decisions a parent can face. And no amount of Internet searches will give you helpful advice. The consensus: Figure it out, parents. That’s what you’re (not) paid for.

As you may have guessed, I have twins, a boy and a girl, and they’re in the first grade. I happen to be a fraternal twin myself, so I’m sensitive to always being compared to a sibling. My son is like his engineer father —completely committed to being a lovable nerd. The other day he found a book of math problems at Barnes and Noble and was so excited it was as if Santa arrived, handed him a gift, and then let him ride a reindeer. My daughter is like her freelance writer mother – studying is not really her thing. She reminds me of the prince in Monty Python and the Holy Grail who is to inherit a large amount of land and says, “But I don’t want any of that. I’d rather sing!” That’s my girl.

We were first introduced to our school’s Spectrum (advanced learning) program last year in Seattle, Washington at the beginning of kindergarten. The kids could be tested that year and would enter the program—or not—in first grade. I hadn’t really thought about whether to have my kids tested. Other parents apparently had. One asked: “Should we have our child practice at home with the same kind of mouse they’re going to use in the test?”

In the beginning, my husband and I laughed at the idea of advanced learning in the first grade. We joked about “Level Two Crayons” and “Expert Alphabet.” But then, as the day to decide about testing came closer, we started hearing from our son’s teacher about how gifted he was. What first grader wants to practice math and reading on his own during the evenings and weekends? My son. And then there was my daughter, who was right on track, but, like most kids her age, was happy to leave school stuff at school. “Let’s just get them both tested and see what happens,” I said.

As far as my kids knew, they were just going to school to talk about what they know and what they don’t. They were never told that the results of the test had any sort of consequences and weren’t the least bit curious. But when we got the results–my son tested into the advanced program and my daughter didn’t–I immediately became anxious. I wanted to let my son move into the advanced program because I knew he would love it and thrive. But I worried for my vibrant, passionate daughter who at the age of six doesn’t think she has any limits. How was I going to separate her from her brother because he could do something better?

As a child I never felt smart enough. Not because of my twin sister, but because of my mother, who was brilliant. She used her intelligence to get off of the Kentucky farm where she grew up and into a New York City law firm. She placed a lot of value on the power of education and what good grades could do. I felt perpetually unable to meet her high expectations. Now I had a daughter who, in kindergarten, was already resistant to doing her reading homework. I was terrified that placing her brother in a higher academic track would affect my daughter’s self-esteem.

I contacted Christina Baglivi Tingloff from the site Talk About Twins. She’s a mother of adult twins and author of six books, including Double Duty and Parenting School-Age Twins and Multiples. “It’s tough when twins differ in abilities,” she says, “and I’d say that it’s the biggest challenge of parenting multiples. [But] kids take their cues from their parents. If you make this a non-issue in your household, I think your kids will follow suit.”

My husband and I have no lofty goals for our kids besides wanting them to be able to pay their own bills, not hurt themselves or anyone else, and be happy. “So many parents of twins try to even the playing field,” says Tingloff. “In my opinion, that’s a bad course of action because…kids then never develop a strong emotional backbone. Your job as a parent is to help them deal with the disappointments in life.”

We ended up putting our son in the Spectrum program and our daughter in the regular learning track. In the years to come, I will make sure that they understand that advanced or regular doesn’t mean better or worse, it just means different. I want both of my children to do the best they can, whether that means taking advanced classes or singing the hell out of the school musical.

When my daughter wanders through the house making up her own songs and singing at the top of her voice, I support her…most of the time. “Really encourage your daughter in the arts,” says Tingloff. “Find her spotlight. At some point her brother will look at her accomplishments and say, ‘Wow, I can’t do that.'” While I had been worrying all this time about my daughter feeling outshined by her brother, I had never considered that he might also feel outperformed by her.

Despite all of my talk about how my daughter’s interests were every bit as valid as her brother’s, I had not been treating them the same. I saw the dance and drama as diversions and hobbies. I never gave those talents the respect that I gave to her brother’s academic interests.

Now that I am more aware of how I have been valuing their different strengths, I’ll be able to give my daughter’s interests the same amount of focus and praise as her brother’s. Hopefully, I can assure them that our only concern is their happiness. Then my husband and son can go do math problems together, and take things apart to see how they work, and my daughter and I will lay on the grass and find shapes in the clouds while we wonder about the world and sing.

The truth is, both my kids are gifted.

 

TIME Opinion

Confessions of a Lumbersexual

Jordan Ruiz—Getty Images

Why plaid yoga mats and beards are the future

Several years ago I was riding in a van with two female friends in the front seats when one of them pointed out the window and yelled “Wait! Slow down…is that him?” We were passing the bar that employed her ex-boyfriend.

“I don’t know,” said her friend who was driving. “A guy in Brooklyn with a beard and a plaid shirt? Could be anyone.”

I looked down over my beard at my shirt and both girls looked at me and we all laughed.

I’ve had a beard most of my adult life and my wardrobe is comprised largely of cowboy cut, plaid shirts and Wrangler blue jeans. On cold days I wear a big Carhartt coat into the office. In my youth in Oklahoma I did cut down some trees and split firewood for use in a house I really did grow up in, but in those days I dressed like a poser gutter punk. I nurture an abiding love for outlaw country and bluegrass, though, again, during my actual lumberjacking days it was all Black Flag, Operation Ivy and an inadvisable amount of The Doors.

After a decade living in urban places likes Brooklyn and Washington, I still keep a fishing rod I haven’t used in years, woodworking tools I shouldn’t be trusted with, and when I drink my voice deepens into a sort of a growl the provenance of which I do not know. I like mason jars, and craft beer and vintage pickup trucks. An old friend visiting me a few years ago commented, as I propped a booted foot against the wall behind me and adjusted the shirt tucked into my blue jeans, that I looked more Oklahoma than I ever did in Oklahoma.

I am a lumbersexual.

The lumbersexual has been the subject of much Internet musing in the last several weeks. The term is a new one on me but it is not a new phenomenon. In 2010 Urban Dictionary defined the lumbersexual as, “A metro-sexual who has the need to hold on to some outdoor based ruggedness, thus opting to keep a finely trimmed beard.” I was never a metrosexual and I’m actually most amused by Urban Dictionary’s earliest entry for lumbersexual, from February 2004: “A male who humps anyone who gives him wood.” But I do think defining the lumbersexual as a metrosexual grasping at masculinity gets at something.

It doesn’t take a lot of deep self-reflection to see that my lumbersexuality is, in part, a response to the easing of gender identities in society at large over the last few decades. Writing for The New Republic nearly 15 years ago, Andrew Sullivan observed “many areas of life that were once ‘gentlemanly’ have simply been opened to women and thus effectively demasculinized.” The flipside of this happy consequence of social progress is a generation of men left a bit rudderless. “Take their exclusive vocations away, remove their institutions, de-gender their clubs and schools and workplaces, and you leave men with more than a little cultural bewilderment,” writes Sullivan.

If not a breadwinner, not ogreishly aggressive, and not a senior member in good standing at a stuffy old real-life boy’s club, what is a man to be?

On the other hand, the upending of gender norms frees men in mainstream culture to do things verboten by a retrograde man-code once enforced by the most insecure and doltish among us. We carry purses now (and call them murses, or satchels, but don’t kid yourselves fellas). We do yoga. That the ancient core workout is so associated with femininity the pop culture has invented the term “broga” only goes to show what a sorry state masculinity is in. The lumbersexual is merely a healthier expression of the same identity crisis.

Which is, I think (?), why I dress like a lumberjack (and a lumberjack from like 100 years ago, mind you; real lumberjacks today, orange-clad in helmets and ear protection, do not dress like lumbersexuals). As a 21st-century man who does not identify with the pickup artist thing or the boobs/cars/abs triad of masculinity on display in most 21st-century men’s magazines (Maxim et al), is not particularly fastidious or a member of any clearly identifiable subculture and who is as attracted to notions of old-timey authenticity as anyone else in my 20s-30s hipster cohort (all of you are hipsters get over it), I guess this is just the fashion sense that felt most natural. I am actually fairly outdoorsy, in a redneck car-camping kind of way. Lumbersexuality just fit right, like an axe handle smoothed out by years of palm grease or an iPhone case weathered in all the right places to the shape of my hand.

There is a dark side to this lumbersexual moment however. It’s an impulse evident in Tim Allen’s new show Last Man Standing. Whereas in the 1990s, Tim the Tool-Man Taylor from Home Improvement was a confident and self-effacing parody on the Man Cave, complete with silly dude-grunting and fetishizing of tools, Mike Baxter, played by Tim Allen in Last Man Standing, is an entirely un-self-aware, willfully ignorant reactionary. The central theme of the show is Baxter in a household full of women struggling to retain his masculinity, which is presumed to be under assault because of all the estrogen around. He does this through all manner of posturing, complaining and at times being outright weird. In an early episode, Baxter waltzes into the back office at his job in a big box store modeled off Bass Pro Shops and relishes in the fact that it “smells like balls in here.” The joke is a crude attempt at celebrating maleness but it rings distressingly hollow to anyone who has spent any time in rooms redolent with the scent of actual balls. In later seasons the show softened but the central concern of a man whose masculinity is under assault because he is surrounded by women speaks to this moment in our popular culture.

If my beard is a trend-inspired attempt to reclaim a semblance of masculinity in a world gone mad then so be it. Beats scrotum jokes.

TIME Opinion

Ask an Ethicist: Can I Still Watch The Cosby Show?

Bill Cosby, Camille Cosby
Bill Cosby sits for an interview about the exhibit, Conversations: African and African-American Artworks in Dialogue, at the Smithsonian's National Museum of African Art in Washington on Nov. 6, 2014. Evan Vucci—AP

I can get over the fact that Martin Luther King, Jr. cheated on his wife, but I don’t care that the Nazis made the trains run on time. Making that call is a moral calculus: when do the negative aspects of a public figure outweigh the positive? Granted, in Bill Cosby’s case, we’re talking about a comedian, but the question is relevant for The Cosby Show‘s legacy. Should I think less of The Cosby Show‘s power to teach and to change perceptions of race in America if it turns out Bill Cosby is a rapist?

Like most people, when I first heard word of allegations that Bill Cosby had raped multiple women, I impulsively pushed them to the back of my mind. For me, The Cosby Show’s legacy is personal. As a kid, the young Huxtables were among the few children on television with faces that looked like mine living well-adjusted upper middle class existences that resembled my own. When I considered my Cosby experience alongside the actor’s on-screen persona, a doctor and family man who combined life lessons with old-fashioned humor, I intuitively knew that he couldn’t be a serial rapist.

But eventually emotion gave way to reason. Seven women with little to gain have reported that Cosby committed the same heinous crime, rape, in the same way. So if someone like me, a life long fan, believes these women, where does that leave The Cosby Show? Are all of Cosby’s indelible life lessons suddenly moot? Does secretly watching an episode when no one is around condone sex crimes?

To help me think through these questions, I turned to ethicists and academics.

First, there’s the question of morality versus art. To condemn his actions, do I also have to repudiate the man and his work? I took this up with Jeremy David Fix, a fellow at Harvard’s Safra Center for Ethics who studies moral philosophy: Would continuing to watch The Cosby Show harm anyone, even indirectly?

(MORE: So What Do We Do About The Cosby Show?)

On the one hand, watching the show helps in some small way line Bill Cosby’s pockets via residuals. On the other hand, with an estimated net worth of over $350 million at the age 77, he can already rest assured that he’ll live the rest of his life comfortably. But Harvard’s Fix asks a good question: What about the women who have been assaulted—what sort of message does it send if I keep supporting Cosby, even indirectly? I had to give up watching, I started to conclude. Otherwise, I might inadvertently send the signal that I think sexual assault is something that can be treated flippantly.

But how do I weigh the message that watching the show might send victims against the still-needed message that it sends to America at-large about race? I had finally stumped Fix. So I turned to historians and other thinkers to talk about the show’s legacy and whether it still has a positive role to play in discussions about race.

Joe Feagin, a sociologist who has written about The Cosby Show, talks eloquently about the indelible impression the show left on the country. Black Americans tend to celebrate the achievement of a top-rated show featuring a black cast in a positive light. They will probably keep doing that even if they condemn its creator. White Americans tend to celebrate the show as evidence that African-Americans can succeed in middle class life, Feagin said. While that view leaves society’s entrenched racism unaddressed, I’d still take Cosby over the Sanford and Son. Let’s face it, American residential communities are still largely racially homogenous, and it would certainly benefit future generations to see black families like the Huxtables.

So I tried to convince myself that somehow we could condemn Cosby’s rape message while continuing to watch the show. That is, I hoped we could separate Cliff Huxtable from Bill Cosby. But in the end, I don’t think we can any more. The two are so closely linked that as I tried to watch an episode of The Cosby Show this week, the image of Cliff kept reminding me of the actor’s pathetic silence in response to questions about the accusations him. If that distracted me, I can only imagine how an assault survivor would feel. The show has positively affected millions of Americans, and that legacy remains intact, but maybe it’s time for a new show to teach us about race. It’s a little overdue anyway.

TIME Opinion

Is Obama Overreaching on Immigration? Lincoln and FDR Would Say ‘No’

Barack Obama
President Barack Obama announces executive actions on immigration during a nationally televised address from the White House in Washington, D.C., on Nov. 20, 2014 Jim Bourg—AP

Like Lincoln and Roosevelt before him, Obama occupies the White House in a time of great crisis

Last night, President Obama announced new steps that will allow about five million undocumented immigrants to obtain work permits and feel free of imminent deportation. Given that we now have an estimated 10–11 million such people within our nation and that many of them clearly will never leave, this seems a reasonable first step towards giving them all some kind of legal status. But, because of the anti-immigration stance of the Republican Party, which will entirely control Congress starting on Jan. 3, the President will have to base this step solely on executive power. And even before the President spoke, various Republicans had accused him of acting like an emperor or a monarch and warning of anarchy and violence if he goes through with his plans.

There are, in fact, substantial legal and historical precedents, including a recent Supreme Court decision, that suggest that Obama’s planned actions would be neither unprecedented nor illegal. This is of course the President’s own position, that no extraordinary explanation is needed—yet we can also put his plans in the broader context of emergency presidential powers, which in fact have a rich history in times of crisis in the United States. It is not accidental that this issue of Presidential power is arising now, because it will inevitably arise—as the founders anticipated—any time a crisis has made it unusually difficult to govern the United States. Like Abraham Lincoln and Franklin Roosevelt, Obama occupies the White House in a time of great crisis, and therefore finds it necessary to take controversial steps.

The Founding Fathers distrusted executive authority, of course, because they had fought a revolution in the previous decade against the arbitrary authority of King George III. But, on the other hand, they had come to Philadelphia in 1787 because their current government, the early version of the U.S. system established by the Articles of Confederation, was so weak that the new nation was sinking into anarchy. So they created a strong executive and a much more powerful central government than the Articles of Confederation had allowed for—and having lived through a revolution, they also understood that governments simply had to exercise exceptional powers in times of emergency.

They made one explicit reference to an emergency power, authorizing the federal government to suspend the right of habeas corpus—freedom from arbitrary arrest—”in cases of rebellion or invasion [when] the public safety may require it.” Nearly 80 years later, when the southern states had denied the authority of the federal government, Abraham Lincoln used this provision to lock up southern sympathizers in the North, and eventually secured the assent of Congress to this measure. He also used traditional powers of a government at war—including the confiscation of enemy property—to emancipate the slaves within the Confederacy in late 1862. With the help of these measures, the North won the war and the Union survived—apparently exactly what the Founders had intended.

When Franklin Roosevelt took the oath of office in the midst of a virtual economic collapse in March of 1933, he not only declared that the nation had “nothing to fear but fear itself,” but also made clear that he would take emergency measures on his own if Congress did not go along. That spring, the country was treated to a remarkable movie, Gabriel Over the White House, in which the President did exactly that—but as it turned out, the Congress was more than happy to go along with Roosevelt’s initial measures. It wasn’t until his second term that Congress turned against him; he, like Obama, used executive authority to find new means of fighting the Depression. In wartime he also claimed and exercised new emergency powers in several ways, including interning Japanese-Americans, this time without a formal suspension of habeas corpus. In retrospect both a majority of Americans and the courts have decided that some of these measures, especially the internment, were unjust and excessive, but the mass of the people accepted them in the midst of a great war as necessary to save the country, preferring to make amends later on. Though opponents continually characterized both Lincoln and FDR as monarchs and dictators trampling on the Constitution, those are judgments which history, for the most part, has not endorsed.

As the late William Strauss and Neil Howe first pointed out about 20 years ago in their remarkable books, Generations and The Fourth Turning, these first three great crises in our national life—the Revolutionary and Constitutional period, the Civil War, and the Depression and the Second World War—came at regular intervals of about 80 years. Sure enough, just as they had predicted, the fourth such great crisis came along in 2001 as a result of 9/11. President Bush immediately secured from Congress the sweeping authority to wage war almost anywhere, and claimed emergency powers to detain suspected terrorists at Guantanamo. (Some of those powers the Supreme Court eventually refused to recognize.) The war against terror was, however, only one aspect of this crisis. The other is the splintering of the nation, once again, into two camps with largely irreconcilable world views, a split that has paralyzed our government to an extent literally never before seen for such a long period. Immigration is only one of several problems—including climate change, inequality and employment—that the government has not been able to address by traditional means because the Republican Party has refused to accept anything President Obama wants to do.

The Founders evidently understood that when the survival of the state is threatened, emergency measures are called for. We are not yet so threatened as we were in the three earlier crises, but our government is effectively paralyzed. Under the circumstances it seems to me that the President has both a right and a duty to use whatever authority he can find to solve pressing national problems. Congressional obstructionism does not relieve him of his own responsibilities to the electorate.

David Kaiser, a historian, has taught at Harvard, Carnegie Mellon, Williams College, and the Naval War College. He is the author of seven books, including, most recently, No End Save Victory: How FDR Led the Nation into War. He lives in Watertown, Mass.

TIME Opinion

Bill Cosby, Camille Cosby and the Oppressive Power of Silence

What the couple's response to allegations of sexual assault reveal about the scandal

Camille Cosby smiles, uncomfortably shifting in her chair. Staring off camera, switching positions, silent. In the latest contribution to the Bill Cosby saga, we see husband and wife side by side as he addresses the very act of questioning about his numerous rape allegations in an AP interview (above). Mrs. Cosby continues to smile and looks away from the reporter several times, both she and her husband presuming that the cameras have stopped rolling. I will not read into her silence. I will not pull meaning about this woman and her thoughts and decisions other than to say that in the watching, the silence is palpable, wince-inducing and profoundly painful.

That exchange highlights the most meaningful currency in this 30+ year long drama that is just now seeing its climax unfold on the public stage: silence. At every turn, it is the silence that serves as a proxy for power in the story of Bill Cosby, his alleged sexual deviance and the current downward spiral of public opinion. Silence here, as in most cases, represents the power wielded and power taken by those who are seen as, well, powerful.

In Cosby’s story we find accusations of women being silenced for decades by threats, lawyers, fear and a generally defensive public, who until now were uninterested in being awakened from sweet dreams of their TV father.

The NPR audio interview released last week showcases Cosby’s clearly pre-determined response to the softest, almost nervous questions about the rape allegations: deafening silence.

This should not be viewed as the mature response of a well respected, integrity filled man (and in the case of his wife, a beloved, regal woman) attempting to maintain dignity and stay above the fray. It should be seen as what it is: A power move by a someone so arrogant that he thinks he shouldn’t even be asked about the fact that 15 women are accusing him of a horrific crime.

The silence of those publicly associated with Mr. Cosby is also noticeable, as comedians who revere him and actors and actresses whose careers were made by him avoid addressing the not-new bombshell like the plague.

And even in the most recent AP video, as Mrs. Cosby sits idly by, the central tension between Mr. Cosby and the reporter revolves around him pressuring the journalist into, what? Silence. He calmly yet persistently requests the editing out of his own “no comment” response to the reporter’s request for a statement. Be clear: In the actual interview, Mr. Cosby refused to discuss it, saying “I don’t talk about that.” It is that exchange that he wants scrubbed from the record. He even wants his silence silenced.

History teaches us that silence is often the most effective tool of power. It forces others into submission. It attempts to control a narrative. It hides things. And it is often a strategic attempt on the part of the powerful to shame other voices – the victims, the oppressed, the challengers, the inquisitors – into a similar silence.

But right now as Missouri police use military tactics and tear gas to force silence upon outraged but peaceful Ferguson protestors and rich executives threaten female reporters who won’t stop talking with personal attacks pulled from private investigators (see the latest Uber controversy), silence is not ok.

And that is why, despite our national love of Dr. Heathcliff Huxtable, silence is not an option. Not for me. Not for his countless fans. Not for a media finally ready to deal with the dirt and thankfully, not for the women who are sharing their painful, private stories. It is time to counter his silence with other forms of power. The power of our common sense to see behind a made-for-TV character. The power of these women to, at the very least, have their voices heard. And the power for all of us to seek truth and justice, however unsettling it may be.

Read next:

TIME Opinion

Feminist Is a 21st Century Word

Gloria Steinem, Jane Fonda, and Robin Morgan, Co-Founders of the Women's Media Center
From left: Gloria Steinem, Jane Fonda and Robin Morgan, co-founders of the Women's Media Center on CBS This Morning in New York City on Sept. 18, 2013 CBS Photo Archive/Getty Images

Robin Morgan is an author, activist and feminist. She is also a co-founder, with Gloria Steinem and Jane Fonda, of the Women's Media Center

I know, I know, TIME’s annual word-banning poll is meant as a joke, and this year’s inclusion of the word feminist wasn’t an attempt to end a movement. But as a writer — and feminist who naturally has no sense of humor — banning words feels, well, uncomfortable. The fault lies in the usage or overusage, not the word — even dumb or faddish words.

Feminist is neither of those. Nevertheless, I once loathed it. In 1968, while organizing the first protest against the Miss America Pageant, I called myself a “women’s liberationist,” because “feminist” seemed so 19th century: ladies scooting around in hoop skirts with ringlet curls cascading over their ears!

What an ignoramus I was. But school hadn’t taught me who they really were, and the media hadn’t either. We Americans forget or rewrite even our recent history, and accomplishments of any group not pale and male have tended to get downplayed or erased — one reason why Gloria Steinem, Jane Fonda and I founded the Women’s Media Center: to make women visible and powerful in media.

No, it took assembling and researching my anthology Sisterhood Is Powerful to teach me about the word feminism. I had no clue that feminists had been a major (or leading) presence in every social-justice movement in the U.S. time line: the revolutionary war, the campaigns to abolish slavery, debtors’ prisons and sweatshops; mobilizations for suffrage, prison reform, equal credit; fights to establish social security, unions, universal childhood education, halfway houses, free libraries; plus the environmentalism, antiwar and peace movements. And more. By 1970, I was a feminist.

Throughout that decade, feminism was targeted for ridicule. Here’s how it plays: first they ignore you, then laugh at you, then prosecute you, then try to co-opt you, then — once you win — they claim they gave you your rights: after a century of women organizing, protesting, being jailed, going on hunger strikes and being brutally force-fed, “they” gave women the vote.

We outlasted being a joke only to find our adversaries had repositioned “feminist” as synonymous with “lesbian” — therefore oooh, “dangerous.” These days — given recent wins toward marriage equality and the end of “don’t ask don’t tell” in the military, not to mention the popularity of Orange Is the New Black — it’s strange to recall how, in the ’70s, that connotation scared many heterosexual women away from claiming the word feminist. But at least it gave birth to a witty button of which I’ve always been especially fond: “How dare you assume I’m straight?!”

Yet in the 1980s the word was still being avoided. You’d hear maddening contradictions like “I’m no feminist, but …” after which feminist statements would pour from the speaker’s mouth. Meanwhile, women’s-rights activists of color preferred culturally organic versions: womanist among African Americans, mujerista among Latinas. I began using feminisms to more accurately depict and affirm such a richness of constituencies. Furthermore, those of us working in the global women’s movement found it fitting to celebrate what I termed a “multiplicity of feminisms.”

No matter the name, the movement kept growing. Along the way, the word absorbed the identity politics of the 1980s and ’90s, ergo cultural feminism, radical feminism, liberal/reform feminism, electoral feminism, academic feminism, ecofeminism, lesbian feminism, Marxist feminism, socialist feminism — and at times hybrids of the above.

Flash-forward to today when, despite predictions to the contrary, young women are furiously active online and off, and are adopting “the F word” with far greater ease and rapidity than previous feminists. Women of color have embraced the words feminism and feminist as their own, along with women all over the world, including Afghanistan and Saudi Arabia.

As we move into 2015, feminism is suddenly hot; celebrities want to identify with it. While such irony makes me smile wryly, I know we live in a celebrity culture and this brings more attention to issues like equal pay, full reproductive rights, and ending violence against women. I also know that sincere women (and men of conscience), celebs or not, will stay with the word and what it stands for. Others will just peel off when the next flavor of the month comes along.

Either way, the inexorable forward trajectory of this global movement persists, powered by women in Nepal’s rice paddies fighting for literacy rights; women in Kenya’s Green Belt Movement planting trees for microbusiness and the environment; Texas housewives in solidarity with immigrant women to bring and keep families together; and survivors speaking out about prostitution not being “sex work” or “just another job,” but a human-rights violation. From boardroom to Planned Parenthood clinic, this is feminism.

The dictionary definition is simple: “the theory of the political, economic, and social equality of the sexes.” Anyone who can’t support something that commonsensical and fair is part of a vanishing breed: well over half of all American women and more than 30% of American men approve of the word — the percentages running even higher in communities of color and internationally.

But I confess that for me feminism means something more profound. It means freeing a political force: the power, energy and intelligence of half the human species hitherto ignored or silenced. More than any other time in history, that force is needed to save this imperiled blue planet. Feminism, for me, is the politics of the 21st century.

Robin Morgan, the author of 22 books, hosts Women’s Media Center Live With Robin Morgan (syndicated radio, iTunes, and wmcLive.com).

TIME Opinion

The Little Mermaid: Not as Sexist as You Thought It Was

Disney

"Dinglehoppers work, guys."

In celebration of The Little Mermaid‘s 25th anniversary — it was originally released Nov. 17, 1989 — Eliana Dockterman and Laura Stampler rewatched a film they hadn’t seen in about a decade and a half. They were pleasantly surprised.

Eliana Dockterman: So we just watched The Little Mermaid. I expected it to be really sexist because of articles like this and this and this.

Laura Stampler: It wasn’t.

ED: So, yeah, the fact that she can’t talk isn’t great. But it’s Ursula—the villain—who tells Ariel that men don’t value women talking so it won’t matter if Ariel loses her voice.

LS: And yes, it’s also not great that Ariel is willing to give up her voice for the chance to get a man who plays a mean flute — watch the movie again, he plays the flute a lot — but she is a 16-year-old girl. As a former 16-year-old girl, we don’t always make the greatest of choices in times of crushing.

ED: And in fact the reason Eric won’t marry her right off the bat is because he fell in love with her voice, not her looks. That’s good, right?

LS: Furthermore, he flat-out says he believes that Ariel lost her voice as a part of some big event. So when he doesn’t want to immediately lock lips with the feisty mute, it’s because he doesn’t want to take advantage of a recent trauma victim.

ED: Basically you think Eric is the perfect guy.

LS: A prince among Disney princes. He loves his dog, which is a good sign, and he’s into it when she has the carriage jump over that ravine. He likes daring, empowered women. And, to counter stereotypes, he falls in love with his savior. Eric was the damsel in distress and Ariel rescued him from drowning during a storm.

ED: It’s true. This is why The Little Mermaid is (secretly?) feminist. She saved him first in that shipwreck and then again from being zapped by Ursula’s triton at the end. He ultimately kills Ursula, but she wins 2-1 in the savior competition. Now can we talk about the criticism that she’s too sexy?

LS: From some angles, Ariel’s cinched waist and round derriere kind of reminded me of Kim K, which is a little disturbing to see in a mermaid. But most Disney princesses are flawed.

ED: Everyone heralded Frozen as a big feminist moment last year, even though Elsa’s eyes are bigger than her waist. So we haven’t made much progress on realistic body image in cartoons in the last 25 years. It’s obviously a problem, but let’s not single out Ariel just because she has the shell bra and great hair.

LS: The hair truly is fantastic. Dinglehoppers work, guys. Also, props to Disney for going with a redhead, although it would have been nice for Triton’s kingdom to have greater diversity than hair color and fish species…

ED: Yes let’s just say it: “Under the Sea” is kind of a racist song, though Disney has definitely done much, much, much worse. Let’s discuss the end. Specifically Ursula. Is she the best Disney villain ever?

LS: Evil? Yes. Savvy contract negotiator? ABSOLUTELY. To be fair, Ursula gives Ariel a head’s up that there’s a big chance she’s going to end up a sea worm or whatever those terrifying creatures are. And her legalese is so good that even Triton can’t find a loophole. No one is above the law.

ED: Another good life lesson. But yeah, Ursula doesn’t even pretend to be good like Cinderella’s stepmother or Cruella de Vil or Scar. Right off the bat, she owns her evil witchiness, and I love it! Can the next Maleficent be Ursula?

LS: Too bad she ends up harpooned. I mean, good for young love and all, but poor Ursula. Now that we’ve come to the end of the movie, I think we should talk about another big Little Mermaid complaint people have: the fact that she gives up her whole world to live with Prince Eric.

ED: Again, this is a thing that happens in literally every Disney princess film except Frozen—and even there Anna ends up with someone, so it’s baby steps. I don’t know why everyone’s hating on The Little Mermaid.

LS: But really, didn’t Ariel fall in love with the human world before she even knew who Eric was? She has always dreamed of being a “part of your world,” which I’m worried she’ll find disappointing considering my dad definitely still reprimanded me (even though I had legs).

ED: Her father doesn’t understand her, and she wants to explore her own interests. Any teenager can relate to that.

LS: Also, there’s a time when we all must move on from our immediate surroundings. Triton basically lives in her back yard. Most newlyweds would probably find that a little too close for comfort.

ED: And conveniently Eric doesn’t seem to have parents. And who are we to say that if he could turn into a merperson, Eric wouldn’t? Maybe in a more progressive version of the film they have a discussion about who moves where for whose career. But in the version we get, she can turn human. He can’t turn merman. End of conversation.

LS: I smell a sequel…

ED: They made one. I think it was bad.

LS: Shhhh.

TIME Opinion

What Does It Mean to ‘Break the Internet’?

Kim Kardashian Paper Magazine
Jean-Paul Goude—Paper

When it comes to Kim Kardashian's butt, the medium is the message

Late Tuesday night, Kim Kardashian’s butt announced it would “break the Internet” when it appeared on the cover of Paper magazine. But what does “breaking the Internet” even mean? Is the Internet like a Gameboy that can break if someone sits on it by accident?

Obviously, Kim isn’t the first person to claim to “break the Internet.” In September Taylor Swift “broke the Internet” when she wore a T-shirt saying “no it’s Becky,” a super-meta reference to a Tumblr post where a user insisted that a picture of young Taylor was, in fact, someone named Becky. Beyoncé’s surprise album “broke the Internet” when she secretly released it last year. Alex from Target “broke the Internet” just by looking cute at work. Even Obama’s sensationally tan suit was almost able (but not quite) to “break the Internet,” according to Shape magazine.

Apparently, the Internet is about as durable as an 87-year-old hip.

And when it comes to Internet buzz, Kim Kardashian is Shiva the Destroyer — she has created a fame engine so big, she can dominate Twitter by flashing her nether cleavage (which, by the way, everyone has already seen.) But the most interesting circle on the Kim Kardashian cover isn’t her glistening derriere — it’s the tiny zero in $10, which is what that magazine costs. Paper magazine is just what it says it is: a magazine made of paper, and it costs money to buy it. That Kim Kardashian can “break the Internet” with a print magazine cover (as opposed to, say, an Instagram) is perhaps the biggest coup of all.

Paper Magazine is a small but prestigious art and fashion publication with an edgy bent. So while her Vogue cover with Kanye helped legitimize Kardashian with the fashion set, Paper is a better print venue for her to bare it all in a non-pornographic way. It’s prestigious in an artsy way, but not too prestigious to demure from the full-butt experience. Plus, print is always unexpected, and Kim loves the unexpected–remember her divorce from Kris Humphries?

It’s reminiscent of Benedict Cumberbatch’s recent old-fashioned newspaper engagement announcement, which immediately went viral. Most of the fascination was the news that Sherlock was off the market, but there was the added shock that the announcement wasn’t made on Twitter or Instagram, but instead appeared on a piece of pulpy grey newsprint in The Times of London. “It’s a kind of traditional thing to do,” Cumberbatch told People magazine. “I wanted to have some control over the message.”

Obviously, if the Internet does actually break, a paper magazine is probably not going to be what breaks it. Google Chairman Eric Schmidt said in October that surveillance programs like the NSA are “going to end up breaking the Internet,” because foreign governments won’t trust the United States not to snoop on their online activities. And according to The Guardian, sharks could “break the Internet” by nibbling at underwater cables.

Those events might change internet. But in the context of viral media content, “breaking the Internet” means engineering one story to dominate Facebook and Twitter at the expense of more newsworthy things. (Like, for example, the fact that humans have landed a probe on a comet for a first time in history.) So perhaps a more accurate term would be “hijacking the Internet,” since really these stories seem to be manipulating online fervor rather than shutting the whole thing down.

Sometimes people “break the Internet” by accident, which was the case for Alex from Target, the baby-faced Target checkout boy whose photo when viral after he was photographed by a teenage girl (and who is reportedly kind of freaked out by his internet fame.) Another example of accidental internet takeover is PR director Justine Sacco, whose offensive AIDS tweet went immediately viral and cost her her job.

But for celebrities, Internet destruction more often a calculated PR maneuver, designed to maximize social media hype and make themselves — or their projects — the center of attention. That’s what happened with Beyoncé last year — her self-titled album dropped the night of Dec. 12, 2013 with no fanfare or PR announcement, and by the next day she dominated Twitter, Facebook and iTunes. And this month, Taylor Swift’s entire rollout of her album 1989 has been calculated to maximize social media buzz, from dropping the first single (“Shake It Off”) through a Yahoo! livestream event to removing her entire catalogue from Spotify. Add her new Tumblr presence and her surprisingly thoughtful interaction with fans, and you’ve got an Internet tornado.

Kim Kardashian and the editors of Paper weren’t quite as strategic as Swift, but they do get points for irony. After all, the web helped eclipse print partially because of the popularity of bare butts online, so if this magazine cover were really able to break the internet, it’d be sweet revenge for paper and ink.

Read Next: Kim Kardashian’s Butt Is an Empty Promise

Your browser, Internet Explorer 8 or below, is out of date. It has known security flaws and may not display all features of this and other websites.

Learn how to update your browser