TIME europe

Only Gender Quotas Can Stop the E.U. from Being a Boys Club

Newly elected President of the European Commission, Jean-Claude Juncker is congratulated on July 15, 2014, in the European Parliament in Strasbourg, France.
Newly elected President of the European Commission, Jean-Claude Juncker is congratulated on July 15, 2014, in the European Parliament in Strasbourg, France. Frederick Florin—AFP/Getty Images

The European Commission's president has asked that EU member states nominate female candidates. Here's why gender quotas are necessary

Gender anxiety is enveloping the top levels of the European Union. By the end of this month, each of the bloc’s 28 countries is expected to put forward their candidate to sit on the European Commission, the powerful body that drives policy-making and enforces E.U. law.

Jean-Claude Juncker, the Commission’s new president, has instructed member states to send female candidates, saying he wants more women in the top jobs. A social media campaign – #10orMore – is also under way to boost female representation at the E.U. to a record high.

Unfortunately, governments are not playing ball: so far only five countries have nominated women. Nineteen other nations have nominated a man, with four countries still to announce their candidates.

The goal of getting more women into top decision-making posts is simply common sense given that they represent more than half of the E.U.’s 507 million citizens. Right now this is not reflected by their visibility in politics, business or the media, meaning their interests are often sidelined.

The drive to change the status quo at the top echelons of the E.U. has attracted skepticism. On the Facebook page of Neelie Kroes – one of the nine women in the outgoing Commission and a co-founder of the #10orMore campaign – critics question why gender would qualify a person for one of the 28 commissioner posts.

Such knee-jerk accusations of tokenism greet most attempts to introduce gender quotas in politics or the boardroom. But while so many barriers stand between women and senior positions – and these range from sexism in the workplace, high childcare costs and the unequal distribution of maternity and paternity leave – quotas are one of the few measures that actually have an impact.

In 1997 the British Labour party introduced all-women short lists for parliamentary candidates in some constituencies. Later that year, a record number of women were elected, and Labour still has the highest proportion of female MPs in Britain.

Britain’s Conservative party, which formed a coalition government with the Liberal Democrats in 2010, does not support all-women short lists, and a U.N. survey of women in ministerial positions earlier this year shows Britain languishing at around the halfway point, below Morocco and Cote d’Ivoire, with women making up just 15% of the cabinet.

There are other poor performers in Europe, with Greece, Cyprus and Hungary faring even worse, reflecting the problems Juncker is having in rallying enough women for his Commission.

At the other end of the spectrum, however, are Sweden and Finland, which are in the top three of the U.N. survey with over 50% female representation in their cabinets. France and Norway are close to reaching gender parity.

What the top performers have in common are long-term and often legislated programs to improve gender equality across society. In Sweden, political parties have since the early 1990s imposed voluntary quotas for election candidates. Norway was the first to introduce quotas for women on company boards, while France has legally-binding quotas for both politics and the boardroom. “Quotas are nobody’s first choice but where they are introduced they do improve representation, they do improve visibility of women,” says Clare McNeil, a senior fellow at the London-based Institute for Public Policy Research, adding that they work best when coupled with penalties for non-compliance.

Given the pool of female talent in the E.U., having just a handful of women in the Commission would be a pitiful performance. It is crucial now that efforts to increase female representation go beyond headline-grabbing promises. Juncker and the European Parliament, which approves the Commission, must make good on threats to reject the line-up if it is too male-dominated.

Hopefully quotas will not need to be in place forever. But right now Europe is so far from being a level playing field that radical measures are needed to kick-start lasting change in society.

Charlotte McDonald-Gibson is a writer and journalist based in Brussels.

TIME Race

Reparations Could Prevent the Next Ferguson

Protesters march in the street as lightning flashes in the distance in Ferguson, Mo, Aug. 20, 2014.
Protesters march in the street as lightning flashes in the distance in Ferguson, Mo, Aug. 20, 2014. Jeff Roberson—AP

The U.S. government and society need to recognize the direct connections between continuing racial disparities in this country and the wrongs that gave rise to them

Watching the events unfold in Ferguson, Missouri, I can’t help thinking about the Holocaust and post-war Germany. As the daughter of a Holocaust survivor, I’ve spent years watching Germany wrestle with its dark past. It’s just one of many places that have made efforts to understand and compensate for a difficult history: For nearly three decades, countries as varied as South Africa, Rwanda, and the nations of Latin America and post-Communist Eastern Europe have been engaged in this process, often called “transitional justice.” That’s a broad term for the ways in which societies deal with the legacies of past injustice. Many believe that countries can only move forward once they have come to terms with their past in this way.

We’re accustomed to looking abroad for examples of such processes. But maybe — especially in light of racial tensions once again revealed in Ferguson — it’s time for us to begin thinking about what “transitional justice” could mean for the U.S.

Like many nations, Americans are reluctant to see ourselves in the same light as human rights abusers elsewhere. And yet our history includes a number of glaring atrocities, including the genocide of Native Americans and slavery and its aftermath. But the United States lags behind other societies in its efforts to confront and make amends for that legacy.

What, exactly would that entail? Justice means more than putting perpetrators on trial. The transitional justice process also encompasses methods focused on the victims and the wider society, such as truthseeking, memorialization, education, institutional change, and material compensation — that is, actions that seek not only to punish, but to encourage a shared historical understanding, begin to repair the damage done, and ensure that it can’t happen again.

A first step in the process seems simple: official acknowledgment. Yet societies are often hesitant to admit historical wrongdoing. Armenians have been trying for decades to get Turkish authorities to acknowledge that they were the victims of an organized crime. To understand what this means, I’ve tried to imagine what I would feel had Germany not accepted responsibility for the Holocaust. Official silence negates the experience of the victims, but it’s also damaging to perpetrator societies; it feeds denial and false narratives of history that allow tensions and resentments to persist.

Apology often accompanies acknowledgment. Both Australia and Canada have recently apologized to their aboriginal populations for decades of removing children from their families. German Chancellor Willy Brandt’s famous gesture in Warsaw in 1970, when he fell to his knees before a memorial to the Warsaw Ghetto uprising, enraged many Germans who preferred not to face questions of guilt and responsibility. But this spontaneous gesture of atonement was enormously important to Holocaust survivors. In recent years, the Polish government has reversed decades of denial under its Communist government by acknowledging the participation of some Poles in anti-Semitic atrocities during World War II. Even the U.S. has managed an apology — in 1988, after a long campaign by Japanese-Americans, president Reagan apologized for the internment of Japanese-Americans during World War II.

Yet the U.S. has never officially apologized for slavery or Jim Crow (and a 2009 “apology” to Native Americans, slipped into a Defense Appropriations Act, made little impact). Nor are there memorials to slavery or to the Native American genocide on a scale similar to the Memorial to the Murdered Jews of Europe in Berlin. That memorial, imperfect as it is, represents a conscious public acknowledgment by a perpetrator society of its own wrongdoing — both a rebuke to deniers and a purposeful statement that memory should not only be the job of victims.

One reason societies often resist officially acknowledging wrongdoing is the fear of being held financially accountable. Even years after the fact, victims or their descendants may ask for the return of confiscated property, bank accounts, or uncollected insurance claims, as they have in the case of the Holocaust, Eastern European communism, and the Armenian genocide. Reagan’s apology for our treatment of Japanese-Americans was accompanied by monetary compensation.

Financial reparations are in fact the most direct way to compensate victims for past suffering.

Germany was able to pay millions to survivors of the Holocaust who suffered quantifiable harm, and continues to do so (my father received a small monthly check that made an enormous difference, especially to a penniless new immigrant in the 1950s who had lost his entire family in the Holocaust; my mother, not a survivor, still receives a widow’s pension). Societies with fewer resources have offered other types of reparation: scholarships to victims’ children, affirmative action programs, and preferential housing, health care and other entitlements.

In the United States, however, we are more likely to insist that existing institutions already provide a sufficient foundation for improving conditions, as though we could erase the effects of past atrocity without undertaking any difficult changes. Except in the brief period following the Civil War, direct financial compensation for slavery and Jim Crow has never had a serious place on the national agenda. The most significant effort to compensate for the institutionalized legal, economic and social discrimination against black Americans that persisted into recent decades—a modern legacy of slavery and Jim Crow vividly described in Ta-Nehisi Coates’ recent Atlantic Monthly piece “The Case for Reparations” — was affirmative action, but it has largely been reversed by the Supreme Court. Very little has been done to directly address ongoing racial injustices such as the disproportionate incarceration of black Americans, which author Michelle Alexander has referred to as “The New Jim Crow.”

Transitional justice demands recognition that fulfilling responsibilities to the past requires more than merely lip service from a perpetrator society. Crimes against minority groups in any society bring benefits to the perpetrator group, and compensating for them can necessitate material sacrifice. But remorse often ends where personal sacrifice begins. Marco Williams’ 2006 documentary, Banished, tells the story of several black towns in the American South that were ethnically cleansed in the early 20th century. A black family from one of these towns sought to have a father’s remains reburied near their new home and was met with sympathy from the white residents of the town — until they asked the town to pay the costs. As in Germany, where polls over the years have shown significant minorities that deny an ongoing financial responsibility towards the victims of the Holocaust, many fail to see why they should be held individually accountable for the acts of their parents or grandparents. The benefits accrued through the injustices of the past are not always apparent.

One of the most important aspects of successful transitional justice, therefore, lies in illuminating not only the victims’ suffering, but the ways in which an entire society continues to bear the burdens of history. This helps elevate an important point: correcting injustice may require affirmative steps. The U.S. government and society need to recognize — and educate citizens on — the direct connections between continuing racial disparities in this country and the wrongs that gave rise to them, and to talk far more about the responsibilities we all share for repairing the damage. Perhaps Ferguson – which has revealed what can happen when we suppress these conversations – will finally motivate us to think about how to address the harms, whether through material reparations or otherwise. If we’re willing to start talking, we’ll find no shortage of role models for transitional justice throughout the world to help us take the next steps.

Belinda Cooper is a Senior Fellow at the World Policy Institute and an adjunct professor at Columbia University’s Institute for the Study of Human Rights and New York University’s Center for Global Affairs. This piece originally appeared on The Weekly Wonk.

 

TIME U.S.

Shut Up Already About Obama’s Tan Suit! Let’s Talk Substance Over Style

President Obama Makes Statement In The Briefing Room Of White House
U.S. President Barack Obama makes a statement at the James Brady Press Briefing Room of the White House August 28, 2014 in Washington, DC. Alex Wong—Getty Images

Suitgate is giving the president a taste of what it's like to live in a woman's world. But what good does that do anyone?

Female politicians have been criticized for what they wear since they first began running for office. Hair too long, skirt too short, too much or too little makeup: any and all of it can derail an interview and focus attention on style over substance. It almost doesn’t matter what you say if you don’t look good doing it, the television adage goes.

Welcome to the women’s world, President Obama. Isn’t it fun? The tempest over the President wearing a tan summer suit on Thursday has virtually overshadowed the important messages he delivered on hostilities in Ukraine and Iraq. As a woman, I’m kind of glad to see a man held to the same crazy standards that we are. But that doesn’t make the standards any less ridiculous, male or female.

This President seems particularly prone to sartorial bullying. Obama has been criticized far more than other recent Presidents; I had to really think hard for similar sturm und drang for George Bushes 1 & 2 or Bill Clinton and came up with virtually nothing (unless you count Clinton making the G7 leaders get dressed up as cowboys, but that seemed more like him having some fun at their expense than an actual fashion misstep). But Obama has drawn ire for his lack of an American flag pin during the primaries that fed conspiracy theories that he wasn’t really American; his mom jeans; and just last week, his lack of tie while addressing the crisis in Iraq from Martha’s Vineyard, where he was vacationing.

What we wear has no impact on what we’re saying, so why does it matter so much? Hillary Clinton has been drawing scrutiny and headlines since velvet headband in her her 60 Minutes interview with Bill in 1992. Sarah Palin got savaged for her big hair, heavy makeup and “porn-star looks.” Condoleeza Rice was accused of going too sexy when she wore black leather knee high boots as Secretary of State. Just last year, the New York Times marked the historic number of women in the 113th Congress by doing a fashion profile of their purses. And these are the things we remember: their hair, their pedicures, their heads photoshopped onto a woman in a bikini, not so much their policies or platforms. Because style is always easier to digest than substance.

Up until recently, men seemed relatively immune to this kind of fluffy criticism. Granted, male politicians rarely venture beyond dull grey suits. Obama once told Vanity Fair in 2012 that he only wore grey and blue suits. But when they do break this unspoken rule, as Obama did on Thursday, do they deserve the kind of evisceration that he got? “The Audacity of Taupe,” tweeted Jared Keller, a programming director at startup MicNews. “Yes we tan!” read another headline. Wall Street Journal economic-policy reporter Damian Paletta tweeted, “I’m sorry but you can’t declare war in a suit like that.” Never mind that the President just announced he had no strategy for the conflict in Iraq and Syria.

Sometimes a boring uniform can be helpful: It creates unanimity and a reassuring predictability. It’s why the military has uniforms. But America isn’t a militarized state. And verging outside the norm shouldn’t detract from important work. Women have learned this the hard way: conform or die, politically. And even a pro like Clinton can still draw criticism after 30 years in the public spotlight when, in the midst of international crises, she didn’t wear makeup or have time to cut her hair. It’s dispiriting to see the same level of scrutiny now being applied to men. I wish the great equalizer would be to leave all comments about appearances off the table.

Jay Newton-Small is TIME’s congressional correspondent and she’s working on a book about women in politics.

TIME

Negrophobia: Michael Brown, Eric Garner, and America’s Fear of Black People

Demonstrators march down West Florissant during a peaceful march in reaction to the shooting of Michael Brown, near Ferguson, Mio., Aug. 18, 2014.
Demonstrators march down West Florissant during a peaceful march in reaction to the shooting of Michael Brown, near Ferguson, Mio., Aug. 18, 2014. Lucas Jackson—Reuters

Phobias are extreme aversions embedded deep in our psyches, activated when we come face-to-face with the thing we fear. Some people are afraid of black people.

Phobias are lethal. This summer’s series of prominent killings of unarmed Black men, Michael Brown being the most covered, have forced me to come to terms with my own fear: I am an arachnophobe.

A few nights ago, I noticed a dark spot in my periphery. Suddenly it twitched. My stomach dropped. The dark spot was a five-inch spider, looking as if it had muscle and bone. There was no possible way I could sleep soundly until the behemoth was neutralized. I scrambled to find a shoe, then swung it with all my might. With a clap of thunder, the big dark enemy was no more; flattened to a wall stencil. Relief.

Phobias are extreme aversions. They are embedded deep in our psyches, activated when we come face-to-face with the thing we fear. For me, spiders trigger overreactions. For others, it can be people.

Black people.

Before there was Michael Brown, there was Eric Garner, a dark spot in the periphery of the NYPD—a trigger for their phobia. There was no possible way they could patrol confidently that day without assurance the behemoth was neutralized.

Garner’s 400-pound anatomy forms an object of American Negrophobia: the unjustified fear of black people. Studies show that Black people, particularly Black men, are the group most feared by White adults. Negrophobia fuels the triangular system of oppression that keeps people of color pinned into hapless ghettos between the pillars of militarized police, starved inner-city schools, and voracious prisons. And this summer there weren’t only Garner and Brown; there were John Crawford, and Ezell Ford, and many others who will not be eulogized in the media.

Even the most well-intentioned people sometimes have difficulty avoiding discourses that reinforce problematic notions of Black physicality. A few months ago, I got into a conversation with a mentor of mine, a Stanford administrator. This individual told a story of a visit to a penitentiary where there was a stellar performance of Shakespeare’s Othello by a cast of inmates. My mentor’s description of the lead, a brawny African-American male convict, will always fascinate me. In this person’s words, the thespian was a “large, beautiful, intimidating Black man.”

This stream of modifiers—large, beautiful, and intimidating—is normally reserved for majestic, predatory beasts like tigers, bears, or dragons. It describes something both appealing and appalling, but not typically a human. You can see classic buck and brute tropes echoed in various corners of modern popular culture. These types of perceptions of historically marginalized groups can, in the wrong circumstances, foment phobias—and dangerous overreactions.

But misperception is nothing new. The bestial depiction, and treatment, of Black people follows a linear history from the times of pickaninny children to the current United States president.

I hate to think this is what the police see when they approach any unarmed Black person—a predator that has escaped captivity and must be tranquilized before he or she wreaks havoc. And yet. An officer quelling Ferguson protests can be heard screaming on live television, “Bring it, all you f****** animals!” to the predominantly Black demonstrators.

Back to the spider once more: my perception of the fear and the ability of that spider to actually produce the threat I have mentally assigned it were completely disproportionate. It was just me spooking myself into fury. Phobic people hyperbolize a threat that is not actually present, and trip themselves into aggression. We as Americans must learn to see each other properly and not through the lens of phobia.

This is a plea to those officers who are unflinching in the gravest of dangers, whose courage is forged in the crucible of our nation’s worst emergencies, yet who lose all composure when facing the grimace of a Black man. The concept of diversity, like Eric Garner, is large, beautiful, and sometimes intimidating. America will only be America once we learn how to fully appreciate it, not fear it. One day, I hope, we won’t see our fellow humans as dark spots.

Brandon Hill is a junior at Stanford University, studying political science and African & African American Studies. Raised in Eden Prairie, Minnesota, he has interned for the White House and UNICEF.

TIME

A Tale of Two 9-Year-Olds: The One on the Playground, and the One With an Uzi

An UZI assault pistol
An UZI assault pistol Terry Ashe—Getty Images

You should be absolutely terrified that a 9-year-old’s constitutional right to fire an Uzi trumps your right to decide at what age your kids can play at the park unsupervised

Parents who allow their 9-year-old to play unsupervised at a playground can be arrested, but handing a nine-year-old an Uzi is perfectly acceptable.

Unfortunately, that’s not hyperbole. It’s just the sad state of affairs in which we find ourselves, after a 9-year-old New Jersey girl accidentally shot and killed her instructor at a firing range in Arizona. The girl’s parents paid for her to fire a fully automatic machine gun, but she lost control of the weapon and shot her instructor, Charles Vacca, killing the military veteran.

The chilling ordeal was caught on tape, courtesy of the girl’s parents, but Arizona police officials have said no charges will be filed or arrests made. The Mohave County Sheriff’s Office concluded the incident was an “industrial accident,” and have contacted the Occupational Safety and Health Administration to investigate, according to published reports.

Let’s compare that to a story from earlier this summer, regarding a different 9-year-old, one in South Carolina.

Debra Harrell is a working mother who faces a common problem for parents when school lets out for the summer: finding affordable child care. The McDonald’s employee couldn’t afford to have someone watch her 9-year-old daughter, so the girl was playing on her laptop in the restaurant during her mother’s shifts. However, when that laptop was stolen from their home, Harrell armed her daughter with a cell phone in case of an emergency and let her go unsupervised to an area playground. Another parent noticed the girl there alone and contacted the police, at which point Harrell was arrested and charged with child neglect. If convicted, she faces up to 10 years behind bars.

Is anyone else absolutely scared to death of the horrendous message we’re sending to parents?

Regarding the incident in Arizona, we’re talking about two parents who willingly paid $200 to put a fully automatic weapon in the hands of their 9-year-old daughter. This poor girl, who should’ve been learning to shoot with a .22 rifle or some other weapon she could handle (if indeed she had to learn to fire a gun) was given an Uzi capable of firing up to 600 rounds per minute—creating a recoil difficult for some adults to handle.

And the scariest part? The firing range has a minimum age of eight years old to fire such weapons – one year younger than the girl who is now surely scarred for life. The terrible judgment of the New Jersey parents (combined with the operators of the firing range to allow kids that young to fire Uzis) directly contributed to a man’s death. That stands in stark contrast to Harrell’s troubles in South Carolina.

Instead of a loaded weapon, Harrell armed her daughter with a phone, and sent her to a playground with lots of other kids and adults. The only shooting that took place was the cool water from a splash pad and some hoops on the basketball courts. There were even volunteers who came by the playground with free snacks. While perhaps not ideal since Harrell was at work, she sent her daughter to a family-friendly place with an environment geared toward fun and summertime frivolity. The same kind of place I routinely rode my bike to at the age of nine.

Yet Harrell is the one arrested. Who lost her job. Who spent 17 days in jail, temporarily lost custody of her daughter, and faces 10 years in prison.

So, when considering charges for the neglect of a child, playgrounds seem to be a greater threat in the eyes of the law than guns. And that is a travesty.

Wherever you fall in this country’s ongoing debate about guns and gun control, this should upset you. It should infuriate you. It should alert you to our disturbingly warped gun culture, and should be more than enough proof that change is desperately needed. And parents, let me state this unequivocally: It is never acceptable to let your 9-year-old fire an Uzi. Never. Under any circumstances.

Harrell’s detractors claim someone could’ve kidnapped her daughter at the playground, which is true. But while there is a low risk of child abduction at a public playground in broad daylight, it pales in comparison to the risks involved with letting a 9-year-old fire a machine gun. So please stop referencing the 2nd amendment, because I’m certain our Founding Fathers weren’t contemplating the benefits of letting children fire hundreds of rounds per minute when they drafted the right to bear arms.

If you’re a parent, you should be absolutely terrified that a 9-year-old’s constitutional right to fire an Uzi trumps your right to decide at what age your kids can play at the park unsupervised.

Something has to change. Now.

Aaron Gouveia is a husband, father of two boys, and writes for his site, The Daddy Files.

TIME feminism

Campus Rape: The Problem with ‘Yes Means Yes’

New students at San Diego State University watch a video on sexual consent during an orientation meeting, Aug. 1, 2014, in San Diego.
New students at San Diego State University watch a video on sexual consent during an orientation meeting, Aug. 1, 2014, in San Diego. Gregory Bull—AP

Having the government dictate how people should behave in sexual encounters is a terrible idea

The campus crusade against rape has achieved a major victory in California with the passage of a so-called “Yes means yes” law. Unanimously approved by the state Senate yesterday after a 52-16 vote in the assembly on Monday, SB967 requires colleges and universities to evaluate disciplinary charges of sexual assault under an “affirmative consent” standard as a condition of qualifying for state funds. The bill’s supporters praise it as an important step in preventing sexual violence on campus. In fact, it is very unlikely to deter predators or protect victims. Instead, its effect will be to codify vague and capricious rules governing student conduct, to shift the burden of proof to (usually male) students accused of sexual offenses, and to create a disturbing precedent for government regulation of consensual sex.

No sane person would quarrel with the principle that sex without consent is rape and should be severely punished. But while sexual consent is widely defined as the absence of a “no” (except in cases of incapacitation), anti-rape activists and many feminists have long argued that this definition needs to shift toward an active “yes.” Or, as the California bill puts it:

“Affirmative consent” means affirmative, conscious, and voluntary agreement to engage in sexual activity. … Lack of protest or resistance does not mean consent, nor does silence mean consent.

The law’s defenders, such as feminist writer Amanda Hess, dismiss as hyperbole claims that it would turn people into unwitting rapists every time they have sex without obtaining an explicit “yes” (or, better yet, a notarized signature) from their partner. Hess points out that consent can include nonverbal cues such as body language. Indeed, the warning that “relying solely on nonverbal communication can lead to misunderstanding,” included in the initial draft of the bill, was dropped from later versions. Yet even after those revisions, one of the bill’s co-authors, Democratic Assemblywoman Bonnie Lowenthal, told the San Gabriel Valley Tribune that the affirmative consent standard means a person “must say ‘yes.’ ”

Nonverbal cues indicating consent are almost certainly present in most consensual sexual encounters. But as a legal standard, nonverbal affirmative consent leaves campus tribunals in the position of trying to answer murky and confusing questions — for instance, whether a passionate response to a kiss was just a kiss, or an expression of “voluntary agreement” to have sexual intercourse. Faced with such ambiguities, administrators are likely to err on the side of caution and treat only explicit verbal agreement as sufficient proof of consent. In fact, many affirmative-consent-based student codes of sexual conduct today either discourage reliance on nonverbal communication as leaving too much room for mistakes (among them California’s Occidental College and North Carolina’s Duke University) or explicitly require asking for and obtaining verbal consent (the University of Houston). At Pennsylvania’s Swarthmore College, nonverbal communication is allowed but a verbal request for consent absolutely requires a verbal response: If you ask, “Do you want this?”, you may not infer consent from the mere fact that your partner pulls you down on the bed and moves to take off your clothes.

Meanwhile, workshops and other activities promoting the idea that one must “ask first and ask often” and that sex without verbal agreement is rape have proliferated on college campuses.

The consent evangelists often admit that discussing consent is widely seen as awkward and likely to kill the mood — though they seem to assume that the problem can be resolved if you just keep repeating that such verbal exchanges can be “hot,” “cool,” and “creative.” It’s not that talk during a sexual encounter is inherently a turn-off — far from it. But there’s a big difference between sexy banter or endearments, and mandatory checks to confirm you aren’t assaulting your partner (especially when you’re told that such checks must be conducted “in an ongoing manner”). Most people prefer spontaneous give-and-take and even some mystery, however old-fashioned that may sound; sex therapists will also tell you that good sex requires “letting go” of self-consciousness. When ThinkProgress.com columnist Tara Culp-Ressler writes approvingly that under affirmative consent “both partners are required to pay more attention to whether they’re feeling enthusiastic about the sexual experience they’re having,” it sounds more like a prescription for overthinking.

Of course anyone who believes that verbal communication about consent is essential to healthy sexual relationships can preach that message to others. The problem is that advocates of affirmative consent don’t rely simply on persuasion but on guilt-tripping (one handout stresses that verbal communication is “worth the risk of embarrassment or awkwardness” since the alternative is the risk of sexual assault) and, more importantly, on the threat of sanctions.

Until now, these sanctions have been voluntarily adopted by colleges; SB-967 gives them the backing of a government mandate. In addition to creating a vaguely and subjectively defined offense of nonconsensual sex, the bill also explicitly places the burden of proof on the accused, who must demonstrate that he (or she) took “reasonable steps … to ascertain whether the complainant affirmatively consented.” When the San Gabriel Valley Tribune asked Lowenthal how an innocent person could prove consent under such a standard, her reply was, “Your guess is as good as mine.”

Meanwhile, Culp-Ressler reassures her readers that passionate trysts without explicit agreement “aren’t necessarily breaches of an affirmative consent standard,” since, “if both partners were enthusiastic about the sexual encounter, there will be no reason for anyone to report a rape later.” But it’s not always that simple. One of the partners could start feeling ambivalent about an encounter after the fact and reinterpret it as coerced — especially after repeatedly hearing the message that only a clear “yes” constitutes real consent. In essence, advocates of affirmative consent are admitting that they’re not sure what constitutes a violation; they are asking people to trust that the system won’t be abused. This is not how the rule of law works.

This is not a matter of criminal trials, and suspension or even expulsion from college is not the same as going to prison. Nonetheless, having the government codify a standard that may implicitly criminalize most human sexual interaction is a very bad idea.

Such rules are unlikely to protect anyone from sexual assault. The activists often cite a scenario in which a woman submits without saying no because she is paralyzed by fear. Yet the perpetrator in such a case is very likely to be a sexual predator, not a clueless guy making an innocent mistake — and there is nothing to stop him from lying and claiming that he obtained explicit consent. As for sex with an incapacitated victim, it is already not only a violation of college codes of conduct but a felony.

Many feminists say that affirmative consent is not about getting permission but about making sure sexual encounters are based on mutual desire and enthusiasm. No one could oppose such a goal. But having the government dictate how people should behave in sexual encounters is hardly the way to go about it.

Cathy Young is a contributing editor at Reason magazine.

 

TIME Business

The $15 Minimum Wage Is a Bellwether of the New Living Wage

Fast-Food Strikes in 50 U.S. Cities Seeking $15 Per Hour
Robert Wideman, a maintenance mechanic at McDonalds Corp., shines the shoes of a Ronald McDonald statue outside of a restaurant while protesting with fast-food workers and supporters organized by the Service Employees International Union (SEIU) in Los Angeles, California, U.S., on Thursday, Aug. 29, 2013. Bloomberg—Bloomberg via Getty Images

When $13 an hour isn't enough

A little less than two years ago, a group of courageous New York fast food workers went on strike, outlandishly insisting on a $15-an-hour wage and launching an unlikely David-versus-Goliath battle to raise pay for tens of millions of Americans in dead-end jobs.

Goliath is falling.

On Tuesday, word leaked that the mayor of Los Angeles will soon propose raising the city’s minimum wage to more than $13 in the next three years – an increase that would lift pay for hundreds of thousands of struggling Angelenos.

The plan neither meets the now iconic $15 demand of low-wage workers everywhere (though with cost-of-living adjustments built in, it would get there by 2023), nor guarantees the right of workers to freely form a union – a critical step in solidifying wage increases and improving other working conditions.

But pointing out these shortcomings only highlights just how far the nation has come. For who, on that cold November day two years ago, could have envisioned that a proposal to raise the minimum wage in America’s second largest city to more than $13, a nearly 50% increase over three years, would not only be taken seriously but would strike some as being too modest?

Who could have envisioned that under pressure from their left, moderate New York governor Andrew Cuomo would endorse a minimum wage of more than $13 for the nation’s largest city (New York), and Chicago’s dyed-in-the-wool pragmatist mayor Rahm Emanuel would throw his weight behind a $13 wage floor in the nation’s third largest city?

Who could have imagined that in 2014, business leaders in Seattle would actively support and help enact an unprecedented $15 minimum wage law, only to be one-upped by the San Francisco business community, which has agreed to let one of the country’s most liberal electorates vote on an even faster increase to $15 this November?

Who could have foreseen techs and janitors at Baltimore’s Johns Hopkins and teachers’ aides and cafeteria workers at schools in Los Angeles successfully bargaining contracts guaranteeing $15 an hour, or businesses like Michigan’s Moo Cluck Moo deciding to raise employees’ pay to $15 just on principle?

By themselves, any of these victories – along with the passage of more modest but still significant wage increases in cities like San Diego, Berkeley, Santa Fe and Washington and in states including Maryland, Michigan, Minnesota, Hawaii, Vermont, Connecticut and Massachusetts – could be dismissed as an aberration. Together, they represent the start of an inexorable march toward a new social compact, one in which America’s workers are no longer cast aside as dispensable factors of production whose output is to be maximized at the lowest possible cost.

Looking ahead, we can ask: Which state will be the first to set a $15 minimum wage? Which big fast-food company will be the first to guarantee a minimum hourly wage that is double the industry standard? When can we expect to see a living wage become a core labor standard guaranteed to all workers across the country?

For four decades, wages have flat-lined, even as worker productivity has continued to grow. Low-wage jobs now form the core of America’s economy, comprising seven of the ten occupations with the largest projected growth over the next decade. Now middle and working class people in this country are rightfully insisting on a larger share of the nation’s prosperity.

In years past, right-wing politicians and their corporate backers may have been able to subdue this agitation with references to “job creators” and patronizing warnings against “hurting those you want to help.” But Americans – low-wage workers, middle class families and even many business owners – have had enough.

A powerful movement is afoot to create a decent life and a truly sustainable economy for us all. Giants beware.

Arun Ivatury is a campaign strategist with the National Employment Law Project.

TIME Innovation

Five Best Ideas of the Day: August 29

1. We must confront the vast gulf between white and black America if we want to secure racial justice after Ferguson.

By the Editors of the Nation

2. As ISIS recruits more western acolytes, it’s clear military might alone can’t defeat it. We must overcome radical Islam on the battleground of ideas.

By Maajid Nawaz in the Catholic Herald

3. Kids spend hours playing the game Minecraft. Now they can learn to code while doing it.

By Klint Finley in Wired

4. One powerful way to raise the quality of America’s workforce: Make community colleges free.

By the Editors of Scientific American

5. Restrictions on where sex offenders can live after prison is pure politics. They do nothing to prevent future offenses.

By Jesse Singal in New York Magazine’s Science of Us

The Aspen Institute is an educational and policy studies organization based in Washington, D.C.

TIME Business

Equal Opportunity Is Over. It’s Time for ‘Racial Realism’

Getty Images

A shift in demographics means that, increasingly, many employers are treating race as a qualification

Californians, like other Americans, like to think that race should never be a qualification for a job, that everyone deserves an equal opportunity and a fair shake. This principle undergirds our Civil Rights Act, which turns 50 this month. And yet increasingly, many employers are treating race as a qualification, especially for people of color. Just look at the Los Angeles Lakers’ acquisition of Jeremy Lin. “We think Jeremy will be warmly embraced by our fans and our community,” said General Manager Mitch Kupchak. Putting Lin on the court is a smart economic move in the country’s largest Asian-American market.

The prevalence of this kind of hiring—particularly in California, America’s most populous and most diverse state—suggests that the Civil Rights Act needs to be updated. California in 2014 certainly looks nothing like Alabama and Mississippi of 1964, which were Congress’s focus when it passed that year’s Civil Rights Act. The main question then was how to provide equal opportunity for African-Americans. The answer at that time was Title VII of the act, which prohibited racial discrimination in employment, and later court decisions allowing for affirmative action.

Twenty-first-century employers have come to value racial differences in ways that were unheard of in 1964, and do not fit with traditional conceptions of affirmative action. Organizations of all kinds today hire and place workers using a practice I have called “racial realism”: seeing color as a real and significant part of workers’ identities, a qualification that is good for business.

As with the Lakers and Lin, employers use racial realism to make customers of different backgrounds feel comfortable. As San Francisco-based Wells Fargo explains on its website: “To know our customers and serve them well, the diversity of team members throughout our ranks should reflect the diversity of the communities we serve.”

Government employers, including police departments and school districts, have also invoked racial realism, seeking to mirror the populations they serve to deliver more effective services. For example, California’s Education Code declares the importance of hiring racially diverse teachers so that “the minority student [has] available to him or her the positive image provided by minority classified and certificated employees.”

In low-skilled jobs, racial realism is often linked to perceived variations in abilities, rather than customer reactions. One study of Los Angeles employers found a common pattern of preference for Latinos due to their perceived diligence.

While racial realism lacks the animus that characterized the racism of the Deep South 50 years ago, it is still problematic. The Civil Rights Act provides no authorization for race to be a job qualification. And the Equal Employment Opportunity Commission has denied the legality of motivations like Wells Fargo’s. If employers in Alabama could claim they preferred white workers because their customers preferred white workers, the cause of equal opportunity would never have gotten off the ground. Courts have ruled that firms should have their workforces mirror their job applicant pools, not their customer bases. And California’s rationale for teacher diversity would seem to have been precluded by a 1986 Supreme Court decision, which explicitly stated that hiring teachers to be racial role models was impermissible.

Moreover, the employer preference for Latino workers, often immigrants, is often propelled by stereotypes, and often at the expense of other workers stereotyped differently, especially African-Americans. The Equal Employment Opportunity Commission has initiated action against employers who use this strategy, grouping the cases under a heading no one would have considered in 1964: “Hispanic Preference.”

For high-skilled nonwhite workers, racial realism can be a double-edged sword. They may have ready access to jobs—then find themselves pigeonholed in positions where they deal with same-race clients or citizens.

Why the shift from equal opportunity to racial realism? Demographics. American birthrates declined as the country became more educated, creating a great demand for low-skilled immigrant labor. Employer demand for labor brought immigrant workers here, but now immigrants themselves, and their descendants, are shaping employment patterns as consumers. Employers are feeling pressure to balance the rights of their workers and the interests of customers and citizens, including those of color, who rightfully expect the best service from businesses and especially from government.

The Civil Rights Act, as written, puts employers and employees alike in a bind. It is time to revisit the law, and make adaptations that fit our new demography—and the law’s original goal of equal opportunity for America’s most disadvantaged.

John D. Skrentny—co-director of the Center for Comparative Immigration Studies and professor of sociology at the University of California, San Diego—is author of After Civil Rights: Racial Realism in the New American Workplace (Princeton University Press).

TIME Education

Here Are the Crucial Job Skills Employers Are Really Looking For

483636127
Tom Merton—Gety Images

'Soft skills' like professionalism and oral communication rank among the most valued, regardless of education level

Labor Day offers an opportunity for politicians and economists to offer their two cents on the state of labor. It’s a good bet that some of that commentary will focus on the so-called “skills gap” — the notion that millions of jobs in highly technical fields remain unfilled while millions of Americans without those skills remain unemployed.

The solution according to the pundits? Education and training that focus on technical skills like computer engineering, or on crucial but scarce skills like welding. Match these newly trained employees with open jobs that require those skills and, voila, the skills gap is gone — and the labor market is steadied.

If only it were so simple.

Yes, more American workers need to learn skills that are underrepresented in the labor market. And yes, those technology titans who advocate for more challenging school curricula, for greater funding for science and engineering education and for immigration reforms to bring more skilled workers are responding to a real problem. But that’s not all there is to it. The problem with the skills gap argument is that it accounts for only one set of skills that employers consider important.

I work at Books@Work, a non-profit organization that brings university professors to the workplace to lead literature seminars with employees. The employers with whom we work want to provide professional development opportunities for all members of their organizations, and — we like to think — are more creative in their approach to doing so than most. Yet even this group of employers has few ways of helping their employees to develop skills that aren’t about content or subject matter — skills like communication, critical thinking, creativity, empathy and understanding of diversity.

Such skills cut across sector, hierarchy and function – and are, according to employers, crucial to the success of their companies. According to research conducted by the Association of American Colleges and Universities (AACU), 93 percent of business and non-profit leaders who were surveyed consider critical thinking and communication skills to be more important than a person’s undergraduate major when it comes to hiring.

That’s bad news because, while many public programs try to bridge gaps in the knowledge of future workers, there are few programs to address the gap in skills that are more difficult to measure, like creativity and critical thinking. My colleagues and I often hear from hiring managers who are hungry for programs that will encourage their employees (at all levels of the organization) to think more creatively, communicate more effectively and become more adept at reacting to changing circumstances.

The gap in these “soft” skills is very real. Professionalism/work ethic, teamwork/collaboration, and oral communication rank among the top five skills valued by employers hiring candidates at any educational level, according to one study. Yet employers rank significant portions of those entering the workforce deficient on all these dimensions. The problem is particularly acute among those without a college degree. Employers rate those entering the workforce with a high school degree deficient on professionalism/work ethic, critical thinking/problem solving, and oral communication. Meanwhile, employers do not regard a majority of college graduates deficient in any of these areas.

The introduction at the K-12 level of the Common Core, which is supposed to emphasize critical thinking and problem solving, may produce changes in these figures in the years to come. But for now, those without access to a university education — and even some workers with college degrees — enter the workforce lacking the interpersonal, reasoning and thinking skills necessary for success. Unlike direct knowledge areas — like computer basics — that can be taught through employer training sessions, there is no set curriculum for critical thinking or applied reasoning.

There is no silver bullet for addressing this gap, though our approach at Books@Work, having employees read literature and reflect on it, is one example of an attempt to disseminate some of the benefits of a liberal arts education beyond the confines of the traditional university setting. We need many more such efforts. In discussing Macbeth or Frankenstein, workers explore complex (and timeless) interpersonal dynamics — an opportunity that a training on the latest operating system or review of safety regulations is unlikely to provide.

We’ve found that reading literature with colleagues can offer a new perspective on the practice of work itself, leading to greater professionalism and new ways of doing things. Themes of empathy in a powerful novella by May Sarton, As We Are Now, which is about a woman in a terrible nursing home, led workers in one hospitality company to reconsider their approach toward customers, resulting in a renewed awareness of customer needs and expectations. A conversation about the racial tension in the post-war Northwest in David Guterson’s Snow Falling on Cedars became a platform to discuss personal integration issues in a company growing rapidly through acquisition and organizational acculturation.

Programs like Books@Work are not an adequate substitute for public policy solutions to the gap in thinking and interpersonal skills. We do not address disparities in such skills among job applicants — only among those who are hired. And they place the burden for addressing the problem squarely with employers. But programs that address the significant divide in soft skills are a first step toward realizing that solving the so-called skills gap requires more than teaching kids to code, retraining the unemployed as welders or encouraging college dropouts to complete technical degrees. We all need to continue to improve the most important skill of them all – our thinking.

Rachel Burstein, Ph.D. is Academic Director at Books@Work. This piece originally appeared at Zocalo Public Square.

Your browser, Internet Explorer 8 or below, is out of date. It has known security flaws and may not display all features of this and other websites.

Learn how to update your browser
Follow

Get every new post delivered to your Inbox.

Join 46,094 other followers