TIME In the Arena

Burned Books in the Holy Land

Joe Klein is TIME's political columnist and author of six books, most recently Politics Lost. His weekly TIME column, "In the Arena," covers national and international affairs.

Jewish and Arab parents watch as Israel’s hopes for peace fade

The Vandals started the fire in the first-grade classroom with a pile of textbooks. But textbooks apparently don’t burn so well. The classroom was destroyed, and the one next to it damaged, but that was all. It was a Saturday evening. The janitor called the principal, Nadia Kinani, to report the fire, and she rushed to the school. She saw that it wasn’t only a fire. There was graffiti that turned her stomach. First she saw kahane was right, a reference to Meir Kahane, a deceased Jewish extremist leader. And then she saw no coexistence with Cancer. And death to Arabs. Kinani is an Arab, and her school is the rarest of things–a bilingual academy whose students are nearly 50% Jewish and 50% Arab, in the heart of Jerusalem. “My first thought was, Our dream is finished,” she told me three days after the fire. “No parents will want to send their children here anymore.”

The hand in hand school in Jerusalem–one of five such–opened in 1998, after several years of careful preparation. It was a moment of hope. The Oslo accords had been signed by Yasser Arafat and Yitzhak Rabin; peace was surely on the way. “I believed that if you want to solve any problem, the way to begin is through education,” says Hattam Mattar, an Israeli Arab who sent his daughters to the school. “Some of my friends said, ‘Your daughter will marry some Jew guy.’ But I figured my daughters could meet Jew guys on the bus. I thought that this school would give them a stronger sense of their own identity and who we are living with.”

The school is totally bilingual. There are two teachers per classroom. All holidays are celebrated–or at least noted and discussed, as in the case of Nakba Day, the Palestinian remembrance of those forcibly removed from the land during the 1948 war. In fact, everything–every riot and bombing and “protective” wall–is discussed by parents and children alike. There is no political consensus about one state or two states, just a feeling. “We are all here,” Kinani told me. “We have to figure out a way to live together.”

The school was built next to a railroad track and is close to the original 1948 border between Israel and Jordan. It was built in an Israeli neighborhood but is adjacent to an Arab area. “They say we live in a bubble, but it is more like a cauldron,” said Rebecca Bardach, the school’s director of resource development and strategy, as she led me to a terrace that overlooked a wadi. On the other side of the valley was the arena where the Beitar Jerusalem soccer team plays. The Beitar fans are notorious; one of their favorite chants is “Death to Arabs.”

There was a time–during most of Israeli history, in fact–when such sentiments were considered way out of the mainstream, unacceptable in polite society. But that is changing. There is rising tension in Jerusalem, with near daily acts of terrorism and humiliation by both sides. Last summer, three Israeli children were kidnapped and killed by Palestinians on the West Bank; some Jews responded by killing a Palestinian child. Israeli Prime Minister Benjamin Netanyahu reacted with emotional disgust to the vengeance killing, but his government has been promoting an entirely unnecessary, and quite possibly meaningless, law that would make Israel a Jewish state. And so you have a steady bloody dribble of horror in the streets. Palestinians murder four rabbis in a synagogue. Israeli thugs torch the Hand in Hand school.

Gradually, the Oslo dream of two states, Israel and Palestine, living peacefully side by side begins to seem unlikely. There are all sorts of sane arguments for a two-state solution. The West Bank occupation has smashed Israel’s moral compass, and Israel’s democracy will be destroyed as the West Bank Palestinian population increases and is refused the right to vote. But in the Promised Land, fantasies have always trumped reality. There is the fantasy now of a Greater Israel; there is the fantasy of no Israel at all. These views are held by minorities with the dead-eyed arrogance of majorities.

Almost immediately, on the night of the fire, the parents went to the Hand in Hand school. At first, Kinani’s fears seemed justified. A parent told her she was withdrawing her child. But there was a discussion in the library that night, a classic Hand in Hand discussion, with Arab and Jewish parents sharing their anger and fears. The parent changed her mind. “There is no place else I would want my child to be,” she said. A student at the meeting asked if there would be school on Monday. “Yes,” Kinani responded, “and there will be homework.” And on Monday, the students responded with graffiti of their own. We are not enemies, said one sign. And another: We continue together without hatred and without fear.

TO READ JOE’S BLOG POSTS, GO TO time.com/swampland

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME sweden

Sweden Is Holding Snap Elections for the First Time Since 1958

Sweden Government Defeat
Sweden's Prime Minister Stefan Lofven talks at a press conference at Stockholm, Sweden, Tuesday Dec. 2, 2014. Pontus Lundahl—AP

Extraordinary measure comes after anti-immigrant party derails the Prime Minister's budget proposal

Sweden’s new Prime Minister Stefan Lofven has called for snap elections, the country’s first in nearly 60 years, after a populist anti-immigration party trashed his attempt to build support for his first budget proposal.

The decision was announced Wednesday, a day after the Sweden Democrat party chose to back the opposition’s alternative budget, a move almost unheard of in a country long known to seek broad, political consensus, Wall Street Journal reports.

Lofven’s minority government, formed between his Swedish Social Democratic Party and the Green Party after the election on Sept. 14, was weak from the start. He has since reached out to center-right parties to find support for his budget, but the Sweden Democrats, who placed third in the election, were systematically shut out of the discussions.

This week, the Sweden Democrats said they planned to derail future budget proposals that continue current spending on immigration.

The snap election will be held on March 22.


TIME politics

Elizabeth Lauten Still Doesn’t Seem to Get How She Dehumanized Young Black Girls on Facebook

Elizabeth Lauten
Elizabeth Lauten Tom Williams—CQ-Roll Call,Inc.

I'm glad she resigned. But her statement speaks to a much larger problem


This story originally appeared on xoJane.com.

Following a long, hard weekend that included much “shade” and reportedly even more prayer, Elizabeth Lauten has finally done the right thing and resigned from her job as spokeswoman for Representative Stephen Fincher (R., Tenn.) after posting inappropriate criticisms of First Daughters Sasha and Malia Obama on Facebook.

This is great because America wasn’t really in the market for a Troll in Chief, and the subsequent apology Lauten offered didn’t help win friends and influence people. The long weekend was a tender time that had already left a lot of people feeling exposed as many Americans wrestled with the meaning of the secret proceedings that led to the Ferguson Decision.

Then, as now, is not the time to revel in shades of racism and mean-girl snark to make a political point, which is exactly what Lauten did. Spectacularly tone-deaf to where we’re at as a country right now, she went all in on Malia and Sasha, Michelle and Barack’s daughters, and Marion Robinson’s grands for their seeming and refreshing disinterest in the corny tradition that is the annual White House Thanksgiving turkey pardon.

“I get you’re both in those awful teen years, but you’re a part of the First Family, try showing a little class,” Lauten wrote.

What she neglects to acknowledge is just how awful those teen years can be. Instead, she piles on. These young ladies are shown standing exposed to the world when everything about them is changing and adjusting at a rapid pace in ways they might not understand because that is what it means to be an adolescent.

Worst of all, Lauten needlessly sexualized the girls by saying, “Dress like you deserve respect, not a spot at a bar.”

Girls have a hard enough time feeling good about their developing bodies without creepy, inappropriate and out-of-context comments like these. This comment felt just as bad as any leering guy on the street wolf-whistling to female passers-by.

It has never been more important to value a young woman’s humanity as she works to be vital and relevant, living and loving, hoping one day she’ll be valued for her efforts and be paid fairly and rewarded accordingly.

And in a society where bullying is rampant, it’s honestly unbelievable to me that Lauten so blindly bullied these girls. Did Lauten not even see the movie Bully? I still cry thinking about it.

While I appreciate that Lauten later tried to apologize, to me it was a failure.

By not directly addressing her apology to the First Daughters (notice how her initial heartless critique was directly addressed to them, though?), Lauten ascribed “superhuman” qualities to them. Meaning, she didn’t consider how her comments might make them or other girls feel, bearing out what Adam Waytz and his research team revealed in a recent study about white attitudes toward blacks.

“Today, a subtler form of dehumanization of blacks persists, with powerful consequences; it increases endorsement of police brutality against blacks and reduces altruism toward blacks,” according to the paper published in Social Psychological and Personality Science.

It is no surprise to me that social media went apoplectic over the weekend upon learning what Lauten had done and how she handled it. It shows that the public has had it up to here with the nastiness of political discourse, especially when race, gender and sexuality are involved.

In her position as the spokeswoman (now former) for Representative Fincher, it was Lauten’s very job to be a communications expert, yet she proved incapable of reading the signs of the times and the particularly sensitive moment happening in this nation right now.

Lauten appears to be one of those women who vote against their own interests, mistaking proximity to the white power structure for real power.

It isn’t.

The lack of respect in her original Facebook post and the subsequent half-hearted apology was unforgivable and unforgettable. Regardless of what Lauten meant, her bad behavior is a reckoning moment for so many other things.

Now that Lauten has given up her job, perhaps she can spend more time reclaiming her own humanity — on her way to seeing ours.

Douglas is a journalist living in Chicago.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME politics

Does America Need More Than One President?

American flags
Getty Images

Two presidents would have a potent incentive to cooperate

The question comes to mind as we watch Barack Obama abandon cooperation in favor of strategies that bypass Republican obstruction on Capitol Hill. Obama’s approach is understandable, but turning to executive orders and other paths to one-party action will only aggravate the problem of political dysfunction. In the short run, Obama may be able to ram his preferences through, but he invites similar action by future Republican presidents. Instead of defusing partisan conflict, Obama will fuel its expansion.

For a solution to our high levels of partisan conflict, we would do well to learn from the legacy of Nelson Mandela. The late South African president understood a key principle for effective governance—if you want everyone to work together on behalf of the common good, you have to give everyone a meaningful voice in government. Mandela rejected one-party control and instead chose a politics of inclusion. As former President Bill Clinton observed about Mandela, that’s the only politics “that works.” Indeed, said Clinton, “it’s the only thing that’s working in American communities today.”

But we currently have a politics of exclusion in Washington, and that explains much of the dysfunction that prevents Congress and the president from solving the country’s pressing problems. Democrats and Republicans both are well represented on Capitol Hill, but only one party is represented in the White House.

It was not always a problem to have a one-party presidency, but over the past 75 years, the Oval Office has amassed an exceptional amount of power. Presidents control policy for air quality, energy exploration, education, health care, consumer protection and many other matters through agency regulations, executive orders, and other unilateral actions. Presidents dominate foreign policy even more. They decide when we go to war, which foreign governments we recognize, and which undocumented immigrants we deport. As Arthur Schlesinger, Jr., wrote, we now have an imperial presidency.

When a single person exercises the immense power of the modern presidency, people fight tooth and nail to secure that power. They spend billions to win the election, and they spend the ensuing four years positioning their party for the next election. One side of the aisle in Congress backs the president; the other side devotes itself to obstruction. Instead of responsible governance, we get the permanent campaign. It is no surprise that the sharp increase in partisan conflict has paralleled the huge expansion of presidential power.

The United States would do well to replace its one-person, one-party imperial presidency with a two-person, two-party presidency. Instead of electing the candidate with the most votes, we would send the top two finishers to the Oval Office. Presidential partners usually would come from the Democratic and Republican Parties, but they also could emerge from third parties.

By giving both sides of the political spectrum a voice in the executive branch, we would temper partisan conflict. Currently, half the public is shut out of the White House and turns readily to partisan opposition. A coalition presidency would represent the views of nearly all Americans. Hence, a much higher percentage of the public would be comfortable with executive branch initiatives. Even if legislators wanted to play partisan ball, they would not find a receptive electorate. There no longer would be a mass of disaffected voters to mobilize against the Oval Office.

Members of Congress from both sides of the aisle would have other reasons to cooperate with a bipartisan White House. For example, they could share in the credit for presidential achievements. During President Obama’s first term, Republicans recognized that even if they voted for the economic stimulus or health care reform, Democrats would receive all of the credit for the programs. GOP members of Congress could benefit politically only by opposing legislative initiatives from the White House and hoping the initiatives would be defeated or would fail after enactment. With bipartisan presidential proposals, both parties could share in the credit for success.

All members of Congress also would be in a better position to get help from the executive branch for their constituents. As I found during my service in the Indiana House of Representatives, legislators often do more for their districts by cutting through governmental red tape than by passing bills. But I also found that I could help my constituents with the executive branch only when it was headed by a governor of my own party. With a two-party executive, every member of Congress could find a receptive ear in the White House.

Shared power would promote better presidential decision-making. The Constitution envisions an executive who primarily implements policy decisions made by Congress. But the modern president has assumed much of the legislative branch’s policymaking authority. While it makes sense to have a single person who can act decisively and with dispatch when the person is an executor of policy made by others, the founding fathers correctly reserved policy making for multiple-person bodies. As Woodrow Wilson observed, “the whole purpose of democracy is that we may hold counsel with one another, so as not to depend upon the understanding of one man.”

Has shared governance ever worked? Experience with multiple executives is not very different from that with single executives. One-person presidential governments have fared well in some countries but poorly in others (e.g., in Eastern European, African and South American nations). Similarly, coalition executives have performed well in some countries, such as Switzerland and Austria, but not in others.

The key question is not whether to have shared governance but how the sharing should be structured. Game theory supplies a sound answer. If we gave two presidents equal power, we would give them the right incentives to cooperate. Elected officials may be highly partisan, but they are partisan for a purpose. In typical power-sharing settings, one person can hope to establish a dominant position by outmaneuvering the other person. In a properly designed coalition presidency, neither president could hope to prevail over the other president. During their terms, they would share power equally, and reelection also would come with half of the executive power.

Two presidents would have a potent incentive to cooperate. If they spent their terms locking horns, they would not be able to implement key policy goals. And having reached the pinnacle of political life, presidents care most about their legacies of achievement. Accordingly, they likely would come to accommodations that would allow them to implement meaningful policy changes. Presidential self-interest would prevent stalemate.

A two-person presidency also would be fairer to voters than a one-person presidency. Barack Obama exercises 100 percent of the executive power after winning only 51 percent of the popular vote. It makes much more sense to give Mitt Romney 50 percent rather than 0 percent of the executive power for his 47 percent support in November 2012.

While the founding fathers preferred a single executive in 1787, they likely would approve of a bipartisan executive today. They wanted the presidency to speak for everyone, not just residents of a particular interest group or political party. The founding fathers also believed in radical reform. When their political system failed, they understood the need for major structural change. To restore the Constitution’s vision of a truly representative government, two presidents really would be better than one.

David Orentlicher is Samuel R. Rosen Professor at Indiana University Robert H. McKinney School of Law. A scholar of constitutional law and a former state representative, David also has taught at Princeton University and the University of Chicago Law School. He earned degrees in law and medicine at Harvard and specializes as well in health care law and ethics. He wrote this for Zocalo Public Square.

Read next: Why Mandarin Won’t Be a Lingua Franca

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.


Supreme Court to Determine Workplace Pregnancy Protections for Moms-To-Be

The court will hear a discrimination case that seeks to make clear what accommodations employers must make to expecting mothers

Should a pregnant worker have the right to workplace accommodations, such as a chair to sit on as she works a cash register or more frequent bathroom breaks during her job as a call center operator?

The Pregnancy Discrimination Act of 1978 was supposed to make the answers to those questions—in both instances—crystal clear. Congress passed it to overturn the Supreme Court’s 1976 decision that pregnancy discrimination is not sex discrimination under Title VII of the Civil Rights Act of 1964.

But over the years, employers have reached differing conclusions about how the Act’s language should be interpreted—specifically the line that says employers must treat pregnant women the same as “other persons not so affected [by pregnancy] but similar in their ability or inability to work.” Some companies have read that phrase to mean that they must meet the needs of pregnant women the same as they would meet the needs of any other worker who’s similarly physically restricted. But other employers believe that so long as their policies are pregnancy-neutral—which often means considering pregnancy the same way they would an off-the-job injury that garners no special treatment—they’re in the clear.

United Parcel Service abided by the latter interpretation in 2006, when it denied former truck driver Peggy Young’s request for light duty during her pregnancy, which forced her into unpaid leave. On Wednesday, the Supreme Court will hear Young’s case and ultimately rule on what accommodations employers must make under the Pregnancy Discrimination Act, a decision that could touchthe lives of the 68 million working women in the U.S. and the 62% of new moms in the last year who were part of the workforce.

“This case is of particular importance because so many working women are now working well into their pregnancy,” says Katherine Kimpel, a lawyer at Sanford Heisler who specializes in gender and race discrimination and who filed an amicus brief in the case supporting Young. In the U.S., 65% of working, first-time mothers stayed on the job into their last month of their pregnancy, Kimpel says. Among full-time workers, that figure surges to 87%.

All the while, pregnancy discrimination cases are on the rise. In fiscal year 2013, 5,342 pregnancy discrimination charges were filed with the Equal Employment Opportunity Commissions and state and local Fair Employment Practices agencies, up from 3,900 in 1997. “For those reasons, how employers think about accommodating pregnancy really matters,” Kimpel says.

Peggy Young started working for UPS in 1999; in 2002, she took on a part-time role as a truck driver, picking up air shipments. Four years later, she took a leave of absence to receive in vitro fertilization. When she became pregnant and a midwife instructed her not to lift packages over 20 pounds, Young asked to return to UPS to do either light duty or her regular job as a truck driver, which seldom required her to lift heavy boxes. According to Young’s Supreme Court petition, her manager told her that UPS offered light duty to workers who sustained on-the-job injuries, employees with ailments covered by the Americans With Disabilities Act, and those who had lost Department of Transportation certification because of physical aliments like sleep apnea; not—the manager said—to pregnant workers. UPS wouldn’t allow Young to return to her former role either since her lifting restriction made her a liability. As a result, Young was required to go on extended, unpaid leave, during which she lost her medical coverage.

Young sued UPS in October 2008 for allegedly violating the Pregnancy Discrimination Act since the company failed to provide Young with the same accommodations it gave to employees who were not pregnant but equally unable to work. Young has lost the two previous rulings in the case. A district court decided in February 2011 that UPS’s decision not to accommodate Young was “gender-neutral” and ruled in the company’s favor. The Fourth Circuit Court of Appeals later affirmed that decision, ruling UPS had established a “pregnancy-blind policy.”

Since the Supreme Court decided to hear the case in July, UPS has announced changes to its policy for pregnant workers. Next year, it will offer temporary light duty to pregnant workers who need it. Despite that reversal, UPS maintains that its denial of Young’s light duty request was lawful at the time and that its policy change is voluntary and not required by the Pregnancy Discrimination Act. The Chamber of Commerce filed an amicus brief supporting UPS, calling attention to companies that offer pregnant employees “more than what federal law compels them to provide.”

Young, meanwhile, has received support from across the political spectrum. Pro-life organizations as well as groups like the American Civil Liberties Union have filed briefs backing Young and calling on the high court to rule in favor of workplace accommodations for expecting mothers.

The justices will hear Young’s case nearly six months after the EEOC issued new guidelines to employers on how to treat pregnant workers amid the increase in bias complaints.

“There are lots of women like Peggy Young who need temporary changes at work during pregnancy and too often, even if employers are routinely accommodating disabled workers, pregnant workers are pushed out to unpaid leave or fired,” says Emily Martin, vice president and general counsel of National Women’s Law Center. “This case is really about whether pregnant women will continue to be asked to make the impossible choice between their jobs and their health.”

This article originally appeared on Fortune.com

TIME politics

Why Thousands of Washingtonians Loved Marion Barry

Marion Barry Discussed his New Autobiography and Met With Locals
D.C. Council member Marion Barry discussed his new autobiography, "Mayor for Life: The Incredible Story of Marion Barry, Jr." during an event hosted by the Washington Informer at the Old Congress Heights School in S.E. Washington. The Washington Post—The Washington Post/Getty Images

Vincent C. Gray is the Mayor of the District of Columbia.

The former mayor began his career as a civil rights activist and helped build the District’s black middle class

When Marion Barry, Jr. passed away last Sunday at the age of 78, many likely took little more note than to say “good riddance.” After all, most of the nation knew about the so-called “Mayor for Life” primarily from his appearances in national headlines after getting arrested, while in office, for smoking crack cocaine–or for his last term in office, when a congressionally appointed Control Board took over our city’s finances and many of our agencies.

But there was a vastly different side to Barry – a side that endeared him to tens of thousands of Washingtonians over the years and a side that became his greatest political asset, enabling him to win redemption again and again at the ballot box.

Barry cut his political teeth early on, as a civil-rights activist and a leader in the Student Nonviolent Coordinating Committee. There, he organized sit-ins and got young people involved in creating a new future for African Americans. Having been born into a sharecropper’s family in the sweltering oppression of Jim Crow-era Mississippi, he knew well the daunting height of the barriers to advancement and success faced by African Americans.

When Barry came to Washington in 1965 to work for SNCC, he saw a city that, in many ways, was every bit as segregated as the Mississippi of his childhood. Moreover, he found a majority-black city that was ruled not by its residents, but by a Congress in which its residents didn’t even have a voting voice. And that Congress had generally delegated oversight of the District to its most conservative white Southern members.

Marion Barry had found the place where he would make his mark, first as an activist for better relations with police and better employment opportunities for African Americans in the District. He got elected to the school board, and then finally, after the advent of Home Rule in the District, to the first popularly elected D.C. Council.

In his first term as mayor, he achieved some truly remarkable successes. He helped get the city’s chaotic finances under control, helped turn our Metropolitan Police Department into an agency whose officers are much more reflective of the population they serve than many other police agencies around the country, helped build the District’s black middle class through a groundbreaking program that required a share of city business to go to minority-owned enterprises and created a summer employment program for the District’s youth.

I still meet people who remember getting their first job because of a program that Mayor Barry started. And it’s this Marion Barry – the one who fought courageously for fairness and justice for much of his career – that those of us who saw him at his best choose to remember.

It may surprise many who aren’t keenly aware of the District’s history that Marion Barry won his first mayoral term largely by relying on upper-middle-class white voters and good-government advocates. Despite an unfortunate later choice to oppose marriage equality, Mayor Barry was one of the earliest elected officials to openly embrace the LGBT community, passing one of the nation’s earliest LGBT-inclusive non-discrimination laws. His legacy is so much more complex and wide-reaching than that fateful drug sting would reflect.

I knew Barry for years, and one anecdote leaps to mind as an example of his true character. I once served as executive director of what was then known as the Association for Retarded Citizens (now called The Arc of D.C.). One of our key advocacy goals was to move people with intellectual disabilities from an inhumane institution named Forest Haven to community living. There was fierce opposition in many neighborhoods to group homes, bolstered by some of the most egregious myths imaginable. One evening, I was with Mayor Barry in an affluent community where the District was preparing to establish a home. Nearly 200 people showed up for this meeting, with one purpose – to stop this home from opening.

Once Barry had concluded his presentation, a man rose and began to pepper him with questions. When it became apparent that the man’s inquiries had no constructive purpose, Barry said, “You really don’t want any answers, do you? If you want to talk about how we make this work, I will stay with you all night. Otherwise, I have nothing else to say to you!” It was vintage Barry – standing up for disadvantaged people who could not effectively fight for themselves. The meeting ended uneventfully – and the home soon opened and proved a huge success.

And so, like many other Washingtonians, I choose to remember Marion Barry by remembering his lifelong commitment to building up our city and freeing it from congressionally imposed shackles. I hope that history will remember, and honor, his virtues and successes – because they far outnumber his failures and foibles.

Vincent C. Gray is the Mayor of the District of Columbia.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME Ferguson

Obama to Order More Oversight of Military Surplus Going to Local Police

Ferguson Military Police
Police in riot gear stand in line opposing protesters in Ferguson, Mo. on Nov. 28, 2014. Jim Vondruska—Xinhua Press/Corbis

President will also expand use of body cameras

President Barack Obama is preparing to issue an executive order to calling for additional oversight of various federal programs which provide military surplus equipment to local law enforcement agencies, senior administration officials said Monday, but will stop short of banning the transfer of heavy gear to police forces.

The government’s transfer of surplus equipment came under scrutiny this summer following protests in Ferguson, Mo. following the shooting death of Michael Brown by police officer Darren Wilson. Obama ordered a review of the government’s programs in August after clashes between heavily-armed officers and both peaceful and violent protesters. The president is discussing those findings Monday afternoon in a Cabinet meeting, officials said, where he will order his aides to draft an executive order seeking to standardize procedures for the five government agencies that support local law enforcement acquisitions.

But beyond calling for additional oversight, community engagement, and training, Obama will not act to curtail the transfer of military-style weapons and vehicles to local law enforcement, officials said, adding that wasn’t even the subject of their review. “Ultimately these were programs that were authorized by Congress, and so congressional intent is really at issue here,” one senior administration official told reporters before Obama’s announcement on the condition of anonymity.

Obama will also announce a three-year $263 million package to increase the use of police body-worn cameras and expand local law enforcement training. The program, modeled after a similar program for bullet-proof vests for officers, would provide $75 million over three years for the “Body Worn Camera Partnership Program.” Administration officials said it would provided a 50 percent match for body-camera purchases by state and local agencies, enough for 50,000 new cameras. Officials said they hope to secure about $70 million in funding for the effort as part of a government funding deal that must be reached in the coming two weeks.

The review of law enforcement acquisitions programs found “a lack of consistency in how federal programs are structured, implemented and audited,” the White House said. Obama’s order will direct the Departments of Defense, Treasury, Justice, and Homeland Security, and the Office of National Drug Control Policy, which all provide or support the acquisition of equipment, to work together to unify standards, including developing a consistent list of military-style equipment that can be purchased by local law enforcement. The White House said the agencies will consider whether to require civilian review and authorization of purchases of “controlled equipment”—like armored vehicles, weapons, and aircraft—before transfers are authorized, as well as whether to require specific training and use-of-force guidelines in place before those purchases are completed. Obama will order the agencies to develop their specific recommendations within 120 days after consultations with law enforcement, community, and civil rights stakeholders.

Additionally, the administration is considering requiring the filing of after-action reports for federally-provided or -funded equipment involved in significant incidents.

On a call with reporters, officials defended the Pentagon’s 1033 Program, which came under the most intensive scrutiny this summer as being a key vehicle by which local agencies are provided military surplus weapons and vehicles, saying 96 percent of the equipment transferred by the program doesn’t have military attributes, including surplus office supplies. It was unclear whether that figure was determined by the purchase price of the items, or just the raw figures of what was transferred. According to the Pentagon’s Law Enforcement Support Office, $450 million worth of property, by original purchase price, was transferred to local agencies in 2013.

“We found that in many cases, these programs actually serve a very useful purpose,” White House Press Secretary Josh Earnest said Monday. “And what is needed, however, is much greater consistency in oversight of these programs, primarily in how these programs are structured, how they’re implemented and then how the programs themselves are audited.”

An official added that the White House has no opinion on congressional efforts to ban the transfer of Mine-Resistant Ambush Protected Vehicles and other military-style gear to law enforcement agencies. “We haven’t reviewed any specific legislation so I don’t have a specific position for you,” the official said.

Obama is also announcing the intent to issue an executive order to create a Task Force On 21st Century Policing,chaired by Philadelphia Police Commissioner Charles H. Ramsey, and Laurie Robinson, professor at George Mason University and former Assistant Attorney General for the Justice Department’s Office of Justice Programs. The Task Force will report its recommendations to reduce crime and improve trust between officers and the communities they serve to the president within 90 days

TIME politics

The Woman Who Broke the U.K.’s Parliamentary Gender Barrier Wasn’t Even Trying

Lady Astor
Lady Nancy Astor in Plymouth, England, in November of 1923 Gill / Getty Images

Dec. 1, 1919: American-born socialite Lady Astor is sworn in as the first female member of the British House of Commons

Lady Astor was an unlikely candidate to break the gender barrier in the U.K. Parliament. For one thing, she wasn’t British; for another, she wasn’t a suffragist. She took her seat in the House of Commons on this day, Dec. 1, in 1919, after running for her husband’s vacant spot when he was given the title of Viscount and elevated to the House of Lords. (She was the second woman to have been elected to the House of Commons, but the first to accept the position.)

She barely wanted the job, according to her election pamphlet. At times she seemed to go out of her way to alienate voters, as when she ended a campaign speech in front of a working-class crowd by saying, according to the New York Times, “And now, my dears, I’m going back to one of my beautiful palaces to sit down in my tiara and do nothing, and when I roll out in my car I will splash you all with mud and look the other way.”

But Nancy Astor had a flair for upending expectations. The Virginia native, whose father was a tobacco auctioneer, ascended to the upper crust of the British aristocracy but never lost her frank, outspoken manner or her earthy sense of humor. The latter was even more jarring when combined with her conservative politics: she was a strict teetotaler and a staunch anti-socialist.

History does not cast her as a particularly influential MP. Although she was re-elected seven times before retiring in 1945, the Times notes, “she accomplished nothing more noteworthy than the forcing through of a bill barring teenagers from entering pubs.”

Still, her witticisms made waves. Her sharp tongue could get her in trouble, but its overall effect was, if not endearing, then at least entertaining. Per the Times, “…she was capable in the House of Commons of doing anything from whistling to calling a fellow member a donkey.”

If she hadn’t pursued politics, Astor could have had a promising career as an insult comic. Her best lines became known as Astorisms, and they tended to take harsh aim at her rivals as well as her friends. According to her 1964 obituary in TIME, her favorite targets included “fellow politicians, her fellow rich (“The only thing I like about them is their money”), Communists, Socialists, Nazis, Yankees, liquor manufacturers, newspapers (her husband’s family owned two), antifeminists, the cult of the Common Man.”

Sometimes her attacks were personal — and borderline cruel. After voting to oust Prime Minister Neville Chamberlain, an old friend and former political ally, she famously said, “Duds must be got rid of, even if they are one’s dearest friends.”

Her vote against Chamberlain helped pave the way for Winston Churchill to take office, but she had few kind words for him either. Churchill was, at least, her equal in trading barbs. In one exchange, she is said to have told him, “If I were married to you, I’d put poison in your coffee.” He replied, “If I were married to you, I’d drink it.”

Read the full obituary for Lady Astor, here in the TIME Vault: The Ginger Woman

TIME psychology

How Memory Links the Presidency, Ferguson and the Cosby Mess

Do you know me? Relax, you're not alone.
Do you know me? Relax, you're not alone.

Jeffrey Kluger is Editor at Large for TIME.

The human brain forgets much more than it remembers, and that has an impact on history, criminal justice and more

Here’s a difficult one, history buffs: Who was Harry Truman? I know, I know, I told you it would be tough, but think hard: Some famous general? Maybe a physicist?

If you guessed U.S. president, good for you! And if you also knew that Truman was the one who came right after Roosevelt (Franklin, that is) and right before Eisenhower, go to the head of the class.

OK, so maybe remembering Truman isn’t such a big deal. But here’s the thing: By 2040, according to a new study just published in Science, only 26% of college students will remember to include his name if they are asked to make a list of all U.S. Presidents, regardless of order.

That finding, which is less a function of historical illiteracy than of the mysterious ways the human brain works, reveals a lot about the perishability of memory. And that, in turn, has implications for contemporary dramas like the Ferguson tragedy, the Bill Cosby mess and the very underpinnings of the criminal justice system.

The Science study, conducted by a pair of psychologists at Washington University in St. Louis, was actually four studies that took place over 40 years—in 1974, 1991, 2009 and 2014. In the first three, the investigators asked groups of then-college students to list all of the presidents in the order in which they served, and also to list as many of them as they could by name regardless of where they fell in history.

In all three groups over all three eras, the results were remarkably similar. As a rule, 100% of respondents knew the president currently serving, and virtually all knew the prior one or two. Performance then fell off with each previous presidency. Roughly 75% of students in 1974 placed FDR in the right spot, for example. Fewer than 20% of Millennials—born much later—could do that. In all groups, the historical trail would go effectively cold one or two presidents before the subjects’ birth—falling into single digits.

There were exceptions. The Founding Father presidents, particularly the first three—George Washington, John Adams and Thomas Jefferson—scored high in all groups. As did Abraham Lincoln and his two immediate successors, Andrew Johnson and Ulysses S. Grant. As for the Tylers and Taylors and Fillmores? Forget about them—which most people did. The pattern held again in a single larger survey conducted in 2014, with a mixed-age sample group that included Boomers, Gen X’ers and Millennials, all performing true to their own eras.

Almost none of this had to do with any one President’s historical relevance—apart from the Founding Fathers and Lincoln. James Polk’s enormously consequential, one-term presidency is far less recalled than, say, Jimmy Carter’s much less successful four-year stint. Instead, our memory is personal, a thing of the moment, and deeply fallible—and that means trouble.

One of the most disturbing aspects of the Ferguson drama is the mix of wildly different stories eyewitnesses presented to the grand jury, with Michael Brown portrayed as anything from anger-crazed aggressor to supine victim. Some witnesses may have been led by prosecutors, some may have simply been making things up, but at least some were surely doing their best, trying to remember the details of a lethal scene as it unfolded in a few vivid seconds.

If forensic psychology has shown anything, it’s that every single expectation or bias a witness brings to an experience—to say nothing of all of the noise and press and controversy that may follow—can contaminate recall until it’s little more reliable than that of someone who wan’t there at all.

Something less deadly—if no less ugly—applies in the Bill Cosby case. In an otherwise reasonable piece in the Nov. 25 Washington Post, columnist Kathleen Parker cautions against a collective rush to judgment and reminds readers that under the American legal system, Cosby is not a rapist, but an alleged rapist; and his victims, similarly, are as yet only alleged victims. Fair enough; that’s what the criminal justice rules say. But then, there’s this:

“…we have formed our opinions… only on the memories of the women, most of whom say they were drugged at the time. Some of them have conceded that their recollections are foggy—which, of course they would be, after decades and under pharmaceutically induced circumstances, allegedly.”

In other words, if Cosby did drug them, then perhaps we must throw their testimony out of court because, um, Cosby drugged them. Talk about the (alleged) criminal making hay on his crime. And yet, when it comes to the science of memory, that’s an argument that could work before a judge.

Finally, too, there is the unseemly business of Ray Rice. Virtually nobody who knows what he did has forgotten it—which is what happens when you’re a massively strong athlete and you cold-cock a woman. But it was the complete elevator video actually showing the blow, as opposed to the earlier one in which Rice was seen merely dragging the unconscious body of his soon-to-be-wife out into a hotel hallway, that spelled his end—at least until his lifetime NFL ban was overturned on Nov. 28. Knowing what happened is very different from seeing what happened—and once you saw the savagery of Rice’s blow, you could never unsee it.

When it comes to presidents, the fallibility of memory can help. In the years immediately following Richard Nixon’s resignation, it was a lot harder to appreciate his manifest triumphs—the Clean Air Act, the opening to China—than it is now. George W. Bush is enjoying his own small historical rebound, with his AIDS in Africa initiative and his compassionate attempt at immigration reform looking better and better in the rear-view mirror—despite the still-recent debacles of his Presidency.

We do ourselves a disservice if we hold historical grudges against even our most flawed presidents; but we do just as much harm if we allow ourselves to forget why ill-planned land wars in countries like Iraq or cheap break-ins at places like the Watergate are so morally criminal. Forget the sequence of the Presidents if you must, but do remember their deeds.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME politics

Robert F. Kennedy: Rare and Classic Photos of an Undaunted Man

Of the three Kennedy brothers -- John, Robert and Edward -- Bobby best embodied the contradictions at play within that famed family

Of the three Kennedy brothers — John, Robert and Edward — who ascended to the national political stage in the 1950s and ’60s, it was arguably the middle brother, Bobby, who best embodied the enormous contradictions at play within that famed (and, it sometimes seems, cursed) American family.

There was, for example, RFK’s fraught relationship with liberals — and with American liberalism in general. As the author and historian Sean Wilentz once wrote while reviewing a largely unflattering biography of Kennedy in the New York Times:

Robert F. Kennedy always irked liberals; and they always irked him. . . . Kennedy’s association with the reckless Sen. Joseph McCarthy in the 1950s forever tainted his reputation in some reform circles. As his brother’s presidential campaign manager in 1960, and thereafter as attorney general, he struck many liberals as ruthless in the pursuit of power and reluctant in the pursuit of principle, especially regarding civil rights. Kennedy, for his part, regarded his liberal critics as hopeless, sanctimonious losers who put purity above political realism, and who seemed to think that sure-fire defeat was inherently noble.

That Bobby Kennedy was, like his brothers and many of his other relatives, past and present, a titanically driven individual is hardly news. There’s a reason, after all, that he’s still despised today, five decades after his death, by some liberals and most conservatives: he did not fit into a neat, ideological box and — then as now — neither side knew what to do with a man who refused to act and speak according to their expectations and their rules.

Then there was his relationship with Lyndon Johnson — a man who, according to virtually everyone who knew both men, hated Bobby Kennedy with an intensity matched only by RFK’s loathing for his bother’s successor as president.

But Kennedy also had an intellectual and — in public, at least — an emotional poise that makes most present-day American politicians seem glib and trifling by comparison. (Is there a sitting U.S. senator or representative whom one can picture quoting Herodotus or Sophocles, from memory, as Kennedy so often did?)

Of course, like his brothers — especially John — Robert Kennedy was also able to immediately and powerfully connect with crowds in a way that most politicians can only envy, and there were certainly people who saw greatness in him and in his future.

“He is one of the half-dozen men in the country today qualified for top political leadership,” one of Lyndon Johnson’s advisers told LIFE writer Robert Ajemian. “He really cares about right and wrong. He cares about people.”

Here, LIFE.com shares photos — most of which never ran in LIFE magazine — of Kennedy and his extended and immediate family in 1964. The pictures, by LIFE’s George Silk, capture a man who, as Robert Ajemian wrote in the magazine’s July 3, 1964, issue, “had shouldered massive burdens” in the six months since his brother John was gunned down in Dallas the previous November.

A major preoccupation of Bob Kennedy’s in the past six months [Ajemian wrote] has been his family — and now it includes his brother’s children, Caroline, who is 6, and John, who is 3. Jackie Kennedy brings them out almost every day to their uncle’s home, Hickory Hill, five miles outside Washington. Bob and [his wife] Ethel spend as much time with them as with their own brood of eight. “They think of it as their own home,” says Jackie Kennedy. “Anything that comes up involving a father, like father’s day at school, I always mention Bobby’s name. Caroline shows him her report cards.”

But even surrounded by so many loved ones, and so busy with speeches and appearances around the country, the rawness of the loss of his older brother was, it seems, never far away. After a speech in Pittsburgh, a reporter asked Kennedy, “What do you miss most about your brother?”

“Kennedy looked startled,” Ajemian reported, “and stared at the reporter as he sought the exact answer. His face softened and he said, ‘Just that he’s not here.'”

Four months after the LIFE cover story, Robert F. Kennedy was elected as the Democratic U.S. Senator from New York. He served until June 6, 1968, when he was assassinated by a gunman named Sirhan Sirhan, while campaigning in Los Angeles for his party’s presidential nomination. Robert Kennedy was 42 — four years younger than John Kennedy was when he was killed.

Liz Ronk, who edited this gallery, is the Photo Editor for LIFE.com. Follow her on Twitter at @LizabethRonk.

Your browser, Internet Explorer 8 or below, is out of date. It has known security flaws and may not display all features of this and other websites.

Learn how to update your browser