TIME Opinion

The Myth of Neutrality in American Sporting Culture

Tommie Smith (C) and John Carlos
Gold medalist Tommie Smith (C) and bronze medalist John Carlos give the black power salute on the podium during the 1968 Olympic Games Rolls Press/Popperfoto—Getty Images

As you sit comfortably in front of your television before the championship game, take a moment to think about the images of militarism associated with the contest

History News Network

This post is in partnership with the History News Network, the website that puts the news into historical perspective. The article below was originally published at HNN.

Sport is supposedly an avenue of escape in American culture where fans can seek refuge from the serious political concerns of the day. Ridicule is often reserved for athletes, invariably on the political left, who violate the neutrality of sport by introducing elements of protest and controversy into the games. For example, 1968 Mexico City Olympic medalists Tommie Smith and John Carlos raised their fists in a black power salute and were expelled from the games by the American Olympic Committee. Muhammad Ali refused conscription into the military during the Vietnam War; citing his religious convictions and proclaiming that he had no arguments with the Viet Cong. In response, Ali was stripped of his heavyweight boxing crown and was unable to earn his living as a boxer until the Supreme Court overturned his conviction. In more recent years, baseball player Carlos Delgado refused to acknowledge the singing of “God Bless America” in many ballparks following 9/11. On the basketball court, Mahmoud Abdul-Rauf (formerly Chris Jackson) of the Denver Nuggets was suspended in 1996 by the National Basketball Association for refusing to stand during the playing of the National Anthem before reaching a compromise that allowed him to stand and pray with his head lowered during the song. A similar controversy occurred in November 2014 when Dion Waiters of the Cleveland Cavaliers refused to join his teammates for “The Star Spangled Banner,” asserting that his Islamic faith made it difficult for him to honor a nation in arms against his faith. Waiters retracted his statement, but he was traded and now plays for the Oklahoma City Thunder.

In more recent days, the supposed tranquility of American sport has been disrupted due to concerns with violence perpetuated by the police against black men. In protest of the shooting of Trayvon Martin in 2012 by George Zimmerman, LeBron James and Dwayne Wade got their Miami Heat teammates to join them in a photograph with all the players donning hoodies similar to the one Trayvon Martin was wearing when he was killed. Following the failure of a grand jury in late November 2014 to indict policeman Darren Wilson for the shooting and killing of unarmed black teenager Michael Brown in Ferguson, Missouri, football players from the St. Louis Rams entered the playing field with their hands up—a move of solidarity with protestors asserting that Brown was shot while trying to surrender. In addition, both professional basketball and football players, primarily black, appeared in pregame workouts wearing shirts containing the slogan “I Can’t Breathe”—a reference to the failure of a New York City grand jury to indict a city policeman for the choking death of a black man, Eric Garner, for allegedly selling untaxed cigarettes. Garner was noncompliant but not violent before he was taken down by police, complaining that he could not breathe. In the wake of the Brown and Garner killings, protests were held on the streets and playing fields of the nation.

After the assassination of two New York City police officers in December 2014, the protest against police brutality has largely disappeared from America’s games, and sport has apparently resumed its status as escapist entertainment free from divisive political concerns such as inequality and racism. Thus, as we enter the championship season of football at the professional and collegiate levels, controversy is avoided as we begin the games with military jets flying over stadiums, military honor guards, public recognition for veterans and current members of the military along with their families, fireworks and cannons, the unfurling of gigantic American flags, and the performance of patriotic songs such as “God Bless America” and “The Star Spangled Banner.” These opening game ceremonies are usually accompanied by television broadcasts in which the networks, sponsors, and announcers proclaim that we are able to enjoy the games in the comfort of our own homes because we are under the protection of military personnel stationed in over 700 outposts scattered around the globe. Invariably, we are shown enthusiastic soldiers watching the game at American outposts in areas such as Afghanistan. These images of American exceptionalism linking militarism, patriotism, and the global reach of American empire with the nation’s sporting culture are, of course, considered apolitical. The careful manipulation of patriotic and military symbols in support of consumer culture and advertising dollars obscure a political agenda in favor of capitalism, militarism, and empire at the expense of more humanitarian values.

While honoring veterans, many of whom have been severely injured in their service, why can we not also celebrate those peace activists who have dedicated their lives to eradicating the scourge of war from the planet? Rather than quarantining them, why not honor the health care workers and organizations such as Doctors Without Borders who have risked their lives to battle Ebola in Africa? Why not honor the heroes of the Civil Rights Movement and those continuing the struggle for racial equality in this country? Why not commemorate the environmentalists and scientists fighting the threat of global warming? Why not proclaim the everyday heroism of teachers and workers alongside police, fireman, and soldiers? The answer to these questions is that we have allowed the realm of sport to become the playground of vested interests and the status quo. Issues of racism, economic inequality, police brutality, environmental damage, gun control, and antiwar activities are perceived as controversial, political, and divisive, while militarism, empire, and American exceptionalism are construed as fundamental values above debate or questioning.

Accordingly, our games and circuses are orchestrated to enforce the political status quo, and the context in which we play our games are hardly neutral. As you sit comfortably in front of your television before the championship game, take a moment to think about the images of militarism associated with the contest. Athletes are only considered political when they speak out on racial, social, and economic issues, but they need to recognize that they are regularly employed by the government and corporate America in support of militarism, capitalism, and empire. The games are hardly neutral, and fans, as well as athletes, should stay vigilant and aware of how sport is used. Rather than simply embracing militarism, exceptionalism, and empire, as citizens and athletes we should support a higher patriotism in which we do not blindly follow the flag into military expansionism, but rather insist that we honor America by assuring that the nation adheres to its founding principles. As the Scottish migrant Frances Wright proclaimed in 1824, in what is believed to be the first Fourth of July oration by a woman, patriotism is “employed to signify a lover of human liberty and human improvement rather than a mere lover of the country in which he lives, or the tribe to which he belongs. . . . A patriot is a useful member of society, capable of enlarging all minds and bettering all hearts with which he comes in contact; a useful member of the human family, capable of establishing fundamental principles and of merging his own interests, those of his associates, and those of his nation in the interests of the human race.” A true neutral playing field.

Ron Briley reviews books for the History News Network and is a history teacher and an assistant headmaster at Sandia Preparatory School, Albuquerque, New Mexico. He is the author of “The Politics of Baseball: Essays on the Pastime and Power at Home and Abroad.”

TIME Opinion

How Caitlin Stasey’s NSFW Website Is Moving Feminism Forward

Actress Caitlin Stasey attends the 25th Annual GLAAD Media Awards at The Beverly Hilton Hotel in Los Angeles on April 12, 2014.
Actress Caitlin Stasey attends the 25th Annual GLAAD Media Awards at The Beverly Hilton Hotel in Los Angeles on April 12, 2014. Jason Merritt—Getty Images

"Our feminism is nothing less than freedom," the 24-year-old CW actress says

Australian actress and CW star Caitlin Stasey recently launched a website, Herself.com (NSFW), featuring interviews with ordinary women (aka non-celebs) about topics like masturbation, pornography, reproductive rights, sexual identity and the female body. The site is the latest addition to the ongoing conversation about women and the media, which in recent years has stretched from the #freethenipple campaign (which advocates women’s right to go topless in public) to Dove’s “Real Beauty” advertisements to Beyonce’s 2014 VMA performance.

Herself is a much-needed feminist tribute to both the minds and bodies of women, and there’s an abundance of nudity featured on the site — hardly a surprise given Stasey’s outspoken stance on female sexuality. In fact, naked women are the first things visitors of Herself.com see. Sure, the site’s thought-provoking interviews may get glossed over by viewers focusing on the subject’s naked bodies, but their overt nakedness is giving a different attitude about the female body the push forward it needs in mainstream media.

“A woman rarely gets the opportunity to just live in herself, as herself, a fully autonomous, self-determining human being. I wanted to give women the chance to reclaim that for themselves,” Stasey said in a recent interview. “The fact that it is a matter of controversy that a woman should choose to be nude without it being an act of sex just shows how backward and oppressive our society is. It shouldn’t matter. It shouldn’t warrant questioning.”

By overtly displaying the female form, Herself has the potential to expose its subjects to objectification, and the site’s garnered a mild amount of criticism on social media. But major publications such as Glamour and People are eager to praise Stasey’s uncensored tribute to women of all kinds.

Stasey encourages readers to “witness the female form in all its honesty without the burden of the male gaze.” The initial buzz surrounding Herself lead to the site trending on Facebook with the description “actress launches website for women featuring personal interviews and nude photos,” basically guaranteeing men will visit. Yet even allowing for the fact that she’s exposing herself to the male gaze, Stasey remains unfazed.

“I sincerely hope men come to the site and learn to see how women present themselves, how women look when they are not merely objects, but the living, breathing subjects of their own stories,” she says. “This is a site made by a woman for women about women,” Stasey adds.

Going full-frontal in the name of feminism isn’t without its controversies, but the women in Herself are showcasing their own bodies on their own terms. Not only that, but they’re reaching a new generation of teen readers thanks to the 24-year-old actress’ star role on the CW’s Reign. Ultimately Herself gives women the opportunity to decide whether they want their bodies showcased online — and no one should judge them for that.

“If [feminism] is anything less than a woman getting to live her life proudly and freely, then I probably wouldn’t march for it,” says Stasey. “If you think feminism is telling other women how to live, then we’ll never agree, and I’m fine with that. Our feminism is nothing less than freedom.”

Herself is about as safe as it gets for like-minded feminists wanting to share their beliefs, fears and hopes, as well as for Stasey’s teen fans who are just becoming aware of their bodies, sexuality and sexual identity. And for someone visiting the site for the nude pics? They’ll definitely learn something while they’re at it.

TIME Opinion

Dove Really, Really Wants These Little Girls to Accept Their Curls

Hair acceptance is the new body acceptance

Dove has moved on from curve-acceptance to curl-acceptance.

The beauty company’s newest campaign continues its body-positive messaging by focusing on curly-haired girls who wish they had straight hair. The little girls in this new ad are sad because they only see straight hair in advertisements and commercials! Dove claims research shows only 4 in 10 girls with curly hair think their hair is beautiful. And nobody with un-beautiful hair could possibly have a shred of happiness in their lonely little lives.

Until… they get pulled outside by their curly-haired mommies (who are dancing in public, ugh STOP IT mo-om!) and taken to a top-secret location where they have to cover their eyes for a surprise. No, there’s not a pony in there. Or a private Taylor Swift concert. Instead, when they open their eyes, every single curly-haired person they’ve ever met shouts at them: “We all love our curls!”

MORE: Hey Dove, Don’t ‘Redefine Beauty,’ Just Stop Talking About It

Instead of shrieking in terror, the girls join in and it becomes a big dance party where everybody’s curls are bouncing with a special spring that says “empowerment,” and “acceptance” and “buy Dove products.”

TIME Opinion

Should the Federal Government Be in the Business of Policing History?

MLK-Voting Rights Bill
President Lyndon Johnson hands a souvenir pen to the Reverend Martin Luther King Jr after signing the Voting Rights Bill at the US Capital, Washington DC, in 1965. PhotoQuest / Getty Images

Defenders of LBJ are less interested in history than in hagiography

History News Network

This post is in partnership with the History News Network, the website that puts the news into historical perspective. The article below was originally published at HNN.

Mark Updegrove, the federal director of the Lyndon Baines Johnson Presidential Library & Museum in Austin, Texas, is one of the instigators of the current backlash against Selma, the widely-praised film that depicts a crucial series of events in the Civil Rights Movement. Leaving others to engage in the historical debate about the film’s portrayal of LBJ, I would like instead to examine the campaign to discredit the film based on that portrayal. Waged by those intent on protecting and promoting Lyndon Johnson’s image, the efforts are part of a larger trend to use presidential libraries in ways far outside their initial objectives and Congressional intent, and to hire “legacy managers” rather than credentialed archivists and historians to run them.

Updegrove, who also serves, ex-officio, as a trustee of the Lyndon Baines Johnson Presidential Library Foundation, began the wave of criticism in an article last month in Politico (which is published by Robert Allbritton, another trustee of the LBJ Foundation). Updegrove wrote that the film’s “mischaracterization” of LBJ “matters now” because “racial tension is once again high” and that “it does no good to bastardize one of the most hallowed chapters in the Civil Rights Movement by suggesting that the President himself stood in the way of progress.”

A few days later, former LBJ White House aide Joseph A. Califano, Jr. – also a trustee of the LBJ Foundation – in an angry op-ed in the Washington Post (which is published by Politico co-founder Fred Ryan, chairman of the board of the Ronald Reagan Presidential Library Foundation) claimed that the Selma marches actually were Johnson’s idea. While this notion has been labeled false and outrageous by, among others, historian Peniel Joseph, in an illuminating NPR piece, the clamor may harm the film’s reputation, business, and, reportedly, its chances during the upcoming awards season.

From the significant, apparently coordinated endeavors of Updegrove, Califano, and others – and the negative attention they have brought to bear on an otherwise broadly-lauded work – it would seem as if, to them, Johnson was, and is, the point. But, like the movement as a whole, Selma the movie is not, and Selma the historical events were not, about Lyndon Johnson. By trying to make them about LBJ, and by rigorously policing any negative representations of him, those entrusted with managing the legacy of our nation’s 36th president reveal the motivations of the private organizations that build, donate, and utilize presidential libraries for their own purposes.

This manufactured controversy sadly diverts proper attention from the film and its powerful message. It also underscores the main theme of my upcoming book, The Last Campaign: How Presidents Rewrite History, Run for Posterity & Enshrine Their Legacies. In the book, I explore the extent to which former chief executives, their families, supporters, and foundations go in order to, as in a campaign, present only the most positive – while ignoring all of the negative – elements of a president’s life, career, and administration. Instead of selling a candidate for office, they’re selling an image for posterity. And like a presidential campaign, image is more important than substance; the reality is more complicated – and less heroic – than the image-makers would have us believe. That doesn’t prevent them from rewriting history, and waging a concerted, and, at times, aggressive, campaign to rectify what they consider to be misrepresentations of their president.

Selling that image takes more than cheery messaging; it also requires the elimination of anything that may harm what often is a fragile narrative, based more on admiring rhapsodies than documented, historical facts. And like a campaign communications staff, members of the late president’s team feel they must hit back, hard, at criticism, negative facts, or even personal opinions that even slightly deviate from the message they have carefully crafted.

To Updegrove, the suggestion that the man whose legacy he was hired to rescue was anything less than heroic, and motivated by anything other than saintly, selfless, devotion to a just cause, is unacceptable, and swiftly must be “corrected.”

In a CNN blog post in February, 2014, Updegrove was quoted as saying, “We want people to know what this President did – what he got done and how it continues to affect us.” That’s a perfectly acceptable desire for a presidential family member or an official of a private foundation dedicated to promoting a president’s legacy to express, but not a mid-level federal employee responsible for administering a nonpartisan government archival facility.

On the January 4, 2015 edition of Face the Nation, host Bob Schieffer commented on critics’ assertion that the movie was “dead wrong” on its portrayal of LBJ, asking Updegrove – as if he were a disinterested arbiter of the truth, rather than a tender of LBJ’s flame and a leader of that very criticism – “What happened here?” Updegrove answered, “Well, unfortunately, there’s no litmus test for movies that — based on history. There’s no standard that says that you got this wrong, you have got to correct that.”

Apparently, though, Updegrove believes there is such a litmus test, and that he is the one designated to administer it.

An insistence that LBJ was so central to the movement that this film “bastardizes” it conveniently ignores his earlier role in successfully blocking civil rights legislation as Senate Majority Leader – a neat trick replicated in the recently-renovated LBJ Library museum. There, in exhibits depicting his pre-presidential career, Vietnam, foreign affairs, domestic programs, and the Civil Rights Movement, the narrative is clean, simple, and undeviating: Lyndon Baines Johnson Was A Great Man Who Did Nothing Other Than Great Things And Only For Great Reasons.

The LBJ presented in the renovated exhibits – which were overseen by Updegrove – bears little resemblance to the meticulously-detailed and extraordinarily well-documented LBJ of Robert Caro’s multi-volume, Pulitzer Prize-winning biography. The museum’s adulatory portrayal differs little from those in recent presidential libraries, but it is quite different from the other mature museums in the National Archives system, which have, over time, begun to develop more thorough, balanced, and nuanced views of the men to whom they are dedicated. Instead of echoing that progress, the recent changes to the LBJ exhibits go backwards; that they are less factual and more flattering is unprecedented in the history of presidential libraries – as is Updegrove’s assertive campaigning, as a federal employee, to rehabilitate a president’s image.

Will Updegrove’s public scolding of Selma director Ava DuVernay have a chilling effect? Will future filmmakers think twice before daring to express an opinion about a former president with taxpayer-funded legacy managers to rescue their legacy? Will researchers at the Johnson Library worry the director might charge them with “mischaracterizing” Johnson? That our government now appears to be in the business not only of administering these legacy-burnishing shrines but of “correcting” others’ views of history should be unacceptable to the citizens who fund the operation of our presidential libraries.

While it would be a shame if Updegrove’s and his colleagues’ need to police and sanitize Johnson’s image deprives this transformative film of deserved accolades and awards, it would be a greater misfortune if their attempts to discredit Selma prevented it from being seen by a broad audience. It is my hope that the film and the filmmakers succeed in spite of these negative efforts, and, in the face of this latest example of the last campaign, overcome.

Anthony Clark, a former speechwriter and legislative director in the U.S. House of Representatives, was responsible for hearings and investigations of the National Archives and presidential libraries for the Committee on Oversight and Government Reform in the 111th Congress.

TIME Opinion

History Shows How 2 Million Workers Lost Rights

Fast food workers, healthcare workers and their supporters march to demand an increase of the minimum wage, in Los Angeles on Dec. 4, 2014 Robyn Beck—AFP/Getty Images

Home attendants and aides have historically been singled out for denial of basic labor rights

Over the last year, the nation has seen a tumultuous wave of low-wage workers contesting terms of employment that perpetually leave them impoverished and economically insecure. It’s a fight in which home-care workers—one of the fastest growing labor forces—have long participated, as home attendants and aides have historically been singled out for denial of basic labor rights. Their work is becoming ever more important in our economy, with over 40 million elderly Americans today and baby boomers aging into their 70s and 80s; the demand for such workers is projected to nearly double over the next seven years. And yet, this week a federal judge is likely to put up just the latest obstacle to their receiving the minimum wage and overtime compensation granted to other workers through the 1938 Fair Labor Standards Act (FLSA).

The story of how home-care workers ended up without rights begins in the Great Depression. Home care first originated as a distinct occupation during the New Deal, and evolved after World War II as part of welfare and health policy aimed at developing alternatives to institutionalization of the elderly and people with disabilities. Prior to the mid-1970s, public agencies provided or coordinated homemaker and home-attendant services. Fiscal constraints subsequently led state and local governments to contract home care first to non-profit and later to for-profit agencies. In 1974, Congress extended FLSA wage and hour standards to long-excluded private household workers. A year later, however, the U.S. Department of Labor (DOL) interpreted the new amendment to exempt home-care workers, even employees of for-profit entities, by misclassifying them as elder companions, akin to babysitters. It provided no explicit reasoning for introducing this new terminology, beyond the need for uniform definitions of domestic service and employer. This exclusion became known as the “companionship rule.”

The rule was a boon for employers. Amid nursing-home scandals and an emergent disability-rights movement, demand for home-based care burgeoned, but the women actually performing the labor were invisible. A distinct home-health industry began to grow following the 1975 exemption, as the rule freed staffing and home-health agencies from paying minimum wages and overtime. Opening Medicaid and other programs to for-profit providers after 1980 led to a tenfold increase in for-profit agencies during the next half decade. By 2000, for-profit groups employed over 60 per cent of all workers. Today, the home franchise industry is worth $90 billion.

Care workers, however, were never just casual friendly neighbors; even before this expansion, home-care workers were middle-aged, disproportionately African American, female wage earners—neither nurse nor maid, but a combination of both. Despite changes in their title since the 1930s, these workers always performed a combination of bodily care work (bathing, dressing, feeding) and housekeeping necessary to maintain someone at home. They increasingly have become a trained workforce.

With the expansion of the industry, service sector unions and domestic worker associations lobbied to change the “companionship rule.” Recently, they seemed to have won: After extensive public comment, the DOL issued a new rule in September of 2013, which would have finally included home-care workers under FLSA coverage. The Obama Administration also updated the definition of domestic service to match the job as performed by nearly 2 million workers who belong to one of the fastest growing, but lowest paid, occupations, with median hourly wages under $10. It recognized aid with activities of daily living as care, and care as a form of domestic labor. Whereas companionship services had previously included even those who spent more than 20 hours engaged in care, the new rule narrowed the meaning of companionship to mere “fellowship and protection” in order to close the loophole that for-profit agencies were deploying to profit by underpaying live-in home attendants. It was to go into effect on Jan. 1, 2015, though enforcement was delayed until June.

Then, in late December, at the urging of for-profit home care franchise operators, led by the Home Care Association of America, Judge Richard J. Leon (a George W. Bush appointee) of the U.S. District Court for the District of Columbia struck down a key element of the revision. The decision vacated the responsibility of third-party employers (such as home-care businesses) to pay minimum wage and overtime for so-called companionship services. In his opinion, the judge charged the DOL with “arrogance,” “unprecedented authority” and “a wholesale abrogation of Congress’s authority in this area.”

A historical perspective suggests otherwise. In the 1970s, Congress never intended to enhance corporate profits by narrowing wage and hour protections; to the contrary, it expanded them. Granted, the Senate Committee on Labor and Public Welfare refused “to include within the terms ‘domestic service’ such activities as babysitting and acting as a companion”—but it distinguished teenage sitters and friendly visitors from domestic workers by adding “casual” to those exempted from labor standards. It explicitly did not refer to “regular breadwinners,” those “responsible for their families.” Moreover, the Supreme Court has repeatedly reaffirmed the supposition that where Congressional intent is ambiguous, executive agencies—including the DOL—have leeway. In the 2007 case Long Island Care at Home, Ltd. v. Coke, a unanimous Supreme Court commended the expertise of the agency to determine the meaning of undefined phrases like “domestic service employment” and “companionship services.”

During oral argument in Coke, Justice Ruth Bader Ginsberg suggested that the proper way to amend the exemption was either a new rule through the DOL, which is what ended up happening, or legislation. Judge Leon reads back Congressional intent from the fact that legislative fixes have stalled in committee in the years following Coke. But there are many reasons why bills go nowhere in our gridlocked government.

The temporary restraining order from Judge Leon effectively blocked implementation of the new DOL rule in totality, setting off a ripple effect against this primarily female workforce. California, for example, instantly suspended implementation for some 80,000 workers. Then on Jan. 9, he heard oral arguments on whether to strike down the redefinition of the companionship classification. Given his prior decisions, the bet is that his next ruling on Jan. 14 will do so. Continuous litigation is in the offering, as the DOL is likely to appeal his decisions all the way to the Supreme Court.

For over 40 years, we’ve relied on cheap labor for care. The structure of home-care has exemplified a broader trend of reconfiguring work throughout the economy as casualized and low-waged, outside of labor standards and immune from unionization. But stopping the correction of this injustice means distorting history—and devaluing the care that someday most of us will need.

Eileen Boris is Hull Professor of Feminist Studies and Professor of History, Black Studies, and Global Studies at the University of California, Santa Barbara. Jennifer Klein is Professor of History at Yale and a Public Voices Fellow. They are the authors of Caring For America: Home Health Workers in the Shadow of the Welfare State.

TIME Innovation

Five Best Ideas of the Day: January 13

The Aspen Institute is an educational and policy studies organization based in Washington, D.C.

1. The U.S. could improve its counterinsurgency strategy by gathering better public opinion data from people in conflict zones.

By Andrew Shaver and Yang-Yang Zhou in the Washington Post

2. The drought-stricken western U.S. can learn from Israel’s water management software which pores over tons of data to detect or prevent leaks.

By Amanda Little in Bloomberg Businessweek

3. Beyond “Teach for Mexico:” To upgrade Latin America’s outdated public education systems, leaders must fight institutional inequality.

By Whitney Eulich and Ruxandra Guidi in the Christian Science Monitor

4. Investment recommendations for retirees are often based on savings levels achieved by only a small fraction of families. Here’s better advice.

By Luke Delorme in the Daily Economy

5. Lessons from the Swiss: We should start making people pay for the trash they throw away.

By Sabine Oishi in the Baltimore Sun

The Aspen Institute is an educational and policy studies organization based in Washington, D.C.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME Opinion

Why You Should Care That Selma Gets LBJ Wrong

President Lyndon B Johnson (1908 - 1973) discusses the Voting Rights Act with civil rights campaigner Martin Luther King Jr. (1929 - 1968) in 1965 Hulton Archive—Getty Images

Even in the movies—and especially in this one—accuracy matters

The film Selma—in wide release Jan. 9—tells one of the most dramatic stories in modern American history, of Martin Luther King Jr.’s successful crusade for voting rights in Alabama in 1965. It triggered a smaller drama of its own when former Lyndon Johnson aide Joseph Califano attacked its portrayal of his old boss in the Washington Post. The film is a well-produced and well-acted drama that will draw a lot of Oscar attention. In many respects—but not all—it was well-researched. Some have argued that the inaccuracies are not important to the purpose of the film, or that accuracy is beside the point when it comes to movies that aren’t documentaries. But Califano was right: its portrayal of Lyndon Johnson and his role in the passage of the Voting Rights Act could hardly be more wrong. And this is important not merely for the sake of fidelity to the past, but because of continuing implications for how we see our racial problems and how they could be solved.

Selma suffers as a piece of history, I would guess, because director Ava DuVernay and writer Paul Webb overcompensated for the flaws of movies like Mississippi Burning and Ghosts of Mississippi. Such movies have been justifiably criticized for exaggerating the role of whites compared to blacks in the Civil Rights movement, and for introducing black characters only to have them killed or terrorized. Selma stands this paradigm on its head. With only one exception—federal judge Frank Johnson—the white characters in Selma are either villains (including LBJ, J. Edgar Hoover, George Wallace and Sheriff Clark of Selma), timid wimps, or victims (Unitarian minister James Reeb, who is misidentified at one point as a priest and talks like an Evangelical, and Detroit mother Viola Liuzzo, both of whom were killed by Alabama whites). Crucially, until its last few minutes, the film presents LBJ as the main obstacle to what King is trying to do. There was no shortage of real white villains in the Selma controversy, but LBJ was not one of them. This portrayal depends upon a complete misrepresentation not only of the facts, but also of specific conversations that King and Johnson had during this period.

For example: Selma shows King meeting LBJ in mid-December of 1964 and asking for voting rights legislation. The President is completely negative and highly perturbed, stating that the time has not come to push the issue. But, in reality, though Johnson did say that legislation would have to wait, that wasn’t not the gist of the meeting: Johnson fully recognized the problem and promised to use the legal tools provided by the Civil Rights Act of 1964 to fight it. In fact, two days earlier he had told a New York Times reporter about the possibility of a new law that would allow southern black voters to register at post offices. Then, on Jan. 4, in his State of the Union address, Johnson promised to remove all remaining obstacles to the right to vote. Word soon leaked to the press that the Justice Department was working both on a new constitutional amendment to ban some of the practices southern states used to disenfranchise black Americans, and on legislation that would allow the federal government to register voters. Work on both of those plans preceded apace through January, starting even before King’s Selma campaign had begun. (The most authoritative works on this story are Robert Dallek’s Flawed Giant, David Garrow’s Protest at Selma, and Taylor Branch’s Pillar of Fire.)

Having spent nearly 25 years in Congress, Johnson was acutely sensitive to the issue of legislative timing. A year earlier, after taking office, he had cleverly begun by submitting what looked like a tight fiscal 1965 budget, insisting that total spending be held under $100 billion. That gave him the leverage he needed to get JFK’s tax-cut bill—his other major legislative priority, along with civil rights—through the Congress. Only then did Johnson have the Senate take up the civil rights act, so that the filibuster it was certain to generate would not stop the tax cut and other important matters. In June 1964 the filibuster was overcome, and the Civil Rights Act passed. In 1965, a voting rights bill would probably mean a new filibuster, so Johnson undoubtedly wanted to get at least some of his other major tasks accomplished before it came up. In any case—and this is one of the things I learned studying Johnson’s approach to Vietnam—LBJ never let anyone know what he planned to do until it was absolutely necessary. Because he had gotten the Justice Department going on a voting-rights measure, he knew he would have it ready when he decided he needed it.

King’s Selma protests began on Jan. 14, escalated for the next six weeks, and climaxed on Bloody Sunday, Mar. 7, as Selma shows, on the Edmund Pettus Bridge, where state troopers beat marchers. Meanwhile, King and LBJ continued to talk. On Jan. 15, as Califano pointed out, King and Johnson had a long, cordial phone conversation in which Johnson encouraged King to push for voting-rights legislation. King met with the President in Washington on Feb. 9, and Johnson insisted that King tell the press that the President was going to submit a voting-rights bill. The newspapers not only confirmed this the next day, but also added that the two men had discussed the use of federal registrars, an end to literacy tests and focusing on the most discriminatory areas in the South. They were, in short, agreeing upon the eventual solution to the crisis.

Selma shows LBJ in this period not only refusing to meet any of King’s demands, but also enlisting J. Edgar Hoover to try to discredit and destroy King. Hoover had in fact taken these steps months before, and LBJ had been appalled by them. Like John and Robert Kennedy before him, he was terrified that Hoover would successfully discredit King and set back civil rights for years. Fortunately, because no media outlet would print the salacious material Hoover provided, the FBI Director failed. King himself wrote, in the midst of these events, that while he and Johnson’s approaches to civil rights were far from identical, he had no doubt at all that Johnson was trying to solve the problem of civil rights “with sincerity, realism and, thus far, with wisdom.”

This is not all. LBJ was moving not only out of conviction, but also because events in Selma, even before Bloody Sunday, were arousing northern opinion. Liberal Republicans introduced voting-rights bills of their own in Congress after King’s meeting with LBJ in early February. Mainline churches of all faiths were calling for voting rights too, just as they had lobbied for the 1964 act a year earlier. King and the Selma marchers obviously deserve credit for tapping into broader support for civil rights, but that support had been building for decades. LBJ now knew voting rights could be a winning political issue.

LBJ took no action during February, partly, I suspect, because he was also very busy covertly launching the Vietnam War. (It is typical of this tragic figure in American history that perhaps his best and worst decisions were taken at exactly the same time.) As Selma shows, King met LBJ again on the eve of the planned march to Montgomery that became Bloody Sunday. Johnson did caution King against inflammatory moves in that meeting—though hardly in the tone portrayed in the film—but they also, once again, discussed the details of projected legislation. Within days of Bloody Sunday, Johnson’s press secretary had announced that the President would ask the Congress for legislation in the following week. Selma gives no indication that the speech on Mar. 15, which Johnson concluded with the words, “We shall overcome,” was anything but a complete surprise, and the pacing of the film, I think, implies that the delay between Bloody Sunday and the speech was much longer than eight days. In fact, any newspaper reader knew that the speech only confirmed, in the most dramatic fashion, the direction in which Johnson had been moving for over a month.

The response to Johnson’s speech confirmed that voting rights for all Americans were now supported by an overwhelming consensus. The bill, which abolished literacy tests and sent federal registrars into every southern county with low black registration, passed the House by a vote of 333-85. A Senate filibuster delayed it for 24 days in May, it eventually passed, 77-19. The “aye” votes included the two Democratic Senators from Tennessee. Meanwhile, Medicare, a huge education bill and other measures moved through the Congress as well. Johnson had all he wanted, and more—and the United States has never been the same. Thanks to King and his marchers, support for the new act was too overwhelming for white southerners to delay it and stall other legislation too.

Selma’s distortion of LBJ’s role is important, I think, because it contributes to a popular but mistaken view of how progress in the United States can occur. The civil rights movement won its greatest triumphs in the 1950s and 1960s by working through the system as well as in the streets; by finding allies among white institutions such as labor unions, universities and churches; and by appealing to fundamental American values. Beginning in the late 1960s a very different view began to take hold: that white people were hopelessly infected by racism and that black people could and should depend only on themselves.

Selma contributes to that view. It not only leaves out much of the story of how the Voting Rights Act was passed, but also fails to illuminate how further progress might be made in the future. We still have serious racial problems in this nation. We can only solve them by working together based on shared values. That is what both Martin Luther King Jr. and Lyndon Baines Johnson understood, and that is why they both deserve to be remembered for their enormous achievements today.

David Kaiser, a historian, has taught at Harvard, Carnegie Mellon, Williams College, and the Naval War College. He is the author of seven books, including, most recently, No End Save Victory: How FDR Led the Nation into War. He lives in Watertown, Mass.

TIME Opinion

Despite the Statistics, We Haven’t Lost the War on Poverty

Though it may not look like it, a stable poverty rate is consistent with anti-poverty programs that work

It’s been more than a half-century since President Johnson officially launched the War on Poverty, in his State of the Union address delivered on this day in 1964, and declared that the U.S. had “the power to eliminate poverty from an entire continental nation.” In the decades that followed his decision to invest in jobs, training and aid, the U.S. has experienced, by some measures, tremendous prosperity. GDP per capita—a measure of the value of goods and services produced in the U.S., a commonly used indicator of living standards–has nearly doubled, from less than $25,000 per person to nearly $50,000. At the same time, our key measure of economic hardship, the official poverty rate, has barely budged from 15 percent in the past 50 years.

On the face of it, then, the War on Poverty seems to have accomplished nothing. Critics of Johnson’s programs may also add that the War on Poverty resulted in billions of dollars spent on the poor. Why has there been no return on that investment?

The simple answer is that there have been improvements—but the way we measure poverty hasn’t, until recently, accounted for them. Many direct transfers to the poor do not count in an official poverty measure based on only pre-tax, cash income.

Many key programs established during the Great Society Era, such as Medicaid, Food Stamps (now known as SNAP) and the WIC nutrition program are “in kind” programs that provide non-cash help. In more recent decades, benefits through the tax system, such as the Earned Income Tax Credit (EITC) or child tax credits, have grown in importance but are ignored in official statistics.

Fortunately, the Census Bureau has developed their Supplemental Poverty Measure, which includes non-cash benefits and post-tax income, accounts for differences in regional costs of living and makes several other sensible adjustments to poverty measures. These adjustments can be incredibly important to understanding poverty. For example, the Census Bureau estimates that refundable tax credits, such as the EITC, lower the poverty rate by up to three percentage points (around 9 million people). SNAP benefits lower it by nearly two percentage points (approximately 6 million people).

Another critical failure of measurement, one that falls more on the community of poverty researchers, is the failure, until recently, to convincingly measure the full benefits of safety net spending. We now have credible, peer-reviewed research that measures the real benefits on health, educational attainment and quality of life from a variety of safety-net programs. These studies show over and over that many safety net programs improve the lives of the poor. For example, studies have shown that infants born to families who are at high risk of poverty are healthier because of Medicaid, SNAP and WIC. These benefits may not show up as lower poverty rates, but they lay the groundwork for a healthier, more productive populace.


Marianne Page and Ann Huff Stevens / UC Davis Center for Poverty Research


It is actually remarkable that poverty rates have not substantially grown, considering economic and social trends over the past three decades. Since the 1980s, rising demand for skilled workers and falling demand for their less-skilled counterparts have meant that real wages have increased substantially for workers earning above the median, but not for most of those who earn below it. This fact is at the heart of growing inequality. It explains how strong GDP growth can co-exist with no improvement for low-wage workers. Our own research suggests that the wider gap in earnings between workers at the median and those earning in the lowest 20 percent during the 1980s should alone have increased poverty rates by roughly 2.5 percentage points.

The last 50 years have also brought massive changes in the structure of U.S. families. Since the 1960s, the fraction of the non-elderly population living in single-parent families has more than doubled. Single-parent families are more likely to be poor because, by definition, there is only one parent to earn enough to push family income over the poverty line. The cost and difficulty of child care in single-parent families pose additional barriers.

Our calculations show that if poverty rates between two-parent and single-parent families had remained constant, and only the frequency of single-parenthood had changed, U.S. poverty rates among the non-elderly since the 1960s would have increased on the order of 4 percentage points. Again, there was actually no significant change in poverty rates over this period.

A major trend identified in the graph above is the downward trend in poverty among the elderly. Unlike the overall poverty rate, poverty among the elderly has declined substantially since the 1960s. Why is this the case? First, the elderly benefit from a stable cash-based benefit from Social Security. Second, the elderly and their probability of being poor are largely unaffected by changes in labor markets. Finally, Social Security benefits are indexed to average wage growth in the economy, and so do not lose their value to inflation—in contrast with many welfare programs aimed at younger individuals.

The lesson here is that an income-support program with benefits linked to overall economic growth, aimed at a population unaffected by the deterioration of the low-wage labor market, has led to a significant, lasting decline in poverty since the War on Poverty was launched.

The War on Poverty has clearly not been won. No amount of explaining, interpreting or squinting at the plot of U.S. poverty rates can get us to a declaration of victory. Declarations of defeat are just as misguided, however. The War on Poverty has been fought against a shifting landscape that has made the effort more difficult with each passing decade. Renewed efforts that recognize demographic and labor market realities and the enormous challenges they place on anti-poverty efforts, and measures of that progress, should be the hallmarks of the next phase in this war.

Marianne Page is a professor of Economics and Deputy Director of the Center for Poverty Research at the University of California, Davis. She received her Ph.D. from the University of Michigan.

Ann Huff Stevens is a professor of Economics, Director of the Center for Poverty Research, and Interim Dean of the Graduate School of Management at the University of California, Davis. She received her Ph.D. from the University of Michigan.

The UC Davis Center for Poverty Research was founded in 2011 with core funding from the Office of the Assistant Secretary for Planning and Evaluation in the U.S. Department of Health and Human Services. It is one of three federally designated Poverty Research Centers whose mission is to facilitate non-partisan academic research in the United States.

TIME Opinion

Old Hickory in a New Century

Andrew Jackson
Andrew Jackson (March 15, 1767 - June 8, 1845), 1815. Engraved by C. Phillips after the original painting by Jarvis. Kean Collection / Getty Images

On the bicentennial of the Battle of New Orleans, a new conversation about Andrew Jackson—the most flawed of heroes—should begin

Andrew Jackson could hardly believe it himself. On Sunday, Jan. 8, 1815, after a series of battles in December and into the new year, his American troops met in a climactic struggle against the British army at New Orleans. The weather was foggy, the fighting fierce—and the results, from General Jackson’s point of view, little short of miraculous. The British lost nearly three hundred men; another 1200 were wounded, taken prisoner, or missing. Only 13 Americans were killed. “It appears,” Jackson recalled, “that the unerring hand of providence shielded my men from the powers of balls, bombs, and rockets, when every ball and bomb from our guns carried with them the mission of death.”

The bicentennial of this American Agincourt, which we commemorate this week, deserves wider attention, for, like the War of 1812 itself, the Battle of New Orleans is one of those historical events that remains stubbornly in the shadows of memory. Not every battle matters down the years; nor does every general. (Though don’t tell the generals that.) To understand the America of the 19th century, however, requires at least a passing acquaintance with what happened at New Orleans—and more than a passing acquaintance with the man who commanded the victorious forces there.

Two centuries after New Orleans made Andrew Jackson a national celebrity, setting in motion one of the most momentous military and political careers in American history, Jackson remains one of our most important and least-understood figures. The architect of American democratic populism, he was also an architect of Native American removal. An unapologetic defender of American union, he was also an unapologetic defender of African-American slavery. His contradictions were legion; so, too, were his contributions.

We need not embrace Jackson uncritically to engage with him profitably, which is part of the work now being undertaken by the newly created Andrew Jackson Foundation (of which I am a trustee) in Nashville. The mechanics of memory are complex and sometimes confounding; at its best, history captures that admirable and the ignoble—two elements that are often inextricably linked in the human experience. The foundation, which is opening a new exhibit, “Born for a Storm,” at his Nashville plantation, the Hermitage, hopes to start a new conversation about Jackson, shifting the traditional historical emphasis from his house and grounds to the meaning of his career and presidency. It is not an easy task to draw sustained attention to a man often passed over in a popular memory that zooms from the Founding to Fort Sumter, but it’s a task worth tackling.

He had come from nothing, and New Orleans gave him everything—fame and a future. Lawyer, planter, self-taught military officer, brawler and duelist, Jackson made his way from the Carolinas to Tennessee, married into Nashville’s leading family, and set out to make a life, and a name, for himself. He was deeply attached to the American experiment in liberty and in union; he believed the nation to be “one great family” and was abidingly hostile to those forces that he thought threatened the sovereignty of the will of the white majority.

Jackson’s successors found virtues in him. In drafting his own First Inaugural, Lincoln consulted Old Hickory’s pro-union proclamation to the people of South Carolina who had considered nullification in 1832-33. TR saluted Jackson’s aggressive vision of executive power. FDR admired him and his battles against entrenched financial-class interests; Truman idolized him. “He wanted sincerely to look after the little fellow who had no pull, and that’s what a president is supposed to do,” Truman remarked of Jackson in one of the best definitions of the presidency I have ever encountered.

New Orleans made Jackson’s broader influence possible. (The Treaty of Ghent, ending the war, had been signed on Christmas Eve 1814, but word had not yet reached the United States.) January 8 became a national holiday; within a decade Jackson would seek the presidency. On an anniversary of the battle, Rachel Jackson, the general’s beloved wife, was overwhelmed by the tributes to her husband. “The attention and honors paid to the General far excel a recital by my pen,” she wrote a friend. “They conducted him to the Grand Theater; his box was decorated with elegant hangings. At his appearance the theater rang with loud acclamations, Vive Jackson.”

Vive Jackson: It became a perpetual cry. At New Orleans, when surviving British soldiers came out from beneath the safety of their fallen comrades’ red coats, the image of the living among the dead struck Jackson forcibly. “I never had so grand and awful an idea of the resurrection as on that day,” Jackson recalled. For Jackson the day brought new possibilities, and, for better and for worse, a new America began emerging from the fog and the smoke of that distant morning two centuries ago.

Jon Meacham, a trustee of the Andrew Jackson Foundation, is the author of “American Lion: Andrew Jackson in the White House” (Random House, 2008).

TIME Opinion

The 1919 Theory That Explains Why Police Officers Need Their Guns

1 Dead, 3 Injured After Shooting At Seattle Pacific University
Police tape marks a crime scene on June 5, 2014 in Seattle, Washington. Mat Hayward—Getty Images

The long view of race relations and the law in America

In 1919, with the smoke still clearing from the battlefields of the First World War, the German sociologist Max Weber began a systematic study of the nation-state by defining a state as any “human community that successfully claims the monopoly of the legitimate use of physical force within a given territory.” But what constitutes legitimate force? We should keep this question in mind when considering the run of recent events involving conflicts between police and citizens, from the killing of two unarmed black men by cops in Ferguson, Mo., and Staten Island, N.Y., earlier this year, as well as the murder of two police offers in New York City this past weekend.

Over the course of many centuries the criminal justice system, with its armed police, replaced self-help justice conducted by gangs and individuals. Human communities such as drug gangs and mafia syndicates use physical force, but it is not legitimate and they do not have a monopoly; this leads to higher rates of violence because there is no strong state to act as an objective third party to oversee disputes. In Mexico, for example, the high incidence of violence can be traced in part to the inability of the state to enforce its drug laws, leading gangs to settle their differences themselves. States, for all their faults, have more checks and balances than individuals. This is why Justitia—the Roman goddess of justice—is often depicted wearing a blindfold (symbolizing impartiality) and carrying a scale on which to weigh the evidence in one hand and a double-edged sword, for her power to enforce the law, in the other.

Today the state’s justice is conducted through two systems: criminal and civil. Civil justice deals with disputes between individuals or groups, whereas criminal justice deals with crimes against the laws of the land that are punishable only by the state. This is why criminal cases are labeled The State v. John Doe or The People v. Jane Roe: the state is the injured party. California, for example, continues to pursue charges against the filmmaker Roman Polanski, who pleaded guilty to unlawful sex with an underage girl in 1977, even though she—a woman now in her 40s—has requested that the state drop the charges. In late December 2014 he filed another motion to have all charges against him dropped, and the woman in question has even offered to testify on his behalf, but the state of California is unlikely to drop the charges because that would show weakness in its commitment to the rule of law.

That’s why people sometimes take the law into their own hands, via vigilantism or protest, when they do not feel that the law is fair to them: because the criminal-justice system acts on behalf of only the state, rather than on behalf of individuals, it requires trust in the state in order to work. That trust is lacking for many black communities in America; a 2013 Gallup poll, for example, found that 25% of whites but 68% of blacks believe that “the American justice system is biased against black people.”

This theory is important to keep in mind when thinking about what happened between 12:01pm and 12:03pm on Aug. 9, 2014, in Ferguson, Mo., when police officer Darren Wilson shot and killed teenager Michael Brown. Was his use of physical force legitimate or not? A grand jury determined that it could not make such a determination based on the conflicting eyewitness accounts, and so Wilson’s actions will not be considered criminal by the state. Something similar happened in New York when Eric Garner died at the hands of the police — was it a legal “grappling hold” or “headlock” or an illegal chokehold? Even with video evidence the grand jury voted not to indict the police officers. Though Brown’s and Garner’s families may choose to pursue civil cases, only the state can say — or, as the case may be, not say — that a crime has been committed.

Though much remains unclear about what happened in these situations, Weber can also help us understand how a seemingly innocent encounter between a citizen and a police officer could escalate into lethal violence. When Officer Wilson asked Michael Brown to move from the middle of the street to the sidewalk and Brown refused, Wilson was faced with the dilemma all cops face when a suspect refuses to comply: back down and thereby lose both the monopoly and the legitimacy of law enforcement, or ratchet up the intensity of tactics to achieve compliance. The latter is what happened on that fateful day when Brown not only refused to comply with Wilson’s demand, but — according to Wilson — also reached into his squad car to grab his gun. The case of Eric Garner was similar in that when he was approached by the police on suspicion of selling single cigarettes without tax stamps, he responding by complaining of being harassed by the police. In response, the police moved to arrest Garner by physically grabbing him, which he also resisted, leading to an escalation of violence and his death.

In a world where Weber’s theory is in effect, allowing a civilian to potentially use force against the police is a challenge to the whole fabric of society. Though it’s possible to debate how much police force is legitimate, any non-police force is inherently illegitimate. This is why the killing of two New York City police officers — 40-year old Rafael Ramos and 32-year old Wenjian Liu — on December 20 is likely to shift the sympathies of the public away from the peaceful protesters marching against injustice and back toward the police, especially after news reports surfaced that the alleged shooter, Ismaaiyl Brinsley, posted on social media that it was in retaliation for the deaths of Michael Brown and Eric Garner. This is most likely a one-off event unlikely to be repeated in cities across the U.S., but it nonetheless punctuates the point that there’s a reason the police in all civilized states are armed in one way or another.

The deaths of Michael Brown, Eric Garner, Rafael Ramos and Wenjian Liu were tragedies that need this kind of perspective. In fact, race relations today are increasingly positive and the state’s use of force would probably seem to most people, by and large, to be more “legitimate” than ever.

Urban violence from the 1950s through the 1980s was commonplace, a regular feature on the nightly news; it’s now mostly relegated to historical documentaries. And racist attitudes of Americans really are on the decline. Polling data show, for example, that in the 1940s nearly three-quarters of Americans agreed that “black and white students should go to separate schools.” That figure collapsed to almost zero by 1995, when pollsters quit asking the question. In 1960 almost half of all white Americans said that they would move if a black family moved in next door. Today that figure is also close to zero. At the turn of the 20th century lynchings were commonplace, averaging a couple a week through the 1920s, finally coming to an end by the 1950s, as shown by data from the Oxford University economist Max Roser. A 2013 Gallup poll on interracial marriage also shows the positive trend in tolerance over the past half century; in keeping with other rights revolutions where age is a factor in becoming more tolerant, the younger respondents, the population of the future, were the most approving of interracial marriage. And think about the case of Donald Sterling, forced to sell his basketball team, the Clippers, for racist remarks he made in private to his mistress. In the 1950s an old white guy who thought of blacks like Sterling seems to today wouldn’t have needed to be especially private about his prejudices. Today, the few who still think like this mostly keep it to themselves or publish their views in fringe white supremacist newsletters or web sites.

Clearly, as recent events have shown, racial violence is not zero and we have a ways to go before America is truly color blind, but, as the oft-quoted Martin Luther King line says, the arc of the moral universe really is bending toward justice. It would be good to keep that fact in mind when watching the daily news. There’s good reason for our police to have weapons they feel free to use, and for the most part they apply that force judiciously — and, with that in mind, we can continue working on refining what is meant by Weber’s “legitimate use of force” to insure that it is rational, reasonable and applied to everyone equally.

Michael Shermer is publisher of Skeptic magazine (skeptic.com), a Presidential Fellow at Chapman University and a monthly columnist for Scientific American. His new book, The Moral Arc, comes out in January. Follow him on Twitter @michaelshermer

Your browser, Internet Explorer 8 or below, is out of date. It has known security flaws and may not display all features of this and other websites.

Learn how to update your browser