TIME Opinion

Obama Takes a Page From FDR’s Playbook

President Obama Speaks At Chicago's Gwendolyn Brooks College Preparatory Academy
Scott Olson—Getty Images President Barack Obama speaks in Chicago on Feb. 19, 2015. Obama used the event to designate Chicago's historic Pullman district a national monument.

In helping to resolve a longshore standoff, the President showed that he’s learned from history

Correction appended, Feb. 24, 2015, 10:45 a.m.

For nine months, trade in the Pacific has been stalled. Ships languished at docks, their cargo stuck, lost profits ticking up into the millions, as negotiations between the International Longshore & Warehouse Union (ILWU) and their employers, the Pacific Maritime Association (PMA), dragged on. But the tide may be turning: the dockworkers’ union and the PMA announced that they have reached a tentative agreement on their next contract—and the repercussions reach far beyond the West Coast ports at the heart of the negotiations. This conflict between workers and employers goes to the heart of the crisis of inequality facing the American economy today.

President Obama repeatedly has claimed one of his top priorities is supporting middle class families. And, though he hasn’t articulated the comparison himself, this latest news shows that one of his strategies for doing so takes a page from history—namely from Franklin D. Roosevelt. Almost exactly 80 years ago, Roosevelt helped resolve a dispute not unlike the current ILWU-PMA showdown, and, in doing so, helped tens of thousands of working people and their families.

In 1934, militant longshoremen rose up and declared the “Big Strike,” shutting down every West Coast port. Workers demanded decent wages and workplace improvements, but employers called upon their allies in government to break the strike—most notoriously when San Francisco police killed two strikers and wounded dozens on “Bloody Thursday.” This remains an official holiday for ILWU members even today.

Employers did not listen to the cries of their own workers but President Roosevelt did. In a shocking break from precedent, he sent a trusted aide, Secretary of Labor Frances Perkins—the first woman ever to serve in a presidential cabinet—to mediate negotiations between workers and their bosses.

With Perkins’ assistance, the strikers won an astounding series of concessions: a union-controlled hiring hall, which ended discrimination and favoritism in hiring; a coastwide contract with all workers receiving the same wages and conditions; and a better hourly wage.

Shortly thereafter, working with a sympathetic Congress, FDR made unions, strikes and collective bargaining legal. He also helped abolish child labor, establish a minimum wage, the 8-hour day and overtime rates. In 1936, he explained why he sided with working people over corporations: “The test of our progress is not whether we add more to the abundance of those who have much, it is whether we provide enough for those who have little.”

Due to unions like the ILWU and public officials like Roosevelt and Perkins, the American middle class grew to a previously unimaginable size. Collectively bargained pay-raises in union workplaces lifted wages in non-unionized ones, too, because employers competed for workers. Not coincidentally, the so-called Greatest Generation also was the most heavily unionized generation in U.S. history.

In recent decades, however, that trend has reversed. Unions and the middle class are both weaker than they once were. Enter the ILWU-PMA impasse. During the past month, the PMA had begun rolling lockouts and blamed the union for work slow-downs. Business associations and retailers called for an end to West Coast port congestion and for the destruction of one of America’s last strong unions.

Given the relative weakness of support for unions today, President Obama found himself in the same situation FDR faced. The easy choice would be to side against the union, intervening to break the impasse, if at all. Some corporate executives and anti-union politicians called for Congress to change labor law to permanently weaken the ILWU or for Obama to invoke an anti-union “cooling off” provision of the Taft-Hartley Act (as President George W. Bush did in 2002 during previous ILWU-PMA negotiations).

But when President Obama finally acted, it was to make the choice Roosevelt also made about a century prior. He dispatched his Secretary of Labor, Tom Perez, who traveled to San Francisco to meet with ILWU and PMA officials. The result is an agreement that could put both laborers and business back to work, though the details of the deal remain secret and, given the union’s democratic process, subject to a vote by the 20,000 members of its longshore division.

The timing of Obama’s intervention is apt. This past Thursday, Obama announced a new national monument, in Chicago’s Pullman neighborhood, to honor workers. Pullman was the birthplace of an important labor union of (mostly African American) sleeping car porters—but was also ground zero for a mammoth railroad strike that rocked American in its First Gilded Age.

As Obama noted in his announcement, the Brotherhood of Sleeping Car Porters dramatically benefited its members and their families by helping them enter the middle class. Crucially, unionism also facilitated racial equality by sufficiently empowering black citizens to be able to demand equal treatment.

Perhaps more surprisingly, four Illinois Republicans—one Senator and three Congressmen—endorsed the Pullman designation. In their letter to the president, they also insisted that the porters union “laid the groundwork for the Civil Rights movement.” Furthermore, they recognized that the mammoth Pullman strike in 1894 “provided workers across America with a blueprint for how to achieve a better working environment and secure fair wages and rights in the workplace.”

These statements from Obama and the Illinois Republicans would likely appeal to Roosevelt—but FDR, the best president American working people have ever had, would take even more comfort from what’s happened with the longshoremen. Eight decades after an earlier West Coast longshore dispute, the president has taken an opportunity to demonstrate—rather than merely declare—that the government serves the people rather than the interests of those FDR called “economic royalists.”

Correction: An earlier version of this article misstated how many years ago FDR intervened in the longshoremen’s strike. It was 81. That version also incorrectly referred to the ILWU situation as a strike. It was an impasse in negotiations.

The Long ViewHistorians explain how the past informs the present

Peter Cole is a Professor of History at Western Illinois University. He is the author of Wobblies on the Waterfront: Interracial Unionism on the Progressive Era Philadelphia. His current book project is entitled Dockworker Power: Struggles in Durban and the San Francisco Bay Area.

TIME Supreme Court

Ruth Bader Ginsburg Upends the Notion of the Silent Justice

Ruth Bader Ginsburg Supreme Court Justice Young Photos
Steve Petteway—Collection of the Supreme Court of the United States Official portrait of Justice Ruth Bader Ginsburg

Supreme Court Justice isn't just writing opinions, she's sharing them in interviews.

Ruth Bader Ginsburg appears to be on a book tour with no book. The oldest Supreme Court Justice has been on a media tear recently, making headlines with interviews about everything from feminism to her workout routine, even slyly revealing that she was “not 100 percent sober” during the State of the Union.

In the last year, Ginsburg has given interviews to Elle, the Associated Press, the National Journal, The New Republic, Yahoo! News, Bloomberg and MSNBC. She’s done a live event at the 92nd Street Y, performed a monologue in a D.C. play about the Civil War and given her blessing to the Notorious RBG Tumblr page, a fan website in her honor. Only Ginsburg’s opera-buddy Antonin Scalia, who gave a much-discussed 2013 interview to New York magazine, and Sonia Sotomayor, who made the rounds promoting her memoir, come close to rivaling Ginsburg’s recent publicity tour.

Some longtime court watchers think Ginsburg and her colleagues may be reshaping the way the traditionally cloistered justices interact with the public.

“That is a lot, and the frequency of it breaks the pattern,” says Lyle Denniston, a contributor to SCOTUSblog who has been covering the courts for 57 years. “This is a much more open age, with the Internet, and the justices are simply players in the modern drama of greater public exposure. It is pattern-setting, and it is unusual.”

Like many things at the Supreme Court, there may be an unspoken political angle too.

Leading up to the 2012 and 2014 elections, some liberals had argued that Ginsburg should retire, given her age (at 81, she’s the oldest sitting Justice), her history with pancreatic cancer and the possibility that Republicans could retake the White House and/or the Senate.

“If Ginsburg and Breyer abjure retirement and Obama wins, the justices’ subsequent departures will be relatively harmless,” wrote Harvard Law professor Randall Kennedy in the New Republic in 2011. “On the other hand, if Obama loses, they will have contributed to a disaster.”

A brief visit to the hospital over Thanksgiving renewed those fears for liberal court-watchers, giving Ginsburg all the more reason to dispel any concerns about her health. In all her interviews, she’s noted that she’s not going anywhere anytime soon.

“I’ve said many times: once I sense that I am slipping, I will step down,” she told MSNBC earlier this week. “This is a very intense job. It’s the best and the hardest job I’ve ever had. It takes a lot of energy and staying power to do it right. I will step down when I feel I can no longer do the job full steam.”

Ginsburg’s interviews have touched on some other common themes. She discusses what it was like to be one of few women in law school, to have no job offers after graduating at the top of her class at Columbia Law and how her egalitarian relationship with her husband Martin Ginsburg shaped her career. She recalls her time working for the ACLU, fighting laws that discriminated against women. She notes that while Roe v. Wade is unlikely to be overturned, restrictions on abortion rights affect poor women far more than affluent ones. And, inevitably, she calls on the generation of young American women to avoid complacency.

One thing that concerns me is that today’s young women don’t seem to care that we have a fundamental instrument of government that makes no express statement about the equal citizenship stature of men and women,” she told The New Republic last year. “They know there are no closed doors anymore, and they may take for granted the rights that they have.”

Not everyone agrees that Ginsburg’s increased public exposure is a good thing, especially when Ginsburg discussed the upcoming gay marriage case, sparking calls from some conservatives for her to recuse herself.

“Justices are generally more cautious than Justice Ginsburg has been lately in discussing pending issues,” says Denniston. “If they were discussing a tax case or a labor case, nobody would notice, but if you’re discussing the most controversial issues, people do pay very close attention. And they do take offense when a member of the court seems to be forecasting where the court’s going to go.”

But Ginsburg seems secure in her decision to speak out about her opinions, whether in a written dissent or not. She told MSNBC that she’d like to be remembered as “someone who used whatever talents she had to do her work to the very best of her ability and to help repair tears in her society.”

For this Supreme Court Justice, that means more than just writing opinions in a quiet legal chamber. It also means getting out there before the public. And that decision may end up as much a part of her legacy as any of her legal ones.

Read next: Oregon’s Kate Brown Becomes First Openly Bisexual U.S. Governor

Listen to the most important stories of the day.

Correction: An earlier version of this story misstated the number of years that Lyle Denniston has covered courts.

TIME Opinion

What Kayla Mueller’s Life Reveals About Her Generation

Kayla Mueller after speaking to a group in Prescott, Ariz. on May 30, 2013.
Matt Hinshaw—AP Kayla Mueller after speaking to a group in Prescott, Ariz. on May 30, 2013.

Charlotte Alter covers lifestyle, crime, and breaking news for TIME in New York City. Her writing has also appeared in The New York Times and The Wall Street Journal.

And why she should be a role model for millennials

These days, it seems like every millennial is trying to be a role model, whether it’s by designing a multi-billion-dollar app or recording a blockbuster album or creating a critically-acclaimed TV show. Everyone wants to be the influencer, the one who inspires people how to act and how to be.

Kayla Mueller didn’t care about any of that–all she wanted to do was end suffering.

Mueller, the 26-year-old aid worker from Arizona who had been held hostage by the Islamic State of Iraq and Greater Syria (ISIS) since 2013, and whose death was confirmed by the White House Feb 10, lived the antithesis to the hyper-performative brand management practiced by other people her age. She did not want to to be seen helping people: she wanted to help people. She’s the role model we really need.

If we’re looking for tips on how to act and how to be, Mueller’s newly released letter home to her family is a better textbook than any quirky essay collection by a 28-year old or professional memoir. It reveals that Mueller represented the best qualities of the millennial generation–our idealism, our optimism, and our love of our families–without the troublesome ones.

Millennials are generally thought to be more socially aware and idealistic than their parents. And they are increasingly demonstrating their idealism through hashtag activism, socially responsible investing, and mobile charity donation (crowdfunding site Fundly said in 2013 that 58% of its users were 34 or younger.)

But that wasn’t enough for Mueller–she wanted to get her hands dirty, first by demonstrating on campus, then by living in the Palestinian territories (sleeping in front of homes threatened by Israeli bulldozers, and escorting children to school) and finally going to Turkey to provide support to Syrian refugees. “I will always seek God,” she wrote in a letter to her family in 2011, before she was kidnapped on her way to a bus station in Syria. “Some people find God in church. Some people find God in nature. Some people find God in love; I find God in suffering. I’ve known for some time what my life’s work is, using my hands as tools to relieve suffering.”

We’re also known as an optimistic generation, but much of the research on our sunny attitude has been done in the context of financial sluggishness–53% of millennials think they don’t earn enough money now, but will in the future, and three out of four believe they’ll achieve their professional goals. Mueller took that optimism to the next level. Even when she was being held captive by a brutal terrorist group she still managed to hope for the best.

“I have been shown in darkness, light + have learned that even in prison, one can be free. I am grateful,” she wrote. “I have come to see that there is good in every situation, sometimes we just have to look for it.”

Those of us who are pouty about our dead-end internships should take note.

Millennials are known to be closer with their parents than previous generations were– over half of us consider one of our parents our best friend. But Mueller’s devotion to her family was so profound that the thought of their pain eclipsed her own suffering. “If you could say I have ‘suffered’ at all throughout this whole experience it is only in knowing how much suffering I have put you all through; I will never ask you to forgive me as I do not deserve forgiveness,” she wrote.

It’s not just that Mueller exemplified the best qualities of her generation–she also repudiated the bad ones. The stereotype of the “whiny” millennial could never apply to her. Because perhaps what’s most striking about Mueller’s letter is the lack of complaining, the omission of any information that might have pained her family to hear. “Please know that I am in a safe location, completely unharmed + healthy (put on weight in fact),” she wrote. “I have been treated w/ the utmost respect + kindness.” While it’s possible that this is true in Mueller’s case–another hostage told the New York Times he believed the female captives were treated relatively well– it’s also possible that she concealed elements of her captivity to spare her family pain. Unnamed US officials even suggested to ABC News that Mueller may have been “given” as a bride to an ISIS commander, which is consistent with the terrorist group’s history of rape and forced marriages with female captives.

So if millennials are looking for role models, we can look past Taylor Swift and Mark Zuckerberg. Kayla Mueller–who never courted the limelight–represents the best in all of us. I look up to her.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME Opinion

American Sniper and Our Problem With Military Mistakes

AMERICAN SNIPER
Warner Bros. Bradley Cooper stars as Chris Kyle in American Sniper

Reaction to the movie shows why it’s important to separate a war from the men and women who fought it

Recent controversy over Clint Eastwood’s film American Sniper echoes the aftermath of the Vietnam War. Once again, observers on both the right and the left suggest that one’s attitude towards our soldiers must mirror our attitude towards the war in which they fought, and vice versa. Actually Eastwood managed to walk a very fine line in his film in an effort to avoid such controversy. The film’s hero, Chris Kyle, does at least once speak of having to kill terrorists in Iraq so as not to face them in the United States, but this is tossed off too quickly, it seemed to me, to take very seriously. Eastwood also seems to have sanitized Chris Kyle slightly as well, not with respect to what he did in Iraq, but by omitting some of the boasting he did about killing people within the United States which no one has been able to confirm. Bradley Cooper’s Kyle is evidently designed to stand for all our soldiers. For me the film raises the issue that has caused my own generation so much trouble for five decades: how to separate a mistaken war from the dilemmas of the men and women who have to fight it.

The month to come, as I noted when considering the movie Selma, is the 50th anniversary not only of the Selma March, but of the beginning of the American combat role in the Vietnam War. In fact, it was on this day 50 years ago—Feb. 9, 1965—that a U.S. Air defense battalion was sent to Vietnam. Today’s Americans must be reminded of the enormity of the scale of that effort. Our commitment to Iraq never reached 200,000 men; the commitment to Vietnam reached half a million by early 1968 and stayed there for two years. A draft filled the ranks of our huge Army, either by conscripting men or inducing others, like myself, to enlist either for active duty, or in the Navy or Air Force, or in the reserves.

When it became clear in 1968-9 that ever-increasing casualties were not bringing us closer to victory, the Boom generation largely turned against it, including many of those drawn into the Army. Feeling against the military became very strong, although there was much less resentment of the soldiers themselves than is often misremembered today. President Nixon introduced a draft lottery in the fall of 1969 and began significantly to reduce our forces in South Vietnam in 1970. In 1973, he brought the draft to an end. Not only did the Vietnam experience end the draft, but it kept the United States out of any similar wars for the last 15 years of the Cold War, mainly because the military knew that another such catastrophe could mean the end of the military as they had known it.

MORE Why The Americans Still Resonates Decades After the Cold War

More recent wars have used different strategies. When George H. W. Bush decided to expel Saddam Hussein from Kuwait in 1990, Colin Powell, then Chairman of the Joint Chiefs of Staff, insisted on an overwhelming force, including many units from the reserves and National Guard, which had largely stayed out of Vietnam. That was quick and successful. When George W. Bush went into Iraq in 2003 with a much larger objective than had been the goal during the Gulf War, Donald Rumsfeld insisted that the military try to achieve it without about half the forces, and far fewer allied troops, than Bush Sr. had used to liberate Kuwait. Until 2007, Rumsfeld insisted, in effect, that he had been right, and the military focused upon getting out of Iraq as soon as possible. But the situation was obviously deteriorating, and General Petraeus and the surge managed to stabilize it. President Obama came into office promising to end American participation in the conflict—which is what the Iraqi government wanted as well. In 2010, he did so.

But there were also similarities between Vietnam and the occupation of Iraq. In Vietnam, the South Vietnamese government could remain in power in Saigon as long as American troops remained in the country and American air power was available against North Vietnamese offensives, but Americans could not secure the allegiance of the South Vietnamese people for that government. The problem in Iraq was remarkably similar. Petraeus proved that a mixture of force and blandishments could bring the Sunni areas of Iraq under control, but their future depended on their relations with the new Shi’ite led government. Nuri Al-Maliki, whom the US hand-picked and stuck with until he faced complete disaster just a few months ago, refused to give the Sunnis any real autonomy, and after his government tried to crush their resistance, they were easily won over by ISIS, a direct descendant of Al-Queda in Iraq. Now American air power has been brought back to contain further ISIS advances, but it cannot create a political alternative.

That is the broader context within which Chris Kyle spent several tours in Iraq. As a veteran and a military historian myself, I am rather surprised by the opinion that there is something dishonorable about being a sniper.

MORE Read the Full Interview With American Sniper Chris Kyle That Didn’t Make it to Press

The point of battle is to kill the enemy before he kills you, and concealing yourself and firing from a distance is an excellent way to do that. A sniper rifle is a much more discriminating weapon than an artillery shell or an airborne bomb. American Sniper did make the Iraq war look somewhat more traditional than it was. A viewer would never guess that Improvised Explosive Devices were the biggest threat to American troops, and I would be very surprised if an American unit actually had to face the kind of assault that is the climax of the film. But the movie is a fairly realistic portrait of warfare, and it highlights the particular insanity of warfare in the digital age, complete with cell-phone calls from loved ones at home in the midst of battle. It is also in large part about PTSD, a big part of every war, and especially of this one. The American people do have to face what is involved for our soldiers in fighting this kind of war.

Yet it was extremely difficult to come away from the film feeling that Kyle and his fellow soldiers had accomplished anything meaningful. Yes, they killed Iraqis and foreign fighters who were trying to kill them—but only because the US government had decided to send them into Iraq. In an interesting omission, the film never shows American troops cooperating with friendly Iraqi forces. We now know, although not from the film, that the government we set up failed to pacify the Sunni population, and that the civil war our invasion unleashed in Iraq has become just one episode in a regional Shi’ite-Sunni struggle that threatens to become a new Thirty Years’ War. And we know that however many bad guys Kyle managed to kill, the supply of them still seems to be endless.

In every war, men (and now women) fight for many reasons, more or less effectively and more or less heroically. The experience, as so much literature has shown through the ages, brings out the best and the worst of humanity. But in the modern age at least, meaningful war requires meaningful and achievable objectives. Those were lacking in both Vietnam and Iraq. And that, not the portrayal of a single soldier in a film, is what we must focus upon—all the more so since the war in the Middle East still continues.

David Kaiser, a historian, has taught at Harvard, Carnegie Mellon, Williams College, and the Naval War College. He is the author of seven books, including, most recently, No End Save Victory: How FDR Led the Nation into War. He lives in Watertown, Mass.

Read next: The True Heroism of American Sniper

Listen to the most important stories of the day.

TIME Opinion

Why the Founding Fathers Wouldn’t Have Been Anti-Vaxxers

Signing of the Declaration of Independence
Hulton Archive / Getty Images Postcard of 'The Signing of the Declaration of Independence', painted by John Trumbull

Look to the 18th-century philosophers who created the modern world

Are you a vaccination skeptic? Or are you skeptical of the vaccination skeptics? Your answer will most likely depend less on science and more on political ideology. The science jury is in when it comes to vaccinations, as it is for climate change and evolution. Vaccinations work, climate change is real and evolution happened. But, though skepticism in all three cases tends to be the product of politics, to doubt science is to run up against the very heart of America’s political framework.

The founding principles of America were the product of 18th century Enlightenment thinkers who were inspired by 17th century scientists such as Galileo and Newton. (This is an argument I make in my new book, The Moral Arc: How Science and Reason Lead Humanity Toward Truth, Justice, and Freedom.) The experimental methods and analytical reasoning of science that these Enlightenment thinkers consciously applied to solve social, political and economic problems created the modern world of liberal democracies, civil rights and civil liberties, equal justice under the law, free minds and free markets, and prosperity the likes of which no human society in history has ever enjoyed.

The founding fathers of the United States often referred to the “American experiment” and to democracy as an “experiment” because democratic elections are analogous to scientific experiments: every couple of years you carefully alter the variables with an election and observe the results. If you want different results, change the variables. Part of the reason that democracies systematically replaced autocracies was because of the scientific appeal of empowering individuals with a methodology to solve problems instead of an ideology to obey.

Many of the Founding Fathers were, in fact, scientists who deliberately adapted the method of data gathering, hypothesis testing and theory formation to the construction of a nation. They understood that no one knows how to govern a nation in all it’s complexities, and so they constructed a system that would allow constant tinkering to adjust for unforeseen circumstances. Instead of thinking about government as a place where power is up for the taking, they saw it as a social technology for solving problems. Their conception of democracy was not dissimilar to their vision of science, as Thomas Jefferson articulated it in 1804: “No experiment can be more interesting than that we are now trying, and which we trust will end in establishing the fact, that man may be governed by reason and truth.”

Consider the principles underlying the Declaration of Independence. We usually think of this great document as a statement of political philosophy, but it was, in fact, a type of scientific argument. Consider this sentence, one of the most famous in all political philosophy: “We hold these truths to be self-evident, that all men are created equal….” In Thomas Jefferson’s first draft he penned “We hold these truths to be sacred and undeniable.” Why did he change it? He didn’t. Benjamin Franklin did. Here is what happened, as described by Walter Isaacson in his biography Benjamin Franklin: An American Life, in a passage that reveals the scientific foundation of one of the greatest political tracts ever published:

The idea of “self-evident” truths was one that drew less on John Locke, who was Jefferson’s favored philosopher, than on the scientific determinism espoused by Isaac Newton and on the analytic empiricism of Franklin’s close friend David Hume. In what became known as “Hume’s fork,” the great Scottish philosopher, along with Leibniz and others, had developed a theory that distinguished between synthetic truths that describe matters of fact (such as “London is bigger than Philadelphia”) and analytic truths that are self-evident by virtue of reason and definition (“The angles of a triangle equal 180 degrees”; “All bachelors are unmarried.”) By using the word “sacred,” Jefferson had asserted, intentionally or not, that the principle in question—the equality of men and their endowment by their creator with inalienable rights—was an assertion of religion. Franklin’s edit turned it instead into an assertion of rationality.

Sticking with science paid off. Where people embraced the Enlightenment worldview that morals and values must be grounded in reason and science, it was no longer acceptable to merely assert that your beliefs, morals and ways of life are better than others. After the Enlightenment it was necessary to provide reasons for your beliefs and values, and those reasons had better be grounded in rational arguments and empirical evidence or else they could be ignored or rejected.

By contrast, countries that quash free inquiry, distrust reason and practice pseudoscience, such as Revolutionary France, Nazi Germany, Stalinist Russia, Maoist China and, more recently, fundamentalist Islamist states, have historically tended to stagnate, regress and often collapse. Theists and post-modernist critics of science and reason often label the disastrous Soviet and Nazi utopias as “scientific,” but their science was a thin patina covering a deep layer of counter-Enlightenment, pastoral, paradisiacal fantasies of racial ideology grounded in ethnicity and geography.

This idea of equal rights for individuals is the product of the Enlightenment, as is the principle of free speech and the use of reason in an open dialogue that forces us to consider the merit of what the other person is saying. And if the other person makes sense, their superior ideas gradually chip away at our prejudices. Reason alone may not get us there. We need legislation and laws to enforce civil rights. But these institutions are premised on law being grounded in reason, and the legislation being backed by rational arguments. Without that, there is no long-term sustainability to moral progress, as it is just a matter of might makes right. To make morals stick you have to change people’s thinking. And more than any other it is the Classical Liberal worldview grounded in reason and science that is bringing about moral progress—even when politics get in the way.

Michael Shermer is the Publisher of Skeptic magazine, a monthly columnist for Scientific American, the host of the Skeptics Distinguished Science Lecture Series at Caltech, and a Presidential Fellow at Chapman University. His latest book is The Moral Arc: How Science and Reason Lead Humanity Toward Truth, Justice, and Freedom (Henry Holt, 2015).

TIME Opinion

Why a Bobby Jindal Portrait Sparked a Racial Controversy

Bobby Jindal Portrait
Alex Wong—Getty Images Louisiana Governor Bobby Jindal delivers remarks during the second day of the 40th annual Conservative Political Action Conference (CPAC) March 15, 2013 in National Harbor, Md.

History validates the Twitter storm over a portrait of the Louisiana Governor

“Who’s the white guy?”

Or so went the jokes of many Twitter users who saw a photo tweeted widely this week of Indian-American Louisiana Gov. Bobby Jindal’s purported “official portrait,” in which, well, he’s been whitewashed to the shade of Benedict Cumberbatch. It turned out it wasn’t actually Jindal’s official portrait, and instead just a portrait loaned by a constituent, according to Jindal’s chief of staff, Kyle Plotkin, who shared on Twitter the actual painting (still with brightened skin, albeit less so) while slamming users for “race-baiting.”

But the backbone of these “race-baiting” accusations—often aimed at liberals who slam Jindal when he’s not “brown enough”—is the assumption that the portraits’ lightened skin tone doesn’t really matter. Paint has nothing to do with race, says that argument, and in any case, skin color is not one of the Governor’s defining features. Besides, the argument continues, the color of a painting certainly shouldn’t imply what skin color Jindal or someone else wishes he had.

That’s wishful thinking. For however inconsequential the object of controversy is, the portraits are capable of evoking a deeply unsettling reaction. That’s because they recall a dark history with lasting consequences. In a nation whose first lawmakers had constructed American identity based largely on whether European, Asian and African immigrants’ complexions appeared sufficiently “white”—a category that had been molded and manipulated from America’s early years—that Jindal’s portraits appear to have been scrubbed of his race matters greatly. A “white” complexion once afforded the right to a political voice; it was the lifeblood of the dominant majority.

Jindal’s skin tone in his portraits matters especially because it suggests that the “official” image of an American political leader is someone that is not of South Asian or Asian race. The touchy question of skin color remains regardless of the portrait maker’s intent, because throughout history, and arguably still today, differences in skin tone, such as those between Jindal’s portraits and Jindal himself—even if just a few shades—were specifically used to construct race and Americanness.

In the mid-1700s, the category of whiteness had been open to only Anglo-Saxon immigrants, and not even to Europeans like Italians, Spaniards, French or Swedes—they were “swarthy,” said Benjamin Franklin in 1751, while Africans were “black or tawny” and Asians “chiefly tawny.” But the acceptability of “swarthy” skin shifted as waves of Asian immigrants entered North America in the 19th century, and as popular imagery of colonial Indians in British Columbia or cheap Chinese laborers in the U.S. continued their likening to black slaves: dark, faceless, subordinate. Their racialization as disposable and immutably foreign, in contrast to the better-assimilated European labor migrants, in turn lifted these “swarthy” European immigrants to a sufficiently high racial status to merit the title “free white persons.”

That was over two centuries ago, but the physical basis for whiteness had permanently shaped what it meant to be a citizen of America, and later, what it meant to be simply a resident of America. It was exactly 98 years ago—Feb. 5, 1917—that the Asiatic Barred Zone Act of 1917 outlawed nearly all Asian immigration, adding South Asia to a blacklist that had already included China and Japan, by blocking “any country not owned by the U.S. adjacent to the continent of Asia” along specific longitudes and latitudes. The act was only the latest of several racially-motivated exclusionary laws that further conflated Americanness with certain physical characteristics, the most obvious of which was skin color. Alongside other physiognomies and cultural stereotypes, skin color determined not only white or non-white, not only eligible citizen or unworthy alien, but also acceptable or unacceptable.

The racial significance of lighter versus darker skin intensified in 1923, when the Supreme Court saw the case of Bhagat Singh Thind, an Indian immigrant. Thind, who was of Punjabi origin like Jindal, had petitioned to regain the U.S. citizenship a local court had granted him for his military service, which was later revoked when a federal court, upon review of his citizenship documents, discovered that Thind was of Indian origin. Thind’s legal argument was that he was a North Indian, which several ethnological studies had classified as Aryan—the “swarthy” skinned persons who had been deemed sufficiently “white.” Thind thus argued that he was “free white person,” the problematic clause that would be struck from naturalization law only in 1952.

Chief Justice George Sutherland rejected Thind’s argument, in line with his ruling from the parallel case Ozawa v. United States (1922), in which a Japanese immigrant had argued his relatively pale, “white” complexion should afford him the right to citizenship. Sutherland wrote in his closing opinion of Thind’s case:

The immigration of that day [late 18th century] was almost exclusively from the British Isles and Northwestern Europe, whence they and their forbears had come. When they extended the privilege of American citizenship to “any alien, being a free white person,” it was these immigrants—bone of their bone and flesh of their flesh—and their kind whom they must have had affirmatively in mind …

… It may be true that the blond Scandinavian and the brown Hindu have a common ancestor in the dim reaches of antiquity, but the average man knows perfectly well that there are unmistakable and profound differences between them to-day.

As Sutherland’s statement suggests, markers of racial differences deserved attention insofar as they maintained an established order. In the following decades, it slowly became strategic for lawmakers to abandon invoking racial differences. During the Cold War era, for example, Soviet and Asian propaganda had scandalized America’s racial inequality as conflicting with the nation’s support of self-determination. In response, the U.S. established a de facto PR agency to send overseas the stories of successful black and Asians in America, purporting that racial issues were clearly overblown. Later, during the late civil rights era, popular media amplified stories of the economic success of some Asian Americans to discourage black activists’ militancy by suggesting that race no longer limited social mobility.

Modern America might be a different place if the distinction between a lighter-skinned Jindal and a darker-skinned Jindal was a mere question of artistic vision. But today, in an age of expanded civil rights, this pick-and-choose attitude toward race has only heightened. The decision whether to dissect or ignore the paint color of Jindal’s portraits is but a small yet important choice among larger, modern issues. It’s about whether post-9/11 airport security unfairly targets those who appear to be Middle Eastern; whether affirmative action is anti-Asian; whether grand juries would return different decisions if the defendant were not black. At its core, what Plotkin decries as “race-baiting” is question of who has the power to decide when an issue deserves to be investigated in racial terms. Choosing to throw the “race-bait” accusation is simply a convenient disengagement from these issues, all of which are complicated by histories that conflate complexion with race, and race with power.

Because, really, why would anyone inherently enjoy the idea of unwanted racialization? As Plotkin’s tweets suggest, that stuff is just plain annoying.

TIME Opinion

Judging the Couple Who Locked Their Kids In a Car to Go Wine Tasting

Schadenfreude is modern parenting's favorite spectator sport.

A Washington, D.C. couple is under arrest after leaving their two young children locked in the car while they were wine tasting at a local restaurant. Yes, wine tasting.

The parents, identified as Christopher Lucas, 41, and Jennie Chang, 45, left their 22-month-old boy and 2 1/2 year-old girl strapped in their car seats in a locked car while they went to go wine-and-dine at a restaurant near the Ritz Carlton. The temperature was hovering near freezing, according to the Washington Post, and neither child had a hat or gloves; one had bare feet. The parents felt like it was okay to leave their kids locked in their Volvo, because they were at a restaurant just around the corner and had left an iPhone on to monitor the two children.

“I left to go inside the restaurant,” Lucas said, according to the report, “but I’m watching them.” The parents were gone for an hour and according to police who checked surveillance cameras, they never came to check on their children. A resident of a local apartment building called police after watching the car for 20 minutes, according to the Post, while, NBCWashington reports that another passerby dialed 911 after hearing the little girl sobbing.

The children were brought into a police car to be warmed up, they were checked out by paramedics and were in good health, police said. The parents returned as police were investigating, but the children were turned over to D.C. Child and Family Services and the Lucas and Change were arrested on two counts of attempted second-degree cruelty to children, which carries a maximum 10-year prison sentence. Their own stupidity, though, will last a lifetime.

To be frank, it seems clear that the parents are idiots. Lucas runs a software company and Chang works for the USDA, they drive a Volvo, and they live in a townhouse, according to the Post. All solid life choices. Despite this: idiots. Idiots for drinking wine while their children were locked in a car in near-freezing temperatures. Even bigger idiots because these parents clearly had the resources to hire a babysitter for the afternoon. Luckily the children were fine, which is what makes this case so prime for one of the favorite pastimes of modern parenting: Parental Schadenfreude.

Schadenfreude is taken from the German and means “harm-joy” and it’s usually used to connote some pleasure derived from the misfortunes of others. In this case, the chucklehead parents. To be clear, this is is not about the kids. The kids survived the parents’ lousy idea and were just the innocent victims of some astoundingly poor parenting. These parents were arrested for making not just a bad choice, but an astonishingly bad choice. These kids weren’t left alone in a car for five minutes while the parents ran into the mini market, they weren’t napping in strollers while their parents watched from inside a coffee shop, nor were they 9-year olds playing at the park while the mother worked. This isn’t free-range parenting or an unfortunate but understandable reality of impoverished working parents. It’s two seemingly well-educated, upper middle class parents who left their toddlers alone for an hour while they imbibed at a tony restaurant around the corner. This is not a mistake that most of us would make. Hence the schadenfreude.

There’s a certain glee that comes with watching other people screw up worse than you, especially when it comes to modernity’s high-stakes parenting. While you may leave your sleeping infant in a car for a minute to buy a gallon of milk or forget to pick up your kid from preschool before 6p.m., not bother to check for trans fats, like, ever, or even drop the baby while trying to cram him into an Ergo, you’re still not even close to locking your children into a car in near-freezing temperatures causing concerned strangers to dial 911 while you’re cozied up around the corner noting the subtle flavor profile of a glass of Rioja.

Thanks to your passable parenting skills, you can click on the headline as you scroll past it on your newsfeed and shake your head in disbelief at the mistakes of others. You can nod along with the local newscasters as they decry the poor decision-making skills of those parents. You can even recognize that parents with two children under the age of 2 probably really needed a glass of wine, while still rolling your eyes at their child care choice. You can understand it, but you would never ever do it, so you can tsk tsk tsk away.

In short, thank you to the police for doing their job and protecting those children and thank you to these parents for making almost everyone else look good by comparison.

TIME Opinion

How to Make History by Tweeting an Old Photo

Civil Rights Leaders
Arthur Brower / Getty Images Civil rights activists Norman Hill, Bayard Rustin and Frederick D. Jones, in 1964

Why I started the hashtag #HistoricPOC: to prove that people of color are part of the past

Recently, a Tumblr discussion of Agent Carter — the post-World War II Marvel miniseries currently airing on ABC — turned to the fact that the cast is predominantly white. Unsurprisingly, opinions ran strong. But according to some, there was no reason to get worked up: after all, life in 1940s New York was so segregated that — even in superhero-based fiction — no person of color would be present in the same working or living spaces as a white woman, unless they were a servant.

But, in reality, President Roosevelt’s desegregated Federal employment by executive order in 1941.

That’s why I started the hashtag #HistoricPOC on Twitter and Tumblr. I encouraged fellow users to post pictures of people of color (POC) throughout history. Whether they posted family photos or links to famous images, I wanted there to be an easily accessible visual historic record. It doesn’t matter if someone had any training in history; all that was required were photos and some idea of when they were taken. Users of the tag posted pictures of family members’ celebratory moments, important events and even some of the truly mundane aspects of day-to-day life. All of that history is relevant. All of it is important. All of it is proof that they were there too. We cannot erase the people who lived through the past from the spaces they inhabited, and it is incredibly important not to whitewash the past as that can only lend itself to racist myths about the roles of POC in creating and sustaining society.

At the time Roosevelt wrote Executive Order 8802 and established Fair Employment Practice in the Defense Industries, federal employment was not fully segregated despite the best efforts of some of his predecessors. People like Mary Church Terrell and W.E.B. DuBois had fought President Woodrow Wilson’s efforts to implement Jim Crow laws in the federal service. Black men (like Adam Clayton Powell Jr. and William Dawson) were in the House of Representatives at a time when current popular media — in a bizarre reflection of the Jim Crow mores that made Blackface more popular than Black people in many movies — would have you believe that the past was all white, unless the topic is the oppression of POC. There is more to our history than the pain of legally enfranchised brutality depicted in Roots, Selma, or The Help. To show only show those aspects of history might make a modern viewer think that the people of color of the past had no power to effect change. Nothing could be further from the truth: the roots of any liberation enjoyed by modern marginalized people can be found in the work done by those who came before us, who (whether we are taught it or not) stood up against oppression in ways that are still impacting us today.

Was the agency of Black people — the agency of all people of color — limited during Jim Crow? Absolutely. It is still limited by structural racism. However, a history that paints us as objects with no power is a false one indeed. Erasure is not equality, and we need to see what happened in the past, to know how we should engage in the present and the future. There is an old saying that history is written by the victors. It’s a great shorthand for the reality that what we think we know of the past is framed by the people teaching us about it, whether with a textbook that skims right over the details of life in between major events like slavery and the end of Jim Crow laws, or media that portray the past as the province of a particular ethnic group. The reality is most people only know as much history as they’ve been taught before the age of 18. After high school, any further study of history is largely optional; interests can be heavily slanted towards one area with no need to learn more about anyone or anything else.

Thus, the fact that Martin Luther King Jr. was born only six months before Anne Frank escapes most people simply because the two icons are never situated in the same lessons. Others who were instrumental in changing the world are erased simply because they aren’t spoken of at all in textbooks. We may hear more about Rosa Parks than Claudette Colvin, but both were instrumental in effecting change. The same is true of Bayard Rustin, whose sexual orientation meant that even those who worked directly with him rarely spoke publicly about the importance of his contributions to the Civil Rights Movement. We may know that Josephine Baker danced in a banana skirt while never hearing that she was awarded the Croix de guerre (Cross of War), or that she was made a Chevalier of the Légion d’honneur by General Charles de Gaulle for her work with the French Resistance during World War II.

Because of how history is taught, we tend to think of it as discrete events, even though it is all interconnected. #HistoricPOC isn’t a comprehensive look at history, but an effort to use social media to bring out more facts that might otherwise be ignored. History was written by the victors, but now anyone can document the past, and because Twitter and other social media sites are being archived, the use of the hashtag can make that data more accessible to the casual researcher. To know your history — to really know it — is to be proud of where you came from, and to be equipped with a certainty that even the smallest of steps forward can make a long term difference.

As events unfold in the current fight against police brutality in marginalized communities, a truer representation of history can be inspirational — and, more importantly, it can show us how communities worked together to achieve common goals. There is no fairness in the workplace without the labor movement and the Civil Rights Movement making their individual and joint contributions. Nor does any movement happen solely because of the efforts of one person. Rosa Parks, Bayard Rustin, Martin Luther King Jr, Malcolm X, James Boggs, Grace Lee Boggs, Josephine Baker, Ozzie Davis, Ruby Dee and Harry Belafonte are the names you might recognize, but there were thousands of people working together to make change happen. A history that erases the importance of money contributed by entertainers, domestic workers and policy players is one that dehumanizes and demeans their sacrifices. Respectability might make some historical figures more appealing than others, but as we are seeing right now in Ferguson, Ohio, Chicago, New York, and so many other places, respectability isn’t required to do good and meaningful work against institutional oppression.

#HistoricPOC is written by all of us who want to participate, about all of our ancestors — not just the ones who made it into the history books.

Mikki Kendall is co-editor of hoodfeminism.com, cultural critic, historian by training, and occasional feminist by choice.

TIME celebrity

The World’s Obsession With Amal Isn’t About Her Accomplishments

Lawyer Amal Alamuddin Clooney attends the hearing in the case Perincek vs Switzerland, at the European Court of Human Rights in Strasbourg, France, Jan. 28,2015.
Sandro Weltin/Council of Europe/EPA Lawyer Amal Alamuddin Clooney attends the hearing in the case Perincek vs Switzerland, at the European Court of Human Rights in Strasbourg, France, Jan. 28, 2015.

Charlotte Alter covers lifestyle, crime, and breaking news for TIME in New York City. Her writing has also appeared in The New York Times and The Wall Street Journal.

They're real, but the gushing isn't

Amal Clooney is at it again— doing something celebrities don’t usually do, and looking like a movie star while doing it.

This time, she’s arguing in the European Court of Human Rights against a Turkish politician who denied the existence of an Armenian genocide 100 years ago in which more than 1.5 million people were brutally murdered. That’s, like, sooo impressive… but who is she wearing?

When a reporter from The Telegraph asked her, she cheekily replied “Ede and Ravenscroft,” the legal robes maker that has been selling drab back judge costumes since 1689, the year Benjamin Franklin’s parents met.

Once she did that, the focus shifted from the history of the Armenian genocide to Amal’s sense of humor and fashion choices. The global reaction to her comments was proof that jig is up: it’s time to stop pretending you care about what Amal Clooney is doing, when you really just care about how she looks while doing it.

The public obsession with Amal Clooney has been outwardly focused on her professional accomplishments, and with good reason. She’s represented high-profile clients like Julian Assange and former Ukrainian Prime Minister Yulia Tymoshenko, fought for the Elgin Marbles to be returned to Greece, and worked to free three Al-Jazeera journalists imprisoned in Egypt. She’s done more in the last ten years than many lawyers do over their entire career.

It sounds great, and it is. But the gushing adoration in the media about her work is false appreciation that crumples under scrutiny. How many other human rights lawyers inspire anything close to Amal-mania? Look at Samira al-Nuaimy, the Iraqi human rights lawyer who was executed by ISIS last year. If the tabloid-buying American public so obsessed with human rights, why wasn’t she on the cover of InTouch?

MORE Lawyer Who Led Challenge of Uganda’s Anti-Gay Law: ‘Long, Long Way to Go’

Let’s face it: no matter how real Amal’s accomplishments are, the breathless celebration of her legal triumphs is just a thinly veiled infatuation with how she looks.

When placed in the glare of celebrity, Clooney’s binders of legal documents and folders of case material become accessories to her shiny hair and perfect manicure, instead of the other way around. What’s worse, there’s something grotesque about using serious work on behalf of genocide victims as a pretense for a fixation on her looks, her clothes, and her marriage to one of the world’s most eligible actors.

Amal’s beauty is the unspoken end of every sentence about her legal career, the sub-head to every headline about her human rights work. Even if the coverage is ostensibly focused on Turkish politics, or the Elgin marbles, or sexual violence in conflict zones, the substance get inevitably lost in the subliminal hum over what Amal’s wearing, how Amal’s hair looks, and the fact that Amal is married to George Clooney. It even happens when there’s nothing to report—the Armenian genocide case was overshadowed by Amal’s non-outfit (she was wearing essentially the same thing as all the other lawyers in the room).

It’s also a weird over-correction to the common sexist problem of focusing on women’s looks over their careers. Instead of focusing on the looks of an accomplished woman (like Kirsten Gillibrand), the media is loudly proclaiming how not-sexist they are by obsessively trumpeting Amal’s professional accomplishments, then mentioning her beauty as a super-conspicuous after-thought.

But discussing Amal Clooney’s human rights work in the same tone as Kim Kardashian’s workouts or Jennifer Lawrence’s pizza cravings isn’t just awkward— it’s bizarre. Imagine if other human rights activists were treated the same way. Next it’ll be “Watch Ban Ki-Moon Go to the Gym Without Makeup” or “Malala’s Celebrity Crush: REVEALED!”

MORE Malala Condemns the Killing of School Children in Peshawar

Some celebrities use their existing fame to shine a light on problems in the world, like Amal’s husband’s best friend’s wife Angelina Jolie, who recently wrote an op-ed in the New York Times demanding improved conditions in Syrian refugee camps. But that’s a different story, because Jolie came to activism after she got famous. She’s getting her picture taken in refugee camps and giving impassioned speeches at the U.N. precisely to direct those who are interested in her hair and clothes towards something more important.

But Amal’s just doing her job. Her work isn’t celebrity activism or a publicity stunt. Yet when it’s put in the context of celebrity fodder, Amal Clooney’s work on behalf of marginalized people gets reduced to just another thing a woman does while being beautiful.

So stop gushing. Stop with the headlines that trumpet Amal as a goddess for doing her job. Stop with the shock and awe that someone so beautiful could be so smart as well. Just let Amal keep doing her thing.

Read next: Amal Alamuddin Clooney and the Rise of the Trophy Husband

Listen to the most important stories of the day.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME Opinion

The Trouble With Disney’s Teeny, Tiny Princesses

BRAVE
Pixar/Disney Queen Elinor and King Fergus in Brave

A culture populated by absurdly small princesses and hulking male heroes can change the way men and women see themselves

Disney has taken a lot of flak for perpetrating sexist stereotypes in its princess movies. In today’s competitive, every-moment-counts child-rearing culture, American parents want their kids’ entertainment to be not just fun, but also fulfilling. So if a movie sends the wrong message, many parents stay away. That’s why the company has responded to the criticism, shaping more recent princess movies such as Frozen and Brave around female characters for whom romance is not the primary motivation.

I welcome this evolution. But there’s still a lot to wonder about — and even complain about — in today’s animated children’s movies, especially in the radical differences between male and female bodies.

Yes, on average real men’s bodies are bigger, and more muscular, than women’s. And yes, animation is an art form not restricted to the boundaries of realism, which is what makes it great. But the exaggerations in these children’s movies are extreme, they almost always promote the same image of big men and tiny women, and they are especially dramatic in romantic situations.

Consider just the differences in hand size. Here are the hands of romantic couples in (clockwise from top left): Frozen, How to Train Your Dragon 2, Gnomeo and Juliet, Hercules, Tangled and Brave.

Disney (4); Dreamworks; Touchstone Pictures

The differences between men’s and women’s hands and arms in these pictures are more extreme than almost any you can find in real adults. The men’s hands are routinely three or four times larger than the women’s. For comparison, I checked a detailed report that the Army commissioned to design its equipment and uniforms. In real American adults, for example, men’s wrists are on average only about 15% larger in circumference than women’s. In that scene from Frozen, not only is Anna’s hand tiny compared with Hans’, but in fact her eyeball is wider than her wrist.

Disney

In the Hercules scene, his bicep is about 2.8 times wider than hers, while the very biggest man in the Army report had a bicep just 2.1 times bigger than the very smallest woman (that bicep difference is also greater than that observed between Shaquille O’Neal and his former wife, Nicole Alexander). The same is true of their neck and wrist measurements.

In the case of Hercules, we can actually compare the Disney depiction to ancient renditions of the demigod and his mistress. From 4th century mosaics to Alessandro Turchi’s 17th century painting, the demigod is portrayed relative to Megara in much more normal human proportions. I know Hercules is not supposed to be a regular human, but if he’s really a different species, maybe Disney shouldn’t feature him kissing a girl in a children’s movie.

(There are exceptions to the Disney/Dreamworks model of couples, even in modern animation. Consider, for example, the teen couple in Japanese animator Hayao Miyazaki’s magical film Kiki’s Delivery Service, Marge and Homer Simpson — or, of course, Charlie Brown and Lucy. Even the older Disney classics, like the 1937 Snow White and the Seven Dwarves, had much more normally proportioned couples.)

Because humans reproduce sexually, there are obvious differences between males and females, called sexual dimorphism. However, in the grand scheme, as the sociologist Lisa Wade puts it, “men and women are overwhelmingly alike”; our similarities outweigh our differences. Still, we choose whether to highlight the differences that are apparent. And the amount of energy we devote to emphasizing and acting on the different qualities of men and women changes over time and varies across cultures.

Artists have been pairing men’s and women’s bodies for millennia. And even in art that was not intended to be realistic, the sex differences were usually not as dramatic as those seen in modern children’s movies.

Consider these three works of art. The first is Seated Man and Woman, a sculpture from Mexico about 2,000 years old, showing obvious but modest differences in body type. The second is Michelangelo’s famous rendition of Adam and Eve from the ceiling of the Sistine Chapel, completed in 1512, in which Eve’s robust physique is comparable to Adam’s. And the third is the classic American Gothic, by Grant Wood, from 1930.

Dallas Museum of Art; Getty Images (2)

I wouldn’t argue that differentiating the sexes in animated movies is the most pressing problem we face today. But I do think the choices that artists and producers make — and the popularity of their choices — gives us a window into important cultural dynamics.

In my own area of research, families and gender, many of our modern debates revolve around the different roles that men and women play. Can men warmly nurture children and work as nurses? Can women successfully lead families and companies? The differences between mothers and fathers can create comfortable compatibilities with obvious benefits. But unless we see that men and women have physical, emotional and cognitive qualities in common as well, we will continue to treat single parents — and same-sex couples — as fundamentally deficient instead of evaluating them as complex people with their own strengths and weaknesses.

Having written about this subject frequently in the past few years, I know many people will disagree, arguing that the fundamental differences they perceive between men and women are natural and should be embraced. But what we think of as normal is not simply natural; it’s a product of the interaction between the natural world and our cultural ways. When the beautiful and romantic stories we grow to love in childhood set a standard that exaggerates gender differences and makes them seem natural — built into our very bone structures — it gives us a more limited, and less complex, vision of our human potential.

Your browser, Internet Explorer 8 or below, is out of date. It has known security flaws and may not display all features of this and other websites.

Learn how to update your browser