TIME ebola

The #Ebowla Jokes Need to Stop

Charlotte Alter covers lifestyle, crime, and breaking news for TIME in New York City. Her writing has appeared in The New York Times and The Wall Street Journal.

As New Yorkers joke about catching Ebola from a bowling ball, just imagine how scared Dr. Craig Spencer must be

As soon as New Yorkers learned that Dr. Craig Spencer, a volunteer Doctors Without Borders physician who had recently returned from West Africa, had been diagnosed with Ebola, panic set in. And as soon as people learned that he’d been bowling the night before, that panic appeared to turn to a kind of sick joke. #Ebowla starting making the rounds on Twitter and finally, there was something about Ebola that seemed kind of funny.

As the alarming details of Dr. Spencer’s New York adventure emerged—a heroic stint caring for the sick in Guinea, a flight home, and then later, a subway ride, a walk along the High Line, a meal at a restaurant, an Uber ride—the fact that he went bowling the night before checking himself into Bellevue Hospital, where he was isolated immediately, was the detail that has captured the collective imagination.

And just as quickly, that fact turned into a deluge of Twitter jokes, each one hoping to be funnier than the next.

One could argue that this was a group-think defense mechanism to distract ourselves from the horror of Ebola’s presence in America’s most populous city. As one tweeter put it:

But somewhere along the line, the tone changed. It stopped being about bowling and started being about Spencer and his character.

This is a guy who signed up to work with Doctors Without Borders, arguably one of the more difficult jobs in the world, to help strangers in one of the most dangerous health zones on the planet. If you found out a United States Marine was playing CandyCrush right before he got blown up by a landmine, would you be laughing then?

Some expressed a similar callousness toward Amber Vinson, the Dallas nurse who contracted Ebola while treating Thomas Eric Duncan and then flew—with approval from the Centers for Disease Control and Prevention—from Cleveland to Dallas.

Do people think Vinson wanted to catch Ebola? Thankfully, Vinson appears to have recovered from the virus, according to NBC.

Some of Spencer’s critics are saying that because he began to feel sluggish on Tuesday, he should have immediately stayed home. These must be people who have themselves never felt a little worn out on a rainy day. Spencer told doctors he was taking his temperature twice a day as a precaution, and he did not yet have a fever on Wednesday, which means he was not symptomatic of Ebola.

Doctors Without Borders said Thursday night that Spencer had followed all recommended protocols for medical workers returning from the afflicted regions. “As long as a returned staff member does not experience any symptoms, normal life can proceed,” the organization said in a statement. “Self-quarantine is neither warranted nor recommended when a person is not displaying Ebola-like symptoms.”

While the Twitterverse is having a good chuckle over #ebowla, Spencer is in an isolation ward. Details of his condition have not yet been released, but it’s easy to imagine his psychological state. He must be terrified. He’s just spent a month watching what Ebola does to afflicted bodies, and now he’s alone, surrounded by hazmat suits, unsure if he’ll ever touch another human being. As Ebola survivor Dr. Kent Brantly wrote in TIME:

“During my own care, I often thought about the patients I had treated. Ebola is a humiliating disease that strips you of your dignity…I finally cried for the first time when I saw my family members through a window and spoke to them over the intercom. I had not been sure I would ever see them again.”

Spencer’s fiancée, Morgan Dixon, is also in isolation at Bellevue. Imagine how she must be feeling. Yesterday was just a normal New York morning, but last night she and her fiancé went to sleep alone, and while it’s too soon to say for sure, there is the risk—and almost certainly, for them, the fear—that they might never see each other again.

So far, every single person, including Spencer, who has been treated for Ebola in the United States became infected because they risked their lives to help others. That’s true of Brantly and the health worker he worked with, Nancy Writebol. It was true of Thomas Eric Duncan, who’d carried a neighbor to the hospital in Liberia, where she was turned away and sent home. And it was true of the two Dallas health care workers who contracted the virus from Duncan before he died.

We should be praising all of them, not mocking them. And as collective fear has morphed into scorn, the response, on Twitter anyway, is without empathy—and is truly embarrassing.

Still, this Ebola joke got it right:

 

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME Opinion

This ‘My Little Pony’ Parody Explains Income Inequality

The real world's a little darker than Ponyland

In a typical episode of the 1980s cartoon My Little Pony, Flutter Ponies and Baby Sea Ponies work together to defeat the evil magicians and sinister woodland creatures that seek to disrupt their cheerful way of life. But unlike the goblins and trolls that plagued Ponyland, rising inequality in the U.S. is very real and, in the words of a new Funny or Die parody video, “the crucial political and economic challenge of our age.”

“The Unbelievably Sweet Alpacas” is the brainchild of director Adam McKay, co-founder of Funny or Die and co-writer of movies like Anchorman and Talladega Nights. McKay was enlisted by the We the Economy project, which seeks to “drive awareness and establish a better understanding of the U.S. economy” through a diverse collection of 20 short films.

In the six-minute animated video, three alpacas, voiced by Amy Poehler, Maya Rudolph and Sarah Silverman, visit the lollipop factory to receive their job assignments after graduation. What they find at the factory stands in stark contrast to the theme song in the opening credits, which suggests that if you “just do your best and play by the rules, you’ll have social mobility, Porsches and jewels.” Instead, the alpacas find that nepotism, uneven access to quality education, market forces and government regulations have more to do with job prospects than good old work ethic.

The video’s use of a simple allows for a simple explanation of the problem, while its satirical bent injects it with an appropriately acerbic tone. “Hey, I just noticed something!” exclaims a lollipop voiced by Billy Eichner. “You three perfectly represent the economic trends of the last 40 years!”

This isn’t Funny or Die’s first use of parody to communicate a social message. Back in July, Kristen Bell starred in a spoof that had Mary Poppins quitting her nanny gig in protest of the too-low minimum wage. As for a solution, raising the minimum wage, both videos hint, would be a good place to start. But it’s going to take more than a few corporations ponying up to get to the heart of the issue.

TIME Opinion

ISIS and American Idealism: Is History Going Our Way?

A member loyal to the ISIL waves an ISIL flag in Raqqa
A member loyal to ISIS waves an ISIS flag in Raqqa, Syria on June 29, 2014. Reuters

In the Middle East, it's theory versus reality

Future historians, I suspect, will look at the United States’ current effort to “degrade and ultimately destroy” ISIS while simultaneously insisting that President Assad of Syria must step down with some puzzlement. Foreign intervention in civil wars is nothing new. France and Sweden intervened in the Thirty Years War between Catholics and Protestants in Germany in the early 17th century, France intervened in the American Revolution, and the United States has intervened in civil wars in Korea, Vietnam and the former Yugoslavia. But while in those previous cases, the intervening power took one side of the conflict, in this case, the United States now opposes both parties. How have we ended up in this position? The answer, I would suggest, goes back at least until the early 1990s, when the collapse of Communism convinced certain intellectuals and the US foreign policy establishment that history was inexorably moving our way.

A new era in world politics began in 1989, with the collapse of Communism in the Soviet Union. In that year a political scientist named Francis Fukuyama, then serving as deputy director of the State Department’s Policy Planning Staff, wrote a sensational article, “The End of History?,” in the conservative journal The National Interest. Communism was about to collapse, and Fukuyama argued tentatively that the world was entering a new era. “What we may be witnessing is not just the end of the Cold War, or the passing of a particular period of post-war history,” he wrote, “but the end of history as such: that is, the end point of mankind’s ideological evolution and the universalization of Western liberal democracy as the final form of human government.” Within two years Soviet Communism and the Soviet Union itself were dead, and many thought Fukuyama had been proven right. He elaborated his ideas in a scholarly work, The End of History and the Last Man, which appeared in 1992.

Fukuyama had worked with prominent neoconservatives, and neconservatives in the Bush Administration wrote his fundamental idea into their 2002 National Security Strategy, a blueprint for US domination of the world based upon democratic principles. “The great struggles of the twentieth century between liberty and totalitarianism,” it began, “ended with a decisive victory for the forces of freedom—and a single sustainable model for national success: freedom, democracy, and free enterprise. In the twenty-first century, only nations that share a commitment to protecting basic human rights and guaranteeing political and economic freedom will be able to unleash the potential of their people and assure their future prosperity.” Like the Marxists over whom they believed they had triumphed, this view saw history moving in a definite direction, and those on the “right” side believed that they had a right, if not a duty, to push history in the right direction. President Bush repeatedly declared that the Middle East was ready for democracy, and decided to create one by overthrowing Saddam Hussein in Iraq. (Fukuyama, interestingly, declared in 2006 that the Bush Administration and neoconservatism had gone astray.) That did not lead, however, to democracy, but rather to a terrible religious civil war in Iraq, featuring the ethnic cleansing of about four million Iraqis under the noses of 150,000 American troops. The United States finally withdrew from Iraq after seven years of war, and the Shi’ite led Iraqi government has now lost authority over both the Kurdish and Sunni parts of the country, with ISIS moving into the Sunni areas.

What went wrong? In 1993, Samuel Huntington had put forward an alternative view of the future in another widely read book, The Clash of Civilizations and the Remaking of World Order. To begin with, Huntington—who, ironically, had been a graduate-school professor of Francis Fukuyama’s at Harvard—denied that the western way of life now dominated the globe. How the future would develop, he argued, remained a very open question. Though Huntington painted with a very broad brush, his vision looks more accurate now than Fukuyama’s. The Muslim world is both enormous and diverse, and nothing suggests that Muslims from south Asia through much of Africa are about to embark upon a war with the West. However, most of the major contending factions among the Muslims of the Middle East—the groups that realistically stand to come to power in contested regions like Iraq and Syria—reject, to varying degrees, fundamental principles of western civilization, including religious tolerance and the separation of church and state. Yet both various pundits and the leadership of the Obama Administration, including the President himself, remain convinced that the Middle East has a destiny to follow the western model, and that American intervention in their civil wars can encourage them to do so. The Obama Administration reacted to the Arab spring based upon the assumption that the fall of authoritarian regimes was both inevitable and surely beneficial to the peoples involved. At first that seemed to be true in Tunisia, but the Administration has in effect backtracked on it by accepting the military coup in Egypt, and in Libya and Syria this plan has not worked out at all. Just this week, the New York Times reports that the new freedom in Tunisia has allowed ISIS to recruit numerous fighters there.

Speaking to the United Nations on Sept. 24, President Obama insisted that ISIS must not, and cannot, prevail, because of the evil that it has done. He also called upon the Middle East to reject religious war and called for “a new compact among the civilized peoples of this world” to work against violent ideology. These are inspiring words to American ears, but they are finding almost no echo among the competing factions of the Middle East. For a complex variety of political, religious and cultural reasons, ISIS has commanded more dedicated support than any other Sunni faction in Syria or Iraq. Nor is there any evidence that two of their principal opponents—the Assad regime in Syria and the Shi’ite led government in Baghdad—share the President’s views on democracy and religious toleration either. The Obama Administration has been reduced to trying to stand up a “third force” of more friendly, reliable Sunni insurgents in Syria—a strategy the President rejected a year ago after a CIA paper explained to him that it was most unlikely to work.

Nearly 80 years ago, writing in the midst of another great world crisis, an American historian, Charles A. Beard, noted a distressing fact: that history shows no correlation between the justice of a cause and the willingness of men to die for it. This has not changed. We cannot rely upon impersonal forces of history to create a better world. Instead, the current U.S. attempt to impose a vision not supported supported by any major political group in the region is likely to create more chaos, in which extremism can thrive. We and the peoples of the Middle East both need peace in that region, but that peace must be based upon realities. If we decide ISIS is indeed the most important threat, we shall have to recruit allies from among the actual contending factions, rather than try to build our own from scratch. And, while encouraging cease-fires and the peaceful settlement of ongoing conflicts, we might try to set a better example for the peoples of the world by making democracy work better here at home.

David Kaiser, a historian, has taught at Harvard, Carnegie Mellon, Williams College, and the Naval War College. He is the author of seven books, including, most recently, No End Save Victory: How FDR Led the Nation into War. He lives in Watertown, Mass.

TIME politics

Which Republican Party?

Even if it captures the Congress, rivalries could hamper the GOP in power

The genesis of the modern republican Party may be found in a phone call placed by Arizona Senator Barry Goldwater in the closing days of a deadlocked 1960 presidential campaign between Richard Nixon and John F. Kennedy. With time running out, Goldwater advised GOP national chairman Thruston Morton, Nixon should skip the urban East and concentrate instead on swing states Texas and Illinois. His own motives were far from disinterested. “I’d like to win this goddamned election without New York,” Goldwater told Morton. “Then we could tell New York to kiss our ass, and we could really start a conservative party.”

Four years later, Goldwater got the part of his wish that mattered most. Meeting in San Francisco’s Cow Palace–the same hall where, just eight years earlier, Republicans had renominated Dwight Eisenhower by acclamation–GOP delegates rejected Ike’s Modern Republicanism (“a dime-store New Deal,” sniffed Goldwater) for a sagebrush libertarian who would block federal aid to education, repeal the graduated income tax and make Social Security voluntary.

The stage was thus set for the most divisive GOP convention since 1912, which opened fissures replicated half a century later, as a fading Eastern establishment battled Sun Belt conservatives for the soul of the party. On its second night, a post-midnight donnybrook pitted Goldwater loyalists against their nemesis, New York Governor Nelson Rockefeller. Rockefeller, a modernist in politics as in art, cited the Ku Klux Klan, the American Communist Party and the right-wing John Birch Society as examples of political extremism. As millions of television viewers looked on, he struggled to make himself heard above the booing and catcalls. “You lousy lover,” one woman shouted at Rockefeller, whose recent divorce and remarriage had come to symbolize for traditionalists a popular culture in which judges made war on religion and governors emulated Hollywood adulterers in flouting the marriage code.

What occurred in San Francisco was the excommunication of moderate and liberal elements presaging today’s GOP–more unswervingly conservative than even Goldwater envisioned. External events played their part in the transformation. As the 1950s Cold War consensus began to fray, racial divisions accelerated the breakup of the old New Deal coalition. The party of Lincoln morphed into the party of Strom Thurmond. Rockefeller-style pragmatism generated diminished support among Republicans for whom government had become an object of suspicion.

From Birchers to birthers, it’s not hard to find parallels between fantasists who imagined Eisenhower “a dedicated and conscious agent of the communist conspiracy” and their latter-day heirs disputing Barack Obama’s origins and loyalty. Obama is hardly the first American President to experience such abuse. In the 19th century, opposition to Andrew Jackson and his policies gave rise to the Whig Party. Depression-era Americans christened shantytowns of tin and cardboard Hoovervilles in mock tribute to their embattled President. Bill Clinton was accused of crimes far worse than perjury, while George W. Bush came in for sustained ridicule, and worse, from the left.

Obama, however, occupies a unique historical position. No mere presidential polarizer, nearly six years into his tenure he defines the opposition party more than his own. Neocons and Pat Buchanan isolationists; Appalachian miners and emotionally bruised billionaires; Mother Angelica Catholics and Ayn Rand objectivists–disdain for the President is seemingly all that unites a coalition as fractious as the one Ronald Reagan successfully bonded through his optimism and conviction politics. How will the GOP cope with life after Obama? We don’t have to wait until January 2017 to find out.

From the outset, the story line of this year’s election has been predictable, unlike many of the races. Would Republicans recapture the Senate after two attempts foiled by the base’s preference for ideological purity over electability? And what would a wholly GOP Congress do to hamper or harass the Obama White House in the continuing effort to tarnish his legitimacy or downsize his place in the history books? (Whether this campaign advances Republican chances to regain the Oval Office in 2016 is another matter altogether.) Massive electoral losses at the same juncture of their presidencies hardly reduced the legacies of Franklin Roosevelt, Eisenhower or Reagan.

The Republican fixation on Obama is just the latest example of a party out of power settling for tactical advantage over the hard work of intellectual renewal. Assume for the moment that at least 51 Republican Senators take the oath of office in January 2015. Will a GOP Senate prefer the ideological red meat served up by Ted Cruz? The war-weary, civil-libertarian message crafted by Rand Paul? Will it follow Mario Rubio through the shifting sands of immigration reform? Will it play to the base, content to remain a congressional party, secure behind its gerrymandered redoubts?

Other Republicans, less incrementalist in their approach, nurture visions of political realignment as sweeping as the Goldwater takeover of 1964. Until last Aug. 5, Justin Amash was the Congressman from Facebook, an obscure Michigan lawmaker and Tea Party favorite noted for his shrewd use of social media to promote a Ron Paul–ish agenda of unquestioning faith in markets, support for a flat tax and opposition to environmental (and virtually all other) regulation. Yet Amash disdains the national-security state no less than the welfare state. Indeed, he may be the National Security Agency’s worst nightmare. Earlier this year he exploited bipartisan anger over NSA snooping to produce a near majority for legislation to rein in the agency from collecting phone and Internet data.

No small feat for a two-term Congressman, the son of Palestinian immigrants, who had his philosophical epiphany reading Friedrich Hayek’s Road to Serfdom. Then came Aug. 5, and the kind of instant fame–or notoriety–that a lifetime of constituent service fails to produce. Amash handily defeated an Establishment-backed candidate in that day’s Republican primary, but it was his stunningly graceless victory speech that immediately went viral. To his elders it established Amash as the least civil of civil libertarians; to his fellow millennials, on the other hand, such trash talk is confirmation of his authenticity.

Amash’s refusal to honor election-night protocol was inevitably contrasted with the legendary good humor of his most illustrious predecessor from Grand Rapids, Gerald Ford. Yet Ford’s own entry into politics was as an insurgent, taking on an isolationist Republican Congressman who opposed the Marshall Plan and voted the Chicago Tribune line. Later, reeling from Goldwater’s crushing defeat at the hands of Lyndon Johnson and his Great Society, Ford wouldn’t hesitate to challenge his party’s minority leader or demand a more creative response to the question posed with every succeeding generation: What does it mean to be a Republican?

All politics is not local but generational. It was true when 22-year-old Theodore Roosevelt, fresh out of Harvard, ran for the New York State assembly to the horror of his fellow patricians; when 32-year-old Nelson Rockefeller, scion of the nation’s most prominent Republican family, accepted an appointment from FDR to be his Latin American coordinator; when a charismatic young Phoenix businessman named Barry Goldwater, fed up with local corruption, declared his candidacy for the city council; and when Jerry Ford came home from World War II convinced that the U.S. could no longer treat the Atlantic and Pacific as divinely provided moats. None of these agents of change was their grandfather’s Republican.

Is today’s GOP poised for its own break with the past? It’s happened before.

The author of six books of American history, Smith has directed the Lincoln, Hoover, Eisenhower, Ford and Reagan presidential libraries

TIME Opinion

Think Tank Tells Women How to Avoid Sexual Assault: Stop Getting ‘Severely Intoxicated’

AEI

Video says it’s not what men put in women’s drinks, but how many drinks women have

In a vlog titled “The Factual Feminist,” Caroline Kitchens, a senior research associate at conservative think tank the American Enterprise Institute, undertakes a MythBusters-style takedown of the threat posed by date rape drugs, suggesting that they are far less common than most women think. But it’s not her skepticism of Roofies that’s problematic — it’s the way she proposes women stop blaming these mythical drugs for the consequences of their own drunken decisions.

The video’s opening question — just how frequently drug facilitated sexual assault occurs — is a valid one. And Kitchens cites several studies that find the incidence to be quite low. Given the relative scarcity of sexual assaults that take place after a woman’s drink has been drugged, she says, “the evidence doesn’t match the hype.”

But it’s unclear exactly what hype Kitchens is referring to. The vast majority of messaging by sexual assault support and prevention groups resorts to awareness, not hysteria. RAINN, the Rape, Abuse & Incest National Network, offers advice to help women protect themselves from sexual assault. Among the group’s suggestions are to “be aware of your surroundings” and “trust your instincts.” Not exactly the picture of fear-mongering. RAINN also suggests refraining from leaving your drink unattended and accepting drinks from strangers, but these tips constitute common sense more than, in Kitchens’ words, “conspiracy.”

Aside from this exaggerated depiction of widespread panic, Kitchens’ debunking of the rampant Roofies myth is largely harmless. That is, until she begins to search for a reason to explain this imbalance between perception and reality. “Most commonly, victims of drug-facilitated sexual assault are severely intoxicated,” Kitchens says, “often from their own volition.” Blaming date rape drugs, she suggests, is “more convenient to guard against than the effects of alcohol itself.” Women would rather blame a “vague, improbable threat,” she says, than take responsibility for their own actions.

It may be true that date rape drugs are used infrequently, but that does not give carte blanche to shift the blame from perpetrator to victim. No, women shouldn’t be unnecessarily panicked about the threat of date rape drugs. But neither should they be shamed for the size of their bar tabs. Because no matter how short her skirt or how strong her drink, a woman never asks to be raped. It takes a rapist to rape a woman.

TIME Opinion

50 Years Later: Why My Fair Lady Is Better Than You Remember

Audrey Hepburn In 'My Fair Lady'
Audrey Hepburn in a scene from the film 'My Fair Lady' Archive Photos / Getty Images

Think it's a sexist relic? Think again

I know what you’re going to say about Eliza Doolittle and Henry Higgins. A snobby British guy in a Sherlock suit tries to “improve” a working woman by teaching her to talk pretty and look bangin’ in necklaces?! Screw you, Henry Higgins! Lean in to the flower business, Eliza! There’s nothing “loverly” about misogynistic woman-shaping narratives! Put My Fair Lady in a folder with all the other movies that “send bad messages,” like Grease and Gone With the Wind!

Screw Henry Higgins, indeed, but please do not underestimate My Fair Lady, a movie that, on Tuesday, celebrates the 50th anniversary of its premiere. And although it may be easy to dismiss the 1964 movie musical as an outdated rom-com from the shady period before feminism got rolling, it’s much more than just a relic of a sexist time. The movie itself isn’t misogynistic– it’s about misogyny.

First, a little history: The 1964 Audrey Hepburn movie version of My Fair Lady is based on the Broadway musical (starring Julie Andrews) with songs written by Alan Jay Lerner and Frederick Loewe. The musical was based on George Bernard Shaw’s 1912 play, Pygmalion, which was itself based on the part in Ovid’s Metamorphosis when a sculptor named Pygmalion falls in love with his statue of the perfect woman. That part of Metamorphosis was based on every guy who ever thought he could create the girl of his dreams (specifically, Freddie Prinze Jr. in She’s All That, of which Ovid was reportedly a mega-fan).

Even studio execs are always trying to cultivate the perfect girl, and that led to a bit of behind-the-scenes drama when it came to casting Eliza Doolittle. Julie Andrews had played Eliza on Broadway, and had already mastered the character and the vocals, and her stage co-star Rex Harrison was going to play Higgins in the movie. But studio head Jack Warner didn’t think Julie Andrews had the name recognition or glamor to carry a major motion picture. “With all her charm and ability, Julie Andrews was just a Broadway name known primarily to those who saw the play,” Jack Warner wrote in his 1965 autobiography My First Hundred Years in Hollywood. “I knew Audrey Hepburn had never made a financial flop.” But Andrews got the last word — losing the My Fair Lady role allowed her to make Mary Poppins, for which she won a Golden Globe and Oscar for Best Actress.

Audrey herself was still pretty good, even if she had to have her songs dubbed by another singer. As TIME wrote after the movie came out in 1964:

The burning question mark of this sumptuous adaptation is Audrey Hepburn’s casting as Eliza, the role that Julie Andrews had clearly been born to play….after a slow start, when the practiced proficiency of her cockney dialect suggests that Actress Hepburn is really only slumming, she warms her way into a graceful, glamorous performance, the best of her career.

From Ancient Greece to Edwardian England to 1960s Hollywood, the narrative remains the same: an overbearing male “genius” who transforms a pliable (read: vulnerable) woman from her meager, inadequate self into his personal ideal of womanhood. But thanks to Lerner and Loewe’s songs, My Fair Lady critiques that narrative as much as it upholds it. Their musical is not about a genius attempting to transform a weak woman. It’s about a strong woman attempting to retain her identity in spite of the controlling machinations of a small-minded man.

Take, for example, the undisguised misogyny in nearly all of Henry Higgins’s songs (spoken, with droll irony, by Rex Harrison). This is from a song near the end, fittingly titled “A Hymn to Him,” in which Higgins asks “Why can’t a woman be more like a man?”:

Why is thinking something women never do?
Why is logic never even tried?
Straightening up their hair is all they ever do /
Why don’t they straighten up the mess that’s inside?

This comes shortly after he says women’s “heads are full of cotton, hay and rags” calls men a “marvelous sex.” That’s not the only song where he drones on about how amazing he is compared to women: in “You Did It,” he takes complete credit for everything Eliza does, and in “I’m an Ordinary Man,” he idealizes his woman-free “bachelor” life.

Now, it’s entirely possible that Lerner and Loewe were themselves misogynistic jerks, and these songs were meant as appreciative bro-anthems. Maybe if they had been alive today, the music videos would have featured naked models on leashes. But more likely, they wrote these songs to humiliate Henry Higgins, to show the audience that he’s a jerk and they know it.

And Eliza Doolittle has plenty of songs that demonstrate she is anything but a statue; after all, the entire musical is written largely from her perspective. By far the best is “Without You,” which is pretty much the Edwardian-showtune version of Beyoncé’s “Irreplaceable:”

Without your pulling it, the tide comes in
Without your twirling it, the Earth can spin
Without your pushing them, the clouds roll by,
If they can do without you, ducky, so can I.

There’s also “Show Me” (where she tells her loser boyfriend Freddy that actions speak louder than words) and “Just You Wait” (where she fantasizes about leaving Henry Higgins for him to drown in the ocean while she goes to meet the King). Lerner and Loewe could easily have made Eliza into a love-sick ingenue, just by writing a few more songs like “I Could Have Danced All Night” (where she’s crushing on Higgins because they danced for a hot second, remember it’s 1912.) But they didn’t.

Of course, the whole Eliza-is-a-strong-woman argument gets compromised by the ending. Because after all her proclamations that she can “stand on her own,” Eliza comes back to Higgins. And when he asks “where the devil are my slippers?” she brings them to him. It’s an ending with the same ashy taste as the ending of Grease, because it seems incongruous: Eliza has no business being with Higgins, and it’s clear she’s independent-minded enough to know it.

Except, it’s 1912. And Eliza has no family connections, no money and no formal education, which means she has nowhere to go but back to the streets (or away with the insipid and financially dubious Freddy). She isn’t brainwashed or stupid — when given the choice between an emotionally abusive man and destitution, she chose the man. Choosing the man doesn’t make My Fair Lady a sexist movie; it makes it a movie about a sexist time.

Of course, 50 years later, there’s another version of My Fair Lady: Selfie, on ABC, is the newest to take up the Pygmalion mantel, when a male marketing exec “rebrands” a girl who has fouled up her social media presence. Let’s see how they do it without Lerner and Loewe.

Read TIME’s 1964 review of My Fair Lady, here in the archives: Still the Fairest of Them All

TIME Careers & Workplace

There’s No Such Thing as Work-Life Balance

Group of office workers in a boardroom presentation
Chris Ryan—Getty Images/OJO Images RF

A mixture of the two creates value in a way that neither does on its own

fortunelogo-blue
This post is in partnership with Fortune, which offers the latest business and finance news. Read the article below originally published at Fortune.com.

As parents settle into the new school year — a time for new schedules, new activities and new demands — the pressure to balance life and work is ever present. But to suggest there is some way to find a perfect ‘balance’ (i.e., to focus equal time and attention on work and home) is impossible in my mind. Or to put it more bluntly – the whole concept of work-life balance is bull.

I’m still a parent when I walk into work, and I still lead a company when I come home. So if my daughters’ school calls with a question in the middle of a meeting, I’m going to take the call. And if a viral petition breaks out in the middle of dinner, I’ll probably take that call, too.

And that’s okay — at least for me and my family. I have accepted that work and life are layers on top of each other, with rotating levels of emphasis, and I have benefited from celebrating that overlap rather than to try to force it apart.

I refer to this as the “Work/Life Mashup.” In tech-speak, a “mashup” is a webpage or app that is created by combining data and/or functionality from multiple sources. The term became popular in the early days of “Web 2.0,” when API’s (application programming interfaces) started allowing people to easily layer services on top of each other – like photographs of apartment rental listings on top of Google maps. There is a similar concept in music, where a mashup is a piece of music that combines two or more tracks into one.

One of the key concepts of a mashup is that the resulting product provides value in a way that neither originally did on its own; each layer adds value to the other.

Now, I’m not suggesting this is a guilt-free approach to life. People – and especially women – who try to do a lot often feel like they do none of it well, and I certainly suffer from that myself. But I have learned over time that how I feel about this is up to me. How much or how little guilt I experience at work or at home is in my control.

I also realize that the concept of a mashup is a lot easier (and perhaps only possible) for people with jobs where creating flexibility is possible. With these caveats in mind, here are some things to think about to create a work/life mashup early in your career: add value and don’t ask permission.

For the rest of the story, please go to Fortune.com.

TIME Sports

Why Wayne Gretzky Is Still ‘The Great One’

Simply the Best
The March 18, 1985, cover of TIME TIME

Wayne Gretzky became the all-time NHL career scoring leader on Oct. 15, 1989

Correction appended, Oct. 15, 2014, 1:45 pm

If you grew up in a hockey house like I did, your parents might’ve worshipped Wayne Gretzky as if he were the Messiah on Skates. And in a lot of ways he was: The Great One played a full two decades of NHL-level hockey, starting in 1979 with the Edmonton Oilers and ending with my hometown heroes, the New York Rangers, just before the turn of the century, racking up some 2,857 points in 1,487 regular season games. (NHL scoring gives individual players one point for a goal and one point for an assist, but those numbers don’t mean squat for the game at hand.)

Those 2,857 points made him — and still makes him — the League’s leading scorer. Gretzky toppled another hockey legend, Gordie Howe (1,850 points), to first take that title on Oct. 15, 1989, 25 years ago Wednesday.

Gretzky’s points total is impressive to say the absolute least. But as a kid who grew up loving hockey in Gretzky’s twilight years, it’s really this stat that stuck in my mind: If you take 2,857 points and subtract the points he got for goals, he’s still got more assists than any other NHL player has total points. (The next guy down, point-wise? Gretzky teammate and Rangers legend Mark Messier.)

As a young hockey fan, that fact instilled a simple lesson: Greatness can sometimes come from being the guy who puts the puck in the back of the net. But even more often, it comes from knowing whom you can count on to help you get that job done even better than you can. “How long Gretzky and [NBA star Larry] Bird play at the top and stay at the fair will help determine their ultimate reputations,” TIME wrote of Gretzky in a March 18, 1985 cover story about athletes at the peaks of their careers.

Gretzky stayed at the top for many seasons after that, but 25 years later his ultimate reputation is this: A life lesson that, while being the hero is nice, you don’t always have to shoot — sometimes it’s smarter to pass.

Read a 1981 story about the then-20-year-old hockey star, here in TIME’s archives: Hockey’s Great Gretzky

Correction: The original version of this story misstated the number of individual points an NHL player gets for a goal. The number is one.

TIME Opinion

Company-Paid Egg Freezing Will Be the Great Equalizer

478187231
Egg storage Science Photo Library—Getty Images/Science Photo Library RF

From Facebook to Citigroup, more companies are covering the cost of elective egg freezing for women who want to delay child-bearing. Is this the key to real gender equality?

Updated on October 16 at 11:25 am.

I spent last Thursday on the 15th floor of a fertility clinic with a dozen women. It was a free seminar on egg freezing, and I listened, wide-eyed, as a female physician described how, by the time a woman reaches puberty, her egg count will already be reduced by half. The women in the room had presumably come for the same reason as I had – we were single, in our 30s and 40s, and wanted to know our options – and yet we might as well have been entering a brothel. We didn’t make eye contact. We looked straight ahead. It was as if each of us now knew the other’s big secret: the fertility elephant in the room.

Women talk about sex, their vibrators, their orgasms – but a woman’s fertility, and wanting to preserve it, seems to be the last taboo. There’s something about the mere idea of a healthy single female freezing her eggs that seems to play into every last trope: the desperate woman, on the prowl for a baby daddy. The woman who has failed the one true test of her femininity: her ability to reproduce. The hard-headed careerist who is wiling to pay to put off the ticking of her biological clock. That or – god forbid – the women who ends up single, childless and alone.

But that may be changing, in part thanks to an unlikely patron saint: the Man.

This week, Facebook and Apple acknowledged publicly for the first time that they are or will pay for elective egg freezing for female employees, a process by which women surgically preserve healthy eggs on ice until they’re ready to become parents, at which point they begin the process of in vitro fertilization. Facebook, which told NBC News it has had the policy in place since the start of the year, will cover up to $20,000 under its “lifetime surrogacy reimbursement” program under Aetna (a typical cost of the procedure is around $10,000 fee, plus annual storage fees.) Apple will begin coverage in 2015.

There are other companies who cover the procedure, too: Citigroup and JP Morgan Chase tell TIME that their coverage includes preventative freezing. According to interviews with employees, Microsoft includes some preventative coverage, too. And sources say Google is weighing the coverage option for 2015.

The revelations appeared to unleash more immediate questions than they answered: Were these companies simply putting even more pressure on women to keep working and put their personal lives on the back burner? Was it a narrow effort by prosperous tech companies to recruit , or retain, female talent in an industry whose gender breakdown remains dismal? Or was it a step toward actually legitimizing the procedure, and leveling the playing field for women? Could the move – and the public nature of it — destigmatize the practice for good?

It’s been two years since the American Society of Reproductive Medicine lifted the “experimental” label from egg freezing -- a procedure initially created to help patients undergoing chemotherapy — leading to a surge in demand. Yet because the non-experimental technology is so new, researchers say it’s too soon to give real qualitative efficacy data. (While doctors typically recommend women freeze at least 18 eggs — which often requires two rounds of the procedure – there’s no guarantee that the eggs will lead to successful pregnancy when they are implanted via IVF years later.)

Nonetheless, the very idea that there might be a way for women to build their careers and their personal lives on a timetable of their own choice — not dictated by their biology — is so intriguing that single women are filling informational seasions at clinics and holding egg freezing “parties” to hear about it. They are flocking to financing services like Eggbanxx, which reports it is fielding more than 60 inquiries a week. And on email lists and at dinner parties, women trade egg freezing tips like recipe binders: which insurers cover what, the right terminology to use when asking for it, side effects of hormone injections that stimulate egg production and the outpatient procedure one most go through to retrieve the eggs.

Sometimes, they’re talking about careers: the relief of knowing that – with your eggs on ice – there is simply more flexibility around when to make the decision to give birth. But more often, they’re talking about dating: the “huge weight lifted off your shoulders,” as one single 32-year-old friend described it, knowing that you no longer have assess every potential prospect as a future husband and father.

For women of a certain age, reared with the reliability of birth control, this could, as the technology improves, be our generation’s Pill — a way to circumvent a biological glass ceiling that, even as we make social and professional progress, does not budge. Women today have autonomy – and choice – over virtually every aspect of their lives: marriage, birth control, income, work. And yet our biology is the one thing we can’t control.

“It’s almost as if evolution hasn’t kept up with feminism,” says a friend, a 34-year-old Facebook employee who underwent the procedure using the new policy this year. “But I think that, like with anything, the culture takes a while to catch up. And sometimes it takes a few big people to come out and say, ‘We’re doing this’ to really change things.”

From a practical standpoint, covering elective egg freezing makes sense. It’s an economic issue that could help companies, especially tech companies, attract women and correct a notorious gender imbalance. “Personally – and confidentially – this made me immediately look at Facebook jobs again,” a 37-year-old marketing executive who worked at both Facebook and Google tells me. “I’m looking to control my career and choices around motherhood on my terms, and a company that would allow me to do so — and provide financial support for those choices — is one I’d willingly return to.”

It’s a social issue, against a backdrop that men and women are waiting longer than ever to tie the knot, and there are now more single people in this country than at any other moment in history. (No, you’re not some kind of failure because you haven’t met someone and reproduced by 35. You’re just…. well, normal.)

And for businesses, of course, it’s a financial issue too. As the Lancet put it in a medical paper earlier this month, covering egg freezing as a preventative measure could save businesses from having to pay for more expensive infertility treatments down the line – a benefit that is already mandated in 15 states. As Dr. Elizabeth Fino, a fertility specialist at New York University, explains it: with all the money we spend on IVF each year, and multiple cycles of it, why wouldn’t healthcare companies jump on this as a way to save? And while success rates for IVF procedures vary significantly by individual, and are often low, using younger eggs can increase the chances of pregnancy.

“Companies with good insurance packages have been paying for IVF for a long time. Why should egg freezing be any different?” says Ruthie Ackerman, a 37-year-old digital strategist who had her egg freezing procedure covered through her husband’s insurance.

Egg freezing is also, of course, an issue of equality: a potential solution to the so-called myth of opting out. An equalizer among both gender – men don’t usually worry about their sperm going bad, or at least not with quite the same intensity or cost – and class (the procedure has typically only been available for those who could afford it). The way egg freezing has worked so far, many women don’t necessarily return to retrieve their eggs. Still others get pregnant naturally. And so, even though it’s too soon to say how successful the procedure down the line will be — for women who return, thaw, and begin the process of IVF — it’s almost like an insurance policy. An egalitarian “peace of mind.”

“I have insurance policies in every other area of my life: my condo, my car, work insurance,” says another friend, another employee of one of these firms, another woman who doesn’t want to be named, but for whom hopefully this will soon no longer be an issue. She points to a recent survey, published in the in the journal Fertility and Sterility, which found that a majority of patients who froze their eggs reported feeling “empowered.” “This is my body, and arguably the most important thing that you could ever have in your life,” she continues. “Why wouldn’t I at least protect that asset?”

And if your boss is offering it up to you for free, what do you have to lose?

Jessica Bennett is a contributing columnist at Time.com covering the intersection of gender, sexuality, business and pop culture. She writes regularly for the New York Times and is a contributing editor for special projects for Sheryl Sandberg’s women’s nonprofit, Lean In. You can follow her @jess7bennett.

Read next: Perk Up: Facebook and Apple Now Pay for Women to Freeze Eggs

MONEY Opinion

What Congress Should Do to Give Student Loan Borrowers Hope For Relief

141015_FF_LOANBORROWERS
Blend Images - Hill Street Studi—Getty Images/Brand X

Student loans are the only debt that can't be discharged in bankruptcy. Joe Valenti and David Bergeron of the Center for American Progress argue for two law changes to fix this.

Steve Mason’s story could keep any parent up at night.

The Redlands, Calif. pastor co-signed $100,000 in private student loans for his daughter Lisa to attend nursing school. But Lisa died suddenly at age 27.

Now, the loans intended to ensure her financial future are threatening to impoverish her parents and their three young grandchildren because Mason remains on the hook for the loans. He is struggling to provide for his family while trying to negotiate with lenders to settle on his daughter’s debt which, with interest and penalties, now totals about $200,000.

If he had co-signed a car loan for his daughter, or if his family had racked up credit card debt, or nearly any other kind of debt, the Masons would have had a way out: bankruptcy. Our Founding Fathers, appalled by British debtors’ prisons, created bankruptcy courts to give Americans that are struggling with debt a chance to reduce or even erase those financial burdens, and gain a fresh start.

Unfortunately, Congress has carved out an exception to this American promise: student loans.

The student loan exception to bankruptcy laws ignores tragic life situations of students, parents, and grandparents alike. And it should be changed. A common-sense approach to bankruptcy reform would help struggling families like the Masons while promoting a better student loan system for everyone.

How Student Loans Became the Exception to the Rule

Until 1976, all types of loans were treated equally under bankruptcy law. But that year, Congress passed the first exception, declaring that bankruptcy judges could only dismiss federal student loans under the direst of circumstances.

In 2005, Congress expanded the exception to include private student loans—those made by banks and credit unions.

Now, bankruptcy judges are only allowed to discharge the student loans of those who have proven they have “undue hardships,” which generally means never being able to work again.

The death or disability of a borrower discharges federal student loans. But private loans—such as those the Masons took out—don’t have those provisions. So private student loans plague those who are disabled as well as the survivors of those who have passed away, such as the Masons.

All together, under current law, it is next-to-impossible to get rid of any kind of student debt in bankruptcy.

How to Fix the Problem

Here are two simple steps that would help make student loans fairer and more bearable:

1) Allow judges to wipe out the private student loans of any private lender that fails to:

A) Discharge loans in the cases of death and disability, as the federal government does.

B) Charge reasonable interest rates.

C) Allow borrowers repayment flexibility, such as deferment and forbearance options for those in financial difficulties.

2) Allow judges to wipe out any student loans—including federal loans—taken out for colleges that:

A) Have high dropout rates.

B) Have high student loan default rates.

Lenders who charge reasonable rates, allow flexible repayment and wipe out the debts of the disabled and deceased could be considered “qualified” for the current tough bankruptcy rules. Bankruptcy would remain the narrow path of last resort it was designed to be for borrowers. But lenders who don’t meet these standards—basically, those that don’t give borrowers any way out—would be subject to the same bankruptcy laws as other lenders.

Schools, too, would need to earn the bankruptcy exemption for the programs they offer. If students are not likely to complete the programs they’re borrowing for, or generally don’t earn enough to pay back the debt, their federal or private student loans would be dischargeable. There is no sense in penalizing students, parents, and grandparents lured by false promises of success.

Indeed, a study two decades ago by the U.S. General Accounting Office found that low-income borrowers who dropped out of poor-performing schools were the borrowers who most frequently defaulted on their loans—not successful young grads simply trying to walk away from their obligations.

It is economic circumstances, rather than moral failings, that often brings families to bankruptcy as a way to deal with difficult and unforeseen situations. Surely the Masons could not have anticipated their current situation. And it’s probably a situation that no member of Congress anticipated either when they closed the doors of bankruptcy court to virtually all student loan debtors.

These are doors that Congress, and Congress alone, can reopen for students, parents, and grandparents who have fallen on hard times to have equal access to the same courts that the wealthy and corporations have used to make a fresh start. And these doors can be opened strategically to make sure bankruptcy remains a last resort.

Otherwise, families like the Masons will continue to struggle needlessly.

Joe Valenti is the Director of Asset Building at the Center for American Progress. David Bergeron is the Vice President of Postsecondary Education at the Center for American Progress and former assistant secretary for postsecondary education at the U.S. Department of Education.

Your browser, Internet Explorer 8 or below, is out of date. It has known security flaws and may not display all features of this and other websites.

Learn how to update your browser