TIME Opinion

ISIS and American Idealism: Is History Going Our Way?

A member loyal to the ISIL waves an ISIL flag in Raqqa
A member loyal to ISIS waves an ISIS flag in Raqqa, Syria on June 29, 2014. Reuters

In the Middle East, it's theory versus reality

Future historians, I suspect, will look at the United States’ current effort to “degrade and ultimately destroy” ISIS while simultaneously insisting that President Assad of Syria must step down with some puzzlement. Foreign intervention in civil wars is nothing new. France and Sweden intervened in the Thirty Years War between Catholics and Protestants in Germany in the early 17th century, France intervened in the American Revolution, and the United States has intervened in civil wars in Korea, Vietnam and the former Yugoslavia. But while in those previous cases, the intervening power took one side of the conflict, in this case, the United States now opposes both parties. How have we ended up in this position? The answer, I would suggest, goes back at least until the early 1990s, when the collapse of Communism convinced certain intellectuals and the US foreign policy establishment that history was inexorably moving our way.

A new era in world politics began in 1989, with the collapse of Communism in the Soviet Union. In that year a political scientist named Francis Fukuyama, then serving as deputy director of the State Department’s Policy Planning Staff, wrote a sensational article, “The End of History?,” in the conservative journal The National Interest. Communism was about to collapse, and Fukuyama argued tentatively that the world was entering a new era. “What we may be witnessing is not just the end of the Cold War, or the passing of a particular period of post-war history,” he wrote, “but the end of history as such: that is, the end point of mankind’s ideological evolution and the universalization of Western liberal democracy as the final form of human government.” Within two years Soviet Communism and the Soviet Union itself were dead, and many thought Fukuyama had been proven right. He elaborated his ideas in a scholarly work, The End of History and the Last Man, which appeared in 1992.

Fukuyama had worked with prominent neoconservatives, and neconservatives in the Bush Administration wrote his fundamental idea into their 2002 National Security Strategy, a blueprint for US domination of the world based upon democratic principles. “The great struggles of the twentieth century between liberty and totalitarianism,” it began, “ended with a decisive victory for the forces of freedom—and a single sustainable model for national success: freedom, democracy, and free enterprise. In the twenty-first century, only nations that share a commitment to protecting basic human rights and guaranteeing political and economic freedom will be able to unleash the potential of their people and assure their future prosperity.” Like the Marxists over whom they believed they had triumphed, this view saw history moving in a definite direction, and those on the “right” side believed that they had a right, if not a duty, to push history in the right direction. President Bush repeatedly declared that the Middle East was ready for democracy, and decided to create one by overthrowing Saddam Hussein in Iraq. (Fukuyama, interestingly, declared in 2006 that the Bush Administration and neoconservatism had gone astray.) That did not lead, however, to democracy, but rather to a terrible religious civil war in Iraq, featuring the ethnic cleansing of about four million Iraqis under the noses of 150,000 American troops. The United States finally withdrew from Iraq after seven years of war, and the Shi’ite led Iraqi government has now lost authority over both the Kurdish and Sunni parts of the country, with ISIS moving into the Sunni areas.

What went wrong? In 1993, Samuel Huntington had put forward an alternative view of the future in another widely read book, The Clash of Civilizations and the Remaking of World Order. To begin with, Huntington—who, ironically, had been a graduate-school professor of Francis Fukuyama’s at Harvard—denied that the western way of life now dominated the globe. How the future would develop, he argued, remained a very open question. Though Huntington painted with a very broad brush, his vision looks more accurate now than Fukuyama’s. The Muslim world is both enormous and diverse, and nothing suggests that Muslims from south Asia through much of Africa are about to embark upon a war with the West. However, most of the major contending factions among the Muslims of the Middle East—the groups that realistically stand to come to power in contested regions like Iraq and Syria—reject, to varying degrees, fundamental principles of western civilization, including religious tolerance and the separation of church and state. Yet both various pundits and the leadership of the Obama Administration, including the President himself, remain convinced that the Middle East has a destiny to follow the western model, and that American intervention in their civil wars can encourage them to do so. The Obama Administration reacted to the Arab spring based upon the assumption that the fall of authoritarian regimes was both inevitable and surely beneficial to the peoples involved. At first that seemed to be true in Tunisia, but the Administration has in effect backtracked on it by accepting the military coup in Egypt, and in Libya and Syria this plan has not worked out at all. Just this week, the New York Times reports that the new freedom in Tunisia has allowed ISIS to recruit numerous fighters there.

Speaking to the United Nations on Sept. 24, President Obama insisted that ISIS must not, and cannot, prevail, because of the evil that it has done. He also called upon the Middle East to reject religious war and called for “a new compact among the civilized peoples of this world” to work against violent ideology. These are inspiring words to American ears, but they are finding almost no echo among the competing factions of the Middle East. For a complex variety of political, religious and cultural reasons, ISIS has commanded more dedicated support than any other Sunni faction in Syria or Iraq. Nor is there any evidence that two of their principal opponents—the Assad regime in Syria and the Shi’ite led government in Baghdad—share the President’s views on democracy and religious toleration either. The Obama Administration has been reduced to trying to stand up a “third force” of more friendly, reliable Sunni insurgents in Syria—a strategy the President rejected a year ago after a CIA paper explained to him that it was most unlikely to work.

Nearly 80 years ago, writing in the midst of another great world crisis, an American historian, Charles A. Beard, noted a distressing fact: that history shows no correlation between the justice of a cause and the willingness of men to die for it. This has not changed. We cannot rely upon impersonal forces of history to create a better world. Instead, the current U.S. attempt to impose a vision not supported supported by any major political group in the region is likely to create more chaos, in which extremism can thrive. We and the peoples of the Middle East both need peace in that region, but that peace must be based upon realities. If we decide ISIS is indeed the most important threat, we shall have to recruit allies from among the actual contending factions, rather than try to build our own from scratch. And, while encouraging cease-fires and the peaceful settlement of ongoing conflicts, we might try to set a better example for the peoples of the world by making democracy work better here at home.

David Kaiser, a historian, has taught at Harvard, Carnegie Mellon, Williams College, and the Naval War College. He is the author of seven books, including, most recently, No End Save Victory: How FDR Led the Nation into War. He lives in Watertown, Mass.

TIME politics

Which Republican Party?

Even if it captures the Congress, rivalries could hamper the GOP in power

The genesis of the modern republican Party may be found in a phone call placed by Arizona Senator Barry Goldwater in the closing days of a deadlocked 1960 presidential campaign between Richard Nixon and John F. Kennedy. With time running out, Goldwater advised GOP national chairman Thruston Morton, Nixon should skip the urban East and concentrate instead on swing states Texas and Illinois. His own motives were far from disinterested. “I’d like to win this goddamned election without New York,” Goldwater told Morton. “Then we could tell New York to kiss our ass, and we could really start a conservative party.”

Four years later, Goldwater got the part of his wish that mattered most. Meeting in San Francisco’s Cow Palace–the same hall where, just eight years earlier, Republicans had renominated Dwight Eisenhower by acclamation–GOP delegates rejected Ike’s Modern Republicanism (“a dime-store New Deal,” sniffed Goldwater) for a sagebrush libertarian who would block federal aid to education, repeal the graduated income tax and make Social Security voluntary.

The stage was thus set for the most divisive GOP convention since 1912, which opened fissures replicated half a century later, as a fading Eastern establishment battled Sun Belt conservatives for the soul of the party. On its second night, a post-midnight donnybrook pitted Goldwater loyalists against their nemesis, New York Governor Nelson Rockefeller. Rockefeller, a modernist in politics as in art, cited the Ku Klux Klan, the American Communist Party and the right-wing John Birch Society as examples of political extremism. As millions of television viewers looked on, he struggled to make himself heard above the booing and catcalls. “You lousy lover,” one woman shouted at Rockefeller, whose recent divorce and remarriage had come to symbolize for traditionalists a popular culture in which judges made war on religion and governors emulated Hollywood adulterers in flouting the marriage code.

What occurred in San Francisco was the excommunication of moderate and liberal elements presaging today’s GOP–more unswervingly conservative than even Goldwater envisioned. External events played their part in the transformation. As the 1950s Cold War consensus began to fray, racial divisions accelerated the breakup of the old New Deal coalition. The party of Lincoln morphed into the party of Strom Thurmond. Rockefeller-style pragmatism generated diminished support among Republicans for whom government had become an object of suspicion.

From Birchers to birthers, it’s not hard to find parallels between fantasists who imagined Eisenhower “a dedicated and conscious agent of the communist conspiracy” and their latter-day heirs disputing Barack Obama’s origins and loyalty. Obama is hardly the first American President to experience such abuse. In the 19th century, opposition to Andrew Jackson and his policies gave rise to the Whig Party. Depression-era Americans christened shantytowns of tin and cardboard Hoovervilles in mock tribute to their embattled President. Bill Clinton was accused of crimes far worse than perjury, while George W. Bush came in for sustained ridicule, and worse, from the left.

Obama, however, occupies a unique historical position. No mere presidential polarizer, nearly six years into his tenure he defines the opposition party more than his own. Neocons and Pat Buchanan isolationists; Appalachian miners and emotionally bruised billionaires; Mother Angelica Catholics and Ayn Rand objectivists–disdain for the President is seemingly all that unites a coalition as fractious as the one Ronald Reagan successfully bonded through his optimism and conviction politics. How will the GOP cope with life after Obama? We don’t have to wait until January 2017 to find out.

From the outset, the story line of this year’s election has been predictable, unlike many of the races. Would Republicans recapture the Senate after two attempts foiled by the base’s preference for ideological purity over electability? And what would a wholly GOP Congress do to hamper or harass the Obama White House in the continuing effort to tarnish his legitimacy or downsize his place in the history books? (Whether this campaign advances Republican chances to regain the Oval Office in 2016 is another matter altogether.) Massive electoral losses at the same juncture of their presidencies hardly reduced the legacies of Franklin Roosevelt, Eisenhower or Reagan.

The Republican fixation on Obama is just the latest example of a party out of power settling for tactical advantage over the hard work of intellectual renewal. Assume for the moment that at least 51 Republican Senators take the oath of office in January 2015. Will a GOP Senate prefer the ideological red meat served up by Ted Cruz? The war-weary, civil-libertarian message crafted by Rand Paul? Will it follow Mario Rubio through the shifting sands of immigration reform? Will it play to the base, content to remain a congressional party, secure behind its gerrymandered redoubts?

Other Republicans, less incrementalist in their approach, nurture visions of political realignment as sweeping as the Goldwater takeover of 1964. Until last Aug. 5, Justin Amash was the Congressman from Facebook, an obscure Michigan lawmaker and Tea Party favorite noted for his shrewd use of social media to promote a Ron Paul–ish agenda of unquestioning faith in markets, support for a flat tax and opposition to environmental (and virtually all other) regulation. Yet Amash disdains the national-security state no less than the welfare state. Indeed, he may be the National Security Agency’s worst nightmare. Earlier this year he exploited bipartisan anger over NSA snooping to produce a near majority for legislation to rein in the agency from collecting phone and Internet data.

No small feat for a two-term Congressman, the son of Palestinian immigrants, who had his philosophical epiphany reading Friedrich Hayek’s Road to Serfdom. Then came Aug. 5, and the kind of instant fame–or notoriety–that a lifetime of constituent service fails to produce. Amash handily defeated an Establishment-backed candidate in that day’s Republican primary, but it was his stunningly graceless victory speech that immediately went viral. To his elders it established Amash as the least civil of civil libertarians; to his fellow millennials, on the other hand, such trash talk is confirmation of his authenticity.

Amash’s refusal to honor election-night protocol was inevitably contrasted with the legendary good humor of his most illustrious predecessor from Grand Rapids, Gerald Ford. Yet Ford’s own entry into politics was as an insurgent, taking on an isolationist Republican Congressman who opposed the Marshall Plan and voted the Chicago Tribune line. Later, reeling from Goldwater’s crushing defeat at the hands of Lyndon Johnson and his Great Society, Ford wouldn’t hesitate to challenge his party’s minority leader or demand a more creative response to the question posed with every succeeding generation: What does it mean to be a Republican?

All politics is not local but generational. It was true when 22-year-old Theodore Roosevelt, fresh out of Harvard, ran for the New York State assembly to the horror of his fellow patricians; when 32-year-old Nelson Rockefeller, scion of the nation’s most prominent Republican family, accepted an appointment from FDR to be his Latin American coordinator; when a charismatic young Phoenix businessman named Barry Goldwater, fed up with local corruption, declared his candidacy for the city council; and when Jerry Ford came home from World War II convinced that the U.S. could no longer treat the Atlantic and Pacific as divinely provided moats. None of these agents of change was their grandfather’s Republican.

Is today’s GOP poised for its own break with the past? It’s happened before.

The author of six books of American history, Smith has directed the Lincoln, Hoover, Eisenhower, Ford and Reagan presidential libraries

TIME Opinion

Think Tank Tells Women How to Avoid Sexual Assault: Stop Getting ‘Severely Intoxicated’

AEI

Video says it’s not what men put in women’s drinks, but how many drinks women have

In a vlog titled “The Factual Feminist,” Caroline Kitchens, a senior research associate at conservative think tank the American Enterprise Institute, undertakes a MythBusters-style takedown of the threat posed by date rape drugs, suggesting that they are far less common than most women think. But it’s not her skepticism of Roofies that’s problematic — it’s the way she proposes women stop blaming these mythical drugs for the consequences of their own drunken decisions.

The video’s opening question — just how frequently drug facilitated sexual assault occurs — is a valid one. And Kitchens cites several studies that find the incidence to be quite low. Given the relative scarcity of sexual assaults that take place after a woman’s drink has been drugged, she says, “the evidence doesn’t match the hype.”

But it’s unclear exactly what hype Kitchens is referring to. The vast majority of messaging by sexual assault support and prevention groups resorts to awareness, not hysteria. RAINN, the Rape, Abuse & Incest National Network, offers advice to help women protect themselves from sexual assault. Among the group’s suggestions are to “be aware of your surroundings” and “trust your instincts.” Not exactly the picture of fear-mongering. RAINN also suggests refraining from leaving your drink unattended and accepting drinks from strangers, but these tips constitute common sense more than, in Kitchens’ words, “conspiracy.”

Aside from this exaggerated depiction of widespread panic, Kitchens’ debunking of the rampant Roofies myth is largely harmless. That is, until she begins to search for a reason to explain this imbalance between perception and reality. “Most commonly, victims of drug-facilitated sexual assault are severely intoxicated,” Kitchens says, “often from their own volition.” Blaming date rape drugs, she suggests, is “more convenient to guard against than the effects of alcohol itself.” Women would rather blame a “vague, improbable threat,” she says, than take responsibility for their own actions.

It may be true that date rape drugs are used infrequently, but that does not give carte blanche to shift the blame from perpetrator to victim. No, women shouldn’t be unnecessarily panicked about the threat of date rape drugs. But neither should they be shamed for the size of their bar tabs. Because no matter how short her skirt or how strong her drink, a woman never asks to be raped. It takes a rapist to rape a woman.

TIME Opinion

50 Years Later: Why My Fair Lady Is Better Than You Remember

Audrey Hepburn In 'My Fair Lady'
Audrey Hepburn in a scene from the film 'My Fair Lady' Archive Photos / Getty Images

Think it's a sexist relic? Think again

I know what you’re going to say about Eliza Doolittle and Henry Higgins. A snobby British guy in a Sherlock suit tries to “improve” a working woman by teaching her to talk pretty and look bangin’ in necklaces?! Screw you, Henry Higgins! Lean in to the flower business, Eliza! There’s nothing “loverly” about misogynistic woman-shaping narratives! Put My Fair Lady in a folder with all the other movies that “send bad messages,” like Grease and Gone With the Wind!

Screw Henry Higgins, indeed, but please do not underestimate My Fair Lady, a movie that, on Tuesday, celebrates the 50th anniversary of its premiere. And although it may be easy to dismiss the 1964 movie musical as an outdated rom-com from the shady period before feminism got rolling, it’s much more than just a relic of a sexist time. The movie itself isn’t misogynistic– it’s about misogyny.

First, a little history: The 1964 Audrey Hepburn movie version of My Fair Lady is based on the Broadway musical (starring Julie Andrews) with songs written by Alan Jay Lerner and Frederick Loewe. The musical was based on George Bernard Shaw’s 1912 play, Pygmalion, which was itself based on the part in Ovid’s Metamorphosis when a sculptor named Pygmalion falls in love with his statue of the perfect woman. That part of Metamorphosis was based on every guy who ever thought he could create the girl of his dreams (specifically, Freddie Prinze Jr. in She’s All That, of which Ovid was reportedly a mega-fan).

Even studio execs are always trying to cultivate the perfect girl, and that led to a bit of behind-the-scenes drama when it came to casting Eliza Doolittle. Julie Andrews had played Eliza on Broadway, and had already mastered the character and the vocals, and her stage co-star Rex Harrison was going to play Higgins in the movie. But studio head Jack Warner didn’t think Julie Andrews had the name recognition or glamor to carry a major motion picture. “With all her charm and ability, Julie Andrews was just a Broadway name known primarily to those who saw the play,” Jack Warner wrote in his 1965 autobiography My First Hundred Years in Hollywood. “I knew Audrey Hepburn had never made a financial flop.” But Andrews got the last word — losing the My Fair Lady role allowed her to make Mary Poppins, for which she won a Golden Globe and Oscar for Best Actress.

Audrey herself was still pretty good, even if she had to have her songs dubbed by another singer. As TIME wrote after the movie came out in 1964:

The burning question mark of this sumptuous adaptation is Audrey Hepburn’s casting as Eliza, the role that Julie Andrews had clearly been born to play….after a slow start, when the practiced proficiency of her cockney dialect suggests that Actress Hepburn is really only slumming, she warms her way into a graceful, glamorous performance, the best of her career.

From Ancient Greece to Edwardian England to 1960s Hollywood, the narrative remains the same: an overbearing male “genius” who transforms a pliable (read: vulnerable) woman from her meager, inadequate self into his personal ideal of womanhood. But thanks to Lerner and Loewe’s songs, My Fair Lady critiques that narrative as much as it upholds it. Their musical is not about a genius attempting to transform a weak woman. It’s about a strong woman attempting to retain her identity in spite of the controlling machinations of a small-minded man.

Take, for example, the undisguised misogyny in nearly all of Henry Higgins’s songs (spoken, with droll irony, by Rex Harrison). This is from a song near the end, fittingly titled “A Hymn to Him,” in which Higgins asks “Why can’t a woman be more like a man?”:

Why is thinking something women never do?
Why is logic never even tried?
Straightening up their hair is all they ever do /
Why don’t they straighten up the mess that’s inside?

This comes shortly after he says women’s “heads are full of cotton, hay and rags” calls men a “marvelous sex.” That’s not the only song where he drones on about how amazing he is compared to women: in “You Did It,” he takes complete credit for everything Eliza does, and in “I’m an Ordinary Man,” he idealizes his woman-free “bachelor” life.

Now, it’s entirely possible that Lerner and Loewe were themselves misogynistic jerks, and these songs were meant as appreciative bro-anthems. Maybe if they had been alive today, the music videos would have featured naked models on leashes. But more likely, they wrote these songs to humiliate Henry Higgins, to show the audience that he’s a jerk and they know it.

And Eliza Doolittle has plenty of songs that demonstrate she is anything but a statue; after all, the entire musical is written largely from her perspective. By far the best is “Without You,” which is pretty much the Edwardian-showtune version of Beyoncé’s “Irreplaceable:”

Without your pulling it, the tide comes in
Without your twirling it, the Earth can spin
Without your pushing them, the clouds roll by,
If they can do without you, ducky, so can I.

There’s also “Show Me” (where she tells her loser boyfriend Freddy that actions speak louder than words) and “Just You Wait” (where she fantasizes about leaving Henry Higgins for him to drown in the ocean while she goes to meet the King). Lerner and Loewe could easily have made Eliza into a love-sick ingenue, just by writing a few more songs like “I Could Have Danced All Night” (where she’s crushing on Higgins because they danced for a hot second, remember it’s 1912.) But they didn’t.

Of course, the whole Eliza-is-a-strong-woman argument gets compromised by the ending. Because after all her proclamations that she can “stand on her own,” Eliza comes back to Higgins. And when he asks “where the devil are my slippers?” she brings them to him. It’s an ending with the same ashy taste as the ending of Grease, because it seems incongruous: Eliza has no business being with Higgins, and it’s clear she’s independent-minded enough to know it.

Except, it’s 1912. And Eliza has no family connections, no money and no formal education, which means she has nowhere to go but back to the streets (or away with the insipid and financially dubious Freddy). She isn’t brainwashed or stupid — when given the choice between an emotionally abusive man and destitution, she chose the man. Choosing the man doesn’t make My Fair Lady a sexist movie; it makes it a movie about a sexist time.

Of course, 50 years later, there’s another version of My Fair Lady: Selfie, on ABC, is the newest to take up the Pygmalion mantel, when a male marketing exec “rebrands” a girl who has fouled up her social media presence. Let’s see how they do it without Lerner and Loewe.

Read TIME’s 1964 review of My Fair Lady, here in the archives: Still the Fairest of Them All

TIME Careers & Workplace

There’s No Such Thing as Work-Life Balance

Group of office workers in a boardroom presentation
Chris Ryan—Getty Images/OJO Images RF

A mixture of the two creates value in a way that neither does on its own

fortunelogo-blue
This post is in partnership with Fortune, which offers the latest business and finance news. Read the article below originally published at Fortune.com.

As parents settle into the new school year — a time for new schedules, new activities and new demands — the pressure to balance life and work is ever present. But to suggest there is some way to find a perfect ‘balance’ (i.e., to focus equal time and attention on work and home) is impossible in my mind. Or to put it more bluntly – the whole concept of work-life balance is bull.

I’m still a parent when I walk into work, and I still lead a company when I come home. So if my daughters’ school calls with a question in the middle of a meeting, I’m going to take the call. And if a viral petition breaks out in the middle of dinner, I’ll probably take that call, too.

And that’s okay — at least for me and my family. I have accepted that work and life are layers on top of each other, with rotating levels of emphasis, and I have benefited from celebrating that overlap rather than to try to force it apart.

I refer to this as the “Work/Life Mashup.” In tech-speak, a “mashup” is a webpage or app that is created by combining data and/or functionality from multiple sources. The term became popular in the early days of “Web 2.0,” when API’s (application programming interfaces) started allowing people to easily layer services on top of each other – like photographs of apartment rental listings on top of Google maps. There is a similar concept in music, where a mashup is a piece of music that combines two or more tracks into one.

One of the key concepts of a mashup is that the resulting product provides value in a way that neither originally did on its own; each layer adds value to the other.

Now, I’m not suggesting this is a guilt-free approach to life. People – and especially women – who try to do a lot often feel like they do none of it well, and I certainly suffer from that myself. But I have learned over time that how I feel about this is up to me. How much or how little guilt I experience at work or at home is in my control.

I also realize that the concept of a mashup is a lot easier (and perhaps only possible) for people with jobs where creating flexibility is possible. With these caveats in mind, here are some things to think about to create a work/life mashup early in your career: add value and don’t ask permission.

For the rest of the story, please go to Fortune.com.

TIME Sports

Why Wayne Gretzky Is Still ‘The Great One’

Simply the Best
The March 18, 1985, cover of TIME TIME

Wayne Gretzky became the all-time NHL career scoring leader on Oct. 15, 1989

Correction appended, Oct. 15, 2014, 1:45 pm

If you grew up in a hockey house like I did, your parents might’ve worshipped Wayne Gretzky as if he were the Messiah on Skates. And in a lot of ways he was: The Great One played a full two decades of NHL-level hockey, starting in 1979 with the Edmonton Oilers and ending with my hometown heroes, the New York Rangers, just before the turn of the century, racking up some 2,857 points in 1,487 regular season games. (NHL scoring gives individual players one point for a goal and one point for an assist, but those numbers don’t mean squat for the game at hand.)

Those 2,857 points made him — and still makes him — the League’s leading scorer. Gretzky toppled another hockey legend, Gordie Howe (1,850 points), to first take that title on Oct. 15, 1989, 25 years ago Wednesday.

Gretzky’s points total is impressive to say the absolute least. But as a kid who grew up loving hockey in Gretzky’s twilight years, it’s really this stat that stuck in my mind: If you take 2,857 points and subtract the points he got for goals, he’s still got more assists than any other NHL player has total points. (The next guy down, point-wise? Gretzky teammate and Rangers legend Mark Messier.)

As a young hockey fan, that fact instilled a simple lesson: Greatness can sometimes come from being the guy who puts the puck in the back of the net. But even more often, it comes from knowing whom you can count on to help you get that job done even better than you can. “How long Gretzky and [NBA star Larry] Bird play at the top and stay at the fair will help determine their ultimate reputations,” TIME wrote of Gretzky in a March 18, 1985 cover story about athletes at the peaks of their careers.

Gretzky stayed at the top for many seasons after that, but 25 years later his ultimate reputation is this: A life lesson that, while being the hero is nice, you don’t always have to shoot — sometimes it’s smarter to pass.

Read a 1981 story about the then-20-year-old hockey star, here in TIME’s archives: Hockey’s Great Gretzky

Correction: The original version of this story misstated the number of individual points an NHL player gets for a goal. The number is one.

TIME Opinion

Company-Paid Egg Freezing Will Be the Great Equalizer

478187231
Egg storage Science Photo Library—Getty Images/Science Photo Library RF

From Facebook to Citigroup, more companies are covering the cost of elective egg freezing for women who want to delay child-bearing. Is this the key to real gender equality?

Updated on October 16 at 11:25 am.

I spent last Thursday on the 15th floor of a fertility clinic with a dozen women. It was a free seminar on egg freezing, and I listened, wide-eyed, as a female physician described how, by the time a woman reaches puberty, her egg count will already be reduced by half. The women in the room had presumably come for the same reason as I had – we were single, in our 30s and 40s, and wanted to know our options – and yet we might as well have been entering a brothel. We didn’t make eye contact. We looked straight ahead. It was as if each of us now knew the other’s big secret: the fertility elephant in the room.

Women talk about sex, their vibrators, their orgasms – but a woman’s fertility, and wanting to preserve it, seems to be the last taboo. There’s something about the mere idea of a healthy single female freezing her eggs that seems to play into every last trope: the desperate woman, on the prowl for a baby daddy. The woman who has failed the one true test of her femininity: her ability to reproduce. The hard-headed careerist who is wiling to pay to put off the ticking of her biological clock. That or – god forbid – the women who ends up single, childless and alone.

But that may be changing, in part thanks to an unlikely patron saint: the Man.

This week, Facebook and Apple acknowledged publicly for the first time that they are or will pay for elective egg freezing for female employees, a process by which women surgically preserve healthy eggs on ice until they’re ready to become parents, at which point they begin the process of in vitro fertilization. Facebook, which told NBC News it has had the policy in place since the start of the year, will cover up to $20,000 under its “lifetime surrogacy reimbursement” program under Aetna (a typical cost of the procedure is around $10,000 fee, plus annual storage fees.) Apple will begin coverage in 2015.

There are other companies who cover the procedure, too: Citigroup and JP Morgan Chase tell TIME that their coverage includes preventative freezing. According to interviews with employees, Microsoft includes some preventative coverage, too. And sources say Google is weighing the coverage option for 2015.

The revelations appeared to unleash more immediate questions than they answered: Were these companies simply putting even more pressure on women to keep working and put their personal lives on the back burner? Was it a narrow effort by prosperous tech companies to recruit , or retain, female talent in an industry whose gender breakdown remains dismal? Or was it a step toward actually legitimizing the procedure, and leveling the playing field for women? Could the move – and the public nature of it — destigmatize the practice for good?

It’s been two years since the American Society of Reproductive Medicine lifted the “experimental” label from egg freezing -- a procedure initially created to help patients undergoing chemotherapy — leading to a surge in demand. Yet because the non-experimental technology is so new, researchers say it’s too soon to give real qualitative efficacy data. (While doctors typically recommend women freeze at least 18 eggs — which often requires two rounds of the procedure – there’s no guarantee that the eggs will lead to successful pregnancy when they are implanted via IVF years later.)

Nonetheless, the very idea that there might be a way for women to build their careers and their personal lives on a timetable of their own choice — not dictated by their biology — is so intriguing that single women are filling informational seasions at clinics and holding egg freezing “parties” to hear about it. They are flocking to financing services like Eggbanxx, which reports it is fielding more than 60 inquiries a week. And on email lists and at dinner parties, women trade egg freezing tips like recipe binders: which insurers cover what, the right terminology to use when asking for it, side effects of hormone injections that stimulate egg production and the outpatient procedure one most go through to retrieve the eggs.

Sometimes, they’re talking about careers: the relief of knowing that – with your eggs on ice – there is simply more flexibility around when to make the decision to give birth. But more often, they’re talking about dating: the “huge weight lifted off your shoulders,” as one single 32-year-old friend described it, knowing that you no longer have assess every potential prospect as a future husband and father.

For women of a certain age, reared with the reliability of birth control, this could, as the technology improves, be our generation’s Pill — a way to circumvent a biological glass ceiling that, even as we make social and professional progress, does not budge. Women today have autonomy – and choice – over virtually every aspect of their lives: marriage, birth control, income, work. And yet our biology is the one thing we can’t control.

“It’s almost as if evolution hasn’t kept up with feminism,” says a friend, a 34-year-old Facebook employee who underwent the procedure using the new policy this year. “But I think that, like with anything, the culture takes a while to catch up. And sometimes it takes a few big people to come out and say, ‘We’re doing this’ to really change things.”

From a practical standpoint, covering elective egg freezing makes sense. It’s an economic issue that could help companies, especially tech companies, attract women and correct a notorious gender imbalance. “Personally – and confidentially – this made me immediately look at Facebook jobs again,” a 37-year-old marketing executive who worked at both Facebook and Google tells me. “I’m looking to control my career and choices around motherhood on my terms, and a company that would allow me to do so — and provide financial support for those choices — is one I’d willingly return to.”

It’s a social issue, against a backdrop that men and women are waiting longer than ever to tie the knot, and there are now more single people in this country than at any other moment in history. (No, you’re not some kind of failure because you haven’t met someone and reproduced by 35. You’re just…. well, normal.)

And for businesses, of course, it’s a financial issue too. As the Lancet put it in a medical paper earlier this month, covering egg freezing as a preventative measure could save businesses from having to pay for more expensive infertility treatments down the line – a benefit that is already mandated in 15 states. As Dr. Elizabeth Fino, a fertility specialist at New York University, explains it: with all the money we spend on IVF each year, and multiple cycles of it, why wouldn’t healthcare companies jump on this as a way to save? And while success rates for IVF procedures vary significantly by individual, and are often low, using younger eggs can increase the chances of pregnancy.

“Companies with good insurance packages have been paying for IVF for a long time. Why should egg freezing be any different?” says Ruthie Ackerman, a 37-year-old digital strategist who had her egg freezing procedure covered through her husband’s insurance.

Egg freezing is also, of course, an issue of equality: a potential solution to the so-called myth of opting out. An equalizer among both gender – men don’t usually worry about their sperm going bad, or at least not with quite the same intensity or cost – and class (the procedure has typically only been available for those who could afford it). The way egg freezing has worked so far, many women don’t necessarily return to retrieve their eggs. Still others get pregnant naturally. And so, even though it’s too soon to say how successful the procedure down the line will be — for women who return, thaw, and begin the process of IVF — it’s almost like an insurance policy. An egalitarian “peace of mind.”

“I have insurance policies in every other area of my life: my condo, my car, work insurance,” says another friend, another employee of one of these firms, another woman who doesn’t want to be named, but for whom hopefully this will soon no longer be an issue. She points to a recent survey, published in the in the journal Fertility and Sterility, which found that a majority of patients who froze their eggs reported feeling “empowered.” “This is my body, and arguably the most important thing that you could ever have in your life,” she continues. “Why wouldn’t I at least protect that asset?”

And if your boss is offering it up to you for free, what do you have to lose?

Jessica Bennett is a contributing columnist at Time.com covering the intersection of gender, sexuality, business and pop culture. She writes regularly for the New York Times and is a contributing editor for special projects for Sheryl Sandberg’s women’s nonprofit, Lean In. You can follow her @jess7bennett.

Read next: Perk Up: Facebook and Apple Now Pay for Women to Freeze Eggs

MONEY Opinion

What Congress Should Do to Give Student Loan Borrowers Hope For Relief

141015_FF_LOANBORROWERS
Blend Images - Hill Street Studi—Getty Images/Brand X

Student loans are the only debt that can't be discharged in bankruptcy. Joe Valenti and David Bergeron of the Center for American Progress argue for two law changes to fix this.

Steve Mason’s story could keep any parent up at night.

The Redlands, Calif. pastor co-signed $100,000 in private student loans for his daughter Lisa to attend nursing school. But Lisa died suddenly at age 27.

Now, the loans intended to ensure her financial future are threatening to impoverish her parents and their three young grandchildren because Mason remains on the hook for the loans. He is struggling to provide for his family while trying to negotiate with lenders to settle on his daughter’s debt which, with interest and penalties, now totals about $200,000.

If he had co-signed a car loan for his daughter, or if his family had racked up credit card debt, or nearly any other kind of debt, the Masons would have had a way out: bankruptcy. Our Founding Fathers, appalled by British debtors’ prisons, created bankruptcy courts to give Americans that are struggling with debt a chance to reduce or even erase those financial burdens, and gain a fresh start.

Unfortunately, Congress has carved out an exception to this American promise: student loans.

The student loan exception to bankruptcy laws ignores tragic life situations of students, parents, and grandparents alike. And it should be changed. A common-sense approach to bankruptcy reform would help struggling families like the Masons while promoting a better student loan system for everyone.

How Student Loans Became the Exception to the Rule

Until 1976, all types of loans were treated equally under bankruptcy law. But that year, Congress passed the first exception, declaring that bankruptcy judges could only dismiss federal student loans under the direst of circumstances.

In 2005, Congress expanded the exception to include private student loans—those made by banks and credit unions.

Now, bankruptcy judges are only allowed to discharge the student loans of those who have proven they have “undue hardships,” which generally means never being able to work again.

The death or disability of a borrower discharges federal student loans. But private loans—such as those the Masons took out—don’t have those provisions. So private student loans plague those who are disabled as well as the survivors of those who have passed away, such as the Masons.

All together, under current law, it is next-to-impossible to get rid of any kind of student debt in bankruptcy.

How to Fix the Problem

Here are two simple steps that would help make student loans fairer and more bearable:

1) Allow judges to wipe out the private student loans of any private lender that fails to:

A) Discharge loans in the cases of death and disability, as the federal government does.

B) Charge reasonable interest rates.

C) Allow borrowers repayment flexibility, such as deferment and forbearance options for those in financial difficulties.

2) Allow judges to wipe out any student loans—including federal loans—taken out for colleges that:

A) Have high dropout rates.

B) Have high student loan default rates.

Lenders who charge reasonable rates, allow flexible repayment and wipe out the debts of the disabled and deceased could be considered “qualified” for the current tough bankruptcy rules. Bankruptcy would remain the narrow path of last resort it was designed to be for borrowers. But lenders who don’t meet these standards—basically, those that don’t give borrowers any way out—would be subject to the same bankruptcy laws as other lenders.

Schools, too, would need to earn the bankruptcy exemption for the programs they offer. If students are not likely to complete the programs they’re borrowing for, or generally don’t earn enough to pay back the debt, their federal or private student loans would be dischargeable. There is no sense in penalizing students, parents, and grandparents lured by false promises of success.

Indeed, a study two decades ago by the U.S. General Accounting Office found that low-income borrowers who dropped out of poor-performing schools were the borrowers who most frequently defaulted on their loans—not successful young grads simply trying to walk away from their obligations.

It is economic circumstances, rather than moral failings, that often brings families to bankruptcy as a way to deal with difficult and unforeseen situations. Surely the Masons could not have anticipated their current situation. And it’s probably a situation that no member of Congress anticipated either when they closed the doors of bankruptcy court to virtually all student loan debtors.

These are doors that Congress, and Congress alone, can reopen for students, parents, and grandparents who have fallen on hard times to have equal access to the same courts that the wealthy and corporations have used to make a fresh start. And these doors can be opened strategically to make sure bankruptcy remains a last resort.

Otherwise, families like the Masons will continue to struggle needlessly.

Joe Valenti is the Director of Asset Building at the Center for American Progress. David Bergeron is the Vice President of Postsecondary Education at the Center for American Progress and former assistant secretary for postsecondary education at the U.S. Department of Education.

TIME ebola

Ebola Lessons We Need To Learn From Dallas

Amesh Adalja, a member of the Infectious Diseases Society of America, is a quadruple-board certified physician. His personal infectious disease blog site is www.trackingzebra.com.

"The virus, unlike many other pathogens, is unforgiving of lapses"

Of the more than 8,000 individuals who have been infected with Ebola, the infection of Nina Pham represents a milestone in what has been an unprecedented Ebola outbreak thus far. Nina Pham, a heroic critical care nurse who cared for America’s most gravely ill Ebola patient to date, wore the appropriate gown, gloves, mask and eye protection that so many of us—myself included—have mentioned time and again are the surefire ways to keep Ebola at bay. Yet she still became infected, not in some austere setting but in a modern tertiary care hospital in one of our nation’s major metropolitan areas.

To try to unravel the circumstances regarding Ms. Pham’s infection requires something we don’t have a lot of at this time: facts. We’ve heard references to a “breach in protocol” and “inconsistencies,” but as of this writing, we don’t know how she became infected.

(PHOTOS: See How A Photographer Is Covering Ebola’s Deadly Spread)

However, we can set the context for her infection to try to understand the factors that may have played pivotal roles in this unfortunate occurrence. First, think of the patient Thomas Eric Duncan. Mr. Duncan was ill and placed in strict isolation in the ICU. Strict isolation meant that individuals attending to him were donning the appropriate personal protective equipment (PPE). However, Mr. Duncan was not just ill. He was critically ill, requiring multiple interventions to support his myriad failing organ systems, including his respiratory system, cardiovascular system and kidneys. Nothing short of heroic measures were employed in the failed attempt to save his life.

These measures, while possibly prolonging Mr. Duncan’s life, are invasive, so they involve much more exposure to blood and bodily fluid—the sole means of acquiring Ebola. Dialysis requires large-sized intravenous catheters to be placed in major veins, while placing someone on a ventilator requires a plastic tube to placed (or intubated) through the vocal cords into the trachea. Both procedures can involve bleeding, and intubation involves exposure to respiratory secretions during the procedure as well as during routine care of the patient, which may make safe removal of PPE a more daunting task.

(PHOTOS: Inside the Ebola Crisis: The Images That Moved Them Most)

Also, according to CDC guidance based on decades of experience treating Ebola patients, what is required to care for Ebola patients safely is to practice what is known as contact/droplet isolation—not airborne isolation. With airborne precautions, a special type of mask or a special larger device known as a PAPR (powered air purifying respirator) is worn. The hospital in Dallas used PAPRs, a step that goes above and beyond required PPE. Such enhancement adds another layer of complexity not only to the preparation process, but also to the removal process, providing another possible route of exposure.

Though PPE is essential, it does have limitations that not only stem from being employed appropriately, but also from being removed. Studies with other important pathogens such as vancomycin-resistant enterococci (VRE) and acinetobacter show that healthcare workers contaminate their hands when removing simple latex gloves up to 11% of the time. Such percentages likely creep higher as more intricate types of PPE are donned. Self-contamination reportedly lies behind the infection of the Spanish assistant nurse, Teresa Romero, the only other case of Ebola acquired outside Africa, who contracted Ebola after touching her face with a contaminated gloved hand. Meticulous removal, or doffing, of PPE is as important as its meticulous donning. This fact is reflected in MSF’s almost ironclad methods, which include a dedicated person tasked solely with observing removal of PPE in order to prevent the occurrence of any inadvertent contamination.

What does all this mean? What is the road ahead?

Ebola hasn’t changed. It is following the familiar pattern that we’ve seen in all of its 25 outbreaks in 38 years. It is coursing through the blood and body fluids of its victims, awaiting opportunities to expand its reach into even more victims. We can’t do it any favors. Ensuring that meticulous infection control is taught, practiced and implemented is mandatory in our response to this outbreak. For the virus, unlike many other pathogens, is unforgiving of lapses.

Does this mean that some hospitals may find themselves unable to meet such rigorous standards? Unfortunately, I think the answer may be yes—despite what many others and I hoped and believed. This is why finding the root cause of Ms. Pham’s exposure coupled with education is a must. Every emergency department must be prepared to handle the initial management of an Ebola patient, beginning with their identification through travel history and their isolation. Beyond this initial stage, some hospitals may feel that the resources required for an Ebola patient outstrip their abilities, and they may elect to transfer to a more advanced treatment at a biocontainment facility such as those that exist at Emory, Nebraska and the NIH.

But it does mean we should seriously consider designating certain medical centers as our primary response centers for any further cases that are treated in the US. Such is the model employed for many diseases including trauma, burns and strokes. In fact, such a regionalization model organically arose during the H1N1 influenza pandemic, when smaller hospitals worked in a hub-and-spoke model to transfer their sickest patients to major medical centers—a phenomenon I studied. Such tiering of levels of care is being implemented now in the UK, which has treated one airlifted Ebola case successfully.

The lesson I draw from the events in Dallas is that in the fluid situation that characterizes this outbreak, it is necessary to continually integrate new information from the frontlines into response plans, public messaging and clinical care. If we do that, and have a little luck, we will eventually pull ahead of this virus on the long road ahead. We need to manage this outbreak with active minds, because—to borrow the eloquent words of Louis Pasteur, one of the grandest members of the pantheon of infectious disease—“chance favors the prepared mind.”

Amesh Adalja, a member of the Infectious Diseases Society of America, is a quadruple-board certified physician. His personal infectious disease blog site is www.trackingzebra.com.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME Race

The Real Problem When It Comes to Diversity and Asian-Americans

Asian-American Whiz Kids | Aug. 31, 1987 Ted Thai

The lack of Asian leadership in tech sheds light on a larger issue: Asians are excluded from the idea of diversity

Years ago… they used to think you were Fu Manchu or Charlie Chan. Then they thought you must own a laundry or restaurant. Now they think all we know how to do is sit in front of a computer.

It was 1987 when Virginia Kee, then a 55-year-old a high school teacher in New York’s Chinatown, said the above words. She was one of several Asian-Americans who discussed the perception of their race for TIME’s cover story, “Those Asian-American Whiz Kids.” The cover story would elicit small-scale Asian boycotts of the magazine from those who found offensive the portrait of textbook-clutching, big-glasses brainiacs. To them, the images codified hurtful beliefs that Asians and Asian-Americans were one-dimensional: that they were robots of success, worshippers of the alphabet’s first letter, study mules branded with their signature eyes.

Today, Kee is 82. It has been nearly 70 years since the days when she avoided the public restroom “because it was white or colored”; nearly 20 years since she co-founded the Chinese-American Planning Council, then an unlikely social service for Asian-Americans, who were perceived to be sufficiently independent not to need it. And yet Kee, who still recalls the words she told TIME nearly 30 years ago, maintains that not much has changed.

“If you try to navigate the human part of it, we are seeing, as yellow people, our stereotypes still existing in the heads of many people. We don’t get the chance to really go through and break the glass ceiling,” Kee says. “We are putting limitations on our people.”

The longevity of the idea that “all [Asian Americans] know how to do is sit in front of a computer” was highlighted recently when several top technology firms released their first-ever diversity reports. Those reports and media discussion of their findings centered on the obvious, important problem: an under-representation of women, blacks, Hispanics and Native Americans. Very little was said of the discrepancy between the high percentage of Asian tech employees and the disproportionately low percentage of Asian leaders. The fact that Asians’ presence charted in bars more than a few pixels tall, it seemed, disqualified them from scrutiny.

To compare representation across companies, and in tech versus leadership roles, use the drop-down menu.

“There is an important conversation to be had in terms of who actually has full access to education and economic opportunities,” says Mary Lui, a professor of American and Asian-American Studies at Yale University. “But at the same time, think about what [not talking about Asian representation] might be saying in terms of Asian-Americans in the U.S.”

What it says is this: Asians and Asian-Americans are smart and successful, so hiring or promoting them does not count as encouraging diversity. It says: there is no such thing as underrepresentation of Asians and Asian-Americans. The problem with this belief, historians and advocates assert, is that it not only obscures the sheer range of experiences within Asian and Asian-American populations, but also excludes them from conversations about diversity and inclusion in leadership and non-tech sectors.

*

Not that this exclusion is a new phenomenon. Historians agree that diversity has turned a blind eye to Asians and Asian-Americans ever since the 1965 Immigration Act. With the conclusion of World War II, many ex-colonial Asian countries like the Philippines, South Korea and India had emphasized technical education to modernize and industrialize their new national economies. The Immigration Act permitted the migration of those highly educated Asians as a means of recruiting science, technology or engineering experts to the U.S. during the Cold War era.

For over half a century, the growth of the Asian-American population in the U.S. had been stunted, first by racially-motivated exclusionary laws that banned Asian immigration and later by annual quotas. But within years of the 1965 act, that population boomed. By the 1970s and 1980s, the image of Asian-Americans was no longer of the alien invaders washing ashore in California during the Gold Rush, the faceless bachelors laying the cold steel of the Transcontinental Railroad, or the land-grabbing and job-stealing migrants. The new Asian-Americans were scientists, doctors, programmers and engineers. They were thriving.

By the mid- to late-1980s, the notion of Asian-Americans as universally successful was everywhere. Major news organizations lauded them as the “model minority” — a term first coined in 1966 when first the New York Times and then U.S. News and World Report published stories that suggested Asian-Americans, through their steely work ethic and quiet perseverance, were uniformly triumphant despite prejudice. The idea elicited criticism, particularly from Asian-American groups whose problems were made invisible behind the guise of universal success: the displaced Laotian and Cambodian refugees of the Vietnam War, or the elderly Filipinos fighting to save their low-cost I-Hotel housing complex from urban renewal.

TIME was not immune to the model minority craze. In 1985, two years before its controversial cover story, TIME published an article called “To America With Skills: A Wave of Arrivals From the Far East Enriches the Country’s Talent Pool.” The piece documented the flood of Asian-Americans into high-paying careers and elite universities with decidedly less focus on marginalized groups like poor Chinese launderers, unassisted Vietnamese refugees or underpaid South Asian cab drivers:

What really distinguishes the Asians is that, of all the new immigrants, they are compiling an astonishing record of achievement. Asians are represented far beyond their population share at virtually every top-ranking university: their contingent in Harvard’s freshman class has risen from 3.6% to 10.9% since 1976 … Partly as a result of their academic accomplishments, Asians are climbing the economic ladder with remarkable speed.

“I consider it a two-headed hydra: the stereotype of being the evil invader, or the model minority,” says Helen Zia, an Asian-American activist, journalist and historian. “The conclusion of both is the same. Asian-Americans are too foreign — from the outside, being an invader, or on the inside, being so bland and so good.”

Asian-Americans’ visible success, with numbers to prove it, began to mean they should be excluded from inclusionary practices like affirmative action. More severely, Asian-Americans were seen as a hindrance to diversity. In one case, high school senior Yat-Pang Au and his Hong Kong-born parents filed a formal complaint to the U.S. Department of Justice that the University of California admissions system discriminated against Asian-Americans. Au’s case was profiled in several media organizations, including TIME’s 1987 “Whiz Kids” cover:

A straight-A student, Yat-Pang, 18, lettered in cross-country, was elected a justice on the school supreme court and last June graduated first in his class at San Jose’s Gunderson High School. Berkeley turned him down. Watson M. Laetsch, Berkeley’s vice chancellor for undergraduate affairs, insists that Yat-Pang was rejected only for a ”highly competitive” engineering program.

Au is now 45. He still recalls his parents’ insistence that he “fight for his rights,” a struggle that concluded with an apology from the chancellor. He later transferred to UC Berkeley in 1989 for his junior year after two years at DeAnza College, a community college in the San Francisco Bay Area. Today, he is the CEO and co-founder of Veritas Investments. And though Au managed to find success despite obstacles — the classic model minority narrative — he says that the fact he chose entrepreneurship as a career meant he rose to leadership despite these systems that assume success for Asians is a byproduct of their race.

“I was, to be honest, embarrassed that I didn’t get in, embarrassed thinking and expecting that we lived in a relatively color blind society,” Au said.

Today, it appears that Asians and Asian-Americans still pose a threat to diversity. Only now even they believe the idea, too. In 2012, a popular New York Times op-ed titled “Asians: Too Smart for Their Own Good?” described Asian-American college students feeling like “a faceless bunch of geeks and virtuosos.” The previous year, an Associated Press article reported that many Asian-Americans were no longer checking off the “Asian” box on college applications, in order to circumvent unspoken quotas at top colleges. Their threat to diversity is so convincing that Asians and Asian-Americans have begun to offer what is, at its core, an inadvertent apology.

*

As the world’s response to the tech diversity reports shows, Asians and Asian-Americans remain invincible to underrepresentation: even though companies tend to have disproportionately low levels of Asian leaders compared to the number of Asians in technical jobs, this discrepancy is overlooked. That silence is only one part of a larger issue that experts insist has deep historical roots. It is not simply a first-world complaint or an upper-middle class problem. It is one with sobering consequences.

“Being the model minority, there’s the expectation that you’re going to do so well you shouldn’t have any problems,” Zia says.

The belief in a blanket Asian-American culture is so thick that it has resulted in confusion when Asian-Americans deviate from the model minority myth. Today, diversity is more visible than ever: There is the commanding John Cho, and there is the awkward William Hung; the funny Mindy Kaling and the serious Indra Nooyi; the talkative local launderer and the mum evil villain; the whitewashed American-born Chinese and the perpetual foreigner. And yet those who display that diversity are often perceived as exceptions. The rule is the single framework — the model minority myth — that persists as the dominant stereotype for the whole race, especially in the tech sector.

“If [executives] assume their Asian-American tech employees are the model minority,” Zia continues, “the baggage that that also brings is that they are good, high-tech coolies who will do their jobs, work like hell, stay up 24/7 grinding out code — and that [executives] can never think of promoting them into management or leadership positions.”

Yet the movement to push Asians and Asian-Americans into conversations of diversity and inclusion has fizzled out in recent years. Asian-American activism, historians believe, was at its peak following a national outcry after two white men escaped prosecution for their 1982 racially-charged murder of Chinese-American Vincent Chin. Nascent groups like American Citizens for Justice and the Coalition Against Anti-Asian Violence demanded equal treatment of Asian-Americans both under the law and in society. The fight for Asian-American equality may be less fierce today, but it is still there.

“I wanted to bring to the conversation that Asians, although they were starting to enter the ranks of these companies, were not moving to the top of these organizations. I think it’s still the case that organizations are still not focused on the issue,” says Korean-American leadership consultant Jane Hyun, whose book Breaking the Bamboo Ceiling: Career Strategies for Asians discusses caps on Asian-American seniority in corporate settings.

The onus, Hyun says, is not only on society and business, but also on Asian-Americans themselves. They must try to untangle how cultural, historical and social factors inhibit their progress, in leadership or in other areas where Asian-American diversity is needed, like film, TV and politics. J.D. Hokoyama, former president of the national nonprofit Leadership Education for Asian Pacifics (LEAP), adds that “[The problem] is not just from the top. Our own communities are also settling.”

The irony is that it is the pride of many Asian and Asian-American cultures not to settle for anything less than they deserve. Unless, that is, they or everyone else believe they’ve already gotten what they deserve, and more: academic success, financial stability, happiness. It is hard to imagine that some have not gotten what they deserve, especially in an age when diversity in Asians and Asian-Americans is seen as the difference between straight-laced, straight-A geniuses and lazy, A- slackers. There are still those facing deeper problems that are dismissed or overlooked. And what it takes to start unraveling these issues is simply to understand that some things are too good to be true.

“We have the good, the bad and the ugly. We’re not models,” Zia said. “We should we be seen in our full humanity. That is, in my experience, what everyone really aspires to.”

Read next: Why I Changed My Korean Name—And Why I Changed It Back

Your browser, Internet Explorer 8 or below, is out of date. It has known security flaws and may not display all features of this and other websites.

Learn how to update your browser