TIME society

When Homework Is a Matter of Life and Death

Zocalo Public Square is a not-for-profit Ideas Exchange that blends live events and humanities journalism.

My parents fled Iran because they were forbidden from getting an education there. I've spent over one-third of my life on a university campus

The first hint of sunlight glows off the horizon as I rush toward Stanford Hospital from the parking garage, white coat in hand, stethoscope bouncing against my chest. Every few steps, the diaphragm of my stethoscope ricochets off the silver pendant my mother gave me—a nine-pointed star etched with a symbol of my Bahá’í faith. My mother escaped Iran at age 17 as the country was on the cusp of revolution—a revolution that would create a society where, to this day, Bahá’ís like myself are barred from obtaining a university education. But here, in the United States, I’ve spent more than a third of my life on a university campus.

The Bahá’í faith was founded in 19th-century Persia, and is now the largest non-Islamic minority religion in Iran. Persecution of our religion has helped it expand around the world—my own family’s escape to the United States in 1979 guaranteed that I would be born to the freedom and opportunities denied to Bahá’ís back home.

Back in Iran, the state bans Bahá’ís from studying at universities as just one of many different forms of persecution, which has included desecration of cemeteries, confiscation of property, and wrongful imprisonment. However, because education is such a fundamental principal of our faith, Bahá’í students there have to learn in secret—usually through the Bahá’í Institute of Higher Education (BIHE), whose volunteers quietly teach classes in homes or via online portals. The threat of arrest is constant; the government recently imprisoned both BIHE students and professors, some at the notorious Evin Prison, which has held many prisoners of conscience. I, on the other hand, had the freedom to receive a bachelor’s degree in bioengineering from Rice University and am now in an M.D.-Ph.D. program at Stanford University, filling my brain with pathophysiology and methods of statistical analysis, which I hope to use to serve the community.

Sometimes I find the sheer volume of learning to be overwhelming, but then I take a deep breath and remind myself how fortunate I am to be able to acquire knowledge freely. Inside the hospital, it’s all bustle. I’m greeted by beeping pagers, an antiquity forgotten by the rest of the outside world, as I make my way to my morning clinic. As soon as I arrive, I glue myself to the computer and begin mentally dissecting patient charts. My first appointment of the day is a lovely woman with Type 2 diabetes who is just beginning to get her blood sugar under control. Between patients, I pore over the medical literature, making sure I understand each patient’s problems.

In my afternoon clinic, one of the residents excitedly approaches me. “You speak Farsi, right?” I nod. “I have a patient who would be really happy to meet you.” She gives me the room number, and I walk gingerly toward the room, already feeling self-conscious about my accent. I walk in and greet Mrs. H. in Farsi; her face instantly glows with a smile. I ask about the course of her cancer, how she’s feeling, and if she has any questions. She tells me she’s doing well and that the therapy has put her in remission. Then, she asks me where my parents live (Dallas), whether I’m married (I have been for three years, to a fellow Bahá’í I met at Stanford), and if I cook Persian food (I wish). At the end, she tells me how proud she is to see a young Iranian woman becoming a physician.

That evening, as I enter my house, I’m surprised to hear voices coming from my living room. But then I remember that my husband, a volunteer BIHE professor of engineering, was scheduled to give a lecture. I peek into the living room, where he is lecturing into his laptop on how circuits work. The information is over my head, but the students halfway around the world are excitedly asking questions. They are huddled on a beautiful scarlet-colored Persian carpet and are dressed like typical American college students—jeans and comfortable sweaters.

I quietly walk in, take off my stethoscope, and sit on the couch across from my husband. I close my eyes and touch the pendant around my neck, trying to imagine, just for a moment, what it would feel like to be on the other side. When I open my eyes, I feel an overwhelming mix of feelings. I’m incensed that rulers anywhere would deprive individuals eager to learn a chance to contribute to society, and deprive a society their contribution. And yet I can’t help but feel hope for a generation of Iranian Bahá’ís who are so motivated that not even the threat of arrest can extinguish their passion for knowledge.

And, with a feeling of gratefulness, I crack open my 500-page textbook on internal medicine and pour over medications for treating Type 2 diabetes.

Roxana Daneshjou is an M.D.-Ph.D. candidate at the Stanford School of Medicine and a recipient of a Paul and Daisy Soros Fellowship for New Americans. She wrote this for What It Means to Be American, a national conversation hosted by the Smithsonian and Zocalo Public Square.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME society

We’re in a Golden Age of Loyalty

american-flag
Getty Images

Zocalo Public Square is a not-for-profit Ideas Exchange that blends live events and humanities journalism.

Today, there is no compelling anti-American ideology or movement with broad appeal seducing our citizens into dividing their loyalties

When I was hired last year by Arizona State University, I faced the customary blizzard of new employee paperwork. You know the drill – forms that ask you to select your health insurance plan, seek the details on where to deposit your paycheck, and invite you to “solemnly” swear to support the Constitution of the United States and of the state, bear them “true faith and allegiance” and defend them against enemies, foreign and domestic.

OK, so the loyalty oath was a new one for me.

I signed it with gusto, even though –just between us – I am not sure what is in the Arizona constitution. But I don’t have any reservations about swearing an oath of loyalty; it’s an honor to work for a public institution. And lest you think this is a red state quirk, public employees in neighboring California, and many other states, must take similar loyalty oaths.

But the exercise does seem, happily, anachronistic and unnecessary. We’re living in what has to be the nation’s golden age of loyalty.

Today, there is no truly compelling cross-border anti-American ideology or movement with broad appeal seducing our residents or citizens into dividing their loyalties. Despite the disturbing tales of a few troubled Americans picking up and joining Al Qaeda, ISIS, or other terrorist groups, we’re currently in a bear market for global ideologies that transcend nationalism.

A time of such undivided loyalty is a rare luxury in American history. Our nation’s birth, after all, was a searing act of disloyalty against the former sovereign.

And those who did fight for independence had radically different ideas of what their new nation was to stand for, a confusion that would take the Civil War to resolve.

And for all America’s success as a melting pot, the strains of massive immigration and religious diversity once challenged national unity in a way they no longer do. Anti-Catholic prejudice in the mid-19th century, for instance, contributed to mass defections among Irish immigrants during the Mexican-American War, when the notorious St. Patrick’s Brigade switched sides and joined their fellow Catholics in the Mexican Army. During World War I, the political power of Irish and German immigrants arguably kept the country in the neutral column far longer than would have otherwise been the case.

Once the country went to war, concerns about the loyalty of German-Americans proved unwarranted. Indeed, official reaction to perceived disloyalty has usually been far more damaging than any real disloyalty. The Palmer raids toward the end of World War I and thereafter, triggered by fear of anarchists and the new Bolshevik menace; the internment of loyal Japanese-Americans during World War II; and the McCarthyite witch hunts of the early Cold War years all amounted to cases of self-destructive paranoia.

Communism was surely the most powerful cross-border temptress undermining national allegiances in modern times. Educated elites in democratic Western societies were disproportionately drawn to the internationalist communist cause. Last year’s nonfiction thriller, A Spy Among Us by Ben MacIntyre, depicting the treachery of Kim Philby, the urbane English spy who ultimately fled to Moscow, captured the degree to which Communism seduced Philby and his generation of Cambridge-educated elite (and some of their American counterparts). Disloyalty then was sufficiently in vogue to merit this cavalier observation from the famous novelist Graham Greene in a foreword to the memoirs Philby’s wrote in Moscow: “‘He betrayed his country’ – yes, perhaps he did, but who among us has not committed treason to someone more important than a country.”

There are no such temptations for ideological adultery today, which is another reason we are nostalgic for, the Cold War. That showdown between rational superpowers stands in stark contrast to today’s frustrating wars against failed states and amorphous terrorist groups. But we also miss the less tangible contest of ideas and ideologies tailored to Western, modern audiences, and the ensuing double-crossing and conflicted allegiances it provoked. This nostalgia is why TV shows like The Americans are culturally significant.

For now, the whole notion of betrayal as a threat to the nation is so devalued that it was humorous fodder at the Oscars, as host Neil Patrick Harris joked that “for some treason” Edward Snowden couldn’t be in the audience to celebrate the documentary about him. Subsequently, this spy who fled to Moscow chimed in that he found Harris’ joke funny.

We’ve come a long ways from the days when divided loyalties were no laughing matter.

Andrés Martinez is editorial director of Zocalo Public Square, for which he writes the Trade Winds column. He is also a professor of journalism at Arizona State University.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME health

I Was on the Front Line of L.A.’s Last Measles Outbreak

vaccine
Getty Images

Zocalo Public Square is a not-for-profit Ideas Exchange that blends live events and humanities journalism.

Why doesn't anyone remember the 12,000 people who got sick, or the 600,000 vaccine doses we gave out 25 years ago?

When it comes to diseases, we never seem to worry until they capture the attention of the media or affect us personally. Today, as new cases of measles turn up in California, I feel a sense of dread — and déjà vu.

Between January 1988 and December 1990, Southern California saw 12,434 cases of measles. Los Angeles County was the first to report its problem in what became a statewide epidemic where 75 people died after developing complications such as pneumonia and encephalitis. In two of those three years, the federal Centers for Disease Control identified Los Angeles as the site of the largest outbreaks in the nation. How we could have forgotten so soon how terrible measles can be?

At the time, I worked as an epidemiology analyst in the Los Angeles County Health Department’s Acute Communicable Disease Control (ACDC) unit. Back then, the unit was a modest group of about 20 people — among them physicians, nurses, epidemiologists, and support staff — that kept track of diseases such as influenza, meningitis, and the vaccine-preventable diseases, all of which which by law must be reported to health departments. We also investigated outbreaks, and kept abreast of trends in world health.

When I started at ACDC in 1987, measles was low on the radar. It hadn’t been a scourge since the early 1960s, when an estimated 90 percent of Americans had been infected by age 15. A national campaign to vaccinate children followed. By the mid 1980s, there were less than two cases per 100,000 persons.

How did our outbreaks begin? In retrospect, 1987 may have been a harbinger of the epidemic; the number of cases of measles reported in the county that year jumped from 40 to 126. Then, by June 1988, the virus found a particularly vulnerable population — young children in low-income communities, who had very low vaccination rates. It’s not clear exactly why these kids didn’t keep up with their shots and why the pattern changed from previous years when high school and college students made up most of the cases. But whenever a highly contagious disease like measles finds a niche in an unvaccinated population, there is going to be an outbreak.

By early 1989, an epidemic was truly underway. In early 1990, I was brought in from the field office where I worked to help the county’s Immunization Program — which had a staff even smaller than the ACDC’s — handle the onslaught of cases. Reports of measles cases quickly stacked up on all our desks. As soon as we put down the phone, we got another call. I remember one of my colleagues, Dr. Lorraine Chun, who was out in the field investigating deaths among unvaccinated children and speaking with families, remarking that she had never seen anything so sad.

As the epidemic progressed, we recognized that the established vaccination protocol had slipped, and we had lost the progress made in previous decades. Trying to reverse the epidemic would take more than just vaccines, but we had none of the data and tools that epidemiologists use today to expedite their work (like the Internet, GPS, e-mail, and cell phones.) We waged the battle with pagers, fax machines, and stand-alone personal computers that we shared.

Soon, there were so many cases that investigating each one and following up on the possible contacts of the infected became impossible. So we had to take stronger action in the community. All schools and child care centers were required to identify unimmunized kids and tell parents that those children had to stay home. Special clinics were set up near schools to vaccinate kids at risk. At colleges that had measles outbreaks, more than 11,500 students and staff were immunized — in both voluntary and mandatory programs. Medical staff at jails, prisons, and juvenile halls throughout California immunized at least 46,000 people. L.A. County alone distributed over 600,000 doses of vaccine.

Educational campaigns in Spanish, Hmong, Samoan, and other languages urging parents to vaccinate their children were disbursed through the media, churches, English-as-a-second-language classes, and mailings. And in an important change, authorities decided — based on the number of toddlers who became infected — to lower the age for receiving the first dose of measles vaccine from 15 to 12 months of age. Since the vaccine is slightly less effective when given at that younger age, public health officials also began recommending that children receive a second dose between ages 4 and 6. This is a schedule maintained until this day.

Even with the intense effort and over $30.9 million spent statewide, the epidemic lasted for three years. It wasn’t until 1995 that the L.A. Times declared victory, noting that “rates of measles, mumps and other diseases that can be prevented by vaccines dropped to a ‘historic low’ in Los Angeles County.” By 1997, the incidence rate in the U.S. as a whole was less than 1 per 100,000. In 2000, measles was declared to be eradicated in the United States — making us the envy of the world where measles remained the eighth-leading cause of death. Though I no longer worked at ACDC by that time, I hoped that the terrible toll exacted by measles during the 1988 to 1990 epidemic had sufficiently spurred us into permanent vigilance.

Why then, less than 20 years later, do 113 countries have higher rates of immunization than we do? Why are we currently facing measles again?

Perhaps we are victims of our own success. The outbreaks that produce mass vaccination campaigns seem to fade from memory after a few decades, and the virus surges back again.

I must confess that the late ‘80s outbreak of measles has even faded from my memory. Even though I spent every work day for nearly two years engaged in addressing an epidemic, I have to wrack my brain to recall details.

We need to find ways to remember better. It should not take having a family member die from measles-induced encephalitis to brand in our minds the effect this virus can have. Public health agencies continue to devote money and time to delivering messages on vaccination, but they are useless unless everyone takes them to heart. Back in the unit, we used to talk about the difficulty of reaching that last group of people reluctant to vaccinate. Our saying: “90 percent of the budget goes to reaching the last one percent.”

Instead of using the tools that all the science-based, peer reviewed research says will prevent epidemics of disease and save lives, a segment of our population chooses to play a dangerous game by not vaccinating. They fear the odds of an adverse event from the vaccine are greater than those of a severe outcome from acquiring the disease itself.

I have been telling everyone I can about the danger of stepping backward into a hole we have previously fallen into. The question is, how can we get people to listen?

Kenn K. Fujioka is the district manager of the San Gabriel Valley Vector Control District. He is a graduate of UCLA. He wrote this for Thinking L.A., a partnership of UCLA and Zocalo Public Square.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME world affairs

Why I Miss Yemen

yemen-sanaa-yemeni-flag
Getty Images

Zocalo Public Square is a not-for-profit Ideas Exchange that blends live events and humanities journalism.

Though the Yemeni government may fail to reconstruct itself, the ties that bind people to one another can step in for the greater cause

I miss Yemen.

That may come as a surprise since whenever the country makes headlines — as it has over the past few weeks — the overwhelming themes are war, violent radicalism, the impending doom of failed statehood and whatever other ominous sounding crisis (water shortages, national drug addiction) can be thrown into the mix.

I find that most Americans assume that the country is seething with anti-American sentiment. Yet, that is far from the truth, and I miss Yemen, my home from 2009 to early 2012. I’m not alone. Most foreigners who have been fortunate enough to experience the warmth, humor and kindness of Yemeni people miss it too.

I miss waking up in the old city of Sanaa, Yemen’s 3,000 year old capital. I would slowly make my way across uneven stone floors that cooled the soles of my feet and into my mafraj, a square room with blue-patterned low cushions lining its perimeter. I would take a moment to stare out into the narrow alleyway below through a green, blue, and red stained glass window, the kind that decorate nearly every building in Sanaa.

I lived on the top floor of a skinny, four-story, brown brick abode with white gypsum outlining its edges. Many have likened these structures in the old city to gingerbread houses. Out the window, I saw men walking to work, elbows linked, donned in long white robes that hung to their ankles, suit jackets, and a curved dagger secured right at their waistline. There were also the elderly women draped in red and blue intricately patterned blankets overtop their black abayas and carrying puffy loaves of bread in clear plastic bags. They’d chat so quickly in clipped sharp Arabic that I could never understand them—even though I’m comfortable in the language. My ears would then catch the sound of the gas merchant who strolled the neighborhood banging with a wrench on a large cooking gas canister. The harsh dinging warmed me in the same way the sounds of Manhattan must warm someone who’s happy to call that city home.

At about 8 am, I would make my way down the incongruent steps of the house and past the doors of apartments where other foreigners lived, and then I’d pull a small metal lever that opened the heavy wooden slab on the ground floor to the outside world. The sun would be strong and the air bone dry at 7,500 feet. I would walk the 10 steps or so to a hole-in-the-wall canteen, a Yemeni bodega, known here as a bagala, and buy a tub of plain yogurt for about 50 American cents that I would mix with Yemeni honey (some of the best in the world!) for breakfast. This was in lieu of the typical Yemeni breakfast of lamb kabob sandwiches or stewed fava beans. The two young guys at the bagala would light up upon our daily meetings.

“Good morning, Laura!” they’d say.

“Good morning! How’s it going?”

“Praise be to God! Did you watch the president’s speech?” Mohamed, the older, would ask, or otherwise comment on the political happenings du jour, which were many since part of my time living in Sanaa covered the Arab Spring protests of 2011.

“I did. What do you think?” I would ask.

“Everything will be fine, God willing. We want stability for Yemen,” he’d answer. Then another friend whose face I recognized from the neighborhood would rush up, give me a nod, and shove approximately 10 cents at Mohamed so he could bring back piles of pita bread for his family.

I would head back home, comforted to know that if anything ill ever befell me, these friends would have my back, as happened when they cornered a cab driver who was requesting $200 to give me back the phone that I had left in his taxi (I got it back free, thanks to my neighbors). You give Yemenis a smile, and they give you so much more in return, always bending over backwards for guests of their country. It was an unfair transaction that benefited me most of all.

I miss walking through the narrow cobblestone streets of the old city and seeing faces I recognized. We waved hello along the way, and perhaps shared a sentence or two about the day. My mood always brightened when I passed the old men who sipped creamy tea sitting outside one tiny cafe, who wore thick glasses that magnified their eyes, turbans round their heads, and held canes in their hands. They laughed and told jokes to pass their days. They’d seen it all—including war worse than the current one. They knew the ebbs and flows of time.

Despite that one greedy cabbie who tried to keep my phone, one of the things I miss most of all are the discussions with taxi drivers, waiting stalled in traffic due to the post-lunch market rush. Yemenis love to talk—and so do I. They often gave me a handful of soft green qat leaves, the mild narcotic widely consumed in the country. I remember when one driver explained that Yemen’s President Ali Abdullah Saleh was like Marie Antoinette. “Let them eat cake!” the driver exclaimed.

A different cab driver once told me he had worked at the Yemeni embassy in Cuba as a driver and missed the rum like you wouldn’t believe. Alcohol is available in Yemen, at Chinese restaurants that double as brothels, or from Ethiopian smugglers who get their bottles on boats from Djibouti. Of course, getting it involves risks—the social shame of being caught with alcohol for an average Yemeni would be damning not only of his reputation, but of his family and his tribe. I took that taxi driver’s number and the next time I left a diplomat’s party in the fancy part of town where sheikhs and foreigners live behind tall walls, I called him to pick me up. I snuck him a beer, which he uncapped with his teeth and drank during our drive back to the old city.

There are things I don’t miss, like the lack of electricity. Or wading through a foot high of muddy, trash-strewn water because the drainage system wasn’t working fast enough for the rainstorm. I certainly don’t miss needing to flee my home in the old city because the war came too close in September 2011, when Yemen’s divided armed forces began to fight one another. I didn’t want to live alone when random artillery fire had fallen nearby. And then there was the gnawing guilt that came with remembering that my suffering was nothing compared to Yemenis who couldn’t afford a generator or the rising prices for basic goods, and who didn’t have another home to which they could flee. But the good always outweighed the bad for me in Yemen, and that’s why I stayed for nearly three years. I left when I realized that reporting during wartime, being so close to explosions, death and violence, had clouded my thoughts so that I was incapable of making safe decisions.

As the country, now leaderless, fractures with little hope of reconciliation, I watch with a breaking heart. Yet, I am confident in this: if the Yemeni government fails to restructure itself into a sustainable organization, and rather continues to mirror a scenario from an apocalyptic future, Yemen will not be a land where every man is for himself. There is a social contract in Yemen more ancient than the one that exists in the United States, and the ties that bind people to one another can step in when the government fails. As an outsider who was fortunate enough to have called Yemen home, I put my hope in that.

Laura Kasinof is an author and freelance journalist. Her book, Don’t Be Afraid of the Bullets: an Accidental War Correspondent in Yemen, is about her time reporting for The New York Times during Yemen’s Arab Spring. She wrote this article for Zocalo Public Square.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME Media

Why Can’t Hollywood Tell America’s Stories?

hollywood-sign
Getty Images

Zocalo Public Square is a not-for-profit Ideas Exchange that blends live events and humanities journalism.

Our onscreen heroes are white men. But most of us aren't

The 2015 Oscars broadcast may reflect the demographics of the Academy of Motion Picture Arts and Sciences voters—who are overwhelmingly older Anglo men—but it won’t reflect the demographics of the rest of the country. All 20 acting nominees are Anglo, and all the directing and screenwriting nominees are male. The Academy Awards may not tell the whole story, but they certainly indicate that many American stories still aren’t being told on our screens. In advance of the Zócalo/UCLA Bunche Center “Thinking L.A.” event “Why Can’t Hollywood Look Like America?”, we asked media scholars: What are the critical and integral contemporary American stories that Hollywood is not telling?

Camille Fojas — Stories of inequality and social and economic immobility

Hollywood is an industry in pursuit of profit. It is not an open marketplace of new imaginings or ideas unless those ideas draw audiences and increase profits. That said, it has all but ignored a major social and cultural upheaval. Since the economic collapse that began in 2007, there is heightened awareness of the deepening economic inequality of U.S. culture. While Hollywood responded quickly to the economic collapse with epic tales of the cruel machinations of the big banks and their minions, it has yet to tell the story of the most economically vulnerable and those burdened by oppressive student loan and mortgage debt. It is more profitable to deliver a melodrama about malevolent banks that fits neatly into the age-old morality tales of good versus evil. We have yet to see the story of those at the bottom of the labor market—those who are out of resources, silver linings, and options. The economic crisis intensified wage stagnation and further limited opportunities for employment and upward mobility. This new scenario does not square with the Hollywood myths around the “American Dream” centered on “rags to riches” storylines. The story of inequality, of the deepening divergence between rich and poor, and social and economic immobility, is the real story of our times.

Camilla Fojas is Vincent de Paul Professor of Latin American and Latino Studies and director of Critical Ethnic Studies at DePaul University. Her newest book is Freefalling: Pop Culture and the Economic Crisis.

Priscilla Peña Ovalle — Universal stories that don’t star white actors

Hollywood trades in the spectacular, the dramatic, the titillating. Even romantic comedies usually elevate the “everyday” business of love with fantasies of wealth. To some degree, that’s OK. Audiences gravitate towards escapist films. I do. But I’d like Hollywood to tell these spectacular tales with actors who look more like contemporary America.

I want to see more women and nonwhite people on screen. I’d like to see a good romantic comedy about a black couple falling in love that can exist without being pigeonholed as a “black movie.” I’d like to see the Latina version of John Wick (2014) or Office Space (1999). I don’t necessarily want to see films about race or women’s issues. Right now, I just want to see some different folks lead.

Hollywood too frequently employs white actors to tell universal stories; a continued reliance on the white “everyman” results in films that lack the texture of (contemporary) America. Where are the sci-fi protagonists with curvy bodies or the vampires with brown skin and curly hair? Such long-standing inequities stem from racist and sexist standards of beauty that have governed a racist and sexist system of film production and stardom in the United States since the silent era. But at a time when 44 percent of moviegoers are nonwhite, it is unbearable that 76 percent of the bodies on screen remain white.

So, I have hopes for the new Ghostbusters starring Kristen Wiig, Melissa McCarthy, Leslie Jones, and Kate McKinnon. While the film highlights Hollywood’s reliance on “sure hits” that often recycle white male protagonists along with narratives, this version of Ghostbusters promises something more: a crew of women that represent radically diverse body types in a film that is presumably not about their looks or struggles as women. What an escapist fantasy!

Priscilla Peña Ovalle teaches film and television in the cinema studies program at the University of Oregon. She is the author of Dance and the Hollywood Latina: Race, Sex, and Stardom (Rutgers 2011) and is currently working on a project about race and hairstyles in Hollywood.

Ellen Scott — Institutional exposés

Hollywood tells many stories about race, but those that lay bare invisible power relations—the struggles of not individuals but of a larger segment of society against institutional constraints—are most rare. These stories are difficult both because such institutional forces are hard to name and personify and because Hollywood, an institution itself, has a vested interest in muting these images.

The problem of incarceration, and the prison industrial complex, is a massive rather than merely personal story. Sixty percent of black male high school dropouts will go to prison before age 35. In the process, they and their families will find many of the gains of the civil rights era effectively reversed, from voting rights (which are often denied to felons) to their prospects of reaching middle-class status.

Such stories remain rare in American media, partly because Hollywood censors long forbade broad condemnation of the criminal justice system as a professional courtesy to police and judges. However, stories from behind prison walls—often told by independent cinema, primarily documentaries but also feature films like Ava DuVernay’s Middle of Nowhere (2012)—stand to reveal much about contemporary America.

Experiments like Richard Linklater’s Fast Food Nation (2006) and John Sayles’ City of Hope (1991) show how narrative cinema can operate as institutional exposé. One challenge is motivating the often individualistic, solipsistic art of cinema to tell the intricate stories of the many. How might we, for example, tell a cinematic story that makes palpable the power and impetus behind the “Black Lives Matter” campaign not only through personal stories but through the story of networks—both digital and human? Such stories even more difficult to sell than they are to tell. The other challenge is to find funding for films whose politics conflict with the whitewashed stories Hollywood has traditionally enshrined as “the” American narrative.

Ellen Scott is Assistant Professor of Media History at CUNY-Queens College. Her work concerns the relationship between media and the ongoing struggle for African American equality and her current book Cinema Civil Rights: Race, Repression and Regulation in the Classical Hollywood Era is now available from Rutgers University Press.

Ana-Christina Ramón — The Latino version of Parenthood

Latinos are not only the largest minority group, but also one of the fastest growing in a country that is expected to be majority-minority by 2043. Many businesses and political interests have taken notice and made a concerted effort to appeal and market to Latinos. So why has Hollywood been slow to catch up? Although our research at the Bunche Center at UCLA shows that relatively diverse TV shows excel in ratings, Latinos remain underrepresented onscreen. One underlying reason may be the belief that Latinos will continue to consume media regardless of who makes or appears in movies and TV shows. But will this reasoning (true or not) hold up as younger Latinos become savvier about their entertainment options? And, is Hollywood leaving money on the table by not appealing to Latinos’ experiences?

Growing up in Los Angeles as a daughter of Mexican and Peruvian immigrants, I know how varied and rich the Latino experience can be. From my grandmother’s journey as a single mother who worked as a housekeeper to give her kids opportunities to my life as an academic researcher advocating for social justice issues, a multitude of stories exist that are uniquely Latino yet encompass universal themes of struggle and triumph. Recent hits such as Jane the Virgin and Devious Maids show that TV audiences want to see Latino content. But not every Latino show has to be based on a Spanish-language telenovela, either.

Overall, Hollywood needs to move beyond its one-dimensional Latino character tropes. Where can I find a drama about successful Latino professionals who maintain strong ties to their families and culture? Where’s the Latino version of Parenthood or Best Man? Take a chance, Hollywood. The results may surprise you.

Dr. Ana-Christina Ramón is Assistant Director and Associate Researcher of the Ralph J. Bunche Center for African American Studies at UCLA. She currently manages the Hollywood Advancement Project that examines diversity in the entertainment industry and is the co-author (with Darnell Hunt) of the 2014 Hollywood Diversity Report and the new 2015 report.

This article was originally written for Zocalo Public Square.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME Culture

Your Chinese Menu Is Really a Time Machine

sweet-sour-pork
Getty Images

Zocalo Public Square is a not-for-profit Ideas Exchange that blends live events and humanities journalism.

Sweet and Sour Pork and Chop Suey aren’t just delicious—they also tell stories of waves of immigration from China

I grew up in a Chinese restaurant called the Peking Restaurant in rural New England during the 1970s and ’80s. I was that kid you saw running around the tables and through the waiters’ legs, and playing with whatever I could get my hands on. I had access to some cool things—pupu platters for my birthdays, all the fortune cookies I could eat, the pleasure of celebrating two different new year’s days every year with treats like a roasted pig during the Year of the Pig. And, when I was old enough, I could use the deep fryer to make dinner. As a child, I didn’t see the complexity of the Chinese-American story hidden amongst the aromatic dishes being served. To me, the restaurant was just home, the place I grew up.

My family’s restaurant was far from the hustle and bustle of the nearest Chinatown, all the way down in New York City. For many years, we were the only Chinese family in West Springfield, Massachusetts. Today, Chinese food is so thoroughly woven into America’s culinary tapestry that you’d be hard pressed not to find a Chinese restaurant in most modest-sized towns. They run the gamut: chic, high-end eateries, barebones take-out counters, and bustling all-you-can-eat buffets. But the success of Chinese food culture was hard-earned. And the history of the challenges it faced appears in a surprisingly common place: the finely inked print of your local Chinese restaurant’s menu.

The most fascinating aspects of the restaurant menu aren’t the exotic names or the daily special, but the wonderful time capsules captured by the food and ingredients that make up each dish. Some of the selections you make for your family’s Chinese food night provide snapshots of the different waves of Chinese immigrants coming to the United States, as well as the American reactions to those new arrivals.

A Chinese restaurant’s menu is usually comfortingly familiar: sections for noodle, vegetable, meat, and the chef’s special. One common dish is sweet and sour pork. It is a traditional southern Chinese dish that in its original form looks far different than the Day-Glo red dishes served today. Many of the first Chinese immigrants originated in southern and southeastern China. They took the risk of coming to America for new opportunities and a chance to make their fortunes as miners, railroad builders, farmers, fishermen, launderers, and restaurant owners in the mid-19th century.

Those Chinese were increasingly looked upon as outsiders who refused to conform to societal norms and took jobs away from Americans. Anti-Chinese feelings began to rise across the United States in the late 19th century, but were especially strong in the West where jobs were scarce as cheap manufactured goods and European immigrants arrived via the railroad. Chinese immigrants found themselves being blamed for the nation’s ills. This wave of resentment culminated in the 1882 Exclusion Act barring Chinese immigration.

Around the turn of the 20th century, chop suey became the quintessential “Chinese” dish and was one of the earliest to capture the imaginations of adventurous Americans. During the heart of the exclusionary period, the dish exploded in popularity and actually aided restaurant owners in overcoming the restrictive attitudes and laws of the time. People may not have liked their foreign neighbors, but they loved their chop suey. Soon the dish began appearing in pop culture: One song pined, “Who will chop your suey when I’m gone?” Edward Hopper depicted two women conversing at a restaurant in 1929’s “Chop Suey” painting.

There is some irony here. It’s long been assumed that chop suey is actually a wholly American invention. A recent book by Andrew Coe, Chop Suey: A Cultural History of Chinese Food in the United States, indicates that chop suey in some variation or another does exist in China. I asked my personal expert on the subject, my father. His opinion? Chop suey (sort of) means extra bits or leftovers in Chinese, and who doesn’t have leftovers?

The exclusionary laws were lifted during World War II, when America allied with China, but wholesale changes in immigration laws did not happen until 1965 with the Hart-Celler Act. The new law opened up immigration to the broader Chinese diaspora spread around the world, including those that had fled the civil war on the Chinese mainland for places like Taiwan and Singapore. With the arrival of more Chinese immigrants came a wider range of regional tastes and recipes. Sichuan and Hunan (provinces in China) started showing up in the names of dishes. Recipes inspired by those provinces make up a recognizable portion of many Chinese menus. For example, it is possible that a section of the menu will include an entire row of Hunan beef, chicken, shrimp, or—in my latest takeout menu—something called “Hunan Delight.” One of the more popular and successful Sichuan dishes to come from this period, is kung pao chicken, a dish well known for its spiciness and easy-to-remember name.

It was during this period, circa 1968, when my family came to America via Taiwan. My father chose to open a northern style eatery with specialties such as Peking duck, double-cooked pork, and hot and sour soup. He wanted something different than the traditional southern restaurants that already dotted the landscape. But in the end they served both newer northern dishes and the more common and expected southern recipes. Business was business.

Opening a Chinese restaurant might not have been my father’s first choice. He came to America originally to carve out a career as a scholar with his degrees in political science and mathematics. But like many Chinese immigrants, even some 25 years after the end of exclusion, this was the route to success open to him. It did give my mom, a lifelong fan of cinema, the chance to meet Paul Newman. Newman had stopped by after a promotional event in our shopping mall for lunch. He liked it so much he brought his wife Joanne Woodward back with him a few weeks later.

The complexity of the Chinese-American story cannot be unraveled though a menu alone. But it can tell us a little bit about ourselves—how we as Americans treat one another and how we accept the new and the different. These little windows into Chinese-American history show us how, despite restrictive laws and great animosity, Chinese immigrants persevered. They took the limited opportunities given them and succeeded so wildly that Chinese restaurants are a thriving, essential part of the American experience.

In fact, at last count there were over 50,000 Chinese restaurants in the U.S.—more than McDonald’s, Burger King, and Wendy’s put together. Critics say that Chinese food from a takeout menu isn’t authentic Chinese, that it’s more like American Chinese food. I think there’s a clear alternative way to describe it: It’s now all-American food.

Cedric Yeh is the deputy chair of the division of armed forces history at the Smithsonian’s National Museum of American History. His latest exhibit is “Sweet & Sour: Chinese Food and Restaurants in America.” He wrote this for What It Means to Be American, a national conversation hosted by the Smithsonian and Zocalo Public Square.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME health

My Polio, My Mother’s Choice

Zocalo Public Square is a not-for-profit Ideas Exchange that blends live events and humanities journalism.

Today's parents, thanks to vaccines, have never had to learn—need never learn—about pain and grief and loss of control

It had been a good year for Lois Mace.

She and her husband, only three years beyond college, had bought their first house. A solid red-brick and clapboard Cape Cod, it sat on a leafy street named for a character out of a Longfellow poem. In its driveway glistened a new sedan, silver-gray with burgundy roof and whitewalls, a gift from her father, a Ford dealer.

And under its dormers that last day of August 1954 slept her three children: A sunny toddler with platinum blonde hair and a weak stomach sphincter, known around the house, mostly affectionately, as Miss Urp. A three-year-old bruiser with a devilish twinkle in his eye, whom the neighbor nicknamed Meatball. Then there was the eldest, a lithe towhead with quick feet and an even quicker tongue—him they called Motormouth. He was set in a week’s time to walk the two blocks down the hill and start first grade at Nakoma Elementary School.

Everything was the way she liked it, under control.

In the middle of that night, Lois was roused by sounds from the boys’ bedroom. Tucked under the shed roof at the back of the house, the bedroom was stuffy with the heat of late summer. The older boy, who shared a bed with his little brother and a ratty blue bear, lay feverish and whimpering. Her husband carried the boy to the bathroom. He was too weak to stand and use the toilet.

The next day they drove him to the hospital for a spinal tap. The spinal fluid was cloudy. “During the past three or four days almost complete paralysis of both lower extremities and left upper extremity and trunk musculature has developed,” his doctor would write in the medical record on September 4.

Lois Mace Paul, 28, had come very far, very fast from a Depression childhood in a small Iowa town—husband, house, kids so well behaved that strangers would stop by the table in restaurants to compliment her. But now she was also the mother of a boy with polio. He lay in an isolation unit, afraid and confused, unable to sit or roll over. She could only stand in the doorway, swathed in a surgical gown and mask, forbidden to hold or comfort him for fear of spreading the virus.

We can safely assume these events counted as life changing for Lois. After 10 days in isolation, the boy was put on a children’s ward, where he would remain for 130 days, “for institution of hot packs and passive stretching exercises and later institution of active exercises,” according to his medical record. Every afternoon at 2, Lois traveled the three miles to the hospital to sit with the boy. She would read to him as he ate the sandwich—always peanut butter on white bread—that she smuggled past the nurses; her boy wasn’t keen on hospital food. Her husband took the night shift, arriving at 7 to launch Pooh and Christopher Robin on their next “expotition.”

Even judged by the standard of today’s families balancing work and parenthood, the logistical challenges were daunting. Meals to make, clothes to wash and hang, diapers to change. Schedule babysitters for every afternoon. Change clothes and put on makeup—a respectable woman didn’t go downtown in jeans and without a face. Find a way to get back and forth; there was only the one car. Make dinner so her husband could get back to the hospital on time. Bathe and put the little ones to bed on her own. How much time or energy could there have been for coffee or cocktails with friends, or for nights out with her husband?

And it didn’t end there. When the boy was finally sent home, he had to be carried up and down stairs. Over the next decade there would be braces and crutches that he was always expensively outgrowing. And as he grew and his unbalanced muscles twisted his frame, Lois and her husband would sit eight times in a surgical waiting room while Dr. Wixson used chisel, hammer, wire, and staples to straighten his back and legs. Not until the boy himself waited outside an operating room as his own infant child underwent orthopedic surgery could he imagine how fear had shadowed Lois’s life.

Imagination is about all we have to tell us what those events meant to Lois emotionally. She didn’t talk much about feelings.

The boy’s only hint came one afternoon, about the time of his sixth birthday. A high school running back had injured his neck in a game and had been brought into the ward the night before, his limbs numb. As Lois and the boy looked on, a doctor and nurses, after some probing, helped the player sit, swing his legs off the bed, and, to the delight of staff and parents, stand again. Seeing what pleased adults, the boy turned to Lois. “I’m going to do that soon,” he said. She didn’t reply, but tears streamed down her face.

We know she grieved. Lois shared the bad news in a letter to her best college friend, who had joined the Iowa diaspora to Los Angeles. It read like a funeral notice. “Oh, my beautiful little boy,” she wrote in ending. Lois confided to her favorite aunt that she feared the boy would die.

Why didn’t Lois vaccinate me? Because she had not been given that choice. I had fallen ill 224 days before the announcement, on April 12, 1955, that the field trials of the Salk polio vaccine were a success.

As she lay in bed that night, digesting the news that had been shouted out across the country over radio, television, and public address systems in workplaces and schools, Lois had a choice to make. Because kicking inside her was the boy she had conceived in her grief the previous fall.

Today’s parents make those choices knowing much more than she did about the effectiveness and safety of the vaccines offered to their children. They can rely on decades of experience and scientific research.

Lois faced only scientific uncertainty. The Salk vaccine was new. It had been only 60 to 70 percent effective in the trial but had been deemed safe. Some of the world’s top polio researchers weren’t so sure. They had publicly opposed the trial, thought the vaccine the wrong approach, maybe even dangerous. Their fears materialized within weeks. Cutter Laboratories in Berkeley shipped vaccine contaminated with live virus. More than 200 children and family members were paralyzed, and 11 died. The vaccination campaign was briefly suspended.

But from her own experience, Lois Mace knew things that today’s parents, thanks to vaccines, have never had to learn—need never learn—about pain and grief and loss of control. As soon as she could, she took all her children to get the shots, and went back again after the Cutter fiasco.

She could not be certain it was the best choice for them. She knew, to her very bones, that it was the right choice for her.

Mark Paul, formerly deputy editorial page editor of the Sacramento Bee and deputy treasurer of California, is co-author, with Joe Mathews, of California Crackup: How Reform Broke the Golden State and How We Can Fix It. He wrote this for Thinking L.A., a partnership of UCLA and Zocalo Public Square.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME Culture

In Defense of Terrible Coffee

diner-coffee-mug
Getty Images

Zocalo Public Square is a not-for-profit Ideas Exchange that blends live events and humanities journalism.

Mass-brand coffee remains the dominant coffee in the U.S., even in this era of gourmet coffee. And that's okay

One June morning years ago, during a cross-country bike trip, my brothers, a couple of friends, and I sat in a diner in Sandpoint, Idaho, waiting for a drizzle to pass, eating eggs and drinking coffee.

The coffee, as I recall, was no great shakes. It likely came in thick, bone-white mugs, the rims pitted and slightly stained from years of use. We were just becoming aware of gourmet coffee in those days. And, sure, if you’d asked if we wanted the diner joe or a cup of Sumatra Mandheling like they served at Brillig Works in Boulder, we’d all have opted for the latter. But we weren’t in Boulder, the gourmet coffee was not available, and yet we had a blast, drinking the bitter diner joe, joking around, and, finally, too jacked up to sit still, rolling down the road.

These days, gourmet coffee is everywhere. And we’ve got a million new ways to prepare it. In addition to cold-pressed coffee, we’ve got the Japanese siphon process, a plethora of pod brewers, and coffee that comes from fancy machines like the Roasting Plant’s Javabot. And there are concoctions like the flat white—an espresso-and-steamed-milk blend—that suddenly become trendy when the Starbucks marketers put them in heavy rotation.

But it is easy to overlook an enduring truth amidst the gourmet coffee shuffle: Most coffee we drink in the U.S. is not the type favored by coffee connoisseurs. Folgers and Maxwell House remain the nation’s most popular coffee brands, by a long shot. Despite the gourmet coffee boom, this golden age of fine coffee, it’s primarily these mass-market blends that keep America caffeinated, and those diner cups full.

Once, hitchhiking through Wyoming in a snow squall, I caught a ride from a young couple. They were vagabonds who had made a tidy little home in their pickup with a camper shell. We pulled off at a truck stop in Rawlins. And I remember how that coffee—plain old truck-stop coffee—warmed us up, strangers waiting out a blizzard. When they dropped me off in Cheyenne a couple of hours later, I felt I was leaving old friends.

Over the years, how many late-night or early morning road trips, outdoors adventures with friends and family, or travels to remote job sites have been undergirded by diner coffee? Too many to count.

Is it just nostalgia that makes me appreciate—not crave, but appreciate—the coffee so often dissed as inferior? Probably. Who can deny the deep emotions triggered by a late-night cup of Joe, reminiscent of Edward Hopper’s Nighthawks: Adrift in midnight America, the clatter of the dishes, the warm cup of diner coffee. You don’t get that feeling at Starbucks.

So, partly, it’s a matter of nostalgia, but partly it’s a matter of caffeine.

Ounce per ounce, Folgers and Maxwell House coffees are more caffeinated than most specialty coffees. And there are two reasons for this. First, they tend to be lightly roasted. A light-roasted coffee has slightly more caffeine per bean than a dark-roasted coffee. Too, they typically include blends of arabica and robusta beans. Arabica, the mountain-grown coffees beloved by coffee connoisseurs, tends to taste smooth. Robusta, the cheaper, hardier, easier-to-grow coffee, often has a bitter tang (one coffee expert says it tastes like burnt rubber). But here’s the catch—robusta has much more caffeine than arabica, often twice as much.

So that cup of Java in the diner or truck stop, unless it is brewed weakly, will likely give you more of a jolt than a cup from an upscale café. And that caffeine is a big part of what pulls us off the two-lane road to a diner in the middle of nowhere, and brings us back to the downtown deli where the waitress is endlessly refilling your coffee cup.

Recently, I stopped at a country store at a northern Maine crossroads on a frosty morning. I’d only planned to ask directions, but got into a conversation about fishing with a friendly local. So I had a cup of coffee while we talked. Unlike some New England convenience stores, this one did not have 15 flavors of Green Mountain coffee in vacuum pots, just two of the old Pyrex coffee pots on hot plates. It sure wasn’t the gourmet stuff, but it definitely hit the spot.

Murray Carpenter is a Maine journalist, and the author of Caffeinated: How Our Daily Habit Helps, Hurts, and Hooks Us. He tweets at @Murray_journo. He wrote this for Zocalo Public Square.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME health

Unvaccinated Families’ Addresses Should Be Made Public

A single dose of MMR for Measles, Mumps, and Rubella at Kaiser Permanente East Medical offices on Feb. 3, 2015 in Denver, CO.
Joe Amon—Denver Post via Getty Images A single dose of MMR for Measles, Mumps, and Rubella at Kaiser Permanente East Medical offices on Feb. 3, 2015 in Denver, CO.

Zocalo Public Square is a not-for-profit Ideas Exchange that blends live events and humanities journalism.

The names and addresses of parents who won't vaccinate their children should made available on the Internet through a public registry

Shouldn’t we know where they live?

California’s measles outbreak has touched off a debate about how to reduce the number of parents who choose—in defiance of all credible public health information—not to vaccinate their children. So far, the debate has focused on tightening California laws that make it easy for parents to obtain exemptions from school vaccination requirements. Newly introduced state legislation would eliminate the “Personal Belief Exemption” that thousands of anti-vaccine parents have used.

I’d be more than happy to see this proposal become law. But the politics of reducing parental choice are fraught, and there are limits to the law’s ability to compel good parenting. There’s also a hard cultural fact: few things are more fundamentally Californian than the freedom to believe whatever pseudo-religious or pseudo-scientific nonsense you choose. So, one way or another, it’s likely that parents will still find ways to avoid vaccinating their children, despite the risks to both their own kids and their communities.

A tougher, smarter way of dealing with anti-vaccine parents would be to target not their choice—but the secrecy that surrounds that choice.

Under today’s privacy laws, public schools and health authorities must protect the identity of parents who choose not to vaccinate. That’s wrong for many reasons. First, the secrecy effectively forces public employees, whose first duty should be to the public safety, to be enablers of those who threaten that safety. Second, parents who endanger the community’s health don’t deserve official protection. And third, the confidentiality of such exemptions makes it harder for those families who vaccinate their children to protect themselves.

People deserve privacy in their private spheres. But a parent who won’t vaccinate is not making a private health decision: She is making a public health decision that profoundly affects others.

So let’s treat the exemption she obtains as the public act it is. Every single exemption request should be reviewed in a public meeting and approved by a public body (like a city council or school board). And if the exemption is approved, basic information—the parent’s name, address, and the vaccinations declined—should be available on the Internet via a publicly maintained registry.

The virtues of disclosure are clear. Having your family’s name published as a potential hazard to public health would be a strong disincentive to obtaining an exemption for all but the most committed (i.e., delusional) anti-vaxxers. And the rest of us would be able to identify our unvaccinated neighbors, and our children’s unvaccinated schoolmates. This would be especially helpful to pregnant women and the parents and caregivers of children who are either too young to be vaccinated (the first measles mumps rubella vaccine isn’t given until after a baby’s first birthday) or have serious diseases like cancer (as in the case of the Marin County six year old recovering from leukemia) that compromise immune systems and preclude vaccination.

In effect, the question of how to handle unvaccinated children and their parents would move from the realm of school administrators to the community at large. And the community level is where the question is best addressed, since we encounter the unvaccinated not only at school but also in parks, churches, and stores.

There is some risk of community and personal conflict in this shift, to be sure, and anti-harassment laws would have to be strictly enforced. But there would also be potential for the kind of conversations necessary to change minds and get more children vaccinated.

Those who have studied the question of how to convince people to vaccinate report that the voices of distant authorities—public health departments, governors, even President Obama—aren’t particularly effective, given deep public distrust of institutions. People you know—neighbors, friends, co-workers—make better emissaries to the unvaccinated. But you can only be an emissary to unvaccinated neighbors or friends if you know they are unvaccinated.

The recent legislation acknowledges this need for conversation with a proposed requirement that all parents be notified of the vaccination rates at their kids’ schools. But that doesn’t go far enough. Indeed, it might create additional anxiety by instigating guessing games and speculation, without triggering the desirable peer pressure of true disclosure.

Some committed opponents of vaccines may howl about their identities being made public or about the exposure of their children, but such objections are easily turned back against them. If you believe you have the absolute power to make whatever decision you want for your children, why would you deny me the right to do the same, including the right to decide whether my children should be going on play dates to the homes of people who have recklessly opted out of modernity?

That response may sound harsh and insufficiently sensitive to privacy. But for better and for worse, it fits the obligations of 21st century childrearing. As a parent myself, I’m repeatedly reminded—by doctors, nurses, public officials, schools, and the dozens of legal waivers that daily life requires me to sign—that I am required to know everything I can about my kids. I’m supposed to know where they are at all times, and to monitor every minute of exercise and each spoonful of sugar. I’m supposed to find out everything I can about the kids they hang out with, and I’m supposed to monitor all their online movements. It’s no coincidence that most successful public service announcement series in America, now celebrating its 25th anniversary, is NBC’s “The More You Know.”

There are other good ideas out there for putting pressure on parents who don’t vaccinate. You could hand out stickers or buttons to all vaccinated schoolchildren—creating a social pressure on those who don’t. Laws could permit insurers to raise the premiums of those who don’t vaccinate (right now, insurers can only set rates based on age, geography and tobacco use). A new tort could be created to permit people who incur medical and other costs because of an outbreak to sue and recover damages from the unvaccinated. I particularly like a proposal from Dorit Rubinstein Reiss, a law professor at UC Hastings, to charge a significant fee for vaccine exemptions to cover the costs of an outbreak.

This issue is personal. My own children are still little, and it will be a few more years before all three are old enough to have had all their vaccinations. Media outlets have recently compiled data on the number of vaccination exemptions in California schools, and it bothers me that, of the 95 kids who attend kindergarten with my oldest son at our local public school, three are unvaccinated because their parents have obtained Personal Belief Exemptions.

I should have the right to know who those families are. And I look forward to the day when I can engage them in a conversation about what our families owe each other.

Joe Mathews is California & innovation editor at Zócalo Public Square, for which he writes the Connecting California column.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME Race

Why Diane Nash Is Selma’s Best Supporting Role

SELMA
Atsushi Nishijima—Paramount Pictures Tessa Thompson plays Diane Nash in Selma

Zocalo Public Square is a not-for-profit Ideas Exchange that blends live events and humanities journalism.

The film may have focused on Martin Luther King, Jr., but Nash was the reason he was there in the first place

If you watched the film Selma, you met Diane Nash when you saw her driving with Martin Luther King, Jr., into the Alabama town early in 1965. King’s organization, the Southern Christian Leadership Conference, had just begun to stage demonstrations to illustrate the need for federal forces to protect African-Americans exercising their right to vote in Selma, and throughout the former Confederacy.

Nash, somewhat surprisingly, stays in the background throughout much of the film—though an FBI field report excerpt flashed on screen does include her name. She could very well be mistaken as simply being activist James Bevel’s stunningly beautiful wife.

A film has its own narrative needs, and I understand that this one very much wants to remain focused on King. But when I saw Selma, I couldn’t help but think of the hundreds of people laying the groundwork for the demonstrations and developing the strategy months or years before the charismatic leaders and the news cameras showed up on the scene. Nash was one of those important trailblazers—she was the main reason King and his organization were there in Selma in the first place.

I first learned about Nash watching Eyes on the Prize as a college student in the late 1980s. She comes across as one of most extraordinary figures to arise in the student movement. Remarkable footage captures Nash leading a march in Nashville that culminates in a face-to-face confrontation with Mayor Ben West, who’s compelled by Nash to make a grudging admission that he felt segregation was morally wrong. I marveled at this forceful, determined woman who was about my own age, but had a will that far exceeded mine. I’ve come to know her better in recent years as she’s advised a number of civil rights history programs I have created for the Smithsonian Institution.

What strikes me about Nash as she talks about the racial problems she saw a half century ago (and still sees today) is the powerful anger at injustice that she channeled into non-violent direct action. Hers was an anger tempered by reason, strategic practicality, and a principled belief in peaceful activism. That holds true for many of the veterans of the civil rights movement whom I’ve met: When I hosted Rosa Parks on a tour of the Henry Ford Museum outside Detroit (the eventual home of the bus on which she refused to give up her seat to a white man), she didn’t seem the quiet and composed “Mother Rosa” when talking about injustice. She seemed pissed off.

For Nash, it was the bombing of Birmingham’s 16th Street Baptist Church in September 1963 that galvanized her into taking action on voting rights. The tragically famous church where four young girls died going to Sunday School had been a training facility for the Birmingham “Children’s Crusade” organized by Bevel.

Nash and her husband had been wrestling with the fact that their activism put people’s lives at risk ever since they became involved in the civil rights movement in 1960. When integrated groups of young men and women organized by the Congress of Racial Equality to ride buses into the South—the Freedom Riders—got beaten and firebombed by Klansmen in Alabama in the spring of 1961, Nash, a co-founder of the Student Nonviolent Coordinating Committee, could have just decided it was too dangerous and stayed away since it was conceived by another civil rights organization. Instead, with Nashville student organizers Bernard Lafayette and (now congressman) John Lewis, she recruited volunteers to continue the rides. When news reached Washington, Attorney General Robert Kennedy, who was eager to keep embarrassing racial violence off the front pages, demanded, “Who the hell is Diane Nash?” and asked his assistant to stop her. But the 23-year-old had made her calculations: The movement was more important than the lives of its organizers.

“If we allowed the Freedom Ride to stop at that point, just after so much violence had been inflicted, the message would have been sent that all you have to do to stop a nonviolent campaign is inflict massive violence,” she remembered.

So when the bomb exploded in Birmingham, Nash told me that she decided that “a grown man and woman with respect for themselves could not let four little girls be murdered and not do anything about it.” They discussed finding those responsible for the bombing and killing them, but decided their efforts were better directed at getting blacks in Alabama the right to vote and changing the faces of those in power. It was possible to upend the power structure in the state’s “black belt,” where a majority of the county’s population was African-American, if not yet making their presence felt in elections. Of the 15,000 or so black people of voting age in Dallas County (where Selma is located), for instance, fewer than 150 were registered to vote.

Nash and Bevel also knew there had been local activists in Selma who would be willing to protest and put their bodies on the line, and so they conceived a step-by-step plan that they presented to the Southern Christian Leadership Conference. Nash may not have literally driven King into town, but she did bring him there in a figurative sense. She was a leader of the negotiating the tense relationship between SNCC and SCLC, addressed demonstrators at Brown Chapel AME Church (the Selma campaign’s headquarters), planned out where protesters would go and when, and organized logistics. When the tear gas and clubs rained down on the peaceful marchers as they tried to cross the Edmund Pettus Bridge en route to Montgomery on what came to be known as Bloody Sunday, Nash sent runners out to get the medics. As the final march reached Montgomery, Nash marched the last few blocks with King. In August, after the Voting Rights Act of 1965 was passed, King conferred on her and Bevel the SCLC Freedom Medal for conceiving of the crucial Selma campaign.

Nash’s story reminds us that the Civil Rights Movement wasn’t just about the names in the headlines, but also about the legions of humble citizens on the ground who took action at great risk to themselves. As for Nash, now in her 70s, she is still determined to remind us that this isn’t all ancient history with a tidy beginning, middle, and end—even after the passage of the Voting Rights Act of 1965.

“That 10 minutes that people spend in the voting booth every two years is not enough,” she told a crowd at the Smithsonian in 2011. “I think back sometimes and wonder if we in the civil rights movement had left it to elected officials to desegregate restaurants and lunch counters, to desegregate buses … I wonder how long we would have had to wait. And I think, truly, that we might still be waiting.”

Christopher Wilson is director of the program in African-American history and culture at the Smithsonian’s National Museum of American History. He is also director of experience and program design and founded the museum’s award-winning educational theater program. He wrote this for What It Means to Be American, a national conversation hosted by the Smithsonian and Zocalo Public Square.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

Your browser, Internet Explorer 8 or below, is out of date. It has known security flaws and may not display all features of this and other websites.

Learn how to update your browser