TIME politics

I Feel Ashamed to Tell Others That I Am Republican

xoJane.com is where women go to be their unabashed selves, and where their unabashed selves are applauded

A misunderstanding of a political ideology can leave people feeling like outcasts in society

xojane

Like every stringy-haired, freckle-faced girl, middle school wasn’t exactly a haven of bliss for me. I’m not sure if it was my dirty no-brand tennis shoes or my ill-fitting mom jeans, but I was never invited to sit at the cool kids table.

Ever the realist, I decided not to push my luck with the popular crowd and instead resigned myself to bus rides spent reading and doing homework. It was during that time of solitude that I fell in love. Not with a boy. Not with a girl. But with politics. I buried my nose in my history books and relished the idea of a society where hierarchy wasn’t determined by birthright or the type of shoes you wore. Instead, everyone was born equal and treated equally. It was at this point that my life course was decided: I wanted to work in politics.

So like any other Type-A child, I dedicated my spare time to ensuring my success by signing up for extracurricular online classes in AP Government and Latin, memorizing the map of the world and the capital of each country, and even skipping school on Election Day in order to volunteer at the local call center.

Years later, my persistence paid off. I landed a scholarship at one of the most prestigious liberal arts schools in the country and graduated with my Bachelor’s in Political Science. By the time I was 23, I had already staffed a presidential campaign and was working on Capitol Hill as the press secretary of a prominent congressman.

My mornings started with a 5:30 am workout in the Capitol Hill staff gym, followed by a relaxing shower and scrub with my lemon bar soap. Toweling off there in the restroom, I’d take a look at myself and smile, thinking of how much the awkward 11-year-old girl from middle school had changed. I was no longer just dreaming of having a place in the political world. Now, I actually did. The sacrifices I had to make to get there were hard, but it was worth it. And I was proud of myself.

However, my confident world came crashing down at a friend’s 24th birthday party. Between cinnamon roll shots and bites from the turkey and cheese platter, the group’s conversation turned from the latest gossip toward the concept of blowing off steam from a stressful workweek. Nothing prepared me for the reaction I received when I uttered the six words, “I work for a Republican congressman.”

My new gay acquaintance, with whom I had been chatting the whole night, abruptly cleared his throat and walked away, while the remaining party guests who heard my comment bombarded me with a series of assumptions and questions, like how could I vote for Sarah Palin, why I was in favor of global warming, and whether I considered myself a feminist even though I’m against women’s rights. In the conversation that followed, my confidence evaporated, and I was reduced back to the stringy-haired, freckle-faced kid of middle school.

I was no longer the poised, accomplished woman I was 15 minutes before. Instead, I felt like a mortified child who just got ejected from the cool kids table.

With the prevalence of social media, constituents have a more hands-on interaction with politics than ever before. News articles from the Wall Street Journal and New York Times can be posted on private Facebook walls, e-mails can be leaked, candidate’s conversations can be secretly recorded and then released to the masses. There is no lack of information, but rather a surplus of opinion. Unfortunately, those who scream the loudest are heard the most—a reality that is leaving our nation more polarized than ever before.

Libel and slander have become socially acceptable as we isolate one another in an attempt to garner more votes. I won’t even dare to claim that only one party is to blame for the propagation of lies and falsehoods. However, I have experienced first-hand how a misunderstanding of a political ideology can leave people feeling like outcasts in society.

Although I’m proud of my accomplishments, when I’m in a crowded room with new faces, I secretly hope nobody asks me about my work background. I hold my breath as I admit that I work in politics and wait for the inevitable question of, “For which party?”

Being a staffer in the Republican Party isn’t exactly the sexiest job title, but what makes me cringe is knowing that somebody is assessing who I am as a person based off my business card. I’m proud of my career and believe in my Party, but I’m ashamed to admit to my peers that I’m a Republican because of the stigma associated with it.

Why does everyone assume that all conservatives are homosexual hating, gun-toting Tea Partiers who demand President Obama’s birth certificate? At what point did the Republican Party become classified as the rich white people party?

Are we all ignorant of the true roots of the Republican party—how we are a political group that favors laissez-faire economic policies rather than government regulation; how we support corporate tax breaks that lead to job creation in place of stronger entitlements; how we believe in equality for every American, even the Americans still within their mothers’ wombs?

Contrary to popular thought, not every conservative is against same-sex marriage and not every liberal is in favor of signing nuclear deals with Iran. As uncomfortable as it may be, it is vital in a democratic society to encourage open conversation and debate rather than pigeonholing people into assumed beliefs.

So allow me to start the conversation by debunking some partisan myths. I’ll admit that I am similar to other conservatives in that I am an advocate for limited government, energy independence and entitlement program reform. However, I do hold a few beliefs that aren’t held by other members in my party.

For one, capital punishment. Call me a softie, but I’m not a fan of the flawed human race having the capability and authorization to sentence one another to death. Besides, I find the judicial process imbalanced in several states. Although 42% of death row inmates are black and 43% are white, cases involving white victims rather than black victims are significantly more likely to result in a death sentence—an impartiality that I find unethical.

“Gun control”: the two scariest words to conservatives. As a born and bred Southern girl, I’m in favor of protecting our Second Amendment rights. However, gun control and regulation are two totally different things. The prevalence of guns that are bought and sold illegally at gun shows is staggering. Each state has different laws regarding gun sales, licensing and concealment. However, the horrific number of shootings in recent years has made additional conversation regarding arms registration and licensing imperative. It isn’t an issue that we should shy away from.

Undoubtedly, the beliefs of individuals don’t always fall within the dogmas of party lines. Therefore, not every political issue is black and white, nor is every political party segmented by skin color. Contrary to popular belief, the Republican Party is not comprised solely of white Americans. In 2014, I worked as the campaign manager for an incredible Congressional candidate, Glo Smith. A Jacksonville, Florida native, Glo was a gorgeous kind-hearted non-career politician running for Congress in a primarily black district. Going door to door during the campaign, our volunteers were astonished by the number of constituents who assumed Glo was running on the Democratic ticket simply because she was a black woman.

The Republican Party is not a party of exclusion and isolation any more than the Democratic Party is a party of entitlement or marginalization. There are plenty of black American Republicans just as there are copious amount of white Democrats. There are also old Libertarians and young Tea Partiers as well as wealthy individuals in favor of higher tax brackets and low-income citizens in favor of Social Security reform.

So let’s stop segregating one another with labels and assumptions. Instead, let’s allow the bright reds and deep blues of the political parties to fall to the ground as we enter an election cycle not wrought with ideological hatred, but with a willingness to listen and hear the voices of others.

Brittany Tony wrote this article for xoJane.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME society

This Is What Gay Marriage and Obamacare Have in Common

Two cases before the Supreme Court point to the long-running battle between state rights and federal authority

I don’t drink champagne, but if the Supreme Court strikes down state bans on gay marriages this month, I might pop open a bottle in celebration. As a newspaper editorial writer and editor, I’ve been waiting a long time for this one, having fought two publisher bosses in two different cities, going back to the mid-1990s, to editorialize in favor of gay marriage. I won the second fight, but barely, at The Los Angeles Times, some nine years ago.

A Court decision that relies on our federal constitution to legalize gay marriage across the country would be a triumph for individual liberty, common sense, and human decency. It would also amount to a well-deserved blow against that most persistent of villains throughout American history: the destructive creed of state rights and state sovereignty.

That same creed is at issue in the Obamacare case that is also expected to be decided this month, as the Court concludes its current term. At first glance, the Affordable Care Act and the institution of gay marriage don’t seem to have much in common as litigation subjects, but this case, too, is as much about the proper relationship between the states and the federal government as it is about anything else – which is true of so many of our political and legal fights these days.

King v. Burwell, the Obamacare decision, is a fluke of a case, an opportunity for opponents of the law to take another swing at the piñata (which they damaged, but did not break it in an earlier challenge) by capitalizing on some careless legislative drafting. The law allows the federal government to provide subsidies to lower-income insurance customers who sign up for coverage on the new exchanges “established by the state.” Trouble is, pursuant to other sections of the law, it was the federal government that ended up establishing an exchange for those states that refused to establish their own – and no one involved in drafting the law intended for its patients to be denied the same subsidies available to people signing up for coverage on a state-created exchange. Now, in their feverish desire to interfere with the relationship between American citizens and their national government, opponents of the law are hoping the Supreme Court will cut off 8 million people from the support and coverage they are receiving.

As we await these landmark decisions that are so of the moment, it’s worth reading Joseph J. Ellis’s new book, The Quartet: Orchestrating the Second American Revolution 1783-1789. It’s a masterful reminder of how timeless this tension is between the concept of the United States as a singular nation and the United States as merely a confederation of sovereign states.

Ellis chronicles how four of our more visionary Founding Fathers – George Washington, John Jay, Alexander Hamilton, and James Madison – recognized from the earliest days after independence that the individual states, and the excessive power retained by them under the loose Articles of Confederation, were a serious threat to the promise of the American Revolution.

Hence this influential “quartet” pushed for the 1787 Constitutional Convention in Philadelphia. Washington’s greatness lay in the fact that, from his earliest days leading the Continental Army, he transcended his narrow identification with Virginia, to think more broadly in terms of an American nation. He came out of self-imposed retirement to lend his enormous credibility to the Philadelphia proceedings. Washington wrote at the time (in what can be read as a challenge to pro-confederation Virginians then, but also to Virginia Confederates who’d secede from the Union in the following century): “We are either a United people or we are not. If the former, let us, in all matters of general concern act as a nation… If we are not, let us no longer act a farce by pretending to it.”

Ellis captures the rare brilliance and admirable foresight of Washington’s three intellectual partners in this quest – Jay, Hamilton, and the first president’s fellow Virginian, Madison. All three men had a clear vision of an America destined to be a unique power in the world, defined by its collective sense of purpose and its citizens’ liberty. They understood that to survive, and thrive, as a continental power, the United States needed a stronger national government representing, and protecting, all of its people.

Madison, often cited as the father of the Constitution, lost plenty of battles at Philadelphia, starting with his bedrock insistence that sovereign power be shifted entirely from the states to the central government. Madison gave up on what he initially considered his non-negotiable demand for a federal veto power over state laws, as he would later have to surrender on his proposal that some of the Bill of Rights also limit the power of states. Though the closest of political partners at other times, Madison and Jefferson disagreed vehemently over whether it was state governments or the new federal government that would be the biggest threat to individual liberty and rights, and history has proven Jefferson spectacularly wrong in that debate. It’s hard to blame him: Madison’s (and Hamilton’s) belief that the larger, more distant national government could be a more representative embodiment of “We the People” was a very modern concept.

But being so ahead of their time limited The Quartet’s contemporary success. They were able to remedy the immediate flaws of the Articles of Confederation, bind the new nation closer together and set it on the right course, but their new Constitution, by political necessity, was riddled with fraught compromises – such as the electoral college and the equal vote of each state in the Senate – whose underlying tensions would define much of American history.

Abraham Lincoln ratified and reinvigorated the Quartet’s accomplishment to the point where he deserves to join Ellis’ crew, and make it a Quintet. The Civil War and its aftermath – especially the 14th Amendment on which the gay marriage case should hinge – delivered on the Madisonian concept of a federal government empowered to protect citizens – especially minorities – from the bullying of local and state authorities (i.e., majorities). But that doesn’t mean the fight is over.

Nowadays we don’t often think about these federalist debates that have haunted our history, because we are too busy – and this goes for both conservatives and liberals – gaming the tension between Washington and state capitals. Even within the gay marriage legal fights over the last decade, both sides have taken turns, depending on the prevailing winds, arguing in favor of a state’s right to define marriage for itself, damned what the rest of the country thinks.

Too rarely do we ask ourselves the more fundamental question of whether we are citizens of California or Texas – or the United States? If the Quartet had invented a time machine and paid us a visit, they’d be astonished at the resilience of the state sovereignty creed, despite all we’ve been through as a nation. Too many Americans stubbornly cling to the belief that the United States is a confederation in which citizens’ fundamental rights – on issues like marriage, access to baseline health care, and what is taught in their public schools – can and should vary across state lines, to accommodate local biases.

Let’s hope in the coming days and weeks that five such Americans aren’t sitting on the Supreme Court.

Andrés Martinez is the editorial director of Zócalo Public Square and a professor at the Walter Cronkite School of Journalism at Arizona State University.

TIME Culture

Why I Won’t Wear War Paint and Feathers in a Movie Again

Zocalo Public Square is a not-for-profit Ideas Exchange that blends live events and humanities journalism.

As a Navajo actor, I've learned where Hollywood likes to stick its 'Indian' roles—and where to find real Native American creativity onscreen

At some point, every Native American actor comes to a career crossroads and has to answer the question: Do I participate in stereotyping or maintain my cultural integrity?

As a Navajo man, I answered that question early in my acting career. Fresh out of Yale with a bachelor’s degree in film studies, I moved to Albuquerque in 2010 when the New Mexican film industry was booming. To build up my resume, I took on parts in various short films—including one memorable role as an “Indian” shaman.

Acting parts for Native Americans are few and far between, so I felt I couldn’t say no to the gig. But as I climbed into the feathered costume and began to apply “war paint” to my face, I began to feel very uncomfortable. Even though I’m not of a Plains tribe (as of 2013, the number of federally recognized tribes in the U.S. was 566), I knew that this kind of regalia was not meant for casual, every-day wear. For many tribes, including mine, feathers are sacred.

Looking at myself in the mirror in full costume, I felt shameful for mocking my spirituality. I promised myself I’d never play “Indian” again—and since then have turned down several auditions for big budget films.

Last month, 12 Native American actors walked off the set of Adam Sandler’s forthcoming comedy, The Ridiculous Six. A few days later, Indian Country Media Today leaked several pages from the script, which features jokes depicting Native Americans as dirty, animalistic backdrops.

The film’s producer, Netflix, was quick to defend Sandler’s jokes as “a broad satire of Western movies and the stereotypes they popularized…” In actuality, however, these jokes aren’t anything novel or creative. They’re uninspired facsimiles of old stereotypes that stem from late 19th-century Wild West Shows.

Starting in 1883 with Buffalo Bill, these shows toured the United States to display their “tamed” wild Indians in extravagant rodeo performances. Many prototypes of Native American stereotypes (such as living in teepees, hunting for buffalo, scalping enemies, wearing feathered regalia, and having a savage demeanor) gestated in these vaudevillian theatrics.

Often, Native Americans in Wild West shows reenacted crippling defeats such as the Wounded Knee Massacre. These shows celebrated the conquest of the West and the decimation of the Native American population, but for the Native American actors who participated in them, they were also a means of earning money for their families.

In 1913, director Thomas Ince hired Native Americans who had performed in those traveling shows to work at his film production studio in Santa Ynez Canyon near Santa Monica. He also recruited several Sioux people from the Pine Ridge Reservation in South Dakota. In exchange for room and board, the actors were cast in Ince’s films or loaned out to other directors.

During that decade, white directors like Ince, Cecil B. DeMille, and D.W. Griffith laid the foundation for the Western narrative in Hollywood by borrowing heavily from Wild West shows. In Griffith’s The Battle of Elderbusch Gulch (1913), an unspecified tribe of savage “Indians” celebrated their annual “feast of dogs” before raiding a nearby white establishment.

Silver-screen tales about defeating Native American tribes proved to be hugely popular, so Hollywood churned them out. In most Westerns, white cowboys represent the shining future, whereas “Indians” are the dimming past. Cowboys are logical. “Indians” are irrational. Together, cowboys and Indians are the ego and the id of Anglo-Saxon identity.

But even though Western movies were brimming with stereotypical “Indian” roles, making a living in the film industry was difficult for Native American actors, many of whom left reservations after World War II to work in L.A. In Reimagining Indian Country, Nicolas Rosenthal writes about how the more prominent, higher-paying “Indian Chief” roles went to non-Native American actors, while Native Americans were stuck in the background—and paid a lower rate than other actors in the same supporting parts.

In 1926, several Native American actors created the War Paint Club, to provide support to Native American actors looking for work in L.A. and to encourage filmmakers to cast them in the “Indian Chief” roles. The War Paint Club also demanded that film companies pay Native American actors the same rate as non-Native American actors. They organized public powwows in the hopes of dispelling negative stereotypes perpetuated by Westerns.

The War Paint Club evolved into the Indian Actor’s Association in 1936, led by Luther Standing Bear, William Eagleshirt, and Richard Thunderbird. That in turn was later absorbed into the Screen Actors’ Guild in the early 1940s.

During the American Indian Movement of the 60s and 70s, television news channels broadcasted Wes Studi’s occupation of Wounded Knee, and a wider United States audience was exposed to the actual conditions of reservation life.

Around the same time, the “Indian” stereotype evolved from reactionary savage to the romantic victims of manifest destiny — from the vicious and blood thirsty Geronimo seen in Stagecoach (1939) to the Geronimo who fights to protect his tribe seen in Geronimo (1962).

This softening trend grew, as Native American actors assumed more prominent (if still rather one-note) roles, as with Will Sampson in One Flew Over the Cuckoo’s Nest (1975), Graham Green in Dances with Wolves (1990), and Russell Means in The Last of the Mohicans (1992).

Outside of the Hollywood system, Native American artists continually wrote, produced, directed and acted in their own short film productions. In 1909, James Young Deer, of the Nanticoke tribe, began his directing career with The Falling Arrow. In 1966, several Navajos near Pine Springs, Arizona, participated in an anthropological study that produced several short films known collectively as Navajos Film Themselves. Victor Masayesva, Jr. directed Weaving in 1981.

Starting with Chris Eyre’s Smoke Signals in 1998, Native American filmmakers began producing feature length movies on par with the Hollywood production system. In Canada, Zacharias Kunuk brought an Inuit legend to life with Atanarjuat: The Fast Runner (2001), while Georgina Lightning explored the horror genre with Older Than America (2008). Director Neil Diamond explored the birth of the Hollywood “Indian” stereotype in the documentary Reel Injun (2009). Jeff Barnaby released his visceral Rhymes for Young Ghouls (2013). And Sterlin Harjo examined the Muscogee-Creek hymns in his documentary, This May Be The Last Time (2014). Just to name a few.

Meanwhile, the Hollywood mainstream has cranked out a fledging resurgence of Westerns with (mostly panned) movies such as Cowboys & Aliens (2011) and The Lone Ranger (2013). In these projects, Native American actors have been restricted to background roles—which brings us back to some of what’s wrong with The Ridiculous Six.

I have personally experienced the level of ignorance that results from one’s only exposure to a culture being what one sees in movies. During my orientation week freshman year in 2006, many of my classmates, when they discovered my Navajo heritage, seemed to think I lived in a teepee and hunted buffalo in the plains on horseback. (For the record, Navajos are primarily farmers and shepherds. Our traditional houses, hogans, are used mainly for ceremonial purposes. We drive cars to get to places. So, no.)

Further, they wanted to know why I didn’t wear any feathers or have long, black hair. I was shocked by how little my fellow students knew about Native Americans, and how much they based their perception of me and my heritage on what they had seen in westerns.

The most troubling aspect of The Ridiculous Six is how the script depicts Native American women as promiscuous, by using names such as “Sits-On-Face.” This may be presented in a spirit of levity for an audience that appreciates fart jokes as much as Sandler, but it undermines the dire circumstances of Native American women, who experience high levels of sexual assault and violence.

In all likelihood, of course, Sandler’s team wasn’t aware of these disturbing statistics when they began writing The Ridiculous Six. But their ignorance isn’t an excuse. Such carelessness with racist, sexist jokes can establish misperceptions that are hazardous in real life to real people.

Brian Young is a Navajo filmmaker currently living in Albuquerque, New Mexico. Among his projects are two short animated films (Lady and the Eagle and Rainbow Bird) and Yeego Nitl’aa’, an exercise video series narrated in Navajo. He wrote this for Thinking L.A., a partnership of UCLA and Zócalo Public Square.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME Parenting

Ed Sheeran to Kids Who Stutter: Embrace Your Weirdness

Ed Sheeran gave this speech at 9th Annual American Institute for Stuttering Benefit Gala

This is the second award I’ve ever got in America, so that’s pretty nice.

I didn’t actually know I was getting an award tonight, because I didn’t expect one. I was coming here to support the cause. I got an email from Emily [Blunt] a couple of months ago telling me about the thing, I said, “of course I’ll turn up.” So turning up today and saying your getting an award is pretty wild, but yeah.

I was a very, very weird child. Very weird child. And I had a port-wine stain birthmark on my face that I got lasered off when I was very young, and one day they forgot to put the anesthetic on, and then ever since then I had a stutter—and I also had very, very big blue NHS glasses – NHS is the National Health Service, one day, I hope you’ll have the same.

And I lacked an ear drum on one side of my face—one side of my ear—so stuttering was actually the least of my problems when I went to school, but it was still quite a difficult thing, and the thing that I found most difficult about it was, knowing what to say but not really being able to express it in the right way.

So I did different speech therapies and stuff, which wasn’t very successful. I had homeopathy, which is like herbs and s—, where you’re drinking… It’s alright.

But I got heavily into music at a young age, and got very, very into rap music—Eminem was the first album that my dad bought me. I remember my uncle Jim told my dad that Eminem was the next Bob Dylan when I was—say what you want, it’s pretty similar, but it’s all just story-telling. So my dad bought me the Marshall Mathers LP when I was nine years old, not knowing what was on it. And he let me listen to it, and I learned every word of it back to front by the age I was ten, and he raps very fast and very melodically, and very percussively, and it helped me get rid of the stutter. And then from there, I just carried on and did some music, but it’s I think the one thing I actually wanted to convey in my speech today for not so much the adults here because I feel like the adults are fine—you’re solid, everybody’s got a lot of money and everyone’s chillin’. But more the kids that are going through the therapy, and I want to stress the point that it’s not—stuttering is not a thing you have to be worried about at all, and even if you have quirks and weirdness, you shouldn’t be worried about that. I think the people I went to school with that were the most normal and were the coolest when we grew up—I was telling Emily earlier that one of the cool kids from school now does my plumbing. So that’s a fact. That’s a fact, so being my thing that I want to stress most here tonight is not necessarily to shed light on stuttering or make it a thing. It’s just to stress to kids in general is to just be yourself ‘cause there’s no one in the world that can be a better you than you, and if you try to be the cool kid from class, you’ll end up being very boring, and doing plumbing for someone that you don’t really want to do plumbing for.

And just be yourself, embrace your quirks—being weird is a wonderful thing. But I think, you know, I’m not very good at speeches, I don’t really do a lot of speeches but I think the one thing I want to say is be yourself, embrace yourself, embrace your quirks, and embrace your weirdness.

And from from a stuttering point of view, don’t treat it as an issue—work through it and get the treatment that you want to get, but don’t ever treat it as an issue, don’t see it as a plight on your life, and carry on pushing forward. And I did alright—I did alright is all. Emily did alright. Nice, thank you.

TIME women

Why I Don’t Want to Have Children

Pacifier
Getty Images

I’ve spent years carefully crafting the most amazing life I can

What I want is to be happy.

I’m often told that I’d make a good mother. Depending on my relationship with the person making this wildly incorrect statement, I have one of two reactions: either a small, insincere smile and a “mmmm” response that does not invite further discussion or a hearty laugh followed by a firm “No.”

Don’t get me wrong: I love kids. They’re hilarious, they’re adorable, and I (mostly) enjoy spending time with them. But without a doubt, I do not want them. And here’s why.

I don’t want to worry about diaper rash and “tummy time” and I don’t want to know what colic is.

I don’t want to put a kid on a kindergarten waiting list and I don’t want to decide between public and private education. I don’t want to coordinate basketball practice drop-off with ballet lessons pick-up, I don’t want to help with trigonometry and darling, I will not deal with your teenage angst because you best believe I invented that. I’d rather have bamboo shoots shoved under my fingernails than try to figure out how to pay for my child’s college while I still owe roughly twelve kajillion dollars for my own degree. I’ve more than once done something “just to tell the grandkids about it,” but I never actually planned on there being any grandkids.

It amuses me to tell people I don’t want children because no one ever quite knows how to respond. I’ve gotten “Well, when you meet the right guy, you’ll change your mind,” which is basically suggesting I’m incapable of making decisions regarding my own life without consulting a nameless, faceless FutureMan and is, by the way, astonishingly offensive. Others immediately ask what I do for a living, as though my employer holds the key to my womb and has locked it up until I retire. I don’t really consider myself a career-minded kind of girl; I’ve always worked to live, not lived to work.

Two mothers have actually said to me, “I didn’t know what love was before having a baby. You should reconsider.” I’m happy they’re happy now but “not knowing love before kids” is one of the most acutely sad things I’ve ever heard. Occasionally, I get a hearty “yeah!” from like-minded women, some of whom will eventually become mothers and some of whom will not. I appreciate the support.

But at this point, it doesn’t matter how much anyone tries to change my mind because the decision’s been made — permanently.

Last October, I spent a wonderful morning with my doctor, during which he performed a tubal ligation on me.

Yep, I got my tubes tied at 28.

I admit that once my doctor agreed to perform the surgery, I had a moment of panic. It immediately crossed my mind that maybe everyone was right and I was wrong and I would wake up at 30 and want a baby more than anything in the world or that maybe my “hard pass” on kids was a rebellion against expectations simply for the sake of a rebellion.

Maybe I would love the complete upheaval of my priorities and schedule and life in general. Shortly after these hysterical thoughts raced through my mind, though, I regained my sanity. I picked a date for the surgery. Done. Tubes tied.

Here’s the thing: I’ve spent years carefully crafting the most amazing life I can.

I’m surrounded by people I love very much, who love me in return. I’m well-educated and well-traveled. I have endless time to learn about things that interest me and to see wonderful things and to meet the greatest people on earth. I leave piles of library books all over my bedroom and plan fabulous trips all over the world. I stay up until 6 a.m. watching Sons of Anarchy because I know no small person is relying on me to feed them in a few short hours. I occasionally eat chips and salsa for breakfast and drink beer for dinner and feel no guilt that I’m teaching anyone horrific eating habits. I spend my days finding my bliss, like all the inspirational posters beg of me.

All this being said, I can’t wait to be an auntie. Whenever my friends start popping out kids, I’ll be there with inappropriately loud and expensive presents. I’ll be the aunt who slips them a vodka martini on their 16th birthday and I’ll rant and rail with the best of them whenever they feel slighted by other kids.

And when I’m off for six months teaching scuba in Venezuela, I promise to send lovely postcards.

I get the reasons people want kids. I do. I’m not such a heartless, selfish monster that I’m incapable of understanding the appeal of a small person who loves you unconditionally and relies on you to guide them safely through a scary world. Parents are brave and strong and incredible people. But so are astronauts and brain surgeons and I don’t want to be those things, either.

What I want is to be happy.

And I’m doing that. I’m there, I’m living that dream. I’m happiest not being a mom, but hey… Call me if you need a babysitter. I’m great in a pinch.

This article originally appeared on YourTango.

More from YourTango:

Read next: What I Learned Living in a Tiny House With Two Children

Listen to the most important stories of the day.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME society

Captain America Dons a Turban

Zocalo Public Square is a not-for-profit Ideas Exchange that blends live events and humanities journalism.

Armed with a beard, a shield, and a sense of humor, I learned why the U.S. needs new superheroes

I was born in our nation’s capital in the early 1970s – but sometimes when people see me in my turban, they think of conflicts in faraway lands, terrorism here at home, Hollywood caricatures, and sensationalized news coverage.

Donning the costume of a superhero—complete with unitard and shield, in addition to the turban of my Sikh faith—changed all that. Suddenly, there was no question that I was American.

Like any good comic book, there’s an origin story. One that covers moving thousands of miles away from home after high school, trying to make my superpower invisibility, and fending off a constant barrage of hate speech.

Sikhs believe in the innate equality of all, striving to merge with energy that traverses every speck of the universe. Fighting against injustice and practicing the art of compassion are part of our spiritual practice.

As a physical manifestation of this journey, Sikhs must don long, unshorn hair as a natural extension of the human form. This was how I grew up experiencing my faith, though otherwise I had been brought up fairly secularly. (Our Sunday ritual was a trip to the butcher to buy fresh meat, rather than a temple visit.)

I first experienced hatred directed at me because of my religion in India, where I spent much of my childhood after my parents moved the family back to their country of origin in 1975. Following the assassination of the Indian Prime Minister Indira Gandhi in 1984 by her two Sikh bodyguards, Sikhs were hunted down on the streets of many Indian cities.

We were fortunate to survive the fury of an angry mob that surrounded the apartment building where my family lived in New Delhi. But many Sikh men and boys would not be so lucky. Many were burned alive by having gasoline poured over them and lit on fire across the Indian capital and other cities.

When I graduated from high school, I moved back to the United States, hoping for a calmer transition to adulthood in Los Angeles. However, I began to encounter hatred of a different variety: offensive calls of “genie,” “clown,” and “raghead,” and laughter at my appearance.

In college, I felt overwhelmed at being stereotyped so much that, by sophomore year, I decided to take off my turban and clip my long hair, which had not been cut since I was born. After a short trip to the barber, all eyes were suddenly off me. I had magically transformed and did not stand out anymore.

I wouldn’t don a turban again for almost 10 years. First, I would fall in love with the words of Asimov, Plato, Nietzsche, Abbott, and Freud. I would explore meditation, Buddhism, and Taoism. Finally, I came back around to the Sikh faith through experiences with the religion’s music.

In August 2001, I put my Sikh turban back on. Only a month later, the horrific Sept. 11 attacks happened. As we watched the TV, shocked and horrified on that terrible day, I remember a coworker looking at me with bloodshot eyes, on the verge of tears, as if I was somehow responsible for these attacks. That was a prelude to a new normal in America that would look at my turbaned and bearded countenance as the ultimate “other” in our midst. The most common racist insult hurled my way ever since has been “Osama,” even after bin Laden was killed.

In the aftermath of Sept. 11, after people who look like me were the victims of hate crimes by bigots across the United States, a piece by Pulitzer Prize-winning cartoonist Mark Fiore would change the course of my life. Fiore’s animated cartoon entitled “Find the Terrorist” prompted users to click on the faces of people of all races and countries of origin as a way to point out our own witch-hunting tendencies and prejudices.

The cartoon so powerfully captured my identity and feelings that it inspired me to create my own cartoons featuring Sikh protagonists. In late 2002, my website Sikhtoons.com was born—and it has since provided a way to channel all my whims, frustrations, and inspirations, in cartoon form.

As my website gained attention, I began traveling across the U.S. to showcase my work and host cartoon workshops, mostly at Sikh gatherings and events. In 2011, in preparation for my first trip to the New York Comic Con, I drew a bearded Captain America wearing a turban, inspired by my experience on the streets of America and the release of the Captain America movie that summer. With a flash, reality and fiction collided to present this vision in my imagination.

A photojournalist named Fiona Aboud saw the drawing at the convention and suggested that I come back next year actually dressed as Captain America myself. I swiftly responded, “No.” I had never worn a costume, ever, and being teased my whole life for my skinny frame had further taught me to avoid drawing attention to myself.

Almost a year later, the massacre of six worshippers at a Sikh temple in Wisconsin in 2012 at the hands of a white supremacist prompted me to pen a few cartoons and an op-ed in the Seattle Times, making the case for a new American superhero who doesn’t take on Nazis, mad scientists, or communists, but rather takes down the real evil-doers in our society: those who commit hate crimes.

More than ever, we need a hero to fight intolerance and suspicion of people who are not like us, forces that are ripping our country apart. Fiona emailed me after reading the piece and made a second request for me to dress up as Captain America. This time I agreed to her request for a photo shoot on the streets of New York City.

The shoot took place on a sunny summer day in 2013. It was one of the most amazing days of my life. Hundreds of onlookers snapped pictures of me and with me; police officers posed with me in photos for their kids; strangers hugged me; and I even got roped into participating in a wedding party (complete with a photo-op with the bride and groom).

An essay I wrote about the experience on Salon.com, which had six photos from Fiona’s shoot, went viral, gathering over 50,000 likes on Facebook. The images in the article keep finding a home to this day, on blogs and websites around the globe.

I have now traversed the country in my spandex uniform (later upgraded to the one featured in 2014’s Captain America: The Winter Soldier), from New York to Los Angeles, Kansas to Mississippi. My trips have taken me to universities, museums—even the backstreets of NYC with a late night comedy crew – to see corners of America I never thought I’d be invited to.

I have received messages of support from Americans in every walk of life: police officers, veterans, active service members, teachers, conservatives, liberals, men, women, black, white, Asian. The embrace of fellow Americans re-enforced my belief that we have much more in common than our eyes lead us to believe, that we all want to believe in a superhero that embodies the goodness of America, even if that superhero doesn’t resemble the clean-cut Chris Evans. And we can all have a good laugh about what I look like in a unitard.

Along the way, I have learned how much my own insecurity about my body keeps me from taking risks, and experiencing life’s many surprises.

I know comic superheroes are not real. In the American tradition, they have long been an extension of the imagination of many young immigrants. Young Jewish Americans of Eastern European descent, who survived the Depression era and battled forces of anti-Semitism, wound up creating one of the most iconic of superheroes—the Man of Steel, Superman.

Superheroes are always in our midst, in a sense. It turns out that just the uniform of a fictional character from the early 1940s, Captain America, created to fight with Axis powers in World War II, does possess a real superpower: It opens doors to new conversations and new visions of what our country can look like as its best self.

Vishavjit Singh is leading a double life. By day, he is a software analyst. By night (and on weekends), he creates cartoons that can be consumed at Sikhtoons.com. He can be reached at @sikhtoons and vsingh@sikhtoons.com. He wrote this for What It Means to Be American a national conversation hosted by the Smithsonian and Zocalo Public Square.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME society

Here’s a New Way to Show Your Support for Caitlyn Jenner

Start by finding a Diet Coke bottle with the name Bruce on it

You know how you can occasionally find Coke bottles with your name on them? Well, somebody happened to find a bottle with the name Bruce and pulled off some topical humor:

Yes. Just yes.

A photo posted by thefatjewish (@thefatjewish) on

This is, of course, a nod to Caitlyn Jenner, formerly known as Bruce, who recently came out as transgender, revealing her new name and image on the cover of Vanity Fair.

According to Coca-Cola’s Share a Coke site, there are already bottles with the name Caitlyn out there — but we still appreciate the ingenuity here.

TIME society

Who Is the Freelance Economy Hurting?

car-piggy-bank-driving
Getty Images

The freelance economy is troubling given the unbending and unforgiving realities of today’s economic environment

These days, it’s all about the freelance economy—or what many experts envision as the new economic normal. As the economy makes its bland recovery, flexible (no-benefits and low-cost) freelance (a.k.a. “shared,” a.ka. “contingency”) work is all the rage.

But who benefits from this supposed freelance boom? This growth of contract employees, such as the ride-sharing and taxi-obliterating Uber, along with its competitor and policy partner-in-sector Lyft, has already sparked some good old-fashioned turf wars over innovation. And while the fever-pitched combat between traditional taxis and ride-share services might seem like just another nose-breaking scrap between rivals in a market space, there’s even more to it than that. The outcome promises to radically and forever reshape what it means to be a worker.

The Freelancers Union excitedly sneezed out its “National Survey of the New Workforce” last September, boasting about the 53 million workers disrupting the typical economic model. Gone are the days when like Ward Cleaver strolls through the front door after a 9-to-5 white collar grind. First add June to that equation—she’s working, too, while the Beav and Wally are left latch-keyed to their own smartphone-fueled devices because their parents’ freelance work comes without the benefit of regular hours. The generational impact of the “freelance revolution” doesn’t stop at the nuclear family—imagine Millennials hunched over laptops in the local Starbucks or overwhelmed Gen-X’ing parents hustling for both paycheck and flexibility.

Let the numbers tell it, and it’s all good—advantage innovators. Popular freelance economy pioneers like Uber enjoy eye-popping valuations of $40 billion or more, thereby validating an emerging freelance ideology. Freelancers, according to the Freelance Union, are also contributing more than $700 billion of productivity to our $14 trillion economy, which is solid and respectable. As the economy rapidly reconfigures and technology pushes us further into automation, the segment of the workforce that’s contracted will also rise from today’s 34 percent to an astonishing 50 percent by 2020. Hence, the trend shows no sign of reversing, but rather more signs of metastasizing. It’s reasonable to assume that, in our collective lifetimes, freelance or contractual work will be the fundamental core of our global labor market. There will be way more freelancers than full-time permanents. A world of full-timers and full-time-nots looms just over the horizon—and for some workers, it’s already here.

Not only does that make the Affordable Care Act convenient in terms of its timing, but also rather prescient. Permanent non-freelance jobs won’t be an economic staple in 10 years. Instead, they’ll be something of a highly-prized and rather rare privilege.

From a white-collar perspective, freelancing seems like an efficient fit. Philosophically, it appears to thrive off the notion of high-octane entrepreneurship, an attractive social construct where we control our own agency.

But the freelance economy is troubling given the unbending and unforgiving realities of today’s economic environment. Among traditionally underserved populations, who face income inequality, stagnant wages, and underemployment, disrupting tech enthusiasts prompt more anxious questions than giddy answers. A ballooning freelance workforce means a permanent state of non-permanent wages, adding more uncertainty to an economic environment saddled by stuck income. As Pew found recently, “the average wage peaked more than 40 years ago: The $4.03-an-hour rate recorded in January 1973 has the same purchasing power as $22.41 would today.” While poverty is at 15 percent, economic inequality in the United States is obscene, a place where the top 20 percent own 84 percent of … well … everything.

Anecdotally and statistically, we see persistent public anxiety about the economy. A POLITICO poll discovered 64 percent of respondents feeling as if the country was “out of control” and only 36 percent believing it’s in a “good position to meet its economic and national security challenges.” When a subsequent POLITICO story highlighting economic concerns as a central issue in the upcoming elections dropped, it was peppered with quotes from average voters expressing “raw” concerns about matters such as “outsourcing” and “job growth.”

The question of who benefits becomes more pressing with each passing year the freelance economy grows. Interestingly enough, the decline of purchasing power since 1973 seems to mirror the upward trend of the “contingent” economy during that same period. And, along with the recession, it also means—eventually—that large segments of the population are getting left behind or will remain behind. Already, as Prospect’s Virginia Durivage pointed out some time ago, “most contingent workers are women and minorities clustered in low-wage jobs with no benefits or opportunities for advancement.”

Official unemployment rates released each month by the Bureau of Labor Statistics make recovery feel as fresh as a detergent commercial, but these reports slickly ignore other indicators such as underemployment or diminished labor force participation rates that actually show joblessness is much higher than we think. One factor, perhaps, could be a rising freelance mindset as fed-up workers tap out of traditional models in a bid to make it on their own.

Another obvious factor is a society still largely discriminating on the basis of race and perceived status, a condition for which the freelance economy may have no solution – especially if, as USA Today showed in a recent analysis, “top universities turn out black and Hispanic computer science and computer engineering graduates at twice the rate that leading technology companies hire them.” Would embracing a dominant freelance economy make that situation worse? It’s unclear at the moment. What we can see, for example, is that freelance pioneers like Uber are disrupting taxi industries largely populated by drivers of color: while 32 percent of tax drivers are black, less than 20 percent of Uber drivers are the same compared to more than 40 percent who are white. Asian and Latino Uber driver rates are, however, nearly identical to their proportions as taxi drivers, even while still low (17 percent each, respectively) when compared to white Uber drivers.

That probably doesn’t hint at any pattern of hiring discrimination on that part of Uber when it clears and selects contracted drivers. But, what is clear is that companies reliant on outsourcing as their primary operational model have more incentive to circumvent (or altogether neglect) worker rights than when industries were more reliant on permanent positions.

Charles D. Ellison is a veteran political strategist and Contributing Editor for The Root. This piece was originally published in New America’s digital magazine, The Weekly Wonk. Sign up to get it delivered to your inbox each Thursday here, and follow @New America on Twitter.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME world affairs

How FIFA’s Sepp Blatter May Have Outmaneuvered Everyone

Read his "resignation" letter carefully. It isn't what you think

And so Sepp Blatter has defied all expectations and announced his intention to step aside from the presidency of FIFA after 17 years at the helm. Despite numerous scandals afflicting the organisation he ran, he won four successive elections. Finally, it seems that the long arm of American law has finally reached close enough to FIFA’s heart to force its leader to step down.

FIFA has been part of Blatter’s life for 40 years. He was headhunted by Horst Dassler, the CEO of German sportswear firm Adidas, and learned his trade at Adidas’ headquarters in Landersheim. He then became a technical director in 1975 before assuming the role of Secretary General in 1981. He finally ousted his mentor, João Havelange, in 1998, to become president.

Significantly, despite every news organisation stating that he had resigned, he did not use the word in his brief press conference. The masterful politician remained in control until the end, and left us not entirely certain if it is indeed the end.

The key passage in Blatter’s announcement stated:

I have decided to lay down my mandate at an extraordinary elective Congress. I will continue to exercise my functions as FIFA President until that election.

Until we hear differently – and this is a fluid situation – he is still at FIFA. More importantly, he is setting the agenda.

Role reversal

In other words, we may have just witnessed a piece of skill and mastery to remind us of Argentina hero Leo Messi. While his opponents are busy fighting over who will succeed him, it appears he will be setting the agenda for when they finally replace him, and the seeds were in his announcement.

The master tactician may have outmanoeuvred everyone. Within his brief announcement there are statements that should cause concern. He suggested that the executive committee must be reduced in size and its members “should be elected through the FIFA Congress”.

This looks like a clever ploy to remove the additional members that are there for historical reasons. Chief amongst these will the anachronistic position for the home countries such as England, held due to their position as the inventors of the modern game.

Blatter continued his brief manifesto for change by saying that: “The integrity checks for all executive committee members must be organised centrally through FIFA and not through the confederations.” This is another clever piece of politics.

By centralising the checks within FIFA, he is accumulating more power for the organisation – at the expense of the regional confederations. This can be seen as a swipe at UEFA, who have attempted to act as a morality check on Blatter and his cabal. Blatter was angry that his former supporter, UEFA’s president Michel Platini, had threatened to boycott the World Cup

Blatter is ruthless. When his relationship with his former Secretary General Michel Zen-Ruffinen soured, he established a separate administration within FIFA to ostracise him, and then had him removed. He said to his former supporter Mohamed Bin Hammam that he would set a limit for two terms for presidents to allow the Qatari to stand. When Bin Hammam did stand against him, Blatter brought down the full weight of the FIFA ethics committee onto his head.

Securing the future

Now Blatter is suggesting term limits for the position of president and executive committee roles. By bringing them in now, he neuters whoever replaces him. Despite saying that he has been blocked in the past, he has been the one in control. And he is not relinquishing it yet.

Football fans should not assume that this is the last we will see of Blatter. It is worth noting that Article 19 of the FIFA statutes explicitly states:

The Congress may bestow the title of honorary president, honorary vice- president or honorary member upon any former member of the Executive Committee for meritorious service to football.

After successfully defending his presidency only five days ago, would anyone be surprised if he was voted into an honorary role? His predecessor João Havelange was elected honorary president despite allegations of corruption.

And these positions still attract the highly generous FIFA expenses package. Blatter is a wily old fox. One does not remain atop the FIFA pyramid for so long without knowing how to play politics. Even when the world thinks he has stepped down, it may just be that he has done anything but.

This article originally appeared on The Conversation.

Read next: Former FIFA Executive Admits to World Cup Bribes

Listen to the most important stories of the day.
The Conversation

TIME society

This Is What the Ideal Woman Looked Like in the 1930s

She was 12 in. around the neck, 6 in. around the wrist and 19.5 in. around the thigh

The “ideal” body type has long been a topic of fascination. Whether we’re focusing on how those standards of beauty have changed over time, how clothing sizes have evolved or what dress size Marilyn Monroe really wore, it’s clear that the subject is less superficial than it may seem. Conversations about beauty are often conversations about the impacts these changing ideals have on the body images of women and girls.

Twenty years before Monroe stood over a subway grate in a billowing white halter dress, LIFE Magazine described the ideal figure American women hoped to attain. The year was 1938, and the model, 20-year-old June Cox, stood 5 ft. 6 3/4 in. and weighed 124 lbs., though life insurance statistics, the magazine said, suggested she should weigh 135 lbs.

The magazine explained that American women’s increasing involvement in sports in recent years had made them taller and flatter, and as such, “the boyish form became the vogue.” But by the late ’30s, romantic-influenced clothing had returned to fashion, and a “soft feminine figure” was replacing the athletic form as the look du jour:

The perfect 1938 figure must have curves but it differs from the perfect figure of past decades in relationship of curves to straight lines. In the 1890’s women had full bosoms, round hips. In actual measurements they were probably no rounder than Miss Cox but they seemed so because they were shorter, tightened their waists into an hour-glass effect … Now, though, the ideal figure must have a round, high bosom, a slim but not wasp-like waist, and gently rounded hips.

When it comes to issues of body image today, many blame the airbrushing of already-thin models for generating an unhealthy self-image among many women. True as this may be, women were receiving messages about how they should look long before the first love handles were magically eliminated in Photoshop.

1938 Ideal Figure LIFE magazine
Alfred Eisenstaedt—LIFE Magazine

Your browser is out of date. Please update your browser at http://update.microsoft.com