TIME

Why Having Kids Won’t Fulfill You

hand in hand
Getty Images

Jennifer Aniston, take note. You haven't failed as a woman if you don't have kids.

I was struck by the comments Jennifer Aniston made to Allure magazine this week about the badgering she gets on a topic that she finds painful: her lack of children. She tells the magazine: “I don’t like [the pressure] that people put on me, on women – that you’ve failed yourself as a female because you haven’t procreated. I don’t think it’s fair. You may not have a child come out of your vagina, but that doesn’t mean that you aren’t mothering — dogs, friends, friends’ children.” For Aniston, 45, the topic is fraught with emotion. “Even saying it gets me a little tight in my throat,” she said.

I thought about Aniston’s comments—what many women in their early 40s without children are forced to feel—and then I thought about my own life. In some respects I’m Aniston’s exact opposite: I’m a 41-year-old mother of two who spent my entire adult life telling myself that children were my destiny. I did what society and my family expected, never questioning the choice. But sometimes I wonder how much of the blueprint of my life was drawn by me, and how much was sketched by experiences I had when I was way too young to be the architect of my own destiny.

For all intents and purposes, my mother was single parent. My father left when I was twelve, but long before then my mother had taken over the head of the household role. She worked full-time as a waitress while my father flitted between different construction jobs. There always seemed to be an injury or a reason he wasn’t able to work. The image of him lying on our living room floor in front of our television is burned on my brain. He was there so much — diagonally and on his side with his head perched upon on his hand–I actually thought it was odd when I went to friends’ houses and their fathers weren’t in that prone position. I also found it odd that my friends’ parents shared a bedroom. My dad had taken up residence on the couch for so long, it seemed normal.

It was the obviously unhappy marriage that birthed the mantra my mother would repeat to me throughout my young life: “Do not depend on a man for anything.” That was followed closely by: “You and your sister are the best things I’ve ever done.” My mother made it clear that we were her reason for living. There was never a time I didn’t feel loved by my mother. But there was also a latent message that became clear after my father left: I am not alone because I have children. If it weren’t for you two I would be falling apart.

Before I hit adolescence, I decided that children were the only things that could fulfill me when I grew older.

“I’ve always wanted kids.” I don’t think I could possibly count the number of times in my life I have uttered those words. But, the same enthusiasm never escaped my lips when talking about marriage. I was never that girl who fantasized about her wedding day. So I skipped the marriage part, feeling like a renegade who was bucking the patriarchal confines of society.

It took five years for my partner and me to have a pregnancy that didn’t end in loss. After the third miscarriage, I began to panic: what if I really couldn’t have children? What would my life become? I was a bartender at the time that we were trying and my partner was a musician — we were in no way financially prepared for children. But the panic and fear that the narrative I had chosen for myself so many years earlier was not going to play out made me a woman consumed.

For five years we spent month after month trying for a child. The obsession I had with ovulation calendars and pregnancy tests only paused when a test came back positive, then the obsession switched to worrying about whether the pregnancy was going to last. I gave birth to a healthy baby boy in 2010, when I was thirty-eight. I was finally a mom.

My life changed — but only the daily tasks. I was still working full-time. Once we added a baby, the only difference was we now had no downtime. I was not a new person. I was the woman I had always been, I just added another label to my list of identifiers: friend, photographer, bartender, girlfriend, writer, mother. I reached the endgame, and nothing about myself had changed — save my ability to multitask.

My assumption that I was destined to be maternal made me never consider the idea that maybe I wasn’t. The possibility that I wasn’t actually hard-wired to mother never occurred to me until I looked into my child’s eyes for the first time and didn’t feel that thunderbolt everyone talks so much about. Those overwhelming feelings of love arrived eventually, but they certainly weren’t automatic.

Had we continued having infertility issues and not been able to conceive, I am certain that I would have felt that there was something “missing” from my life. But only because I believed the narrative my mother sold that children bring fulfillment. Since I’ve become a mother and seen that the essence of what makes me who I am has not changed, I’ve learned that nothing outside of you can fulfill you. Fulfillment is all about how you perceive the fullness or emptiness of your life. But how can a woman feel fulfilled if she’s constantly being told her life is empty without children? How can she ever feel certain she’s made the right decision if society is second-guessing her constantly?

There is nothing wrong or incomplete about building a life with a partner or alone, unburdened by the added stress of keeping another human being alive. This is something that men have always been allowed – women, not so much. A woman is constantly reminded of the ticking time bomb that is her biological clock. We don’t believe that a life without children is something a woman could possibly want. It’s why successful, wealthy women like Aniston are still asked the baby question every single time they sit down for an interview. Everyone is always looking for the latent sadness, the regret. What if it’s not there?

It’s been 40 years since the women’s liberation movement told us that just because we have a uterus, doesn’t mean we have to use it. We still don’t believe it. Whether we realize it or not, the necessity to tap into our maternal side is so wired into our being that we can’t escape it. If we could, there wouldn’t be debates about whether women could “have it all” or whether we were turning against our nature if we decide not to procreate.

I never questioned my desire to have children, because I didn’t have to; I took the well-traveled road. That desire is expected of me – it’s expected of all women. It took me decades to realize that the maternal drive I carried with me my entire adult life, the one that led me to try for five years to have children, may not have been a biological imperative at all. It may just have been a program that was placed into my psyche by the repeated mantras of a woman who was let down by a man and comforted by her children. That’s okay. I love my children and I’m happy about the experiences I’ve had and the paths that have led me to this place. But if this isn’t your place—whether you’re a famous movie star or not– you didn’t take a wrong turn.

 

Parents Newsletter Signup Banner
TIME
TIME Parenting

From BFF to ‘Friend Divorce:’ The 5 Truths We Should Teach Our Girls About Friendship

People walking, blurred motion
Getty Images

There's no such thing as a perfect friendship. It’s time to teach girls the truth about the complexities of BFFs.

Girls may love movies about fairytale princes, but their most captivating romance is with their friends. Every year, I stand on the stages of school auditoriums and ask thousands of girls this question: “How many of you have had a friend divorce?”

Instantly, a sea of hands shoot up in the air – this is not a term I need to define. The girls look around furtively, surprise spreading across their faces. They are astonished to discover they are not the only ones who have lost close friends.

That’s because girls receive unrealistic messages about how to have a friendship. Films and television see-saw between two extremes: mean girl-fests (think Real Housewives) and bestie love-fests (Sex and the City). Adults, meanwhile, aren’t always the perfect role models, either. The result is a steady diet of what I call “friendship myths”: find a best friend, and keep her forever. A good friendship is one where you never fight and are always happy. The more friends you have, the cooler you are.

These myths are all part of the pressure girls face to be “good girls”: liked by everyone, nice to all, and pleasing others before herself. It’s a subject I wrote an entire book on, and see often with my students.

Research has found that girls who are more authentic in their friendships – by being open and honest about their true feelings, and even having conflicts – have closer, happier connections with each other. Yet when a girls’ social life goes awry, they often blame themselves. Many interpret minor problems as catastrophes. Some may not even tell their parents out of embarrassment.

But there are things we can do to prepare girls for the gritty realities of real-life friendships. We can teach them that friendship challenges are a fact of life. That hiccups – a moody friend, fight over a love interest, or mean joke –- are simply par for the course. And when we do? They probably wouldn’t beat themselves up as much when conflicts happen. They’d be more willing to seek out support and move on when it did. Instead of expecting perfection all the time, they could adapt more easily to stress.

Here are five hard but important truths we can teach our girls about their relationships — perhaps sparing them that traumatizing “friend divorce” later on.

There is no such thing as a perfect friendship.

A healthy friendship is one where you share your true feelings without fearing the end of the relationship. It’s also one where you sometimes have to let things that bug you slide. The tough moments will make you wiser about yourself and each other. They will also make you stronger and closer as friends.

You will be left out or excluded.

It may happen because someone is being mean to you, or because someone forgot to include you. It will happen for a big reason or no clear reason at all; it will have everything or nothing to do with you. You will feel sad about it, and as your parent, I will be there to support you.

No matter how hard you try, your apology may not be accepted.

Some people just can’t move on from a conflict. You are only responsible for your own actions, not others’. You cannot make anyone do anything they don’t want to do. If you have done everything you can to make things right on your side, all you can do is wait. Yes, you may wait a long time, maybe even forever, but I will be there to support you.

Friend divorce happens.

Just like people date and break up, friends break up, too. “Best friends forever” rarely ever happens; it’s just that no one talks about it. Friend divorce is a sign that something was broken in your relationship, and it creates space in your life to let the next good friend in. You may be heartbroken by this experience, but your heart is strong, and you will find a new close friend again soon. I will be there to support you.

Friendships ebb and flow.

There are times in every friendship when you or your friend are too busy to call, or are more focused on other relationships. It will hurt, but it’s rarely personal. Making it personal usually makes things worse, and being too clingy or demanding can drive a friend even further away. Like people, friendships can get “overworked” and need to rest. In the meantime, let’s figure out other friends you can connect with.

I know plenty of grown-ups who still haven’t learned these truths – and they can be painful. But that’s all part of friendship: understanding just how hard – but at the same time, rewarding — it can be.

 

Rachel Simmons is the co-founder of Girls Leadership Institute and the author of the New York Times bestselling book, “Odd Girl Out: The Hidden Culture of Aggression in Girls” and “The Curse of the Good Girl: Raising Authentic Girls With Courage and Confidence.” Follow her on Twitter @racheljsimmons.

TIME Opinion

Girl Gone Wild: The Rise of the Lone She-Wolf

Wild
Fox Searchlight

A woman on a solitary journey used to be seen as pitiful, vulnerable or scary. Not any more.

The first few seconds of Wild sound like sex. You hear a woman panting and moaning as the camera pans across the forest, and it seems like the movie is starting off with an outdoor quickie. But it’s not the sound of two hikers hooking up: it’s the sound of Cheryl Strayed, played by Reese Witherspoon, climbing a mountain all by herself.

It lasts only a moment, but that first shot contains everything you need to know about why Wild is so important. It’s a story of a woman who hikes the Pacific Crest Trail for 94 days in the wake of her mother’s death, but more than that, it’s a story of a woman who is no longer anything to anybody. We’re so used to seeing women entangled with other people (with parents, with men, with children, in neurotic friendships with other women), that it’s surprising, almost shocking, to see a woman who is gloriously, intentionally, radically alone.

When it comes to women onscreen, the lone frontier is the last frontier. It’s no big deal to see women play presidents, villains, baseball players, psychopaths, superheroes, math geniuses, or emotionally stunted losers. We’ve even had a female Bob Dylan. But a woman, alone, in the wilderness, for an entire movie? Not until now.

Which is unfair, considering all the books and movies dedicated to the often-tedious excursions of solitary men, from Henry David Thoreau to Jack Kerouac to Christopher McCandless. Audiences have sat through hours of solo-dude time in critically acclaimed movies like Castaway, Into the Wild, Life of Pi, 127 Hours, and All is Lost. America loves a Lone Ranger so much, even Superman worked alone.

In fact, the only thing more central to the American canon than a solitary guy hanging out in the woods is a guy on a quest (think Huckleberry Finn or Moby Dick). The road narrative may be the most fundamental American legend, grown from our history of pilgrimage and Western expansion. But adventure stories are almost always no-girls-allowed, partly because the male adventurer is usually fleeing from a smothering domesticity represented by women. In our collective imaginations, women don’t set out on a journey unless they’re fleeing from something, usually violence. As Vanessa Veselka writes in her excellent essay on female road narratives in The American Reader: “A man on the road is caught in the act of a becoming. A woman on the road has something seriously wrong with her. She has not ‘struck out on her own.’ She has been shunned.”

MORE: The Top 10 Best Movies of 2014

The ‘loner in nature’ and the ‘man on the road’ are our American origin stories, our Genesis and Exodus. They’re fables of an American national character which, as A.O. Scott pointed out in his The New York Times essay on the death of adulthood in American culture, has always tended towards the boyish. Wild is the first big movie– or bestselling book, for that matter–to re-tell that central American story with a female protagonist.

But Wild is just the most visible example of what’s been a slow movement towards loner ladies onscreen. Sandra Bullock’s solo spin through space last year in Gravity was the first step (although her aloneness was accidental, and it was more a survival story than road narrative). Mia Wasikowska’s long walk across Australia in Tracks this year was another. But Wild, based on Strayed’s bestselling memoir and propelled by Witherspoon’s star power, is the movie that has the best shot at moving us past the now-tired “power woman” towards a new kind of feminist role model: the lone female.

Because for women, aloneness is the next frontier. Despite our chirpy boosting of “independent women” and “strong female leads,” it’s easy to forget that women can never be independent if we’re not allowed to be alone.

For men, solitude is noble: it implies moral toughness, intellectual rigor, a deep connection with the environment. For women, solitude is dangerous: a lone woman is considered vulnerable to attacks, pitiful for her lack of male companionship, or threatening to another woman’s relationship. We see women in all kinds of states of loneliness–single, socially isolated, abandoned–but almost never in a state of deliberate, total aloneness.

Not to mention the fact that women’s stories are almost always told in the context of their relationships with other people. Even if you set aside romance narratives, the “girl group” has become the mechanism for telling the stories of “independent” women– that is, women’s stories that don’t necessarily revolve around men. Think Sex & The City, Steel Magnolias, A League of Their Own, Sisterhood of the Traveling Pants, Girls: if a woman’s not half of a couple, she must be part of a gaggle.

When Cheryl Strayed describes her experience of “radical aloneness,” she’s talking about being completely cut off from human contact–no cell phone, no credit card, no GPS. But her aloneness is also radical in that it rejects the female identity that is always viewed through the lens of a relationship with someone else. To be alone, radically alone, is to root yourself in your own life, not the role you play in other people’s lives. Or, as Strayed’s mother Bobbi wistfully puts it, “I always did what someone else wanted me to do. I’ve always been someone’s daughter or mother or wife. I’ve never just been me.”

MORE: The Top 10 Best Movie Performances of 2014

And that’s the difference between aloneness and independence. The “independent woman” is nothing new– if anything, it’s become a tired catchphrase of a certain kind of rah-rah feminism. “Independence” implies a relationship with another thing, a thing from which you’re severing your ties. It’s inherently conspicuous, even performative. Female independence has become such a trope that it’s become another role for women to play: independent career woman, independent post-breakup vixen, independent spitfire who doesn’t care what anyone thinks. And usually, that “independence” is just a temporary phase before she meets a guy at the end of the movie who conveniently “likes a woman who speaks her mind.”

Aloneness is more fundamental, and more difficult. It involves cultivating a sense of self that has little to do with the motherhood, daughterhood, wifehood or friendship that society calls “womanhood.” When interviewed by the Hobo Times about being a “female hobo,” Strayed says: “Women can’t walk out of their lives. They have families. They have kids to take care of.” Aloneness then, isn’t just a choice to focus on one’s self– it’s also a rejection of all the other social functions women are expected to perform.

In 1995, when Strayed hiked for 94 days, that would have been hard. In 2014, it’s even harder. Thanks to the internet, our world is more social now than ever before, and it’s even harder to escape other people. But aloneness is at the root of real independence, it’s where self-reliance begins and ends. So these days, if you want to be independent, maybe you can start by trying to be alone.

Read next: Reese Witherspoon Isn’t Nice or Wholesome in Wild, and That’s What Makes It Great

TIME

Viral Threats

TURKEY-SYRIA-CONFLICT-KURDS
Militants of Islamic State are seen before explosion of air strike on Tilsehir hill near the Turkish border village Yumurtalik in Sanliurfa province, Oct. 23, 2014. BULENT KILIC—AFP/Getty Images

Why combatting the extremists of ISIS is harder than fighting an Ebola outbreak

As images of brutal beheadings and dying plague victims compete for the world’s shrinking attention span, it is instructive to compare the unexpected terrors of the Islamic State of Iraq and Greater Syria (known as ISIS or ISIL) and Ebola. In October, the U.N. High Commissioner for Human Rights pointed out that “the twin plagues of Ebola and ISIL both fomented quietly, neglected by a world that knew they existed but misread their terrible potential, before exploding into the global consciousness.” Seeking more direct connections, various press stories have cited “experts” discussing the potential for ISIS to weaponize Ebola for bioterrorist attacks on the West.

Sensationalist claims aside, questions about similarities and differences are worth considering. Both burst onto the scene this year, capturing imaginations as they spread with surprising speed and severity. About Ebola, the world knows a lot and is doing relatively little. About ISIS, we know relatively little but are doing a lot.

In the case of Ebola, the first U.S.-funded treatment unit opened on Nov. 10—more than eight months after the epidemic came to the world’s attention. The U.S. has committed more than $350 million and 3,000 troops to this challenge to date. To combat ISIS, President Obama announced on Nov. 7 that he would be sending an additional 1,500 troops to Iraq to supplement his initial deployment of 1,500. And he has asked Congress for a down payment of $5.6 billion in this chapter of the global war on terrorism declared by his predecessor 13 years ago and on which the U.S. has spent more than $4 trillion so far.

Over recent centuries, medicine has made more progress than statecraft. It can be useful therefore to examine ISIS through a public-health lens. When confronting a disease, modern medicine begins by asking: What is the pathogen? How does it spread? Who is at risk? And, informed by this understanding, how can it be treated and possibly prevented?

About Ebola, we know the answers to each. But what about ISIS?

Start with identification of the virus itself. In the case of Ebola, scientists know the genetic code of the specific virus that causes an infected human being to bleed and die. Evidence suggests that the virus is animal-borne, and bats appear to be the most likely source. Scientists have traced the current outbreak to a likely animal-to-human transfer in December 2013.

In the case of ISIS, neither the identity of the virus nor the circumstances that gave rise to it are clear. Most see ISIS as a mutation of al-Qaeda, the Osama bin Laden–led terrorist group that killed nearly 3,000 people in the attacks on the World Trade Center and Pentagon in September 2001. In response to those attacks, President George W. Bush declared the start of a global war on terrorism and sent American troops into direct conflict with the al-Qaeda core in Pakistan and Afghanistan. In the years since, the White House has deployed military personnel and intelligence officers to deal with offshoots of al-Qaeda in Iraq (AQI), Yemen (AQAP), Syria (al-Nusra) and Somalia (al-Shabab).

But while ISIS has its roots in AQI, it was excommunicated by al-Qaeda leadership in February. Moreover, over the past six months, ISIS has distinguished itself as a remarkably purpose-driven organization, achieving unprecedented success on the battlefield—as well as engaging in indiscriminate violence, mass murders, sexual slavery and apparently even attempted genocide.

Horrifying as the symptoms of both Ebola and ISIS are, from an epidemiological perspective, the mere emergence of a deadly disease is not sufficient cause for global concern. For an outbreak to become truly worrying, it must be highly contagious. So how does the ISIS virus spread?

Ebola is transmitted only through contact with infected bodily fluids. No transfer of fluids, no spread. Not so for ISIS, where online images and words can instantly appear worldwide. ISIS’s leadership has demonstrated extraordinary skill and sophistication in crafting persuasive messages for specific audiences. It has won some followers by offering a sense of community and belonging, others by intimidation and a sense of inevitable victory, and still others by claims to restore the purity of Wahhabi Islam. According to CIA estimates, ISIS’s ranks of fighters tripled from initial estimates of 10,000 to more than 31,000 by mid-September. These militants include over 15,000 foreign volunteers from around the globe, including more than 2,000 from Europe and more than 100 from the U.S.

Individuals at risk of Ebola are relatively easy to identify: all have come into direct contact with the bodily fluids of a symptomatic Ebola patient, and almost all these cases occurred in just a handful of countries in West Africa. Once symptoms begin, those with the virus soon find it difficult to move, much less travel, for very long undetected.

But who is most likely to catch the ISIS virus? The most susceptible appear to be 18- to 35-year-old male Sunni Muslims, among whom there are many Western converts, disaffected or isolated in their local environment. But militants’ individual circumstances vary greatly, with foreign fighters hailing from more than 80 countries. These terrorists’ message can also inspire “lone wolf” sympathizers to engage in deadly behavior thousands of miles from any master planner or jihadist cell.

In sum, if Ebola were judged as a serious threat to the U.S., Americans have the knowledge to stop it in its tracks. Imagine an outbreak in the U.S. or another advanced society. The infected would be immediately quarantined, limiting contact to appropriately protected medical professionals—thus breaking the chain of infection. It is no surprise that all but two of the individuals infected by the virus who have returned to the U.S. have recovered and have not infected others. Countries like Liberia, on the other hand, with no comprehensive modern public-health or medical system, face entirely different challenges. International assistance has come slowly, piecemeal and in a largely uncoordinated fashion.

Of course, if ISIS really were a disease, it would be a nightmare: a deadly, highly contagious killer whose identity, origins, transmission and risk factors are poorly understood. Facing it, we find ourselves more like the Founding Fathers of the U.S., who in the 1790s experienced seasonal outbreaks of yellow fever in Philadelphia (then the capital of the country). Imagining that it was caused by the “putrid” airs of hot summers in the city, President John Adams and his Cabinet simply left the city, not returning until later in the fall when the plague subsided. In one particularly virulent year, Adams remained at his home in Quincy, Mass., for four months.

Not until more than a century later did medical science discover that the disease was transmitted by mosquitoes and its spread could be stopped.

We cannot hope to temporarily escape the ­“putrid” airs of ISIS until our understanding of that scourge improves. Faced with the realities of this threat, how would the medical world suggest we respond?

First, we would begin with humility. Since 9/11, the dominant U.S. strategy to prevent the spread of Islamic extremism has been to kill its hosts. Thirteen years on, having toppled the Taliban in Kabul and Saddam Hussein in Baghdad, waged war in both Iraq and Afghanistan, decimated the al-Qaeda core in Pakistan and Afghanistan and conducted 500 drone strikes against al-Qaeda affiliates in Yemen and Pakistan, and now launched over 1,000 air strikes against ISIS in Iraq and Syria, we should pause and ask: Are the numbers of those currently infected by the disease shrinking—or growing? As former Secretary of Defense Donald Rumsfeld once put it: Are we creating more enemies than we are killing? With our current approach, will we be declaring war on another acronym a decade from now? As we mount a response to ISIS, we must examine honestly past failures and successes and work to improve our limited understanding of what we are facing. We should then proceed with caution, keeping in mind Hippocrates’ wise counsel “to help, or at least, to do no harm.”

Second, we would tailor our treatments to reflect the different theaters of the disease. Health care professionals fighting Ebola in West Africa face quite different challenges of containment, treatment and prevention than do their counterparts dealing with isolated cases in the Western world. Similarly, our strategy to “defeat and ultimately destroy” ISIS in its hotbed of Iraq and Syria must be linked to, but differentiated from, our treatment for foreign fighters likely to “catch” the ISIS virus in Western nations. While continuing to focus on the center of the outbreak, the U.S. must also work to identify, track and—when necessary—isolate infected individuals within its borders.

Just as Ebola quarantines have raised ethical debates, our response to foreign fighters will need to address difficult trade-offs between individual rights and collective security. Should citizens who choose to fight for ISIS be stripped of their citizenship, imprisoned on their return, or denied entry to their home country? Such a response would certainly chill “jihadi tourism.” Should potential foreign fighters be denied passports or have their travel restricted? How closely should security agencies be allowed to monitor individuals who visit the most extremist Salafist websites or espouse ISIS-friendly views? Will punitive measures control the threat or only add fuel to radical beliefs?

Finally, we should acknowledge the fact that for the foreseeable future, there may be no permanent cure for Islamic extremism. Against Ebola, researchers are racing toward a vaccine that could decisively prevent future epidemics. But the past decade has taught us that despite our best efforts, if and when the ISIS outbreak is controlled, another strain of the virus is likely to emerge. In this sense, violent Islamic extremism may be more like the flu than Ebola: a virus for which we have no cure, but for which we can develop a coherent management strategy to minimize the number of annual infections and deaths. And recalling the 1918 influenza pandemic that killed at least 50 million people around the world, we must remain vigilant to the possibility that a new, more virulent and contagious strain of extremism could emerge with even graver consequences.

Allison is director of the Belfer Center for Science and International Affairs at Harvard’s John F. Kennedy School of Government

TIME Opinion

The Problem With Frats Isn’t Just Rape. It’s Power.

The Phi Kappa Psi fraternity house at the University of Virginia in Charlottesville, Va., Nov. 24, 2014. A Rolling Stone article last week alleged a gang rape at the house which has since suspended operations.
The Phi Kappa Psi fraternity house at the University of Virginia in Charlottesville, Va., on Nov. 24, 2014. A Rolling Stone article alleged a gang rape at the house, which has since suspended operations Steve Helber—AP

Too many frats breed sexism and misogyny that lasts long after college. Why we need to ban them—for good.

At the university I called home my freshman year, fraternity row was a tree-lined street full of Southern style mansions, against a backdrop of the poor urban ghetto that surrounded the school. Off-campus frat parties weren’t quite how I pictured spending my weekends at a new school – I wasn’t actually part of the Greek system – but it became clear quickly that they were the center of the social structure. They controlled the alcohol on campus, and thus, the social life. So there I was, week after week, joining the throngs of half-naked women trekking to fraternity row.

We learned the rules to frat life quickly, or at least we thought we did. Never let your drink out of your sight. Don’t go upstairs – where the bedrooms were housed – without a girlfriend who could check in on you later. If one of us was denied entry to a party because we weren’t deemed “hot” enough – houses often ranked women on a scale of one to 10, with only “sixes” and up granted entry to a party – we stuck together. Maybe we went to the foam party next door.

In two years at the University of Southern California, I heard plenty of stories of women being drugged at frat parties. At least one woman I knew was date raped, though she didn’t report it. But most of us basically shrugged our shoulders: This was just how it worked… right?

If the recent headlines are any indication, it certainly appears so. Among them: women blacked out and hospitalized after a frat party at the University of Wisconsin, only to discover red or black X’s marked on their hands. An email guide to getting girls in bed called “Luring your rapebait.” A banner displayed at a Texas Tech party reading “No Means Yes, Yes Means Anal” – which happened to be the same slogan chanted by frat brothers at Yale, later part of a civil rights complaint against the university.

And now, the story of Jackie, who alleged in a Rolling Stone article — swiftly becoming the subject over fairness in reporting whether the author was negligent in not reaching out to the alleged rapists — that she was gang raped by seven members of the Phi Kappa Psi house at the University of Virginia, and discouraged from pressing charges to protect the university’s reputation.

The alleged rape, it turned out, took place at the same house where another rape had occurred some thirty years prior, ultimately landing the perpetrator in jail.

“I’m sick about this,” says Caitlin Flanagan, a writer and UVA alumna who spent a year documenting the culture of fraternity life for a recent cover story in the Atlantic. “It’s been 30 years of education programs by the frats, initiatives to change culture, management policies, and we’re still here.”

Which begs the question: Why isn’t every campus in America dissolving its fraternity program — or at least instituting major, serious reform?

Not every fraternity member is a rapist (nor is every fraternity misogynist). But fraternity members are three times more likely to rape, according to a 2007 study, which notes that fraternity culture reinforces “within-group attitudes” that perpetuate sexual coercion. Taken together, frats and other traditionally male-dominated social clubs (ahem: the Princeton eating club) crystalize the elements of our culture that reinforce inequality, both gender and otherwise.

For starters, they are insulated from outside perspective. It wasn’t until the late 1960s that Greek organizations eradicated whites-only membership clauses; as a recent controversy at the University of Alabama revealed, only one black student had been permitted into that Greek system since 1964. Throughout the country, the fraternities grew into “caste system based on socioeconomic status as perceived by students,” John Chandler, the former president of Middlebury, which has banned frats on campus, recently told Newsweek.

And when it comes to campus social life, they exert huge social control: providing the alcohol, hosting the parties, policing who may enter–based on whatever criteria they choose. Because sororities are prohibited from serving alcohol, they can’t host their own parties; they must also abide by strict decorum rules. So night after night, women line up, in tube tops and high heels, vying for entrance. Even their clothes are a signifier of where the power lies. “Those with less power almost invariably dress up for those who have more,” Michael Kimmel, a sociologist at Stony Brook University, wrote in a recent column for TIME. “So, by day, in class, women and men dress pretty much the same … At parties, though, the guys will still be dressed that way, while the women will be sporting party dresses, high heels and make up.”

And when frat boys grow up? They slide right into the boys club of the business world, where brothers land Wall Street jobs via the “fraternity pipeline,” as a recent Bloomberg Businessweek piece put it — a place where secret handshakes mean special treatment in an already male-dominated field. Fraternities have graduated plenty of brilliant Silicon Valley founders: the creators of Facebook, Instagram, among others. They’ve also brought us Justin Mateen, the founder of Tinder, who stepped down amid a sexual harassment lawsuit, and Evan Spiegel, the Snapchat CEO, whose recently apologized for e-mails sent while in the Stanford frat where Snapchat was founded, which discussed convincing sorority women to perform sex acts and drunkenly peeing on a woman in bed.

(VIDEO: My Rapist Is Still on Campus: A Columbia Undergrad Tells Her Story)

If we lived in a gender-equal world, fraternities might work. But in an age where 1 in five college women are raped or assaulted on campus, where dozens of universities are under federal investigations for their handling of it, and where the business world remains dominated by men, doesn’t the continued existence of fraternities normalize a kind of white, male-dominated culture that already pervades our society? There is something insidious about a group of men who deny women entry, control the No. 1 asset on campus – alcohol – and make the rules in isolated groups. “[Colleges] should be cultivating the kind of sensibility that makes you a better citizen of a diverse and distressingly fractious society,” Frank Bruni wrote it in a New York Times column this week. “How is that served by retreating into an exclusionary clique of people just like you?”

The argument for Greek life – at least for the mainstream, largely white frats that seem to be the problem – goes something like this: It’s about fostering camaraderie. (According to a 2014 Gallup Poll, fraternity and sorority members have stronger relationships with friends and family than other college graduates.) It’s about community: As the Washington Post reported, chapters at UVA reportedly raised $400,000 for charity and logged 56,000 hours of community service during the past academic year. It’s part of a student’s free right to congregate. And also about training future leaders. According to Gallup, fraternity and sorority members will end up better off financially, and more likely to start businesses than other college graduates.

But the real benefit – at least the unspoken one – may be about money. Frats breed generous donors: as Flanagan pointed out in her Atlantic piece, fraternities save universities millions of dollars in student housing. At least one study has confirmed that fraternity brothers also tend to be generous to their alma maters.

All of which is part of the problem. Who wants to crack down on frats if it’s going to profoundly disturb campus life?

UVA, for its part, has suspended the frat in question until the new year, what the Inter-Fraternity Council described as a helpful opportunity for UVA’s Greek system to “take a breath.” The university’s president has said that the school “is too good a place to allow this evil to reside.” But critics saw the punishment as a slap on the wrist: a suspension, when most students are out of town for the holidays?

There are other options on the table: The school is reportedly considering proposals to crack down on underage drinking and even a ban on alcohol. Other universities have explored making fraternities co-ed. And there’s some evidence that fraternity brothers who participate in a rape prevention program at the start of the academic year are less likely to commit a sexually coercive act than a control group of men who also joined fraternities.

Yet all the while, the parade of ugly news continues. A group of frat brothers at San Diego State University interrupted a “Take Back the Night” march last week by screaming obscenities, throwing eggs and waving dildos at marchers. The next night, a woman reported she was sexually assaulted at a party near the school’s campus; she was the seventh person to come forward this semester. And on Monday, Wesleyan announced that its Psi Upsilon fraternity would be banned from hosting social events until the end of 2015, also because of rape accusations.

Fraternities have created something that’s fairly unique in the modern world: a place where young men spend three or four years living with other men whom they have vetted as like them and able to “fit in.” What do you expect to happen at a club where women are viewed as outsiders, or commodities, or worse, as prey, and where men make the rules? It should be no surprise they end up recreating the boys club — and one that isn’t all so great for the boys, either.

Jessica Bennett is a contributing columnist at Time.com covering the intersection of gender, sexuality, business and pop culture. She writes regularly for the New York Times and is a contributing editor on special projects for Sheryl Sandberg’s women’s non-profit, Lean In. You can follow her @jess7bennett.

Read more views on the debate about preventing sexual assault on campus:

Caitlin Flanagan: We Need More Transparency on the Issue of Fraternity Rape

A Lawyer for the Accused on Why Some Rules About Consent Are Unfair to Men

Ban Frat Parties–Let Sororities Run the Show

TIME Opinion

Why Ferguson Should Matter to Asian-Americans

A female protester raises her hands while blocking police cars in Ferguson
A female protester raises her hands while blocking police cars in Ferguson, Mo. on Nov. 25, 2014. Adrees Latif—Reuters

Ferguson isn’t simply black versus white

A peculiar Vine floated around social media Monday evening following the grand jury announcement in Ferguson, Mo. The short video shows an Asian-American shopkeeper standing in his looted store, with a hands-in-his-pockets matter-of-factness and a sad slump to his facial expression. “Are you okay, sir?” an off-screen cameraman asks. “Yes,” the storeowner says, dejectedly.

The clip is only a few seconds, but it highlights the question of where Asian-Americans stand in the black and white palette often used to paint incidents like Ferguson. In the story of a white cop’s killing of a black teen, Asian-Americans may at first seem irrelevant. They are neither white nor black; they assume the benefits of non-blackness, but also the burdens of non-whiteness. They can appear innocuous on nighttime streets, but also defenseless; getting into Harvard is a result of “one’s own merit,” but also a genetic gift; they are assumed well-off in society, but also perpetually foreign. Asian-Americans’ peculiar gray space on the racial spectrum can translate to detachment from the situation in Ferguson. When that happens, the racialized nature of the events in Ferguson loses relevance to Asian-Americans. But seen with a historical perspective, it’s clear that such moments are decidedly of more colors than two.

VOTE: Should the Ferguson Protestors Be TIME’s Person of the Year?

Michael Brown’s death has several parallels in Asian-American history. The first to come to mind may be the story of Vincent Chin, a Chinese-American killed in 1982 by a Chrysler plant superintendent and his stepson, both white, both uncharged in a racially-motivated murder; like Brown, Chin unified his community to demand protection under the law. However, most direct parallels have often had one distinct dissimilarity to Ferguson: they have not spurred widespread resistance, nor have they engraved a visible legacy.

There is the story of Kuanchang Kao, an intoxicated Chinese-American fatally shot in 1997 by police threatened by his “martial arts” moves. There is Cau Bich Tran, a Vietnamese-American killed in 2003 after holding a vegetable peeler, which police thought was a cleaver. There is Fong Lee, a Hmong-American shot to death in 2006 by police who believed he was carrying a gun. None of the three cases resulted in criminal charges against the police or in public campaigns that turned the victim’s memory into a commitment to seek justice. One op-ed even declared how little America learned from Tran’s slaying.

While Ferguson captures the world’s attention, why do these Asian-American stories remain comparatively unknown?

One possible answer could be found in the model minority myth. The myth, a decades-old stereotype, casts Asian-Americans as universally successful, and discourages others — even Asian-Americans themselves — from believing in the validity of their struggles. But as protests over Ferguson continue, it’s increasingly important to remember the purpose of the model minority narrative’s construction. The doctored portrayal, which dates to 1966, was intended to shame African-American activists whose demands for equal civil rights threatened a centuries-old white society. (The original story in the New York Times thrust forward an image of Japanese-Americans quietly rising to economic successes despite the racial prejudice responsible for their unjust internment during World War II.)

Racial engineering of Asian-Americans and African-Americans to protect a white-run society was nothing new, but the puppeteering of one minority to slap the other’s wrist was a marked change. The apparent boost of Asian-Americans suggested that racism was no longer a problem for all people of color — it was a problem for people of a specific color. “The model minority discourse has elevated Asian-Americans as a group that’s worked hard, using education to get ahead,” said Daryl Maeda, a professor of ethnic studies at the University of Colorado, Boulder. “But the reality is that it’s a discourse that intends to pit us against other people of color. And that’s a divide and conquer strategy we shouldn’t be complicit with.”

Through the years, that idea erased from the public consciousness the fact that the Asian-American experience was once a story of racially motivated legal exclusion, disenfranchisement and horrific violence — commonalities with the African-American experience that became rallying points in demanding racial equality. That division between racial minorities also erased a history of Afro-Asian solidarity born by the shared experience of sociopolitical marginalization.

As with Ferguson, it’s easy to say the Civil Rights movement was entirely black and white, when in reality there were many moments of interplay between African-American and Asian-American activism. Japanese-American activist Yuri Kochiyama worked alongside Malcolm X until he was assassinated in front of her. Groups protesting America’s involvement in the Vietnam War, like the student-run Third World Liberation Front, united resisters across racial lines under a collective radical political identity. W.E.B. DuBois called on African Americans to support the 1920s Indian anti-colonial resistance, which he compared to whites’ oppression of blacks. Chinese-American activist Grace Lee Boggs, who struggled as a female scholar of color, found passion in fighting similar injustices against African-Americans alongside C.L.R. James in the 1950s. Though Afro-Asian solidarity wasn’t the norm in either groups’ resistance movements, the examples highlight the power of cross-racial resistance, and what hardships they shared as non-whites.

The concept of non-whiteness is one way to begin the retelling of most hyphenated American histories. In Asian-American history, non-whiteness indelibly characterized the first waves of Asians arriving in the mid-1800s in America. Cases like People v. Hall (1854) placed them alongside unfree blacks, in that case by ruling that a law barring blacks from testifying against whites was intended to block non-white witnesses, while popular images documented Asian-American bodies as dark, faceless and indistinguishable — a racialization strengthened against the white supremacy of Manifest Destiny and naturalization law. Non-whiteness facilitated racism, but it in time also facilitated cross-racial opposition. With issues like post-9/11 racial profiling, anti-racism efforts continue to uphold this tradition of a shared non-white struggle.

“This stuff is what I call M.I.H. — missing in history,” said Helen Zia, an Asian-American historian and activist. “Unfortunately, we have generations growing up thinking there’s no connection [between African-Americans and Asian-Americans]. These things are there, all the linkages of struggles that have been fought together.”

The disassociation of Asian-Americans from Ferguson — not just as absent allies, but forgotten legacies — is another chapter in that missing history. In final moments of the Vine depicting an Asian-American shopkeeper’s looted store, the cameraman offers a last thought in their conversation that had halted to a brief pause. “It’s just a mess,” the cameraman says. The observation, however simplistic, has a truth. That, as an Asian-American who’s become collateral damage in a climate often black-and-white, he, like all of Ferguson, must first clean up — and then reassess the unfolding reality outside.

TIME

When One Twin is More Academically Gifted

My son tested into the gifted program at school, but my daughter didn't. Should I split them up?

Splitting up twins in school is never easy. But splitting up twins so that one goes on the advanced learning track and the other follows the regular program is one of the most agonizing decisions a parent can face. And no amount of Internet searches will give you helpful advice. The consensus: Figure it out, parents. That’s what you’re (not) paid for.

As you may have guessed, I have twins, a boy and a girl, and they’re in the first grade. I happen to be a fraternal twin myself, so I’m sensitive to always being compared to a sibling. My son is like his engineer father —completely committed to being a lovable nerd. The other day he found a book of math problems at Barnes and Noble and was so excited it was as if Santa arrived, handed him a gift, and then let him ride a reindeer. My daughter is like her freelance writer mother – studying is not really her thing. She reminds me of the prince in Monty Python and the Holy Grail who is to inherit a large amount of land and says, “But I don’t want any of that. I’d rather sing!” That’s my girl.

We were first introduced to our school’s Spectrum (advanced learning) program last year in Seattle, Washington at the beginning of kindergarten. The kids could be tested that year and would enter the program—or not—in first grade. I hadn’t really thought about whether to have my kids tested. Other parents apparently had. One asked: “Should we have our child practice at home with the same kind of mouse they’re going to use in the test?”

In the beginning, my husband and I laughed at the idea of advanced learning in the first grade. We joked about “Level Two Crayons” and “Expert Alphabet.” But then, as the day to decide about testing came closer, we started hearing from our son’s teacher about how gifted he was. What first grader wants to practice math and reading on his own during the evenings and weekends? My son. And then there was my daughter, who was right on track, but, like most kids her age, was happy to leave school stuff at school. “Let’s just get them both tested and see what happens,” I said.

As far as my kids knew, they were just going to school to talk about what they know and what they don’t. They were never told that the results of the test had any sort of consequences and weren’t the least bit curious. But when we got the results–my son tested into the advanced program and my daughter didn’t–I immediately became anxious. I wanted to let my son move into the advanced program because I knew he would love it and thrive. But I worried for my vibrant, passionate daughter who at the age of six doesn’t think she has any limits. How was I going to separate her from her brother because he could do something better?

As a child I never felt smart enough. Not because of my twin sister, but because of my mother, who was brilliant. She used her intelligence to get off of the Kentucky farm where she grew up and into a New York City law firm. She placed a lot of value on the power of education and what good grades could do. I felt perpetually unable to meet her high expectations. Now I had a daughter who, in kindergarten, was already resistant to doing her reading homework. I was terrified that placing her brother in a higher academic track would affect my daughter’s self-esteem.

I contacted Christina Baglivi Tingloff from the site Talk About Twins. She’s a mother of adult twins and author of six books, including Double Duty and Parenting School-Age Twins and Multiples. “It’s tough when twins differ in abilities,” she says, “and I’d say that it’s the biggest challenge of parenting multiples. [But] kids take their cues from their parents. If you make this a non-issue in your household, I think your kids will follow suit.”

My husband and I have no lofty goals for our kids besides wanting them to be able to pay their own bills, not hurt themselves or anyone else, and be happy. “So many parents of twins try to even the playing field,” says Tingloff. “In my opinion, that’s a bad course of action because…kids then never develop a strong emotional backbone. Your job as a parent is to help them deal with the disappointments in life.”

We ended up putting our son in the Spectrum program and our daughter in the regular learning track. In the years to come, I will make sure that they understand that advanced or regular doesn’t mean better or worse, it just means different. I want both of my children to do the best they can, whether that means taking advanced classes or singing the hell out of the school musical.

When my daughter wanders through the house making up her own songs and singing at the top of her voice, I support her…most of the time. “Really encourage your daughter in the arts,” says Tingloff. “Find her spotlight. At some point her brother will look at her accomplishments and say, ‘Wow, I can’t do that.'” While I had been worrying all this time about my daughter feeling outshined by her brother, I had never considered that he might also feel outperformed by her.

Despite all of my talk about how my daughter’s interests were every bit as valid as her brother’s, I had not been treating them the same. I saw the dance and drama as diversions and hobbies. I never gave those talents the respect that I gave to her brother’s academic interests.

Now that I am more aware of how I have been valuing their different strengths, I’ll be able to give my daughter’s interests the same amount of focus and praise as her brother’s. Hopefully, I can assure them that our only concern is their happiness. Then my husband and son can go do math problems together, and take things apart to see how they work, and my daughter and I will lay on the grass and find shapes in the clouds while we wonder about the world and sing.

The truth is, both my kids are gifted.

 

TIME Opinion

Confessions of a Lumbersexual

Jordan Ruiz—Getty Images

Why plaid yoga mats and beards are the future

Several years ago I was riding in a van with two female friends in the front seats when one of them pointed out the window and yelled “Wait! Slow down…is that him?” We were passing the bar that employed her ex-boyfriend.

“I don’t know,” said her friend who was driving. “A guy in Brooklyn with a beard and a plaid shirt? Could be anyone.”

I looked down over my beard at my shirt and both girls looked at me and we all laughed.

I’ve had a beard most of my adult life and my wardrobe is comprised largely of cowboy cut, plaid shirts and Wrangler blue jeans. On cold days I wear a big Carhartt coat into the office. In my youth in Oklahoma I did cut down some trees and split firewood for use in a house I really did grow up in, but in those days I dressed like a poser gutter punk. I nurture an abiding love for outlaw country and bluegrass, though, again, during my actual lumberjacking days it was all Black Flag, Operation Ivy and an inadvisable amount of The Doors.

After a decade living in urban places likes Brooklyn and Washington, I still keep a fishing rod I haven’t used in years, woodworking tools I shouldn’t be trusted with, and when I drink my voice deepens into a sort of a growl the provenance of which I do not know. I like mason jars, and craft beer and vintage pickup trucks. An old friend visiting me a few years ago commented, as I propped a booted foot against the wall behind me and adjusted the shirt tucked into my blue jeans, that I looked more Oklahoma than I ever did in Oklahoma.

I am a lumbersexual.

The lumbersexual has been the subject of much Internet musing in the last several weeks. The term is a new one on me but it is not a new phenomenon. In 2010 Urban Dictionary defined the lumbersexual as, “A metro-sexual who has the need to hold on to some outdoor based ruggedness, thus opting to keep a finely trimmed beard.” I was never a metrosexual and I’m actually most amused by Urban Dictionary’s earliest entry for lumbersexual, from February 2004: “A male who humps anyone who gives him wood.” But I do think defining the lumbersexual as a metrosexual grasping at masculinity gets at something.

It doesn’t take a lot of deep self-reflection to see that my lumbersexuality is, in part, a response to the easing of gender identities in society at large over the last few decades. Writing for The New Republic nearly 15 years ago, Andrew Sullivan observed “many areas of life that were once ‘gentlemanly’ have simply been opened to women and thus effectively demasculinized.” The flipside of this happy consequence of social progress is a generation of men left a bit rudderless. “Take their exclusive vocations away, remove their institutions, de-gender their clubs and schools and workplaces, and you leave men with more than a little cultural bewilderment,” writes Sullivan.

If not a breadwinner, not ogreishly aggressive, and not a senior member in good standing at a stuffy old real-life boy’s club, what is a man to be?

On the other hand, the upending of gender norms frees men in mainstream culture to do things verboten by a retrograde man-code once enforced by the most insecure and doltish among us. We carry purses now (and call them murses, or satchels, but don’t kid yourselves fellas). We do yoga. That the ancient core workout is so associated with femininity the pop culture has invented the term “broga” only goes to show what a sorry state masculinity is in. The lumbersexual is merely a healthier expression of the same identity crisis.

Which is, I think (?), why I dress like a lumberjack (and a lumberjack from like 100 years ago, mind you; real lumberjacks today, orange-clad in helmets and ear protection, do not dress like lumbersexuals). As a 21st-century man who does not identify with the pickup artist thing or the boobs/cars/abs triad of masculinity on display in most 21st-century men’s magazines (Maxim et al), is not particularly fastidious or a member of any clearly identifiable subculture and who is as attracted to notions of old-timey authenticity as anyone else in my 20s-30s hipster cohort (all of you are hipsters get over it), I guess this is just the fashion sense that felt most natural. I am actually fairly outdoorsy, in a redneck car-camping kind of way. Lumbersexuality just fit right, like an axe handle smoothed out by years of palm grease or an iPhone case weathered in all the right places to the shape of my hand.

There is a dark side to this lumbersexual moment however. It’s an impulse evident in Tim Allen’s new show Last Man Standing. Whereas in the 1990s, Tim the Tool-Man Taylor from Home Improvement was a confident and self-effacing parody on the Man Cave, complete with silly dude-grunting and fetishizing of tools, Mike Baxter, played by Tim Allen in Last Man Standing, is an entirely un-self-aware, willfully ignorant reactionary. The central theme of the show is Baxter in a household full of women struggling to retain his masculinity, which is presumed to be under assault because of all the estrogen around. He does this through all manner of posturing, complaining and at times being outright weird. In an early episode, Baxter waltzes into the back office at his job in a big box store modeled off Bass Pro Shops and relishes in the fact that it “smells like balls in here.” The joke is a crude attempt at celebrating maleness but it rings distressingly hollow to anyone who has spent any time in rooms redolent with the scent of actual balls. In later seasons the show softened but the central concern of a man whose masculinity is under assault because he is surrounded by women speaks to this moment in our popular culture.

If my beard is a trend-inspired attempt to reclaim a semblance of masculinity in a world gone mad then so be it. Beats scrotum jokes.

TIME Family

Why Your Kids Don’t Thank You for Gifts

images by Tang Ming Tung;Getty Images/Flickr RM

And how to help them develop some gratitude

When we shop for holiday gifts, many of us look for things that will make our children happy. We can’t wait to hear their appreciative cries of “thank you! thank you!” once the wrapping gets ripped off.

But here’s a tip: Don’t count on it.

In this season for thanks and giving, even the most thoughtful children may not offer much gratitude for the gadgets, gizmos, and games they receive. And you’d be wise not expect it.

I’ve spent the last year living more gratefully because of a book I’m writing on the subject, so I’m confident that gratitude can make us happier, healthier, and even fitter. Seeing the world through grateful eyes can lower depression and improve sleep. It creates a pay-it-forward spirit that is good for the world. Encouraging children to write down events that made them grateful—and not just on Thanksgiving—can begin a habit that lasts a lifetime.

Read: What I’m Thankful For, by Nick Offernan, Wendy Davis and others

But gratitude for the endless stuff we buy them? All the research I’ve done has convinced me that it’s not going to happen. And there are several reasons why.

In one study, Yale’s assistant professor of psychology, Yarrow Dunham, found that 4- to 8-year old kids responded differently when given a gift they thought they earned versus one that was granted out of simple generosity. He called the earned gift an “exchange relationship.” The children were happy for the trinket but didn’t experience the deeper resonance of gratitude that might also make them more generous to others. The gift given for no reason, however, had a different emotional impact and the children showed thanks by being more likely to share candies they received in a follow-up game.

As parents, we don’t consider our holiday gifts an “exchange relationship” since we know the time, money, and effort we put in to buying them. But kids have a different view. One mom told me that when she asked her 16-year-old son to thank her for buying him a cellphone, he said, “But that’s what moms do, isn’t it?” He wasn’t being rude—just practical.

From a teenager’s vantage, it’s a parent’s responsibility to take care of the family, and playing Santa is part of the job. According to Dunham, “when teenagers code it that way, a gift is no longer something given freely and voluntarily”—it’s just mom and dad living up to their obligation. And who’s going to be grateful for parents doing what they’re supposed to do?

Read: 40 Inspiring Motivational Quotes About Gratitude

Asking our children to be grateful for gifts is sending the wrong message, anyway. Cornell psychology professor Tom Gilovich has found that people are more likely to be grateful for experiences than for material possessions. A family dinner, a songfest around the fireplace, or even a hike in the woods creates a spirit of gratitude that outlasts even the nicest Nintendo.

Parents may get exasperated when a teenager tosses a new cashmere sweater on the floor, and gratitude aside, and we do have the right to demand good manners. Children should know to say thank you (profusely) to every parent, child, aunt, and uncle who gives them something.

But kids can’t know how blessed they are unless they have a basis for comparison. And they don’t learn that by a parent complaining that they’re ungrateful. We need to give our children the gift of a wider world view. Take them to a soup kitchen instead of to the mall. Become the secret Santa for a needy family. Show by example that gratitude isn’t about stuff—which ultimately can’t make any of us happy anyway. It’s about realizing how lucky you are and paying your good fortune forward.

My favorite idea: Collect all the charitable appeals you get this time of year into a big basket and find a night when the whole family can sit down together to go through them. You set the budget for giving and the kids decide how it’s distributed. Going through each request, you have the opportunity to discuss with children and teens (and also your spouse) what it means to need a food bank or to live in a part of the world where there is no clean water. You can talk about teenagers who are caught in war zones or those suffering from disabilities. Then write the checks together or go online and make your contributions.

Once the conversation about gratitude gets started, it’s much easier to continue all year. Set up a family ritual at bedtime where kids describe three things that made them grateful. When kids go off to college, text them a picture each week of something that inspired your appreciation. Whether it’s a friend, a snowflake, or a sunset, the spirit of the photos will help you (and them) see the world differently.

Teaching children to focus on the positive and appreciate the good in their lives is perhaps the greatest gift we can give them. And we can all learn together that the things that really matter aren’t on sale at a department store.

So I hope my kids will thank me for the gifts I buy them this year. But gratitude? That needs more than wrapping paper and a bow.

Your browser, Internet Explorer 8 or below, is out of date. It has known security flaws and may not display all features of this and other websites.

Learn how to update your browser