TIME society

Students With These Names Are the Absolute Worst

According to a survey that claims to reveal the names of students with the best and worst behavior

School Stickers, a website started by a UK teacher where PreK-12 students earn “rewards” for good behavior and achievement from teachers, has compiled a list of boys and girls who have been “naughty” and “nice” based on the most common names.

Nicest Girls

Amy
Georgia
Emma
Charlotte
Grace
Sophie
Abigail
Hannah
Emily

Naughtiest Girls

Ella
Bethany
Eleanor
Olivia
Laura
Holly
Courtney
Amber
Caitlin
Jade

Nicest Boys

Jacob
Daniel
Thomas
James
Adam
Harry
Samuel
Jack
Oliver
Ryan

Naughtiest Boys

Joseph
Cameron
William
Jake
Joshua
Jamie
Lewis
Benjamin
Ethan
Luke

(h/t BuzzFeed)

TIME society

Here Are the 10 Most Misquoted Holiday Songs

“Don we now our day of peril”

The following 10 songs are the most frequently misquoted holiday classics and the funniest ways people have interpreted them, according to an Amazon survey. The data is a promotion for X-ray for Music, an application that displays lyrics as tunes play. Happy sing-a-long!

  • “Auld Lang Syne”: “Should all acquaintance be forgot and never brought to mine, should all acquaintance be forgot in the land of old man time.”
  • “The Christmas Song (Chestnuts Roasting on an Open Fire)”: “Chestnuts roasting on an open fire, Jack Frost nipping at your toes”
  • “Winter Wonderland”: “Later on, we’ll perspire”
  • “Deck the Halls”: “Don we now our day of peril”
  • “Jingle Bells”: “Bells on cocktail rings”
  • “The Twelve Days of Christmas”: “On the fourth day of Christmas, my true love sent to me, four colly birds”
  • “Silent Night”: “Round John Virgin, mother and child”
  • “Joy to the World”: “Joy to the world! The Lord has gum”
  • “Grandma Got Run Over by a Reindeer”: “Grandma got run over by a reindeer walkin’ home from outhouse on Christmas Eve”
  • “We Three Kings of Orient Are”: “We three kings of porridge and tar”
TIME

This Is the Most Popular Christmas Song Ever

TIME crunches the merry numbers behind the most popular Christmas songs of the modern era

The names Joseph Mohr and Franz Xaver Gruber have largely vanished into the annals of Christmas tormentors, but their greatest triumph lives on. “Silent Night,” which Mohr wrote the lyrics for (in German) in 1816 and Gruber put to music two years later, is the most recorded Christmas song in the modern era of the holiday’s substantial oeuvre.

To determine this fact, TIME crawled the records at the U.S. Copyright Office, which offers digitized registrations going back to 1978, and collected data on every Christmas album recorded since that time. “Silent Night,” it turns out, is not merely the most popular carol; with 733 copyrighted recordings since 1978, it is nearly twice as dominant as “Joy to the World,” a distant second with 391 records to its name.

As one might surmise, songs that are no longer under their original copyright are considerably more prominent on modern Christmas albums, given that one needn’t share the holiday windfall. This lends an obvious advantage to the ecclesiastical hymns and tunes, like “O Holy Night” and “God Rest Ye Merry, Gentlemen.” As intellectual property lawyer Paul C. Jorgensen explains, this does nothing to prevent artists from copyrighting their own recording of a song and collecting royalties whenever a radio station wants to play it–assuming the other 732 renditions weren’t to taste.

Nor is it strictly limited to American recording artists. “A lot of international artists will go ahead and register things in the United States,” Jorgensen said.

To determine secularity, TIME measured the likelihood that a song appears on the same album with either “What Child Is This?”, a decidedly devout 1865 tune, or “Jingle Bell Rock,” roughly it’s polar opposite. (The choice of those two songs is rather arbitrary, but proved in trial and error to offer the clearest dichotomy.) In true Christmas spirit, “Silent Night” aptly bridges that great divide: It co-headlines with just about anyone.

Methodology

This project began by downloading every copyrighted recording of “Jingle Bells,” then expanding to every song on the same album as “Jingle Bells,” and so forth until the universe of Christmas music was exhausted. The data only includes “sound recording” records from the Copyright Office, as opposed to sheet music arrangements, videos, and other formats in which one might copyright a song. Variations on the same material, such as “O Christmas Tree” and “O Tannenbaum,” where grouped as one song.

Design by Alexander Ho

TIME relationships

What Every Generation Gets Wrong About Sex

British Mods
Young Mods kissing in the street in London, 1964 John Pratt—Getty Images

Think the past was oppressive and the present is debauched? Think again

It was January 1964, and America was on the brink of cultural upheaval. In less than a month, the Beatles would land at JFK for the first time, providing an outlet for the hormonal enthusiasms of teenage girls everywhere. The previous spring, Betty Friedan had published The Feminine Mystique, giving voice to the languor of middle-class housewives and kick-starting second-wave feminism in the process. In much of the country, the Pill was still only available to married women, but it had nonetheless become a symbol of a new, freewheeling sexuality.

And in the offices of TIME, at least one writer was none too happy about it. The United States was undergoing an ethical revolution, the magazine argued in an un-bylined 5000-word cover essay, which had left young people morally at sea.

The article depicted a nation awash in sex: in its pop music and on the Broadway stage, in the literature of writers like Norman Mailer and Henry Miller, and in the look-but-don’t-touch boudoir of the Playboy Club, which had opened four years earlier. “Greeks who have grown up with the memory of Aphrodite can only gape at the American goddess, silken and seminude, in a million advertisements,” the magazine declared.

But of greatest concern was the “revolution of [social] mores” the article described, which meant that sexual morality, once fixed and overbearing, was now “private and relative” – a matter of individual interpretation. Sex was no longer a source of consternation but a cause for celebration; its presence not what made a person morally suspect, but rather its absence.

The essay may have been published half a century ago, but the concerns it raises continue to loom large in American culture today. TIME’s 1964 fears about the long-term psychological effects of sex in popular culture (“no one can really calculate the effect this exposure is having on individual lives and minds”) mirror today’s concerns about the impacts of internet pornography and Miley Cyrus videos. Its descriptions of “champagne parties for teenagers” and “padded brassieres for twelve-year-olds” could have been lifted from any number of contemporary articles on the sexualization of children.

We can see the early traces of the late-2000s panic about “hook-up culture” in its observations about the rise of premarital sex on college campuses. Even the legal furors it details feel surprisingly contemporary. The 1964 story references the arrest of a Cleveland mother for giving information about birth control to “her delinquent daughter.” In September 2014, a Pennsylvania mother was sentenced to a minimum of 9 months in prison for illegally purchasing her 16-year-old daughter prescription medication to terminate an unwanted pregnancy.

But what feels most modern about the essay is its conviction that while the rebellions of the past were necessary and courageous, today’s social changes have gone a bridge too far. The 1964 editorial was titled “The Second Sexual Revolution” — a nod to the social upheavals that had transpired 40 years previously, in the devastating wake of the First World War, “when flaming youth buried the Victorian era and anointed itself as the Jazz Age.” Back then, TIME argued, young people had something truly oppressive to rise up against. The rebels of the 1960s, on the other hand, had only the “tattered remnants” of a moral code to defy. “In the 1920s, to praise sexual freedom was still outrageous,” the magazine opined, “today sex is simply no longer shocking.”

Today, the sexual revolutionaries of the 1960s are typically portrayed as brave and daring, and their predecessors in the 1920s forgotten. But the overarching story of an oppressive past and a debauched, out-of-control present has remained consistent. As Australian newspaper The Age warned in 2009: “[m]any teenagers and young adults have turned the free-sex mantra of the 1970s into a lifestyle, and older generations simply don’t have a clue.”

The truth is that the past is neither as neutered, nor the present as sensationalistic, as the stories we tell ourselves about each of them suggest. Contrary to the famous Philip Larkin poem, premarital sex did not begin in 1963. The “revolution” that we now associate with the late 1960s and early 1970s was more an incremental evolution: set in motion as much by the publication of Marie Stopes’s Married Love in 1918, or the discovery that penicillin could be used to treat syphilis in 1943, as it was by the FDA’s approval of the Pill in 1960. The 1950s weren’t as buttoned up as we like to think, and nor was the decade that followed them a “free love” free-for-all.

Similarly, the sex lives of today’s teenagers and twentysomethings are not all that different from those of their Gen Xer and Boomer parents. A study published in The Journal of Sex Research this year found that although young people today are more likely to have sex with a casual date, stranger or friend than their counterparts 30 years ago were, they do not have any more sexual partners — or for that matter, more sex — than their parents did.

This is not to say that the world is still exactly as it was in 1964. If moralists then were troubled by the emergence of what they called “permissiveness with affection” — that is, the belief that love excused premarital sex – such concerns now seem amusingly old-fashioned. Love is no longer a prerequisite for sexual intimacy; and nor, for that matter, is intimacy a prerequisite for sex. For people born after 1980, the most important sexual ethic is not about how or with whom you have sex, but open-mindedness. As one young man amongst the hundreds I interviewed for my forthcoming book on contemporary sexual politics, a 32-year-old call-center worker from London, put it, “Nothing should be seen as alien, or looked down upon as wrong.”

But America hasn’t transformed into the “sex-affirming culture” TIME predicted it would half a century ago, either. Today, just as in 1964, sex is all over our TV screens, in our literature and infused in the rhythms of popular music. A rich sex life is both a necessity and a fashion accessory, promoted as the key to good health, psychological vitality and robust intimate relationships. But sex also continues to be seen as a sinful and corrupting force: a view that is visible in the ongoing ideological battles over abortion and birth control, the discourses of abstinence education, and the treatment of survivors of rape and sexual assault.

If the sexual revolutionaries of the 1960s made a mistake, it was in assuming that these two ideas – that sex is the origin of all sin, and that it is the source of human transcendence – were inherently opposed, and that one could be overcome by pursuing the other. The “second sexual revolution” was more than just a change in sexual behavior. It was a shift in ideology: a rejection of a cultural order in which all kinds of sex were had (un-wed pregnancies were on the rise decades before the advent of the Pill), but the only type of sex it was acceptable to have was married, missionary and between a man and a woman. If this was oppression, it followed that doing the reverse — that is to say, having lots of sex, in lots of different ways, with whomever you liked — would be freedom.

But today’s twentysomethings aren’t just distinguished by their ethic of openmindedness. They also have a different take on what constitutes sexual freedom; one that reflects the new social rules and regulations that their parents and grandparents unintentionally helped to shape.

Millennials are mad about slut-shaming, homophobia and rape culture, yes. But they are also critical of the notion that being sexually liberated means having a certain type — and amount — of sex. “There is still this view that having sex is an achievement in some way,” observes Courtney, a 22-year-old digital media strategist living in Washington DC. “But I don’t want to just be sex-positive. I want to be ‘good sex’-positive.” And for Courtney, that means resisting the temptation to have sex she doesn’t want, even it having it would make her seem (and feel) more progressive.

Back in 1964, TIME observed a similar contradiction in the battle for sexual freedom, noting that although the new ethic had alleviated some of pressure to abstain from sex, the “competitive compulsion to prove oneself an acceptable sexual machine” had created a new kind of sexual guilt: the guilt of not being sexual enough.

For all our claims of openmindedness, both forms of anxiety are still alive and well today – and that’s not just a function of either excess or repression. It’s a consequence of a contradiction we are yet to find a way to resolve, and which lies at the heart of sexual regulation in our culture: the sense that sex can be the best thing or the worst thing, but it is always important, always significant, and always central to who we are.

It’s a contradiction we could still stand to challenge today, and doing so might just be key to our ultimate liberation.

Rachel Hills is a New York-based journalist who writes on gender, culture, and the politics of everyday life. Her first book, The Sex Myth: The Gap Between Our Fantasies and Reality, will be published by Simon & Schuster in 2015.

Read next: How I Learned About Sex

TIME society

Did This Man Really Use an Olive Garden Pasta Pass to Feed the Homeless?

Olive Garden Pasta Pass
Olive Garden's “Never Ending Pasta Pass." Darden Restaurants/AP

His blog, Random Acts of Pasta, chronicles his time as a carb-laden bringer of happiness

“I was sitting at work one day reading usatoday.com and I came across an article about Olive Garden selling a Never Ending Pasta Pass for $100 which entitled the bearer to unlimited pasta for 7 weeks.”

Thus begins a magical story of charity and pasta. (Maybe.)

Matt Tribe lives in Ogden, Utah. He was one of the lucky ones who snagged the limited edition unlimited pasta pass from Olive Garden, which he promptly used by driving around the Ogden area giving away pasta to the area’s hungry. His blog, Random Acts of Pasta, chronicles his time as a carb-laden bringer of happiness.

Here’s a highlight:
“I was cruising around looking for someone homeless that I could give some pasta to. I finally found one sleeping in a park. I debated on whether or not it would be a good idea to wake her up, but in the end I decided to do it. I approached her and asked if she was hungry and would like some Olive Garden. She thanked me for the pasta and said she was going to share it with her friends. First of all, the initial thought of someone who has nothing was to share it with someone else – that’s pretty incredible.”

Tribe’s video has of course gone viral, thanks in large part to coverage from outlets like PEOPLE. But many others have voiced doubts that the video’s veracity: With that whole “Drunk Girl in Public” hoax fresh in the Internet’s mind, people on YouTube and Reddit are already calling this whole thing a well-engineered piece of viral marketing.

Olive Garden, for their part, is denying any involvement beyond furnishing the pass to Tribe:

What do you think? Case of well-shot, curiously brand-forward charity, or crass attempt at manipulating the tenor of the Internet to sell chain-quality Italian food? (We’d really like to hear from Marilyn Hagerty.)

Of course, maybe the providence of this clip is beside the point, if some hungry people actually got fed.

This article originally appeared on People.com

TIME society

Muhammad Becomes Britain’s Most Popular Name for a Baby Boy

The website BabyCentre also reveals that parents have been heavily influenced by celebrity name choices

The website BabyCentre UK has revealed that, rising 27 places from last year, the name Muhammad has topped the list of the top 100 boys’ names of 2014 (when alternate spellings such as Mohammed are included).

It is closely followed by Oliver and Jack, but royal names have fallen in popularity, with George actually declining in popularity since the birth of Prince George to the Duke and Duchess of Cambridge last year.

Sarah Redshaw, managing editor of BabyCentre, told The Times: “Kate and William have a lot of attention and parents don’t want to always be asked if they named their baby after Prince George.”

Data shows a rising trend in Arabic names, with Omar, Ali, and Ibrahim appearing in the chart, and Nur jumping straight to number 29 in the girls’ top 100. But Biblical names such as Jacob, Noah and Gabriel for boys, and Abigail, Elizabeth and Eve for girls continue to endure in popularity.

The influence of popular culture on parents’ choices is also clear: Game of Thrones is likely responsible for Emilia entering the charts at 53, while Frozen‘s Elsa makes an appearance in the top 100. Breaking Bad‘s Skyler, Jesse and Walter have also soared up the charts since the series ended last September.

 

TIME society

New Study Finds That Weight Discrimination in the Workplace is Just as Horrible and Depressing as Ever

workplace
Getty Images

Still more research finds that fat women earn less than their thinner peers, but the same can't be said for fat men

xojane

This story originally appeared on xoJane.com.

A new study out of Vanderbilt University has confirmed prior research — and what many women already know — about the earning power of fat women: they earn less than their peers across different types of work, and this holds true even when their level of education is accounted for.

The study controlling for educational background is important, as this has often been suggested as the “real” reason fat women earn less, because they as a group also tend to be less educated, which is often put down to their being poorer, because they tend to get lower-paying jobs, all of which comes down to a hugely depressing and seemingly endless ride on the blame carousel. Because, it seems, no one likes to cop to the possibility that maybe fat women are paid less simply because of cultural bias against them.

A bias, it turns out, that seems to have a specific impact on the paychecks of fat women, and is a statistically insignificant issue for fat men.

Fat women earning less than their smaller peers is hardly new information; a 2010 study published in the Journal of Applied Psychology that analyzed pay discrepancies between people of different sizes found some dramatic differences. This study broke women’s body sizes down into categories of “very thin,” “thin,” “average,” “heavy,” and “very heavy.” It found that when compared to women of average weight, “very thin” women earned $22,000 more a year, while “very heavy” women earned almost $19,000 less.

And lest you think these burdens are shouldered only by the extremely obese, a weight gain of 25 pounds predicted an annual salary loss of approximately $14,000 per year — or even more, if the woman gaining the weight was previously thin, as thin women who gain weight are penalized more harshly than already-overweight women who do so. Even being as little as 13 pounds overweight resulted in $9,000 less per year. I hope this demonstrates that this issue is not exclusively of concern to the very fat, but women in general.

The new data, in a study authored by Jennifer Shinall, an assistant professor of law at Vanderbilt Law School, looks not only at differences in salary, but at the types of work fat women are likely to get. Every pound gained lowers the likelihood of a woman working in higher paying jobs that involve interaction with the public, or other forms of personal communication. And if they do get these jobs, they earn an average of 5% less than their average-weight counterparts anyway.

My own personal experience supports the notion that fat women can get a raw deal when job-hunting. I’ve been on lots of job interviews in my adult life, and even as a person with an unusual amount of self-confidence and a meticulous sense of personal presentation, I have lost count of the number of instances in which I have experienced this bias myself. I have had lengthy phone interviews with would-be bosses who seemed barely able to contain their certainty that I was the right person for the job — and then seen their faces perceptibly fall when I appeared in their office. I have met hiring managers who were overwilling to trade social exchanges about music and favorite restaurants during introductory chats via phone and email, but who turned aloof and distant once we were face to face, calculatedly avoiding eye contact, and in some instances plodding through the formal interview as though it was merely a hurdle to be overcome, so they could send me home and very shortly dispatch an email informing me that although they are grateful for my time, I am not the right fit for their needs.

I can’t say I’ve ever regretted missing out on the job in these cases; why would I want to work for an employer so put off by my appearance? I think of it as dodging a particularly slow-moving bullet. Nor do I think that these hiring managers were all horrible callous superficial fat-loathing monsters. Indeed, odds are strongly in favor of their not even being fully conscious of their bias, because it is often the nature of biases to be invisible to those they influence, and a revulsion toward fat people sure seems like a normal cultural response, unless you’ve ever been moved to think critically about it.

But it is frustrating to know — even if I can never prove it — that I was dismissed not because of a lack of expertise or capability, but because I did not look the way that person expected me to look. That is an injustice that you can get used to knowing, but it never stops being a tough thing to accept.

I’d wager that it is the same bias that likely keeps fat women as a group out of more interaction-heavy jobs, as the Vanderbilt study found; it’s likely that many businesses would prefer not to have a fat woman as a public-facing employee representing them to clients or customers, out of a fear that a fat woman’s appearance will suggest certain negative connotations that said public may then negatively associate with the business itself.

And despite the conventional wisdom that describes fat people as lazy humps — the same wisdom that leads hiring managers to assume that a fat job candidate will work less hard than a thinner one — Shinall found the opposite was true in occupational reality.

“As a woman becomes heavier she is actually more likely to work in a physical activity occupation. So morbidly obese women are the most likely to work in a physically demanding occupation,” says Shinall. “Physically demanding positions are healthcare support (nurse’s aides or home health aides), healthcare practitioners (such as registered nurses), food preparation and childcare.”

That’s right: the fatter a woman is, the more likely she is to be working in a physically taxing, extremely active job in which she is on her feet for most of the day, while thinner women are more likely to be chilling behind a desk and getting the bulk of their daytime exercise walking to and from the vending machine. Incidentally, this is also true of the fattest men, according to the study, which states that, “Obese men… are more likely than normal-weight men to work in jobs that emphasize all types of physical activity—including strength, speed, and stamina.” Figure that one out, society.

Ultimately, the purpose of Shinall’s research is to analyze these issues from a legal perspective.

All of this data, says Shinall, could set the stage for some very interesting legal strategies on behalf of overweight women in the coming years. While some women have sought protection from discrimination under the Americans with Disabilities Act, Shinall notes that the fact that many obese women do just fine in physically demanding jobs suggest that may not be the perfect tool.

Title 7 of the Civil Rights Act, on the other hand, opens the door to something different: a “sex plus” claim, based on a company’s unequal treatment of men and women facing precisely the same circumstances (such as a refusal to hire someone with preschool-aged children.) It’s this law that female flight attendants used to defeat formal weight limits.

Weight discrimination has always been nearly impossible to prove in court, in part because there are so few laws in place explicitly making it illegal. Only the state of Michigan includes weight discrimination alongside race, sex, religion, and other points of bias in employment. On a local level, Santa Cruz, CA, Washington, DC, Binghamton, NY, Urbana, IL, Madison, WI and San Francisco, CA all have laws on the books outlawing discrimination based on weight. But that’s it. In other places, even if by some feat you are able to prove weight discrimination beyond a shadow of a doubt, you then also have to prove that the discrimination is itself against the law — by citing the ADA or other legislation and trying to make it fit your situation.

But employing the ADA as a legal grounds against weight bias has long been controversial, since many able-bodied fat people are reluctant to align obesity with disability as a universal fact. Some fat people have disabilities related to their weight, but there are also a great many who do not, and including able-bodied fat people under the ADA could dilute the ADA’s purpose, especially considering that the biases and challenges disabled folks face trying to live in the world are often very different than those experienced by fat people. We need not conflate the two ideas for each to be important and worthy of recognizing on its own merits.

Using Title 7 as a basis for protection would identify weight bias as a matter of sex discrimination, a claim bolstered by Shinall’s findings (and that of many other studies) that fat men do not experience the same pay discrepancies based on weight. Fat bias in the workplace — at least when it comes to a paycheck — is a consistent and severe problem for women at all levels of employment, background, and education, which ought to make gender the central feature. As Shinall notes, in this light, it would make little sense to approach this as a matter of physical ability or accommodation.

The simple fact is, many of us don’t want to believe that weight discrimination in the workplace is real, no matter how many studies and statistics come out demonstrating it. Or, worse yet, that it may be real, but it is acceptable and correct. Rather than acknowledge the injustice of fat women being paid less and given fewer opportunities because of how they look, it is easier to assume that they are, intrinsically, worth less as employees. Such an assertion comes naturally in a culture that treats fat women as though they are likewise worth less as human beings. And so, the argument goes, this is not discrimination, but rather the appropriate way of business.

Fat bias is pervasive issue with measurable affects on people’s lives, and not simply a cosmetic concern, or a source of occasional public discomfort. From public insults, to difficulty securing quality and supportive healthcare, to staggering pay discrepancies, it is not a minor problem, but one that affects millions of people to varying degrees, many of whom don’t even realize they are being hurt by it, because they believe they are getting only what they deserve. And that’s the great tragedy of weight discrimination — the fact that so many people experiencing it believe that it is justified, and not something they are entitled to resent or to fight. As long as that cycle keeps churning, change is going to come slowly indeed.

Lesley Kinzel is Deputy Editor at xoJane.com.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME language

7 English Words You’d Never Guess Have American Indian Roots

Dictionary
Getty Images

English speakers owe Algonquian speakers many thanks

The Pilgrims had plenty of thanks to give the Wampagnog Indians in 1621, around the time they had a certain special meal you might have heard of. Members of that American Indian tribe had been essential to the early settlers’ survival, teaching them which crops to plant and how to fish.

Modern day English speakers, who are about to gorge themselves on sweet potatoes and napping this Thanksgiving, might not know that they have a smaller joy for which to give thanks: the many words that English adopted from American Indian languages (or at least may have). These are words beyond the ones you learned in elementary school like moccasins or powwow, as well as the Mayflower-sized pile of place names derived from American Indian words, including the names of about half the states. Here are some that should at the least make good conversation if you and your distant aunt run out of things to talk about over second helpings.

moose (n.): a ruminant mammal with humped shoulders, long legs, and broadly palmated antlers that is the largest existing member of the deer family.

Moose comes from the New England Algonquian word for that animal: moòs. Algonquian describes a family of about three dozen languages spoken by American Indian tribes, like Arapaho and Cree. One of the first known English-speakers to use the word moose was Captain John Smith, who recounted the creatures in his 1616 writings about the New World.

Yankee (n.): a nickname for a native or inhabitant of New England, or, more widely, of the northern States generally.

Yankee, that word the redcoats used to use to mock American doodles who thought they were fancy because of their feathery hats, is of uncertain origin. But one of the earliest theories is that the slang comes from the Cherokee word eankke, meaning slave or coward. In 1789, a British officer said Virginians used that word to describe New Englanders who sat out during war with the Cherokees.

raccoon (n.): a small North American animal with grayish-brown fur that has black fur around its eyes and black rings around its tail.

Our word for what may be the most adorable cat-sized, trash-eating creatures in America comes from a Virginia Algonquian language. In a book about animals written two years before the United States declared independence, the author noted that the raccoon was also sometimes called the “Jamaica rat, as it is found there in great abundance, playing havoc with everything.”

squash (n.): any of various fruits of plants of the gourd family widely cultivated as vegetables.

Squash is a shortened form of what the Narragansett, an Algonquian-speaking tribe from what is now Rhode Island, called that food: asquutasquash. Circa the 1600s, English-speakers used a closer (and now obsolete) derivative: squanter-squash. And they described the squanter-squash as a cake, bread and “kind of Mellon.” Though today considered a vegetable in cooking, the squash is technically a fruit, even if it seems too starch-like to be in the same family.

toboggan (n.): a long, light sled that has a curved front and that is used for sliding over snow and ice.

Early French settlers in what would later be North America took the Algonquian word for this vessel and made it tabaganne, and that became the English toboggan. The northern neighbors of the tribes who used this word, Alaska Natives like the Inuit, gave English words too, like kayak and husky.

skunk (n.): a North American animal of the weasel kind, noted for emitting a very offensive odor when attacked or killed.

As you’ve probably noticed, there is more than one animal on this list. Encountering new creatures, English speakers had no words of their own for them and so naturally adapted names from the hundreds of American Indian languages already being spoken in the country. Skunk comes from the Abenaki tribe’s name for this potent weasel: segankw.

caucus (n.): a private meeting of the leaders or representatives of a political party.

Like Yankee, the exact origin of this word is unknown. But a possible derivation is from an Algonquin word cau′-cau-as′u, meaning one who advises, urges or encourages. That word has its own roots, according to the Oxford English Dicitionary, in words meaning “to give counsel” and “to urge, promote, incite to action.” American Indian names, the OED notes, were commonly taken by clubs and secret associations in New England.

And here is an eighth word, which you should consider a bonus feature that probably doesn’t have American Indian roots at all, though people in the past have argued that case.

OK (adj., int.): all right; satisfactory, good; well, in good health or order.

The lexicographers at the Oxford English Dictionary do not give a definite origin of this word. They do say it “seems clear” that the heavy favorite theory (O.K. being an abbreviation of “oll korrect,” a play on “all correct”) is true. But they still list competing, underdog origin stories, including the idea that “O.K. represents an alleged Choctaw word” okii, meaning “it is.” The Choctaw may have actually used the word as a suffix to mean “despite what you are wrongly thinking,” as in, “I did too remember to turn the oven off, okii.” It’s an interesting story that would connect well with passive-aggressive uses today. But if you find yourself with free time this holiday, you might peruse the whole history written to support the prevailing theory.

TIME society

Let’s Give Thanks and Stop Whining This Holiday Season

world
Getty Images

The world has never been better

In her classic reinterpretation of Western history, The Legacy of Conquest, Patricia Nelson Limerick writes of an entrepreneurial young man in St. Louis eager to get in on the 1849 Gold Rush. He hands over $200 to a fledgling new carrier called the Pioneer Line that promises comfortable and speedy (60 days, give or take) coach service to the coast, food and drinks included.

Things don’t go as smoothly as advertised. Passengers have to walk part of the way on account of the wagons being overloaded and the ponies being spent. Many travelers die on the voyage, some early on from cholera, others later from scurvy. Once in San Francisco, the young man finds he is too late to the party, struggling to make a living taking menial jobs and distraught that his letters back home take up to six or seven months to arrive.

I can relate. On a recent flight out West, for which I also plopped about $200, we were ordered off the plane after we’d already boarded on account of some mechanical issue, then were forced to wait while United reassigned another aircraft. We made it to Phoenix a full three hours late. I don’t think any of us picked up scurvy along the ways, but things were pretty rough: The onboard Wi-Fi wasn’t working, the flight attendants ran out of Diet Coke, and the guy in front of me reclined his seat so that it nearly touched my knees. Cross-country travel remains ghastly, even when you have a book as good as Limerick’s to entertain you.

You see where I’m going with this, don’t you? That’s right: Let’s stop whining already — at least for this holiday season. We’re so spoiled we can’t really relate to how bad previous generations had it.

The “good old days” are a figment of our imagination. Life — here, there, everywhere — has never been better than it is today. Our lives have certainly never been longer: Life expectancy in the U.S. is now 78.8 years, up from 47.3 years in 1900. We are also healthier by almost any imaginable measure, whether we mean that literally, by looking at health indices, or more expansively, by looking at a range of living- standard and social measures (teen pregnancy rates, smoking, air-conditioning penetration, water and air quality, take your pick).

And in the rest of the world, the news is even better. Despite all the horrors in the headlines, fewer people are dying these days in conflicts, or from natural disasters, than in the past. The world has its obvious geopolitical divides, but a nuclear Armageddon triggered by the reckless hostility of great powers doesn’t loom large as a threat, as it did not long ago. Most impressive of all, the number of people in the world living in dire poverty has been cut in half since 1990, fulfilling a key U.N. development goal that once struck many as unrealistic. With infant mortality rates plummeting and education levels rising all over, people are having fewer kids and taking better care of them. In most of the world, the new normal is to send girls to school along with their brothers, an accomplishment whose significance, development experts will tell you, cannot be overstated.

Even as Americans, we don’t have to compare ourselves to our 19th-century forefathers — or to the Pilgrims — to appreciate how life has become better. Things have improved drastically in our own lifetimes. Remember how unsafe cities were not long ago? How we used to smoke on airplanes? How our urban rivers used to catch fire? Reported violent crimes in the United States are down by half since 1993. And consider how much more humane our society has become. We still suffer from inherited racial, gender, and other biases in our society, but to a far lesser extent than in the past; the bigoted among us are finding less and less acceptance. We’ve adopted a default tolerance of others’ choices and values — think of the revolution in attitudes toward gays over the course of one generation. Americans’ ability to pursue happiness as we see fit has never been greater.

And when it comes to how we communicate, entertain, and learn from one another, we might as well live in an alternate universe to the one we inhabited as recently as the 1980s. Today more than 70 percent of homes have broadband connectivity, and more than 90 percent of American adults have a cellphone. (Remember rotary phones?) If you ever feel bored, your 1980s doppelganger should appear before you in the middle of the night—and just slap you.

And no, growing inequality is not the hallmark of our era. On the contrary, when you look at the human community as a whole, the present time will be remembered for the expansion of the global middle class, and the democratization of living and health standards that once were the privileged birthright in only the wealthiest societies. A few months back I sat through a riveting presentation by Steven Rattner on rising inequality in the U.S., but his most telling slide (arguably undercutting the rest of his talk) was his last, entitled “to end on an optimistic note,” that put the issue in a global context. It showed that most of the world’s workers (as opposed to the middle classes in the most developed countries) have seen their incomes rise in the last 30 years at a rapid clip, much like what’s happened to the super rich.

No one has done more to propagate the notion of a “great convergence” of living standards in the world than the charismatic Swedish development economist Hans Rosling. Go online this holiday season and check out his dynamic graphs that chart countries’ life expectancy and incomes over time since the Middle Ages. They will make you smile, and be thankful. His graphs make humanity look like a flock of birds taking flight, with the U.S. and Europe leading the way, but the others following, tentatively at first, then more assuredly.

So why, if life is better all around, do we whine and complain endlessly as if we live in the worst of times? The answer is: Our success allows us to constantly update our expectations. When my flight is three hours late and the Wi-Fi is busted, I couldn’t care less what it took to cross the country in previous centuries. We are all prima donnas that way. Even in China, young middle-class consumers whine as well, instead of counting their blessings that they didn’t suffer through Mao’s Cultural Revolution.

I’ll concede, very grudgingly, that all this whining can be a good thing. As Yuval Noah Harari, the author of Sapiens: A Brief History of Humankind, has written, we’re hard-wired to be disgruntled. It’s the only way we achieve progress. Evolution requires us to demand more and better, all the time.

Otherwise, we would have given each other high-fives when life expectancy reached 50 and a cross-country journey took just two months—and that would have been that. Still, suspend your whining for a moment this holiday season. Let’s appreciate how far we’ve come.

Andres Martinez is editorial director of Zocalo Public Square, for which he writes the Trade Winds column. He is a professor of journalism at Arizona State University.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME society

5 Great Reasons to Celebrate International Buy Nothing Day Instead of Black Friday

Wallet
Getty Images

Staying home on Friday means saving 100%

xojane

This story originally appeared on xoJane.com.

The day after Thanksgiving, Black Friday, is also International Buy Nothing Day, a day to take a break from buying to question status quo capitalism and consider how we might participate in a more sustainable economy. There are many reasons to participate in International Buy Nothing Day, but in this modern, short-attention-span era, I’ll give you five, starting with the personal and working my way out.

1) It’s good for your wallet.

When I was a little kid, my grandmother would come home and proudly announce how much she had “saved” while she was out shopping. My grandfather would predictably, jokingly reply, “Vivie, you could have saved 100% by staying home.” It wasn’t that he didn’t appreciate her efforts and he wasn’t trying to be a jerk; he was reminding her that he didn’t need a lot of stuff to feel successful. As corny as his humor sounds, it’s true — staying home on Friday means saving 100%.

Evolutionarily speaking, we are hunter/gatherers; it is super hard to fight the primal urge to gather on Black Friday, to source necessary items for our families for a bargain. Finding a great deal can feel like stumbling upon a field of berries, and you have to fight every genetic urge to turn around and walk in the opposite direction. If you need more proof that Black Friday isn’t the most economical way to gather for your family, consider that multiple sources have indicated that “deals” on Black Friday are often no better than what you can find at other times of the year and, in some cases, not as good as those in the last couple weeks before Christmas, or in the days immediately after. Do you really want to brave the traffic and crowds for a mediocre deal?

Economists use the term “opportunity cost” when talking about how we choose what to do with our time. The theory goes that with each chunk of time, the logical choice is to select the activity with the highest financial or personal benefit.

In short, we should choose the activity that makes the most money (or provides the most happiness) with each chunk of time we have. This is a way to evaluate the costs associated with free time. For example, when considering camping out in front of a store for two weeks, you ought to ask yourself if the amount of money saved (on items to be bought) or the happiness accrued (by camping out) will be greater than the amount of money or happiness that could be had doing something else in those two weeks. If the amount saved is not larger, then camping out has a very high opportunity cost.

2) Quantity is not the same thing as quality.

In a tough economy, thoughtful spending is more important than amount of spending.

It’s generally assumed that spending helps the economy. When consumer confidence is up and people are buying, demand for goods goes up. Demand goes up, more production is required, more workers are needed. But in reality, there’s a big difference between raiding the $1 bin at a mega-chain and spending $30 at a local gift shop or restaurant.

Reports show that when you spend $100 locally, $45 stays in the local economy — that’s as compared to about $13 when you spend the same $100 at a big chain store. How, you ask? Good question.

Just like individuals, local businesses also need goods and services. Some of these goods and services include: banking, tax prep, cleaning, maintenance, catering, office supplies, uniforms, etc. — all of which are more likely to come from other local companies in the area than from chain stores (who have centralized these services to save costs and support a national store base). When you shop at a local store you are supporting the owners of the business, the workers at that store, AND all the other local businesses who provide services to that store.

At least one study in San Francisco showed that a spending shift of 10% (putting 10% of spending money toward local businesses rather than chains) could create 1,300 jobs for the Bay area. Keep in mind, that’s a 10% shift — it’s not an all/nothing proposition; that shift still leaves you 90% to spend at big chain stores, while you make a difference locally.

Black Friday encourages a mindless sort of consumerism. The idea is to show up at the mall and buy, buy, buy because everything is on sale. When you heed the hype of Black Friday, you’re less likely to scrutinize prices and recall that they aren’t much better than those Labor Day sales a few weeks back, and you’re also less likely to weigh the impact of the dollars spent at the mall that could be better spent elsewhere.

3) Solidarity with retail workers.

This is perhaps the point getting the most news right now. Black Friday keeps creeping earlier and earlier chronologically, and this year it has finally crept all the way up to the morning of Thanksgiving. We are hearing so much about the poor workers being forced to work ON Thanksgiving, unable to enjoy the holiday with their own families, and that is a valid point; it’s hypocritical for anyone who considers Thanksgiving a “family holiday” to be shopping on Thanksgiving.

However, the issue is much larger than employees being forced to work on Thanksgiving and to give up spending the holiday with their families.

Scheduling is a major issue among retail workers that doesn’t get nearly enough press. In addition to being paid at/near an unlivable minimum wage by exploitative billionaire employers, and often working without employer-paid healthcare or paid sick time, retail workers have little to no control over their schedules. Increasingly, retail workers can expect to be working anytime; schedules are posted with limited notice and changed at the last minute; requests for time off are denied.

Forget Thanksgiving; imagine trying to schedule time with your family to be there for important events (birthdays, school concerts, sports games, parent-teacher meetings, doctor appointments) when your schedule is perpetually subject to change. It is not uncommon for employees to be fired for missing a single scheduled shift no matter the reason.

Boycotting Black Friday is a good first step toward standing in solidarity with workers but it is not the end of the work. For extra credit, consider year-round boycotts of stores that are especially and needlessly stingy with workers – both with wages and benefits.

4) Solidarity with American and international manufacturing workers.

In the U.S., thanks in large part to the historic and ongoing labor movement, workers have some basic protections like: a 40-hour work week (beyond which we are required to receive overtime pay), child labor laws, and workplace safety laws. We take a lot of the basics for granted.

Don’t believe me? It wasn’t until 1998 that OSHA established rules for bathroom breaks, mandating: “… it is clear that the standard requires employers allow employees prompt access to sanitary facilities. Restrictions on access must be reasonable, and may not cause extended delays.” Yeah, up until 1998, workers had no protected right to use the bathroom. So what constitutes a “reasonable” restriction? In 2013, a plaintiff lost his case against his employer because he left his spot on the line three times in one shift. Ruling against him, the court declared:

While there is a clear public policy in favor of allowing employees access to workplace restrooms, it does not support the proposition that employees may leave their tasks or stations at any time without responsibly making sure that production is not jeopardized. In recognition of an employer’s legitimate interest in avoiding disruptions, there is also a clear public policy in favor of allowing reasonable restrictions on employees’ access to the restrooms.

So yeah, they have to let you go, but not whenever you want (or need) to go.

If you work somewhere that doesn’t track bathroom breaks, think about THAT the next time you duck into the can and sit down to enjoy a secluded game of Candy Crush. We’ve made a lot of strides, but the American factory — from poultry and meat-packing to the Amazon warehouse fulfilling your Cyber Monday order can still be a really tough place to earn a living.

And, while American retail workers don’t have it so great, the folks across the globe making all our stuff have it considerably worse. From preventable factory fires in Bangladesh (the courts ruled that the deaths amounted to culpable homicide), to continued dangerous conditions in factories making all of our beloved Apple gadgets, to virtually every item of mass-produced clothing we buy. “Made in the U.S.A.” is not a guarantee that the person doing the making was treated fairly but any other label is a near-certain guarantee of low wages or even wage theft and indentured servitude, grueling hours,unsafe conditions, sexual harassment, child workers, and other exploitative practices.

Unless you’re planning to live like the Amish, it’s virtually impossible to entirely avoid buying any fruits of exploitation, but you can help by buying less, buying only what you actually need, and shifting to ethical suppliers when you can.

5) You’ll be part of a cultural movement that re-thinks what it means to celebrate, to express love, to reward ourselves.

Last week a photo passed through my Facebook feed. It was a man holding a sign with this text: “Nothing says ‘I love you’ like cheap crap made in China by slave labor. Sold by a company owned by billionaires benefitting from corporate welfare, paying slave wages to employees kept from enjoying Thanksgiving with their families.” And that about sums it up for me.

It’s not that poor labor conditions exist only in China or that people only need time with their families on Thanksgiving. It’s that it matters how we spend money. It’s that it matters that there are real people behind each product we buy and that we have a moral responsibility to consider them when making each purchase.

A few years ago, I suggested to my family that we all stop buying stuff for each other for the holidays. After all, we’re all grown-ups and can buy things for ourselves. (Side note: I also wanted to encourage giving gifts when the moment struck rather than on a cycle of obligation, i.e. holidays, birthdays, anniversaries, etc). I might as well have announced myself as the unreformed version of The Grinch. My family, my liberal, cares-about-the-welfare-of-others, have-all-held-really-crappy-jobs family, couldn’t fathom a holiday without a preponderance of THINGS.

Despite the familial disapproval, I still think that BEING with each other is more important than BUYING for each other. How much time were you planning to spend shopping on Black Friday? Between scouring the flyers, making a shopping list, driving to/from stores, parking, roaming the aisles, waiting to pay, paying the bills when they’re due — how many hours is that? 2? 5? 10? More than 10?

Why not use that time to make dates with family? Go out to lunch together, go to dinner, go to the movies. Invite each other over for dinner. Volunteer to watch each others’ kids. You get to spend time with little ones and you give the grown-ups a date night. If you live far away, make Skype or Facetime dates with each other. Or, if you’re old school, pick up the phone and have a conversation.

I can buy myself all the fancy hand cream, high-end gadgets, and skinny jeans I want. I can’t buy: a long walk with my partner and our dogs, a hug from my mom, an as-yet-untold family story from my Dad’s bottomless memory bank, a Skype date with my cousin and his family in which his daughter wants nothing more than to exchange silly faces with us, a plate of my aunt’s holiday cookies, a thoughtful hand-made card from my little sister, a shared trip to a nerdy con with my step-kid. That’s what’s on my wish list this year. What’s on yours?

P.S. Want to do more? If you want to do more than sit at home and buy nothing, you can do anything! From going to the mall to hand out free hugs, to cutting up your credit card, to bringing snacks and support to striking retail workers, to using everything you’ve learned watching “The Walking Dead” to organize a zombie walk.

Amy Mendosa wrote this story for xoJane.com.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

Your browser, Internet Explorer 8 or below, is out of date. It has known security flaws and may not display all features of this and other websites.

Learn how to update your browser