TIME psychology

Kanye West: Narcissist of the Day

Oh, sit down: Kanye in Sydney, where everyone must stand
Oh, sit down: Kanye in Sydney, where everyone must stand Mark Metcalfe; Getty Images

Jeffrey Kluger is Editor at Large for TIME.

Insisting that your audience members stand before you'll perform is just bad form—especially when some of them can't stand

Memo to Kanye West: The “O” in “standing O” doesn’t stand just for ovation; it also stands for optional. That’s worth remembering the next time you insist that your entire audience—every single one of them—stand up before you’ll even begin a song, especially if, as is often the case, there are people in that same audience who, you know, can’t stand up.

Precisely that unseemly scene played out over the weekend in Sydney, Australia, when West stopped his show and informed the crowd—who had, as is the custom, paid money to see him perform—that, “I decided I can’t do this song, I can’t do the rest of the show until everybody stands up.” There would, he allowed, be exceptions: “Unless you got a handicap pass and you get special parking and sh*t.”

So everyone stood up, except for two people who, as it turned out, did have “special parking and sh*t.” One was in a wheelchair; the other had a prosthetic limb, which initially did not stop the crowd from booing them and chanting, “Stand up, stand up, stand up,” as West egged them on. “This is the longest I’ve had to wait to do a song,” he griped. “It’s unbelievable!”

Finally, the woman removed her prosthesis and waved it over her head and West polled the people around the wheelchair-bound man: “Now if he is in a wheelchair, that’s fine. He in a wheelchair, there? Only if he’s in a wheelchair.” At last, the fabulously rich entertainer agreed to perform for the disabled audience members.

Yes, there is cellphone camera footage of this; yes, West surely knew there would be. And no, he didn’t give a fig.

This is, as I write in my book The Narcissist Next Door, the same Kanye West who famously interrupted Taylor Swift’s acceptance speech at the 2009 MTV awards to announce that Beyonce should have won the award; the West who responded on his blog to the B+ score Entertainment Weekly had given one of his concerts with this blast: “What’s a B+ mean? I’m an extremist, its either pass or fail! A+ or F-! You know what, f**k you and the whole f*****g staff!” And the West who had this to say (in the third person, of course) about, well, Kanye West: “I think what Kanye West is going to mean is something similar to what Steve Jobs means. I am undoubtedly, you know, Steve of Internet, downtown, fashion, culture. Period. By a long jump.”

West is hardly the entertainment world’s only raging narcissist. Indeed, it’s an industry-wide affliction. Narcissism is measured by the Narcissistic Personality Inventory, a 40-question survey with a theoretical bottom score of 0 and high score of 40. But only a few points either way can make a difference. The average American weighs in at about 15.5, depending on age, gender and a few other variables. Inmates convicted of violent crimes score from 21.5 to 23. Celebrities don’t fall far shy of those stratospheric highs, coming in at 18.27, according to one study of 200 stars by pop psychologist Drew Pinsky.

But just which kind of celeb you are makes a difference. Reality show stars—no surprise—top the list at 19.45, followed by comedians at 18.89, actors at 18.45 and musicians at 16.67. That last, comparatively low figure makes sense because, as University of Georgia psychologist Keith Campbell told me, “If you’re a musician, you’ve got to play in a band.” Subsuming the individual into the group—the me into The Who, say—is not something the most florid narcissists would permit.

The musician rule is less applicable, of course, if you’re an individual performer like Miley Cyrus, Justin Bieber or West, because you are the sole—or at least central—star on the stage. West’s star was surely tarnished by his stunt in Sydney—judging at least by the Internet blowback it’s received. But will he care? No he won’t. Will he change? Not a bit. Audiences, of course, could respond on their own, choosing to remain seated—or better yet, not showing up at all. Even a narcissist would notice an empty hall—and, worse, an empty till.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME History

FDR’s Polio: The Steel in His Soul

Jeffrey Kluger is Editor at Large for TIME.

Disease can break a lot of people. As a new film by Ken Burns and an exclusive video clip show, it helped make Franklin Roosevelt

No one will ever know the name of the boy scout who changed the world. Odds are even he never knew he had so great an impact on history. It’s a certainty that he was carrying the poliovirus—but he may not have known that either since only one in every 200 infected people ever comes down with the paralytic disease. And it’s a certainty too that he had it in late July of 1921 when he and a raucous gathering of other scouts had gathered on Bear Mountain in New York for a summer jamboree. So important was the event in the scouting world that it even attracted a visit by the former Assistant Secretary of the Navy and 1920 Democratic Vice Presidential nominee, Franklin Roosevelt.

This much is painfully certain too: somehow, the virus that inhabited the boy found its way to the man, settling first in his mucus membranes, and later in his gut and lymph system, where it multiplied explosively, finally migrating to the anterior horn cells of his spinal cord. On the evening of August 10, a feverish Roosevelt climbed into bed in his summer cottage on Campobello Island in Canada’s Bay of Fundy. It was the last time he would ever stand unassisted again.

Roosevelt’s polio, which struck him down just as his political star was rising, was supposed to be the end of him. The fact that it wasn’t is a self-evident matter of history. Just why it wasn’t has been the subject of unending study by historians and other academics for generations. This year, Roosevelt and his polio are getting a fresh look—for a few reasons.

October 28 will be the 100th birthday of Jonas Salk, whose work developing the first polio vaccine was backed by the March of Dimes, which was then known as the National Foundation for Infantile Paralysis and which itself grew out of the annual President’s Birthday Balls, nationwide events to raise funds for polio research, the first of which was held on FDR’s 52nd birthday, on January 30, 1934, early in his presidency. That initial birthday ball raised a then-unimaginable $1 million in a single evening, a sum so staggering Roosevelt took to the radio that night to thank the nation.

“As the representative of hundreds of thousands of crippled children,” he said, “I accept this tribute. I thank you and bid you goodnight on what to me is the happiest birthday I have ever known.”

This year too marks one more step in what is the hoped-for end game for the poliovirus, as field-workers from the World Health Organization, Rotary International, UNICEF and others work to vaccinate the disease into extinction, focusing their efforts particularly on Pakistan, one of only three countries in the world where polio remains endemic.

Then too there is the much-anticipated, 14-hr. Ken Burns film, The Roosevelts: An Intimate History, which begins airing on Sept. 14. It is by no means the first Roosevelt documentary, but it is the first to gather together all three legendary Roosevelts—Franklin, Theodore and Eleanor—and explore them as historical co-equals. It’s the segments about FDR and his polio that are perhaps the most moving, however—and certainly the most surprising, saying what they do about the genteel way a presidential disability was treated by the media and by other politicians in an era so very different from our own.

“We think we’re better today because we know so much more,” Burns told TIME in a recent conversation. “But FDR couldn’t have gotten out of the Iowa caucuses because of his infirmity. CNN and Fox would have been vying for shots of him sweating and looking uncomfortable in those braces.”

That’s not a hard tableau to imagine—the competing cameras and multiple angles, shown live and streamed wide. And what Americans would have seen would not have been pretty, because never mind how jolly Roosevelt tried to appear, his life involved far, far more pain and struggle than the public ever knew, as a special feature from the film, titled “Able-Bodied,” makes clear. That segment, which is not part of the broadcast and is included only on the film’s DVD and Blu-Ray versions, which are being released almost contemporaneously with the film, was made available exclusively to TIME (top).

Concealing—or at least minimizing—the president’s paralysis was nothing short of subterfuge, the kind of popular manipulation that wouldn’t be countenanced today. But it’s worth considering what would have been lost by exposing the masquerade that allowed FDR to achieve and hold onto power. Roosevelt, as the Burns film makes clear, was a man whose ambition and native brilliance far exceeded his focus and patience. It was a restlessness that afflicted cousin Teddy too, causing him to make sometimes impulsive decisions, like pledging in 1904 that he wouldn’t run again in 1908—an act he regretted for the rest of his life and tried to undo with his failed third-party presidential bid in 1912.

“Who knows what would have happened if Teddy had had the great crises Franklin had—the Depression and World War II?” Burns says. “I do know he was unstable and always had to be in motion. It fell to FDR, who could not move, to figure out a way to outrun his demons.”

George Will, in an artful turn in the “Able-Bodied” clip, observes that when the steel went onto Roosevelt’s legs it also went into his soul. That may have been true in FDR’s case, but it’s true too that suffering is not ennobling for everyone. Some people are broken by it; some are embittered by it. As polio nears the end of its long and terrible run, the things FDR achieved despite—even partly because of—his affliction remain nothing short of remarkable.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME psychology

Ray Rice—An Epic Narcissist

Run away now, Ray: The former raven is paying a price for his self-regard
Run away now, Ray: The former Raven is paying a price for his self-regard Baltimore Sun; MCT via Getty Images

Criminality in professional sports has become an epidemic—and there's one psychological condition more to blame than any other

No one will ever get inside Ray Rice’s head except for Rice himself—and from the looks of things, it’s not a place you’d want to spend any time anyway. It’s up to the former Baltimore Raven himself to figure out how he arrived at the moral point that he was able—and, more troubling, willing—to cold-cock his then-fiancée across the face, causing her head to strike an elevator hand rail and knocking her utterly unconscious. Dragging her to the floor outside the elevator and using his feet to move her limp legs out of the way was a second level of inexplicable ugly.

But there are answers to be found in Rice’s profession—professional sports in general and the National Football League in particular, as I explore in my new book, The Narcissist Next Door. Pro athletes have increasingly earned a reputation as serial lawbreakers, and it’s a rep they come by rightly. The morning’s rap sheet has become a near staple in the sports pages, with regular reports of which athletes got picked up on which charges the night before—charges that range from DWI to weapons possession to drug possession to brawling to domestic violence to, in the most extreme cases, multiple murder.

It’s gotten so bad that the San Diego Union-Tribune has taken to posting a regularly updated NFL Arrests Database, with 719 entries so far, dating back to Denver wide receiver Rod Smith’s Jan. 24, 2000, bust for third-degree assault. For convenience, you can refine your search by date, team, position, name, incident and resolution. This should be parody; it’s not. And it’s all a part of the suite of narcissist behaviors too many athletes exhibit.

There is the look-at-me showboating of the athlete, the body-centric preening of the athlete, the entitlement of the athlete, the unaccountability of the athlete, the regal third-person self-reference of the athlete.

“I wanted to do what was best for, you know, LeBron James, and what LeBron James was gonna do to make him happy,” said, well, LeBron James, about his 2010 decision to leave the Cleveland Cavaliers for the Miami Heat—or, as he put it, “to take my talents to South Beach.”

“If they don’t sign me, sorry, but I must go. That’s what Carlos Zambrano thinks,” said, yes, pitcher Carlos Zambrano when he was in contract negotiations with the Chicago Cubs.

Narcissists come by their self-regard in a lot of ways. There’s heritability—with studies showing that genetics play a role in the trait in up to 77% of all cases. There is, too, the so-called mask model of narcissism, with overweening self-regard in fact a masquerade to conceal its exact opposite—a deep well of self-loathing or at least low self-esteem.

And then there is narcissism that is exactly what it appears to be—a toxic mashup of grandiosity, lack of empathy, and indifference to the rules. This is thought to be what’s behind the behavior of most reprobate athletes because the fact is, they are outside the rules. Spend your life being regularly feted by coaches and classmates, getting waved through college classes you may never even have attended, inking an eight-figure signing bonus just for agreeing to terms with a team for which you’ve yet to do a single day’s work, and why should you think you’re part of the same accountable community as everyone else?

Even when athletes misbehave and the hammer does come down, it’s often wrapped in velvet. Rice’s initial punishment was the NFL equivalent of a traffic ticket—a two-game suspension that would have stood had the latest video not seen the light. The Yankees’ Alex Rodriguez was suspended for the entire 2014 season for using performance-enhancing drugs, but what of it? He’s nearing the end of his career anyway, he’s fabulously wealthy, and he’s free to come back next season, never mind the fact that most fans—to say nothing of the Yanks themselves—would probably love to see the back of him. Even the athletes who do wind up booked and charged often get off easy—which is what millions of dollars to spend on a legal team will get you (see e.g. Simpson, O.J., double murder).

The answer, if there is one, begins with removing the velvet from the hammer. Quarterback Michael Vick served 21 months behind bars for running a dogfighting ring and killing the animals that didn’t please him, and while you may or may not think he ever deserved the chance he’s gotten to play again in the NFL, his sentence sent a powerful message—making it far likelier that the NFL has seen its last dogfighting ring. Rice may come back one day too—though surely not with the Ravens—or he may be finished for good. But in this case the unaccountable player was eventually—if too late—held to serious account.

Narcissistic players have only themselves to blame for their misdeeds. But the first important step in making sure they quit thinking they’re outside the rules is for people in authority to start applying them—and in ways that hurt.

TIME Family

Why Being Second Born Can Be a Royal Pain

Meet the family: It's that little one on the right you have to watch out for
Meet the family: It's that little one on the right you have to watch out for JOHN STILLWELL; AFP/Getty Images

An open letter to George's Number Two: regal or not, second-borns can get a rotten deal

Dear Pending Prince or Princess:

First of all, the other seven billion of us are just thrilled to hear the happy news that you’re on the way—in a gender yet to be announced and with a name yet to be determined. I realize you’ll have your hands full for the next several months doing things like, well, growing hands, so I don’t want to burden you with too much right now. But before long you’ll emerge into the world and meet your royal Mum and Dad—and guess what? You’ll have a royal big brother too.

I know, I know, sorry to break it to you. You were kind of hoping you’d be the first and, if it were at all possible to arrange it, the only. Well, welcome to the club, kid. From one Number Two to another, here’s a frank admission: it’s a lousy gig—except when it’s great.

Every first child will always be a family’s crown prince or princess, which is all the more relevant in your family because the whole crown thing is for real. As a rule, first-borns are more serious than later-borns; they work harder, are better students and their IQ tends to be about three points higher than that of second-borns. They are also much more inclined than later-borns to go into the family business—which, yes, in your case is kind of the whole point. You should get accustomed to hearing your brother and you referred to as “an heir and a spare,” which is a term you won’t understand at first, then you will, and will go on to loathe for the rest of your natural life.

There’s a reason all this is true—and in commoner families too, not just yours. Think of your clan not so much as just Royal Family, but as Royal Family Inc. Moms and Dads have a finite supply of hours, energy and money—though in some families (we’re not pointing fingers here) there’s a little more of the latter than in others. The point is, your parents pour all their resources into the first product to come off the assembly line (let’s call it, for example, George v. 1.0). By the time the next one rolls along (let’s call this one You v. 2.0) there’s no getting that early investment back. This is what’s known to business people as sunk costs, which you’ll learn about at Eaton and Oxford and will later get to forget about because your exchequers and ministers will see to such things. The point is, in both a family and a company, sunk costs lead the board of directors (Mum and Dad in your case) to value the first product more than the second, whether they realize it or not.

This is an arrangement that suits that first product just fine, which is why big brothers and sisters tend to play by the rules. Your job—and the job of any littler royals who may come along after you—will be to try to upset that order. It’s why later-borns tend to be more rebellious and to take more risks than first-borns. You’ll be likelier to play extreme sports than big bro George. Even if you and he play the same sports, you’ll choose a more physical position—a baseball catcher, say, instead of an outfielder. (Baseball is…never mind. Ask someone in the royal court what the soccer and polo analogy are.) In the event you ever become Ruler of the United Kingdom of Great Britain and Northern Ireland and of other Realms and Territories around the world—and you’re fourth in line for the job, so don’t start getting measured for the cape yet—you’d be a more liberal, less conventional monarch than your big bro will be.

Later-borns are more inclined to be artists too, and if there is a comedian in the family, it’s likeliest to be the very last-born. This makes sense, since when you’re the smallest person in the nursery, you are in constant risk of getting clocked by someone bigger—sorry, no royal dispensation on that rule—so you learn to disarm with humor. You also may find you’re more empathic and intuitive than George, since you similarly have to know how to suss out what people are thinking in order to get your way—what scientists call a low-power strategy, rather than the big sib’s high-power one.

There are other perils that come with being a number two, not least figuring out ways to get yourself noticed, and it’s best to go about that one carefully. One day, ask your Uncle Andy about a special friend of his named Miss Stark—and if you really want to get a laugh, call her Auntie Koo. Ask Uncle Harry to show you pictures of his recent visit to a Las Vegas hotel. On second thought, don’t, but do remember that there is only a narrow window available to you for being photographed naked—you’ll get a grace period of about 12 months after you arrive. Uncle Harry exceeded that by a teensy bit.

The point is, you’ll have to figure out ways to be special, to make a difference, while staying off of TMZ and out of the tabs. The upside? Well, you know that thing about big sibs having a higher IQ? That’s because they mentor and look after the little sibs, which isn’t half bad (trust another Number Two’s word on this one too). And if more kids come along, you get to be the mentor, which is its own kind of wonderful. The downside? Then you’ll be a middle child. And I hate to tell you kid, but that gig stinks no matter who you are.

But all that comes later. For now, enjoy the quiet, brace for the noise, and travel safe.

–A Friend in the Colonies

TIME

Bird Watching: The Joys of Loving the Baltimore Orioles

Orioles Cover
The Sept. 11, 1964, cover of TIME TIME

On the anniversary of Ripken's record-breaking game, a fan reflects

You may not have any interest in being a Baltimore Orioles fan. Indeed, most people have no interest in being Baltimore Orioles fans. It requires you to root for a mid-market team from a relatively small city with a middling record of success. It’s been 31 years since the Orioles were in a World Series (a cool 11,315 days if you’re counting, which Baltimore fans do, and which is one more reason you probably don’t want to be one of them). Then there’s the business of playing in the same division year after year, season after season, as the New York Yankees and the Boston Red Sox. Talk about getting your lunch money stolen.

But if you are an Orioles fan (and again, this is not something you should try at home), there are compensations. There is their blistering, and still ongoing, 2014 season, which saw them enter September with the biggest division lead and the second best record in baseball. How that will ultimately play out is something Orioles fans dare not even contemplate, since hot seasons can turn very cold very fast, and until something is done and in the history books it may as well not have happened at all.

Still, there is plenty in those history books already. There were the four World Series appearances and two championships from 1966 to 1971 — a sweet, buck-a-seat, pre-free-agent era, when little teams sometimes did very big things. There was the four-game sweep of the juggernaut Los Angeles Dodgers in that 1966 Series, in which the Dodgers scored a run apiece in the second and third innings of the first game — and then never scored again. A 5-2 Orioles win was followed by three straight shutouts. And then there was Earl Weaver — which is a phrase that does not require another word to be thrilling and riotous and make you so much happier to be alive than if you couldn’t say a phrase like “and then there was Earl Weaver.”

And there was too, 19 years ago this week, Cal Ripken, the Hall of Fame third baseman turned shortstop turned Iron Man turned legend, who on September 6, 1995 played in his 2,131st consecutive game without a rest — a 13-year run that broke Lou Gehrig’s record of 2,130 games, which up until that point had stood for 56 years. Ripken padded his record for three more years after that, finally taking himself out of a match-up against the Yankees on Sept. 20, 1998, after 2,632 games.

As TIME reported in that record-breaking week in 1995, in the 13 years Ripken had been showing up for work, doing his job, showering and going home, 3,695 other major leaguers had gone on the disabled list at one point or another — no fault of theirs surely, since the body breaks down, especially when it’s going through all the unnatural torquing and torsion a baseball player subjects it to. But Ripken’s body never broke down — or at least not so badly he couldn’t ice it, soak it, wrap it and then go out and play the next day. And if the very same injuries he was willing to play with were the ones that landed those other lunch-bucket players on the DL, well that was on their heads, wasn’t it?

Ripken played with thoroughbred blood in his veins — the son of Cal Ripken, Sr., a longtime Orioles coach and onetime manager; and brother of Billy Ripken, an infielder who played for four Major League teams during a career that included two stints with the Orioles. But it was Cal’s baseball success that seemed strangely fated. In 1972, when he was only 12, he was helping the team as a clubhouse boy during a training stint in Asheville, N.C., when a deranged local began shooting at the field. Then-shortstop Doug DeCinces spotted Ripken out in the open, scooped him up and hustled into the dugout with him. “I don’t think his feet touched the ground,” DeCinces said years later. In some ways they didn’t finally touch down until 1998.

TIME gave its nod to a few other Orioles in the long history of the team (60 years) and the magazine (91 years). There was the July 27, 1979, profile of Weaver, the Orioles’ manager — an issue that featured a somber looking Jimmy Carter filling almost all of the cover, and a red-faced Weaver screaming from the corner flap. It was an image that perfectly captured the crowing, overachieving, look-at-me rooster that Weaver was — and that he made his teams.

There was, too, the Sept. 11, 1964 cover — 50 years ago this week — of manager Hank Bauer, who would take the Orioles to that memorable World Series two years later. The headline of the story was “Old Potato Face” and it opened with a scene of Bauer meeting with reporters in his stadium office, finishing a beer (“The heat don’t bother them,” he said of his team, “’cause they drink this here good beer”) and crushing the can. Then he walked off to the shower — stark naked.

That, too, is part of what it means to be a Baltimore Orioles fan. It ain’t pretty and it ain’t easy. But now and again — every 50 or 31 or 19 years or so — it can be awfully sweet.

Read the full 1964 article about Hank Bauer and the Baltimore Orioles here, in TIME’s archives: Old Potato Face

TIME Family

Why I Don’t Eat With My Kids

Who invited those two? The 'family dinner' ain't all it's cracked up to be
Who invited those two? The 'family dinner' ain't all it's cracked up to be GMVozd; Getty Images

The curative properties of the nightly family dinner have been greatly overexaggerated

I love my daughters, I really do, more than I can coherently describe. I love my dinner hours too — not nearly as much, of course, but I’ve been on familiar terms with dinner for a lot longer than I’ve been on familiar terms with my children. Frankly, I don’t see much reason to introduce them to each other.

It’s not that my wife and I don’t eat with our daughters sometimes. We do. It’s just that it often goes less well than one might like. For one thing, there’s the no-fly zone surrounding my younger daughter’s spot at the table, an invisible boundary my older daughter dare not cross with touch, gesture or even suspicious glance, lest a round of hostile shelling ensue.

There is too the deep world-weariness my older daughter has begun bringing with her to meals, one that, if she’s feeling especially 13-ish, squashes even the most benign conversational gambit with silence, an eye roll, or a look of disdain so piteous it could be sold as a bioterror weapon. Finally, there is the coolness they both show to the artfully prepared meal of, say, lemon sole and capers — an entrée that is really just doing its best and, at $18.99 per lb., is accustomed to better treatment.

All of this and oh so much more has always made me greatly prefer feeding the girls first, sitting with them while they eat and, with my own dinner not on the line, enjoying the time we spend together. Later, my wife and I can eat and actually take pleasure in the experience of our food. But that, apparently, is a very big problem.

We live in the era of the family dinner, or, more appropriately, The Family Dinner™, an institution so grimly, unrelentingly invoked that I’ve come to assume it has its own press rep and brand manager. The Family Dinner™, so parents are told, is now recognized as one of the greatest pillars of child-rearing, a nightly tradition you ignore at your peril, since that way lie eating disorders, obesity, drug use and even, according to a recent study out of McGill University, an increased risk of the meal skipper being cyberbullied.

O.K., there is some truth in all of this. Sit your kids down at the table and talk with them over dinner every day and you have a better chance of controlling what they eat, learning about their friends, and sussing out if they’re troubled about something or up to no good. But as with so much in the way of health trends in a gluten-free, no-carb, low-fat nation, enough, at some point, is enough.

For one thing, the always invoked, dew-kissed days of the entire nuclear family sitting down to a balanced, home-cooked meal were less than they’re cracked up to be. Ever hear of the Loud family? Ever watch an episode of Mad Men — particularly one that plays out in the Draper kitchen? Welcome to family dinner in the boomer era.

Much more important, as a new study from North Carolina State University shows, the dinner-hour ideal is simply not possible for a growing number of families. The researchers, a trio of sociologists and anthropologists, spent 18 months conducting extensive interviews with 150 white, African-American and Latina mothers from across the socioeconomic spectrum, and an additional 250 hours observing 12 lower-income and poor families to get at the truth of what’s possible at mealtime and what’s not.

The first problem, the moms in the study almost universally agree, is that it is always more time-consuming to prepare dinner than you think it will be. Michael Pollan, the ubiquitous author and food activist, has written, “Today, the typical American spends a mere twenty-seven minutes a day on food preparation, and another four minutes cleaning up. That’s less than half the time spent cooking and cleaning in 1965.” To which I say, huh? And so do the moms in the study.

“I just hate the kitchen,” said one. “I know I can cook but it’s the planning of the meal, and seeing if they’re going to like it, and the mess that you make, and then the mess afterwards.” Added another: “I don’t want to spend an hour cooking after I pick [my daughter] up from school every day.” All of that sounds a lot more familiar to me than Pollan’s rosy 27+4 formulation.

Even if prep time weren’t a problem, dealing with the scheduling vagaries in two-income households can require day-to-day improvisation that makes regular, predictable mealtimes impossible. One couple studied by the NC State researchers worked for the same fast-food company in different parts of the state. Both parents often don’t know the next day’s schedule until the night before, which means inventing dinner plans on the fly and often calling on a grandmother for help. That kind of scrambling is part of what the researchers describe as “invisible labor,” work that is every bit as much a part of dinner as preparing and serving the food, but is rarely acknowledged.

Finally, there is the eternal struggle of trying to prepare a meal that everyone at the table will tolerate — a high-order bit of probability math in which the number of acceptable options shrinks as the number of people who get to weigh in grows. “I don’t need it, I don’t want it, I never had it!” declared one 4-year-old in one observed household. Parents throughout history have dealt with that kind of reaction with all manner of wheedling, bargaining and here-comes-the-airplane-into-the-hangar games, to say nothing of one mother in the study who simply turned a timer on and told her child to keep eating until the buzzer sounded.

Again, none of these problems diminish the psychological and nutritional value of a family sitting down to eat a home-prepared meal together — but perhaps that meal should be an aspirational option, not a nightly requirement. The family-dinner ideal, the authors write, has become “a tasty illusion, one that is moralistic and rather elitist … Intentionally or not, it places the burden of a healthy, home-cooked meal on women.”

With that said, I shall now open some wine and grill my wife and myself some salmon. After all, the girls are in bed.

TIME animals

The Asian Camel Cricket and 10 Other Invasive Species You Might Not Know

TIME takes a look at species that have overstayed their welcome

This is the camel cricket. You hate it, don’t you? You should. Let’s start with the fact that it’s—how to put this nicely?—repulsive. Add the fact that it’s big, by bug standards at least, measuring up to two inches (5 cm) long; that it resembles a spider more than a cricket; and that it will eat nearly anything—including other camel crickets, which is just plain bad form.

Now to all that, add the additional fact that camel crickets are here. And by “here,” we mean everywhere. An Asian species originally, it has now turned up in more than 90% of cricket sightings across the U.S. It wasn’t as if we needed the import, thank you very much. The North American continent already had its own species of camel cricket. But the Asian variety arrived and appears to be crowding out the native species. There are, at current estimates, more than twice as many camel crickets of all species in America as there are actual Americans, with the bugs outnumbering us 700 million to 314 million.

In fairness, camel crickets don’t bite or pose any other particular threat to people. And since they’re scavengers, they also help keep ecosystems in balance. So really, we should be glad to have them–even welcome them, right? Nah. Sorry science, this time we’re going with our guts: camel cricket, here’s your tiny hat. Please go home.

TIME psychology

Hooray for the Mundane! Ordinary Memories Are the Best

Life's peak experiences sometimes pale in comparison with the routine business of living, a new study shows. That "what is ordinary now becomes more extraordinary in the future" can have some positive implications for our state of mind

Never mind those dreamy recollections of your fab trip to Rome or that perfect night out last Valentine’s Day. Want a memory with some real sizzle? How about that time last week you went out for a tuna sandwich with the guy in the next cubicle? Or that trip to the supermarket on Sunday? Hot stuff, eh?

Actually, yes. Ordinary memories, it turns out, may be a lot less ordinary than they seem — or at least a lot more memorable — according to a nifty new study published in the journal Psychological Science. And that can have some positive implications for our state of mind.

It’s not entirely surprising that the experiences we often think should have the greatest impact on us sometimes don’t. For one thing, we tend to expect too much of them. The first time you stand in the Colosseum or stare up at the Eiffel Tower is a gobsmacker all right, but while those moments nicely enhance your life, they typically don’t change them. What’s more, in the weeks and years that follow, we tend to rerun the memory loop of the experience over and over and over again. Like a song you hear too much, it finally becomes too familiar. To test how much we underestimate — yet genuinely appreciate — the appeal of our more mundane experiences, a group of researchers at Harvard University’s school of business devised a multipart study.

In the first part, 106 undergraduate volunteers were asked to compile an online, nine-item time capsule that included such unremarkable items as an inside joke they share with somebody, a list of three songs they were currently listening to, a recent status update on Facebook, an excerpt from a final class paper and a few recollections of a recent social event. They sealed the virtual capsule at the beginning of summer and were asked to predict how interested they’d be, on a scale of 1 to 7, in rereading each item when they reopened it a few months later, and how surprised they thought they’d be by the details of the contents.

After the students did get that opportunity at the beginning of the fall semester, they used the same 1-to-7 scale to rate how meaningful and interesting they found the items. On item after item, the interest, curiosity and surprise they felt was significantly higher than what they had anticipated three months earlier.

In the second part of the study, a different pool of participants did something similar, but this time wrote about a recent conversation they had, rated it on whether it was an ordinary or extraordinary one (what they had for dinner the night before, say, compared with the news of a new romantic interest), and predicted again how interested they thought they’d be about reading the description a few months down the line. Here too they wound up lowballing those predictions — finding themselves much more interested than they predicted they’d be. And significantly, the more mundane the conversation they described was, the wider the gap between their anticipated interest in it and their actual interest when they reread the description.

The third part of the study replicated the second, but this time used only volunteers who did have a romantic partner, and asked them to describe and anticipate their later interest in an ordinary evening the two of them had spent on or before Feb. 8, 2013, and the one they’d spent one week later, on Feb. 14. Here too the Valentine date did less well than the subjects expected compared with the surprise and pleasure they felt in reading about the routine date.

“What is ordinary now becomes more extraordinary in the future,” said lead researcher Ting Zhang, in a statement that accompanied the study’s release. “People find a lot of joy in rediscovering a music playlist from three months ago or an old joke with a neighbor, even if those things did not seem particularly meaningful in the moment.”

One way to correct this imbalance — to take more pleasure in the day-to-day, nothing-special business of living — is merely to try to be more cognizant of those moments as they go by. Another, say Zhang and her colleagues, is to document them more, either by writing them down or, in the social-media era, by sharing them. But there are limits.

“[T]he 5,000 pictures from one’s ‘extraordinary’ wedding may be excessive,” the researchers write. The same is true, they warn, about photo-documenting every plate of food that’s set in front of you rather than just getting down to the pleasurable business of eating it — a practice that they say is leading to “an unhealthy narcissism” growing society-wide. Recording our lives for the biopics that are constantly playing out in our heads is fine, but sometimes that has to give way simply to living those lives.

TIME findings

Why Scientists Should Celebrate Failed Experiments

No losers here: all data is good data
No losers here: all data is good data ilyasat; Getty Images

Researchers live in dread of the null result—when a study turns up nothing. But that's exactly the wrong way to view things

Reporters hate facts that are too good to check—as the phrase in the industry goes. The too-good-to-check fact is the funny or ironic or otherwise delicious detail that just ignites a story and that, if it turns out not to be true, would leave the whole narrative poorer for its absence. It must be checked anyway, of course, and if it doesn’t hold up it has to be cut—with regrets maybe, but cut all the same.

Social scientists face something even more challenging. They develop an intriguing hypothesis, devise a study to test it, assemble a sample group, then run the experiment. If the theory is proven, off goes your paper to the most prestigious journals you can think of. But what if it isn’t proven? Suppose the answer to a thought-provoking question like, “Do toddlers whose parents watch football or other violent sports become more physically aggressive?” turns out to be simply, “Nope.”

Do you still try to publish these so-called null results? Do you even go to the bother of writing them up—an exceedingly slow and painstaking process regardless of what the findings are? Or do you just go on to something else, assuming that no one’s going to be interested in a cool idea that turns out not to be true?

That’s a question that plagues whole fields of science, raising the specter of what’s known as publishing bias—scientists self-censoring so that they effectively pick and choose what sees print and what doesn’t. There’s nothing fraudulent or unethical about dropping an experiment that doesn’t work out as you thought it would, but it does come at a cost. Null results, after all, are still results, and once they’re in the literature, they help other researchers avoid experimental avenues that have already proven to be dead ends. Now a new paper in the journal Science, conducted by a team of researchers at Stanford University, shows that publication bias in the social sciences may be more widespread than anyone knew.

The investigators looked at 221 studies conducted from 2002 to 2012 and made available to them by a research collective known as TESS (Time-Sharing Experiments in the Social Sciences), a National Science Foundation program that makes it easier for researchers to assemble a nationally representative sample group. The best thing about TESS—at least for studies of publication bias–is that the complete history of every experiment is available and searchable, whether it was ever published or not.

When the Stanford investigators reviewed the papers, they found just what they suspected—and feared. Roughly 50% of the 221 studies wound up seeing publication, but that total included only 20% of the ones with null results. That compared unfavorably to the 60% of those studies with strong positive results that were published, and the 50% with mixed results. Worse, one of the reasons so few null results ever saw print is that a significant majority of them, 65%, were never even written up in the first place.

The Stanford investigators went one more—very illuminating—step and contacted as many of the researchers of the null studies as they could via e-mail, asking them why they had not proceeded with the studies. Among the answers: “The unfortunate reality of the publishing world [is] that null effects do not tell a clear story.” There was also: “We determined that there was nothing there that we could publish in a professional journal” and “[the study] was mostly a disappointing wash.” Added one especially baleful scientist: “[The] data were buried in the graveyard of statistical findings.” Among all of the explanations, however, the most telling—if least colorful—was this: “The hypotheses of the study were not confirmed.”

That, all by itself, lays bare the misguided thinking behind publication bias. No less a researcher than Jonas Salk once argued to his lab staff that there is no such thing as a failed experiment, because learning what doesn’t work is a necessary step to learning what does. Salk, history showed, did pretty well for himself. Social scientists—disappointed though they may sometimes be—might want to follow his lead.

Your browser, Internet Explorer 8 or below, is out of date. It has known security flaws and may not display all features of this and other websites.

Learn how to update your browser