TIME A Year In Space

A Month Spent in Space—and 11 More to Go

.@FLOTUS Thank you. Made it! Moving into crew quarters on @space_station to begin my #yearinspace.
Scott Kelly—NASA Scott Kelly posted this selfie on March 30, shortly after his arrival to the space station, while moving into his living quarters, where he will sleep during his year in space.

The first 30 days of Scott Kelly's mission aboard the ISS are in the books

A year in space is marked in part by the holidays that will pass while you’re away. Christmas? Sorry, out of town. Easter? Ditto. Thanksgiving, New Year’s Eve, Halloween? Catch you next year.

It’s fitting then, that the first holiday astronaut Scott Kelly spent in the just-completed first month of his planned one-year stay aboard the International Space Station (ISS)—which began with his launch from the Baikonur Cosmodrome in Kazakhstan in the early morning hours of March 29—was Cosmonautics Day. Never heard of it? You would have if you were Russian.

Cosmonautics Day celebrates April 12, 1961, when Yuri Gagarin lifted off from the same launch pad from which Kelly’s mission began, becoming the first human being in space. Kelly and his five crewmates—including fellow one-year marathoner Mikhail Kornienko—got the morning off on this year’s special day, taking the opportunity to enjoy the relative comforts of a spacecraft with more habitable space than a four-bedroom home. But in the afternoon it was back to work—following a moment-by-moment schedule that is scripted on the ground, followed in space and that, while often grueling, is the best way for astronauts and cosmonauts who have signed on for a long hitch to keep their minds on their work and keep the time from crawling.

Kelly’s first month was, in some ways, typical of the 11 that lie ahead. There was the arrival of a SpaceX cargo ship—a vessel carrying 4,300 lbs (1,950 kg) of equipment and supplies, including a subzero freezer that can preserve experiments at -112º F (-80º C)—that needed to be unloaded; new gear to aid studies of the effects of microgravity on mice; and a sample of so-called synthetic muscle, a strong but pliant material modeled after human muscle, to be used for robotic limbs and joints. Also tucked into the load was a less practical but infinitely more anticipated item—a zero-gravity espresso machine, dubbed the ISSpresso.

There are 250 experiments that must be tended at any one time aboard the ISS, but the most important of them will be Kelly and Kornienko themselves. The human body was built for the one-g environment of Earth, but if we ever hope to achieve our grand dreams of traveling to Mars and beyond, we’d better figure out if we can survive the rigors of zero-g. And that’s no sure thing. Almost every system in the body—circulatory, skeletal, cellular, visual—breaks down in some ways in weightlessness.

In their first month in space alone, the two long-termers submitted to a whole range of preliminary experiments that will track their health throughout their stay: their eyes are being studied to determine the kind of effect the upward shift in fluids caused by zero-g has on the optic nerve and the shape of the eyeball. Space physicians already know the basic answer: not a good one. But the hope is that Kelly and Kornienko will help provide ways to mitigate the damage.

Other biomedical studies in the first month include sampling saliva and sweat to test for bacterial levels and chemical balance; leg scans to determine blood flow; studies of blood pressure—which can fluctuate wildly when the heart no longer has to pump against gravity; analyses of throat and skin samples; bone density tests and studies of the cells to determine why they change shape in zero-g. As exquisite serendipity has it, Kelly’s identical twin brother, Mark, is a retired astronaut, providing a perfect controlled study of how men with matching genomes and matching backgrounds react to a year spent in decidedly non-matching environments. Nearly all of the studies Scott submits to in space will be duplicated in Mark on the ground.

The eleven months ahead will not all be a Groundhog Day repetition of the first. Kelly will venture out on at least two spacewalks—the first of his four-mission career—and will help oversee a complex reconfiguration of the station, with modules and docking ports repositioned to accommodate commercial crew vehicles built by Boeing and SpaceX, which are supposed to begin arriving in 2017. There will also be movie nights and web-surfing and regular video chats, phone calls and emails with family. And the periodic arrival of cargo ships will provide such luxuries as fresh fruits and vegetables, which don’t last long in space, but don’t have to because six-person crews missing the comforts of home scarf them down fast.

The clubhouse turn of Kelly’s and Kornienko’s one-year mission will occur next December, the 50th anniversary of what was once America’s longest stay in space: the two-week flight of Gemini 7, which astronauts Frank Borman and Jim Lovell passed in the equivalent of two coach airline seats, with the ceiling just three inches over their heads. The ISS is a manor house compared to the Gemini. But the astronauts are still astronauts, human beings in a very strange place experiencing very strange things—in this case for a very long time.

TIME is covering Kelly’s mission in the new series, A Year In Space. Watch the trailer here.

TIME baltimore

The Pain of Watching Baltimore Burn — Again

BALTIMORE, USA - APRIL 27: Fire Fighter attempt to put out a building that was set on fire during riots in Baltimore, on April 27, 2015. Protests following the death of Freddie Gray from injuries suffered while in police custody have turned violent.
Anadolu Agency/Getty Images Fire fighters attempt to put out a building that was set on fire during riots in Baltimore, on April 27, 2015.

Jeffrey Kluger is Editor at Large for TIME.

The current violence is—and very much isn't—a repeat of the 1968 riots

I’ve seen this ugly movie before and I didn’t like it much that first time, either. I can’t pretend I was touched directly by the violence when Baltimore was last convulsed by riots, but I was close enough—a boy living in the green suburban ring surrounding Baltimore City during the days of violence that followed Martin Luther King Jr.’s assassination in 1968. I experienced the curfews and the lockdowns and the shuttered schools, the troop trucks on the streets and the olive drab National Guard planes flying low over our house, heading south to the airport where the soldiers would be disgorged and fan out into the city.

I experienced too Barry Oskar (not his real name; I use a pseudonym to spare his surviving family any unnecessary pain), the grandfather of a pair of children who lived on my block. Oskar owned a liquor store in the heart of the violence and announced one evening to family and neighbors that he had shot and killed the first black man to die in the rioting. I didn’t know if it was true or it was a boast — such was Oskar’s naked hatred of African Americans that he would count that a boast. But it had the ring of something he would do, and it wasn’t as if the half dozen killings that occurred in those violent days were going to be aggressively investigated anyway.

The rioting that broke out this week followed the funeral of Freddie Gray, the 25-year-old African American who died at the hands of a dysfunctional white police force that, since 2011 alone, has had to pay out $5.7 million to settle brutality claims. Superficially it felt like a 1968 redux, but the city of now is not the city of then. When I was born, Baltimore was a fault-line place, a town perched on the tipping point between north and south, between the Brown v. Board school desegregation ruling, which had happened not long before, and the voting rights and civil rights acts that were still years away.

The population was openly, racially stratified and businesses that reckoned they could get away with it operated with Jim Crow impunity. When our babysitter, a young African-American woman — as nearly all babysitters were in that time and that place — took my brothers and me to the Uptown Theater to see a revival screening of Pinocchio, the manager scowled at her as soon as we entered, summoned her over and whispered a few cross words.

She came back to us bearing what she gamely framed as the very exciting news that we were all going to get to sit in the balcony. That was fine with me — the balcony was indeed exciting — and it was not until we all got home, my grandmother got word of what had happened and called the theater to rip the bark off the manager that I suspected something more was going on. That something, she made clear to us, had been very wrong.

I didn’t know anything about the primal roots of racism at the time. I hadn’t done any of the reading I’ve done in the decades since about how the brain sorts all people into same and other, tribe and non-tribe, a highly adaptive behavior when we lived in the state of nature but decidedly non-adaptive now. I didn’t know either how powerfully color — of a flag or a uniform or a person — affects that sorting behavior, how easily nonsense like What color is the dress? can morph into What color is the person? And when the answer is black, and the person is young and male, terrible things can happen at the hands of the people who are supposed to be keeping the peace.

I’m glad I didn’t know those things back then, the last time Baltimore burned. Behavioral science can too often be used to make excuses. Laws are fine, but alas, the hearts and minds of men are too often fixed. Best to grow up without that dodge, to learn early on that hearts and minds are as mutable as custom, and that it is every culture’s — and every person’s — responsibility to turn from the dark to the light.

That’s as true of Baltimoreans as of anyone else, but too many people are unsurprised by the current violence. They’ve heard of the city’s stubbornly high murder rate — fifth in the country, behind only Detroit, Newark, New Orleans and St. Louis. Oh, and they’ve seen The Wire — so that pretty much seals things.

But Baltimore’s history and nature run far deeper — the tale of a harbor town, a steel town, a beer-brewing town, an immigrant town. Yes, parts of the city have hollowed out as parts of so many other cities did. And unlike many of those other cities, Baltimore has not enjoyed the same rebound, the same return to the urban core.

But in Baltimore as in the nation as a whole there have been tectonic shifts barely imaginable in 1968. When the lines of authority in a time of civil unrest run from a President named Barack Obama to Attorney General Loretta Lynch to Mayor Stephanie Rawlings-Blake—African Americans all—no one can pretend the world is the same place it was when Lyndon Johnson, Ramsey Clark and Tommy D’Alesandro filled those roles. Rawlings-Blake especially can speak truth to law-breaking power in ways that the white patriarchy never could.

“I’m at a loss for words,” she said angrily as the fires in her hometown flared. “It is idiotic to think that by destroying your city that you’re going to make life better for anybody.”

That idiocy will end. Riots always do when enough rage has been spent. And the question then will be what both the black and white populations of Baltimore, newly chastened, newly shaken, will do. If Rawlings-Blake can shame the looters, someone must also shame the police and their enablers. Otherwise the cycle will never be broken.

Barry Oskar did not live to see Baltimore’s latest burst of violence — indeed, he did not live terribly many years after the murder he liked to brag about. One day he was in his store, behind his counter and an intruder entered with intent to rob. Oskar reached for his well-loved gun but the robber shot fast and first, and the mortal ledger was once again balanced — murder for murder, death for death, the entry made in the blood ink of the race wars. No more, Baltimore. Please stop. You’ve suffered enough.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME Television

Can Science Conquer Late-Night TV?

Smart talk: Tyson and guests in a season one episode
National Geographic Channels/Scott Gries Smart talk: Tyson and guests in a season one episode

A new talk show starring Neil deGrasse Tyson makes a play in an unlikely time slot

Carl Sagan had it easy. The famed astronomer, author and TV host, who died in 1996, may have mastered one of the most head-crackingly complex disciplines in all of the sciences, but when it came to explaining its mysteries to everyone else, he didn’t have to look hard for an audience. Space is intoxicating, Sagan was engaging and there just weren’t that many distractions that would keep people from tuning into his Cosmos series or reading his books.

Not so today. With 500 cable channels and an infinity of Internet options all vying for attention, customers are harder to come by. And too many of the ones who are left are being picked off by the forces of informational darkness—the anti-vaxxers and climate change deniers and moon-landing hoaxsters all peddling their chosen rubbish.

It’s those challenges that astronomer Neil deGrasse Tyson, director of the American Museum of Natural History’s Hayden Planetarium and host of the 21st-century reboot of Sagan’s Cosmos, must face every day. He’s done an impressive job so far, wooing science lovers with his books and TV appearances and Star Talk radio show. Now he’s stepping up his game, bringing Star Talk to TV in a frank bid for the minds and eyeballs of science’s non-lovers, non-believers and can’t-be-bothereds too.

Everything about the new Star Talk—which premiered on April 20 on the National Geographic Channel with a first-season, 10-episode run already in the can—breaks the science show rules. It ditches the familiar format of smart guy prowling a set with zillions of special effects to make the technical stuff go down easy in favor of a talk show recorded before a live audience in New York City’s Rose Center for Earth and Space, home to the Hayden. His co-host is a rotating cast of comedians, and his guest is typically a non scientist—Star Trek‘s George Takei, sex columnist Dan Savage, Interstellar director Christopher Nolan, President Jimmy Carter.

Much more daring—or much more reckless, depending on how you look at these things—is the show’s time slot: Monday nights at 11 p.m. EDT, when both basic cable and the broadcast networks bring their comedic sluggers to the plate. But the audience for the likes of Jon Stewart and Jimmy Fallon is exactly the demo Tyson is after, even if he’ll have to work hard to win them.

“We recognize candidly that a scientist alone would not have served as a draw for the audience to come to the show,” he says. “The icon or the celebrity is the excuse to talk about the science and we try to blend that with the pop culture and the comedy.”

It’s a mix that can work in both predictable and unpredictable ways. No surprise if you find Nolan talking wormholes and time dilation and the other cosmological puzzlers that made Interstellar go. No surprise either if Takei talks about the science of Star Trek and how much of it can—or already has—come true. But it’s refreshing if Carter does not have to talk about the Middle East peace process and instead can be allowed to show off the nuclear engineering cred he earned in the Navy. Something similar is true of Savage talking sex while constrained by the guardrails of an academic anthropologist who specializes in the neuroanatomy of love appearing with him.

The premiere episode, with Takei, also featured a scientist, astrophysicist Charles Liu of the State University of New York College of Staten Island. Judging by that first installment, the show could use a little tweaking—or perhaps relaxing—with everyone on the panel working a little less hard to fill their assigned roles and instead just allowing the conversation to go wherever it wants to go.

Future episodes could push the envelope of the new format further—perhaps even until it rips. Tyson speaks openly about the possibility of having both Charlie Sheen and, discomfitingly, Jenny McCarthy on the show. “Sheen was in the museum and asked some very deep philosophical questions about the universe,” he says. “Jenny McCarthy had me on her show and has a very curious mind about the universe. If she brings up vaccines I’d be all over her, but only if it goes there.”

Guests like that may just be a ratings play. Nothing could make Sheen’s gyros go haywire like talking cosmology—and you wouldn’t want to miss that. But there may be tactical genius behind bringing on someone like McCarthy. Scientific know-nothingism is a modern scourge, and an inquisitive but academically rigorous interlocutor like Tyson might be the kind of person who can school McCarthy in the difference between finding things out and making things up.

“Is science a satchel of facts that is poured into your head or is it an understanding of emergent truths?” he asks—clearly knowing his answer. “Science literacy is not just what you know but how your brain is wired for thought. If you can achieve that, you never have to ask if the moon landings are real again.”

Tyson is willing to take a flier on the possibility that viewers in the playtime slot of late-night TV might be willing to invest an hour in getting smarter. If he doesn’t succeed, it may say more about us than him.

Read next: Watch Neil deGrasse Tyson Explain the Meaning of Life

Listen to the most important stories of the day.

TIME space

See the 50 Best Images Taken by Hubble

After a quarter of a century on the job, the Hubble Space Telescope has returned some of the most extraordinary cosmic images ever captured

The best space machines reveal their purpose with a single glance. The gangly, leggy lunar module could only have been a crude contraption designed to land on another world. A rocket, any rocket, could only be a machine designed to fly—fast, high and violently.

And so it is with the Hubble Space Telescope—a bright silver, 43 ft. (13 m) long, 14 ft. (4.2 m) diameter cylinder, with a wide open eye at one end and a flap-like eyelid that, for practical purposes never, ever closes. Since shortly after its launch on April 24, 1990, that eye has stared and stared and stared into the deep, and in the 25 years it’s been on watch, it has revealed that deep to be richer, lovelier and more complex than science ever imagined.

Hubble started off sickly, a long-awaited, breathlessly touted, $1.5 billion machine that was supposed to change astronomy forever from almost the moment it went into space, and might have too if its celebrated 94.5 in. (2.4 m) primary mirror that had been polished to tolerances of just 10 nanometers—or 10 one-billionths of a meter—hadn’t turned out to be nearsighted, warped by the equivalent of 1/50th the thickness of a sheet of paper. It would be three and a half years before a fix could be devised and built and flown to orbit and shuttle astronauts could set the myopic mirror right. And then, on January 13, 1994, the newly sharpened eye blinked open, the cosmos appeared before it and the first of one million observations the telescope has made since then began pouring back to Earth.

Some of Hubble’s images have become cultural icons—Pillars of Creation, the Horsehead Nebula. Some have thrilled only scientists. All have been mile-markers in the always-maturing field of astronomy. The fifty images that follow are just a sampling of the telescope’s vast body of work. Hubble still has close to a decade of life left to it. That means a great deal more work and a great many more images—before the metal eyelid closes forever.

TIME animals

Here Is the Biggest Reason You Love Your Dog

Getty Images

Never mind the petting or playing; it's all about the eyes

Humans are irrational in a whole lot of ways, but nothing quite compares to our love for our dogs. They provide us neither food nor conversation nor, in most cases, protection. What’s more, they cost us a fortune—a big share of the $60 billion Americans spend on all pets per year goes to the 70 million dogs living in 43 million U.S. households.

But never mind. Dogs and humans have created an improbable bond that is nearly as close as the one we share with our own kind. Now, a study in Science reveals one of the reasons the two species love each other so: the secret, it turns out, is in the eyes.

The average dog spends a lot of its time gazing at it owner adoringly, and owners—whether they know it or not—spend a lot of time gazing back. That’s very different from the way things work with other species—particularly the dog’s close cousin, the wolf—which typically use eye contact as a threat display or a means of domination.

To test the effect of the human-dog gaze, a team of researchers headed by Miho Nagasawa of Japan’s Azabu University conducted a pair of experiments, both of which involved the hormone oxytocin, nicknamed the cuddle chemical because it facilitates bonding in humans and many other species. Oxytocin levels skyrocket in people who are in love and in new parents, and breastfeeding blows the doors off the concentrations of the stuff in the mother’s blood and milk, which means it goes straight to the babies, making them feel the love too.

In the first part of Nagasawa’s study, urine samples were collected from 21 pairs of dogs and owners, both before and after experimental sessions in which the owners petted the dogs, talked to the dogs, and often simply gazed at the dogs. As a control group, 11 pairs of owners and hand-raised wolves also provided samples and also performed the interactions.

Consistently, the oxytocin levels of both the dogs and the humans were higher at the end of the sessions—and usually by about the same percentage for each owner-dog pair. But it was among the pairs in which there was a lot more gazing and a lot less touching and talking that the levels were highest—high enough to cross the threshold of statistical significance. None of this was true in the wolf-human pairs.

“The duration of the dog-to-owner gaze…significantly explained the oxytocin-change ratio,” the investigators wrote.

In the second experiment, the investigators similarly collected before-and-after urine samples from dog-human pairs. But this time, either oxytocin or an inert solution was administered to the dogs nasally before the interactions began. Each dog was then released into a room with its owner and two strangers, and though the dogs typically approached their owners and nuzzled them, the humans were instructed neither to talk to the dogs nor touch them back, but simply to meet their gaze.

Of all the dogs, the females that had received the oxytocin gazed at their owners most—and it was those females’ owners whose oxytocin levels were the highest afterwards. Female dogs, the researchers believe, are simply more susceptible to the effects of oxytocin than males—no surprise since they’re the ones who bear and nurse puppies. To the extent that the males were affected by the intranasal dosing at all, the impact might have been blunted by the mere fact that there were strangers in the room.

“The results of experiment 2 may indicate that male dogs were attending to both their owners and to unfamiliar people as a form of vigilance,” the researchers wrote.

Whatever the explanation for the dogs’ behavior, it’s clear that it works. It’s been many thousands of years since dogs climbed aboard the human caravan—guarding our campfires and protecting our livestock in exchange for food and a warm place to sleep. But as with all good friends, the relationship deepened, and as with all good friends too, the right chemistry—literally—is one of the reasons.

TIME Biology

Here’s Why You Have a Chin

Gorgeous—and pretty much useless
Chev Wilkinson; Getty Images Gorgeous—and pretty much useless

Hint: You could do perfectly well without it

Nature is nothing if not parsimonious, especially when it comes to the human body. There’s a reason we don’t have webbed feet or nut-cracking beaks like other species, and that’s because we don’t need them. The system isn’t perfect, of course. If you ever wind up having painful abdominal surgery, odds are pretty fair that it will be your good-for-nothing appendix that’s to blame. And wisdom teeth seem a lot less wise when you consider how often they fall down on the job and need to get yanked.

As it turns out, the same why-bother pointlessness is true of what you might consider one of your loveliest features: your chin.

Researchers have long wondered what the adaptive purpose of the chin could possibly be. Sexual selection seems like an obvious answer, since an attractive chin increases your chances of mating. But a feature needs a function before it can appear in the first place. Only then can it be assigned some aesthetic value.

The other, better answer is all about chewing. The jaw exerts enormous forces when it bites and chews—up to 70 lbs. per sq. in. (32 kg per 6.5 sq. cm) for the molars. Conscious clenching increases the figure, and people who grind their teeth in their sleep may exceed the average force 10-fold. What’s more, the jaw moves in more than just one axis, both chewing up and down and grinding side to side.

That, so the thinking went, might increase bone mass in the same way physical exercise builds muscle mass. And bone mass, in turn, may produce the chin. The problem with the theory, however, is that it doesn’t account for Neanderthals and other primates—including the great apes—which lack prominent chins but in many cases have far more powerful bites than we do.

To answer the riddle, Nathan Holton, a post-doctoral researcher who specializes in craniofacial structure in the University of Iowa school of orthodontics, selected 37 of the many subjects whose facial measurements have been taken regularly from age 3 to young adulthood, as past of the longstanding Iowa Facial Growth Study (yes, there is such a thing).

With the help of basic physics, it’s possible to determine how much force any one jaw exerts without the subjects’ ever having to be tested directly with a bite gauge. Measuring the geometry of what orthodontic researchers call the mandibular symphysis and what everyone else just calls the chin region, and comparing that to what is known as the bending moment arm—or the distance between where a force is initially applied (in this case the muscles in the jaw) and where that force is eventually felt (the chin)—yields a pretty good measure of force exerted.

“Think about removing the lug nuts from a wheel on your car,” Holton wrote in an e-mail to TIME. “The longer the wrench, the easier it is because the longer wrench increases the moment arm, allowing you to create more force.”

And more force, in this case, should mean more bone mass in the chin—but that’s not what the results of the new research showed. Not only did the two turn out to be unrelated in the 37 subjects studied, but Holton and his colleagues even found that as the face matures, the chin is less adept at resisting mechanical forces, which is the whole reason it was assumed to grow more pronounced in the first place.

So why did we grow chins at all? The answer is, we didn’t. Holton and his collaborator, University of Iowa anthropologist Robert Franciscus, instead suspect that the face shrank away from behind the chin as primitive and pre-humans became modern humans, making it appear larger relative to everything else. The reason, as with so many things in the human species, has to do with male behavior—specifically violent male behavior.

As humans migrated from Africa 20,000 years ago and settled down into societies, males had to become less competitive and more cooperative—giving an advantage to those with lower testosterone levels. And reduced testosterone softens and shrinks the craniofacial structure.

“What we are arguing is that modern humans had an advantage at some point to have a well-connected social network,” Franciscus said in a statement accompanying the study. “And for that to happen, males had to tolerate each other. There had to be more curiosity and inquisitiveness than aggression, and the evidence of that lies in facial architecture.”

It wasn’t until we had our chins that we set about assigning value to them—strong ones, weak ones, angular, round, cleft or dimpled, depending on your tastes. Those tastes—and the mating choices that arise from them—ensure that the chin will stay. It might be biomechanically useless, but you’d look awfully silly without one.

Read next: Can Plastic Surgery Make You More Likeable? A Close Look at a New Study

Listen to the most important stories of the day.

TIME psychology

Here’s What Happens in the Brain When People Kill

Pulling the trigger is hard—and that's very good
George Frey—Getty Images Pulling the trigger is hard—and that's very good

There's a lot of neuroscience and moral juggling behind the decision to take a life

Evil isn’t easy. Say what you will about history’s monsters, they had to overcome a lot of powerful neural wiring to commit the crimes they did. The human brain is coded for compassion, for guilt, for a kind of empathic pain that causes the person inflicting harm to feel a degree of suffering that is in many ways as intense as what the victim is experiencing. Somehow, that all gets decoupled—and a new study published in the journal Social Cognitive and Affective Neuroscience brings science a step closer to understanding exactly what goes on in the brain of a killer.

While psychopaths don’t sit still for science and ordinary people can’t be made to think so savagely, nearly anyone can imagine what it would be like to commit the kind of legal homicide that occurs in war. To study how the brain reacts when it confronts such murder made moral, psychologist Pascal Molenberghs of Monash University in Melbourne, Australia, recruited 48 subjects and asked them to submit to functional magnetic resonance imaging (fMRI), which could scan their brains while they watched three different scenarios on video loops.

In one, a soldier would be killing an enemy soldier; in the next, the soldier would be killing a civilian; and in the last, used as a control, the soldier would shoot a weapon but hit no one. In all cases, the subjects saw the scene from the shooter’s point of view. At the end of each loop, they were asked “Who did you shoot?” and were required to press one of three buttons on a keypad indicating soldier, civilian or no one—a way of making certain they knew what they’d done. After the scans, they were also asked to rate on a 1 to 7 scale how guilty they felt in each scenario.

Even before the study, Molenberghs knew that when he read the scans he would focus first on the activity in the orbitofrontal cortex, a region of the forebrain that has long been known to be involved with moral sensitivity, moral judgments and making choices about how to behave. The nearby temporoparietal junction (TPJ) also takes on some of this moral load, processing the sense of agency—the act of doing something deliberately and therefore owning the responsibility for it. That doesn’t always makes much of a difference in the real world—whether you shoot someone on purpose or the gun goes off accidentally, the victim is still dead. But it makes an enormous difference in how you later reckon with what you’ve done.

In Molenbergh’s study, there was consistently greater activity in the lateral portion of the OFC when subjects imagined shooting civilians than when they shot soldiers. There was also more coupling between the OFC and the TPJ—with the OFC effectively saying I feel guilty and the TPJ effectively answering You should. Significantly, the degree of OFC activation also correlated well with how bad the subjects reported they felt on their 1 to 7 scale, with greater activity in the brains of people who reported feeling greater guilt.

The OFC and TPJ weren’t alone in this moral processing. Another region, known as the fusiform gyrus, was more active when subjects imagined themselves killing civilians—a telling finding since that portion of the brain is involved in analyzing faces, suggesting that the subjects were studying the expressions of their imaginary victims and, in so doing, humanizing them. When subjects were killing soldiers, there was greater activity in a region called the lingual gyrus, which is involved in the much more dispassionate business of spatial reasoning—just the kind of thing you need when you’re going about the colder business of killing someone you feel justified killing.

Soldiers and psychopaths are, of course, two different emotional species. But among people who kill legally and those who kill criminally or promiscuously, the same brain regions are surely involved, even if they operate in different ways. In all of us it’s clear that murder’s neural roots and moral roots are deeply entangled. Learning to untangle them a bit could one day help psychologists and criminologists predict who will kill—and stop them before they do.

Read next: What Binge Drinking During Adolescence Does to the Brain

Listen to the most important stories f the day.

TIME psychology

Why Narcissists Will Live Long if They Avoid Risky Business

Loving the view: Looking great does not necessarily mean living well
Dougal Waters; Getty Images Loving the view: Looking great does not necessarily mean living well

A strange mix of living well and taking risks adds one more puzzle to the narcissistic personality

If you’re shopping for a personality disorder to call your own, you might want to avoid becoming a narcissist. It’s true that you’ll be confident, charismatic, extroverted and irresistible, but only until people discover that you’re also arrogant, self-absorbed, insensitive and unlovable. Now, one more contradiction in the narcissistic personality has been revealed. Even as narcissists take better care of themselves than nonnarcissists do — eating well and exercising regularly — they’re also likelier to engage in risky behaviors that could kill them before they can take advantage of those good habits.

That’s the conclusion of a new study by psychologist Erin Hill of West Chester University in Pennsylvania. Like most researchers studying narcissism, Hill knew there is much more nuance to the disorder than there seems to be. Narcissists are cocky, yes, but they’re hungry too — for recognition, applause, approval, validation. Their profound sense of insecurity also bumps up against a paradoxical sense of indestructibility — a belief that they are immune to the kinds of dangers most other people take pains to avoid.

To test the self-enhancing and self-destructive crosscurrents in the narcissistic temperament, Hill recruited 365 undergraduate students and asked them to take the Narcissistic Personality Inventory (NPI), a 40-item questionnaire that is considered the best available tool to diagnose the condition. NPI scores can range from a theoretical low of zero to a theoretical high of 40, but in the U.S. the average is about 15.5. The students in Hill’s study averaged a bit higher — 18.25 for males and 16.04 for females — which is typical for a population of young people who have yet to be chastened by setbacks in life.

Hill next asked her subjects to answer a number of questions about how they live — in both good ways and bad. In the first category, she asked them how many fruits and vegetables they eat per week, how consistently they maintain a healthy eating pattern overall, how often they exercise and whether they regularly practice safe sex. In the second category, she asked them if they smoke, how often and how much they drink, whether they use marijuana or other drugs, and whether they engage in reckless driving behaviors like texting behind the wheel or not wearing a seat belt.

MORE: How Do You Spot a Narcissist? Just Ask

The results were a mix of reasonably good news and very bad news. Narcissism did not seem to be linked to increased smoking, use of drugs other than pot or a greater likelihood of practicing unsafe sex — suggesting that some health messages are getting through even to people who typically think they’re above such concerns. But high NPI scores were significantly related to more drinking—as well as more binge drinking — greater marijuana use and reckless driving.

When it came to healthy behaviors, narcissists weren’t any likelier to eat more fruits and veggies than other people, but they were likelier to maintain a healthy diet over all. They were also significantly more inclined to play sports or otherwise exercise regularly.

Those good habits, while commendable, were not necessarily well motivated, Hill concluded — perhaps little more than part of the narcissist’s deep need to be the prettiest person in any room. If that means going to the gym and saying no to dessert, fine.

The happy news for the trim and toned narcissists is that good health habits can stick for life, while bad risk behaviors do tend to decline over time, as even the hopelessly self-adoring eventually discover that they’re not invulnerable to harm. Narcissism as a whole, however, is a much harder thing to shake — which leads to the final paradox of the narcissistic personality. All that working out and eating well may be perfectly fine, but it does you little good if the people you were trying to impress have long since quit having anything to do with you.

TIME psychology

7 Ways Your Mind Messes With Your Money

Mmmmmoney: Get a grip; it's just paper
KAREN BLEIER; AFP/Getty Images

Jeffrey Kluger is Editor at Large for TIME.

A new book shows the many ways money makes you crazy

If your brain is like most brains, it’s got an awfully high opinion of itself—pretty darned sure it’s pretty darned good at a lot of things. That probably includes handling money. But on that score your brain is almost certainly lying to you. No matter how much you’re worth, no matter how deftly you think you play the market, your reasoning lobes go all to pieces when cash is on the line. That is one of many smart—and scary—points made by author and J.P. Morgan vice president Kabir Sehgal in his new book Coined: The Rich History of Money and How it Has Shaped Us. Here, in no particular order, are seven reasons you should never leave your brain alone with your wallet.

Inflation? What’s that? You’re way too smart to think that if your salary doubles but the price of everything you buy doubles too you’ve somehow come out ahead, right? Wrong. In one study, volunteers were given the opportunity to win money that they could use to buy gifts from a catalogue. In later rounds, the amount they could win went up by 50% but so did the cost of all of the catalogue items. Nonetheless, their prefrontal cortex registered greater arousal after the staged inflation—even when they were warned before the study began that the purchasing power of their money would not increase. The implication: If a corned beef sandwich and a Coke cost $15,000 you’d still be thrilled to be a billionaire.

Keep yer lousy money: Guess what! I’m going to give you $199. Nice, right? Oh, did I forget to mention that it comes out of $1,000 someone else gave me to divide up between us any way I see fit? In multiple studies, when it’s up to one subject to apportion a fixed amount and up to the other to accept it or neither one gets paid, more than half of recipients will reject anything less than 20% of the total. In other words, you’ll turn down a free $199 to deny me my undeserved $801. Your ego thanks you, your checking account doesn’t.

Losing feels worse than winning feels good: Here’s something the Vegas casinos don’t tell you: That high you get from winning $10,000 at the craps table will fade a lot faster than the what-was-I-thinking self-loathing that comes when you lose the same amount. To get people to wager $20 on a coin flip, researchers have found that they typically have to be given the chance to double their money; betting $20 to win, say, $35 just doesn’t cut it. That seems like good sense—but given the realistic shot you’ve got at winning, it’s also bad math.

Simply the best: You know that store that opened on your corner that sold nothing but artisanal beets—the one that you knew would go out of business within a month and that didn’t even last two weeks? The owner totally didn’t see that coming. That’s called the overconfidence bias. The hard fact is, about 80% of new businesses are floating upside down at the top of the aquarium within 18 to 24 months—but nearly all entrepreneurs are convinced they’re going to be in the elite 20%. We bring the same swagger to playing the market and speculating in real estate—and to dancing at a wedding after we’ve had enough drinks and are convinced we’ve got moves. Watch the video later and see how that works out.

The hunt beats the kill: Never mind cigarettes and alcohol, if there’s one substance the government should regulate it’s dopamine—the feel-good neurotransmitter that gives you a little reward pellet of happiness when your brain decides you’ve done something good. The problem is, your brain can be an idiot. There’s far more dopamine released in its nucleus acumbens region—the reward center—when you’re anticipating some kind of payoff than when you’ve actually achieved it. That means expanding your business is more fun than running it and investing in the market is more fun than consolidating your gains. Those are great strategies—but only until the very moment they’re not.

I think therefore I win: I have a perfect three-step plan for winning the Power Ball Lottery: 1) I buy a ticket. 2) About 175 million other people buy tickets. 3) They give me all the tickets they bought. OK, failing that, the odds are pretty good that I may not be the person on TV who gets handed that giant check. But I play anyway thanks to what’s known as the availability heuristic. I think about winning, I see commercials with people who have actually won, I fantasize about what I’ll do with the money when I do win—and pretty soon it seems crazy not to play. The more available thoughts of something unlikely are, the more realistic it seems that it may actually happen. This is the reason there should always be a 48-hour cooling off period after you leave baseball fantasy camp and before you’re allowed to sell your house and try out for the Yankees’ farm club.

Fifty shades of green: Perhaps the biggest reason we’re irrational about money is that we’ve come to fetishize not just the idea of wealth but the pieces of currency themselves. In one study, subjects counted out either actual bills or worthless pieces of paper of the same size, and then plunged their hands into 122ºF (50ºC) water. The ones who had handled real cash experienced less pain—effectively anesthetized by the Benjamins. Other studies have shown heightened brain activity when people witness money being destroyed, with the degree of neuronal excitement increasing in lockstep with the value of the currency. It’s money’s world; we’re just living in it.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME psychology

One Sign of Narcissism That Turns Out to Be All Wrong

The mighty I: Narcissists aren't as easy to spot as you think
Hachephotography—Getty Images/Flickr RF The mighty I: Narcissists aren't as easy to spot as you think

Even the most self-adoring people don't use a telltale pronoun anymore than you do

There are a lot of enduring truths about narcissists: they’re grandiose, insensitive, entitled, greedy, sexually exploitative and morally indifferent. And they love, love, love to use the pronoun I—except, as it turns out, they don’t. All the narcissist’s other dubious qualities are very real, but the one about language—perhaps the most straightforward of all—appears to be a myth.

That’s the conclusion of a new study in the Journal of Personality and Social Psychology, and while the use of a single pronoun ought to make a small difference when you’re reckoning with a personality disorder as destructive as narcissism, it actually matters a lot. Overuse of the self-referential I was seen as a quick and dirty way for even lay people to diagnose the condition; now that tool appears to be useless.

The I marker has a long history in narcissism research. The original study that made the link was published in 1988 and found that among a sample group of subjects who took the 40 question Narcissist Personality Inventory (NPI)—an all but universally accepted way of formally diagnosing the condition—people who scored higher indeed tended to utter I more. As I reported in my 2014 book The Narcissist Next Door, other studies in 2011 and 2012 tracked the use of first-person language in popular music from 1980 to 2007, and in literature from 1960 to 2008, and found both to be on the rise.

But there are holes in all of the research: the books and music may be consumed by narcissists drawn by their self-referential tone, but they’re not necessarily written by them, and there’s no telling how the listeners and readers themselves actually talk. What’s more, the 1988 study was a small one of just 48 subjects—a piece of work long considered too thin to go unchallenged.

So challenged it was, and in a very big way, with a research group headed by doctoral candidate Angela Carey of the University of Arizona’s psychology department surveying 4,811 different subjects, all of whom sat for one of 15 different experiments—writing essays or telling stories about their past, completing questionnaires, engaging in stream-of-conscious conversation, offering their Facebook pages for analysis and more. The subjects also took the NPI. And the result?

Pretty much nothing. Among nearly all of the volunteers participating in all 15 studies, there was simply no there there, with narcissists and non-narcissists using the I pronoun with about the same frequency. Among males there was a slightly higher correlation between I use and narcissism than there was among females, but not enough to cross the threshold of statistical significance.

So what gives? Why was a feature of the condition that was considered so intuitively right so demonstrably wrong? Part of the reason, the researchers speculate, may be that narcissists radiate such confidence and cockiness that the expectation is that they use the I pronoun more even when they don’t. “Perceived I talk,” they wrote, “may be part of a perceptual schema of self-confidence or arrogance which, once activated, selectively…draws attention to a person’s use of the first-person singular.”

No less a person than President Obama—who likely ranks high on the NPI, as studies suggest most U.S. Presidents do—was a victim of this misconception. Early in his first term he was criticized for using far too much I talk, but ironically, the authors write, “Obama’s actual first-person singular pronoun [use]…put him at the very bottom of the distribution of modern U.S. Presidents; much lower, for example, than President G. W. Bush, Clinton and G. H. W. Bush.”

None of this mitigates the radioactive nature of the narcissistic temperament or the importance of avoiding their contamination field. Indeed, it makes them even more dangerous if the Geiger counter we’ve been using for decades turns out to have been mis-calibrated all along.

Your browser is out of date. Please update your browser at http://update.microsoft.com