TIME psychology

Here’s What Happens in the Brain When People Kill

Pulling the trigger is hard—and that's very good
George Frey—Getty Images Pulling the trigger is hard—and that's very good

There's a lot of neuroscience and moral juggling behind the decision to take a life

Evil isn’t easy. Say what you will about history’s monsters, they had to overcome a lot of powerful neural wiring to commit the crimes they did. The human brain is coded for compassion, for guilt, for a kind of empathic pain that causes the person inflicting harm to feel a degree of suffering that is in many ways as intense as what the victim is experiencing. Somehow, that all gets decoupled—and a new study published in the journal Social Cognitive and Affective Neuroscience brings science a step closer to understanding exactly what goes on in the brain of a killer.

While psychopaths don’t sit still for science and ordinary people can’t be made to think so savagely, nearly anyone can imagine what it would be like to commit the kind of legal homicide that occurs in war. To study how the brain reacts when it confronts such murder made moral, psychologist Pascal Molenberghs of Monash University in Melbourne, Australia, recruited 48 subjects and asked them to submit to functional magnetic resonance imaging (fMRI), which could scan their brains while they watched three different scenarios on video loops.

In one, a soldier would be killing an enemy soldier; in the next, the soldier would be killing a civilian; and in the last, used as a control, the soldier would shoot a weapon but hit no one. In all cases, the subjects saw the scene from the shooter’s point of view. At the end of each loop, they were asked “Who did you shoot?” and were required to press one of three buttons on a keypad indicating soldier, civilian or no one—a way of making certain they knew what they’d done. After the scans, they were also asked to rate on a 1 to 7 scale how guilty they felt in each scenario.

Even before the study, Molenberghs knew that when he read the scans he would focus first on the activity in the orbitofrontal cortex, a region of the forebrain that has long been known to be involved with moral sensitivity, moral judgments and making choices about how to behave. The nearby temporoparietal junction (TPJ) also takes on some of this moral load, processing the sense of agency—the act of doing something deliberately and therefore owning the responsibility for it. That doesn’t always makes much of a difference in the real world—whether you shoot someone on purpose or the gun goes off accidentally, the victim is still dead. But it makes an enormous difference in how you later reckon with what you’ve done.

In Molenbergh’s study, there was consistently greater activity in the lateral portion of the OFC when subjects imagined shooting civilians than when they shot soldiers. There was also more coupling between the OFC and the TPJ—with the OFC effectively saying I feel guilty and the TPJ effectively answering You should. Significantly, the degree of OFC activation also correlated well with how bad the subjects reported they felt on their 1 to 7 scale, with greater activity in the brains of people who reported feeling greater guilt.

The OFC and TPJ weren’t alone in this moral processing. Another region, known as the fusiform gyrus, was more active when subjects imagined themselves killing civilians—a telling finding since that portion of the brain is involved in analyzing faces, suggesting that the subjects were studying the expressions of their imaginary victims and, in so doing, humanizing them. When subjects were killing soldiers, there was greater activity in a region called the lingual gyrus, which is involved in the much more dispassionate business of spatial reasoning—just the kind of thing you need when you’re going about the colder business of killing someone you feel justified killing.

Soldiers and psychopaths are, of course, two different emotional species. But among people who kill legally and those who kill criminally or promiscuously, the same brain regions are surely involved, even if they operate in different ways. In all of us it’s clear that murder’s neural roots and moral roots are deeply entangled. Learning to untangle them a bit could one day help psychologists and criminologists predict who will kill—and stop them before they do.

Read next: What Binge Drinking During Adolescence Does to the Brain

Listen to the most important stories f the day.

TIME psychology

7 Ways Your Mind Messes With Your Money

Mmmmmoney: Get a grip; it's just paper
KAREN BLEIER; AFP/Getty Images Mmmmmoney: Get a grip; it's just paper

Jeffrey Kluger is Editor at Large for TIME.

A new book shows the many ways money makes you crazy

If your brain is like most brains, it’s got an awfully high opinion of itself—pretty darned sure it’s pretty darned good at a lot of things. That probably includes handling money. But on that score your brain is almost certainly lying to you. No matter how much you’re worth, no matter how deftly you think you play the market, your reasoning lobes go all to pieces when cash is on the line. That is one of many smart—and scary—points made by author and J.P. Morgan vice president Kabir Sehgal in his new book Coined: The Rich History of Money and How it Has Shaped Us. Here, in no particular order, are seven reasons you should never leave your brain alone with your wallet.

Inflation? What’s that? You’re way too smart to think that if your salary doubles but the price of everything you buy doubles too you’ve somehow come out ahead, right? Wrong. In one study, volunteers were given the opportunity to win money that they could use to buy gifts from a catalogue. In later rounds, the amount they could win went up by 50% but so did the cost of all of the catalogue items. Nonetheless, their prefrontal cortex registered greater arousal after the staged inflation—even when they were warned before the study began that the purchasing power of their money would not increase. The implication: If a corned beef sandwich and a Coke cost $15,000 you’d still be thrilled to be a billionaire.

Keep yer lousy money: Guess what! I’m going to give you $199. Nice, right? Oh, did I forget to mention that it comes out of $1,000 someone else gave me to divide up between us any way I see fit? In multiple studies, when it’s up to one subject to apportion a fixed amount and up to the other to accept it or neither one gets paid, more than half of recipients will reject anything less than 20% of the total. In other words, you’ll turn down a free $199 to deny me my undeserved $801. Your ego thanks you, your checking account doesn’t.

Losing feels worse than winning feels good: Here’s something the Vegas casinos don’t tell you: That high you get from winning $10,000 at the craps table will fade a lot faster than the what-was-I-thinking self-loathing that comes when you lose the same amount. To get people to wager $20 on a coin flip, researchers have found that they typically have to be given the chance to double their money; betting $20 to win, say, $35 just doesn’t cut it. That seems like good sense—but given the realistic shot you’ve got at winning, it’s also bad math.

Simply the best: You know that store that opened on your corner that sold nothing but artisanal beets—the one that you knew would go out of business within a month and that didn’t even last two weeks? The owner totally didn’t see that coming. That’s called the overconfidence bias. The hard fact is, about 80% of new businesses are floating upside down at the top of the aquarium within 18 to 24 months—but nearly all entrepreneurs are convinced they’re going to be in the elite 20%. We bring the same swagger to playing the market and speculating in real estate—and to dancing at a wedding after we’ve had enough drinks and are convinced we’ve got moves. Watch the video later and see how that works out.

The hunt beats the kill: Never mind cigarettes and alcohol, if there’s one substance the government should regulate it’s dopamine—the feel-good neurotransmitter that gives you a little reward pellet of happiness when your brain decides you’ve done something good. The problem is, your brain can be an idiot. There’s far more dopamine released in its nucleus acumbens region—the reward center—when you’re anticipating some kind of payoff than when you’ve actually achieved it. That means expanding your business is more fun than running it and investing in the market is more fun than consolidating your gains. Those are great strategies—but only until the very moment they’re not.

I think therefore I win: I have a perfect three-step plan for winning the Power Ball Lottery: 1) I buy a ticket. 2) About 175 million other people buy tickets. 3) They give me all the tickets they bought. OK, failing that, the odds are pretty good that I may not be the person on TV who gets handed that giant check. But I play anyway thanks to what’s known as the availability heuristic. I think about winning, I see commercials with people who have actually won, I fantasize about what I’ll do with the money when I do win—and pretty soon it seems crazy not to play. The more available thoughts of something unlikely are, the more realistic it seems that it may actually happen. This is the reason there should always be a 48-hour cooling off period after you leave baseball fantasy camp and before you’re allowed to sell your house and try out for the Yankees’ farm club.

Fifty shades of green: Perhaps the biggest reason we’re irrational about money is that we’ve come to fetishize not just the idea of wealth but the pieces of currency themselves. In one study, subjects counted out either actual bills or worthless pieces of paper of the same size, and then plunged their hands into 122ºF (50ºC) water. The ones who had handled real cash experienced less pain—effectively anesthetized by the Benjamins. Other studies have shown heightened brain activity when people witness money being destroyed, with the degree of neuronal excitement increasing in lockstep with the value of the currency. It’s money’s world; we’re just living in it.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME neuroscience

Here’s a New Trick to Help Babies Learn Faster

Surprise them. Not by jumping out of a closet but by challenging her developing notions about the world, and avoiding the same-old same-old

We know that babies like new things. Present them with something they haven’t seen before and they’ll gravitate toward it, touch it, bang it around, put it in their mouths. It’s all part of the learning process so they can build a database of knowledge about the world around them.

But for babies to really learn about how the world works, it takes more than novelty. In a series of experiments with 11 month olds published Thursday in the journal Science, researchers at Johns Hopkins University found that surprising information—things that went against babies’ assumptions about concepts like gravity and the solidness of objects—forms the seed for future learning.

Aimee Stahl, a PhD candidate in the department of psychological and brain science at Johns Hopkins University, and her colleague Lisa Feigenson conducted a set of experiments with 110 infants to tease out this effect of surprise in how babies learn. The studies began with the assumption that babies are born with certain core knowledge about how the world works — that objects are solid so other things can’t pass through them, for example, or that dropping things causing things to fall rather than float.

MORE: Naps May Help Babies Retain Memories, Study Finds

First, Stahl challenged these concepts with some babies by strategically using a screen to hide a wall as they rolled ball. When they lifted the screen, some babies saw the ball stopped in front the wall, as they would expect. Other babies, however, saw the ball on the other side of the wall. When both groups were then presented with something entirely new to learn — associating a squeaking sound with a new toy — the babies who saw the contrary event (the ball on the other side of the wall) learned to link the sound to the new toy more quickly than those who saw the expected event (the ball on the correct side of the wall).

To ensure that the babies weren’t just enthralled with the novelty of the new toy, Stahl and Feigenson then repeated the experiment, except this time during the testing phase they played a different, rattling sound instead of the squeaking noise. The learning scores in the first experiment were still higher than those in the second version, strongly suggesting that the babies were actually making new connections and learning something about the objects, rather than just paying attention to the new-ness of them.

MORE: How to Improve a Baby’s Language Skills Before They Start to Talk

This was supported by the other experiments Stahl and Feigenson conducted, in which babies tried to find an explanation for the contrary results; for the balls that appeared to melt through the solid wall, they bounced and banged the balls to verify their solidity. For situations in which objects seemed to defy gravity and float, they dropped them. “It seemed like they were seeking an explanation to the kind of surprising events they witnessed,” says Stahl. “If it was just novelty that was attracting them, they wouldn’t be so specific in the way they handled the objects.”

These are the first experiments to test the idea that learning involves more than just exploring new things; Stahl’s results indicate that surprising or contradictory information helps them to confirm and test their knowledge, and try to explain events that seem to go against what they know.

“It raises exciting questions about whether surprise is something educators, parents and doctors can harness to enhance and shape learning,” says Stahl. She’s exploring, for example, how surprise can help in learning even with older children in more naturalistic environments, outside of artificial lab experiments. “Our research shows that when babies’ predictions about the world don’t match what they observe, that signals a special opportunity to update and revise their knowledge and to learn something new.”

Video: Johns Hopkins University Office of Communications; Len Turner, Dave Schmelick and Deirdre Hammer

TIME Research

Level Up! Gamers May Learn Visual Skills More Quickly

HaloFest for Xbox One
Matt Sayles—Invision/AP Xbox fans play games from the popular “Halo” franchise at HaloFest at the Avalon Theatre in Los Angeles on Monday, Nov. 10, 2014

Practice not only makes perfect, it may improve gamers' ability to learn

A small study from Brown University suggests video gamers, who are already known to have a better visual-processing skills, may also be able to improve on those attributes faster than the average person.

According to Brown University press, the study analyzed nine gamers and compared them with nine nongamers during a two-day trial. Researchers required participants to complete two visual tasks, one right after the other. The next day they repeated the exercises (in a random order) and compared how participants improved.

What they found is that the second task interfered with the ability of nongamers to improve on the first — while gamers improved equally well on both exercises.

“We sometimes see that an expert athlete can learn movements very quickly and accurately and a musician can play the piano at the very first sight of the notes very elegantly … maybe [gamers] can learn more efficiently and quickly as a result of training,” senior author Yuka Sasaki said.

The authors admit the findings require more study, conceding that there is no proof that video games caused the learning improvement, since people with quick visual-processing skills could be naturally drawn to gaming.

TIME neuroscience

Your Brain Learns New Words By Seeing Them Not Hearing Them

521811839
Chris Ryan—Getty Images/Caiaimage

To be a really proficient reader, it’s not enough to “hear” words. You also have to see them

We start to talk before we can read, so hearing words, and getting familiar with their sounds, is obviously a critical part of learning a language. But in order to read, and especially in order to read quickly, our brains have to “see” words as well.

At least that’s what Maximilian Riesenhuber, a neuroscientist at Georgetown University Medical Center, and his colleagues found in an intriguing brain-mapping study published in the Journal of Neuroscience. The scientists recruited a small group of college students to learn a set of 150 nonsense words, and they imaged their brains before and after the training.

Before they learned the words, their brains registered them as a jumble of symbols. But after they were trained to give them a meaning, the words looked more like familiar words they used every day, like car, cat or apple.

MORE: Mistakes to Avoid When Learning a Foreign Language

The difference in way the brain treated the words involved “seeing” them rather than sounding them out. The closest analogy would be for adults learning a foreign language based on a completely different alphabet system. Students would have to first learn the new alphabet, assigning sounds to each symbol, and in order to read, they would have to sound out each letter to put words together.

In a person’s native language, such reading occurs in an entirely different way. Instead of taking time to sound out each letter, the brain trains itself to recognize groups of letters it frequently sees together — c-a-r for example — and dedicates a set of neurons in a portion of the brain that activates when these letters appear.

In the functional MRI images of the volunteers’ brains, that’s what Riesenhuber saw. The visual word form area, located in the left side of the visual cortex, is like a dictionary for words, and it stores the visual representation of the letters making up thousands of words. This visual dictionary makes it possible to read at a fast pace rather than laboriously sounding out each letter of each word every time we read. After the participants were trained to learn the meaningless words, this part of their brains was activated.

MORE: An Infant’s Brain Maps Language From Birth, Study Says

“Now we are seeing words as visual objects, and phonetics is not involved any more,” he says. “We recognize the word as a chunk so we go directly from a visual pattern to the word’s meaning, and we don’t detour to the auditory system.”

The idea of a visual dictionary could also help researchers to better understanding reading or learning disorders like dyslexia. More research could reveal whether the visual word form area in people with such disabilities is different in any way, or whether they tend to read via more auditory pathways. “I helps us understand in a general way how the brain learns, the fastest way of learning, and how to build on prior learning,” says Riesenhuber.

TIME medicine

Many Doctors Don’t Tell Patients They Have Alzheimer’s

168835256
PASIEKA—Getty Images/Science Photo Library RM

It’s hard to believe in today’s era of transparency in modern medicine, but there’s a diagnosis that doctors still try to keep from their patients

In a surprising new survey of patients who were asked about their interactions with their doctors, 45% of people whose doctors treated them for Alzheimer’s never told these patients that they had the degenerative brain disorder.

Led by researchers at the Alzheimer’s Association, the scientists looked at Medicare claim data from 2008 to 2010 for 16,000 people. They were asked, among other things, whether their doctors had ever told them they had Alzheimer’s disease. When the researchers then matched the respondents’ answers to their medical records, and the diagnostic codes that their doctors used to describe their care, only 45% of those who were billed for Alzheimer’s-related care were told by their doctors of their disease.

MORE New Research on Understanding Alzheimer’s

“What struck us was that physicians generally understand the positive benefits of disclosing the diagnosis, and agree with those benefits,” says Keith Fargo, director of scientific programs and outreach at the Association, who oversaw the analysis of the survey data. “But many still don’t do disclosure in their own practice.”

One of the few papers investigating the phenomenon of Alzheimer’s diagnoses found that as few as 36% of doctors said they usually told their patients if they had Alzheimer’s. The main reasons for the intentional omission? Fear of causing emotional distress in their patients and the lack of time and resources to fully explain what the diagnosis means. This was true of both primary care doctors as well as neurology specialists who have more expertise in brain-related disorders.

Dr. Robert Wergin, president of the American Academy of Family Physicians, advocates for transparency and honesty in disclosing diagnoses to his patients in his practice in Milford, Nebraska. But he understands why many physicians might be reluctant to use the word “Alzheimer’s” with their patients. “Labels are important,” he says. “When I label you and say you’ve got Alzheimer’s disease, then you’re likely to say, ‘Well that’s it for me, I better start looking for nursing homes.’”

MORE This Alzheimer’s Breakthrough Could Be a Game Changer

Alzheimer’s is a challenging diagnosis to make on several levels. First, it can only be definitively diagnosed at autopsy, when doctors can see the hallmark amyloid plaques and tangles that cause the gradual loss of memory and cognitive function. There is no blood test or brain scan that can conclusively tell doctors that a patient does or does not have the condition; while promising versions are being developed, it’s still a diagnosis that doctors make based on reports of the patients’ changing intellectual abilities and on psychiatric tests that aren’t specific for Alzheimer’s.

It’s also difficult to tell patients they likely have Alzheimer’s because there are currently no effective drugs for the disease. Medications can slow the effects of the cognitive decline, but nothing can stop or reverse the march of worsening symptoms. Wergin notes that once a patient is labeled with Alzheimer’s it could, at least before the Affordable Care Act, affect that patient’s ability to get insurance for nursing home care. “Once I label you, it’s in your chart. If an insurance company extracts your data, I’m not going to insure you because you are at higher risk of drawing on your coverage,” he says.

MORE New Test May Predict Alzheimer’s 10 Years Before Diagnosis

Wergin says that doctors may be over-anticipating the emotional distress that an Alzheimer’s diagnosis can bring. While the news is certainly difficult, most patients and their caregivers may already be aware that a neurodegenerative disease like Alzheimer’s may be present. And while there are no treatments that physicians can prescribe for their patients — at least not yet — Fargo and Beth Kallmyer, vice president of constituent services at the Association, note that it’s particularly important for Alzheimer’s patients and their families to know what to expect so they can begin planning. “There might not be a pill that slows the disease down or there might not be cures, but there are things people can do to impact their everyday quality of life,” says Kallmyer. “They can build a care team, and prepare advanced directives. And if a caregiver has knowledge of the disease, they can make things better in the day to day world of the person with the disease. If they don’t know about the diagnosis, they may not get that support.”

MORE Breakthrough Discoveries of Alzheimer’s Genes

But making doctors more comfortable with the diagnosis will take more structural changes in the way we deliver health care. The Alzheimer’s Association is supporting legislation that would reimburse doctors and their staff for a longer discussion about Alzheimer’s and how to plan for the disease. More medical schools are also including discussion about such planning in their curricula, as doctors in coming decades will be increasingly called upon to make this difficult disclosure.

Read next: A Simple 3-Part Test May Predict Alzheimer’s

Listen to the most important stories of the day.

TIME Innovation

Five Best Ideas of the Day: March 23

The Aspen Institute is an educational and policy studies organization based in Washington, D.C.

1. Modern American life — from smartphones to cars and even missiles — depends on rare earth elements. But we’ve let China take over the industry.

By Lesley Stahl on 60 Minutes

2. Men in New York own 2.5 times as many businesses as women. A new program aims to innovate against that gap.

By Alexis Stephens in Next City

3. In 2014, global violence surged, drawing a sharp contrast between the developed world and everywhere else.

By Peter Apps in the Project for the Study of the 21st Century

4. Tunisia’s response to a terrorist attack provides a hopeful model.

By the editorial board of the Christian Science Monitor

5. Want to change how you see the world? Rewire your brain by learning a second language.

By Nicholas Weiler in Science Magazine

The Aspen Institute is an educational and policy studies organization based in Washington, D.C.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME medicine

A Simple 3-Part Test May Predict Alzheimer’s

146104159
Chris Parsons—Getty Images

Dementia is a part of aging, but how do doctors separate normal brain decline from the first signs of Alzheimer’s? A new test that any physician can perform in their office may help

Diseases like Alzheimer’s start years, even decades, before the first symptoms of memory loss shows up. And with rates of those diseases rising, experts say that more primary care physicians—not neurology experts—will have the task of identifying these patients early so they can take advantage of whatever early interventions might be available.

“If we had a simple blood test, a cholesterol test for Alzheimer’s disease, that would help,” says Dr. Ronald Petersen, director of the Alzheimer’s Disease Research Center at the Mayo Clinic, “but we don’t.” But Petersen has a potential solution, and according to a new paper released Wednesday in the journal Neurology, his Alzheimer’s test has promise.

Petersen and his team wanted to develop a test that any physician can administer to patients, without the need for any new technology or expensive equipment. Petersen believes that the test they came up with could become a useful tool for any physician, even those without special training in the brain. “What we are trying to do is give them some help so they can be as efficient as possible without ignoring these important cognitive issues,” he says.

In the first phase of the test, his researchers simply collected information from 1,500 patients’ medical charts—their age, family history of Alzheimer’s, factors such as diabetes or smoking that have been linked to Alzheimer’s, and whether the patient had ever reported problems with memory.

In the next phase they studied the results of the patient’s basic mental exam as well as of a psychiatric evaluation, because depression and anxiety have been connected to Alzheimer’s.

And another factor that emerged as important in developing the disease—how quickly the participant could walk a short distance. “We were a little surprised,” says Petersen. “But what’s nice about it is that it’s a nice non-cognitive, motor factor so it’s looking at another aspect of brain function.”

MORE: This Alzheimer’s Breakthrough Could Be a Game Changer

Petersen suggests that every physician should get this information on their patients at age 65; that way, they can have a baseline against which to compare any changes as their patients age. Only if they show such changes — a slower walk, for example, or worsening signs of depression or memory issues — should they move on to the third phase of the test, which is a blood analysis. That would look for known genetic factors linked to Alzheimer’s, including the presence of certain versions of the ApoE gene.

Currently, the only way to truly separate out those on the road to Alzheimer’s is to conduct expensive imaging tests of the brain, or to do a spinal tap, an invasive procedure that extracts spinal fluid for signs of the amyloid protein that builds up in the disease. “We have either expensive techniques or invasive techniques and it’s not practical to do them from a public health screening standpoint,” says Petersen.

MORE: New Test May Predict Alzheimer’s 10 Years Before Diagnosis

While his test is a possible solution to that problem, he acknowledges that the results need be repeated before it’s recommended on a wide scale to physicians across the country. But those who scored higher on the test of risk factors had a seven-fold higher chance of developing mild cognitive impairment than those with lower scores.

For now, even if doctors identify patients around age 65 who might be at higher risk of developing cognitive impairment, there isn’t much they can do to interrupt the process. But they can direct them toward clinical trials of promising new drugs to address Alzheimer’s dementia, which may slow the cognitive decline considerably.

TIME Innovation

Five Best Ideas of the Day: March 12

The Aspen Institute is an educational and policy studies organization based in Washington, D.C.

1. Protecting whistleblowers protects national security.

By Mike German at the Brennan Center for Justice

2. Could we treat pain by switching off the region of the brain controlling that feeling?

By the University of Oxford

3. Small businesses are booming in China, and it might save their economy.

By Steven Butler and Ben Halder in Ozy

4. Not so fast: Apps using Apple’s new health technology could require FDA approval. That doesn’t come quick.

By Jonathan M. Gitlin in Ars Technica

5. We might feel better about driving electric cars, but they’re still not good for the environment.

By Bobby Magill in Quartz

The Aspen Institute is an educational and policy studies organization based in Washington, D.C.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME sexuality

No Ben Carson, Homosexuality Is Not a Choice

Pointing the wrong way: Carson is just plain wrong on the science
Richard Ellis; Getty Images Pointing the wrong way: Carson is just plain wrong on the science

Jeffrey Kluger is Editor at Large for TIME.

A presidential hopeful (and a doctor) gets the science all wrong—and makes things worse when he tries to explain himself

If you’re a candidate dreaming of the White House with virtually no chance of actually winding up there, it sometimes helps to say something ridiculous—if only to get your name-recognition numbers up. That is the very best and most charitable explanation for comments by Dr. Ben Carson, a neurosurgeon, on CNN, arguing that homosexuality is “absolutely” a choice. His evidence? Prison.

“A lot of people who go into prison go into prison straight and when they come out, they’re gay,” he said. “So did something happen while they were in there?”

Prison, of course, is the worst of all possible examples Carson could have chosen—conflating sexuality with circumstance. Men confined together for years without women remain sexual beings and may take whatever outlet is available to them. Something similar was true in a less enlightened era of gay men and women who were forced to marry people of the opposite sex, and who dutifully produced children and tried to satisfy their partners despite the fact that they were getting little satisfaction themselves.

Carson, who was blowtorched in both social and mainstream media for his remarks, quickly walked them back, issuing a statement that, in some ways, only made things worse. “I’m a doctor trained in multiple fields of medicine, who was blessed to work at perhaps the finest institution of medical knowledge in the world,” he wrote. “Some of our brightest minds have looked at this debate, and up until this point there have been no definitive studies that people are born into a specific sexuality.”

That statement could indeed have the virtue of being true—provided it was issued in 1990. But since then, there’s been a steady accumulation of evidence that sexuality—like eye color, nose size, blood type and more—is baked in long before birth. The first great breakthrough was the 1991 study by neuroscientist Simon LeVay finding that a region in the hypothalamus related to sexuality known as INAH3 is smaller in gay men and women than it is in straight men. The following year, investigators at UCLA found that another brain region associated with sexuality, the midsagittal plane of the anterior commissure, is 18% larger in gay men than in straight women and 34% larger than in straight men.

One cause of the differences could be genetic. In 1993, one small study suggested a connection between sexual orientation and a section on the X chromosome called Xq28, which could predispose men toward homosexuality. The small size of the study—only 38 pairs of gay brothers—made it less than entirely reliable. But a study released just last year expanded the sample group to 409 pairs of brothers and reached similar conclusions.

Genes are not the only biological roots for homosexuality. Womb environment is thought to play a significant role too, since part of what determines development of a fetus is the level and mix of hormones to which it is exposed during gestation. In 2006, psychologist Anthony Bogaert of Brock University in Canada looked into the never-explained phenomenon of birth order appearing to shape sexuality, with gay males tending to have more older brothers than straight males. Working with a sample group of 944 homosexual and heterosexual males, Bogaert found that indeed, a first born male has about a 3% chance of being gay, a number that goes up 1% at a time for each subsequent boy until it doubles to 6% for a fourth son.

The explanation likely involves the mother’s immune system. Any baby, male or female, is initially treated as an invader by the mother’s body, but multiple mechanisms engage to prevent her system from rejecting the fetus. Male babies, with their male proteins, are perceived as slightly more alien than females, so the mother’s body produces more gender-specific antibodies against them. Over multiple pregnancies with male babies, the womb becomes more “feminized,” and that can shape sexuality.

A range of other physical differences among gay men and lesbians also argue against Carson’s thinking—finger length for instance. In heterosexual men, the index finger is significantly shorter than the ring finger. In straight women, the index and ring fingers are close to the same length. Lesbian finger length is often more similar to that of straight males. This, too, had been informally observed for a long time, but in 2000 a study at the University of California, Berkeley, seemed to validate it.

Lesbians also seem to have differences in the inner ear—of all unlikely places. In all people, sound not only enters the ear but leaves it, in the form of what are known as otoacoustic emissions—vibrations that are produced by the interaction of the cochlea and eardrum and can be detected by instruments. Heterosexual women tend to have higher frequency otoacoustic emissions than men, but gay women don’t. Still other studies have explored a link between homosexuality and handedness (with gays having a greater likelihood of being left-handed or ambidextrous) as well as hair whorl (with the hair at the crown of gay men’s heads tending to grow counterclockwise), though there are differing views on these last two.

Clearly, none of us choose our genetics or finger length or birth order or ear structure, and none of us choose our sexuality either. As with so many cases of politicians saying scientifically block-headed things, Carson either doesn’t know any of this (and as a doctor, he certainly should) or he does know it and is pretending he doesn’t. Neither answer reflects well on his fitness for political office.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

Your browser is out of date. Please update your browser at http://update.microsoft.com