TIME Obesity

This Is What Weight Loss Does To Your Brain

Brain scan, MRI scan
Getty Images

New research shows weight loss surgery can reverse the negative effects body fat may have on the brain

Too much fat weighs down not just your body, but also your brain.

Obesity harms most organs in the body, and new research suggests the brain is no exception. What’s more, the researchers found that getting rid of excess fat actually improves brain function, reversing the ill effects of the extra weight. The new study, which focused on people who underwent bariatric surgery, found that the procedure had positive effects on the brain, but other research has shown that less invasive weight loss strategies, like exercise, can also reverse brain damage thought to be related to body fat.

Here’s why that matters: Obese men and women are estimated to be about 35% more likely to develop Alzheimer’s compared to people of a normal weight. Some research suggests that body fat ups the number of proteins in the brain that trigger a cascade of events that predispose someone to the disease, and other research in mice has suggested that fat cells release a substance called interleukin 1, which can cause severe inflammation and, in turn, gunk up the brain.

In a recent study, a team of researchers looked at 17 obese women prior to bariatric surgery and found that their brains metabolized sugars faster than the brains of a control group of women at a normal weight. The women underwent cognitive function tests before their surgery as well as after. The results show that after surgery, the obese women showed improvement in the troubling brain activity seen prior to going under the knife, and they performed better on their cognitive function tests—especially in the area of executive function, which is used during planning and organization. The findings suggest that the fat loss reversing its bad effects on the brain.

It is possible that the long-term “cerebral metabolic activity”—meaning the way the brains of obese people process sugars—leads to structural damage that can hasten or contribute to cognitive decline, the authors write in their paper.

Researchers are still trying to understand the exact effects of body fat on the brain, but one theory is that it’s a chain-of-events-type of scenario. For instance, insulin resistance has become linked to neurodegenerative diseases like Alzheimer’s because insulin resistance is associated with an increase in fatty acids, inflammation and oxidative stress. Insulin resistance is a metabolic disorder, that can be brought on by obesity. Other theories have to do with the effects of certain kinds of fat. The National Institutes of Health (NIH) points out that visceral fat, the most damaging type of body fat, ups a person’s likelihood of developing insulin resistance, and on top of that, belly fat can produce stress hormones that can also hinder cognition. Other research has shown that the stress hormones are tied to hunger signaling, and those disruptions can alter a person’s sense of hunger and fullness and can contribute to obesity.

“The more we understand about [body fat], the clearer it becomes that belly fat is its own disease-generating organism,” said Dr. Lenore Launer, chief of NIA’s Neuroepidemiology Section of the Laboratory of Epidemiology, Demography, and Biometry in an NIH statement.

Inflammation continues to be fingered as a culprit in the link between body fat and a variety of disorders, which include brain-related diseases, and even depression. Body fat, also referred to as adipose tissue, is thought to create substances that cause inflammation, and that could be at least one of the primary ways it irritates the brain.

The bottom line is that excess body fat has a laundry list of effects on the body, and none of them are good. But on the bright side, getting rid of that fat should reverse some of the blips body fat is leaving on the brain. Though not everyone needs to go under the knife.

TIME animal behavior

What Are Animals Thinking? (Hint: More Than You Suspect)

The mind of an animal is a far richer, more complex thing than most people know — as a new TIME book reveals

Let’s be honest, you’d probably rather die than wake up tomorrow morning and find out you’d turned into an animal. Dying, after all, is inevitable, and there’s even a certain dignity to it: Shakespeare did it, Einstein did it, Galileo and Washington and Twain all did it. And you, someone who was born a human and will live your life as a human, will end your life that way too.

But living that life as an animal — an insensate brute, incapable of reason, abstraction, perhaps even feeling? Unthinkable. Yes, yes, the animals don’t recognize the difference, and neither would you. If you’re a goat, you possess the knowledge of a goat, and that can’t be much. But there’s more to it than that.

Human beings have always had something of a bipolar relationship with the millions of other species with which we share the planet. We are fascinated by them, often dazzled by them. They can be magnificently beautiful, for one thing: the explosive color and frippery of a bird of paradise, the hallucinatory variety of the fish in a coral reef, the otherworldly markings and architecture of a giraffe. Even the plain or ugly animals — consider the naked, leathery grayness of the rhino or elephant — have a certain solidity and equipoise to them. And to see an animal at what appears to be play — the breaching dolphin, the swooping raptor — is to think that it might be fun to have a taste, a tiny taste, of their lives.

But it’s a taste we’d surely spit right out, because as much as we may admire animals, we pity them too: their ignorance, their inconsequence, and their brief, savage lives. It’s in our interest to see them that way — not so much because we need to press our already considerable advantage over them; we don’t. But because we have certain uses in mind for them. We need the animals to work for us — to pull carts, drag plows, lift logs and carry loads, and stand still for a whipping if they don’t. We need them to entertain us, in our circuses and zoos and stage shows. And most of all, we need them to feed us, with their eggs and milk and their very flesh. A few favored beasts do get a pass — dogs, cats, some horses — but the rest are little more than tools for our use.

But that view is becoming impossible to sustain — as a new TIME book reveals. The more deeply scientists look into the animal mind, the more they’re discovering it to be a place of richness, joy, thought and even nuance. There are the parrots that don’t just mimic words but appear to understand them, for example, assembling them into what can only be described as sentences. There are the gorillas and bonobos that can do the same with sign language or pictograms. Those abilities are hard to dismiss, but they also miss the point; they are, in many way, limited gifts — animals doing things humans do, but much less well.

A better measure is the suite of behaviors the animals exhibit on their own: crows that can fashion tools, lions that collaborate on elaborate hunts, dolphins and elephants with signature calls that serve as names, and cultural norms like grieving for their dead and caring for grandchildren. There are the complex, even political societies that hyenas create and the factory-like worlds of bees and ants. There are the abiding friendships among animals, too — not just the pairs of dolphins or horses or dogs that seem inseparable but the cross-species loyalties: the monkey and the dog, the sheep and the elephant, the cat and the crow, members of ordinarily incompatible species that appear never to have thought to fight with or eat one another because, well, no one told them they had to.

Animals, the research is proving, are creatures capable of reflection, bliss, worry and more. Not all of them in the same ways or to the same degrees, surely, but all of them in far deeper measures than we’ve ever believed. The animal mind is nothing like the wasteland it’s been made out to be. And if it’s not the mind you’d want to have as your own, it’s one that is still worth getting to know much better.

(The Animal Mind is now available on newsstands.)

TIME Innovation

IBM’s New Processor Sounds More Brain-Like Than Ever

Imagine assistive glasses for the visually impaired that can help them navigate through complex environments—without the need for a wi-fi connection.
Imagine assistive glasses for the visually impaired that can help them navigate through complex environments—without the need for a wi-fi connection. IBM

IBM unveils a new processor that sips a fraction of the energy today's processors do, but that can deliver radically greater returns on a brain-like synaptic scale.

IBM’s splashy new “brain” chip, TrueNorth, is actually nothing like a real human brain — it’s not going to admire the pointillistic works of Van Gogh, much less fall in love with you before absconding to frolic with a new race of godlike machine beings ala Her — but it is a remarkable-sounding next step in the direction of brain-like computers that mimic the synaptic conversation actual brains have been having for eons.

The chip, designed over the past decade and part of IBM’s TrueNorth computing architecture, is detailed in the August 8 issue of Science, and it’s based on a principle that’s been around for decades known as neuromorphic engineering. That’s a fancy way of describing a system that mimics the biological nervous system, including (though not limited to) biological brains.

TrueNorth is IBM’s stab at a neuromorphic processor, something its authors describe as an “efficient, scalable and flexible non-von Neumann architecture.” John von Neumann came up with the basic architecture for how you’d go about running a digital processor in the 1940s, and it’s that essentially linear notion that forms the basis for the computers we’re still using today.

But linearity has its disadvantages: today’s processors are epically faster than human brains at crunching massively complex mathematical equations, say simulating weather patterns or calculating all the gravitational vectors involved in soft-landing a rover on Mars. But they’re utterly dimwitted at attempting contextual feats we humans perform with ease, say picking a voice out of a crowd or deciding which type of wine goes best with a meal.

That’s where the notion of brain-like parallelism comes in, itself a well-established idea in computing, but TrueNorth is about scaling it to unprecedented levels. The processor simulates a brain with one million neurons and 256 million synapses — about the crunch-power of a honey bee or cockroach — fueled by an on-chip network of 4,096 neurosynaptic cores. That adds up to a 5.4 billion transistor processor — the largest chip IBM’s yet built — but one that sips a mere 70 milliwatts of power during realtime operations, or four orders of magnitude fewer than conventional chips today. Altogether, the chip can perform 46 billion synaptic operations per second, per watt, says IBM.

IBM

Think of it as a little like the old left brain, right brain relationship: language and analytic thinking are left (von Neumann architecture) while sense and pattern recognition are right (neuromorphic processors). And that’s just the start, says IBM, noting that in the years to come, it hopes to bring the two together “to create a holistic computing intelligence.”

While we’re waiting for our holistic machine overlords to take over, what can TrueNorth do after developers have figured out what to design for it? How about improving visual and auditory tasks traditional computers presently stumble over?

As IBM Fellow Dharmendra Modha writes, “The architecture can solve a wide class of problems from vision, audition, and multi-sensory fusion, and has the potential to revolutionize the computer industry by integrating brain-like capability into devices where computation is constrained by power and speed.” Notice the way Modha describes the chips as complementary: the strategy with TrueNorth out of the gate looks to be integrative rather than one of displacing the processors in our smartphones, tablets and laptop computers.

Again, it’s important to bear in mind that TrueNorth isn’t a brain. As Modha himself notes, “we have not built the brain, or any brain. We have built a computer that is inspired by the brain.” But it’s an important step forward: another rung on a ladder that’s as high as our hopes of perhaps someday creating computer-like beings in our own image, or ones better still.

TIME Mental Health/Psychology

The Part of Your Brain That Senses Dread Has Been Discovered

This tiny part of your brain tracks bad experiences

A tiny part of the brain can keep track of your expectations about negative experiences—and predict when you will react to an event—researchers at University College London say.

The brain structure, known as the habenula, activates in response to negative events such as electric shocks, and they may help people learn from bad experiences.

The findings, published in Proceedings of the National Academy of Sciences, marks the first time this association has been proven in humans. Earlier studies showed that the habenula causes animals to avoid negative stimuli by suppressing dopamine, a brain chemical that drives motivation.

In this study, investigators showed 23 people random sequences of pictures followed by a set of good or bad outcomes (an electric shock, losing money, winning money, or neutral). The volunteers were asked to occasionally press a button to show they were paying attention, and researchers scanned their brains for habenula activity using a functional magnetic resonance imaging (fMRI) scanner. Images were taken at high resolution because the habenula is so small—half the size of a pea.

When people saw pictures associated with painful electric shocks, the habenula activated, while it did not for pictures that predicted winning money.

“Fascinatingly, people were slower to press the button when the picture was associated with getting shocked, even though their response had no bearing on the outcome,” lead author Rebecca Lawson from the University College London Institute of Cognitive Neuroscience, said in a statement. “Furthermore, the slower people responded, the more reliably their habenula tracked associations with shocks. This demonstrates a crucial link between the habenula and motivated behavior, which may be the result of dopamine suppression.”

The study also showed that the habenula responds more the worse an experience is predicted to be. For example, researchers said the habenula responds much more strongly when an electric shock is certain than when it is unlikely to happen. This means that your brain can tell how bad an event will be before it occurs.

The habenula has been linked to depression, and this study shows how it could play a part in symptoms such low motivation, focusing on negative experiences and pessimism in general. Researchers said that understanding the habenula could potentially help them develop new ways of treating depression.

TIME Opinion

I Don’t Love Lucy: The Bad Science in the Sci-Fi Thriller

Maybe if the screenwriters had used 20% of their brains...

You use a whole lot more than 10% of your brain—but a common fallacy that says otherwise is nonetheless the central premise of a new movie

Now there are three Lucys I have to keep straight: The 3.2 million year old Australopithecus unearthed in Ethiopia in 1974; the eponymous star of the inexplicably celebrated 1950s sitcom I Love Lucy; and, most recently, the lead character—played by Scarlett Johansson—of the new sci-fi thriller straightforwardly titled Lucy. Going by intellectual heft alone, I’ll pick the millions-year-old bones.

The premise of the movie, such as it is, is that Lucy, a drug mule living in Taiwan, is exposed to a bit of high-tech pharma that suddenly increases her brain power, giving her the ability to outwit entire police departments, travel through time and space, dematerialize at will and yada-yada-yada, cut to gunfights, special effects and a portentous message about, well, something or other.

The movie poster’s teaser line? “The average person uses 10% of their brain capacity. Imagine what she could do with 100%.”

Let’s forgive the poster its pronoun problem (the average person—as in just one of us—uses 10% of their brain capacity), because the science problem is so much more egregious. The 10% brainpower thing is part of a rich canon of widely believed and entirely untrue science dicta that include “Man is the only animal that kills its own kind” (tell that to the lion cubs that were just murdered by an alpha male trying to take over a pride) and “A goldfish can remember something for only seven seconds” (a premise that was tested…how? With a pop quiz?).

No one is entirely sure where the 10% brainpower canard got started, but it goes back at least a century and is one of the most popular entries in the equally popular book 50 Great Myths of Popular Psychology. There is some speculation that the belief began with an idle quote by American philosopher William James who, in 1908, wrote, “We are making use of only a small part of our possible mental and physical resources,” an observation vague enough to mean almost anything—or nothing—at all.

Some people attribute it to an explanation Albert Einstein offered when asked to account for his own towering intellect—except that Einstein never said such a thing and even if he had it would not make it true. Still others cite the more scientifically defensible idea that there is a measure of plasticity in the brain, so that if the region that controls, say, the right arm, is damaged by, say, a stroke, it is sometimes possible for other parts of the brain to pick up the slack—a sort of neural rewiring that restores lost motion and function.

But none of that remotely justifies the 10% silliness. The fact is, the brain is overworked as it is, 3 lbs. (1,400 gm) of tissue stuffed into a skull that can barely hold it all. There’s a reason the human brain is as wrinkled as it is and that’s because the more it grew as we developed, the more it bumped up against the limits of the cranium; the only way to increase the surface area of the neocortex sufficiently to handle the advanced data crunching we do was to add convolutions. Open up the cerebral cortex and smooth it out and it would measure 2.5 sq. ft. (2,500 sq cm). Wrinkles are a clumsy solution to a problem that never would have presented itself in the first place if 90% of our disk space were going to waste.

What’s more, our bodies simply couldn’t afford to maintain so much idle neuronal tissue since the brain is an exceedingly expensive organ to own and operate—at least in terms of energy needs. At birth, babies actually have up to 50% more neural connections among the billions of brain cells than adults do, but in the first few years of life (and, to a lesser extent, on through sexual maturity) a process of pruning takes place, with many of those synaptic links being broken and the ones that remain growing stronger. That makes the brain less diffuse and more efficient—which is exactly the way any good central processing unit should operate. It also allows it to use up fewer calories, which is critical.

“We were a nutritionally marginal species early on,” the late William Greenough, a psychologist and brain development expert at the University of Illinois, told me for my 2007 book Simplexity. “A synapse is a very costly thing to support.”

Added Ray Jackendoff, co-director of the center for Cognitive Studies at Tufts University, “The thing that’s really astonishing might not be that we lose so many connections, but that the brain’s plasticity and growth are able to continue for as long as they do.”

OK, so the Lucy screenwriters aren’t psychologists or directors of cognitive studies institutes. But they do have the same 100 billion neurons everybody else’s brains have. Here’s hoping they take a few billion of them out for an invigorating run before they write their next sci-fi script.

TIME Brain

Learning to Read Does Not End in Fourth Grade

Girl learning to read
Cultura RM/Gary John Norman—Getty Images/Collection Mix: Subjects RM

Do you remember when reading stopped requiring so much effort, and became almost second nature?

Probably not, but researchers have long believed that it probably happened some time during fourth grade. That’s when, they thought, word-processing tended to become more automatic and less deliberate, and you started to read to learn, as opposed to learning to read.

But a new study published in the journal Developmental Science questions that assumption, showing that children are still learning to read past fourth and even fifth grade. The shift to automatic word-processing, in which the brain recognizes whether a group of symbols constitutes a word within milliseconds, allowing fluid reading that helps the reader focus on the content of the text rather than on the words, may occur later than previously thought.

To test when this process develops, researchers fitted 96 college, third, fourth and fifth grade students with electrode caps to scan their brains as they were shown on a screen real words, fake words, strings of letters and strings of random symbols.

The third-, fourth- and fifth-graders processed real words, fake words, and letter strings similarly to the college students, showing that some automatic word-processing begins as early as third grade. But only the college students processed the meaningless symbols differently from actual words—which suggests that brain activity in the three groups of young children remained the same whether they were processing real words or not. While they showed some signs of automatic word processing, or no longer exerting effort to read, for the most part the younger children still treated familiar and unfamiliar words in the same way.

However, when the researchers switched to a written test, which presumably gave the participants the more time to think about the distinctions, all groups scored above 95 percent, showing that with some effort, or when their conscious brains were involved, the children also realized the difference between real and fake words.

That suggests that for the young children, the processing wasn’t automatic just yet. Study author Donna Coch, associate professor of education at Dartmouth, says that it’s not that fourth graders can’t read well, but rather they aren’t quite as efficient as adults at reading.

“You have a limited amount of resources, and if you’re using them on words that could not be words in your language, that’s taking up resources that could be used in word processing,” says Coch. “If you don’t have to put in effort to sound out words, you can pay more attention to understanding.”

So if fourth-graders aren’t quite reading to learn, then when does the shift toward more complete automatic word-processing occur? According to Coch, that probably happens some time between fifth grade and college—a period she says that hasn’t been studied.

For now, the results strongly suggest that reading skills need to continue to be nurtured during that period. “This certainly does suggest that teachers beyond fourth grade are still teachers of reading,” says Coch.

TIME Brain

Want to Learn a Language? Don’t Try So Hard

If at first you don't succeed, trying again might not help you when it comes to learning languages.

A new study from MIT shows that trying harder can actually make some aspects of learning a new language more difficult. While researchers have known that adults have a harder time with new languages than children do, the latest findings, published in the journal PLOS ONE, suggest that adults’ stronger cognitive abilities may actually trip them up.

Children have a “sensitive period” for learning language that lasts until puberty, and during these years, certain parts of the brain are more developed than others. For example, they are adept at procedural memory, which study author Amy Finn, a postdoc at MIT’s McGovern Institute for Brain Research, describes as the “memory system we get for free.” It’s involved in tasks we learn unconsciously such as riding a bike, dancing, or subtle language rules. It’s a system that learns from observing and from experience; neural circuits in the brain build a set of rules for constructing words and sentences by absorbing and analyzing information—like sounds—from the world around them.

“The procedural memory is already in place for an infant and working well, and not interacting with other brain functions,” says Finn. However, as people age, another memory system that is less based on exploratory processes starts to mature, and control the language learning process. “As an adult, you have really useful late-developing memory systems that take over and do everything.”

In essence, adults may over-analyze new language rules or sounds and try to make them fit into some understandable and coherent pattern that makes sense to them. But a new language may involve grammar rules that aren’t so easily explained, and adults have more difficulty overcoming those obstacles than children, who simply absorb the rules or exceptions and learn from them. That’s especially true with pronunciation, since the way we make sounds is something that is established early in life, and becomes second nature.

“Adults are much better at picking up things that are going to immediately help them like words and things that will help them navigate a supermarket,” says Finn. “You can learn language functionally as an adult, but you’ll never sound like a native speaker.”

So how can adults remove their own roadblocks to learning new languages? Finn says more research needs to be done to determine if adults can ever go back to learning languages like children, but linguists are looking at a variety of options. A few include “turning off” certain areas of the brain using a drug or a technique called transcranial magnetic stimulation, which might allow adults to be more open to accepting new language rules and sounds.

Finn also hopes to study adults performing a challenging task while they learn a language, which is another way of distracting the cognitive portions of the brain from focusing on the new language, to see if that can help them to absorb more linguistic information.

Your browser, Internet Explorer 8 or below, is out of date. It has known security flaws and may not display all features of this and other websites.

Learn how to update your browser
Follow

Get every new post delivered to your Inbox.

Join 46,422 other followers