TIME Diet/Nutrition

Here’s The Scientific Way To Make A Perfect Pumpkin Pie

Prebake the crust for pumpkin pie before filling

WSF logo small

The kind of fat that goes into a pie dough can totally change the chemistry of the crust—and for a supremely flaky crust, you can’t beat lard, as former White House pastry chef Bill Yosses explains in the above selection from the 2014 World Science Festival event “Biophysics? More like Pie-o-Physics!” (Yosses is something of an authority on deliciousness; earlier this year, President Obama joked that his pies were so good he must be lacing them with crack cocaine.)

But traditional Thanksgiving fare presents additional “pie-o-physics” conundrums. Pumpkin pie filling is closer on the pastry evolutionary tree to flan or custard. Baking one requires some special considerations, according to Yosses.

In pumpkin pie, “the eggs coagulate to form a silken smooth network,” Yosses told us. “The egg proteins shrink as they cook, and you need to stop the process at the right time.” The time to remove a pumpkin pie, he says, is when it is “set,” but the center should still jiggle when shaken in the oven. “This is sensitive because too little cooking and the pie will be liquid.”

To avoid overcooking his pumpkin pies, one trick Yosses likes to employ is to lower the bottom of pie dish into cold water for about 30 seconds right after taking it out of the oven (take care not to splash water or burn yourself). This will stop the protein threads from continuing to cook.

“I like a filling made with acorn squash and some sugar pumpkin, and I love trying all kinds of vegetable and ginger variations—but then it is not really a pumpkin pie,” Yosses says. He prebakes the crust for his pumpkin pie before filling. If you do the same, but don’t want an extra-crispy edge on the crust that forms during the second round in the oven, he recommends covering the edge with aluminum foil before baking.

If any foodies reading this feel guilty about going with canned pumpkin instead of the fresh stuff, take comfort in the fact that Yosses himself often reaches for a can of Libby’s pumpkin pie mix. As he says: “Why reinvent the wheel?”

This piece originally appeared on World Science Festival.

TIME Opinion

The Reason Every One of Us Should Be Thankful

Thanksgiving Preparations
Illustration of preparing the Thanksgiving meal circa 1882. Kean Collection / Getty Images

As Thanksgiving approaches, a little bit of historical context goes a long way

Astronomy is a historical science because the distance scales involved are so immense that to look out into space is to look back into time. Even at the almost unfathomable speed of light — 300,000 kilometers per second — the sun is eight light minutes away, the nearest star is 4.3 light years away, the nearest galaxy, Andromeda, is about 2.5 million light years away and the farthest object ever observed is about 13.8 billion light years away. Astronomers call this way of describing such distances “lookback time.”

The concept is not limited to astronomy: current events also have their own lookback times, accounting for what gave rise to them. Just as looking at a star now actually involves seeing light from the past, looking at the world today actually involves looking at the reverberations of history. We have to think about the past in order to put current events into proper context, because that’s only way to track human progress.

Consider the longing many people have for the peaceful past, filled with bucolic scenes of pastoral bliss, that existed before overpopulation and pollution, mass hunger and starvation, world wars and civil wars, riots and revolutions, genocides and ethnic cleansing, rape and murder, disease and plagues, and the existential angst that comes from mass consumerism and empty materialism. Given so much bad news, surely things were better then than they are now, yes?

No.

Overall, there has never been a better time to be alive than today. As I document in my 2008 book The Mind of the Market and in my forthcoming book The Moral Arc, if you lived 10,000 years ago you would have been a hunter-gatherer who earned the equivalent of about $100 a year — extreme poverty is defined by the United Nations as less than $1.25 a day, or $456 a year — and the material belongings of your tiny band would have consisted of about 300 different items, such as stone tools, woven baskets and articles of clothing made from animal hides. Today, the average annual income in the Western world — the U.S. and Canada, the countries of the European Union, and other developed industrial nations — is about $40,000 per person per year, and the number of available products is over 10 billion, with the universal product code (barcode) system having surpassed that number in 2008.

Poverty itself may be going extinct, and not just in the West. According to UN data, in 1820 85-95% of the world’s people lived in poverty; by the 1980s that figure was below 50%, and today it is under 20%. Yes, 1 in 5 people living in poverty is too many, but if the trends continue by 2100, and possibly even by 2050, no one in the world will be poor, including in Africa.

Jesus said that one cannot live on bread alone, but our medieval ancestors did nearly that. Over 80% of their daily calories came from the nine loaves a typical family of five consumed each day. Also devoured was the 60 to 80% of a family’s income that went to food alone, leaving next to nothing for discretionary spending or retirement after housing and clothing expenses. Most prosperity has happened over the two centuries since the Industrial Revolution, and even more dramatic gains have been enjoyed over the last half-century. From 1950 to 2000, for example, the per capita real Gross Domestic Product of the United States went from $11,087 (adjusted for inflation and computed in 1996 dollars) to $34,365, a 300% increase in comparable dollars! This has allowed more people to own their own homes, and for those homes to double in size even as family size declined.

For centuries human life expectancy bounced around between 30 and 40 years, until the average went from 41 in 1900 to the high 70s and low 80s in the Western world in 2000. Today, no country has a lower life expectancy than the country with the highest life expectancy did 200 years ago. Looking back a little further, around the time of the Black Death in the 14th century, even if you escaped one of the countless diseases and plagues that were wont to strike people down, young men were 500 times more likely to die violently than they are today.

Despite the news stories about murder in cities like Ferguson and rape on college campuses, crime is down. Way down. After the crime wave of the 1970s and 1980s, homicides plummeted between 50 and 75% in such major cities as New York, Los Angeles, Boston, Baltimore and San Diego. Teen criminal acts fell by over 66%. Domestic violence against women dropped 21%. According to the U.S. Department of Justice the overall rate of rape has declined 58% between 1995 and 2010, from 5.0 per 1,000 women age 12 or older to 2.1. And on Nov. 10, 2014, the FBI reported that in 2013, across more than 18,400 city, county, state, and federal law enforcement agencies that report crime data to the FBI, every crime category saw declines.

What about the amount of work we have today compared with that of our ancestors? Didn’t they have more free and family time than we do? Don’t we spend endless hours commuting to work and toiling in the office until late into the neon-lit night? Actually, the total hours of life spent working has been steadily declining over the decades. In 1850, for example, the average person invested 50% of his or her waking hours in the year working, compared to only 20% today. Fewer working hours means more time for doing other things, including doing nothing. In 1880, the average American enjoyed just 11 hours per week in leisure time, compared to today’s 40 hours per week.

That leisure time can be spent in cleaner environments. In my own city of Los Angeles, for example, in the 1980s I had to put up with an average of 150 “health advisory” days per year and 50 “stage one” ozone alerts caused by all the fine particulate matter in the air—dirt, dust, pollens, molds, ashes, soot, aerosols, carbon dioxide, sulfur dioxide and nitrogen oxides—AKA smog. Today, thanks to the Clean Air Act and improved engine and fuel technologies, in 2013 there was only one health advisory day, and 0 stage-one ozone alerts. Across the country, even with the doubling of the number of automobiles and an increase of 150% in the number of vehicle-miles driven, smog has diminished by a third, acid rain by two-thirds, airborne lead by 97%, and CFCs are a thing of the past.

Today’s world has its problems — many of them serious ones — but, while we work to fix them, it’s important to see them with astronomers’ lookback-time eyes. With their historical context, even our worst problems show that we have made progress.

Rewind the tape to the Middle Ages, the Early Modern Period or the Industrial Revolution and play it back to see what life was really like in a world lit only by fire. Only the tiniest fraction of the population lived in comfort, while the vast majority toiled in squalor, lived in poverty and expected half their children would die before adulthood. Very few people ever traveled beyond the horizon of their landscape, and if they did it was either on horseback or, more likely, on foot. No Egyptian pharaoh, Greek king, Roman ruler, Chinese emperor or Ottoman sultan had anything like the most quotidian technologies and public-health benefits that ordinary people take for granted today. Advances in dentistry alone should encourage us all to stay away from time machines.

As it turns out, these are the good old days, and we should all be thankful for that.

Michael Shermer is the Publisher of Skeptic magazine, a monthly columnist for Scientific American, and a Presidential Fellow at Chapman University. He is the author of a dozen books, including Why People Believe Weird Things and The Believing Brain. His next book, to be published in January, is entitled The Moral Arc: How Science and Reason Lead Humanity Toward Truth, Justice, and Freedom.

TIME Science

This Is What Happens to Your Body When You Overeat

Just in time for Thanksgiving

In the days leading up to Thanksgiving, the American Chemical Society’s YouTube channel “Reactions” has boiled down the science of stuffing your face in a two-and-a-half-minute video. It explains how signals in the brain tell us when we are feeling full or when it is time to stop eating, as well as how antacids reduce the physical discomfort. Might be fodder for conversation after dinner when everyone is sitting around, running out of things to talk about and on the verge of slipping into a food coma.

MORE: Mall Will Fine Stores if They Don’t Open on Thanksgiving

TIME

On Evolution Day, Remember That Darwin Knew He’d Meet Resistance

127035224
A statue of Darwin in the Natural History Museum, London Philippe Lissac—Godong / Getty Images

Plus, TIME's original coverage of the anti-evolution arguments of the 1925 Scopes trial

Correction appended, Nov. 24, 2014, 5:49 p.m.

Time was, “Darwin” was just a guy’s name. It was not a noun (Darwinism) or an adjective (Darwinian). And it certainly wasn’t a flash point for debate between folks who prefer a Scriptural view of the history of life and those who take a more scientific approach. That started to change 155 years ago today, on Nov. 24, 1859, when Charles Darwin’s seminal work—On the Origin of Species—was published.

Darwin knew that by supporting an empirical theory of evolution as opposed to the Biblical account of Creation he was asking for trouble. Two weeks before the book’s publication, he sent letters to 11 prominent scientists of his day, asking for their support—or at least their forbearance—and acknowledging that for some of them, that would not be easy. To the celebrated French botanist Alphonse de Candolle he wrote:

Lord, how savage you will be, if you read it, and how you will long to crucify me alive! I fear it will produce no other effect on you; but if it should stagger you in ever so slight a degree, in this case, I am fully convinced that you will become, year after year, less fixed in your belief in the immutability of species.

And to American Asa Gray, another botanist, he conceded:

Let me add I fully admit that there are very many difficulties not satisfactorily explained by my theory of descent with modification, but I cannot possibly believe that a false theory would explain so many classes of facts as I think it certainly does explain.

But the whirlwind came anyway. Speaking of Darwin in 1860, the Bishop of Oxford asked: “Was it through his grandfather or his grandmother that he claimed his descent from a monkey?” The battle raged in the U.S. in the summer of 1925, with the trial of John Scopes, a substitute school teacher charged with violating a Tennessee statute forbidding the teaching of evolution in schools.

But Darwin and his theory of evolution endured, so much so that Nov. 24 is now recognized as Evolution Day. As if serendipity and circumstance were conspiring to validate that decision, it was on another Nov. 24, in 1974, that the fossilized remains of Lucy, the australopithecus who did so much to fill in a major gap in human evolution, were found in Ethiopia.

In honor of Lucy and Evolution Day and Darwin himself, check out TIME’s coverage of the florid anti-evolution closing argument of prosecuting attorney and three-time presidential candidate William Jennings Bryan during the Scopes trial, as quoted in the magazine’s Aug. 10, 1925 issue:

“Darwin suggested two laws, sexual selection and natural selection. Sexual selection has been laughed out of the class room, and natural selection is being abandoned, and no new explanation is satisfactory even to scientists. Some of the more rash advocates of Evolution are wont to say that Evolution is as firmly established as the law of gravitation or the Copernician theory.

“The absurdity of such a claim is apparent when we remember that any one can prove the law of gravitation by throwing a weight into the air and that any one can prove the roundness of the earth by going around it, while no one can prove Evolution to be true in any way whatever.”

Bryan died mere days after the trial ended but, as the historical record shows, his strenuous efforts paid off—sort of. Scopes was duly convicted. His sentence for teaching what most of the world now accepts as science: $100.

Read the full text of that story, free of charge, here in the TIME archives, or in its original format, in the TIME Vault: Dixit

Correction: The original version of this article misstated the date of Darwin Day. Darwin Day is typically celebrated on February 12.

TIME Science

What You Didn’t Know About William Jennings Bryan. What You Should Know About Darwin

Bryan
William Jennings Bryan, center, arrives at Dayton, Tenn., in 1925. AP

Enthusiasm for improving the human race could have negative consequences

History News Network

This post is in partnership with the History News Network, the website that puts the news into historical perspective. The article below was originally published at HNN.

The action of the North Carolina legislature to pay compensation to victims of a forced sterilization program brings attention to an almost forgotten chapter of American history. It may also provide an opportunity to set the infamous Scopes trial in a broader light and do justice to the much-maligned William Jennings Bryan for his role in that case.

The sterilizations that were carried out in so many American states were a direct result of Charles Darwin’s writing on the theory of evolution, writings which alarmed William Jennings Bryan and led him to campaign against it being taught in American schools. Bryan was no theologian but an intensely practical man concerned with consequences. His goal in life was to make the world a better place and he believed that teaching evolution would not do that. In one area at least he was right.

Darwin followed up his publication of The Origin of Species in 1859 with a second book, The Descent of Man, published in 1871. Bryan had prepared a long statement for the Scopes trial but the trial came to an end before he could enter it in the record. In his statement, he quoted from The Descent of Man, in which Darwin had written:

With savages, the weak in body or mind are soon eliminated; and those that survive commonly exhibit a vigorous state of health. We civilized men, on the other hand, do our utmost to check the process of elimination; we build asylums for the imbecile, the maimed and the sick: we institute poor laws and our medical men exert their utmost skill to save the life of every one to the last moment. There is reason to believe that vaccination has preserved thousands who from a weak constitution would formerly have succumbed to smallpox. Thus the weak members of civilized society propagate their kind. No one who has attended to the breeding of domestic animals will doubt that this must be highly injurious to the race of man. It is surprising how soon a want of care, or care wrongly directed, leads to the degeneration of a domestic race; but, excepting in the case of man himself, hardly anyone is so ignorant as to allow his worst animals to breed.45

Bryan was appalled. Darwin, he wrote,

. . . reveals the barbarous sentiment that runs through evolution and dwarfs the moral nature of those who become obsessed with it. Let us analyze the quotation just given. Darwin speaks with approval of the savage custom of eliminating the weak so that only the strong will survive, and complains that “we civilized men do our utmost to check the process of elimination.” How inhuman such a doctrine as this! He thinks it injurious to “build asylums for the imbecile, the maimed and the sick” or to care for the poor. Even the medical men come in for criticism because they “exert their utmost skill to save the life of everyone to the last moment.” And then note his hostility to vaccination because it has “preserved thousands who, from a weak constitution would, but for vaccination, have succumbed to smallpox!” All of the sympathetic activities of civilized society are condemned because they enable “the weak members to propagate their kind.” . . . Could any doctrine be more destructive of civilization? And what a commentary on evolution! He wants us to believe that evolution develops a human sympathy that finally becomes so tender that it repudiates the law that created it and thus invites a return to a level where the extinguishing of pity and sympathy will permit the brutal instincts to again do their progressive (?) work! . . . Let no one think that this acceptance of barbarism as the basic principle of evolution died with Darwin. ( Memoirs, p. 550)

Bryan’s concern was with practical outcomes, not “What do Christians believe?” but “What difference does it make?” Evolution seemed to him to be making the wrong kind of difference, and in his time it often did. The development of what is now called “Social Darwinism” is far too complex to be dealt with fairly in this brief article, but German militarism in World War One seems to have been influenced by it and the Turkish genocide of Armenians was justified by some on the same grounds.

The same enthusiasm for improving the human race called “eugenics” led a majority of the states to pass laws allowing the sterilizing and castrating of selected populations, typically prisoners and those with reduced mental abilities. California, in particular, carried out thousands of sterilizations. Unnoticed at the time of the Scopes trial was the publication in the same year, 1925, of Mein Kampf in which Adolf Hitler called for the improvement of the race by the elimination of inferior people: Jews, gypsies, homosexuals, the mentally retarded, and others. Once the full significance of that program became visible at the end of World War Two, eugenics and social Darwinism took on a different appearance and the interest in improving the human race by those methods withered away.

The ongoing struggle over the teaching of evolution in American classrooms and courthouses seems now to be confined to a conflict between what scientists believe and what some Christians believe but unfortunately William Jennings Bryan’s role in the Scopes trial is caricatured in those same terms. Undoubtedly Bryan had a simplistic view of the Bible, but the cause Bryan was championing was one with which most Americans today would probably agree: human beings are not to be treated as mere tools in some gigantic experiment. Human progress is created when societies find new and better ways to incorporate the weakest and most handicapped as fully as possible in the life of their community. The North Carolina legislature has taken an important step in recognizing a wrong turn in its past and has set an example for many other states to follow.

Perhaps it is time also to rescue William Jennings Bryan’s reputation from the stain of the Scopes trial. He was not a brilliant and original thinker but he was a man who consistently worked for the weaker members of society and set the Democratic party on the side of those who were being left behind in the free-for-all evolutionary struggle of the age of industrialization.

Christopher L. Webber is an Episcopal priest and author of some thirty books including “American to the Backbone,” the biography of the fugitive slave and abolition leader, James W.C. Pennington. He is a graduate of Princeton University and the General Theological Seminary who has served parishes in Tokyo, Japan, and the New York area and currently lives in San Francisco. The work of William Jennings Bryan is dealt with more fully in Christopher Webber’s book, “Give Me Liberty: Speeches and Speakers that Shaped America,” Pegasus, 2014.

TIME Science

A Sheep, a Duck and a Rooster in a Hot-Air Balloon — No Joke

Ascent in captive hot air balloon made by Pilatre de Rozier, Paris, 11 October 1783 (1887). Artist: Anon
Illustration of a Jean-Francois Pilatre de Rozier flight from 'Histoire des Ballons' by Gaston Tissandier Print Collector / Getty Images

Nov. 21, 1783: Two men take flight over Paris on the world’s first untethered hot-air balloon ride

Before subjecting humans to the unknown dangers of flight in a hot-air balloon, French inventors conducted a trial run, sending a sheep, a duck and a rooster up in the air over Versailles.

Anyone who was anyone in pre-revolution France came out for the September 1783 demonstration in the courtyard of the royal palace. According to Simon Schama, the author of Citizens: A Chronicle of the French Revolution, the spectators included King Louis XVI, his wife Marie Antoinette and 130,000 French citizens who, six years before returning to the palace to riot over the scarcity of bread, were drawn by sheer curiosity over how the animals would fare in the balloon’s basket.

The eight-minute flight, which ended in the woods a few miles from the palace, didn’t seem to do the barnyard trio any harm, Schama writes: “‘It was judged that they had not suffered,’ ran one press comment, ‘but they were, to say the least, much astonished.’”

The public was similarly astonished when, on this day, Nov. 21, two months after the sheep and fowl made their historic trip, two eminent Frenchmen went aloft themselves in the world’s first untethered hot-air balloon ride.

Jean-François Pilâtre de Rozier, a chemistry and physics teacher, and the Marquis d’Arlandes, a military officer, flew nearly six miles, from the center of Paris to the suburbs, in 25 minutes. This time, Benjamin Franklin was among the spectators, according to Space.com. He later marveled in his journal about the experience, writing, “We observed [the balloon] lift off in the most majestic manner. When it reached around 250 feet in altitude, the intrepid voyagers lowered their hats to salute the spectators. We could not help feeling a certain mixture of awe and admiration.”

It was more than a century before the Wright brothers lifted the first powered airplane off the ground in 1903, and more than two centuries before another pair — a Swiss psychiatrist and a British balloon instructor — circumnavigated the globe in an air balloon in a record-breaking 20 days. This first balloon, rather delicately constructed of paper and silk, and requiring a large supply of fuel to stoke the fire that kept it aloft (but also threatened to burn it down), likely wouldn’t have made it so far.

There were still a few bugs to work out in this novel form of flight. The inventors themselves didn’t quite grasp the physics that made the balloon rise, believing that they had discovered a new kind of gas that was lighter than air. In fact, the gas was air, just hotter and therefore lighter than the air surrounding it.

Experimenting with different gases ultimately led to the demise of one of the intrepid voyagers aboard the first balloon flight. Pilâtre de Rozier was killed two years later while attempting to cross the English Channel in a balloon powered by hydrogen and hot air, which exploded.

Read about the 1999 balloon trip around the world, here in the TIME Vault: Around the World in a Balloon in 20 Days

TIME Research

Study Suggests Banking Industry Breeds Dishonesty

Bank industry culture “seems to make [employees] more dishonest,” a study author says

Bank employees are more likely to exhibit dishonesty when discussing their jobs, a new study found.

Researchers out of Switzerland tested employees from several industries during a coin-toss game that offered money if their coins matched researcher’s. According to Reuters, there was “a considerable incentive to cheat” given the maximum pay-off of $200. One hundred and twenty-eight employees from one bank were tested and were found to be generally as honest as everyone else when asked questions about their personal lives prior to flipping the coin, the Associated Press reports. But when they were asked about work before the toss, they were more inclined toward giving false answers, the study determined.

The author of the study says bankers are not any more dishonest than other people, but that the culture of the industry “seems to make them more dishonest.”

The American Bankers Association rebuffed the study’s findings to the AP.

“While this study looks at one bank, America’s 6,000 banks set a very high bar when it comes to the honesty and integrity of their employees. Banks take the fiduciary responsibility they have for their customers very seriously,” the Association said.

[AP]

TIME Science

Sorting Fact From Fiction and What the Best Science Writing Can Teach Us

Barbara J. King is a biological anthropologist at the College of William and Mary.

The latest volume of 'The Best American Science and Nature Writing' puts today's diseases and outbreak in context

My bedside table holds a jumble of fiction and non-fiction books, since in my reading I trade off across genres. In any given week, my mood swings between a desire to lose myself in the vivid writing of novelists who create imaginary worlds and a competing wish to keep up with breaking knowledge in science. Sometimes, though, vivid writing and scientific material happily collide in a single volume, as it has done in The Best American Science and Nature Writing 2014.

Edited by the Pulitzer-Prize winning writer Deborah Blum, this book isn’t, of course, fiction. It is a work of science, not a novel. But it does contain worlds of imagination, as 26 scientists and science writers offer enticing essays spanning mutliple disciplines and topics. (I am honored to be included for a piece I wrote called “When Animals Mourn” that originally appeared in Scientific American and is based on my book How Animals Grieve.)

In her introduction, Blum promises readers “stories that range from the shimmer of deep space to the wayward nature of a wild sheep,” tales that show “the stumbles and the hopes, the the unexpected ideas and unexpected beauty” of doing science.

Several of the chapters focus on the body, health and disease. Do you know which is the most infectious microbe in the world, with a 90% rate of transmission? I didn’t, until reading Seth Mnookin‘s chapter “The Return of Measles” (originally published in The Boston Globe Magazine). “The fact that measles can live outside the human body for up to two hours,” Mnookin writes, “makes a potential outbreak all the more menacing.” Alarmingly, parents who refuse to allow their children to be vaccinated against measles and other diseases turn this theoretical public-health risk into a risk quite concrete: In 2013 an unvaccinated 17-year-old from Brooklyn caught the virus while in the UK, and once he returned home it spread rapidly through a community where many other deliberately unvaccinated children lived. Fifty-eight people came down with measles, making it, Mnookin says, “the largest outbreak in the country in more than 15 years.” The costs–health- and money-wise–were significant. Measles may be fatal, as it was during France’s recent prolonged outbreak: in 2007 only 44 measles infections were reported there; over the next four years, Mnookin notes, 20,000 people were sickened, almost 5,000 people were hospitalized, and 10 died.

By contrast, a disease that’s still greatly misunderstood–and feared–as highly contagious isn’t at all. Rebecca Solnit‘s piece “The Separating Sickness” (first published in Harper’s Magazine) profiles people who have Hansen’s Disease, also known as leprosy. Somehow I’d thought that this condition was no longer present in the U.S., but in 2011, 173 people were diagnosed with it in this country. The U.S.’s largest leprosy clinic is located in Baton Rouge, Louisiana, and Solnit’s profile of what goes on there is informative and inspiring. “Contrary to long-standing belief,” Solnit writes, leprosy “is very nearly the least contagious contagious disease on earth. Ninety-five percent of us are naturally immune to the disease, and the rest have a hard time catching it.” Yet those who did catch it in past decades suffered not only physically (with skin lesions and sometimes the need for amputation of limbs owing to neuropathy) but also emotionally, because of the disease’s terrible stigma. At places like the Baton Rouge clinic, that stigma has vanished (though, sadly, it persists in other places). And, if the disease is caught early, the cure may be total.

It’s a real challenge, it seems to me, for the human brain to assess relative risks accurately. We see this also with the Ebola epidemic in Liberia, Sierra Leone and Guinea, specifically in the fear and anxiety that regional epidemic has caused for people living in other parts of the world. This striking map of Africa without Ebola and its accompanying text puts the matter into perspective. Yet when isolated cases occur in the U.S. or European countries, panic has ensued, along with disturbing patterns of discrmination against people thought (incorrectly) to be possible sources of contamination through casual contact. (As scientists have widely reported, the virus is transmitted during the acute phase of the illness via bodily fluids.)

Mnookin’s and Solnit’s chapters intersect powerfully with one about the effect of sudden, violent and debilitating trauma on the body and the mind. In “A Life-or-Death Situation” (originally published in The New York Times Magazine), Robin Marantz Henig describes the day when a retired English professor named Brooke Hopkins goes out for a bicycle ride in Utah canyon country and collides with another cyclist. Gravely injured, with a snapped neck, Hopkins stopped breathing but is revived on the trail; his living will, a document unknown to his rescuer, had specified no heroic measures in the case of catastrophic injury or illness. In one of life’s dark ironies, his wife Peggy Battin is a well-known scholar in the bioethics of end-of-life decisions. In captivating prose, Henig recounts the twisting course over the next years as Hopkins copes–just as happens with sufferers of Hansen’s Disease–in rollercoaster ways both physical and emotional. He catapults from good to poor health, from steely determination to shaky hesitation about wanting to continue on.

Henig’s was one of the pieces in the book that affected me mostly deeply, perhaps because I know that what happened to Hopkins and Battin could happen to any of us when we begin an apparently routine day with a bicycle ride: at some subconscious level, our brains know the risk of trauma exists, but we don’t dwell on it, and surely this is the right approach. (We would do better to be alarmed daily at the rising costs of anthropogenic climate change, after all.)

Since the Paleolithic age, when we gathered in small groups in front of glorious cave images of animals or around a community fire to weave tales of the natural world, we humans have learned best through storytelling. The Best American Science And Nature Writing 2014 is a modern-day equivalent, in written form, of those conversations, this time between science writer and science-intrigued reader.

Barbara J. King is a biological anthropologist at the College of William and Mary who teaches, writes and speaks about animal studies, primate behavior, human evolution and evolutionary perspectives on gender. Her latest book is How Animals Grieve, published in 2013.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME

An Infant’s Brain Maps Language From Birth, Study Says

Rear view of baby girl
Vladimir Godnik—Getty Images

The infant's brain retains language that it hears at birth and recognizes it years later, even if the child no longer speaks that language.

A new study study reveals that an infant’s brain may remember a language, even if the child has no idea how to speak a word of it.

The finding comes from a new study performed by a team of researchers from McGill University’s Department of Psychology and Montreal’s Neurological Institute who are working to understand how the brain learns language.

As it turns out, the language that an infant hears starting at birth creates neural patterns that the unconscious brain retains years later, even if the child completely stops using the language. The study offers the first neural evidence that traces of so-called “lost” languages remain in the brain.

Because these lost languages commonly occur within the context of international adoptions—when a child is born where one language is spoken and then reared in another country with another language—the researchers recruited test subjects from the international adoption community in Montreal. They studied 48 girls between the ages of nine and 17 years old. One group was born and raised speaking only French. The second group was bilingual, speaking French and Chinese fluently. And the third was Chinese-speaking children who were adopted as infants and later became French speakers, but discontinued exposure to Chinese after the first few years of life. They had no conscious recollection of the Chinese language. “They were essentially monolingual French at this point,” explained Dr. Denise Klein, one of the researchers, in an interview with TIME. “But they had been exposed to the Chinese language during the first year or two of their life.”

The three groups were asked to perform a Chinese tonal task–“It’s simply differentiating a tone,” said Klein. “Everybody can do it equally.” Scans were taken of their brains while they performed the task and the researchers studied the images. The results of the study, published in the November 17 edition of the scientific journal Proceedings of the National Academy of Sciences (PNAS), showed that the brain activation pattern of the adopted Chinese who “lost” or completely discontinued using the language, matched the brain activation patterns for those who continued speaking Chinese since birth—and was completely different from the group of monolingual French speakers.

The researchers interpret this to believe that the neural pathways for the Chinese language could only have been acquired during the first months of life. In layman’s terms, this means that the infant brain developed Chinese language patterns at birth and never forgot them, even though the child no longer speaks or understands the language.

“We looked at language that was abruptly cut off, so we could see what happens developmentally in that early period,” said Klein. “The sound of languages are acquired relatively early in life, usually within the first year. We’ve learned through a lot of seminal work that is out there that children start out as global citizens who turn their heads equally to all sounds and only later start to edit and become experts in the languages that they’re regularly exposed to.” The question for the researchers was whether the brains of the Chinese-born children who no longer spoke their native language would react like a French speaker or like a bilingual group.

To see what neural pathways might still exist in a brain and to see what a brain might remember of the mother tongue, the researchers used Chinese language tones, which infants in China would have been exposed to before coming to live in French-speaking Montreal. “If you have never been exposed to Chinese, you would just process the tones as ‘sounds,'” said Klein. However, if someone had been previously exposed to Chinese, like the bilingual Chinese-French speakers, they would process the tone linguistically, using neural pathways in the language-processing hemisphere of their brain, not just the sound-processing ones. Even though they could have completed the task without activating the language hemisphere of their brain, their brains simply couldn’t suppress the fact that the sound was a language that they recognized. Even though they did not speak or understand the language, their brains still processed it as such.

The results were that the brain patterns of the Chinese-born children who had “lost” their native tongue looked like the brains of the bilingual group, and almost nothing like the monolingual French group. This was true, even though the children didn’t actually speak any Chinese. “These templates are maintained in the brain, even though they no longer have any knowledge of Chinese,” said Klein, who was not surprised that these elements remained in the brain.

As with most scientific research, this finding opens the door to even more questions, particularly as to whether children exposed to a language early on in life, even if they don’t use the language, will have an easier time learning that language later in life. Don’t go rushing to Baby Einstein quite yet, though. “We haven’t tested whether children who are exposed to language early, re-learn the language more easily later,” said Dr. Klein, “But it is what we predict.”

What the study does suggest though is the importance of this early phase of language exposure. “What the study points out is how quite surprisingly early this all takes place,” said Klein. “There has been a lot of debate about what the optimal period for the development of language and lots of people argued for around the ages of 4 or 5 as one period, then around age 7 as another and then around adolescence as another critical period. This really highlights the importance of the first year from a neural perspective.”

“Everything about language processing follows on the early ability to do these phonological discriminations,” said Klein. “You become better readers if you do these things.”

While Klein isn’t an expert in the field of language acquisition, she does surmise that the more languages you are exposed to the better for neural pathway development, but she hasn’t fully tested that hypothesis. She mentioned other studies that show that early exposure to multiple languages can lead to more lingual “flexibility” down the road. Before you clean out Berlitz and build a Thai-Kurdish-German-Mandarin language playlist for your infant, Klein doesn’t recommend loading kids up with “thousands of languages.” She explains: “I don’t think bombarding somebody with multiple languages necessarily improves or changes anything.” Klein thought ensuring future lingual flexibility could come from exposure to just two or three languages at an early age.

To that end, Klein does think it’s important to develop these neural templates early in life, which she considers similar to wiring a room—put in the plugs, ports and outlets first and if you need to add a light later, you won’t have to start from scratch. Luckily there are no products required to develop a language template in the brain: simply talking to your baby in your native tongue is enough to develop those all-important neural pathways. If you want to invest in Baby Berlitz, well, the studies aren’t in yet, but it can’t hurt.

TIME Research

How to Survive a Spaceship Disaster

sky
Getty Images

One of the most dangerous parts of an astronaut’s journey is the very beginning

WSF logo small

Falling from ten miles up, with no spacesuit on, in air that’s 70 degrees below zero and so thin you can hardly draw breath…Conditions were not ideal for Peter Siebold, a test pilot flying on Virgin Galactic’s SpaceShip Two, to survive. But he did. Siebold told investigators that he was thrown from the plane as it broke up, and unbuckled from his seat at some point before his parachute deployed automatically. It’s unclear at this point why the same thing didn’t happen for his copilot, Michael Alsbury.

Now, as spaceflight goes commercial, the destruction of both Spaceship Two and the Antares unmanned rocket is likely to bring the eyes of federal regulators back towards an industry that has until now enjoyed minimal red tape. The Commercial Space Launch Amendments Act, first passed by Congress in 2004, was designed to encourage innovation by keeping the rules not so stringent for the fledgling private space industry. But “the moratorium [was designed to] be in place until a certain date or the event of the first death,” Joanne Irene Gabrynowicz, editor-in-chief of the Journal of Space Law, told the MIT Technology Review. “Unfortunately, the first death has now occurred, and the FAA will likely revisit the need for regulations, if any.”

A Virgin Galactic spokesperson said in an email that the company couldn’t comment too broadly about the escape mechanisms for its spacecraft, due to the pending investigation. The spokesperson did confirm there are two exits from the cabin, but said that “specific design elements of the passenger cabin and spacesuits are still being developed and have not been made public.”

Since the earliest days of the space program, researchers have tried to develop realistic ways to provide astronauts with an emergency exit. But in an emerging field of such complexity, what mechanisms are plausible…and practical? Here’s a brief history of the effort so far.

Condition One: Failure To Launch

One of the most dangerous parts of an astronaut’s journey is the very beginning. To maximize the chance of survival during a launch, most spacecraft from the Mercury project onwards have incorporated a launch escape system (LES), which can carry the module containing the human crew away from a sudden threat to the rest of the craft—either while still on the launch pad, or during the initial ascent.

The Apollo LES was powered by a solid fuel rocket. At the first sign of trouble (transmitted by the loss of signal from wires attached to the launch vehicle), the LES would fire automatically, steering the command module up and away from danger, then jettison and allow the module to open its parachute and land. A similar principle lies behind the launch escape mechanisms used for Russia’s Soyuz capsules and the Shenzhou capsule used by the Chinese space program. The Orion spacecraft, NASA’s next generation of manned craft in development, also features an LES mounted on top of the craft, called a Launch Abort System.

On the private industry side of LESs, SpaceX’s Dragon capsule incorporates the rocket motors of the escape mechanism into the sides of the capsule itself, instead of mounting the LES on top. Since the LES isn’t discarded after launch, this “pusher” method provides the capsule with emergency escape capability throughout the entire flight—something the Space Shuttle and Apollo crafts never had, the company notes. (The drawback is that, if unused, all that fuel for the escape system is extra weight to carry around). Testing Dragon’s abort system both on the launch pad and in flight is something SpaceX expects to have done by January.

Using one of these devices is no picnic. Orion’s LAS was estimated to put about 15.5 Gs of force on an astronaut—more than a fighter pilot experiences, but a little alleviated by the fact that the astronauts are lying on their backs. “They’ll feel the effects,” Orion’s launch abort systems director Roger McNamara told Space.com, but “the bottom line is they’ll be walking away.”

Condition Two: Disaster In Orbit

In the 1960s, General Electric tested an emergency inflatable device called MOOSE (Manned Orbital Operations Safety Equipment, but originally Man Out Of Space Easiest) that was basically a small rocket motor attached to a six-foot-long polyester bag equipped with a heat shield, life support system, radio equipment and parachute. After a space-suited astronaut exited his or her space vehicle and climbed into the bag, he or she would activate pressurized canisters that filled it up with polyurethane foam.

More recently, NASA explored a new escape pod design called the X-38, a 7-person lifeboat designed to provide an escape route for astronauts on the International Space Station (say in case the Soyuz space capsule were damaged, or made unavailable because of political infighting, or hijacked by Sandra Bullock). This design made it as far as test flights, but was scrapped in 2002 over budget concerns.

Condition Three: Extraterrestrial Rescue

What if a disaster trapped astronauts on the moon? To prepare for that contingency, NASA worked on designs for unmanned Gemini Lunar Rescue Vehicles that could scoop up a marooned crew of two or three astronauts from the lunar surface, or from orbit around the moon. But funding cutbacks during the Apollo program prevented the agency from fully exploring these designs.

Condition Four: Trouble With The Landing

NASA’s space shuttles had an inflight escape system to be used only when the orbiter could not land properly after reentering orbit, which used a pole that extended out from one of the side hatches. The astronauts would hook themselves to the pole with a Kevlar strap and then jump out, allowing the pole to guide them out and underneath the left wing of the spacecraft. However, for this exit system to work, the space shuttle would have to be in pretty good shape, capable of staying in controlled, gliding flight. You can see the pole being used in this test footage here:

This article originally appeared on World Science Festival.

Your browser, Internet Explorer 8 or below, is out of date. It has known security flaws and may not display all features of this and other websites.

Learn how to update your browser