TIME Research

Scientists Now Know Why People Scream

Hans Neleman / Getty Images

Your brain processes shrieks differently from speech, finds a new study

A baby wails upon an airplane’s liftoff, a person shrieks when he stumbles upon something shocking, a kid throws a tantrum because she wants to get her way—people scream in reaction to all kinds of situations.

But exactly why we scream has remained a mystery. Now, new research published in the journal Current Biology suggests that hearing a scream may activate the brain’s fear circuitry, acting as a cautionary signal.

Scream science is a new area of study, so David Poeppel, a professor of psychology and neural science at New York University, and his co-authors collected an array of screams from YouTube, films and 19 volunteer screamers who screamed in a lab sound booth. (This last collection method, by the way, was a highlight for Poeppel, who said he found listening to and judging screams an amusing break from the monotony of lab work.)

The researchers first measured the sound properties of screams versus normal conversation. They measured the scream’s volume and looked at how volunteers responded behaviorally to screams. They then looked at brain images of people listening to screams and saw something they found fascinating—screams weren’t being interpreted by the brain the way normal sounds were.

Normally, your brain takes a sound you hear and delivers it to a section of your brain dedicated to making sense of these sounds: What is the gender of the speaker? Their age? Their tone?

Screams, however, don’t seem to follow that route. Instead, the team discovered that screams are sent from the ear to the amygdala, the brain’s fear processing warehouse, says Poeppel.

“In brain imaging parts of the experiment, screams activate the fear circuitry of the brain,” he says. “The amygdala is a nucleus in the brain especially sensitive to information about fear.” That means screams are inherently considered not just sound but a trigger for heightened awareness.

From these screams, Poeppel and his team mapped “roughness,” an acoustic description for how fast a sound changes in loudness. While normal speech modulates between 4 and 5 Hz in sound variation, screams spike between 30 and 150 Hz. The higher the sound variation, the more terrifying the scream is perceived.

Poeppel and his team had volunteers listen to different alarm sounds and found people responded to alarms with similar variations: The more the alarms varied at higher rates, the more terrifying they were judged to be.

That huge variation in scream roughness is a clue to how our brains process danger sounds, Poeppel says. Screaming serves not only to convey danger but also to induce fear in the listener and heighten awareness for both screamer and listener to respond to their environment.

TIME Food & Drink

Stop Everything: There’s a New Seaweed That Tastes Like Bacon and Is Better for You Than Kale

Getty Images Dulse seaweed: a new variety, when cooked, reportedly tastes like bacon

The world's most perfect food may have just arrived

Researchers from Oregon State University’s Hatfield Marine Science Center say they’ve created and patented a new type of seaweed that has the potential to be sold commercially as the next big superfood.

The reason? It tastes just like bacon, they claim.

The bizarre but tasty creation is actually a new strain of red marine algae called dulse that is packed full of minerals and protein and looks like red lettuce.

Dulse normally grows in the wild along the Pacific and Atlantic coastlines and is harvested, dried and sold as a cooking ingredient or nutritional supplement.

“Dulse is a superfood, with twice the nutritional value of kale,” said Chuck Toombs, a faculty member in OSU’s College of Business and a member of the team working to develop the product into a foodstuff. “And OSU had developed this variety that can be farmed, with the potential for a new industry for Oregon.”

The team began researching ways of farming the new strain of dulse to feed abalone, but they quickly realized its potential to do well in the human-food market.

“There hasn’t been a lot of interest in using it in a fresh form. But this stuff is pretty amazing,” said chief researcher Chris Langdon. “When you fry it, which I have done, it tastes like bacon, not seaweed. And it’s a pretty strong bacon flavor.”

They’ve received a grant from the Oregon Department of Agriculture to explore dulse as a “special crop” and are working with the university’s Food Innovation Center in Portland and several chefs to find out ways dulse could be used as a main ingredient.

Though there is currently no commercial operation that grows dulse for human consumption in the U.S., the team is confident the seaweed superfood could make it big. If it really does taste like bacon, that would be no surprise at all.

Read next: The New Superfruit You’ve Never Heard of But Need to Try

Listen to the most important stories of the day

TIME Research

Why Planned Parenthood Provides Fetal Cells to Scientists

Planned Parenthood
Mario Tama—Getty Images A sign hangs in the offices of the Planned Parenthood Federation of America December 7, 2001 in New York City.

Donating cells from aborted fetuses for research has been practiced for decades

A video showing a Planned Parenthood doctor discussing the transfer of fetal tissue from abortions to researchers put the organization on the defensive this week after its release by undercover anti-abortion activists. The activists claim to have caught the organization perpetrating a federal crime by selling human body parts.

While the casual discussion about abortion procedures depicted in the video may upset people on both sides of the issue, donating cells from aborted fetuses for research has been practiced for decades and affirmed by leading medical organizations. Research using fetuses has also led to medical advances, including in Parkinson’s Disease and the development of a Polio vaccine.

Rush University neurosurgery professor Jeffrey Kordower said that he and his fellow researchers faced ethical questions when researching Parkinson’s Disease, but there was “very little discussion” about the ethics of using aborted fetuses.

The American Medical Association (AMA), for instance, separates a woman’s decision to have an abortion from the decision to donate fetal tissue and argues that fetal tissue has the potential to save lives. “Fetal tissue has intrinsic properties…that make it attractive for transplantation research,” the AMA said in a statement in 1991.

The 1991 statement also lays out a set of rules and procedures researchers and donors must follow to remain ethically sound. Among them is a rule prohibiting the sale of fetuses. The undercover video points to a moment in the video when Planned Parenthood Senior Director Deborah Nucatola says that receiving a fetus may cost $30-100 per specimen to allege that the organization violated AMA guidelines and federal law. But, as Planned Parenthood explained in a statement, the organization’s policy allows recipients of fetal donations to reimburse the organization for costs associated with the donations.

“There is no financial benefit for tissue donation for either the patient or for Planned Parenthood,” said spokesperson Eric Ferrero in a statement. “In some instances, actual costs, such as the cost to transport tissue to leading research centers, are reimbursed, which is standard across the medical field.”

TIME Research

Hair Burning Is Now a Thing

This is This Is Now A Thing, where we check out the science behind new health phenomena.

Friday treat 🔥🔥🔥 @lacesandhair @crisdioslaces #hairtreatment #hidratação #multivitaminas

A photo posted by @alessandraambrosio on

The thing: Candle cutting, or the Brazilian term, velaterapia, is a $150 to $200 hair treatment that involves running a candle flame along twisted strands of hair to singe off stray and unhealthy ends.

Popular for decades in South America, velaterapia possibly originated in ancient civilizations among pioneering beauties such as Cleopatra, who supposedly had her locks singed regularly to get that thick, glossy, waterfall look for which she was known. Stylists twist modern-day tresses into dozens of strands, dreadlock-style, and then run a lit candle along each one, holding the flame just long enough for the stray ends to briefly catch fire and burn off.

Ricardo Gomes, a stylist at New York’s Maria Bonita salon, performs the service. He says he first heard about the technique when his Brazilian friends swore by its results. Five years ago, he went back to his native Brazil and spent a day watching a stylist do it. “It took me a little time to learn it,” he says. “I tried it on a few friends and got the hang of it.” His clients now include models—who regularly damage their hair with styling chemicals and constant drying and heating, but who don’t want to lose any length off their locks—as well as women who get regular bleaching, relaxing or straightening treatments, which strips hair of its natural shine and leaves ends damaged and straw-like.

Gomes says he has perfected an after-treatment, which involves a deep conditioning regimen that he won’t reveal, but that he says takes advantage of the hair’s “open cuticle.” “When you run the flame through the hair, it’s such a shock treatment that you need something really strong and powerful to close that cuticle back, and start the growing process to become a lot stronger than what it was,” he says.

Post-conditioning, he goes over the hair again with a pair of scissors, snipping off the singed ends and any rogue flyaways for the smooth, finished look that his clients desire.

The hype: Proponents claim the heat from the candle opens the hair shaft to make it more receptive to conditioning afterward, and burning the ends seals off the annoying split ones. Just as cauterizing a wound stops bleeding, they claim that lighting up hair makes it smoother-looking. Models Alessandra Ambrosia and Barbara Fialho recently posted photos of their precious locks going up in smoke.

The research: But does it really work? Unsurprisingly, there isn’t much research on the practice, but dermatologists specializing in hair care aren’t convinced it’s the best way to smooth out your tresses. “The best way to treat split ends is to get regular hair cuts,” says Dr. Melissa Piliang from the Cleveland Clinic. “Even small trims, called dusting, every six to eight weeks can make hair grow longer, stay healthier and fuller. It’s a much better option than putting fire near your hair, which is flammable, and seems dangerous.”

And exposing hair to more heat, she says, isn’t a good idea. Hot curling irons, straightening irons and high-heat blow-drying aren’t recommended, so the heat from an open flame, however, brief, can’t be beneficial either, they say. Instead, she recommends gentle hair care to keep hair healthy, like using a deep conditioner a few times a week and avoiding hair treatments that strip hair and leave it damaged, such as bleaching and straightening.

Piliang also notes that healthy, shiny hair starts from under the skin, so factors like a healthy diet diet and getting enough sleep can also help. Vitamins like zinc, vitamin D, and iron are critical for making hair grow, and omega 3 fatty acids, found in fish like salmon, contribute to healthy, shiny locks. But it takes a few months of taking these supplements or eating well and sleeping enough to see the results, she says.

The bottom line: Searing off damaged ends may seem like a quick and satisfying way to subdue flyaway hair. But besides doing it all for questionable benefit, you’ll have to sit through what many of Gomes’s clients say they fear is the worst part: several three-hour sessions of smelling your hair lit on fire.

TIME Sex/Relationships

Being Multiracial May Give You An Advantage In Online Dating

Online dating
Getty Images

But that's hardly the whole story, recent research suggests

In the otherwise newfangled world of online dating, an old secret remains: All is not fair in love.

This ugly truth was revealed in the book Dataclysm by OkCupid co-founder Christian Rudder, released last year, which used data collected from OkCupid users. It found that while we’d like to claim we have advanced as a society beyond judging people by the color of their skin, our habits show otherwise. Regardless of gender, according to the book, whites are most preferred, while blacks are least preferred. Asians and Hispanics fall somewhere in between. Toss gender into the quotient, and the facts get even more uncomfortable: Asian men, black women, and black and Latino men are considered the least desirable in the dating market, but Asian and Latina women are seen as the most desirable—perhaps because of fetishization, Rudder suggested.

But Rudder’s theory does not include a key, growing part of the American population: individuals who identify as multiracial. In a country where the number of people who identify as multiracial has grown substantially and 93% of multiracial people identify as white and black, what does dating data show about them?

A forthcoming study from the Council on Contemporary Families, to be published in August by the American Sociological Review, looks at this very question. Researchers analyzed data collected between 2003 and 2010 from a major online dating website and combed through 6.7 million messages exchanged between heterosexual men and women. The researchers were looking for how often Asian-white, black-white, and Hispanic-white multiracial people received responses to messages, compared to people of one race.

The three groups were the most common multiracial identifications on the site. Reciprocation, or response messages, were key to figuring out where multiracial people fell in perceived attractiveness because they were more “honest,” explains Celeste Curington from the University of Massachusetts Amherst and one of the authors of the study.

“We look at response rate versus attractive rate because of social desirability bias,” she says, noting that being multiracial often carries an added unspoken benefit of being “exotic.” “People will be less likely to claim what they will view. The response rates are more accurate [as a measurement] since we can actually see what they do.”

At first glance, there seems to be a remarkable advantage to being multiracial on the online dating scene.

“The most surprising finding from our study is that some white-minority multiracial daters are, in fact, preferred over white daters,” the authors write in a press release. Called the “dividend effect,” the authors found that three specific combinations were heavily favored in online dating: Asian-white women, Asian-white men, and Hispanic-white men.

But beneath the superficial results that being of mixed race is advantageous remains a more complicated, race-tinged story, write the authors, who note that the study’s results do not suggest a totally even playing field.

“White men and women are still less likely to respond to an individual who identifies as part black and part white than they are to a fellow white,” the authors write. And when they do respond, skin color still plays a role. “In some cases they [the preferences for the three multiracial groups] seem to be closely linked to a continuing partiality for lightness or whiteness,” the study notes.

But being lighter skinned is not the whole story. Virginia Rutter, professor of sociology at Framingham State University, and Stephanie Coontz, professor of history and family studies at Evergreen State College, reviewed the results. The two warn against the takeaway that multiracial people are considered more attractive along skin color lines—a far too simple conclusion, they say.

It’s not as simple as societal preference for lighter-skinned people, and future consequences have yet to be measured, according to Rutter, who says that it helps to consider the results through “the arc of time.” Only 48 years ago, the ban on marrying a person of a different race was lifted nationally, and Rutter thinks societal acceptance of mixed race couples might indicate more acceptance—or, very possibly, less. Curington, one of the study’s authors, points to the multicultural movement of the 1990s that popularized identification of a person beyond being black, white, Asian or Hispanic as a key factor, too. “After those changes came about, there was an increased representation of mixed people in general,” Curington says.

“As these changes lead to a growing multiracial population, is it possible that the multiracial dividend will be extended, or at least begin to counter some of the racial penalties that have existed in the dating and marriage market?” ask Rutter and Coontz in their review. “Or will individuals perceived as mono-racial blacks fall even further behind?”

What further complicates these findings more is the exoticizing of multiracial people. Pop culture tends to mark “the ethnically ambiguous” person to be attractive to either sex for their enigma and lack of clear origin, Curington says. “If you look at cultural representations of multiracial people, going back to the early 1900s, they are often portrayed as exotic and sexually wanton,” she says.

But being multiracial might also act as a marker of progressiveness, particularly for Asian-American women. As Asian-American generations ground themselves in American culture and seek mates who can transcend their cultural tradition while also being able to understand their American upbringing, Asian-American women might prefer multiracial men for two reasons: First, they offer a dual upbringing that blatantly signals to Asian-American women the ability for the potential date to transcend both cultures; and second, they offer a “middle ground” of sorts for Asian parents—not quite white, and therefore more acceptable for older generations seeking to keep Asian culture intact in their offspring’s mating choice, but not quite Asian either, or having the “exotic” factor to come into play.

TIME Research

Here’s What Happens When You Name Your Newborn Babygirl

A study finds that a new way to handle unnamed babies resulted in 36% fewer errors

When I was born, my parents couldn’t agree on how to spell my name. My dad wanted T-A-N-I-A; my mom insisted on using a “y.” (Check my byline to see which parent eventually won.)

Because my parents didn’t have a name ready for me at my birth, the hospital wrote “Babygirl Basu” on my birth certificate—and I wasn’t alone. Of the 4 million children in 2012 in the U.S., many are assigned the generic “Babygirl” or “Babyboy” when their parents don’t immediately choose a name. Sometimes, parents just aren’t ready to name their children; other times, a newborn is in need of immediate intensive care and there’s no time to enter a name. Still other times, some hospitals have a policy to name children by sex and time of birth as an identifier for the hospital.

A study from Pediatrics suggests that these placeholder names can result in more medical errors, particularly for children born with conditions requiring intensive care.

Sign up here for TIME’s weekly roundup of the best, most useful parenting stories from anywhere on the web.

The researchers—led by Dr. Jason Adelman, patient safety officer at Montefiore Health System in New York and the recent father of an unnamed-at-birth son, who became christened Babyboy Adelman—wanted to test a solution. Between July 1, 2013, and June 30, 2014, they implemented a simple system of assigning temporary names to nameless newborns at Montefiore Medical Center by combining the baby’s mother’s name with their gender. For example, if a newborn baby girl’s mother’s name was Jane, the newborn would be assigned the name Janegirl.

Compared to the hospital’s standard practice in 2013, a year after using the new system, researchers saw a 36.3% reduction in the number of times a newborn received an erroneous patient order.

Not all hospitals follow a standard procedure for naming newborns; some use maternal last names, while others assign a number based on time and order of birth. Others simply use the alphabet. More than 80% of hospitals in a survey of 339 hospitals have fuzzy policies on naming conventions, and many of them resort to simply writing Babyboy or Babygirl on the newborns’ birth certificates. But the problems are magnified when it comes to multiple births, Adelman says: when babies not only share the same last name but also often the same gender and time of birth, along with an elevated risk of complications that may land them in a NICU. Add a long or hyphenated name into the mix—which often gets truncated on hospital paperwork—and the chance of errors increases exponentially, he says.

Naming the baby afterwards doesn’t solve the problem, Adelman points out, since hospital records register names once, and to go back and change these names increases the chance for errors and is expensive. That’s why the effects of this naming system are so significant, he says.

The study was inspired by previous research on adults at Montefiore, where researchers looked at certain types of medical errors—those caused when a doctor would place an order, like a medication, to be delivered to a patient, only to have it returned within a short period of time because it was either the wrong order or the wrong patient. While the national level of all reported hospital errors of this kind was around 10 errors per year, the actual number of errors at Montefiore was around 20 per day. Researchers found these errors through measuring “near-miss errors,” showing that Montefiore wasn’t necessarily making more errors but identifying these near-misses.

Adelman wondered if some of the discrepancy was because of errors made with babies, who wouldn’t be able to report the errors. “You only hear about errors when they’re devastating, near misses,” he says.

Adelman has made it his mission to convince health policymakers of the need to change and standardize hospital policies on child naming. “I think we should do away with these dangerous names,” he says. “We should encourage parents of multiples to have names ready.” And for parents who can’t decide on a name, “hospitals should have a standard policy to use some sort of naming device,” he says.

As for Babyboy Adelman, he has a name now. It’s Charlie.

TIME Research

Crash Course: 8 Ways to Discipline Your Kids

Getty Images

We surveyed the experts on what to do when they just won't behave--presented in rough order of escalation.

Whether you’re a parent or babysitter sometimes you need a go-to trick or two—or three— when it comes to discipline. All families develops their own approach when it comes to doling out punishment for bad behavior, but in case you need a little inspiration, we’ve rounded up some expert opinions on the most effective strategies.

These are listed roughly in order of escalation. Remember, you’re playing the long game here. You need to immediately stop violent or dangerous behavior (experimenting with the stove or a sibling’s eyes), but for other infractions, bear in mind that you’re trying to create a new human with a sense of right and wrong, empathy and decency. So sometimes, it’s worth trying something a couple of times before moving on, especially since kids really respond to consistency.

No time to read all those mommy blogs? Sign up here for TIME’s Parenting Newsletter and keep up on the good stuff.

Each of these methods has its upside and downside and, for the most part, we are not debating the merits, so much as suggesting the approaches a parent might like to try. Discipline is never less work for parents than it is for kids, so choose your battles wisely.

Let natural consequences play out: The American Academy of Pediatrics (AAP) is a fan of teaching children through natural consequences. For instance, if a child is tossing her crackers on the floor, don’t pick them up. At a certain point she will learn that throwing her food on the floor means she no longer gets to eat it. Throwing toys against the wall could mean that they break, and a child can no longer use them.

Try some logical consequences: When natural consequences are not doing the trick, stepping in to create a consequence of your own can work well. For instance, removing the toy being chucked at the wall and locking it up for the rest of the day. Try to be as consistent as possible when you choose consequences or when reacting to behavior that needs to change.

Guide the child to better behavior: Dr. Ben Siegel, the immediate past chair of the AAP committee on Psychosocial Aspects of Child & Family Health is an advocate for positive parenting, which includes guiding a child toward better behaviors. “Discipline means to teach,” he says. According to Siegel, kids do not cognitively understand or remember the rules of the house until age 2.5 or 3 and around that age, kids can be stubborn. Siegel recommends guiding children to appropriate behavior by giving them choices. For example, if a child doesn’t want to put in their jacket, a parent could say, ‘fine, but you have to carry it.’ Or insisting a child can have dessert only if they finish their dinner.

Other experts have created techniques around a similar idea, arguing that decisions should be made collaboratively with a child and that children should be empowered to suggest their own solutions to behavioral issues they are having. For instance, does the child have any ideas for what would make bath time less onerous?

Withhold a child’s privileges: You know the drill. When a child is acting up, they lose something they like. Experts recommend taking away privileges or cherished items immediately, and choosing something that’s not a necessity; depriving them of a meal would be a bad idea. Depending on the age of the child, canceling a playdate that wasn’t going to happen until the evening may allow too much time to pass for the message to stick.

Scold strategically: The AAP isn’t a big fan of yelling, but at a certain point, raising your voice may be necessary to get a child’s attention or to simply be heard over their own tantrums. Experts suggest avoiding screaming things that are humiliating or are physical threats because they don’t appear to be that effective. And because kids are great mimics. When parents totally lose their cool, which can certainly happen, recognizing and talking about any mistakes or regrets in that interaction can be a learning experience for both child and parent.

Use non-negotiable arguments: When the inevitable “It’s not fair” argument arises, some experts suggest using firm responses along the lines of “No it’s not,” or simply, “I know.” It’s an easy trick for stopping a fight in its tracks. Parents can offer some sympathy by acknowledging they understand the child is upset, but that their decision is still final.

Enforce an effective time out: To pull off a successful time out, experts suggest sending a child to a pre-designated corner or to a chair. Avoid sending a child to their room, where they are may be more distractions and toys. Some recommend assigning a minute for every year in the child’s age. What if they just refuse? You may need to sit with the child, or remain nearby to monitor them. Other experts even recommend having a “time-in” rather than a “time-out” which consists of sitting with the child to talk and reflect about their behavior.

What to do about spanking: The AAP says don’t do it, arguing it teaches aggression and is not very productive. Yet statistics suggest many parents do so anyway. A 2013 Harris poll for instance showed 8 in 10 people surveyed thought spanking was appropriate at least “sometimes” and 86% reported being spanked themselves when they were a child. For the low-down on spanking, read TIME’s recent feature on the behavior.

TIME Innovation

How To Engineer Serendipity

Getty Images

The Aspen Institute is an educational and policy studies organization based in Washington, D.C.

It isn’t happy accidents—It’s a state of mind

I’d like to tell the story of a paradox: How do we bring the right people to the right place at the right time to discover something new, when we don’t know who or where or when that is, let alone what it is we’re looking for? This is the paradox of innovation: If so many discoveries — from penicillin to plastics – are the product of serendipity, why do we insist breakthroughs can somehow be planned? Why not embrace serendipity instead? Because here’s an example of what happens when you don’t.

When GlaxoSmithKline finished clinical trials in May of what it had hoped would be a breakthrough in treating heart disease, it found the drug stank — literally. In theory, darapladib was a wonder of genomic medicine, suppressing an enzyme responsible for cholesterol-clogged arteries, thus preventing heart attacks and strokes. But in practice it was a failure, producing odors so pungent that disgusted patients stopped taking it.

Glaxo hadn’t quite bet the company on darapladib, but it did pay nearly $3 billion to buy its partner in developing the drug, Human Genome Sciences. The latter’s founder, William Haseltine, once promised a revolution in drug discovery: After we had mapped every disease to every gene, we could engineer serendipity out of the equation. Darapladib was to have been the proof — the product of scientists carefully picking their way through the company’s vast genetic databases. Instead it’s a multi-billion-dollar write-off.

Big Pharma is hardly alone when it comes to overstating its ability to innovate, although it may be in the worst shape. By one estimate, the rate of new drugs developed per dollar spent by the industry has fallen by roughly a factor of 100 over the last 60 years. Patent statistics tell a similar story across industry after industry, from chemistry to metalworking to clean energy, in which top-down innovation has only grown more expensive and less efficient over time. According to a paper by Deborah Strumsky, José Lobo, and Joseph Tainter, the average size of research teams bloated by 48 percent between 1974 and 2005, while the number of patents per inventor fell 22 percent during that time. Instead of speeding up the pace of discovery, large hierarchical organizations are slowing down — a stagflationary principle known as “Eroom’s Law,” which is “Moore’s Law” spelled backwards. (Moore’s Law roughly states that computing power doubles every two years, a principle enshrined at the heart of technological progress.)

While Big Pharma’s American scientists were flailing, their counterparts at Paris Jussieu — the largest medical research complex in France — were doing some of their best work. The difference was asbestos. Between 1997 and 2012, Jussieu’s campus in Paris’s Left Bank reshuffled its labs’ locations five times due to ongoing asbestos removal, giving the faculty no control and little warning of where they would end up. An MIT professor named Christian Catalini later catalogued the 55,000 scientific papers they published during this time and mapped the authors’ locations across more than a hundred labs. Instead of having their life’s work disrupted, Jussieu’s researchers were three to five times more likely to collaborate with their new odd-couple neighbors than their old colleagues, did so nearly four to six times more often, and produced better work because of it (as measured by citations).

The lesson? We still have no idea how to pursue what former U.S. Defense Secretary Donald Rumsfeld famously described as “unknown unknowns.” Even an institution like Paris Jussieu, which presumably places a premium on collaboration across disciplines, couldn’t do better than scattering its labs at random. It’s not enough to ask where good ideas come from — we need to rethink how we go about finding them.

I believe there’s a third way between the diminishing returns of typical organizations and sheer luck. In Silicon Valley, they call it “engineering serendipity,” and if that strikes you as an oxymoron (which it is), perhaps we need to step back and redefine what serendipity means:

  1. Serendipity isn’t magic. It isn’t happy accidents. It’s a state of mind and a property of social networks — which means it can be measured, analyzed, and engineered.
  2. It’s a bountiful source of good ideas. Study after study has shown how chance collaborations often trump top-down organizations when it comes to research and innovation. The challenge is first recognizing the circumstances of these encounters, then replicating and enhancing them.

Any society that values novelty and new ideas (like our innovation-obsessed one) will invariably trend toward greater serendipity over time. The push toward greater diversity, better public spaces, and an expanded public sphere all increase the potential for fortuitous discoveries.

The flip side is that institutions failing to embrace serendipity will ossify and die. This is especially true in our current era of incessant disruption, as seen in rising corporate mortality rates and a surge of unpredictable “black swan” events. (Nassim Taleb’s advice for taming black swans, by the way? “Maximize the serendipity around you.”)

Finally, the greatest opportunities for engineering serendipity lie in software, which means we must take great care as to who can find us and how, before Google (or the NSA) makes these choices for us.

It’s no coincidence Silicon Valley is obsessed with serendipity. Everyone is familiar by now with the origins of the Post-it Note, Velcro, corn flakes, and Nike’s waffle sole, to say nothing of Teflon, Kevlar, dynamite, and vast swaths of modern chemistry and medicine. The Valley’s contributions include microprocessors and inkjet printers, while Steve Jobs didn’t discover desktop computing or the mouse until a reluctant visit to Xerox PARC in 1979 — which beget the Macintosh and everything after.

When Yahoo banned its employees from working from home in 2013, the reasons the struggling company gave had less to do with productivity than serendipity. “Some of the best decisions and insights come from hallway and cafeteria discussions, meeting new people, and impromptu team meetings,” explained an accompanying memo. The message from new CEO Marissa Mayer was clear: Working solo couldn’t compete with lingering around the coffee machine waiting for inspiration — in the form of a colleague — to strike.

Google and Facebook have gone Yahoo one better. Rather than sit back and wait for serendipity to happen, the search giant has commissioned a new campus expressly designed, in the words of its real estate chief, to maximize “casual collisions of the work force.” Rooftop cafés will offer additional opportunities for close encounters, and no employees in the complex will be more than two and a half minutes away from one another. “You can’t schedule innovation,” said David Radcliffe, but you can make introductions — as both Googlers and Mayer know well. The latter attributes the genesis of such projects as Gmail, Google News, and Street View on her watch to engineers meeting fortuitously at lunch.

Meanwhile, Facebook has hired architect Frank Gehry to build “the perfect engineering space: one giant room that fits thousands of people, all close enough to collaborate together,” founder Mark Zuckerberg explained. The goal of each company is the same: to create the best conditions for spreading the most valuable kind of ideas — the hunches locked inside our skulls until a felicitous combination of circumstances sets them free.

Mayer’s demand for proximity ignited a debate that’s still raging: What’s the best way to work, together or alone? Finally breaking her silence on the matter in the spring of 2013, she conceded “people are more productive when they’re alone,” then added, “but they’re more collaborative and innovative when they’re together. Some of the best ideas come from pulling two different ideas together.”

She’s right. (Not that Yahoo has many ideas to show for it.) We experience moments of serendipity daily, each with potentially huge payoffs down the road. But because we can’t predict which ideas will collide and fuse, we cling to boring productivity and efficiency. We not only run our lives but our entire economy this way, using GDP and even grosser statistics to measure progress that has never unfolded in a straight line. Life is emergent and unknowable — we’re just terrified to manage it that way. And because we only attribute our success to serendipity after the fact (if at all), we typically consign it to anecdotes (e.g. Post-it Notes), turning to them only when the numbers don’t add up. The problem is that more and more of the most important numbers — including patent applications, R&D budgets, and even economic growth — have stopped adding up.

We take the pace of innovation for granted. We assume that like Moore’s Law, the rate of scientific discoveries and inventions is smoothly accelerating. But we’re wrong. A growing body of research suggests the opposite is true; Eroom’s Law rules. That this is happening in nearly every industry means something deeper is at work — that the corporation itself is reaching its limits when it comes to invention. Like the long-dead societies he’s excavated, Joseph Tainter — who’s most famous for his book The Collapse of Complex Societiesbelieves companies have become too rigid and hierarchical to survive disruption, seeking only to discover what they already know. What’s missing is serendipity.

The same phenomenon that produced a gusher of new research papers at Paris Jussieu once produced the laser and transistor at Bell Labs and breakthroughs in linguistics and acoustics at MIT. It’s still happening in places like IBM’s Thomas J. Watson Research Center in Yorktown Heights, New York, where a chance meeting of a physicist and biologist in a hallway a few years ago led to a tiny microchip able to single-handedly sequence long strands of DNA. It’s no accident that the Watson Research Center produces more patents per year than any other building in the world, and IBM more than any other company.

In all of these cases, serendipity was responsible for the bridging of what the University of Chicago sociologist Ronald Burt calls “structural holes,” which appear when org charts and other formal structures create gaps in the informal network of experts floating through a company, campus, or city. In a landmark study a decade ago, Burt found that managers who straddled holes between teams and domains consistently produced better ideas than those who did not (and were rewarded accordingly). “This is not creativity born of genius,” he wrote. “It is creativity as an import-export business.” The easiest way to discover an idea, it would seem, is to borrow one.

Burt’s findings have been borne out again and again; in one study, a slight increase in serendipity generated more revenue and projects while speeding up their completion. (Contrary to Mayer’s mea culpa, it appears bumping into people makes you more productive, too.)

There’s a rich vein of research running through sociology, anthropology, network science, and management theory explaining how serendipity increases one’s “absorptive capacity,” i.e., our ability to recognize, assimilate, and put to use knowledge from outside our personal experience. Other studies demonstrate how successful firms harness serendipity to lower the costs and barriers to collaboration. And still others suggest that how we share “non-redundant information” across a social network is more important than the experience or credentials of any one person in the network itself, which explains how scattering scientists across a campus at random could vastly improve the quality of their work.

But perhaps the most interesting thing about all of these examples is that they were unintentional. Serendipity may not be luck after all — there is a hidden order to how we find new ideas and people — but we will never realize more than the tiniest fraction of its potential as long as we treat it that way. So how do we go further and actually plan for serendipity?

The first step takes place in our own minds. A few years ago, an Australian psychologist named James Lawley realized that no one had mapped the experience of serendipity before. Upon re-reading the letter in which the British aristocrat Horace Walpole coined the word in 1754, he noticed the fabled Three Princes of Serendip “were always making discoveries, by accidents and sagacity, of things they were not in quest of.” Today, all anyone remembers are the accidents. But equally important is sagacity, which the chemist Louis Pasteur famously called “the prepared mind.”

“What kind of mind is it?” Lawley asks. “One that thinks more systematically than simple cause-and-effect.” In other words, it’s a mind that’s open to the unexpected, to thinking in metaphors, to holding back and not jumping to conclusions, and to resist walls between domains and disciplines. It’s a mind that looks a lot like Joi Ito’s.

Ito is a former DJ, venture capitalist, and entrepreneur who moved to Dubai on a whim to get a better feel for the place. (That’s when he wasn’t traveling 300 days a year.) “My job was running around mostly making connections,” is how he describes it. That was before he was picked to run the MIT Media Lab, despite never finishing college himself.

Headlining a panel at 2013’s South by Southwest titled “The New Serendipity,” Ito talked about the qualities he’s cultivated within himself — being “antidisciplinary” and retaining his “beginner’s mind” — which he hopes will guide the Media Lab. “We aim to capture serendipity,” he said. “You don’t get lucky if you plan everything — and you don’t get serendipity unless you have peripheral vision and creativity.”

That’s also true for the next step, which is engineering serendipity into organizations. For all the talk of “failing faster” and disruptive innovation, an overwhelming majority of companies are still structured along predictable lines. Even Google cancelled “20 percent time,” its celebrated policy of granting engineers one day a week for personal projects. To capture serendipity, the company is looking at space instead of time — hence the design of its new campus, in which everyone is just a short “casual collision” away.

But how can we do a better job of bringing people together than installing bigger cafeteria tables, adding another coffee machine, or locking all the bathrooms but one? A start would be to tear down the walls preventing colleagues in one department or company from bumping into peers from another. That’s what AT&T has done with its worldwide Foundry network, where selected startups and entrepreneurs work alongside its own engineers as well as those from partners such as Intel, Cisco, and Ericsson. One of these startups, Intucell, improved AT&T’s call retention and throughput speeds by 10 percent and was later bought by Cisco for $475 million. In general, Foundry teams have cut the development time of new products from three years to nine months.

It’s telling that the Foundry outpost in Silicon Valley is stationed in downtown Palo Alto, where the chances of someone dropping in on their walk back from lunch are substantially greater than in some exurban office park. Cities are the greatest serendipity engines of all. They began life at crossroads as places to exchange goods and later ideas with others you would never encounter on the farm.

Only recently, we’ve come to recognize great ones for what they are — not as collections of skyscrapers (which China can build but can’t fill), but as the sum of their dense, rich, and overlapping networks of people. “They’re not a set of people, they’re not a set of roads; they’re a set of interactions,” says Luis Bettancourt, a physicist who describes cities as “social reactors.” Like the sun, they’re places where strangers collect, collide, and fuse — releasing tremendous heat and light in the process. What makes a city great, in other words, is how well its people are connected — to the city itself and to each other. And to make a city better, you have to engineer serendipity.

Which is what Tony Hsieh is trying to do in Las Vegas. Much has been written about the Zappos CEO’s flailing efforts to terraform downtown into a desert facsimile of Brooklyn or the Mission district, but his instincts are correct. He envisions every bar and coffee shop around the company’s downtown campus as an extension of its conference rooms, inviting strangers to work alongside his employees. He fervently believes blurring the line between the city and his company will make people in both smarter, happier, and more productive. “If you accelerate serendipity,” he says, “you’ll accelerate learning.”

To ensure that happens, he’s imported dozens of tech startups for his employees to learn from. Stipulated in their contracts is a promise to spend “1,000 hours per year of serendipitous encounters” downtown, searching for collisions and conversations. While it remains to be seen whether Hsieh can build a successful creative class company town, he’s right to believe the energies of the city are greater than any one company.

The final piece is the network. Google has made its ambitions clear — as far as chairman Eric Schmidt is concerned, the future of search is a “serendipity engine” answering questions you never thought to ask. “It’ll just know this is something that you’re going to want to see,” explained artificial intelligence pioneer Ray Kurzweil shortly after joining the company as its director of engineering.

One antidote to this all-encompassing filter bubble is an opposing serendipity engine proposed by MIT’s Ethan Zuckerman. In his book, Rewire, he sketches a set of recommendation and translation tools designed to nudge us out of our media comfort zones and “help us understand whose voices we’re hearing and whom we are ignoring.”

As Zuckerman points out, the greatest threats to serendipity are our ingrained biases and cognitive limits — we intrinsically want more known knowns, not unknown unknowns. This is the bias a startup named Ayasdi is striving to eliminate in Big Data.Rather than asking questions, its software renders its analysis as a network map, revealing hidden connections between tumors or terrorist cells, which CEO Gurjeet Singh calls “digital serendipity.”

IBM is trying something similar with Watson, tasking its fledgling artificial intelligence software with reading millions of scientific papers in hopes of finding leads no human researcher would ever have time to spot. Baylor’s College of Medicine used it this way to identify six new proteins for cancer research in a month; the entire scientific community typically finds one per year.

Baylor’s experiment — much like Paris Jussieu’s unintentional one — tells us something profound about the potential for new discoveries. Rather than compiling ever-bigger data sets or throwing more bodies at a problem, we need tools, organizations, and environments geared less toward efficiency — which is suffering from decreasing returns — and more toward what John Hagel III and John Seely Brown call “scalable learning,” in which serendipity is crucial.

So, what if we borrowed Ayasdi to power a social serendipity engine — one to identify who’s nearby, parse our hidden relationships, and make introductions? How would it work? We’d want it to be as easy as Tinder, which now owns half the mobile dating market. Next, we’d need context — why do I want to meet this person? Tinder works because its logic is binary: Swipe right or left. Everything else is harder.

That context exists somewhere in our data exhaust. For example, Relationship Science has mapped the connections between 3 million members of the 1 percent using publicly available information from more than 10,000 databases. Its customers use it to trace paths to their quarry via colleagues, corporate boards, and alma maters, with each link graded into strong, medium, and weak ties. Meanwhile, a startup named Rexter mines users’ email, calendars, and contacts to calculate the value of their connections and assign tasks accordingly. And, of course, there’s no shortage of sensors available — from smartphones to beacons to “sociometric badges.”

Now, take all of that and run it through Ayasdi’s digital serendipity engine. We could conceivably perform the equivalent of Baylor’s Watson experiment with the researchers of Paris Jussieu, plugging hundreds if not thousands of structural holes in months or even weeks, rather than fifteen years. What would we find then?

Usually, when I describe this vision, someone will reply, “But that isn’t serendipity!” I’m never quite sure what they mean — because it isn’t random or romantic? Serendipity is such a strange word; invented on a whim in 1754, it didn’t enter widespread circulation until almost two centuries later and is still notoriously difficult to translate. These days, it means practically whatever you want it to be.

So, I’m staking my own claim: Serendipity is the process through which we discover unknown unknowns. Understanding it as an emergent property of social networks, instead of sheer luck, enables us to treat it as a viable strategy for organizing people and sharing ideas, rather than writing it off as magic. And that, in turn, has potentially huge ramifications for everything from how we work to how we learn to where we live by leading to a shift away from efficiency — doing the same thing over and over, only a little bit better — toward novelty and discovery.

This essay was made possible with the generous support of the John S. and James L. Knight Foundation.

This article was originally published by The Aspen Institute on Medium

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME Research

Gays and Lesbians Have Different Reasons to Get Married, Study Says

gay wedding men marriage ring
Getty Images

The big differences come down to kids

Same-sex marriage is now legal across the United States, but research on the reasons gays and lesbians get married is sparse. Now, in a recent study published in the journal Demography, a team of researchers looked at earnings and parenting patterns over time among married Swedish couples and found that registered partnership is important to both—but for different reasons.

The researchers looked at and followed Swedish couples who entered into registered partnerships sometime between 1995—the year Sweden approved registered partnerships of same-sex couples—through 2007. (They also analyzed data from 1994 to get a glimpse of life before official partnership.) The 1,381 couples in the study—672 lesbian and 709 gay couples—were entering their first unions and were between the ages of 20 and 64. The authors analyzed demographic data—including annual earnings from the couples, the differences between the earnings of people in the couple and the number of children in each union—for same-sex couples and compared the results to 267,264 heterosexual couples.

Sweden provides an intriguing opportunity to study how policy impacts same-sex marriages; though the country approved registered partnerships of same-sex couples in 1995, it wasn’t until a 2002 law that the country’s registered partners were allowed to jointly adopt children. (Swedish law dictates that married couples can only adopt jointly, thereby making it impossible for one partner to adopt without the other if the two partners are married.)

The authors found that gays and lesbians got married for very different reasons. Most gay couples entered their union without kids, and that number remained close to zero after marriage; the authors concluded that “the main function of registered partnership for gays is resource pooling,” they write in the paper. “For lesbians, on the other hand, the right to joint or step-parent adoption allowed in 2002 raised fertility and possibly entry into partnership.”

In other words, gay couples were more likely to get married to combine incomes and resources; lesbians tended to use marriage as a stepping stool towards creating a family, further emphasized by a spike in lesbians registering for marriage in 2002, the year when joint adoption was made legal.

The decision to have children is likely a large factor responsible for these differences, said Lena Edlund, an associate professor at Columbia University and one of the economists involved with the study. “I think the asymmetry results from a much greater difficulty male couples have in finding children that they can parent jointly,” she said in an e-mail. “It is also possible that male couples have a lower desire for joint children.”

For same-sex couples, adoption laws often lag behind marriage recognition laws—as they do in many states in the United States and did in Sweden. Having kids is especially expensive for gay mean, who need to find an egg and a gestational carrier—a problem lesbian couples don’t have.

Perhaps most intriguing is the role education plays in determining mates. In heterosexual marriages, assortative mating—choosing a partner more like oneself—is often at play, where partners are matched on an education level, according to economist Gary Becker’s A Theory of Marriage. A person with a master’s degree would partner with someone with at least a master’s degree; the theory states that it’s unlikely that this person would find common ground in parenting style and life philosophy with a person with a high school education.

What’s astonishing about the new research is that it showed that lesbian couples are often not as assortatively matched as heterosexual couples, or even gay men. For lesbians, an already thin marriage market means that education might not necessarily play a role in finding a mate so much as finding a partner who is equally as interested—or not—in raising children, Edlund said.

The concept of specialization also seems to play a lesser role in lesbian marriage compared to straight marriages. In a typical heterosexual marriage, the combination of having children and unequal pay means that partners are more likely to specialize, the study notes; the partner who earns less will stay at home with the kids, for example, while the partner who earns more acts as the breadwinner. In the Swedish sample, a higher percentage of lesbian couples remained on the labor force together and, in some instances, having their incomes nearly match after marriage.

The results of the study can only provide insight into the Swedish experience of same-sex parenting, which may differ from that in America, Edlund said. “American individuals and couples have greater access to fertility treatments and sperm banks,” she said. “There are also more American couples who can afford a surrogate mother.” Swedish couples, regardless of orientation, have access to healthcare and childcare options that the American couples don’t necessarily have, which would probably play into labor market options for partners, the study notes. But what can be said for sure is that, like any heterosexual marriage, marriage has consequences far more complex than simply signing a piece of paper.

TIME Research

90% of Americans Eat Too Much Salt

Getty Images

A new report sheds light on Americans' sodium habits

Consuming too much sodium can be a risk factor for heart problems, and new federal data shows more than 90% of Americans eat too much.

The findings show that from 2011 to 2012, the average daily sodium intake among U.S. adults was 3,592 mg, which is well above the public health target set by the U.S. Department of Health and Human Services (HHS) of 2,300 mg. The data comes from the U.S. Centers for Disease Control and Prevention’s (CDC) 2013 survey of 180,000 American adults in 26 states, D.C. and Puerto Rico. The findings were published Thursday in the CDC’s Morbidity and Mortality Weekly Report (MMWR).

Some Americans, however, are taking action to cut back, the report shows. About half of the U.S. adults surveyed said they were monitoring or reducing their sodium intake, and 20% said they had received medical advice to do so. People with high blood pressure were more likely to report they were doing something about their sodium consumption, and overall, people in Southern states were more likely to report such action or advice from medical providers.

Public health experts argue that people without high blood pressure could also benefit from cutting back. “Among adults without hypertension, most did not report taking action to reduce sodium intake, and an even smaller proportion reported receiving professional advice to reduced sodium,” the study authors write. “These findings suggest an opportunity for promoting strategies to reduce sodium consumption among all adults, with and without hypertension.”

Sodium intake recommendations have been the focus of controversy, with some researchers arguing that sodium levels are safe and that cutting back to very low recommended levels could be harmful. Others argue that high sodium consumption is related to serious health complications and contributes to millions of deaths every year. Some groups recommend limits that are even lower than the HHS; for instance, the American Heart Association recommends less than 1,500 mg a day.

In the new CDC report, researchers say that a high sodium habit doesn’t come cheap; medical costs for cardiovascular disease are predicted to triple from $273 billion to $818 billion between 2010 to 2030, and cutting back on sodium intake by 1,200 mg a day could save $18 billion in costs each year, they say.

Your browser is out of date. Please update your browser at http://update.microsoft.com