TIME The Weekend Read

What Science Says About Race and Genetics

DNA
Illustration by Umberto Mischi for TIME

The New York Times' former science editor on research showing that evolution didn't stop when human history began.

A longstanding orthodoxy among social scientists holds that human races are a social construct and have no biological basis. A related assumption is that human evolution halted in the distant past, so long ago that evolutionary explanations need never be considered by historians or economists.

New analyses of the human genome have established that human evolution has been recent, copious, and regional.In the decade since the decoding of the human genome, a growing wealth of data has made clear that these two positions, never at all likely to begin with, are simply incorrect. There is indeed a biological basis for race. And it is now beyond doubt that human evolution is a continuous process that has proceeded vigorously within the last 30,000 years and almost certainly — though very recent evolution is hard to measure — throughout the historical period and up until the present day.

New analyses of the human genome have established that human evolution has been recent, copious, and regional. Biologists scanning the genome for evidence of natural selection have detected signals of many genes that have been favored by natural selection in the recent evolutionary past. No less than 14% of the human genome, according to one estimate, has changed under this recent evolutionary pressure.

Analysis of genomes from around the world establishes that there is a biological basis for race, despite the official statements to the contrary of leading social science organizations. An illustration of the point is the fact that with mixed race populations, such as African Americans, geneticists can now track along an individual’s genome, and assign each segment to an African or European ancestor, an exercise that would be impossible if race did not have some basis in biological reality.

Racism and discrimination are wrong as a matter of principle, not of science. That said, it is hard to see anything in the new understanding of race that gives ammunition to racists. The reverse is the case. Exploration of the genome has shown that all humans, whatever their race, share the same set of genes. Each gene exists in a variety of alternative forms known as alleles, so one might suppose that races have distinguishing alleles, but even this is not the case. A few alleles have highly skewed distributions but these do not suffice to explain the difference between races. The difference between races seems to rest on the subtle matter of relative allele frequencies. The overwhelming verdict of the genome is to declare the basic unity of humankind.

Genetics and Social Behavior

Human evolution has not only been recent and extensive, it has also been regional. The period of 30,000 to 5,000 years ago, from which signals of recent natural selection can be detected, occurred after the splitting of the three major races, so represents selection that has occurred largely independently within each race. The three principal races are Africans (those who live south of the Sahara), East Asians (Chinese, Japanese, and Koreans), and Caucasians (Europeans and the peoples of the Near East and the Indian subcontinent). In each of these races, a different set of genes has been changed by natural selection. This is just what would be expected for populations that had to adapt to different challenges on each continent. The genes specially affected by natural selection control not only expected traits like skin color and nutritional metabolism, but also some aspects of brain function. Though the role of these selected brain genes is not yet understood, the obvious truth is that genes affecting the brain are just as much subject to natural selection as any other category of gene.

Human social structures change so slowly and with such difficulty as to suggest an evolutionary influence at work.What might be the role of these brain genes favored by natural selection? Edward O. Wilson was pilloried for saying in his 1975 book Sociobiology that humans have many social instincts. But subsequent research has confirmed the idea that we are inherently sociable. From our earliest years we want to belong to a group, conform to its rules and punish those who violate them. Later, our instincts prompt us to make moral judgments and to defend our group, even at the sacrifice of one’s own life.

Anything that has a genetic basis, such as these social instincts, can be varied by natural selection. The power of modifying social instincts is most visible in the case of ants, the organisms that, along with humans, occupy the two pinnacles of social behavior. Sociality is rare in nature because to make a society work individuals must moderate their powerful selfish instincts and become at least partly altruistic. But once a social species has come into being, it can rapidly exploit and occupy new niches just by making minor adjustments in social behavior. Thus both ants and humans have conquered the world, though fortunately at different scales.

Conventionally, these social differences are attributed solely to culture. But if that’s so, why is it apparently so hard for tribal societies like Iraq or Afghanistan to change their culture and operate like modern states? The explanation could be that tribal behavior has a genetic basis. It’s already known that a genetic system, based on the hormone oxytocin, seems to modulate the degree of in-group trust, and this is one way that natural selection could ratchet the degree of tribal behavior up or down.

Human social structures change so slowly and with such difficulty as to suggest an evolutionary influence at work. Modern humans lived for 185,000 years as hunters and gatherers before settling down in fixed communities. Putting a roof over one’s head and being able to own more than one could carry might seem an obvious move. The fact that it took so long suggests that a genetic change in human social behavior was required and took many generations to evolve.

Tribalism seems to be the default mode of human political organization. It can be highly effective: The world’s largest land empire, that of the Mongols, was a tribal organization. But tribalism is hard to abandon, again suggesting that an evolutionary change may be required.

The various races have evolved along substantially parallel paths, but because they have done so independently, it’s not surprising that they have made these two pivotal transitions in social structure at somewhat different times. Caucasians were the first to establish settled communities, some 15,000 years ago, followed by East Asians and Africans. China, which developed the first modern state, shed tribalism two millennia ago, Europe did so only a thousand years ago, and populations in the Middle East and Africa are in the throes of the process.

Two case studies, one from the Industrial Revolution and the other from the cognitive achievements of Jews, provide further evidence of evolution’s hand in shaping human social behavior within the recent past.

The Behavioral Makeover Behind the Industrial Revolution

The essence of the Industrial Revolution was a quantum leap in society’s productivity. Until then, almost everyone but the nobility lived a notch or two above starvation. This subsistence-level existence was a characteristic of agrarian economies, probably from the time that agriculture was first invented.

Perhaps productivity increased because the nature of the people had changed.The reason for the economic stagnation was not lack of inventiveness: England of 1700 possessed sailing ships, firearms, printing presses, and whole suites of technologies undreamed of by hunter gatherers. But these technologies did not translate into better living standards for the average person. The reason was a Catch-22 of agrarian economies, called the Malthusian trap, after the Rev. Thomas Malthus. In his 1798 Essay on the Principle of Population, Malthus observed that each time productivity improved and food became more plentiful, more infants survived to maturity, and the extra mouths ate up the surplus. Within a generation, everyone was back to living just above starvation level.

Malthus, strangely enough, wrote his essay at the very moment when England, shortly followed by other European countries, was about to escape from the Malthusian trap. The escape consisted of such a substantial increase in production efficiency that extra workers enhanced incomes instead of constraining them.

This development, known as the Industrial Revolution, is the salient event in economic history, yet economic historians say they have reached no agreement on how to account for it. “Much of modern social science originated in efforts by late nineteenth and twentieth century Europeans to understand what made the economic development path of western Europe unique; yet these efforts have yielded no consensus,” writes the historian Kenneth Pomeranz. Some experts argue that demography was the real driver: Europeans escaped the Malthusian trap by restraining fertility through methods such as late marriage. Others cite institutional changes, such as the beginnings of modern English democracy, secure property rights, the development of competitive markets, or patents that stimulated invention. Yet others point to the growth of knowledge starting from the Enlightenment of the 17th and 18th century or the easy availability of capital.

This plethora of explanations and the fact that none of them is satisfying to all experts point strongly to the need for an entirely new category of explanation. The economic historian Gregory Clark has provided one by daring to look at a plausible yet unexamined possibility: that productivity increased because the nature of the people had changed.

Clark’s proposal is a challenge to conventional thinking because economists tend to treat people everywhere as identical, interchangeable units. A few economists have recognized the implausibility of this position and have begun to ask if the nature of the humble human units that produce and consume all of an economy’s goods and services might possibly have some bearing on its performance. They have discussed human quality, but by this they usually mean just education and training. Others have suggested that culture might explain why some economies perform very differently from others, but without specifying what aspects of culture they have in mind. None has dared say that culture might include an evolutionary change in behavior — but neither do they explicitly exclude this possibility.

To appreciate the background of Clark’s idea, one has to return to Malthus. Malthus’s essay had a profound effect on Charles Darwin. It was from Malthus that Darwin derived the principle of natural selection, the central mechanism in his theory of evolution. If people were struggling on the edge of starvation, competing to survive, then the slightest advantage would be decisive, Darwin realized, and the owner would bequeath that advantage to his children. These children and their offspring would thrive while others perished.

“In October 1838, that is, fifteen months after I had begun my systematic inquiry,” Darwin wrote in his autobiography, “I happened to read for amusement Malthus on Population, and being well prepared to appreciate the struggle for existence which everywhere goes on from long-continued observation of the habits of animals and plants, it at once struck me that under these circumstances favorable variations would tend to be preserved, and unfavorable ones to be destroyed. The results of this would be the formation of a new species. Here then I had at last got a theory by which to work.”

Given the correctness of Darwin’s theory, there is no reason to doubt that natural selection was working on the very English population that provided the evidence for it. The question is that of just what traits were being selected for.

The Four Key Traits

Clark has documented four behaviors that steadily changed in the English population between 1200 and 1800, as well as a highly plausible mechanism of change. The four behaviors are those of interpersonal violence, literacy, the propensity to save, and the propensity to work.

Profound events are likely to have profound causes.Homicide rates for males, for instance, declined from 0.3 per thousand in 1200 to 0.1 in 1600 and to about a tenth of this in 1800. Even from the beginning of this period, the level of personal violence was well below that of modern hunter-gatherer societies. Rates of 15 murders per thousand men have been recorded for the Aché people of Paraguay.

Work hours steadily increased throughout the period, and interest rates fell. When inflation and risk are subtracted, an interest rate reflects the compensation that a person will demand to postpone immediate gratification by postponing consumption of a good from now until a future date. Economists call this attitude time preference, and psychologists call it delayed gratification. Children, who are generally not so good at delaying gratification, are said to have a high time preference. In his celebrated marshmallow test, the psychologist Walter Mischel tested young children as to their preference for receiving one marshmallow now or two in fifteen minutes. This simple decision turned out to have far-reaching consequences: Those able to hold out for the larger reward had higher SAT scores and social competence in later life. Children have a very high time preference, which falls as they grow older and develop more self-control. American six-year-olds, for instance, have a time preference of about 3% per day, or 150% per month; this is the extra reward they must be offered to delay instant gratification. Time preferences are also high among hunter-gatherers.

Interest rates, which reflect a society’s time preferences, have been very high — about 10% — from the earliest historical times and for all societies before 1400 AD for which there is data. Interest rates then entered a period of steady decline, reaching about 3% by 1850. Because inflation and other pressures on interest rates were largely absent, Clark argues, the falling interest rates indicate that people were becoming less impulsive, more patient, and more willing to save.

These behavioral changes in the English population between 1200 and 1800 were of pivotal economic importance. They gradually transformed a violent and undisciplined peasant population into an efficient and productive workforce. Turning up punctually for work every day and enduring eight eight hours or more of repetitive labor is far from being a natural human behavior. Hunter-gatherers do not willingly embrace such occupations, but agrarian societies from their beginning demanded the discipline to labor in the fields and to plant and harvest at the correct times. Disciplined behaviors were probably evolving gradually within the agrarian English population for many centuries before 1200, the point at which they can be documented.

Clark has uncovered a genetic mechanism through which the Malthusian economy may have wrought these changes on the English population: The rich had more surviving children than did the poor. From a study of wills made between 1585 and 1638, he finds that will makers with £9 or less to leave their heirs had, on average, just under two children. The number of heirs rose steadily with assets, such that men with more than £1,000 in their gift, who formed the wealthiest asset class, left just over four children.

The English population was fairly stable in size from 1200 to 1760, meaning that if the rich were having more children than the poor, most children of the rich had to sink in the social scale, given that there were too many of them to remain in the upper class.

Their social descent had the far-reaching genetic consequence that they carried with them inheritance for the same behaviors that had made their parents rich. The values of the upper middle class — nonviolence, literacy, thrift, and patience — were thus infused into lower economic classes and throughout society. Generation after generation, they gradually became the values of the society as a whole. This explains the steady decrease in violence and increase in literacy that Clark has documented for the English population. Moreover, the behaviors emerged gradually over several centuries, a time course more typical of an evolutionary change than a cultural change.

In a broader sense, these changes in behavior were just some of many that occurred as the English population adapted to a market economy. Markets required prices and symbols and rewarded literacy, numeracy, and those who could think in symbolic ways. “The characteristics of the population were changing through Darwinian selection,” Clark writes. “England found itself in the vanguard because of its long, peaceful history stretching back to at least 1200 and probably long before. Middle-class culture spread throughout the society through biological mechanisms.”

Economic historians tend to see the Industrial Revolution as a relatively sudden event and their task as being to uncover the historical conditions that precipitated this immense transformation of economic life. But profound events are likely to have profound causes. The Industrial Revolution was caused not by events of the previous century but by changes in human economic behavior that had been slowly evolving in agrarian societies for the previous 10,000 years.

This of course explains why the practices of the Industrial Revolution were adopted so easily by other European countries, the United States, and East Asia, all of whose populations had been living in agrarian economies and evolving for thousands of years under the same harsh constraints of the Malthusian regime. No single resource or institutional change — the usual suspects in most theories of the Industrial Revolution — is likely to have become effective in all these countries around 1760, and indeed none did.

That leaves the questions of why the Industrial Revolution was perceived as sudden and why it emerged first in England instead of in any of the many other countries where conditions were ripe. Clark’s answer to both these questions lies in the sudden growth spurt in the English population, which tripled between 1770 and 1860. It was this alarming expansion that led Malthus to write his foreboding essay on population.

But contrary to Malthus’s gloomy prediction of a population crash induced by vice and famine, which would have been true at any earlier stage of history, incomes on this occasion rose, heralding the first escape of an economy from the Malthusian trap. English workmen contributed to this spurt, Clark dryly notes, as much by their labors in the bedroom as on the factory floor.

Clark’s data provide substantial evidence that the English population responded genetically to the harsh stresses of a Malthusian regime and that the shifts in its social behavior from 1200 to 1800 were shaped by natural selection. The burden of proof is surely shifted to those who might wish to assert that the English population was miraculously exempt from the very forces of natural selection whose existence it had suggested to Darwin.

Explaining Ashkenazi IQ

A second instance of very recent human evolution may well be in evidence in European Jews, particularly the Ashkenazim of northern and central Europe. In proportion to their population, Jews have made outsize contributions to Western civilization. A simple metric is that of Nobel prizes: Though Jews constitute only 0.2% of the world’s population, they won 14% of Nobel prizes in the first half of the 20th century, 29% in the second and so far 32% in the present century. There is something here that requires explanation. If Jewish success were purely cultural, such as hectoring mothers or a zeal for education, others should have been able to do as well by copying such cultural practices. It’s therefore reasonable to ask if genetic pressures in Jews’ special history may have enhanced their cognitive skills.

It’s reasonable to ask if genetic pressures in Jews’ special history may have enhanced their cognitive skills.Just such a pressure is described by two economic historians, Maristella Botticini and Zvi Eckstein, in their book “The Chosen Few.” In 63 or 65 AD, the high priest Joshua ben Gamla decreed that every Jewish father should send his sons to school so that they could read and understand Jewish law. Jews at that time earned their living mostly by farming, as did everyone else, and education was both expensive and of little practical use. Many Jews abandoned Judaism for the new and less rigorous Jewish sect now known as Christianity.

Botticini and Eckstein say nothing about genetics but evidently, if generation after generation the Jews less able to acquire literacy became Christians, literacy and related abilities would on average be enhanced among those who remained Jews.

As commerce started to pick up in medieval Europe, Jews as a community turned out to be ideally suited for the role of becoming Europe’s traders and money-lenders. In a world where most people were illiterate, Jews could read contracts, keep accounts, appraise collateral, and do business arithmetic. They formed a natural trading network through their co-religionists in other cities, and they had rabbinical courts to settle disputes. Jews moved into money-lending not because they were forced to do so, as some accounts suggest, but because they chose the profession, Botticini and Eckstein say. It was risky but highly profitable. The more able Jews thrived and, just as in the rest of the pre-19th century world, the richer were able to support more surviving children.

As Jews adapted to a cognitively demanding niche, their abilities increased to the point that the average IQ of Ashkenazi Jews is, at 110 to 115, the highest of any known ethnic group. The population geneticists Henry Harpending and Gregory Cochran have calculated that, assuming a high heritability of intelligence, Ashkenazi IQ could have risen by 15 points in just 500 years. Ashkenazi Jews first appear in Europe around 900 AD, and Jewish cognitive skills may have been increasing well before then.

The emergence of high cognitive ability among the Ashkenazim, if genetically based, is of interest both in itself and as an instance of natural selection shaping a population within the very recent past.

The Adaptive Response to Different Societies

The hand of evolution seems visible in the major transitions in human social structure and in the two case studies described above. This is of course a hypothesis; proof awaits detection of the genes in question. If significant evolutionary changes can occur so recently in history, other major historical events may have evolutionary components. One candidate is the rise of the West, which was prompted by a remarkable expansion of European societies, both in knowledge and geographical sway, while the two other major powers of the medieval world, China and the house of Islam, ascendant until around 1500 AD, were rapidly overtaken.

Civilizations may rise and fall but evolution never ceases.In his book The Wealth and Poverty of Nations, the economic historian David Landes examines every possible factor for explaining the rise of the West and the stagnation of China and concludes, in essence, that the answer lies in the nature of the people. Landes attributes the decisive factor to culture, but describes culture in such a way as to imply race.

“If we learn anything from the history of economic development, it is that culture makes all the difference,” he writes. “Witness the enterprise of expatriate minorities — the Chinese in East and Southeast Asia, Indians in East Africa, Lebanese in West Africa, Jews and Calvinists throughout much of Europe, and on and on. Yet culture, in the sense of the inner values and attitudes that guide a population, frightens scholars. It has a sulfuric odor of race and inheritance, an air of immutability.”

Sulfuric odor or not, the culture of each race is what Landes suggests has made the difference in economic development. The data gathered by Clark on declining rates of violence and increasing rates of literacy from 1200 to 1800 provide some evidence for a genetic component to culture and social institutions.

Though equivalent data does not exist for the Chinese population, China’s society has been distinctive for at least 2,000 years and intense pressures on survival would have adapted the Chinese to their society just as Europeans became adapted to theirs.

Do Chinese carry genes for conformism and authoritarian rule? May Europeans have alleles that favor open societies and the rule of law? Obviously this is unlikely to be the case. But there is almost certainly a genetic component to the propensity for following society’s rules and punishing those who violate them. If Europeans were slightly less inclined to punish violators and Chinese slightly more so, that could explain why European societies are more tolerant of dissenters and innovators, and Chinese societies less so. Because the genes that govern rule following and punishment of violators have not yet been identified, it is not yet known if these do in fact vary in European and Chinese populations in the way suggested. Nature has many dials to twist in setting the intensities of the various human social behaviors and many different ways of arriving at the same solution.

For most of recorded history, Chinese civilization has been pre-eminent and it’s reasonable to assume that the excellence of Chinese institutions rests on a mix of culture and inherited social behavior.

The rise of the West, too, is unlikely to have been just some cultural accident. As European populations became adapted to the geographic and military conditions of their particular ecological habitat, they produced societies that have turned out to be more innovative and productive than others, at least under present circumstances.

That does not of course mean that Europeans are superior to others — a meaningless term in any case from the evolutionary perspective – any more than Chinese were superior to others during their heyday. China’s more authoritarian society may once again prove more successful, particularly in the wake of some severe environmental stress.

Civilizations may rise and fall but evolution never ceases, which is why genetics may play some role alongside the mighty force of culture in shaping the nature of human societies. History and evolution are not separate processes, with human evolution grinding to a halt some decent interval before history begins. The more that we are able to peer into the human genome, the more it seems that the two processes are delicately intertwined.

Nicholas Wade is a former science editor at The New York Times. This piece is adapted from the new book, A Troublesome Inheritance, published by the Penguin Press.

TIME The Weekend Read

Science Gave My Son the Gift of Sound

Alex, March 2006.
Alex, March 2006. Courtesy of Lydia Denworth

Cochlear implants have been controversial in Deaf culture — how would one change my son?

On a cold January night, I was making dinner while my three boys played in and around the kitchen. I heard my husband Mark’s key in the lock. Jake and Matthew, my two older sons, tore down the long, narrow hall toward the door. “Daddy! Daddy! Daddy!” they cried and flung themselves at Mark before he was all the way inside.

He was nearly two and he could say only ‘Mama,’ ‘Dada,’ ‘hello,’ and ‘up.’I turned and looked at Alex, my baby, who was 20 months old. He was still sitting on the kitchen floor, his back to the door, fully engaged in rolling a toy truck into a tower of blocks. A raw, sharp ache hit my gut. Taking a deep breath, I bent down, tapped Alex on the shoulder and, when he looked up, pointed at the pandemonium down the hall. His gaze followed my finger. When he spotted Mark, he leapt up and raced into his arms.

We had been worried about Alex for months. The day after he was born, four weeks early, in April 2003, a nurse appeared at my hospital bedside. I remember her blue scrubs and her bun and that, when she came in, I was watching the news reports from Baghdad, where Iraqis were throwing shoes at a statue of Saddam Hussein and people thought we had already won the war. The nurse told me Alex had failed a routine hearing test.

“His ears are full of mucus because he was early,” the nurse explained, “that’s probably all it is.” A few weeks later, when I took Alex back to the audiologist as instructed, he passed a test designed to uncover anything worse than mild hearing loss. Relieved, I put hearing out of my mind.

It wasn’t until that January night in the kitchen that Alex was totally and obviously unresponsive to sound. Within weeks, tests revealed a moderate to profound sensorineural hearing loss in both of Alex’s ears. That meant that the intricate and finely tuned cochleas in Alex’s ears weren’t conveying sound the way they should.

Nonetheless, he still had usable hearing. With hearing aids, there was every reason to think Alex could learn to speak and listen. We decided to make that our goal. He had a lot of catching up to do. He was nearly two and he could say only “Mama,” “Dada,” “hello,” and “up.”

A few months later we got a further unwelcome surprise: All of the hearing in Alex’s right ear was gone. He was now profoundly deaf in that ear. We had discovered in the intervening months that in addition to a congenital deformity of the inner ear called Mondini dysplasia, he had a progressive condition called Enlarged Vestibular Aqueduct (EVA). That meant a bang on the head or even a sudden change in pressure could cause further loss of hearing. It seemed likely to be only a matter of time before the left ear followed the right.

Suddenly Alex was a candidate for a cochlear implant. When we consulted a surgeon, he clipped several CT scan images of our son’s head up on the light board and tapped a file containing reports of Alex’s latest hearing tests and speech/language evaluations, which still put him very near the bottom compared to other children his age: He was in the sixth percentile for what he could understand and the eighth for what he could say.

“He is not getting what he needs from the hearing aids. His language is not developing the way we’d like,” the doctor said. Then he turned and looked directly at us. “We should implant him before he turns three.”

The Cochlear Countdown

A deadline? So there was now a countdown clock to spoken language ticking away in Alex’s head? What would happen when it reached zero? Alex’s third birthday was only a few months away.

‘Hot damn, I want to take this one home with me,’ the patient exclaimed.As the doctor explained that the age of three marked a critical juncture in the development of language, I began to truly understand that we were not just talking about Alex’s ears. We were talking about his brain.

When they were approved for adults in 1984 and children six years later, cochlear implants were the first device to partially restore a missing sense. How could it be possible to hear without a functioning cochlea? The cochlea is the hub, the O’Hare Airport, of normal hearing, where sound arrives, changes form, and travels out again. When acoustic energy is naturally translated into electrical signals, it produces patterns of activity in the 30,000 fibers of the auditory nerve that the brain ultimately interprets as sound. The more complex the sound, the more complex the pattern of activity. Hearing aids depend on the cochlea. They amplify sound and carry it through the ear to the brain, but only if enough functioning hair cells in the cochlea can transmit the sound to the auditory nerve. Most people with profound deafness have lost that capability. The big idea behind a cochlear implant is to fly direct, to bypass a damaged cochlea and deliver sound — in the form of an electrical signal — to the auditory nerve itself.

A cochlear implant. Doug Finger—The Gainesville Sun

To do that is like bolting a makeshift cochlea to the head and somehow extending its reach deep inside. A device that replicates the work done by the inner ear and creates electrical hearing instead of acoustic hearing requires three basic elements: a microphone to collect sounds; a package of electronics to process those sounds into electrical signals (a “processor”); and an array of electrodes to conduct the signal to the auditory nerve. The processor has to encode the sound it receives into an electrical message the brain can understand; it has to send instructions. For a long time, no one knew what those instructions should say. They could, frankly, have been in Morse code — an idea some researchers considered, since dots and dashes would be straightforward to program and constituted a language people had proven they could learn. By comparison, capturing the nuance and complexity of spoken language in an artificial set of instructions was like leaping straight from the telegraph to the Internet era.

It was such a daunting task that most of the leading auditory neurophysiologists in the 1960s and 1970s, when the idea was first explored in the United States, were convinced cochlear implants would never work. It took decades of work by teams of determined (even stubborn) researchers in the United States, Australia and Europe to solve the considerable engineering problems involved as well as the thorniest challenge: designing a processing program that worked well enough to allow users to discriminate speech. When they finally succeeded on that front, the difference was plain from the start.

“There are only a few times in a career in science when you get goose bumps,” Michael Dorman, a cochlear implant researcher at Arizona State University, once wrote. That’s what happened to him when, as part of a clinical trial, his patient Max Kennedy tried out the new program, which alternated electrodes and sent signals at a relatively high rate. Kennedy was being run through the usual set of word and sentence recognition tests. “Max’s responses [kept] coming up correct,” remembered Dorman. “Near the end of the test, everyone in the room was staring at the monitor, wondering if Max was going to get 100 percent correct on a difficult test of consonant identification. He came close, and at the end of the test, Max sat back, slapped the table in front of him, and said loudly, “Hot damn, I want to take this one home with me.”

A Cure or a Genocide?

So did I. The device sounded momentous and amazing to me — a common reaction for a hearing person. As Steve Parton, the father of one of the first children to receive an implant once put it, the fact that technology had been invented that could help the deaf hear seemed “a miracle of biblical proportions.”

I found cochlear implantation of children described as child abuse.Many in Deaf culture didn’t agree. As I began to investigate what a cochlear implant would mean for Alex, I spent a lot of time searching the Internet, and reading books and articles. I was disturbed by the depth of the divide I perceived in the deaf and hard of hearing community. There seemed to be a long history of disagreement over spoken versus visual language, and between those who saw deafness as a medical condition and those who saw it as an identity. The harshest words and the bitterest battles had come in the 1990s with the advent of the cochlear implant.

By the time I was thinking about this, in 2005, children had been receiving cochlear implants in the United States for 15 years. Though the worst of the enmity had died down, I felt as if I’d entered a city under ceasefire, where the inhabitants had put down their weapons but the unease was still palpable. A few years earlier, the National Association of the Deaf, for instance, had adjusted its official position on cochlear implants to very qualified support of the device as one choice among many. It wasn’t hard, however, to find the earlier version, in which they “deplored” the decision of hearing parents to implant their children. In other reports about the controversy, I found cochlear implantation of children described as “child abuse.”

No doubt those quotes had made it into the press coverage precisely because they were extreme and, therefore, attention-getting. But child abuse?! I just wanted to help my son. What charged waters were we wading into?

Cochlear implants arrived in the world just as the Deaf Civil Rights movement was flourishing. Like many minorities, the deaf had long found comfort in each other. They knew they had a “way of doing things” and that there was what they called a “deaf world.” Largely invisible to hearing people, it was a place where many average deaf people lived contented, fulfilling lives. No one had ever tried to name that world.

Beginning in the 1980s, however, deaf people, particularly in academia and the arts, “became more self-conscious, more deliberate, and more animated, in order to take their place on a larger, more public stage,” wrote Carol Padden and Tom Humphries, professors of communication at the University of California, San Diego, who are both deaf. They called that world Deaf culture in their influential 1988 book Deaf in America: Voices from a Culture. The capital “D” distinguished those who were culturally deaf from those who were audiologically deaf. “The traditional way of writing about Deaf people is to focus on the fact of their condition — that they do not hear — and to interpret all other aspects of their lives as consequences of this fact,” Padden and Humphries wrote. “Our goal . . is to write about Deaf people in a new and different way. . . Thinking about the linguistic richness uncovered in [work on sign language] has made us realize that the language has developed through the generations as part of an equally rich cultural heritage. It is this heritage — the culture of Deaf people — that we want to begin to portray.”

In this new way of thinking, deafness was not a disability but a difference. With new pride and confidence, and new respect for their own language, American Sign Language, the deaf community began to make itself heard. At Gallaudet University in 1988, students rose up to protest the appointment of a hearing president — and won. In 1990, the Americans with Disabilities Act ushered in new accommodations that made operating in the hearing world far easier. And technological revolutions like the spread of computers and the use of e-mail meant that a deaf person who once might have had to drive an hour to deliver a message to a friend in person (not knowing before setting out if the friend was even home), could now send that message in seconds from a keyboard.

In 1994, Greg Hlibok, one of the student leaders of the Gallaudet protests a few years earlier, declared in a speech: “From the time God made earth until today, this is probably the best time to be Deaf.”

Into the turbulence of nascent deaf civil rights dropped the cochlear implant.

A child with an early cochlear implant on Aug. 24, 1984. Glen Martin—Denver Post/Getty Images

The Food and Drug Administration’s 1990 decision to approve cochlear implants for children as young as two galvanized Deaf culture advocates. They saw the prostheses as just another in a long line of medical fixes for deafness. None of the previous ideas had worked, and it wasn’t hard to find doctors and scientists who maintained that this wouldn’t work either — at least not well. Beyond the complaint that the potential benefits of implants were dubious and unproven, the Deaf community objected to the very premise that deaf people needed to be fixed at all. “I was upset,” Ted Supalla, a linguist who studies ASL at Georgetown University Medical Center, told me. “I never saw myself as deficient ever. The medical community was not able to see that we could possibly see ourselves as perfectly fine and normal just living our lives. To go so far as to put something technical in our brains, at the beginning, was a serious affront.”

The Deaf view was that late-deafened adults were old enough to understand their choice, had not grown up in Deaf culture, and already had spoken language. Young children who had been born deaf were different. The assumption was that cochlear implants would remove children from the Deaf world, thereby threatening the survival of that world. That led to complaints about “genocide” and the eradication of a minority group. The Deaf community felt ignored by the medical and scientific supporters of cochlear implants; many believed deaf children should have the opportunity to make the choice for themselves once they were old enough; still others felt the implant should be outlawed entirely. Tellingly, the ASL sign developed for “cochlear implant” was two fingers stabbed into the neck, vampire-style.

The medical community agreed that the stakes were different for children. “For kids, of course, what really counts is their language development,” says Richard Dowell, who today heads the University of Melbourne’s Department of Audiology and Speech Pathology but in the 1970s was part of an Australian team led by Graeme Clark that played a critical role in developing the modern-day cochlear implant. “You’re trying to give them good enough hearing to actually then use that to assist their language development as close to normal as possible. So the emphasis changes very, very much when you’re talking about kids.”

Implanted and improving

I picked Alex up and hugged him tight. ‘You did it,’ I said.By the time Alex was born, children were succeeding in developing language with cochlear implants in ever greater numbers. The devices didn’t work perfectly and they didn’t work for everyone, but the benefits could be profound. The access to sound afforded by cochlear implants could serve as a gateway to communication, to spoken language and then to literacy. For hearing children, the ability to break the sound of speech into its components parts — a skill known as phonological awareness — is the foundation for learning to read.

We wanted to give Alex a chance to use sound. In December 2005, four months before he turned three, he received a cochlear implant in his right ear and we dug into the hard work of practicing speaking and listening.

One year later, it was time to measure his progress. We went through the now familiar barrage of tests: flip charts of pictures to check his vocabulary (“point to the horse”), games in which Alex had to follow instructions (“put the purple arms on Mr. Potato Head”), exercises in which he had to repeat sentences or describe pictures. The speech pathologist would assess his understanding, his intelligibility, his general language development.

To avoid prolonging the suspense, the therapist who did the testing calculated his scores for me before we left the office and scribbled them on a yellow Post-It note. First, she wrote the raw scores, which didn’t mean anything to me. Underneath, she put the percentiles: where Alex fell compared to his same-aged peers. These were the scores that had been so stubbornly dismal the year before when Alex seemed stuck in single-digit percentiles.

Now, after 12 months of using the cochlear implant, the change was almost unbelievable. His expressive language had risen to the 63rd percentile and his receptive language to the 88th percentile. He was actually above age level on some measures. And that was compared to hearing children.

I stared at the Post-It note and then at the therapist.

“Oh my god!” was all I could say. I picked Alex up and hugged him tight.

“You did it,” I said.

Listening to Each Other

I was thrilled with his progress and with the cochlear implant. But I still wanted to reconcile my view of this technology with that of Deaf culture. Since those nights early on when I was trolling the Internet for information on hearing loss, Gallaudet University in Washington, D.C., had loomed large as the center of Deaf culture, with what I presumed would be a correspondingly large number of cochlear implant haters. By the time I visited the campus in 2012, I no longer imagined I would be turned back at the front gates, but just the year before a survey had shown that only one-third of the student body believed hearing parents should be permitted to choose cochlear implants for their deaf children.

Only one-third of the student body believed hearing parents should be permitted to choose cochlear implants for their deaf children.“About fifteen years ago, during a panel discussion on cochlear implants, I raised this idea that in ten to fifteen years, Gallaudet is going to look different,” says Stephen Weiner, the university’s provost. “There was a lot of resistance. Now, especially the new generation, they don’t care anymore.” ASL is still the language of campus and presumably always will be, but Gallaudet does look different. The number of students with cochlear implants stands at 10 percent of undergraduates and 7 percent overall. In addition to more cochlear implants, there are more hearing students, mostly enrolled in graduate programs for interpreting and audiology.

“I want deaf students here to see everyone as their peers, whether they have a cochlear implant or are hard of hearing, can talk or can’t talk. I have friends who are oral. I have one rule: We’re not going to try to convert one another. We’re going to work together to improve the life of our people. The word ‘our’ is important. That’s what this place will be and must be. Otherwise, why bother?” Not everyone agrees with him, but Weiner enjoys the diversity of opinions.

At the end of our visit, he hopped up to shake my hand.

“I really want to thank you again for taking time to meet with me and making me feel so welcome,” I said.

“There are people here who were nervous about me talking to you,” he admitted. “I think it’s important to talk.”

So I made a confession of my own. “I was nervous about coming to Gallaudet as the parent of a child with a cochlear implant,” I said. “I didn’t know how I’d be treated.”

He smiled, reached up above his right ear, and flipped the coil of a cochlear implant off his head. I hadn’t realized it was there, hidden in his brown hair. Our entire conversation had been through an interpreter. He seemed pleased that he had managed to surprise me.

“I was one of the first culturally Deaf people to get one.”

Perhaps it’s not surprising that most of the people who talked to me at Gallaudet turned out to have a relatively favorable view of cochlear implants. When I met Irene Leigh, she was about to retire as chair of the psychology department after more than 20 years there. She doesn’t have an implant, but is among the Gallaudet professors who have devoted the most time to thinking about them.

She and sociology professor John Christiansen teamed up in the late 1990s to (gingerly) write a book about parent perspectives on cochlear implants for children; it was published in 2002. At that time, she says, “A good number of the parents labeled the Deaf community as being misinformed about the merits of cochlear implants and not understanding or respecting the parents’ perspective.” For their part, the Deaf community at Gallaudet was beginning to get used to the idea by then, but true supporters were few and far between.

In 2011, Leigh served as an editor with Raylene Paludneviciene of a follow-up book examining how perspectives had evolved. Culturally Deaf adults who had received implants were no longer viewed as automatic traitors, they wrote. Opposition to pediatric implants was “gradually giving way to a more nuanced view.” The new emphasis on bilingualism and biculturalism, says Leigh, is not so much a change as a continuing fight for validation. The goal of most in the community is to establish a path that allows implant users to still enjoy a Deaf identity. Leigh echoes the inclusive view of Steve Weiner when she says, “There are many ways of being deaf.”

Ted Supalla, the ASL scholar who was so upset by cochlear implants, had deaf parents and deaf brothers, a background that makes him “deaf of deaf” and accords him elite status in Deaf culture. Yet when we met, he had recently left the University of Rochester after many years there to move to Washington D.C. with his wife, the neuroscientist Elissa Newport. They were setting up a new lab not at Gallaudet but at Georgetown University Medical Center. Waving his hand out the window at the hospital buildings, Supalla acknowledged the unexpectedness of his new surroundings. “It’s odd that I find myself working in a medical community . . . It’s a real indication that times are different now.”

‘Deaf like me’

Alex will never experience deafness in quite the same way Ted Supalla does. And neither do the many deaf adults and children — some 320,000 of them worldwide — who have embraced cochlear implants gratefully.

But they are all still deaf. Alex operated more and more fluently in the hearing world as he got older, yet when he took off his processor and hearing aid, he could no longer hear me unless I spoke loudly within inches of his left ear.

We had said that Alex would still learn ASL — and we’d meant it, in a vague way.I never wanted us not to be able to communicate. Even if Alex might never need ASL, he might like to know it. And he might someday feel a need to know more deaf people. In the beginning, we had said that Alex would learn ASL, as a second language. And we’d meant it — in a vague, well-intentioned way. Though I used a handful of signs with him in the first few months, those had fallen away once he started to talk. I regretted letting sign language lapse. The year Alex was in kindergarten, an ASL tutor named Roni began coming to the house. She, too, was deaf and communicated only in ASL.

Through no fault of Roni’s, those lessons didn’t go so well. It was striking just how difficult it was for my three boys, who were then five, seven and 10, to pay visual attention, to adjust to the way of interacting that was required in order to sign. (Rule number one is to make eye contact.) Even Alex behaved like a thoroughly hearing child. It didn’t help that our lessons were at seven o’clock at night and the boys were tired. I spent more time each session reining them in than learning to sign. The low point came one night when Alex persisted in hanging upside down and backward off an armchair.

“I can see her,” he insisted.

And yet he was curious about the language. I could tell from the way he played with it between lessons. He decided to create his own version, which seemed to consist of opposite signs: YES was NO and so forth. After trying and failing to steer him right, I concluded that maybe experimenting with signs was a step in the right direction.

Even though we didn’t get all that far that spring, there were other benefits. At the last session, after I had resolved that one big group lesson in the evening was not the way to go, Alex did his usual clowning around and refusing to pay attention. But when it was time for Roni to leave, he gave her a powerful hug that surprised all of us.

“She’s deaf like me,” he announced.

Lydia Denworth is the author or I Can Hear You Whisper: An Intimate Journey through the Science of Sound and Language (Dutton), from which this piece is adapted.

TIME The Weekend Read

Face-to-Face With a Psychopath

Photo from Trapped, a series on mental illness in American prisons, intended to increase the dialogue about prison reform and the mental health crisis in America.
Photo from Trapped, a series on mental illness in American prisons, intended to increase the dialogue about prison reform and the mental health crisis in America. Jenn Ackerman

'Shock Richie' lived up to his name—but did he also point toward a new way of looking at the psychopathic mind?

My Sunday morning began with a 60-minute commute through the rain to the home of the maximum-security treatment program for Canada’s most notorious violent offenders. This was a special day as a new cohort of inmates was being transferred in to start treatment. I was excited about the chance to interview 25 new inmates and get them signed up for my research studies.

An inmate had exited his cell completely naked and started walking up the tier.I arrived at the housing unit before the inmates had left their cells. I entered the nurses’ station and fired up the coffeemaker. The inmates’ cells opened and they rushed for the showers or the TV room. It was football season and the East Coast games were just starting. The inmates crowded into the TV room. I leaned against the door frame, watching the TV to see if I could catch a glimpse of the latest highlights. And then suddenly there was tension in the air. I felt it on the back of my neck before I was even conscious of what was happening. The inmates milling around had slowed, the sound of their feet hit­ting the cold concrete floor halted, the TV seemed to get louder, and all of a sudden I was acutely aware of the steam from the hot coffee in my mug spiraling up toward my nose.

An inmate had exited his cell completely naked and started walking up the tier. I noticed him out of the corner of my eye. He passed the TV room, shower stalls, and empty nurses’ station and proceeded down the stairs to the doors that led to the outside exer­cise area. Some of the inmates turned slightly after he had walked by to take a look at him. Others tried not to move or look, but I could see they noticed. The inmates were as confused as they were anx­ious. What was he doing?

The naked inmate proceeded outside into the rain and walked the perimeter of the short circular track. He walked around the oval track twice. The TV room was on the second floor and the inmates had a good view of the track. Some of the inmates peered outside and watched him. Everyone was distracted; no one spoke. We were all in shock.

The inmate returned, still naked, and walked up the stairs to the second-floor tier and then down to his cell. The tension around the TV room grew. The inmate quickly emerged from his cell with a towel and proceeded to the showers. He walked down the middle of the tier as inmates slowly moved out of his way or retreated into their cells. Other inmates appeared to talk to one another, but they were clearly trying to avoid any direct eye contact with him. I no­ticed one of the biggest inmates had subtly slowed his pace so that he would not cross the path of the new inmate.

The naked inmate took a quick shower and returned to his cell; there was a slight swagger to his stride. He was not particularly big, but his physique was ripped.

I had to interview him. I took a gulp of coffee and then walked toward his cell.

The first name written on masking tape above his door was “Richard.”

“Good morning. I’m the research guy from the University of British Columbia. We are con­ducting interviews and brain wave testing on the inmates in treat­ment here. Would you be interested in hearing more about it?” I asked.

“Sure” came the reply out of the dark cell.

“All right, then. Why don’t you get dressed and grab a bite to eat, and I’ll come get you in about thirty minutes. We’ll do the interview downstairs in my office.”

I returned to the nurses’ station and had a couple more cups of coffee. I wanted to make sure I was fully awake when I interviewed Richard.

‘Shock Richie’ Pushes My Button

Richard had dressed in classic prison garb: blue jeans, white T-shirt, and dark green jacket. He sauntered down the stairs and through the covered outdoor walkway to the mess hall for breakfast. He returned to his cell after about 15 minutes. I couldn’t wait; I went down early to get him.

He followed me to my office and he plopped down in the chair opposite from me.

Over a minute later, we heard doors being slammed open in the distance and the unmistakable sound of running footsteps.Before I could get the consent form out of the drawer, he stared at me and said: “You ever need to push that red button?” He was refer­ring to the silver-dollar-sized button in the middle of the wall; when depressed, it signaled distress. A buzzer would go off in the guard bubble down the hallway.

We were both about the same distance away from the button. I realized that I might not be able to reach the button before he could get to me. My mind quickly turned to figuring out a new way to or­ganize the office so that I was closer to the button than the inmates being interviewed.

“No,” I replied. “In the five years I’ve worked here, I’ve never had to push the button.” I threw the five years in to let him know that I had some experience behind me.

Without saying another word, he leaped up and slammed his hand on the button. I didn’t have time to react. He returned to his seat as quickly as he had jumped up.

“Let’s see what happens,” he said calmly, leaning back into his chair.

Over a minute later, we heard doors being slammed open in the distance and the unmistakable sound of running footsteps.

I had thought about getting up and opening the door for the guards, but I would have had to pass by Richard to get to the door. So I just sat in my chair and waited. The guards’ response time felt glacial.

A key was jammed into my door and then it was flung open; two guards entered, panting and out of breath, and stared at us.

Richard turned calmly in his chair and said to the guards: “What’s the problem?”

“Someone pushed the alarm button,” the guard stammered. “Ev­erything okay?” His question was directed at me.

“Oh, I must have accidentally pushed it when I took my coat off,” Richard answered. “Everything is just fine; we are just doing the research interview here.”

“Okay,” the guard said. “Don’t do that again.”

I just nodded. I was having trouble speaking.

The guards pulled the door closed and Richard turned and looked at me.

“They call me Shock Richie,” he said. “And I’m going to shock you too.”

Mustering as much inner strength as I could, I replied: “I’m looking forward to it; I’m here to be shocked. Take your best shot.”

Shock Richie smiled.

Prison is never boring, I thought.

‘You ever tried to carry a body?’

We completed the consent form and then I started the Psychopa­thy Checklist interview with a question I would never ask any other inmate in my career.

“Why did you walk naked out in the rain?”

Nike probably never envisioned a psychopathic in­mate embracing their slogan ‘Just Do It’ in a manner quite like this.“Well, I arrived last night. You have to make an impression on the other inmates right away when you get shipped to a new place. I saw you standing there by the TV room. You noticed how all the other inmates got a bit nervous when I walked by. Even the big ones get nervous when you do shit like that. You just got to establish your­self right away. If you don’t, then inmates think they can test you.” He stared quite matter-of-factly at me; the emptiness in his eyes was unnerving.

“When I do stuff like that, inmates don’t know what to think. I’m unpredictable. Sometimes I don’t even know why I do what I do. I just do it.”

My mind was racing again. I completely agreed with his logic, albeit twisted; he had already established his dominance at this prison. He was going to score high on at least a few psychopathic traits. Nike probably never envisioned a psychopathic in­mate embracing their slogan Just Do It in a manner quite like this.

“You’ve been working here for five years?”

“Yes, since I started graduate school,” I replied.

“Interviewed lots of guys, right?”

“Yes, hundreds of them.”

“Well, you ain’t never met anyone like me,” he said.

“Really? What makes you so special?”

“I’ve done shit you can’t even imagine. I’m gonna shock you like I shock everyone,” he stated calmly. “Let’s get on with it.”

Richie enjoyed doing bad things. He was only in his late 20s when I interviewed him, but he had a rap sheet like no one I had ever interviewed before. As a teenager he had committed burglary, armed robbery of banks and convenience stores, arson for hire, and all kinds of drug-related crimes from distribution to forcing others to mule drugs for him. He would force women to hide plastic bag­gies of cocaine in their body cavities and transport them across bor­ders and state lines and on plane flights. One of Richie’s girls got a baggie stuck in her vagina. Richie used a knife to “open her up a bit” so he could retrieve his drugs. He said he didn’t use her again after that. When I asked him what he meant by that, he said that he didn’t use her for sex; she was too loose now, and she lost her nerve about carrying drugs.

Richie smiled as he told me a story of a prostitute he had killed for pissing him off. He actually seemed proud when he described wrapping her up in the same blanket he had suffocated her with so he could keep all the forensic evidence in one place. He put her in the trunk of his car and drove out to a deserted stretch of road bor­dered by a deep forest. Chuckling, he told me he was pulled over by a highway trooper because he was driving erratically as he searched for a dirt road to drive up so he could bury the body in the woods.

“So the cop pulls me over and comes up to the window and asks me if I have been drinking alcohol. I lied and said no. I told him that I just had to take a piss and I was looking for a place to go. But the cop gave me a field sobriety test anyways. I figured that if I didn’t pass the test, I would have to kill that cop. Otherwise, he might open the trunk and discover the body. The cop didn’t search me when I got out of the car, and I was carrying a knife and a handgun. I’m surprised that I passed that field test since I had had a few drinks that night. I was planning to beat the cop senseless and then I was going to put the girl’s body in the backseat of the cop’s car. Then I would shoot him in the head with his own gun and make it look like a suicide after he accidentally killed the prostitute while raping her in the backseat of his cruiser. Everyone would think it was just another sick dude.”

The irony of his latter statement was completely lost on Shock Richie.

The cop proceeded to point out a dirt road just up the way where Richie could pull over and take a piss. It was fascinating that Richie could remain calm enough not to set off any alarm bells for the cop that something was amiss. After all, Richie had a body decomposing in the trunk of the car. Yet apparently, Richie showed no anxiety in front of the cop. Most psychopaths like Richie lack anxiety and ap­prehension associated with punishment.

Richie turned up the dirt road the cop pointed out to him and drove in a ways. He pulled over, parked, and removed the body from the trunk.

“I had all these great plans to carry the body miles into the woods and bury it really deep so nobody would ever find it. But it’s f—ing hard to carry a body. You ever tried to carry a body?” he asked.

“No, I don’t have any experience carrying dead bodies,” I told him.

“Well, it’s a lot of work, let me tell you. So I only got about a hundred yards off the road and just into the trees before I was ex­hausted. Then I went back and got the shovel from the car. I started digging a huge hole.”

He looked up at me with those empty eyes and asked: “You know how hard it is to dig a hole big enough to bury a body?”

“No,” I answered, “I don’t have any experience digging holes to bury bodies.”

“Well, it’s harder than you might think.” He starts laughing. “I had all these great plans to carry her miles into the woods and dig this monster hole so nobody would ever find her.”

A couple weeks later, a couple of hikers discovered the body. Shock Richie read about it in the newspapers, but he was never charged with the murder.

Not His Brother’s Keeper

Richie admitted that he had no need for friends. He’d really never been close to anyone in his life. He preferred to do everything on his own. He also didn’t trust anyone.

I believed him. Richie had no friends in prison, he had no visitors, and all the other inmates said he could not be trusted and he knew not to trust them in return.

While he was having sex with the prostitute in the living room, she said she smelled something funny.He had lived a life supported by crime, never had any vocational training, and never made even a passing attempt at any other life­style. He made most of his big scores by taking down rival drug pushers. He would set up deals in different towns and then rob and sometimes kill the other person. Richie had no fear or hesitation with killing. Richie also had more than a dozen fake names and ac­companying identification.

For a long time he was a pimp. He used to corral runaways into working for him. He would get them hooked on drugs and then make them work the streets. He’d killed more than a few prostitutes. He saw people as objects, things to be manipulated; we were there just for his entertainment.

When Richie had been released the last time from prison, he was taken in by his older brother. His older brother was not a criminal. He was on the straight and narrow. After a few months of Richie bringing home prostitutes and doing drug deals at the house, his brother had told Richie he had to stop or he was going to kick him out. They argued, but Richie never tried to change his behavior. Fi­nally, his brother had had enough. He picked up the phone to call the police to have him arrested for drug possession. “I was high,” said Richie, “but not more than usual. I got the jump on him and beat him with the phone. While he was lying there dazed on the floor, I ran into the kitchen and grabbed a knife. I came back and stabbed him a few times.” He looked up at me intently to see if I was shocked.

“Continue,” I said.

“I figured that I would make it look like somebody had come over and killed him as part of a drug deal gone bad. Then I thought that maybe I should make it look like my brother had raped one of my girls and one of them had stabbed him.” By girls he meant the prostitutes in his “stable.”

After killing his brother, he went out and partied for a day or two. Then he came back home with a prostitute whom he planned to stab, and then put the weapon in the hand of his dead brother. He was going to put them both in the basement and make it look like his brother died quickly during the fight and the girl died slowly from stab wounds.

While he was having sex with the prostitute in the living room, she said she smelled something funny.

“You ever smell a body after it’s been decomposing for a couple days?” he asked.

“No,” I replied, “I don’t have any experience smelling decompos­ing bodies.”

“Well, they stink. I recommend getting rid of them fast.”

After having sex, he intended to lure the girl down into the base­ment. But the prostitute excused herself to use the bathroom and she jumped out the window and ran away. Later that evening the police showed up at his door and asked to come inside. Apparently, the prostitute recognized that odd smell to be that of a decomposing body. She had good survival instincts.

Richie told the cops he had been away from the house partying for a few days. He didn’t know that his brother had been killed. Con­fessing to being a pimp and drug dealer, Richie told the officers that he owed a lot of people a lot of money. He gave them a list of a dozen or so names of potential suspects.

The police eventually arrested Richie. Through his attorney, Richie received a plea deal. He pleaded guilty to manslaughter and was sentenced to many years in prison.

No More Little Richies?

Richie had a few more zingers he hit me with that day. He had indeed met my challenge. When I got home that evening, I opened a bottle of wine; it was empty before I knew it.

If Shock Richie’s brain has been abnormal since he was a child, is he responsible for his actions as an adult?Richie and I have both spent the last 20 years in prison. Richie as an inmate, me as a scientist trying to understand the mind and brain of the psychopath. Richie scored in the 99th percentile on Hare Psychopathy Checklist-Revised (PCL-R), the test we use to assess psychopathic traits. There are 20 psychopathic traits on the Hare PCL-R, including Lack of Empathy, Guilt and Remorse, Callousness, Irresponsibility, and Impulsivity. Richie fit the classic definition of all of those traits.

Richie was also the first psychopath to receive an MRI scan of his brain. Since that first MRI study my laboratory has scanned the brains of more than 3,000 other inmates, many of them psychopaths like Richie. This MRI data is the world’s largest forensic neuroscience repository and it is starting to yield some startling discoveries. We know for example, how Richie’s brain differs from the rest of us. His limbic system, the area of the brain that controls emotion and affect, is reduced in both brain structure and function. Additional research has found these same brain abnormalities in incarcerated youth with emerging psychopathic traits. Indeed, some scientists argue that emotional and behavioral antecedents to psychopathic traits can be recognized as early as age 6.

If Shock Richie’s brain has been abnormal since he was a child, is he responsible for his actions as an adult? Does Richie have the same free will as the rest of us?

Finally, the latest science of psychopaths has also illuminated a path that might remedy these problems before they even get started. Indeed, studies are showing that early treatment might prevent little Richies from ever developing.

Excerpts adapted from The Psychopath Whisperer: The Science of Those Without Conscience (Crown), by Dr. Kent Kiehl, available Apr. 22.

TIME The Weekend Read

I’ll Finish the Dishes When I’m Dead

Overwhelmed
Illustration by Leah Goren for TIME

Caught up in what I've come to call the Overwhelm, the thought kept nagging me: Was I not just bad at time, was I squandering my one and only life?

Correction appended: April 7, 2014.

The way you live your days is the way you live your life.

—Annie Dillard

One evening when my kids were young, I was outside weeding my infernal gravel yard that, if left untended, begins to look like a furry Chia Pet. They were bouncing with sheer delight on the trampoline.

“Mommy, come jump with us!” they cried. “In a minute,” I kept saying. “Just let me finish weeding.” It was a time in my life when I used to routinely ask myself, “What do I need to do before I can feel O.K.?” And then I’d run through a never-ending mental list. That evening, with a familiar sense of vague panic rising, I felt compelled to finish at least one thing — the weeding — on that long, long list.

Lost in my churning thoughts, I didn’t notice the sun go down. Or hear my kids go inside. When I looked up again, the sky was dark, the yard still covered in weeds, and I was alone.

I have often thought back to that moment with such regret.

My daughter has stuck yellow Post-it notes on my forehead while I sat working at the computer to remind me to come upstairs for story time.But it wasn’t the only moment. Because this is how it felt to live my life most days: scattered, fragmented and exhausting. I was always doing more than one thing at a time and felt I never did any one particularly well. I was always behind and always late, with one more thing and one more thing and one more thing to do before rushing out the door. Entire hours evaporated while I did stuff that “needed to get done.” But once I’d done it, I couldn’t tell you what it was I had done or why it seemed so important. I felt like the Red Queen of Through the Looking-Glass on speed, running as fast as I could — usually on the fumes of four or five hours of sleep — and getting nowhere. Like the dream I kept having about trying to run a race wearing ski boots.

I have baked Valentine’s Day cupcakes until 2 a.m. and finished writing stories at 4 a.m., when all was quiet and I finally had unbroken time to concentrate. I have held what I hope were professional-sounding interviews sitting on the floor in the hall outside my kids’ dentist’s office, in the teachers’ bathroom at school functions, in the car outside various lessons and on the grass, quickly muting the phone after each question to keep the whooping of a noisy soccer practice to a minimum.

At work, I’ve arranged car pools to ballet and band practice. At home, I have constantly written and returned emails and done interviews and research for work. “Just a sec,” I would hear my daughter mimic me as she mothered her dolls. “Gimme a minute.” She has stuck yellow Post-it notes on my forehead while I sat working at the computer to remind me to come upstairs for story time.

At night, I have often awakened in a panic about all the things I need to do or didn’t get done. I’ve worried that I’ll face my death and realize that my life got lost in this frantic flotsam of daily stuff, that I’ll find that my children grew up right before my eyes and yet I somehow missed it, that I lived my life on the sidelines, watching time scream past, as a friend once told me, like a rabid lunatic.

Caught up in what I’ve come to call the Overwhelm, the thought kept nagging me: Was I not just bad at time, but was I squandering my one and only life?

Sleep, Sail, Sew, Be Happy

I sit at a table with four other people, pencil in hand, paralyzed. In front of each of us lies a blank calendar for one week, starting on Sunday and ending on Saturday. Each day is broken into hourly grids, starting at 6 a.m. and ending at midnight. The task at this daylong Time Triage workshop sounds simple enough: Design Your Perfect Schedule. What would you do, say, on Tuesday at 10 a.m. or on Friday at 3 p.m. to make your life meaningful? What, when you really come down to the quotidian details, does it look like every day to have time to do good work, to spend quality time with your family and friends and to refresh your soul?

‘When we die, the email inbox will still be full. The to-do list will still be there. But you won’t.’I stare at the page.

And so does everyone else: a real estate agent who feels there’s so much chaos between her work and life that it seems as if her time is “bleeding”; a man who just wants to figure out how to relax on the weekends without feeling guilty; his wife, who wants the world to stop for a few days so she can get caught up; a young woman living on fast-forward who has burned through two marriages and snaps photos of beautiful sunsets to post on Facebook as she flies down the road on her way to somewhere else. (“I just feel this tremendous sense of loss all the time,” she says.)

We’d started the workshop that morning with a very different exercise: filling in a schedule of what we’d done in the past week. That was easy. Everyone jam-packed the little hour grids with so much stuff that the cramped handwriting spilled out into the margins of the page.

Terry Monaghan, our no-nonsense leader and self-described “productivity expert,” then asked us what we’d do if our schedules opened up and we suddenly found we had more time.

“Read,” I said. The others chimed in: “Sleep.” “Learn to sail.” “Sew.” “Pray.” “Travel.” “Be happy.”

“Where is the time for that on your schedules?” Monaghan had asked.

There wasn’t any. That’s when she’d given us these blank calendars and told us to find the time. We stare, stumped, for several more uncomfortable minutes.

Monaghan’s approach to time management is simple: You can’t manage time. Time never changes. There will always be 168 hours in a week. What you can manage are the activities you choose to do in that time. And what busy and overwhelmed people need to realize, she said, is that you will never be able to do everything you think you need to, want to or should do. “When we die, the email inbox will still be full. The to-do list will still be there. But you won’t,” she told us. “Eighty percent of the email that comes in is crap anyway, and it takes you the equivalent of 19 1/2 weeks a year just to sort through. Eighty percent of your to-do list is crap. Look, the stuff of life never ends. That is life. You will never clear your plate so you can finally allow yourself to get to the good stuff. So you have to decide. What do you want to accomplish in this life? What’s important to you right now? And realize that what’s important now may not be two years from now. It’s always changing.”

Monaghan looks at us staring forlornly at our blank Perfect Schedules. She sighs. “This is not rocket science here, people,” she says. “Start with time for what’s most important.”

But that’s where I got stuck. Everything seemed important. My work. My family. My friends. My community. Changing the kitty litter. Sorting my daughter’s Barbie shoes. Keeping the incoming tide of clutter in the house at bay.

Ellen Ernst Kossek, an organizational psychologist and management professor at Purdue University, would later tell me that this means I’m not only an “integrator” of work and home duties, but the kind of überintegrator she calls a “fusion lover.” Unlike “separators,” who keep their work and life separated with bright lines, I tend to do everything all at once, all the time. In her book, CEO of Me, Kossek writes that some people thrive on integration, answering work emails from the sidelines of a child’s soccer game or checking in with the babysitter in the afternoon at the office, juggling 100 different balls with aplomb.

But if that integration was making me feel overwhelmed, then I wasn’t doing it particularly well. The downside to being a fusion lover, she said, is that people like me tend to get confused over which demand is more pressing in the moment, so we don’t have clear focus on what to do. We can’t decide. So we end up doing both work and home activities in an ambivalent, halfhearted way, which produces mediocre outcomes and vague disappointment in both.

Fighting Ambivalence Among the WoMoBiJos

Psychologists say that ambivalence is, literally, being of two minds. In their labs, they have found that this nebulous feeling is far more uncomfortable and stressful on the body and mind than either embracing one position over another or merely being neutral. To be ambivalent, say the psychotherapists David Hartman and Diane Zimberoff, is to be preoccupied with both what is wanted and what is not. “The opposite of ambivalence is a rigid intolerance for ambiguity, nuance or paradox,” they write. “The synthesis of the two is ‘passionate commitment in the face of ambiguity.’ ”

WoMoBiJos: Working Mothers With Big JobsAh, is that it?

Sitting in the Time Triage workshop, staring at my blank Perfect Schedule, I realized I would never be able to schedule my way efficiently out of the Overwhelm. I had to face my own ambivalence about trying to live two clashing ideals at once — ideal worker, ideal mother. There would never be enough room in a day for both. Big social change in workplaces, policies and attitudes is critical to move out of Overwhelm. But change is hard. It takes time. And I may not live long enough to see it. I had to figure out how to embrace my own life with that passionate commitment in the face of ambiguity, right here, right now.

I searched for people who had. That led me to Maia Heyck-Merlin and the group she put together called WoMoBiJos: Working Mothers With Big Jobs. The WoMoBiJos are women in their 30s and 40s who live in different cities and have big careers in finance, the nonprofit world, medicine and other fields. They love their work, yet they are not ideal worker-warriors. They love their kids and families, yet they don’t buy into the ideal mother’s impossibly high standards. “Good enough is the new perfect” is their mantra. They love their lives. And many have found a way to make time for themselves. Though each one lives a busy life, not one described herself as feeling overwhelmed.

In talking to them, it pretty quickly became apparent why: none of the WoMoBiJos felt ambivalent. Their lives certainly weren’t perfect — living with a 2-year-old, one said, is “like living with a bipolar drunken troll.” They were tired. They worked hard to make things work. But without the fog of guilty ambivalence shrouding their days, each was able to embrace her life with passion.

“I found it’s better for me to ask myself: Am I trying my best? Am I doing things for the right reasons? Do I make those I love feel loved? Am I happy? And then adjust as I go,” said Heather Peske, a Boston WoMoBiJo.

The more I spoke with the WoMoBiJos, it became apparent that they were freed from the mire of ambivalence because the structures of their lives fully support them in work, love and play. They all work in incredibly flexible work environments. Many WoMoBiJos work compressed schedules or work regularly from home. Their time is their own to control and is predictable. They are unapologetic. Their partners are, to greater and lesser extents, equitably sharing care of kids and domestic work. They automate, delegate or drop everything else — shopping for groceries online, hiring help or not caring if the house is less than perfect or if their husbands always make sandwiches for dinner. So, unlike a majority of women who still do about twice the housework and child care even when working full time, none face the double-time bind at home.

Heyck-Merlin has no qualms about hanging up chore lists at big gatherings of family or friends. “Why should someone be sitting on the couch while I do all the work?” she says. “They can empty the dishwasher.”

The WoMoBiJos are also ruthlessly clear about their priorities. They feel no compulsion to spend time on anything that feels obligatory. They are all disciplined and organized and have learned skills to integrate their work and home lives. They carve firm boundaries to protect uninterrupted time at work, undisturbed time to connect with family and guilt-free time to themselves.

More than anything, I was struck by how supremely confident all the WoMoBiJos are, in themselves, their skills, the decisions they’ve made and the way they live their lives — cultural norms be damned. I wondered, Was that it? Their confidence? Were they able to create these rich, complex and full lives and live them wholeheartedly simply because they believed they could? And if that were the case, could the WoMoBiJos, instead of being just a small group of admirable women in enviable special circumstances, really be pioneers showing us all the way? If they could believe their way into living unambiguously, could others? Could I? “I actually really do not care, to a fault, what people think,” Heyck-Merlin told me. “But I also don’t believe anything is due to personality trait. Everything is learned. It’s a mind-set. It’s a skill that needs to be developed. It takes practice. And time.”

That’s the gospel that Kathy Korman Frey, whom some call the Confidence Guardian, has been preaching. Korman Frey, a Harvard MBA, is an entrepreneur, a mother of two and a business professor at George Washington University who runs the Hot Mamas Project, the largest global database of business case studies written by female entrepreneurs about how they run their companies and manage their home lives at the same time. She is adamant that what keeps so many people, especially women, running ragged is that most have yet to develop the skill of confidence, or what she calls self-efficacy.

Self-efficacy, like grit, can be learned, she said. Like a muscle, like willpower, it can be exercised and made strong. And she is devoting her life to teaching the four ways that famed psychologist Albert Bandura said people could learn it. She calls them Jedi Mind Tricks:

  1. Have “mastery experiences.” The more you do some things well, the more you’ll build confidence to do other things well.
  2. Find role models and seek out mentors and sponsors.
  3. Listen to and believe, rather than dismiss, the positive and encouraging words people have for you.
  4. Get a grip. Recognize that perceptions are what shape experience.

And when it comes to negative and self-defeating patterns of thought, she advises, as Cher did in Moonstruck, “Snap out of it!”

“I’m not saying it’s not hard,” Korman Frey said. “But I am saying it’s like you’re wearing the ruby slippers. You have the power. You’ve had it all along.”

Boot Camp: ‘What’s Most Important to You Right Now’

I called Terry Monaghan. If I was ever going to use what I’d been learning on this journey to find a way out of the Overwhelm, if I was ever going to allow myself a moment of peace, if I was ever going to figure out how to embrace my life with passion, I realized I needed boot camp.

We started small: by clearing my desk.At our first meeting, Monaghan asked me: “What’s most important to you right now?” Then she asked me what I planned to do in the coming week to make time for it. I began rattling off an exhaustive list of just about everything that I needed to do, ever, in my life. By the following week, when we were scheduled to talk again, I was feeling guilty and defeated. I’d barely made a dent in all the tasks I’d assigned myself to do.

“So,” she said wryly when she called, “how long did it take for you to figure out you couldn’t do everything on your list in one week?”

In truth, I’d always known it.

“So much of our overwhelm comes from unrealistic expectations,” she said. “And when we don’t meet them, instead of questioning the expectations, we think that we’re doing something wrong.” Managing the overwhelm, she said, comes down to knowing the underlying story that’s driving those unrealistic expectations.

“So what’s my underlying story?” I asked.

“You want to write the perfect book,” she said matter-of-factly. “And you think the perfect book is anything written by anyone else. Your ongoing conversation with yourself is, You’re not enough. So whatever you do will never be enough. Every human being has some flavor of ‘not enough.’ You can either be stopped by it or simply notice it, like the weather.”

I began to try just to notice that stormy internal weather, instead of getting swept away in it. Notice how much I was unconsciously trying to live up to impossible ideals. Notice my ambivalence. And I began to grapple more consciously with the questions that daunt not only perfectionists but, really, anyone with a pulse: How much is enough? When is it good enough? How will I know?

We started small: by clearing my desk. “It gives your brain a rest from visual clutter,” Monaghan said. As we worked to build systems and routines into my days, we always seemed to be coming back to my brain, and how getting a handle on the Overwhelm was not just about creating more space and order on my calendar and in my office but also doing the same in my mind.

When I would second-guess myself or become obsessed about not knowing what I was doing, she’d interrupt me brusquely. “Right now, you need to free up all this energy that’s being consumed by worry.” She told me to take out a piece of paper, set a timer for five minutes and write furiously about absolutely everything that was bugging me. I didn’t have to do anything about this “Worry Journal.” Just getting the ambivalence out of my head and putting it somewhere would give my brain a rest. “It’s a way off the hamster wheel,” she said.

We did the same with the enormous to-do list I carried around in my head. Every Monday morning, I began to set aside time to plan the week. I began with a brain dump. It was the list of everything on my mind from here to eternity. The working memory can keep only about seven things in it at one time. And if the to-do list is much longer than that, the brain, worried it may forget something, will get stuck in an endless circular loop of mulling, much like a running toilet. That mulling is what social scientists say creates “contaminated time,” when, even in what looks like a moment of leisure on the outside, you can be lost in the churn of your thoughts and feel anything but. The brain dump is like jiggling the handle. “If your to-do list lives on paper, your brain doesn’t have to expend energy to keep remembering it,” Monaghan said.

The Power of the Pulse

The idea was to chunk my time to minimize the constant multitasking.As I worked with Monaghan, I also interviewed productivity and time-management experts; read books; clipped magazine articles; watched webinars; listened to podcasts; attended lectures; took my Time Perspective Inventory to see if I viewed the past, present and future in the optimal configuration for happiness; took an Energy Audit to see if I was working at peak performance physically, mentally, emotionally and spiritually; and reviewed dozens of different methodologies all aiming to relieve the time-sucking overwhelm.

The essence of their advice all seemed to boil down to what my kids learned in preschool: Plan. Do. Review. Take time to figure out what’s important in the moment and what you want to accomplish in life. If you’re ambivalent, notice it. Pick something anyway. Embrace it. Play. Try one approach. Assess. If that isn’t working, ditch it and play with something else. Keep yourself accountable but enjoy the process. There is no right answer. This is life.

Like Monaghan herself does, I began using bits of one method, pieces of another. If they seemed to help, I kept on using them. If the methods were too complicated or took too much work, I moved on. But by far, the one skill that I have learned that has transformed my experience of time is the power of the pulse.

Pulsing — deactivating and reactivating the brain — actually makes it pay better attention. The brain evolved to detect and respond to change, always alert to danger. And once the novelty wears off and the brain becomes “habituated,” it no longer notices the nonthreatening sights, sounds or feelings that have been constantly present. And neuroscience is finding that breaks inspire creativity and flashes of insight.

Monaghan sought to train me to work in pulses. The idea was to chunk my time to minimize the constant multitasking, role switching and toggling back and forth between work and home stuff like a brainless flea on a hot stove. The goal was to create periods of uninterrupted time to concentrate on work — the kind of time I usually found in the middle of the night — during the day. And to be more focused and less distracted with my family.

When it was time to work, I began to shut off email and turn off the phone. When it was time to be with family, I tried to do the same. I began to gather home tasks in a pile and block off one period of time every day to do them. It was easier to stay focused on work knowing I’d given myself a grace period to get to the pressing home stuff later.

When I was having difficulty, procrastinating, avoiding a task, stuck in ambivalence, Monaghan had me set a timer for 30 minutes, then take a break. “Your brain can stay focused on anything, even an unpleasant task, if it knows it will last only 30 minutes,” she said. Slowly, I worked up to 45- and then 90-minute stretches. I try to regularly pause, to disrupt the busyness and set my own priorities. That’s helped me flip my to-do list. I focus on what’s important first, pick one thing to do a day, and the rest goes in the Brain Dump.

I’m getting better about defining the mission of my work, setting realistic expectations, communicating them and measuring performance rather than tracking hours. Because change is hard and the urge to conform to the larger culture hardwired into our brains, I have created my own culture — and set up a network of support with like-minded people, who are also committed to resisting the popular culture and finding time for Work, Love and Play.

At home, my husband Tom and I and the kids are getting better at sharing responsibility for the second shift. That has freed up time to play. I’ve spent entire days reading again, whether the laundry is folded or not.

Having a clearer sense of what’s most important to do, I’m not as seized with the feeling that I haven’t done enough and the urge to do “just one more thing.” Clearing the clutter in my head and the guilt that hung over every halfhearted decision has given me more peace of mind than any elaborate time-management system. Time is still a struggle. But I am learning. Time feels better. Rather than ambivalence or Overwhelm, what I feel most of the time is gratitude.

‘Come On, Mom, Let’s Take a Break Together’

I’d pledged to live my life as if it were always 200 o’clock.When I was 34, I spent months helplessly watching my younger sister die of cancer. For the first time, I clung to each precious minute like a rare jewel. She had so few left. If she had to go down this awful road, then I wanted only to be right there with her, so at least she wouldn’t have to travel it alone. In that singular focus, the smallest gesture, the quietest moment was transformed into an unimaginably exquisite gift of grace. Every detail presented itself in its aching fullness: the bright red Adriamycin dripping into her veins, the way we laughed like little girls who’d done something naughty when I combed her thick, wavy blond hair and a big chunk fell out, the quality of the fading light in her hospital room as evening gently softened to dusk.

The single tear that rolled out of the side of her eye when it was clear that her life was at an end.

The Greeks called that kind of time kairos. When we live by the clock, the Greeks said, we are bound by chronos time. This is the time that races, marches, creeps and flies. It is the life that T. S. Eliot measured out in coffee spoons and the 30 hours of leisure that some time researchers claim we have. But kairos is the time of the “right moment,” the eternal now, when time is not a number on a dial but the enormity of the experience inside it.

On the day that I sought to write the last chapter for my book, I was caught in the gears of chronos, rushing from an early-morning teacher meeting we’d forgotten to the shop to get Tom’s rattling car. The dryer was broken. Soggy clothes were draped all around the house. My son had forgotten his big geometry project. And I’d had to physically remove the keyboard from the computer to keep my daughter from spending most of her waking hours on MovieStarPlanet.com. At a loss for what to write, I went for a walk. As I passed the park near our house, I saw a little girl wearing a bright pink paper crown and giggling with her friends. One asked, “What time is it?” The little girl, completely absorbed in the joy of walking home from school with friends on a gloriously sunny afternoon, started to laugh. “It’s 200 o’clock!”

When my sister was gone, I thought, for her sake, I would remember to live the rest of my days with that same fragile and humble grace, as if it were always 200 o’clock, knowing that one day I, too, would be gone. I even began to wear her watch every day to remind myself. I still do.

But I soon forgot.

One rainy Sunday, not so long ago, the kids and I made soup together. The kitchen was a mess. I immediately began to tackle the sink, which was clogged with vegetable peels and dirty dishes. My daughter Tessa sat on the window seat in the family room to watch the rain pour down.

“Mom, let’s have lunch,” she said.

“I’m doing the dishes right now.”

“Come on, Mom, let’s take a break together.”

“In a minute. Just let me get these dishes done.”

“Mom. Come over.”

It was the third time that hit me. Just stop, I thought. Stop right now. I took a breath. Now, I thought. I can feel O.K. right now. Here, I thought. Here is the best place to be. I keep forgetting, but right now I remember. I remember that life will be over quickly and that this is an amazingly beautiful day.

I poured myself some of the soup we’d just made, left the mess in the kitchen sink and sat across from Tessa on the window seat. My son Liam came to join us. I didn’t yip at them about chores or homework or things to do. We just sat together on the window seat. Eating soup. Watching the rain.

Brigid Schulte is an award-winning journalist for the Washington Post and the Washington Post Magazine. She was part of the team that won the 2008 Pulitzer Prize. She is also a fellow at the New America Foundation. She lives in Alexandria, Va., with her husband and their two children. Adapted from her new book Overwhelmed: Work, Love & Play when No One Has the Time, published by Sarah Crichton Books/FSG.

Correction: The original version of this story misspelled the name of Diane Zimberoff.

TIME The Weekend Read

Parent Like a Mad Scientist

Taking them to my alma mater, U.C. Berkeley, where upon Yo announced, ÒDad, thereÕs no way I am going here. My days of attending the schools you went to are over.Ó
Me with my daughter, E, and my son, Yo. I gave the kids unique names based on research showing that this might endow them with superior impulse control. Stephen P. Hudner

Give your kids weird names, expose them to raw sewage, and still be the world’s best dad

As an immigrant society with no common culture, we Americans have always been blessed with the ability to make things up as we go, be it baseball, jazz, the Internet … even Mormonism. Yet, when it comes to parenting, we’ve become obsessed with finding the one best way — whether it’s learning to raise our kids like the Chinese, the French, Finns, or whatever other group is in fashion today. It’s time to stop. No one culture has parenting down pat; there’s no one best model that we can look to for all the answers. And that’s a good thing. Parenting should be an adventure. And more importantly, if we want to keep America’s culture young and prosperous and innovative, parenting should be an experiment.

Yo engaged by something his mother is demonstrating to him; E mad for some reason. Courtesy Dalton Conley

I should know. I’m a bit of a mad-scientist parent myself — just ask my kids, E and Yo.

As a dual-doctorate professor of sociology and medicine at New York University, I gave my kids “unique” names based on research about impulse control. I exposed them to raw sewage (just a little!) and monkeys (O.K., just one!) to build up their immune systems based on the latest research on allergies and T-cell response. I bribed them to do math inspired by a 2005 University of Pennsylvania study of Mexican villagers that demonstrated the effectiveness of monetary incentives for schooling outcomes. And don’t think my offspring were the only ones bearing the brunt of all this trial and error: I got myself a vasectomy based on research showing that fewer kids may mean smarter kids.

There’s a method to my madness (namely, the scientific method). Parentology — as I call this approach to raising kids — involves three skills: first, knowing how to read a scientific study; second, experimenting on your kids by deploying such research; and third, involving your kids in the process, both by talking to them about the results and by revising your hypotheses when necessary.

Kids raised this way won’t necessarily end up with 4.0 GPAs, but they almost certainly will become inquisitive, creative seekers of truth.

Dalton Conley's kids, Yo, left, and E, right.
Often we are asked if E (right) and Yo (left) are twins; they are not. Despite knowing that narrow birth spacing may be disadvantageous, we popped our kids out a mere 18 months apart. Courtesy Dalton Conley

“Parentology” in Practice

My son’s name: Yo Xing Heyno Augustus Eisner Alexander Weiser KnucklesI put my approach into practice more or less immediately upon becoming a father, throwing out my copy of Dr. Spock and instead conducting a series of experiments on my two young children, now 16 and 14. No, I didn’t raise one in the woods with wolves and the other in a box. But I did give my children weird names — E (my daughter) and Yo (my son, full name: Yo Xing Heyno Augustus Eisner Alexander Weiser Knuckles) — to teach them impulse control. Evidence shows that kids with unusual names learn not to react when their peers tease them (at least in elementary school). What’s more, a 1977 analysis of Who’s Who by psychologist Richard Zweigenhaft found that unusual names were overrepresented, even after factoring out the effect of social class and background.

Meanwhile, after exploring the literature on verbal development, I decided not to teach my kids to read, but instead I read aloud to them constantly. It turns out that exposure to novel words, complex sentences and sustained narratives are what predict verbal ability later on, not whether a 4-year-old can decode words on a page. And the best predictor of later verbal skills is the number of total and unique words a child hears before kindergarten. Psychologists Betty Hart and Todd Risley observed how poor and middle-class parents interact with their toddlers. They estimated that the middle-class kids heard an average of 45 million words over a four-year period, while the poor children heard a mere 13 million. This difference, in turn, explained later achievement gaps. Unable to mimic Robin Williams and babble away, I decided the best thing was to read to my kids constantly. So while E and Yo were both behind their peers in reading in first grade, by fourth grade they had the best scores in their respective classes.

Dalton Conley reads to his son, Yo.
Reading is fundamental. Though I never taught them to decode words on the page, I was a human Kindle before there were such things. I never stopped reading to them. Courtesy Dalton Conley

Of course, not all my experiments have been successful. (If they were all successful, they’d hardly be experiments.) When my son was 11, his school wanted to medicate him for what administrators suspected was ADHD. I thought there might be a way around it. Scientific studies reviewed by University of California, San Diego, professor Andrew Lakoff in 2002 show that psychopharmacological placebo effects are almost as big as those of the actual drugs. And even student-teacher interaction is not immune to such Pygmalion-like dynamics. In one classic 1968 study, researchers Robert Rosenthal and Lenore Jacobson lied to teachers, telling them that they had identified a new test that could pick out genius kids with remarkable accuracy. Then they randomly picked certain pupils and informed the teachers that these particular kids had aced the test. Lo and behold, when the scientists showed up a year later, the scores of the kids who had received the “teacher placebo” treatment had jumped 15 points in their actual IQs relative to the control-sample kids.

Worried about sleep apnea and its potential role in causing ADHD, we took Yo in for an evaluation. My attempt to cure ADHD with a placebo came to naught. Courtesy Dalton Conley

With this research in mind and fearful of the risks of actual medication, I lied to the school, his sister and my son himself, telling them all that I was giving him a powerful stimulant (when it was actually a vitamin), hoping that if they all thought he was calmer and more attentive, they would treat him as such and his behavior would improve. While his teachers noted an improvement in his concentration and behavior for a few weeks after I started my placebo protocol, he backslid — prompting calls from the school about his inappropriate behavior — and was ultimately given a formal ADHD diagnosis. The real stimulants worked. However, I did decide to experiment with only giving him the drugs during the school week (in order to mitigate against long-term effects and possible habituation to the drug), which has been successful so far.

Customizing to the Kid

As you can see, while knowing how to read the existing science is important, even more critical is being able to properly experiment on your own young. What works for one kid (or one population of kids) may not work for all, and your family may require customization in order to make a technique work or just to be comfortable with what you’re doing.

Even when there’s research on a topic, you can’t be sure how it will apply to your own kids. You need to experiment.Even when there is a clear scientific consensus on, say, the importance of breast-feeding, we don’t often know the distribution of those effects. If a particular intervention — say, paying a child to do a half hour of math a day, like I did — is shown in a randomized, controlled trial to raise math scores by 20%, that could mean that all the kids in the bribery group saw their scores jump by a fifth. Or it could mean that for 80% of the kids, the bribes did not make a whit of difference, but for 20% it doubled their scores. This is what researchers call heterogeneous treatment effects.

Some kids are car-truck-train kids; others are animal kids. Guess which ours are. Courtesy Dalton Conley

Other times, results vary across studies and methods. One 2005 study of Mexican families found that cash rewards that were conditional on school attendance were hugely effective in improving child outcomes such as health and educational attainment. But an effort to replicate this in New York City showed only minor educational benefits in 2009. And a third study, published in the Review of Economics and Statistics in 2010, focused on elementary-school students in Coshocton, Ohio, found that it worked to pay the students themselves (as opposed to their families) based on how well they did on outputs (i.e., test scores). But the largest U.S. study of all — conducted in 2011 by Harvard economist Roland Fryer in Chicago, Dallas, New York City and Washington — found that when rewards were focused on outcomes like passing tests, they failed to produce meaningful improvements. But in that study, when the rewards were based on performing input tasks like reading a book or being on time to class, then they worked. (Even in this study, however, results were not consistent across cities, age groups or race.)

In short, even when there’s research on a topic, you can’t be sure how it will apply to your own kids — so it’s necessary to embrace experimentation. While I may never know what explains why some studies found big gains from bribery and others failed to, I was able to bribe both my kids to do extra math. I simply adjusted the rewards to fit the kid (something that would be impossible for researchers to do in a big study). As a parent, I could play on my son’s love of video games to offer a minute-for-minute swap of online math problems in exchange for World of Warcraft time. For my daughter, the enticement was gummy bears.

I did worry that by providing external motivation in the form of bribery, I might erode their internal motivation for mathematics, as some psychology research has suggested can happen. But that was a risk I was willing to take because — unlike with reading, for instance — they weren’t exactly clamoring for math problems. Here was a case of customizing the existing research to one’s own children. I may or may not have eroded their internal motivation to do math (and I doubt either will end up a professional mathematician), but at least they passed the big tests they needed to in order to get into high school.

How to Know What Matters

Lots of folks think being a scientist is knowing a bunch of esoteric facts that fit together, like how the Earth’s tilt causes the seasons or what mitochondria do or how, exactly, light can be both a wave and a particle. But the scientific method is what’s most important — especially when it comes to parenting. Particularly important, especially for middle- and upper-class parents, is knowing how to read a study and sift out causal relationships from the chaff of mere correlations. (It turns out a lot of great outcomes are correlated with being born in good economic circumstances to well-educated parents, but you want to figure out how to cause better outcomes.)

To assuage my own anxieties, I just keep reminding myself just how unimportant going to Harvard really is.For instance, take my educational choices for my kids — or, more accurately, their choices. That is, after all the extra math prep I bribed them to do to get them into Stuyvesant (the prestigious New York City high school that students must test into), I allowed them to decide if they actually wanted to go or not.

This may seem, at first blush, to be more like 1970s-style laissez-faire parenting. But actually I was following the latest cutting-edge research in ceding educational choice to my kids. Two studies by economists Stacy Dale and Alan Krueger in 2002 and 2011 showed that if you are white and middle class (which we are), it does not make a difference where you go to college. While it is true that graduates of more-selective institutions fare better in terms of income and wealth later on, compared with graduates of less selective schools, it turns out that this is an artifact of what we scientists call selection bias. It’s not that Harvard is adding so much value to your education as compared with the University of Nebraska — it’s that Harvard admissions is good at picking winners.

This research was about college, but my intuition that it also applied to high school was confirmed when MIT economist Joshua Angrist obtained the data from the selective exam-admission schools in Boston and New York City. He examined the data for what we call regression discontinuities. The logic is the following: if the cutoff to get into Stuyvesant is, say, 560 in a given year, then it is really pretty random whether an individual scores 559 or 560. It could be the difference of a good breakfast or a single vocabulary word that was in one kid’s stack of flash cards by chance. In other words, it probably does not reflect a major difference in innate ability. But the consequences of that point difference determine which school the kid ends up attending. By comparing two groups — the one just above and the one just below the line — we can see how big the “treatment effect” of attending the “better” school is. And it turns out not to matter at all, in either Boston or New York.

So, though both my kids gained admission to the most prestigious math and science high school in the country, I let them choose whether they went there or not. I figured, with no overall treatment effect, why not let them go where they sensed they would feel the most comfortable? They knew what environment was best for them. My daughter turned down her offer of admission, while my son decided to go. I, meanwhile, am taking notes to see how this next phase of the experiment turns out. (She is a sophomore and he is a freshman.) Meanwhile, to assuage my own anxieties, I just keep reminding myself just how unimportant going to Harvard really is.

One of the many cross-species interactions that take place in our home. No underworked immune systems here. Courtesy Dalton Conley

The Path to Enlightening Kids

One of the few fake animals in our house. Courtesy Dalton Conley

Finally, perhaps the most important part of parentology is to involve the kids themselves. Whether that means discussing the research about standing desks and their role in preventing obesity, giving them an opportunity to help design the experiment or debriefing them about its results (like when I confessed to my son that I had been giving him a placebo and not the real ADHD medication), the teachable moment is, actually, the most valuable part of the entire experiment.

Turn your rug rats into lab rats — they might not go Ivy, but they’ll be a lot more fun.Having a kid who knows how to separate out causation from mere correlation is more important than having one who can memorize a list of amino acids or Egyptian pharaohs. This is the real goal of experimental parenting: indoctrinating one’s kids into the Enlightenment way of thinking. Helping them learn to question — not authority necessarily: this isn’t 1960s hippie-dippie parenting, after all — but knowledge itself.

So, where tradition fails us (after all, what does the Bible have to say about kids and cell phones?), we can and should resort to the scientific method. Hypothesis formation, trial, error and revision. That is, we should experiment on our own kids.

Worried that screens may be disrupting your teen’s sleep? Do a controlled study in which you take the iPad away at night for two weeks and chart what happens. Want to encourage better study habits? Set up a marketplace for grades or effort and fine-tune the rewards and punishments in real time. Want to exercise the self-discipline muscles of your kids’ brains? Make them wear a mitten on their dominant hand for a couple of hours a day. Want to boost their performance before a big test? Prime them with positive stereotypes about their ethnic and gender identities. Today it is easier than ever — with Google Scholar and the like — to immerse oneself in the most cutting-edge research and apply it to one’s kids.

Like with patient-driven medicine, in which informed patients advocate to their doctor rather than just passively receiving information, I predict that American parents and their children will increasingly shun authorities — even good old Dr. Spock — and instead interpret and generate the scientific evidence for themselves.

Rather than a rigid formula of 10,000 hours of violin practice or a focus on a single socially sanctioned pathway to success, American parents should pursue an insurgency strategy: more flexibility and fluidity; attention to often counterintuitive, myth-busting research; and adaptation to each child’s unique and changing circumstances.

E working on her novel. Perhaps my reading-out-loud experiments worked. Courtesy Dalton Conley

If you approach your rug rats this way, by turning them into lab rats, I can’t guarantee they will get into Columbia. But I can predict with statistical confidence that they will be creative, fulfilled members of society and that you will have a lot more fun raising them along the way.

Dalton Conley is a professor of sociology and medicine at New York University and author of Parentology: Everything You Wanted to Know About the Science of Raising Children but Were Too Exhausted to Ask.

Parentology Quiz

Your browser, Internet Explorer 8 or below, is out of date. It has known security flaws and may not display all features of this and other websites.

Learn how to update your browser