TIME The Weekend Read

Science Gave My Son the Gift of Sound

Alex, March 2006.
Alex, March 2006. Courtesy of Lydia Denworth

Cochlear implants have been controversial in Deaf culture — how would one change my son?

On a cold January night, I was making dinner while my three boys played in and around the kitchen. I heard my husband Mark’s key in the lock. Jake and Matthew, my two older sons, tore down the long, narrow hall toward the door. “Daddy! Daddy! Daddy!” they cried and flung themselves at Mark before he was all the way inside.

He was nearly two and he could say only ‘Mama,’ ‘Dada,’ ‘hello,’ and ‘up.’I turned and looked at Alex, my baby, who was 20 months old. He was still sitting on the kitchen floor, his back to the door, fully engaged in rolling a toy truck into a tower of blocks. A raw, sharp ache hit my gut. Taking a deep breath, I bent down, tapped Alex on the shoulder and, when he looked up, pointed at the pandemonium down the hall. His gaze followed my finger. When he spotted Mark, he leapt up and raced into his arms.

We had been worried about Alex for months. The day after he was born, four weeks early, in April 2003, a nurse appeared at my hospital bedside. I remember her blue scrubs and her bun and that, when she came in, I was watching the news reports from Baghdad, where Iraqis were throwing shoes at a statue of Saddam Hussein and people thought we had already won the war. The nurse told me Alex had failed a routine hearing test.

“His ears are full of mucus because he was early,” the nurse explained, “that’s probably all it is.” A few weeks later, when I took Alex back to the audiologist as instructed, he passed a test designed to uncover anything worse than mild hearing loss. Relieved, I put hearing out of my mind.

It wasn’t until that January night in the kitchen that Alex was totally and obviously unresponsive to sound. Within weeks, tests revealed a moderate to profound sensorineural hearing loss in both of Alex’s ears. That meant that the intricate and finely tuned cochleas in Alex’s ears weren’t conveying sound the way they should.

Nonetheless, he still had usable hearing. With hearing aids, there was every reason to think Alex could learn to speak and listen. We decided to make that our goal. He had a lot of catching up to do. He was nearly two and he could say only “Mama,” “Dada,” “hello,” and “up.”

A few months later we got a further unwelcome surprise: All of the hearing in Alex’s right ear was gone. He was now profoundly deaf in that ear. We had discovered in the intervening months that in addition to a congenital deformity of the inner ear called Mondini dysplasia, he had a progressive condition called Enlarged Vestibular Aqueduct (EVA). That meant a bang on the head or even a sudden change in pressure could cause further loss of hearing. It seemed likely to be only a matter of time before the left ear followed the right.

Suddenly Alex was a candidate for a cochlear implant. When we consulted a surgeon, he clipped several CT scan images of our son’s head up on the light board and tapped a file containing reports of Alex’s latest hearing tests and speech/language evaluations, which still put him very near the bottom compared to other children his age: He was in the sixth percentile for what he could understand and the eighth for what he could say.

“He is not getting what he needs from the hearing aids. His language is not developing the way we’d like,” the doctor said. Then he turned and looked directly at us. “We should implant him before he turns three.”

The Cochlear Countdown

A deadline? So there was now a countdown clock to spoken language ticking away in Alex’s head? What would happen when it reached zero? Alex’s third birthday was only a few months away.

‘Hot damn, I want to take this one home with me,’ the patient exclaimed.As the doctor explained that the age of three marked a critical juncture in the development of language, I began to truly understand that we were not just talking about Alex’s ears. We were talking about his brain.

When they were approved for adults in 1984 and children six years later, cochlear implants were the first device to partially restore a missing sense. How could it be possible to hear without a functioning cochlea? The cochlea is the hub, the O’Hare Airport, of normal hearing, where sound arrives, changes form, and travels out again. When acoustic energy is naturally translated into electrical signals, it produces patterns of activity in the 30,000 fibers of the auditory nerve that the brain ultimately interprets as sound. The more complex the sound, the more complex the pattern of activity. Hearing aids depend on the cochlea. They amplify sound and carry it through the ear to the brain, but only if enough functioning hair cells in the cochlea can transmit the sound to the auditory nerve. Most people with profound deafness have lost that capability. The big idea behind a cochlear implant is to fly direct, to bypass a damaged cochlea and deliver sound — in the form of an electrical signal — to the auditory nerve itself.

A cochlear implant. Doug Finger—The Gainesville Sun

To do that is like bolting a makeshift cochlea to the head and somehow extending its reach deep inside. A device that replicates the work done by the inner ear and creates electrical hearing instead of acoustic hearing requires three basic elements: a microphone to collect sounds; a package of electronics to process those sounds into electrical signals (a “processor”); and an array of electrodes to conduct the signal to the auditory nerve. The processor has to encode the sound it receives into an electrical message the brain can understand; it has to send instructions. For a long time, no one knew what those instructions should say. They could, frankly, have been in Morse code — an idea some researchers considered, since dots and dashes would be straightforward to program and constituted a language people had proven they could learn. By comparison, capturing the nuance and complexity of spoken language in an artificial set of instructions was like leaping straight from the telegraph to the Internet era.

It was such a daunting task that most of the leading auditory neurophysiologists in the 1960s and 1970s, when the idea was first explored in the United States, were convinced cochlear implants would never work. It took decades of work by teams of determined (even stubborn) researchers in the United States, Australia and Europe to solve the considerable engineering problems involved as well as the thorniest challenge: designing a processing program that worked well enough to allow users to discriminate speech. When they finally succeeded on that front, the difference was plain from the start.

“There are only a few times in a career in science when you get goose bumps,” Michael Dorman, a cochlear implant researcher at Arizona State University, once wrote. That’s what happened to him when, as part of a clinical trial, his patient Max Kennedy tried out the new program, which alternated electrodes and sent signals at a relatively high rate. Kennedy was being run through the usual set of word and sentence recognition tests. “Max’s responses [kept] coming up correct,” remembered Dorman. “Near the end of the test, everyone in the room was staring at the monitor, wondering if Max was going to get 100 percent correct on a difficult test of consonant identification. He came close, and at the end of the test, Max sat back, slapped the table in front of him, and said loudly, “Hot damn, I want to take this one home with me.”

A Cure or a Genocide?

So did I. The device sounded momentous and amazing to me — a common reaction for a hearing person. As Steve Parton, the father of one of the first children to receive an implant once put it, the fact that technology had been invented that could help the deaf hear seemed “a miracle of biblical proportions.”

I found cochlear implantation of children described as child abuse.Many in Deaf culture didn’t agree. As I began to investigate what a cochlear implant would mean for Alex, I spent a lot of time searching the Internet, and reading books and articles. I was disturbed by the depth of the divide I perceived in the deaf and hard of hearing community. There seemed to be a long history of disagreement over spoken versus visual language, and between those who saw deafness as a medical condition and those who saw it as an identity. The harshest words and the bitterest battles had come in the 1990s with the advent of the cochlear implant.

By the time I was thinking about this, in 2005, children had been receiving cochlear implants in the United States for 15 years. Though the worst of the enmity had died down, I felt as if I’d entered a city under ceasefire, where the inhabitants had put down their weapons but the unease was still palpable. A few years earlier, the National Association of the Deaf, for instance, had adjusted its official position on cochlear implants to very qualified support of the device as one choice among many. It wasn’t hard, however, to find the earlier version, in which they “deplored” the decision of hearing parents to implant their children. In other reports about the controversy, I found cochlear implantation of children described as “child abuse.”

No doubt those quotes had made it into the press coverage precisely because they were extreme and, therefore, attention-getting. But child abuse?! I just wanted to help my son. What charged waters were we wading into?

Cochlear implants arrived in the world just as the Deaf Civil Rights movement was flourishing. Like many minorities, the deaf had long found comfort in each other. They knew they had a “way of doing things” and that there was what they called a “deaf world.” Largely invisible to hearing people, it was a place where many average deaf people lived contented, fulfilling lives. No one had ever tried to name that world.

Beginning in the 1980s, however, deaf people, particularly in academia and the arts, “became more self-conscious, more deliberate, and more animated, in order to take their place on a larger, more public stage,” wrote Carol Padden and Tom Humphries, professors of communication at the University of California, San Diego, who are both deaf. They called that world Deaf culture in their influential 1988 book Deaf in America: Voices from a Culture. The capital “D” distinguished those who were culturally deaf from those who were audiologically deaf. “The traditional way of writing about Deaf people is to focus on the fact of their condition — that they do not hear — and to interpret all other aspects of their lives as consequences of this fact,” Padden and Humphries wrote. “Our goal . . is to write about Deaf people in a new and different way. . . Thinking about the linguistic richness uncovered in [work on sign language] has made us realize that the language has developed through the generations as part of an equally rich cultural heritage. It is this heritage — the culture of Deaf people — that we want to begin to portray.”

In this new way of thinking, deafness was not a disability but a difference. With new pride and confidence, and new respect for their own language, American Sign Language, the deaf community began to make itself heard. At Gallaudet University in 1988, students rose up to protest the appointment of a hearing president — and won. In 1990, the Americans with Disabilities Act ushered in new accommodations that made operating in the hearing world far easier. And technological revolutions like the spread of computers and the use of e-mail meant that a deaf person who once might have had to drive an hour to deliver a message to a friend in person (not knowing before setting out if the friend was even home), could now send that message in seconds from a keyboard.

In 1994, Greg Hlibok, one of the student leaders of the Gallaudet protests a few years earlier, declared in a speech: “From the time God made earth until today, this is probably the best time to be Deaf.”

Into the turbulence of nascent deaf civil rights dropped the cochlear implant.

A child with an early cochlear implant on Aug. 24, 1984. Glen Martin—Denver Post/Getty Images

The Food and Drug Administration’s 1990 decision to approve cochlear implants for children as young as two galvanized Deaf culture advocates. They saw the prostheses as just another in a long line of medical fixes for deafness. None of the previous ideas had worked, and it wasn’t hard to find doctors and scientists who maintained that this wouldn’t work either — at least not well. Beyond the complaint that the potential benefits of implants were dubious and unproven, the Deaf community objected to the very premise that deaf people needed to be fixed at all. “I was upset,” Ted Supalla, a linguist who studies ASL at Georgetown University Medical Center, told me. “I never saw myself as deficient ever. The medical community was not able to see that we could possibly see ourselves as perfectly fine and normal just living our lives. To go so far as to put something technical in our brains, at the beginning, was a serious affront.”

The Deaf view was that late-deafened adults were old enough to understand their choice, had not grown up in Deaf culture, and already had spoken language. Young children who had been born deaf were different. The assumption was that cochlear implants would remove children from the Deaf world, thereby threatening the survival of that world. That led to complaints about “genocide” and the eradication of a minority group. The Deaf community felt ignored by the medical and scientific supporters of cochlear implants; many believed deaf children should have the opportunity to make the choice for themselves once they were old enough; still others felt the implant should be outlawed entirely. Tellingly, the ASL sign developed for “cochlear implant” was two fingers stabbed into the neck, vampire-style.

The medical community agreed that the stakes were different for children. “For kids, of course, what really counts is their language development,” says Richard Dowell, who today heads the University of Melbourne’s Department of Audiology and Speech Pathology but in the 1970s was part of an Australian team led by Graeme Clark that played a critical role in developing the modern-day cochlear implant. “You’re trying to give them good enough hearing to actually then use that to assist their language development as close to normal as possible. So the emphasis changes very, very much when you’re talking about kids.”

Implanted and improving

I picked Alex up and hugged him tight. ‘You did it,’ I said.By the time Alex was born, children were succeeding in developing language with cochlear implants in ever greater numbers. The devices didn’t work perfectly and they didn’t work for everyone, but the benefits could be profound. The access to sound afforded by cochlear implants could serve as a gateway to communication, to spoken language and then to literacy. For hearing children, the ability to break the sound of speech into its components parts — a skill known as phonological awareness — is the foundation for learning to read.

We wanted to give Alex a chance to use sound. In December 2005, four months before he turned three, he received a cochlear implant in his right ear and we dug into the hard work of practicing speaking and listening.

One year later, it was time to measure his progress. We went through the now familiar barrage of tests: flip charts of pictures to check his vocabulary (“point to the horse”), games in which Alex had to follow instructions (“put the purple arms on Mr. Potato Head”), exercises in which he had to repeat sentences or describe pictures. The speech pathologist would assess his understanding, his intelligibility, his general language development.

To avoid prolonging the suspense, the therapist who did the testing calculated his scores for me before we left the office and scribbled them on a yellow Post-It note. First, she wrote the raw scores, which didn’t mean anything to me. Underneath, she put the percentiles: where Alex fell compared to his same-aged peers. These were the scores that had been so stubbornly dismal the year before when Alex seemed stuck in single-digit percentiles.

Now, after 12 months of using the cochlear implant, the change was almost unbelievable. His expressive language had risen to the 63rd percentile and his receptive language to the 88th percentile. He was actually above age level on some measures. And that was compared to hearing children.

I stared at the Post-It note and then at the therapist.

“Oh my god!” was all I could say. I picked Alex up and hugged him tight.

“You did it,” I said.

Listening to Each Other

I was thrilled with his progress and with the cochlear implant. But I still wanted to reconcile my view of this technology with that of Deaf culture. Since those nights early on when I was trolling the Internet for information on hearing loss, Gallaudet University in Washington, D.C., had loomed large as the center of Deaf culture, with what I presumed would be a correspondingly large number of cochlear implant haters. By the time I visited the campus in 2012, I no longer imagined I would be turned back at the front gates, but just the year before a survey had shown that only one-third of the student body believed hearing parents should be permitted to choose cochlear implants for their deaf children.

Only one-third of the student body believed hearing parents should be permitted to choose cochlear implants for their deaf children.“About fifteen years ago, during a panel discussion on cochlear implants, I raised this idea that in ten to fifteen years, Gallaudet is going to look different,” says Stephen Weiner, the university’s provost. “There was a lot of resistance. Now, especially the new generation, they don’t care anymore.” ASL is still the language of campus and presumably always will be, but Gallaudet does look different. The number of students with cochlear implants stands at 10 percent of undergraduates and 7 percent overall. In addition to more cochlear implants, there are more hearing students, mostly enrolled in graduate programs for interpreting and audiology.

“I want deaf students here to see everyone as their peers, whether they have a cochlear implant or are hard of hearing, can talk or can’t talk. I have friends who are oral. I have one rule: We’re not going to try to convert one another. We’re going to work together to improve the life of our people. The word ‘our’ is important. That’s what this place will be and must be. Otherwise, why bother?” Not everyone agrees with him, but Weiner enjoys the diversity of opinions.

At the end of our visit, he hopped up to shake my hand.

“I really want to thank you again for taking time to meet with me and making me feel so welcome,” I said.

“There are people here who were nervous about me talking to you,” he admitted. “I think it’s important to talk.”

So I made a confession of my own. “I was nervous about coming to Gallaudet as the parent of a child with a cochlear implant,” I said. “I didn’t know how I’d be treated.”

He smiled, reached up above his right ear, and flipped the coil of a cochlear implant off his head. I hadn’t realized it was there, hidden in his brown hair. Our entire conversation had been through an interpreter. He seemed pleased that he had managed to surprise me.

“I was one of the first culturally Deaf people to get one.”

Perhaps it’s not surprising that most of the people who talked to me at Gallaudet turned out to have a relatively favorable view of cochlear implants. When I met Irene Leigh, she was about to retire as chair of the psychology department after more than 20 years there. She doesn’t have an implant, but is among the Gallaudet professors who have devoted the most time to thinking about them.

She and sociology professor John Christiansen teamed up in the late 1990s to (gingerly) write a book about parent perspectives on cochlear implants for children; it was published in 2002. At that time, she says, “A good number of the parents labeled the Deaf community as being misinformed about the merits of cochlear implants and not understanding or respecting the parents’ perspective.” For their part, the Deaf community at Gallaudet was beginning to get used to the idea by then, but true supporters were few and far between.

In 2011, Leigh served as an editor with Raylene Paludneviciene of a follow-up book examining how perspectives had evolved. Culturally Deaf adults who had received implants were no longer viewed as automatic traitors, they wrote. Opposition to pediatric implants was “gradually giving way to a more nuanced view.” The new emphasis on bilingualism and biculturalism, says Leigh, is not so much a change as a continuing fight for validation. The goal of most in the community is to establish a path that allows implant users to still enjoy a Deaf identity. Leigh echoes the inclusive view of Steve Weiner when she says, “There are many ways of being deaf.”

Ted Supalla, the ASL scholar who was so upset by cochlear implants, had deaf parents and deaf brothers, a background that makes him “deaf of deaf” and accords him elite status in Deaf culture. Yet when we met, he had recently left the University of Rochester after many years there to move to Washington D.C. with his wife, the neuroscientist Elissa Newport. They were setting up a new lab not at Gallaudet but at Georgetown University Medical Center. Waving his hand out the window at the hospital buildings, Supalla acknowledged the unexpectedness of his new surroundings. “It’s odd that I find myself working in a medical community . . . It’s a real indication that times are different now.”

‘Deaf like me’

Alex will never experience deafness in quite the same way Ted Supalla does. And neither do the many deaf adults and children — some 320,000 of them worldwide — who have embraced cochlear implants gratefully.

But they are all still deaf. Alex operated more and more fluently in the hearing world as he got older, yet when he took off his processor and hearing aid, he could no longer hear me unless I spoke loudly within inches of his left ear.

We had said that Alex would still learn ASL — and we’d meant it, in a vague way.I never wanted us not to be able to communicate. Even if Alex might never need ASL, he might like to know it. And he might someday feel a need to know more deaf people. In the beginning, we had said that Alex would learn ASL, as a second language. And we’d meant it — in a vague, well-intentioned way. Though I used a handful of signs with him in the first few months, those had fallen away once he started to talk. I regretted letting sign language lapse. The year Alex was in kindergarten, an ASL tutor named Roni began coming to the house. She, too, was deaf and communicated only in ASL.

Through no fault of Roni’s, those lessons didn’t go so well. It was striking just how difficult it was for my three boys, who were then five, seven and 10, to pay visual attention, to adjust to the way of interacting that was required in order to sign. (Rule number one is to make eye contact.) Even Alex behaved like a thoroughly hearing child. It didn’t help that our lessons were at seven o’clock at night and the boys were tired. I spent more time each session reining them in than learning to sign. The low point came one night when Alex persisted in hanging upside down and backward off an armchair.

“I can see her,” he insisted.

And yet he was curious about the language. I could tell from the way he played with it between lessons. He decided to create his own version, which seemed to consist of opposite signs: YES was NO and so forth. After trying and failing to steer him right, I concluded that maybe experimenting with signs was a step in the right direction.

Even though we didn’t get all that far that spring, there were other benefits. At the last session, after I had resolved that one big group lesson in the evening was not the way to go, Alex did his usual clowning around and refusing to pay attention. But when it was time for Roni to leave, he gave her a powerful hug that surprised all of us.

“She’s deaf like me,” he announced.

Lydia Denworth is the author or I Can Hear You Whisper: An Intimate Journey through the Science of Sound and Language (Dutton), from which this piece is adapted.

TIME medicine

‘Are Your Children Vaccinated?’ Is the New ‘Do You Have a Gun in the House?’

baby arm vaccines
Summer Yukata—Getty Images/Flickr RF

Most of your parenting choices don't affect me. Having a loaded weapon in your house does. The same is true when you don't immunize your children.

I try not to judge other parents. If you want your whole family to sleep together in one giant bed, it is none of my concern. If you feel like breastfeeding your kid until he’s in junior high school, go for it. If you don’t want to or can’t breastfeed, hey, formula is good too. To binky or not to binky? Maybe that is the question in your house, but I am positive you will make the right decision. Either way, I could really care less. Most of your parenting choices don’t affect me or my children. Having a loaded weapon in your house does. It has the potential to do serious harm to, and possibly kill, my child. The same is true when you decide not to immunize your children against preventable infectious diseases.

My kids are five and two. They have gone through most of their early childhood vaccinations. With all the coverage in the news lately about the return of the measles and the mumps (seriously, mumps is a thing again?), I called the pediatrician to confirm that their immunizations were up to date. I found out that I had somehow missed my two year old’s second MMR vaccination. Just in case you don’t know, those two “Ms” stand for measles and mumps! Crud… I was an accidental anti-vaxxer! It was an oversight that I quickly remedied. That was a close one! What if my little dude had come in contact with one of the unvaccinated!? Chances are, nothing. But maybe, something. And if it was something, that thing could have been catastrophic.

I’ve been wondering lately if I have any friends who are anti-vaxxers. Some of the dads in my playdate group are kind of out there: musicians, actors, and such. One is a big conspiracy theory guy. Another is active in the Occupy movement. Who knows what kind of wacky stuff they’re up to? Maybe they hopped aboard the trendy not-getting-your-kids-immunized train. I brought it up with a couple of them. Luckily, no true nut jobs. (Well, about this issue anyway. They’re an odd bunch, but in the best ways.)

There is one dad who is not fully on board with vaccines, deeming some of them unnecessary. He felt that the reason a lot of vaccines are required by schools is because the state has a financial interest in…I don’t know…their sale and distribution or something. It was the conspiracy guy, and I had kind of a hard time following his logic. He also does not agree with the recommended vaccination schedule, asserting that getting too many at a time weakens a child’s immune system. (A reasonable-sounding concern some might think, though there is absolutely no evidence supporting it.) But, even if somewhat grudgingly, he vaccinates his daughter. Whew! We can still hang out; our children can still be friends.

I’m sort of joking…but the truth is, I’m not sure what I would do if I found out that one of my playgroup buddies was an anti-vaxxer. I really like those dudes! And most of the kids have known each other so long, they view each other as second cousins.

At this point — especially since I rectified my earlier negligence — my children are out of the danger zone. Not all vaccines are 100% effective, but I feel relatively safe. Yet, I remain rankled by the anti-vaxxers. There is still a chance that my children could be a part of the unlucky few who are vaccine resistant. Though the risk to my children is small, there are other children who are too young for certain vaccines. Anti-vaxxers are unnecessarily putting those kids in harm’s way (not to mention the potential danger to their own offspring). They are, in fact, banking on others getting vaccinated to protect their own children from the spread of disease. It just seems so selfish. Of course, they believe that they are doing what is best for their kids and are likely discounting the exposure of other children.

I understand that injecting something into your child that you do not fully comprehend is scary. Most parents are not scientists or doctors. I’m certainly not. I also understand that nothing I say is going to convince anti-vaxxers that vaccinations are safe; their minds are already made up. Other people, who are much smarter than I am, have made a pretty compelling case for the efficacy of immunizations. Yet the anti-vaxxer movement seems to be on the rise. If you are on the fence, I ask only that you don’t just do your “research” on anti-vaxxer websites. That is not really research; it’s confirmation.

Not vaccinating your children is that odd family decision that has potential real life consequences outside your home. It should come with a certain set of responsibilities. If you have a gun in your house, you are expected to safely secure it. If you have decided not to immunize your children, it is incumbent on you to make sure other children are not exposed to an unnecessary threat of infectious disease. It may seem harsh to equate an innocent child with a loaded weapon, but if that child comes into contact with a virus he is not immunized against, the metaphor is apt. Most of the time, because of herd immunization, unvaccinated children are not exposed to these diseases. They are, therefore, harmless: unloaded and secured. As we have seen with recent outbreaks, however, the safety of the herd does not hold up when too many people opt out.

If you are worried about anti-vaxxers in your playgroup, you need to find out for yourself and not wait for other parents to bring it up. It is not a topic you should debate (trust me, you will not persuade your anti-vaxxer friend to immunize her child), but it is important to have the information. If there are unimmunized children in the group, consult your pediatrician about what increased risks there may be to your child. Then, you can make an informed decision about what is best for you and your family.

Lesser blogs at Amateur Idiot/Professional Dad. You can follow him on Facebook and on Twitter (@amateuridiot).

TIME medicine

Study: Children Given Codeine in ER Despite Risks

Too many kids are getting codeine in emergency rooms, say the authors of a new study, which estimates that at least half-a-million children receive prescriptions each year

The painkiller codeine is prescribed to kids in at least half-a-million emergency room visits, a new study suggests, despite recommendations in place to limit its use among children.

Only 3% of children’s ER trips in 2010 resulted in a codeine prescription, but with kids making 25 million ER visits each year, authors of the study say too many children are getting the opiate, the Associated Press reports.

The study, published Monday in Pediatrics, analyzed national data from 2000 to 2010 on emergency room visits by children between the ages of 3 to 17. The study’s authors say the annual number of visits that led to codeine prescriptions ranged from approximately 560,000 to 877,000, though the frequency of codeine treatment slightly declined during the study.

A pediatric drug expert told the AP that codeine use has likely declined further since the study ended after last year’s strict warning from the Food and Drug Administration about the drug’s risks and possible complications.

[AP]

TIME medicine

Cleveland Clinic’s New Medicine

At one Ohio hospital, patients get herbs as well as drugs

Lora Basch, 59, sometimes suffers from poor sleep and anxiety. She’s uncomfortable with the side effects of drugs, so she’s tried acupuncture and magnesium supplements, but with only minimal success. After years of low energy, she went a different route altogether: gui pi tang, a mix of licorice root, ginseng and ginger meant to rejuvenate the body. Three months later, the Cleveland native is finally falling asleep at night, and she has more energy during the day. “The remedy is a huge relief,” she says. “I have a more stable life.”

Though herbal therapy has been practiced in China for centuries, it is still an afterthought in the U.S., in part because pharmaceutical remedies are usually easier to obtain. Now that’s beginning to change: in January, the Cleveland Clinic opened a Chinese herbal-therapy ward. In the past three months, therapists at the clinic have seen patients suffering from chronic pain, fatigue, poor digestion, infertility and, in the case of Basch, sleep disorders. “Western medicine may not have all the answers,” says Daniel Neides, the clinic’s medical director.

A certified herbalist runs the unit under the supervision of multiple Western-trained M.D.s. Patients must be referred to the clinic by their physician, who in accordance with Ohio law must oversee their treatment for at least a year. Executives at Cleveland say the clinic is the first of its kind to be affiliated with a Western hospital. “We’re incorporating ancient knowledge into patient care,” says in-house herbalist Galina Roofener.

Cleveland is starting modestly: its clinic is a single room with bright pillows, a tapestry, candles and a cot reserved for procedures like acupuncture. The center doesn’t take walk-ins and primarily sees patients with conditions that Western medicine has, for whatever reason, failed to remedy. “For something like acute pneumonia, Western antibiotics may be faster and more cost-effective,” says Roofener. “But if someone has antibiotic resistance, we can strengthen their immune system.”

All herbal formulas at the clinic are encapsulated for easy consumption. (By contrast, in China, patients are usually sent home with raw herbs to brew themselves.) The FDA doesn’t regulate herbs and supplements, so finding pharmacies that can both supply them and still meet hospital safety standards was a top priority. After a lengthy search, the clinic tapped a Kaiser Pharmaceutical subsidiary out of Taiwan as well as a Chinese herb–specific compounding pharmacy in Massachusetts and California that specializes in custom blends.

The primary uncertainty in herbal medicine is the prospect of an unpleasant or dangerous herb-drug interaction, which is why the clinic requires herbalists and physicians to have joint access to patients’ electronic medical records. To become an herbal therapist requires three to four years of master’s-degree-level education in Chinese medicine and a series of certification exams in Oriental medicine, herbology and biomedicine.

As it happened, I was battling a cold when I visited the clinic, so I signed up for the $100 consultation. Roofener spent 30 minutes reviewing my medical history, sleep routine, diet and even my spirituality–I was asked about what I practice and whether I meditate. She took my pulse Chinese-style: holding my wrists, she measured what she said were the multiple “pulses” of my organ systems. “Did you eat breakfast?” she asked. “The pulse on your stomach position is very weak.” I had eaten half a slice of toast.

I left the clinic with my own herbal remedy: 80 capsules of a diverse mixture of ingredients ranging from Lonicera flower to mint leaf, with instructions to take two pills four times a day for 10 days. Though an over-the-counter drug usually does the trick for me, my symptoms were cleared on the herbs alone. Now if only I could find an herb to make me taller.

TIME The Weekend Read

Parent Like a Mad Scientist

Taking them to my alma mater, U.C. Berkeley, where upon Yo announced, ÒDad, thereÕs no way I am going here. My days of attending the schools you went to are over.Ó
Me with my daughter, E, and my son, Yo. I gave the kids unique names based on research showing that this might endow them with superior impulse control. Stephen P. Hudner

Give your kids weird names, expose them to raw sewage, and still be the world’s best dad

As an immigrant society with no common culture, we Americans have always been blessed with the ability to make things up as we go, be it baseball, jazz, the Internet … even Mormonism. Yet, when it comes to parenting, we’ve become obsessed with finding the one best way — whether it’s learning to raise our kids like the Chinese, the French, Finns, or whatever other group is in fashion today. It’s time to stop. No one culture has parenting down pat; there’s no one best model that we can look to for all the answers. And that’s a good thing. Parenting should be an adventure. And more importantly, if we want to keep America’s culture young and prosperous and innovative, parenting should be an experiment.

Yo engaged by something his mother is demonstrating to him; E mad for some reason. Courtesy Dalton Conley

I should know. I’m a bit of a mad-scientist parent myself — just ask my kids, E and Yo.

As a dual-doctorate professor of sociology and medicine at New York University, I gave my kids “unique” names based on research about impulse control. I exposed them to raw sewage (just a little!) and monkeys (O.K., just one!) to build up their immune systems based on the latest research on allergies and T-cell response. I bribed them to do math inspired by a 2005 University of Pennsylvania study of Mexican villagers that demonstrated the effectiveness of monetary incentives for schooling outcomes. And don’t think my offspring were the only ones bearing the brunt of all this trial and error: I got myself a vasectomy based on research showing that fewer kids may mean smarter kids.

There’s a method to my madness (namely, the scientific method). Parentology — as I call this approach to raising kids — involves three skills: first, knowing how to read a scientific study; second, experimenting on your kids by deploying such research; and third, involving your kids in the process, both by talking to them about the results and by revising your hypotheses when necessary.

Kids raised this way won’t necessarily end up with 4.0 GPAs, but they almost certainly will become inquisitive, creative seekers of truth.

Dalton Conley's kids, Yo, left, and E, right.
Often we are asked if E (right) and Yo (left) are twins; they are not. Despite knowing that narrow birth spacing may be disadvantageous, we popped our kids out a mere 18 months apart. Courtesy Dalton Conley

“Parentology” in Practice

My son’s name: Yo Xing Heyno Augustus Eisner Alexander Weiser KnucklesI put my approach into practice more or less immediately upon becoming a father, throwing out my copy of Dr. Spock and instead conducting a series of experiments on my two young children, now 16 and 14. No, I didn’t raise one in the woods with wolves and the other in a box. But I did give my children weird names — E (my daughter) and Yo (my son, full name: Yo Xing Heyno Augustus Eisner Alexander Weiser Knuckles) — to teach them impulse control. Evidence shows that kids with unusual names learn not to react when their peers tease them (at least in elementary school). What’s more, a 1977 analysis of Who’s Who by psychologist Richard Zweigenhaft found that unusual names were overrepresented, even after factoring out the effect of social class and background.

Meanwhile, after exploring the literature on verbal development, I decided not to teach my kids to read, but instead I read aloud to them constantly. It turns out that exposure to novel words, complex sentences and sustained narratives are what predict verbal ability later on, not whether a 4-year-old can decode words on a page. And the best predictor of later verbal skills is the number of total and unique words a child hears before kindergarten. Psychologists Betty Hart and Todd Risley observed how poor and middle-class parents interact with their toddlers. They estimated that the middle-class kids heard an average of 45 million words over a four-year period, while the poor children heard a mere 13 million. This difference, in turn, explained later achievement gaps. Unable to mimic Robin Williams and babble away, I decided the best thing was to read to my kids constantly. So while E and Yo were both behind their peers in reading in first grade, by fourth grade they had the best scores in their respective classes.

Dalton Conley reads to his son, Yo.
Reading is fundamental. Though I never taught them to decode words on the page, I was a human Kindle before there were such things. I never stopped reading to them. Courtesy Dalton Conley

Of course, not all my experiments have been successful. (If they were all successful, they’d hardly be experiments.) When my son was 11, his school wanted to medicate him for what administrators suspected was ADHD. I thought there might be a way around it. Scientific studies reviewed by University of California, San Diego, professor Andrew Lakoff in 2002 show that psychopharmacological placebo effects are almost as big as those of the actual drugs. And even student-teacher interaction is not immune to such Pygmalion-like dynamics. In one classic 1968 study, researchers Robert Rosenthal and Lenore Jacobson lied to teachers, telling them that they had identified a new test that could pick out genius kids with remarkable accuracy. Then they randomly picked certain pupils and informed the teachers that these particular kids had aced the test. Lo and behold, when the scientists showed up a year later, the scores of the kids who had received the “teacher placebo” treatment had jumped 15 points in their actual IQs relative to the control-sample kids.

Worried about sleep apnea and its potential role in causing ADHD, we took Yo in for an evaluation. My attempt to cure ADHD with a placebo came to naught. Courtesy Dalton Conley

With this research in mind and fearful of the risks of actual medication, I lied to the school, his sister and my son himself, telling them all that I was giving him a powerful stimulant (when it was actually a vitamin), hoping that if they all thought he was calmer and more attentive, they would treat him as such and his behavior would improve. While his teachers noted an improvement in his concentration and behavior for a few weeks after I started my placebo protocol, he backslid — prompting calls from the school about his inappropriate behavior — and was ultimately given a formal ADHD diagnosis. The real stimulants worked. However, I did decide to experiment with only giving him the drugs during the school week (in order to mitigate against long-term effects and possible habituation to the drug), which has been successful so far.

Customizing to the Kid

As you can see, while knowing how to read the existing science is important, even more critical is being able to properly experiment on your own young. What works for one kid (or one population of kids) may not work for all, and your family may require customization in order to make a technique work or just to be comfortable with what you’re doing.

Even when there’s research on a topic, you can’t be sure how it will apply to your own kids. You need to experiment.Even when there is a clear scientific consensus on, say, the importance of breast-feeding, we don’t often know the distribution of those effects. If a particular intervention — say, paying a child to do a half hour of math a day, like I did — is shown in a randomized, controlled trial to raise math scores by 20%, that could mean that all the kids in the bribery group saw their scores jump by a fifth. Or it could mean that for 80% of the kids, the bribes did not make a whit of difference, but for 20% it doubled their scores. This is what researchers call heterogeneous treatment effects.

Some kids are car-truck-train kids; others are animal kids. Guess which ours are. Courtesy Dalton Conley

Other times, results vary across studies and methods. One 2005 study of Mexican families found that cash rewards that were conditional on school attendance were hugely effective in improving child outcomes such as health and educational attainment. But an effort to replicate this in New York City showed only minor educational benefits in 2009. And a third study, published in the Review of Economics and Statistics in 2010, focused on elementary-school students in Coshocton, Ohio, found that it worked to pay the students themselves (as opposed to their families) based on how well they did on outputs (i.e., test scores). But the largest U.S. study of all — conducted in 2011 by Harvard economist Roland Fryer in Chicago, Dallas, New York City and Washington — found that when rewards were focused on outcomes like passing tests, they failed to produce meaningful improvements. But in that study, when the rewards were based on performing input tasks like reading a book or being on time to class, then they worked. (Even in this study, however, results were not consistent across cities, age groups or race.)

In short, even when there’s research on a topic, you can’t be sure how it will apply to your own kids — so it’s necessary to embrace experimentation. While I may never know what explains why some studies found big gains from bribery and others failed to, I was able to bribe both my kids to do extra math. I simply adjusted the rewards to fit the kid (something that would be impossible for researchers to do in a big study). As a parent, I could play on my son’s love of video games to offer a minute-for-minute swap of online math problems in exchange for World of Warcraft time. For my daughter, the enticement was gummy bears.

I did worry that by providing external motivation in the form of bribery, I might erode their internal motivation for mathematics, as some psychology research has suggested can happen. But that was a risk I was willing to take because — unlike with reading, for instance — they weren’t exactly clamoring for math problems. Here was a case of customizing the existing research to one’s own children. I may or may not have eroded their internal motivation to do math (and I doubt either will end up a professional mathematician), but at least they passed the big tests they needed to in order to get into high school.

How to Know What Matters

Lots of folks think being a scientist is knowing a bunch of esoteric facts that fit together, like how the Earth’s tilt causes the seasons or what mitochondria do or how, exactly, light can be both a wave and a particle. But the scientific method is what’s most important — especially when it comes to parenting. Particularly important, especially for middle- and upper-class parents, is knowing how to read a study and sift out causal relationships from the chaff of mere correlations. (It turns out a lot of great outcomes are correlated with being born in good economic circumstances to well-educated parents, but you want to figure out how to cause better outcomes.)

To assuage my own anxieties, I just keep reminding myself just how unimportant going to Harvard really is.For instance, take my educational choices for my kids — or, more accurately, their choices. That is, after all the extra math prep I bribed them to do to get them into Stuyvesant (the prestigious New York City high school that students must test into), I allowed them to decide if they actually wanted to go or not.

This may seem, at first blush, to be more like 1970s-style laissez-faire parenting. But actually I was following the latest cutting-edge research in ceding educational choice to my kids. Two studies by economists Stacy Dale and Alan Krueger in 2002 and 2011 showed that if you are white and middle class (which we are), it does not make a difference where you go to college. While it is true that graduates of more-selective institutions fare better in terms of income and wealth later on, compared with graduates of less selective schools, it turns out that this is an artifact of what we scientists call selection bias. It’s not that Harvard is adding so much value to your education as compared with the University of Nebraska — it’s that Harvard admissions is good at picking winners.

This research was about college, but my intuition that it also applied to high school was confirmed when MIT economist Joshua Angrist obtained the data from the selective exam-admission schools in Boston and New York City. He examined the data for what we call regression discontinuities. The logic is the following: if the cutoff to get into Stuyvesant is, say, 560 in a given year, then it is really pretty random whether an individual scores 559 or 560. It could be the difference of a good breakfast or a single vocabulary word that was in one kid’s stack of flash cards by chance. In other words, it probably does not reflect a major difference in innate ability. But the consequences of that point difference determine which school the kid ends up attending. By comparing two groups — the one just above and the one just below the line — we can see how big the “treatment effect” of attending the “better” school is. And it turns out not to matter at all, in either Boston or New York.

So, though both my kids gained admission to the most prestigious math and science high school in the country, I let them choose whether they went there or not. I figured, with no overall treatment effect, why not let them go where they sensed they would feel the most comfortable? They knew what environment was best for them. My daughter turned down her offer of admission, while my son decided to go. I, meanwhile, am taking notes to see how this next phase of the experiment turns out. (She is a sophomore and he is a freshman.) Meanwhile, to assuage my own anxieties, I just keep reminding myself just how unimportant going to Harvard really is.

One of the many cross-species interactions that take place in our home. No underworked immune systems here. Courtesy Dalton Conley

The Path to Enlightening Kids

One of the few fake animals in our house. Courtesy Dalton Conley

Finally, perhaps the most important part of parentology is to involve the kids themselves. Whether that means discussing the research about standing desks and their role in preventing obesity, giving them an opportunity to help design the experiment or debriefing them about its results (like when I confessed to my son that I had been giving him a placebo and not the real ADHD medication), the teachable moment is, actually, the most valuable part of the entire experiment.

Turn your rug rats into lab rats — they might not go Ivy, but they’ll be a lot more fun.Having a kid who knows how to separate out causation from mere correlation is more important than having one who can memorize a list of amino acids or Egyptian pharaohs. This is the real goal of experimental parenting: indoctrinating one’s kids into the Enlightenment way of thinking. Helping them learn to question — not authority necessarily: this isn’t 1960s hippie-dippie parenting, after all — but knowledge itself.

So, where tradition fails us (after all, what does the Bible have to say about kids and cell phones?), we can and should resort to the scientific method. Hypothesis formation, trial, error and revision. That is, we should experiment on our own kids.

Worried that screens may be disrupting your teen’s sleep? Do a controlled study in which you take the iPad away at night for two weeks and chart what happens. Want to encourage better study habits? Set up a marketplace for grades or effort and fine-tune the rewards and punishments in real time. Want to exercise the self-discipline muscles of your kids’ brains? Make them wear a mitten on their dominant hand for a couple of hours a day. Want to boost their performance before a big test? Prime them with positive stereotypes about their ethnic and gender identities. Today it is easier than ever — with Google Scholar and the like — to immerse oneself in the most cutting-edge research and apply it to one’s kids.

Like with patient-driven medicine, in which informed patients advocate to their doctor rather than just passively receiving information, I predict that American parents and their children will increasingly shun authorities — even good old Dr. Spock — and instead interpret and generate the scientific evidence for themselves.

Rather than a rigid formula of 10,000 hours of violin practice or a focus on a single socially sanctioned pathway to success, American parents should pursue an insurgency strategy: more flexibility and fluidity; attention to often counterintuitive, myth-busting research; and adaptation to each child’s unique and changing circumstances.

E working on her novel. Perhaps my reading-out-loud experiments worked. Courtesy Dalton Conley

If you approach your rug rats this way, by turning them into lab rats, I can’t guarantee they will get into Columbia. But I can predict with statistical confidence that they will be creative, fulfilled members of society and that you will have a lot more fun raising them along the way.

Dalton Conley is a professor of sociology and medicine at New York University and author of Parentology: Everything You Wanted to Know About the Science of Raising Children but Were Too Exhausted to Ask.

Parentology Quiz

TIME

Wikipedia Founder Sticks It To ‘Lunatic’ Holistic Healers

Wikipedia founder Jimmy Wales gives a lecture in Hanover, Germany, March 14, 2014.
Wikipedia founder Jimmy Wales gives a lecture in Hanover, Germany, March 14, 2014. Christoph Schmidt—Zumapress

Jimmy Wales rejected a Change.org petition calling for more information on holistic medicinal therapies. "Every single person who signed this petition needs to...think harder about what it means to be honest, factual, truthful," he said

Wikipedia founder Jimmy Wales responded definitively to a Change.org petition from holistic healing supporters to “allow for true scientific discourse” on the online encyclopedia.

The petitioners say the representation of holistic healing on Wikipedia is biased, and they have not been allowed to amend the information. The petition, which has over 7,790 supporters, states:

“Wikipedia is widely used and trusted. Unfortunately, much of the information related to holistic approaches to healing is biased, misleading, out-of-date, or just plain wrong. For five years, repeated efforts to correct this misinformation have been blocked and the Wikipedia organization has not addressed these issues. As a result, people who are interested in the benefits of Energy Medicine, Energy Psychology, and specific approaches such as the Emotional Freedom Techniques, Thought Field Therapy and the Tapas Acupressure Technique, turn to your pages, trust what they read, and do not pursue getting help from these approaches which research has, in fact, proven to be of great benefit to many.”

Wales responded to the petition on Sunday, and was unapologetic for the way holistic medicine is covered on Wikipedia, saying it will only publish evidence rooted in science. He responds:

No, you have to be kidding me. Every single person who signed this petition needs to go back to check their premises and think harder about what it means to be honest, factual, truthful.

Wikipedia’s policies around this kind of thing are exactly spot-on and correct. If you can get your work published in respectable scientific journals – that is to say, if you can produce evidence through replicable scientific experiments, then Wikipedia will cover it appropriately.

What we won’t do is pretend that the work of lunatic charlatans is the equivalent of “true scientific discourse”. It isn’t.

Now perhaps he’ll tell us how he really feels.

TIME

Do You Think the CIA Infected African Americans With HIV? You’re Not Alone

Half of all Americans believe in that, or one of five other medical conspiracy theories

+ READ ARTICLE

About half of the grownup population in the U.S. believes at least one medical conspiracy theory, a new survey from the University of Chicago shows.

In the study, 1,351 adults were asked about whether they had heard of, and agreed or disagreed with, six popular medical conspiracy theories, such as those that hold that U.S. regulators prevent people from getting natural cures, that the U.S. government knows cell phones cause cancer but does nothing about it, and that the CIA infected a large number of African Americans with HIV.

About 49% of the people agreed with at least one of the theories, which all had distrust of the government or large corporations as the common characteristic.

According to the study’s lead author, J. Eric Oliver, the reason so many people believe in medical conspiracy theories is that they are easier to understand than science. He added that people who believe in one or more of these theories are more likely to use alternative instead of conventional medicine.

[Reuters]

TIME ADHD

Doctor: ADHD Does Not Exist

Adderall
Getty Images

Over the course of my career, I have found more than 20 conditions that can lead to symptoms of ADHD, each of which requires its own approach to treatment. Raising a generation of children — and now adults — who can't live without stimulants is no solution

This Wednesday, an article in the New York Times reported that from 2008 to 2012 the number of adults taking medications for ADHD increased by 53% and that among young American adults, it nearly doubled. While this is a staggering statistic and points to younger generations becoming frequently reliant on stimulants, frankly, I’m not too surprised. Over my 50-year career in behavioral neurology and treating patients with ADHD, it has been in the past decade that I have seen these diagnoses truly skyrocket. Every day my colleagues and I see more and more people coming in claiming they have trouble paying attention at school or work and diagnosing themselves with ADHD.

And why shouldn’t they?

If someone finds it difficult to pay attention or feels somewhat hyperactive, attention-deficit/hyperactivity disorder has those symptoms right there in its name. It’s an easy catchall phrase that saves time for doctors to boot. But can we really lump all these people together? What if there are other things causing people to feel distracted? I don’t deny that we, as a population, are more distracted today than we ever were before. And I don’t deny that some of these patients who are distracted and impulsive need help. What I do deny is the generally accepted definition of ADHD, which is long overdue for an update. In short, I’ve come to believe based on decades of treating patients that ADHD — as currently defined by the Diagnostic and Statistical Manual of Mental Disorders (DSM) and as understood in the public imagination — does not exist.

Allow me to explain what I mean.

Ever since 1937, when Dr. Charles Bradley discovered that children who displayed symptoms of attention deficit and hyperactivity responded well to Benzedrine, a stimulant, we have been thinking about this “disorder” in almost the same way. Soon after Bradley’s discovery, the medical community began labeling children with these symptoms as having minimal brain dysfunction, or MBD, and treating them with the stimulants Ritalin and Cylert. In the intervening years, the DSM changed the label numerous times, from hyperkinetic reaction of childhood (it wasn’t until 1980 that the DSM-III introduced a classification for adults with the condition) to the current label, ADHD. But regardless of the label, we have been giving patients different variants of stimulant medication to cover up the symptoms. You’d think that after decades of advancements in neuroscience, we would shift our thinking.

Today, the fifth edition of the DSM only requires one to exhibit five of 18 possible symptoms to qualify for an ADHD diagnosis. If you haven’t seen the list, look it up. It will probably bother you. How many of us can claim that we have difficulty with organization or a tendency to lose things; that we are frequently forgetful or distracted or fail to pay close attention to details? Under these subjective criteria, the entire U.S. population could potentially qualify. We’ve all had these moments, and in moderate amounts they’re a normal part of the human condition.

However, there are some instances in which attention symptoms are severe enough that patients truly need help. Over the course of my career, I have found more than 20 conditions that can lead to symptoms of ADHD, each of which requires its own approach to treatment. Among these are sleep disorders, undiagnosed vision and hearing problems, substance abuse (marijuana and alcohol in particular), iron deficiency, allergies (especially airborne and gluten intolerance), bipolar and major depressive disorder, obsessive-compulsive disorder and even learning disabilities like dyslexia, to name a few. Anyone with these issues will fit the ADHD criteria outlined by the DSM, but stimulants are not the way to treat them.

What’s so bad about stimulants? you might wonder. They seem to help a lot of people, don’t they? The article in the Times mentions that the “drugs can temper hallmark symptoms like severe inattention and hyperactivity but also carry risks like sleep deprivation, appetite suppression and, more rarely, addiction and hallucinations.” But this is only part of the picture.

First, addiction to stimulant medication is not rare; it is common. The drugs’ addictive qualities are obvious. We only need to observe the many patients who are forced to periodically increase their dosage if they want to concentrate. This is because the body stops producing the appropriate levels of neurotransmitters that ADHD meds replace — a trademark of addictive substances. I worry that a generation of Americans won’t be able to concentrate without this medication; Big Pharma is understandably not as concerned.

Second, there are many side effects to ADHD medication that most people are not aware of: increased anxiety, irritable or depressed mood, severe weight loss due to appetite suppression, and even potential for suicide. But there are also consequences that are even less well known. For example, many patients on stimulants report having erectile dysfunction when they are on the medication.

Third, stimulants work for many people in the short term, but for those with an underlying condition causing them to feel distracted, the drugs serve as Band-Aids at best, masking and sometimes exacerbating the source of the problem.

In my view, there are two types of people who are diagnosed with ADHD: those who exhibit a normal level of distraction and impulsiveness, and those who have another condition or disorder that requires individual treatment.

For my patients who are in the first category, I recommend that they eat right, exercise more often, get eight hours of quality sleep a night, minimize caffeine intake in the afternoon, monitor their cell-phone use while they’re working and, most important, do something they’re passionate about. Like many children who act out because they are not challenged enough in the classroom, adults whose jobs or class work are not personally fulfilling or who don’t engage in a meaningful hobby will understandably become bored, depressed and distracted. In addition, today’s rising standards are pressuring children and adults to perform better and longer at school and at work. I too often see patients who hope to excel on four hours of sleep a night with help from stimulants, but this is a dangerous, unhealthy and unsustainable way of living over the long term.

For my second group of patients with severe attention issues, I require a full evaluation to find the source of the problem. Usually, once the original condition is found and treated, the ADHD symptoms go away.

It’s time to rethink our understanding of this condition, offer more thorough diagnostic work and help people get the right treatment for attention deficit and hyperactivity.

Dr. Richard Saul is a behavioral neurologist practicing in the Chicago area. His book, ADHD Does Not Exist, is published by HarperCollins.

TIME medicine

Bring the Doctor with You

The logical next step in managing chronic disease is technology that tracks our vitals and guides us to better health

Chronic disease affects 2 out of 3 adults in the U.S., and it is estimated that 8% of the American population suffers from diabetes. Sixty-nine percent of Americans say they would like direct access to their health records. People want to keep track of their health–and we’d be better off as a society if people had an easy way to do so.

As luck would have it, mobile technology is bringing us closer to the day when we’ll be able to essentially wear our doctors. So when TIME asked me to propose an idea for how design can improve the world, my thoughts quickly turned to medicine. I call my concept–and for now, it is only that–LifeTiles: a wearable kit of sensors for monitoring individual health.

The sensors–designed to be aesthetically pleasing–would noninvasively monitor the user’s physical activity, environment and bloodstream. The information would be sent automatically to the cloud, where specialized algorithms could be used to monitor it and notify the individual with personalized feedback.

A user could also volunteer to donate his or her data, which would be made anonymous and shared with medical experts. Researchers could use the data to look for patterns, understand how disease works and find ways to prevent and cure it. Our doctors would always be with us–and everyone would benefit.

Béhar is the founder of Fuseproject and leads design and brand at Jawbone

TIME

Running Out the Clock

pha289000020
Getty Images

In our ongoing 'Doctor-in-Training' series, time is of the essence in more ways than one for a medical student conducting a routine physical on an elderly patient

I’m running out of time. It was right there on the vital signs monitor clock: 30 minutes left to finish the patient’s history and do her physical. And here she was, a real talker, expounding on the pros and cons of Obamacare. I pressed ahead with my questions about her health, not rushing her, but taking advantage of her pauses to steer the conversation in the direction of the information I needed to present in less than an hour to my supervising doctor.

A classmate and I had been assigned to this patient–I’ll call her Mrs. G.–as part of our course on the physical exam. She lay in her bed on the inpatient cardiac ward, frail under gown and blanket, an IV dribbling into her arm. We worked systematically, with lots of ground to cover. At this stage of our medical education, year two, nothing we do is for the patient’s benefit. Not the barrage of questions, not the poking and prodding for findings we’re only just beginning to understand. It’s all for our training. We find our patients catch-as-catch can. Sometimes one of our physician teachers will ask a patient to let us perform an examination. Other times the nurses tell us which of their charges that day are the nicest, and we ask those patients to put up with us. Invariably, they do. Though sicker than sick, they generously act as guinea pigs so we can learn the skills to help our future patients.

Mrs. G. was hoarse but still chatty as she answered our questions about her heart problems. “Have you experienced any palpitations?” I asked. “Only twice. Right before I came to the hospital, and the first time I saw my husband,” she deadpanned. They’d been married, she said, 63 years. As my classmate and I prepared to move from taking the history to doing the physical exam, it struck me that Mrs. G. was doing me a favor—allowing me to learn by practicing my budding physical exam skills on her frail form. And she was even entertaining. But I couldn’t repay her with the open-ended listening she was clearly hoping for. It’s starting, I found myself thinking. This is why everyone says doctors are always in a rush.

She was still talking. “They say I may go down in days,” she said. “I’m just hoping to get to Christmas with my grandkids.”

It was a mental slap on the wrist. I’m running out of time? I thought. My cheeks warm, I contemplated how few hours she could have left on this earth. A few hundred, probably. If she was lucky. And yet, here she was, spending one of those hours helping me grow into a doctor.

Your browser, Internet Explorer 8 or below, is out of date. It has known security flaws and may not display all features of this and other websites.

Learn how to update your browser
Follow

Get every new post delivered to your Inbox.

Join 46,492 other followers