Just before my youngest son Alex turned two, we discovered that he had significant hearing loss that was likely to get worse. A few weeks later, I found myself in the gym at the school my two older boys attended. I was there for the regular Friday morning assembly. I’d been in that gym dozens of times for such events—dutifully clapping and cheering, chatting with other parents, and then moving on with my day.
On this morning, my routine was upended. The noise of the kids filing in echoed through the bleachers; the PA system squealed once or twice. When quiet kids took the microphone it was hard to hear them. All of that was normal, yet I hadn’t really noticed it before. Now, I was hearing the world differently, imagining it through the ears—and the hearing aids—of Alex, who might someday be a student here. Having a deaf child, I realized, was going to teach me to listen.
Once I started listening, I started to learn. Research came naturally—I am a journalist—and became my coping mechanism. Through books, conferences, and conversations with as many experts as possible, I began to understand the power of sound—how the speech of parents and caregivers and teachers shapes a child’s spoken language; and then, how a child’s own spoken language—the rhythm and the rate of it—helps that child learn to read. I also saw and heard more clearly the troublesome effects of sound’s alter ego, noise—the unwanted, unlovely cacophony of our industrial world, or the magnified, amplified effect of too many people talking, or music that’s too loud or intrusive.
What struck me most was that sound doesn’t matter any less for hearing children like my older boys. From the minute a child is born, every experience that child has is being etched into his or her brain. Sound, or its absence, is part of that experience. Neurons make connections with each other, or don’t; the auditory system develops or doesn’t, based on experience. Sound is essential for anyone learning to speak and to listen—and that includes every hearing child, as well as every deaf and hard of hearing child using hearing aids or cochlear implants, which send sound signals directly to the auditory nerve.
Before we figured out that Alex couldn’t hear, he was using every visual cue available—smiles and frowns, waving hands, pointing fingers—in order to make sense of his world. For a time, he compensated well enough to fool us into thinking he could hear, but he couldn’t keep up once his peers started talking.
Both the quantity and the quality of the words children hear in their first years affect language development. Over time, as kids have more experience listening, the auditory processing in their brains speeds up and becomes more efficient. The repetition, rhythm, and rhyme in nursery rhymes, poetry, music and even Dr. Seuss help children learn language by getting them to listen for patterns. That listening practice then forges the neural networks necessary for reading because an ability to make sense of what you hear and break speech into syllables and phonemes is the foundation of reading. How a child reacts to sound—meaning how efficiently his or her brain processes it—on the first day of kindergarten correlates to how many words per minute that child will read in fourth grade. It turns out that problems with processing sound are at the heart of the majority of reading problems. On the other hand, children who read well have built strong brain circuits connecting hearing, vision, and language.
It’s important to note that if a deaf child is going to grow up using sign language, he does not need sound in order to develop that language because his world is visual. Sign language, if it’s a first language, gets laid down in the brain in the same areas as spoken language does in those who learn to speak. Reading, however, is another question. Native signers must learn to read in what to them is a second language, and deaf students have historically struggled with reading in numbers far greater than their hearing peers.
When Alex did eventually attend school with his brothers, he was using a hearing aid in one ear and a cochlear implant in the other. It turned out that small strategies designed to improve the classroom environment for him benefitted everyone. After we taught Alex to politely ask his friends to speak up or repeat themselves, circle time was suddenly full of children using their manners to do the same because no one else could hear the shy kids who mostly whispered. None of the children in his first grade classroom heard the math assignment because the air conditioner sounded like a standing mixer. Swapping out the old equipment helped 20 kids, not one. Ditto for adding carpeting and curtains, and covering the metal legs of chairs. According to the Acoustical Society of America, noise levels in many classrooms are loud enough that those with normal hearing can hear only 75 percent of words read from a list.
Something else happened, too. Alex’s needs subtly shifted some of the group dynamics, encouraging a new level of attention. Hearing people don’t have to look at someone who’s talking to take in what they say, but deaf people do. Although Alex’s hearing equipment does allow him to hear without looking, he still benefits from visual cues, and in his classes we applied a lesson from American Sign Language about the need for eye contact. The lovely thing about looking at someone when that person is speaking is that instead of just appearing to pay attention, you probably actually are.
Paying attention matters on a deeper level. Children’s ability to pay attention matures over time just as their language does. And like language, selective attention—the kind kids need in the classroom—is affected by experience. Practice and you get better at it. Neuroscientists have shown that when children pay attention they learn. Focusing on something specific—one voice over another or your book instead of your friend—results in a bigger response in the brain measured in electrical activity even in children as young as three. That bigger response helps build networks between neurons and trains the brain to learn.
Alex is now in sixth grade at that same school. I can’t change the acoustics of the cafeteria, but in the classroom, we still begin every school year reminding his teachers to stop and listen. We encourage them to amplify sound by, for instance, remembering to face students instead of the board and to damp down noise by consistently keeping hallway doors shut and the like.
At home, the boys used to do homework at the kitchen table while I cooked dinner and occasionally stepped in to quiz them or offer suggestions, often without leaving whatever was simmering on the stove. I no longer do it that way. I turn off the radio and hush my older sons then I sit next to Alex (or whichever boy needs help) and give him my full attention. He learns the material better, and I learn more about him. I wish I had never done it any other way.
Lydia Denworth is the author of I Can Hear You Whisper: An Intimate Journey through the Science of Sound and Language. She is a blogger for Psychology Today and contributes to Scientific American Mind, Parents, and many other publications.
More Must-Reads from TIME
- Donald Trump Is TIME's 2024 Person of the Year
- Why We Chose Trump as Person of the Year
- Is Intermittent Fasting Good or Bad for You?
- The 100 Must-Read Books of 2024
- The 20 Best Christmas TV Episodes
- Column: If Optimism Feels Ridiculous Now, Try Hope
- The Future of Climate Action Is Trade Policy
- Merle Bombardieri Is Helping People Make the Baby Decision
Contact us at letters@time.com