We start to talk before we can read, so hearing words, and getting familiar with their sounds, is obviously a critical part of learning a language. But in order to read, and especially in order to read quickly, our brains have to “see” words as well.
At least that’s what Maximilian Riesenhuber, a neuroscientist at Georgetown University Medical Center, and his colleagues found in an intriguing brain-mapping study published in the Journal of Neuroscience. The scientists recruited a small group of college students to learn a set of 150 nonsense words, and they imaged their brains before and after the training.
Before they learned the words, their brains registered them as a jumble of symbols. But after they were trained to give them a meaning, the words looked more like familiar words they used every day, like car, cat or apple.
The difference in way the brain treated the words involved “seeing” them rather than sounding them out. The closest analogy would be for adults learning a foreign language based on a completely different alphabet system. Students would have to first learn the new alphabet, assigning sounds to each symbol, and in order to read, they would have to sound out each letter to put words together.
In a person’s native language, such reading occurs in an entirely different way. Instead of taking time to sound out each letter, the brain trains itself to recognize groups of letters it frequently sees together — c-a-r for example — and dedicates a set of neurons in a portion of the brain that activates when these letters appear.
In the functional MRI images of the volunteers’ brains, that’s what Riesenhuber saw. The visual word form area, located in the left side of the visual cortex, is like a dictionary for words, and it stores the visual representation of the letters making up thousands of words. This visual dictionary makes it possible to read at a fast pace rather than laboriously sounding out each letter of each word every time we read. After the participants were trained to learn the meaningless words, this part of their brains was activated.
“Now we are seeing words as visual objects, and phonetics is not involved any more,” he says. “We recognize the word as a chunk so we go directly from a visual pattern to the word’s meaning, and we don’t detour to the auditory system.”
The idea of a visual dictionary could also help researchers to better understanding reading or learning disorders like dyslexia. More research could reveal whether the visual word form area in people with such disabilities is different in any way, or whether they tend to read via more auditory pathways. “I helps us understand in a general way how the brain learns, the fastest way of learning, and how to build on prior learning,” says Riesenhuber.