• History
  • Technology

How Deaf Advocates Won the Battle for Closed Captioning and Changed the Way Americans Watch TV

8 minute read

When Neil Armstrong took his first step onto the moon on July 20, 1969, Harry Lang had just graduated from college with a bachelor’s degree in physics. But as Americans across the country gathered around their television sets to witness this historic moment, Lang, who is deaf and thus could not hear the words that went with those famous images, was unable to follow along with what was happening. In the days before captions became available for television programs, there was little those who could not hear could do to participate in shared cultural experiences like that one.

“What I saw was great irony,” Lang tells TIME. “Our country’s scientists could send a spaceship to the moon and back, but we couldn’t put captions on television for millions of deaf people who were watching it!”

It wasn’t until March 16, 1980 — 40 years ago this Monday — that the network TV channels ABC, NBC and PBS debuted closed-captioned television shows, in which the show’s dialogue and soundtrack appeared as text on screen as the action proceeded. Starting with The ABC Sunday Night Movie, Disney’s Wonderful World and Masterpiece Theatre, a new world opened up. But getting there was a fight, and that battle still continues today.

Lang, now professor emeritus at Rochester Institute of Technology, notes that visual entertainment wasn’t always out of reach for deaf people. Silent films were accessible to those who could not hear, but “talkies” left deaf people “effectively isolated from the world of films.” Captioned versions of Hollywood films for deaf people only started to become required under law in 1958. As television grew, however, it did not follow suit. There was no system in place to provide captions, and finding a solution was not a priority for many in the TV business.

“The networks felt the deaf population was too small to justify funding of captions, while the deaf community viewed captions as a right, not a privilege,” says Philip Bravin, who was CEO of the non-profit National Captioning Institute (NCI) in the mid-1990s and chair of the National Association of the Deaf’s TV Access Committee in the early 1980s.

By the 1970s, however, advocacy efforts led to early experimentation with TV closed captioning. (The use of the word closed in the term closed captioning refers to the fact that viewers have to turn on captioning. “Open-captioning” would be like subtitles in a foreign-language film, with the captions on all the time.) In 1971, Malcolm “Mac” Norwood, who was deaf, became Chief of Media Services and Captioned Films within the Bureau of Education for the Handicapped at the U.S. Department of Health, Education, and Welfare (DHEW). The agency tasked PBS, particularly its Boston affiliate WGBH, with developing captioning technology. Julia Child’s The French Chef started experimenting with captions in 1972. The station then started airing a captioned version of ABC’s 6:30 p.m. evening news about five hours later, this time with the show’s dialogue as text on the screen. It was the only national TV newscast deaf people could follow in the 1970s.

Norwood, who became known as the “father of closed captioning,” played a key role in the Federal Communications Commission’s (FCC) decision to reserve “Line 21,” a portion of the bottom half of television screens, for captions, and the NCI received DHEW funding to figure out how to caption shows faster. Pre-recorded shows would be sent to these captioning providers to encode captions into the TV show — the same rhythm used today. The NCI developed a decoder that would sit on top of a TV set and turn those coded captions, for shows that had them, into text that would appear on Line 21. Available at Sears and via catalog, early versions retailed at $520 (about $1600 today). These decoders are what allowed deaf people to enjoy the first closed-captioned network TV broadcasts in 1980.

“For the first time, I could watch TV with my family,” says Brian Greenwald, a professor of history and director of the Drs. John S. & Betty J. Schuchman Deaf Documentary Center at Gallaudet University. “A limited number of shows had captions, and while I was in the room my family would only watch the shows that I could watch.”

“The deaf are no longer a silent minority,” TIME declared in a 1980 feature about recent gains for deaf and hard-of-hearing Americans:

There are approximately 14 million such people in the U.S., and until very recently they have been the true silent minority, unheard as well as unhearing.

Now all that is changing. In March PBS, NBC and ABC began captioning some of their programs, sending out signals that can be converted into subtitles on specially adapted sets. PBS’s Masterpiece Theater is captioned, and so are such shows as ABC’s Vega$ and NBC’s Real People. Some advertisers, who realize what a vast market they have been missing, are even captioning their commercials.

One deaf performer, Sesame Street‘s Linda Bove, was so popular with the show’s preschool audience that she became a regular member of the cast, playing the part of a deaf actress and spawning entire playgrounds of tots weaving tiny finger patterns in the air. At least one major theater, Los Angeles’ Mark Taper Forum, reserves two performances of every production for the deaf, with a translator using sign language at the side of the stage to tell what the actors are saying. A major breakthrough came last month when Children of a Lesser God, a play about the romance of a deaf woman and a hearing man, virtually swept the Tonys, Broadway’s equivalent of the Oscars. The most surprising award was to Phyllis Frelich, 36, the first deaf person ever to have a lead role on Broadway.

Get your history fix in one place: sign up for the weekly TIME History newsletter

By 1982, real-time broadcasts — such as the Academy Awards and ABC’s World News Tonight — were available with captions. These were made possible by court reporters or people with similar training, known as “stenocaptioners,” who would do real-time captioning for live events.

But captioning still had a long way to becoming more widely accessible. In the ’70s and ’80s, it could take up to 40 hours a week to caption one TV show, and one going rate for stenocaptioning was $2,000 an hour. And for viewers, “the cost of the decoder was hard to justify because there were so few shows captioned on television,” as Lang put it. Fewer than 200,000 decoders had been sold by 1988, per Strauss’s book A New Civil Right: Telecommunications Equality for Deaf and Hard of Hearing Americans.

The Television Decoder Circuitry Act of 1990 required closed-caption decoders to be built into all TV sets with screens larger than 13 inches — in this case, the decoder was a chip instead of separate set top boxes. Frank Bowe, who chaired the Commission on Education of the Deaf in the mid-1980s, found that Japanese manufacturers were eager to design these chips because of the great interest in learning English as a second language.

“The theory was if you expand the audience that can have access to captioning, then there will be more incentive for the broadcasters and the producers and the advertisers to support it,” says Karen Peltz Strauss, an attorney who did outreach organizing for the 1990 act. “We did an analysis of who would benefit from captioning, and we calculated 100 million people when you added up people who were deaf and hard of hearing, senior citizens, people who were illiterate, children learning to read, and people learning English as a second language. There was this massive effort to get those chips into the TV sets.”

Strauss was involved in drafting both the 1996 Telecommunications Act amendments requiring television programming to have closed captions, and the Twenty-First Century Communications and Video Accessibility Act of 2010, which expanded that mandate to nearly all video programming devices of smaller sizes, such as tablets and cell phones. Over the past 25 years, various companies that do captioning proliferated and costs went down; the growth of captioning companies led to quality control issues, and 2014 FCC rules were designed to strengthen accuracy and completeness. Digital cable enabled increased customization of the appearance of captions on the screen. Strauss argues that “the booming aging population” has also been a reason for the expansion of captioning, as people who are hard of hearing have become a larger market.

Still, the 2010 act doesn’t require Internet videos to have captions, and some automated captioning tools that have been developed produce misspellings and gibberish. Viral hashtags have pointed out the worst offenders. As some are turning to crowdsourcing as a solution, advocates continue to campaign to make sure some types of Internet videos, such as educational videos, have captioning.

Given the history, there’s reason to believe those campaign will persist. Plus, captions are no longer seen as useful for deaf people only, but also as a convenience for anyone who might want to watch a video without making noise. One 2016 survey found that as much as 85% of all video views on Facebook were with the sound off, and a 2019 study showed that the vast majority of videos consumed on mobile devices are watched on mute.

As Bravin puts it, whenever people “watch captions in noisy environments such as gyms and bars, they have the deaf community to thank.”

More Must-Reads From TIME

Write to Olivia B. Waxman at olivia.waxman@time.com