What reporters witnessed in a New York City auditorium on April 7, 1927 was a fundamentally startling notion: seeing someone speak—from hundreds of miles away—in real time. When then-Commerce Secretary Herbert Hoover appeared on screen from Washington, D.C., he declared that “human genius has now destroyed the impediment of distance in a new respect, and in a manner hitherto unknown.”
“It was as if a photograph had suddenly come to life and begun to talk, smile, nod its head and look this way and that,” the New York Times marveled.
In retrospect, we might deem that the moment video calling was born.
But few conceived of the technology being utilized by average Americans. Sure, it was a “phenomenal feat,” according to the Boston Globe—but one with “no definite purpose.” Nonetheless, AT&T president Walter Gifford, who received the call from Hoover, confidently predicted that “in due time it will be found to add substantially to human comfort and happiness.”
He likely could not have imagined just how right he was. The world has turned to modern iterations of that first video call to connect socially during COVID-induced quarantine. Daytime teleworkers transition seamlessly into happy hour revelers; birthdays are celebrated and lost ones mourned on virtual platforms. Zoom, the video-meeting platform that has come to symbolize this shift, says the company added 100 million participants in just the first three weeks of April. The pandemic has further entrenched our digital saturation.
“People were already incorporating these technologies into their everyday life before COVID-19 in a way that made everything seem kind of ready-made for the current crisis,” says Lisa Parks, a professor of media studies at MIT.
But the concept of video chatting has not always been embraced. In fact, most of its history is a story of failure.
After that public debut in 1927, work continued at AT&T’s Bell Labs. (The company had monopolistic control of the nation’s incipient phone services, giving it primacy in research and development.) But the research could only go so far. At the time, even if there had been demand for the product, networks lacked the carrying capacity needed to transmit visual calls with desirable resolution.
“The idea of visual communications was still alive at Bell Labs, but waiting for the right moment technologically, socially, culturally,” says Jon Gertner, author of The Idea Factory: Bell Labs and the Great Age of American Innovation.
That moment—or so researchers hoped—arrived in 1964, when AT&T introduced the Picturephone at the World’s Fair in New York City, complete with a promotional cross-country call to Disneyland. Streams of visitors could try the devices, while market researchers gauged interest.
Get your history fix in one place: sign up for the weekly TIME History newsletter
Soon after, the company opened Picturephone rooms in New York, Washington and Chicago. Lady Bird Johnson did the inaugural honors, “with the smiling ghost of Alexander Graham Bell looking over her shoulder,” as the Palm Beach Post noted with flourish. Hoping to build on this momentum, the service was introduced into offices in select markets in 1970, but AT&T was unable to garner a sufficient number of users to make the idea work. The effort sputtered out in 1973.
“It was such a spectacular commercial flop, it’s almost kind of hard to imagine today,” says Gertner.
Why was such a novel product doomed commercially? One reason was what Sheldon Hochheiser, Corporate Historian at the AT&T Archives and History Center, calls “that chicken-and-egg problem: in a network technology, there’s a disincentive to be an early adopter, because your being able to use a Picturephone is dependent on people you wish to contact also having one.” The steep price tag—Hochheiser estimates device and minimal usage costs as equivalent to $1,000 today—put it out of reach for many consumers. High-cost, cumbersome calls yielded blurry images. In the end, the reward failed to outweigh the inconvenience.
But another reason came as a surprise, and it had nothing to do with technology: “It turns out people don’t want to be routinely seen on the telephone,” says Hochheiser. One columnist raised the specter of a call “every time we allowed ourselves to relax in our tired old bathrobe.” Such fears undoubtedly resonate with many quarantined teleworkers today.
“For an innovator, being early is pretty close to being wrong,” says Gertner. “To have an innovation that scales, that makes an impact on society or business, you really have to check a lot of boxes.”
In the 1980s, video phones—launched by domestic and international competitors getting in on the game—inched forward. In 1992, AT&T tried again with the VideoPhone 2500, which was compatible with existing phone lines. In 1995, it too was discontinued, seemingly vexed again by a reluctant market.
Yet despite years of defeats, “the idea, the lure of telephony endured,” says Hochheiser.
An accidental breakthrough hastened video calling’s trajectory—and shifted it to an entirely different medium.
In 1993, a University of Cambridge scientist was tinkering with a camera used to monitor coffee pot levels, and connected it to the fledgling world wide web. Surprisingly, it gained millions of fans. Commercial webcams followed soon after, paving the way for services such as Skype to connect PC users in the early 2000s.
Then in 2010, Apple CEO Steve Jobs unveiled the iPhone 4’s FaceTime. “I grew up…dreaming about video calling, and now it’s real,” Jobs told an enthusiastic audience, recalling futuristic depictions of video calls on The Jetsons during his childhood.
The proliferation of smartphones and built-in cameras served as an accelerant. “A host of video-based platforms have all emerged along parallel tracks over the past 15 years, and they’ve had a way of reinforcing one another’s success by normalizing online video interactions,” says Parks. Once perceived as invasive, cameras in personal space became mainstream.
It may have taken longer than innovators of the pre-Internet era expected, but the ubiquity of visual calls proves—and perhaps exceeds—their audacious goals. “Every great fundamental discovery of the past,” Hoover predicted back in 1927, “has been followed by use far beyond the vision of its creator.”
Correction, May 11
A photo caption in this story originally misstated where Lady Bird Johnson was photographed using a Picturephone. It was in Washington, D.C., not New York.
More Must-Reads from TIME
- Caitlin Clark Is TIME's 2024 Athlete of the Year
- Where Trump 2.0 Will Differ From 1.0
- Is Intermittent Fasting Good or Bad for You?
- The 100 Must-Read Books of 2024
- Column: If Optimism Feels Ridiculous Now, Try Hope
- The Future of Climate Action Is Trade Policy
- FX’s Say Nothing Is the Must-Watch Political Thriller of 2024
- Merle Bombardieri Is Helping People Make the Baby Decision
Contact us at letters@time.com