Locked down in London at the height of the pandemic, bombarded with scary news, I’d felt my connection with nature starting to fray. The daily hour we were allowed to walk in the park for exercise became for me (and many others) a lifeline. And for these walks, I took my phone—not to chat, but to learn.
Despite being a well-traveled wildlife filmmaker, I was shamefully clueless about the names and habits of many of the species native to my homeland. Soon a tree-identifying app introduced me to the flora I’d been strolling past. A bird-call app helped me discern the difference between the great tits, blue tits, and goldfinches.
But although my phone unexpectedly became a soothing nature guide, I’d long understood that tech can supercharge our innate connection with the non-human world.
In 2015 while kayaking in Monterey Bay with my friend Charlotte, a humpback whale shot out of the sea, as if a building made of flesh and bone had grown out of the water. Its breach took it arcing through the air and down onto us. It landed on the kayak, smashing us below the surface. Somehow we survived. Afterwards we paddled back to shore, ringing with adrenaline, exhilarated to be alive and sure no one would believe us.
By chance, a tourist filmed the collision on their phone. They put it on YouTube, and their video went viral: a 30-ton whale arcing through the air, tiny humans beneath disappearing in a white explosion. A living wonder was spread via the internet, and a fleeting incident become part of a bigger story.
Scientists analyzed the video and told us that the whale had seen us and turned away from us mid-flight, an action which saved our lives. Then researchers at Happywhale.com used AI to identify who the whale was. The site is a giant “citizen science” project, a database of whale-watchers’ and scientists’ humpback photographs. They told me where it was born, how old it was, who its mother was. I was astonished. As one of the whale’s “followers,” whenever it is sighted again I now receive an email. It’s been sighted 16 times since 2015, last in 2020.
The initiative now has an app that allows scientists and naturalist guides to identify a whale in a fraction of a second. If you’re going whale-watching and get a good photo, Happywhale’s app can tell you the life story of the whale you’ve snapped, and you can add a piece to its puzzle.
Technology can help us “talk” to nature
It’s easy to feel overwhelmed by technology’s utopian promises or toxic fallout. Many environmentalists, including myself, feel an instinctive uneasiness about machines getting between us and the experience of nature and wildness. But there is a choice over how to wield these powers, and we can be conscious about where and how we use them.
During the pandemic, I spent hours pointing my phone at branches and flowers and holding it up to record birdsong, as they sang in skies untroubled by aircraft noise. One day I met Eugene, an elderly volunteer who’d planted many of the trees I’d come to know. He was skeptical of the app’s abilities at first, but when I showed him the results, he found them accurate. Then Eugene led me around the park and together we filled in his gaps—the rare species and cultivars he’d been unable to ID himself.
For my book, How To Speak Whale I have spent the last four years researching how new technologies are transforming what we know of other species, especially whales and dolphins (the cetaceans). With machine help, we have a new appreciation and understanding of these diverse and mysterious animals.
Some pass tests for self-awareness, others seem to intentionally intoxicate themselves. There’s evidence they mourn their dead, form “friendships” and team up to hunt and play with other species (including us). They have cultures. They have powerful and controlled voices which they use to communicate and sing, teaching each-other how to survive. Some seem to have “names” for themselves and their social groups.
We know these things because of tech like waterproofed recording devices, some even suction-mounted onto the bodies of whales and algorithms that find patterns in the data.
Scientists are often taking devices developed for humans—cameras and batteries miniaturized for cell phones—water and shock-proofed for extreme sports, facial and voice recognition, the language mapping techniques of Google Translate, and pointing them towards other species. In the human realm, these same tools may have dangerous uses—repressive regimes can use them to track activists, for example.
But software follows whatever marching orders it’s been given. Right now projects are underway to attempt to use software to decode the communications of whales and perhaps even to speak to them.
At the time of writing, Happywhale has identified almost all the humpback whales in the North Pacific. Its incredible algorithm was co-written across continents by Jinmo Park, a software engineer in South Korea, and Ken Southerland of Portland, Oregon. Happywhale’s database of over half a million photographs is a powerful biological research tool, but it is also an empathy machine—a way for whale watchers of all stripes, professional and casual, to become part of a larger project that forges a connection between human and whale lives.
One of Happywhale’s best-known whales is Fran. When she returned to Monterey Bay with her first calf, her followers rejoiced. When she washed up dead last month, killed by a boat collision, her passing was mourned by dozens in person and untold people online, who triggered calls for better protections for these animals.
Phone up, phone down
As the great environmentalist Rachel Carson wrote, “exploring nature…is largely a matter of becoming receptive to what lies all around you” and that’s something technology can do for us—more than we often realize.
The year before the whale leapt onto me, my father, Michael, was coming to the end of his life.
He’d always been fascinated by space. We’d call each other up to get excited together about the latest deep space imagery, of the discoveries of satellites and Mars missions. In the last year of his life I was visiting him at his cottage in the Yorkshire dales. The skies are very clear there, with little light pollution. I’d heard Saturn was unusually close and clear. I carted his telescope out in front of the cottage. I didn’t know how to find Saturn, and my father was too ill to lean down to look through the lens, let alone show me where to point it. I knew there wasn’t much time, it was very cold and he was tired and frail.
I downloaded an app, which I held up to the sky, and it pointed me to the planet. I lined it up and found its moons bright, its rings crisp. I knelt on the wet grass and my dad leaned on my back and looked through the lens and gasped. This was eight years ago but it is blazed into my heart.
Still, there are times to put our phones down.
Now, strolling around the park, I don’t need the app. I know almost all those trees. I help water the young saplings, and I watch them, not to learn what they are, but to observe them, and to be present, as they change with the seasons.
More Must-Reads from TIME
- Donald Trump Is TIME's 2024 Person of the Year
- Why We Chose Trump as Person of the Year
- Is Intermittent Fasting Good or Bad for You?
- The 100 Must-Read Books of 2024
- The 20 Best Christmas TV Episodes
- Column: If Optimism Feels Ridiculous Now, Try Hope
- The Future of Climate Action Is Trade Policy
- Merle Bombardieri Is Helping People Make the Baby Decision
Contact us at letters@time.com