If you talk to the engineers and dreamers in Silicon Valley, especially anyone over 35, they’ll probably admit to being into science fiction. This genre of movies, comic books and novels was huge in the first half of the last century and remained strong through its second half, when most of today’s engineers were born. That’s not to say science fiction’s allure has faded — if anything, the popularity of shows like Westworld and Stranger Things suggests we’re as fascinated as ever — but to point out that it had a great influence on those creating today’s technology.
I was born in the latter part of the last century, and like many of my geek friends, was into science fiction at all levels. We loved its heady futuristic ideas and reveled in its high-minded prophesies. But there is one theme in science fiction that always troubled me: when technology runs amok and subverts its creators. Usually when this happens, the story becomes a dramatic puzzle, whose solution involves the protagonists expending tons of creative energy in an effort to either destroy their mutinous creation or contain it. I had nightmares for months after I read Mary Shelley’s Frankenstein.
I’ve been involved in dozens of technology projects, but I have to admit that seldom in our design or business discussions do we spend much time on the potential negative impact of our work on the world. Instead, we abide by an engineering mantra often embodied in the concept “We create it because we can.” Indeed, in most cases we create technology because we see a need, or to solve a problem. But sometimes in hindsight it seems we wind up creating new ones.
I recently spent time with key execs in the security and cybersecurity space. Perhaps no other area in our digital world underlines the flip side of technological progress. IT execs tell me that security is now about 25% of their IT budget spend. Each day we hear of hackers targeting user identities, financial networks and power grids, and malware routinely targets PCs, laptops and smartphones, holding them hostage till users pay a ransom fee to recover their data.
When the folks at DARPA and other agencies blueprinted the Internet in the 1960s, the idea was to have a medium in which to share scientific data and other information quickly and on a global scale. But as the Internet has evolved, it’s become the de facto medium for just about any type of communication, commercial transactions, and yes, hacking that impacts us for better and worse.
It’s also been responsible for an unprecedented age of distraction. I was recently in New York and had to drive from northern New York City to the Elmira area on the state’s freeways. For the first time, I saw signs that said “Next texting stop is 3 miles ahead. Don’t text and drive.” Most states have already outlawed texting while driving, and yet we hear almost weekly of traffic accidents cased by oblivious drivers tapping blithely on smartphones.
The level of distraction caused by technology (driving or no) is at an all-time high. While on vacation in Maui, Hawaii last month, I was stunned to see people pulling out their smartphones and checking them while walking around beautiful Lahaina and other areas of the island. The gravitational pull of these devices is ubiquitous. During a dinner with my wife, my son and his wife and our two granddaughters at a beachside restaurant, I caught all of us looking at our phones as we waited for our food, paying no heed to the gorgeous scenery right in front of us.
I don’t believe Steve Jobs and Apple dreamed the iPhone or smartphones in general would engender this level of diversion. I don’t think Mark Zuckerberg, when he created Facebook, foresaw how distracting and addictive Facebook would become. And I don’t think Niantic, the creators of Pokémon Go, fully thought through the tectonic fantasy-reality collisions of their augmented reality app (shortly after its launch in early July 2016, two people playing the game walked off a cliff). My wife has had close encounters with trees and light posts herself while chasing down some of the game’s secretive critters.
In a recent Harvard Business Review piece titled “Liberal Arts in the Data Age,” author JM Olejarz writes about the importance of reconnecting a lateral, liberal arts mindset with the sort of rote engineering approach that can lead to myopic creativity. Today’s engineers have been so focused on creating new technologies that their short term goals risk obscuring unintended longterm outcomes. While a few companies, say Intel, are forward-thinking enough to include ethics professionals on staff, they remain exceptions. At this point all tech companies serious about ethical grounding need to be hiring folks with backgrounds in areas like anthropology, psychology and philosophy.
I have no illusions about the cat being out of the bag (it’s hence shacked up with YouTube), and as a parent and grandparent, admit I need to be more proactive about self-policing. My hope is that we can all move a little more in that direction, creating technology that is both impactful and thoughtful in its engagement with our lives and the world.
Tim Bajarin is recognized as one of the leading industry consultants, analysts and futurists, covering the field of personal computers and consumer technology. Mr. Bajarin is the President of Creative Strategies, Inc and has been with the company since 1981 where he has served as a consultant providing analysis to most of the leading hardware and software vendors in the industry.
More Must-Reads from TIME
- Donald Trump Is TIME's 2024 Person of the Year
- Why We Chose Trump as Person of the Year
- Is Intermittent Fasting Good or Bad for You?
- The 100 Must-Read Books of 2024
- The 20 Best Christmas TV Episodes
- Column: If Optimism Feels Ridiculous Now, Try Hope
- The Future of Climate Action Is Trade Policy
- Merle Bombardieri Is Helping People Make the Baby Decision
Contact us at letters@time.com