Twelve years ago, I was 24 years old when I handed my boss a printed piece of paper with the description for a job I believed should exist but didn’t: a reporter covering the emerging technology startup world.
There was a pocket of innovation I was privy to. The iPhone had launched in 2007, and Apple’s app store debuted a year later, creating a virtual canvas of creativity for entrepreneurs who had an idea they could code into the hands of millions. It was a moment defined by scarcity that birthed a new breed of creativity. Entrepreneurs were coding apps for mobile phones. Companies like Twitter and Facebook (now Meta) were taking off, and Silicon Valley’s complicated business model had yet to enter the picture.
Before pitching my idea on a scrap of paper, I’d been a lowly news assistant at a pivotal moment at the breaking news desk at CNN. We were in the midst of a global recession, Bernie Madoff had swindled New York’s finest out of millions, and swine flu was a looming threat.
My evenings were spent in downtown New York dive bars, listening to entrepreneurs relay their ideas over cheap beers and whiskey. During that time period, two little-known entrepreneurs created Instagram, and we all began to filter our life’s moments, posting them in exchange for likes. I explained in video interviews the idea of getting into a stranger’s car (Uber) wasn’t as crazy as it seemed. And sleeping in a stranger’s home (Airbnb) was part of a growing trend called the sharing economy. I watched as entrepreneurs I knew, the ones that gulped PBR and belted karaoke until the early mornings, either failed, or sold their companies for millions, or became household names in a public arena. I became CNN’s senior technology correspondent.
It wasn’t long, however, before I began to worry. The utopian vision that baby-face entrepreneurs had promised us, was beginning to evolve, as the algorithms had adverse impacts on our mental health, our politics, our lives. With all the good, came a host of bad. I stopped asking for descriptors of emerging apps and started asking the human questions: Had they thought about whether they were coding addictive products? Did they understand their platforms were beginning to push us into our own filter bubbles, and polarize us further? Did they realize their platforms could be weaponized to shape elections? Did they know their platforms weren’t just the pipes that allowed content to run through them?
The laws and regulations weren’t equipped to deal with the speed of innovation. The problems of running a kingdom of content were vast.
In 2018, the Cambridge Analytica scandal rocked Facebook and moved the conversation about data collection and privacy to the global stage. The idea that our data had been used to manipulate us was personal, but it was just the tipping point. People were already beginning to question tech’s power and influence in politics as the internet’s town square—once hailed a utopian world that would help us navigate the flow of information and engage in quality conversation—became overrun by misinformation, trolls with varying motives, and election interference.
Sitting across Mark Zuckerberg that year, I asked the question at the heart of it all: What went wrong? He didn’t have a concrete answer, and the truth was there were no simple answers to my question and all the other complicated questions emerging from a decade of innovation. Addressing them would be messy, and took too long for the founders of these platforms to build teams and processes to answer them. As technology transformed our society, I had a lingering thought: The algorithms were getting better, but were we?
I’d always loved the underdogs, and had a soft spot for those in the corners—the misunderstood, the courageous out-of-the box thinkers who had the ability to take on the status quo, but increasingly the underdogs had transformed into the powerbrokers of society. Still, the point that kept getting lost in the shiny office in Silicon Valley was simple: not enough people took a beat to understand human behavior. The lack of diversity of thought and background was coded into our everyday experience and how we interacted with technology. Discussions of the ethical implications of products weren’t prioritized early in the building phases of innovation. We all paid the price.
There’s a lot we can learn from the good and bad of the last generation of technology as we hurtle towards the next phase. It’s called Web3, and as someone who has peaked around the corners, and anticipated the tech that transforms society, I’m confident that this moment will be as disruptive as when smartphones hit the market or even the earliest stages of the internet that enabled people to access and share information freely.
If Web1 was defined by the democratization of information, and Web2 ushered in a more centralized era of the internet, where companies like Twitter, Facebook, Amazon, and YouTube became central hubs for our interactions and data, Web3 promises an internet where creators have more ownership over their digital selves. Web3 is a reaction to the last generation of tech where companies owned our data. If Instagram were to disappear tomorrow, all our data and our carefully curated existences would disappear with it—but Web3 promises protocols that call for an internet where “you own you.”
Web3 also promises a more immersive experience with technology. The pandemic accelerated talk of a word that’s now become mainstream vernacular: the metaverse. The term has different definitions depending on who you ask, but a basic way to understand its meaning is to think about living in a world that’s more interactive.
It’s already happening. Our children spend more time in virtual worlds as they grow up. Gaming company Roblox reported its users spent 10.8 billion hours on the platform in the fourth quarter of 2021—a 28% increase from the same time a year before. It’s not just about gaming. People create community and build friendships in these worlds. There will be jobs we didn’t know existed—and worlds, protocols, and platforms that will be built. It’s an exciting time. But we are also at a pivotal point. We have the opportunity to take a “humanity first” approach to innovation, to take the lessons from the last decade and ask the human questions before it’s too late.
There are already emerging problems with the utopian Web3 reset of tech. For example, virtual properties are being snatched up as investments for those who want to build out infrastructure on new platforms. That infrastructure will be used to build virtual shops, concert venues, and galleries where artists, our children, musicians, and others will spend more time. But the demographic of the buyers of these virtual world properties, skews towards men. If these virtual environments live up to the promise of becoming a crucial part of a new generation of the internet, then the people who own digital land will make decisions about what type of environments they’d like to create. To create healthy environments, it’s crucial to have diverse decision makers and architects in these worlds.
Our children will inevitably spend more time in these virtual spaces as they evolve, but we aren’t paying enough attention to the proper safety regulations and psychological implications of young minds being shaped by virtual environments and avatar identities, or how we bring educators into building healthier environments.
There will also be new financial models for funding—decentralized autonomous networks (DAOs), for example, are growing in popularity. While DAOs give people the ability to organize and fund projects in a setting that could benefit community building and force organizations and businesses to rethink their models, without checks, they could also lead to scams and fraudulent behavior. In a worst-case scenario, they could become an efficient outlet for bad actors to collectively organize and become more powerful.
Despite these issues, we have an opportunity at the dawn of a new era of the web, to ask important questions as we build new products and create room for more voices who will help anticipate the adverse impact of the algorithms. Developers and early adopters don’t often look at worst-case scenarios when there’s excitement around an idea or movement, but if we learned anything from the last generation of tech, it’s necessary. We’ve already examined the business model at the heart of Silicon Valley—the ad model—and discovered that it creates an unhealthy incentive-based ecosystem for users that optimizes for engagement at all costs. While Web3 promises an alternative model and a more decentralized approach, we have to remember, technology may be neutral, but the humans that both build and interact with the products are not.
These are questions tech founders, the government, and users should collectively weigh in on. During the last era of tech, there was a divide between the startup world and the government until the two came to a head over issues like Russian interference and data privacy concerns. These worlds shouldn’t be so foreign to one another. Investors who are interested in the Web3 boom, should look to invest in a diverse group of entrepreneurs who aren’t in their own backyards, so the architects of a more immersive internet our children grow up in will be more reflective of a healthy and diverse ecosystem.
I’ve always been an optimist when it comes to tech innovation, but pattern recognition tells me we are dangerously close to beginning another cycle of “move fast and break things”—Facebook’s early motto for innovation that later became a warning for the tech sector’s pace of innovation without careful consideration.
We are entering an exciting time, full of disruptive concepts: digital ownership, decentralization, a focus on creators, and more control over our digital selves, and if we move fast, ask the right questions, and approach Web3 in an ethical way, we have a shot at making this next era of the Internet more compelling, and kinder than the last.
More Must-Reads From TIME
- Inside the White House Program to Share America's Secrets
- Meet the 2024 Women of the Year
- East Palestine, One Year After Train Derailment
- The Closers: 18 People Working to End the Racial Wealth Gap
- Long COVID Doesn’t Always Look Like You Think It Does
- Column: The New Antisemitism
- The 13 Best New Books to Read in March
- Want Weekly Recs on What to Watch, Read, and More? Sign Up for Worth Your Time