It is not easy to protect 1.4 billion people every day. But if Facebook wants to be the home where all those people share their likes and heartbreaks and plans and politics with acquaintances online, it had better try a lot harder.
That was the thrust of the news on March 17, when the Observer of London and the New York Times revealed that analytics firm Cambridge Analytica improperly obtained data from 50 million Facebook accounts. The company, which worked with both Senator Ted Cruz and Donald Trump on their 2016 presidential campaigns, then attempted to build psychological profiles of potential voters — with the hopes of using them to determine whom to target.
But in this case, unlike other recent privacy breakdowns — like the Equifax data breach that put 145.5 million accounts at risk — thieves or hackers did not steal information. The company actually just handed the data over, then didn’t watch where it went. As Facebook itself reported, Aleksandr Kogan, the academic researcher who first obtained the information through an app he developed, did so “in a legitimate way and through proper channels” and violated Facebook’s policies only when he passed it on to Cambridge Analytica. The social network was also under the impression until recently that the harvested data had been deleted, but the Times says it has viewed a set of it. Right now, it’s not clear who else can see the data.
All This has prompted sharp criticism of the company, which meticulously tracks its users but failed to keep track of where information about the lives and thinking of those people went. Facebook’s shares were down by 6.8% in the first business day after the reports. Lawmakers in both the U.S. and Britain, where Cambridge Analytica did similar work ahead of the Brexit referendum, have demanded testimony from Facebook chief Mark Zuckerberg; in an interview with CNN, Zuckerberg responded to a question about whether he would go before Congress by saying, “The short answer is I’m happy to if it’s the right thing to do.” The U.S. Federal Trade Commission and state attorneys general have reportedly begun investigations. On the site itself, many users mused, Why are we still here? This all comes at a time when the company reported that it had seen a decrease in daily active users in the U.S. and Canada for the first time — from 185 million to 184 million — in the fourth quarter of 2017.
Because Kogan obtained the data through legitimate channels, preventing such a scenario from happening again isn’t as simple as patching a bug or boosting Facebook’s security infrastructure. A fix would require Facebook to be stricter with its actual customers: developers and advertisers of all kinds, from retailers to political groups, who pay to know what you have revealed about yourself. But it will need to keep a closer eye on who can see what, even if that results in repercussions for its other partners. Facebook invites you to chronicle your life through its platforms, especially your most cherished moments. There is a natural expectation that a space with such precious material will be guarded. As Zuckerberg said in a statement that in part pledged to restrict developers’ access to data: “We have a responsibility to protect your data, and if we can’t then we don’t deserve to serve you.”
There’s another group in need of urgent introspection: users. In an era in which we tell companies like Facebook, Google and Amazon what groceries we eat, whom we’re in touch with and where we’re going (at a minimum), users themselves need to actually demand to know to whom their information is being sent and how they will use it, in a way that is readable and accessible. There’s no single obvious answer for preventing future data abuse, but one lesson is evident: Facebook needs to be more transparent with its users when their data is being exploited, and users themselves should be much more vigilant about the personal details they’re willing to share. “It’s clear these platforms can’t police themselves,” Senator Amy Klobuchar posted to Twitter. (Although even she expressed to Vice News Tonight skepticism that lawmakers will change the system ahead of this year’s elections, suspecting some will want one last election cycle with these tools in hand.)
Users may not invest in Facebook with cash. Instead, we offer invisible things: our emotions, our interests, our time and, in the end, our trust. As Facebook asks for more and more of us as it expands, from its messenger apps to virtual reality to Instagram, we must ask, Why do we trust what we know so little about? Especially when what we do know is that the site profits off our interests? We need to value our minds and lives at least as much as the advertisers and politicians do.