Facebook is ending a controversial research program in which it paid users up to $20 a month to install a smartphone app that gave the company nearly unfettered access to their activity. The move comes after the program was highlighted in a report by TechCrunch, and after Apple said the app violated its policies and revoked its certificate.
A Facebook spokesperson says the program was not “secret,” as some early reports suggested, and that it was opt-in. “It wasn’t ‘spying’ as all of the people who signed up to participate went through a clear on-boarding process asking for their permission and were paid to participate,” said the spokesperson. But the program had major privacy implications even still, and seemed likely to prey on the vulnerabilities of Facebook’s most financially desperate users.
All of us who use a digital service for “free” are in fact paying for it in some way. Most free digital services, including Facebook, make money through data collection and advertising. As we use Facebook (and often even when we’re not directly using it), it collects data about what we’re doing. It then sells advertising space to other companies and brands, offering them the ability to precisely target their ads based on users’ activity, demographics and demonstrated interests. Those ads are then shown to us as we use Facebook’s services. This is the core model that’s made companies like Facebook and Google some of the world’s most valuable firms — they are, at heart, advertising companies. (Facebook and other services, of course, offer some options for users to better protect their privacy.)
That Facebook’s business model is built around a convenience vs. privacy tradeoff is not a novel thought. (There’s also an argument that we should be able to pay for an ad-free Facebook and other services.) But Facebook’s now-canceled research program took that tradeoff to a terrifying new extreme. Essentially, it compensated users who sacrificed almost all of their digital privacy. Facebook used the data it collected to better understand users’ mobile activity, which could help the company optimize its current apps — Facebook also owns Instagram and WhatsApp, along with a plethora of lesser-used offerings — or launch new services that compete with its rivals, like Snapchat and WeChat.
“There is a long history of market research firms paying a small amount to consumers in exchange for asking questions or a one-time opportunity gather limited amounts of data,” says Roger McNamee, author of Zucked: Waking Up to the Facebook Catastrophe, via email. “What makes this different is the app would have collected a huge amount of data for a long time, some of the targeted users would be minors as young as 13, and Facebook has already demonstrated that it cannot be trusted to hold user data securely or use it in ways that benefit the user.”
Whether Facebook users who signed up for this program truly understood its implications is up for debate. And a handful of participants — Facebook says less than 5% — were teenagers, who were required to get a parent or guardian’s permission to participate; whether those guardians understood what their kids were up to is also unclear.
But the most alarming element of Facebook’s research program was its inherently exploitative nature. By offering as paltry a sum as $20 to see nearly everything we’re doing on our smartphones, Facebook is, whether consciously or not, targeting the most desperate among us. Facebook users who make a comfortable wage are unlikely to see that deal as worth the trade-off. But people struggling to pay the bills or put food on the table could see it as a far more tempting offering. Nearly 40 million Americans live in poverty, according to the latest figures from the U.S. Census Bureau, while 85% of Internet-using Americans making less than $30,000 use Facebook, per one estimate. Their financial situation makes them more vulnerable to economically exploitative activity of all kinds, from payday lending to loan sharks to, now, privacy sharks. Facebook, a company that’s worth nearly $421.5 billion at press time, can surely find less problematic ways to do market research.