To hear Facebook describe me, I’m a bit of a lone wolf. I live away from my family, the site says. I’m a frequent traveler. I’m interested in real estate, engineering, chemistry and hotels. I enjoy the German death metal band Obscura. They’re associated with the bands Suffocation and Pestilence — at least that’s what Wikipedia told me, because I had to look them up.
Truth be told, I got a “C” in high school chemistry, I haven’t been on a plane in about a year and I’m not into death metal — German or otherwise. Yet for some reason, Facebook not only thinks this is who I am, it’s also what the social network tells its advertisers about me. I learned this by combing through my Facebook ad preferences, something any of the service’s users can do. And while some of Facebook’s data points made sense and seemed even reasonable to collect, others (like that I’m a “close friend of expats,” the site says) are more than unsettling — they could be damaging if used by advertisers the wrong way.
The big problem currently facing Facebook is making this point in a horrific way. According to 17 U.S. intelligence agencies, Russia used social media to influence the 2016 presidential election. This is jarring and alarming because about two-thirds of Americans get their news from social media, says a Pew Research Center survey. And now Facebook, the largest and most influential social network with more than two billion users, is being threatened by Congress with regulations over how it presents political ads. The concern is that outside actors, like Russia, can manipulate the popular social network, and in turn, influence the thoughts and opinions of its users. You know, regular people like you and me, posting things about their cats and Pumpkin Spice Lattes.
If you’ve been purposefully disconnected following November 8, 2016, here’s what we’ve learned since the election: The F.B.I. and other U.S. intelligence agencies have confirmed that Russia used social media to influence the election. According to Facebook’s internal investigations, approximately 500 Russia-linked accounts bought 3,000 ads worth about $100,000 during the 2016 campaign. Targeting issues like Islam, gun ownership and the Black Lives Matter movement, these ads were specifically aimed at swing states and intended to sow discord. But virality being what it is, 10 million people saw the ads, says Facebook.
On November 1, executives from Facebook, Twitter and Google will testify publicly before Congress. And soon the Facebook ads in question will be publicly released. As forthcoming as Facebook may be with Congress both publicly and behind closed doors, it’s unlikely the company will speak plainly about everything users might want it to.
For instance, when it comes to advertising, the bread and butter that accounted for Facebook’s $26 billion in revenue in 2016, the company is tight-lipped. It’s a no-brainer that when you “like” something such as Mr. Bean on Facebook — which, amazingly in 2017, 75 million people do — the social network collects that data and uses it to curate content and ads, making the site more relevant. And when you’re served an ad, you can even click on a discrete drop-down menu to see why it targeted you specifically.
But advertisers can use the information Facebook users provide in unsettling ways. For instance, recent reporting by ProPublica showed how repugnant terms that users had put on their profiles, like “NaziParty,” “how to burn jews,” and “jew hater,” later appeared in Facebook’s ad targeting system. There have been no known cases where advertisers used this information to sell anti-Semitic targeted ads, but it’s possible that they could have.
Instead, it’s more likely that advertisers, like the Russian-linked accounts looking to game the social network, used more common and socially acceptable interests to reach users who were susceptible to being influenced by controversial messaging. In other words, if you “like” the National Rifle Association or Planned Parenthood on Facebook, there’s a chance you got played.
And if I hadn’t taken the time to clean up my own ad preferences, I could have been one of them. For the most part, my Facebook ad preferences are tame. I’m conservative with what I “like” on Facebook: I don’t let friends tag me in posts or photos, and I purposely try not to like anything with bias or controversy. In fact, I’m so fastidious with my Facebook that of the roughly 175 advertising data points the company lists about me, they completely miss the fact that I’m an enormous Red Sox and Star Wars fan. (I can only imagine all that marketing I’m “missing.”)
Still, with how regimented I am, it makes no sense that Facebook thinks I like racewalking (I don’t), vikings (not interested), leather (I’m more of a fleece guy), and finance (you’re kidding, right?). What the heck is going on here? The answer may be one of the social network’s closer-held secrets.
Facebook works with a number of third-party data providers, forging partnerships that make the seemingly all-knowing website even smarter. The company doesn’t reveal what information it culls about us from those services — that’s Facebook’s secret sauce — culminating in a stance that irks privacy experts. And the company is unlikely to detail any of this research when it testifies before Congress next month.
In fact, don’t expect Facebook to give us access to the full scope of our own information any time soon. Instead, in the coming days, we’ll see the ads that the Russian-backed accounts bought to divide us. And when we look at them, it’ll be like peering in a mirror, only the faces staring back will be shrewder. They’ll know exactly how all those nonchalant or impassioned “likes” have been turned against us. And like ideas trapped behind a looking glass, all they’ll tell us is what’s been there all along, hiding in plain sight.
Until Facebook is forced to disclosed the full scope of its knowledge of users — as well as the means through which it gathers that information — lingering questions will outnumber answers. And in demanding those answers, our voices are fundamentally diminished. We aren’t Facebook’s customers, after all, we’re its products.
And that’s what Russia — I mean Facebook — has known about us all along.
John Patrick Pullen has written about smart devices and home automation for TIME and Fortune since 2009. His column, “Tech in Real Life,” appears weekly on TIME.com and explores the ways that technology impacts people in their daily lives. He lives (in a home that’s much smarter than he is) in Portland, Oregon.
More Must-Reads from TIME
- Why Trump’s Message Worked on Latino Men
- What Trump’s Win Could Mean for Housing
- The 100 Must-Read Books of 2024
- Sleep Doctors Share the 1 Tip That’s Changed Their Lives
- Column: Let’s Bring Back Romance
- What It’s Like to Have Long COVID As a Kid
- FX’s Say Nothing Is the Must-Watch Political Thriller of 2024
- Merle Bombardieri Is Helping People Make the Baby Decision
Contact us at letters@time.com