How Big Tech Weaponizes Our Shame

5 minute read
Ideas
O’Neil is the author of The Shame Machine: Who Profits in the Age of Humiliation and the bestselling Weapons of Math Destruction. She received her PhD in mathematics from Harvard and has worked in finance, tech, and academia. O’Neil is a regular contributor to Bloomberg Opinion.

Shame is a visceral, instinctual response. We react violently to shaming by others, either by feeling shame or by feeling outraged at the attempt. This human hardwiring, which historically salvaged our reputations and preserved our lives, is being hijacked and perverted by the Big Tech companies for profit. In the process, we are needlessly pitted against each other. It doesn’t have to be like this. What I’ve learned–in part from very personal experience–is that shame comes in a number of forms and the better we understand it, the better we can fight back.

Whereas shame is primarily a useful social mechanism which coerces its target into conforming with a shared norm, the kind of shaming that often goes viral on social media is inappropriate, a punching down type of shame where the target cannot choose to conform even if they tried. That obese woman who fell over in her wheelchair at Walmart? Viral. That overdose victim? Shamed. Kids without lunch money? Stamped with ink on their arms.

Shame’s secondary goal is arguably more effective on social media, namely to broadcast the norm for everyone to see what mistakes look like. When we see yet another phone video of an outrageous public “Karen” situation, it can conceivably be seen as a learning situation for everyone else.

But what exactly are we learning? I’d argue the lessons are bad. The ensuing viral shame is swift and overly simplistic, often leaving little context or right to due process. When we do hear further from the target, the shame tends to have backfired, leaving the alleged Karen defiant rather than apologetic, and finding community with equally defiant others.

Finally, the underlying societal problem exposed by a Karen episode is left unaddressed: that white women hold outsized power over others, especially Black men, due to a historical bias in policing. That problem won’t be solved simply by making white women use such power off camera. In other words, the shaming is misdirected in the first place, because it aims too low.

As poorly as shame plays out, it is exactly how the big tech companies have designed it. I should know, I used to work as a data scientist in the world of online ads. I would decide who deserved an opportunity and who did not, based on who had spent money in the past and who hadn’t. My job was to make the lucky people luckier and the unlucky people unluckier.

And that’s a general rule. Most online algorithms quantify and profile you, putting a number on how much you’re worth, whether it’s to sell you a luxury item or to prey upon you if they deem you vulnerable to gambling, predatory loans, or cryptocurrencies. In turn, the advertisers who find you figure out your weaknesses and deftly exploit them. When I realized I was helping build a terrible system, I got out.

What I didn’t realize even then was how much shame sells. Years later, when I was researching bariatric surgery as a way to avoid getting diabetes, I was inundated with online ads that had pegged me as vulnerable to fad diets, liposuction, and plastic surgery. And even though I knew exactly why my online environment was flooded by these ads, and how they were intentionally manipulating me, they still had a visceral effect on my psyche. They had optimized the ads to my particular vulnerability, my body image and my fatness.

For social media, the data scientists are interested in only one thing: sustained attention. That’s why online we are made to feel so very comfortable, surrounded by like-minded friends, perhaps thousands of them. It’s big enough to feel like we’re “in society” but of course it’s actually quite small, a minute corner of the world. The ways we disagree with others outside our group are filtered straight to us, via algorithms, and the ways we agree with each other are likewise filtered away from us, making them essentially invisible.

That automated boosting of shame-based outrage triggers us, and we get habituated to perform acts of virtue signaling. We jump on the shame train to get our tiny little dopamine boosts for being outraged and for our righteousness. That we get accolades from our inner circle only serves to convince us once again that we’re in the right and that everyone outside our circles are living in sick cults. This turns what should be a socially cohesive act into a mere performance, as we get stuck for hours on the platforms, tearing each other down for the sake of increasing the profits of Big Tech.

What’s particularly tragic about all of this is that the shame doesn’t work at all; it is inherently misdirected. For shame to work, in the sense of persuading someone to behave, we first need to share norms and even a sense of trust, and second, the target of the shame needs to have the choice to conform and the expectation that their better behavior will be noticed. Those preconditions are rarely met in the raucous free-for-all that we inhabit online.

We have had differences of opinions for a long time; that’s nothing new. We are not each other’s enemy even now, even though it can seem like that. By pitting us against each other in these endless shame spirals, Big Tech has successfully prevented us from building solidarity and punching up against the actual enemy, which is them. The first step is for us to critically observe their manipulations and call them what they are: shame machines.

More Must-Reads from TIME

Contact us at letters@time.com

TIME Ideas hosts the world's leading voices, providing commentary on events in news, society, and culture. We welcome outside contributions. Opinions expressed do not necessarily reflect the views of TIME editors.