Social Media Platforms Failed to Tackle Abuse. So Tracy Chou Stepped In

[video id=IoHs7P3f ]

Three years ago, when Tracy Chou was between meetings or even while brushing her teeth, she would frequently open her phone and gravitate to Twitter. “I was a little bit addicted,” admits the 35-year-old founder, software engineer, and advocate for diversity in the tech industry.

As an Asian American woman, Chou often found herself in environments that felt alienating. On Twitter, though, she could connect with a broad community of allies, engage with the world around her, and occasionally air her frustrations at racism and sexism in Silicon Valley and beyond. But as her Twitter following grew, so did the abuse. It ranged from what she calls “drive-by trolling,” including sexist or racist slurs, to a couple of persistent stalkers who supplemented online abuse with real-world harassment, with one even following her from London to San Francisco, she says. It got to the point where anytime she checked Twitter, Chou risked exposing herself to something that made her feel unsafe.

[time-brightcove not-tgx=”true”]

She wasn’t alone. From 2014 to 2020, the share of Americans who experienced sexual harassment online doubled from more than 5% to 11%, and those who were targeted by physical threats doubled from 7% to 14%, according to a recent Pew Research Center report. Women were also more likely than men to have experienced “extremely” or “very” upsetting forms of abuse, and minorities were more likely to have been abused because of their race or sexual orientation.

Read More: How Domestic Abusers Have Exploited Technology During the Pandemic

Casting around for her next project, Chou decided to try to solve the problem. She knew from personal experience how the design of platforms like Twitter can encourage abusive behavior. When a white male designer looks at a function to make it easier to send a private message directly from the Twitter home page, he might see as a quick way to retain users. Chou, on the other hand, immediately recognizes its potential for abuse. “I think a big aspect of it is gender and race, where most women have had to deal with harassment throughout their lives,” she says. “We have a very different conception around safety.”

Tracy Chou in Brooklyn, N.Y., on Feb. 3, 2022.
Jingyu Lin for TIMETracy Chou in Brooklyn, N.Y., on Feb. 3, 2022.

Chou’s interest in these issues was sparked years ago, when she was one of the few female software engineers at the online question-and-answer site Quora. After learning that 9 out of the top 10 writers on the site were men, Chou dug into the case of one woman who had been an active user on the platform, but who quit the site without explanation. Chou eventually learned that the woman had left because she had repeatedly encountered misogynistic questions.

That moment underscored an idea that would come to define Chou’s career: that the design of an online platform could significantly shape the behaviors of the people who use it. Chou used the insights she gleaned to build the block button for Quora. In the years that came after, she also became a founding member of Project Include, an initiative that aims to help the tech industry become more inclusive, partially out of the belief that a more heterogeneous workforce is more likely to build products with safety in mind.

It’s a path that led Chou to spend months coding Block Party, an app she launched in January 2021, which provides users with fine-grained protections to filter out abuse from Twitter accounts that are likely to be owned by trolls: new accounts, those without profile photos, or those with fewer than 100 followers. A premium version of the app ($12 per month) offers even stricter filters. The app doesn’t delete trolls’ tweets, but hides them from the Block Party user, who can check their “lockout folder,” or nominate a confidant to access it on their behalf.

That comes, in part, from Chou’s experiences collating the abuse she received on multiple platforms to share with police as evidence to use against her stalkers. “It was bad the first time you had to see it, and then it’s retraumatizing to go through it again,” Chou says. “There’s a lot of folks who just respond by deleting all of it—they want to make it go away—which then makes it even harder to do anything later.” She hopes that Block Party might one day be able to reduce the burden of that process by collecting evidence automatically.

As online harassment has become more visible in recent years, platforms including Twitter and Instagram have begun rolling out their own features to help users combat abuse. Twitter now lets some users temporarily block accounts that use hateful language, whereas Instagram introduced a feature called “Limits” in August 2021 to let users temporarily lock down their accounts if they become the targets of harassment campaigns.

Big Tech companies have a habit of copying successful features pioneered by smaller competitors—from Instagram Stories or Twitter’s discontinued Fleets (both cribbed from Snapchat) to Reels (Instagram’s new attempt to compete with TikTok). Chou indicates she is confident that the big platforms won’t make Block Party defunct by copying its functionality. “The incentives for these companies are almost always growth and engagement,” she says. That means that the priority is usually new features that might get more users, rather than improving safety. “It’s not like they couldn’t do these things,” she says, “but more like they won’t.”

Block Party works only on Twitter right now, but Chou has plans to expand the service to Instagram and other platforms. Over time she hopes it can become a kind of one-stop shop for online safety, where a user can coordinate their filters across lots of different apps from one place. She also wants to diversify the services on offer, making it even less likely that a boardroom decision at a social media behemoth sinks her business.

For now, Chou’s Twitter experience has been vastly improved by having Block Party activated on her account—particularly after an “Ask Me Anything” event on Reddit led to her receiving torrents of abuse on the platform. “The one place that actually felt safe was Twitter,” she says. “It was really good for me to have a place where I could still engage with a supportive community.”

Your browser is out of date. Please update your browser at http://update.microsoft.com