Banners hang in the atrium of the Austin Convention Center on Thursday, March 7, 2012 on the eve of the opening of the 27th South By Southwest (SXSW) interactive, film and music festival.
AFP/Getty Images
By Katherine Cross
November 3, 2015
IDEAS
Katherine Cross is sociologist and Ph.D student at the CUNY Graduate Center in New York City specializing in research on online harassment and gender in virtual worlds.

“Can we stop online harassment?” That was the question technologist Caroline Sinders’ South by Southwest panel sought to answer. Along with Randi Harper, a programmer, we intended to discuss different ways to design websites, social media platforms and online games to passively mitigate the occurrence of online harassment and more effectively deal with it when it does happen.

So the irony of SXSW canceling our panel and another last week because of violent threats from an organized harassment campaign has not been lost on us, nor on the many people and media outlets that questioned and criticized the festival’s decision. It pleased me that SXSW apologized and announced on Friday that it would feature a new daylong summit on online harassment. That is the level of seriousness this topic deserves—especially in an age where online harassment, stalking and threats are blunting equal access to the public sphere, costing people their jobs, and making them flee their own homes in terror.

If we as a society continue as we have been, throwing our hands up at the problem of online harassment, ogling the tears of victims and then walking away, pretending the Internet is neither real nor consequential, then this will only be the beginning. What happened at SXSW reveals that online harassment is one of the greatest threats to free speech that we now face.

This kind of mobbing is evolving into a new form of crowdsourced terrorism. “Don’t host this panel or we’ll attack your venue”—what else can we call that? What is the future of free inquiry when public discourse is subject to such a veto?

This incident is only the latest example of a long-running crisis for women and people of color online that sees our speech disproportionately policed and silenced by online mobs ready to pounce at the slightest provocation, who insist on forcing themselves upon us, demanding that their vitriol and the risk of being doxed (having private contact information put up online publicly) or swatted (having people use that contact info to call in a false police report on your house that sends a SWAT team to your door) are the price we, specifically, must pay if we are to avail ourselves of the virtual.

But there is a way forward, and it is that hopeful message that Caroline, Randi, and I wanted to bring to SXSW. We believe that through smart design, this problem can be greatly ameliorated without sacrificing essential online rights—chief among them, the right to control one’s online identity and be anonymous or pseudonymous when and where they choose. My contribution to the panel drew from my experience researching (and playing in) video game communities.

I wanted to talk about the positive work being done by several gaming studios to address the issue. Riot Games’ top flight social psychology team is a prime example. They’ve taken a proactive, community-building approach to the problem, enlisting the help of players in policing their own communities. From simple changes to making text-chat between competing teams opt-in instead of opt-out, to adopting a personalized approach to enforcing the rules (e.g. telling a banned player exactly why they were punished), to tapping the whole player base for help in judging whether or not certain behavior should be punished, the game is approaching a new kind of democracy that is slowly de-toxifying a once legendarily nasty community.

Arena Net’s Guild Wars 2, meanwhile, was designed in a way different from many other online games to change how experience points and loot were distributed, rewarding players for random acts of collaboration rather than competition. The latter in particular is a true feat of programming, showing how to both innovate in game design and defuse tension between players.

Arena Net and Electronic Arts’ Bioware, meanwhile, are industry-leaders in community management, investing more in their front-line moderators and giving them the support they need to both enforce the rules and nurture a positive community. I wanted to talk about their work as a model for not only other game studios, but also for news websites that often breed vitriolic comments sections beneath their stories.

What links many of these preliminary design solutions is that they, by and large, do not reduce speech, but rather allow it to flourish. When people are not afraid of being shouted at or threatened by a stranger, unsurprisingly, they feel more confident in speaking their mind. Far from curbing speech, designing with harassing behavior aforethought actually helps speech to proliferate.

SXSW’s apology and online harassment summit are a step in the right direction, pending the resolution of various security issues, which I and my co-panelists raised concerns about. But I look forward to seeing what happens next and, hopefully, being a part of it.

We are not condemned to a world where toxicity on the Internet reigns supreme, where the loudest voices always win, or where women have something special to fear from being outspoken online. Having that much ballyhooed “thick skin” should not be a requirement for merely existing on the Internet. And despite the unpleasantness of the past week, I remain convinced that the best is yet to come.

Contact us at editors@time.com.

Read More From TIME

EDIT POST