Like the gunman who killed 50 worshippers at two New Zealand mosques in March, the man who killed one person in a Poway, Calif., synagogue on April 27 announced his intentions on 8chan, an anonymous online message board with a reputation for violent extremism. In the wake of these and other murders, calls to shutter such sites have only grown louder.
Several Australian ISPs blocked 8chan right after the Christchurch, New Zealand, attack. There have also been suggestions for ways websites like Facebook could stop linking to these spaces, as well as demands for censorship or taking the sites down completely. But while these actions may prevent the spread of a video of a massacre in the short term, they may end up helping the ideologies responsible for these tragedies.
Blocking access to these platforms would validate users’ long-running narrative that the mainstream simply can’t deal with their edginess. For instance, 8chan has been blamed at various times for the spread of extremist bloodshed, campaigns of harassment and generally being a hotbed of hate. The site hosts content that is illegal in several countries, and many of its users encouraged and delighted in the acts of anti-Semitic or Islamophobic violence–amid chats on gaming, fitness and anime.
8chan’s users are well aware of its infamy and association with all the Establishment finds intolerable, from offensive jokes to extremist politics. Much of the content is designed to shock–and laugh at who’s offended. The Christchurch and Poway shooters’ manifestos, posted to the site before the killings started, are strewn with in-jokes, trolling and red herrings–with the twin aims of winking to the community and misleading unwitting, horrified journalists. It is deliberate provocation: search interest in 8chan spiked after the Christchurch massacre, no doubt because of the many articles published about it afterward.
Trying to knock out fringe sites will also create a never-ending game of whack-a-mole. 8chan itself was set up as a “free-speech-friendly” alternative to 4chan, which itself grew after Reddit banned several controversial sections. And consider ISIS propaganda, which after years of pressure from governments, social-media companies and security services is only marginally harder to find these days and clearly still reaches enough people to inspire harm. Besides, much of the far-right content originating on sites like 8chan is already everywhere, including on the biggest platforms on the web–YouTube, Facebook, Twitter and Google–as well as thousands of other message boards and hosting platforms. 8chan may be among the worst, but it is not alone.
The Myopic focus on eliminating hateful content from the Internet fundamentally misses that it is the community–not the content–that drives radicalization. These sites are filled with their own icons, language and culture. On 8chan, the Christchurch killer has been branded a “saint,” and the Norwegian terrorist who killed 77 people in 2011 has been too. The most important thing these sites offer a would-be gunman is companionship, recognition and, ultimately, an audience to impress. It’s the desire for this sense of belonging that deserves far more of our focus.
Only a select few platforms can afford the investment in technology and manpower needed to hide hate online: building algorithms that efficiently spot extremist propaganda; hiring exponentially more moderators to quickly remove content; doing more than the self-congratulatory bean counting of the posts scrubbed from one platform, only to leave the content readily findable elsewhere. For smaller forums like 8chan, doing this would be impossible.
Companies should take down more content faster, and perhaps fewer people will stumble across a video or a piece of extremist material they weren’t expecting to see. But in the long term, governments must commit to understanding how and why extremist communities thrive in online spaces as well as the ways in which their users might be brought back into the fold. There’s a lot to learn from sites like 8chan: alongside the violent extremism, these boards remain among the most innovative and forceful drivers of cultural change and politics online for the past decade. Trying to smother them may only make their worst elements more appealing.
More Must-Reads from TIME
- Introducing the 2024 TIME100 Next
- Sabrina Carpenter Has Waited Her Whole Life for This
- What Lies Ahead for the Middle East
- Why It's So Hard to Quit Vaping
- Jeremy Strong on Taking a Risk With a New Film About Trump
- Our Guide to Voting in the 2024 Election
- The 10 Races That Will Determine Control of the Senate
- Column: How My Shame Became My Strength
Contact us at letters@time.com