• Tech
  • Web

Science Says You Should Ignore Internet Trolls

4 minute read

Commonly found under bridges and in the reader commentary of stories about Apple, trolls have long plagued the good people of fairy tales and the Internet. While banishing them has long been the remedy of choice, new research out of Stanford and Cornell universities might help to identify these persistent pests before they even start wringing their wart-covered hands. Boasting a methodology with 80% accuracy, the study provides hope that once Skynet becomes self-aware, we can wipe this scourge off the face of the web once and for all.

So who, exactly, is a troll? Analyzing the comments on news (CNN.com), politics (Breitbart.com), and gaming (IGN.com) sites over a period of 18 months, the study examined more than 40 million posts by at least 1.7 million users, discovering not only what antisocial behavior looks like, but how it festers, grows, and is ultimately dealt with. This allowed the researchers to see how trolls typically evolve over time.

But one thing in particular helped these odious Internet users stand out from their mild-mannered counterparts. “They receive more replies than average users,” says the paper, “suggesting that they might be successful in luring others into fruitless, time-consuming discussions.”

To create the algorithm, the researchers looked at all 1.7 million users surveyed and split them into two groups: future-banned users (FBUs) and never-banned-users (NBUs). Assuming the FBUs were all trolls, they then monitored their behavior from when they signed up to when they got shut out. Some clear differences emerged between the trolls and the NBUs: FBUs wrote differently than everyone else, often going off-topic, scribbling posts that were more difficult to read, and saying more negative things. In addition, trolls made more comments per day, and posted more times on each thread. They often had the most posts in a particular thread, and made more replies to other comments.

In other words, the trolls were hyper-active.

But that alone wasn’t enough to separate trolls from your casual cranks. To do that, the researchers looked at how users’ behaviors changed over time, analyzing how many posts of theirs were deleted by site moderators. NBUs weren’t saints — they also had posts deleted — but only a small proportion got worse over the course of the study. The trolls, on the other hand, had an increasing amount of posts deleted as time wore on.

And this all makes sense, when you think about it. Trolls start off surly, are met with opposition and then get a little nutty. Then, and when their comments are deleted, they get even crazier — a cycle that gradually spins out of control, until they’re ultimately shut down. It happens online. It happens on television. It even happens in the real world.

Admittedly, the study doesn’t take sarcasm into account, a tool no doubt wielded by a mutant strain of super-trolls, users who “purposefully ask overly naive questions or state contrary viewpoints.” Imagine that . . . oh god, the horror.

But the study does give actionable insight on what to do should you ever encounter a troll. “Anti-social behavior is exacerbated when the community feedback is overly harsh,” says the report. In other words — and of course you already know this — don’t feed the trolls. Since FBUs’ behavior gets worse over time, that means don’t engage them early or often.

Currently, this research is unfortunately little more than an exercise in academics, as its algorithm for detecting trolls has yet to be rolled into a software or a service. But it’s a good first step for sites all over the web — especially on Twitter —where the formula could be used to scout out future troublesome users.

More Must-Reads From TIME

Contact us at letters@time.com