• Ideas
  • Technology

Why Reporting Offensive Players in Online Games Is a Losing Battle

9 minute read
Ideas
xoJane.com is where women go to be their unabashed selves, and where their unabashed selves are applauded

Picture this: After a long day at work, you come home to relax, unwind, and play a video game where you pretend to be a science fiction soldier playing capture the flag for the next four hours. At some point during the evening, the game’s auto-matching program assigns you to a team with a player whose online username can’t be repeated in polite company. Then another teammate uses the in-game voice chat to preach views on “those dirty Mexicans.” Rather than play the match, your team grows tired of the rants and decides to “go Jew for a while.”

What exactly happened here? All you wanted was a few hours of mindless entertainment before bed and another day at the job, and now you’re wondering if the entire human race lost its mind in the meantime. You think about reporting the ugly behavior of your former teammates, but they’ve since vanished into the online ether. You grumble to yourself, but rejoin the game, hoping the players in the next match aren’t complete airheads.

I’ve played online games since the late-’90s and watched similar problems happen with each of them: Developers focus their time on fixing problems that affect playability, not the player base. Someone won’t play a game because the players are sexist, racist, and otherwise bigoted? Someone untroubled by such prejudice will eventually log on and play.

Many online games state on their boxes and splash screens that their online content is not rated by the Entertainment Software Rating Board, informing players about the no-man’s-land of online content they’re about to enter and (more importantly, from a business standpoint) protecting the parent company from potential lawsuits. The content delivery services that host these games have boilerplate anti-harassment policies, yet the overworked and understaffed game company can’t keep up with anything but the most flagrant of problem players. A “snitches get stitches” culture takes root. Eventually, the players are left to police themselves.

According to Xbox Live’s EULA, a player can’t “use [Xbox products including Xbox Live] to harm, threaten, or harass another person, organization, or Microsoft.” Sony Entertainment’s EULA lists a plethora of activities players cannot do on the Playstation Network: “You may not take any action, or upload, post, stream, or otherwise transmit any content, language, images or sounds in any forum, communication, public profile, or other publicly viewable areas or in the creation of any [username] that [Sony and its affiliates]…find[s] offensive, hateful, or vulgar. This includes any content or communication that SNEI or its affiliates deem racially, ethnically, religiously or sexually offensive, libelous, defaming, threatening, bullying or stalking.” Even the family-friendly Nintendo Wii comes with an EULA stating its online services may not be used “for commercial or illegal purposes, in a way that may harm another person or company, or in any unauthorized or improper manner.”

For personal computers, three companies dominate the online-gaming content delivery market: Valve’s Steam service, Electronic Arts’ Origin service, and Blizzard Entertainment’s Battle.net. Steam’s subscriber agreement contains a list of conduct rules players must follow, including a clause saying players must not “defame, abuse, harass, stalk, threaten or otherwise violate the legal rights (such as rights of privacy and publicity) of others.” EA’s EULA prohibits users from “Defaming, abusing, harassing, threatening, spamming, violating the rights of others and/or otherwise interfering with others’ use and enjoyment of [Origin and all related software, services, updates, and upgrades];” or “Publishing, transferring or distributing any inappropriate, indecent, obscene, foul or unlawful conduct.” Blizzard’s EULA states that players will not “use or contribute User Content that is unlawful, tortious, defamatory, obscene, invasive of the privacy of another person, threatening, harassing, abusive, hateful, racist or otherwise objectionable or inappropriate.”

For now, content-delivery services for mobile devices come without the social networking options available for similar services on consoles and computers. Google’s Terms of Service states that available content “content is the sole responsibility of the entity that makes it available. We may review content to determine whether it is illegal or violates our policies, and we may remove or refuse to display content that we reasonably believe violates our policies or the law. But that does not necessarily mean that we review content, so please don’t assume that we do.” Apple’s App Store EULA states: “You understand that by using any of the Services, You may encounter content that may be deemed offensive, indecent, or objectionable, which content may or may not be identified as having explicit language, and that the results of any search or entering of a particular URL may automatically and unintentionally generate links or references to objectionable material. Nevertheless, You agree to use the Services at Your sole risk and that the Application Provider shall not have any liability to You for content that may be found to be offensive, indecent, or objectionable.”

Game apps come with their own Terms of Service. Zynga, creator of Facebook apps like FarmVille and mobile-device apps like Words with Friends, states in their community rules where users agree not to “post any content that is abusive, threatening, obscene, defamatory, libelous, or racially, sexually, religiously, otherwise objectionable or offensive; or violates any applicable law or regulation.”

So if all of these prohibitions are in place, why are you still reporting offensive user names like RapeFace and flagging users referring to earning in-game currency as “jewing”?

First, what can be considered “offensive content” can be debated ad infinitum in a courtroom, costing companies money. Second, staffing shortages lead to prioritization, and actively policing user content usually ends up at the bottom of priority lists, as it’s a problem without a concrete deadline. These two situations combined to form the user policing system used by nearly all of the aforementioned services: It’s up to players to notify company staff that something is amiss, from flagging content as inappropriate to filling out a Web form akin to a police report, describing the situation and providing screenshots and timestamps when necessary.

At the moment, Halo 4 is the only mainstream multiplayer game to take a zero-tolerance policy in regards to sexist behavior. Halo’s server host, Xbox Live, has the funding to support a team of live humans enforcing its online-content rules. Other online games have in-game monitors or forum moderators in reactive roles, fixing problems on case-by-case basis like overworked Wild West sheriffs.

Some players live for crapping up someone else’s gaming experience for no other reason than because they can. After all, the EULA doesn’t cover intentionally dropping the captured flag, opening your base to the other team, or other forms of grief play. Some players see it as their divine calling to find the line between permitted and unacceptable behavior and cross it — or better yet, troll someone else into crossing it, and reporting that player for rule-breaking. A recent case of griefing — intentional game disruption meant to harass or annoy — during team events in Lord of the Rings Online caused both players and LOTRO’s developer Turbine to reexamine its definition of in-game harassment.

Understandably, when reporting a bad player can take longer than playing a game session with said bad player, the path of least resistance is to put up with whoever it is until the sessions end or the players change. Platforms like Steam and Battle.net encourage users to mute, squelch, kick, or otherwise dismiss problem players from their personal gaming sessions as a means to solve the problem, rather than such measures acting as a first line of defense. Not every online platform has a paid team of employees specifically hired to enforce the rules, and because of this, online gaming culture sees such systemic problems as subjective, even victim-blaming.

In the online gaming frontier of the ’90s, EverQuest and Ultima Online and the earliest incarnation of Battle.net hosted live human moderators, but as scope grew, their roles shrunk. Now that gaming’s problems with sexism, racism, and homophobia have been laid bare thanks to GamerGate, it’s time to take these problems seriously instead of pushing them aside. Hiring staff dedicated to solving these issues shows that gaming companies won’t tolerate prejudice.

However, this behavior can’t be curbed by user policing alone. As in real life, there’s a fair amount of enabling in the virtual world. Most online gaming groups (clans, corporations, etc.) have at least one player who is an abominable human being yet plays the game like the Pinball Wizard. Other players justify his or her inclusion by stressing the problem player’s skills, abilities, or knowledge, dismissing personality problems with “he’s just like that, you’ll get used to it” or “she’s a great player — we’ll put up with her crap if it means she’s on our team.”

To fix what the game can’t, stop playing with such players. Easier said than done, right? If there’s an in-game group or clan that promotes acceptance and good sportsmanship, join it. Join communities like the Rough Trade Gaming Community and RPG.net. Lurk on subReddits like Truegaming. Or simply investigate the in-game community for players who fit your play-style. I’ve been playing games online since 1998, and the few times I haven’t found a BS-free group, I’ve started one, and never was I short on teammates.

So until the online gaming world gets its act together, I’ll hang out with my gaming group, where the foremost rule is “Don’t be an airhead.” If you’re ready to give gaming one more try, look me up by my Disqus name on Steam. Hope you like space ninjas.

Laura Carruba is a freelance writer and contributor to xoJane. This article originally appeared on xoJane.com.

More Must-Reads from TIME

Contact us at letters@time.com

TIME Ideas hosts the world's leading voices, providing commentary on events in news, society, and culture. We welcome outside contributions. Opinions expressed do not necessarily reflect the views of TIME editors.