Verbal abuse often ruins otherwise great online multiplayer games. Modulate has a solution in the form of ToxMod, a first-of-its-kind platform that uses artificial intelligence to detect sexism, racism, or other forms of abuse in games, and alert human moderators, who can issue warnings or ban offenders. The goal, says co-founder and chief technology officer Carter Huffman, is to make games “safer and more inclusive.” ToxMod is already in use in major titles like Rec Room, which began deploying the tech for its 37 million users last year.
A weekly newsletter featuring conversations with the world’s top CEOs, managers, and founders. Join the Leadership Brief.
- Zero-COVID Protests in China Have Rattled Global Markets
- Column: Diversity Initiatives Are Failing the U.S. Muslim Community
- Why European Countries Are Giving Teens Free Money To Spend on Books, Music, and Theater
- Republican Skepticism of Trump Has Never Been Higher
- Column: The U.S. Prison System Doesn't Value True Justice
- How Green Is the Qatar World Cup’s Outdoor AC?
- 16 Funny and Whimsical White Elephant Gifts Under $25
- The 5 Best New TV Shows Our Critic Watched in November 2022