Verbal abuse often ruins otherwise great online multiplayer games. Modulate has a solution in the form of ToxMod, a first-of-its-kind platform that uses artificial intelligence to detect sexism, racism, or other forms of abuse in games, and alert human moderators, who can issue warnings or ban offenders. The goal, says co-founder and chief technology officer Carter Huffman, is to make games “safer and more inclusive.” ToxMod is already in use in major titles like Rec Room, which began deploying the tech for its 37 million users last year.
A weekly newsletter featuring conversations with the world’s top CEOs, managers, and founders. Join the Leadership Brief.
More Must-Reads From TIME
- The Inside Story of Princeton's Cinderella Run at March Madness
- The Case for Betting on Succession's Tom Wambsgans
- For Both Donald Trump and Alvin Bragg, the Central Park Jogger Case Was a Turning Point
- If Donald Trump Is Indicted, Here's What Would Happen Next in the Process
- Alison Roman Won't Sugarcoat It
- Why Not All Observant Muslims Fast During Ramadan
- It's Time to Say a Loving Goodbye to John Wick
- Who Should Be on the 2023 TIME100? Vote Now
- Column: Ozempic Exposed the Cracks in the Body Positivity Movement