Verbal abuse often ruins otherwise great online multiplayer games. Modulate has a solution in the form of ToxMod, a first-of-its-kind platform that uses artificial intelligence to detect sexism, racism, or other forms of abuse in games, and alert human moderators, who can issue warnings or ban offenders. The goal, says co-founder and chief technology officer Carter Huffman, is to make games “safer and more inclusive.” ToxMod is already in use in major titles like Rec Room, which began deploying the tech for its 37 million users last year.
A weekly newsletter featuring conversations with the world’s top CEOs, managers, and founders. Join the Leadership Brief.
More Must-Reads from TIME
- Donald Trump Is TIME's 2024 Person of the Year
- Why We Chose Trump as Person of the Year
- Is Intermittent Fasting Good or Bad for You?
- The 100 Must-Read Books of 2024
- The 20 Best Christmas TV Episodes
- Column: If Optimism Feels Ridiculous Now, Try Hope
- The Future of Climate Action Is Trade Policy
- Merle Bombardieri Is Helping People Make the Baby Decision