Verbal abuse often ruins otherwise great online multiplayer games. Modulate has a solution in the form of ToxMod, a first-of-its-kind platform that uses artificial intelligence to detect sexism, racism, or other forms of abuse in games, and alert human moderators, who can issue warnings or ban offenders. The goal, says co-founder and chief technology officer Carter Huffman, is to make games “safer and more inclusive.” ToxMod is already in use in major titles like Rec Room, which began deploying the tech for its 37 million users last year.
A weekly newsletter featuring conversations with the world’s top CEOs, managers, and founders. Join the Leadership Brief.
More Must-Read Stories From TIME
- Climate-Conscious Architects Want Europe To Build Less
- The Red-State Governor Who's Not Afraid to Be 'Woke'
- Jonathan Van Ness: We Are Still Not Taking Monkeypox Seriously Enough
- The Not-So-Romantic Return of Europe's Sleeper Trains
- This Filmmaker Set Out To Record Her Family’s Journey Rebuilding Afghanistan. Her Work Is a Reminder of What’s at Stake
- Why Sunscreen Ingredients Need More Safety Data
- What Historians Think of the Joe Biden-Jimmy Carter Comparisons
- Author Mimi Zhu Is Relearning What It Means to Love After Trauma