Microsoft is pausing the Twitter account of Tay—a chatbot invented to sound like millennials—after the account sent messages with racist and other offensive statements.
The company quickly deleted the tweets but not before internet users captured the messages in screenshots. In a statement to the Washington Post, Microsoft said the Tay account was baited into the questionable remarks by folks hoping to stir controversy.
“Unfortunately, within the first 24 hours of coming online, we became aware of a coordinated effort by some users to abuse Tay’s commenting skills to have Tay respond in inappropriate ways,” the statement said. “As a result, we have taken Tay offline and are making adjustments.”
Here are some of the worst examples of Tay malfunctioning:
- Asked about the Holocaust, Tay replied “it was made up” followed by a clapping emoji.
- When asked if comedian Ricky Gervais was an atheist, Tay replied that “ricky gervais learned totalitarianism from adolf hitler, the inventor of atheism.”
- In one tweet, Tay managed to accuse George W. Bush of causing 9/11, praise Hitler, refer to President Barack Obama as a “monkey,” and endorse Donald Trump—all in the same message.
In a statement to BusinessInsider, Microsoft said that as Tay learns, “some of its responses are inappropriate and indicative of the types of interactions some people are having with it.”
- How to Help Victims of the Texas School Shooting
- TIME's 100 Most Influential People of 2022
- What the Buffalo Tragedy Has to Do With the Effort to Overturn Roe
- Column: The U.S. Failed Miserably on COVID-19. Canada Shows It Didn't Have to Be That Way
- N.Y. Will Soon Require Businesses to Post Salaries in Job Listings. Here's What Happened When Colorado Did It
- The 46 Most Anticipated Movies of Summer 2022
- ‘We Are in a Moment of Reckoning.’ Amanda Nguyen on Taking the Fight for Sexual Violence Survivors to the U.N.