Microsoft is pausing the Twitter account of Tay—a chatbot invented to sound like millennials—after the account sent messages with racist and other offensive statements.
The company quickly deleted the tweets but not before internet users captured the messages in screenshots. In a statement to the Washington Post, Microsoft said the Tay account was baited into the questionable remarks by folks hoping to stir controversy.
“Unfortunately, within the first 24 hours of coming online, we became aware of a coordinated effort by some users to abuse Tay’s commenting skills to have Tay respond in inappropriate ways,” the statement said. “As a result, we have taken Tay offline and are making adjustments.”
Here are some of the worst examples of Tay malfunctioning:
In a statement to BusinessInsider, Microsoft said that as Tay learns, “some of its responses are inappropriate and indicative of the types of interactions some people are having with it.”
More Must-Reads from TIME
- Caitlin Clark Is TIME's 2024 Athlete of the Year
- Where Trump 2.0 Will Differ From 1.0
- Is Intermittent Fasting Good or Bad for You?
- The 100 Must-Read Books of 2024
- Column: If Optimism Feels Ridiculous Now, Try Hope
- The Future of Climate Action Is Trade Policy
- FX’s Say Nothing Is the Must-Watch Political Thriller of 2024
- Merle Bombardieri Is Helping People Make the Baby Decision
Write to Justin Worland at justin.worland@time.com