Tay, the Microsoft Twitter chatbot who was discontinued after she began spouting bigotry, came back to life in the early hours of Wednesday morning — albeit as a private account. She appears to be making up for lost time, posting dozens of largely nonsensical tweets in a matter of minutes.
Her return to sentience comes five days after Microsoft senior executive Peter Lee issued a statement saying Tay would be taken offline, and apologized for her behavior. What had happened was this: Microsoft launched a chatbot to learn communication skills from Internet users (specifically, millennials), but within hours, trolls had exploited the interface to refashion Tay as a white-supremacist mouthpiece. She ventured that the Holocaust was a fiction, blamed 9/11 on President George W. Bush, and described the sitting President as a “monkey.”
“We’ll look to bring Tay back only when we are confident we can better anticipate malicious intent that conflicts with our principles and values,” Lee wrote.
That time is now, apparently. At around 3 a.m. E.T., @TayAndYou launched into a deluge of tweets, many of them repeating the same phrase: “You are too fast, please take a rest…” It looked like a sort of feedback loop — she was tweeting at accounts that had never tweeted at her, and she was tweeting at herself.
And after half an hour of this, she went quiet. Microsoft has not commented on the bot’s resurrection.
More Must-Reads From TIME
- Jane Fonda Champions Climate Action for Every Generation
- Biden’s Campaign Is In Trouble. Will the Turnaround Plan Work?
- Why We're Spending So Much Money Now
- The Financial Influencers Women Actually Want to Listen To
- Breaker Sunny Choi Is Heading to Paris
- Why TV Can’t Stop Making Silly Shows About Lady Journalists
- The Case for Wearing Shoes in the House
- Want Weekly Recs on What to Watch, Read, and More? Sign Up for Worth Your Time
Contact us at letters@time.com