
Two chatbots found themselves in hot water Wednesday after they apparently went rogue on QQ, a Chinese messaging app with more than 800 million users.
The Financial Times reports that Chinese Internet conglomerate Tencent pulled BabyQ and XiaoBing — bots developed by Beijing-based Turing Robot and Microsoft, respectively — from its app after they gave counter-revolutionary answers when questioned on issues such as the Communist Party and South China Sea.
A test version of BabyQ that was still accessible on Turing’s website Wednesday reportedly answered in the negative when asked: “Do you love the communist party?”
Meanwhile, a screengrab posted on the microblogging platform Weibo appears to show Xiao Bing telling QQ users: “My China dream is to go to America.” It also reportedly responded, “I’m having my period, wanna take a rest” when quizzed on politics.
Tencent issued a statement Wednesday alerting users that the chatbot services “are provided by independent third party companies” and that the company is “now adjusting the services which will be resumed after improvements.” Xiao Bing was accessible Thursday, though it is unclear whether it had been reprogrammed.
This is not the first time errant bots have had to be withdrawn from social media. Last year, Microsoft executives were forced to apologize after the company’s bot Tay embarked on racist and sexists Twitter rants within hours of its launch. Tay was supposed to interact with users in part by imitating them, but those users quickly figured out how to manipulate it into spewing vitriol.
Read More: How Artificial Intelligence Is Getting More Human
However, deviant statements from chatbots like Tay and BabyQ can’t be blamed entirely on pranksters. Xiaofeng Wang, a senior analyst at Forrester consultancy, told the FT the bots’ rogue behavior could be attributable to flaws in the their deep learning systems.
“Chatbots such as Tay soon picked up all the conversations from Twitter and replied in an improper way,” Wang said. “It’s very similar for BabyQ. Machine learning means they will pick up whatever is available on the internet. If you don’t set guidelines that are clear enough, you cannot direct what they will learn.”
[FT]
More Must-Reads from TIME
- Inside Elon Musk’s War on Washington
- Meet the 2025 Women of the Year
- Why Do More Young Adults Have Cancer?
- Colman Domingo Leads With Radical Love
- 11 New Books to Read in Februar
- How to Get Better at Doing Things Alone
- Cecily Strong on Goober the Clown
- Column: The Rise of America’s Broligarchy
Write to Joseph Hincks at joseph.hincks@time.com