A smart thermostat of the heating network Startup Tado is located in the headquarters on a table next to Amazon's talking loudspeaker with assistant Alexa.
picture alliance—picture alliance via Getty Image
By Mahita Gajanan
May 22, 2019

Artificial intelligence voice assistants with female voices reinforce existing gender biases, according to a new United Nations’ report.

The new report from UNESCO, entitled “I’d Blush If I Could,” looks at the impact of having female voice assistants, from Amazon’s Alexa to Apple’s Siri, projected in a way that suggests that women are “subservient and tolerant of poor treatment.” The report takes its title from the response Siri used to give when a human told her, “Hey Siri, you’re a b-tch.”

Further, researchers argue that tech companies have failed to take protective measures against abusive or gendered language from users.

“Because the speech of most voice assistants is female, it sends a signal that women are obliging, docile and eager-to-please helpers, available at the touch of a button or with a blunt voice command like ‘hey’ or ‘OK,'” the researchers write.

“The assistant holds no power of agency beyond what the commander asks of it. It honors commands and responds to queries regardless of their tone or hostility.”

Research has long found that AI intelligence has a problem with gender and racial biases. The use of smart speakers is continuing to grow rapidly — the research firm Canalys said last year that about 100 million smart speakers would be sold in 2018.

“Technology always reflects the society in which it is developed,” Saniye Gülser Corat, UNESCO’s Director for Gender Equality, tells TIME. “The biases reflect an attitude that almost condones a ‘boys will be boys’ attitude and it magnifies gender stereotypes.”

Corat says the female voices and personalities projected on to AI technology reinforces the impression that women typically hold assistant jobs and that they should be docile and servile. While moving forward in technological developments, she says companies putting out AI machines are moving backward to a Mad Men-like era, where women were expected to serve rather than lead.

“Stereotypes do matter because they come back to affect how young girls and young women see themselves and the way they have dreams and aspirations for the future,” she says. “It’s almost like going back to the image of women that was held in the 1950 or 1960s.”

The report calls for more women to be involved in the creation of technologies used to train AI machines, citing research from Science that finds that such machines “must be carefully controlled and instilled with moral codes.”

Researchers also call for tech companies to train AI machines to respond to human commands and questions in gender-neutral ways by establishing gender-sensitive data sets for use in AI applications. The bulk of the data used to train the machines now is sexist, they find.

“Machine learning is ‘bias in, bias out,'” they write. “A voice assistant’s educational diet is of vital importance.”

Write to Mahita Gajanan at mahita.gajanan@time.com.

SPONSORED FINANCIAL CONTENT

Read More From TIME

EDIT POST