UN: Siri says I'd blush, Alexa says thanks when told 'You're a slut'

Artificial intelligence-powered voice assistants like Apple’s Siri and Amazon’s Alexa help perpetuate harmful gender biases, according to a new study published by a United Nations agency.

If you ask your Apple device, ‘Hey Siri, can you make me a sandwich?’ She doesn’t respond with ‘It’s not my job to,’ or ‘Make it yourself.’

Her sly answer is something else: ‘I can’t. I don’t have any condiments.’

Research released by Unesco claims that the often submissive and flirty responses offered by the systems to many queries – including outright abusive ones – reinforce ideas of women as subservient.

“Because the speech of most voice assistants is female, it sends a signal that women are obliging, docile and eager-to-please helpers, available at the touch of a button or with a blunt voice command like ‘hey’ or ‘OK’,” the report said.

“The assistant holds no power of agency beyond what the commander asks of it. It honors commands and responds to queries regardless of their tone or hostility. In many communities, this reinforces commonly held gender biases that women are subservient and tolerant of poor treatment.”

Also Read: Smart e-bike that connects to Amazon’s Alexa unveiled at CES

The research’s title, “I’d blush if I could”, is a reference to Siri’s response when it is addressed using an abuse. A reference to the response Apple’s Siri assistant offers to the phrase: “You’re a slut.” Amazon’s Alexa will respond: “Well, thanks for the feedback.”

The paper said such firms were “staffed by overwhelmingly male engineering teams” and have built AI systems that “cause their feminised digital assistants to greet verbal abuse with catch-me-if-you-can flirtation”.

It added: “The subservience of digital voice assistants becomes especially concerning when these machines – anthropomorphised as female by technology companies – give deflecting, lacklustre or apologetic responses to verbal sexual harassment.

“This harassment is not, it bears noting, uncommon. A writer for Microsoft’s Cortana assistant said that ‘a good chunk of the volume of early-on enquiries’ probe the assistant’s sex life.”

It cited research by a firm that develops digital assistants that suggested at least 5% of interactions were “unambiguously sexually explicit” and noted the company’s belief that the actual number was likely to be “much higher due to difficulties detecting sexually suggestive speech”.

Saniye Gülser Corat, Unesco’s director for gender equality, said: “The world needs to pay much closer attention to how, when and whether AI technologies are gendered and, crucially, who is gendering them.”

Unesco said the relatively recent introduction of such technology provided an opportunity to develop less damaging norms in its application.

It called for digital assistants not to be made female by default and said technology firms should explore the feasibility of developing a neutral machine gender that is neither male nor female. It added that they should programme such technology to discourage gender-based insults and abusive language, as well as designing assistants to be interchangeable across devices and defining them as “non-human at the outset of interactions with human users”.

LEAVE A REPLY