Amazon’s Alexa and Apple’s Siri fuel stereotype that women are “subservient” – UN report

Artificial intelligence voice assistants, such as Amazon’s Alexa and Apple’s Siri, are perpetuating and spreading gender stereotypes, says a new UN report.

Titled “I’d blush if I could”, the report from UNESCO says the almost exclusive market of female voice assistants fuels stereotypes that women are “obliging, docile and eager-to-please helpers”.

And with assistants responding to requests no matter the manner in which they are asked, this also reinforces the idea in some communities that women are “subservient and tolerant of poor treatment”.

Canalys, a technology research company, has estimated that 100 million “smart speakers”, which are used to interact with voice assistants, were sold in 2018.

According to the UNESCO report, technology giants such as Amazon and Apple have in the past said consumers prefer female voices for their assistants, with an Amazon spokesperson recently attributing these voices with more “sympathetic and pleasant” traits.

However, further research has shown that preferences are a little more complex – people have been found to like specific masculine tones when listening to authority, but prefer female tones when in a helpful context.

In general, most people prefer the sound of the opposite sex, the report said.

The report specifically notes that the inability for some female-voiced digital assistants to defend themselves from hostile and sexist insults “may highlight her powerlessness”.

In fact, some companies with majority male engineering teams have programmed the assistants to “greet verbal abuse with catch-me-if-you-can flirtation,” the report said.

Some cases even found assistants “thanking users for sexual harassment”, and that sexual advances from male users were tolerated more than from female users.

Citing a Quartz piece specifically focusing on Siri, it found the assistant would respond “provocatively to sexual favours” from male users, using phrases such as: “I’d blush if I could” and “Oooh!”, but would be less so towards women.

The report added that such programming “projects a digitally encrypted ‘boys will be boys’ attitude” that “may help biases to take hold and spread”.

To tackle the issue, the UN has argued in favour of technology companies adopting more non-human and gender-neutral voices, pointing to the robotic voice used by Stephen Hawking as an example.

“As intelligent digital assistants become ubiquitous, a machine gender might help separate technologies from notions of gender ascribed to humans, and help children and others avoid anthropomorphising them,” the report said.

Related Articles

Leave A Reply

Your email address will not be published.