Siri, can you fix sexism embedded in voice assistants?

1 Apr 2021

Image: © naka/Stock.adobe.com

Will Apple’s latest Siri update spark a change in the way that AI assistants are often female voices by default?

Yesterday (31 March), it was revealed that Apple would no longer set a default voice for Siri. Instead, iOS users would be prompted to decide whether they would prefer a male or female voice on set-up.

While the default for Siri was male in some regions, the voice of Siri that has been popularised is that of a woman. For many years, it was likely the voice of Susan Bennett. Apple has never officially confirmed Bennett as the original voice of Siri, but many credible sources and your own ears will lead you to believe it.

An iOS update in 2013 introduced the option to switch from one gendered voice to another and altered the original voice somewhat to sound a bit more naturalistic. This week’s update to Siri delivers two new voices that nudge this realism even further. Increased processing aims to make the computer-generated responses, which combine elements from voice actor recordings, sound closer to that of a real person. TechCrunch’s Matthew Panzarino said, “They sound pretty fantastic, with natural inflection and smooth transitions.”

And as the makers of voice assistants strive to create ever more human-sounding voices, they have been sparking questions around the gendering of assistive technology, what that says about society’s existing gender biases, and how these tools can either exacerbate or address these issues.

In 2019, a group of linguists, technologists and sound designers sought to address some of the concerns around gendered AI with Q, a genderless voice for assistive technologies.

Dr Julie Carpenter, a California-based researcher who worked on Q, told NPR: “One of our big goals with Q was to contribute to a global conversation about gender, and about gender and technology and ethics, and how to be inclusive for people that identify in all sorts of different ways.”

This chimes with Apple’s statement on the move away from gender defaults when it comes to its voice assistant. “This is a continuation of Apple’s long-standing commitment to diversity and inclusion, and products and services that are designed to better reflect the diversity of the world we live in,” the company said this week.

‘Because the speech of most voice assistants is female, it sends a signal that women are obliging, docile and eager-to-please helpers’
– UNESCO REPORT

According to TechCrunch, Siri handles 25bn requests each month. The Apple voice assistant is available in 36 countries, and so has been equipped with 21 languages and a range of accents – including Irish – to optimise familiarity.

So Siri doesn’t have to sound like Susan Bennett or any other English-speaking North American woman, though this is often the default for voice assistants. See Google Assistant, Amazon’s Alexa and Microsoft’s Cortana to complete the set of top tech examples.

A report published by UNESCO in 2019 said of this trend: “Because the speech of most voice assistants is female, it sends a signal that women are obliging, docile and eager-to-please helpers, available at the touch of a button or with a blunt voice command like ‘hey’ or ‘OK’.”

This report was titled ‘I’d blush if I could’, borrowed from one of Siri’s programmed responses to being called a ‘slut’. It was Quartz that conducted this experiment of sexually harassing voice assistants in 2017. Not just to push the buttons of an emotionless piece of technology, but because it is known that users tend to be abusive to AI assistants and how these assistants respond to such abuse can have wider implications.

It’s not for Apple, Google, Amazon or Microsoft to resolve the sociological reasons why human beings would be so unnecessarily cruel to a tool, but programming responses to abuse that can be deduced as flirtatious does not help the broader issue.

The UNESCO report claimed that the subservient nature of technology that is female by default “reinforces commonly held gender biases that women are subservient and tolerant of poor treatment”. It further warned: “The feminisation of digital assistants may help gender biases to take hold and spread.” Which is a particular concern as children become more accustomed to using voice tech.

“According to Calvin Lai, a Harvard University researcher who studies unconscious bias, the gender associations people adopt are contingent on the number of times people are exposed to them. As female digital assistants spread, the frequency and volume of associations between ‘woman’ and ‘assistant’ increase dramatically,” the UNESCO report said.

Led by Big Tech’s innovations in this space, voice tech is set to become a more common interface for interactions, not just with devices, but with customer support and other services. Just as text-based chatbots are commonly used in this work, sufficiently advanced voice tech can be deployed also.

Anyone who has ever worked in customer service (myself included) will know that people with problems can be abusive even when they’re dealing with another human being. And so the prospect of providing customers with a gendered but soulless voice as a probable punching bag will require further rumination on what that means for us as a society. The codifying, in actual code, of the assistant as female alongside the assistant as virtual sounding board for abuse can have real-life repercussions that tech ethicists must be aware of.

Elaine Burke is the host of For Tech’s Sake, a co-production from Silicon Republic and The HeadStuff Podcast Network. She was previously the editor of Silicon Republic.

editorial@siliconrepublic.com