But Sutton points out that people are now highly accused of anthropomorphized technologies.
“We are so conditioned to hear certain types of voices in these devices, moving even a little bit away from that can be difficult for users,” she says.
Clearly, part of the problem here is that gender bias and prejudice exists across society and synthetic voices – like any cultural artefact – run the risk of reflecting that. Although still a worthwhile exercise, you can’t get rid of gender biases just by redesigning Siri’s voice. That won’t reverse people’s misogynistic attitudes overnight or suddenly equalize the number of women and men working in the AI industry.
There’s another point to be made here. And that’s that virtual assistants – by definition – will always be subservient entities. They are more or less digital servants, after all, so how could we ever speak to them on a level playing field? That’s really what we would need in order to get away from all the awkward power dynamics and problematic, dominating behavior that we currently throw their way.
“If they’re primarily designed to help people search and shop, how far can you go with meaningful representations and relationships?” quizzes Charlotte Webb, co-founder of Feminist Internet.
In the near future, we are likely to encounter even more technologies that speak to us. Webb says she is concerned about how voice assistants could continue to perpetuate gender stereotypes once they embody avatars in the “metaverse” – virtual reality spaces. People have already been accused of sexually harassing others in the metaverse. Will virtual assistants serve to, inadvertently or not, enable and encourage such behavior?
The history of synthetic voices, and our attitudes towards them, may have perpetuated and even deepened gender biases – like a feedback loop amplifying some of our worst intentions. And yet awareness of such issues has rocketed in recent years, with investigations into the output of AI technologies and changes in social attitudes thanks to the #MeToo movement and similar campaigns.
You could argue that’s the crux of all this, in the end. A more enlightened approach to one another and the wonderful array of human identities that exists in the world begins with us, not a database. In order to vanquish gender-based prejudices, we can’t just update the software. Or return the car to the manufacturer.
“I certainly don’t see a technological solution to it,” says Webb. “I think it’s a human problem.”
Join one million Future fans by liking us on Facebookor follow us on Twitter gold instagram.
If you liked this story, sign up for the weekly bbc.com features newslettercalled “The Essential List” – a handpicked selection of stories from
BBC future, Culture, Work life, Travel and Real delivered to your inbox every Friday.