91大黄鸭

Skip to content

Alexa, what鈥檚 your gender?: Female-voiced virtual assistants spur debate

Default settings in Canada is a woman鈥檚 voice for Alexa, Siri, Cortana and Assistant
16102133_web1_TSR-alexa-edh-180327
An Amazon Echo device. (Ian Terry/Black Press Media)

Apple鈥檚 Siri can prattle off the height of celebrities at the drop of a hat. Amazon鈥檚 Alexa can order a fresh batch of toilet paper to your door. Google鈥檚 Assistant can help tune your guitar, while Microsoft鈥檚 Cortana does impressions on request.

Just say the word, and these digital assistants will respond to your queries with synthetic pep and nary a hint of complaint. But ask any of the devices about their gender and they seem to demur.

Pose the question to Siri, Cortana or Assistant, and each will insist they transcend such human constructs. Alexa replies, 鈥淚鈥檓 female in character.鈥

But all of these artificially intelligent aides will answer in a woman鈥檚 voice, at least according to their default settings for Canadian users.

READ MORE:

Critics argue this chorus of fembots reinforces stereotypes about women being servile, while members of the tech industry insist they鈥檙e simply catering to consumers.

Jodie Wallis, managing director of consulting firm Accenture鈥檚 Canadian AI department, says there鈥檚 truth to both sides of the debate 鈥 sexist attitudes may be at play, but developers shouldn鈥檛 be held responsible for society鈥檚 views.

鈥淚t absolutely reinforces stereotypes, but it鈥檚 reinforcing those stereotypes based on research done on what we respond well to,鈥 said Wallis.

A Microsoft spokeswoman said the company thought 鈥渓ong and hard鈥 about gender and did extensive research about voice in crafting Cortana鈥檚 鈥減ersonality.鈥

Microsoft concluded there were 鈥渂enefits and trade-offs to either gender-oriented position,鈥 but found there is a 鈥渃ertain warmth鈥 to the female voice that鈥檚 associated with helpfulness 鈥 a quality the company wanted in its product, according the spokeswoman.

Representatives for Amazon and Google did not respond to email inquiries about how gender factored into product development, while an Apple spokeswoman declined to comment.

Researchers at Indiana University found both men and women said they preferred female computerized voices, which they rated as 鈥渨armer鈥 than male machine-generated speech, according to a 2011 study.

But late co-author Clifford Nass suggested in the 2005 book 鈥淲ired for Speech鈥 that these gender-based preferences can change depending on a machine鈥檚 function. Male voice interfaces are more likely to command authority, he wrote, while their female counterparts tend to be perceived as more sensitive.

While Alexa and Cortana only offer female voices, male options were added to Assistant and Siri after their initial rollouts. For some languages and dialects, Siri鈥檚 voice defaults to male, including Arabic, British English and French.

Ramona Pringle, a Ryerson University professor who studies the relationship between humans and technology, acknowledged these developments seem promising, but said if companies are passing the buck onto their customers, then they should be able to select the voice during setup.

鈥淭he tech industry often does put the onus back on users, whereas it鈥檚 not up to an individual user to bring about this kind of change,鈥 said Pringle. 鈥淭he way that we perpetuate any stereotype is by saying, 鈥楾hat鈥檚 what people expect.鈥欌

Pringle said there鈥檚 a 鈥渃lear power dynamic鈥 between users and digital assistants, with the female-voiced devices being placed in a subservient role, perpetuating stereotypes that women should be 鈥渄ocile and doing our bidding at our beck and call.鈥

This form of digital sexism that may seem innocuous, but could become more insidious as AI develops, Pringle warned.

If we鈥檙e conditioned to hurl commands or even verbal abuse at female-coded devices, then as robots become more human-like, the risk is women will be dehumanized, she said.

鈥淚t鈥檚 almost a step backwards, because it changes the way we engage,鈥 said Pringle.

鈥淚t鈥檚 very concerning, because there鈥檚 no reason 鈥 for (these devices) being gendered the way that they are.鈥

Adina Bresge, The Canadian Press

Like us on and follow us on





(or

91大黄鸭

) document.head.appendChild(flippScript); window.flippxp = window.flippxp || {run: []}; window.flippxp.run.push(function() { window.flippxp.registerSlot("#flipp-ux-slot-ssdaw212", "Black Press Media Standard", 1281409, [312035]); }); }