07 November 2019 Blogs, Academic, Community College, Faculty, Librarian, Student/Researcher

AI, Gender and Companionship

Feminized virtual assistants are also designed to function as personal companions. What does this mean for human relationships?

By Courtney Suciu

Do you use a digital personal assistant?

Do you regularly ask Siri, Alexa, Google or Cortana for directions, to play a song or order household supplies like laundry detergent and toilet paper?

If you answer “yes” to any of these questions, have you thought much about the “gender” of the voice responding to your requests?

It turns out, the forms of artificial intelligence we rely on most in our day-to-day lives – digital personal assistants, or, in many cases, customer service chatbots – have qualities we traditionally identify as female. And, according to researchers, the implications of gendered AI have consequences when it comes to human interactions.

Why are so many bots and digital assistants female?

In a 2018 interview with The Economic Times1, American lawyer Reshma Suajani expressed concern about the ubiquity of female AI in our lives:

I think we have to really ask ourselves why every major digital assistant to hit the market has been a woman. Why is a woman's name (and voice) the default for an assistant? We're putting these chatbots in homes all over the world. Our children are learning that when they have a question or need help, it's going to be a faceless female voice that will do the work for them.

Suajani, who is also the founder of Girls Who Code, a non-profit dedicated to encouraging women to work in technology, offered a potential answer to her own question: "I think a big reason why we're seeing our digital assistants taking on female personas is because women aren't in the room where the big ideas and designs are happening.”

As a result, Suajani added, “all we’re getting are the biases of the people who are in the room.”

But the dominance of men in tech isn’t the only reason so much AI is female. It’s also because we want female AI – at least according to a recent report from the Dow Jones Institutional News2.

Citing a study of over 7000 consumers in the Americas, Asia and Europe, the report revealed we like our bots to be as human-like as possible. Consumers “have strong views on how they want bots to look and behave,” the article noted, as half of those surveyed indicated they preferred a bot to look human rather than have it represented by an avatar.

And, among those who indicated a preference, nearly three times as many said they preferred a female bot over a male. In addition, the “bot personality traits” respondents ranked highest were “sounding polite,” “caring” and “intelligent.” The qualities that tied for the lowest ranking were “sounding posh” and “sounding authoritative.”

So, we like our female bots to be kind and smart, but not to come across as anyway superior to us.

What, if anything, does this reflect about gender roles and stereotypes?

AI and gender stereotyping

For researchers like Allison Piper, there is concern that feminization of digital personal assistants and chatbots perpetuates the historically subservient role of women in Western cultures.

In her thesis “Stereotyping Femininity in Disembodied Virtual Assistants,”3 Piper argued “The purpose of the AI virtual assistant technology seems to be an outdated signification of femininity, one that presupposes that women exist in a supportive manner rather than directive.”

She compared “original virtual assistants like ‘Clippy,’ the Microsoft paper clip,” which “did not resemble humans at all, save for its blinking eyes” – and was hugely unpopular – with “newer virtual assistants like Siri, Cortana, and Alexa” which “are given feminine personas, able to speak audibly with users, and interact beyond simply answering questions.”

According to Piper, today’s digital personal assistants function much like women of previous generations were expected to, making life easier for typically male executives, bread-winners and important decision-makers.

Likewise, Piper pointed out, we use Siri, Cortana, and Alexa to “keep track of important dates in our calendars for birthday[s] or appointments; there are also similarities in filing systems because virtual assistants can find specific documents; some can even order us items from Amazon after a simple verbal command.”

“Virtual assistants on phones and other devices are essentially secretaries for everyday people,” she observed, which could be a great thing. “They seem to make our lives easier, cutting out time to look up something ourselves on the Internet, for example.”

However, in doing so, Piper warned, these digital assistants are also “fulfilling a Western, subordinate feminine typecast.”

Of course, notions around gender in general are complex, and Piper was careful in her argument to note that it is “not a restrictive binary.”

“Masculinity does not necessarily correspond with ‘man,’ and femininity does not necessarily correspond with ‘woman,’” she pointed out.

Rather, in Piper’s research, “gender” referred to “the spectrum of masculinity and femininity [as] symbolic signs of culture.” She wrote that in Western cultures, “gender is a way in which people can possess power or be disempowered and masculinity is viewed more favorably than femininity” – a convention underscored by the feminine typecasting of AI.

How might our relationships be impacted?

In his article for the Journal of Science and Technology of the Arts4, Pedro Costa made a similar observation and suggested there is a risk in how our interactions with feminized virtual assistants “might affect the way we feel, perceive, interpret and even describe reality, gender and women.”

Such concerns are particularly pertinent in the context of Costa’s analysis of how digital assistants also perform as personal companions.

“The AIs frequently display caregiving attitudes, namely in interactions that don’t relate directly with providing help or assistance,” he wrote. For example, he noted how “Alexa resembles a hybrid between a housewife and an assistant…even saying ‘Well, hello! I’m very glad you’re here,’ when the user arrives home.”

Meanwhile, “Cortana poses as a maternal and somewhat intimate assistant who seeks a friendly relationship with the user, calling the user ‘friend’ and frequently using humor or asking about the user’s day or dinner plans, as well as offering to sing lullabies.”

As feminized digital personal assistants increasingly fulfill an emotional role in our lives, how might human relationships be affected? Might the companionship and intimacy with AI be preferable for some people?

In the CNN series Mostly Human5, host Laurie Segall interviewed Matt McCullen, founder and CEO of RealDoll. When Segall described his lab as a “sex doll factory,” McCullen quickly corrected her and explained that many of his clients “have feelings that are beyond just sexual desire. They actually become attached to their dolls…so I think love is more in line with what this is."

McCullen’s company creates life-size, bespoke “dolls” that allow clients to “customize everything that you want on your girl and just order it. You’re gonna receive the body with all the things that you chose,” he explained.

In addition to a body, RealDolls are equipped with a highly responsive artificial intelligence. “The more you interact with it, the better your doll gets to know you,” McCullen said, and compared the experience to interacting with a digital personal assistant.

“When you [talk about] AI in your phone, you think Siri or something like that. But Siri doesn't care when your birthday is or what your favorite food is, so on and so on. Our AI is very interested to know who you are,” he told Segall.

When Segall asked him if he thought it at all sad that his clients might be replacing human love and intimacy with a relationship with a doll, McCullen said he was honored to “be part of that equation.”

“All I'm trying to do is make that person happy, make them feel complete and like they have that missing piece that they didn't have before,” he told her. “People are not as loyal as they used to be to each other. And people deceive each other, and that breaks their heart. And they find themselves in a place where like, ‘I don't wanna do that again?’”

McCullen’s perspective provides a unique opportunity for us to think about what we expect and desire from our relationships, whether with those who assist us in varying capacities, or those with whom we relate in various emotional capacities.

Beyond the implications of feminized artificial intelligence perpetuating gender stereotypes, how do we, as humans, benefit from interacting with forms of intelligence that exist only to serve and respond to our needs?

Can these exchanges be considered “relationships” if the “relating” is one-sided rather than give-and-take?

And what does it mean to be human if we can avoid being vulnerable – if we never get our hearts broken or make someone mad or have our requests denied?

Perhaps we could just ask Siri what she thinks.

Learn more about ProQuest's Global Challenges initiative.

Notes:

  1. Arora, P. (2018, Sep 15). Chatbot as Woman: Did Industrial-Age Biases Creep into New-Age Tech? The Economic Times. Available from ProQuest One Academic.
  2. Press release: Consumers Want Female and Funny - But Not Youthful - Chatbots. (2017, Sep 12). Dow Jones Institutional News. Available from ProQuest One Academic.
  3. Piper, A. (2016). Stereotyping Femininity in Disembodied Virtual Assistants (Order No. 10167825). Available from ProQuest One Academic.
  4. Pedro Carvalho Ferreira, d. C. (2018). Conversing with Personal Digital Assistants: On Gender and Artificial Intelligence. Journal of Science and Technology of the Arts, 10(3), 59-72. Available from ProQuest One Academic.
  5. I Love You, Bot, in Mostly Human with Laurie Segall, season 1, episode 3. Hunt, R. (Director). (2017, Jan 01).[Video/DVD] CNN Newsource Sales. Available from ProQuest One Academic.

________________________________________________________________________________

Courtney Suciu is ProQuest’s lead blog writer. Her loves include libraries, literacy and researching extraordinary stories related to the arts and humanities. She has a Master’s Degree in English literature and a background in teaching, journalism and marketing. Follow her @QuirkySuciu

arrow_upward