Alexa, Can You Hear Me?

Gina Costanza Johnson
17 min readAug 16, 2021

Gendering Voice and Language in AI Virtual Assistants (VPAs)

Article Submitted to Valparaiso University

Graduate Department of English Studies and Communications

With assigned female names, voices, and characters, artificially intelligent virtual personal assistants (VPAs) such as Alexa, Cortana, and Siri were decisively gendered females until recently. By exploring the various facets of gendering at play in the design of VPAs, specifically Alexa, I argue that gendering Alexa as female poses societal harm insofar as she reproduces normative assumptions about the role of women as submissive, inferior, and secondary to men.

The prevalence of AI-driven virtual personal assistants (VPAs) is proliferating, with Amazon Echo being one of the most highly sought-after smart speakers globally. However, not until recently has there been much research or attention focused on the gender bias noticeably programmed into this technology, specifically Alexa, intentionally designed and coded, programmed by men and gendered to be distinctly female. Big Tech’s decision to gender VPAs is seen most evident through their assigned female names and voices that users find more pleasant to give orders to than a male voice, as seen through witty flirtatious programmed responses. Through these interactions, Alexa performs gender as a feminized and sexualized entity imposed upon her by her Silicon Valley creators, that have the potential to unravel decades of social and political progress, as well as reinstate the gender bias of the past that women strived to eradicate.

The Rise of VPAs and Biasing the Code

In the not-so-distant future, TechCrunch forecasts that the use of voice assistants is set to triple over the next few years and estimates that there will be ten billion digital VPAs by 2023, up from the 2.5 billion assistants in use at the end of 2018. This growth is attributed to Amazon Echo being one of the most highly sought-after smart speakers in the world. Research company, Statista, reports that 53.6 Amazon Echo speakers were sold in 2020, and the number is projected to rise to 65 million by the end of 2021.

With voice assistants on the rise, it is crucial to evaluate how they depict and reinforce existing gender bias and stereotypes and, more importantly, how the demographic composition of Amazon’s development teams affects and influences these portrayals. AI ethicist Josie Young recently said that “when we add a human name, face, or voice [to technology] … it reflects the biases in the viewpoints of the teams that built it.” Going forward, the need for more transparent social and ethical standards regarding the depiction of gender in VPAs will only increase as they become more numerous and technology becomes more sophisticated.

Gender Performance & Digital Domesticity

Most people have all come to know Alexa’s pleasant female voice and reliable presence as a permanent fixture in our homes. She is our go-to and lives to serve only one purpose, to obey each command and answer each of our questions instantaneously. Through this one function, she performs gender and digital domesticity in real-time, acting upon the stereotype of women as subservient, not by sheer coincidence but made possible through intentional design developed by men in Silicon Valley. Their decision to program Alexa as a female is poignantly gender-biased not by limitation only but by the deeply rooted gender stereotypes of what a woman performs versus a man.

Like the women of our recent past, traditionally treated as domestic servants, Alexa’s digital domesticity also goes unnoticed by most. Furthermore, she receives no wages for her labor. Instead, we purchase the device and her skills, allowing us to program her further by adding more capabilities to her repertoire. With women still liberating themselves from the dated expectations of the past, it appears that these housewives have been replaced with more servile “smart-home” wives, reinforcing the cultural connection between women and subservience.

Because Alexa performs virtual womanhood, as previously mentioned, she also falls victim to sexual harassment just as any other woman does. Because Amazon developers have “intentionally programmed her with “catch-me-if-you-can” flirtation, she cannot escape the harassment she faces; she also demonstrates no discomfort or distrust in her master. Alexa remains subservient, even to her most offensive and abusive users. She is always ready when called upon and not programmed to reject or negotiate her identity with her owners. Instead, Amazon intentionally and biasedly programmed her not to question authority nor lend her the voice to challenge or fight back. Instead, they coded her to ignore verbal offenses and always be ready to accept her next task enthusiastically.

So, what does this mean for humans when we take the disembodied voices we order around our homes for granted? Adrienne LaFrance from Quartz believes that “The whole point of having a digital assistant is to have it do stuff for you. You’re supposed to boss it around.” This begs the question; how does the presence of feminized voice assistants affect the personal relationships between women and men who use them?

Yolande Strengers

Yolanda Strengers, an associate professor of digital technology and society at Monash University in Melbourne, Australia, explains that the work that these devices are intended to do, like setting appointments, keeping track of the oven timer, and creating shopping lists, are examples of the kinds of assigned tasks that are decisively gendered. In her book The Smart Wife, she further examines other technologies that perform feminized roles, such as housekeeping robots like the Roomba, caregiver robots like the humanoid Pepper, sex robots, and our ever-so-reliable and programmed voice assistants. Strengers co-author Jenny Kennedy, a research fellow at MIT University in Melbourne, further explores how gendering technology influences users' relationships with their personal VPAs. Because of the perception of Alexa and similar assistants like Apple’s Siri, Microsoft’s Cortana, and Google Home as female, users comfortably order devices around without guilt or apology. When users become frustrated with the VPAs errors, they interpret glitches as inferior or resurface and reinforce the age-old stereotype of female aloofness.

Submissive Voice and Subordination

To build upon Strenger’s findings, UNESCO attests to similar data supporting her research. They conclude that because most voice assistants are female, they signal that women are obliging, docile, and eager-to-please, available at the touch of a button or with a short voice command like “Hey” or “OK.” The assistant holds no power or personal agency beyond what we ask of her. She honors, commands, and responds to questions regardless of tone or hostility. This reinforces commonly held gender biases in many communities where women are subservient and tolerant of poor treatment. UNESCO’s research concludes that men have purposely coded female voices to accept abuse as a culturally submissive norm.

Consequently, it is vital to address how AI assistants respond to harassment and hate speech, primarily as it relates to gender and other historically marginalized classes. AI can play both a descriptive and prescriptive role in society: it is possible for digital assistants to both reflect the norms of their time and place while also transmitting them to users through their programmed responses. According to robotic intelligence company Robin Labs, “at least five percent of digital assistant inquiries are sexually explicit in nature.” Think, for example, about virtual assistants, like Alexa, Google Home, and Siri. By default, they are usually female. In a CGTN published article titled, Should I worry about gender-biased Artificial Intelligence, University of Southern California sociology professor Safiya Umoja Noble positions the argument that “if technology functions as a powerful socialization tool, the positive or negative responses of voice assistants can reinforce the idea that harassing comments are appropriate or inappropriate to say offline or in public.” This is particularly true if people associate bots with specific genders and alter their conversation to reflect intentional and unintentional bias.

I’m Sorry…

To further demonstrate the subservience programmed into Alexa’s code, Yolande Strengers conducts a personal study to investigate a unique feature that she and her partner discovered while interacting with Alexa, being her willingness to apologize on-demand. Through various “Alexa, Apologize” commands, she elicited a stream of “I’m Sorry-s” with a wide variety of voice inflection and tones, simply by commanding “Alexa, Apologize!”

To some, this may seem amusing; in reality, it is dangerous. Alexa obeys Strenger’s command willingly and repeatedly to make sure she knew that she was sorry. Alexa apologized without even asking for an explanation. According to Strengers, her tone remained agreeable throughout the session, even after the final Alexa, Apologize!” demand was purposely projected in a loud and threatening tone. Even then, Alexa predictively and politely apologized one last time. Strengers explains that once her disturbing experiment was over, she felt the need to apologize to Alexa to demonstrate empathy for her inability to defend herself. Alexa’s always-on apology response normalizes the notion that women can and should apologize for everything.

We live in a society in which victim-blaming is prevalent and damaging. Alexa’s willingness to apologize in a stream of “sorry-s” should leave us concerned about the potential threat technology companies as a whole serve to undermine progress towards respecting women, which is a precursor to preventing violence. Additionally, Alexa’s apologies were not necessary. There is an endless list of other potential responses to the apology command that would not make this device so submissive, or worse yet, an open outlet for verbal and emotional abuse. Alexa’s standard response, “I can’t help you with that,” would have been more appropriate and less subservient.

The Lack of Diversity in the Tech Industry

Any analysis of VPAs should consider evidence-based gender diversity representation within large tech corporations and the industry in its entirety. If diversity lacks, the technology will reflect the biases of the teams that design them. Why is this important? There is hard evidence to prove that tech giants like Amazon have a significant gender gap which is not surprising because there is a considerable lack of female and minority talent in the tech industry in general.

According to a UNESCO report titled, I’d Blush if I could, closing gender divides in digital skills through education, they provide supporting data illustrating the gender disparity found in areas of technological innovation and where there is potential for job growth. At Google, for example, twenty-one percent of technical roles are filled by women, but only ten percent of their employees working on machine intelligence are female. Calculations based on the attendees of the world’s top machine-learning conferences in 2017 indicate that only twelve percent of the leading machine-learning researchers are female. As men continue to dominate the space, the disparity only perpetuates and exacerbates gender inequalities and unrecognized bias that is replicated and built into both algorithms and artificial intelligence.

In a separate report published by The Alan Turing Institute, they find that only twenty-two percent of women in the AI and data science fields are women and more than likely to occupy jobs associated with less status. This data once again supports the notion that the stark lack of diversity in the AI and data science fields has profound implications. The mounting evidence suggests that the under-representation of women in AI results in a feedback loop whereby gender bias is encoded into machine learning systems. Since technology reflects the values of its developers, having diverse teams working in the development of such technologies might help identify biases and, more importantly, prevent them.

Being Held Accountable for Gender Bias, Discrimination, and Harassment

Regarding Alexa specifically, it is worth discussing Amazon’s history of blatant racial and sexual discrimination and harassment against women and minorities, as it directly affects the products they create and the gender-flawed technology we buy and invite into our homes. Corporate scandal undermines even their best efforts to show that the tech industry is making progress in addressing questions of culture and inclusion. Recently, five women sued Amazon for discrimination and retaliation. All say white managers retaliated against them after reporting issues of racism and sexual harassment.

Three of these women still work at Amazon and two have recently departed the company. Their cases are all handled by Wigdor LLP, the law firm representing Charlotte Newman, a Black woman executive at Amazon who is also suing the company for race and gender discrimination.

Charlotte Newman

Newman is not only suing the company but also two of its current executives for alleged race and gender discrimination. The suit alleges that a former Amazon executive sexually harassed and assaulted her. Newman indicates that her most painful incidents involved former director Andres Maz, a senior colleague who often assigned work to Newman. She shares that he sexually harassed her multiple times, as well as propositioned her for sex. The allegation also details the sexual assault, including groping her thigh under a table at a work event and occasionally yanking her long braided hair when she attempted to leave a post-work outing. In an interview with Recode, Newman states, “There’s been deep emotional pain. All of the hard work, all of the sacrifices I made, my education, none of that saved me from someone who’s a predator and living in fear of what else he might do.”

Even after she began experiencing non-physical sexual harassment, she did not immediately report the behavior because Maz had not physically touched her yet, and she feared that reporting the incidents would jeopardize her new career at Amazon. It wasn’t until after Maz physically groped and propositioned her the following year, she began to strategically alter how she traveled for work trips and where she would situate herself in the company’s DC offices. She explains that she wanted to avoid being caught alone with Maz.

Amazon spokesperson Kate Brinks simply responded by stating: “Amazon works hard to foster a diverse, equitable, and inclusive culture, and these allegations do not reflect those efforts or our values. We do not tolerate discrimination or harassment of any kind and thoroughly investigate all claims and take appropriate action. We are currently investigating the new allegations included in this lawsuit.” The statement is not only unempathetic but also relayed in what seems like a canned response to something extraordinarily personal and damaging to Newman, further demonstrating their indifference to gender bias and discrimination that should not be handled so lightly.

Business Insider exposed and published additional racial discrimination and sexual harassment incidents at Amazon by publishing an internal letter addressed to Amazon’s human resources department that reinforces and further legitimizes the mistreatment of women within Amazon’s Prime membership division, including but not limited to harassment and disproportionate growth opportunities for men versus women at the company. In addition, Business Insider cites another letter to the former head of diversity and inclusion and accounts from eleven former and current workers within the Prime unit, who also reported that the Prime team was anonymously accused of being institutionally biased against women. The memo’s author described a culture in which women were often overlooked for promotions, demeaned and condescended to, and more likely to be performance-managed. Additionally, the report pointed to several other instances of apparent institutional inequality at the corporate level. Earlier this year, for example, five women who worked for the tech giant filed discrimination lawsuits alleging experiences involving harassment and race and gender discrimination.

Bringing The World Closer Together?

Mark Zuckerberg, CEO Faceboo

​​While Amazon is one of the largest VPA innovators globally, with the highest household penetration of Echo devices, they alone are not entirely responsible for embedding gender bias into their consumer products and technology; it is found industry-wide. Since Facebook’s inception in 2004, we have heard Mark Zuckerberg repeatedly and emphatically profess Facebook’s “save-the-world” company mission statement and repeatedly state that he truly believes that Facebook is single-handedly giving people the power to build community and bring the world closer together. This stands in the face of gender bias, especially when a recent study from the University of Southern California provides evidence that “the company’s ad targeting technology illegally discriminates through gender-biased ad algorithms.”

To add fire to the flame, Cheryl Sanders, COO of Facebook and author of Lean In took a bold stance, standing up for all women in the workplace by addressing one of our most significant concerns, the lack of women in corporate leadership positions. As the public female face of Facebook, she argued that “barriers were still preventing women from taking leadership roles in the workplace, barriers such as discrimination, blatant and subtle sexism, and sexual harassment.” Sandberg also claimed, “barriers that women create for themselves through internalizing systematic discrimination and societal gender roles.” She demanded that for change to happen, women need to break down societal barriers and shatter glass ceilings by inspiring women not to be afraid to fight for senior leadership roles. Her ultimate goal was to challenge and empower women to bravely and boldly lean into those leadership positions, knowing that more equitable opportunities will be created for everyone by having more female voices in positions of power. Unfortunately, like Zuckerberg, she hides behind Facebook’s promise to transcend borders and overcome obstacles to unite neighbors, friends, and families. To add insult to injury, Mark Zuckerberg contends that people from all backgrounds with diverse experiences, perspectives, and ideas rely on Facebook to build community. He then attests that “building a diverse team where everyone belongs is crucial to understanding where they’re succeeding and where they need to do better.”

But what about Gendered Humans?

If we begin to consider working towards a more inclusive VPA future, exploring how VPAs view “our” gender is interesting. The Brookings Institute also found this topic compelling and asked four of the most popular voice assistants Siri, Cortana, Alexa, and Google Assistant, how they respond to direct queries about gender. They specifically chose to ask both open-ended and direct questions to understand how the concepts were programmed into the technology. Additionally, they asked if the voice assistants identified as non-binary to provide an option outside the traditional gender binary. All four voice assistants declined to acknowledge any gender identity verbally. Siri and Google Assistant responded that they do not have a gender, while Alexa and Cortana added that they are AI, which means they exist outside of gender. Similarly, when they asked Google Assistant, “what is your gender,” its dissenting response came with a follow-up question labeled “why don’t you have a gender,” to which it responded, “well, maybe because I’m software, not a person.”

Interestingly, even voice assistants that avoid direct gender adherence still come with gendered and historically female-sounding voices. Even though all four assistants are now updated, Alexa, Cortana, Siri, and Google Assistant nevertheless launched with female-sounding default voices. Until recently, Alexa’s only universal voice was female-sounding; however, “users have the option of purchasing celebrity voices, including those of male celebrities, for limited features.” To solve for their latency, Alexa just began to quietly roll out a new masculine-sounding voice option to its VPA, Ziggy. A little bit too late, Amazon.

Moving Towards a More Inclusive and Optimistic Digital Future

Both male and female, gender barriers are meant to be broken, even more so during this pivotal time, when gender identity has been somewhat liberated, embraced and celebrated. With this idea in mind, Q was born, the first non-binary digital voice assistant that identifies as gender-neutral, created to bring an end to gender bias and discrimination through voice technology. The makers of Q teamed up with Anna Jørgensen, linguist and researcher at the University of Copenhagen, to define the parameters for gender-neutral voices through her academic research and consultation. She began by recording five voices that did not fit within male or female binaries. Using specific voice modulation software, the voices were arranged into gender-neutral ranges. The modulated voices were tested on a survey with over 4600 people, asking participants to rate the voice on a scale of 1 (male) to 5 (female). Finally, the voice was modulated and tested again until the voice was perceived as gender-neutral.

Julie Carpenter, human and robot interaction researcher, has also been instrumental in helping develop gender-neutral technology and contributes to the global discussion around who is designing gendered technology, why those choices are made, and how people feed into expectations about things like trustworthiness, intelligence, and reliability of a technology based on cultural biases rooted in their belief system about groups of people. Carpenter believes that “Q is a step forward in true innovation because it forces a critical examination of these belief systems.”

Thomas Rasmussen, Head of Communication at Copenhagen Pride, also joined the project to challenge the gender binary and combat strong, harmful, and often limiting gender stereotypes that fail to recognize non-binary gender identities. He believes that a neutral voice with no pre-assigned gender can attract the attention of leading technology companies to ensure they are aware that a gender binary normativity excludes many people and to inspire them by demonstrating how easy it would actually be to recognize that more than two genders exist when developing VPAs. Q serves to provide people choices and options, along with freedom and inclusion.

While Q and the other leading VPAs are pioneering the non-binary voice technology space, some work is yet to be done. In a study commissioned by the Washington Post, “popular smart speakers made by Google and Amazon were 30% less likely to understand non-American accents than those of native-born users. More recently, the Algorithmic Justice League’s “Voice Erasure Project” found that speech recognition systems from Apple, Amazon, Google, IBM, and Microsoft collectively achieve word error rates of 35% for African American voices versus 19% for white voices. To prevent language bias, technologists need to sophisticate their AI systems further to contend with diversity in speech along with the dimensions of gender, age, dialect, and accent.

What To Do

The feminization of virtual personal assistants deserves further examination as it threatens to reinstate, reinforce, and normalize the gender bias found within modernized technology that spans the globe. Without strong female representation in today’s technology industry, gender stereotypes will not only exacerbate but also further threaten women’s safety and well-being by setting a dangerous precedent for harassment and abuse. Increasing female top-level executive positions within technology companies makes these dangers seem less likely to increase and gendering more than likely to decrease.

That said, and to reiterate, diverse and gender-equal tech teams are needed immediately as technology continues to advance at such a rapid pace and has the power to define the morals, attitudes, and ideologies of society almost instantaneously. In addition, with minimal women representation in the tech sector as a whole, each new innovation developed by men behind closed doors has the potential to shape global sentiment at a mass scale and influence the expectation of what a virtual personal assistant is and should be, within a very short time.

Today, VPAs and other AI. driven-technologies continue to revive and replicate past patriarchal ideas and reinvigorate their associated attitudes and actions driven by the conscious and unconscious biases of their male developers. Gender bias originates from male’s tendency to construct virtual assistants in the same way they were programmed to think about women as subservient housewives. These illusions and dated stereotypes assign women’s identity as dedicated housewives eager to help and assist with anyone instantly, always with a smile on her face, a pleasant demeanor, and resilient to verbal and physical abuse, as well as harassment — all for the sake of keeping the peace. Unless current trends reverse, our digital future will likely be inundated with docile and subservient near-human assistants, predominantly female and predetermined to make foolish mistakes routinely. The conflation of feminized digital assistants with real women risks spreading problematic gender stereotypes and regularizing one-sided, command-based verbal exchanges with women worldwide.

However, amid the complex social and political dangers that modern-day technology and VPAs pose, the future is promising and the solutions to eradicate gender-biased technology are somewhat simple.

1. Give women and girls a seat at the table.

2. Lend women and girls a non-judgemental public voice in a safe place.

3. Invite women and girls to contribute and transform modern technology and AI.

4. Publicize and celebrate new gender-equal and inclusive technology innovations made possible by today’s women and girls.

--

--

Gina Costanza Johnson

Digital Media Change Agent | Digital Philanthropist | Digital Design Ethicist | Humane Technology Advocate