If you watched Super Bowl LV this past weekend (and the numbers say that fewer of us did this year than since 1969), whether, for the game or the commercials, chances are that you caught Amazon's latest spot promoting its Echo smart speakers and Alexa smart voice assistant. In the ad, a woman daydreams about what the "ideal vessel" for Alexa would be in her mind. She then conjures up a sultry fantasy featuring 2020 People Magazine's Sexiest Man Alive, Michael B. Jordan. That's right, Erik Killmonger himself!
I'll be the first to admit that I laughed the first time I saw this commercial. Even as a straight, white, cisgender male, I appreciated how Amazon turned the tables on the stereotype/trope of female sexualization towards the male form. Sex sells, and the topic of objectification is a serious one that another article could address (just read the YouTube comments if you're interested), but it wasn't precisely what held my attention with this ad.
Amazon has been running longer form, humorous ads for Alexa during the big game for several years now, including a similar concept that featured different celebrities voicing Alexa when she lost her voice. That campaign featured guest appearances by the likes of Gordon Ramsay, Cardi B., Rebel Wilson, and Sir Anthony Hopkins dutifully filling in for the voice assistant. Part of what was funny about those ads was not just the celebrity cameos, but the shock on customers' faces when they heard voices that they didn't associate with Alexa — both female and male.
Following that campaign's success, Amazon even created a webpage for what it calls Alexa's Celebrity Voice Program. It currently only includes Samuel L. Jackson (which is hilarious btw), but it's clear the idea is to add more options in the future. Even Google has let customers choose celebrity voice responses for the Google Assistant on its smart speakers, such as John Legend and Issa Rae.
Alexa is calm and soothing, but why is she, a "she"?
All of which reminded me about a topic I'd been thinking and reading about for years now. That is, not only why tech companies insist on personifying and anthropomorphizing technology in general, and smart voice assistants in particular, but why Amazon insists on gendering Alexa as a "she."
Over the past several years, as the prevalence of smart voice assistants has increased in our daily lives, there have been many articles written — both research-based and opinion pieces — on why companies like Amazon, Apple, and Google, among others, tend to bias their voice assistants toward having "female" voices, personalities, and characteristics.
Some have reported that customers react more favorably to a female voice or that female voices tend to articulate sounds better, while others have suggested that it's simply down to outdated gender stereotypes. Some have argued that our society has conditioned and convinced itself that women are a better fit for the administrative roles that Alexa, Siri, and Google Assistant play. Hence, tech companies gender those voices accordingly.
I have many friends who have changed their Siri or Google Assistant settings to sound more like a traditional male voice and/or have changed their smart voice assistant's accent. As a person who uses Alexa much of the time, one of the first things I noticed was that there was no option to change the gender of Alexa's voice. When you think about it, though, there's nothing necessarily gendered about voice. I've known many boys and men who have very soft, almost feminine-sounding voices, and many women and even older girls have deeper, more traditionally masculine-sounding voices. Personality traits such as accent and tone, much like the concept of gender itself, are a social construct.
Even Amazon can't decide if Alexa has a gender or not.
So ultimately, it's not a problem of how Alexa's voice sounds, but rather why the AI has to be anthropomorphized as a human gender in the first place. Why does she have to be a "she" (or he, or they) at all?
In Amazon's written materials, you'll frequently find the company referring to the smart voice assistant as "she" or "her." Ironically, when I asked the Amazon Echo Dot (4th Gen) next to my monitor, "Alexa, what is your preferred pronoun," the speaker replied: "As an AI, I don't have a gender." Tell that to your bosses, Alexa.
We may earn a commission for purchases using our links. Learn more.