If you watched Super Bowl LV this past weekend (and the numbers say that fewer of us did this year than since 1969), whether, for the game or the commercials, chances are that you caught Amazon's latest spot promoting its Echo smart speakers and Alexa smart voice assistant. In the ad, a woman daydreams about what the "ideal vessel" for Alexa would be in her mind. She then conjures up a sultry fantasy featuring 2020 People Magazine's Sexiest Man Alive, Michael B. Jordan. That's right, Erik Killmonger himself!
I'll be the first to admit that I laughed the first time I saw this commercial. Even as a straight, white, cisgender male, I appreciated how Amazon turned the tables on the stereotype/trope of female sexualization towards the male form. Sex sells, and the topic of objectification is a serious one that another article could address (just read the YouTube comments if you're interested), but it wasn't precisely what held my attention with this ad.
Amazon has been running longer form, humorous ads for Alexa during the big game for several years now, including a similar concept that featured different celebrities voicing Alexa when she lost her voice. That campaign featured guest appearances by the likes of Gordon Ramsay, Cardi B., Rebel Wilson, and Sir Anthony Hopkins dutifully filling in for the voice assistant. Part of what was funny about those ads was not just the celebrity cameos, but the shock on customers' faces when they heard voices that they didn't associate with Alexa — both female and male.
Following that campaign's success, Amazon even created a webpage (opens in new tab) for what it calls Alexa's Celebrity Voice Program. It currently only includes Samuel L. Jackson (which is hilarious btw), but it's clear the idea is to add more options in the future. Even Google has let customers choose celebrity voice responses for the Google Assistant on its smart speakers, such as John Legend and Issa Rae.
All of which reminded me about a topic I'd been thinking and reading about for years now. That is, not only why tech companies insist on personifying and anthropomorphizing technology in general, and smart voice assistants in particular, but why Amazon insists on gendering Alexa as a "she."
Over the past several years, as the prevalence of smart voice assistants has increased in our daily lives, there have been many articles written — both research-based and opinion pieces — on why companies like Amazon, Apple, and Google, among others, tend to bias their voice assistants toward having "female" voices, personalities, and characteristics.
Some have reported that customers react more favorably to a female voice or that female voices tend to articulate sounds better, while others have suggested that it's simply down to outdated gender stereotypes. Some have argued that our society has conditioned and convinced itself that women are a better fit for the administrative roles that Alexa, Siri, and Google Assistant play. Hence, tech companies gender those voices accordingly.
I have many friends who have changed their Siri or Google Assistant settings to sound more like a traditional male voice and/or have changed their smart voice assistant's accent. As a person who uses Alexa much of the time, one of the first things I noticed was that there was no option to change the gender of Alexa's voice. When you think about it, though, there's nothing necessarily gendered about voice. I've known many boys and men who have very soft, almost feminine-sounding voices, and many women and even older girls have deeper, more traditionally masculine-sounding voices. Personality traits such as accent and tone, much like the concept of gender itself, are a social construct.
So ultimately, it's not a problem of how Alexa's voice sounds, but rather why the AI has to be anthropomorphized as a human gender in the first place. Why does she have to be a "she" (or he, or they) at all?
In Amazon's written materials, you'll frequently find the company referring to the smart voice assistant as "she" or "her." Ironically, when I asked the Amazon Echo Dot (4th Gen) next to my monitor, "Alexa, what is your preferred pronoun," the speaker replied: "As an AI, I don't have a gender." Tell that to your bosses, Alexa.
Jeramy is proud to help *Keep Austin Weird* and loves hiking in the hill country of central Texas with a breakfast taco in each hand. When he's not writing about smart home gadgets and wearables, he's defending his relationship with his smart voice assistants to his family. You can follow him on Twitter at @jeramyutgw.
Well, it's basic psychology to give it human characteristics to make people care about the product. Watch facts always show 10:08 because it "looks like a smile", of course Alexa has a persona. As for why it's female, you can put whatever spin you want on that based on your personal politics.
I could not think of a better topic than bringing identity politics to autonomous devices. Somebody award this author the Pulitzer!!! Now go back to sipping your Grande Vanilla Bean Creme Frappuccino (in a Venti cup) with a variety of dairy and non-dairy milks, extra caramel drizzle, coconut flakes and Greek yogurt, bananas, strawberries, protein powder, and a very specific 34-degree serving temperature.
Alexa is a woman's name.
Why? I'd say so too, but what makes it a woman's name? Is Siri necessarily feminine? Is Cortana? Would Cortano be male? Google seems to have sidestepped the issue with their no name Google Assistant. Google offers an assortment of voices that aren't identified as male or female, just colors, though they do present masculine and feminine variations with several accents. I picked Sydney Harbour Blue.
Now we will get pervs trying to mack on their Alexas.... Stupid hard up people that made this commercial trying to sexualize an smart device.
Please give us a Scarlett Johansson voice like the movie HER!!!!!!!!!
Aircraft manufacturers give their cockpit voice warnings a female voice because psychology testing showed that a female voice is more commanding, yet more soothing, in an emergency (which is normally the case when there is a problem with the aircraft).
Get the best of Android Central in in your inbox, every day!
Thank you for signing up to Android Central. You will receive a verification email shortly.
There was a problem. Please refresh the page and try again.