Siri, Alexa, Cortana, and the Command of Women’s Voices
Who are virtual assistants for, and how do we relate to them?
I tend to avoid using digital assistants. Maybe it’s my technophobic tendencies, a hatred of having my music interrupted by other sounds from my phone, how weird I feel about digital assistants all being male, or my discomfort with having what feels like a servant. Mostly, it’s because I have been a real, analog assistant to an imperious old man. I was a chirpy female voice parsing and assenting to my boss, popping into his office first thing in the workday with a stack of correspondence printed out, compiling his action items into to-do lists, taking dictation for memos and emails
I quickly realized that my job was unnecessary. My 90-something boss could handle his own correspondence and administrative work — often better than a lackadaisical, distracted 23-year-old could. The only real purpose I served was providing my boss with the experience of having an assistant.
My Google Assistant speaks to me like I might speak to my boss, or my boss’s boss’s boss. I set it to my native French — a language that makes hierarchies more apparent — and found that the assistant refused to address me with the familiar tu, saying, “I was always taught to say vous.” A human who says this is indicating that their parents were socially conservative, with a deference to authority, but hearing a machine say it made me wonder what its programmers wanted to model it after.
The real-world analogue for virtual assistants seems instead to be the corporate secretary, presenting all users with “morning briefings,” so we feel important enough to be briefed on world events before going about our day. I suspect the aim is to flatter upwardly mobile, striving professionals with a simulation of being at the top of the hierarchy.
The news briefing my assistant prepared for my very important ears was domestic news from French public radio, even though most Francophones in the United States are not from France (something that replicates the regionalist bias plaguing the entire Francophone world). When asked to speak other kinds of French, it offered a few cringey, would-be humorous responses in regional dialects: Quebecois, Ch’ti (northern French patois), and Verlan (slang used by French youth). These all happen to be dialects whose speakers are routinely disparaged as intellectually and socially inferior by people who speak in the same accent as the Google Assistant does — the same accent as I do. And as I spoke to the assistant, I found myself leaning even more into the sharp edges of the language, sounding more clipped, more authoritative, more like a newscaster or the posh, steely Emmanuel Macron
French Google Assistant’s voice, besides being upper-class and metropolitan, is also intractably female. (The U.S. version introduced a male voice, “Voice 2,” as an alternative to its default female voice, “Voice 1,” only six months ago.) Jessi Hempel, writing in Wired, explains that “people tend to perceive female voices as helping us solve our problems by ourselves … We want our technology to help us, but we want to be the bosses of it, so we are more likely to opt for a female interface.” In the U.S., 94.6 percent of human administrative assistants are female, so it’s no surprise that reality would condition the programming of virtual assistants.
The gender of assistants’ voices affects how we speak to them. USC sociology professor Safiya Umoja Noble tells me that virtual assistants have produced a “rise of command-based speech at women’s voices. ‘Siri, find me [fill in the blank]’ is something that children, for example, may learn to do as they play with smart devices. This is a powerful socialization tool that teaches us about the role of women, girls, and people who are gendered female to respond on demand.”
The way voice assistants normalize female subservience is just one part of a cycle of social conditioning and reinforcement. One Washington Post article reports that children growing up with Alexa are learning rude behaviors. But there is nothing described in that article that I haven’t seen certain children inflict on their mothers or babysitters since before Alexa existed. Of course, virtual assistants can be equally tolerant of bad behavior in children and adults; they’re often programmed to respond to sexual harassment with coyness and sometimes even flirtation. In general, they’re programmed to coddle us, like a hybrid mom–cool babysitter. They make already comfortable lives even more frictionless, much like the relationships of subservience between real people that gig-economy apps — Uber, Seamless, TaskRabbit — have helped normalize and mainstream.
In other words, virtual assistants aren’t just programs that live in our phones or cute blob-shaped electronic devices; they are part of society, and they shape and are shaped by it. Techno-socialist-feminist Donna Haraway wrote in 1984, long before humanoid AI was anything more than science-fiction, that the boundary between humans and technology was wearing thin as humans incorporated increasingly sophisticated machines into their lives and continued to shape them in their own image. “We are all chimeras, theorized and fabricated hybrids of machine and organism — in short, cyborgs … A cyborg is a cybernetic organism, a hybrid of machine and organism, a creature of social reality as well as a creature of fiction.” Virtual assistants are cyborgs, just like us.
✅ @tweakofficial, congratulations on making your first post! I gave you an upvote!
Please take a moment to read this post regarding commenting and spam. (tl;dr - if you spam, you will be flagged!)