In this interview, Dr Rachel Adams discusses the numbers gap for women in technology, the problematic gendering of VPAs and advises on ways to tackle the issue. 

Interview and photo used was originally published on El Pais.

Why do VPA’s have female voices and female “attitudes”? 

The most obvious reason why VPAs, such as Siri and Alexa – who are designed primarily for home and personal use, have been assigned female voices and characteristics is because studies in behavioral economics have determined that consumers of all genders prefer the sound of a female voice in this kind of setting. However, the problem with uncritically accepting this is that we end up reproducing stereotypes of women as helpful and as assistants. This is exacerbated when we look at VPAs used in a professional setting, like IBM’s Watson, where a male personality and voice is used. Again, this reproduces normative ideas about gender and the roles of women to be at home, and the roles of men as professionals. If we want to dig a little deeper, we see a long history of robots being gendered female, from Pygmalion’s statute in Ancient Greece, to Olympia in The Sandman and, more recently, Samantha in Spike Jonze’s Her and Ava in Alex Garland’s Ex Machina. In addition to these fictional representations we have the Sophia robot. They all broadly follow a trend that denotes many things at once: ideas that women are programmable and “other” to men; or ideas that women are not as human as men. But if often comes back to the idea that new technologies are less threatening, less of a threat to the social order, if they are characterised as women.

When we think about tech and innovation it all sounds new; however, we seem to be perpetuating old-fashioned gender stereotypes. Why? How is this possible? What does this say about our society?

Looking historically and we see an even more worrying trend. There has been a critical decline in women graduates from computer science and women engineers since a few decades ago where in 1993 28% of computer graduates in the US were women, to today where the figure is at about 18%. These educational statistics are further echoed in the workplace where there has been a historical decrease in the numbers of women in tech. A brilliant book by Mar Hicks – Programmed Inequality – shows how as computing became seen as a more strategic and important job the women were systemically kicked out and it was marketed as a men’s work.

At the end of the day, we think about the tech industry as a male thing. Could this be a reason why gender stereotypes are not taken into account? (Maybe they don’t do it on purpose?)

Yes, I am sure that a lot of this is implicit biases. But more critically it shows the effects of not including women in tech design teams. Statistics show that from 1980 to 2010, 88 percent of all information technology patents were by male-only invention teams, while 2 percent were by female-only invention teams. These facts are telling. A recent report from AI Now Institute succinctly put it that: “it is time to address the connection between discrimination and harassment in the AI community, and bias in the technical products that are produced by the community.”

On the other hand, how does this approach to VPAs impact on how we see women in society?

One of my critical concerns is the ethics behind VPAs. VPAs are considered to be tools which free you up from supposedly superficial work in order for you to have more time for more important work. But this makes a critical value statement about the worth of work that has been traditionally undertaken by women: that is, that it is less important.

How do we sort this out? What would your recommendations be and who should be taking the lead on this?

Addressing the issues within the tech industry and within gendered VPAs requires a wholesale approach from society as a whole, and they are playing into broader historical negative gender stereotypes. However, one of the key things I have advocated for is a quota for women in executive and leadership positions within tech and tech design teams. This simply step will begin to address many of the issues at hand.

Within your proposals, you talk about a “wholesale reform within the industry” and also about thinking about how AI is “used in context”. What do these two ideas mean in practice?

For “wholesale reform”, see above. In terms of understanding how AI is used in context, this would mean drawing greater attention to the negative ways in which VPAs such as Siri and Alexa are being used – in terms of verbal sexual abuse specifically. Programming VPAs not to respond to this kind of conversation (most of them currently are programmed to repsond to this: if you ask Bixby (Samsung’s VPA) for example, to talk dirty, the female voiced Bixby will respond “I don’t want to end up on Santa’s naughty list”) is a first step. Where it is perceived as OK to speak to women like this in the virtual world, and to expect a jovial response, this will translate into the real world and cause harm.