If AI were human, it would be male. 

Yes, many bots and AI agents are cast as female, especially voice assistants. Siri and Alexa have names that identify them as women. That’s because when testing the systems, developers discovered that users were more comfortable with female assistants, perhaps because of a stereotype of women as helpmates.

This stands in stark contrast to IBM’s Watson, where the voice used is clearly male. It also matches male speech patterns of short phrases, delivered in a self-assured manner. Watson is intended to be used in ways where leadership and authority matters, so a male persona was chosen. Since we want users to interact with AI-enabled products that personify humans, it is not surprising that they are coded to adopt conventional and stereotyped assumptions of gender roles.


Yet the problem of gender biases in AI goes considerably beyond the user experience/interface we have when we interact with AI-powered products. There are three main contributing factors to bias: It starts with talent –who’s coding – and becomes more serious when we consider the lack of algorithmic transparency and biases in the data.

The diversity gap in STEM fields is generally wide, but it’s especially pronounced for programmers contributing to machine learning and, more broadly, computer science. It starts in education, where 80 percent of AI professors are male, according to research from New York University’s AI Now Institute. Then it trickles down to the workplace. Tech giants like Facebook and Google might be on the cutting edge of AI technology and research, but they are mostly boys’ clubs. Only 15 percent and 10 percent of their AI workforces, respectively, are women. The numbers are even more dismal when it comes to racial diversity.

Only 15 percent and 10 percent of Facebook and Google’s AI workforces, respectively, are women. The numbers are even more dismal when it comes to racial diversity.

This matters, because the algorithms those employees write end up skewed to favor men, especially white men. I’m a technology investor who focuses especially on AI; I’m not a neuroscientist, but it’s common knowledge that female and male brains operate differently. Consider the ramifications if algorithms are written only by men: When left unchecked, personal biases, assumptions, and stereotypes get built into the algorithms. If all of the engineers look alike and come from similar backgrounds, it’s only natural that the algorithms they create will reflect that homogeneous atmosphere. With so few women in the room, their voices get swallowed by the more dominant group. This, the AI Now Institute report points out, creates a discrimination feedback loop. With AI now used in so many facets of life, the outcome results in one group receiving preferential treatment based solely on the programmed algorithms, while another group of people may be ignored all together. Yet the algorithms are not shown to users, so they become victims of bias with no way to evaluate that.


Now let us consider that algorithms require data: “[A]lgorithms are not just crunching numbers through static mathematical models but update their behavior iteratively based on models tuned in response to their experience (input data) and performance metrics,” according to a study from the Rand Corporation. Herein lies our third issue – the blatant biases of historical data. In fact, data biases well beyond gender represent such a challenge that it would warrant its own article. You can see that we are dealing with a real problem. 

Here is a quick example of what can happen:

The data that trains AI to find correlations and “learn” depends entirely on what information is fed into the system. Testing facial recognition data with only white male faces, for instance, will result in facial recognition systems that cannot reliably recognize women or people of color. When algorithms are based on historical data, they will skew in that direction, as Amazon discovered. The company built a hiring algorithm based on ten years of accumulated resumes with the supposed goal of finding top candidates in a non-biased way. The problem was that the historical data was based on predominately male candidates and employees, so that was the information the AI culled in its searches. Clearly great care must be taken when selecting data sets and training algorithms. But few organizations take such care.

AI cannot become gender neutral (or color- and background-blind, for that matter) until the workforce creating the technology becomes more diverse. The good news is that much of the research about gender biases in AI is coming from women, and this not only puts the issue front and center, but it also shows the value of women as researchers and engineers in the space.

This is only the start. Diversity of all kinds, not just women but people of color and from various national origins, must be incorporated in the AI and technology workforce. That will unquestionably improve the quality and value of the software. The change to the AI workforce has to begin in the classroom, with more diverse faculty and greater encouragement for women and minorities to major in computer science and machine learning and AI.

Diversity of all kinds–not just women but people of color and from various national origins–must be incorporated in the AI workforce. That will improve the quality and value of the software.

Changing the culture also requires companies to present a more balanced public image. The people doing corporate and research presentations and recruiting in this area are overwhelmingly male. (Only 18 percent of AI C-suite leaders are women.) The stage should be shared more equitably. Women need to be given the opportunity to present the same sort of technical work as their male peers.

Siri and Alexa may be female, but AI remains resolutely male. Unless something is done to address the gender gap in the AI workforce, the consequences will be far reaching, long lasting, and harmful to society. In an era when we are seeking to close the gender pay gap and reduce other inequalities, we are setting up a future that endangers much of our progress to-date.

As funders, entrepreneurs, technologists, and parents (of both daughters and sons), we should proactively focus on and tackle this area of opportunity. If we do, the outcome will be more beneficial than we can imagine.