NSFW (Not Safe For Women): How bad data encodes sexism

NB Team
02.12.2019

Share now

Let’s begin with a riddle. A car crash hurts a father and son. At the hospital, the father dies but the boy goes to the operating room. On arrival there the surgeon says “I can’t operate on this boy, because he’s my son.” How is this possible? Because the surgeon is his mother.

Now if you didn’t get this, you’re not alone. If I ask you to picture a coder, a doctor or a mathematics professor, you’re probably thinking of a man. Almost 80% of people automatically do the same, even with seemingly non-gendered terms such as ‘person’.

This is no coincidence; it is by design. Design led by significant gaps in sex-disaggregated data. What Caroline Criado Perez has termed the ‘gender data gap’ in her inspiring book: Invisible Women: Exposing Data Bias in a World Designed for Men.

So what has led to this gender data gap, and is there anything we can do about it?

Women: the anomaly

As far back as 340 BC, the Greek philosopher Aristotle characterised a female as a ‘mutilated male’. Our understanding of the differences between women and men has been based on these ancient stereotypes. But surely these days, with the establishment of rigorous, objective research, we can rely on solid data from scientific research?

Unfortunately, even today, these are open to bias. Men are the norm, with no consideration for women, from medical textbooks to drug trials. Anything falling outside (including half the population) is ‘atypical’.

In medical studies, researchers often remark that it’s too complex to include women (due to their pesky propensity to menstruate), or they’ll include women if there is enough interesting data collected from men[1]. This inherent bias in research creates a vicious cycle. Women aren’t studied. We don’t have sufficient data on distinct female needs.

The trouble is, the way female bodies work is often very different from men. Ignoring this can have dangerous consequences.

Take, for example, heart attacks. The symptoms of a heart attack are chest pain and pain in the left arm, right?

If you’re a man, yes. But only one in eight women will experience these characteristics. A recent study has found that the predominant female symptoms are nausea, dizziness, fatigue and breathlessness[2].

A dangerous trend

Because there has been such little research on these ‘atypical’ symptoms until now, misdiagnosis occurs 50% of the time and can foster dangerous delays. The median diagnosis time for women is 54 hours and 16 for men. The survival chance for a woman is significantly lower.

Female viagra launched in 2015, just 17 years after the male version. It quickly became apparent that the drug reacted very differently when combined with alcohol. The manufacturers ran tests, in which they trialled 23 men and two women. For female Viagra. This is not an isolated incident. In the 1960s pregnant women took Thalidomide to combat morning sickness. Whilst proved safe for men, it had catastrophic effects on foetal development. This was something manufacturers knew as early as 1959, yet it wasn’t taken off the market until 1962[3].

A lack of research on women’s specific needs, and broad reluctance to use existing data, means women are routinely misunderstood, misdiagnosed and mistreated. This doesn’t just happen in medicine.

Baby, you cant drive my car

As car manufacturers began to pay attention to safety, they introduced the crash test dummy (CTD) in the 1930s. ‘Sam’ the anthropomorphic model was – you guessed it – male. Following years of lobbying, a ‘female’ CTD was finally created. This was done by scaling down the male model and putting it in the passenger seat. This means we still don’t have any data about safety for women in cars. However, we do have some data for a scaled-down male who’s called shotgun.

Every automotive design decision since the 1940s has been made based on what is safest for men. From seatbelts which don’t accommodate breasts or pregnant bellies, to headrests which don’t sufficiently protect. Car design does not keep women safe. As a result, women are 47% more likely to be seriously injured in an accident[4], and three times more likely to suffer from whiplash. Instead of adapting tests (and products) to women’s bodies, some manufacturers have even suggested that it’s women who need to change – because they “sit out of position”[5].

Change is possible if companies acknowledge the issue is with the data that leads design. Not with women. Volvo is leading the way by gathering data around women in vehicles. This is changing the way they design their cars[6]. This has helped to a reduced chance of women contracting whiplash to the same level as men. But other industries are slow to catch up.

Alexa, can you hear me?

Sadly, this trend continues into the digital world. If you’ve had issues with voice recognition technologies misunderstanding you, you’re probably a woman. A YouGov study found that speech-recognition software is 70% more likely to recognise male speech than female speech[7]. This is because the software that teaches technologies, such as Siri and Alexa, to understand voices uses sex-biased data. The majority (69%) of its learning material comes from male speech recordings.

Once again, women are to blame for this problem. Tom Schalk, of voice technology company ATX Group, says there would not be an issue if women were ‘willing’ to undergo ‘lengthy training’ in how to speak better[8].

The development of AI programmes uses publicly available data. Yet existing biases are shaping that data. In the top ten image search results for doctors, only 30% feature a woman. Most (90%) of the top images for coders are men. None of the first ten images for mathematicians or professors pictured were women.

Yet 45% of the UK’s doctors, 37% US mathematicians and 33% of UK professors are women. Moreover, coding started as a female profession. So where are these women in the image searches?

Unfortunately, artificial intelligence is only as strong as its training data. Therefore, it could be biased. This has serious real-world implications. Algorithms are increasingly taking decisions over humans. Especially in areas such as recruitment.

In one case, an algorithm written to sort job applicants for a coding position found that successful candidates (and existing employees) often visited a particular website. This site is filled with overwhelmingly male-oriented content. It’s not a very welcoming place for women. Subsequently, most female applicants weren’t considered for the role. A similar situation took place with Amazon’s recruitment tool in 2018. The AI taught itself that male candidates were preferable, marking down resumes which included the word ‘women’s’ or if the candidate went to all-female colleges[9].

Correcting the biases

These biases are common. Apple co-founder Steve Wozniak recently hit out against Apple Card for treating his wife differently to him. ‘We have no separate bank or credit card, or any separate assets’. So why offer him over 10 times the credit limit of his wife? Or why does Google Translate interpret gender-neutral Turkish pronouns such as ‘o bir muhendis’ to he is an engineer, while ‘o bir hemsire’ translates to she is a nurse? It all comes back to dodgy data, which either encodes gender bias or excludes women altogether.

Unless the gender-data gap is closed, the world around us carries on being designed for men. It will become an increasingly unsafe and unfair place for women. Luckily, the solution is simple. Simply research women fairly, sex-disaggregate the data and base your decisions on considering men and women equally. Women are not the niche. They are the norm. Consider the 50%.  

Laura Wynn-Owen

References

[1] The Guardian: The female problem
[2] The Independent: Heart attack gender gap: causing needless deaths of women, charity warns
[3] Science museum: Thalidomide
[4] Citylab: A clue to the Reason for Women’s Pervasive
[5] Kilden: Cars are still designed for men
[6] Volvo Cars: The future of driving
[7]  The Telegraph: ‘Sexist’ smart speakers designed by men…
[8] Techland.time: It’s not you, it’s it” Voice recognition doesn’t recognise women
[9] Reuters: Amazon scraps secret AI recruitment tool that showed bias against women