Image for NSFW (Not Safe For Women): How bad data encodes sexism
Back to Blogs

NSFW (Not Safe For Women): How bad data encodes sexism

Let’s begin with a riddle. A father and son get in a car crash, and are rushed to the hospital. The father dies. The boy is taken to the operating room and the surgeon says, “I can’t operate on this boy, because he’s my son.”

How is this possible? Because the surgeon is his mother.

Now if you didn’t get this, you’re not alone. If I ask you to picture a coder, a doctor, a professor or a mathematician, you’re probably thinking of a man. Almost 80% of people automatically do the same, even with seemingly non-gendered terms such as ‘person’.

This is no coincidence; it is by design. Design led by significant gaps in sex-disaggregated data – or what Caroline Criado Perez has termed the ‘gender data gap’, in her inspiring book: Invisible Women: Exposing Data Bias in a World Designed for Men.

So what has led to this gender data gap, and is there anything we can do about it?

Women: the anomaly

As far back as 340 BC Greek philosopher, Aristotle, characterised a female as a ‘mutilated male’, and our understanding of the differences between women and men has been based off these ancient stereotypes. But surely these days, with the establishment of rigorous, objective research, we can rely on solid data from scientific research?

Unfortunately, even today these are open to bias, and women are overlooked. From medical textbooks to drug trials, men are considered the norm, and anything that falls outside (including half the population) is considered ‘atypical’.

In medical studies, researchers often remark that it’s too complex to include women (due to their pesky propensity to menstruate), or they’ll include women if there’s enough interesting data collected from men[1]. This inherent bias in research creates a vicious cycle. Women aren’t considered worth studying, so we don’t have sufficient data on distinct female needs, so no one bothers to study them.

The trouble is, the way female bodies work can often very different from men, and ignoring this can have dangerous consequences.

Take for example, heart attacks. The symptoms of a heart attack are chest pain and pain in the left arm, right?

If you’re a man, yes. But only one in eight women will experience these characteristics. A recent study has found that the predominant female symptoms are nausea, dizziness, fatigue and breathlessness[2].

Because there has been such little research on these ‘atypical’ symptoms until now, they are often misdiagnosed (around 50% of the time), leading to dangerous delays. With a median diagnosis time of 54 hours for women compared to 16 hours for men, the survival chances for a woman are significantly lower.

When female Viagra launched in 2015, a mere 17 years after the men’s version, it quickly became apparent that the drug reacted very differently when combined with alcohol. The manufacturers quickly ran tests, in which they trialed 23 men and two women. For female Viagra. This is not an isolated incident. In the 1960s pregnant women were given Thalidomide to combat morning sickness. Whilst proven safe for men, it had catastrophic affects on foetal development (something the manufacturers knew as early as 1959, yet it wasn’t taken off the market until 1962)[3].

A lack of data and research on women’s specific needs, and a widespread reluctance to use that data when it is there, means women are routinely misunderstood, misdiagnosed, and mistreated. And it’s not just in medicine.

Baby, you cant drive my car

The crash test dummy (CTD) was introduced in the 1930s when car manufacturers began to pay attention to safety. ‘Sam’ the anthropomorphic model was – you guessed it – male. After years of lobbying, a ‘female’ CTD was finally created – by scaling down the male model, and putting it in the passenger seat. This means we still don’t have any data about safety for women in cars, though we do have some data if you’re a scaled-down male who’s called shotgun.

Every automotive design decision since the 1940s (and let’s be honest, probably before that) has been made based on what is best and safest for men. From seatbelts which don’t accommodate breasts or pregnant bellies, to headrests which don’t sufficiently protect cars simply haven’t been designed to keep women safe. As a result, women are 47% more likely to be seriously injured in an accident[4], and three times more likely to suffer from whiplash. Instead of adapting tests (and products) to women’s bodies, some manufacturers have even suggested that it’s actually women who need to change – because they “sit out of position”[5].

Change is possible if companies acknowledge the issue is with the data that leads design, rather than with women. Volvo is leading the way by gathering data around women in vehicles, and changing the way they design their cars[6]. This has helped to a reduced chance of women contracting whiplash to the same level as men. But other industries are slow to catch up.

Alexa, can you hear me?

Sadly, this trend continues into the digital world. If you’ve had issues with voice recognition technologies misunderstanding you, you’re probably a woman. A YouGov study found that speech-recognition software is 70% more likely to recognise male speech than female speech[7]. This is because the software used to teach technologies such as Siri and Alexa to understand voices is trained on sex-biased data: 69% of its learning material comes from male speech recordings.

And again, women are blamed for the problem. According to Tom Schalk, of voice technology company ATX Group, there wouldn’t be an issue if only women were ‘willing’ to undergo ‘lengthy training’ in how to speak better[8].

Artificial intelligence programmes are often trained off publicly available data. But this data is shaped by our existing biases. In the top ten image search results for doctors, only 30% feature a woman. 90% of the top coder images are men, and none of the first ten images for mathematicians or professors pictured were women.

Yet 45% of the UK’s doctors, 37% US mathematicians and 33% of UK professors are women, and coding started as a female profession. So where are these women in the image searches?

Unfortunately, artificial intelligence is only as strong as the data that it is trained on, which means many can become biased. This has serious real-world implications. Algorithms are increasingly being used to make decisions in place of humans, in areas like recruitment.

In one case, an algorithm written to sort applicants for a job in coding found that successful candidates (and existing employees) often visited a particular website. But this site was filled with overwhelmingly male-oriented content, and wasn’t a very welcoming place for women to visit. This meant most female applicants weren’t considered for the job. A similar situation happened with Amazon’s recruitment tool in 2018. The AI taught itself that male candidates were preferable, marking down resumes which included the word ‘women’s’ or if the candidate went to all-women colleges[9].

These biases are common. Apple co-founder Steve Wozniak recently hit out against Apple Card for treating his wife differently to him. ‘We have no separate bank or credit card, or any separate assets’, so why was he offered over 10 times the credit limit than his wife? Or why does Google Translate interpret gender-neutral Turkish pronouns such as ‘o bir muhendis’ to he is an engineer, while ‘o bir hemsire’ translates to she is a nurse? It all comes back to dodgy data, which either encodes gender bias, or excludes women altogether.

Unless the gender-data gap is closed, the world around us will continue to be designed for men, and become an increasingly unsafe and unfair place for women. Luckily, the solution is simple: research women fairly, sex-disaggregate your data and base your decisions from a position of knowledge, where women are considered equally as important as men. Women are not the niche, they are the norm. Consider the 50%.  

Laura Wynn-Owen

 

References

[1] The Guardian: The female problem
[2] The Independent: Heart attack gender gap: causing needless deaths of women, charity warns
[3] Science museum: Thalidomide
[4] Citylab: A clue to the Reason for Women’s Pervasive
[5] Kilden: Cars are still designed for men
[6] Volvo Cars: The future of driving
[7]  The Telegraph: ‘Sexist’ smart speakers designed by men…
[8] Techland.time: It’s not you, it’s it” Voice recognition dosen’t recognise women
[9] Reuters: Amazon scraps secret AI recruitment toll that showed bias against women