I work in AI – we need to deal with how biased technology is against women

The tech world is still dominated by men, which means that those in charge of programming machines, a status quo that has to change – and swiftly, writes Tabitha Goldstaub

Tuesday 09 March 2021 08:40 EST
Comments
‘My enforced bedrest brought home how gender, AI technologies and the pandemic had collided over the last year’
‘My enforced bedrest brought home how gender, AI technologies and the pandemic had collided over the last year’ (Getty Images/iStockphoto)

Last December, my partner had a migraine. He was laid up in bed for 24 hours and, although he had no other symptoms, took a Covid-19 test just in case. To our surprise, it came back positive, and like many people have experienced, I had to shield myself and our 16-month old. As well as handling my concern for my partner, I was suddenly flying solo with our son and trying to work full time.

A few days later, I came down with the virus. I was sicker than I’d ever been and stayed that way for 10 days. By then, luckily, my partner had recovered and could take over looking after our little boy and I was able to take time off my work in the AI sector without any issues.

I was lucky: not every woman has that kind of support. My enforced bedrest brought home how gender, AI technologies and the pandemic had collided over the last year, not just in my world, but for other women too – and in ways they won’t yet have seen. 

I am on a mission to talk to women and girls about the way AI technologies are already impacting our lives and how they can embrace the coming change by preparing for it, especially in the world of work. I want them to know that we will be working alongside AI, and that many aspects of jobs are going to be assigned to machines which is not as scary as it sounds because machines can’t do everything that humans can do! They can’t empathise, care, or evaluate in the same way we can. 

Part of that preparation is understanding that AI-driven machines (from the algorithms we recognise, such as Netflix or Facebook, to lesser-seen systems used in health care, or mortgage deciders) learn from data that humans feed into their systems.

And, despite the numbers of women slowly creeping up to 20 per cent, the tech world is still dominated by men. This means that (albeit, consciously or not) those in charge of programming machines are enforcing, or even exacerbating, a biased status quo which many of us are trying to escape or eradicate – such as the gender pay divide, or the idea that a woman’s place is in the home.

To ride rough-shot over some complex stuff – this happens through a mix of using data sets which underrepresent women, encoding historical biases, and the modelling programme itself can be skewed by the assumptions of its mostly male developers. In this way, the underrepresentation of women, and lack of diversity in general in the tech world, produces a kind of feedback loop that can impound existing structural inequalities in the technologies we use.

So, lying in bed sick with Covid, I began to reflect on what I already knew about gender bias in AI, and the disproportionate economic and social impact the pandemic has had on women’s lives. Women are on the front line of the pandemic with the majority of healthcare and social care workforce being women.

As well as holding high-risk jobs, jobs themselves are at risk for women. There are more women employed in insecure labour than men, such as zero-hour contracts, and women’s jobs are more at risk than male ones simply because women predominantly work in the sectors most negatively affected by the pandemic.

Add the unpredictable cycle of schools and nurseries opening and closing to this, and you have an increase in responsibility for unpaid care work – which has mainly fallen to women. This makes working from home even more difficult, even if they are able to do the job digitally.

I believe in the potential of AI as a force for good, but it’s important to stay vigilant. Knowing that women have suffered disproportionately during this pandemic and that AI technology can also bias against women, what happens when companies might increase their engagement with AI technology during the crisis in line with the need to automate activities?

In the days to come, there will be a huge shift in social behaviour as tracking our own health becomes the new normal and an essential part of protecting women for a future labour market. We’ll be encouraged to do this not in order to get fit or understand fertility, but because health data will identify vulnerability to the virus, in addition to tracking whether you have had symptoms of Covid-19 or any future viruses.

But health tracking also poses ethical problems for women. We know that women between the ages of 25 and 49 are penalised in the workplace, due to bias held by employers who deem them a “maternity flight risk”. Of course, it’s illegal to discriminate on the basis of fertility or motherhood – or any aspect of a person’s health – but it’s important to consider how it might be used unconsciously against women.

So, it’s imperative that as we go forward, and the systems we use become increasingly sophisticated, we ensure that we track safely (and maintain privacy) to ensure that any data collected doesn’t further divide citizens into categories or classes.

Tabitha Goldstaub is a British tech entrepreneur who specialises in communicating the impact of artificial intelligence. Her book, How To Talk To Robots: A Girls’ Guide To a Future Dominated by AI (4th Estate), is out now. She is also presenting at the WOW ( Women of the World UK Festival) online until 21 March

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in