Machine learning without bias: time to solve the gender gap
Everywhere you look, you see artificial intelligence (AI). Smart algorithms are already being used to find your face on social media, assess loan applications and translate excerpts of text instantly. But it doesn’t stop there; soon, machines could be in complete control of your car and will be able to track your food consumption based on the contents of your fridge.
While AI has the potential to transform our lives for the better, it’s important to be mindful of how these machines are designed. Even in 2017, we still see a significant gender gap in computing jobs: According to new figures from Eurostat, women hold just 16% of ICT specialist roles in the EU, and remain largely under-represented in tech education as well. Of course, AI reflects the value of those it’s created by. If the computer professionals behind the machines are all typically male software engineers – Microsoft researcher, Margaret Mitchell, referred to AI as a “sea of dudes” – there is a strong likelihood that we’ll see ingrained forms of bias also built into new systems.
So, what can we do to ensure the AI systems and co-bots of the future do not reflect the industry’s gender-diversity shortcomings today and are representative of our communities?
Masculine by design
With many more males than females training computers to act like humans, it’s no real surprise that subtle gender bias is entrenched in the data sets used to teach these skills to machines. As Dr Ileana Stigliani, assistant professor of design and innovation at London's Imperial College Business School, points out, "This could explain why we're seeing sexualised fembots with a view of the world that reflects the social norms of the group who created them – white men."
In turn, these narrow-minded mathematical models are having a major impact on how different communities are being served and assessed. Take a recent study at Carnegie Mellon university for instance, which found that women were served Google ads for high paying jobs far less than men (300 times compared to 1,800 times). Similarly, a study by Stanford University discovered that an AI system ranked information about female programmers as less relevant than that of their male counterparts.
What’s being done to address the problem?
While some experts describe a lack of urgency in the industry to monitor and restrict algorithm bias, there are small pockets of activity starting to emerge. For instance, a new AI research consortium has been set up by Amazon, Facebook, Google, Microsoft and IBM to share ideas and formulate best practices; and Apple has recently joined the group to reveal more of its work in the field. Elsewhere, IBM has developed a genderless avatar with its Watson platform that is working with doctors to improve cancer care.
Of course, if AI programs are going to reflect the perspectives of a broader community, data sets must be assembled by both men and women, but how are we to encourage more females into computer science in the first place? According to a recent Microsoft study, less than 40% of school girls (aged 11-18) in the Netherlands are thinking of pursuing a career in STEM, highlighting that more must be done to make the career prospects of AI more appealing to young females. Oracle, IBM, Cisco and Microsoft have come together in the Netherlands to form Platform Diversity in IT (DIT) and stress that schoolgirls need to come into contact with IT studies much earlier to enable them to make more informed career choices. At Atos, our diversity program includes a “connecting generations” initiative to demonstrate that technology is fun and exciting, and no longer a “geeky” topic. We invite young girls (aged 8 to 16) – from primary and secondary schools, including the daughters of our employees – to explore technology through a hands-on TechLab experience; programming robots, learning about virtual reality and exploring design thinking.
Nikita Johnson, founder of all-female run events firm, RE-WORK, puts the lack of diversity within AI down to the fact that there aren’t many “role models for young women and girls to look up to.” This is something Johnson is looking to address, by running “Women in Machine Intelligence” networking events. Attendees are encouraged to find peers and mentors in other women, in the hope that female representation will be grown in the industry.
These examples show small steps are being made to bypass gender bias in AI and it is promising to see that the issue is being recognized industry-wide. The onus is now on all of us as a community to nurture young female talent in software engineering to ensure that the machines of the future are wholly representative of our society.
For more information on how technology is shaping the workplace of tomorrow, read our Future of Work report - https://atos.net/en/insights-and-innovation/future-of-work