The algorithms used in AI make decisions for you in virtually every sphere of your life: employment, eligibility for a loan, medical diagnoses, car safety and consumer product creation. Yet, the female point of view is largely missing from these algorithms, which is a crisis affecting more than half of the world’s population.

Men primarily chose the data, program the models, and interpret the results. Male data is the norm in a world made for men. There is an abundance of big data on men, who are recognized as the universal norm. This is why your phone is too big for your hand, and your doctor may prescribe you the wrong medication for your body. Women are 47% more likely to be seriously injured in a car accident because male data is used in car safety.

Interested in learning more shocking truths about this dangerous gender data gap? Check out the incredibly insightful, heavily researched, “Invisible Women – Data Bias in a World Designed for Men” by Caroline Perez:

How can this be, you may ask? There are many reasons, including our history of minimizing women’s voices and contributions. Today, women are still massively underrepresented in the tech industry. Only 22% of all AI professionals worldwide are women. While the number of women getting computer science degrees is on the rise, only 38% of these women are staying in tech.

I co-founded and scaled a software company in 2010. In eleven years of rapid revenue growth, only 15% of our software developers were female.

What can you do to end this crisis? More women need to work in tech and to stay in tech. Women are desperately needed in data science to prevent bias in algorithms. It starts early by encouraging young girls to enter STEM fields. As a start, I enrolled my eight-year-old daughter in robotics summer camp.

The lack of diversity in tech is dangerous for all of us. This is why I am so passionate about helping close the gender confidence gap and accelerating equality in tech by empowering women. Women in tech need to be supported. The safety of our world depends on it.

Categories: