Blogpost

To Minimize Sexism at Work, AI Must Learn to Stop It

Kristina Finseth

Content Writer at Phenom People

Sexism in the workplace has been an important topic this past year. The headlines abound—"Study: Women Valued for Physical Attractiveness" or "We Need More Women to Overcome the Sexist Tech Industry"—but this pervasive issue could soon be addressed, if only in part, with a little help from technology. As artificial intelligence continues to play an larger role in talent acquisition, it has the potential to impact gender bias.

During a recent webinar, I spoke with Mahe Bayireddi, CEO of Phenom People and artificial intelligence thought leader, about AI's potential to tackle bias by bringing the issue to light: "People are worried and anticipate that machines will be biased," he said. "However, if machines become biased, where do you think they would have learned it from? Bias may exist, but artificial intelligence will bring these biases to the forefront, enabling HR and talent acquisition professionals to figure out its origin and fix it."

Although machines aren't biased at their core, they can learn to be, a problem that must be tackled by HR managers using the technology. This may seem like an uphill battle, but the positive side of this is that if these machines can learn bias, they can also learn to spot it—and stop it.

Gender Bias Stems From People, Not AI

Artificial intelligence technology can develop biases over time based on the biases of the people that use it.

Since AI uses algorithms to parse data, develop patterns and make predictions, it can pick up on prominent sexism that exists in an organization's sourcing, recruiting and hiring practices.

For example, while reviewing applicants for a sales position, one recruiter may pass over applicants with female-sounding names, even though artificial intelligence technology has identified them as top candidates for the position. If this oversight isn't caught, insight into this ongoing, biased behavior from this recruiter will be collected, analyzed and developed into a formula that'll be used to make suggestions for future positions.

To Minimize AI Bias, Test and Train

The good news is that artificial intelligence can be taught to be unbiased. It comes down to organizations adopting an ethical and responsible approach when implementing artificial intelligence into their recruiting practices—regardless of whether they do it on their own or through a third-party vendor.

AI technology should be trained and tested for known unwanted bias extensively before it's put to work. And once it's live, it's critical to continuously test AI technology, ensuring that any unwanted behaviors or biases are identified and immediately corrected.

Bias May Never Fully Disappear

AI adds a lot of value to the recruiting process, and it will continue to impact the future of work, only improving over time. When powered by the right data, artificial intelligence can eliminate a lot of the headaches and pain points that we associate with sourcing, recruiting and hiring the right employees.

Beyond that, though, AI will eventually bring biases to the forefront, revolutionizing the future of work. Organizations will begin to focus more on how to correct human bias, which may minimize sexism. As for eliminating it entirely, we're not quite there yet, but change may come in the not-so-distant future.

Photo: Creative Commons

Gerelateerde bronnen

Wilt u blijven leren? Bekijk onze producten, klantverhalen en de nieuwste branche-inzichten.

Pioneering the future of work with spatial learning

Artikel

Pioneering the future of work with spatial learning

Plan een persoonlijk gesprek

Praat met een Cornerstone-expert over hoe wij kunnen helpen met de unieke behoeften van uw organisatie op het gebied van personeelsmanagement.

© Cornerstone 2024
Juridisch