New research on "resume whitening" confirms bias at the resume screening stage is still a major problem.
Researchers found that up to 40 percent of minority job seekers try to avoid being stereotyped by "whitening" their resume, such as using an Anglicized version of their name or removing their experience in race-related organizations. And unfortunately, the results suggest these job seekers are justified in being cautious.
For black applicants, 10 percent received callbacks for job interviews if they used their black-sounding names and included involvement with black organizations, compared to 26 percent who received callbacks when they used a "whitened" version of their names and removed extracurriculars such as the Black Students' Association.
For Asian applicants, 12 percent received callbacks if they used their Asian-sounding names and included their experience with Asian organizations compared to 21 percent who used "whitened" resumÃ©s. For example, using "Luke Zhang" instead of "Lei Zhang."
It's no surprise that a recent Deloitte survey reports 68 percent of companies are exploring technology to reduce bias in recruiting.
Overcoming bias in the screening stage requires not just the right technology but an overall strategy as well. Here are three strategic steps you can take to reduce bias during candidate screening.
1) Identify If a Bias Exists During the Screening Stage
Identify if a bias exists in your candidate screening by comparing the demographics of the qualified candidates who apply to your company to the people that get hired. Ideally, they should be similar.
Companies that fall under the Equal Employment Opportunity Commission (EEOC) or the Office of Federal Contract Compliance Programs (OFCCP) (e.g., 100 or more employees, government contractors with 50 or more employees and at least $50,000 in contracts) are required to collect data on the demographics of both their applicants and employees.
To ensure a lack of adverse impact, many companies will include a (voluntary) survey asking applicants about their ethnicity, race and gender during the application process. In addition, companies are required to submit an annual report on the race and gender of their employees.
If you find a discrepancy in the demographics of your applicants compared to new hires, the next step is to figure out why.
2) Examine the Reasons for Disqualifying Candidates at the Screening Stage
The best way to assess if a bias exists is to examine why candidates get disqualified at the screening stage. There may be reasons for disqualifying minority candidates unrelated to bias, for example, their work authorization status.
If, however, the qualifications of minority candidates are similar to non-minority candidates and they're still getting rejected at a higher rate, that's a problem. The key here is to have good data collection practices. Minority candidates getting rejected at a higher rate due to vague descriptions such as "not a good fit" likely won't cut it with the EEOC or OFCCP.
If after analyzing your numbers, you're still confused why minority candidates are getting disproportionately rejected at the screening stage, then you might have an unconscious bias problem on your hands. At this point, you should consider a technology solution.
3) Invest in Technology That Helps Reduce Bias During the Screening Stage
An unconscious bias is a mental shortcut we use to process information and make decisions quickly. These biases are automatic, outside of our awareness, and pervasive: Wikipedia lists more than 180 biases that affect our decision making, memory and social interactions.
Activities like resume screening that require processing large amounts of data very quickly and making decisions about people are especially susceptible to unconscious bias.
These limitations of the human brain are exactly why AI technology is now being applied to recruiting. AI can reduce unconscious bias by pattern matching between employee resumes and candidate resumes rather than using untested qualifications, such as the school someone graduated from.
To further prevent unconscious bias at the screening stage, AI can be programmed to ignore demographic-related information such as candidates' names, addresses or college or university.
Some organization such as Deloitte UK and the federal government of Canada are taking the extra step of "blinding" resumes by removing these details before submitting them to hiring managers to interview.
These technologies and practices are fairly new but based on the research, they have the potential to reduce bias during screening by an impressive 75 to 160 percent and represent the next big breakthrough in reducing bias and increasing diversity in the workplace.
Photo: Creative Commons
Want to keep learning? Explore our products, customer stories, and the latest industry insights.
Honoring Black excellence in the workplace and beyond
The month of February is Black History Month — an annual celebration to recognize the history of African-American achievements in the USA. During this month’s observance, we commemorate Black excellence and pivotal African-American figures like Arthur Ashe, Sojourner Truth, Phillis Wheatley and Victor Glover.
DEIB: Designing for a Post-Pandemic World
Many organizations still have a long way to go when it comes to diversity, equity, inclusion, and belonging (DEIB). This study, based on a review of more than 70 articles and interviews with 10 DEIB leaders and 20 HR leaders, answers several critical questions:
Cultivate a culture of belonging with conversational learning
People learn best from one another by actively engaging in sensitive dialogues and listening to different perspectives, even if they have different backgrounds and beliefs. Yet many organizations struggle with implementing diversity programs that successfully affect behavioral change and increase shareholder value. Research has shown that compulsory diversity training sometimes does more harm than good, resulting in hostility and resistance toward opposing views.