Early bird prices are coming to an end soon... ⏰ Grab your tickets before January 17

This article was published on January 21, 2021

New AI tool detects hiring discrimination against ethnic minorities and women

The system uses supervised machine learning to analyze search behavior of recruiters on employment website.


New AI tool detects hiring discrimination against ethnic minorities and women Image by: Gustavo Fring from Pexels

Research shows that various industries have big ethnic and gender pay gaps, but the extent to which discrimination affects these inequalities is tricky to assess.

A new AI tool developed at the London School of Economics has shed some light on how recruitment prejudices influence these outcomes.

The system uses supervised machine learning algorithms to analyze the search behavior of recruiters on employment websites.

The researchers applied the algorithms to the online recruitment platform of the Swiss public employment service.

[Read: How Netflix shapes mainstream culture, explained by data]

The tool used data from 452,729 searches by 43,352 recruiters, 17.4 million profiles that appeared in the search lists, and 3.4 million profile views. The researchers then analyzed how much time the recruiters spent looking at each profile, and whether or not they decided to contact a jobseeker.

They found that recruiters were up to 19% less likely to follow up with job seekers from immigrant and ethnic minority backgrounds than with equally qualified candidates from the majority population.

The study also showed that women experienced a penalty of 7% in professions that are dominated by men, while the opposite pattern was detected for men in industries that are dominated by women.

“Our results demonstrate that recruiters treat otherwise identical job seekers who appear in the same search list differently, depending on their immigrant or minority ethnic background,” said study co-author Dr Dominik Hangartner. Unsurprisingly, this has a real impact on who gets employed.”

Interestingly, the level of bias varied at different times of the day. Just before lunch or near the end of the workday, recruiters reviewed CVs more quickly, leading immigrant and minority ethnic groups to experience up to 20% higher levels of discrimination.

“These results suggest that unconscious biases, such as stereotypes about minorities, have a larger impact when recruiters are more tired and fall back on ‘intuitive decision-making’,” said Dr Hangartner.

The researchers believe the bias can be reduced by re-designing recruitment platforms to place details such as name and nationality lower down the CV.  But their tool could also help, by continuously monitor hiring discrimination and informing approaches to counter it.

You can read the study paper in the journal Nature.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with