Are psychometrics subject to machine bias?

Artificial Intelligence, or A.I. (or machine learning as it should actually be called in the vast majority of cases) often goes under the radar to influence small aspects of our lives; providing us with recommended shows to watch next on Netflix, suggesting what to buy on Amazon or ever more predictive Google searches.

While these are some simple examples of how AI can make life easier, when it comes to your future career and what could be at stake, we are understandably much more cautious when it comes to trusting new emerging technologies. We all want to ensure that we are going to get a job based on merit, rather than being due to some algorithmic quirk.

Are psychometrics subject to machine bias?

What is A.I. Bias?

AI in recruitment should be fair and objective, as it removes the human, ‘biassed’, element of decision making. However, due to a phenomenon known as A.I bias (sometimes termed machine learning bias or artificial stupidity) this isn’t always the case. If an algorithm uses coding or data that comes from our unfair and prejudiced human world, then the machine outcome will be the same, making the same erroneous assumptions as if a human had made the decision intuitively. As the machine learns more from biassed information these biases can then become more and more extreme.

Hypothetical example

Online advertising will use information (cookies) about you and your past browsing habits to tailor the content you receive based on trends in data of what similar people have interacted with. An algorithm designed to show job advertisements to people who are the best suited to a role will therefore draw on previous data about the type of people who are successful in these roles. This means that if a high paid role is being advertised, an algorithm will decide who to advertise this towards based on the type of people already interested in existing higher level roles.

Statistically, women are less likely to be in high level roles due to a number of historical factors, although today many would argue that the reason this is still an issue is more due to biases in our society. In this instance, an algorithm might be less likely to target women based on this information. Does this mean an algorithm is biassed, or is it just using biassed information to make a decision?

The algorithm is just doing its job, making decisions based on the data it receives. But as the input here is biassed against a certain group, there is no way that the outcome couldn’t also be biassed.

In this case, the problem has been exacerbated as women are now being shown less advertisements for high paid jobs, and are even less likely to apply for or gain a high paid position. And so the cycle continues. In this way, A.I. makes the recruitment process more unfair.

 

Real life example

A few years back Amazon was forced to discontinue a recruitment tool used for sifting high volumes of resumes because it disproportionately favoured male applicants. The algorithms, in simple terms, scanned past resumes for people previously employed in those positions (over the previous 10 years) and then made recommendations for what it thought were ‘good’ candidates.

However, due to the existing male dominance in the tech industry the algorithm began to identify any reference to women or being a woman as a negative attribute, and then subsequently disregarded the CV. The algorithm was eventually amended to correct for this phenomenon but as there was no way of knowing if it would continue to sift based on other discriminatory qualities, use of the tool was discontinued.

Amazon claims that no hiring decisions were made based solely on this algorithm, but it goes to show how machine learning bias can present genuine issues to recruiters.

 

A.I. Bias in assessments

Psychometric soft skills assessments are normative.

In simple terms this means that a respondent’s results are compared to a wide group of candidates who have completed the assessment before, known as a norm group. Norm referenced scores are raw scores (what the candidate actually answered), expressed in a way that has taken previous respondents’ data into account. Therefore, the only input into our algorithm that influences the outcome of a candidate’s assessment is the way they answered the questionnaire, and the way the 1000s of previous candidates that make up the norm group answered theirs. The norm group population defines what average, below average and above average look like, meaning that we don’t attribute bias to the scoring, as every candidate can be benchmarked against the exact same group.

I can hear you thinking it… but if the norm group data was biassed in some way wouldn’t that distort a candidate’s responses? Well, yes, but we have taken steps to ensure that this is not the case. For one thing we can guarantee that no other data except the pure data of responses to scales are fed into our norm group algorithm.

Personality questionnaires are developed to ensure that the scales measure what they set out to do, and have undergone piloting and testing before being rolled out into the recruitment world. Our personality tests are also rationally derived; this means that they were developed based on a specific use. In our case this was an occupational one – with item content and language based on a business setting.

For cognitive ability tests, as there is also only one correct solution for a numerical sum, and any answer can only be right or wrong, the way we measure ability isn’t influenced by subjectivity when developing the tool. In other words, the results are already objective, and the purpose of the norm group is to aid interpretation and make differences between candidates easier to understand.

 

View the output from Clevry assessments by downloading sample reports below.

 

Therefore, while A.I might be making other aspects of the recruitment process more unfair through machine learning bias, psychometrics isn’t one of these.

As long as tests are occupationally relevant, measure what they set out to, and don’t include erroneous information in hiring decisions, they will be providing some much-needed objectivity to the world of recruitment and selection.

If you’d like some more information about how our assessments can improve your recruitment process then please do get in touch.

Get

Isn’t it time that your company gets the tools to hire the best?

Get in touch with our sales to learn all about our solutions.

Follow

Would you like to have our content delivered to your feed? Follow us in your favorite channel!

Or subscribe to our newsletter
Find your Soft Skills
Let's Go!
Want to check out a sample report to see what Clevry can uncover?