Shape of prejudice to come

Robo job interviewers and AI

The rise of AI in recruitment is a mixed blessing.

It means scanners can sort resumes by key words and terms (with a high error rate) but it also means they can scan paragraphs which have been ‘cut and pasted’ in to resumes by shonky 1300 number resume mills.

There are thousand of cut and paste paragraphs planted by shonky ‘resume writers’ floating around in resumes of hopeful jobseekers across Australia.

The poor client paid $300 or more for what they thought was an original product, reflecting their experience, character, skills and capabilities. They placed their hopes, dreams and ambition in the hands of a resume mill, churning out turgid, plagiarised prose.

They sent it off and never heard from the recruiter again. Because the software picked up a cut and paste paragraph like the one below:

“A highly qualified and experienced professional known for a combination of focused technical and mechanical skills; leadership, analytical and planning capabilities, and interpersonal strengths. Strong background in Maintenance and Mechanical Operations, Technical Leadership, Stakeholder Engagement, Staff Management, Safety Management and Compliance gained from numerous years of employment history. Possessing highly developed management skills, strong technical aptitude, determination, capabilities and the ability to maintain a safe job site based on legal and company guidelines…”

Blah, blah, blah.

Cut and paste resume writers have whole slabs of preformulated text just waiting to be dropped in to your resume for a price.

The resume has failed because the 1300 telephone number writer, has failed to do his or her job.

Their eye is on the cash not on demonstrating how to think originally and strategically and promote their client to the best of their ability.

Now the scanning machines are sorting them out.

Not all sunshine and lolly pops

AI systems promise to save employers time and money in the recruitment process by using cutting-edge technology, such as CV scanners and vocal assessments, to “classify, rank and score” job applicants.

This means a computer program could be assessing a jobseeker’s application right now, and accepting or rejecting it based on its machine understanding before the person reaches an interview stage with a human.

New research on Algorithm-facilitated discrimination from Dr Natalie Sheard, a lawyer and postdoctoral fellow at the University of Melbourne, has found AI hiring systems may “enable, reinforce and amplify discrimination against historically marginalised groups”.

“There are some serious risks that are created by the way in which these systems are used by employers, so risks for already disadvantaged groups in the labour market — women, jobseekers with disability or [from] non-English-speaking backgrounds, older candidates,” she tells ABC Radio National’s Law Report.

Informative, jargon-free stories about law reform, legal education, test cases, miscarriages of justice and legal culture. About 62 per cent of Australian organisations used AI “extensively or moderately” as part of their recruitment processes last year, according to the Responsible AI Index.

Yet Australia does not have any specific laws to regulate how these tools operate or how organisations may use them. It was hoped AI would end bias in the hiring process but several well-publicised cases in America have highlighted that the opposite is occurring.

In one example, an AI system developed by Amazon learned to downgrade the applications of jobseekers who used the word “women’s” in their CVs.

The AI tools used in hiring

Dr Sheard interviewed 23 people as part of her research into AI hiring systems. The participants were mainly recruiters who had worked at small, medium and/or large organisations, both private and public, and in a range of industries.

She also spoke to two careers coaches to understand the impact of AI hiring practices on job candidates, as well as a leading Australian AI expert, and two employees of a large AI developer — the director of AI Services and the AI ethics leader.

Her focus was on three aspects of recruitment screening: CVs, candidate assessments (which may include psychological or psychometric tests) and video (“robo”) interviews.

Robo interviews typically involve candidates recording themselves providing answers to a series of questions, which are then assessed by AI. Dr Sheard says there are well-known cases of AI tools initially using “a controversial technique called facial analysis” when analysing these interviews.

“It had a look at the facial features and movements of applicants to assess their behaviour, personality. [For example], it was looking to see [if] they [were] enthusiastic or angry when they spoke to customers.”

In 2019, America’s Electronic Privacy Information Center filed a complaint against third-party recruitment agency HireVue over software it used to analyse video interviews with applicants, arguing the results were “biased, unprovable and not replicable“.

Facial analysis has been scientifically discredited. But Dr Sheard warns there is no law in place to prohibit its use and so “some systems may be still incorporating” it.

How do AI hiring systems impact marginalised candidates?

AI hiring tools can experience data bias, Dr Sheard says, since the systems learn from the information they are fed. Some AI hiring systems are built using large language models, for example, and if they are missing datasets from a disadvantaged group, “they won’t be representative” of the broader population, the academic says.

This is similarly the case if there are biases in the AI system’s training data, which means it adopts and reproduces the same biases and discriminatory practices as the tools used in its development.

“They’re trained on things that are scraped from the internet. For example, we know … only about 15 per cent of women contribute to articles on Wikipedia. So it incorporates that male view, which can then be brought through into recruitment processes.

One example of learned gender discrimination occurred at Amazon when it developed an AI hiring model in 2014 based on CVs collected from applicants for software developer positions over a 10-year period.

As Australia’s unemployment rate rises, finding a job is getting more challenging. Some describe it like online dating.

“That model learnt to systemically discriminate against women because it’s a male-dominated field. [Many of] those CVs came from men, and so the system learnt to downgrade the applications of women who applied for positions through that tool, particularly when they use the word “women’s” in their application. For example, [if] they [wrote they] attended a women’s college,” Dr Sheard says.

“And it also picked up language styles that are used by men, so when particular words were used more typically by men … like executed or actioned, it upgraded those applications.”

The recruitment tool was reportedly found to be sexist and ultimately scrapped.

Put your best foot forward

Malcolm builds expert resumes, cover letters and LinkedIn profiles, which unleash an unbeatable business case to promote you as a ‘must have’ asset to an employer.