People Interviewed by AI for Jobs Face Discrimination Risks, Australian Study Warns
The Guardian
SKIPPED
Details
- Date Published
- 12 May 2025
- Priority Score
- 3
- Australian
- Yes
- Created
- 13 May 2025, 03:04 pm
Description
Data used to train artificial intelligence does not ‘reflect the demographic groups we have in Australia’, says researcher
Summary
An Australian study led by Dr. Natalie Sheard highlights significant risks of discrimination in AI recruitment systems. With many AI models based on datasets heavily favoring American demographics, non-native English speakers and individuals with disabilities might face unfair disadvantages during AI interviews. The study, considered within the broader context of AI discrimination, underscores a need for stricter legislation to govern AI use in hiring, notably in Australia where 30% of employers currently use AI recruitment tools. This study could inform global discussions on AI safety and discrimination, pressing the need for transparency and accountability in AI systems.
Body
Australian research estimates about 30% of Australian employers use AI recruitment tools, with that figure expected to grow in the next five years.Illustration: style-photography/Getty ImagesView image in fullscreenAustralian research estimates about 30% of Australian employers use AI recruitment tools, with that figure expected to grow in the next five years.Illustration: style-photography/Getty ImagesPeople interviewed by AI for jobs face discrimination risks, Australian study warnsData used to train artificial intelligence does not ‘reflect the demographic groups we have in Australia’, says researcherAustralia news live: latest politics updatesGet ourbreaking news email,free appordaily news podcastJob candidates being interviewed by AI recruiters risk being discriminated against if they don’t have American accents, or are living with a disability, a new study has warned.This month, videos of job candidates interacting with at-times faulty AI video interviewers as part of the recruitment process have been widely shared on TikTok.Allow TikTok content?This article includes content provided byTikTok. We ask for your permission before anything is loaded, as they may be using cookies and other technologies. To view this content,click 'Allow and continue'.Allow and continueThe use of AI video recruitment has grown in recent years. HireVue, an AI recruitment software company used by many employers, reported in February that, among 4,000 employers surveyed worldwide, AI use in hiring had risen from 58% in 2024 to 72% in 2025.Sign up for Guardian Australia’s breaking news emailAustralian researchpublished this monthestimates the use is significantly lower – about 30% in Australian organisations – but expected to grow in the next five years.However, the paper, by Dr Natalie Sheard, a University of Melbourne law school researcher, warns the use of AI hiring systems to screen and shortlist candidates risks discriminating against applicants, due to biases introduced by the limited datasets the AI models were trained on.‘Tone deaf’: US tech company responsible for global IT outage to cut jobs and use AIRead moreIn her research, Sheard interviewed 23 human resources professionals in Australia on their use of AI in recruitment. Of these, 13 had used AI recruitment systems in their companies, with the most common tool being CV analysis systems, followed by video interviewing systems.Datasets based on limited information that often favours American data over international data presents a risk of bias in those AI systems, Sheard said. One AI systems company featured in Sheard’s research, for example, has said only 6% of its job applicant training data came from Australia or New Zealand, and 33% of the job applicants in the training data were white.The same company has said, according to the paper, that its word error rate for transcription of English-language speakers in the US is less than 10%, on average. However, when testing non-native English speakers with accents from other countries, that error rate increases to between 12 and 22%. The latter error rate is for non-native English speakers from China.Allow TikTok content?This article includes content provided byTikTok. We ask for your permission before anything is loaded, as they may be using cookies and other technologies. To view this content,click 'Allow and continue'.Allow and continue“The training data will come from the country where they’re built – a lot of them are built in the US, so they don’t reflect the demographic groups we have in Australia,” Sheard said.Research participants told Sheard that non-native English speakers or those with a disability affecting their speech could find their words not being transcribed correctly, and would then not be rated highly by the recruitment algorithm.This prompted two of the participants to seek reassurance from their software vendor that it did not disadvantage candidates with accents. Sheard said they were given reassurances that the AI was “really good at understanding accents” but no evidence was provided to support this.skip past newsletter promotionafter newsletter promotionSheard said there was little to no transparency about the AI interview systems used, for potential recruits, the recruiter, or the employer.“This is the problem. In a human process, you can go back to the recruiter and ask for feedback, but what I found is recruiters don’t even know why the decisions have been made, so they can’t give feedback,” she said.“That’s a problem for job seekers … It’s really hard to pick where liability lies, but absolutely vendors and employers are legally liable for any discrimination by these systems.”No case of AI discrimination had yet reached the courts in Australia, Sheard said, with any instances of discrimination needing to first go to the Australian Human Rights Commission.In 2022, the federal merit protection commissionerrevealed11 promotion decisions in Services Australia in the previous year had been overturned, after the agency outsourced the process to a recruitment specialist that used AI automated selection techniques, including psychometric testing, questionnaires and self-recorded video responses.It was found that the selection process “did not always meet the key objective of selecting the most meritorious candidates”.Sheard said the returned Albanese Labor government should consider a specific AI act to regulate the use of AI, and potentially strengthen existing discrimination laws to guard against AI-based discrimination.Explore more on these topicsAustralia newsArtificial intelligence (AI)Industrial relationsnewsShareReuse this content