Back to Articles
Public Sector AI Hiring Practices Under Scrutiny in the Australian Public Service

The Canberra Times

SKIPPED

Details

Date Published
8 Dec 2024
Priority Score
3
Australian
Yes
Created
8 Mar 2025, 01:04 pm

Authors (1)

Description

AI in recruitment could overlook qualified professionals. Public sector committee examines AI's role and pitfalls.

Summary

The article highlights the scrutiny faced by AI-assisted recruitment processes within the Australian Public Service (APS). Concerns have been raised about AI systems erroneously deeming experienced candidates as unsuitable due to inadequate algorithmic evaluation. This has been substantiated by the Merit Protection Commissioner's report, detailing multiple overturns of recruitment decisions. The article underscores the potential risks AI poses by unintentionally favoring candidates familiar with generative AI tools like ChatGPT, and outlines criticisms from the Community and Public Sector Union regarding current recruitment processes disadvantaging diverse groups. These discussions feed into broader debates about AI governance and the need for responsible use of AI in public sectors, with implications for policy adjustments aimed at safeguarding merit-based hiring practices.

Body

AI systems used by Services Australia to screen candidates for roles in the mega department rated workers with a proven track record unsuitable, workers representatives have told a parliamentary committee. Speaking before the joint parliamentary committee on public accounts and audit (JCPAA), Community and Public Sector Union deputy secretary Rebekah Fawcett said AI tools were making the wrong decisions. "We are aware, for example in Services Australia, workers with a proven track record being rated unsuitable and ruled out for promotion or permanency because their video recording or written application failed to use language that the algorithm was searching for," Ms Fawcett said. The incident Ms Fawcett referred to was detailed by the Merit Protection Commissioner in its 2021-22 report. The independent umpire for recruitment decisions in the APS found an AI-assisted bulk recruitment process did not pick the right people for the job. That year, the commissioner overturned 12 decisions, 11 of which were from the one bulk recruitment round. "The increased number of overturns indicated that the selection process was not always meeting its key objective, which put simply, was to identify and select the most meritorious candidates for the roles advertised." A spokesperson for Services Australia said while the agency used AI in recruitment, it did not make decisions. "All recruitment decisions, including applicant suitability, are made by a delegated staff member in line with APS merit principles," the spokesperson said. "We don't use AI-assisted tools to screen resumes or assess video interviews." The agency had updated its processes after the findings of the Merit Protection Commissioner. Ms Fawcett also said AI used in recruitment processes for public sector roles preferred candidates who had themselves used AI tools for the applications. "Anecdotally we're aware of AI tools preferring candidates who use ChatGPT for their applications and candidates using [generative] AI to respond to questions in video interviews and other online processes." The Services Australia spokesperson said AI tools assisted in assessing applications. "When recruiting to fill thousands of jobs, we engage with service providers who use a range of tools, including some that are AI-assisted, to help efficiently and consistently assess candidate applications," the spokesperson said. "Due to the volume of applications, not using these tools would mean a significantly longer and more expensive recruitment process. These tools are extensively trained and quality checked by our staff, tailoring them to our needs and ensuring they are effectively and accurately assessing candidates against our criteria and are mitigating bias." The JCPAA is pursuing an inquiry into the use of artificial intelligence by public sector entities. In September, the Digital Transformation Agency published the first policy for the responsible use of AI in government. While the policy does not explicitly mention recruitment, it sets out broad principles and some requirements for the use of AI. These include that APS staff must be able to explain, justify and take ownership of advice and decisions when using AI. While this guidance is relatively new, the use of AI in recruitment is already established in the APS. In the Merit Protection Commissioner's 2022 guidelines, nearly a quarter of surveyed APS agencies used AI-assisted and automated tools in recruitment processes in the previous 12 months. The Merit Protection Commissioner does not rule out using AI in recruitment but warns that although there may be benefits, there are also pitfalls. "Incorrect or negligent use of AI-assisted and automated recruitment tools can impede the operation of a merit-based recruitment process. Agencies should exercise care when engaging these tools, in order to uphold the merit principle." More recently, the CPSU has hardened its position when it comes to the use of AI in recruitment. In October, the CPSU published a survey of 1800 Commonwealth bureaucrats, finding 85 per cent of respondents were concerned about the use of AI in recruitment and promotion decisions. This survey found staff were still dissatisfied with the way Services Australia uses AI in recruitment. Of those surveyed, 640 worked for Services Australia, a third of whom said they had participated in a recruitment process, including for a promotion, that included the use of AI. Two-thirds of that cohort, or about 137 respondents, said they had experienced issues with the process. Staff responses in the 2024 CPSU recruitment survey described what this meant in practice. "Staff with 20+ years experience cannot get through the current recruitment process. It devalues and demotivates good staff," one said. "People are able to 'cheat' the process by using AI to answer questions. Fundamentally, there is an obvious lack of comprehension of what Services Australia do and the questions asked," another wrote. The Services Australia spokesperson said candidates can request feedback. "Candidates employed at the APS5 level or lower can ask for a review or appeal a promotion decision when applying for a role up to the APS6 level." CPSU national secretary Melissa Donnelly said AI should be banned from making recruitment decisions and called for outsourced recruiting to be brought back in-house. "Outsourced recruitment is not meeting the needs of agencies and having an adverse impact on individual employees who are rejected for jobs they are more than qualified for or left sitting in merit pools for months on end," Ms Donnelly said. "People putting themselves forward for a career in the public service deserve to be treated better than this." In her annual APS Reform speech, Minister for the Public Service and ACT senator Katy Gallagher said more work needed to be done to ensure the APS reflected the broader Australian community and make it an employer of choice. "I think that in every area of building a diverse workforce, there is more to do," she said. While Senator Gallagher said the APS as a whole met its gender and cultural and linguistic diversity targets, this was not the case at the senior ranks. "There is something not right when we broadly reflect the culturally diverse nature of our population, but then something happens at a senior level where people aren't going through," Senator Gallagher said. "That's something structural there and we need to deal with that." Ms Donnelly said AI-assisted recruitment processes disadvantaged these very groups. "We are also concerned that current practices significantly disadvantage groups of people, including culturally and linguistically diverse Australians, Aboriginal and Torres Strait Islander people, and people with disability." This was shared by staff responses to the CPSU survey. "There's a lack of consideration to tailor the recruitment process to accommodate and consider needs of people with disability, neurodivergence, CALD and First Nations background," a Department of Employment and Workplace Relations staff member said. "[Recruitment firm] Chandler McLeod is a US company and didn't target their questions to an Australian market. I was insulted as an Indigenous person when at the end of the recruitment they had a tick list of what background you were from and didn't include Aboriginal/Torres Strait Islanders," a Services Australia staff member said.