A Botched Government Report Should Be a Wake-up Call on AI Hype
The Age
READ
Details
- Date Published
- 5 Oct 2025
- Priority Score
- 3
- Australian
- Yes
- Created
- 6 Oct 2025, 12:21 pm
Description
Beating up management consultants may be something of a national sport these days, but Deloitte Australia’s headache with AI is the latest example of why technology must be carefully managed.
Summary
The article highlights concerns over the use of AI in corporate environments, focusing on a recent incident involving Deloitte Australia, which submitted a government report containing fabricated content due to AI errors. This incident underscores the potential risks associated with AI 'hallucinations'—an occurrence where AI generates seemingly accurate but incorrect information. The report mishap emphasizes the need for robust human oversight and clear governance when leveraging AI in high-stakes contexts, notably in governmental and corporate applications. The narrative serves as a cautionary tale relevant to AI safety and governance frameworks, advocating for stringent checks and balances to mitigate errors and ensure accountability.
Body
October 6, 2025 — 4.07pmSaveLog in,registerorsubscribeto save articles for later.Save articles for laterAdd articles to your saved list and come back to them any time.Got itNormal text sizeLarger text sizeVery large text sizeDeloitte Australia, botching a report to a federal government department thanks to AI doing some of the heavy lifting, is a canary in the coalmine for the technology being as big a hazard in the workplace as it is an aid.Who knew (outside the tech industry and those well-versed in tech lingo) that your average AI assistant was capable of “hallucinating”?These so-calledhallucinationsare created as part of the AI’s pattern-matching process, especially when the model has limited or biased training data and a lack of real-world understanding of the problem it’s trying to solve. This leads to the AI model guessing and delivering results that sound right but aren’t factually accurate.(Word of warning and point of irony: I used AI as a reference for the above definition.)Corporate Australia is wildly excited about the productivity charging technology that is AI.Credit:iStockThe reason AI models resort to guessing when they are short on real information is that they also possess an innate desire to please the one asking the question. And this desire means that an AI model would rather use dubious sourcing and incorrect interpretation to deliver a wrong result than nothing at all.It’s also what has left Deloitte red-faced and, asreportedbyThe Australian Financial Review, forced it to issue a partial refund to the federal government. The original report, which reportedly cost $440,000, created by Deloitte for the Department of Employment and Workplace Relations, contained a completely fabricated quote from a federal court judgment and invented academic references.LoadingA revised version of the document has now been uploaded to the department’s website minus the fabrications and typos. The AI-fabricated references and citations (alongside a smattering of grammatical errors) in the original document were outed by one of the academics quoted in the original report.So Deloitte picks up an ignominious distinction and the department wins a discount. It would be fanciful to think that this is an isolated case of AI going off the rails.AdvertisementBeating up management consultantsmay have become something of an Australian sport these days, but they aren’t the only ones using AI at work.Corporate Australia in particular is wildly excited about the introduction of this productivity charging technology. It is the new toy that everyone wants to play with before reading the instruction manual, and most fans realise there are plenty of wrinkles that need ironing out.There is already plenty of evidence of companies adopting AI with an eye to eventually replacing workers doing more process-driven administrative jobs, but some, such as Commonwealth Bank, havelearnt the hard waythat there is a big gulf between what AI can do in theory and what it delivers in practice.The adoption of generative AI, in which the model learns from existing data and can then spit out text videos and pictures when asked a question, is still in its infancy, so mistakes will be made.In Deloitte’s case, using AI to help put the report together wasn’t the problem; instead, it was the firm’s laxity in getting the report checked by humans before it was stamped customer ready.LoadingAdding insult to the injury, Deloitte markets itself as a firm that can educate its corporate clients on how best to deploy AI.Its glossy marketing material contains the boast that “deploying Artificial Intelligence and Generative AI across an enterprise requires the same level of operational strategy and action it takes to manage a manufacturing line or complex supply chain”.In other words, even the experts in using AI appear vulnerable to tripping up.There has also been criticism of Deloitte about transparency around its use of AI in the report. The new version reportedly includes an explicit concession in the methodology that generative AI was used for what the firm called “traceability and documentation gaps”.“There have been media reports indicating concerns about citation accuracies which were contained in these reports. Deloitte conducted this independent assurance review and has confirmed some footnotes and references were incorrect,” the department says on its website.It makes you wonder just how many worms are in the can being opened by AI. The Deloitte snafu certainly lifts the lid on a couple of big ones, and there will almost certainly be many more such mistakes in the future.This isn’t an argument against AI, but there is a need for a codified set of checks and balances to be put in place, and it’s probably going to take more unfortunate bouts of hallucinations before we get there.The Market Recap newsletter is a wrap of the day’s trading.Get it each weekday afternoon.SaveLog in,registerorsubscribeto save articles for later.License this articleAIDeloitteConsultingFor subscribersOpinionElizabeth Knightcomments on companies, markets and the economy.Connect viaTwitteroremail.Loading