The Sydney Hospital Using AI to Improve Diagnoses Under Pressure
The Sydney Morning Herald
SKIPPED
Details
- Date Published
- 11 Oct 2025
- Priority Score
- 2
- Australian
- Yes
- Created
- 12 Oct 2025, 03:19 pm
Description
The trial is one of few to investigate how the technology is used by doctors – and what they do when AI gets it wrong.
Summary
The article highlights a study conducted at Sydney hospitals where AI-assisted X-ray tools improved diagnostic accuracy by up to 12%. Lead researchers, including Professor Farah Magrabi, emphasized the significance of this study as it provided insights into AI's effectiveness in high-pressure environments like emergency departments. While the study presented advancements in AI applications in Australian healthcare, it also explored the doctors' responses when dealing with AI errors. The trial is a part of broader efforts by the NSW AI taskforce to integrate advanced technologies in clinical settings. Though it shows progress in AI usage, the article primarily focuses on improving specific healthcare processes rather than addressing existential AI risks.
Body
ByAngus ThomsonOctober 13, 2025 — 5.00amSaveLog in,registerorsubscribeto save articles for later.Save articles for laterAdd articles to your saved list and come back to them any time.Got itNormal text sizeLarger text sizeVery large text sizeSydney doctors were up to 12 per cent better at diagnosing conditions in a milestone trial of an AI-assisted X-ray tool already used in NSW hospitals which could prevent misdiagnosis in one in every 17 scans.FromGP transcription softwaretocomplex screening and diagnostic tools, artificial intelligence is already commonplace in almost every corner of Australia’s health system. But few studies have investigated how the tools are used by doctors – and what they do when AI gets it wrong.Professor Michael Dinh leads the research arm of the emergency department at Royal Prince Alfred Hospital in Camperdown.Credit:Wolter Peeters“One source of risk with these systems is that they’re highly accurate, but they’re not 100 per cent,” said Professor Farah Magrabi, a Macquarie University expert on AI in healthcare whose independent analysis of a chest X-ray tool developed in Australia was published in the peer-reviewedEmergency Medicine Journalon Monday.“There are very few studies that are actually evaluating AI systems in the hands of doctors,” Magrabi said. “This study is really interesting because it delivers the evidence of what [AI] can do, particularly in a high-pressure ED environment.”Magrabi’s team recruited 200 senior and junior doctors working in five Sydney emergency departments to test the chest X-ray tool developed by Australian company,Harrison.ai.Each doctor was given 18 X-ray slides and accompanying patient information representing a range of conditions encountered in emergency departments. These included trauma cases, heart failure, pneumonia, fractures and dislocations, viral infections and cancer-related presentations.The doctors were assisted by AI on a random selection of half of the slides. Four slides contained X-rays known to produce false-negative or false-positive AI findings, which helped the researchers assess how the doctors reacted when the tool produced inaccurate information.The slides were divided into four panes: (1) chest X-ray viewer; (2) AI findings viewer; (3) description of the patient’s complaint, symptoms and observations; and (4) a response panel for recording diagnosis.Credit:Emergency Medicine JournalThe study found one misdiagnosis could be prevented for every 17 patients assessed using the AI tool. The rate of improvement was highest for doctors with around three years of practice at 11.8 per cent.AdvertisementProfessor Michael Dinh, a senior emergency physician at Royal Prince Alfred Hospital in Camperdown who co-authored the study, said even a marginal improvement would be hugely beneficial for patients with urgent and life-threatening presentations such as pneumonia, collapsed lungs and trauma injuries.Loading“In an average shift, we probably read about 30 or 40 of these scans, right? So if it stops us from incorrectly diagnosing two patients in a shift, that’s a massive thing for us,” Dinh said.RPA is among several NSW hospitals trialling real-world uses for next-generation technologies through the state’s AI taskforce. Harrison.ai’s platform is already used at hospitals in the Hunter region, and by medical imaging companies including I-MED, Harbour Radiology and Sonic Healthcare.The Australian government’s privacy watchdog launched an inquiry last year after an investigation by online outletCrikeyalleged its flagship product, Annalise.ai, was trained on thousands of scans provided by I-MEDwithout express consent from patients.Asubsequent reportfound I-MED had not breached privacy regulations and the case was an example of “good privacy practice”, but did not rule out future investigations.Start the day with a summary of the day’s most important and interesting stories, analysis and insights.Sign up for our Morning Edition newsletter.SaveLog in,registerorsubscribeto save articles for later.License this articleHealthcareFor subscribersNew South WalesAISydneyRoyal Prince Alfred HospitalMore…Angus Thomsonis a reporter covering health at the Sydney Morning Herald.Connect viaTwitteroremail.Loading