Back to Articles
Queensland AI Traffic Cameras Flagged for Missing Ethical Checks

iTnews

SKIPPED

Details

Date Published
25 Sept 2025
Priority Score
3
Australian
Yes
Created
28 Sept 2025, 12:28 pm

Authors (1)

Description

Part of mobile phone and seatbelt technology program.

Summary

The image-recognition AI system deployed in Queensland to monitor mobile phone and seatbelt use has been criticized in an audit for not sufficiently addressing ethical risks. While the system drastically reduces the need for human review by 98%, the audit highlights deficiencies in ethical risk management according to the state's AI Ethics Framework. Despite the Queensland Department of Transport and Main Roads attempting to enhance system reliability and privacy, it has not completed a full ethical assessment required by governance policies. This issue underscores the importance of using structured governance frameworks, such as Queensland's foundational AI risk assessment (FAlRA), to mitigate potential ethical challenges in AI deployments.

Body

An image-recognition AI system used to assess millions of photos captured of drivers in Queensland has come under scrutiny for not adequately managing potential ethical risks. The system, part of the state's mobile phone and seatbelt technology (MPST) program, uses AI-supported 'heads-up' cameras developed by Acusensus to detect mobile phone use and seatbelt non-compliance. It was first deployed by the Department of Transport and Main Roads (TMR) in July 2021. While operationally effective, reducing the number of cases requiring human review by 98 percent, the system was flagged in a new audit for potentially falling short of ethical expectations, particularly in light of the Queensland Government’s AI Ethics Framework released in September 2024. According to the audit [pdf], 114,000 fines were issued in 2024 following human reviews of AI-flagged cases by both the vendor and the Queensland Revenue Office, which administers infringements. “The MPST program uses image-recognition AI to detect driving offences, which introduces a range of ethical risks,” the auditor’s report said. “TMR has not yet undertaken a full ethical risk assessment as required by the Queensland government’s AI governance policy. "This means it does not know whether all ethical risks for the MPST program are identified and managed.” The report, however, noted that the MPST program was introduced before Queensland issued its AI governance policy. It also acknowledged that TMR “has implemented controls to support the system reliability and accuracy, protect privacy, enable fair outcomes in the fine adjudication process, and manage its contractual arrangements”. TMR responded to the audit saying it is currently building a framework to provide “centralised visibility of its AI systems” over the next 12 months and also intends to use the state's FAlRA -foundational artificial intelligence risk assessment - framework by the end of 2025. TMR is currently in the middle of a program to expand the number of MPST units it operates, having signed a $27.4 million contract extension with the ASX-listed Acusensus in December last year. On the whole, the audit raises concerns that TMR “lacks full visibility” over its AI systems in use, This extends to its adoption of the state government’s own AI assistant, QChat. According to the audit, TMR has not yet configured any entity-specific prompts for QChat. “Using this safeguard could improve the accuracy of responses from QChat and reduce the risk of users receiving misleading guidance,” the audit stated. The department responded by saying it planned to introduce monitoring procedures and implement them by December 2025, as well as improve internal AI literacy.