Scammers Utilize AI to Mimic Queensland Premier in Investment Scheme
7NEWS
SKIPPED
Details
- Date Published
- 15 May 2024
- Priority Score
- 2
- Australian
- Yes
- Created
- 8 Mar 2025, 02:41 pm
Description
Texts and calls that appear to be from Miles could be far more sinister. Here’s what you should know.
Summary
The article reports on a new AI-driven scam where criminals use artificial intelligence to replicate the voice of Queensland Premier Steven Miles to trick individuals into investing in fraudulent schemes. This incident highlights the burgeoning risks associated with AI technologies like deep fake voice generators, which can potentially deceive innocent people and make scams more difficult to detect. The case illustrates an emerging challenge for AI safety, as it emphasizes the necessity for vigilant AI governance and robust cybersecurity measures. While the article focuses on a localized (Australian) example, it underscores a broader global issue concerning AI misuse that could lead to severe societal harms.
Body
Scammers are upping their game, and turning to artificial intelligence (AI) in a bid to con innocent Australians out of money — and they’ve hijacked the Queensland premier to do so. However, when scammers sent out a text message from a social media account claiming to be Steven Miles to the premier’s long-time friend, marketing executive Dee Madigan, their target knew better. WATCH THE VIDEO ABOVE: Premier Steven Miles caught up in an AI scam. So Madigan decided to play along as the scammers tried to convince her to invest in cryptocurrency. “I like to troll scammers,” Madigan told 7NEWS. “This one was particularly funny because Steven Miles is actually a mate of mine.“ In a bid to outwit them she asked them to call and talk her through how to transfer the money. But when her phone started ringing 10 minutes later, she was stunned to hear what sounded like Miles’ voice on the other end of the line. “It actually really sounds like him, but the inflections are strange,” Madigan said. She suspects the scammers were either piecing together audio or using AI to replicate the premier’s voice. In a voice message shared with 7NEWS, the fake premier can be heard sounding eerily similar to the real deal. “Sorry for the rush, Dee,” the message said. “I’m about to enter a meeting, but I just wanted to talk to you, as I haven’t been able to do that since I promised.” In a statement to 7NEWS, Miles called the fake voice “terrifying”. “The fake clip of what sounds like my voice is obviously terrifying — for me and for anyone who might accidentally be conned.” Madigan warned the troubling rise in AI technologies used to impersonate real people through deep fake imagery and AI voice generators meant scammers could become harder to detect. “People need to be careful of not just voices but also faces, you know the deep fake faces (it’s) really scary what can be done,” she said. If you think you may have targeted by a scam, contact Scamwatch here.