Back to Articles
Clearview AI Continues to Assist Australian Police Despite Privacy Concerns

Crikey

SKIPPED

Details

Date Published
25 Jan 2024
Priority Score
4
Australian
Yes
Created
8 Mar 2025, 01:04 pm

Authors (1)

Description

A Crikey investigation reveals that the Australian Federal Police has provided case materials that were then analysed using Clearview AI's technology that the agency can't use.

Summary

The article reveals the ongoing use of Clearview AI's facial recognition technology by Australian law enforcement agencies, despite previous privacy violations and public disapproval. Clearview AI's controversial database has been employed by international partners of the Australian Federal Police in child exploitation investigations under Operation Renewed Hope. This raises significant concerns about privacy and legal compliance, as Australian agencies indirectly utilize tools banned within their jurisdiction. The case underscores the complex interplay between global policing needs and domestic privacy regulations, highlighting a gap in governance frameworks capable of addressing such cross-border technology deployments.

Body

Clearview AI is being used to solve Australian police cases despite the privacy watchdog slamming police for their use of the controversial and unlawful facial recognition technology.After publicly cutting ties with the company and denying that any third parties are using the technology on their behalf, the Australian Federal Police (AFP) has now confirmed toCrikeythat it has provided case material to an international law enforcement agency which was later analysed using Clearview AI’s technology.Internal correspondence obtained byCrikeyshows that police monitored the issue, discussed the risk to their agency if the use of Clearview AI’s technology gained the attention of the Australian media, and planned how to spin its use in a positive light.Australian police’s relationship with the ‘world’s most controversial company’Clearview AI is a US tech company co-founded by Australian Hoan Ton-That. It was dubbed the “world’s most controversial company” after a 2020New York Timesarticle revealed that it had built a facial recognition app based on a database of billions of illegally obtained images scraped from the internet.Related Article Block PlaceholderArticle ID: 1115210Australia’s top police met with Clearview AI after it was slammed for breaking nation’s privacy lawCam WilsonThe companyclaimedthat anyone with a smartphone could use a photograph to instantly find out someone’s identity as well as other information about them. Hundreds of law enforcement agencies were given access to the software — including theAFPandQueensland Police Force(QPS) — as well as other private customers.Clearview AI and its users soon earned the attention of regulators around the world. Among the many jurisdictions in which it faced penalties or civil lawsuits, Australia’s information and privacy commissioner Angelene Falk found in 2021 thatClearview AI had broken the lawthrough acts including the unauthorised gathering of Australians’ data. She delivered a separate finding about the AFP’s use of the technology, criticising them for breaching Australians’ privacy through its use.While the AFP no longer directly uses Clearview AI,Crikeycan now reveal that the AFP’s international law enforcement counterparts have used the company’s facial recognition technology on the case material that the Australian agency provided to Interpol on at least one occasion last year.Operation Renewed HopeBetween July 17 and August 4 last year, the United States’ Homeland Security Investigations (HSI) ran a global victim identification task force called Operation Renewed Hope that was tasked with reviewing unsolved cases of child sexual exploitation. The task force brought together HSI staff and staff from other US agencies, intergovernmental police bodies like Interpol as well as representatives from more than a dozen other countries’ law enforcement agencies including the AFP and the QPS.A briefing by QPS Assistant Commissioner Katherine Innes to the state’s police minister in June obtained through a Queensland right to information request shows that agencies were informed ahead of time that it would use “new and emerging technologies” on old cases. A June executive briefing prepared by a detective assistant sergeant in the AFP’s victim identification team obtained through a federal freedom of information (FOI) request similarly states that participation in the operation would provide the AFP an “invaluable opportunity for information sharing” about combating child sexual exploitation.Three days after the task force concluded,Forbesfirstreportedabout the operation’s existence — and its use of Clearview AI’s facial technology product Clearview ID: “Sources toldForbesthat Clearview and other AI tools were used to scan huge caches of child exploitation material captured by HSI as well as Interpol’s Child Sexual Exploitation (ICSE) database, which contains more than 4.3 million images and videos of abuse,” wrote reporter Thomas Brewster.Operation Renewed Hope resulted in 311 referrals for victims identified from the ICSE images. This included the identification of one Australian victim who, according to anABC interviewwith QPS victim ID analyst and Operation Renewed Hope attendee Scott Anderson, had already gone to police for previous disclosures.The AFP has contributed case material to the ICSE database in the past. In late 2022, an AFP and Interpoljoint announcementabout the Australian agency providing $815,000 in funding for the ICSE stated that 860 victims had been identified and 349 offenders had been arrested in Australia as a result of the collaboration. The press release said that the funding would go towards improving the database “through integration of the latest technologies … [including] facial recognition”.Related Article Block PlaceholderArticle ID: 1097622AFP secretly met with Clearview AI months after being told to not use it, emails revealCam WilsonAn AFP spokesperson toldCrikeythat the force does “not use Clearview AI or use any other agency to do so on our behalf” and that AFP staff who attend task forces only use approved tools. This echoes astatementmade by Deputy Commissioner International and Specialist Capabilities Command Lesa Gale during Senate estimates in May last year.But the spokesperson also confirmed that it provides material to Interpol’s ICSE which formed the basis of material analysed in Operation Renewed Hope. This then resulted in further developments in at least one Australian child sexual exploitation case. In short, the AFP shared Australian case material to a third party which was analysed using Clearview AI technology that the agency wasn’t allowed to use.When asked about when the AFP became aware that Operation Renewed Hope — which it was a part of — was using Clearview AI, a spokesperson didn’t answer directly but gaveCrikeya general statement that didn’t seem to answer our question about whether the agency believes it should have access to Clearview AI’s technology.“The AFP is committed to using any online tools appropriately, and to carefully balance the privacy and potential sensitivity of data in an online environment with the important role this information can play in [the] investigation of serious crimes, including terrorism and child exploitation,” they said.‘Given the use of Clearview ID… there are risks’Behind the scenes, emails show, Australian police carefully watched the reaction to the use of Clearview AI in Operation Renewed Hope. On August 9, an Australian Centre to Counter Child Exploitation (ACCCE) detective sergeant emailed one of the centre’s leadership team, a detective superintendent, about theForbesarticle with the subject line “Article re HSI taskforce and use of facial recognition tools on ICSE data”. “This article does not reference any international involvement, however does reference the use of facial recognition tools, and that the material was provided from the Interpol ICSE database,” they wrote. “[Redacted] brought it to me this morning, noting it would be relevant to brief up so that we can prepare responses should we be asked for comment.”The detective superintendent thanks them and says they will bring it up with their superior: “We can step through any questions he might have”, they replied.Around the same time, QPS was also considering its strategy. In an August 10 email addressed to “Chief”, a staff member sends a briefing note in response to a request from 7 News to do an interview about the operation. “Given the use of Clearview ID by the HSI, there are risks in us doing an interview.” The briefing note mentions the privacy commissioner’s Clearview AI decision. In a section titled “Media Issues”, it says that QPS staff will not mention the use of Clearview AI’s technology and says that the agency’s media team is “assisting with appropriate messaging around this technology to ensure any comments are brand agnostic and that the mention is made of the potential privacy and human rights implications that the technology can bring.”Since Operation Renewed Hope, Australian police have stepped up their advocacy for the use of Clearview AI. A September article inTheAustralianreportedcommentsby the AFP’s Assistant Commissioner Hilda Sirec and Deputy Commissioner Lesa Gale — who previously had denied the AFP used Clearview AI through a third party in Senate estimates — in favour of granting access to tools like Clearview AI based on the success of Operation Renewed Hope.Related Article Block PlaceholderArticle ID: 1110490Department of Defence staff used ChatGPT thousands of times without authorisationCam WilsonIt also featured an enthusiastic endorsement from former AFP ACCCE Operations Manager Jon Rouse, who made appearances in theHerald Sun,7 Newsand the ABC’s coverage of Operation Renewed Hope, making the argument that an effective ban on Clearview AI meant that police were “working with one arm tied behind their backs”.“The debate over protecting privacy and whether we should be allowed to scrape data is irrelevant to me,” Rouse told theHerald Sun.(Last September,Crikeyreportedthat Rouse, while still working for the AFP, had met with Hon-That, emailed back and forth and arranged for him to brief a meeting of Australian and New Zealand child protection unit leaders to get “some education on the Clearview Issue”, all after the privacy commissioner’s adverse findings against Clearview AI and the AFP.)Australian Greens digital rights spokesperson Senator David Shoebridge said he’s been concerned about Australian police “essentially contracting out their use of Clearview AI” for a while.“What is especially troubling is the suggestion that victims’ images have been uploaded to the Clearview database without consent and without any effective privacy checks,” he said in an emailed statement. “Analysis of Clearview has shown that it has a very real racial bias, if for no other reason, this should prevent the AFP from participating in it.”Shoebridge said this shows that Australia’s privacy legislation is inadequate.“If the federal police can routinely flout privacy protections, even when they have been called out by the Privacy watchdog, then surely that’s proof of the need for urgent reforms and far clearer protections,” he said.The Attorney-General’s office did not answer questions about whether the AFP using a technology that breached Australia’s privacy laws through a third party was meeting its legal obligations and community expectations.University of Melbourne senior lecturer Dr Jake Goldenfein, who has researched facial recognition technology and surveillance, described the use of Clearview AI on AFP material by a third party as an example of law enforcement “arbitraging jurisdictions”.Related Article Block PlaceholderArticle ID: 1078441Why we need to regulate artificial intelligence before it’s too lateLeslie CannoldHe said it’s remarkable that Clearview AI has emerged as the most controversial and objectionable facial recognition technology company but, despite that, law enforcement agencies around the world still want to use its product.“There’s this big regulatory effort against Clearview AI,” he said over the phone. “The administrative state is ostensibly against this company, but it’s not going to disappear because another part of the state is using it.”UNSW criminology professor and child sexual exploitation expert Dr Michael Salter said it’s understandable that police would want access to technology like Clearview AI to investigate the most serious crimes that are often aided with new technologies.“The tech industry is delivering powerful tools that can be used to abuse kids but there isn’t the same commercial imperative to create tools that can protect them,” he said on a call.“Like before, we’re seeing the same dynamics where we don’t get the tools to catch criminals or give relief to the victims, while criminals can use it for their own means.”