Back to Articles
OAIC probing I-MED for handing private medical data to harrison.ai
Crikey
SKIPPED
Details
- Date Published
- 23 Sept 2024
- Priority Score
- 0
- Australian
- No
- Created
- 8 Mar 2025, 02:41 pm
Authors (1)
- Cam WilsonENRICHED
Description
After Australia's largest medical imaging provider I-MED and its private equity owners Permira went to ground, the information commissioner has stepped in to get answers.
Summary
Error processing article with AI.
Body
Australia’s privacy regulator is probing I-MED Radiology Network over handing medical data to harrison.ai to train artificial intelligence without patients’ knowledge, intensifying pressure on I-MED, which has remained silent amid the growing controversy.Last week, aCrikeyinvestigationrevealedthat patients at Australia’s largest medical imaging provider I-MED were unaware their chest x-rays had been given to health technology company harrison.ai to train AI, promptingconcernsfrom politicians and consumer advocacy groups.Harrison.ai moved to distance itself by telling investors that questions about patient consent and privacy raised by the investigation were “not matters for Harrison to respond to”, a leaked email obtained byCrikeyshowed.Related Article Block PlaceholderArticle ID: 1175989Leaked harrison.ai email shifts blame to I-MED over patient consentCam WilsonThe company released a statement saying the data it received is de-identified and that it complies with its legal obligations. Meanwhile, I-MED and its private equity owners, Permira, have not responded to multiple media requests about its partnership with harrison.ai.Now, the Office of the Australian Information Commissioner (OAIC) is stepping in to determine if I-MED has complied with privacy requirements.“The OAIC is making preliminary inquiries with I-MED Radiology Network to ensure it is meeting its obligations under the Australian privacy principles in relation to reports it has provided private medical scans to a third-party entity for the purpose of training an artificial intelligence model,” a spokesperson toldCrikey.Under Australian privacy law, the disclosure of personal information is allowed for the purpose it was collected, or for a secondary purpose that would be reasonably expected.“However, given the unique characteristics of AI technology, the significant harms that may arise from its use and the level of community concern around the use of AI, in many cases it may be difficult to establish that such a secondary use was within reasonable expectations,” they said.Related Article Block PlaceholderArticle ID: 1175720Australia’s biggest medical imaging lab is training AI on its scan data. Patients have no ideaCam WilsonBoth companies have said steps have been taken to protect patients’ privacy. I-MED said the data was “anonymised” when announcing the partnership in 2019, and harrison.ai last week said the data had been “de-identified and cannot be re-identified”.However, an OAIC spokesperson raised concerns that such steps may not fully remove any privacy risk to the individual.“Entities should be aware that de-identification is context dependent and may be difficult to achieve,” they said.After harrison.ai’s public statement was sent out on Friday,Crikeysent the company a list of follow-up questions about the state of its partnership with I-MED, when it first became aware of concerns, and whether it was continuing to take this data even still. We did not receive a response.