Back to Articles
Neural Notes: Heidi Health CEO warns healthcare AI can ‘degrade’ under pressure

SmartCompany

ENRICHED

Details

Date Published
6 May 2026
Priority Score
3
Australian
Yes
Created
6 May 2026, 04:00 am

Authors (1)

Description

Heidi Health CEO Dr Thomas Kelly warns healthcare AI tools can “degrade” under pressure as regulation struggles to keep pace with clinical AI.

Summary

This article highlights warnings from Heidi Health CEO Dr. Thomas Kelly regarding the inconsistent real-world performance of clinical AI models, which can show meaningful degradation during periods of high demand. These findings underscore significant safety risks in high-stakes healthcare environments, particularly as AI transitions from administrative note-taking to clinical decision support. The discussion emphasizes that Australian regulatory frameworks are currently failing to keep pace with these non-repeatable model behaviors, suggesting a need for specialized oversight instead of relying on general-purpose models like ChatGPT for medical applications.

Body

Welcome back to Neural Notes, a weekly column where I look at how AI is influencing Australia. In this edition: what happens when doctors start relying on AI tools that don’t always behave consistently under real-world pressure, particularly when regulation is lagging? Heidi Health co-founder and CEO Dr Thomas Kelly says Australia’s regulatory framework is not keeping pace with clinical AI. In an interview with SmartCompany at Blackbird Venture’s recent Sunrise event, Kelly warned real-world performance of models can vary in ways that could create patient safety risks as AI tools move beyond note-taking. “Overall, I would say no,” Kelly said when asked whether regulators are keeping up. Related Article Block Placeholder Article ID: 324579 Heidi Health’s $98 million raise skyrockets its valuation to $704 million Simon Thomsen “The area I think that is being chronically missed is they’re treating models as repeatable… in practice, it’s not true.” He said performance can fluctuate depending on how and where models are run, including during periods of high demand. Smarter business news. Straight to your inbox. For startup founders, small businesses and leaders. Build sharper instincts and better strategy by learning from Australia’s smartest business minds. Sign up for free. * indicates required Email Address * By continuing, you agree to our Terms & Conditions and Privacy Policy. “When we measure and monitor the performance of Heidi on days where it’s particularly busy, like a Monday morning… you see meaningful degradations in performance of the notes, which can be a safety issue,” he said. The warning comes as Heidi, best known for its AI medical scribe, shifts into a broader healthcare platform, with Kelly arguing the category label no longer captures what the company is building. The Melbourne-founded company has emerged as one of Australia’s fastest-growing AI startups, reporting over $69 million in annual recurring revenue earlier this year after a rapid global expansion across hospitals, clinics, and health systems. Blackbird even described Heidi as the fastest-growing company in its portfolio since Canva. Heidi raised $98 million in late 2025 in a funding round led by Point72 Private Investments, valuing the company at roughly $704 million. This followed a $10 million Series A round back in 2023. Heidi Health goes beyond note-taking Kelly says that ‘AI scribe’ is now too small a description for what the company actually does now. “Even on just the scribing part, it underestimates how hard it is to effectively write like a doctor.” Heidi’s core product, which transcribes consultations into clinical notes, has been widely adopted by clinicians looking to reduce administrative workload. Kelly said the initial appeal remains straightforward. Related Article Block Placeholder Article ID: 322212 Heidi Health goes fur-ther with AI for pet care Tegan Jones “It’s a no-brainer,” he said. “Patients end up with a better visit… doctors have more cognitive load, so they can focus on more issues.” But Heidi is now expanding into other parts of the clinical workflow, including coding and organisational tools. It has also released an “Evidence” product, which surfaces relevant medical research during consultations. “We’ve already had some amazing stories of people who stopped themselves [from] doing something that would have harmed the patient,” Kelly said, referring to clinicians using the tool to double-check decisions against guidelines. Heidi has also expanded beyond general practice in recent years, rolling out across veterinary clinics, pharmacy networks, and major hospital systems in Australia and overseas. Kelly said Heidi’s longer-term focus is increasingly centred on areas where clinicians face the highest administrative burden. “The areas that interest us most are where the administrative weight is highest, and the human cost is clearest,” he said. That can include chronic disease management, pre-visit preparation, and patient communications. “Anywhere a clinician is spending time on process instead of on the person in front of them.” This upgrade to providing decision-adjacent support is part of what is raising new regulatory and safety questions when it comes to AI and healthcare. ‘ChatGPT shouldn’t be allowed’ in clinical use Kelly drew a distinction between clinical AI tools and general-purpose models, arguing healthcare applications require a fundamentally different standard. Related Article Block Placeholder Article ID: 330129 Neural Notes: ChatGPT Health’s Australian launch raises hard regulatory questions Tegan Jones “AI companies in healthcare need to build their own models,” he said. “ChatGPT shouldn’t be allowed to be used if you’re doing evidence retrievals.” Kelly’s comments come as OpenAI increasingly pushes into the space through products like ChatGPT Health, which launched in Australia earlier this year.  While OpenAI has positioned the product as informational rather than clinical, questions remain around how tools operating even at this point intersect with existing healthcare regulation and accountability frameworks. Kelly said general-purpose models are not designed for clinical reliability, and warned that variability in performance could have real-world consequences as clinicians begin to rely more heavily on these tools. “My concern is, as these tools get closer to care, it requires the regulators and us as industry partners to help them understand the dangers,” he said. While Heidi currently sits outside some medical device classifications in Australia because of its administrative positioning, Kelly said the company expects regulatory frameworks to evolve as AI tools move closer to delivering care. He noted Heidi is already registered as a Class I software-as-a-medical-device product with the UK’s MHRA and said the company is operating with “the kinds of systems you would expect from a regulated product” ahead of tighter oversight. Moving AI away from the cloud Part of Heidi’s expansion has been to bring more of its technology stack in-house, including running its own transcription and generation models. The company has also moved into hardware with its “Remote” device, which Kelly said is partly about control and consistency. The wearable microphone device, launched earlier this year, forms part of Heidi’s broader push toward tighter control over reliability, compliance, and on-device processing in clinical settings. “Overwhelmingly, it’s just about reducing the number of sessions that clinicians lose or have problems with,” he said. Related Article Block Placeholder Article ID: 327617 Cloudflare back online after global outage took X, Canva and ChatGPT offline AAP Longer term, he foresees a broader change in how healthcare AI is deployed. “I actually think it will make sense for clinicians and organisations to have a lot of what Heidi does stay within their four walls,” he said. “It’s almost like going away from [the] cloud.” That approach, he said, could allow organisations to maintain tighter control over performance, compliance, and patient data. AI consent at the doctor and commercial pressure As AI tools become more embedded in clinical workflows, questions around patient consent are also becoming more complex and frequent. Last year, SmartCompany reported on a patient who was refused care after declining the use of an AI scribe during an appointment for their child. This led to questions around whether consent remains meaningful as these systems become more deeply embedded in healthcare workflows. Related Article Block Placeholder Article ID: 321249 Neural Notes: When saying no to AI means losing your doctor Tegan Jones Kelly said patients should retain the ability to opt out, and warned against making such tools mandatory. “I don’t think blanket mandates are sensible,” Kelly said. At the same time, he drew a firm line on commercial influence in the doctor’s office. “No, and that’s a hard line for us,” he said when asked whether Heidi would ever allow the likes of selling patient data. “The moment you introduce a commercial incentive into the consult, even if it’s disclosed, you create a tension between what’s best for the patient and what’s best for the business.” “We don’t sell patient data because that erodes the trust that clinicians and patients place in the system. That’s not a position we’ll revisit as we scale.” Heidi’s cautious path to automation While Heidi is exploring more advanced use cases, including triage, Kelly said the company is deliberately holding back on deploying higher-risk products and features right now. “Fully automated AI triaging is the clearest example,” Kelly said. “The technical case for some of these capabilities is closer than people think. That’s exactly why we’re deliberate about not rushing them.” Kelly said Heidi has developed an internal ethics framework to determine not just what it can build, but what it should build. “The question we ask isn’t ‘can we build it’, it’s ‘should we, and is the environment around it ready?’” he said. “That means the clinical evidence, the regulatory frameworks, the liability structures, and the trust that clinicians and patients need to have in these systems before they go anywhere near real care.” If the technology continues to evolve as expected, Kelly said the long-term goal is not to make healthcare more technological, but in some ways, less. “If we get this right, technology fades into the background,” he said. “The consult becomes about the relationship again.” Stay in the know Never miss a story: sign up to SmartCompany’s free daily newsletter and find our best stories on LinkedIn.