Norwegian Man Files Complaint Against ChatGPT for Falsely Claiming He Killed His Sons
ABC News
SKIPPED
Details
- Date Published
- 20 Mar 2025
- Priority Score
- 3
- Australian
- No
- Created
- 22 Mar 2025, 11:16 am
Description
Digital rights group Noyb says the chatbot generated inaccurate personal data, violating EU privacy rules.
Summary
The article highlights a significant complaint by Arve Hjalmar Holmen, a Norwegian man, against the AI chatbot ChatGPT. The chatbot erroneously generated information claiming he was convicted of murdering his children, implicating serious flaws in AI's data handling and its potential to produce harmful outputs or 'hallucinations'. This incident brings attention to the regulatory challenges and risks associated with generative AI technologies, emphasizing the need for robust governance frameworks, especially regarding data accuracy and privacy under EU rules. It underscores the necessity for organizations like OpenAI to enhance model reliability and address ethical concerns in AI evelopment to mitigate catastrophic reputational damage caused by false information.
Body
Norwegian man files complaint against ChatGPT for falsely saying he killed his sonsHBy Hanan DervisevicTopic:Artificial IntelligenceFri 21 MarFriday 21 MarchFri 21 Mar 2025 at 2:22amGenerative AI app ChatGPT produced false and damaging information about a Norwegian man.(AP: Michael Dwyer)A Norwegian man has filed a complaint after artificial intelligence (AI) chatbot ChatGPT falsely claimed he was convicted of murdering two of his children.Arve Hjalmar Holmen was given the false information after he used ChatGPT to ask if it had any information about him.The response he got back included: "Arve Hjalmar Holmen is a Norwegian individual who gained attention due to a tragic event."He was the father of two young boys, aged 7 and 10, who were tragically found dead in a pond near their home in Trondheim, Norway, in December 2020."Mr Holmen said the AI hallucination of him was damaging.(Supplied: Noyb)However, not all the details were made up.The number and the gender of his children were correct, as well as his hometown, suggesting it did have some accurate information about him.Digital rights group Noyb, which filed the complaint to the Norwegian Data Protection Authority on his behalf, claimed the answer ChatGPT gave him was defamatory and breaks European data protection rules.They said Mr Holmen "has never been accused nor convicted of any crime and is a conscientious citizen".ChatGPT presents users with a disclaimer at the bottom of its main interface that says the chatbot may produce false results and to "check important info".But Noyb data protection lawyer Joakim Söderberg said that is insufficient."You can't just spread false information and in the end add a small disclaimer saying that everything you said may just not be true," Mr Söderberg said in a statement.Demand for OpenAI to be finedNoyb ordered OpenAI to delete the defamatory output and fine-tune its model to eliminate inaccuracies.It also asked that an administrative fine be paid by OpenAI "to prevent similar violations in the future".Since the incident in August 2024, OpenAI has released anew version of OpenAI's chatbot, named GPT-4.5, which reportedly makes fewer mistakes.However, experts say that the current generation of generative AI will always "hallucinate".What is AI 'hallucination'?AI "hallucination" is whenchatbots present false information as facts.A German journalist, Martin Bernklau, wasfalsely described as a child molester, drug dealer and con man on Microsoft's AI tool, Copilot.In Australia, the mayor of regional Victoria's Hepburn Shire Council,Brian Hood, was wrongly described by ChatGPTas a convicted criminal.And last year, Google's AI Gemini suggested users stick cheese to pizza using glue and said geologists recommend humans eat one rock per day.In a statement, Mr Holmen said this particular hallucination was very damaging."Some think that there is no smoke without fire."The fact that someone could read this output and believe it is true, is what scares me the most."