Please enable JavaScript to access this page.
Business News

A man filed a complaint against OpenAI saying ChatGPT falsely accused him of killing his children

GettyImages 2198379368 e1742566274761
  • Arve Hjalmar HolmenA citizen in Norway said that he asked Shatt to tell him what he knew about him, and her response was a horrific hallucine who claimed that he killed his children and went to prison because of the violent act. Looking at how artificial intelligence mixed its wrong response with real details about his personal life, Holmeen filed an official complaint against the Chatgpt Openai maker.

Have you ever Google yourself just see what the Internet says about you? Well, a man had the same idea with ChatGPT, and now he is I filed a complaint against Openai Based on what artificial intelligence said about him.

“Who is Arve Hjalmar Holmeen?” Said Arve Hjalmar Holmeen, from Tredheim, Norway, said he asked Chatgpt the question, “Who is Arve Hjalmar Holmen?” And the response – which we will not fully print – was convicted of killing his two sons, between the ages of 7 and 10, and sentenced to 21 years in prison. As Halmeen said he tried to kill his third son.

None of these things actually happened. ChatGPT appeared to spit a completely wrong story believed to have been completely correct, called “Hilassa” Amnesty International.

Based on its response, Holmeen filed a complaint against Openai With the help of NobeebIt is a European Center for Digital Rights, which is accused of the first mummy giant of violating the principle of accuracy shown in the General Union’s general data protection regulation (GDPR).

The complaint said: “The complainant was severely troubled by these outputs, which could have a harmful effect in his private life, if they were cloned or leaked in one way or another in his community or in his mother town.”

What is dangerous in ChatGPT response, according to the complaint, does it mix real elements of Halman’s personal life with complete manufacturing. Chatgpt got the right city of Holmen, and it was also true about the number of children – specifically, children – has.

JD Harriman, partner in Foundation Law Group LLP in Bourbank, California, luck Halimin may have a difficult time to prove defamation.

“If you defend artificial intelligence, then the first question is” Should people think that the statement made by artificial intelligence is a fact? “There are many examples of lying from artificial intelligence.”

Moreover, artificial intelligence did not spread its results to a third party. If a man redirects the message of artificial intelligence to others, then then he “He becomes the publisher and has to prosecute himself,” Hariman said.

Hariman said, “Holmeen is also likely to have difficulty in proving the negligence aspect of defamation, because” Amnesty International may not be qualified as an actor who can commit neglect, “compared to people or companies.

Tell Avrohom Gefeen, a partner in Vishnick McGOVERN Milizio LLP in New York, luck The defamation cases surrounding the hallucinations of artificial intelligence “were not tested” in the United States, but it mentioned a suspended case in Georgia, where a radio host filed a lawsuit that survived Openai’s proposal to reject, “so we may soon get some reference to how the court deals with these allegations.”

The official complaint of Openai “deleting the advertising product on the owner of the complaint” requires, adjusting its model so that it results in accurate results from Halimin, and it is fined due to its alleged violation of the rules of GDP, which forces Openai to take “every reasonable step” to ensure “erased or correct personal data without delay.”

“With all lawsuits, there is nothing automatic or easy,” said Hariman luck. As Ambrose Pears said, she goes to litigate as a pig and go out as a sausage. “

Openai did not respond immediately luckRequest to comment.

This story was originally shown on Fortune.com


https://fortune.com/img-assets/wp-content/uploads/2025/03/GettyImages-2198379368-e1742566274761.jpg?resize=1200,600
2025-03-21 16:44:00

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button