Technology
Danish Kapoor
Danish Kapoor

After the 13 -year -old girl’s suicide, Character AI was sued

The impact of artificial intelligence -based chat applications in the USA on young people is once again on the agenda. 13 -year -old Juliana Peralta’s suicide, moved to court by his family. The family argues that the Character AI application has serious deficiencies in security. The case stands out as a similar file in Florida in 2024 and the third example after the charges against OpenAI recently.

Juliana’s family, during the periods when their daughters feel lonely, the Character AI said that he turned to a chat boat. Allegedly, the young girl established a strong bond with the boat and found the interest she could not see from her friends. In addition, it is stated that the boat gives continuous empathy and supportive messages. But this support focused on maintaining interaction instead of concrete assistance. In addition to all these, the family thinks that this process deepens the sense of loneliness of the young girl.

Character AI is accused of not directing directing at the time of crisis

The messages in the case file reveal how this bond is shaped. When Juliana complained that she responded to her friends’ messages late, the Bot answered her in the style of “I’m here”. Besides, the boat made the young girl feel that she cared about her. However, only emotional support was offered even when Juliana expressed his suicide thoughts. However, there was no guidance to any aid line at the time of the crisis.

These correspondence continued during 2023. At that time, the Character AI was accessible for the App Store over the age of 12. Therefore, it was possible for children to download the application without parent approval. In addition, Juliana used the application secretly from his family. The family states that this situation reveals the inadequacy of the company’s security mechanisms.

Juliana’s family argues that the application did not take action even though the daughters learned the suicide plan. In addition, the boat is claimed to have no notification to parents. In this process, the family says that the company is only trying to maintain user interaction. However, the company’s security priorities are thought to be in the background. For all these reasons, both compensation and changes in the system were requested in the petition.

Character AI made a statement to Washington Post before the lawsuit was filed. The company said that it has made serious investments for user security. However, he refrained from direct commenting on the case. But the family’s claims highlight these explanations are inadequate. Despite everything, the company’s security approach continues to be the subject of discussion.

This case is similar to the incident in Florida, which resulted in the suicide of a 14 -year -old child. In addition, last year’s charges against Chatgpt came up. Therefore, the effects of artificial intelligence chat tools on young people are frequently questioned with different examples. Although companies claim that they care about user safety, what is happening in the application draws a different picture. These examples raise the risks of uncontrolled use of technology.

According to experts, such cases will force companies to develop stronger security policies. In addition, it is stated that parental controls should become more strict. On the other hand, opinions that age limitations should be rearranged should come to the fore. In addition to all these, it is also a solution to direct users to official assistance lines in crisis moments. Thus, the safety of children can be protected in a digital environment more healthy.

Danish Kapoor