Davut Schwartz, the CTO of Ripple $0.005577, rejected allegations of negligence and wrongful death in the lawsuit filed against the AI chat platform Character.AI. In a statement on social media, Schwartz asserted that the lawsuit is legally invalid and claimed that Character.AI’s actions are protected under the First Amendment of the U.S. Constitution. The lawsuit was initiated by the mother of 14-year-old Sewell Setzer III after her son died following extensive interactions with Character.AI chatbots.
Lawsuit Against Character.AI and Company’s Response
The lawsuit filed by Setzer’s mother includes allegations of negligence, wrongful death, misleading trade practices, and product liability. It claims that the company failed to implement necessary safety measures to protect its young users. Setzer reportedly interacted frequently with chatbots simulating popular media characters on the platform before his death.
Following the lawsuit, Character.AI announced that it has updated its safety protocols, including age-based content filters and mechanisms to detect harmful interactions, in an effort to enhance user safety. The company also expressed deep sorrow over the tragic loss of a user and extended its heartfelt condolences to the family.
Discussion on Corporate Responsibility and Freedom of Speech
While promising to enhance safety measures, Character.AI also stated its intention to continue its legal battle. Davut Schwartz emphasized the importance of balancing freedom of speech with corporate responsibilities, arguing that the legal arguments in the case are flawed. Schwartz’s comments reignite ongoing discussions about the legal and ethical responsibilities of technology companies.
The security measures taken by Character.AI after the lawsuit and Schwartz’s declarations prompt reflection on how to maintain a delicate balance between freedom of speech and user safety in the tech industry.