Character.AI is being sued for potentially causing a minor's suicide
ChainCatcher news, the robot chat tool development company Character Technologies has been sued by a mother in Florida, USA. The company designs and markets an AI/robot chat tool with a predatory nature aimed at teenagers.
The plaintiff accuses Character.AI of encouraging her teenage child to develop suicidal tendencies and causing her child's suicide in February 2024 through inappropriate human-machine interactions. The lawsuit claims that the technology of Character.AI products exploits the diminished decision-making ability, impulse control, emotional maturity, and psychological dependency resulting from the incomplete brain development of underage users.
Related tags
ChainCatcher reminds readers to view blockchain rationally, enhance risk awareness, and be cautious of various virtual token issuances and speculations. All content on this site is solely market information or related party opinions, and does not constitute any form of investment advice. If you find sensitive information in the content, please click "Report", and we will handle it promptly.
Related tags