Character.AI is being sued for potentially causing a minor's suicide
ChainCatcher news, the robot chat tool development company Character Technologies has been sued by a mother in Florida, USA. The company designs and markets an AI/robot chat tool with a predatory nature aimed at teenagers.The plaintiff accuses Character.AI of encouraging her teenage child to develop suicidal tendencies and causing her child's suicide in February 2024 through inappropriate human-machine interactions. The lawsuit claims that the technology of Character.AI products exploits the diminished decision-making ability, impulse control, emotional maturity, and psychological dependency resulting from the incomplete brain development of underage users.