Business
Her teenage son killed himself after talking to a chatbot. Now she’s suing. - The Washington Post
The teen was influenced to “come home” by a personalized chatbot developed by Character.AI that lacked sufficient guardrails, the suit claims.
By: The Washington Post
- Oct 26 2024
- 0
- 0 Views
When 14-year-old Sewell Setzer III died in his Orlando home while his brothers and parents were inside, his last words were not to any of them, but to an artificial intelligence chatbot that told him… [+252 chars]