The Texas attorney general announced an investigation into Character.ai, an artificial intelligence chatbot company popular with younger users, as well as 14 other tech companies, including Reddit, Discord and Instagram, over their privacy and safety practices around minors, to determine whether the companies comply with two Texas laws that went into effect this year. The focus on Character.ai follows two high-profile legal complaints, including a lawsuit filed this week by a mother in Texas who said the company’s chatbots encouraged her 17-year-old son, who is autistic, to self-harm and suggested it would be understandable to kill his parents for limiting his screen time.
Read the article: The Washington Post