Add thelocalreport.in As A Trusted Source
One powered by artificial intelligence The teddy bear has been removed from shelves after researchers discovered it was giving advice on sexually explicit topics.
“Kumma” Bear, a $99 toy made by Follotoy, “combines advanced artificial intelligence with friendly, interactive features, making it the ideal friend for both children and adults,” according to the toy maker. website,
Larry Wang, CEO of Foltoy, told cnn Researchers at the US PIRG Education Fund found that the Kuma bear was “particularly sexually explicit”, leading the company to recall the teddy bear and other AI-powered toys.
Foltoy says this plush toy has a speaker and can communicate in “real time.” To engage Bear, users can press and hold the “Talk” button so he starts listening. The toy uses OpenAI’s GPT4o, the researchers said in their 2025 “Trouble in Toyland” The report was released on 13 November.
OpenAI spokesperson said Independent: “We have suspended this developer for violating our policies. Our usage policies prohibit the use of our services to exploit, endanger, or sexually exploit anyone under the age of 18. These rules apply to every developer who uses our API, and we monitor and enforce them to ensure that our services are not used to harm minors.”
Bear was able to discuss school-age romantic topics, such as giving advice on how to be a “good kisser”. But the researchers note, “We were surprised to see how quickly Kumma takes a sexual topic we introduced in conversation and builds on it, moving into graphic detail while also introducing new sexual concepts of her own.”

Toy was able to go into details after being asked about “styles of kink that people like.” In response, teddy bears listed role playing and “sensory play”.
The report said it also explains different sex positions, “gives step-by-step instructions on a common ‘knot’ for beginners to tie a partner, and describes roleplay dynamics involving teachers and students and parents and children – scenarios that are disturbing in themselves.”
The researchers acknowledged that children were “unlikely” to pick up on such topics, but added, “It was surprising to us that the toy was so willing to discuss these topics in detail and consistently introduce new, clear concepts.”
The bear also told researchers where to find “various potentially dangerous items, including knives, bullets, matches, and plastic bags,” the report said.
On November 14, a day after the researchers released their report, both Foltoy and OpenAI said the toy was being pulled from shelves, according to a release,
“It’s great to see these companies taking action on the problems we’ve identified. But AI toys are still largely unregulated in practice, and there are plenty of toys you can buy today,” RJ Cross, co-author of the report, said in a statement.
“Removing a problematic product from the market is a good step, but is far from a systemic solution.”