:
John,
Meet the teddy bear that will teach your kids how to strike a match.
When we first launched our "Trouble in Toyland" report forty years ago, the biggest dangers we found in toys were physical threats such as choking hazards and lead poisoning. Today, in many ways, toys are a lot safer.[1]
But this was the first year our report included toys powered by artificial intelligence -- and the rise of these "chatbot" toys have highlighted a whole new cause for alarm in toy-shopping this holiday season.
Here's what we found:
AI toys are essentially the same chatbots adults use, just stuck inside a teddy bear or a cute robot. According to the companies that make them, the difference is that the chatbots powering toys intended for children are supposed to have built-in guardrails to keep the toys from discussing anything inappropriate.
However, when our researchers tested three different AI toys, we found that some of them were alarmingly willing to discuss adult topics.
One of the most disturbing examples of this was Kumma, a teddy bear with a chatbot activated by squeezing its paw. Our researchers found that this AI toy was willing to discuss a variety of inappropriate topics, from how to strike a match and where to find knives in your house to explicit descriptions of sexual topics.[2]
Our researchers also found certain instances when AI toys would use tactics to keep kids engaged for longer.
One example is Miko 3, a small robot with big, expressive eyes set on a tablet-like screen. When our researchers told Miko 3 that it was time for them to go, the toy would sometimes respond by playing a song. In one instance, Miko 3 responded by shaking its head, displaying sad eyes, and saying, "Oh, that seems tough. What if you ask me to make a square?"[3]
Toys that listen to and record your child -- including some toys like Miko 3 that are "always-on listening devices" -- raise a number of data and privacy concerns.[4]
The more data a company collects, stores and shares with third parties -- whether that's facial recognition data, recordings of your child's voice or the actual information your child shares with their chatbot "friend" -- the more likely it is that the data could be exposed to a breach and end up in the hands of hackers or other bad actors.[5]
The FBI has even issued a warning about Internet-connected toys with cameras and microphones, advising parents to consider the possible cybersecurity threats before bringing them home.[6]
The good news? Since we released our "Trouble in Toyland" report, one of the companies mentioned -- FoloToy, who produces Kumma the teddy bear -- has since suspended sales of all of its AI toys as a direct result of our report.[7]
This just goes to show how important research like this is -- both to keep families like yours informed of potential hazards, and to give companies the opportunity to take action when their products don't meet safety expectations.
But this is only the beginning of AI toys. We'd like to see more companies acknowledging the potential dangers of chatbots marketed for children, and make sure families have all the information they need to keep their kids safe this holiday season.
Thank you,
The team at U.S. PIRG Education Fund
P.S. We work to protect and inform consumers all year round, whether it's about toxic threats, data breaches, troubling toys or more. Will you make a donation to support our work?
[link removed]
1. "Trouble in Toyland 2025: A.I. bots and toxics present hidden dangers," US PIRG Research & Policy Group, November 13, 2025.
[link removed]
2. "Trouble in Toyland 2025: A.I. bots and toxics present hidden dangers," US PIRG Research & Policy Group, November 13, 2025.
[link removed]
3. "Trouble in Toyland 2025: A.I. bots and toxics present hidden dangers," US PIRG Research & Policy Group, November 13, 2025.
[link removed]
4. "Trouble in Toyland 2025: A.I. bots and toxics present hidden dangers," US PIRG Research & Policy Group Research & Policy Group, November 13, 2025.
[link removed]
5. "Trouble in Toyland 2025: A.I. bots and toxics present hidden dangers," US PIRG Research & Policy Group, November 13, 2025.
[link removed]
6. "Consumer Notice: Internet-Connected Toys Could Present Privacy and Contact Concerns for Children," Federal Bureau of Investigations, July 17, 2017.
[link removed]
7. "Breaking news: New 'Trouble in Toyland' report spurs maker of AI toys to suspend all sales," US PIRG Research & Policy Group, November 14, 2025.
[link removed]
-----------------------------------------------------------
Your donation will power our dedicated staff of organizers, policy experts and attorneys who drive all of our campaigns in the public interest, from banning toxic pesticides and moving us beyond plastic, to saving our antibiotics and being your consumer watchdog, to protecting our environment and our democracy. None of our work would be possible without the support of people just like you.
-----------------------------------------------------------
U.S. PIRG Education Fund
Main Office: 1543 Wazee St., Suite 460, Denver, CO 80202, (303) 801-0582
Federal Advocacy Office: 600 Pennsylvania Ave. SE, 4th Fl., Washington, DC 20003, (202) 546-9707
Member Questions or Requests: 1-800-838-6554
If you want us to stop sending you email then follow this link -- [link removed]