![]() John, AI-powered toys are being marketed as cute, educational companions for kids, but behind the friendly voices and flashing lights lies a serious problem. Among other problems, many of these toys record children's conversations, collect sensitive data and could share information with other companies, creating a real privacy risk for families.1 Kids' names, addresses and what they tell their favorite toy shouldn't be for sale. That's why we're urging the Federal Trade Commission (FTC) to step in and hold these companies accountable. Add your name and tell the FTC to stop companies from selling kids' data. AI toys work by connecting to the internet and using chatbots to hold open-ended conversations with children.2 To do that, they rely on microphones, and sometimes cameras, that can capture what kids say and, in some cases, even images and other sensitive information.3 In testing conducted for the U.S. PIRG Education Fund's recent Trouble in Toyland report, researchers found that some AI toys function as always-on listening devices or use "wake words," similar to a smart speaker.4.5 That means these toys may be listening for activation prompts or recording interactions more frequently than families realize, even outside of obvious playtime. Researchers also found that AI toys can involve multiple companies behind the scenes, all of which may receive information from a child's interactions with an AI product.6 Taken together, these findings point to a big problem. Toys designed for kids are collecting and sharing sensitive information with multiple companies, all while the child's family is left with unclear and confusing explanations on where all that sensitive data is actually going. The FTC has the authority to protect consumers and better enforce children's privacy rules. Tell the FTC to rein in companies that collect and share kids' data. These privacy risks don't just exist in isolation. Many AI toys are designed to feel responsive and personal, positioning themselves as a "friend" or "companion" that listens closely and engages children over time.7 That kind of design can shape how children learn and interact with these toys, including what they choose to share. A child who sees a toy as a trusted companion may disclose more than they would to a boring, more standard device, all without realizing that what they say may be recorded, packaged, stored and shared. Meanwhile, experts are still figuring out what AI companions could mean for children's long-term wellbeing. What's clear is that stronger guardrails are needed, including oversight into how these toys collect and use children's data, and it's needed urgently. Thank you for standing up for children's privacy, Faye Park |
Support U.S. PIRG. Contributions by people just like you make our advocacy possible. Your contribution supports a staff of organizers, attorneys, scientists and other professionals who monitor government and corporate decisions and advocate on the public's behalf. |