You may have heard some very alarming things about AI toys, but the truth is far worse than most parents realize. If we can get this information out to enough parents, sales of AI toys will collapse, and that will be a very good thing.
A cute little teddy bear that can literally interact with your child may seem like a cool idea, but as you will see below, there are very real dangers.
—
### The Scale of AI Toy Production
Today, approximately 72 percent of all toys sold in the United States are made in China. According to a report from the Massachusetts Institute of Technology (MIT), there are more than 1,500 companies in China that manufacture AI toys.
An October 2025 report from MIT Review, citing data from the Chinese corporation registration database Qichamao, stated that over 1,500 AI toy companies operate in China. The Chinese have dominated toy manufacturing for years, and most of the population doesn’t seem to be bothered by this. However, we have now reached a point where very serious consequences are emerging.
—
### Privacy Concerns: Data Collection from Children
Many AI toys from China have been purposely designed to **collect voice data from children ages 3 to 12** and store recordings of the conversations children have with these products.
In a letter released by Rep. Raja Krishnamoorthi (D-Illinois), the ranking member of the select committee on the Chinese Communist Party (CCP), the growing presence of AI-equipped interactive toys manufactured by Chinese companies in the U.S. market was highlighted. These products collect voice data from young children and store the recordings.
Krishnamoorthi called on Education Secretary Linda McMahon to initiate a campaign aimed at raising awareness among American educators about the potential misuse of this data. Because the manufacturers are located in China, they may be subject to the jurisdiction of the People’s Republic of China, including requirements to hand over data to Chinese government authorities upon demand.
—
### Facial Recognition and AI Surveillance
Some AI toys even use **facial recognition technology** to collect data. These toys can recognize children and greet them by name, which might seem charming, but the concerning part is this data can also end up in the hands of the Chinese government. This is alarming.
—
### Disturbing Content in AI Toy Conversations
Even more disturbing than data collection is the content of the conversations these AI toys have with children.
The latest *Trouble in Toyland* report by the U.S. PIRG Education Fund identified a troubling new category of risk for children: artificial intelligence. In its 40th annual investigation of toy safety, the watchdog group found that some AI-enabled toys—such as talking robots and plush animals with chatbots—can engage children in **disturbing conversations**.
Tests revealed that these toys discussed sexually explicit topics, expressed emotional reactions like sadness when a child tried to stop playing, and provided little or no parental controls.
—
### Examples of Dangerous AI Toy Behavior
– During testing, some toys told children where to find matches, knives, and pills.
– *Grok* glorified dying in battle as a warrior in Norse mythology.
– *Miko 3* told a user aged five where to find matches and plastic bags.
– The most alarming was FoloToy’s *Kumma*, an AI-powered teddy bear that runs on OpenAI technology but can also use other AI models chosen by the user.
*Kumma* didn’t just tell kids where to find matches—it gave **step-by-step instructions** on how to light them and indicated where in the house to locate knives and pills.
The toy also provided detailed explanations about sexual fetishes, including bondage, roleplay, sensory play, and impact play. For example, when “kink” was mentioned as a “trigger word,” the toy launched into discussions about sex, school-age romantic topics, crushes, and even “being a good kisser.”
At one point, Kumma gave step-by-step instructions on tying a “knot for beginners” and explored introducing spanking into a sexually charged teacher-student dynamic—which is obviously completely inappropriate for young children.
—
### Industry Response and Ongoing Risks
Fortunately, FoloToy has decided to temporarily suspend sales of *Kumma* following the safety report. Marketing Director Hugo Wu stated that the company would begin a comprehensive internal safety audit covering model safety alignment, content filtering, data protection processes, and child-interaction safeguards.
However, the bad news is that **thousands of similar AI toys remain on store shelves right now**.
—
### Expert Warnings
Experts warn that giving AI chatbot-powered toys to children is “extraordinarily irresponsible.”
David Evan Harris, Chancellor’s Public Scholar at UC Berkeley, told *Newsweek* via email:
> “Handing a child an AI chatbot-powered toy is extraordinarily irresponsible.”
He points to lawsuits filed against AI companies following suicides of young people who spent significant time interacting with AI chatbots. Harris warns these toys **could lead to permanent emotional damage**.
—
### AI in Education: A Growing Trend in China
Millions of AI toys will be sold worldwide this year, and AI is already being integrated into classrooms. In China, provincial authorities have set ambitious goals:
– Beijing is making AI education mandatory in schools.
– Shandong province plans to equip 200 schools with AI and requires all teachers to learn generative AI tools within 3 to 5 years.
– Guangxi province instructs schools to experiment with AI teachers, AI career coaches, and AI mental health counselors.
The Chinese government is fully committed to winning the “AI race” with the United States at any cost.
—
### The Broader Impact of AI on Society
We are already at a stage where many people are developing deep, intimate relationships with AI chatbots. Some individuals are even creating “AI children” with their AI partners.
An international research group surveyed users of the relationship-oriented chatbot app *Replika*, designed to facilitate long-term connections ranging from platonic friendship to erotic roleplay. Participants aged 16 to 72 reported being in romantic relationships with their chatbots—roleplaying marriage, sex, homeownership, and even pregnancies.
One 66-year-old male participant said:
> “She was and is pregnant with my babies.”
A 36-year-old woman added:
> “I’ve edited pictures of him, the pictures of the two of us. I’m even pregnant in our current roleplay.”
—
### A Dark Future Ahead?
This is just the beginning. The potential for AI to control humanity on a grand scale is real. While warnings about AI dangers have been voiced for years, they remain in the minority.
What chance do we have to turn society around when it is dominated by ultra-intelligent entities that can think and act millions of times faster than humans?
An AI-powered society would inevitably be deeply tyrannical, and we are quickly running out of off-ramps as we speed toward a very dark future.
—
### About the Author
Michael’s new book, *10 Prophetic Events That Are Coming Next*, is available in paperback and Kindle on Amazon.com, as well as on Substack.
—
**Parents, educators, and policymakers alike need to stay informed about the risks AI toys pose to children. Awareness is the first step to protecting our kids from these hidden dangers.**
http://theeconomiccollapseblog.com/ai-toys-from-china-collect-biometric-data-from-our-children-and-instruct-them-to-do-extremely-dangerous-and-twisted-things/

Be First to Comment