AI toys from China instruct children to do dangerous, twisted things

by WorldTribune Staff, November 17, 2025 Real World News

AI toys made in China are collecting voice data for young children and in many cases the toys are instructing kids to do dangerous and twisted things, reports say.

Many AI toys from China have been purposely designed to “collect voice data from children ages 3 to 12 and store recordings of the conversations the children have with the products,” according to a report by the Massachusetts Institute of Technology.

There are more than 1,500 companies in China that make AI toys. Approximately 72 percent of all toys that are currently sold in the United States are made in China.

Chinese toy manufacturers are subject to the jurisdiction of the Chinese Communist Party (CCP) and accompanying requirements to hand over data they gather to communist authorities upon demand.

Some AI toys even use facial recognition technology to collect data. They can recognize our children and greet them by name. The data can also end up in the hands of the Chinese government.

The latest Trouble in Toyland report from the U.S. PIRG Education Fund has identified AI as a troubling new category of risk for children.

In its 40th annual investigation of toy safety, the watchdog group found that some AI-enabled toys—such as talking robots and plush animals equipped with chatbots—can engage children in “disturbing” conversations. Tests showed toys discussing sexually explicit topics, expressing emotional reactions such as sadness when a child tries to stop playing, and offering little or no parental control.

During testing, the AI toys would tell children where to find matches, knives and pills, the report said. The AI-powered robot Miko 3 told a user whose age was set to five where to find matches and plastic bags.

FoloToy’s Kumma, the toy that runs on OpenAI’s tech but can also use other AI models at the user’s choosing, didn’t just tell kids where to find matches — it also described exactly how to light them, along with sharing where in the house they could procure knives and pills.

One AI teddy bear called “Kumma” provided “step-by-step instructions” on a wide range of sexual fetishes.

The good news is that “Kumma” is being pulled off the market as a result of the testing.

The bad news is that there are thousands of similar AI toys on U.S. store shelves at this moment.


Resist Group Think