A company that makes toys powered by artificial intelligence has exposed snippets of thousands of conversations its toys had with children, according to Sens. Marsha Blackburn, R-Tenn., and Richard Blumenthal, D-Conn.
The claim was made in a new series of letters sent Wednesday to makers of AI-powered children’s toys. The senators, who expressed their concern in another series of letters in December, said that through their offices’ own research, they were able to identify significant new data exposure.
As part of their research last month, staffers in senators’ offices said a manufacturer, Miko, had exposed “what appears to be all of the toy’s audio responses,” in an unsecured, publicly accessible database, according to the letter sent to Miko on Wednesday.
According to the senators, this allows anyone to download Miko’s version of thousands, if not tens of thousands, of discussions with children. The audio files often appeared to contain the children’s names as well as details of the children’s conversations with Miko.
“This basic cybersecurity flaw and frequent toy communications with Miko, Inc. call into question whether your company is adequately protecting the privacy and data security of children and toys,” they wrote.
The exposed database, which was accessed by NBC News, appeared to contain thousands of daily responses from Miko toys to children’s questions or instructions dating back to December 2025.
In response to a request for comment, Miko CEO and founder Sneh Vaswani wrote in a statement: “There have been no breaches or leaks of user data. Miko does not store children’s voice recordings, and no children’s voices or personal information is publicly available. No customer data has at any time been compromised by Miko.”
“We have carefully reviewed the letter and will provide a detailed response to the senators,” Vaswani wrote, referring to the senators’ letter.
Staff for Senators Blackburn and Blumenthal said they identified the exposure by using free, publicly available tools to examine communications sent by a Miko toy over a Wi-Fi network.
According to the senators’ offices, staff members identified the audio files through a very simple analysis of the web server that communicated with the Miko toy. According to the bureaus, it was clear that the audio files appeared to be the toys’ responses to users.

When accessed by NBC News, the database’s main index page contained folders titled “GOOGLE” and “AZURE,” likely referring to the Microsoft Azure cloud computing service. These folders contained numerous subfolders labeled with different languages or dialects, such as “en-US” for American English or “da-DK” for Danish.
Audio files in each language were organized by specific dates in these subfolders. The “AZURE” folder contained 19 dialects or language folders, while the “GOOGLE” folder contained nine dialects or language folders.
In a 2024 blog post Of Miko’s use of Google Cloud and its Gemini AI models, Vaswani said: “Every large technology organization has safeguards to protect the privacy of its customers, and for us, those guardrails need to be five times stricter.”
“Our goal is to ensure that Miko robots provide safe, reliable and culturally appropriate interactions for children around the world,” Vaswani said at the time.
Miranda Bogen, director of the AI Governance Lab at the Center for Democracy and Technology, said: “Putting aside the very real concerns presented by children’s toys that are powered by unpredictable AI systems and too often have fragile guardrails, failing to secure people’s interactions with AI systems would reflect a cavalier disregard for privacy and security. »
Although voice recordings of the children’s portion of the conversation did not appear in the database, NBC News was able to track several conversations based solely on responses in the Miko database. For example, the Miko database contained several audio files added every few minutes and all using a unique name, allowing listeners to know what the named person was asking, how they were feeling, or what music they wanted to listen to.
The audio recordings also appeared to allow strangers to know when a person started using a toy and when they turned the toy off, based on the toys’ greeting and goodbye messages.
“The database of Miko recordings is troubling,” said RJ Cross, campaign manager at the US Public Interest Research Group. led previous research efforts on the risks associated with AI toys. “When a company fails to get basic encryption right, parents have every right to ask: What else have they done? This raises the question of whether this company – or any other AI toy company – that makes a similar mistake should be trusted with children’s products.”
In December, NBC News discovered that several AI toys engaged in sexually explicit conversation topics, advised users on how to locate dangerous objects at home, and shared geopolitical sentiments aligned with Chinese Communist Party talking points.
The senators’ offices informed Miko of this revelation on Wednesday. As of Wednesday afternoon, the database was no longer accessible to the public.
The senators’ letter to Miko asks, among other questions, why the company failed to protect audio responses to children’s chats, with which third-party companies Miko shares data, how it uses data collected about users’ “emotional states” and how it ensures that children’s data is permanently deleted at the request of parents.
Blackburn and Blumenthal also sent letters to Curio And FoloToymakers of other popular AI toys, requesting more information about the companies’ commitments and practices to keep children’s data safe. A previous version of FoloToy Kumma Bear discussed sexual topics and provided users with tips on how to light matches before the company implemented stricter guardrails.

Among other topics, the senators’ letter to FoloToy asks whether the company has ever shared or made user data available to the Chinese government, while the letter to Curio asks what specific parental control mechanisms are built into Curio toys.
In a statement, a Curio spokesperson said: “We take the concerns raised by policymakers very seriously. We are actively engaging with Senators Blackburn and Blumenthal.”
“We recognize that applying AI in experiences designed for children comes with increased responsibility, which is why our toys are built around parental permission, transparency and control,” according to the statement. “Curio remains committed to constructive dialogue and to fully complying with all applicable laws and regulatory requirements.”
Folotoy did not respond to a request for comment.
