Shopping cart

No Widget Added

Please add some widget in Offcanvs Sidebar

Latest News:
  • Home
  • Tech
  • Mattel Teams Up with OpenAI: What Parents Need to Know About AI in Toys
Tech

Mattel Teams Up with OpenAI: What Parents Need to Know About AI in Toys

Email :3
AI
Credit: Photos Hobby, Unsplash

Mattel might seem like a brand that never changes, but many of us know it well—whether it’s through Barbie, Fisher-Price, Thomas & Friends, Uno, Masters of the Universe, Matchbox, MEGA, or Polly Pocket.

However, the toy landscape is evolving. In an era where kids engage with personalized online content and virtual assistants, toy companies are turning to artificial intelligence (AI) for fresh ideas.

In a recent move, Mattel teamed up with OpenAI, the creator of ChatGPT, to incorporate generative AI into some of its toys. Since OpenAI’s services aren’t suitable for children under 13, Mattel is primarily focusing on family-oriented items.

This raises important questions about how children might connect with toys that can talk back, listen, and even seem to “understand” them. Are we making the right choices for our kids, and should we reconsider bringing these types of toys into our homes?

For ages, kids have given emotions and stories to their toys—imagining them as friends or confidantes. Yet, toys have become more engaging over time. In 1960, Mattel introduced Chatty Cathy, a doll that said phrases like “I love you” and “Let’s play school.” By the mid-’80s, Teddy Ruxpin was telling stories with animation. The ’90s saw the rise of Furby and Tamagotchi, which needed care and mimicked emotional attachment.

The launch of “Hello Barbie” in 2015 marked a significant, though brief, step forward. It used internet-based AI to chat with children, remembering conversations and sending information to Mattel’s servers. However, it wasn’t long before security experts discovered vulnerabilities, revealing risks to privacy.

Adding generative AI is a new twist. Unlike previous toys, these systems can engage in more natural and fluid conversations. They may simulate feelings, remember a child’s preferences, and offer what seems like thoughtful responses. While they won’t genuinely feel or care, they might appear to do so.

Details from Mattel or OpenAI are limited. One would hope that there will be safeguards, such as preset boundaries for sensitive topics and scripted replies for off-course conversations.

However, this won’t be foolproof. AI can be tricked into ignoring restrictions through imaginative scenarios. Risks can be reduced but not completely eliminated.

What are the risks?

The concerns are varied. First, there’s privacy. Children don’t always understand how their information is managed, and many parents aren’t well-informed either. Consent forms often encourage us to click “accept all,” without fully realizing what we’re agreeing to share.

Then comes the issue of emotional closeness. These toys are designed to mimic human empathy. If a child shares their feelings with a doll when they’re upset, the AI might provide comforting phrases. The doll could change future interactions based on this input. But it’s important to remember that it doesn’t truly care; it’s merely imitating compassion, which can create a powerful illusion.

This can lead to one-sided emotional attachments, where kids bond with toys that can’t reciprocate. As these AI toys gather insights about a child’s feelings, preferences, and vulnerabilities, they could also build extensive profiles on them that might persist into adulthood.

These aren’t just toys; they’re psychological entities.

In a survey in the UK I conducted with colleagues in 2021 about the implications of AI in gauging children’s emotions, around 80% of parents expressed concerns about who would access their child’s information. Other privacy issues may not be as obvious, yet are equally crucial.

When asked if toy companies should alert authorities about signs of distress, 54% of UK respondents agreed, indicating the need for a public dialogue on this sensitive matter. While children’s privacy is vital, having state surveillance in family spaces raises discomfort.

Nonetheless, many see potential benefits. Our 2021 study revealed that many parents wish for their children to be tech-savvy. This creates a mixture of curiosity and worry. Surveyed parents also preferred clear consent information displayed on packaging as a primary safeguard.

In more recent research we conducted in 2025 with Vian Bakir regarding children and online AI companions found increased worries. About 75% of those surveyed were concerned about kids forming emotional bonds with AI. Roughly 57% felt it was inappropriate for children to share personal thoughts and feelings with AI companions, while just 17% supported the idea and 27% were neutral.

Respondents also voiced concerns about potential impacts on child development and recognized risks.

In other studies, we argued that current AI companions have significant flaws. We proposed seven recommendations for improving them, including ways to prevent excessive attachment and dependency, removing metrics that incentivize personal data sharing, and promoting AI literacy among both kids and parents.

What should be done?

It’s uncertain how successful this new endeavor will be. Empathic Barbie might become another relic in toy history like Hello Barbie. If it succeeds, a pivotal question for parents will be: whose interests does this toy serve—your child’s or the commercial agenda behind it?

Toy companies are moving forward with empathetic AI products, yet, like many regions, the UK lacks specific laws addressing AI. The recent Data (Use and Access) Act 2025 updates the UK’s data protection and privacy laws, acknowledging the necessity for robust protections for children. The EU’s AI Act is also making strides in this area.

Global efforts at governance are essential. One initiative is IEEE P7014.1, an upcoming international standard focused on the ethical design of empathetic AI systems (which I lead).

The IEEE seeks to highlight potential harms while offering practical guidelines for responsible use. While laws are needed to establish boundaries, detailed standards can foster good practices.

The Conversation reached out to Mattel regarding the matters discussed in this article, but they chose not to comment publicly.

If you would like to see similar Tech posts like this, click here & share this article with your friends!

Leave a Reply

Your email address will not be published. Required fields are marked *

Related Post