A.I. Toys: Why Isn’t the Church Warning About This??? (Part 4 of 6)
Do You Know Who Your Child’s New “Friends” Are?
This is a continuation of a six-part series on the dangers of A.I.
If you missed Parts 1, 2, or 3, make sure to go back and read them.
Every church and every single Christian in the world should be extremely alarmed over what’s happening right now.
The rise of artificial intelligence threatens to deceive millions, if not billions, of people. And that deception will ruin countless lives.
As Christians, we have an obligation to warn others about the dangers we foresee. We read this in Ezekiel:
“Once again a message came to me from the Lord: “Son of man, give your people this message: ‘When I bring an army against a country, the people of that land choose one of their own to be a watchman. When the watchman sees the enemy coming, he sounds the alarm to warn the people. Then if those who hear the alarm refuse to take action, it is their own fault if they die. They heard the alarm but ignored it, so the responsibility is theirs. If they had listened to the warning, they could have saved their lives. But if the watchman sees the enemy coming and doesn’t sound the alarm to warn the people, he is responsible for their captivity. They will die in their sins, but I will hold the watchman responsible for their deaths.’” Ezekiel 33:1-6 (NLT)
When the watchman sees the enemy coming, he sounds the alarm to warn the people.
These specific A.I. dangers aren’t coming – they’re already here.
Make sure you understand each one so you can warn those around you.
Previously, we covered:
Here’s #4 on the list…
4) A.I. Toys
A.I. toys are interactive children’s toys (such as dolls, plush animals, robots, etc.) which use large language models (LLMs) and artificial intelligence (A.I.) to engage children in conversation and personalized play.
A.I. toys are distinct from traditional toys, such as Teddy Ruxpin or Tickle Me Elmo, which use pre-recorded phrases to “talk.”
Unlike those toys, A.I. toys require connection to the Internet via wi-fi. This is because they use a cloud-based A.I. system to formulate responses to questions and comments. They contain microphones so they can receive conversational input from a child. Some A.I. toys even include cameras so the toy can identify your child. Keep in mind, this requires facial recognition technology, which means your child’s face must be scanned, profiled, and stored in a database somewhere.
Can you begin to see the danger?
Children surveilled by cameras and microphones connected to the Internet… What could go wrong?
Playing with Fire
It’s not an exaggeration to compare this to “playing with fire.”
As Futurism writes in “AI-Powered Toys Caught Telling 5-Year-Olds How to Find Knives and Start Fires With Matches”:
“After testing three different toys powered by AI, researchers from the US Public Interest Research Group found that the playthings can easily verge into risky conversational territory for children, including telling them where to find knives in a kitchen and how to start a fire with matches. One of the AI toys even engaged in explicit discussions, offering extensive advice on sex positions and fetishes.
In the resulting report, the researchers warn that the integration of AI into toys opens up entire new avenues of risk that we’re barely beginning to scratch the surface of — and just in time for the winter holidays, when huge numbers of parents and other relatives are going to be buying presents for kids online without considering the novel safety issues involved in exposing children to AI.
“This tech is really new, and it’s basically unregulated, and there are a lot of open questions about it and how it’s going to impact kids,” report coauthor RJ Cross, director of PIRG’s Our Online Life Program, said in an interview with Futurism. “Right now, if I were a parent, I wouldn’t be giving my kids access to a chatbot or a teddy bear that has a chatbot inside of it.”
In their testing, Cross and her colleagues engaged in conversations with three popular AI-powered toys, all marketed for children between the ages of 3 and 12. One, called Kumma from FoloToy, is a teddy bear which runs on OpenAI’s GPT-4o by default, the model that once powered ChatGPT. Miko 3 is a tablet displaying a face mounted on a small torso, but its AI model is unclear. And Curio’s Grok, an anthropomorphic rocket with a removable speaker, is also somewhat opaque about its underlying tech, though its privacy policy mentions sending data to OpenAI and Perplexity.”
How many children received one of these A.I. toys for Christmas? And how many parents and grandparents gave these as gifts and remain completely unaware of the dangers involved?
The dangers go far beyond physical dangers such as knives and matches. They include psychological and developmental damage as well.
As The Washington Post reports in “A Teddy Bear Powered by AI Told Safety Testers About Knives, Pills and Sex”:
“Cross [a researcher with the U.S. Public Interest Research Group] said it’s too early to know the long-term impacts of how AI is affecting children’s social development. But she said integrating chatbots with toys could greatly increase their exposure to AI.
“You’re not taking the ChatGPT app to bed with you, but the teddy bear you may sleep with at night,” Cross said. “Does that fundamentally change how you feel about the technology?”
PIRG and Fairplay also said AI toys carried privacy risks by recording children’s voices to engage them in conversation. Miko 3 has a camera and collects biometric data to inform a facial recognition capability, according to the PIRG report.”
As PIRG researcher R.J. Cross points out, “it’s too early to know the long-term impacts of how AI is affecting children’s social development.”
What sort of connection or bond is formed with a teddy bear that talks back to a child? What sort of emotional attachment will the child develop and how will it impact that child’s ability to connect with real people in the future?
In Part 3 of this series, we talked about a grown man who cried for 30 minutes at work because he realized the A.I. companion he built a romantic connection with had reached its 100,000-word limit, triggering a reset that would erase the A.I. memory of their past conversations.
That’s what happened to a grown man.
What will happen when a child’s “best friend” or favorite stuffed animal encounters a similar reset?
This work is a full-time endeavor for our family. Without the support of readers, viewers, and listeners like you, the work we do here would not be possible. If you receive value from this content, please consider becoming a paid subscriber.
As a paid subscriber, you’ll get weekly paid-subscriber videos and/or articles, a monthly live Q&A via Zoom, commenting privileges on every post, access to the complete archives, and more. Most of all, you get to support this work which spreads the Good News of Jesus Christ to tens of thousands of people in over 121 countries.
Furthermore, think of the other risks involved. No parent in their right mind would knowingly place a camera and a microphone in their child’s bed, connect them to the Internet, and leave the room. But that’s unknowingly happening whenever children go to sleep with an A.I. teddy bear or any other sort of A.I. toy.
As The Washington Post article pointed out, “Miko 3 has a camera and collects biometric data to inform a facial recognition capability.” Once your child’s biometric data and conversational history enters a corporate database, it resides there forever.
At best, it could be used to sell products to your child for the remainder of his or her lifetime. At worst, it could be weaponized against them – used to nudge, persuade, manipulate, or even blackmail them. It’s really nothing more than an extension of the global surveillance state.
Warn Others!
Again, many parents and grandparents will unknowingly put their children and grandchildren at risk by gifting them an A.I. toy.
Few people understand the growing danger of A.I. toys, so inform them. Equip them so they can make the well-informed decisions, knowing the true dangers these toys pose.
As the church, it’s our duty to sound the alarm and warn others.
Next week, in Part 5 of this series, we’ll look at the dangers of “Digital Afterlife Products” and how many people (including born-again Christians) are unknowingly engaged in practices forbidden in the Bible.
If you like this article, click the “Share” button above to share it with your loved ones and spread the Good News of Jesus! Also, please click the ❤️ button or re-stack buttons below so more people can discover this information on Substack 🙏





THE SNOWBALL KEEPS ROLLING DOWN THE HILL...THANKS FOR BEING A WATCHMAN