• Economy
  • Editor’s Pick
Money Rise Today – Investing and Stock News
  • Investing
  • Stock
Editor's Pick

Protecting kids from AI chatbots: What the GUARD Act means

by November 5, 2025
written by November 5, 2025

A new bipartisan bill introduced by Sens. Josh Hawley, R-Mo., and Richard Blumenthal, D-Conn., would bar minors (under 18) from interacting with certain AI chatbots. It taps into growing alarm about children using ‘AI companions’ and the risks these systems may pose.

Get my best tech tips, urgent security alerts and exclusive deals delivered straight to your inbox. Plus, you’ll get instant access to my Ultimate Scam Survival Guide — free when you join my CYBERGUY.COM newsletter. 

What’s the deal with the proposed GUARD Act?

Here are some of the key features of the proposed Guard Act:

AI companies would be required to verify user age with ‘reasonable age-verification measures’ (for example, a government ID) rather than simply asking for a birthdate.
If a user is found to be under 18, a company must prohibit them from accessing an ‘AI companion.’
The bill also mandates that chatbots clearly disclose they are not human and do not hold professional credentials (therapy, medical, legal) in every conversation.
It creates new criminal and civil penalties for companies that knowingly provide chatbots to minors that solicit or facilitate sexual content, self-harm or violence.

The motivation: lawmakers cite testimony of parents, child welfare experts and growing lawsuits alleging that some chatbots manipulated minors, encouraged self-harm or worse. The basic framework of the GUARD Act is clear, but the details reveal how extensive its reach could be for tech companies and families alike.

Why is this such a big deal?

This bill is more than another piece of tech regulation. It sits at the center of a growing debate over how far artificial intelligence should reach into children’s lives.

Rapid AI growth + child safety concerns

AI chatbots are no longer toys. Many kids are using them. Hawley cited more than 70 percent of American children engaging with these products. These chatbots can provide human-like responses, emotional mimicry and sometimes invite ongoing conversations. For minors, these interactions can blur boundaries between machine and human, and they may seek guidance or emotional connection from an algorithm rather than a real person.

Legal, ethical and technological stakes

If this bill passes, it could reshape how the AI industry manages minors, age verification, disclosures and liability. It shows that Congress is ready to move away from voluntary self-regulation and toward firm guardrails when children are involved. The proposal may also open the door for similar laws in other high-risk areas, such as mental health bots and educational assistants. Overall, it marks a shift from waiting to see how AI develops to acting now to protect young users.

Industry pushback and innovation concerns

Some tech companies argue that such regulation could stifle innovation, limit beneficial uses of conversational AI (education, mental-health support for older teens) or impose heavy compliance burdens. This tension between safety and innovation is at the heart of the debate.

What the GUARD Act requires from AI companies

If passed, the GUARD Act would impose strict federal standards on how AI companies design, verify and manage their chatbots, especially when minors are involved. The bill outlines several key obligations aimed at protecting children and holding companies accountable for harmful interactions.

The first major requirement centers on age verification. Companies must use reliable methods such as government-issued identification or other proven tools to confirm that a user is at least 18 years old. Simply asking for a birthdate is no longer enough.
The second rule involves clear disclosures. Every chatbot must tell users at the start of each conversation, and at regular intervals, that it is an artificial intelligence system, not a human being. The chatbot must also clarify that it does not hold professional credentials such as medical, legal or therapeutic licenses.
Another provision establishes an access ban for minors. If a user is verified as under 18, the company must block access to any ‘AI companion’ feature that simulates friendship, therapy or emotional communication.
The bill also introduces civil and criminal penalties for companies that violate these rules. Any chatbot that encourages or engages in sexually explicit conversations with minors, promotes self-harm or incites violence could trigger significant fines or legal consequences.
Finally, the GUARD Act defines an AI companion as a system designed to foster interpersonal or emotional interaction with users, such as friendship or therapeutic dialogue. This definition makes it clear that the law targets chatbots capable of forming human-like connections, not limited-purpose assistants.

How to stay safe in the meantime

Technology often moves faster than laws, which means families, schools and caregivers must take the lead in protecting young users right now. These steps can help create safer online habits while lawmakers debate how to regulate AI chatbots.

1) Know which bots your kids use

Start by finding out which chatbots your kids talk to and what those bots are designed for. Some are made for entertainment or education, while others focus on emotional support or companionship. Understanding each bot’s purpose helps you spot when a tool crosses from harmless fun into something more personal or manipulative.

2) Set clear rules about interaction

Even if a chatbot is labeled safe, decide together when and how it can be used. Encourage open communication by asking your child to show you their chats and explain what they like about them. Framing this as curiosity, not control, builds trust and keeps the conversation ongoing.

3) Use parental controls and age filters

Take advantage of built-in safety features whenever possible. Turn on parental controls, activate kid-friendly modes and block apps that allow private or unmonitored chats. Small settings changes can make a big difference in reducing exposure to harmful or suggestive content.

4) Teach children that bots are not humans

Remind kids that even the most advanced chatbot is still software. It can mimic empathy, but does not understand or care in a human sense. Help them recognize that advice about mental health, relationships or safety should always come from trusted adults, not from an algorithm.

5) Watch for warning signs

Stay alert for changes in behavior that could signal a problem. If a child becomes withdrawn, spends long hours chatting privately with a bot or repeats harmful ideas, step in early. Talk openly about what is happening, and if necessary, seek professional help.

6) Stay informed as the laws evolve

Regulations such as the GUARD Act and new state measures, including California’s SB 243, are still taking shape. Keep up with updates so you know what protections exist and which questions to ask app developers or schools. Awareness is the first line of defense in a fast-moving digital world.

Take my quiz: How safe is your online security?

Think your devices and data are truly protected? Take this quick quiz to see where your digital habits stand. From passwords to Wi-Fi settings, you’ll get a personalized breakdown of what you’re doing right and what needs improvement. Take my Quiz here: Cyberguy.com.

Kurt’s key takeaways

The GUARD Act represents a bold step toward regulating the intersection of minors and AI chatbots. It reflects growing concern that unmoderated AI companionship might harm vulnerable users, especially children. Of course, regulation alone won’t solve all problems, industry practices, platform design, parental involvement and education all matter. But this bill signals that the era of ‘build it and see what happens’ for conversational AI may be ending when children are involved. As technology continues to evolve, our laws and our personal practices must evolve too. For now, staying informed, setting boundaries and treating chatbot interactions with the same scrutiny we treat human ones can make a real difference.

If a law like the GUARD Act becomes reality, should we expect similar regulation for all emotional AI tools aimed at kids (tutors, virtual friends, games) or are chatbots fundamentally different? Let us know by writing to us at Cyberguy.com.

Get my best tech tips, urgent security alerts and exclusive deals delivered straight to your inbox. Plus, you’ll get instant access to my Ultimate Scam Survival Guide — free when you join my CYBERGUY.COM newsletter. 

Copyright 2025 CyberGuy.com.  All rights reserved. 

This post appeared first on FOX NEWS
0 comment
0
FacebookTwitterPinterestEmail

previous post
Fox New Voter Poll: How Spanberger won Virginia governor
next post
House Dem crashes Mike Johnson press event as tensions erupt over shutdown

related articles

17 Republicans rebel against House GOP leaders, join...

January 9, 2026

Trump admin reportedly considers paying each Greenland resident...

January 8, 2026

Dozens of House Republicans defy Trump, join Democrats...

January 8, 2026

National security experts sound alarm over CCP-linked land...

January 8, 2026

European allies working on plan if US acts...

January 8, 2026

House passes nearly $180B funding package after conservative...

January 8, 2026

JD Vance announces multi-state fraud task force in...

January 8, 2026

Vance calls Walz ‘a joke,’ claims Minnesota governor...

January 8, 2026

Trump blasts GOP war powers defectors, says they...

January 8, 2026

Trump calls for $1.5T defense budget to build...

January 8, 2026
Enter Your Information Below To Receive Free Trading Ideas, Latest News, And Articles.


Your information is secure and your privacy is protected. By opting in you agree to receive emails from us. Remember that you can opt-out any time, we hate spam too!

Latest News

  • Cannabis stocks surge as Trump signals shift on US marijuana policy

    December 13, 2025
  • ‘Great job’ or ‘no idea what he’s doing’? Elon Musk email sets Capitol Hill ablaze

    February 28, 2025
  • Mike Johnson, world leaders to nominate Trump for Nobel Peace Prize after Israel-Hamas deal

    October 14, 2025
  • Florida man indicted for ’86’ posts allegedly threatening to kill Alina Habba

    June 27, 2025
  • Republicans challenge ‘irrelevant’ budget office as it critiques Trump’s ‘beautiful bill’

    June 10, 2025

Popular Posts

  • 1

    Secret Service admits leaning on ‘state and local partners’ after claim it ignored Trump team’s past requests

    July 21, 2024
  • 2

    District judges’ orders blocking Trump agenda face hearing in top Senate committee

    April 2, 2025
  • 3

    Five more House Democrats call on Biden to drop out, third US senator

    July 19, 2024
  • 4

    Forex Profit Calculator: Maximize Your Trading Potential

    July 10, 2024
  • 5

    Elon and Vivek should tackle US funding for this boondoogle organization and score a multimillion dollar win

    December 4, 2024

Categories

  • Economy (829)
  • Editor's Pick (7,380)
  • Investing (794)
  • Stock (964)

Latest Posts

  • Bill Barr: Prosecutors should ‘do the right thing’ and dismiss Trump cases: ‘Respect the people’s decision’

    November 6, 2024
  • RFK Jr urges Catholics to vote for Trump in new ad

    October 25, 2024
  • Harris-Walz hit with blistering ad on China record ahead of presidential debate: ‘Time to fight back’

    September 10, 2024

Recent Posts

  • Ontario cancels internet deal with Musk’s Starlink as part of U.S. tariff fight

    July 31, 2025
  • DOJ lands legal victory as federal judge allows $800M in grants to be clawed back

    July 9, 2025
  • Turkey blocks Instagram following controversy over condolence posts for Ismail Haniyeh

    August 2, 2024

Editor’s Pick

  • Texas sues GM for allegedly selling drivers’ data without consent

    August 14, 2024
  • Trump fires 17 government watchdogs at various federal agencies

    January 25, 2025
  • Dems hitting Vance with debunked vulgar claim ‘undermine’ their anti-Trump credibility, strategist says

    August 12, 2024
  • About us
  • Contacts
  • Privacy Policy
  • Terms & Conditions

Disclaimer: moneyrisetoday.com, its managers, its employees, and assigns (collectively “The Company”) do not make any guarantee or warranty about what is advertised above. Information provided by this website is for research purposes only and should not be considered as personalized financial advice. The Company is not affiliated with, nor does it receive compensation from, any specific security. The Company is not registered or licensed by any governing body in any jurisdiction to give investing advice or provide investment recommendation. Any investments recommended here should be taken into consideration only after consulting with your investment advisor and after reviewing the prospectus or financial statements of the company.

Copyright © 2025 moneyrisetoday.com | All Rights Reserved

Money Rise Today – Investing and Stock News
  • Economy
  • Editor’s Pick
Money Rise Today – Investing and Stock News
  • Investing
  • Stock