Kids Online Safety Act
H.R. 6484 – Kids Online Safety Act for minors on internet platforms
119th Congress
H.R. 6484 sets national rules for how large online platforms must protect users under age 17. It focuses on safety tools, limits on harmful content and addictive features, audits, and clear notices to kids and parents. It would be enforced mainly by the Federal Trade Commission and State attorneys general and would take effect 18 months after enactment.
- Bill Number
- HR6484
- Chamber
- house
What This Bill Does
The bill creates new duties for certain online platforms, called “covered platforms,” that are open to the public, use accounts and user names, rely on user‑generated content, use engagement‑boosting design features, and use personal data for ads or recommendations. These platforms must create and enforce reasonable policies to address harms to minors, including threats of physical violence, sexual exploitation and abuse, illegal drugs, tobacco, cannabis, gambling, alcohol, and financial harm from deceptive practices. The bill states that minors can still search for information and get resources about these topics, and it bars government enforcement that targets speech based on viewpoint. Covered platforms must offer special safeguards for users they know are minors. These include tools to limit who can contact the minor, default limits on design features that cause compulsive use, and an easy option for minors to limit their own time on the platform. By default, minors must get the most protective privacy and safety settings the platform offers. Platforms must also give parents tools to see and, in the case of children under 13, control privacy and account settings, restrict purchases, and view and limit time spent on the platform. Platforms must notify minors when parental tools are active and, for known children, turn these tools on by default unless parents previously opted out under similar tools. The bill requires platforms to set up easy reporting systems for harms to minors, including a dedicated electronic contact point and confirmation plus a substantive response within set time frames. Platforms may not advertise narcotic drugs, cannabis, tobacco, gambling, or alcohol to users they know are minors. It sets rules so that safeguards and parental tools are clear, age‑appropriate, and in the same language and format as the service, and allows use of third‑party services to help provide these tools. It also states that platforms are not required to share minors’ browsing history, messages, or contact lists, and are not forced to collect more age data than they already do in the normal course of business. Before a known minor registers or makes a purchase, covered platforms must clearly explain their minor‑safety policies and how to use the safeguards and parental tools. For known children under 13, they must inform a parent about these tools and get verifiable parental consent, which can be combined with existing notice and consent steps under the Children’s Online Privacy Protection Act. Platforms must also clearly label when users’ endorsements of products or services are paid or made for commercial consideration. The bill requires each covered platform to undergo an independent third‑party safety audit every year. Auditors must review the platform’s safeguards, methods to reduce harms, and must consult with parents, nonprofits, health experts, and free expression experts. The audit must cover usage by minors, the number of minor users, time spent on the platform, use of safeguards and parental tools, reports of harm, how the platform handles those reports, how it collects and uses minors’ personal information, and how it designs and evaluates engagement features used by minors. Platforms must cooperate fully and submit audit results to the Federal Trade Commission. Enforcement is mainly through the Federal Trade Commission, which can treat violations as unfair or deceptive acts and use its usual powers and penalties. State attorneys general and certain State officials can also sue in State or Federal court to stop violations, seek money for residents, and obtain other relief, though they may not bring overlapping cases while a federal case is pending. The bill also creates a Kids Online Safety Council within the Department of Commerce to study online risks and benefits for minors, recommend methods to reduce harms, suggest research topics, and propose best practices for audits and reports. The Council includes experts, parents, minors, educators, platforms, civil liberties experts, and State officials, and must issue a final report within three years before it ends. Finally, the bill states it does not change the Children’s Online Privacy Protection Act, the scope or meaning of Section 230 of the Communications Act, or certain Federal Trade Commission rules. It allows platforms to cooperate with law enforcement and respond to legal demands and security threats. It preempts State and local laws that relate to the provisions of this Act, sets an 18‑month delayed effective date, and includes a standard severability clause so that if one part is struck down, the rest can remain in force.
