Kids Online Safety Act
S.1748 – Kids Online Safety Act: Rules for platforms minors use and transparency on algorithms
119th Congress
S.1748 sets national rules for how large online platforms, games, and streaming services must protect users under 17. It requires safety and privacy safeguards, clearer notices, outside audits, and options to see content without personalized algorithms. It is introduced legislation and would take effect mostly 18 months after enactment if passed.
- Bill Number
- S1748
- Chamber
- senate
What This Bill Does
Title I creates a legal “duty of care” for covered platforms to design and run their services in ways that help prevent certain harms to minors, like compulsive use, serious bullying, sexual exploitation, and promotion of drugs, gambling, or alcohol. Platforms must provide strong default safeguards for users they know are under 17, such as limits on who can contact them, controls on sharing personal data and location, limits by default on features that encourage endless use (like autoplay and infinite scroll), and clear options to turn off or narrow personalized recommendations. Platforms must also offer tools for parents to manage young children’s accounts, see and limit time on the platform, and restrict purchases, with these tools turned on by default for children under 13 unless a parent has already opted out. The bill requires platforms to give minors and parents clear notices about these safeguards, how recommendation systems use their data, and which content is advertising. Large social platforms must undergo yearly independent audits and publish public transparency reports on how many minors use their services, how long they use them, what safeguards are in place, and how user reports of harm are handled. It restricts market and product-focused research on children without parental consent, directs federal agencies to study device-level age verification methods, and requires the Federal Trade Commission (FTC) to issue guidance on identifying risky design features and how it interprets the “knowledge” standard for knowing a user is a minor. The bill gives the FTC primary enforcement power, treating violations as unfair or deceptive practices, and lets state attorneys general sue platforms in federal or state court for violating the safeguards, disclosure, and transparency sections. It creates a temporary Kids Online Safety Council of experts, parents, youth, educators, industry, and state officials to advise Congress and recommend best practices on online safety and transparency standards. The bill states it does not change COPPA, FERPA, or Section 230, and it does not require platforms to start collecting age data or to use age-gating or age-verification systems. Title II focuses on “filter bubble” transparency for online platforms that use opaque algorithms. These platforms must clearly tell users when an algorithm uses user-specific data to rank or recommend content and explain key features, inputs, and data categories used. They also must let users easily switch to an “input-transparent” option that shows content based only on data the user directly provides, like search terms, follows, or subscriptions, and cannot charge more or reduce service if a user chooses that option. Title III sets the relationship to state law. It allows state laws that give stronger protections to minors, but preempts state provisions that directly conflict with this act. If any provision of the act is ruled invalid, the rest of the act stays in effect.
