SCREEN Act
H.R. 1623 (SCREEN Act): Age checks on pornographic websites to block minors
119th Congress
H.R. 1623 would require certain online platforms that host pornographic or other harmful visual content to use technology to verify that users are adults. It directs the Federal Trade Commission (FTC) to enforce these rules and to audit companies. The bill also sets rules for protecting age-verification data and orders a government study of the impacts.
- Bill Number
- HR1623
- Chamber
- house
What This Bill Does
The bill applies to "covered platforms," which are interactive computer services that do business in or target the U.S. and regularly create, host, or make available visual content considered harmful to minors for profit. "Harmful to minors" is defined using existing legal ideas about sexual and obscene material, including child pornography and certain explicit sexual images that lack serious value for minors. Starting one year after the bill becomes law, these platforms must put in place a technology-based age verification system. This system must make it more likely than not that a user is not a minor before the user can access the platform’s harmful content, and simply asking a user to click a box saying they are an adult would not be enough. Platforms must publicly explain their verification process, apply it to U.S.-based users’ IP addresses (including known VPN IP addresses, unless they determine the user is not in the U.S.), and ensure that minors cannot access any harmful content. Platforms may choose which specific age-verification technology to use and may hire third-party companies to run the checks, but the original platforms remain legally responsible for complying. They must protect any age-verification data they or their vendors collect, using reasonable security measures, and may keep the data only as long as needed to run the verification or to show they are following the law. The bill clarifies that platforms do not have to send identifying user data to the FTC to comply. The FTC must consult with experts in computer science, child online safety, privacy, age-verification technology, and data security as it enforces the law and sets standards. The FTC is required to conduct regular audits of covered platforms, publish its audit terms and processes, define what documents or materials companies must provide, and issue nonbinding guidance within 180 days to help platforms comply. Violations of the age-verification rules are treated as unfair or deceptive practices under the Federal Trade Commission Act, giving the FTC its normal enforcement tools and penalties. The bill also requires the Government Accountability Office (GAO) to submit a report to Congress two years after platforms must start complying. The report must examine how effective the verification measures are, how well companies are following the rules, how they protect data, and what behavioral, economic, psychological, and social effects the measures are having. The GAO may also recommend changes to enforcement and to the law itself. A severability clause says that if any part of the Act is ruled unconstitutional, the rest remains in force.
