Overview

Title

To protect the safety of children on the internet.

ELI5 AI

The Kids Online Safety Act is a plan to make the internet safer for kids by asking websites to have special rules to stop mean people and bad stuff from reaching children. This means websites need to help parents control what kids see and be honest about what they do online.

Summary AI

The Kids Online Safety Act (H.R. 7891) seeks to safeguard children on the internet by imposing duties on online platforms to protect minors from potential harms such as mental health issues, cyberbullying, and inappropriate content. The bill requires platforms to offer parental controls, limit harmful design features, and ensure clear transparency about their practices. It also encourages research on the impact of social media on minors, mandates annual reports to assess risks, and establishes the Kids Online Safety Council to advise on online safety. The Federal Trade Commission is tasked with the enforcement of these regulations.

Published

2024-04-09
Congress: 118
Session: 2
Chamber: HOUSE
Status: Introduced in House
Date: 2024-04-09
Package ID: BILLS-118hr7891ih

Bill Statistics

Size

Sections:
19
Words:
12,369
Pages:
62
Sentences:
220

Language

Nouns: 3,573
Verbs: 1,083
Adjectives: 822
Adverbs: 118
Numbers: 337
Entities: 392

Complexity

Average Token Length:
4.29
Average Sentence Length:
56.22
Token Entropy:
5.62
Readability (ARI):
30.38

AnalysisAI

General Summary of the Bill

The Kids Online Safety Act, introduced as H.R. 7891 in the 118th Congress, aims to enhance the safety of children and minors on the internet. This legislation is structured into three main titles. Title I addresses online safety, laying out definitions and duties for platforms, as well as requirements for transparency, research, and enforcement mechanisms. Title II focuses on filter bubble transparency, requiring platforms to allow users to view unmanipulated content. Title III clarifies the Act's relationship with state laws, stating that state laws with greater protection for minors are not preempted by this federal law.

Summary of Significant Issues

A major issue with this Act is the broad definition of "covered platform" (Section 101), which could include a wide variety of online services, potentially causing enforcement challenges. Section 301’s preemption clause might lead to legal disputes due to ambiguous language about what constitutes a conflict between federal and state laws. Additionally, the requirement for platforms to undergo independent audits (Section 105) might involve significant costs and privacy concerns, particularly for smaller companies. The subjective language regarding "clear, conspicuous, and easy-to-understand" notices in Section 104 also raises potential compliance and communication challenges. Furthermore, the mandates for research and studies (Sections 106 and 108) lack specific budgeting or funding sources, posing fiscal challenges and concerns about resource allocation.

Impact on the Public

The general public, especially parents and minors, could see both positive and negative impacts from this Act. On the positive side, the Act's focus on protecting children online is likely to offer greater safety and oversight over digital interactions for younger users. Parents might feel more empowered to monitor and control their children’s online activities thanks to enhanced parental tools and safety features included in the bill.

However, the broad and sometimes vague terminology used throughout the legislation could lead to inconsistencies in application and enforcement, potentially diminishing these positive outcomes. The costs and efforts required for compliance might reduce incentives for platforms to implement these features effectively, indirectly impacting user experience.

Impact on Specific Stakeholders

Online Platforms and Companies: Larger companies may have the resources to comply with the extensive requirements (such as audits and transparency reports), but smaller platforms could face financial strain. The Act's definitions impose broad obligations, making it difficult for smaller companies to navigate the requirements without incurring high compliance costs. Moreover, the ambiguous definition of “high impact online company” poses a challenge in determining which companies are subject to the law, potentially leading to uneven enforcement.

State Governments: The relationship to state law provisions could create tension between federal and state regulations, particularly if state laws offer more robust protections. This tension might lead to inconsistent enforcement and legal disputes over preemption.

Consumer Advocacy Groups and Legal Bodies: These stakeholders might be concerned with privacy implications due to the extensive data sharing required for compliance. They are likely to advocate for stricter privacy protections and clarity on compliance standards to prevent misuse or breaches of sensitive information.

In summary, while the Kids Online Safety Act is laudable in its mission to protect minors online, its effectiveness may be hindered by broad definitions, potential compliance burdens, and legal ambiguities. Addressing these issues through amendments or clearer regulatory guidelines could enhance the bill's practicality and effectiveness.

Financial Assessment

The Kids Online Safety Act, H.R. 7891, aims to enhance the safety of children on the internet by imposing certain responsibilities and regulations on online platforms. However, the bill also touches upon financial aspects, which require careful analysis to understand their implications.

Financial Provisions and Allocations

While the bill contains substantial regulatory language, there are limited explicit financial provisions or allocations. There is no mention of direct spending or appropriations within the bill text. This absence raises questions about how certain objectives under the bill will be financially supported or implemented.

Independent Audits and Data Sharing Costs

One of the significant financial implications mentioned in the bill is the requirement for independent, third-party audits, as seen in Section 105. These audits are intended to assess risks to minors on online platforms. However, the requirement for extensive data sharing and independent audits may impose notable costs, particularly on smaller platforms. This leads to concerns about significant financial burdens that could result in non-compliance, as smaller companies might struggle to afford these audits.

Furthermore, the privacy risks associated with extensive data sharing prompt the need for stringent safeguards, potentially increasing costs for compliance and legal risk management. Financial allocations or resources to mitigate these costs are not specified, which could result in resource allocation challenges for smaller platforms.

Issues of Unspecified Funding for Studies

The bill outlines requirements for numerous studies and reports, particularly regarding the impacts of social media on minors, as described in Section 106. However, there is a distinct absence of specific budgetary allocations or cost estimates for these studies. This lack of financial planning might lead to inefficient use of resources, as unclear funding sources or budget constraints could hinder comprehensive and effective research. This issue highlights the need for financial clarity to ensure that proposed studies are feasible and yield valuable insights without resource waste.

Additionally, Section 108 mandates a study on age verification systems but similarly lacks details about funding sources or cost estimates. Without clear financial planning, the feasibility and effectiveness of this crucial study could be compromised, affecting the overall goal of enhancing online safety for minors.

Impact on "High Impact Online Companies"

The definition of a "high impact online company" involves financial metrics such as generating $2,500,000,000 or more in annual revenue or maintaining a large user base. However, the bill does not clarify whether these companies will bear additional financial responsibilities or face specific financial penalties or incentives. This lack of clarity can lead to ambiguity in determining which companies are subject to specific fiduciary duties under the law and may impact overall legal enforcement and compliance.

Conclusion

In sum, while the Kids Online Safety Act outlines rigorous measures to protect children online, it lacks explicit financial details necessary for its effective implementation. The absence of specified budgets, cost estimates for mandated studies, and clear funding sources poses significant challenges. These ambiguities can lead to financial inefficiencies and complicate compliance, particularly for smaller entities required to meet the bill’s regulatory demands. Addressing these financial elements could strengthen the Act's enforcement and effectiveness, ensuring that the objectives of protecting minors online are met without undue financial strain.

Issues

  • The definition of 'covered platform' in Section 101 is broad, which could encompass a wide range of online services, potentially leading to enforcement challenges and creating loopholes for platforms seeking to avoid regulation, affecting both legal clarity and consumer protection.

  • Section 301's preemption clause is ambiguous about what constitutes 'a conflict' between state and federal laws, which could lead to legal disputes and confusion over whether state laws that offer more protection to minors are preempted.

  • The requirement for independent audits and extensive data sharing in Section 105 poses potential privacy risks and significant costs, especially for smaller platforms, which could lead to non-compliance and legal and financial challenges.

  • The subjective language in Section 104 regarding what constitutes 'clear, conspicuous, and easy-to-understand' notices potentially leads to ambiguity in compliance and enforcement, affecting both legal and communication clarity for platforms and users.

  • The bill's mandate for studies on social media and minors in Section 106 lacks a specific budget or cost estimate, potentially leading to inefficient use of resources and financial concerns about wasteful spending.

  • Section 108's lack of specification regarding funding sources for the age verification study could lead to inadequate budgeting and possible financial waste, potentially affecting its feasibility and effectiveness.

  • The absence of a clear definition for 'high impact online company' in Section 102 could lead to ambiguity in determining which companies are subject to the law, affecting legal enforcement and responsibility allocation.

  • Section 203's severability clause lacks specific examples or scenarios, which might make it difficult to understand the impact of certain provisions being deemed unenforceable, affecting the overall legal robustness of the Act.

Sections

Sections are presented as they are annotated in the original legislative text. Any missing headers, numbers, or non-consecutive order is due to the original text.

1. Short title; table of contents Read Opens in new tab

Summary AI

The Kids Online Safety Act is introduced in Section 1, which includes its short title and the table of contents. The Act is organized into three main titles: Title I focuses on online safety for kids with sections that define terms, establish duties to protect minors, and require transparency and various studies. Title II deals with transparency in content manipulation on internet platforms, and Title III addresses how this Act interacts with state laws.

101. Definitions Read Opens in new tab

Summary AI

The section explains various terms related to online platforms, including what defines a child, a minor, and a covered platform. It also outlines specifics like what constitutes a design feature that keeps minors engaged, what qualifies as compulsive usage, and what makes an online company high impact, as well as other related definitions.

Money References

  • Design features include, but are not limited to— (A) infinite scrolling or auto play; (B) rewards for time spent on the platform; (C) notifications; (D) push alerts that urge a user to spend more time engaged with the platform when they are not actively using it; (E) badges or other visual award symbols based on elevated levels of engagement with the platform; (F) personalized recommendation systems; (G) in-game purchases; or (H) appearance altering filters. (5) HIGH IMPACT ONLINE COMPANY.—The term “high impact online company” means an online platform or online video game that provides any internet-accessible platform where— (A) such online platform or online video game generates $2,500,000,000 or more in annual revenue, including the revenue generated by any affiliate of such covered platform; or (B) such online platform or online video game has 150,000,000 or more global monthly active users for not fewer than 3 of the preceding 12 months on the online product or service of such covered platform; and (C) such online platform or online video game constitutes an online product or service that is primarily used by users to access or share, user-generated content. (6) KNOW; KNOWS.—The term “know” or “knows” means— (A) with respect to a high impact online company, the platform knew or should have known the individual was a child or minor; (B) with respect to a covered platform that had an annual gross revenue of $200,000,000 or more, collects the personal information of 200,000 individuals or more, and does not meet the qualifications of subparagraph (A), that covered platform knew or acted in willful disregard of the fact that the individual was a child or minor; and (C) with respect to a covered platform that does not meet the requirements of subparagraph (A) or (B), actual knowledge. (7) MENTAL HEALTH DISORDER.—The term “mental health disorder” has the meaning given the term “mental disorder” in the Diagnostic and Statistical Manual of Mental Health Disorders, 5th Edition (or the most current successor edition). (8) MICROTRANSACTION.

102. Duty of care Read Opens in new tab

Summary AI

A high impact online company must take reasonable steps to prevent and reduce various harms to minors, including mental health issues like anxiety and depression, compulsive usage patterns, violence, cyberbullying, sexual exploitation, and the promotion of harmful substances like drugs and alcohol. However, the company is not required to stop minors from actively seeking such content or providing resources to prevent these harms.

103. Safeguards for minors Read Opens in new tab

Summary AI

The bill section focuses on protecting minors on online platforms by requiring safeguards that limit their interactions and provide parental control tools. It mandates platforms to offer default protective settings for minors, prohibits advertising illegal products to them, and ensures controls are clear, accessible, and do not use deceptive design tactics.

104. Disclosure Read Opens in new tab

Summary AI

The section requires platforms to clearly inform minors and their parents about safety measures, parental tools, and potential risks before minors register or use the platform. It also mandates that platforms use reasonable efforts to obtain parental consent and provide details about personalized recommendation systems while making these notices and resources available in various languages.

105. Transparency Read Opens in new tab

Summary AI

A covered platform with over 10 million monthly U.S. users must issue an annual public report detailing potential risks and safety measures for minors, while being guided by independent audits. The report, available online, must include information such as user demographics, risks, safety tools, and privacy safeguards, ensuring content is de-identified to protect users' privacy.

106. Research on social media and minors Read Opens in new tab

Summary AI

The section outlines that the Federal Trade Commission will work with the National Academy of Sciences to conduct studies on the harms social media may pose to minors, focusing on issues like mental health, substance use, and exploitation. The studies will involve access to data from social media platforms and the participation of health officials to better understand and address these risks.

107. Market research Read Opens in new tab

Summary AI

The Federal Trade Commission, collaborating with the Secretary of Commerce, will provide guidelines for companies conducting market research involving minors. These guidelines will include a consent form understandable by both minors and parents, instructions for getting parental consent, and research recommendations for different age groups, with a deadline for issuing the guidance set at 18 months from the law's enactment.

108. Age verification study and report Read Opens in new tab

Summary AI

The section requires the Secretary of Commerce to work with two federal commissions to study ways to verify a person's age using their device or operating system. The study will explore the benefits, necessary information, accuracy, privacy impacts, technical feasibility, and competitive impacts of such age verification systems, and a report on the findings must be submitted to Congress within a year.

109. Guidance Read Opens in new tab

Summary AI

The section requires the Federal Trade Commission to work with the Kids Online Safety Council to create guidance within 18 months, helping online platforms understand safety features for minors and how to protect them. It also mandates the Secretary of Education to help schools use these tools and provides that this guidance does not grant any rights or bind the FTC but aids enforcement of existing laws.

110. Enforcement Read Opens in new tab

Summary AI

The section explains how the Federal Trade Commission can enforce violations of the Act as unfair or deceptive practices, granting the Commission the same powers, penalties, and privileges as under current law. It also outlines how state attorneys general can bring civil actions on behalf of state residents if they believe a platform has violated specific sections of the Act, with provisions for notifying the Commission and allowing it to intervene in these legal actions.

111. Kids Online Safety Council Read Opens in new tab

Summary AI

The Kids Online Safety Council is being set up by the Secretary of Commerce to advise on online safety for minors, involving experts from various fields, parents, youth, and government representatives. The council's job is to pinpoint risks to minors online, suggest ways to protect them, and recommend best practices and research methods, but it won't be bound by rules that usually apply to federal advisory committees.

112. Effective date Read Opens in new tab

Summary AI

The section explains that, unless specified otherwise, the title will become effective 18 months after the Act is officially enacted.

113. Rules of construction and other matters Read Opens in new tab

Summary AI

The section clarifies that this bill does not override existing laws on student privacy and children's online privacy, nor does it require collecting users' ages or implementing age verification on platforms. It allows platforms to cooperate with law enforcement and comply with legal investigations, and if a video streaming service mainly provides preselected content rather than user-generated content, it is considered compliant if it offers tools for parental control and protection of minors.

114. Severability Read Opens in new tab

Summary AI

If any part of this title or its amendments is found to be unenforceable or invalid, the rest of the provisions and amendments will continue to be effective.

201. Definitions Read Opens in new tab

Summary AI

This section provides definitions for various terms used in the bill, such as "algorithmic ranking system," which refers to a computer process that ranks content on online platforms, and "connected device," which is an electronic gadget that connects to the internet for data communication. It also clarifies what constitutes user-specific data, geolocation information, and distinguishes between input-transparent and opaque algorithms based on how they use user data.

202. Requirement to allow users to see unmanipulated content on internet platforms Read Opens in new tab

Summary AI

This section mandates that online platforms using opaque algorithms must disclose this usage to users and allow them to switch to transparent algorithms without facing different prices or services. It also empowers the Federal Trade Commission to treat violations as unfair or deceptive practices and ensures the protection of trade secrets and personalized content restrictions.

203. Severability Read Opens in new tab

Summary AI

If any part of the title or its amendments is found to be invalid or cannot be enforced, the rest of the title and its amendments will still remain in effect.

301. Relationship to State laws Read Opens in new tab

Summary AI

The Act overrides state laws only if they contradict its provisions, but it allows states to create laws that offer more protection to minors than what the Act provides.