Overview

Title

To provide Americans with foundational data privacy rights, create strong oversight mechanisms, and establish meaningful enforcement, and for other purposes.

ELI5 AI

H.R. 8818, called the American Privacy Rights Act of 2024, is like a big rulebook that helps protect people's personal data and says companies have to ask for permission to use it, especially for kids, and makes sure everyone follows the same privacy rules all over the country. It also plans to use money from penalties to help people whose data was mishandled and teach others how to protect their data better.

Summary AI

H.R. 8818, also known as the American Privacy Rights Act of 2024, establishes rights for Americans related to data privacy. It creates oversight mechanisms and enforces strong privacy protections, focusing on transparency, individual control over personal data, and prohibiting the misuse of data, especially for minors. The bill outlines requirements for companies handling data, including obtaining consent for data use, data security measures, and specific provisions for data collected from children. Additionally, it provides guidelines for federal and state enforcement and preempts certain state laws to create a unified data privacy standard.

Published

2024-06-25
Congress: 118
Session: 2
Chamber: HOUSE
Status: Introduced in House
Date: 2024-06-25
Package ID: BILLS-118hr8818ih

Bill Statistics

Size

Sections:
29
Words:
36,517
Pages:
184
Sentences:
499

Language

Nouns: 9,844
Verbs: 3,473
Adjectives: 2,294
Adverbs: 363
Numbers: 942
Entities: 1,038

Complexity

Average Token Length:
4.32
Average Sentence Length:
73.18
Token Entropy:
5.64
Readability (ARI):
38.99

AnalysisAI

The "American Privacy Rights Act of 2024" is a legislative proposal aimed at strengthening data privacy and protection for individuals across the United States. The bill seeks to establish foundational rights for data privacy, providing mechanisms for oversight and enforcement. It outlines specific requirements for collecting, processing, and sharing personal data, emphasizing minimization and consent, and includes robust protections for children and teens under the "Children’s Online Privacy Protection Act 2.0." Additionally, the bill aims to streamline how personal data is handled, focusing on transparency and user control.

Summary of Significant Issues

A few critical issues arise within this bill. Firstly, the term "covered minor" in Section 120 is not clearly defined, potentially complicating protections for minors who are uniquely vulnerable to privacy invasions. Additionally, Section 118's preemption clause may weaken state-level data privacy protections, potentially conflicting with existing robust local laws.

Furthermore, the creation of a new Federal Trade Commission (FTC) bureau in Section 115 could lead to bureaucratic expansion without demonstrating a clear necessity for such enlargement. This might risk inefficiencies and increased government spending. The ambiguous definition of "dark patterns" in Section 107 presents another issue; lack of clarity could foster inconsistent enforcement of provisions designed to prevent manipulative design techniques.

Broad Impact on the Public

Broadly, the bill seeks to protect consumer data privacy across the United States, which could be beneficial for individuals increasingly concerned about digital privacy. However, the preemption of state laws might limit protections in regions with stringent local data privacy standards. This could lead to a lessening of protections for individuals in those states, underscoring the need for a uniform yet comprehensive national framework that adequately addresses local concerns.

Impact on Specific Stakeholders

Businesses: Companies, especially those handling large volumes of data, will need to implement substantial changes to comply with new data minimization and transparency requirements. Small businesses and nonprofits, which might be excluded as "covered entities," could experience varying impacts. Smaller businesses may benefit from reduced regulatory burdens, but potential loopholes might affect comprehensive enforcement of privacy rights.

Regulatory Bodies: Establishing a new bureau within the FTC might improve structured oversight and execution of privacy protection laws, provided it functions efficiently. However, the cost and potential overlap with existing efforts may draw public scrutiny if not properly managed.

Consumers: On the positive side, the bill could enhance consumer confidence by increasing data protection and granting individuals more control over their information. On the downside, the lack of clarity in certain provisions might lead to confusion or limited effectiveness in upholding privacy rights.

Children and Teens: The protection of minors by barring targeted advertising and restricting data sharing could result in safer online environments for younger users. Nevertheless, the complexity in the consent mechanisms for collecting children's data poses potential compliance challenges for service providers, potentially leading to inconsistent protections.

Conclusion

The "American Privacy Rights Act of 2024" is a significant legislative step towards enhancing data privacy in the United States, addressing crucial areas such as children's online privacy and individuals' rights over personal data. While it proposes valuable measures for securing consumer data, unresolved issues around clear definitions and bureaucratic expansion indicate the need for further refinement to ensure thorough and effective implementation of its provisions. Balancing the enforcement of privacy rights with maintaining state-level protections and efficient regulatory structures will be vital in realizing the bill's full potential impact.

Financial Assessment

Money is referenced in several ways throughout H.R. 8818, the American Privacy Rights Act of 2024. This commentary will explore these references and their potential implications, particularly concerning the identified issues.

Financial References and Implications

  1. Registration Fee for Data Brokers: The bill requires data brokers to register with the Federal Trade Commission (FTC) and pay a registration fee of $100. This monetary requirement is part of a broader effort to regulate data brokers, ensuring transparency and accountability. However, this relatively low fee raises questions about its sufficiency to cover the administrative costs associated with overseeing compliance. Furthermore, while this fee establishes a nominal barrier to entry, it may not significantly deter smaller entities, which might still exploit data handling loopholes, contributing to the issue of entities excluded from the definition of 'covered entities.'

  2. Establishment of a New FTC Bureau: Section 115 calls for the establishment of a new bureau within the FTC to assist in enforcing the legislation. This bureau is mandated to have staffing comparable to existing bureaus, including technologists, attorneys, and other professionals. Although not quantified explicitly within the bill, this establishment implies significant new government spending. The concern here lies in the potential bureaucratic expansion without a clear accountability structure, as pointed out in the issues list. Without stringent oversight, this could lead to inefficiencies and inflated costs without demonstrable advantages in consumer protection.

  3. Privacy and Security Victims Relief Fund: The bill proposes a "Privacy and Security Victims Relief Fund" to be financed by civil penalties collected from entities that violate the act. This fund is intended to provide financial redress to individuals harmed by such violations and support consumer education and technological research. However, the criteria for allocation of these funds remain vague, which could lead to discretionary misuse or inefficient fund allocation. The potential lack of oversight and clear usage guidelines ties into concerns about mismanagement, highlighting the need for well-defined accountability measures.

  4. Revenue Criteria for Large and Small Entities: Certain sections define financial thresholds to categorize entities as small businesses or large data holders. A small business is defined as having average annual gross revenues not exceeding $40,000,000, while a large data holder must have annual gross revenue of at least $250,000,000. These financial boundaries determine the obligations and exemptions entities face under the bill. The high threshold for large data holders may overlook significant data privacy risks posed by smaller entities managing substantial amounts of sensitive data. This, in turn, exacerbates concerns about data protection loopholes and inadequate enforcement scope.

Overall, while the financial references within the bill aim to establish regulatory frameworks and provide remedies for privacy violations, they also bring attention to concerns about administrative efficacy, enforcement rigor, and sufficient oversight mechanisms to ensure financial allocations achieve their intended purposes. These monetary details are critical to understanding the bill's potential impact and the effectiveness of its provisions.

Issues

  • The lack of a clear definition for 'covered minor' in Section 120 poses ambiguity in protecting the privacy of minors, which is critical as it relates to the handling of vulnerable individuals' data.

  • The preemption clause in Section 118 could limit the ability of states to enforce stricter data privacy laws, potentially weakening protections for individuals in states with more stringent regulations.

  • Section 115 establishes a new bureau within the FTC, which could lead to bureaucratic expansion and increased spending without clear accountability or demonstrated necessity, risking inefficiencies in government spending.

  • The ambiguous definition of 'dark patterns' in Section 107 could lead to inconsistent enforcement, as it relies on subjective interpretation of user interface designs that may impair user autonomy.

  • The complexity and lack of clarity in consent mechanisms for the online collection of children's data (Section 202) add administrative burdens for operators and may hinder compliance, impacting children's privacy.

  • In Section 115, the establishment of a 'Privacy and Security Victims Relief Fund' includes vague criteria for fund allocation, potentially leading to mismanagement or discretionary misuse of funds without adequate oversight.

  • The exclusion of some entities such as small businesses and certain nonprofits from being considered 'covered entities' in Section 101 can create potential data protection loopholes, affecting the scope of privacy enforcement.

  • The criteria for defining a 'large data holder' in Section 101 may overlook smaller entities handling sensitive data, potentially leaving important data privacy issues inadequately addressed.

  • The termination of the FTC's rulemaking on commercial surveillance and data security in Section 121 lacks justification, leaving questions about future consumer protection measures against these threats.

  • The ambiguous language regarding 'substantial privacy harm' in Section 117 could lead to varied interpretations and legal challenges, complicating individuals' ability to pursue claims for privacy infringements.

Sections

Sections are presented as they are annotated in the original legislative text. Any missing headers, numbers, or non-consecutive order is due to the original text.

1. Short title; table of contents Read Opens in new tab

Summary AI

The "American Privacy Rights Act of 2024" comprises comprehensive guidelines on protecting consumer data privacy. It outlines various aspects such as data minimization, transparency, individual rights over personal data, data security, and enforcement measures by different entities. Additionally, it introduces the "Children’s Online Privacy Protection Act 2.0" which addresses the handling of children's personal information online.

101. Definitions Read Opens in new tab

Summary AI

The document provides detailed definitions of terms related to data privacy and usage, including what constitutes affirmative express consent, biometric information, covered data, service providers, and many others. It outlines how each term is applied in the context of data collection, processing, and sharing while specifying exclusions and conditions where applicable.

Money References

  • — (A) IN GENERAL.—The term “covered high-impact social media company” means a covered entity that provides any internet-accessible platform that— (i) generates $3,000,000,000 or more in global annual revenue, including the revenue generated by any affiliate of such covered entity; (ii) has 300,000,000 or more global monthly active users for not fewer than 3 of the preceding 12 months; and (iii) constitutes an online product or service that is primarily used by users to access or share user-generated content.
  • — (A) IN GENERAL.—The term “large data holder” means a covered entity or service provider that, in the most recent calendar year, had an annual gross revenue of not less than $250,000,000 and, subject to subparagraph (B), collected, processed, retained, or transferred— (i) the covered data of— (I) more than 5,000,000 individuals; (II) more than 15,000,000 portable connected devices that identify or are linked or reasonably linkable to 1 or more individuals; or (III) more than 35,000,000 connected devices that identify or are linked or reasonable linkable to 1 or more individuals; or (ii) the sensitive covered data of— (I) more than 200,000 individuals; (II) more than 300,000 portable connected devices that identify or are linked or reasonable linkable to 1 or more individuals; or (III) more than 700,000 connected devices that identify or are linked or reasonably linkable to 1 or more individuals. (B) EXCLUSIONS.—For the purposes of subparagraph (A), a covered entity or service provider may not be considered a large data holder solely on the basis of collecting, processing, retaining, or transferring to a service provider— (i) personal mailing or email addresses; (ii) personal telephone numbers; (iii) log-in information of an individual or device to allow the individual or device to log in to an account administered by the covered entity; or (iv) in the case of a covered entity that is a seller of goods or services (other than an entity that facilitates payment, such as a bank, credit card processor, mobile payment system, or payment platform), credit, debit, or mobile payment information necessary and used to initiate, render, bill for, finalize, complete, or otherwise facilitate payments for such goods or services.
  • — (A) IN GENERAL.—The term “small business” means an entity (including any affiliate of the entity)— (i) that has average annual gross revenues for the period of the 3 preceding calendar years (or for the period during which the entity has been in existence, if such period is less than 3 calendar years) not exceeding $40,000,000, indexed to the Producer Price Index reported by the Bureau of Labor Statistics; (ii) that, on average for the period described in clause (i), did not annually collect, process, retain, or transfer the covered data of more than 200,000 individuals for any purpose other than initiating, rendering, billing for, finalizing, completing, or otherwise collecting payment for a requested service or product; and (iii) that did not, during the period described in clause (i), transfer covered data to a third party in exchange for revenue or anything of value, except for purposes of initiating, rendering, billing for, finalizing, completing, or otherwise collecting payment for a requested service or product or facilitating web analytics that are not used to create an online activity profile.
  • (53) SUBSTANTIAL PRIVACY HARM.—The term “substantial privacy harm” means— (A) any alleged financial harm of not less than $10,000; or (B) any alleged physical or mental harm to an individual that involves— (i) treatment by a licensed, credentialed, or otherwise bona fide health care provider, hospital, community health center, clinic, hospice, or residential or outpatient facility for medical, mental health, or addiction care; or (ii) physical injury, highly offensive intrusion into the privacy expectations of a reasonable individual under the circumstances, or discrimination on the basis of race, color, religion, national origin, sex, or disability. (54) TARGETED ADVERTISING.—The term “targeted advertising”— (A) means displaying or presenting an online advertisement to an individual or to a device identified by a unique persistent identifier (or to a group of individuals or devices identified by unique persistent identifiers), if the advertisement is selected based, in whole or in part, on known or predicted preferences or interests associated with the individual or device; (B) includes— (i) an online advertisement by a covered high-impact social media company for a product or service that is not a product or service offered by the covered high-impact social media company; and (ii) an online advertisement for a product or service based on the previous interaction of an individual or a device identified by a unique persistent identifier with such product or service on a website or online service that does not share common branding or affiliation with the website or online service displaying or presenting the advertisement; and (C) excludes contextual advertising and first-party advertising.

102. Data minimization Read Opens in new tab

Summary AI

The section outlines rules to ensure that companies collect, use, and share personal data only when needed and with consent, especially when it comes to sensitive data like biometric and genetic information. It lists several specific reasons why data can be shared without consent, such as to protect against fraud or comply with legal obligations, but emphasizes that these actions must be necessary, appropriate, and not excessive.

103. Privacy by design Read Opens in new tab

Summary AI

Each company or service provider must create and maintain policies and procedures to handle personal data responsibly, especially for vulnerable groups such as minors, the elderly, and people with disabilities. They need to evaluate their privacy risks based on their business size and the sensitivity of the data, and they must follow the latest technological and security practices. The government will provide guidance on what these reasonable policies look like within a year of the law's enactment.

104. Transparency Read Opens in new tab

Summary AI

The section outlines requirements for covered entities and service providers to publish a clear and accessible privacy policy detailing their data practices, including data collection, retention, and sharing. It also mandates that privacy policies be accessible to individuals with disabilities, available in multiple languages, and that any material changes must be communicated to users with an option to opt out if desired. Additional transparency measures apply to large data holders, requiring them to retain and disclose previous privacy policies and provide concise short-form notices of data practices.

105. Individual control over covered data Read Opens in new tab

Summary AI

The section outlines the rights individuals have regarding their personal data held by companies. It details how people can access, correct, delete, and transfer their data, and sets rules for how quickly companies must respond, what exceptions exist, and what information companies must report annually.

106. Opt-out rights and universal mechanisms Read Opens in new tab

Summary AI

A covered entity must provide individuals with the ability to opt out of their data being shared with third parties and used for targeted advertising. The law outlines that within two years, clear and easy-to-use universal opt-out mechanisms should be established, allowing individuals to exercise these rights through a single interface, ensuring accessibility for all, including those with disabilities.

107. Interference with consumer rights Read Opens in new tab

Summary AI

A covered entity is not allowed to use "dark patterns," which are manipulative design techniques, to distract someone from important notices, hinder their rights, or trick them into giving consent. Additionally, they cannot mislead individuals with false or deceptive statements to make them waive their rights.

108. Prohibition on denial of service and waiver of rights Read Opens in new tab

Summary AI

In Section 108, the bill states that companies cannot punish individuals for using their rights by changing the price or quality of goods and services. However, companies can offer different prices or services through genuine loyalty programs with customer consent, as long as the sale of personal data is not required, and can also provide incentives for market research, or decide not to offer a product if it's necessary to maintain privacy rights.

109. Data security and protection of covered data Read Opens in new tab

Summary AI

The section requires every organization handling sensitive data to establish and maintain strong security practices to protect the data from unauthorized access. It includes conducting regular risk assessments, taking preventive and corrective actions, following strict data retention and disposal practices, providing employee training, and having procedures for responding to security incidents.

110. Executive responsibility Read Opens in new tab

Summary AI

A covered entity or service provider must appoint privacy and data security officers to ensure compliance with data privacy rules, while large data holders are required to designate specialized officers for managing privacy and security separately. Large data holders must conduct regular internal audits, establish reporting structures, and complete privacy impact assessments to evaluate and document the impact of their data handling practices on individual privacy.

111. Service providers and third parties Read Opens in new tab

Summary AI

Service providers must follow strict rules when handling data for other businesses, including adhering to specific instructions, deleting data after use, and ensuring compliance with various data protection laws. They also need to enter contracts with third parties to ensure data is used properly, exercise reasonable care in selecting service providers, and halt data transfers if they know a violation could occur.

112. Data brokers Read Opens in new tab

Summary AI

Data brokers must create informative websites to help individuals exercise certain rights, are prohibited from engaging in harmful practices, and must register with the Commission if they handle data from over 5,000 people or devices. The Commission will maintain a public registry for data brokers, allowing individuals to submit "Do Not Collect" or "Delete My Data" requests, which must be honored within 30 days unless exceptions apply.

Money References

  • (2) REGISTRATION REQUIREMENTS.—In registering with the Commission as required under paragraph (1), a data broker shall do the following: (A) Pay to the Commission a registration fee of $100. (B) Provide the Commission with the following information: (i) The legal name and primary valid physical postal address, email address, and internet address of the data broker. (ii) A description of the categories of covered data the data broker collects, processes, retains, or transfers.

113. Commission-approved compliance guidelines Read Opens in new tab

Summary AI

A covered entity or group that is not a large data holder can apply to the Commission to have their data compliance guidelines approved. The approval requires meeting certain criteria, including independent review and enforcement. If significant changes are made to the guidelines, they must be submitted for re-approval. Approved guidelines allow entities to self-certify their compliance, with a presumption of compliance if they follow the approved guidelines. If the guidelines fail to meet standards, they can be withdrawn but must be given a chance to fix any issues. Service providers, also not large data holders, are eligible to apply and participate in these guidelines similar to covered entities.

114. Privacy-enhancing technology pilot program Read Opens in new tab

Summary AI

The section describes a pilot program encouraging the use of privacy-enhancing technologies to protect data, where eligible companies can join by meeting specific security standards. It outlines the program's establishment, requirements for participation, responsibilities of the overseeing Commission, evaluation through audits and studies, possible withdrawal from the program if standards aren't maintained, and offers legal protections for compliant participants.

115. Enforcement by Federal Trade Commission Read Opens in new tab

Summary AI

The bill requires the Federal Trade Commission to establish a new bureau focused on consumer protection and competition, with a specialized staff that includes attorneys and technologists. It also creates a "Privacy and Security Victims Relief Fund" to help people affected by violations and mandates regular reports to Congress about investigations and plans related to enforcing the new rules.

116. Enforcement by States Read Opens in new tab

Summary AI

In this section, states are given the power to take legal action if a company violates privacy or data security laws, but they must notify the federal Commission first. It sets rules for how states can legally pursue these cases, work with outside firms, and collaborate with the federal government, while ensuring all existing state powers are preserved.

117. Enforcement by persons Read Opens in new tab

Summary AI

In this section, individuals can file lawsuits against certain organizations for violating data privacy laws, with possible remedies including damages and legal fees. It explains the process for seeking injunctive relief or damages, conditions under which arbitration agreements may be invalid, and emphasizes that notification requirements may be waived in cases of substantial privacy harm.

118. Relation to other laws Read Opens in new tab

Summary AI

The section establishes a national standard for privacy and data security, overriding most state laws to create consistency across the U.S. However, it allows states to keep laws on topics like consumer protection, civil rights, and data breach notifications. It also respects existing federal laws and does not interfere with assessing civil lawsuits based on common law rights or state statutes. Certain regulations from the Communications Act of 1934 and Telecommunications Act of 1996 related to FCC privacy don't apply to covered entities, although some exceptions exist, such as protection for emergency services and telecommunications confidentiality.

119. Children’s Online Privacy Protection Act of 1998 Read Opens in new tab

Summary AI

This section clarifies that nothing in the current title changes or removes any responsibilities or requirements that a company or individual may have under the Children’s Online Privacy Protection Act of 1998.

120. Data protections for covered minors Read Opens in new tab

Summary AI

Covered entities and service providers are prohibited from showing targeted ads to minors and cannot share minors' data with third parties without consent unless certain legal exceptions apply. The Commission can set rules to help parents and teens manage their rights under this law, ensuring alignment with existing child privacy regulations.

121. Termination of FTC rulemaking on commercial surveillance and data security Read Opens in new tab

Summary AI

This section states that as soon as this Act is enacted, the rulemaking process for the "Trade Regulation Rule on Commercial Surveillance and Data Security," which was initiated on August 22, 2022, will be stopped.

122. Severability Read Opens in new tab

Summary AI

If any part of this title is found to be invalid, the rest of the title remains in effect and can still be applied to other people or situations.

123. Innovation rulemakings Read Opens in new tab

Summary AI

The Commission can create new rules to add more types of data to the definition of "sensitive covered data," but it cannot expand a specific category of information already defined. It can also update the list of allowed reasons for handling this data.

124. Effective date Read Opens in new tab

Summary AI

This section states that, unless stated otherwise, the provisions in this title will become effective 180 days after the date when the Act is officially passed into law.

201. Short title Read Opens in new tab

Summary AI

The section introduces the Children’s Online Privacy Protection Act 2.0, which is the official name for this part of the legislation.

202. Online collection, use, disclosure, and deletion of personal information of children Read Opens in new tab

Summary AI

The proposed amendments to the Children's Online Privacy Protection Act aim to update and clarify definitions and provisions regarding the collection, use, and safeguarding of children's personal information online. It establishes rules for operators of websites and services directed at children, ensuring they obtain proper parental consent and adhere to guidelines that protect children's privacy, while providing conditions for educational institutions to handle such data responsibly.

203. Study and reports on mobile and online application oversight and enforcement Read Opens in new tab

Summary AI

The section mandates that within specific timeframes, the Federal Trade Commission (FTC) must submit reports to Congress about its oversight and enforcement efforts concerning mobile and online applications that target children. These reports will cover compliance with children’s online privacy laws, details of enforcement actions taken, and policy recommendations, while the FTC's Inspector General will evaluate the effectiveness of existing safe harbor provisions meant to protect children's online privacy.

204. Severability Read Opens in new tab

Summary AI

This section states that if any part of the bill or its amendments is found to be invalid, the rest of the bill and its amendments should still remain effective and applicable to other situations or people.