Overview

Title

To combat the sexual exploitation of children by supporting victims and promoting accountability and transparency by the tech industry.

ELI5 AI

The "STOP CSAM Act of 2024" is a plan to help kids by making sure websites don’t have bad stuff about children and are quicker in actions against it. It makes the rules stricter for internet companies, so they have to follow new important rules to keep kids safe online.

Summary AI

H. R. 7949, known as the "STOP CSAM Act of 2024," aims to combat the online sexual exploitation of children by improving victim support, enhancing accountability, and ensuring transparency from the tech industry. The bill amends federal laws to better protect child victims and witnesses in court, mandates reporting and removal procedures for child sexual abuse material, and holds online platforms accountable for hosting or promoting such content. It also establishes the Child Online Protection Board to oversee complaints and ensure quick removal of harmful material, while providing civil remedies for victims. Additionally, it introduces fines and penalties for tech companies that fail to comply with these provisions.

Published

2024-04-11
Congress: 118
Session: 2
Chamber: HOUSE
Status: Introduced in House
Date: 2024-04-11
Package ID: BILLS-118hr7949ih

Bill Statistics

Size

Sections:
9
Words:
27,482
Pages:
141
Sentences:
428

Language

Nouns: 6,879
Verbs: 2,309
Adjectives: 1,525
Adverbs: 195
Numbers: 767
Entities: 751

Complexity

Average Token Length:
4.13
Average Sentence Length:
64.21
Token Entropy:
5.56
Readability (ARI):
33.54

AnalysisAI

Overview of the Bill

The proposed legislation, titled the "Strengthening Transparency and Obligations to Protect Children Suffering from Abuse and Mistreatment Act of 2024" or the "STOP CSAM Act of 2024," aims to combat the sexual exploitation of children. The bill outlines various measures to support victims and to increase accountability and transparency within the tech industry. It proposes amendments to existing U.S. laws, focusing on improving the legal handling of child exploitation cases, enforcing proper compensation for victims, and mandating that tech companies report and remove child sexual abuse material (CSAM) swiftly.

Key components of the bill include modifications to Federal court processes to better protect child victims and witnesses, enhancements to facilitate payments of restitution for victims, and improvements to the CyberTipline reporting structure. It establishes significant penalties for tech companies failing to comply with new reporting and removal guidelines. Additionally, the legislation seeks to expand civil remedies for victims and create a Child Online Protection Board to oversee the implementation of these processes.

Significant Issues

One of the considerable issues with the bill relates to financial penalties, especially for tech companies. Sections like the ones focusing on CyberTipline improvements impose substantial fines that some observe as arbitrary, potentially impacting smaller companies more severely. There are also concerns regarding the $40,000,000 annual budget for the newly established Child Online Protection Board, with worries about potential waste without detailed expense justification.

The bill's elimination of the statute of limitations for filing civil complaints could result in legal uncertainty for service providers, increasing litigation risks and compliance costs. Terms like "knowing" conduct and "good faith" effort within the defense clauses are vague, leading to potential inconsistencies in legal interpretations and applications.

Moreover, the bill's technical and complex language is a recurring issue across sections, potentially hindering public understanding and complicating compliance for laypersons and smaller providers.

Impact on the Public

For the general public, the STOP CSAM Act of 2024 underscores a commitment to protecting children and holds tech companies accountable for handling harmful content. By focusing on expeditious removal and clear procedures, the bill aims to reduce the circulation of CSAM online, aligning with broader societal interests in safety and justice for victims. These steps might increase public confidence in digital safety and foster greater industry responsibility for child protection.

Impact on Specific Stakeholders

Victims and Their Families: The bill expands the rights of victims to seek restitution and civil remedies, offering greater opportunities for justice and financial compensation. This empowerment might provide victims and their families with both emotional and practical support in recovering from exploitation.

Tech Industry: Companies providing online services are significantly impacted by the bill. Large penalties and stringent reporting requirements may place a substantial burden on tech companies, especially smaller entities with fewer resources to navigate complex compliance issues. This could lead to an increase in operational costs and encourage more investment in compliance infrastructure.

Legal Community: Legal professionals might see an increase in demand for expertise in navigating these new regulations, with potential growth in litigation related to the bill's implementations. Lawyers and advocates working on child protection issues may find this legislation as an additional tool to support their causes.

In summary, while the STOP CSAM Act of 2024 seeks to bolster protection measures and advance accountability in tackling child sexual exploitation, its effectiveness may hinge on the balance between stringent enforcement and practical feasibility for stakeholders involved. The issues of implementation complexity, financial burden on smaller companies, and the need for clear guidelines will determine the bill's reception and impact.

Financial Assessment

In examining H.R. 7949, the "STOP CSAM Act of 2024," various financial elements warrant particular attention. These references and allocations are inherently tied to several issues outlined in the bill's sections, particularly concerning penalties, funding provisions, and overall financial oversight.

Financial Penalties and Fines

The bill imposes significant financial penalties on technology providers that violate specific requirements related to child sexual exploitation materials. For instance, fines up to $1,000,000 are established under Section 2260B for violations involving conscious or reckless risks or if an individual is harmed. Additionally, providers face a civil penalty between $50,000 and $250,000 for failing to comply with CyberTipline reporting obligations, as outlined in Section 4.

These financial penalties are noted as potentially arbitrary and disproportionately severe, particularly for smaller companies, which could find compliance financially burdensome. This concern is underscored by the mandates for providers with over 100,000,000 monthly active users, who may face fines ranging from $600,000 to $1,000,000 for infractions, potentially exacerbating financial strain on burgeoning platforms.

Appropriations and Funding

There are substantial allocations for appropriations in multiple sections. Section 2 authorizes $25,000,000 each fiscal year for use by the U.S. courts to implement victim protection measures. Similarly, Section 6 allocates $40,000,000 annually for the operational costs of the Child Online Protection Board.

Critics have raised concerns about the perceived excessive nature of the $40,000,000 annual funding for the Child Online Protection Board. Without a comprehensive breakdown of expected costs and expenses, there are worries about potential wasteful spending. In reflection of this, detailed oversight mechanisms for the authorized $15,000,000 annually for facilitating restitution payments under Section 3 are also mentioned as lacking.

Restitution and Civil Remedies

Sections 5 and 2255A introduce an interesting dynamic by eliminating the statute of limitations for specific civil actions, allowing victims indefinite time to file with financial ramifications for technology platforms. This could potentially increase litigation and compliance costs. The act also envisions mandatory payments of $300,000 in liquidated damages for victims pursuing civil actions, which highlights the financial impact on online platforms targeted by such legislation.

While the intent behind these measures is to offer comprehensive support to victims and maintain accountability within the tech industry, the financial burdens imposed by these regulations may cause significant challenges for technology providers, particularly smaller entities. Additionally, the complex language of financial stipulations may limit understanding and complicate compliance efforts, which is a notable issue given the intricate nature of the bill.

Issues

  • Section 4 - Cybertipline improvements: The financial penalties for providers not complying with the reporting and preservation requirements are substantial and perceived as arbitrary, possibly disproportionately affecting smaller companies. This could discourage compliance or lead to financial strain on smaller tech companies.

  • Section 6 - Reporting and removal of child sexual abuse material: The $40,000,000 annual funding for the Child Online Protection Board appears excessive without a detailed breakdown of expected expenses, raising concerns about potential wasteful spending.

  • Section 5 & 2255A - Expanding civil remedies and civil remedy against online platforms: The elimination of the statute of limitations for filing complaints may lead to legal uncertainty for providers and challenges in defending against old misconduct, potentially increasing litigation and compliance costs.

  • Section 2255A - Civil remedy against online platforms and app stores: The vagueness in terms such as 'knowing' conduct and 'good faith' efforts in defense conditions could lead to inconsistent application and legal interpretations, affecting the ability of platforms to defend themselves.

  • Section 3 - Facilitating payment of restitution: The provision authorizing $15,000,000 annually for use under restitution amendments lacks detailed oversight mechanisms and justification of fund allocation, raising concerns about potential misuse or unjustified expenditure.

  • General Issue - Complexity and Transparency: Across multiple sections, the dense legal and technical language complicates understanding and compliance for laypersons and smaller providers, potentially limiting public understanding and transparency of the bill's implications.

Sections

Sections are presented as they are annotated in the original legislative text. Any missing headers, numbers, or non-consecutive order is due to the original text.

1. Short title Read Opens in new tab

Summary AI

The first section of the bill specifies that it can be officially called the "Strengthening Transparency and Obligations to Protect Children Suffering from Abuse and Mistreatment Act of 2024" or the "STOP CSAM Act of 2024."

2. Protecting child victims and witnesses in Federal court Read Opens in new tab

Summary AI

The bill modifies existing laws to better protect child victims and witnesses in federal court by updating definitions and language, expanding the scope of protections, and specifying how sensitive information should be handled. It also authorizes funding for related activities and ensures these changes apply retroactively to past, present, and future conduct.

Money References

  • — “(A) IN GENERAL.—There is authorized to be appropriated to the United States courts to carry out this subsection $25,000,000 for each fiscal year.

3. Facilitating payment of restitution; technical amendments to restitution statutes Read Opens in new tab

Summary AI

The proposed amendments to Title 18 of the U.S. Code update restitution laws to facilitate better victim compensation, particularly in cases involving child pornography and trafficking crimes. Key changes include clarifying the definition of restitution rights, allowing courts to appoint trustees to manage restitution funds for certain victims, and providing guidelines for trustee duties, fees, and potential use of appropriated funds for these purposes.

Money References

  • Title 18, United States Code, is amended— (1) in section 1593(c)— (A) by inserting “(1)” after “(c)”; (B) by striking “chapter, including, in” and inserting the following: “chapter. “(2) In”; and (C) in paragraph (2), as so designated, by inserting “may assume the rights of the victim under this section” after “suitable by the court”; (2) in section 2248(c)— (A) by striking “For purposes” and inserting the following: “(1) IN GENERAL.—For purposes”; (B) by striking “chapter, including, in” and inserting the following: “chapter. “(2) ASSUMPTION OF CRIME VICTIM'S RIGHTS.—In”; and (C) in paragraph (2), as so designated, by inserting “may assume the rights of the victim under this section” after “suitable by the court”; (3) in section 2259— (A) by striking subsection (a) and inserting the following: “(a) In general.—Notwithstanding section 3663 or 3663A, and in addition to any other civil or criminal penalty authorized by law, the court shall order restitution for any offense under— “(1) section 1466A, to the extent the conduct involves a visual depiction of an identifiable minor; or “(2) this chapter.”; (B) in subsection (b)— (i) in paragraph (1), by striking “Directions.—Except as provided in paragraph (2), the” and inserting “Restitution for child pornography production.—If the defendant was convicted of child pornography production, the”; and (ii) in paragraph (2)(B), by striking “$3,000.” and inserting the following: “— “(i) $3,000; or “(ii) 10 percent of the full amount of the victim’s losses, if the full amount of the victim's losses is less than $3,000.”; and (C) in subsection (c)— (i) by striking paragraph (1) and inserting the following: “(1) CHILD PORNOGRAPHY PRODUCTION.—For purposes of this section and section 2259A, the term ‘child pornography production’ means— “(A) a violation of, attempted violation of, or conspiracy to violate section 1466A(a) to the extent the conduct involves production of a visual depiction of an identifiable minor; “(B) a violation of, attempted violation of, or conspiracy to violate section 1466A(a) involving possession with intent to distribute, or section 1466A(b), to the extent the conduct involves a visual depiction of an identifiable minor— “(i) produced by the defendant; or “(ii) that the defendant attempted or conspired to produce; “(C) a violation of subsection (a), (b), or (c) of section 2251, or an attempt or conspiracy to violate any of those subsections under subsection (e) of that section; “(D) a violation of section 2251A; “(E) a violation of section 2252(a)(4) or 2252A(a)(5), or an attempt or conspiracy to violate either of those sections under section 2252(b)(2) or 2252A(b)(2), to the extent such conduct involves child pornography— “(i) produced by the defendant; or “(ii) that the defendant attempted or conspired to produce; “(F) a violation of subsection (a)(7) of section 2252A, or an attempt or conspiracy to violate that subsection under subsection (b)(3) of that section, to the extent the conduct involves production with intent to distribute; “(G) a violation of section 2252A(g) if the series of felony violations involves not fewer than 1 violation— “(i) described in subparagraph (A), (B), (E), or (F) of this paragraph; “(ii) of section 1591; or “(iii) of section 1201, chapter 109A, or chapter 117, if the victim is a minor; “(H) a violation of subsection (a) of section 2260, or an attempt or conspiracy to violate that subsection under subsection (c)(1) of that section; “(I) a violation of section 2260B(a)(2) for promoting or facilitating an offense— “(i) described in subparagraph (A), (B), (D), or (E) of this paragraph; or “(ii) under section 2422(b); and “(J) a violation of chapter 109A or chapter 117, if the offense involves the production or attempted production of, or conspiracy to produce, child pornography.
  • — “(A) IN GENERAL.—There is authorized to be appropriated to the United States courts to carry out this subsection $15,000,000 for each fiscal year.

4. Cybertipline improvements, and accountability and transparency by the tech industry Read Opens in new tab

Summary AI

This section amends U.S. laws to improve efforts against online child exploitation. It mandates that tech providers report child exploitation promptly, imposes penalties for non-compliance, and enhances transparency by requiring annual reports, while also protecting providers who act in good faith to stop such crimes.

Money References

  • that violates subparagraph (A) shall be fined— “(I) in the case of an initial violation, not more than— “(aa) $850,000 if the provider has not fewer than 100,000,000 monthly active users; or “(bb) $600,000 if the provider has fewer than 100,000,000 monthly active users; and “(II) in the case of any second or subsequent violation, not more than— “(aa) $1,000,000 if the provider has not fewer than 100,000,000 monthly active users; or “(bb) $850,000 if the provider has fewer than 100,000,000 monthly active users.
  • — “(A) VIOLATIONS RELATING TO CYBERTIPLINE REPORTS AND MATERIAL PRESERVATION.—A provider shall be liable to the United States Government for a civil penalty in an amount of not less than $50,000 and not more than $250,000 if the provider knowingly— “(i) fails to submit a report under subsection (a)(1) within the time period required by that subsection; “(ii) fails to preserve material as required under subsection (h); or “(iii) submits a report under subsection (a)(1)
  • “(B) ANNUAL REPORT VIOLATIONS.—A provider shall be liable to the United States Government for a civil penalty in an amount of not less than $100,000 and not more than $1,000,000 if the provider knowingly— “(i) fails to submit an annual report as required under subsection (i); or “(ii) submits an annual report under subsection (i) that— “(I) contains a materially false, fraudulent, or misleading statement; or “(II) omits information described in subsection (i)(1) that is reasonably available.
  • — “(1) IN GENERAL.—Not later than March 31 of the second year beginning after the date of enactment of the STOP CSAM Act of 2024, and of each year thereafter, a provider that had more than 1,000,000 unique monthly visitors or users during each month of the preceding year and accrued revenue of more than $50,000,000 during the preceding year shall submit to the Attorney General and the Chair of the Federal Trade Commission a report, disaggregated by subsidiary, that provides the following information for the preceding year to the extent such information is applicable and reasonably available: “(A) CYBERTIPLINE DATA.— “(i) The total number of reports that the provider submitted under subsection (a)(1). “(ii) Which items of information described in subsection (b)(2) are routinely included in the reports submitted by the provider under subsection (a)(1).
  • “(II) AGENCY DISCRETION.—The Attorney General and Chair of the Federal Trade Commission— “(aa) shall consider a request made under subclause (I); and “(bb) may, in their discretion, redact from a report published under subparagraph (A) any information that is law enforcement sensitive or otherwise not suitable for public distribution, whether or not requested.”; (2) in section 2258B— (A) by striking subsection (a) and inserting the following: “(a) In general.— “(1) LIMITED LIABILITY.—Except as provided in subsection (b), a civil claim or criminal charge described in paragraph (2) may not be brought in any Federal or State court. “(2) COVERED CLAIMS AND CHARGES.—A civil claim or criminal charge referred to in paragraph (1) is a civil claim or criminal charge against a provider or domain name registrar, including any director, officer, employee, or agent of such provider or domain name registrar, that is directly attributable to— “(A) the performance of the reporting or preservation responsibilities of such provider or domain name registrar under this section, section 2258A, or section 2258C; “(B) transmitting, distributing, or mailing child pornography to any Federal, State, or local law enforcement agency, or giving such agency access to child pornography, in response to a search warrant, court order, or other legal process issued or obtained by such agency; or “(C) the use by the provider or domain name registrar of any material being preserved under section 2258A(h) by such provider or registrar for research conducted voluntarily and in good faith for the sole and exclusive purpose of— “(i) improving or facilitating reporting under this section, section 2258A, or section 2258C; or “(ii) stopping the online sexual exploitation of children.”; and (B) in subsection (b)— (i) in paragraph (1), by striking “; or” and inserting “or knowingly failed to comply with a requirement under section 2258A;”; (ii) in paragraph (2)(C)— (I) by striking “this section, sections” and inserting “this section or section”; and (II) by striking the period and inserting “; or”; and (iii) by adding at the end the following: “(3) for purposes of subsection (a)(2)(C), knowingly distributed or transmitted the material, or made the material available, except as required by law, to— “(A) any other entity; “(B) any person not employed by the provider or domain name registrar; or “(C) any person employed by the provider or domain name registrar who is not conducting any research described in that subsection.”; (3) in section 2258C— (A) in the section heading, by striking “the CyberTipline” and inserting “NCMEC”; (B) in subsection (a)— (i) in the subsection heading, by striking “Elements” and inserting “Provision to providers and nonprofit entities”; (ii) in paragraph (1)— (I) by striking “to a provider” and inserting the following: “or submission to the child victim identification program to— “(A) a provider”; (II) in subparagraph (A), as so designated— (aa) by inserting “use of the provider’s products or services to commit” after “stop the”; and (bb) by striking the period at the end and inserting “; or”; and (III) by adding at the end the following: “(B) a nonprofit entity for the sole and exclusive purpose of preventing and curtailing the online sexual exploitation of children.”; and (iii) in paragraph (2)— (I) in the heading, by striking “Inclusions” and inserting “Elements”; (II) by striking “unique identifiers” and inserting “similar technical identifiers”; and (III) by inserting “or submission to the child victim identification program” after “CyberTipline report”; (C) in subsection (b)— (i) in the heading, by inserting “or nonprofit entities” after “providers”; (ii) by striking “Any provider” and inserting the following: “(1) IN GENERAL.—Any provider or nonprofit entity”; (iii) in paragraph (1), as so designated— (I) by striking “receives” and inserting “obtains”; and (II) by inserting “or submission to the child victim identification program” after “CyberTipline report”; and (iv) by adding at the end the following: “(2) LIMITATION ON SHARING WITH OTHER ENTITIES.—A provider or nonprofit entity that obtains elements under subsection (a)(1) may not distribute those elements, or make those elements available, to any other entity, except for the sole and exclusive purpose of stopping the online sexual exploitation of children.”; (D) in subsection (c)— (i) by striking “subsections” and inserting “subsection”; (ii) by striking “providers receiving” and inserting “a provider to obtain”; (iii) by inserting “or submission to the child victim identification program” after “CyberTipline report”; and (iv) by striking “to use the elements to stop the online sexual exploitation of children”; and (E) in subsection (d), by inserting “or to the child victim identification program” after “CyberTipline”; (4) in section 2258E— (A) in paragraph (6), by striking “electronic communication service provider” and inserting “electronic communication service”; (B) in paragraph (7), by striking “and” at the end; (C) in paragraph (8), by striking the period at the end and inserting a semicolon; and (D) by adding at the end the following: “(9) the term ‘publicly available’, with respect to a visual depiction on a provider's service, means the visual depiction can be viewed by or is accessible to all users of the service, regardless of the steps, if any, a user must take to create an account or to gain access to the service in order to access or view the visual depiction; and “(10) the term ‘child victim identification program’ means the program described in section 404(b)(1)(K)(ii) of the Juvenile Justice and Delinquency Prevention Act of 1974 (34 U.S.C. 11293(b)(1)(K)(ii)).”; (5) in section 2259B(a), by inserting “, any fine or penalty collected under section 2258A(e) or subparagraph (A) of section 6(g)(24) of the STOP CSAM Act of 2024 (except as provided in clauses (i) and (ii)(I) of subparagraph (B) of such section 6(g)(24)),” after “2259A”; and (6) by adding at the end the following: “§ 2260B. Liability for certain child exploitation offenses “(a) Offense.—It shall be unlawful for a provider of an interactive computer service, as that term is defined in section 230 of the Communications Act of 1934 (47 U.S.C. 230), that operates through the use of any facility or means of interstate or foreign commerce or in or affecting interstate or foreign commerce, through such service to— “(1) intentionally host or store child pornography or make child pornography available to any person; or “(2) knowingly promote or facilitate a violation of section 2251, 2251A, 2252, 2252A, or 2422(b). “(b) Penalty.—A provider of an interactive computer service that violates subsection (a)— “(1) subject to paragraph (2), shall be fined not more than $1,000,000; and “(2) if the offense involves a conscious or reckless risk of serious personal injury or an individual is harmed as a direct and proximate result of the violation, shall be fined not more than $5,000,000. “(c) Rule of construction.—Nothing in this section shall be construed to apply to any good faith action by a provider of an interactive computer service that is necessary to comply with a valid court order, subpoena, search warrant, statutory obligation, or preservation request from law enforcement.”. (b) Clerical amendment.—The table of sections for chapter 110 of title 18, United States Code, is amended by adding at the end the following: “2260B. Liability for certain child exploitation offenses.”. (c) Effective date for amendments to reporting requirements of providers.—The amendments made by subsection (a)(1) of this section shall take effect on the date that is 120 days after the date of enactment of this Act. ---

2260B. Liability for certain child exploitation offenses Read Opens in new tab

Summary AI

The section makes it illegal for providers of online services to intentionally host, store, or share child pornography, or to knowingly encourage violations related to child exploitation laws. Providers violating these rules can be fined up to $1,000,000, or $5,000,000 if the violation risks serious injury or causes harm, but good faith actions to comply with legal obligations are not affected.

Money References

  • (a) Offense.—It shall be unlawful for a provider of an interactive computer service, as that term is defined in section 230 of the Communications Act of 1934 (47 U.S.C. 230), that operates through the use of any facility or means of interstate or foreign commerce or in or affecting interstate or foreign commerce, through such service to— (1) intentionally host or store child pornography or make child pornography available to any person; or (2) knowingly promote or facilitate a violation of section 2251, 2251A, 2252, 2252A, or 2422(b). (b) Penalty.—A provider of an interactive computer service that violates subsection (a)— (1) subject to paragraph (2), shall be fined not more than $1,000,000; and (2) if the offense involves a conscious or reckless risk of serious personal injury or an individual is harmed as a direct and proximate result of the violation, shall be fined not more than $5,000,000. (c) Rule of construction.—Nothing in this section shall be construed to apply to any good faith action by a provider of an interactive computer service that is necessary to comply with a valid court order, subpoena, search warrant, statutory obligation, or preservation request from law enforcement. ---

5. Expanding civil remedies for victims of online child sexual exploitation Read Opens in new tab

Summary AI

The section expands civil remedies for victims of online child sexual exploitation by allowing them to sue individuals and online platforms that promote, aid, or host illegal activities involving minors. It specifies damages, allows no time limit for filing claims, and includes rules about encryption, venue, service of process, and defenses for interactive computer services that take timely actions.

Money References

  • “(b) Relief.—In a civil action brought by a person under subsection (a)— “(1) the person shall recover the actual damages the person sustains or liquidated damages in the amount of $300,000, and the cost of the action, including reasonable attorney fees and other litigation costs reasonably incurred; and “(2) the court may, in addition to any other relief available at law, award punitive damages and such other preliminary and equitable relief as the court determines to be appropriate, including a temporary restraining order, a preliminary injunction, or a permanent injunction ordering the defendant to cease the offending conduct.

2255A. Civil remedy against online platforms and app stores Read Opens in new tab

Summary AI

Any person who is harmed by an online platform or app store that intentionally promotes or assists certain criminal activities, such as child pornography, can sue for damages in the United States District Court. The law allows victims to recover significant compensation, including actual or liquidated damages and attorney fees, and it removes any time limits for filing such lawsuits.

Money References

  • (1) PROMOTION OR AIDING AND ABETTING OF CERTAIN VIOLATIONS.—Any person who is a victim of the intentional or knowing promotion, or aiding and abetting, of a violation of section 1591 or 1594(c) (involving a minor), or section 2251, 2251A, 2252, 2252A, or 2422(b), where such promotion, or aiding and abetting, is by a provider of an interactive computer service or an app store, and who suffers personal injury as a result of such promotion or aiding and abetting, regardless of when the injury occurred, may bring a civil action in any appropriate United States District Court for relief set forth in subsection (b). (2) ACTIVITIES INVOLVING CHILD PORNOGRAPHY.—Any person who is a victim of the intentional or knowing hosting or storing of child pornography or making child pornography available to any person by a provider of an interactive computer service, and who suffers personal injury as a result of such hosting, storing, or making available, regardless of when the injury occurred, may bring a civil action in any appropriate United States District Court for relief set forth in subsection (b). (b) Relief.—In a civil action brought by a person under subsection (a)— (1) the person shall recover the actual damages the person sustains or liquidated damages in the amount of $300,000, and the cost of the action, including reasonable attorney fees and other litigation costs reasonably incurred; and (2) the court may, in addition to any other relief available at law, award punitive damages and such other preliminary and equitable relief as the court determines to be appropriate, including a temporary restraining order, a preliminary injunction, or a permanent injunction ordering the defendant to cease the offending conduct.

6. Reporting and removal of child sexual abuse material; establishment of Child Online Protection Board Read Opens in new tab

Summary AI

The section of the bill establishes the Child Online Protection Board within the Federal Trade Commission to oversee and enforce procedures for the swift removal of child sexual abuse material (CSAM) from online platforms. The board will handle notifications from victims or their representatives about CSAM, require service providers to act on these notices quickly, and impose fines for non-compliance, while also safeguarding the privacy and legal rights of involved parties.

Money References

  • — (A) IN GENERAL.—If the Board grants a complainant’s petition filed under this section, notwithstanding any other law, the Board shall— (i) order the provider to immediately remove the child sexual abuse material, and to permanently delete all copies of the child sexual abuse material known to and under the control of the provider unless the Board orders the provider to preserve the child sexual abuse material; (ii) impose a fine of $50,000 per item of child sexual abuse material covered by the determination,
  • but if the Board finds that— (I) the provider removed the child sexual abuse material after the period set forth in subsection (c)(1)(A)(i), but before the complainant filed a petition, such fine shall be $25,000; (II) the provider has engaged in recidivist hosting for the first time with respect to the child sexual abuse material at issue, such fine shall be $100,000 per item of child sexual abuse material; or (III) the provider has engaged in recidivist hosting of the child sexual abuse material at issue 2 or more times, such fine shall be $200,000 per item of child sexual abuse material; (iii) order the provider to pay reasonable costs to the complainant; and (iv) refer any matters involving intentional or willful conduct by a provider with respect to child sexual abuse material, or recidivist hosting, to the Attorney General for prosecution under any applicable laws.
  • (B) COSTS FOR INCOMPLETE OR FRIVOLOUS NOTIFICATION AND HARASSMENT.—If, in granting or denying a petition as described in subparagraph (A), the Board finds that the notification contested in the petition could not be made complete under subsection (c)(2)(D), is frivolous, or is duplicative under subsection (c)(2)(C)(i), the Board may order the complainant to pay costs to the provider and any interested owner, which shall not exceed a total of $10,000, or, if the Board finds that the complainant filed the notification with an intent to harass the provider or any person, a total of $15,000.
  • (p) Funding.—There are authorized to be appropriated to pay the costs incurred by the Commission under this section, including the costs of establishing and maintaining the Board and its facilities, $40,000,000 for each year during the period that begins with the year in which this Act is enacted and ends with the year in which certain subsections of this section expire under subsection (q). (q) Sunset.—Except for subsections (a), (h), (k), (l), (m), (n), (o), and (r), this section shall expire 5 years after the date on which the Child Online Protection Board issues its first determination under this section. (r) Definitions.—In this section: (1) BOARD.—The term “Board” means the Child Online Protection Board established under subsection (d). (2) CHILD SEXUAL ABUSE MATERIAL.—The term “child sexual abuse material” has the meaning provided in section 2256(8) of title 18, United States Code.

7. Severability Read Opens in new tab

Summary AI

If any part of this Act or its amendments is found to be unconstitutional, the rest of the Act and its amendments will still remain in effect, as well as how they apply to other people and situations.