United States Senate Committee on the Judiciary Testimony of Mary Graw Leary Professor of Law The Catholic University of America Columbus School of Law CUA THE CATHOLIC UNIVERSITY OF AMERICA 1 Introduction Good morning Chairman Graham Ranking Member Feinstein and members of the Judiciary Committee Thank you for inviting me to participate in this hearing to discuss legislation concerning “Holding the Tech Industry Accountable in the Fight Against Online Child Sexual Exploitation ” The growth in both the quantity and severity of child sexual abuse material CSAM over the last several years is indeed a significant problem worthy of Congressional response and I appreciate Congress and this Committee’s hearings to address this form of exploitation I must be clear that I am not present specifically to testify either in favor of or against the EARN IT Act of 2020 I appear as a witness who has studied the issues of exploitation of children as an academic and has worked on these issues directly as a former child abuse prosecutor on both the state and federal levels as well as an attorney in the non-profit sector I hope to offer some historical and practical context for the problems of online child sexual exploitation as well as some issues this Committee may wish to consider moving forward to disrupt this national and global problem Having held hearings addressing these issues I know this Committee well versed in them I hope my thoughts can provide assistance in framing the many issues that flow from online CSAM and that I can answer any questions to assist in this important process 2 Child Sexual Abuse Material The United States has lagged behind much of the rest of the world in its label of child sexual abuse material as “child pornography ”1 The label “child pornography” has been problematic for several reasons The primary flaw in such a label is that it sanitizes the content of the images suggesting that they are images of adult pornography with younger looking subjects In fact as the federal definition indicates such content includes graphic depictions of numerous sexual acts involving children 2 CSAM more accurately describes the content of the images as crime scene photographs Additionally CSAM makes an important point essential to the criminal law these images are harmful in and of themselves A crime has been defined generally as a voluntary act that causes a social harm 3 The many social harms of CSAM have been recognized both by Congress and the Supreme Court and these include not only the harm in production that many children experience but also the harm in the dissemination of the images and their perpetual existence As the Supreme Court noted in its initial CSAM case the materials produced are a permanent record of the children's participation and the harm to the child is exacerbated by their circulation… t he distribution network for child pornography must be closed if the production of material which requires the sexual exploitation of children is to be effectively controlled 4 1 See e g Ethel Quayle The Impact of Viewing on Offending Behavior in CHILD SEXUAL ABUSE AND THE INTERNET TACKLING THE NEW FRONTIER 26 Martin C Calder ed 2004 “Many professionals working in the area have expressed the belief that such terminology “child pornography” allows us to distance ourselves from the true nature of the material A preferred term is abuse images ” 2 18 U S C 2256 2 5 8 3 See e g Albin Eser The Principle of ‘Harm’ in the Concept of Crime A Comparative Analysis of the Criminally Protected Legal Interests 4 Duq L Rev 345 386 1965 4 New York v Ferber 458 U S 747 759 1982 3 The Court later expounded on this harm by noting how the resultant child “pornography’s continued existence causes the child victims continuing harm by haunting the children in years to come ”5 That is to say that the Court has recognized that the existence of the images themselves is harmful Pornography poses an even greater threat to the child victim than does sexual abuse or prostitution Because the child’s actions are reduced to a recording the pornography may haunt him in future years long after the original misdeed took place A child must go through life knowing that the recording is circulating within the mass distribution system for child pornography 6 This concept is essential to the contemporary understanding of CSAM and how society responds to it The label CSAM more accurately describes the recognized social harm of this material It also connotes that the very existence of the images harms the subject in the images and potentially those exposed to them 7 This harm is uniquely pernicious Unlike many other crimes a component of the social harm of these crimes is a recurring one which continues to victimize in perpetuity In a recent CSAM case the Court recognized this The full extent of this victim's suffering is hard to grasp Her abuser took away her childhood her self-conception of her innocence and her freedom from the kind of nightmares and memories that most others will never know These crimes were compounded by the distribution of images of her abuser's horrific acts which meant the wrongs inflicted upon her were in effect repeated for she knew her humiliation and hurt were and would be renewed into the future as an everincreasing number of wrongdoers witnessed the crimes committed against her 8 5 Osborne v Ohio 495 U S 103 111 1990 Ferber 458 U S at 760 n 10 7 See Mary Graw Leary The Third Dimension of Victimization 13 Ohio St J Crim L 139 151 2016 8 Paroline v U S 572 U S 434 441 2014 Indeed this fact was underscored by the victim survivor in the Paroline case who describes the harm “Every day of my life I live in constant fear that someone will see my pictures and recognize me and that I will be humiliated all over again It hurts me to know someone is looking at them… I did not choose to be there but now I am there forever in pictures that people are using to do sick things I want it all erased I want it all stopped But I am powerless to stop it My life and my feelings are worse now 6 4 These harms extend not only to the children in the images but those exposed to them as well 9 Therefore this change in the labeling of child pornography to CSAM is an important advancement in addressing this issue Mens Rea In the criminal law the mens rea term for “knowingly” is generally accepted to be one of the highest levels of culpability It is a mental state in which the actor is practically certain his criminal conduct will cause the social harm 10 Victim survivors of CSAM have been afforded a private right of action to give them access to justice when a defendant “knowingly” violates certain statutes including those related to child sexual exploitation 11 Consequently a victim survivor of such exploitation has a very difficult burden to prove that an actor including an interactive computer service provider acted knowingly as to his or her exploitation A standard of “recklessly” has been proposed Recklessly remains a difficult standard to meet as a victim survivor generally must establish that an actor consciously disregarded a substantial and unjustifiable risk that his conduct would cause such harm 12 This is not an objective test and requires a showing that the actor subjectively saw the risk and proceeded Moreover not any risk will suffice but it must be one that is substantial and unjustifiable Furthermore the risk is evaluated by considering the nature and purpose of the actor’s conduct and the circumstances because the crime has never really stopped and will never really stop It's like I am being abused over and over and over again ” Id at 440-441 9 See e g Ethel Quayle The Impact of Viewing an Offending Behavior in Child Sexual Abuse and the Internet Tackling the New Frontier Martin Calder eds 2004 noting that such images fuel a market for CSAM “the viewing of child pornography in and of itself increases the likelihood that children will continue to be abused in the service of providing pictures for people to download ” Diana E H Russell Natalie J Purcell Exposure to Pornography as a Cause of Child Sexual Victimization in Handbook of Children Culture and Violence 59 66 Nancy E Dowd Dorothy G Singer Robin Fretwell Wilson eds 2006 discussing grooming 10 See M P C Sec 2 02 2 b 11 18 U S C 2255 12 See M P C Sec 2 02 2 c 5 known to the actor at the time This conduct in order to be actionable must involve a gross deviation from the standard of conduct that a law-abiding person would observe 13 As Congress discusses the appropriate mens rea recklessly should not be understood as an easy burden to meet Given the need for deterrence of practices that disregard known and identifiable risks to children as well as the reality that the Internet is a forum where such images have exploded exponentially 14 opening up this dialog is a positive step towards diminishing the proliferation of such images Unresolved Challenges Congress recently addressed the immunity provided by Section 230 of the Communications Decency Act within the context of sex trafficking 15 It amended Section 230 inter alia to allow sex trafficking victim survivors to utilize their private right of action under 18 U S C 1595 16 That amendment took place after two significant occurrences First prior to the amendment when victim survivors went into court to exercise their right to sue under federal law they were often denied that right because online entities asserted they were immune from such suits notwithstanding apparent Congressional intent otherwise This effectively precluded a victim survivor from holding a party with Section 230 protection responsible for its potentially exploitive acts The most developed example of this was Backpage com Backpage who took that litigation position both when sued civilly or when states attempted to enforce state laws Section 230 was used by Backpage to dismiss cases at the 13 Id See e g Testimony of John Clark to Senate Judiciary Committee July 9 2019 Elie Bursztein et al Rethinking Detection of Child Sexual Abuse Images on the Internet 2019 15 47 U S C 230 16 Allow States and Victims to Fight Online Sex Trafficking Act Pub L 115-164 2018 47 U S C 230 e 5 14 6 motion to dismiss stage which precluded discovery and prevented plaintiffs from obtaining access to the internal documents to prove the level of culpability in which Backpage engaged 17 With that being the legal landscape the second significant event took place The Senate Subcommittee on Investigations stepped in to investigate Backpage After a two-year contentious investigation it issued a final report This report noted that Backpage repeatedly claimed that it was merely a host of content created by others and not involved in facilitating prostitution However the report found that “internal company documents conclusively show that Backpage’s public defense is a fiction ”18 With the amendment to Section 230 immunity for cases involving sex trafficking Congress resolved that problem and at least afforded victim survivors some access to justice by allowing them the possibility of filing a valid complaint and proceeding past a motion to dismiss stage to have the opportunity to prove their claims when appropriate As Congress contemplates the appropriate limits of Section 230 in the space of CSAM more generally particular attention should be paid to this issue and its history In the current form of the EARN It Act an interactive computer service can claim a safe harbor from both state enforcement of its criminal laws and civil action through two fairly easily attainable paths First it obtains broad immunity if an officer certifies that the provider conducted a thorough review of the implementation and operation of the best practices and he has a “reasonable basis to conclude that review does not reveal any material non-compliance” with the best practices 19 This should be understood to be a relatively low standard not establishing actual compliance with best 17 E g Jane Doe No 1 v Backpage com LLC 817 F 3d 12 1st Cir 2016 See generally STAFF OF S PERMANENT SUBCOMM ON INVESTIGATIONS 115TH CONG BACKPAGE COM’S KNOWING FACILITATION OF ONLINE SEX TRAFFICKING 1 2017 19 Sec 4 d 18 7 practices This is particularly true given that the best practices will be generated by a commission whose membership largely comes from the tech community or its allies 20 Such a certification – which notably does not certify the entity is in fact in compliance with the practices - will provide the entity immunity from civil suit or state level prosecution This exclusive protection is sweeping and yet available to it for a very qualified certification While the provider does risk prosecution such a risk is remote as prosecution can only occur if it can be established that the provider knowingly submitted a false certification 21 The variance between the mens rea necessary for certification reasonable basis to not believe there is material noncompliance and that necessary for prosecution knowingly submitting a false statement is significant The second path to immunity is a provider establishing that it has implemented “reasonable measures” to prevent it from being used for the exploitation of minors 22 This will require a trial court possibly at a motion to dismiss stage to determine if the provider was reasonable in its measures Such an approach risks a similar outcome as with the aforementioned Section 230 sex trafficking caselaw in which courts unfamiliar with the technology and relying on outdated precedent arguably expanded Section 230 immunity further than congressionally intended 23 In such a regime a victim survivor can attempt to hold a provider responsible for its actions and file a suit with facts and a good faith belief that the provider violated the law was not 20 Sec 3 c 2 Sec 5 a 22 Sec 6 a 23 See e g Mary Graw Leary The Indecency and Injustice of the Communications Decency Act 41 Harvard Journal of Law and Public Policy 554 559-565 2018 21 8 in compliance with the practices and in fact did not make a certification in good faith Yet by Section 230 immunity being awarded to the provider at such a low standard the risk exists that the case will be dismissed prior to discovery Thus the legal landscape that led to Backpage successfully avoiding liability for several years could be repeated That is to say an actor could be engaged in activity for which liability is appropriate but a victim survivor is precluded from proving that case due to the sweeping scope of such immunity available to providers merely by arguing they are reasonable or that they had no reason to believe a material non-compliance occurred Relevant Considerations Should Congress implement a commission system to develop best practices it has included a number of considerations for the commission most of which focus on needs of providers 24 Absent from this list are considerations of the safety of children or the protection of children as it relates to CSAM While such matters are referenced in Matters Addressed section 25 they are critical to the Relevant Considerations These considerations appear to encompass the list of considerations the commission can turn to in executing its duties As such the commission should be specifically charged to consider more than factors of interest to tech companies but also and with equal importance factors that are of interest to children and children’s protection from exploitation Conclusion The problem of online CSAM is beyond dispute It claims as its victims countless children who experience life altering harm That harm is exacerbated by the role of online 24 25 Sec 4 a 4 Sec 4 a 3 9 entities who directly or indirectly contribute to this form of exploitation Section 230 provides such entities with sweeping immunity protection That protection was created with the goal of shielding children from objectionable content protecting good Samaritans and protecting a nascent internet The internet is no longer nascent and children are at risk of exploitation at unprecedented levels At the center of congressional action in this space should be those at risk of exploitation crime victim survivors and their ability to access the laws created to protect them 10
OCR of the Document
View the Document >>