18-397-cv IN THE UNITED STATES COURT OF APPEALS FOR THE SECOND CIRCUIT STUART FORCE, individually and as Administrator on behalf of the ESTATE OF TAYLOR FORCE, ROBBI FORCE, KRISTIN ANN FORCE, ABRAHAM RON FRAENKEL, individually and as Administrator on behalf of the ESTATE OF YAAKOV NAFTALI FRAENKEL, and as the natural and legal guardian of minor plaintiffs, A.H.H.F, A.L.F, N.E.F, N.S.F, and S.R.F, A.H.H.F, A.L.F, N.E.F., N.S.F., S.R.F., RACHEL DEVORA SPRECHER FRAENKEL, individually and as Administrator on behalf of the ESTATE OF YAAKOV NAFTALI FRAENKEL and as the natural and legal guardian of minor plaintiffs A.H.H.F, A.L.F, N.E.F, N.S.F, and S.R.F, TZVI AMITAY FRAENKEL, SHMUEL ELIMELECH BRAUN, individually and as Administrator on behalf of the ESTATE OF CHAYA ZISSEL BRAUN, CHANA BRAUN, individually and as Administrator on behalf of the ESTATE OF CHAYA ZISSEL BRAUN, SHIMSHON SAM HALPERIN, SARA HALPERIN, MURRAY BRAUN, ESTHER BRAUN, MICAH LAKIN AVNI, (Caption Continued on the Reverse) On Appeal from the United States District Court for the Eastern District of New York BRIEF OF ELECTRONIC FRONTIER FOUNDATION AS AMICUS CURIAE IN SUPPORT OF DEFENDANT-APPELLEE Sophia Cope David Greene ELECTRONIC FRONTIER FOUNDATION Attorneys for Amicus Curiae 815 Eddy Street San Francisco, CA 94109 (415) 436-9333 sophia@eff.org individually, and as Joint Administrator on behalf of the ESTATE OF RICHARD LAKIN, MAYA LAKIN, individually, and as Joint Administrator on behalf of the ESTATE OF RICHARD LAKIN, MENACHEM MENDEL RIVKIN, individually, and as the natural and legal guardian of minor plaintiffs S.S.R., M.M.R., R.M.R., S.Z.R., BRACHA RIVKIN, individually, and as the natural and legal guardian of minor plaintiffs S.S.R., M.M.R., R.M.R., and S.Z.R., S.S.R., M.M.R., R.M.R., S.Z.R., Plaintiffs-Appellants, V. FACEBOOK, INC., Defendant-Appellee. DISCLOSURE OF CORPORATE AFFILIATIONS AND OTHER ENTITIES WITH A DIRECT FINANCIAL INTEREST IN LITIGATION Pursuant to Rule 26.1 of the Federal Rules of Appellate Procedure, Amicus Curiae Electronic Frontier Foundation states that it does not have a parent corporation and that no publicly held corporation owns 10% or more of its stock. i TABLE OF CONTENTS CORPORATE DISCLOSURE STATEMENT .......................................................i TABLE OF CONTENTS ...................................................................................... ii TABLE OF AUTHORITIES .................................................................................iv STATEMENT OF INTEREST............................................................................... 1 INTRODUCTION .................................................................................................. 1 ARGUMENT .......................................................................................................... 4 I. THE FIRST AMENDMENT PREVENTS IMPOSING LIABILITY ON FACEBOOK FOR HOSTING CONTENT ABOUT TERRORISM. .... 4 A. Speech Discussing or Promoting Terrorism is Not Categorically Excluded From the First Amendment. ........................................................ 4 B. Facebook Cannot Be Held Liable for Incitement Based on the Knowing Publication of Hamas Content on Its Platform. .......................................... 6 II. IMPOSING LIABILITY ON FACEBOOK WOULD VIOLATE INTERNET USERS’ FIRST AMENDMENT RIGHTS TO RECEIVE AND GATHER INFORMATION ABOUT TERRORISM. ......................... 8 III. SECTION 230 IS ESSENTIAL TO INTERNET USERS’ FREEDOM OF EXPRESSION ONLINE. ...................................................................... 11 A. Congress Passed Section 230 to Encourage the Development of Open Platforms and Enable Robust Online Speech............................................ 12 B. Section 230 Applies to All Federal Civil Claims, Including Those That Are Attendant to Federal Criminal Statutes .............................................. 14 IV. FAILURE TO APPLY SECTION 230 AND THE FIRST AMENDMENT WOULD HARM INTERNET USERS’ FREE SPEECH AND PLATFORMS’ WILLINGNESS TO HOST THAT SPEECH. .................. 17 CONCLUSION..................................................................................................... 22 ii CERTIFICATE OF COMPLIANCE .................................................................... 23 CERTIFICATE OF SERVICE ............................................................................. 24 iii TABLE OF AUTHORITIES Cases Application of Dow Jones & Co., Inc., 842 F.3d 603 (2d Cir. 1988) .................................................................................. 9 Batzel v. Smith, 333 F.3d 1018 (9th Cir. 2003) ............................................................................. 13 Bd. of Educ. v. Pico, 457 U.S. 853 (1982) ........................................................................................ 9, 10 Brandenburg v. Ohio, 395 U.S. 444 (1969) .......................................................................................... 5, 7 Brown v. Entertainment Merchants Ass’n, 564 U.S. 786 (2011) .............................................................................................. 6 Cohen v. Facebook, Inc., 252 F. Supp. 3d 140 (E.D.N.Y. 2017) ............................................................. 2, 11 Crosby v. Twitter, Inc., 303 F. Supp. 3d 564 (E.D. Mich. 2018) ................................................................ 2 Fields v. Twitter, Inc., 200 F. Supp. 3d 964 (N.D. Cal. 2016) ............................................................ 2, 11 Fields v. Twitter, Inc., 881 F.3d 739 (9th Cir. 2018) ................................................................................. 2 Gonzalez et al. v. Google, Inc., Case No. 16-cv-03282 (N.D. Cal. Aug. 15, 2018) ................................................ 2 Herceg v. Hustler Magazine, Inc., 814 F.2d 1017 (5th Cir. 1987) ....................................................................... 6, 7, 8 James v. Meow Media, Inc., 300 F.3d 683 (6th Cir. 2002) ................................................................................. 7 Jane Doe No. 1 v. Backpage.com, LLC, 817 F.3d 12 (1st Cir. 2016) ................................................................................. 15 iv Lamont v. Postmaster Gen’l, 381 U.S. 301 (1965) .............................................................................................. 9 Packingham v. North Carolina, 137 S. Ct. 1730 (2017) .......................................................................................... 3 Pennie v. Twitter, Inc., 281 F. Supp. 3d 874 (N.D. Cal. 2017) .................................................................. 2 Reno v. ACLU, 521 U.S. 844 (1997) ...................................................................................... 12, 18 Rice v. Paladin Enters., 128 F.3d 233 (4th Cir. 1997) ............................................................................. 7, 8 Richmond Newspapers v. Virginia, 448 U.S. 555 (1980) .............................................................................................. 9 Stanley v. Georgia, 394 U.S. 557 (1969) .............................................................................................. 9 U.S. v. Stevens, 559 U.S. 460 (2010) .............................................................................................. 5 Universal Studios, Inc. v. Corley, 273 F.3d 429 (2d Cir. 2001) .................................................................................. 7 Watts v. United States, 394 U.S. 705 (1969) .............................................................................................. 5 Wicks v. Miss. State Emp’t Servs., 41 F.3d 991 (5th Cir. 1995) ................................................................................. 21 Zeran v. AOL, 129 F.3d 327 (4th Cir. 1997) ......................................................................... 13, 14 Statutes 18 U.S.C. § 1595..................................................................................................... 15 47 U.S.C. § 230................................................................................................ passim v Other Authorities Anupam Chander & Uyên P. Lê, Free Speech, 100 Iowa L. Rev. 501 (2015) ...... 12 Bree Brouwer, YouTube Now Gets Over 400 Hours of Content Uploaded Every Minute, Tubefilter (July 26, 2015) ............................................................ 18 Diana Bradley, How the Grindr ecosystem evolved into more for its 4 million users, PR Week (Feb. 26, 2018).......................................................................... 19 Facebook, Violence and Criminal Behavior, Community Standards (2018) ........... 8 Hayley Tsukayama, Facebook Turns to Artificial Intelligence to fight hate and misinformation in Myanmar, Washington Post (Aug. 15, 2018) ................. 20 Heather J. Williams, Ilana Blum, Defining Second Generation Open Source Intelligence (OSINT) for the Defense Enterprise, RAND Corporation (2018) .. 11 Internet live stats, Total number of Websites ......................................................... 18 Ken Yeung, Medium grows 140% to 60 million monthly visitors, VentureBeat (Dec. 14, 2016) .................................................................................................... 19 Nathan McDonald, Digital in 2018: World’s Internet Users Pass the 4 Billion Mark, We Are Social (Jan. 2018) ........................................................................ 18 Philip N. Howard, et al., Opening Closed Regimes: What Was the Role of Social Media During the Arab Spring? (Sep. 1, 2011) ....................................... 21 Protect the Protest ................................................................................................... 16 Public Participation Project .................................................................................... 16 Shopify Partners, Platform Feature......................................................................... 19 SmallBusiness.com, What Percentage of Small Businesses Have Websites? (2017) .................................................................................................................. 19 Statista, Number of monthly active Facebook users worldwide as of 2nd quarter 2018 (in millions) ................................................................................................ 20 vi Sydney Li and Jamie Williams, Despite What Zuckerberg’s Testimony May Imply, AI Cannot Save Us, Electronic Frontier Foundation “Deeplinks” (Apr. 11, 2018) ............................................................................................................................. 20 WhoIsHostingThis.com, Shocking WordPress Stats 2017..................................... 19 vii STATEMENT OF INTEREST1 Amicus Curiae Electronic Frontier Foundation is a member-supported, nonprofit civil liberties organization that works to protect free speech and privacy in the digital world. Founded in 1990, EFF has more than 37,000 members. EFF represents the interests of technology users in both court cases and broader policy debates surrounding the application of law to technology. EFF has litigated or otherwise participated in a broad range of Internet free expression and intermediary liability cases. INTRODUCTION Plaintiffs-Appellants’ attempt to hold Facebook liable for third-party speech jeopardizes online platforms’ ability to offer Internet users robust and open forums for speech. Holding online platforms liable for what their users post also is contrary to both Section 230 (47 U.S.C. § 230)2 and the First Amendment. The district court was correct to dismiss Plaintiffs-Appellants’ amended complaint under Section 230. Cohen v. Facebook, Inc., 252 F. Supp. 3d 140, 155 1 Pursuant to Federal Rule of Appellate Procedure Rule 29(c), EFF certifies that no person or entity, other than amicus, its members, or its counsel, made a monetary contribution to the preparation or submission of this brief or authored this brief in whole or in part. Pursuant to Federal Rule of Appellate Procedure Rule 29(a)(2), amicus has filed a motion for leave to file this brief. 2 The statute was passed as Section 509 of the Communications Decency Act, part of the Telecommunications Act of 1996, Pub. L. 104–104. It is sometimes colloquially referred to as “CDA 230” or “Section 230 of the Communications Decency Act.” Amicus refer to it as Section 230. 1 61 (E.D.N.Y. 2017). Section 230 plainly bars Plaintiffs-Appellants’ claims and this Court should affirm on that basis. Given the proliferation of lawsuits that seek to impose liability on online platforms under the Anti-Terrorism Act (“ATA”) for hosting user-generated content,3 it is important that this Court expressly hold that Section 230 bars the claims here. Amicus EFF writes separately to emphasize the important policy goals underlying Section 230 and why it is vitally important that Section 230 continue to apply to those civil claims that are attendant to federal criminal statutes. Additionally, even if Section 230 did not bar Plaintiffs-Appellants’ claims, the First Amendment would. First, online platforms have a First Amendment right to publish speech discussing or promoting terrorism. Such content does not automatically fall outside the First Amendment’s protection unless it fits within narrow categories of unprotected speech, such as true threats or direct incitement to violence. Imposing civil liability on Facebook for the publication of user-generated content about terrorism would sweep up large swaths of protected speech—or this 3 Fields v. Twitter, Inc., 200 F. Supp. 3d 964, 972 (N.D. Cal. 2016), aff’d on other grounds, Fields v. Twitter, Inc., 881 F.3d 739 (9th Cir. 2018); Crosby v. Twitter, Inc., 303 F. Supp. 3d 564 (E.D. Mich. 2018), on appeal, Case No. 18-1426 (6th Cir. 2018); Pennie v. Twitter, Inc., 281 F. Supp. 3d 874 (N.D. Cal. 2017), on appeal, Case No. 17-17536 (9th Cir. 2017); Gonzalez et al. v. Google, Inc., Case No. 16-cv-03282 (N.D. Cal. Aug. 15, 2018) (order granting motion to dismiss), on appeal, Case No. 18-16700 (9th Cir. 2018). 2 Court would be required to fashion an entirely new categorical First Amendment exception. This proposition is ill-advised and contrary to well-settled law. Second, imposing liability on online platforms for the publication of content concerning terrorism would violate the First Amendment rights of Internet users. Internet users have a right to receive and gather information on a variety of topics, including speech about terrorism. By imposing liability on Facebook for hosting such speech, Plaintiffs-Appellants would force online platforms to restrict access to terrorist content or to remove it entirely, limiting the range of content and viewpoints available to Internet users. Finally, allowing Plaintiffs-Appellants to undermine Section 230 and the First Amendment’s protections for online platforms would mean undermining the free and open Internet. Internet intermediaries are an essential element of the modern Internet. They provide the very “vast democratic forums of the Internet” that, in the words of the Supreme Court, make the Internet one of the “most important places . . . for the exchange of views.” Packingham v. North Carolina, 137 S. Ct. 1730, 1735 (2017). Users rely on intermediaries to express themselves and to communicate with others online. Online platforms like Facebook give everyone and anyone the ability to reach an audience or engage with others, without having to learn how to code or expend significant financial resources, on all manner of topics, for all manner of reasons. If Plaintiffs-Appellants are 3 successful in their efforts to impose liability on Facebook, Internet intermediaries would likely take steps to restrict the openness of their platforms, such as limiting who can use their service, screening content before it is even posted by those users, and removing even more speech than they already do. And those platforms that cannot afford to take these steps to avoid liability would simply cease to exist. This outcome would blunt the Internet’s ability to be a powerful, diverse forum for political and social discourse—including about controversial topics such as terrorism. ARGUMENT I. THE FIRST AMENDMENT PREVENTS IMPOSING LIABILITY ON FACEBOOK FOR HOSTING CONTENT ABOUT TERRORISM. A. Speech Discussing or Promoting Terrorism is Not Categorically Excluded From the First Amendment. Imposing liability on Facebook for hosting content discussing or promoting terrorism would violate its First Amendment rights by punishing Facebook for disseminating speech that is fully protected by the First Amendment. An unstated premise in Plaintiffs-Appellants’ attempts to hold Facebook liable for hosting user-generated content discussing or promoting for terrorism is that the speech itself is unlawful, that it enjoys no First Amendment protection, and that Facebook can be held liable for publishing it. See AOB at 49 (describing Hamas’ use of Facebook as “not merely bad; it is criminal”). 4 But there are only a handful of historically unprotected categories of speech, and terrorist speech is not one of them. Although certain types of terrorist speech may be unprotected, such as true threats, see Watts v. United States, 394 U.S. 705 (1969), and speech directly inciting imminent lawless acts, see Brandenburg v. Ohio, 395 U.S. 444 (1969), the vast majority of speech about terrorism is fully protected by the First Amendment. Further, the Supreme Court has been loath to expand the list of unprotected categories of speech, even in cases involving toxic and extremely offensive speech. In U.S. v. Stevens, 559 U.S. 460, 469 (2010), for example, the government sought to create a new category of unprotected speech that it could punish: graphic and disturbing depictions of animal cruelty. The government proposed a balancing test to determine whether certain categories of speech fall outside the First Amendment: “the value of the speech against its societal costs.” Id. at 470. The Supreme Court rejected the government’s proposal as both “startling and dangerous.” Id. The First Amendment does not permit the creation of new categories of unprotected speech, the Court held, because the “guarantee of free speech does not extend only to categories of speech that survive an ad hoc balancing of relative social costs and benefits.” Id. The Court reaffirmed this principle a year later in striking down a California law that banned the sale of violent video games to minors, and would have created 5 a de facto new category of unprotected speech by grafting portions of the definition of obscenity with depictions of extremely violent video game content. Brown v. Entertainment Merchants Ass’n, 564 U.S. 786, 792–93 (2011). There is no historical basis for expanding the list of speech unprotected by the First Amendment to include speech discussing or promoting terrorism. Further, the First Amendment does not permit ad hoc judgments regarding the social value of speech to determine whether that speech is protected. Plaintiffs-Appellants cannot impose categorical liability on Facebook for publishing terrorist content without punishing platforms for disseminating speech fully protected by the First Amendment. And the creation of a new category of unprotected speech is unwarranted and inappropriate. B. Facebook Cannot Be Held Liable for Incitement Based on the Knowing Publication of Hamas Content on Its Platform. Some online speech promoting terrorism may constitute speech directly inciting violence and thus fall outside the First Amendment; but even then, Facebook could not be held liable for merely publishing such speech. The First Amendment generally bars claims against publishers for inciting harmful conduct via the knowing publication of motivational or instructional speech. In Herceg v. Hustler Magazine, Inc., 814 F.2d 1017, 1021–22 (5th Cir. 1987), for example, the Fifth Circuit overturned a jury verdict finding Hustler liable for a teen’s death as a result of its publication of an article about autoerotic 6 asphyxiation. The court held that liability could not be imposed on Hustler “without impermissibly infringing upon freedom of speech” because there was no evidence that the publisher intended, advocated for, or directly incited the teen to attempt the act. Id. (citing Brandenburg, 395 U.S. 444). See also Universal Studios, Inc. v. Corley, 273 F.3d 429, 447 n.18 (2d Cir. 2001) (noting that even publication of instructions on how to commit illegal acts are subject to First Amendment scrutiny); James v. Meow Media, Inc., 300 F.3d 683, 695, 699 (6th Cir. 2002) (refusing to hold video game, movie, and Internet companies liable for murder of students by fellow classmate, stating “attaching tort liability to the effect that such ideas have on a criminal actor would raise significant constitutional problems under the First Amendment”). Courts have held that publishers can only be held liable for content that results in death or bodily injury in cases where (i) the publisher has the specific intent to encourage the commission of violent acts, and (ii) the publisher provides specific instructions to commit the acts, rather than abstract advocacy. See Rice v. Paladin Enters., 128 F.3d 233, 242 (4th Cir. 1997). This narrow class of cases in which the First Amendment will not bar platform liability based on content that resulted in death or bodily injury is not applicable here. Plaintiffs-Appellants cannot show that Facebook possessed the specific intent and direction required to hold the company liable for the violence 7 Hamas promoted online and ultimately perpetrated against Plaintiffs-Appellants in this case. Even if Facebook had actual knowledge of content that directly incited terrorism or other criminal acts, it would still lack the specific intent required to vitiate the First Amendment protection recognized in both Herceg and Rice. The allegations of Facebook’s wrongful behavior are essentially that bad actors—in this case, alleged members of Hamas—used the platform’s tools in much the same way as any other Facebook user would: to post content and to connect with other users. See AOB 48-49. There is no evidence that Facebook made any efforts to direct, incite, or encourage Hamas’ violent actions beyond providing an open platform to all users. In fact, Facebook has policies that prohibit speech that encourages violence. See Facebook Community Standards – Violence and Criminal Behavior.4 II. IMPOSING LIABILITY ON FACEBOOK WOULD VIOLATE INTERNET USERS’ FIRST AMENDMENT RIGHTS TO RECEIVE AND GATHER INFORMATION ABOUT TERRORISM. The First Amendment protects the right of platform users to receive and gather information, including offensive rhetoric advocating for terrorism that does not constitute either a true threat or directly incite violence. The Supreme Court has held that “the right to receive ideas is a necessary predicate to the recipient’s 4 Facebook, Violence and Criminal Behavior, Community Standards (2018), https://www.facebook.com/communitystandards/violence_criminal_behavior. 8 meaningful exercise of his own rights of speech, press, and political freedom.” Bd. of Educ. v. Pico, 457 U.S. 853, 867 (1982) (plurality). This Court has recognized that “the First Amendment unwaveringly protects the right to receive information and ideas.” Application of Dow Jones & Co., Inc., 842 F.3d 603, 607 (2d Cir. 1988). Similarly, the First Amendment protects the right to gather information. See Richmond Newspapers v. Virginia, 448 U.S. 555, 576 (1980) (plurality) (protecting the right to gather information in courtrooms, because “free speech carries with it some freedom to listen”). The right to receive information does not turn on the underlying merit of the ideas communicated. Quite the opposite: it ensures that people have access to different, controversial ideas and views. As the Supreme Court has recognized, “the right to receive information and ideas, regardless of their social worth . . . is fundamental to our free society,” Stanley v. Georgia, 394 U.S. 557, 564 (1969) (protecting the right to possess obscene materials at home), because it is essential to fostering open debate. Indeed, “[i]t would be a barren marketplace of ideas that had only sellers and no buyers.” Lamont v. Postmaster Gen’l, 381 U.S. 301, 308 (1965) (Brennan, J., concurring) (protecting the “right to receive” foreign publications). Holding platforms liable for publishing speech on certain topics thus interferes with users’ rights. Platforms such as Facebook would likely react to this 9 new legal liability by simply not publishing any speech about terrorism—not merely speech expressing true threats or directly inciting imminent terrorist attacks. But users have the right to receive speech, even on unpopular and abhorrent topics such as terrorism or from unpopular speakers who advocate terrorist ideology. Depriving users of their right to receive and gather information discussing terrorism would do far more than simply limit which content is available online; it would stunt people’s ability to be informed about the world and form opinions. Depriving platform users of their ability to decide for themselves whether to receive certain kinds of speech would short-circuit the marketplace of ideas in a way that runs directly counter to the First Amendment. As the Supreme Court recognized in Pico, the ability to access information is antecedent to engaging in speech protected by the First Amendment. See 457 U.S. at 867. The interplay between receiving information and engaging in speech exists for terrorism just like any other subject matter: journalists need to access and gather information about terrorism to report about it; academic researchers need the information to inform our social and political beliefs; government officials and the general public need 10 the information to engage in political and social debates about terrorism and related foreign and domestic policy.5 III. SECTION 230 IS ESSENTIAL TO INTERNET USERS’ FREEDOM OF EXPRESSION ONLINE. While Plaintiffs-Appellants have attempted to plead around Section 230, the district court rightly concluded: “The essence of the Force complaint is not that Plaintiffs were harmed by Hamas’ ability to obtain Facebook accounts[, but] … that Facebook contributed to their harm by allowing Hamas to use its platform to post particular offensive content that incited or encouraged those attacks.” Cohen, 252 F. Supp. 3d at 158 (emphasis in original). Amicus endorses the supporting arguments in Facebook’s brief. Brief for Defendant-Appellee Facebook at 13-25. See also Fields, 200 F. Supp. 3d at 972 (holding that online platforms’ decisions about what third-party content is posted to their sites is publishing activity protected by Section 230), aff’d on other grounds, 881 F.3d 739 (9th Cir. 2018). Keeping in mind the policy rationales underlying Section 230, amicus urges this Court to affirm on that ground, and to hold that Section 230 applies to civil claims that are attendant to federal criminal statutes. 5 See, e.g., Heather J. Williams, Ilana Blum, Defining Second Generation Open Source Intelligence (OSINT) for the Defense Enterprise, RAND Corporation, at 12 (2018) (discussing social media as open sources of intelligence information), https://www.rand.org/pubs/research_reports/RR1964.html. 11 A. Congress Passed Section 230 to Encourage the Development of Open Platforms and Enable Robust Online Speech. Plaintiffs-Appellants’ claims threaten not just Facebook, but the ability of all Internet intermediaries to host platforms for diverse online speech. Congress recognized in passing Section 230 that the Internet depends upon intermediaries, which serve “as a vehicle for the speech of others.”6 Intermediaries create democratic forums in which anyone can become “a pamphleteer” or “a town crier with a voice that resonates farther than it could from any soapbox.” Reno v. ACLU, 521 U.S. 844, 870 (1997). They give a single person, with minimal resources and technical expertise anywhere in the world, the ability to communicate with others across the globe. Online platforms host a wide range of diverse speech on behalf of their users, ensuring that all views—especially controversial ones—can be presented and received by platform users. Intermediary platforms—such as social media websites, blogging platforms, video-sharing services, and web-hosting companies—are the essential architecture of today’s Internet. Indeed, they are often the primary way in which the majority of people engage with one another online. Thus, Section 230 benefits the Internet as a whole. Congress clearly understood the essential function online platforms play in our digital lives. In passing Section 230, Congress recognized the Internet’s power 6 Anupam Chander & Uyên P. Lê, Free Speech, 100 Iowa L. Rev. 501, 514 (2015). 12 to sustain and promote robust individual speech, a value rooted in the First Amendment. Congress sought to further encourage the already robust free speech occurring online in the mid-1990s, and to speed the development of online platforms by providing broad immunity to service providers that host usergenerated content. See 47 U.S.C. § 230(b)(2), (3) (“It is the policy of the United States . . . to encourage the development of technologies which maximize user control over what information is received by individuals, families, and schools who use the Internet and other interactive computer services” and “to preserve the vibrant and competitive free market that presently exists for the Internet and other interactive computer services, unfettered by Federal or State regulation.”). See also Batzel v. Smith, 333 F.3d 1018, 1027 (9th Cir. 2003) (“Congress wanted to encourage the unfettered and unregulated development of free speech on the Internet, and to promote the development of e-commerce.”). Congress recognized that if our legal system failed to robustly protect intermediaries, it would fail to protect free speech online. Zeran v. AOL, 129 F.3d 327, 330 (4th Cir. 1997). Given the volume of information being published online, it would be impossible for most intermediaries to review every single bit of information published through their platforms prior to publication. “Faced with potential liability for each message republished by their services, interactive computer service providers might choose to severely restrict the number and type 13 of messages posted.” Id. at 331. The resulting Internet would include a far more limited number of forums if intermediaries were forced to second-guess decisions about managing and presenting content authored by third parties. By creating Section 230’s platform immunity, Congress made the intentional policy choice that individuals harmed by speech online need to seek relief from the speakers themselves, rather than the platforms those speakers used. Id. at 330–31. By limiting liability in this way, Congress decided that creating a forum for unrestrained and robust communication was of utmost importance, even if it might result—depending on how platforms moderate their sites—in the presence of harmful content online. Thus, while Congress certainly did not intend to promote speech from terrorist organizations, Congress did decide that promoting robust online dialogue was more important than ridding the Internet of all harmful speech. Placing liability on Facebook in this case not only conflicts with the plain text and purpose of Section 230, it also would severely undercut the essential role online platforms play in fostering our modern political and social discourse. B. Section 230 Applies to All Federal Civil Claims, Including Those That Are Attendant to Federal Criminal Statutes. Holding that Section 230’s exception for federal criminal prosecutions also applies to civil ATA claims—meaning that Section 230 would not bar civil claims that are attendant to federal criminal statutes—would swallow Section 230’s broad 14 protections for online platforms and frustrate Congress’ purpose in enacting the law. Plaintiffs mischaracterize Section 230(e)(1), which states that “[n]othing in this section shall be construed to impair the enforcement of . . . any other Federal criminal statute,” to argue that Section 230 cannot provide online platforms immunity for a civil violation of the ATA. AOB 52-53. With the exception of recently enacted legislation that is not relevant here, however, Section 230(e)(1)’s limited exception is reserved solely for prosecutions brought by the government itself. See Jane Doe No. 1 v. Backpage.com, LLC, 817 F.3d 12, 23-24 (1st Cir. 2016) (holding that Section 230(e)(1)’s language “quite clearly indicates that the provision is limited to criminal prosecutions”).7 Extending Section 230’s federal criminal law exception beyond prosecutions to include civil actions authorized by the ATA—or any other criminal statutes with civil recovery corollaries—would result in very serious practical consequences for freedom of speech and innovation online. Prosecutorial discretion and the higher standard of proof together can mitigate against the chilling effect created by the 7 Congress clearly agrees with this position. Congress recently amended Section 230(e) to include a new exception to the immunity that permits civil actions against online platforms under the civil provisions of the criminal anti-trafficking law, 18 U.S.C. § 1595. See 47 U.S.C. § 230(e)(4)(A). The fact that Congress added a specific Section 230 exception enabling civil litigants to recover under the civil anti-trafficking statute demonstrates that Section 230(e)(1)’s criminal prosecution exception is indeed limited to federal prosecutions. 15 lack of immunity for Internet intermediaries against criminal prosecutions due to Section 230(e)(1). Limiting Section 230’s federal criminal law exception to actual prosecutions makes practical sense given that government prosecutors generally exercise their discretion to bring criminal charges with care because so much is at stake in criminal cases—that is, the defendants’ life or liberty. Additionally, the standard of proof in criminal cases is much higher than in civil cases. By contrast, private plaintiffs typically do not exercise such judiciousness in deciding whether to bring lawsuits where money damages are the remedy and the standard of proof is lower. It is very easy to bring a civil case—and sometimes private plaintiffs do not even intend to see their case to the end; rather, they simply want to scare the defendant into silence.8 Exposing Internet intermediaries to the decisions of a broad array of civil litigants whose claims are authorized by criminal statutes would disincentivize online innovation and ultimately diminish the free speech and the free exchange of ideas and information that Internet platforms facilitate. This unfortunate result would be contrary to Congress’ recognition that “[t]he Internet and other interactive computer services offer a forum for a true diversity of political 8 Such cases are called “Strategic Lawsuits Against Public Participation” (SLAPPs). See Protect the Protest, https://www.protecttheprotest.org/; and Public Participation Project, https://anti-slapp.org/. 16 discourse, unique opportunities for cultural development, and myriad avenues for intellectual activity.”9 IV. FAILURE TO APPLY SECTION 230 AND THE FIRST AMENDMENT WOULD HARM INTERNET USERS’ FREE SPEECH AND PLATFORMS’ WILLINGNESS TO HOST THAT SPEECH. Placing liability on Facebook in this case would severely undercut the essential role all online platforms play in fostering our modern political and social discourse. It would require fundamentally altering the relationship between platforms and their users by incentivizing platforms to dramatically curtail what people can discuss on online. Instead of making open forums for participation by users around the world— a quintessential feature of Internet intermediaries—online platforms, saddled with potential or explicit liability based on the content their users post, would likely screen user-generated content to avoid the risk that their users might post offensive content that will create liability for the companies. And companies would take down any and all content that drew any complaint—especially by those complainants with resources to fund litigation—just as Congress feared, which was what prompted it to pass Section 230. These overreactions could include not only removing content after-the-fact, but also reviewing all content users intend to post and thereby preventing any potentially controversial comments or criticism from 9 47 U.S.C. § 230(a)(3). 17 being published in the first place, and removing accounts whose content drew objections—potentially far beyond the content Hamas posted at issue here. The ability—both logistically and financially—for modern platforms to conduct a fair review is dubious given the incredible volume of content generated by platform users. When Congress passed Section 230 in 1996, about 40 million people used the Internet worldwide, and commercial online services in the United States had almost 12 million individual subscribers. See Reno v. ACLU, 521 U.S. 844, 850–851 (1997). Today’s Internet hosts third-party contributions from a broad array of voices, facilitating the speech of billions of people. As of January 2018, roughly 4 billion people were online—53 percent of the global population.10 And the Web continues to grow at an accelerating pace. At the end of 2016, there were 1.046 billion websites on the Web; only a year later, at the end of 2017, that number had reached 1.767 billion, and in 2018 we are nearing 2 billion.11 Users of the video platform YouTube today upload roughly 400 hours of video to the website every minute.12 WordPress—a free and open-source content management 10 See Nathan McDonald, Digital in 2018: World’s Internet Users Pass the 4 Billion Mark, We Are Social (Jan. 2018), https://wearesocial.com/us/blog/2018/01/global-digital-report-2018. 11 Internet live stats, Total number of Websites http://www.internetlivestats.com/total-number-of-websites/. 12 Bree Brouwer, YouTube Now Gets Over 400 Hours of Content Uploaded Every Minute, Tubefilter (July 26, 2015), http://www.tubefilter.com/2015/07/26/youtube400-hours-content-every-minute/. 18 system available in over 50 languages that allows users around to globe to create free websites or blogs—as of 2017 had been used to create 75,000,000 websites.13 Medium—an online publishing platform that allows amateur and professional people alike to publish their work—had 60 million unique monthly visitors in 2016, only four years after the company launched.14 Also, the Grindr dating app is today approaching 4 million daily active users.15 Small businesses also have an increasing online presence. In 2017, over 70 percent of small businesses in the United States had a website, and 79 percent of those small businesses had a mobile-friendly website.16 And with platforms like Shopify—an e-commerce platform that helps customers create websites for their online stores and currently powers over 600,000 online merchants17—it is easier than ever for small businesses to get online. 13 WhoIsHostingThis.com, Shocking WordPress Stats 2017, https://www.whoishostingthis.com/compare/wordpress/stats/. 14 Ken Yeung, Medium grows 140% to 60 million monthly visitors, VentureBeat (Dec. 14, 2016), https://venturebeat.com/2016/12/14/medium-grows-140-to-60million-monthly-visitors/. 15 See Diana Bradley, How the Grindr ecosystem evolved into more for its 4 million users, PR Week (Feb. 26, 2018), https://www.prweek.com/article/1457079/grindrecosystem-evolved-its-4-million-users. 16 SmallBusiness.com, What Percentage of Small Businesses Have Websites? (2017), https://smallbusiness.com/digital-marketing/how-many-small-businesseshave-websites/. 17 Shopify Partners, Platform Features, https://www.shopify.com/partners/platform-features. 19 With such a staggering number of Internet users, the new content-screening regime that a ruling in favor of Plaintiffs-Appellants would lead to would be costly. To keep the cost of human reviewers down, larger, more sophisticated platforms would likely turn to algorithms or artificial intelligence to flag and block controversial comments or criticism. Defendant-Appellee Facebook, which now has 2.23 billion users,18 is already using algorithms or artificial intelligence to moderate content on its platform.19 This is despite the fact that such systems are notoriously terrible at understanding context and cultural differences, are capable of being gamed by those looking to censor speech, and therefore are more likely to result in censorship of journalists, human rights activists, artists, or any other creators of lawful content.20 Use of these automated systems would only increase, and censorship would along with it. Meanwhile, smaller platforms without the substantial resources required to manage potential liability in this way—or to weather the significant litigation costs 18 Statista, Number of monthly active Facebook users worldwide as of 2nd quarter 2018 (in millions), https://www.statista.com/statistics/264810/number-of-monthlyactive-facebook-users-worldwide/. 19 See, e.g., Hayley Tsukayama, Facebook Turns to Artificial Intelligence to fight hate and misinformation in Myanmar, Washington Post (Aug. 15, 2018), https://www.washingtonpost.com/technology/2018/08/16/facebook-turns-artificialintelligence-fight-hate-misinformation-myanmar/?. 20 Sydney Li and Jamie Williams, Despite What Zuckerberg’s Testimony May Imply, AI Cannot Save Us, Electronic Frontier Foundation “Deeplinks” (Apr. 11, 2018), https://www.eff.org/deeplinks/2018/04/despite-what-zuckerbergstestimony-may-imply-ai-cannot-save-us. 20 they would face if they choose not to—would be forced to shut down. And new companies would be deterred from even trying to offer open platforms for speech. Indeed, even those platforms that would prevail on the merits of a lawsuit will incur significant legal fees. Cf. Wicks v. Miss. State Emp’t Servs., 41 F.3d 991, 995 n.16 (5th Cir. 1995) (“[I]mmunity means more than just immunity from liability; it means immunity from the burdens of defending a suit[.]”). If platforms are required to take some or all of the measures described above, it would lead to sanitized, milk-toast online platforms. Platforms would be incentivized to engage in self-censorship and host only non-controversial conversations, while actively discouraging any discussions that may draw objection, out of concern that the content may one day form the basis of a lawsuit against the company. The end result: the less controversial the content, the more likely it will remain on the platform. Increased platform censorship would end the essential role intermediaries play in fostering social and political discourse on the Internet. Indeed, many individuals around the world use U.S.-based services to access and distribute all manner of content, from organizing in opposition to oppressive regimes21 to sharing pictures of children with grandparents. Such robust, global online 21 See, e.g., Philip N. Howard, et al., Opening Closed Regimes: What Was the Role of Social Media During the Arab Spring? (Sep. 1, 2011), http://philhoward.org/opening-closed-regimes-what-was-the-role-of-social-mediaduring-the-arab-spring/. 21 participation would never have been achieved without the immunity provided by Section 230, while the First Amendment continues to provide meaningful protections against publisher liability for incitement to violence. Granting wouldbe plaintiffs a clear avenue to circumvent Section 230’s protections, or those of the First Amendment, would undermine this global phenomenon. Because platforms would be unwilling to take a chance on provocative or unpopular speech, the online marketplace of ideas would be artificially stunted. This is precisely what Congress sought to avoid in passing Section 230, and what the First Amendment should continue to protect against. CONCLUSION For the reasons outlined here, this Court should affirm the district court’s dismissal based on Section 230, and also hold that and the First Amendment immunizes Facebook from Plaintiffs-Appellants’ claims in this case. Dated: October 10, 2018 By: /s/ Sophia Cope Sophia Cope David Greene ELECTRONIC FRONTIER FOUNDATION 815 Eddy Street San Francisco, CA 94109 Telephone: (415) 436-9333 sophia@eff.org Counsel for Electronic Frontier Foundation 22 CERTIFICATE OF COMPLIANCE WITH TYPE-VOLUME LIMITATION, TYPEFACE REQUIREMENTS AND TYPE STYLE REQUIREMENTS PURSUANT TO FED. R. APP. P. 32(A)(7)(C) Pursuant to Fed. R. App. P. 32(a)(7)(C), I certify as follows: 1. This Brief of Amicus Curiae Electronic Frontier Foundation In Support of Defendant-Appellee complies with the type-volume limitation, because this brief contains 4,962 words, excluding the parts of the brief exempted by Fed. R. App. P. 32(a)(7)(B)(iii); and 2. This brief complies with the typeface requirements of Fed. R. App. P. 32(a)(5) and the type style requirements of Fed. R. App. P. 32(a)(6) because this brief has been prepared in a proportionally spaced typeface using Microsoft Word 2011, the word processing system used to prepare the brief, in 14 point font in Times New Roman font. Dated: October 10, 2018 By: /s/ Sophia Cope Sophia Cope Counsel for Amicus Curiae Electronic Frontier Foundation 23 CERTIFICATE OF SERVICE I hereby certify that I electronically filed the foregoing Brief of Amicus Curiae Electronic Frontier Foundation with the Clerk of the Court for the United States Court of Appeals for the Second Circuit by using the appellate CM/ECF system on October 10, 2018. I certify that all participants in the case are registered CM/ECF users and that service will be accomplished by the appellate CM/ECF system. Dated: October 10, 2018 By: /s/ Sophia Cope Sophia Cope Counsel for Amicus Curiae Electronic Frontier Foundation 24