Case: 16-17165, 06/07/2017, ID: 10463592, DktEntry: 30-2, Page 1 of 33 NO. 16-17165 IN THE UNITED STATES COURT OF APPEALS FOR THE NINTH CIRCUIT TAMARA FIELDS, ET AL. PLAINTIFFS-APPELLANTS, V. TWITTER, INC., DEFENDANT-APPELLEE. On Appeal from the United States District Court for the District of Northern California Case No. 16-cv-00213-WHO The Honorable William Horsley Orrick, District Court Judge BRIEF OF AMICI CURIAE ELECTRONIC FRONTIER FOUNDATION AND CENTER FOR DEMOCRACY & TECHNOLOGY IN SUPPORT OF DEFENDANT-APPELLEE AND AFFIRMANCE Aaron Mackey Counsel of Record Jamie Williams Sophia Cope ELECTRONIC FRONTIER FOUNDATION 815 Eddy Street San Francisco, CA 94109 Email: amackey@eff.org Telephone: (415) 436-9333 Counsel for Amici Curiae Electronic Frontier Foundation and Center For Democracy & Technology Case: 16-17165, 06/07/2017, ID: 10463592, DktEntry: 30-2, Page 2 of 33 DISCLOSURE OF CORPORATE AFFILIATIONS AND OTHER ENTITIES WITH A DIRECT FINANCIAL INTEREST IN LITIGATION Pursuant to Rule 26.1 of the Federal Rules of Appellate Procedure, Amici Curiae Electronic Frontier Foundation and Center for Democracy & Technology state that they do not have a parent corporation and that no publicly held corporation owns 10 percent or more of their stock. i   Case: 16-17165, 06/07/2017, ID: 10463592, DktEntry: 30-2, Page 3 of 33 TABLE OF CONTENTS CORPORATE DISCLOSURE STATEMENT .......................................................i TABLE OF CONTENTS ...................................................................................... ii   TABLE OF AUTHORITIES .................................................................................iv   STATEMENTS OF INTEREST ............................................................................ 1   INTRODUCTION .................................................................................................. 2   ARGUMENT .......................................................................................................... 4   I.   HOLDING PLATFORMS LIABLE FOR THIRD-PARTY SPEECH VIOLATES USERS’ AND PLATFORMS’ FIRST AMENDMENT RIGHTS. ............................................................................................... 4   A.   The First Amendment Protects Users’ Rights to Receive Information, Including Speech About Terrorism. .......................... 4   B.   The First Amendment Protects Twitter as a Publisher of Controversial Speech...................................................................... 6   i.   Speech Promoting Terrorism Is Not Categorically Excluded From First Amendment Protection. ......................................... 6   ii.   Twitter Cannot Be Liable for Incitement Based on the Knowing Publication of Terrorism Tweets on Its Site. ........... 8   II.   SECTION 230 BARS PLAINTIFFS’ CLAIMS. ............................... 10   A.   Congress Passed Section 230 to Encourage the Development of Open Platforms and Enable Robust Online Speech. .................... 10   B.   Because the Harm Plaintiffs Seek to Hold Twitter Liable for Flows From Third-Party Speech, Section 230 Immunity Applies to Twitter......................................................................... 13   C.   Providing Accounts to Third-Party Users Is a Classic Editorial Function Protected by Section 230 Immunity. ............................. 16   ii   Case: 16-17165, 06/07/2017, ID: 10463592, DktEntry: 30-2, Page 4 of 33 III.   HOLDING TWITTER LIABLE WILL FORCE IT AND OTHER INTERMEDIARIES TO ADOPT PRACTICES THAT WILL SEVERELY CURTAIL ONLINE SPEECH. ..................................... 18   A.   Platforms Will Need to Aggressively Vet Their Users and the Content They Create..................................................................... 18   B.   Platforms Will Likely Restrict or Prohibit Users’ Ability to Speak Anonymously. ................................................................... 21   C.   The Internet Will Become a Place of Closed, Sanitized Forums, Rather Than an Essential Place for Social and Political Discourse........................................................................ 22   CONCLUSION..................................................................................................... 23   iii   Case: 16-17165, 06/07/2017, ID: 10463592, DktEntry: 30-2, Page 5 of 33 TABLE OF AUTHORITIES Cases   Art of Living Foundation v. Does, No. C10-05022 LHK (HRL) (N.D. Cal. Aug. 10, 2011) .................................... 22 Ascentive, LLC v. Opinion Corp., 842 F. Supp. 2d 450 (E.D.N.Y. 2011) ................................................................. 18 Barnes v. Yahoo!, Inc., 570 F.3d 1096 (9th Cir. 2009) ................................................................. 13, 14, 15 Batzel v. Smith, 333 F.3d 1018 (9th Cir. 2003) ............................................................................. 11 Bd. of Educ. v. Pico, 457 U.S. 853 (1982) .......................................................................................... 4, 5 Brown v. Entertainment Merchants Ass’n, 564 U.S. 786 (2011) .............................................................................................. 7 Carafano v. Metrosplash.com, Inc., 339 F.3d 1119 (9th Cir. 2003) ............................................................................. 17 Cohen v. Facebook, Inc., No. 16-CV-4453 (E.D.N.Y. May 18, 2017) .................................................. 15, 16 Columbia Ins. Co. v. Seescandy.Com, 185 F.R.D. 573 (N.D. Cal. 1999) ........................................................................ 22 Conant v. Walters, 309 F.3d 629 (9th Cir. 2002) ................................................................................. 4 Dart v. Craigslist, Inc., 665 F. Supp. 2d 961 (N.D. Ill. 2009) .................................................................. 18 Doe v. 2TheMart.com Inc., 140 F. Supp. 2d 1088 (W.D. Wash. 2001) .......................................................... 22 Doe v. MySpace, Inc., 528 F.3d 413 (5th Cir. 2008) ......................................................................... 15, 17 iv   Case: 16-17165, 06/07/2017, ID: 10463592, DktEntry: 30-2, Page 6 of 33 Fair Hous. Council of San Fernando Valley v. Roommates.com, LLC, 521 F.3d 1157 (9th Cir. 2008) ....................................................................... 17, 18 Fields v. Twitter, No. 16-cv-00213-WHO (N.D. Cal. Nov. 18, 2016) ...................................... 14, 16 FTC v. LeadClick Media, LLC, 838 F.3d 158 (2d Cir. 2016) ................................................................................ 16 Gentry v. eBay, Inc., 99 Cal. App. 4th 816 (2002) ................................................................................ 18 Green v. Am. Online (AOL), 318 F.3d 465 (3d Cir. 2003) ................................................................................ 17 Herceg v. Hustler Magazine, Inc., 814 F.2d 1017 (5th Cir. 1987) ........................................................................... 8, 9 In re Terrorist Attacks on September 11, 2001 (Al Rajhi Bank, et al.), 714 F.3d 118 (2d Cir. 2013) ................................................................................ 14 Jane Doe No. 1 v. Backpage.com, LLC, 817 F.3d 12 (1st Cir. 2016) ................................................................................. 18 Jones v. Dirty World Entm’t Recordings LLC, 755 F.3d 398 (6th Cir. 2014) ............................................................................... 18 Lamont v. Postmaster Gen’l, 381 U.S. 301 (1965) .............................................................................................. 5 McIntyre v. Ohio Elections Comm’n, 514 U.S. 334 (1995) ............................................................................................ 21 Reno v. ACLU, 521 U.S. 844 (1997) ...................................................................................... 10, 20 Rice v. Paladin Enters., 128 F.3d 233 (4th Cir. 1997) ................................................................................. 9 Richmond Newspapers v. Virginia, 448 U.S. 555 (1980) .............................................................................................. 5 v   Case: 16-17165, 06/07/2017, ID: 10463592, DktEntry: 30-2, Page 7 of 33 Stanley v. Georgia, 394 U.S. 557 (1969) .............................................................................................. 4 Stratton Oakmont, Inc. v. Prodigy Servs. Co., No. 31063/94 (N.Y. Sup. Ct. Dec. 15, 1995) ................................................ 16, 17 U.S. v. Stevens, 559 U.S. 460 (2010) .............................................................................................. 7 Zeran v. AOL, 129 F.3d 327 (4th Cir. 1997) ................................................................... 11, 12, 17 Statutes   18 U.S.C. § 2333(a) ................................................................................................ 14 47 U.S.C. § 230................................................................................................ passim 47 U.S.C. § 230(b)(2) ............................................................................................. 11 47 U.S.C. § 230(b)(3) ............................................................................................. 11 47 U.S.C. § 230(c)(1) ....................................................................................... 13, 14 Constitutional Provisions   U.S. Constitution, amendment I ...................................................................... passim Other Authorities   Anupam Chander & Uyên P. Lê, Free Speech, 100 Iowa L. Rev. (2015) ............. 10 Bree Brouwer, YouTube Now Gets Over 400 Hours of Content Uploaded Every Minute, Tubefilter (July 26, 2015) ...................................................................... 21 ICT Facts & Figures 2016, International Telecommunications Union (U.N. agency for information and communications technology)........................ 21 Investor Relations, Yelp 1Q17 Data Sheet ............................................................. 21 Jeremy Kessel, Our tenth Twitter #Transparency Report (Mar. 21, 2017) ............ 20 Number of monthly active Facebook users worldwide as of 3rd quarter 2016, Statista ................................................................................................................. 21 vi   Case: 16-17165, 06/07/2017, ID: 10463592, DktEntry: 30-2, Page 8 of 33 Philip N. Howard, et al., Opening Closed Regimes: What Was the Role of Social Media During the Arab Spring? (Sep. 1, 2011) .................................................. 23 Ryan Grenoble, Twitter Transparency Report Details Escalating Crackdown On Terrorists, The Huffington Post (Mar. 21, 2017)................................................ 19 Violent Threats, Abuse or Harassment, Twitter Safety Policies .............................. 9   vii Case: 16-17165, 06/07/2017, ID: 10463592, DktEntry: 30-2, Page 9 of 33 STATEMENTS OF INTEREST The Electronic Frontier Foundation (EFF), a non-profit civil liberties organization with more than 36,000 members, works to protect rights in the digital world. Based in San Francisco and founded in 1990, EFF regularly advocates in courts on behalf of users and creators of technology in support of free expression, privacy, and innovation online. The Center for Democracy & Technology (CDT) is a non-profit public interest organization that advocates for individual rights in Internet law and policy. CDT represents the public’s interest in an open, innovative, and decentralized Internet that promotes constitutional and democratic values of free expression, privacy, and individual liberty. CDT has litigated or otherwise participated in a broad range of Internet free expression and intermediary liability cases. 1   Case: 16-17165, 06/07/2017, ID: 10463592, DktEntry: 30-2, Page 10 of 33 INTRODUCTION Plaintiffs’ aggressive attempt to plead around 47 U.S.C. § 230 (“Section 230”)1 to hold Twitter liable for third-party speech jeopardizes online platforms’ ability to offer Internet users robust and open forums for speech. Although Plaintiffs go to great lengths to argue that their legal claims and grievances do not implicate Section 230, there is no escaping the fact that the harm they allege flows directly from third-party speech. Section 230 bars these claims and, as Twitter shows, this Court should affirm the lower court’s holding on this basis alone. This Court should affirm the district court’s decision for two independent reasons: First, Plaintiffs’ effort to hold Twitter liable for offensive speech violates the First Amendment rights of Internet users, who have a right to receive information, and Twitter, which has a right to publish information. Imposing civil liability for the publication of speech promoting terrorism will require this Court to fashion an entirely new First Amendment category of speech that falls outside the First Amendment’s protection—a proposition that is not only ill-advised but also contrary to well-settled law.                                                                                                                 1 The statute was passed as Section 509 of the Communications Decency Act, part of the Telecommunications Act of 1996, Pub. L. 104–104. It is sometimes colloquially referred to as “CDA 230” or “Section 230 of the Communications Decency Act.” Amici refer to it as Section 230. 2   Case: 16-17165, 06/07/2017, ID: 10463592, DktEntry: 30-2, Page 11 of 33 Second, enabling Plaintiffs to circumvent Section 230 via a “provision of accounts” theory is bad public policy. Online intermediaries like Twitter are an essential element of the modern Internet. End users rely on intermediaries to express themselves and communicate with others online. Platforms like Twitter give everyone and anyone the ability to reach an audience or engage with others, without having to learn how to code or expend significant financial resources, on all manner of topics, for all manner of reasons. Thanks to these platforms, Internet users can easily connect with family and friends, follow the news, express opinions, share personal experiences, create and share art, and debate politics. The immunity Section 230 affords Internet intermediaries is the key factor enabling platforms to host vibrant, robust, and diverse forums for online speech. Undermining Section 230 means undermining the free and open Internet. Indeed, if Plaintiffs were to prevail in their effort to circumvent Section 230, intermediaries will likely take immediate steps to restrict the openness of their platforms, such as by scrutinizing users, limiting accounts, and screening content. And those platforms that cannot afford to screen their users will simply cease to exist. That outcome will blunt the Internet’s ability to be a powerful, diverse forum for political and social discourse. This Court should affirm the district court’s order granting Twitter’s motion to dismiss without leave to amend. 3   Case: 16-17165, 06/07/2017, ID: 10463592, DktEntry: 30-2, Page 12 of 33 ARGUMENT I. HOLDING PLATFORMS LIABLE FOR THIRD-PARTY SPEECH VIOLATES USERS’ AND PLATFORMS’ FIRST AMENDMENT RIGHTS. A. The First Amendment Protects Users’ Rights to Receive Information, Including Speech About Terrorism. The First Amendment protects the right of platform users to receive information, including offensive rhetoric advocating for terrorism that does not constitute either a true threat or directly incite violence. The Supreme Court has held that “the right to receive ideas is a necessary predicate to the recipient’s meaningful exercise of his own rights of speech, press, and political freedom.” Bd. of Educ. v. Pico, 457 U.S. 853, 867 (1982) (plurality). This Court has acknowledged that the right to receive information “and the right to speak are flip sides of the same coin.” Conant v. Walters, 309 F.3d 629, 643 (9th Cir. 2002). The right to receive information does not turn on the underlying merit of the ideas communicated. Quite the opposite: it ensures that people have access to different, controversial ideas and views. As the Supreme Court has recognized, “the right to receive information and ideas, regardless of their social worth . . . is fundamental to our free society,” Stanley v. Georgia, 394 U.S. 557, 564 (1969) (protecting the right to possess obscene materials at home), because it is essential to fostering open debate. Indeed, “[i]t would be a barren marketplace of ideas that had only sellers and no buyers.” Lamont v. Postmaster Gen’l, 381 U.S. 301, 308 4   Case: 16-17165, 06/07/2017, ID: 10463592, DktEntry: 30-2, Page 13 of 33 (1965) (Brennan, J., concurring) (protecting the “right to receive” foreign publications). Similarly, the First Amendment protects the right to gather information. See Richmond Newspapers v. Virginia, 448 U.S. 555, 576 (1980) (plurality) (protecting the right to gather information in courtrooms, because “free speech carries with it some freedom to listen”). Holding platforms liable for publishing speech on certain topics thus interferes with users’ rights. Platforms will likely react to such legal liability by simply not publishing any speech about terrorism—not merely speech directly inciting imminent terrorist attacks or expressing true threats. But users have the right to receive speech, even on unpopular and abhorrent topics such as terrorism or from unpopular speakers who advocate terrorist ideology. Depriving users of their right to receive and gather information discussing terrorism will do far more than simply limit which content is available online; it will stunt people’s ability to be informed about the world and form opinions. Depriving platform users of their ability to decide for themselves whether to receive speech on certain content will short-circuit the marketplace of ideas in a way that runs directly counter to the First Amendment. As the Supreme Court recognized in Pico, the ability to access information is antecedent to engaging in speech protected by the First Amendment. See 457 U.S. at 867. The interplay 5   Case: 16-17165, 06/07/2017, ID: 10463592, DktEntry: 30-2, Page 14 of 33 between receiving information and engaging in speech exists for terrorism just like any other subject matter: journalists need to access and gather information about terrorism to report about it; academic researchers need the information to inform our social and political beliefs; government officials and the general public need the information to engage in political and social debates about terrorism and related foreign and domestic policy. B. The First Amendment Protects Twitter as a Publisher of Controversial Speech. i. Speech Promoting Terrorism Is Not Categorically Excluded From First Amendment Protection. Imposing liability on intermediaries for hosting content promoting terrorism will also violate the intermediaries’ First Amendment rights. Specifically, it will punish Twitter for disseminating speech that is fully protected by the First Amendment. Underlying Plaintiffs’ legal theory is the premise that speech about or otherwise promoting terrorism is of so little value that it enjoys no First Amendment protection, and that platforms can thus be held liable for publishing such content. But there are only a handful of historically unprotected categories of speech, and terrorist speech is not one of them. Although certain types of terrorist speech may be unprotected, such as true threats and speech directly inciting imminent lawless acts, the vast majority of speech about terrorism is fully 6   Case: 16-17165, 06/07/2017, ID: 10463592, DktEntry: 30-2, Page 15 of 33 protected by the First Amendment. Further, the Supreme Court has been loath to expand the list of unprotected categories of speech, even in cases involving toxic and extremely offensive speech. In U.S. v. Stevens, 559 U.S. 460, 469 (2010), for example, the government sought to create a new category of unprotected speech that it could punish: graphic and disturbing depictions of animal cruelty. The government proposed a balancing test—“the value of the speech against its societal costs”—to determine whether certain categories of speech fall outside the First Amendment. Id. at 470. The Supreme Court rejected the government’s proposal as both “startling and dangerous.” Id. The First Amendment does not permit the creation of new categories of unprotected speech, the Court held, because the “guarantee of free speech does not extend only to categories of speech that survive an ad hoc balancing of relative social costs and benefits.” Id. The Court reaffirmed this principle less than a year later in striking down a California law that banned the sale of violent video games to minors, and would have created a de facto new category of unprotected speech by grafting portions of the definition of obscenity with depictions of extremely violent video game content. Brown v. Entertainment Merchants Ass’n, 564 U.S. 786, 792–93 (2011). There is no historical basis for expanding the list of speech unprotected by the First Amendment to include speech promoting terrorism. Further, the First 7   Case: 16-17165, 06/07/2017, ID: 10463592, DktEntry: 30-2, Page 16 of 33 Amendment does not permit ad hoc judgments regarding the social value of speech to determine whether that speech is protected. Plaintiffs cannot impose categorical liability on Twitter for publishing terrorist content without punishing platforms for disseminating speech fully protected by the First Amendment. The creation of a new category of unprotected speech is unwarranted and inappropriate. ii. Twitter Cannot Be Liable for Incitement Based on the Knowing Publication of Terrorism Tweets on its Site. Some online speech promoting terrorism may constitute speech directly inciting violence and thus fall outside the First Amendment, but even then Twitter could not be held liable as a result of merely publishing such speech. The First Amendment generally bars claims against publishers for inciting harmful conduct via the knowing publication of motivational or instructional speech. In Herceg v. Hustler Magazine, Inc., 814 F.2d 1017, 1021–22 (5th Cir. 1987), for example, the Fifth Circuit overturned a jury verdict finding Hustler liable for a teen’s death as a result of its publication of an article about autoerotic asphyxiation. The court held that liability could not be imposed on Hustler “without impermissibly infringing on freedom of speech” because there was no evidence that the publisher intended, advocated for, or directly incited the teen to attempt the act. Id. Courts have held that publishers can only be held liable for content that results in death or bodily injury in cases where (i) the publisher has the specific 8   Case: 16-17165, 06/07/2017, ID: 10463592, DktEntry: 30-2, Page 17 of 33 intent to encourage the commission of violent acts, and (ii) the publisher provides specific instructions to commit the acts, rather than abstract advocacy. Rice v. Paladin Enters., 128 F.3d 233, 242 (4th Cir. 1997). This narrow class of cases in which the First Amendment will not bar platform liability based on content that resulted in death or bodily injury is not applicable here. Plaintiffs cannot show that Twitter possessed the specific intent and direction required to create and disseminate the user-generated content that incited the violence in this case. There is no allegation that Twitter made any efforts to direct or incite the offensive tweets that form the basis of Plaintiffs’ case, and in fact Twitter has policies prohibiting such speech. See Violent Threats, Abuse or Harassment, Twitter Safety Policies.2 Even if Twitter had actual knowledge of ISIS tweets that directly incited terrorism or other criminal acts, it would still lack the specific intent required to vitiate the First Amendment protection recognized in both Herceg and Rice.                                                                                                                   2 Available at https://about.twitter.com/safety/policies. 9   Case: 16-17165, 06/07/2017, ID: 10463592, DktEntry: 30-2, Page 18 of 33 II. SECTION 230 BARS PLAINTIFFS’ CLAIMS. A. Congress Passed Section 230 to Encourage the Development of Open Platforms and Enable Robust Online Speech. Plaintiffs’ legal theory threatens not just Twitter, but the ability of all Internet intermediaries to host platforms for diverse online speech. Congress recognized in passing Section 230 that the Internet depends upon intermediaries, which serve “as a vehicle for the speech of others.” Anupam Chander & Uyên P. Lê, Free Speech, 100 Iowa L. Rev. 501, 514 (2015). Intermediaries create democratic forums in which anyone can become “a pamphleteer” or “a town crier with a voice that resonates farther than it could from any soapbox.” Reno v. ACLU, 521 U.S. 844, 870 (1997). They give a single person, with minimal resources and technical expertise anywhere in the world, the ability to communicate with others across the globe. Online platforms host a wide range of diverse speech on behalf of their users, ensuring that all views—especially controversial ones—can be presented and received by platform users. Intermediary platforms—such as social media websites, blogging platforms, video-sharing services, and web-hosting companies—are the essential architecture of today’s Internet. Indeed, they are often the primary way in which the majority of people engage with one another online. Thus, any efforts to weaken Section 230 will threaten the Internet as a whole.   10 Case: 16-17165, 06/07/2017, ID: 10463592, DktEntry: 30-2, Page 19 of 33 Congress clearly understood the essential function online intermediaries play in our digital lives. In passing Section 230, Congress recognized the Internet’s power to sustain and promote robust individual speech, a value rooted in the First Amendment. Congress sought to further encourage the already robust free speech occurring online and to speed the development of online platforms by providing broad immunity to service providers that host user-generated content. See Section 230 (b)(2), (3) (“It is the policy of the United States . . . to encourage the development of technologies which maximize user control over what information is received by individuals, families, and schools who use the Internet and other interactive computer services” and “to preserve the vibrant and competitive free market that presently exists for the Internet and other interactive computer services, unfettered by Federal or State regulation.”); see also Batzel v. Smith, 333 F.3d 1018, 1027 (9th Cir. 2003) (“Congress wanted to encourage the unfettered and unregulated development of free speech on the Internet, and to promote the development of e-commerce.”). Congress recognized that if our legal system failed to robustly protect intermediaries, it would fail to protect free speech online. Zeran v. AOL, 129 F.3d 327, 330 (4th Cir. 1997). Given the volume of information being published online, it would be impossible for most intermediaries to review every single bit of information published through their platforms prior to publication. “Faced with   11 Case: 16-17165, 06/07/2017, ID: 10463592, DktEntry: 30-2, Page 20 of 33 potential liability for each message republished by their services, interactive computer service providers might choose to severely restrict the number and type of messages posted.” Id. at 331. The resulting Internet would include a far more limited number of forums if intermediaries were forced to second-guess decisions about managing and presenting content authored by third parties. By creating Section 230’s platform immunity, Congress made the intentional policy choice that individuals harmed by speech online will need to seek relief from the speakers themselves, rather than the platforms those speakers used. Id. at 330–31. By limiting liability in this way, Congress decided that creating a forum for unrestrained and robust communication was of utmost importance, even if it resulted in the presence of harmful content online. Thus, while Congress certainly did not intend to promote speech that aids terrorist organizations, as Plaintiffs point out, Congress did decide that promoting robust online dialogue was more important than ridding the Internet of all harmful speech. See Appellants’ Opening Brief (“AOB”) at 23. Placing liability on Twitter in this case not only conflicts with the plain text and purpose of Section 230, it also will severely undercut the essential role online platforms play in fostering our modern political and social discourse.     12 Case: 16-17165, 06/07/2017, ID: 10463592, DktEntry: 30-2, Page 21 of 33 B. Because the Harm Plaintiffs Seek to Hold Twitter Liable for Flows From Third-Party Speech, Section 230 Immunity Applies to Twitter. Plaintiffs’ liability theory requires this Court to fashion a judicial exception to Section 230’s broad platform immunity because the relief they seek depends on finding Twitter liable for third-party speech on its platform. Congress plainly barred this result in passing Section 230.3 Plaintiffs’ claims begin and end with concern over user-generated content: Plaintiffs believe ISIS tweets were the proximate cause of the harm they suffered and they seek to hold Twitter liable for failing to prevent those tweets. See AOB 9– 10, 19–22. Plaintiffs explicitly acknowledge that content is the crux of their causation arguments—an acknowledgement that belies their repeated assertions that their material support allegations against Twitter have nothing to do with speech. See AOB at 20. They believe that preventing terrorist propaganda— namely, tweets and messages by ISIS that “spread propaganda and incite fear,” solicited “funds for its terrorists’ feats,” conveyed “instructional guidelines,” or attempted to recruit new members—would have prevented the tragedies that gave rise to this case. Fields v. Twitter, No. 16-cv-00213-WHO, 2016 WL 6822065, at                                                                                                                 3 Section 230(c)(1) states: “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” 47 U.S.C. § 230(c)(1). The statute protects an online service provider that a plaintiff seeks to treat as the publisher or speaker of content created by a third party. Barnes v. Yahoo!, Inc., 570 F.3d 1096, 1100–01 (9th Cir. 2009).   13 Case: 16-17165, 06/07/2017, ID: 10463592, DktEntry: 30-2, Page 22 of 33 *2 (N.D. Cal. Nov. 18, 2016). They believe tweets of ISIS terrorists radicalized and inspired Abu Zaid, who in turn committed the act of violence that resulted in the tragic deaths of Plaintiffs’ loved ones.4 Plaintiffs rely entirely on the content of tweets as their showing of proximate causation necessary for a civil claim under the Anti-Terrorist Act (ATA). See 18 U.S.C. § 2333(a); In re Terrorist Attacks on September 11, 2001 (Al Rajhi Bank, et al.), 714 F.3d 118, 123–125 (2d Cir. 2013). Indeed, in all ATA cases where the defendant is an Internet communications platform and the supposed nexus between the plaintiff’s injury and the defendant’s alleged provision of material support involves the defendant’s mere hosting of objectionable content, the claims will always be based on third-party speech. Because Plaintiffs’ claims attempt to causally link Twitter’s liability under the Anti-Terrorism Act to user-generated content, Section 230’s immunity must apply. If “the duty that the plaintiff alleges the defendant violated derives from the defendant’s status or conduct as a ‘publisher or speaker’” then “section 230(c)(1) precludes liability.” Barnes v. Yahoo!, Inc., 570 F.3d 1096, 1102 (9th Cir. 2009). “[W]hat matters is not the name of the cause of action—defamation versus negligence versus intentional infliction of emotional distress—what matters is                                                                                                                 4 Plaintiffs present no evidence that Abu Zaid had a Twitter account or otherwise used Twitter.   14 Case: 16-17165, 06/07/2017, ID: 10463592, DktEntry: 30-2, Page 23 of 33 whether the cause of action inherently requires the court to treat the defendant as the ‘publisher or speaker’ of content provided by another.” Id. at 1101–02; see also Doe v. MySpace, Inc., 528 F.3d 413, 420 (5th Cir. 2008) (rejecting an attempt to circumvent Section 230’s protection by styling the claims as a failure to implement basic safety measures to protect minors). Nor can Plaintiffs divorce Twitter’s “provision of accounts” from its function as a platform for third-party speech. Twitter accounts serve no other function than to facilitate a user publishing and/or receiving information from others. Thus, Plaintiffs’ “attempt to draw a narrow distinction between policing accounts and policing content must ultimately be rejected.” Cohen v. Facebook, Inc., No. 16-CV-4453, 2017 WL 2192621, at *12 (E.D.N.Y. May 18, 2017). Their allegations are “merely another way of claiming that [Twitter is] liable for publishing the communications and they speak to [Twitter’s] role as a publisher of online third-party-generated content.” MySpace, 528 F.3d at 420. Thus, holding Twitter liable for third-party speech that promotes terrorism is one and the same as holding Twitter liable for creating the content itself. To imagine otherwise requires that Plaintiffs either (1) completely dissociate the harm stemming from the user-generated content they complain of from Twitter’s role as an online platform, or (2) run roughshod over Section 230’s immunity. The former cannot be done either as a matter of logic or legal causation. The latter cannot be   15 Case: 16-17165, 06/07/2017, ID: 10463592, DktEntry: 30-2, Page 24 of 33 done without a judicial rewrite of a duly passed act of Congress—a dangerous request that this Court should reject. C. Providing Accounts to Third-Party Users Is a Classic Editorial Function Protected by Section 230 Immunity. In providing accounts to the public, Twitter creates opportunities for users to publish content, and as such it is a traditional publication function immunized by Section 230. Both the district court below and the Eastern District of New York have correctly recognized that a platform’s decision to provide accounts is “just like monitoring, reviewing, and editing content.” Fields, 2016 WL 6822065, at *5; see also Cohen, 2017 WL 2192621, at *12. Twitter’s “choices as to who may use its platform are inherently bound up in its decisions as to what may be said on its platform, and so liability imposed based on its failure to remove users would equally ‘derive[ ] from [Twitter’s] status or conduct as a publisher or speaker.’” Cohen, 2017 WL 2192621, at *12 (quoting FTC v. LeadClick Media, LLC, 838 F.3d 158, 175 (2d Cir. 2016) (internal quotation marks and citations omitted). The entire purpose of Plaintiffs’ suit is to restrict certain users from having accounts and thereby prevent the publication of “content, ideas, and affiliations” by those would-be account holders. Fields, 2016 WL 6822065, at *6. In enacting Section 230, Congress in part sought to reverse judicial decisions, such as Stratton Oakmont, Inc. v. Prodigy Servs. Co., No. 31063/94, 1995 WL 323710 (N.Y. Sup. Ct. Dec. 15, 1995), which held online platforms   16 Case: 16-17165, 06/07/2017, ID: 10463592, DktEntry: 30-2, Page 25 of 33 liable when they undertook traditional editing and publishing functions of usergenerated content. See S.Rep. No. 104–230, at 194 (1996) (“One of the specific purposes of [Section 230] is to overrule Stratton Oakmont v. Prodigy and any other similar decisions[.]”); Zeran v. AOL, 129 F.3d 327, 331 (4th Cir. 1997) (recognizing that one of Section 230’s purposes was to encourage platforms to exercise traditional editorial functions). The immunity Section 230 provides for platforms in their role as editors and publishers of third-party speech is broad. This Court, for example, has held that Section 230 immunizes “any activity that can be boiled down to deciding whether to exclude material that third parties seek to post online[.]” Fair Hous. Council of San Fernando Valley v. Roommates.com, LLC, 521 F.3d 1157, 1170–71 (9th Cir. 2008) (en banc). The Fourth Circuit has held similarly, recognizing that any “lawsuits seeking to hold a service provider liable for its exercise of a publisher’s traditional editorial functions—such as deciding whether to publish, withdraw, postpone or alter content—are barred.” Zeran, 129 F.3d at 330; see also Green v. Am. Online (AOL), 318 F.3d 465, 471 (3d Cir. 2003) (same). This also includes claims relating to a platform’s “editing and selection” process, Carafano v. Metrosplash.com, Inc., 339 F.3d 1119, 1124 (9th Cir. 2003), and claims related to a platform’s “‘monitoring, screening, and deletion of content.’” MySpace, 528 F.3d at 420 (citation omitted).   17 Case: 16-17165, 06/07/2017, ID: 10463592, DktEntry: 30-2, Page 26 of 33 Section 230 also bars claims related to a platform’s decisions about “the structure and operation” of a website, “which reflect choices about what content can appear on the website and in what form.” Jane Doe No. 1 v. Backpage.com, LLC, 817 F.3d 12, 21 (1st Cir. 2016). The statute also protects a platform’s choices about how its search engine should function. See Roommates.com, 521 F.3d at 1167; see also Ascentive, LLC v. Opinion Corp., 842 F. Supp. 2d 450, 476 (E.D.N.Y. 2011). Further, Section 230 immunizes platforms from claims regarding how a platform displays, categorizes, or allows access to user-generated content. Jones v. Dirty World Entm’t Recordings LLC, 755 F.3d 398, 409–10 (6th Cir. 2014); Dart v. Craigslist, Inc., 665 F. Supp. 2d 961, 967 (N.D. Ill. 2009); see also Gentry v. eBay, Inc., 99 Cal. App. 4th 816, 832 (2002). III. HOLDING TWITTER LIABLE WILL FORCE IT AND OTHER INTERMEDIARIES TO ADOPT PRACTICES THAT WILL SEVERELY CURTAIL ONLINE SPEECH. A. Platforms Will Need to Aggressively Vet Their Users and the Content They Create. If causes of action against intermediaries based on the provision of accounts to users who post objectionable content are permitted, it will fundamentally alter the relationship between platforms and their users. Instead of offering open forums for participation by people around the entire world—a quintessential feature of Internet intermediaries—these platforms, saddled with an active duty to monitor   18 Case: 16-17165, 06/07/2017, ID: 10463592, DktEntry: 30-2, Page 27 of 33 the creation of accounts, will choose to limit access to their platforms to individuals whose identities can be verified and who can be expected to post noncontroversial content. Those platforms without the substantial resources required to manage potential liability in this way will shut down. The platforms that remain will thus become active gatekeepers, screening users during the account creation process and likely requiring individuals to verify their identities—a requirement that will bar huge swathes of the population from participating, as many people do not possess adequate identification. Worse, such a practice will violate personal privacy and the right to speak anonymously, as discussed below in Section III.B. If an individual cannot pass the screening process for any reason—i.e., if the platform has some reason to believe that the individual will potentially post controversial content—that individual will be completely blocked from opening an account and thus from speaking at all. Platforms also will likely attempt to screen user-generated content even after vetting their users, to avoid the risk their users might post offensive content that will create liability for platform. Twitter is already removing thousands of posts and hundreds of thousands of accounts in response to concerns about terrorist speech on its platform. Ryan Grenoble, Twitter Transparency Report Details Escalating Crackdown On Terrorists, The Huffington Post (Mar. 21, 2017) (noting   19 Case: 16-17165, 06/07/2017, ID: 10463592, DktEntry: 30-2, Page 28 of 33 that Twitter suspended roughly 376,000 accounts in 2016).5 The liability Plaintiffs seek to impose will simply push Twitter to adopt even more extreme forms of censorship. These overreactions could include first reviewing all content users intend to post and preventing certain content from being published in the first place, and deleting content or removing accounts that discuss anything remotely related to terrorism—or any other controversial subject—even if it were critical commentary or a journalist’s account. Moreover, the ability—both logistically and financially—for modern platforms to conduct such review is dubious given the incredible volume of content generated by platform users. When Congress passed Section 230 in 1996, about 40 million people used the Internet worldwide, and commercial online services in the United States had almost 12 million individual subscribers. Reno, 521 U.S at 850– 851. Today’s Internet hosts third-party contributions from a broad array of voices, facilitating the speech of billions. In 2016, roughly 3.5 billion people were online, 47 percent of the global population, and prominent online service Facebook had                                                                                                                 5 Available at http://www.huffingtonpost.com/entry/twitter-transparency-antiterrorism-isis_us_58d1691ee4b0be71dcf89b7b; see also Jeremy Kessel, Our tenth Twitter #Transparency Report (Mar. 21, 2017), https://blog.twitter.com/2017/ourtenth-twitter-transparency-report.   20 Case: 16-17165, 06/07/2017, ID: 10463592, DktEntry: 30-2, Page 29 of 33 1.79 billion users.6 Users of the video platform YouTube today upload roughly 400 hours of video to the website every minute. Bree Brouwer, YouTube Now Gets Over 400 Hours of Content Uploaded Every Minute, Tubefilter (July 26, 2015).7 In early 2017, review website Yelp saw an average of 183 million visitors to its site monthly and hosted an estimated 127 million user-generated reviews of restaurants, businesses, and services. See Investor Relations, Yelp 1Q17 Data Sheet.8 B. Platforms Will Likely Restrict or Prohibit Users’ Ability to Speak Anonymously. If Section 230’s immunity fails, platforms also will likely restrict the ability of their users to speak anonymously, as intermediaries will need to know who users are before they can screen them. Anonymous speech, which is fully protected by the First Amendment, is part of an “honorable tradition of advocacy and of dissent.” McIntyre v. Ohio Elections Comm’n, 514 U.S. 334, 357 (1995). The                                                                                                                 6 See ICT Facts & Figures 2016, International Telecommunications Union (U.N. agency for information and communications technology) (June 2016) http://www.itu.int/en/ITU-D/Statistics/Documents/facts/ICTFactsFigures2016.pdf; Number of monthly active Facebook users worldwide as of 3rd quarter 2016, Statista, available at https://www.statista.com/statistics/264810/number-ofmonthly-active-facebook-users-worldwide/ (last visited June 5, 2017). 7 Available at http://www.tubefilter.com/2015/07/26/youtube-400-hours-contentevery-minute/. 8 Available at http://www.yelp-ir.com/phoenix.zhtml?c=250809&p=irol-irhome (Last visited June 5, 2017).   21 Case: 16-17165, 06/07/2017, ID: 10463592, DktEntry: 30-2, Page 30 of 33 Internet has historically served as a one of the most robust spaces for anonymous speech. Art of Living Foundation v. Does, No. C10-05022 LHK (HRL), 2011 WL 3501830, at *2 (N.D. Cal. Aug. 10, 2011). “Internet anonymity facilitates the rich, diverse, and far ranging exchange of ideas” and “‘can foster open communication and robust debate.’” Doe v. 2TheMart.com Inc., 140 F. Supp. 2d 1088, 1092 (W.D. Wash. 2001) (quoting Columbia Ins. Co. v. Seescandy.Com, 185 F.R.D. 573, 578 (N.D. Cal. 1999)). Plaintiffs’ proposed liability regime will thwart the Internet’s ability to remain “a valuable forum for robust exchange and debate.” Art of Living Foundation, 2011 WL 3501830, at *2. C. The Internet Will Become a Place of Closed, Sanitized Forums, Rather Than an Essential Place for Social and Political Discourse. If platforms are required to take some or all of the measures described above, it will lead to sanitized and milquetoast online platforms. Platforms will outright prevent certain individuals from participating in online communities in the first place. And they will encourage self-censorship and sanitized conversation; the less controversial one’s online persona, the more likely it will be to obtain and maintain an account. This will end the essential role intermediaries play in fostering social and political discourse on the Internet. Indeed, many individuals around the world use U.S.-based services to access and distribute all manner of content, from organizing   22 Case: 16-17165, 06/07/2017, ID: 10463592, DktEntry: 30-2, Page 31 of 33 in opposition to oppressive regimes,9 to sharing pictures of children with grandparents. Such robust, global participation would never have been achieved without the immunity provided by U.S. law via Section 230, and granting a plaintiff a clear avenue to circumvent Section 230’s protections will undermine it. Because platforms will be unwilling to take a chance on provocative or unpopular speech, the online marketplace of ideas will be artificially stunted, despite such speech being protected by the First Amendment. CONCLUSION For the reasons outlined here, this Court should affirm the district court’s holding that Section 230 immunizes Twitter from Plaintiffs’ claims in this case. Dated: June 7, 2017 By: /s/ Aaron Mackey Aaron Mackey Jamie Williams Sophia Cope ELECTRONIC FRONTIER FOUNDATION 815 Eddy Street San Francisco, CA 94109 Telephone: (415) 436-9333 amackey@eff.org Counsel for Amici Curiae Electronic Frontier Foundation and Center For Democracy & Technology                                                                                                                 9 See, e.g., Philip N. Howard, et al., Opening Closed Regimes: What Was the Role of Social Media During the Arab Spring? (Sep. 1, 2011), available at http://philhoward.org/opening-closed-regimes-what-was-the-role-of-social-mediaduring-the-arab-spring/.   23 Case: 16-17165, 06/07/2017, ID: 10463592, DktEntry: 30-2, Page 32 of 33 CERTIFICATE OF COMPLIANCE WITH TYPE-VOLUME LIMITATION, TYPEFACE REQUIREMENTS AND TYPE STYLE REQUIREMENTS PURSUANT TO FED. R. APP. P. 32(A)(7)(C) Pursuant to Fed. R. App. P. 32(a)(7)(C), I certify as follows: 1. This Brief of Amici Curiae Electronic Frontier Foundation in Support of Defendant-Appellant complies with the type-volume limitation, because this brief contains 4,986 words, excluding the parts of the brief exempted by Fed. R. App. P. 32(a)(7)(B)(iii); and 2. This brief complies with the typeface requirements of Fed. R. App. P. 32(a)(5) and the type style requirements of Fed. R. App. P. 32(a)(6) because this brief has been prepared in a proportionally spaced typeface using Microsoft Word 2011, the word processing system used to prepare the brief, in 14 point font in Times New Roman font. Dated: June 7, 2017 By: /s/ Aaron Mackey Aaron Mackey Counsel for Amici Curiae Electronic Frontier Foundation and Center For Democracy & Technology   24 Case: 16-17165, 06/07/2017, ID: 10463592, DktEntry: 30-2, Page 33 of 33 CERTIFICATE OF SERVICE I hereby certify that I electronically filed the foregoing with the Clerk of the Court for the United States Court of Appeals for the Ninth Circuit by using the appellate CM/ECF system on June 7, 2017. I certify that all participants in the case are registered CM/ECF users and that service will be accomplished by the appellate CM/ECF system. Dated: June 7, 2017 By: /s/ Aaron Mackey Aaron Mackey Counsel for Amici Curiae Electronic Frontier Foundation and Center For Democracy & Technology   25