1 2 3 4 UNITED STATES DISTRICT COURT 5 NORTHERN DISTRICT OF CALIFORNIA 6 7 TAMARA FIELDS, et al., Case No. 16-cv-00213-WHO Plaintiffs, 8 v. ORDER GRANTING MOTION TO DISMISS 9 10 TWITTER, INC., Re: Dkt. No. 49 Defendant. United States District Court Northern District of California 11 INTRODUCTION 12 13 In November 2015, Lloyd “Carl” Fields, Jr. and James Damon Creach were shot and killed 14 while working as United States government contractors at a law enforcement training center in 15 Amman, Jordan. The shooter, Anwar Abu Zaid, was a Jordanian police officer who had been 16 studying at the center. In subsequent statements, the Islamic State of Iraq and Syria (“ISIS”) 17 claimed responsibility for the attack, and according to Israeli intelligence, the gunman belonged to 18 a clandestine ISIS terror cell. In their Second Amended Complaint, plaintiffs, the wife of Fields 19 and the wife and children of Creach, seek to hold defendant Twitter, Inc. (“Twitter”) liable for 20 Abu Zaid’s despicable acts and ISIS’s terrorism under 18 U.S.C. § 2333(a), part of the Anti- 21 Terrorism Act (“ATA”), on the theory that Twitter provided material support to ISIS by allowing 22 ISIS to sign up for Twitter accounts, and that this material support was a proximate cause of the 23 November 2015 shooting. 24 I dismissed plaintiffs’ First Amended Complaint because their claims were barred by the 25 Communications Decency Act (“CDA”), 47 U.S.C. § 230(c). In the Second Amended Complaint, 26 plaintiffs attempt to plead around the CDA by asserting that Twitter provided ISIS with material 27 support by allowing ISIS members to sign up for accounts, not by allowing them to publish 28 content. But no amount of careful pleading can change the fact that, in substance, plaintiffs aim to 1 hold Twitter liable as a publisher or speaker of ISIS’s hateful rhetoric, and that such liability is 2 barred by the CDA. Twitter’s motion to dismiss is GRANTED without leave to amend. BACKGROUND United States District Court Northern District of California 3 4 In 2015, Fields and Creach travelled to Jordan through their work as government 5 contractors. Second Amended Complaint ¶¶ 72-73 (“SAC”) (Dkt. No. 48). Both had served as 6 law enforcement officers in the United States, and both were assigned to the International Police 7 Training Center (“IPTC”), a facility in Amman run by the United States Department of State. Id. 8 ¶ 74. 9 One of the men studying at the IPTC was Anwar Abu Zaid, a Jordanian police captain. Id. 10 ¶ 77. On November 9, 2015, Abu Zaid smuggled an assault rifle and two handguns into the IPTC 11 and shot and killed Fields, Creach, and three other individuals. Id. ¶ 78. ISIS subsequently 12 “claimed responsibility” for the attack, stating, And on ‘9 November 2015,’ Anwar Abu Zeid—after repenting from his former occupation—attacked the American crusaders and their apostate allies, killing two American crusaders, two Jordanian apostates, and one South African crusader. These are the deeds of those upon the methodology of the revived Khilāfah. They will not let its enemies enjoy rest until enemy blood is spilled in revenge for the religion and the Ummah. 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 Id. ¶ 80. Plaintiffs do not allege that ISIS recruited or communicated with Abu Zaid over Twitter, that ISIS or Abu Zaid used Twitter to plan, carry out, or raise funds for the attack, or that Abu Zaid ever viewed ISIS-related content on Twitter or even had a Twitter account. There is no connection between Abu Zaid and Twitter alleged in the SAC. Plaintiffs accuse Twitter of violating 18 U.S.C. § 2333(a), part of the ATA, by knowingly providing material support to ISIS, in violation of 18 U.S.C. § 2339A and 18 U.S.C. § 2339B. SAC ¶¶ 84-87 (Count 1, section 2339A), 88-91 (count 2, section 2339B). Section 2333(a) provides: Any national of the United States injured in his or her person, property, or business by reason of an act of international terrorism, or his or her estate, survivors, or heirs, may sue therefor in any appropriate district court of the United States and shall recover threefold the damages he or she sustains and the cost of the suit, including attorney’s fees. 2 1 18 U.S.C. § 2333(a). Sections 2339A and 2339B prohibit the knowing provision of “material 2 support or resources” for terrorist activities or foreign terrorist organizations. 18 U.S.C. §§ 3 2339A(a), 2339B(a)(1). The term “material support or resources” is defined to include “any 4 property, tangible or intangible, or service,” including “communications equipment.” 18 U.S.C. 5 §§ 2339A(b)(1), 2339B(g)(4). Plaintiffs assert that Twitter’s “provision of material support to ISIS was a proximate cause United States District Court Northern District of California 6 7 of [their] injur[ies].” SAC ¶¶ 86, 90. They allege that Twitter “knowingly and recklessly 8 provided ISIS with accounts on its social network” and that “[t]hrough this provision of material 9 support, Twitter enabled ISIS to acquire the resources needed to carry out numerous terrorist 10 attacks” including the attack that took place “on November 9, 2015 when an ISIS operative in 11 Amman, Jordan shot and killed Lloyd ‘Carl’ Fields, Jr. and James Damon Creach.” Id. ¶ 1. Plaintiffs contend that ISIS uses Twitter “to spread propaganda and incite fear by posting 12 13 graphic photos and videos of its terrorist feats.” Id. ¶ 58. ISIS also uses Twitter “to raise funds for 14 its terrorist activities,” id. ¶ 30, and to “post instructional guidelines and promotional videos,” id. ¶ 15 46. In addition, ISIS uses Twitter as a recruitment platform, “reach[ing] potential recruits by 16 17 maintaining accounts on Twitter so that individuals across the globe can reach out to [ISIS] 18 directly.” Id. ¶ 43. “After first contact, potential recruits and ISIS recruiters often communicate 19 via Twitter’s Direct Messaging capabilities.”1 Id. Plaintiffs allege that “[t]hrough its use of 20 Twitter, ISIS has recruited more than 30,000 foreign recruits over the last year.” Id. ¶ 52. Plaintiffs cite a number of media reports from between 2011 and 2014 concerning ISIS’s 21 22 use of Twitter and Twitter’s “refusal to take any meaningful action to stop it.” Id. ¶¶ 19-26. They 23 also describe several attempts by members of the public and United States government to persuade 24 Twitter to crack down on ISIS’s use of its services. Id. ¶¶ 27-32. They allege that, while Twitter 25 has now instituted a rule prohibiting threats of violence and the promotion of terrorism, and 26 announced in August 2016 that “it has suspended 235,000 accounts since February for promoting 27 1 28 Twitter’s Direct Messaging capabilities allow Twitter users to communicate privately through messages that can be seen only by the people included on them. SAC ¶ 43. 3 1 terrorism,” it still permits groups designated by the U.S. government as Foreign Terrorist 2 Organizations to maintain official accounts Id. ¶ 40. LEGAL STANDARD 3 4 5 statement of the claim showing that the pleader is entitled to relief,” Fed. R. Civ. P. 8(a)(2), in 6 order to “give the defendant fair notice of what the claim is and the grounds upon which it rests,” 7 Bell Atl. Corp. v. Twombly, 550 U.S. 544, 555 (2007) (internal quotation marks and alterations 8 omitted). 9 United States District Court Northern District of California Federal Rule of Civil Procedure 8(a)(2) requires a complaint to contain “a short and plain A motion to dismiss for failure to state a claim under Federal Rule of Civil Procedure 10 12(b)(6) tests the legal sufficiency of a complaint. Navarro v. Block, 250 F.3d 729, 732 (9th Cir. 11 2001). “Dismissal under Rule 12(b)(6) is appropriate only where the complaint lacks a cognizable 12 legal theory or sufficient facts to support a cognizable legal theory.” Mendiondo v. Centinela 13 Hosp. Med. Ctr., 521 F.3d 1097, 1104 (9th Cir. 2008). While a complaint “need not contain 14 detailed factual allegations” to survive a Rule 12(b)(6) motion, “it must plead enough facts to state 15 a claim to relief that is plausible on its face.” Cousins v. Lockyer, 568 F.3d 1063, 1067-68 (9th 16 Cir. 2009) (internal quotation marks and citations omitted). A claim is facially plausible when it 17 “allows the court to draw the reasonable inference that the defendant is liable for the misconduct 18 alleged.” Ashcroft v. Iqbal, 556 U.S. 662, 678 (2009) (internal quotation marks omitted). 19 In considering whether a claim satisfies this standard, the court must “accept factual 20 allegations in the complaint as true and construe the pleadings in the light most favorable to the 21 nonmoving party.” Manzarek v. St. Paul Fire & Marines Ins. Co., 519 F.3d 1025, 1031 (9th Cir. 22 2008). However, “conclusory allegations of law and unwarranted inferences are insufficient to 23 avoid a Rule 12(b)(6) dismissal.” Cousins, 568 F.3d at 1067 (internal quotation marks omitted). 24 A court may “reject, as implausible, allegations that are too speculative to warrant further factual 25 development.” Dahlia v. Rodriguez, 735 F.3d 1060, 1076 (9th Cir. 2013). 26 DISCUSSION 27 In the First Amended Complaint (“FAC”), plaintiffs asserted that Twitter provided 28 material support to ISIS because it had “knowingly permitted . . . ISIS to use its social network as 4 United States District Court Northern District of California 1 a tool for spreading extremist propaganda, raising funds and attracting new recruits.” FAC ¶ 1. I 2 concluded that plaintiffs’ claims were barred by the CDA because they sought to hold Twitter 3 liable as a publisher or speaker of ISIS’s speech. Order Dismissing FAC at 1 (Dkt. No. 47). In 4 the SAC, plaintiffs attempt to avoid the CDA’s protection by reframing their claims – alleging that 5 Twitter provided ISIS with material support, not by permitting it to use the social network, but by 6 furnishing ISIS with accounts in the first place. SAC ¶ 1. Despite this careful repleading, 7 plaintiffs’ claims have not changed in a meaningful way. They seek to hold Twitter liable for 8 allowing ISIS to use its network to spread propaganda and objectionable, destructive content. But 9 for the reasons outlined below, and in my prior order, these claims are barred under the CDA. 10 Twitter moves to dismiss on multiple grounds, but its principal argument is that plaintiffs’ 11 claims are barred by section 230(c), the “protection for ‘Good Samaritan’ blocking and screening 12 of offensive material” provision of the CDA. 47 U.S.C. § 230(c). Section 230(c) contains two 13 subsections, only the first of which, section 230(c)(1), is relevant here: 14 (1) Treatment of publisher or speaker 15 No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider. 16 17 18 19 20 21 22 23 24 25 26 27 28 47 U.S.C. § 230(c)(1). While the Ninth Circuit has described the reach of section 230(c)(1) in broad terms, stating that it “immunizes providers of interactive computer services against liability arising from content created by third parties,” the statute does not “create a lawless no-man’s-land on the internet.” Fair Hous. Council of San Fernando Valley v. Roommates.Com, LLC, 521 F.3d 1157, 1162, 1164 (9th Cir. 2008); see also Doe v. Internet Brands, Inc., No. 12-56638, 2016 WL 3067995, at *6 (9th Cir. May 31, 2016) (noting that “the CDA does not declare a general immunity from liability deriving from third-party content”) (internal quotation marks omitted). Rather, separated into its elements, section 230(c)(1) protects from liability only (a) a provider or user of an interactive computer service (b) that the plaintiff seeks to treat as a publisher or speaker (c) of information provided by another information content provider. Barnes v. Yahoo!, Inc., 570 F.3d 1096, 110001 (9th Cir. 2009). 5 Plaintiffs do not dispute that Twitter is an interactive computer service provider, or that the 1 2 offending content highlighted in the SAC was provided by another information content provider. 3 They dispute only the second element of Twitter’s section 230(c)(1) defense, i.e., whether they 4 seek to treat Twitter as a publisher or speaker. The prototypical cause of action seeking to treat an interactive computer service provider United States District Court Northern District of California 5 6 as a publisher or speaker is defamation. See, e.g., Internet Brands, Inc., 2016 WL 3067995, at *4; 7 Barnes, 570 F.3d at 1101.2 However, “the language of the statute does not limit its application to 8 defamation cases.” Barnes, 570 F.3d at 1101. Courts have applied section 230(c)(1) to a variety 9 of claims, including negligent undertaking, id. at 1102-03, intentional assault, Klayman v. 10 Zuckerberg, 753 F.3d 1354, 1357-60 (D.C. Cir. 2014), and violation of anti-sex-trafficking laws, 11 Jane Doe No. 1 v. Backpage.com, LLC, 817 F.3d 12, 18-24 (1st Cir. 2016). “[W]hat matters is not 12 the name of the cause of action – defamation versus negligence versus intentional infliction of 13 emotional distress – [but] whether the cause of action inherently requires the court to treat the 14 defendant as the publisher or speaker of content provided by another.” Barnes, 570 F.3d at 1101- 15 02 (internal quotation marks omitted). “[C]ourts must ask whether the duty that the plaintiff 16 alleges the defendant violated derives from the defendant’s status or conduct as a publisher or 17 speaker. If it does, section 230(c)(1) precludes liability.” Id. (internal quotation marks omitted). Twitter contends that, despite plaintiffs’ attempt to reframe their claims, plaintiffs still seek 18 19 to hold it liable as the publisher of content created by ISIS. Motion to Dismiss (“Mot.”) at 7-8 20 (Dkt. No. 49). It notes that while the SAC contains “some allegations . . . concerning Twitter’s 21 provision of accounts to ISIS,” it still “contains, and necessarily relies on ‘detailed descriptions of 22 ISIS-related messages, images, and videos disseminated through Twitter and the harms allegedly 23 caused by the dissemination of that content.’ ” Mot. at 8 (quoting Order Dismissing FAC at 7-8). 24 Twitter highlights that many of the allegations in the SAC still fault it for failing to detect and 25 26 27 28 2 Congress enacted section 230(c)(1) in part to respond to a New York state court decision, Stratton Oakmont, Inc. v. Prodigy Servs. Co., 1995 WL 323710 (N.Y. Sup. Ct. May 24, 1995), finding that an internet service provider could be held liable for defamation based on third-party content posted on its message boards. See Internet Brands, 2016 WL 3067995, at *5; Barnes, 570 F.3d at 1101; Roommates, 521 F.3d at 1163. 6 United States District Court Northern District of California 1 prevent ISIS-related content: Twitter “failed to respond to pleas to shut down clear incitements to 2 violence”; Twitter “does not actively monitor and will not censor user content”; “experts agree 3 that Twitter could and should be doing more to stop ISIS from using its social network”; the White 4 House announced they will “press the biggest Internet firms to take a more proactive approach to 5 countering terrorist messages and recruitment online.” SAC ¶¶ 30, 32, 36, 37. According to 6 Twitter, plaintiffs’ claims are based on Twitter’s alleged failure to exclude this third-party content, 7 a quintessential responsibility of a publisher. See Mot. at 8-9; see also Klayman, 753 F.3d at 1359 8 (“the very essence of publishing is making the decision whether to print or retract a given piece of 9 content”); Doe v. MySpace, Inc., 528 F.3d 413, 420 (5th Cir. 2008) (“decisions relating to the 10 monitoring, screening, and deletion of content [are] actions quintessentially related to a publisher’s 11 role”) (internal quotation marks omitted); Roommates, 521 F.3d at 1170-71 (noting that section 12 230(c)(1) applies to “any activity that can be boiled down to deciding whether to exclude material 13 that third parties seek to post online,” and that “determin[ing] whether or not to prevent [the] 14 posting” of material by third parties is “precisely the kind of activity” covered by the statute); 15 Batzel v. Smith, 333 F.3d 1018, 1031 (9th Cir. 2003) (“the exclusion of ‘publisher’ liability 16 necessarily precludes liability for exercising the usual prerogative of publishers to choose among 17 proffered material”). Twitter adds that plaintiffs’ provision of accounts theory does not resolve 18 this issue because “decisions about whether particular third parties may have Twitter accounts” are 19 also publishing activity. Mot. at 9 (quoting Order Dismissing FAC at 10). 20 Plaintiffs make two main arguments in response. First, they contend that their claims are 21 not content based because “[t]he theory of liability set out in the SAC is based purely on 22 Defendant’s knowing provision of Twitter accounts to ISIS, not content created with those 23 accounts.” Opposition (“Oppo.”) at 1-2. (Dkt. No. 52). They assert that “[t]he decision to 24 provide ISIS with a Twitter account is wholly distinct from permitting ISIS to tweet propaganda. 25 The content-neutral decision about whether to provide someone with a tool is not publishing 26 activity as defined by the Ninth Circuit.” Id. Plaintiffs further explain that 27 28 Providing ISIS with a Twitter account is not publishing under [the Ninth Circuit’s] definitions because it does not involve reviewing editing or deciding whether to publish or withdraw tweets. Nor is 7 deciding whether someone can sign up for a Twitter account the same thing as deciding what content can be published; handing someone a tool is not the same thing as supervising their use of that tool. The CDA bars claims based on the latter, but the theory of liability in this case is based solely on the former. And, notably, many Twitter users who sign up for accounts never issue a single tweet. In other words, account creation and content creation on Twitter are two distinct activities. 1 2 3 4 United States District Court Northern District of California 5 Id. at 3. 6 Plaintiffs also argue that reliance on content to prove causation does turn their non-content 7 based provision of account theory into a CDA barred claim. They insist that “[a]ll of the content- 8 based allegations in the SAC are strictly limited to Section III, titled ‘TWITTER 9 PROXIMATELY CAUSED PLAINTIFFS’ INJURIES’.” Id. at 4. And they urge that “where a 10 theory of liability relies on content purely for purposes of causation, but otherwise does not 11 depend on content as a critical element, the CDA does not apply.” Id. 12 Second, plaintiffs highlight their allegations regarding Twitter’s Direct Messaging 13 capabilities and assert that “[b]ecause Direct Messages are unpublished private communications, 14 this theory of liability does not seek to treat Defendant as a publisher or speaker and, accordingly, 15 the CDA does not apply.” Id. at 5. Plaintiffs assert that the “ordinary meaning of ‘publisher’ is 16 one who disseminates information to the public” and therefore, plaintiffs’ “theory of liability, 17 based on purely private content, is not barred by the CDA.” Id. at 5-6. 18 I. PROVISION OF ACCOUNTS THEORY 19 There are at least three problems with plaintiffs’ provision of accounts theory. First, 20 providing accounts to ISIS is publishing activity, just like monitoring, reviewing, and editing 21 content. Courts have repeatedly described publishing activity under section 230(c)(1) as including 22 decisions about what third-party content may be posted online. See, e.g., Klayman, 753 F.3d at 23 1359 (“the very essence of publishing is making the decision whether to print or retract a given 24 piece of content”); MySpace, 528 F.3d at 420 (“decisions relating to the monitoring, screening, 25 and deletion of content [are] actions quintessentially related to a publisher’s role”); Roommates, 26 521 F.3d at 1170-71 (“determin[ing] whether or not to prevent [the] posting” of third-party 27 material online is “precisely the kind of activity” covered by the CDA); Batzel, 333 F.3d at 1031 28 (“the exclusion of ‘publisher’ liability necessarily precludes liability for exercising the usual 8 1 prerogative of publishers to choose among proffered material”). Plaintiffs’ provision of accounts 2 theory is slightly different, in that it is based on Twitter’s decisions about whether particular third 3 parties may have Twitter accounts, as opposed to what particular third-party content may be 4 posted. Plaintiffs urge that Twitter’s decision to provide ISIS with Twitter accounts is not barred 5 by section 230(c)(1) because a “content-neutral decision about whether to provide someone with a 6 tool is not publishing activity.” United States District Court Northern District of California 7 Although plaintiffs assert that the decision to provide an account to or withhold an account 8 from ISIS is “content-neutral,” they offer no explanation for why this is so and I do not see how 9 this is the case. A policy that selectively prohibits ISIS members from opening accounts would 10 necessarily be content based as Twitter could not possibly identify ISIS members without 11 analyzing some speech, idea or content expressed by the would-be account holder: i.e. “I am 12 associated with ISIS.” The decision to furnish accounts would be content-neutral if Twitter made 13 no attempt to distinguish between users based on content – for example if they prohibited 14 everyone from obtaining an account, or they prohibited every fifth person from obtaining an 15 account. But plaintiffs do not assert that Twitter should shut down its entire site or impose an 16 arbitrary, content-neutral policy. Instead, they ask Twitter to specifically prohibit ISIS members 17 and affiliates from acquiring accounts – a policy that necessarily targets the content, ideas, and 18 affiliations of particular account holders. There is nothing content-neutral about such a policy. 19 Further, as Twitter points out, even if a user never posts a tweet, “a user who opens an 20 account necessarily puts content online” as each Twitter account displays a public user name – 21 such as @ISIS_Media_Hub, and a user photograph, such as a bearded man’s face. Reply at 4 22 (Dkt. No. 53). The user name @ISIS_Media_Hub, on its own, expresses a number of ideas: “I am 23 affiliated with ISIS”; “I am a media source”; and “Follow me for information and publicity about 24 ISIS’s activities.” A decision to decline to furnish an account to this user, based on its apparent 25 ISIS affiliation, would be a publishing decision to prohibit the public dissemination of these ideas. 26 Functionally, plaintiffs urge that Twitter should have imposed a blanket ban on pro-ISIS 27 content by prohibiting ISIS affiliates from opening accounts at all. While the timing and scope of 28 such a censorship policy differs from one barring specific objectionable tweets, it is still a content9 1 based policy, and therefore, would constitute publishing activity. Despite being aimed at blocking 2 Twitter accounts instead of particular tweets, plaintiffs’ provision of accounts theory is still based 3 on Twitter’s alleged violation of a “duty . . . derive[d] from [its] status or conduct as a publisher.” 4 Barnes, 570 F.3d at 1102. United States District Court Northern District of California 5 A recent First Circuit case, Jane Doe No. 1 v. Backpage.com, LLC, 817 F.3d 12 (1st Cir. 6 2016), adds further support to this conclusion, as the Backpage court held that decisions about the 7 structure and operation of a website are content-based decisions. In Backpage, the plaintiffs, each 8 of whom had been a victim of sex trafficking, sued the defendant website provider under the 9 Trafficking Victims Protection Reauthorization Act, asserting that the defendant had violated the 10 statute through various “choices [it] ha[d] made about the posting standards for advertisements,” 11 including “the lack of controls on the display of phone numbers, the option to anonymize email 12 addresses, [and the] acceptance of anonymous payments.” Id. at 20. The plaintiffs argued that 13 “these choices are distinguishable from publisher functions.” Id. The First Circuit disagreed, 14 holding that section 230(c)(1) “extends to the formulation of precisely the sort of website policies 15 and practices [the plaintiffs] assail.” Id. The court explained that decisions regarding the 16 “structure and operation of [a] website” – such as “permitt[ing] users to register under multiple 17 screen names” and other decisions regarding “features that are part and parcel of the overall design 18 and operation of the website” – “reflect choices about what content can appear on the website and 19 in what form” and thus “fall within the purview of traditional publisher functions.” Id. at 20-21. 20 Likewise, Twitter’s decisions to structure and operate itself as a “platform . . . allow[ing] 21 for the freedom of expression [of] hundreds of millions of people around the world,” SAC ¶ 35, 22 and, through its hands-off policy, allowing ISIS to obtain “dozens of accounts on its social 23 network” Id. ¶ 9, “reflect choices about what [third-party] content can appear on [Twitter] and in 24 what form,” Backpage, 817 F.3d at 21. Where such choices form the basis of a plaintiff’s claim, 25 section 230(c)(1) applies. 26 Second, although plaintiffs have carefully restructured their SAC to focus on their 27 provision of accounts theory of liability, at their core, plaintiffs’ allegations are still that Twitter 28 10 1 knowingly failed to prevent ISIS from disseminating content through the Twitter platform, not its 2 mere provision of accounts to ISIS. United States District Court Northern District of California 3 In the SAC, plaintiffs have consolidated their allegations about Twitter’s provision of 4 accounts to ISIS under a section titled “TWITTER PROVIDED ACCOUNTS TO ISIS” where 5 they assert, for example, that: “Since 2010, Twitter has provided ISIS with dozens of accounts on 6 its social network”; “Twitter permitted Al-Furqan, ISIS’s official media arm, to maintain an 7 account with 19,000 followers”; “Al-Hayat Media Center, ISIS’s official public relations group, 8 maintained at least a half dozen accounts”; “@ISIS_Media_Hub, had 8,954 followers as of 9 September 2014”; “As of December 2014, ISIS had an estimated 70,000 Twitter accounts, at least 10 79 of which were ‘official.’” SAC ¶¶ 9-13. Plaintiffs characterize these allegations as “based 11 purely on Defendant’s knowing provision of Twitter accounts to ISIS, not content created with 12 those accounts” and contend that their claims against Twitter are therefore not barred by the CDA. 13 Oppo. at 1-2. 14 As discussed above, the decision to furnish an account, or prohibit a particular user from 15 obtaining an account, is itself publishing activity. Further, while plaintiffs urge me to focus 16 exclusively on those five short paragraphs, I cannot ignore that the majority of the SAC still 17 focuses on ISIS’s objectionable use of Twitter and Twitter’s failure to prevent ISIS from using the 18 site, not its failure to prevent ISIS from obtaining accounts. For example, plaintiffs spend almost 19 nine pages, more than half of the complaint, explaining that “Twitter Knew That ISIS Was Using 20 Its Social Network But Did Nothing”; “ISIS Used Twitter to Recruit New Members”; “ISIS Used 21 Twitter to Fundraise”; and “ISIS Used Twitter To Spread Propaganda.” SAC ¶¶ 19-71 (emphasis 22 added). These sections are riddled with detailed descriptions of ISIS-related messages, images, 23 and videos disseminated through Twitter and the harms allegedly caused by the dissemination of 24 that content. 25 The SAC also includes a number of allegations specifically faulting Twitter for failing to 26 detect and prevent the dissemination of ISIS-related content through the Twitter platform. See, 27 e.g., id. ¶¶ 30 (Twitter “failed to respond to pleas to shut down clear incitements to violence”), 36 28 (Twitter “does not actively monitor and will not censor user content”). 11 1 2 their claims are inherently tied up with ISIS’s objectionable use of Twitter, not its mere 3 acquisition of accounts. Though plaintiffs allege that Twitter should not have provided accounts 4 to ISIS, the unspoken end to that allegation is the rationale behind it: namely, that Twitter should 5 not have provided accounts to ISIS because ISIS would and has used those accounts to post 6 objectionable content. 7 United States District Court Northern District of California It is no surprise that plaintiffs have struggled to excise their content-based allegations; In short, the theory of liability alleged in the SAC is not that Twitter provides material 8 support to ISIS by providing it with Twitter accounts, but that Twitter does so by allowing ISIS to 9 use Twitter “to send its propaganda and messaging out to the world and to draw in people 10 vulnerable to radicalization.” SAC ¶ 41. Plaintiffs do not dispute that this theory seeks to treat 11 Twitter as a publisher and is barred by section 230(c)(1). Oppo. at 3. 12 Acknowledging that the SAC contains substantial references to ISIS content on Twitter, 13 plaintiffs nevertheless argue that their claims are not barred by the CDA because “[a]ll of the 14 content-based allegations in the SAC are strictly limited to Section III, titled “TWITTER 15 PROXIMATELY CAUSED PLAINTIFFS’ INJURIES,” and the mere fact that a claim relies on 16 content to demonstrate proximate cause does not make the underlying claim content-based. They 17 point to Barnes and Internet Brands in support of their contention that a claim that relies on third- 18 party content to demonstrate proximate cause is not necessarily barred by the CDA. While this 19 may be true, those cases, which involve substantially different facts than those at issue here, only 20 highlight that plaintiffs’ theory of liability is not content-neutral and cannot survive. 21 Plaintiffs attempt to liken the provision of accounts theory to the promissory estoppel 22 claim raised in Barnes, but the facts of that case are substantially different. The plaintiff in Barnes 23 sued Yahoo alleging that she had relied on its promise that it would remove explicit photographs 24 her ex-boyfriend had posted online without her consent, but that it had failed to fulfill this 25 promise. 570 F.3d at 1098-99. The Ninth Circuit found that this promissory estoppel claim was 26 not precluded by section 230(c)(1) because the plaintiff did not “seek to hold Yahoo liable as a 27 publisher or speaker of third-party content, but rather as the counterparty to a contract, as a 28 promisor who has breached.” Id. at 1107. In other words, the plaintiff’s theory of liability was 12 1 based not on Yahoo’s “publishing conduct,” but rather on its “manifest intention to be legally 2 obligated to do something.” Id. By contrast, plaintiffs here assert no theory based on contract 3 liability and allege no promise made or breached by Twitter. Barnes does not indicate that the 4 conduct underlying the provision of accounts theory is beyond the scope of publishing conduct. United States District Court Northern District of California 5 Plaintiffs also rely on Internet Brands, but again, that case involves a substantially 6 different set of facts from this one. The plaintiff there sued the defendant website operator for 7 negligent failure to warn, alleging that the defendant knowingly failed to warn her that two 8 individuals were using the website to identify and lure rape victims. 2016 WL 3067995, at *2-3. 9 Although the plaintiff had posted information on the website, the two individuals had not. Id. In 10 holding that the plaintiff did not seek to hold the defendant liable as a publisher of third-party 11 content, the Ninth Circuit emphasized that her negligent failure to warn claim 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 would not require [the defendant] to remove any user content or otherwise affect how it publishes or monitors such content . . . Any alleged obligation to warn could have been satisfied without changes to the content posted by the website’s users and without conducting a detailed investigation. [The defendant] could have given a warning to . . . users, perhaps by posting a notice on the website or by informing users by email what it knew about the activities of [the individuals]. Posting or emailing such a warning could be deemed an act of publishing information, but section 230(c)(1) bars only liability that treats a website as a publisher or speaker of content provided by somebody else . . . A post or email warning that [the defendant] generated would involve only content that [the defendant] itself produced. Id. at *4. Plaintiffs’ provision of accounts theory, on the other hand, has nothing to do with information Twitter itself should have posted online. Moreover, it would significantly affect Twitter’s monitoring and publication of third-party content by effectively requiring Twitter to police and restrict its provision of Twitter accounts. Internet Brands, like Barnes, does not help plaintiffs’ case. Plaintiffs’ reliance on Barnes and Internet Brands fails because, unlike those cases, and regardless of how plaintiffs have organized their complaint, their theory of liability is inherently tied to content. Creative use of headings does not change the nature of their claims, and plaintiffs cannot avoid section 230(c)(1) immunity by segregating their content-based allegations under a proximate cause banner. In the SAC, plaintiffs still seek to hold Twitter liable for allowing ISIS to 13 1 post propaganda and other objectionable content on its site. This is fundamental publishing 2 activity and falls under section 230(c)(1). Plaintiffs’ attempt to limit their content-based 3 allegations to their proximate cause section does not save their provision of accounts theory. United States District Court Northern District of California 4 The third problem with the provision of accounts theory is that plaintiffs have not 5 adequately alleged causation. Although the parties dispute the exact formulation of the 6 appropriate causal test for civil liability under the ATA, they agree that the statute requires a 7 showing of proximate causation. See Reply at 9-10; Oppo. at 10-11; see also 18 U.S.C. § 2333(a) 8 18 U.S.C. § 2339(a) (establishing that an individual provides material support by providing 9 resources “knowing or intending that they are to be used in preparation for, or in carrying out, a 10 violation of” the ATA)(emphasis added); (authorizing a suit for damages by “[a]ny national of the 11 United States injured . . . by reason of an act of international terrorism”) (emphasis added); In re 12 Terrorist Attacks on Sept. 11, 2001, 714 F.3d 118, 123-25 (2d Cir. 2013) (affirming a Rule 13 12(b)(6) dismissal of ATA claims for failure to plausibly allege proximate causation); Rothstein v. 14 UBS AG, 708 F.3d 82, 94-98 (2d Cir. 2013) (same). 15 Even under plaintiffs’ proposed “substantial factor” test, see Oppo. at 11, the allegations in 16 the SAC do not support a plausible inference of proximate causation between Twitter’s provision 17 of accounts to ISIS and the deaths of Fields and Creach. Plaintiffs allege no connection between 18 the shooter, Abu Zaid, and Twitter. There are no facts indicating that Abu Zaid’s attack was in 19 any way impacted, helped by, or the result of ISIS’s presence on the social network. Instead they 20 insist they have adequately pleaded proximate causation because they have alleged “(1) that 21 Twitter provided fungible material support to ISIS, and (2) that ISIS was responsible for the attack 22 in which Lloyd Fields, Jr. and James Damon Creach were killed.” Id. at 13. Under such an 23 expansive proximate cause theory, any plaintiff could hold Twitter liable for any ISIS-related 24 injury without alleging any connection between a particular terrorist act and Twitter’s provision of 25 accounts. And, since plaintiffs allege that Twitter has already provided ISIS with material 26 support, Twitter’s liability would theoretically persist indefinitely and attach to any and all future 27 ISIS attacks. Such a standard cannot be and is not the law. 28 14 As Twitter points out, courts have rejected similarly expansive proximate cause theories 1 2 under the ATA. For example, In re Terrorist Attacks on Sept. 11, 2001, the Second Circuit 3 rejected the allegation that “providing routine banking services to organizations and individuals 4 said to be affiliated with al Qaeda—as alleged by plaintiffs—proximately caused the September 5 11, 2001 attacks or plaintiffs’ injuries.” 714 F.3d at 124. It similarly found that the plaintiffs in 6 Rothstein v. UBS had not demonstrated proximate cause by simply alleging that “they were injured 7 after UBS violated federal law” and noted that such a standard “would mean that any provider of 8 U.S. currency to a state sponsor of terrorism would be strictly liable for injuries subsequently 9 caused by a terrorist organization associated with that state.” Rothstein, 708 F.3d at 96. Plaintiffs have not alleged any facts linking Twitter’s provision of accounts to ISIS to Abu United States District Court Northern District of California 10 11 Zaid’s attack. Instead they assert that they have demonstrated proximate cause by alleging that 12 Twitter provided ISIS with material support in the form of a powerful communication tool and 13 that ISIS has claimed responsibility for Abu Zaid’s actions. These allegations do not plausibly 14 suggest the necessary causal connection between Twitter’s provision of accounts and the attack 15 that killed Lloyd Fields, Jr. and James Damon Creach.3 For these three reasons, Plaintiffs’ claims based on the provision of accounts theory are 16 17 DISMISSED. 18 II. DIRECT MESSAGING THEORY 19 Plaintiffs’ other attempt to evade section 230(c)(1) is based on Twitter’s Direct Messaging 20 capabilities, which allow for the sending of private messages through the Twitter platform. Oppo. 21 at 5-8. Plaintiffs contend that a “publisher” under section 230(c)(1) is “one who disseminates 22 information to the public” and thus “the CDA does not apply to claims based on purely private 23 communications, including claims based on ISIS’s use of Twitter’s direct messages.” Id. at 7. In 24 support of their Direct Messaging theory, plaintiffs abandon all pretense of a content-less basis for 25 3 26 27 28 While courts have not required plaintiffs bringing ATA claims based on material support theories to “trace specific dollars to specific attacks,” Strauss v. Credit Lyonnais, S.A., 925 F. Supp. 2d 414, 433 (E.D.N.Y. 2013); accord Linde v. Arab Bank, PLC, 97 F. Supp. 3d 287, 328 (E.D.N.Y. 2015) (on appeal), they have nevertheless rejected alleged causal connections that are too speculative or attenuated to raise a plausible inference of proximate causation, Terrorist Attacks, 714 F.3d at 123-25; Rothstein, 708 F.3d at 94-98. 15 United States District Court Northern District of California 1 liability and assert instead that ISIS has “used [Direct Messaging] to its great advantage,” 2 specifically, to contact and communicate with potential recruits. Id. at 5; see also SAC ¶¶ 43-45. 3 Publishing activity under section 230(c)(1) extends to Twitter’s Direct Messaging 4 capabilities. As noted earlier, Congress enacted section 230(c)(1) in part to respond to a New 5 York state court decision finding that an internet service provider could be held liable for 6 defamation based on third-party content posted on its message boards. See, e.g., Roommates, 521 7 F.3d at 1163. In defamation law, the term “publication” means “communication [of the 8 defamatory matter] intentionally or by a negligent act to one other than the person defamed.” 9 Barnes, 570 F.3d at 1104 (internal quotation marks omitted). The Fourth Circuit has held that an 10 internet service provider covered by the “traditional definition” of publisher is protected by section 11 230(c)(1). Zeran v. Am. Online, Inc., 129 F.3d 327, 332 (4th Cir. 1997) (explaining that “every 12 repetition of a defamatory statement is considered a publication,” and “every one who takes part in 13 the publication is charged with publication”) (internal quotation marks and alterations omitted). 14 While plaintiffs insist that reference to the legislative history is improper and that 15 “publisher” must be given its normal, ordinary meaning, - “to make public” - the Ninth Circuit has 16 already indicated that section 230(c)(1) extends at least as far as prohibiting internet service 17 providers from being treated as publishers for the purposes of defamation liability. Barnes, 570 18 F.3d at 1101 (the language of the statute does not limit its application to defamation 19 cases)(emphasis added). In response to this, plaintiffs assert that “even if Congress had intended 20 that the defamation definition of ‘publisher’ be applied in defamation cases, it makes no sense to 21 apply that definition outside of the context of defamation claims.” Oppo. at 8. But this is 22 contradicted by the Ninth Circuit’s statement that “sections 230(c)(1) precludes courts from 23 treating internet service providers as publishers not just for the purposes of defamation law . . . but 24 in general.” Id. Further, if the goal of the CDA is to promote unfettered and unregulated free 25 speech on the internet, there is no reason that section 230(c)(1) should shield providers of private 26 messaging services from defamation liability, but no other civil liability. Treating service 27 providers as publishers in non-defamation cases would undermine the protections afforded for 28 defamation claims by forcing providers to implement general, content-restricting policies to filter 16 1 and block objectionable content that might lead to non-defamation liability. Under this analysis, 2 the private nature of Direct Messaging does not remove the transmission of such messages from 3 the scope of publishing activity under section 230(c)(1). United States District Court Northern District of California 4 In addition, a number of courts have applied the CDA to bar claims predicated on a 5 defendant’s transmission of nonpublic messages, and have done so without questioning whether 6 the CDA applies in such circumstances. See Hung Tan Phan v. Lang Van Pham, 182 Cal. App. 7 4th 323, 324-28 (2010); Delfino v. Agilent Techs., Inc., 145 Cal. App. 4th 790, 795-96, 804-08 8 (2006); Beyond Sys., Inc. v. Keynetics, Inc., 422 F. Supp. 2d 523, 528, 536-37 (D. Md. 2006). 9 Apart from the private nature of Direct Messaging, plaintiffs identify no other way in 10 which their Direct Messaging theory seeks to treat Twitter as anything other than a publisher of 11 information provided by another information content provider. Accordingly, plaintiffs’ claims 12 based on this theory are DISMISSED. 13 III. PLAINTIFFS’ PUBLIC POLICY THEORY 14 Finally, plaintiffs argue that barring their claims “would be at odds with the purported 15 goals of the CDA . . . ‘[to] encourage the unfettered and unregulated development of free speech 16 on the Internet’ ” because “Congress surely did not intend to promote speech that aids designated 17 terrorist organizations.” Oppo. at 8 (quoting Batzel, 333 F.3d at 1027). But if Congress intended 18 such a carve out to the CDA, it would have included one. I am not at liberty to create it. 19 Moreover, plaintiffs’ argument is incoherent. If the goal of the CDA is to “encourage the 20 unfettered and unregulated development of free speech,” any policy that requires interactive 21 computer service providers to remove or filter particular content undermines this purpose. Such a 22 policy would require companies like Twitter to institute (1) expensive and likely imperfect 23 content-specific controls or (2) broad content neutral restrictions that suppress content across the 24 board. If content-specific controls are expensive enough to institute, and the penalties for failure 25 to adequately control objectionable content are sufficiently severe, companies like Twitter will be 26 encouraged to reduce their services or stop offering them all together. There is no debate that 27 pro - ISIS content is objectionable but that does not mean that a carve out is consistent with the 28 CDA’s purpose. The exact opposite is true. Shielding interactive computer service providers 17 1 from publisher liability for all content encourages these companies to create “platform[s] . . . 2 allow[ing] for the freedom of expression [of] hundreds of millions of people around the world,” 3 SAC ¶ 35, just as the CDA intended. United States District Court Northern District of California 4 Plaintiffs acknowledge that allowing their claims might have a minor “ ‘chilling effect’ on 5 Internet free speech,” but insist that, at most, allowing these claims “would deter interactive 6 computer services from knowingly providing material support to terrorists.” Id. This 7 oversimplifies the obligation plaintiffs seek to impose on Twitter. Twitter provides its services to 8 the public indiscriminately. It does not actively provide material support to terrorists. A policy 9 holding Twitter liable for allowing ISIS to use its services would require it to institute new 10 procedures and policies for screening and vetting accounts before they are opened; identify and 11 suspend the accounts of users posting pro-ISIS content; and even identify and suspend the 12 accounts of users promoting terrorism through the direct messaging feature. These are not minor 13 obligations, as they would require Twitter to fundamentally change certain aspects of its services 14 and overturn its hands-off content-neutral approach. That plaintiffs seek to hold Twitter liable for 15 allowing only one type of content – content that supports terrorism – does not make this a minor 16 exception to the CDA’s protections. Requiring Twitter to make any content-based publishing 17 decisions would require them to create and implement filtering procedures that they do not 18 currently have. The difference between treating them as a publisher for one type of content, as 19 compared to no content, is substantial. 20 21 22 Congress, not the courts, has the authority to amend the CDA. Plaintiffs’ policy arguments do not justify allowing their claims to proceed. CONCLUSION 23 Twitter’s motion to dismiss the SAC is GRANTED WITHOUT LEAVE TO AMEND. I 24 granted leave to amend the FAC, but the restructured SAC suffers from the same fatal infirmities 25 as the FAC. I confirmed with plaintiffs’ counsel at the hearing that they did not seek further 26 27 28 18 1 2 3 4 5 amendment, and I have concluded that further amendment would be futile. IT IS SO ORDERED. Dated: November 18, 2016 ______________________________________ WILLIAM H. ORRICK United States District Judge 6 7 8 9 10 United States District Court Northern District of California 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 19