The Advocacy of Terrorism on the Internet: Freedom of Speech Issues and the Material Support Statutes Kathleen Ann Ruane Legislative Attorney September 8, 2016 Congressional Research Service 7-5700 www.crs.gov R44626 The Advocacy of Terrorism on the Internet Summary The development of the Internet has revolutionized communications. It has never been easier to speak to wide audiences or to communicate with people that may be located more than half a world away from the speaker. However, like any neutral platform, the Internet can be used to many different ends, including illegal, offensive, or dangerous purposes. Terrorist groups, such as the Islamic State (IS, also referred to as ISIS or ISIL), Al Qaeda, Hamas, and Al Shabaab, use the Internet to disseminate their ideology, to recruit new members, and to take credit for attacks around the world. In addition, some people who are not members of these groups may view this content and could begin to sympathize with or to adhere to the violent philosophies these groups advocate. They might even act on these beliefs. Several U.S. policymakers, including some Members of Congress, have expressed concern about the influence that terrorist advocacy may have upon those who view or read it. The ease with which such speech may be disseminated over the Internet, using popular social media services, has been highlighted by some observers as potentially increasing the ease by which persons who might otherwise have not been exposed to the ideology or recruitment efforts of terrorist entities may become radicalized. These concerns raise the question of whether it would be permissible for the federal government to restrict or prohibit the publication and distribution of speech that advocates the commission of terrorist acts when that speech appears on the Internet. Significant First Amendment freedom of speech issues are raised by the prospect of government restrictions on the publication and distribution of speech, even speech that advocates terrorism. This report discusses relevant precedent concerning the extent to which advocacy of terrorism may be restricted in a manner consistent with the First Amendment’s Freedom of Speech Clause. The report also discusses the potential application of the federal ban on the provision of material support to foreign terrorist organizations (FTOs) to the advocacy of terrorism, including as it relates to the dissemination of such advocacy via online services like Twitter or Facebook. Congressional Research Service The Advocacy of Terrorism on the Internet Contents Introduction ..................................................................................................................................... 1 Constitutional Principles: Advocacy of Violence or Lawlessness Under the First Amendment .................................................................................................................................. 3 Pure Advocacy of Violence and Law Breaking Under Brandenburg v. Ohio ........................... 4 Ambiguity in Brandenburg’s Scope .......................................................................................... 5 Holder v. Humanitarian Law Project ........................................................................................ 7 Restricting the Advocacy of Terrorism on the Internet ................................................................... 9 The Constitutionality of a Criminal Law that Would Wholly Prohibit Terrorist Advocacy ............................................................................................................................... 9 Can Terrorist Advocacy Be Restricted More Easily on the Internet?.......................................11 Application of Material Support Statutes to Advocacy of Terrorism ............................................ 13 Section 2339A: Material Support of Acts of Terrorism .......................................................... 13 Section 2339B: Material Support of Designated Terrorist Organizations ............................... 15 Advocacy Directed to, Coordinated with, Under the Direction of an FTO ...................... 15 The Application of Section 2339B to Social Media Services ........................................... 17 Private Civil Lawsuits and Section 230 of the Communications Decency Act ................ 20 Contacts Author Contact Information .......................................................................................................... 24 Congressional Research Service The Advocacy of Terrorism on the Internet Introduction The development of the Internet has revolutionized communications.1 It has never been easier to speak to wide audiences or to communicate with people who may be located more than half a world away from the speaker.2 However, like any neutral platform, the Internet can be used for many different ends, including illegal, offensive, or dangerous purposes.3 Terrorist groups, such as the Islamic State (IS, also referred to as ISIS or ISIL),4 Al Qaeda,5 Hamas,6 and Al Shabaab,7 use the Internet to disseminate their ideology, recruit new members, and take credit for attacks around the world. In addition, people who are not members of these groups may view such content and could begin to sympathize with or to adhere to the violent philosophies these groups advocate. They might even act on these beliefs.8 For example, it has been reported that the sermons of Anwar al Awlaki were instrumental in influencing the ideology of certain individuals accused of terrorist activities, including the perpetrators of the San Bernardino shooting and the Boston Marathon bombers.9 Awlaki was a U.S. citizen who was targeted and killed by a drone strike on foreign soil.10 Awlaki left behind numerous digital videos on websites like YouTube of himself preaching his interpretation of the Islamic faith.11 Some of his videos expound upon less controversial topics such as respect for the holy month of Ramadan, the nature of marriage, or the relationship between Islam and Jesus Christ.12 However, other videos depict Awlaki exhorting his followers never to trust a nonMuslim; that Muslims are at war with the United States; and, in a video entitled “Call to Jihad,” that “it is every Muslim’s religious duty to kill Americans.”13 It was messages like these, 1 Protecting and Promoting the Open Internet, 80 Fed. Reg. 19738 (Apr. 13, 2015) (“The open Internet drives the American economy and serves, every day, as a critical tool for America’s citizens to conduct commerce, communicate, educate, entertain, and engage in the world around them.”). 2 Id. 3 See Fair Hous. Council v. Roommates.com, LLC, 521 F.3d 1157, 1169 (9th Cir. 2008) (noting that Internet content and service providers often provide neutral tools that may be used to carry out illegal acts or to communicate illegal content). 4 Rick Gladstone and Vindu Goel, ISIS Adept on Twitter, Study Finds, N.Y. TIMES (Mar. 5, 2015), http://www.nytimes.com/2015/03/06/world/middleeast/isis-is-skilled-on-twitter-using-thousands-of-accounts-studysays.html?_r=0. 5 Scott Shane, The Lessons of Anwar al-Awlaki, N.Y. TIMES (Aug. 27, 2015), http://www.nytimes.com/2015/08/30/ magazine/the-lessons-of-anwar-al-awlaki.html. 6 Gwen Ackerman, Facebook Accused in $1 Billion Suit of Being Hamas Tool, BLOOMBERG (Jul. 11, 2016), http://www.bloomberg.com/news/articles/2016-07-11/facebook-sued-for-1b-for-alleged-hamas-use-of-medium-forterror. 7 Harriet Alexander, Tweeting Terrorism: How al-Shabaab Live Blogged the Nairobi Terrorist Attack, THE TELEGRAPH (Sept. 22, 2013), http://www.telegraph.co.uk/news/worldnews/africaandindianocean/kenya/10326863/Tweetingterrorism-How-al-Shabaab-live-blogged-the-Nairobi-attacks.html. 8 Id. 9 Shane, supra note 5. 10 Id. 11 Id. 12 Id. 13 Id. Congressional Research Service 1 The Advocacy of Terrorism on the Internet particularly the “Call to Jihad,” that reportedly motivated the San Bernardino and Boston Marathon attackers.14 More broadly, the Islamic State organization has been known to use popular Internet services such as Twitter and YouTube to disseminate videos of its fighters executing prisoners, claim credit for organizing terrorist attacks such as the attack that occurred in Paris in November of 2015, and recruit new members to their cause.15 IS personnel also disseminate high-quality electronic publications encouraging supporters to conduct violent attacks in their communities. Members of Hamas and Al Shabaab have reportedly used Facebook and Twitter to disseminate their ideology as well.16 The media arm of Al Shabaab used Twitter to claim credit for the terrorist attack on the Westgate Shopping Mall in Nairobi, Kenya and distribute information and pictures of the attack while it remained ongoing.17 Speech advocating violence and terrorism is prohibited by the terms of service of Twitter, Facebook, and other social media outlets.18 Sites like Twitter reportedly have increased their efforts to disable accounts that are associated with terrorist groups or the advocacy of terrorist ideologies. However, these efforts have not been wholly successful.19 When one account is disabled, another might soon appear to replace it. Many policymakers, including some Members of Congress, have expressed concern about the influence the speech of terrorist groups and the speech of others who advocate terrorism can have on those who view or read it.20 Some policymakers have expressed particular concern regarding the ease by which persons who might otherwise not have been exposed to the ideology or recruitment efforts of terrorist entities may become radicalized.21 These concerns raise the question of whether it would be permissible for the federal government to restrict or prohibit the publication and distribution of speech that advocates the commission of terrorist acts when that speech appears on the Internet.22 14 Id. See also Erik Eckholm, ISIS Influence on Web Prompts Second Thoughts on First Amendment, N.Y. TIMES (Dec. 27, 2015), available at http://www.nytimes.com/2015/12/28/us/isis-influence-on-web-prompts-second-thoughts-onfirst-amendment.html. Awlaki also reportedly helped to create Inspire Magazine. Shane, supra footnote 5. Inspire Magazine, published by Al Qaeda in the Arabian Peninsula, is freely available on the Internet, and can be found via a simple Internet search on an engine like Google. Bob Drogin, The ‘Vanity Fair’ of Al Qaeda, L.A.TIMES (Nov. 26, 2010), http://articles.latimes.com/2010/nov/26/nation/la-na-terror-magazine-20101126. It is written in English and commonly defends perpetrators of terrorist attacks and encourages readers to commit terrorist acts. Id. 15 Gladstone and Goel, supra note 4. 16 Ackerman, supra note 6; Alexander, supra note 7. 17 Alexander, supra note 7. 18 See, e.g., Twitter Rules. https://support.twitter.com/articles/18311# (last visited, Aug. 26, 2016); Facebook Terms of Service https://www.facebook.com/terms (last visited Aug. 26, 2016); YouTube Community Guidelines https://www.youtube.com/yt/policyandsafety/communityguidelines.html (last visited Aug. 26, 2016). 19 Mike Isaac, Twitter Steps Up Efforts to Thwart Terrorists Tweets, N.Y. TIMES (Feb. 5, 2016), http://www.nytimes.com/2016/02/06/technology/twitter-account-suspensions-terrorism.html. 20 See, e.g., Scott Shane, Internet Firms Urged to Limit Work of Anwar al-Awlaki, N.Y. TIMES (Dec. 18, 2015), http://www.nytimes.com/2015/12/19/us/politics/internet-firms-urged-to-limit-work-of-anwar-al-awlaki.html [hereinafter “Shane II”]; Eric Posner, ISIS Gives Us No Choice But to Consider Limits on Speech, SLATE (Dec. 15, 2015) http://www.slate.com/articles/news_and_politics/view_from_chicago/2015/12/ isis_s_online_radicalization_efforts_present_an_unprecedented_danger.html; Press Release, House Committee on Foreign Relations, Poe, Sherman, Royce, Engel: Shutdown Terrorists on Twitter (Mar. 12, 2015), https://democratsforeignaffairs.house.gov/news/press-releases/poe-sherman-royce-engel-shut-down-terrorists-twitter. [hereinafter “Committee Member Press Release”] 21 See Committee Member Press Release. footnote 20. 22 Posner, supra note 20; but see, David Post, Protecting the First Amendment in the Internet Age, Volokh Conspiracy, (continued...) Congressional Research Service 2 The Advocacy of Terrorism on the Internet Significant First Amendment freedom of speech issues are raised by the prospect of government restrictions on the publication and distribution of speech, even speech that advocates terrorism. However, government restrictions on advocacy that is provided to foreign terrorist organizations as material support have been upheld as permissible. This report will discuss relevant precedent that may limit the extent to which advocacy of terrorism may be restricted. The report will also discuss the potential application of the federal ban on the provision of material support to foreign terrorist organizations (FTOs) to the advocacy of terrorism and the dissemination of such advocacy by online service providers like Twitter or Facebook. Constitutional Principles: Advocacy of Violence or Lawlessness Under the First Amendment The First Amendment to the Constitution states that “Congress shall make no law ... abridging the freedom of speech.... ”23 According to the Supreme Court, “the First Amendment [generally] means that government has no power to restrict expression because of its message, its ideas, its subject matter, or its content.”24 However, the freedom of speech is not absolute.25 Some speech, including fighting words, incitements to imminent violence, child pornography, and obscenity can be restricted by the government without constitutional concern.26 Furthermore, courts do allow the government to place restrictions on protected speech under certain circumstances.27 When a restriction applies to speech based upon its content28 or upon the viewpoint expressed,29 courts generally apply the highest level of scrutiny, known as strict scrutiny.30 In order to satisfy strict scrutiny, a speech restriction must directly advance a compelling government interest and it must be the least restrictive means for achieving that interest. 31 It is rare that a law will survive this level of scrutiny.32 Generally, when a restriction is not directed at the content of speech or the viewpoint expressed, courts apply a less-exacting (...continued) WASH. POST (Dec. 21, 2015), https://www.washingtonpost.com/news/volokh-conspiracy/wp/2015/12/21/protectingthe-first-amendment-in-the-internet-age/?utm_term=.cfe4340d93ad. 23 U.S. CONST. amend. I. 24 Ashcroft v. Am. Civ. Liberties Union, 535 U.S. 564, 573 (2002) (internal quotation marks omitted). 25 United States v. Stevens, 559 U.S. 460, 468 (2010). 26 Id. (listing the traditionally recognized categories of speech that may be prohibited). 27 For example, the government may place content-neutral time, place, and manner restrictions on speech as long as such restrictions serve a significant interest and are narrowly tailored to directly advance that interest. Ward v. Rock Against Racism, 491 U.S. 781 (1989). 28 A law that restricted speech because of the subject matter discussed likely would be considered by a reviewing court to be a content-based restriction. Reed v. Town of Gilbert, ___ U.S. ___, 135 S. Ct. 2218, 2227 (2015) (“Government regulation of speech is content based if a law applies to particular speech because of the topic discussed or the idea or message expressed.”). 29 Laws that single out speech expressing a particular viewpoint on a subject (e.g., negative views of a racial group) for special restriction are “viewpoint discriminatory” and a particularly egregious form of content discrimination. Rosenberger v. Rector and Visitors of Univ. of Va., 515 U.S. 819, 829 (1995) (finding that government discrimination among viewpoints is a “more blatant” and “egregious form of content discrimination”). 30 Reed, 135 S. Ct. at 227. 31 Sable Commc’n of Cali., Inc. v. FCC, 492 U.S. 115, 126 (1989). 32 Williams-Yulee v. Fla. Bar, ___ U.S. ___, 135 S.Ct. 1656, 1666 (2015) (noting that it is the “rare case [] in which a speech restriction survives strict scrutiny”). Congressional Research Service 3 The Advocacy of Terrorism on the Internet standard of scrutiny, known as intermediate scrutiny.33 Content-neutral restrictions on protected speech will be upheld if the government can show that the restriction advances a substantial government interest and is narrowly tailored to achieve that interest.34 In this way, the government is permitted to impose reasonable regulations on the time, place, and manner of speech.35 Pure Advocacy of Violence and Law Breaking Under Brandenburg v. Ohio In Brandenburg v. Ohio, the Supreme Court held that the First Amendment can protect the advocacy of lawbreaking and violence. Brandenburg overturned a conviction under Ohio’s criminal syndicalism statute, invalidating that statute.36 The statute, like other syndicalism laws, prohibited “advocat[ing] ... the duty, necessity, or propriety of crime, sabotage, violence, or unlawful methods of terrorism.”37 Members of the Ku Klux Klan had been convicted of violating that statute at a rally covered by a Cincinnati television station at the request of one of the Klan members.38 At the rally, among other incendiary comments, one of the members said “We’re not a revengent [sic] organization, but if our President, our Congress, our Supreme Court, continues to suppress the white, Caucasian race, it’s possible that there might have to be some revengeance [sic] taken.”39 Other statements advocating violence were made, as well. The Supreme Court overturned the convictions at issue because “the mere abstract teaching ... of the moral propriety or even moral necessity for a resort to force and violence is not the same as preparing a group for violent action” and cannot be punished by the government in a manner consistent with the First Amendment.40 In order for speech inciting violence or lawlessness to fall outside of the ambit of the Free Speech Clause, the Court held that: 1. the speech must be directed at . . . 2. inciting “imminent lawless action” and 3. the speech must also be likely to produce such action.41 33 See Ward, 491 U.S. at 797; United States v. O’Brien, 391 U.S. 367, 377 (1968) (sustaining regulations that burden speech where the government interest in enacting the regulation is unrelated to the suppression of speech). 34 Ward, 481 U.S. at 797 (“[E]ven in a public forum the government may impose reasonable restrictions on the time, place, or manner of protected speech, provided the restrictions are justified without reference to the content of the regulated speech, that they are narrowly tailored to serve a significant governmental interest, and that they leave open ample alternative channels for communication of the information.) (internal quotations omitted). 35 Id. 36 395 U.S. 444 (1969) (per curiam). Brandenburg overturned a previous decision, Whitney v. California, upholding California’s criminal syndicalism statute, which was similar in substance to the Ohio statute struck down by Brandenburg. 274 U.S. 357 (1927). In upholding the constitutionality of California’s statute, the Whitney Court gave great deference to the judgment of the state regarding the dangers presented by the speech at issue. Id. at 371. The state had declared that becoming a member of a group like the Communist Party “involves such danger to the public peace and the security of the state, that these acts should be penalized.” Id. The Court said that it would not declare such judgment to be unconstitutional unless it was “an arbitrary or unreasonable attempt to exercise the authority vested in the State in the public interest.” Id. Whitney was decided over forty years before Brandenburg, and the Brandenburg Court wrote that cases subsequent to Whitney had discredited the Court’s ruling in that case. 395 U.S. at 447. 37 Brandenburg, 395 U.S. at 445. 38 Id. 39 Id. at 446. 40 Id. at 448 (quoting Noto v. United States, 367 U.S. 290 (1961)). 41 Id. Congressional Research Service 4 The Advocacy of Terrorism on the Internet In other words, for punishment of speech advocating violence to be constitutional, the speaker must both intend to incite a violent or lawless action and that action must be likely to imminently occur as a result.42 In so holding, the Court invalidated Ohio’s criminal syndicalism statute,43 reasoning that the statute failed to draw the distinction between “mere abstract teaching” and “preparing a group for violent action” and, therefore, swept too broadly.44 Consequently, in order for a statute that restricts the advocacy of violence or lawlessness to be constitutional, the statute must apply only to speech meeting the standard announced by the Court.45 Ambiguity in Brandenburg’s Scope As indicated by the Court in Brandenburg, speech that is intended to incite violent action and is likely to imminently produce such action is not protected by the First Amendment and may freely be proscribed by the government. However, the Court did not elaborate upon what it might mean for speech to be “likely to imminently produce” unlawful action.46 Therefore, it is unclear how imminent the violence advocated must be in order for speech to be able to be proscribed.47 In Hess v. Indiana, the Supreme Court provided some guidance regarding “imminence” pursuant to Brandenburg.48 The defendant in Hess had been convicted of disorderly conduct.49 At an antiwar rally, he had been arrested for shouting “[we’ll] take the [expletive] street later.”50 The Court overturned his conviction because his statement “amounted to nothing more than advocacy of illegal action at some indefinite future time” and his statements were not directed to any person or group of persons.51 In the Court’s words, “there was no evidence, or rational inference from the import of the language that his words were intended to produce and likely to produce imminent disorder.”52 Some argue that the Court’s decision in Hess indicated the Court viewed the “imminence” requirement to mean that violence must be likely to occur immediately as a result of the speech at 42 See, e.g., Bible Believers v. Wayne Cty., 805 F.3d 228, 244 (6th Cir. 2015) (“The right to freedom of speech provides that a state cannot proscribe advocacy of the use of force or of law violation except where such advocacy is directed to inciting or producing imminent lawless action and is likely to incite or produce such action.”) (internal quotations omitted); Blum v. Holder, 744 F.3d 790, 802 (1st Cir, 2014) (finding that the Animal Enterprise Terrorism Act did not apply to expressive conduct protected by the First Amendment, including speech meeting Brandenburg’s standard); United States v. Bell, 414 F.3d 474 (3d Cir. 2005) (finding that, while speech meeting Brandenburg’s standard cannot be restricted, the speech at issue was false commercial speech that aided and abetted violations of the tax laws and, therefore, could be enjoined). 43 Brandenburg, 395 U.S. at 450. 44 Id. at 448. 45 Id. at 448 n.2 (citing Dennis v. United States, 341 U.S. 494 (1951), which had upheld the constitutionality of the Smith Act, 18 U.S.C. § 2385, which prohibits advocating the overthrow of the United States government, and Yates v. United States, 354 U.S. 298 (1957), which overturned convictions under the Smith Act because the jury instructions had permitted punishment for pure advocacy “unrelated to its tendency to produce forcible action”). 46 See Michael Buchhandler-Raphael, Overcriminalizing Speech, 36 CARDOZO L. REV. 1667, 1678-80 (2015); Thomas Healy, Brandenbug In a Time of Terror, 84 NOTRE DAME L. REV. 655, 681 (2009). 47 Healy, supra note 46 at 681 (“For instance, what does imminence meant? Does it mean within a few hours ... or does it indicate a longer time frame?”). 48 414 U.S. 105 (1973) (per curiam). 49 Id. 50 Id. at 106-07. 51 Id. at 108-09. 52 Id. at 109 (emphasis in original). Congressional Research Service 5 The Advocacy of Terrorism on the Internet issue.53 The Hess Court apparently did not think that a threat to “take the street” at some later point to be imminent or likely enough to be punished under Brandenburg.54 However, state and federal courts have not always applied Hess or the imminence requirement of Brandenburg so strictly.55 For example, in People v. Rubin, a California state court ruling, Rubin was charged with solicitation of murder.56 During a press conference to protest a planned march by the Nazi Party through Skokie, Illinois, Rubin offered money to anyone who “kills, maims, or seriously injures a member of the American Nazi Party ... . The fact of the matter is that we’re deadly serious. This is not said in jest, we are deadly serious.”57 The trial court found that his speech was protected by the First Amendment, but the appeals court reversed.58 After reciting Brandenburg’s standard, the state appeals court held the defendant’s speech was directed to inciting lawless action, and that such action was likely to imminently occur, even though the march in Skokie was not scheduled to take place until five weeks after the defendant had spoken.59 In justifying its holding, the court wrote that “time is a relative dimension and imminence a relative term, and the imminence of an event is related to its nature.... We think solicitation of murder in connection with a public event of this notoriety, even though five weeks away, can qualify as incitement to imminent lawless action.”60 The only other Supreme Court case to apply the standard announced in Brandenburg was NAACP v. Claiborne Hardware.61 In that case, the Supreme Court overturned civil judgments against black defendants who had organized a boycott of white-owned businesses in a Mississippi town in order to protest discrimination and advocate for racial equality.62 According to the lawsuit, one of the defendants, Charles Evers, had advocated the use of violence to enforce the boycott against unwilling participants.63 Plaintiffs argued that Evers’s advocacy of violence should make him liable to the plaintiffs for their losses resulting from the boycott.64 The Court disagreed, stating “[this] Court has made clear ... that mere advocacy of the use of force or violence does not remove speech from the protection of the First Amendment.”65 The Court specifically noted that Evers’s speech primarily consisted of a plea for black people to unify wherein strong language was used that might have been construed as advocacy of violence.66 While the Court noted that “a 53 See Buchhandler-Raphael, supra note 46 at 1677 (“In light of Hess, imminent means nothing but immediate action, which is an almost impossible burden to satisfy.”); Marc Rohr, Grand Illusion? The Brandenburg Test and Speech that Encourages or Facilitates Criminal Acts, 38 WILLAMETTE L. REV. 1, 18-19 (2002) (“In Hess, the Court did appear to require that the interval between speech and called-for response must be quite brief.”); KENT GREENAWALT, SPEECH, CRIME, AND THE USES OF LANGUAGE, 209 (1989) (finding that Hess’s imminence interpretation was “very restrictive”). 54 This interpretation coincides with the plain meaning of the word “imminent,” which generally means “close at hand in its incidence,” seemingly requiring an immediate risk of the occurrence of violence. Imminent, OXFORD ENGLISH DICTIONARY (2016) http://www.oed.com/view/Entry/91904?redirectedFrom=imminent&. 55 Healy, supra note 46, at 670-72 (noting cases that arguably misapplied Brandenburg). 56 96 Cal. App. 3d 968 (Cal. Ct. App. 1979). See Healy, supra note 46, at 671-72 (citing People v. Rubin as an example that lower courts have not strictly applied Brandenburg’s imminence requirement). 57 Rubin, 96 Cal. App. 3d at 972. 58 Id. 59 Id. at 977. 60 Id. at 978. 61 458 U.S. 886 (1982). 62 Id. at 934. 63 Id. at 926. 64 Id. at 898. 65 Id. at 927-28. 66 Id. at 928. Congressional Research Service 6 The Advocacy of Terrorism on the Internet substantial question would be presented whether Evers could be held liable for the consequences of that unlawful conduct” had immediate violence erupted from his speech, any violence that occurred happened weeks later. 67 The Court therefore concluded that Evers’ speech was protected under the First Amendment. However, the Court has also found that First Amendment protection for speech related to illegal activity has its limits. In United States v. Williams, the Court upheld the constitutionality of a statute that outlawed knowing offers to provide or requests to receive child pornography.68 Citing Brandenburg, the Court noted that “there remains an important distinction between a proposal to engage in illegal activity and the abstract advocacy of illegality.”69 Distinguishing the statute from one that would raise issues under Brandenburg, the Court found that it did not ban the advocacy of the creation of child pornography, but instead prohibited what were essentially attempts to give or receive it.70 Child pornography is not protected by the First Amendment.71 In Williams, the Court held that offers to give or receive it are also categorically excluded from First Amendment protection.72 For that reason, the Court found that the statute “[fell] well within constitutional bounds.”73 Because the Court found that the statute did not raise issues regarding permissible restrictions on the pure advocacy of violence or lawlessness, the precise boundaries of the Brandenburg standard remain unclear. Holder v. Humanitarian Law Project Beyond the issue of regulating advocacy of imminent lawlessness or violence, the more recent 2010 case of Holder v. Humanitarian Law Project (“Humanitarian Law Project”) is also relevant to questions regarding the advocacy of terrorism, as the opinion analyzes the permissibility of burdening speech that may more generally benefit terrorist organizations. In Humanitarian Law Project, the Supreme Court upheld the constitutionality of the federal criminal prohibition on the provision of material support to entities that have been officially designated as foreign terrorist organizations (FTOs) by the United States.74 The criminal prohibition applies to, among other forms of support, the provision of “personnel,” “training,” “service,” and “expert advice and support” to designated entities.”75 The plaintiffs in that case sought to provide certain services to FTOs, but feared that the provision of such services was barred by the law.76 While the plaintiffs did not claim that the statute violated the First Amendment in all instances, they alleged that the law was constitutionally invalid if applied to the assistance they contemplated providing the FTOs, which was intended to assist those groups’ legitimate, non-terrorism related activities.77 In particular, the plaintiffs sought pre-enforcement review in order to ensure that they could teach these organizations how to 67 Id. 553 U.S. 285 (2008). 69 Id. at 299. 70 Id. 71 New York v. Ferber, 458 U.S. 747, 764 (1982) (finding that child pornography is not protected speech). 72 Williams, 553 U.S. at 299. 73 Id. 74 561 U.S. 1 (2010). 75 Id. at 14; 18 U.S.C. § 2339B. 76 Humanitarian Law Project, 561 U.S 1, 10-11 (2010). 77 Id. 68 Congressional Research Service 7 The Advocacy of Terrorism on the Internet apply for certain humanitarian aid, engage in political advocacy on behalf of minority groups represented by these organizations, and give legal advice regarding the negotiation of peace agreements.78 The Court agreed with the plaintiffs that the prohibition on the provision of material support extended not only to conduct, but also to speech in a manner that burdened the plaintiffs’ First Amendment rights.79 The majority acknowledged that the restriction was content-based, and accordingly applied “a more demanding standard” of scrutiny to the provision than would otherwise have been employed if it reached only conduct.80 Nonetheless, the Court upheld the application of the statute to the speech in question. The Court’s assessment of the permissibility of the burden imposed on the plaintiffs’ speech rights was informed by its reading of the underlying statute. The Court construed the material support statute as having been “carefully drawn to cover only a narrow category of speech to, under the direction of, or in coordination with foreign groups that the speaker knows to be terrorist organizations.”81 The fact that the statute covered speech coordinated with foreign terrorist groups, and not “independent advocacy” that happened to support those groups, made it easier for the government to demonstrate that the restriction was narrowly tailored to advance the government’s interest.82 Turning to the government’s justification, the Court began its analysis by reiterating that Congress has a compelling interest in combating terrorism and protecting national security.83 In banning the provision of all material support to foreign terrorist organizations, the Court observed that Congress had reasoned that any support provided to these organizations, even support not intended to aid in terrorist endeavors, can free resources to support terrorist activities.84 The Court noted congressional findings that if American citizens provided material support to entities the U.S. government had identified as terrorist groups that activity may strain diplomatic relationships between the United States and countries in which the FTOs operate.85 In the Court’s view, official cooperation and interaction with non-governmental entities can lend legitimacy to terrorist groups, which might undermine the government’s interest in combating those organizations.86 When examining the support that the plaintiffs wished to provide to FTOs, the Court observed that “[a] foreign terrorist organization introduced to the structures of the international legal system might use the information to threaten, manipulate, and disrupt.”87 According to the Court, prohibiting the provision of material support to FTOs, therefore, directly advanced the government’s compelling interest in combating terrorism, and because the prohibition applied only to material support coordinated with FTOs, the statute was narrowly tailored to achieve the government’s compelling interest. 78 Id. Id. at 28. 80 Id. at 27-28. 81 Id. at 26. 82 Humanitarian Law Project, 561 U.S. at 26. 83 Id. at 28 (“Everyone agrees that the Government’s interest in combating terrorism is an urgent objective of the highest order.”) (internal quotations omitted). 84 Id. at 29-30. 85 Id. at 32. 86 Id. at 30. 87 Id. at 37. 79 Congressional Research Service 8 The Advocacy of Terrorism on the Internet In reaching its conclusion upholding the statute, the Court emphasized that its holding applied only to the factual situation before the Court, and that the Court was not deciding the constitutionality of more difficult cases that might arise in other circumstances.88 The Court also stated that it “in no way suggest[ed] ... that a regulation of independent speech”—i.e., speech that is not coordinated with a FTO– “would pass constitutional muster, even if the Government were to show that such speech benefits foreign terrorist organizations” or that a similar statute banning the provision of material support to a domestic organization would survive review.89 The Court focused upon the significance of the fact that Congress had carefully crafted the statute to avoid burdening constitutional rights when enacting the law90 and emphasized that Congress’s views on matters of national security are entitled to “significant weight.”91 Restricting the Advocacy of Terrorism on the Internet Some have argued that it should be permissible to restrict advocacy of terrorism disseminated via the Internet under the First Amendment, including perhaps in situations where current case law suggests that significant constitutional questions might be raised.92 Such advocates contend that eliminating or restricting such speech from the digital environment will reduce the risk of “selfradicalization”93 and will restrict the ability of terrorist groups to use social media to spread their propaganda.94 The Constitutionality of a Criminal Law that Would Wholly Prohibit Terrorist Advocacy Under Brandenburg, it appears that laws that criminalize the dissemination of the pure advocacy of terrorism, without more, would likely be deemed unconstitutional.95 Despite the ambiguities in its application, Brandenburg remains controlling precedent and has been cited by the Supreme 88 Id. at 39 (“All this is not to say that any future applications of the material-support statute to speech or advocacy will survive First Amendment scrutiny.”). 89 Id. (emphasis added). 90 Id. at 35-36. Here the Court stressed again that “Congress has avoided any restriction on independent advocacy, or indeed any activities not directed to, coordinated with, or controlled by foreign terrorist group.” Id. at 36. 91 Id. at 36. 92 Eckholm, supra note 14; Posner, supra note 20. 93 Self-radicalization describes the process by which an individual, or group of individuals comes to adhere to and, eventually, act on extreme philosophies. See Threats to the American Homeland After Killing Bin Laden: An Assessment: Hearing Before the H. Comm. on Homeland Security, 112th Cong. 12 (2011) (statement of Evan Kohlmann, Flashpoint Global Partners) (“We are seeing individuals who are popping up who were not recruited by any individual cleric or any individual mosque. They are being motivated purely by what they see on the web.”). 94 Eckholm, supra note 14 (“Recently, though, a few legal scholars, too, have engaged in what others call First Amendment heresy. What does clear and present danger mean when terrorists are provoking violence over the Internet? Should not the government have a way, they ask, to block messages that facilitate terrorist acts?”); Cass R. Sunstein, Islamic State’s Challenge to Free Speech, BLOOMBERG VIEW (Nov. 28, 2015, 12:38 PM), https://www.bloomberg.com/ view/articles/2015-11-23/islamic-state-s-challenge-to-free-speech (arguing that due to the dangers posed by the advocacy of terrorism, the constitutional test for restricting such advocacy may need to be reconsidered). 95 Brandenburg v. Ohio, 395 U.S. 444, 448 (1969) (per curiam) (“A statute which fails to draw [the distinction between abstract teaching of the moral propriety of violence and preparing a group for action] impermissibly intrudes upon the freedoms guaranteed by the First and Fourteenth Amendments.”). Congressional Research Service 9 The Advocacy of Terrorism on the Internet Court as such.96 Consequently, speech that does no more than independently advocate the moral propriety or the moral good of terrorist acts is likely protected by the First Amendment. 97 According to Brandenburg, statutes that fail to draw the distinction between abstract advocacy of violence and incitements directed at and likely to produce imminent lawless action sweep too broadly to be upheld.98 Therefore, any law that would generally restrict the independent advocacy of terrorist action on the Internet, without narrowing its application to only that advocacy that meets Brandenburg’s definition of incitement, would likely be unconstitutional. The limiting language in the Supreme Court’s majority opinion in Humanitarian Law Project appears to support this argument.99 The Court stressed that the application of the prohibition on material support to FTOs was properly tailored to withstand scrutiny, in part, because the statute did not apply to independent advocacy of terrorism.100 The Court stressed that it was not addressing whether Congress could burden independent advocacy, even if it could be shown that FTOs would benefit from that advocacy.101 Some have argued that speech that advocates terrorist acts is so inherently dangerous that it should be distinguished from other speech that advocates violence or law breaking.102 These commentators posit that the government should be able to ban the dissemination of terrorist advocacy in the same way that the dissemination of child pornography is restricted.103 The comparison to child pornography is likely inapt, however. While the technology that permits the identification and filtering of child pornography might be adapted to filter the advocacy of terrorism,104 the constitutional justification for allowing the government to police the distribution of child pornography arguably does not justify treating the abstract advocacy of terrorism in a similar matter. 96 See, e.g., United States v. Stevens, 559 U.S. 460, 468 (2010). (citing Brandenburg for the proposition that “incitement” is not protected by the First Amendment). 97 Brandenburg, 395 U.S. at 447. 98 Id. at 448. 99 See Eugene Volokh, Speech That Aids Foreign Terrorist Organizations and Strict Scrutiny, VOLOKH CONSPIRACY (June 21, 2010, 5:43 PM), http://volokh.com/2010/06/21/speech-that-aids-foreign-terrorist-organizations-and-strictscrutiny/ (“We Americans must have the right to try to persuade our fellow citizens, and our government, that our government is on the wrong side in various foreign policy controversies, that groups that the government says are bad guys are actually good guys (or at least less bad than the really bad guys), or that we should change our policies about which kinds of support to the bad guys are barred and which are allowed. To do that, we need to be able to make arguments defending or even praising those groups, even when such arguments help designated foreign terrorist organizations ... ”) (emphasis in original). 100 Holder v. Humanitarian Law Project, 561 U.S. 1, 36 (2010) (“Finally, and most importantly, Congress has avoided any restriction on independent advocacy, or indeed any activities not directed to, coordinated with, or controlled by foreign terrorist groups.”) (emphasis in original). 101 Id. at 39. See also, Volokh, supra note 99 (“To be sure, the majority doesn’t hold that a ban on independent advocacy would be unconstitutional even though such a ban might be necessary to serve a compelling government interest; it expressly reserves that question. But I think that the majority’s repeated stress that the law doesn’t restrict independent advocacy suggests that the Court would indeed strike down such a ban that applied to independent advocacy.”). 102 See, e.g., Eckholm, supra note 14; Sunstein, supranote 94. 103 See, Posner, supra note 20 (“The idea would be to get out the word that looking at ISIS-related websites, like looking at websites that display child pornography, is strictly forbidden.”); Shane II, supra note 20 (quoting Mark D. Wallace, a former diplomat, arguing that terrorist advocacy should be removed from the Internet similarly to child pornography). 104 See, e.g., Ellen Nakashima, There’s a new tool to take down terrorism images online. But social-media companies are wary of it, WASH. POST (June 21, 2016), https://www.washingtonpost.com/world/national-security/new-tool-totake-down-terrorism-images-online-spurs-debate-on-what-constitutes-extremist-content/2016/06/20/0ca4f73a-349211e6-8758-d58e76e11b12_story.html. Congressional Research Service 10 The Advocacy of Terrorism on the Internet Child pornography, the depiction of a minor engaged in sexual conduct that is not necessarily obscene, is unprotected by the First Amendment.105 The Supreme Court held that the possession and distribution of child pornography could be completely prohibited because the government has an overriding interest in destroying the market for speech that requires the injury and exploitation of children in order to create it.106 However, the Court has held that this reasoning does not extend to pornography that merely appears to, but does not in actuality depict a child engaging in sexual conduct (“virtual child pornography”).107 In reaching this holding, the Court explained that restrictions that apply to pornography depicting actual children were upheld because the laws targeted the production of the work and the harm that it caused children, not its content.108 In the case of virtual depictions of child pornography, no children are harmed in the creation of the content.109 The government had attempted to justify similar restrictions on virtual child pornography by arguing that such material, like actual child pornography, increased the risk that consumers of that content would victimize children in the future.110 Addressing that concern, the Supreme Court reiterated that the government “may not prohibit speech because it increases the chance an unlawful act will be committed ‘at some indefinite future time.’”111 Applying this reasoning to the advocacy of terrorist activity, it does not appear that the creation of speech that advocates terrorism always inherently harms someone in the course of its production in a way that would be similar to the creation of actual child pornography. To be sure, some terrorist propaganda depicts terrorist attacks or executions, but terrorist advocacy does not necessarily require someone to be harmed in order for the speech to occur.112 Instead, the advocacy of terrorism, like virtual child pornography, arguably creates or increases a risk that a crime will be committed “at some indefinite future time.”113 And the Supreme Court has held that the government may not prohibit speech solely on the basis of that indefinite risk.114 Can Terrorist Advocacy Be Restricted More Easily on the Internet? Speech that advocates terrorism that is distributed via the Internet may pose a significant danger to the public due to the ease of propagation to people who might be willing to act on those messages.115 For that reason, some lawmakers and scholars argue that courts should permit 105 Ferber, 458 U.S. at 764. The definition of “sexually explicit conduct” in the federal child pornography statute includes “lascivious exhibition of the genitals or pubic area of any person [under 18]” and “is not limited to nude exhibitions or exhibitions in which the outlines of those areas [are] discernible through clothing.” 18 U.S.C. §§ 2256(2)(A)(v), 2252 note. 106 Osborne v. Ohio, 495 U.S. 103 (1990). 107 Ashcroft v. Free Speech Coal., 535 U.S. 234, 249 (2002). Virtual child pornography includes depictions that appear to be children, but are not. Id. at 241. They may be images of adults that are digitally altered to make the subject appear younger, or simply images of very young-looking adults. Id. 108 Id. 109 Id. 110 Id. at 253. 111 Id. 112 See, e.g., Terrence McCoy, ISIS, beheadings and the success of horrifying violence, WASH. POST (June 13, 2014), https://www.washingtonpost.com/news/morning-mix/wp/2014/06/13/isis-beheadings-and-the-success-of-horrifyingviolence/ 113 Ashcroft, 435 U.S. at 253. 114 Id. 115 Eckholm, supra note 14; Posner, supra note 20. Congressional Research Service 11 The Advocacy of Terrorism on the Internet terrorist advocacy to be more easily restricted when the Internet is used to disseminate it.116 The Supreme Court has recognized that “each medium of expression presents special First Amendment problems.”117 For example, the Court has permitted broadcast speech to be more easily regulated than speech via other mediums because, among other factors, broadcasted speech is uniquely accessible to children.118 The Supreme Court has yet to consider the specific question of whether the ease of dissemination of information over the Internet warrants treating advocacy of violence via that medium differently than the same speech communicated through other means. However, the Court has had the opportunity to examine what standard of review should be applied to restrictions on Internet speech more generally. In Reno v. American Civil Liberties Union, the Supreme Court struck down restrictions on the communication of indecent speech to minors.119 In doing so, the Court held that content-based restrictions on Internet speech should be subject to strict scrutiny.120 The Court examined whether the medium of the Internet, like the broadcast medium, justified greater latitude for the government to restrict speech on that platform and concluded that it did not.121 Comparing the reasons for permitting greater restrictions on broadcasted speech to the medium of the Internet, the Court explained that broadcasted speech is uniquely accessible to children. Specifically, the Court noted that there is an appreciable risk that if indecent speech were broadcast at times when children would be likely to be in the audience, children might accidentally be exposed to that speech.122 In contrast, the risk of accidentally encountering indecent speech on the Internet was, in the Court’s assessment, far lower and did not justify departing from the general rules regarding content-based restrictions on speech.123 The Court therefore accorded the highest degree of protection to speech on the Internet because “the interest in encouraging freedom of expression in a democratic society outweighs any theoretical but unproven benefit of censorship.”124 Assuming that the Court would apply the same reasoning to a content-based restriction on the advocacy of terrorism, the fact that speech is distributed via the Internet would not seem to permit the government to more easily regulate its dissemination, under current case law. 116 Id. FCC v. Pacifica Found., 438 U.S. 726, 748 (1978). 118 Id. at 749. In the Supreme Court’s most recent opinion addressing the broadcast indecency rules, the Court did not reach the question of their constitutionality. FCC v. Fox Television, __ U.S. __; 132 S.Ct. 2307 (2012). However, Justice Ginsburg filed a concurring opinion arguing that Pacifica was wrongly decided and invited its reconsideration. 132 S.Ct. at 2321 (Ginsburg, J. concurring). 119 Reno, 521 U.S. at 885. Indecent speech is speech that describes or depicts sexual or excretory organs or activities in a way that is patently offensive to the audience. Pacifica, 438 U.S. at 732. 120 Reno, 521 U.S. at 885. 121 Id. at 868-69. 122 Id. 123 Id. (“[T]he Internet is not as ‘invasive’ as radio or television. The District Court specifically found that ‘communications over the Internet do not “invade” an individual’s home or appear on one’s computer screen unbidden. Users seldom encounter content ‘by accident.’”) (internal citations omitted). 124 Id. at 882. 117 Congressional Research Service 12 The Advocacy of Terrorism on the Internet Application of Material Support Statutes to Advocacy of Terrorism Under current law, the two federal statutes criminalizing material support for terrorism or foreign terrorist organizations, 18 U.S.C. §§ 2339A and 2339B, appear most relevant to online advocacy of terrorism. Neither statute squarely prohibits the advocacy of terrorism. But both statutes potentially cover non-tangible forms of support for terrorist activities or foreign terrorist groups, including through the recruitment of personnel or the provision of training, expert advice or assistance, funding, or financial services.125 The material support statutes have been the most often prosecuted anti-terrorism offenses.126 Section 2339A of Title 18 of the United States Code prohibits the provision of material support with the knowledge or intent that the support be used to carry out a terrorist attack.127 Section 2339B prohibits the provision of material support to designated foreign terrorist organizations.128 As previously discussed, the Supreme Court in Humanitarian Law Project held that material support provided in the form of certain kinds of speech under the direction or in coordination with foreign terrorist organizations in violation of section 2339B may be punished consistent with the First Amendment even if such support is for purposes other than advancing the group’s terrorist activities.129 Section 2339A: Material Support of Acts of Terrorism Section 2339A of Title 18 of the U.S. Code prohibits: 1. (a) attempting to, (b) conspiring to, or (c) actually 2. (a) providing material support or resources, or (b) concealing or disguising i. the nature, ii. location, iii. source, or iv. ownership of material support or resources 3. knowing or intending that they be used (a) in preparation for, (b) in carrying out, (c) in preparation for concealment of an escape from, or 125 18 U.S.C. § 2339A(b); 18 U.S.C. § 2339B(g). CRS Report R41333, Terrorist Material Support: An Overview of 18 U.S.C. 2339A and 2339B, by Charles Doyle. 127 18 U.S.C. § 2339A. 128 18 U.S.C. § 2339B. 129 Holder v. Humanitarian Law Project, 561 U.S. 1, 36 (2010). 126 Congressional Research Service 13 The Advocacy of Terrorism on the Internet (d) in carrying out the concealment of an escape from 4. an offense identified as a federal crime of terrorism, as enumerated by the statute.130 Material support or resources covers “any property, tangible or intangible, or service,”131 but excludes medicine and religious material.132 For the purposes of the discussion of whether the material support statutes can apply to the advocacy of terrorism, it is sufficient to note that the definition of material support explicitly includes “training” and “expert advice or assistance.”133 The statute defines training as the “instruction or teaching designed to impart a specific skill, as opposed to general knowledge,”134 whereas expert advice and assistance is defined to mean “advice or assistance derived from scientific, technical, or other specialized knowledge.”135 Unlike the material support statute at issue in Humanitarian Law Project, which potentially applied to activities done in furtherance of a foreign entity’s activities unrelated to terrorism, a requisite for application of Section 2339A is that the support was given with the knowledge or intention that it would be used to facilitate a terrorist crime.136 It seems that the provision of speech in the form of training or expert advice or assistance with the intent that it be used to support a specific act of terrorism can be constitutionally punished. Even when a violation of the law may take the form of speech, the Supreme Court has held that the Constitution is not necessarily a barrier to punishment.137 For example, an agreement to violate the law may be punished as a criminal conspiracy, and an agreement to fix prices in a market might constitute a violation of the antitrust laws. 138 Constitutional challenges to the scope of Section 2339A, including those based on arguments that it is incompatible with the First Amendment, thus far have proven unsuccessful.139 130 18 U.S.C. § 2339A. 18 U.S.C. § 2339A(b)(2)-(b)(3). 132 Id. 133 Id. 134 Id. 135 Id. 136 Id. Convictions for a criminal violation of either Section 2339A or 2339B are punishable by imprisonment for not more than 15 years and/or a fine of not more than $250,000 (not more than $500,000 for an organizational defendant). Id.; 18 U.S.C. § 2339B. See also CRS Report R41334, Terrorist Material Support: A Sketch of 18 U.S.C. 2339A and 2339B, by Charles Doyle. 137 Giboney v. Empire Storage & Ice Co., 336 U.S. 490, 498-502 (1949) (“[It] has never been deemed an abridgement of freedom of speech or press to make a course of conduct illegal merely because the conduct was in part initiated, evidenced, or carried out by means of language, either spoken, written, or printed.”). 138 Id. (“Such an expansive interpretation of the constitutional guaranties of speech and press would make it practically impossible ever to enforce laws against agreements in restraint of trade as well as many other agreements and conspiracies deemed injurious to society.”). 139 See, e.g., United States v. Amawi, 545 F.Supp.2d 681 (N.D. Ohio 2008) (upholding Section 2339A against a challenge that the statute was constitutionally overbroad and could infringe lawful expression); United States v. Sattar, 314 F.Supp.3d 279, 305 (S.D.N.Y. 2004) (“There can be no doubt, in any event, that § 2339A is a legitimate exercise of Congress’s power to enact criminal laws that reflect legitimate state interests in maintaining comprehensive controls over harmful, constitutionally unprotected conduct.”) (internal quotations omitted). See generally CRS Report R41333, Terrorist Material Support: An Overview of 18 U.S.C. 2339A and 2339B, by Charles Doyle. 131 Congressional Research Service 14 The Advocacy of Terrorism on the Internet Section 2339B: Material Support of Designated Terrorist Organizations Section 2339B of Title 18 of the United States Code predicates liability on material support provided to certain designated terrorist organizations, rather than to the commission of a specific crime of terrorism.140 As acknowledged by the Supreme Court in Humanitarian Law Project, this difference may create closer constitutional questions when the statute is applied to speech.141 Section 2339B outlaws: 1. (a) attempting to provide, (b) conspiring to provide, or (c) actually providing 2. material support or resources 3. to a foreign terrorist organization 4. knowing that the organization (a) has been designated a foreign terrorist organization, or (b) engages, or has engaged in “terrorism” or “terrorist activity.”142 Section 2339B’s definition for “material support” is the same definition that applies to Section 2339A and covers “any property, tangible or intangible, or service” and explicitly includes “training” and “expert advice or assistance.”143 The Humanitarian Law Project Court clarified that advocacy may only constitute material support when it is “concerted activity,” i.e., activity in connection with or under the direction of an FTO.144 To violate Section 2339B, one need not intend to aid in a terrorist attack or to further an organization’s terrorist activities.145 One need only have “knowledge that the organization is a designated terrorist organization ... that the organization has engaged in terrorist activity ... or that the organization has engaged or engages in terrorism.”146 Advocacy Directed to, Coordinated with, Under the Direction of an FTO Assuming that a particular type of terrorist advocacy is within the scope of the “material support” proscribed by Section 2339B (e.g., it involves training or the giving of expert advice), the primary question that remains is whether the speech at issue would constitute advocacy “directed to, coordinated with or controlled by” a FTO.147 In examining the clarity of the statute in Humanitarian Law Project and holding that the statute was not impermissibly vague, the Court pointed out that, by its terms, 140 18 U.S.C. § 2339B. 561 U.S. at 39. 142 18 U.S.C. § 2339B. In addition to criminal prosecutions, Section 2339B(c) permits the Attorney General or the Secretary of the Treasury to bring a civil suit to enjoin violations of this section. 143 18 U.S.C. § 2339B(g)(4). 144 Humanitarian Law Project, 561 U.S. at 24. 145 18 U.S.C. § 2339B. 146 Id.; Humanitarian Law Project, 561 U.S. at 17-18. 147 Humanitarian Law Project, 561 U.S. at 6. 141 Congressional Research Service 15 The Advocacy of Terrorism on the Internet [the] statute prohibits providing a service “to a foreign terrorist organization.” The use of the word “to” indicates a connection between the service and the foreign group. We think a person of ordinary intelligence would understand that independently advocating for a cause is different from providing a service to a group that is advocating for that cause.148 However, the Court did acknowledge that it was leaving open “questions of exactly how much direction or coordination is necessary for an activity to constitute a ‘service.’”149 The majority decided that it would be more appropriate to address those questions when the particular factual situations arose.150 A 2013 case decided by the First Circuit Court of Appeals may indicate that some courts could be inclined to broadly interpret what it means to direct speech to, coordinate with, or act under the direction of a FTO.151 In that case, Tarek Mehanna was convicted of several terrorism-related charges, including for providing material support to Al Qaeda, an organization Mehanna knew to be a designated FTO.152 Mehanna had traveled to Yemen in an attempt to join Al Qaeda, but had been unable to locate the training camp he sought.153 Mehanna had also translated publicly available Arabic language documents, some of which were Al Qaeda-generated propaganda, into English and had posted his translations on a website that was sympathetic to Al Qaeda.154 Mehanna argued that his translations were independent advocacy, protected by the First Amendment and that the jury had not been properly instructed regarding what it means to coordinate with an FTO under Section 2339B.155 The First Circuit upheld his convictions. In doing so, the court examined the trial judge’s instructions to the jury regarding the First Amendment and the defendant’s translations. The First Circuit noted that the trial judge had defined “coordination” by explaining that “[i]ndividuals who act entirely independently of the ‘FTO’ to advance its goals or objectives shall not be considered to be working under the FTO’s direction.”156 The First Circuit could find no legal error with these jury instructions. In response to the contention that the jury should have been instructed that a direct connection between a FTO and a defendant must be proven in order for the defendant to have acted in coordination with the organization, the appeals court stated that neither the statute nor the Supreme Court’s decision in Humanitarian Law Project required “a direct link” between a defendant and a FTO for a violation to occur.157 Nonetheless, the appeals court did not explicitly hold that Mehanna’s translational activities were sufficient to sustain his conviction.158 At the same time, the court held that, even if the defendant’s translational activities did not constitute an attempt to provide material support, the evidence surrounding his trip to Yemen in an attempt to join Al Qaeda supported his convictions.159 148 Id. at 24 (internal citations omitted). Id. 150 Id. 151 United States v. Mehanna, 735 F.3d 32 (1st Cir. 2013), cert. denied, 135 S. Ct. 49 (2014). 152 Id. at 41. 153 Id. 154 Id. 155 Id. at 47. 156 Id. at 48-49. 157 Id. at 50. 158 Id. at 51. 159 Id. 149 Congressional Research Service 16 The Advocacy of Terrorism on the Internet The Mehanna case does not provide definitive answers as to when speech activity that may benefit an FTO is sufficiently directed to, coordinated with, or under the direction of a FTO to be considered material support.160 However, it does suggest that certain forms of speech-related activity might be considered to be material support, even if the defendant never actually has direct contact with the FTO.161 If speech can be material support when the defendant has never succeeded in making direct contact with the FTO, it seems unclear where the line might be drawn between independent advocacy of terrorism that the Supreme Court in Humanitarian Law Project suggested could not be restricted and advocacy of terrorism that constitutes a violation of Section 2339B.162 The Application of Section 2339B to Social Media Services Some observers have suggested the possibility that when members of FTOs obtain accounts for social media sites like Twitter and Facebook, those sites are providing a service to FTOs that constitutes material support in violation of Section 2339B.163 Some FTOs have reportedly acquired social media accounts from social media companies.164 One recent study estimated that over 30,000 accounts on Twitter were controlled by the Islamic State organization alone as of 2014.165 Others have noted the apparent presence of other terrorist groups on Twitter and other social media outlets.166 The outstanding questions appear to be whether providing a social media 160 See Buchlander-Raphael, supra note 46 at 1683-84 (“Cognizant of the difficulties that the government’s ‘translation as service’ theory posed for First Amendment jurisprudence, the court attempted to play down the significance of the translation-centric charge by characterizing it only as an ‘alternative basis’ for conviction.”); Emily Goldberg Knox, Note: The Slippery Slope of Material Support Prosecutions: Social Media Support to Terrorists, 66 HASTINGS L.J. 295, 314-15 (In affirming Mehanna’s conviction, the First Circuit held that the trial court adequately instructed the jury. However, the court skirted the need to decide anything about coordination by affirming Mehanna’s conviction on the grounds that his trip to Yemen was sufficient to support a guilty verdict, even if his Internet activities were constitutionally protected.”). 161 See Buchhandler-Raphael, supra note 46 at 1683-84 (arguing that Mehanna “demonstrates how the prohibition against providing material support to FTOs criminalizes a lot of speech, despite the arguably weak causal link between the speech in question and the risk of future harm”). 162 See Noah Feldman, Courts Blur Line Between Violent Speech and Crime, BLOOMBERG VIEW (Jul. 12, 2016), https://www.bloomberg.com/view/articles/2016-07-12/courts-blur-line-between-violent-speech-and-crime (“Create a pro-Islamic State music video and post it on a known IS website and you could find yourself convicted of a crime, material support for terrorism. ... The First Circuit should’ve analyzed whether Mehanna’s translation activities counted as coordination. But it didn’t ... ”). 163 See Fields v. Twitter, No. 16-cv-00213-WHO, 2016 U.S. Dist. LEXIS 105768 (N.D. Cal. Aug. 10, 2016); Compl. at 2-3, Force, et. al v. Facebook, No. 16 Civ. 05490 (S.D.N.Y. Jul. 10, 2016); see also Jacob Bogage, Family of ISIS Paris Attack Victim Sues Google, Facebook, and Twitter, WASH. POST (Jun 16, 2016), https://www.washingtonpost.com/ news/the-switch/wp/2016/06/16/family-of-isis-paris-attack-victim-sues-google-facebook-and-twitter/; Knox, supra note 160, at 311 (“The plain language, legislative history, and legislative intent provide strong support for a broad understanding of the term ‘service’” and it likely includes social media accounts); Zoe Bedell and Benjamin Wittes, Tweeting Terrorists, Part II: Does it Violate the Law for Twitter to Let Terrorist Groups Have Accounts?, LAWFARE (Feb. 14, 2016, 6:35 PM), https://www.lawfareblog.com/tweeting-terrorists-part-ii-does-it-violate-law-twitter-letterrorist-groups-have-accounts. [hereinafter “Bedell and Wittes Part II”] 164 See Knox, supra note 160, at 308-10; Zoe Bedell and Benjamin Wittes, Tweeting Terrorists, Part I: Don’t Look Now But a Lot of Terrorist Groups are Using Twitter, LAWFARE (Feb. 14, 2016, 5:05 PM), https://www.lawfareblog.com/tweeting-terrorists-part-i-dont-look-now-lot-terrorist-groups-are-using-twitter (taking note of the number of terrorist groups that appear to have affiliated and official Twitter accounts). 165 J.M. Berger, The ISIS Twitter Census: Making Sense of ISIS’s Use of Twitter, BROOKINGS INST., (Mar. 6, 2015), https://www.brookings.edu/blog/order-from-chaos/2015/03/06/the-isis-twitter-census-making-sense-of-isiss-use-oftwitter/. 166 See Complaint at 2-3, Force, et al. v. Facebook, No. 16 Civ. 05490 (S.D.N.Y. filed Jul. 10, 2016); Jacob Bogage, Family of ISIS Paris Attack Victim Sues Google, Facebook, and Twitter, WASH. POST (Jun 16, 2016), (continued...) Congressional Research Service 17 The Advocacy of Terrorism on the Internet account could constitute material support and whether a social media company has knowledge sufficient to support a conviction.167 However, it does not appear that the Department of Justice (DOJ) has ever brought a criminal or civil case against any social media outlet alleging such a violation.168 As a result, there is no case law clarifying whether the statute can properly be applied to social media companies whose generally available services are used by FTOs to communicate. If a court were to evaluate whether Section 2339B can be applied to social media companies, one of the most difficult questions presented would be whether a social media site could be said to be acting in coordination with or under the direction of an FTO. As noted above, Humanitarian Law Project did not provide guidance as to the level of coordination necessary to constitute the provision of a service to a FTO under the statute.169 Mehanna suggests that defendants need not have direct contact with the FTO in order for a violation to occur.170 Websites generally do not engage in background checks or any other form of verification prior to permitting an account to be created.171 Given the number of people who use their services, such verification may be impossible.172 And given the difficult burden that would be imposed on social media companies in performing background checks on every user, a court may simply conclude that providing an account to a user who may happen to be affiliated with a FTOs may be insufficient in and of itself to rise to the level of “coordination” necessary to violate Section 2339B.173 Nonetheless, a number of social media sites have policies to remove terrorist content or the accounts of terrorist groups.174 Some have argued that because social media sites often fail to suspend accounts that are associated with FTOs, the government could argue that this failure is evidence of coordination with FTOs.175 Without case law interpreting this question, it remains unclear whether the fact that (...continued) https://www.washingtonpost.com/news/the-switch/wp/2016/06/16/family-of-isis-paris-attack-victim-sues-googlefacebook-and-twitter/. 167 See Knox, supra note 160 at 310-21. 168 See Quintan Wiktorowicz, Working to Counter Online Radicalization to Violence in the United States, WHITE HOUSE BLOG (Feb. 5, 2013, 10:02 AM), https://www.whitehouse.gov/blog/2013/02/05/working-counter-onlineradicalization-violence-united-states (noting that the Obama administration investigates and prosecutes those “who use the Internet to recruit others to plan and carry out acts of violence.” and works to inform citizens regarding the threats of terrorist activity online, but does not attempt to remove content from the Internet in its approach). 169 Humanitarian Law Project, 561 U.S. at 24. 170 Mehanna, 735 F.3d at 50. 171 See, e.g., Chi. Lawyers’ Comm. for Civ. Rights Under Law, Inc. v. Craigslist, 519 F.3d 666, 668-69 (7th Cir. 2008) (noting that “[every] month more than 30 million notices are posted” to Craigslist); Stats, FACEBOOK (last visited Aug. 27, 2016), http://newsroom.fb.com/company-info/ (“1.13 billion daily active users on average for June 2016”). 172 See Chi. Lawyers’ Comm., 519 F.3d at 668 (“An online service could hire a staff to vet the postings, but that would be expensive and may well be futile: if postings had to be reviewed before being put online, long delay could make the service much less useful ... ”). 173 See also, Gabe Rottman, Hamas, Twitter and the First Amendment, AM. CIV. LIBERTIES UNION (Nov. 21, 2012, 3:25PM), https://www.aclu.org/blog/hamas-twitter-and-first-amendment (“For de facto common carriers like Twitter, open to all, the provision of service is not ‘coordinated’ in the way the Court seems to argue is necessary, and it certainly does not suggest ‘support’ in the sense of concerted activity in furtherance of the goals of the designated terrorist organization.”). 174 See Katie Benner, Twitter Suspends 235,000 More Accounts Over Extremism, N.Y. TIMES (Aug. 18, 2016), http://www.nytimes.com/2016/08/19/technology/twitter-suspends-accounts-extremism.html?_r=0; Natalie Andrews and Deepa Seetharama, Facebook Steps Up Efforts Against Terrorism, WALL ST. J. (Feb. 11, 2016), http://www.wsj.com/articles/facebook-steps-up-efforts-against-terrorism-1455237595. 175 See Knox, supra note 160 at 316 (“The failure to suspend FTO accounts could be seen as a conscious decision that provides evidence of coordination, particularly as social media companies have the ability to monitor and proactively take down these accounts.”); Bedell and Wittes Part II, supra footnote 163 (noting that the Humanitarian Law Project (continued...) Congressional Research Service 18 The Advocacy of Terrorism on the Internet terrorist groups and their representatives often are able to obtain and use social media accounts represents a sufficient connection between social media services and the terrorist groups to support a finding of the provision of material support. In addition, in order to hold a social media company criminally or civilly liable for the provision of material support to an FTO under Section 2339B, the government must prove that the defendant knew that the organization to which the support was provided was a designated FTO or that the organization was engaged in terrorism.176 Social media companies could be expected to argue that they do not have sufficient knowledge of a new user’s affiliation with a FTO upon the activation of an account.177 Over one billion people use Facebook,178 and millions also use Twitter.179 The companies do not, and might argue that they cannot, attempt to discover with any degree of certainty what users of its services will be terrorist affiliates prior to the activation of an account.180 Nonetheless, social media sites appear to be generally aware that their services are used by some percentage of their subscribers to disseminate terrorist advocacy and that they may be used by FTOs themselves. Twitter, Facebook, and other social media sites can and do disable accounts that violate their terms of service.181 Certain violent speech, including speech that advocates or glorifies terrorism, violates the terms of service of these sites.182 It has been reported that Twitter has recently increased the number of accounts that it has disabled for promoting terrorism.183 Facebook has also reportedly removed posts that advocate terrorism and has disabled accounts for the same reason.184 The fact that social media companies take steps to remove that content when it appears on their services may be evidence that they know that their services are being used by (...continued) Court left unclear what coordination might mean, but arguing that “it’s worth noting that even if the requisite relationship includes ‘formal elements,’ Twitter’s relationship with its users—including Hezbollah, Hamas, and the PKK—is defined by precisely such elements, including a legal contract.”). 176 18 U.S.C. § 2339B. Humanitarian Law Project did not address the question of what level of awareness would be sufficient to sustain a conviction under Section 2339B because the plaintiffs in that case did not dispute their knowledge that the groups at issue were FTOs. Humanitarian Law Project, 561 U.S. at 9-10. 177 See Knox, supra note 160 at 319 (“Social media companies will reject any assertion that they knew they were providing a service to designated FTOs.”); Zoe Bedell and Benjamin Wittes, supra note 163, https://www.lawfareblog.com/tweeting-terrorists-part-ii-does-it-violate-law-twitter-let-terrorist-groups-have-accounts (“One could imagine Twitter’s arguing that the statute is vague as applied to it, because there is no way it reasonably can know which of the millions of users who sign up for its services are affiliated with designated foreign terrorist organizations.”). 178 Stats, FACEBOOK (last visited Aug. 27, 2016), http://newsroom.fb.com/company-info/ (“1.13 billion daily active users on average for June 2016”). 179 @Twitter, Twitter (Dec. 18, 2012, 10:01AM), https://twitter.com/twitter/status/281051652235087872 (stating that Twitter now has over 200 Million users). 180 See Knox, supra note 160 at 319 (“Social media companies will reject any assertion that they knew they were providing a service to designated FTOs.”); Bedell and Wittes Part II, supra note 163; Gabe Rottman, Hamas, Twitter and the First Amendment, AM. CIV. LIBERTIES UNION (Nov. 21, 2012, 3:25PM), https://www.aclu.org/blog/hamastwitter-and-first-amendment. 181 See, e.g., Benner, supra note 174; Ryan Lucas, New Software Aims to Curb Extremism Online, CQ.COM (June 17, 2016, 3:04 PM) http://www.cq.com/doc/news-4912514?0. 182 See, e.g., Twitter Rules. https://support.twitter.com/articles/18311# (last visited, Aug. 26, 2016); Facebook Terms of Service https://www.facebook.com/terms (last visited Aug. 26, 2016); YouTube Community Guidelines https://www.youtube.com/yt/policyandsafety/communityguidelines.html (last visited Aug. 26, 2016). 183 Benner, supra note 174. 184 Andrews and Seetharama, supra footnote 174. Congressional Research Service 19 The Advocacy of Terrorism on the Internet FTOs.185 On the other hand, a social media website might not be sure if an account actually belongs to a FTO or is operated by another user identifying itself as the FTO. 186 It is unclear whether the general knowledge of social media sites that terrorist organizations are using their services, despite lacking actual knowledge that a particular account is used by a specific FTO, is sufficient to support a conviction for the provision of material support under Section 2339B.187 Private Civil Lawsuits and Section 230 of the Communications Decency Act Section 2333 of Title 18 of the United States Code permits U.S. citizens injured in their persons, property, or business by acts of international terrorism to recover treble damages.188 Courts have interpreted violations of Section 2339A and 2339B to be acts of international terrorism for the purposes of Section 2333.189 Recently, a number of lawsuits have been filed by private plaintiffs against social media websites alleging that they are providing material support to terrorist groups in violation of 18 U.S.C. § 2339B.190 One such lawsuit was brought against Twitter and alleged that Twitter’s dissemination of propaganda by the Islamic State organization led to a terrorist attack that caused the death of three government contractors at a training facility in Jordan.191 In another lawsuit, the father of one of the Americans killed in the orchestrated terrorist attack in Paris in 2015 similarly accused Twitter, Google, and Facebook of providing material support to the Islamic State in violation of Section 2339B, arguing that without the services provided by these sites, “the explosive growth of ISIS over the last few years into the most-feared terrorist group in the world would not have been possible.”192 Beyond the difficulty with determining whether social media companies can be held civilly liable for providing material support under the terms of Section 2339B, such private civil lawsuits may face an additional hurdle in attempting to hold social media services accountable for allowing FTOs to use them.193 Section 230(c) of the Communications Decency Act (CDA) prohibits 185 See Knox, supra note 160 at 320 (“Furthermore, after the brutal execution of reporter James Foley, Twitter and YouTube began aggressively cracking down on accounts affiliated with ISIL. This swift and decisive action suggests that these companies have the ability to identify and shut down accounts affiliated with FTOs despite the large number of users.”). 186 See Zoe Bedell and Benjamin Wittes, Tweeting Terrorists, Part III: How Would Twitter Defend Itself Against a Material Support Prosecution?, LAWFARE (Feb. 14, 2016, 7:16 PM), https://www.lawfareblog.com/tweeting-terroristspart-iii-how-would-twitter-defend-itself-against-material-support-prosecution (“Twitter might plausibly argue that it cannot know if the Hamas account really is run by Hamas or is, in fact, precisely the sort of independent advocacy the statute exempts.”) [hereinafter “Bedell and Wittes Part III”] 187 Id. 188 18 U.S.C. § 2333. 189 See Boim v. Quranic Literacy Inst., 291 F.3d 1000, 1015 (7th Cir. 2002); Goldberg v. UBS, 609 F.Supp.2d 92, 114 (E.D. N.Y. 2010). See also CRS Report R41333, Terrorist Material Support: An Overview of 18 U.S.C. 2339A and 2339B, by Charles Doyle. 190 Fields v. Twitter, No. 16-cv-00213-WHO, 2016 U.S. Dist. LEXIS 105768, at *3-*6 (N.D. Cal. Aug. 10, 2016); Complaint at 2-3, Force, et al. v. Facebook, No. 16 Civ. 05490 (S.D.N.Y. filed Jul. 10, 2016); Jacob Bogage, Family of ISIS Paris Attack Victim Sues Google, Facebook, and Twitter, WASH. POST (Jun 16, 2016), https://www.washingtonpost.com/news/the-switch/wp/2016/06/16/family-of-isis-paris-attack-victim-sues-googlefacebook-and-twitter/. 191 Fields, 2016 U.S. Dist. LEXIS 105768, at *3-*6. 192 Jacob Bogage, Family of ISIS Paris Attack Victim Sues Google, Facebook, and Twitter, WASH. POST (Jun 16, 2016), https://www.washingtonpost.com/news/the-switch/wp/2016/06/16/family-of-isis-paris-attack-victim-sues-googlefacebook-and-twitter/. 193 Fields, 2016 U.S. Dist. LEXIS 105768, at *28. See also Cyrus Farivar, It’ll be very hard for terrorism victim’s family to win lawsuit against Twitter, ARSTECHNICA (June 17, 2016, 5:00AM), http://arstechnica.com/tech-policy/2016/ (continued...) Congressional Research Service 20 The Advocacy of Terrorism on the Internet holding interactive computer service providers liable for content provided by third parties.194 While the liability shield does not apply to violations of the federal criminal law,195 the U.S. Court of Appeals for the First Circuit recently held that, even when the civil lawsuit is based upon a violation of federal criminal law, Section 230’s shield may bar recovery if the lawsuit seeks to treat a service provider as a publisher.196 Consequently, even if social media service providers do violate Section 2339B when providing accounts to FTOs, it is possible that Section 230 of the CDA would shield them from civil liability for that violation. Section 230 of the Communications Decency Act Section 230(c)(1) of the CDA states, in pertinent part, that “no provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”197 This language has been interpreted by courts to provide immunity from civil liability to providers and users of interactive computer services for content provided by third parties.198 In creating this liability shield, Congress sought to preserve the robust and vibrant communication that occurs on the Internet and to “keep government interference to a minimum.”199 However, Section 230 is not an absolute bar to liability.200 A three-part test has been developed to determine whether a defendant is eligible for Section 230’s protection. If the lawsuit is: 1. brought against and interactive computer service provider or user (e.g., a website like NYTimes.com, or a social media service, like Twitter or Facebook),201 (...continued) 06/itll-be-very-hard-for-terrorism-victims-family-to-win-lawsuit-against-twitter/; Benjamin Wittes and Zoe Bedell, Did Congress Immunize Twitter Against Lawsuits for Supporting ISIS?, LAWFARE (Jan. 22, 2016, 9:14AM), https://www.lawfareblog.com/did-congress-immunize-twitter-against-lawsuits-supporting-isis (“companies would, indeed, have powerful arguments for immunity under” Section 230). [hereinafter “Wittes and Bedell Section 230”] 194 47 U.S.C. § 230(c)(1). 195 47 U.S.C. § 230(e). 196 Doe v. Backpage.com, 817 F.3d 12, 22 (1st Cir. 2016) (holding that the plain meaning of Section 230(e)’s text compelled the conclusion that the exception from the liability shield for federal criminal cases did not apply to civil liability even when the civil suit was based upon a violation of federal criminal law). 197 47 U.S.C. § 230(c)(1). The liability shield is not available for violations of federal criminal law, intellectual property law, state civil and criminal laws that do not conflict with section 230, or the Electronic Communications Privacy Act of 1984 (ECPA), its amendments, or similar state laws to ECPA. 47 U.S.C. § 230(e). 198 Jones v. Dirty World Entertainment, 755 F.3d 398, 406 (6th Cir. 2014) (“Although §230(c)(1) does not explicitly mention immunity or a synonym thereof, this and other circuits have recognized the provision to protect internet service providers for the display of content created by someone else.”) (citations omitted). 199 Id. at 407; see also 47 U.S.C. § 230(b)(2) (“It is the policy of the United States ... to preserve the vibrant and competitive free market that presently exists for the Internet and other interactive computer services, unfettered by Federal and State regulation.”). 200 Fair Hous. Council v. Roommates.com, LLC, 521 F.3d 1157, 1164 (9th Cir. 2008) (stating that the statute did not “create a lawless no-man’s-land on the Internet.”). 201 An interactive computer service is defined by section 230 as “any information service, system, or access software provider that provides or enables computer access by multiple users to a computer server, including specifically a service or system that provides access to the Internet and such systems operated or services offered by libraries or educational institutions.” 47 U.S.C. § 230(f)(2). This definition has been held by courts broadly to encompass websites, such as Backpage.com or Facebook, internet service providers, such as Verizon or Comcast, and all other types of computer services that provide a neutral platform for users of the service to communicate via the Internet. See, e.g., Dirty World, 755 F.3d at 407 (“These providers include broadband providers, hosting companies, and website operators like Dirty World and Richie.”); Backpage.com v. Hoffman, No. 13-cv-03952, 2013 U.S. Dist. LEXIS 119811 (D.NJ (continued...) Congressional Research Service 21 The Advocacy of Terrorism on the Internet 2. based upon information provided by another content provider, and 3. seeks to hold the defendant liable as a publisher or speaker of that content, then Section 230’s liability shield applies.202 If it can be shown that the interactive computer service provider contributed to the development of the illegal content, Section 230’s liability shield is not available.203 To date, only one court has ruled with regard to section 230’s application to a lawsuit alleging material support of a FTO by a social media company. In Fields v. Twitter, the plaintiffs, family members of United States government contractors that were killed in a terrorist attack in Lebanon, alleged that the Islamic State organization used Twitter to spread propaganda, raise funds, and attract recruits, and that Twitter knowingly permitted such use of its services.204 The plaintiffs further alleged that Twitter’s provision of these services generally caused their injuries.205 However, the plaintiffs did not specifically allege that the Islamic State used Twitter to recruit the person who committed the terrorist attack that injured the plaintiffs or that Twitter was used to plan the attack.206 Instead, the plaintiffs alleged only that the attacker was generally inspired by propaganda that he had seen on Twitter.207 The United States District Court for the Northern District of California dismissed the plaintiffs’ lawsuit, but gave the plaintiffs leave to amend their complaint.208 In doing so, the court did not reach the question of whether Twitter actually had provided material support to a foreign terrorist organization. Instead, the district court dismissed the lawsuit because the court found that Section 230 of the CDA shielded Twitter from liability.209 The central argument in Fields with regard to Section 230 was whether the lawsuit was attempting to hold Twitter liable as a publisher.210 To support their argument that the lawsuit was not targeting Twitter as a publisher, the plaintiffs offered two justifications. First, they argued that their lawsuit was not based upon the content of the speech disseminated by Twitter, but instead was based solely upon Twitter’s provision of services in the form of Twitter accounts to the Islamic State organization.211 Second, they alleged that because much of the organization’s recruiting communication is accomplished via Twitter’s Direct Messaging services, which are private communications between individual users, those direct messages are not “published” to (...continued) 2013). 202 See Klayman v. Zuckerberg, 753 F.3d 13554, 1357 (D.C. Cir. 2014). 203 Fair Hous. Council, 521 F.3d at 1168 (“In other words, a website helps to develop unlawful content, and thus falls within the exception to section 230, if it contributes materially to the alleged illegality of the conduct.”). 204 Fields v. Twitter, No. 16-cv-00213-WHO, 2016 U.S. Dist. LEXIS 105768, at *3 (N.D. Cal. Aug. 10, 2016). 205 Id. at *4-*6. 206 Id. 207 Id. 208 Id. at *28. 209 Id. 210 Id. at *8. The plaintiffs in Fields did not dispute that Twitter was providing an interactive computer service. Id. Nor did the plaintiffs argue that the content that allegedly inspired the attack that injured them was provided by a third party. Id. 211 Id. at *11. Congressional Research Service 22 The Advocacy of Terrorism on the Internet the public and recovery based upon those communications cannot be barred by Section 230.212 The district court rejected both arguments. In rejecting the plaintiffs’ provision of accounts theory of liability, the court noted previous cases analyzing the breadth of Section 230’s liability shield have defined publishing broadly to include any decision related to what third-party content might be made available on a site or removed from it.213 Generally, previous courts had applied that reasoning to specific offensive content and not to the decision of whether to provide access to a platform to specific persons.214 However, the district court did not see a substantive distinction between the two concepts for the purposes of whether Twitter was being treated as a publisher. In the court’s view, granting permission to a particular person to post whatever content that person liked was no less a publishing decision than allowing content to be posted in the first place, or removing content that the site decided violated its rules.215 “Twitter’s decisions to structure and operate itself as a platform ... allowing for the freedom of expression [of] hundreds of millions of people around the world and to allow even ISIS to sign up for accounts on its social network,” in the court’s view, “reflect choices about what [third-party] content can appear on Twitter and in what form.”216 These were quintessential publishing decisions in the view of the court. Consequently, Section 230’s liability shield applied.217 Turning to whether content that was not made publicly available was published for the purposes of Section 230, the court again was not convinced by the plaintiffs’ arguments.218 In making this determination, the court noted that Section 230 was first enacted to provide a liability shield for defamatory content posted by third parties.219 Accordingly, the district court took into account that appeals courts have looked to defamation law when determining the scope of the shield 212 Id. at *13. Id. at 17 (citing Klayman, 753 F.3d at 1359 (“the very essence of publishing is making the decision whether to print or retract a given piece of content”); see generally Fair Hous. Council v. Roommates.com, LLC, 521 F.3d 1157, 117071 (9th Cir. 2008) (“‘determin[ing] whether or not to prevent [the] posting’ of third-party material online is ‘precisely the kind of activity’ covered by the CDA”). 214 Fields, 2016 U.S. Dist. LEXIS 105768, at *17. 215 Id. at *19. The court analogized this case to a recent First Circuit decision. In Doe v. Backpage.com, 817 F.3d 12 (1st Cir. 2016), plaintiffs had argued that Backpage.com was liable to them for violating the Trafficking Victims Protection Reauthorization Act because it had hosted advertisements whereby the plaintiffs were trafficked in violation of the statute. Id. at 15. The First Circuit held that Section 230(c)’s liability shield barred recovery because it “extends to the formulation of precisely the sort of website policies and practices” at issue in the case. Id. at 20. “Features such as these, which reflect choices about what content can appear on the website and in what form, are editorial choices that fall within the purview of traditional publisher functions.” Id. at 21. 216 Fields, 2016 U.S. Dist. LEXIS 105768, at *20 (internal citations and quotations omitted). 217 The court also found two further deficiencies in the plaintiffs’ argument that seeking to hold Twitter liable for providing accounts to FTOs was not an attempt to hold Twitter liable as a publisher. Id. at *13. First, the court found that much of the plaintiffs’ complaint focused on content posted to Twitter by the Islamic State organization. Id. “In short, the theory of liability ... is not that Twitter provides material support to ISIS by providing it with Twitter accounts, but that Twitter does so by ‘knowingly permitt[ing] [ISIS] to use [those accounts] as a tool for spreading extremist propaganda ... ” which essentially sought to hold Twitter liable for content it published. Id. *16-*17. Furthermore, plaintiffs had not properly alleged causation. Id. at *22 (“[The] allegations in the [complaint] do not support a plausible inference of proximate causation between Twitter’s provision of accounts to ISIS and the deaths of Fields and Creach.”). 218 Id. at *26. 219 Id. (“As noted above, Congress enacted section 230(c)(1) in part to respond to a New York state court decision finding that an internet service provider could be held liable for defamation based on third-party content posted on its message boards.”). 213 Congressional Research Service 23 The Advocacy of Terrorism on the Internet provided by Section 230.220 Because, under defamation law, the term publication means to communicate to another person, other than the person defamed221 the Fields court determined that in order for an interactive service provider to be treated as a publisher for the purposes of Section 230, the third-party content need not be made publicly available.222 Section 230’s liability shield therefore also applied to content communicated via Twitter’s Direct Messaging service. As noted above, similar private civil lawsuits alleging that social media sites provide material support to terrorist organizations are pending in court. It remains to be seen whether those cases will be decided similarly to Fields v. Twitter. However, Fields does suggest that reviewing courts may be inclined to continue the general trend of broadly applying the liability shield provided by Section 230 to civil suits brought against Internet platforms that display third-party content like Twitter, Facebook, and Google.223 If so, they may provide a barrier to attempts to curb the advocacy of terrorism on the Internet through civil litigation. Author Contact Information Kathleen Ann Ruane Legislative Attorney kruane@crs.loc.gov, 7-9135 220 Id. Fields, 2016 U.S. Dist. LEXIS 105768, at *27 (citing Zeran v. America Online, 129 F.3d 327, 332 (4 th Cir. 1997)). 222 Id. The district court also noted a number of other cases where Section 230 was held to bar recovery for claims based upon the transmission of third-party content that was nonpublic and in which the courts did not question whether the content could be said to have been “published.” Id. at *28 (citing Hung Tan Phan v. Lang Van Pham, 182 Cal. App. 4th 323, 324-28 (App. 2010); Delfino v. Agilent Techs., Inc., 145 Cal. App. 4th 790, 795-96, 804-08 (App. 2006); Beyond Sys., Inc. v. Keynetics, Inc., 422 F. Supp. 2d 523, 528, 536-37 (D. Md. 2006)). 223 See Eric Goldman, Section 230 Immunizes Twitter From Liability For ISIS’s Terrorist Activities–Fields v. Twitter, TECH. & MARKETING L. BLOG (Aug. 11, 2016), http://blog.ericgoldman.org/archives/2016/08/section-230-immunizestwitter-from-liability-for-isiss-terrorist-activities-fields-v-twitter.htm (“There are other pending lawsuits against social media sites for materially supporting terrorists. [Fields v. Twitter] reinforces the likelihood that those cases will also fail for Section 230 and causation reasons.”); but see Wittes and Bedell Section 230, supra note 193 (arguing that despite the fact that “companies would, indeed, have powerful arguments for immunity under” Section 230, the liability shield should not apply). 221 Congressional Research Service 24