STATE OF WISCONSIN IN SUPREME COURT YASMEEN DANIEL, Individually, and as Special Administrator of the Estate of Zina Daniel Haughton, Plaintiff-Appellant, TRAVELERS INDEMNITY COMPANY OF CONNECTICUT, as Subrogee for Jalisco's LLC, v. Intervening Plaintiff, ARMSLIST, LLC, an Oklahoma Limited Liability Company, BRIAN MANCINI and JONATHAN GIBBON, Defendants-Respondents-Petitioners, BROC ELMORE, ABC INSURANCE CO., the fictitious name for an unknown insurance company, DEF INSURANCE CO., the fictitious name for an unknown insurance company and ESTATE OF RADCLIFFE HAUGHTON, by his Special Administrator, Jennifer Valenti, Defendants, PROGRESSIVE UNIVERSAL INSURANCE COMPANY, Intervening Defendant. APPEAL NO. 2017-AP-344 Milwaukee County Case No. 15-CV-8710 BRIEF OF AMICUS CURIAE FLOOR64, INC D/B/A THE COPIA INSTITUTE AND THE ELECTRONIC FRONTIER FOUNDATION GIMBEL, REILLY, GUERIN & BROWN LLP Kathryn A. Keppel (State Bar No. 1005149) Steven C. McGaver (State Bar No. 1051898) 330 E. Kilbourn Ave., Suite 1170 Milwaukee, WI 53202 (414) 271-1440 CATHERINE R. GELLIS, ESQ. PO Box 2477 Sausalito, CA 94966 cathy@cgcounsel.com (202) 642-2849 Counsel for Amicus Curiae Floor64, Inc., d/b/a The Copia Institute and the Electronic Frontier Foundation i TABLE OF CONTENTS Page INTRODUCTION ........................................................................... 1 ARGUMENT ................................................................................... 2 I. If the Decision Stands, It Will Chill Online Speech and Innovation. ............................................................................ 2 A. This Is a Case about Holding Platforms Liable for User Speech, Which Section 230 Forbids. ...................... 2 B. Online Speech and Innovation Depend on Internet Platforms Being Able to Depend on Robust Section 230 Protection. .................................................... 4 II. The Court of Appeals Erred in Refusing to Apply Section 230 to Armslist......................................................... 6 A. Congress Intended Section 230 to Apply to All Internet Platforms, Including Those Like Armslist ...... 6 B. Congress Pre-Empted States from Interfering with the Application of Section 230 to Internet Platforms, Including Those Like Armslist. .................... 8 CONCLUSION .............................................................................. 11 CERTIFICATIONS ....................................................................... 13 CERTIFICATE OF SERVICE ...................................................... 14 -i- TABLE OF AUTHORITIES Cases Page Batzel v. Smith, 333 F.3d 1018 (9th Cir. 2003), cert. denied 541 U.S. 1085 (2004) ................................................. 8 Barnes v. Yahoo, 570 F. 3d 1096 (9th Cir. 2009) .......................... 4 Doe 14 v. Internet Brands, 824 F. 3d 846 (9th Cir. 2016) ............ 4 Fair Housing Council of San Fernando Valley v. Roommates.com, LLC, 521 F.3d 1157 (9th Cir. 2008) ............. 3 Homeaway.com v. City of Santa Monica, No. 18-55367 (9th Cir.) Brief of Amicus Curie for Chris Cox and NetChoice filed Apr. 25, 2018)...................................................................... 6-8 Reno v. ACLU, 521 U.S. 844, 870 (1997) ....................................... 6 Stratton Oakmont v. Prodigy Servs. Co., 1995 WL 323710 (N.Y. Sup. Ct. May 24, 1995) ..................................................... 6, 7 Zeran v. AOL, 129 F.3d 327 (4th Cir. 1997) .................................. 5 Statutes Page 17 U.S.C. §512(c)(1)(C) ...................................................................5 47 U.S.C. §230 ....................................................................... passim 47 U.S.C. §230(a) ......................................................................... 10 -ii- 47 U.S.C. §230(b)(1) ..................................................................... 10 Page 47 U.S.C. §230(c)(1) ................................................................... 3, 9 47 U.S.C. §230(c)(2) ........................................................................9 47 U.S.C. §230(e)(2) ........................................................................5 47 U.S.C. §230(e)(3) ........................................................................9 47 U.S.C. §230(e)(5) ........................................................................8 Pub. L. 115–164, § 2, Apr. 11, 2018, 132 Stat. 1255 .................... 8 Other Authorities Page Anupam Chander & Uyên P. Lê, Free Speech, 100 IOWA L. REV. 501, 514 (2015) .................................................5 Mike Masnick, SESTA's First Victim: Craigslist Shuts Down Personals Section, TECHDIRT.COM, Mar. 23, 2018 ........................8 -iii- INTRODUCTION Tragic events like the one at the heart of this case often challenge the proper adjudication of litigation brought against Internet platforms. Justice would seem to call for a remedy, and if it appears that some twenty-year old federal statute is all that prevents a worthy plaintiff from obtaining one, it is tempting for courts to ignore it in order to find a way to give them that remedy. The problem is, in cases like this one, there is more at stake than just the plaintiff’s interest. This case is not a gun policy case, or even a negligence case. It is a speech case, and the laws that protect speech exist for good reason. They are ignored at our peril, because doing so imperils all the important expression they exist to protect. But that is what the Court of Appeals has done. In its efforts to provide the plaintiff a remedy, the court ignored the prohibitions imposed by a key federal statute limiting the court’s ability to extract that remedy from an Internet platform like defendant-respondent-petitioner Armslist LLC. The statute in question, 47 U.S.C. §230 ("Section 230"), precludes a cause of action from proceeding against an Internet platform for liability arising from content created by a user. It further prohibits states from imposing law inconsistent with this immunity provision. It does so because long ago Congress realized the only way the Internet could thrive as a place for innovative speech and services would be for platforms such as Armslist to be immune from suits arising from the user expression it enabled. In finding that Section 230 does not apply to Armslist, however, the Court of Appeals flouted these provisions and in doing so undermined all that Congress had sought to protect with it. 1 Thus, there are two interrelated reasons why this Court should review the Court of Appeals’ decision. One relates to the mechanics of Section 230. The Court of Appeals misread it as one of narrow applicability that did not reach the claims at hand. It so concluded despite the pre-emption provision barring states from imposing their own law in ways that conflict with the statute’s essential operation. Which leads to the other reason: in deciding that Section 230 could not reach Armslist, the decision inherently conflicts with that operation. In doing so it jeopardizes all the online speech and innovation that the statute was designed to protect, not just on topics related to gun sales, and not just for Internet businesses and users in Wisconsin. It is a decision the effect of which will be felt far beyond Wisconsin’s borders, and with far greater deleterious impact on all sorts of valuable speech and innovation than the Court of Appeals likely anticipated. Review by this Court is therefore necessary and appropriate. ARGUMENT I. If the Decision Stands, It Will Chill Online Speech and Innovation. A. This Is a Case about Holding Platforms Liable for User Speech, Which Section 230 Forbids. This case presents as a gun policy and negligence case, but the core legal question raised is whether an Internet platform can be held liable for the consequences of speech a user expressed through its services. In this case, the speech in question is the speech offering the sale of the gun. All questions of liability flow from this speech because had it not 2 been made then the gun would not have been sold to the shooter. But, crucially, the case is not about holding a speaker liable for the consequences of his or her speech, which Section 230 permits. Rather, this case is about a plaintiff attempting to hold a platform liable for the consequences of its user’s speech, which Section 230 expressly forbids. 47 U.S.C. §230(c)(1) (“No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”). As Armlist’s Petition notes, there is plenty of case law affirming this prohibition. Pet. 14. There can often be negative consequences to speech, but courts have been clear that Section 230 prevents holding the intermediating platforms liable for them, even in cases where the types of speech a platform attracts may be more likely to have negative consequences. See id. at 16-17. When courts have found potential liability for Internet platforms, those cases have had key differences from this one. One such difference is when there is a question as to who created the potentially wrongful expression, the platform or the user. Fair Housing Council of San Fernando Valley v. Roommates.com, LLC, 521 F.3d 1157, 1166-67 (9th Cir. 2008). Here, however, there is no allegation that the Armslist platform created the content offering the sale of the gun; it was the seller who did. Furthermore, unlike in Roommates where the court found the platform had helped give the content its wrongful quality, it appears that under Wisconsin law the speech offering the gun sale by an unlicensed dealer was not illegal. Op. ¶9. Other cases where courts have allowed claims to proceed against platforms have been those where they found 3 a platform's potential culpability had nothing to do with its facilitation of user speech. For instance, in Barnes v. Yahoo, the Ninth Circuit affirmed that Section 230 would have applied to Yahoo's intermediation of the user speech in question. 570 F. 3d 1096, 1105-06 (9th Cir. 2009). Instead, it only found the possibility of promissory estoppel liability for the separate action of having promised to delete the content and then not. Id. at 1109. Meanwhile in Doe 14 v. Internet Brands, the theory of liability against it duty to warn) was also found to be separate from any of its speech intermediation activities. 824 F. 3d 846, 851 (9th Cir. 2016). In this case, however, the entire theory of liability is predicated on dissatisfaction with how Armslist handled its user’s speech. Op. ¶17. Neither Barnes nor Internet Brands supports such a finding of liability. Nor does twenty-plus years of jurisprudence interpreting Section 230. Thus, review by this Court is warranted to reverse the Court of Appeals error. B. Online Speech and Innovation Depend on Internet Platforms Being Able to Depend on Robust Section 230 Protection. For Section 230 to function at all, the judicial temptation to whittle away at its protection must be resisted. As described further in Section II.A, the statutory language allows few exemptions to its coverage. But there is plenty of evidence that when platforms do face potential liability for user speech, the result is chilling to all online expression. A notable illustration of this dynamic is the censoring effect that results from claims alleging violations of intellectual property rights. One of the few exemptions written into Section 230 is that it provides no liability protection to platforms hosting user content that may violate 4 these rights. 47 U.S.C. §230(e)(2). As a result, when claims are made alleging that user content is infringing, platforms find themselves having to censor that content pre-emptively, without any adjudication as to whether it was truly See, e.g., 17 U.S.C. §512(c)(1)(C) infringing or not. (conditioning platform liability protection on the "removal" of allegedly infringing content upon being notified of its presence on the platform, not its ultimate adjudication). When there is no protection from liability for user expression, the choice for a platform is stark: censor, or potentially be obliterated by the enormous costs of even litigating liability over user content. “Faced with potential liability for each message republished by their services, interactive computer service providers might choose to severely restrict the number and type of messages posted.” Zeran v. AOL, 129 F.3d 327, 331 (4th Cir. 1997). If platforms had to fear liability for their users’ content, the resulting Internet would inevitably include far less speech, if not also far fewer platforms altogether. Because it is not just the Armslists of the world that can find themselves at these cross-roads. Platforms of all types depend on the immunity Section 230 provides. Internet platforms such as social media websites, blogging platforms, video-sharing services, and web-hosting companies— platforms that are the essential architecture of today’s Internet—depend on it. These platforms are often the primary way in which the majority of people engage with one another online. They are the “vehicle for the speech of others,” Anupam Chander & Uyên P. Lê, Free Speech, 100 IOWA L. REV. 501, 514 (2015), hosting a wide range of diverse ideas that can be presented and received all over the world. Internet platforms enable anyone, even those with minimal resources and technical expertise, to become “a pamphleteer” or “a town crier with a voice that resonates 5 farther than it could from any soapbox.” Reno v. ACLU, 521 U.S. 844, 870 (1997). But platforms can only afford to provide these services when they can be protected from liability arising from all the speech they enable. Thus, any efforts to weaken Section 230 weaken the ability for others to speak online. Review by this Court is therefore warranted in order not to invite this chilling effect. II. The Court of Appeals Erred in Refusing to Apply Section 230 to Armslist. A. Congress Intended Section 230 to Apply to All Internet Platforms, Including Those Like Armslist. In reading Section 230 more narrowly than the text supports, the Court of Appeals ignored the Congressional intent behind the statute. Section 230 was not a solution to a hypothetical problem: in 1995 a New York state court had found Prodigy, an early online communications service, liable for $200 million in damages from a user's defamatory speech. Stratton Oakmont v. Prodigy Servs. Co., 1995 WL 323710 (N.Y. Sup. Ct. May 24, 1995). Damage awards like these can wipe technologies off the map. If platforms had to fear the crippling effect that even just one award, arising from just one user, could have on their developing online services, it would force them to monitor all the expression they facilitate to ensure none could tempt such trouble. Br. amicus curiae for Chris Cox and NetChoice at 15, Homeaway.com v. City of Santa Monica, No. 18-55367 (9th Cir. filed Apr. 25, 2018) (“Cox Brief”), available at https://tdrt.io/gPJ ("The inevitable consequence of attaching platform liability to user-generated content is to force 6 intermediaries to monitor everything posted on their sites.").1 Given the sheer amount of expression they handle, however, such monitoring would be an impossible task. Id. at 2 (“While the volume of users [in 1995] was only in the millions, not the billions as today, it was evident […] even then that no group of human beings would ever be able to keep pace with the growth of user-generated content on the Web.”). Platforms would thus be forced to either over-censor broad swaths of legitimate expression pre-emptively, or cease to be platforms at all. Id. at 26 (“[Platforms], facing massive exposure to potential liability if they do not monitor user content and take responsibility for third parties’ legal compliance, would encounter significant obstacles to capital formation.”). Congress passed Section 230 to relieve platforms of this monitoring burden and its speechinhibiting decisions. Id. at 12 (“All of the unique benefits the Internet provides are dependent upon platforms being able to facilitate communication among vast numbers of people without being required to review those communications individually.”). Stratton Oakmont notably was a defamation case. Congress could have limited Section 230 only to liability for defamation, or only to the sorts of platforms that might face that sort of liability. But it chose broader language, because if all Section 230 spared platforms from was defamation The Court does not need to guess how Congress intended Section 230 to work: earlier this year former member of Congress Chris Cox, the statute’s co-author, submitted an amicus brief in a similar case where a lower court had denied Section 230 applicability to certain types of 1 platforms. In it, he explained that Congress intended Section 230 to apply broadly, because it was only by being broad that it could have any effect achieving Congress's goal of fostering the growth of the Internet while most effectively limiting its downsides. Cox Brief 11-12. 7 liability they would still need to monitor content for all other possible sources of liability, and little would have been accomplished. Furthermore, as much as Congress wanted to protect platforms' ability to promote discourse, id. at 12, it also wanted to advance e-commerce, which required treating platforms equally. Id. at 15; 23-24. See also Batzel v. Smith, 333 F.3d 1018, 1028 (9th Cir. 2003), cert. denied 541 U.S. 1085 (2004). Congress is certainly capable of narrowing Section 230 should it desire. It recently added a new exemption to its coverage explicitly allowing platform liability for user speech connected with human trafficking. Pub. L. 115–164, § 2, Apr. 11, 2018, 132 Stat. 1255, codified at 47 U.S.C. §230(e)(5). Congress could similarly narrow Section 230 further by creating an exemption for online gun sales. But when Section 230 is limited, platforms have a perverse incentive to limit online speech and services that are otherwise legitimate and valuable. See, e.g., Mike Masnick, SESTA's First Victim: Craigslist Shuts Down Personals Section, TECHDIRT.COM, Mar. 23, 2018, https://tdrt.io/gIw. This Court should therefore review the appeals court decision to ensure Wisconsin jurisprudence does not invite the speech-limiting consequences Congress had sought to avoid. B. Congress Pre-empted States from Interfering with the Application of Section 230 to Internet Platforms, Including Those Like Armslist. In 1996 when Section 230 was codified, Congress could not know what the Internet would become. So Congress drew Section 230 broadly and in accordance with a general policy principle: encourage the most good online expression, and the least bad. It achieved this policy goal with a regulatory approach that both protected against liability for 8 carrying speech, 47 U.S.C. §230(c)(1), and against liability for removing it. 47 U.S.C. §230(c)(2). By removing the threat of sanction, platforms would be able to facilitate the most beneficial speech and allocate their resources most efficiently to minimize the most undesirable. But imposing liability on platforms distorts this balance and undermines both objectives. It co-opts resources that could be better spent optimizing speech intermediation faculties and pressures sites to reject more content, even content that may be perfectly legitimate, because, as discussed above, it may be prohibitively expensive, if not also impractical or even impossible, to weed out the acceptable from the problematic. Of course, the Internet inherently transcends state boundaries, meaning platforms could be exposed to regulators in each one they reach. Cox Brief 27 (“A website […] is immediately and uninterruptedly exposed to billions of Internet users in every U.S. jurisdiction and around the planet. This makes Internet commerce uniquely vulnerable to regulatory burdens in thousands of jurisdictions.”). Congress worried that state and local authorities would be tempted to impose liability on platforms, and in doing so interfere with the operation of the Internet by separately creating the very monitoring obligations Section 230 was intended to avoid. Id. at 25 (“While one monitoring requirement in one city may seem a tractable compliance burden, myriad similar-but-not-identical regulations could easily damage or shut down Internet platforms.”). The pre-emption provision of Section 230 was supposed to forestall this result. 47 U.S.C. §230(e)(3) (“No cause of action may be brought and no liability may be imposed under any State or local law that is inconsistent with this section.”). It was particularly needed because not every jurisdiction will agree on what the best policy should 9 be for imposing liability on certain kinds of expression. But if one jurisdiction can effectively chill certain types of speech facilitation with the threat of potential liability, it will chill it for every jurisdiction everywhere, regardless of whether these other places agree with the policy choice or not. So while here it may seem desirable for Wisconsin to take the regulatory lead with regard to online gun sales, if the Court of Appeals’ decision were to stand, its disregard for the preemption provision could easily prompt other states to similarly disregard it and impose its policy preference to shape what speech may be available online, even in Wisconsin, which may not share the same policy preferences. The Court of Appeals thus misunderstood the purpose of Section 230’s pre-emption provision. The purpose was not to pre-empt any particular policy “domain” normally left to the states. Op. ¶33. Instead, Congress used its commerce powers to pre-empt the “field” of Internet platform regulation itself. “To ensure the quintessentially interstate commerce of the Internet would be governed by a uniform national policy[,]” sparing platforms the need to monitor the expression they facilitate, Congress deliberately foreclosed the ability of state and local authorities to interfere with that policy. Cox Brief 10. Congress did so because without this provision, the statute would be useless. Cox Brief 13 (“Were every state and municipality free to adopt its own policy concerning when an Internet platform must assume duties in connection with content created by third party users, not only would compliance become oppressive, but the federal policy itself could quickly be undone.”). When it comes to online speech, the only policy that is supposed to be favored is the one Congress originally chose, “to promote the continued development of the Internet and other interactive computer services and other interactive media,” 47 U.S.C. §230(b)(1), and all that these services offer. See 47 U.S.C. §230(a) (enumerating the many of 10 benefits of these services). The only way to give that policy the effect Congress intended is to ensure local regulatory efforts cannot distort the careful balance Congress codified to achieve it. This Court therefore should review the Court of Appeals’ decision, which threatens that fundamental balance. CONCLUSION The Court of Appeals’ erroneous interpretation of Section 230 and its refusal to apply the statute to the Internet platform Armslist will chill speech and innovation. Therefore, this Court should grant the petition for review. Dated this 7th day of June, 2018. Respectfully submitted, GIMBEL, REILLY, GUERIN & BROWN LLP By: _____________________________ KATHRYN A. KEPPEL State Bar No. 1005149 kkeppel@grgblaw.com STEVEN C. MCGAVER State Bar No. 1051898 smcgaver@grgblaw.com 330 East Kilbourn Avenue, Suite 1170 Milwaukee, Wisconsin 53202 Telephone: 414/271-1440 Local Counsel for Amicus Curiae Floor64, Inc., d/b/a The Copia Institute and the Electronic Frontier Foundation 11 Catherine R. Gellis, Esq. cathy@cgcounsel.com PO Box 2477 Sausalito, CA 94966 (202) 642-2849 Counsel for Amicus Curiae Floor64, Inc., d/b/a The Copia Institute and the Electronic Frontier Foundation 12 FORM AND LENGTH CERTIFICATION I hereby certify that this brief conforms to the rules contained in Wis. Stat. §§ 809.19(8)(b) and (c) as to form and length for a non-party brief produced with a proportional serif font. The length of this brief, including footnotes, is 2,966 words. KATHRYN A. KEPPEL CERTIFICATION REGARDING ELECTRONIC BRIEF I hereby certify that I have submitted an electronic copy of this brief which complies with the requirements of Wis. Stat. §809.19(12). I further certify that the text of the electronic copy of the brief is identical to the text of the paper copy of the brief filed as of this date. KATHRYN A. KEPPEL 13 CERTIFICATE OF SERVICE I hereby certify that on this 7th day of June, 2018, I caused one copy of this brief supporting the petition for review to be served upon each of the following persons via U.S. Mail, First Class: QUARLES & BRADY LLP Joshua D. Maggard Eric J. Van Schyndle James Goldschmidt 411 East Wisconsin Avenue Suite 2400 Milwaukee, WI 53202 BRADY CENTER TO PREVENT GUN VIOLENCE LEGAL ACTION PROJECT Jonathan E. Lowy Alla Lefkowitz 840 First Street, NE, Suite 400 Washington, D.C. 20002 REED SMITH LLP Brian A. Sutherland 101 Second Street, Suite 1800 San Francisco, CA 94105 CANNON & DUNPHY Patrick O. Dunphy Brett A. Eckstein 594 North Barker Road P.O. Box 1750 Brookfield, WI 53008 MANATT, PHELPS & PHILLIPS, LLP Jacqueline C. Wolff Arunabha Bhoumik Samantha J. Katz 7 Times Square New York, NY 10036 CHAPIN & ASSOCIATES John A. Griner 13935 Bishops Dr., Suite 250 Brookfield, WI 53005 DEVINE HAHN SC Thomas M. Devine Anthony P. Hahn 840 Lake Ave., Ste. 300 Racine, WI 53403 FUCHS & BOYLE SC John F. Fuchs 13500 Watertown Plank Rd., Ste. 100 Elm Grove, WI 53122 Dated: June 7, 2018 KATHRYN A. KEPPEL 14