Catherine R. Gellis for Floor64, Inc., d/b/a the Copia Institute in Docket No. 2015-7 Before the U.S. COPYRIGHT OFFICE LIBRARY OF CONGRESS Washington, DC ___________________________________ ) In the Matter of ) ) Section 512 Study ) ) ) ____________________________________) Docket No. 2015-7 SECOND COMMENT OF FLOOR64, INC. d/b/a THE COPIA INSTITUTE I. Preliminary Statement Floor64, Inc., d/b/a the Copia Institute, is a corporation that regularly advises and educates innovative technology startups on a variety of issues, including those relating to intermediary liability and the protective effects of safe harbors on free speech. Through the Copia Institute it works directly with innovators and entrepreneurs to better understand innovation and policy issues, while Floor64's online publication, Techdirt.com, has published over 60,000 posts on these subjects and regularly receives more than two million views to its pages per month. These posts have also attracted more than one million comments, third party speech that advances discovery and discussion around these topics. Floor64 depends on statutory protections for intermediaries, including those under the Digital Millennium Copyright Act (“DMCA” or “Section 512”), to both enable this robust public discourse found on its pages and for its own speech to be shared and read throughout the Internet. We submit this comment to continue to highlight the deleterious effects the DMCA often has on free speech. To the extent that the Copyright Office can or should advocate for policy, it should be for policy that alleviates these effects. In any event it should not be for policy that further exacerbates them. II. Operation of the DMCA must be constrained by First Amendment principles. The overarching issue pertinent to many of the questions asked in this follow-up inquiry (as well as the original one) is that the DMCA continues to inflict a tremendously powerful, and effectively unchecked, censorious effect on legitimate speech interests. Furthermore, this effect undermines the “twin goals” of the DMCA discussed in Question #3, one of which is to support the growth of the Internet. Growing the Internet requires there to be both more speech and more Internet platforms available to intermediate that speech. Question #2 asks whether the interests of individual users needs to be taken into account in any inquiry about the function of the DMCA. The answer is absolutely, because there is no functional difference between the interest of individual users and those of the Internet platforms that intermediate their speech. Together speakers and platforms form a symbiotic relationship, with each dependent on the robust protections for speech afforded by the First Amendment. Without those protections not only is speech itself vulnerable to attack, but so are the platforms in the business of intermediating that speech. Without these protections these attacks force intermediaries to lose content and users, as well as put the statutory protection they depend on at risk – and with it all the user speech they enable. To keep speech and platforms from being wrongfully targeted we must return to first principles. Congress can make no law abridging freedom of speech, so for the DMCA to be constitutional it cannot provide any less protection for speech than the First Amendment ordinarily requires. As described in our previous comment there are unfortunately several areas where its present operation provides fewer safeguards than the Constitution demands. The following responses explain further how they should instead be interpreted in order to remove these constitutional infirmities. A. Speech should not be vulnerable to infringement allegations that could not survive judicial scrutiny. Several questions contemplate shortcomings with the DMCA takedown system. Because this takedown system functions as a system of extra-judicial injunctions it is critical that the speech they target have at least as much protection as speech targeted by any request for injunctive relief. Ordinarily someone seeking to enjoin speech would need to properly plead and then prove that the targeted speech was indeed actionable. Under present practice, however, senders of takedown notices have not needed to overcome these sorts of hurdles prior to effecting the removal of targeted content via their takedown demands. A significant reason takedown notice senders have been able to evade these constitutional requirements is because there is no effective consequence for sending nonmeritorious takedown demands. As discussed in the previous proceedings and earlier comments, intermediaries expose themselves to significant legal risk if they opt not to remove content in response to a takedown demand. Furthermore, even if intermediaries could review each takedown notice without legal risk they often receive too many to be able to manually scrutinize each one individually. Intermediaries are also ill-positioned make to effective determinations about a takedown notice’s validity.1 Thus the only way to stem the tide of illegitimate takedown notices upon intermediaries is for there to be an adequate consequence for them. And that consequence is written into Section 512(f). In practice, however, Section 512(f) has not been serving as an effective deterrent due to courts reading into the requirements for sending takedown notices under Section 512(c) that the good faith belief that the targeted content is wrongful need only be “subjective.” Statutory interpretations allowing a subjective good faith belief, rather than 1 Questions of validity include whether there is a legitimate copyright interest, whether that interest belongs to the takedown notice sender, whether the user might have had some license allowing use, or whether the use was otherwise fair. Intermediary platforms are rarely likely to have adequate information available to them to answer these questions. requiring an objective one, have invited all sorts of illegitimate takedown notices to be dispatched and all sorts of lawful content to be removed, among other speech-chilling consequences discussed below. Under no circumstance should the Copyright Office advocate for exacerbating any of the consequences to speech that the DMCA already inflicts. For instance, any proposal to increase the power of a takedown notice, such as by turning it into a permanent injunction through “takedown-and-staydown” proposed by Question #12, would only increase the severity of the Constitutional injury the DMCA inflicts, as would requiring any additional delay in restoring content after receiving a counter-notice, as proposed by Question #5.2 If the Copyright Office is to do anything it should only be to encourage alleviation of the incursions on free speech that these unchecked takedown notices allow. Towards that end Question #7 contemplates stiffening penalties for wrongful takedown notices.3 At this point, however, the deficiencies in the operation of the DMCA relate to there effectively never being any penalty at all. While Question #13 asks for statutory language to improve the situation, the most needed fix is to simply stop reading in the word “subjective” as part of the good faith standard and instead read it to require an objective one. Although a legislative change could remove the ambiguity, such a change should be unnecessary. The word “subjective” is not part of the statute, and even just ceasing to infer it as an operative term will go a long way to address the current Constitutional deficiencies in the DMCA’s operation. B. The safe harbor should not be vulnerable to infringement allegations that could not withstand judicial scrutiny. Elrod v. Burns, 427 U.S. 347, 373 (1976) (“The loss of First Amendment freedoms, for even minimal periods of time, unquestionably constitutes irreparable injury.”). 3 It also contemplates stiffening penalties for “false” or “abusive” counter-notices. Setting aside that a “false” counter-notice is far more likely to simply be a notice merely sent in error, where a user is simply mistaken about the scope of one’s rights, and that constitutional principles such as due process mean that a defense of one’s speech should never be considered “abusive,” such a proposal would only expand the worst censoring, and self-censoring, effects of the DMCA. As discussed previously there are already plenty of deterrents to sending counter-notices, even for non-infringing content, and the Copyright Office should not be advocating for more. 2 Takedown notices do not merely cause speech to be disappeared from the Internet. As the recent cases cited by Question #14 have explored they potentially can cause speakers to be disappeared as well. The reason for this effect is because the safe harbor is contingent on an intermediary doing certain things once it knows of infringement, including disabling or removing access to targeted content. But those recent cases have also suggested that these takedown notices effectively start a clock on the intermediary, where once it learns too much about a user’s predilection for potentially infringing activities it must act to remove that user’s access to its systems entirely. These cases are troublesome for several reasons, not the least of which being that, like jurisprudence relating to Section 512(f), they also infer a statutory requirement not actually in the statute. Section 512(i) only says that an intermediary must have a policy for terminating repeat infringers; it is otherwise silent as to what that policy should be, and post-hoc decisions by a court threaten to make safe harbor protection illusory, given that a platform can never be sure if it has complied with the statute or not. They are also troublesome because they give the takedown demand a sort of power that such demands would never have outside of the DMCA. As discussed above, and in prior comments and proceedings, infringement allegations can often be false (or even merely mistaken), which is why injunctions are not granted without due process. Due process allows the allegations to be tested, so that only the meritorious accusations can result in any penalty.4 Allowing a penalty for unproven allegations, particularly with respect to speech, amounts to prior restraint, which is itself anathema to the First Amendment.5 A penalty that censors speech is bad enough, but a penalty that censors See Neb. Press Ass’n v. Stuart, 427 U.S. 539, 562 (1976) (“Only after judgment has become final, correct or otherwise, does the law’s sanction become fully operative.”). 5 See, e.g., Bantam Books, Inc. v. Sullivan, 372 U.S. 58, 70 (1963) (“Any system of prior restraints ... bear[s] a heavy presumption against its constitutional validity. We have tolerated such a system only where it operated under judicial superintendence and assured an almost immediate judicial determination of the validity of the restraint.”) (internal citations omitted); Carroll v. President & Comm’rs of Princess Anne, 4 speakers altogether raises the constitutional injury to a whole other level. We have already seen malevolent actors abuse takedown notices to try to suppress criticism. We should not also be handing them the power to use takedown notices to suppress critics’ ability to speak out at all.6 But as long as platforms are effectively forced to respond to every takedown notice as it were meritorious, and to bar users from the forums they provide, or as in the case of the Section 512(a)-eligible service providers discussed in Question #8, provide connection to, the potential for abuse is tremendous. Removing lawful content is already injurious enough to the values that the DMCA, and the First Amendment itself, are intended to promote. Pressuring platforms to further silence any speech their users have yet to make, particularly when predicated on unproven claims, is even more contrary to those fundamental policy goals. III. Conclusion The conclusion here echoes that of our prior comment: it is not possible to have a valid copyright law that in any part is inconsistent with the First Amendment. For the DMCA's provisions to be valid, and for the safe harbor protection it is intended to offer to be meaningful, it must operate consistently with those constitutional precepts, and that requires reducing the power that unproven takedown demands can have over platforms and the speech they intermediate. 393 U.S.175, 181 (1968) (“The elimination of prior restraints was a ‘leading purpose’ in the adoption of the First Amendment.”). 6 Being banned from an Internet platform like a social media site with strong network effects is itself of serious consequence, but in BMG Rights Management (US) v. Cox Communications, Inc., 149 F.Supp.3d 634, 653-62 (E.D. Va. 2015) the court required Cox to terminate users from the provider of their Internet service. Such a termination did not just remove users’ access from one social media site but rather every social media site, every educational site, every travel site, every banking site, every governmental site, etc. Such a termination cannot possibly be deemed a reasonable and proportional response to allegations of infringement, even valid ones. Moreover, what makes the Cox case particularly worrisome is how many of the takedown notices sent to Cox were invalid – so many that one of the plaintiffs was even dismissed from the case for lack of standing to bring an infringement claim. Id. at 6542-53. Dated: February 21, 2017 Respectfully submitted, /s/ Catherine R. Gellis Catherine R. Gellis (CA Bar # 251927) P.O. Box 2477 Sausalito, CA 94966 Phone: 202-642-2849 cathy@cgcounsel.com Counsel for Floor64, Inc., d/b/a the Copia Institute