STATE OF VERMONT SUPERIOR COURT CHITTENDEN UNIT State of Vermont, Plaintiff, Index No.: 226-3-20 v. Clearview AI, Inc., Defendant. Defendant's Memorandum of Law in Opposition to Vermont's Motion for a Preliminary Injunction Tor Ekeland Pro Hac Vice pending, NY Bar No. 4493631 Tor Ekeland Law, PLLC 195 Montague Street, 14th Floor, Brooklyn, NY 11201 (718) 737-7264 tor@torekeland.com Corinne Mullen, Of Counsel Pro Hac Vice pending, NY Bar No 2451391 Mullen Law Firm 1201 Hudson St, ste 230 Hoboken, NJ 07030 (201) 420-1911 corinne@mullenlawfirm.com Timothy C. Doherty, Jr., VT Bar No. 4849 Tristram J. Coffin, VT Bar No. 2445 Downs Rachlin Martin PLLC 199 Main Street, PO Box 190, Burlington, VT 05402-0190 (802) 863-2375 TDoherty@drm.com TCoffin@drm.com Table of Contents Table of Contents ................................................................................................................ ii Table of Authorities ........................................................................................................... iv Introduction ......................................................................................................................... 1 Background ......................................................................................................................... 2 How the Clearview AI App Works ......................................................................... 3 Law Enforcements' Use of Clearview AI ............................................................... 7 Public Interest Uses of Clearview AI...................................................................... 8 The Media Firestorm............................................................................................... 9 Cyber Security ........................................................................................................ 9 Argument .......................................................................................................................... 10 I. Communications Decency Act § 230 Preempts Vermont's Action ......................... 11 A. B. C. II. Clearview is an Interactive Computer Service .................................................... 12 Vermont's Claims are Based on Third-Party Content ......................................... 13 Vermont is treating Clearview as a Speaker and Publisher................................. 16 Clearview's Computer Code is Protected First Amendment Speech ....................... 19 A. B. C. D. E. The First Amendment Protects Computer Languages ........................................ 20 Search Engine Results Are Expressive First Amendment Speech ...................... 26 Vermont is Engaged in Content Based Speech Discrimination .......................... 27 Strict Scrutiny is Fatal to Vermont's Content Based Restrictions ....................... 29 A Preliminary Injunction Violates the First Amendment's Prohibitions on Overbreadth and Vagueness ................................................................................ 32 i. ii. F. G. III. A. Facial Overbreadth & Vagueness ..................................................................... 32 As Applied Overbreadth & Vagueness ............................................................ 35 Vermont Seeks an Unconstitutional Prior Restraint on Speech .......................... 35 There is a First Amendment Right to Public Data on the Internet ...................... 37 Vermont's Action is Void for Vagueness as Applied Under the Due Process Clauses of the Fifth and Fourteenth Amendments ................................................................ 42 The Court Should Apply the Regular Four-Part Balancing Test for Injunctions 43 ii B. This Case is Not Appropriate For a Statutory Preliminary Injunction ................ 44 i. ii. iii. C. The State Has Made No Showing Justifying an Injunction Under the Proper Standard ............................................................................................................... 47 i. ii. D. iii. G. Vermont Supreme Court’s Narrow Expectation of Privacy ............................. 48 Vermont Statutory Definition of “Reasonable Expectation of Privacy” .......... 54 Vermont Cannot Succeed on the Merits – Unfairness ........................................ 57 i. ii. E. F. Vermont Law Does Not Allow It ..................................................................... 44 Other Jurisdictions Reject the State’s Interpretation of the Doctrine ............... 45 Prudence Requires Applying the Regular Rule to this Unique Case ............... 46 Clearview AI’s Practices Do Not Injure The Public ........................................ 60 Clearview AI’s Practices Do Not Violate Any Clear and Well-Established Public Policy..................................................................................................... 61 Clearview AI’s Practices Are Not Immoral ..................................................... 62 Vermont Cannot Succeed on the Merits: Deception .......................................... 63 Vermont’s Legislature Has Implemented a Statute Regulating Biometric Technology in Limited Contexts But Permitting it Generally ............................ 67 Vermont’s Data Broker Laws Don't Apply to Clearview ................................... 68 IV. Vermont Lacks Standing Because there is No Concrete, Material Injury ............... 68 CONCLUSION ................................................................................................................. 70 iii Table of Authorities CASES Ashcroft v. Free Speech Coalition, 535 U.S. 234 (2002) ............................................................. 22 Barnes v. Yahoo!, Inc., 570 F.3d 1096 (9th Cir. 2009)................................................................. 13 Bartnicki v. Vopper, 532 U.S. 514 (2001). ................................................................................... 31 Bennett v. Google, LLC, 882 F.3d 1163 (D.C. Cir. 2018) ............................................................ 15 Brooklyn Inst. of Arts & Scis v. City of New York & Rudolph W. Giuliani, 64 F. Supp. 2d 184, (E.D.N.Y.1999) ................................................................................................................... 22, 37 Brown v. Entm't Merchs. Ass'n, 564 U.S. 786 (2011) ............................................................ 30, 32 Carter v. Gugliuzzi, 168 Vt. 48 (1998) ......................................................................................... 64 Cf. Reno v. ACLU, 521 U.S. 844 (1997) ................................................................................. 32, 33 CFTC v. Vartuli, 228 F.3d 94 (2d Cir. 2000) ............................................................................... 25 Chicago v. Morales, 527 U.S. 41 (1999) ...................................................................................... 44 Christie v. Dalmig, Inc., 136 Vt. 597, 396 A.2d 1385 (1979) ...................................................... 64 City of New York v. Golden Feather Smoke Shop, Inc., 597 F.3d. 115 (2d Cir. 2010) ................ 45 Cohen v. California, 403 U.S. 15, 21 (1971) ................................................................................ 22 Def. Distributed v. United States Dep't of State, 838 F.3d 451 (5th Cir. 2016) ..................... 20, 29 Doctor’s Assocs., Inc. v. QIP Holders, LLC (“Doctor’s Assocs. I”), No. 06-cv-1710, 2007 WL 1186026 (D. Conn. Apr. 19, 2007) ........................................................................................... 12 Edwards v. District of Columbia, 755 F.3d 996 (D.C. Cir. 2014) ................................................ 32 Elrod v. Burns, 427 U.S. 347 (1976) ............................................................................................ 46 F.T.C. v. Accusearch Inc., 570 F.3d 1187 (10th Cir. 2009).............................................. 14, 16, 18 iv F.T.C. v. Sperry & Hutchinson Co., 405 U.S. 233 (1972). ........................................................... 58 Fair Hous. Council of San Fernando Valley v Roommates.com, LLC, 521 F.3d 1157 (9th Cir. 2008..................................................................................................................................... 13, 16 Fed. Trade Comm'n v. LeadClick Media, LLC, 838 F.3d 158 (2d Cir. 2016) .................. 15, 16, 18 Force v. Facebook, Inc., 934 F.3d 53 (2d Cir. 2019) ....................................................... 12, 17, 18 Fort Wayne Books, Inc. v. Indiana, 489 U.S. 46 (1989) ............................................................... 46 FTC v Accusearch, Inc., 570 F.3d 1187 (10th Cir. 2009)............................................................. 13 FTC v. Motion Picture Advert. Serv. Co., 344 U.S. 392 (1953) ................................................... 66 FTC v. Sterling Drug, Inc., 317 F.2d 669 (2d Cir. 1963) ............................................................. 66 Gibson v Craigslist, Inc., 2009 WL 1704355 (S.D.N.Y. June 15, 2009) ..................................... 13 Grayned v. City of Rockford, 408 U.S. 104 (1972)....................................................................... 43 Green v. United States DOJ, 392 F. Supp. 3d 68 (D.D.C. 2019) ................................................. 20 Hill v. Colo., 530 U.S. 703 (U.S. June 28, 2000).......................................................................... 29 Hill v. Colorado, 530 U.S. 703 (2000) ......................................................................................... 44 HiQ Labs, Inc. v. LinkedIn Corp., 938 F.3d 985 (9th Cir. 2019). ................................................ 41 Hogdon v Mt. Mansfield Co., Ins., 160 Vt. 150, 162, 624 A.2d 1122 (1992) .............................. 56 Holder v. Humanitarian Law Project, 561 U.S. 1 (2010). ........................................................... 43 Hustler Magazine v. Falwell, 485 U.S. 46 (1988) ........................................................................ 22 Int'l Airport Ctrs., L.L.C. v. Citrin, 440 F.3d 418 (7th Cir. 2006) ................................................ 43 Jane Doe No. 1 v Backpage.com, LLC, 817 F.3d 12 (1st Cir. 2016) ............................................ 13 Jones v Dirty World Entm’t Recordings LLC, 755 F.3d 398 (6th Cir. 2014) ............................... 13 Junger v. Daley, 209 F.3d 481, (6th Cir. 2000) ............................................................................ 20 v Junger v. Daley, 209 F.3d 481 (6th Cir. 2000) ................................................................. 21, 22, 24 Klay v. United Healthgroup, Inc., 376 F.3d, 1092 (11th Cir. 2004). ........................................... 47 Klayman v. Zuckerberg, 753 F.3d 1354 (D.C. Cir. 2014) ............................................................ 17 Lujan v. Defs. Of Wildlife, 504 U.S. 555 (1992) ........................................................................... 69 Marshall's Locksmith Serv. Inc. v. Google, LLC, 925 F.3d 1263 (D.C. Cir. 2019).......... 12, 14, 16 McBoyle v. United States, 283 U.S. 25 (1931) ............................................................................. 43 MIC, Ltd. v. Bedford Township, 463 U.S. 1341 (1983) ................................................................ 46 Near v. Minnesota ex rel. Olson, 283 U.S. 697 (1931)................................................................. 36 Nebraska Press Ass'n v. Stuart, 427 U.S. 539 (1976)................................................................... 36 Nemet Chevrolet, Ltd. v. Consumeraffairs.com, Inc., 591 F.3d 250 (4th Cir. 2009).............. 11, 12 O’Kroley v Fastcase Inc., 2014 WL 2881526 (M.D. Tenn, June 25, 2014) ................................ 13 O'Kroley v. Fastcase, Inc., 831 F.3d 352 (6th Cir. 2016) ................................................. 14, 15, 16 Organization for a Better Austin v. Keefe, 402 U.S. 415 (1971) .................................................. 36 Packingham v. North Carolina, 137 S. Ct. 1730 (2017) .............................................................. 38 Parker v. Google, Inc., 422 F. Supp. 2d 492 (E.D. Pa. 2006) ...................................................... 14 Patel v. Facebook, 932 F.2d 1264 (2019) ..................................................................................... 70 Pion v Bean, 2003 VT 79, Para 34, 176 Vt. 1, 833 A.2d 1248 ..................................................... 57 Pittsburgh Press Co. v. Pittsburgh Commission on Human Relations, 413 U.S. 376 (1973). ..... 36 Prayze FM v. FCC, 214 F.3d 245, 248 (2d Cir. 2008)…………………………………..46 R. A. V. v. City of St. Paul, 505 U.S. 377 (1992) .......................................................................... 30 Reno v. ACLU, 521 U.S. 844 (1997)....................................................................................... 20, 21 Richardson v. City of Rutland, 164 Vt. 422, 671 A.2d 1245 (1995) ............................................ 46 vi Roth v. United States, 354 U.S. 476 (1957) .................................................................................. 21 Sandvig v. Barr 16-CV-1368 2020 U.S. Dist. LEXIS 53631 (D.D.C. March 27, 2020) . 34, 35, 42 Sandvig v. Sessions, 315 F. Supp. 3d 1 (D.D.C. 2018) ................................................................. 32 Saponaro v. Grindr, LLC, 93 F. Supp. 3d 319, (D.N.J. 2015)...................................................... 13 See HiQ Labs, Inc. v. LinkedIn Corp., 938 F.3d 985 (9th Cir. 2019) ........................................... 40 Seldon v Magedson, 2012 WL 4475274 (S.D.N.Y. July 10, 2012) .............................................. 13 Shahi v Madden, 2008 VT 25, Para 24, 184 Vt. 320, 949 A.2d 1022 .......................................... 57 Smith v. Goguen, 415 U.S. 566, 573 (1974). ................................................................................ 43 Sony Comput. Entm't, Inc. v. Connectix Corp., 203 F.3d 596, 602 (9th Cir. 2000) ............... 20, 21 Sorrell v. IMS Health Inc., 564 U.S. 552 (2011) ........................................................ 21, 28, 29, 37 Spokeo , Inc. v. Robins, 136 S.Ct. 1540, 194 L.Ed.2d 635 (2016) ............................................... 70 State v. Albarelli, 2011 VT 24, 189 Vt. 293, 19 A.3d 130 ........................................................... 49 State v. VanBuren, 2018 VT 95 214 A.3d 791, 822 (Vt. 2019).............................................. 49, 50 States v. Carmichael, 326 F. Supp. 2d 1267 (M.D. Ala. July 20, 2004) ...................................... 20 Taylor v. Town of Cabot, 205 Vt.586, 205 A.2d. 586 (Vt. 2017) ................................................. 44 Taylor, 205 Vt.586, 205 A.2d. 586 .............................................................................................. 45 Texas v. Johnson, 491 U.S. 397 (1989) ........................................................................................ 22 Town of Sherburne v. Carpenter, 155 Vt. 126 (1990). ................................................................. 45 Turner Broad. Sys. v. FCC, 512 U.S. 622, 642 (1994) ................................................................. 28 vii United States v. Bondarenko 2:17-CR-306 JCM, 2019 U.S. Dist. LEXIS 98423 (D. Nev. June 12, 2019) ......................................................................................................................................... 20 United States v. Drew, 259 F.R.D. 449 (C.D. Ca. 2009) .............................................................. 42 United States v. John, 597 F.3d 263 (5th Cir. 2010) .................................................................... 43 United States v. Mass. Water Resources Auth., 256 F.3d 36 (1st Cir. 2001) ............................... 47 United States v. Nosal, 676 F.3d 854 (9th Cir. 2012). .................................................................. 42 United States v. Nutri-cology, 982 F.2d.394 (1992) ..................................................................... 47 United States v. O’Brien, 391 U.S. 367 (1968) ............................................................................ 22 United States v. Playboy Entm't Grp., 529 U.S. 803 (2000)......................................................... 30 United States v. Rodriguez, 628 F.3d 1258 (11th Cir. 2010) ........................................................ 43 United States v. Stevens, 559 U.S. 460 (2010).............................................................................. 21 United States v. Valle, 807 F.3d 508 (2d Cir. 2015) ..................................................................... 42 Universal City Studios v. Corley, 273 F.3d 429 (2d Cir. 2001) ...................... 20, 21, 23, 24, 25, 26 Universal City Studios v. Reimerdes, 111 F. Supp. 2d 294 (S.D.N.Y. 2000) ............................... 20 Universal Commc’n Sys., Inc. v. Lycos, Inc., 478 F.3d 413 (1st Cir. 2007) ................................. 12 Vance v. Universal Amusement Co., 445 U.S. 308 (1980) ........................................................... 46 Vance v. Universal, 445 U.S. 308, (1980) .................................................................................... 36 Vermont Ins. Mgmt., Inc. v Lumbermens’ Mut. Cas, Co., 171 Vt. 601, 604, 764 A.2d 1213 (2000) ........................................................................................................................................ 56 WEC Carolina Energy Sols. LLC v. Miller, 687 F.3d 199 (4th Cir. 2012) .................................. 42 Weinberger v. Romero–Barcelo, 456 U.S. 305, 102 S.Ct. 1798, 72 L.Ed.2d 91 (1982) .............. 47 Weinstein v Leonard, 2015 VT 136, 134 A.3d 547 ...................................................................... 56 viii Zeran v America Online, Inc., 129 F.3d 327 (4th Cir. 1997)........................................................ 13 Zhang v. Baidu.Com, Inc., 10 F. Supp. 3d 433 (S.D.N.Y. 2014). ................................................ 27 STATUTES 13 V.S.A. 2605........................................................................................................................ 48, 55 13 V.S.A. 2606 (b)(1) ................................................................................................................... 50 13 V.S.A. 2606(d)(1). ................................................................................................................... 50 15 U.S.C.§ 45(n). .......................................................................................................................... 62 47 U.S.C. § 230 ................................................................................................................. 11, 13, 14 9 V.S.A. § 2451a(a) ...................................................................................................................... 65 Restatement (Second) of Torts § 652A (1977). ............................................................................ 56 OTHER AUTHORITIES Eugene Volokh and Donald K. Falk, "Google First Amendment Protection for Search Engine Search Results," 8 J.L. Econ. & Pol'y 883, 888-89 (2012). ...................................................... 27 Orin S. Kerr Norms of Computer Trespass, 116 Colum. L. Rev. 1143 (May 2016) ................... 43 Orin S. Kerr Vagueness Challenges to the Computer Fraud and Abuse Act, 94 MINN. L. REV. 1561, 1562-63 (May 2010) ....................................................................................................... 43 Samuel D. Warren & Louis D. Brandeis, Right to Privacy, 4 Harv. L. Rev. 193 ........................ 49 William L. Prosser, Privacy, 48 Calif. L. Rev. 383 at 391, 394 (1960)....................................... 49 ix Introduction Defendant Clearview AI, Inc. ("Clearview") hereby opposes the Vermont Attorney General's Motion for a Preliminary Injunction for the reasons argued below. First, the Vermont Attorney General's action is preempted by the federal Communications Decency Act § 230(c)(1), which immunizes interactive computer services like Clearview whose search engines use third-party internet content. This is a threshold issue that bars Vermont's action in its entirety, and with prejudice. Second, as the Second Circuit holds, computer languages like Clearview AI's search engine and biometric facial recognition algorithm are protected speech under the First Amendment of the United States Constitution, and thus they also enjoy the free speech protections of Article 13 of the Vermont Constitution. This means that not only is Vermont's content-based attempt to regulate Clearview AI's protected speech subject to strict scrutiny, it is an unconstitutional prior restraint on Clearview's, and the public's, First Amendment rights to access and use public data on the internet. Third, Vermont's action is Void for Vagueness under the Fifth and Fourteenth Amendments’ Due Process Clause as applied here because neither the Vermont Legislature nor the Vermont Courts have endorsed the shadow privacy rights the Attorney General is asserting. Fourth, Vermont is subject to and cannot meet the traditional elements for obtaining a Preliminary Injunction. Fifth, Vermont lacks standing because there is no concrete, material injury. For all these reasons this Court should deny Vermont's Motion for a Preliminary Injunction. 1 Background In 2016, Hoan Ton-That, an Australian - American immigrant with Vietnamese roots, moved to New York City.1 Interested in complex, intricate systems, he started reading academic papers on artificial intelligence, neural net machine learning, and computer vision in his spare time. Conceiving of a way to address the accuracy problems raised in what he was reading, TonThat recruited a pair of engineers. With the help of a financial backer he met at an event that same year, they launched a small startup named Clearview AI, Inc. One engineer worked on developing a search engine to gather photos from the internet, including news sites, and another engineer worked on developing Ton-That's idea for a mathematical facial biometric algorithm developed from his study of academic papers. 2 Together, Clearview AI's public photo search engine and facial biometric algorithm search engine converge to create a state of the art neural net that algorithmically convert the publicly collected photos into mathematical formulas based on facial geometry. 3 “The algorithm doesn't use race, gender, or age as metrics.” 4 Instead, it uses a 512-point vector which turns the image into a mathematical representation of the face. 5 “The user photo’s vector is then run against Clearview’s database, which have already been vectorized, to identify other images that are very similar to the searched.”6 Like the latest generation of facial recognition technology 1 Kashmir Hill, The Secretive Company That Might End Privacy as We Know It, N.Y. Times, Jan 20, 2020, at A1, available at https://www.nytimes.com/2020/01/18/technology/clearview-privacy-facialrecognition.html (last accessed Apr. 10, 2020). 2 Id. 3 Id. 4 Mulcaire Aff. Ex. A, at ¶ 4. 5 Id. at 9. 6 Id. 2 coming to the fore in America, (there are currently over 100 facial recognition companies) 7 Clearview AI does not suffer from the early problems of racial bias that first iterations of facial recognition tech were prone to. How the Clearview AI App Works Clearview AI ("Clearview AI" or the "App") is a public photo search engine that uses a proprietary algorithm to match a face on a user-inputted-photo with public photos collected from the internet.8 Clearview only collects publicly accessible photos available to anyone with an internet connection.9 If a photo is marked private on a social media site, Clearview does not index or cache it and it is not searched when a user inputs a photo search query. 10 Clearview AI's search engine is similar to standard internet search engines like Google's or Bing's. Clearview uses automated computer programs called "bots," of a particular type called "spiders" that "crawl" the internet indexing, copying, and caching information on Clearview AI's secure servers.11 These are the same servers used by major financial institutions and Fortune 500 companies.12 Search engine speed comes from the fact that spiders have already indexed the internet and cached it in a quickly available server.13 Like Google and Bing, Clearview AI 7 See About Face: Examining the Department of Homeland Security’s Use of Facial Recognition and Other Biometric Technologies, Part II: Hearing before the H. Comm. on Homeland Security, 116th Cong. 6 (2020) (Testimony of John P. Wagner) (https://docs.house.gov/meetings/HM/HM00/20200206/110460/HHRG-116-HM00-Wstate-WagnerJ20200206.pdf)(last accessed April 10, 2020). 8 Mulcaire Aff. Ex. A, at ¶ 7. 9 Id. at ¶ 2. 10 Id. at ¶ 8. 11 Id. at ¶ 2. 12 Id. at ¶ 10. 13 Id. at ¶ 2. 3 caches the images it collects on servers to make its searches more efficient. But unlike Google and Bing, Clearview AI collects far less data.14 Clearview AI doesn't collect what the law calls Personally Identifying Information ("PII").15 This means it doesn't intentionally collect names, addresses, social security numbers, account numbers, or the like. Nor does the App match names to photos. 16 All the App collects is public photos from public servers, and whatever incidental metadata may be attached to that photo, if any.17 “Approximately 10% of all photos indexed by Clearview contain EXIF metadata.”18 EXIF metadata is the type of data which identifies the geolocation of where the photo was taken.19 The reason so few photos have this EXIF metadata is due to large social media platforms stripping that data from photos uploaded onto their servers. 20 Thus, Clearview AI has little access to the data necessary to identify someone, and no way of identifying who, if anyone, in its database is a Vermont citizen unless that information somehow is in its metadata. 21 Clearview’s capacity to identify the name and location of a person pictured in their database is the same capacity as any user who copies a publicly available photo online and reads the metadata manually. All of the data created by the neural net algorithm is stored on secured servers separate from the database of photos.22 The form that biometric data takes is a mathematical equation, a 14 Id. Id. at ¶ 4. 16 Id. at ¶ 3. 17 Id. at ¶ 14. 18 Id. at ¶ 16. 19 Id. 20 Id. 21 Id. at ¶ 15. 22 Id. at ¶¶ 11-12. 15 4 series of numbers, which can only be accessed in a readable and understandable manner with the aid of a database of photos, a neural net code, and a database of, or access to URLs where those photos are stored on the internet.23 This is a high hurdle to surmount for any hacker to decipher, even if they could breach Clearview's biometric data server. And this biometric data is just based on public data from a separate database of photos compiled solely of publicly available photos which are already on the internet.24 The image below is an image of a search that undersigned counsel ran via the App on an iPhone of Rembrandt's Self Portrait at the Norton Simon Museum in Pasadena, California. The small circle with the face of Rembrandt on the bottom of the page is the original photo of the Rembrandt at the museum, the other photos, 325 that you can scroll through, are matches with photos on the internet. If you tap any of the photos you are linked to the public website where that photo is located. That is the extent of what a user sees when they use the App. Note that nowhere are the photos identified as Rembrandt, and the names that do appear on the images are the public URL's to the sites that have the photo. It's the investigator who determines the identity of the photo search results by following the public links. 25 And unlike forensic results with DNA or Fingerprints, facial matches between two photos are easily checked for accuracy with the naked eye. 23 Id. at ¶ 13. Id. at ¶ 2. 25 Id. at ¶ 3. 24 5 "Rembrandt Self Portrait" Rename Search 42532? 1 . For the last 20 years, prior to the App, police departments had limited access to facial recognition technology and could only search government images, such as driver’s license photos and mug shots.26 After a few law enforcement agencies had success with the App word of mouth spread and soon hundreds of law enforcement agencies and major companies with security needs were using the App on the federal, state, and municipal level. 27 Law Enforcements' Use of Clearview AI Federal, state, and local law enforcement have used the App to help solve crimes all over the country.28 The types of crimes run the gambit from murderer, child abuse, sex trafficking, theft, fraud, and battery.29 Over 600 law enforcement departments have used the App in the last year.30 The App is an effective investigative tool in criminal cases 31, particularly child exploitation, abduction and human trafficking cases. 32 In Nevada a Child Exploitation Investigations Unit had photos which contained the abuser in the background. 33 Law enforcement databases failed to identify the suspect, but using the App they were able to locate information that led to the true identity of the suspect. 34 In Alabama, “a law enforcement analyst investigating the stabbing of a sex trafficking victim identified her suspected pimp and attacker 26 Kashmir Hill, The Secretive Company That Might End Privacy as We Know It, N.Y. TIMES, Jan 20, 2020, at A1. 27 Mulcaire Aff. Ex. A, at ¶ 20. 28 Id. at ¶¶ 18-19. 29 Id. at ¶ 19. 30 Id. 31 Webb Aff. Ex. C 32 Id. 33 Garrison Aff. Ex. B, at ¶ 4. 34 Id. 7 with information Clearview directed the analyst to on the internet.” 35 Almost 30 years after he was believed to have fled to another country, a man suspected of molesting a 10-year-old child in California was identified using the App.36 Public Interest Uses of Clearview AI A variety of technological tools are being deployed in the current pandemic, including Google's geolocation data being used to track the spread of infection, and biometric and other computer vision technologies helping find people who may have been infected by someone who tested positive.37 Had the App been employed at the Westport house party that spawned thousands of Covid-19 cases, perhaps the spread of the virus could have been prevented, notifying those exposed of the need to immediately self-quarantine by simple application to public social media postings related to the party.38 Biometric Facial recognition technology has numerous benefits for the community outside of helping the public safety, one of them being as a potential epidemiological tool. The field of computer vision that facial recognition is part of has important medical, scientific, industrial, and public interest applications, to say nothing of its artistic potential. 35 Id. at ¶ 5. Id. at ¶ 7. 37 See Tony Romm, Google Taps Vast Trove of Location Data to Aid Global Effort to Combat Coronavirus, Wash. Post, April 3, 2020, available at https://www.washingtonpost.com/technology/2020/04/03/google-data-distancing-coronavirus/ (last accessed April 5, 2020). 38 Elizabeth Williams and Kristin Hussey, Party Zwro: How a Sioree' in Conecticut Became a 'Super Spreader', .Y. Times, March 23, 2020, available at https://www.nytimes.com/2020/03/23/us/coronavirus-westport-connecticut-party-zero.html (last accessed April 9, 2020). 36 8 Computer vision is a major academic field. 39 Computer vision can help the blind see, guide driverless cars, guide robots through dangerous jobs, and help the disabled communicate through technologies like eye and muscle tracking. Biometric algorithms are critical to that field, and computer vision is still a way off from being as good as the human visual system. The Media Firestorm Clearview AI was a small startup still developing its service when the New York Times ran a front page, Sunday story proclaiming that the end of privacy was nigh because of the App.40 A flurry of media attention followed, including aggressive investigative reporting by Buzzfeed and follow up story by the New York Times and others, much of it politically and class focused, decrying the politics of those involved with Clearview, or the fact that wealthy investors were allowed to try the App in the ordinary course of Due Diligence. 41 Despite this, there still are no reported abuses of Clearview's App.42 Cyber Security As part of the media firestorm and political condemnation Clearview AI was under constant attack by hackers. Despite this, no one ever breached its servers and never got access to personally identifiable information because Clearview doesn't collect PII. Clearview restricts access to their image database to only a small number of employees with the highest 39 See, e.g., University of California at Berkeley, Computer Vision Group, available at https://www2.eecs.berkeley.edu/Research/Projects/CS/vision/ (last accessed April 10, 2020). 40 Kashmir Hill, The Secretive Company That Might End Privacy as We Know It, N.Y. TIMES, Jan 20, 2020, at A1. 41 Id. 42 Mulcaire Aff. Ex. A, at ¶ 20. 9 administrative access. It is their policy to not access any client search histories (if the client has not disabled search history) unless the client requests it, or unless necessary to enforce the Terms of Service.43 The only incident of unauthorized access to any information possessed by Clearview AI was in February 2020, when a hacker copied an old customer list and the number of searches performed by customers (but not search history content) from a developer version of the App. No facial images or any other biometric data was accessed as part of the cyber incident. Clearview AI has never suffered a loss of biometric data since its inception. The information which was accessed in February 2020 incident was limited to: Clearview AI’s list of client organizations, the number of accounts they hold, and the number of searches they had performed. The vulnerability that led to this incident of unauthorized access has been eliminated. Argument The Court should deny Vermont's motion for a preliminary injunction because: First, the Vermont Attorney General's action is preempted by the federal Communications Decency Act § 230, which provides near absolute immunity for interactive computer services’ use of third-party content on the internet. This is a threshold issue that bars Vermont's action in its entirety, and with prejudice. Second, as the Second Circuit holds, computer languages like Clearview AI's biometric facial recognition algorithm are protected speech under the First Amendment of the United States Constitution, as well as the Vermont Constitution's Chapter I, Article 13 Free Speech and 43 Id. at ¶ 12. 10 Expression guarantees. This means that Vermont's content-based attempt to regulate Clearview’s protected speech is subject to strict scrutiny, which is fatal to its case on the merits. Moreover, because Vermont's Preliminary Injunction Motion seeks to restrain Clearview’s protected speech, issuance of the injunction constitutes an unconstitutional prior restraint. Third, Vermont's action is void for vagueness as applied under the Due Process clauses of the 5th and 14th Amendments, as no reasonable person would be on notice of what Vermont is claiming is illegal. Fourth, Vermont must, and cannot, meet the standard elements for obtaining a preliminary injunction. Fifth, Vermont lacks standing because there's no concrete, material injury. This Court should deny Vermont's motion for a preliminary injunction. I. Communications Decency Act § 230 Preempts Vermont's Action The federal Communications Decency Act § 230 ("CDA") states, “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”44 This broad, almost absolute immunity preempts state laws to the contrary. 45 Simply put, “the CDA bars the institution of a cause of action or imposition of liability under “any State or local law that is inconsistent” with the terms of § 230.”46 44 47 U.S.C. § 230 (c)(1). 47 U.S.C. § 230 (e)(3) (“Nothing in this section shall be construed to prevent any State from enforcing any State law that is consistent with this section. No cause of action may be brought and no liability may be imposed under any State or local law that is inconsistent with this section.”). 46 See Nemet Chevrolet, Ltd. v. Consumeraffairs.com, Inc., 591 F.3d 250, 254 (4th Cir. 2009). 45 11 Clearview is entitled to immunity under the CDA because: (1) Defendant is an interactive computer service provider or user; (2) Plaintiff’s claims are based on “information provided by another information content provider;” and (3) Plaintiff’s claims would treat Defendant as the “publisher or speaker” of such information. 47 Although often said to be an affirmative defense, courts routinely dismiss cases at the pleading stage based on CDA § 230 immunity when the immunity is apparent from the face of the complaint. 48 Immunity under § 230 should be determined at the, “earliest possible stage of the case because that immunity protects [interactive computer services] not only from ultimate liability, but also from having to fight costly and protracted legal battles.” 49 Thus, the Court should decide this threshold issue before granting any hearings on the motion for a preliminary injunction. A. Clearview is an Interactive Computer Service The CDA defines an interactive computer service ("ICS") as “any information service, system, or access software provider that provides or enables computer access by multiple users to a computer server.”50 Clearview's AI public photo search engine fits this definition. 47 See Doctor’s Assocs., Inc. v. QIP Holders, LLC (“Doctor’s Assocs. I”), No. 06-cv-1710, 2007 WL 1186026, at *2 (D. Conn. Apr. 19, 2007) (citing Universal Commc’n Sys., Inc. v. Lycos, Inc., 478 F.3d 413, 418 (1st Cir. 2007)). 48 See, e.g., Marshall's Locksmith Serv. Inc. v. Google, LLC, 925 F.3d 1263, 1267 (D.C. Cir. 2019) (noting “[p]reemption under the Communications Decency Act is an affirmative defense, but it can still support a motion to dismiss if the statute’s barrier to suit is evident from the face of the complaint.”); Force v. Facebook, Inc., 934 F.3d 53, 63 (2d Cir. 2019) (“application of Section 230(c)(1) is appropriate at the pleading stage when, as here, the “statute's barrier to suit is evident from the face of” plaintiffs' proposed complaint.”). 49 Nemet Chevrolet, Ltd. v. Consumeraffairs.com, Inc., 591 F.3d 250, 255 (4th Cir. 2009) (citations omitted). 50 47 U.S.C. § 230(f)(2). 12 Clearview AI enables computer access by multiple users, its subscribers, to computer servers like other search engines hosting online message boards, gossip sites, or online marketplaces does.51 Court's routinely grant CDA § 230 immunity to search engines, business reviews, online marketplaces, online message boards, gossip sites, and a host of other internet services.52 B. Vermont's Claims are Based on Third-Party Content Meriam Webster defines a search engine as “computer software used to search data (such as text or a database) for specified information.” 53 Clearview publishes only an already existing public photo, and the public link to that photo. Subscribers input a photo search query, which then searches Clearview's database of public photos for a match. The Attorney General seeks to prohibit Clearview from accessing or using publicly distributed photos, none of which Clearview AI created. Clearview AI's republication of third-party content is the result of its search engine algorithm, which in this instance happens to be a biometric facial algorithm. The underlying technology does not transform Clearview into an information content provider that would be 51 See 47 U.S.C. § 230(f)(2). See, e.g., Saponaro v. Grindr, LLC, 93 F. Supp. 3d 319, (D.N.J. 2015); Green v. America Online (AOL), 318 F.3d 465 (3d Cir. 2003) (online message board); Barnes v. Yahoo!, Inc., 570 F.3d 1096 (9th Cir. 2009) (search engine), Gibson v. Craigslist, Inc., 2009 WL 1704355 (S.D.N.Y. June 15, 2009) (online marketplace); Jones v. Dirty World Entm’t Recordings LLC, 755 F.3d 398 (6th Cir. 2014) (gossip site); Zeran v. America Online, Inc., 129 F.3d 327 (4th Cir. 1997) (search engine); Jane Doe No. 1 v. Backpage.com, LLC, 817 F.3d 12 (1st Cir. 2016) (online marketplace); Seldon v. Magedson, 2012 WL 4475274 (S.D.N.Y. July 10, 2012) (business review); FTC v. Accusearch, Inc., 570 F.3d 1187 (10th Cir. 2009) (search); Fair Hous. Council of San Fernando Valley v. Roommates.com, LLC, 521 F.3d 1157 (9th Cir. 2008) (marketplace); O’Kroley v. Fastcase Inc., 2014 WL 2881526 (M.D. Tenn, June 25, 2014) (search engine). 53 See “search engine” at MeriamWebster.com, accessible at https://www.merriamwebster.com/dictionary/search%20engine (last accessed April 7, 2020). 52 13 ineligible for CDA immunity. The CDA protects the publication of search engine results. 54 This protection has persisted even as search engines have evolved from merely searching for links to delivering algorithmically generated GPS “pins” on Google Maps. 55 A search engine that runs on increasingly advanced software – in this case software capable of translating a photograph into a “searchable” result – does not alter that it remains a search engine. Search engines are the core technology that have fulfilled Congress’s policy to “promote the continued development of the internet.”56 The almost unfathomable amount of content available on the internet is functionally inaccessible without them. To stifle search engine innovation would necessarily inhibit the internet’s future evolution. The CDA defines “information content provider” as an entity that is “responsible, in whole or in part, for the creation or development of information provided through the Internet or any other interactive computer service.” 57 Clearview is not an information content provider because, “a website does not create or develop content when it merely provides a neutral means 54 See, e.g., Barnes v. Yahoo!, Inc., 570 F.3d 1096, 1101 (9th Cir. 2009), as amended (Sept. 28, 2009) (holding that plaintiff’s negligence claim against search engine for improperly removing dating profile was barred by the CDA § 230); O'Kroley v. Fastcase, Inc., 831 F.3d 352, 355 (6th Cir. 2016), (holding that under CDA § 230 search engines cannot be held liable for the manner in which results are displayed); Marshall's Locksmith Serv. Inc. v. Google, LLC, 925 F.3d 1263 (D.C. Cir. 2019) (holding that CDA § 230 immunized search engine for publishing false information by third-parties). 55 See, e.g., Parker v. Google, Inc., 422 F. Supp. 2d 492, 500–01 (E.D. Pa. 2006), aff'd, 242 F. App'x 833 (3d Cir. 2007) (holding that Google was immune under § 230 from suit for linking to defamatory UseNet posts); Marshall's Locksmith Serv. Inc. v. Google, LLC, 925 F.3d 1263, 1267 (D.C. Cir. 2019) (holding that Google was immune under § 230 the algorithmic generation of pinpoints on a map). 56 47 U.S.C. § 230(b)(1). 57 F.T.C. v. Accusearch Inc., 570 F.3d 1187, 1199 (10th Cir. 2009) (“We therefore conclude that a service provider is ‘responsible’ for the development of offensive content only if it in some way specifically encourages development of what is offensive about the content.”). 14 by which third parties can post information of their own independent choosing online.” 58 Clearview's publication of its biometric facial algorithms results does not make it an information content provider any more than Google becomes one when it publishes its search algorithm results. Simply put, “[i]f a website displays content that is created entirely by third parties, ... [it] is immune from claims predicated on that content.”59 Clearview did not plausibly have a role in the “creation” of the publicly accessible thirdparty photographs its search engines access. Courts have conducted a searching inquiry into the meaning of the word “developed” as used within the statute, and determined it can be construed as “covering even those who are responsible for the development of content only ‘in part.’” 60 However, “to be ‘responsible’ for the development of offensive content, one must be more than a neutral conduit for that content. [A] service provider is ‘responsible’ . . . if it in some way specifically encourages development of what is offensive about the content.” 61 An entity “helps to develop unlawful content, and thus falls within the exception to section 230, if it contributes materially to the alleged illegality of the conduct.” 62 Put another way, “[a] defendant…will not be held responsible unless it assisted in the development of what made the content unlawful." 63 This is a high bar that requires a level of intervention or modification of information not contemplated by Clearview. For example. the Second Circuit found that a company that 58 Bennett v. Google, LLC, 882 F.3d 1163, 1167 (D.C. Cir. 2018). O'Kroley v. Fastcase, Inc., 831 F.3d 352, 355 (6th Cir. 2016). 60 Accusearch, 570 F.3d at 1197 (“[T]here may be several information content providers with respect to a single item of information (each being ‘responsible,’ at least ‘in part,’ for its ‘creation or development’).” (citation omitted). 61 Accusearch, 570 F.3d at 1198-99; accord, Roommates.com, 521 F.3d at 1171-72 & n.33, 1175. 62 Roommates.com, 521 F.3d at 1167-68. 63 Fed. Trade Comm'n v. LeadClick Media, LLC, 838 F.3d 158, 174 (2d Cir. 2016). 59 15 affirmatively played a role in editing “fake news” websites promoting diets could be considered an information content provider.64 The Tenth Circuit found that a company that actively solicited confidential phone records and sold that information to third parties could be considered an information content provider.65 Clearview only finds information that already exists and provides it to end-users, without any modification and without violating confidentiality. Clearview delivers results through a neural algorithm to provide information, the use of which is insufficient to transform an ICS into an ICP.66 Simply providing end users a neutral tool is insufficient render an ICS a “developer” of information. 67 Similarly, filtering information, drop down menus, and search engine results don't cross threshold of turning an ICS into an ICP.68 C. Vermont is treating Clearview as a Speaker and Publisher 64 See id. at 164 (“LeadClick employees requested content edits to some of its affiliates using fake news sites.”). 65 See F.T.C. v. Accusearch Inc., 570 F.3d 1187, 1192 (10th Cir. 2009) (noting that “[a]cquisition of [telephone records] would almost inevitably require someone to violate the Telecommunications Act or to circumvent it by fraud or theft.”). 66 See Marshall's Locksmith Serv. Inc. v. Google, LLC, 925 F.3d 1263, 1270 (D.C. Cir. 2019) (holding “defendants’ translation of third-party information into map pinpoints does not convert them into ‘information content providers’ because defendants use a neutral algorithm to make that translation”). 67 See Fair Hous. Council of San Fernando Valley v. Roommates.Com, LLC, 521 F.3d 1157, 1169 (9th Cir. 2008) (“providing neutral tools to carry out what may be unlawful or illicit searches does not amount to “development” for purposes of the immunity exception.”) (emphasis added). 68 See Fair Hous. Council of San Fernando Valley v. Roommates.Com, LLC, 521 F.3d 1157, 1169 (9th Cir. 2008) (regarding drop down menus); O'Kroley v. Fastcase, Inc., 831 F.3d 352, 355 (6th Cir. 2016) (holding that automated editorial acts such as removing spaces and altering fonts in search results are traditional publication functions and protected by CDA § 230); Marshall's Locksmith Serv. Inc. v. Google, LLC, 925 F.3d 1263, 1271 (D.C. Cir. 2019) (holding that search engine’s rendering of information provided by third parties into “map pinpoints” does not convert it into an information content provider because a neutral algorithm was used to make the translation). 16 Vermont’s complaint cannot change the fact that it is targeting Clearview for performing the exact same functions as corporations like Google and Microsoft. Vermont claims that, “at a minimum, Clearview ‘must obtain the other party’s consent before’ using consumers’ photos from any website.”69 Google, in contrast, is said to “respect the Terms of Service of the websites they visit."70 But Google searches are filled with information that individuals wanted to remain private, such as nonconsensually distributed intimate images. 71 Nevertheless, Google has repeatedly been protected by § 230 because courts have correctly viewed it as a publisher. The core purpose of § 230 was to prevent interactive computer services, like Clearview, from being treated as the publisher or speaker of information provided by third parties in order to protect interactive computer services from ruinous litigation. At issue here are third-party images which are a form of information. Vermont would treat Clearview as the publisher of these images. But the Second Circuit (and others) have, "generally looked to [the word ‘publisher’] ordinary meanings: 'one that makes public' or 'the reproducer of a work intended for public consumption'.”72 Decades of case law have resulted in a, “capacious conception of what it means to treat a website operator as the publisher ... of information provided by a third party.” 73 Vermont’s claims against Clearview are derived from how end-users may interpret content published by third parties. The purpose of the lawsuit is to deprive Clearview of its 69 Compl. at ¶ 40. Compl. at ¶ 39. 71 See Carrie Goldberg How Google Destroyed the Lives of Revenge Porn Victims, N.Y. POST, Aug. 17, 2019, accessible at https://nypost.com/2019/08/17/how-google-has-destroyed-the-lives-ofrevenge-porn-victims/, (last accessed April 10, 2020). 72 Force v. Facebook, Inc., 934 F.3d 53, 65 (2d Cir. 2019) (quoting Klayman v. Zuckerberg, 753 F.3d 1354, 1359 (D.C. Cir. 2014). 73 Id. 70 17 decision as to whether it should republish existing images, despite the fact that,“[a]t its core, § 230 bars ‘lawsuits seeking to hold a service provider liable for its exercise of a publisher's traditional editorial functions—such as deciding whether to publish, withdraw, postpone or alter content.”74 Vermont is seeking to hold Clearview responsible for creating a means of efficiently locating an image that already exists on the “public” internet, which the CDA incentivizes. The Tenth Circuit noted that the purpose of the CDA was to “encourage Internet services that increase the flow of information by protecting them from liability when independent persons negligently or intentionally use those services to supply harmful content." 75 Clearview’s use a complex algorithm does not negate the fact that it is performing the traditional role of a publisher. The Second Circuit emphatically rejected a claim that Facebook’s “matching” algorithm deprecated its status as a publisher. The Second Circuit has stated that, “we find no basis in the ordinary meaning of ‘publisher,’ the other text of Section 230, or decisions interpreting Section 230, for concluding that an interactive computer service is not the ‘publisher’ of third-party information when it uses tools such as algorithms that are designed to match that information with a consumer's interests." 76 The Court further observed that, “arranging and distributing third-party information inherently forms connections and matches among speakers, content, and viewers of content, whether in interactive internet forums or in more traditional media…is an essential result of publishing.” 77 Clearview’s consolidation of 74 See Fed. Trade Comm'n v. LeadClick Media, LLC, 838 F.3d 158, 174 (2d Cir. 2016). F.T.C. v. Accusearch Inc., 570 F.3d 1187, 1199 (10th Cir. 2009). 76 Force v. Facebook, Inc., 934 F.3d 53, 66 (2d Cir. 2019). 77 Id. 75 18 existing images in a readily searchable form performs a traditional publication function by enabling end users to make connections between the images, and nothing more. II. Clearview's Computer Code is Protected First Amendment Speech The Attorney General unconstitutionally seeks to censor Clearview's speech, and the public's right to access and use public information on the internet using a search engine of their choice. Computer languages are a form of speech and expression protected by the First Amendment to the United States Constitution, and Chapter I, Article 13 of the Vermont Constitution.78 Vermont's crusade for content-based restrictions on Clearview's speech - its argued prohibition on Clearview's public internet search engine, the caching of public photos on Clearview's private servers, and Clearview's use of a facial biometric algorithmic search engine are subject to, and can't survive, the heightened judicial scrutiny the First Amendment calls for. Vermont's legal theory unreasonably and dangerously broadens the government's, and data oligopolies like Facebook's and Google's, power over public access and use of information on the internet stored on publicly facing private servers. Creative drafting and prosecution of terms of service agreements leads to the government and social media data oligopolies controlling access to, and use of, public data. This is a threat to internet and computer speech, including the publications of expressive computer languages like search engines. 78 See U.S. CONST. amend. I ("Congress shall make no law . . .abridging the freedom of speech"); VT. CONST. art. 13 ("That the people have a right to freedom of speech, and of writing and publishing their sentiments, concerning the transactions of government, and therefore the freedom of the press ought not to be restrained."). 19 Vermont's argument that Clearview's facial biometric algorithm and search engines are illegal because they are "immoral, unethical, and oppressive" threatens to chill a host of public interest speech and expression involving computer languages, including the type of computer code, bots and algorithms at issue here. The First Amendment forbids these types of contentbased restrictions based on state animus and overbreadth that Vermont's vague and ambiguous application of its Consumer Protection and Data Broker law results in here. 79 Thus, any preliminary injunction is an unconstitutional prior restraint on speech, and this the Court should deny Vermont's motion. A. The First Amendment Protects Computer Languages The Second, Fifth, and Sixth Circuit Federal Courts of Appeal hold that computer code is protected First Amendment speech. 80 The United States Supreme Court, in striking down a ban 79 See Reno v. ACLU, 521 U.S. 844, 858-59 (1997) ("[The sections at issue] are informally described as the 'indecent transmission' provision and the 'patently offensive display' provision."). 80 Universal City Studios v. Corley, 273 F.3d 429, 447-48 (2d Cir. 2001); Def. Distributed v. United States Dep't of State, 838 F.3d 451, 458, 469 (5th Cir. 2016) (holding government's national security interest in imposing a restriction on computer code for the 3-D printing of guns outweighed plaintiff-appellants "very strong constitutional rights."); Junger v. Daley, 209 F.3d 481, (6th Cir. 2000) ("Because computer source code is an expressive means for the exchange of information and ideas about computer programming we hold that it is protected by the First Amendment."); Sony Computer Entm't v. Connectix Corp., 203 F.3d 596, 602 (9th Cir. 2000); see also United States v. Bondarenko, No. 2:17-CR306 JCM (VCF), 2019 U.S. Dist. LEXIS 98423, at *28 (D. Nev. June 12, 2019) ("Source code, object code, and other computer software languages can involve the expression of ideas."); Green v. United States DOJ, 392 F. Supp. 3d 68, 86 (D.D.C. 2019) ("The Court . . . agrees with plaintiffs that the DMCA and its triennial rulemaking process burden the use and dissemination of computer code, thereby implicating the First Amendment."); United States v. Carmichael, 326 F. Supp. 2d 1267, 1290 (M.D. Ala. July 20, 2004) ("Based on all of the above factors and the evidence presented, the court holds that Carmichael's website is not a 'true threat' and is thus protected by the First Amendment."); Universal City Studios v. Reimerdes, 111 F. Supp. 2d 294, 327 (S.D.N.Y. 2000) ("As computer code--whether source or 20 on virtual child pornography, held that the First Amendment applies to the internet, and that video games- a product of information expressive computer code - enjoy First Amendment protection.81 And the United States Supreme Court has struck down Vermont's attempted regulation of pharmaceutical price data miners despite Vermont's invocation of the public safety.82 And the fact that computer software can be copyrighted is an acknowledgment by the law that computer code is creative and expressive. 83 Computer languages are the lingua franca of the 21st Century. The First Amendment covers a broad range of speech and expression from the profound to the trivial and profane. "Even dry information, devoid of advocacy, political relevance, or artistic expression, has been accorded First Amendment protection." 84 The First Amendment protects speech and expressive conduct like: violent video games; torture videos of animals; 85 draft card and flag burning;86 virtual child pornography; 87 walking in a California courthouse object--is a means of expressing ideas, the First Amendment must be considered before its dissemination may be prohibited or regulated."). 81 See Reno v. ACLU, 521 U.S. 844, 870 (1997) (holding a ban on virtual child pornography unconstitutional prohibition on protected speech and noting "the content on the Internet is as diverse as human thought [and] our cases provide no basis for qualifying the level of First Amendment scrutiny that should be applied to this medium." (citations omitted)). 82 See Sorrell v. IMS Health Inc., 564 U.S. 552 (2011). 83 Sony Comput. Entm't, Inc. v. Connectix Corp., 203 F.3d 596, 602 (9th Cir. 2000) ("The object code of a program may be copyrighted as expression . . . ."). 84 Universal City Studios v. Corley, 273 F.3d 429, 446 (2d Cir. 2001); Junger v. Daley, 209 F.3d 481, 484 (6th Cir. 2000), (“The Supreme Court has held that ‘all ideas having the slightest redeeming social importance,” including those concerning ‘the advancement of truth, science, morality, and arts’” have the full protection of the First Amendment." (quoting Roth v. United States, 354 U.S. 476, 484 (1957)(quoting 1 Journals of the Continental Congress 108 (1774))). 85 United States v. Stevens, 559 U.S. 460 (2010). Texas v. Johnson, 491 U.S. 397 (1989). 87 Ashcroft v. Free Speech Coalition, 535 U.S. 234, 245 (2002) ("As a general principle, the First Amendment bars the government from dictating what we see or read or speak or hear."). 86 21 with "Fuck the Draft" on a jacket; 88 crude, disgusting parodies of public figures;89 the abstractions of mathematics and music;90 “the artwork of Jackson Pollack, the music of Arnold Schoenberg; or the Jabberwocky verse of Lewis Carroll.” 91 And it protects expressive computer languages conveying information like Clearview's public photo search engine from state censors labeling it "immoral, unethical, and oppressive." Because, "[i]f there is a bedrock principle underlying the First Amendment, it is that the government may not prohibit the expression of an idea simply because society finds the idea itself offensive or disagreeable." 92 Computer language is the language of the internet, and to allow the government to restrict the access and use of information on the internet based on its dislike of particular algorithms endangers the greatest tool humanity has yet created for the public access, use, and sharing of information. This is why computer code is protected speech, and why courts hold it is. In Universal City Studios v. Corley, the Second Circuit Court held that expressive computer languages that convey information are protected speech under the First Amendment: Computer programs are not exempted from the category of First Amendment speech simply because their instructions require use of a computer. A recipe is no less "speech" because it calls for the use of an oven, and a musical score is no less "speech" because it specifies performance on an electric guitar. Arguably distinguishing computer programs from conventional language instructions is the fact that programs are executable on a computer But the fact that a program has the capacity to direct the 88 Cohen v. California, 403 U.S. 15, 21 (1971) ("The ability of government, consonant with the Constitution, to shut off discourse solely to protect others from hearing it is, in other words, dependent upon a showing that substantial privacy interests are being invaded in an essentially intolerable manner."). 89 Hustler Magazine v. Falwell, 485 U.S. 46 (1988). 90 See United States v. O’Brien, 391 U.S. 367 (1968). 91 Junger v. Daley, 209 F.3d 481, 484 (6th Cir. 2000). 92 Texas v. Johnson, 491 U.S. at 414 (quoted by Brooklyn Inst. of Arts & Scis v. City of New York & Rudolph W. Giuliani, 64 F. Supp. 2d 184, 198 (E.D.N.Y.1999).). 22 functioning of a computer does not mean that it lacks the additional capacity to convey information, and it is the conveying of information that renders instructions "speech" for purposes of the First Amendment.93 Clearview AI's algorithmic search engines, including its facial biometric algorithm, are protected speech because they convey information in reaction to search queries from Clearview's users. The information conveyed includes the facts that the algorithm has determined there's a high probability of a match between a user input photo and a photo publicly available on the internet; and that the users can choose to access that public image to investigate further by clicking the photo to link to its location on the public internet. This information is then evaluated by the user, who can choose to follow up on any of the purported matches by clicking on them. Alternatively, because facial recognition matches can be checked by the naked eye in a way that fingerprints or DNA tests cannot, the user can visually inspect the information and reject that the algorithm is correct in saying there's a match. This is analogous to listening to recorded music. A digital recording is nothing more than binary zeroes and ones converted into sound when played. But no one would argue that a musical score isn't protected artistic expression, or that the musical score was protected expression but the act of listening to the music wasn't. Likewise, Clearview AI's computer code is like a musical score, and a user reviewing search results is like someone listening to a music score - someone who has accessed and used information. In one instance musical notation 93 Universal City Studios v. Corley, 273 F.3d 429, 447-48 (2d Cir. 2001). "Computer code conveying information is 'speech' within the meaning of the First Amendment." id. at 450-51 23 converted into music as sound, in the other a biometric facial algorithm converted into information about match probabilities between faces in two photos. Corley was an appeal challenging the constitutionality of the Digital Millennium Copyright Act, and the validity of a permanent injunction to enforce the DMCA. The injunction prohibited the defendant from publishing, and to a limited degree linking to, a computer code called DeCSS. DeCSS decrypted copyright protection encryption for movies on DVD and thereby facilitated their illegal copying and sale. 94 On appeal the appellant argued that the DMCA and the Injunction against appellant publishing DeCSS violated the First Amendment because it imposed a content-based restriction on the publication of computer code. The case attracted significant input from amicus on both sides of the issues. 95 The Corley court adapted the Sixth Circuit's First Amendment analysis in Junger v. Daley which separated the encryption computer code at issue into expressive speech components and functional components.96 Corley develops the analysis further, aiming to separate the expressive speech component of computer language that conveys information and engages a user, from the purely functional aspects of computer code that require no active response from a user.97 Under Corley, restrictions on expressive computer languages that convey information are 94 Id. at 434-43. Id. at 443. 96 Junger v. Daley, 209 F.3d 481, 484 (6th Cir. 2000) ("The issue of whether or not the First Amendment protects encryption source code is a difficult one because source code has both an expressive feature and a functional feature."). 97 Corley, 273 F.3d 429, 448; see also CFTC v. Vartuli, 228 F.3d 94, 111 (2d Cir. 2000) (holding the First Amendment didn't apply to computer code because in the sales and marketing of the code at issue only its functional aspects were implicated). 95 24 generally content based restrictions subject to strict scrutiny, while restrictions on functional computer languages generally are content neutral and subject to intermediate scrutiny. 98 Corley is careful to point out that standards of review are context dependent and need to be adjusted for the internet. The court in Corley was particularly concerned with the fact that DeCSS could quickly be distributed on the internet - a feature unique to the internet - and that it would foster the harm the DMCA was directly targeting - copyright piracy. With this concern in mind the court examined the expressive and functional features of DeCSS. Before starting its analysis, the court noted its preference for narrow holdings when it came to novel technological issues, so the law could "mature on a 'case-by-case' basis." 99 Then it sketched a two part analysis focused on separating the information conveying, expressive speech aspects of computer code from its functional aspects that convey little or no information and require no reaction from a user.100 If the computer code is expressive, as here, the government regulation of it is content based and subject to strict scrutiny review. If the computer code targeted by the government is merely targeted in its functional capacity, with only incidental effect on expressive speech, intermediate scrutiny applies.101 In Corley, the court held that the decryption code in question was only implicated in its functional capacity, and any effect on speech was incidental and counterbalanced by the clear legislative mandates of the DMCA against copyright privacy.102 98 Corley, 273 F.3d at 450-51. Id. at 445 (quoting Name.Space, Inc. v. Network Solutions, Inc., 202 F.3d 573, 584 n.11 (2d. Cir. 2000).). 100 Id. at 450-51. 101 Id. at 458. 102 Id. 99 25 Clearview AI's search engines - both those that index and cache public photos from the internet, as well as its facial biometric algorithm that searches those cached public photos, are computer languages the expressively convey information for a computer user to evaluate and act on. A user can review search results from Clearview's biometric facial algorithm, and accept or reject it with their own eyes, knowing that they have the choice to follow up on the results on the public internet. As such, Clearview’s public photo search engine is no different than Google, Bing, or Baidu, except that it collects far less information and no PII. Like them, it matches information on the public internet to a specific user search query by running algorithms on public data stored on publicly networked private servers across the internet. To hold that expressive computer languages like the algorithms at the heart of the internet's search engines aren't protected speech is to invite state interference with a primary purpose of the internet and a core First Amendment concern: the free exchange and expression of information and ideas. And it cedes control of public data on the internet to data oligopolies who can manipulate their terms of services agreements to control the public data on their servers. B. Search Engine Results Are Expressive First Amendment Speech In Zhang v. Baidu.com, Inc., the federal district court for the Southern District of New York held that search engine results are constitutionally protected speech. 103 The court applied the First Amendment to algorithmic search engine results, finding that the selection and the arrangement of the results were protected expressive speech: 103 Zhang v. Baidu.Com, Inc., 10 F. Supp. 3d 433 (S.D.N.Y. 2014). 26 [T]he fact that search-engine results may be produced algorithmically matter for the analysis. After all, the algorithms themselves were written by human beings, and they “inherently incorporate the search engine company engineers" judgments about what material users are most likely to find responsive to their queries . . . . When search engines select and arrange others’ materials, and add the all-important ordering that causes some materials to be displayed first and others last, they are engaging in fully protected First Amendment expression. 104 Clearview AI's App is a public photo search engine entitled to First Amendment protection like all internet search engines. Algorithmic automation doesn't deprive it of First Amendment protection any more than a musical score being played by a computer deprives its composer of copyright.105 Clearview’s search engine algorithms function the same for the purposes of this inquiry as other search engines. C. Vermont is Engaged in Content Based Speech Discrimination Vermont's animus towards Clearview's caching of public photos available on the internet, and its ire towards biometric facial recognition software are evidence that it is engaged in content-based discrimination. "We have said that the principal inquiry in determining contentneutrality . . . is whether the government has adopted a regulation of speech because of [agreement or] disagreement with the message it conveys." 106 104 Id. Eugene Volokh and Donald K. Falk, "Google First Amendment Protection for Search Engine Search Results," 8 J.L. ECON. & Pol'y 883, 888-89 (2012). 106 Turner Broad. Sys. v. FCC, 512 U.S. 622, 642 (1994) (quoting Ward v. Rock Against Racism, 491 U.S. 781, 791(1989); see also R. A. V. v. St Paul, 505 U.S. 377, 386 (1992) ("The government may not regulate [speech] based on hostility--or favoritism--towards the underlying message expressed"). 105 27 Like the content-based restriction on "indecent" and "patently offensive" messages in Reno v. ACLU, the Attorney General seeks to restrict Clearview's facial biometric algorithm because it is "immoral, unethical, and oppressive." The content-based restriction threatened here is similar to the restraint on speech Vermont attempted in Sorrell v. IMS Health Inc.107 In Sorrell, Vermont advanced several grounds purportedly justifying a limitation on sales of pharmacy records revealing the prescribing practices of individual doctors. Those records were gathered by internet data miners, who in turn, leased the reports to pharmaceutical manufacturers.108 Among a variety of grounds to justify the restriction, Vermont argued, as it does in this case, that the privacy interests of residents justified the restriction on sale of the records. 109 The Supreme Court rejected this argument, holding that the gathering and distribution of the records by commercial data miners was speech protected by the First Amendment, notwithstanding the threatened release of private information. Moreover, the Supreme Court held that heightened scrutiny of the restriction was mandated because the Vermont law focused on the content and the speaker. Vermont’s current effort to limit speech is equally infirm because it focuses on the speech of Clearview and the content that is produced by Clearview’s search engines. As in Sorrell, Vermont in this instance is attempting to regulate Clearview's expressive speech it thinks is harmful to its citizens: it does not like the copying, indexing and caching of public photos from the internet and repeatedly says that Clearview's use of a biometric facial 107 Sorrell v. IMS Health Inc., 564 U.S. 552 (2011)). Id. 109 Id. at 579-80. 108 28 recognition algorithm on that database of publicly available photos will result in a "Orwellian Dystopia."110 Yet Vermont hasn't brought any action like this against any other search engine. Vermont is not attempting to apply a content neutral regulation that applies to all search engines. It is singling out Clearview because it doesn't like that Clearview's search engine's has collected "3,000,000,000 photos" publicly available on the internet and used them for a subscription facial recognition App targeted to law enforcement. And Vermont in particular is targeting expressive computer code conveying through the publication of its search results the high probability of a match from a user search query and providing choices about what to do with the information. And because Vermont is targeting just one type of search engine - facial recognition algorithms - and not all search engines, its application of the Vermont Consumer Protection and Data Broker Acts is not content neutral. Vermont is categorizing search engines and computer languages by type and is picking which ones it thinks are illegal because it believes them "immoral, unethical, or oppressive." Because this is a content-based restriction, Vermont's lawsuit fails strict scrutiny and should be dismissed with prejudice. 111 D. Strict Scrutiny is Fatal to Vermont's Content Based Restrictions Content based restrictions on protected speech are subject to strict scrutiny and are invalid unless the State can justify the restriction by a compelling government interest that is 110 Id. at 552. See, e.g., Hill v. Colo., 530 U.S. 703, 719 (U.S. June 28, 2000) ("The principal inquiry in determining content neutrality, in speech cases generally and in time, place, or manner cases in particular, is whether the government has adopted a regulation of speech because of disagreement with the message it conveys."); Def. Distributed v. United States Dep't of State, 838 F.3d 451, 469 (5th Cir. 2016) ("Only because Defense Distributed posted technical data referring to firearms covered generically by the USML does the government purport to require prepublication approval or licensing. This is pure content-based regulation."). 111 29 narrowly drawn to serve that interest.112 It is rare for a content based restriction on speech to survive strict scrutiny.113 The government must identify an "actual problem," backed up by evidence, that necessarily requires the curtailment of free speech to address in order to survive strict scrutiny.114 Vermont cannot meet that standard because its causes of action and alleged harms are speculative. Neither the Vermont legislature nor courts recognize the Attorney General's novel theory of invasion of privacy based on alleged violations of terms of service agreements by internet search engines indexing and caching public information on the internet. Not a single specific Vermont consumer is identified as having any material, concrete harm. Moreover, Clearview is a Delaware corporation with its principal place of business in New York City and that does no business, and has no revenue, in Vermont. So this cannot be justified as a reasonable business regulation. There is no compelling interest here that Vermont is narrowly enforcing in the least restrictive way. A judicial declaration that facial recognition technology constitutes an invasion of privacy would substantially burden more speech than any legitimate interest Vermont has in its shadow privacy right that is unrecognized by Vermont's legislature. A narrowly least restrictive means on the use of facial recognition search engines would be one that only punished people who actually caused harm in a way that society explicitly recognizes, whether that be stalking, identity theft, or other recognized crimes. No one seeks to 112 R. A. V. v. City of St. Paul, 505 U.S. 377 (1992). See Brown v. Entm't Merchs. Ass'n, 564 U.S. 786, 799 (2011) (citing United States v. Playboy Entm't Grp., 529 U.S. 803, 818 (2000).). 114 United States v. Playboy Entm't Grp., 529 U.S. 803, 818 (2000). 113 30 ban phones because criminals use them to plan crimes. That's because "[t]he normal method of deterring unlawful conduct is to impose an appropriate punishment on the person who engages in it," not blanket prohibitions on public access to, and use of, public information. 115 Vermont's proposed de facto ban on facial recognition search engines is an unconstitutional publication ban - it says one cannot publish the results of a class of computer matching algorithms. Where does this stop? Given the large volume of algorithms involved with matching data on the internet the Vermont Attorney General's lawsuit places a substantial burden on a wide range of protected internet speech. Vermont is not arguing that specific, harmful conduct that occurs because of use of a facial recognition App should be illegal. Vermont is arguing that the use of facial recognition software to search the public internet is always illegal. It doesn't matter to Vermont if the facial match is 100% accurate and the information being published by Clearview AI's public photo search engine is entirely true.116 Vermont's position is incredibly broad in its implications, and one that forbids the publication of classes of true information, as well as forbidding a substantial class of academic, medical, and public interest research, speech, and expressive conduct. 117 115 Bartnicki v. Vopper, 532 U.S. 514, 529 (2001). See Bartnicki v. Vopper, 532 U.S. 514, 527-28 (2001) ("More specifically, this Court has repeatedly held that "if a newspaper lawfully obtains truthful information about a matter of public significance then state officials may not constitutionally punish publication of the information, absent a need . . . of the highest order."). 117 Cf. Reno v. ACLU, 521 U.S. 844, 882 (1997) ("The CDA, casting a far darker shadow over free speech, threatens to torch a large segment of the Internet community."); see also Sandvig v. Sessions, 315 F. Supp. 3d 1, 29 (D.D.C. 2018) ("As the Access Provision both limits access to and burdens speech in the public forum that is the public Internet . . . heightened First Amendment scrutiny is appropriate." (citation omitted)). 116 31 Thus, this Court should apply strict scrutiny to Vermont's attempted content-based restrictions, deny their motion for a preliminary injunction. 118 E. A Preliminary Injunction Violates the First Amendment's Prohibitions on Overbreadth and Vagueness If the Attorney General's interpretation of Vermont's Consumer Protection and Data Broker's law are correct, then this court should strike them down as facially, and as applied, unconstitutional violations on the First Amendment's prohibitions on overbreadth and vagueness. "'[T]he distinction between facial and as-applied challenges . . . goes to the breadth of the remedy employed by the Court, not what must be pleaded in a complaint.' . . . The substantive rule of law is the same for both challenges."119 i. Facial Overbreadth & Vagueness If the Attorney General is correct in its prosecution of Vermont's Consumer Protection and Data Broker Acts, and that they do make algorithmic search engines illegal upon the determination of the Attorney General that something immoral is going on, this impacts a substantial range of academic, medical, and public interest protected speech. 120 The statutes interpretational elasticity as evidenced here by the Attorney General's prosecution is a result of their vagueness, and is what leads their facial overbreadth. And this overbreadth impacts a 118 Brown v. Entm't Merchs. Ass'n, 564 U.S. 786, 799-800 (2011) (applying strict scrutiny to, and striking down, California's attempt to regulate violent video games). 119 Sandvig v. Sessions, 315 F. Supp. 3d 1, 28 (D.D.C. 2018) (citing Edwards v. District of Columbia, 755 F.3d 996, 1001 (D.C. Cir. 2014)). 120 Cf. Reno v. ACLU, 521 U.S. 844, 881 (1997) ("[W]e find no textual support for the Government's submission that material having scientific, educational, or other redeeming social value will necessarily fall outside the CDA's "patently offensive" and "indecent" prohibitions."). 32 substantial amount of protected speech, not only those of internet search engines, but anyone who seeks to use the internet for information. The overbreadth is illustrated by the case of Sandvig v. Sessions ("Sandvig I"), a declaratory judgment action brought by academic researchers who wanted to use methodologies that violated website terms of service agreements to conduct research. The case is also a good example of the perils of predicating tort and criminal law on breach of contract law like violations of terms of service agreements. Sandvig I involves academic researchers who wanted to use bots, fake user profiles, and other methods that violated terms of service agreements to scrape data from real estate and job websites to see if they violated employment and housing laws: Plaintiffs are all aware that their activities will violate certain website ToS. All intend to use scraping to record data, which is banned by many of the websites plaintiffs seek to study. Many of the housing websites that Sandvig and Karahalios will study prohibit the use of bots. All of the hiring websites that Mislove and Wilson will study prohibit the use of sock puppets, and most prohibit crawling. Additionally, some websites control when and how visitors may speak about any information gained through the site—even in other forums—by including non-disparagement clauses in their ToS. Some sites also have ToS that require advance permission before using the sites for research purposes, which, plaintiffs allege, creates the possibility of viewpoint-discriminatory permission schemes. Aside from their ToS violations, plaintiffs' experiments will have at most a minimal impact on the operations of the target websites.121 121 Sandvig v. Sessions, 315 F. Supp. 3d 1, 10 (D.D.C. 2018) (denying DOJ's motion to dismiss on the basis that plaintiffs had made a colorable claim to First Amendment protection.) ("Sandvig I"); see also Sandvig v. Barr, No. 16-CV-1368 (JDB) 2020 U.S. Dist. LEXIS 53631 (D.D.C. March 27, 2020) (holding on summary judgment that Plaintiffs proposed actions didn’t constitute a statutory violation of the CFAA) ("Sandvig II"). 33 The researchers sued United States Attorney General Jeff Sessions seeking a declaration that they wouldn't be federally prosecuted under the Computer Fraud and Abuse Act's ("CFAA") prohibitions against unauthorized access based on terms of service violations. Plaintiffs raised First Amendment overbreadth and Fifth Amendment Due Process vagueness arguments, among others.122 The government moved to dismiss. The court dismissed all the counts in the complaint except the count alleging a First Amendment, as applied, challenge claiming the CFAA restricted the researchers protected speech. The Sandvig I. court found the argument plausible enough to make it pass the motion to dismiss stage. In so doing it held that the CFAA wasn't a content-based restriction on speech because it didn't target speech, and thus found its application in the researcher's instance to be a content neutral one. It applied an intermediate scrutiny analysis and found it favorable to the researchers' claim that the CFAA was unconstitutional under the First Amendment as applied. On summary judgment, in Sandvig II, the court held on narrow CFAA statutory grounds that breaches of terms of service agreements didn't violate the CFAA's criminal provisions, thereby avoiding the constitutional issue.123 Sandvig I & II illustrate the substantial range of legitimate public interest speech Vermont's action threatens to chill. The internet is critical to academic research, just as it is the primary form of information for most people in the 21st Century. Vermont's statutes if correctly interpreted by the Attorney General substantially and unconstitutionally impact broad 122 123 Sandvig v. Sessions, 315 F. Supp. 3d 1, 10 (D.D.C. 2018). Sandvig v. Barr, No. 16-CV-1368 (JDB) 2020 U.S. Dist. LEXIS 53631 (D.D.C. March 27, 2020). 34 swaths of protected First Amendment activity that academics and ordinary people engage in every day. ii. As Applied Overbreadth & Vagueness For the same reasons Vermont's prosecution is unconstitutionally overbroad and vague under the First Amendment facially, it is unconstitutional as applied. The Vermont legislature didn't intend for the Vermont Consumer Protection and Data Broker Acts to ban algorithms, and neither they, nor the Vermont courts, have recognized the inchoate and long-distance intrusion upon seclusion the Attorney General argues for. This court should deny the motion for a preliminary injunction . F. Vermont Seeks an Unconstitutional Prior Restraint on Speech The First Amendment abhors the prior restraint of speech and publication. "Prior restraints on speech and publication are the most serious and the least tolerable infringement on First Amendment rights."124 The prevention of prior restraints is one of the core purposes of the First Amendment.125 Thus, any form of prior restraint is presumptively unconstitutional. 126 The preliminary injunction sought by Vermont’s Attorney General is an unconstitutional prior restraint on protected speech - computer code. It is even more onerous and constitutionally 124 See Nebraska Press Ass'n v. Stuart, 427 U.S. 539, 559 (1976). See Near v. Minnesota ex rel. Olson, 283 U.S. 697, 713-14 (1931) ("[I]t has been generally, if not universally, considered that it is the chief purpose of [the First Amendment] to prevent previous restraints upon publication."). 126 See Organization for a Better Austin v. Keefe, 402 U.S. 415, 419 (1971) ("noting that any prior restraint on expression[] comes … with a heavy presumption against its constitutional validity." (citation omitted).). 125 35 objectionable because it comes in the form of a Preliminary Injunction. The special threat of a preliminary injunction enjoining speech or publication “is that communication will be suppressed . . . before an adequate determination that it is unprotected by the First Amendment.”127 The First Amendment issues here are substantial, and this court should not restrain Clearview AI from publicly accessing information on internet before the issues can be fully litigated. In Vance v. Universal, the United States Supreme Court held in the context of a preliminary injunction that a state scheme that permitted enjoining alleged obscenity “based on a showing of probable success on the merits and without a final determination of obscenity” was unconstitutional because it permitted “prior restraints of indefinite duration on the exhibition of motion pictures that [had] not been finally adjudicated to be obscene.” 128 The Court reasoned further, “a state trial judge might be thought more likely than an administrative censor to determine accurately that a work is obscene does not change the unconstitutional character of the restraint if erroneously entered.”129 Vermont faces the same issue in trying to preliminarily enjoin Clearview AI's search engines. As discussed above, Vermont is attempting to enjoin Clearview AI's protected speech expressed by its search engine's algorithms publication and conveyance of information via a content-based restriction. It's doing so under a novel theory of privacy law neither recognized by Vermont's legislature or its courts. The tentative and speculative nature of Vermont's claims, 127 Pittsburgh Press Co. v. Pittsburgh Commission on Human Relations, 413 U.S. 376, 390 128 Vance v. Universal, 445 U.S. 308 (1980). Id. (1973). 129 36 combined with the fact that it hasn't identified a single Vermont resident with a concrete harm, counsels this Court against granting the draconian and constitutionally disfavored prior restraint on speech via preliminary injunction. G. There is a First Amendment Right to Public Data on the Internet The First Amendment protects the public's right to access, and use, public data – whether from a public library, a public museum, or the public internet. 130 "An individual's right to speak is implicated when information he or she possesses is subjected to 'restraints on the way in which the information might be used' or disseminated." 131 Moreover, "[a] fundamental principle of the First Amendment is that all persons have access to places where they can speak and listen, and then, after reflection, speak and listen once more. 132 And the public's right to listen and speak includes access to public data openly stored on publicly facing, private servers. 133 Vermont's prosecution threatens to chill the free flow of information necessary for scientific, medical, academic, artistic, and public interest discourse because it seeks to restrict the use of a particular type of internet search engine. The United State Supreme Court recognizes the public's right to access public data on the internet. In 2017, the Court affirmed this First Amendment right in Packingham v. North 130 See, e.g., Brooklyn Inst. of Arts & Scis v. City of N.Y. & Rudolph W. Giuliani, 64 F. Supp. 2d 184 (E.D.N.Y. 1999)(granting the Brooklyn Museum a preliminary injunction against the Mayor for his attack on the museum's funding due to its exhibition of a painting he considered offensive). 131 Sorrell v. IMS Health Inc., 564 U.S. 552, 568 (2011). 132 Packingham v. North Carolina, 137 S. Ct. 1730, 1735 (2017). 133 See HiQ Labs, Inc. v. LinkedIn Corp., 938 F.3d 985 (9th Cir. 2019). 37 Carolina, when it held that restricting sex offenders from social media sites violated the First Amendment: Even with these assumptions about the scope of the law and the State’s interest, the statute here enacts a prohibition unprecedented in the scope of First Amendment speech it burdens. Social media allows users to gain access to information and communicate with one another about it on any subject that might come to mind. By prohibiting sex offenders from using those websites, North Carolina with one broad stroke bars access to what for many are the principal sources for knowing current events, checking ads for employment, speaking and listening in the modern public square, and otherwise exploring the vast realms of human thought and knowledge. These websites can provide perhaps the most powerful mechanisms available to a private citizen to make his or her voice heard. They allow a person with an Internet connection to “become a town crier with a voice that resonates farther than it could from any soapbox.”134 Vermont’s lawsuit threatens the right of the public at large, as well as Clearview's, right to access and use public information on the internet through a search engine of their choosing. Vermont argues that it's illegal for Clearview AI to have cached publicly available photos from the public internet and to use a facial recognition search engine on them. But Vermont isn't targeting any other search engines in this action. Vermont has singled out Clearview AI and is saying that while Google may access those public photos with its search engine and cache them, Clearview may not do the same thing. Because Vermont doesn't like facial recognition search engines. The First Amendment forbids this type of content-based discrimination because it 134 Packingham v. North Carolina, 137 S. Ct. 1730, 1737 (2017) (citation omitted). 38 interferes with the free expression, exchange, and use of ideas in the vibrant public forum known as the internet.135 It is not for Vermont to tell anyone what public data they can look at, how they can look at it, and what kind of mathematical algorithms they can run to analyze this public data. Under the free speech protections of the United States Constitution and the Vermont Constitution, both speakers and listeners are protected. This means that Vermont cannot make a particular type of search engine result illegal any more than it could prevent the New York Times from publishing while simultaneously allowing the Wall Street Journal to publish. Vermont tries to avoid its First Amendment problems by arguing that Clearview AI's "scraping" of public data from private servers violates the Terms of Service agreements of the server owners. Bootstrapping a breach of contract theory into a tort and criminal theory of fraud, contrary to traditional contract law notions against this, Vermont claims that Vermont consumers have been victimized by Clearview because their expectation was that their photos wouldn't be indexed, cached and searched by a facial recognition algorithm. Vermont spins this into an intrusion on the expectation of privacy on the part of unnamed Vermont consumers with whom Clearview has never done business. But over the last twenty years the majority of courts that have considered the issue have held that a breach of a terms of service agreement doesn't constitute computer fraud. And they do so because to hold that it does puts too much power in the hands of the government to criminally prosecute people for felony computer crime over trivial infractions like lying about their age or weight in a user 135 See Sandvig v. Sessions, 315 F. Supp. 3d 1, 11-14 (D.D.C. 2018) (discussing internet as a public forum). 39 profile.136And it risks ceding power to data oligopolies running publicly facing private servers housing massive troves of public data. As the federal Ninth Circuit Court of Appeals noted in HiQ v. LinkedIn in 2019: We agree with the district court that giving companies like LinkedIn free rein to decide, on any basis, who can collect and use data—data that the companies do not own, that they otherwise make publicly available to viewers, and that the companies themselves collect and use—risks the possible creation of information monopolies that would disserve the public interest. 137 In HiQ v. LinkedIn, the Ninth Circuit affirmed the issuance of a preliminary injunction enjoining the business networking website LinkedIn from stopping plaintiff hiQ from scraping public data from LinkedIn’s private servers. 138 HiQ scraped data not designated private by LinkedIn's users from LinkedIn's publicly facing private servers, and then analyzed it to see who might be thinking about leaving their job, hiQ then sold the information to companies who wanted to retain talented employees.139 After LinkedIn decided to start selling the same services as hiQ, LinkedIn sent hiQ a cease and desist letter ordering it to stop collecting data from its servers.140 HiQ filed for a declaratory judgment affirming its right to access public data on LinkedIn's private servers, and moved for a preliminary injunction against LinkedIn on the basis 136 See United States v. Drew, 259 F.R.D. 449 (C.D. Ca. 2009). HiQ Labs, Inc. v. LinkedIn Corp., 938 F.3d 985, 991 (9th Cir. 2019). 138 See HiQ Labs, Inc. v. LinkedIn Corp., 938 F.3d 985, 1003-04 (9th Cir. 2019) ("The data hiQ seeks to access is not owned by LinkedIn and has not been demarcated by LinkedIn as private using such an authorization system"). LinkedIn has filed a petition for certiorari with the United States Supreme Court, available at https://www.supremecourt.gov/search.aspx?filename=/docket/docketfiles/html/public/19-1116.html (last accessed Apr. 8, 2020). 139 Id. 140 Id. 137 40 that scraping data in violation of LinkedIn's terms of service agreement didn't constitute a CFAA violation.141 The district court granted the injunction, and the Ninth Circuit affirmed, holding: It is likely that when a computer network generally permits public access to its data, a user's accessing that publicly available data will not constitute access without authorization under the CFAA. The data hiQ seeks to access is not owned by LinkedIn and has not been demarcated by LinkedIn as private using such an authorization system. HiQ has therefore raised serious questions about whether LinkedIn may invoke the CFAA to preempt hiQ's possibly meritorious tortious interference claim. 142 HiQ came down after the decision in Sandvig 1. The district court in HiQ cites Sandvig 1 in its discussion of the First Amendment issues. Sandvig 2 came down after HiQ, and the Sandvig 2 court picked up on HiQ's reasoning when rendering summary judgment in favor of the plaintiff academic researchers' rights to access public data on the public internet, citing HiQ's discussion of public and private data on the internet. 143 As Sandvig 2 notes, HiQ follows the logic of most cases, going back almost two decades, holding that breaches of terms of service agreements aren't sufficient to constitute a computer intrusion leading to civil or criminal CFAA liability.144 The lesson from the case law is that computer law is not a branch of contract law, and to import contract law principles into computer law is to confuse principles and unreasonably and excessively expand civil and criminal liability. As one of the lead cases, United States v. Nosal, notes, when it comes to predicating liability on 141 Id. HiQ Labs, Inc. v. LinkedIn Corp., 938 F.3d 985, 1003-04 (9th Cir. 2019). 143 Sandvig v. Barr, No. 16-CV-1368 (JDB) 2020 U.S. Dist. LEXIS 53631 (D.D.C. March 27, 2020) ("Sandvig 2"). 144 Sandvig 2, 2020 U.S. Dist. LEXIS 53631, *26 ("Most courts agree with the hiQ Labs court's interpretation . . . ."). 142 41 violations of terms of service agreements there are “[s]ignificant notice problems” with criminal prohibitions that “turn on the vagaries of private polices that are lengthy, opaque, subject to change and seldom read.”145 The majority case of the case law counsels against such vague and broad expansions of traditional contract law liability. 146 This Court should deny Vermont's preliminary injunction motion. III. Vermont's Action is Void for Vagueness as Applied Under the Due Process Clauses of the Fifth and Fourteenth Amendments Ignorance of the law is no excuse if the law can be looked up. But if the law cannot be looked up, it is obviously unfair to prosecute someone for what a person of ordinary intelligence had no fair notice was illegal.147 Thus, "it is a basic principle of due process that an enactment is void for vagueness if its prohibitions are not clearly defined." 148 Vermont's novel theory of a long distance, digital invasion of the privacy of unnamed Vermont consumers is not something a 145 United States v. Nosal, 676 F.3d 854, 860 (9th Cir. 2012). See United States v. Valle, 807 F.3d 508, 527 (2d Cir. 2015) ("We agree with the Ninth and Fourth Circuits that courts that have adopted the broader construction . . . failed to consider the effect on millions of ordinary citizens . . . ."); United States v. Nosal, 676 F.3d 854, 862 (9th Cir. 2012) ("We remain unpersuaded by the decisions of our sister circuits that interpret the CFAA broadly to cover violations of corporate computer use restrictions or violations of a duty of loyalty."); WEC Carolina Energy Sols. LLC v. Miller, 687 F.3d 199, 207 (4th Cir. 2012) ("Thus, we reject an interpretation of the CFAA that imposes liability on employees who violate a use policy . . . ."); United States v. Drew, 259 F.R.D. 449 (C.D. Ca. 2009) (reversing on 5th Amendment void for vagueness grounds criminal CFAA conviction based on violation of a terms of service agreement); but see United States v. Rodriguez, 628 F.3d 1258 (11th Cir. 2010); United States v. John, 597 F.3d 263 (5th Cir. 2010); Int'l Airport Ctrs., L.L.C. v. Citrin, 440 F.3d 418 (7th Cir. 2006); see generally, Orin S. Kerr Norms of Computer Trespass, 116 Colum. L. Rev. 1143 (May 2016), Vagueness Challenges to the Computer Fraud and Abuse Act, 94 MINN. L. REV. 1561, 1562-63 (May 2010). 147 Holder v. Humanitarian Law Project, 561 U.S. 1, 18 (2010). 148 See Grayned v. City of Rockford, 408 U.S. 104, 108 (1972) (collecting cases)); McBoyle v. United States, 283 U.S. 25, 27 (1931) (""[I]t is reasonable that a fair warning should be given to the world in language that the common world will understand, of what the law intends to do if a certain line is passed. To make the warning fair, so far as possible the line should be clear.") (Holmes, J.). 146 42 person of ordinary intelligence would conclude to be Vermont law after a survey of Vermont's statutes and case law. There are no statutes naming the tort Vermont seeks, and all the case law involves physical intrusions on privacy for which the courts set a substantially high bar. 149 This Court should not permit the Attorney General use its vague policy yearnings to arbitrarily prosecute Clearview AI. The void-for-vagueness doctrine is meant to prevent this type of arbitrary and discriminatory enforcement. 150 No other search engines have been targeted by the Attorney General for this type of lawsuit based on shadowy privacy rights. Vermont is prosecuting this case on principles that a person of ordinary intelligence reviewing Vermont law would not be on notice of, because Vermont has discriminatorily and arbitrarily decided to prosecute facial biometric algorithms based on animus. 151 The Court should deny Vermont's preliminary injunction motion. IV. Vermont Does Not Meet the Elements for a Preliminary Injunction A. The Court Should Apply the Regular Four-Part Balancing Test for Injunctions Under Vermont law, “[a] preliminary injunction is an extraordinary remedy never awarded as of right.” ‘In each instance, we must balance the competing claims of injury and must consider the effect on each party of the granting or withholding of the requested relief.’ The movant bears the burden of establishing that the relevant factors call for imposition of a 149 See below, Part IV. C (discussing Vermont's privacy case law). Smith v. Goguen, 415 U.S. 566, 573 (1974). 151 See Hill v. Colorado, 530 U.S. 703, 732 (2000) (citing Chicago v. Morales, 527 U.S. 41, 56-57 150 (1999)). 43 preliminary injunction. The trial court here rightly identified the main factors guiding its review under Vermont law: (1) the threat of irreparable harm to the movant; (2) the potential harm to the other parties; (3) the likelihood of success on the merits; and (4) the public interest.” 152 B. This Case is Not Appropriate For a Statutory Preliminary Injunction i. Vermont Law Does Not Allow It The State has not shown that this court should depart in this case from the regular standards for deciding whether to issue an preliminary injunction which require a balancing of interests “in each case.”153 Citing aged authority from other jurisdictions, the State urges the court to use a different standard for assessing its motion for a preliminary injunction, urging it to draw upon on the doctrine of statutory injunctions, to support its position that it need only show a statutory violation to be presumed entitled to a sweeping injunction. This position is made even the more extreme because the Attorney General’s Office leaves it to itself to decide what is a statutory violation, and thus what it is empowered to enjoin. While the Vermont Supreme Court has clearly and repeatedly annunciated the standard for a preliminary injunction as set forth above in a broad array of cases, the Vermont Supreme Court has not forthrightly adopted this other doctrine that special rules apply when a government agency seeks an injunction. Research reveals that to the extent this rationale has been adopted, it has been in contexts very far removed from the case at bar, almost entirely in the context of an injunction enforcing a local zoning 152 Taylor v. Town of Cabot, 205 Vt.586, 205 A.2d. 586, 595-96 (Vt. 2017) (citing Winter v. Nat. Res. Def. Council, Inc., 555 U.S. 7, 24, 129 S.Ct. 365, 172 L.Ed.2d 249 (2008) (“A preliminary injunction is an extraordinary remedy never awarded as of right.”). 153 Taylor, 205 Vt.586, 205 A.2d. 586, 595-96. 44 ordinance, such as a setback, without needing to find irreparable harm and fully weighing all equitable interests.154 ii. Other Jurisdictions Reject the State’s Interpretation of the Doctrine Indeed, even reported decisions from other jurisdictions on the so-called statutory injunction doctrine undermine the State’s strict liability manner of applying the doctrine to the highly disputed legal and factual landscape here. For example, some courts, while upholding the doctrine in appropriate cases, have noted that it does not apply when the constitutionality of the statute that serves as the basis for the statutory injunction is being challenged. 155 Indeed, such a holding makes sense since deprivation of constitutional rights, such as freedom of speech, is often seen in and of itself to manifest irreparable harm, the preliminary injunction of which is a prior restraint.156 The cases in which the statutory injunction doctrine has been invoked typically involve far more obvious and clear-cut legislative pronouncements of law and violations of a statute or specific regulation. For example, Golden Feather involved the sale of cigarettes that did not bear the statutorily required New York state tax stamp.157 Prayze FM involved an FM radio 154 See e.g., Town of Sherburne v. Carpenter, 155 Vt. 126 (1990). City of New York v. Golden Feather Smoke Shop, Inc., 597 F.3d. 115 (2d Cir. 2010) (“where the government seeks injunctive relief for a statutory violation, there is a presumption of irreparable harm except when the constitutionality of the statute being violated was at issue.” Finding no need to show irreparable harm for violation of New York statute prohibiting sale or possession of unstamped cigarettes; no challenge to constitutionality of statute) (internal quotes and cites omitted). See also Prayze FM v. FCC, 214 F.3d 245, 248 (2d Cir. 2008). The Second Circuit applied the remaining factors of the multi-part balancing of the equities test in both Golden Feather and Prayze FM while finding the irreparable harm was to be presumed on the facts of those cases. 156 See e.g., Elrod v. Burns, 427 U.S. 347 (1976) (loss of free speech, even for short period of time, is irreparable harm); Fort Wayne Books, Inc. v. Indiana, 489 U.S. 46, 66 (1989); MIC, Ltd. v. Bedford Township, 463 U.S. 1341, 1343 (1983); Vance v. Universal Amusement Co., 445 U.S. 308, 316-17 (1980) (finding that a preliminary injunction constitutes a prior restraint of speech). 157 Golden Feather, 597 F.3d at 199. 155 45 broadcaster who did not obtain the required FCC license. 158 Town of Sherburne involved a property owner who erected a building that extended into the town set back limits. 159 All of these cases involved a clear announcement of the legislative intent through a specific rule or law that the injunction sought by the executive in those case was seeking to enforce. Accordingly, it made more sense that the public interest had already been weighed by the legislature in those cases, and that interest could be presumed. Even in cases where there is a clear cut violation of a declared rule or statute that the party moving for a statutory injunction seeks to invoke, courts applying the statutory injunction standard only do so in a manner presuming the existence of irreparable harm, and still go on to require a showing of the other equitable factors – such as substantial likelihood of success on the merits and balance of equities – to support enforcement of a preliminary injunction. 160 iii. Prudence Requires Applying the Regular Rule to this Unique Case Further, given the unique situation presented in this case, the Court should as a matter of judicial prudence require the State to prove the traditional factors required for a preliminary injunction in order to grant the “extraordinary” relief sought here. 161 158 Prayze FM, at 248. Again, the progeny of Town of Sherburne has been confined solely to permitting a statutory injunction in the context of enforcement of zoning violations. Even in this context, courts have permitted limited balancing of interests in a zoning violation case, directing that the zoning violation be “substantial” and/or “conscious” before granting an injunction based on the zoning and development statute. Town of Sherburne, 155 Vt. at 131-32. See Richardson v. City of Rutland, 164 Vt. 422, 671 A.2d 1245 (1995). 160 See e.g. Golden Feather, supra; Prayze, supra; United States v. Mass. Water Resources Auth., 256 F.3d 36 , 51 (1st Cir. 2001) (if Congress meant to limit the court’s equitable powers in the statutory injunction at issue, it needed to state so clearly in the statute); Weinberger v. Romero–Barcelo, 456 U.S. 305, 313, 102 S.Ct. 1798, 72 L.Ed.2d 91 (1982) (same); United States v. Nutri-cology, 982 F.2d.394 , 395 (1992); Klay v. United Healthgroup, Inc., 376 F.3d, 1092, 1098-99 (11th Cir. 2004). 161 See Taylor, 205 Vt. 586. 159 46 In short, the statutory injunction standard as a matter of law and practicality should have no application here, and the State should be required to provide proof of irreparable harm, likelihood of success on the merits, and the balance of equities in its favor in order to obtain preliminary relief here. C. The State Has Made No Showing Justifying an Injunction Under the Proper Standard Applying the appropriate injunction standard articulated in Taylor and numerous other cases – and as argued at length below -- the State has made no showing on any of the equitable factors, other than a conclusory assertion that the conduct is against the law and therefore enjoining it is in the public interest. They have shown no harm whatsoever, much less harm that is irreparable in nature. Second, the harm on the other parties – namely Clearview and its users and the public– is concrete and real, ranging from denial of First Amendment rights to delay or prevention in solving or stopping violent crimes. Third and fourth, as described above and below, the State cannot show a decided likelihood of success on the merits, nor can they clearly and authoritatively state that the injunction they seek is in the public interest. However, as also argued at length below, the motion must still fail even applying the more lenient injunction standard which they urge the Court to adopt. This is because they cannot prove a violation has occurred, namely that a deceptive or unfair act in commerce has occurred under Vermont law in violation of the Vermont Consumer Protection Act. Plaintiff says Clearview has violated Vermonters’ expectation of privacy and therefore is commercially unfair. But Vermont only legislates on privacy interests in the most narrow and rare of circumstances and for purposes of protecting sexual privacy, or within the context of an 47 existing special relationship.162 Those factors aren't present in this case. This is a case about public photos on the internet, and who can access and use them. And Louis Brandeis and William Prosser would agree that there are no privacy rights in a public photo. In the seminal The Right to Privacy Louis Brandeis and Samuel Warren argued that there are no privacy protections for material already released to the public. 163 The right to privacy, according to Brandeis, “ceases upon the publication of the facts by the individual, or with his consent.”164 Professor William Prosser, in his article, On Privacy, concurs that there is no privacy interest in public images; whether taken publicly or available to the public: On the public street, or in any other public place, the plaintiff has no right to be alone, and it is no such invasion of his privacy to do no more than follow him about. Neither is it such an invasion to take his photograph in such a place, since this amounts to nothing more than making a record, not different essentially from a full written description, of a public sight which any one present would be free to see. . . . Certainly no one can complain when publicity is given to information about which he himself leaves open to the public eye.165 i. Vermont Supreme Court’s Narrow Expectation of Privacy In 2019, the Supreme Court of Vermont adopted an objective standard for determining privacy in the digital realm.166 In State of Vermont v Van Buren, a woman sent naked pictures of 162 See 13 V.S.A. § 2605 (Disclosure of Sexually Explicit Images Without Consent), 13 V.S.A. § 2606 (Voyeurism);3 V.S.A. § 129a (Unprofessional Conduct). 163 Samuel D. Warren & Louis D. Brandeis, Right to Privacy, 4 Harv. L. Rev. 193, at 215 (“The general object in view is to protect the privacy of private life, and to whatever degree and in whatever connection a man’s life has ceased to be private, before the publication under consideration has been made, to that extent the protection is to be withdrawn.”) 164 Samuel D. Warren & Louis D. Brandeis, Right to Privacy, 4 Harv. L. Rev. 193, at 218. 165 William L. Prosser, Privacy, 48 Calif. L. Rev. 383 at 391, 394 (1960). 166 State v. VanBuren, 2018 VT 95, ¶ 105, 214 A.3d 791, 822 (Vt. 2019), as supplemented (June 7, 2019), citing State v. Albarelli, 2011 VT 24, ¶ 14, 189 Vt. 293, 19 A.3d 130, noting “[t]his reflects a 48 herself via Facebook Messenger to an individual she was dating. 167 The offender intercepted the pictures by accessing them without authorization from the intended recipient’s Facebook account. The offender then publicly posted the images onto the account the victim wasn’t authorized to access, tagging the victim so all her friends and family could see. While the victim pleaded for the images to be removed, the offender replied that she was going to “ruin” her and “get revenge.”168 The offender was charged under Vermont’s 2015 nonconsensual pornography statute, a civil and criminal law, which prohibits one from “knowingly disclos[ing] a visual image of an identifiable person who is nude or who is engaged in sexual conduct, without his or her consent, with the intent to harass, intimidate, threaten, or coerce the person depicted, and the disclosure would cause a reasonable person to suffer harm.” 169. The law has several exceptions including disclosures made in the public interest, matters of public concern, and most relevantly “[i]mages involving nudity or sexual conduct in public or commercial settings or in a place where a person does not have a reasonable expectation of privacy.” 170 In its lengthy decision, the Vermont Supreme Court said the statute, a content-based restriction on free speech, only passes strict scrutiny if the language about there being “a reasonable expectation of privacy” is an element of the statute, as opposed to an affirmative decision by the Legislature that the expectation-of-privacy determination should be based on what a reasonable person would think, not what the person depicted thought.” 167 Id. 168 Id. 169 13 V.S.A. 2606 (b)(1) 170 13 V.S.A. 2606(d)(1). 49 defense.171 The court said this expectation of privacy could be nullified at either of two stages – the circumstances surrounding the capture of the image or the distribution of the image.172 After further briefing by the parties, the Court upheld dismissal of the case, because it found no reasonable expectation of privacy. The court said a reasonable person could not be expected to know the contours of the relationship between the sender and the recipient, nor whether they had expressly agreed to keep intimate images private. 173 The court said that the reasonable expectation of privacy does not require exclusivity between the depicted person and the recipient, but nearly so.174 The Van Buren court considered the following facts to be irrelevant: that the offender’s intent was to destroy the depicted person's privacy; that the intimate images were only intercepted through criminal means (hacking); that there was a destructive result caused by the image's distribution; and that the offender violated Facebook’s Community Standards. 175 Given that Van Buren shows there is no reasonable expectation of privacy for an image depicting nudity, taken in private sent to one person through a password protected app and unlawfully intercepted, there surely cannot be an expectation of privacy for not sexually graphic images already existing on the public internet. The images in Clearview’s database, already 171 State v. VanBuren, 2018 VT 95, ¶ 66, 214 A.3d 791, 813 (Vt. 2019), as supplemented (June 7, 2019) ( noting “there is no practical difference between a nude photo someone voluntarily poses for in the public park and one taken in private that the person then voluntarily posts in that same public park.”). 172 Id. 173 Id. at 823. 174 Id. 175 Id. 50 made public online, depict faces – not genitals. To apply a higher standard of privacy to pictures of faces than nude photos is irrational. And as for individuals captured non-consensually in images, or whose images are posted onto social media without consent, per the standards established in Van Buren, no reasonable person could be expected to know that. Once posted publicly, with or without consent, the reasonable expectation of privacy for all subsequent republications diminishes. Any grievance a depicted person may have as to images non-consensually captured or posted are best worked out between the depicted person and the photographer or poster, without roping in unknowing third parties. Under the Vermont Supreme Court’s framework downstream third parties, who cannot know what putative restrictions may exist on a publicly available image, cannot be held responsible for their republication of an image. 176 When it comes to images posted on the specific social media platforms (i.e. Facebook, Google, Twitter, YouTube, LinkedIn, and Venmo), from which Plaintiff claims Clearview cached public photos with its search engines, there could be no less reasonable an expectation of privacy under Vermont law. By posting the image publicly online onto those platforms, if the expectation was even still intact, the individual at that time most certainly relinquishes any reasonable expectation of privacy. The act of uploading an image onto a social media site with billions of users, nullifies any reasonable expectation of privacy. It goes far beyond Prosser’s assertion that individual’s 176 An exception may be warranted as to downstream re-publisher liability if it is clear from the circumstances that an image is posted without consent, such as if it's published on a dedicated revenge porn website. 51 lack privacy when walking on a public street; it is the equivalent of posting a billboard with your likeness alongside a highway, a billboard with potentially interminable duration. We all know that anybody – whether human or corporation -- with access to the public internet has the ability to screen capture, download, upload, repost, organize, and categorize information that is publicly viewable. Venmo’s own policy puts into words the common-sense notion that images published onto its platform are essentially released into the wild, “[w]hen you broadcast information to . . . third-party social networks, such information is no longer under the control of Venmo and is subject to the terms of use and privacy policies of such third parties.” 177 On top of that, by their own terms of service, the social media platforms cited by Vermont expressly require users to relinquish control over the images they post. By the terms users also acknowledge that even when they delete images from the platforms, the images continue to exist. Vermont is asserting rights that under its theory of terms of service agreements its consumers have waived: ● Facebook: “[W]hen you share, post, or upload content... you grant us a non-exclusive, transferable, sub-licensable, royalty-free, and worldwide license to host, use, distribute, modify, run, copy, publicly perform or display, translate, and create derivative works of your content… This means, for example, that if you share a photo on Facebook, you give us permission to store, copy, and share it with others.” “When you delete content, it’s no longer visible to other users, however it may continue to exist elsewhere on our systems.”178 ● Google: “This license allows Google to: host, reproduce, distribute, communicate, and use your content… publish, publicly perform, or publicly display your content, if you’ve made it visible to others… [and] sublicense these rights to… other users [and] our contractors.” “[I]f you shared a photo with a friend who then made a copy of it, or shared 177 Venmo Privacy Policy, subsection “How We Share Personal Information with Other Parties”, available at https://venmo.com/legal/us-privacy-policy (last accessed April 9, 2020). 178 Facebook Terms of Services, Section 3, Item 3, available at https://www.facebook.com/terms.php, (last accessed April 9, 2020). 52 ● ● ● ● it again, then that photo may continue to appear in your friend’s Google Account even after you remove it from your Google Account.” 179 Twitter: “By submitting, posting or displaying Content on or through the Services, you grant us a worldwide, non-exclusive, royalty-free license (with the right to sublicense) to use, copy, reproduce, process, adapt, modify, publish, transmit, display and distribute such Content in any and all media or distribution methods now known or later developed.”180 YouTube: “By providing Content to the Service, you grant to YouTube a worldwide, non-exclusive, royalty-free, sublicensable and transferable license to use that Content (including to reproduce, distribute, prepare derivative works, display and perform it)... The licenses granted by you continue for a commercially reasonable period of time after you remove or delete your Content from the Service”181 LinkedIn: “As between you and LinkedIn… you are only granting LinkedIn and our affiliates the following non-exclusive license: A worldwide, transferable and sublicensable right to use, copy, modify, distribute, publish and process, information and content that you provide through our Services and the services of others, without any further consent, notice and/or compensation to you or others.” “You and LinkedIn agree that we may access, store, process and use any information and personal data that you provide in accordance with, the terms of the Privacy Policy and your choices (including settings).”182 “Information you have shared with others (e.g., through InMail, updates or group posts) will remain visible after you close your account or delete the information from your own profile or mailbox, and we do not control data that other Members have copied out of our Services.”183 Venmo: “We may share your personal information with: Our parent company, PayPal, Inc. and affiliates and subsidiaries it controls,… [c]ompanies that PayPal, Inc. plans to merge with or be acquired by,… third party service providers,… [and ] [t]he other Venmo user participating in the transaction and, depending on the privacy setting of each Venmo account transaction, your Venmo friends and the Venmo friends of the other user participating in the transaction, or the public.” “When you broadcast information to… 179 Google Terms of Service, subsection “Rights”, available at https://policies.google.com/terms?hl=en-US#toc-software (last accessed April 9, 2020). 180 Twitter Terms of Service, subsection “Your Rights and Grant of Rights in the Content”, available at https://twitter.com/en/tos (last accessed April 9, 2020). 181 YouTube Terms of Service, subsection “Your Content and Conduct”, available at https://www.youtube.com/static?template=terms (last accessed April 9, 2020). 182 LinkedIn User Agreement, subsection “Your License to LinkedIn”, available at https://www.linkedin.com/legal/user-agreement, (last accessed April 9, 2020). 183 LinkedIn Privacy Policy, subsection 4.3 “Account Closure”, available at https://www.linkedin.com/legal/privacy-policy (last accessed April 9, 2020). 53 third-party social networks, such information is no longer under the control of Venmo and is subject to the terms of use and privacy policies of such third parties.” 184 ii. Vermont Statutory Definition of “Reasonable Expectation of Privacy” Vermont’s statutory definition of “reasonable expectation of privacy” is contained in Vermont’s voyeurism law.185 The plain language of the statute explicitly states that the reasonable expectation of privacy stems from circumstances and from place. 186 Notably, the expectation of privacy embodied in the statute refers only to intimate privacy. Clearview cannot be found to be “unfair” because of its conduct toward Vermonters’ reasonable expectation of privacy. The expectation of privacy vis-à-vis public images accessed by Clearview do not depict intimate parts of the body and the hypothetically depicted persons reasonable expectations of privacy was nullified upon the original publication onto the internet. The downstream use of the images by Clearview or anybody, no matter the purpose or use, cannot restore an expectation of privacy. Aside from the voyeurism and nonconsensual pornography statutes, Vermont has no other laws that define Vermonters’ reasonable expectation of privacy. 187 184 Venmo Privacy Policy, subsection “How We Share Personal Information with Other Parties”, available at https://venmo.com/legal/us-privacy-policy (last accessed April 9, 2020). 185 13 V.S.A. 2605. 186 See 13 V.S.A. 2605(a)(3) (“[c]ircumstances in which a person has a reasonable expectation of privacy’ means circumstances in which a reasonable person would believe that his or her intimate areas would not be visible to the public, regardless of whether that person is in a public or private area. This definition includes circumstances in which a person knowingly disrobes in front of another, but does not expect nor give consent for the other person to photograph, film or record his or her intimate areas” and 13 V.S.A. 2605(a)(5) “[p]lace where a person has a reasonable expectation of privacy” means: (A) a place in which a reasonable person would believe that he or she could disrobe in privacy, without his or her undressing being viewed by another; or (b) a place in which a reasonable person would expect to be safe from unwanted intrusion or surveillance.”). 187 The term “reasonable expectation of privacy” is also used, though not defined, in 3 V.S.A. § 129a (a)(26) (Professional Conduct). Persons licensed by the state are considered to have engaged in 54 Plaintiff accuses Clearview of intruding upon the seclusion of Vermonters, but this too, is wrong. Restatement (Second) of Torts 652B (1977) defines intrusion upon seclusion as “an intentional interference with his interest in solitude or seclusion, either as to his person or as to his private affairs or concerns, of a kind that would be highly offensive to a reasonable man.” 188 The invasion, per the Restatement (Second) can be a physical intrusion such as a trespass into one’s home or through the use of the defendant’s senses such as to peep through a window or eavesdrop telephone wires.189 Or the intrusion can be through an over-the-top examination into something obviously private such as opening personal mail, rifling through somebody’s wallet or opening a safe.190 Historically, Vermont has not recognized the tort of unreasonable intrusion upon seclusion.191 In recent years, though, several cases have reached the courts. These cases demonstrate that only extreme, persistent, and directed action taken against the plaintiff are sufficient to state a claim, and always the intrusion is a physical one. The intrusion “must be substantial.”192 The intrusion must be “repeated with such persistence and frequency as to amount to a course of hounding the plaintiff." 193 The plaintiff “unprofessional conduct” when “violating a patient, client, or consumer's reasonable expectation of privacy.” Note that the licensee’s conduct is only deemed unprofessional if it impacted a person with whom they had an existing special legal relationship as a patient, client, or consumer. 188 Restatement (Second) of Torts § 652A (1977). 189 Restatement (Second) of Torts § 652A (1977). 190 Restatement (Second) of Torts § 652A (1977). 191 Restatement (Second) of Torts § 652A (1977).The Reporter’s Notes states “[t]here appear to be no holdings, in Maine, Vermont and Wyoming.” 192 Hogdon v Mt. Mansfield Co., Ins., 160 Vt. 150, 162, 624 A.2d 1122, 1129 (1992). 193 Weinstein v Leonard, 2015 VT 136, 134 A.3d 547 (citing Hogdon v Mt. Mansfield Co., Ins., 160 Vt. 150, 162, 624 A.2d 1122, 1129 (1992)). 55 must “face a legitimate threat” from defendant. 194 All intrusion upon seclusion claims Vermont has recognized have involved “egregious behavior” and a physical trespass. 195 Claims cannot be based on the cumulative effect of multiple minor incidents. 196 In a case involving extreme ongoing harassment and threat (e.g. defendant engaged in identity theft, fired guns from plaintiff’s property, poisoned plaintiff’s dog, placed live bullets in plaintiff’s yard, and keyed plaintiff’s car), a jury awarded damages for “invasion of privacy” though even under these circumstances, intrusion of seclusion was not pleaded by the plaintiff. 197 Once again, Clearview’s access of publicly available images already published on the internet is not an intrusion upon seclusion. It involves no prying into private affairs. Clearview has not shown extreme, persistent, threatening or directed acts against Defendant or anybody in Vermont. Clearview has not trespassed onto the property of anybody in Vermont. Clearview has only accessed publicly available material. The same material is available to any single one of us by opening a browser, typing a name into a search engine, or logging onto Facebook. Any single one of us can screenshot an image we find online and put it somewhere else online. This not an unfair practice nor an intrusion on anybody’s seclusion. When it comes to images on the internet, 194 Vermont Ins. Mgmt., Inc. v Lumbermens’ Mut. Cas, Co., 171 Vt. 601, 604, 764 A.2d 1213, 1216 (2000). 195 See Pion v Bean, 2003 VT 79, Para 34, 176 Vt. 1, 833 A.2d 1248 (Offender cut down the victim’s trees, flooded their property by rerouting a streambed, made false complaints to the police, erected a fence and retaining wall.). 196 Weinstein v Leonard, 2015 VT 136, 134 A.3d 547 (“[A] handful of minor offenses are insufficient to constitute a tortious intrusion upon seclusion.”). 197 Shahi v Madden, 2008 VT 25, Para 24, 184 Vt. 320, 949 A.2d 1022. 56 posted onto social media platforms with billions of users, the notion of solitude or seclusion is anachronistic. D. Vermont Cannot Succeed on the Merits – Unfairness The VCPA prohibits “unfair or deceptive acts or practices in commerce.” 198 In construing the terms “unfair” and “deceptive” Vermont courts are to “be guided by the construction of similar terms contained in Section 5(a)(1) of the Federal Trade Commission Act as from time to time amended by the Federal Trade Commission and the courts of the United States.” 199 According to the State, the criteria for determining whether a practice is unfair are as follows: (1) whether the practice, without necessarily having been previously considered unlawful, offends public policy as it has been established by statutes, the common law, or otherwise whether, in other words, it is within at least the penumbra of some common-law, statutory, or other established concept of unfairness; (2) whether it is immoral, unethical, oppressive or unscrupulous; (3) whether it causes substantial injury to consumers . . . .200 Yet the State fails to acknowledge, let alone analyze, the considerable evolution in the unfairness criteria that has occurred since Christie and Sperry were decided in the 1970s. After the U.S. Supreme Court’s decision in Sperry, the FTC “claimed the power to sit as a court in equity over acts and practices within its jurisdiction that either offended public policy, or were 198 9 V.S.A. § 2453(a). 9 V.S.A. § 2453(b). 200 (Pls.’ Mot. For Prelim. Injunct. at 18 (citing Christie v. Dalmig, Inc., 136 Vt. 597, 601 (1979) and F.T.C. v. Sperry & Hutchinson Co., 405 U.S. 233, 244 n.5 (1972)). 199 57 immoral, etcetera, or caused substantial injury to consumers.” 201 Under this view of the FTC’s unfairness authority, “no consideration need[ed] to be given to the offsetting benefits that a challenged act or practice may have on consumers.”202 Not surprisingly, the FTC’s efforts to engage in unfairness rulemaking based upon its own assessment of public policy provoked considerable backlash from the public and Congress.203 In response to this backlash, in 1980 the FTC adopted the Unfairness Policy Statement,204 which limited the role that public policy should play in the FTC’s unfairness analysis by emphasizing that “unjustified consumer injury is the primary focus of the FTC Act, and the most important” of the three criteria for unfairness set forth in Sperry.205 The Unfairness Policy Statement sets forth the following factors for determining whether an unjustified consumer injury has occurred. First, “the injury must be substantial because the FTC “is not concerned with trivial or merely speculative harms.”206 The FTC clarified that “[i]n most cases a substantial injury involves monetary harm” and that “[e]motional impact and other more subjective types of harm . . . will not ordinarily make a practice unfair.” 207 Second, “the injury must not be outweighed by any offsetting consumer or competitive benefits.” The FTC will take into “account of the various costs that a remedy would entail” 201 J. Howard Beales (Former Director, Bureau of Consumer Protection), The FTC’s Use Of Unfairness Authority: Its Rise, Fall, and Resurrection at Section II. A (May 30, 2003) available at, https://www.ftc.gov/publicstatements/2003/05/ftcs-use-unfairness-authority-its-rise-fall-and-resurrection#N_43_ (last visited April 8, 2020). 202 Id. 203 Id. 204 In re International Harvester, 104 F.T.C. 949, 1984 WL565290 at *95, *98-99; see also State of Vermont v. CSA-Credit Solutions of Am., LLC & Doug Van Arsdale, Dec. and Order: Mot. for Summ. J. (Vt. Super. Ct. March 5, 2012) (Pls. Mot. For Prelim. Injunct., Ex. A). 205 In re International Harvester, 104 F.T.C. 949, 1984 WL565290 at *97. 206 Id. 207 In re International Harvester, 104 F.T.C. 949, 1984 WL565290 at *97. 58 including “not only the costs to the parties directly before the agency, but also the burdens on society in general.”208 Third, “the injury must be one which consumers could not reasonably have avoided.” 209 The Unfairness Policy Statement acknowledges that public policy considerations may inform the FTC’s actions, but specified that, to the “extent that the Commission relies heavily on public policy to support a finding of unfairness, the policy should be clear and wellestablished.210 Specifically, “the policy should be declared or embodied in formal sources such as statutes, judicial decisions, or the Constitution as interpreted by the courts, rather than being ascertained from a general sense of the national values.”211 The Unfairness Policy Statement also disavows Sperry’s “immoral, unethical, oppressive or unscrupulous” criterium as an independent basis for determining that conduct is unfair. 212 In 1994, when Congress re-authorized the FTC, it codified the Unfairness Policy Statement’s criteria: The Commission shall have no authority under this section . . . to declare unlawful an act or practice on the grounds that such act or practice is unfair unless the act or practice causes or is likely to cause substantial injury to consumers which is not reasonably avoidable by consumers themselves and not outweighed by countervailing benefits to consumers or to competition. In determining whether an act or practice is unfair, the Commission may consider established public policies as evidence to be considered with all other evidence. Such public policy 208 In re International Harvester, 104 F.T.C. 949, 1984 WL565290 at *97. Id. 210 Id. at *99 (emphasis added). 211 Id. 212 Id. 209 59 considerations may not serve as a primary basis for such determination.213 In sum, the criteria for determining whether a practice is unfair are considerably more stringent that the State asserts. When the appropriate legal standard is applied, the State falls far short of establishing that Clearview AI’s practices are unfair pursuant to the VCPA. i. Clearview AI’s Practices Do Not Injure The Public The State’s principal claim is that Clearview AI’s practices and Constitutionally protected speech are unfair because those practices violate Vermonters’ right to privacy. The Court should make short work of this misplaced argument. First, the FTC has made clear that a merely “speculative” injury is insufficient to establish unfairness. 214 The State’s alleged privacy interest—which, as discussed above, finds no footing in Vermont law—is the very definition of speculative. The speculative injury that the State does describe is quintessentially “emotional” and “subjective” rather than economic. According to the FTC, alleged harm of this type “will not ordinarily make a practice unfair.” 215 213 15 U.S.C.§ 45(n). In re International Harvester, 104 F.T.C. 949, 1984 WL565290 at *97. 215 In re International Harvester, 104 F.T.C. 949, 1984 WL565290 at *97. The State’s claim that Clearview AI’s practices are unfair because Clearview AI fails to protect consumers’ data (Pls.’ Mot. For Prelim. Injunct. at 41-42) likewise fails to allege a substantial injury. First, the State does not allege that any data, let alone data belonging to Clearview AI’s consumers, was compromised. Second, if being the victim of a computer hacking crime is equivalent to violating the VCPA, then many, many Vermont companies who conduct business via the internet are perpetrators. Such a result would be absurd. 214 60 Second, the State fails to establish that the speculative and subjective privacy harm it envisions is “not outweighed by any offsetting consumer or competitive benefits.” 216 The State’s silence on this issue speaks volumes. As discussed above in the Public Interest Uses of Clearview AI and in Law Enforcements’ Use of Clearview AI sections, Clearview AI provides significant benefits to its customers, and to the public as a whole, by providing law enforcement critical investigatory leads. These concrete and quantifiable benefits far outweigh the State’s parade of imagined horribles. Finally, the State does not acknowledge that the purported privacy harm is, in fact, one that consumers could reasonably avoid.217 Clearview AI only accesses photographs that are publicly available on the internet and anyone can refrain from publishing his or her photographs to the web. In addition, Clearview AI has a robust opt-out policy for individuals who want their photographs to be removed from Clearview AI’s indexing process. Because the State’s alleged privacy harm is speculative, outweighed by the benefits that Clearview AI provides to the public, and one that consumers can reasonably avoid, it does not constitute the type of substantial injury to consumers that would permit the State, or this Court, to declare Clearview AI’s practices unfair pursuant to the VCPA. ii. 216 217 Clearview AI’s Practices Do Not Violate Any Clear and WellEstablished Public Policy Id. Id. See also 15 U.S.C.§ 45(n). 61 Contrary to its assertion, the State is not simply free to declare any conduct that it dislikes as contrary to public policy and therefore unfair pursuant to the VCPA. Instead, if public policy is to be a factor in the State’s (or this Court’s) unfairness analysis, such a public policy “should be clear and well-established” via declaration “in formal sources such as statutes, judicial decisions, or the Constitution as interpreted by the courts.” 218 The privacy harm that the State attempts to articulate is certainly not a matter of clear and well-established public policy. The State, does not, and cannot, point to a Vermont statute that prohibits Clearview AI’s conduct. Nor can the State cite to a single court decision, in Vermont or any other jurisdiction, that declares Clearview AI’s practices to be an unconstitutional violation of privacy. Indeed, as detailed in IV. Section D., Privacy above, the State’s own practices with respect to facial recognition software have been all over the map. Far from being “clear and well-established,” the public policy questions regarding facial recognition software are in a considerable state of flux and evolution. The public’s view on the benefits and drawbacks of such software in specific contexts have yet to be ascertained. Here, the State is attempting to do precisely what the FTC has stated that it cannot: ascertain public policy from the State’s own general sense of the national values.219 The Court should not permit this overreach. iii. Clearview AI’s Practices Are Not Immoral Finally, the State’s opinion that a particular practice is “immoral, unethical, oppressive or unscrupulous” does not provide an independent basis for declaring that practice unfair pursuant 218 219 In re International Harvester, 104 F.T.C. 949, 1984 WL565290 at *99 (emphasis added). In re International Harvester, 104 F.T.C. 949, 1984 WL565290 at *99 (emphasis added). 62 to the VCPA.220 Simply put, neither the FTC nor the State of Vermont, are entitled to serve as the morality police in this context. To permit otherwise would be profoundly dangerous and undemocratic. The Court should note that2012 the FTC published a “Best Practices” paper on facial recognition issues but did not engage in rulemaking. 221 In the eight years since, the FTC has also refrained from promulgating rules that would govern Clearview AI’s practices. Similarly, the Attorney General’s Office is statutorily authorized to engage in rulemaking in order to effectuate the purposes of the VCPA and protect Vermont consumers. 222 Although in recent years the State has promulgated rules on controversial subjects like genetically modified foods,223 it has chosen not to engage in this process here, despite the fact that rulemaking would engage the public’s views about the morality and privacy issues related to the use of facial recognition software.224 This Court should not permit the State to achieve through a preliminary injunction what is unwilling—or perhaps unable—to achieve through rulemaking. E. Vermont Cannot Succeed on the Merits: Deception The State also contends that Clearview AI’s practices violate the VCPA because those practices are deceptive. The Court should reject this misplaced argument. 220 Id. FTC, Facing Facts: Best Practices for Common Uses of Facial Recognition Technologies (Oct. 2012). 222 9 V.S.A. 2453(c). 223 Consumer Protection Rule 121 available at https://ago.vermont.gov/wp-content/uploads/2018/03/CPRule-121.pdf (last accessed April 8, 2020). 224 The Attorney General’s rulemaking is statutorily constrained, however, because its rules may “not be inconsistent with the rules, regulations, and decisions of the Federal Trade Commission and the federal courts interpreting the Federal Trade Commission Act.” 9 V.S.A. § 2453(c). The Vermont Supreme Court is willing to declare a rule invalid if the State outstrips its authority in this regard. See Christie v. Dalmig, Inc., 136 Vt. 597, 600, 396 A.2d 1385, 1387 (1979) (declaring Consumer Protection Rule overly broad and invalid). 221 63 Under the VCPA, for a practice to qualify as deceptive: (1) there must be a representation, omission, or practice likely to mislead consumers; (2) the consumer must be interpreting the message reasonably under the circumstances; and (3) the misleading effects must be material, that is, likely to affect the consumer's conduct or decision regarding the product. 225 Even assuming, arguendo, the State’s description of Clearview AI’s practices is accurate, this standard is not met. The State principally argues that members of the Vermont public—whose photographs are allegedly in Clearview AI’s database—have been deceived by Clearview AI’s practices. Yet the State does not, and cannot, contend that the Vermont public writ large are consumers of Clearview AI’s products. In pertinent part, the VCPA defines a “consumer” as “any person who purchases, leases, contracts for, or otherwise agrees to pay consideration for goods or services not for resale in the ordinary course of his or her trade or business but for his or her use or benefit.”226 Accordingly, the only consumers of Clearview AI’s product are the law enforcement, financial institutions, and security services who purchase the product for use in conducting law enforcement and related activities. The State does not allege that Clearview AI misled any of these consumers in any way shape or form. Simply put, the general Vermont public does not fall within the statutorily defined ambit of Clearview AI’s consumers. 227 Regardless of the fact that the Vermont public are not consumers of Clearview's product, a plain reading of Clearview's Privacy Policy cannot sustain the State's theory of deception. The 225 See, e.g., Carter v. Gugliuzzi, 168 Vt. 48, 56 (1998). 9 V.S.A. § 2451a(a). 227 For the same reason, the State’s claim that Clearview AI’s practices are unfair because it allegedly violates the terms and conditions of various websites, also fails. The websites described by the State are not Clearview AI’s consumers as defined by the VCPA. 226 64 policy plainly states, in bold lettering, that data rights are not absolute, and are subject to limitations that vary by jurisdiction and will only be honored under the applicable rules. Here, the State's theory of deception falls afoul of the Second Circuit's holding that "a concise statement established fact, immediately thereafter expanded...cannot fairly be proscribed by [the Federal Trade Commission"; the alternatives are complete omission of the admittedly true statement or long-winded qualification and picayune circumlocution, neither of which we believe was in the contemplation of Congress."228 Here, Clearview's Privacy Policy clearly referenced a true fact--that some members of the public, including non-Americans, may have certain data rights, which are subject to limits that vary by jurisdiction. The State's theory of deception here would certainly require privacy policies to contain a great deal of "long-winded qualification and picayune circumlocution". In any case, Clearview has a good reason for denying data deletion and access requests it is not legally required to honor--as a very small company, which has received extensive media attention, it receives thousands of requests, and opening the floodgates to requests from all 50 states would impair its ability to use its limited resources to honor the requests it is required to honor. This is just the sort of "exigenc[y] of a particular situation and... practical requirement of the business in question" which must inform a court's interpretation of the Federal Trade Commission Act.229 The State also contends that Clearview AI’s representation that its data is “secure” is deceptive pursuant to the VCPA.230This argument misses the mark. First, and most importantly, 228 FTC v. Sterling Drug, Inc., 317 F.2d 669, 675-76 (2d Cir. 1963). FTC v. Motion Picture Advert. Serv. Co., 344 U.S. 392, 396 (1953). 230 (Pls. Prelim. Injunct. Mot. at 43). 229 65 the State falls far short of establishing that Clearview AI’s representation is, in any way, false. Clearview AI’s data is not insecure merely because it has been the victim of a computer hacking crime. Moreover, this is a representation regarding the quality of Clearview AI’s data protections and, as such, is not actionable under the VCPA. See, e.g., Repucci v. Lake Champagne Campground, Inc., 251 F. Supp. 2d 1235, 1241 (D. Vt. 2002) (dismissing VCPA claim alleging misrepresentation about the quality of a campground because the representation was one of opinion, not fact). The state contends that Clearview misrepresents the accuracy of its results. However, “Clearview AI has tested the accuracy of its App using the Megaface benchmark test—a 1 million photo dataset containing over 690,000 unique individuals which is made public for facial recognition algorithm evaluation by the University of Washington. The Megaface test is recognized worldwide as a leading method for evaluation of facial recognition accuracy.” 231 “In late 2018, Clearview AI conducted a Megaface test internally with an accuracy rate of 99.6% for the toughest Facescrub challenge. This test measures the true positive rate of picking out a face accurately out of a gallery of 1 million other faces. The only major company ranking higher than Clearview AI is SenseTime with a 99.8% accuracy SenseTime is the largest facial recognition provider in China.”232 The state also claims Clearview misrepresents that it only sells to law enforcement. Clearview AI currently provides their technology to security and anti-fraud staff at a handful of financial institutions. In the past, they have provided their technology to security employees at a 231 Mulcaire Aff. Ex. A, at ¶ 5. 232 Id. 66 small number of large national retailers and a handful of private security companies, but have terminated those accounts out of an abundance of caution. At present their only private clients are financial institutions.233 F. Vermont’s Legislature Has Implemented a Statute Regulating Biometric Technology in Limited Contexts But Permitting it Generally The equities of granting an injunction against Clearview based on a supposedly cut-anddried violation of the Vermont Consumer Protection Act are particularly acute given the State of Vermont’s legislative efforts in this area. Vermont indeed does have a statute regulating biometric analysis of photographic images. In 2004, when Vermont finally began requiring photographs on driver’s licenses and certain other forms of identification, it passed and enacted a law regulating in part and permitting in part the use of biometric software to identify applicants for licenses and identification cards. As subsection (c) to 23 V.S.A. sec. 634 states, “[t]he Department of Motor Vehicles shall not implement any procedures or processes for identifying applicants for licenses, learner permits, or nondriver identification cards that involve the use of biometric identifiers. Pursuant to the provisions of 49 U.S.C. sec 31308, this subsection shall not apply to applicants for commercial driver licenses or endorsements on these licenses”. The statute is interesting and relevant to the injunction issue for a number of reasons: 233 Id. at ¶ 21. 67 First, its enactment shows the legislature and governor have the capacity to legislate in this area if in the public interest. Indeed, they first did so over 15 years ago. Second, the legislature chose to prohibit a narrow class of use of biometric analysis and to leave a broad realm unregulated. Even with regard to DMV, under 634, some use is permitted; some is prohibited. There is nothing in the statute that would apply any prohibition to conduct beyond the DMV. Finally, the way the statute is worded, it is only sensibly read as leaving open the use of facial recognition circumstances other than the prohibition in subsection c on use by DMV for identifying applicants of regular drivers’ licenses or id cards. In sum, by the terms of the Vermont law on biometric analysis, the legislature has legislated in this area and has not decided to prohibit the conduct the Attorney General proclaims is unlawful. G. Vermont’s Data Broker Laws Don't Apply to Clearview The State also argues in passim that Clearview has violated the Data Broker Protection. 234 Statute, 9 V.S.A. § 2431. It alleges that Clearview has engaged in a fraudulent acquisition of data in violation of § 2431(b). For the reasons stated at length above, Clearview’s technology in obtaining data is not “acquisition of brokered data through fraudulent means” and cannot form the basis for recovery as a VCPA violation. The data is not brokered. It is not acquired by fraud. IV. Vermont Lacks Standing Because there is No Concrete, Material Injury 234 9 V.S.A. § 2431. 68 Because Vermont can't point to a clear statutory harm, it lacks standing for lack of a concrete, material injury. In order to establish standing, it is well established that a plaintiff “must have suffered an ‘injury in fact’- an invasion of a legally protected interest which is (a) concrete and particularized; and (b) actual or imminent, not conjectural or hypothetical.’” 235 Considering this requirement, the United States Supreme Court observes that even where “‘a statute grants a person a statutory right and purports to authorize that person to sue to vindicate that right,’” 236 a plaintiff may not necessarily meet the concrete injury requirement. Therefore, it is insufficient for standing purposes that an individual has a statutory right. A court must still determine that the individual has suffered a “concrete” injury in fact due to a violation to confer standing in any case.237 In Patel v. Facebook, a biometric invasion of privacy case, the Ninth Circuit held in establish that a two-step injury inquiry was required: “‘(1)whether the statutory provisions at issue were established to protect [the plaintiff’s] concrete interests (as opposed to purely procedural rights), and if so, (2) whether the specific procedural violations alleged in this case actually harm, or present a material risk of harm to, such interests.’” 238 Vermont does not meet this test. Vermont’s Complaint and motion for a preliminary injunction make broad and amorphous claims of injury under the Vermont Consumer Protection Act and Vermont Data Brokerage Statute. But no specific injury related to any specific person of 235 Lujan v. Defs. Of Wildlife, 504 U.S. 555,561, 112 S.Ct. 2130, 119 L.Ed.2d 351(1992). Spokeo , Inc. v. Robins, 136 S.Ct. 1540,1549, 194 L.Ed.2d 635 (2016). 237 Patel v. Facebook, 932 F.2d 1264, 1270 (2019). 238 Patel v. Facebook, 932 F.2d 1264,1270-71(2019) (citation omitted). 236 69 is identified. Rather, shadowy rights unrecognized by the Vermont legislature are asserted on behalf of unnamed Vermont consumers that Clearview AI has never done business with because it's never done business in Vermont. The Vermont Consumer Protection and Data Broker Acts statutes were not passed to create an intrusion of privacy tort of the type the Attorney General fancies. If the Vermont legislature had intended to do that, they would have said so. The statutes in this case simply do not apply in this case, and therefore do not grant Vermont standing. Turning to the second prong of standing, asking whether the violations alleged actually “harm, or present a material risk of harm to, such interests,’” 239it is clear that no harm, or material risk of harm, to the interests of Vermont is presented. Just like to date there are no reported, actual abuses of Clearview AI's innovative facial biometric algorithmic search engine. As a matter of law, this Court should conclude that no standing exists for Vermont to bring this action, and dismiss the Complaint with prejudice. CONCLUSION This Court should deny the Attorney General's motion for a preliminary injunction and dismiss the Complaint with prejudice, because: 1. As a threshold matter, Vermont's action is preempted by the federal Communications Decency Act § 230, which bars holding interactive computer 239 Patel v. Facebook, 932 F.2d 1264, 1274 (2019) (internal citation omitted). 70 services like Clearview AI liable for the posting and use of third-party content. Clearview AI is no different than Google on this front. 2. The First Amendment protects Clearview AI's search engine algorithms and its facial biometric algorithm because they are expressive, information conveying computer codes, there is a First Amendment right to access and use public data on the internet, and a preliminary injunction constitutes a prior restraint 3. Vermont's action is void for vagueness under the Fifth and Fourteenth Amendments' Due Process clauses as no one of ordinary intelligence would be on notice that the actions Vermont complains of are illegal 4. Vermont doesn't meet all the elements for a preliminary injunction, which they are required to meet, including likelihood of the success on the merits 5. Vermont lacks standing because there is no concrete, material, injury. 71 Date: April 10, 2020 Respectfully submitted, /s/ Tor Ekeland Tor Ekeland Pro Hac Vice pending, NY Bar No. 4493631 Tor Ekeland Law, PLLC 195 Montague Street, 14th Floor, Brooklyn, NY 11201 (718) 737-7264 tor@torekeland.com Timothy C. Doherty, Jr., VT Bar No. 4849 Tristram J. Coffin, VT Bar No. 2445 Downs Rachlin Martin PLLC 199 Main Street, PO Box 190, Burlington, VT 05402-0190 (802) 863-2375 TDoherty@drm.com TCoffin@drm.com Corinne Mullen, Of Counsel Pro Hac Vice pending, NY Bar No 2451391 Mullen Law Firm 1201 Hudson St, ste 230 Hoboken, NJ 07030 (201) 420-1911 corinne@mullenlawfirm.com Attorneys for Defendant Clearview AI, Inc. 72