Pntential Pnlicy fer Regulatinn nf Secial Media and Firms Sncial media and wider digital cnnununicatinns have changed nur in innumerahle ways. They haye transl?nrmed the way we ey erythin l?rnm l"nr grnceries tn grnwing nur small businesses and haye radically lnwered the nl", and harriers tn, glnhal ccuntnunicatinn. The American cnmpanies behind these prnducts and seryices Facehnnk, Twitter, Amaznn, and Apple, amnng nthers haye heen snme nl" the successful and in the As such, each nl" them deseryes recngnitinn l?nr the transl?nrmatinn they have engendered arnund the Its their cnllectiye influence has l'tnweyer, these tech giants alsn eserye increased scrutiny. In the course of investigating Russia?s unprecedented interference in the 2016 electinn, the extent tn which many nl" these haye heen esplnited and their prnyiders caught repeatedly l?lat-l?nnted has been unmistakable. Mnre than illuminating the capacity nl? these tn he esplnited hy had the reyelatinns nl" the last year haye reyealed the dark underhelly nl? an entire The speed with which these prnducts haye and cnme tn dnminate nearly eyery aspect nl" nur sncial, pnlitical and liyes has in many ways nhscured the nl" their creatnrs in anticipating the harmful effects nl? their use. {Ernyernment has failed tn adapt and has been incapable nr unwilling tn adequately address the impacts nl" these trends nn priyacy, cnmpetitinn, and public discnurse. Armed with this it is time tn hegin tn address these issues and tn adapt nur regulatinns and laws. There are three areas that shnuld he nl" particular l"nc us l?nr pnlicymakers. First, understanding the capacity cnmmunicatinns tn that undermines trust in nur institutinns, tlemncracy, free press, and markets. In many ways, this threat is new. instance, Russians haye heen cnnducting information warfare for decades. During the Cold War, the Soviets tried to spread ?fake news" denigrating Martin Luther King Jr. and alleging that the American military had manufactured the \N'llitc US. Sen. A-?larl. R. Warner AIDS virus.1 Much like today, their aim was to undermine Americans? faith in democratic government. But what is new is the advent of social media tools with the power to magnify and target propaganda and fake news on a scale that was unimaginable back in the days of the Berlin Wall. As one witness noted during the March 2017 hearing on Russian disinformation efforts before the Senate Select Committee on Intelligence, tod ay?s tools seem almost purpose- baitt for Russian disinformation techniques.g Just as we?re trying to sort through the disinformation playbook used in the 2(116 election and as we prepare for additional attacks in 2018, a new set of tools is being developed that are poised to exacerbate these problems. Aided in large part by advances in machine learning, tools like DeepFake allow a user to superimpose existing images and videos onto unrelated images or videos. In addition, we are seeing an increasing amount of evidence that bad actors are beginning to shift disinformation campaigns to messaging applications rather than using the relatively more open social media platforms. Closed applications like WhatsApp, Telegram, 1liiber, and others, present new challenges for identifying, rapidly responding to, and fact- checking misinformation and disinformation targeted to specific users .3 But it?s also important to recognize that manipulation and exploitation of the tools and scale these platforms provide goes beyondjust foreign disinformation efforts. In the same way that bots, trolls, click-farms, fake pages and groups, ads, and algorithm-gaming can be used to propagate political disinformation, these same tools can and have been used to assist financial frauds such as stock-pumping schemes, click fraud in digital advertising markets, schemes to sell counterfeit prescription drugs, and efforts to convince large numbers of users to download malicious apps on their phones.4 Addressing these diseconomies of scale negative U.S. Department of State: Soviet in?uence Activities: A Report on Active Measures anall Propaganda, i 985-} 93? [August 198?), 2 U.S. Congress, Senate, Select Committee on Intelligence, Dpen Hearing: ?isiry?brmatian: A Primer in Russian Active Measures and iruriuence Campaigns. 1 151J1 Cong., 1M sess., 2111?. 3 See Elizabeth Dwoskin, Annie Gowen, ?On WhatsApp, fake news is fast and can be fatal,? Washington Post. July 23, 21113. -is-?rstnand-can-be- ?ttaliZGlSi?TiEl?riaEddTl lE-Eebf?l leg-bcdS-ngl 1c'i84c38 story.html?utm Nic Dias, ?The Era of WliatsApp Propaganda Is Upon Us,? Foreign Policy. August 2111?. -upon-usi. 4 See, Robert Gotwa, ?Oomtxitational Propaganda in Poland: False Ampli?er and the Digital Public Sphere," Working Paper No. 211114, Oxford Internet Institute, University of xford. \N?l?lilc Paper US. Sen. h-?Iark R. Warner externalities borne by users and society as a result of the size of these platforms represents a priority for technology policy in the 21?? century. A second dimension relates to consumer protection in the digital age. As online platforms have gained greater prominence in our lives, they have developed more advanced capabilities to track and model consumer behavior typically across the multiple devices a consumer owns. This includes detailed information on viewing, window-shopping, and purchasing habits, but also more sensitive information. The prevailing business model involves offering nominally free services, but which results in consumers providing ever-more data in exchange for continued usage. User tracking can have important consumer benefits, for instance by showing users more relevant ads and helping to optimize user experience across different apps. At the same time, these user profiles could provide opportunities for consumer harm and in surreptitious, undetectable ways. Pervasive tracking may give platforms important behavioral information on a consumer?s willingness to pay or on behavioral tendencies that can be exploited to drive engagement with an app or service. These technologies might even be used to influence how we engage with our own democracy here at home, as we saw in recent months with the Cambridge Analytica scandal, where sensitive Facebook data from up to 87 million people may have been used to inappropriately target U.S. voters. The allure of pervasive tracking also creates incentives to predicate services and credit on user behavior. Users have no reason to expect that certain browsing behavior could determine the interest they pay on an auto-loan, much less that what their friends post could be used to determine that. Further, numerous studies indicate users have no idea their information is being http:ffbloas.oii.ox.ac. Renae Merle, ?Scheme created fake news stories to manipulate stock prices, SEC alleges," Les Times. July 5, Lauren Moss, ?Xanax drug sold on social media found to be fake,? BBC News. March 215, 2918. Danny Palmer, ?Android malware found inside apps downloaded 500,000 times,? ZDNEL March 215, 2018. \N'llilc Paper 1. US. Sen. R. Warner used in this manner, resulting in a massive informational as ymmetry.5 Important policy mechanisms include requiring greater disclosure by platforms and in clear, concise ways about the types of information they collect, and the specific ways they are utilizing it. Lastly, the rise of a few dominant platforms poses key problems for long-term competition and innovation across multiple markets, including digital advertising markets which support much of the Internet economy), future markets driven by machine-learning and arti?cial intelligence, and communications technology markets. User data is increasingly the single most important economic input in information markets, allowing for more targeted and relevant advertisements, facilitating refinement of services to make them more engaging and efficient, and providing the basis for any machine-learning algorithms (which, for instance, develop decisional rules based on pattern-matching in large datasets) on which all industries will increasingly rely. Unlike many other as sets, which tend to illustrate declining marginal utility, the value of any piece of data increases in combination with additional data.?? Relatedly, data exhibits economies of scale, enabling more effective data analysis, computationally intensive pattern recognition and computational learning with greater collected data.Tr As a consequence, firms with large preexisting data sets have potentially ins uperable competitive advantages over new entrants and 5 Lee Raine, ?Americans' Complicated Feelings About Social Media in An Era of Privacy Concerns," Pew Research Center. March 2618. social- media- in- -an- er a- -of-DI iv rrcv-concer nsi (noting that ?people struggle to understand the nature and scope of the data collected about them"); Timothy Morey et al. ,?Customer Data: Designing for Transparency and Trust, Harvard Ba Review. May 21315. (?While awareness varied by country? .overall the survey revealed an astonishingly low recognition of the speci?c types of information tracked onl me. On average, only 25% of people knew that their data footprints included information on their location, and just 14% understood that they were sharing their web-sur?ng history '5 Maurice Stucke 3e Allen P. Grunes, Big Data and Competition Poiicy(D1rford University Press, 21316], zoo- 2131.; DECD, ?Data-Driven Innovation for Growth 21nd Well-Being: Interim Report,? (October 2U14), 29. (?The diversi?cation ofservices leads to even better insights if data linkage is possible. This is because data linkage enables ?super-additive' insights, leading to increasing ?returns to scope.1 Linked data is a means to contextualize data and thus a source for insights and value that are greater than the sum of its isolated parts (data 7 DEED, ?Exploring the Economics of ersonal Data," (2013), 34. (?The monetary, economic 21nd social value of personal data is likely to be governed by non-linear, increasing returns to scale. The value of 211] individual record, alone, may be very low but the value 21nd usability of the record increases as the number of records to compare it with increases"); see aiso Frank Pasquale, ?Paradoxes of Digital Antitrust.? Harvard .i'onrnai ofLaw 2f2 Technology. July HHS. (describing the ?Matthew Effect? in digital markets). \N'llilc US. Sen. R. Warner nascent firms.3 Dominant platforms have also aggressively commercialized research, identifying ways to exploit cognitive biases and vulnerabilities to keep users on the site and addicted to their products, generating more behavior data to mine.? As machine- learning and AI begin to animate a wider variety of fields medicine, transportation, law, accounting/book-keeping, financial services a handful of large platforms may be able to leverage large datasets to develop products faster and more efficiently than competitors. These advantages are especially pronounced because many machine- learning and AI techniques are openly-extensible: pattern recognition, decisional rules, and computational learning tools can be applied on a new dataset (like tumor images) even if they were developed from a completely dissimilar dataset (such as cat pictures). Policy Options The size and reach of these platforms demand that we ensure proper oversight, transparency and effective management of technologies that in large measure undergird our social lives, our economy, and our politics. Numerous opportunities exist to work with these companies, other stakeholders, and policymakers to make sure that we are adopting appropriate safeguards to ensure that this ecosystem no longer exists as the ?Wild West? unmanaged and not accountable to users or broader society and instead operates to the broader advantage of society, competition, and broad-based innovation. The purpose of this document is to explore a suite of options Congress may consider to achieve these objectives. In many cases there may be flaws in each proposal that may undercut the goal the proposal is trying achieve, or pose a political problem that simply can?t be overcome at this 3 See Tom Simonite, ?Al and ?Enormous Data? Could Make Tech Giants Harder to Topple," Wired. July l'l, Zill'l. see also Alon Haley, Peter Norvig, Fernando Pereira, ?The Unreasonable Effectiveness of Data,? intelligent Systems. MarchlApril 213109. [concluding that ?invariably, simple models and a lot of data trump more elaborate models based on less data"); Chen Sun, Abhinav Shrivastava, Saurabh Singh Abhinav Gupta, ?Revisiting Unreasonable Effectiveness of Data in Deep Learning Era,? Google Al Blog. July 1 l, 261?. [finding that performance of computer vision models increases logarithmically based on the volume of training data). 5' Ian Leslie, ?The Scientists Who Make Apps Addictive,? The Economist: 1?34? Magazine. Dctobetmovelnbet? time. This list does not represent every idea, and it certainly doesn?t purport to answer all of the complex and challenging questions that are out there. The hope is that the ideas enclosed here stir the pot and spark a wider discussion among policymakers, stakeholders, and civil society groups on the appropriate trajectory of technology policy in the coming years. Disinformation and MisinformationtEx loitation of chhnolo Duty to clearly and conspicuously label bots Bots play a significant role in the amplification and dissemination of disinformation. Bot-enabled amplification and dissemination have also been utilized for promoting scams and financial frauds.In New technologies, such as Google Assistant?s AI-enabled Duplex, will increasingly make bots indistinguishable from humans (even in voice interfaces). To protect consumers, and to inhibit the use ofbots for amplification of both disinformation and misinformation, platforms should be under an obligation to label bots both those they provide (like Google?s Duplex) and those used on the platforms they maintain bot-enabled accounts on Twitter). California lawmakers have proposed something like it colloquially referred to as a ?Blade Runner law? after the 1980s movie to do just this.11 Duty to determine origin of posts andtor accounts Anonymity and pseudo- anonymity on social media platforms have enabled bad actors to assume false identities (and associated locations) allowing them to participate and influence political debate on social media platforms. We saw this during the 2(116 election, as [RA-affiliated actors pretended to be real Americans, flooding Facebook and Twitter newsfeeds with propaganda and disinformation. Forcing the platform companies to determine andtor authenticate the origin of accounts or posts would go far in limiting the influence of bad actors outside the United States. Facebook appears to have trialed an approach similar to this in May 2018: Samuel C. Woolley Philip N. Howard, ?Computational Propaganda Worldwide: Executive Summary," Working Paper No. 201111. xf'ord Internet Institute, University of?sfot?d), reference-waking paper. SB- 1 ?Bolsteriug Unline Transparency (EDT) Act of CA Legislature 201 3. White Paper US. Sen. Mark R. \K?"urucr hmoli: an: Pa Ii: Supp-or tn". 'i It Llhe 5- Fall-oh A Share -- .S'Edm ?dl?'ll [in ad: availabl? Page History There are no ads Cu'ren lly mm Hg lo 5' new. :5 Dane urea-mud en 1" Feb-4m; 1-11? Mum Cornmualtr erns MB However, due to the widespread use ofVPN?s and other methods for masking IP addresses, determining the true origin of posts or accounts can be technically challenging. Such a scheme could result in a large number of false positives, potentially undermining its value. Facebook?s trial, for instance, apparently associated pages with particular locations simply because a page admin had logged into their Facebook account from that country while traveling. A duty on the part of service providers to identify the origin of posts or accounts raises a number of privacy concerns. For one, it may incentivize online service providers to adopt identity verification policies at the cost of user privacy. Facebook has, for instance, come under criticism from a variety of groups and advocates LGBT, Native American, and human rights groups for its real name policy. It may also better enable online platforms to track users. Lastly, location identification could potentially enable oppressive regimes to undermine and attack freedom of expression and privacy particularly for those most vulnerable, including religious and ethnic minorities, dissidents, human rights defenders, journalists, and others. Any effort on this front must address the real safety and security concerns of these types of at-risk individuals. Duty to identify inauthentic accounts A major enabler of disinformation is the ease of creating and maintaining inauthentic accounts (not just bots, but in general, accounts that are based on false id entities}. Inauthentic accounts not only pose threats to our democratic process (with inauthentic accounts disseminating disinformation or harassing other users), but also undermine the integrity of digital markets {such as digital adyertising). Platforms haye peryerse incentiyes not to take inauthentic account creation seriously: the steady creation of new accounts allows them to show continued user growth to financial markets, and generates additional digital adyertising money {hoth in the form of inauthentic yiews and lrom additional often highly sensational content to run ads against). A law could he crafted imposing an affirmatiye, ongoing duty on platforms to identify and curtail inauthentic accounts, with an SEC reporting duty to disclose to the public [and adyertisers) the number of identified inauthentic accounts and the percentage of the user base that represented. Legislation could also direct the FTC to inyestigate lapses in addressing inauthentic accounts under its authority to address unfair and deceptiye trade practices. Failure to appropriately address inauthentic account actiyity or misrepresentation of the estent of the problem could he considered a yiolation of both SEC disclosure rules andi?or Section 5 ofthe FTC Act. Like a duty to determine the origin of accounts or posts, howeyer, a duty on the part of online seryice proyiders to identify inauthentic accounts may haye the effect of incentiyiaing proyiders to adopt identity yerification policies, at the cost of user priyacy. Mandatory identity yerification is likely to arouse significant opposition from digital priyacy groups and potentially from ciyil rights and human rights organizations who fear that such policies will harm at?risk populations. In addition, any effort in this area needs to consider the distinction between inauthentic accounts created in order to mislead or spread disinformation from accounts clearly set up for satire and other legitimate forms of entertainment or parody. Make platforms liable for state-law torts (defamation, false light, public disclosure of private facts) for failure to take down deep fake or other manipulated audiofyideo content Due to Section 230 of the Communications Decency Act, internet intermediaries like social media platforms are immunized from state tort and criminal liability. ['Ioweyer, the rise of technology like DeepFakes sophisticated image and audio tools that cart generate fake audio or yideo files falsely depicting someone saying or doing something is poised to usher in an unprecedented waye of false and defamatory content, with state law-hased torts \l'lmc - US. Sen. Maul. R. \l'umcr (dignitary torts) potentially offering the only effective redress to victims. Dignitary torts such as defamation, invasion ofprivacy, false light, and public disclosure of private facts represent key mechanisms for victims to enjoin and deter sharing of this kind of content. Currently the onus is on victims to exhaustively search for, and report, this content to platforms who frequently take months to respond and who are under no obligation thereafter to proactively prevent the same content from being re-uploaded in the future.12 Many victims describe a ?whack-a-mole? situation.13 Even if a victim has successfully secured a judgment against the user who created the offending content, the content in question in many cases will be re-uploaded by other users. In economic terms, platforms represent ?least-cost avoiders? of these harms; they are in the best place to identify and prevent this kind of content from being propagated on their platforms. Thus, a revision to Section 23(1 could provide the ability for users who have successfully proved that sharing of particular content by another user constituted a dignitary tort to give notice of this judgement to a platform; with this notice, platforms would be liable in instances vvhere they did not prevent the content in question from being re-uploaded in the future a process made possible by existing perceptual hashing technology the technology they use to identify and automatically take down child pornography). Any effort on this front would need to address the challenge of distinguishing true DeepFakes aimed at spreading disinformation from satire or other legitimate forms of entertainment and parody. Reforms to Section 230 are bound to elicit vigorous opposition, including from digital liberties groups and online technology providers. Opponents of revisions to Section 230 have claimed that the threat of liability will encourage online service providers to err on the side of content takedovvn, even in non-meritorious instances. Attempting to distinguish between true disinformation and legitimate satire could prove difficult. Hovvever, the requirement that plaintiffs successfully obtain court judgements that the content in question constitutes a dignitary tort which provides significantly more process than something like the Digital Millennium '2 Chris Silver Smith, ?Paradigm Shift: Has Google Suspended Defamation Removals?? Search Engine Land. December 3f}, 20115. '3 Kari Paul, ?Reddit's Revenge Porn Policy Still Puts the Guns on Victims, Advocates Say," Motherboard. February 26, 2015. -are-skeotical- of?reddits-new-lgolicv. White Paper US. Sen. Mark R. Warner Copyright Act (DMCA) notice and takedown regime for copyright-infringing works may limit the potential for frivolous or adversarial reporting. Further, courts already must make distinctions between satire and defamationir libel. Public Interest Data Access Bill One of the gravest problems identified by people like Tristan Harris, Wael Ghonim, and Tom Wheeler is that regulators, users, and relevant NGOs lack the ability to identify potential problems (public healthiaddiction effects, anticompetitive behavior, rad icaliaation) and misuses (scams, targeted disinformation, user-propagated misinformation, harassment) on the platforms because access to data is zealously guarded by the platforms. 14 Under this view, we could propose legislation that guarantees that platforms above a certain size provide independent, public interest researchers with access to anonymized activity data, at scale, via a secure API. The goal would be to allow researchers to measure and audit social trends on platforms. This would ensure that problems on, and misuse of, the platforms were being evaluated by researchers and academics, helping generate data and analysis that could help inform actions by regulators or Congress. While at first glance this might seem drastic, the upshot is that the platforms have already developed methods by which researchers can gain anonymised activity data, at scale; the current problem is that much of this research is proprietary and platforms typically condition access to it on a researcher signing an NBA (compromising their independence). Further, as Bloomberg has reported, platforms have typically sought collaborations with researchers whose projects comport with their business goals, while excluding researchers whose work may be adverse to their interests. 15 Under immense public and political pressure, Facebook has proposed a system '4 Wael Ghonim Jake Rashbass, ?It? Time to End the Secrecy and Dpacity of Social Media," Washington Post. Dctober 31, 201?. 1iits-time-to-end-the- termz.fht3cbt3adce13; Stefan Verhulst Andrew Young, ?How the Data that Internet Companies Chllect Can Be Used for the Public Good," Harvard Business Review. January 23, 2018. Tom Wheeler, ?How to Monitor Fake News,? New Fork Times. February 26, 2018. 201 html. '5 Karen Weise Sarah Frier, ?IfYou?re A Facebook User, You?re Also a Research Subject," Bioemberg. June 14, 201 8. littosn?Ir w.blo on?rberacon'ur evvs:If article si 20 S-fi?- l4ii f- vo u-I'e-a-facebook-user- so u- Ie-also-a-res earch- subject. ll] somewhat similar to a puhlic interest research access regime, in collaboration with the Social Science Research Council. Large?scale implementation ol?such art initiatiye does present a number of practical challenges, howeyer. To protect user priyacy, a number ol?controls would need to he required including contractual controls, technical controls, criminal penalties for misuse of data hy researchers, estensiye auditing, compliance checks, and institutional reyiew hoards (IRBs). At the same time, eatensiye priyacy protections may simultaneously inhibit the ability of researchers to el?l?ectiyely use platl?orm data for research. Further, experts point out that as important as ensuring researcher access to platl?orm data is regulating the use of hehayior data by platl?orms. Experts haye pointed to a need to regulate the use of corporate hehayioral science, focusing on research controls {such as requiring companies to run research rough an IRB) and the implications of hehayior research on their business models. Commercial hehayioral science may proyide large platforms with unl?air competitiye adyantages, allowing platforms to use hehayior data to model new features that driye higher leyels of user engagements. These practices eyen extend to conditioning user hehayior? designing {and refining) products to he intentionally hahit?l?orming. These practices raise important questions related to consumer protection, competition, and priyacy. Require lnteragency Task Force for Countering Asymmetric Threats to Democratic Institutions Alter multiple briefings and discussions, it is eyident that the intelligence and national security are not as well?positioned to detect, track, attribute, or counter malicious threats to our political system as they shouch be. From inl?ormation operations to cyher?attacks to illicit finance and money laundering, our democratic institutions face a wide array of new threats that don?t fit easily into our current national security authorities and responsibilities. Its just one example, programs to detect and protect against inl?ormation operations are disparately positioned with unclear reporting chains and lack metrics for measuring success. Standing up a congressionally?recluired task force would help bring about a whole?of?goyernment approach to counter asymmetric attacks against our election infrastructure and would reduce gaps that currently exist in tracking and addressing the threat. This typically could be done by the President without legislation; howeyer, President Trump seems unwilling to touch the issue, and as such, Congress could force the issue as they did with the creation of the State Department I{Erlobal Engagement Center. [-Ioweyer, as tho has proyen, without engaged leadership these types of legislated entities can easily be staryed of resources or authorities. Disclosure Requirements for ()nline Political Advertisements As tho Senate Select Committee on Intelligence helped to uncoyer during its inyestigation into Russian interl'erence in tho 2016 elections, tho ease by which our foreign adyersaries purchased and targeted politically oriented ads during tho campaign esposed an obyious throat to the integrity of our democracy. Because outdated election laws haye failed to keep up with eyolying technology, online political ads haye had yery little accountability or transparency, as compared to ads sold on TV, radio, and satellite. Improying disclosure requirements for online political adyertisements and requiring online platforms to make all reasonable efforts to ensure that foreign indiyiduals and entities are not purchasing political ads seem like a good first step in bringing more transparency online. The Honest Ads Act {8.1989} is one potential path, but there are other reasonable ways to increase disclosure requirements in this space. Public Initiative for Media Literacy Addressing the challenge of misinformation and disinl?ormation in the long?term will ultimately need to be tackled by an informed and discerning population of citizens who are both alert to the threat but also armed with the critical thinking skills necessary to protect against malicious inl?luence. A public initiatiye propelled by federal funding but led in large part by state and local education institutions focused on building media literacy from an early age would help build long-term resilience to foreign manipulation of our democracy. Such an effort could benefit from the resources and knowledge of priyate sector tech companies, as well as the expertise and training ofsonte of the country?s most credible and trustworthy media entities. One particularly difficult challenge in any long?term el?l?ort like this, howeyer, is establishing and tracking metrics for real success. It is not enough for social media companies or the tech community to simply giye lip seryice to building long-term resiliency and media literacy without taking some much more significant short?term steps in addressing the \l'lulc - US. (?it-11. Maul. R. \l'umcr threat we face in the here and now. A public effort like this should be seen as augmenting or supporting more assertive and more aggressive policy steps. At the same time, technology scholars such as danah boyd have argued that emphasis on media literacy obscures the real problems around online consumption of misinformation: dis trust of media sources and a proclivity of users to deploy online information in service of strongly-held ideological or identity-based claims or beliefs. ?5 A recent study by Gallup and the Knight Foundation found that "People rate [plartisan news stories as more or less trustworthy depending on whether the source is viewed as sympathetic or hostile to their political preferences? rather than on the content of the story.? Under this view, empowering individuals as fact-checkers and critics may exacerbate distrust of institutions and information intermediaries. More important than building capacity for individuals to scrutinize sources is cultivating a recognition that information can (and will) be weaponized in novel ways, along with an understanding of the pathways by which misinformation spreads. Increasing Deterrence Against Foreign Manipulation The U.S. government needs to do more strengthen our security against these types of asymmetric threats. We have to admit that our strategies and our resources have not shifted to aggressively address these new threats in cyberspace and on social media that target our democratic institutions. Russia spends about $70 billion a year on their military. We spend ten times that. But we?re spending it mostly on physical weapons designed to win wars that take place in the air, on land, and on sea. While we need to have these conventional capabilities, we must also expand our capabilities so that we can win on the expanded battlefields of the 21"1 century. Until we do that, Russia is going to continue getting a lot more bang for its buck. The consequences of this problem are magnified because we lack a deterrence strategy that would discourage cyberattacks or information warfare targeting our democracy. In the absence ?1 danah boyd. ?You Think You want Media Literacy. . .Do Your? Medium. March 9. an s. '7 ?An Online Experimental Platform to Assess Trust in the edia,? Golinp Inc. and the John S. and James Knight Foundation. July 18, 2018. assess-trus t-in-tlie-media. 13 \?x'lnlc Palm\?x'umu of a credible deterrent, there is nothing preventing Russia or another adversary from just continuing to use a tool that, frankly, has been working. It is not even clear which of the numerous agencies and departments tasked with responding to the cyber threat is supposed to be in charge. We must spell out a deterrence doctrine, so that our adversaries don?t see information warfare or cyberattacks against us as a ?free lunch.? The U.S. has often done too little to respond to these attacks against us or our allies. When we do respond, it has often been done quietly, and on a one-off basis. That?s not been enough to deter future action. We need to make clear to Russia and other nations, that ifyou go after us in the cyber reahn, we?re going to punch back using our own cyber capabilities. And we need to increase the costs of this activity with robust sanctions and other tools. Privacv and Data Protection Information fiduciary Yale law professor Jack Balkin has formulated a concept of ?information fiduciaries? service providers who, because of the nature of their relationship with users, assume special duties to respect and protect the information they obtain in the course of the relationships. Balkin has proposed that certain types of online service providers including search engines, social networks, ISPs, and cloud computing providers be deemed information fiduciaries because of the extent of user dependence on them, as well as the extent to which they are entrusted with sensitive information. ?i A fiduciary duty extends beyond a mere tort duty (that is, a duty to take appropriate care): a fiduciary duty would stipulate not only that providers had to zealously protect user data, but also pledge not to utilize or manipulate the data for the benefit of the platform or third parties (rather than the user). This duty could be ?i Jack M. Balkin, ?Information Fiduciaries and the First Amendment," Davis Law Review, Vol. 39, No. 4. April 2on3. Balkinpdf [noting that in addition to performing professional services, ?fiduciaries also handle sensitive personal information. That is because, at their core, ?duciaryr relationships are relationships of trust and con?dence that involve the use and exchange of inforrnatio Jack M. Balkin St Jonathan Zittrain, Grand Bargain to Make Tech Companies Trustworthy,? Tire Atiamic. Dctober 3, 2'31 (v 1 l4 estahlished statutorily, with defined l?unctionsi?seryices qualifying for classification as an in form ation l?i iary . Concretely defining what responsihilities a fiduciary relationship entails presents a more difficult challenge. Appropriate responsibilities ntay yary based on a number ol" factors, including the yalue that consumers deriye front the seryice, whether consumers are paying ntonetarily tor the service, and the extent ol" data collection by the seryice proyider. Applying a one?size?l?its?all set of fiduciary duties may inhibit the range ol? seryices consunters cart access, while driying online business models towards more uniform ol?l?erings. Privacy rulentaking authority at FTC Man,r attribute the thilure to adequately police data protection and unfair competition in digital markets to its lack of genuine rulentaking authority {which it has lacked since 1980). Efforts to endow the FTC with rulentaking authority most recently in the contest ol?Dodd?Frank haye been defeated. the FTC had genuine rulentaking authority, rnany claim, it would he able to respond to changes in technology and business practices. In addition, many haye suggested that Congress shouch proyide the FTC with additional resources. The funding since 2010 has fallen by Significantly more funding is necessary in order for the FTC to deyelop tools necessary to eyaluate contples algorithntic systems for unl?airness, deception, or competition concerns. Comprehensive {GDPR-like) data protection legislation The US could adopt rules ruirroring GDPR, with key features like data portahility, the right to be forgotten, TZ?hour data breach notification, 1-?1 party consent, and other major data protections. Business processes that handle personal data would he built with data protection by design and by default, nteaning personal data must he stored using pseudonyntisation or full anonyntization. Under a regime similar to GDPR, no personal data could he processed unless it is done under a lawful hasis specified by the regulation, or it" the data processor has receiyed an unambiguous and indiyidualized consent from the data subject {1?1 party consent). In addition, data suhjects haye the right to request a portable copy ol" the data collected by a processor and the right to haye their data erased. Businesses must report any data breaches within 7?2 hours it" they haye an adyerse effect on user priyacy. One major tenant ol?the GDPR {that the US could or could not adopt} is the potential of IS high penalties for non?compliance in which a corupany or organization can be fined (in the EU, penalties are up to 4% of its annual global turnover or ?20 million - whicheyer is higher). US. firrus haye yoiced seyeral concerns about the GDPR, including how it will be impleruented and the scale ofpotential fines. In addition, if like legislation were to be proposed, a central authority would need to be created to enforce these regulations. EU. rueruber states haye their own data priyacy authorities to enforce the GDPR, but this does not exist in the US. Delegating this responsibility to states could result in a patchwork of data protection and priyacy regulations. In some respects, there are also indications GDPR ruay take too extreme a yiew of what constitutes personal data. For instance, doruain registration information the historically public inforruation about the indiyidual who has registered a giyen doruain, which operates ruuch like a phonebook is treated as personal data under GDPR. This poses serious problerus to operation of the WHUIS database a yital repository of domain registry information for those inyestigating online scammers and ruany haye suggested it will underruine cybersecurity inyestigations. 151 Party Consent for Data Collection The US could adopt one specific eleruent of requiring 1-?l party consent for any data collection and use. This would preyent third?parties froru collecting or processing a user?s data without their explicit and inforrued consent. Because third? party data is a practice reliant on consent that is n?t explicit, GDPR renders all third?party actiyity obsolete. Critics haye acknowledged the need to remoye some of the ruore salacious practices that go on with third-party access to data, but haye also called for more clarity on the explicit consent side due to the negatiye connotations that can result from remoying the third party data ruarket in its entirety. Under the supply of first?party data will likely decrease. It 2010 study by the European Commission (EC) found that ?89% of respondents agreed that they avoid disclosing their personal information online." Critics haye noted, howeyer, that a focus on user consent tends to mask greater irubalances in the bargaining power between users and seryice proyiders. The strong network effects of certain If) White Paper US. Sen. Mark R. xlit-"miter online services and the costs to users of foregoing those services may undermine the extent to which consent is ?freely? given. Further, absent restrictions on practices like ?dark patterns? (which manipulate user interfaces to steer users towards consenting to settings and practices advantageous to the platform), an emphasis on user consent may be na'i've. Statutory determination that so-called r'dark patterns? are unfair and deceptive trade practices Dark patterns are user interfaces that have been intentionally designed to sway (or trick) users towards taking actions they would otherwise not take under effective, informed consent. Often, these interfaces exploit the power of defaults framing a user choice as agreeing with a skewed default option (which benefits the service provider) and minimizing alternative options available to the user. A vivid example of this practice is below, where Facebook deceptively prods users into consenting to upload their phone contacts to Facebook (something highly lucrative to Facebook in tracking a user?s ?social graph?): Adding your contacts to Messenger Messenger only works when you have people to tell: to. Continuoust uploading your Text Anyone in Your Phone contacts helps you connect with friends. Messenger will contlnuously upload your contacts to connect zero with truer-Us. OK If you skip this step you'. need to add each contact one by one to [message thorn w. . 'r 1-. h?crr? Syncing your contacts help-5 friends connect on 5.le Faceboolt too. .-.. -- - (First screen presented to users) [Second screen presented to users uP-On clicking ?Learn More) The ?rst screen gives the false impression that there is only one option with a bouncing arrow below the option pushing users towards consent. Ifusers click ?Learn More? (which is the path towards declining consent) they? re presented with yet another deceptively-designed l7 intorfaco whoro tho opt-in is highlightorl (clospito gotting to this scroon law not opting-in on tho first scroon), anrl tho opt-out option is in smallor font, positionorl at tho bottom of tho scroon, anrl not highlightorl with a hluo hutton. Tho FTC Act coulrl ho uprlatorl to rlofino thoso kinrls of practicos which aro hasorl on rlosign tricks to osploit human as per so unfair anrl rlocoptiyo. Ono clrawhack of corlifying this prohibition in statuto is that tho law may ho slow to noon forms of thoso practicos not anticipatorl hy rlraftors. To this, tho FTC coulrl ho giyon rulomaking authority to onsuro that tho law koops paco with husinoss practicos. Algorithmic aurlitahilityffairnoss Tho forloral goyornmont coulrl sot manrlatory stanrlarrls for algorithms to ho aurlitahlo hoth so that tho outputs of algorithms aro oyaluatorl for officacyi?fairnoss woro you justifiably rojoctorl for a mortgago hasorl on tho clofinorl factors 15') as woll as for potontial hirlrlon hias. This coulrl ho ostahlishorl for algorithms anrl Al?hasorl systoms usorl for .rpcr'ifficjinn-Irons {liko oligihility for omploymont, anrl housing opportunities). For ins tanco, Finland rocontly pass ed a law prohibiting tho ?discriminatory uso? of artificial intolligonco in clocisions ahout financial crorlit. Or it coulrl ho ostahlishorl hasorl on magniturlo {in othor worrls, an algorithmic systom that coyors oyor BOOM pooplo). Unrlor GDPR, usors hayo numorous rights rolatorl to automatorl rlocision?making, particularly ifthoso procossos hayo logal or significant offocts. Thoso inclurlo furnishing inrliyirluals with information ahout tho automatot'l rlocision-making procoss, proyirling ways for tho consumor to roquost human intoryontion in tho procoss (or a challongo oftho automatorl procoss that is arljurlicatorl by a human), as woll as rogular aurlits of tho automatorl rlocision?making procoss to onsuro it is working as intonrlorl. A first stop towarrls this {somothing that coulrl, for instanco, ho insortorl into tho annual National Dofonso Authorization fact) woulrl ho to roquiro that any algorithmic rlocision?making prorluct tho goyornmont huys must satisfy algorithmic aurlital'iility stanrlarrls rlologatorl to NIST to rloyolop. Moro hroarlly, a manrlato woulrl roquiro soryico proyirlors to proyirlo consumors with tho sourcos of rlata usorl to mako algorithmic rlotorminations or classifications. Soryico proyirlors woulrl also noorl to furnish cons umors with information on tho rocipionts of that rlata or whilc also cstahlishing by which consumcrs can or crroncous data. Critics of this approach will likcly arguc that many mcthorls of machinc lcarning proclucc outputs that cannot hc prcciscly caplaincrl and that rcquiring caplainahility will comc at tho cost of computational cfficicncy or, hccausc outputs of machinc lcarning-hasccl arc not strictly caplainahility is not l?casihlc. Particularly in and housing opportunitics, a ofcomputational incl?l?icicncy an cost to promotc grcatcr auclitahility, and Morcoycr, whilc algorithmic may not hc l?casihlc or prcl?crahlc, a rangc of tools and caist to algorithms align with Roy yalucs, and lcgal rulcs.? ompctition Data Transparency Bill opacity of platforms? and ass data as a major ohstaclc to agcncics likc FTC compctitiyc [or cons umcr) harms. This lack of transparency is also an impcdimcnt to consumcrs ?voting with thcir and moying to compcting scryiccs that cithcr protcct thcir priyacy or for uscs ol? thcir data.2U Unc of major among dominant platforms, for instancc, is that of hargain scryiccs in cachangc for to consumcr data co nti nuc to bc amcnd cd in favor of platform. Googlc?s cmail scryicc, for ins tancc, was oncc prcdicatcrl on tho itlca that uscrs a l?rcc cmail scryicc in cachangc for Googlc using cmail data for morc ads. Incrcasingly, Googlc has found miter uscs for this data, hcyoncl of original rlcal. Similarly, Facchook has marlc cons umcrs agrcc to giyc up additional data as a condition for using its mcssaging scryicc on thcir smartphoncs: prcyiously thcy coulcl usc mcssaging l?caturc through thcir Facchook latcr madc download a icatcd app that cons id crably morc data. Joshua r?t. Kroll cl 165 U. Pa. L. Rcy. 633 Mauricc E. Sluckc ?5 Allcn P. Emacs. Big Data and Pat?t'cjr (Oxford: Oxford Uttiycrsily 2016}, 333. Many ohseryers haye noted that the collection and use of data is at odds with consumer expectations. Legislation could require companies to more granularly (and continuously) alert consumers to the ways in which their data was being used, counterparties it was being shared with, and {perhaps most importantly) what each user?s data was worth. to the platform. Requiring that ?free? platforms provide users with an annual estimate ofwhat their data was worth to the platform would provide significant ?price? transparency, educating consumers on the true value of their data and potentially attracting new competitors whose seryices (and data collectionfuse policies) the consumer could eyaluate against existing seryices. Lastly, data transparency wouch also assist antitrust enforcement agencies like the FTC and by proyiding concrete and granular metrics on how much yalue data proyides a giyen company, allowing enforcement agencies to identify {particularly retrospectiyely) anticompetitiye transactions as ones that significantly increase the yalue a company extracts from users [which in the data-centric markets is equiyalent to a price increase). Data Portability Bill As platforms grow in size and scope, network effects and lock?in effects increase; consumers face diminished incentiyes to contract with new proyiders, particularly if they haye to once again proyide a full set of data to access desired functions. The goal of data portability is to reduce consumer switching costs between digital seryices (whose efficiency and customization depends on user data). The adoption of a local nuruber portability requirement by Congress in the Telecommunications Act of 1996 had a substantial procompetitiye effect, particularly in the mobile market by facilitating competitiye switching by customers. A data portability requirement would be predicated on a legal recognition that data supplied by [or generated from) users {or user actiyity) is the users not the service proyid er?s. In other words, users would be endowed with property rights to their data. This approach is already taken in Europe {under GDPR, seryice proyiders must proyide data, free of charge, in structured, commonly?used, machine-readable format) but a robust data ownership proposal might garner pushback in the US. More modestly, a requirement that consumers be permitted to their data in structured data, macl'tine?readable format without addressing the underlying ownership issue, would be more feasible. [d Ono potcntial in cstahlishing a clata portahility is to it to 039mm clata by a scryicc proyirlcr. ln onc this clata is clata noon! uscr, clcriyccl from tho actiyity of uscr. Scryicc arc likcly to claim that clata for instancc, classifications or about a uscr hasccl on actiyity hclong to scryicc proyiclcr. Scryicc may inyokc l?l protcctions against sharing that clata which thcy may charactcrizc as with thircl partics. Additionally, clata portability can posc a numhcr of risks if not Spccil?ically, it incrcascs attack surl?acc hy cnlarging numhcr ol?sourccs l?or attackers to siphon uscr clata; ifthc mcchanism hy which clata is portct'l {typically an API) is not unauthoriaccl partics coulrl usc it to clata guisc ol? portahility rcqucsts. is also a risk that, if not clcyisccl appropriatcly, clata portahility coulcl hc usccl hy clominant rites ol?smallcr, proyit'lcrs. Largc arc hcst?positionccl to ol?l?cr to uscrs to suhmit portahility rcqucsts to now cntrants who may posc a compctitiyc throat to thcm. Smallcr also may hayc ability to proccss portahility rcqucsts, ancl ability to portahility mcchanisms sccurcly. For this rcason, any portahility manclatc shouch iclcally hc imposch on photo a ccrtain sizc, or who hayc to holcl clominant positions in particular Imposing an intcropcrahility on clominant platforms to blunt thcir ahility to thcir clominancc oycr onc markct or l?caturc into or acljaccnt or proclucts coulcl ho a powcrl?ul catalyst of compctition in cligital Moro importantly, an intcropcrahility that in somc l?or instancc, nctwork cl?l?ccts arc so or it woulrl hc uncconomical for a now platform to raclically kcy l'unctions by a clominant inc clata portahility alonc will not proclucc procompctitiyc outcomcs. For instancc, allowing mcssaging or photo? sharing startups access to tho ?social graph? ofFaccbook would allow uscrs to communicatc morc hroaclly without a now startup haying to {unfcasihly ancl uncconomically) an entirelv new Facehook. A prominent template for this was in the AOLlTime Warner merger, where the FCC identified instant ntessaging as the ?killer app? the app so popular and dominant that it would drive consumers to continue to pav for AOL service despite the eaistence of more innovative and efficient email and internet connectivitv services. To address this, the FCC required AOL to make its instant messaging service (AIM, which also included a social graph) interoperahle with at least one rival immediater and with two ether rivals within 6 months. Another example was the interoperability decrees with respect to Intel?s treatment of NVIDIA. Interoperabilityr is seen as falling within the ?ex is ting toolkit" regulators have to address a dominant platform; ohservers have noted that ?Regional Bell Operating Company" (RBOC) interoperability with long distance carriers actually worked quite well. Some experts have expressed concern with the managed interoperabilitv approach, suggesting it might create too coav a relationship between regulatorv agencies and the platforms. ['Iowever, a tailored interoperahilitv requirement mav not pose the same regulatorv capture concerns. lnteroperahilitv could he achieved hv mandating that dominant platforms maintain APIs for third partv access. Anticipating platforms? counter-arguments that fullv open APIs could invite abuse, the requirement could he that platforms maintain transparent, third?partv accessihle APIs under terms that are fair, reasonable, and non?discriminatorv As with data portahilitv, securitv esperts have observed that interoperahilitv could increase the attack surface of any given platform. Implementing APIs securelv cart he difficult for even mature providers; for instance, it was a weakness in Apple?s iCIloud API (allowing attackers to make unlimited attempts at guessing victims? passwords) that contributed to the 2014 hacks of major celebrities photos. Opening federal tlatasets to university researchers and qualified small httsinessesfstartups Structured data is increasinglv the single most important economic input in information markets, allowing for more targeted and relevant advertisements, facilitating refinement of services to make them more engaging and efficient, and providing the basis for anv machine? learning algorithms {which develop decisional rules based on pattern-matcl'ting in large sets of training data) on which all industries will increasingly rely. Large platforms ha ye successfully built lucratiye datasets by mining consumer data oyer significant timescales, and separately through buying smaller companies that haye unique datasets. For startups and researchers, howeyer, access to large datasets increasingly represents the largest barrier to innoyation so much so that uniyersity researchers are steadily leaying academia not only for higher but also for access to unriyalled or unique datasets to continue their work. The federal goyernment, across many different agencies, maintains some ofthe most sought?after data in many different fields such that eyen the largest platforms are pushing the Trump Administration to open this data to them. To catalyze and sustain long?term competition, howeyer, Congress could ensure that this data be proyided unit-1 to uniyersity researchers and qualified small businesses, with contractual prohibitions on sharing this data with companies aboye a certain size. Numerous precedents already exist for goyernment contractual agreements only with smaller or non? commercial entities procurement). Essential Facilities Determinations Certain technologies serye as critical, enabling inputs to wider technology ecosystems, such that control oyer them can be leyeraged by a dominant proyider to extract unfair terms from, or otherwise disadyantage, third parties. For instance, liiroogle Maps maintains a dominant position in digital mapping {enhanced by its purchase of Wale), serying as the key mapping technology behind millions of third party applications (mobile and desktop) and enabling l{Froogle to extract preferential terms and conditions {such as getting lucratiye in?app user data from the third-party apps as a condition of using the Maps function). Legislation could define thresholds for instance, user base size, market share, or leyel of dependence of wider ecosystems beyond which certain core would constitute ?essential titcilities?, requiring a platform to proyide third party access on fair, reasonable and non?discriminatory GRAND) terms and preyenting platforms from engaging in self-dealing or preferential conduct. In other words, the law would not mandate that a dominant proyid er offer the seryice forji'ee; rather, it would be required to offer it on reasonable and non- discriminatory terms (including, potentially, requiring that the platform not giye itself better terms than it giyes third parties). Examples of this kind ofcondition are rife in areas such as teleconmtunication regulation, where similar conditions have been imposed on how CIomcast?s NBC-Uriiyersal subsidiary engages with Comcast and Comcast riyals.