HEARING BEFORE THE UNITED STATES SENATE COMMITTEE ON COMMERCE, SCIENCE, AND TRANSPORTATION October 28, 2020 Testimony of Mark Zuckerberg Facebook, Inc. I. Introduction Chairman Wicker, Ranking Member Cantwell, and members of the Committee, thank you for the opportunity to be here today. Facebook’s mission is to give people the power to build community and bring the world closer together. Our products enable more than 3 billion people around the world to share ideas, offer support, and discuss important issues. We know we have a responsibility to make sure people using our products can do so safely, and we work hard to set and enforce policies that meet this goal. II. CDA Section 230’s Role in Giving People a Voice and Keeping Them Safe Section 230 of the Communications Decency Act is a foundational law that allows us to provide our products and services to users. At a high level, Section 230 does two things: • First, it encourages free expression. Without Section 230, platforms could potentially be held liable for everything people say. Platforms would likely censor more content to avoid legal risk and would be less likely to invest in technologies that enable people to express themselves in new ways. • Second, it allows platforms to moderate content. Without Section 230, platforms could face liability for doing even basic moderation, such as removing hate speech and harassment that impacts the safety and security of their communities. Thanks to Section 230, people have the freedom to use the internet to express themselves. At Facebook, this is one of our core principles. We believe in giving people a voice, even when that means defending the rights of people we disagree with. Free expression is central to how we move forward together as a society. We’ve seen this in the fight for democracy around the world, and in movements like Black Lives Matter and #MeToo. Section 230 allows us to empower people to engage on important issues like these—and to provide a space where non-profits, religious groups, news organizations, and businesses of all sizes can reach people. Section 230 also allows us to work to keep people safe. Facebook was built to enable people to express themselves and share, but we know that some people use their voice to cause harm by trying to organize violence, undermine elections, or otherwise hurt people. We have a responsibility to address these risks, and Section 230 enables us to do this more effectively by removing the threat of constant litigation we might otherwise face. 1 We want Facebook to be a platform for ideas of all kinds, but there are specific types of harmful content that we don’t allow. We publish our content policies in our Community Standards, and we update them regularly to address emerging threats. To address each type of harmful content, we’ve built specific systems that combine sophisticated technology and human judgment. These systems enabled us to take down over 250 million pieces of content that violated our policies on Facebook and Instagram in the first half of 2020, including almost 25 million pieces of content relating to terrorism and organized hate, almost 20 million pieces of content involving child nudity or sexual exploitation, and about 8.5 million pieces of content identified as bullying or harassment. We report these numbers as part of our Transparency Reports, and we believe all other major platforms should do the same so that we can better understand the full picture of online harms. However, the debate about Section 230 shows that people of all political persuasions are unhappy with the status quo. People want to know that companies are taking responsibility for combatting harmful content—especially illegal activity—on their platforms. They want to know that when platforms remove content, they are doing so fairly and transparently. And they want to make sure that platforms are held accountable. Section 230 made it possible for every major internet service to be built and ensured important values like free expression and openness were part of how platforms operate. Changing it is a significant decision. However, I believe Congress should update the law to make sure it’s working as intended. We support the ideas around transparency and industry collaboration that are being discussed in some of the current bipartisan proposals, and I look forward to a meaningful dialogue about how we might update the law to deal with the problems we face today. At Facebook, we don’t think tech companies should be making so many decisions about these important issues alone. I believe we need a more active role for governments and regulators, which is why in March last year I called for regulation on harmful content, privacy, elections, and data portability. We stand ready to work with Congress on what regulation could look like in these areas. By updating the rules for the internet, we can preserve what’s best about it—the freedom for people to express themselves and for entrepreneurs to build new things—while also protecting society from broader harms. I would encourage this Committee and other stakeholders to make sure that any changes do not have unintended consequences that stifle expression or impede innovation. III. Preparing for the 2020 Election and Beyond The issues of expression and safety are timely as we are days away from a presidential election in the midst of a pandemic. With COVID-19 affecting communities around the country, people will face unusual challenges when voting. Facebook is committed to doing our part to help ensure everyone has the chance to make their voice heard. That means helping people register and vote, clearing up confusion about how this election will work, and taking steps to reduce the chances of election-related violence and unrest. 2 This election season, Facebook has run the largest voting information campaign in American history. Based on conversion rates we calculated from a few states we partnered with, we’ve helped an estimated 4.4 million people register to vote across Facebook, Instagram, and Messenger. We launched a Voting Information Center to connect people with reliable information on deadlines for registering and voting and details about how to vote by mail or vote early in person, and we’re displaying links to the Voting Information Center when people post about voting on Facebook. We’ve directed more than 39 million people so far to the Voting Information Center, and we estimate we’ve helped about 100,000 people sign up to be poll workers. We’re also working to tackle misinformation and voter suppression. We’ve displayed warnings on more than 150 million pieces of content that have been debunked by our third-party factcheckers. We’re partnering with election officials to remove false claims about polling conditions, and we’ve put in place strong voter suppression policies that prohibit explicit or implicit misrepresentations about how or when to vote, as well as attempts to use threats related to COVID-19 to scare people into not voting. We’re removing calls for people to engage in poll watching that use militarized language or suggest that the goal is to intimidate, exert control, or display power over election officials or voters. In addition, we’re blocking new political and issue ads during the final week of the campaign, as well as all political and issue ads after the polls close on election night. Since many people will be voting by mail, and since some states may still be counting valid ballots after election day, many experts are predicting that we may not have a final result on election night. It’s important that we prepare for this possibility in advance and understand that there could be a period of uncertainty as the final results are counted, so we’ve announced a variety of measures to help in the days and weeks after voting ends: • We’ll use the Voting Information Center to prepare people for the possibility that it may take a while to get official results. This information will help people understand that there is nothing illegitimate about not having a result on election night. • We’re partnering with Reuters and the National Election Pool to provide reliable information about election results. We’ll show this in the Voting Information Center so it’s easily accessible, and we’ll notify people proactively as results become available. Importantly, if any candidate or campaign tries to declare victory before the results are in, we’ll add a label to their post stating that official results are not yet in and directing people to the official results. • We’ll attach an informational label to content that seeks to delegitimize the outcome of the election or discuss the legitimacy of voting methods, for example, by claiming that lawful methods of voting will lead to fraud. This label will provide basic reliable information about the integrity of the election and voting methods. • We’ll enforce our violence and harm policies more broadly by expanding our definition of high-risk targets to include election officials in order to help prevent any attempts to 3 pressure or harm them, especially while they’re fulfilling their critical obligations to oversee the vote counting. • We’ve strengthened our enforcement against militias, conspiracy networks, and other groups that could be used to organize violence or civil unrest in the period after the election. We have already removed thousands of these groups from our platform, and we will continue to ramp up enforcement over the coming weeks. It’s important to recognize that there may be legitimate concerns about the electoral process over the coming months. We want to make sure people can speak up if they encounter problems at the polls or have been prevented from voting, but that doesn’t extend to spreading misinformation. Four years ago we encountered a new threat: coordinated online efforts by foreign governments and individuals to interfere in our elections. This threat hasn’t gone away. We’ve invested heavily in our security systems and now have some of the most sophisticated teams and systems in the world to prevent these attacks, including the teams working in our dedicated Election Operations Center. Since 2017, we’ve removed more than 100 networks worldwide engaging in coordinated inauthentic behavior, including ahead of major democratic elections, and we’ve taken down 30 networks so far this year. We’re also blocking ads from state-controlled media outlets in the US to provide an extra layer of protection against various types of foreign influence in the public debate ahead of the election. IV. Supporting a Healthy News Ecosystem Facebook also supports our democracy by supporting journalism—particularly local journalism, which is vital for helping people be informed and engaged citizens. Facebook has invested hundreds of millions of dollars across a variety of initiatives to support a healthy news and journalism ecosystem. We launched Facebook News in October 2019, making a $300 million commitment to help publishers invest in building their readership and subscription models. We now have multi-year partnerships with ABC News, The New York Times, The Wall Street Journal, The Washington Post, BuzzFeed, Fox News, The Dallas Morning News, and many more. Among other benefits, Facebook provides publishers with free organic distribution of news and other content, which grows audience and revenue for news publishers; customized tools and products to help publishers monetize their content; and initiatives to help them innovate with online news content. We’ve also built tools to help publishers increase their subscribers by driving people from Facebook links to publisher websites. Helping publishers reach new audiences has been one of our most important goals, and we have found that over 95% of the traffic Facebook News now delivers to publishers is in addition to the traffic they already get from News Feed. The Facebook Journalism Project is another initiative to create stronger ties between Facebook and the news industry. Over the past three years, we’ve invested more than $425 million in this effort, including developing news products; providing grants, training, and tools for journalists; and working with publishers and educators to increase media literacy. Since launching the 4 Facebook Journalism Project, we have met with more than 2,600 publishers around the world to understand how they use our products and how we can make improvements to better support their needs. This investment includes support for organizations like the Pulitzer Center, Report for America, the Knight-Lenfest Local News Transformation Fund, the Local Media Association and Local Media Consortium, the American Journalism Project, and the Community News Project. We’ve seen how important it is that people have information they can rely on, and we’re proud to support organizations like these that play a critical role in our democracy. V. Conclusion I’d like to close by thanking this Committee, and particularly Chairman Wicker and Ranking Member Cantwell, for your leadership on the issue of online privacy. Facebook has long supported a comprehensive federal privacy law, and we have had many constructive conversations with you and your staffs as you have crafted your proposals. I understand that there are still difficult issues to be worked out, but I am optimistic that legislators from both parties, consumer advocates, and industry all agree on many of the fundamental pieces. I look forward to continuing to work with you and other stakeholders to ensure that we provide consumers with the transparency, control, and accountability they deserve. I know we will be judged by how we perform at this pivotal time, and we’re going to continue doing everything we can to live up to the trust that people have placed in us by making our products a part of their lives. 5