November 29, 2017 The Honorable Cedric Richmond Chairman Congressional Black Caucus 420 Cannon House Office Building Washington, D.C. 20515 Dear Chairman Richmond: I want to thank you and your colleagues of the Congressional Black Caucus (CBC) for taking the time to meet with me in Washington last month. It was a difficult but important meeting, and I appreciate the CBC’s commitment to asking hard questions and holding us accountable for our progress. We know that African Americans have been the victims of divisive and abusive content online and we appreciate the leadership the CBC has shown in speaking out in defense of democratic values and in favor of more diversity in technology. We take the abuse of our platform and the lack of representation in our industry very seriously, and we are grateful for your input as we work through these issues. In that spirit, I am writing today to respond to a series of recent letters from you and your colleagues, Representatives Cleaver, Conyers, Ellison, Kelly, and Watson Coleman. These letters raise a number of questions related to diversity, content and ads, and the ongoing investigation into Russian election interference. We have compiled our responses below to ensure that all of your colleagues in the CBC have the opportunity to see and review the work that is already underway and the steps we plan to take moving forward. Solving these problems will not happen overnight, but we are committed to addressing and making progress on the issues raised in our meeting and in these letters. As we work to connect people and create community, we understand that we must work just as diligently to ensure our platform is used as a force for good — not as a vehicle for discrimination or division. This challenge cuts to the heart of our service and our society, and we are committed to getting it right. Your engagement is an invaluable part of this work. We will continue working with the CBC, other members of Congress, the administration, and the American people to ensure the integrity of our elections and the independence of our democracy. FACEBOOK'S ADVERTISING POLICIES To begin: we recently published the principles that guide our decision making when it comes to advertising across Facebook, Messenger and Instagram, including that advertising should be safe, civil, and non-discriminatory, that control is in the hands of people on our platforms, and that we do not sell your data. Many of your questions concerned our policies on ads: how ads are purchased, how they are reviewed, our efforts to stop discrimination in connection with our ad tools, and our latest changes with regard to transparency, including around election-related ads. We’re pleased to be able to address these issues below. How our ads work Facebook generates most of our revenue from selling ads to businesses and other organizations. Our ad tools enable advertisers to define the audience for their ads based on age, gender, location, interests, and other criteria. This ability to carefully define audiences is particularly helpful to small businesses and non-profits who do not have the resources for mass advertising or television ads. Ads can appear in multiple places on our services, including on Facebook, Instagram, and Messenger. The way an advertiser purchases ads on Facebook varies based on their marketing objective, budget, and the interfaces they prefer to use for advertising. Most advertisers simply access our self-serve “Ads Manager” tool through their Facebook account and set up an advertising campaign. When an advertiser creates an ad using Ads Manager, they choose the marketing objective (e.g., getting people to install its mobile app), the audience they want to see the ad, the places they want to show the ad (e.g., Facebook or Instagram), and the ad format (e.g., a single image or a video ad). Advertisers can also turn posts from their Pages into ads, an action we call “boosting” a post. When an advertiser creates a post on its Facebook Page, they will sometimes see a “Boost Post” button, which allows the advertiser to quickly create a Facebook ad using the post. During and at the end of an ad campaign, we share aggregated information about the campaign's performance—such as the number of people who saw the ads with the advertiser. Preventing abuse of ads aimed at multicultural affinity groups Since 2013, we have given advertisers the option to show ads to people who belong to what we call “multicultural affinity” groups. These groups (or “segments”) are made up of people whose activities on Facebook suggest they may be interested in ads related to the African American, Hispanic American, or Asian American communities. Multicultural marketing is a common practice in the ad industry. There are many legitimate uses for this kind of marketing—for example, advertisers may want to reach people with content that reflects the community with which they identify or products that are highly purchased by a specific community. Advocacy organizations may want to reach people to provide opportunities, like telling them about healthcare enrollment. Last year, we heard concerns—including from members of the CBC—that these advertising options could be used to discriminate against people in the areas of housing, employment, and the extension of credit. In response to the CBC and others, and working closely with Representative Kelly's office and leading civil rights and consumer advocacy organizations, we developed a series of improvements that would address those concerns by: • strengthening our policies prohibiting discrimination in advertising • providing more education to advertisers about their obligation not to discriminate • prohibiting the use of the multicultural affinity segments to advertise offers of housing, employment, or credit • requiring advertisers running ads that offer housing, employment or credit to certify compliance with our anti-discrimination policy and with applicable law As we act to make these changes, we remain on alert to correct errors if and when they arise. For example, we recently learned that a journalist was able to run ads for apartment rentals that failed to trigger the protections we put in place around housing ads. We took action immediately to correct the issue that allowed this to occur. Going forward, we are determined to do better. Until we can better ensure that our tools will not be used inappropriately, we are disabling the option that permits advertisers to exclude multicultural affinity segments from the audience for their ads. In addition, all advertisers will have to complete the certification described above when they choose to include any of the multicultural affinity segments. We will also conduct a full review of how exclusion targeting is being used across audience segments, focusing particularly on potentially sensitive segments (e.g., segments that relate to the LGBTQ community or to people with disabilities). We worked in consultation with Representative Kelly on these steps, and we plan to work closely with the CBC and others to help ensure that our ad targeting tools are used in compliance with our policies and in a responsible manner. Working to stop discrimination and divisiveness in connection with ads We also allow people who use Facebook to fill in their education or employer on their profile, and advertisers to target on the basis of those education and employment fields. We found a small percentage of people who have entered offensive responses in these sections in violation of our policies. To help ensure that targeting is not used for discriminatory purposes, we clarified our advertising policies and tightened our enforcement processes in an effort to prevent content that goes against our community standards from being used to target ads. This includes anything that directly attacks people based on their race, ethnicity, national origin, religious affiliation, sexual orientation, sex, gender or gender identity, or health. We have also expanded our policy that prohibits shocking content—such as ads depicting violence or threats of violence, direct threats, and the promotion of the sale or use of weapons—to prevent ads that use even more subtle expressions of violence. Strengthening ads review Before any ad can appear on Facebook or Instagram, it must go through our ad review process. We rely on both automated and manual review, and we’re taking aggressive steps to strengthen both. The process includes automated checks of an ad’s images, text, targeting, and positioning, in addition to the content on the ad’s Facebook and landing pages. Our automated systems also flag content for human review. Reviewing ads means assessing not just what’s in an ad, but also the context in which it was bought and the intended audience—so we’re changing our ads review system to pay more attention to these signals. We’re adding more than 1,000 people to our global ads review teams over the next year and simultaneously working to hire more people from African American and Hispanic communities. This will help increase the diversity of our workforce and improve our understanding and awareness of ads that are meant to exploit culturally sensitive issues. In addition, we are investing more in machine learning to better understand when to flag and take down ads. Providing more transparency in ads Facebook is firmly committed to transparency in political advertising. As the Federal Election Commission contemplates requiring disclaimers in online political ads, we submitted comments in support of such efforts. And as Congress considers ad transparency legislation, we stand ready to work with lawmakers. We are also undertaking our own efforts to bring more transparency to advertising on Facebook. This month, we began testing a feature that allows people to view all the ads a Page is running on Facebook, Instagram and Messenger—even if they are not the intended target audience for the ad. All Pages will eventually be part of this effort, and we will require that all ads be associated with a Page as part of the ad creation process. We have started this test in Canada and expect to roll it out to the United States by next summer, ahead of the U.S. midterm elections in November 2018.
 During this initial test, we will only show active ads. However, when this feature is rolled out in the United States, it will include a growing archive so that people can see both current and historic federal election-related ads. This archive will ultimately go back 4 years (the time period proposed by the Honest Ad Act for covered ads). In addition, for each federal election-related ad, we will: • include the ad in a searchable archive that will ultimately cover a rolling four-year period • provide details on the total amounts spent • provide the number of times we show an ad • provide demographic information (i.e. age, general location, gender) about the audience that the ad reached We try to catch content that shouldn’t be on Facebook before it surfaces on the platform, but because this is not always possible, we also take action when people report ads that violate our policies. We hope that more transparency will mean more people can report inappropriate ads. Beyond these steps, we’re also updating our policies to require advertisers who want to run U.S. federal election-related ads to verify the identity of the business or organization they represent. As part of this documentation process, advertisers will be required to identify that they are running election-related advertising and verify both their entity and location. Once verified, these advertisers will be required to include a disclosure in their election-related ads. When people click on the disclosure, they will be able to see details about the advertiser. Like other ads on Facebook, they will also be able to see an explanation of why they saw that particular ad. Protecting against foreign interference and malicious use of Facebook ads Facebook recognizes the extraordinary nature of Russian interference in the 2016 U.S. election and we are cooperating fully with the ongoing congressional investigations. We believe that the public deserves a full accounting of the facts, and that congressional investigators are best positioned to deliver that based on all the information to which they have access, including what Facebook has provided. Should congressional committees elect to release the content provided by Facebook, we stand ready to support their efforts by helping to redact any personally identifying information and addressing any other privacy considerations. We have provided all of the Russian ads we identified as part of our internal investigation— along with all of the targeting and payment information and the relevant Page and Instagram posts—to the committees prior to the public hearings. And our investigation continues to this day. All of the ads that Russia’s Internet Research Agency ran during the 2016 U.S. election cycle were run by accounts engaged in coordinated inauthentic activity, which is against Facebook's policies. Going forward, all of the efforts discussed here are designed to prevent a repeat of such abuse, including to catch malicious actors faster and to prevent or quickly take down the content put up by such coordinated inauthentic activity. Even if we eliminated all such abuse of our platform, there would still be political and social content that is controversial and that I, and others in leadership positions at Facebook, would find deeply offensive – but that we would not take down due to this nation's commitment to free expression and robust debate. Accordingly, in addition to the steps we have taken to increase enforcement of our content policies, we believe that it is equally important that we work collaboratively with the government and our peer companies to detect and deter inauthentic behavior by foreign actors. Additionally, based on feedback we heard from the CBC and others, we will unveil a feature by the end of this year that will allow people to learn which Internet Research Agency Facebook and Instagram accounts and pages they may have liked or followed between January 2015 and August 2017. This tool is one of a number of transparency initiatives we are undertaking as a company. Finally, we are making a number of investments and changes to protect our platform from the kind of abuse we saw in the 2016 U.S. election cycle, including adding more than 10,000 people to work on safety and security by the end of 2018, and requiring additional documentation and disclosures from political advertisers. FACEBOOK CONTENT POLICY Community Standards With regard to organic content on Facebook, we have a set of policies – our Community Standards – that explain what is and isn't allowed on the platform. The goal of our Standards is to encourage expression and create a safe and respectful community. We are constantly evaluating – and, where necessary, iterating on – our content policies to account for shifts in cultural and social norms around the world. For example, we recently updated our hate speech policies to remove violent speech directed at groups of people defined by protected characteristics, even if the basis for the attack may be ambiguous. Under the previous hate speech policy, a direct attack targeting women on the basis of gender, for example, would have been removed from Facebook, but the same content directed at women drivers would have remained on the platform. We have come to see that this distinction is a mistake, and we no longer differentiate between the two forms of attack when it comes to the most violent hate speech. We continue to explore how we can adopt a more granular approach to hate speech, both in the way we draft our policies and the way we enforce on them. Our Community Standards also apply to Facebook Live video. If we learn of someone violating our Community Standards while using Live, we will interrupt that stream. Viewers can report potential violations during a Live broadcast, and we work hard to address reports quickly. We also monitor Live videos that are being seen and shared by a high volume of people, even if they are not reported to us. In May, we announced that we would be hiring 3,000 people to our Community Operations team to review the millions of reports we get every week and improve the process for doing it quickly. That hiring is now complete, bringing our Community Operations team around the world to 7,500 people. To increase the cultural competency of our content reviewers, we specifically retained a vendor committed to employing a diverse workforce and with experience operating in locations with significant ethnic diversity. In addition to investing in more people, we're building better technology and tools to keep our community safe. Automation helps us collate the millions of reports we get each week by recognizing duplicates, so that if 1,000 people report a piece of content, we don't have 1,000 people reviewing the same piece of content. We also use automation to make sure we quickly get reports to reviewers who have the right subject matter or language expertise. We have made it simpler for people to report problems to us, faster for our reviewers to determine which posts violate our standards and easier for them to contact law enforcement if someone needs help. As a result of these investments in our live review capabilities, we are now able to better proactively detect and accelerate reports that require immediate attention. For instance, in the case of live videos depicting suicide and self-harm, we're able to escalate these to local authorities twice as quickly as other reports. Determining whether a particular comment is hateful often depends on context, so automation primarily serves to assist when it comes to manual review and enforcement of reported content. As technology improves, we continue to explore new ways that automation can help us more effectively enforce our policies and make the platform safer. DIVERSITY AND INCLUSION AT FACEBOOK A diverse workforce is crucial for our company and our industry, and while we have made progress, we recognize that we have to do more. To demonstrate our commitment, we began publicly reporting our workforce diversity statistics annually in 2014 to ensure accountability. The top line trend data is that both our hiring rates and our overall representation of traditionally underrepresented people (a result of both hiring and retention) have increased over the past three years, with most improvement being in non-technical fields. The number of African Americans at the company overall has increased by a factor of five since 2014, and our percentage of African Americans in non-technical roles has increased from two percent to six percent. While this improvement represents real progress, as I said above, both Facebook and the tech industry as a whole has a long way to go on hiring, representation in leadership, and strengthening the pipeline for women and minorities to fill more jobs in the tech industry. While you may be familiar with our efforts from reading our annual diversity report, below are details on some of our short, medium and long-term initiatives. Our partnerships with minorityfocused organizations have helped us build relationships, and we will continue to invest in these and many other efforts as we push for greater diversity. Short term: Hiring computer scientists and computer engineers and providing an inclusive workplace Given the variety of disciplines within STEM, and Facebook's particular focus on computer science and computer engineering (CS&E), we partner with many institutions and groups for hiring, including the National Society of Black Engineers whose membership includes CS&E graduates. We’re also working with a number of Historically Black Colleges and Universities (HBCUs) to develop and hire talented individuals in both technical and non-technical disciplines. Our efforts in non-technical hiring have outperformed those in the technical realm, leading us to invest heavily in building talent partnerships for technical talent and being as transparent as possible on the skills and experience we seek. With only eight percent of STEM graduates in the U.S. studying computer science, we not only have an interest in hiring as many as we can today, but in investing for the future. We recently hosted our annual HBCU Faculty Summit at our headquarters in Menlo Park to help faculty members better understand our technical interview process, as well as the typical Facebook intern experience. Beyond our own Summit, we partner with the United Negro College Fund (UNCF) on their HBCU Innovation Summit, and have agreed to design the advanced software engineering online coursework track for participating HBCUs. Three weeks ago, we hosted 30 faculty members from 21 different HBCUs to discuss innovative approaches to computer science curriculum design and career pathways into engineering and analytics in the tech industry. We know that all our efforts to bring currently underrepresented people in to the company would be fruitless if we don't maintain an inclusive culture where both equity and equality are the norm. All of our employees, including our engineers, policy developers, and content moderators, undergo training to help address and manage implicit bias. After receiving positive feedback on our Managing Bias training, we produced a version of this course for free, public use so that any company that was interested in addressing bias in its own workforce could have the benefit of this training without the cost or burden of producing one themselves. The session can be found online at https://managingbias.fb.com/ and has been viewed and used thousands of times in a wide array of companies and industries. We want a workforce that reflects the range of perspectives, experiences and identities of those we hope to serve, and our partnerships with professional organizations, like Black MBA Women and Management Leadership for Tomorrow, are helping us make progress toward that goal. Medium Term: Supporting university students with an interest in tech To increase transparency and build skills in the specific aspects of computer science and engineering that are a focus in our hiring, we partner with Google on the Engineer in Residence Program where Facebook software engineers teach at Morgan State University. As visiting professors, our engineers teach “Data Structures and Algorithms,” a computer science subject that is most in demand at companies like Facebook. In addition to training faculty to administer this class at Morgan State, we build in comprehensive professional development along with coursework, opportunities for networking, and added mentorship for students. The students also receive the benefit of having a Facebook employee on staff and available to help coach them through the interview process for a summer internship. We have extended this “Engineer in Residence” program to other HBCUs and Hispanic Serving Institutions (HSIs), and we share information with participating companies in order to leverage our joint learnings and investments. This year, we are also running a ten-week interview prep workshop at Facebook New York with students from CUNY, Cornell Tech, NYU, NJIT and Rutgers. We're working closely with NYC Tech Talent Pipeline, which is funded by the New York City Mayor's office and aims to double the number of graduates with tech degrees by 2022. In 2013 we launched the pilot class of Facebook University (FBU), a minority-geared training program. FBU was created in recognition of the fact that the earlier students are exposed to both hard technical training and soft workplace navigation skills, the more successful they are likely to be. Building on the success of that first summer class of 30 students, we have now grown FBU to classes in the range of 200 students a year and have graduated more than 500 students from the program. Understanding that the opportunities for roles in non-technical functions are as attractive as technical roles to many students, we have expanded FBU beyond engineering to analytics, sales, product design and business operations. As part of our Facebook Academy initiative, a six-week summer internship program for local teens, we have enrolled 100 high school students from our local communities since 2012. We’ve also hired an economic opportunities manager at our headquarters to help connect our local residents to job opportunities at Facebook and with our vendors. Long Term: Expanding Opportunity and Access We are also proud of our Computer Science Education (CS Edu) and Social & Emotional Learning (SEL) skills and resources partnerships and programs. We have CS Edu partnerships and programs with Code.org and the College Board, among others. Through these partnerships we have trained over 1,250 new teachers in AP Computer Science nationwide, and we have set a goal of enrolling 27,000 underrepresented students in Computer Science in 2017-2018. In Illinois and California, we engaged 650 new learners, 80 educators and 20 internal representatives. We are also working to engage learners, educators, and internal representatives in Texas and New York. All four states are priority U.S. markets for our programs in the second half of this year. In addition, we have had CS Edu workshops that brought together approximately 200 education, youth and community action leaders to address school climate/SEL needs at the Connecting Communities of Courage Summit which was held in partnership with the National School Climate Center. We are also launching an 8-week student coding pilot program with the Harlem Children’s Zone in New York. The goal is to incentivize and reward underrepresented minority students in 5th through 9th grade to develop computer science-related skills. In 2015, Facebook launched our TechPrep website—our own national outreach effort to expose underrepresented minority communities to computer programming. TechPrep is a free online resource hub available in both English and Spanish to help parents, guardians, and students of all ages explore computer programming, the jobs available to programmers, and the skills required to become one. We’ve had more than one million unique users across all 50 states since our launch in October 2015. We brought TechPrep to cities across the country in a road show during the summer of 2016, including events with Representatives Butterfield, Davis and Lee in their respective districts. We set a goal of reaching 200,000 new unique TechPrep users this year, and have currently reached 182,000. We continue to optimize the program based on community feedback. We recognize that although we’ve made progress toward diversity, there is much more work to do toward ensuring that our workforce, management and board are reflective of the people and communities we serve. To that end, we greatly appreciate the ideas, suggestions and feedback we have received from the CBC, and are committed to continuing to work with you to improve diversity in our company and our sector. SUPPORTING COMMUNITY Economic Opportunity Facebook is committed to increasing economic opportunity both on and off our platform. Since 2011, we’ve partnered with Year Up to provide training and internships to young adults from communities of color. Facebook has hired 54 Year Up alumni to fulfill contract or full-time positions. We have hosted 167 interns in 3 markets (Bay Area, New York, and Seattle) across 6 training specialties: IT Helpdesk, Project Coordination, Business Operations, Quality Assurance, Cyber Security and Data Analytics/Business Intelligence. In 2017, we partnered with a local workforce development organization, JobTrain, to support its Construction Pre-Apprenticeship Program. Through its Local Priority Hiring Initiative, JobTrain is encouraging contractors and subcontractors working on our headquarters expansion to hire apprentices from our surrounding communities that have graduated from the program. We have also trained over 6,000 African American owned small businesses through our Boost Your Business program, which helps small businesses use online tools to grow. We’ve partnered with both the U.S. Black Chambers, Inc. and the National Black Chamber of Commerce to reach even more African American business owners. We have relationships with a number of coding organizations focused on growing African American representation in tech such as Black Girls Code, All Star Code, Hack the Hood, The Hidden Genius Project, Level Playing Field Institute, Yes We Code, Streetcode Academy, Dev Color, Dev Bootcamp, and Techbridge. We know we have both the ability and the responsibility to create economic opportunity and promote equality outside of our company. We continue to grow our supplier diversity program, and earlier this year we implemented billing guidelines for law firms that provide Facebook with outside counsel. The expectation for law firms is that 33% of outside counsel teams staffed across all Facebook matters should be comprised of women and minorities. We have made it clear that the women and minorities who are staffed on Facebook matters should be given clear and measurable leadership opportunities—for example, they should serve as a relationship manager, make significant contributions in the courtroom or on deals, and so on. Addressing housing shortages and creating economic opportunity in the Bay Area Our headquarters in Menlo Park is adjacent to three of the lowest income communities of color in Silicon Valley. Last year we announced a new partnership with Envision Transform Build and the cities of East Palo Alto and Menlo Park to create regional solutions to the affordable housing crisis, seeded with a $20 million contribution from Facebook. The partnership is starting with the creation of more affordable housing, since this is what our neighbors have told us their community needs most urgently. The partnership is built on three initial pillars: Affordable Housing: A $75 million Catalyst Housing Fund, starting with an $18.5 million contribution from Facebook, will help ensure people from different means can continue to live and thrive in Silicon Valley. Local Initiatives Support Corporation (LISC) will manage this fund in collaboration with Housing Trust Silicon Valley. The Catalyst Housing Fund will pursue innovative and scalable ways to increase the production and protection of affordable housing. An additional $250,000 will go to Rebuilding Together Peninsula to support the building and upkeep of homes for low-income residents. Economic Opportunity: The partnership will devote $625,000 to job training in the science, technology, engineering and mathematics (STEM) fields. We have also hired a dedicated local community liaison who will help connect community members with open positions at Facebook. Legal Support: Since the announcement of the community partnership last December, the partners have provided $500,000 to Community Legal Services in East Palo Alto (CLSEPA) to support Belle Haven and East Palo Alto residents threatened with displacement from evictions or abuse by landlords. We see the need for policy and funding solutions to work in tandem, and we know we can’t solve these issues by working alone. This unique coalition is comprised of initial partners that share our goals, including Youth United for Community Action, Faith in Action Bay Area, Community Legal Services in East Palo Alto, Comité de Vecinos del Lado Oeste – East Palo Alto, the local governments of East Palo Alto and Menlo Park, and other community groups. We work in many ways to keep our local neighborhoods strong, including donating to local nonprofit organizations and creating educational programs and housing initiatives that our neighbors tell us are very important to them. The Facebook Local Community Fund has donated $750,000 to more than 75 organizations serving the Menlo Park, Belle Haven and East Palo Alto communities. We have also committed to building mixed-income housing, including 225 below market rate units, on our campus. DEVELOPING TOOLS TO INCREASE CIVIC ENGAGEMENT Since 2008, Facebook has developed tools to help people engage politically and express their civic voice. The types of products we have launched during elections show people how to register to vote, when and where to vote, what is on their ballot, and where candidates and parties stand on the issues. For example, our efforts in 2016 helped nearly 2 million people register to vote, and we are making it easier for people to find, follow, and contact their elected representatives at all levels of government. In addition to keeping individuals informed, Facebook has developed tools to combat misinformation and fraud. We remove content that tries to deceive and disenfranchise voters; in 2016, we removed content that promoted false voting dates and times. We have also developed post-election tools to make it simple for people to connect with their newly-elected representatives. Through our Town Hall feature, people can find their representatives and choose to follow them on Facebook. Finally, we make it easier for elected officials to connect with their communities. We have created constituent badges to help representatives see if commenters on their posts are constituents. This product ties directly into our insight and targeting features and helps elected officials hear directly from their constituents about the issues that matter most to them. FACEBOOK DATA USE POLICY Law enforcement and criminal activity People who use Facebook can control who can see the information and content they share by selecting the audience for each of their posts. When people choose to share information publicly, that information can be viewed by anyone with access to the Internet. We disclose user data to law enforcement in accordance with our Data Use Policy and applicable law, including the federal Stored Communications Act. Law enforcement has to obtain a valid subpoena connected to an official criminal investigation to compel the disclosure of basic subscriber records, which may include name, length of service, credit card information, email address, and recent login/logout IP address, if available. A court order is required to compel the disclosure of certain additional records or information pertaining to an account, such as message headers and IP addresses. To obtain non-public account content, such as messages, photos, videos, timeline posts, and location information, law enforcement must demonstrate probable cause and obtain a search warrant issued under the procedures described in the Federal Rules of Criminal Procedure, or equivalent state warrant procedures. We interpret the “national security letter” provision as applied to Facebook to require the production of only two categories of information: name and length of service. Federal and state law set the requisite evidentiary thresholds and definitions for valid legal process from law enforcement authorities. Facebook does not comply with unlawful demands to provide data. Such requests may be inadequate because they are based on invalid legal process, are overly broad, or for other reasons. Facebook does not collect ethnic data related to requests from law enforcement authorities or related to proactive sharing of information with law enforcement, consistent with the terms of our Data Use Policy. *** As a company on whose platform the Black Lives Matter movement and many others were started, we know firsthand how important voice, understanding and cultural competence are in achieving our mission to give people the power to build strong communities. We will continue to work in partnership with individuals, organizations, and industry peers towards this end. Thank you once again for our meeting. It was a powerful reminder of how valuable engagement with the CBC is and will continue to be. We look forward to continuing to work with all of you to ensure Facebook is a force for good for communities here and around the world. Respectfully, Sheryl Sandberg' Chief Operating Officer Facebook, Inc Address: 1 Hacker Way Menlo Park, CA 94025 facebook