October 18, 2019 Office of the General Counsel Rules Docket Clerk Department of Housing and Urban Development 451 Seventh Street SW, Room 10276 Washington, D.C. 20410-0001 Re: Docket No. FR-6111-P-02 - HUD’s Implementation of the Fair Housing Act’s Disparate Impact Standard Dear Sir/Madam: The Center on Race, Inequality, and the Law at New York University School of Law (“Center”) is a multifaceted institute created to confront laws, policies, and practices that lead to the oppression and marginalization of people of color. AI Now Institute at New York University (“AI Now”) is an interdisciplinary research center dedicated to understanding the social implications of artificial intelligence. We submit this public comment on behalf of the Center, AI Now and the undersigned collection of scholars in response to the Department of Housing and Urban Development’s (“HUD”) proposed rule to amend the interpretation of the Fair Housing Act’s (“FHA”) disparate impact standard. The Center, AI Now, and the signatories to this comment letter are particularly concerned with the use of algorithmic decision-making tools that can exacerbate existing racial disparities and racial discrimination against communities of color. HUD's proposed rule heightens the already significant burden that plaintiffs must meet to raise claims of housing discrimination, and allows defendants accused of housing discrimination to use algorithmic tools to shield themselves from liability for discriminatory behavior. The proposed rule is concerning because: (1) it is antithetical to the purpose of the FHA; (2) it will significantly weaken the effectiveness of the disparate impact standard by enabling racial discrimination when facilitated by algorithms to escape enforcement; and (3) by enabling racial discrimination, it will disadvantage communities of color even further in the housing market. For these reasons, we strongly oppose HUD’s proposed rule. We submit this joint public comment and urge HUD to withdraw its proposed rule in light of the following considerations. I. The Proposed Rule Is Antithetical to the Purpose of the Fair Housing Act, Which Is to Eliminate Discriminatory Practices in Housing 1. The Fair Housing Act Was Enacted to Remedy Generations of Racial Discrimination in Housing Housing discrimination has firmly established its place in our country’s narrative. This ugly reality circumscribed the composition of neighborhoods and communities along racial lines and has perpetuated racial inequality for generations. The federal government is not merely a passive player in all of this. Through its policies and practices, the federal government has actively shaped the housing market into its present-day unequal and discriminatory state. For example, the Federal Housing Administration’s own Underwriting Manual once dictated that “incompatible racial groups should not be permitted to live in the same communities,” denying aspiring homeowners of color the fruits of the post-war housing boom. 1 Barriers like this led to diminished homeownership among Black and Brown communities, and subsequently contributed to stark wealth disparities that persist today. Numerous other government practices, such as urban renewal programs and the strategic construction of highways, also displaced families of color by demolishing their communities and subsequently limiting their access to homes and constraining their mobility to relocate outside of disinvested neighborhoods. 2 In 1968, the presidentially established Kerner Commission released a report declaring that “[o]ur nation is moving toward two societies, one black, one white— separate and unequal.” 3 It is in this context that the FHA was enacted on April 11, 1968, one week after the assassination of Dr. Martin Luther King, Jr. Operating as a bulwark against discrimination and a follow-up to the Civil Rights Act of 1964, the FHA prohibited discrimination concerning the sale, rental, and financing of housing based on race, religion, national origin, sex, (and as amended) handicap and family Terry Gross, A 'Forgotten History' of How the U.S. Government Segregated America, NPR (May 3, 2017, 12:47 PM), https://www.npr.org/2017/05/03/526655831/a-forgotten-history-of-how-the-u-sgovernment-segregated-america. 2 See Marc Seitles, The Perpetuation of Residential Racial Segregation in America: Historical Discrimination, Modern Forms of Exclusion, and Inclusionary Remedies, 14 J. LAND USE & ENVTL. L. 89, 91–106 (1998). 3 NAT’L ADVISORY COMM’N ON CIVIL DISORDER, THE KERNER REPORT 1 (Sean Wilentz ed., Princeton University Press 2016) (1968). 1 2 status. 4 HUD, entrusted with enforcing the FHA, was charged with “eliminat[ing] housing discrimination, promot[ing] economic opportunity, and achiev[ing] diverse, inclusive communities by leading the nation in the enforcement, administration, development, and public understanding of federal fair housing policies and laws.” 5 2. The Disparate Impact Standard is a Critical Tool for Fulfilling HUD’s Mission to Challenge Racial Discrimination The wide-ranging harms inflicted by housing discrimination cannot be adequately rectified without the disparate impact standard. The proposed rule effectively undermines the strength of this safeguard and imposes unnecessary barriers to disparate impact claims that have the ability to root out an array of discriminatory housing practices that would otherwise go unchecked. Disparate impact claims are employed to challenge exclusionary zoning practices that harm people of color and the destruction of housing under the guise of “urban renewal.” 6 Disparate impact claims are also a tool to challenge the adverse effects of neighborhood preference policies, arbitrary landlord screening practices, and racially discriminatory mortgage underwriting and home insurance standards. 7 Disparate impact liability is triggered “even if the practice was not motivated by a discriminatory intent.” 8 On June 25, 2015, the Supreme Court upheld and recognized the critical role that the disparate impact standard plays in Texas Department of Housing and Community Affairs v. Inclusive Communities Project, Inc: “[U]nlawful practices . . . that function unfairly to exclude minorities from certain neighborhoods without any sufficient justification . . . reside at the heartland of disparate-impact liability.” 9 The current HUD rule governing disparate impact claims is consistent with the Inclusive Communities decision. History of Fair Housing, U.S. DEP’T HOUSING & URB. DEV., https://www.hud.gov/program_offices/fair_housing_equal_opp/aboutfheo/history (last visited Oct. 14, 2019). 5 OFFICE OF FAIR HOUS. & EQUAL OPPORTUNITY, U.S. DEP’T OF HOUS. & URBAN DEV., ANNUAL REPORT TO CONGRESS FY 2017, at 3 (2017) https://www.hud.gov/sites/dfiles/FHEO/images/FHEO_Annual_Report_2017-508c.pdf. 6 See Robert G. Schwemm, Fair Housing Litigation After Inclusive Communities: What’s New and What’s Not, 115 COLUM. L. REV. SIDEBAR 106 (2015), https://columbialawreview.org/content/fairhousing-litigation-after-inclusive-communities-whats-new-and-whats-not/. 7 See, e.g., Conn. Fair Hous. Ctr. v. CoreLogic Rental Prop. Sols., LLC, 369 F. Supp. 3d 362, 376–79 (D. Conn. 2019), http://www.ctfairhousing.com/PDFs/CoreLogicMTDOrder.pdf (finding a consumerreporting agency potentially liable under the FHA’s disparate impact standard for its use of a tenant screening product). 8 24 C.F.R. § 100.500 (2019). 9 Tex. Dep't of Hous. & Cmty. Affairs v. Inclusive Cmtys. Project, Inc., 135 S. Ct. 2507, 2521–22 (2015) (citations omitted). 4 3 Past practices continue to exert a striking contemporary impact. For example, neighborhoods once “redlined”—marked by the Home Owners’ Loan Corporation as hazardous risks to creditors due to the prevalence of residents of color—continue to experience “both economic disadvantage and majority-minority presence.” 10 And HUD’s own programs continue to subject Black renters to segregated conditions and fewer housing options. 11 Despite HUD’s acknowledgement that housing “[d]iscrimination isn’t always obvious,” 12 with this proposed rule, the agency is taking extraordinary steps to dismantle the very protections and tools that counteract the often covert policies and practices that disparately impact on communities of color. In light of both HUD’s history of culpability and its mandate to affirmatively further fair housing, it is incumbent upon HUD to ensure that people of all races can participate equally in the housing market. II. The Proposed Rule Will Significantly Weaken the Effectiveness of the Disparate Impact Standard by Enabling Racial Discrimination Facilitated by Algorithms to Escape Enforcement 1. The Use of Algorithmic Tools Can Perpetuate Racial Discrimination Algorithmic tools operate within the historical and social context of the data they use. This means that the destructive racial and social conditions of our society are reflected within housing data and, in turn, the algorithmic tools that are trained upon it. Therefore, while some argue that algorithms are favorable because they are trained on extensive, empirically accurate, and objective data, such data actually reflects existing and historical social inequities as well as discriminatory institutional values, systems, and practices. One such example can be seen in the context of predictive policing algorithms, for which the training data frequently reflects historical police practices and policies rather than the actual prevalence and BRUCE MITCHELL & JUAN FRANCO, NAT’L CMTY. REINVESTMENT COAL., HOLC “REDLINING” MAPS: THE PERSISTENT STRUCTURE OF SEGREGATION AND ECONOMIC INEQUALITY 19 (2018), https://ncrc.org/wp-content/uploads/dlm_uploads/2018/02/NCRC-Research-HOLC-10.pdf. 11 Elizabeth Julian & Michael M. Daniel, HUD-Assisted Low-Income Housing: Is It Working and for Whom?, POVERTY & RACE, July–Aug. 2009, at 3, 6–7, https://www.prrac.org/newsletters/julaug2009.pdf (finding that the “unavoidable conclusion one comes to after reviewing the somewhat tedious data in the HUD report is that poor Black renters, as a result of accepting HUD rental assistance, will be subjected to worse conditions or more segregated conditions, or both”). 12 Examples of Housing Discrimination, U.S. DEP’T HOUSING & URB. DEV., https://www.hud.gov/program_offices/fair_housing_equal_opp/examples_housing_discrimination. HUD’s first example of housing discrimination on its Office of Fair Housing and Equal Opportunity website is the story of “John.” John’s landlord rejects his application on the pretense of a negative recommendation from John’s past landlord. Following an investigation into the present landlord’s practices, HUD finds a pattern of discrimination based on race and color. 10 4 frequency of crime. 13 Yet because racial discrimination in housing has been so historically prevalent, even a predictively accurate algorithm that is based on data which accurately reflects historical conditions will reproduce racial bias and discriminatory outcomes. Algorithmic tools are also subject to the biases of individuals and institutions that create and design them. 14 Individual and institutional bias can be introduced at many different stages, including framing the problem that the algorithm is designed to solve, choosing what metrics to optimize for, collecting and preparing the data, developing the model that guides the performance of the tool, and deciding how to present that information to practitioners. 15 The pervasive nature of racial bias has led researchers across a variety of sectors including computer science, criminal justice, education, health, and employment to conclude that algorithmic tools carry the vast potential to disproportionately affect the lives of communities of color by perpetuating and concealing inequality and racial discrimination. 16 Any introduction of guidelines, protocols or procedures relying on algorithmic tools must therefore acknowledge that the undue influence of race will play a key role in the design, implementation, and effect of algorithmic tools. Comprehensive and established guidelines must be established to mitigate these effects, ensuring that algorithmic tools are adequately accounted for and that communities of color are protected. See, e.g., Rashida Richardson et al., Dirty Data, Bad Predictions: How Civil Rights Violations Impact Police Data, Predictive Policing Systems and Justice, 94 N.Y.U. L. REV. 193, 199–203, 218–19 13 (2019), https://www.nyulawreview.org/wp-content/uploads/2019/04/NYULawReview-94-RichardsonSchultz-Crawford.pdf (discussing the bias inherent in police data, leading to significant challenges in predictive policing); Sarah Brayne, Big Data Surveillance: The Case of Policing, 82 AM. SOC. REV. 977 (2017), https://journals.sagepub.com/doi/10.1177/0003122417725865 (discussing the rise of big data in surveillance and policing using a research study conducted within the Los Angeles Police Department); Kristian Lum & William Isaac, To Predict and Serve?, SIGNIFICANCE MAG., Oct. 2016, at 5, 14, https://rss.onlinelibrary.wiley.com/doi/epdf/10.1111/j.1740-9713.2016.00960.x (finding that the big data analytics used in policing amplify prior surveillance practices that create greater social inequities and consequences). 14 Danielle Keats Citron & Frank Pasquale, The Scored Society: Due Process for Automated Predictions, 89 WASH. L. REV. 1, 4 (2014). 15 Karen Hao, This is How AI Bias Really Happens—and Why It’s So Hard To Fix, MIT TECH. REV. (Feb. 4, 2019), https://www.technologyreview.com/s/612876/this-is-how-ai-bias-really-happensandwhy-its-so-hard-to-fix/; see also Timnit Gebru et al., Datasheets for Datasets (Apr. 14, 2019) (working paper), https://arxiv.org/abs/1803.09010 (highlighting the lack of standards in creating and organizing data sets); Ben Green, The Just City: Machine Learning’s Social and Political Foundations, in THE SMART ENOUGH CITY (2019), https://smartenoughcity.mitpress.mit.edu/pub/vmjl8djz (describing the variety of design choices that shape the impacts of machine learning models). 16 See, e.g., VIRGINIA EUBANKS, AUTOMATING INEQUALITY: HOW HIGH-TECH TOOLS PROFILE, POLICE, AND PUNISH THE POOR (2018); SAFIA U. NOBLE, ALGORITHMS OF OPPRESSION (2018). 5 2. Racial Discrimination by Algorithms Already Contributes to Housing Inequality In housing, algorithmic tools used for mortgage underwriting (such as credit scoring 17 and disqualification), 18 tenant screening practices, 19 and social media advertising 20 have all been found to discriminate against communities of color. 21 One such example can be seen in HUD’s 2016 guidance statement that an algorithm that considered criminal records as a way to exclude rental applicants disproportionately harmed Black and Latinx populations. 22 Accordingly, while we recognize that it is appropriate to assess individuals for housing applications and that algorithms can play a role in this assessment, safeguards must be established to prevent the exacerbation of racial harms and social exclusion experienced by communities of color. Yet the proposed rule fails to acknowledge the potential misuse of, or negative outcomes produced by, these See David Murakami Wood, Spatial Profiling, Sorting and Prediction, in UNDERSTANDING SPATIAL MEDIA 225, 226–32 (Rob Kitchin et al. eds., 2015). 18 Susan Saegert et al., Mortgage Foreclosure and Health Disparities: Serial Displacement as Asset Extraction in African American Populations, 88 J. URB. HEALTH 390 (2011), https://link.springer.com/article/10.1007/s11524-011-9584-3 (finding that mortgage foreclosure is a way of serial displacement highlighting the current crisis in the context of historically repeated extraction of capital). The paper finds that for Black households, this reflects structural inequality in health and housing. The scale of displacement makes a large difference for this community. 19 See, e.g., Conn. Fair Hous. Ctr. v. CoreLogic Rental Prop. Sols., LLC, 369 F. Supp. 3d 362, 373 (D. Conn. 2019), http://www.ctfairhousing.com/PDFs/CoreLogicMTDOrder.pdf (finding that the lack of an individualized review of a Latinx disabled man’s suitability for tenancy prevented him from obtaining housing, relying heavily upon the automated algorithm that determined him “disqualified” for housing). 20 Julia Angwin & Terry Parris Jr., Facebook Lets Advertisers Exclude Users by Race, PROPUBLICA (Oct. 28, 2016, 1:00 PM), https://www.propublica.org/article/facebook-lets-advertisers-exclude-usersby-race. 21 While this comment centers on race, other algorithms used in the housing context have been found to discriminate based on other social characteristics. See e.g., Marie C. Baca, Housing Companies 17 Used Facebook’s Ad System to Discriminate Against Older People, According to New Human Rights Complaints, WASH. POST (Sept. 18, 2019), https://www.washingtonpost.com/technology/2019/09/18/housing-companies-used-facebooks-adsystem-discriminate-against-older-people-according-new-human-rights-charges/ (discussing discrimination based on age); HUD Files Housing Discrimination Complaint Against Facebook, Press Release, HUD No. 18-085 (Aug. 17, 2018), https://www.hud.gov/press/press_releases_media_advisories/HUD_No_18_085 (discussing HUD’s complaint against Facebook regarding ads that discriminated on the basis of characteristics that include gender, disability, religion, and zip codes); Muhammed Ali et al, Discrimination Through Optimization: How Facebook's Ad Delivery Can Lead to Skewed Outcomes, 2019 PROC. ACM ON HUM.-COMPUTER INTERACTION (forthcoming), https://arxiv.org/abs/1904.02095 (finding that algorithms used by Facebook resulted in housing ads that align with race and gender stereotypes). 22 OFFICE OF GEN. COUNSEL, U.S. DEP’T OF HOUS. & URBAN DEV., GUIDANCE ON APPLICATION OF FAIR HOUSING ACT STANDARDS TO THE USE OF CRIMINAL RECORDS BY PROVIDERS OF HOUSING AND REAL ESTATE-RELATED TRANSACTIONS 2 (2016), https://www.hud.gov/sites/documents/HUD_OGCGUIDAPPFHASTANDCR.PDF. 6 algorithmic tools or provide substantial safeguards or protections against these harms. In fact, the proposed rule gives unprecedented deference to mortgage lenders, landlords, banks, insurance companies, and others in the housing industry, placing potential tenants and homebuyers of color at a severe disadvantage. 3. The Proposed Rule Will Encourage, Facilitate, and Exacerbate Racial Discrimination by Algorithms in Housing The proposed rule encourages and facilitates discrimination by providing those in the housing industry with new ways to discriminate and conceal discriminatory motives with the protection of the law, and it thus dramatically reduces their responsibility for discriminatory outcomes. 23 In addition, because of the discriminatory nature of housing algorithms, the rule will enable discrimination even in cases where housing industry actors had no intention to discriminate. When a plaintiff alleges that the cause of discrimination is attributable to an algorithmic tool, the proposed rule creates three defenses that absolve the defendant from responsibility. The three defenses against claims of discrimination by algorithm provides such parties with a feasible pathway to both intentionally and unintentionally discriminate with little accountability. As detailed below, the defenses themselves are therefore inherently flawed, making it more difficult for plaintiffs to successfully challenge racial discrimination and exacerbating racial inequality in housing. 24 The First Defense The first defense allows parties in the housing industry to rely on algorithmic tools provided the inputs “do not rely in any material part on factors that are substitutes or close proxies for protected classes under the FHA.” 25 However, there are significant challenges relating to the use of proxies in algorithmic tools. The emphasis upon substitutes or close proxies suggests that there is a hierarchy of relevant proxies for protected classes. In fact, the scope and hierarchy of close and relevant proxies is impossible to define in an algorithmic tool given the Emily Badger, Who’s to Blame When Algorithms Discriminate? , N.Y. TIMES, (Aug. 20, 2019), https://www.nytimes.com/2019/08/20/upshot/housing-discrimination-algorithms-hud.html; Andrew D. Selbst, A New HUD Rule Would Effectively Encourage Discrimination by Algorithm, SLATE (Aug. 19, 2019), https://slate.com/technology/2019/08/hud-disparate-impact-discrimination-algorithm.html; Olatunde Johnson & Michelle Aronowitz, The Trump Administration’s Assault on Fair Housing, TAKE CARE BLOG (Aug. 19, 2019), https://takecareblog.com/blog/the-trump-administration-s-assaulton-fair-housing. 24 See, e.g., Alistair Croll, Big Data Is Our Generation’s Civil Rights Issue, and We Don’t Know It, SOLVE FOR INTERESTING (Jul. 31, 2012), https://perma.cc/BS8S-6T7S; see also Solon Barocas & Andrew D. Selbst, Big Data’s Disparate Impact, 104 CAL. L. REV. 671 (2016), https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2477899. 25 HUD’s Implementation of the Fair Housing Act’s Disparate Impact Standard, 84 Fed. Reg. 42,854, 42,862 (proposed Aug. 19, 2019). 23 7 social construction and changing nature of many protected classes. 26 As noted by computer and information scientists, “how we measure race often changes how we conceive of it, and changing conceptions of race may force us to alter what we measure.” 27 In the context of race, proxies that were once considered far removed from race may now be construed as racial in nature. As a practical matter, the cost of proving that a given factor is a proxy may make prosecuting many legitimate discrimination cases economically infeasible. In a society as racially stratified as ours, many things seemingly unrelated to race can act as proxies. Machine learning algorithms “learn” from historical data to identify patterns of related attributes or activities that can collectively serve as proxies for protected groups. 28 For example, the well-known history of racially segregated housing would suggest that neighborhoods and zip codes could reasonably be considered as close proxies for race. 29 Yet the power of machine learning lies not just in finding salient connections, but in “produc[ing] novel insights that probably couldn't have been revealed in any other way.” 30 This means that many factors that seem unrelated to race—such as musical taste, the number of “likes” on a social network such as Facebook, and a network of friends 31—can act as proxies for race. The challenge is that there is no clear way of knowing how “close” an attribute is in order for it to be considered “a close proxy or substitute” for a protected group. 32 Even attributes that appear themselves to be unrelated to protected categories can, when combined with other attributes, collectively be close proxies for protected categories. Whether an attribute serves as a proxy for a protected category depends in part on factors such as what dataset is being used and what other attributes are included. It is therefore impossible to identify whether a specific attribute is a close proxy. Almost all algorithmic patterns are infused with data that could be construed as substitutes or proxies for race. But the proposed rule’s limited focus on “close proxies” ignores this essential fact. See e.g., Ian Haney Lopez, The Social Construction of Race, 29 HARV. C.R.-C.L. L. REV. 1 (1994), https://scholarship.law.berkeley.edu/cgi/viewcontent.cgi?article=2815&context=facpubs (explaining that the law plays a significant role in carving out notions of race often reflecting, or working in tandem with societal norms and culture; by relying on jurisprudence, Lopez notes that throughout history, the notion of race as it pertains to different groups has changed); Angela Onwuachi-Willig, Race and Racial Identity Are Social Constructs, N.Y. TIMES (Sept. 6, 2016), https://www.nytimes.com/roomfordebate/2015/06/16/how-fluid-is-racial-identity/race-and-racialidentity-are-social-constructs (explaining that while the concept of race is socially constructed and fluid, the social, political and economic meaning attached to racial groups has remained the same). 27 Solon Barocas et al., Introduction, in FAIRNESS IN IN MACHINE LEARNING (2019), https://fairmlbook.org/pdf/introduction.pdf. 28 Pedro Domingos, A Few Useful Things to Know About Machine Learning, COMM. ACM, Oct. 2012, at 78, 78–80, https://homes.cs.washington.edu/~pedrod/papers/cacm12.pdf. 29 The Justice map demonstrates that zip code and census data combined does track race. JUSTICE MAP, http://www.justicemap.org/ (last visited Oct. 16, 2019). 30 VIKTOR MAYER-SCHON ̈ BERGER & KENNETH CUKIER, BIG DATA: A REVOLUTION THAT WILL TRANSFORM HOW WE LIVE, WORK, AND THINK (2013). 31 See, e.g., Barocas, supra note 24, at 712. 32 Id. at 720. 26 8 The first defense also allows those in the housing industry to rely on an algorithmic tool provided that the model “is predictive of credit risk or other similar valid objective.” 33 But even a tool that is predictive can still be discriminatory. Indeed, given the deeply racialized nature of housing in the United States, 34 many predictions that accurately forecast risk will do so in part or whole by using attributes that identify or are correlated with the race of the subject. In the context of housing, it would be almost impossible for a model to make accurate predictions without making decisions based on race. Moreover, while algorithms rely upon a range of patterns in order to make predictions, what constitutes reasonable predictive accuracy varies greatly according to context and involves tradeoffs. 35 A comparative example illustrates this point: algorithmic tools used to classify radiographs can have a predictive accuracy measure of 0.98, 36 while recidivism prediction models tend to have a predictive accuracy measure of approximately 0.7. 37 Given the context-specific nature of “predictive accuracy,” it is difficult to determine by what standard or means a model could be deemed sufficiently predictive of credit risk. This ambiguity would again make it exceedingly difficult for plaintiffs to succeed against this defense. It would likewise make it easy for those engaging in housing discrimination to escape liability. The Second Defense The second defense permits those in the housing industry to rely on algorithms that may have a discriminatory impact, provided they were developed by “a recognized third-party that determines industry standards.” 38 This defense raises serious concerns given the lack of transparency surrounding the use of third-party vendors and the lack of uniformity in industry standard setting. HUD’s Implementation of the Fair Housing Act’s Disparate Impact Standard, 84 Fed. Reg. 42,854, 42,862 (proposed Aug. 19, 2019). 34 See supra Section 1. 35 Sam Corbett-Davies et al., Algorithmic Decision Making and the Cost of Fairness, 23 PROC. ACM SIGKDD INT’L CONF. KNOWLEDGE DISCOVERY & DATA MINING 797 (2017), https://dl.acm.org/citation.cfm?id=3098095/ (finding that accuracy and fairness of a machine learning model can be in tension). 36 Jared A. Dunnmon et al., Assessment of Convolutional Neural Networks for Automated Classification of Chest Radiographs, 2018 RADIOLOGY 1, 1 (2018), https://jdunnmon.github.io/dunnmon_radiology_2018.pdf. 37 See SARAH L. DESMARAIS & JAY P. SINGH, RISK ASSESSMENT INSTRUMENTS VALIDATED AND IMPLEMENTED IN CORRECTIONAL SETTINGS IN THE UNITED STATES (2013), https://csgjusticecenter.org/wp-content/uploads/2014/07/Risk-Assessment-Instruments-Validatedand-Implemented-in-Correctional-Settings-in-the-United-States.pdf; see also NORTHPOINTE, PRACTITIONERS GUIDE TO COMPAS (2012), http://www.northpointeinc.com/files/technical_documents/FieldGuide2_081412.pdf. 38 HUD’s Implementation of the Fair Housing Act’s Disparate Impact Standard, 84 Fed. Reg. at 42,862. 33 9 The onus-shifting approach to third-party vendors is troubling. It provides those in the housing industry with little incentive to evaluate the potential discriminatory effects of the algorithmic tools that they are using. Some may argue that this defense follows a similar approach to §230 of the Communications Decency Act, which provides legal protection for content produced by online providers who use third-party intermediaries to host or republish speech online. Yet even the broad protection offered under §230 exhorts online providers to monitor and eliminate the worst of that content and does not absolve online providers from complete responsibility. 39 The proposed HUD defense fails to incorporate this type of provision and enables those in the housing industry to shift their responsibilities onto thirdparty vendors. This defense is particularly troubling given the opaque nature of algorithmic tools and the challenges to transparency and accountability that accompany them. For instance, the Center and AI Now, in 2018 and 2019 Litigating Algorithms Reports, highlighted legal challenges to algorithmic tools used in education, public benefits, and criminal justice, where third party vendors used broad, and ultimately illegitimate, trade secrecy or confidentiality claims to obstruct efforts to examine the algorithm. 40 Due to trade secrecy laws and non-disclosure agreements, third-party developers are generally free to design algorithmic tools within a “black-box.” 41 Details pertaining to how algorithmic tools are designed and created are therefore inaccessible to those who may wish to challenge the use of an algorithmic tool. 42 This makes any effort to hold third parties accountable extremely challenging. Researchers and scholars have already noted the tension between trade secrecy and transparency, and have called for mechanisms to be put in place to make these processes more transparent and accountable. 43 Joshua A. Geltzer, The President and Congress Are Thinking of Changing This Important Internet Law, SLATE (Feb. 25, 2019), https://slate.com/technology/2019/02/cda-section-230-trump39 congress.html. 40 RASHIDA RICHARDSON ET AL., AI NOW INST. & CENT. ON RACE, INEQUALITY, & THE LAW, LITIGATING ALGORITHMS 2019 US REPORT: NEW CHALLENGES TO GOVERNMENT USE OF ALGORITHMIC DECISION SYSTEMS (2019), https://ainowinstitute.org/litigatingalgorithms-2019-us.pdf; MEREDITH WHITTAKER ET AL., AI NOW INST., AI NOW REPORT 2018, at 39–40 (2018), https://ainowinstitute.org/AI_Now_2018_Report.pdf. 41 See, e.g., FRANK PASQUALE, THE BLACK BOX SOCIETY: THE SECRET ALGORITHMS THAT CONTROL MONEY AND INFORMATION 12–15 (2015). 42 Robert Brauneis & Ellen P. Goodman, Algorithmic Transparency for the Smart City, 20 Yale J.L. & Tech. 103 (2018), https://yjolt.org/sites/default/files/20_yale_j._l._tech._103.pdf; Rebecca Wexler, Life, Liberty, and Trade Secrets: Intellectual Property in the Criminal Justice System, 70 STAN. L. REV. 1343 (2018), https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2920883 (highlighting the lack of meaningful transparency regarding government use of algorithmic systems and how the aggressive use of trade secrecy and confidentiality serves as an obstacle). 43 See, e.g., Joshua A. Kroll et al., Accountable Algorithms, 165 U. PA. L. REV. 633, 636 (2017) (observing that “accountability mechanisms and legal standards that govern decision processes have not kept pace with technology” and arguing that “[c]itizens, and society as a whole, have an interest in making these processes more accountable). 10 Yet HUD’s proposed defense steps away from these recommendations and encourages an approach that increases third-party opacity. As we have seen in other contexts, such as policing, the lack of third-party transparency can lead to third-party vendors having an undue influence on direct service providers. 44 In the housing context, this could have drastic consequences for communities that are already marginalized and subject to racial discrimination. The reference to industry standards raises further cause for concern. There is no criteria that determines industry standards for standard-setting boards to follow when regulating algorithmic models. Even within the research community, there are ongoing debates about what standards or metrics should govern such models. This challenge, along with the highly politicized nature of industry standard setting, has been observed in the context of newly created Wifi connectivity standards, 45 privacy certifications, 46 and DRM-protected video. 47 The result is that third-party developers are free to design and implement algorithmic tools with no requirements to test the data sets or the algorithms that are used, no regulation, no oversight, and no clear standards by which to test the models against. HUD has provided no guidance regarding the industry standards required of third-party developers. Yet even if they had, there is no current method or standard in place that would ensure that algorithmic tools are “discrimination proof.” 48 As noted in the examples above, industry standard creation is an inherently political process that will likely undermine the societal goal of reducing discrimination in housing. Even the most advanced systems still pose a risk of amplifying and perpetuating structural or systemic bias. Unlike the proposed rule suggests, it is See Elizabeth E. Joh, The Undue Influence of Surveillance Technology Companies on Policing, 92 N.Y.U. L. REV. ONLINE 101 (2017), https://www.nyulawreview.org/wp-content/uploads/2018/08/JohFINAL_0.pdf; William Alden, There's a Fight Brewing Between the NYPD and Silicon Valley's Palantir, BUZZFEED NEWS (Jun. 28, 2017), https://www.buzzfeednews.com/article/williamalden/theres-a-fight-brewing-between-the-nypd-andsilicon-valley (demonstrating the power vendors can have). 45 Joshua Sisco & Leah Nylen, DOJ Probes Role of Special Interest Group in New WiFi Standard, MARKET INSIGHT (Jan. 26, 2018), https://mlexmarketinsight.com/insights-center/editorspicks/antitrust/north-america/doj-probes-role-of-special-interest-group-in-new-wifi-standard. 46 TRUSTe Settles FTC Charges It Deceived Consumers Through Its Privacy Seal Program, FED. TRADE COMMISSION (Nov. 17, 2014), https://www.ftc.gov/news-events/press-releases/2014/11/trustesettles-ftc-charges-it-deceived-consumers-through-its. 47 Jacob Kastrenakes, A DRM Standard Has Been Approved for the Web, and Security Researchers are Worried, VERGE (Jul. 8, 2017), https://www.theverge.com/2017/7/8/15942238/web-drm-standardeme-approved-controversy. 48 Sandra G. Mayson, Bias In, Bias Out, 128 YALE L.J. 2218, 2230–51 (2019), https://www.yalelawjournal.org/pdf/Mayson_p5g2tz2m.pdf (describing how attempts to standardize racial equality are hampered by the lack of consensus on definitions and adequate technical approaches to balance the tradeoffs of varying notions of equality in the criminal justice context); Jon Kleinberg et al., Inherent Trade-Offs in the Fair Determination of Risk Scores, 8 PROC. INNOVATIONS THEORETICAL COMPUTER SCI. (2017), https://arxiv.org/pdf/1609.05807.pdf (describing how different notions of “fairness” in algorithms are inherently in conflict with one another). 44 11 unclear what industry is best suited to create standards that could mitigate discrimination. The Third Defense The third defense permits those in the housing industry to rely upon a neutral third-party expert to analyze the algorithm and determine that the inputs are not substitutes for protected characteristics. The defense seems to suggest that a thirdparty expert could determine whether the inputs for the algorithmic tool were intended to be discriminatory. As described above, however, it is unclear and likely impossible to determine how this could be proven. Research on data privacy has shown that “one can never know what information may later be extracted from any particular collection of big data”; 49 by the same token, it is impossible to determine that any piece of information could not (either now or in the future) be connected to race and used to create discriminatory models. While biased tools can sometimes be identified by following the discriminatory effects of an algorithmic model, it is less clear what the absence of bias looks like in the design process and what could constitute a “fair” algorithmic tool. 50 Indeed, there are myriad definitions of what it means for a machine learning model to be “fair,” 51 and those definitions are often in tension with one another. 52 While a third-party expert may claim that an algorithmic model is fair, there are no approved standards that state what constitutes “fair” and whether the determination of fairness aligns with societal and constitutional notions of equality. In fact, researchers have shown that common notions of “fairness” are in tension with broader notions of justice and equality. 53 The term “fairness” may also differ for third-party experts when compared to the impact that it has on communities. Finally, there is no way to determine the credibility of experts. No standards currently exist to assess which experts could confirm that tools are “fair” and will produce racially equitable outcomes. HUD claims that these defenses “are not intended to provide a special exemption for parties who use algorithmic models, but merely to recognize that additional guidance is necessary in response to the complexity of disparate impact PRESIDENT’S COUNCIL OF ADVISORS ON SCI. & TECH., BIG DATA AND PRIVACY: A TECHNOLOGICAL PERSPECTIVE ix (2014), https://bigdatawg.nist.gov/pdf/pcast_big_data_and_privacy_-_may_2014.pdf. 50 Hao, supra note 15. 51 Arvind Narayanan, Tutorial: 21 Fairness Definitions and Their Politics, YOUTUBE (Mar. 1, 2018), https://www.youtube.com/watch?v=jIXIuYdnyyk. 52 Kleinberg, supra note 48. 53 Anna Lauren Hoffmann, Where Fairness Fails: Data, Algorithms, and the Limits of Antidiscrimination Discourse, 7 J. INFO. COMM. & SOC. 22 (2019), https://www.tandfonline.com/doi/full/10.1080/1369118X.2019.1573912; Ben Green & Lily Hu, The Myth in the Methodology: Towards a Recontextualization of Fairness in Machine Learning, 35 INT’L CONF. ON MACHINE LEARNING (2018), https://www.benzevgreen.com/wp-content/uploads/2019/02/18icmldebates.pdf. 49 12 cases challenging these models.” 54 However, in practice, the proposed rule provides just such an exemption, by imposing a burden on plaintiffs to interrogate opaque systems and to prove what algorithmic fairness means, in the absence of any standard to do so. An Onerous and Unreasonable Burden upon Plaintiffs The broad defenses that the proposed rule would grant defendants are complemented by the onerous burden placed on plaintiffs. Under the proposed rule’s five point prima facie evidentiary test, plaintiffs would need to essentially establish the elements of their claim at the outset of the case, prior to any discovery process. This heavy burden, which will disqualify many meritorious claims, is particularly ill-suited to cases involving algorithmic tools. As discussed, the internal workings of algorithmic tools are frequently hidden within a “black box” that prevents outsider analysis under ordinary circumstances. 55 For example, in Houston Federation of Teachers, Local 2415 v. Houston Independent School District, the court found that open discovery was essential to understanding the inner workings of an algorithm established to improve teaching quality by providing a standardized assessment of teachers. 56 The plaintiffs were successful on due process grounds, with the court noting that without open discovery, it was almost impossible for teachers to access, understand, or act on their own evaluations. 57 Without access to discovery, it is highly implausible that a victim of a racially discriminatory algorithmic tool will possess the information to capably contest the validity of an algorithmic tool’s inputs or to otherwise discuss the tool with the specificity that the proposed rule would require. III. By Enabling Racial Discrimination, the Proposed Rule Will Disadvantage Communities of Color Even Further in the Housing Market 1. The Housing Market Is Already Rife with Racial Discrimination The challenges that algorithmic tools pose and the broad defenses that the proposed regulation offers to defendants will make it substantially more difficult for plaintiffs with meritorious claims to obtain relief. This is especially troubling because even with existing protections, the realm of housing is plagued with racial discrimination and inequality. HUD’s Implementation of the Fair Housing Act’s Disparate Impact Standard, 84 Fed. Reg. 42,854, 42,859 (proposed Aug. 19, 2019). 55 See Pasquale, supra note 41 at 12–15. 56 See Houston Fed’n of Teachers, Local 2415 v. Houston Indep. Sch. Dist., 51 F. Supp. 3d 1168, 1179 (S.D. Tex. 2017); see also RASHIDA RICHARDSON ET AL., supra note 40. 57 Houston Fed’n of Teachers, 51 F. Supp. 3d at 1176–80. 54 13 This discrimination and the racial inequality that results manifests in a variety of forms. A mortgage loan is often a requirement for buying a house. Yet Black and Latinx borrowers face unique obstacles in acquiring such loans. A recent lawsuit against Wells Fargo, for example, revealed that Black borrowers who qualified for regular loans were 2.9 times more likely than similarly situated White borrowers to be directed to a subprime loan and Latinx borrowers 1.8 times more likely, thus encumbering Black and Latinx borrowers with higher interest rates and unconscionable terms. Further data indicated that in Chicago, relative to similarly situated White customers, customers who borrowed $300,000 through an independent broker paid an average of $2,937 more in broker fees if Black and $2,187 if Latinx. 58 Wells Fargo is not alone in these types of practices, 59 nor are banks the only actors at fault. Courts have recognized that there is a “direct connection of availability of property insurance and ability to purchase a house.” 60 Yet property insurers have overcharged or denied coverage to residents of Black neighborhoods, further restricting free and fair access to the housing market. 61 Buyers and renters of color face additional discrimination in the housing market even when they can acquire adequate funds. A 2012 HUD-sponsored study found that homeseekers of color are shown and told about fewer homes than White homeseekers. 62 And homeseekers of color who get beyond this stage must then Charlie Savage, Wells Fargo Will Settle Mortgage Bias Charges, N.Y. TIMES (July 12, 2012), https://www.nytimes.com/2012/07/13/business/wells-fargo-to-settle-mortgage-discriminationcharges.html; see also Robert Barlett et al., Consumer-Lending Discrimination in the FinTech Era 4 (Nat’l Bureau of Econ. Research, Working Paper No. 25943, 2019), https://www.nber.org/papers/w25943.pdf (finding that that “accepted Latinx and African-American borrowers pay 7.9 and 3.6 basis points more in interest for home purchase and refinance mortgages respectively because of discrimination”). 59 See, e.g., Richard Rothstein, A Comment on Bank of America/Countrywide’s Discriminatory Mortgage Lending and Its Implications for Racial Segregation, ECON. POL. INS. (Jan. 23, 2012), https://www.epi.org/publication/bp335-boa-countrywide-discriminatory-lending; see also EMMANUEL MARTINEZ & AARON GLANTZ, CENTER FOR INVESTIGATIVE REPORTING, HOW REVEAL IDENTIFIED LENDING DISPARITIES IN FEDERAL MORTGAGE DATA (2018), https://s3-us-west2.amazonaws.com/revealnews.org/uploads/lending_disparities_whitepaper_180214.pdf (finding based on 2015 and 2016 records that “in certain areas of the country, people of color were more likely to be denied a conventional mortgage than white applicants, even after controlling for a wide array of economic and social factors”). 60 Nationwide Mut. Ins. Co. v. Cisneros, 52 F.3d 1351, 1359 (6th Cir. 1995). 61 See Stephen Koff, HUD Tries to Crack Down on Discrimination by Insurers, While Insurers Deny They're Discriminating, CLEVELAND.COM (Oct. 6, 2016), https://www.cleveland.com/metro/2016/10/hud_to_crack_down_on_discrimin.html (reporting on HUD’s recognition of this issue); see also John H. Gilmore, Insurance Redlining & the Fair Housing Act: The Lost Opportunity of Mackey v. Nationwide Insurance Companies, 34 CATH. U. L. REV. 563, 575–78 (1985) (discussing how this practice has functioned). 62 MARGERY AUSTIN TURNER ET AL., URBAN INST., HOUSING DISCRIMINATION AGAINST RACIAL AND ETHNIC MINORITIES 2012, at xi (2013), https://www.huduser.gov/portal/Publications/pdf/HUD514_HDS2012.pdf. 58 14 contend with racial discrimination on the part of sellers and renters. 63 In addition to explicit racial discrimination, sellers discriminate through the use of nominally raceneutral screening factors that apply in a highly racialized manner. 64 HUD has recognized this problem and the importance of adequate redress. 65 For Black homeowners, the ultimate acquisition of a house does not signal an end to racial discrimination in housing. A recent Brookings Institution study found that homes in neighborhoods where the residents are 50 percent Black are valued at about half the price of homes in neighborhoods without any Black residents. Considering whether this could be explained by non-racial factors, the study concluded that “we are left with the fact that a square foot of residential real-estate is worth 23 percent less in neighborhoods where half the population is black compared to neighborhood with few or no black residents, even after adjusting for housing quality and neighborhood quality.” This devaluation of Black homes has resulted in a cumulative loss of $156 billion. 66 Additionally, homeowners of color are disproportionately subjected to improper enforcement of local laws and regulations that can result in illegal seizures and foreclosures, undermining the opportunities for building wealth that homeownership offers. 67 See, e.g., B. Rose Kelly, Hispanics Face Racial Discrimination in New York City’s Rental Housing Market, PRINCETON U. (Oct. 24, 2018, 4:34 PM), 63 https://www.princeton.edu/news/2018/10/24/hispanics-face-racial-discrimination-new-york-citysrental-housing-market. 64 See, e.g., Esme Caramello & Nora Mahlberg, Combating Tenant Blacklisting Based on Housing Court Records: A Survey of Approaches, SARGENT SHRIVER NAT’L CENTER ON POVERTY L. (Sept. 2017), https://www.povertylaw.org/clearinghouse/article/blacklisting (discussing one “fair housing challenge to a landlord’s blanket policy of rejecting or adversely treating any tenant with a record of a housing court case,” despite the fact that “in the county of the lawsuit, African Americans are almost four times more likely than whites to have been sued in an eviction case, and African American women are sued more than five times as often as households headed by white men”); Jennifer Safstrom & Rachel Goodman, Lawsuit Challenges Discriminatory Housing Policy in Chesterfield County, Virginia, ACLU (June 4, 2019, 12:00 PM), https://www.aclu.org/blog/racialjustice/race-and-economic-justice/lawsuit-challenges-discriminatory-housing-policy (discussing a challenge to a landlord’s blanket ban on tenants with felony convictions, even though “individuals who are Black represented 46% of those convicted of a felony between 2007 and 2017 [in the county], despite only accounting for 22% of the population”). 65 See OFFICE OF GEN. COUNSEL, supra note 22 (noting that since “African Americans and Hispanics are arrested, convicted and incarcerated at rates disproportionate to their share of the general population. . . . [C]riminal history-based restrictions on housing opportunities violate the Act if, without justification, their burden falls more often on renters or other housing market participants of one race or national origin over another”). 66 ANDRE M. PERRY, JONATHAN ROTHWELL & DAVID HARSHBARGER, THE DEVALUATION OF ASSETS IN BLACK NEIGHBORHOODS: THE CASE OF RESIDENTIAL PROPERTY 11–15 (2018), https://www.brookings.edu/wp-content/uploads/2018/11/2018.11_Brookings-Metro_DevaluationAssets-Black-Neighborhoods_final.pdf. 67 See, e.g., Mary Frost, Brooklyn Officials Demand Full-Scale Investigation of Home Theft in Black & Brown Neighborhoods, BROOKLYN DAILY EAGLE (Nov. 26, 2018), https://brooklyneagle.com/articles/2018/11/26/brooklyn-officials-demand-full-scale-investigation-ofhome-theft-in-black-brown-neighborhoods/; Stephen Witt, City Caught Trying To Grab Senior Citizen’s Brownstone, KINGS COUNTY POL. (Sept. 17, 2018), 15 2. Racial Discrimination in Housing Has Substantial Collateral Effects Racial discrimination in housing is not only a profound wrong in itself, but one that substantially affects other areas of life. Wealth accumulation, involvement with the criminal legal system, education, and health are all shaped by housing discrimination. For most American families, a house is one of their most valuable assets. However, homeownership rates and typical home equity vary greatly along racial lines, with Black and Latinx individuals sharply disadvantaged. 68 The encumbrance of homeseekers of color with excessive mortgage interest rates, property devaluation, and other discriminatory measures thus contributes significantly to the growing racial wealth gap in the United States. 69 This spillover effect of housing discrimination can also be seen in the criminal legal system. In Boston, for example, one study found that, controlling for crime and other non-racial factors, the number of police-civilian interactions in a neighborhood was driven by the concentration of Black residents. 70 Similarly, statistical data submitted in New York’s stop-and-frisk cases showed that the numbers of stops per crime were highest in areas with the highest concentration of Black residents. 71 The racially concentrated state of America’s neighborhoods—an inevitable result of racially discriminatory housing practices—can only encourage this brand of raciallytargeted law enforcement. Racial segregation in housing also engenders racial segregation in schooling. Since most school districts rely on residence-based school assignment and a large portion of school district funding is derived from local property taxes, housing discrimination is also a primary driver of the extreme racial segregation and resource https://www.kingscountypolitics.com/1217-dean-street/; Stephen Witt & Kelly Mena, City Takes Property From Working Class Latinos, KINGS COUNTY POL. (Oct. 9, 2018), https://www.kingscountypolitics.com/city-takes-property-from-working-class-latinos/. 68 LAURA SULLIVAN ET AL., INSTITUTE FOR ASSETS & SOCIAL POLICY, BRANDEIS UNIVERSITY & DEMOS, THE RACIAL WEALTH GAP: WHY POLICY MATTERS 9 (2015), https://www.demos.org/sites/default/files/publications/RacialWealthGap_1.pdf (finding that “73 percent of whites as compared to 47 percent of Latinos and 45 percent of Blacks” own homes, and that typical home equity is “$86,800 for white homeowners at the median as compared to $50,000 for Black homeowners and $48,000 for Latino homeowners). 69 See id. at 13 (finding that “[e]qualizing wealth returns to homeownership raised wealth among Black and Latino families while white wealth was held constant, significantly reducing the racial wealth gap” and that “[e]qualizing the returns to homeownership reduces the wealth gap between white and Black families by” 16 percent and between White and Latinx families by 41 percent) 70 ACLU, A REPORT ON BOSTON POLICE DEPARTMENT STREET ENCOUNTERS FROM 2007–2010, at 1 (2014), https://www.aclum.org/sites/default/files/wp-content/uploads/2015/06/reports-black-brownand-targeted.pdf. 71 Second Supplemental Report of Jeffrey Fagan, Ph.D. at 13, Floyd v. City of New York, 959 F. Supp. 2d 540 (S.D.N.Y. 2013), https://ccrjustice.org/sites/default/files/assets/files/FaganSecondSupplementalReport.pdf. 16 disparities in public schools. 72 For example, a recent Washington Post investigation found that children throughout the country “remain locked in deeply segregated districts,” and that Black students—also more likely to grow up in impoverished neighborhoods 73—are especially likely to be enrolled in segregated districts. 74 Housing discrimination even endangers the health of its victims. Even at higher income levels, families of color are still subjected to racial discrimination and thus more likely than similarly situated White families to live in high-poverty neighborhoods. Such neighborhoods present greater risks to children, including exposure to lead and vermin, fewer safe spaces for play, and more limited access to nutritious food. 75 As HUD has written, “for people residing in neighborhoods of concentrated poverty, a number of neighborhood level indicators are linked to important outcomes . . . . [E]ducation, psychological distress, and various health problems, among many other issues, are affected by neighborhood characteristics.” 76 And when Black homeowners face foreclosure, as mortgage discrimination makes more likely, it brings with it a host of dangers both to the health of individuals and to the social cohesiveness of the community. 77 See EDBUILD, $23 BILLION 2–5 (2019), https://edbuild.org/content/23-billion/full-report.pdf (analyzing data on this relationship). 73 Sociologist Patrick Sharkey has found that “young African Americans (from 13 to 28 years old) are now ten times as likely to live in poor neighborhoods . . . as young whites.” Richard Rothstein, The 72 Racial Achievement Gap, Segregated Schools, and Segregated Neighborhoods – A Constitutional Insult, ECON. POL’Y INST. (Nov. 12, 2014), https://www.epi.org/publication/the-racial-achievement- gap-segregated-schools-and-segregated-neighborhoods-a-constitutional-insult/. 74 Laura Meckler & Kate Rabinowitz, The Changing Face of School Integration, WASH. POST (Sept. 12, 2019), https://www.washingtonpost.com/education/2019/09/12/more-students-are-going-schoolwith-children-different-races-schools-big-cities-remain-deeply-segregated/?arc404=true; see also GROVER J. WHITEHURST ET AL, BROOKINGS INST., BALANCING ACT: SCHOOLS, NEIGHBORHOOD AND RACIAL IMBALANCE 14–19 (2017), https://www.brookings.edu/wpcontent/uploads/2017/11/es_20171120_schoolsegregation.pdf (analyzing the relationship between racial imbalances in schooling and racially segregated housing); Erica Frankenberg, The Role of Residential Segregation in Contemporary School Segregation, 45 EDUC. & URB. SOC’Y 548, 549–56, https://journals.sagepub.com/doi/pdf/10.1177/0013124513486288 (discussing data on the relationship between housing and school segregation); Adam Harris, The Whiter, Richer School District Right Next Door, ATLANTIC (Aug 1, 2019), https://www.theatlantic.com/education/archive/2019/08/segregated-school-districts-trapped-theirborders/595318 (discussing how the composition of segregated schools reflects the surrounding neighborhoods). 75 Brian Smedley & Rachel A. Davis, The Effects of Housing Discrimination on Health Can Reverberate for Decades, HILL (Aug. 27, 2019, 8:00 AM), https://thehill.com/opinion/civilrights/458899-the-effects-of-housing-discrimination-on-health-can-reverberate-for. 76 Understanding Neighborhood Effects of Concentrated Poverty, EVIDENCE MATTERS, Winter 2011, https://www.huduser.gov/portal/periodicals/em/winter11/highlight2.html. 77 See Susan Saegert, Desiree Fields & Kimberly Libman, Mortgage Foreclosure and Health Disparities, Serial Displacement as Asset Extraction in African American Populations, 88 J. URB. HEALTH 390 (2011), https://link.springer.com/content/pdf/10.1007%2Fs11524-011-9584-3.pdf (discussing how “[t]he health impacts of serial displacement for individuals, social groups, and neighborhoods . . . contribut[e] to poorer health for individuals while maintaining and exacerbating 17 More protection is needed for buyers and renters of color—not less. This includes protection from algorithmic tools, which have contributed to racial discrimination even without the broad defenses that the proposed rule would create. 78 By weakening existing protections in general and with regard to algorithmic tools specifically, the proposed rule will only exacerbate racial discrimination in housing, with devastating effects for communities of color. Conclusion Racial discrimination in housing has a long, disturbing history. It was a predominant practice and policy of the Jim Crow era and a major focus of the civil rights movement, exemplified by Dr. Martin Luther King, Jr.’s 1966 campaign in Chicago for fair housing. In enacting the FHA, the federal government signaled its will to combat this manifestation of racism. Although racial discrimination continues to be all too prevalent in the realm of housing, the FHA has served as an important legal tool in challenging it. If HUD adopts this rule, it will dramatically undermine the FHA, enable racially discriminatory housing practices, and further disadvantage communities of color more generally. At a time when racial discrimination increasingly manifests through the use of algorithmic tools alongside explicit, intentional racially discriminatory action, the animating spirit of the FHA demands that HUD reject a proposed rule that will erect impossibly high hurdles for countless victims of racial discrimination. We therefore urge HUD to withdraw the proposed rule. Sincerely, Deborah N. Archer Associate Professor of Clinical Law, New York University, School of Law; and Faculty Co-Director, Center on Race, Inequality, and the Law Ruha Benjamin Associate Professor of African Americans Studies, Princeton University Meredith Broussard Associate Professor, New York University existing health disparities between African Americans and whites, socially and spatially,” as well as how foreclosures cause neighborhoods to “gain hazards including vacant properties and their negative effects on crime, public safety, and health”). 78 See supra Section II.2. 18 Wendy Chun Professor and Simon Fraser University's Canada 150 Research Chair in New Media in the School of Communication Kate Crawford Distinguished Research Professor, New York University; and Co-founder of AI Now Institute Steve Demarest Research Scholar, Center on Race, Inequality, and the Law Ben Green Research Fellow, AI Now Institute Sarah L. Hamilton-Jiang Research Scholar, Center on Race, Inequality, and the Law Frank Pasquale Piper & Marbury Professor of Law, University of Maryland Rashida Richardson Director of Policy Research, AI Now Institute Jason M. Schultz Professor of Clinical Law, New York University School of Law Robert Seamans Associate Professor of Management and Organizations, New York University Stern School of Business Vincent M. Southerland Executive Director, Center on Race, Inequality, and the Law Meredith Whittaker Co-founder of AI Now Institute 19