C O R P O R AT I O N WILLIAM R. JOHNSTON, JOHN ENGBERG, ISAAC M. OPPER, LISA SONTAG-PADILLA, LEA XENAKIS Illustrating the Promise of Community Schools An Assessment of the Impact of the New York City Community Schools Initiative Sponsored by the New York City Mayor’s Office for Economic Opportunity For more information on this publication, visit www.rand.org/t/RR3245 Published by the RAND Corporation, Santa Monica, Calif. © Copyright 2020 RAND Corporation R® is a registered trademark. Cover image: Rachael Hacking for Show the Good Limited Print and Electronic Distribution Rights This document and trademark(s) contained herein are protected by law. This representation of RAND intellectual property is provided for noncommercial use only. Unauthorized posting of this publication online is prohibited. Permission is given to duplicate this document for personal use only, as long as it is unaltered and complete. Permission is required from RAND to reproduce, or reuse in another form, any of its research documents for commercial use. For information on reprint and linking permissions, please visit www.rand.org/pubs/permissions. The RAND Corporation is a research organization that develops solutions to public policy challenges to help make communities throughout the world safer and more secure, healthier and more prosperous. RAND is nonprofit, nonpartisan, and committed to the public interest. RAND’s publications do not necessarily reflect the opinions of its research clients and sponsors. Support RAND Make a tax-deductible charitable contribution at www.rand.org/giving/contribute www.rand.org Preface With the launch of the New York City Community Schools Initiative (NYC-CS) in 2014, the New York City Department of Education (NYCDOE) increased its focus on implementing a holistic strategy of education reform to address the social consequences of poverty as a means of improving student outcomes. NYC-CS is a strategy to organize resources in schools and share leadership among stakeholders so that academics, health and wellness, youth development, and family engagement are integrated into the fabric of each school. New York City is implementing this strategy at a scale unmatched nationally. In this study, we assessed the impact of the NYC-CS through the 2017–2018 school year. We assessed the effects along seven outcome domains (attendance, educational attainment, academic performance, disciplinary incidents, teachers’ shared responsibility for student success, student connectedness to adults and peers, and family empowerment opportunities) and explored the extent to which there was variation in programmatic impact based on studentand school-level characteristics. We leveraged innovative quasi-experimental methodology to determine whether students in community schools are performing better than they would be had their schools not been designated as community schools. The findings of this report will contribute to the emerging evidence base on the efficacy of the community school strategy and will be useful for other school district– and state-level policymakers interested in developing or refining similar interventions that support students’ and communities’ academic, social, and emotional well-being. This research was undertaken by RAND Education and Labor, a division of the RAND Corporation that conducts research on early childhood through postsecondary education programs, workforce development, and programs and policies affecting workers, entrepreneurship, and financial literacy and decisionmaking. The research was conducted under a contract with the New York City Mayor’s Office for Economic Opportunity (NYC Opportunity). Funding to support the evaluation has been provided by NYC Opportunity, the NYCDOE and New York City Department of Health and Mental Hygiene (DOHMH). Although RAND has worked collaboratively with staffs from NYC Opportunity, NYCDOE, and DOHMH, the research team has maintained independent control of all aspects of the study design, as well as final editorial control of all published reports, including this study. More information about RAND can be found at www.rand.org. Questions about this report should be directed to williamj@rand.org, and questions about RAND Education and Labor should be directed to educationandlabor@rand.org. iii Contents Preface. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . iii Figures. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . vii Tables. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ix Summary. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xi Acknowledgments. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xv Abbreviations. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xvii CHAPTER ONE Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 Background. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 CHAPTER TWO Community Schools in New York City. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7 New York City Community Schools Initiative’s Theory of Change . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7 Implementation Findings .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13 The Present Study.. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15 CHAPTER THREE Data and Methods. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17 Data Sources. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17 Measures. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17 Methods. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24 CHAPTER FOUR Results. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31 Matching Results.. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31 School-Level Average Impact. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34 Changes in School Demand and Composition.. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38 Effect on Student Subgroups and School Heterogeneous Effects. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41 Grade-by-Year Analysis Results.. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47 CHAPTER FIVE Discussion.. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63 Community Schools Had a Positive Impact on Most of the Examined Student Outcomes.. . . . . . . . . . . 63 There Was Limited and Inconsistent Evidence of Community Schools Supporting Improvements in Aspects of School Culture.. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65 v vi Illustrating the Promise of Community Schools The Estimated Impact Increased over Time for Some Outcomes. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65 Limitations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 66 Implications for Policy and Practice .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67 Directions for Future Research. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67 APPENDIXES A. Review Memo. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69 B. Core Capacity Score Item Summary and Distributions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73 C. Data Sources for Mental Health Implementation Profiles. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77 D. Mental Health Implementation Profile Estimation Results. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79 References. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83 Figures 1.1. 2.1. 2.2. 4.1. 4.2. 4.3. 4.4. B.1. D.1. Timeline of NYC-CS Implementation and RAND Evaluation Activities. . . . . . . . . . . . . . . . . . . 2 New York City Community Schools Initiative Theory of Change. . . . . . . . . . . . . . . . . . . . . . . . . . . . 8 Three-Tiered Model of Mental Health Services. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12 Average Outcomes of Non-Community Schools, Community Schools, and Matched Comparison Schools over Time: Elementary and Middle Schools. . . . . . . . . . . . . . . . 35 Average Outcomes of Non-Community Schools, Community Schools, and Matched Comparison Schools over Time: High Schools. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36 Difference Between Community Schools and Matched Comparison Schools: Elementary and Middle Schools. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37 Difference Between Community Schools and Matched Comparison Schools: High Schools. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38 Box-and-Whisker Plot of Core Capacity Index Scores. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75 Cluster Analysis Structure Comparison.. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80 vii Tables 3.1. 3.2. 4.1. 4.2. 4.3. 4.4. 4.5. 4.6. 4.7. 4.8. 4.9a. 4.9b. 4.9c. 4.9d. 4.10a. 4.10b. 4.10c. 4.10d. 4.11a. 4.11b. 4.11c. 4.11d. 4.11e. 4.12. 4.13. B.1. B.2. B.3. B.4. C.1. D.1. Summary of Data Sources. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18 New York City School Survey Items Used to Calculate Outcome Measures.. . . . . . . . . . . . . . . 21 Elementary and Middle School Summary Statistics. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32 High School Summary Statistics. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33 Average Impact of NYC-CS on Elementary and Middle Schools. . . . . . . . . . . . . . . . . . . . . . . . . . . . 39 Average Impact of NYC-CS on High Schools. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40 Measures of Elementary and Middle School Demand and Demographics.. . . . . . . . . . . . . . . . 42 Measures of High School Demand and Demographics.. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43 Estimates of NYC-CS on Student Subgroups in Elementary and Middle Schools.. . . . . . . 44 Estimates of NYC-CS on Student Subgroups in High Schools.. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45 Effect Heterogeneity in Elementary and Middle Schools: Differences by School Zoning.. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48 Effect Heterogeneity in Elementary and Middle Schools: Differences by Principal Tenure. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49 Effect Heterogeneity in Elementary and Middle Schools: Differences by School Size.. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50 Effect Heterogeneity in Elementary and Middle Schools: Differences by Renewal School Status. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51 Effect Heterogeneity in High Schools: Differences by School Zoning.. . . . . . . . . . . . . . . . . . . . . . 52 Effect Heterogeneity in High Schools: Differences by Principal Tenure. . . . . . . . . . . . . . . . . . . . 53 Effect Heterogeneity in High Schools: Differences by School Size. . . . . . . . . . . . . . . . . . . . . . . . . . 54 Effect Heterogeneity in High Schools: Differences by Renewal School Status. . . . . . . . . . . . . 55 Effect Heterogeneity Based on Implementation Metrics: Coordination. . . . . . . . . . . . . . . . . . . . 56 Effect Heterogeneity Based on Implementation Metrics: Collaboration.. . . . . . . . . . . . . . . . . . . . 57 Effect Heterogeneity Based on Implementation Metrics: Connection. . . . . . . . . . . . . . . . . . . . . . . 58 Effect Heterogeneity Based on Implementation Metrics: Continuous Improvement. . . . . . 59 Effect Heterogeneity Based on Implementation Metrics: Mental Health. . . . . . . . . . . . . . . . . . 60 Maturity and Exposure for Middle School Outcomes. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61 Maturity and Exposure for High School Outcomes.. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62 Continuous Improvement Capacity Score. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73 Coordination Capacity Score.. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73 Connectedness Capacity Score. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74 Collaboration Capacity Score. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74 Domains and Data Points for Mental Health Implementation Profiles. . . . . . . . . . . . . . . . . . . . 77 Implementation Profile Demographic Comparison. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81 ix Summary There is a growing body of research suggesting that community school interventions are a promising strategy to improve student outcomes through coordinated services and collaborative leadership practices (Maier et al., 2017). The community school strategy entails an integrated focus on academics, youth development, family support, health and social services, and community development with strategic partnerships among the school and local organizations and community members (Blank, Melaville, and Jacobson, 2012). Community schools are experiencing a dramatic expansion across the country, with more than 5,000 such schools in place nationwide (Blank and Villarreal, 2015; National Center for Community Schools, undated). This expansion is linked to a broader movement of place-based, comprehensive interventions that endeavor to strengthen and organize disparate agencies and institutions in an effort to mitigate the harmful effects of poverty. To date, the largest implementation of the community school strategy occurred in New York City, where Mayor Bill de Blasio in 2014 designated $52 million to create 45 community schools just after taking office. By the 2018–2019 school year, the New York City Community Schools Initiative (NYC-CS) expanded to include more than 200 community schools, with a total budget of $195 million (Jacobson, 2019). The NYC-CS builds on the existing framework of the community schools model, which includes the four core evidence-based features—(1) collaborative leadership and practices, (2) family and community engagement, (3) expanded learning time and opportunities, and (4) integrated student supports (Maier et al., 2017). The initiative has adapted those features to meet the unique needs of New York City students, families, and communities at a scale that is unprecedented in the United States thus far. The RAND Corporation has been engaged in an evaluation of the NYC-CS since 2016 through a contract agreement with the New York City Mayor’s Office for Economic Opportunity. The first report focused on implementation of the NYC-CS in 108 schools and was released in 2017 (Johnston et al., 2017). We found that the community school strategy was alive and well in New York City, with the vast majority of community schools already implementing most (if not all) of the strategy’s core features. For example, we found that most schools had formed well-aligned partnerships with community-based organizations, provided expanded learning time opportunities, and were implementing a three-tiered model of mental health programs and services by the 2015–2016 school year. xi xii Illustrating the Promise of Community Schools The Present Study This report is an extension of our 2017 report that focused on the implementation of the NYC-CS program through the 2015–2016 school year. The primary objective of this study was to leverage a quasi-experimental methodology to estimate the overall program impact through the 2017–2018 school year by comparing the outcomes of students attending NYC-CS schools with students attending a carefully constructed comparison group of schools that are similar to NYC-CS schools in many ways, except for their designation as “community schools.” We also assessed variation in estimated program impacts based on a series of student- and schoollevel characteristics. Findings We found that the NYC-CS had a positive impact on students across various outcome measures with some notable exceptions. In particular, we found that the NYC-CS had a positive impact on student attendance in all types of schools (elementary, middle, and high schools) and across all three years that outcomes were measured (2015–2016, 2016–2017, and 2017– 2018). We also found positive and significant impacts on elementary and middle students’ ontime grade progression in all two years for which we have data and on high school students’ graduation rates in two of the three years. Our analyses suggest that the NYC-CS led to a reduction in disciplinary incidents for elementary and middle school students but not for high school students. Finally, we found that NYC-CS had a positive impact on math achievement in the third and final year, but the impact estimates on reading achievement in all three years and on math achievement in the first two years were smaller and not statistically significant. Our evaluation found limited evidence of the NYC-CS supporting improvements in school climate and culture for elementary and middle schools. For example, we found that teachers’ reports of shared responsibility for student success increased at elementary and middle schools in the second and third years of the study. In addition, we only found a positive effect on students’ sense of connectedness to adults and peers for elementary and middle school students, but only for the second year of the study period. Finally, we found no statistically significant impact on families’ reports of engagement opportunities in elementary and middle schools. Impacts on none of the school climate and culture measures for high schools were statistically significant. We also examined whether estimated program impacts were being driven by certain subgroups of students or schools, including schools that had been found to be implementing key aspects of the NYC-CS program at higher levels than others. Although all community schools experienced reductions in chronic absenteeism, we found that community schools with higher levels of implementation of mental health programs and services saw a stronger impact on this outcome, compared with community schools with lower levels of mental health implementation. We also found evidence of variation in impact for some outcomes when comparing community schools that were concurrently a part of the NYCDOE’s Renewal Schools program1 with those that were not. Otherwise, we did not find a consistent pattern of differential 1 Renewal Schools program is a concurrent school-improvement initiative that endeavors to turn around some of the lowestperforming schools in the city with a combination of instructional supports for teachers and social supports for students. Summary xiii impacts to suggest that the overall pattern in estimated effects was not being driven by particular subgroups of students, schools, or program implementation factors. Limitations Although our findings were robust to numerous sensitivity checks, this study did have a number of limitations. First, New York City and its school system are unique—therefore, whether this initiative can be replicated elsewhere with similar findings is unknown. Second, because the initiative was launched at a large scale and focused on all New York City schools that failed to meet specific academic goals, we were unable to use a randomized design or create a perfectly balanced comparison group. We utilized a quasi-experimental design that created a comparison group that included nonprogram schools with similar baseline trajectories to the NYC-CS schools and excluded nonprogram schools that were substantially different from the NYC-CS schools. We also employed an analytic approach commonly referred to as “difference-in-difference” to overcome the lack of random assignment and the remaining differences between the NYC-CS and comparison schools. Nevertheless, we cannot rule out that impact estimates could be biased by unobserved differences between the NYC-CS schools and the comparison schools. The study was also limited to existing NYC-DOE administrative and survey data that posed some constraints regarding the availability and consistent measurement of some outcomes over time, particularly related to measures of school climate and student mental health. In addition, although we were able to follow schools and students for four years after the program’s initiation, this type of holistic intervention could continue to affect participants for years to come. Implications The positive findings of the impact of the NYC-CS suggest that the strategy can be a promising approach to support student success in traditionally disadvantaged communities. These positive impacts are particularly important because the NYC-CS is such a large program compared with other instances of the community strategy that have been rigorously evaluated thus far. We contend that if other, smaller efforts that instill a whole-child, whole-school strategy for student support are to be developed, the coordinated efforts of the New York City Office of Community Schools, along with other key agencies in the city, may provide a promising template for such initiatives. Acknowledgments We would like to acknowledge the following people and organizations for their contributions to this work. First, we are grateful to the large number of central office staff, school leaders, and community school directors who gave generously of the time to share their insights and experiences with the New York City Community School Initiative. In particular, we appreciate the guidance provided by key staff from the New York City Office of Community Schools, the Bureau of Children, Youth, and Families in the Department of Health and Mental Hygiene, and the Office of School Health. We also thank Fatih Unlu at the RAND Corporation, James Kemple at New York University, and Aaron Pallas at Teachers College, Columbia University, for reviewing the document and providing constructive feedback. xv Abbreviations AIDP Attendance Improvement and Dropout Prevention CBO community-based organization CSD Community School director ELA English language arts HCZ Harlem Children’s Zone NV New Visions for Public Schools NYC-CS New York City Community Schools Initiative NYCDOE New York City Department of Education NYCSS New York City School Survey OCS Office of Community Schools PAM Partitioning Around Medoids PCA principal component analysis SLT school leadership team SMHM school mental health manager xvii CHAPTER ONE Introduction Decades of research have shown that many disparities in educational outcomes are related to nonacademic factors, such as poverty, housing instability, exposure to trauma and violence, and limited access to health care (e.g., Knopf et al., 2016; Reardon, 2011). These outside-of-school factors represent barriers that inhibit many students’ ability to attend school regularly or arrive ready to learn and succeed (Nauer et al., 2014). One strategy for mitigating these negative consequences of poverty and socioeconomic disadvantage is to empower schools to become service hubs that provide children and families with coordinated, cross-sector assistance and support in various areas (Jacobson and Blank, 2011). Although the provision of school-linked services is not new in the United States, there is an emerging movement to provide holistic educational reform as a contrast to more narrowly defined interventions that strictly focus on academics, with the community school strategy being the most commonly implemented reform of this effort (Bronstein, Mason, and Quinn, 2016). Broadly, a community school is a partnership involving the school, students’ families, and the surrounding community that maintains an integrated focus on academics, youth development, family support, health and social services, and community development (Blank, Melaville, and Jacobson, 2012). A community school serves not only the students but also students’ families and the surrounding community by providing access to such critical programs and services as health care, mentoring, expanded learning programs, and adult education (Bronstein, Mason, and Quinn, 2016; Coalition for Community Schools, 2017; Dryfoos and Maguire, 2002; Warren, 2005). To date, the largest implementation of the community school strategy is in New York City, where in 2014 Mayor Bill de Blasio designated $52 million to create an initial cohort of 45 community schools just after taking office. By the 2018–2019 school year, the New York City Community Schools Initiative (NYC-CS) has expanded to include more than 200 community schools, with a total annual budget of $198 million (New York City Independent Budget Office, 2018).1 The growth of the NYC-CS was designed to complement existing initiatives that served as the foundation for the city’s goal of achieving an equitable educational system. The first cohort of community schools comprised 45 schools receiving the city’s Attendance Improvement and Dropout Prevention (AIDP) grant, with these schools gradually being onboarded as community schools during the 2014–2015 school year (see Figure 1.1). The program expanded in 2015 1 Examples of community schools existed in New York City long before 2014, going as far back as the urban settlement houses in the 1800s. See Belay, Mader, and Miller (2014) for a more detailed account of early community school efforts in New York City. 1 2 Illustrating the Promise of Community Schools Figure 1.1 Timeline of NYC-CS Implementation and RAND Evaluation Activities Pre-trend period (2009–2010 through 2013–2014) Transition year (2014–2015) Year 1 (2015–2016) Year 2 (2016–2017) Year 3 (2017–2018) • NYC-CS announced and launched as part of AIDP grant • 45 schools selected to partner with CBOs for NYC-CS • Gradual onboarding of NYC-CS schools, including partnerships with lead CBOs • 94 Renewal Schools begin receiving NYC-CS supports • First full year of NYC-CS • RAND Study Year 1 • RAND implementation data collection • Second full year of NYC-CS • RAND Study Year 2 • RAND implementation data collection  • Third full year of NYC-CS • RAND Study Year 3 • RAND implementation report published in fall 2017 NOTE: CBO = community-based organization. when 94 additional schools that were also designated as Renewal Schools2 were onboarded as community schools in fall 2015. Additional expansions took place in 2017 and 2018, bringing the total number of community schools to 258 by fall 2018.3 To lead implementation and facilitate centralized coordination and support of the city’s growing cadre of community schools, the New York City Department of Education (NYCDOE) created the Office of Community Schools (OCS) in 2015. In this study, we estimated the impact of the NYC-CS through the 2017–2018 school year. We assessed the effects along seven outcome domains. Four of these were at the student level: • • • • attendance educational attainment academic performance disciplinary incidents. The other three were school-level outcomes related to culture and climate: • teachers’ shared responsibility for student success • student connectedness to adults and peers • family empowerment opportunities. We also explored the extent to which there was heterogeneity in programmatic impact based on student- and school-level characteristics. We leveraged innovative quasi-experimental methodology to determine whether students in the community schools are doing better than they would be doing had their schools not been designated community schools. In the sections that 2 In addition to receiving the community schools supports, Renewal Schools benefit from additional interventions focused on improving classroom instruction and boosting academic achievement. Eleven of the 45 in the initial cohort of AIDP schools were also Renewal Schools; in all, there were 83 new schools added to the program when the Renewal Schools were integrated into the NYC-CS. 3 These newer cohorts of community schools are not included in the present analysis because of the limited time frame available to measure outcomes. Also, these newer sets of community schools have a slightly different service model, with no contracted mental health service providers using the three-tiered model, described later in this chapter. Introduction 3 follow, we discuss the background literature on community school models across the country (Chapter Two), followed by a detailed discussion of the data and methods (Chapter Three), and then our findings (Chapter Four). We conclude with a discussion of implications for policy and practice (Chapter Five). Background Community schools is an umbrella term describing schools that provide various services to address the comprehensive needs of students, families, and communities, through collaboration with community agencies and local government (Jenkins and Duffey, 2016; Oakes, Maier, and Daniel, 2017). The U.S. Department of Education articulated that the primary purpose of full-service community schools is to “provide comprehensive academic, social, and health services for students, students’ family members, and community members that will result in improved educational outcomes for children” (U.S. Department of Education, 2014). Community schools provide various services, and each model aims to address the local needs of the students, families, and neighborhood through a variety of comprehensive services (Jenkins and Duffy, 2016). Community schools are also characterized by the partnerships formed with the community through engagement and services. Community schools often partner with CBOs to provide schoolwide preventative approaches, including social and emotional learning programs, expanded learning time, adult and family services, and health and mental health supports (Báez et al., 2019). Community schools were originally designed to provide a quality educational environment for families in low-income neighborhoods by creating local partnerships with various community organizations, encouraging and providing opportunities for engagement by parents, and providing extracurricular activities (Bireda, 2009; Blank, Melaville, and Shah, 2003; Castrechini and London, 2012; Dryfoos, 2008; Jacobson and Blank, 2015). The combination of services, engagement, and improved educational opportunities that community schools provide can alleviate the challenges that underresourced and segregated communities often experience (Biag and Castrechini, 2016; Fehrer and Leos-Urbel, 2016; Sanders, Galindo, and DeTablan, 2019). However, there is also an empowerment component to the community schools strategy, because schools strive to address students’, families’, and communities’ complex needs while also empowering families to generate sustainable changes in their communities. A general goal of the community schools strategy is to build strong ties among key stakeholders through the establishment of inclusive, collaborative climates that value and expand families’ social capital (Galindo, Sanders, and Abel, 2017). Community Schools in the United States Community schools are experiencing a notable expansion across the United States, with current estimates indicating there are approximately 5,000 such schools across the country (National Center for Community Schools, undated). This expansion is linked to a broader movement of place-based, comprehensive education interventions that endeavor to strengthen and organize disparate agencies and institutions in an effort to mitigate the harmful effects of poverty. For example, the Obama-era Promise Neighborhoods program has helped more than 70 communities create systems of integrated supports, from early childhood programming to after-school programs to job training (Jacobson, 2019). The core tenets of the strategy 4 Illustrating the Promise of Community Schools are echoed in the broader, bolder approach to education reform that aims to address social and economic disadvantages through the provision of such supports as early childhood and preschool programs, the extension of learning opportunities after school and in the summer, and school-based health services (Steen and Noguera, 2010). Community schools have had staunch support from teacher unions, with the American Federation of Teachers and the National Education Association advocating the expansion of community schools and supporting members interested in pursuing the strategy. Recently, a key component of the agreement ending a protracted teacher strike in Los Angeles was the Los Angeles Unified School District’s promise to transform 30 schools into community schools, investing approximately $400,000 per school (Quartz, 2019). A prominent example of a community schools model is the Harlem Children’s Zone (HCZ) in New York City, a comprehensive strategy for children and families with longer school days and after-school activities. HCZ’s goal is to break “the cycle of generational poverty for the thousands of children and families it serves” (HCZ, undated) through education interventions and community services. Two other examples garnering national attention are Pennsylvania and Oklahoma’s models. The United Way of the Lehigh Valley in Pennsylvania provides services, along with its partners, to 14 community schools. Similar to other models, it includes comprehensive services, but it relies heavily on the partners and leaders versus the district and requires substantial support from corporate partners. Tulsa Area Community Schools Initiative in Oklahoma is another comprehensive model that engages the families and community by encouraging them to participate in the governance of the initiative. Many cities have employed the community schools strategy in some of their schools, ranging from just a handful of schools to larger programs, such as Cincinnati’s Community Learning Centers program, with 39 schools as of 2019 and dozens more in the works. In 2011, the country’s first districtwide community schools initiative with overarching model for student support in all schools was established in Oakland, California (Jacobson, 2019). Community school models are quite varied in terms of size, budget, menu of programming, and partnership structure; however, most community school initiatives have many commonalities that converge in four core evidence-based features, as Oakes, Maier, and Daniel (2017, p. 1) articulate in their comprehensive literature review: • Integrated student supports. Youth development is integrated across academics, programs and services. Additionally, mental health, medical, and social services are integrated into the schools and available to students who need them. • Expanded learning time and opportunities. Expanded learning time includes academic interventions and enrichment activities and is aligned with school-day curriculum and expectations. • Family and community engagement. Parents and the community help design and plan the Community School according to its strengths and needs, and parents and caregivers are active partners in their children’s education. Additionally, family members have access to education opportunities and programs that strengthen families. • Collaborative leadership and practices. Schools implement a collaborative school governance structure that includes a lead Community-Based Organization (CBO) partner and members of a School Leadership Team. Additionally, school leadership has a clear instructional vision and high expectations for all students. Introduction 5 These four components align with the core structures that the NYC-CS has in place (as shown in the “NYC-CS Theory of Change” section in Chapter Two). In the next section, we summarize evidence about the effectiveness the community schools’ strategy, organized around these features. We also discuss the evidence about comprehensive models that incorporate some or all of these components. Prior Research on the Community School Strategy There is an emerging evidence base on the efficacy of the community schools approach, both in the quantity of studies and the statistical rigor of the work. Over the past decade, various high-quality studies (i.e., randomized controlled trials or quasi-experimental design) evaluating the impact of the community schools model as a comprehensive strategy, as well as studies of the core features (e.g., integrated student supports, expanded learning time, family and community engagement, and collaborative leadership) have emerged (for systematic reviews, see Maier et al. [2017] and Moore et al. [2017]). Broadly, the evidence base suggests some positive academic, behavioral, and social-emotional gains for students in schools participating in comprehensive community school initiatives. Full-service community schools have been linked to greater access to coordinated services for families, lower family stress, increased family engagement, and lower chronic student absenteeism (Arimura and Corter, 2010; Hancock, Cooper, and Bahn, 2009; Olson, 2014). In addition, several studies have found that community school programming has been linked to reductions in student absences (e.g., Dobbie and Fryer, 2011; ICF International, 2010b; Kemple, Herlihy, and Smith, 2005). There is also evidence that comprehensive community school reforms may improve school climate outcomes. In Baltimore, the families of students at community schools were more likely to report that school staff cared about their children, connected them with community resources, and worked closely with their children to learn compared with parents at non-community schools (Durham and Connelly, 2016). In addition, Olson (2014) found that parents at community schools had higher response rates to school climate surveys than parents from a comparison group of non-community schools. And finally, LaFrance Associates (2005) found that students at full-service community schools had greater improvements in attitudes toward school and improved quality relationships with adults. Findings for academic achievement have been mixed, with some evidence of improvements in students’ grades (LaFrance Associates, 2005) but other results suggesting no differences in math and reading achievement (Adams, 2010). On the contrary, multiple quasiexperimental design studies of the City Connects program in Boston found positive impacts on math and English language arts (ELA) test scores of high-poverty elementary school students (Walsh et al., 2014), particularly first-generation immigrants and English language learners (Dearing et al., 2016). Although Cummings, Dyson, and Todd (2011) found some evidence of academic improvements for the most-disadvantaged students, there was no evidence of an overall effect for all students. In addition, a series of randomized control trials of Communities in Schools interventions found positive impacts on math and reading test scores for some cohorts of students in some study sites, but there were no consistently positive impacts for all students in all study locations (ICF International, 2010a, 2010b). Fewer evaluations examined disciplinary outcomes and found mixed results (Maier et al., 2017). Additionally, evidence regarding behavioral health outcomes is limited, although there is suggestive evidence that community schools might help cultivate healthy student behaviors through improved school culture and climate (Maier et al., 2017). The recent literature review by 6 Illustrating the Promise of Community Schools Child Trends reported on four studies that found declines in risky behavior and behavioral problems. Similarly, the report listed four studies that found a positive relationship between socioemotional development and integrated student support models (Moore et al., 2017). On the other hand, evaluations of Communities in Schools programs—in Austin, Texas; Jacksonville, Florida; and Wichita, Kansas—all found null effects on behavioral problems (ICF International, 2010a, 2010b, 2010c). Finally, there is emerging evidence that implementation timing and fidelity are likely to be important factors. For example, two studies report that the greater the extent and fidelity of implementation, the greater the positive outcomes (Comer and Emmons, 2006; Kalafat, Illback, and Sanders, 2007). In addition, Dryfoos (2008) and Smith, Anderson, and Abell (2008) point out that achievement outcomes develop slowly, only after an intervention has been in place for several years. Thus, evaluation results that are based on early implementation may not be an accurate estimate of a program’s full potential impact. Although the community school strategy is not particularly new in the United States or even in New York City, the model has expanded rapidly, from 45 schools in 2014 to more than 100 schools in 2016 to 258 schools as of fall 2018. With that scale-up (both in New York City and nationwide) comes heightened interest in understanding the impact of the model and whether the rapid expansion may come at the expense of program quality (Kirp, 2019). Thus, it is imperative to conduct a rigorous evaluation of the impact of the community schools strategy in New York City, to which we turn our attention in Chapter Two. CHAPTER TWO Community Schools in New York City Similar to other community school models that have been implemented across the country, the NYC-CS is based on principles of organizing resources and establishing collaborative leadership so that academics, health and wellness, and family empowerment are integrated into the fabric of each school. The NYC-CS embodies the existing framework of the community schools model that includes the four core evidence-based features (collaborative leadership and practices, family and community engagement, expanded learning time and opportunities, and integrated student supports) most commonly seen among community schools (Maier et al., 2017), and it has adapted those elements to meet the unique needs of New York City students, families, and communities, at a scale that is unprecedented in the United States thus far. New York City Community Schools Initiative’s Theory of Change The NYC-CS uses a capacity-building model to support community schools’ positive development along four key capacity domains—continuous improvement, coordination, connectedness, and collaboration (see the center column of Figure 2.1 for definitions). This capacitybuilding model moves beyond merely injecting services in schools and is intended to be more sustainable so that schools and communities are able to work together and effectively support students and communities. As shown in Figure 2.1, the NYC-CS Theory of Change posits that community schools will develop along the four capacities in support of the whole child when NYCDOE invests in operations and administrative support, infrastructure and technical assistance, holistic tools and resources, new programs and initiatives for students and families, and organizes strategies for schools and CBOs. In addition to the four core capacities being fostered within each school, there are four core evidence-based features that are present within each of the community schools, including collaborative leadership and practice, family and community empowerment, expanded learning time, and wellness and integrated student supports. The model posits a feedback loop between the development of the core capacities and the implementation of the key evidence-based features, such that capacity improvements are likely to beget more effective implementation of the program features, which in turn might contribute to greater capacity development. Although not an explicit goal of the NYC-CS, we also included school demand as an intermediate outcome and potential mediator of student outcomes, such as average school academic performance and attendance. As the community perceives that community schools are improving, we expect the number of students applying to and enrolling in these schools will 7 2017 Operations and administrative support to schools (e.g., budgets, coordination of partnerships) Infrastructure and technical assistance to provide resources and sharing of best practices for schools and CBOs Holistic tools and resources using real-time data for strategic decisionmaking New programs and initiatives to complement ongoing efforts to create healthy and thriving learning communities Organizing strategies for schools and CBOs that focus efforts around student success Then community schools develop their capacity in… Resulting in improved… Continuous improvement through ongoing collection and analysis of data to assess needs and guide decisions. School climate and culture • Shared responsibility for student success • Student connectedness to adults and peers • Family empowerment opportunities Coordination across programs and agencies to ensure equitable delivery of the right services to the right students at the right time. Connectedness among adults and students that fosters a sense of community among all stakeholders and encourages resilient academic and personal behaviors among students.  Collaboration that strengthens school and CBO partnerships and supports families’ voices in school engagement and student learning. Student outcomes • Attendance • Educational attainment • Academic performance • Disciplinary Incidents Mediated by… School climate and culture • More advantaged students • More transfers in • Fewer transfers out And institute core evidence-based features, including… Collaborative leadership and practice • CBO partnerships and CSDs • Data-informed planning and interventions • Interagency and public-private partnerships • Assets and needs assessment Family and community empowerment • Family nights • Family leadership training • Specialized programs (adult ed. classes, home visits) Expanded learning time • Hands-on learning experiences • CBO cofacilitation of programs before, during, and after school • Summer programming Wellness and integrated student supports • Mental health • Reproductive health • Vision screenings • Success mentoring • Vulnerable youth services (homelessness, immigration, relationship violence SOURCE: Adapted from the New York City Community Schools Strategic Plan (New York City Community Schools, undated) and authors’ correspondence with the New York City OCS. NOTE: CSD = Community School director. Illustrating the Promise of Community Schools If the office of community schools provides… 8 Figure 2.1 New York City Community Schools Initiative Theory of Change Community Schools in New York City 9 increase. We also expect more applications and enrollment from less disadvantaged students who previously opted for other schools that required families to find, enroll in, and travel to resources. A decrease in the proportion of disadvantaged students could lead to improved average student outcomes at the community schools, even if the community schools were not improving the outcomes of individual students. Therefore, we examined not only whether demand increases but also whether any improvement in average outcomes is attributable to changes in school composition, improvements in individual outcomes, or both. Each community school has the flexibility and autonomy to select its lead CBO partner with whom it works to develop the specific menu of programs and services for its community. The OCS provides schools with implementation support, primarily through program managers (who are each responsible for 13 to 15 schools). Program managers serve as coaches to support schools’ capacity development as they implement the four core features of the community school strategy. Program managers also support effective implementation of collaborative planning meetings in which multiple stakeholders analyze academic and financial data to ensure equitable allocation of resources to meet student needs. Schools are required to hold at least three of these meetings per year, with attendees including the principal (or designee), CSD, key staff from CBO partners, and the school mental health manager (SMHM). SMHMs are based in New York City’s Office of School Health and are responsible for supporting the implementation of mental health programs, which fall under the umbrella of Wellness and Integrated Student Supports. SMHMs support these efforts by helping establish, expand, and promote the three-tiered model (described in more detail below); collaborating with schools, CBOs, mental health providers, and other key stakeholders; and monitoring progress within the schools. In addition, outreach specialists from the Division of Family and Community Engagement in the NYCDOE work with schools to support family engagement efforts in the NYC-CS. With schools implementing the four core evidence-based features and developing along the four core capacities, the Theory of Change hypothesizes, in part, that these efforts will lead to improvements in school climate and students’ academic and behavioral outcomes. Next, we describe the four core features of the NYC-CS in more detail. Core Evidence-Based Features of a New York City Community School As part of its Theory of Change, the NYCDOE identified four core evidence-based features to be implemented by all community schools in New York City. The features include (1) collaborative leadership and practice, (2) family and community empowerment, (3) expanded learning time, and (4) wellness and integrated student supports. These core evidence-based features were informed by national research (e.g., Maier et al., 2017), as well as local input from New York City principals, CBO providers, community partners and members of the NYC Community Schools Advisory Board. According to the New York City Community Schools Strategic Plan,1 every community school is intended to uniquely reflect the strengths and needs of its students, families, and local community. Given this need for flexibility and responsiveness to the needs of the local community, it is best to think of the NYC-CS features as a way to ensure consistency and accountability across community schools while allowing flexibility for schools to innovate and customize their services to best meet the needs of their community and stu1 See New York City Community Schools website for the full strategic plan (New York City Community Schools, undated). 10 Illustrating the Promise of Community Schools dent population. That being said, the schools in the NYC-CS are expected to implement some type of programming related to each feature. Collaborative Leadership and Practice In the NYC-CS model, collaborative leadership and practice includes four key components: (1) CBO partnerships and CSDs, (2) data-informed planning and interventions, (3) interagency and public-private partnerships, and (4) assets and needs assessment. Under this model, each school is paired with a lead CBO partner who works collaboratively with the principal and other school leaders to carry out the NYC-CS at the school level, which includes the hiring of a CSD—a full-time staff person in the school building who is focused on assessing school and student needs, securing resources, and coordinating services for students, families, and the school community across organizations and partners. In addition to the lead CBO, which hires the CSD and serves to coordinate services at the school, most schools also work with various other partner CBOs to implement the programs associated with the NYCCS. The CBOs are often nonprofit social service, education, or health/mental health organizations; their partnerships with schools are formalized in contracts, memoranda of understanding, or linkage agreements. The use of data to inform continuous improvement is also a core component of NYCCS. The initiative encourages schools to engage in strategic data collection and analysis that will inform program decisions and help align outcomes with the school’s needs. Key staff from the school and CBO partners conduct an annual assets and needs assessment of the school and community to determine their academic, health, social, and emotional needs, along with resources and assets present in the school and community. School and student goals (and the school’s progress toward achieving those goals) are regularly shared among all school partners through data inquiry and collaborative data review. To support the use of real-time data to inform school efforts, schools and lead CBOs have been provided with access to a data portal developed by New Visions for Public Schools (NV). The portal allows for regular conversations between school administrators and CBO staff to manage critical school processes, such as course programming and interventions for student academics, attendance, and well-being. The conversations are grounded in school-specific data tools, which organize key data on each student and help facilitate the workflow of critical student- and school-level tasks. Family and Community Empowerment Core to the NYC-CS is successful family engagement that ensures that parents and caregivers are enlisted as partners in their children’s education and well-being through involvement in implementing the community schools model. This model is rooted in principles of community organizing, which see parents as partners with capacities of their own that can contribute to educational improvements (Mapp and Kuttner, 2013). The NYC-CS seeks to engage and empower families through collaborative school-based governance, family organizing, and leadership development. These activities may manifest through the work of an existing school leadership team (SLT) and community school team, as required by the NYCDOE, or via datasharing with families to engage them in decisionmaking about school initiatives. Multiple school- and central office–level staff members are dedicated to supporting family engagement in the NYC-CS. For example, each community school has a parent coordinator who focuses on meeting parents’ needs and creating opportunities to engage families in school activities. Community Schools in New York City 11 In addition, outreach specialists from the Division of Family and Community Engagement, an agency within the NYCDOE, work with community school staff and school leadership to develop schools’ capacity to carry out effective family engagement practices. Expanded Learning Time Expanded learning time is a strategy used by schools to redesign their school days and/or yearly calendar to provide students, particularly in communities of concentrated poverty, with substantially more and better learning time (Jacobson and Blank, 2015). Given the focus on improving students’ academic success, additional learning time—through expanding the traditional school day and/or offering after-school and/or summer enrichment programs—is core to the community school strategy in New York City. Under the NYC-CS, expanded learning time includes the following components: (1) more hands-on learning experiences, (2) CBOs’ cofacilitation of programs before, during, and after school with school administration and staff; and (3) the availability of summer programming. Wellness and Integrated Student Supports Under the NYC-CS model, community schools offer various wellness and integrated student supports that address mental health, reproductive health, vision, mentoring for students at risk of chronic absenteeism, and services for vulnerable youth (e.g., homeless youth, immigrants, witnesses of domestic violence). The availability of specific services and programs might vary from school to school based on the needs of the students and the existing programs in the school. Although all of these services fall under the NYC-CS umbrella, we focus here on the mental health component. Under the NYC-CS, community schools are expected to enhance inclusion/presence of mental health programs and services, foster a seamless integration of these programs/services with other academic and health supports, and facilitate the coordination and integration of efforts across institutions (e.g., schools, communities, schooling systems, and government entities). In this context, NYC-CS has adopted a public health model for mental health and the three-tiered model to delivering mental health programs (Fox et al., 2003; Fox et al., 2009). Mental health programs and services are intended to promote the emotional well-being and healthy functioning of all students through three tiers of supports (Tier 1: Universal; Tier 2: Selective; and Tier 3: Targeted [see Figure 2.2]). Tier 1 programs are usually preventive in nature, addressing social-emotional health before problems arise and are inclusive of all students. Tier 2 interventions do not replace Tier 1 interventions, but rather are supplemental and focus on early intervention for at-risk students. Tier 3 services or treatments are designed to meet the needs of a few students with diagnosable mental health disorders. In tiered models, the intensity of the service or program increases progressively and the determination of which services or programs are offered to a student is based on a combination of the individual needs of the student and outcome goals of the service (e.g., building universally beneficial social and emotional skills versus addressing clinical symptoms of a disorder by accessing onsite mental health treatment services). There is a strong evidence base linking participation in schoolwide multitiered models to mental health with a variety of educational outcomes improvements (e.g., standardized assessment test scores), attendance, behavioral outcomes (e.g., mental health outcomes), school engagement, and fewer discipline referrals and suspensions (Kase et al., 2017; Sanchez et al., 2018). 12 Illustrating the Promise of Community Schools Figure 2.2 Three-Tiered Model of Mental Health Services Supports and resources for the few students who have diagnosable mental health conditions, and who are already displaying or have been identified with particular emotional, behavioral, or mental health problems.  Schoolwide supports and resources appropriate for all students to impart knowledge, awareness, and skills that promote social, emotional, and mental well-being and that encourage help-seeking.  Tier 3: Targeted For a few students Tier 2: Selected For some students School supports and resources for a subset of students who are identified as being at risk of developing mental health or substance use conditions, to prevent these conditions from developing or to detect a condition early. Tier 1: Universal For all students SOURCE: Definitions for tiers provided by the New York City Department of Health and Mental Hygiene and adapted from Fox et al. (2003 and 2009). The services implemented by the Office of School Health do not offer a cookie cutter model. The breadth and depth of programming and services differ across schools based on their funding profile and are tailored to school and student needs. To support the adoption of the three-tiered model to delivering mental health programs and services and to help foster a cultural shift within the community schools to view mental health and well-being as an integral part of students’ academic success, the Office of School Health assigned each school to an SMHM who supports implementation efforts across multiple schools. Mental health managers work closely with the CSD, principal, assistant principal or dean, and the school-based support team, which might include social workers, the school psychologist, guidance counselor, or community mental health provider, to establish, expand, and promote the three-tiered model and monitor progress within the schools. Core Capacities In addition to the core components described above, the NYC-CS Theory of Change also considers the extent to which schools are developing in key areas of school governance and operations. The four core capacity domains are briefly defined in Figure 2.1, and we provide more information on our operationalization later in this chapter, in Appendix B of this report, and in our previous report on NYC-CS implementation (Johnston et al. 2017). First, the continuous improvement core capacity represents the ongoing collection and analysis of data to assess needs and guide decisions. This capacity score is measured via a composite score of responses from a school leader survey administered by RAND in 2016 related to Community Schools in New York City 13 whether school staff use data regularly to set benchmarks, track progress, and guide programming, both for individual students and for the school as a whole (see Appendix B for specific items). Second, the coordination core capacity is defined as the strategic alignment of varied programs and agencies to ensure equitable delivery of the right services to the right students at the right time. This score is based on survey items related to schools’ ability to strategically align various CBO partners and integrated programs to efficiently provide students with high-quality, well-integrated programming. Third, the connectedness core capacity is defined as positive relationships among adults and students that foster a sense of community among all stakeholders and encourage resilient academic and personal behaviors among students. We measured connectedness via school leader reports of the sense of community among school and CBO staff as well as students and families. The collaboration core capacity is defined as the effective alliances between schools and their CBO partners, along with the integration of families’ voices in school engagement and student learning. This score is based on school leader reports of the strength of the partnership between the school and CBO, the principal and the CSD, and the level of family engagement at the school. These core capacities are intended to be mutually reinforcing with the effective implementation of the more-tangible core components—i.e., when schools implement more aspects of the program, they will develop their capacities in key ways, which will in turn facilitate more effective programmatic implementation. This capacity-building model to school reform moves beyond merely injecting services into schools and is intended to be more sustainable so that schools and communities are able to work together and effectively support students and communities. With schools implementing the core components and developing along the four core capacities, the Theory of Change argues that these efforts will lead to positive academic, behavioral, and socioemotional outcomes for both students and their families. Implementation Findings In 2017, RAND published the results of an analysis of the first two full years of implementation of the NYC-CS model (Johnston et al., 2017). Our analysis focused on the implementation of the core program components and schools’ development of the core capacities, focusing on the same set of community schools that we examined in this impact study. In this section, we provide a summary of the key findings. Implementation of the Core Components Our analysis of the implementation of collaborative leadership in community schools found that the majority of schools in the NYC-CS were implementing most of the key program components. For example, all schools had established partnerships with lead CBOs and hired CSDs by the 2016–2017 school year, with most school leaders indicating that the programming being provided by CBOs was aligned with their vision for schools’ needs (approximately 90 percent of school leaders). We also found that 80 percent of schools reported having datadriven meetings to address attendance trends in the 2015–2016 and 2016–2017 school years, which was up from 59 percent in 2014–2015. Lastly, we found prevalent use of strategic data check-in practices by the 2015–2016 school year, with 74 percent of community schools having conducted three or more strategic data check-ins, and 92 percent indicated they were using NV real-time data analysis tools, such as the NV Data Sorter. 14 Illustrating the Promise of Community Schools Regarding family and community empowerment, we found that community schools were becoming fertile ground for robust family engagement and empowerment. Surveyed principals and CSDs said they felt that the transformation into a community school increased participation among family members, and 81 percent of respondents indicated that families were more present in the school as a result of the NYC-CS. In our review of administrative data, three main categories of family engagement activities and opportunities emerged: (1) leadership opportunities, collective decisionmaking, and relationship building; (2) social and educational services that meet the needs of the whole family; and (3) opportunities to share and collect data with families. Regarding expanded learning time, we found that more than 90 percent of community schools were offering expanded learning time programming after school by the 2015–2016 school year, an increase from 59 percent the prior year. This programming included various activities, such as visual and performing arts, academic enrichment, and standardized test preparation. Finally, regarding wellness and integrated student supports, we found that the three-tiered model of mental health services was increasingly common but not universal at community schools by the 2016–2017 school year. Specifically, we found that more than 80 percent of community schools implemented a three-tiered mental health service model in the 2016–2017 school year, up from approximately 50 percent in 2014–2015. Our analysis suggested a great deal of variation in the types of mental health services that schools are administering, with many schools still planning to implement staff professional development, student skill-building, family services, crisis intervention, and mental health screening and assessments. Development of the Core Capacities In our implementation report (Johnston et al., 2017), we describe the creation of the core capacity scores, which we then compare with principals’ self-assessment of their levels of development along an established “stages of development” rubric used by the OCS as a planning tool for Program Managers.2 We found considerable variation in schools’ scores along the core capacities, as shown in Figure B.1 in Appendix B. To help with interpretation of the scores, we centered all of the index scores at the mean, so that positive index scores should be interpreted as being above the mean and negatives scores as below the mean for that index. As we show in Figure B.1, we found that the minimum values were much further from the mean than are the maximum values. In other words, although we do not see many exceptionally high-scoring schools across the four indexes, there are a handful of schools that appear to be substantially lower in capacity than the average community school. When comparing capacity scores with school leaders’ assessments of their schools’ status on the stages of development rubric, we found that principals said they felt their schools were more developed in their initiatives related to coordination and connectedness than in continu2 The stages of the development rubric identifies four sequential stages that schools pass through as they develop along the capacities. The stages are: (1) exploring, a planning stage before implementation in which schools express optimism and curiosity about the work; (2) emerging, a stage in which schools deepen collaboration among all stakeholders and define community partnerships to facilitate program implementation; (3) maturing, a stage in which schools make steady, intentional progress toward the community school vision as implementation begins and service usage increases; and (4) excelling, a stage in which schools are implementing quality programs that are guided by the collective governance of many community stakeholders. Community Schools in New York City 15 ous improvement and collaboration. Specifically, 81 percent and 70 percent of school leaders, respectively, said felt they were “maturing” or “excelling” in their coordination and connectedness efforts. Furthermore, more than half of the school leaders rated themselves in these more-advanced two stages on continuous improvement and collaboration capacity domains. However, across all four core capacities, the largest share of school leaders indicated that they were in the “maturing” stage, suggesting schools are progressing toward implementing the full community schools model. The Present Study The primary goal of this evaluation was to assess programmatic impact on various outcomes through the 2017–2018 school year. In this study, we assessed impact along seven outcome domains (attendance, educational attainment, academic performance, disciplinary incidents, teachers’ shared responsibility for student success, student connectedness to adults and peers, and family empowerment opportunities) and explored the extent to which there was heterogeneity in programmatic impact based on student- and school-level characteristics. We leveraged innovative quasi-experimental methodology to determine whether students in the community schools were performing better than they would be doing had their schools not been designated as community schools. The specific research questions that we sought to answer are as follows: • What is the impact of the NYC-CS on outcomes related to attendance, educational attainment, academic achievement, student behavior, and school climate and culture? • To what extent are the overall impacts of NYC-CS being observed among key subgroups of students within schools? • To what extent are there differences in program impact related to such school characteristics as programmatic implementation, grade configuration, principal experience, and the residential dispersion of students? In Chapter Three, we describe the data sources, key measures, and methodology for our analysis. This is followed by a presentation of our findings and then a discussion of implications for policy and practice. CHAPTER THREE Data and Methods In this chapter, we describe the data and methodology used to answer the research questions listed in Chapter Two. Our general approach was to leverage a quasi-experimental design that compares the outcomes of students in community schools with the outcomes of their peers who attend a carefully selected group of similar comparison schools. We examined the effect on seven outcome domains. Four of these were at the student level (attendance, educational attainment, academic performance, and disciplinary incidents), and the other three were at the school level (teachers’ shared responsibility for student success, student connectedness to adults and peers, and family empowerment opportunities). Our comparison schools were chosen to match the community schools on preinitiative outcomes, as well as on other characteristics that are known to be related to changes in these outcomes. We first describe the various data sets and measures we used for matching and impact estimation and then present the details of our matching and impact estimation methods. Data Sources The NYCDOE provided virtually all of the data used in our analysis. The OCS provided programmatic data that allowed us to identify which schools were among those in the NYC-CS in each year of the study period, as well as additional details about resources and programming. The Research and Policy Support Group provided deidentified student-level data under a data-use agreement and school-level data on students and staff available to the general public on school websites. Table 3.1 lists these data sources and briefly describes their content. Data are for all NYCDOE schools, staff and students for 2009–2010 through 2017–2018 unless otherwise noted. Measures Outcome Measures This section provides more detail on the primary outcomes that we analyzed. These outcomes are listed in the blue box in Figure 2.1 in Chapter Two. Student Outcomes In our primary analysis, we used a total of five student outcomes from NYCDOE administrative data, four of which were available for elementary and middle school students and four of which were available for high school students. As we describe below, we used student-level data 17 18 Illustrating the Promise of Community Schools Table 3.1 Summary of Data Sources Data Content Summary Reporting Unit (Level of Aggregation) Notes School survey—student module Students’ connectedness to adults and peers School Only 2015–2018 School survey—teacher module Shared responsibility for student success School Only 2015–2018 School survey—parent module Opportunities for family empowerment School Only 2015–2018 Community school program designation Community school status by year School Student characteristics Demographics, economic disadvantage, English language learner, special education, year of birth, grade Student New York test scores ELA, math, science and social studies (grades 3–8) Student Graduation Status four years after entering grade 9 Student Attendance Days present, days absent, chronic absenteeism Student Credits earned Course credits earned Student Discipline Disciplinary incidents, coded by type and severity Incident Enrollment Entrance and exit date and reason Student by school Choice Data on whether the student listed the school as first choice Student Transfer Fraction of students transferring schools midyear or during nontransition years School AIDP application review AIDP grant application score School One-time application from 2014 OSH program information Program implementation School Community schools only RAND-developed school leader Program implementation survey School (principal Community or CSD) schools only to create averages for these outcomes, either for all students within each school or for student subgroups within each school. For elementary and middle schools, we used the following: a measure of the proportion of students in the school who are chronically absent (i.e., who miss more than 10 percent of the school days), the proportion of students who progress on time to the next grade, the average end-of-year test scores for students in the school, and the number Data and Methods 19 of disciplinary incidents per student in the school.1 Our on-time progression measure excludes students who are recorded as transferring out of the NYCDOE system. The test scores are available every year for students in grades 3 through 8 and are standardized for every grade and year; that is, the average test score is rescaled to zero and the standard deviation (at the student level) to 1 for all grades and years. For example, scores have an average of zero and a standard deviation of 1 for students in grade 5 in 2014–2015 and for students in grade 6 in 2017–2018.2 Rescaling test scores in this fashion is common in education research. It provides a familiar scale for impact estimates. For example, one standard deviation on either side of the mean contains about one-third of the student population. If a program has an impact of 1 on standardized test scores, that suggests that a student who would have had a median test score in the absence of the program will end up with a test score in about the 83rd percentile. For high schools, we used many of the outcomes described above, but did not include the average test score since high school students no longer take a year-end test.3 We also used the cohort four-year graduation rate rather than on-time progression to the next grade.4 As with the younger students, we excluded students who were recorded as transferring out of the system from the on-time progression calculation. We also included a measure of the average number of credits accumulated for each student in each year, which can be viewed as a measure of whether students are progressing through high school on time and their academic achievement. In addition to these primary outcomes, we analyzed additional secondary outcomes in each of these domains. For example, we examined the percentage of enrolled days absent in addition to the primary outcome of chronic absenteeism. We also examined suspension rates in addition to the primary outcome of disciplinary incidents. Impacts on these secondary outcomes were generally aligned with the estimated impacts on the primary outcomes that we report here. The estimates of the impact on secondary outcomes are available upon request. School Climate and Culture As described above, NYC-CS uses a capacity-building approach to support community schools’ positive development along key capacity domains—continuous improvement, coordination, connectedness, and collaboration—that are geared toward improving school culture and climate as well as student outcomes (Johnston et al., 2017). 1 We had initial interest in examining additional measures that reflect student mental health. However, because of data constraints, we focused on disciplinary incidents as the main mental health correlate in this study. 2 Although the standardization of test scores is done at the student level, the summary statistics shown in Table 4.1 are at the school level. Therefore, the mean school-level average will not be equal to zero and the standard deviation will not be equal to 1. Also, we should point out that the test underwent a change in 2017–2018, but the strategy of standardizing at the student level remained consistent. 3 High school students take the New York State Regents Examination. However, because this exam is only available for a relatively small portion of high school students each year and because of complications arising from multiple types of Regents exams and the nonrandom timing of when students take the exams, we have chosen not to use Regents exam records as an outcome. 4 We are aware that many of the students in the graduating classes were exposed to NYC-CS during the last year or two of their academic careers. However, our analysis of the data indicates that most of the important variation in graduation rates involve differences in outcomes for seniors rather than for students in grades 9 through 11. Therefore, we contend that graduation rate is a valid measure to assess the initiative’s impact at the high school level. Also, we use credit accumulation as an additional high school outcome, which allows us to estimate the effect of the initiative on students in grade 9 through 11. 20 Illustrating the Promise of Community Schools Although there is an emerging body of literature on the impact of the community school strategy on student outcomes (Maier et al., 2017), there is limited evidence showing how the program is related to school climate and culture. For a notable exception, see Daniel et al., 2019. Therefore, in this report, we consider the impact of NYC-CS on a series of measures of school culture and climate that are hypothesized to support school improvement and student success (Bryk et al., 2010).5 Specifically, we describe the impact of the NYC-CS strategy on (1) teachers’ perception of shared responsibility for student success, (2) students’ sense of connectedness to adults and classmates, and (3) families’ reports of opportunities for engagement and empowerment. These indicators are derived from the New York City School Survey (NYCSS) and aligned with New York City’s Framework for Great Schools (NYCDOE, undated-a). The indicators are measured at the school level based on teacher, student, and parent responses to the NYCSS. Following the strategy used by NYCDOE to estimate School Quality Reports, we created a composite indicator through a two-step process (NYCDOE, undated-b). First, we calculated the percentage of positive responses for each item (e.g., the percentage of respondents indicating they “Agree” or “Strongly Agree” with a particular item). Second, we calculated the average percentage of positive responses across all the items in that particular domain. See Table 3.2 for a summary of the NYCSS survey items as they were worded in the 2016 survey. Before 2015, there was too little overlap between the survey items to enable the creation of comparable measures, so we do not have measures of these outcomes prior to 2015. Furthermore, although the survey went through a substantial revision following the 2015 version, we were able to crosswalk items from that year, thus enabling us to include measures for four academic years (2014–2015 though 2017–2018). Demand Measures To measure whether the NYC-CS affected school demand, we used three demand measures, as well as estimating whether the demographic makeup of the schools was affected. The first measure of demand is derived from NYCDOE’s school choice data and is referred to as “fraction who listed as first choice.” This measure is the number of students who listed a school as their first-choice school divided by the overall number of students who are admitted that year. A ratio greater than one means that more people listed the school as their first choice, while a ratio less than one means that it was not the top choice of some of the students who were admitted. The second measure we used was the fraction of students who left the school early. For elementary schools, this is a measure of which fraction of students in grades 1 through 4 were observed in a different NYC public school in the following year; for middle school, this measures the fraction of students in grades 6 and 7 who switch to a different NYC public school in the next year and for high schools, it is the fraction of students in grades 9 through 11 who do so. The final measure is the number of students who transferred into the school in nontraditional grades. For elementary schools, this measures the fraction of students in the grades 2 5 Although these climate measures may be considered by some to be potential mechanisms that in turn influence student outcomes, we did not formally test for an indirect or mediated effect because of data constraints related to the length of the study period. Therefore, we treated these measures as distal outcomes, just the same as we analyze such student outcomes as attendance and academic achievement. Data and Methods Table 3.2 New York City School Survey Items Used to Calculate Outcome Measures Domain Shared responsibility for student success (NYCSS teacher module) Survey Items How many teachers at this school . . . [None, Some, A Lot, All] a. Help maintain discipline in the entire school, not just their classroom? b. Are really trying to improve their teaching? c. Take responsibility for improving the school? d. Feel responsible for helping students develop self-management? e. Are willing to take risks to make the school better? f. Are eager to try new ideas? g. Feel responsible that all students learn? Student connectedness to adults and classmates (NYCSS student module) How much do you agree with the following statements? [Strongly Disagree, Disagree, Agree, Strongly Agree] a. I’m learning a lot in my classes at this school to prepare me for the next grade level. b. There is at least one adult in the school that I can confide in. c. My teachers will always listen to students’ ideas. d. My teachers always keep their promises. e. My teachers treat me with respect. f. When my teachers tell me not to do something, I know they have a good reason. g. My classes at this school really make me think. h. Discipline is applied fairly in my school. i. School safety agents promote a safe and respectful environment at this school. 21 22 Illustrating the Promise of Community Schools Table 3.2—Continued Domain Opportunities for family empowerment (NYCSS parent module) Survey Items Please mark the extent to which you disagree or agree with each of the following statements about this school. [Strongly Disagree, Disagree, Agree, Strongly Agree] a. School staff regularly communicate with parents or guardians about how parents can help students learn. b. Parents/guardians are invited to visit classrooms to observe instruction. c. Parents/guardians are greeted warmly when they call or visit the school. d. Teachers and parents/guardians think of each other as partners in educating children. e. Teachers work closely with families to meet students’ needs. f. Teachers communicate regularly with parents/guardians. g. The principal or school leaders encourage feedback from parents/guardians and the community through regular meetings with parent and teacher leaders. h. Teachers understand families’ problems and concerns. NOTE: Underlined response options indicate an affirmative or positive response. The average school-level response rate for NYCSS across the four years of data was 62.4 percent for teachers, 61.3 percent for students, and 53.7 percent for parents. through 5 who attended a different NYC public school in the previous year. For middle schools, we focus on students in grades 7 and 8 and for high schools we focus on grades 10 through 12.6 Student Subgroup and School-Level Heterogeneity Measures Student Subgroup Analysis In addition to estimating overall program effects, we considered whether particular subgroups of students within the schools were differentially affected by the NYC-CS than the rest of the students. For example, we asked whether the NYC-CS had larger impacts on individuals who were classified with a disability than would be expected based only on the average impact. Here, we start by defining the subgroups of students to focus on that have been identified by NYCDOE and OCS as being of particular interest: those who are classified as being in poverty, those in temporary housing, those who are English language learners, students with highincidence disabilities, males, females, black students, and Hispanic students. We calculated the average outcomes for each school in each year, using only students who are in the relevant subgroup. For example, we calculated the average test scores of students who are classified as having a high-incidence disability in each school and each year. 6 Given the relatively high rates of turnover between the elementary and middle school grades, we ignored student transi- tions between grades 5 and 6 (even in schools teaching kindergarten through grade 8) when constructing this measure. Data and Methods 23 School-Level Heterogeneity Additionally, we explored whether schools with certain characteristics were differentially affected by the NYC-CS than others. To do so, we first identified subgroups of schools that might have been affected by the NYC-CS differently than the average school. We settled on six school subgroups that were of interest to NYCDOE and OCS: large schools, small schools, schools that predominantly served students who were zoned to that school (“highly zoned schools”), schools that predominantly served students who were not zoned to that school (“lightly zoned schools”), schools run by a more recently hired principal, and schools run by a longer tenured principal. We only present the zoned analysis for elementary and middle schools; NYC does not typically zone high schools. We used measures in spring 2015 to define all of these subgroups under the theory that we wanted to measure such aspects as principal stability in what was the first year of implementation for most of the schools. We defined large schools that were above the median size and small schools that were below the median size, doing this separately for elementary, middle, and high schools. For example, we defined a high school as being a small school if it was below the median size of all high schools. Similarly, we defined highly zoned schools as those serving a higher percentage of students zoned to their school than the median school and lightly zoned schools as those serving a lower percentage of students zoned to their school. For this analysis, we also included nonzoned schools as a third category. We defined principal status based on the number of years the principal had worked at the school as of the 2014–2015 school year, with values above the median indicating a longer tenured principal and values below the median indicating a more recently hired principal. Finally, we also estimated whether the estimated effects differed depending on a schools’ codesignation as a Renewal School. We also assessed school-level heterogeneity based on schools’ varying levels of program implementation, which we describe in the following section. Implementation Measures Core capacity scores. To understand the relationship between implementation and subsequent impact, we used a scaled numeric value for each core capacity. As described in Johnston et al. (2017), the core capacity scores were derived from a school leader survey administered by RAND in 2016 and focused on schools’ capacity development through the 2015–2016 school year. The implementation scores we present are continuous measures to capture various values that are most useful when comparing schools with one another. The NYC-CS model is intended to be a developmental, capacity-building approach to school improvement, so we contend that it makes sense to measure and analyze schools’ development on a spectrum rather than a binary indicator of simply “present” versus “absent.” These scores are weighted composites of multiple survey items, each one a Likert-scaletype response about the presence or absence of a particular program element or, in some cases, a judgment about the relationships and behaviors of key staff and institutions. All survey items were phrased with a positive valence such that a higher value indicates a greater degree of implementation—higher values on the resultant indexes suggest greater levels of capacity development. These capacity dimensions are defined in Chapter Two; in Appendix B, we provide a summary of the survey items that contributed to each capacity index measure, along with the principal component analysis (PCA) weight that was used to calculate the scores. Mental health implementation profiles. The breadth and depth of mental health programming and services differ across schools and are tailored to school and student needs. In 24 Illustrating the Promise of Community Schools order to assess the impact of mental health programs and services, we created mental health service implementation profiles among the community schools, which represent a typology of implementation experiences rather than a qualitative evaluation of high versus low implementation levels. We endeavored to capture variations in how programs and services were provided for students, teachers, and families, while also capturing information on buy-in and awareness among school staff and other stakeholders. To determine how many different types of profiles there are across schools, we conducted a multivariate cluster analysis. A cluster analysis groups the data points in such a way that schools in the same profile, or cluster, are more similar to each other than to those in other profiles. Cluster analysis is a data analysis technique that seeks to maximize differences between clusters while minimizing differences within clusters (Peck, 2005). This approach allows us to capture the differences in implementation across schools while also identifying common characteristics among schools that have the same type of profile. In other words, cluster analysis seeks to divide the schools into groups that are as dissimilar as possible, while making sure that the groups themselves are as internally consistent as possible.7 See Appendix C for additional information on the data used to calculate the implementation profiles, and Appendix D for a summary of the cluster analysis results. Methods In order to determine the average effectiveness of the NYC-CS program, we compared the outcomes of community schools in the 2015–2016, 2016–2017, and 2017–2018 school years with the outcomes that we predict would have occurred for those schools in the absence of the community school designation. These predicted outcomes are based on the predesignation characteristics of each community school and on the outcomes of a strategically chosen set of comparison schools, using a difference-in-difference style methodology. We conducted all of our analyses at the school level, using average student outcomes in each school either for all students in the school during each year or for a select subgroup of students in each year based on socioeconomic characteristics. An alternative method would have been to conduct our analysis at the student level, using outcome information for each student and conditioning on the individual student’s pre–NYC-CS outcomes through a difference-indifference method or through lagged dependent variables in a regression. We chose a school-level analysis for two main reasons. First, we contend that the most important research question is how students in the community schools each year are doing compared with students in the same schools before those schools became a part of the NYCCS. The school-level analysis is best suited to answer this question, whereas a student level analysis is better able to answer questions about whether outcomes for a given set of students have improved since those same students entered the community school program. The second reason for choosing the school-level analysis is that a student-level analysis is most useful when individual students have an outcome data series that extends from pre7 We used a specific type of cluster analysis known as Partitioning Around Medoids (PAM), which is more adept than traditional k-means clustering at handling categorical and continuous measures, data with missing values, and data with potential outliers (Kaufman and Rousseeuw, 1990). In PAM, each cluster is denoted by a representative observation, or a medoid, which is the most centrally located data point within the cluster. Data and Methods 25 program years through the end of the study period. For a student-level analysis, it is also important to have students who are continually exposed to the program throughout its existence. In our case, our final year of outcome data was four years after program initiation. Students who entered grade 2 when the NYC-CS began in fall 2015 would have two years of attendance data prior to program initiation and would still be in the same school in our final study year. There is no other cohort of students that has two years of preprogram data and would be expected to be in the same school throughout the four years of program existence (or three years if we exclude the transition year of 2014–2015). Therefore, we prefer a school-level analysis to a student-level analysis. Below, we first discuss the matching approach we used to choose a set of comparison schools; then, we discuss the difference-in-difference estimator we used to generate the analysis. Matching We chose our comparison schools to be similar to the community schools on baseline outcomes in three outcome dimensions (academic achievement, attendance, and discipline), demographic makeup, and characteristics that determined treatment (i.e., whether they applied to AIDP and, if so, the score they received). Finding schools that were similar to the community schools on all of these factors was difficult for two reasons. First, community schools were selected for inclusion in the NYC-CS because of their difficulties in reaching student achievement and attendance goals, which suggests that comparison schools will be systematically higher on these measures, by design. Second, although there are more than 1,600 schools in NYCDOE, finding close matches on all of these metrics for any given school was impossible. To solve the first issue, we complemented the matching approach with a difference-in-difference estimator. To solve the second issue, we used the following strategy to define the set of comparison schools. 1. Defining the base measures to match on. We started by defining the measures that we would theoretically like to match on. These measures are: a. Outcome measures ◦◦ Elementary and middle schools: information from the 2011–2012, 2012–2013, and 2013–2014 school years on the percentage of students who are chronically absent, the attendance rate, the average English score, the average math score, and the average number of disciplinary incidents per students. ◦◦ High schools: information from the 2011–2012, 2012–2013, and 2013–2014 school years on the percentage of students who are chronically absent, the attendance rate, the graduation rate, the average number of credits that students attain, and the average number of disciplinary incidents per student. b. Demographic measures ◦◦ All schools: information on the percentage of students in the 2011–2012, 2012– 2013, and 2013–2014 school years who received free or reduced-price lunch, were classified as English language learners, were diagnosed with a disability, or were male, black, or Hispanic. c. AIDP application information ◦◦ All schools: information on whether the schools applied for the AIDP program and, if so, what score they received from the selection committee on their application. 26 Illustrating the Promise of Community Schools 2. Aggregating the base measures. Finding non-community schools that are similar to community schools on all of the above measures would be impossible. Therefore, we used a PCA to transform these measures into a smaller number of dimensions. As is standard practice, we kept the components with an eigenvalue greater than 1, which corresponded to the eight components for both elementary/middle schools and for high schools. (The eigenvalue is a measure of how much of the variation in the data is explained by the associated component.) 3. Matching on the principal components. We finished by matching community schools to comparison schools on the eight largest principal components estimated above. To be clear, each component is a weighted average of 35 base measures as categorized above in Step 1. Main Analyses Our estimates of the impact of the NYC-CS program in each year following implementation uses a weighted difference-in-difference estimator: Yst = α t + γ s + 2018 ∑ κ =2015 β k Tstk + ε st Y is any of the many outcome measures we examined, which are listed in Figure 1.1 in Chapter One, and the observations are weighted by the matching weight estimated above. The value s indexes the school and t indexes year so Yst is the outcome value for school s in year t. The value αt is the year fixed effects to account for any changes that affect both the community schools and the matched comparison schools similarly, and γs is the school fixed effects that adjust for differences across schools in the years prior to 2015.8 By using the matched weights, the effects of the NYC-CS are estimated using only schools that were similar to the community schools in the three years prior to implementation. The main treatment variable Tstk equals 1 if school s is a community school and the year t equals k. Thus, β2016 is the estimated effect of the NYC-CS in 2016, β2017 is the estimated effect of the NYC-CS in 2017, and β2018 is the estimated effect of the NYC-CS in 2018. Although we included a dummy for 2015, we do not present those results because that year was a transition period. This estimator is implemented using a weighted least squares regression of the outcomes on school dummies and a treatment dummy in the impact year. We estimated standard errors after clustering by school to account for the correlation over time in each of the schools’ outcomes and for some comparison schools being matched with more than one community school. Two additional points merit discussion. First, we also ran analysis that estimated a pooled effect of the program across post-program years instead of separately estimating effects for each year. To do so, we ran a regression similar to the one above, but now included only Tst2015 (i.e., a measure of whether the school is a community school and the year is 2015) and Tst2016+ (i.e., a measure of whether the school is a community school and the year is 2016, 2017, or 2018). The coefficient βst2016+ is then a measure of the average impact of the NYC-CS over the three years. 8 We do not include other covariates in the equation. The impact of time-invariant school-level covariates are included in the school fixed effects. Time-varying covariates, such as changing school demographics, could be caused by participation in NYC-CS, and so would be endogenous. Therefore, our estimated impact of NYC-CS includes any indirect effect of the initiative on the outcome measure through its impact on these omitted time-varying covariates. Data and Methods 27 Second, we had survey data only from 2015 onward. Therefore, when using survey measures, we omitted Tst2015 from the regression above, which means that 2015 served as the baseline year for these analyses. Analysis of Impact on Subgroups of Students and Schools Analyses that probe heterogeneity in program impacts by the student characteristics described above were conducted by using school-level average outcomes that are calculated using the outcomes of students that had a given characteristic. As an example, consider the subgroup analyses that estimate the NYC-CS effects on English language learners. For this analysis, we started by calculating the average outcome in each school and year using outcomes of all the English language learner students in a specific school in a given school year. Then, we essentially repeated the main impact analysis as described above to estimate the effect of the NYC-CS for the subgroup of English language learner students.9 We note, however, that we could only do this for outcomes recorded in administrative data (for which we had studentlevel measures) and not for the survey outcomes (for which we only had school-level averages). For this analysis, we focus on historically underperforming groups: students in poverty, English language learners, students with disabilities, black students and Hispanic students. We also provide separate estimates for male and female students. We also estimated whether the effect of NYC-CS is moderated by school-level characteristics. To do so, we defined mutually exclusive groups of schools based on their characteristics measured in the 2015 school year. For example, we grouped schools based on how many years their principals had worked at the schools in the 2015 school year, defining a school as having a longer-tenured principal if the principal in the 2015 school year had more than the median experience and as having a recently hired principal if the principal in that school year has less than the median experience. We then defined separate treatment variables depending on what subgroup a given community school belonged to. Consider the subgroup analyses that probed whether the average effects across 2016 through 2018 school years differ by whether the school had a longer-tenured exp or recently hired principal. For this analysis, we defined Tst 2016+ as equal to 1 if school s is a community school with a longer-tenured principal in the 2015 school year and year t is the inexp 2016 school year or later. Similarly, Tst 2016+ equals 1 if school s is a community school with a recently hired principal in the 2015 school year and year t is the 2016 school year or later. We estimated the treatment effects for the two subgroups (longer-tenured versus recently hired principal) by running a modified version of the difference-in-difference specification used for the main analyses such that the model now has multiple treatment indicators for each subgroup as follows:10 9 Technically, focusing on subgroups of students within each school makes the matching weights no longer valid. For example, ensuring that on average students in School X had similar test scores to students in School Y in 2014 does not guarantee that English language learner students in School X had similar test scores as English language learner students in School Y. Regardless, we continue to use the matching weights in this analysis. 10 An alternative approach allows for the yearly time effects to vary by school grouping (i.e., for schools with a longertenured principal to have different time trends than schools with a recently hired principal), which provide equivalent coefficient estimates as if we ran separate regressions for each group of schools. We opted against this for two reasons. First, this would mean that the time trends for one group of community schools are estimated using some non-community schools that they were not matched to. Second, we would not be able to use this approach to estimate differential effects for renewal and non-Renewal Schools, because all Renewal Schools are community schools. 28 Illustrating the Promise of Community Schools exp inexp exp Yst = α t + γ s + β stinexp T inexp + β 2015 Tstexp + β 2016+ Tstinexp + β 2016+ Tstexp + ε st 2015 st 2015 2015 2016+ 2016+ inexp The coefficient β 2016+ is thus the estimated effect of NYC-CS on schools with a recently exp hired principal in the 2015 school year and β 2016+ is the estimated effect of NYC-CS on schools with a longer-tenured principal. In addition to estimating the effect for each subgroup, we also tested the hypothesis that the effects for each subgroup are identical, i.e., whether exp inexp β 2016+ = β 2016+ In addition, we also estimated subgroup effects in each post-program year by including in the model treatment indicators for each subgroup and post-program year. Finally, we estimated whether the way in which schools implemented NYC-CS was correlated with the estimated impacts of the initiative. To do so, we conducted the same analysis as described above, but defined the treatment school groupings based on our implementation measures. Analysis of Grade-by-Year Impact The methods described thus far will provide estimates of the impact of community schools in each of the three years following implementation: 2016, 2017, and 2018 school years. Changes in the impact over time can occur both because the program matures and because some students have more exposure to the program. The analysis described in this section decomposes the year-by-year changes in impact into a maturity component and an exposure component. We did this by taking advantage of the different number of years of exposure experienced by students in each grade in each year. This analysis begins by reformulating our main impact estimator to be at the school by grade-by-year level rather than the school-by-year level: H Ysgt = α gt + γ gs + ∑ 2018 ∑ g=1 k=2015 β gk Tsgtk + εsgt where g indexes the grades offered by a school. The equation now includes fixed effects for grade in each year (αgt) and each grade in each school (γgs). We used g equal to 1 to indicate the lowest grade offered by the school and g equal to H to indicate the highest grade offered by the school. We conducted this analysis for middle schools that run from grade 6 through 8 and high schools that run from grade 9 through 12. This analysis also did not estimate the impact of exposure for elementary school students because we do not have test scores in the entry grade (kindergarten) or the immediately following grades (grades 1 and 2). This means that all the students in elementary school for which we have data are exposed to the program for the same period, limiting the independent variation needed to identify exposure effects. We then used the grade-by-year impact estimates to examine whether changes over time are better explained by program maturity or level of exposure: Data and Methods 29 β gk = α + θ M k + δ Egk + λ Gg + υ gk where Mk is the maturity of the program, Egk is the exposure to the program of the students in grade g at time k, and Gg is the grade level. Maturity is measured as the number of years since program initiation, with the 2016 school year equal to 1 and the 2018 school year equal to 3. Exposure is measured as the number of years a grade has been exposed to the program at the end of each school year. Exposure is equal to 1 for all grades in the 2016 school year, which is the first year of the program. It is equal to 2 for all grades in the 2017 school year except the lowest grade in a given school, which just entered the school and for which it takes a value of 1. Likewise, it is equal to 3 for all grades except the lowest two in the 2018 school year. That is, in the 2018 school year, it is equal to 1 for the lowest grade and two for the second lowest grade, reflecting their exposure. The equation also includes a variable, Gg, which measures the grade level relative to the lowest grade in the school. In our middle school analysis for example, Gg is 1 for grade 6, 2 for grade 7, and 3 for grade 8. Gg is included to assure that our exposure variable is truly capturing exposure rather than differential effects by grade. This regression is run separately for each middle school and high school outcome and is weighted by the inverse of the standard error of βgk. For comparison purposes, we also estimated a restricted regression that omits the exposure and the grade-level variables, thereby attributing all change over time to the maturity variable. There are some limitations to this analysis. For example, we know that many students change schools between school years that are not terminal grades for their school and during the school year. We also know that some students enter grade 1 at a school after having been at a community school in the previous year. This analysis implicitly assumes these types of transitions do not exist. For parsimony, we also implicitly assume that the effects of maturity, exposure, and grade level are linear. CHAPTER FOUR Results Matching Results Before presenting the estimated average impact of the program, we illustrate the results of the matching analysis. We present the characteristics of both all non-community schools and all community schools, as well as the subset of non-community schools that were used in the analysis as comparison schools. This subset was chosen via the matching approach described in Chapter Three. Table 4.1 shows these results for elementary and middle schools, and Table 4.2 shows the results for high schools. In both tables, the first column shows the average for all noncommunity schools, the second column shows the average for community schools, and the third column shows the average for the matched comparison group. Finally, the fourth column reports the difference between the community school and matched comparison average, and indicates whether this difference is statistically significant. The demographic characteristics and outcome measures reported here are all from the 2013–2014 school year, which is the year before the announcement of the NYC-CS. Thus, any differences should be considered to measure the baseline differences and cannot be due to the effect of NYC-CS. The first and second columns in Tables 4.1 and 4.2 clearly show that the community schools are quite different than non-community schools. The elementary and middle community schools serve a population that is more likely to be economically disadvantaged, disabled, Hispanic, black, chronically absent, or have a disciplinary incident. The population is also less likely to be Asian or white, and it has lower on-time progression rates and baseline test scores. The differences between high school community schools and non-community schools are similar but less dramatic. Our matching strategy, however, eliminates most of these differences. This is seen by comparing the second and third columns or looking directly at the fourth column. Of note, the difference in test scores and on-time progression between treatment and matched comparison schools remains statistically significant after matching. This reflects the Renewal School selection process for that part of NYC-CS, which used low test scores and graduation rates as criteria for the Renewal School designation and the resulting designation as a community school. Therefore, it is not possible to find schools with similarly low test scores or high dropout rates for the matched comparison group. However, in the difference-in-difference specification we employed, the critical assumption is not that treatment and matched comparison schools have similar preprogram characteristics but that they have the same preprogram trends. 31 32 Illustrating the Promise of Community Schools Table 4.1 Elementary and Middle School Summary Statistics Variables Proportion English language learners Proportion in poverty Proportion with disability Percentage in temporary housing Proportion who are Hispanic Proportion who are black Proportion who are Asian Proportion who are white Proportion who are chronically absent Average attendance rate Average math score Average ELA score On-time progression Proportion of students with disciplinary incident Number of schools (1) (2) (3) (4) All Non-Community Schools Community Schools Matched Comparison Group Difference Between Column 2 and 3 0.140 0.173 0.168 0.00470 (0.130) (0.111) (0.104) (0.0171) 0.752 0.897 0.903 –0.00630 (0.216) (0.073) (0.075) (0.0119) 0.188 0.246 0.243 0.00234 (0.065) (0.046) (0.048) (0.00712) 0.093 0.164 0.165 –0.00100 (0.082) (0.082) (0.070) (0.0118) 0.408 0.518 0.525 –0.00645 (0.266) (0.243) (0.226) (0.0368) 0.293 0.425 0.418 0.00651 (0.287) (0.240) (0.221) (0.0360) 0.134 0.025 0.024 0.000725 (0.187) (0.047) (0.042) (0.00661) 0.149 0.023 0.021 0.00215 (0.213) (0.044) (0.036) (0.00612) 0.243 0.402 0.391 0.0103 (0.124) (0.095) (0.082) (0.0139) 0.926 0.893 0.897 –0.00421 (0.027) (0.022) (0.021) (0.00358) –0.040 –0.707 –0.591 –0.116*** (0.514) (0.191) (0.189) (0.0277) –0.014 –0.622 –0.512 –0.111*** (0.470) (0.171) (0.174) (0.0261) 0.953 0.939 0.940 –0.000945 (0.036) (0.035) (0.038) (0.00592) 0.125% 0.290% 0.201% 0.0890% (0.375) (0.327) (0.370) (0.0549) 1,060 72 269 341 NOTE: This table reports average outcomes and school demographics in 2014 for all New York City elementary and middle schools, as well as the subset of community and non-community schools that are included in the analysis. When calculating the averages for the non-community schools, we used the same matching weights as used in the analysis. The number of schools on the bottom row do not include the matching weights and reflect the number of unique schools that receive a non-zero matching weight. Finally, Column 4 shows the difference between Column 2 and Column 3, as well as the estimated standard error and the statistical significance of the difference. * p < 0.10 ; ** p < 0.05; *** p < 0.01. Results 33 Table 4.2 High School Summary Statistics Variable Scores Between Proportion of English language learners Proportion in poverty Proportion with disability Proportion in temporary housing Proportion who are Hispanic Proportion who are black Proportion who are Asian Proportion who are white Proportion who are chronically absent Average attendance rate Proportion who graduated Credits per year Proportion of students with disciplinary incident Number of schools (1) (2) (3) (4) All Non-Community Schools Community Schools Matched Comparison Group Difference Between (2) and (3) 0.144 0.205 0.154 0.0512 (0.226) (0.183) (0.182) (0.0368) 0.776 0.832 0.839 –0.00678 (0.141) (0.091) (0.083) (0.0183) 0.158 0.207 0.204 0.00274 (0.078) (0.066) (0.068) (0.0142) 0.069 0.089 0.082 0.00736 (0.056) (0.041) (0.050) (0.00922) 0.431 0.560 0.544 0.0162 (0.231) (0.238) (0.207) (0.0437) 0.362 0.357 0.366 –0.00884 (0.250) (0.228) (0.211) (0.0423) 0.109 0.047 0.054 –0.00661 (0.153) (0.078) (0.077) (0.0179) 0.085 0.024 0.027 –0.00250 (0.127) (0.039) (0.036) (0.00779) 0.411 0.553 0.527 0.0261 (0.240) (0.144) (0.167) (0.0296) 0.845 0.784 0.808 –0.0247 (0.115) (0.095) (0.077) (0.0168) 0.787 0.673 0.720 –0.0473* (0.190) (0.137) (0.141) (0.0266) 11.599 10.462 11.218 –0.756*** (1.976) (1.279) (1.360) (0.273) 0.337 0.291 0.313 –0.0225 (0.437) (0.218) (0.264) (0.0452) 339 41 130 171 NOTE: This table reports average outcomes and school demographics in 2014 for all the New York City high schools, as well as the subset of community and non-community schools that are included in the analysis. When calculating the averages for the non-community schools, we used the same matching weights as used in the analysis. The number of schools on the bottom row do not include the matching weights and reflect the number of unique schools that receive a non-zero matching weight. Finally, Column 4 shows the difference between Column 2 and Column 3, as well as the estimated standard error and the statistical significance of the difference. 34 Illustrating the Promise of Community Schools Figures 4.1 and 4.2 suggest that this is the case. Like Tables 4.1 and 4.2, these figures show the average of four outcomes for all non-community schools, community schools, and the matched comparison group, but they show how these averages change over time. These figures illustrate that the community schools were substantially different than the non-community schools, with worse average outcomes on nearly every measure. Figures 4.1 and 4.2 also suggest that the differences between the community schools and non-community schools seemed to be generally stable before the policy started, a finding that can support the difference-in-difference specification. This point can be seen more clearly in Figures 4.3 and 4.4 and is discussed in more detail in the following section. These two figures explicitly show the difference between community and noncommunity schools. Lines that are fairly horizontal and near zero prior to the start of community schools in 2014–2015 indicate that the preinitiative trends were similar for the two groups. The relative stability of the differences between the community schools and non-community schools in the period from the 2010 school year to the 2014 school year, along with the improvement of the community schools relative to the non-community schools in the period from the 2016 school year to the 2018 school year when NYC-CS was implemented, provide suggestive evidence that NYC-CS had a positive effect on these outcomes. The next section illustrates these trends more clearly and provides specific estimates of the impact of NYC-CS on a variety of school outcomes. School-Level Average Impact Next, we turn our attention to the estimated school-level impacts of the NYC-CS. These estimates are calculated using the methods and measures discussed in Chapter Three. In short, we estimated whether the outcomes of the community schools improved relative to the matched comparison group of schools. The results are illustrated in Figures 4.3 and 4.4; Figure 4.3 shows the results for elementary and middle schools, and Figure 4.4 shows the results for high schools. These graphs show how the community schools’ performance compares with that of the matched comparison group relative to that difference in 2014. As is shown by the values that are mostly near zero prior to 2014, the differences between the community schools and the matched comparison schools were mostly stable between 2010 and 2014, with the one exception being that the average test scores in elementary and middle community schools seem to be improving slightly relative to the matched comparison schools. Although these differences were mostly stable before 2014, community schools improved after 2014 relative to the matched comparison schools on nearly all of the measures. This suggests that the NYC-CS had positive impacts on a variety of school-level outcomes for elementary, middle, and high schools. Average Effect of NYC-CS on Student Outcomes Tables 4.3 and 4.4 report the estimated impacts for elementary/middle schools and high schools, respectively. There are two main modifications between these estimates and the results shown in Figures 4.1 and 4.2. Most importantly, Tables 4.3 and 4.4 include additional outcomes. As described in Chapter Two, these outcomes come from the NYCSS and are only available from 2015 onward. For these outcomes, we therefore measured how community schools compare with the matched comparison schools relative to the observed difference in 2015. To the extent that the 2015 survey results already partially reflected a positive effect of NYC-CS, the results we present in Tables 4.3 and 4.4 would underestimate the actual effect of NYC-CS. The Results 35 Figure 4.1 Average Outcomes of Non-Community Schools, Community Schools, and Matched Comparison Schools over Time: Elementary and Middle Schools On–time progression .40 .96 .35 .95 Average Average Chronically absent .30 .25 .20 .93 .92 .91 2010 2012 2014 2016 2018 2010 2012 2014 2016 Year Year Average test score Disciplinary incidents per student 0 .4 –.2 .3 Average Average .94 –.4 –.6 2018 .2 .1 2010 2012 2014 2016 2018 2010 Year Treated 2012 2014 2016 2018 Year All non–treated schools Matched controls NOTE: Dashed vertical lines indicate that 2014–2015 is considered a transition year. Vertical scale for “Chronically absent” and “On-time progression” is the proportion of students in those categories, averaged over schools. The vertical scale for “Average test score” is standardized test score, averaged over schools. The vertical scale for “Disciplinary incidents per student” is the number of incidents, averaged over schools. second modification between the results in Tables 4.3 and 4.4 and those shown in Figures 4.1 and 4.2 is that, whenever possible, we measured how community schools compared with the matched comparison schools relative to the average difference in the entire pre-period (i.e., 2010–2014) rather than relative to a single year (i.e., 2014); we did this to improve the statistical precision of the estimates. To assist with interpretation of the tables, we consider the first column of Table 4.3, which provides estimates of the impact on chronic absenteeism in elementary and middle schools. As indicated toward the bottom of the table, these estimates are based on a comparison with the average of the outcome across the five years from 2010 to 2014 during the baseline period. The top estimate shows that NYC-CS had an impact of –0.0545 in 2016, which was statistically significant at the p < 0.01 level. Referring back to Table 4.1, we see that the average fraction of community school students that were chronically absent in elementary and middle schools in 2014 was 0.401 or 40.1 percent. The impact of NYC-CS would have reduced this average by 5.45 percentage points to 0.346 or 34.6 percent. As a point of comparison, the chronic absenteeism rate in non-community schools was 24.3 percent in 2014, which is nearly 16 percentage points lower than that for community schools. That means that NYC-CS eliminated almost one-third of the difference in the chronic absenteeism rate between community and noncommunity schools in its first year. In 2017 and 2018, the impact was even larger, eliminating 36 Illustrating the Promise of Community Schools Figure 4.2 Average Outcomes of Non-Community Schools, Community Schools, and Matched Comparison Schools over Time: High Schools Graduated .60 .85 .55 .80 Average Average Chronically absent .50 .45 .40 .70 .65 2010 2012 2014 2016 2018 2010 2012 2014 2016 Year Year Credits accumulated Disciplinary incidents per student 2018 .36 11.5 .34 11.0 Average Average .75 10.5 10.0 9.5 .32 .30 .28 .26 2010 2012 2014 2016 2018 2010 Year Treated 2012 2014 2016 2018 Year All non–treated schools Matched controls NOTE: Dashed vertical lines indicate that 2014–2015 is considered a transition year. Vertical scale for “Chronically absent” and “Graduated” is the proportion of students in those categories, averaged over schools. The vertical scale for “Credits accumulated” is the number of credits earned per year, averaged over schools. The vertical scale for “Disciplinary incidents per student” is the number of incidents, averaged over schools. about half of the absenteeism gap between community and non-community schools in 2014. The average effect shown in the fourth row in Table 4.3 shows that in 2016, 2017, and 2018, NYC-CS reduced chronic absenteeism by 7.3 percentage points. The other impact estimates in Tables 4.3 and 4.4 can be interpreted in similar fashion by referring to the appropriate statistics in Tables 4.1 and 4.2. Tables 4.3 and 4.4 paint a promising picture, suggesting that the NYC-CS had positive effects on most of the examined measures with some notable exceptions. In particular, we found that the NYC-CS had a positive impact on student attendance in elementary, middle, and high schools and across all three years that outcomes were measured. We also found positive and significant impacts on elementary and middle students’ on-time grade progression in the two years for which we have data and on high school students’ graduation rates in two of the three years. In addition, the results suggest that the NYC-CS also increased math test scores of elementary and middle school students in 2018, reduced disciplinary incidents for elementary and middle school students, increased the number of credits that high school students obtained, and had positive effects on perceived teacher responsibility in elementary and middle schools in 2017 and 2018. It is important to note that impacts for some outcomes were close to zero and not statistically significant, including ELA test scores of elementary and middle students in any of the three years we examined and math test scores in the first two Results 37 Figure 4.3 Difference Between Community Schools and Matched Comparison Schools: Elementary and Middle Schools Chronically absent On-time progression .02 .05 .01 0 0 –.01 –.05 –.02 –.10 –.03 2010 2012 2014 2016 2018 2010 2012 2014 2016 Year Year Average test score Disciplinary incidents per student .20 2018 .10 0 .10 –.10 0 –.20 –.10 –.30 2010 2012 2014 Year 2016 2018 2010 2012 2014 2016 2018 Year NOTE: Vertical axis reflects the difference in outcome between community schools and non-community schools, normalized so that the difference in 2014 is equal to zero. Dashed vertical lines indicate that 2014–2015 is considered a transition year. Solid bars represent 95-percent confidence intervals. The vertical scale for “Chronically absent” and “On-time progression” is the proportion of students in those categories, averaged over schools. The vertical scale for “Average test score” is standardized test score, averaged over schools. The vertical scale for “Disciplinary incidents per student” is the number of incidents, averaged over schools. years; student connectedness to adults and family empowerment opportunities for all schools; and the number disciplinary incidences and perceived teacher responsibility in high schools. Tables 4.3 and 4.4 also suggest that the impact of NYC-CS grew over time for some of the outcomes, a result that is most apparent at elementary and middle schools. Although a large part of the reduction in chronic absenteeism occurred immediately, the improvements in average math test scores generally took one or two years to appear. An interesting question is whether that is because of the increase in how much students themselves are exposed to the NYC-CS program or whether it is because the schools get better at implementing the NYC-CS over time. We explored this possibility by conducting grade-by-year analysis, which is described in this chapter’s “Grade by Year Analysis Results” section. Finally, it is worth noting that the effects in Tables 4.3 and 4.4 are robust to a number of sensitivity checks that we conducted. First, we estimated the difference-in-difference specification that control for school-specific linear trends (in lieu of year fixed effects) in the preNYC-CS period and find nearly identical results. We also conducted these analyses using all non-community schools as the comparison group, therefore implementing a traditional difference-in-difference design, and found similar results. Finally, we estimated the effects using 38 Illustrating the Promise of Community Schools Figure 4.4 Difference Between Community Schools and Matched Comparison Schools: High Schools Chronically absent Graduated .15 .05 .10 0 .05 –.05 –.10 0 –.15 –.05 2010 2012 2014 2016 2018 2010 2012 Year 2014 2016 2018 Year Credits accumulated Disciplinary incidents per student 2 .2 .1 1 0 0 –.1 –1 –.2 2010 2012 2014 2016 2018 Year 2010 2012 2014 2016 2018 Year NOTE: Vertical axis reflects the difference in outcome between community schools and non-community schools, normalized so that the difference in 2014 is equal to zero. Dashed vertical lines indicate that 2014–2015 is considered a transition year. Solid bars represent 95-percent confidence intervals. The vertical scale for “Chronically absent” and “Graduated” is the proportion of students in those categories, averaged over schools. The vertical scale for “Credits accumulated” is the number of credits earned per year, averaged over schools. The vertical scale for “Disciplinary incidents per student” is the number of students, averaged over schools. an approach different from the difference-in-difference specification discussed in Chapter Three. In this analysis, we used stronger threshold for the matching process, which increased the similarity of the community schools and the matched comparison schools in their baseline outcomes at the cost of the exclusion from the analytic sample of some of the community schools that had the lowest baseline performances. We then used a doubly robust estimation approach that compared the outcomes of the community schools in the post-program years (2016 through 2018) with their matched comparison schools while controlling for schools’ baseline characteristics. Again, this approach yielded similar results to those reported in Tables 4.3 and 4.4. Results from all of these sensitivity analyses are available upon request. Changes in School Demand and Composition One potential reason for the positive impacts that are highlighted in the previous section is that the demographic makeup of the community schools might have changed during this time period, either because the NYC-CS made these schools more attractive to students and parents or due to unrelated reasons. In this section, we examine whether the NYC-CS affected Table 4.3 Average Impact of NYC-CS on Elementary and Middle Schools Estimated effect of community school program 2016 2017 2018 Average effect (1) (2) (3) (4) Proportion Chronically Absent Proportion On-Time Progression –0.0545*** 0.0123** 0.0345 –0.00225 –0.0707** 1.806 –0.220 –0.893 (0.0108) (0.00596) (0.0288) (0.0281) (0.0356) (1.695) (1.358) (0.900) –0.0804*** 0.0110** 0.0364 –0.0166 –0.119*** 7.370*** 2.164* 0.0901 (0.0110) (0.00479) (0.0374) (0.0359) (0.0407) (2.271) (1.259) (0.888) –0.0870*** N/A 0.131*** 0.0539 –0.111*** 9.274*** 1.319 0.503 (0.0120) N/A (0.0385) (0.0396) (0.0394) (2.735) (1.611) (0.807) –0.0734*** 0.0117** 0.0657** 0.0108 –0.0995*** 5.976*** 1.036 –0.132 (0.0101) (0.00477) (0.0316) (0.0314) (0.0349) (1.809) (1.134) (0.747) Average Math Average ELA Test Scores Test Scores (5) Number of Disciplinary Incidents (6) (7) Student Teacher Connectedness to Responsibility Adults (8) Family Empowerment Opportunities   Base year 2010–2014 2010–2014 2010–2014 2010–2014 2010–2014 2015 2015 2015 Schools included Elementary and middle Elementary and middle Elementary and middle Elementary and middle Elementary and middle Elementary and middle Elementary and middle Elementary and middle 341 341 341 341 341 341 167 341 2,994 2,667 2,970 2,970 2,673 1,339 647 1,342 Number of clusters (i.e., schools) Number of school-year observations NOTES: The coefficients shown are the result of a weighted difference-in-difference specification. Administrative outcomes include data from the 2010 school year to the 2018 school year; survey outcomes include data from the 2015 school year to the 2018 school year. Math and ELA test scores are measured in student standard deviation units, and the number of disciplinary incidents is measured per student per year. * p < 0.10; ** p < 0.05; *** p < 0.01. Results 39 (1) (2) (3) (4) (5) (6) (7) Proportion Chronically Absent Proportion Chronically Absent Proportion Graduated Credits Accumulated Number of Disciplinary Incidents Teacher Responsibility Student Connectedness to Adults Family Empowerment Opportunities 2016 –0.0606*** 0.0468**  1.265***  0.00944 –0.253 0.0713  1.619 (0.0177) (0.0258) (0.253) (0.0483) (1.985) (1.071) (1.317) –0.0940*** 0.0278 1.206*** –0.0253 0.958 1.083 1.463 (0.0216) (0.0189) (0.312) (0.0584) (2.501) (1.176) (1.331) –0.0952*** 0.0724** 1.346*** 0.00439 –0.0279 0.673 1.246 (0.0245) (0.0289) (0.304) (0.0670) (2.402) (1.282) (1.245) –0.0828*** 0.0487*** 1.271*** –0.00377 0.225 0.601 1.451 (0.0191) (0.0239) (0.269) (0.0464) (1.915) (1.035) (1.110) 2010–2014 2010–2014 2010–2014 2010–2014 2015 2015 2015 High schools High schools High schools High schools High schools High schools High schools 171 171 171 171 171 171 171 1,492 1,464 1,511 1,353 671 667 661 2017 2018 Average Effect Base year(s) Schools included Number of clusters (i.e., schools) Number of school-year observations NOTES: The coefficients shown are the result of a weighted difference-in-difference specification. Administrative outcomes include data from the 2010 school year to the 2018 school year; survey outcomes include data from the 2015 school year to the 2018 school year. Credits accumulated are measured as credits per student per year, and the number of disciplinary incidents is measured per student per year. * p < 0.10; ** p < 0.05; *** p < 0.01 Illustrating the Promise of Community Schools Estimated effect of community school program 40 Table 4.4 Average Impact of NYC-CS on High Schools Results 41 the demand for community schools and whether the demographic makeup of the community schools changed relative to their matched comparison schools. The results of this analysis are shown in Table 4.5 for elementary and middle schools and Table 4.6 for high schools. The results suggest that the NYC-CS had mixed impact on demand, if any. The first column in the two tables shows there was an increase in the fraction of students who listed a community school as their first choice in the NYC school choice program relative to the matched comparison group, although this is only statistically significant for one school year and only for high schools. The second and third columns show that the NYC-CS had little effect on the degree to which students voluntarily left before the terminal grade or entered the schools after the opening grade, with the only statistically significant estimates suggesting NYC-CS reduced late entry. Therefore, we conclude that the impact findings presented in the previous section are not related to an increased number of students enrolling and staying in these schools. The remaining columns in Tables 4.5 and 4.6 report the demographic changes that occurred in the community schools relative to the matched comparison schools. Overall, there were no consistent patterns of change in student demographics in community schools compared with non-community schools. However, there were some changes that are worth noting. The fourth column shows that there was a relative increase in the proportion of students who were classified as English language learners in the community schools at all grade levels in 2016 and 2017. The fifth column in Table 4.5 shows that there was also a relative increase in the fraction of students who were in poverty at the elementary and middle community schools in 2016 and 2018. The other statistically significant changes include that the community high schools saw a relative increase in the proportion of their students who were Asian in all three years and a slight decrease in the proportion of their students who had a disability in 2018. This lack of substantial demographic change in the community schools suggests that the estimated community school effects in the previous sections are due to an impact of the program on enrolled students rather than changes in the backgrounds of students who attend the community schools. Effect on Student Subgroups and School Heterogeneous Effects The previous section focused on estimating the average effect of the NYC-CS; here, we investigate the effect of the program on particularly relevant student subgroups and whether school characteristics are related to the overall effectiveness of the program. Effect on Student Subgroups The results of this analysis are shown in Tables 4.7 and 4.8 for elementary/middle and high school, respectively. We provide estimates for groups that historically underperform on educational benchmarks: students in poverty, students in temporary housing, English language learners, students with disabilities, black students, and Hispanic students. We also provide separate estimates for male and female students. In elementary and middle schools, the impact estimates on the proportion of those chronically absent, the average math score, and the number of disciplinary incidents are all statistically significant at the 5-percent level for all subgroups, with three exceptions. The impact on the average math score is only marginally significant (at the p < 0.10 level) for black students, the average math score is not significant for students in temporary housing, and the impact on the number of disciplinary incidents is not significant for English language learners. The impact on percent on-time progression is statistically sig- 2017 2018 Average effect Base year Schools included Number of clusters (i.e., schools) Number of school-year observations (1) (2) (3) (4) (5) (6) (7) (8) (9) (10) Fraction Who Listed as First Choice Proportion Who Left School Early Proportion Who Entered School Late Proportion ELL Proportion in Poverty Proportion with Disability Proportion White Proportion Asian Proportion Black Proportion Hispanic 0.00243 0.00322 –0.00884 0.0209*** 0.0252*** 0.0150* 0.000244 –0.00193 –0.00756 0.0115 (0.0411) (0.00575) (0.00641) (0.00692) (0.00964) (0.00812) (0.00257) (0.00247) (0.00793) (0.00743) 0.0404 –0.00182 –0.0112 0.0178** 0.0119 0.00597 –0.00242 –0.00220 –0.00343 0.0112 (0.0524) (0.00504) (0.00771) (0.00759) (0.0109) (0.00806) (0.00369) (0.00282) (0.00918) (0.00826) 0.0566 N/A –0.0130* 0.00842 0.0319** –0.0113 –0.00705 –0.00421 –0.00369 0.0177* (0.0650) N/A (0.00724) (0.00903) (0.0125) (0.00949) (0.00485) (0.00260) (0.0104) (0.0104) 0.000779 0.000779 –0.0109* 0.0160** 0.0229** 0.00384 –0.00290 –0.00273 –0.00499 0.0133* (0.00469) (0.00469) (0.00556) (0.00717) (0.00936) (0.00767) (0.00341) (0.00232) (0.00821) (0.00770) 2013–2014 2010–2014 2010–2014 2010–2014 2010–2014 2010–2014 2010–2014 2010–2014 2010–2014 2010–2014 Middle Elementary and middle Elementary and middle Elementary Elementary and middle and middle Elementary and middle Elementary and middle Elementary and middle Elementary Elementary and middle and middle 340 340 338 340 340 340 340 340 340 340 2,550 2,550 2,542 2,929 2,929 2,929 2,929 2,929 2,929 2,929 NOTES: The coefficients shown are the result of a weighted difference-in-difference specification. * p < 0.10; ** p < 0.05; *** p < 0.01. ELL = English language learner. Illustrating the Promise of Community Schools Estimated  2016 effect of community school program 42 Table 4.5 Measures of Elementary and Middle School Demand and Demographics Table 4.6 Measures of High School Demand and Demographics Estimated effect of community school program 2016 2017 2018   Average effect Base year Schools included Number of clusters (i.e., schools) Number of school-year observations (1) (2) (3) Fraction Who Listed as First Choice Proportion Who Left School Early 0.0319 0.000437 –0.0151*** (0.0333) (0.00834) 0.0769*** (4) (5) (6) (7) (8) Proportion in Poverty Proportion with Disability Proportion White Proportion Asian 0.0173** –0.00698 –0.000216 0.00109 0.00981** –0.00553 –0.00211 (0.00490) (0.00835) (0.0133) (0.00745) (0.00277) (0.00399) (0.00956) (0.00965) –0.00276 –0.00521 0.0241** –0.00277 –0.0151 0.00253 0.0139*** –0.00780 –0.00579 (0.0296) (0.00766) (0.00664) (0.0108) (0.0138) (0.00927) (0.00326) (0.00498) (0.0120) (0.0123) 0.0507 N/A 0.00944 0.0216 –0.00364 –0.0238** 0.00373 0.0163*** –0.0142 –0.00568 (0.0555) N/A (0.00938) (0.0132) (0.0133) (0.0114) (0.00337) (0.00552) (0.0117) (0.0120)                     0.0528 –0.00417 –0.00670 0.0209** –0.00451 –0.0127 0.00241 0.0132*** –0.00906 –0.00448 (0.0332) (0.00557) (0.0211) (0.0101) (0.0114) (0.00875) (0.00289) (0.00461) (0.0106) (0.0109) 2012–2014 2010–2014 2010–2014 2010–2014 2010–2014 2010–2014 2010–2014 2010–2014 2010–2014 2010–2014 High schools High schools High schools High schools High schools High schools High schools High schools High schools High schools 203 231 205 231 231 231 231 231 231 231 1,169 1,688 1,514 1,945 1,945 1,945 1,945 1,945 1,945 1,945 Proportion Who Entered Proportion School Late ELL (9) (10) Proportion Proportion Black Hispanic Results NOTES: The coefficients shown are the result of a weighted difference-in-difference specification. * p < 0.10; ** p < 0.05; *** p < 0.01. ELL = English language learner. 43 44 Illustrating the Promise of Community Schools Table 4.7 Estimates of NYC-CS on Student Subgroups in Elementary and Middle Schools (1) (2) (3) (4) (5) Proportion Chronically Absent Percentage On-Time Progression Average Math Score Average ELA Score Number of Disciplinary Incidents –0.0750*** 0.0112** 0.0643** 0.0175 –0.105*** (0.0104) (0.00530) (0.0316) (0.0308) (0.0364) In temporary housing –0.0419*** 0.0101 0.0550 0.0507 –0.123** (0.0141) (0.00857) (0.0356) (0.0353) (0.0539) English language learners –0.0676*** 0.00578 0.104*** 0.0654* –0.0342 (0.0133) (0.00894) (0.0366) (0.0366) (0.0261) With disability –0.0758*** -0.00281 0.0999*** 0.0439 –0.159** (0.0120) (0.00592) (0.0296) (0.0326) (0.0634) Male –0.0753*** 0.00787 0.0685** 0.0104 –0.115** (0.0101) (0.00528) (0.0330) (0.0321) (0.0458) –0.0716*** 0.0158*** 0.0754** 0.0262 –0.0850*** (0.0122) (0.00495) (0.0301) (0.0309) (0.0272) –0.0628*** 0.0165*** 0.0625* 0.0318 –0.147*** (0.0140) (0.00626) (0.0326) (0.0351) (0.0538) –0.0773*** 0.00503 0.0640** 0.00405 –0.0679** (0.0109) (0.00471) (0.0294) (0.0295) (0.0297) Base year 2010–2014 2010–2014 2010–2014 2010–2014 2010–2014 Schools included Elementary and middle Elementary and middle Elementary and middle Elementary and middle Elementary and middle Estimated effect of community school program In poverty Female Black Hispanic NOTES: This table shows the results of multiple difference-in-difference regressions, which estimate the effect of NYC-CS on a number of student subgroups. The reported coefficients are the average effect of NYC-CS from the 2016 school year to the 2018 school year. Each coefficient comes from a different regression. * p < 0.10; ** p < 0.05 *** p < 0.01. Results 45 Table 4.8 Estimates of NYC-CS on Student Subgroups in High Schools Estimated effect of community school program (1) (2) (3) (4) Proportion Chronically Absent Proportion Graduated Credits Accumulated Number of Disciplinary Incidents –0.0778*** 0.0463** 1.305*** –0.00630 (0.0201) (0.0223) (0.263) (0.0479) In temporary housing –0.0925*** 0.0118 1.365*** 0.00698 (0.0250) (0.0420) (0.344) (0.0684) English language learners –0.0493** 0.0641 1.401*** 0.0340 (0.0205) (0.0593) (0.304) (0.0461) With disability –0.0778*** 0.0463** 1.305*** –0.00630 (0.0201) (0.0223) (0.263) (0.0479) Male –0.0884*** 0.0495* 1.265*** 0.0177 (0.0200) (0.0279) (0.294) (0.0576) –0.0776*** 0.0482* 1.251*** –0.0325 (0.0216) (0.0244) (0.268) (0.0348) –0.101*** 0.0385* 1.178*** –0.0490 (0.0259) (0.0230) (0.301) (0.0705) –0.0735*** 0.0608** 1.418*** 0.0320 (0.0207) (0.0293) (0.277) (0.0432) 2010–2014 2010–2014 2010–2014 2010–2014 High schools High schools High schools High schools In poverty Female Black Hispanic Base year Schools included NOTES: This table shows the results of multiple difference-in-difference regressions, which estimate the effect of NYC-CS on a number of student subgroups. The reported coefficients are the average effect of NYC-CS from the 2016 school year to the 2018 school year. Each coefficient comes from a different regression. * p < 0.10; ** p < 0.05; *** p < 0.01. 46 Illustrating the Promise of Community Schools nificant (at the p < 0.05 level) for four of the eight groups: students in poverty, students in temporary housing, females, and black students. Interestingly, English language learner students are the only subgroup that saw marginally significant impacts (at the p < 0.10 level) on their English scores. In high schools, the estimated impacts on the proportion chronically absent and credits accumulated were significant at the p < 0.05 level for all subgroups. The impact on the proportion graduated was significant at the p < 0.05 level for three subgroups (students in poverty, students with disabilities, and Hispanic students) and marginally significant at the p < 0.10 level for three subgroups (male, female, and black students) for all subgroups, except for students in temporary housing and English language learners. The impact on the number of disciplinary incidents is not significant for any of the subgroups. Evidence of School-Level Heterogeneity The results of our analysis of impact heterogeneity based on school characteristics are shown in Tables 4.9 and 4.10. The differential impact between any two groups tested can be found by subtracting the estimate for one group from the estimate for the other. For example, in the first column of Table 4.9, we see that among elementary and middle schools, the impact on chronic absenteeism is –0.0666 for highly zoned schools, –0.0646 for lightly zoned schools, and –0.0798 for unzoned schools. Therefore, the differential impact between highly zoned and unzoned schools is 0.0132. A test of the null hypothesis that the three coefficients are identical yields a p-value of 0.541, suggesting that the impact of NYC-CS is similar across the three groups. Overall, we found limited evidence of treatment effect heterogeneity based on school characteristics. Interestingly, there is some evidence that the NYC-CS was more effective in improving levels of credit accumulation at high schools with recently hired principals, which could be evidence that the strategy is even more helpful for these principals than for longer-tenured principals or could be because NYC-CS had more buy-in from recently hired principals. There also was some evidence that community schools that were also Renewal Schools were more successful at reducing chronic absenteeism, increasing on-time grade progression (among elementary and middle schools) and increasing graduation rates and credit accumulation (among high schools). In other words, while all community schools were seeing positive effects on these three outcomes, community schools that were also Renewal Schools were seeing the strongest impact. The findings of differential impact based on Renewal School status mostly align with recent work estimating the short-term impact of that program (Opper et al., 2019), in which we also found that Renewal Schools had improvements in attendance and reductions in chronic absenteeism, along with higher amounts of credits earned at high schools, but no significant impact on ELA test scores at elementary/middle schools or on discipline at any level. One notable difference between the previous and current findings is that the effect of Renewal Schools on math scores was not statistically significant in the former report, but it is in this report.1 Program Implementation Was Not Related to Differential Impacts As described in Chapter Three and Appendix B, we classified schools into different groups based on their development along the program’s four core capacities (coordination, collaboration, connection, and continuous improvement) and the nature of their implementation of mental health programs and services. To account for concerns regarding sample size due to the 1 The findings reported in Opper et al. (2019) are based on one less year of follow-up data and use a regression discontinuity design rather than the matching design used here, which may account for this difference. Results 47 disaggregation of schools into several subgroups, we conducted these heterogeneity analyses on all schools, instead of conducting separate analyses for elementary/middle and high schools. First, as shown across the first four panels of Table 4.11, we found that although many impact estimates were slightly larger for schools with higher values on the core capacity index scores, there was only one case where higher capacity scores were statistically significantly associated with larger program impact—because schools with above-median collaboration levels had stronger impact on student connectedness to adults. Second, we classified schools as fitting into one of two implementation profiles with respect to mental health program implementation (higher and lower; see Appendixes B and C for more details about these profiles). We investigated whether the impacts of NYC-CS differed depending on which of the two implementation profiles the schools followed. To do so, we used the same approach outlined in Chapter Three, but we now defined two treatment indicators (one for each profile). The first consisted of NYC-CS schools with the first implementation profile that tended to have higher levels of mental health program implementation, and the second consisted of NYC-CS with the second implementation profile that had lower levels of program implementation. We estimated two treatment effects, one for each implementation profile, and also tested whether these two estimated effects were statistically distinguishable or not. As shown in the last panel of Table 4.11, we found that schools in the higher mental health implementation cluster were more effective at reducing chronic absenteeism than those in the low implementation cluster. Otherwise, we did not reject the hypothesis that the estimated effects were the same regardless of the level of mental health implementation. Grade-by-Year Analysis Results Table 4.12 presents our estimates for the impact of exposure and maturity for middle school outcomes. Each set of rows is from a different regression, as is each column. For the chronically absent outcome in the first column, the first row indicates that the average impact across the three post-program years is negative and significant. The maturity coefficient in the second regression (0.019) indicates that when we do not distinguish between program maturity and student exposure, the impact increases (i.e., becomes more negative) with program maturity. The third regression indicates that when we control for cohort exposure, we still find that the increased impact over time is due to increased program maturity rather than increased cohort exposure. In other words, the longer a school participates in the NYC-CS, the lower its level of chronic absenteeism, even among students who are new to the school and have not had as much exposure to the program. The remaining middle school outcomes do not show any statistically significant change in the outcome over time. Table 4.13 shows similar information for high school outcomes. As with middle schools, the impact on chronic absenteeism increases over time. Although the maturity coefficient in the third regression is not significant at a 5-percent level, it is marginally significant at the 10-percent level, and it is approximately the same size as when we do not include the exposure variable. The findings for on-time progression are similar to that for chronic absenteeism—the impact increases over time, and including exposure reduces the significance of maturity but Proportion Chronically Absent (2) (3) (4) Proportion On- Average Math Average ELA Test Time Progression Test Scores Scores (5) (6) (7) (8) Number of Disciplinary Incidents Teacher Responsibility Student Connectedness to Adults Family Empowerment Opportunities Lowly zoned schools –0.0646*** 0.00889 0.00359 0.00352 –0.0412 9.645*** 1.557 –0.659 (0.0131) (0.00600) (0.0408) (0.0479) (0.0407) (2.491) (1.254) (1.037) Highly zoned schools –0.0666*** 0.00263 0.0355 –0.0385 –0.0309 6.239** –2.698 –1.564 (0.0179) (0.00879) (0.0426) (0.0360) (0.0440) (3.122) (2.455) (1.115) Unzoned schools –0.0798*** 0.0160*** 0.105*** 0.0311 –0.150*** 4.230** 1.530 0.580 (0.0123) (0.00545) (0.0358) (0.0396) (0.0423) (2.065) (1.248) (0.864) 0.541 0.247 0.0298 0.261 0.0157 0.111 0.205 0.147       P-value on test of equality of above coefficients   Base year 2010–2014 2010–2014 2010–2014 2010–2014 2010–2014 2015 2015 2015 Schools included Elementary and middle Elementary and middle Elementary and middle Elementary and middle Elementary and middle Elementary and middle Elementary and middle Elementary and middle Number of clusters (i.e., schools) 341 341 341 341 341 341 167 341 Number of school-year observations 2,994 2,667 2,970 2,970 2,673 1,339 647 1,342 NOTES: Zoned schools are sorted by what fraction of their students in the 2015 school year lived in the zone; those in the top half of the distribution (e.g., with more students living in the zone) are considered highly zoned schools and those in the bottom half are considered lightly zoned schools. * p < 0.10; ** p < 0.05; *** p < 0.01. Illustrating the Promise of Community Schools (1) 48 Table 4.9a Effect Heterogeneity in Elementary and Middle Schools: Differences by School Zoning Table 4.9b Effect Heterogeneity in Elementary and Middle Schools: Differences by Principal Tenure (1) Proportion Chronically Absent (2) (3) (4) Proportion On- Average Math Average ELA Test Time Progression Test Scores Scores (5) (6) (7) (8) Number of Disciplinary Incidents Teacher Responsibility Student Connectedness to Adults Family Empowerment Opportunities Recently hired principal –0.0834*** 0.0136** 0.0891** 0.0214 –0.109** 6.025*** 1.345 –0.0746 (0.0109) (0.00560) (0.0346) (0.0376) (0.0426) (1.978) (1.206) (0.876) Longer-tenured principal –0.0568*** 0.00920* 0.0267 –0.00788 –0.0886*** 6.027** 0.183 –0.393 (0.0143) (0.00540) (0.0377) (0.0376) (0.0338) (2.472) (1.594) (0.832) 0.0656 0.444 0.0775 0.491 0.580 0.999 0.450 0.718 Base year 2010–2014 2010–2014 2010–2014 2010–2014 2010–2014 2015 2015 2015 Schools included Elementary and middle Elementary and middle Elementary and middle Elementary and middle Elementary and middle Elementary and middle Elementary and middle Elementary and middle Number of clusters (i.e., schools) 338 338 338 338 338 338 165 338 Number of school-year observations 2,967 2,643 2,948 2,948 2,649 1,328 639 1,330 P-value on test of equality of above coefficients Results NOTES: The coefficients shown are the result of a weighted difference-in-difference specification. Administrative outcomes include data from the 2010 school year to the 2018 school year; survey outcomes include data from the 2015 school year to the 2018 school year. Math and ELA test scores are measured in student standard deviation units, and the number of disciplinary incidents is measured per student per year. Schools are sorted by how experienced their principal is in the 2015 school year; those with principals in the top half of the distribution are considered longer-tenured principal schools and those in the bottom half are considered recently hired principal schools. * p < 0.10 ;** p < 0.05; *** p < 0.01. 49   Small schools Proportion Chronically Absent (2) (3) (4) Proportion On- Average Math Average ELA Test Time Progression Test Scores Scores (5) (6) (7) (8) Number of Disciplinary Incidents Teacher Responsibility Student Connectedness to Adults Family Empowerment Opportunities –0.0752*** 0.0135*** 0.0703** 0.0121 –0.116*** 5.832*** 1.118 –0.0905 (0.0107) (0.00512) (0.0342) (0.0356) (0.0381) (1.946) (1.205) (0.819) –0.0675*** 0.00534 0.0506 0.00642 –0.0450 6.462** 0.736 –0.273 (0.0170) (0.00642) (0.0356) (0.0341) (0.0390) (2.556) (1.608) (0.863)                 0.652 0.195 0.547 0.880 0.0618 0.804 0.806 0.831 Base year 2010–2014 2010–2014 2010–2014 2010-2014 2010–2014 2015 2015 2015 Schools included Elementary and middle Elementary and middle Elementary and middle Elementary and middle Elementary and middle Elementary and middle Elementary and middle Elementary and middle Number of clusters (i.e., schools) 341 341 341 341 341 341 167 341 Number of school-year observations 2,994 2,667 2,970 2,970 2,673 1,339 647 1,342 Large schools   P-value on test of equality of above coefficients NOTES: The coefficients shown are the result of a weighted difference-in-difference specification. Administrative outcomes include data from the 2010 school year to the 2018 school year; survey outcomes include data from the 2015 school year to the 2018 school year. Math and ELA test scores are measured in student standard deviation units, and the number of disciplinary incidents is measured per student per year. Schools are sorted by their number of students in the 2015 school year; those in the top half of the distribution are considered large schools and those in the bottom half are considered small schools. * p < 0.10; ** p < 0.05; *** p < 0.01. Illustrating the Promise of Community Schools (1) 50 Table 4.9c Effect Heterogeneity in Elementary and Middle Schools: Differences by School Size Table 4.9d Effect Heterogeneity in Elementary and Middle Schools: Differences by Renewal School Status (1)   Non-Renewal School Proportion Chronically Absent (2) (3) (4) Proportion On- Average Math Average ELA Test Time Progression Test Scores Scores (5) (6) (7) (8) Number of Disciplinary Incidents Teacher Responsibility Student Connectedness to Adults Family Empowerment Opportunities –0.0370** –0.00243 0.0269 0.0388 –0.0934 3.299 –0.743 0.572 (0.0144) (0.00766) (0.0373) (0.0505) (0.0653) (2.316) (1.545) (0.950) –0.0830*** 0.0153*** 0.0760** 0.00358 –0.101*** 6.684*** 1.352 -0.319 (0.0106) (0.00489) (0.0336) (0.0336) (0.0354) (1.951) (1.186) (0.804)                 0.00154 0.0173 0.151 0.495 0.900 0.145 0.159 0.339 Base year 2010–2014 2010–2014 2010–2014 2010–2014 2010–2014 2015 2015 2015 Schools included Elementary and middle Elementary and middle Elementary and middle Elementary and middle Elementary and middle Elementary and middle Elementary and middle Elementary and middle Number of clusters (i.e., schools) 341 341 341 341 341 341 167 341 Number of school-year observations 2,994 2,667 2,970 2,970 2,673 1,339 647 1,342 Renewal School   P-value on test of equality of above coefficients Results NOTES: The coefficients shown are the result of a weighted difference-in-difference specification. Administrative outcomes include data from the 2010 school year to the 2018 school year; survey outcomes include data from the 2015 school year to the 2018 school year. Math and ELA test scores are measured in student standard deviation units, and the number of disciplinary incidents is measured per student per year. * p < 0.10; ** p < 0.05; *** p < 0.01. 51 52 Table 4.10a Effect Heterogeneity in High Schools: Differences by School Zoning (2) (3) (4) (5) (6) (7) Proportion Chronically Absent Proportion Graduated Credits Accumulated Number of Disciplinary Incidents Teacher Responsibility Student Connectedness to Adults Family Empowerment Opportunities Lowly zoned schools –0.107*** 0.0395 2.094*** –0.00620 1.839 –0.957 –1.739 (0.0203) (0.0347) (0.328) (0.0386) (1.718) (1.639) (1.279) Highly zoned schools –0.0781 –0.0753** 1.472*** –0.0620 6.622** 4.007*** 1.137 (0.0617) (0.0357) (0.387) (0.0557) (2.899) (0.809) (2.446) –0.0782*** 0.0587** 1.099*** 0.000325 –0.470 0.675 2.082 (0.0225) (0.0269) (0.302) (0.0532) (2.171) (1.104) (1.283) 0.564 0.00401 0.0368 0.530 0.0874 0.00000240 0.0664       Unzoned schools P-value on test of equality of above coefficients Base year   2010–2014 2010–2014 2010–2014 2010–2014 2015 2015 2015 High schools High schools High schools High schools High schools High schools High schools Number of clusters (i.e., schools) 171 171 171 171 171 171 171 Number of schoolyear observations 1,492 1,464 1,511 1,353 671 667 661 Schools included NOTES: The coefficients shown are the result of a weighted difference-in-difference specification. Administrative outcomes include data from the 2010 school year to the 2018 school year; survey outcomes include data from the 2015 school year to the 2018 school year. Math and ELA test scores are measured in student standard deviation units, and the number of disciplinary incidents is measured per student per year. Zoned schools are sorted by what fraction of their students in the 2015 school year live in the zone; those in the top half of the distribution (e.g., with more students living in the zone) are considered highly zoned schools and those in the bottom half are considered lightly zoned schools. * p < 0.10; ** p < 0.05; *** p < 0.01. Illustrating the Promise of Community Schools (1) Table 4.10b Effect Heterogeneity in High Schools: Differences by Principal Tenure (1) (2) (3) (4) (5) (6) (7) Proportion Chronically Absent Proportion Graduated Credits Accumulated Number of Disciplinary Incidents Teacher Responsibility Student Connectedness to Adults Family Empowerment Opportunities –0.0912*** 0.0632*** 1.421*** –0.0112 0.621 1.436 1.887 (0.0222) (0.0239) (0.306) (0.0564) (2.283) (1.177) (1.360) –0.0278 –0.0184 0.817*** –0.0271 –2.135 0.467 0.251 (0.0374) (0.0640) (0.214) (0.0514) (3.213) (1.830) (1.662) 0.127 0.210 0.0403 0.793 0.426 0.590 0.423 Base year 2010–2014 2010–2014 2010–2014 2010–2014 2015 2015 2015 Schools included High schools High schools High schools High schools High schools High schools High schools Number of clusters (i.e., schools) 149 149 149 149 149 149 149 Number of school-year observations 1,298 1,268 1,315 1,177 583 581 575   Newly hired principal Longer tenured principal P-value on test of equality of above coefficients Results NOTES: The coefficients shown are the result of a weighted difference-in-difference specification. Administrative outcomes include data from the 2010 school year to the 2018 school year; survey outcomes include data from the 2015 school year to the 2018 school year. Math and ELA test scores are measured in student standard deviation units, and the number of disciplinary incidents is measured per student per year. Schools are sorted by how experienced their principals are in the 2015 school year; those with principals in the top half of the distribution are considered longer-tenured principal schools and those in the bottom half are considered recently hired principal schools. * p < 0.10; ** p < 0.05; *** p < 0.01. 53 54 Table 4.10c Effect Heterogeneity in High Schools: Differences by School Size (2) (3) (4) (5) (6) (7) Proportion Chronically Absent Proportion Graduated Credits Accumulated Number of Disciplinary Incidents Teacher Responsibility Student Connectedness to Adults Family Empowerment Opportunities –0.0737*** 0.0748** 0.983*** 0.0259 –2.618 1.242 2.553 (0.0261) (0.0343) (0.346) (0.0625) (2.552) (1.233) (1.668) –0.0935*** 0.0172 1.631*** –0.0411 3.776** –0.196 0.158 (0.0244) (0.0232) (0.316) (0.0484) (1.880) (1.204) (1.096) 0.558 0.120 0.116 0.311 0.0176 0.277 0.197 Base year 2010–2014 2010–2014 2010–2014 2010–2014 2015 2015 2015 Schools included High schools High schools High schools High schools High schools High schools High schools Number of clusters (i.e., schools) 171 171 171 171 171 171 171 Number of school-year observations 1,492 1,446 1,511 1,353 671 667 661   Small schools Large schools P-value on test of equality of above coefficients NOTES: The coefficients shown are the result of a weighted difference-in-difference specification. Administrative outcomes include data from the 2010 school year to the 2018 school year; survey outcomes include data from the 2015 school year to the 2018 school year. Math and ELA test scores are measured in student standard deviation units, and the number of disciplinary incidents is measured per student per year. Schools are sorted by their number of students in the 2015 school year; those in the top half of the distribution are considered large schools and those in the bottom half are considered small schools. * p < 0.10; ** p < 0.05; *** p < 0.01. Illustrating the Promise of Community Schools (1) Table 4.10d Effect Heterogeneity in High Schools: Differences by Renewal School Status (1) (2) (3) (4) (5) (6) (7) Proportion Chronically Absent Proportion Graduated Credits Accumulated Number of Disciplinary Incidents Teacher Responsibility Student Connectedness to Adults Family Empowerment Opportunities –0.0459 –0.0204 0.0549 0.0683 –2.869 0.495 1.269 (0.0279) (0.0418) (0.324) (0.0762) (3.341) (1.345) (1.783) –0.0990*** 0.0812*** 1.854*** –0.0377 1.715 0.652 1.542 (0.0224) (0.0240) (0.268) (0.0483) (1.941) (1.158) (1.302) 0.118 0.0231 >0.001 0.182 0.187 0.910 0.895 Base year 2010–2014 2010–2014 2010–2014 2010–2014 2015 2015 2015 Schools included High schools High schools High schools High schools High schools High schools High schools Number of clusters (i.e., schools) 171 171 171 171 171 171 171 Number of school-year observations 1,492 1,446 1,511 1,353 671 667 661   Non-Renewal School Renewal School P-value on test of equality of above coefficients Results NOTES: The coefficients shown are the result of a weighted difference-in-difference specification. Administrative outcomes include data from the 2010 school year to the 2018 school year; survey outcomes include data from the 2015 school year to the 2018 school year. Math and ELA test scores are measured in student standard deviation units, and the number of disciplinary incidents is measured per student per year. * p < 0.10; ** p < 0.05; *** p < 0.01. 55 56 Illustrating the Promise of Community Schools Table 4.11a Effect Heterogeneity Based on Implementation Metrics: Coordination (1) (2) (3) (4) (5) (6) Proportion Chronically Absent Proportion On-Time Progression Number of Disciplinary Incidents Teacher Responsibility Student Connectedness to Adults Family Empowerment Opportunities Below median coordination –0.0680*** 0.0197** –0.0275 4.013** –0.510 0.406 (0.0160) (0.00868) (0.0457) (1.880) (1.005) (1.056) Above median coordination –0.0748*** 0.00106 –0.0675** 3.828** 1.352 0.144 (0.0124) (0.00908) (0.0287) (1.638) (1.022) (0.695) P-value on test of equality of above coefficients 0.714 0.100 0.407 0.931 0.137 0.820 Base year 2010–2014 2010–2014 2010–2014 2015 2015 2015 Schools included All schools All schools All schools All schools All schools All schools Number of clusters (i.e., schools) 512 512 512 512 338 512 Number of school-year observations 4,486 4,013 4,026 2,010 1,314 2,003 NOTES: The coefficients shown are the result of a weighted difference-in-difference specification described in this report. * p < 0.10 ** p < 0.05 *** p < 0.01 shows no indication that exposure accounts for the change in impact.2 The impact on the other two outcomes, credit accumulation and disciplinary incidents, does not appear to change over time. In sum, our analyses suggest that any increases of program impact over time (as shown in our main effects in Tables 4.3 and 4.4) appear to be due to increasing program maturity that affects all students rather than from multiple years of exposure to the program experienced by students in the earlier cohorts. In other words, for outcomes that get stronger from one year to the next, the increasing impact seems to have affected all students and not just those who have been at the school for the entire time. 2 In the grade-by-year analysis, we used on-time grade progression as an outcome for high school students rather than high school graduation as seen elsewhere in the report. This adjustment allows us to have a consistent set of outcomes across grade levels and also enables us to include information about all high school students rather than focusing only on those in grade 12. Results 57 Table 4.11b Effect Heterogeneity Based on Implementation Metrics: Collaboration (1) (2) (3) (4) (5) (6) Proportion Chronically Absent Proportion On-Time Progression Number of Disciplinary Incidents Teacher Responsibility Student Connectedness to Adults Family Empowerment Opportunities Below median collaboration –0.0641*** 0.00610 –0.0170 2.729 –1.221 –0.0101 (0.0156) (0.00869) (0.0485) (1.962) (0.955) (0.978) Above median collaboration –0.0670*** 0.00567 –0.0414 4.689*** 2.199** 0.536 (0.0139) (0.00900) (0.0289) (1.423) (0.974) (0.792) P-value on test of equality of above coefficients 0.878 0.969 0.633 0.338 0.00355 0.634 Base year 2010–2014 2010–2014 2010–2014 2015 2015 2015 Schools included All schools All schools All schools All schools All schools All schools Number of clusters (i.e. schools) 512 512 512 512 338 512 Number of school-year observations 4,486 4,013 4,026 2,010 1,314 2,003 NOTES: The coefficients shown are the result of a weighted difference-in-difference specification described in this report. * p <0.10 ** p < 0.05 *** p < 0.01 58 Illustrating the Promise of Community Schools Table 4.11c Effect Heterogeneity Based on Implementation Metrics: Connection (1) (2) (3) (4) (5) (6) Proportion Chronically Absent Proportion On-Time Progression Number of Disciplinary Incidents Teacher Responsibility Student Connectedness to Adults Family Empowerment Opportunities Below median connection –0.0512*** 0.0108 –0.0327 3.567* –0.718 0.0320 (0.0143) (0.00871) (0.0405) (1.879) (0.967) (1.016) Above median connection –0.0745*** 0.0139 –0.0765** 4.787*** 1.506 0.670 (0.0143) (0.00890) (0.0317) (1.576) (1.057) (0.751) P-value on test of equality of above coefficients 0.208 0.784 0.329 0.559 0.0746 0.580 Base year 2010–2014 2010–2014 2010–2014 2015 2015 2015 Schools included All schools All schools All schools All schools All schools All schools Number of clusters (i.e., schools) 512 512 512 512 338 512 Number of school-year observations 4,486 4,013 4,026 2,010 1,314 2,003 NOTES: The coefficients shown are the result of a weighted difference-in-difference specification described in this report. * p < 0.10 ** p < 0.05 *** p < 0.01. Results 59 Table 4.11d Effect Heterogeneity Based on Implementation Metrics: Continuous Improvement (1) (2) (3) (4) (5) (6) Proportion Chronically Absent Proportion On-Time Progression Number of Disciplinary Incidents Teacher Responsibility Student Connectedness to Adults Family Empowerment Opportunities Below median continuous improvement –0.0766*** 0.00438 –0.0626* 2.520 –0.0596 –0.00130 (0.0163) (0.00924) (0.0326) (1.687) (0.997) (0.852) Above median continuous improvement –0.0573*** 0.0173** –0.0212 5.947*** 1.207 0.506 (0.0115) (0.00810) (0.0386) (1.650) (1.012) (0.873) P-value on test of equality of above coefficients 0.287 0.237 0.344 0.0816 0.302 0.645 Base year 2010–2014 2010–2014 2010–2014 2015 2015 2015 Schools included All schools All schools All schools All schools All schools All schools Number of clusters (i.e., schools) 512 512 512 512 338 512 Number of school-year observations 4,486 4,013 4,026 2,010 1,314 2,003 NOTES: The coefficients shown are the result of a weighted difference-in-difference specification described in this report. * p < 0.10; ** p < 0.05; *** p < 0.01. 60 Illustrating the Promise of Community Schools Table 4.11e Effect Heterogeneity Based on Implementation Metrics: Mental Health (1) (2) (3) (4) (5) (6) Proportion Chronically Absent Proportion On-Time Progression Number of Disciplinary Incidents Teacher Responsibility Student Connectedness to Adults Family Empowerment Opportunities High mental health implementation cluster –0.106*** 0.0127 –0.0342 3.825** –1.078 0.277 (0.0162) (0.00974) (0.0311) (1.900) (1.129) (0.763) Low mental implementation cluster –0.0651*** 0.0222*** –0.0391 4.877*** 0.834 0.739 (0.0110) (0.00730) (0.0330) (1.513) (0.848) (0.745)             0.0244 0.373 0.891 0.599 0.100 0.608 Base year 2010–2014 2010–2014 2010–2014 2015 2015 2015 Schools included All schools All schools All schools All schools All schools All schools Number of clusters (i.e., schools) 512 512 512 512 338 512 Number of school-year observations 4,486 4,013 4,026 2,010 1,314 2,003     P-value on test of equality of above coefficients NOTES: The coefficients shown are the result of a weighted difference-in-difference specification described in the report. * p < 0.10 ** p < 0.05 *** p < 0.01. Results Table 4.12 Maturity and Exposure for Middle School Outcomes (1) Proportion Chronically Absent Constant (Average Effect) Constant Maturity Exposure Maturity   (2) (3) Proportion On-Time Progression Math Test Score (4) (5) ELA Test Score Number of Disciplinary Incidents –0.063*** 0.010*** 0.097*** 0.016 –0.087*** (0.009) (0.004) (0.017) (0.017) (0.020) –0.027 –0.001 0.046 –0.050 –0.099* (0.020) (0.012) (0.040) (0.040) (0.057) –0.019** 0.008 0.026 0.033* 0.006 (0.009) (0.008) (0.019) (0.019) (0.026) 0.004 0.009 –0.017 –0.001 –0.034 (0.015) (0.015) (0.047) (0.046) (0.066) –0.020** 0.003 0.035 0.033 0.025 (0.010) (0.012) (0.030) (0.031) (0.046) NOTES: Each column and each set of rows is a separate regression. The first regression includes only a constant. The second regression includes only a constant and the maturity variable. The third regression presents the maturity and exposure coefficients from a regression that also contains a constant and the grade variable. * p < 0.10; ** p < 0.05; *** p < 0.01. 61 62 Illustrating the Promise of Community Schools Table 4.13 Maturity and Exposure for High School Outcomes (1) (2) (3) (4) Proportion Chronically Absent Proportion On-Time Progression Credits Accumulated Number of Disciplinary Incidents Constant (average effect) –0.082*** 0.039*** 1.230*** –0.008 (0.008) (0.007) (0.042) (0.014) Constant –0.043** –0.001 1.176*** –0.005 (0.018) (0.015) (0.113) (0.038) –0.021** 0.027*** 0.028 –0.002 (0.008) (0.010) (0.054) (0.018) 0.009 –0.009 -0.172 0.073** (0.020) (0.022) (0.114) (0.029) Maturity –0.026* 0.033* 0.141 –0.053**   (0.015) (0.020) (0.091) (0.025) Maturity Exposure NOTES: Each column and each set of rows is a separate regression. The first regression includes only a constant. The second regression includes only a constant and the maturity variable. The third regression presents the maturity and exposure coefficients from a regression that also contains a constant and the grade variable. * p < 0.10; ** p < 0.05; *** p < 0.01. CHAPTER FIVE Discussion Since the launch of the NYC-CS in 2014, New York City has been at the forefront of education reform efforts to transform city schools into places where student success is fostered through an integrated, holistic system of school-based supports. NYCDOE believes schools are based on relationships and that student achievement and well-being are elevated through authentic and empowering partnerships among school staff, families, and CBOs. The NYC-CS is designed to implement this idea on a scale that has not been seen before in the United States. Our evaluation of the NYC-CS is based on a quasi-experimental design to estimate the impact of the program on various outcomes through the 2017–2018 school year. Considered as a whole, our findings suggest promise for the community school strategy, which is currently expanding across the country. In this chapter, we provide a discussion of our findings as they relate to prior rigorous research on community schools more broadly, and we discuss implications for policymakers and practitioners who are interested in designing, supporting, or evaluating similar programs. Community Schools Had a Positive Impact on Most of the Examined Student Outcomes As we describe in Chapter Four, we found that the NYC-CS has proven to be beneficial for students across a variety of outcome measures. Regarding student attendance, we found that for students of all grade levels, the NYC-CS initiative has led to a statistically significant decrease in the percentage of students who are chronically absent across the three years of the study, with the three-year average effect for elementary and middle schools being –0.074 and the three-year average effect for high schools being –0.083 (equivalent to a 7.4 percentage point and 8.3 percentage point decrease, respectively). These findings provide new evidence of the promise of the community schools strategy for improving student attendance, as prior studies into similar comprehensive programs have only found marginal improvements in student attendance. However, there is no prior work that explicitly considers the impact on chronic absenteeism, which has been a particular policy target for NYCDOE since the mayoral administration of Michael Bloomberg (Balfanz and Byrnes, 2013). We also found that the NYC-CS had a positive impact on students’ on-time grade progression and credit accumulation, particularly among high school students. Specifically, we found that that NYC-CS led to a 4.7 percentage point increase in the number of students graduating high school on time in 2016 and a 7.2 percentage point increase in 2018 (results were not statistically significant in 2017).  Averaging across the three years, we found an increase 63 64 Illustrating the Promise of Community Schools of 4.9 percentage points in the number of students graduating on time. Although there is minimal prior evidence about the impact of the community school strategy on this outcome domain, our results do align with Kemple et al. (2005)’s evaluation of the Talent Development intervention for high schools, which led to a higher percentage of students in grade 9 progressing to grades 10 and 11 on time, compared with students not participating in the program. Otherwise, we believe that this is the first study to report a positive impact on educational attainment for students, perhaps because this outcome was not considered among academic achievement and other outcomes. Regarding academic achievement, we found that among students in elementary and middle schools, attending a school in the NYC-CS was associated with a difference in math achievement of over a tenth of a standard deviation in the final year of the study period. However, we did not find significant effects on ELA test scores or on math scores in the first two years. The positive impact on math achievement in early grades aligns with prior results from quasi-experimental analyses of the City Connects program in Boston, which found positive impacts for students in grades 6 and 7 (Dearing et al., 2016; Walsh et al., 2016). However, these studies also found positive impacts on reading achievement, which we did not observe in the NYC-CS. Our final student-level outcome was the number of disciplinary incidents for students. Although we found no impact for high school students, we did observe that among students in elementary and middle schools, attending a community school was associated with a statistically significant reduction of approximately 0.1 disciplinary incidents per student per year. Prior studies of community schools that tested for differences in student behavior found no impact (ICF International, 2010a, 2010b, 2010c; LaFrance Associates, 2005). Evidence of Impact for Many Student Subgroups We found evidence that NYC-CS affected multiple subgroups of students, such as English language learners, students in temporary housing, black students, Hispanic students, and students in poverty. Although we did not formally test whether program impacts are different for a particular subgroup compared with the rest of the sample, our findings that program impacts were consistently experienced across key student subpopulations lend credence to the universal utility of the NYC-CS, at least in the New York City context. Limited Evidence of School-level Impact Heterogeneity Regarding our hypothesis that program impact may be more pronounced for certain schools, we found evidence suggesting that the NYC-CS was more effective for some outcomes in schools with newly hired principals and in schools that were also designated as Renewal Schools for some outcomes. Finally, although we mostly found no association between the level of program implementation and estimated impacts, the results were as we would expect in the handful of cases where we did see differential impacts—with schools that demonstrated higher levels of program implementation (e.g., collaborative practices and mental health programming) also having stronger impacts on certain outcome measures (e.g., students’ connectedness to adults and classmates and chronic absenteeism, respectively). Related to the latter finding, we should point out that while all community schools experienced a statistically significant reduction in chronic absenteeism, schools that implemented greater amounts of mental health programs and services saw particularly large reductions in this area. Discussion 65 There Was Limited and Inconsistent Evidence of Community Schools Supporting Improvements in Aspects of School Culture Our evaluation found some evidence of the NYC-CS supporting improvements in school climate and culture in elementary and middle schools. We found an impact on teachers’ shared responsibility for student success among the elementary and middle schools, but only for the final two years of the study. In addition, we found a positive effect on students’ sense of connectedness to adults and peers for elementary and middle school students, but only during the second year of the study and not the first or third. Finally, although we found positive gains in family engagement opportunities, the differences between community schools and the comparison group did not reach statistical significance. For the high schools, none of the impacts on school climate and culture measures were statistically significant. These findings align with prior research on the benefits of the community school strategy. For example, LaFrance Associates (2005) found that students in community schools showed improved relationships with adults and improved self-esteem, while Dunham and Connolly (2016) found that parents of students in community schools were more likely to report school staff cared about their children and were willing to work with their children to learn. The Estimated Impact Increased over Time for Some Outcomes We conducted various sensitivity analyses to understand whether the estimated effects were due in part to the schools’ composition changing or nuances of our statistical specifications. In both cases, we found no evidence suggesting that community schools were serving new types of students because of the potential increase in desirability after being designated community schools. In addition, we found that our estimated impacts were robust to various sensitivity analyses, such as re-estimating the comparisons using alternative comparison groups—one with all non-community schools in the city, and one with a very restricted comparison group of nearly identical schools. In both cases, we found no notable differences in estimated impacts. Finally, our analysis of program impact over time suggests that the impact of NYC-CS has increased for some outcomes over the three-year period we examined. We do not find evidence that the increasing impact for these outcomes is due to increased exposure to the program among the first cohorts of students. Instead, the increased impact appears to apply to new and old cohorts equally, which is consistent with increased impact because of program maturity rather than to additional exposure. This finding is consistent with prior research indicating that the community schools strategy is particularly impactful after several years of program implementation. For example, Olson (2014) found that community schools’ impacts were most pronounced among schools that had been operating for five or more years. In addition, Dunham and Connolly (2016) found that impacts on some student outcomes (in particular, attendance) were only present in schools that had been open longer than others. Therefore, we encourage program developers and policymakers to think about implementation timing when considering evaluations of their programs in the future. 66 Illustrating the Promise of Community Schools Limitations This study has several limitations that are important to keep in mind when considering its implications for policy and practice. First, New York City is a unique school system, with vast resources and even bigger challenges. This evaluation examines the impact of NYC-CS on New York City schools and students, and the intervention represents the largest program of its kind in the history of the United States. Although it is plausible that many of the features of the NYC-CS strategy could be implemented in other locations and many of our findings may generalize to other urban school systems, it is beyond the scope of our study to demonstrate that the program or our impact estimates would generalize to other settings. In particular, as discussed in our earlier report (Johnston, et al., 2017), New York City OCS and the participating schools and CBOs devoted a remarkable amount of energy and other resources to NYCCS. Whether other school systems can follow this lead is an open question. Our impact study design has some limitations because this initiative was launched at scale. We did not have an opportunity to randomly assign schools or students to the initiative. Furthermore, the administrative assignment of all schools that failed to meet specified achievement goals for the community school program made it impossible to construct a comparison group of schools that was equivalent to participating schools at baseline. Our quasi-experimental design created a comparison group with similar baseline trajectories and employed a difference-indifference analytic approach that attempted to overcome the lack of random assignment and matched baseline equivalence. However, our impact estimates could be biased because of unobserved differences between the community schools and the comparison schools. Our findings are also subject to limitations in the nature and extent of the data at our disposal. For example, although NYCDOE has rich survey data from teachers, students, and families via the NYCSS, these data are no substitute for surveys that are designed specifically for the purpose at hand. It is likely that a study that had the opportunity to design surveys, administer them at baseline and during the initiative, and administer them to subjects affiliated with both community schools and comparison schools would be able to learn more about the program’s impact on social and emotional learning, specific barriers and enablers of the initiative in its impact on child development and school engagement, and other questions that are specific to the understanding of community schools. Access to administrative data regarding student health, justice involvement and other aspects of students’ lives would also expand our knowledge beyond the current study. In particular, given the focus on mental health in the NYC-CS, our study could have benefited from greater access to data on student mental health outcomes (or proxies thereof); however, because of data privacy constraints, we were unable to extend our analysis of mental health impact beyond the core outcome measures described thus far. Finally, the study is limited in its duration. By current research standards, our examination of data from four school years following program initiation is rather generous (or three years if we do not include the transition year of 2014–2015). However, it is possible that the impact of an intervention that assists schools and students in a holistic fashion could grow over time. Additional years of study are necessary to examine that question. Discussion 67 Implications for Policy and Practice Our findings have important implications for policymakers and practitioners interested in developing or improving similar community school initiatives in other contexts. The most obvious implication of our analysis is that the community school strategy appears to be having tangible impact on a variety of student outcomes, such as attendance, grade completion, credit accumulation, math achievement, and disciplinary incidents. These positive impacts are particularly important because the NYC-CS is such a large program compared with other instances of the community strategy that have been rigorously evaluated thus far. Even through the NYC-CS represents a uniquely large initiative being implemented at scale, the program’s alignment with prior research regarding the hallmark components of the community schools strategy make the initiative a relevant template for others to follow (Oakes, Maier, and Daniel, 2017). Although the results presented in this report are encouraging for state and local education agencies that might consider implementing the community schools strategy in their jurisdictions, we understand that the resource demands may represent a limiting factor in some contexts. Therefore policymakers should consider forging strategic partnerships with service providers and community-based organizations, while also pursuing grant funding through programs, such as the federal Full-Service Community School Grant. This grant program has been in place for several years and generally supports smaller instances of community school development (e.g., five to ten schools within a district or urban community). In addition to federal policy, state and local education agencies should consider working with a growing field of organizations that provide critical technical support and guidance for community school implementation, including the Coalition of Community Schools and The National Center for Community Schools, both of which have been influential in the development and refinement of the NYC-CS. Directions for Future Research As mentioned in our initial report on the implementation of the NYC-CS (Johnston et al., 2017), there are numerous ways in which this evaluation was limited in its scope, which could represent opportunities for further research on community school programs nationwide and NYC-CS in particular. First, we see the need for an explicit inquiry into the district-level strategies and processes that shape the program as a whole and are likely to affect the implementation experiences of schools. Although we found evidence of a positive effect of the NYC-CS as told through the outcomes of the students and schools, we believe that an important mechanism for this observed impact across such a large group of schools is likely due to some of the efforts of the OCS and other such key agencies as the Office of School Health and the Division of Family and Community Engagement, along with the Bureau of Children, Youth and Families at the Department of Health and Mental Hygiene. Therefore, we encourage scholars of community school program implementation to consider integrating data collection activities at the district or city level to fully understand the work involved in large-scale community school program implementation and scale-up. 68 Illustrating the Promise of Community Schools Second, although we see great merit in this comprehensive evaluation that considers the cumulative impact of the various program features of the NYC-CS, we encourage scholars to conduct focused investigations into particular community school components, such as family engagement, extended day activities, or mental health service provision. And finally, we encourage scholars to examine more long-term impacts of the community school strategy, ideally extending beyond the limited time frame that we used for our study. We found that some impacts have increased for some outcomes across the three years of the study, and it will be helpful to examine whether and how impacts change beyond this period. APPENDIX A Review Memo The following is the review memo from the New York City Mayor’s Office for Economic Opportunity. 69 70 Illustrating the Promise of Community Schools 'vzuxl?NII"- I- I I ?1 Department of V, Education c" New York City Department of Education Response - 1: .nr 1, fan-g." . 1- H. .. - -.: Mayor Bill de Blasio has long championed the adoption of a holistic ?whole child? approach to student learning in New York City. In 2014, he pledged to create 100 community schools as an innovation that would lift up every child through new opportunities for students and families both in and out of school. Community schools have become a critical pillar of New York City?s educational strategy to integrate academics, health, wellness, and family empowerment into the fabric of schools. Along with the Mayor?s Equity and Excellence forAll agenda?including Universal 3-K and Pre-K, Universal Literacy, AP for All, and Computer Science for All?community schools demonstrate New York City?s commitment to enabling all students to thrive. In six years, New York City has more than doubled the initial goal of 100 community schools, with 267 community schools now serving more than 135,000 students. As a result of this unprecedented commitment and growth, NYC has become the national leader in an educational movement focused on addressing students? diverse needs, empoweringfamilies to be active participants in their child?s education, and engaging entire communities in support of student success. Now, thanks to the RAND Corporation?s evaluation, additional evidence demonstrates that community schools are effective at increasing elementary and middle school students? on-time grade progression and high school students? graduation rates, improving math achievement in elementary and middle grades, improving attendance, and reducing disciplinary incidents in elementary and middle schools. ct- 3.31. i- m- '2 6 mm: 1.9: .921: -, 1: In 2016, the Mayor?s Office for Economic Opportunity (NYC Opportunity), in partnership with the NYC Department of Education (NYCDOE) and the NYC Department of Health and Mental Hygiene (DOHMH), contracted with the RAND Corporation to deliver two evaluations ofthe Community School Initiative: an implementation study and an impact study. Published in 2017, the implementation study found that in two years, New York City launched the largest community school initiative in the United States. It also found that virtually all NYC community schools successfully implemented the key evidenced-based programs and structures needed to support students and families as intended. Key programs and structures ofthe NYC community schools include: A Community School Director (C5D) responsible for coordinating services, programs, partnerships, and support systems for students and families 0 Expanded learning opportunities before and after school, on weekends, and during the summer 0 Integrated health, wellness, and social services in the school and community 0 Family and community empowerment that elevates parents and community members to leadership roles in the school community 0 Collaborative leadership and interrelationships between school leaders and community based organizations (in contract with the NYC Department of Education)? a unique aspect ofthe NYC community school strategy ChapterTitle Department of :1 Education . .f --. In early 2020, RAND published the impact study, which is titled ?Illustrating the Promise of Community Schools: An Assessment of the Impact ofthe NYC Community Schools Initiative?. The study looks at changes in student outcomes and school culture and climate overthree years (School Year 2015-16 through School Year 2017-18) and examines the impact ofthe community school strategy on academics, attendance, behavior, and school culture and climate. The impact study addresses three main questions: 1. What is the impact of community schools on attendance, educational attainment, academic achievement, student behavior, and school climate and culture? 2. To what extent are the overall impacts of community schools being observed among key subgroups of students within schools? 3. To what extent are there differences in program impact related to school characteristics such as programmatic implementation, grade configuration, principal experience, and the residential dispersion of students? The primary objective ofthe impact study is to estimate the overall impact ofthe community school strategy by comparing the outcomes of students attending community schools with students attending a carefully constructed comparison group of matched schools. 4n The RAND impact study found statistically significant and positive results for a broad range of outcomes when community schools were compared to the comparison group of non-community schools. Those outcomes include graduation rates, absenteeism, math scores, credit accumulation, on-time grade progression, and disciplinary incidents. Specifically: 1. Graduation rates signi?cantly increased in community schools compared to non-community schools. Over three years, graduation rates in community schools we re 4.9 percentage points higher than comparison schools. This effect was particularly pronounced in the third and final year ofthe study, when graduation rates in community schools were 7.2 percentage points higherthan comparison schools. 2. Chronic absenteeism declined signi?cantly compared to non-community schools, especially for vulnerable populations. Community schools across all grade bands and several vulnerable populations showed significantly lower chronic absence (CA) rates than their counterparts in comparison schools. Specifically, CA rates in community schools were lower than the control group by 1.3 percentage points for elementary and middle school students,- 3.3 percentage points for high school students; 9.3 percentage points for high school students in temporary housing; 7.4 percentage points for Hispanic high school students,- and 10.1 percentage points for black high school students. Although all community schools experienced reductions in chronic absenteeism, community schools with broader and more intensive implementation of mental health programs and services saw a stronger impact on this outcome, compared with community schools with lower levels of mental health implementation. . LJNI It- - I l'l Department of I Education 9 . - -.-. 3. Math scores, credit accumulation, and on-time grade progression significantly increased in community schools compared to non-community schools. 0 Math scores forthird through eighth grade students were significantly higher in community schools than in comparison schools. According to the study, in 2017-18 math test scores in community schools were 0.13 standard deviations or 4.2 percentile points higherthan the comparison group. For example, an average student at a comparison school might perform at the 501th percentile, while a similar student at a community school would perform at the 54th percentile. 0 High school students in community schools accumulated an average of 1.3 more credits per year compared to students in comparison schools. This finding was significant across all subgroups of students, with English language learners and Hispanic students earning an average of 1.4 more credits than their counterparts at comparison schools. Assuming on track progress is 11 credits per year, 1.3 credits is equivalent to 12% of a regular school year?s accumulation. . Community school students matriculated to the next grade on time at higher rates than comparison schools. Those rates were higherfor community school students by 1.2 percentage points for elementary and middle school students. 4. Disciplinary incidents declined sharply in elementary and middle schools compared to non- community schools. Community schools saw an average of 0.10 fewer disciplinary incidents per student per year at the elementary and middle school level compared to the comparison schools. For example, in a community school with 500 students, there were 50 fewer incidents every year compared to a matched school with the same number of students. Other results in the study were positive, but not statistically significant, including reading achievement, disciplinary incidents in high schools, and school culture and climate measures such as student connectedness to adults and family empowerment opportunities. "u . i- 43', "u u. Based on the impact ofthe New York City Community Schools initiative since 2014, the following policy implications can be surmised: 1. NYC has successfully launched the largest community school initiative in the nation. As community schools gain in momentum across the country, NYC is a model for implementation and has demonstrated that it is feasible to successfully provide this type of innovation at a large scale. 2. RAND found that the impacts of the community school strategy increased over time for some outcomes. This finding is consistent with prior research indicating that the community schools strategy is particularly impactful after several years of program implementation and underscores the importance of continued assessment of effects in the years to come as the program matures. 3. While continuing to scale the number of community schools, New York City is also using key lessons learned from the community school strategy to guide its approach toward supporting students living in temporary housing and decreasing chronic absenteeism across the NYC school system. APPENDIX B Core Capacity Score Item Summary and Distributions Tables B.1 through B.4 summarize the items from the RAND-developed survey that were used to calculate each of the four core capacity measures. In Figure B.1, we display the distribution of each of the resultant scores, with the box representing the interquartile range, the central horizontal bar representing the median, the vertical whisker bars representing the range between the 5th and 95th percentiles, and the dots representing outliers that fall outside of that range. See Johnston et al. (2017) for more information on these indexes. Table B.1 Continuous Improvement Capacity Score Survey Item PCA Weight The principal and community school team both attend the weekly data meeting. 0.329 Our community school team uses the NV Data Sorter to assess progress against benchmarks and goals for individual students. 0.506 Our community school team uses the NV Data Sorter to assess progress against benchmarks for school. 0.493 Our community school team uses data to determine whether our services and programs are meeting the needs of the student body. 0.460 Our community school has clear, data-driven benchmarks that guide continuous improvement across school and CBO. 0.426 NOTE: Continuous Improvement (Cronbach alpha = 0.800). Table B.2 Coordination Capacity Score Survey Item PCA Weight Expanded learning time is available to meet students’ needs before and/or after school. 0.250 Community school programs are available during the summer. 0.321 Teachers successfully interact with staff from our lead CBO partner. 0.412 Teachers are aware of the services that are available to students through the lead CBO partner. 0.413 Teachers and staff in our school are aware that the Tier 1 (universal), Tier 2 (selective), and Tier 3 (targeted) mental health programs and services exist. 0.376 All community partners and CBOs (in and outside of school building) meet monthly with the CSD to coordinate and assign services across students in building. 0.404 There is a communication and student referral system implemented among school and CBO staff. 0.369 Our community school’s expanded learning time programs use rigorous, standards-based curricula. 0.233 NOTE: Coordination (Cronbach alpha = 0.780). 73 74 Illustrating the Promise of Community Schools Table B.3 Connectedness Capacity Score Survey Item PCA Weight Our school and CBO developed a shared and strategy for addressing social, emotional, and behavioral problems. 0.406 As a result of our community school partnerships and programs, our school has a more positive and welcoming environment that is conducive to learning. 0.457 We have a culture of connectedness and belonging for staff, students, and families. 0.420 Students who are at risk of being chronically absent are quickly identified (i.e., within one to two weeks of initial absence). 0.338 Students at risk of being chronically absent are quickly assigned a success mentor (i.e., within one to two weeks of initial absence). 0.265 Students are aware of school-based mental health services provided by the partner CBO. 0.392 Families are receptive to opportunities for their children to participate in school-based programs and services that support their social, emotional, and behavioral needs. 0.332 NOTE: Connectedness (Cronbach alpha = 0.755). Table B.4 Collaboration Capacity Score Survey Item PCA Weight The principal and CSD have established a trusting relationship. 0.300 School and CBO staff attend trainings together. 0.290 The principal, members of the SLT, and CSD worked together to create the Renewal School Comprehensive Educational Plan or Community School Work Plan (for AIDP schools). 0.240 The principal, CSD, and SLT collaborated in creating the community school budget. 0.240 The CSD and CBO staff have a visible presence throughout the school day. 0.290 CBO services align with our school’s vision, priorities, and procedures. 0.330 Universal, selective, and targeted mental health programs and services are provided collaboratively by CBO staff, guidance counselors, social workers, teachers, and/or other school or district staff. 0.220 Teachers view the efforts of community partners as supporting their work as educators. 0.31 Our community school has implemented systems for communication with families on a weekly basis (or more frequently) around student attendance, achievement, and behavior. 0.24 As a result of our community school partnerships and programs, families come to the school more frequently. 0.22 School administrators, teachers, parents, family members, CBO staff, and community partners trust each other. 0.31 Families have input in planning for services related to child and family mental health needs. 0.250 Families have a say in decisions and plans related to school improvement. 0.290 NOTE: Collaboration (Cronbach alpha = 0.847). Core Capacity Score Item Summary and Distributions Figure B.1 Box-and-Whisker Plot of Core Capacity Index Scores 4 3 2 1 Score 0 –1 –2 –3 •• –4 •• • Connectedness Continuous improvement –5 6 –7 • • Coordination Collaboration Core Capacity 75 APPENDIX C Data Sources for Mental Health Implementation Profiles Table C.1 outlines the data points that were used to create the mental health implementation profiles. The variables are organized into conceptual groups that are informed by implementation literature and information shared by NYC’s Office of School Health. We considered various data on the scope of services within schools using mental health provider data, school assessments conducted by SMHMs, and data from the RAND-administered school leader survey. Table C.1 Domains and Data Points for Mental Health Implementation Profiles Domain Implementation of the threetiered model Data Point Indicator of at least one Tier 1 service (PD) Indicator of at least one Tier 2 service (PD) Indicator of at least one Tier 3 service (PD) Indicator of three tiers present (PD) To what extent does the school implement universal mental health services (3-point assessment) To what extent does the school implement selective mental health services (3-point assessment) To what extent does the school implement targeted mental health services (3-point assessment) Service delivery, dosage, and reach Number of types of services available at each tier (PD) Number of sessions of services at each tier (PD) Number of services targeting teachers at each tier (PD) Number of services targeting parents at each tier (PD) Number of services targeting students at each tier (PD) Mental health provider and service integration Number of CBOs providing mental health service (PD) Number of mental health providers (3-point assessment) 77 78 Illustrating the Promise of Community Schools Table C.1—Continued Domain Parent engagement Data Point Our community school has implemented systems for communication with families on a weekly basis (or more frequently) around student attendance, achievement, and behavior. (SL Survey) As a result of our community school partnerships and programs, families come to the school more frequently (SL Survey) School administrators, teachers, parents, family members, CBO staff, and community partners trust each other (SL Survey) Families have input in planning for services related to child and family mental health needs (SL Survey) Families have a say in decisions and plans related to school improvement. (SL Survey) Tier 1 services targeting parents (e.g., campaigns, promotions, workshops) (PD) Tier 2 services targeting parents (e.g., assessments, case management, collateral contacts, de-escalation, workshops) (PD) Tier 3 services targeting parents (e.g., assessments, collateral contacts, crisis deescalation, group counseling, individual counseling, referrals) (PD) To what extent does the school involve parents/caregivers when providing mental health services at the school? (3-point assessment) To what extent do the school mental health providers involve parents/caregivers when administering mental health services at the school? (3-point assessment) Awareness and buy-in To what extent does the school internally showcase and share knowledge on the mental health component of the community school model? (3-point assessment) To what extent does the lead CBO showcase and share knowledge on the mental health component of the community school model? (3-point assessment) To what extent is there awareness of the crisis protocols that are in place? (3-point assessment) To what extent is the CSD engaged and supportive of the integration of mental health services within the school? (3-point assessment) To what extent is school leadership engaged and supportive of the integration of mental health services within the school? (3-point assessment) Were crisis protocols established as a collaborative effort between the school, mental health providers, and the CBOs? (3-point assessment) NOTE: PD = Mental health provider data. 3-point assessment = three-point assessment conducted by an SMHM in spring 2017. SL Survey = RAND-administered survey in fall 2016 (93% of schools surveyed [N = 110 of 118] had at least one school leader complete the online survey). APPENDIX D Mental Health Implementation Profile Estimation Results In this appendix, we present the results of the development of the mental health implementation profiles. First, we describe the results of the creation of the implementation profiles via cluster analysis. Second, we present the results showing the extent to which impact estimates (as presented in Chapter Four) varied based on schools’ implementation profile. Determining the Appropriate Number of Clusters The first step of cluster analysis is to determine the appropriate number of clusters. To do this, we iterated over several versions of the process, each with a different number of predetermined clusters. To assess model fit and compare cluster structures, we calculated the average “silhouette width” of each cluster structure, with the goal of identifying the cluster structure with the largest silhouette width. Silhouette values were calculated for each observation: Values near one mean that the observation was well placed in its cluster; values near 0 mean that it was likely that an observation might really belong in some other cluster. As we illustrate in Figure D.1, we found that a two-cluster solution had the best fit for our implementation profiles. Cluster Comparison of Implementation Profile Groups The two clusters that were estimated using the PAM strategy had many notable differences when comparing median and mean responses with individual data points that went into the analysis. We summarize the following overall differences, by domain.1 • Implementation of three-tiered approach: Both clusters had very high implementation, but cluster 1 had nearly universal implementation of all three tiers. • Service delivery, dosage, and reach: Cluster 1 had more reported types of services and more documented “sessions” of actual service provision. • Mental health provider and service integration: Minimal differences were noted between the two clusters, other than cluster 1 schools were more likely to have more than one mental health service provider. • Parent engagement: There were no differences in the school leader survey items, but cluster 1 schools had more parent-oriented mental health services. • Awareness and buy-in: Cluster 1 schools had higher average scores on all three-point assessment items, but none of the differences were statistically significant. 1 Full item-by-item comparisons are available upon request. 79 80 Illustrating the Promise of Community Schools Figure D.1 Cluster Analysis Structure Comparison 0.55 0.50 Silhouette width 0.45 0.40 0.35 0.30 0.25 1 2 3 4 5 6 Number of clusters Due to the general trend of cluster 1 schools experiencing higher levels of program implementation, we hereafter refer to this group as the high-implementation cluster, with cluster 2 being the low-implementation cluster. Demographic Comparison of Implementation Profile Groups In Table C.1, we present a comparison of the demographics of the schools in the highimplementation cluster and the low-implementation cluster. The schools had no statistically significant differences other than school size (cluster 1 schools were much larger than cluster 2 schools) and the percentage of students who were Asian (cluster 1 schools had slightly more Asian students). The size difference between schools in the high- and low-implementation clusters was likely the main driver of differences in the various data points. To summarize, schools in cluster 1 tended to have program implementation, as documented through the provider data and three-point assessment. Mental Health Implementation Profile Estimation Results Table D.1 Implementation Profile Demographic Comparison High Implementation Low Implementation 31 77 Elementary school 29% 16% Kindergarten through grade 8 3% 10% Middle school 29% 27% High school 29% 16% Middle school/high school 6% 5% Transfer schoola 3% 4% Enrollment (mean) 745.7 471.1 Female (mean) 47% 48% White (mean) 2% 2% Black (mean) 37% 42% Hispanic (mean) 53% 53% Asian (mean) 6% 2% Other (mean) 1% 1% Students with disabilities 23% 24% English language learners 16% 18% Percentage of free or reduced-priced lunch 91% 91% n= Grade level a Transfer school is a specialty school for older students at risk of permanently dropping out. 81 References Adams, C. M., The Community School Effect: Evidence from an Evaluation of the Tulsa Area Community School Initiative, Tulsa: University of Oklahoma, The Oklahoma Center for Educational Policy, 2010. As of October 18, 2019: http://www.csctulsa.org/files/file/Achievement%20Evidence%20from%20an%20Evaluation%20of%20 TACSI.pdf Arimura, T., and C. Corter, “School-Based Integrated Early Childhood Programs: Impact on the Well-Being of Children and Parents,” Interaction, Vol. 20, No. 1, 2010, pp. 23–32. Báez, J. C., K. J. Renshaw, L. E. M. Bachman, D. Kim, V. D. Smith, and R. E. Stafford, “Understanding the Necessity of Trauma-Informed Care in Community Schools: A Mixed-Methods Program Evaluation,” Children and Schools, Vol. 41, No. 2, 2019, pp. 101–110. Balfanz, R., and V. Byrnes, Meeting the Challenge of Combating Chronic Absenteeism: Impact of the NYC Mayor’s Interagency Task Force on Chronic Absenteeism and School Attendance and Its Implications for Other Cities, Baltimore, Md.: Johns Hopkins School of Education, 2013. As of October 18, 2019: https://eric.ed.gov/?id=ED544570 Belay, Kassa, Nicole Mader, and Laura Miller, Scaling the Community School Strategy in New York City: A Systems-Building Guide, New York: Center for New York City Affairs, September 2014. Biag, Manuelito, and S. Castrechini, “Coordinated Strategies to Help the Whole Child: Examining the Contributions of Full-Service Community Schools,” Journal of Education for Students Placed at Risk (JESPAR), Vol. 21, No. 3, July 2, 2016, pp. 157–173. As of October 18, 2019: https://doi.org/10.1080/10824669.2016.1172231 Bireda, Saba, “A Look at Community Schools,” Center for American Progress, October 2009. As of October 18, 2019: https://www.americanprogress.org/issues/education-k-12/ reports/2009/10/28/6754/a-look-at-community-schools/ Blank, Martin J., Atelia Melaville, and Reuben Jacobson, “Achieving Results Through Community School Partnerships,” Center for American Progress, January 18, 2012. As of June 10, 2019: https://www.americanprogress.org/issues/education-k-12/reports/2012/01/18/10987/ achieving-results-through-community-school-partnerships/ Blank, Martin J., Atelia Melaville, and B. Shah, Making the Difference: Research and Practice in Community Schools, Washington, D.C.: Coalition for Community Schools, 2003. Blank, Martin J., and L. Villarreal, “Where It All Comes Together: How Partnerships Connect Communities and Schools,” American Educator, Vol. 39, No. 3, 2015, pp. 4–9. Bronstein, L., Susan Mason, and Jane Quinn, School-Linked Services: Promoting Equity for Children, Families and Communities, New York: Columbia University Press, 2016. Bryk, A. S., P. B. Sebring, E. Allensworth, J. Q. Easton, and S. Luppescu,  Organizing Schools for Improvement: Lessons from Chicago, Chicago: University of Chicago Press, 2010. Castrechini, S., and R. A. London, Positive Student Outcomes in Community Schools, Washington, D.C.: Center for American Progress, 2012. As of June 8, 2019: https://cdn.americanprogress.org/wp-content/uploads/issues/2012/02/pdf/positive_student_outcomes.pdf 83 84 Illustrating the Promise of Community Schools Coalition for Community Schools, “What Is a Community School?” webpage, undated. As of June 5, 2017: http://www.communityschools.org/aboutschools/what_is_a_community_school.aspx Comer, J. P., and C. Emmons, “The Research Program of the Yale Child Study Center School Development Program,” Journal of Negro Education, Vol. 75, 2006. Cummings, C., A. Dyson, and L. Todd, Beyond the School Gates: Can Full-Service and Extended Schools Overcome Disadvantage? New York: Taylor and Francis Group, 2011. Daniel, J., K. H. Quartz, and J. Oakes, “Teaching in Community Schools: Creating Conditions for Deeper Learning,” Review of Research in Education, Vol. 43, No. 1, 2019, pp. 453–480. Dearing, E., M. E. Walsh, E. Sibley, T. J. Lee-St. John, C. Foley, and A. E. Raczek, “Can Community and School‐Based Supports Improve the Achievement of First‐Generation Immigrant Children Attending High‐ Poverty Schools?” Child Development, Vol. 87, No. 3, 2016, pp. 883–897. Dobbie, W., and R. G. Fryer, Jr., “Are High-Quality Schools Enough to Increase Achievement Among the Poor? Evidence from the Harlem Children’s Zone,” American Economic Journal: Applied Economics, Vol. 3, No. 3, 2011, pp. 158–187. Dryfoos, Joy G., “Centers of Hope,” Educational Leadership, Vol. 65, 2008. Dryfoos, Joy G., and Sue Maguire, Inside Full Service Community Schools, Thousand Oaks, Calif.: Corwin, 2002. Durham, R. E., and F. Connolly, Baltimore Community Schools: Promise and Progress, Baltimore, Md.: Baltimore Education Research Consortium, 2016. Fehrer, K., and J. Leos-Urbel, “We’re One Team: Examining Community School Implementation Strategies in Oakland,” Education Sciences, Vol. 6, No. 3, 2016. Fox, L., J. Carta, P. S. Strain, G. Dunlap, and M. L. Hemmeter, Response to Intervention and the Pyramid Model, Tampa, Fla.: University of South Florida: Technical Assistance Center on Social Emotional Intervention for Young Children, 2009. Fox, L., G. Dunlap, M. L. Hemmeter, G. E. Joseph, and P. S. Strain, “The Teaching Pyramid: A Model for Supporting Social Competence and Preventing Challenging Behavior in Young Children,” Young Children, Vol. 58, No. 4, 2003. Galindo, Claudia, Mavis Sanders, and Yolanda Abel, “Transforming Educational Experiences in Low-Income Communities: A Qualitative Case Study of Social Capital in a Full-Service Community School,” American Educational Research Journal, Vol. 54, No. 1, April 11, 2017. Hancock, P., T. Cooper, and S. Bahn, “Evaluation of the Integrated Services Pilot Program from Western Australia,” Evaluation and Program Planning, Vol. 32, 2009. Harlem Children’s Zone, homepage, undated. As of June 10, 2019: https://hcz.org/ HCZ—See Harlem Children’s Zone. ICF International, Communities in Schools National Evaluation, Vol. 4: Randomized Controlled Trial Study Jacksonville, Florida, Fairfax, Va., 2010a. ICF International, Communities in Schools National Evaluation, Vol. 5: Randomized Controlled Trial Study Austin, Texas, Fairfax, Va., 2010b. ICF International, Communities in Schools National Evaluation, Vol. 6: Randomized Controlled Trial Study Wichita, Kansas, Fairfax, Va., 2010c. Jacobson, Reuben, “The Community Schools Movement: Emergence and Growth Trends,” in Reuben Jacobson and JoAnne Ferrara, eds., Community Schools: People and Places Transforming Education and Communities, Lanham, Md.: Rowman and Littlefield, 2019, pp. 1–20. Jacobson, Reuben, and Martin J. Blank, “A Framework for More and Better Learning Through Community Schools Partnerships,” Coalition for Community Schools, September 2015. As of June 10, 2019: http://www.communityschools.org/betterlearning/ References 85 Jenkins, Della, and Mark Duffy, “Community Schools in Practice: Research on Implementation and Impact, a PACER Policy Brief,” Research for Action, January 2016. As of October 18, 2019: https://eric.ed.gov/?id=ED570123 Johnston, William R., Celia J. Gomez, Lisa Sontag-Padilla, Lea Xenakis, and Brent Anderson, Developing Community Schools at Scale: Implementation of the New York City Community Schools Initiative, Santa Monica, Calif.: RAND Corporation, RR-2100-NYCCEO, 2017. As of October 10, 2019: https://www.rand.org/pubs/research_reports/RR2100.html Kalafat, J., R. J. Illback, and D. Sanders, Jr., “The Relationship Between Implementation Fidelity and Educational Outcomes in a School-Based Family Support Program: Development of a Model for Evaluating Multidimensional Full-Service Programs,” Evaluation and Program Planning, Vol. 30, No. 2, May 2007, pp. 136–148. Kase, Courtney, Sharon Hoover, Gina Boyd, Kristina D. West, Joel Dubenitz, Pamala A. Trivedi, Hilary J. Peterson, and Bradley D. Stein, “Educational Outcomes Associated with School Behavioral Health Interventions: A Review of the Literature,” Journal of School Health, Vol. 87, No. 7, 2017, pp. 554–562. As of October 18, 2019: https://onlinelibrary.wiley.com/doi/abs/10.1111/josh.12524 Kaufman, L., and P. J. Rousseeuw, “Partitioning Around Medoids (Program PAM),” in P. J. Rousseeuw and L. Kaufman, eds., Finding Groups in Data, Hoboken, N.J.: John Wiley and Sons, 1990, pp. 68–125. As of October 18, 2019: https://onlinelibrary.wiley.com/doi/abs/10.1002/9780470316801.ch2 Kemple, J. J., C. M. Herlihy, and T. J. Smith, Making Progress Toward Graduation: Evidence from the Talent Development High School Model, New York: MDRC, 2005. Kirp, D. L., “The Community School Comes of Age,” New York Times, January 10, 2019. As of October 18, 2019: https://www.nytimes.com/2019/01/10/opinion/community-school-new-york.html Knopf, John A., Ramona K. C. Finnie, Yinan Peng, Robert A. Hahn, Benedict I. Truman, Mary VernonSmiley, Veda C. Johnson, Robert L. Johnson, Jonathan E. Fielding, Carles Muntaner, Pete C. Hunt, Camara Phyllis Jones, and Mindy T. Fullilove, “School-Based Health Centers to Advance Health Equity: A Community Guide Systematic Review,” American Journal of Preventive Medicine, Vol. 51, No. 1, July 1, 2016, pp. 114–126. As of October 18, 2019: http://www.sciencedirect.com/science/article/pii/S0749379716000350 LaFrance Associates, LLC, Comprehensive Evaluation of the Full-Service Community Schools Model in Iowa: Harding Middle School and Moulton Extended Learning Center, San Francisco: Milton S. Eisenhower Foundation, September 2005. Maier, A., J. Daniel, J. Oakes, and L. Lam, Community Schools as an Effective School Improvement Strategy: A Review of the Evidence, Palo Alto, Calif.: Learning Policy Institute, 2017. Mapp, Karen L., and Paul J. Kuttner, Partners in Education: A Dual Capacity Building Framework for FamilySchool Partnerships, Washington, D.C.: American Institute for Research, 2013. As of October 18, 2019: www.sedl.org Moore, K. A., S. Caal, R. Carney, L. Lippman, W. Li, K. Muenks, D. Murphey, D. Princiotta, A. N. Ramirez, A. Rojas, R. Ryberg, H. Schmitz, B. Stratford, and M. A. Terzian, Making the Grade: Assessing the Evidence for Integrated Student Supports, Bethesda, Md.: Child Trends, February 2014. As of October 18, 2019: https://www.childtrends.org/publications/making-the-grade-assessing-the-evidence-for-integrated-studentsupports/ National Center for Community Schools, “How Many Community Schools Are There in the United States?” webpage, undated. As of November 20, 2019: https://www.nccs.org/block/how-many-community-schools-are-there-united-states 86 Illustrating the Promise of Community Schools Nauer, K., N. Mader, G. Robinson, and T. Jacobs, A Better Picture of Poverty: What Chronic Absenteeism and Risk Load Reveal About NYC’s Lowest-Income Elementary Schools, New York: New York Center for New York City Affairs, Milano School of International Affairs, Management, and Urban Policy, November 2014. As of October 18, 2019: https://www.attendanceworks.org/wp-content/uploads/2017/06/BetterPictureofPoverty_PA_FINAL_001.pdf New York City Department of Education, “Framework for Great Schools,” webpage, undated-a. As of October 18, 2019: https://www.schools.nyc.gov/about-us/vision-and-mission/framework-for-great-schools New York City Department of Education, “School Quality,” webpage, undated-b. As of October 18, 2019: https://www.schools.nyc.gov/about-us/reports/school-quality New York City Community Schools, “Community Schools Strategic Plan,” webpage, undated. As of October 18, 2019: http://www1.nyc.gov/site/communityschools/plan/plan.page New York City Independent Budget Office, “How Much Funding Is Allocated to the Community Schools Program and for Which Services?” December 19, 2018. As of November 20, 2019: https://ibo.nyc.ny.us/iboreports/community-school-funding-btn-2018.pdf Oakes, J., A. Maier, and J. Daniel, Community Schools: An Evidence-Based Strategy for Equitable School Improvement, Boulder, Colo.: National Education Policy Center, June 2017. As of October 18, 2019: http://nepc.colorado.edu/publication/equitable-community-schools Olson, L. S., A First Look at Community Schools in Baltimore, Baltimore, Md.: Baltimore Education Research Consortium, December 2014. As of October 18, 2019: http://baltimore-berc.org/wp-content/uploads/2014/12/CommunitySchoolsReportDec2014.pdf Opper, Isaac M., William R. Johnston, John Engberg, and Lea Xenakis, Assessing the Short-Term Impact of the New York City Renewal Schools Program, Santa Monica, Calif.: RAND Corporation, WR-1303-NYCDOE, 2019. As of August 16, 2019: https://www.rand.org/pubs/working_papers/WR1303.html Peck, L. R., “Using Cluster Analysis in Program Evaluation,” Evaluation Review, Vol. 29, No. 2, 2005, pp. 178–196. Quartz, K. H., “Community Schools Score Key Victory in L.A. Teachers Strike,” The Conversation, January 29, 2019. As of June 10, 2019: http://theconversation.com/community-schools-score-key-victory-in-la-teachers-strike-110149 Reardon, Sean F., The Widening Academic Achievement Gap Between the Rich and the Poor: New Evidence and Possible Explanations, Stanford University, July 2011. Sanchez, Amanda L., Danielle Cornacchio, Bridget Poznanski, Alejandra M. Golik, Tommy Chou, and Jonathan S. Comer, “The Effectiveness of School-Based Mental Health Services for Elementary-Aged Children: A Meta-Analysis,” Journal of the American Academy of Child and Adolescent Psychiatry, Vol. 57, No. 3, 2018, pp. 153–165. As of February 22, 2019: https://doi.org/10.1016/j.jaac.2017.11.022 Sanders, Mavis, Claudia Galindo, and Dante DeTablan, “Leadership for Collaboration: Exploring How Community School Coordinators Advance the Goals of Full-Service Community Schools,” Children and Schools, Vol. 41, No. 2, 2019, pp. 89–100. As of July 19, 2019: https://doi.org/10.1093/cs/cdz006 Smith, J. S., J. A. Anderson, and A. K. Abell, “Preliminary Evaluation of the Full-Purpose Partnership Schoolwide Model,” Preventing School Failure, Vol. 53, 2008. Steen, S., and Pedro A. Noguera, “A Broader and Bolder Approach to School Reform: Expanded Partnership Roles for School Counselors,” Professional School Counseling, Vol. 14, No. 1, February 15, 2018. Todd, P.E., “Matching Estimators,” in S. N. Durlauf, and L. E. Blue, eds., Microeconometrics: The New Palgrave Economics Collection, London: Palgrave Macmillan, 2010. References 87 U.S. Department of Education, “Full-Service Community Schools Program,” webpage, 2014. As of October 18, 2019: http://www2.ed.gov/programs/communityschools/index.html Walsh, M. E., G. F. Madaus, A. E. Raczek, E. Dearing, C. Foley, C. An, T. J. Lee-St. John, and A. Beaton, “A New Model for Student Support in High‐Poverty Urban Elementary Schools: Effects on Elementary and Middle School Academic Outcomes,” American Educational Research Journal, Vol. 51, No. 4, 2014, pp. 704–737. Warren, Mark, “Communities and Schools: A New View of Urban Education Reform,” Harvard Educational Review, Vol. 75, No. 2, July 2005.