November 3, 2017 Mike Kirst, President California State Board of Education 1430 N Street, Suite 5111 Sacramento, CA 95814 Via email only (sbe@cde.ca.gov) LCFF Equity Coalition Comments re: • • • Developing an Integrated Local, State, and Federal Accountability and Continuous Improvement System etc. – November 2017 Board Meeting Item 3; Update on Development of CA’s System of Support for LEAs and Schools - November 2017 Board Meeting Item 4; and Update on Issues Re California’s Implementation of ESSA – November 2017 Board Meeting Item 5. Dear President Kirst: We represent a coalition of civil rights, advocacy, community, parent, student and other organizations who have worked diligently on passage and implementation of the Local Control Funding Formula (LCFF). LCFF creates an historic opportunity to focus resources on helping California’s neediest students overcome the barriers they face in closing the achievement gap and graduating college and career ready. It also promises a new level of transparency and local engagement for parents, students, and community members in the design of their local schools. As you know, in an effort to give life to these objectives, we have commented jointly multiple times over the last four years regarding the State Board of Education’s LCFF regulatory proposals and accountability system items. We offer the following comments and recommendations concerning Items 3 and 4. I. Item 3 – Developing an Integrated Local, State and Federal Accountability System (Attachment 1: Altering the Academic Indicator) Staff have recommended a number of changes, some of them quite significant, to establishing cut scores and performance color designations for the Academic Indicator. Some of the adjustments are reasonable, such as defining “Maintained” along the Change dimension to represent a decline or increase of 3 points (as opposed to the current definition which is actually skewed toward an increase). We also appreciate the proposed modification to ensure that the Math indicator (like the ELA indicator already) would operate to ensure that no district or school would receive a High status designation unless their average outcomes are at least at the “Standard Met” Achievement Level. However, we have serious reservations and we think the Board should too, in supporting the lowering of Math cut scores and the re-labeling of color cells in the ELA and Math 5x5 grids to qualify more schools as Green and Blue and fewer as Red than would be the case under the current cut score and color designations. It appears to subvert the accountability system and risks undermining public confidence when, after test scores issue that are disappointing, the State so significantly alters the rubric by which performance is judged. What is more, all this has come at the last minute, with very little notice to stakeholders and the public, a rushed closed-door meeting with “the Technical Design Group (TDG),” 1 and no time to run analyses on how these proposed changes affect student subgroup performance. We should not be changing expectations for schools and districts without more data and understanding; as such, the Equity Coalition cannot support the proposed modifications to the Academic Indicator at this time. We appreciate the volatility that would result were the current cut scores and color designations used with no modifications. However, we do not think the answer is to establish the significant negative precedent of altering the rubrics to produce the result the State finds more palatable. In this instance, testing experts have argued that some of the problem may lie with the SBAC test itself. Given that SBAC scores “dropped like a stone” in most SBAC states, there very well may be an issue with the test itself, which must be disentangled from conclusions we might draw about student learning. Before modifying the accountability system so significantly, the State should have the TDG and its experts exam the SBAC test more closely. Finally, we would further urge the Board to direct staff to make meetings of the Technical Design Group open to the public so that stakeholders can be aware of and weigh in on such significant conversations in the future and not “discover” them for the first time in a memo on an action item to the Board on the eve of a Board meeting. II. Item 3 – Developing an Integrated Local, State and Federal Accountability System (Attachment 2: Chronic Absenteeism) We are pleased to see progress on the Chronic Absenteeism Indicator and to hear from CDE that the data underlying the indicator has proved to be of good quality. We think more should be reported in the Dashboard, however, than simply a link to data on DataQuest. While the latter has more robust data available than can reasonably be included in the Dashboard, in terms of immediate transparency, simply reporting a weblink will often not prove particularly transparent or effective. This is especially so for parents who may be reading a printed version of the Dashboard and/or do not have the time or immediate inclination to wade through a set of reports from DataQuest. The Dashboard released earlier this year, which included the first-time release of data for the College and Career Indicator (status only, without color-coding), offers strong precedent for including comparable data on chronic absence directly in the Dashboard directly. Accordingly, we recommend that along with the link to DataQuest, a topline report of chronic absenteeism also be reported for the LEA or school. This should report, at minimum, the percent of students who were chronically absent; even better would be data similar to the chart in Attachment 2 at page 2, identifying the count of students at each of four different absence categories from Satisfactory Attendance to Severely Chronic Absent. III. Item 3 – Developing an Integrated Local, State and Federal Accountability System (Attachment 3: School Climate) As CDE’s memo lays out, the Board has been discussing how to measure School Climate through the use of surveys since June 2016. The School Conditions and Climate Work Group (CCWG) has worked diligently for over a year reviewing and incorporating feedback from a broad array of stakeholders, updating the Board and public along the way. The Equity Coalition strongly urges the Board to show some urgency on this priority and: 1. Fully adopt the CCWG’s Recommendation Framework by March 2018 for implementation in the 2018-19 Dashboard, including the recommendation that LEAs administer a school conditions and climate survey annually to students, parents/guardians and school staff; 2 Requiring school climate surveys annually is consistent with LCFF’s intent and structure. LEAs are required to focus their LCAPs on annual outcomes and modifications. LCFF requires annual updates of 3-year plans, the LCAP template calls for annual goals and actions for each statutory metric, and LCFF holds districts accountable for annual changes in performance. The Board sends a powerful negative message that school climate is not as important as the other state priorities when it allows LEAs to measure it every other year unlike the other state priorities. See attached a brief outlining the benefits of annual surveys of school climate. 2. Incorporate the framework into California’s system of support by: a. directing that it be integrated with tools, resources and supports about school conditions and climate and b. directing that capacity building among all stakeholders, including students, parents and teachers, be part of any plan for continuous improvement. IV. Item 3 – Developing an Integrated Local, State and Federal Accountability System (Attachment 4: Broad Course Access) We appreciate very much your staff’s efforts in moving forward the state priority to measure students’ access to a broad course of study as required by LCFF Priority 7. This indicator is at the core of equity. Having access to a rich education across all subject matters significantly affects a student’s concept development, language and overall academic achievement. This access must begin at TK and be consistent until graduation at 12th grade. As an initial matter, we urge the Board not to adopt automatically the staff’s recommendation to make Priority 7 solely a local indicator. We think it worth exploring development of a hybrid state/local indicator on broad course access. We note that CALPADS has data that could support a statewide indicator on Priority 7, at least for core courses for middle and high school. This data meets the Board and staff’s requirements for data needed to support a state indicator: the data is valid and reliable, comparable across the state, and is capable of being disaggregated by student subgroups. Accordingly, course access information for middle and high school grades could and should be reported (and compared) as a state indicator, and a local indicator should be used for elementary grades where individual inquiries would need to be made to determine the breadth of course access available to all students in self-contained, elementary school classrooms. As to the proposed shape of the local indicator, we submitted comments to staff after our most recent stakeholder input session on this topic and were disappointed to see that none were taken or are included in the current proposal. We repeat that we feel that the self-reflection tool, which is basically three open-ended “softball” questions, falls short of providing LEAs and local stakeholders with the tools and guidance needed to meaningfully assess course access status and identify disparities. A meaningful self-reflection tool should include guiding questions/considerations/analyses districts should explore with respect to a broad course of study, or provide spreadsheets and data tools that provide visualizations to identify disparities by grade level, subject matter and for unduplicated students and other subgroups. Furthermore, a meaningful tool should clearly distinguish between single subject courses (which are essentially tracked in CALPADS) and elementary and middle school self-contained classrooms which must be individually analyzed by the LEA to ascertain the extent to which teachers are fully providing access to the required course of study at those grade levels and should be able to ascertain disparities in access. 3 V. Item 4 – Update on Developing California’s System of Support for LEA’s and Schools We appreciate some of the further thinking and articulation of actions presented in the memo on the state’s emerging system of support and technical assistance for LEAs and schools in need of improvement. In particular, as previously noted, we support the notion that one of the key elements of the System of Support in this first year of implementation of technical assistance is the undertaking by LEAs and local stakeholders of a needs assessment and root cause analysis of student outcomes. We think that such an analysis is an essential element in any improvement plan and, by necessity, should be undertaken early in the process. One major recommendation at this stage with respect to the described system of support continues to be to urge the Board to require that the system explicitly call first for an analysis of the capacity for local change. We agree that prior state and federal program improvement plans have largely been unsuccessful and in no small part as a result of the fact that such improvement plans, however sound, have been imposed upon local districts and schools by external entities without regard to local capability, willingness or ownership as concerned plan implementation. While much in the Item suggests that the new support system could avoid such a pitfall by potentially surfacing such issues in a needs/roots cause analysis, we believe it is essential that the system be explicitly designed to do so. Accordingly, we recommend that the State Board direct staff, in future iterations, to add a separate and independent initial step in the improvement process that calls for an analysis of the local districts’ or schools’ capacity to implement a major improvement plan. Before a local entity can implement an improvement plan successfully—whether such might be a new coherent instructional program, a new system of supports for high need students, new school climate and engagement initiatives, or some other measures—the district or school unit must be sufficiently functional to effectively embrace and implement the initiatives. Sometimes key dysfunctionalities, e.g., ineffective school leadership, low staff morale, the lack of staff cohesion or of parent and student engagement, severely adverse working conditions, etc., will need to be identified and corrected before new substantive initiatives can take hold. If there is little hope that the local staff and community can work effectively together to agree on and implement any type of improvement plan, whatever may eventually be put forward will have little chance of success. Regarding the individuals in the entities providing support to LEAs and schools, we believe it is essential that they be required to demonstrate experience and expertise in working with and getting improved results with the specific student group whose performance has triggered support. We have also been surprised that there has been virtually no mention of the new English Learner Roadmap in the stated plans for the new system of support for troubled schools—a good number of whom will have issues with EL outcomes. As such, we recommend that the State Board direct staff to incorporate explicit instructions in the system of support which utilize and rely on the EL Roadmap when addressing issues of EL underperformance. This meeting’s memo on the System of Support lays out, for the first time, five different illustrative examples of school districts that could find themselves as identified for support and assistance. While the illustrations are helpful such as they are, what would prove even more useful would be if the Board could clarify the types of criteria or performance whose increasing severity qualifies an LEA for increasing levels of assistance. E.g., the number of priorities one is red or orange on, the number of lowperforming subgroups, the number of years of underperformance, etc. 4 Finally, we strongly recommend the State Board and staff establish within the System of Support, clear procedures and protocols by which LEAs and schools will be supported in working through the change process with local stakeholders. It will not be enough for the State to recommend or even direct LEAs to include local stakeholders; the State Board and its partner agencies need to provide LEAs with the roadmap and the support to effectively work through the change process with the local community. Only by ensuring such, can the State Board ensure that a whole school or whole district process takes place that builds the type of local capacity and agency to make continuous improvement a reality. VI. Item #5: Implementation of ESSA: School-site per-pupil spending reporting We believe there should be more guidance for LEAs on the LCAP Federal Addendum around the supplement not supplant rule. We recommend for each of the Title funding sources – Title I A, Title I D, Title II, Title III, and Title VI – that there be a box that requires an explanation of how the use of funding meets the federal supplement not supplant requirements of ESSA. Federal law has added a new fiscal reporting requirement that will begin with the 2018-19 school year, namely that the state will report for every school in the state, the per-pupil expenditures for personnel and non-personnel costs disaggregated by funding source. This new fiscal data could be a powerful tool to support fiscal transparency and reflection in the LCAP and LCAP addendum in terms of how a district decides to invest future LCFF supplemental and concentration funds and federal funds. Education leaders, community groups and other stakeholders will use the reporting to make comparisons across school within a district and across districts. But, there is a role for the state to play to help ensure that these comparisons are meaningful and helpful to the self-reflection process. Absent guidance and leadership from the state on how this data should be collected and reported to the state, a thousand school districts will each have to invest staff time and resources to figure out how it will report for its schools. And, unfortunately, the results are not likely to be comparable across districts even though stakeholder are likely to make such comparisons. We encourage the state to figure this out once and not leave it to locals to figure out a thousand times. To that end, we recommend that the Board create a working group of interested stakeholders, to work with CDE, including for example members of the equity community, FCMAT, CCSESA’s Business and Administration Steering Committee, ACSA, CSBA, CASBO, parents and community groups, to support the CDE in developing guidance to the field on meeting this new reporting requirement and submitting the data. * * * Thank you for the opportunity to comment. We look forward to continuing working with the State Board of Education to realize the full promise of LCFF for our neediest students. Sincerely, John T. Affeldt Managing Attorney Public Advocates Inc. Kevine Boggess Director of Policy Coleman Advocates for Children and Youth 5 Shelley Spiegel-Coleman Executive Director Californians Together Jan Gustafson-Corea CEO California Association for Bilingual Education (CABE) Oscar E. Cruz President & CEO Families in Schools Shimica Gaskins Executive Director Children’s Defense Fund – California Sylvia Torres-Guillén Director of Education Equity ACLU of California Taryn Ishida Executive Director Californians for Justice Brian Lee California State Director Fight Crime: Invest in Kids Chris Norwood Bay Area Tutoring Association Luis Santana Executive Director Reading and Beyond Ryan J. Smith Executive Director The Education Trust-West Samantha Tran Senior Managing Director, Education Children Now Geoffrey Winder and Ginna Brelsford Co-Executive Directors Genders & Sexualities Alliance Network Attachment: The Need for Annual School Climate Surveys, Brian Lee, California State Director, Fight Crime: Invest in Kids cc: Karen Stapf Walters, Executive Director, State Board of Education (SBE) 6 Judy Cias, Chief Counsel, SBE Dave Sapp, Deputy Policy Director and Assistant Legal Counsel, SBE Jeff Breshears, Director of the Local Agency Systems Support Office, CDE Barbara Murchison, State Lead, ESSA State Plan Office, CDE Jenny Singh, Analysis, Measurement & Accountability Reporting Division, CDE Cindy Kazanis, Analysis, Measurement & Accountability Reporting Division, CDE Jeff Bell, Department of Finance Jannelle Kubinec, Director of National, State and Special Projects, WestEd 7 The Need for Annual School Climate Surveys CDE’s School Conditions and Climate Working Group is recommending that the State Board of Education require annual school climate surveys of students, parents, and staff as part of the California School Dashboard and related accountability system. School climate surveys are required of districts as part of LCFF’s School Climate state priority area, and are identified as a local indicator in the Dashboard. Currently, districts are deemed to have “met” this local indicator if they conduct surveys every other year. Annual surveys are essential for continuous improvement and are quite practical, given that the vast majority of districts have already been administering the state-subsidized California Healthy Kids Survey (CHKS) for more than a decade, and close to 200 districts already do so annually. They also can trigger prompt action: districts receive reports from online CHKS surveys within 2 weeks on average. Annual Surveys: Recognize the importance of school climate • The current singling out of school climate surveys as the only state or local indicator in the Dashboard that does not require annual reporting sends a clear message that school climate isn’t as important as other indicators, even though: (1) research shows the impact of school climate on school performance1; and (2) the public prioritizes creating a safe and positive school climate as the most important way to evaluate schools, above graduation rates, college and career readiness, and test scores.2 • School climate surveys should not be under-prioritized any further, especially given that they already have been deemed a local rather than state indicator, no standards have been set for measuring survey outcomes, and uniform survey questions, which would enable statewide data collection and comparisons across districts, have been rejected. Are consistent with the intent and structure of LCFF and LCAPs • The LCFF statute requires LCAPs to focus on year-to-year outcomes and modifications. LCFF requires annual updates of 3-year plans, the LCAP template calls for annual goals and actions for each statutory metric, and LCFF holds districts accountable for annual changes in performance. • In practice, surveying every other year means that often a district will only review its school climate survey results once in a 3-year LCAP time frame (after Year 2). 1 Ensure that all student voices are heard • Since surveys like the popular California Healthy Kids Survey (CHKS), which is already used by approximately 70% of districts, target students in every other grade (i.e., 5th, 7th, 9th, 11th), surveying every other year means that one cohort of students will never be surveyed. More effectively identify emerging tensions and challenges and support continuous learning • A lot can change in one year, such as the recent impact of concerns over more aggressive immigration policies, and CHKS content is reviewed annually to be responsive to emerging issues. Annual survey results lead to quicker action. • CHKS offers quick turnaround of results: districts receive reports from online surveys within 2 weeks on average.3 • Understanding and monitoring year-to-year progress is essential to continuous learning. Provide the opportunity to ask more questions • Districts can use different combinations of supplementary questions year-to-year to explore in more depth specific problems and issues or help assess the impact of improvement efforts that are being implemented. Many surveys offer supplemental sets of questions and/or customizable features. For example, in addition to its required core module, CHKS includes 22 separate additional modules on topics including social emotional health, resilience and youth development, safety and violence, physical health and nutrition, and a more in-depth module focused just on school climate. Many districts do one combination of modules one year and another combination the next. Improve capacity • Annual surveys will increase familiarity with the surveys, enabling districts and schools to build better capacity to successfully administer the survey and use the data. This will likely improve the level of survey participation, thus improving data quality as well. Would replace a past relic of every other year surveys • Most districts survey every other year because that was the initial recommendation of a state Advisory Panel when the CHKS was first fully implemented in 1999, and there was the precedent of the state Biennial California Student Survey, which started in 1985. This recommendation was based on two factors: (1) the initial survey was heavily focused on behavioral health and most behavioral surveys (like the Biennial California Student Survey) were biennial because the behaviors didn’t change that quickly; and (2) there were concerns about the burden of surveying on schools. 2 • These factors are less relevant today. Now that schools are accustomed to doing student surveys, an annual administration would actually reduce the survey burden in many respects as it will become more institutionalized. Moreover, starting in 2007-08, CHKS became focused on school climate, school connectedness, and pupil engagement, indicators that are more susceptible to change than behavioral surveys, warranting annual assessment. No Legitimate Reason to Reject Annual Surveys Any concerns about the burdens and costs of climate surveys are misplaced. Fortunately, California schools have an option that is neither burdensome nor costly: the state-subsidized California Healthy Kids Survey, which most school districts—714 or approximately 70% of districts statewide—already use.4 The California Healthy Kids Survey does not take much time • Student time commitment is minimal, especially now that 93% of surveys are completed electronically. The maximum time for secondary students to complete the California Healthy Kids “Core” surveys is 21 to 24 minutes, depending on grade. If districts use the new “Mini Core” module, it only takes a maximum of 8-10 minutes. The online survey also greatly reduces staff labor time. • Districts do not need extra time to tabulate results: for CHKS, WestEd crunches the numbers and shares the results. The CHKS is not costly • The California Healthy Kids Survey costs only 40 cents per student because of state support, and less than 25% of students are expected to be surveyed (one grade from each grade span, i.e. K-5, 6-8, and 9-12). So even a district with 40,000 students would pay no more than $4,000 annually to administer the CHKS (fewer than 20 districts in California have that many students). • Online surveying has greatly reduced the labor cost involved in administering the survey and the direct costs to districts for photocopying and distributing printed surveys. • Over 400 districts get funding to support administration of the California Healthy Kids Survey through, and as a condition of, their Tobacco Use Prevention and Education (TUPE) grants, which requires CHKS every other year. Through the Proposition 56 tobacco tax, more of these grants are expected to be awarded, and existing grants are expected to increase in part to cover increased administration costs for the California Healthy Kids Survey, which recently increased from 30 to 40 cents per student. 3 Annual Surveys are Already the Norm for Many Districts A final telling statistic: The logic for annual administration of a school climate survey is so strong that 186 districts already administered the CHKS annually in the past two years—26% of all districts that conducted the survey—and another approximately 20 districts committed to annual surveys adapted from CHKS by the CORE districts. The number of districts conducting the CHKS annually is on a sharp upward trend.5 This indicates that requiring California school districts to conduct an annual school climate survey is not only logical, but feasible without placing an undue burden on them. Voight, A., Austin, G., and Hanson, T. (2013). A climate for academic success: How school climate distinguishes schools that are beating the achievement odds (Full Report). San Francisco: WestEd. Retrieved from https://www.wested.org/online_pubs/hd-13-10.pdf 2 Berkeley IGS poll. (2017, September). Educating California’s children and youth: a summary of the findings from a survey of voters about K-12 schools. Retrieved from https://www.documentcloud.org/documents/4067526-Educating-Californias-Children-Survey-Report2017.html 3 For the few districts (7%) still using paper surveys, reports of survey findings are provided in 30 days on average. 4 432 districts used the related staff survey, and 210 used the related parents survey between 2015-17. Other than a $150 start-up fee for each, there are no additional costs for administering either survey when conducted with the California Healthy Kids Survey. 5 The number of districts doing the CHKS annually increased by 44% (from 18% to 26%) between 2014-16 and 2015-17). 1 Data about implementation of California Healthy Kids Survey provided by Greg Austin and Tom Hanson at WestEd, which administers the survey. 4