May 6, 2016 Mike Kirst, President California State Board of Education 1430 N Street, Suite 5111 Sacramento, CA 95814 Via email only (sbe@cde.ca.gov) Re: LCFF Equity Coalition Comments re: Developing a New Accountability System: LCFF Evaluation Rubrics– May 2016 Board Meeting Item 2 Proposed Revision of the LCAP Template – May 2016 Board Meeting Item 3 Dear President Kirst: We represent a coalition of civil rights, advocacy, community, parent, student and other organizations who have worked diligently on passage and implementation of the Local Control Funding Formula (LCFF). LCFF creates an historic opportunity to focus resources on helping California’s neediest students overcome the barriers they face in closing the achievement gap and graduating college and career ready. It also promises a new level of transparency and local engagement for parents/caregivers, students, and community members in the design of their local schools. As you know, in an effort to give life to these objectives, we have commented jointly multiple times over the last year regarding the State Board of Education’s LCFF regulatory proposals and evaluation rubrics/accountability system items. We appreciate very much the work of your staff into the development thus far of the evaluation rubric. In particular, we applaud the SBE staff’s proposals to develop a top-level display that flags disparities in subgroup performance, to include suspension rates as a key indicator, and to adopt an annual process to refine and improve the key indicators used in the rubrics. However, we write to raise several remaining concerns and alternative recommendations regarding key decision points under Items 2 and 3 of the May Board meeting. I. Top-Level Display and Equity We thank SBE staff for the recommendation that it design a top-level display that flags for all stakeholders any disparities in subgroup performance. As you know, it is a priority to our coalition that equity be an important dimension both in the architecture of the new accountability system as well as in a visual display. Highlighting disparities between groups of students in the top-level look at an LEA or school is a critical piece of operationalizing equity in the new state accountability system. We also strongly recommend that the SBE operationalize equity by designing and adopting a top-level display that is understandable and accessible to community stakeholders. The rubrics must convey strengths and weaknesses of an education program not only to state policymakers, LEAs, and schools, but especially to parents, students, and community groups. For our comprehensive accountability system to work, we must ensure that our state accountability system works in tandem with local accountability of LEAs and schools. When parents and other community stakeholders have easy and transparent access to data on school and district performance, they will be able to use that information to more effectively and meaningfully advocate a LEA or school for improvement in areas of weakness, particularly with regards to gaps between subgroups of students. The rubrics serve as a critical tool for both state and community accountability. 1    II. Current Proposal of Key Indicators We are pleased to see the SBE staff proposal for the inclusion of suspension rates as a key indicator in the evaluation rubrics. We strongly urge the SBE to keep this indicator as proposed. While we support a focus on academic achievement, we do not support the inclusion of the Grade 3 English Language Arts/Grade 8 Math CAASPP Score indicator. As the SBE staff memo states in Attachment 2 to Item 2, including this indicator, when the rubric will already include test scores for all grades in English Language Arts and Math, double-counts and overemphasizes test scores within the system. Instead, we propose that the SBE eliminate the Grade 3 ELA/Grade 8 Math CAASPP Indicator and replace it with a “growth measure” indicator as described below. We also reiterate that when the SBE refers to test scores within the system, those scores should be calculated in English or in the student’s primary language. Finally, as the SBE considers specific proposals for an indicator on progress of English learners towards English language proficiency, we reiterate our recommendation of an EL composite including CELDT, reclassification, and Long Term English Learners (LTEL). III. Commit to a Tangible, Long-Term Vision on Additional Key Indicators We strongly support SBE staff’s proposal that the SBE engage in an annual process to refine and improve the set of key indicators used within the rubrics. We support such a process because we are concerned that the current set of key indicators proposed for adoption falls short of implementing the full intent of the Local Control Funding Formula and the scope of the eight state priorities. We understand that data limitations prevent the SBE from including a broader range of indicators at this time. However, consistent with the proposal for an annual review of key indicators, we strongly urge the SBE to establish a concrete vision for a broader set of key indicators to ensure timely improvements to the evaluation rubrics in the coming years. Although further time and work are needed to capture and display data on indicators such as college and career readiness, student academic growth, science achievement, and chronic absence, we believe that the SBE can take further steps to ensure that these critical measures of LEA performance are reflected in the evaluation rubrics. Specifically, we request that the SBE instruct staff to provide preliminary analyses of the following for inclusion as key indicators and commit to the timeline below: · A growth measure for 3rd through 8th grade test scores by March 2017, in time for potential approval at the September 2017 meeting. Growth data should be available in Fall 2016, allowing for analysis of a potential growth measure. · A College and Career Readiness Index by March 2017, in time for potential approval at the September 2017 meeting. This could include a composite of various potential metrics that are already available statewide through individualized data, such as A-G completion, CTE pathways completion, Early Assessment Programs/11th grade test scores, Advanced Placement scores, SAT/ACT scores, and dual enrollment course completion. It also could include additional metrics if reliable individualized statewide data can be available, such as GPA, State Seal of Biliteracy, and International Baccalaureate measures. · A chronic absence measure by March 2019, in time for potential approval at the September 2019 meeting. Two years of chronic absence data, to allow for analysis of performance and year-to-year improvement, should be available by Fall 2018 · A measure for NGSS-aligned science test scores by March 2021, in time for potential approval at the September 2021 meeting. Two years of science test scores should be available by Fall 2020. We are 2    concerned that the SBE staff proposal uses only scores for ELA and math, but not science. Considering our efforts to develop and adopt new state science standards, this is inconsistent with current state priorities and values. Even with limitations on current data, we believe that the SBE nevertheless can create an accountability system that we want in the coming years by committing to preliminary analyses of additional key indicators and the timeline proposed above. IV. Consider a Gap-Closing Method for Calculating Performance and Improvement We appreciate the considerable work that has gone into modeling scenarios for performance and improvement of graduation rates. However, we offer an alternative proposal for discussion and review by the SBE. We believe that this proposal—as discussed in detail in Attachment A to this letter—better operationalizes equity and accountability within the evaluation rubrics and state accountability system and is still consistent with many aspects of prior proposals from SBE staff. Specifically, we propose that the SBE set a long-term, ambitious goal for each of the key indicators in the rubric and calculate whether, at the current rate of improvement, schools, districts, and subgroups are on track to reach that goal. In other words, the SBE should set expectations for improvement for LEAs, schools, and subgroups that encourage continuous improvement towards one explicit, ambitious goal. Technical assistance and intervention, however, could be pegged to a more modest standard, not necessarily to the longterm, ambitious goal. The rubric will report whether each entity is on track to attain the long-term goal within a reasonable period of time. Districts, schools, and subgroups that are further from the goal will therefore be expected to make more rapid improvement. We believe that this proposal maintains a focus on one long-term, ambitious standard and encourages continuous improvement towards that standard without labeling LEAs or schools as failures. If used with the top-level display flagging of student subgroup disparities as currently proposed by SBE staff, we also believe that this approach will better incentivize gap-closing improvement amongst LEAs, districts, and schools. Finally, this proposal is consistent with prior proposals from the SBE recommending both “quality standards” as well as “assistance and support standards.” Please review Attachment A for details on this technically feasible way by which the SBE can operationalize equity while maintaining a focus on performance and continuous improvement. V. Local Data and Importance of the Eight State Priorities We strongly support SBE staff’s recommendation to include local data as indicators in the LCFF evaluation rubrics. We think it is critical that SBE think about how the external design of the rubric can better align to all eight state priorities so that stakeholders at the local level can use the rubric to inform the LCAP development process. Ultimately, we want a rubric that aligns not only with basic ESSA requirements, but also with the broader priorities envisioned under LCFF. We also support staff’s recommendation to include for the Board’s consideration a list of indicators where local data and a common statewide definition exist for the purposes of providing differentiated intervention and support. We recommend that the California Practitioners Advisory Committee (CPAG) review these local indicators, focusing specifically on the indicators for parental involvement as an example. SBE staff referenced the work of CPAG in reviewing the Statements of Model Practices which included data points that already have statewide definitions such as school site councils, community advisory committees (for students with disabilities), English learner advisory committees, the Williams basic services and conditions, and 3    A-G coursework. For those data where there are no statewide definitions or where the definitions are weak, a menu of research-based indicators should be provided as options at the local level. We believe all eight state priorities are measures by which all schools and districts should be held accountable under LCFF. This is the benefit of California’s new accountability system—one that does not focus only on test scores or other easily-quantifiable measures. We look forward to working with the Board and its staff as it “supports analysis of local data” and determines how the local metric component interacts with the overall LCFF accountability framework. We request that a fuller discussion of local data indicators be included in the agenda for the next stakeholder input session so that we can provide further input. VI. Extend the Timeline for Stakeholder Input on LCAP Template Revisions We continue to appreciate the SBE staff’s outreach to stakeholders and the equity community, particularly around proposed LCAP template revisions. While we support the overarching principles expressed by SBE staff in its proposal for LCAP template revisions, we urge the SBE to release an actual, proposed template to which stakeholders can respond and provide feedback. Without a draft mock-up, stakeholders cannot meaningfully engage or provide specific input on how a new template will function at the local level. Therefore, we recommend that the SBE ask staff to release a first mock-up of a new template as soon as possible or offer more time after the release of a draft template so that stakeholders can provide substantive feedback and review. Thank you for the opportunity to comment. We look forward to continued work with the SBE to realize the full promise of LCFF for our state’s highest-need students. Sincerely, Nayna Gupta Racial Justice Fellow/Staff Attorney ACLU of California Liz Guillen Director of Legislative & Community Affairs Public Advocates Inc. Marvin Andrade Leadership Development Director Asian Americans Advancing Justice Los Angeles Kevine Boggess Director of Policy Coleman Advocates for Children and Youth Lauren Brady Statewide Education Rights Director Public Counsel Oscar E. Cruz President and CEO Families In Schools 4    Jesse Hahnel Executive Director National Center for Youth Law Taryn Ishida Executive Director Californians for Justice Adam Kruggel Director of Organizing PICO California Brian Lee California State Director Fight Crime: Invest in Kids Luis Santana Executive Director Reading and Beyond Ryan J Smith Executive Director The Education Trust–West Samantha Tran Senior Managing Director, Education Children Now David Valladolid President & CEO Parent Institute for Quality Education (PIQE) Debra Watkins Founder and Executive Director California Alliance of African American Educators Attachment A: Memo, May 6, 2016, Equity Coalition recommendations to State Board of Education, Setting standards for performance and expectations for improvement cc: Members, California State Board of Education Karen Stapf Walters, Executive Director, California State Board of Education Judy Cias, Chief Counsel, California State Board of Education David Sapp, Deputy Policy Director and Assistant Legal Counsel Nancy Brownell, Senior Fellow, Local Control and Accountability Michelle Magyar, Local Control Funding Formula Jeff Bell, Department of Finance Cathy McBride, Governor’s Office Jannelle Kubinec, Director of National, State and Special Projects, WestEd 5    Attachment A MEMO May 6, 2016 Equity Coalition recommendations to State Board of Education Setting standards for performance and expectations for improvement Background In our February 18, 2016 memo entitled “Operationalizing Equity in the Accountability System,” the LCFF Equity Coalition wrote that “Equity is reporting and making progress toward closing subgroup gaps relative to a common standard of performance which requires differential expectations for improvement consistent with ESSA and LCFF.” Further, in that memo the coalition recommended that “the system should set ambitious goals and accelerated growth targets for LEAs, schools and subgroups that are starting off further behind.” This memo builds on the February 18th memo by describing a way in which that recommendation can be achieved. Standards for Performance We recommend the state set a long-term, ambitious goal for each key indicator. This goal will serve as a “north star” for districts and schools to work toward. Technical assistance and intervention, however, could be pegged against a more modest standard. This recommendation is generally consistent with prior proposals from the SBE suggesting that we have both “quality standards” and “assistance and support standards.” (These are referenced, for example, in the SBE February 2016 memo, Potential Architecture of a Single, Coherent System.) Expectations for Improvement We recommend the state set expectations for improvement for each LEA, school, and subgroup that encourage continuous improvement toward the state’s long-term goal. Here’s how it could work, using graduation as an example. For each LEA, school, and subgroup:    Calculate the starting point. For instance, a three-year average graduation rate for an example high school might be 85%. A rolling average would take into account natural fluctuations in data, especially for small schools and subgroups. Calculate the average annual improvement using several years of data. For instance, average annual improvement in this example school might be 1.5% per year. Since graduation rates tend to bounce around over time, it is important to find an average rate of improvement using multiple years of data. By calculating the average improvement the LEA, school, or subgroup is making, we are establishing a trend line rather than focusing on a particular point in time. Calculate whether it is on track to meet the goal based on its starting point and its rate of improvement. This will require setting a time horizon for purpose of the calculation. For instance, we could ask whether the school that is currently at 85% graduation is likely to reach 95% in 10 years if it continues to improve at a rate of 1.5% per year. o If we wanted to further differentiate improvement, we could also ask whether the LEA, school, or subgroup is likely to reach the Quality Standard in a shorter period of time, say 5 years. Why we think this works well:   It maintains a focus on one long-term ambitious standard. This allows educators and leaders to keep their eye on a single goal, not separate targets for performance and improvement. It also communicates high expectations for all subgroups of students. It encourages continuous improvement. Because the starting point and rate of improvement are recalculated on a rolling basis, the clock never “runs out” on the district, school, or subgroup. This avoids 6      some of the pitfalls of NCLB, where schools were all expected to achieve 100% proficiency in a finite amount of time. It does not label LEAs or schools as failures. This model proposes a time horizon for purposes of calculating an LEA, school, or subgroup’s improvement trend. However, this proposal does not mean that LEAs or schools that fall short of the standard in that length of time have categorically failed, or that they must necessarily receive intervention. The rolling 10-year time horizon could be used for purposes of calculating an improvement trend line. A finite year by which the standard must be met, akin to NCLB, is not part of this proposal. The decisions about how to label schools or districts, or about how to assist or intervene, may be made separately from the determination regarding whether or not the school or district is on track to reach the standard. It incentivizes gap-closing improvement. If the LEA or school needs to make progress, as in this example, to 95% of students graduating, the only way to achieve that is by focusing on students who are not currently graduating. Of course, it could be possible for districts to make improvement while still having large disparities. For example, the district might improve its graduation rate for all students and most subgroups, while failing to improve graduation rates for foster youth. For this reason, it is still important that the top-level display flag subgroups that are underperforming in order to highlight disparities for stakeholders. Further, it is important that technical assistance be targeted based on subgroup-level results. An Alternate Framework We recommend the state put the results of its outcome/improvement analysis into a simplified framework that looks at performance and improvement in concert with each other. By using 5 rather than 25 cells to communicate performance and improvement, the SBE can make this more accessible to stakeholders. It also orients the conversation around improvement toward standard. Further, it will give the SBE staff more flexibility in calculating each band of performance and improvement as they consider additional key indicators, such as English language proficiency for English learners and suspensions rates. See example below. Performance and Improvement Trend - Rubric Overall Far below standard and not improving fast enough to reach standard –ORFar below standard and maintaining or declining Concern Below standard and not improving fast enough to reach standard Issue Below standard but making improvement toward standard Emerging Nearly at standard and making improvement toward it –ORAt standard but declining Good At standard and maintaining or improving Excellent Conclusion The approach proposed in this memo identifies another technically feasible way by which the SBE could operationalize equity while maintaining a focus on performance and continuous improvement. We are happy to share more details on this proposal and look forward to continuing the conversation with staff and stakeholders. 7