Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 1 of 88 United States District Court Office of the Court Monitor Emma C., et al., v. Delaine Eastin, et al. (No. C96-4179 TEH) P.O. Box 51170 Palo Alto, CA 94303-9998 (650) 329-2800 x60125 (650) 326-3410 (fax) Mark A. Mlawer Court Monitor MEMO TO: Judge Thelton E. Henderson FROM: Mark A. Mlawer DATE: January 9, 2014 RE: Court Monitor's Determinations Regarding Plaintiffs' State Monitoring Design and SESR Implementation Objections I. Introduction Pursuant to the Second Joint Stipulation Re Amendment of Dispute Resolution Timelines in Fifth Joint Statement, the parties have met and conferred regarding Plaintiffs' objections to the 1) design of Defendant California Department of Education's (CDE's) state special education monitoring system, and 2) implementation of the Special Education Self Review (SESR) portion of the monitoring system in Defendant Ravenswood City School District. Despite the hope expressed in the Stipulation that the "issues could be narrowed and/or resolved during the meet-and-confer period" (at 2), the parties have neither reached an agreement nor narrowed the issues. Therefore, pursuant to Sections IV. B., VI. D., and VII. E. of the Fifth Joint Statement, the Court Monitor has made determinations below regarding Plaintiffs' objections. II. Standards The standards used in these determinations are those that govern state monitoring and enforcement systems in the Individuals with Disabilities Education Act (IDEA). Specific standards set forth in the IDEA will be drawn upon as needed in these determinations. In addition, standards are set by the First Amended Consent Decree (Consent Decree) and the Court's 11/26/12 Order Denying Motions Objecting to the Monitor's July 16, 2012 Determinations (Order). Section 13.0 of the Consent Decree states that CDE must 1 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 2 of 88 have a monitoring system that meets this standard: "...the state-level system in place is capable of ensuring continued compliance with the law and the provision of FAPE to children with disabilities in Ravenswood...." In addition, the parties' 10/24/12 Joint Supplemental Brief Re: The Monitor's Analysis of Section 13.0 of the FACD and the Parties' Fourth Joint Statement states that Defendant CDE "bears the initial burden of showing" that its monitoring system complies with this section of the Consent Decree (at 4). The Court's Order, upholding the Monitor's determination on this issue, resulted in the following standard: "CDE's statewide monitoring system, as applied to Ravenswood," must be "implemented adequately," identify "both noncompliance and compliance appropriately based on adequate evidence and reasoning," and result "in appropriate corrective actions, the implementation of required corrective actions, and the timely correction of identified noncompliance" (at 8-9).1 III. System Design Objections A. Inadequate Policies and Procedures A state educational agency (SEA) is required to "have in effect policies and procedures to ensure that it complies with the monitoring and enforcement requirements..." (34 C.F.R. 300.149(b)). Plaintiffs argue first that the "purpose and use" of the documents produced by CDE as policies and procedures is "mostly unclear," revealing the appearance of a system design that is "disorganized and inefficient" (Plaintiffs' Design Challenge, 4/12/13,2 at 4-5). CDE responds with a table of documents previously produced that constitute its policies and procedures, and argues that the "very existence" of these documents disproves Plaintiffs' claim (CDE Design Response, 5/31/13,3 at 6-7). CDE is correct that the documents produced are monitoring policies and procedures. Plaintiffs offer no argument from these specific documents to support their contention that the policies and procedures are unclear, disorganized, or inefficient. Moreover, apart from the content specified in the regulation cited above, the regulation merely requires that the policies and procedures be in effect. 1 Both Plaintiffs and CDE also rely to some extent for monitoring system standards on a document coauthored over a decade ago by the current Court Monitor in the Emma C. case. See Plaintiffs' Design Challenge, 4/12/13 at 11-12; CDE Design Response, 5/31/13 at 3-21. This document, called by CDE the White Paper and by Plaintiffs Focused Monitoring-A Model for Change, was written by external experts retained by the plaintiffs in Angel G. v. TEA, pre-dates the reauthorization of IDEA in 2004, and has not been adopted by this Court as a set of standards to be applied in the Emma C. case. Because the standards and guidance set forth in this document are irrelevant to this case, and may not be current, they will not be used in these determinations. 2 This document is cited below as PDC. 3 This document is cited below as CDE Design Response I. 2 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 3 of 88 Turning next to the content of the documents, Plaintiffs allege that CDE does not have policies and procedures that address a number of required topics: • • • • • • • monitoring implementation of its enforcement activities; making determinations annually and reporting annually about the performance of each local educational agency (LEA) using specific categories; taking specific actions after making compliance determinations; placing primary focus of monitoring activities on improving educational and functional results/outcomes, and ensuring compliance with IDEA requirements with an emphasis on those most closely related to improved results for students; using quantitative indicators and such qualitative indicators as are needed to measure performance adequately in the priority areas, which include, but are not limited to, free appropriate public education (FAPE) in the least restrictive environment (LRE), state exercise of general supervision, and child find; establishing measurable and rigorous targets for indicators in the priority areas, and collecting valid and reliable data to report annually; and ensuring noncompliance is corrected as soon as possible, and in no cases later than one year after the SEA identifies noncompliance (PDC at 5-6; paraphrased with quotations and citations omitted). CDE describes the documents listed in the table of policies and procedures referenced above as "specific written procedures for the implementation of every aspect of the development and implementation of its monitoring system..." (CDE Design Response I at 6; emphasis added). But when it turns to Plaintiffs' specific objection that its policies and procedures do not address the listed requirements, CDE attempts to dispose of this point in a single sentence: "Further, the policies and procedures do address the IDEA monitoring and enforcement requirements listed in Plaintiffs challenge" (CDE Design Response I at 7). No arguments are offered, nor are specific references to any parts of the listed documents, to support this assertion. Further, as part of its challenge to CDE's monitoring and enforcement policies and procedures, Plaintiffs include a list of additional alleged flaws in the policies and procedures (PDC at 6-7; lettered a-e and organized under its policies and procedures challenge). Plaintiffs write that their document will "review in detail each of the three primary components of CDE's statewide monitoring system (annual district level data collection and analysis, SESR, and VR4) and demonstrate that each of these components suffers from...major design flaws" (PDC at 7-8). Each of these alleged design flaws will be considered below as raised by Plaintiffs in the context of the primary components of CDE's monitoring system. To the extent that any aspect of the monitoring system is found by these determinations to be inadequate to meet the standards set by the statute, Consent Decree, and/or the Court's Order, the remedy for such inadequacies must include, in addition to any necessary substantive changes to its monitoring and 4 Verification Review (Monitor's note). 3 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 4 of 88 enforcement system set forth in the specific determinations, adequate policies and procedures reflecting those changes. Monitor's Determination: Plaintiffs have not shown that CDE's monitoring policies and procedures are disorganized, unclear, or inefficient. CDE has not demonstrated that it has policies and procedures that address the requirements specified above. Therefore, CDE shall engage in corrective action steps to demonstrate through a submission to the Monitor and the parties that it has policies and procedures in effect that fully address the requirements in Plaintiffs' list by either 1) supporting its assertion by specific references to the documents in its table for each of the requirements in Plaintiffs' list, or 2) developing policies and procedures that address the requirements.5 B. Inadequate Staffing and Training Plaintiffs allege that CDE's staffing of its monitoring functions is inadequate, and that its staff is not properly trained (PDC at 6, 19; at 19 Plaintiffs state that they "have not found any clear documentation" that staffing, training, and oversight of monitors is adequate). The argument offered to support this conclusion is that some of the alleged failures of CDE's monitoring system (this set of alleged failings are noted in a paragraph at PDC 19-20) are "demonstrated" by CDE documents showing a number of districts that did not correct noncompliance within one year as required, and the small number of districts that received a Verification Review (VR) in recent years (PDC at 20). Plaintiffs also cite their 3/8/13 comments on CDE's 2/22/13 response on these specific issues. CDE does not offer a response to these arguments in either its Design Response I or in its 8/26/13 Supplemental Response.6 But in CDE's 2/22/13 document, CDE included budgetary information regarding all of its quality assurance processes, staffing charts for its Special Education Division, staff assignments, sample duty statements, minimum qualifications for positions, information regarding contractors that perform specific tasks related to monitoring, and a description of its formal and informal training, peer mentoring, and special training (at 1-3). While the information provided by CDE is illuminating, CDE offers no clear argument as to how this information shows that its staffing is adequate, and its staff adequately trained, to fulfill its monitoring and enforcement responsibilities as envisioned by the statute and the controlling documents of the Emma C. case. Although CDE offers no argument in response, Plaintiffs' arguments fall far short of showing that monitoring staffing and training are inadequate to fulfill CDE's responsibilities. First, Plaintiffs do not set forth any standards by which the adequacy of state monitoring system staffing and training could be judged. Second, even if the assumption is made that CDE's monitoring system has the many failings Plaintiffs say it 5 For all corrective actions set forth in these determinations, the parties shall have an opportunity to comment on CDE submissions. 6 This document is cited below as CDE Design Response II. 4 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 5 of 88 has, it does not follow that inadequate staffing and training are responsible for that; while that could be the case, establishing such a connection requires specific evidence.7 Third, Plaintiffs' 3/8/13 comments on these topics, cited in their PDC, consist largely of questions for CDE and requests for documents, with one exception--staff vacancies. Plaintiffs note that CDE's monitoring staffing chart shows "some" vacancies, including vacancies that are "longstanding and apparently key positions" (Plaintiffs' 3/8/13 comments at 2). But Plaintiffs do not show that this state of affairs has had a deleterious effect on CDE's monitoring efforts; instead, they pose additional questions. While the Monitor has previously made plain his agreement with Plaintiffs' contention in their 3/8/13 comments that "CDE's system design cannot pass muster if it lacks necessary staffing" (at 2),8 Plaintiffs do not show that this is the case. Monitor's Determination: Plaintiffs have not shown that CDE's monitoring functions are inadequately staffed, or its monitors inadequately trained, such that CDE cannot meet the standards set by the controlling documents of this case. However, as CDE has not demonstrated the adequacy of its monitoring staffing and training, and as inadequacies in these areas may come to light during CDE's implementation of the corrective actions called for in these determinations, the Monitor may reconsider this determination in light of any new facts that emerge. C. Annual Collection of District Information 1. Too Limited Data Collection Plaintiffs argue that CDE's annual collection of data is "largely" based on its California Special Education Management Information System (CASEMIS) database, and that CASEMIS data include "little if anything" that focuses on the statutory requirement to improve student results and outcomes by measuring performance in the priority areas of FAPE in the LRE and child find. Arguing that the federal State Performance Plan (SPP) indicators are too limited for this purpose, Plaintiffs provide lists of data they believe could and should be collected by CDE, data that in their view would speak more adequately to performance in the priority areas. Responding to CDE's claim that additional sources of information on results and outcomes are used by CDE to tailor SESRs and VRs, Plaintiffs respond that these two components of the monitoring system are ineffective,9 and that this claim from CDE ignores Plaintiffs' 7 As shown in Section III. E. 1. below, CDE conducts very few VRs. But it is unknown to the Monitor whether this is a result of inadequate staffing, as Plaintiffs have not submitted evidence showing that it is. 8 In an exchange of correspondence on 2/4/13 with counsel for CDE related to Plaintiffs' initial submission of system-design objections, the Monitor wrote, "... CDE argues that the submission exceeds the scope of the Fifth Joint Statement in that it raises non-design issues such as funding and staffing.... Regarding funding and staffing, if a well-designed monitoring system had insufficient funding or staffing, or inadequately trained staff, then such a system could arguably fail to meet standards in the statute and/or controlling documents of this case. Thus, these issues do not exceed the scope of the inquiry into the state monitoring system's efficacy to which the parties have agreed" (Mlawer to Tillman, at 2). 9 For determinations on these issues see Sections III. D. and E. below. 5 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 6 of 88 point regarding the annual collection of data. Further, using CDE's response to the federal Critical Elements Analysis Guide (CrEAG) document, Plaintiffs note that CDE did not include Indicator 5 (LRE) among the ways it uses data to identify noncompliance, and argue that CDE's responses show that CASEMIS data are "evaluated for very limited purposes" (PDC at 18, 21-23). Regarding CASEMIS data, CDE argues that this database is the "primary" source of information for federal reporting of required data, but not the sole source of information used to inform its monitoring activities. CDE lists the additional data it collects: local budget and service plans, data-based findings of noncompliance, ongoing compliance history (due process results, complaints, and timely correction of noncompliance), SPP indicators, compliance determinations, and significant disproportionality (CDE Design Response I at 8). However, CDE does not show that these additional pieces of data are adequate to measure performance in the priority areas of FAPE in the LRE and child find, the argument advanced by Plaintiffs, nor does it explain with precision how the data are used on an annual basis. Instead, CDE argues that it does not have to collect data for monitoring purposes on any indicators other than those set forth as part of the federally mandated SPP document. After citing the regulatory requirement that states must "use quantifiable indicators and such qualitative indicators as are needed to adequately measure performance in the priority areas" (34 C.F.R. § 300.600(d)), CDE goes on to argue that the statute does not require SEAs to create targets in priority areas outside the SPP, citing § 300.601(a)(3): "As part of the State performance plan, each State must establish measurable and rigorous targets for the indicators established by the Secretary under the priority areas described in § 300.600(d)." CDE then shows that each SPP indicator falls under one of the priority areas, citing the federal Office of Special Education Program's (OSEP's) Part B Indicator Table, and that these indicators "reflect" the priority areas and are "meant to work in conjunction with, and not separate from, the priority areas" (CDE Design Response II at 2). CDE's argument here attempts to evade Plaintiffs' critique and misreads the regulatory requirements. Plaintiffs are not arguing that additional indicators must be added to the state's SPP10; rather, they argue that for monitoring purposes the SPP indicators and CASEMIS data are collectively insufficient to measure performance in the priority areas of FAPE in the LRE and child find. Hence, CDE's citation to § 300.601, a section entitled "State performance plans and data collection," is inapposite to Plaintiffs' argument. The relevant requirements are at § 300.600, which is entitled "State monitoring and enforcement." As CDE offers only a truncated quotation from § 300.600 (d), it is important to look at the relevant regulations from § 300.600(c-d) in full: (c) As a part of its responsibilities under paragraph (a) of this section [a(1) requires monitoring], the State must use quantifiable indicators and such qualitative indicators as are needed to adequately measure performance in the priority areas identified in 10 Thus the Monitor makes no determination on this issue. 6 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 7 of 88 paragraph (d) of this section, and the indicators established by the Secretary for the State performance plans. (d) The State must monitor the LEAs located in the State, using quantifiable indicators in each of the following priority areas, and using such qualitative indicators as are needed to adequately measure performance in those areas: (1) Provision of FAPE in the least restrictive environment. (2) State exercise of general supervision, including child find, effective monitoring, the use of resolution meetings, mediation, and a system of transition services as defined in § 300.43 and in 20 U.S.C. § 1437(a)(9). (3) Disproportionate representation of racial and ethnic groups in special education and related services, to the extent the representation is the result of inappropriate identification. (emphases added) Subsection (c) plainly requires the use of indicators adequate to measure performance in the priority areas for monitoring purposes in addition to the SPP indicators. Subsection (d) stresses that the qualitative indicators used must be adequate to measure performance in these areas, and that quantifiable indicators must be used in each of the priority areas. As CDE does not offer an argument for the adequacy of the SPP indicators and its specified additional areas of data collection to measure performance in the priority areas, it is unnecessary to spend much time on this issue. While the Monitor will not make a determination on the specific items in Plaintiffs' lists of suggestions for additional areas of data collection, one example from this list will suffice to show the inadequacy of the SPP indicators for monitoring purposes, the area of child find. The relevant regulation on this issue requires the state to have policies and procedures in effect that ensure that all children with disabilities in the state "who are in need of special education and related services, are identified, located, and evaluated" (§ 300.111(a)). The SPP indicators that speak to this priority area are Indicators 11 and 12. But Indicator 11 only measures the percentage of students referred for evaluation who received timely evaluations and eligibility determinations, and Indicator 12 only measures whether children referred from Part C of the IDEA to Part B have IEPs in effect by their third birthday. If a student was not referred for evaluation and not served by Part C, the student will not be included in these indicators. Clearly, an LEA can be fully compliant with these indicators yet still have children with disabilities in its jurisdiction who need special education and related services but who have not been identified, located, and evaluated. Thus, the SPP indicators are not fully adequate to measure performance in the priority areas.11 11 With respect to the child find example, it should also be noted that 1) CDE already collects relevant data, special education enrollment data, for Indicators 9 (disproportionate representation in special education) and 10 (disproportionate representation in specific disability categories); and 2) CDE has proposed a pilot quality assurance process for Ravenswood for the 2013-14 school year that would use 7 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 8 of 88 Turning next to Plaintiffs' argument that CDE's responses show that CASEMIS data are evaluated for purposes that are too limited and their related Indicator 5 (LRE) argument, Plaintiffs cite CDE's 3/22/13 response at 9-11 and 2/22/13 response at 13. In the 3/22/13 document CDE argues as follows: The requirement is not to collect data but to monitor LEAs in the state (34 CFR 300.600(d)) using quantifiable indicators and using such qualitative indicators as are needed to adequately measure performance in those areas. This requirement is not about selection but is about the activities onsite that produce findings of compliance or noncompliance. The CDE asserts that it uses quantifiable (the SPP/APR12 indicators) and such qualitative indicators (parent input, complaints history, and due process findings) to select items to be included in all reviews and to select individual items to be included in a review that are unique to a particular district. (at 9, emphasis in original) CDE's approach here again ignores Plaintiffs' argument, which concerns annual data collection. Further, in order to use indicator and other data for any purpose, including on-site monitoring, the data must first be collected; hence, the requirement also involves collecting necessary data. As selection for on-site or other monitoring activities is part of a monitoring system, CDE's attempt to drive a wedge between selection and monitoring does not persuade. A monitoring system that complies with the statute uses data adequate to measure performance in the priority areas to drive its activities, including selection for those activities. With respect specifically to LRE data, while Plaintiffs attempt to read too much into CDE's CrEAG response, CDE's response reinforces the concern discussed above. Plaintiffs regard CDE's failure to list LRE data among the ways it uses data to identify noncompliance as an "inexcusable oversight" (PDC at 23). CDE responds that the CrEAG is not an official submission to OSEP, and that the document was ultimately withdrawn by OSEP as it was not approved by the federal Office of Management and Budget as a data-collection vehicle. CDE goes on to argue that particular placement levels do not establish compliance or noncompliance with the LRE requirements, and these data as a factor in compliance determinations, selection for VR and as an element of the VR monitoring plan (for the latter two issues if the district's data fell outside a target range) (Quality Assurance Process Pilot, Ravenswood City School District 2013-14, at 9-10). With respect to the VR monitoring process, CDE also proposes a methodology for conducting child find monitoring as part of its VR pilot proposal for Ravenswood (Verification Review Policy and Procedure Guide, Ravenswood City School District 2013–14, at 56-59). While CDE's proposal suffers from a lack of precision regarding the use of these data in determinations and VR selection, and a too-limited approach to child find monitoring in the VR process, the proposal takes clear steps in the right direction. However, the proposed pilot has not been accepted by the other parties and, thus, is irrelevant to these determinations. Moreover, a proposal for a pilot project, and one that is limited to Ravenswood, is clearly not part of the state-level system "in place," as required by Consent Decree § 13.0. 12 Annual Performance Report (Monitor's note). 8 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 9 of 88 adds that its monitoring system "tests" how placement decisions are made by IEP Teams (CDE 2/22/13 Response at 13). CDE is correct that the CrEAG is not an official document, and also correct that placement levels alone cannot determine compliance or noncompliance with the LRE requirements. However, even if the assumption is made that the approach to LRE monitoring in the SESR and VR processes is adequate, that again is not the claim Plaintiffs are advancing here. While CDE's statements on the CrEAG document should be treated as irrelevant, CDE has not shown that it uses LRE data annually for monitoring purposes outside of the SESR and VR processes, and if so how, such that potential LRE violations that placement data may suggest are investigated and, if found to be violations, corrected. Monitor's Determination: CDE has not demonstrated that it collects data adequate to measure performance in the priority areas for monitoring. Therefore, CDE shall engage in corrective action steps to ensure that it collects data adequate to measure performance in the priority areas for monitoring. CDE shall set forth through a submission to the Monitor and the parties all data it will collect, analyze, and evaluate on an annual basis for monitoring purposes. For each type of data, CDE shall describe with precision how it will be used annually, including identifying the specific levels of data that will result in specific CDE monitoring and enforcement activities. In addition, CDE shall fully set forth in this submission the basis for its belief that these data are collectively adequate to measure performance in the priority areas. The necessity of further corrective actions will be determined after review of the submission. 2. Inadequate Systems to Ensure Valid and Reliable Data Plaintiffs allege that CDE's annual activities to validate the data it collects are insufficient to ensure the data are accurate. Specifically Plaintiffs argue that CDE's data verification is "computer-driven and mostly limited to mistakes identified by software," and that CDE fails to request back-up materials from districts, use specific methods to cross-check data, monitor itself to ensure consistency across districts, and evaluate analyses "interjected" by data collectors other than districts. In addition, Plaintiffs question whether CDE actually analyzes qualitative information instead of converting it into quantitative information, state again their conclusion that CDE does not collect sufficient qualitative information, and assert that CDE does not have any evaluative criteria for qualitative information that would allow providers of special education and related services to improve their performance (PDC at 23-25). In response, CDE describes its approach to data accuracy verification as a "double-validation process." The process includes CASEMIS file validations that look for logical inconsistencies and year-to-year anomalies. CASEMIS data are also "crossverified" with California Longitudinal Pupil Achievement Data System (CALPADS) data; results of this analysis for the 2011-12 school year showed a match of basic student information at 94% accurate or higher. In addition, special education staff members perform unspecified statistical analyses on the data. Further, CDE compares CASEMIS 9 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 10 of 88 data to IEPs and student records during VRs and SESR follow-up visits. This process looks at the accuracy of a limited set of fields in the CASEMIS database, and appears to have begun for 2012-13 VRs and follow-ups to the 2011-12 SESRs.13 With one exception, Plaintiffs' critique of CDE's approach to ensuring the accuracy of data consists of a series of largely unsupported assertions rather than of arguments supported by evidence. Plaintiffs do not show that any qualitative information is converted to quantitative information, nor do they state clearly why such a process would be inadequate; do not identify which types of data would require backup materials and explain why that would be necessary; do not explain why CDE's data verification activities do not ensure consistency across districts; do not identify any data collected or analyses performed by third parties, or explain why such data or analyses should be treated any differently from other data; do not acknowledge that CDE does, in fact, use specific tools and procedures to validate and cross-check data (CASEMIS/CALPADS), or explain why the procedures employed by CDE are inadequate; and do not identify qualitative information that requires evaluative criteria in order to facilitate improvements in provider performance. Plaintiffs cite to their 3/8/13 comments at 6-9, but that document is of little help, as Plaintiffs do not support these specific concerns in these comments. However, Plaintiffs offer a clear argument that CDE data verification is largely limited to software-identified inaccuracies. They add in their 3/8/13 comments that the fields subjected to the on-site validation process are too limited (but do not identify additional fields they believe should be validated), and that this process is further limited to "rare" VRs and once-every-four-years SESRs (Plaintiffs' 3/8/13 comments at 6, 8). As set forth above, CDE engages in a variety of data validation activities. But CDE does not include in its responses the results of those activities, leaving an important question unanswered: how accurate has CDE found the reported data to be? While the software tests and checking databases against each other can find potentially inaccurate data, the most reliable results on data accuracy will come from the comparison of reported data with the actual content of student records. As these results were not conveyed in CDE's responses, the adequacy of CDE's data validation efforts cannot be currently judged against the scope of the potential problem. In addition, while the results of testing CASEMIS against CALPADS were conveyed by CDE for the eight fields tested (a 94%+ accuracy level), in addition to the test only looking at eight fields of very basic student information, what is unknown is whether the CALPADS database itself has been found to be accurate at a high level. Further, the CDE document (CASEMIS CALPADS Data Matching Project, undated) appears to show that over 99,000 students had records in CASEMIS but not in CALPADS (at 2), and does not state the number and percentage of students for whom data matched in all fields tested. These results do not inspire confidence in the accuracy of data collected by CDE. The issue of the accuracy of CALPADS data has additional importance as CDE is now 13 CDE Design Response I at 10, CDE Design Response II at 6-7, CDE 2/22/13 Response at 5-9; CDE's "2011-12 SESR Followup Review: CASEMIS Student Level Data Verification"; CDE's "2011-12 SESR Followup Reviews: CASEMIS Data Verification Protocol." 10 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 11 of 88 apparently relying on this database for suspension/expulsion data (CDE Design Response I at 11; see Section III. C. 3. c. below). Monitor's Determination: Plaintiffs have not shown that qualitative information is converted to quantitative information, nor why such a process would be inadequate; what types of data require back-up materials, nor why that would be necessary; that CDE's data verification activities do not ensure consistency across districts; that data collected or analyses are performed by third parties; and that any qualitative information requires evaluative criteria in order to facilitate improvements in provider performance. CDE has not demonstrated that its data validation activities are adequate to ensure that it is collecting and using accurate data for monitoring and enforcement purposes. Therefore, CDE shall engage in corrective action steps to ensure that it collects and uses accurate data for monitoring and enforcement purposes. For each type of data identified in response to the determination made at Section III. C. 1. above, CDE shall set forth through a submission to the Monitor and the parties the results of its data validation activities for that type of data in the last three school years. The submission should clearly indicate which data validation activity was used to reach each result. For results from the on-site data validation activities, CDE should ensure that the submission shows the number and percentage of students for whom all fields checked were found to be fully accurate, and the percentage accuracy for each field in the database. For any type of data identified by CDE for which it has not validated accuracy, CDE shall set forth the steps necessary to do so. In addition, as CDE has stated that it has made findings of noncompliance related to data accuracy as a result of its on-site data validation efforts (CDE Design Response I at 10, CDE Design Response II at 7), the submission should identify the number of such findings made, and the number and percentage of districts that were found noncompliant through its on-site activities in the 2012-13 school year. The necessity of further corrective actions will be determined after review of the submissions. 3. Too Limited Follow-Up with Districts a. APR Measure/Annual Correction of Noncompliance Plaintiffs argue that CDE's claims to identify and correct noncompliance on an annual basis are not supported by any documents CDE produced, with the exception of what Plaintiffs regard as the "flawed APR measure." Plaintiffs accuse CDE of using this annual public reporting vehicle unevenly. In support of this claim, using the 2012 reports they note that in response to Ravenswood's low state assessment proficiency rates (Indicator 3) CDE required action by the District, but did not require action for the District's failure to meet Indicator 11 (timely assessments). In addition, Plaintiffs point out what in their view is a lack of uniformity in the application of "action required" on the APR measure: action was not required of other districts that failed to meet targets on performance indicators. Plaintiffs offer the examples of two districts that did not 11 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 12 of 88 have action required for low performance on state assessments, while action was required of a district with much higher levels of proficiency (PDC at 16, 17, 25). CDE responds that it will avoid the use of "action required" in the future on this report, and will instead simply report on whether or not the target was met. CDE states that its APR measure "in substance, will continue (1) to correct, within a year, all findings of noncompliance for the compliance indicators applicable to LEAs...." This appears to contradict CDE's comment on this subject in its 3/22/13 response, in which it described its APR report in the following manner: "The APR indicator report is just that--a report. Findings of noncompliance are made through other means--the databased noncompliance process, the disproportionality reviews and through SESRs and VRs." CDE argues that the lack of an "action required" column for Indicator 11 in the report does not indicate that no action was required on this issue: it asserts that it notified the District of this area of noncompliance in a letter and required corrective action within a year. While CDE cites its data-based monitoring and technical assistance guide in support of this point, it does not attach or link to the letter it claims to have sent. After describing its process for identifying and correcting noncompliance with the APR compliance indicators, CDE turns to Plaintiffs' example from the performance indicators, Indicator 3, and argues that the discrepancy between the treatment of districts pointed to by Plaintiffs is due to nothing more than the districts for which action was not required not meeting the minimum "n" size for CDE to make accountability determinations, citing its Accountability Workbook. On the performance indicators more generally, CDE claims to assess performance annually in the selection process for VRs, through which it is "more likely to select an LEA with low proficiency on performance indicators...." Elsewhere it writes that that the "performance measures are monitored through the SESR or VR" (CDE Design Response II at 4-6; CDE 3/22/13 Response at 13-14). In spite of the inconsistency in the two CDE documents cited regarding the purpose of the APR indicator report, Plaintiffs' concern about the report is misdirected. CDE is required by the statute to report annually about the performance of each LEA (§ 300.600(a)(4)), must do so using the targets in the SPP14 within 120 days of submitting its APR each year (§ 300.602(a) and (b)(1)(i)(A)), and cannot report information on performance that would not be statistically reliable or result in disclosure of information that could identify individual students (§ 300.602(b)(3)). CDE is not required by the regulations to use this report as a tool to directly improve compliance or performance; it is simply a vehicle for public reporting. Second, as noted above Plaintiffs regard the APR Measure as the sole means CDE identified in its documents through which it identifies and corrects noncompliance annually (PDC at 25). But CDE has also provided links to what it regards as "extensive evidence" that aspects of this are accomplished through other means (CDE's 3/22/13 Response at 13; CDE Design Response II at 5). Plaintiffs offer no arguments based on 14 The regulation also requires the use of the priority areas in the annual report, but as Plaintiffs do not specifically raise this issue, the Monitor will not make a determination on it. 12 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 13 of 88 these documents and, thus, the Monitor makes no determination related to them. However, CDE must support its claim regarding its identification and correction of noncompliance in Ravenswood with the relevant compliance indicators (but for Indicator 4, see Section III. C. 3. c. below). As CDE has claimed that it monitors the performance indicators through the SESR and VR processes, including the VR selection process, the adequacy of these processes for this purpose--considered in light of Plaintiffs' objections--will be discussed below. Monitor's Determination: Plaintiffs have not shown that CDE's LEA APR reports are inconsistent, nor have they shown that CDE is required to use these reports for compliance purposes. In addition, Plaintiffs have not shown that CDE lacks effective means for identifying and correcting noncompliance with compliance indicators annually. However, within 60 days of the date of this memo, CDE shall submit to the Monitor and parties all documents relevant to, and showing the adequacy of, its monitoring of Indicator 11 and 12 in Ravenswood for the last three years. The necessity of further corrective actions will be determined after review of the submission. b. Use of Determinations Plaintiffs turn next to the annual determinations required by IDEA, and claim that CDE has put forth no evidence that it uses these determinations in any part of its monitoring system or in its annual reviews. Plaintiffs, however, do not explain their view of how the IDEA requires determinations to be used as part of an SEA monitoring system. Plaintiffs do not regard the responses made by CDE on the issue of the use of determinations as "meaningful," and critique two documents produced by CDE: while they agree that one undated document "purports" to apply the determinations, they state that it "cannot be confirmed to be a CDE monitoring document"; the other, a sample letter sent to a district, Plaintiffs do not regard as evidence supporting CDE's claim. Plaintiffs' PDC does not state the reasons for the latter judgment, although their 3/8/13 comments which they cite in support on this point appear to describe this document as "vague, ambiguous, unclear and lacks timeframes for response, see e.g., sample letter to district re failures to meet SPPI measures" (PDC at 16-17, 25-26; Plaintiffs' 3/8/13 comments at 1315). CDE addresses determinations in several places in its responses to Plaintiffs' objections. It states that it makes determinations annually, and that the determinations are "dispositive" of the issues LEAs must address in their SESRs. In addition, the annual data collection for the determinations "pinpoints" the issues LEAs must address in their SESRs. CDE also notes that it "has available" both determinations and VR 15 This comment by Plaintiffs also includes the claim that "OSEP has registered the same concerns with respect to CDE's failure to make determinations annually consistent with the above requirements," citing Exhibit 5 (at 2) of CDE's 9/24/12 Request for Judicial Notice. But the second page of this 2012 OSEP letter responding to CDE's submission of its FFY 2010 APR and revised SPP does not state any concerns regarding CDE's performance on determinations; it appears to be merely a reminder to the state to make the determinations. 13 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 14 of 88 findings to make findings of noncompliance. Citing its pilot proposal, CDE claims that determinations are calculated based on SPP indicators (reflecting in CDE's view the priority areas), and indicators for audits and timely correction of noncompliance. One of CDE's attachments to its 2/22/13 document is a 10/1/12 sample letter to districts conveying determinations for 2010-11. According to this document the indicators used in CDE's determinations for that year include Indicators 4A, 4B, 9, 10, 11, 12, timely and complete reporting, and audit findings. Timely correction of noncompliance is also listed in the table in the letter, but CDE states there that this item was not used in the determinations for that year (CDE 10/1/12 sample letter at 3). Another attached document, entitled California Local District Compliance Determination Process Under Section 616, IDEA 2004, is undated and not on CDE letterhead. This document explains how CDE arrives at determinations. Points are awarded for each of the indicators noted above in the sample letter in accordance with standards stated in the document (4=meets requirements, 3=needs assistance, 2=needs intervention, and 1=needs substantial intervention). To arrive at the overall determination, CDE sums the determination for each indicator and divides by the number of indictors with numerical values (excluding not applicable and not calculated). Regardless of whether a district's overall determination is needs assistance (NA), needs intervention (NI), or needs substantial intervention (NSI), the document prescribes identical next steps, that the LEA "must" seek technical assistance from its assigned CDE consultant (at 14); however, the document also states that additional sanctions "may apply" to districts that were NA or NI for two or more consecutive years (at 15). NSI districts are not mentioned (CDE Design Response I at 8, 13, 16, 19; CDE Design Response II at 1-3; attachment to CDE 3/22/13 Response at 7; attachments to CDE 2/22/13 Response at 13). CDE is required by the regulations to make determinations "about the performance of each LEA" annually using the four specified categories (§ 300.600(a)(2); emphasis added). As part of its responsibilities under paragraph (a)--which include monitoring at (1), determinations at (2), enforcement at (3), and annual reporting at (4)-CDE "must use quantifiable indicators and such qualitative indicators as are needed to adequately measure performance in the priority areas identified in paragraph (d) of this section, and the indicators established by the Secretary for the State performance plans" (§ 300.600 (c); emphases added). Again, the list of priority areas identified at (d) include FAPE in the LRE; state exercise of general supervision, including child find, effective monitoring, the use of resolution meetings, mediation, and a system of transition services; and disproportionate representation of racial and ethnic groups in special education (to the extent the representation is the result of inappropriate identification). The determination categories available to CDE are set forth at § 300.603(b)(1), which also states that the determinations should be "[b]ased on" the information in the APR, "information obtained through monitoring visits, and any other public information made available..." (emphasis added). The enforcement steps that flow from the determinations are listed at § 300.604(a)-(c), and additional enforcement steps at § 300.608. 14 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 15 of 88 Setting aside once again CDE's proposal for a pilot for Ravenswood for the reasons set forth above, except to note on this issue as well that the proposal takes steps in the right direction, CDE's process for making annual determinations does not comply with the statutory requirements. The determinations are required to be "about" performance, and CDE is further required to use indicators in its determinations that are adequate to measure performance in the priority areas in addition to the SPP indicators. Yet it is plain from the process described in CDE's documents that its determinations have been limited to the compliance indicators, in addition to timely/complete reporting of data and audit findings. CDE attempts to take refuge behind an OSEP guidance to states on this subject, citing a 2009 federal document that allows states, in spite of the clear regulatory requirements for determinations, to only consider compliance indicators, valid and reliable data,16 correction of identified noncompliance,17 and other data available to the state about districts' compliance with the statute (CDE Design Response II at 2-3). But, in addition to the absence of an argument for its position from the regulatory requirements for determinations, CDE does not mention that OSEP now acknowledges the conflicts between the statutory requirements and its former approach to determinations, and is shifting in a more defensible direction. As OSEP wrote in 2012:   The current system places heavy emphasis on procedural compliance without consideration of how the requirements impact student learning outcomes. In order to fulfill the IDEA’s requirements, a more balanced approach to determining program effectiveness in special education is necessary. ...The Department is required to annually make determinations of each State’s performance status using data from the APR and other publicly available data. The designation “meets requirements” should acknowledge a State’s effectiveness in improving outcomes for children with disabilities relative to other states and to the nation. Determinations under RDA will be based on States’ overall performance on a set of priority indicators and other relevant data rather than only on compliance indicators.18 CDE is required to use indicators adequate to measure performance in its annual determinations. In addition, CDE's determinations process is not based on "information obtained through monitoring visits." While it appears that timely correction of noncompliance was at least intended to be included but was not for the 2010-11 school year, limiting the inclusion of monitoring information in determinations to just whether noncompliance 16 However, CDE's indicator for determinations looks for timeliness and completeness of data reporting, not validity and reliability. 17 However, timely correction of noncompliance was not considered by CDE for 2010-11. 18 OSEP, Results-Driven Accountability in Special Education, Summary, April 5, 2012, at 1-2; emphasis added (http://www2.ed.gov/about/offices/list/osers/osep/rda-summary.pdf). 15 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 16 of 88 has been corrected timely has no basis in the IDEA's requirements. Further, basing the SESR in part on the determinations is laudable, but has the relation between monitoring and determinations backwards: the determinations are to be based in part on monitoring findings. Moreover, the failure to use current monitoring findings as a factor in the determinations can also produce results that strain common sense. For example, if CDE makes findings of noncompliance in a district and has not yet verified that the noncompliance has been corrected, CDE's process appears to allow it to label such a district "meets requirements" at the same point in time during which it is in possession of information showing that the district does not, in fact, meet all requirements. Further, CDE's documents related to determinations are unduly vague regarding the consequences of certain determinations. One cannot determine in accordance with the Consent Decree whether the state-level system in place is or is not capable of ensuring continued compliance with the law and the provision of FAPE to children with disabilities in Ravenswood without clarity regarding the consequences of NA, NI or NSI determinations. Stating that unspecified additional sanctions "may apply" to certain districts does not provide the needed clarity. Finally, CDE's documents do not include any the Monitor has been able to locate that convey the determinations CDE has applied to Ravenswood in recent years, nor data regarding CDE's use of the four categories of determinations statewide. As the parties and Court have a good deal of information regarding compliance in Ravenswood in recent years, CDE's determinations for the District may be instructive. Monitor's Determination: CDE has not demonstrated that its process for making determinations is compliant with the statute or Consent Decree. Therefore, CDE shall engage in corrective action steps to develop and implement a process for making determinations that is compliant with the statute and Consent Decree. For each type of data identified by CDE pursuant to the Monitor's Determination for Section III. C. 1. above, CDE shall set forth with precision through a submission to the Monitor and the parties whether, and if so how, the type of data will be used for annual determinations. In addition to setting forth the manner in which determinations will be calculated, the submission shall show with clarity the scores that will result in each level of determination. Further, the submission shall show how determinations will be based on information obtained through monitoring activities. The submission shall also set forth the consequences of each determination, considering the number of years an LEA is in a determination level. Finally, for each of the last five years, CDE shall also convey in its submission the determination given to Ravenswood, the reason(s) for the determination, and the number and percentage of California districts placed in each determination level. c. Suspension/Expulsion/Disproportionate Representation Plaintiffs argue first that CDE's monitoring system neither captures nor corrects disproportionate suspensions of students with disabilities, particularly non-white students. Citing 2009 U. S. Department of Education (USDOE) data and a 2012 UCLA 16 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 17 of 88 Civil Rights Project document, Plaintiffs argue that Ravenswood and other districts in the state have disproportionately high rates of suspension of students with disabilities, especially African-American students with disabilities. The UCLA document in particular shows, Plaintiffs argue, that of approximately 500 California districts Ravenswood is 43rd in the use of more than one out-of-school suspension for students with disabilities, and 11th in such suspensions of African-American students with disabilities. Plaintiffs add that these data do not count in-school suspensions, which can also affect compliance with the requirements associated with the priority areas. Plaintiffs argue that while CDE claims to collect such data through CASEMIS, it has not produced reports or systems to correct the problems indicated by the data. Responding to CDE's claim in its 2/22/13 response (at 12) that the conclusions of the UCLA study cannot be replicated using CDE data, and that the rate of suspensions and expulsions is half that reported by UCLA and "appears" to have decreased over the last three years, Plaintiffs counter that CDE's response relies on CASEMIS data which it assumes to be accurate, and adds that the USDOE data on which the UCLA study was based were collected directly from the state or districts (PDC at 26-28; Plaintiffs' 3/8/13 comments at 12). Plaintiffs argue that because SPP Indicator 4 data are only collected for suspensions greater than 10 days, Ravenswood's alleged problems in this area are masked by CDE's exclusive use of Indicator 4 data for compliance purposes. Plaintiffs argue further that this approach does not connect suspensions of any length to potential child find and FAPE violations. In addition, Plaintiffs state that they have not identified any process for individualized review of students suspended or expelled to monitor whether a denial of FAPE and/or behavior related to students' disabilities has caused the high rates of suspension (PDC at 28). CDE responds that suspension/expulsion data were collected through CASEMIS, but currently collected through CALPADS. It is not clear from CDE's response whether the former is continuing to collect, and will continue to collect, these data as well. CDE uses the data to identify districts that are significantly discrepant overall from the state rate of suspensions/expulsions, and districts significantly discrepant by race/ethnicity, for greater than 10 days. CDE has calculated the state rate of suspensions/expulsions for greater than 10 days as .60 and adds for unexplained reasons a 2% "variation," which sets the state "bar" at 2.6%. Districts that exceed this rate overall or by race/ethnicity are required to conduct a Special Self-Review (SSR). The SSR includes review of the district's policies, procedures and practices. CDE consultants are "available" to assist districts in this self-review process. CDE's earlier response included links to SSR instructions, a student practices review form, and a policies and procedures review form, all dated 10/12. CDE argues that it should not be "compelled to go beyond" the OSEP requirement to collect data on suspensions/expulsions of greater than 10 days as a result of the data from the UCLA study cited by Plaintiffs. In support CDE offers two arguments: first, CDE uses 2009-10 data to show that 0% of Ravenswood students with disabilities were suspended or expelled for greater than 10 days, compared with a state average of .11%; and second, CDE disagrees with the conclusions and methodology of 17 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 18 of 88 the UCLA study. As noted above, CDE states that CALPADS is now collecting suspension/expulsion data beginning in the 2011-12 school year, and explains that all expulsion data is being collected, even if the expulsion's term changed or was suspended. CDE does not state whether all suspension data are being collected as well, or whether these data include in-school suspensions. Finally, CDE states that the state 2011-12 data do in fact indicate that racial/ethnic groups were suspended at different rates.19 In response to the data, CDE spotlights several initiatives and adds that a district's policies, procedures and practices related to child find and IEPs "may be" revised by CDE after "proper review" of districts' CALPADS data. No timeline or process is set forth for that review (CDE Design Response I at 10-12, 18-19; ftp://ftp.cde.ca.gov/sp/se/ds/201112%20Special%20Self%20Review%20of%20Disproportionality/). The Monitor has made a determination above related to data accuracy. In addition, there is no need to resolve the specific dispute regarding the respective accuracy of the UCLA and CDE data in order to resolve Plaintiffs' objection on the issues of suspensions/expulsions20: even if one assumes the UCLA data and conclusions to be inaccurate, Plaintiffs' argument may still persuade and reach its desired conclusion in a weaker form. In other words, any problems that Ravenswood or any other district may have regarding disproportionate use of suspensions of any length may not be revealed by CDE's exclusive use of Indicator 4 data for compliance purposes, as disproportionality can also exist in a district's use of removals of fewer than 10 days. Moreover, Plaintiffs have also argued that CDE's approach to this issue cannot connect suspensions of any length to potential child find and FAPE violations, and that they have not identified any process for individualized review of students suspended or expelled to determine whether a denial of FAPE and/or behavior related to students' disabilities is a causal factor in suspensions. Plaintiffs here are not quarreling with CDE's approach to Indicator 4 noncompliance, and have not offered any critique of the SSR process and instruments CDE uses for that purpose21; thus much of what CDE says in response, as it is based on Indicator 4 comparative data and concerns the Indicator 4 process, is not relevant to Plaintiffs' argument. After validating that the 2011-12 data show an unspecified amount of disproportionate use of 19 The press release from CDE on this subject states: "... the data show African-American students are 6.5 percent of total enrollment, but make up 19 percent of suspensions. White students are 26 percent of total enrollment, but represent 20 percent of suspensions. Hispanic students are 52 percent of total enrollment, and 54 percent of suspensions" ("State Schools Chief Tom Torlakson Releases First Detailed Data on Student Suspension and Expulsion Rates," 4/19/13). 20 Nor is it possible for the Monitor to do so without collecting data in California school districts. 21 While Plaintiffs do not critique the SSR process and instruments, they do note in the PDC (at 30, fn. 2) that CDE has not provided additional information related to SSRs requested by Plaintiffs, including rates of SSRs, resulting findings, and follow-up visits (Plaintiffs' 3/8/13 comments at 11; CDE 3/22/13 Response at 22-23). But, as noted above, CDE has stated the circumstances that provoke SSRs related to Indicator 4, and has also produced the instructions and forms used in SSRs. As will be seen in Section III. C. 4. below, Plaintiffs are also aware that findings of noncompliance related to Indicator 4 have been made by CDE in the past. As Plaintiffs have not critiqued the process, the Monitor will not require CDE to produce additional information related to SSRs. 18 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 19 of 88 suspensions, CDE's response to the most important part of Plaintiffs' argument is vague regarding when the "proper review" of districts' CALPADS data will take place, what this review will consist of, the level of disproportionality that will result in a review of districts' policies, procedures and practices related to child find and IEPs, how this review will be conducted and by whom, and whether such reviews will take place annually. In addition, it has not clarified with sufficient precision the suspension/expulsion data it is collecting and whether those data include in-school suspensions. Monitor's Determination: CDE has not demonstrated that it uses any individualized process to ensure that students with disabilities subjected to disciplinary removals for fewer than 10 days are receiving FAPE, including any positive behavior supports necessary for them to receive FAPE; nor has it set forth such a process for students subjected to disciplinary removals for fewer than 10 days who do not currently have IEPs to ensure that such students are evaluated if they are suspected of having disabilities. Therefore, CDE shall engage in corrective action steps reasonably calculated to ensure that students with disabilities subjected to disciplinary removals for fewer than 10 days are receiving FAPE, including any positive behavior supports necessary for them to receive FAPE; and to ensure that students subjected to disciplinary removals for fewer than 10 days who do not currently have IEPs are evaluated if they are suspected of having disabilities. CDE shall set forth with precision through a submission to the Monitor and the parties the suspension/expulsion data it is collecting, the database(s) from which it is collecting those data, and whether the data include in-school suspensions; the frequency of the review of districts' CALPADS data; the substance of this review; the level of disproportionality that will result in a subsequent review of districts' policies, procedures and practices related to child find and IEPs; and how the latter review will be conducted and by whom. The necessity of further corrective actions will be determined after review of the submission. 4. Lagging Follow-Up with Noncompliant Districts Plaintiffs argue that CDE has not consistently ensured that identified noncompliance is corrected within one year. CDE's sample letter to districts, according to Plaintiffs, is not clear regarding the one-year timeline to correct noncompliance. In addition, Plaintiffs believe that CDE documents show failure to correct noncompliance in a number of areas and districts, and point specifically to findings of noncompliance related to Indicator 4A: only 17122 out of 821 of these findings were corrected within one year. Although Plaintiffs assert that the system design "is not set up" to correct noncompliance timely, they offer no analysis to show what the design lacks in order to do so (PDC at 29; Plaintiffs' 3/8/13 comments at 7). CDE's two major responses to Plaintiffs' objections do not respond to these concerns. In its 3/22/13 response CDE does respond in a limited way to Plaintiffs' 22 The document shows the number to be 178. 19 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 20 of 88 3/8/13 comment on this subject. Discussing the documents characterized in its PDC and paraphrased above, Plaintiffs wrote that these documents "demonstrate that CDE is not currently on top of the limited identified noncompliance that has continued for more than one year in a number of districts" (Plaintiffs' 3/8/13 comments at 7). CDE responded that it "does not understand what is meant by the phrase, 'not on top of.' CDE has and is tracking a large number of findings" (CDE 3/22/13 Response at 17). Turning first to the sample letter ("NC District Letter" dated 1/15/13), Plaintiffs are correct that it is not clear regarding the timeline for compliance. However, the letter states that the CDE consultant assigned to the district would follow up with the district "to identify timelines." Further, Plaintiffs do not acknowledge that this letter concerns findings that have already passed the one-year timeline for correction; the findings were originally made in June 2011 and the data one year later showed that the noncompliance was not corrected. For that reason the letter uses the phrase "continued noncompliance," and requires the district to complete a root cause analysis (RCA) and a corrective action plan (CAP) to address the root cause of the noncompliance (at 2). CDE's "Data NC Webinar" sets 5/31 as the deadline for individual student correction, RCA and CAP, and 11/1 as the final deadline for completing all corrective actions and the Prong II review (an additional review of students to ensure that no additional violations are found) (at slide 25). Second, while the "Noncompliant Findings Report" shows that only 21.7% of Indicator 4A violations were corrected within one year, the report also shows that 98.1% of all violations were corrected within one year (76,480 total violations were being tracked by CDE that year), which is not evidence of a systemic problem stemming from a faulty design of this aspect of CDE's monitoring system. However, CDE should account for the Indicator 4A results. Monitor's Determination: Plaintiffs have not established that CDE is not consistently ensuring that identified noncompliance is corrected within one year, nor that the system design is not set up to correct such noncompliance timely. However, within 60 days of the date of this memo, CDE shall submit to the Monitor and parties an explanation setting forth the reasons why only 21.7% of Indicator 4A findings of noncompliance were corrected within one year. The necessity of further corrective actions will be determined after review of the submission. D. SESRs 1. Ineffective SESRs Plaintiffs note that SESRs "only" take place every four years on a cyclical schedule, and assert that this frequency, when considered along with what Plaintiffs regard as the ineffectiveness of CDE's annual review process, results in a monitoring system that cannot identify noncompliance "at or approaching real time as required by law" (PDC at 30-31). CDE construes Plaintiffs' position on this issue as "...SESRs should be conducted annually...," and argues in response that annual SESRs would not be "meaningful." 20 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 21 of 88 CDE believes that its use of SESRs every four years allows multiple years of data collection to identify statistically relevant trends, rather than anomalies, to impact a district's development of its monitoring plan (CDE Design Response I at 13). Plaintiffs have not argued that SESRs should be conducted annually, but do not state or support a position on how often such a self-monitoring process should take place. Plaintiffs also do not support their contention that "at or approaching real time" identification of noncompliance in all districts is required by the statute. Although CDE's conclusion is stronger than its argument merits (as multiple years of data are available each year, a data-based monitoring process could be conducted annually based on statistically relevant trends), Plaintiffs' argument for more frequent SESRs is not convincing.23 Monitor's Determination: Plaintiffs have not shown that the current frequency of SESRs does not comply with the statute, Consent Decree, or Court Order. 2. Too Limited Data Collection Plaintiffs assert that they have not identified any description showing how the SESR process is used to focus monitoring activities on improving student results and outcomes and on performance in the priority areas, FAPE in the LRE and child find in particular. Plaintiffs argue that CDE's prior responses indicated that the SESR process does not include parent meetings or surveys, parent or student interviews, and observations of students. Regarding the educational benefit review, only five student records are reviewed; additional records are only reviewed if noncompliance is found in the initial reviews. In the IEP implementation review only ten records are reviewed regardless of district size. Thus, Plaintiffs conclude that the lack of interviews, surveys, parent meetings, and student observations, in addition to the stated inadequacies of the sample sizes used in the educational benefit and IEP implementation reviews, amount to inadequate quantitative and qualitative indicators to measure performance in the priority areas (PDC at 31-32). CDE, however, believes that the SESR "concentrates" on the priority areas, including child find, FAPE in the LRE, and educational benefit. It explains that a district's monitoring plan is based on qualitative and quantitative information, which includes parent input gained through meetings and surveys conducted by the district, the SPP indicators on which action by the district was required, and three years of the district's complaint and due process history. In addition to the monitoring plan, CDE points to the SESR monitoring activities as another aspect of the process that includes a range of data. Specifically CDE highlights the review of a sample of students' special education records, the educational benefit review of five student files, and the IEP implementation review of ten students. CDE claims that "...the SESR must demonstrate a district's compliance with fourteen" SPP indicators (emphasis in original), but does not 23 As Plaintiffs do not argue that more frequent SESRs should be performed in some circumstances in a subset of districts based on, for example, districts' determination levels, the Monitor does not make a determination on this issue. 21 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 22 of 88 note that a number of the indicators do not apply to an elementary district such as Ravenswood (CDE Design Response I at 13-14). With respect to the individualized monitoring plan, CDE has stated that it is only the SPP indicators on which action was required of a district that form part of the basis of the plan.24 Hence, it is difficult to understand how the SESR could "demonstrate" compliance with all indicators, as all indicators would include those on which action was not required. Second, as some of those indicators are performance indicators, it is unclear how the SESR would demonstrate "compliance" with them (although CDE is likely referring imprecisely here to compliance with IDEA requirements related to the performance indicators25). Contrary to Plaintiffs' claim, CDE documents confirm that at least one parent meeting, and parent surveys if attendance at the meeting is below 20% of the special education population, must be conducted as part of the development of the monitoring plan. Further, if the survey does not bring the total attendance/response up to the 20% level, districts "must use additional means" to gather parent input, such as phone or email surveys, or surveys at IEP meetings. Issues identified by parents must be in the monitoring plan if those issues were: 1) identified by parents and validated by data; or 2) appear to be a violation of law; or 3) expressed by "several" parents, affect a number of students, or occur at a number of sites. However, while the 20% standard for parent input appears to be verified by CDE consultants, the specific standards for including issues raised by parents in the monitoring plan do not appear to be verified by consultants. In addition to complaint and due process history, other data included in the monitoring plan are highly qualified teachers, and overdue annual and triennial IEPs.26 Thus, while parent input is included in the monitoring plan, it does not appear from CDE's documents that it verifies that issues raised by parents have been appropriately included in the plan in accordance with its standards. Second, for reasons similar to those stated in Section III. C. 1. above, because the data that influence the monitoring plan are largely limited to the SPP indicators, and only those on which action was required by CDE and/or the target was not met, CDE does not ensure that sufficient qualitative and quantitative indicators are used by districts in the development of the plan. It is noteworthy in this regard that a key CDE document does 24 See also "Final Activity One 12-13 SESR PPT," slide 32; "Monitoring Plan Evaluation Checklist 2012-13" at 2; SESR Instruction and Forms Manual (10/12) at 7-9; 2011-2012 SelfReview Analysis Instructions for Regional Focused Monitoring and Technical Assistance Consultants at 13. 25 See § 300.600 (b)(2); CDE's 2/22/13 response at 44: "There are no findings of noncompliance with an indicator. There are findings of noncompliance with individual items in the item table related to the indicator in question." 26 "Final Activity One 12-13 SESR PPT," slides 13, 18-19, 22, 37; "Monitoring Plan Evaluation Checklist 2012-13" at 1-3; 2011-2012 SelfReview Analysis Instructions for Regional Focused Monitoring and Technical Assistance Consultants at 3, 9-12; SESR Instruction and Forms Manual (10/12) at 7. 22 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 23 of 88 not even list improving student results and outcomes as part of the purpose of the SESR or among the "six broad questions" the SESR is "designed to answer."27 With respect to the sample sizes for the educational benefit and IEP implementation reviews, CDE documents verify that samples for the former are five students receiving mental health services (with other students added if there are fewer than five), and ten students ("up to" five of whom are labeled emotionally disturbed (ED) or are receiving mental health services) for the latter review. However, CDE's SESR training document states that the sample for the latter review is 15 students, five of whom are ED or receiving mental health services. Unlike the record review portion of the SESR, no allowances are made in the CDE documents for different sample sizes based on the size of the district for either the educational benefit or IEP implementation reviews. The IEP implementation review also includes parent interviews, and interviews of teachers and/or service providers ("Final Activity Two 12-13 SESR PPT," slides 25, 67; SESR Instruction and Forms Manual (10/12) at 22). The Monitor has not located in CDE's SESR documents any mention of student observations. Hence, while parent interviews are included in the IEP implementation review, the sample sizes used for both reviews are not sufficiently tailored to the size of the district to ensure FAPE in the LRE is being provided and that the SESR portion of the monitoring system can identify both noncompliance and compliance appropriately based on adequate evidence.28 While student observations are not included in the SESR process, Plaintiffs have not offered a convincing rationale for the inclusion of that methodology in all SESRs.29 Monitor's Determination: Plaintiffs have not shown that parent meetings and/or surveys, and parent interviews, are not used in the SESR process, nor have they convincingly argued that student observations should be included in all SESRs. CDE has not demonstrated that its SESR process uses adequate quantitative and qualitative indicators to measure performance in the priority areas in the creation of the monitoring plan, it exercises oversight sufficient to ensure that its own standards regarding inclusion of parent input in the monitoring plan are followed, and that it uses sample sizes sufficient to ensure compliance with the Consent Decree standards and standards from the Court's Order. Therefore, CDE shall engage in corrective action steps reasonably calculated to ensure that the SESR process uses adequate quantitative and qualitative indicators to measure performance in the priority areas in the creation of the monitoring plan, CDE exercises oversight sufficient to ensure compliance with its standards regarding 27 SESR Instruction and Forms Manual (10/12) at 1; Plaintiffs' 3/8/13 comments at 15; see also CDE's SESR Implementation Response, 8/26/13, at 1-2. 28 As Plaintiffs have not raised issues related to the composition of samples beyond the size, nor issues related to the methodologies used in these reviews, the Monitor makes no determinations related to them. However, such concerns are raised by Plaintiffs in the context of SESR implementation; see Section IV. A. 3. below. 29 As Plaintiffs do not argue that student observations should be included in some SESRs based on data in a district indicating, for example, poor student results and outcomes or student and/or parent dissatisfaction with the quality of services, the Monitor makes no determination on this issue. 23 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 24 of 88 inclusion of parent input in the monitoring plan, and that sample sizes sufficient to ensure compliance with the Consent Decree standards and standards from the Court's Order are used. For each type of data identified by CDE pursuant to the Monitor's Determination in Section III. C. 1. above, CDE shall set forth with precision through a submission to the Monitor and the parties whether, and if so how, that type of data will be used in the development of SESR monitoring plans. The submission shall explain the basis for CDE's belief that the resulting data on which SESR monitoring plans will be based will be adequate to measure performance in the priority areas. Further, the submission shall include amended guidance to CDE consultants adequate to ensure that CDE's standards with respect to inclusion of parent input in monitoring plans are followed. In addition, the submission shall include amended SESR procedures and guidance to its consultants to ensure that adequate sample sizes are used in the educational benefit and IEP implementation reviews. 3. Too Limited Data Verification Plaintiffs claim that computer programs generate SESR monitoring plans with minimal intervention from CDE staff after district input of data, a process that does not include "any" focus on the IDEA priority areas. Plaintiffs claim further that the design of the SESR process lacks "data integrity instructions" to the Special Education Local Plan Areas (SELPAs), the entities Plaintiffs believe collect SESR data. Such instructions are necessary in Plaintiffs' view to understand CDE's monitoring of SESR data collection. With respect to CDE verification of the adequacy of SESR monitoring plans, Plaintiffs believe CDE verifies the plans "in some basic detail," but does not ensure that districts' self-selection of items from parent surveys is done "validly and reliably," nor does it contain processes to ensure that districts enter items on which CDE required action through its APR reports. Plaintiffs argue concerning SELPAs that the system design has no methods through which it checks districts' SESRs within one SELPA against the SESRs of districts in other SELPAs, "or against itself" to ensure consistency; no methods to ensure that a SELPA does not offer "school district reports through its own prism"; no methods to ensure that CDE "connects the dots" between districts in the same SELPA; and no system for "monitoring itself," which Plaintiffs believe is required by § 300.600 (a)(1) (PDC at 32-33; Plaintiffs' 3/8/13 comments at 13, 15, 17). CDE responds by noting that it trains both SELPAs and districts before each stage of the SESR begins, "review[s] and approve[s] the conclusion of each activity," and provides technical assistance during the process. A CDE consultant also "checks the quality" of the monitoring plan using a checklist; the consultants and SELPAs also provide training, and review and approve each of the SESR activities (CDE Design Response I at 14-15). As the SESR process is part of its monitoring system, CDE must ensure that monitoring plans are developed that comply with its standards for such plans. In addition, it must ensure that the monitoring process itself is conducted with integrity and results in accurate findings of both noncompliance and compliance based on 24 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 25 of 88 adequate evidence and reasoning. The methods through which CDE attempts to do so are either adequate or they are not; for this reason, the role of SELPAs in the SESR process is irrelevant, as is the role of computer programs in the creation of monitoring plans. In addition, Plaintiffs offer no reasons to believe that SELPAs have any greater self-interest at stake in the SESR process than districts themselves. Nor do Plaintiffs set forth any standard from the IDEA, Consent Decree, or Court's Order that would require CDE to crosscheck the SESRs from within one SELPA to those from other SELPAs, or to connect any similar SESR findings within a SELPA.30 That said, CDE's processes for ensuring the fidelity of monitoring plans and of findings from the SESR process, regardless of what entity develops those plans and makes the actual findings, are clearly relevant. A determination has been made at Section III. D. 2. above that it does not appear from CDE's documents that it verifies that issues raised by parents have been appropriately included in monitoring plans in accordance with CDE's standards. In addition, a determination has been made regarding the extent to which the SESR process uses adequate quantitative and qualitative indicators to measure performance in the priority areas in the creation of the monitoring plan. CDE has claimed that it reviews and approves each stage of the SESR process, and that it checks the quality of the monitoring plan. The tool CDE consultants use to check the quality of the plan has two clear areas in which the extent to which all indicators noted as "action required" or "not met" is judged ("Monitoring Plan Evaluation Checklist 2012-13" at 2-3). The manual for CDE consultants to use in the SESR process calls for this check as well (20112012 SelfReview Analysis Instructions for Regional Focused Monitoring and Technical Assistance Consultants at 13, 17). Hence, CDE's processes contain adequate safeguards to ensure that districts enter items on which CDE required action related to SPP indicators. Turning to oversight of the monitoring process itself, CDE points to two processes it uses to ensure the integrity of SESR monitoring: the first is the use of the instruction document for CDE consultants, and the second is the SESR follow-up process. In addition to its use in evaluating monitoring plans, CDE describes the instruction document as addressing "student selection, accuracy and completeness of findings, student and district level corrective actions, and assurances" (CDE 2/22 Response at 15). A review of the document indicates that it prompts CDE consultants to verify that the district reviewed the number of records called for by the monitoring plan, and reviewed the correct number of records of school-age students, preschool, infants, special populations (for example, English-language learners), and based on race/ethnicity in similar proportions to the district's demographics. The number of educational benefit reviews is also verified. Although a section of the document is entitled "Review Student Findings for Accuracy and Completeness," only noncompliant and not applicable findings are reviewed, and the "review" of noncompliant findings is limited to whether the finding relates directly to the legal requirement, demonstrates with evidence how the requirement was not met, states the source of the evidence, and 30 CDE would, of course, be wise to do the latter under some circumstances (for example, LRE violations within a SELPA's geographic area regarding the use of county self-contained programs), but it is not required to do so as part of the process of ensuring that the SESR process is conducted appropriately. 25 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 26 of 88 is written in a complete sentence. A finding of noncompliance will be questioned if it appears "incongruent" with the stated compliance test for that requirement (2011-2012 SelfReview Analysis Instructions for Regional Focused Monitoring and Technical Assistance Consultants at 21-40, 48-50). The review process called for by this document is insufficient to ensure the accuracy and integrity of any substantive aspect of the SESR monitoring process beyond the number and type of student records reviewed. Student reviews that do not result in findings of noncompliance or not applicable are not verified at all, and the review of noncompliant findings cannot ensure that the reviews were conducted accurately as student records are not reviewed by CDE consultants as part of this process. While CDE also points to the "regular interaction" between consultants, SELPAs, and districts throughout the monitoring process (CDE 2/22/13 Response at 17), that is an insufficient safeguard against inadequate SESR implementation and inaccurate findings.31 With respect to the SESR follow-up process, CDE explains that until the 2011-12 school year 5% of districts that performed SESRs were scheduled for follow-up. This was increased to 10% in 2011-12. Districts are selected for follow-up "if there are few or no findings of noncompliance or few or no findings of noncompliance in frequently noncompliant items..." (CDE 2/22/13 Response at 18). However, while another CDE document confirms the 10% target for 2011-12, the criteria for selection are different and include 1) random selection of districts, and/or 2) the selection of districts that failed to complete the SESR timely, failed to correct noncompliance timely, or did not have any student findings of noncompliance. Putting aside aspects of the methodologies used by CDE that concern the efficacy of corrective actions resulting from a SESR (see Section III. B. 4. below), for districts that report findings of noncompliance for students, CDE claims to examine the district's "documentation" to ensure that the IEP implementation and educational benefit reviews were "conducted correctly and reported accurately"; for districts that did not report findings of noncompliance for students, CDE uses the same file review protocol used by the district during its SESR to review 20 files reviewed as part of the monitoring process by the district.32 The 20 files are selected randomly. The district's SESR review forms are also examined, and the documentation related to the IEP implementation and educational benefit reviews are examined for these districts as well. CDE then determines whether the district's findings match CDE's ("SESR On-site Follow-up Review Guidance" at 1-3). Aside from the differences regarding the standards for selection for follow-up reviews in the two cited CDE documents, it is unclear whether the "documentation" examined by CDE in the follow-up review related to the IEP implementation and educational benefit reviews include all relevant student records. For districts that made findings of noncompliance for students, CDE's stated methodology does not include 31 This is shown in the context of the Ravenswood SESR by CDE's Ravenswood SESR follow-up report, which found that the District did not implement the IEP implementation and educational benefit reviews properly (Greenwood letter to Hernandez, 8/15/13, at 1). 32 However, another CDE document states that ten records from districts that did not make any student findings of noncompliance are sampled (SESR Instruction and Forms Manual (10/12), Appendix IV, at 44). 26 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 27 of 88 any steps adequate to ensure that findings of compliance and noncompliance were made appropriately. However, such steps are included in CDE's methodology for districts that did not make findings of noncompliance for students. Moreover, the follow-up monitoring process is only applied to 10% of districts and, thus, cannot be regarded as an approach that can ensure that findings of compliance and noncompliance in all SESRs are made appropriately. Plaintiffs assert, and CDE confirms, that CDE has completed 77 follow-ups ("only" 77, in Plaintiffs' view) in the last three years (Plaintiffs' 3/8/13 comments at 19; CDE 3/22/13 Response at 30). However, while a CDE spreadsheet shows that 77 have been completed thus far, CDE projects that the three-year rate of follow-ups will be approximately 13.2% when all are completed ("SESR Districts 2009-2012"). Unfortunately, CDE does not set forth the results of those reviews. Monitor's Determination: Plaintiffs have not shown that the role of SELPAs and computer programs impairs the SESR process, nor have they shown that CDE's process for ensuring that monitoring plans include items related to the SPP indicators on which action was required is inadequate. In the respects set forth above, CDE's processes for validating the accuracy of SESR data collection are collectively insufficient to do so. Therefore, CDE shall engage in corrective action steps to ensure that its processes for validating the accuracy of SESR data collection are reasonably calculated to result in accurate SESR data. CDE shall set forth with precision through a submission to the Monitor and the parties the procedures, guidance and training documents it will adopt to ensure that SESR findings are accurate. With respect to the follow-up reviews it has conducted during the last two years, CDE's submission shall also set forth the number and percentages of SESRs that were found to be fully accurate. 4. No Evidence of Required Compliance Determinations/Follow-Up Plaintiffs indicate that none of CDE's SESR documents mention determinations or enforcement requirements. In addition, Plaintiffs note that CDE has not produced its SESR corrective action follow-up reports. Plaintiffs characterize CDE's Ravenswood SESR monitoring plan, findings, and corrective action documents as flawed in several respects, including showing no evidence of IDEA's required focus and no reports in a number of areas of review33 (PDC at 33-34). CDE asserts that it responds to SESR findings of noncompliance by producing both student-level and district-level corrective actions. The former must be corrected within 45 days and the latter within 90 days. Any noncompliance identified in additional record reviews must be corrected within the same timelines, and 100% compliance achieved within a year (CDE Design Response I at 15). The instruction manual for CDE consultants calls upon consultants to determine that due dates for student-level corrective actions are present and accurate, and that each corrective action matches the finding and has not been changed from the action 33 Plaintiffs' concerns regarding SESR implementation in Ravenswood will be discussed in Section IV below. See Section IV. J. for discussion of monitoring reports related to the Ravenswood SESR. 27 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 28 of 88 CDE approved. For district-level corrective actions due dates are reviewed, consultants are told to "[e]valuate" the corrective actions--although the specific instruction is limited to determining whether the district changed any "automated" corrective actions and whether the corrective action addresses the finding and compliance standard--and corrective actions are reviewed to ensure they contain evidence that the finding has been corrected and the location of that evidence. The document also informs consultants that CDE's "SESR Closure" letter will apprise districts that 10% of them will be selected for follow-up monitoring (2011-2012 SelfReview Analysis Instructions for Regional Focused Monitoring and Technical Assistance Consultants at 51-58). CDE's on-site follow-up guidance document describes the methodologies used in the follow-up reviews to ensure that the corrective actions were efficacious. For districts that reported noncompliance for students, a review protocol is created in the areas found noncompliant by the district. Twenty files of students whose IEP meetings were held after the district's completion of the corrective actions are randomly selected and reviewed with the protocol. If continuing noncompliance is found, a letter is sent to the district that requires a corrective action and timeline for each violation. When the district completes these additional corrective actions, another letter is generated stating that the SESR has been completed satisfactorily. No process for ensuring that the additional corrective actions have been efficacious is set forth in this document, although the sanctions process is initiated if the corrective actions are not completed and the district does not respond to the consultant's "attempt to intervene and assist" ("SESR On-site Follow-up Review Guidance" at 2, 4). If the follow-up review finds that a district did not implement the SESR as required, and/or "could not document and support" the data and findings, the consultant is to bring this to the attention of the "FMTA unit Manager. An appropriate response will be determined, which may include any or all of the following: the imposition of special conditions, an immediate Verification Review, or that the LEA is recommended for sanctions" ("SESR On-site Follow-up Review Guidance" at 4; emphasis added). As noted above, Plaintiffs express concerns that neither determinations nor enforcement are mentioned in CDE's SESR documents. With respect to the determinations issue, a determination has been made at Section III. C. 3. b. above. There is no specific requirement for determinations related to a state's LEA selfmonitoring process like the SESR with the exception of including SESR monitoring results as a factor in determinations. Turning to enforcement, potential enforcement steps are set forth in CDE's onsite follow-up guidance document. However, two concerns are raised by CDE's presentation of its SESR enforcement process in that document. First, when CDE finds that a district's SESR was not implemented correctly, no clear standards are articulated for next steps: the document uses the word "may," and does not state which of the possible sanctions will be used in specific circumstances. Second, for districts that did not complete corrective actions and did not respond to the CDE consultant, the document only states that the sanctions process would be initiated. The specific sanctions to be used in this circumstance are not spelled out. 28 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 29 of 88 For the 90% of SESR districts not selected for a follow-up review, the steps specified in the instruction document for CDE consultants are clearly inadequate to determine whether the corrective actions have been efficacious, as those steps do not call for the consultants to review student records or any other pertinent documentation. For the 10% of districts selected for follow-up, the process set forth in the follow-up guidance document is adequate to ensure that noncompliance found through the SESR has been corrected. However, when the follow-up reveals that the original corrective actions have not been efficacious, no process to ensure that the mandated additional corrective actions had the desired effect is set forth in that document. Monitor's Determination: Plaintiffs have not shown that determinations must be specifically addressed in SESR documents. CDE has not demonstrated that it uses appropriate sanctions in cases of inadequate SESRs or incomplete corrective actions, or that its process for ensuring the efficacy of corrective actions for districts not selected for follow-ups is adequate to do so. While the process for ensuring the efficacy of corrective actions for districts selected for follow-ups is adequate, no process is set forth for ensuring the efficacy of any additional corrective actions mandated by CDE as a result of the follow-up. Therefore, CDE shall engage in corrective action steps to ensure that its documents are clear regarding the sanctions to be employed in cases of inadequate SESRs or incomplete corrective actions, and that its processes for 1) ensuring the efficacy of corrective actions for districts not selected for follow-ups, and 2) ensuring the efficacy of any additional corrective actions mandated as a result of the follow-up, are adequate to do so. CDE shall set forth with precision through a submission to the Monitor and the parties the sanctions to be used in cases of inadequate SESRs or incomplete corrective actions. In addition, the submission shall set forth processes reasonably calculated to ensure the efficacy of SESR corrective actions, and any mandated additional corrective actions. E. VRs 1. Ineffective VRs Plaintiffs assert that there is "insufficient evidence" that CDE selects districts for VRs "when it should," particularly in light of what Plaintiffs regard as the defects in the APR measure and CrEAG document. They point out that CDE data show that from 2008 to 2012-13 between 1.5% and 3.5% of districts were chosen for VRs each year, in Plaintiffs' view "an extremely small number." It is clear to Plaintiffs from these data that CDE does not conduct "enough good-faith (i.e., non-flagged) VRs to meet a good stewardship standard that comports with IDEA's mandates" (PDC at 34-35; Plaintiffs' 3/8/13 comments at 29). In addition, Plaintiffs argue that the criteria for selecting districts for VRs "do not match or reflect" the IDEA determinations Plaintiffs believe CDE must use for this purpose, or the primary focus and priority areas for monitoring. Further, Plaintiffs claim that CDE has not provided evidence to support its contention that VRs are 29 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 30 of 88 performed when districts do not adequately implement a SESR (PDC at 35; Plaintiffs' 3/8/13 comments at 21-22, 26). CDE does not respond to Plaintiffs' point that it does not select districts for VRs "when it should," or to their point regarding non-flagged VRs. With respect to the number of VRs conducted in recent years, CDE argues that Plaintiffs' argument "lacks foundation" due to their failure to offer different criteria for VR selection or a number of districts that should be selected for VR. CDE also does not defend the number of VRs it has conducted in recent years. Citing its VR procedures guide, VR selection document, and 2005-10 SPP, CDE sets forth criteria for VR selection: • • • • • • • • triage factors (fiscal improprieties or numerous complaints); legislative requirements automatic selection of districts that did not complete the SESR process "successfully"; an allegation or reason to believe that a compliance issue exists based on "data" from CDE program approval and compliance staff; significantly sub-average performance on indicators such as overidentification, LRE, and academic performance; declines in compliance in recent and current years; lowest compliance determinations (along with program improvement status under No Child Left Behind and "other compliance information"); and "may also result" from complaint investigations. CDE further explains that Ravenswood was chosen for a 2013-14 VR based on "triage factors"--the longstanding Emma C. litigation and "implementation issues" in the 201213 SESR (CDE Design Response I at 17; CDE Design Response II at 4). CDE also argues that, although Plaintiffs have "intimated" that the VR selection process should be "uniform and systematic," flexibility is built into the VR selection process, flexibility that enables it "to make qualitative judgments appropriate for each unique circumstance." Indicator data "informs" selection, but CDE uses discretion in selecting districts for VR (CDE Design Response II at 3). Further, CDE responds to Plaintiffs' point regarding the use of determinations in the VR selection process by noting that the federal regulations cited by Plaintiffs contain only "broad statements" and no specific references to VRs34 (CDE Design Response I at 17). Plaintiffs do not offer standards by which judgments could be made regarding the extent to which CDE selects districts for VRs "when it should," or examples of districts that should have received VRs but did not. In addition, Plaintiffs do not offer a critique of the standards CDE claims to use to select districts for VRs, although standards for selecting districts for VRs were supplied by CDE in its 2/22/13 Response (at 21-22). Setting aside for the moment the inadequate SESR standard (see below), 34 The Monitor notes that VRs are not mentioned in the federal regulations at all. However, CDE's point still holds with respect to SEA direct on-site monitoring activities. 30 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 31 of 88 Plaintiffs' comments on CDE's 2/22/13 list were limited to asking where these criteria are to be found in "official monitoring documents" and noting that different criteria were used in another CDE document (Plaintiffs' 3/8/13 comments at 21-22). The table below displays CDE's statements of the criteria for selecting districts for VRs from a number of its documents. As the Monitor cannot locate the document referred to by Plaintiffs, the quotation offered by Plaintiffs is used in its stead. CDE's Verification Review Manual 2011-12, Volume I is excluded from the table, as that document does not include standards for selection of districts for VRs. Verification Review Procedure Guide 2012-13 (at 5, quoted) LEAs are selected for participation in the VR process by the CDE leadership team based on legislated requirements, CDE compliance concerns, or an inadequate SESR. Factors for inclusion in the process are an LEA that does not meet the targets for State Performance Plan Indicators (SPPIs) or demonstrated slippage toward indicator targets, an LEA’s Program Improvement status, failure to ensure CDE Design ReCDE 2/22/13 Response (at sponse I (at 20-22, quoted) 17, paraphrased) With the completion of legisSee the general section on triage faclatively required reviews of selection criteria in the tors (fiscal Juvenile Court Schools and Court Verification Review Proimproprieand Community Schools, Unit cedure Guide for 2012-13. ties or nuManagers for the Special Between 2008 and 2012, the merous Education Division met legislature required CDE to combeginning in the fall of 2011 to focus primarily on Juvenile plaints); discuss criteria for selection Court Schools. legislative districts for Verification Reviews Triage and SESR follow-up refor the 2012-13 school year. VRs were also conducted. quirements A variety of criteria were Unit Managers for the automatic discussed including data based Special Education Division selection of selection, legislative met (beginning in the fall of districts requirements, triage factors, 2011) to discuss criteria for that did not unsuccessful SESR, districts selection of districts for complete with repeating risk factors, and Verification the SESR a random selection process. The Reviews for the 2012-13 process following criteria were school year. A variety of "successgenerally discussed: criteria were discussed fully"; an General including data based seallegation lection, legislative reor reason to 1. Districts are selected as quirements, triage factors, believe that a district of residence. All unsuccessful SESR, districts a comdistricts of service must be with repeating risk factors, pliance sampled in review. County and a random selection issue exists Offices of Education are rarely process. The following based on the district of residence – they criteria were identified: "data" from are reviewed through the General CDE prodistrict of residence. 1. Districts are selected as a gram ap2. In order to spread the district of residence. All proval and workload, selections would be districts of service must be compliance made on a FMTA by FMTA sampled in review. County staff; sigbasis using uniform selection Offices of Education are nificantly criteria. rarely the district of sub-average 3. Districts would be residence –they are reviewed perselected to represent multiple through the district of formance counties to reach multiple residence. on indica"Selection of Districts for Verification Reviews 2012-13" (at 1-2, quoted) 31 Plaintiffs' 3/8/13 comments (at 22, quoted) Further, as noted above, these criteria do not comport with CDE’s stated criteria for VRs, in SELPA 103-08 link produced on 2-28-13: “I want to take a moment to provide some clarification about how districts are selected to participate in VRs. In the past, when we were using Key Performance Indicators to monitor districts, Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 32 of 88 Verification Review Procedure Guide 2012-13 (at 5, quoted) timely and correct submission of data and other information to CDE, the LEA’s compliance history, CDE selected focus area, random selection, and concerns from the public regarding the LEA. "Selection of Districts for Verification Reviews 2012-13" (at 1-2, quoted) SELPAs and should be selected to represent multiple size groups. 4. Districts having SESRs with low numbers of findings and high correction rates would be given a lower priority 5. Districts identified as having Significant Disproportionality would be given lower priority as they are conducting significant selfreview and developing a Coordinated Early Intervening Services plan with CDE oversight. 6. Legislative requirement would be the highest priority. 7. Next priority would be data based selection In the course 8. Districts whose SESRs of working were not adequate would with LEAs in always be included for the converification review. sultant’s Legislative assigned geographic 1. Districts would be area there selected to satisfy monitoring may be inpriorities required in Law, stances the Regulation or Budget Act (e.g., consultant is Juvenile Court Schools). provided Data Based Selection with information that 1. Districts would be would make selected based on the percentit pertinent age of SPP/APR indicators not for the LEA met in a given year, highest to be conpercentage of unmet receiving sidered for the highest priority. VR. There 2. Data based “tie may be some breakers” would include: LEAs • Districts with larger selected for number of decreased indicator VR based values from prior year would upon legisreceive higher priority for lated reselection quirements, • Districts having high for example, number of unmet indicators in CDE Design ReCDE 2/22/13 Response (at sponse I (at 20-22, quoted) 17, paraphrased) 2. In order to spread the tors such as workload, selections would overidentibe made on a FMTA by fication, FMTA basis using uniform LRE, and selection criteria. academic 3. Districts would be selected perforto represent multiple mance; counties to reach multiple declines in SELPAs and should be compliance selected to represent in recent multiple size groups. and current 4. Districts having SESRs years; lowwith low numbers of est complifindings and high correction ance derates would be given a lower terminapriority. tions (along 5. Districts identified as with having Significant Disprogram proportionality would be improvegiven lower priority as they ment status are conducting special selfunder No reviews and developing a Child Left Coordinated Behind and Early Intervening Services "other plan with CDE oversight. compliance 6. Legislative requirement inforwould be the highest primation"; ority. and "may 7. Next priority would be also result" data based selection from com8. Districts whose SESRs plaint inwere not adequate would vestigations always be included for verification review. Legislative 1. Districts would be selected to satisfy monitoring priorities required in law, regulation or Budget Act (e.g., Juvenile Court Schools). Data Based Selection 1. Districts would be selected based on the percentage of SPP/APR indicators not met in a given year, highest percentage of unmet receiving the highest 32 Plaintiffs' 3/8/13 comments (at 22, quoted) we selected the districts on a random basis. However, now that we are using SPP indicators, we have a more objective system for selecting districts on the basis of the data they need to collect. Specifically, the criteria are as follows: 1) the districts SPP indicators; 2) the district's program improvement status; 3) and the district's compliance history." Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 33 of 88 Verification Review Procedure Guide 2012-13 (at 5, quoted) VRs completed for county court and community schools. The consultant should review the following reports to assist in making the determination regarding LEA participation: "Selection of Districts for Verification Reviews 2012-13" (at 1-2, quoted) the compliance, LRE or outcome clusters would receive higher priority for selection. • Districts having a high per capita number of complaints would receive a higher priority for selection • Districts having audit compliance determinations of 1 or 2 would receive a higher priority for selection Triage 1. Districts can be selected for review based on triage factors alone if the SED managers determine that one or a combination of factors requires a CDE review. 2. Triage factors can add LEA Annual priority to selection of districts Performance having a high percentage of Reports unmet indicators Compliance Determina3. Triage factors include: tions • SELPA indicates that LEA Comthe district should be reviewed plaint His• Fiscal improprieties tory have been identified LEA Office • Influx of compliance of Adminis- complaints trative • Newspaper expose Hearing Due • Consultant concerns – Process number of parent calls and History concerns Timely and • Failure to respond to Complete CDE direction Report • Frequent, continuous administrative change Facilitated Year 2 SESR 1. A district is always selected for a Verification Review if they do not have a satisfactory SESR based on consideration of the following factors: CDE 2/22/13 Response (at 20-22, quoted) priority. 2. Data based “tie breakers” would include: • Decrease from prior year would receive higher priority for selection • Districts having high number of unmet indicators in the compliance, LRE or outcome clusters would receive higher priority for selection. • Districts having a high per capita number of complaints would receive a higher priority for selection. • Districts having audit compliance determinations of 1 or 2 would receive a higher priority for selection. Triage 1. Districts can be selected for review based on triage factors alone if the SED managers determine that one or a combination of factors requires a CDE review. 2. Triage factors can add priority to selection of districts having a high percentage of unmet indicators 3. Triage factors include: • SELPA indicates that the district should be reviewed • Fiscal improprieties have been identified • Influx of compliance complaints • Newspaper expose • Consultant concerns – number of parent calls and concerns • Failure to respond to CDE direction • Frequent, continuous administrative change Facilitated Year 2 SESR 1. A district is always se- 33 CDE Design Response I (at 17, paraphrased) Plaintiffs' 3/8/13 comments (at 22, quoted) Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 34 of 88 Verification Review Procedure Guide 2012-13 (at 5, quoted) "Selection of Districts for Verification Reviews 2012-13" (at 1-2, quoted) • Failure to report completely or in a timely fashion • FMTA determines that the district is unable to conduct/complete a satisfactory review • Inability to determine that the district has made proper correction • There are few or no findings SESR follow-up demonstrates that reported findings or corrective actions cannot be verified CDE 2/22/13 Response (at 20-22, quoted) CDE Design Response I (at 17, paraphrased) Plaintiffs' 3/8/13 comments (at 22, quoted) lected for a Verification Review if they do not have a satisfactory SESR based on consideration of the following factors: • Failure to report completely or in a timely fashion • FMTA determines that the district is unable to conduct/complete a satisfactory review • Inability to determine that the district has made proper correction • There are few or no findings • SESR follow-up demonstrates that reported findings or corrective actions cannot be verified As the table indicates, criteria for VR selection are found in CDE monitoring documents, and are substantially similar although expressed at different levels of generality in some of the documents. Plaintiffs' quotation from a CDE document is the most different, but that may be due to its apparent 2008 date. Plaintiffs express concerns that determinations are not included in the criteria for selection. However, determinations are mentioned among the criteria in two of the documents, and SPP indicators, which affect determinations to some extent, are also included. With respect to the IDEA's determinations, a determination has been made at Section III. C. 3. b. above. As CDE argues, there is no specific requirement for determinations related to a state's direct on-site monitoring component, beyond including monitoring results as a factor in determinations.35 Plaintiffs do not ground their "good stewardship standard" in any of the IDEA's specific requirements, and do not explain their view of how many non-flagged VRs should be conducted to meet such a standard. Further, raw data regarding the percentage of districts chosen for VRs each year do not show that any of the VRs were or were not non-flagged.36 In addition, Plaintiffs do not explain why any of the finite resources available for CDE direct on-site monitoring should be devoted to monitoring 35 That is not to say that CDE would not be wise to use determinations, based in part on district performance, as a major factor in selecting districts for VRs, only that it is not required to do so. 36 It should be noted that random selection of districts is mentioned in the criteria included in CDE's Verification Review Procedure Guide 2012-13 (at 5). 34 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 35 of 88 districts for which CDE has no data-based or other compelling reasons to suspect noncompliance. Turning to Plaintiffs' claim of insufficient evidence that VRs take place whenever districts do not adequately implement SESRs, CDE had asserted in one response to Plaintiffs that VRs "always" occur under these circumstances and set forth standards by which unsatisfactory SESRs are judged: • • • • • failure to report completely or timely, inability to conduct or complete a satisfactory review, inability to determine that the district had made "proper correction," few or no SESR findings, and CDE follow-up monitoring shows that findings or corrective actions cannot be verified (CDE 2/22/13 Response at 22). Plaintiffs commented regarding CDE's statement of these standards, "This assertion does not comport with number of SESR violations/corrective action plans required last four years identified above as well as number of districts with identified noncompliance lasting more than one year, compared to very low number of VRs conducted in same period." Plaintiffs asked in their comments that CDE explain this "discrepancy" (Plaintiffs' 3/8/13 comments at 22). CDE responded that the discrepancy referred to by Plaintiffs was unclear (CDE 3/22/13 Response at 32). The apparent discrepancy referred to by Plaintiffs is clear: it is simply the number of VRs compared with the number of districts with SESR findings and CAPs, and the number of districts that did not correct noncompliance within one year. However, Plaintiffs appear to have misread CDE's standards for automatic VRs. The standards do not include SESR districts with findings of noncompliance and CAPs, or districts with noncompliance uncorrected after one year.37 Thus, there is no discrepancy, although CDE has not demonstrated that its standards for unsatisfactory SESRs always lead to automatic VRs. With respect to Plaintiffs' assertion that the primary focus and priority areas for monitoring are inadequately reflected in the selection criteria for VRs, for the reasons stated in Section III. C. 1. above the Monitor agrees. As noted above CDE has argued that "flexibility" is built into the VR selection process that facilitates its ability to consider the unique circumstances of districts. While flexibility allows districts' special circumstances to influence selection, a positive aspect of the exercise of discretion, it also has the negative consequence of preventing the parties and the Court from understanding with clarity the data and circumstances that will result in Ravenswood's selection for VR in the future. Since 2000, Ravenswood was only selected for VR once, in 2006-07, until the current year. Much was known 37 Plaintiffs may have in mind on this point CDE's "Noncompliant Findings Report," but is unclear from that document that the districts with noncompliance uncorrected after one year had even identified the noncompliance through SESRs; these findings may have been made through data-based monitoring by CDE (such as Indicator 11 or 12), VRs, or SSRs. See Column 2 in the "Noncompliant Findings Report." 35 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 36 of 88 about the District's compliance status during that period of time, and it is both surprising and concerning that it was only selected once by CDE. In addition, triage factors, rather than the other possible criteria listed in CDE's documents, resulted in the District's selection for VR this year, according to CDE.38 One cannot determine in accordance with the Consent Decree whether the state-level monitoring system in place is or is not capable of ensuring continued compliance with the law and the provision of FAPE to children with disabilities in Ravenswood without clarity regarding the data and circumstances that will result in Ravenswood's selection for VR, the direct on-site component of CDE's monitoring system. The table below displays the number and percentage of VRs conducted by CDE in each of the last five years: Year Number of Districts39 Number of VRs40 Percentage of VRs 2008-09 1,042 16 1.5% 2009-10 1,032 25 2.4% 2010-11 1,037 19 1.8% 2011-12 1,043 33 3.2% 2012-13 1,038 31 3.0% TOTAL 5,192 124 2.4% As Plaintiffs note, the number of VRs is indeed quite low. This raises additional concerns regarding the probability that Ravenswood will be selected for a VR in future years when data and other information indicate that a VR is necessary to ensure compliance. Monitor's Determination: Plaintiffs have not shown that CDE does not select districts for VRs when it should, the standards CDE uses to select districts are inadequate, determinations must be used as part of the criteria for selection, determinations are not used as part of the criteria for selection, any IDEA standard requires non-flagged VRs, or that there is a discrepancy between CDE's inadequateSESR standard for VRs and the number of VRs conducted. CDE has not demonstrated that it always conducts VRs when its standards for unsatisfactory SESRs are met; the primary focus and priority areas for monitoring are adequately reflected in the selection criteria for VRs; and that its criteria for selecting districts for VRs, in addition to the low number of VRs it conducts annually, will result in the selection of the District for VR when data and other information available to CDE indicate that a VR is necessary in order to ensure compliance. Therefore, CDE shall engage in corrective action steps reasonably calculated to ensure that the primary focus and priority areas for monitoring are adequately reflected in the selection criteria for VRs, and that its criteria for selecting districts for VRs will 38 In addition, of course, to the agreement between the parties reflected in the Fifth Joint Statement that Ravenswood would receive a VR this year. 39 Data from http://dq.cde.ca.gov/dataquest/content.asp. 40 Data from CDE's "VR Districts 2008-2012." 36 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 37 of 88 result in the selection of the District for VR when data and other information available to CDE indicate that a VR is necessary in order to ensure compliance. For each type of data identified in response to the determination made at Section III. C. 1. above, CDE shall set forth through a submission to the Monitor and the parties whether, and if so how, that data will be used as part of the VR selection criteria. Further, CDE shall set forth with precision VR selection criteria, and provide current Ravenswood data, to the extent available, for each of the criteria in order to illustrate the implications of the criteria for the selection of Ravenswood for VR. In addition, CDE shall state with clarity the number of SESRs found to be unsatisfactory through application of its standards during the last four years, and demonstrate that each resulted in a VR. For each VR conducted by CDE in the 2011-12 and 2012-13 school years, CDE shall also set forth the reason(s) the public agency was selected for VR. The necessity of further corrective actions will be determined after review of the submission. 2. Too Limited Data Collection Plaintiffs claim that CDE does not have "triggers, algorithms, thresholds, targets or criteria for flagging" the monitoring priority areas. Plaintiffs cite one of CDE's VR selection documents, and assert that it cannot be confirmed to be an "official" CDE document and does not use the monitoring priority areas. The latter is also true of the VR manual, according to Plaintiffs (PDC at 35). Plaintiffs assert that the design of the VR process calls for the gathering of information, but "lacks detail" regarding how this information is processed. In addition, Plaintiffs argue that the VR process does not collect enough information. First, CDE conducts too few record reviews. Plaintiffs offer the IEP implementation review as an example, and argue that only fifteen students are required to be reviewed in all districts regardless of size, and only current student records are reviewed. Second, although the educational benefit review looks at three years of student records, Plaintiffs believe that CDE documents show that the district, not CDE, randomly selects five records for review and performs "most" of the reviews without training or quality control, which could lead to "self-interested" selections and reviews. Responding to CDE's claim that its staff members conduct the educational benefit review, Plaintiffs argue that CDE's 2011-12 VR manual does not support that claim. Citing again the VR manual, Plaintiffs argue that the educational benefit review does not include parent meetings, surveys or interviews, or student interviews and observations. Taken together, this information indicates to Plaintiffs that the educational benefit review does not sufficiently focus on the IDEA's priority areas and does not use indicators adequate to measure the performance of an entire school district (PDC at 36-37). Plaintiffs also argue that, while the VR process solicits parent input, the system contains no "assurances" that this input is considered adequately. Citing the VR manual, Plaintiffs argue that CDE's standard that parental concerns be from multiple parents and multiple sites in order to meet the requirement for systemic noncompliance is too high, as one parent can identify systemic noncompliance. In addition, only one 37 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 38 of 88 parent meeting is required. The process for incorporating parental input into the monitoring plan is, in Plaintiffs' view, "ad hoc and discretionary" (PDC at 37-38). Responding to CDE's claim that it uses sufficient qualitative indicators in the form of parental input, complaint history, and due process findings to select the issues to be included in the review, Plaintiffs assert that these indicators are inadequate to measure district performance in FAPE, LRE, and child find (PDC at 37). In its major response documents CDE does not respond to Plaintiffs' claim that the VR process lacks sufficient detail regarding how information is processed, and does not respond to Plaintiffs' points regarding the number of IEP implementation and educational benefit reviews conducted. However, CDE defends the number of student record reviews conducted, which are based on overall district enrollment and the number of special education students. CDE also does not respond regarding which entity selects students for the educational benefit review, and whether that review includes parent meetings, surveys or interviews, or student interviews and observations; or to Plaintiffs' points regarding assurances that parent input is considered, the standard used for parental identification of systemic noncompliance, and the adequacy of qualitative indicators (CDE Design Response I at 18). Although not entirely clear, Plaintiffs' first point appears to concern selection for VR. A determination regarding selection has been made at Section III. E. 1. above. With respect to Plaintiffs' assertion that the design of the system lacks sufficient detail regarding how the information gathered is processed, as Plaintiffs do not offer specific examples of insufficient detail regarding the processing of information, the claim cannot be adjudicated as stated. Turning to the number of students reviewed in the IEP implementation review, CDE's VR manuals and VR powerpoint confirm that the implementation of fifteen students' IEPs are reviewed in the VR process, regardless of the size of the district (Verification Review Procedure Guide 2012-13 at 41-42; Verification Review Manual 2011-12, Volume I at 16; Special Education Verification Review in California (2010) at 18). Because the sample size used for this review is not sufficiently tailored to the size of the district, the aspect of FAPE related to the implementation of IEPs is not adequately monitored in the VR process for districts of every size; further, if IEP implementation problems are found for some of the students reviewed, the process does not appear to call for the review of additional students. With respect to the educational benefit review, CDE has stated that the VR team lead, not the district, selects the students for the review from the group of students who were in the samples for both the student record and IEP implementation reviews (CDE 2/22/13 Response at 41, 42). Contrary to Plaintiffs' claim, this is confirmed by VR manuals and training documents, which call for the team lead to select the students from a random list generated by CDE (Special Education Verification Review in California 38 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 39 of 88 (2010) at 15; Verification Review Manual 2011-12, Volume I at 1641; Verification Review Procedure Guide 2012-13 at 3242). Turning to who conducts the educational benefit review, the table below displays the relevant instructions from CDE's VR manuals: Verification Review Manual 2011-12, Volume I (at 22, quoted; emphasis in original) Process: The Educational Benefit Review requires five student files to be reviewed to determine if the IEP was reasonably calculated to provide educational benefit. The CDE team demonstrates the first educational benefit file review and then facilitates the district team through the remaining educational benefit file reviews. If any of the five student files selected is found to be noncompliant the team should pull an additional five files up to ten in order to verify systemic noncompliance. The CDE team must conduct one demonstration educational benefit review using one of the district’s randomly selected student files. The educational benefit review demonstration generally takes several hours and is a key element of the VR process. Once the demonstration is completed, CDE and the district teams can be divided into groups to complete the remaining four educational benefit reviews. Verification Review Procedure Guide 2012-13 (at 39, quoted) The CDE team must conduct one demonstration educational benefit review using one of the files from the student record review. The demonstration usually takes several hours and is a key element of the VR process. The CDE team will later facilitate the district team through the remaining educational benefit file review. 1. Participants should include CDE staff, special education teachers, and assessment staff 2. Divide staff into teams of two or three Identify one or two team member(s) to review the record while the other team member records information on the Educational Benefit Chart specifying areas of assessment, present levels of performance, areas of need, goals, services, and year-to-year progress. It is helpful to include one CDE consultant with each group completing the review. Plaintiffs are correct that district staff members are involved in the educational benefit review. However, the older manual appears to call for CDE staff to participate in each review after the demonstration review, although it is not entirely clear that a CDE staff person would be in each group; the more recent manual does not explain what the CDE facilitation called for consists of, and regards CDE staff participation in the reviews as "helpful," rather than required. Due to this vagueness, it is not clear that quality control of the reviews can be ensured, as Plaintiffs suggest. Plaintiffs also claim that district staff members participate in the reviews without clear training. Training of district staff on the student record review process does take place, according to CDE's VR manuals, and the demonstration of an educational benefit review noted above is also training for the review (Verification Review Manual 2011-12, Volume I at 18). The more recent manual explains the training provided in this manner: 41 "Contact Assessment, Evaluation and Support Unit (AES), Chris Drouin, for the list of student names randomly generated via the database, this list will be used for all student record review activities, including school age, IEP implementation, and educational benefit review." 42 "Contact the Assessment, Evaluation and Support Unit (AES), Patricia Skelton, for a list of student names randomly generated via the database, this list will be used for all student record review activities, including school age, IEP implementation, CASEMIS review, and the educational benefit review." 39 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 40 of 88 The review team in conjunction with selected members of the district and SELPA will review records using the CDE record review forms containing frequently found noncompliant items and items identified for investigation in the Monitoring Plan. The LEA can select reviewers for the student level document reviews from staff that have the best background for the subject matter. For example, the LEA may wish to have teachers and clinical staff participation in the review of the student records and have administrators assist with the review of fiscal documentation. The CDE consultant will use the actual review protocols during training activities so that a common understanding is established about what each item is requiring and what constitutes adequate evidence. (Verification Review Procedure Guide 2012-13 at 34-35) However, it is not clear from either manual that specific training for the educational benefit review, aside from the demonstration, takes place. With respect to Plaintiffs' claim that the educational benefit review does not include parent meetings, surveys or interviews, or student interviews and observations, the VR process according to the more recent VR manual includes interviews of parents and district staff for the IEP implementation review; all students selected for the educational benefit review are also in the samples for the IEP implementation review (Verification Review Procedure Guide 2012-13 at 4143). On the other hand, "It is also desirable to select sites in such a fashion that the team can interview parents, district staff, and administrators of students who have been involved in all three types of record reviews (procedural guarantees, educational benefit, and IEP implementation)" (Verification Review Procedure Guide 2012-13 at 52). However, "desirable" and required are different concepts, and interviews of parents of students involved in the educational benefit review, and the students' teachers, related service providers, and administrators, do not appear to be required. Such interviews are necessary to paint a full picture of the extent to which the IEP is reasonably calculated to result in educational benefit. Further, although "[i]nformation gathered from interviews is collected, analyzed, and discussed at the Post-Review Meeting" (Verification Review Procedure Guide 2012-13 at 53), the specific instructions for the review do not include guidance regarding if, and if so how, the information gained through interviews would influence the findings (Verification Review Procedure Guide 2012-13 at 39-4144). As to parent meetings and surveys, Plaintiffs offer no argument regarding the relevance of the views of parents in general regarding whether a particular group of specific students have IEPs that are 43 "All of the files used for the Educational Benefit Review must be used in the IEP Implementation Review." 44 For example, "The review team will conclude if educational benefit was provided to the student based on a comparison of year-to-year evaluation of IEPs. Determining educational benefit is an iterative process based on review of student records and evidence of efforts to address the learning needs of students in the special education program" (Verification Review Procedure Guide 2012-13 at 39). The use of information from interviews is not mentioned. 40 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 41 of 88 reasonably calculated to result in educational benefit. However, student observations should be included if the review results in questions that can be answered best through observations, such as poor student outcomes that may be due to flaws in service delivery. Competent reviews of educational benefit must follow evidence where it leads, including to observations if and when necessary, but a rule that observations always be conducted would lead to an inefficient use of monitoring resources in exchange for little gain.45 Regarding Plaintiffs' argument that the educational benefit review does not use indicators adequate to measure the performance of an entire school district, the sample size of five students used for the review is not sufficiently tailored to the size of the district to ensure FAPE in the LRE is being provided and noncompliance and compliance identified appropriately based on adequate evidence.46 Turning to Plaintiffs' concerns related to parent input, the older VR manual states: "To determine if a parent concern meets the systemic noncompliance requirement, it must be a concern from multiple parents and from multiple school sites" (Verification Review Manual 2011-12, Volume I at 6). As Plaintiffs argue, this standard would prevent the monitoring plan from including a legitimate systemic noncompliance concern raised by one parent. However, CDE's more recent manual changes this standard: information from parent input activities "must" be included in the monitoring plan if "either" 1) "[t]he information is identified during one of the parent input processes and validated by other LEA reports," or 2) "[t]he information alleges a violation of state or federal statute or regulation and similar information was expressed by several parents, affects a number of students, or occurred at a number of school sites" (Verification Review Procedure Guide 2012-13 at 9; emphasis added). This standard requires the inclusion of potential systemic noncompliance in the monitoring plan when identified by a parent. In addition, the newer standards for the inclusion of parental input in the monitoring plan are not ad hoc and discretionary; on the contrary, the instructions appear to be clear (Verification Review Procedure Guide 2012-13 at 9-11). Plaintiffs are correct that one parent meeting is required (Verification Review Manual 2011-12, Volume I at 11; Verification Review Procedure Guide 2012-13 at 6). While this is likely to be problematic in school districts larger than Ravenswood, with respect to VRs in the District one parent meeting in addition to the other methodologies for gathering parental input--the on-line survey, mailed survey, parent input cards from the meeting, and the standard that input be gathered from at least 20% of a district's parents--are adequate means of collecting parental input for the VR monitoring plan (Verification Review Procedure Guide 2012-13 at 6-9). With respect to Plaintiffs' argument that the indicators used by CDE to create the monitoring plan are insufficient measures of district performance in FAPE, LRE, and 45 The Monitor notes that CDE's pilot proposal for Ravenswood VR includes student observations as part of the IEP implementation and educational benefit reviews (CDE Design Response I at 4; Verification Review Policy and Procedure Guide, Ravenswood City School District 2013–14, at 51-56). 46 Plaintiffs do not raise issues related to the composition of the sample beyond the size; thus, the Monitor makes no determination on CDE's use of random samples for the educational benefit review. However, Plaintiffs raise this concern in the context of SESR implementation; see Section IV. A. 3. below. 41 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 42 of 88 child find, the more recent VR manual requires that the monitoring plan include SPP indicators on which action was required of the district (Verification Review Procedure Guide 2012-13 at 13). As stated in Sections III. C. 1. and D. 2. above, because the data that influence the monitoring plan are largely limited to the SPP indicators, and only those on which action was required by CDE and/or the target was not met, CDE does not ensure that sufficient qualitative and quantitative indicators are used in the development of the VR monitoring plan. Moreover, while the CDE consultant is called upon to analyze the SPP data, specific instructions to ensure that the data are drilled down and analyzed by disability, race/ethnicity, school site, free and reduced lunch status, etc. are not offered47; such analyses are important, as meeting an overall target can mask very poor performance by some students, and competent monitoring under IDEA must look closely and substantively at such concerns if suggested by data. In addition, ensuring compliance with the IDEA's substantive FAPE, LRE and child find requirements are not set forth in the VR manuals among the questions the VR seeks to answer, and improving student results and outcomes is not listed as part of the purpose of the VR process (Verification Review Manual 2011-12, Volume I at 4; Verification Review Procedure Guide 2012-13 at 1-2). CDE's recent VR training document also lists similar questions the VR sets out to answer ("Special Education Verification Review (VR) 201314 Training Module, Student Record Reviews," at slides 3-4). Monitor's Determination: Plaintiffs have not shown that districts select students for the educational benefit review, parent meetings and surveys are necessary for the educational benefit review, the system contains no assurances that parental input is considered adequately, parental concerns must be from multiple parents and multiple sites in order to meet the requirement for systemic noncompliance, one parent meeting for the VR process in Ravenswood is insufficient, and that the process for incorporating parental input into the monitoring plan is ad hoc and discretionary. CDE has not demonstrated that it ensures quality control of, and adequate training for, the educational benefit reviews; ensures that interviews are conducted of parents of students involved in the educational benefit review, and the students' teachers, related service providers, and administrators; provides sufficient guidance regarding if, and if so how, the information gained through interviews can affect the educational benefit findings; observations are conducted as part of the educational benefit review when necessary; its VR process uses adequate quantitative and qualitative indicators to measure performance in the priority areas in the creation of the monitoring plan; that data are fully analyzed in order to develop the monitoring plan; and that it uses sample sizes in the IEP implementation and educational benefit reviews sufficient to ensure compliance with the Consent Decree standards and standards from the Court's Order. Therefore, CDE shall engage in corrective action steps reasonably calculated to ensure quality control of, and adequate training for, the educational benefit reviews; interviews are conducted of parents, teachers, related service providers, and administrators as part of the educational benefit review; sufficient guidance is provided 47 "The CDE consultant is responsible for obtaining a copy of the special education data report from the CDE Web site and analyzing the information for inclusion in the Monitoring Plan" (Verification Review Procedure Guide 2012-13 at 13). 42 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 43 of 88 regarding how information gained through interviews will affect educational benefit findings; observations are conducted as part of the educational benefit review when necessary; the VR process uses adequate quantitative and qualitative indicators to measure performance in the priority areas in the creation of the monitoring plan; data are fully analyzed in order to develop the monitoring plan; and that sample sizes are used in the IEP implementation and educational benefit reviews sufficient to ensure compliance with the Consent Decree standards and standards from the Court's Order. For each type of data identified by CDE pursuant to the Monitor's Determination in Section III. C. 1. above, CDE shall set forth with precision through a submission to the Monitor and the parties whether, and if so how, that type of data will be used in the development of VR monitoring plans. The submission shall explain the basis for CDE's belief that the resulting data on which VR monitoring plans will be based will be adequate to measure performance in the priority areas. In addition, the submission shall include amended VR procedures, manuals, interview instruments, and guidance to its consultants to ensure that data are fully analyzed in order to develop the monitoring plan; adequate sample sizes are used in the educational benefit and IEP implementation reviews; quality control of, and adequate training for, the educational benefit reviews; interviews are conducted of parents, teachers, related service providers, and administrators as part of the educational benefit review; information gained through interviews affect VR findings, if appropriate; and observations are conducted as part of the educational benefit review when necessary. 3. Too Limited Data Verification Plaintiffs argue that the VR process does not "ensure effective human agency involvement" in VR decision-making. By this Plaintiffs appear to mean that the system lacks "assurances against arbitrary or inappropriate determinations and oversights," citing CDE's VR powerpoint document regarding analyses performed by SELPAs. Plaintiffs claim that "sufficient instructions" to VR monitors to evaluate information gathered by SELPAs are not provided by CDE, and that the evaluation is limited to meetings with SELPA staff. In addition, citing the VR manual, Plaintiffs argue that while the VR process calls for interviews with parents and district staff and administrators, instructions are not offered regarding the number of interviews to be conducted, how the information gathered should be considered, and how to ensure that input from interviews is considered (PDC at 38-39). Plaintiffs further claim that the VR process uses "a few computer programs in some vague ways" to analyze VR data; these ways, in Plaintiffs' view, fall short of meeting IDEA requirements. Plaintiffs do not explain the ways in which this claim is allegedly true, but assert that the programs used, and what they are assessing, are not identified, and the manner in which CDE ensures the efficacy of the process is unclear (PDC at 39). Plaintiffs believe it is unclear who the VR monitors are, and instructions to monitors are also unclear. Plaintiffs are particularly concerned about the possibility of "self-interested responses," and do not think it is clear whether data are collected by 43 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 44 of 88 school district staff and/or third parties. Plaintiffs assert that the VR process calls for districts to "self-select" issues for correction when called for by the APR measure. Further, Plaintiffs argue that CDE's standards regard districts as compliant until evidence of noncompliance is found and regard this standard as "illegal," as the statute does not contain a presumption of compliance (PDC at 39-40). In its major response documents CDE does not respond to Plaintiffs' point related to SELPAs, or its points regarding interviews, self-interested responses, selfselection of issues for correction, and the standard of regarding districts as compliant until evidence of noncompliance is found. CDE does respond to the issue of computer programs determining corrective actions, and explains that software "merely 'suggests'" corrective actions, suggestions that are "subject to editing" by CDE staff; the only findings developed by the software are those required by OSEP, that SEAs must ensure the subsequent review of other students to ensure that violations found are not present for additional students (CDE Design Response I at 21). The pages from the VR powerpoint document cited by Plaintiffs were reviewed, and do not make it clear who conducts the SELPA governance review portion of the VR (Special Education Verification Review in California (2010) at 20-28). The older VR manual calls for the SELPA review to be conducted by CDE, district, and SELPA staff (Verification Review Manual 2011-12, Volume I at 31). However, the more recent manual calls for this review to be conducted by the CDE consultant (Verification Review Procedure Guide 2012-13 at 45-46). With respect to Plaintiffs' claim that sufficient instructions are not given to VR monitors in order to evaluate information gathered by SELPAs, the table below displays the SELPA-related items from CDE's "Item Table Pre selected items" spreadsheet: Item No Compliance Test 111-1 If the SELPA is a multi-district SELPA, have the entities participating in the local plan entered into written agreements and procedures for the ongoing review of programs and a mechanism for correcting any identified problems? 111-2 Does the SELPA have in effect policies, procedures, and programs that are consistent with 44 Revised Compliance Standard The multidistrict SELPA must have a written agreement that includes provision for the ongoing review of programs. The SELPA must have Revised Other Guidance Check to see if the multidistrict SELPA has a written agreement among all of the participating entities. Review the agreement to see if it makes provision for the ongoing review of the programs and procedures used to implement the local plan (which includes all of the federal procedures under IDEA). Interview SELPA and district staff to: 1) determine the review schedule; 2) to determine what findings were made; and, 3) what was done to correct any findings of noncompliance. Review SELPA local plan, policies and procedures to Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 45 of 88 Item No 111-3 Compliance Test state laws, regulations, and policies governing: (1) Free appropriate public education. (2) Full educational opportunity. (3) Child find and referral. (4) Individualized education programs, including development, implementation, review, and revision. (5) Least restrictive environment. (6) Procedural safeguards. (7) Annual and triennial assessments. (8) Confidentiality. (9) Transition from Part C of IDEA to the preschool program under Part B of IDEA. (10) Students in private schools. (11) Compliance assurances, including general compliance with the Individuals with Disabilities Education Act (20 U.S.C. Sec. 1400 et seq.), Section 504 of the Rehabilitation Act of 1973 (29 U.S.C. Sec. 794), the Americans with Disabilities Act of 1990 (42 U.S.C. Sec. 12101 et seq.), federal regulations relating thereto, and this part. (12) A description of the governance and administration of the plan consistent with 30 EC 56205, (13) Comprehensive system of personnel development and personnel standards, including standards for training and supervision of paraprofessionals. (14) Performance goals and indicators. (15) Participation in state and districtwide assessments, and reports relating to assessments. (16) Supplementation of state, local, and other federal funds, including nonsupplantation of funds. (17) Maintenance of financial effort. (18) Opportunities for public participation prior to adoption of policies and procedures. (19) Suspension and expulsion rates, (20) Access to instructional materials by blind individuals with exceptional needs and others with print disabilities in accordance with Section 1412(a)(23) of Title 20 of the United States Code, (21) Overidentification and disproportionate representation by race and ethnicity of children as individuals with exceptional needs, including children with disabilities with a particular impairment described in Section 1401 of Title 20 of the United States Code and in accordance with Section 1412(a)(24) of Title 20 of the United States Code, (22) Prohibition of mandatory medication use pursuant to Section 56040.5 and in accordance with Section 1412(a)(25) of Title 20 of the United States Code? Does the SELPA have an annual budget plan on file that includes expected expenditures for all items related to the local plan including: (A) Funds received; (B) Administrative costs of the 45 Revised Compliance Standard policies, procedures and programs that comply with state and federal laws and regulations. The SELPA must have an annual budget plan Revised Other Guidance ensure that required content is included. Check to see if the SELPA has an annual budget plan on file. Check to see if it has all of the required contents. Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 46 of 88 Item No 111-3.1 Compliance Test plan; (C) Special education services to pupils with severe disabilities and low incidence disabilities; (D) Special education services to pupils with nonsevere disabilities; (E) Supplemental aids and services to meet the individual needs of pupils placed in general education classrooms and environments; (F) Regionalized operations and services, and direct instructional support by program specialists; (G) The use of property taxes allocated to the special education local plan area? Was the annual budget plan adopted at a public hearing held by the SELPA? 111-3.2 Was notice of the hearing related to the annual budget plan posted in each school in the local plan area at least 15 days prior to the hearing? 111-4 Does the SELPA have an annual service plan that includes a description of services to be provided by each district and county office, including the nature of the services and the physical location at which the services will be provided, including alternative schools, charter schools, opportunity schools and classes, community day schools operated by school districts, community schools operated by county offices of education, and juvenile court schools, regardless of whether the district or county office of education is participating in the local plan? Was the annual service plan adopted at a public hearing held by the SELPA? 111-4.1 111-4.2 Was notice of the hearing related to the annual service plan posted in each school district in the local plan area at least 15 days prior to the hearing? 46 Revised Compliance Standard on file. Revised Other Guidance The annual budget plan must be adopted at a public hearing held by the SELPA. Notice of the hearing must be posted at least 15 days prior to the hearing. The SELPA must have an annual service plan on file. Review the annual budget plan to see if it has an adoption date. Interview the SELPA Director to determine how the plan was adopted. The annual service plan must be adopted at a public hearing held by the SELPA. Notice of the hearing must be posted at least 15 days Review the annual service plan to see if it has an adoption date. Interview the SELPA Director to determine how the plan was adopted. Interview the SELPA Director to determine how and when notices for the hearing were disseminated. Check to see if the SELPA has an annual service plan on file. Check to see if it has all of the required contents. Interview the SELPA Director to determine how and when notices for the hearing were disseminated. Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 47 of 88 Item No 111-1.1 Compliance Test Does a single district SELPA have a written procedure for the ongoing review of programs and procedures for correcting any identified problems? Revised Compliance Standard prior to the hearing. The single district SELPA must have a written procedure that includes provision for the ongoing review of programs. Revised Other Guidance Check to see if the single district SELPA has a written procedure. Review the procedure to see if it makes provision for the ongoing review of the programs and procedures used to implement the local plan (which includes all of the federal procedures under IDEA). Interview SELPA and district staff to: 1) determine the review schedule; 2) to determine what findings were made; and, 3) what was done to correct any findings of noncompliance. For the most part, the instruction given to monitors in the standard and guidance columns is sufficient. However, item 11-1-2 is problematic: the compliance test includes the words "consistent with," but the revised compliance standard uses the words "comply with," and the revised other guidance seeks to ensure merely that the "required content" is included. Further, the compliance test is to be applied to "policies, procedures, and programs," and include many crucial aspects of state special education law, regulations and policies: while one can judge the extent to which policies and procedures comply with requirements based on the policy and procedure documents, without monitoring all programs it is difficult to understand how the answer to the programmatic aspect of this monitoring item can be meaningful. Turning to Plaintiffs' points about interviews, a determination has been made at Section III. E. 2. above regarding interviews related to the educational benefit review portion of the VR process. Plaintiffs' points here are wider, in that they argue that CDE does not provide instructions about the number of interviews, the consideration of interview information, and how such consideration is ensured. The older VR manual calls for interviews of administrators, staff and parents "to follow up on issues and concerns noted in the monitoring plan in order to evaluate district practices related to implementing policies and procedures." More specifically, the manual states that the purpose of interviews is to verify information from other sources, collect additional information, resolve discrepancies from other sources of information, and corroborate the district's compliance with requirements. The CDE lead consultant selects the interviewees. The manual does not provide guidance regarding the number of interviews to be conducted. The protocols for the interviews are based on the same issues and sources that impacted the monitoring plan (Verification Review Manual 2011- 47 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 48 of 88 12, Volume I at 29-30). Guidance is offered in the manual regarding the consideration of information gained through interviews: Information gathered from interviews is collected, analyzed, and discussed at the Post-Review Meeting. This meeting may corroborate a finding of systemic noncompliance based on the interviews of parents, general education teachers, special education service providers and administrators. Issues found as a result of the interviews must be marked noncompliant if there is evidence from other sources that corroborates the information from the interviews. In some cases, this will be clear (e.g., the child did not receive speech therapy during the month of April as indicated on the IEP). In any case, a finding of noncompliance requires the consultant to write a finding statement that includes both the source(s) of the information as well as the content. (Verification Review Manual 2011-12, Volume I at 30-31) The more recent manual contains guidance that is substantially the same, although the additional information regarding the use of interview results in findings does not appear to be included (Verification Review Procedure Guide 2012-13 at 52-53, 64-65). Thus, information regarding the consideration of interview information is provided by CDE in the older VR manual, and partially in the newer manual. However, while CDE points out in the older manual that there will be clarity at times regarding how interview results are corroborated through other forms of evidence, in other cases this will not be clear. Unfortunately, CDE does not offer sufficient guidance to assist VR monitoring teams in both sorts of circumstances. With respect to Plaintiffs' concern regarding the use of computer programs, Plaintiffs have not specified the ways in which the use of such programs allegedly does not meet IDEA requirements. In addition, the VR manual does not indicate the use of such programs to analyze VR data. The software is used to enter the monitoring plan after it has been created and approved (Verification Review Procedure Guide 2012-13 at 1848). The software generates forms for each review based on the monitoring plan (Verification Review Procedure Guide 2012-13 at 29-31). Findings are made by VR monitors and then entered into the software (Verification Review Procedure Guide 2012-13 at 53-61). The software generates corrective actions, which can then be edited by the VR monitors at the post-review meeting; additional corrective actions for district-level findings can also be ordered, and some suggestions of additional corrective actions are included in the manual (Verification Review Procedure Guide 2012-13 at 64-6549). 48 "The Monitoring Plan is approved prior to entry of the items and information for investigation from the Monitoring Plan into the VR software." 49 "The purpose of the post review meeting is to review the software-generated list of findings and to edit, as needed." 48 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 49 of 88 It is clear from the manual that the VR review team includes CDE staff, and district and SELPA staff (Verification Review Procedure Guide 2012-13 at 34-35). With respect to the possibility of self-interested responses, CDE staff review student record review findings made by other reviewers, but this appears to be limited to findings of noncompliance, unless there is disagreement between CDE and other reviewers in compliance interpretations (Verification Review Procedure Guide 2012-13 at 37). This form of oversight is insufficient to ensure that findings of compliance are made appropriately. Contrary to Plaintiffs' claim, districts do not self-select items for correction from the APR measure. CDE selects these items for the monitoring plan if the target was not met or action was required (Verification Review Procedure Guide 201213 at 13, 24). Turning to Plaintiffs' argument that CDE's standards illegitimately regard districts as compliant until evidence of noncompliance is found, CDE has verified that that is its standard (CDE 2/22/13 Response at 33; for a slightly different formulation, see Verification Review Manual 2011-12, Volume I at 24, 31). The Monitor agrees with Plaintiffs that the IDEA does not support a presumption of compliance unless evidence of noncompliance is found.50 Monitor's Determination: Plaintiffs have not shown that the SELPA governance review is limited to meetings with SELPA staff, the use of computer programs violates IDEA requirements, computer programs are used to analyze VR data, or that districts self-select items for correction from the APR measure. CDE has not demonstrated that the instruction given to monitors for all items in the SELPA governance review is sufficient to enable appropriate judgments of compliance and noncompliance, sufficient guidance is provided regarding the number of interviews to be conducted, sufficient guidance is provided regarding the consideration of interview information in the development of monitoring findings, effective and full oversight is exercised by CDE regarding all findings made in the VR process, and that its standard for compliance comports fully with the IDEA. Therefore, CDE shall engage in corrective action steps reasonably calculated to ensure that the instructions given to monitors for all items in the SELPA governance review is sufficient to enable appropriate judgments of compliance and noncompliance, sufficient guidance is provided regarding the number of interviews to be conducted, sufficient guidance is provided regarding the consideration of interview information in the development of monitoring findings, effective and full oversight is exercised by CDE regarding all findings made in the VR process, and that its standard for compliance comports with the IDEA. CDE shall set forth with precision through a submission to the Monitor and the parties amended VR procedures, manuals, interview instruments, and guidance to its consultants adequate to ensure that the instructions given to monitors for all items in the SELPA governance review are sufficient to enable appropriate judgments of compliance and noncompliance, sufficient guidance is provided regarding the number of interviews to be conducted, sufficient guidance is 50 As a practical matter CDE's standard does not appear to have significant implications, as findings of noncompliance are only made if evidence of noncompliance is found. However, in the absence of information indicating compliance or noncompliance, a presumption of any sort is inappropriate. 49 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 50 of 88 provided regarding the consideration of interview information in the development of monitoring findings, effective and full oversight is exercised by CDE regarding all findings made in the VR process, and that its standards do not include a presumption of compliance. 4. No Evidence of Required Compliance Determinations/Follow-Up Plaintiffs assert that the "outputs" of the VR process--monitoring findings, monitoring plans, and CAPs--do not comply with the IDEA's requirements, citing § 300.600 (a)(2) and (a)(3), § 300.603 (b)(1), and § 300.604. Plaintiffs do not state how these outputs do not comply with the cited requirements, but note that the requirements are not mentioned in VR guidance documents. In addition, Plaintiffs state that CDE has refused to produce VR monitoring plans, citing CDE's 3/22/13 Response (PDC at 40). As noted in Section III. E. 1. above, CDE responds to Plaintiffs' point regarding the use of determinations in the VR selection process by noting that the federal regulations cited by Plaintiffs contain only "broad statements" and no specific references to VRs (CDE Design Response I at 17). Additionally with respect to determinations, CDE explains that it reviews data submitted by districts, makes determinations, and these determinations can result in technical assistance or other support to "compel compliance." Further, corrective actions can result from the VR process. If student-level correction is required, the violation must be corrected within 45 days, and a follow-up review of other students demonstrating 100% compliance is also required. CDE claims that systemic corrective actions "require" the development or review of policies and procedures, dissemination of the policies and procedures to relevant staff, and training of staff. In all cases noncompliance must be corrected within one year. Citing its SPP, CDE states that if noncompliance is not corrected, CDE has the authority to impose sanctions. The available sanctions are structured as a hierarchy, and include submission of data demonstrating correction of violations, letters of noncompliance, local board hearings, additional "focused and continuous" monitoring, negative certification steps for nonpublic schools, "requiring intermediary agency assurance," "specialized" corrective actions, requiring compensatory services, attaching "special conditions" to grants, withholding funds, and using writs of mandate (CDE Design Response I at 19-21; see also 12/12 Revised SPP at 86-87). First, there is no requirement that VR guidance documents mention the IDEA provisions cited by Plaintiffs. Second, although most of the enforcement options specified in § 300.604 appear on CDE's list of available sanctions, several do not: directing the use of funds (§ 300.604 (a)(2)), the use of compliance agreements if there are reasons to believe that compliance cannot be achieved within one year (§ 300.604 (b)(2)(ii)), and recovering funds (§ 300.604 (b)(2)(iv) and (c)(1)).51 Third, with respect to monitoring findings, as noted in Section III. C. 3. b. above information gathered through monitoring visits is required to be part of the basis of determinations (§ 300.603 (b)(1)). 51 The Monitor is interpreting writs of mandate as the functional equivalent of referrals for enforcement (§ 300.604 (b)(2)(vi) and (c)(3) and (4)). 50 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 51 of 88 None of the other regulations cited by Plaintiffs apply to VR monitoring findings, unless the findings are uncorrected. Fourth, the regulations cited do not appear to have specific applicability to VR monitoring plans, and Plaintiffs do not set forth an argument to show such applicability. Fifth, CAPs are specifically mentioned in the requirements, but in the context of an NI determination for three or more consecutive years (§ 300.604 (b)(2)(i)). This requirement does not apply to routine VR corrective actions. Turning to CDE's refusal to produce VR monitoring plans, CDE responded to this request from Plaintiffs by arguing that "detailed information regarding VRs conducted in other LEAs as long as five years ago" is not "critical in determining whether CDE currently" has a monitoring system that meets the standard set by Consent Decree § 13.0 (CDE's 3/22/13 Response at 40, 52). The Monitor disagrees. While VR monitoring plans from five years ago are not likely to be relevant, recent VR monitoring plans are likely to be helpful in understanding this aspect of the VR monitoring process. Monitor's Determination: Plaintiffs have not shown that VR guidance documents must mention the cited requirements, or that VR outputs do not comply with the cited requirements. CDE has not demonstrated that it has available to it all enforcement options specified in § 300.604. Therefore, CDE shall engage in corrective action steps to ensure that it has available to it the enforcement options of directing the use of funds, the use of compliance agreements if it has reasons to believe that noncompliance cannot be corrected within one year, and recovering funds. CDE shall set forth in a submission to the Monitor and parties citations to appropriate state laws and/or regulations showing that it has the specified enforcement options available. If it does not have these options available, CDE shall set forth with precision the steps necessary to make those options available, and timelines for taking those steps. In addition, within 60 days of the date of this memo, CDE shall produce VR monitoring plans for a random sample of ten districts that received VRs in the 2012-13 school year. Monitor's Overall Determination for the Design of CDE's Monitoring System: In the specific respects set forth in the System Design determinations above, the Monitor has determined that CDE has not demonstrated that the design of its monitoring system complies with statutory requirements, can be implemented adequately, can identify both noncompliance and compliance appropriately based on adequate evidence and reasoning, and can result in appropriate corrective actions. Therefore, the design of the state-level monitoring system currently in place is not capable of ensuring continued compliance with the law and the provision of FAPE to children with disabilities in Ravenswood. Hence, a CAP that includes at minimum the steps specified in each determination above should be developed and implemented. The CAP should include timelines and persons responsible for each activity and outcome. 51 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 52 of 88 IV. SESR Implementation Objections Plaintiffs argue that CDE has waived its objections to Plaintiffs' challenge to SESR implementation, citing Section VI. F. of the Fifth Joint Statement (Cummings email to Mlawer, 8/26/13). CDE responds by citing Section B. 6. of the Second Joint Stipulation Re Amendment of Dispute Resolution Timelines in Fifth Joint Statement (Wise email to Mlawer, 8/26/13). The Fifth Joint Statement at VI. C. requires the parties to inform the Monitor of the bases of their positions on disputed issues related to SESR implementation within a week of the expiration of the meet-and-confer period, which Plaintiffs argue ended on 6/7/13. Section VI. F. states that failure to timely object constitutes a waiver of any objection. However, in the Second Joint Stipulation the parties changed the deadline for the meet-and-confer to 7/26 (at B. 5.), and changed the deadline for submitting disputed issues to the Monitor to 8/26 (at B. 6.). Thus, CDE has not waived its objections to Plaintiffs' SESR implementation challenge. A. Creation of Monitoring Plan 1. Lack of Meaningful and Representative Parent Input Plaintiffs assert that the parent data relied upon for the creation of the monitoring plan was unreliable and not representative of the District's special education population. Plaintiffs claim that parent meetings were held during the summer, a period of time when "many" parents were away, and thus the parent meetings were not well attended. The parent surveys were also sent during the summer, and used, in Plaintiffs' view, "high-level vocabulary," a barrier to meaningful responses. Further, because sufficient responses to the survey were not received, District staff members were "overly burdened" by the necessity of making phone calls to parents, often in Spanish, to seek sufficient numbers of responses to the survey (Plaintiffs' SESR Implementation Objections, 5/10/13,52 at 5-6). CDE responds that input was gathered from "at least" 20% of parents. With respect to the timing of the parent survey, CDE notes that this was due to the SESR timelines agreed to by Plaintiffs, citing the Fifth Joint Statement, as these surveys are typically sent out during the school year; the parent meetings were held during the first week of school, not the summer break, as Plaintiffs claim. CDE agrees that aspects of the survey instrument may be challenging for some parents to understand, and commits to adopting a new instrument for the 2014-15 school year. CDE does not respond to Plaintiffs' point that District staff members were unduly burdened by having to call additional parents, but does note that this activity took place during the school year (CDE's SESR Implementation Response, 8/26/13,53 at 4). 52 53 This document is cited below as SESR Objections. This document is cited below as SESR Response. 52 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 53 of 88 Section II. E. of the Fifth Joint Statement (at 4) calls for the collection and analysis of parent survey data to take place in August 2012. However, that passage is written in the past tense ("collected and analyzed") as this document was filed in December 2012. Thus, in the relevant section of the document cited by CDE, Plaintiffs were only stipulating to events that had already taken place. But in Section II. E. of the Fourth Joint Statement (at 7), filed in July of 2012, the same timeline is called for. Hence, Plaintiffs did agree in advance to a timeline that necessitated collecting data from parents during the summer. Plaintiffs do not support their conclusion that the survey instrument uses vocabulary with which many parents may not be familiar by citing specific survey questions. However, CDE concedes this point and has agreed to redraft the instrument.54 Although CDE did not respond to Plaintiffs' point that District staff were overly burdened by calling parents, the self-monitoring component of a state monitoring system always creates some additional burdens and obligations for districts. The necessity of contacting additional parents in order to ensure adequate parent input into a SESR monitoring plan was not, in the Monitor's judgment, an excessive burden on the District. Monitor's Determination: Plaintiffs have not shown that the District was unduly burdened by calling additional parents to administer the parent survey, nor have they shown that the timing of the parent input activities was imposed by CDE alone. CDE has agreed that the meaning of some questions in its current parent survey instrument may not be clear to some parents. Therefore, CDE should engage in corrective action steps to develop a SESR parent survey comprehensible to the vast majority of parents. CDE shall set forth in a submission to the Monitor and the parties a draft parent survey instrument. 2. Inadequate CDE Oversight Plaintiffs argue that CDE did not oversee the development of the parent input aspects of the monitoring plan effectively. They point out that the District noted on the plan in one area that parents did not understand the question, yet CDE approved the plan as written. Plaintiffs also assert that the role of the SELPA in the development of the monitoring plan is unclear (SESR Objections at 6). Unfortunately, CDE does not offer a response to these specific objections (SESR Response at 4). The Monitor agrees that any role the SELPA may have played in the development of the monitoring plan is opaque from CDE's documents. But Plaintiffs offer no explanation regarding why this matters: in the Monitor's judgment, any role the SELPA played is irrelevant; what is important is the accuracy and comprehensiveness of the content of the document, not what entities contributed to its development. 54 In the Monitor's view, terms like "good faith effort" (Q1) and "facilitate" (Q5), for example, may be problematic in a parent survey. 53 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 54 of 88 However, CDE is ultimately and fully responsible for the implementation of every aspect of its monitoring system, including the development of monitoring plans for SESRs. The monitoring plan for Ravenswood's SESR (Section I) for individual item 8-4-2 under "Title and Description" reads: The IFSP for an infant or toddler and the infant's or toddler's family is reviewed every six months or more frequently. Total Responses were 24 parents. Out of 24, 9 replied Yes, 7 replied don't know, 8 replied No. Clearly parents did not understand the question. Question validity-RSCD (sic) has only 9 students under the age of 3. In spite of that concern from the District, on the last page of the document the box next to the statement "This Monitoring Plan has been reviewed and is approved as submitted" is checked, and the signature of the CDE Administrator appears under that statement. As this concern expressed by the District appears to have validity, it does not seem to be the case that CDE reviewed the parent input portion of the monitoring plan thoroughly. Monitor's Determination: Plaintiffs have not shown any effect or potential effect of the role the SELPA may or may not have played in the development of the monitoring plan. CDE has not demonstrated that it exercised effective oversight of the development of the parent input portions of the monitoring plan. Therefore, CDE should engage in corrective action steps to ensure that it exercises effective oversight over the development of the parent input portions of SESR monitoring plans. CDE shall set forth in a submission to the Monitor and the parties draft policies, procedures, and guidance documents for CDE consultants adequate to ensure that it exercises effective oversight over the development of the parent input portions of SESR monitoring plans. 3. Lack of Transparency Regarding Other Data Plaintiffs argue that the rationale for choosing some of the areas of focus for the Ravenswood SESR is unclear in three respects. First, they note that two areas of focus were chosen based on performance on SPP indicators. For one of these, performance on state assessments in English/Language Arts (ELA) and Math (Indicator 3), Plaintiffs argue that none of the questions the District was required to answer through the SESR focused on achievement in these areas, with the possible exception of questions related to student goals/objectives. Further, the District was not asked to look at actual achievement scores during the process, nor was it required to select samples of students for investigation who scored poorly on state assessments in these areas. Second, Plaintiffs argue that the monitoring plan is not clear regarding how the special populations were selected for review, adding that CDE's SESR manual is also unclear on this issue. Third, Plaintiffs allege that it is unclear if, and if so how, compliance complaints and due process hearings influenced the content of the monitoring plan. It 54 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 55 of 88 appears to Plaintiffs that the area of Other Health Impaired (OHI)-Diabetes was identified for monitoring based on complaints, but how complaints and hearings affected the monitoring plan is not transparent. In addition, while raised in a footnote, Plaintiffs also argue that data related to receiving more than one day of out-of-school suspension for students with disabilities in Ravenswood, and data regarding suspensions of African-American students with disabilities, should have resulted in the monitoring of this issue in Ravenswood's SESR. Yet Plaintiffs did not find evidence that this issue was probed (SESR Objections at 5, fn. 3; 6-8). CDE offers partial responses to three of Plaintiffs' points. With respect to Indicator 3 monitoring, CDE asserts that 26 items related to performance on state assessments were "subject to analysis" through the SESR student record, and policy and procedures, reviews. CDE adds that Plaintiffs' point on this issue reflects nothing more than Plaintiffs' "misunderstanding" of SESR implementation (SESR Response at 4). However, CDE has misunderstood Plaintiffs' argument. Plaintiffs argued that none of the items pointed to by CDE actually focused on math and ELA achievement. CDE offers no response to this critique, and ignores Plaintiffs' points regarding the failure to look at student achievement scores and to create and investigate samples of students with poor math and ELA achievement. The latter is particularly important, as the claim that one can competently investigate potential FAPE violations that may be related to poor student achievement in math and ELA through samples of students that do not purposefully include students with poor performance in these areas is not defensible, as noncompliance that may exist for such students is far less likely to be found if sufficient numbers of them are not in the samples reviewed. In addition, the monitoring methodologies used must also be capable of finding such violations, but CDE does not set forth evidence that the methodologies used were capable of doing so. With respect to the putative lack of clarity concerning how special populations were chosen for review, CDE says only that the District consulted with CDE to ensure proper selection of special populations, and that its SESR manual will be revised in this regard. No explanation is offered on this issue with respect to the Ravenswood SESR monitoring plan (SESR Response at 4). While it is important to note that Section IV of the monitoring plan states that the specified special populations were chosen for review based on data from Dataquest, the plan does not specify the data that resulted in the selection of the groups identified for monitoring. OHI-Diabetes is also listed in this section of the monitoring plan, and is not listed in Section III (Compliance History), which suggests that data rather than complaints resulted in the selection of this issue for monitoring. But CDE's response is silent on this issue. Turning to the suspension issue raised by Plaintiffs, CDE responds that Indicator 4 data did not point to this issue as a potential problem in the District, but states that it will use CALPADS data for future SESRs (SESR Response at 4). The Monitor has made a determination related to CDE's exclusive use of Indicator 4 data for monitoring purposes (see Section III. C. 3. c. above), which applies to this issue in the SESR context as well. While CDE disagrees with the data on which Plaintiffs rely, in addition to the concerns expressed in Section III. C. 2. above regarding the accuracy of CDE data, its exclusive reliance on Indicator 4 data as a rationale for the exclusion of 55 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 56 of 88 suspension/expulsion-related issues from the Ravenswood SESR monitoring plan again misses the force of Plaintiffs' concern. Even assuming the accuracy of CDE's Indicator 4 data, which concerns suspensions/expulsions of greater than ten days in a school year, the exclusive use of these data may mask disproportionate use of suspensions of under ten days,55 and any potential FAPE or child find violations such suspensions may reveal if investigated competently. Monitor's Determination: CDE has not demonstrated that 1) the items related to Indicator 3 investigated as part of the District's SESR focused on potential FAPE violations related to math and ELA achievement; 2) actual student achievement data related to these areas were reviewed; 3) samples were selected that included a sufficient number of students who scored poorly on state assessments in these areas; 4) its SESR monitoring plan is sufficiently transparent regarding the rationale for the selection of special populations for review, and how the District's compliance history influenced the content of the monitoring plan; and 5) potential disproportionate use of suspensions of under ten days, and FAPE and child find violations that may be related to such suspensions, were investigated. Therefore, CDE should engage in corrective action steps to ensure that 1) its SESR monitoring plans, policies, procedures, methodologies, student samples, and instruments are reasonably calculated to result in competent monitoring of potential FAPE violations related to math and ELA achievement; 2) its monitoring plans, guidance documents, policies and procedures are transparent regarding the rationale for the inclusion of special populations for review and how compliance history influences the content of the plans; and 3) its monitoring plans, policies, procedures, methodologies, student samples, and instruments ensure that potential disproportionate use of suspensions of under ten days, and FAPE and child find violations that may be related to such suspensions, are investigated. CDE shall set forth with precision through a submission to the Monitor and the parties the procedures, manuals, instruments, and guidance documents it will adopt in this regard. B. Student Records Review 1. Organizational Burden Plaintiffs argue that the student record review portion of the SESR is unduly costly and burdensome to districts. In support Plaintiffs point to the process of staff first filling out a paper document and then entering the monitoring data electronically, adding that the mandated corrective actions would require the same method. Plaintiffs assert that the process of converting information from paper documents to electronic formats also "led to inconsistency." Plaintiffs do not offer examples of alleged inconsistencies caused by this process (SESR Objections at 8-9). 55 See Section III. C. 3. c. above for CDE data on disproportionate use of suspensions statewide. 56 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 57 of 88 While CDE makes comments on the subject of consistency (see below), it does not respond to Plaintiffs' specific concern regarding inconsistencies caused by the paper-to-electronic conversion process (SESR Response at 4-6). As Plaintiffs do not offer supporting evidence for its claim of paper-to-electronic inconsistencies, the Monitor will not search for evidence to support Plaintiffs' claim.56 With respect to the claim of undue burdens and costs to Ravenswood as a result of SESR implementation, self-monitoring components of state monitoring systems often create additional burdens and costs for districts. While unfortunate, that fact does not render the record review portion of CDE's SESR process an ineffective tool for identifying compliance and noncompliance appropriately. Monitor's Determination: Plaintiffs have not shown that any inconsistencies resulted from the conversion of monitoring data from paper to electronic formats, nor have they shown that unreasonable costs and burdens were incurred by the District through implementing the student record review portion of the SESR. 2. Lack of Clarity/Consistency Across Reviews Plaintiffs allege that CDE did not provide the District with guidance sufficient to produce consistency in the findings of compliance made during the SESR. In support Plaintiffs offer four arguments. First, items that did not apply to students were marked compliant for some, but not applicable for others. The example of the prior written notice requirement is used to illustrate this issue. Second, some requirements monitored appear procedural in nature but actually "implicate substantive determinations." Item 3-2-11.1, the example offered by Plaintiffs, queries whether the IEP includes extended school year (ESY) services "when appropriate." Plaintiffs argue that answering this question requires a substantive judgment of whether ESY was appropriate for the student, but CDE did not "clearly articulate" a substantive standard to be used in making this judgment. Thus, the word "appropriate" in this item was ignored by the District in the SESR, Plaintiffs claim, as were similar concepts in other items. Third, Plaintiffs argue again that items monitored due to a presumed relation with an SPP indicator did not include reviewing actual data that comprise the indicator, nor did it target the indicator. Fourth, Plaintiffs note that an 11/6/12 email from the CDE Administrator to the District Assistant Superintendent pointed out that some reviewers did not answer questions in complete sentences, submit the number of data collection summary sheets called for, and document student data appropriately. A determination related to Plaintiffs' argument concerning SPP indicators has been made at Section IV. A. 3. above. A determination on the consistency of reviews will be made at Section IV. B. 3. below, as CDE offers a response on this issue applicable to both findings of compliance and noncompliance. With respect to the problems revealed by the email from the CDE Administrator, this may show CDE consultants going beyond the review process required by CDE's 2011-2012 SelfReview Analysis 56 See SESR Response at 3 ("CDE also contests those objections that are overly general and lacking in evidentiary support."), and Fifth Joint Statement at Section VI. A. 57 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 58 of 88 Instructions for Regional Focused Monitoring and Technical Assistance Consultants (see Section III. D. 3. above); in this case, the CDE consultant took action to ensure that certain basic requirements of the SESR process were fulfilled, an active form of CDE oversight over the SESR process in Ravenswood (see also SESR Response at 5). As called for by the determination set forth at Section III. D. 3. above, such steps should be built into the CDE SESR oversight process so that they take place routinely in all SESRs, both for findings of compliance and noncompliance. Turning to Plaintiffs' argument regarding some monitoring items appearing procedural in nature but actually implicating substantive judgments in the absence of clear standards with which to make such judgments, the example cited by Plaintiffs was examined in CDE's "Item Table 12-13" spreadsheet and is displayed in the table below: Item No 3-211.1 Compliance Test Does the IEP, when appropriate, include extended school year services for the student? Revised Compliance Standard Must be documented in the IEP, when item is appropriate for the student. Revised Other Guidance Review IEP to determine if extended school year is indicated as a check-box or other method for the district to indicate if extended year is necessary. Here the item purports to judge whether ESY services are included in the IEP for a student when "appropriate." The compliance standard looks for documentation of ESY in the IEP when it is "appropriate" for the student, but does not include a standard of appropriateness, a substantive judgment; nor does the other guidance column, which simply substitutes the word "necessary" for the word "appropriate." Plaintiffs' objection on this issue is correct with respect to this item. Although they allege that this problem affects other items, Plaintiffs do not put forth any as additional examples. However, CDE must ensure that its item table and supporting documents provide sufficient guidance for district staff in the SESR process to enable them to make substantive determinations with substantive requirements. Monitor's Determination: Determinations have been made above or will be made below on three of Plaintiffs' specific objections. CDE has not demonstrated that it provides guidance to district staff sufficient to enable substantive determinations of compliance or noncompliance with substantive requirements. Therefore, CDE should engage in corrective action steps to ensure that it provides guidance to SESR monitors reasonably calculated to enable substantive determinations regarding compliance and noncompliance with substantive requirements. CDE shall set forth with precision through a submission to the Monitor and the parties the procedures, manuals and guidance documents it will adopt in this regard. 3. Findings of Noncompliance Lack Consistency Plaintiffs argue that inconsistency of reviews is shown by disagreements between reviewers and administrators on proper compliance determinations. In addition, Plaintiffs point to inconsistent documentation of noncompliance in SESR 58 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 59 of 88 software submissions, summary sheets, and SESR worksheets, and add that the rationale for findings of noncompliance was not clearly stated (SESR Objections at 1011). CDE responds that Plaintiffs cite few specific examples of inconsistencies, and states that the "student record review is designed to produce consistent results." But if there were inconsistencies, CDE continues, this would only be a problem if it led to unreliable results. However, CDE believes that Plaintiffs acknowledged in the meetand-confer discussions, and CDE confirmed in its on-site follow-up, that both Plaintiffs and CDE were "generally" able to replicate the District's SESR findings. Thus, the SESR results are reliable and "generate meaningful information." CDE also points to the above-mentioned intervention of its Administrator and its follow-up review as providing important guidance and feedback to the District, and states that it is considering changes to its manual to require district staff to review files as a team in the future during SESRs (SESR Response at 5). The Monitor reviewed the SESR Student Non-Compliant Report and finds that the rationale for each finding of noncompliance is stated in that document, although not always with all relevant details included (see Section IV. B. 6. below). While Plaintiffs do not indicate specific inconsistencies in the review of any particular students, they have alleged confusion between compliant and not applicable designations, disagreements between reviewers and administrators on findings, and a lack of alignment between summary sheets, worksheets, and software submissions. CDE tackles none of these allegations, instead arguing that the findings were "generally" replicated and "meaningful" information was provided through the SESR. But meaningfulness and general replication are not the applicable standards. CDE's monitoring system must be implemented adequately, and identify both noncompliance and compliance appropriately based on adequate evidence and reasoning; CDE has not demonstrated that either was the case in the Ravenswood SESR. Monitor's Determination: Plaintiffs have not shown that rationales for studentlevel findings of noncompliance are not stated clearly. CDE has not demonstrated that the SESR process in Ravenswood was implemented adequately and identified noncompliance and compliance appropriately based on adequate evidence and reasoning. Therefore, the determination reached at Section III. D. 3. above is reiterated: CDE shall engage in corrective action steps to ensure that its processes for validating the accuracy of SESR data collection are reasonably calculated to result in accurate SESR data. CDE shall set forth with precision through a submission to the Monitor and the parties the procedures, guidance and training documents it will adopt to ensure that SESR findings are accurate. 4. Exercise Often Reduced to Box-Checking Plaintiffs identify a review question that appears simple but in their view requires a much more important judgment. The example offered is item 4-1-1, "Does the LEA make a free appropriate public education available to all eligible students, who 59 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 60 of 88 are between the ages of 3 and 22?" Such questions require, Plaintiffs argue, "full-scale determinations of FAPE" that "cannot possibly be completed in the checklist format" of the record review. Hence, these sorts of questions turn the review into nothing more than the mere checking of boxes (SESR Objections at 11). CDE responds very generally to this objection, claiming that Plaintiffs' "boxchecking" criticism "ignores the very purpose of the SESR." CDE believes that the investment of time called for by the various steps of the SESR process is rewarded by the valuable feedback the District received through the findings of noncompliance. Rather than being little more than box-checking, "...the student record review requires that the District provide a sufficient basis for each finding of noncompliance by demonstrating why the legal requirement was not met and stating the source of the evidence for noncompliance." Unfortunately, CDE does not respond to Plaintiffs' specific example (SESR Response at 5). The example cited by Plaintiffs was examined in CDE's "Item Table 12-13" spreadsheet and is displayed in the table below: Item No 4-1-1 Compliance Test Does the LEA make a free appropriate public education available to all eligible students, who are between the ages of 3 and 22? Revised Compliance Standard Records, documentation and interviews must indicate that all children with a disability have a free appropriate public education available to them. Revised Other Guidance Review the components of FAPE in 34 CFR 300.13 and in 34 CFR 300.300-313. Generally, does the LEA serve all eligible children? Are they served at no cost to the parents? Are services provided under public supervision? Does it include a regular preschool, elementary and secondary education? And are the services provided in conformity with an IEP? First, both the compliance test and compliance standard refer to "all" students, yet the guidance asks whether "generally" "all" students are served. It is difficult to reconcile those two concepts and, thus, answers to this question are likely to be largely devoid of meaning. Second, as only samples of students are reviewed through the SESR process, there is no evidentiary basis for a judgment regarding "all" students. Third, the question would also require a full monitoring of child find in order to be answered, as students may be eligible who have not been located, identified, and served. Finally, Plaintiffs' argument that a "full-scale" determination of FAPE is required by the question is correct, and Plaintiffs are also correct that the content of the record review collectively does not provide a basis for a defensible answer to the question, as the record review is not, for example, an educational benefit review. However, it is unfortunate that CDE did not respond to this specific objection from Plaintiffs, as it does not appear to the Monitor that item 4-1-1 was reviewed as part of the student record review. Rather, this item appears to be answered through the educational benefit review, according to CDE's SESR training document ("Final Activity Two 12-13 SESR PPT," slide 63). Moreover, a review of the worksheets filled out by District staff was undertaken, and indicates that item 4-1-1 was reviewed as part of the 60 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 61 of 88 educational benefit portion of the SESR. Nevertheless, the Monitor's first three points above are applicable to this question in the context of the educational benefit review. Monitor's Determination: CDE has not demonstrated that item 4-1-1 is capable of a meaningful answer as currently structured. Therefore, CDE should engage in corrective action steps to ensure that all questions in its review instruments match the compliance standard and guidance, and are capable of being answered accurately in light of the evidence gathered during the review. CDE shall set forth with precision through a submission to the Monitor and the parties the review instruments it will adopt in this regard. 5. Exclusive Focus on Annual IEP Insufficient Plaintiffs argue that noncompliance can be missed because the student record review does not go beyond the current IEP. For students with behavior plans, the process does not require looking at the adequacy of the functional analysis (FAA) on which the behavior plan was based. Second, very few of the files reviewed were of students who were recently identified as having a disability, had a triennial assessment, or transferred into the District, a new school, or special education setting. As a result, monitoring items that address such situations are skipped rather than investigated using older records. The same is true for the prior written notice requirement. Third, reviewers skip a number of monitoring items identified by Plaintiffs if the student does not have a specific learning disability (SLD). However, Plaintiffs believe for unspecified reasons that these items also apply to other students. Plaintiffs add that it is not clear why SLD students were chosen as a special population for review57 (SESR Objections at 11-12). CDE does not respond to these specific objections except to note that Plaintiffs acknowledge that the educational benefit portion of the SESR does go beyond the current IEP (SESR Response at 6). CDE's argument that it has another aspect of the SESR process that looks beyond the current IEP does not attempt to answer Plaintiffs' objections or explain why the student record review portion of the SESR cannot or should not include reviewing older functional analyses or prior written notices, or older IEPs that followed initial or triennial assessments or transfers.58 Nor does CDE explain why SLD students were chosen as a special population for review, or why items applicable to other students were only monitored for SLD students, or why those items were not, in fact, applicable to other students. CDE's failure to offer a response to the last claim is particularly puzzling. The table below displays the requirements pointed to by Plaintiffs from CDE's "Item Table 12-13" and includes the citation column from CDE's spreadsheet: 57 See determination on the issue of choosing special populations for review at Section IV. A. 3. above. The Monitor notes on the subject of reviewing older IEPs that there is evidence in the "Student NonCompliant Report" that an older IEP was reviewed for one student (at 25, for requirements 20-3-5, 4-1-4, and 20-3-5.1). In addition, a review of the SESR worksheets for two students with behavior problems indicated that an older FAA was reviewed for one student but not the other. 58 61 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 62 of 88 Item No 3-59.7 3-59.2 3-59.4 3-59.5 Compliance Test On initial IEP's and each triennial, for students determined to have a specific learning disability, did the IEP team certify in writing that the disability is due to a disorder in one or more of the basic psychological processes and is not the result of environmental, cultural, or economic disadvantages? On initial IEP's and each triennial, for students determined to have a specific learning disability, did the IEP team certify in writing that observations of relevant behavior have been made by at least one team member other than the student's teacher (in the general education classroom or other appropriate environment)? On initial IEP's and each triennial, for students determined to have a specific learning disability, did the IEP team certify in writing that there exists a disorder in one or more of the basic psychological processes and a severe discrepancy between intellectual ability and academic achievement in oral and written language, reading, or mathematics which cannot be corrected through general education or categorical services? On initial IEP’s and each triennial, for students determined to have a specific learning disability, did the IEP team certify in writing that when standardized tests are considered to be invalid for a specific pupil, the discrepancy was measured by alternative means? Revised Compliance Standard Must be documented in student record. Revised Other Guidance Citation The report includes information considered by the team: data from tests, information from parents, pupil performance in the classroom setting, age of child and other relevant information. 34 CFR 300.311(a)(6), 30 EC 56337(a). Must be documented on the IEP. The observation must be made by a qualified professional other than the child's regular classroom teacher. An Assessment Report statement and an observation report check off on the IEP or the form for SLD eligibility criteria. 34 CFR 300.310 (a-c), 30 EC 56341(c). Must be documented in student record. Check in the Assessment Report, "SLD eligibility form"; "IEP checklist for SLD eligibility". The report rules out limited school attendance. 34 CFR 300.311(a)(7)(i), 5 CCR 3030(j). Must be documented in student record. The report includes a statement of the area, the degree, and the basis and method used in determining the discrepancy. This includes information considered by the team: data from alternative tests, information from parents, 34 CFR 300.307(a)(3), 5 CCR 3030(j)(4)(B). 62 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 63 of 88 Item No Compliance Test Revised Compliance Standard Revised Other Guidance Citation pupil performance in the classroom setting, age of child and other relevant information. The section of the regulations at § 300.307-311 is entitled "Additional Procedures for Identifying Children With Specific Learning Disabilities," and is only applicable to students suspected of having SLD. While Plaintiffs may have in mind here that some of the other students for whom these items should have been applicable were suspected of having SLD, they neither state that nor identify any such students. Monitor's Determination: Plaintiffs have not shown that any additional students should have had the SLD-specific items applied to their records in the SESR record review. CDE has not demonstrated that the student record review portion of the SESR cannot or should not include reviewing older functional analyses for students with behavior plans and prior written notices, and older IEPs that followed initial and triennial assessments, or transfers into the District, a new school, or special education setting. Therefore, CDE shall engage in corrective action steps to ensure that the student record review portion of the SESR is applied to sufficient numbers of students in each of the categories specified above, by either reviewing additional student files or through reviewing older records for the students chosen for review. CDE shall set forth with precision through a submission to the Monitor and the parties the procedures, manuals, guidance and training documents it will adopt in this regard. 6. Limited Verification of Findings by CDE Plaintiffs assert that corrective actions were developed by CDE based on the SESR findings of noncompliance made by the District without reviewing student files "or engaging in any other sort of verification." Due to CDE's failure to train District staff adequately regarding data collection and submission, CDE's ability to verify findings was attenuated (SESR Objections at 12). In response CDE points to the guidance given to the District by the CDE Administrator during the SESR process, guidance acknowledged by Plaintiffs, and its listing in its documents of the actions required of its consultants during the SESR process. In addition CDE claims that its "consultants are involved in training, guiding, and monitoring every SESR activity." Further, CDE emphasizes its follow-up review, which it describes as "a comprehensive review of all evidence the District relied upon and all the documents created in completing the components of the SESR" (SESR Response at 5, 9). Plaintiffs' point here concerns CDE's alleged adoption of corrective actions prior to any verification of the accuracy of the SESR findings of noncompliance. CDE's follow-up review cannot be regarded as responsive to this concern as, according to 63 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 64 of 88 CDE's "Chronology of SESR Activities in Ravenswood City School District" document, the follow-up review took place in June of 2013, while CDE's statements of student- and district-level corrective actions are dated 1/30/13; in addition, CDE trained District staff regarding corrective actions in January, received the District's corrective action statements in March, and received the revised corrective action statements in May. Determinations related to CDE's guidance documents for consultants, and the extent of CDE's oversight of SESR monitoring activities and data collection, have been made at Sections III. D. 2. and III. D. 3. above. CDE's monitoring system must identify both compliance and noncompliance appropriately based on adequate evidence and reasoning, and result in "appropriate corrective actions." CDE's system of SESR oversight did not actually verify the adequacy of findings through reviewing files until the follow-up review: CDE's "Chronology" document does not show any oversight activities until the follow-up, nor do CDE's guidance documents call for on-site oversight of SESR activities or the review of files by CDE itself prior to the development of corrective actions. At the time of CDE's adoption of corrective actions it did not know the extent to which SESR findings had identified compliance and noncompliance appropriately based on adequate evidence and reasoning; thus, it had no way of knowing that the corrective actions it had developed were appropriate. It is worth noting in this regard that CDE's SESR follow-up "revealed a number of instances of noncompliance not previously identified by the District" during the SESR, and found flawed SESR implementation by the District (Greenwood letter to Hernandez, 8/15/13, at 1). Thus, all noncompliance was not appropriately identified through the SESR and the original corrective actions not fully appropriate. Additional corrective actions were therefore mandated by CDE as a result of the follow-up review. Monitor's Determination: CDE has not demonstrated that it developed appropriate corrective actions in response to the District's SESR findings based on appropriate identification of both compliance and noncompliance. Therefore, CDE shall engage in corrective action steps to ensure that SESR findings of compliance and noncompliance are verified to be correct prior to the adoption of corrective actions responding to SESR findings. CDE shall set forth with precision through a submission to the Monitor and the parties the procedures, manuals and guidance documents it will adopt to ensure that SESR findings of compliance and noncompliance are verified to be correct prior to the adoption of corrective actions responding to SESR findings. 7. Student-Level Compliance Errors Plaintiffs believe that for the most part the District conducted the student record review accurately, but Plaintiffs' review of files found "many" compliance errors due to incorrect or inconsistent answers to questions. Plaintiffs offer a number of examples: • vague findings of noncompliance ("SLP goals are appropriate"), and reasons given for noncompliance are "sometimes vague or limited": 64 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 65 of 88 • • • • • • inconsistent findings of noncompliance, including confusion between compliant and not applicable; reasons for findings were challenging to determine "[w]ithout clear evidence and rationale"; district-level findings of noncompliance were not "explicitly tied" to student performance or data; it was "unclear where these findings came from"; lack of consensus among reviewers on findings are shown by discrepancies on worksheets, submissions, and document notes; findings of compliance although there was evidence of noncompliance: for example, when asked if a number of placement settings were considered, although some parents were concerned that their child might need more restrictive settings, these concerns were "not incorporated meaningfully" into IEPs and it was not clear that more restrictive settings were actually considered; findings of noncompliance although there was evidence of compliance with item 3-2-5 (support for school personnel): Plaintiffs note that "several" IEPs contained descriptions of student supports from paraprofessionals and other service providers; and student goals that seemed both appropriate and similar to the goals of other students were sometimes found noncompliant, suggesting to Plaintiffs misidentification of either compliance or noncompliance. None of plaintiffs' examples include references to findings for specific students (SESR Objections at 13-14). CDE does not offer responses to these objections, although its general comments on consistency are applicable (SESR Response at 4-6; see Section IV. B. 3. above). While the Monitor will not search for supporting evidence not provided by Plaintiffs, some comments are in order. First, the "Student Non-Compliant Report" was reviewed on the subject of allegedly vague findings of noncompliance. The review indicates that three of the 63 student-level findings of noncompliance in the report were unduly vague (4.8%), in the Monitor's view; all concerned IEP goals.59 Second, Plaintiffs assert that district-level findings of noncompliance were not tied explicitly to student performance or data, and that the genesis of these findings is not clear. With respect to the lack of ties to student performance or data, a review of the district-level findings does not indicate any that require ties to performance or data, in the Monitor's view, nor do Plaintiffs point out any findings that they believe require such ties. However, as suggested by the determination made in Section IV. A. 3. above, the use of monitoring methodologies capable of making substantive determinations of denial of FAPE might well require student performance data for support. With respect to the source of these findings, CDE has explained: "Every student-level finding results in a district-level finding of noncompliance. Additional district-level findings may be made by CDE or district staff" (CDE 3/22/13 Response at 50). The district-level 59 This problem that affected a small percentage of findings is of less importance in a self-monitoring effort than in direct state monitoring, as the District is presumably cognizant of its rationale for these findings regarding inappropriate IEP goals and therefore capable of correcting the noncompliance. 65 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 66 of 88 findings resulting from the Ravenswood SESR appear to match the student-level findings.60 Third, lack of consensus among reviewers does not necessarily indicate a compliance error, assuming the ultimate finding was accurate. Monitors can and do sometimes disagree on findings prior to a final decision. However, concerns have been noted in Sections III. D. 3. and IV. B. 3. above regarding CDE oversight of the collection of SESR data, and ineffective oversight can result in inaccurate ultimate findings. The remaining objections concern consistency and correctness of findings. The Monitor has made determinations in Sections III. D. 3. and IV. B. 3. above that CDE does not currently have reliable methods for ensuring the accuracy of SESR findings of compliance and noncompliance. Monitor's Determination: Plaintiffs have not shown that any of the SESR districtlevel findings of noncompliance required ties to student performance or data, or that the source of these findings is unclear, or that disagreements between reviewers resulted in inaccurate findings. CDE has not demonstrated that the SESR in Ravenswood was implemented adequately and identified noncompliance and compliance appropriately based on adequate evidence and reasoning. Therefore, the determinations reached at Sections III. D. 3. and IV. B. 3. above are reiterated: CDE shall engage in corrective action steps to ensure that its processes for validating the accuracy of SESR data collection are reasonably calculated to result in accurate SESR data. CDE shall set forth with precision through a submission to the Monitor and the parties the procedures, guidance and training documents it will adopt to ensure that SESR findings are accurate. C. Implementation Review 1. Lack of Parent Interviews Plaintiffs claim that the District did not interview parents as part of the review as called for by CDE's SESR guidelines, instead relying on the District's service delivery database, MySPED (SESR Objections at 14-15). CDE agrees that the SESR guidelines require parent interviews as part of the implementation review and agrees that the District did not conduct such interviews as part of the process. But CDE points out that its follow-up review "caught this error" and will require correction of it during the VR. CDE believes that this "highlights" both the strength of the SESR and of its follow-up process (SESR Response at 6). CDE's SESR follow-up report "found that the District had failed to properly implement" the IEP implementation portion of the SESR. The specific ways in which implementation fell short are not included in CDE's report (Greenwood to Hernandez, 8/15/13, at 1). While CDE does not explain how the District's failure to follow CDE's SESR guidelines could highlight the strength of its SESR process, the Monitor agrees that this circumstance shows the ability of its follow-up process to catch this sort of SESR 60 However, see Section IV. H. 1. below for an additional determination on this issue. 66 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 67 of 88 implementation failure. But, aside from the fact that the follow-up process is only applied to 10% of SESR districts, this circumstance also shows clearly that CDE's oversight of SESR implementation in Ravenswood during and immediately following the process was insufficient to ensure that the SESR was implemented appropriately. Monitor's Determination: CDE has not demonstrated that the IEP implementation review in Ravenswood was implemented adequately and identified noncompliance and compliance appropriately based on adequate evidence, as the review did not include parent interviews. Therefore, CDE shall engage in corrective action steps to develop processes for oversight of SESR implementation that are sufficient to ensure that the SESR is implemented in accordance with its guidelines. CDE shall set forth with precision through a submission to the Monitor and the parties the procedures it will adopt to ensure that parent interviews are conducted as part of the IEP Implementation Review portion of the SESR. 2. Required Minimal Documentation Submission Plaintiffs assert that the District "did not appear" to verify the accuracy of the MySPED data before relying on it for the implementation review. Thus, the IEP implementation review was based on unverified entries in the database made by service providers. They add that only one file contained a database printout against which the worksheet could be checked. Further, plaintiffs assert that CDE's SESR guidelines are clear that service delivery logs are insufficient to determine whether services have been delivered. No citation to the guidelines is included (SESR Objections at 14-15). CDE's responds: However, the MySped logs are the data source relied upon by the Monitor in determining whether the District is compliant with the RSIP requirements. Because the reliability of the MySped logs has already been established to the satisfaction of the Monitor, plaintiffs' objection should be disregarded. (SESR Response at 6) CDE does not cite any documents or reports from the Monitor establishing the reliability of the MySPED database, nor does it respond to Plaintiffs' point regarding CDE's SESR guidelines. A service delivery database is not required by the RSIP; at RSIP 12.1.1 service provider timelogs are required. Compliance with that requirement is established through copies of the timelogs that are "complete" for 95% of providers. A complete timelog, according to the requirement, includes the type and nature of the service provided, the date and time of service delivery, and the signature of the provider. A specific level of accuracy of the timelogs is not required for compliance. RSIP 12.1.3 and 12.3.1 measure service delivery based, in part, on the timelogs. The former includes review of IEPs and student progress reports in addition to timelogs; the latter requires interviews and observations in addition to timelogs. Over 67 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 68 of 88 the years of monitoring compliance with these requirements, timelogs that misrepresented the status of service delivery have not been discovered although, as is well known to the parties, incomplete and missing timelogs were often found until recent years. The RSIP does not call for a systematic judgment regarding the accuracy of the MySPED database.61 However, none of this is relevant to the question of whether the IEP implementation portion of the SESR in Ravenswood was implemented appropriately and whether CDE exercised sufficient oversight to ensure that that took place. The pertinent instructions in CDE's SESR manual state: "Secure or arrange to receive any documentation about service delivery since the beginning of the school year or the last IEP including service logs or other evidence (student attendance records, staff attendance records, contractor billing records)" (SESR Instruction and Forms Manual (10/12) at 22; emphasis added). The manual does not require evidence in addition to service logs. However, one of CDE's SESR training documents calls for interviews of parents and service providers, and the review of "documented" evidence of service provision, listing service logs, student and service provider attendance records, and payment vouchers for contracted services as the documentation needed. The word "or" does not appear in this list ("Final Activity Two 12-13 SESR PPT," at slide 67; emphasis in original). CDE's "Item Table 12-13" does not mention the specific documentation needed, but clearly calls for staff interviews: Item No Compliance Test 4-13 Does the LEA provide special education and related services in accordance with the student's IEP? Revised Compliance Standard Records, documentati on and interviews must confirm that the LEA provides special education and related services as indicated on the IEP. Revised Other Guidance Check to make sure that all services are provided as described in the IEP (e.g., frequency, intensity, duration, location). Can staff confirm that students receive the special education and related services as stated on the IEP? Can staff describe methods that document how they ensure that student’s receive services as stated on the IEP? Can staff and administration describe how they ensure that special education and related services are provided to children with disabilities according to their IEPs? Can staff and administration describe how they determine which children are currently receiving special education and related services as stated on the IEP? and Can staff and administration describe how they determine which 61 RSIP 1.3.2 calls for a review of the accuracy of another database, the District's student tracking database, and allows the District ten days to correct any inaccuracies found. If the inaccuracies communicated to the District are corrected within ten days, the District is found compliant with that requirement. Thus, as the Monitor has discussed with the parties in the past, even RSIP 1.3.2, the RSIP's sole database accuracy requirement, does not call for systematic judgments regarding the overall accuracy of the student tracking database. 68 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 69 of 88 Item No Compliance Test Revised Compliance Standard Revised Other Guidance children are not receiving special education and related services as stated on the IEP? CDE's response does not mention whether staff interviews were conducted to corroborate the data in MySPED, and does not mention whether any other corroborating steps were taken. Monitor's Determination: CDE has not demonstrated that the IEP implementation review in Ravenswood was implemented adequately and identified noncompliance and compliance appropriately based on adequate evidence. Therefore, CDE shall engage in corrective action steps to ensure that its processes for oversight of SESR implementation are reasonably calculated to result in the implementation of the IEP implementation review in accordance with CDE guidelines. CDE shall set forth with precision through a submission to the Monitor and the parties the procedures it will adopt to ensure that the SESR IEP implementation review is implemented adequately and identifies noncompliance and compliance appropriately based on adequate evidence. Further, CDE shall revise its SESR manual, item table, and training documents to clarify the required forms of evidence necessary for the IEP implementation review portion of the SESR. 3. Lack of Qualitative Review The quality of the services delivered was not monitored during the SESR, in part due to the lack of data regarding parent or student satisfaction with the services (SESR Objections at 15). CDE agrees that the quality of service delivery is not measured by the IEP implementation review, but regards such an inquiry as tantamount to questioning the IEP document: "The IEP implementation review is not designed for the state to question the IEP itself, which is formatted with input from parents and service providers who observe the student on a daily basis" (SESR Response at 6-7). First, judging the quality of the special education and related services provided to students has nothing to do with questioning the IEP itself; the members of an IEP Team presuppose that services will be delivered and delivered competently when they agree that specific services and supports are necessary for a student to receive FAPE. Second, monitoring systems that comply with the statute question IEPs as necessary, as it would be impossible for a monitoring system to monitor substantively in the priority areas of FAPE and LRE without doing so. The Monitor reached a determination at Section III. D. 2. above that Plaintiffs did not argue convincingly that observations of students should be included in all SESRs, and commented that a determination would not be made on whether observations should be conducted in some SESRs due to data indicating poor student outcomes or parental or student dissatisfaction with the quality of service delivery, as Plaintiffs did not raise that argument. As CDE did not ensure that parent interviews were conducted 69 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 70 of 88 as part of the IEP Implementation review, the Monitor cannot judge whether the results of such interviews should have resulted in direct monitoring of the quality of service delivery as part of the Ravenswood SESR. Monitor's Determination: A determination has been made at Section IV. C. 1. above regarding CDE's failure to ensure that parent interviews were conducted as part of the IEP Implementation review. Plaintiffs have not shown that student observations should have been conducted as part of the IEP implementation review as a result of other data. 4. Reliance on MySPED Plaintiffs believe that the District's use of MySPED suggests that conducting the IEP implementation review would be challenging for districts that lack a service delivery database (SESR Objections at 15). CDE responds that the District does, in fact, have the database, and notes that the standard from the Court's Order includes the words "as applied to Ravenswood" with respect to the state's monitoring system; thus, whether other districts could perform the review without such a database is irrelevant to the standard (SESR Response at 6). In light of the unclear guidance from CDE regarding the evidence to be used in the IEP implementation review, the Monitor agrees that conducting the review might well be challenging for other districts. However, as CDE correctly points out, this is not relevant to whether the SESR has been implemented adequately in Ravenswood. Monitor's Determination: Plaintiffs have not shown the relevance of the possibility that some districts could not perform the IEP implementation review due to the lack of a service delivery database to the standards governing SESR implementation in Ravenswood. D. Educational Benefit Review 1. Lack of Procedural Clarity Plaintiffs argue that, due to lack of clarity in CDE's SESR guidance, implementation of the educational benefit portion of the Ravenswood SESR was inconsistent. Some educational benefit reviews gave detailed information about students while other reviews were very limited or left some sections blank. Plaintiffs believe that it was particularly difficult to complete the reviews meaningfully when fewer than three years of student records were available (SESR Objections at 16). CDE does not respond to the last point, and believes that the District implemented the educational benefit review "with fidelity." However, CDE also states without specificity that its on-site follow-up review "noted many of the same mistakes observed by plaintiffs." CDE promises additional guidance and training to districts in 2013-14 SESRs, including training on the educational benefit review "at the time the LEA staff is conducting the review" (SESR Response at 7). CDE's SESR follow-up report "found that the District had failed to properly implement" the educational benefit 70 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 71 of 88 portion of the SESR. The specific ways in which implementation fell short are not included in CDE's report (Greenwood to Hernandez, 8/15/13, at 1). CDE has conceded that the reviews were inconsistent and additional guidance for the educational benefit review necessary in the future. In addition, the Monitor's review of the educational benefit worksheets also revealed some amount of inconsistency in the amount of information entered on these sheets. With respect to Plaintiffs' point regarding the difficulty of completing the reviews if there were fewer than three years of records available for a student, CDE's training document suggests, but does not appear to require, only reviewing students for whom a district has three years of records.62 Monitor's Determination: CDE has not demonstrated that the educational benefit review in Ravenswood was implemented adequately and identified noncompliance and compliance appropriately based on adequate evidence. Therefore, CDE shall engage in corrective action steps to develop processes for guidance and training related to implementation of the educational benefit review portion of the SESR that are reasonably calculated to ensure that the review is implemented consistently, including guidance related to whether three years of records are required for students chosen for the educational benefit review. CDE shall set forth with precision through a submission to the Monitor and the parties the procedures, guidance documents, and training it will adopt and implement to ensure that the educational benefit review portion of the SESR is conducted consistently and identifies compliance and noncompliance appropriately. 2. Lack of Consistency Plaintiffs regard the review as "entirely subjective," as "broad claims" regarding benefit are made without reference to "meaningful information" regarding student progress such as assessment data or other quantifiable data. Thus, the reviews are both subjective and based on limited information. In addition, the reviews do not include observations or interviews (SESR Objections at 16). In addition to its comments set forth above, CDE notes that including student observations in the reviews would be even more subjective than the current methodology. It does not respond with respect to interviews, and regards the educational benefit review as a "thorough and evenhanded examination" of educational benefit. However, its response does not include a defense of the objectivity and accuracy of the educational benefit reviews conducted as part of the Ravenswood SESR (SESR Response at 7). As noted above, CDE has conceded that the reviews were inconsistent and additional guidance for the educational benefit review necessary in the future. The Monitor's review of educational benefit worksheets indicates that Plaintiffs' claim that the reviews were "entirely subjective" is incorrect, as some worksheets show evidence of 62 "Start with the year in which an initial or triennial IEP was completed, review that year and two subsequent years of IEP information" ("Final Activity Two 12-13 SESR PPT" at slide 25). 71 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 72 of 88 consideration of objective and quantifiable data such as performance on state assessments. Others, however, do not. CDE's Educational Benefit Chart documents with instructions for districts were reviewed, and do not make it clear whether quantifiable data should be used from students' assessments, IEP present levels of performance, and to judge the amount of progress made by the student ("Ed Ben Chart Pg 1," "Ed Ben Chart Pg 2"). The use of state assessment performance data is not mentioned in these two documents. CDE's SESR manual, however, does contain specific instructions regarding the use of such data as part of the review of the assessment: "The district should document conclusions from the assessment information that is quantifiable such as measurement instrument scores or narrative specifying areas of service need related to the student’s disability" (SESR Instruction and Forms Manual (10/12) at 20). CDE's training document also calls for measuring whether an IEP is reasonably calculated to result in educational benefit, in part, through the achievement of passing grades and "[i]mproved scores on clinical, district, and statewide assessments"; and for identifying the present levels of student performance by including "as much information as possible," information that includes "not just test scores," suggesting that test scores should be minimally included. In addition, review teams are urged to ascertain whether the IEP Team considered the results of statewide assessments ("Final Activity Two 12-13 SESR PPT" at slides 31, 37, 53). Thus, quantifiable data should be gathered and considered in the review, according to some of CDE's SESR instruction documents. While Plaintiffs do not lay out a rationale for including observations and interviews in the educational benefit review, gathering the perspectives of parents and service providers on the level of progress achieved by the student is crucial to a full review of educational benefit. As noted in Section III. E. 2. above, observations, however, should be included if the review of records and interviews point to questions that can only be answered through observations, such as poor student outcomes that may be due to flaws in service delivery. But a rule that observations always be conducted would lead to an unproductive use of monitoring resources in exchange for little gain. Monitor's Determination: Plaintiffs have not shown that the educational benefit reviews were entirely subjective, or that observations should have been included in the reviews. CDE has not demonstrated that the educational benefit review in the Ravenswood SESR was implemented consistently and adequately, and identified noncompliance and compliance appropriately based on adequate evidence. Therefore, CDE shall engage in corrective action steps to ensure that its processes for oversight, guidance, and training related to implementation of the educational benefit review portion of the SESR are reasonably calculated to result in the consistent implementation of the review, the use of objective data and interviews as part of the methodology, and the use of student observations when necessary. CDE shall set forth with precision through a submission to the Monitor and the parties the oversight procedures, guidance documents, and training it will adopt and implement to ensure that the educational benefit review portion of the SESR is conducted consistently and 72 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 73 of 88 uses objective data and interviews as part of its methodology, and conducts observations when necessary. 3. Required Minimal Documentation Submission Although the use of worksheets is required as part of the review, the worksheets are not required to be submitted to CDE. Instead, only an answer to item 4.15.163 (compliant or noncompliant) is required. Hence, CDE is only given limited information regarding programs in the District and benefit to the student (SESR Objections at 17). CDE does not respond to this objection (SESR Response at 7). Review of the worksheets filled out by Ravenswood staff as part of the educational benefit review indicates that the review requires answers to five items. CDE's training document states that no worksheets are to be submitted to CDE ("Final Activity Two 12-13 SESR PPT" at slide 47). Thus, it would be very difficult for CDE to verify that the review was implemented adequately and identified noncompliance and compliance appropriately based on adequate evidence, as the evidence relied upon by the District for its findings is not submitted to CDE. Monitor's Determination: Plaintiffs have not shown that the educational benefit review only requires an answer to one item. CDE has not demonstrated that the educational benefit review in the Ravenswood SESR could be overseen effectively as the submission of relevant documents was not required. Therefore, CDE shall engage in corrective action steps to ensure that its processes for oversight related to implementation of the educational benefit review portion of the SESR are reasonably calculated to result in the adequate implementation of the review and the identification of noncompliance and compliance appropriately based on adequate evidence. CDE shall set forth with precision through a submission to the Monitor and the parties the procedures, manuals, and guidance documents it will adopt in this regard. 4. Improper Composition of Review Team Plaintiffs point out that the SESR manual requires the District to conduct the educational benefit review in teams. Plaintiffs found no indication that the District used teams to conduct the reviews or that CDE verified that this was done. Plaintiffs add that the purpose of using teams for this process is not clearly stated by CDE (SESR Objections at 17). CDE confirms that this review should be performed in teams of two, but does not respond to Plaintiffs' point that this did not take place (SESR Response at 7). The Monitor cannot determine from the SESR worksheets whether the reviews were performed in teams. CDE's training document does not state that the educational benefit reviews should be conducted in teams ("Final Activity Two 12-13 SESR PPT" at slides 25-66). However, the SESR manual is clear that teams of two should be used to 63 This item number appears to be a typographical error in Plaintiffs' SESR Objections. 73 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 74 of 88 conduct the reviews (SESR Instruction and Forms Manual (10/12) at 20). As to CDE oversight of the use of teams, CDE's guidance to its consultants on the one hand states that the educational benefit review "should be conducted in a manner consistent with the requirements" contained in the SESR manual, but on the other hand the requirement to conduct the review in teams does not appear to be overseen by consultants: "If the LEA either did not submit Educational Benefit review data or did not conduct Educational Benefit reviews on the correct number of student files or if the form is incomplete, missing, or the responses appear inadequate, contact the LEA, provide technical assistance, and require correction or resubmission." The use of teams does not appear on this list (2011-2012 SelfReview Analysis Instructions for Regional Focused Monitoring and Technical Assistance Consultants at 27, 40). Monitor's Determination: CDE has not demonstrated that its SESR guidelines regarding performing the educational benefit review in teams were followed in the Ravenswood SESR. Therefore, CDE shall engage in corrective action steps to ensure that its processes for oversight related to implementation of the educational benefit review portion of the SESR are reasonably calculated to result in reviews that are implemented in accordance with CDE guidelines. CDE shall set forth with precision through a submission to the Monitor and the parties the oversight procedures, manuals, and guidance documents it will adopt in this regard. E. Fiscal Review 1. Lack of Procedural Clarity Plaintiffs point out that the SESR guidelines require the collection and review of a number of documents related to budget and expenditures, and argue that there was "limited transparency" regarding the process and no reason to believe that CDE could verify that the review was conducted appropriately. According to Plaintiffs, the District when applicable entered noncompliant findings, but was not required to submit supporting documentation or a description of its review process (SESR Objections at 1718). CDE responds that the fiscal review portion of the SESR is not meant to be as "intensive" as a CDE fiscal audit, which it notes it recently completed in the District. CDE does not use the results of its audit to defend the accuracy of the SESR fiscal review, nor does it respond to Plaintiffs' points regarding the transparency of the process, the lack of supporting documents and description of the process, and the verifiability of the results (SESR Response at 7-8). The SESR manual does not require the submission of supporting documents or a description of the review process, nor does the SESR training document (SESR Instruction and Forms Manual (10/12) at 26-27; "Final Activity Two 12-13 SESR PPT" at slides 80-82). The lack of such documents would make it very difficult to verify the results of the fiscal review. The instruction document for CDE consultants does not appear to contain any oversight steps related to the fiscal review, and the SESR 74 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 75 of 88 manual's appendix listing required actions by consultants merely requires verification that the review was conducted (SESR Instruction and Forms Manual (10/12) at 43). Monitor's Determination: CDE has not demonstrated that it requires documentation and information that would enable it to verify the results of the fiscal review. Therefore, CDE shall engage in corrective action steps to ensure that it requires the submission of documents and information necessary for it to verify the accuracy of the fiscal review portion of the SESR. CDE shall set forth with precision through a submission to the Monitor and the parties the procedures, manuals, and guidance documents it will adopt in this regard. 2. Lack of Substantive Clarity Plaintiffs argue that the process calls for "significant subjectivity" as it lacks clear measures or definitions to be used in the application of concepts like "proportionality" and or "appropriate" levels of spending, pointing to items 30-1-1 and 30-1-6 as examples. Thus, noncompliant findings may be missed as the lack of measures and definitions make "broad interpretations" by the District likely (SESR Objections at 18). CDE does not offer a response to this objection (SESR Response at 7-8). The items pointed to by Plaintiffs were examined in CDE's "Item Table 12-13" and are displayed in the table below: Item No 30-11 30-16 Does the District allocate and spend special education resources appropriately for special education staff and the provision of special education services? Revised Compliance Standard Special education revenue sources are appropriately applied to special education staffing and the provision of special education services. Does the LEA ensure that a proportionate share of IDEA funding has been correctly calculated for parentally placed private school special education students? The proportionate share of IDEA funds were properly calculated for parentally placed private school students receiving special education services. Compliance Test Revised Other Guidance Review the District's certificated and non-certificated special education staff assignment lists. Compare the staff assignment lists with the District's timesheet and payroll records to determine if all sources of special education revenue have been properly allocated and spent on assigned special education staff. Review the District's budget to determine if a proportionate share of IDEA funding has been correctly calculated for parentally placed private school special education students. In light of the minimal guidance offered in the item table regarding concepts like "appropriately," "proportionate," and "properly allocated," it is possible that districts in their implementation of the fiscal review will apply incorrect interpretations of these concepts. While this may not have been the case in the Ravenswood SESR, CDE has offered no defense of the conduct of this review. 75 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 76 of 88 Monitor's Determination: CDE has not demonstrated that these concepts were interpreted appropriately in the fiscal review portion of the Ravenswood SESR. Therefore, CDE shall engage in corrective action steps to ensure that its guidance and training documents offer clear interpretations of all concepts used in the fiscal review portion of the SESR. CDE shall set forth with precision through a submission to the Monitor and the parties the guidance and training documents it will adopt in this regard. 3. Limited Verification by CDE Plaintiffs assert that CDE did not oversee the fiscal review process, leaving room for mistakes and implementation that "could" vary from a complete review to a cursory one (SESR Objections at 17-18). CDE does not explain its level of oversight or verification of the District's fiscal review findings (SESR Response at 7-8). As noted in Section IV. E. 1. above, CDE does not require the submission of documents necessary to enable it to verify the accuracy of the fiscal review. CDE has offered no explanation of its verification of the accuracy of the District's fiscal review. Monitor's Determination: CDE has not demonstrated that it verified the accuracy of the District's fiscal review. Therefore, CDE shall engage in corrective action steps to ensure that it verifies the accuracy of the fiscal review portion of the SESR. CDE shall set forth with precision through a submission to the Monitor and the parties the policies, procedures, and oversight processes it will adopt in this regard. F. SELPA Governance Review Plaintiffs point out that the SELPA Governance review was performed by another district without any indication regarding how that district was chosen or whether other districts in the SELPA were consulted during the process (SESR Objections at 18). CDE responds that the purpose of this review "is for the SELPA to review its governance obligations." Construing Plaintiffs' objection as the SESR manual not explaining how a district is chosen to lead the review, CDE states that it will revise the manual in this regard. CDE explains that the SELPA itself "conducts" the review, and that districts in the SELPA agree on the selection of a district to report the findings to CDE. CDE does not respond to Plaintiffs' concern regarding whether other districts were consulted during the process (SESR Response at 8). CDE has explained that the SELPA governance review is a self-monitoring endeavor by the SELPA, and this is confirmed by the SESR manual and SESR training document (SESR Instruction and Forms Manual (10/12) at 25-26; "Final Activity Two 12-13 SESR PPT" at slides 76-79). It is reasonable for such a review, in order to ensure accuracy of findings regarding the performance by the SELPA of its governance 76 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 77 of 88 obligations, to consult each of the districts in the SELPA for its perspective. However, CDE has not demonstrated that this was done. Monitor's Determination: CDE has not demonstrated that the SELPA governance review included the perspectives of the SELPA's member districts. Therefore, CDE shall engage in corrective action steps to ensure that districts' perspectives on SELPAs' performance of their governance obligations are sought and considered as part of the process. CDE shall set forth with precision through a submission to the Monitor and the parties the policies, procedures, and guidance documents it will adopt in this regard. G. Policies and Procedures Review 1. Lack of Clarity or Direction for Process Plaintiffs argue that the monitoring process for District policies and procedures basically calls for the review of written policies and procedures, but "in many instances the District is encouraged" to review IEPs and conduct interviews. However, CDE does not indicate how many or which IEPs should be reviewed or individuals interviewed. Plaintiffs point specifically to items 10-1-2.1 and 10-2-1 in this regard. Plaintiffs note that the District found itself compliant in the policies and procedures review, but Plaintiffs found no indication that files were reviewed or individuals interviewed as part of this review (SESR Objections at 18-19). CDE replies that the policies and procedures review does not require interviews or the review of IEPs. The review is intended to answer "the broad questions related to whether the LEA has the necessary policies and procedures in place and is implementing them correctly (SESR Response at 8; emphasis added). CDE does not explain how the implementation of policies and procedures could be judged in the absence of interviews or review of IEPs; nor does it explain if, and if so how, the results of the student record, IEP implementation, and educational benefit reviews are used to judge the extent to which the District's policies and procedures have been implemented. The SESR manual does not call for the review of IEPs or interviews as part of this process, nor do the relevant SESR training documents (SESR Instruction and Forms Manual (10/12) at 24-25; "Final Activity Two 12-13 SESR PPT" at slides 74-75). The items pointed to Plaintiffs were examined in CDE's "Item Table 12-13" and are displayed in the table below: Item No 10-12.1 Compliance Test Has the governing board of the LEA adopted a policy to implement a course of instruction that sufficiently prepares pupils to meet state graduation requirements? Revised Compliance Standard Revised Other Guidance The governing board of the LEA must certify to the Superintendent of Public Instruction that it has adopted a policy to implement a course of instruction that sufficiently prepares students to meet state graduation requirements. Review board policies. Interview staff and administrators about how the LEA prepares students with disabilities to meet state graduation requirements. 77 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 78 of 88 Item No 10-21 Compliance Test Revised Compliance Standard Revised Other Guidance Are all students whose home language survey indicates a language other than English assessed using the California English Language Development Test (CELDT) or an alternate to determine English language proficiency? The district must assess any pupil whose native language is other than English as determined by the home language survey for English language proficiency. Review policies and procedures. Review files of students with disabilities whose home language is other than English to see if their English language proficiency has been assessed. Interview administrators and staff. Plaintiffs are correct that interviews are called for by both of these items, and file reviews by item 10-2-1; and are also correct that CDE does not indicate how many IEPs should be reviewed or individuals interviewed. But CDE does indicate which files should be reviewed, those of students whose home language is other than English.64 The Monitor found no indication in the District's SESR worksheets that interviews were conducted as part of this review. However, the worksheets indicate that item 10-2-1 was monitored for students designated as English learners. As CDE has stated that file reviews are not required as part of the policies and procedures review, it is unclear whether this item was reviewed as part of the student record review, policies and procedures review, or both. Monitor's Determination: Plaintiffs have not shown that CDE does not indicate which files should be reviewed for item 10-2-1. CDE has not demonstrated if, and if so how, implementation of policies and procedures is monitored in the policies and procedures review; nor has it demonstrated that it gives sufficient guidance to district SESR monitors regarding how many and which staff should be interviewed as part of the policies and procedures review. Therefore, CDE shall engage in corrective action steps to ensure that its guidance and training related to implementation of the policies and procedures review portion of the SESR are reasonably calculated to result in interviews and file reviews being conducted as part of the review when necessary, and in the amounts necessary. CDE shall set forth with precision through a submission to the Monitor and the parties the guidance documents and training it will adopt and implement in this regard. 2. Limited Verification by CDE Since relevant District policies and procedures were not submitted to CDE as part of the SESR, Plaintiffs regard CDE's ability to verify that the review took place correctly and reached proper conclusions as impaired (SESR Objections at 19). CDE does not respond to this objection (SESR Response at 8). CDE's guidance document for its consultants has consultants check to ensure that the policies and procedures review was conducted. It does not call for any 64 CDE's SESR follow-up report contains evidence that item 10-2-1 was monitored by CDE in the SESR follow-up for at least one student, resulting in a finding of noncompliance (see 8/15/13 CDE "Student Level Corrective Actions" report at 10). 78 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 79 of 88 substantive review of policies and procedures by the consultants. The SESR manual's appendix listing required actions by consultants also only requires verification that the review was conducted (2011-2012 SelfReview Analysis Instructions for Regional Focused Monitoring and Technical Assistance Consultants at 44-47; SESR Instruction and Forms Manual (10/12) at 43). Monitor's Determination: CDE has not demonstrated that it verified the accuracy of the results of the policies and procedures review. Therefore, CDE shall engage in corrective action steps to ensure that the results of SESR policies and procedures reviews are accurate. CDE shall set forth with precision through a submission to the Monitor and the parties the procedures and guidance documents it will adopt and implement to ensure this outcome. H. District-Level Findings of Noncompliance and Corrective Actions 1. Rationale for Findings Not Clearly Stated Plaintiffs assert that the SESR district-level findings of noncompliance simply restate the federal regulations, but are not explanations of noncompliance (SESR Objections at 20). CDE does not respond to this objection (SESR Response at 8-9). The Ravenswood SESR district-level findings of noncompliance were reviewed. The document contains 39 such findings, and the language entered in the "NonCompliant Findings" column for each finding appears to be identical: "Federal rules require the district to review an additional number of student records and evaluate the records to confirm 100 percent compliance with the standard." This language cannot be regarded as a finding of noncompliance, as it does not state with precision that a requirement has been violated and how it has been violated. At best it calls for the final step of a corrective action process, a step that can judge whether a set of corrective actions have been efficacious. Monitor's Determination: CDE has not demonstrated that the district-level findings of noncompliance are accurate and meaningful statements of noncompliance. Therefore, CDE shall engage in corrective action steps to ensure that the SESR process results in accurate and meaningful district-level findings of noncompliance. CDE shall set forth with precision through a submission to the Monitor and the parties the policies, procedures, guidance documents and training documents it will adopt to ensure that district-level findings of noncompliance are accurate and meaningful statements of noncompliance. 2. Lack of Clarity While Plaintiffs understand that district-level findings of noncompliance were "apparently" generated from student-level findings, the district-level findings do not indicate the specific data on which the findings were based. This, in Plaintiffs' view, makes meaningful corrective actions less likely (SESR Objections at 20). 79 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 80 of 88 CDE does not respond to this objection (SESR Response at 8-9). As Plaintiffs state, the district-level findings of noncompliance do not contain any data. For each requirement monitored, one cannot tell from the finding how many students were monitored and how many found compliant and noncompliant.65 Further, as the reports only contain information regarding requirements with which the District was found noncompliant, the reports do not include information regarding other requirements monitored and found compliant. Thus, the evidence and reasoning leading to findings of both compliance and noncompliance cannot be judged and, as the scope of noncompliance found is not known, the adequacy of corrective actions to resolve the noncompliance cannot be determined. Monitor's Determination: CDE has not demonstrated that the SESR process in Ravenswood identified both noncompliance and compliance appropriately based on adequate evidence and reasoning as quantitative information regarding compliance and noncompliance is not included in the report. Therefore, CDE shall engage in corrective action steps to ensure that SESR monitoring reports contain quantitative data regarding district-level findings of compliance and noncompliance. CDE shall set forth with precision through a submission to the Monitor and the parties the policies, procedures, report templates, guidance documents and training documents it will adopt to ensure that district-level findings contain quantitative data regarding findings of compliance and noncompliance. 3. Root Cause Analysis Plaintiffs assert that the SESR manual "suggests" that an RCA be completed for certain findings, although it is not clear if and when some findings "demand" an RCA. In addition, it is not clear that the District performed RCAs when required (SESR Objections at 20). CDE explains that for items "repeatedly found noncompliant, the LEA is advised but not required" to complete an RCA "to aid in detecting the cause of the noncompliance and in implementing a lasting remedy." CDE adds that, for districtlevel findings of noncompliance, a district "generally must undergo training to develop policies and procedures that correct the noncompliance." CDE does not state whether the District was advised to perform an RCA in response to any SESR findings (SESR Response at 8-9). CDE's SESR manual states that districts should "[p]erform a root cause analysis to isolate and remedy identified noncompliance as needed" (SESR Instruction and Forms Manual (10/12) at 3). No guidance is given regarding when an RCA is "needed." Later in the document CDE explains that the RCA 65 One could argue that CDE's failure to quantify SESR findings of noncompliance does not comply with OSEP guidance on this subject: in order to ultimately demonstrate correction of noncompliance, "the percentage level of noncompliance in each of those sites" must be identified (OSEP Memo 09-02, 10/17/08, at 2). 80 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 81 of 88 was introduced for the 2011–12 SESR to ensure that corrective actions have a meaningful and lasting effect. RCA is a process to determine the causes for noncompliance and helps direct district actions to resolve causes. Understanding the specifics of a noncompliant practice can prevent future occurrences of the practice. Occasionally districts may correct individual instances of noncompliant practices without investigating the causes of the event, and therefore, allow future occurrences without taking action on a systemic or in depth level. Failure to carry out well defined district policies or practices can be traced to defined causes that are detectable through relatively simple analysis or investigation. Once discovered and located, the situation can be permanently remedied. RCA is a mechanism for carrying out lasting and effective problem resolution. (SESR Instruction and Forms Manual (10/12) at 30) However, after this persuasive explanation of the importance of the RCA process, CDE describes the RCA as an "option" (SESR Instruction and Forms Manual (10/12) at 31). RCA is also described as an option in CDE's relevant SESR training document "Final Activity Three 12-13 SESR PPT" at slides 20-21). CDE's explanations indicate that an RCA is not required for any SESR findings, even if a finding reveals a systemic problem.66 Monitor's Determination: Plaintiffs have not shown a lack of clarity regarding whether the District performed RCAs when required, as RCAs are not required.67 4. Meaningless Corrective Actions Plaintiffs argue that CDE's assigned corrective actions for district-level findings of noncompliance are nothing more than additional file reviews "without requiring any particular remedial actions." Corrective actions that could improve the delivery of services, such as training, are not required. The additional file reviews are required until 100% compliance is found, but Plaintiffs point out that it is unclear how many "iterations" of file reviews are appropriate, and unclear how the files are chosen for review (SESR Objections at 20). CDE believes that district-level findings of noncompliance "do force" a district to change its district-level practices, and uses the example of a high school district 66 One could argue that CDE's failure to ensure that RCAs are conducted in response to systemic findings of noncompliance does not comply with OSEP guidance on this subject: in order to ultimately demonstrate correction of noncompliance, "the root cause(s) of the noncompliance" must be identified (OSEP Memo 09-02, 10/17/08, at 2). 67 As Plaintiffs do not argue that an RCA should have been required in response to any findings of noncompliance in the SESR, no determination will be made on that issue. However, the Monitor notes in this regard that the lack of data in the report would have made it challenging for Plaintiffs to determine whether the scope of any finding merited an RCA. 81 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 82 of 88 required to undergo training and change its policies and procedures to ensure compliant student progress reports. Although CDE's example suggests that it required those corrective action steps in that district, CDE does not defend the corrective actions it required as a result of the Ravenswood SESR (SESR Response at 9). The corrective actions prescribed in response to the SESR district-level findings of noncompliance do not include training, development of policies and procedures, or any other remedial steps. As Plaintiffs point out, the corrective actions simply require the review of additional files. As Plaintiffs also point out, it is unclear how many additional reviews should be conducted if 100% compliance is not found in the initial reviews. However, the document makes clear which files to initially review (for example, those of students referred in the last six months, those with IEPs developed in the prior six months, etc.). Additional file reviews are not corrective actions. As stated above, such steps do nothing more than verify that corrective actions have been efficacious. Corrective actions are steps taken to correct noncompliance and prevent future noncompliance. It is true that in order to verify that noncompliance has been corrected, OSEP requires "review of updated data such as data from subsequent on-site monitoring or data collected through a State data system." CDE's prescription of additional file reviews, for violations found through the review of files alone, complies with that aspect of OSEP's guidance. But the same OSEP memo makes clear that in order to correct noncompliance, states should "[i]f needed, change, or require each LEA or EIS program to change, policies, procedures and/or practices that contributed to or resulted in noncompliance...." OSEP continues: In determining the steps that the LEA or EIS program must take to correct the noncompliance and to document such correction, the State may consider a variety of factors, including whether the noncompliance: (1) was extensive or found in only a small percentage of files; (2) resulted in the denial of a basic right under the IDEA (e.g., an extended delay in an initial evaluation with a corresponding delay in the child’s receipt of a free appropriate public education or early intervention services, or a failure to provide services in accordance with the individualized education program or individualized family service plan); and (3) represents an isolated incident in the LEA or EIS program, or reflects a longstanding failure to meet the IDEA requirements. (OSEP Memo 0902, 10/17/08, at 2) In order to correct the district-level noncompliance found through the Ravenswood SESR, the steps necessary to do so should have been determined, considering at minimum the listed factors. Unfortunately, the corrective actions required by CDE are not such steps. Monitor's Determination: Plaintiffs have not shown that CDE's corrective actions do not make clear which additional files to review. CDE has not demonstrated that it 82 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 83 of 88 adopted appropriate corrective actions in response to the district-level noncompliance identified through the SESR process in Ravenswood. Therefore, CDE shall engage in corrective action steps to ensure that SESR district-level corrective actions set forth steps reasonably calculated to resolve the noncompliance. CDE shall set forth with precision through a submission to the Monitor and the parties the policies, procedures, guidance documents and training documents it will adopt to ensure that SESR district-level corrective actions set forth steps reasonably calculated to resolve the noncompliance. I. Limited Transparency of CDE Verification and Oversight Plaintiffs assert that CDE's verification, monitoring, and oversight of the Ravenswood SESR does not appear to go beyond a paper review and is characterized by "limited transparency" (SESR Objections at 21). CDE in response points to its listing of all SESR-related actions required of its consultants in its documents. In addition, as noted above in Section IV. B. 6., CDE claims that its consultants are involved in "training, guiding, and monitoring every SESR activity," citing again its documents rather than any particular steps taken by its consultants during the Ravenswood SESR. CDE then points to its on-site follow-up review, which in CDE's view was "a comprehensive review of all evidence the District relied upon and all documents created in completing the components of the SESR" (SESR Response at 9). Monitor's Determination: Determinations regarding CDE's verification, monitoring and oversight of the SESR process have been made above at Sections III. D. 2., III. D. 3., III. D. 4., IV. A. 2., IV. B. 2., IV. B. 3., IV. B. 6., IV. B. 7., IV. C. 1., IV. C. 2., IV. D. 1., IV. D. 2., IV. D. 3., IV. D. 4., IV. E. 1., IV. E. 3., and IV. G. 2. J. Monitoring Reports As noted in Section III. D. 4. above, Plaintiffs argue that "...despite CDE’s SESR guidance describing numerous distinct kinds of reviews, including Education Benefit, IEP Implementation, SELPA governance, district policies and procedures, and fiscal reviews, no relevant reports in these areas have been produced to date" related to the Ravenswood SESR (PDC at 34). No response to this point from CDE has been located in its documents. As noted above, the reports produced as a result of the Ravenswood SESR consist of student-level and district-level findings of noncompliance. Determinations regarding the deficiencies in these reports have been reached above. In order to judge the extent to which a monitoring system has been implemented adequately, and has made findings of compliance and noncompliance appropriately based on adequate evidence and reasoning, the monitoring methods, selection of samples of students for review, quantitative data, documents reviewed, standards used, and evidence and reasoning leading to conclusions of compliance or noncompliance must be set forth. Without such information it is difficult to judge whether findings 83 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 84 of 88 have been reached appropriately and are based on persuasive evidence and reasoning. As Plaintiffs note, reports concerning each aspect of the SESR process have not been produced. Thus, one cannot judge whether the fiscal, SELPA governance, and policies and procedures reviews reached their conclusions persuasively, nor can one judge whether findings of compliance resulting from other reviews rest on adequate evidence and reasoning. Monitor's Determination: CDE has not demonstrated that the conclusions of its Ravenswood SESR fiscal, SELPA governance, and policies and procedures reviews, and findings of compliance made in other reviews, are supported by adequate evidence and reasoning due to its current process for reporting the results of SESR monitoring. Therefore, CDE shall engage in corrective action steps to ensure that SESR monitoring results are set forth in reports that contain a level of detail sufficient to enable a determination of the extent to which the SESR process has been implemented adequately and has made findings of compliance and noncompliance appropriately based on adequate evidence and reasoning. CDE shall set forth with precision through a submission to the Monitor and the parties the policies, procedures, and report templates it will adopt to ensure that SESR monitoring reports present the evidence relied upon for their conclusions. Monitor's Overall Determination for SESR Implementation: In the specific respects set forth in the SESR Implementation determinations above, the Monitor has determined that CDE has not demonstrated that its SESR process, as applied to Ravenswood, has been implemented adequately, identified both noncompliance and compliance appropriately based on adequate evidence and reasoning, and resulted in appropriate corrective actions. Therefore, the SESR aspect of the state-level monitoring system currently in place is not capable of ensuring continued compliance with the law and the provision of FAPE to children with disabilities in Ravenswood. Hence, a CAP that includes at minimum the steps specified in each determination above should be developed and implemented. The CAP should include timelines and persons responsible for each activity and outcome. V. Requested Relief Plaintiffs request that the Monitor determine that an impartial court-appointed expert assessment of CDE's monitoring of the District should be conducted as the monitoring of Ravenswood takes place. They also request that the expert conduct "further analyses," including of revised policies and procedures (SESR Objections at 2122; PDC at 2, 41-42). CDE argues that this request should be denied for four reasons. First, Plaintiffs have neither established "any generally accepted standard" for a monitoring system, nor identified any state's monitoring system that they believe California should adopt. 84 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 85 of 88 Second, to the extent that the White Paper sets forth such a standard, CDE believes that its monitoring system meets, if not exceeds, that standard, and further believes the current Emma C. Court Monitor's knowledge of the White Paper makes the appointment of another expert unnecessary. Third, an expert is unnecessary because CDE believes it has demonstrated in its submissions that its monitoring system complies with the IDEA. Fourth, CDE asserts that OSEP has reached an "expert determination" that the state's monitoring system "meets IDEA requirements" and that this "moots" Plaintiffs' request (CDE Design Response I at 2-3 (fn. 2), 3, 4 (fn. 3), 21; CDE Design Response II at 7; SESR Response at 10). To the extent that Plaintiffs' request for further analyses by a Court-appointed expert can be read as a request for analyses of additional issues not raised by Plaintiffs in their objections, the Monitor disagrees. Plaintiffs have had an opportunity to raise their objections, and those objections have been subject to responses from CDE and determinations from the Monitor. The Monitor has rejected in the past CDE's argument that Plaintiffs should have identified generally accepted standards or another state's model for CDE to adopt, as there is no requirement in place that Plaintiffs do so.68 The statute, Consent Decree, and Court's Order set forth the applicable standards.69 Further, as discussed above,70 the White Paper is irrelevant as standards are set by the statute and controlling documents of the Emma C. case. As to CDE's argument that its monitoring system complies with the IDEA, in the specific respects set forth in the relevant determinations above that is not the case.71 Moreover, the Court has previously rejected CDE's OSEP-acceptance argument: CDE has not presented any evidence that OSEP has issued a ruling on the question presented in this case – whether CDE’s monitoring system is “capable of ensuring continued compliance with the law and the provision of FAPE to children with disabilities in Ravenswood.” FACD § 13.0. And even if OSEP had ruled on that precise question, Douglas at most suggests that OSEP’s decision would be entitled to some level of deference– it in no way implies that the Court should decline to exercise its jurisdiction. (Order at 5-6) 68 In an exchange of correspondence on 2/4/13 with counsel for CDE related to Plaintiffs' initial submission of system-design objections, the Monitor wrote, "...CDE argues that Plaintiffs do not identify any monitoring model that CDE's monitoring system should, but does not, meet. However, there is no requirement in the Fifth Joint Statement that Plaintiffs' objections identify any such model. Plaintiffs' 1/25 submission identifies and argues from standards in the underlying statute, the parties' joint statements, the First Amended Consent Decree, and the Court's Orders. Without judging at this time the completeness, cogency, and persuasiveness of Plaintiffs' arguments, there is no need for them to identify any model or models that Defendant CDE's monitoring system must meet" (Mlawer to Tillman, at 1). 69 See Section II above. 70 See footnote 1 above. 71 The Monitor expresses no opinion regarding the extent to which CDE's monitoring system may or may not comply with the IDEA in respects not raised, or not raised persuasively, by Plaintiffs. 85 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 86 of 88 ...the USDOE has not made a specific determination that CDE’s monitoring system is capable of ensuring compliance with the law and provision of FAPE in Ravenswood, and even if it had, such a determination would not be dispositive. (Order at 8) Therefore, CDE's argument is again rejected.72 The Court, in its 2012 ruling denying Plaintiffs' request for expert assessments of CDE's monitoring work in Ravenswood at that time, wrote: No dispute currently exists between the parties concerning the adequacy or efficacy of CDE’s monitoring work. It is possible that such a dispute will arise during the course of the monitoring work that is now being performed and will be performed in the 2013-14 school year. But it would be inappropriate for the Court to presume that a dispute will arise, particularly since no concerns about CDE’s monitoring arose during the most recent prior monitoring events, which took place in 2006 and 2007. While Plaintiffs have informed the Court that they have hired an expert to 72 Moreover, in the most recent OSEP verification letter known to the Monitor, noncompliance was found: OSEP found noncompliance and has required corrective action in the following areas: (1) the State has not, in verifying correction of findings of noncompliance for State Performance Plan (SPP)/Annual Performance Report (APR) Indicators 11 and 12, verified, based on updated data, that the local educational agency (LEA) is correctly implementing the specific regulatory requirements; (2) the State has not ensured that, within 15 days of receiving notice of a parent’s due process complaint, the LEA convenes a resolution meeting; (3) the State has not ensured that LEAs implement corrective actions that a due process hearing officer includes in a due process hearing decision; (4) the State has not implemented procedures and practices that are consistent with Part B requirements regarding the identification of LEAs with significant disproportionality; (5) LEAs that submit timely, substantially approvable applications are not notified that they have a 27-month period in which they may obligate Part B funds; and (6) the State is not including all sources of State financial support in calculating State-level funds made available for special education and related services for children with disabilities. (Musgrove letter to Torlakson, 2/7/11, at 1-2) However, with respect to the timely identification of noncompliance, OSEP wrote: "Based on the review of documents, analysis of data, and interviews with State personnel, OSEP concludes that the State’s systems for general supervision are reasonably designed to identify noncompliance in a timely manner." But no analysis is set forth in the OSEP document to support that conclusion; thus, that aspect of the OSEP document has no persuasive force. In addition, OSEP states: "However, without also collecting data at the local level, OSEP cannot determine whether the State’s systems are fully effective in identifying noncompliance in a timely manner" ("California Part B Verification Visit Letter Enclosure" at 2). 86 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 87 of 88 evaluate the design of CDE’s monitoring system, they have not yet put any evidence in the record demonstrating design deficiencies that might render CDE’s monitoring system incapable of ensuring compliance with the law and the provision of FAPE to children with disabilities in Ravenswood. In short, Plaintiffs have not yet demonstrated to the Court’s satisfaction that the addition of expert monitoring of CDE’s monitoring is warranted. (Order at 4-5; footnote omitted) In the specific respects set forth in the above determinations, Plaintiffs have shown inadequacies in the design of CDE's monitoring system and of the implementation of its SESR component in Ravenswood. Hence, the Monitor concludes that expert assessments of CDE's monitoring under the auspices of the Court are now warranted. These assessments should take the form of monitoring and oversight of CDE's implementation of the CAP to be developed pursuant to these determinations. VI. Going Forward A. Development of CAP If none of the parties seek relief from these determinations, the Monitor recommends that the parties be given an opportunity to negotiate and agree on a CAP, but should do so under a timeline of 90 days. If those negotiations are not fruitful within the timeline, the Monitor recommends that the Court direct the Monitor to develop the CAP with the assistance of outside consultants. The Monitor also recommends that funding for any needed consultants be provided by CDE. If relief is sought from these determinations and the Court ultimately upholds them, the Court may wish to consider the above approach as it reaches questions of appropriate relief. B. Alternative Approach The Monitor has noted above that aspects of CDE's pilot proposal for Ravenswood have promise. Negotiations between the parties regarding the pilot proposal during the summer of 2013 did not result in an agreement. While it is difficult to be optimistic regarding the probability of fruitful negotiations between the parties about the pilot proposal, the Monitor would recommend that the parties be given the opportunity to reopen these discussions if they so desire. If requested by the parties, the Monitor recommends to the Court that a brief extension of the timeline for seeking relief from these determinations be granted, an additional 45 days, to pursue such discussions. If that takes place, the Court may wish to consider directing the Monitor to provide informal biweekly updates to the Court 87 Case 3:96-cv-04179-TEH Document 1890 Filed 01/09/14 Page 88 of 88 regarding the progress of the negotiations so that the extension of time can be reconsidered if progress is not made. VII. Conclusion Pursuant to Sections A3 and B7 of the Second Joint Stipulation Re Amendment of Dispute Resolution Timelines in Fifth Joint Statement, any party aggrieved by the above determinations may seek relief from the Court through filing a motion within 45 days. c: Aimee Armsby William S. Koski Arlene Mayerson/Larisa Cummings Lisa A. Tillman/R. Matthew Wise Ava C. Yajima Gloria Hernandez Carolyn Schwartzbord Fred Balcom Chris Drouin 88