addendum-dsib-adad-mar15item04 Page 1 of 19 California Department of Education Executive Office SBE-004 (REV. 01/2011) ITEM ADDENDUM DATE: March 6, 2015 TO: MEMBERS, State Board of Education FROM: TOM TORLAKSON, State Superintendent of Public Instruction SUBJECT: Item 4 – California Assessment of Student Performance and Progress: Designation of the California Assessment of Student Performance and Progress Contractor Summary of Key Issues Pursuant to EC Section 60643(b), the California Department of Education (CDE) shall develop and the State Superintendent of Public Instruction and the State Board of Education (SBE) shall approve CAASPP contracts. Additionally, EC Section 60643(b) states that the SBE shall consider the following criteria in selecting a CAASPP contractor: (A) The ability of the contractor to produce valid and reliable scores (B) The ability of the contractor to report accurate results in a timely fashion (C) Exclusive of the consortium assessments, the ability of the contractor to ensure technical adequacy of the tests, inclusive of the alignment between the CAASPP tests and the state-adopted content standards (D) The cost of the assessment system (E) The ability and proposed procedures to ensure the security and integrity of the assessment system (F) The experience of the contractor in successfully conducting statewide testing programs in other states Request for Submission As required per EC 60643, the CDE used a competitive and open Request for Submissions (RFS) process utilizing standardized scoring criteria to select a potential contractor or contractors for recommendation to the SBE for consideration. The RFS addressed the following tasks as part of the RFS Scope of Work (SOW):   Task 1: Comprehensive Plan and Schedule of Deliverables Task 2: Program Support Services 3/13/2015 9:53 AM addendum-dsib-adad-mar15item04 Page 2 of 19        Task 3: Technology Services Task 4: Test Security Task 5: Accessibility and Accommodations Task 6: Assessment Development Task 7: Test Administration Task 8: Scoring and Analysis Task 9: Reporting Results The bidders were required to propose a timeline for implementation of the assessments contemplated by this RFS as set forth in Table 1.1. The bidders were instructed to plan on developing and administering only one form per grade level/span. The bidders were also instructed to plan on using previously developed pre-equated forms for the California Standards Tests (CST) in science, the California Modified Assessment (CMA) in science, the California Alternate Performance Assessment (CAPA) in science, and the STS for RLA until successor assessments are developed. Table 1.1: CAASPP System – Test Administration Schedule ‡ School Year Status 2015–16 Existing 2015–16 Existing 2015–16 2015–16 Existing Existing 2015–16 New 2016–17 Existing 2016–17 Existing 2016–17 Existing 2016–17 Pilot Test 2016–17 Assessment* Smarter Balanced Summative Assessments, ELA and mathematics in grades 3–8 and grade 11 Smarter Balanced Interim Assessments, ELA and mathematics designed for grades 3–8 and grade 11 (available to K–12 educators) (optional for LEA) CST/CMA/CAPA for Science Assessments in grades 5, 8, and 10 STS – RLA Assessments in grades 2–11 (optional for LEA) Alternate Assessments (successor to CAPA), ELA and mathematics in grades 3–8 and grade 11 Smarter Balanced, Summative Assessments, ELA and mathematics in Grades 3–8 and Grade 11 Smarter Balanced Interim Assessments, ELA and mathematics designed for grades 3–8 and grade 11( available to K–12 educators) (optional for LEA) Type CAT/PT CAT/PT Paper-Pencil Paper-Pencil CBT CAT/PT CAT/PT Paper-Pencil Existing CST/CMA/CAPA for Science Assessments in grades 5, 8, and 10 Science Assessments (successor to CST/CMA/CAPA), including alternate assessments in grade spans 3–5, 6–8, and 9–12 STS – RLA Assessments in grades 2–11 (optional for LEA) 2016–17 Pilot Test Primary Language Assessments (successor to STS) for RLA in grades 3–11 CBT only 2016–17 Existing Alternate Assessments ELA and mathematics in grades 3–8 and grade 11 CBT 2017–18 Existing Smarter Balanced Summative Assessments, ELA and mathematics in grades 3–8 and grade 11 CAT/PT 2017–18 Existing Smarter Balanced, Interim Assessments, ELA and mathematics designed for Grades 3–8 and Grade 11, available to K–12 educators (optional for LEA) CAT/PT 2017–18 Field Test Science Assessments (successor to CST/CMA/CAPA), including alternate assessments in grade spans 3–5, 6–8, and 9–12 CBT only CBT only Paper-Pencil 3/13/2015 9:53 AM addendum-dsib-adad-mar15item04 Page 3 of 19 School Year ‡ Status Assessment* Type 2017–18 Field Test Primary Language Assessments (successor to STS) for RLA in grades 3–11 CBT only 2017–18 Existing Alternate Assessments, ELA and mathematics in grades 3–8 and grade 11 CBT Excluding Smarter Balanced assessments for purposes of this submission the vendor should plan on providing only one form per grade/span. * CST: California Standardized Test; CMA: California Modified Assessment; CAPA: California Alternate Performance Assessment; STS: Standards-based Tests in Spanish; ELA: English–language arts; RLA: Reading/Language Arts; CAT: Computer-adaptive test; PT: Performance task; CBT: Computer-based test; K-12: kindergarten through grade 12 Once the SBE designates the successful bidder, the CDE, SBE, and the Department of Finance (DOF) will negotiate and finalize the negotiated SOW and budget for the resulting contract. The selected submission is the working document to begin the negotiations for the final SOW. Prior to negotiations, the successful bidder designated by the SBE may be requested to provide additional cost detail beyond that requested in the RFS cost submission, including costs per subtask, per pupil, fixed and variable, and per each test administration cycle and other documents necessary to negotiate the SOW. It is anticipated that a negotiated SOW and contract will be presented to the SBE at its May 2015 meeting for approval. RFS Process In an open and competitive process, the RFS invited bidders to provide submissions for the development, administration, scoring, reporting, and analysis of assessments and technology support for the CAASPP System as defined in EC sections 60601 through 60649. The contract awarded through this RFS will cover the school years 2015–16, 2016–17, and 2017–18 test administration cycles and is anticipated to be in effect from July 1, 2015, through December 31, 2018, contingent upon funding through the annual budget process. The SBE has the option of extending the contract to cover additional test administration cycles. The CAASPP RFS and related documents are posted on the CDE RFS for CAASPP Web page at http://www.cde.ca.gov/fg/fo/r19/caaspp14rfs.asp. To ensure a competitive and open process for all submissions, the RFS included Web addresses for current CAASPP and Smarter Balanced resources that provided bidders with an equal opportunity to become familiar with the complete context of the CAASPP System. Additionally, all bidders were provided the opportunity to submit questions, requests for clarification, concerns, and/or comments regarding the RFS. The complete list of questions and answers are posted on the CDE Web site at http://www.cde.ca.gov/fg/fo/r19/caaspp14rfs.asp. Submissions Received The CDE received submissions from three vendors: California Test Bureau/McGraw-Hill (CTB); Educational Testing Service (ETS); and NCS Pearson (Pearson). Each bidder provided a list and description of the subcontractors that they would be working with if selected. 3/13/2015 9:53 AM addendum-dsib-adad-mar15item04 Page 4 of 19 (Order of submissions is alphabetical)  CTB proposed subcontractors: o Caveon o Data Recognition Corporation o Kelley Services o MetaMetrics  ETS proposed subcontractors: o Accenture o American Institutes for Research (AIR) o Center for Assessment o Computerized Assessments and Learning Center for Assessment o InTouch/Insight o Measurement, Inc. (MI) o Red Dog Records  Pearson proposed subcontractors: o Amplify o Caveon o MetaMetrics o Pacific Metrics o Sacramento County Office of Education o WestEd Submission Evaluation Process Overview The evaluation process consisted of three phases and five steps: Phase I: Step One: Pre-Evaluation Review of the Format Requirement Checklist; Phase II: Step Two: Evaluation Panels’ Review of the Submissions; Step Three: Review of the Submission by the Independent Consultant from Sabot Consulting, the CDE’s Independent Validation and Verification (IV&V) Contractor; Step Four: Review of the Implementation Readiness Package (IRP) Evidence; and Phase III: Step Five: Internal CDE Review for the Development of this Recommendation. The CDE submission and evaluation processes were developed and designed to meet and exceed the requirements of EC Section 60643, to ensure “a competitive and open process utilizing standardized scoring criteria” and to arrive at a recommendation that accurately reflects the requirements set forth in the RFS, approved by the SBE at the November 2014, SBE meeting. For example, the CDE established two evaluation panels, 3/13/2015 9:53 AM addendum-dsib-adad-mar15item04 Page 5 of 19 which were not required by law, however were included in Phase II to enable LEA input and give further clarity to the recommendation. Section 6, of the RFS set forth the evaluation panels’ Evaluation Process for the submissions. Attachment 12, of the RFS, specified the evaluation criteria and scoring rubric, competitive process to evaluate each task of every submission, including identifying the individual weights to be applied to the consensus scores for each specific task. The CDE’s IV&V consultant, Sabot Consulting, reviewed Task 3–Technology Services to determine, not only responsiveness to the general requirements of the RFS, but also to identify any potential deficiencies in the submission plans regarding technology services to inform the pending negotiations. CDE staff, with technical assistance from the University of California, Los Angeles (UCLA), reviewed the final IRP evidence submitted by the three bidders for compliance with the requirements. The IRP was developed by UCLA to assist Smarter Balanced states and their contractors verify their assessment delivery system displays items with authenticity, score items and tests properly, and delivers assessment results to the Smarter Balanced Data Warehouse according to specifications. Smarter Balanced states are responsible for ensuring that their assessment delivery systems support all functionality needed to successfully administer Smarter Balanced assessments. Additionally, CDE staff thoroughly reviewed the evaluation panels’ findings, each of the bidder’s submissions, the IV&V consultant’s findings, and the IRP submissions to develop the required recommendation. Step One: Pre-Evaluation Review of the Format Requirement Checklist Section 6, of the RFS called for the CDE to review the contents of the Format Requirements Checklist (Attachment 11) for the presence of all completed forms/attachments. All submissions were moved onto Phase II after this review. Step Two: Evaluation Panels’ Review of the Submissions Two evaluation panels were established to review the bidder submissions: (1) technology and (2) assessment. The assessment evaluation panel reviewed Tasks 1, 2, and 4 through 9, inclusive. A listing of all the tasks and sub-tasks can be found in Table 1.2, beginning on page 15. The technology evaluation panel reviewed only the technology services’ requirements addressed in Task 3 of the SOW. A cost submission subpanel was a subset of the two panels that worked jointly to review the cost submissions. The panel members were selected from local educational agencies (LEAs) throughout the state and internal CDE employees. Table 1.3, provides information on the composition of the evaluation panel. The LEA members were selected based on their long professional histories of providing service to their LEAs and the state in local and 3/13/2015 9:53 AM addendum-dsib-adad-mar15item04 Page 6 of 19 statewide assessment operations; their regional size and location; and either testing, evaluation, or technology experience. CDE members were selected based on their expertise and background in statewide testing, technology, data management, language acquisition, curriculum, instruction, accountability, special education, measurement, and/or contracting. All panel members reviewed the management, personnel, facilities, and resources’ capacity, of the bidders for each identified task. The cost submission subpanel reviewed the bidders cost submissions for compliance. The technology evaluation panel met January 20 through 23, 2015, the assessment evaluation panel met January 20 through 27, 2015, and the cost submission subpanel met on January 28, 2015. 1. The assessment evaluation panel consisted of 11 members including a member of the state assessment Technical Advisory Group (TAG), LEA testing directors, LEA data management specialists, and CDE staff. 2. The technology evaluation panel consisted of 8 members including an LEA Chief Technology Officer and an LEA Director of Administrative Operations as well as other LEA technology specialists/coordinators, and CDE technology services and CAASPP program staff members. 3. The cost submission subpanel consisted of a subgroup of 8 members from the two panels. Panel members concurrently reviewed each submission one at a time using the RFS Evaluation Criteria Score Sheets (RFS, Attachment 12), which is available on the CDE Web site at http://www.cde.ca.gov/fg/fo/r19/caaspp14rfs.asp. The pace of the review periods were adjusted by task or groups of tasks. At the completion of the individual panel member reviews of each task or group of tasks, the panel members convened to discuss strengths and weaknesses of each submission’s description of each task to arrive at a consensus score for each evaluation criterion by task. General Findings from the Panel Reviews Each of the three submissions was deemed responsive to the general requirements described in the RFS. All contractors and subcontractors included in each submission met the experience requirement of a minimum of three (3) years of recent (within the last 5 years) full-time experience in both computer- and paper-based assessments. Each submission proposed the use of multiple subcontractors with defined roles and responsibilities. Each of the three submissions addressed and provided cost detail for all tasks outlined in the RFS SOW. Each submission proposed new test development for a successor primary language assessment. No submission proposed the use of a predeveloped primary language assessment. 3/13/2015 9:53 AM addendum-dsib-adad-mar15item04 Page 7 of 19 Bidders could earn a Total Weighted Score of 1,200 per submission: 1,000 for the SOW and 200 for the cost submission. The raw scores for each task were weighted as per the RFS Evaluation Criteria Score Sheets (RFS, Attachment 12). ETS scored the most points overall on eight out of nine SOW tasks and received the highest total weighted score of 932 points. CTB received the next highest total weighted score of 794 points. Pearson received a total weighted score of 769 points. CTB/McGraw-Hill Scope of Work Score Total Cost Submission Consensus Score (not weighted) Total Educational Testing Service NCS Pearson 454 1000 280 646 357 796 277 624 200 200 148 148 136 136 145 145 1200 Cost Submission Grand Total (Total costs covering the 2015–16, 2016–17, and 2018–19 overlapping test administrations) 794 932 769 $223,769,974.58 $239,998,122.30 $205,840,739.00 The complete score summary can be found in Attachment 1, and displays the scores for each submission by task and cost submission, and provides the overall total costs proposed in each cost submission (Attachment 2). Step Three: Review of the Submission by the Independent Consultant from Sabot Consulting, the CDE’s Independent Validation and Verification Contractor The independent consultant from Sabot Consulting reviewed Task 3–Technology Services, of each submission to determine if the vendors’ submissions fully addressed the corresponding technical requirements in the RFS. The Sabot report did not provide a ranking of the submissions nor did the report provide a rating of one bidder over another. The technical deficiencies identified by the independent consultant were consistent with the deficiencies identified during the review by the technology evaluation panel as well as the IRP review. Step Four: Implementation Readiness Package Evidence Review As part of the final stage of the submission evaluation process, bidders were required to complete an Evidence of Meeting Implementation Readiness Package (IRP) Form (RFS Attachment 14, Amended) outlined in Section 3.3.2.B.2 of the SOW and Section 5.4 of the RFS and to provide documentation of conducting the IRP simulated assessment administration by February 24, 2015. At a minimum, the evidence was required to include the Client Summary Report Output produced by the IRP and evidence of meeting the rendering and interaction requirements of the Phase I IRP standards. The Client Summary Report Output provides evidence that the proposed test delivery system can 3/13/2015 9:53 AM addendum-dsib-adad-mar15item04 Page 8 of 19 accurately capture and score student responses and produce an output file meeting the requirements for submission to the Smarter Balanced Data Warehouse. Additionally, the IRP provided a means for bidders to self-document the assessment delivery system’s capacity to accurately render items and allow for item interactions (e.g., drag and drop, select response) on various secure browsers and their corresponding operating systems. The CDE staff, with technical assistance from the UCLA, reviewed the final IRP evidence submitted by the three bidders for compliance with the requirements outlined in Section 3.3.2.B.2 of the SOW and Section 5.4 of the RFS. (Note: The IRP process is not a onetime procedure, but instead is a process that the successful bidder will be required to perform periodically over the life of the contract.) The following table summarizes the item rendering and interaction IRP evidence submitted by each of the bidders. All cells should have an “X” indicating that the item types rendered and interacted as required by the RFS. The RFS required the bidders to be compliant with all of the Smarter Balance supported operating systems. (Order of submissions is alphabetical) CTB1 All Item Types Rendered 4 Android Tablets Android OS 4.0.4 Android OS 4.1.x Android OS 4.2.x Apple iPad iOS iOS 7.1 IOS 8.0 iOS 8.1 iPad 2 iPad 3rd Generation iPad 4th Generation iPad Air Apple Mac Laptops or Desktops Apple Mac OS X 10.4.4 Apple Mac OS X 10.5 Apple Mac OS X 10.6 Apple Mac OS X 10.7 Apple Mac OS X 10.8 Apple Mac OS X 10.9 Apple Mac OS X 10.10 Chromebook Chrome OS 31 Chrome OS 32 Chrome OS 33 Chrome OS 34 Chrome OS 35 Chrome OS 40 X X X X X X X X6 X X X X ETS2 All Item Types Rendered4 Pearson3 All Item Types Rendered4 X X X X X X R R5 X X X X X X X X X X X X X X X X R R X X X X X X X 3/13/2015 9:53 AM addendum-dsib-adad-mar15item04 Page 9 of 19 CTB1 All Item Types Rendered 4 ETS2 All Item Types Rendered4 X X X X X Pearson3 All Item Types Rendered4 Linux Ubuntu 9 Ubuntu 10 Ubuntu 11 Ubuntu 12 Ubuntu 14.04 Fedora Core 6 Fedora Core 7 Fedora Core 8 Fedora Core 9 Fedora Core 10 Fedora Core 11 Fedora Core 12 Fedora Core 13 Fedora Core 14 Fedora Core 16 Fedora Core 19 Fedora Core 20 Windows Laptops or Desktops Windows XP Service Pack 3 Windows Vista Windows 7 Windows 8 and 8.1 Windows Server 2003 Windows Server 2008 Windows Server 2012 Windows Virtual desktop Windows Tablets 8 or 8.1 Windows 8.0 Windows 8.1 X X X X X X X X X X X X X X X X6 X X X X X X X X X X X X X X X R R X X X X X X X X X Footnotes: 1. CTB indicated use of DRC Insight Secure Browser. The bidder identified the following supports and accommodations as being enabled during the IRP process: text-to-speech, human voice control, video sign language, voice capture response, color overlay, color contrasts, and masking. 2. ETS indicated use of an open source secure browser. The bidder identified the following supports and accommodations as being enabled during the IRP process: three different calculators, expandable passages, passage font size and type, highlight, item font size, item tools menu, mark for review, student comments/notepad, strikethrough, system volume control, tutorial, text-to-speech audio adjustments, text-to-speech pausing, and glossary. 3. Pearson did not indicate use of a secure browser, but instead indicated use of various Web browsers (e.g., Firefox, Internet Explorer, etc.) that would not result in the lockdown of the testing device. The bidder identified the following supports as enabled during the IRP: calculator, masking, and zoom. 4. “X” indicates all item types were rendering and interaction compliant. “R” indicates all items were rendering compliant but not all were interaction compliant. A blank cell indicates rendering and interaction compliance information for that operating system was not included in the IRP evidence submission. 5. Indicated compliance with all rendering and interaction for iOS 8.1.2 but not iOS 8.1. 3/13/2015 9:53 AM addendum-dsib-adad-mar15item04 Page 10 of 19 6. The submission states that CTB will not support these operating systems after September 1, 2015. CTB Strengths of the IRP Submission  The evidence demonstrates that the bidder conducted the simulated assessment and the self-evaluation.  The evidence demonstrates that the bidder’s assessment delivery system can produce an item scoring output file that is accurate and in a format usable by the Smarter Balanced Data Warehouse. Weaknesses of the IRP Submission  The evidence indicates that refreshable braille is not currently supported.  The bidder did not include rendering and interaction evidence for all the Smarter Balanced secure browsers and their corresponding operating systems (e.g., Apple Mac OS X 10.5). Additionally, the evidence indicates lack of current support for Android tablets with an indication they will be supported in fall 2015.  The evidence indicated which supports and accommodations were enabled during IRP but it was unclear, based on the IRP evidence submitted, which supports and accommodations are supported by the assessment delivery system. ETS Strengths of the IRP Submission  The evidence demonstrates that the bidder conducted the simulated assessment and the self-evaluation.  The evidence demonstrates that the bidder’s assessment delivery system can produce an item scoring output file that is accurate and in a format usable by the Smarter Balanced Data Warehouse. Weaknesses of the IRP Submission  The evidence indicates that two required parameters in the item scoring output file were not included in the IRP. Additionally, these two parameters were incorrectly identified as optional by the bidder.  The bidder did not include rendering and interaction evidence for all the Smarter Balanced secure browsers and their corresponding operating systems (e.g., Chrome OS 40).  The evidence indicated which supports and accommodations were enabled during IRP but it was unclear, based on the IRP evidence submitted, which supports and accommodations are supported by the assessment delivery system. Pearson Strengths of the IRP Submission  The evidence demonstrates that the bidder conducted the simulated assessment and the self-evaluation. 3/13/2015 9:53 AM addendum-dsib-adad-mar15item04 Page 11 of 19  The evidence demonstrates that the bidder’s assessment delivery system can produce an item scoring output file that is accurate and in a format usable by the Smarter Balanced Data Warehouse. Weaknesses of the IRP Submission  The evidence indicates the rendering and interaction compliance determination was done using Web browsers and not secure browsers as outlined in the instructions. Additionally, some item types rendered only (i.e., no evidence to support interactions) on some Web browsers.  The bidder did not include rendering and interaction evidence for all the Smarter Balanced secure browsers and their corresponding operating systems (e.g., iPad 3rd Generation).  The evidence indicated which supports and accommodations were enabled during IRP but it was unclear, based on the IRP evidence submitted, which supports and accommodations are supported by the assessment delivery system. Step Five: Internal CDE Review for the Development of this Recommendation CDE staff, with expertise in contract monitoring and statewide assessments, thoroughly reviewed each of the bidder’s submissions, the evaluation panel’s findings, the IV&V consultant’s findings, and the IRP submissions to ensure accuracy of the findings and to develop this recommendation on behalf of the SSPI. These CDE staff did not participate in the evaluation panels, nor rank or score the submissions; but rather verified and accounted for statements made by the three review panels, and presented a summary for this item addendum. CDE’s Identification and Summary of the Strengths and Weaknesses After reviewing each of the bidder’s submissions, the evaluation panel’s findings, the IV&V consultant’s findings, and the IRP submissions CDE staff developed the following compilation of strengths and weakness. (Order of submissions is alphabetical) Strengths Following is a summary of the strengths of the three submissions: CTB Strengths  Provided a work plan which includes having a scoring center in California that will facilitate California teachers participating in test scoring.  Provided a detailed and holistic description of artificial intelligence (AI) scoring and hand scoring processes which included the training of scorers, and item analyses, and relevant item statistics. ETS Strengths 3/13/2015 9:53 AM addendum-dsib-adad-mar15item04 Page 12 of 19        Provided a detailed timeline and work schedule and clear descriptions of meetings to include roles and assignments of officers and the role of the bidder and its oversight of subcontractors. Provided a description of how data will be used to inform technical assistance center staff including the ability to adjust staffing levels to meet demands of the CAASPP System. Provided details that included ETS providing a designated office for testing security, a clear description of the anti-hacking system for CAASPP tests, and a thorough process for investigating security breaches and processing social media breaches in collaboration with the CDE. Provided extra statistics for items, inter-rater reliability, and a clear description of AI scoring. Provided a clear process for collecting the school readiness information with clear timelines, LEA site visits and follow-up for those LEAs that do not have the technological capacity to administer computer-based assessments. Provided a description of a proven methodology and process for delivering online statewide assessments. Also provided a description of the test delivery system architecture with strong fault tolerance, including how the system handles network connectivity and hardware failures, and how different architectural components will be geographically dispersed. Provided a clear description of application vulnerability testing, dedicated Cisco fire walls, and threat management systems. Pearson Strengths  Provided detailed descriptions of the work plan, meetings, progress reports, major components and designated staff for the transition processes from the current CAASPP contractor, ETS, to the bidder, and from the bidder to the next contractor.  Presented clear descriptions of training, material production processes, in-house scanning, test manuals, and paper-pencil materials.  Provided a discussion of how the vendor will be prepared in advance of new operating system releases.  Provided a test development and item development plan (exclusive of schedule) that mimics the Smarter Balanced design and a basic item development review process that meets industry standards. Weaknesses Following is a summary of the weaknesses of the three submissions: CTB Weaknesses  Submission lacked depth, breadth, and clarity in response to multiple sections of the RFS; such as the following: - Training, accommodations, project management, scoring and reporting. - Alternate assessments and a plan for future development of ELA and mathematics. 3/13/2015 9:53 AM addendum-dsib-adad-mar15item04 Page 13 of 19  The assessment delivery system interface requirements; data security system implementation; and user experience. Submission did not specifically address, specify, and/or describe all of the system requirements in Task 3 of the RFS; such as the following: - How periodic modifications to the assessment delivery system minimum system requirements to incorporate system enhancements and fixes will be accomplished and documented. The ability to complete the processing of 6.5 million student registration information records daily; impact on processing without impacting any other nightly batch processing or maintenance windows. The development of a detailed Disaster Recovery and Business Continuity Plan that meets the minimum system requirements found in Table 3.3.1. (Specifically, it did not include the availability rate of 99.9 percent annually and the use of a Tier 3 data center or the required 99.82 percent availability.)  Submission did not address ten of the 13 [Performance] requirements in Section 3.3.2.B.9 and Table 3.3.1.  Submission did not include requirements of the RFS as documented by the evaluation panels and confirmed by CDE staff; such as the following: - Detailed description to develop a System Delivery Release Management Plan inclusive of all the components outlined in the Minimum System Requirements. - Performance tasks for any of the assessments in the submission. - Detailed procedure for data retention and destruction consistent with Minimum System Requirements found in Table 3.3.1, nor a detailed process for system maintenance and operation. - The system configurations that differ from open-source system default settings for modifying the system architecture documentation. - Support of Android operating system. - Support for all embedded accessibility supports (e.g., refreshable braille). ETS Weaknesses  Submission lacked depth, breadth, and clarity in response some parts of the work plan; such as the following: -  Reliance on AI scoring in the first year with minimal hand scoring in out years. The Submission did not specifically address, specify, and/or describe each of the minimum system requirements in Task 3 of the RFS; such as the following: - Purge or disposal of sensitive data nor a description of integrity controls such as source authentication, checksums, and message authentication methods, protections against denial of service attacks, logging and audit controls, inventory of storage media, and encryption of portable storage devices. 3/13/2015 9:53 AM addendum-dsib-adad-mar15item04 Page 14 of 19 -  Development of a detailed Disaster Recovery and Business Continuity Plan that meets the Minimum System Requirements found in Table 3.3.1. Specifically, it did not include the availability rate of 99.9 percent annually and the use of a Tier 3 data center or the required 99.82 percent availability. Submission did not address five of the 13 [Performance] requirements in Section 3.3.2.B.9 and Table 3.3.1. Pearson Weaknesses  Submission lacked depth, breadth, and clarity in response to multiple sections of the RFS; such as the following: - Test development schedule. - FAQs and emergency notifications and did not acknowledge WebART review, annual review, and confused the data delivery system with a different Web site demonstrating a lack of understanding of the RFS requirements. - Regarding responses to LEAs’ request for devices and approval process for individualized aids. - The appeals process [Smarter Balanced appeals] and how the bidder will work with LEAs. - System requirements for the Interface Requirements and System Implementation.  Submission did not specifically address, specify, and/or describe multiple sections of the RFS; such as the following: - Webinars and trainings for the alternate assessment. - Consistent look and feel for each class of user for all components of the system. - Level 1 support for technical issues. - Description of the recruitment process for the standard setting process. - System Delivery Release Management Plan was not inclusive of all components outlined in the Minimum System Requirements. - Detailed process to develop a detailed System Functional Test Plan. - Development of a detailed Disaster Recovery and Business Continuity Plan that meets the Minimum System Requirements found in Table 3.3.1. Specifically, it did not include the availability rate of 99.9 percent annually and the use of a Tier 3 data center or the required 99.82 percent availability.  Submission did not address ten of the 13 [Performance] requirements in Section 3.3.2.B.9 and Table 3.3.1.  Submission did not include requirements of the RFS as documented by the evaluation panels and confirmed by CDE staff. Examples include the following: - Large parts of the submission requirements for the Assessment and Delivery Architecture were missing. - Submission requirements for Data Security were missing. 3/13/2015 9:53 AM addendum-dsib-adad-mar15item04 Page 15 of 19 Cost Submission Findings CTB Weaknesses  The submission only provided subcontractor costs on Attachment 10D, Assessment Delivery Cost Detail. Attachment 10D for fiscal year 2018–19 was not provided.  The bidder notes “a/n” for several names, but reviewers were unable to find what the a/n referenced.  The bidder noted TBD for hours for several names. ETS Weaknesses   Subcontractor AIR rates and hours in 10C are not listed. The note for AIR on page C1 appears insufficient to explain why hours are not provided. Names listed on worksheet 10A were not listed on worksheet 10C (psychometrician listed on page 7 was not listed in 10C). Pearson Weaknesses  The Submission grouped key staff together instead of listing them individually (i.e., multiple staff, Program Manager/Program Associate). Table 1.2—SCOPE OF WORK 3.1 TASK 1: Comprehensive Plan and Schedule of Deliverables 3.1.1. Work Plan, Narrative Schedule, and Timeline 3.1.2. Orientation Meeting 3.1.3. Management Meetings 3.1.3.A. Weekly Meetings 3.1.3.B. Annual Meetings 3.1.3.C. State Board of Education Meetings 3.1.3.D. Technical Advisory Group Meetings 3.1.4. Coordination, Continuous Improvement, and Independent Evaluation 3.1.4.A. Coordination with the Consortium (UCLA) and CDE Entities and Staff 3.1.4.B. Development of Plan for Continuous Improvement 3.1.4.C. Coordination with the Independent Evaluator 3.1.4.D. Responding to Concerns 3.1.5. Transition of Contracts 3.1.6. Records and Minutes 3.1.7. Progress Reports 3.1.8. Document Format and Style 3.1.9. CDE Notification and Approval 3.2 TASK 2: Program Support Services 3.2.1. CAASPP Coordinators 3.2.2. Administration Management System LEA Support 3/13/2015 9:53 AM addendum-dsib-adad-mar15item04 Page 16 of 19 3.2.3. Data-Driven Improvement 3.2.4. Technical Assistance Center 3.2.5. Student Accessibility Tool 3.2.6. Internet Resource Site 3.2.7. Workshops and Webcasts 3.2.7.A. Pre-test Workshops and Pre-test Webcast 3.2.7.B. Training for Users of the Interim Assessments and the Digital Library 3.2.7.C. Additional Webcasts 3.2.8. Local Assessments: Smarter Balanced Interim Assessments and Digital Library 3.2.8.A. Smarter Balanced Interim Assessments 3.2.8.B. Digital Library of Formative Assessment Resources 3.3 TASK 3: Technology Services 3.3.1. School Technology Readiness 3.3.2. Assessment Delivery System 3.3.2.A. Project Management Plan 3.3.2.B. System Requirements 3.3.2.B.1. Assessment Delivery System Architecture 3.3.2.B.2. Interface Requirements 3.3.2.B.3. Data Security 3.3.2.B.4. System Development Process 3.3.2.B.5. System Implementation 3.3.2.B.6. User Experience 3.3.2.B.7. Technical Assistance Center (Technology Support) 3.3.2.B.8. System Delivery Release Management 3.3.2.B.9. Performance 3.3.2.B.10. Disaster Recovery and Business Continuity 3.3.2.B.11. Data Policy Retention and Destruction 3.3.2.B.12. Maintenance and Operations 3.4 TASK 4: Test Security 3.4.1 Test Security Plan 3.4.2. Test Administration Monitoring 3.4.3 Investigating Security Breaches 3.5 TASK 5: Accessibility and Accommodations 3.5.1. Accessibility Plan for Computer-based and Paper-pencil Tests 3.5.1.A. Computer-based Tests 3.5.1.A.1. Print on Demand 3.5.1.A.2. Assistive Technology 3.5.1.A.3. Translations 3.5.1.B. Paper-pencil Tests 3.5.1.B.1. Braille and Large Print Testing Materials 3.5.2. Individualized Aids 3.6 TASK 6: Assessment Development 3.6.1. Assessment Design 3.6.1.A. Pre-Developed Primary Language Assessment 3.6.2. Item and Task Development 3.6.2.A. Pilot Testing 3.6.2.B. Field Testing 3.6.2.C. Test Form Construction 3.6.3. Standard Setting 3.6.4. Test Administration System Familiarization 3/13/2015 9:53 AM addendum-dsib-adad-mar15item04 Page 17 of 19 3.6.5. Released Test Questions 3.6.6. Analysis of Test Results 3.6.7. Item Bank 3.6.8. Activities in Support of Future Assessment Development 3.6.8.A. Test Development Experience 3.6.8.B. Test Development for New Science Assessment Based on the California NGSS 3.6.8.C. Test Development for Science Assessment Based on Alternate Performance Standards 3.6.8.D. Test Development for Primary Language Assessment 3.6.8.E. Test Development of Additional Assessments 3.7 TASK 7: Test Administration 3.7.1. CAASPP Test Administration Requirements 3.7.1.A. Manuals 3.7.2. Paper-pencil Assessments 3.7.2.A.1. Paper Test Booklets and Answer Documents 3.7.2.A.2. Special Versions (Braille and Large-Print) 3.7.2.A.3. Paper-Pencil Test Administration 3.7.3. Computer-based Assessments 3.7.3.A.1. Interim Assessments\ 3.7.3.A.2. Appeals for Computer Based Assessments 3.7.4. Contracting with LEAs for STS for Dual Immersion Programs 3.8. TASK 8: Scoring and Analysis 3.8.1. Scoring 3.8.1.A. Methods of Scoring 3.8.1.A.1. Deterministic or Machine Scoring 3.8.1.A.2. Performance Task and Constructed-Response Scoring 3.8.1.B. Interim Assessment Scoring 3.8.1.C. Cumulative Scores 3.8.2. Analysis of Test Results 3.8.2.A. Item Analyses 3.8.2.B. Summary Analyses 3.8.2.C. Replication of Analyses 3.8.2.D. Interim Assessment Analyses 3.9 TASK 9: Reporting Results 3.9.1 Reporting to Local Educational Agencies 3.9.2 Reporting t o the CDE – Public Reporting Web Site 3.9.3. Data Files 3.9.4. Secure File Transfer System 3.9.5 Technical Report 3.9.6 Other Analyses or Reports 3/13/2015 9:53 AM addendum-dsib-adad-mar15item04 Page 18 of 19 Table 1.3—Evaluation Panel Assessment Panel Reviewed Cost LEA Size LEA Role LEA Experience Y/N S/M/L Open-ended Open-ended Y L Administrative Director 39 years Y L Testing Operations Manager 13 years L Supervisor, Achievement Assessments Office 20 years S LEA CAASPP Coordinator 15 years L Director of Assessment, Research and Evaluation Office 13 years Y CDE Branch CDE Role CDE Experience District School and Innovations CAASPP LEA 5 years Y District School and Innovations CAASPP Science 4 years 8 months Y District School and Innovations Psychometrics 14 years Y District School and Innovations Academic Accountability District School and Innovations ELPAC Personnel Services Division Contracts 14 months with CDE + 25 years assessment exp with the State (DCA) 8 months with CDE and 6 years with the University of California 25 years Y Technology Panel Reviewed Cost LEA Size LEA Role LEA Experience Y/N S/M/L Open-ended Open-ended S Data Coordinator/Trainer 7.5 years M 17 years S Chief Technology Officer/Technology Coordinator Network Technology Lead L Director, Admin Operations 18 years L Education Research Analyst 12.5 years Y Y CDE Branch CDE Role CDE Experience 18 years District School and Innovations CAASPP 10 Years Services for Admin, Finance, Tech & Infrastructure Services for Admin, Finance, Tech & Infrastructure TSD 16 years at CDE, 2 years in current position CALPADS 3 years 3/13/2015 9:53 AM addendum-dsib-adad-mar15item04 Page 19 of 19 Attachment 1: California Assessment of Student Performance and Progress Request for Submissions CN 150012 Submission Score Summary (1 Page) Attachment 2: California Assessment of Student Performance and Progress Request for Submissions CN 150012 Cost Submission Summary (2 Pages) Attachment 3: State Superintendent of Public Instruction’s Letter to Michael Kirst, SBE President (3 Pages) Attachment 4: California Assessment of Student Performance and Progress Request for Submissions CN150012 Notice of Public Viewing (1 Page) 3/13/2015 9:53 AM addendum-dsib-adad-mar15item04 Attachment 1 Page 1 of 1 California Assessment of Student Performance and Progress Request for Submissions CN 150012 Submission Score Summary CTB/McGraw-Hill Task Comprehensive Plan and Schedule of Deliverables Program Support Services Technology Services Test Security Accessibility and Accommodations Assessment Development Test Administration Scoring and Analysis Reporting Results Scope of Work Score Total Cost Submission Consensus Score (not weighted) Total Educational Testing Service NCS Pearson RFS Section Possible Points Weight Percent Submission Consensus Score Percent Earned1 Scope of Work Total Weighted Score in Points2 3.1 16 5 13 81 41 16 100 50 14 88 44 3.2 3.3 3.4 3.5 3.6 3.7 3.8 3.9 61 82 28 18 85 49 71 44 454 5 15 10 10 20 5 15 15 33 41 22 14 46 33 47 31 280 54 50 79 78 54 67 66 70 27 75 79 78 108 34 99 105 646 46 60 23 14 65 40 55 38 357 75 73 82 78 76 82 77 86 38 110 82 78 152 41 116 129 796 38 42 21 10 53 27 43 29 277 62 51 75 56 62 55 60 66 31 77 75 56 124 28 90 99 624 148 136 136 145 200 Cost Submission Grand Total (Total costs covering the 2015–16, 2016–17, and 2018–19 overlapping test administrations) 148 Submission Consensus Score Percent Earned1 Scope of Work Total Weighted Score in Points2 Submission Consensus Score Percent Earned1 Scope of Work Total Weighted Score in Points2 794 $223,769,974.58 932 $239,998,122.30 145 769 $205,840,739.00 1 Percent Earned = ((Consensus Score/Possible Points) * 100) Total Score = ((Percent Earned * Weight Percent)/10) 2 3/13/2015 9:53 AM addendum-dsib-adad-mar15item04 Attachment 2 Page 1 of 2 California Assessment of Student Performance and Progress Request for Submissions CN 150012 Cost Submission Summary Per the Request for Submissions, once the State Board of Education (SBE) approves the successful bidder, the contract scope of work and budget will be finalized through negotiations between the successful bidder, the California Department of Education, SBE staff and members, and the Department of Finance. The successful bidder designated by the SBE will be requested to provide costs per subtask, per pupil, fixed and variable, and per each test administration prior to entering budget negotiations. CTB/ McGraw-Hill Task Educational Testing Service NCS Pearson Task 1: Comprehensive Plan and Schedule of Deliverables $ 5,639,100.32 $ 4,681,624.69 $ 1,839,250.88 Task 2: Program Support Services $ 10,631,031.94 $ 8,201,490.53 $ 3,804,554.86 Task 3: Technology Services $ 4,656,594.11 $ 5,027,486.35 $ 3,916,177.36 Task 4: Test Security $ 1,138,395.80 $ 99,832.25 $ 294,057.85 Task 5: Accessibility and Accommodations $ 498,339.54 $ 171,457.16 $ 44,497.92 Task 6: Assessment Development $ 7,390,788.17 $ 3,234,494.40 $ 4,752,770.61 Task 7: Test Administration $ 20,700,545.83 $ 31,285,609.29 $ 11,553,703.38 Task 8: Scoring and Analysis $ 13,970,048.38 $ 21,193,220.27 $ 41,848,475.21 Task 9: Reporting Results $ 9,607,538.16 $ 2,104,720.09 $ 5,939,246.93 $ 74,232,382.25 $ 75,999,935.03 $ 73,992,735.00 Fiscal Year 2015–16 (12 months) Total: Task 1: Comprehensive Plan and Schedule of Deliverables $ 6,583,423.94 $ 4,754,962.16 $ 1,422,289.31 Task 2: Program Support Services $ 12,231,393.58 $ 7,888,022.58 $ 3,610,438.42 Task 3: Technology Services $ 4,874,479.31 $ 4,733,928.99 $ 2,989,059.58 Task 4: Test Security $ 1,052,030.18 $ 100,576.58 $ 298,158.50 Task 5: Accessibility and Accommodations $ 383,021.40 $ 100,491.20 $ 43,349.70 Task 6: Assessment Development $ 11,258,653.06 $ 6,020,004.06 $ 3,224,694.57 Task 7: Test Administration $ 17,302,083.43 $ 32,426,307.55 $ 10,743,409.47 Task 8: Scoring and Analysis $ 14,443,766.44 $ 23,904,257.53 $ 41,717,262.46 Task 9: Reporting Results $ 7,538,252.34 $ 2,930,378.71 $ 5,057,542.99 $ 75,667,103.68 $ 82,858,929.36 $ 69,106,205.00 Fiscal Year 2016–17 (12 months) Total: 3/13/2015 9:53 AM addendum-dsib-adad-mar15item04 Attachment 2 Page 1 of 2 CTB/ McGraw-Hill Task Educational Testing Service NCS Pearson Task 1: Comprehensive Plan and Schedule of Deliverables $ 6,469,858.30 $ 4,869,001.59 $ 1,411,292.57 Task 2: Program Support Services $ 13,374,213.70 $ 8,107,584.37 $ 2,952,473.05 Task 3: Technology Services $ 5,118,305.72 $ 4,293,781.43 $ 2,828,768.39 Task 4: Test Security $ 1,029,133.30 $ 102,410.63 $ 304,959.70 Task 5: Accessibility and Accommodations $ 412,452.40 $ 110,762.83 $ 43,642.82 Task 6: Assessment Development $ 9,581,647.50 $ 8,499,556.24 $ 4,076,019.88 Task 7: Test Administration $ 14,317,104.91 $ 27,574,358.66 $ 4,996,955.75 Task 8: Scoring and Analysis $ 14,426,523.66 $ 19,648,999.02 $ 38,447,251.10 Task 9: Reporting Results $ 7,517,563.69 $ 2,552,625.25 $ 4,804,022.74 $ 72,246,803.18 $ 75,759,080.02 $ 59,865,386.00 Task 1: Comprehensive Plan and Schedule of Deliverables $ 946,631.67 $ 1,615,287.23 $ 390,793.55 Task 2: Program Support Services $ 219,520.50 $ 383,409.70 $ 50,731.20 Task 3: Technology Services $ - $ 601,494.00 $ 629,000.00 Task 4: Test Security $ - $ - $ 6,592.08 Task 5: Accessibility and Accommodations $ - $ - $ Task 6: Assessment Development $ 131,804.30 $ 1,401,847.08 $ 587,564.59 Task 7: Test Administration $ 11,010.80 $ 192,217.58 $ 6,292.44 Task 8: Scoring and Analysis $ 72,854.00 $ 50,844.00 $ 191,056.42 Task 9: Reporting Results $ 241,864.20 $ 1,135,078.30 $ 1,014,382.72 $ 1,623,685.47 $ 5,380,177.89 $ 2,876,413.00 Task 1: Comprehensive Plan and Schedule of Deliverables $ 19,639,014.23 $ 15,920,875.67 $ 5,063,626.31 Task 2: Program Support Services $ 36,456,159.72 $ 24,580,507.18 $ 10,418,197.53 Task 3: Technology Services $ 14,649,379.14 $ 14,656,690.77 $ 10,363,005.33 Task 4: Test Security $ 3,219,559.28 $ 302,819.46 $ 903,768.13 Task 5: Accessibility and Accommodations $ 1,293,813.34 $ 382,711.19 $ 131,490.44 Task 6: Assessment Development $ 28,362,893.03 $ 19,155,901.78 $ 12,641,049.65 Task 7: Test Administration $ 52,330,744.97 $ 91,478,493.08 $ 27,300,361.04 Task 8: Scoring and Analysis $ 42,913,192.48 $ 64,797,320.82 $ 122,204,045.19 Task 9: Reporting Results $ 24,905,218.39 $ 8,722,802.35 $ 16,815,195.38 $ 223,769,974.58 $ 239,998,122.30 $ 205,840,739.00 Fiscal Year 2017–18 (12 months) Total: Fiscal Year 2018–19 (6 months only) Total: Cost Submission Total: - 3/13/2015 9:53 AM addendum-dsib-adad-mar15item04 Attachment 3 Page 1 of 3 March 6, 2015 Michael Kirst, President State Board of Education 1430 N Street Sacramento, CA 95814 Dear President Kirst: California Education Code (EC) Section 60643 requires that I provide you the California Department of Education’s (CDE) recommendation for the selection of the next California Assessment of Student Performance and Progress (CAASPP) contractor. This decision has many long lasting effects of the programs administered by the CDE and implemented by the 1,874 local educational agencies (LEAs) and approximately 10,000 schools throughout the state. But more importantly, this decision will have a lasting and enduring impact on the children of California, their parents and guardians, teachers and administrators, and the people of California. The direct and indirect consequences of this decision cannot be understated and it is for that reason I feel fortunate that CDE received three responsive proposals from three highly respected companies: CTB/McGraw-Hill (CTB), Educational Testing Service (ETS), and NCS Pearson, Inc. (Pearson). After a very long, thorough, and deliberate evaluation process I recommend the following action be taken by the California State Board of Education (SBE): “ETS be designated as the CAASPP contractor and that the SBE designation be expressly conditioned on ETS meeting each of the stated conditions that follow. If these conditions are not met by the May 2015 SBE meeting, the SBE gives notice that it expressly reserves its right to rescind this designation and select another contractor at the May meeting. The conditions to be met by ETS are as follows: March 6, 2015 3/13/2015 9:53 AM addendum-dsib-adad-mar15item04 Attachment 3 Page 2 of 3 Page 2  A draft contract and scope of work to which the parties will have reached substantial agreement shall be presented at the May 2015 SBE meeting. The contract shall be executed shortly thereafter with the approval of the State Board President or his designee and the Superintendent of Public Instruction or his designee.  In no event shall the contract price exceed the amount that is estimated to be included in the annual Budget Act for this contract; nor shall the contract price exceed the published cost submission total except as stipulated below.  Require the designated contractor to ensure online individual student results for all CAASPP computer-based assessments be available in the Dynamic Online Reporting System within three (3) weeks after the student has completed all components of the assessment for that content area.  Require the designated contractor to ensure online individual student results for all CAASPP paper-pencil tests be available in the Dynamic Online Reporting System within six (6) weeks after the scoring center receives a complete clean set of answer documents for processing and scoring and after receipt of the score keys and conversion tables.  If directed by the SBE or the CDE, the designated contractor and/or their subcontractors for a specific task will agree to provide the same approach/work described in another bidder’s submission for that task (the desired approach) at costs not to exceed the cost proposed for that task in the other submission.  Further, the SBE reserves the right to extend the ETS designation for a longer period with additional test administrations and fiscal years and cost to be negotiated an approved by the Department of Finance in accordance with EC Section 60643.” I am proud of the work performed by the staff of CDE and that of the LEA evaluators. My recommendation is the culmination of an evaluation process consisting of four steps: • Review of the submission by Evaluation Panel members; • Review of the submission by the independent Consultant from Sabot Consulting, the CDE’s Independent Validation and Verification (IV&V) contractor; • Implementation Readiness Package (IRP) Evidence review; and • Internal review for the development of this recommendation. Our submission and evaluation process was developed and designed to meet and exceed the requirements of EC Section 60643, to ensure “a competitive and open process utilizing standardized scoring criteria” and to arrive at a recommendation that accurately reflects the requirements set forth in the Request for Submission (RFS), 3/13/2015 9:53 AM addendum-dsib-adad-mar15item04 Attachment 3 Page 3 of 3 March 6, 2015 Page 2 approved by the SBE at the November 2014 SBE meeting. The processes were extremely effective and provided the CDE with the information needed to arrive at this recommendation and for which I can state assuredly that the people of California are receiving the benefits of our good stewardship of their trust and tax dollars. Teachers, school district testing, evaluation, technical, and curriculum staff; and CDE staff evaluated the submissions. They served on panels to review the comprehensive plan and schedule of deliverables; program support and services; technology services; test security; accessibility and accommodations; assessment development; scoring and analysis; reporting; as well as the cost submissions. We had our independent IV&V consultant review Task 3–Technology Services to determine not only responsiveness to the general requirements of the RFS but also to identify any potential deficiencies in the submissions regarding technological services for use during the pending contract negotiations. We also had additional CDE staff thoroughly review the evaluation panel findings and each of the bidder’s submissions for the development of this recommendation. While all of the proposals were identified as being “responsive to the general requirements described in the RFS” and would be able to meet the “technical solutions to successfully host the Smarter Balanced Assessment Delivery System” I believe that the submission by ETS has presented California with the technical qualities most beneficial for our successful administration of the CAASPP System. The evaluation panels’ consensus opinions identified ETS as the bidder with the most solid, well-written proposal that clearly identified and described how they would support the administration of the CAASPP System and provide the CDE and the SBE with the necessary supports to carry on the critical elements to maintain and improve California’s world-class assessment system. ETS’s flexibility and corporate agility will allow negotiations and scope(s) of work to benefit not only the CDE but also our students, educators, and the people of California. I appreciate the time and efforts by the bidders, evaluators, and staff and respectfully encourage your support and approval of my motion. Sincerely, Tom Torlakson TT:dk 3/13/2015 9:53 AM addendum-dsib-adad-mar15item04 Attachment 4 Page 1 of 1 CALIFORNIA DEPARTMENT OF CALIFORNIA STATE EDUCATION BOARD OF EDUCATION TOM TORLAKSON, State Superintendent of Public Instruction 916-319-0800 1430 N Street Sacramento, CA 95814-5901 MICHAEL W. KIRST, President 916-319-0827 March 9 and March 10, 2015 NOTICE OF PUBLIC VIEWING California Assessment of Student Performance and Progress Request for Submissions CN150012 This Request for Submissions (RFS) invited submissions for the development, administration, scoring, reporting, and analysis of assessments and technology support for the California Assessment of Student Performance and Progress (CAASPP) System as defined in California Education Code (EC) sections 60601 through 60649. Pursuant to EC Section 60643, the California Department of Education (CDE) will evaluate the submissions received in response to this competitive-bidding process to recommend a CAASPP testing contractor(s) to the State Board of Education (SBE) for approval. Submissions will be evaluated as set forth in RFS Section 6, Evaluation Process and the CDE will present the evaluation results, each bidder’s cost submission, and the CDE recommendation to the SBE for the selection of the CAASPP testing contractor at the SBE’s March 2015 meeting. PUBLIC VIEWING The CDE will host a public viewing of the three submissions received in response to the RFS on March 9 and 10, 2015, from 9 a.m. to 4 p.m. Viewing will be available at the California Department of Education, 1430 N Street, Sacramento, CA 95814, in room 1103. Visitors must check-in with security before proceeding to the room. No photography or recording of the submission will be permitted. Space and available copies of the submissions will be limited to first come – first served. Only the submissions will be made available at the viewing. 3/13/2015 9:53 AM