Memorandum To: From: Re: Interested Parties Mark Johnson, NC Superintendent of Public Instruction Public Records from the Read to Achieve RFP There is a great deal of misinformation in public forums regarding the Read to Achieve diagnostic selection process. DPI cannot release every detail of the procurement process until the process is complete (i.e. until after the protest is decided). Unfortunately, that means the public records released now might not present a full picture of the process. It is our hope, though, that these public records help to eliminate some of the misinformation. Public Records Context A review of evaluation committee notes and an internal update presentation included in these public records will reveal misstatements of facts put forth by members of the evaluation committee. Many of these misstatements were clarified and corrected later in the process, such as statements regarding dyslexia screening. In the case of the update presentation from December of 2018, the slides were not updated to correct misstatements of fact, missing information, or the Phase 1 rankings based on such misstatements because the presentation was never publicly delivered. The Contract Award Recommendation and presentation to the State Board of Education contain correct information. Statement from Superintendent Johnson “Istation is the best reading diagnostic tool for North Carolina, and I believe using Istation will yield quality data that will better support success for our students, meeting students where they are and helping them grow, while also reducing the time teachers must spend testing students. DPI and the State Board adhered to all laws, rules, and policies during this procurement to ensure fairness and objectivity. We are excited about the end result of a partnership with Istation to support students and teachers across North Carolina.” Reading Diagnostic Tool Statewide Professional Development Plan Dr. Tara Galloway K-3 Literacy Director ISIP Authorship Team Joseph Torgesen, Patricia Mathes, PILD. Jeannine Herr-on, ?on average hm 25-45 minutes! Grade Kindergarten 1St Grade 2"d 3" Grade Subtest Listening Comprehension Phonemic Awareness Letter Knowledge Vocabulary Phonemic Awareness Letter Knowledge Vocabulary Alphabetic Decoding Comprehension Spelling Connected Text Fluency *(Maze/Cloze Passage) Oral Reading Fluency Vocabulary Comprehension Spelling Corrected Text Fluency (Maze/Close Passage) Oral Reading Fluency Text Fluency subtest not included in overall ability score Valid and Reliable Measures Drive Instruction Based on the Science of Reading Implementation Plan Use Istation at the beginning of the 2019-2020 school year (with delay in metrics) Gather data to become familiar with the assessments during the fall Use data to inform instruction Delay the use of data to measure growth for EVAAS until MOY benchmark Use first official benchmark in the winter (MOY) and end of year benchmark (EOY) for EVAAS purposes • Train all teachers by start of school by continuing aggressive schedule including: - in-person workshops - on-demand webinars - learning modules (podcasts) - technical assistance - ongoing support from the K-3 Literacy team • • • • • Impact of Measuring Growth Using Middle of Year (MOY) to End of Year (EOY) • Since EVAAS growth is a relative measure of performance, there is no predetermined level a student must reach to show growth. • As long as the measurement period for all participants is roughly the same, the model will yield a valid estimate of growth for a teacher relative to peers in the same grade and subject. • Teachers in the state will not be disadvantaged by the shorter measurement period because we are comparing the progress one teacher's students made to all the other teachers' progress with their students (in the same grade and subject) in the same amount of time. • The State has always measured kindergarten growth this way. • Growth for third grade is not based on the diagnostic assessment and will continue to be measured BOG/EOG for EVAAS. Dates for technical and educator webinars will be conducted in a live Q&A environment (recorded and available for on demand viewin Release of Recorded Online Module for educators to view brief course, complete quiz, and attain certificate of com letion Podcast Planning and meetings to prepare for launch of !station across North Carolina Face to face training provided by !station for chosen campus leader (regional based interactive workshop to bring back to school level) June 2019 July 2019 Su Mo Tu We Th Fr Sa 1 2 !station �r;; !ii!tr NCDPI September 2019 Su Mo Tu We Th Fr Sa 1 2 3 4 5 6 7 8 9 10 11 12 13 14 January (Tradltlonal) First banchmark ow far 2019-ZOZO opan far EVAASpurpo (MOY) barfarYR) 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 October 2019 Su Mo Tu We Th Fr Sa 1 2 3 4 5 6 9 10 11 12 13 12 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 13 14 15 16 17 18 19 20 21 22 23 24 25 26 January 2020 Mo Tu We Th Fr 1 2 3 6 7 8 9 10 13 14 15 16 17 20 21 22 23 24 27 28 29 30 31 February 2020 Su Mo Tu We Th Fr Sa 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 Su 5 12 19 26 Sa 4 11 18 25 August 2019 Su Mo Tu We Th Fr Sa Su Mo Tu We Th Fr Sa 7 4 4 5 1 2 8 9 10 11 12 13 3 6 5 6 7 1 2 8 9 10 3 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 November 2019 Su Mo Tu We Th Fr Sa 1 2 3 4 5 6 7 8 9 December 2019 Mo Tu We Th Fr 2 3 4 5 6 9 10 11 12 13 16 17 18 19 20 23 24 25 26 27 30 31 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 March 2020 Su Mo Tu We Th Fr Sa 1 2 3 4 5 6 7 8 9 10 11 12 13 14 16 17 18 19 20 Su 1 8 15 22 29 Su Mo 5 12 19 26 6 13 20 27 April 2020 Tu We Th 1 2 7 8 9 14 15 16 21 22 23 28 29 30 Sa 7 14 21 28 Fr Sa 3 10 17 24 4 En k nch 11 window 2019-2020 n far EVAAS purposes 18 (April farYR) 25 Training Opportunities Face to Face – Webinars - Podcasts Podcasts (Online Modules) begin July 15th Note: Upon completion of training, participants will receive a certificate which can be used to earn CEU credit Getting Started with Istation Roll-out Plan Summer 2019 (June, July, August) •Istation enrollment and deployment activated in schools •Live webinars hosted •In-person regional trainings hosted •On-going implementation support •Districts can begin using Istation as early as July, but it will not count in metrics Fall 2019 (September, October, November) •Students take Istation’s ISIP assessment to begin to learn from the program •Additional in-person regional trainings hosted •Additional live webinars hosted •Fall is a “getting started” learning opportunity: Data will not feed into EVAAS Winter 2019-2020 (December, January, February) •Ongoing progress monitoring continues •Ongoing training continues •JANUARY: The first benchmark window for 2019-2020 opens for EVAAS purposes (MOY) * Spring 2020 (March, April, May) * Traditional Calendar •Ongoing progress monitoring continues (December / April for YR) •Ongoing training continues •MAY: End-of-year benchmark window for 2019-2020 opens for EVAAS purposes (EOY) * Istation Measures for Dyslexia Screening Kindergarten • Automatically screened upon login • Additional subtest • alphabetic decoding * • Normal ISIP kindergarten subtests • listening/language comprehension • phonological and phonemic awareness * • letter knowledge * • vocabulary 1st Grade • Automatically receive the relevant subtests based on the initial screening • phonological and phonemic awareness * • letter knowledge * • vocabulary • alphabetic decoding * • reading comprehension • spelling Istation Measures for Dyslexia Screening 2nd Grade • Automatically screened upon login • Additional subtests • alphabetic decoding * • letter knowledge * • phonological and phonemic awareness * • Normal ISIP subtests for 2nd grade • vocabulary • reading comprehension • spelling • text fluency 3rd Grade • Automatically screened upon login • Additional subtests • alphabetic decoding * • letter knowledge * • phonological and phonemic awareness * • Normal ISIP subtests for 3rd grade • vocabulary • reading comprehension • spelling • text fluency Next Steps - Read to Achieve • DPI is purchasing the diagnostic assessment directly, as before • DPI has Read to Achieve funding available for implementation of the new diagnostic assessment: • Devices – each classroom should have sufficient devices to implement a work station approach (4 is recommended guideline) • In addition to this funding, DPI has iPads available to distribute • Accessories – for example, each device should be equipped with a headset with microphone • Training expenses such as summer stipends Next Steps – PRC 085 Allotment Policy Manual Update • Policy was updated at June SBE meeting - to remove one-time allotments from 2017-18 not continued in 2018-19 - to provide a clean slate for 2019-20 changes once assessment contract awarded • Propose update to current version of policy based on feedback from local superintendents in June (current version only provides for device refresh) • Empower districts to make locally-informed choices on K-3 Literacy spending while maintaining State-defined parameters aligned with Read to Achieve law, including anticipated 2019 amendments • Update to add the following allowable expenditures: - K-3 literacy aligned instructional supports - Training and personnel to support K-3 literacy instruction • Continue allotment of funding based on K-3 ADM Next Steps - Read to Achieve • Operational & Policy Steps • Update Read to Achieve Guidebook • Annual updates to Local Alternative Assessment list • Grade level expectations for proficiency READ TO ACHIEVE K-3 Literacy Division NC Department of Public Instruction Istation and Read to Achieve in North Carolina Video fit: @71 Public Schools of North Carolina Questions? Dr. Tara Galloway K-3 Literacy Director Roy Cooper Governor Eric Boyette Secretary of Information Technology State Chief Information Officer Contract Award Recommendation To: Andrea Pacyna Deputy Chief IT Procurement Officer Department of Information Technology From: Tymica Dunn Procurement Chief Department of Public Instruction Date: June 7, 2019 Subject: Contract Award Recommendation Read to Achieve Diagnostics - Requisition # - RQ20680730, DIT File #300042 Reference #: Request for Negotiations 40-RQ20680730A, DIT File #300042 Enclosed for your review and approval is the award recommendation for Requisition # RQ20680730. Bids received pursuant to RFN #40-RQ20680730A have been reviewed and an Evaluation Committee hereby requests the Statewide IT Procurement Office to award the contract, as follows: Description: Recommended Vendor: Read to Achieve Diagnostics – Software as a Service Imagination Station Inc., dba, Istation 1 of 14 v.2017-06-06 Cost: $8,405,820 for 3 years Contract Term: Two (2) years plus 1 (one) year optional renewals at the discretion of the State Project Name and Number: Read to Achieve Diagnostics - 2018 DIT file # 300042 Thank you for your assistance. If additional information is required, please do not hesitate to contact me. cc: Evaluation Committee Patti Bowers, DSCIO Glenn Poplawski, DSCIO Kathy Bromead, PMA Table of Contents Section 1: Introduction ............................................................................................................................... 4 Section 2: Evaluation Committee ............................................................................................................... 4 Section 3: Evaluation Criteria / Methodology ............................................................................................ 5 Section 4: Timeline...................................................................................................................................... 6 Section 5: Evaluation of Bid Submission ..................................................................................................... 6 Section 6: Vendors .................................................................................................................................... 7 A. Evaluation Criteria ................................................................................................................ 7 B. Cost ........................................................................................................................... 7 C. Vendor Financial Stability ........................................................................................ 9 D. Formative and Diagnostic Assessment ..................................................................... 9 E. Personalized Learning ............................................................................................. 11 2 of 14 v.2017-06-06 Section 7: Finalist Vendor(s) ................................................................................................................... 13 Section 8: Award Recommendation ......................................................................................................... 14 Section 9: Supporting Documentation ..................................................................................................... 14 3 of 14 v.2017-06-06 Section 1: Introduction The North Carolina Department of Public Instruction posted Request for Proposal number 40-RQ20680730A to the North Carolina Interactive Purchasing System on September 6, 2018. A total of four (4) bids were received; however, the evaluation committee could not reach a consensus and deemed it most advantageous to the State to cancel and negotiate with sources of supply. NCDPI requested and received approval from the DIT DSCIO/Chief Procurement Officer to negotiate. Request for Negotiations were sent to Amplify and Istation on March 28, 2019 and negotiation meetings were conducted on April 11, 2019 with both vendors at North Carolina Department of Public Instruction. DSCIO/Chief Procurement Officer The purpose of this award recommendation and the resulting contract award is to identify a vendor best qualified to offer services for Read to Achieve Diagnostic Software as a Service solution (RtAD) to meet NCDPI’s obligations under state law, N.C.G.S. 115C83.1, et. seq. North Carolina state law requires kindergarten through third grade students to be assessed with valid, reliable, formative and diagnostic reading assessments. NCDPI is obligated to adopt and provide these developmentally appropriate assessments. The solution must assess student progress, diagnose difficulties, inform instruction and remediation, and yield data that can be used with the Education Value-Added Assessment System (EVAAS). Section 2: Evaluation Committee Name Title/Agency Participation Level Berry, Erika Craver, Nathan Karkee, Thakur Shue, Pam AlHour, Julien Senior Policy Advisor, NCDPI Digital Teaching and Learning Consultant, NCDPI Psychometrician, NCDPI Deputy Superintendent of Early Education, NCDPI Director - Architecture, Integration, & Quality Assurance, NCDPI Purchasing Section Chief, NCDPI Decision Maker Decision Maker Decision Maker Decision Maker SME Dunn, Tymica Procurement Officer 4 of 14 v.2017-06-06 Gossage, Chloe Strong, Melissa Viswanathan, Srirekha Chief Strategy Officer, NCDPI State Board of Education Attorney Project Manager, NCDPI SME SME Project Manager Decision Maker: Key business stakeholders evaluating the bid responses. Voting Project Manager: Overall responsibility includes successful initiation, planning, design, execution, implementation, and closure of a project. Non-Voting Subject Matter Expert (SME) Person who is an authority in a particular technical area pertaining to the procurement Non-Voting Role Definitions: Section 3: Evaluation Criteria / Methodology The selection process was conducted using the “best value” methodology authorized by N.C.G.S. §§143-135.9 and 143B-1350(h). The evaluation committee met as a group and evaluated the responsive proposals. The evaluation criteria listed below is in the order of importance: Evaluation Criteria Cost Vendor Financial Stability Formative and Diagnostic Assessment Personalized Learning 5 of 14 v.2017-06-06 Section 4: Timeline Date March 21, 2019 Milestone RFP Cancellation Notifications sent to vendors, Request to Negotiate Review Period RFP proposals were extended to June 29, 2019 – Clarification 1 Negotiation Meeting with vendors Clarification issued to vendors – Clarification 2 Clarification response received and shared with evaluation team Evaluation Committee meeting and discussion of proposal strengths and weakness Clarification issued to vendor – Clarification 3 Clarification response received and shared with evaluation team Clarification issued to vendor – Clarification 4 Clarification response received and shared with evaluation team Best and Final Offer (BAFO) Award Recommendation March 27, 2019 April 11, 2019 April 17, 2019 April 23, 2019 April 25, 2019 May 3, 2019 May 15, 2019 June 4, 2019 June 6, 2019 Section 5: Evaluation of Bid Submission Proposal response from the following two vendors were considered for further negotiations: Number Company Name Address 1. AmplifyAmplify Education Inc. Istation Imagination Station 2. dba, Istation 55 Washington Street, Suite 800, Brooklyn, NY 11201 8150 North Central Expressway, Suite 2000, Dallas, TX 75206 6 of 14 v.2017-06-06 Section 6: Vendors Listed below is a synopsis of each proposal submitted based on the criteria defined in Section 3. A. Evaluation Criteria "Best Value" procurement method authorized by N.C.G.S. §§143-135.9 and 143B-1350(h) has been used for this evaluation. A one step source selection was used. The proposals were objectively evaluated using the evaluation criteria described below. The evaluation team members did their due diligence and issued clarifications for each proposal before meeting the vendors on April 11, 2019. Strengths and weaknesses were discussed during the evaluation meeting on April 25,2019. The following evaluation criteria was used to determine strengths and weakness 1. Cost 2. Vendor Financial Stability 3. Formative and Diagnostic Assessment 4. Personalized Learning B. Cost The strengths and weaknesses identified by the Evaluation team for the responsive vendors are summarized in the tables below. Cost Vendor Amplify Strengths Weakness No strengths noted. 1. Amplify submitted two cost offers - one for assessment only at $4,312,210 (Year 1), $3,895,210 (Year 2), $3,883,760 (Year 3) totaling $12,102,096.08 another 7 of 14 v.2017-06-06 2. 3. 4. 5. Istation 1. Istation submitted two cost offers one for the assessment component only and one for the both the assessment and curriculum components. The cost for the assessment was $2,751,940 (Year 1) $2,751,940 (Year 2) $2,751,940 (Year 3) totaling $8,255,820. For both the assessment and curriculum was $9,934,813 (Year 1), $9,934,813 (Year 2), $9,934813 (Year 3) totaling $29,804,438. 2. The assessment cost of $5.70 per student is less expensive than Amplify and includes more features such as 3,000 teacher directed lessons, remote student and parent access to Istation's iPractice. one for personalized and blended approach to learning at $11,948,912.75 (Year 1), $10,934,412.75 (Year 2) and $10,922,962.75 (Year 3) totaling $33,806,288.25. The assessment only cost which was considered for this proposal review is significantly higher than Istation’s assessment only tool. The assessment cost of $8.00 per student is higher than that of Istation and does not include online assessments nor remote student or parent access. This cost does not include teacher lessons. The assessment is not automated and requires teacher intervention by reading the tests aloud and takes away significant classroom time from teaching. Professional Development cost for year 1 is 556,650; however, is limited to training Master Literacy Trainers and NCDPI Consultants. The proposal response did not adequately include strategies for ensuring consistent scoring to evaluate training effectiveness. 1. Solution is not compatible with screen readers or keyboards and will cost extra to ensure compatibility. 8 of 14 v.2017-06-06 3. $76,103 for professional development offers 22 onsite trainings, 14 recorded live webinars and 10 virtual teacher trainings annually, in addition the vendor will provide up to 5 additional onsite and 10 recorded webinars annual at no additional cost. 4. Vendor will provide additional professional development beyond these allowances at a rate of $5,800.00 per day of professional development and $550.00 per webinar. 5. The cost for Professional Development also covers the logistics which includes securing learning facilities, paying the cost to host the training, coordinating training dates, communication to participants etc. C. Vendor Financial Stability The strengths and weaknesses identified by the Evaluation team for the responsive vendors are summarized in the tables below. Vendor Amplify Istation Vendor Financial Stability Strengths NCDPI Financial Director finds no going concern. NCDPI Financial Director finds no going concern. Weakness None None D. Formative and Diagnostic Assessment The strengths and weaknesses identified by the Evaluation team for the responsive vendors are summarized in the tables below. 9 of 14 v.2017-06-06 Vendor Amplify Formative and Diagnostic Assessment Strengths 1. Assessment covers all five areas of early literacy which is mandated by law. The service has the capability to appropriately assess K-3 students. Weakness 1. Benchmarking and progress monitoring per student per grade level consumes a lot of time and requires excessive teacher involvement to manually administer and enter test results. The fixed form manual test takes more time testing to find where the students are at. This takes away significant instructional time. 2. The $8 option is not adaptive i.e., it does not measure 2. Amplify Service has enough item pool for student’s exact level of achievement. It was difficult to 20 assessments (i.e., number of items gauge from the proposal response how the service adapts when students gain mastery. that are aligned to NC standards which 3. The fixed form tests don’t always provide feedback on will be enough for 20 tests). It is also to the student’s exact level of achievement which brings be noted that Schools have three tests per to question the effectiveness of the data driven grade level for this age group. instructional support. 3. The reports are easily understandable. Home Connect Letters for parents is clear. There are multiple reports for teachers about instruction and areas that need intervention. Istation 1. Adaptive assessment (also known as Computer Adaptive Assessment) allows students to reach their full potential. This assessment measures student’s mastery with the minimal amount of teacher time. 2. The aggregate reports for teachers are easy to read and interpret. 3. Istation has enough item pool for 10 assessments (i.e., number of items that are aligned to NC standards which will be None 10 of 14 v.2017-06-06 enough for 10 tests). It is also to be noted that Schools have three tests per grade level for this age group. E. Personalized Learning The strengths and weaknesses identified by the Evaluation team for the responsive vendors are summarized in the tables below. Personalized Learning Strengths Vendor Amplify 1. Personalized Learning was only offered in the Alternate Cost proposal which came with increased pricing. 2. Progress Monitoring when a student is identified as at risk for achievement, is at individual skills level. 3. Amplify offers a dyslexia component. Weakness 1. The basic cost proposal offered does not have all aspects of personalized learning and is not computer adaptive. 2. Progress Monitoring for students at risk requiring intervention takes up a lot of time for teachers. The basic assessment solution option is not computer based it is takes away significant instruction time from teachers and the reliability and validity of results vary significantly. 3. Home Reading is not included in the bid offering. This limits the ability for students to have access to resources outside of school which limits their learning and the participation from parents. 11 of 14 v.2017-06-06 Istation 1. The assessment is computer adaptive and caters 1. Although Istation stated that their assessment can be to the individual student's need. used to screen for dyslexia, the vendor does not have a 2. The time for assessment offered by Istation is 40 separate dyslexia component at this time. minutes/student and is fully online (i.e., teacher can work with other students in class while a group of students are taking the assessment) . Amplify’s assessment is 45 minutes/student on the low end and requires teachers to spend time with the students while they are being assessed. The reduced assessment time and the fact that the teacher does not have to be with students who are being assessed (using the computerized model) allows teachers more time to support student’s individual needs. 3. Istation allows students see their own academic need and take responsibility for their learning by providing feedback after each subtest. This feedback is available to students, parents and teachers. Further students are allowed access outside of school. They can personalize their learning by choosing games and activities to further enhance their learning. 12 of 14 v.2017-06-06 Section 7: Finalist Vendor(s) NCDPI entered into negotiations with both vendors. Each vendor was given the opportunity to present their assessment solution and how it would best meet the needs of the department. Clarification 1 was issued to both vendors extending their RFP bid submission as the proposal response was used in the negotiation process. Clarification 2 was issued to both vendors prior to the negotiation meeting. The question provided in this request were focal points during the meeting. This clarification request also gave the Evaluation Team some guidance and understanding with both vendor offering. After the negotiation meeting held on April 11, 2019 the team unanimously agree to continue further negotiation efforts with Istation. Clarification 3 Istation was asked by NCDPI to provide the cost of both the assessment and curriculum. This request was to compare the Alternative Cost proposal 2 submitted by Amplify which included the curriculum portion. After reviewing Istation’s submission the team agreed to go with only the assessment portion which is required in legislation. While there was in interest in the curriculum offering it is not required in the law. Clarification 4 was issued to negotiation on the Terms of Use and Privacy policy that Istation has in place. NCDPI’s legal team negotiated the language that was provided by Istation. Istation was in agreement and signed the clarification giving the department permission to incorporate in in the final contract offering. While Amplify was able to submit an offer to satisfy the agencies needs it was not cost effective. As the incumbent the progress made by students in reading is not significant. The effectiveness of the data driven instructional support is questionable. The current test scores does not support the inflated cost offered by Amplify. Istation provided a solution that was robust, cost effective, offered additional enhancements that were required, and met the business needs of NCDPI. While Istation’s dyslexia component may be missing key measures, the service substantially conforms to the requirements specified under N.C.G.S. 115C-83.1, which is the primary obligation of this procurement. Negotiations were issued to Istation and memorialized in the BAFO # 40-20680730A dated June 4, 2019 in which Istation agreed to the following change in specifications: ADA Compliance high contrast reports, Voice Recognition Software, Onsite Training and 13 of 14 v.2017-06-06 Recorded Webinars, Growth Calculation, Summer Reading Camps, Customizations and Enhancements, BAFO Cost, as well as modifications to the Istation Terms of Use and Privacy Policy which comprise the License grant and agreement for the State’s use of the Istation Resources. IStation also completed the Vendor Security Assessment Guide (VRAR) that was reviewed and approved by NCDPI and DIT technical teams. Section 8: Award Recommendation The Evaluation Committee has determined that Istation’s bid substantially conforms to the specifications and requirements of the law and therefore, recommends award RFP No. 40-RQ20680730A to Imagination Station Inc. (Istation) in the amount of $8,405,820 (Year 1 - $2,751,940, Year 2 - $2,751,940, Year 3 - $2,751,940) for 2 years with the option of one (1) additional one (1) year renewals. Section 9: Supporting Documentation The following supporting documents that reflect the vendor selection are included: 1. 2. 3. 4. Bid Response Clarification documents – Signed BAFO document Hosting Exception and Privacy and Threshold Analysis (approved by DIT) 14 of 14 v.2017-06-06 EVALUATION COMMITTEE MEMBER’S CONFIDENTIALITY AGREEMENT For RFP # 40-RQ20680730 – Read to Achieve Diagnostics – Software as a Service (RtADSaaS) Pursuant to North Carolina’s Administrative Code 09 NCAC 06B.0103, all information and documentation (verbal and written) relative to development of a contractual document is deemed “confidential” and shall remain confidential until successful completion of the procurement process. Therefore, Evaluation Committee Members (both voting and contributing advisors) are required to keep all comments, discussions, and documentation confidential until a notification of award has been made by the Issuing Agency for this solicitation. By participating in this Evaluation Committee, you agree to not divulge any information to an unauthorized person in advance of the time prescribed for its authorized release to the public. This includes coworkers, supervisors, family, friends, etc. If it is discovered that there has been a breach of confidentiality by a member of this Committee, he/she will be immediately excused by the Committee Chair until further notice. The solicitation may be cancelled and a new solicitation may be issued with a new Evaluation Committee. In addition, the issue will be referred to the employee’s department director or agency head. Department directors or the heads of autonomous agencies shall be responsible for the preliminary examination and investigation of reports from employees of any violations which compromise the procurement process. If, following a preliminary examination and investigation, the department director or agency head finds evidence of a violation or finds that further investigation is warranted, a report shall be submitted to the respective Human Resources Office for potential disciplinary action. By signing below, I certify that, as a member of this Evaluation Committee, I will keep all comments and discussions, preliminary / working evaluation notes, and all other information (verbal and written) regarding the above referenced solicitation, confidential until after a notification of award has been made by the Issuing Agency. _________________________________________ Signature Samiel Fuller _______________ Date EVALUATION COMMITTEE MEMBER’S STATEMENT REGARDING CONFLICT OF INTEREST AND DISCLOSURE For RFP # 40-RQ20680730 – Read to Achieve Diagnostics – Software as a Service (RtAD-SaaS) The following organizations have submitted a bid proposal and response to the above solicitation: 1) 2) 3) 4) Amplify Education Inc. Curriculum Associates (i-Ready) Imagination Station Inc. (IStation) NWEA Each member involved in the evaluation process must verify that he / she has no personal, financial, business or other conflicts of interest, with regard to this procurement and his / her official duties as an evaluator. North Carolina General Statute § 143-58.1 prohibits unauthorized use of public purchase(s) or contract procedure for private benefit. Therefore, by signing this statement, you certify that neither you nor members of your immediate family currently have or expect to gain, any personal, financial, business or other benefit, from the potential contract awarded to any of the competing, bidding-vendors listed above; and that, neither you nor members of your immediate family have any potential conflicts of interest in the organization(s) listed above, including any subcontractor referenced in their respective proposals, that could influence, or be reasonably perceived as influencing, your evaluation or recommendations for this solicitation. If it appears as potential conflict of interest between your official duties as an evaluator and your personal interest, you will be excused from participation by the Evaluation Committee Chair. Please return this form unsigned and a replacement evaluator will be assigned. You need not disclose the relationship or conflict. In addition the issue will be referred to the employee’s Department Director or Agency Head. Department Directors or the Heads of autonomous agencies shall be responsible for the preliminary examination and investigation of reports from employees of any violations which compromise the procurement process. If, following a preliminary examination and investigation, the Department Director or Agency Head finds evidence of a violation or finds that further investigation is warranted, a report shall be submitted to the respective Human Resources Office for potential disciplinary action. By signing below, I certify that I do not have, nor does any member of my immediate family have, any personal, financial, business, or other conflicts of interest in the bidding-vendors listed above. Signature Date Read to Achieve 2018 (RtAD) Evaluation Consensus Meeting notes Meeting to update team on the status of the RtA procurement Location Conference Room 385, Education Building, Raleigh. Date & Time March 8, 2019; 9:00 AM – 10:00 AM Facilitator(s) Tymica Dunn Next Meeting TBD Voting Member Participants Cynthia Dewey Kristi Day Pam Shue Susan Laney Thakur Karkee Non-Voting Member Participants Jonathan Sink Lynne Loeser Matt Hoskins EL LE Chloe Gossage D Meeting Purpose Srirekha Viswanathan Tymica Dunn Agenda Items The agenda for this meeting was to update the evaluation team on the status of the procurement. C Meeting Summary AN At the start of the meeting, the Procurement Officer informed the team that the participants will be addressed by the General Counsel. C The General Counsel emphasized the importance of confidentiality and objectivity in an RFP procurement. He did add that one of the voting members breached the confidentiality of the procurement process which jeopardized the legality of this procurement. It should also be noted that the team did not reach a unanimous consensus on the choice of the finalist vendors. Because of these issues the current read to achieve procurement has to be cancelled again. Discussions were underway with DIT on the best possible approach to proceed. The meeting was adjourned. FP Next Steps: Guidance from DIT. Action items resulting from the meeting are as follows. Action Items R Item RtAD-Consensus Meeting_March 2019 Assignee Due Date Status Page 1 of 1 Read to Achieve 2018 (RtAD) Evaluation Consensus Meeting notes Consensus Meeting to recommend finalist for negotiations Location Conference Room 504 A, Education Building, Raleigh. Date & Time January 8, 2019; 1:30 PM – 3:00 PM Facilitator(s) Srirekha Viswanathan and Tymica Dunn Next Meeting TBD Voting Member Participants Chloe Gossage Cynthia Dewey Matt Hoskins Pam Shue Rebecca Belcastro Non-Voting Member Participants Mark Johnson Srirekha Viswanathan Agenda Items Kristi Day Lynne Loeser EL LE Abbey Whitford D Meeting Purpose Susan Laney Thakur Karkee Tymica Dunn The agenda for this meeting was to recommend finalist for approval and negotiations. C Meeting Summary AN The Superintendent thanked the evaluation team for their hard work and time spent on this most important RFP. He also mentioned that he had reviewed the proposals over the Holidays to get a full understanding of the various offerings. The Superintendent discussed his vision of empowering teachers and giving teachers their time back to teach. Empowering teachers include providing teachers the right tools; appropriate professional development and training. It is important to allow teachers to teach by reducing assessment time. C He requested voting members to keep this vision in mind while making recommendations on the vendor(s) for negotiations. To maintain integrity of the process he stepped out and requested the voting team members to proceed with voting. FP The next steps in this process i.e., recommendations by voting members, approval by Superintendent and negotiations were elaborated by the Business Owners and Procurement Officer. To further ensure that an impartial and unbiased process is followed, the voting members were provided ‘Post It’ cards to enter their recommendations. Sri tallied the votes and the recommendation was announced to the team. Six (6) voting members recommended negotiating with Amplify only; Three (3) voting members recommended negotiating with Istation only; One (1) voting member recommended negotiating with both Amplify and Istation. R • • • The team discussed further and recommended that in order to align with the vision of the Superintendent, it is important that if negotiations are conducted with Amplify that the assessment measures are reduced to the core measures of DIBELS. The current implementation package includes TRC and it takes away significant teaching time. RtAD-Consensus Meeting_01082019 Page 1 of 2 Meeting Agenda & Summary The team also made a note that when negotiations are held with Istation it is important to further understand their recording and playback feature as it may also impact teaching time. D In all the team felt that is important to understand the overall assessment time with both vendors and work towards reducing the assessment time. Next Steps: Action items resulting from the meeting are as follows. Action Items Item Assignee Inform the State Superintendent of the team recommendation Dr. Pam Shue Gather negotiation questions and get team input on the questions Sri Set up meetings with the finalist vendor Tymica Dunn EL LE The Business Owner will provide an update to the Superintendent on the team’s recommendation. Upon the approval from Superintendent the next steps will be planned. Due Date Status 1-9-19 C 1-15-19 R FP C AN TBD RtAD-Consensus Meeting_11192018 & 11202018 Page 2 of 2 LE D EL AN C Read to Achieve – 2018 R FP C December 4th, 2018 Business Owner(s): Dr. Amy Jablonski Dr. Pamela Shue Project Manager: Srirekha Viswanathan Procurement Officer: Tymica Dunn 65? BACKGROUND REVIEW RANKING AND WEAKNESS EXT STEPS LE D Background EL SECTION 7.27.(b) The State Superintendent shall issue a Request for Proposals (RFP) to vendors of diagnostic reading assessment instruments to provide one or more valid, reliable, formative, and diagnostic reading assessment instrument or instruments for use pursuant to G.S. 115C-174.11. C At a minimum, the diagnostic reading assessment instrument or instruments provided by the selected vendor shall meet all of the following criteria: R FP C AN a. Yield data that can be used with the Education ValueAdded Assessment System (EVAAS). b. Demonstrate close alignment with student performance on State assessments c. Demonstrate high rates of predictability as to student performance on State assessments SECTION 7.27.(c) The State Superintendent shall form and supervise an Evaluation Panel to review the proposals received pursuant to the RFP issued in accordance with subsection (b) of this section. The Evaluation Panel shall be composed of persons employed within the Department of Public Instruction. By December 1, 2018, the Evaluation Panel, with the approval of the State Superintendent, shall select one vendor to provide the assessment instrument or instruments for the 2019-2020 school year. R FP C AN RFP Posted C EL LE D Where we are … Demonstration & Evaluation Finalist Selection & Negotiation LE D Evaluation Ranking Amplify Education Curriculum Associates Inc. Istation NWEA Substantial Conformity to specification 1 3 2 4 RtAD Desired Specifications 1 3 2 4 Proof of Concept / Demonstration 1 3 2 3 Vendor Cost Proposal 4 3 1 2 Strength of References 1 1 1 1 Vendor Financial Stability 4 1 1 1 Overall Rank 1 3 2 4 R FP C AN C EL Evaluation Criteria The evaluation criteria are stated in relative order of importance. LE D Evaluation Ranking Ranked 4th NWEA (MAP Assessment) Strengths The computer adaptive nature of the assessment helps each student to stay engaged. 2. Good Reporting feature based on the data collected from screening measure. 3. Parent communications could be available in multiple languages other than Spanish and English but that requires customizations. EL 1. Weakness Progress monitoring is not yet in place and is currently under development. 2. The Progress Monitoring tool currently under development is the only Progress Monitoring tool that is going to exist because the benchmark assessments can only be given three times a year. 3. This tool cannot accurately identify risk indicators for dyslexia and the company has not provided any data for the same. Their statement in the proposal was that the developers expect the service to be sensitive and specific screener for dyslexia. This will require multiple tools for assessment. 4. Formative assessment is only given to some students because once the students read independently fluency assessment is optional. 5. The proposal was not for a statewide implementation. 6. The vendor has mentioned that they will negotiate with the state on the proposed security standards and has not given a clear timeline for the SOC2 Type II audit. 7. Equity of technology in schools may lead to loss of instructional time. R FP C AN C 1. LE D Evaluation Ranking Ranked 3rd Curriculum Associates (i-Ready Assessment) Strengths 1. The computer adaptive nature of the assessments helps each student stay engaged. EL 2. The service has sound identification that is well described. 3. This service has the Standards Mastery Results (Student) report that helps teachers understand how students performed on an assessment, including how students performed on each skill in the assessment. C Weakness 1. Fluency has not been developed and will not be available till the 2021 School Year. AN 2. i-Ready is not a reliable screener for dyslexia because it lacks measures for fluency and non-sense word recognition. This will require multiple tools for assessment. 3. The approach for service deployment is not statewide but by districts. Also student transfers from one district to another is a manual process and will take about 48 hours. C 4. Reporting feature requires a lot of customizations. Some reports have to be requested from the vendor and will not be immediately available for the School Districts. R FP 5. Vendor has mentioned that the SOC 2 Type II assessment will be completed by Summer 2019. (This may compromise contract award) 6. Equity of technology in schools may lead to loss of instructional time. LE D Evaluation Ranking R FP C AN C EL Ranked 2nd Istation (Istation Assessment) Strengths 1. The assessment is adaptative in nature and adjusts to each student’s true abilities in early literacy. 2. Teachers are incorporated in this service for early education, in that the student’s reading is recorded and the teacher will playback and grade the student. 3. In this assessment, students internalize the learning goals and will be able to set the target for themselves. A student’s self-assessment process allows transition to independent learning. 4. Has robust reporting capabilities. Weakness 1. Text fluency and oral language are not a part of the overall ability score. Fluency is a new assessment. 2. No method to determine decoding. 3. This assessment is not diagnostic in nature. 4. Istation is not a reliable screener for dyslexia because it lacks some key measures for dyslexia risk factors like letter naming fluency. This will require multiple tools for assessment. 5. Vendor has mentioned that the SOC 2 Type II assessment will be completed by Spring 2019. (This may compromise contract award) 6. Equity of technology in schools may lead to loss of instructional time. LE D Evaluation Ranking Ranked 1st Amplify Education Inc. (mClass Assessment) Strengths EL 1. Offers online as well as observational assessment. 3. Offline assessment is available. 4. The service has robust reporting capabilities. Weakness AN 5. The service is SOC 2 Type II certified. C 2. The core measures of Dibels are a valid and reliable screener for risk factors for dyslexia. R FP C 1. There are many assessment measures that needs to be turned off. LE D Cost Proposed Cost per student Annual Cost Amplify Education Inc. $25.78 $12,102,096.08 4 Curriculum Associates $22.48 $10,551,955.67 3 Istation $6.60 $3,098,606.17 1 NWEA $21.14 $9,925,148.58 2 AN C EL Vendor R FP C Note: 1. Costs will be negotiated with finalists. 2. It includes potential cost for headsets. 3. For Vendors who provided multiple costs, the higher cost was considered. Rank LE D Next Steps R FP C AN C EL 1. Finalist Identification & Negotiations 2. Best and Final Offer (BAFO) Read to Achieve 2018 (RtAD) Evaluation Consensus Meeting notes Consensus Meeting to rank the proposal vendors Location State Board Room, Education Building, Raleigh. Date & Time November 19, 2018; 8:30 AM – 5:00 PM & November 20, 2018 8:30 – 2:30 PM Facilitator(s) Linda Lowe and Srirekha Viswanathan Next Meeting TBD Voting Member Participants Amy Jablonski Chloe Gossage Lynne Loeser Matt Hoskins Pam Shue Rebecca Belcastro Susan Laney Non-Voting Member Participants Kristi Day EL LE Abbey Whitford Thakur Karkee Cynthia Dewey D Meeting Purpose Courtney Moates Constance Bridges Deborah Wilkes Erika Berry Giancarlo Anselmo Gin Hodge Julien AlHour KC Hunt K.C. Elander Linda Lowe Meera Phaltankar Mia Johnson Paola Pilonieta Shaunda Cooper Srirekha Viswanathan Tonia Parrish Tymica Dunn C Agenda Items The agenda for this meeting was to discuss the evaluation notes from the independent reviews by voting and nonvoting members, reach consensus to rank the proposals and determine the next steps in this procurement. Meeting Summary (11-19-2018) AN This meeting summary includes notes from the meeting on 11-19-2018 and 11-20-2018. C 1. Sri kicked off the meeting by thanking the participants for their thorough review and participation at the consensus meeting. 2. The intent of the meeting and the approach to evaluate all the criteria were discussed - including being objective, impartial, unbiased and fair in all aspects of the evaluation process and arrive at a consensus. All proposals should be ranked consistently. Consensus means general agreement and not unanimity. 3. The six evaluation criteria in proposal were reiterated: Substantial Conformity to Solicitation Specifications RFP Desired Specification Proof of Concept/Demonstration Vendor Cost Proposal Vendor Relevant Experience and Reference Checks Vendor Financial Stability FP a. b. c. d. e. f. R 4. All responsive vendors were evaluated on all six evaluation criteria. 5. To evaluate substantial conformity to specifications the team unanimously agreed to take the following approach: a. Review the legislatively mandated specifications for all responsive vendors. b. Vendors who were deemed substantially conforming to statutory requirements to be further evaluated for all RFP specifications to ideally reach a group agreement and further rank the vendors. RtAD-Consensus Meeting_11192018 & 11202018 Page 1 of 31 Meeting Agenda & Summary c. Those vendors that were not substantially conforming to statutory requirements were ranked lower by the team for this evaluation criteria. D 6. The following ranking was used for each specification – a. “Yes” implies conforms to specifications. b. “No” implies does not conform to specifications. c. “MayBe” implies that the team is unsure about conformity. EL LE Discussion during the consensus meeting is summarized below. The voting members were issued three colored cards – Green to show compliance to specification, Pink to show that the specification was not complied with and Yellow to show Maybe there was compliance. In the case of Maybe responses, further clarifications may occur during negotiation prior to Best and Final Offer (BAFO) submission and Award. Negotiation questions matter for Vendors in the competitive range that are selected for further consideration. The voting members discussed each mandatory requirement in full and arrived at a consensus by showing the appropriate cards. Outcomes from consensus meeting for the various specifications are provided in a separate spreadsheet for each bidder with appropriate strengths and weaknesses. The proposals were taken up in an alphabetical order for ranking. 1. Substantial Conformity to Specifications C AN C Review of Legislated Specifications Amplify Education Inc.: Business Specification 1: Describe how the proposed solution directly assesses reading and pre-reading behaviors to support student’s learning development at the various grade levels to inform instruction, including any observation-based practices if applicable: a. oral language (expressive and receptive) b. phonological and phonemic awareness c. phonics d. vocabulary e. fluency f. comprehension FP Consensus Ranking: The voting members were unanimous in their agreement that Amplify complied with this specification. Two of the voting members mentioned that while online versions are available for students with appropriate self-regulation and computer skills; teachers continue to have the option to directly assess/observe students. The voting members were 11-0 Yes on Amplify’s ability to comply with this specification. Business Specification 3: R Describe the validity and reliability of the assessment in the following areas: a. oral language b. phonological and phonemic awareness c. phonics d. vocabulary e. fluency f. comprehension RtAD-Consensus Meeting_11192018 & 11202018 Page 2 of 31 Meeting Agenda & Summary Consensus Ranking: D Two of the voting members mentioned that the data is good and reliable on most assessments using DIBELS. Oral Language reliability data was sound. TRC data for Inter Rater Reliability (IRR) is low in many areas. The voting members voted 10 Yes and 1 Maybe. EL LE Negotiation Question: TRC online shows concurrent validity to two measures of reading. Need the alpha data for the lower online TRC book levels. Early literacy measures to be included as part of negotiations. Business Specification 5: Describe how the assessment identifies and reports students who may need intervention and enrichment. Consensus Ranking: The voting members were unanimous in their Yes votes for Amplify for this specification because the team felt that multiple reports and data are available for teachers about instruction and areas that need intervention. C AN C Business Specification 6: Describe how the following characteristics for progress monitoring between benchmarks are met by the proposed solution: a. brief, b. repeatable, c. sensitive to improvement over time, including short term change d. multiple equivalent forms of screening assessments that enable the teacher to gauge short term growth (weekly or every other week), e. reliable, f. valid, g. measure accuracy and fluency with skills h. quantitative results charted over time to calculate and document rates of improvement i. Allow for off-grade level progress monitoring j. Ability for the results to be graphed against a goal (national norms and/or individualized goals) with 12-14 data points in 10 weeks’ time. FP Consensus Ranking: The team voted 8 Yes and 3 Maybe on the question of Amplify’s progress monitoring meeting the characteristics defined above. Some of the team members felt that TRC did not meet all of the above characteristics; however, the DAZE as an outcome measure of reading comprehension does meet. The team felt that teachers choose their own book outside of the Atlas set for progress monitoring, which probably impacts reliability and validity. Negotiation Question: As part of further negotiations, the team agreed that further data is required on TRC’s validity and customizations needed for Oral Language measures. R Business Specification 8: Describe how the measures align with best practices and adequately and accurately identify indicators of risk for dyslexia in grades K-3 as outlined in NC Session Law 2017-127: http://www.ncleg.net/Sessions/2017/Bills/House/PDF/H149v4.pdf RtAD-Consensus Meeting_11192018 & 11202018 Page 3 of 31 Meeting Agenda & Summary D Consensus Ranking: The team was unanimous on Amplify’s ability to meet this specification and voted a 11-0 Yes on this specification. The SMEs in the area of dyslexia mentioned that Amplify’s tools are predictive of reading outcomes and include domains known to be impacted by dyslexia including phonological awareness and rapid naming. They also felt that DIBELS is sufficient as dyslexia screener although it does not get into higher level screening. The core measures of DIBELS have always been recognized as valid and reliable screener for risk factors for dyslexia. The only drawback of DIBELS is they do not get into the advanced levels of phonological awareness for first grade and beyond. EL LE Business Specification 9: Describe how the system uses developmentally appropriate practices to assess K-3 students. Consensus Ranking: The team voted 11-0 Yes on Amplify’s ability to appropriately assess K-3 students. Receptive and expressive assessing options are available with this service. There was also the Observational and Online means of assessing students. Business Specification 10: Describe how the system incorporates educators and/or students using digital devices to assess reading and prereading behaviors. AN C Consensus Ranking: The team was unanimous in their votes about Amplify’s ability to assess reading and pre-reading behaviors of students and voted 11-0 Yes for this specification because of the availability of online and observational assessment. Business Specification 11: Describe how the proposed solution is a formative reading assessment(s) tool for grades K, 1, 2, 3. C Consensus Ranking: The team voted 9 Yes and 2 Maybe on this specification. Some members felt that the proposal response did not adequately respond to this question. While Amplify shared research in support of formative assessment, the response did not include how the proposed solution is a formative reading assessment for grades K-3. Some SMEs also mentioned that individual skills measured by DIBELS assessments lend to formative assessment of different isolated skills. However, TRC component did not appear to be easy for formative reading assessment. FP Business Specification 12: Describe how the proposed solution is a diagnostic reading assessment(s) tool for grades K, 1, 2, 3. R Consensus Ranking: The team voted 10 Yes and 1 Maybe. There was a question whether the TRC and MSV type analysis is truly diagnostic in nature and whether TRC’s diagnostics capacity is dependent on the teachers’ ability to interpret the student’s responses. Business Specification 15: Describe how the proposed solution minimizes impact to instructional time while ensuring formative and diagnostic assessments are conducted. Provide estimates of assessment time, for both benchmarking and progress monitoring, per student per grade. RtAD-Consensus Meeting_11192018 & 11202018 Page 4 of 31 Meeting Agenda & Summary D Consensus Ranking: The team voted 7 Yes, 3 No and 1 Maybe on this specification. Some members felt that the response was described well, however the estimate of time was not answered. It was unclear as to the estimates of assessment time, for benchmarking and progress monitoring per student per grade level. Some voting members pointed out that time differentials between the online and observational versions of assessment was provided in the demonstration clarification document. It was also mentioned that TRC assessment could take longer and would require further negotiations and customizations. EL LE Business Specification 17: Describe how the content standards will be aligned and realigned to State Board of Education adopted ELA Standard Course of Study (Spring 2017). Provide specific mapping to the current standards. http://www.ncpublicschools.org/curriculum/languagearts/scos/. Consensus Ranking: The team voted 7 Yes, 2 No and 2 Maybe for this specification. The members pointed out that during the demo the online instruction was aligned, however the gaming piece was not aligned to the Standard Course of Study. There were insufficient examples for alignment for the assessment piece. It is also unclear as to how the assessment questions are aligned to the ELA SCoS. C Business Specification 19: Explain how the proposed solution can yield data that can be used with EVAAS. Describe and provide any information that explains any alignment or relationship between the assessment and the Education Value-Added Assessment System (EVAAS). http://www.ncpublicschools.org/effectiveness-model/evaas/ https://www.sas.com/en_us/software/evaas.html AN Consensus Ranking: EVAAS expert in the evaluation panel mentioned that the service can yield data required for EVAAS and the team unanimously voted 11-0 Yes on Amplify’s ability to provide data for EVAAS. C Business Specification 24: Describe how the Benchmarking process occurs in the proposed solution. NCDPI expects benchmarking three times a year for grades K, 1, 2 and 3. Consensus Ranking: The team voted 11-0 Yes on Amplify’s ability to Benchmark by State Board’s guidelines. R FP Reporting Specification 3: Reporting feature is expected to provide the following capabilities: a. timely assessment results to teachers/administrators b. timely assessment results to parents/guardians c. reporting results at the district, school, grade, teacher, group, and individual student level by all subgroups ESSA d. an end-of-year student summary report for cumulative folder historical data year after year to identify consistent gaps and learning trends for district, school, grade, teacher, group, and individual student level by all subgroups RtAD-Consensus Meeting_11192018 & 11202018 Page 5 of 31 Meeting Agenda & Summary For each of the above, provide a timeframe for how frequently the data is refreshed (real-time, on demand, or some other interval). D Consensus Ranking: The Business Team voted 11-0 in favor of Amplify’s service to provide timely assessment results. The team felt that the teacher reports are easy to read and interpret. Reporting feature provides drill down capability into previous year’s assessment results. The service allows creating unique groups and assign view rights to assorted individuals. EL LE Reporting Specification 4 Provide communication to parents in a format that is clear and easy to understand after each benchmark. Consensus Ranking: The team voted 7 Yes and 4 Maybe for this specification. The team felt that there should be different methods of communicating to the parents and this is not explained clearly. Some members felt that parent communication should be in different languages and this was not clear. The team agreed that the Home Connect letters are easily understandable. Negotiation Question – Need for engagement webinars be archived and available for parents. AN C Technical Specification 6 This service will be classified as “Program Critical/Moderate” based on the sensitivity of data used, the security controls under the “Moderate” category column need to be implemented. The vendor’s security policy should include all the control categories as specified under “Moderate” classification. Please refer to pages 4 through 10 for the security control baselines table in the State Information Security Manual document. For example, AC-1 (Access Control Policy and Procedures) under “Access Control” Family/Category is discussed in detail in the NIST 800-53 document. NC Statewide Information Security Manual https://it.nc.gov/documents/statewide-information-security-manual  a) Describe how you will ensure compliance to the NC Statewide Security Manual. R FP C Consensus Meeting – Without much detail the vendor said they would comply with the Security Manual. The use of developers in Ukraine by this service was brought up. There is a letter from the vendor about them not using foreign workers for development under the current contract. This needs to be further clarified as to whether the request is for new RFP or is the vendor currently engaging their offshore developers for development? This will be a risk to be escalated if vendor currently engages this development team. The Security SME did clarify that the intent of the question is whether data goes to Ukraine and whether the developers from Ukraine can log in and see the data. The vendor has mentioned that the production data will not be shared. Previously under the current agreement, DPI verified that all data resides within the United States. Clarification was also provided that under the current contract with DPI the vendor has demonstrated compliance with the NC Statewide Security Manual. If they had not, DPI would not have been approved to renew contracts by DIT. The team voted 11 Maybe because clarification needs to be sought from the vendor about use of developers in Ukraine. Negotiation Question – To clarify if the request for offshore development is for new RFP or is the vendor currently engaging their offshore developers for development? This will be a risk to be escalated if vendor currently engages this development team. RtAD-Consensus Meeting_11192018 & 11202018 Page 6 of 31 Meeting Agenda & Summary Technical Specification 35 Provide a 3rd party attestation, one of the following based on the system proposed: SOC2 Type II, SSAE-16, FEDRAMP, ISO 27001 D Consensus Meeting The team voted 11 Yes to this question because they were informed by IT Security expert in the team that the vendor has completed the SOC 2 Type II audit. EL LE Project Management Specification 1 Include an initial schedule and the associated Work Breakdown Structure (WBS) for the proposed implementation plan. The Project Schedule in the proposal to include significant phases, activities, tasks, milestones and resource requirements necessary for NCDPI to evaluate the plan. Consensus Meeting The voting team unanimously voted Yes for this question because this service can be implemented for the 2019 School Year. The team agreed that Project Schedule is well laid out and if further enhancements are required under the new contract then it can be completed prior to the start of 2019 School year. One thing that needs to be added in the project plan is the timeframe for disaster recovery testing. C Negotiation Question – Update schedule to include disaster recovery testing. The team moved on to review Curriculum Associates for mandatory specifications. AN Curriculum Associates (i-Ready) FP C Business Specification 1: Describe how the proposed solution directly assesses reading and pre-reading behaviors to support student’s learning development at the various grade levels to inform instruction, including any observation-based practices if applicable: a. oral language (expressive and receptive) b. phonological and phonemic awareness c. phonics d. vocabulary e. fluency f. comprehension R Consensus Ranking: The voting members voted 8 Nos and 3 Maybe. Some voting members felt that the vendor is currently not measuring oral language and fluency. The vendor’s rationale is that teacher interaction needed for these measures. Also, in the demonstration clarifications this vendor has indicated that the "We propose working with NCDPI to identify one of the current traditional teacher-administered fluency assessments from another vendor and make it available to all RTAD participants, at no additional cost to our original RTAD RFP response." which raised many questions. Some Subject Matter experts mentioned that just measuring comprehension skills without fluency raised the question of how the foundational skills were assessed. They had questions around how a student could be good on foundational reading but just will not be a fluent reader. Some members felt that the observational aspect is missing. In early reading assessment it is important to ask students to read aloud and that key component is missing. The team felt that the company did not understand the value of adding fluency RtAD-Consensus Meeting_11192018 & 11202018 Page 7 of 31 Meeting Agenda & Summary and that it was added because of demand and cited that timeline needs to be adjusted and planned to onboard another vendor, if this vendor is chosen as the finalist. The team also was concerned about having two different normative data sets one for fluency and another for all other measures. The measures for oral language were also very limited. CA has an estimated timeline for fluency development and does not say how close they are to complete development. EL LE Business Specification 3: D Negotiation Question: Need to discuss how the vendor would add the fluency component with any cost implications. More details on how they can present to the end user as a one vendor package are needed. Get a firmed-up timeline for fluency development. Describe the validity and reliability of the assessment in the following areas: a. oral language b. phonological and phonemic awareness c. phonics d. vocabulary e. fluency f. comprehension C Consensus Ranking: The team voted 10 No and 1 Yes for this requirement because of the reasons cited above for fluency. It was mentioned that some team members were not able to access any of CA’s psychometric data. AN Business Specification 5: Describe how the assessment identifies and reports students who may need intervention and enrichment. C Consensus Ranking: The team voted 1 Yes 4 No and 6 Maybe. It was said that the enrichment was not clear in their proposal and the other voting members agreed to that. The team was doubtful about the validity and reliability of the assessment and that is a concern to identify students in need appropriately. R FP Business Specification 6: Describe how the following characteristics for progress monitoring between benchmarks are met by the proposed solution: a. brief, b. repeatable, c. sensitive to improvement over time, including short term change d. multiple equivalent forms of screening assessments that enable the teacher to gauge short term growth (weekly or every other week), e. reliable, f. valid, g. measure accuracy and fluency with skills h. quantitative results charted over time to calculate and document rates of improvement i. Allow for off-grade level progress monitoring j. Ability for the results to be graphed against a goal (national norms and/or individualized goals) with 12-14 data points in 10 weeks’ time. RtAD-Consensus Meeting_11192018 & 11202018 Page 8 of 31 Meeting Agenda & Summary D Consensus Ranking: The team voted 10 No and 1 Maybe on this question because of the reliability and validity of the assessment. Progress Monitoring appeared to be a shortened version of the assessment and the team mentioned that to be a concern. Reliability almost always goes down to some extent when you reduce test items. Progress Monitoring itself takes 15 minutes. EL LE Business Specification 8: Describe how the measures align with best practices and adequately and accurately identify indicators of risk for dyslexia in grades K-3 as outlined in NC Session Law 2017-127: http://www.ncleg.net/Sessions/2017/Bills/House/PDF/H149v4.pdf Consensus Ranking: The team voted 11 Nos because they did not think this service can be reliable screener for dyslexia because it lacks measures for fluency and nonsense (or pseudoword) word recognition. Business Specification 9: Describe how the system uses developmentally appropriate practices to assess K-3 students. C Consensus Ranking: The team voted 1 Yes 6 No and 4 Maybe. It was mentioned that it is totally online and this is a problem for early learners and particularly for economically disadvantaged children, who may lack access to computers. Young children would have to master computer skills before they can be assessed. AN Business Specification 10: Describe how the system incorporates educators and/or students using digital devices to assess reading and prereading behaviors. C Consensus Ranking: The team voted 1 Yes 1 No and 9 Maybe. Some team members felt that prereading behaviors cannot be assessed completely online. A voting member mentioned that online assessment can be limited to identification of sound but was concerned about how production of sound can be assessed for young children without the fluency component in place. The service has sound identification that is well described. Business Specification 11: Describe how the proposed solution is a formative reading assessment(s) tool for grades K, 1, 2, 3. FP Consensus Ranking: The team voted 9 Yes, 1 No and 1 Maybe. Business Specification 12: Describe how the proposed solution is a diagnostic reading assessment(s) tool for grades K, 1, 2, 3. R Consensus Ranking: The team voted 9 Yes, 2 Maybe. Some team members cited that in the proposal response the vendor had noted the following "The Standards Mastery Results (Student) report helps teachers understand how students performed on an assessment, including how students performed on each skill in the assessment. This report also displays the actual assessment taken by a student along with the correct answer, the student’s answer, and any RtAD-Consensus Meeting_11192018 & 11202018 Page 9 of 31 Meeting Agenda & Summary misconceptions that may have led to an incorrect or partially correct answer. This information can help teachers understand which concepts an individual student is struggling with and potential reasons why." D Business Specification 15: Describe how the proposed solution minimizes impact to instructional time while ensuring formative and diagnostic assessments are conducted. Provide estimates of assessment time, for both benchmarking and progress monitoring, per student per grade. EL LE Consensus Ranking: The team voted 1 Yes, 10 Maybe. Lynne mentioned that the assessments take 48 minutes and one member mentioned that this company mentioned that the child gets to take 1 day or 2 days to complete the assessment and questioned the reliability and validity of the test results on the whole. Should we need some guidelines around how long assessments can take by grade level should this vendor be a finalized? Some team members mentioned that the end user will have to make some movement by moving the mouse. The technology representatives mentioned that the session appears to remain active upon clicking on the web page. This needs to be clarified. Negotiation clarification: Is the activity on the screen by clicking or by mouse movement? C Business Specification 17: Describe how the content standards will be aligned and realigned to State Board of Education adopted ELA Standard Course of Study (Spring 2017). Provide specific mapping to the current standards. http://www.ncpublicschools.org/curriculum/languagearts/scos/. AN Consensus Ranking: Some voting members mentioned that the continuum is not aligned to standards and it was a forced alignment. The standards mastery examples presented were not aligned and the team voted 3 Yes, 6 No and 2 Maybe. C Business Specification 19: Explain how the proposed solution can yield data that can be used with EVAAS. Describe and provide any information that explains any alignment or relationship between the assessment and the Education Value-Added Assessment System (EVAAS). http://www.ncpublicschools.org/effectiveness-model/evaas/ https://www.sas.com/en_us/software/evaas.html FP Consensus Ranking: EVAAS expert in the evaluation panel mentioned that the service can yield data required for EVAAS and the team unanimously voted 11-0 Yes on Curriculum Associates’ ability to provide data for EVAAS. Business Specification 24: Describe how the Benchmarking process occurs in the proposed solution. NCDPI expects benchmarking three times a year for grades K, 1, 2 and 3. R Consensus Ranking: The team unanimously voted Yes for this specification. Reporting Specification 3: Reporting feature is expected to provide the following capabilities: a. timely assessment results to teachers/administrators RtAD-Consensus Meeting_11192018 & 11202018 Page 10 of 31 Meeting Agenda & Summary D b. timely assessment results to parents/guardians c. reporting results at the district, school, grade, teacher, group, and individual student level by all subgroups ESSA d. an end-of-year student summary report for cumulative folder historical data year after year to identify consistent gaps and learning trends for district, school, grade, teacher, group, and individual student level by all subgroups EL LE For each of the above, provide a timeframe for how frequently the data is refreshed (real-time, on demand, or some other interval). Consensus Ranking: The team voted 1 Yes, 9 No, 1 Maybe. The team was concerned about the complexity and difficulty of manually exporting to create subgroups. The wanted to know how long it would take to customize and add additional report, should an outside request be made to get certain reports that are not readily available? If so, what will be the response time. Negotiation Question: Can reports be customized for subgroups. What is the turnaround time and cost impact if requests for custom reports are made? Should this be included in the SLA? C Reporting Specification 4 Provide communication to parents in a format that is clear and easy to understand after each benchmark. AN Consensus Ranking: The team voted 4 Yes, 2 No and 5 Maybe. Two team members said that they read in the response that the parent had to logon to get the report. It was clarified that the report is acceptable and that it can be printed and sent home by the teacher if the parent is unable to access a computer. C Technical Specification 6 This service will be classified as “Program Critical/Moderate” based on the sensitivity of data used, the security controls under the “Moderate” category column need to be implemented. The vendor’s security policy should include all the control categories as specified under “Moderate” classification. Please refer to pages 4 through 10 for the security control baselines table in the State Information Security Manual document. For example, AC-1 (Access Control Policy and Procedures) under “Access Control” Family/Category is discussed in detail in the NIST 800-53 document. NC Statewide Information Security Manual https://it.nc.gov/documents/statewide-information-security-manual  FP a) Describe how you will ensure compliance to the NC Statewide Security Manual. R Consensus Meeting – The team voted a unanimous 11 Maybe for this specification as CA stated in their proposal response - "i-Ready Diagnostic is a SaaS product, so that some of the policies in the NC Statewide Security Manual do not apply. However, we feel i-Ready meets the intent of the security practices and policies as outlined.” In their subsequent clarification the vendor indicated that they will complete the SOC 2 Type II in the summer of 2019. The Security Officer clarified that he cannot clearly say if they would want to negotiate because they were a SaaS shop. Negotiation Question – If this vendor is a finalist then their timeframe for SOC 2 Type II should be clarified and also confirmation required about their compliance to the Security Manual. RtAD-Consensus Meeting_11192018 & 11202018 Page 11 of 31 Meeting Agenda & Summary Technical Specification 35 Provide a 3rd party attestation, one of the following based on the system proposed: SOC2 Type II, SSAE-16, FEDRAMP, ISO 27001 EL LE D Consensus Meeting The team voted 11 Maybe for this specification because of the reasons cited above for Technical Specification 6. Currently DIT will not permit agencies to issue new contracts for moderate level solutions that have not been received an acceptable 3rd party attestation. Project Management Specification 1 Include an initial schedule and the associated Work Breakdown Structure (WBS) for the proposed implementation plan. The Project Schedule in the proposal to include significant phases, activities, tasks, milestones and resource requirements necessary for NCDPI to evaluate the plan. Consensus Meeting It was mentioned that the approach defined is not statewide and can end up very time consuming. Plus, the fact that fluency component needs to be added on requires planning and additional time. Team has to factor other customizations for a statewide deployment for the 2019 School Year. Implementation of EdFi ODS is targeted for 2020 in the proposal response. The team voted 10 No and 1 Maybe for this specification. C The team moved on to reviewing Istation for mandatory specifications. C AN Istation Business Specification 1: Describe how the proposed solution directly assesses reading and pre-reading behaviors to support student’s learning development at the various grade levels to inform instruction, including any observation-based practices if applicable: a. oral language (expressive and receptive) b. phonological and phonemic awareness c. phonics d. vocabulary e. fluency f. comprehension FP Consensus Ranking: The voting members voted 1 No and 10 Maybe for this specification. Some team members were concerned about the oral language and phonological awareness in this service. They were not sure if the measure for fluency was the correct measure. Additional equipment will be needed. The observational piece is optional. Some voting members felt that this was a low-level assessment especially for oral language. They also mentioned even though the proposal mentioned that the reading can be recorded for the teacher to evaluate and grade fluency, it was not demonstrated. Students speak into a microphone and there is nothing that stops them and that could be frustrating. R Negotiation Question: Oral language assessment is unclear or missing and needs to be further understood. Business Specification 3: RtAD-Consensus Meeting_11192018 & 11202018 Page 12 of 31 Describe the validity and reliability of the assessment in the following areas: a. oral language b. phonological and phonemic awareness c. phonics d. vocabulary e. fluency f. comprehension EL LE Consensus Ranking: D Meeting Agenda & Summary The team voted 9 Yes, 1 No and 1 Maybe. Some SMEs mentioned there is a need for clear description of the normed group demographics. Some team members also mentioned that each of their assessment measures did not check validity and reliability and that the sample size is very low. Their criterion measure was based on one school in Texas. The Subject Matter Experts felt that the AUC data was very low, and the sample size was for 25 students. It was pointed out that there was a lot of their comparison to DIBELS. Negotiation Question: Can the language around composite score be changed. Business Specification 5: Describe how the assessment identifies and reports students who may need intervention and enrichment. AN C Consensus Ranking: The voting members were unanimous in their Yes votes for this question. It was mentioned that the enrichment piece was not clear. FP C Business Specification 6: Describe how the following characteristics for progress monitoring between benchmarks are met by the proposed solution: a. brief, b. repeatable, c. sensitive to improvement over time, including short term change d. multiple equivalent forms of screening assessments that enable the teacher to gauge short term growth (weekly or every other week), e. reliable, f. valid, g. measure accuracy and fluency with skills h. quantitative results charted over time to calculate and document rates of improvement i. Allow for off-grade level progress monitoring j. Ability for the results to be graphed against a goal (national norms and/or individualized goals) with 12-14 data points in 10 weeks’ time. R Consensus Ranking: The team voted 1 Yes, 5 No and 5 Maybe. It was mentioned that the company recommended monthly progress monitoring, but the service allows you to do as many as you want. Business Specification 8: RtAD-Consensus Meeting_11192018 & 11202018 Page 13 of 31 Meeting Agenda & Summary Describe how the measures align with best practices and adequately and accurately identify indicators of risk for dyslexia in grades K-3 as outlined in NC Session Law 2017-127: http://www.ncleg.net/Sessions/2017/Bills/House/PDF/H149v4.pdf D Consensus Ranking: The team voted 9 No and 2 Maybe for this requirement. It was mentioned that the service was missing some of the key measures for dyslexia risk factor identification like letter naming fluency. EL LE Business Specification 9: Describe how the system uses developmentally appropriate practices to assess K-3 students. Consensus Ranking: The team voted 7 No and 4 Maybe for this one. Some voting members were concerned about the all online piece of this assessment and how appropriate it is for Kindergarteners and struggling learners. The assessment was also for 40 minutes. Business Specification 10: Describe how the system incorporates educators and/or students using digital devices to assess reading and prereading behaviors. AN C Consensus Ranking: The team voted 3 Yes, 2 No and 6 Maybe. It was indicated that online assessment for prereading behavior will be a concern given the fact the targeted audience is Kindergarteners. The team also mentioned that the vendor described better on incorporating teachers and it made a difference from the other online assessments. Teachers can also go back and listen to recording. Business Specification 11: Describe how the proposed solution is a formative reading assessment(s) tool for grades K, 1, 2, 3. C Consensus Ranking: The team voted 7 Yes and 4 Maybe on this specification. It was mentioned that the strength of this service as compared to the other online assessment service in that from NCDPI's definition of formative assessment (pg. 9 on RFP), students internalize the learning goals and become able to see the target themselves. A student’s selfassessment process marks the transition to independent learning. It was added that the evaluation team would like to see how this is driven by the solution and less by the teacher. FP Negotiation Question: Clarify how formative reading assessment is driven less by teachers and more by the solution. Business Specification 12: Describe how the proposed solution is a diagnostic reading assessment(s) tool for grades K, 1, 2, 3. R Consensus Ranking: The team voted 2 Yes, 2 No and 7 Maybe. It was mentioned that the service allows teachers to be diagnosticians and does not say much about how the service diagnoses reading deficiencies. It was felt that the nature of the assessment is not diagnostic. Business Specification 15: RtAD-Consensus Meeting_11192018 & 11202018 Page 14 of 31 Meeting Agenda & Summary Describe how the proposed solution minimizes impact to instructional time while ensuring formative and diagnostic assessments are conducted. Provide estimates of assessment time, for both benchmarking and progress monitoring, per student per grade. EL LE D Consensus Ranking: The team voted 3 Yes, 4 No and 4 Maybe on this specification. The team was not certain about where impact to instructional time was clearly addressed in the proposal response. One SME mentioned that the timeframes were addressed in question number 11 in the clarification document. The team was allowed to review the response again. Based on the review of the response the team came up with the votes. It was also noted that the assessment time far outweighs the impact to instructional time. Business Specification 17: Describe how the content standards will be aligned and realigned to State Board of Education adopted ELA Standard Course of Study (Spring 2017). Provide specific mapping to the current standards. http://www.ncpublicschools.org/curriculum/languagearts/scos/. Consensus Ranking: The team voted 3 No, 8 Maybe for this specification. It was pointed out that most of the questions and activities are not aligned to the current SCoS. The continuum that is provided does not show an alignment between NCSCoS and the questions/examples. AN C Business Specification 19: Explain how the proposed solution can yield data that can be used with EVAAS. Describe and provide any information that explains any alignment or relationship between the assessment and the Education Value-Added Assessment System (EVAAS). http://www.ncpublicschools.org/effectiveness-model/evaas/ https://www.sas.com/en_us/software/evaas.html Consensus Ranking: EVAAS expert in the evaluation panel mentioned that the service can yield data required for EVAAS and the team unanimously voted 11-0 Yes on Istation’s ability to provide data for EVAAS. C Business Specification 24: Describe how the Benchmarking process occurs in the proposed solution. NCDPI expects benchmarking three times a year for grades K, 1, 2 and 3. FP Consensus Ranking: The team voted 10 Yes and 1 Maybe. One team member had questions around how the benchmark timeframes can be opened and closed. Negotiation Question: How can benchmarking window be opened and closed. R Reporting Specification 3: Reporting feature is expected to provide the following capabilities: a. timely assessment results to teachers/administrators b. timely assessment results to parents/guardians c. reporting results at the district, school, grade, teacher, group, and individual student level by all subgroups ESSA d. an end-of-year student summary report for cumulative folder RtAD-Consensus Meeting_11192018 & 11202018 Page 15 of 31 Meeting Agenda & Summary historical data year after year to identify consistent gaps and learning trends for district, school, grade, teacher, group, and individual student level by all subgroups For each of the above, provide a timeframe for how frequently the data is refreshed (real-time, on demand, or some other interval). EL LE D Consensus Ranking: The voting members voted 11-0 in favor of Istation’s capability to provide timely assessment results. The team felt that the teacher reports are easy to read and interpret. Reporting Specification 4 Provide communication to parents in a format that is clear and easy to understand after each benchmark. Consensus Ranking: The team voted 3 Yes; 2 No and 6 Maybe for this specification. It was noted that there was no separate report for parents. Only a summary report for teachers was available for parents. There was a letter that could go home but it was not automatically system generated but had to be filled in by the teachers. It was felt that most reports for teachers cannot be interpreted by parents. AN C Technical Specification 6 This service will be classified as “Program Critical/Moderate” based on the sensitivity of data used, the security controls under the “Moderate” category column need to be implemented. The vendor’s security policy should include all the control categories as specified under “Moderate” classification. Please refer to pages 4 through 10 for the security control baselines table in the State Information Security Manual document. For example, AC-1 (Access Control Policy and Procedures) under “Access Control” Family/Category is discussed in detail in the NIST 800-53 document. NC Statewide Information Security Manual https://it.nc.gov/documents/statewide-information-security-manual  a) Describe how you will ensure compliance to the NC Statewide Security Manual. C Consensus Meeting – The team voted 11 Yes because the vendor mentioned that they will agree to work with the state’s security manual. FP Technical Specification 35 Provide a 3rd party attestation, one of the following based on the system proposed: SOC2 Type II, SSAE-16, FEDRAMP, ISO 27001 R Consensus Meeting The team voted 11 Maybe to this question because the vendor has agreed to complete the SOC 2 Type II by Spring 2019 in order to meet this requirement. Currently DIT will not permit agencies to issue new contracts for moderate level solutions that have not been received an acceptable 3 rd party attestation. Obtaining approval from DIT to award a contract with a vendor that does not already meet this requirement will be challenging. Negotiation Question: Clarify and get the timeframe for SOC 2 Type II completion. Project Management Specification 1 RtAD-Consensus Meeting_11192018 & 11202018 Page 16 of 31 Meeting Agenda & Summary Include an initial schedule and the associated Work Breakdown Structure (WBS) for the proposed implementation plan. The Project Schedule in the proposal to include significant phases, activities, tasks, milestones and resource requirements necessary for NCDPI to evaluate the plan. EL LE Negotiation Question – Training dates need to be negotiated. Need a firm GoLive date. D Consensus Meeting Their plan had training in mid and late August this will be too late for year-round schools. Data Integration (IAM, EVAAS) was not indicated in the Project Schedule. The team voted 2 Yes and 9 Maybe. The team moved on reviewing NWEA for mandatory specifications. C NWEA Business Specification 1: Describe how the proposed solution directly assesses reading and pre-reading behaviors to support student’s learning development at the various grade levels to inform instruction, including any observation-based practices if applicable: a. oral language (expressive and receptive) b. phonological and phonemic awareness c. phonics d. vocabulary e. fluency f. comprehension AN Consensus Ranking: The voting members voted 7 No and 4 Maybe for this question for NWEA because Some team members pointed out that there were many assessment components and it was hard to sort out. There was MAP Growth, skills checklist etc. The service did not directly assess oral language. It was brought to the team’s attention that on page 45, "Beginning in the 2019-2020 school year, we anticipate MAP Reading Fluency will include progress monitoring forms that can be used in between benchmark tests." While the benchmark system meets the requirements, the Progress Monitoring is yet to be in place. MAP was also planning to include audio for their K-1 class. C Business Specification 3: FP Describe the validity and reliability of the assessment in the following areas: a. oral language b. phonological and phonemic awareness c. phonics d. vocabulary 0e. fluency f. comprehension R Consensus Ranking: The team voted 1 Yes, 6 No and 4 Maybe for this specification. It was mentioned that NWEA has fluency and comprehension threshold in MAP Growth Assessment. This will be problematic. RtAD-Consensus Meeting_11192018 & 11202018 Page 17 of 31 Meeting Agenda & Summary Business Specification 5: Describe how the assessment identifies and reports students who may need intervention and enrichment. D Consensus Ranking: The voting members voted 4 Yes, 3 No and 4 Maybe for this specification. One team member mentioned that because she was not confident with the validity and reliability, she was not sure if the system can identify students who need intervention. C EL LE Business Specification 6: Describe how the following characteristics for progress monitoring between benchmarks are met by the proposed solution: a. brief, b. repeatable, c. sensitive to improvement over time, including short term change d. multiple equivalent forms of screening assessments that enable the teacher to gauge short term growth (weekly or every other week), e. reliable, f. valid, g. measure accuracy and fluency with skills h. quantitative results charted over time to calculate and document rates of improvement i. Allow for off-grade level progress monitoring j. Ability for the results to be graphed against a goal (national norms and/or individualized goals) with 12-14 data points in 10 weeks’ time. AN Consensus Ranking: The team voted 10 No and 1 Maybe. It was mentioned that Progress Monitoring is under development. An evaluation team member mentioned that the under-development Progress Monitoring tool is the only Progress Monitoring tool that is going to exist because the regular assessment can only be given three times a year. The Progress Monitoring fix is their skills checklist. From a growth perspective going from a norm reference to a criterion reference and basing it on growth checklist can be problematic. C Business Specification 8: Describe how the measures align with best practices and adequately and accurately identify indicators of risk for dyslexia in grades K-3 as outlined in NC Session Law 2017-127: http://www.ncleg.net/Sessions/2017/Bills/House/PDF/H149v4.pdf FP Consensus Ranking: The team voted a 11 No on this specification. It was mentioned that their response was very brief and even their clarification was brief. The team was concerned about the statement in the proposal that the developers expect that the service is sensitive and specific to screening for dyslexia but no data is currently available. R Business Specification 9: Describe how the system uses developmentally appropriate practices to assess K-3 students. Consensus Ranking: RtAD-Consensus Meeting_11192018 & 11202018 Page 18 of 31 Meeting Agenda & Summary The team voted 2 Yes; 3 No; 6 Maybe on NWEA’s ability to appropriately assess K-3 students. It was mentioned that they had drag and drop at the demo that is not developmentally appropriate for the target group especially the kindergarteners. D Business Specification 10: Describe how the system incorporates educators and/or students using digital devices to assess reading and prereading behaviors. EL LE Consensus Ranking: The team voted 3 Yes; 2 No; 6 Maybe. It was noted that there is no observational aspect to assess pre-reading behaviors. Business Specification 11: Describe how the proposed solution is a formative reading assessment(s) tool for grades K, 1, 2, 3. C Consensus Ranking: The team voted 6 Yes and 5 Maybe on this specification. It was pointed out that from page 51 of the response to RPF, "NWEA recommends administering MAP Growth and MAP Reading Fluency formative assessments at regular benchmark intervals across the year in grades K–3. Once students can read independently with adequate rate, accuracy, and literal comprehension, MAP Reading Fluency no longer needs to be given." This definition seems to suggest that formative assessment is only for some students and is a benchmark assessment. Business Specification 12: Describe how the proposed solution is a diagnostic reading assessment(s) tool for grades K, 1, 2, 3. AN Consensus Ranking: The team voted 3 Yes; 3 No and 5 Maybe. C Business Specification 15: Describe how the proposed solution minimizes impact to instructional time while ensuring formative and diagnostic assessments are conducted. Provide estimates of assessment time, for both benchmarking and progress monitoring, per student per grade. R FP Consensus Ranking: The team voted 2 Yes, 3 No and 6 Maybe on this specification. This product does not have Progress Monitoring built yet. It can take up to an hour to complete benchmark and for students who are falling off, the teacher will have to go back and review the recording (total of 2 hours per student). This time needs to be added to the overall time. This vendor talked about going to a lab and it depends on the school to have that kind of lab. A field member noted that network bandwidth limits the number of concurrent tests that a school can support in their labs in Buncombe county. Fewer than half of the students in a classroom can go online at the same time. So, in some districts, even when labs exist in schools with the proper computer equipment (with required high quality microphones), students may not be able to be assessed at the same time due to noise levels and network limitations. It was also pointed out that even in those labs there may be a combination of iPads and Chromebook and the kids have to interact differently. This could present a problem for kindergartners because the teachers will have to train them accordingly. If it is a true formative assessment, then it does not take time away from instruction. Formative assessments inform instruction and it is observational for younger kids. Business Specification 17: RtAD-Consensus Meeting_11192018 & 11202018 Page 19 of 31 Meeting Agenda & Summary Describe how the content standards will be aligned and realigned to State Board of Education adopted ELA Standard Course of Study (Spring 2017). Provide specific mapping to the current standards. http://www.ncpublicschools.org/curriculum/languagearts/scos/. D Consensus Ranking: The team voted 6 No and 5 Maybe for this specification. It was noted that the questions in the examples were not aligned to Standard Course of Study and the chart was confusing. EL LE Business Specification 19: Explain how the proposed solution can yield data that can be used with EVAAS. Describe and provide any information that explains any alignment or relationship between the assessment and the Education Value-Added Assessment System (EVAAS). http://www.ncpublicschools.org/effectiveness-model/evaas/ https://www.sas.com/en_us/software/evaas.html Consensus Ranking: The EVAAS expert in the team mentioned that the service can yield data required for EVAAS and the team unanimously voted 11-0 Yes on NWEA’s ability to provide data for EVAAS. C Business Specification 24: Describe how the Benchmarking process occurs in the proposed solution. NCDPI expects benchmarking three times a year for grades K, 1, 2 and 3. Consensus Ranking: The team voted 11-0 Yes on this specification and required clarification on the benchmark testing window. AN Negotiation Clarification: Need clarification on the benchmark testing timeframes. FP C Reporting Specification 3: Reporting feature is expected to provide the following capabilities: a. timely assessment results to teachers/administrators b. timely assessment results to parents/guardians c. reporting results at the district, school, grade, teacher, group, and individual student level by all subgroups ESSA d. an end-of-year student summary report for cumulative folder historical data year after year to identify consistent gaps and learning trends for district, school, grade, teacher, group, and individual student level by all subgroups For each of the above, provide a timeframe for how frequently the data is refreshed (real-time, on demand, or some other interval). R Consensus Ranking: The team voted 11 Yes for this specification. It was mentioned that subgroup reporting is unclear. Reporting Specification 4 Provide communication to parents in a format that is clear and easy to understand after each benchmark. RtAD-Consensus Meeting_11192018 & 11202018 Page 20 of 31 Meeting Agenda & Summary Consensus Ranking: The team voted 2 Yes; 2 No and 7 Maybe. Some members were not sure what they had for parents. It was clarified that Progress Monitoring reports will be shared with the parents which will be too hard for parents to comprehend. EL LE D Technical Specification 6 This service will be classified as “Program Critical/Moderate” based on the sensitivity of data used, the security controls under the “Moderate” category column need to be implemented. The vendor’s security policy should include all the control categories as specified under “Moderate” classification. Please refer to pages 4 through 10 for the security control baselines table in the State Information Security Manual document. For example, AC-1 (Access Control Policy and Procedures) under “Access Control” Family/Category is discussed in detail in the NIST 800-53 document. NC Statewide Information Security Manual https://it.nc.gov/documents/statewide-information-security-manual  a) Describe how you will ensure compliance to the NC Statewide Security Manual. C Consensus Meeting – The team voted 11 No because this vendor would like to preserve their opportunity to negotiate on all security related questions. The vendor was interested in discussing a different security standard (CIS) rather than NIST, which is the standard followed by the State of North Carolina. In addition, this vendor’s solution appears to permit teachers to see data for students who are not their own. Negotiation Question – Need to find out what they want to negotiate on all the security questions. AN Technical Specification 35 Provide a 3rd party attestation, one of the following based on the system proposed: SOC2 Type II, SSAE-16, FEDRAMP, ISO 27001 C Consensus Meeting The team voted 11 No to this question because the vendor wanted to preserve the opportunity to negotiate for SOC 2 Type II. Currently DIT will not permit agencies to issue new contracts for moderate level solutions that have not been received an acceptable 3rd party attestation. FP Project Management Specification 1 Include an initial schedule and the associated Work Breakdown Structure (WBS) for the proposed implementation plan. The Project Schedule in the proposal to include significant phases, activities, tasks, milestones and resource requirements necessary for NCDPI to evaluate the plan. R Consensus Meeting The voting team unanimously voted No. The implementation plan was not statewide and were asking for a primary point of contact for each charter and district. Their GoLive was May 27th in the plan and their Quality Assurance and testing was after the GoLive. Negotiation Question – Need to revamp schedule for a statewide implementation. RtAD-Consensus Meeting_11192018 & 11202018 Page 21 of 31 Meeting Agenda & Summary After the initial round of deep dive review of the mandatory specifications, the evaluation team ranked the vendors for substantial conformity of mandatory requirements and selected the vendors in the competitive range for deep dive of all RFP specifications. For the substantial conformity of legislative specification, the vendors were ranked as follows : D Amplify Education Inc. Istation Curriculum Associates NWEA EL LE 1. 2. 3. 4. Summary: AN C The team deliberated and summarized that in the case of Curriculum Associates (i-Ready), their fluency measure is not ready and will not be available until the 2021 School Year. There was also concerns and skepticism about how the foundational questions were assessed just by measuring comprehension skills without fluency. The measures for oral language were also very limited. Progress Monitoring appeared to be a shortened version of the assessment and reliability almost always goes down to some extent when you reduce test items. This service cannot be a reliable screener for dyslexia because it does not have fluency and nonsense word recognition, a means to assess a student’s ability to apply letter/sound knowledge to unknown words, a core deficit in students with dyslexia. The continuum is not aligned to standards and the examples presented were a forced alignment. The standards mastery examples presented at the demo were not aligned. This service requires a lot of reporting customization. It also requires manual intervention for student transfers. The company plans to complete SOC 2 Type II audit in the summer of 2019. This presents a challenge when negotiating with DIT for contract award and will extend the contract award timeline. The implementation plan proposed is not for statewide implementation and appeared time consuming which brought to question the ability to deploy for the 2019 School Year. C In the case of NWEA, this service does not directly assess oral language. Progress Monitoring is not in place and implementing this solution will be a problem for the 2019 School year. Since this service does not have Progress Monitoring in place in the interim the growth checklist is used. This service does not screen for many of the key indicators of risk for dyslexia. This tool is also not a good diagnostic and formative assessment screener. Their example questions were not aligned to standards. Their project implementation timeline is not for a statewide implementation and there were serious doubts how the statewide implementation can be handled. There were concerns raised about teachers viewing other teacher’s students. Time has to be accounted for customizations to be FERPA compliant. The vendor has indicated negotiating state security standards and has not given a clear response for SOC 2 Type II audit requirement which will delay approval from DIT or even potentially get the contract award rejected. FP Considering all of the above for both the vendors, the team decided that they will not do a deep dive review of the non-mandatory questions for Curriculum Associates and NWEA. The team completed a deep dive of all substantial conformity items for Amplify Education Inc and Istation. Their independent vote count was retained for the non-mandatory substantial conformity items for NWEA and Curriculum Associates. Review of Non-Mandatory substantial conformity specifications: R The team continued with ranking substantial conformity for Amplify Education Inc and Istation. There were discussions of both strengths and weaknesses for each vendor. The following is a summary of key discussion points for each vendor followed by ranking. RtAD-Consensus Meeting_11192018 & 11202018 Page 22 of 31 Meeting Agenda & Summary Amplify Education Inc. EL LE D Business Specification While reviewing how the solution adapts as students gain mastery and have demonstrated proficiency, it was pointed out that this solution is not adaptive and that there were serious doubts about how it can adapt as students gain mastery. There were questions about how grade level determines the universal screening not student mastery of content. The team was concerned about the proposed training for Master Literacy Trainers. Also, the proposal response did not adequately include strategies for ensuring consistent scoring to evaluate training effectiveness. This needs to be further negotiated with the vendor during negotiation. Reporting Specification Julien pointed out to the team that the reporting permissions need to be enhanced. Overall the voting team was satisfied with Amplify’s reporting capabilities and the reporting offering. C AN C Technical Specification At the start of the evaluation of technical specifications with permission from Procurement, an evaluation team member expressed concerns about how the vendors support for iPad was not adequate and that the districts had difficulty accessing the service with the newer iPads. The District representatives at the evaluation expressed that in their specific districts there were no real trouble accessing the service using the newer iPads. The only set up required was using the correct URL to get to the IAM integrated service. Some teachers were still using the old URL for login. Concerns were also expressed with the ETL process with this vendor and said that this needs to be enhanced if this vendor is awarded the contract. On the question of IAM integration, it was expressed that even though the current service is IAM integrated, the protocols can be improved, and that there were concerns with the architecture. It was also pointed out that the offline access of the service required additional coding to remain compliant. The team voted a unanimous Maybe for the IAM integration question based on this need to enhance. There were questions about tier 1 through tier 3 support and it was pointed out that the response time for closing tickets should be negotiated with the vendor should this vendor be selected for further consideration and negotiations. This vendor has a SOC 2 Type 2 Audit completed by a third-party auditor and has also been completing penetration testing under the current contract and is highly rated in terms of security. There was a question about the physical audit of the data center by NCDPI. This question is irrelevant now because most SaaS service is hosted in the cloud. Although there was a unanimous No for this question, Sri and Tymica were advised to remove this question from the future RFPs. R FP Project Management During discussions it was pointed out that the key Technical resumes were lacking. Maturity to manage technical aspects needs to be improved. Vendor failed to acknowledge agreement to this term: "Prior to making personnel changes for key human resources outlined in the project plan, the vendor must provide an opportunity for NCDPI to review resumes and transition plan and request a meeting with replacement resources." This needs to be clarified. The team unanimously voted 11 Maybe for Project Management resumes. Other than this weakness there were strengths in the implementation approach which is statewide supporting regional model. There were strengths in Project Processes as well. RtAD-Consensus Meeting_11192018 & 11202018 Page 23 of 31 Meeting Agenda & Summary Service Level Agreement Amplify had shared the current service level agreement that is in place. If this vendor is selected for negotiation the tiered support process, the associated reports and timeframe for issue resolution should be improved. The team moved on to review Istation. Istation Business Specification EL LE D Negotiation Question 1. Professional Development and training needs to be enhanced and negotiated 2. Enhancement of Reporting Permission 3. Additional insights on the ETL process 4. Tiered support to be finalized and response time for issue to be negotiated 5. NCDPI to see the technical resumes and confirm their review and approval of resource transition. C The team had serious concerns about the validity and reliability of the service as a universal screener because the results presented in the RFP was from a study using ISIP Early Reading that was conducted in five elementary schools from a north Texas school district. Some voting members questioned the study parameters and its transferability to NCEOG standards. While main classification study had a very good ‘n’ it again demonstrates predictive validity from one district in Texas. Classification accuracy data in the RFP Attachment 1 suggests very low sample sizes when determining AUC data. The team also felt that the RFP response didn't specifically address sub-group and like-peer group reporting features to assess progress. It would be hard to use the system for frequent monitoring needed for SLD policy compliance. Some of the voting team members were concerned with the reports noting effectiveness of core instruction. Group intervention effectiveness for supplemental support was also unclear in the proposal. AN The team liked the use of a consultative approach described in the proposal to designing PD based on local needs. There were also strengths in the virtual modules offered. However, there should be more specifics for each content strand of PD offered. This needs to be clarified during negotiations. C Reporting Specification Reporting appeared to be easy to use but there was limited specifics on the report to track service usage. There needs to be further negotiations on the SLA reports. FP Technical Specification It was clarified to the team that the architecture approach for this service was current. There needs to be clarification on the penetration testing and frequency. With regards to service scaling it was pointed out that this service is not elastic, and it needs to be negotiated with the vendor during negotiations. The solution maps to CEDs but there still need to be negotiations in the use of SIF. The data transfer capabilities are not quite state of the art. Otherwise the TASD shows well. R There was a question about the physical audit of the data center by NCDPI. This question is irrelevant now because most SaaS service is hosted in the cloud. Although there was a unanimous “No” vote for this question, Sri and Tymica were advised to remove this question from the future RFPs. Project Management Specifications There were serious concerns about the proposed Project Manager. The resume shows training. However actual implementation experience is missing. The team agreed that negotiation Clarifications should be undertaken to probe deeper to see how the essential project management and technical roles will be staffed to ensure success. RtAD-Consensus Meeting_11192018 & 11202018 Page 24 of 31 Meeting Agenda & Summary The team also agreed that clarification should be sought on how User Acceptance Testing (UAT) bug fixes will be conducted and should be incorporated in the plan. SLA Specifications D There were questions around SLA availability and how the vendor reported that the 99.9% will be ensured during non-peak time. The team had questions around peak time availability. Clarifications need to be sought on the Tier 1 to Tier 3 support issue response time. Overall ranking for Substantial Conformity EL LE Negotiation Question The following questions were noted for the vendor: 1. Need classification accuracy for a larger sample size 2. Reports to track service usage needs to be defined 3. The frequency and process for penetration testing to be clarified 4. Application performance monitoring should be elastic 5. Schedule to include UAT bug fixes timeframe and to incorporate timeframe for all deliverables 6. Project Management and technical resources to be clarified. 7. SLA Availability and SLA terms and conditions to be negotiated. AN C Based on the discussions among the evaluation team members, the voting team unanimously agreed with the following ranking for Substantial conformity to specifications: 1. Amplify Education Inc. 2. Istation 3. Curriculum Associates 4. NWEA 2. RFP Desired Specification Amplify: C The RFP desired specifications were ranked for all the four vendors. The following is the summary from the discussions - FP The team was uncertain about the ability to upload evidence of learning. This appeared to be a negotiation item and was not included as part of the proposed package. The team was split on Amplify’s ability to incorporate a personalized blended approach to assessment and learning to meet the demands of diverse student populations. The team liked the online and observation options that Amplify offered. They were also satisfied with Amplify’s ability to assesses reading behaviors and print concepts of connected text. R Curriculum Associates The voting members felt that the RFP response did not include clear measurement of print awareness for young children. They also agreed that this service did not provide the ability to upload evidence of student learning. The team liked the easy to read reports and adaptive assessment. RtAD-Consensus Meeting_11192018 & 11202018 Page 25 of 31 Meeting Agenda & Summary EL LE D Istation: The team was concerned about the capability of the system to provide personalized blended approach to meet the demands of diverse student population. Oral language assessment was unclear or missing. The vendor reported working on ways to use voice recognition for oral language and no specific test is yet developed. From page 8 of the RFP response, "Students’ expressive language knowledge is captured by their ability to identify a rhyming word from an orally given target word", this was not convincing for the voting members. The RFP response also did not specify about constructed response type assessment features. There was also uncertainty for some members about the availability of touchscreen availability for students which was clarified by the technical team member as being available. The team liked the adaptive nature of the assessment and how ISIP adjusts to each student’s true abilities in early literacy to provide more accurate assessments and targeted, personalized instruction. NWEA According to the RFP response, Expressive language is "not directly assessed. With the inclusion of audio on our grades K–1 tests, we can assess more of the receptive components of oral language, including grammar, vocabulary, and syntax. Additionally, our grades K–1 assessment measure speaking and listening standards through questions about text read aloud and describing people, places, things, and events." Based on this response, the team had concerns about the capability of the system to provide personalized blended approach to meet the demands of diverse student population. Teaching modalities were not specifically stated in the RFP. C The team was concerned about the vendor not providing response time directly but stating that they are willing to negotiate. AN The team liked the adaptive nature of the MAP assessment. C Summary After discussion, the team voted on Desired Specification and the ranking is listed below: 1. Amplify Education Inc. 2. Istation 3. Curriculum Associates 4. NWEA FP 3. Proof of Concept/Demonstration The Proof of Concept/Demonstration was ranked for all the four vendors. The vendors were ranked for the following three questions: 1. Demo Script Adherence 2. Ability to meet RFP Specification 3. Ability to meet legislated timeline for implementation R Amplify Education Inc. The voting members were unanimous in agreement that Amplify’s demonstration adhered to the demo script and expressed confidence in the vendor’s ability to meet the RFP Specification. It was pointed out that if at all any with this vendor, the measures have to be scaled down to make it easier for educators and students. The team was confident that with the customizations that will be needed to scale back measures, the service can be implemented statewide for the 2019 – 2020 School Year. RtAD-Consensus Meeting_11192018 & 11202018 Page 26 of 31 Meeting Agenda & Summary EL LE D Curriculum Associates The team was in agreement that some of i-Ready’s processes demonstrated at the meeting was robust. However, they were split in their votes about i-Ready’s adherence to the demo script. The team voted 6 Yes and 5 No for i-Ready’s demo script adherence. i-Ready’s oral reading fluency measure is currently not available, and the vendor had indicated that this measure will not be available until the 2021 School Year. During the demonstrations, the vendor tied the rapid naming which is needed to screen dyslexia to the fluency measure. Some of the voting members pointed out that rapid naming could be accomplished without WPM fluency measure. Some newer research uses object naming and number naming as a predictor. In i-Ready’s ability to meet the RFP specification one of the voting members voted Yes. However, the rest voted a No on their ability to meet the RFP specification. The team unanimously voted all “No” on i-Ready’s ability to implement for the 2019 School year because the vendor’s proposal and subsequent clarifications followed a districtwide implementation model. Based on DPI’s lessons learned in such a model, implementation requires more resources and more time. Also based on the vendor’s clarification response, to achieve the 2019 School Year implementation at a minimum, the state will have to plan to supplement Fluency measures and work with the vendor to update the SAML integration capabilities and add additional regional and statewide roles. These enhancements with a district implementation model make implementation for 2019 School Year questionable. Istation AN C The voting members agreed that Istation’s demonstration showed that they had robust reporting capabilities. The team was split on Istation’s adherence to the demo scripts and voted 5 Yes and 6 No. Oral language is a new assessment; text fluency and oral language are not a part of the overall ability score. There is no pure measure of letter knowledge. Fluency is incorporated in MAZE and not included in the overall score. The team voted 4 in favor and 7 against Istation’s ability to meet the RFP specifications. R FP C In their vendor demonstration clarification response, Istation mentioned that they have engaged with a third-party auditor to provide SOC 2 Type II attestation of their software application. This attestation is expected to be completed in a matter of months. This led to the question of delaying the approval of vendor award recommendation should this vendor be chosen for award. This had to be considered for the overall implementation timeline. The team unanimously agreed that this service can be implemented as such for 2019 School Year and voted 11 in favor of Istation. RtAD-Consensus Meeting_11192018 & 11202018 Page 27 of 31 Meeting Agenda & Summary NWEA: D The team voted 6 in favor and 5 against NWEA’s adherence to demo script. The key discussion point was the district implementation model and district by district rostering that was presented at the demo. The team felt that the demo was unclear on some of the activities that students complete for each measure. During the demo it was mentioned that MAP assessment is still being studied on the suitability as dyslexia screener. Also, there was a disconnect between the standards and the student questions during demo. Text complexity was mentioned during demo, but it was stated that only Lexile was used for this. Clusters were incorrect in several places in the demo. EL LE Second graders can be benchmarked in different ways using K-2 or 2-5. There has to be appropriate policy in place for second grade assessment as the student should be in the same test for the calendar year. Also, to be considered, that the 2 – 5 level does not dip down to the foundational skills if needed. Progress Monitoring is currently under development and being validated. MAP reading fluency for Progress Monitoring can be on demand. The biggest concern of the voting members were that the student transfer between districts is a manual process and would require upto 48 hours. The team unanimously voted “No” for NWEA’s ability to meet the RFP specifications. They also voted 1 in favor and 10 against NWEA in their ability to meet the legislatively mandated deadline of 2019 School Year. After this the team ranked the vendors: C 1. Amplify Education Inc. 2. Istation 3. Curriculum Associates & NWEA (tied) Low Cost per student AN 4. Vendor Cost Proposal The Total Cost of Ownership (TCO) was determined for all the four vendors. This included the proposed cost and any additional costs indicated in the proposal. Certain vendors proposed alternate costs. $15.36 $8.08 $6.60 $21.14 High Cost per Student $25.78 $22.48 $6.60 $21.14 Rank 4 3 1 2 Accordingly, the TCO for each of the vendors is tabulated below: Amplify Curriculum Associates Istation NWEA C Vendor FP The following assumptions were made in determining TCO: In the case of Amplify the higher cost was chosen because the higher cost option had the appropriate professional development component. 2. In the case of Curriculum Associates the higher of the two cost was chosen because this cost included the Assessment and Instruction component. 3. In the case of Amplify, a total of 20,000 kits were assumed at $125 per kit. This cost was distributed for three years. 4. In the case of Curriculum Associates, in the demonstration clarification document with the statement that fluency will be offered from another vendor with no additional cost. There was also another statement to the effect “If we are selected for award, we understand there are many details to work out with NCDPI — not least of which will be cost. We anticipate the third-party fluency assessment purchase would be in line with R 1. RtAD-Consensus Meeting_11192018 & 11202018 Page 28 of 31 Meeting Agenda & Summary currently available offerings, in the approximate $1/student/year range.”. An additional $1 per student was added to account for fluency. EL LE D Since the solution required a headset, the business team decided that each device would require 4 headsets. TCO was arrived at assuming 26,000 RtA devices and $10 per headset. 5. Istation indicated that “While Istation recommends that students use headsets to reduce distractions during the assessment, they are not required”, the business team decided that each device would require 4 headsets. TCO was arrived at assuming 26,000 RtA devices and $10 per headset 6. NWEA indicated that “MAP reading fluency requires each student to use an over-ear headset with a boomstyle microphone.* Districts and Schools will be responsible for purchasing and providing these to students. Built in computer microphones and microphones in-line on a headset cord are not supported for administration of MAP reading fluency.” The business team decided that each device would require 4 headsets. TCO was arrived at assuming 26,000 RtA devices and $50 per headset. For cost ranking as indicated above the vendors were ranked as follows : 1. Istation 2. NWEA 3. Curriculum Associates 4. Amplify Education Inc. 5. Vendor Relevant Experience and Reference Checks C Constance Bridges walked the team through her reference checks results. Per the procurement guidelines, NCDPI reached out to every reference at least 3 times. AN Three reference check responses were received for Amplify, Curriculum Associates and NWEA. Two of the three Istation references responded. All the references said the vendors have been highly responsive and have addressed all issues. All the respondents agreed that they will renew the vendor’s contract and will refer them. With this information, the voting members ranked all the vendors the same for Reference Checks. Amplify, Curriculum Associates, ISKME and NWEA were ranked 1 for reference checks. C 6. Vendor Financial Stability FP Meera Phaltankar walked the team through the results of her Financial Analysis. She said all of the four vendors have good liquidity positions and she did not see any going concerns. Unqualified auditor’s report was available for Curriculum Associates, Istation and NWEA. She was able to determine the quick ratio. Since the unqualified auditor’s report was not available for Amplify she could not do the quick ratio. Meera recommended that Curriculum Associates, Istation and NWEA are ranked number 1 and Amplify be ranked number 4 for financial stability. R Negotiation Question: If NCDPI proceeds further with Amplify the company’s unqualified auditor’s report should be received. Based on Meera’s recommendation, the team ranked Curriculum Associates, Istation and NWEA as number 1 and Amplify number 4 for Financial Stability. RtAD-Consensus Meeting_11192018 & 11202018 Page 29 of 31 Meeting Agenda & Summary Final Ranking The team completed ranking of the four vendors and the outcomes are given below – Amplify Education Inc. Curriculum Associates Istation NWEA 1. Substantial conformity to solicitation specification 2. RtAD SaaS Desired Specifications 3. Proof of Concept/ Demo 4. Vendor Cost Proposal 5 Strength of References 6. Vendor Financial Stability Rank Rank Rank Rank Rank Rank 1 1 1 4 1 4 1 3 3 3 3 1 1 3 2 2 2 1 1 1 2 4 4 3 2 1 1 4 Phase 1 Rank Order EL LE Vendors D Consensus Meeting C After this the team deliberated the merits of the services reviewed as summarized below and reconfirmed the ranking. In summary, the team expressed unanimous agreement with the ranking outcome above. C AN In the case of Amplify Education Inc., the service provides online assessment as well as observational assessment. Online assessment is available for students with appropriate self-regulation and computer skills; teachers continue to have the option to directly assess/observe students who are in need of regulation. This is especially critical because this read to achieve solution is expected to assess the pre-reading and reading behavior of students. The target population for this assessment is K-3 students. The students may come from different socio-economic background and ethnicity. The core measures of DIBELS have always been recognized as valid and reliable screener for risk factors for dyslexia. This will satisfy the needs of HB 149 without overburdening the School Districts to develop or identify additional tools for dyslexia screening. The online assessment takes about 17 minutes and the observation assessment about 12 minutes. However it has be remembered that the observational assessment takes 12 minutes per child and the online assessment can be group assessed. FP This assessment is available offline. The issue of using developers in Ukraine for coding should be further discussed with DIT and Legal. Further clarification is needed from the vendor including identifying all associated risks. R In the case of Istation who came second in ranking oral language is a new assessment; text fluency and oral language are not a part of the overall ability score. There is no pure measure of letter knowledge. This service does not satisfy the needs of HB 149 to act as a valid and reliable screener for dyslexia. Consideration should be given to how much the teachers may be overburdened by using different assessments for dyslexia and the read to achieve legislation. The school districts could be challenged to develop or identify tool to satisfy the dyslexia legislation. The assessment is also not diagnostic in nature in that it expects teachers to be diagnosticians. Also, this assessment takes about 40 minutes for assessment and progress monitoring. RtAD-Consensus Meeting_11192018 & 11202018 Page 30 of 31 Meeting Agenda & Summary The service allows recording students and playing back. The team also agreed that most schools have at least about 2 computers (Chrome book and iPad) per class. D Additionally, the parent letters that are required to be sent home for the child requires manual entry of student data and this time should also be taken into account while considering the instructional time taken away from teachers. EL LE Next Steps: The budget bill authorizes the State Superintendent to supervise and approve the vendor selection. As directed by the budget bill, the business owners will collaborate with the State Superintendent to inform him about the ranking and understand his priority. The meeting was adjourned and the evaluation team was notified that they will be notified of the next steps after meeting with the State Superintendent. Action items resulting from the meeting follow. Action Items Item Assignee Confirm the process with the State Superintendent Tymica Notify the evaluation team about the next steps based on guidance received from Procurement Sri Due Date Status 12-5-18 R FP C AN C 12-15-18 RtAD-Consensus Meeting_11192018 & 11202018 Page 31 of 31 LE D EL AN C Read to Achieve Diagnostics Software as a Service R FP C Proposal Evaluation KickOff Meeting Dr. Pam Shue & Dr. Amy Jablonski October 5, 2018 1:00PM C AN C EL Introduction Project Background and Objective Evaluation Team Composition Evaluation Ground Rules Proposal Evaluation Process Reminders to ensure success Accessing Evaluation Team Site Questions Wrap Up R FP 1. 2. 3. 4. 5. 6. 7. 8. 9. LE D Agenda 2 LE D Project Background R FP C AN C EL The purpose of this project is to pursue a competitive bidding process as enacted in Session Law 2018-5; by the Office of the State Superintendent (OSS) to find the best solution(s) for a formative, diagnostic assessment for the Read to Achieve diagnostics and to satisfy obligations outlined in NC House Bill 149 to screen students for dyslexia in grades K,1,2,3. 3 LE D Project Background R FP C AN C EL North Carolina state law requires kindergarten through third grade students to be assessed with valid, reliable, formative and diagnostic reading assessments. NCDPI is obligated to provide these developmentally appropriate assessments. Further pursuant to state law, the solution must assess student progress, diagnose difficulties, inform instruction and remediation, and yield data that can be used with the Education Value-Added Assessment System (EVAAS). 4 LE D Project Background EL The assessments should also support – AN • Dyslexia C • Multi-Tiered System of Support R FP C • Specific Learning Disability Policy 5 LE D RFP Evaluation Objective R FP C AN C EL To complete the RFP evaluation and select the finalist by November 14th, 2018 in order to conduct further negotiations and award the contract on or before January 31st, 2019. 6 LE D Evaluation Team Composition  Voting Members EL The evaluation team is composed of AN C  Non-Voting Members (who are Subject Matter Experts) R FP C The specific role of each of these groups are discussed in the following slides Prerequisites: All Evaluation team members must sign and submit the confidentiality and conflict of interest forms. 7 Evaluation Team Member Organization Role LE D Evaluation Team –Voting Members Title NCDPI Enterprise Data & Reporting Voting Member Enterprise Data Manager Jablonski, Amy NCDPI Integrated Academic & Behavior Systems Voting Member Director, Integrated Academic and Behavior Systems Shue, Pamela NCDPI Office of State Superintendent Voting Member Associate Superintendent for Early Childhood Education Berry, Erika NCDPI Office of State Superintendent Voting Member Senior Policy Advisor Gossage, Chloe NCDPI Office of State Superintendent Voting Member Chief Strategy Officer EL Pond, Karl Voting Member K-3 Literacy, Piedmont-Triad Consultant C Belcastro, Rebecca NCDPI K-3 Literacy NCDPI K-3 Literacy Voting Member K-3 Literacy Northeast Consultant Laney, Susan NCDPI Integrated Academic & Behavior Systems Voting Member Integrated Academic and Behavior Systems Consultant, Research and Evaluation Specialist Loeser, Lynne NCDPI Exceptional Children Voting Member Statewide Consultant for Specific Learning Disabilities and ADHD AN Whitford, Abbey Karkee, Thakur NCDPI Accountability Services R FP C Parrish, Tonia Johnson, Mia NCDPI K-12 Standards, Curriculum and Instruction NCDPI K-3 Literacy NCDPI K-3 Literacy Day, Kristi Dewey, Cynthia Hoskins, Matt Voting Member Interim Section Chief for ELA Voting Member K-3 Literacy Consultant Voting Member K-3 Literacy Consultant Voting Member Psychometrician NCDPI Office of Early Learning Voting Member K-3 Education Consultant NCDPI Integrated Academic & Behavior Systems Voting Member Integrated Academic and Behavior Systems Consultant, Research and Evaluation Lead Consultant Prerequisites: All Evaluation team members must sign and submit the confidentiality and conflict of interest forms. 8 Evaluation Team Member Organization Role LE D Evaluation Team – Non-Voting Members Title NCDPI Purchasing Non-Voting SME Procurement Specialist Lowe, Linda NCDPI Technology Services Non-Voting SME PMO Manager Viswanathan, Srirekha NCDPI Technology Services Non-Voting SME Project Manager Snider, Eric NCDPI State Board of Education Non-Voting SME Attorney AlHour, Julien NCDPI Technology Services Non-Voting SME Director, Technology Services Hunt, KC NCDPI Technology Services Non-Voting SME Information Security Officer Phaltankar, Meera NCDPI Financial Services Non-Voting SME Director, Financial Services Hodge, Gin Buncombe County Schools, Instructional Coach Lanier, Claudia NCDPI K-3 Literacy Moates, Courtney New Hanover County Schools Non-Voting SME MTSS Instruction Specialist Cantey, Joy T Guilford County Schools Non-Voting SME Director of K-12 Literacy Reap-Klosty, Darlene Chatham County Schools Non-Voting SME MTSS Instructional Program Facilitator Anselmo, Giancarlo Cleveland County Schools Non-Voting SME School Psychologist Roberts, Amy Cabarrus County Schools Non-Voting SME Wilkes, Deborah Cumberland County Schools Non-Voting SME ESL Coordinator Cooper, Shaunda NCDPI Office of Charter Schools Non-Voting SME Education Consultant C EL Dunn, Tymica Non-Voting SME County Instructional Coach R FP C AN Non-Voting SME K-3 Literacy, North Central Regional Consultant Tomberlin, Thomas NCDPI School Research, Data and Reporting Pilonieta, Paola UNCC Non-Voting SME Director Associate Professor, Coordinator of the Undergraduate Non-Voting SME Reading Program Prerequisites: All Evaluation team members must sign and submit the confidentiality and conflict of interest forms. 9 LE D Evaluation Voting Team Member Roles R FP C AN C EL Consists of representatives from DPI who are required to participate in all evaluation meetings for the entire RFP evaluation process from start to finish: • Review RFP objectives prior to beginning evaluations • Participate in demos/orals • Review each responsive proposal and record strengths, weaknesses and clarification questions • Notify Project Manager of Clarification questions or concerns that arise • Through team consensus based on proposal review and demo, rank each proposal relative to other proposals to determine Finalists (short list) • Participate Best And Final Offers (BAFOs) • Select and Recommend Vendor(s) for contract award Prerequisites: All Evaluation team members must sign and submit the confidentiality and conflict of interest forms. 10 LE D Evaluation Non-Voting Team Member Roles R FP C AN C EL Consists of representatives from DPI, LEA and IHE who supplement knowledge and provide feedback to Evaluation Team. The evaluation team will require advisors skilled in a variety of technical fields. • Review business, legal, technical, security, project management and procurement aspects of proposals • Ensure project schedule is adhered to. • Review financial statements to determine level of financial risk (high, medium, low) • Provide guidance on cost evaluation • Provide inputs on business, technical, financial, reference check and project management aspects of the RFP. Prerequisites: All Evaluation team members must sign and submit the confidentiality and conflict of interest forms. 11 • Contact outside of Evaluation Team LE D RFP Evaluation Ground Rules Procurement initiates ALL communication to/from vendors ₋ No discussion permitted with co-workers, managers, family members or anyone else outside of the Evaluation Team (unless authorized and signed Conflict of Interest and Disclosure and Confidentiality Agreement forms are on file) ₋ Do not speak to any vendor about the RFP, responses or the selection process ₋ All questions and clarification points that arise throughout the process must go through Procurement Specialist ₋ Procurement Specialist will establish contact with Vendor(s) and make arrangements for conference calls, webinars, face-to-face meetings, etc. as appropriate ₋ Proposals must be treated as confidential and proprietary ₋ Proposals and evaluation team materials including any portable storage devices, must remain locked and secure when you are not reviewing them ₋ Evaluation Team members should refrain from sending email messages that contain proposal information, ranking or any other information that must remain confidential. Any clarifications can be posted in the individual folder assigned in SharePoint and email sent to the Project Manager. R FP C AN C EL ₋ 12 LE D General Guidelines for Evaluation Public Record: As an Evaluation Committee member you are accountable for everything you write and do regarding the RFP, each Proposal and the evaluation process. Proposal evaluations are part of the RFP and contract files, and as such, are public records, including the names of the Evaluators. • Proposers may request to review evaluations of all Proposals may use the information to submit a protest. In addition, Proposers are entitled to ask for a debriefing and Evaluation Committee members could be required to attend and explain scoring. • Any member of the public may also request to review all documents relating to the RFP process in compliance with North Carolina’s public records law. • Written comments will be disclosed to any requesting party as part of the public record. • Please do not transmit any confidential proposal related details via email. If you need clarifications, please send an email to the Project Manager and the Project Manager will contact you to find out more. • In SharePoint each evaluator has a folder assigned to them. Please upload a document with questions or clarifications and send a SharePoint message to the Project Manager. R FP C AN C EL • 13 f. g. C Best And Final Offers (BAFOs) – Phase 2 Award Recommendation Package R FP 5. 6. Vendor Demos (record demo script & ask questions) Individual Review & Recording Strengths, Weaknesses, Clarifications Team Review of Compiled S,W,C; Initial Consensus Ranking Formal Clarification Questions to Purchasing for Vendor Responses Conduct In-depth Reference Checks i. Assess K3 Solution Relevance & Experience Consider Total Cost of Ownership, Relevance, Confidence Team consensus and ‘Short List” Finalists for BAFO AN a. b. c. d. e. EL Proposal Opening Check for Initial Purchasing Office Review Responsiveness Kick-Off Meeting Proposal Evaluation – Phase 1 (Competitive Range) C 1. 2. 3. 4. LE D Proposal Evaluation Process 14 LE D RFP Process Completed so far Amplify Education Inc. Curriculum Associates Imagination Station Inc. NWEA C i. ii. iii. iv. EL 1. Bid Opening on 10/02/2018 2. The following bidders have sent their proposals – Amplify Education Inc. Curriculum Associates Imagination Station Inc. NWEA C i. ii. iii. iv. AN 3. Initial responsiveness evaluation update from Procurement – R FP 4. The responsive bidder response has been published in SharePoint. 15 LE D RFP Evaluation Criteria EL The evaluation criteria included in the RFP is listed below. The criteria at the top of the list are relatively more important than those at the bottom of the list. R FP C AN C 1.Substantial Conformity to Solicitation Specifications – Refer Attachment A ; Tables A, B, C, D and E 2.RFP Desired Specification - Refer Attachment A Table F 3.Proof of Concept/Demonstration – Responsive Vendors 4.Vendor Cost Proposal – Refer Attachment A Table G 5.Vendor Relevant Experience and Reference Checks - See Section III – Paragraph 14. 6.Vendor Financial Stability - Refer Section V Paragraph 3 16 LE D RtAD RFP proposed Evaluation Schedule in scope There will be additional Conference call meetings as needed 10/05/18 Vendor Demonstrations 10/22/18 & 10/23/18 Consensus Meeting 11/19/18 & 11/20/18 Best and Final Offer (BAFO) From 11/26/18 to 12/14/18 TBD NCDPI – State Board Room (7th floor) 8am - 5pm NCDPI - Room 504 (11/19) & State Board Room (7th Room) AN C 8am – 5pm NCDPI – Conference Calls Return the Demo Scripts with review feedback no later than October 10th. Please complete your review of proposals for consensus meeting by November 8th to allow time for compilation of results. R FP • NCDPI – Room 504 C • 1:00pm-3:00pm EL Kick-off Meeting 17 LE D Proposal Evaluation – Phase 1 R FP C AN C EL Vendor Demonstrations • Vendor demonstration scripts will be shared with all evaluators on 10/8/18. • Please review and provide your feedback no later than 10/10/18 • Each Evaluation Team Member and Non-voting SME will closely observe product demonstrations and Vendor presenters to document additional strengths and weaknesses of the solution and team. • Tymica Dunn will issue a Clarification Document to Vendors the day after the onsite demonstrations to obtain written documentation of the demo session. • Evaluation Team members can leverage 30-day trial licenses to further assess solution(s) as needed, trial period allows access to proposed solution features, customer tools, user guides and training materials 20 LE D Individual Proposal Evaluation All responsive proposals are available in the RtA Evaluation SharePoint . EL Evaluation Committee Members will be expected to: R FP C AN C ₋ Read the RFP and all Addenda. ₋ Read each Proposal and independently review and respond to questions in the checklist. ₋ Include strengths and weakness observed during demonstrations and vendor clarifications to further your review feedback. ₋ Evaluate Proposals based only on the responses in the RFP and vendor clarifications. ₋ Complete the checklist provided by 11/8/2018 in preparation for the Evaluation Committee Consensus meeting. 19 LE D Rationale for Scoring EL • Each evaluator will be provided an excel workbook to document their feedback in a consistent manner. • In the checklist, the specifications are listed as recorded in the RFP. • In the column ‘Meets Requirements’ C AN C – Yes indicates that the vendor has addressed the specification and the evaluator is satisfied. – No indicates that the vendor has not addressed the specification in the RFP Response – MayBe indicates that the response is unclear and that the evaluator needs further clarifications R FP • Document Strengths and Weakness of each specification in their appropriate columns. • Any ambiguities should be noted down in the Clarification column 20 LE D Consensus Meeting on November 19th and 20th EL • After all the Evaluation Committee members have completed their checklist, the Evaluation Committee will meet to jointly discuss the merits of each Proposal. AN C • It is not necessary that the voting members concur on any given point, however, this meeting is an opportunity for Evaluation Committee members to discuss as a group with input from SMEs and ideally reach a group agreement to rank the vendors. R FP C • Based on the ranking during the consensus meeting the finalist vendors will be selected for Best and Final Offers (BAFO) negotiation. 21 LE D Team Consensus Ranking Methodology R FP C AN C EL Team Consensus Ranking, Clarifications & Finalist Selection • Support for cost analysis will be provided, as you review proposals pay attention to areas that may increase cost or result in savings relative to other proposals • Outcomes from In-depth Reference Checks will be provided to the Evaluation Team • The Project Manager will compile all recorded strengths and weaknesses and validate them against vendor clarification responses where appropriate before the team meets for consensus ranking • Evaluation Team Members and Non-voting SMEs will review the compiled strengths, weaknesses, cost analysis and clarifications • Everyone will meet and the voting members will conduct a consensus ranking to make recommendations for vendor finalist(s) for live product demonstration • Timely approval from the voting members on the consensus meeting notes and demonstration script is very important. 19 Purpose of BAFO step is to: • • • C AN C • • Evaluation Team will narrow Finalist list down before beginning the BAFO process to preferred and possibly second preferred Vendor Procurement Specialist will coordinate Legal reviews as appropriate Negotiation meetings are allowed during BAFO and when the committee and the evaluation team is comfortable, a single BAFO meeting is conducted to finalize discussions and obtain approval. R FP • allow bid offerors to revise their offers; revisions may apply to price, schedule, technical requirements or other terms of the proposed contract respond to any errata in the vendor’s proposal obtain the Vendor’s best and final cost offer EL • LE D Best And Final Offers (BAFOs) – Phase 2 21 EL C AN • C • Evaluation Team and Project Manager prepares Award Recommendation Package with supporting documentation to justify the best value decision. Award recommendation with supporting details will be presented by Project Manager to the Voting Members and DPI Leadership for review and approval. Upon approval, the Project Manager submits to award recommendation package to the Procurement Specialist and Legal Counsel. The Procurement Specialist prepares the Award Recommendation Letter with supporting details and submits to DIT for permission to award the contract(s). R FP • LE D Award Recommendation Package 22 LE D Reminders to Ensure Success EL • Failure to adhere to the ground rules may compromise the entire RFP process C • If you have any questions about what is and is not permissible, please contact the Project Manager. C AN • The less information you share with those not on the Evaluation Team (or others required to support the decision-making process) the better R FP • A document with contact information has been posted in the project SharePoint repository. Please provide your contact information including alternate phone numbers to call or text incase of need. 23 LE D RFP Deliverables (part of public record) Request for Proposal and Responses Evaluation Team (Names, Business Title and Role in the evaluation) 3. Confidentiality Agreement and Non-Disclosure Forms by all evaluation team members 4. Kickoff Meeting Presentation and Minutes 5. Evaluation Checklists 6. Team Consensus Ranking 7. Meeting Minutes from the Consensus Meetings (approvals from all voting members are required on these documents) 8. Demonstration Script 9. Demonstration Vendor Clarification 10. Best and Final Offer documentation 11. Contract Award Recommendation Document R FP C AN C EL 1. 2. 26 LE D Evaluation Document Repository EL • Evaluation Team Members have been granted access to the SharePoint site. C • Each member has an assigned folder with their name to which only the evaluator has access. R FP C AN • For non DPI users please use your Microsoft credentials to login. 27 Evaluation Document Repo PIE: Read to Achieve RFP PP .- . . - u: Team Sites. internal 1 DPI Extemai .. .v Search site 1.3. Achieve Evaiuijon Team Members. RFP Evaluation Do RFP Responses Documents Type "iame There are no items to show in this view of the Responses" document iihm rv. To add a neviI item, ciick "Add document?. I- Add document Gossege Dewey Dave RFP and Addendum vpe Name I'i?cdi?lhd 4D-i'i 99115-18 Read to Achieve NextGen RFP Addendum 01 Srireicha Wmanathan Hui-9r: Sn'rekha Viewanaihan 4D-l'l'0ii115-13 Read to Achz'eve Net?tGen RFP Posted 12-6-2917 Srireici?a t?swanathen en Ul?fl?l? 9:36 PM Sriretha viswanainan 1- Add document 9:48 PM Srirekha Viewanaihan Team Communication 9:40 PM Sn'rekha 1" Name ed Modded 55: FE: lfl??iil? 9:49 PM Srireitha ?u'iswanaihan Rm 15 hours ago virginiahodge?bcmeiiorg NextGen Evaluation 9:4? PM Sn're'xha Viswaraihan Team a? Public Schools of North Carolina LE D Evaluation Checklist EL • Each evaluator has a workbook per vendor in their assigned folder. C • Each workbook has two worksheets – AN 1. Substantial Conformity Evaluation 2. Desired Specification Evaluation R FP C • Please document the strengths, weakness and clarifications of the service as identified in the proposal. 29 Questions m: 28 Public Schools 0f N?rth Carolina Fm . $298 552 ?oo?m 0295 LE HE Read to Achieve - Evaluation Checklist for Substantial Conformity to Technical Speci?cations Evaluators Amplify Education Inc. Curriculum Associates lstation NWEA Index Evaluation Sped?mtlon -Technical Speci?cation Meets Requlrernents Meets Requlremeno: Meets Requirements Meets Requirements Fr ?s Describe how the proposed solution is compatible with common digital devices including mobile and desktop devices. Describe any differences in the mobile offerings. Yes Yes Yes Yes Please include a preliminary Technical Architecture System Design document that illustrates the proposed solution. [Describe in the TASD how the items outlined in this attachment are expected to be addressed with due consideration for all speci?cations in this document. Provide supporting narrative, appropriate technical diagrams depicting the flow of data and system architecture} Yes Yes Yes Maybe The preliminary TASD submitted with RFP response is expected to be revised after solution delivery to TS-OOB ensure the "as designed" and "as delivered" solution still conforms to NCDPI and NCDIT standards. Any architectural or security changes require NCDPI and NCDIT approval. Describe your proposed approach for meeting this speci?cation. Yes No Yes YES TS-004 Describe how the proposed solution aligns with State Technical Architecture Yes Maybe Yes No Describe the following in the TASD referenced in Spec ll 2 - -Availability, IISecurabIlIty, -Scalability and - nteroperability Yes Yes Yes Yes This service will be classi?ed as "Program Critical/Moderate" based on the sensitivity of data used, the security controls under the ?Moderate? category column need to be implemented. The vendor?s security policy should include all the control categories as speci?ed under "Moderate" classi?cation. Please refer to pages 4 through 10 for the security control baselines table in the State Information TS-ODG Security Manual document. For example, {Access Control Policy and Procedures} under "Access Control" Family/Category is discussed in detail in the NIST 800-53 document. NC Statewide Information Security Manual - 3} Describe how you will ensure compliance to the NC Statewide Security Manual. Yes Maybe Yes No TS-DD7 Describe how the proposed service protects and FERPA data. Include details related to security of data stored at the vendor?s site as well as any server security policies. Yes Yes Yes Maybe Describe the Vendor?s proposed hosting site. All hosting sites must reside in the continental United States of America. Include in the hosting description answers to the following questions: TS-DGS i. Who is the hosting provider? n. Where is the primary site? Where is the disaster recovery site? iv. Are the hosting fac ies compliant with applicable governance (such as FERPA, Pll, or SAS 70 certification)?? yes, please provide copies of the most recent auditls}. Yes Yes Yes TS-OOB Describe how penetration testing is done and the current frequency. Maybe No Maybe Yes Describe the proposed solution?s system management practices with information on security patching. How often servers are patched, and what the Vendor's methodologies are for handling patching? Yes Yes Yes Maybe Describe what processes the Vendor has in place to allow the NCDPI to audit the physical environment T5011 [could apply to production, secondary site, etc.) where the application/service is hosted. The reserves the right to audit the physical environment. No No No No TS-012 Describe how is used within the application. Include in your description whether database network SSL, iPsec, SSH, etc], data?at?restfdata-in-motion and/or backup are used. lfthe proposed solution uses any of the foregoing typesfmethods of describe the Yes Yes 13 Describe the Vendor's process for handling and notifying a breach of PII and other non-public data. YES Yes TS-014 Describe the security auditing and related capabilities in place. Refer to State Security Manual referenced above. Yes NO TS-OIS Describe any proposed system security provisions not already addressed above. Currently NCDPI has about 26,000 K-3 staff and 500,000 K-3 students. Describe how the proposed solution will scale without impacting performance. Maybe Maybe Yes TS-Clli?r Describe how the proposed solution can integrate with Identity and Access Management Service. Here is a brief description ofthe integration methodology using Yes Yes Maybe TS-O 18 Describe how the proposed solution restricts access to users. What are the various attributes to restrict access and maintain con?dentiality. Define their hierarchy and hierarchy attributes. Yes Yes Maybe TS-019 Describe the proposed system?s data integration capab ties with other NCDPI authorized system{s}. Explain how your solution consumes and publishes data with other solutions. De?ne the integration priorities and integration interface. Planned data integration points may include but are not limited to Student Information System (student enrollment, transfers across districts, dual enrollment, summer camps, teacher data, school calendars, etc], Every Child Accountability Tracking System and State Operational Data Store List all other products [suite] that may integrate with the service and the mechanism of integration. Maybe Yes Yes TS-DZO Describe how the proposed solution is aligned with CEDS. Maybe No Yes Describe the solution?s use of SIF. No No No Provide a list of data elements currently in the system. Yes Yes Yes TS-023 Describe in detail the ETL process in place. Yes Yes Yes TS-024 Describe in detail the items and services to be covered under operational maintenance and support of the proposed solution. Maybe Yes Yes TS-DZS Describe in detail the data conversion processes to migrate detailed historical data and setup new students. Historical data should be retained for a minimum of four years based on current retention requirements and can be updated depending on need. This information can include previous assessment data and student information data. Maybe No Describe your capabilities and approach for transitioning the NC K-3 assessment data to the State at the end of the subscription service should NCDPI decide to end the use of the service in the future. Include what format the data will be provided e.g. Excel. Comma Delimited. Maybe Maybe TS-027 NCDPI will provide student and staff information which should be used as a system of record for students and staff. Describe the proposed solution?s data processing, cleansing and security process envisioned for NCDPI. include any data transformation, data latency messaging capabilities. NO TS-OZS All K-3 Assessment results (benchmark and progress monitoring] will be reported back to NCDPI or systems authorized by NCDPI. Describe your current data transfer capabilities using state ofthe art protocols and services. Describe the system?s ability to recover from failed or partial data transfer and your current noti?cation process for the same. No Yes Maybe Maybe TS-OZS NCDPI will be involved in User Acceptance Testing prior to initial deployment and testing enhancements before each planned release or adhoc bug ?xes. Describe the vendor?s software delivery process including the types of testing undertaken and Test Environment Management Process in supporting application releases and project delivery. No Describe' how the proposed solution conforms to current accessib' 'ty standards, including Section 508, W3C WCAG 2.0, in accordance with N.C.G.S. 168A-7 and the Code of Federal Regulations at 28 CFR parts 35 [title Ill and 36 [title Ill]. Maybe Describe the vendor's disaster recovery plan including the current Recovery Time Objective (RTO) and Recoveryr Point Objective Yes Yes Maybe TS-032 Describe the vendor?s capability to provide Tier 1 through Tier 3 customer support and help desk capab ties for school districts to provide a uni?ed single point of contact assistance with Technical issues. Describe the industry standard that is adopted for support Maybe Yes Maybe TS-033 Describe your proposed process for collecting and prioritizing user feedback and providing NCDPI a roadmap for enhancements and changes every quarter. Maybe Maybe Yes TS-034 Provide the minimum hardware, software and network bandwidth requirements for optimal performance ofthe proposed solution. Aiso indicate the sunsetting plan. Yes Yes Provide a 3rd party attestation, one of the following based on the system proposed: SOCZ Type II, FEDRAMP, ISO 2?001 Yes No No 036 Provide completed Vendor Readiness Assessment Report (VRAR) No Maybe Maybe No 22 17 26 9 Read to Achieve - Evaluation Checklist for Desired Speci?cation Evaluation Vendor Evaluators _Voting Members Voting Me Voting Me Voting Me Voting Me Voting Me Voting Me Voting Me Voting Me Voting Me Voting Me Voting Mel Votetount?efou' Index [35-001 atlan - Desired about The Vendor?s SaaS Solution may provide functionality to conduct student assessments ori supported devices. Meets Yes Meeu? Maybe Meal: Yes Meet! Yes Meats Maybe Meeu Melt! Mee?bE Yes Yes Yes Yes Mm Bil!" No IDS-002 Support at least 99.9% uptime availability. However, it the vendor proposal is recommended by the evaluation team to the competitive range. all SLA terms ma be he otiated at that time. 05?003 Maybe Maybe Maybe Maybe Maybe Yes Support 3 to 5 second or less web page response times. However. it the vendor proposal is recommended by the evaluation team to the competitive range. all SLA terms ma be he otiated at that time. [35-004 Maybe Maybe Maybe Maybe Maybe Maybe Incorporates a personalized blended approach to assessment and teaming, including multiple teaching and assessing models to meet the demands of diverse student populations with a wide range of teaming needs. Yes Maybe Maybe N0 DS-OOS The assessment system incorporates innovative and evidence-based approaches ng assessment results to assist in recommending instructional strategies. Describe the system's capab to provide on demand (real time) assessment data and instructional strategies recommendation. Maybe Maybe Yes Maybe Maybe DS-DDG Electronic student, class, school. and district reports on assessment results to help all educators make instructional decisions based on the data. including a report that tracks student prooresstorowth. Maybe Yes Maybe Onltne professional development options for teachers and administrators pertaining to the use of the assessment system and how to analyze and use the data to make informed instructional decisions for students. 05?008 Ves Maybe Describe how your solution provides communications to parents including the ability to generate strategiesl'toots for them to be able to help their children at home. Explain the research and vetting process for these recommendations. Maybe Yes YES 05-009 05-010 05-011 D5012 05-013 14 Describe how the proposed solution includes a constructed response feature for responding to text dependent questions (NAEP) - ovtnationsre or?tcardtr No No no No Describe the proposed service's ability for authorized users to upload evidence of teaming. No No I10 ND Describe the proposed service's ability to maintain a portfolio of students ongoing development over time. No maybe Describe how the proposed solution approaches print awareness for young children. Maybe Maybe Maybe maybe Maybe Describe other open standards (other than CEDS de?ned in Technical 5 eci?cation #20 that are used torintero erabitity. Maybe Maybe maybe Maybe Describe the open standards that can be used for interoperability with your service. Maybe Maybe maybe Maybe a 4? 21 Isl. Vendor Evaluators Voting Members Voting Voting Me Voting Me Voting Me Voting Me Voting Me Voting Me Voting Me Voting Me Voting Me Voting Mel VoteCotmt Before tomensus Index Wonm-Mmm ?Remit!" Meets" New? Hm V95 Describe how the proposed solution direct?iy assesses reading and pro-reading behaviors to support student's leam?mg development at the various grade leveis to inform instruction, including any observation-based practices itapplloable: a.oral language {Writs-Siva and reoeptivei Bus-001 .b.phonologionl and phonemic awareness cphonlcs d.vocabu ary e.?uency f.oomprehension No Yes Yes No Maybe Yes Yes Maybe Maybe No Yes 5 7 Describe how the proposed solution measures accuracy and rate for grades K, 1. 2. 3 {oral language. Bus-002 phonological and phonemic awareness, phonics, vocabulary. fluency, and comprehension) including any observation?based practices if applicable Maybe Yes Yes Maybe Maybe Maybe Yes Maybe Maybe Maybe Yes 4 4 0 Describe the validity and reliability of the assessment in the following areas: a.oral language b.phonological and phonemic awareness Bus-003 cphonics d.vocabulary e.f uency f.oornprehension Maybe Yes Yes Maybe Maybe Maybe Yes Yes Maybe No Yes 5 1 6 Describe how the proposed solution meets the requirements for a universal screener, including the following: a.reliabi ity at .30 or higher and concurrent or predictive validity at .60 or above Bus-004 b.benchmarking that provides large scale norm groups andfor research-based criterion cadequate sensitivity and classi?cation accuracy d.n1ultip e equivalent forms of screening assessments that enable the teacher to gauge short term growth No Maybe Yes Maybe Maybe Maybe Yes Yes No No Yes 4 4 3 Bus-ops Describe how the assessment identifies and reports students who may need intervention and enrichment. Yes Yes Yes Maybe Yes Yes Yes Yes Yes No Yes 9 4 3 pescrioe now the wuumug Lllaldum mm. 1mm are met oy Bus-006 the proposed solution: No Maybe Yes Maybe Maybe Maybe Yes Maybe No No Yes 3 0 Describe how the proposed solution will be able to assess progress based on large scale norm groups andz?or research-based criteria for district, school, grade level. class, group, individual, sub-group, and like peer group {refer to NC SLD policy Maybe Maybe Yes Maybe Maybe Maybe Yes Maybe Maybe No Yes 3 3 Describe how the measures align with best practices and adequately and accurately identify indicators BUS-003 of risk for dyslexia in grades K-3 as outlined in NC Session Law 201?- 149v4.pdf No Maybe Yes No Maybe No Yes Maybe No No Maybe 2 11 Bus-009 Describe how the system uses developmentally appropriate practices to assess students. No Yes Yes Maybe Maybe Maybe Yes Yes 5-010 Describe how the system incorporates educators andfor students using digital devices to assess reading and pro-reading behaviors. No Maybe Yes Yes Maybe Yes Yes Yes No Yes 6 3 2 BUS-011 . . . . Describe how the proposed solution Is a formative reading assessmen?s} tool for grades K. 1, 2, 3. Yes Yes Yes Maybe Maybe Yes Yes Yes Yes Yes Yes 9 6 0 Bus-012 . . . Describe how the proposed solution Is a dia??OS?IIl: reading assessment(s] tool for grades Yes Yes Yes Maybe No Yes Yes Ir' 3 3 Bus?013 Describe the miscue and skills analysis features to assist with analyzing and identifying students reading dif?culties. Maybe Yes Yes No Maybe No Yes Maybe Maybe No Yes 4 4 3 Bus-014 Describe how the proposed solution reports and displays results of progress monitoring. No Maybe Yes No Maybe No Yes Maybe No No Maybe 2 2 5 Describe how the proposed solution minimizes impact to instructional time while ensuring formative Bus-015 and diagnostic assessments are oonoucted. Provide estimates of assessment time. for both benchmarking and progress monitoring. per student per grade. No Maybe Yes Bus-016 Describe how the solution adapts as students gain mastery and have demonstrated proficiency. Testing should not repeat for mastered skills unless the educator selects to repeat testing. Yes Yes Yes Yes Maybe Yes Yes Yes Yes Yes Yes 10 10 Desaibe how the content standards I be aligned and realigned to State Board of Education adopted ?5?917 ELA Standard Course of Study {Spring 2017}. Provide speci?c mappingto the current standards. . Yes Maybe Yes Maybe Bus-013 Describe how the proposed solution can demonstrate high rates of predictability as to student performance on National and State assessments. Maybe Maybe Yes Yes Maybe Yes Yes Yes Maybe Yes 5 6 0 Explain honr the proposed solution an yield data that can be used with EVMS. Describe and provide any information that explains any alignment or relationship between the Bus-019 assessment and the Education Value?Added Assessment System (EVAASJ. us/software/evaas.htm Yes Maybe Yes Yes Maybe Yes Yes Yes Yes Yes Yes 9 11 Bu 5-0 20 NCDPI prefers a web-based software as a service application with the capability to support classroom student assessments without an internet connection, Describe how the proposed solution can satisfy this specification. No No BUS-021 Describe how the proposed solution allows for grouping and assigning student and educators outside of the Student Information System. Maybe Yes Yes BUS-022 Describe howI the proposed solution establishes instructional reading groups based on data-specific student performance data. Maybe Yes Yes 10 10 Bu 5-0 23 Describe how the proposed solution helps educators meet the individual needs of students by recommending adjustments to instructional practices. Maybe Yes Yes No Yes BUS-024 Describe how the Benchmarking process ocwrs in the proposed solution. NCDPI expects benchmarking three times a year for grades Describe how the proposed solution w'l provide data on the effectiveness of core support. Yes Maybe Describe how the proposed solution will provide data on the effectiveness of supplemental and intensive support. Yes Maybe No Maybe Yes No Bu 5?02? Describe the vendor?s proposed training model to train DPI Stakeholders (Estimated about 100), Master Literacy Coaches at the school districts (Estimated about 500} and Teachers, Exceptional Children Teachers, English as a Second Language Teachers, and literacy specialists (at least 25,000) initially and on an ongoing basis {refresher training}. Include any real-time training in the classroom, practice components, etc. Maybe Maybe Yes Maybe Bu S-0 28 Describe how the vendor evaluates training effectiveness and adapts to meet the needs. Include any strategies for ensuring consistent scoring. No Yes Maybe Maybe Maybe Eu 5?0 29 Describe in detail the training and professional development content areas and variety of levels. For example, product training, usability for both diagnostic and progress monitoring implementation, data analysis. etc. Maybe Yes Yes Yes Yes Yes Bu 5-030 Describe all training methods that the vendor will make available for the trainers like Technology based training or Training Presentation. Include all associated training costs in Attachment 6. Yes Maybe Maybe Yes Describe the strategy to provide demo site I accounts for trainers taking into account appropriate protections are in place to mask sensitive production data in the demo site. Please be sure to elaborate how the masked data resembles production data and is repeatable, while maintaining referential integrity, Maybe Yes Index EvaMtIon Speci?cation - Reporting Speci?cation Meets Meet: Meets Meets .Mee?tl? Idem iMectsRe Describe the proposed permissions for reporting services Maybe Maybe Yes RS-OOZ Describe the various report output formats graphs, charts, CW, and the report delivery methods Email, Excel etc. If email is offered as an option describe the data security po icies in place. Maybe Maybe Yes Yes Yes 515-003 Reporting feature is expected to provide the following capabil les: a.timely assessment results to teachers/administrators b.timely assessment results to oreporting results at the district, school, grade, teacher, group, and individual student level by all subgroups ESSA d.an en d-of~year student summary report for cumulative folder historical data year after year to identify consistent gaps and learning trends for district. school, grad e, teacher, group, and individual student level by all subgroups For each of the above, provide a timeframe for how frequently the data is refreshed {real-time, on demand, or some other interval}. Maybe Maybe Yes 11 RS-004 Provide communication to parents in a format that is clear and easy to understand after each benchmark Maybe RS-U 05 Describe the capability to track and report service usage. Maybe Maybe Yes NJ ?4 Describe any other reports that the solution offers. Maybe Maybe Index Evaluation Speci?cation - Todinio?yci?u?on Meetslloti Describe how the proposed solution is compatible with common di ital devices including mobile and desktop devices. Describe any differences in the mobile offerings. Maybe Maybe TS-UOZ Please include a preliminary Technical Architecture System Design (TASD) document publicfd ocu men tsliilesfl?echnical'ismArchit ecture% $520 Design%20Tem platedoc] that illustrates the proposed solution. {Describe in the TASD how the items outlined in this attachment are expected to be addressed with due consideration for all specifications in this document. Provide supporting narrative, appropriate technical diagrams depicting the flow of data and system architecture.) Maybe Maybe Maybe Maybe TS-OOB The preliminary TASD submitted with RFP response is expected to be revised after solution delivery to ensure the "as designed" and ?as delivered? solution still c0nforms to NCDPI and NCDIT standards. Any architectural or security changes require NCDPI and NCDIT approval, Describe your proposed approach for meeting this spec Maybe Maybe Maybe Yes Maybe Describe how the proposed solution aligns with State Technical Architecture Maybe Maybe Maybe Maybe TS-OOS Describe the following in the TASD referenced in Spec ti 2 OSecu rability, and olnteroperability Maybe Maybe Maybe Maybe security controls under the ?Moderate? category column need to be implemented. The vendors security policy should include all the control categories as speci?ed under ?Moderate? classi?catian. Please refer to pages 4 through 10 for the security control baselines table in the State Information security Manual document. For example, AC-J. [Access Control Policy and Procedures] under ?Access Control? Familnyategory is discussed in detail in the NIST 300-53 doCument. NC Statewide Information Security Manual - a} Describe how you will ensure compliance to the NC Statewide Security Manual. This service will be classified as ?9rogram CriticallModerate" based on the sensitivity of data used, the Maybe Maybe Maybe Yes Maybe 11 Describe how the proposed service protects PII and FERPA data. include details related to security of data stored at the vendor?s site as well as any server security policies. Maybe Maybe Maybe Maybe Yes TS-UUS Describe the Vendor?s proposed hosting site. All hosting sites must reside in the continental United States of America. Include in the hosting description answers to the following questions: i. who is the hosting provider? where is the primary site? Where is the disaster recovery site? iv, Are the hosting faci ies compliant with applicable governance {such as FERPA. PII, or SAS 70 certification]? if yes, please provide copies of the most recent auditlsl. Maybe Maybe Maybe Maybe TS-DOS Describe how penetration testing is done and the current frequency. Maybe Maybe Maybe Maybe TS-O 10 How often servers are patched. and what the Vendor?s methodologies are for handling patching? Describe the proposed solution?s system management practices with information on security patching, Maybe Maybe Maybe Maybe [could apply to production, secondary site, etc.) where the applicationfservice is hosted. The NCDPI reserves the right to audit the physical environment. Describe what processes the Vendor has in place to allow the NCDPI to audit the physical environment Maybe Maybe Maybe Maybe Maybe T5-012 network le.g. SSL, IPsec. SSH, SFTPIFTPS. etc], andfor backup are used. if the proposed solution uses any of the foregoing typesj'methods of describe the Describe how is used within the application. Include in your description whether database Maybe Maybe Maybe Maybe TS-O 13 Describe the Vendor's process for handling and notifying a breach of PII and other non?pu data. Maybe Maybe Maybe Maybe TS-O 14 Describe the security auditing and related capab referenced above. ies in place, Refer to State Security Manual Maybe Maybe Maybe Maybe TS-D 15 Describe any proposed system security provisions not already addressed above. Maybe Maybe Maybe Maybe TS-D 16 Currently NCDPI has about 26,000 staff and 500,000 K-S students. Describe how the proposed solution will scale without impacting performance. Maybe Maybe Maybe Yes Maybe Yes Describe how the proposed solution can integrate with identity and Access Management llAMl Service. Here is a brief description of the integration methodolo? using Maybe Maybe Maybe Yes Maybe TS-U 13 Describe how the proposed solution restricts access to users. what are the various attributes to restrict access and maintain confidentiality. Define their hierarchy and hierarchy attributes. Maybe Maybe Maybe Yes 19 Describe the proposed system?s data integration capabi ties with other NCDPI authorized systemls}. Explain how your solution consumes and publishes data with other solutions. Define the integration camps, teacher data, school calendars, etc], Every Child Accountability Tracking System and State Operational Data Store List all other products [suite] that may integrate with the service and the mechanism of integration. priorities and integration interface. Planned data integration points may include but are not limited to Student Information System [student enrollment, transfers across districts, dual enrollment. summer Maybe Maybe Maybe No TS-OZD Describe how the proposed solution is aligned with CEDS. Maybe Maybe No Describe the solution's use of SIF. Maybe Maybe Maybe Maybe No Provide a list of data elements currently in the system. Maybe Maybe Maybe Maybe Maybe TS-O 23 Describe in detail the ETL process in place. Maybe Maybe Maybe Maybe bulb-NW TS-024 the proposed solution. Describe in detail the items and services to be covered under operational maintenance and support of Maybe Maybe Maybe Maybe Yes 13-0 25 Describe in det he data conversion processes to migrate detailed historical data and setup new students. Historical data should be retained for a minimum of four years based on current retention requirements and can be updated depending on need. This information can include previous assessment data and student information data. Maybe Maybe Maybe Maybe ?i?es 26 Describe your capabi ties and approach for transitioning the NC assessment data to the State at the end of the subscription service should NCDPI decide to end the use of the service in the future. Include what format the data will be provided e.g. Excel. Comma Delimited. Maybe Maybe TS-O 27 NCDPI will provide student and staff information which should be used as a system of record for students and staff. Describe the proposed solution?s data processing, cleansing and security process envisioned for NCDPI. Include any data transformation, data latency messaging capab' ies. Maybe Maybe Maybe Maybe TS-O 28 All K-3 Assessment results [benchmark and progress monitoring} will be reported back to NCDPI or systems authorized by NCDPI. Describe your current data transfer capabilities using state of the art protocols and services. Describe the system?s ability to recover from failed or partial data transfer and your current notification process for the same. Maybe Maybe Maybe Maybe TS-OZS will be involved in User Acceptance Testing prior to i deployment and testing enhancements before each planned release or adhoc bug fixes. Describe the vendor's software in supporting application releases and project delivery. delivery process including the types of testing undertaken and Test Environment Management Process Maybe Maybe Maybe Maybe Describe how the proposed solution conforms to current accessibility standards, including Section 503, W31: WCAG 2.0, in accordance with 11.12.65. 5 and the Code of Federal Regulations at 28 CFR parts 35 [title Ill and 36 (title ll}. Maybe Maybe Maybe May be TS-DB 1 Recovery Point Objective Describe the vendofs disaster recovery plan including the current Recovery ??me Objective and Maybe Maybe Maybe Maybe TS-032 Describe the vendor's capability to provide Tier 1 through Tier 3 customer support and help desk capab ies for school districts to provide a unified single point of contact assistance with Technical issues, Describe the industry standard that is adopted for support. Maybe Maybe Maybe Maybe 33 Describe your proposed process for collecting and priorit ing user feedback and providing NCDPI a roadmap for enhancements and changes every quarter. Maybe Maybe Maybe TS-034 Provide the minimum hardware, software and network bandwidth requirements for optimal performance of the proposed solution. Also indicate the sunsetting plan. Maybe Maybe Maybe Maybe Yes T5035 Provide a 3rd party attestation, one of the following based on the system proposed: Type ll. FEDRAMP, ISO 3001 Maybe Maybe Maybe Maybe 11 TS-OSS Provide completed Vendor Readiness Assessment Report Maybe Maybe Maybe Yes Maybe Index Sped?gtlon - Project ?moment Speci?cation Meet- Re?ects Moots Meet- Maybe Yes Phil-001. Include an initial schedule and the associated Work Breakdown Structure (was) for the proposed implementation plan. The Project Schedule in the proposal to include signi?cant phases, activities, tasls, milestones and resource requirements necessaryfor NCDPI to evaluate the plan. Maybe Maybe Maybe Yes Maybe 11 Phil-002 Include your current processes for the following - a. Configuration Management, b, Change Management, c. Quality Assurance, d. Risk and Issue Management, e. Communication Management. Maybe Maybe May be Maybe PM-OOS daily activities of the Vendor's project team and serve as the primary contact for the proiect, Vendor is expected to provide a full-time experienCed Project Manager to oversee and coordinate the Yes Maybe Maybe Yes Phil-004 Acknowledge that the Vendor shall comply with and support State IT project processes [State require processes including participation in and forms are described here: Maybe Maybe YES PM-ODS The vendor will be expected to deliver the following documents. Please acknowledge your agreement to deliver and where the deliverables are tailored, please provide supporting justification. Maybe Maybe YES PM-ODE Include resumes for key personnel required to deliver the work. YES YES Maybe Maybe . Evaluation Spocl?atblh- Service Laval Speci?cation Meet Req Meets?aql?eeh I'll Yes No Maybe fails to meet the performance metrics established in the SLA. Provide a copy of the standard Service Level Agreement with this proposal submission, including provisions establishing remedies, such as refunds or service credits for NCDPI in the event that Vendor Maybe Maybe Maybe Maybe SLA-GDZ Describe the proposed solution's historical Uptime, Availa Maybe Maybe Maybe Maybe Yes SLA-OO3 Maybe Maybe Maybe YES Maybe Maybe Maybe Maybe Ma ybe 320 61 27M 122 2?5 Read to Achieve - Evaluation Checklist for Desired Specification Evaluation Vendor Evaluators Non-Voting SM Es SME 1 SME 2 SME3 SME SME .I.1 Index "Evaluation Specification - Desired Speci?cation Meets Requirern Mean Meets Meets MeetsRe Meets Re Meets He Meets Re Meet Re Meets Meets uirements 05-001 The Vendor's SaaS Solution may provide functionality to conduct student assessments on supported devices. Yes Yes Maybe No Yes Yes Maybe Yes 05-002 Support at least 99.9% uptirne availability. However, if the vendor proposal is recommended by the evaluation team to the competitive range, all SLA terms may be negotiated at that time, Yes YES Yes Yes Maybe Yes 05-003 Support 3 to 5 second or less web page response times. However. if the vendor proposal is recommended by the evaluation team to the competitive range, all SLA terms may be negotiated at that time. Maybe Yes Yes Maybe Maybe Maybe 05-004 Incorporates a personalized blended approach to assessment and learning. including multiple teaching and assessing models to meet the demands of diverse student populations with a wide range of learning needs. NO aybe Maybe YES Maybe Maybe Maybe 05-005 The assessment system incorporates innovative and evidence-based approaches utilizing assessment results to assist in recommending instructional strategies. Describe the system?s capability to provide on demand (real time) assessment data and instructional strategies recommendation. Yes Yes Yes Yes Maybe Maybe Maybe 05-006 Electronic student, class, school, and district reports on assessment results to help all educators make instructional decisions based on the data, including a report that tracks student Yes Yes Yes Maybe Maybe Yes 05-007 Online professional development options for teachers and administrators pertaining to the use of the assessment system and how to analyze and use the data to make informed instructional decisions for students. YES Yes Maybe Yes Yes Maybe Yes 05-008 Describe how your solution provides communications to parents including the ability to generate strategiesitools for them to be able to help their children at home. Explain the research and vetting process for these recommendations. Yes Yes Yes No Yes Yes Maybe 05-009 05-010 05?011 05-012 05-013 05'014 Describe how the proposed solution includes a constructed response feature for responding to text dependent questions (NAE P) - s?ncesed. ovi'nationsre ortcardi' NO maybe NO NO maybe learning. Describe the proposed service?s ability for authorized users to upload evidence of NO No NO NO maybe Describe the proposed service?s to maintain a portfolio of student?s ongoing development over time, Yes Yes Yes maybe Describe how the proposed solution approaches print awareness for young children. No YES maybe Describe other open standards (other than CEDS de?ned in Technical Speci?cation #20) that are used for interoperability. Yes Yes Yes Maybe maybe Maybe Describe the open standards that can be used for interoperability with your service. YES Yes Yes Maybe maybe Maybe ead to Achieve - Evaluation Checklist for Substantial Conformity to Speci?cation Vendor Evaluators Non-Voting SMEs SME PMO Security Technology Index Evaluation Spedfiation - Business Speci?cation Meets Heq Meets Meets Ree Meets MeetsReq Meetsneql Meessned Meets Meets Req MeetsRequetsRequetsReq Meets Requirements behaviors to support student?s learning development at the various grade levels to inform instruction, including any observation-based practices if applicable: a.oral language {expressive and receptive) Bus-001 b.phonological and phonemic awareness c.phonics d.vocabulary e.f uency ?comprehension Maybe Yes YES Maybe YES Yes Describe the validity and reliability of the assessment in the following areas: a.oral language b.phonological and phonemic awareness Bus-003 c.phonics d.\.lrocabi..rlaryr e.?uency f.cornprehension Yes No Maybe Maybe Maybe Maybe YES Maybe Describe how the assessment identi?es and reports students who may need Bus?00 5 intervention and enrichment. Yes Yes YES YES Describe how the following characteristics for progress monitoring between benchmarks are met by the proposed solution: a.brief, bxepeatable, c.sensitive to improvement over time, including short term change .multiple equivalent forms of screening assessments that enable the teacher to gauge short term growth [weekly or every other week], Bus-006 e.reliable, f.valid, gmeasure accuracy and fluency with skills h.quantitative results charted over time to calculate and document rates of improvement i.Allow for off-grade level progress monitoring j.Ability for the results to be graphed against a goal (national norms andfor individualized goals} with 12-14 data points in 10 weeks? time. Maybe Yes Maybe NO YES NO YES NO accurately identify indicators of risk for dyslexia in grades K-3 as outlined in NC Bus-one Session Law 2017- 1?]Billsl House; PD 149v4.pdf Yes No NO aybe Maybe YES Bum K-s students. Maybe Yes Yes YES Describe how the system incorporates educators andfor students using digital 8-010 devices to assess reading and pre-reading behaviors. Yes Yes YES YES Yes Yes Describe how the proposed solution is a formative reading assessmentls) tool BUS-011 for grades K, 1, 2, 3. Yes Yes Yes Maybe Yes YES No Yes Describe how the proposed solution is a diagnostic reading assessmentls} tool 2 ?5-01 for grades K. 1, 2, 3. Yes Yes Yes YES YES while ensuring formative and diagnostic assessments are conducted. Provide Bus-015 estimates of assessment time, for both benchmarking and progress monitoring, per student per grade. Yes Maybe Maybe YES YES YES Yes Board of Education adopted ELA Standard Course of Study {Spring 2017}. 3115017 Provide speci?c mapping to the current standards. . Yes Yes YES YES Yes Yes YES Bus-019 Describe and provide any information that explains any alignment or relationship between the assessment and the Education Value-Added Assessment System (EVAASL Yes Maybe YES YES YES YES Yes Yes BUS-024 Describe how the Benchmarking process occurs in the proposed solution. NCDPI expects benchmarking three times a year for grades K, 1, 2 and 3. YES YES YES YES YES YES YES RIB-ODS a.timely assessment results to b.tirnely assessment results to c.reporting results at the district, school, grade, teacher, group, and individual student level by all subgroups ESSA d.an end-of-year student summary report for cumulative folder historical data year after year to identify consistent gaps and learning trends for district, school, grade, teacher, group, and individual student level by all subgroups For each of the above, provide a tirneframe for how frequently the data is refreshed {real-time, on demand, or some other interval}. YES YES YES YES YES PIS-004 Provide communication to parents in a format that is clear and easy to understand after each benchmark YES Yes Maybe Yes Yes YES 75-005 sensitivity of data used, the security controls under the ?Moderate? category column need to be implemented. The vendor's security policy should include all the control categories as speci?ed under ?Moderate? classi?cation. Please Information Security Manual document. For example, [Access Control detail in the NIST 300-53 document. NC Statewide Information Security Manual - 3} Describe how you will ensure compliance to the NC Statewide Security refer to pages 4 through 10 for the security control baselines table in the State Policy and Procedures] under ?Access Control" FamilyICategory is discussed in Maybe NO Maybe Maybe Maybe aybe Maybe Maybe No TS-OSS Provide a 3rd party attestation. one ofthe foliowing based on the system proposed: YES NO Yes NO Maybe No PHI-001 Include an initial schedule and the associated Work Breakdown Structure (WBSJ for the proposed implementation plan. The Project Schedule in the proposal to include signi?cant phases, activities, tasks, milestones and resource requirements necessaryfor NCDPI to evaluate the plan. YES YES Yes YES maybe YES NO Read to Achieve - Evaluation Checklist for Desired Speci?cation Evaluation Vendor Evaluators Voting Members Voting Member 1 Voting Me Voting Me Voting Me Voting Me Voting Me Voting Me Voting Me Voting Me Voting Me Voting Mel Vote Count Bebl'e Com: Index 5 Ion - Desired l?oetbn Mach Meets Meets Menu Moots Moon Moon Moon Moots Meets It Meets en lilo 05601 The Vendor?s SaaS Solution may provide functionality to conduct student assessments on supported devices. Maybe Yes Yes Yes Maybe Yes Maybe Maybe Yes 5 4 11 0 Support at least 99.9% uptime availa y. However. if the vendor proposal is 05-002 recommended by the evaluation team to the competitive range. ail SLA terms may be he otiated at that time. Maybe Yes Maybe Yes Maybe Yes Maybe 3 4 11 0 0 Support 31d 5 second or less web page response times. However. if the vendor 05?003 proposal is recommended by the evaluation team to the competitive range. all SLA terms may be negotiated at that time. I'Y'lilYlie Yes Maybe YES Mavbe Yes Yes 4 0 3 0 9 11 Incorporates a personalized blended approach to assessment and Ieaming, DS-ooa including multiple teaching and assessing models to meet the demands of diverse student populations with a wide range of learning needs. No Yes Yes Maybe Yes Yes The assessment system incorporates innovative and evidence-based approaches 05?005 mg assessment results to assist in recommending instructional strategies. Describe the system's capao to provide on demand (real time) assessment data and strategles recommendation. No Maybe Yes Yes Maybe YES Electronic student. class. school. and district repons on assessment results to help DS-DDE all educators make instructional decisions based on the data, including a report that tracks student progressl'drowth. Yes Yes Yes Yes Maybe Yes Yes Yes Yes Yes Yes to la 1 10 0 1 Online professional development options for teachers and administrators 05-00? pertaining to the use of the assessment system and how to analyze and use the data to make informed instructional decisions for students. Yes Yes Yes Yes Maybe Yes Yes Yes Yes Yes Yes 10 1 to 1 Describe how your solution provides communications to parents including the 05-003 ability to generate strategiesrtools for them to be able to help their children at home. Explain the research and vetting process for these recommendations. Maybe Maybe Yes Maybe Maybe Maybe yes Maybe Maybe Yes Yes 4 0 4 0 if Describe how the proposed solution includes a constructed response feature for 05?009 responding to text dependent questions (NAEP) - 'r'i'ncesed. ovrnationsre rdMaybe to 1 10 1 054310 Describe the proposed service's ability for authorized users to upload evidence of learning. No No No No Maybe No Maybe 05011 Describe the proposed service's ability to maintain a portfolio of student's ongoing development over time. Yes No Yes Maybe Maybe Yes Yes Yes Yes 6 1 2 5 1 5 35.012 Describe how the proposed solution approaches print awareness for young children, Yes Maybe Yes 05-013 Describe other open standards (otherthan CEDS de?ned in Technical Speci?cation #20) that are used for interopera ty. Maybe Maybe Maybe Maybe a 2 9 05.014 Describe the open standards that can be used tor interoperability with your service. Maybe Maybe Maybe Maybe 4 11 0 To! Vote: 61 24 3? 'i?n 31 53 forSuhstantial assess and to learning development at the various gradelevets to Inform instrutlim. any observationbased practices ?applicable: language {expressinand receptive] and phonemic awareness how the proposed solution measures acsuuty and rare for grader It. 1, 2. 3 [oral phonological and phonemic awaren css, phonics. voca bulary, ?uency, and va1idity and assessment in the follow-vim areas: language and phonetic awareness following: at .SDor higher and com ullent or predicttve validity at .wor above how the andreporu students ?homey need intervention and howthe ?tossing durat?tetlstI-cs for "was monitoring between benchmarks are bvthe proposed sdudon: lolmprovernult our time. including shun term change equivalenttorrns orweeina assessments thalenahle theteacher lounge short your?! [weeklyor every other weekl. art: may and charted over Ill-rte to calculate and document rates ?improvement for o?-gradeleuel progs'ess milonng these-mints be graphed goal [mdonal norms auditor individualised how the propot??l solution will be able to assess progress based on Falge scale noun andfor research-based criteria for district. school. grade towel, class. group, individual. and like peer group [refer to NC {it policy how the measures Itignwitll best practices and adequately and accurately ofrislsfor dyslexia lngrades lit-3 as outlined in NC Session law 201?? howlhe system incorporates educators students Loin; ?gilal devk? to assess Irourthe proposed solut?smtuhxrnalivereaing too-ttor graded?. 2. hour the proposed solulJonis a diagnosli: reading grades I. l. nus-013 the miter-e and strills amino features Io assisl with analyting and identifying how the proposed solution reports and displays results ol progress monitoring. minlrniteslmpact to Instructional time while ensuring and diagnostic assessments are conducted Provide esbmatesotassusmenlume. how the solution ad'a as students gain mastery and have demonstrated pro?ciency. the content sca ndards wlil be aligned and reallgnedtostate Burdol Eduta lion EM Standard Cerise of Study [Spring 1017?]. Provide speci?c mapping to the current Bus-018 how the proposed solution can demonstrate high rates at predictability as to student how the proposed solution can yield data that can'be used with 1nd pto-Mduny lobar-nation that explains any ot between the andthe Edutatton Valueadded muss-nerd SystemilVMS}. prefers a web-based soltwale as a service application with the capability to support student assessments without an Internet connection. Describe how the proposed Bus-011 how the proposed solution allows [or grouping and assigning student and educators Eu 5-022 how the proposed solution establishes instructional reading groups based on data- Bus-023 howlhe proposed solution rdocalors meetthe individual needs olstudents by hornI the Bertie-starting process occurs in the proposed sotutlorL NCOPI eweets BUS-025 Bus-U26 how the proposed solution will provide data on the efleuiven ess of supplemental and Describe the vendor's proposed training model to train Staiehol ders (Estimated about Master Literacy Coaches at the school districts {Estimated about 500} and leathers. [xc eptional Children Tearhers. English at a Second Language lea chers. and literary specialists least 25.000: initially and on an ongoing basis {rs-Fresher training]. Include any real-time Bus-023 how the vendor evaluates e?eclivencts and adapls to meel lhe needs. Describe in detart the training and protest-anal development content arr-as and variety of levels. For esa mpl'e. product train-rig. usability tot both diagnostic and progress monitoring. alt trashing methods that lhe vendor will make available For the trainers like BUS-030 Technology based training or training Presentation. include all associated training costs In Describe the strategy to provide demo sitefaccounls [or trainers into account appropriate proletlionr are in plate to mast: sensitive production data the demo site. Please Bus-031 he sure to elaborate how the masked data resembles production data and is repeatable, while Describe the various report output [or mats graphs. charts. (5V. lit.- and the report delivery methods Err-rail, Excel etc. It email is offered as an option describe the data E5003 lea ture is enacted to pro-vi de the following capahl lilies: a.tlmely assessment residtslo bur-retry assessment mulls to ueparlin? results attire school. student1evct by alisubsloups ESSA den end-ol-year student summary cumulativeiolder historical data year after year toiderility consistent gaps and learningtrends for district. school, grade.teacher. group. andinrividsral student level by all subgroups For each ol'the above. provide a tlrnelramefor data is refreshed mdemand. or some other Interval}. ller Maybe Maybe Provide commica?onto parents in abrmat that is clear and easytounderstapd after each berrc'hiwlc No Maybe E5005 Describe the a abil ity to tree It and report servic usage. Maybe Yes Yes Yes ESM Yes Huh I Hm I Meek Describe any other reports that the solution ottersMelt! E. (II ri?calda Yes Maybe Weakness Medal T5001 Describe how the proposed solution is compatible Mli?l common digital devices including mobile and deslrlop devices. Describe any drflerenc es in lhe rnobrle offerings. Maybe Ma ybe REFFI T5002 Please include a preliminary Iechnic al Architecture System Design lush] document :Hnt It.si.a marona ws tom's] ls- oublicl'docum entsh?rres? er hnic alKZUArchil eclur e?li gnhazcll'empl a I e. doc} that illustrates the proposed solution. [Describe in the hut-r the items outlined in this attachment are esp ecled to be address ed with due consideration for all speci?cations in this document. Provide supporting narrative, appropriate technical diagram depicting the flow of data and system arr Ma ybe Maybe Maybe the preliminary n50 submits ed with response is expected lobe revised a?er solution delivery to ensure the 'as designed' and 'as deliver ed' solution still conforms to and standards Any architectural or security changes require NEDPI and NC DIT approval. Describe your proposed approach for meeting this specific alion. Maybe Maybe Maybe Maybe IEEFI Maybe IRFFI Des tribe how the proposed solution aligns with State lec hnical Architecture ps :Hit. nc .goyr'ser vicesfit- a rchitectur efstatewi e-ar chi ter lure-Ira mewor It Maybe Maybe Maybe Maybe 15-005 Describe the lollowing in the referenced in Spec 2- IAyailabriity. eSec urabrlity. -$calabilrly and Maybe Maybe Maybe Maybe Maybe This serrdca classi?ed as 'Program ate? based on the sensitivity ofdata used. the security controls under the 'hloderate? category column needlobe Implemented. The vendor's setulty policy should Include al the control categories as speci?ed under 'Maderate? Please reler to pages 4 through 10 for the sec uriry control baselines table in the State Into: maria-r Security Manual document. For example. arc-t the: Control and Proceduesl under 'Mcess Control' Familyr?cuegoryls arc-am In detail in the NEE ems: do: other?. NC statewide Information Security Manual - etothaHCStatewldeSocurity Manual. Maybe Maybe Maybe Maybe Ma ybe 15-00? Describe how the proposed service protects Fit and FEEPR data. Include details related to security of data stored at the vendor's site as well as any server securitv policies Maybe Maybe Ma ybe Maybe Describe Ihe Vendor's proposed hosting site. All hosting srles musl reside in Ihe tonlinental United States of America. Include in the hosting description answers to the following questions: i. Who is Ihe hosting provider? i'r. Where rs the primary site! where is the disaster recovery rite? ry. Are the hosting lacilitiesconnpliant with applicable governance is uch as FEEPR, Pll. or 5? TO certification}? If yes. please provide copies of the most recent audil[sl. Maybe Maybe Maybe Maybe Describe how penetration testing is done and the current frequent y. Maybe Maybe Maybe Maybe Describe the proposed solution's system manl ge ment praclic es with information on security patching. flow often servers are patched, and what the Vendor's methodologies are for handling Ditching? Maybe Maybe Maybe Maybe TS 01! Describe what processes the \r'e ndor has in place Io allow the to audit the physical environment [could apply to production. secondary site, etc.) where the applic ationrls ervice is hosted. The NCDPI reserves the rightto audit the physitai environment. Maybe Maybe Maybe Maybe TS-012 Describe how is used within the application. include in your description whether database networlr ent ryption [e g. SS L, IPsec, SSH. etc}. data-at- resty'data-in-molion andy'or backup are used. It the propos ed solution uses any of the for ego-n}. typesfm ethods of des