. DEPARTMIEENTEGF GAVIN - I STATE I REIDM 1 145 I CA I 95314-4995 I August 9, 2019 Ms. Ana Matosantos, Cabinet Secretary Office of Governor Gavin Newsom State Capitol Governor?s Office Sacramento, CA 95814 Dear Ms. Matosantos: Final ResultsmErnst Young independent Assessment of the Department of Motor Vehicles? New Motor Voter Application Pursuant to a request by former Governor Brown in September 2018, the Department of Finance initiated a performance audit of the Department of Motor Vehicles? (DMV) information Technology and Customer Service Functions. The results of that audit were released on - March 1, 2019. -To complement the performance audit, Finance also contracted for an independent technical assessment of DMV's Enterprise Applications Systems Environment, the California New Motor Voter Application (Motor Voter). On. October 12, 2018, Finance released a Request for Offer and the contract was awarded to Ernst Young The contract term was November 6 2018 to July 1 2019 with a maximum contract total of $417,.736 00. Ernst Young performed fieldwork at both the DMV and Secretary of State (808) offices The contract included five separate deliverables. These deliverables have been received over the last few months and have informed the work of the Department of Motor Vehicles.- The fifth - and final data validation report was received at the beginning of July. The topic of each assessment is as follows: 1. Business Process Assessment Report?A review and analysis of Motor Voter Business Processes. 2. System Development Assessment Report?A review and validation of the Motor Voter design and development code. 3. Motor Voter Risk Assessment??An identification of the most critical risks. 4. Quality Assurance Assessment Report?~A review of the Motor Voter electronic files and file transfers between the DMV and SOS. Ana Matosantos, Cabinet Secretary August 9, 2019 Page 2 5. Pre-lntegration with Ease File Validation Report?A review of the file transfers between April 23 2018 and September 28, 2018. (This period was selected specifically for review because administrative processing errors had occurred during this time frame ahead of fun system integration.) . The first four assessments were performed against an established set of evaluation criteria to highlight risks and develop recommendations for DMV, and, where applicable, for 808, to conduct future improvements to enhance the effectiveness of the Motor Voter program. The criteria were evaluated using a combination of techniques including the review of project documentation.(i.e., artifacts), stakeholder interviews, research on vendors and technologies used technical code review and, in the case of the Risk Assessment, utilization of a web? based collaborative application to capture risk exposures through anonymous contributions from program team members Many of the findings from the first four reports mirror those' In Finance 8 audit report, and DMV has already addressed or is in the process of addressing these. . The scope of the fifth deliverable, the Validation Report, was limited to providing the results of data comparisons only and did not include an evaluation of the results. The scope did not include an explanation of these differences or whether they were significant. DMV and 808 technical staff have reviewed the results of the data transfer validation to determine whether differences were expected or whether these differences need further action to ensure the efficacy of the Motor Voter application. From discussions with DMV and 808, Finance has learned the differences cited were expected and 'did not jeopardize voter registration through the Motor Voter application Highlights of the assessments are described below. - An exact match comparison was performed between 3,184,724 records received by 808 and the corresponding DMV records sent. Each record within the data sets contained 41 fields fora total of 130,573,684 data points checked. Ernst Young identified 1,759,121 data points that did not contain the exact same data. The majority of the mismatches are not associated with core voter information. A review of the data found that 1,378,430 records, or 78 percent, had mismatched ?Creation date" or ?Effective date? which is an expected outcome because each department has small differences in the format of stored date-stamp data. Additionally, data files are often batched and processed by DMV and 808 at different times. Otherdifferences include 91,927 records without matching ?Former residence zip code? and 40,742 with mismatched ?Current mailing street address These are both the result of differences in field management between the two systems but do not indicate any issues with actual voter registrations. - - Of the identified data points referenced above, 173,243 records contained mismatches between the DMVdata set and the 808 data set in the "Political party? field. All were due to different designations between the two data sets in recording "no party preference? and ?opting out? of registration? versus the field being blank. This difference did not result in voters being registered to the wrong party. Ana Matosantos, Cabinet Secretary August- Page 3 9, 2019 The report identified 92,201 records in the DMV data set that were not in the 808 data set. This number includes 55,662 records that were flagged and checked for quality control by DMV, and subsequently all approved records were sent to 808 in later batches as new records. The fact that the original records were not received by 808 was a result of the system working as expected. Additionally, 28,558 records were processed into the 808 system after the cutoff date for this analysis and another 7,981 were not sent due to missing or invalid address or county. In a separate but related finding, the report identified 83,684 records shown as duplicates in the SOS data set because they were sent initially to the 808 with a blank ?Political party? field value and later the same records were resent with the corrected value of "No party preference? in the same field. This action resulted in no impact to voter eligibility. As noted earlier, the data analyzed as part of the file validation was from April through September 2018. It is our understanding that DMV staff, with assistance from 808 staff where necessary, has subsequently addressed many of the factors that led to the differences noted in the report. If you ave any questions or need additional information regarding this matter, please contact Steve Wells, Principal Program Budget Analyst, at (916) 322-2283. KEELY MARTIN BOSLER Director By: w. VIVEK Chief Deputy Director Department of Finance Department of Motor Vehicles ─ Independent System Assessment Business Process Assessment Report February 21, 2019 Table of contents 1 Executive summary ............................................................................................................................................... 2 2 Assessment ........................................................................................................................................................... 4 Findings and Recommendations ............................................................................................................................... 4 2.1 Customer Feedback Analysis ...................................................................................................................... 11 Appendix A: Deliverable Expectation Document......................................................................................................... 14 Introduction............................................................................................................................................................. 14 1. Deliverable Description...................................................................................................................... 14 2. Deliverable Contents ......................................................................................................................... 14 3. Key Activities ..................................................................................................................................... 14 4. Business Process Assessment Evaluation Criteria.............................................................................. 15 5. Deliverable Assumptions ................................................................................................................... 19 6. Acceptance Criteria ........................................................................................................................... 19 Appendix B: Interviews Performed .............................................................................................................................. 20 Appendix C: Artifacts Reviewed................................................................................................................................... 22 Appendix D: Voter Registration Criteria ...................................................................................................................... 25 Appendix E: Acronyms and Definitions ....................................................................................................................... 27 Page i Executive Summary 1 Executive summary Assembly Bill (AB) 1461 required California’s Secretary of State (SOS) and Department of Motor Vehicles (DMV) to establish the California New Motor Voter (CNMV) program, which provides DMV customers opportunities to register to vote if they qualify. Pursuant to AB 1461, DMV is required to electronically provide SOS the records of individuals who apply for an original or renewal of a driver license or state identification card or who provide a change of address who may be eligible to vote. The Motor Voter project was created with collaboration from California's Department of Technology (CDT), DMV and SOS. CDT led development and implementation of the application. DMV acts as the customer-facing agency to collect, filter and send voter registration data. SOS receives voter registration data from DMV and incorporates the information into the VoteCal system to update voter registrations. This business process assessment is the first in a series of assessments of the CNMV program. The objective is to review the Motor Voter business processes against the following evaluation criteria to highlight risks and develop recommendations for improvement of the ongoing program’s effectiveness. The criteria were evaluated using a combination of techniques including the review of project documentation (i.e., artifacts), stakeholder interviews, analytics and demonstrations of the Motor Voter application. Business process assessment evaluation criteria:  Business process owners accountable to the requirements for the application are clearly defined  Software requirements were formally approved by the business process owners  Federal, state and other requirements were identified and compliance was incorporated into the overall application requirements management  Software requirements were formally approved prior to changes  Requirements management process was thorough and rigorous  Application requirements are complete, correct and consistent  Data validation activities were designed to confirm the completeness, integrity and accuracy of data that is transferred from end to end, including the front-end application, back-end processing, source data files, intermediate data files and target data files  File transfer records (pre-transmission) are evaluated for anomalies against a baseline  The target database is evaluated for voter anomalies against a baseline  The application is intuitive and efficient for the public Future assessments are planned around system design and data integrity. A facilitated risk assessment is also planned. Page 2 Executive Summary The Motor Voter program represents a substantial investment and will require ongoing solution enhancements, upgrades and production support. Effective management of this program will require a systems development life cycle that includes defined roles and responsibilities across DMV and SOS and ownership of decision-making to mitigate risk and maintain alignment with the program’s stated objectives. The program should implement proficient independent program oversight to provide objective progress monitoring and advise program leadership on risk management. This assessment has identified risks that, if not addressed, will adversely impact the realization of the intended benefits of the program. The following business process recommendations are prioritized to first implement foundational improvements to support the continued maintenance and operations of the program, followed by recommended enhancements to continuously improve information and usability of the application. These recommendations should be reviewed and considered for implementation by program stakeholders: 1. Redesign the existing program governance model to include Secretary of State accountability for program success at all levels of the project (e.g., steering committee, requirements management, testing, incident management). 2. Design a formal incident management process with clear roles and responsibilities, a single source of truth and formal root-cause analysis. 3. Utilize a proven software development life cycle (SDLC) methodology to structure, plan and control the program including a standard set of tools with access for all stakeholders. SDLC should incorporate a requirements management process, including the definition of clear roles and responsibilities and formal approvals, and a testing strategy with entrance and exit criteria which includes unit, systems (hardware, software and security), integration (with SOS), performance and user acceptance testing (UAT). 4. Create required documentation, such as a requirements traceability matrix, data mapping specifications and data validation strategy, which was absent from the project so that the program maintenance and operations (M&O) can better understand the current system, support future decision-making and avoid potential problems. 5. Create and implement a quality management plan for the SDLC so that ongoing application problems and future changes have clear quality assurance measures and clear decision-making protocols. 6. Establish a program work stream responsible for optimizing current DMV field operations business processes and prepare the field offices for ongoing changes. 7. Establish a program work stream responsible for public research to assess and resolve challenges with the application. 8. Utilize data-driven analytic tools to assist in understanding trends in the voter registration data compared to a baseline and to evaluate customer complaint data for enhancements based on public research. 9. Enhance the existing solution with real-time validation of voter eligibility using definitive data sources . Page 3 ``` Assessment 2 Assessment This business process assessment was performed against an established set of evaluation criteria as detailed in Appendix A: Deliverable Expectation Document. The criteria were evaluated using a combination of techniques including the review of project documentation (i.e., artifacts), stakeholder interviews, analytics and demonstrations of the Motor Voter application. The table below represents findings observed throughout the assessment, risk implications related to the findings and the associated recommendations. Each finding (F1-F14) may have recommendations that tie to multiple overarching recommendations (1-9) identified in the Executive Summary above. Findings and Recommendations Evaluation criteria: Business process owners accountable to the requirements for the application are clearly defined F1. Finding: Roles accountable to the business requirements were not clearly defined, communicated or upheld during the project and this situation remains in the ongoing maintenance of the program.    Project’s Responsible, Accountable, Consulted and Informed (RACI) matrix and a project organizational chart were provided to the assessment team. However, there was a lack of awareness, use and enforcement of the RACI and organizational chart throughout the project. Program RACI and organizational chart, including stakeholders from both DMV and SOS, have not been established or communicated. Project and program personnel consistently communicated during assessment interviews that multiple people were thought to be responsible for tasks/areas and there is inconsistency in lines of reporting and ownership of decision-making authority. Assignments of responsibilities did/do not align to individuals with the requisite experience and knowledge (e.g., individuals knowledgeable in relevant business requirements were not assigned the role of business process owner; developers wrote their own business requirements and defined priorities, due to lack of product owners). Risk Implication: Without clear accountability and business process owners responsible for application requirements, there is risk of incomplete and inaccurate requirements being released into production. Recommendation (R) R1 - Redesign the existing program governance model to include Secretary of State accountability for program success at all levels of the project (e.g., steering committee, requirements management, testing, incident management). R2 - Given that the voter registration application integrates with other DMV applications, SOS and DMV business process owners will need to work very closely together. Because SOS ultimately defines the voter eligibility requirements, a qualified SOS representative should be formally designated as a business process owner going forward in the program. R3 - Select qualified business process owners/product owners who will be responsible for the requirements of the application and business processes. This may be different people or departments for different applications. Page 4 ``` Assessment R4 - Update the project documentation (i.e., organization chart and RACI) to reflect these roles and communicate and enforce the specific responsibilities of roles within the program. Evaluation criteria: Software requirements were formally approved by the business process owners F2. Finding: The software development methodology used during the project was inconsistent throughout the project life cycle. The program has not determined a methodology for changes and maintenance and operations. The Motor Voter project management processes did not enforce formal definition, documentation, review and approval of the project’s business requirements.    Portions of the project were conducted using an Agile methodology that relied on user stories in a tool called Jira to track the business requirements. However, product owners did not have access to Jira to provide direct input and formally document their approvals. Other portions of the project used a Waterfall methodology. Although the Secretary of State provided input to the applicable software requirements there was not a formal approval process between CDT, DMV and SOS over these requirements. The Secretary of State was not well integrated into the project review of the application prototypes and was shown the solution on a limited basis within the course of the project. Risk Implication: Without an accountable business process owner (Waterfall) or product owner (Agile) formally approving requirements, there is risk that the software does not meet the needs of the business or that software requirements for the target system may be incomplete or inaccurate. Recommendation (R) R1 - Establish a process for a formal, written approval of business requirements in whichever software development methodology process will be utilized going forward. The process should include the definition of clear roles and responsibilities and formal approvals. R2 - Business process/product owners should be responsible for approval and acceptance of proposed requirements, including ownership of business requirements development or product backlog. R3 - Utilize a common and proven SDLC methodology to structure, plan and control the program including a standard set of tools with access for all stakeholders. Evaluation criteria: Federal, state and other requirements were identified and compliance was incorporated into the overall application requirements management F3. Finding: Although DMV and SOS resources collaborated to incorporate requirements under AB 1461, there was no formal process established for legal resources to review and approve that legislative requirements under AB 1461 were met. Additionally, there is no formal process in place to ensure new requirements or necessary changes to existing requirements are met.   Language and flow within the voter screens changed throughout the project without notification to SOS. There did not appear to be active legal representation within the project to ensure requirements were met. Page 5 ``` Assessment  The SOS representative identified as the lead for VoteCal and Motor Voter integration was not effectively incorporated into critical project tasks/activities. Risk Implication: Lack of an appropriate compliance management plan could result in missed interpretation and implementation of noncompliant requirements being introduced into the target system. Recommendation (R) R1 - Legal and compliance resources should be assigned to the program and included in periodic touch points for legal and compliance validation. Release processes for new/change functionality should be updated to include defined interim touch points for legal validation. This could include production incident review sessions, requirements sign-off, sprint reviews, test case planning and user acceptance testing. R2 - Legal and/or compliance-related defects or incidents should be immediately escalated to the legal resources assigned to the program . R3 - Create a compliance management plan to ensure legal and compliance requirements are identified, incorporated and validated for accuracy after implementation and periodically re-validated over time. R4 - A requirements traceability matrix can facilitate legal and compliance requirements and provide transparency. Evaluation criteria: Software requirements were formally approved prior to changes F4. Finding: The project change control process was not formalized resulting in changes to software requirements without traceability to business requirements or documentation of approvals prior to implementation in production.   A project change control log was kept for July, August and September of 2018. However, prior change requests were not tracked. There is not a formal process in place for addressing changes to the SOS and DMV interface. Risk Implication: Lack of adequate and effective change control process and formal approval for software requirements can increase risk of unintended modifications released into production. Recommendation (R) R1 - A requirement traceability matrix should be developed and maintained to enable the project bi-directional tracing between additional changes and existing requirements. R2 - Create, maintain and enforce a formal change control process to ensure changes to software requirements are formally validated and approved prior to implementation. The change control process should include not only a systematic approach to make configuration changes to the software, but also impact analysis to reduce potential disruption to ongoing project activities to effectively manage impact to project schedule, cost and resource allocation. R3 - All changes should be transparent to SOS and they should be actively involved in the design, testing and approval for the interface from DMV to SOS. Page 6 ``` Assessment Evaluation criteria: Requirements management process was thorough and rigorous F5. Finding: The project did not maintain a requirements traceability matrix to validate that all requirements were built and tested. The lack of a requirements traceability matrix during the software development cycle created a gap in linking requirements to their origin and traceability to test cases. Risk Implication: Lack of an effective requirements management process, including traceability matrix, may hinder the organization’s ability to validate that all requirements are built and tested during the design, testing, training, and maintenance and operations processes. Recommendation (R) R1 - A requirements traceability matrix should be developed and maintained that allows bi-directional tracing of all requirements to test cases. R2 - Create required documentation which was absent from the project so that the program maintenance and operations can better understand the current system, support future decision-making and avoid potential problems. Some examples of required documentation are a requirements traceability matrix, data mapping specifications and data validation strategy. F6. Finding: Business requirements for the Motor Voter front-end application (Enterprise Application Services Environment - EASE) were not all captured in the standard DMV business requirements documentation and formally approved. Risk Implication: Without a formal requirements document and approval, there is risk that the software does not meet the needs of the business or that software requirements for the target system may be incomplete or inaccurate. Recommendation (R) R1 - A requirements management approach should be clearly defined which identifies roles and responsibilities, and provides approved tools and templates that are communicated and available for use by the project team. The process should include the formal review and approval of requirements by SOS and DMV. R2 - The full set of requirements should be inventoried, recorded and formally approved so there is future traceability of the requirements to origin, design and test cases. Evaluation criteria: Application requirements are complete, correct and consistent F7. Finding: There was no formal requirements process by which SOS could review, verify and approve application requirements during the project or the current program.   Requirements completion, accuracy and consistency cannot be validated due to missing project documentation and/or lack of a formal requirements management process for the current program. Available project documentation, and current program processes, do not reflect a distinction between legal and other categories of requirements. There is an absence of formally documented and approved legal requirements by legal representatives. Page 7 ``` Assessment  There was no defined process by which SOS could review, verify and approve requirements. Risk Implication: Lack of an effective requirements management process, including traceability matrix, may hinder the organization’s ability to validate that all requirements are built and tested during the design, testing, training, and maintenance and operations processes. Recommendation (R) R1 - Create and implement a quality management plan for the SDLC so that ongoing application problems and future changes have defined quality assurance measures and clear decision-making protocols. Evaluation criteria: Data validation activities were designed to confirm the completeness, integrity and accuracy of data that is transferred from end to end, including the front-end application, back-end processing, source data files, intermediate data files and target data files F8. Finding: Motor Voter system’s data transfer and validation strategy is not formally documented. Additionally, documented requirements specific to interfaces, data transfer and data validation do not exist.   There are no interface control document (ICD) or interface design specifications documentation. Documentation of Motor Voter system source input and output target data for all data entry points does not exist (e.g., paper forms, electronic sources). This information is necessary to validate that input and output data is being processed as defined by the system requirements. Risk Implication: Without a comprehensive data validation process there is increased risk of incomplete or inaccurate data transferred to the target system. Recommendation (R) R1 - Create and execute a data validation test plan to substantiate that all mappings and transformation rules are correct. R2 - Create/enhance the system integration testing and regression testing processes to the source/target data results to demonstrate that mapping and transformation rules are correct and complete. R3 - Create and maintain the interface design documentation to serve as supporting information for the M&O team. R4 - Enhance the existing solution with real-time validation of voter eligibility using definitive data sources. F9. Finding: The master test plan describes a project decision to intentionally “limit” the testing time for the application and to simultaneously perform all types of testing, rather than in successive phases (integration, system and user acceptance). Testing duration was limited to meet the imposed April 3, 2018 project deadline. Risk Implication: Without sufficient testing time, there is risk that the team did not have adequate time to plan and execute all relevant test cases. Recommendation (R) Page 8 ``` Assessment R1 - Create and execute a formal and comprehensive test plan to fully re-test end-to-end business processes incorporating all the interfaces, validations and reports. R2 - Create and enforce a testing strategy for future changes with entrance and exit criteria which includes unit, systems (e.g., hardware, software and security), integration (with SOS), performance and user acceptance testing. F10. Finding: The program does not have a formal production incident management process in place.  No single database is used to capture, assess, manage and report on production incidents. Risk Implication: Without a formal production incident management process there is greater risk that incidents will not be identified, fully assessed and resolved timely. Tracking and managing incidents in disparate systems rather than a centralized database limits the team’s visibility into the volume, categories, criticality, resource needs and trends of incidents. Recommendation (R) R1 - Develop an incident management process with defined roles and responsibilities. The process should log and track incidents in a central database, be workflow-driven and scalable, and have the capability to provide reporting for progress monitoring and root-cause analysis. Evaluation criteria: File transfer records (pre-transmission) are evaluated for anomalies against a baseline and the target database is evaluated for voter anomalies against a baseline F11. Finding: A baseline of typical daily records for trending analysis has not been established to identify abnormal changes that could indicate issues within the data records. SOS performs certain statistical analysis over data records transferred from DMV. However, the analysis does not include trending and comparison against an established baseline.  SOS weekly analysis includes a comparison and reconciliation of transactions processed in the DMV application programming interface to transaction results in VoteCal to confirm completeness of the data transfer. However, the analysis does not include a comparison to baseline trends to identify abnormal changes that could indicate underlying issues within the data records. Risk Implication: Without an evaluation of expected data against a baseline, there is risk of incomplete or inaccurate data. Recommendation (R) R1 - Utilize data-driven analytic tools to assist in understanding trends in the voter registration data compared to a baseline to identify incomplete or inaccurate data. F12. Finding: The DMV Enterprise Resource Management (ERM) group is responsible for performing manual audits of a sample of Motor Voter data.   Currently, the Driver License Automation Development (DLAD) unit investigates issues identified in the manual audit. Manual audit issues are not formally logged/tracked in a central incident database. Page 9 ``` Assessment   The audit performed by DMV ERM is manual in nature and designed to be a mechanism to validate the integrity of Motor Voter data. Due to the manual nature of the audit and the large number of data records, the audit is limited to a sampling approach and may not uncover systemic issues. The manual audit process, procedures and results are not formally documented. Risk Implication: Without a documented audit process, there is risk that any resource turnover could result in poorly executed audits or delays in the process. A manual and partial audit (sampling) may not uncover systemic issues as well as a trend analysis compared to an established baseline. Without a central database for audit findings there is risk of not understanding trends or where to focus resources and remediation efforts. Recommendation (R) R1 - Document audit findings in a central database to allow for trend analysis, root-cause assessments, and prioritization of resources and remediation efforts. R2 - Document the manual audit procedures to facilitate a consistent audit process that can be seamlessly transitioned to new audit team members as necessary. R3 - Create a business case to automate the manual audit process to allow for greater coverage of data review (as close to 100% as practical) and refocus of personnel on trend analysis and remediation activities. Evaluation criteria: The application is intuitive and efficient for the public F13. Finding: Elements related to Motor Voter questions within the Driver License or Identification Card Application (eDL-44) are not intuitive to the public, creating confusion and uncertainty for the customer when completing the application.     DMV customers (the public) have indicated they do not understand why voter registration questions are included in the process of completing DMV transactions (i.e., driver license renewal, change of address) resulting in additional questions/clarification required to complete the transaction. The registration options within the Motor Voter questionnaire are confusing to the public. The language used in the options does not provide a clear choice to maintain voter registration, as is, without being removed from their current voter roll. The UX team was released from the project prior to go-live. As a result, not all potential enhancements could be assessed and implemented. In addition, they were not available to assist with difficulties identified post-go-live. Members of the DMV project and program teams have communicated that not all recommendations provided by the UX team were implemented, which could have enhanced the intuitiveness of the current application. Risk Implication: The lack of an intuitive and easy-to-use application may result in customer frustration/confusion and could result in longer transaction times, erroneous responses and reduced voter confidence in their registration status. Recommendation (R) Page 10 ``` Assessment R1 - Establish a user experience (UX) work stream to capture and implement changes to the application based on customer feedback. R2 - Simplify/clarify wording of voter registration questions. This may be facilitated by a UX study of the application. R3 - Assess and deliver additional training to DMV field offices to address questions asked by the public regarding the voter registration process. Consider a public outreach campaign to educate the public regarding the New Motor Voter program. R4 - Establish a program work stream responsible for public research to assess and resolve challenges with the application. R5 - Ongoing usability analytics should be created, evaluated and implemented so that DMV and SOS leadership can continually understand the New Motor Voter program acceptance and issue trends based on public input. F14. Finding: Training for the DMV field offices regarding the eDL44 was communicated. However, offices were not provided with training to answer voter registration questions and concerns.  Based on field visit interviews, DMV technicians do not know how to answer voter eligibility questions by the public such as “If I’m on parole am I still eligible to register?” Risk Implication: The lack of an intuitive and easy-to-use application may result in customer frustration/confusion and could result in longer transaction times, erroneous responses and reduced voter confidence in their registration status. Recommendation (R) R1 - Establish a program work stream (organizational change management) responsible for optimizing current DMV field operations business processes and prepare the field offices for ongoing changes. 2.1 Customer Feedback Analysis DMV provided available public feedback as input to the analysis of whether the New Motor Voter application is intuitive and efficient for the public. A set of customer feedback data was provided which included the following fields: date of the provided feedback, source of the comment (e.g., DMV website, call center, in-office) and the specific public comments. The data consisted of approximately 550 records which were deemed as Motor Voter-related and used for analysis. The analysis consisted of a sentiment and category analysis. Sentiment analysis performs a word-based identification of sentiment (i.e., positive, negative and neutral) within sentences, as well as the frequency of negative and positive words. The category analysis is a sentence-based analysis of text using machine learning methods to identify categories for each of the comments (e.g., operational, technical). The results of this analysis highlight an overwhelming negative sentiment, with four times the number of negative over positive comments. Negative sentiment steadily increased from May through September (note that data collected for the month of November is only partial). From the negative word analysis, the four most frequently observed negative words are “confusing” (“confused”), “forced”, “not clear” (“unclear”) and “do not want” (“don’t want”), which Page 11 ``` Assessment suggests that the public is confused overall and feels forced to register to vote. Approximately 30% of the customer feedback comments analyzed indicated the public’s suggestion for an option to bypass voter registration when they have already registered (see “Already Registered”, “Option to Skip Voter Registration”, “Comments on Mandatory Voter Registration” in graphic below). Note: During the period of analysis, it was noted that wait times at the DMV field offices were unacceptably high, which may also have contributed to the public’s negative sentiments. Ongoing sentiment and category analytics should be conducted and evaluated regularly so that DMV and SOS leadership can continually understand the New Motor Voter program acceptance and issue trends based on public input. Page 12 Assessment Dll?ll' Inn- Ila-view Analytics Eiut-llalnz-q-Jr'p Clpur-Jl urn Comment: an Illegal: ?123 Lummnt:unTc;h:lcal ICE-manhunt: [Mm Ar L'llh hurtl'n-Irr. 4-. '5 0..1-: Eccj tf?a'nk 3am t. easy leca :e Page 13 ``` Appendix A Appendix A: Deliverable Expectation Document Introduction The objective is to review and analyze the business processes for Motor Voter business processes including delivery channels. The assessment will highlight exposures and develop recommendations for improvement to help facilitate achievement of business objectives. 1. Deliverable Description The business process assessment will provide findings and recommendations based on artifact analysis, stakeholder interviews and application demonstrations of the EASE Motor Voter application. Stakeholders interviewed are expected to be the project sponsor(s), process owners, developers, testers, DMV branch personnel or other relevant stakeholders represented in the project org chart. 2. Deliverable Contents The Business Process Assessment will be a written report containing the results of the assessment in the form of findings and recommendations. The report will include an executive summary, identification of findings and actionable recommendations for improvement. Additionally, the report will list the received artifacts and stakeholders interviewed. The intention of the report is to identify risks and opportunities for improvement. The report will not disclose the specific stakeholder source(s) of the findings, but will attempt to corroborate findings with multiple sources. The report is advisory in nature. The assessment will not render an assurance report or opinion under this report, nor will the Services constitute an audit, review, examination or other form of attestation as those terms are defined by the American Institute of Certified Public Accountants. None of the Services or any reports will constitute any legal opinion or advice. 3. Key Activities  Develop basic understanding of the existing project processes, governance, organizational change management strategy and controls through document review and stakeholder interviews  Create business process artifacts request list and gather relevant documentation  Perform initial analysis and formulate hypotheses  Create interview questionnaires, conduct interviews and develop understanding of the existing Motor Voter business processes and delivery channels  Analyze and capture related issues and risks, and prepare assessment results and recommendations for resolution  Provide and walk through initial draft of Business Process Assessment Report with Department of Finance to obtain feedback Page 14 ``` Appendix A  Incorporate Department of Finance feedback and finalize Report within five business days  Deliver soft copy of Business Process Assessment Report for approval 4. Business Process Assessment Evaluation Criteria Listed below are the specific criteria that will be evaluated as part of the business process assessment. Each of these criteria will be evaluated using a combination of techniques including the review of project documentation (i.e., artifacts), stakeholder interviews and through demonstrations of the Motor Voter application and analytics. This is a current-state assessment; however, understanding data points leading up to EASE go-live will provide input to our analysis. Item # Evaluation Criteria Implication 1. Are clearly defined “business process owners” accountable to the requirements for the application? Without clearly accountable and skilled business process owners responsible for the complete and accurate requirements there is risk of requirement gaps Assessment Approach    2. 3. Were software requirements formally approved by the business process owners? The approved set of requirements for the target system may be incomplete or inaccurate  Were federal, state and other requirements identified and The approved set of requirements for the  Confirm the scope of applications and delivery channels that feed into the data transfer Assess the defined roles and responsibilities of the requirements analysis Assess if business process owners understand their role and are accountable to the correct and complete requirements Assess the business requirements documentation to determine if formal approval was completed Assess if compliance requirements were Example Stakeholders Engaged       Example Artifacts Requested Business process owners Project sponsor   Project RACI Requirements documentation and approval evidence Business process owners Project Management Office  Software requirements documentation Business process owners Developers  Business requirement documentation defining federal, state Page 15 ``` Appendix A Item # Evaluation Criteria compliance incorporated into the overall application requirements management? Implication Assessment Approach target system may be incomplete or inaccurate understood and included in the software requirements documentation Example Stakeholders Engaged  Example Artifacts Requested Testers   4. 5. Were software requirements formally approved prior to changes? Is the requirements management process thorough and rigorous? Increased risk of missed or misunderstood requirements Increased risk of missed or misunderstood requirements    6. Are application requirements complete, correct and consistent? Increased risk of missed or misunderstood requirements  Assess the change control log for changing business requirements and traceability back to approved business requirements Assess the requirements management process to understand the controls used and whether it uses standard practices Assess the requirements management plan and determine if there is both documented evidence and stakeholder confirmation of the plan’s adherence Business process walkthrough demonstration of the solution with the         Business process owners Developers Testers Project Management Office  Business process owners Developers Testers Project Management Office         Business process owners Developers  and other requirements Software requirements documentation Requirements traceability matrix Change control process Change control documents Software requirements documents Requirements management plan Software requirements documentation Change control process Change control log Software requirements documentation Page 16 ``` Appendix A Item # Evaluation Criteria Implication Assessment Approach   7. 8. Were data validation activities designed to confirm the completeness, integrity and accuracy of data that is transferred from end to end, including the front-end application, back-end processing, source data files, intermediate data files and target data files? Without a comprehensive data validation process there is increased risk of incomplete or inaccurate data Are the file transfer records (pretransmission) evaluated for anomalies against a baseline? Without an evaluation of expected data there is risk of incomplete or inaccurate data     process owners to observe the end-to-end solution Verify with stakeholders that the requirements and current application and associated business processes satisfy their needs Assess the consistency of assumptions and requirements across the software requirements documentation Identify the scope of applications that feed into the data transfer Observe a walk-through of the end-to-end data validation process Assess the data validation strategy and procedures to understand if there are procedures to proactively identify data integrity issues before transmission Assess the pretransmission process to understand if there are procedures to proactively identify data Example Stakeholders Engaged Example Artifacts Requested  Field office super users  Software training materials     Data lead Data stewards Developers Testers  Data validation strategy Data validation procedures Application data validation algorithms       Data lead Data stewards Developers Testers  Data analytics requirements Page 17 ``` Appendix A Item # Evaluation Criteria Implication Assessment Approach  Is the target database evaluated for voter anomalies against a baseline? Without an evaluation of expected data there is risk of incomplete or inaccurate data  Assess the posttransmission process to understand if there are procedures to proactively identify data integrity issues after transmission 10. Is the application intuitive and efficient for the public? Without an intuitive and easy-to-use application, the public may be more likely to make mistakes or get frustrated and decide not to register, defeating the original intent of Motor Voter  As part of the application walkthrough evaluate the ease of use and intuitiveness of the application Assess training material sample developed for field office employee use Review a sample of customer complaints from field offices Confirm if training has occurred and obtain feedback on perception of training quality Conduct a walk-through of a transaction     Example Artifacts Requested integrity issues before transmission Assess the data transmission process to determine if the process is designed to detect and respond to failed file transfers 9. Assess field office personnel comprehension of the application and associated business processes Example Stakeholders Engaged     Data lead Data stewards Developers Testers  Data analytics requirements   Process owners Field office employees (two field office locations)   Customer complaints Customer survey data (if available) Page 18 ``` Appendix A 5. Deliverable Assumptions  10-15 stakeholder interviews will be conducted  40-50 artifacts will be reviewed  1 deliverable for this task is expected – Business Process Assessment Report  Requested artifacts will be received within two business days of request The scope of business process analysis does NOT include the interim phase (Phoebe) 6. Acceptance Criteria Below are the specific acceptance criteria for this deliverable. Evaluation criteria are assessed Written acceptance of the deliverable received from one of the following individuals:  Erica Gonzales, Chief IT Consulting Unit (DOF) or Thomas Giordano, Oversight Manager, IT Consulting Unit Written deliverable acceptance form complete Page 19 Appendix B Appendix B: Interviews Performed The following is a list of individuals interviewed as part of this business process assessment. Table 1. Interviews conducted Individual Title Organization Date DMV 12/6/2018 DMV 12/6/2018 DMV 12/11/2018 DMV 12/11/2018 12/12/2018 DMV 12/13/2018 12/13/2018 DMV 12/14/2018 DMV 12/14/2018 DMV 12/14/2018 DMV 12/14/2018 DMV 12/17/2018 DMV 12/17/2018 CDT 12/19/2018 CDT 12/19/2018 DMV 12/20/2018 Census 12/20/2018 CDT 12/21/2018 CDT 12/21/2018 DMV 12/31/2018 DMV 12/31/2018 SOS 1/9/2019 SOS 1/9/2019 SOS 1/9/2019 Page 20 ``` Appendix B Table 1. Interviews conducted Individual Title Organization Date SOS 1/9/2019 SOS 1/9/2019 SOS 1/9/2019 SOS 1/9/2019 CDT 1/16/2019 Page 21 Appendix C Appendix C: Artifacts Reviewed The following is a list of artifacts reviewed during this business process assessment. Table 2. Artifacts reviewed Page 22 Appendix Table 2. Artifacts reviewed Page 23 Appendix Table 2. Artifacts reviewed Page 24 Appendix D Appendix D: Voter Registration Criteria Below is a table describing the current validation of registration data obtained via the California New Voter Motor program against the California Voter Registration Eligibility Requirements. California Voter Registration Eligibility Requirements DMV California New Motor Voter Data Validation Approach SOS VoteCal Data Validation of California New Motor Voter Records A citizen of the United States None (self-attestation by the public via a signed affidavit at DMV). None (self-attestation by the public via a signed affidavit). A California resident None (self-attestation by the public via a signed affidavit at DMV). None (self-attestation by the public via a signed affidavit). Note: California counties subsequently send voter notification postcards to new/updated registrants. If the postcard is undeliverable, the county will inactivate the registrant. 18 years old or older on Election Day None (self-attestation by the public via a signed affidavit at DMV). Note: From an application perspective, there is validation using the birth date as a validator and records are not released unless the record reflects the customer is at least 16 years old. However, this is all based upon selfattestation. In the field offices, technicians are trained to enter name and date of birth from the actual documents that are acceptable birth date legal presence documents. None. Not currently in prison or on parole for a felony conviction None (self-attestation by the public via a signed affidavit at DMV). California Department of Corrections and Rehabilitation sends monthly new and updated felon information to the SOS registration database. VoteCal then automatically sends potential voterto-felon matches to the county. The counties then make the final determination on the match and confirm or deny the match. Voters with the confirmed matches are then cancelled. The birth date provided by the registrant is used by SOS but is not validated further. Page 25 Appendix E Not found to be mentally incompetent to vote by a court None (self-attestation by the public via a signed affidavit at DMV) Counties receive conservatorship records directly from the courts. All new registrations are validated against the county records during processing. Page 26 Appendix E Appendix E: Acronyms and Definitions Acronym Definition AB Assembly Bill CDT California Department of Technology CNMV California New Motor Voter DLAD Driver License Automation Development DLIR Driver License Issuance Replacement DMV California Department of Motor Vehicles EASE Enterprise Application Services Environment eDL-44 Electronic Driver License/Identification Card Application ERM Enterprise Resources Management ICD Interface Control Document M&O Maintenance and Operations RACI Responsible, Accountable, Consulted and Informed (matrix) SDLC System Development Life Cycle SOS California Secretary of State UAT User Acceptance Testing UX User Experience Page 27 Department of Finance Department of Motor Vehicles ─ Independent System Assessment System Development Assessment Report February 21, 2019 Table of contents 1. Executive summary ............................................................................................................................................... 2 2. Assessment ........................................................................................................................................................... 4 2.1 Findings and Recommendations .................................................................................................................. 4 Appendix A: Deliverable Expectation Document......................................................................................................... 13 Introduction............................................................................................................................................................. 13 1. Deliverable Description .............................................................................................................................. 13 2. Deliverable Contents .................................................................................................................................. 13 3. Key Activities .............................................................................................................................................. 13 4. System Development Evaluation Criteria .................................................................................................. 14 5. Deliverable Assumptions ............................................................................................................................ 20 6. Acceptance Criteria .................................................................................................................................... 20 Appendix B: Interviews Performed .............................................................................................................................. 21 Appendix C: Artifacts Reviewed................................................................................................................................... 22 Appendix D: Acronyms and Terminology .................................................................................................................... 23 Page i Executive Summary 1. Executive summary Assembly Bill (AB) 1461 required California’s Secretary of State (SOS) and Department of Motor Vehicles (DMV) to establish the California New Motor Voter (CNMV) program, which provides DMV customers opportunities to register to vote if they qualify. Pursuant to AB 1461, DMV is required to electronically provide SOS the records of individuals who apply for an original or renewal of a driver license or state identification card or who provide a change of address who may be eligible to vote. The Motor Voter project was created with collaboration from California's Department of Technology (CDT), DMV and SOS. CDT led development and implementation of the application. DMV acts as the customer-facing agency to collect, filter and send voter registration data. SOS receives voter registration data from DMV and incorporates the information into the VoteCal system to update voter registrations. This system development assessment is the second in a series of assessments of the CNMV program. The objective is to review the Motor Voter system development against the following evaluation criteria to highlight risks and develop recommendations for the future improvement of the program’s effectiveness. The criteria were evaluated using a combination of techniques including the review of project documentation (i.e., artifacts), stakeholder interviews, research on vendors and technologies used, and technical code review. System development assessment evaluation criteria:  Clearly defined “IT stakeholders” are accountable to the system design and programming quality for the application  System and software design requirements are formally approved by the IT Architect  Standards exist for code quality and there is a process in place for code review and approval  Data governance processes are defined to manage the data quality  The software test design and associated test cases achieve the desirable test coverage  The project follows a standard software development methodology  The software development process follows a rigorous and robust promote-to-production process  The architecture took into consideration and characterized the concurrent users, communication channels and devices (e.g., mobile)  The interface architecture was designed to provide sufficient availability and recoverability  Nonfunctional requirements were identified and analyzed, and the resulting architecture satisfies the requirements  The project has a transition strategy/plan, and is being effectively implemented A future assessment is planned around data integrity. A facilitated risk assessment is also planned. Page 2 Executive Summary The Motor Voter program represents a substantial investment and will require ongoing solution enhancements, upgrades and production support. Effective management of this program will require a software development life cycle (SDLC) that includes defined roles and responsibilities across DMV and SOS, and ownership of decision-making to mitigate risk and maintain alignment with the program’s stated objectives. The program should implement proficient independent program oversight to provide objective progress monitoring and advise program leadership on risk management. This assessment has identified risks that, if not addressed, may adversely impact the realization of the intended benefits of the program. The following system development recommendations are prioritized to first implement improvements to support the continued maintenance and operations of the program (1-5), followed by recommended enhancements to continuously improve the technical architecture and optimize the development process (6-8). These recommendations should be reviewed and considered for implementation by program stakeholders: 1. Transfer of knowledge of the California New Motor Voter application and infrastructure will be critical to DMV operational performance and to the effective management of ongoing incidents. For DMV to be prepared for that responsibility, there must be a robust transition plan to replicate and transfer the skills and knowledge from CDT and Cambria (consultant resources) to DMV personnel. At a minimum, the transition plan should include: a single owner responsible for developing and implementing the plan, definition and assignment of roles and responsibilities, processes and procedures, technical architecture, tools and an assessment of quantifiable completion criteria. 2. Appoint a qualified Solution Architect to mitigate risks with technical architecture to safeguard the application availability, scalability, recoverability and maintainability. 3. Utilize a proven SDLC methodology to structure, plan and control the program, including a standard set of tools with access for all stakeholders. Establish a process by which system design requirements and trade-offs are formally reviewed, evaluated and approved. To migrate any changes to production requires additional formal approvals from the following accountable shareholders – Product Owner, Solution Architect, Organizational Change Management, Training and Communications. 4. DMV and SOS should jointly create a data governance plan which includes (but may not be limited to) defining data accountability, data management processes, data quality metrics and reporting. The plan should also include a defined data council, a set of data owners and a set of management procedures. Data quality metrics should be published and monitored. 5. Manage future change deployment risk through a phased deployment approach using a limited subset of users and field offices to minimize risk and increase the speed of fallback/recovery when there are deployment issues. 6. Set up separate/dedicated environments for development and testing to safeguard the isolation of development and testing activities. Design and implement automated end-to-end integration tests which are executed prior to any production deployment. 7. Implement automated monitoring of system workload and configure automated alerts so that operations can take appropriate and timely action if the system were to have capacity/performance issues or failed data transmissions. 8. Reassess the required infrastructure sizing estimates based on load testing and baselining/benchmarking and optimize the infrastructure service costs. Perform a single point-of-failure analysis to eliminate any single component causing an outage. Design and conduct periodic infrastructure resiliency tests. Determine the data retention policy for CNMV records and implement an archiving solution to avoid unnecessary storage costs and improve overall system performance. Page 3 Assessment 2. Assessment This system development assessment was performed against an established set of evaluation criteria as detailed in Appendix A: Deliverable Expectation Document. The criteria were evaluated using a combination of techniques including the review of project documentation (i.e., artifacts), stakeholder interviews and technical code review. Please see Appendix D for a glossary of acronyms and technical terminology. Below represent findings observed throughout the assessment, risk implications related to the findings and the associated recommendations. Each finding (F1-F15) may have recommendations that tie to multiple overarching recommendations (1-8) identified in the Executive Summary above. Previous technical assessments were conducted by third parties. Findings from these third-party assessments are cross-referenced in the Findings and Recommendations table below where those findings were related to those identified in this technical assessment. 2.1 Findings and Recommendations Evaluation criteria: Clearly defined “IT stakeholders” are accountable to the system design and programming quality for the application F1. Finding: The current program does not have resources in key technical roles such as a solution architect and lead developer. Related finding also noted in a previous third-party assessment report. Risk Implication: Without clearly accountable architects and lead developers, there is a risk of system design issues such as availability, scalability, maintainability and recoverability not being addressed effectively; there is currently no accountability for determining technical priorities or resolving development and architectural issues. Recommendation (R) R1 - Select qualified personnel to fill key technical roles, such as solution architect and lead developer, who will be responsible for the architecture, software design and programming quality. Evaluation criteria: System and software design requirements are formally approved by the IT Architect F2. Finding: The cloud-based system design requirements were not formally documented and approved prior to implementation. Risk Implication: The set of system design and software requirements for the target system may be incomplete or inaccurate. Page 4 Assessment Lack of fully vetted system design requirements can result in an unreliable, non-scalable infrastructure. Recommendation (R) R1 - The architecture should be reviewed against enterprise standards (e.g., CDT guidelines for state-wide use or DMV) and go through a formal architectural review process to assess potential simplification and enhancement opportunities. F3. Finding: The program has not conducted a formal architecture review and evaluation to assess technical options and trade-offs based on requirements.  The architecture is considered overly complex by project personnel. Risk Implication: An overly complex architecture will result in greater challenges for knowledge transfer, delays in issue resolution and a higher likelihood of issues during implementation of future changes or enhancements to the system. Recommendation (R) R1 - Establish a process by which the responsible solution architect is formally defining system design requirements, evaluating design trade-offs and conducting peer reviews. Page 5 Assessment Evaluation criteria: Standards exist for code quality and there is a process in place for code review and approval F4. Finding: Formal coding standards and application-wide code quality oversight is not in place.     An informal code peer review process is followed; however, it is not consistently used across teams and there is no application-wide oversight. There is no lead developer responsible for code quality across the entire program. A source code analysis tool is being used by the development team to provide a health check and determine the test coverage of the code. Documented coding standards do not exist for the project. A developer’s guide is being compiled for DMV as part of the transition. A static code analysis performed on a sample of 80k lines of code resulted in 333 code “smells”, which is an indication of a design weakness in the source code. Risk Implication: Without the use of coding standards and a code review process, there is an increased risk of software incidents, performance issues and application stability problems. Lack of an appropriate code review process could result in missed interpretation and implementation of requirements. Effective code development standards, analysis, and reviews improve code quality, consistency and lower the risk of incidents and increase the maintainability of the system. Recommendation (R) R1 - In addition to the peer review process, there should be an independent and periodic review of the code quality application-wide by the development team to uncover areas of risk and where there may be need for additional support. R2 - Perform a static code analysis (e.g., quarterly) on the entire code base to assess code quality and whenever a major release is migrated to user acceptance test. R3 - Document, communicate and enforce coding standards to provide existing and new program team members with a consistent set of standards to facilitate the creation of high-quality code. R4 – Implement a process to identify and prioritize code smells with an emphasis on resolving the higher-priority items first and eliminating remaining smells over time. Page 6 Assessment F5. Finding: Configuration properties are being placed in the application source directory, which could negatively impact the performance of every application build. Risk Implication: Without separating the configuration properties (e.g., screen resolution, database connections) from the source code there may be an unnecessary delay in the build process. Recommendation (R) R1 - Re-evaluate the project technical configuration setup to determine if there is an opportunity to prevent performance issues with the build process. Evaluation criteria: A data governance process is defined to manage the data quality F6. Finding: The program does not have a formal and approved data governance plan to manage the availability, usability and integrity of data. Risk Implication: Without data governance, there is an increased risk of inconsistent, incomplete and inaccurate data. Recommendation (R) R1 - DMV and SOS should jointly create a data governance plan including a defined council, a set of data owners and a set of management procedures. Data quality metrics should be published and monitored. Evaluation criteria: The software test design and associated test cases achieve the desirable test coverage F7. Finding: The software testing design for the ongoing program (post-implementation) requires further enhancements for a program of this complexity and criticality.   The project team did not utilize any type of formal analysis, such as a “pairwise” assessment, to understand all possible discrete combinations of parameters required for testing the application. There are no automated integration tests for regression testing of future changes to the system. Risk Implication: Without assessing and testing all the possible combinations of the application parameters, defects may not be discovered, leading to an increased risk of data defects during development and data incidents in production. Without end-to-end automated test scripts for regression testing there is a higher risk of production incidents when changes are made to the applications. Recommendation (R) R1 - Consider the use of a pairwise assessment to optimize the number of test cases and test coverage. Page 7 Assessment R2 - Create and execute a formal and comprehensive integration test plan to fully re-test end-to-end business processes incorporating all the interfaces, validations and reports. These tests can then form the basis of automated end-to-end tests for ongoing regression testing. Evaluation criteria: The project follows a standard software development methodology F8. Finding: The software development methodology used during the project was inconsistent throughout the project life cycle. The program has not determined a methodology for changes and maintenance and operations. The Motor Voter project management processes did not enforce formal definition, documentation, review and approval of the project’s business requirements.    Portions of the project were conducted using an Agile methodology that relied on user stories in a tool called JIRA to track the business requirements; however, product owners did not have access to JIRA to provide direct input for collaboration. Other portions of the project used a Waterfall methodology. Although Secretary of State provided input to the applicable software requirements, there was not a formal approval process between CDT, DMV and SOS over these requirements. The Secretary of State was not well integrated into the project review of the application prototypes and was shown the solution on a limited basis within the course of the project. Risk Implication: Without an accountable business process owner (Waterfall) or product owner (Agile) formally approving requirements, there is risk that the software does not meet the needs of the business or that software requirements for the target system may be incomplete or inaccurate. Lack of a common set of development tools for the project team impedes collaboration, which can result in incomplete project inputs and unnecessary project delays. Recommendation (R) R1 - Establish a process for formal, written approval of business requirements in whichever software development methodology process will be utilized going forward. The process should include the definition of clear roles and responsibilities and formal approvals. R2 - Business process/product owners should be responsible for approval and acceptance of proposed requirements, including ownership of business requirements development or product backlog. R3 - Utilize a common and proven SDLC methodology to structure, plan and control the program, including a standard set of tools with access for all stakeholders. R4 - Create and implement a quality management plan for the SDLC so that ongoing application problems and future changes have defined quality assurance measures and clear decision-making protocols. Evaluation criteria: The software development process follows a rigorous and robust promote-to-production process Page 8 Assessment F9. Finding: The program’s software development process does not follow a rigorous and robust promote-to-production process.   Several of the application components do not have separate development or test integration environments. For a period after implementation, changes to the production application have been via a “big bang” approach whereby all field offices were affected simultaneously. Risk Implication: Without separate development and testing environments, there is a lack of isolation for testing changes, creating opportunity for version control issues. By having “big bang” changes there is risk of greater operational impact affecting all field offices when there are any issues with the rollout. Recommendation (R) R1 - Prior to migrating changes to production, require formal approvals from the following accountable shareholders (as applicable): o o o o Product owner(s) Solution architect Organizational change management Training and communication R2 - Evaluate options to phase in changes rather than deploying in a “big bang” approach. The application or infrastructure changes to production should be released initially to a limited subset of users and field offices to minimize risk and increase the speed of fallback/recovery. R3 - Set up separate/dedicated environments for development and testing to safeguard the isolation of development and testing activities. Evaluation criteria: The architecture took into consideration and characterized the concurrent users, communication channels and devices (e.g., mobile) F10. Finding: Per our interviews and review of project documentation, a systematic evaluation of the cloud-computing platform requirements was not performed and issues remain open.     In July 2018, a third-party technical evaluation was performed and seven items were noted: five critical items were identified and two were unanswered. These seven items remain in an unknown state. No evidence was provided for load testing or benchmarking to demonstrate the analysis and basis for architecture sizing. A cloud-computing vendor is hosting the California New Motor Voter application. The service includes auto scaling, which can dynamically increase and decrease capacity as needed. However, auto scaling is not enabled, which can lead to an oversized infrastructure and unnecessary costs. There is no proactive periodic health check being performed of the cloud-computing platform. Risk Implication: Without a thorough architecture characterization, there is increased risk of incomplete or inaccurate requirements and inadequate design. Page 9 Assessment Without periodic proactive analysis of the health of the cloud-computing platform configuration and utilization, there is risk of unexpected and negative operational impacts. By not utilizing the auto scaling capabilities, there is risk of performance degradation during peak periods. An oversized and non-optimized infrastructure will incur the risk of unnecessary costs. Recommendation (R) R1 - Appoint a solution architect to take responsibility for resolution of the critical and unanswered questions in the third-party assessment report. R2 - Enable auto scaling within the cloud-computing platform as soon as possible. R3 - Reassess the required infrastructure sizing estimates to determine if the service cost is optimal based on the actual workload. R4 - Establish a monthly health check review to proactively monitor the cloud-computing platform utilization. Evaluation criteria: The interface architecture was designed to provide sufficient availability and recoverability F11. Finding: There is no ongoing process to test the system for resiliency failures. Related finding also noted in a previous third-party assessment report. Risk Implication: Without periodic failover testing, there is an increased risk of component failures causing application outages and performance issues. Recommendation (R) R1 - Design and execute periodic resiliency tests to verify that components failover properly; this testing should be performed quarterly or biannually at a minimum. R2 - Perform a single-point-of-failure analysis to eliminate any single component capable of causing an outage. F12. Finding: There is no formal strategy defined for disaster recovery and activation of the recovery. Related finding also noted in a previous third-party assessment report. Risk Implication: Without a disaster recovery strategy and viable plan, there is risk of an extended outage of the application in the event of a disaster. Recommendation (R) R1 - Define and develop a disaster recovery strategy and test and validate the implementation. Page 10 Assessment Evaluation criteria: Nonfunctional requirements were identified and analyzed, and the resulting architecture satisfies the requirements F13. Finding: There is some ad hoc monitoring but no evidence of end-to-end monitoring for risks of data transmission failures or system availability issues. Risk Implication: Adequately monitoring and proactively alerting on the application availability and data errors will reduce the risk of service interruption to the public and support the timely transmission of accurate records to SOS. Recommendation (R) R1 - Set up end-to-end automated monitoring and alerting of utilization issues and data errors so that operations can respond timely. Related recommendation also noted in a previous third-party assessment report. F14. Finding: A data retention and archiving strategy is not in place to reduce data storage consumption, reduce costs and improve performance. Risk Implication: Without a data retention policy and archiving process, there is a risk of reduced system performance and unnecessary costs due to unwarranted data storage. Recommendation (R) R1 - Working closely with SOS, determine a data retention policy and implement an archiving approach so that unnecessary data is not maintained in the system indefinitely. Page 11 Assessment Evaluation criteria: The project has a transition strategy/plan, and it is being effectively implemented F15. Finding: A detailed knowledge transfer plan has been initiated but requires completion. Current plan strengths:     Identification and inventory of 56 program maintenance tools Transition team assignments, including contract owners Documentation of owners of previously delivered custom software products and associated production support System Outage Communication Plan is included and identifies individuals to be called to triage and resolve system outages or problems Current plan development needs (including, not limited to):       Training materials and activities are not included for each tool and software product identified Processes and procedures are lacking, such as how to perform daily support activities or checking the health of the cloud services Owner(s) are not identified to be responsible to oversee the knowledge transfer process; the knowledge transfer process should be treated like a project and managed accordingly There is no plan to conduct a formal skills assessment of the knowledge transfer recipients Knowledge transfer from current program architects to the DMV architects is not included There is no scorecard of knowledge transfer metrics (quantifiable where possible) to measure progress Risk Implication: Without the implementation of a robust transition plan there is increased risk of degraded customer service, operational inefficiencies, compliance risks and substantial remediation/problem-solving efforts. Recommendation (R) R1 - Identify a single owner and sponsor for the transition plan from CDT/Cambria to DMV. R2 – Complete and enhance the current plan. R3 - Define and publish quantifiable completion criteria of knowledge transfer. R4 - Regularly assess the confidence level of both DMV and CDT/Cambria on the comprehensive level of the recipient. R5 - Publish weekly metrics and escalate any issues or risks to leadership. Page 12 Appendix A Appendix A: Deliverable Expectation Document Introduction The objective is to assess and review the validity and quality of the Motor Voter design and development programming code. The assessment will highlight exposures and develop recommendations for improvement to help facilitate achievement of business objectives. 1. Deliverable Description The system development assessment will provide findings and recommendations based on artifact analysis, stakeholder interviews, software code walk-through and application demonstrations of the EASE Motor Voter application. Stakeholders interviewed are expected to be the project management office, project architect(s), developers, testers and DMV branch personnel. 2. Deliverable Contents The System Development Assessment will be a written report containing the results of the assessment in the form of findings and recommendations. The report will include an executive summary, identification of findings and actionable recommendations for improvement. Additionally, the report will list the received artifacts and stakeholders interviewed. The intention of the report is to identify risks and opportunities for improvement. The report will not disclose the specific stakeholder source(s) of the findings but will attempt to corroborate findings with multiple sources. The report is advisory in nature. The assessment will not render an assurance report or opinion under this report, nor will the Services constitute an audit, review, examination or other form of attestation as those terms are defined by the American Institute of Certified Public Accountants. None of the Services or any reports will constitute any legal opinion or advice. 3. Key Activities Create design and development artifacts, request list and gather relevant documentation Perform initial analysis and formulate hypotheses Create interview questionnaires, conduct interviews and develop understanding of the existing Motor Voter design and development Assess the application software design to determine if it satisfies the requirements and evaluate for correctness, consistency, completeness and testability Assess the technical architecture and interface design between system elements to determine if they satisfy the requirements and evaluate for correctness, completeness and testability Page 13 Appendix A Conduct inspection (sample) of code quality and adherence to development standards Evaluate the sizing estimates, performance test plan and performance test results Capture related issues and risks and prepare assessment results and recommendations for resolution Vet findings and recommendations with stakeholders prior to deliverable development and walk through initial draft of System Development Assessment Final Report with Department of Finance to obtain feedback Incorporate Department of Finance feedback and finalize Report within five business days Deliver soft copy of final System Development Assessment Final Report for approval 4. System Development Evaluation Criteria Listed below are the specific criteria that will be evaluated as part of the system development assessment. Each of these criteria will be evaluated using a combination of techniques, including the review of project documentation (i.e., artifacts) and stakeholder interviews and through demonstrations of the Motor Voter application programming code. This is a current-state assessment; however, understanding data points leading up to EASE go-live will provide input to our analysis. Item # 1. Evaluation Criteria Are clearly defined “IT stakeholders” accountable to the system design and programming quality for the application? Implication Without clearly accountable architects and lead developers there is a risk of system design issues such as availability, scalability, maintainability and recoverability Assessment Approach    Confirm the scope of applications and delivery channels that feed into the data transfer Assess the defined roles and responsibilities of the system design requirements analysis Assess if IT architects, developers or vendors understand their role and are accountable to the correct and complete requirements Example Stakeholders Engaged     Lead Architect Lead Developer IT vendor(s) Project sponsor Example Artifacts Requested    RACI System design requirements documentation and approval evidence Vendor contracts Page 14 Appendix A Item # 2. Evaluation Criteria Were system and software design requirements formally approved by the IT Architects? Implication The set of system design and software requirements for the target system may be incomplete or inaccurate Assessment Approach  Assess the architecture review process  Assess the system design and software requirements documentation to determine if formal approval was completed Assess the system design architecture documentation to determine if formal approval was completed  3. Are there standards for code quality and a process in place for code review and approval? Without the use of coding standards and a code review process, there is increased risk of software incidents, performance issues and application stability problems       Assess the scan reports for bugs, vulnerabilities, smells and duplications Assess the coding standards Assess standards for parameter and configuration changes Assess the code review and quality assurance processes Assess the test-driven development Assess processes for migration of new code (and changes) into production Example Stakeholders Engaged   Lead Architect Project Management Office Example Artifacts Requested     Lead Developer IT Developers     System design and software design requirements documentation System design architecture approved documents Static code analysis reports Sample code Coding standards Code review processes Page 15 Appendix A Item # 4. Evaluation Criteria Are data governance processes defined to manage the data quality? Implication Increased risk of data errors Assessment Approach    5. Does the software test design and associated test cases achieve the desirable test coverage? Increased risk of data defects during development and data incidents in production      6. Does the project follow a standard software development methodology? Increased risk of missed or misunderstood requirements   Assess the data governance processes and procedures documentation Assess the structured and unstructured data types Review the data schema/model for consistency across different data sources Review test plan document Assess defect triage/ remediation processes Assess and analyze the results of test cases Assess coverage analysis graphs Review the input parameters and their dependencies Assess the software development processes for backlog tracking, prioritization and automated testing Assess the user story prioritization in JIRA and other tools Example Stakeholders Engaged    Data Owners Data Architect Data Lead Example Artifacts Requested        Test Manager Testers          Product Owner Scrum Master Lead Developer DevOps     Data quality production incidents Data schema/model Technical architecture design showing the interactions of DMV applications Data dictionary Data governance processes and procedures Testing strategy Test coverage reports Pairwise testing analysis and reports UAT testing process Testing tools and frameworks Scrum reports Product backlog documents Sprint backlog documents Product prioritization documents Page 16 Appendix A Item # Evaluation Criteria 7. Does the development follow a rigorous and robust promote-toproduction process (CICD continuous integration/continuous deployment)? Implication Delay to production release and increased incidents Assessment Approach       8. Did the architecture take into consideration and characterize the concurrent users, communication channels and devices (e.g., mobile)? Without a thorough architecture characterization there is increased risk of incomplete or inaccurate requirements and inadequate design    Assess plan and processes for assessing quality assurance activities (e.g., defect tracking, unit testing, source code tracking, technical reviews, integration testing and system testing) Assess the automation deployment strategy and setup of environments in the cloud/on-premise Assess the build quality of the CICD pipeline Assess the sampling of development objects to understand potential integrity issues Assess the automated deployment scripts Assess the code review process Assess the concurrent user traffic Assess whether the design accommodates possible devices Assess whether the infrastructure suits the business requirements Example Stakeholders Engaged     IT Architect IT Developers IT Environment Manager DevOps Example Artifacts Requested      Cloud Architect Lead Developer     Automation deployment scripts/ code Cloud computing deployments Technical architecture design document Technical architecture design document Cloud computing deployments Cloud computing current environment Sizing estimates Page 17 Appendix A Item # 9. Evaluation Criteria Was the interface architecture designed to provide sufficient availability and recoverability? Implication Increased risk of application downtime and performance issues Assessment Approach    10. 11. Are nonfunctional requirements identified and analyzed, and the resulting architecture satisfies the requirements? Does the program have a transition strategy/plan and is it being effectively implemented? Increased risk of performance and availability issues A transition plan and implementation that does not lay out the tasks and activities needed to efficiently move a software solution to an operations and    Assess the current state of interface architecture based on the use cases and assess risk Assess the project design to define synchronous vs. asynchronous interaction between interface services Assess processes for monitoring unplanned downtime (HA and DR strategy) Assess the nonfunctional requirements for availability, scalability, operability, maintainability and recoverability Determine if a transition strategy/plan has been developed and communicated Determine if processes requiring transition have been inventoried and documented Example Stakeholders Engaged   Enterprise Architect Integration Architect Example Artifacts Requested      IT Architect IT Developers         Project Management Office Product Owner Business Analyst Developers Production Support  Technical architecture design document Interface architecture Interface specifications Technical architecture design document Sizing estimates and design alternatives considered Single-point-of-failure analysis and design alternatives considered Transition plan that contains approach, deployment schedules, resource estimates, and identification of special resources and staffing needs Page 18 Appendix A Item # Evaluation Criteria Implication maintenance environment may result in degraded customer service, operational inaccuracies and inefficiencies, compliance risks and substantial remediation/problemsolving efforts Assessment Approach   Determine if SLAs have been established that clearly define business roles and expectations, as well as service level expectations Determine if operational metrics have been created to monitor success Example Stakeholders Engaged Example Artifacts Requested   Detailed knowledge transfer plan Management reports associated with the transition Page 19 Appendix A 5. Deliverable Assumptions 8-10 stakeholder interviews will be conducted 20-30 artifacts will be reviewed 1 deliverable for this task is expected – System Development Assessment Final Report Requested artifacts will be received within two business days of request The scope of system development assessment does NOT include the interim phase (Phoebe) 6. Acceptance Criteria Below are the specific acceptance criteria for this deliverable. Evaluation criteria are assessed Written acceptance of the deliverable received from one of the following individuals: Erica Gonzales, Chief IT Consulting Unit (DOF) or Thomas Giordano, Oversight Manager, IT Consulting Unit Written deliverable acceptance form complete Page 20 Appendix B Appendix B: Interviews Performed The following is a list of individuals interviewed as part of the system development assessment. Table 1. Interviews conducted Individual Title Organization Date 01/08/2019 01/08/2019 01/09/2019 01/15/2019 CDT 01/16/2019 01/16/2019 01/17/2019 DMV 01/17/2019 DMV 01/22/2019 DMV 01/23/2019 CDT 01/24/2019 01/26/2019 Page 21 Appendix C Appendix C: Artifacts Reviewed The following is the list of artifacts requested and reviewed during the system development assessment. In addition, the team leveraged the artifacts from the previous business process assessment. Table 2. Artifacts reviewed Page 22 Appendix D Appendix D: Acronyms and Terminology Acronym/Term Definition AB Assembly Bill CNMV California New Motor Voter CDT California Department of Technology Artifacts Project documentation CICD Continuous integration/continuous deployment - continuous integration (CI) is the practice of merging all developer working copies to a shared mainline several times a day. Continuous deployment (CD) aims at building, testing and releasing software with greater speed and frequency. SDLC Systems development life cycle DED Deliverable Expectations Document JIRA Business requirements tool tracking user stories (requirements) Smells An indication of potential design weakness in source code React CLI Code library for building user interfaces Pairwise A combinatorial method of software testing where, given a set of input parameters, there exists a minimum number of scenarios with all discrete value pairs tested at least once Big Bang A type of implementation whereby all sites or end users are affected in one switch over to a new system, rather than a phased approach which reduces risk to the organization EASE The California Department of Motor Vehicles Enterprise Application Services Environment HA High-availability - a characteristic of a system which aims to ensure an agreed level of operational performance, usually uptime, for a higher-than-normal period DR Disaster recovery GitHub A tool for source code management DevOps A software development methodology that combines software development and information technology operations to shorten the systems development life cycle while delivering features, fixes and updates frequently in close alignment with business objectives SLA Service level agreement - a commitment between a service provider and some client where particular aspects of the service are agreed upon between the service provider and the service user UAT User acceptance testing Page 23 Appendix D Acronym/Term Definition Cloud-based A term that refers to applications, services or resources made available to users on demand via the internet from a cloud computing provider’s servers Auto scaling Auto scaling monitors applications automatically and adjusts capacity to maintain steady, predictable performance at the lowest possible cost Page 24 Department of Finance Department of Motor Vehicles ─ Independent System Assessment Risk Assessment Report March 8, 2019 Table of contents Executive summary ....................................................................................................................................... 2 Risk assessment process ............................................................................................................................... 7 2.1 Risk identification session ............................................................................................................. 7 2.2 Risk validation session .................................................................................................................. 8 2.3 Risk Register ................................................................................................................................. 8 2.4 Impact and Likelihood Heat Map................................................................................................... 9 2.5 Program Action Plan Journey Map ............................................................................................. 10 Appendix A: Deliverable Expectation Document ........................................................................................ 11 Introduction .............................................................................................................................................. 11 1. Deliverable Description ....................................................................................................... 11 2. Deliverable Contents ........................................................................................................... 11 3. Key Activities ....................................................................................................................... 12 4. Deliverable Assumptions ..................................................................................................... 12 5. Acceptance Criteria ............................................................................................................. 13 Appendix B: ThinkTank Session Participants .............................................................................................. 14 Appendix C: Identification Session Risks ..................................................................................................... 16 Appendix D: Likelihood and Impact Scales ................................................................................................. 21 Appendix E: Acronyms and Definitions ....................................................................................................... 22 Page i Executive Summary Executive summary Assembly Bill (AB) 1461 required California’s Secretary of State (SOS) and Department of Motor Vehicles (DMV) to establish the California New Motor Voter (CNMV) program, which provides DMV customers opportunities to register to vote if they qualify. Pursuant to AB 1461, DMV is required to electronically provide SOS the records of individuals who apply for an original or renewal of a driver license or state identification card or who provide a change of address who may be eligible to vote. The Motor Voter project was created with collaboration from California's Department of Technology (CDT), DMV and SOS. CDT led development and implementation of the application. DMV acts as the customer-facing agency to collect, filter and send voter registration data. SOS receives voter registration data from DMV and incorporates the information into the VoteCal system to update voter registrations. This risk assessment is the third in a series of assessments of the CNMV program. The objective is to use a web-based application (ThinkTank) as a virtual and collaborative platform to engage California New Motor Voter program stakeholders in a facilitated brainstorming session to capture risk exposures through anonymous contributions that will lead to actions that drive program improvements. Conducting the risk assessment is intended to help answer the questions “what risks are most critical to the ongoing performance of the New Motor Voter program” and “where should program management focus their attention and resources in the short and long term.” Participants were asked to participate in one (1) one-hour web-enabled risk identification session and one (1) one-hour web-enabled risk validation session over the course of a two-week period. During the risk identification sessions participants were asked to provide two or more risks that would affect the success of CNMV maintenance and operations, and future enhancements. During the risk validation sessions participants were asked to score the likelihood and impact of the most frequently identified risks. The Motor Voter program represents a substantial investment and will require ongoing solution enhancements, upgrades and production support. Effective management of this program will require continuous identification and evaluation of risks by all parties involved to mitigate risk and maintain alignment with the program’s stated objectives. Page 2 Executive Summary This risk assessment has identified risks that, if not addressed, will adversely impact the realization of the intended benefits of the program. The risks and corresponding likelihood and impact scores in this report were identified by the CNMV stakeholders who participated in the facilitated sessions. Participants included representatives from DMV, CDT, SOS, and Cambria. The participants in these sessions identified numerous areas of risk which they assessed as “highly likely” to occur (> 75% probability on average) and will have “high impact” on program objectives. These risks should be further reviewed and mitigation plans be developed. 1. 2. 3. 4. 5. 6. 7. 8. Ineffective governance Insufficient application testing Ineffective organizational change management Ineffective knowledge transfer Failure to create adequate documentation Failure to reach common understanding of requirements Ineffective project staffing plans Ineffective user experience research and design Ineffective user experience research and design Ineffective project staffing plans Failure to reach common understanding of requirements Failure to create adequate documentation Ineffective knowledge transfer Ineffective organizational change management Insufficient application testing There is a direct correlation between the program risks uncovered by the CNMV stakeholders in this study and the recommendations previously identified in the CNMV independent assessments (Business Process Assessment and System Development Assessment). The table below provides a mapping of the key risks to the independent assessments recommendations. Ineffective governance Risks identified by the CNMV stakeholders Recommendations communicated in prior independent assessments Redesign the existing program governance model to include Secretary of State accountability for program success at all levels of the project.  Page 3 Executive Summary Ineffective user experience research and design Ineffective project staffing plans Failure to reach common understanding of requirements Failure to create adequate documentation Ineffective knowledge transfer Ineffective organizational change management Insufficient application testing There is a direct correlation between the program risks uncovered by the CNMV stakeholders in this study and the recommendations previously identified in the CNMV independent assessments (Business Process Assessment and System Development Assessment). The table below provides a mapping of the key risks to the independent assessments recommendations. Ineffective governance Risks identified by the CNMV stakeholders Recommendations communicated in prior independent assessments Design a formal incident management process with clear roles and responsibilities, a single source of truth and formal root-cause analysis.  Utilize a proven software development life cycle (SDLC) methodology to structure, plan and control the program including a standard set of tools with access for all stakeholders. SDLC should incorporate a requirements management process, including the definition of clear roles and responsibilities and formal approvals, and a testing strategy with entrance and exit criteria which includes unit, systems (hardware, software and security), integration (with SOS), performance and user acceptance testing (UAT).   Create required documentation, such as a requirements traceability matrix, data mapping specifications and data validation strategy, which was absent from the project so that the program maintenance and operations (M&O) can better understand the current system, support future decision-making and avoid potential problems. Create and implement a quality management plan for the SDLC so that ongoing application problems and future changes have clear quality assurance measures and clear decision-making protocols.    Page 4 Executive Summary Ineffective user experience research and design Ineffective project staffing plans Failure to reach common understanding of requirements Failure to create adequate documentation Ineffective knowledge transfer Ineffective organizational change management Insufficient application testing There is a direct correlation between the program risks uncovered by the CNMV stakeholders in this study and the recommendations previously identified in the CNMV independent assessments (Business Process Assessment and System Development Assessment). The table below provides a mapping of the key risks to the independent assessments recommendations. Ineffective governance Risks identified by the CNMV stakeholders Recommendations communicated in prior independent assessments Establish a program work stream responsible for optimizing current DMV field operations business processes and prepare the field offices for ongoing changes.  Establish a program work stream responsible for public research to assess and resolve challenges with the application.  Transfer of knowledge of the California New Motor Voter application and infrastructure will be critical to DMV operational performance and to the effective management of ongoing incidents.  Appoint a qualified Solution Architect to mitigate risks with technical architecture to safeguard the application availability, scalability, recoverability and maintainability. DMV and SOS should jointly create a data governance plan which includes (but may not be limited to) defining data accountability, data management processes, data quality metrics and reporting. The plan should also include a defined data council, a set of data owners and a set of management procedures. Data quality metrics should be published and monitored.   Page 5 Executive Summary Ineffective user experience research and design Ineffective project staffing plans Failure to reach common understanding of requirements Failure to create adequate documentation Ineffective knowledge transfer Ineffective organizational change management Insufficient application testing There is a direct correlation between the program risks uncovered by the CNMV stakeholders in this study and the recommendations previously identified in the CNMV independent assessments (Business Process Assessment and System Development Assessment). The table below provides a mapping of the key risks to the independent assessments recommendations. Ineffective governance Risks identified by the CNMV stakeholders Recommendations communicated in prior independent assessments Utilize a proven SDLC methodology to structure, plan and control the program, including a standard set of tools with access for all stakeholders. Establish a process by which system design requirements and trade-offs are formally reviewed, evaluated and approved. To migrate any changes to production requires additional formal approvals from the following accountable shareholders – Product Owner, Solution Architect, Organizational Change Management, Training and Communications.  Manage future change deployment risk through a phased deployment approach using a limited subset of users and field offices to minimize risk and increase the speed of fallback/recovery when there are deployment issues.  Set up separate/dedicated environments for development and testing to safeguard the isolation of development and testing activities. Design and implement automated end-toend integration tests which are executed prior to any production deployment.   Page 6 ``` Appendix Assessment A Risk assessment process This risk assessment was performed against an established set of activities as detailed in Appendix A: Deliverable Expectation Document. California New Motor Voter program stakeholders from DMV, CDT, SOS, and Cambria were invited to participate in this risk assessment. The list of invitees was vetted and approved by leadership of each organization. The risk assessment was performed in two phases, an identification phase and a validation phase. The first phase consisted of sessions in which participants anonymously identified risks that could impact the CNMV program’s ongoing maintenance and operations and future enhancements. The second phase invited participants of the previous sessions to assess the impact and likelihood of the risks identified. The ThinkTank application was utilized to conduct these sessions virtually, enabling participation regardless of location. The application enables real-time anonymous input and participant voting which allows for immediate analysis and identification of anomalies that require further discussion. Refer to Appendix B for the listing of participants. 2.1 Risk identification session For the risk identification session, participants were provided with pre-read materials and were asked to be prepared to provide a minimum of two to three risks that would affect the success of DMV maintenance and operations and future enhancements of the CNMV program. The materials included a “risk universe” which provided a listing of risk examples. Risk identification sessions were held on 2/12/19 and 2/14/19 (participants elected which date to attend). The risk identification sessions were attended by thirty-five (35) of the forty-two (42) invitees and comprised of program stakeholders from DMV, CDT, SOS and Cambria. During the sessions participants provided details of risks that they had identified during their involvement with the program, including: the risk name, risk definition, root cause(s) of the risk, a description of how the risk could impact CNMV performance, and the current or potential mitigation efforts related to the risk. As a result of the two risk identification sessions, seventy-nine (79) risks were identified by the thirty-five (35) participants. The seventy-nine (79) risks were categorized by combining risks with similar information provided by the participants. To provide clarity of the data the facilitation team consolidated the risks by theme which resulted in thirteen (13) overarching risks. Eight (8) of the thirteen (13) were supported by five (5) or more of the seventy-nine (79) risks; those eight (8) were used in the subsequent risk validation sessions. Refer to Appendix C for a mapping of the risks identified to the overarching risks. Page 7 ``` Appendix Assessment A 2.2 Risk validation session Risk validation sessions were held on 2/20/19 and 2/21/19 (participants elected which date to attend). The risk validation sessions were attended by thirty-two (32) of forty-two (42) invitees and comprised of program stakeholders from DMV, CDT, SOS and Cambria. During the sessions participants were provided with a scale to use in rating the impact and likelihood of the eight (8) overarching risks identified in the risk identification sessions. The scales ranged from critical (5) to low (1) for the impact scale and almost certain (5) to rare (1) for the likelihood scale. Refer to Appendix D for the impact and likelihood scales with definitions. 2.3 Risk Register Below are the results of the quantitative scores for impact and likelihood as assessed by the participants against the eight (8) overarching risks. The impact and likelihood scores are calculated based on the arithmetic average of input from all participants during the risk validation sessions. The overall score was calculated by averaging the impact and likelihood score for each individual risk (e.g. Ineffective governance (4.45 + 4.31)/2 = 4.38). 88.6% of the risks identified by CNMV stakeholders are rated as, or on the cusp of, high likelihood and high impact to the program based on their ratings. This is based on the seventy (70) risks identified in the table below as a percentage of the total risks identified (79) by the participants, which excluded incomplete risk statements. Risk Register No. Overarching risk Risk identification occurrences Impact Likelihood Overall score 1 Ineffective governance 14 4.45 4.31 4.38 2 Insufficient application testing 6 4.44 4.29 4.37 3 Ineffective organizational change management 5 4.12 4.11 4.12 4 Ineffective knowledge transfer 12 4.18 3.70 3.94 5 Failure to create adequate documentation 9 4.06 3.79 3.93 6 Failure to reach common understanding of requirements 8 4.09 3.76 3.93 7 Ineffective project staffing plan 9 3.75 3.82 3.79 8 Ineffective user experience research and design 7 3.56 3.78 3.67 Page 8 ``` Appendix Assessment A 2.4 Impact and Likelihood Heat Map Below is a heat map representing the qualitative and quantitative results from the risk identification and risk validation sessions, which illustrates the likelihood of occurrence and the impact on the program if a particular risk is experienced. 1. Ineffective governance 5 4 6 5 4 2. Insufficient application testing 2 1 3 3. Ineffective organizational change management Impact 7 8 4. Ineffective knowledge transfer 3 5. Failure to create adequate documentation 6. Failure to reach common understanding of requirements 2 7. Ineffective project staffing 1 1 2 3 Likelihood 4 5 8. Ineffective user experience research and design Page 9 ``` Appendix Assessment A 2.5 Program Action Plan Journey Map This journey map provides a conceptual set of sequenced actions to address the risks identified by the risk assessment participants. This visual can be leveraged by the CNMV program to drive improvement activities in a logical sequence over time. Page 10 ``` Appendix A Appendix A: Deliverable Expectation Document Introduction The objective is to use a virtual and collaborative platform to engage a broad spectrum of New Motor Voter program stakeholders in a facilitated brainstorming session to quickly capture risk exposures through anonymous contributions and provide the foundation for implementing program improvements. Conducting the risk assessment is intended to help answer the questions “what risks are most critical to the ongoing performance of the New Motor Voter program” and “where should program management focus their attention and resources in the short and long term.” The process will use the ThinkTank application to conduct these sessions virtually, enabling participation regardless of location. The application provides the ability for anonymous input to promote honest feedback regardless of position or who else is participating. It facilitates an efficient risk identification process, which allows more time for virtual discussion, voting on key risks and generating buy-in. The voting capability allows for immediate analysis, quickly identifying alignment or strong differences that require further discussion. 1. Deliverable Description Gather New Motor Voter program risks (forward looking) with input from the key stakeholders across the program. Collect qualitative and quantitative risk data to construct a risk register and enable program management to prioritize actions for program involvement. Participants will be New Motor Voter program staff and key stakeholders from DMV, CDT, SOS, and Cambria. 2. Deliverable Contents The Risk Assessment will be a written report with an executive summary, descriptions of the risk assessment process, a populated risk register depicting the relative likelihood and impact of each risk and a proposed set of sequenced actions for program improvement. Additionally, the report will include the facilitator questions and list of the session participants. The report is advisory in nature. The assessment will not render an assurance report or opinion under this report, nor will the Services constitute an audit, review, examination or other form of attestation as those terms are defined by the American Institute of Certified Public Accountants. None of the Services or any reports will constitute any legal opinion or advice. Page 11 ``` Appendix A 3. Key Activities  Prepare  Communicate the purpose and benefit of the risk assessment to participating department leadership (DMV, CDT, SOS, and Cambria)  Prepare pre-read materials for participants  Create facilitated questions for session participants and review with DOF  Identify the list of session participants and review with DOF  Solicit executive support for participation and communicate the session(s) schedule to targeted participants  Set up collaboration platform and test  Schedule and prepare sessions  Conduct two facilitated risk assessment sessions to collect and analyze risks and aggregate risks  Synthesize session data and populate risk register and validate findings with department leaders (SOS, CDT, Cambria and DMV) and DOF  Create high-level roadmap of program improvements – A roadmap reflecting the major steps and/or milestones needed to reach their goals, including recommended specific steps for critical actions. The roadmap and critical steps can be used to track the delivery of results  Provide and walk through initial draft of Risk Assessment with Department of Finance to obtain Feedback  Incorporate Department of Finance feedback and finalize report within five business days  Deliver soft copy of final Risk Assessment Final Report for approval 4. Deliverable Assumptions  30-40 participants (95% attendance is assumed)  Department leadership to approve attendance list  2 facilitated sessions to gather risks (may consider a third session for risk prioritization, if needed)  1 deliverable for this task is expected – Risk Assessment Final Report Page 12 ``` Appendix A 5. Acceptance Criteria Below are the specific acceptance criteria for this deliverable.  A New Motor Voter risk register is populated with qualitative and quantitative data (impact and likelihood)  A high-level roadmap is provided consisting of proposed sequenced actions for implementing program improvements  Written acceptance of the deliverable received by the following individuals:  Erica Gonzales, Chief IT Consulting Unit (DOF) or Thomas Giordano, Oversight Manager, IT Consulting Unit  Written deliverable acceptance form complete Page 13 Appendix B Appendix B: ThinkTank Session Participants The following is a list of individuals who participated in at least one ThinkTank session for the risk identification sessions and/or the risk validation sessions. Table 1. ThinkTank session participants Individual Title Organization CDT CDT CDT Census (former CDT) DMV DMV DMV DMV DMV DMV DMV DMV DMV DMV DMV DMV DMV Page 14 Appendix B Table 1. ThinkTank session participants Individual Title Organization DMV DMV DMV DMV DMV DMV DMV DMV SOS SOS SOS SOS SOS N/A Page 15 Appendix C Appendix C: Identification Session Risks This table provides the mapping of participant risk statements (paraphrased) to the overarching risks, as compiled by the risk assessment facilitators. Table 2: Identification Session Risks/Issues Ineffective governance 1. Lack of project champions that are fully engaged 2. Lack of transparency 3. Leadership needs to foster a culture of innovation 4. Stakeholders didn't reach mutual agreement on process 5. DMV is a legacy based culture and is not open to agile or how changes can be made to modernize work 6. Lack of established guardrails (technical and procedural) to help guide the project 7. There was no clear decision maker on the project as a whole 8. DMV executive and senior leadership lack of active engagement in project 9. Too much oversight from CDT 10. Failure to use correct methodology 11. Decisions driven by business, DMV IT left out of decision making process 12. Lack of leadership 13. Lack of agile mindset does not allow reaping the benefits of agile methodology 14. Lack of commitment from DMV leadership Ineffective knowledge transfer 15. Department does not have adequate in house knowledge to maintain applications moving forward 16. DMV resources may not have all the skills (technical) needed to take over the applications built for NMV 17. New tools introduced however, DMV was not allowed to shadow or participate in the development of the application using the new tools 18. Knowledge transfer sessions are occurring but there is still an overwhelming amount of information/processes that needs to be shared/learned Page 16 Appendix C Table 2: Identification Session Risks/Issues 19. Having staff not trained in the environment can cause issues with the deployment and the inability to support the environment 20. DMV teams are not proficient with the new tools and will take longer to navigate, use, and diagnose when there are problems or outages 21. Lack of AWS cloud knowledge within DMV 22. A lot of applications are used in the environment that DMV is not experienced with 23. If there is no knowledge transfer then the project will fail and will have no one to maintain it long term 24. DMV Staff are not trained in cloud technologies and do not have the expertise to continue to support or maintain the NMV environment 25. Lack of clarity and agreement with internal stakeholders as to what the definition of success means in terms of a complete and meaningful knowledge transfer 26. Tools not readily available to those in the knowledge transfer sessions Failure to create adequate documentation 27. Inadequate requirements documentation for the current motor voter process 28. No documentation makes it difficult to know how the environment was built 29. No business requirement for motor voter 30. Inadequate requirements documentation for future motor voter process modifications 31. No documentation on build 32. There is lack of documentation on the motor voter program 33. New agile processes utilized led to poor documentation left behind for DMV to sort through 34. No documentation makes it difficult to know how the environment was built 35. Requirements not properly documented for future enhancement request (impact analysis) Ineffective project staffing 36. Resources overcommitted to other projects 37. The time allotted to complete the project is not based on what needs to be done to complete the project (staff is burned out) 38. Employees are sometimes assigned to too many projects outside this one, and are unable to dedicate the amount of time needed Page 17 Appendix C Table 2: Identification Session Risks/Issues 39. Lack of DMV Subject Matter Experts (SMEs) with long term institutional knowledge of DMV systems involved in the motor voter project 40. Limited staffing across the IT organization 41. Lack of right skills needed for the development effort 42. New workload, no new positions to support new workload 43. Lack of DMV resources with knowledge of all motor voter applications 44. Insufficient staffing Ineffective user experience research and design 45. Lack of Usability testing - i.e., testing with customers before releasing the program/process 46. Customer usability is a risk 47. Lack of customer basic understanding of the Motor Voter process, intentions 48. Customer doesn't understand the questions related to motor-voter 49. Customer confusion with voter registration and DMV business (DL renewals) 50. No clear choice on touchscreen for citizens who do not wish to register or edit their registration 51. Customers unfamiliar with PC's and touchscreen technology Insufficient application testing 52. Not having any product ready to test 53. A lack of common records for testing the end to end system 54. Not enough end-t- end testing with a large set of different inputs and expected outputs 55. Lack of System testing of all environments 56. Insufficient testing 57. Lack of time to thoroughly test applications and environment Failure to reach common understanding of requirements 58. Program builders were unaware of legal requirements 59. If registration records aren't received in a timely manner, voters may be disenfranchised 60. 10-day NVRA requirement is not being satisfied 61. Any delay(s) in accurately and efficiently collecting and transferring information results in possibility the customer cannot vote Page 18 Appendix C Table 2: Identification Session Risks/Issues 62. Timeliness of voter registration data transfer 63. Paper applications received increase DMV's risk to keying errors 64. Paper forms allow for inaccurate and complete information (provided by customer) and/or inaccurate and incomplete information entered into the system by DMV 65. No saved image of voter registration Ineffective organizational change management 66. There was little thought given to the organizational impact of the proposed change 67. If the public loses trust in DMV, we may not receive support from the Governor’s Office going forward with new projects 68. For DMV to embrace and truly leverage cloud capabilities, they need to establish a Cloud CoE 69. Business needs to be more involved as they own the program and policy sides 70. The project was run as "agile-like" and does not fit into the SDLC process or infrastructure support model Ineffective project team communications 71. Lack of communication between Product Owner and Scrum Master 72. Top managers do not permit lower level managers to communicate directly to resolve problems 73. Lack of communications, especially around the roadmap for CNMV and priorities 74. Lack of communication between business and IT Lack of transparency around production incidents and resolutions 75. Staff is not allow to bring up what is really going on for fear of getting in trouble 76. Lack of a viable schedule prevents transparency for sponsors and Executive Management Failure to create data integrity controls 77. Information entered by the customer/voter being corrupted, changed or inaccurately entered and then transferred to the SOS Failure to manage scope 78. Business is allowed to add scope at any time Failure to adequately assess go-live readiness 79. Releasing a product that was not nearly ready Page 19 Appendix C Table 2: Identification Session Risks/Issues Incomplete risk statements without supporting or determinable evidence Training Communication Clear vision statement Limited application knowledge Lack of governance and stage gates that define success Lack of communications, especially around the roadmap for NMV and priorities Poor hardware equipment in the field Having different APIs and none to manage them all No experience with cloud environments Skills & Tools + Learning! Late Introduction to New Tools Page 20 Appendix D Appendix D: Likelihood and Impact Scales The following tables were presented to the participants as part of the risk validation session to assign impact and likelihood: Likelihood Scale Score/Rating Probability 5 - Almost Certain > 90% 4 - Highly Likely 75% 3 - 50/50 50% 2 - Less Likely 25% 1 - Rare < 10% Impact Scale 5 - Critical ► ► ► ► ► ► ► ► 4 - High Failure to maintain adequate application/infrastructure availability over a prolonged or critical period (e.g. leading up to an election) Failure to maintain adequate application/infrastructure system performance over prolonged or critical periods Failure to provide timely resolution for majority of production incidents Failure to adequately test most of future changes and enhancements resulting in a high number of production incidents Inadequate knowledge transfer to DMV resulting in consistently poor decision making and delays to problem resolution Field office line length increases significantly due to application/infrastructure issues Prolonged negative press regarding wait times, data integrity or application usability Pervasive loss of confidence in the data integrity of voter eligibility records 3 – Medium 2 - Moderate ► ► ► ► ► ► ► ► ► Failure to maintain adequate application/infrastructure availability periodically Failure to maintain adequate application/infrastructure system performance periodically Failure to provide timely resolution for a moderate number of production incidents Failure to adequately test a moderate number of future changes and enhancements resulting in production incidents Inadequate knowledge transfer to DMV resulting in poor decisions periodically and delays to problem resolution Field office line length increases moderately due to application/infrastructure issues Occasional negative press regarding wait times, data integrity or application usability Moderate loss of confidence in the data integrity of voter eligibility records ► ► ► 1 - Low Limited impact to field office operations None or very limited negative press New Motor Voter program future changes and enhancements go live successfully with minimal disruption Support resources have occasional delays for incident response Page 21 Appendix E Appendix E: Acronyms and Definitions Acronym Definition AB Assembly Bill CDT California Department of Technology CNMV California New Motor Voter DLAD Driver License Automation Development DLIR Driver License Issuance Replacement DMV California Department of Motor Vehicles EASE Enterprise Application Services Environment eDL-44 Electronic Driver License/Identification Card Application ERM Enterprise Resources Management ICD Interface Control Document M&O Maintenance and Operations RACI Responsible, Accountable, Consulted and Informed (matrix) SDLC System Development Life Cycle SOS California Secretary of State UAT User Acceptance Testing UX User Experience Page 22 Department of Finance Department of Motor Vehicles ─ Independent System Assessment Quality Assurance Assessment Report March 15, 2019 Table of contents 1 Executive summary ............................................................................................................................... 2 2 Assessment ........................................................................................................................................... 4 Findings and Recommendations ............................................................................................................... 4 Appendix A: Deliverable Expectation Document .......................................................................................... 10 Introduction ................................................................................................................................................ 10 1. 2. 3. 4. 5. 6. Deliverable Description ......................................................................................................... 10 Deliverable Contents............................................................................................................. 10 Key Activities ......................................................................................................................... 10 Quality of File Transfer Assessment Evaluation Criteria ........................................................ 11 Deliverable Assumptions ...................................................................................................... 15 Acceptance Criteria ............................................................................................................... 15 Appendix B: Interviews Performed ............................................................................................................. 16 Appendix C: Artifacts Reviewed .................................................................................................................. 17 Appendix D: Acronyms and Definitions ...................................................................................................... 18 Page i Executive Summary 1 Executive summary Assembly Bill (AB) 1461 required California’s Secretary of State (SOS) and Department of Motor Vehicles (DMV) to establish the California New Motor Voter (CNMV) program, which provides DMV customers opportunities to register to vote if they qualify. Pursuant to AB 1461, DMV is required to electronically provide SOS the records of individuals who apply for an original or renewal of a driver license or state identification card or who provide a change of address who may be eligible to vote. The Motor Voter project was created in collaboration with California's Department of Technology (CDT), DMV and SOS. CDT led development and implementation of the application. DMV acts as the customer-facing agency to collect, filter and send voter registration data. SOS receives voter registration data from DMV and incorporates the information into the VoteCal system to update voter registrations. This quality assurance assessment is the fourth in a series of assessments of the CNMV program. The objective is to review the Motor Voter data transfer processes against the following evaluation criteria to highlight risks and develop recommendations for improvement of the ongoing program’s effectiveness. The criteria were evaluated using a combination of techniques including review of project documentation (i.e., artifacts), stakeholder interviews and walk-throughs of Motor Voter application data transfers, data validation activities and SOS-related operational reporting. Quality assurance assessment evaluation criteria:  Data validation rules exist and are applied prior to data transferred from DMV to SOS  The manual DMV data validation process includes steps for identifying, capturing, communicating, and remediating data issues  Data validation processes are implemented by SOS, which captures and reports data defects to DMV timely  Data defects/exceptions are centrally logged and remediated by DMV (pre-transfer) and SOS (post-transfer)  Defined data metrics are used to measure data completeness, accuracy, and quality  Access control is applied to the DMV/SOS data transfer process Page 2 Executive Summary This assessment has identified risks that, if not addressed, will adversely impact the realization of the intended benefits of the program. The quality assurance assessment recommendations summarized below should be reviewed and considered for implementation by program stakeholders: 1. Implement a process for timely DMV second-level review and release of California New Motor Voter applicant data being held subsequent to automated Help America Vote Act (HAVA) data validation checks. 2. Assess the benefit of DMV’s manual review of CNMV applicant data prior to transfer to SOS to determine the effectiveness and efficiency of performing the review as it is currently designed; consider whether the review addresses critical risks and whether the validation could be faster and more comprehensive if automated. 3. Implement a process for DMV to communicate to SOS the inventory of records being held for DMV review to promote awareness of all records unavailable in the current data transfer and provide visibility of applicant backlog and plan for resolution and release. 4. Implement a process whereby DMV communicates to SOS the expected applicant record counts for each transfer to enable SOS to implement a process to compare records received to the expected outcome. 5. Modify the SOS voter data load table format to allow for complete date and time entry and implement an SOS evaluation of records to determine applicants’ most recent registration information is being applied and not overwritten by data that may have been transmitted out of chronological order. 6. Perform a DMV automated validation of applicant-provided California residential address data prior to transfer to SOS. 7. Design and implement a DMV-managed interface testing plan that includes a framework for testing future changes or enhancements for the end-to-end DMV-to-SOS voter data transfer process. 8. Design and implement a single defect tracking and management process for the CNMV program that consolidates SOS and DMV defect identification, assignment, tracking, remediation, retesting, and closure. The process could include automated reporting of issues experienced by SOS which require DMV resolution. 9. Create a service level agreement (SLA) that communicates aspects of both DMV and SOS services including responsibilities and key performance indicators for quality, timeliness, and availability of data transferred from DMV to SOS. Page 3 ``` Assessment 2 Assessment This quality assurance assessment was performed against an established set of evaluation criteria as detailed in Appendix A: Deliverable Expectation Document. The criteria were evaluated using a combination of techniques including review of project documentation (i.e., artifacts), stakeholder interviews and walk-throughs of data transfers, data validation activities and SOS-related operational reporting. The table below represents findings observed throughout the assessment, risk implications related to the findings and the associated recommendations. Each finding (F1-F13) may have recommendations that tie to nine overarching recommendations identified in the Executive Summary above. Where relevant, complementary findings and recommendations from the prior Business Process Assessment Report were referenced to be considered along with the findings and recommendations in this report. Findings and Recommendations Evaluation criteria: Data validation rules exist and are applied prior to data transfer from DMV to SOS. F1. Finding: CNMV applicant records that do not pass the automated Help America Vote Act (HAVA) data validation checks may be eligible to vote; however, these eligible voter records may not be released to SOS timely because they are held as status “pending” by DMV. HAVA checks include validation of AB 60 eligibility, Risk Implication: Not all eligible voter applications processed by the CNMV application are downloaded and recorded in the SOS voter system in time to allow registrants to vote. Recommendation (R) R1 - DMV should identify, review, remediate, and release pending HAVA records to SOS timely. DMV management should closely monitor execution and provide metrics on this process to SOS to support the timely registration of eligible voters. Page 4 ``` Assessment F2. Finding: There is no requirement and/or validation in the CNMV application to confirm that a valid California residence address is submitted by an applicant. Refer to complementary finding F8 in the Business Process Assessment Report. Risk Implication: SOS may unnecessarily receive voter registration information from DMV for applicants who are not California residents. Receiving voter registration information for ineligible applicants creates additional and unnecessary workload for SOS. Recommendation (R) R1 – DMV should implement an automated process, ideally in the front-end application, to validate that voter registration applicants have provided valid California residential addresses. R2 – Voter registration applicants should be notified if their registration was unsuccessful. F3. Finding: There is no formal interface testing process for Motor Voter application updates or modifications. Risk Implication: Lack of adequate and effective interface testing in the change control process can increase risk of unintended impacts to the successful transfer of complete and accurate voter registration data records. Recommendation (R) R1 – Finding F9 in the Business Process Assessment Report includes a complementary recommendation (R1) to create and execute a formal and comprehensive test plan to fully re-test end-to-end business processes incorporating all interfaces, validations and reports. In conjunction with our recommendation to develop and implement a test plan, DMV should perform the following:    Implement an interface testing process that tests the data transfer process from DMV source to SOS data tables and document testing results. Develop an interface test plan, test scripts and a repository to store the scripts. Automate the execution of these scripts to enable an efficient process for testing the impact of Motor Voter changes and enhancements to the data transfer process. Evaluation criteria: Manual DMV data validation process includes steps for identifying, capturing, communicating and remediating data issues. F4. Finding: The manual DMV data validation process compares the voter registration applicant answers to the US citizenship question and voter registration requirement question against the associated CNMV system record downstream in the architecture. This provides confidence in data integrity across the architecture. The review process validates that the information entered matches the information captured in the downstream CNMV system record, however the review does not validate that the information provided by the applicant is accurate (i.e. if the applicant responds that they are a US citizen, the manual review does not validate that the applicant is in fact a US citizen). Risk Implication: There is continuous DMV resource allocation, consisting of fifteen (15) full time equivalent reviewers and five (5) full time equivalent supervisors, for this manual review without a clear understanding or definition of the cost and benefits. The process potentially delays voter registration. Page 5 ``` Assessment Recommendation (R) R1 - DMV should review the current process to assess the cost and benefits of this manual review. An automated solution should be considered if the review is considered to be of ongoing benefit. R2 - DMV should communicate the list of records in review so that SOS is aware of records unavailable for the current transfer. This would provide transparency to all involved parties and in addition allow DMV and SOS to confirm that all records are transferred. Evaluation criteria: Data validation processes are implemented by SOS which capture and report data defects to DMV timely F5. Findings: In some instances, CNMV registration applicants generate more than one registration record. Each record in the CNMV source file that is available to be pulled from DMV by SOS has a field in a timestamp format. SOS is not currently evaluating the timestamp to determine the most current voter registration record because the record is truncated upon loading to the SOS table, resulting in SOS only having a date, but no timestamp. Risk Implication: For those applicants who have submitted multiple registrations in the same day, SOS may not be registering the voter with the most recent and accurate voter registration applicant information. Recommendation (R) R1 - SOS should modify the load table to allow for complete date/time format. SOS should also evaluate records to determine applicants’ most recent registration information is applied. F6. Finding: The CNMV data transfer error logs are recorded and stored by SOS after the pull from the CNMV application and in some cases the errors result in a “hard stop” in which the registration does not complete and the applicant is not notified (e.g. an “out of state” residence provided in the registration will result in a “hard stop” to the registration process). Errors logged by SOS are not communicated to DMV for transparency and possible remediation. Risk Implication: Without a periodic joint DMV/SOS evaluation of errors, there is potential that eligible voters are not registered. Unaddressed errors can also cause unnecessary effort and/or expense in operating and maintaining the Motor Voter program and hinder improvement in the overall voter registration process. Recommendation (R) R1 - Implement a joint SOS/DMV process to track all errors, research root causes, remediate functional issues and correct data errors. F7. Finding: There is no process that validates the record count pulled by SOS agrees with the record count available for SOS in the CNMV application. Risk Implication: Without a completeness validation, SOS may not detect that records are missing from the data transfer from DMV. Recommendation (R) Page 6 ``` Assessment R1 – DMV should implement a process that communicates to SOS available (expected) voter registration applicant records for each pull. Additionally, SOS should implement a process to compare actual records pulled with the expected record count outcome. F8. Finding: There is no post-transmission SOS process that verifies the accuracy of the transferred data records pulled from the DMV. Risk Implication: There is an increased risk of data errors occurring during the transfer process without detection. Recommendation (R) R1 – SOS should implement an automated procedure that compares the data contained in a sample of received records to DMV source file for quality and accuracy. F9. Finding: There is no SLA between SOS and DMV regarding data quality/transmission between the two organizations. Risk Implication: Without an agreed-upon SLA, there is no expectation on the level of service expected, the specific services provided, and the ability to set registration targets and measurements by which successful delivery will be determined. Recommendation (R) R1 - Create an SLA between DMV and SOS to define responsibilities, processes, timeliness, and measurements for determining successful delivery. Evaluation criteria: Data defects/exceptions are centrally logged and remediated by DMV (pre-transfer) and SOS (post-transfer). F10. Finding: There is no consolidated incident logging process or single source repository (DMV and SOS) to log, prioritize and remediate CNMV defects and issues. Refer to complementary finding F10 in the Business Process Assessment Report. Risk Implication: Defects recognized by one organization may not be communicated to the other for research and remediation. Both parties may not be aware of defects or issues, resulting in a lack of visibility to the resolution. Without sharing information and a common record of defects, the ability to assess the priority of future enhancements is limited. Data issues may not be communicated or resolved timely, which can lead to duplication of effort and/or rework. Recommendation (R) R1 – DMV and SOS should implement a Life Cycle Defect Process that consolidates SOS and DMV defect identification, assignment, tracking, remediation, retest, and closure. Evaluation criteria: Defined data metrics are used to measure data completeness, accuracy, and quality. Page 7 ``` Assessment F11. Finding: There is no automated reporting built into the SOS data transfer process that reports the completeness, accuracy and quality of the data being transferred. SOS checks for valid county, data in mandatory fields. After loading, SOS checks effective date and duplicates. Communicating the results of the file transfer would help DMV comprehend the accuracy and sufficiency of information sent over to SOS. Risk Implication: DMV is not aware of potential issues with the data transfers hindering the ability to remediate and reduce or eliminate the issue going forward. Recommendation (R) R1 – SOS should create automated reports that identify issues encountered and the number of records impacted. These reports should be developed to align with the requirements of the DMV and SOS SLA (see F9 above) and published to DMV for review and follow-up. F12. Finding: Evaluation criteria: Access control is applied to the DMV/SOS data transfer process F13. Finding: Page 8 ``` Appendix A Appendix A: Deliverable Expectation Document Introduction The objective is to assess and review the quality of the data file transfer process from Department of Motor Vehicles (DMV) to Secretary of State (SOS). The assessment team will identify risks and issues and develop recommendations to help resolve these issues. 1. Deliverable Description The Quality of File Transfer Assessment will provide findings and recommendations based on artifact reviews, stakeholder interviews, process and tool walkthroughs. Stakeholders interviewed are expected to be the data integration lead, developers, business process owners, testers, risk management personnel, and other relevant stakeholders. 2. Deliverable Contents The Quality of File Transfer Assessment will be a written report containing the results of the assessment in the form of findings and recommendations. The report will include an executive summary, identification of findings, and actionable recommendations for improvement. Additionally, the report will list the received artifacts, stakeholders interviewed and process walk-throughs. The intention of the report is to identify risks and opportunities for improvement. The report will not disclose the specific stakeholder source(s) of the findings but will attempt to corroborate findings with multiple sources. The report is advisory in nature. The assessment will not render an assurance report or opinion under this report, nor will the Services constitute an audit, review, examination, or other form of attestation as those terms are defined by the American Institute of Certified Public Accountants. None of the Services or any reports will constitute any legal opinion or advice. 3. Key Activities   Evaluate DMV and SOS data validation rules and processes o The inclusion of voter eligibility criteria including name, address, date of birth/age, citizenship and effective date o Duplicate data record prevention o Batch record counts Evaluate the quality of data file transfer reports and metrics o Number of records transferred o Error exception reporting Page 9 ``` Appendix A o  Data file transfer controls to track which records have been transferred and are yet to be transferred Review data transfer defect resolution processes o Logging of issues in a designated defect management system o Processes for identification, communication, and remediation of issues  Observe the data transfer process from DMV to SOS  Prepare assessment report draft findings and recommendations  Provide and walk through initial draft of Quality File Transfer Assessment Final Report with the Department of Finance to obtain feedback  Incorporate Department of Finance feedback and finalize report within five business days  Deliver a soft copy of the final Quality of File Transfer Assessment Final Report for approval 4. Quality of File Transfer Assessment Evaluation Criteria Listed below are the specific criteria that will be evaluated as part of the Quality of File Transfer Assessment. Each of these criteria will be evaluated using a combination of techniques including the review of project documentation (i.e., artifacts), stakeholder interviews and through demonstrations of the Motor Voter application and analytics. This is a current-state assessment; however, understanding data points leading up to go-live will provide input to our analysis. Item # 1. Evaluation Criteria Data validation rules exist and are applied prior to data transfer from DMV to SOS. Implication Without applying appropriate data validation rules prior to data transfer, there is an increased risk of data errors, including duplicate records, being passed to SOS. Assessment Approach   Through documentation review and/or walk-through, obtain an understanding of the data validation rules/criteria in place at DMV prior to data transfer to SOS Observe the operation of data validation rules by observing the execution of sample transaction(s) (i.e., test script execution) Example Stakeholders Engaged    Business process owners Data lead Developers Example Artifacts Requested       Data validation plan DMV data validation rules/scripts Data validation documentation Data validation scripts Interface test scripts Interface test results Page 10 ``` Appendix A Item # Evaluation Criteria Implication Assessment Approach   2. Manual DMV data validation process includes steps for identifying, capturing, communicating and remediating data issues The effectiveness of a manual review process is diminished when it does not include baseline criteria to identify errors, a method to capture and communicate errors for root-cause analysis, or steps to engage necessary stakeholders for remediation of the issues resulting in data errors. Without capture and analysis of data issues, the manual review will not support continued improvement in the Motor Voter application or a related reduction in data issues.      Example Stakeholders Engaged Example Artifacts Requested Review a sample of DMV data validation test results Determine if processes exist for DMV/SOS interface testing in support of current development and enhancements Review the scope of the current DMV manual validation process Through inquiry, review of available documentation and walk-through, assess the validation criteria and steps performed Determine if manual validation criteria are also subject to automated data validation by the Motor Voter application Determine if the process includes steps for logging, tracking, reporting and remediating issues Determine if the manual validation process includes rootcause or trending analysis of identified issues    Business process owner Developers Risk management owner   Manual data validation process documentation Validation results (e.g., issue tracking log/spreadsheet, analysis reports/dashboards) Page 11 ``` Appendix A Item # Evaluation Criteria Implication Assessment Approach Example Stakeholders Engaged Example Artifacts Requested Data issues may not be identified or remediated timely. 3. Data validation processes are implemented by SOS, which captures and reports data defects to DMV timely. Increased risk of data errors being transferred without detection. Review of received data can increase the likelihood of data conformity to the SOS system requirements. Minimize redundancy and re-work for DMV, SOS, and counties, thus increasing the likelihood of complete and accurate voter registration information. 4. Data defects/exceptions are centrally logged and remediated by DMV (pretransfer) and SOS (post-transfer). Without a single-source (centralized) tool to log data defects and production incidents, there is increased risk of data issues not being communicated or resolved timely, and/or duplication of effort and/or rework.       Through inquiry, review of available documentation and walk-through, assess the validation criteria and steps performed Observe a walk-through of SOS validation processes including the pre- and post-filtered file validation process Review the process for logging, tracking and reporting defects Review the process for communicating defect remediation Determine which defect management tool(s) is being utilized by DMV and SOS Through inquiry, review of available documentation and walk-through, gain an understanding of the shared process between DMV and SOS for the logging of datarelated defects, assignment of      Business process owner SOS integration lead Data analyst Data lead Developers         Business process owner Data lead Defect manager SOS integration lead    Functional validation rules Data validation process documentation SOS filter/cleansing rules/scripts Sample source file and post filtered file DMV/SOS defect/issue/exception logs/reports Defect life cycle process Error resolution procedures Page 12 ``` Appendix A Item # Evaluation Criteria Implication Assessment Approach  5. 6. Defined data metrics are used to measure data completeness, accuracy, and quality. Access control is applied to the DMV/SOS data transfer process Without a review of data metrics, data transfer performance may not be accurately measured and monitored, which can negatively impact data quality and/or mask issues that have not yet been identified or addressed.  Access to data transfer files and mechanisms should be limited to reduce the risk of unauthorized and/or unintended access and/or changes to data/data transfers, which could negatively impact data confidentiality, integrity, and availability.   Example Stakeholders Engaged Example Artifacts Requested remediation ownership, communication, and steps for closure/remediation Through inquiry and/or review of example documentation determine if the defect triage process includes root-cause analysis Through inquiry and/or review of available documentation, obtain an understanding of processes for measuring and reporting on the success of data transfer processes Observe a walk-through of the current data metric review process     Business process owner Data lead Developer SOS integration lead   Data metrics report Data scripts Through inquiry and/or review of available documentation, obtain an understanding of access control requirements and observe a walk-through to validate the implementation of those access controls Page 13 ``` Appendix A 5. Deliverable Assumptions  3–6 stakeholder interviews/observation walk-throughs will be conducted.  Primary validation will include data validation processes within the DMV “Shovel” component of the Motor Voter application. Shovel is the database where all sources of voter registration data are collected and validated prior to the data transfer to SOS.  10–15 artifacts will be reviewed.  1 deliverable for this task is expected – Quality of File Transfer Assessment Final Report.  Requested artifacts will be received within two business days of the request.  The scope of Quality of File Transfer Assessment does not include the interim phase of the Motor Voter application (Phoebe). 6. Acceptance Criteria Below are the specific acceptance criteria for this deliverable.  Evaluation criteria are assessed  Written acceptance of the deliverable received from one of the following individuals:  o Erica Gonzales, Chief IT Consulting Unit (DOF) o Thomas Giordano, Oversight Manager, IT Consulting Unit Written deliverable acceptance form complete Page 14 Appendix B Appendix B: Interviews Performed The following is a list of individuals interviewed as part of this Quality of File Transfer Assessment. Table 1. Interviews conducted Individual Title Organization Date(s) DMV 2/21/19 DMV 2/21/19 SOS 2/20/19, 2/26/19, 2/28/19, 3/4/2019 SOS 2/20/19, 2/26/19, 2/28/19, 3/4/2019 SOS 2/26/19, 2/28/19, 3/4/2019 DMV 2/26/2019 DMV 2/26/2019, 2/28/2019 DMV 2/28/2019 2/20/2019, 2/26/2019, 2/28/2019 DMV 3/11/2019, 3/12/2019 Page 15 Appendix C Appendix C: Artifacts Reviewed The following is a list of artifacts reviewed during this Quality of File Transfer Assessment. In addition, the team leveraged the artifacts from the previous business process and system development assessments. Table 2. Artifacts reviewed Page 16 Appendix D Appendix D: Acronyms and Definitions Acronym Definition AB Assembly Bill ADA Americans with Disabilities Act API Application Programming Interface CDT California Department of Technology CNMV California New Motor Voter DMV California Department of Motor Vehicles EASE Enterprise Application Services Environment eDL-44 Electronic Driver License/Identification Card Application HAVA A process that checks records in Shovel for AB 60 compliance JSON JavaScript Object Notation QAE Quality Assurance Evaluation SDLC System Development Life Cycle SLA Service Level Agreement SOS California Secretary of State Page 17 Department of Finance Department of Motor Vehicles ─ Independent System Assessment Pre-Integration with EASE File Validation Strategy Report July 1, 2019 Table of contents Executive Summary ....................................................................................................................................................... 2 Analysis……………. ........................................................................................................................................................... 4 1.1 Data Collection ............................................................................................................................................. 4 1.2 Data Validation ............................................................................................................................................. 4 1.3 Data Loading ................................................................................................................................................ 5 1.4 Data Analysis ................................................................................................................................................ 5 1.5 Reporting ...................................................................................................................................................... 9 Appendix A: Deliverable Expectation Document......................................................................................................... 10 Introduction............................................................................................................................................................. 10 Appendix B: Points of Contact ..................................................................................................................................... 13 Appendix C: Data Files Utilized in Analysis .................................................................................................................. 14 Appendix D: Acronyms and Terminology .................................................................................................................... 15 Page i Executive Summary Executive Summary Assembly Bill (AB) 1461 required California’s Secretary of State (SOS) and Department of Motor Vehicles (DMV) to establish the California New Motor Voter (CNMV) program, which provides DMV customers opportunities to register to vote if they qualify. Pursuant to AB 1461, DMV is required to electronically provide SOS the records of individuals who apply for an original or renewal of a driver license or state identification card or who provide a change of address who may be eligible to vote. The Motor Voter project was created with collaboration from California's Department of Technology (CDT), DMV and SOS. CDT led development and implementation of the application. DMV acts as the customer-facing agency to collect, filter and send voter registration data. SOS receives voter registration data from DMV and incorporates the information into the VoteCal system to update voter registrations. This data transfer validation follows a series of four assessments of the CNMV program; Business Process Assessment, System Development Assessment, Risk Assessment and Quality Assurance Assessment. As described in Appendix A: Deliverable Expectation Document, the objective is to propose and execute a Department of Finance approved strategy to validate the integrity of the California New Motor Voter (CNMV) data transferred from DMV to SOS during the period prior to EASE integration (Apr. 23, 2018 – Sept. 26, 2018). The data transfer validation compared the full population of DMV records to the full population of records received by SOS based on data transfer time stamp. The data transfer validation scope is limited to providing the results of specific data comparisons only and does not include evaluation of the results, nor the identification of findings and recommendations. Records that do not match have been provided to DMV and SOS for review, analysis and determination of risk, at their discretion. Differences between the DMV and SOS pre-EASE integration voter registration data sets were identified. The scope of this data transfer validation does not include the vetting or rationalization of these items to determine if they exist in the current voter registration data sets or if they represent risk to the CNMV program objectives. The record details have been provided to DMV and SOS to determine the need for such review and analysis. Due to the sensitive nature of the voter registration data, the results included in this report are presented in an aggregated format exclusive of personally identifiable voter registration information. Data differences noted during the pre-EASE integration voter registration data transfer validation are summarized as follows: 1. Record count analysis: Total DMV voter registration data includes 3,192,789 records and total SOS voter registration data includes 3,184,724 records, resulting in a difference of 8,065 records. 2. Record count analysis: SOS voter registration data includes 452 records which are not included in the DMV voter registration data. 3. Record count analysis: DMV voter registration data includes 92,201 records which are not included in the SOS voter registration data. 4. Record count analysis: SOS voter registration data includes 83,684 duplicate voter registration records as submitted by DMV. 5. Blank field analysis: 32 of 41 field types were identified as having blanks during transmission with five fields (labeled as information related to former mailing address and designation of other political party) being blank in over 99% of the data records. Page 2 Executive Summary 6. Key field analysis: DMV voter registration data includes 55,716 records from DMV source types described as “Release” or “Do not send”. These source types are not included in the defined list of SOS approved source types. 7. Key field analysis: SOS voter registration data includes two records from the unapproved source type described as “Do not send”. 8. Key field analysis: DMV voter registration data includes 171,145 records containing approved party preference within the DMV data, however these records do not have the associated designation within SOS voter registration data. 9. Key field analysis: DMV voter registration data includes 2,256 records containing unapproved county names; these records were received and processed with blank county within the SOS voter data set. 3,517 DMV records have unapproved county names and these records were not identified within the SOS data. 10. Field truncation analysis: 24 of 41 fields exist with potential for truncation between the DMV and SOS voter registration data sets. 11. Exact match analysis: 130,573,684 data points were compared for an exact match across the DMV and SOS voter registration data sets. Of this number, 1,759,121 instances within 27 field types exist where an exact match was not identified between the two data sets. Approximately 78% of these differences were fields labeled as the date the record was created in the DMV system and the date the record was effective in the DMV system. Page 3 Analysis Analysis Data transfer validation strategy and results This data transfer validation analysis was performed against an established set of activities as detailed in Appendix A: Deliverable Expectation Document, and the Department of Finance approved data transfer validation strategy detailed in this section. 1.1 Data Collection STRATEGY Request and collect voter registration data files from DMV and SOS for the pre-EASE integration period of the California New Motor Voter program for analysis. This period is April 23, 2018 through September 26, 2018. Method of data transfer will be determined by DMV and SOS. RESULTS On May 14, 2019 SOS provided two comma delimited files containing addresses and related voter registration records as received from DMV for the period April 23, 2018 through September 26, 2018. The data was transferred utilizing protocols determined by DMV and SOS. On May 15, 2019 and May 16, 2019 DMV provided seven comma delimited files containing voter registration records as sent to SOS during the period April 23, 2018 through September 26, 2018. The data was transferred utilizing protocols determined by DMV and SOS 1.2 Data Validation STRATEGY Data validations are to be performed prior to analysis to confirm the ability to perform analytics. This validation may include, but is not limited to:    Verifying files are in consistent formats o Readability in text editor o Presence of headers o Presence of pipe / comma as a file delimiter Integrating data on a Structured Query Language (SQL) Server o Errors occurring during the integration will be documented o The list of fields will be compared against data request criteria Additional verification tests o Check the first and last log date / time o Compare raw data file’s total row count with row counts in the uploaded table o Identify fields with no values and evaluate its critical level Page 4 Analysis Issues raised from this verification process will be investigated by DMV and SOS to identify issue origination (i.e. issues with original source data files, issues during data upload, etc.). RESULTS All files received by SOS and DMV were readable within text editors and contained usable headers and delimiters. The DMV data included qualifiers for all data fields, providing a clean data import. All record counts were matched against counts provided by DMV from their source system. The SOS data did not include qualifiers for data fields which yielded data shifting for 24 records. These records were adjusted via SQL scripting and reviewed for accuracy with SOS. All record counts were matched against counts provided by SOS from their source system. 1.3 Data Loading STRATEGY Data is to be loaded into SQL Server via the SQL Server Import and Export Wizard. All data points are to be input as text and converted via scripting into native data formats as they would relate in the source systems. RESULTS All files were loaded as text using SQL definition VARCHAR(4000). Fields were then converted to native data types via SQL scripting and appended into testing tables. 1.4 Data Analysis 1.4.1 RECORD COUNT ANALYSIS STRATEGY Objective: To confirm at a high level that the number of records produced from the source system (DMV) is the same as the number of records received in the downstream system (SOS). Concept Logic:  Count number of records in source system file  Count number of records in downstream system file  Compare record counts and report on differences  Count number of unique records in source system file based on driver license identification number, time stamp and row position  Count number of unique records in downstream system file based on driver license identification number, time stamp and row position  Compare unique record counts and report on differences Page 5 Analysis RESULTS Record Count Total record counts were performed on each of the SOS and DMV data sets. Record counts were also performed based on unique driver license identification number, considering the creation date and time of the record, and unique transfer identifier, and performed based on unique transaction identifier and unique transfer identifier. Table 1. Record count Validation description Total record count Total record count of unique driver license identification number + creation date and time + unique transfer identifier Total record count of unique transaction identifier + unique transfer identifier DMV count 3,192,789 3,192,789 SOS count 3,184,724 3,181,481 Difference 8,065 11,308 3,192,789 3,101,040 91,749 Data Validation A comparison of DMV and SOS data based on unique transaction identifier and unique transfer identifier was performed to identify potential gaps between the two data sets. 92,201 records were identified which were included in the DMV data set but not included in the SOS data set. Similarly, 452 records were included within the SOS data set but not included in the DMV data set. A comparison of total record count against the unique transfer identifier count was performed to identify duplicate records within each data set. In the SOS data set, 83,684 records were identified as having a complete record duplicate. Duplicate records were not identified in the DMV data set. Table 2. Data validation Validation description DMV records not included in SOS data set SOS records not included in DMV data set Duplicate records within SOS data set Count 92,201 452 83,684 1.4.2 BLANK FIELD ANALYSIS STRATEGY Objective: To identify potential data quality issues at the start of the transfer process to assist with narrowing of false positive matches from other analytics. Concept Logic:  Identify required fields for analysis  Count number of blank records per field in source system file  Compare blank record count to number of records and compute percentage blank Page 6 Analysis RESULTS A data quality assessment was performed by identifying the number of blank data points within the required fields for transfer. This test helps identify potential issues downstream but can also assist with increasing performance by identifying fields that are rarely or never used. 32 field types were identified as having blanks during transmission with the following five fields being blank in over 99% of the data records. Table 3. Blank field analysis Field type description Other party designation Former mailing address – state Former mailing address – zip code Former mailing address – city Former mailing address – street address Count of blanks 3,178,642 3,174,799 3,169,384 3,168,246 3,168,245 Blank percentage 99.56% 99.44% 99.27% 99.23% 99.23% 1.4.3 KEY FIELD ANALYSIS STRATEGY Objective: To identify potential data quality issues within the source system file as it relates to standard data definitions. For example, confirming that date records are in a valid date format or that county code is in the standard assignment value as per requirements. Concept Logic:  Isolate key fields for analysis  Summarize data within key fields for a unique data list  Join unique data list to associated reference list  Compare data values to reference list and report on differences RESULTS Analysis was performed on three key fields to compare values with approved standardized values for downstream use. The key fields were labeled as political party, county, and source of data. The analysis used standard definitions provided by SOS and compared against DMV data. In addition, records which contained blank values within the three key fields were omitted from testing due to missing information to relate. 171,145 DMV records contained approved party preference text where these records do not have the associated designation within the SOS data. 87,170 DMV records contained approved party preference text and are not contained in within the SOS data. 2,256 DMV records contained unapproved county names and were identified as received and processed with blank county within the SOS data set. 3,517 DMV records were noted as having unapproved county names and these records were not identified within the SOS data. Page 7 Analysis Two “DMV source type description” categories (“Do not send” and “Release”) in Table 4 are included within the DMV data set, but are not included in the defined list of SOS approved source types. The DMV data set included 55,662 records with the “Release” source type; these records were not identified within the SOS data set. The DMV data set included 54 records with the “Do not send” source type; two of these records were identified within the SOS data set. Table 4. Source validation DMV source type description DMV count SOS count Difference Change of address Field office Do not send Release Request by mail Internet change of address Driver license issuance replacement Form 410 – renewal by mail 163,957 2,261,850 54 55,662 207,926 112,475 470,631 3,918 154,270 2,261,653 2 0 198,226 97,416 469,073 3,632 9,687 197 52 55,662 9,700 15,059 1,558 286 Percentage difference 5.91% 0.01% 96.30% 100.00% 4.67% 13.39% 0.33% 7.30% 1.4.4 FIELD TRUNCATION ANALYSIS STRATEGY Objective: To identify potential data quality issues between the source system and downstream system related to field truncation. This may arise when both systems have different field length definitions or data types. Concept Logic:  Identify minimum and maximum length per field within source system  Identify minimum and maximum length per field within downstream system  Compare source system field lengths with downstream system field lengths (per field)  Report on differences RESULTS Identified 24 fields with potential truncation between the DMV and SOS data sets. 1.4.5 EXACT MATCH ANALYSIS STRATEGY Objective: To identify potential data quality issues between the source system and downstream system related to inaccurate data between systems. This may arise when records get mapped inaccurately or during loading from transmission. Concept Logic:  Identify key fields within source system Page 8 Analysis  Identify key fields within downstream system  Perform field level mapping between source and downstream system  Join data sets based on exact match criteria  Report on differences RESULTS Identified 1,759,121 instances within 27 field types where an exact match was not identified between the DMV and SOS data sets. 88% of these items reside within fields labeled as the date the record was created in the DMV system and the date the record was effective in the DMV system, and political party. Fields over 1% difference are included in Table 5 below. Table 5. Unmatched field types over 1% Field type description Created date Effective date Political party Former residence zip code Current mailing street address 1.5 Unmatched count 902,719 475,711 173,243 91,927 40,742 Unmatched percentage 28.35% 14.94% 5.44% 2.89% 1.28% Reporting STRATEGY The results of the analytic procedures are to be provided in tabular format utilizing Microsoft Excel to DMV and SOS. Observations and procedures followed to produce results will be shared and additional support can be provided to help understand results. DMV and SOS will be responsible for all follow-ups on data differences and action items. RESULTS An appended DMV data file, appended SOS data file, and detailed results of the analytics described above have been provided to DMV and SOS personnel on encrypted storage devices as authorized by DMV and SOS legal counsel. The hard drive storing the voter registration data on the analytics laptop during the validation procedures has been removed from the laptop and remains with SOS as agreed with DMV and SOS legal counsel. The analytics team provided a debrief of the data transfer validation strategy and results in a meeting with both DMV and SOS on May 30, 2019. Rationalization, analysis and follow-up activities related to the results of the analytics are to be determined and executed by DMV and SOS. Page 9 Appendix A Appendix A: Deliverable Expectation Document Introduction The objective is to propose and execute a Department of Finance approved strategy to validate the integrity of the California New Motor Voter (CNMV) data transferred from Department of Motor Vehicles (DMV) to Secretary of State (SOS) during the period prior to EASE integration (Apr. 23, 2018 – Sept. 26, 2018). The validation will utilize scripts to compare the full population of DMV records to the full population of records received by SOS based on data transfer time stamp. Records that do not match will be provided to DMV for review and analysis. 1. Deliverable Description The Pre-Integration File Validation Strategy deliverable will include a written strategy and description of procedures to be performed to accomplish the comparison of DMV records transferred to the records received by SOS for the period noted above. The deliverable will include the results of our procedures and an inventory of DMV and SOS records that do not match each other. 2. Deliverable Contents The Pre-Integration File Validation Strategy deliverable will be a written report containing the DOF approved strategy, detailed procedures performed, a summary of the results of our procedures and an inventory of DMV and SOS records that do not match each other. Additionally, the deliverable will list the received artifacts and stakeholders interviewed. The intention of the deliverable is to provide a strategy for validating the integrity of the California New Motor Voter data transferred from DMV to SOS during the period prior to EASE integration and identify records that do not match between DMV and SOS repositories. This strategy may be leveraged by the CNMV program team during future data validation procedures. The report is advisory in nature. The data transfer validation procedures will not render an assurance report or opinion under this report, nor will the Services constitute an audit, review, examination, or other form of attestation as those terms are defined by the American Institute of Certified Public Accountants. None of the Services or any deliverables or reports will constitute any legal opinion or advice. 3. Key Activities The table below includes specific activities that will be performed as part of the pre-EASE data transfer validation procedures. Page 10 Appendix A # Activity Description Outcome 1. Develop pre-EASE integration data validation strategy Develop written procedures to validate the integrity of the CNMV data transferred from DMV to SOS during the period prior to EASE integration (Apr. 23, 2018 – Sept. 26, 2018)  Written data validation strategy, including procedures 2. Obtain DOF leadership approval of pre-EASE integration data validation strategy Discuss data validation strategy and written procedures with DOF leadership and obtain approval for execution  Formal approval of strategy by DOF leadership 3. Data request and collection Request CNMV registration data from DMV and SOS for preEASE integration period (Apr. 23, 2018 – Sept. 26, 2018)  DMV CNMV registration data file (pre-EASE integration) SOS CNMV registration data file (pre-EASE integration) Provide and walk through initial draft of PreIntegration File Validation Strategy deliverable with Department of Finance to obtain feedback  4. Reporting  Pre-integration File Validation Strategy final deliverable Key Assumptions    Procedures are based on the data structures previously provided by DMV and SOS DMV and SOS will provide data within 24 hours Data files provided meet the requirements included in the data request DMV/SOS Resource Requirements  None expected  None expected  DMV and SOS data integration leads DMV and SOS business process owners   DMV and SOS representatives attend knowledge transfer session Incorporate Department of Finance feedback and Page 11 Appendix A # Activity Description Outcome finalize report within five business days Key Assumptions DMV/SOS Resource Requirements Deliver a soft copy of the final Pre-Integration File Validation Strategy deliverable for approval 4. Acceptance Criteria Below are the specific acceptance criteria for this deliverable.  Key activities were performed  Written acceptance of the deliverable received from one of the following individuals:  o Erica Gonzales, Chief IT Consulting Unit (DOF) or o Thomas Giordano, Oversight Manager, IT Consulting Unit Written deliverable acceptance form complete Page 12 Appendix B Appendix B: Points of Contact The following is a list of the points of contact for the pre-EASE data transfer validation procedures. Table 6. Points of contact Individual Title Organization DMV DMV SOS SOS SOS DMV SOS DMV DMV DMV Page 13 Appendix C Appendix C: Data Files Utilized in Analysis Table 7. DMV and SOS pre-EASE data files included in analysis File Name Date Received 5/14/2019 Provided By File Format (SOS) CSV Record Count Provided 3,184,724 Record Count Loaded 3,184,724 5/14/2019 (SOS) CSV 3,184,724 3,184,724 5/15/2019 (DMV) CSV 430,977 430,977 5/15/2019 (DMV) CSV 264,124 264,124 5/15/2019 (DMV) CSV 313,499 313,499 5/15/2019 (DMV) CSV 254,813 254,813 5/15/2019 (DMV) CSV 289,526 289,526 5/15/2019 (DMV) CSV 399,094 399,094 5/16/2019 (DMV) CSV 1,240,756 1,240,756 Table 8. Appended data files and results files provided to DMV and SOS File Name Date Provided 5/22/2019 5/22/2019 5/22/2019 5/22/2019 Received By File Format (DMV), (SOS) (DMV), (SOS) (DMV), (SOS) (DMV), (SOS) Text - Pipe delimited with double quote qualifiers Text - Pipe delimited with double quote qualifiers Microsoft Excel Text - Pipe delimited with double quote qualifiers Page 14 Appendix D Appendix D: Acronyms and Terminology Acronym/Term Definition CNMV California New Motor Voter DMV California Department of Motor Vehicles EASE Enterprise Application Services Environment SOS California Secretary of State SQL Structured Query Language VARCHAR(4000) Variable Character Field is a set of character data with a maximum length of 4,000 characters Page 15