UNITED STATES DEPARTMENT OF COMMERCE Economics and Statistics Administration U.S. Census Bureau Washington, DC 20233-0001 OFFICE OF THE DIRECTOR May 17, 2018 Mr. Austin R. Evers American Oversight 1030 15th Street NW, Suite B255 Washington, DC 20005 foia@americanoversight.org Dear Mr. Evers: This letter is in further response to your correspondence, dated March 14, 2018, to the U.S. Census Bureau's Freedom oflnformation Act (FOIA) Office. We received your request in this office on March 16, 2018, and have assigned to it tracking number DOC-CEN-2018-000937. We are responding under the FOIA to your request for: 1. 2. 3. 4. 5. Records sufficient to show the IT systems necessary for carrying out the Bureau's 2020 Census redesign, including, if such a record exists, a list of these necessary IT systems. Records sufficient to show the most recent delivery schedule, or schedules, for all IT systems that will be a part of the Bureau's 2018 End-to-End Test for the 2020 Census. Records sufficient to show the most up-to-date delivery and testing schedule, or schedules, for all IT systems that will be used, or may be used, in the 2020 Census. Records sufficient to show the Bureau's contingency plans for additional testing of IT systems that are not fully tested by the completion of the 2018 End-to-End Test. All records deemed responsive to the U.S. House of Representatives Committee on Oversight and Government Reform's February 20, 2018 letter requesting documents related to the Bureau's IT system testing and deployment for the 2018 End-to-End Test, and any records produced to the Committee related to the Bureau's IT systems and the 2018 End-to-End Test. Enclosed are nine (9) documents (191 pages) that are responsive to your request; these records are fully releasable under the FOIA. There is no charge for these records. census.gov Mr. Austin R. Evers, DOC-CEN-2018-000937 May 17, 2018 Page2 Please contact De_lorisReed of my staff by telephone at 301-763-2127 or by email at census.efoia@census.gov if you have any questions regarding your request. , , CIPP/G Freedom of Infonna · n Act/Privacy Act Officer Chief, Freedom of I ormation Act Office Enclosures AMcHICAN pv _RSIGHT Enclosures UNITEDSTATESDEPARTMENT OFCOMMERCE Economicsand Statistics Administration U.S.CensusBureau Office of the Direct.or Washingt,on, DC 20233-0001 February 22, 2018 The Honorable Gerald E. Connolly Ranking Member Subcommitteeon Government Operations Committee on Oversight and Government Reform U.S. House of Representatives Washington,DC 20515-6143 Dear Representative Connolly: This responds to your November 21, 2017 and February 20, 2018 letters requesting information regarding delivery of the Information Technology (IT) systems for the 2018 End-to-End Census Test and the 2020 Decennial Census. As the Census Bureau approaches the 2018 End-to-End Census Test, it is rigorously tracking the status of the systems, each with its own well-defined scope, requirements, schedule, and costs, and each overseen by experienced project management teams. The Government and Accountability Office (GAO) has conducted a series of meetings about IT systems readiness with the Census Bureau. GAO has suggested that delays in system deploymentmay prevent the Census Bureau from properly testing each system. Since we met with GAO, we have revised our testing deployment schedule to prepare for the 2018 End-ToEnd Test. The revised deployment schedule allows us to incorporate broad fraud protection measures. The Census Bureau updated GAO on its plans for the 2018 End~to-EndCensus Test, including testing the fraud detection system, which will be running the fraud statistical models on response data as a post data collection operation. The Census Bureau regularly meets with GAO at multiple levels to review and discuss IT systems readiness. If it would be helpful, we would be happy to participate in a joint meeting between Census Bureau staff, GAO, and your office. The Census Bureau continues to make significantprogress in the development and integration of the various systems designed to handle the collection, processing, tabulation, and other operational functions for a successful count in the 2020 Census. 44 systems are being deployed in the 2018 End-to-End Census Test. As of February 15, 2018, 40 of those 44 systems have been delivered, and the remaining four systems are scheduledto be delivered beginning April 2018, to support activities starting July 2018. In addition, 24 of the 40 delivered systems have successfullycompleted integration testing. Integrationtesting is currently being conducted for the remaining systems and will be completed for those systems before the 2018 End-to-End Test begins. No system will be released without completingthe necessary integration testing. COMM-CB-18-0152-A-000001 census.gov 2 The Honorable Gerald E. Connolly Your letter requests a list of the critical path of IT systems necessary for successfully carrying out the Bureau's redesign of the Decennial Census. In addition to this list, which is enclosed , we are pleased to provide the Committee with the responsive documents listed below. 1 1. 2. 3. 4. 5. 6. Systems Completed and Future Releases with 2020 releases Automation Test Plan Integration and Implementation Plan Technical Integrator Performance Test Strategy Test and Evaluation Management Plan Revised Baseline of the Census Enterprise Data Collection and Processing (CEDCaP) Finally, our Office of Congressional and Intergovernmental Affairs will be reaching out to your staff to arrange a briefing on recent changes to the Decennial Programs directorate, as your letter requests. We hope this information has been helpful and appreciate your interest in this matter. If you have any additional question s or concerns , please contact our Office of Congressional and Intergovernmental Affairs at (301) 763-6100. Sincerely, Ron S. Jarmin Performing the Non-Exclusive Functions and Duties of the Director Enclosures 1 Please note that these document s, con tain sensitive IT infonnation , includin g descriptions of testing , name s of internal servers , and schedules of syste m deployment for testing and production, which if not safeg uarded could pose a security risk to the Census Bureau 's IT infrastructure , proce sses or data . CA pv·ERSIGHT COMM-CB-18-0152-A-000002 PRE-DECISIONAL - 2020 Census Systems System 2018 E2E CT Releases 2018 End-toEnd Census Test In-Office AdCan (Release I) Test Readiness Review Dates 2020 WebsiteG 2 ATAC (Automated Tracking and Control)   3 BARCAG (Block Assessment, Research, and Classification Application)  4 CAES (Concurrent Analysis and Estimation System) 5 CaRDS (Control and Response Data System)  6 CBS (Commerce Business System) 7 CDL (Census Data Lake)   8 CEDSCI (Center for Enterprise Dissemination Services and 9 CEM (Customer Experience Management) 10 CENDOCS (Census Document System) 11 Centurion 12 CHEC (Census Hiring and Employment Check System) 13 CHRIS (Census Human Resources Information System) 14 CIRA (Census Image Retrieval Application) 15 CQA (Census Questionnaire Assistance) 16 CRM (Customer Relationship Management) 17 DAPPS (Decennial Applicant, Personnel and Payroll Systems) 18 Desktop Services 19 DMP (Data Management Platform) 20 DRPS (Decennial Response Processing System) 21 DPACS (Decennial Physical Access Control System (PACS)) 22 DSC (Decennial Service Center) 23 ECaSE ENUM (Enterprise Census and Survey Enabling Platform 10/11/2017 (2) 11/17/2017 (3) 3/27/2018 (1) 12/6/2017 (2) 4/17/2018 12/13/2016 12/1/2016 6/22/2017 7/26/2017 7/31/2017 1/12/2018 (2) 1/26/2018 (3) 5/21/2018 (1) 2/12/2018 (2) 6/11/2018 ---- 3/31/2017 7/31/2017 8/28/2017 9/5/2017 2/12/2018 (2) 2/12/2018 (3) 6/11/2018 (1) 3/14/2018 (2) 7/16/2018  1 I I I I I I I I   1,2   I I 1,2 I 2 1/26/2018 1,2 2 2, 3 2 Y 1,2 1,2 2,3 1/26/2018  1,2   1,2   2,3 Y  2 Y  2 Y   1,2     1,2      1,2   2  iCADE (Integrated Computer Assisted Data Entry)  IDMS (Identity Management System)  32 ILMS (Integrated Logistics Management System) 33 IPTS (Intelligent Postal Tracking System)   34 LiMA (Listing and Mapping Application) 35 MaCS (Matching and Coding Software) 36 MAF/TIGER (Master Address File/Topologically Integrated   2  38 MOJO Optimizer/Modeler (MOJO – Optimizer/Modeling) 39 MOJO Recruiting DashboardG 40 NPC Printing (Printing at the National Processing Center)  41 OneForm Designer PlusG  42 PEARSIS (Production Environment for Administrative Records, 2,3 Y   1,2  1,2  1,2 -F    1/26/2018 9/1/2018 5/1/2019 11/1/2019 7/1/2020 I I I I  1,2 PES Clerical Match and Map Update (Post-Enumeration Survey - Clerical Matching System and Map Update) 44 PES Imputation and Estimation (Post-Enumeration Survey -                   Y 1,2 2 1/26/2018 1  2,3 Y 1,2  2 Y 1,2 2 Y 1,2  2,3 Y 1,2 2 1/26/2018 1,2 1,2 1,2,3        Se 47 RTNP (Real Time Non-ID Processing) 48 SMaRCS (Sampling, Matching, Reviewing, and Coding System)   *            1,2 1,2    1   1,2 1    1,2   2,3 Y 1,2 2 1/26/2018 1       SOA (Service Oriented Architecture) 50 Tabulation (Decennial Tabulation System) 51 UTS (Unified Tracking System) 52 WebTQA (Web Telephone Questionnaire Assistance) Totals A Acquired and Support Systems 1 CES (Center of Economic Studies) 2 CFS Hotline (Census Field Supervisor Hotline) 3 Commercial Printing 4 dDaaS (Decennial Device as a Service) 5 DSSD (Decennial Statistical Studies Division) 6 ENS (Emergency Notification System) 7 Fingerprint Vendor 8 NPC (National Processing Center) 9 POP (Population Division) 10 Sunflower /\I pVERSIGHT  Standalone system; no program-level integration testing required        49   2,3 PES PCS (Post-Enumeration Survey - Processing and Control R&A (Recruiting and Assessment)   System) 46  2    2 Y   1 2,3   3   1,2 5/21/2018 Y   3 2   1,2     1   Imputation and Estimation System) 45 I 1,2 Y    1 1 I  ns iti 43     2,3 2 or MCM (Mobile Case Management)  1,2  37    31 YSIA ov er  30 3 2   2/3/2020 en  GUPS (Geographic Update Partnership Software) 6/5/2019 O  29 3/1/2019 tU se YSIA  7/23/2018  3 Geospatial Services 10/3/2019 2  Release 4 2/19/2019     FDS (Fraud Detection System) Release 3 11/5/2018 I 1 Release 2 5/21/2018 3 I 2, 3 1,2 28 ve Y  27 Staging, Integration, and Storage) I   Geographic Encoding and Referencing Database) 2, 3  Operational Control System) Release 1 3 lG ECaSE OCS (Enterprise Census and Survey Enabling Platform – I I (1) 2/26/2018 (2) 7/11/2018 (3) 10/31/2018 (1) 4/4/2018 (2) 9/4/2018 (3) 12/3/2018 (1) 5/1/2018 (2) 10/1/2018 (3) 1/7/2019  ia 26 Release E -------------- ffi c Internet Self-Response) Release D 6/15/2017  ECaSE ISR (Enterprise Census and Survey Enabling Platform – I (1) 3/8/2017 (2) 5/10/2017  ECaSE FLD OCS (Enterprise Census and Survey Enabling Platform – Field Operation Control System) Release C Release C ATO Status 3/8/2017 O 25 Training Release 2 11/18/2016   – Enumeration) 24 Release A I I nm Consumer Innovation)  1 Recruiting Release 2 ---- Conduct Operation Dates 1 Training Release 1 Future Releases nl y Production Readiness Review Dates Recruiting Release 1 2020 Census Releases Future Releases --------------------------r------------------- Systems in Production 3 Y 2 Y 1,2              1  1,2   2,3 Y 1,2  2            24 47 21 3      2   2,3 YSIA 1,2 2 44 3 9 12 20 16 16 28 24 30 12 18 2,3 1/26/2018 1,2 1,2,3    *            2,3 1/26/2018 1,2 2,3 1/26/2018 1,2 1,2 2 1,2 1 3           2,3 2,3   1/26/2018 1,2 I  -----'-----                    COMM-CB-18-0152-A-000003 PRE-DECISIONAL - 2020 Census Systems System 2018 E2E CT Releases 2018 End-toEnd Census Test External Systems In-Office AdCan (Release I) Recruiting Release 1 Training Release 1 Release A Recruiting Release 2 I I Training Release 2 Release C Release C ATO Status Release D Release E CLER (Centralized Enrollment Clearinghouse System) 3 DHS USCIS (US Citizen and Immigration Services) 4 EWS (Equifax Workforce Solutions) 5 FBI (Federal Bureau of Investigation) 6 Federal LTC Insurance (LTC Partners/B58 Federal Long Term NFC (National Finance Center)  8 NGA Imagery Service (National Geospatial-Intelligence  Agency Imagery Service) 9 OCSE (Office of Child Support Enforcement) 10 OPM (Office of Personnel Management) 11 RITS (Retirement and Insurance Transfer System) 12 SSA (Social Security Administration) 13 USACCESS 14 USPS (United States Postal Service) 15 WebTA (Web Time and Attendance System) X,X * = Participated/Will Participate in Census/Census Test (with TRR information) G = GAO identified system development and integration testing as "complete" = Release G for Geographic Programs n = Change Request Pending = 2020 Census System, not included in the 2018 End-to-End Census Test Grey = Not Applicable tU se Blue Purple = CEDCaP System = At risk of meeting Conduct Operation date Yellow = At risk of meeting Test Readiness Review (TRR) date Green = In production = Status has changed from At Risk to On Track = Schedule = Technical Blockers ov er lG ffi c 2018 End-to-End Census Test Releases Release I (In-Office AdCan) Recruiting Release 1 (AdCan Recruiting; PES IL/IHUFU Recruiting*) Training Release 1 (AdCan Training) Release A (In-Field Address Canvassing); TRR 1 = Final functionality for all AdCan systems except ECaSE, MCM/LiMA, and UTS; TRR 2 = Final ECaSE, UTS, MCM/LiMA functionality Recruiting Release 2 (Field Enumeration Recruiting; PES PI/PFU/FHUFU Recruiting*) Training Release 2 (NRFU Training) Release C-2 (Self-Response; includes: Printing/Mailing/Workload & CQA/SelfResponse Release C-3 (GQ Workload/Advanced Contact/All GQ Training) Release D-1 (Field Enumeration (includes UL/NRFU/Coverage Improvement operations) Release D-2 (GQ eResponse/GQ Enumeration/Service Based Enumeration) Release E-1 (Tabulation and Dissimination - Residual Coding) Release E-2 (Tabulation and Dissimination - Post Capture Data Interface, Center for Dissemination Information, Primary Selection Algorithm, Census Unedited File) Release E-3 (Tabulation and Dissimination - Census Edited File, Micro Data File, Disseminate PL) 2020 Census Releases Release 1 (Recruiting for all positions; Selection/Hiring/Training of RAs, PAs, OOS & Clerks) nm Systems List v.38 and 2018 IIP v.57.3; 2020 IIP v.02_2 ia Sources: en REASONS FOR AT RISK STATUS: T Release 4 O = Security Impact Assessment (SIA) required to determine if reauthorization is necessary = Budget/Resources Release 3        KEY: S Release 2  Care Insurance Program) B Release 1      7 Red I I nl y Bureau of Fiscal Service 2 CR r------------------I Future Releases I 1 SIA 2020 Census Releases Future Releases --------------------------1 Systems in Production -F or O Release 2 (AdCan selection of CFSs, Enumerator and Listers; PES Sample Release: Initial Sample for PES; AdCan Training; In-Field Address Canvassing; Peak Operation Recruiting) Release 3 (Advertising and Earned Media; HU Count Review; Peak Operation Training (includes UL/GQ/UE/NRFU); Post-Enumeration Survey- IL Training; Post-Enumeration Survey - Independent Listing; GQ Workload and Advanced Contact/CQA Training/Printing &Mailing Workload; Remote Alaska; Island Areas Censuses; Enumeration at Transitory Locations; Self-Response (includes Mailing/Self-Response/CQA /Coverage Improvement); Peak Operations (includes UL/UE/GQ/SBE/Early NRFU/NRFU); Post-Enumeration SurveyPerson Interview; Post-Enumeration Survey-Initial Housing Unit Follow-up; PostEnumeration Survey - PI Matching (E-Sample ID, Computer Matching, BFU Clerical Matching) Release 4 (Tabulation/Dissemination; Archiving; Federally Affiliated Count Overseas; Redistricting Data; Post-Enumeration Survey - Person Follow-up; Count Question Resolution; Post-Enumeration Survey - Final Housing Unit Follow-up; Post-Enumeration Survey - Reports & Release Findings Se ns iti As of 12/19/2017 ve * 2020 only A \11 f I ,/\ I pVERSIGHT COMM-CB-18-0152-A-000004 O en tU se 2020 nl y census ov er nm Automation Test Plan al G Version 2.0 O ffi ci April 21, 2017 Technical Directive # 007 DCN: WP007-003-002 Se ns iti ve -F or Work Order # YA1323-15-BU-0033/003 Prepared by: T-Rex Corporation A \11 IC PVERSIGHT COMM-CB-18-0152-A-000005 Version History VERSION HISTORY Date Author(s) Description of Change 1.0 1/31/17 Trong Bui Initial Draft 1.1 2/10/2017 Casie, Shubra, Kabir, Puneet, Rajeev and Gayathri 2.0 4/21/2017 Updated comments from Candice P. peer review. Updated comments from Beverly. nl y Version Se ns iti ve -F or O ffi ci al G ov er nm en tU se O TI Automation Test Team Automation Test Plan Version 2.0 ii COMM-CB-18-0152-A-000006 Table of Contents T ABLE OF CONTENTS 1 INTRODUCTION .................................................................................................. 1 nl y 1.1 PURPOSE ........................................................................................................... 1 1.2 SCOPE OF TESTING .............................................................................................. 1 O 2 AUTOMATION T EST APPROACH ........................................................................ 2 se 2.1 TEST TOOLS ....................................................................................................... 3 2.2 TEST ENVIRONMENT ............................................................................................. 3 en tU 3 T EST AUTOMATION PROCESS ........................................................................... 4 ns iti ve -F or O ffi ci al G ov er nm 3.1 PHASE 1: DISCOVERY ........................................................................................... 5 3.1.1 High Level Scope ............................................................................................. 5 3.1.2 Architecture ..................................................................................................... 5 3.1.3 Investigation..................................................................................................... 6 3.1.4 Entry Criteria .................................................................................................... 6 3.1.5 Exit Criteria ...................................................................................................... 6 3.2 PHASE 2: PLANNING ............................................................................................. 6 3.2.1 Detailed Planning ............................................................................................. 6 3.2.2 Framework Creation ......................................................................................... 7 3.2.3 Entry Criteria .................................................................................................... 9 3.2.4 Exit Criteria .................................................................................................... 10 3.3 PHASE 3: EXECUTION ......................................................................................... 10 3.3.1 Test Automation Scripting ............................................................................... 11 3.3.2 Test Data Creation ......................................................................................... 11 3.3.3 Test Automation Execution ............................................................................. 12 3.3.4 Results and Analysis ...................................................................................... 12 3.3.5 Entry Criteria .................................................................................................. 12 3.3.6 Exit Criteria .................................................................................................... 12 3.4 PHASE 4: MAINTENANCE ..................................................................................... 12 3.4.1 Re-test and validation ..................................................................................... 13 3.4.2 Regression testing support.............................................................................. 14 3.4.3 Entry Criteria .................................................................................................. 14 3.4.4 Exit Criteria .................................................................................................... 14 Se 4 T ESTING DEPENDENCY AND RISKS ................................................................. 14 4.1 DEPENDENCY.................................................................................................... 14 4.2 RISKS AND CONTINGENCY ................................................................................... 15 5 KEY ROLES AND RESPONSIBILITY .................................................................. 16 Automation Test Plan Version 2.0 iii COMM-CB-18-0152-A-000007 Table of Contents 6 ASSUMPTIONS AND CONSTRAINTS .................................................................. 17 6.1 ASSUMPTIONS ................................................................................................... 17 6.2 TEST CONSTRAINTS ........................................................................................... 17 nl y APPENDIX A: ACRONYMS ...................................................................................... 1 O APPENDIX B: GLOSSARY ....................................................................................... 1 se APPENDIX C: REFERENCED DOCUMENTS .............................................................. 1 en tU T ABLE OF T ABLES Table 2: Key Roles and Responsibility ......................................................................................... 16 nm T ABLE OF FIGURES Se ns iti ve -F or O ffi ci al G ov er Figure 1: Test Automation Approach Diagram ............................................................................... 2 Figure 2: Test Automation Process ................................................................................................ 5 Figure 3: Hybrid Automation Framework........................................................................................ 9 Figure 5: Defect Lifecycle ............................................................................................................. 13 Automation Test Plan Version 2.0 iv COMM-CB-18-0152-A-000008 1 Introduction 1.1 Purpose en tU se O nl y The 2020 Census Program will maximize the use of automated testing to take advantage of efficiency and increased quality. There are many benefits to test automation, one of the many benefits are the ability for the TI test team to execute test cases faster. Another benefit is to increase test coverage by being able to test complex data combinations and process complex test scenarios. We provide a solid foundation by designing an effective test automation strategy through a phase by phase process (refer to Figure2). Our goal is to create a sustainable test automation framework tailored for the entirety of 2020 Census systems. O ffi ci al G ov er nm The Automation Test Plan document describes the testing approach, task, activities and overall framework that will drive the testing of the 2020 Census Program in accordance with the 2020 Census Decennial Test and Evaluation Management Plan (TEMP). The TEMP identifies the tasks and activities that must be performed to ensure that all aspects of the Program are adequately tested and that the system(s) can be successfully implemented. The Automation Test Plan identifies the automation test approach, the environment in which we will be testing in, the automation test process we will be implementing, entry and exit criteria for each phase, as well as test dependencies and risk mitigation. This document is a work product and will be continuously adjusted throughout the 2020 Census Program, based on the progress of each individual test teams. or 1.2 Scope of Testing Se ns iti ve -F The functional automated test cases are identified by the 2020 Census Program Level Test Team. The scope of automation is to reduce time and maximize efficiency by building a reusable process to increase quality of the work. The Automation Test Team will focus on automating recurring high-priority test cases that will be identified by the Program Level Test Team. The 2020 Census Program Test Strategy is an overall approach for validating the systems under development by ensuring that they are designed, built, and implemented to meet Census business requirements and are fully operational upon deployment (Refer to TEMP, Appendix C). In addition to the 2020 Program Test Strategy, the Automation Test Team will focus on defining test strategies for different test phases, which will be executed by the Program Level Test Team. Automation Test Plan Version 2.0 1 COMM-CB-18-0152-A-000009 2 Automation Test Approach a ion - Approach 0 nm Regression Testing en tU Test A se O nl y The approach is to test and ensure that the 2020 Census System of Systems (SoS) under development evolves as an integrated system. This requires that the system be integrated and tested in a logical, incremental fashion as the development phase of the program matures (Refer to The Integration & Test Plan, Appendix C). Figure 1 below illustrates the automation test approach, which subsequently follows all other Program Level test type, such as Regression, Infrastructure, Interface, Business Thread, etc. or O ffi ci al G ov er Test Automation on regression testmg ellort.s. Figure 1: Test Automation Approach Diagram ns iti ve -F Successful Test Automation 1mplementat10ns are predicated Se During Phase 1: Discovery, the Automation Test Team and Program Level Test Teams will work closely with the TI Segment Teams and the Project Level Teams to understand business needs, the project infrastructure, and the project application lifecycle. In addition, using the Architecture and Design document, the Automation Test Team will identify dependencies and risks. Automation Test Plan Version 2.0 2 COMM-CB-18-0152-A-000010 O nl y During Phase 2: Planning the Automation Test Team will develop an Automation Test Plan and milestone schedule to ensure stakeholders expectations are being met. The Automation Test Team will identify and prioritize test cases that are best suited for automation criteria. These automation criteria depend on the complexity of test cases and the prioritization of business requirements. Not all test cases will be automated, however for this phase the Automation Test Team will create an automation framework. se During Phase 3: Execution, the Automation Team will create and execute test scripts. en tU During Phase 4: Maintenance, the Automation Test Team will update and re-execute test scripts based on enhancements and bug fixes. nm 2.1 Test Tools Automation Test Tools for Program Level Testing include: HP Unified Functional Tester (UFT 12.53) - Used for automation testing.  HP ALM 12.53 - Used for requirements management, developing test cases, tracking defects, and metrics generation.  Perfecto - Used for mobile testing. ci al G ov er  ffi 2 . 2 Test Environment Se ns iti ve -F or O Automation testing will be conducted within System of Systems, to perform common repeatable tests to check the behavior of the Application Under Test (AUT). The application will be closely tied to the environment/infrastructure that supports it, so a large focus will be placed on test environment management. During all test phases, to ensure consistent configurations across different environments and builds, process/procedure documentations from Project Level Team needs to be established. It will be critical to properly manage the test environment to ensure the availability and stability of the systems during automation testing. Refer to the TEMP for further details on the testing environment. Automation Test Plan Version 2.0 3 COMM-CB-18-0152-A-000011 Table 1: Test Environments Test Level Brief Description of Environment Independent Test Program Level Environment used for Program Level Integration Testing, Security Controls Testing, and CAT (Output) Staging Program Level / Operational Level Environment used for Checkout and Certification (C&C), System Readiness Test (SRT), Performance and Scalability Testing, and ORT. er 3 Test Automation Process nm en tU se O nl y Environments G ov Figure 2 below illustrates the TI Team’s test automation process, which contains four phases: al  Phase 1: Discovery ffi  Phase 3: Execution ci  Phase 2: Planning O  Phase 4: Maintenance Se ns iti ve -F or This test automation process will be repeated for each release. Automation Test Plan Version 2.0 4 COMM-CB-18-0152-A-000012 en tU se O nl y Phase \laintmance nm Planning er Figure 2: Test Automation Process ov 3.1 Phase 1: Discovery ffi ci al G During Phase 1: Discovery, the Program Level Test Team will work closely with the TI Segment Teams and Project Test Teams to evaluate the SoS to gather capability requirements. The assessment efforts will dive deep into the solution architecture to understand the maturity level of the functionality for each system. O 3.1.1 High Level Scope Understand stakeholders’ priority on automation testing tasks. ve  -F or Project planning that involves determining and documenting a list of specific program goals, deliverables, features, functions and tasks. iti 3.1.2 Architecture Se ns Using architectural diagram of 2020 Census System of Systems to map work flows, release by release.  Understand the high-level architecture across all systems and the integration of systems. Automation Test Plan Version 2.0 5 COMM-CB-18-0152-A-000013 3.1.3 Investigation Discovering a process to test the SoS based on work flow. Understand the Project Development Team’s dynamics.  Understand the maturity level of the application. O nl y  se 3.1.4 Entry Criteria en tU The following list provides the entry criteria for Phase 1: Acquire Synthetic Automation Test Data.  Complete systems maturity level and status documentation. nm  er 3.1.5 Exit Criteria ov The following list provides the exit criteria for Phase 1: Automation Test Environment  Automation Test Priority/Schedule  Program Automation Test Framework and Tools Selection ci al G  O ffi 3.2 Phase 2: Planning -F or In Phase 2 Planning, the Automation Test Team will work with all required stakeholders to design the Automation Test Plan in preparation for test execution. 3.2.1 Detailed Planning Se ns iti ve As the other Program Level Test Teams create their test cases based on procedures and scenarios, the Automation Test Team will work with them to identify which test cases are the best candidates for automation. Since testing is cumulative, new test cases are added to the automation test suite to ensure full test coverage. Additional tests may run on the Automation Test Suite per request from the Program Level Test Team. Criteria to determine candidates for automation:  Repeatable test cases Automation Test Plan Version 2.0 6 COMM-CB-18-0152-A-000014 System stability  Complex manual test cases  Cross Browser Testing and Cross Platform Testing nl y  O 3.2.2 Framework Creation en tU se As part of framework creation, the Automation Test Team will determine the appropriate automation tool(s) to automate the re-usable components, which will accelerate the test process and maintainability across the application. As well as, develop the automation test plan and schedule for each component based on their level of maturity. A Hybrid Automation Framework will be created for the 2020 Census Program to maximize testing efficiency and increase quality.  The Hybrid Framework is a combination of the Data-Driven Framework (DDF) and the Functional Decomposition framework. Data-driven testing is ideal for testing systems that are large, complex, and have varied test environments.  The Functional Decomposition Framework reduces redundancy and repetition when creating test scripts. Script maintenance is easier because the re-usable code is available at a single place, no matter how many times it is called. If there is a change in any re-usable function, the tester is required to make the changes in only one place within the test script. or O ffi ci al G ov er nm  Data is simulated and action is processed. iti  A set of data is read from a file. ve  -F Data-Driven Framework (DDF): Tests data and/or output values that are read from data files instead of using the same hard-coded values each time the test runs. Actual results are compared against the expected results. Se ns  Functional Decomposition Framework (FDF)  Analyze test cases and identify re-usable functions and flows.  Write the test scripts and create re-usable functions and update function libraries. Automation Test Plan Version 2.0 7 COMM-CB-18-0152-A-000015 Hybrid Framework Folder Structure: nl y  Master Driver Script: The script that drives the entire execution. It performs prerequisite and initial settings that are required for the execution. O  Library Files: The associated functions that form the function libraries.  Data Table: The test data that is required for execution. en tU se  Object Repository: The objects and their properties that enable UFT to recognize the objects seamlessly.  Execution Logs: The folder contains the Execution Log file with user functions and function execution history. t Autorn tlon Fram work T stM ov T er nm  Results: The folder contains all the execution results (Pass/Fail/No Run). G Library g rn nt yer ci al (l!J ALM Se ns iti ve -F or O ffi Hybrid Fram work ngine Automation Test Plan Version 2.0 Functional Decom position Data Driven Script Drlv r Engln • • II T st R ult - D f ct 8 COMM-CB-18-0152-A-000016 Figure 3: Hybrid Automation Framework  Function Library o Re-usable functions O  nl y o Version control repository (ALM) Object Repository se o Stored identified objects  en tU o Descriptive Programming Script Driver Engine and Report Engine  er nm o HP ALM, UFT driver script will generate test results after execution. Test Execution Management  G ov o Scheduled test automation script execution. al Test Data iti ve -F or O ffi ci o The Test Data Management Team (TDM) will generate synthetic test data that will be used for different kinds of testing, without referencing existing production data to avoid confidentiality or privacy risks. Test data creation and maintenance is an integral aspect of automation testing. High-quality, production-like, test data will result in a high-level data permutation that will ensure a wider test scenario coverage during automation testing. A structured approach to the creation, and configuration management of the test data will enable and expedite testing activities. This will result in a collection of realistic test artifacts with thorough coverage and correlation at statistically relevant volumes. ns 3.2.3 Entry Criteria Se The following list provides the entry criteria for Phase 2:  Successful completion of Phase 1 activities.  Analyze test cases and identify re-usable functions and flows. Automation Test Plan Version 2.0 9 COMM-CB-18-0152-A-000017  Software must successfully complete Project Level Testing prior to entry into Test Integration Level Testing.  Stable builds with no urgent or high priority defects. nl y 3.2.4 Exit Criteria Prioritized test cases to be automated  Creation of a Hybrid Automation Framework  Generate automation test script en tU  se O The following list provides the exit criteria for Phase 2: nm 3.3 Phase 3: Execution Se ns iti ve -F or O ffi ci al G ov er In Phase 3: Execution, the first step is to analyze and select candidate test cases to automate. Next the automation test scripts will be created for the selected test cases. Afterwards, the test scripts will be peer-reviewed before they are finalized and executed. Automation Test Plan Version 2.0 10 COMM-CB-18-0152-A-000018 t J or O ffi ci al G ov er nm en tU se O nl y Finally, the test results that fail will be analyzed. Automation scripts may need to be updated, re-tested, and developed iteratively as needed. -F Figure 4: Automation Test Script Flow ve 3.3.1 Test Automation Scripting  iti Select and/or update suitable test cases ns 3.3.2 Test Data Creation Se  Simulated data will be provided by ExactData as part of TI Task Directive 004 (Refer to TEMP, Appendix C). Automation Test Plan Version 2.0 11 COMM-CB-18-0152-A-000019 3.3.3 Test Automation Execution  nl y Process of the test automation execution is using Hybrid Automation Framework within UFT (Refer to Figure 3: Hybrid Automation Framework) 3.3.4 Results and Analysis en tU se O An HTML and Log File report will be generated after each execution through UFT. The HTML Report will be used for reporting purposes and the Log File will be used for debugging purposes. Additional execution reports can be regenerated in UFT. The HTML and Log File reports will have the following: Test case name  Test execution date and time  Execution status  Function name (suggested names of re-usable functions)  Test Data G ov er nm  al 3.3.5 Entry Criteria ci The following list provides the entry criteria for Phase 3: Successful completion of Phase 1 and Phase 2  Test environment availability  Availability of Test Data -F or O ffi  3.3.6 Exit Criteria iti Updated/Modified automaton test scripts for testing ns  ve The following list provides the exit criteria for Phase 3: Updated automation test data for testing  Test Automation Results Report are generated Se  3.4 Phase 4: Maintenance In Phase 4: Maintenance, the Automation Test Team will update and re-test the automation script as necessary. When a defect is identified, the Automation Test Team Automation Test Plan Version 2.0 12 COMM-CB-18-0152-A-000020 en tU se O nl y will log defect details into HP ALM to ensure that the Project Team and other stakeholders are aware that a defect exists and must be addressed. The defect reporting process is described in detail in the Defect Remediation Plan (Refer Defect Remediation, Appendix C). After defect fixes, test scripts directly related to those defects need to be run to ensure that the fixes do not alter other functionalities (Refer to Figure 5: Defect Lifecycle). The Test Team must also conduct an impact and risk analysis for the defects and identify additional test scripts that should be run based on code complexity, defect density and code priority. Communication of progress, and issues are continuously conveyed using various channels (status dashboard, daily standups, weekly meetings and TARs) to apprise stakeholders of the current test status. er nm Defect ffi Rl·solved O Re-Open ci al G ov Asstaned lffl@ , MitiN• • Figure 4: Defect Lifecycle Se ns iti ve -F or r • • Re·lest 3.4.1 Re-test and validation  Automation Test Team will continue re-testing and validating the scripts/ functions as defects are found in the System of Systems to ensure that automation test scripts are running properly. Automation Test Plan Version 2.0 13 COMM-CB-18-0152-A-000021 3.4.2 Regression testing support  An Automated Regression Test will be performed when there is a(n): nl y o Change in requirements and the code is modified per the requirement o New feature added to the software O o Defect fixed se o Request by other Program Test Teams en tU 3.4.3 Entry Criteria The following provides the entry criteria for Phase 4: Ensure that all planned test cases have been scripted and executed  New enhancements are made and/or defects are found nm  er 3.4.4 Exit Criteria ov The following provides the exit criteria for Phase 4: Test execution is completed successfully  Test Automation Results Report are generated  All defects are addressed ffi ci al G  O 4 Testing Dependency and Risks or 4.1 Dependency -F The following dependencies have been identified for Automation Testing:  ve iti Ensure that automation test tools, such as Unified Functional Testing (UFT), Application Life Cycle Management (ALM) and Infrastructures are available. ns  The Program Level Test Team is dependent on the Project Level Test documentation, which aligns with the functional capabilities for each release. Project Level Testing needs to be completed and Test Readiness Reviews (TRR) must be up to date for automation testing.  Test data will be delivered based on the test schedules. Se  Automation Test Plan Version 2.0 14 COMM-CB-18-0152-A-000022 4.2 Risks and Contingency No. Risk(s) Impact Contingency nl y The following risks have been identified for Automation Testing, which may have a negative impact on automation testing: Testing Release deadlines High  Prioritize the workload in order to meet the deadline. 2 Urgent/High priority defects found at a late stage in the cycle High  Minimized implementing new/update requirements that could impact downstream and/or upstream applications at a late stage of the testing cycle. 3 Parallel test execution using the same test environment may have an impact on test data, which may cause irrelevant test results. 4 Limitation to data set provided by TMD006 could impact automation testing nm en tU se O 1 ci ov al G Moderate  Program Level Test Team will assist the Automation test team to create the test data based on the systems under test. Se ns iti ve -F or O ffi Schedule/Coordinate within the Program Level Test team to execute the test scripts for the systems under test. er Moderate  Automation Test Plan Version 2.0 15 COMM-CB-18-0152-A-000023 5 Key Roles and Responsibility nl y Table 2: Key Roles outlines the key roles associated with automation test activities. The roles described in the table will be involved in all phases of automation testing to support the various activities as needed. Responsibilities Automation Test Team  Automation Test Lead    en tU Monitors and reports automation test results. Reviews product requirements, as well as functional specifications, design documents to determine and prepare automated test cases. Automation Testers    Plan and execute automation test activities based on priority. Report defects for issues found. Generate and analyze Automation Test Reports. Test Data Management Team  er ov G al ci Assists in generating Test data needed by the Program Level Test Team. Se ns iti ve -F or O  nm  Manages test activities and tasks. Leads automation test planning. Monitors, measures, controls, and reports on the test progress. Manages responsibilities and influences the direction of the automation effort, its schedule, and prioritization. Escalates technical risks/issues to appropriate Leads. ffi  se Key Roles O Table 1: Key Roles and Responsibility Automation Test Plan Version 2.0 16 COMM-CB-18-0152-A-000024 6 Assumptions and Constraints 6.1 Assumptions System design documents are completed and are available.  Automation testing tools are installed and are licensed.  Automaton Test Design and Planning is completed.  Synthetic Test Data is available for automation testing.  Final version of Architecture and Design Documents for the entire Project are available.  Independent Test Environment is available and functional before test execution.  Issues and defect resolution, support the testing timeline. nm en tU se O nl y  er 6.2 Test Constraints Access to multiple systems, such as UTS, DAPPS, CHEC, database, etc.  Defect fixes must be delivered in a timely manner to avoid backlogs of test cases and minimize down-time of test resources, especially if they are showstoppers. Se ns iti ve -F or O ffi ci al G ov  Automation Test Plan Version 2.0 17 COMM-CB-18-0152-A-000025 O Application Lifecycle Management Application Under Test Data-Driven Framework Development Operation Resources Functional Decomposition Framework Hewlett Packard Hyper Text Markup Language Operation Readiness Review Personally-Identifiable Information Production Readiness Review 2020 Census System of Systems System Readiness Test Testing Analysis Reporting Test Data Management Test and Evaluation Management Plan Test Readiness Review Unified Functional Tester se ALM AUT DDF DevOps FDF HP HTML ORR PII PRR SoS SRT TAR TDM TEMP TRR UFT en tU Definition Se ns iti ve -F or O ffi ci al G ov er nm Acronym nl y Appendix A: Acronyms Automation Test Plan Version 2.0 A-1 COMM-CB-18-0152-A-000026 Appendix B: Glossary Definition Automation Test Suite Collection of test cases that are intended to be used to test a software program to show that it has specified set of behaviors. Se ns iti ve -F or O ffi ci al G ov er nm en tU se O nl y Term Automation Test Plan Version 2.0 B-1 COMM-CB-18-0152-A-000027 Appendix C: Referenced Documents Path or Link Integration and Test Plan Planned Delivery Date: TBD Test and Evaluation Management Plan (TEMP) Defect Remediation Plan Delivered on: 11/30/2016 O nl y Description of Supporting Document or Site Se ns iti ve -F or O ffi ci al G ov er nm en tU se Work Product Automation Test Plan Version 2.0 C-1 COMM-CB-18-0152-A-000028 census 2020 Integration and Implementation Plan Version 3.5 Due Date: March 14, 2017 Delivered Date: March 14, 2017 Last Update: September 21, 2017 Work Order # YA1323-15-BU-0033/003 Technical Directive # 006 DCN: TD006-014-004 Prepared by: T-Rex Corporation This document is an interim draft until formally accepted by the Census Bureau. The document will be watermarked as Final upon acceptance. COMM-CB-18-0152-A-000029 Version History VERSION HISTORY Version Date Author(s) Description of Change 2.0 09/29/2016 Mitre Updated Appendix C with data from System Engineering and Integration (SE&I), Updated all timeline, operations, systems and milestone diagrams. Complete redesign of “Other Supporting Operations” section; Addition of significant content in Sections 6 and 7. 3.0 11/16/2016 Mitre Deleted references to the 2017 Puerto Rico Test except where it made sense for historical purposes, updated Census tests information as appropriate, added Executive Summary, made other updates based on updates to the Census Enterprise Architecture Transition Plan v 3.0. 3.1 01/27/2017 Natasha Barilaro, Jerome Carter T-Rex Comprehensive re-write of v3.0 to remove duplicative content with Census Enterprise Architecture Transition Plan and dates managed in the 2020 Integrated Master Schedule (IMS); provided more detailed information on each 2020 release; provided information on how the 2020 Census Program has used the IIP to date. 3.2 02/16/2017 Natasha Barilaro, T-Rex Revised Formal Submittal. 3.3 03/14/2017 Natasha Barilaro, T-Rex Revised formal submittal based on comments received from Census Enterprise Data Collection and Processing (CEDCaP) after formal submittal review period. 3.4 05/27/2017 Natasha Barilaro, David Galdi, T-Rex Revised to incorporate the following SE&I Change Requests (CRs) per the SE&I Configuration Management Process: 337, 197, 180, 173, 167. Removed views of IIP spreadsheets, 2020 IMS dates to ensure these dates are not referenced in multiple baselined artifacts. 3.5 09/21/2017 Natasha Barilaro, David Galdi, T-Rex Updated to reflect CR 407: “2020 System Release Schedule and Supplemental SRR/CDR Dates” Integration and Implementation Plan Version 3.5 ii COMM-CB-18-0152-A-000030 Table of Contents TABLE OF CONTENTS 1 EXECUTIVE SUMMARY ............................................................................... 1 2 INTRODUCTION ......................................................................................... 2 2.1 TERMINOLOGY IN THIS DOCUMENT ............................................................................... 2 2.2 DOCUMENT PURPOSE AND SCOPE ............................................................................... 2 2.3 RELATIONSHIP TO OTHER DOCUMENTS ........................................................................ 3 2.4 AUDIENCE ................................................................................................................. 5 3 INTEGRATION AND IMPLEMENTATION PLAN OVERVIEW ................................. 6 3.1 PROGRAM REVIEWS ................................................................................................... 6 3.2 PROGRAM RELEASES ................................................................................................. 7 4 APPLYING THE INTEGRATION AND IMPLEMENTATION PLAN ............................ 9 4.1 2020 CENSUS IIP APPROACH ..................................................................................... 9 4.1.1 Approach: Census Tests .................................................................................... 12 4.1.2 IIP Approach: Ongoing Geographic Programs ................................................... 13 4.1.3 IIP Approach: 2020 Census ................................................................................ 13 4.2 2020 IIP REVIEWS ................................................................................................... 14 4.3 2020 IIP RELEASES ................................................................................................. 20 4.4 MANAGING CHANGES TO THE 2020 IIP ...................................................................... 27 5 TRACKING SYSTEMS AGAINST THE IIP ...................................................... 28 5.1 STAR MILESTONES ................................................................................................... 28 5.2 BUSINESS RHYTHMS AND THE IMS ............................................................................ 37 5.3 RELEASE MANAGEMENT ........................................................................................... 38 6 LESSONS LEARNED AND KNOWN UPDATES REQUIRED ............................... 39 6.1 LESSONS LEARNED .................................................................................................. 39 6.1.1 General Benefits of the IIP ................................................................................. 39 6.1.2 Lessons Learned ................................................................................................ 40 6.2 KNOWN UPDATES REQUIRED .................................................................................... 40 APPENDIX A: ACRONYMS LIST ................................................................. A-1 8: REFERENCES...................................................................... B-1 APPENDIX Integration and Implementation Plan Version 3.5 iii COMM-CB-18-0152-A-000031 Table of Contents LIST OF TABLES Table 1: Review/Milestone Details ............................................................................................15 Table 2: 2020 Census Tests Releases ......................................................................................20 Table 3: 2020 Releases and Census Tests ...............................................................................21 Table 4: Scope of Releases ......................................................................................................22 Table 5: Star Milestones ...........................................................................................................30 Table 6: Established Business Rhythms ...................................................................................37 Table 7: IIP Lessons Learned ...................................................................................................40 Table 8: Known IIP Updates Required ......................................................................................41 LIST OF FIGURES Figure 1: 2020 Census Document Relationships........................................................................ 4 Figure 2: Census Bureau Enterprise Program Framework for Systems Readiness .................... 6 Figure 3: 2020 Census IIP Milestones .......................................................................................10 Figure 4: Overlap of 2018 E2E Census Tests vs 2020 Census Activities ..................................11 A \11 ICIntegration and Implementation Plan PVERSIGHT Version 3.5 iv COMM-CB-18-0152-A-000032 Executive Summary 1 Executive Summary The 2020 Census Integration and Implementation Plan (IIP) describes the framework for integrating Information Technology (IT) solutions comprising the 2020 Census Architecture. By providing an actionable framework for functional integration of IT activities and operational milestones, the 2020 IIP helps to realize the desired Solution Architecture that can fulfill the business goals of achieving a cost-efficient Census through modern technology, and implementing innovative Census operations as laid out in the 2020 Census Operation Plan for the following:  Re-engineering Address Canvassing (AdCan).  Optimizing Self-Response.  Utilizing Administrative Records and Third-Party Data.  Re-engineering Field Operations. The 2020 IIP describes the framework for integrating IT solutions comprising the 2020 Census Architecture by:  Providing an overview of an integration framework and its two major tenets, namely Program Reviews (e.g., Critical Design Reviews [CDRs], Systems Requirements Reviews [SRRs], Test Readiness Reviews [TRRs], Production Readiness Reviews [PRRs], etc.) and Program Releases (e.g., Release C: Self-Response, Release D: Field Enumeration, etc.).  Describing the approach the 2020 Census Program used to establish an integration framework for the 2020 Census, Census Tests, and ongoing geographic programs.  Describing how the 2020 IIP has been implemented in Decennial and is actively used to track systems progress and communicate key milestones across disciplines and organizations.  Providing a brief assessment on the success of the IIP within Decennial and a list of known updates required to the IIP to date. The 2020 IIP is a living document that drives the collaborative process, which will be refined or adjusted to minimize risk and maximize efficiency, yet meet the required timelines of the 2017 Census Test, 2018 End-to-End Census Test, and 2020 Census. The IIP achieves this by driving coordination and resolution of integration issues much earlier in the lifecycle than in the past, helping to ensure all systems come together in time for both testing and deployment. A \11 ICIntegration and Implementation Plan PVERSIGHT Version 3.5 1 COMM-CB-18-0152-A-000033 Introduction 2 Introduction This document describes how the Decennial established the IIP framework and how it is using this framework to manage systems integration activities. This section provides key information, including the use of specific terminology, scope and purpose of the document, relationship to other documents, and the intended audience. 2.1 Terminology in this Document Several terms in this document have a diverse set of meanings depending on their use. This section of the document intends to level-set the reader on the meaning of these terms in the context of the IIP.  Program-Level: This term refers only to the 2020 Census Program-Level. While there are several other “programs” referenced in this document, such as Center for Enterprise Dissemination Services and Consumer Innovation (CEDSCI) and CEDCaP, in the context of the IIP, CEDSCI and CEDCaP are considered solution providers.  Program-Level or Program Reviews: These terms refer only to the seven major reviews defined by the IIP, namely the Critical Business Proposal Review (CBR), Project Baseline Review (PBR), SRR, CDR, TRR, PRR, and Operational Readiness Review (ORR).  Milestone: This term generically refers to any key activity, point in time, or integration point.  Key Milestones: This term refers to a special set of interim integration points between the Program-Level Reviews.  Census Test: This term is used to reflect the series of operational live tests, sometimes referred to as “Field Tests” that are performed leading up to the 2020 Census, such as the Address Canvassing (AdCan) Test, 2017 Census Test, and 2018 End-to-End Census Test.  Solution Providers: The term “solution provider” is used synonymously with “system” in this document.  Conduct Operation: This term refers to a specific milestone, namely the first time a system capability within a release needs to support production operations. 2.2 Document Purpose and Scope The purpose of the 2020 Census IIP is to: Integration and Implementation Plan Version 3.5 2 COMM-CB-18-0152-A-000034 Introduction  Provide an overview of an integration framework and its two major tenets, namely Program Reviews and Program Releases.  Describe the approach the 2020 Census Program used to establish an integration framework for the 2020 Census, Census Tests, and ongoing geographic programs.  Describe how the 2020 IIP has been implemented in Decennial and is actively used to track systems progress and communicate key milestones across disciplines and organizations.  Provide a brief assessment on the success of the IIP within Decennial and a list of known updates required to the IIP. The following item is out of scope for the IIP:  While the 2020 Census IIP does include a high-level description of 2020 Census Program releases, its scope does not include the process for managing each individual release and its deployment. The 2020 Census Release and Deployment Management Plan (RDMP) provides the process for defining the functionality within each release, tracking the integration of that functionality, the deployment process, and production tracking of releases. 2.3 Relationship to Other Documents As shown in Figure 1: 2020 Census Document Relationships, highlighted in green is part of a broader set of inter-related documentation developed for the 2020 Census. The 2020 IIP is directed (denoted as a solid line in Figure 1); by the 2020 Census Architecture because it describes the target solution architecture with systems and their interfaces planned for 2020 Census operations. It also provides guidance on the development of systems that comprise the solution architecture and communicates the architectural principles that shall be considered when developing or providing the capabilities for the 2020 Census. The 2020 IIP is informed by (denoted as a dotted line in Figure 1) the 2020 Census Enterprise Architecture and Infrastructure Transition Plan (CEATP) as it contains the milestones necessary for further engineering planning to ensure a successful transition of the 2020 Census solution architecture at the 2020 Census Program Level. The 2020 IIP informs the 2020 RDMP, which is currently under development. Section 5.3 further describes the relationship between the RDMP and the IIP. Integration and Implementation Plan Version 3.5 3 COMM-CB-18-0152-A-000035 Introduction Oper at iona l Des ign 202:0 Ce nsus Operati onal Plan / 2:02:0 Censu s Det ail ed Operati onal Plans / ; ; , ,, " ... / ,._" Architecture & IT Solutions 2020 Census A rchit e ctur e ......... ......... ... ........ ... ' ' ... --------------........... ...... ......... ............ CEDSCI Segm ent A rchit e ctu re CEDCaP Segm ent A rel, it e ctu re 2:02:0Census Int erface Cat alog 2:02:0 Censu s Syst e m li st 202:0 Censu s Ent er prise Arc hit e ctu re and Infras tructur e Transiti o n Plan CEDCaP Transition Plan ______ ' _,_ ... ...... '' I I I I 2020 Census Integration and Implementation Plan 2020 Censu s IT In fra st ructu re St rat egy and Road m ap CEDSCI Tra nsiti on Plan 1 1 I I I I I - - - - ....,_ - - - - - - - - - - _I Engineering Integration ~ / Key: -- ..... •• - - - + Directs Informs 2:02:0 Census Release& Depl oym ent M anagem ent Plan ., Figure 1: 2020 Census Document Relationships The 2020 IIP also informs many other 2020 Census Program artifacts that leverage the framework. For example, the 2020 Census Test and Evaluation Management Plan (TEMP) defines the types of testing that will be performed between each of the major reviews. The 2020 IMS and lower-level systems schedules use the milestones within the IIP for developing a framework for detailed dates and activities. Similarly, business rhythms for multiple meetings are planned in the context of the IIP. Section 5.2 depicts the Decennial SE&I business rhythms in the context of the IIP. Integration and Implementation Plan Version 3.5 4 COMM-CB-18-0152-A-000036 Introduction 2.4 Audience This document is intended for several different audiences:  Solution Providers such as CEDCaP, Decennial Information Technology Division (DITD), CEDSCI, and Applications Development Services Division- Enterprise Testing Services Branch (ADSD-ETSB). o Informs Solution Providers regarding systems and technical capabilities required to support the various phases of testing and deployment leading to the 2020 Census.  Key Stakeholders, such as Decennial Census Management Division (DCMD), Chief Technology Officer (CTO), and Technical Integrator (TI). o Ensures all Census Bureau-required key elements are identified for a successful 2020 Census by providing release, milestone, and system information in views that can be used by stakeholders to identify gaps in system participation or availability.  Census Bureau Leadership. o Provides information on what systems and infrastructure elements need to be in place, and when to support testing for the Census Tests and the 2020 Census.  Oversight. o Provides assurances to oversight (e.g., Government Accountability Office [GAO] and the Commerce Inspector General [IG]), that the Census Bureau has a plan for incrementally integrating and implementing the architecture over the phases of testing that occur in the years leading up to the 2020 Census.  Office of Innovation and Implementation (OII). o Ensures that OII is aware of key milestones, dates, and deliverables.  Segment Leadership (aka Decennial, CEDCaP) o Provides key milestones and dates to system owners in sufficient time to support readiness for each Census Test and ultimately the 2020 Census. This plan can be used or referenced by associated projects and other programs, or separate documentation can be crafted, provided there is no contradictory content and misalignment with the strategies and actions prescribed in this document. This plan can be applied regardless of development methodology, whether it be agile or waterfall, in accordance with Census Enterprise Software Development Lifecycle (eSDLC) standards. Integration and Implementation Plan Version 3.5 5 COMM-CB-18-0152-A-000037 Integration and Implementation Plan Overview 3 Integration and Implementation Plan Overview The 2020 IIP leverages the Census Enterprise Program Readiness Framework to define a path for implementation and integration of a set of identified systems as shown in Figure 2 below: High level, formal reviews (similar to those held at the project level) are held to ensure:   Initial high level approach is agreed to at Decennial Program level and CM controlled thereafter Initial direction is clearly conveyed to projects  System is appropriately tested and ready for operations to begin testing Systems Readiness Reviews Concept PgLC CBR Program Definition 1 PBR 2 Development Project Baseline System SRR Requirements CDR Review Critical Design Review Joint review of PLBR/CAP requirements by engineering and operations. 4 Operations & Disposal 5 TRR PRR ORR SRR CDR Review of program baselines. Schedule, organizational structure, etc. 4 Execution 3 Program SEIT Teams execute the technical PgLC processes and conduct reviews of Projects’ technical activities, ESDLC artifacts, and outputs throughout the Project Life Cycle (ESDLC) Critical Business Review of initial high level architecture to ensure CBR Proposal Review inclusion of appropriate systems to implement the desired subset of the 34 operations. PBR Review Per each Release 3 2 1 ! Enterprise Organizations and Governance Bodies conduct Systems Readiness Reviews during each PgLC phase and prior to each release Test Readiness Ensures appropriate test objectives, methods procedures, scope and environments. Production Assessment of test results to ensure system is ready for operations testing. TRR Review PRR Readiness Review Operational ORR Readiness Review Assessment of Operational testing results to ensure system is ready for production operations to begin. Figure 2: Census Bureau Enterprise Program Framework for Systems Readiness Figure 2 also highlights the two key facets of the Enterprise Program Readiness Framework, namely: (1) Program-Level Reviews and (2) Program-Level Releases. The subsequent subsections describe these two key facets in more detail. 3.1 Program Reviews The Enterprise Program Readiness Framework defines a series of high-level, formal reviews to ensure:  Initial high-level approach agreed to at the Program level and placed under configuration management (CM) control thereafter.  Initial direction is clearly conveyed to projects.  The system of systems (SoS) is appropriately tested and ready for operations to begin testing. Integration and Implementation Plan Version 3.5 6 COMM-CB-18-0152-A-000038 Integration and Implementation Plan Overview Major high-level program reviews are defined by the Enterprise Readiness Framework are as follows:  CBR – Review of initial high-level architecture to ensure inclusion of appropriate systems to implement the desired subset of the operations.  PBR – Review of program baselines, including schedule, organizational structure, and risk.  SRR – Joint review of business requirements and business process models (BPMs) by engineering and operations.  CDR – Review of high level architecture and business processes that define operations, interfaces, etc. Includes allocation of requirements to systems.  TRR – Review to ensure appropriate test objectives, methods, procedures, scope, systems and environments are ready.  PRR – Review to assess program-level test results to ensure systems are ready for operational testing.  ORR – Review to assess operational testing results to ensure systems are ready for production operations to begin. The exclamation point in the upper right hand portion of Figure 2 highlights the three milestones (TRR, PRR, and ORR) that repeat for every release, while the previous milestones (CBR, PBR, CDR, and SRR) are performed only once per event (such as a Census Test). Section 4.2 provides a detailed description of each of these reviews as implemented by the 2020 Census Program. 3.2 Program Releases Releases entail the implementation of a collection of target system(s) capabilities necessary for specific operations, within a specific period. The benefits of establishing a release structure include the following:  Releases simplify SoS deliveries. A common set of releases and key milestones for each release limits the amount of coordination required to manage multiple (often complex) dependencies between systems and between systems and operations.  Releases reduce complexity of dependencies. While it is not completely possible to remove dependencies across releases, the release structure does help to avoid Integration and Implementation Plan Version 3.5 7 COMM-CB-18-0152-A-000039 Integration and Implementation Plan Overview systems capabilities and interfaces that span releases. This means that a set of interdependencies are delivered by the same key milestone.  Releases oriented around operations naturally provide testable and deployable segmentation of functionality that ensure no gaps in function. A comprehensive effort is organized into releases by:  Identifying all the key activities and assessing the entire objective of the effort (for example, the 2017 Census Test).  Identifying the operations and need dates (e.g., Internet Self-Response [ISR] needed by March 20, 2017).  Identifying the supporting systems (e.g., ISR requires RTNP, Enterprise Censuses and Surveys Enabling Platform – (ECaSE) – ISR, Enterprise Censuses and Surveys Enabling Platform – Operational Control System (ECaSE-OCS), etc.  Repeating for all operations and supporting systems. Group by similar need dates (e.g., ISR systems are included in a self-response release, grouped with paper data collection, and Census Questionnaire Assistance [CQA]). Section 4.3 provides a detailed description of each of the releases as implemented by the 2020 Census Program. A \11 ICIntegration and Implementation Plan PVERSIGHT Version 3.5 8 COMM-CB-18-0152-A-000040 Applying the Integration and Implementation Plan 4 Applying the Integration and Implementation Plan The 2020 Census Program used a highly-regimented approach to establish a comprehensive set of releases and to further define and schedule program-level reviews. This section describes the steps taken and resulting baselined set of releases and reviews. 4.1 2020 Census IIP Approach There are several operational events on the path to the 2020 Census, including Census Tests, ongoing geographic programs, and the 2020 Census itself. The 2020 Census Program used a slightly different approach for implementing an IIP framework to each of these three distinct types of events to properly account for the distinct timelines, complexity of development, and amount of testing required for each. Error! Reference source not found.Figure 3 provides a high-level pictorial of the Program-Level review timelines and visually displays the difference in the implementation of the IIP between the Census Tests and 2020 Census. Figure 4 shows another view of these activities, focusing on the 2018 End-to-End Test and the 2020 Census to demonstrate overlaps between the releases of each one. Integration and Implementation Plan Version 3.5 9 COMM-CB-18-0152-A-000041 Applying the Integration and Implementation Plan 2020 Census Systems Engineering & Integration (SE&I): Integration & Implementation Milestones Pictorial FY 2016-FY 2018 FY 2018 -FY 2019 FY 2019- FY 2020 017 SRR - Sy11emsRequ,nmenaReviN C0R - C~0ui;nRevTRR - Tut Re.aiMn R,v_ • Re!uff A.- Addreu ~YUP'!g • RU!ue P - PMtt.enfup ActlYlbH • RUue PRR - Pro®Ction Ru.:fll!Us R111- Conduet()per,11Jon (CO)- Fnt System Golivt D111 Cl>ffr.ig. U.nurtfMflt I I • R~O • RU!ue E- T.abub11011 / 0i5Hm111at10n - F'ieldEnumti.ation • ReerumngRelu.Ml . Recn.nta1gRPUe2 • Truing ReluH I • PES ~ 1 Re!uu (CII) ReltHH • Relel.M B - PES lriclt-ptehdenlLll~ • Releu. • Rftl.W l - PES PtiSOl'Ilnterv-Ml1 • RtiUMM - PESPtisonF~ K - FES lnltla Hovsng UM Folowup, PerMJn Interview 1elling • RUJM N - PES FNil Houfflg UM Folowtup • Rftu.0 - PES Repo,u& RtluH FNng't PRR1 Key • • Release 1 Release 2 • Release 3 Release 4 • censu ts· -----• C- Se!f-Rt:spotlM Bureau TRR TRR • • U .S . Department of Commerce Economics and Statistics Administration U .S . CENSUS census .gov PRR • • TRR co • • PRR co l • FUTURE @-ict BUREAU Activa ting Change Figure 3: 2020 Census IIP Milestones Integration and Implementation Plan Version 3.5 VERSIGHT 10 COMM-CB-18-0152-A-000042 Applying the Integration and Implementation Plan 2018 Recruiting Release 1 2018 Training Releas.e 1 20l8 Release A.1 20l8 Release A.2 2018 Recruiting ~I '~---~ I I I 11 Release 2 2018 Training Release 2.1 2018 Training Release 2.2 20l8 Release C.1 I I I I I I I II 2018 Release c.2 20l8 I Release 0 2018 Release E.1 2018 Release E.2 I I [I] 2018 Release E.3 ~~~ -,--------, I __ 2020 RELEASE 1 2020 RELEASE 2 I 2020 RELEASE 3 2020 RELEASE 4 Figure 4: Overlap of 2018 E2E Census Tests vs 2020 Census Activities Integration and Implementation Plan Version 3.5 VERSIGHT 11 COMM-CB-18-0152-A-000043 Applying the Integration and Implementation Plan The following subsections describe the 2020 Census Program’s approach for implementing an IIP framework for the Census Tests, ongoing geographic programs, and the 2020 Census. These subsections describe how the 2020 Census Program creates a baseline plan for each of the distinct events. However, these releases and systems are continuously assessed and managed. The 2020 Census Program iteratively assesses readiness on an ongoing basis as described in Sections 4.2 and 5.0. As part of these iterative readiness assessments, the 2020 Census Program has the latitude to adjust lifecycle processes when and where appropriate to mitigate risks to “Conduct Operation” dates and operations. Section 4.4 describes the management of changes to the IIP. 4.1.1 Approach: Census Tests The 2020 Census Program used the following approach to determine the releases and major IIP review dates for the Census Tests leading up to the 2020 Census, beginning with the AdCan Test: 1. Assigned dates for each CBR/PBR, SRR, and CDR based on the proposed schedules for baselining scope and developing SE&I documentation for each test. 2. Created a superset of releases based on 2020 Census operations (see Section 4.3 for the list of releases). 3. Assigned the appropriate subset of releases to each Census Test (AdCan Test, 2017 Census Test, and 2018 End-to-End Census Test) based on the draft operational timelines and scope available for each one. 4. Assigned systems to each release based on the Census Test Solution Architecture, Census Test BPMs, and subject matter expertise. 5. For each release in each Census Test, assigned dates for “Conduct Operation” based on key operational start dates (such as “Begin Internet Self-Response”). 6. For each Conduct Operation date, designated an ORR approximately 10 days prior to the Conduct Operation. 7. Allocated specific timeframes for operational testing and integrated testing to determine PRR and TRR respectively. o The baseline time allocated for program-level operational readiness testing of the systems (between PRR and ORR) was 4-6 weeks1. 1 In some cases, such as the 2017 Census Test release, in order to maximize project-level development timeframes, the time between PRR and ORR was reduced from the original plan of 4-6 weeks. In these instances, ORT is conducted in parallel with Program-Level testing (starting at TRR rather than PRR). Integration and Implementation Plan Version 3.5 12 COMM-CB-18-0152-A-000044 Applying the Integration and Implementation Plan o The baseline time allocated for program-level integration testing (between TRR and PRR) was four months and then tailored based on complexity of the release and project-level development and testing time required. The 4-month programlevel testing baseline was established primarily based on the amount of integration testing required for previous Census Tests. 4.1.2 IIP Approach: Ongoing Geographic Programs Prior to establishing an IIP framework, several 2020 Census operations (such as InOffice Address Canvassing [IOAC]) were actively in production, as part of the ongoing geographic activities performed by the Census Bureau. As such, the 2020 Census Program implemented IIP checkpoints for these operations rather than full-scale Program Reviews.  Assigned a Production Readiness Review (PRR) date for key operational milestones (such as performing IOAC in the Census Test location).  Determined whether any major software upgrades to ongoing geographic programs are necessary to support an aspect of the 2020 Census. If Program-Level testing is required, assign TRRs, PRRs, and ORRs accordingly. (See Known Updates Section 6.2). 4.1.3 IIP Approach: 2020 Census For the 2020 Census, the 2020 Census Program proposes that functionality be consolidated into as limited a number of releases as possible. The goal in limiting the number of 2020 releases is to maximize end-to-end integration and testing efforts, while still allowing sufficient time for project-level development and the need to meet operational timelines of the 2020 Census. For most capabilities within a release, this also provides a schedule buffer between the time a release is scheduled to complete program-level testing efforts and the start of the operation a release capability supports. These factors obviously help mitigate risk to 2020 Census production operations. However, it is also important to note that the 2020 Census Program needs to weigh the risk mitigation this schedule buffer provides, while still allowing enough time to assess and address any changes resulting from the 2017 Census Test and 2018 End-to-End Census Test lessons learned. Accordingly, the 2020 Census Program used the following approach to determine draft releases and IIP review dates for the 2020 Census: 1. Created a superset of releases based on the 2020 Census. Integration and Implementation Plan Version 3.5 13 COMM-CB-18-0152-A-000045 Applying the Integration and Implementation Plan 2. Assigned dates for a limited number of consolidated reviews: CBR/PBR, SRR, CDR, TRR, and PRR. 3. For each release, assigned Conduct Operation dates based on key operational start dates. 4. Added follow-up SRRs and CDRs for operations that have not yet been fully defined. 5. Determined appropriate ORR dates based on the Conduct Operation dates and key operational readiness milestones. 4.2 2020 IIP Reviews Decennial SE&I, in conjunction with DCMD, further defines, tailors, and schedules each program review. For each review, the 2020 Census Program designates:  A specific owner, responsible for the full coordination and conduct of the review.  The primary artifacts required for the review.  Criteria for assessing the review.  A voting board to determine whether the review is Approved/Pass with Exceptions/ Fail. Table 1 depicts this information for each of the program reviews. In addition to the Program reviews, Decennial SE&I continues to manage and reference a “Conduct Operation” date rather than the ORR date. The Conduct Operation date designates the first time a system capability within a release needs to begin operational activities. Per the defined IIP framework, the ORR typically precedes this Conduct Operation date and provides the approval required to ensure comprehensive readiness to begin an operation (including the system Conduct Operation). Currently, several Conduct Operation dates are not preceded by an ORR because the Conduct Operation date represents the start of an operational activity that is strictly a systems-based activity and do not include other operational aspects associated with an ORR (e.g., people and processes). For example, the systems supporting Release C: SelfResponse must be deployed to support creation of the workload long before the people and processes supporting Self-Response are needed. For this release in the Census Tests, the ORR does not precede the Conduct Operation date. Currently, for these releases, Conduct Operation is approved through the PRR. (See Known Updates, Section 6.2.) Please see the “Review/Milestone Details” table on the following page. Integration and Implementation Plan Version 3.5 14 COMM-CB-18-0152-A-000046 Applying the Integration and Implementation Plan Table 1: Review/Milestone Details Review/ Major Milestone Owner Description Timing Primary Artifacts that Feed the Review Approval Criteria for Review One per Census Test and one for the 2020 Census; aligns with Goals, Objectives, and Success Criteria (GOSC) and Test Plan Baseline date; can be combined with PBR. One per Census Test and one for the 2020 Census; Can be combined with CBR. Operations: GOSC, Draft Test Plan, High-Level Timeline, and relevant Ops Plans SE&I: Draft Systems List and Draft Review/Release schedule. The operations Integrated Project Teams (IPTs) have finalized and communicated to stakeholders their goals and objectives for the Field Test. The "vision" for the test is clear. Operations: Final GOSC, Final Test Plan, Schedule baseline, Operations/SME POC list, Risk baseline, and OMB Pre-submission SE&I: Draft Systems List, Draft Review/Release schedule, and DART schedule. The vision for the field test is complete and has been communicated. Key operational dates and program baselines have been finalized. SE&I DART and the operations IPTs have the information they need to create BPMs and capability requirements. Critical Business Proposal Review (CBR) Ops Review of initial high- level scope to ensure inclusion of appropriate systems to implement the desired subset of the 34 operations. Project Baseline Review Ops Review of program baselines. Schedule, organizational structure, Risk, etc. Integration and Implementation Plan Version 3.5 VERSIGHT 15 COMM-CB-18-0152-A-000047 Applying the Integration and Implementation Plan Review/ Major Milestone Owner Description Timing Primary Artifacts that Feed the Review Approval Criteria for Review Systems SE&I/TI Requirements Review (SRR) Joint review of Project Level Business Requirements (PLBRs)/ Capability Requirements (CAPs) by engineering and operations. Operations/SE&I DART: Final BPMs and Capability Requirements SE&I: Draft Systems List and Draft LOB Solution Architecture. The full scope of the field test is clear. DART has completed their work with the Operations IPTs participating in the test and have created a final set of Business Process Models and capability requirements. The Architecture Team has the information it needs to finalize the allocation of capability requirements to systems/projects and create the required SE&I Architecture artifacts. Critical Design Review (CDR) Review of high-level architecture and detailed business processes that define the subset of operations, interfaces, etc. Includes allocation of requirements to systems and the defined releases. One each for the 2017 and Address Canvassing Tests, two for the 2018 E2E Test, and multiple for the 2020 Census, starting May 2017; aligned with approximate delivery of baseline BPMs and Requirements from SE&I DART. One per Census Test and multiple for the 2020 Census, starting June 2017; aligned with SE&I Architecture delivery of allocated capability requirements to Solutions, Solution Architecture Diagram, and Interface List. SE&I: Allocated Capability Requirements, Final Systems List, LOB Solutions Architecture Diagram, Interface Catalog, and Final Review/Release schedule. Architecture Team has allocated capability requirements to systems (projects), and finalized the Solution Architecture Diagram, Systems List, and Interface Catalog. The projects/systems (including CEDCaP) have the information they need to begin the eSDLC. SE&I/TI Integration and Implementation Plan Version 3.5 VERSIGHT 16 COMM-CB-18-0152-A-000048 Applying the Integration and Implementation Plan Review/ Major Milestone Test Readiness Review (TRR) Owner Description Timing Primary Artifacts that Feed the Review Approval Criteria for Review SE&I/ Program Test Team Review to ensure appropriate test objectives, methods procedures, scope, and environments are ready. Assesses readiness of systems to begin independent Program-Level testing. Marks the beginning of Program-Level testing. Typically one per release Occurs prior to PRR, anywhere between 6 weeks and 5 months depending upon the complexity of the release. Projects – Requirements Traceability Matrix, Interface Documentation, Authorization status, Test Analysis Report (including test cases, test results, defects), Release Notes Program Test Team: Test Plan and Test Threads. Systems have completed all projectlevel development and testing necessary to support full independent integration testing. The Integrated Test Environment (ITE) is fully operational. Systems have delivered all capabilities to the ITE. All systems have received Authority To Operate (ATO). Operations have provided input to the Program Test Plan/scenarios. Systems have provided the Program Test Team with project-level information to assist them in their testing effort. Program Test Team can begin integrated testing of the release. Explicit Expectations of Systems or “TRR Checklist” can be found in the 2020 Census TEMP. Integration and Implementation Plan Version 3.5 VERSIGHT 17 COMM-CB-18-0152-A-000049 Applying the Integration and Implementation Plan Review/ Major Milestone Owner Description Timing Primary Artifacts that Feed the Review Approval Criteria for Review Program Test Team has completed testing of the release. Key stakeholders agree that any outstanding defects will not negatively impact the goals/objectives of the field test. ORT environment is operational. Systems are ready for production and ready to support any ORT. Systems have deployed to the environment in which ORT will take place. All the people, processes/procedures, facilities and technology are in place to support the operation. Systems have been deployed to the Production environment. Note: ORR formality and rigor depend on the complexity of the operation. Production Readiness Review (PRR) SE&I/TI Review to assess test results ensure systems are ready for operations testing. Typically, one per release Occurs prior to PRR, anywhere between 3 weeks and 4 months depending upon the complexity of the release. Program Test Team: Test Analysis Report Projects: Production Readiness Status (infrastructure, deployment, and initialization). Operational Readiness Review (ORR) Ops Review to validate that all components of an operation are ready before the operation is conducted. It is a final check that the needed resources (i.e., people, systems, processes, and facilities) have been acquired/developed. The ORR process ensures expectations are set early and met before the conduct of the operation. Aligned with key operational activities from high-level milestone schedule. Typically occurs approximately 10 business days prior to conduct operation and/or key operational start dates. Operations: Operational Readiness Test (ORT) Results Report SE&I/Projects: System Checkout results, PRR Results, TRR Results, and daily/weekly status updates from operations. Integration and Implementation Plan Version 3.5 VERSIGHT 18 COMM-CB-18-0152-A-000050 Applying the Integration and Implementation Plan Review/ Major Milestone Conduct Operation Owner Description Timing SE&I/TI The first time a capability within the release needs to support Operations and Maintenance (O&M). Aligned with key operational start dates from highlevel milestone schedule. Conduct Operation approval received at ORR, or in some exceptional cases, at PRR. Integration and Implementation Plan Version 3.5 VERSIGHT Primary Artifacts that Feed the Review N/A: not a review Approval Criteria for Review N/A: not a review 19 COMM-CB-18-0152-A-000051 Applying the Integration and Implementation Plan 4.3 2020 IIP Releases Using the approach described in Section 4.1, the 2020 Census Program established the following program releases. Table 2: 2020 Census Tests Releases depicts the baselined set of releases and the Census Tests (beginning with the AdCan Test) in which each is included. Table 3: 2020 Releases and Census Tests shows the four 2020 Census releases, along with the Field Test releases to which they map. Note that the 2020 Census releases may be updated to accommodate the needs of operations that have not yet been fully defined or funded (e.g., Count Question Resolution), especially leading up to the final SRR and CDR for the 2020 Census. Section 4.4 describes how these types of changes will be managed. RR-1 RR-2 Recruiting Release 1 Recruiting Release 2 A C Training Release 1 Training Release 2 Release A Release C D Release D E G Release E Release G* I Release I TR-1 TR-2 Integration and Implementation Plan Version 3.5 Release Title In-Field AdCan Recruiting, PES IL/IHUFU Recruiting* Field Enumeration Recruiting, PES PI/ Person Follow-up (PFU)/FHUFU Recruiting* In-Field AdCan Training X Field Enumeration Training X In-Field Address Canvassing Self-Response (Includes Print/Mailing/Workload & CQA/Self Response/GQ Advanced Contact/CQA Training) Field Enumeration (includes UE*/UL/GQ/ETL*/NRFU operations) Tabulation/Dissemination/Archiving Geographic Programs / Local Update of Census Addresses (LUCA) In-Office Address Canvassing X X Address Canvassing Test Release Name 2018 End-to-End Census Test ID 2017 Census Test Table 2: 2020 Census Tests Releases X X X X X X X X X 20 COMM-CB-18-0152-A-000052 PES-1 Partnership Release* PES Sample Release* Release B* PES-2 Release K* PES-3 Release L* PES-4 Release M* PES-5 Release N* PES-6 Release O* P PES-0 Release Title Address Canvassing Test Release Name 2018 End-to-End Census Test ID 2017 Census Test Applying the Integration and Implementation Plan Partnership Activities Initial Sample for Post Enumeration Survey (PES) Post Enumeration Survey - Independent Listing Post Enumeration Survey - Initial Housing Unit Follow-up, Person Interview (PI) Post Enumeration Survey - PI Matching (E-Sample ID, Computer Matching, Before Follow-up (BFU) Clerical Matching) Post Enumeration Survey - Person Follow-up Post Enumeration Survey - Final Housing Unit Follow-up (FHUFU) Post Enumeration Survey - Reports and Release Findings *2020-only, not included in the Census Tests, but maintained in Table for reference to 2020 releases. Table 3: 2020 Releases and Census Tests ID Release Name Census Test Releases Mapping (See Table 3 for Complete Release Descriptions) 1 2 3 4 Release 1 Release 2 Release 3 Release 4 Early RR1 RR1,TR1,A,RR2, PES Sample TR2, P,B,C,D,K,L E, M, N, O It is essential for systems and operations to understand the scope of each release to ensure systems are building and delivering the appropriate functionality to meet Integration and Implementation Plan Version 3.5 21 COMM-CB-18-0152-A-000053 Applying the Integration and Implementation Plan operational timelines. The table below provides a high-level description of each release2. Note that the RDMP describes the process and documentation used to define, communicate, and verify the detailed scope of each release. Table 4: Scope of Releases ID Release High-Level Description of Release Scope RR-1 Recruiting Release 1: InField AdCan Recruiting, PES IL/IHUFU RR-2 Recruiting Release 2: Field Enumeration Recruiting PES Person Interview / PFU / Initial Housing Unit Follow-up (FHUFU) Recruiting TR-1 Training Release 1: InField Address Canvassing Training TR-2 Training Release 2: Field Enumeration Training The scope of this release is the recruiting and selecting of applicants to support Address Canvassing and the PES Independent Listing (IL) / Initial Housing Unit Follow-up (IHUFU) Operations. Included within this release, applicants undergo a background check. Upon successful completion of that background check, they are then eligible to be hired. During this time, an account is created for the applicant but is not activated. This release also includes the hiring of Recruiting Assistants and Clerks. The deployment of IT Infrastructure for the field offices is also currently included in this release. [See Known Updates Section 6.2] The scope of this release is to conduct the recruiting and selecting of applicants to support Field Enumeration and the PES PI/PFU/FHUHU Operations. Included within this release, applicants undergo a background check. Upon successful completion of that background check, they are then eligible to be hired. During this time, an account is created for the applicant but is not activated. This release also includes the hiring of Recruiting Assistants and Clerks. The deployment of IT Infrastructure for the field offices is also currently included in this release. [See Known Updates Section 6.2] The scope of this release is the hiring and training (both online and in classroom) of the Field Workers recruited during Recruiting Release 1. Hired applicants' accounts activated during this time. Field Workers are issued a device to be used during fieldwork. The scope of this release is the hiring and training (both online and in classroom) of the Field Workers recruited during Recruiting Release 2. Hired applicants' accounts are activated during this time. Field Workers are issued a device to be used during fieldwork. 2 The verbiage used to provide these high-level descriptions was derived from two primary sources: 2020 Census Project-Level Business Requirements (PLBRs) and the 2020 Census Operational Plan. Integration and Implementation Plan Version 3.5 22 COMM-CB-18-0152-A-000054 Applying the Integration and Implementation Plan ID Release A Release A: In-Field Address Canvassing C Release C: Self Response (Includes Print/Mailing/Workload & CQA/Self Response/GQ Advanced Contact/CQA Training) Integration and Implementation Plan Version 3.5 High-Level Description of Release Scope The scope of this release is to conduct In-Field Address Canvassing in a continued effort (from In-Office AdCan) to update and verify the Frame, which is needed to create the Universe of households for Census Responses. During this operation, listers verify the address, location and classification of living quarters housing unit (HU) or group quarter (GQ)). The scope of this release includes all activities for selfresponse via Internet, Telephone, and Paper channels. This release also includes workload creation, printing, and mailing of all self-response related materials (Internet Invitation Letters, Reminder letters or postcards, and Questionnaire Mail Packages). This release also includes the training of CQA agents. This release also includes GQ Advanced Contact, which allows for the collection of information about a GQ prior to enumeration. 23 COMM-CB-18-0152-A-000055 Applying the Integration and Implementation Plan ID Release D Release D: Field Enumeration (includes UE, U/L, GQE, SBE, NRFU operations) E Release E: Tabulation/Dissemination Integration and Implementation Plan Version 3.5 High-Level Description of Release Scope The scope of this release is the field components of the Update Enumerate (UE), Update/Leave (U/L), Group Quarters Enumeration (GQE), Service-Based Enumeration (SBE), and Non-Response Follow-up (NRFU) operations. UE serves as a combined Address Canvassing and Field Enumeration activity covering geographic areas identified as having unique challenges for self-response. The UE Field Worker will verify the address listing and attempt enumeration of the living quarter at the same time. UE consists of Production, Follow-up, Listing Quality Control (QC), and re-interview components. U/L includes updating of the address list, and includes a self-response option. LiMA is used for the address update. A choice questionnaire packet is at each unit (no enumeration). The follow-up operation for non-responding households folds into the larger NRFU operation that covers Self-Response areas. GQE serves to enumerate the residents of a Group Quarter. This can be performed either in person with the enumerator or by scheduling a return visit to collect the GQ's response. SBE serves to enumerate people experiencing homelessness or utilizing transitional shelters, soup kitchens, regularly scheduled mobile food vans, and targeted non-sheltered outdoor locations. NRFU targets nonresponding addresses and addresses from other operations requiring fieldwork. Administrative Records modeling and contact strategies are used to minimize fieldwork and reduce costs. The fieldwork includes include enumerating households, performing multi-unit manager visits, and conducting re-interviews and field verification. For quality control, NRFU includes re-interviews for a sample of production cases as well as those determined to be potential falsifications. The scope of this release is to primarily prepare and provide results of the Census. This includes delivery of a prototype Apportionment Table, creation of State Redistricting Data Products and other Data Products and dissemination of results or publication of result on the Census website. This release also includes archiving and destruction of records. Note that this release does not include dissemination of Post Enumeration Survey data. 24 COMM-CB-18-0152-A-000056 Applying the Integration and Implementation Plan ID Release High-Level Description of Release Scope G Release G: Geographic Programs / Local Update of Census Addresses (LUCA) I Release I: In-Office Address Canvassing (IOAC) P Partnership Release: Partnership Activities CM-0 Post Enumeration Survey Sample Release: Initial Sample for PES Release B: PES IL This release includes all geographic partnership programs, namely: Boundary and Annexation Survey (BAS), Participant Statistical Areas Program / Tribal Statistical Areas Program (PSAP/TSAP), Boundary Validation Program (BVP), and Public Use Microdata Areas (PUMAs), Redistricting Data Program (RDP), and LUCA. [See Known Updates Section 6.2] The scope of this release is to conduct and perform IOAC to update and verify the Frame, which is needed to create the Universe of households for Census Responses. During this operation, IOAC workers verify the address, location, and classification of living quarters, e.g., housing unit (HU), Group Quarters (GQ), Transitory Location (TL). IOAC consists of Interactive Review, Active Block Resolution (ABR) and Un-Geocoded Resolution. This release also includes GQ In-Office Review. All aspects of the Integrated Partnership Communications activities and the systems needed to support them, including: digital advertising and social media targeting, the use of texting and e-mailing to motivate self-response, an online portal to allow for posting and downloading materials, as well as providing online fulfillment. The scope of this release is to select the PES sample initial Basic Collection Units (BCUs) and identify surrounding BCUs. The scope of this release is PES Independent Listing (IL) Census Field Supervisor (CFS) Training, PES IL Listers Training, PES IL and QC, and IL post-processing. The scope of this release includes creation of the PES Sample Final BCUs and all activities for PES IHUFU and PI. IHUFU includes training for CFS, Listers, Computer Matching, and Clerical matching as well as the conduct of Initial Housing Unit Computer Matching, Initial Housing Unit BFU Clerical Matching, IHUFU and QC, and IHUFU post-processing. Conduct PES Initial Housing Unit After Follow-up (AFU) Clerical Matching. PI includes training for CFS and Interviewers, identification of the PI Sample, the conduct of PES PI and Re-interview (RI), and post-processing. Conduct Automated Geocoding of Alternate Addresses from Person Interview. Currently, this release also includes all waves of PES Clerical Geocoding and Residence Status Coding. B K Release K: Post Enumeration Survey – IHUFU/PI Integration and Implementation Plan Version 3.5 25 COMM-CB-18-0152-A-000057 Applying the Integration and Implementation Plan ID Release High-Level Description of Release Scope Release L: Post Enumeration Survey - PI Matching (E-Sample ID, Computer Matching, BFU Clerical Matching) Release M: Post Enumeration Survey Person Follow-up The scope of this release is Person Interview (PI) matching activities, including the conduct of the E-Sample Identification, PES Person Computer Matching, and PES Person Before Follow-up Clerical Matching (BFU). 1 Release N: Post Enumeration Survey Final Housing Unit Follow-up Release O: Post Enumeration Survey Reports and Release Findings 2020 Release 1 2 2020 Release 2 3 2020 Release 3 The scope of this release is all aspects of PES FHUFU, including CFS and Interviewers Training, PES FHUFU and QC, as well as post-processing and Person After Follow-up (AFU) Clerical Matching. The scope of this release is the production of PES Reports and Release Findings, including PES National Person Estimation, PES Domain Person Estimation, and all PES Housing Unit Reports and Release Findings. Recruiting for all positions Selection/Hiring/Training of RAs, PAs, OOS & Clerks AdCan selection of CFSs, Enumerator and Listers PES Sample Release: Initial Sample for PES AdCan Training In-Field Address Canvassing Peak Operation Recruiting Advertising and Earned Media HU Count Review Peak Operation Training (includes UL/GQ/UE/NRFU) Post-Enumeration Survey- IL Training Post-Enumeration Survey - Independent Listing GQ Workload and Advanced Contact/CQA Training/Printing & Mailing Workload Remote Alaska Island Areas Censuses Self-Response (includes Mailing/Self-Response/CQA /Coverage Improvement) Peak Operations (includes UL/UE/GQ/ETL/Early NRFU/NRFU) Post-Enumeration Survey-Person Interview Post-Enumeration Survey-Initial Housing Unit Follow-up Post-Enumeration Survey - PI Matching (E-Sample ID, Computer Matching, BFU Clerical Matching) L M N O Integration and Implementation Plan Version 3.5 The scope of this release is all aspects of PES PFU, including CFS and Interviewers Training, PES PFU and QC, as well as post-processing and Person After Followup (AFU) Clerical Matching. 26 COMM-CB-18-0152-A-000058 Applying the Integration and Implementation Plan ID 4 Release 2020 Release 4 High-Level Description of Release Scope Tabulation/Dissemination Archiving Federally Affiliated American Count Overseas Redistricting Data Post-Enumeration Survey - Person Follow-up Count Question Resolution Post-Enumeration Survey - Final Housing Unit Follow-up Post-Enumeration Survey - Reports & Release Findings 4.4 Managing Changes to the 2020 IIP The releases and scope within each release will be continuously assessed against the Decennial Program-Level schedule and scope CRs and Decennial SE&I CRs to determine whether they require updates. Examples of possible changes may include:  The shift of functionality from one release to another due to an operational schedule change.  The splitting of one release into two releases to provide more systems development or testing time.  The addition of new releases if a new operation or activity is added to the scope of a Census Test, 2020 Census, or if any of the 2020 Census operations not included in a test requires additional releases. Integration and Implementation Plan Version 3.5 27 COMM-CB-18-0152-A-000059 Tracking Systems Against the IIP 5 Tracking Systems Against the IIP As described in this document, the IIP framework establishes key high-level milestones for the progress of delivering tested, comprehensive, and operational releases. It is critical that systems will be tracked and managed at a level lower than what is provided by these key milestones. This section of the document describes how the 2020 Census Program tracks progress against the IIP framework by defining interim or “star” milestones using the 2020 IMS, standard business rhythms, and using a defined release management process per the RDMP. 5.1 Star Milestones Star milestones are a set of interim integration milestones / dependencies between the major Program-Level reviews, primarily between CDR and TRR. These milestones serve several purposes:  Provide more refined points of integration that can be managed at the program level.  Serve as a common timeline for meeting dependencies between systems and operations as well as between systems.  Simple to incorporate into the IMS and use to track progress on a weekly/monthly basis in accordance with SE&I business rhythms. Table 5 lists milestones based on the following input:  eSDLC Phase Gate Review checklists: Extracted a common set of key project-level artifacts/activities for which a system would be dependent on other systems or organizations to complete (for example, S-8: System Architectures Complete).  Program Test Team’s TRR checklist (taken from 2020 Census TEMP) extracted any items required by the Test Team that may not be required by eSDLC (for example, S-23.5: Release Notes).  2020 IMS: Extracted explicit integration points between the Operations and Systems teams (for example, S-17: Specifications Complete). These star milestones are actively being used to assess a release’s progress toward TRR. They serve as the basis for the weekly SEIT segment status meetings as described in Section 5.2. Additionally, several of the key star milestones are actively being incorporated into the 2020 IMS as part of a broader system schedule integration effort. It should also be noted that the TI is currently working with Decennial SE&I to assess this list against their integration dependencies and work products to determine if additional interim milestones should be managed. Integration and Implementation Plan Version 3.5 28 COMM-CB-18-0152-A-000060 Tracking Systems Against the Integration and Implementation Plan Table 5: Star Milestonesprovides a list of the star milestones that the Decennial SE&I tracks and a description of the milestone, the implementer of the milestone (organization responsible for performing work to achieve the milestone), the milestone receiver (organization dependent upon completion of the milestone), and the key milestone’s approximate timing and predecessors. A \11 ICIntegration and Implementation Plan PVERSIGHT Version 3.5 29 COMM-CB-18-0152-A-000061 Tracking Systems Against the IIP Table 5: Star Milestones ID S-1 S-2 S-3 Milestone Final Capability Requirements baselined Description This milestone represents SE&I's formal deliverable of baselined requirements for the Field Test. System sign-off All systems must have on Capability signed-off on their Requirements capability requirements by this date. This serves as complete SE&I's formal confirmation that capability requirements have been accepted by the systems. Project Critical For CEDCaP, this Business meeting takes place after Proposal OII has received the Review (pDecennial capability CBR) requirements, developed formerly" (or confirmed) Regurgitation corresponding Enterprise Requirements, and then Mtg" held provided the CEDCaP systems with the requirements they need to begin their work. They also create the Vision Document and Operational Flow Document. Integration and Implementation Plan Version 3.5 VERSIGHT Implementer Receiver Decennial SE&I (DART) / Ops Systems Non-CEDCaP Systems / CEDCaP PGMO and OII for CEDCaP systems Decennial SE&I CEDCaP – OII CEDCaP Systems Appx Timing Predecessor Once per Census Test / 2020 SRR Census, as soon as possible after SRR (usually about 4 weeks) Once per Census Test / S-1 2020 Census, as soon as possible after CDR (approx. 4 weeks), dependent on baselined requirements postSRR. Once per Census Test / 2020 CDR Census, as soon as possible after CDR (~ 4 weeks) 30 COMM-CB-18-0152-A-000062 Tracking Systems Against the Integration and Implementation Plan ID Milestone S-4 S-5 Initial system engagement with Office of Information Security (OIS) complete S-6 System schedules complete Description Implementer For non-CEDCaP Projects, this meeting is held as part of the concept development phase. The Architecture Review Board (ARB) often chooses to hold this review for particular key systems. It provides a high-level review of the system architecture and verifies the system has what it needs to move on to Planning/Sprint 0 phase. All systems must have met with their Information System Security Officer (ISSO) to review planned changes for the test and should understand whether re-authorization activities are required by this date. All systems must have provided their system schedules to the appropriate scheduling team for integration by this milestone. Non-CEDCaP - SEIT Segment Leads nonCEDCaP Systems Non-CEDCaP Systems / CEDCaP PGMO & OII for CEDCaP systems Decennial SE&I / OIS Once per Census Test / 2020 CDR Census, as soon as possible after CDR (~6 weeks) Non-CEDCaP Systems / CEDCaP PGMO & OII for CEDCaP systems in conjunction with Scheduling team Decennial SE&I / DCMD Once per Census Test / 2020 Census, as soon as possible after CDR (~8 weeks), coincides with DBP approval Integration and Implementation Plan Version 3.5 VERSIGHT Receiver Appx Timing Predecessor CDR CDR 31 COMM-CB-18-0152-A-000063 Tracking Systems Against the Integration and Implementation Plan ID Milestone S-7 All Assessment of Alternatives (AoAs) Complete S-8 System Architectures Complete Project Critical Design Review (p-CDR) or Detailed Design Review (DDR) External Models baselined S-9 S-10 Internal Models baselined Description Receiver Appx Timing Predecessor Any AoAs for systems, commercial-off-the-shelf (COTS) products, etc. should have been completed by this date. All systems should have completed their low-level system architecture by this date and held a pCDR, DDR if required by eSDLC. CEDCaP / Decennial SE&I Systems and Program Test Team Once per Census Test / 2020 CDR Census, as soon as possible after CDR (~8 weeks) Systems Security, ITD, SE&I Once per Census Test / 2020 CDR Census, as soon as possible after CDR (~8 weeks) All external demand models applicable to the Field Test will be baselined via SE&I CR by this date. Decennial SE&I – TI Systems, Program Test Team, and Operations, Budget All internal demand models applicable to the Field Test will be baselined via SE&I CR by this date. Decennial SE&I - TI Integration and Implementation Plan Version 3.5 VERSIGHT Implementer Once per Census Test / 2020 Census, as soon as possible after SRR (~6 weeks). Initial draft needs to be completed by SRR for providing best estimate input to Internal Models that feed the systems, which will need time to develop optimization models. Systems, Once per Census Test / Program 2020 Census, as soon as Test Team, possible after SRR (~10 and weeks). Initial draft needs to Operations, be completed by SRR for providing best estimate input Budget to Internal Models that feed the systems that will need time to develop optimization models. SRR S-9 32 COMM-CB-18-0152-A-000064 Tracking Systems Against the Integration and Implementation Plan ID Milestone S10.5 Data Model baseline S-11 High Availability and Scalability Designs Complete System Optimization Models complete COTS Upgrades complete S-12 S-13 S-14 Infrastructure Changes Done (cloud, etc.) S-15 Integrated Test Environment Ready Description Receiver Appx Timing Predecessor Data model for the Census Test / 2020 Census is baselined and reflects complete dataset. This represents the point at which systems have developed high availability and scalability designs for the test. All required system optimization models will be completed by this date. Decennial SE&I - TI Systems, Program Test Team Once per Census Test / 2020 CDR Census, as soon as possible after CDR (~4 weeks). Systems n/a Once per Census Test / 2020 S-8, S-10 Census, as soon after CDR as possible, prior to system's first TRR Systems n/a Once per Census Test / 2020 S-10 Census, ~2 increments prior to a system's first TRR If a system needs to upgrade any COTS (Oracle, etc.) for the Field Test, the upgrade needs to be complete by this date. If a system requires any infrastructure changes for the test, the changes need to be implemented by this date. EDITE / TI / Computer Services Division (CSvD) needs to have provided a test environment that allows Program Test Team to test all test threads in the release and allows systems to deploy final code for the release. Systems n/a Once per Census Test / 2020 S-11 Census, ~2 increments prior to a system's first TRR Systems n/a Once per Census Test / 2020 S-8, S-11 Census, ~2 increments prior to a system's first TRR EDITE / TI and Systems Systems and Program Test Team Once per Census Test / 2020 Census, Prior to first incremental delivery of software to Program Test Team; NLT 10 business days prior to a system's first TRR Integration and Implementation Plan Version 3.5 VERSIGHT Implementer S-13, S-14 33 COMM-CB-18-0152-A-000065 Tracking Systems Against the Integration and Implementation Plan ID Milestone Description Implementer S-16 Release Workflow Diagram Complete This document represents the scope/vision for the business threads within a release. Technical Integrator S-17 Specifications Complete Decennial Ops S-18 Reports Mocked Up S-19 Interface Control Documents (ICDs) completed and signed Represents the date that the operations teams have provided CEDCaP and Non-CEDCaP systems the specifications required for development of the release functionality. Note: Individual specification need-by dates are managed at the projectlevel and tracked by Decennial SE&I / CEDCaP. If a system implements reports in a release, these reports must be mocked up based on Operations reporting specifications by this date. All systems should have finalized ICDs for relevant interfaces for the release by this date Integration and Implementation Plan Version 3.5 VERSIGHT Receiver Appx Timing Predecessor Program Test Team, CEDCaP Systems, nonCEDCaP Systems Systems As soon as possible after SRR; however, once per release, no later than 2 increments prior to the relevant TRR S-3, S-4 As soon as possible after SRR; however, once per release, no later than 2 increments prior to the relevant TRR S-2 Decennial Ops and Systems Systems Once per release (as needed), ~ 2 months prior to each TRR S-17 Systems Systems and Program Test Team Once per release, ideally 2 increments prior to TRR but no later than 1 increment prior to TRR CDR 34 COMM-CB-18-0152-A-000066 Tracking Systems Against the Integration and Implementation Plan ID Milestone S-20 Interfaces peer to peer tested by project S-21 All ATOs received S-22 Change Request (CR) Cutoff Description All systems should have completed peer-to-peer testing of relevant interfaces for the release by this date. Any systems requiring authorization or reauthorization for the release should have received ATO by this date. Note: the timing of this Star Milestone is desired by Decennial SE&I; however, it is not an OIS requirement to obtain ATO prior to the start of program-level testing. Represents the point at which any change impacting a system's development may seriously impact a project's TRR-readiness and ability to meet operational dates/needs, training materials, translation. Integration and Implementation Plan Version 3.5 VERSIGHT Implementer Receiver Appx Timing Predecessor Systems Program Test Team Once per release, ~1 increment prior to TRR S-10.5, S-19 Systems / OIS OIS Once per release, ~10 business days prior to the relevant TRR S-5 Decennial Ops Systems and Program Test Team Once per release, ~ Half-way between CDR and relevant TRR CDR, S-17 35 COMM-CB-18-0152-A-000067 Tracking Systems Against the Integration and Implementation Plan ID Milestone Description S-23 User Interface (UI) Freeze S23.5 Release Notes complete S-24 Final Code Deployed to Test Environment Any systems with user interfaces being deployed for the release should have finalized the UI by this date. Any changes after this date may seriously impact a project's TRR-readiness. Represents the LAST day to provide release notes for any systems deploying code in the release. TI Release Management Team will consolidate and summarize systems release notes. Represents the LAST day to deploy code to the target test environment in preparation for ProgramLevel testing for the release. Integration and Implementation Plan Version 3.5 VERSIGHT Implementer Receiver Appx Timing Predecessor Decennial Ops and Systems (Primus, ECaSE, etc.) Systems Once per release, ideally coincides with start of training material preparation; ~2 increments prior to TRR CDR, S-17 TI / Systems Program Test Team Once per release, no later than 5 days prior to TRR. CDR, S-20, S22, S-23 Systems (current state), CM/Program Test Team (future state) Program Test Team Once per release, no later than 5 days prior to TRR. Dependent upon ITE Ready. S-15 36 COMM-CB-18-0152-A-000068 Tracking Systems Against the IIP 5.2 Business Rhythms and the IMS While the IIP framework establishes releases and initial review dates, it is important to note that these dates are then incorporated into the 2020 IMS. It is within the IMS that the review dates for each Census Test, 2020 Census, and releases therein are baselined and statused against. The IMS creates linkages from activities performed by each system in a release to the appropriate TRR. As such, the IMS can be used to track the status of systems for each Census Test and each release. In addition to the IMS status, Decennial SE&I establishes business rhythms that provide an additional mechanism for tracking systems against the IIP. Table 6: Established Business Rhythmsdepicts these established business rhythms and where they occur within the IIP framework. Decennial SE&I hosts weekly meetings with each Non-CEDCaP system (organized by segment) and uses the star milestones to track system’s progress toward completion of a release and progress toward the TRR, PRR, and conduct operation. Decennial SE&I also meets regularly with CEDCaP and uses the TRR dates as the common readiness milestone when assessing status of each CEDCaP system. Additionally, Decennial SE&I hosts a weekly Systems Integration meeting that is used as a forum for discussing cross-systems integration issues. Table 6: Established Business Rhythms SE&I Meeting Title Day / Frequency Timing in IIP Framework Description SEIT Segment Status Weekly Between CDR and ORR Serves as primary forum for Segment/System status, metrics, schedule, issues & risks for nonCEDCaP systems. CEDCaP SEIT Twiceweekly Between CDR and ORR Serves as primary forum for Segment/System status, metrics, schedule, issues and risks for CEDCaP systems specific to Decennial. Weekly Release Meetings Weekly Wednesday Between CDR and TRR Focused cross-discipline forum to review comprehensive status of upcoming releases. Daily Program Test Status Daily 8 a.m. Between TRR and PRR Provides status on Program Level Integration testing progress to SE&I, CEDCaP, DCMD Operatons, and systems during Program-Level testing. Daily Systems Production Status Daily, 8 a.m. Post-ORR Forum for reporting systems production status/issues. Technical Review Board As needed Throughout Review impacts of pending Change Requests across multiple systems/organizations/ stakeholders. Integration and Implementation Plan Version 3.5 37 COMM-CB-18-0152-A-000069 Tracking Systems Against the Integration and Implementation Plan 5.3 Release Management As described in this document, the TRR, PRR, and ORR serve as established ProgramLevel milestones for the progress of delivering tested, comprehensive, operational releases. The star milestones, established business rhythms, and IMS all serve to track the progress of individual projects and teams toward this common set of deliveries. However, it is also important to note that each release must be explicitly defined, managed, and deployed to ensure its success. The IIP provides a basis for this management, but does not include the process for managing each individual release and its deployment. The 2020 Census RDMP provides the process for defining the detailed functionality within each release, tracking the integration of that functionality, the deployment process, and production tracking of releases. Integration and Implementation Plan Version 3.5 38 COMM-CB-18-0152-A-000070 Lessons Learned and Known Updates Required 6 Lessons Learned and Known Updates Required As previously described, the 2020 IIP is a living document that actively drives the collaborative process within Decennial. The IIP will be refined or adjusted to minimize risk and maximize efficiency, yet meet the required timelines of the Census Tests and 2020 Census. This section of the document serves as a mechanism for both assessing the IIP’s effectiveness up to this point as well as provide a list of known updates required. 6.1 Lessons Learned The IIP Framework was first officially established and used for the AdCan Test and was also used successfully to plan and manage the 2017 Census Test. It is now actively being used to and manage the 2018 End-to-End Census Test. This section of the document describes a general assessment of the impact the IIP has on the 2020 Census Program lessons learned, and known outstanding items that need to be addressed within some aspect of the IIP. 6.1.1 General Benefits of the IIP The IIP has had a significant impact on the 2020 Census Program within the limited time it has been established and used within Decennial:  It drives coordination and resolution of integration issues much earlier in the lifecycle than in the past, helping to ensure all systems come together in time for both testing and deployment.  It creates a common “language” that allows all systems (both enterprise and Decennial-specific solutions), operations stakeholders, Decennial and executive leadership, as well as external entities to communicate status and progress toward milestones with a shared meaning. This dramatically simplifies status tracking and reporting.  It creates a schedule for groups of systems to follow the same development and release to program-level testing schedule. This allows for integrated testing of all systems that support all operations occurring either during a particular time period or operations that naturally relate to one another. It also significantly increases the time allotted to test and acknowledges that recruiting and training efforts that precede actual production operations have deliverables of their own during the software development cycle. Integration and Implementation Plan Version 3.5 39 COMM-CB-18-0152-A-000071 Lessons Learned and Known Updates Required 6.1.2 Lessons Learned Although the IIP has only been established and used within Decennial for a limited time, Table 7 lists several lessons learned associated with the IIP as documented by the 2020 Program. Table 7: IIP Lessons Learned ID 1 Lessons Learned Description Recommended Action(s) Notes / Status The Program needs to clearly define the scope of each release and verify systems understand the program-level testing scope and the TRR expectations. 1. Document a clear, concise description of each release in the IIP framework. 2. Map capability requirements to their releases and map solutionlevel requirements, user stories to capability requirements. 3. Clearly communicate TRR entry / exit criteria and verify all systems understand.    2 The IIP dates should not be managed in multiple artifacts. The dates should be baselined and then statused only through the 2020 IMS. Item 1 is complete and included in this version of the IIP. Item 3 is complete. The Program Test Team now has a checklist of TRR inputs required of the systems included in the TEMP and clearly assessed at TRRs. Item 2 is within scope for the Technical Integrator to complete, and is complete for the 2018 E2E Test. These lessons learned and recommended actions are documented as part of the AdCan Test Lessons Learned.  This action was completed upon full implementation of Program Level Change Request P-0301 and incorporated in version 4.4 of this document. 6.2 Known Updates Required While the IIP is now a well-established framework actively in use by the 2020 Census Program, several updates are still pending to the release / review structure. Table 8 provides a list of these known items at present, along with the expected dates for resolution. It should be noted that each of these changes will be processed through the appropriate change control process(es), depending on the organization and configuration item(s) affected by the change. Integration and Implementation Plan Version 3.5 40 COMM-CB-18-0152-A-000072 Lessons Learned and Known Updates Required Table 8: Known IIP Updates Required ID 1 Description of Key Outstanding Items Expected Resolution Date Finalize 2020 milestones, including additional CDR1 / SRR1, resolution of operations that occur significantly later in the 2020 timeline. 9/1/2017 status update: Complete with the v0.1 of the 2020 release schedule. All 2020 releases and SRRs/CDRs have been documented. Create draft allocation of 2020 systems to baseline set of releases. ~ 7/1/2017 (after 2020 SRR/CDR) CLOSED 3 Review LUCA systems enhancements required to support 2020 to determine PRR checkpoints and/or program-level testing and associated TRR/PRR/ORR reviews. ~11/10/2017 (after 2020 SRR3/CDR3) 4 Determine whether a release needs to be added to the 2020 IIP to account for the IT Infrastructure / systems required to support the Regional Census Centers (RCCs)/Area Census Offices (ACOs). 9/1/2017 status update: Complete with the v0.1 of the 2020 release schedule. All 2020 releases have been documented. The “Go Live” terminology has explicit meaning to systems and stakeholders at the Census Bureau that does not precisely align with the IIP’s use of the term. Decennial is considering a recommendation to eliminate the “Go Live” term and instead use “Conduct Operation.” The term “Conduct Operation” will be used for all operations and will be defined as follows: “Systems are approved to move to their final production environment and have been approved by SE&I and the appropriate operations for use.” 5/23/2017 status update: Complete with v4.4 of this document. In the current IMS, there are several “Conduct Operation” dates that precede an ORR. DCMD and Decennial SE&I are developing a plan / schedule to eliminate/reduce these occurrences. 9/1/2017 status update: Complete with IIP Spreadsheet v0.55_1. The only remaining instance where the Conduct Operation date precedes the ORR is the 2018 End-to-End Census Test Training Release 2. ~7/1/2017 (after 2020 SRR1/CDR1) CLOSED 2 5 6 Integration and Implementation Plan Version 3.5 ~11/1/2017 (after 2020 SRR3/CDR3) ~ 2/12/2017 CLOSED ~7/1/2017 (after 2020 SRR1/CDR1) – CLOSED 41 COMM-CB-18-0152-A-000073 Appendix A: Acronyms List Appendix A: Acronyms List Acronym ABR ACO AdCan ADSD AFU AoA ATO BAS BCU BFU BPM CBPR CDR CEA CEAF CEDCaP CEDSCI CFS CM COTS CQA CSvD CTO DART DCMD DDR DITD ECaSE-ISR ECaSE-OCS EDITE eSDLC ETSB FHUFU GAO GOSC GQ Definition Active Block Resolution Area Census Office Address Canvassing Applications Development Services Division After Follow-Up Analysis of Alternatives Authority to Operate Boundary and Annexation Survey Basic Collection Unit Before Follow-Up Business Process Model Critical Business Proposal Review Critical Design Review Census Enterprise Architecture Census Enterprise Architecture Framework Census Enterprise Data Collection and Processing Center for Enterprise Dissemination Services and Consumer Innovation Census Field Supervisor Configuration Management Commercial Off the Shelf Census Questionnaire Assistance Computer Services Division Chief Technology Officer Decennial Architecture Requirements Team Decennial Census Management Division Detailed Design Review Decennial Information Technology Division Enterprise Censuses and Surveys Enabling Platform – Internet SelfResponse Enterprise Censuses and Surveys Enabling Platform – Operational Control System Enterprise Development, Integration, and Test Environment Enterprise System Development Life Cycle Enterprise Testing Services Branch Final Housing Unit Follow-Up Government Accountability Office Goals, Objectives, and Success Criteria Group Quarters Integration and Implementation Plan Version 3.5 - A-1 COMM-CB-18-0152-A-000074 Appendix A: Acronyms List Acronym HU ICD IG IHUFU IIP IL IMS IOAC IPTs ISSO ISR IT ITE LUCA NRFU O&M OCS OII OIS ORR PBR PES PFU PI PLBRs PRR PSAP QC RCC RI SE&I SoS SRR TI TSAP TRR UE Definition Housing Unit Interface Control Document Inspector General Initial Housing Unit Follow-up Integration and Implementation Plan Independent Listing Integrated Master Schedule In-Office Address Canvassing Integrated Project Teams Information Systems Security Officer Internet Self-Response Information Technology Integrated Test Environment Local Update of Census Addresses Non-response Follow-Up Operations and Maintenance Operational Control System Office of Innovation and Implementation Office of Information Security Operational Readiness Review Project Baseline Review Post Enumeration Survey Person Follow-Up Person Interview Project Level Business Requirements Production Readiness Review Participant Statistical Areas Program Quality Control Regional Census Center Re-interview System Engineering and Integration System of Systems Systems Requirements Review Technical Integrator Tribal Statistical Areas Program Test Readiness Review Update Enumerate A \11 ICIntegration and Implementation Plan PVERSIGHT Version 3.5 A-2 COMM-CB-18-0152-A-000075 Appendix B: References Appendix B: References Document Title Author 2020 Census Test and Evaluation Management Plan Trong Bui 11/30/2016 Census Solution Architecture T-Rex Corp. 06/07/2017 Census Business Process Models Various Decennial Interface Catalog Current Baselined Version Current Baselined Version 08/15/2017 Systems List Technical Integrator (TI) 2020 Census Release and Deployment Management Plan Integration and Implementation Plan Version 3.5 VERSIGHT T-Rex Corp. - Location Date https://collab.ecm.census.gov/teamsites/dceo/2020TIGPMO/2 020_TI/TI%20Site/PMO%20Functions/PalDocs/ITP%20002% 20Test%20and%20Evaluation%20Management%20Plan.pdf https://collab.ecm.census.gov/div/20rpo/R2_sei/CM/SEI%20C M%20Repository/Forms/AllItems.aspx (Solutions Architecture Folder within 2020 Census) https://collab.ecm.census.gov/div/20rpo/R2_sei/CM/SEI%20C M%20Repository/Forms/AllItems.aspx (Business Process Models folder) https://collab.ecm.census.gov/div/20rpo/R2_sei/CM/SEI%20C M%20Repository/Decennial%20Interface%20Catalog%20-%20Link.aspx https://collab.ecm.census.gov/div/20rpo/R2_sei/CM/SEI%20C M%20Repository/2020%20Decennial%20System%20List.pdf DCN: TD006-015-004 B-1 COMM-CB-18-0152-A-000076 nl y O se nm en tU Technical Integrator (TI) Performance Test Strategy G ov er Version 1.0 DCN: TD007-006-001 Se ns iti ve -F or O ffi ci al Due Date: December 30, 2016 Delivered Date: December 30, 2016 Work Order # YA1323-15-BU-0033/003 Technical Directive # 007 AMERICAN pVERSIGHT Prepared By: T-Rex Corporation COMM-CB-18-0152-A-000077 VERSION HISTORY Author Trong Bui Trong Bui Description Initial Draft Initial Formal Submittal nl y Date 12/07/2016 12/30/2016 Se ns iti ve -F or O ffi ci al G ov er nm en tU se O Version 0.1 1.0 Performance Test Strategy Version 1.0 VERSIGHT ii COMM-CB-18-0152-A-000078 Table of Contents Table of Contents Introduction ............................................................................................................. 1 1.1 Document Purpose .......................................................................................... 1 1.2 Decennial 2020 Integration and Test Framework .......................................... 1 1.3 Document Scope .............................................................................................. 2 1.4 Intended Audience ........................................................................................... 2 2 Technical Approach ............................................................................................... 4 2.1 Performance Test Release Integration ........................................................... 7 2.2 Performance Testing Process......................................................................... 8 2.2.1 Phase 0 ....................................................................................................... 9 en tU se O nl y 1 nm 2.2.1.1 Objectives .......................................................................................................... 9 2.2.1.2 Entry Criteria ................................................................................................... 10 2.2.1.3 Activities .......................................................................................................... 10 2.2.1.4 Exit Criteria ...................................................................................................... 11 2.2.2 Phase 1 ..................................................................................................... 11 G ov er 2.2.2.1 Objectives ........................................................................................................ 11 2.2.2.2 Entry Criteria ................................................................................................... 11 2.2.2.3 Activities .......................................................................................................... 11 2.2.2.4 Exit Criteria ...................................................................................................... 12 2.2.3 Phase 2 ..................................................................................................... 12 O ffi ci al 2.2.3.1 Objectives ........................................................................................................ 12 2.2.3.2 Entry criteria .................................................................................................... 12 2.2.3.3 Activities .......................................................................................................... 13 2.2.3.4 Exit Criteria ...................................................................................................... 13 2.2.4 Phase 3 ..................................................................................................... 13 ve -F or 2.2.4.1 Objectives ........................................................................................................ 13 2.2.4.2 Entry Criteria ................................................................................................... 14 2.2.4.3 Activities .......................................................................................................... 14 2.2.4.4 Exit Criteria ...................................................................................................... 14 2.2.5 Phase 4 ..................................................................................................... 15 Se ns iti 2.2.5.1 Objectives ........................................................................................................ 15 2.2.5.2 Entry Criteria ................................................................................................... 15 2.2.5.3 Activities .......................................................................................................... 15 2.2.5.4 Exit Criteria ...................................................................................................... 15 2.3 Performance Testing Types .......................................................................... 16 2.4 Performance Testing and Monitoring Tools ................................................ 20 3 Performance Test Planning ................................................................................. 22 3.1 Establish Performance Test Objectives ....................................................... 22 3.2 Establish NFORs for Performance Testing .................................................. 23 4 Performance Test Environment........................................................................... 24 4.1 Environment Sizing ........................................................................................ 24 Performance Test Strategy Version AVf HI(,/\ \J 1.0 PVERSIGHT iii COMM-CB-18-0152-A-000079 Table of Contents nm en tU se O nl y 4.2 Test Environment Management .................................................................... 24 5 Test Data Strategy ................................................................................................ 25 5.1 Types of Test Data ......................................................................................... 25 5.2 Test Data Management (TDM) ....................................................................... 26 5.3 Test Data Provisioning .................................................................................. 26 6 Performance Testing Reporting Procedures ..................................................... 27 6.1 Test Results .................................................................................................... 27 6.2 Engineering Level Reporting ........................................................................ 27 7 Key Roles .............................................................................................................. 29 8 Assumptions and Constraints ............................................................................. 30 8.1 Assumptions .................................................................................................. 30 8.2 Constraints ..................................................................................................... 30 9 Dependencies ....................................................................................................... 31 10 Risk and Mitigation ............................................................................................... 32 Appendix A: Acronyms List....................................................................................... A-1 er Table of Figures al G ov Figure 1: Decennial 2020 Integration and Test Framework ....................................... 2 Figure 2: Performance Test Strategy Approach ........................................................ 5 Figure 3: Performance Test Release Integration ........................................................ 7 Figure 4: Performance Testing Process...................................................................... 9 ci Table of Tables Se ns iti ve -F or O ffi Table 1: List of Systems................................................................................................. 6 Table 2: Types of Performance Tests ......................................................................... 17 Table 3: Performance Testing and Monitoring Tools ................................................ 20 Table 4: Performance Test Objectives Activities ....................................................... 22 Table 5: Key Roles ........................................................................................................ 29 Performance Test Strategy Version AVf HI(,/\ \J 1.0 PVERSIGHT iv COMM-CB-18-0152-A-000080 Introduction 1 Introduction en tU se O nl y The 2020 Census Program seeks to use Performance Testing to ensure the quality of Decennial Operations throughout the design, implementation, and operation phases. Performance testing represents an underlying, parallel thread of specialized testing during the Enterprise System Development Life Cycle (ESDLC) Integration and Testing phase. There are many categories of performance testing which may apply to a given Project, but in general, the results will help the 2020 Census Program realize the following benefits: 1.1 Document Purpose ov er nm The purpose of this Performance Test Strategy is to provide an outline and a technical approach for the 2020 Census Program Performance Testing. It describes the methods and tools used by performance test engineers to validate and tune the performance of the system. The Performance Test Strategy provides the decisions and testing activities that are conducted to ensure comprehensive performance testing. G 1.2 Decennial 2020 Integration and Test Framework Se ns iti ve -F or O ffi ci al The Decennial 2020 Integration and Test Framework shown in Figure 1 describes the Test and Evaluation Management Plan (TEMP) in details and depicts the major levels of testing for the 2020 Census Program. The framework includes the types of testing conducted at each level. Also highlighted in the diagram are the relevant performance test activities which are covered by the Performance Test Strategy. Performance Test Strategy Version AVf HI(,/\ \J 1.0 PVERSIGHT 1 COMM-CB-18-0152-A-000081 Introduction 2020 --- (. C. .... la • Twt I I ~,.._.. - al - .. ::::::- G ::/= ov er nm en tU se O nl y Decennial O 1.3 Document Scope ffi ci Figure 1: Decennial 2020 Integration and Test Framework -F or The scope of this document focuses on a strategic approach leading up to the 2020 Census Program. Performance testing can be conducted in parallel with other test phases if relevant information is rolled into test planning and reporting as required. ve 1.4 Intended Audience Se ns iti The Performance Test Strategy is targeted to the 2020 Census Program and other Programs supporting the projects for the 2020 mission such as Census Enterprise Data Collection and Processing (CEDCaP). The following is a list of the suggested audiences that may be interested in how the 2020 Census Program will accomplish Performance Testing: Performance Test Strategy Version 1.0 VERSIGHT 2 COMM-CB-18-0152-A-000082 Introduction    en tU se  nl y  Program Managers and Program Technical Staff (e.g., the Chief Program Engineer and the Chief Program Architect) Project Leads and Team Members (e.g., Technical Lead and Systems Developers) Information Technology (IT) Directorate Divisions and Offices United States Census Bureau (USCB) Governance Bodies (e.g., Architecture Review Board (ARB) and Standards Working Group (SWG)) Relevant USCB IT process and standards owners (e.g., Enterprise System Development Life Cycle (ESDLC), Center of Excellence (CoE)), and Office of Information Security (OIS) External oversight (Government Accounting Office (GAO), National Academy of Sciences (NAS), and Congress) O  Se ns iti ve -F or O ffi ci al G ov er nm The 2020 Census Program will leverage this Performance Test Strategy to establish and execute repeatable processes for performance testing leading to the successful deployment of scalable systems that meet the operational needs of all stakeholders. Performance Test Strategy Version AVf HI(,/\ \J 1.0 PVERSIGHT 3 COMM-CB-18-0152-A-000083 Technical Approach 2 Technical Approach en tU se O nl y Performance testing is performed at the project and program levels to ensure performance variables are within acceptable levels as defined in the Non-Functional Operational Requirements (NFORs) for the system/program. During the execution of any performance test, automation tools will be used to the maximum extent possible to reduce the workload of manual testing, defects introduced through manual configuration, and for repeatability. Performance testing cannot rely on automation tools only because in some instances we want to identify what the end user is experiencing. Se ns iti ve -F or O ffi ci al G ov er nm Overall, the Program Level Test Team will leverage a phased approach to performance testing. The phased approach will begin at the assessment and information-gathering level and gradually increase in scope and complexity through the five different phases. Each phase will gradually demonstrate an increased level of maturity and stability of the 2020 Census System of Systems (SoS). Figure 2 illustrates the high level concept of the 2020 Census Program performance testing life cycle. For a more detailed description of the tests included in each phase, please see Table 2 in Section 2.3. Performance Test Strategy Version AVf HI(,/\ \J 1.0 PVERSIGHT 4 COMM-CB-18-0152-A-000084 Full End-to-End Tests Scalability Tests Soak Tests Failover Tests en tU se Business Thread End-to-End Performance Tests Scalability Tests O Baseline Performance Tests Isolated Performance Tests al G ov er nm Information Gathering & Discovery Analysis Assessment nl y Technical Approach ffi ci Figure 2: Performance Test Strategy Approach O Phase 0 ve -F or During this first phase, the Program Level Test Team will work closely with the TI Assessment Team and Project Teams to evaluate the 2020 Census SoS with a focus on performance and scalability. The assessment efforts will dive deep into the solution architecture resulting in a thorough understanding of the maturity level and performance characteristics of each system. Se ns iti In addition, the Program Level Test Team will review the NFORs and Demand Models for each system to determine the performance targets and test data needs. This effort will also play a key role in sizing out a performance test execution environment (automated test tools, load generators, controllers, Application Performance Monitoring (APM) tools, etc.) that can support the projected volumes. Finally, all key points of contact supporting testing efforts will also be identified (e.g., Test Manager, Test Lead, Environment Management Team, Application Development Team, System Administrators, etc.). Performance Test Strategy Version 1.0 PVERSIGHT 5 COMM-CB-18-0152-A-000085 Technical Approach Phases 1, 2, 3, and 4 en tU se O nl y Based on the analysis and requirements gathered in Phase 0, the Program Level Test Team will work with all required stakeholders to design the Performance Test Plan in preparation for test execution. The first wave of performance testing will be conducted to baseline the performance results and quickly identify any bottlenecks that need to be addressed. Each additional phase will focus on a set of performance attributes, (e.g., scalability, integration, stress, resilience, failover, etc.), tested in a properly-sized environment. While each phase has a different focus, they all share a common iterative testing approach. Sections 2.2.1 through 2.2.5 provide a more detailed breakdown of the five phases. ov er nm The Program Level Test Team will ensure all systems which support the Decennial program can meet the performance targets set forth by the NFORs. There are certain systems which may fall into an exception category such as systems of service where the performance testing approach needs to be determined. As we work to identify the approach for these systems, the performance test strategy will be updated. The systems in Table 1 below have been divided into three categories based on their performance testing scope. Table 1: List of Systems List of Systems -F or O ffi ci al G All systems listed as Category 1 have been identified as systems that play an integral part in the 2020 Census and will be involved in the full suite of performance testing efforts. Category 2 systems require further analysis to determine the appropriate performance testing scope and approach. The last category are systems that have been determined to be out of scope, though a validation of their performance metrics is still required. It is important to note that for each system listed below, a full assessment needs to occur during Phase 0 to refine and customize the testing scope and approach. Se ns iti ve Category 1: These systems are In-Scope based on this Performance Test Strategy 2020 Website DSC CQA BARCA DSSD CM Clerical Match and Map Update CaRDS ECaSE - ALM CM Imputation and Estimation CEM ECaSE - Enum CM PCS CENDOCS ECaSE - FLD - OCS MOJO - Optimizer/Modeling Centurion (SWIM) ECaSE - ISR NPC Printing CHEC Fraud Detection System RTNP CHRIS ECaSE - OCS SOA CIRA ECaSE - ISR(GQ) SMaRCS Count Review System GEO Imagery Sunflower C-SHaRPS IDMS UTS DAD ILMS CAES Performance Test Strategy Version 1.0 VERSIGHT 6 COMM-CB-18-0152-A-000086 Technical Approach List of Systems en tU se O nl y DAPPS IPTS LMS DMP MaCS MAF/TIGER DMS MCM iCADE DRPS MCS Category 2: These systems require further research to determine the Performance Testing Approach CBS CRM Tabulation CEDSCI ENS GUPS CFS Hotline NFC nm Category 3: These systems are Out-of-Scope in terms of Performance Testing Commercial Printing OneForm Designer Plus POP Desktop Services er 2.1 Performance Test Release Integration r al ~ ~~~-~~ G ea .,.,..l+ifiti•?dtl+fl J ov Figure 3 depicts the Performance Test Release Integration schedule. IE9 RTIFY ci sonarqube O ffi Artifact Repository . - . Test Environment -- . Hewlet P.cbrd ,, fnt~lp!t)~ tvj Perfor m ance Center 2018 Census Testing 2020 Census Testing Figure 3: Performance Test Release Integration ns iti ve -F or Performance .- Se The Release schedule defined in the Integration and Implementation (IIP) provides the systems which support each release and the scheduled gateway reviews for the release as defined in the Test Framework. Once the capabilities developed within the release are identified, they will be evaluated as part of the Phase 0 approach to determine if activities in Phase 1 or Phase 2, such as isolated performance measurements, can be conducted in the Project Test environments which will provide a baseline measurement Performance Test Strategy Version 1.0 VERSIGHT 7 COMM-CB-18-0152-A-000087 Technical Approach nl y for that capability. The program level test team will work with the project teams to identify what performance measurements can be conducted in the project test environments for the capabilities available in each release dependent on the limitations of each project teams’ environment. en tU se O Once the project level testing is completed, the capabilities are delivered to the Independent Test Environment (ITE) environment based on the Test Readiness Review (TRR) date in the IIP. The Program Level Test Team will include those capabilities in the integrated performance measurements described in Phase 2 and Phase 3 to evaluate the baseline measurement along with other impacted components. er nm The Program Level Test Team will conduct the Phase 4 performance measurements in the Staging environment after the Production Readiness Review (PRR) is conducted as scheduled in the IIP. These performance measurements will be based on the delivery of changes identified in the IIP schedule for system capabilities which were initially released and tested in a prior release. They will be scheduled for performance regression testing and/or integration with new components delivered as appropriate. ffi ci al G ov The Performance Test Strategy is designed to ensure that all systems meet or exceed the performance load targets for 2020, which are calculated based on the External and Internal Demand Models. In preparation for the Census 2018 test, the systems will be tested against 2018 volumes (derived from the 2020 Demand Models). In addition, the systems will be scaled up to 2020 load targets to identify any scalability issues from the onset. O 2.2 Performance Testing Process Se ns iti ve -F or Performance Testing is an iterative process of assessing the current performance of the targeted system(s) against the Service Level Agreements (SLAs). Figure 4 illustrates the testing process and phase alignment. Performance Test Strategy Version AVf HI(,/\ \J 1.0 PVERSIGHT 8 COMM-CB-18-0152-A-000088 Technical Approach nl y Phase 1 - 4 Environment and Data setup .. O .. NFOR s and Demand Modeling Signoff Demand Modeling Review NFOR s Review Performance Test Plan se Performance Assessment of Systems en tU Phase 0 er nm Script Development G ov Test Exectution Results Analysis ci al Tuning & Fixes Final Report & Wrap Up Meeting Figure 4: Performance Testing Process ve -F or O ffi Ongoing Reports iti 2.2.1 Phase 0 ns 2.2.1.1 Objectives Se The primary objective during Phase 0 is to analyze each system with a focus on performance and scalability. To accomplish this goal, each system’s architecture and previous test artifacts are carefully reviewed. A performance assessment will be conducted based on the solution architecture and projected demand on the system. This step is critical for understanding the load profile and potential problem areas. Furthermore, the Program Level Test Team will be able to prioritize the testing of each system based on the maturity level assessment. The maturity level assessment will be conducted by the TI Assessment Team and the TI Program Level Test Team by Performance Test Strategy Version AVf HI(,/\ \J 1.0 PVERSIGHT 9 COMM-CB-18-0152-A-000089 Technical Approach O se Identify all software components Identify all hardware components Review functionality of the system and its various processes Document project-level performance characteristics Evaluate the nature of the load and system usage pattern Review code readiness and deployment procedures en tU       nl y gathering the current state metrics from all projects and based on the to-be state, a Maturity Assessment Report (MAR) will be developed for each system. Below is the list of tasks that needs to be performed during the project assessments: 2.2.1.2 Entry Criteria al G ov er Completed Performance Goals Completed Demand Models Available NFORs Comprehension of Performance Test Data needs Available Architecture and Design Documents Projects under assessment Completed Systems Maturity Level and Status documentation Systems Functional Testing Results and Defects status ci        nm For a project to be considered ready for performance testing (Phase 0), the following criteria must be met: ffi 2.2.1.3 Activities or Performance assessment of systems o Establish the Assessment Team o Assess the maturity level of each system with a focus on performance (e.g., response time, throughput) o Assess testing infrastructure o Review the Project Team’s Business Rhythm and Build Process o Review existing project level performance test results o Review previous Census Tests performance results o Review high-level architecture and integration points o Review functional testing results and functional testing defects status o Prioritize the system based on mission criticality and readiness o Identify dependencies and risks o Identify those areas requiring the Census Bureau’s support to resolve challenges o Assess test environment architecture and configuration o Assess performance testing scope for each system Se ns iti ve -F  O The following is a list of high-level activities that will be performed during Phase 0: Performance Test Strategy Version AVf HI(,/\ \J 1.0 PVERSIGHT 10 COMM-CB-18-0152-A-000090  2.2.1.4 Exit Criteria ov er Completion of systems assessments Development of Performance Test Environment Plan (including items such as build out of load generators, Automation tools) Identification of performance test priorities for test execution Enterprise Performance Test and Support Tool Selection G   nm The following list provides the exit criteria from Phase 0:   O se  Review Demand Models o Review external demand models for all external facing systems o Review internal demand models for all intra system interfaces Review NFORs o Review business needs and expectations o Collaborate with Stakeholders and the Project Team to define performance goals and expectations o Establish environment limitations and expectations o Assess the complexity of test data for the performance tests NFORs and Demand Model signoff en tU  nl y Technical Approach al 2.2.2 Phase 1 ci 2.2.2.1 Objectives or O ffi The primary objective of Phase 1 is to develop the Performance Test Plan and baseline the performance of each system. Furthermore, initial scalability testing takes place during this phase to identify any issues as early as possible. 2.2.2.2 Entry Criteria ve Successful completion of systems assessments Availability of ITE Stable builds with no major defects Development of Test data provisioning plan ns iti     -F For performance testing in (Phase 1) to begin, the following criteria must be met: Se 2.2.2.3 Activities The following is a list of high-level activities that will be performed during Phase 1:  Development of Performance Test Plan o Performance Test Designing and Planning  Script Development Performance Test Strategy Version AVf HI(,/\ \J 1.0 PVERSIGHT 11 COMM-CB-18-0152-A-000091 Technical Approach Test data Setup o Create and utilize Smart Stubs (A program which is used as a substitution for a real interface during testing activities until the real systems have matured and are available for testing) o Define Performance Test Data requirements for Baseline and Isolation Tests o Performance Test Script Designing and Creation for Baseline and Isolation Tests Test Execution o Baseline Performance Test execution o Establish each System’s Performance Baseline results o Isolated Performance Test execution o Scalability Test execution o Execute Isolation test Tuning and Fixes o Re-test and tune to meet performance goals and expectations o Re-test and tune to meet scalability goals and expectations o Monitor test execution Results Analysis o Analyze results Development Performance Test Results Report Establish each System’s Performance Models   en tU nm er ov  G  al  se O nl y  ci 2.2.2.4 Exit Criteria or Successful completion of planned isolation and baselined testing Completion of Performance Test Plan Performance Models for each system -F    O ffi The following list provides the exit criteria from Phase 1: 2.2.3 Phase 2 ve 2.2.3.1 Objectives ns iti The primary objective during Phase 2 is to conduct integrated and business thread endto-end tests with production-like volumes. In addition, a continuation of scalability testing will also occur during this phase. Se 2.2.3.2 Entry criteria    Successful completion of Performance Test Plan Completed Performance Test Results Report Completion of Test Data setup Performance Test Strategy Version AVf HI(,/\ \J 1.0 PVERSIGHT 12 COMM-CB-18-0152-A-000092 Technical Approach 2.2.3.3 Activities The following is a list of high-level activities that will be performed during Phase 2: nl y Script Development o Create/update performance test scripts for integration and business thread end-to-end tests o Create/update performance test data for integration and business thread End-to-End tests Test Data Setup o Replace Smart Stubs with live interfaces between available systems Test Execution o Execute integrated performance test o Re-test and tune to meet performance goals and expectations o Execute business thread End-to-End test o Re-test and tune to meet performance goals and expectations o Execute Scalability test o Execute Stress test Tuning and Fixes Results Analysis o Monitor the Application and System o Analyze and document performance results o Validation of the business thread end-to-end performance models o Generate Performance Test Results Report ov ci al G   er nm  en tU  se O  ffi 2.2.3.4 Exit Criteria ve  Successful completion of planned Scalability, Business thread end-to-end, and Stress Test Successful Completion of Performance Test Results Report for Scalability, Business thread end-to end, and Stress Test -F  or O The following list provides the exit criteria from Phase 2: iti 2.2.4 Phase 3 ns 2.2.4.1 Objectives Se The primary objective in Phase 3 is to execute full end-to-end and scalability tests. Performance Test Strategy Version AVf HI(,/\ \J 1.0 PVERSIGHT 13 COMM-CB-18-0152-A-000093 Technical Approach 2.2.4.2 Entry Criteria nl y Successful completion of Performance Test scripts Successful completion of Test Data Setup Scalable environment availability Stable code with tested interfaces O     se 2.2.4.3 Activities ffi or  Tuning and Fixes o Re-test and tune to meet performance goals and expectations o Re-test and tune to meet the scalability goals and expectations Results Analysis O  ci al G ov  nm  Script Development o Create/Update Performance Test Scripts for Full End-to-End Performance Test Test data Setup o Create/Update Performance Test Data for Full End-to-End Performance Test Test Execution o Execute Full End-to-End Test o Execute Load Test o Execute Spike Test o Execute Scalability Testing o Execute End-to-End Scalability Testing o Performance regression testing er  en tU The following is a list of high-level activities that will be performed during Phase 3: -F 2.2.4.4 Exit Criteria Successful Completion of planned Full End-to-End, Load, Spike, Scalability, and Regression Test Successful Completion of Performance Test Results Report for Full End-to-End, Load, Spike, Scalability, and Regression Test Full End-to-End Performance Models Completion of Performance Test Results Report Documented Full End-to-End performance characteristics iti  ve The following list provides the exit criteria from Phase 3: Se ns     Performance Test Strategy Version AVf HI(,/\ \J 1.0 PVERSIGHT 14 COMM-CB-18-0152-A-000094 Technical Approach 2.2.5 Phase 4 2.2.5.1 Objectives se O nl y The primary objectives in Phase 4 are to execute scalability, stress, soak, failover, disaster recovery, and Security tests. Performance testing in the to-be Production environment will also be a priority during this phase to ensure the stability of critical systems ahead of go-live. 2.2.5.2 Entry Criteria Completion of Performance Test scripts Availability of to-be Production environment en tU   nm 2.2.5.3 Activities The following is a list of high-level activities that will be constituted in Phase 4: er Script Development Test Executions o Execute Scalability Tests o Test End-to-End Scalability including Auto-Scaling in the Census Cloud Solution (GovCloud, TI Cloud, etc.) o Execute soak Tests o Execute Failover Tests o Test End-to-End Failover of the Systems o Execute Disaster Recovery Tests o Test End-to-End Disaster Recovery of the Systems o Test End-to-End Security Tests of the Systems Under Load o Tests will be performed with Operational activities in parallel o Execute End-to-End Performance Tests in the Production environments Tuning and Fixes Results Analysis o Completion of Final Report and Wrap Up Meeting o Validate Graceful Recovery and Handling of Outage -F iti ve   or O ffi ci al G ov   ns 2.2.5.4 Exit Criteria Se The following list provides the exit criteria from Phase 4:  Successful Completion of planned Soak, Failover, Disaster Recovery, and Scalability Test  Successful Completion of planned End-to-End Performance Test in Production environment  Successful Completion of Final Report o Test Analysis and Engineering-Level reports with recommendations Performance Test Strategy Version AVf HI(,/\ \J 1.0 PVERSIGHT 15 COMM-CB-18-0152-A-000095 Technical Approach nl y O  o Assessment documentations of end-to-end performance characteristics under the following conditions:  Systems’ Ability to Scale Up and/or Down  Systems’ Ability to achieve Graceful Recovery  Systems’ Ability to Handle Outages Completion of Defined End-to-End Performance Model se 2.3 Performance Testing Types nm en tU Performance testing plays a crucial role in determining the overall performance of the system through validation and verification of system attributes such as speed, scalability, responsiveness, and stability under a specific load. The goal of performance testing is to identify and eliminate bottlenecks to ensure a stable and scalable system. Performance testing encompasses a wide range of testing activities outlined below. er Certain types of testing (e.g. Load, Scalability testing) will be repeated across phases, as seen in Table 2, however they differ in the following ways:  Se ns iti ve -F or O ffi ci al G ov Tests conducted in phase 1 are isolated performance tests and are constrained by the environment’s size. This may result in testing at lower volumes during load tests and fewer available clusters during scalability testing. There is still value from running these types of tests in a scaled down environment to ensure no major issues exist.  As these testing types are repeated during phase 2, they are able to leverage a larger environment (e.g. ITE) where there is an ability to test multiple integrated systems together. This will result in the ability to increase the volume during load testing and further scale up the environment (e.g. number of clusters) during scalability testing.  Phase 3 builds on phase 2 with the addition of testing being conducted in a production-like environment (Staging) as well as end-to-end testing activities. In this phase, as with the progression of each phase, volumes that can be supported, and the resources available for scalability testing, are increased to 2020 Production-like conditions due to the available environments and the ability for all systems to be tested concurrently.  Finally, phase 4 testing efforts will continue to follow the progression of testing complexity as previously described. In this phase, scalability and load testing will be conducted in the Staging and to-be Production environments to ensure Production environment readiness. In summary, testing types repeated in multiple phases will differ due to the fact that scaled up environments will become available, allowing for higher volumes and more clusters to be tested. The addition of each phase also increases the number of systems that can be tested concurrently thus allowing for a more Production-like scenario to be tested. Performance Test Strategy Version AVf HI(,/\ \J 1.0 PVERSIGHT 16 COMM-CB-18-0152-A-000096 Technical Approach Table 2: Types of Performance Tests     en tU  ov er nm  se O nl y Phase 0 Phase 1 Phase 2 Phase 3 Phase 4   Se ns iti ve -F or O ffi ci al Stress Testing Description Load tests conducted at a specific, controlled transaction load and data volume load based on the demand models. Simulate performance at an expected transaction volume level is processed by the fullyintegrated system. Load tests are executed at any chosen volume level, such as levels corresponding to average conditions or corresponding to peak conditions. Involves intentionally ramping up volume to reach a breaking point for a critical resource, such as server, I/O, or memory. This breaking point may have no direct correspondence to an expected peak volume – this type of test is intended to identify which resource will saturate first. For workloads that have volatile peaking patterns, the Performance Analysis Team may want to understand the behavior of the system at 2X or 3X of expected peak conditions. Involves executing longrunning or repeating workload mix scripts to assess whether the performance and stability of the system G Test Name Load Testing Soak Testing Performance Test Strategy Version 1.0 VERSIGHT  17 COMM-CB-18-0152-A-000097 Technical Approach  ov er nm en tU se O nl y Phase 0 Phase 1 Phase 2 Phase 3 Phase 4 G Description remains consistent over long periods of time, such as between 16-48 hours. This type of test is used to identify issues that may become apparent only over an extended testing period, such as a memory leak. Spike The objective of Spike Testing Testing is to analyze the software’s reaction to sudden large spikes in the load generated by users. The goal is to determine whether performance will suffer, the system will fail, or it will be able to handle dramatic changes in load. Scalability The objective of Testing Scalability Testing is to determine the software application's effectiveness in "scaling up" to support an increase in user load. It helps plan capacity addition to the system. Scalability testing is the testing of a software application to test or measure its scale up or scale out features in terms of any of its nonfunctional abilities. Business Business Thread EndThread End- to-End Performance to-End Testing is a technique Performance used to test whether the Testing flow of an application right from start to finish is behaving as expected. The Purpose of this testing is to identify system dependencies       Se ns iti ve -F or O ffi ci al Test Name Performance Test Strategy Version 1.0 VERSIGHT 18 COMM-CB-18-0152-A-000098 Technical Approach ov er nm en tU se O nl y Phase 0 Phase 1 Phase 2 Phase 3 Phase 4   ffi ci al Failover Testing Description and to ensure that the data integrity is maintained between various system components and systems. The Entire application is tested for critical functionalities under load such as communicating with the other systems, interfaces, database, network and other applications. It Is used to verify the system’s ability to continue day-to-day operations while the processing part is transferred to a back-up. It can determine if a system can allocate additional resources when needed, or even if it’s able to recognize when the need has arisen. It is the examination of each step in a disaster recovery plan. Disaster Recovery Testing Helps to ensure that an application within an organization can really recover data, restore business critical applications and continue operations after an interruption of services. Security Testing is one type of testing to identify vulnerabilities in an application and is a distinct phase that occurs as part of many application-level tiers. It G Test Name  Se ns iti ve -F or O Disaster Recovery Testing Security Testing Performance Test Strategy Version 1.0 VERSIGHT  19 COMM-CB-18-0152-A-000099 Technical Approach ov er nm en tU se O nl y Phase 0 Phase 1 Phase 2 Phase 3 Phase 4   al Isolation Testing Description models various stages of attacks to help identify application level vulnerabilities or potential vulnerabilities and either remediate or risk accept them, attempts to exploit security flaws. It also involves a human tester who does understand the business context of an application and allows them some creativity and uniqueness in their tests. This type of test is used to isolate one part of an application to determine where a flaw or issue exists within the system. G Test Name ffi ci 2.4 Performance Testing and Monitoring Tools -F or O To simulate Production like conditions, performance testing and monitoring tools will be selected and utilized for script and scenario creation, load distributions, execution, and analysis. Table 3 lists samples of widely adopted tools that may be leveraged for performance testing. Table 3: Performance Testing and Monitoring Tools ve Tool Name Type System Administrator/Owner Comments Scripting Tool System Performance Team Create new scripts for load testing HP Performance Center (PC) Controller System Performance Team Create and run new load test Load Generator System Performance Team Generate Load Se ns iti HP Virtual User Generator Performance Test Strategy Version 1.0 VERSIGHT 20 COMM-CB-18-0152-A-000100 Technical Approach Tool Name Type System Administrator/Owner Comments Cloud-based distributed testing tool System Performance Team Cloud-based distributed testing tool Cloud-based distributed testing tool System Performance Team Cloud-based distributed testing tool Monitoring System Performance Team Monitor resources while the scenario is running HP SiteScope Monitoring System Performance Team AWS CloudWatch Monitoring System Performance Team AppDynamics APM Monitoring System Performance Team en tU se O nl y Cloud-based distributed testing toolWeb App Cloud-based distributed testing toolMobile App HP PC Monitors nm Monitor resources while the scenario is running Monitor resources Se ns iti ve -F or O ffi ci al G ov er Monitor resources Performance Test Strategy Version 1.0 VERSIGHT 21 COMM-CB-18-0152-A-000101 Performance Test Planning 3 Performance Test Planning nl y 3.1 Establish Performance Test Objectives en tU se O The definition of measurable performance goals and objectives is critical early in the Performance Engineering Life Cycle. The definition of performance objectives involves a negotiation process that strikes a balance between service levels desired by the business and realistic service levels that can be provided by the system and its supporting technology. Table 4 identifies the recommended activities and performance artifacts which Programs and Projects should focus on for setting Performance Test Objectives. Performance Artifacts Programs develop the overarching performance strategy which considers the purpose of the Program, as well as incorporates each of the Project’s performance considerations. Performance Strategy Projects follow the direction and guidance provided the Program and the Program’s Performance Strategy to execute the Performance Engineering Life Cycle throughout the ESDLC. N/A Programs develop performance targets and metrics for the Program and allocate to the Projects. Performance Targets and Metrics Project Projects adopt Program performance targets and metrics, develop a plan to report and meet the Program allocated targets and metrics, and develop Project-specific metrics and targets, as needed. NFORs Program Programs develop external and internal demand models which forecast the demand on the Program as a whole, and which feed into Project demand levels. Demand Models NFORs Program al G Define Performance Strategy Responsibilities er Program/Project ov Activity nm Table 4: Performance Test Objectives Activities Program Se ns iti ve -F Define Performance Targets and Metrics or O ffi ci Project Develop External and Internal Demand Models Performance Test Strategy Version 1.0 VERSIGHT 22 COMM-CB-18-0152-A-000102 Performance Test Planning Program/Project Performance Artifacts Responsibilities Projects derive Project-specific demand models from Program provided demand models. Project-Specific Demand Models (as needed) se O Project nl y Activity en tU 3.2 Establish NFORs for Performance Testing Se ns iti ve -F or O ffi ci al G ov er nm Performance NFORs are the building blocks on which many performance engineering processes are built. They determine the technical specifications and help communicate the “-ilities” characteristics (e.g., availability, scalability) of the systems being delivered. A good understanding and agreement on performance NFORs ensure that the Business, Program, and Project teams are clear about the performance constraints that are evaluated within the scope and context of the Performance Test. Performance Test Strategy Version 1.0 VERSIGHT 23 COMM-CB-18-0152-A-000103 Performance Test Environment 4 Performance Test Environment en tU se O nl y Performance testing is conducted to check the behavior of the application under test (AUT) on different load conditions. However, the application is closely tied to the environment / infrastructure that supports it. Therefore, a large focus is also placed on the environment sizing which will yield the necessary performance targets. It is not always possible to build the Performance Test environment as an exact replica of the Production environment. The expectation is that the Performance Test environment is a representative of the Production environment. The following section expands on these sizing requirements. nm 4.1 Environment Sizing ov er The testing environment must demonstrate more realism in terms of design and loads to prove the capabilities can support the operational needs of the system. As the 2020 operational environment design and architecture becomes final, the Test environment needs to provide an adequate representation of the operational environment. or O ffi ci al G The main objective is to performance test as early as possible in a scalable environment to rule out any potential bottlenecks that might hinder the performance of the systems. Continuous activities focusing on configurations of the Performance Test environments will be in place to support different types of performance testing that will run in multiple test phases. Systems will be tested and benchmarked as the environment goes through a series of configuration updates to improve performance. These environment configuration changes need to be closely tracked to ensure they are implemented in the to-be Production environment. The environment changes will be made following the established TI configuration management processes. -F 4.2 Test Environment Management Se ns iti ve To ensure consistent configurations across different environments and builds during all testing phases, the Program Level Test Team will follow the process established by the Program Level Configuration Management Team. It will be critical to properly manage the Test environment to ensure availability and stability during performance testing windows. Performance Test Strategy Version AVf HI(,/\ \J 1.0 PVERSIGHT 24 COMM-CB-18-0152-A-000104 Test Data Strategy 5 Test Data Strategy se O nl y Test data creation and maintenance is an integral and important aspect of testing. High quality, Production-like, test data will ensure a high level of data permutation that ensures a wider test scenario coverage during performance testing. A structured approach to the creation and configuration management of test data will enable and expedite testing activities. en tU A dedicated Test Data Management (TDM) Team will support 2020 Census performance testing by developing and maintaining secure test datasets that simulate Census data formats but are not actual Census data or based on actual Census data (not considered Title 5, Title 13, Title 26, Sensitive, or PII). er nm The TDM Team will work together with the Performance Test Team to collect test data requirements, develop a delivery schedule, and data provisioning plan as required in phases 1 through 4. G ov 5.1 Types of Test Data Se ns iti ve -F or O ffi ci al There will be a diverse set of datasets required to fully exercise the performance of the 2020 Census SoS.  Address Master File (AMF) data sets: Foundational data model framework that supports all components of address data required anywhere within the SoS.  Residence Master File (RMF) data sets: Correlative data model framework that supports all core paper and Group Quarters data required throughout the SoS. This data model relies on address data from AMF. It also provides Administrative Records (postal, tribal, IRS, CMS, and SSA, etc.) that correlate with core paper and GQ forms. Language support for Spanish, Chinese, Korean, and Vietnamese is also built into this model.  Capture data: Correlative data model framework that supports all data capture modes (internet, questionnaire, phone, Field Enumeration), and paradata along with a simulated workflow engine.  Field data: Correlative data model framework that supports all field data (hiring process documentation such as resumes and credit checks, HR records for 300,000 enumerators, enumerator timesheets for 10 weeks, enumerator expense reports for 10 weeks, submissions and updates for 10 weeks, enumerator fraud scenarios for 10 weeks, along with simulated enumerator assignments).  For all types of test data generated, both happy path and erroneous datasets will be included to ensure all expected and non-expected error handling conditions are exercised. Performance Test Strategy Version AVf HI(,/\ \J 1.0 PVERSIGHT 25 COMM-CB-18-0152-A-000105 Test Data Strategy 5.2 Test Data Management (TDM) se O nl y Effective Test Data Management plays a key role during the different performance test phases. A dedicated TDM Team will be responsible for developing and disseminating test data sets to the Test Teams per an agreed-upon set of requirements. Performance Test Engineers will leverage a metadata repository, which the TDM team will maintain, to determine the required datasets that need to be requested. en tU 5.3 Test Data Provisioning Se ns iti ve -F or O ffi ci al G ov er nm Performance test engineers will require new and/or refreshed datasets to enable their testing efforts. A request will be sent to the TDM Team, utilizing a data request template with specific requirements and target completion dates. Accompanying the delivery of the dataset(s), are Data Dictionaries and File Layout documentation. Performance Test Strategy Version AVf HI(,/\ \J 1.0 PVERSIGHT 26 COMM-CB-18-0152-A-000106 Performance Testing Reporting Procedures 6 Performance Testing Reporting Procedures en tU se O nl y Reports containing details of the scalability, transaction(s) performance summaries, and a list of findings will be given at the end of every iteration. Reports will also provide detailed information on whether the system can meet previous baselines or able to scale up. It will also list performance bottlenecks if any in, the various tiers (application or database) of the architecture. Two levels of reporting will be provided to the stakeholders (Test Analysis and Engineering-Level reports). 6.1 Test Results er ov G al ci ffi -F or   Test Name: Name of the test scenario Test Time: Test execution time # Of Users: Number of users utilized in the scenario Test Duration: Duration of the test run Transaction Name: List of transactions included the test run Average Response Time: Average response time for overall transactions 90th Percentile: 90th percentile for overall transactions Volume: Number of transactions executed in the scenario Failures: Number of failures observed in the scenario HyperText Transfer Protocol (HTTP) Error Types: HTTP error types observed in the test Throughput: Throughput is the amount of data in bytes receive from the server Hits Per Second: Hit Per Second is the number of hits made on the Web server by Virtual Users during each second of the load test O           nm The Test result report will have high-level data to include but not limited to the following: 6.2 Engineering Level Reporting iti ve Engineering level reporting will provide details on individual server metrics for the Web, Application, and Database servers. ns Web Server and Application Server Se The Web server metrics provide resource usage data on the web server during the performance test. These metrics provide the useful information on Web server performance and its bottlenecks. Most of the application computational activities are performed on its Application server which is the back bone of the application, especially in case of highly complex business applications. Some of the common metrics for Web and Application servers are listed below and not limited to: Performance Test Strategy Version AVf HI(,/\ \J 1.0 PVERSIGHT 27 COMM-CB-18-0152-A-000107 Performance Testing Reporting Procedures     nl y O se  en tU   nm  Average CPU Utilization (%): Percentage usage of CPU time (across all Processors). Available Memory: Total amount of memory that is left after nopaged pool allocations and paged pool allocation. Page/sec: This value indicates number of page input and outputs per second. Committed Bytes: Total amount of memory that has been allocated for the exclusive use of any process. Non-Pageable Memory Pool Bytes: This is the size of physical memory for objects that cannot be written to disk when they are not being used. Request/Sec: The number of requests executed per second. Request Wait Time: Is the amount time the most recent request was waiting in the queue. Java Virtual Machine Heap Size: The amount of memory allocated to application running in the JVM (Java Virtual Machine). Garbage Collector: Program which runs on the JVM which gets rid of objects which are not being used by an application. er  ov Database Server ci Se ns iti ve -F or  ffi  Database Transaction Summary: Database transaction monitor provide the information of volume of data sent and received by the Database server Database Connection Summary: Provides information for the total number of open and close connections on the Database server during the test. Database Thread Summary: Thread metric provides information on the number of new threads created, connected, and used. O  al G All application data are stored in the database. The database server contains all the critical information of the application. Some of the common metrics are listed below to include but not limited to the following: Performance Test Strategy Version AVf HI(,/\ \J 1.0 PVERSIGHT 28 COMM-CB-18-0152-A-000108 Key Roles 7 Key Roles se O nl y Table 5 outlines the key roles associated with the performance test activities. The roles described in the table will be involved in all phases of performance testing to support the various activities as needed. However, the number of resources in each role may vary by phase. For example, not as many performance testers will be needed in Phase 0 as in Phase 4. en tU Table 5: Key Roles Role Responsibilities  Monitors System Performance (Workload, Application, and Infrastructure)  Leads Capacity Planning and Monitoring activities  Assists with Performance Tuning activities Infrastructure Team  Monitors and Reports on Infrastructure Performance  Assists with Performance Tuning activities Development Team  Implements Code Optimization Techniques, as necessary  Participates in Code Reviews  Adheres to Coding Guidelines Performance Testers  Plan and execute performance test activities  Supports Performance Tuning, as necessary  Generate Performance Test Reports Business Representatives  Participate in Performance Test Scenario Identifications Se ns iti ve -F or O ffi ci al G ov er nm Performance Engineer Performance Test Strategy Version 1.0 VERSIGHT 29 COMM-CB-18-0152-A-000109 Assumptions and Constraints 8 Assumptions and Constraints nl y 8.1 Assumptions  se en tU nm er ov G al   ci        ffi   Prior to test execution performance objectives have been agreed-upon, including working assumptions and testing scenarios Completed system design documents exist and are available Performance testing tools and supporting technologies will be installed and fully licensed Availability of the environment needed for the test and phase Completed external and internal Performance Models Documented project-level performance characteristics Prior to test execution verification of code readiness NFORs are available Comprehension of Performance Test Data needs Available Architecture and Design Documents for all Projects (Post-2020 Critical Design Review (CDR) version) Testing team access to the environment needed for the test and phase Testing environment will be populated with Production-like / simulated data and sufficient volume prior to performance testing execution where applicable for the test cases Issue and defect resolution will support the testing timeline or 8.2 Constraints O  O The following are high-level assumptions that the Performance Test Strategy is based upon: Se ns iti ve -F The following is the key constraint identified during the development of the Performance Test Strategy:  With the deployment of a new build, some underlying objects in the system may change which require rewriting automated test scripts as well as changes to data setup. This increases script preparation effort and readiness to execute a performance test. Performance Test Strategy Version AVf HI(,/\ \J 1.0 PVERSIGHT 30 COMM-CB-18-0152-A-000110 Dependencies 9 Dependencies  O Se ns iti ve -F or O ffi ci al G ov er nm  se   Availability of internal and external demand models for all systems: Completed NFORs for 2020 Test data will be delivered per an agreed upon schedule Test environments to support performance testing in various level/environments (Project Test, Independent Test Environment, Staging, and Production, etc.) Performance testing tools shall be installed in ITE, Staging, and Production environments Performance Test Environment for the Performance Test Tool (Cloud-based load generations, etc.) en tU  nl y The following are high-level dependencies identified during the development of the Performance Test Strategy: Performance Test Strategy Version AVf HI(,/\ \J 1.0 PVERSIGHT 31 COMM-CB-18-0152-A-000111 Risk and Mitigation 10 Risk and Mitigation Se ns iti ve -F or O ffi ci al G ov er nm en tU se O nl y All risks and issues related to testing, and in particular to performance testing, will be managed through the TI Risk Management Process. This process includes identifying mitigation strategies and tracking them to completion. Performance Test Strategy Version AVf HI(,/\ \J 1.0 PVERSIGHT 32 COMM-CB-18-0152-A-000112 Appendix A: Acronyms List Appendix A: Acronyms List Definition AMF Address Master File APM Applications Performance Monitoring ARB Architecture Review Board AUT Application Under Test CDR Critical Design Review CEDCaP Census Enterprise Data Collection and Processing COE Center of Excellence DB Database ESDLC Enterprise System Development Life Cycle GAO Government Accounting Office HP ALM Hewlett Packard Application Lifecycle Management HTTP HyperText Transfer Protocol IIP Integration and Implementation Plan IT Information Technology O ffi ci al G ov er nm en tU se O nl y Acronym ITE or Independent Test Environment MAR -F Maturity Assessment Report NAS National Academy of Sciences Non-Functional Operational Requirements OIS Office of Information Security iti ve NFOR Production Readiness Review PTE Project Test Environment Se ns PRR RMF Residence Master File SE Staging Environment SLA Service Level Agreement Performance Test Strategy Version 1.0 VERSIGHT A-1 COMM-CB-18-0152-A-000113 Appendix A: Acronyms List Definition SoS System of Systems SWG Standards Working Group TAR Test Analysis Report TDM Test Data Management TEMP Test and Evaluation Management Plan TI Technical Integrator TRR Test Readiness Review USCB United States Census Bureau Se ns iti ve -F or O ffi ci al G ov er nm en tU se O nl y Acronym Performance Test Strategy Version 1.0 VERSIGHT A-2 COMM-CB-18-0152-A-000114 en tU se O 2020 nl y census nm Test and Evaluation Management Plan ov er Version 2.0 G Due Date: December 22, 2016 or O ffi ci al Delivered Date: December 22, 2016 Technical Directive # 007 DCN: TD-007-004-002 Se ns iti ve -F Work Order # YA1323-15-BU-0033/003 Prepared By: T-Rex Corporation AE t pVERSIGHT COMM-CB-18-0152-A-000115 2020 Census Test and Evaluation Management Plan Document Revision History Publication Date Author Revision Description 0.1 11/23/2016 Trong Bui Initial Draft 1.0 11/30/2016 Trong Bui Initial Formal Submittal 2.0 12/22/2016 Trong Bui Revised Formal Submittal Se ns iti ve -F or O ffi ci al G ov er nm en tU se O nl y Version Test and Evaluation Management Plan Version 2.0 VERSIGHT ii COMM-CB-18-0152-A-000116 2020 Census Test and Evaluation Management Plan Table of Contents Program Overview ............................................................................................... 1 2. Test Strategy........................................................................................................ 2 nl y 1. Overall Test and Evaluation Management Plan Purpose ............................. 2 2.2. Overall Test Strategy ....................................................................................... 2 2.3. Test Approach .................................................................................................. 3 2.4. Program Test Life Cycle .................................................................................. 3 en tU se O 2.1. Test Planning ............................................................................................. 4 2.4.2. Test Design ................................................................................................ 4 2.4.3. Test Execution ........................................................................................... 5 2.4.4. Test Reporting ........................................................................................... 5 nm 2.4.1. Test Metrics ...................................................................................................... 5 2.6. Test Data ........................................................................................................... 6 2.7. Test Data Security ............................................................................................ 7 2.8. Test Tools ......................................................................................................... 7 2.9. Test Automation ............................................................................................... 7 al G ov er 2.5. Test Environments........................................................................................ 8 2.11. Test Roles & Responsibilities ...................................................................... 9 O ffi ci 2.10. Test Levels ............................................................................................................ 12 3.1 Project Level ................................................................................................... 13 or 3 Development Test Strategy ........................................................................ 13 3.1.2 System Test Strategy ................................................................................. 14 3.1.3 User Acceptance Test Strategy ................................................................. 15 3.1.4 Customer Acceptance Strategy - Instrument ........................................... 16 iti ve -F 3.1.1 Program Level ................................................................................................ 16 ns 3.2 Technical Integration Test Strategy .......................................................... 17 3.2.2 Customer Acceptance Strategy – Output ................................................. 18 3.2.3 Checkout & Certification Strategy ............................................................. 20 3.2.4 System Readiness Test Strategy............................................................... 21 Se 3.2.1 3.3 Operational Level ........................................................................................... 22 3.3.1 Performance & Scalability Test Strategy .................................................. 22 Test and Evaluation Management Plan Version AVf HI(,/\ \J 2.0 PVERSIGHT iii COMM-CB-18-0152-A-000117 2020 Census Test and Evaluation Management Plan 3.3.2 Operational Readiness Test Strategy ....................................................... 24 3.3.3 System Checkout ........................................................................................ 25 3.4 Security Control Assessment .............................................................................. 29 4.1 nl y 4 Test Types ...................................................................................................... 26 Security Assessment of the Environments ................................................. 29 Authority to Test ......................................................................................... 29 4.1.2 Authority to Operate ................................................................................... 29 Security Controls Testing.............................................................................. 29 Additional Information ......................................................................................... 31 en tU 5 se 4.2 O 4.1.1 5.1 Test Documentation........................................................................................... 31 5.2 Assumptions & Constraints .............................................................................. 35 nm 5.2.1 Test Assumptions ........................................................................................ 35 6 er 5.2.2 Test Constraints .......................................................................................... 37 Point of Contact .................................................................................................... 38 Decennial 2020 Test Type Definitions ......................................... A-1 APPENDIX B Acronyms ...................................................................................... B-1 APPENDIX C Referenced Documents ................................................................ C-1 al G ov APPENDIX A ci Table of Figures O ffi Figure 1: Test Approach. Assessment results and system status yield comprehensive test planning, design, execution and report generation................. 4 or Figure 2: Test Automation Process ............................................................................. 8 Figure 3: Decennial 2020 Integration and Test Framework ..................................... 13 Table of Tables iti ve -F Figure 4: Test Documentation Tree ........................................................................... 35 ns Table 1: Test Environments ......................................................................................... 8 Table 2: Roles and Responsibilities ............................................................................ 9 Se Table 3: Test Categories Mapped to Test Approach ................................................ 27 Table 4: Test Documentation ..................................................................................... 31 Test and Evaluation Management Plan Version AVf HI(,/\ \J 2.0 PVERSIGHT iv COMM-CB-18-0152-A-000118 2020 Census Test and Evaluation Management Plan 1. Program Overview O nl y The purpose of the 2020 Census is to conduct a census of population, housing, and disseminate the results to the President, the 50 States, and to the American people. The goal of the 2020 Census is to count everyone only once, and in the right location. The challenge is to conduct the 2020 Census at a lower cost per household (adjusted for inflation) than the 2010 Census, while maintaining high-quality results. er nm en tU se The 2020 Census aims to incorporate internet response, portable devices for field operations, paper responses, administrative records (ADREC), and an integrated field operational control system (OCS). Between these ambitious goals and the challenges the U.S. Census Bureau encountered during the 2010 Census, a flexible, yet controlled, systems engineering approach is required to successfully implement new systems. The 2020 Census Program faces many unique challenges in scale, volume, complexity, and duration. The program is a nation-wide distributed operation with a large and diverse temporary workforce that will often lack technical savvy. The program needs to cover each household or place of residence over a short period and still meet an immutable, constitutionally-mandated deadline. Se ns iti ve -F or O ffi ci al G ov Systems engineering efforts structure activities required to satisfy the operations of the 2020 Census Operational plan after meeting all of the prescribed operational readiness criteria. Test and Evaluation Management Plan Version AVf HI(,/\ \J 2.0 PVERSIGHT 1 COMM-CB-18-0152-A-000119 2020 Census Test and Evaluation Management Plan 2. Test Strategy 2.1. Overall Test and Evaluation Management Plan Purpose nm en tU se O nl y The 2020 Census Program Test and Evaluation Management Plan (TEMP) identifies the tasks and activities that need to be performed to ensure that all aspects of the system are adequately tested and the system(s) can be successfully implemented. The TEMP describes the test activities of the Program in progressive levels of detail. In addition, the TEMP document lays out the program test and evaluation strategy by which projects within the Decennial program will benefit from common, repeatable test processes across all levels of testing for the various product releases. The TEMP describes the approach for the various levels of testing and testing types between systems. Program-level testing will focus on test guidelines, activities, and support to be provided by the Program Team as well as other Project teams involved in systems development. It distinguishes between project and program- level responsibilities for testing activities. er 2.2. Overall Test Strategy ffi ci al G ov This document defines the 2020 Census Program test strategy, an overall approach for validating the systems under development by ensuring they are designed, built, and implemented to meet Census’ business requirements and are fully operational upon deployment. The strategy defines the test levels that will be executed within the Program and its component systems. The 2020 Census Testing is divided at the top level among three major categories: 1) Project Level, 2) Program Level, and 3) Operation Level. The approach identifies the objectives, types of testing conducted, environment, entry and exit criteria, and the input and output related to each test level. ns iti ve -F or O Testing is a major activity within the quality control process. Testing reduces the risk of delivering a solution with defective or missing functionality, unacceptable performance or usability, or vulnerabilities in security and privacy. The costs associated with detecting and correcting defects rise exponentially as the project proceeds further along the Software Development Life Cycle (SDLC), to the point where major defects detected in the last stages of pre-release testing may render the entire product unfit for service due to the excessive amount of rework required. Consequently, it is critical that any testing approach should aim to detect errors as early as possible in the project timeline and release cycle. Therefore, the 2020 Census Test Strategy, as depicted in Figure 3 titled, the ‘Decennial 2020 Integration and Test Framework’ is designed to ensure defects are identified as early as possible in the process. Se To help ensure comprehensive testing, multiple levels of testing are conducted. As test levels progress from one level to the next, they become increasingly imitative of the Production environments. The environment for conducting the tests also moves from the Development environment for Unit and Project Integration testing into Environments appropriate for System Integration Testing, Program Integration Testing, and Operational Testing. The objective of this approach is to identify defects as early in the process as possible, regardless of the team’s methodology (whether it is Waterfall or Test and Evaluation Management Plan Version AVf HI(,/\ \J 2.0 PVERSIGHT 2 COMM-CB-18-0152-A-000120 2020 Census Test and Evaluation Management Plan Agile). As testing progresses through each level, there are different types of testing conducted to achieve the greatest amount of test scenario coverage and confidence in the system. nl y 2.3. Test Approach O There are various approaches being managed by the project teams for the project level development. Some of the projects follow the Waterfall methodology while others use the Agile approach. en tU se For project teams executing a Waterfall development approach, there is a direct correlation between the SDLC phase and the associated test level and test type where each lifecycle phase and test environment has well-defined test levels, entry, and exit criteria. ci al G ov er nm For projects executing an Agile/Iterative development approach, testing will be performed to the same levels (i.e., unit, integration, regression, security); however, the sequencing of testing will be integrated within the individual sprint/iteration. In keeping with Agile practice, upfront collaborative requirements gathering with the product owner and knowledgeable Subject Matter Experts (SME) will include simultaneously defining the needed test levels and test requirements to satisfy acceptance testing for the work product. Following the multiple sprints comprising the program increment and the multiple program increments making up the release, additional levels of testing (i.e., system, regression, security, and performance) will take place prior to deployment. Oftentimes, this testing could take the form of a "hardening" sprint, during which the Development Team will participate in both testing and defect remediation. -F or O ffi As testing moves past Project System test, the code will be delivered to the Program Level Test Team incrementally. For projects which are not aligned with the release cycles, the TI Team will work with each team to determine the best schedule to integrate the code based on the capabilities delivered within each increment timeframe. The Program Level Test Team is dependent on the Functional Integration Plan which aligns the functional capabilities provided across and within systems in various deliveries to plan testing which can be conducted in each release. ve 2.4. Program Test Life Cycle Se ns iti The Program Level test approach starts with an in-depth knowledge of the 2020 Census systems and is enhanced through our assessments. Testing is conducted early, often, and iteratively. Communication of progress, issues, and risks are continuously conveyed using various channels (status dashboard, daily standups, weekly meetings, and Test Analysis Reports (TAR) to apprise stakeholders of the current test status. Test and Evaluation Management Plan Version AVf HI(,/\ \J 2.0 PVERSIGHT 3 COMM-CB-18-0152-A-000121 UnitedStats ™ Cens!:!.., __ _ 2020 Census Test and Evaluation Management Plan ~~ Program Our cyclical-phased process, shown in Figure 1, is /\s~ smcnts continuously updated with assessment and segment Project Design\ lead findings. This tailored approach strategically grows Assessments both horizontally (# of systems under integration & test) Test Segment Executio and vertically (test types for each system, thread, and Lead Updates 0 System of Systems (SoS). Priority systems, interfaces, and operations are completed first while continuing towards the full end-to-end suite of SoS tests that the TI Figure 1: Test Approach. Assessment results and system Team uses to mimic Decennial operations. This status yield comprehensive test horizontal and vertical testing approach uses threadplanning, design, execution and level tests, for example, the Non-Response Follow-Up report generation. (NRFU) or Self-Response threads and SoS-level tests focused on the overall operational level scenarios that span across multiple threads and response channels. en tU se O nl y {r1'!:ing nm 2.4.1. Test Planning er During the planning phase, the Program Level Test Team develops the approach to breakdown Test Scenarios and Test Cases by Test Types. al G ov The Test Scenarios & Test Cases are work products that will be continuously updated by the test team based on a variety of inputs such as system documents including requirements, architecture, design, system models, and existing test cases. The Program Level Test Team defines these types of inputs and work with the segment teams to help gather the documentation needed from the Project Teams. O ffi ci The Program Level Test Team will identify dependencies in order to determine what procedures are affected based on any limitations found and determine what we can do to resolve it from testing perspective. This defined testing approach will be utilized for each release. ve -F or The Program Level Test Team will quickly develop an inventory that identifies the Test Scenarios and Test Cases by Test Type. This inventory provides the universe of test scenarios which will be planned for execution based on the delivery of the capability. The Program Level Test Team will then work with Census Bureau stakeholders to select and prioritize the test cases to be run within each increment. iti Refer to the Integration and Test Plan for details on the Program Level Testing. ns 2.4.2. Test Design Se In this phase, we develop test data needs, detailed test cases that verify specific requirements using operational scenarios, and identify acceptance criteria in conjunction with project SMEs and operations personnel. The Program Level Test Team develops the Program-Level test data strategy by assessing the requirements of each system and business threads of systems, coordinating with stakeholders on test data needs and architecting a test data plan. We use existing test data with our Test and Evaluation Management Plan Version 2.0 4 COMM-CB-18-0152-A-000122 2020 Census Test and Evaluation Management Plan synthetic test data to ensure that the entirety of test data satisfies the needs of the SoS solution, from spatial to frame and administrative records. 2.4.2.1. Test Data Design en tU se O nl y The Program Level Test Team generates synthetic test data that could be used for different kinds of testing, without referencing existing production data (to avoid confidentiality or privacy risks), that results in a collection of realistic test artifacts with thorough coverage and correlation at statistically-relevant volumes and known ground truth. The test data set is used in both smaller-scale integration testing, and builds to large scale performance tests. We maximize test efficiency and cost savings by using in-house tools to create representative, yet registered and labeled data that can be used across all systems and segments and for different kinds of tests. ov er nm Existing synthetic data generation technology is used by establishing data models that automatically generate test data that meet the requirements of the system being tested. Combined with Agile methods, these models lead to discovery and preparation for follow-up tests. The Program TI Team uses the synthetic data generator and other tools (e.g., test automation tools) to generate synthetic data for different types of testing. We particularly focus on injecting statistically accurate errors into the test data to test exception handling at appropriate volumes. G 2.4.3. Test Execution O ffi ci al The approach is to test every system and integration point as early as possible and with as many test types applicable to the system or interface. Test execution will be conducted by the testers using test cases designed in the design phase. During the testing, defects will be identified, recorded, and addressed by the overall program. As defect fixes are provided within the releases, the Program Level Test Team will also verify the fixes address defects as written. These defects will be marked as “closed.” or 2.4.4. Test Reporting Se ns iti ve -F Test reporting includes metrics on test requirements coverage, test execution progress, and defect metrics. Defect metrics include the total and open defects per test type and system and are categorized by priority and severity. It depicts the defect burndown status and trends for identification, resolution and closure. Test execution and defect status, available workarounds for defects, and improvement recommendations are communicated immediately and are summarized in the TARs. Test reporting serves as a major input to the Production Readiness Review, where the 2020 Census Program assesses the readiness of the System of Systems to support final Operational Readiness Testing (ORT). 2.5. Test Metrics The management of the 2020 Census Program will rely on the generation of status reports and metrics to effectively quantify progress. Test metrics are collected and reported during all levels of testing including unit testing, code integration, system Test and Evaluation Management Plan Version AVf HI(,/\ \J 2.0 PVERSIGHT 5 COMM-CB-18-0152-A-000123 2020 Census Test and Evaluation Management Plan testing, program level testing, user acceptance testing, customer acceptance testing, and security vulnerability testing. The test teams need to provide the following types of Metrics: nl y 1. Execution Metrics (Pass/Fail/Not Tested) 2. Defect Metrics se O Execution metrics are used to provide status on test execution coverage including "pass", "fail", and "not tested" statuses and are used to determine readiness of the components and systems as the releases move through the test levels. ov er nm en tU The projects shall use the approved enterprise defect tracking and management tool. When a defect is identified, the person finding the defect (typically a Quality Assurance (QA) tester or a developer) shall log defect details into the approved enterprise defect tracking and management tool to ensure the Development Team and other stakeholders are aware that a defect exists and must be addressed. For each defect identified, information shall be logged to assist the team in evaluating the defect. The defect reporting process (including the fields to be logged such as the severity levels and priorities) are described in detail in the Defect Remediation Plan which will be developed by the TI Team. al G The Defect Remediation Plan will identify what constitutes a defect as well as the severity levels, priority levels, and fields included in the defect. The Plan will also discuss the defect flow and triage steps once a defect has been identified. ci The Program Level Test and Project Team Leads shall meet frequently to: ffi       -F or O Review active defects Ensure mutual understanding of the defect Develop a plan for fixing the defect Prevent recurrence of the defect Schedule re-testing to validate the fix Review defect trends to find areas of improvement ve 2.6. Test Data Se ns iti Test data creation and maintenance is an integral and important aspect of testing. High quality test data will help to automate testing and achieve the progressive inclusion of Continuous Integration (CI) methodologies. A structured approach to the creation and configuration management of test data will also save time and resources at the project and program level during systems development, regardless of the ESDLC methodology. The Program Level Test Team Lead will own the test data management function, including definition and identification of data needs, generation of synthetic and/or redacted test data, maintenance and storage, version control, and disposition. The Project Team is responsible for providing representative samples of test data. Test Test and Evaluation Management Plan Version AVf HI(,/\ \J 2.0 PVERSIGHT 6 COMM-CB-18-0152-A-000124 2020 Census Test and Evaluation Management Plan nl y teams need data with characteristics as close as possible to real production data to properly test and evaluate system behavior. High-quality test data will reflect the characteristics of the production data set. The Systems Engineering Integration Test (SEIT) program is then responsible for creation and maintenance of a data set that is designed to test program-level requirements and scenarios. The data set will be reviewed by the SEIT, assisted by a project SME with the key business process and data knowledge. O  nm en tU se The goal is to use non-personally identifiable information (PII), Title 13, or Title 26 data whenever possible  The focus will be on creating representative, correlated data when testing threads across systems  Data needs to be representative in volume for some testing (i.e., Performance, Infrastructure, and other scenarios) o Simulated data will be provided by ExactData as part of TI Task directive 004. 2.7. Test Data Security G ov er To ensure test data security, the Program Level Test Team will use representative data as much as possible. If the Program Level Test Team uses T13 data to support test activity, then the Program Level Test Team will use proper procedures and approval to make sure they comply with data security procedures. Resources using data has to be Title 13-trained. al 2.8. Test Tools ci The following Test Tools will be used during Program Level Testing: HP ALM - used for requirements management, developing test cases, tracking defects, and metrics generation.   HP Unified Functional Tester (UFT) - used for automation testing. JAWS - used for Section 508 testing. -F or O ffi  ve Additional test tools will be researched and evaluated to be added as needed to facilitate Program Level testing. iti 2.9. Test Automation Se ns The 2020 Census Program will maximize the use of automated testing to the extent possible to take advantage of efficiency and increased quality. Care must be taken to avoid automating tests too early to avoid costly re-factoring of automations in the event of application changes and UI adjustments. To increase the software quality with ever increasing software complexity, we maximize the use of test automation. Among the many test automation benefits, two of the most significant are the ability for test organizations to execute test cases faster and increase test coverage by being able to test complex data combinations and/or process complex test scenarios. We provide a solid foundation by designing an effective test automation Test and Evaluation Management Plan Version AVf HI(,/\ \J 2.0 PVERSIGHT 7 COMM-CB-18-0152-A-000125 2020 Census Test and Evaluation Management Plan • Te t Automation • cripting • Te t Data Creation • Te t Automation • Execution • Results and Analysis Planning • Re-test and validation • Regre ion te ting support O Ewcution • Detailed Plaiming • Framework Creation '.\laintenance se Discoven • I Iigh-level Scope • Architecture • Investigation • Solution Creation nl y strategy through a phased process. Our goal is to create a practical test automation framework tailored for the entirety of 2020 Census systems. Similar to the software development life cycle, as shown in Figure 2, the Program TI Team’s test automation process contains four phases: Discovery, Planning, Execution, and Maintenance. en tU Figure 2: Test Automation Process Following this phased approach allows Program TI Team to implement a successful test automation effort while delivering improved software quality and test coverage. nm Regression test cases would be ideal candidates for automation as the suite of regression test cases would remain relatively stable. er The Program Level Test Team plans to coordinate with Project teams to leverage automation already implemented there. G ov Refer to the Automation Test Plan for details on the Automation Testing Plan and activities. al 2.10. Test Environments ffi ci Below is the list of the five Test environments planned for testing as depicted in the Decennial 2020 Integration and Test Framework (Figure 3) in Section 3. Environment used for Development Testing Project Level Environment used for Project Level System Testing, Security Controls Testing, and UAT and CAT (Instrument) Program Level Environment used for Program Level Integration Testing, Security Controls Testing, and CAT (Output) Program Level / Operational Level Environment used for Checkout and Certification (C&C), System Readiness ns iti ve Project Test Se Independent Test Staging Test and Evaluation Management Plan Version 2.0 VERSIGHT Brief Description of Environment Project Level -F Development Test Level or Environments O Table 1: Test Environments 8 COMM-CB-18-0152-A-000126 2020 Census Test and Evaluation Management Plan Environments Test Level Brief Description of Environment Test (SRT), Performance and Scalability Testing, and ORT. Environment used by the Operation Team for Production. nl y Operational Level se O Production Test Roles and Responsibilities are defined below: Table 2: Roles and Responsibilities  Project Development Teams Developer Project Tester(s) Responsible for performing Unit Test Procedures Responsible for performing Integration tests in the Development Environment G ov  nm Responsibilities er Role Project Test Team(s) Team Name en tU 2.11. Test Roles & Responsibilities   Consists of resources from the project who are accountable and responsible for project/system-level testing Responsible for performing project-level testing Escalates technical risks/issues to SEIT O ffi ci al  or -F    Responsible for performing Customer Acceptance Testing (CAT) – Instrument Test and Output Test ORT Team  Responsible for performing ORT. Se ns Responsible for performing User Acceptance Testing (UAT) Participate as supporting role for ORT. Subject Matter Experts (SMEs) iti Operational Subject Readiness Matter Test Team Experts (SMEs) ve System Users System Users Test and Evaluation Management Plan Version 2.0 VERSIGHT 9 COMM-CB-18-0152-A-000127 2020 Census Test and Evaluation Management Plan Role Responsibilities  2020 Census Test Lead  en tU  Manages defects/coordinates with other teams (e.g. Project Teams/Operations) I&T Interface Test Lead    Leads the Interface Test Team Coordinates with the different systems Escalates technical risks/issues to Census Test Lead ov er I&T Defect Manager O ffi ci I&T Interface Testers al G Program Level Test Team  Creates or coordinates and delivers all formal document deliverables and Special Interest Artifacts Assists Test Lead with gathering test metrics, helps coordinate different types of testing, and maintains schedule nm  I&T Deputy Test Manager se O  Responsible for managing the 2020 Census Test team and all testing approaches. Gathers test metrics, coordinates test approaches with scheduling and execution Produces documentation for test approaches and schedules software installations nl y Team Name -F or Performance Test Lead ve Performance Tester iti ns Se  System Management/ Infrastructure Test Lead Perform Interface-related Test cases/scenarios Write defects for issues found   Leads the Performance Test Team Escalates technical risks/issues to Census Test Lead  Creates Performance scripts, executes, and document results Writes defects for issues found  Test and Evaluation Management Plan Version 2.0 VERSIGHT    Leads the System Management/Infrastructure Test Team Escalates technical risks/issues to Census Test Lead 10 COMM-CB-18-0152-A-000128 2020 Census Test and Evaluation Management Plan Responsibilities  System Management/ Infrastructure Tester  Performs System Management/Infrastructure- related Test cases/scenarios Writes defects for issues found nl y Role   Leads the Regression Test Team Escalates technical risks/issues to the Census Test Lead Regression Tester  Performs Regression-related Test cases/scenarios Writes defects for issues found  se Assist in generating Test data needed by the test team Se ns iti ve -F or O ffi ci al G ov er Test Data Specialists en tU  O Regression Test Lead nm Team Name Test and Evaluation Management Plan Version 2.0 VERSIGHT 11 COMM-CB-18-0152-A-000129 2020 Census Test and Evaluation Management Plan 3 Test Levels nl y This section explains the test levels, objectives of that level of testing, list the types of testing conducted within each level, entry and exit criteria, inputs and outputs, environment used, and the team with the responsibility for each level of testing.  O Figure 3 “Decennial 2020 Integration and Test Framework” depicts the major levels of testing for the 2020 Census Program: en tU nm o o o o Development Testing  Unit Testing  Integration Testing Readiness Review System Testing User Acceptance Testing (UAT) Customer Acceptance Testing (CAT)- Instrument Test er o ffi ci al G Test Readiness Review Integration Testing Security Controls Testing Customer Acceptance Testing (CAT)- Output Test Pre- Production Readiness Review C&C System Readiness Testing (SRT) -F Production Readiness Review Performance and Scalability Test ORT Pre-Operational Readiness Review (ORR) System Checkout ORR Se ns iti ve o o o o o o or Operational Level O  ov Program Level o o o o o o o se Project Level Test and Evaluation Management Plan Version AVf HI(,/\ \J 2.0 PVERSIGHT 12 COMM-CB-18-0152-A-000130 2020 Census Test and Evaluation Management Plan Decennial 2020 Integration and Test Framework Project Level Development Team Project Level Test Team Program Level Test Team End to End Testing Build Checkout Thread/End to End Testing Interface Testing (internal external components) Performance Testing (Load, Volume, Stress) Exception Testing Perf Testing Path Testing Section 508 Testing Mobile Testing Application Regression Testing Infra Testing Infrastructure Testing (Continuous Operation, Backup and Recovery) Interface Testing Integration Testing Functional Testing Development Test Support Program level Test * * * * Infrastructure Testing End to End Testing Performance Testing Interface Testing * * Integration Test Support Security Controls Test Operational Readiness Test Production Environment Operation Team System Checkout GO LIVE * Support Customer Acceptance Testing (Output Test) *Test Automation Candidate G Support Customer Acceptance Testing (Instrument Test) Operation Test Team Checkout & Certification Regression Testing Support Program level Test Program Level Test Team and Performance & Scalability Testing System Readiness Test System Test Support User Acceptance Testing System Checkout Test Staging Environment er Usability Testing Staging Environment Program Level Test Team nl y Independent Test Environment O Project Test Environment Operational Readiness Review se Development Environment Pre-Operational Readiness Review en tU Test Readiness Review Unit Testing Operational Level Production Readiness Review ov Readiness Review Program Level Pre-Production Readiness Review nm Project Level al Figure 3: Decennial 2020 Integration and Test Framework ci 3.1 Project Level O ffi 3.1.1 Development Test Strategy Development testing consists of Unit Level testing and Integration Testing. -F or Unit testing tests individual software components as an integral and continual activity of System Development while Integration Testing validates software modules created from the integration of previously unit-tested software to validate proper integration. Responsible Team – Project Development Team  Types of Testing Conducted iti ve  Unit Test o Integration Test Se ns o  Environment – The development environment will be utilized for Unit and Integration Testing  Entry Criteria for Development Test o o A ffI The development environment is available Process is in place to deploy code into the development environment Test and Evaluation Management Plan Version 2.0 PVERSIGHT 13 COMM-CB-18-0152-A-000131 2020 Census Test and Evaluation Management Plan  Inputs for Development Test ESDLC Requirements Document ESDLC Requirements Traceability Matrix (RTM) Detailed Design Specification Solution Architecture Obtain appropriate level of Security Authorization nl y o o o o o o se  Test execution for Unit and Integration Testing has been completed. Release Notes for deployment to System Test Outputs from Development Test er nm List of outstanding defects Unit and Integration test results (TAR) System Administration Manual System Operations Manual Test Cases System Deployment Plan Release Notes for deployment to System Test ov o o o o o o o en tU o o O Exit Criteria from Development Test  System Test Strategy ci 3.1.2 al G Readiness Review o The gate review between project development and project test which is conducted when necessary. or O ffi System testing ensures that the system performs the necessary business functions as outlined in the system requirements. As part of Project Level System TestingFunctional, Usability, Path, Performance, Interface, Application Regression, Infrastructure and End-to-End Test Types will be conducted. Responsible Team o Project Test Team  Types of Testing Conducted ve -F  iti o Usability Testing ns o End-to-End Testing Se o Functional Testing o Interface (Pair-wise Internal and External) Testing o Performance Testing o Path Testing o Application Regression Testing o Infrastructure Testing Test and Evaluation Management Plan Version AVf HI(,/\ \J 2.0 PVERSIGHT 14 COMM-CB-18-0152-A-000132 2020 Census Test and Evaluation Management Plan  Environment o The project test environment will be utilized for System Testing  Entry Criteria for System Testing  nl y o Test Lab is configured and software installed  se nm en tU ESDLC Requirements Document ESDLC RTM Detailed Design Specification Solution Architecture Unit and Integration test results (Test Analysis Report (TAR)) List of outstanding defects System Administration Manual System Operations Manual Test Cases from Development Testing System Deployment Plan Release Notes for deployment to System Test Obtain appropriate level of Security Authorization er o o o o o o o o o o o O Inputs for System Testing ov Exit Criteria from System Testing  G o Test execution has been completed for System Testing Outputs from System Testing or O ffi ci al o List of outstanding defects o TAR o List of outstanding defects from System Test o Test Cases from System Testing o Release Notes for deployment to Integration Test o ESDLC RTM o Obtain appropriate level of Security Authorization -F 3.1.3 User Acceptance Test Strategy Se ns  iti ve UAT enables end users to evaluate real-world scenarios to test the system before it is deployed in the production environment. This testing is essential for the Business/Mission Area to determine if the system correctly implements business processes and increases end user acceptance of new or modified system.  Responsible Team o System Users Types of Testing Conducted o User Acceptance Test (UAT)  Environment o The project test environment will be utilized for UAT Test and Evaluation Management Plan Version AVf HI(,/\ \J 2.0 PVERSIGHT 15 COMM-CB-18-0152-A-000133 2020 Census Test and Evaluation Management Plan  Entry Criteria for UAT o Project Test environment is available to perform UAT  Inputs for UAT  nl y o Operational Test scenarios Exit Criteria for UAT  O o Test execution has been completed for UAT o Test results and summaries o Defect reports 3.1.4 Customer Acceptance Strategy - Instrument en tU se Outputs from UAT er nm CAT-IT is intended to demonstrate the graphical user interface (GUI) to SME personnel to elicit their feedback and approval. The schedule for the CAT testing needs to align with UI freeze dates. Responsible Team o SMEs  Types of Testing Conducted al G ov   ci o Customer Acceptance Test (CAT) - Instrument ffi Environment O o The project test environment will be utilized for CAT  Entry Criteria for CAT or o Project Test environment is available to perform CAT  -F Inputs for CAT o Acceptance Test scenarios ve  Exit Criteria for CAT Outputs from CAT ns  iti o Test execution has been completed for CAT Se o Test results and summaries o Defect reports 3.2 Program Level Test Readiness Review (TRR) is the gate review between project and program level test. This review ensures appropriate test objectives, methods, procedures, scope and Test and Evaluation Management Plan Version AVf HI(,/\ \J 2.0 PVERSIGHT 16 COMM-CB-18-0152-A-000134 2020 Census Test and Evaluation Management Plan environments are ready. Assesses readiness of systems to begin independent program level testing. 3.2.1 Technical Integration Test Strategy O nl y Integration and Test is the process of combining the various systems into one overall program. I&T focuses on executing tests across system boundaries, validating the physical and logical interfaces between systems, hardware products, software products, and external interfaces. en tU se The primary integration objective is to ensure that the system under development evolves as a homogeneous system as opposed to a collection of incompatible hardware and software products. This requires that the system be integrated and tested in a logical, incremental fashion as the development phase of the program progresses. Types of Testing Conducted er  ov Responsible Team o Program Level Test Team G  nm I&T planning is performed by Program Level Test Engineers. During the planning phase, the Program Level Test Team will develop a detailed Integration Test Plan as well as identify the high-level test cases and scenarios. ci o Thread/End-to-End Testing al o Build Checkout O o Exception Testing ffi o Performance Testing (Load, Volume, and Stress) o Section 508 Testing or o Mobile Testing -F o Infrastructure Testing o Interface Testing Environment Se ns  iti ve o Regression Testing  o The Independent Test environment will be utilized for Technical Integration Testing Entry Criteria for Technical Integration Testing o Software must successfully complete System Testing prior to entry into Technical Integration Testing. Test and Evaluation Management Plan Version AVf HI(,/\ \J 2.0 PVERSIGHT 17 COMM-CB-18-0152-A-000135 2020 Census Test and Evaluation Management Plan o The Independent Test environment is configured and available for testing. o Process is in place to deploy code into the Independent Test environment. Inputs for Technical Integration Testing o o o o  nl y O se o o ESDLC Requirements Document ESDLC RTM from the Project Level Detailed Design Specification Interface Control Documents (ICDs) (if point to point) or API Profiles and SLA (if SOA Based Interface). Solution Architecture System Level TAR o List of outstanding defects from System Test Test Cases from System Testing System Deployment Plan Release Notes for deployment to Integration Test Ensure appropriate level of Security Authorization en tU o o o o Exit Criteria from Technical Integration Testing nm  ov er o Integration testing results have been reviewed. o Open Defects will have disposition or an acceptable operational workaround is available.  ffi ci al TAR Test Status Reports Test Scenarios Test Cases Meeting Minutes (Test Meetings) Test Metrics O o o o o o o G Outputs from Technical Integration Testing or o Defect reports -F 3.2.2 Customer Acceptance Strategy – Output iti ve The Customer Acceptance Test – Output Testing (CAT-OT) is coordinated by the SME with participation from Ops. It is performed in the 2020 Census Test environment after TRR and prior to Product Readiness Review (PRR). Se ns CAT-OT testing validates that the system’s back ends are storing and updating data as expected through a SME-led demonstration of end-to-end capabilities. The Program Level Test Team will support the demonstration and validate the accuracy of the data flow from initial entry through delivery.  Responsible Team o SMEs  Types of Testing Conducted Test and Evaluation Management Plan Version AVf HI(,/\ \J 2.0 PVERSIGHT 18 COMM-CB-18-0152-A-000136 2020 Census Test and Evaluation Management Plan o CAT- Output  Environment o Independent Test environment will be utilized for CAT (Output Test)  nl y Entry Criteria for CAT  O o Independent Test environment is available for testing o Software is available for testing se Inputs  Exit Criteria from CAT en tU o CAT- Output testing scenarios o Test execution has been completed for Customer Acceptance Output Testing. o Outputs from CAT Test results and summaries Se ns iti ve -F or O ffi ci al G ov er nm o Defect reports Test and Evaluation Management Plan Version AVf HI(,/\ \J 2.0 PVERSIGHT 19 COMM-CB-18-0152-A-000137 2020 Census Test and Evaluation Management Plan  nl y Pre - Production Readiness Review o Pre - Production review is the gate review after testing conducted in Independent Test environment and the start of testing in the Staging environment within the program level test. O This review assesses readiness of systems to be installed in the Staging environment. 3.2.3 Checkout & Certification Strategy  en tU se The C&C consists of the plan and procedures to ensure that the site has been properly installed and is ready for SRT approaches. The C&C Plan outlines the specific approaches to be conducted during the C&C. Responsible Team  Program Level Program Level Test Team nm o  System Checkout Test ov o er Types of Testing Conducted Staging environment will be utilized for Checkout and Certification. al o G Environment   ci Entry Criteria for Checkout and Certification The site installation and deployment task have been completed. o Initial installation of Custom Code has been completed. O ffi o -F o or Inputs C&C Test Procedures iti Exit Criteria from Checkout and Certification o The C&C procedures have been executed on the equipment installed o The applicable C&C procedures have been executed at each of the sites. o The C&C report is delivered. Se ns  ve o Ensure appropriate level of Security Authorization Test and Evaluation Management Plan Version AVf HI(,/\ \J 2.0 PVERSIGHT 20 COMM-CB-18-0152-A-000138 2020 Census Test and Evaluation Management Plan  Outputs from Checkout and Certification C&C test results o Discrepancies (e.g., missing hardware components) found during the execution of the C&C procedures. o Software defects of the 2020 Census Test custom code are not expected during the C&C but if they are discovered during the C&C, they will be documented and a defect will be created. These defects will not prevent the completion of the C&C. se O nl y o en tU 3.2.4 System Readiness Test Strategy  nm SRT will be a system-wide test to exercise the 2018 End-to-End Census Test solution to check that the system can process data through multiple response channels and can successfully communicate and transfer data between external interfaces. Program Level Program Level Test Team ov o er Responsible Team  G For further detail, look to the System Readiness Test Plan. o Interface Testing o End-to-End Testing o Performance Testing ffi O -F Environment Staging environment will be utilized for System Readiness Test ve o ci Infrastructure Testing or  o al Types of Testing Conducted  iti Entry Criteria for System Readiness Testing Successful completion of C&C ns o Se  Inputs o System Readiness Test Plan Test and Evaluation Management Plan Version AVf HI(,/\ \J 2.0 PVERSIGHT 21 COMM-CB-18-0152-A-000139 2020 Census Test and Evaluation Management Plan  Exit Criteria from System Readiness Testing o  Solution as installed at the site is complete and ready for use by the Operations Staff SRT Test results/Report Defects found during testing O o o nl y Outputs from System Readiness Testing se 3.3 Operational Level en tU PRR is the gate review held after the staging environment checkout testing has been conducted. er 3.3.1 Performance & Scalability Test Strategy nm This review is an assessment of test results to ensure systems are ready for Operations Testing.  ov Performance and Scalability Testing validates that systems perform as needed based on the target infrastructure and projected workloads. al Program Level Test Team ci o G Responsible Team   O Performance and Scalability Test  Staging environment will be utilized for Performance and Scalability Testing ve o -F Environment or o ffi Types of Testing Conducted Entry Criteria for Performance and Scalability Testing Completion of Project Level and Program Level Performance testing ns iti o Se  Inputs o Performance and scalability Test Plan o Mobile Application Performance Test Plan o Test scenarios and Test data Test and Evaluation Management Plan Version AVf HI(,/\ \J 2.0 PVERSIGHT 22 COMM-CB-18-0152-A-000140 2020 Census Test and Evaluation Management Plan  Exit Criteria from Performance and Scalability Testing o  Performance and scalability testing has been completed. Performance and Scalability Test results. o Defects are documented. O o nl y Outputs from Performance and Scalability Testing se Performance and Scalability Testing will be conducted in the following five phases:  en tU Phase 0: Activities planned in this phase are: information gathering and discovery, Analysis, and Assessment. Phase 1: Activities planned in this phase are: Performance Test Design, Baseline Performance Tests, and Isolated Performance Tests. Phase 2: Activities planned in this phase are: Integrated Performance Tests and Business Thread End-to-End Performance Tests. Phase 3: In this phase, full End-to-End Performance Tests will be conducted. Phase 4: Activities planned in this phase are: Scalability Tests, Soak Tests, and Failover Tests.  nm  ov er   G Detailed information on Performance and Scalability Testing activities will be in the Performance and Scalability Test Plan. ffi ci al The Mobile Application Performance Test Plan will be derived from the Performance Test Strategy and will focus on the approach for ensuring the mobile applications perform as expected under load. Se ns iti ve -F or O Mobile Performance Strategy includes: Test and Evaluation Management Plan Version AVf HI(,/\ \J 2.0 PVERSIGHT 23 COMM-CB-18-0152-A-000141 2020 Census Test and Evaluation Management Plan  Identifying Mobile Performance Expectations o What the expectations are for mobile users to see (i.e., response time on mobile devices, etc.)  nl y o Measuring the real mobile end-user experience Executing on various mobile scenario conditions O o Real mobile devices se o Real mobile carriers  en tU o Simulated network conditions Collecting key mobile metrics o Device vitals nm o Mobile application response time (end-user experience) Integrating Mobile Performance Testing tool with traditional performance testing tool  Troubleshooting, performance tuning, and analysis for mobile execution  Reporting via performance test tool dashboard ov er  al G Detailed information on Mobile Application Performance Testing activities will be in the Mobile Application Performance Test Plan. ci 3.3.2 Operational Readiness Test Strategy O ffi ORT testing is an Operations checkout that confirms the system is functioning correctly in the production environment that occurs prior to ORR. Responsible Team o Operation Test Team  Types of Testing Conducted -F or  ve o End-to-End o Interface Se ns  iti o BPM Processes validation  Environment o Staging environment will be utilized for ORT Entry Criteria for ORT o Completion of System Readiness Testing  Inputs o ORT Plan Test and Evaluation Management Plan Version AVf HI(,/\ \J 2.0 PVERSIGHT 24 COMM-CB-18-0152-A-000142 2020 Census Test and Evaluation Management Plan  Exit Criteria from ORT o Refer to Production Level Test Plan  Outputs from ORT nl y o ORT Test Results/Report o Defects found during testing Pre-ORR O  se o Pre-ORR is the gate review prior to start of testing in the Production environment. This review assesses readiness of systems to be installed in the Production 3.3.3 en tU environment. System Checkout nm ORT is an Operations checkout that confirms the system is functioning correctly in the production environment that occurs prior to ORR. Responsible Team o Operation Team  Types of Testing Conducted G ov er  o System Checkout  al Environment ffi ci o Production environment will be utilized for System Checkout  O Entry Criteria for Testing  Inputs -F or o Completion of ORT in the Staging Environment. ve o ORT Plan Exit Criteria from System Checkout ns  iti o Ensure appropriate level of Security Authorization Se o Refer to Production Level Test Plan  Outputs from ORT o Checkout Test Results/Report o Defects found during testing Test and Evaluation Management Plan Version AVf HI(,/\ \J 2.0 PVERSIGHT 25 COMM-CB-18-0152-A-000143 2020 Census Test and Evaluation Management Plan  ORR o ORR is the gate review before going live. en tU se O nl y ORR validates that all components of an operation are ready before the operation is conducted. It is a final check that needed resources (i.e., people, systems, processes, and facilities) have been acquired and developed. The ORR process ensures expectations are set early and met before the conduct of the operation. Interim ORR activities that occur throughout the Product Generation Life Cycle (PgLC) ensure early identification and mitigation of risks and issues. Those activities also foster collaboration between the program, operations, and systems. 3.4 Test Types Se ns iti ve -F or O ffi ci al G ov er nm The following table depicts which test types are executed in each test approach. Appendix A titled Decennial 2020 Test Type Definitions contains the Test Levels/Test types and environments associated with these approaches. Test and Evaluation Management Plan Version AVf HI(,/\ \J 2.0 PVERSIGHT 26 COMM-CB-18-0152-A-000144 2020 Census Test and Evaluation Management Plan Table 3: Test Categories Mapped to Test Approach Project Test Program Level Test Operational Test Project-Level Integration Testing • en tU • • • • • • • • • G ov Usability Testing ci al Functional Testing • • ve -F or Performance Testing O ffi Interface Testing Path Testing • er Thread/End—to-End Testing nm Unit Testing se O Development Test nl y Approach • • Infrastructure Testing • • Se ns iti Regression Testing Build Checkout • Section 508 Testing • Test and Evaluation Management Plan Version 2.0 VERSIGHT 27 COMM-CB-18-0152-A-000145 2020 Census Test and Evaluation Management Plan Approach • Mobile Testing • en tU Exception testing Operational Test nl y Program Level Test O Project Test se Development Test • nm System Checkout Test • er Performance and Scalability • G ov BPM Processes validation ci ffi • • • • Se ns iti ve -F or O User Acceptance Testing Support Customer Acceptance Testing Support • al Security Control Testing Support Test and Evaluation Management Plan Version 2.0 VERSIGHT 28 COMM-CB-18-0152-A-000146 2020 Census Test and Evaluation Management Plan 4 Security Control Assessment nl y The 2020 Census Test and Evaluation Management Plan is designed to capture the overarching test strategy for the 2020 Decennial Census program, and does not address a single system; therefore, there is not an applicable security assessment performed by Office of Information Security (OIS) to describe in this section. 4.1 en tU se O The following paragraphs do however, address the responsibilities of the Security Team and the Test Teams as it relates to the security requirements of the program, specifically for authorization of use of the environments and ensuring security controls are in place in the systems design. Security control requirements are defined by OIS, per required security controls for the Census Bureau Information Systems and are elaborated in the Security Management section of the 2020 Census SEMP document. Security Assessment of the Environments er nm There are two types of approvals which are driven by the Security Integrated Product Team (IPT) for the environments which are defined in the TEMP: Authority to Test (ATT) and Authority to Operate (ATO). ov 4.1.1 Authority to Test al G All systems must obtain ATT before testing their systems. An ATT request is completed for any component, application, or system in development that meets one of the following criteria:  ffi ci Testing involves the use of production data or information types specified in National Institute of Standards and Technology (NIST) SP 800-60 Vol II Testing involves a new platform (new operating system or major upgrade of an existing platform) The component, application, or product is identified for an enterprise deployment O  or  -F Note that all Decennial systems, by definition, meet these criteria for testing and thus should plan to request an ATT. ve 4.1.2 Authority to Operate Se ns iti ATO, is performed by an independent agent in the operational (production) environment under the guidance of the Office of Information Security. This level of security testing is not covered by the TEMP, but governs the operational ongoing use of the system, including Assessment and Authorization (A&A). 4.2 Security Controls Testing Security Controls testing will be conducted at the project level by the Project Test Teams and by the Security IPT in the appropriate environment for their scenarios. The Test and Evaluation Management Plan Version AVf HI(,/\ \J 2.0 PVERSIGHT 29 COMM-CB-18-0152-A-000147 2020 Census Test and Evaluation Management Plan Program Level Test Team will support the Security IPT as needed to verify security test scenarios. nl y Project Teams shall design and develop a solution that conforms to standard Bureau security standards. The security approach and testing must:  Comply with Federal guidelines, including NIST, U.S. Department of Commerce O (Commerce), and Bureau Security Standards as defined in the Bureau IT Security se Program Policy (ITSSP)  en tU Incorporate technical controls appropriate for the assigned NIST system security categorization  Leverage existing Bureau-wide security services to the fullest extent possible G ov er nm OIS has identified three categories of security requirements which apply to Census systems dependent on the sensitivity of the data which these systems are processing. The specific details of these requirements are subject to a degree of change based on the ongoing upgrades and enhancements to enterprise security within the bureau. As a system enters the “Initiation” phase of the ESDLC, the Chief Security Engineer will work with the Project Team to establish which of the following three categories apply to the system in question and assign the specific details of the security requirements for incorporation into the system requirements specification.  ci al Platform Testing without Production Data – In this scenario, only dummy test data will be used for testing purposes. The objective is to test the functionality of system security features and the overall installation. This reflects a comprehensive, but less stringent, level of security requirement. Platform Testing with Production Data - In this scenario, actual production test data will be used for testing purposes, provided it is not Title 13, Title 26, PII, or designated as sensitive data. The objective, again, is to test the functionality of system security features and the overall installation. This reflects a comprehensive, but stringent, level of security requirement. Platform Testing with PII, Title Data, and/or Sensitive Production Data - Actual protected production data will be used for testing purposes in specific circumstances. Any system identified for this type of security testing will necessarily be assigned the most comprehensive list of security requirements and will be subjected to the most thorough and stringent validation and testing. or O ffi  Se ns iti ve -F  Test and Evaluation Management Plan Version AVf HI(,/\ \J 2.0 PVERSIGHT 30 COMM-CB-18-0152-A-000148 2020 Census Test and Evaluation Management Plan 5 Additional Information This section details Test Documentation and Assumptions and Constraints. nl y 5.1 Test Documentation en tU se O The following table (Table 4) and figure (Figure 4) summarize all work products and deliverables covered under the TEMP. The document tree (depicted in Figure 4) is a graphical representation of the document hierarchy. Deliverables are shown in green and work products are shown in blue colors as Special Interest Artifacts and Procedures. nm The Census Bureau will have access to both the work products and formal deliverables created by the Program Level Test Team. Both types of documents are created collaboratively to meet both the Customer and Working Team needs. Work product documents are more dynamic, more current, and more flexible to adjust to current information or changes as needed. al G ov er Formal deliverables are work products that go through an additional process including management within the team schedule to ensure higher visibility and confirmation of content. Changes to a formal deliverable require a change request (CR) and additional process which in some cases, causes the official version to be out-of-date; therefore, this process is applied to documents that are not expected to change frequently. ci Table 4: Test Documentation Purpose of the Document Item Type ffi Item Deliverable ve -F or O 2020 Census Program Test The purpose of the 2020 Census and Evaluation Management Program Test and Evaluation Plan (TEMP) Management Plan (TEMP) is to communicate the overarching test approach for the 2020 Decennial Program. Performance Test Strategy Se ns iti Note: Includes Mobile Performance Testing The Performance Test Strategy is intended to describe the overall performance test approach from component level to system level to operational performance testing based on the 2020 Census models and approach. Test and Evaluation Management Plan Version 2.0 VERSIGHT Deliverable 31 COMM-CB-18-0152-A-000149 2020 Census Test and Evaluation Management Plan Purpose of the Document Item Type The Performance & Scalability Test Deliverable Plan will be the approach, schedule, objectives and plan for the Performance & Scalability testing planned for the Staging Environment. Mobile Application Performance Test Plan The Mobile Application Performance Test Plan will be derived from the Performance Test Strategy and will focus on the approach for ensuring the mobile applications perform as expected under load. Test Analysis Reports The Test Analysis Reports will Deliverable summarize test objectives, execution status, defect status and disposition results based on a set of testing (i.e., by Increment, Release, or event). Decennial 2020 Integration and Test Framework The Test Framework is a graphical Deliverable representation of the test teams, environments, types of testing, and test events leading to Operations. O Performance & Scalability Test Plan nl y Item ci al G ov er nm en tU se Deliverable Work Product or O ffi Test Status Report Template The Test Status Report Template is a template to be used to communicate Program Level Test status to the Census Bureau stakeholder. The Test Status Reports will be a Work Product summary of the test execution metrics, the defect metrics, schedule status, and issues provided at regular intervals to ensure transparency with the Program Level test progress throughout the releases and events. Se ns iti ve -F Test Status Reports Test and Evaluation Management Plan Version 2.0 VERSIGHT 32 COMM-CB-18-0152-A-000150 2020 Census Test and Evaluation Management Plan Purpose of the Document Item Type Test Scenarios are the individual test Work Product objectives to be exercised within a certain release and type of testing (i.e., submit an internet form with a valid address and internet access code, and process through the internet solution as a nominal scenario). The test scenario may have one or more test cases associated with it. Test Cases The test cases are the set of valid and invalid executable procedures that are part of the test scenarios (i.e., submit an internet form with the maximum amount of respondents) Meeting Minutes (Test Meetings) Meeting Minutes will capture the topics and outcomes of our various test meetings. Defect Remediation Plan The Defect Remediation Plan is a plan Work Product to describe how the Program Level Test team will record, prioritize, communicate, and disposition defects with the customer and project teams. se O Test Scenarios nl y Item Work Product ffi ci al G ov er nm en tU Work Product The test metrics will reflect the execution status and defect status of the various test scenarios throughout the test releases and events. They may be incorporated into the Test Status Report and or meetings. Work Product The Checkout and Certification Plan outlines the specific approaches to be conducted during the C&C. Work Product -F or O Test Metrics ns iti ve Checkout and Certification Plan Se System Readiness Test Plan The System Readiness Test Plan is the Work Product approach, objectives, and schedule specifically focused on the System Readiness Test which will be executed to ensure the system is configured and ready for production. Test and Evaluation Management Plan Version 2.0 VERSIGHT 33 COMM-CB-18-0152-A-000151 2020 Census Test and Evaluation Management Plan Item Purpose of the Document Item Type The Integration & Test Plan contains the detailed test plan for the Program Level Test Team test types specified within the Decennial 2020 Integration and Test Framework. Work Product Automation Test Plan The Automation Test Plan will describe the test automation framework, the approach to determine what scenarios will be automated, how to incorporate the automation into the test process. Work Product Build Schedule The build schedule is a list of the various builds and the dates when they will be installed on the TI Independent Test Environment and the functionality and / or defect fixes included. Work Product Se ns iti ve -F or O ffi ci al G ov er nm en tU se O nl y Integration & Test Plan Test and Evaluation Management Plan Version 2.0 VERSIGHT 34 COMM-CB-18-0152-A-000152 2020 Census Test and Evaluation Management Plan 2020 ~nsus Tnt & EvaluMIOll Malleaement Plan (TEMP) System Test Plar(~ (Project l ..,of) , -------- System ReachnessTest Plan (SRT Plan) Performana- & ScalabilityStraterv -------- Supportin& : : Documentation : I - ----------------• System Checko..st Tea: Plan Perfaman::e & Scalabitty Tel Plan lnfrastru:tll' e Test T~Proc:e:sure Proceci.J'"e Performance & Scala>1ky End-to-End Prcx::edu"e Mobile Appicatl()f'l Performan;e Tes Plan Peformanc:e Test Proceclxe Mobile-Appkatt0n Performance T~ Procedu'es System Exc~ion Performan:e Procedu"es Procedu'es er TIY,ad/End-to-End Procl!dlxes ScelarOS Scmaros TestCcSeS Test Ca;es Test Cases S.a:ton508 Proced.res Scenaros Test Cases Teg: Cce;es Mobile Scenacos Tes Ca:.es Scerwos Test Ca:.cs Test Ca:.cs 8Ult1Checkoi..t Procedlres • • - KEY Daliv•ablas Nct0¥in.:ibyl&T ~Proce:hnslSceiaiosj Tat S~ ri al lnt•estAn:i facts (na: ~ Procad.iras) I : "'Glt>Upirc"' L--• O ffi ci Scenaros al WrastructtS e Procedu"es S'/S'f.emReeresson Procedu"es Tea: M&cs Procedu"es G lnl:ertace Scenaros ov Scenaros Procedl..l'es Test Proc:ewres nm A\.C:omatton Procedl.res en tU Interface Test Proc:edu"e Tea: Sta:usReportS se ALlomaoon t : O Checkout & Certd'icalKlrl Plan (C&C Plan) nl y Requir ements Matrix -F or Figure 4: Test Documentation Tree 5.2 Assumptions & Constraints ns iti ve The assumptions and constraints below are used to plan the test strategy, Program Level Test schedules, and Program Level Test plans. As the assumptions and constraints change, there may be impacts to the planning and approach of the Program Level Test Team which will be reflected in change requests (CRs) to update our plans accordingly. Se 5.2.1 Test Assumptions 1. The TI Task 007 will receive simulated data sets from Task #004 that are appropriate for robust and comprehensive testing procedures. 2. Releases of Projects within and across Programs (e.g. Census Enterprise Data Collection and Processing (CEDCaP) and Decennial) are coordinated and A fI Test and Evaluation Management Plan Version 2.0 PVERSIGHT 35 COMM-CB-18-0152-A-000153 2020 Census Test and Evaluation Management Plan properly planned in the Integration and Implementation Plan (IIP) to facilitate stable baselines for testing activities. O 4. The Test Environment can support both functional end-to-end test and performance testing. nl y 3. The Test Environment is dedicated to the purposes of the Integration & Test Team. en tU se 5. The Test Environment mirrors the Production Environment in relation to the version of the infrastructure (e.g. Cloud vs. data center), Operating System, and COTS software). 6. There is a Test Environment that contains the official version of the software application as released via Configuration Management. er nm 7. All systems identified in the scope have passed previous testing activities, including Unit, Project Integration, System, and Project CAT prior to the TRR for the release that they are included in. ov 8. The Mobile Device Management (MDM) solution is installed on all mobile devices during mobile performance testing. al G 9. All mobile applications needed for the 2020 Census are installed on the phone within the MDM solution. ffi ci 10. The Test environment is sized to approximate the volume and scale of a full production system. O 11. An approved defect-tracking system is used to capture and track defects. -F or 12. The Integrator will adopt all USCB level testing policies, procedures, and tools as directed by the 2020 Census TI GPMO. 13. Project-Level test results available to the TI prior to beginning a system test. iti ve 14. Mobile device requirements and footprints have been baselined prior to the ORR for Release A of the 2018 End-to-End Census Test. Se ns 15. Program-Level requirements will be provided to the TI 30 days post-award. 16. Code will be delivered as part of the Census Program Increment Release Plan where applicable to the Test Environment and the final version of the code will be delivered prior to TRR for each of the 2018 End-to-End Census Test systems. 17. Mobile devices will be provided to the Program Level Test team for test purposes. 18. The Program Level Test Team will not directly participate in the 2017 Census Test and Evaluation Management Plan Version AVf HI(,/\ \J 2.0 PVERSIGHT 36 COMM-CB-18-0152-A-000154 2020 Census Test and Evaluation Management Plan Test; however, the Integrator will travel to observe the testing for purposes of gathering lessons learned to apply to the 2018 End-To-End Census Test. 5.2.2 Test Constraints  nl y The constraints for the test approach include: Se ns iti ve -F or O ffi ci al G ov er nm en tU se O Test activities are not planned for the Production environment. Test and Evaluation Management Plan Version AVf HI(,/\ \J 2.0 PVERSIGHT 37 COMM-CB-18-0152-A-000155 2020 Census Test and Evaluation Management Plan 6 Points of Contact Below is the list of Points of Contacts. Phone Stakeholder Census Coordinator for Program Level Test Team Beverly Harris Decennial Information Technology Division 3017632606 Program Level Test Lead Coordination of Program Level Test Planning and Execution Trong Bui TI Contract 2406086811 Email O Organization Beverly.Anne.Harris @census.gov trongkhuong.bui@vi doori.com Se ns iti ve -F or O ffi ci al G ov er nm Name se Responsibility en tU Role nl y Table 5: Points of Contact Test and Evaluation Management Plan Version 2.0 VERSIGHT 38 COMM-CB-18-0152-A-000156 2020 Census Test and Evaluation Management Plan Decennial 2020 Test Type Definitions Environm Te st Type Suooort.ineStak eho lder (s) Resoonsible St a keho lder {s) Oescr ip tt<>n nl y Grou p ing Crttlcal 111.mi n-.s Proposa l llftNwCCIIPR) IUSlonolapp,oprlata.,_... ID l~thednlredsubaetoltt.S4operatlofts. llftNwollnltlalhfahlawl~ID.....,.lnc: ,.,..of,.,,,,,.,. ......... Protect IIUellne SdNGIYlro,ga.., ..... .,.,..,lftllcw.,l'dllrlCA/"'.cqA llftNwCPIIRJ sol~~ O Census - Level - or gani zation e nt Used --·--de. se APPENDIX A .. .,,~...,,,.........., Un it test ina t ests ind i v i dua l s oftware components as an lnteara l and cont i nua l act i vity of s y s tem d evel op men t. Un it Te s t e nt lntearation X Pro j e c t Dev e lopme n t Team Sec u rity Contro ls Test i na Security Team Pro ject T est Tll!'am er Secur ity Cont rol s Te.st ing Pro je ct- leve l l ntearation Test i n& v a l i dates s oftware modu le s cr ea ted from the in til!'gr at ion of prev ious ly un it -t e st ed softwa re to validate proper in te •ra t ion . Secur ity Contro ls Test in& focuses on val idat i n& th a t Secu r ity r eq uirements have been s at i sf ied. Te s t Pro je c t Development Team nm Developm Test en tU Crttlca l ICDIQ _...,o,.,.,.,,_.........,--4...._, ..... ,..._..,._.,,,_.,....,._,of~.....,__,dc....._,......_oflWIIIA e11Sm.,........--4tlw-,,,.,,J...._., u.sed during th a t it confo rm s t es t in & ified i n the ov Fun ct i o na l test i na is a process softwa r e deve l opme nt to verify to all requ ire ments _ Functional va li dates the funct ional ity spec f u nctiona l requ ir ll!'ments . l Testin a G Functiona Check if the user interface is e a sy to use and u nderstand_ It is conce rn ed m ai n ly w ith t he u s e of th e a pplication and its us a b ili ty. O ffi usa b ility Te st i n a al ce Te.s t i n g ci P e rforman Pll!'rformance Tes tina validat es that the systll!'m can perform as des ia ned based on the exist ina o r ta raet i nfrastructure a nd pro j ected work lo ad s and user load . It a lso determ ines the ex ist ina i nfrastructure does no t introd u ce an v secu ritv issues u n d e r lo a d . Proj e ct Leve l System Te.st ousrnessruncrro ns s c ripts. These scr i pts a r e gen e r a lly a utom a ted as much as poss ibl e to li m it re s ource ut ili za t ion . ve iti Infrastru ns Se Deve l opment Team, Pr oar a m Tl Te.st Reg r es sio n Testina is used to v a lidat e th a t a l ready e x istin& funct io n ality, wh i ch was not c han a ed for a b ase lin e , i s working as designed and new code has h a d no n qa tive impact on it . This type oftest ina a ener ally has some basic In fr a structur e ref e r s to th e hardware , softwar e . networks . data centers, fac i liti e s, and r e la t ed equ ip m e n t used to d eve l op, t es t . operatll!' , mon it or, m ana a e . and /o r support i nfo rm at ion techno l oav serv ices w ithin the Census Enterpris e. c ture Test ina End -to- E nd Int e gr at ion t es t i na v a li dates in t e rfaces between sy stll!'ms . su b-sy stem interfaces e xtern a l t o the proaram funct c ommun ica t ll!'proper ly i n co mbin at ion ea ch other_ End to End Te stina USer Ac cepta nc e Test CUA.Tl that th e s. and ion and wi t h by the system users and UAT IS performed tlHt .._ ms to sol k lt feedbac k • nd X project a~ CAT eustomer Acceptance - 1from users . Test CCATl CAT-IT Is intended to demonstr • te the ..-. phk • I user lnlRrface(GU I) to SMEpersonnel toel lclt the lr fftcl:Jeck Test and Evaluation Management Plan Version 2.0 VERSIGHT Proj e ct Test Team ( if nec essa ry) App lic at i on Ril!'&ression Te stina UAT In terface Te st ina focu ses on t he fun ct i ona li ty of t h e in t e rf ace s between pro j ect syst e ms. 1----------+----------------+---+-+--+----t--; -F as oocumented Interfa c e Tll!'s tina (P air wi sei nt e rna l a nd ex t e rn a l compo n ents) or Sys-re m Tesrir,g is used ro det erm ine if the proauct ~rtormsthe • nd • - X SMEs Project Tea m 1. A-1 COMM-CB-18-0152-A-000157 2020 Census Test and Evaluation Management Plan Envi ronm ent Used ] a i .... ..i ... i e ! E l.. .! .. ----~......._.,,,_., ____ __.,.,....,...,.....,_ __ ----(IM) Census " l evd Test Ty pe Grou pine " Oreanizat ion Oescr iptto n Responsible Stakehol der (s) Support in& Stakeholder (s) tl Buil d Checkout Thread/End to End Testm1 Security Controls Testin1 focuses on validatine tha t Security requirements have bttn sati sfied X Build Checkout tests focus on providin& cons istent softw are dep loyment results and confirmation that the system i s ready for testme . X End to End tests ar e des 11ned to assure that the system flows, or threads , are systematically exerc ised to make sure that the systems have bttn lnteerated successfully and , u lti mate ly, that the business process nttd has bttn met. X Proeram Level Test Team Exception Testm 1 focuses on the except ion scenarios for the Census SoSsystem behav ior and hand li ne of v:ception scenar ios across bus in ess process scenarios. focuses on Proeram Level Test Team X er Proaram Level X nm lnteeration Test Exception Testine mregnn,on oc,oss systems Interface Testin1 focuses on the functionality th e in terfaces between project systems of Section 508 Testine Secti on 508 Testme tests user in terfaces to ensure they mttt Section 508 requirements. X Mob il e Testlne Mobile Testine focuses on systems wh ich nttd to be exercised on mobile dev ices X Re,ression Testin& Reeression Testine consists of test cases wh ic h ensure the system funct io ns as expected after major releases X X ci ffi O ___ __ - -.. .......... ,,.._, or CAT- ~Acc--TestlCAl) l~ Test) -F I Se ns iti ve Checkout and Certificat ion System Checkout Test Infr astr ucture Testi ne SRT - Proeram Level Te.st Team ProJect Test Team Development Team Proeram L.Nel Test Team Proeram L.Nel Test Team Proeram Level Test Team al G ov Interface Testine lnfri!lstructure Testme ensures continu ity of Infrastructure operat ions remains in place throueh fa ilur es Teshn&(Conbnuous Operation , such as network , hardware , and power related Backup and Recovery) issues CAT Proeram Level Test Team Proeram L.Nel Test Team Performance Measurements are executed u nder Performance (Load, Vo lume, Str, product ion- li ke cond it ions and at certain t imes dur in& the system operat ion s Progrom-U11el /rrregnnlon Tesr Security Team nl y Security Controls Testin1 Security Controls Testm.11: _,._ en tU I ] O ...,,,. se Ii Proeram Level Test Team X Proeram Level Te.st Team TestllStl"IVOl--the bock ends ... _ _.... de-lSlr-ofencMD-end ,.,. .... upd• tl ... SMEs X ....,..,.Lewl Test TNffl cap•blllttes .,.......... ........... c&c te.sti n& i s an eneme-erine checkout that confirms the system has bttn insta ll ed correct lv. Infr astructure Testine ensures continu ity of operat ions remains in place throueh fa il ures such as network, hardware, and powe r related issues. X Proeram Level Test Tum X Proeram Level Test Team X Proer am Level Te.st Team SRT1111 ill t»aJ)/Stt1mw id t1 tut tOtlXffC i MI tht1Z01B Cnuus Interface Testine Interface Testine focuses on the functiona li ty of the i nterfaces between project systems . solut•on roch« t that tht1 l)'fflffl can pro.r:nsdato through m111r,pte rt1fPO"JMIChonnt1•s ondcon wccflsfi, lly communlcort1ond End to End Testine tron.Jffd,no End to End tests are des iened to assure that the system flows, or threads, are systemat ically exerc is ed to make sure that the systems have bttn inteerated successfu ll y and , u lti mate ly, that the business process nttd has bttn met. X Develo pment Team, ProJectTest Team & Env ironment Infrastructure Team Proeram Level Test Team MWft'1tt1JCtffnOI inrB{oc~ Performance Testin& Performance Measurements are executed under X Pro1n,m Level Test Team product ion-lik e cond it ions and at certain times dur in1 the svstem ooerat lon s Test and Evaluation Management Plan Version 2.0 VERSIGHT A-2 COMM-CB-18-0152-A-000158 2020 Census Test and Evaluation Management Plan Environment Used Census "Leve l "Ore:anization Test Type Groupine .... j..,. i .., .. . ·,. a .. C De scription C E ~ JI ~ J( ! _,,,_.....,.._.,....,._,_,,,,,~,_,. C JI t; ::,0 e ] Respons ible Stake holder (s) support ing Stak eho lder (s) i e nl y Productionlleedlness-CPMI Performa nce & Performance & Sca lab ility Test Scalab ility Test 1-.Tat u- Processes v • lldatlon dac- . So/Sll!mUsers . DSCHelpdest ~Team CFleld Test!.. Onlyl, Prapam Lewi Tat Team So/Sll!m Users, DSCHelpdest ~Team CFleld Testl,.On lyl, Prapam Lewi Tat Tum So/Sll!m Users , DSCHelpdest ~TNm CFleld Operation Tat Team Test!,. Onlyl. Prapam Lewi Tat Tum X ci al - G ov ensures Business Process v• and confirms that the process 11-1or each -, ldenllfled In the c11..,.ms , Is exec.- and meets the business requ irements. Testl,. of these tasb wtll ensure that a ll processes . and aperationa l mner la ls. used In Production . This IHtl,. focuses on theof bus iness .a processes used In Production; -. functiona l focus. Operat ions Team se en tU Operation Tat TNm X er ORT Operation Tat Team X nm 1-.IHtl,. confirms the functiona l aperatlon of and transm • ccur_,,theS'ISIRffl l-.Sexec:.Identified In the Business Process -s CIPMs l In an environment similar ID Production . Operat iona l Level Proa:ram Level Test Team X End-to-EndTestl,.exerc lses the Business Process Is CIPMslfor eech operation and 1Bts Inputs and outputs from start unti l the end deUw,y . TIieIHtl,. exercises all aperationa l ~of the ..,.,,,.IDensure that !hercan fully aperat,e ckw1,. Production . End-to-EndTat O Performance end Sce leb ility Testina: ve lidetes that systems perform as nttd~ bas~ on the tar~t infrastructure an d projected work loads . 11w~ ffi So/Sll!m Checkout .......... - O ORT Operat iona l Level I Reedinessl- lled lnthepn,ductlon-- ~ rad lnessofsysllmslDbe - (OIIIIJ....... lflat,,. _..a/_, This lHtl,. ls a chectoutthatconllrmsthe S'ISIRfflhas been I- lied correctly. oi-- ___ 1lleed1ness-101U11 ,_,,,,,_ ,_ __ X Operation Tum ,...........,_ ltlso/lNlldm lflatlklteedetl_ /l.e.,...., Se ns iti ve -F or ~ _.......,..,.,,,,.~ -_,, ,.,.,_,_,.,_...,~ .CJllll-- _,,...,,,,.-.,..,,.,..,,_,,,,_,,_"""""",,,*---- CJM-lflat _,,.,.,...., ... ,..,,........,,,,,..,..,,_ fflcllr_..,,_, .••.•_.._,,_ _ _......,,..,,.,....,._ Test and Evaluation Management Plan Version 2.0 VERSIGHT A-3 COMM-CB-18-0152-A-000159 2020 Census Test and Evaluation Management Plan Acronyms Acronym Definition A&A Assessment and Authorization ADREC Administrative Records ALM Application Lifecycle Management ATO Authority to Operate ATT Authority to Test CAT Customer Acceptance Test CEDCaP Census Enterprise Data Collection and Processing CI Continuous Integration CR Change Request C&C Checkout & Certification ECaSE-OCS Enterprise Census and Survey Enabling Platform- Operational Control System -F or O ffi ci al G ov er nm en tU se O nl y APPENDIX B iti IIP Hewlett Packard ve HP ns IPT Se MDM Integration and Implementation Plan Integrated Product Team Mobile Device Management NIST National Institute of Standards and Technology NRFU Non-Response Follow-Up Test and Evaluation Management Plan Version 2.0 VERSIGHT B-1 COMM-CB-18-0152-A-000160 2020 Census Test and Evaluation Management Plan Definition OCS Operational Control System OIS Office of Information Security ORR Operation Readiness Review ORT Operational Readiness Testing PgLC Product Generation Life Cycle PII Personally Identifiable Information PRR Production Readiness Review QA Quality Assurance RTM Requirement Traceability Matrix SDLC Software Development Life Cycle SEIT Systems Engineering Integration Test SME Subject Matter Expert SoS System of Systems O se en tU nm er ov G al ci ffi O or -F System Readiness Test ve SRT TAR nl y Acronym ns iti TEMP Se TI Test Analysis Report Test and Evaluation Management Plan Technical Integrator TRR Test Readiness Review UAT User Acceptance Test Test and Evaluation Management Plan Version 2.0 VERSIGHT B-2 COMM-CB-18-0152-A-000161 2020 Census Test and Evaluation Management Plan Definition UFT Unified Functional Tester Se ns iti ve -F or O ffi ci al G ov er nm en tU se O nl y Acronym Test and Evaluation Management Plan Version 2.0 VERSIGHT B-3 COMM-CB-18-0152-A-000162 2020 Census Test and Evaluation Management Plan APPENDIX C Referenced Documents This appendix provides the list of referenced documents in the TEMP. Path or Link Automation Test Plan Planned delivery date - TBD Defect Remediation Plan Planned delivery date - TBD Integration Test Plan Planned delivery date - TBD Performance and Scalability Test Plan Planned delivery date of 1/30/17 Mobile Application Performance Test Plan Planned delivery date of 3/31/17 Se ns iti ve -F or O ffi ci al G ov er nm en tU se O nl y Description of Supporting Document or Site Test and Evaluation Management Plan Version 2.0 VERSIGHT C-1 COMM-CB-18-0152-A-000163 Revised Baseline of the Census Enterprise Data Collection and Processing December 2017 Summary In early January 2017, a draft program office estimate (POE) developed by the CEDCaP Program Office and an independent cost estimate (ICE) developed by the Census Bureau’s Office of Cost Estimation, Analysis and Assessment (OCEAA) suggested significant cost growth. Through March, the CEDCaP Program Office and OCEAA have refined and reconciled both the program office estimate and the independent cost estimate to ensure incorporation of all programmatic decisions. Both estimating teams collected the project requirements and data inputs simultaneously to ensure they were estimating the same program. They used different methodologies or tools on most cost elements to ensure a reasonableness test was completed, and to identify areas of risk. The POE primarily used a parametric methodology while the ICE used a combination of parametric and extrapolation from actuals.1 The final estimates confirm that the projected costs increased compared to the baseline from May 2016. The baseline estimate provided in May 2016 was $656.4 million and the final reestimate in March 2017 is $965.2 million, which is a difference of $308.8 million or a 47% increase. Root causes The causes of the growth of the CEDCaP estimate and the shift to prioritize CEDCaP’s focus on the 2020 Census, are closely interrelated. We cannot separately quantify the impacts of each element described in this document to the differences between the previous and current estimates. As an example, program integration/management costs were not appropriately reflected in the original estimates, which meant that the CEDCaP PgMO was not resourced as needed and thus was not fully integrated with the 2020 Census program from a schedule perspective. Budget shortfalls on the 2020 Census led to insufficient resources to define some operations which meant that requirements and schedule weren't defined which led to a misalignment of requirements and schedule between the 2020 Census program and the CEDCaP program a) Deficiencies in the May 2016 Cost Baseline There were several deficiencies in the May 2016 program office estimate that have since been remedied. The May 2016 estimate was developed in FY 2013 to support the FY 2015 Congressional Budget Request. It was not developed by certified cost estimators following best practices. It was not accompanied by an independent cost estimate to identify gaps in the estimate and was not guided by a detailed Cost Analysis Requirements Document that laid out 1 As an example of the detailed steps in producing these estimates, the POE process included 1) Collecting and digesting available documentation, 2) developing an understanding of the project scope, 3) identifying applicable cost elements, 4) sizing project requirements, 5) quantifying risks and adjusting estimate inputs accordingly, 6) developing cost models, and 7) reviewing and finalizing costs with project teams. VERSIGHT COMM-CB-18-0152-A-000164 a clear plan for the program that could serve as the basis of the estimates. The March 2017 program office estimate and accompanying ICE remedied these issues and represents a significant increase in reliability over the May 2016 estimate. The initial estimate was developed by project by the teams leading each of the 13 original individual projects that comprised the program before the Commercial Off the Shelf (COTS) Capabilities Assessment and Analysis (CCAA), colloquially known as a build-buy decision, based on what they knew about the CEDCaP concept at that point in time. The budget estimates were aggregated and formed the first cost estimate. The estimates were based on contract costs and Federal employee level of effort for each project. A detailed Cost Analysis Requirements Document (CARD) was not developed to document the program baseline and guide the cost estimation process. The subject matter experts were not trained cost estimators. Most importantly, from a cost perspective, CEDCaP was not treated as an integrated program, but instead a compilation of individual projects. This estimate did not include the cost of the overall management of the technical, business and operational integration across the portfolio of individual CEDCaP projects. It did not include the cost of a program management office to provide integrated requirements management and change control, integrated schedule management nor program-level risk management. Additionally, the initial estimate did not benefit from having an ICE. The ICE was produced by the Census Bureau’s Office of Cost Estimation, Analysis and Assessment in 2015 to support the Department of Commerce's milestone review board process. The ICE total estimate was $1.2 billion through FY 2021, nearly double the program office estimate from 2013. The ICE was also requested by and delivered to the GAO in January 2016 in the course of their audit of the CEDCaP program. The 2015 ICE suffered from some of the challenges of the program office estimate. Without a CARD to document the programmatic baseline and therefore ensure that the program office and the independent cost estimators were in fact estimating the same set of requirements, the program manager and the Census CEDCaP Executive Steering Committee could not rely on the ICE at that time to estimate growth in program office costs. So while the ICE was superior to the initial program budget estimate methodology, including using parametric estimation and inputs such as source lines of code (SLOC) instead of subject matter expertise judgements on level of effort, there were concerns on the reliability of both set of estimates. Beginning September 2015, the Census Bureau conducted an analysis of alternatives that resulted in the decision to pursue a hybrid-contractor/government solution for CEDCaP capabilities. A request for quotes was released on 9/2/2015 and purchase orders were awarded to purchase software licenses from five vendors on 9/30/2015. The Census Bureau reduced the five vendors to two after an initial evaluation phase and engaged Carnegie Mellon University’s Software Engineering Institute to assist with a Consumer Off the Shelf Software capability analysis, which culminated in a decision announced in May 2016. Once that decision was made, contractors were on-boarded and the Census Bureau began building a new Program Office Estimate and an Independent Cost Estimate, efforts which for a program of this size and 2 AMERICAN PVERSIGHT COMM-CB-18-0152-A-000165 complexity takes approximately six to nine months. These estimates were finalized by the end of March 2017. b) Requirements and Schedule The March 2017 CEDCaP program office estimate also reflects significant changes to the program compared to the estimate last reported in May 2016. The most significant of these changes resulted from the Commercial Off the Shelf (COTS) Capabilities Assessment and Analysis (CCAA). The assessment framework was comprised of two major elements: 1) assessing the most viable COTS products against the CEDCaP capability needs, and 2) comparing the COTS products to the current infrastructure and custom solutions. This work completed in April 2016 and a decision was announced in May 2016 that the Census Bureau would use a hybrid COTS solution going forward to deliver some of the CEDCaP capabilities, specifically those for data collection. The CCAA focused on a core operational control capabilities provided within CEDCaP and the prioritized subset of the corresponding 2020 Census requirements and used specifications from the 2016 Census Test to evaluate cost, schedule and technical considerations. At that time, there was less certainty about the lower level business rules and user stories needed for subsequent Census Tests and the 2020 Census. As highlighted by the Government Accountability Office (GAO), there was also disconnect in the CEDCaP and 2020 Census’s schedules. As a result, the cost, schedule and technical analysis in the CCAA was based on 1) a subset of the requirements and business rules used for the proof of concept systems and 2) the high-level delivery/product release timeline in CEDCaP’s Transition Plan and not the more detailed schedules in the 2020 Census’s Integrated Master Schedule. After the May 2016 decision, the CEDCaP program worked with the 2020 Census program and the selected vendor, Pegasystems (Pega), to finalize the detailed business rules and user stories, and confirm the schedule for the systems supporting the 2020 Census, the 2017 Census Test and 2018 End-to-End Census Test. The agile software development process used by CEDCaP and Pegasystems calls for detailed business rules and user stories to configure the Pega product. This refinement at the low level, ongoing since the May 2016 decision, added a layer of complexity to CEDCaP, additional user stories, and required an earlier delivery of the needed capabilities than documented in the CEDCaP Transition Plan and planned for by the projects. While the 2020 Census program had assembled business requirements from several tests using Census Bureau-built prototypes, more work than expected was performed by the government and vendor staff to transform these business requirements into user stories that could be used in the agile software development process. The number of user stories that have been developed from the 2020 Census program’s business rules has exceeded the number of user stories that the vendor Pegasystems was exposed to during the CCAA. As a result, cost growth in the CEDCaP program is attributable to a larger number of user stories than was previously estimated during the CCAA. While there was an expectation the estimate used in the CCAA 3 AMERICAN PVERSIGHT COMM-CB-18-0152-A-000166 analysis would change once the selected vendor was exposed to all information regarding schedule and detailed technical specifications required to deliver all the necessary functionality prior to the 2018 End to End test, the gap was more than estimated at the time of the CCAA. After the CCAA and related decision and determining the refined 2020 Census business rules, user stories and schedule, work began on developing the detailed cost analysis documentation for CEDCaP, including updating the CEDCaP estimate that was put in place when the program began as a separate budget initiative in October 2014. This work revealed that the change in priority to deliver the 2020 Census first, the integrated schedule impacts and the low-level scope decisions had increased costs for CEDCaP. c) Insufficient Resources To Define All Operations Timely Compounding the issues described above, the 2020 Census has not received their entire budget requests in FY 2015 and FY 2016 necessary to build out the operational business rules, user stories and schedule for every 2020 Census operation, which in turn drive the establishment of CEDCaP business requirements. The 2020 Census program has had to continually delay work on coverage measurement also known as the Post Enumeration Survey, Group Quarters enumeration, and unique non response follow-up procedures for some special operations in order to get the necessary work done in the priority areas within the existing resources. The Census Bureau has been logically and responsibly focusing on business requirements that will both cover the largest segments of the population and will bring the Bureau the largest return on investment in automation. Some operations or segments of operations that do not have mature business requirements (e.g., that cannot be readily translated to user stories for the agile development process) can be efficiently removed from the CEDCaP automation workload. While this will add to the cost of the 2020 Census field operations particularly in the years 2019 and 2020, the 2020 Census program will not sacrifice the quality of the count and will conduct a well-designed and complete 2020 Census. Actions and Proposals to Control Future Cost Growth a) Improvements to the Reliability of the CEDCaP Estimates To address these gaps, the Census Bureau has worked to advance the CEDCaP program cost estimate based on sound estimation methodology. The Census Bureau’s Office of Cost Estimation and Analysis and Assessment (OCEAA), who produced the ICE, and the Census Information Technology Directorate (ITD) cost estimation team, who produced the Program Office Estimate, had a shared understanding of the required program capabilities. OCEAA and the ITD team jointly met with the CEDCaP program teams and conducted several reconciliation meetings to ensure a consistent program baseline. Both sets of estimates were produced by certified cost estimators (one in the case of the POE and five for the ICE). Both sets of estimates include the appropriate funding estimates to cover the Census Program Management Office to manage requirements, schedule, risk and budget. The CEDCaP program developed a CARD in tandem with the POE and ICE. The program manager and the Census Executive Steering 4 VERSIGHT COMM-CB-18-0152-A-000167 Committee are confident that the reconciliation process completed by OCEAA and the ITD team adds rigor to reliability of the estimates. While working independently of each other, both used actual productivity data when available to model estimates. The fact that the program office estimate and the independent cost estimate are very close in magnitude provides further confidence in the accuracy of the revised estimates. b) Corrective Action for Schedule In 2016, the Census Bureau and the Government Accountability Office (GAO), as part of its review of the 2020 Census program and CEDCaP (see report: GAO-16-623 Information Technology: Better Management of Interdependencies between Programs Supporting 2020 Census Is Needed, August 2016) identified a disconnect in the high-level timelines established for CEDCaP in FY 2015 and the 2020 Census schedule. The CEDCaP and 2020 Census testing schedules were not integrated and the schedules were not originally built based on the Census Bureau's System Life Cycle Development process. Adequate time for testing of systems was not identified in either program’s plan, which led to incorrect communication with vendors participating in the CCAA. The vendors then delivered cost estimates based on delivery dates that did not fully account for system testing per the test readiness milestones in the 2020 Census Integrated Master Schedule. Before the GAO report was issued, the Census Bureau had already taken remedial actions to address the gaps. The 2020 Census Integrated Master Schedule (IMS) is the program schedule used to manage the 2020 Census and track all 2020 Census program dependencies. The CEDCaP deliveries for the 2020 Census program are integrated into the 2020 Census IMS. The 2020 Census Program drives the schedule for all solutions that support it, including CEDCaP. The key milestones of the 2020 Census therefore become key milestones for CEDCaP. The Census Bureau also maintains an integrated, comprehensive risk register for the 2020 Census and within it, CEDCaP. Regular risk reviews are held by senior 2020 Census leadership to examine and mitigate risks. Additional measures in place to monitor performance include regular review of the detailed business rules, user stories and the development backlog to ensure only what is necessary and critical to the 2020 Census program is developed. c) Corrective Action for Requirements The 2020 Census program has and will continue to prioritize its operational needs for CEDCaP and implement only the critical functionality needed to complete the 2020 Census operations. The 2020 Census program is prioritizing the business rules and user stories to focus the automation for the targeted functionality. Key business leads have collaborated to remove non-priority user stories from the development backlog and ensure focus on delivering the core functionality needed to execute the 2020 Census Program. For example, a decision was researched and presented to the Census Bureau 2020 Executive Steering Committee, and finalized to remove requirements for the Automated Listing and Mapping from the vendor’s user story backlog for the 2020 Census. The Census Bureau will 5 VERSIGHT COMM-CB-18-0152-A-000168 revert to the proven technology for this function. The 2020 Census will use LIMA (Listing and Mapping Application), which is the solution for the current surveys and was successful in the 2016 Addressing Canvassing test and the address canvassing portion of the 2018 End to End Census test. Other reductions to the requirements backlog are not dramatic, but are critical to controlling cost. The 2020 Census program manager and the team of 2020 Census subject matter experts are reviewing the requirements backlogs of all the CEDCaP functionalities. The requirements backlog of the Internet Self Response capability has been reviewed and reduced. The 2020 Census program manager, in collaboration with experts from the Field Directorate, are reviewing and reducing the requirements backlogs in the field operational control capability and the enumeration capability. While the reductions are decreasing the scope of CEDCaP in meaningful ways with respect to controlling cost, the quality of the 2020 Census is being considered in each decision. The Census Bureau will not sacrifice a high-quality 2020 Census. Another action the Census Bureau has taken to control costs and maintain the critical path of the 2020 Census in light of budget shortfalls is to delay CEDCaP requirements development for the current surveys including both demographic and economic surveys until the development of functionality for the 2020 Census is nearly complete. The objective of the 2018 End-to-End Census Test is to field test the major operations of the 2020 Census along with the automation. FY 2019 funding is reserved in CEDCaP to address deficiencies uncovered during the 2018 Endto-End Census Test and to complete the work necessary to ensure all the capabilities appropriately and successfully scale for nationwide data collection and processing. This program office estimate includes minimal transition planning through FY 2020 for the current surveys/enterprise and concentrates resources on CEDCaP's support of the 2020 Census and the 2017 Economic Census. d) Corrective Action for Budget Shortfalls and Controlling Future Cost Growth As mentioned in the previous section, 2020 Census budget shortfalls in FY 2015, FY 2016 and constraints associated with the timing of funding in FY 2017 have delayed the operational planning associated with the Post Enumeration Survey, Group Quarters enumeration, and unique non-response follow up procedures for some unique operations. The result of these delays is that the business rules were not available to be translated by business analysts into user stories for the developers of CEDCaP to build to the requirements for these operations. Instead, the Census Bureau will revert to proven paper-based processes and legacy systems. Reverting to paper-based operations for a subset of smaller operations will not harm the quality of the Census because we are reverting to proven solutions. New Estimates of the Total Project Costs The following new estimate of the total project costs stems from the program office estimate, which has been reconciled with the independent cost estimate. Changes were made to the 6 VERSIGHT COMM-CB-18-0152-A-000169 program office estimate during the reconciliation process that increased the estimate, but were necessary to ensure the estimate was comprehensive. Fiscal Year Original Estimate Revised Estimate Difference 2015 $ 66,174 $ 65,855 $ (319) 2016 $ 77,623 $ 117,864 $ 40,241 2017 $ 104,045 $ 185,173 $ 81,128 2018 2 $ 136,311 $ 160,402 $ 24,091 2019 $ 105,847 $ 143,201 $ 37,354 2020 $ 79,548 $ 176,704 $ 97,156 2021 $ 86,870 $ 115,998 $ 29,128 Total $ 656,418 $ 965,197 $ 308,779 See Appendix A for the Original and Revised total cost estimates with further detail by capability. Conclusion A number of programmatic and technical factors led to the CEDCaP program cost increase, including an incomplete process for initial cost estimation, re-prioritization of scope and schedule disconnects. Additionally, the Census Bureau believes it has the governance, change control, and management structure in place to effectively address risks to future cost growth. These factors are on top of the current measures already in place to control costs, which are the result of a robust program analysis that is integrated across the CEDCaP Program Management Office and the 2020 Census Program. Text for Footnote 2 (see col 1 in table above) missing from draft 7 VERSIGHT COMM-CB-18-0152-A-000170 Appendix A: Original and Revised Total Costs Original Program of Record – May 2016 Total Costs FY15 – 21 ($000s) CEDCaP CAPABILITY DM&E O&M Lifecycle 8,296 33,578 35,338 24,698 13,627 23,184 27,826 4,292 13,483 8,899 3,662 9,207 20,773 117,967 12,588 47,061 44,237 28,360 22,834 43,957 145,793 Centralized Development &Testing Enviroment Survey (and Listing) Interview Operational Control Centralized Operational Analysis & Control Service Oriented Architecture Dashboard for Monitoring 10,084 56,021 46,157 47,003 14,103 8,375 46,399 31,031 43,985 8,430 18,459 102,420 77,188 90,988 22,533 $339,915 $316,503 $656,418 EDCADS DECENNIAL Electronic Correspondence Portal Survey Response Processing Internet & Mobile Data Collection Address Listing & Mapping Questionnaire Design & Metadata Scanning Data Capture from Paper Decennial Scale-up TOTAL Revised Program of Record - May 2017 (Decennial & Other Surveys) Total Costs FY15 – 21 ($000s) EDCADS DECENNIAL CEDCaP CAPABILITY PLANNING Electronic Correspondence Portal Survey Response Processing Internet & Mobile Data Collection Address Listing & Mapping Questionnaire Design & Metadata Scanning Data Capture from Paper Platform Implementation Team (Planning) PIT - Internet Self Response (ISR) PIT - Enumeration (ENUM) PIT - Operational Contol Systems (OCS) PIT - Address Listing & Mapping (LiMA/ALM) PIT - MOJO Scale-Up Centralized Development &Testing Enviroment Survey (and Listing) Interview Operational Control Centralized Operational Analysis & Control Service Oriented Architecture Dashboard for Monitoring Adaptive Survey Design Total $ 31,356 31,356 $ DME O&M LIFECYCLE 32,326 51,367 13,336 18,426 11,427 65,824 0 49,889 66,225 71,025 90,858 68,329 56,639 0 2,276 4,800 0 0 0 11,464 1 2,880 2,916 3,856 11,148 2,960 145,634 0 34,602 56,167 13,336 18,426 11,427 77,288 31,357 52,769 69,141 74,881 102,006 71,289 202,273 19,159 17,547 18,495 35,064 5,630 53,176 0 0 0 0 0 1,164 19,159 17,547 18,495 35,064 5,630 54,340 744,742 $ 189,099 $ 965,197 8 VERSIGHT COMM-CB-18-0152-A-000171 Revised Decennial Sub-Set Program – FY15 – 21 EDCADS DECENNIAL CEDCaP CAPABILITY PLANNING Electronic Correspondence Portal Survey Response Processing Internet & Mobile Data Collection Address Listing & Mapping Questionnaire Design & Metadata Scanning Data Capture from Paper Platform Implementation Team (Planning) PIT - Internet Self Response (ISR) PIT - Enumeration (ENUM) PIT - Operational Contol Systems (OCS) PIT - Address Listing & Mapping (LiMA/ALM) PIT - MOJO Scale-Up DME 31,356 Centralized Development &Testing Enviroment Survey (and Listing) Interview Operational Control Centralized Operational Analysis & Control Service Oriented Architecture Dashboard for Monitoring Adaptive Survey Design Total $ 31,356 $ O&M LIFECYCLE 0 46,389 0 18,426 0 63,026 0 40,491 54,456 56,133 75,198 67,502 56,639 0 0 0 0 0 0 0 1,940 1,918 2,502 5,800 0 145,634 0 46,389 0 18,426 0 63,026 31,356 42,431 56,374 58,635 80,998 67,502 202,273 16,769 17,547 18,495 31,062 5,630 40,454 0 0 0 0 0 0 16,769 17,547 18,495 31,062 5,630 40,454 608,217 $ 157,794 $ 797,367 Revised Other Surveys Sub-Set Program – FY15 – 21 EDCADS DECENNIAL CEDCaP CAPABILITY PLANNING DME O&M LIFECYCLE Electronic Correspondence Portal Survey Response Processing Internet & Mobile Data Collection Address Listing & Mapping Questionnaire Design & Metadata Scanning Data Capture from Paper Platform Implementation Team (Planning) PIT - Internet Self Response (ISR) PIT - Enumeration (ENUM) PIT - Operational Contol Systems (OCS) PIT - Address Listing & Mapping (LiMA/ALM) PIT - MOJO Scale-Up 32,326 4,978 13,336 0 11,427 2,798 0 9,398 11,769 14,892 15,660 827 0 2,276 4,800 0 0 0 11,464 1 940 998 1,354 5,348 2,960 0 34,602 9,778 13,336 0 11,427 14,262 1 10,338 12,767 16,246 21,008 3,787 0 Centralized Development &Testing Enviroment Survey (and Listing) Interview Operational Control Centralized Operational Analysis & Control Service Oriented Architecture Dashboard for Monitoring Adaptive Survey Design Total $ 2,390 0 0 4,002 0 12,722 0 0 0 0 0 1,164 2,390 0 0 4,002 0 13,886 - $ 136,525 $ 31,305 $ 167,830 9 VERSIGHT COMM-CB-18-0152-A-000172 P R E ‐ D E C I S I O N A L  ‐  2 0 2 0  C e n s u s  S y s t e m s System 2018 End‐to‐End Census Test Releases Systems in Production In‐Office  AdCan  (Release I) Test Readiness Review Dates Recruiting  Release 1 ‐‐‐‐ Production Readiness Review Dates 12/13/2016 Conduct Operation Dates 1 2 2020 WebsiteG ATAC (Automated Tracking and Control) 3 BARCAG (Block Assessment, Research, and Classification  Application) 4 CAES (Concurrent Analysis and Estimation System) 5 CaRDS (Control and Response Data System) 6 CBS (Commerce Business System) 7 CDL (Census Data Lake) 8 CEDSCI (Center for Enterprise Dissemination Services and  Consumer Innovation) 9 CEM (Customer Experience Management) 10 CENDOCS (Census Document System) 11 Centurion ‐‐‐‐ Training  Release 1 Recruiting  Release 2 Release A 11/18/2016 3/8/2017 (1) 3/8/2017 (2) 5/10/2017 6/15/2017 12/1/2016 6/22/2017 7/26/2017 7/31/2017 3/31/2017 7/31/2017 8/28/2017 9/5/2017 Training  Release 2  Future Releases Release C‡ Release C ATO  Status  1 Release D 10/11/2017 (2) 11/17/2017 (3) 3/27/2018 (1) 12/6/2017 (2) 4/17/2018 1/12/2018 (2) 1/26/2018T (3) 5/21/2018 (1) 2/12/2018T (2) 6/11/2018 3/19/2018 (2) 2/12/2018 (3) 6/11/2018 (1) 3/14/2018 (2) 7/16/2018 I I Release E  1 l l I I 3 2 2 3 Y 1 2 3 l   1,2  Y 1 2  2 3 Y 1 2  2 3 Y 1 2   1,2 3 2 3 2 13 CHRIS (Census Human Resources Information System)   1,2   14 CIRA (Census Image Retrieval Application) 15 CQA (Census Questionnaire Assistance) 20 DRPS (Decennial Response Processing System) 21 DPACS (Decennial Physical Access Control System (PACS)) 22 DSC (Decennial Service Center) 23 ECaSE ENUM (Enterprise Censuses and Surveys Enabling    1,2    1,2  ECaSE ISR (Enterprise Censuses and Surveys Enabling  Platform – Internet Self‐Response) 26 ECaSE OCS (Enterprise Censuses and Surveys Enabling     Platform – Operational Control System) 27 FDS (Fraud Detection System) 28 Geospatial Services  29 GUPS (Geographic Update Partnership Software) 30 iCADE (Integrated Computer Assisted Data Entry) 31 IDMS (Identity Management System) 32 ILMS (Integrated Logistics Management System) 33 IPTS (Intelligent Postal Tracking System) 34 LiMA (Listing and Mapping Application) 35 MaCS (Matching and Coding Software) 36 MAF/TIGER (Master Address File/Topologically Integrated  Geographic Encoding and Referencing Database) 37 MCM (Mobile Case Management) 38 MOJO Optimizer/Modeler (MOJO – Optimizer/Modeling) 39 MOJO Recruiting DashboardG 40 NPC Printing (Printing at the National Processing Center) 41 OneForm Designer PlusG 42 PEARSIS (Production Environment for Administrative Records,  Staging, Integration, and Storage) 3 2 3 9/1/2018 5/1/2019 11/1/2019 7/1/2020 Y SIA Y Y 2 Y 2 2 1   I      Y     Y             SIA    Y   Y   Y  Y 1 1  * I In progress Y 2 3  2  ECaSE FLD OCS (Enterprise Censuses and Surveys Enabling  Platform – Field Operation Control System) 25 2/3/2020    1,2  Platform – Enumeration) 24 6/5/2019 I Y  DMP (Data Management Platform) 3/1/2019 In progress   19 7/23/2018 Y 3 1,2 Desktop Services 10/3/2019 SIA 3 Release 4 2/19/2019 Y 2  18 Release 3 11/5/2018 Y 1  DAPPS (Decennial Applicant, Personnel and Payroll Systems) Release 2 5/21/2018 Y I CHEC (Census Hiring and Employment Check System) 17 Release 1 ------------------- 12 CRM (Customer Relationship Management) ATO Status (1) 2/26/2018 (2) 7/11/2018 (3) 10/31/2018 (1) 4/4/2018 (2) 9/4/2018 (3) 12/3/2018 (1)  5/1/2018 (2) 10/1/2018 (3) 1/7/2019 2 16 2020 Census Releases Future Releases --------------------------------. -------------------------- 2   SIA 1 2 Y 1 2 Y 3 2 Y 3 Y 2 3 1 1 2 3    Y Y       auth.) 3/14/2018 1 2 3/14/2018  3 Y 1 2 Y 2 Y 1 3 Y 1 2 1,2  2 3 Y 1 2 Y 1 2 2 Y 1 2 Y 1 1 2 2 1,2   1,2   3 2 Y 1 2    *  Y    In progress     3/14/2018   Y     Y 3   Y 2   Y Y 3  In progress 1   Y 3  Y 2 1 3/14/2018   1,2 1 3/14/2018     1,2 3   2 3 Y 1 2 Standalone system; no program‐level integration testing required   Y 2 2   2    1 2 1,2 Y Y 2   3 2    2   Y (Remedy 9  2  SIA Y Y    Y      N/A 2 1 3     Survey ‐ Imputation and Estimation System)   45 PES PCS (Post‐Enumeration Survey ‐ Processing and Control    46 R&A (Recruiting and Assessment) 43 Y 2 Y PES Clerical Match and Map Update (Post‐Enumeration  Survey ‐ Clerical Matching System and Map Update) 44 PES Imputation and Estimation (Post‐Enumeration  System)  47 RTNP (Real Time Non‐ID Processing) 48 SMaRCS (Sampling, Matching, Reviewing, and Coding System)  1,2   3 Y 2 Y 1 SOA (Service Oriented Architecture) 50 Tabulation (Decennial Tabulation System) 51 UTS (Unified Tracking System) 52  1,2 1   2 3 Y 1 2   2  3 12 13 21 16  1,2 2 3 Y 2 3 2 1 2         In progress 3 2  Y Y WebTQA (Web Telephone Questionnaire Assistance) Totals   3/14/2018 3   Y Y 1,2 49 2     26 47 22           * 17 31 28 33 13 33 19 Acquired and Support Systems A 1 CES (Center of Economic Studies) 2 CFS Hotline (Census Field Supervisor Hotline) 3 Commercial Printing 4 dDaaS (Decennial Device as a Service) 5 DSSD (Decennial Statistical Studies Division) 6 ENS (Emergency Notification System) 7 Fingerprint Vendor  8 NPC (National Processing Center) 9 POP (Population Division) 1 2   1,2    2 2 2 Y 3 3 3 3 2 Y 2 2 3 2 1  Y 1 1 1 2 2 2 1 2 Y 2 Y Y Y 3 TBD TBD Y 2 3 3 Y TBD         ,,/\N PVERSIGHT COMM-CB-18-0152-A-000173 P R E ‐ D E C I S I O N A L  ‐  2 0 2 0  C e n s u s  S y s t e m s System In‐Office  AdCan  (Release I) 10 Sunflower 2018 End‐to‐End Census Test Releases Systems in Production 1 Recruiting  Training  Recruiting  I Training  Release 1 Release 1 Release A Release 2 I Release 2  Release C‡ 2020 Census Releases Future Releases .--------------------------------. ----.---------------------   1,2  Future Releases Release C ATO  Status  •:-~--,------,-------r---~ Release D 1 2 Release E I I I ATO Status 1 -l- Y I I I Release 1 Release 2 1 ----,----,-~   Release 3 Release 4   External Systems 1 Bureau of Fiscal Service 2 CLER (Centralized Enrollment Clearinghouse System) 3 DHS USCIS (US Citizen and Immigration Services) 4 EWS (Equifax Workforce Solutions) 5 FBI (Federal Bureau of Investigation) 6 Federal LTC Insurance (LTC Partners/B58 Federal Long  7 NARA (National Archives and Records Administration) 8 NFC (National Finance Center) 9 10 NGA Imagery Service (National Geospatial‐Intelligence  Agency Imagery Service) OCSE (Office of Child Support Enforcement) 11 OPM (Office of Personnel Management) 12 RITS (Retirement and Insurance Transfer System) 13 SSA (Social Security Administration) 14 USACCESS 15 USPS (United States Postal Service) 16 WebTA (Web Time and Attendance System) Term Care Insurance Program) KEY: ‡ X * G SIA CR = GQ Workload/Advance Contact was moved from Release C‐1 to Release C‐3; to avoid confusion, Release C‐1 was removed = Participated/Will Participate in Census/Census Test (with TRR information) = Release G for Geographic Programs = GAO identified system development and integration testing as "complete" = Security Impact Assessment (SIA) required to determine if reauthorization is necessary = Change Request Pending ~ Lt. Blue = 2020 Census System, not included in the 2018 End‐to‐End Census Test = ATO Status has been updated Grey = Not Applicable Purple = CEDCaP System Red = At risk of meeting Conduct Operation date Yellow = At risk of meeting Test Readiness Review (TRR) date Orange = At risk of meeting Production Readiness Review (PRR) date Green = In production = System Status has changed from At Risk to On Track REASONS FOR AT RISK STATUS: B = Budget/Resources S = Schedule T Sources: = Technical Blockers Systems List v.41 and 2018 IIP v.58.0; 2020 IIP v.02_6 2018 End‐to‐End Census Test Releases Release I (In‐Office Address Canvassing (AdCan)) Recruiting Release 1 (AdCan Recruiting) Training Release 1 (AdCan Training) Release A (In‐Field AdCan); TRR 1 = Final functionality for all AdCan systems  except ECaSE, LiMA/MCM, and UTS; TRR 2 =  Final ECaSE, LiMA/MCM, and UTS  Recruiting Release 2 (Field Enumeration Recruiting) Training Release 2 (Nonresponse Followup (NRFU) Training) Release C‐2 (Self‐Response; includes: Printing/Mailing/Workload  and Census Questionnaire Assistance (CQA)/Self‐Response Release C‐3 (Group Quarters (GQ) Workload/Advanced Contact/All GQ  Release D‐1 (Field Enumeration (includes Update Leave/NRFU/Coverage  Improvement operations) Release D‐2 (GQ eResponse/GQ Enumeration/Service Based Enumeration  Release E‐1 (Tabulation and Dissemination ‐ Residual Coding) Release E‐2 (Tabulation and Dissemination ‐ Post Capture Data Interface,  Primary Selection Algorithm, Census Unedited File/Fraud Detection) Release E‐3 (Tabulation and Dissemination ‐ Census Edited File, Micro Data  File, Disseminate redistricting data required by Public Law 94‐171) 2020 Census Releases Release 1 (Recruiting for all positions/AdCan Recruiting;  Selection/Hiring/Training of Recruiting Assistants, Partnership Assistants,  Release 2 (Address Canvassing selection of Census Field Supervisors,  Enumerator and Listers; Post Enumeration Survey (PES) Sample Release: Initial  Sample for PES; AdCan Training; In‐Field Address Canvassing; Peak Operation  Release 3 (Advertising and Earned Media; Housing Unit Count Review; Peak  Operation Training (includes UL/GQ/Update Enumerate (UE)/NRFU); PES ‐  Independent Listing Training; PES ‐ Independent Listing; GQ Workload and  Advanced Contact/CQA Training/Printing and Mailing Workload; Remote  Alaska; Island Areas Censuses; Enumeration at Transitory Locations; Self‐ Response (includes Mailing/Self‐Response/CQA/Coverage Improvement); Peak  Operations (includes UL/UE/GQ/SBE/Early NRFU/NRFU); PES ‐ Person  Interview; PES ‐ Initial Housing Unit Follow‐up; PES ‐ Person Interview  Matching (E‐Sample ID, Computer Matching, Before Followup Clerical  Release 4 (Tabulation/Dissemination; Archiving; Federally Affiliated Count  Overseas; Redistricting Data; PES ‐ Person Follow‐up; Count Question  Resolution; PES ‐ Final Housing Unit Follow‐up; PES ‐ Reports and Release  As of 3/5/2018 A v1tRl 1 ,,/\N PVERSIGHT COMM-CB-18-0152-A-000174 United States House Committee on Oversight and Governance 2020 Census Status Update 2018 Census End-to-End Test Objectives Test and validate 2020 Census operations, procedures, systems, and field infrastructure together to ensure proper integration and conformance with functional and nonfunctional requirements. Goals of today’s update – • Provide a comprehensive update on Census Bureau Systems Readiness for the 2018 Test and 2020 Census • Provide context, current status and clear comparison with GAO statement of readiness of Census IT systems • Provide an update on Cyber Security and Fraud plans and progress • Provide a current overview of the changes in operations, controls implemented and governance processes related to the revised baseline for CEDCAP • Answer any questions necessary to provide the Committee with a level of comfort that the Census Bureau is on track to meeting the objectives of the 2018 Census Test and ultimately in position to execute a successful 2020 Census " Unit.ed Stat.es '· U.S. Department of Commerce Econom ics and Stat istic s Administration U.S. CENSUS BUREAU census.gov 1 Census COMM-CB-18-0152-A-000175 2020 The 2020 Census Where Are We Now Address ,Canvassing Opaatlo Activities n Peal: Operatl- Comp l ete I 2013 2014 2015 20 ~~ 20 1 6 Ke,, census actlvttles !ililrtlfl continue 2015 and through 2021 IEsuma ·t.e o n -thi!ground WOl"ldoads and de,fine ope.rations and systems needed few the census De-velop and awanl maf« conttacts for the systems that WI SD,P.pCll"l the ,a!flSUs Ap r 1: Census Day Put fiel d in·frastruct..-e and offic~ In place throughout the country Key Activities: • Making Key Decisions: Continuously make timely decisions based on research and evidence • Awarding Key Contracts: Continue to award key contracts for the 2020 Census • Planning and Execution of the 2018 End-to-End Census Test: Focuses on the overall integration of systems and operational procedures for 24 of the 35 operations of the 2020 Census f«thetest U.S. CENSUSBUREAU census.gov On-the-