Library of Congress Office of the Inspector General NOT FOR PUBLIC RELEASE 2016-lT-102 February 2017 OFFICE OF THE INSPECTOR GENERAL LIBRARY OF CONGRESS 101 INDEPENDENCE AVE. WASHINGTON. D.C. 20540 February 17, 2017 MEMORANDUM FOR: Dr. Carla D. Hayden Librarian of Congress FROM: Kurt W. Hyde ?39; Inspector General SUBJECT: Audit Report No. 20 6-IT- 102, FY 2016 Review of Svstems Development Life Cycle This transmits the audit report summarizing the results of Kearney Company (Kearney) FY 16 Review of Systems Development Life Cycle. The Executive Summary begins on page i, and the full text of Kearney?s report begins in Appendix A. Management?s response to the recommendations appears in Appendix B. This report is not for public release. Based on management?s written responses to the draft report, we consider all of the recommendations resolved. Please provide, within 30 calendar days, a corrective action plan addressing implementation of the recommendations, including an implementation date, in accordance with LCR 2023 -9, Rights and Responsibilities of Library Employees to the Inspector General, We appreciate the cooperation and courtesies extended by the Copyright Of?ce, Library Services, the Office of the Chief Information Of?cer and its Web Services team, and other units within the Library during this review. cc: Deputy Librarian 2016-lT-102 February 2017 This page left blank intentionally FY 16 DevelopmentL?eCycle Table of Contents Appendix A: Kearney Company FY 16 Review of Systems Development Life Cycle 1 Appendix B: Management?s Response 2 2016-lT-102 February 2017 This page left blank intentionally 1 Appendix A: Kearney Company FY 16 Review of Systems Development Life Cycle 2016-IT-102 - February 2017 This page left blank intentionally FY 16 Review of Systems Development Life Cycle 33 .3: 3? For the Library of Congress (Library) to achieve a secure, ef?cient, and effective portfolio of business and program applications, its information system policies and procedures must establish a framework of sound System Development Life Cycle (SDLC) practices. Senior management must complement SDLC practices with an effective Project Management Life Cycle (PMLC) process that provides thorough development oversight, full investment transparency, and periodic variance analysis. As part of the Of?ce of the inspector General's (OIG) ongoing audit emphasis on the Library?s information technology (lT) governance, operations, and best practices, OIG engaged Kearney Company, PO. (Kearney) to assess the Library?s SDLC processes. This assessment involved a review of three recent system development efforts within the Lbrary: the U.S. Copyright Of?ce?s (Copyright) Electronic Licensing System (eLi), Library Services Overseas Field Of?ce Replacement System (OFORS), and the Of?ce of the Chief Information Officer's Congressgov'. Agendas with successful system development results empioy SDLC practices that take new system concepts and products through clearty defined phases that include planning, requirements gathering, designing, buiiding, and testing to deliver quality systems within investment and development targets. lnadequateiy developed IT systems expose agencies to waste throughout the system iifecycle (development, operations and maintenance, and retirement), and weaknesses in data con?dentiality, availability, and integrity. At the time the service units initiated each of the systems development efforts reviewed in this report (?scal years 2010-2012), Library management began to implement disjointed elements of information systems governance. Those elements included establishing an iT Steering Committee, a Project Management Of?ce website, and an Congressgov was initiated in 2012 by the Library's Web Governance Board (WGB) and continues to be overseen by the WGB. Development and implemenhtion has been managed by Web Services. originally part of the Of?ce of Strategic Initiatives (OSI) and currently part of OCIO, since project inception. 2016-IT-102 methodology. Despite establishing those elements, Library senior management at the time made it optionai for service unit management to comply with prescribed SDLCIPMLC practices and requirements when funding new system development from their base budgets. In addition, top Library of?cials at that time did not hire a quali?ed CIO. These issues were identi?ed in previous OIG reports.2 Beginning in FY 2014, new Library leadership reorganized its IT resources into an infrastructure service unit. The changes initiated by that reorganization continued with the recent Library Special Announcement 16-9 advising that the CIO reports directly to the Librarian. The realignment of the CIO role at the Library is consistent with the principles of the Clinger-Cohen Act3 and should further serve to break down the silos that previously fostered waste and an absence of transparency for IT expenditures and investments. OIG believes this top down leadership approach should result in substantial bene?ts to the service units, such as greater accountability and performance. Progress in the delivery of ef?cient and secure information systems will demonstrate to Congress that appropriating funds to the Lbrary for new system investments will deliver the promised investment results. the Assessment Found in summary, Kearney determined that two of the three systems reviewed did not establish and utilize SDLC practices from the outset of development activities. As a result, key program and project 2 See: Audit Report No. sown-101 Reporton the Design of Library-wick Internal Controts for Tracking Investments, March 2015; Audit Report No. The Library Needs to Determ?ne an eDeposr't and eColleco?ons Strategy. April 2015; and Report No. 2015-lT-101 Bendvnarla'ng the Library of Congress Information Technology Fr'scat Year 2014 Budgetary Obligations and Human Capital. March 2016. The law reqlires each agency head to establish clear accountability for IT management activities by appointing an agency Chief Information Otticer (CIO) with the visibility and management responsibiitieo necessary to carry out the speci?c provisions of the Act. February 2017 FY 16 Review of Systems Development Life Cycle it controls were not instituted early on in the eLi and OFORS projects. In contrast, Kearney observed SDLC practices being enforced and followed for the Congressgov project, which was managed by the (formerly ITS). Further, contracts for system development work for eLi and OFORS did not require vendors to comply with systems development best practices. Without SDLC compliance requirements, contractors with ?xed price contracts may seek to strictly meet contractual requirements and save costs on internal controls and quality compliance. At the time the Service Units let the contracts, the Library's contracts of?ce and Contract Of?cer Representatives did not demonstrate the knowledge, skilis, and abilities to enforce best practices for system development projects. Speci?cally, Kearney found that: Copyright did not follow sound SDLC methodologies which resutted in it scrapping the eLi project development after six years and $11.5" ?It proiect expenditures. The eLi project began in 2010 with a budget approval of and increased to approximately for full implementation in 2012. Ultimately, Copyright spent over through 2016 when it decided to terminate the contracts and abandon development activities. Durhg that six?year period, Copyright continued to report it eLCplans {the Library's performance mmagement system) that eLi development was occurring near or on schedule. Neither Copyright Licensing nor its contractors made use of appropriate SDLC standards during the eLi . devel0pment. Continuous failure of vendor developed software to meet Copyright Licensing requirements was attributed to poor requirements and software code management, both key eiements of effective SDLC management. Mostevidentin theeLi projectwas alackof demonstrated project management skits. Copyright did not ensure it properly controlled the project and its contractor. Additionally, Copyright did not employ an earned value management approach to proactively identify cost overruns and plan corrective actions. Appendix of OMB Circular A-11 de?nes earned value management as a management tool used to mitigate risks in developing capital assets. 2016-lT-102 Library Services did not follow sound SDLC methodologies which resulted in late and incomplete deptoyment of OFORS with inadequate security. Library Services initiated the OFORS development program in 2010 with an approximate budget of An additional congressional budget request was denied in 2011, leaving Library Services to provide that additional funding from base program sources. Library Services did not mandate the use of SDLC standards during the OFORS development efforts internally or by the contract ?rms performing the work. The absence of SDLC requirements management and product testing led to the vendor delivering an incomplete system resulting in legal action by the Library. The expected completion of all the requirements in the originally designed system is now forecasted for the end of 2017. Although the development work for OFORS remains relatively near budget to date, the absence of the required functionality and delays until the end of 2017 to complete will contribute to additional internal project costs along with undelivered operating improvements. During the six years of development, Library Services was not consistently reporting in eLCplans the system development status and performance. Because of incomplete security requirements and controls implementation, security issues relating to OFORS were further identi?ed for remediation. The most signi?cant security issues identi?ed related to inconsistent server con?gurations and vulnerability scanning across the foreign of?ces. The remediation activities required were agreed to with the systems owner and project team. February 2017 FY 16 Review of Systems Development Life Cycle development of Congress.gov resulted in system delivery on time, within investment budget, and with limited post implementation security repairs required. Congressgov development was initiated in 2010. As part of the OCIO (formerly ITS), the project team adopted the available SDLC and PMLC policies and standards. Congressgov development was performed using an iterative (or Sprint) methodology, which delivers packages of incremental functionality in a manner prioritized and communicated with users providing their requirements. adequately monitored the development and implementation of requirements for completeness and user acceptance. Recommendations With the recent Library reorganization placing all if oversight under the OCIO, the CIO should focus on improving the Library's system development practices and compliance. The CIO should undertake a review of all systems deveIOpment work currently in planning or in progress and compile an inventory of the projects and evaluate their compliance with Library technology investment, SDLC. and PMLC standards. The CIO should share the review's results and evaluation with the Strategic Planning and Performance Management Office, Budget Of?ce, and Financial Reports Office, as well as with the Librarian's Of?ce and Executive Committee. While not noted in this report as a non- compliance issue, the OIG has stated in various reports that improvement opportunities exist for capturing and reporting full time employee costs related to speci?c development projects. OCIO, Of?ce of the Chief Financial Of?cer, and Human Resource Services should collaborate on ?nding solutions for ?nancial system tracking of employee costs involved in new system development. trigirmgement Resp/Jose In response to the draft report (see Appendix B), the Library?s senior leadership agreed with all of the recommendations. Our of?ce acknowledges and appreciates the comments from the Copyright office on the recommendations along with their concurrence. With regard to Copyright's comments. the review and conditions noted are clear that Copyright executives at that time did not disclose in the Library?s performance management system (eLCplans) and annual Congressional Budget Justi?cations the magnitude of issues and cost 2016-lT-102 overruns related to the project. As a result. Congress and Library executives rid not have adequate infonnm to timely act on and address the issues. February 2017 This page left blank intentionally Library of Congress Performance Audit of the System Development Life Cycle Practices and IT Security Assessments Report Date: January 4, 2017 KEARNEY8L comm Points of Contact: William Kubisraf, Parna- Phii Moore, Partner 1' 701 Duke Shut. Suite 5 00 Alexandnh. VA 22314 703-931-5600, 703-931?3655 phi]. Ware-cram mn"i?& Library of Congress Performance Audit of SDLC and IT Security Audit Report TABLE OF CONTENTS Page COVER LETTER .. 1 OBJECTIVES 3 BACKGROUND SIMIARY OF AUDIT RESULTS . 4 LIBRARY-WIDE (SYSTEMIC ISSUES) ELECTRONIC LICENSING SYSTEM (ELI) 8 OVERSEAS FIELD OFFICE REPLACEMENT SYSTEM (OFORS) 15 CONGRESSGOV 22 AUDIT METHODOLOGY AND SCOPE - 26 DETAILED AUDIT FINDINGS ELI 30 DETAILED AUDIT FINDINGS - OFORS - 46 DETAILED AUDIT FINDINGS - CONGRESSGOV 63 comm 1701 Duke Street. Suite 500. Alexandria. VA 22314 PH 703.931.5600. FX. 703 931.3655 keameycorom COVER January 4,2017 Kurt W. Hyde Inspector General Library of Congress 101 Independence Ave SE Washington, DC. 20540 Dear Mr. Hyde, Kearney Company, P.C. (Kearney) has an audit of the Library of Congress? (LOC) System Development Life Cycle (SDLC) practices and Information Technology (IT) Security of the Copyright Of?ce?s Electronic Licensing System (eLi), the Library Services Overseas Field Of?ce Replacement System (OFORS), and the Congressional Research Service?s Congress. gov website and supporting applications. This performance audit, conchicted under Contract No. was designed to meet the objectives identi?ed in the ?Objectives? section of this report. Kearney conducted this performance audit in accordance with Generally Accepted Govemment Auditing Standards (GAGAS), 2011 Revision, issued by the Government Accountability Of?ce (GAO). The purpose of this report is to communicate the results of Keamey?s performance audit, as well as our related ?ndings and recommendations. This report includes language that is intended solely for the information and use of LOC and is not intended to be and should not be used by anyone other than these speci?ed parties. An audit involves performing procedures to obtain evidence about the performance of a program The procedures selected depend on the auditor?s judgment, including an assessment of the risks of system development and IT security, whether due to fraud or error. An audit also includes evaluating the appropriateness of policies used and the reasonableness of decisions made by management, as well as evaluating the overall presentation of assertions made by management. Based on our audit work, we concluded that the Copyright Of?ce did not have the appropriate project management and contracting procedures in place to ensure system development delivery of required technical elements within the development timeframe and project budget. The Copyright Of?ce did not have a project oversight function to evaluate development delays, additional ?mding requests, and recommended courses of action. The Copyright Of?ce did not de?ne project management oversight responsibilities of third-party vendors and did not contractually de?ne vendor technical deliverables, timelines, and project management activities. martini comm As a result, the Copyright Of?ce ceased current development activities in October 2016 after six years and approximately $1 1 million of expenditures. Kearney also concluded that the Library Services service unit did not have the appropriate project management and contracting procedures in place to ensure system development delivery of required technical elements within the development time?ame and project budget. In addition, the Library Services service unit did not have a project oversight ?inction to evaluate development delays, additional funding requests, and recommended courses of action lmtil ?scal year (FY) 2016. Library Services did not de?ne project management oversight responsibilities of third-party vendors and did not contractually de?ne vendor technical deliverables, timelines, and project management activities. library Services has halted development efforts, and they are requesting a project remediation plan from the vendor for missing and/or late deliverables. While the vendor delivered the base product ?mctionality, the vendor has not delivered key user functional requirements related to the Printing, Binding, and Managing Suppliers Inventory and In-Transit processes. Kearney also concluded that the Congress. gov project team did have the appropriate project management and contracting procedures in place to ensure system development delivery of required technical elements within the development timeframe and project budget. Kearney appreciates the cooperation provided by personnel during our audit. Magma Kearney Company, RC. January 4, 2017 mu?& Library of Congress Performance Audit of SDLC and IT Security OBJECTIVES The Library of Congress?s (referred to as or ?the Library?) Of?ce of Inspector General (01G) contracted Kearney Company, P.C. (referred to as ?Keamey,? and ?our?) to conduct a performance audit on three LOC system development e?'orts: . The Electronic Licensing System (eLi) managed by the Copyright Office (USCO) . The Overseas Field Of?ce Replacement System (OFORS) managed by the Overseas Operation Division (OvOp) of Library Services . The Congress gov website, formerly managed by the Congressional Research Service and now managed by the LOC Of?ce of the Chief Information Of?cer (OCIO). During the performance audit, Kearney evaluated the Library?s information technology (IT) system development practices, known in industry as the Systems Development Life Cycle (SDLC), as well as reviewed critical IT secmity elements for the programs mentioned above, using Federal standards and industry best practices as a benchmark Additionally, Kearney evaluated the appropriateness of IT-related policies against recognized industry best practices, the reasonableness of management decisions, and the performance of SDLC and IT security programs to own policy framework. BACKGROUND USCO The USCO is one of eight service units. The USCO administers United States copyright laws, including: registration; the recordation of title and licenses; a number of statutory licensing provisions; and the collection, investment, and disbursement of copyright fees. The USCO employs approximately 420 employees, including 25 IT employees assisting with IT oversight and daily operations, business analysis, and project and contract management. The USCO receives funding from two different sources: annual apprOpriations and a congressionally mandated ceiling of collected fees under Title 17 of the United States Code (U In ?scal year (FY) 2016, Congress appropriated $23 million and authorized expenditures up to a $36 million fee ceiling under Title 17. The USCO is exempt from complying with LOC OCIO system deveIOpment policies when funding of that development is from base budget. The USCO manages and operates 17 applications in support of its mission. The service unit last managed a system initial development project in 2008. Libra}; Services Library Services is one of eight service units and supports the mission of LOC through the acquisition, cataloging, preservation, and referencing services of traditional and digital collections. Library Services employs approximately 1,300 employees, including 60 IT employees. mn:v& Library of Congress Performance Audit of SDLC and IT Security oomph" Audit Report Library Services receives funding via annual appropriations. In FY 2016, through congressional appropriations, LOC allotted library Services $214 million Library Services is exempt from complying with LOC OCIO system development policies when funding those systems out of its base budget. Library Services manages and operates 33 applications in support of its mission. OCIO OCIO1 supports the mission of LOC by providing IT strategic direction, leadership, services and capabilities. OCIO employs approximately 295 employees. The LOC OCIO receives funding via annual appropriations. In FY 2016, through congressional appropriations, the LOC allotted OCIO $85 million. SUMMARY OF AUDIT RESULTS While Kearney conducted performance audits for each of these three systems eLi, OFORS, and Congress. gov) individually, we noted the following overarching factors that a??ected some, if not all, of the programs reviewed: . LOC did not have fully developed SDLC policies, procedures, and oversight practices established at inception of eLi and OFORS development activities. As LOC developed a more speci?c SDLC, the new policies were not applied retroactively to existing . development activities - - The eLi and OFORS projects were not subject to the oversight and mandates of OCIO guidance at the time they were initiated. The system ouming service units had the authority to develop these systems on their own and, therefore, moved forward without requesting guidance from OCIO or its IT Steering Committee (IISC), or developing comparable project oversight policies . The eLi and OFORS-related contracts did not have clearly de?ned requirements, deliverables, or timelines, indicating a lack of standardized IT -related contract templates or coordination with OCIO or the Library?s Of?ce of Contracts and Grants Management . The eLi and OFORS projects did not de?ne required project management and contractor oversight responsibilities . The eLi and OFORS projects did not create a management body to evaluate development delays, additional ?mding requests, and re'commended courses of action. eLi and OFORS project management and respective service units did not clearly and timely report project delays and funding needs in excess of approved amounts to LOC management and the budget of?ce. Below is a summary of audit ?ndings, broken out by system. Detailed ?ndings for each audited - system are listed in Appendix A. 1 Web Services. which manages Congress. gov. began under the O?ice of Strategic Initiatives (081). which later became OCIO. OCIO was initially aligned as a unit under the O?ice of the Librarian/Of?ce of the Chief Operating Of?cer (OCOO) in FY 2015. In September 2016, OCIO became its own service unit directly reporting to the Librarian of Congress. Library of Congress Performance Audit of SDLC and IT Security ramps. Libm?Wide [Systemic Issues} LOC created the ITSC in 2010 by establishing a charter and broadly de?ned processes, roles, and responsibilities of the Committee and service units related to IT system development activities. The following timeline identi?es key milestones in development of IT system develoPment authorities, responsibilities, and oversight activities. Exhibit 1: ITSC Development Timeline 11:2013 10.2015 941010 10-201: Sepm'ate Project PMO Chara PMO Ilium [Tsc Mammal: Reissues lT-relared 353010 Website Com -.. Lre Cycle muons- Recmd?tiw Creled ch, moo Created by optional a: Non IT-PMO for SDLC Phase: PMO Projects A 3x201!) 10?2010 Chane: New Estahlished Mimick- wirhmn Over 31M or Oversight 1-1391: Risk" 'nu'eshold: Program Management . At the inception of the eLi and OFORS development process, LOC did not have fully developed SDLC oversight policies, procedures, and practices. As LOC developed more speci?c SDLC policies and procedures, the new policies were not applied retroactively to existing deveIOpment activities. Because of this policy gap, both system development activities proceeded without the structure of a comprehensive LOC oversight framework Additionally, neither service unit responsible for the system development activities adopted industry best practices. Compounding this oversight, neither responsible service unit retroactively applied new LOC policies and procedures as they were implemented. As a result, eLi and OFORS project management did not report development delays, vendor performance issues, key contract modi?cations terms, billing conventions, technical milestones, and contract value), and cost increases to LOC, budget oversight, and the service units. Currently, both project management teams have halted development activities and neither project team has deployed systems that met user- de?ned functionality. Library of Congress Performance Audit of SDLC and IT Security NIMPANY Audit Report Project Management At the inception of the eLi and OFORS development process, LOC did not have fully developed SDLC project management policies, procedures, and practices. As LOC developed more speci?c SDLC project management policies and procedures, the new policies were not applied retroactively to existing development activities. Because of this policy gap, both system development activities proceeded without clearly de?ning roles and responsibilities of the project management teams, minimum documentation necessary to demonstrate compliance with policy, and the degree and detail of vendor oversight as part of a comprehensive LOC project management framework Additionally, neither service unit responsible for the system development activities adopted industry best practices. Compounding this oversight, neither responsible service unit retroactively applied new LOC policies and procedures as they were implemented. As a result, eLi and OFORS project management did not consistently perform vendor oversight, document vendor oversight when performed, or request vendor documentation and project plans necessary to effectively execute project oversight activities. As discussed in the subsequent Contract Issues section, project management teams and Contracting O?icers (C0) did not jointly ensure that vendor technical deliverables, project timelines, and milestones were appropriately included in vendor contracts- Currently, both project management teams have halted development activities and neither project team has deployed systems which met user-de?ned ?mctionality. Contract Issues The following key contracting elements were not present to establish vendor accountability for the eLi and OFORS projects: Requirements for project management best practices, customer oversight, and acceptance Technical requirement details to ensure user functionality Contractor deliverable details software source code, programner?s documentation) Vendor oversight requirements Interim and ?nal review criteria milestones and expectations for development at those milestones) Clearly de?ned technical framework, resulting in the inability to match technical requirements to deliverables. Security Issues There were no signi?cant Library-wide (systemic) ?ndings in this audit area. mu"& Library of Congress Performance Audit of SDLC and IT Security comm Wm Recommendations 1. 2. LOC should compare current SDLC policies and procedures to industry best practices to ensure development risks are actively monitored and managed LOC should monitor current SDLC activity and environmental factors as part of a structured risk assessment framework to ensure policies and procedures identify and address emerging issues/new risks COs and Contracting Of?cer?s Representatives (COR) should collaboratively identify standard SDLC contract elements, including vendor timelines, technical deliverables, required documentation, and internal review and acceptance procedures, as well as ensure that SDLC contracts contain these elements LOC should develop policies that clearly delineate required oversight approval for additional ?mding requests, contract modi?cations, delivery delays, and inability to meet original technical requirements in all LOC service units LOC should clarify funding sources and status of ?mds reporting requirements. mnulv& Library of Congress Performance Audit of SDLC and IT Security en?pn?v Audit Report Electronic Licensing System [eLi] The Licensing Division began development of the eLi system in 2010. eLi is intended to streamline the receipt of Copyright royalty paments and management of Copyright royalty investment accounts. The initially approved contract budget was approximately $1.1 million, which was subsequently increased to approximately $2 million in 2011. To date, the USCO has spent over $11.6 million with third-party vendors to develop this system. In October 2016, the USCO cancelled the deve10per?s contract prior to deployment of this project. The USCO is currently evaluating the options regarding development of the next steps, including identifying what elements are functional or recoverable. Exhibit 2: eLi Development Timeline 2011 Semester 2013 '201 Iniial cLi Trmsitiona 0 . ~tober 2016 ?mma'kd Fanning September 2011 Implementation to Soil-war: Cop-origin Cent's {or a; in mm eLi Maw Halted Hosting Dn?clopmc?l 'umm [mall SI 1 Million 5 COTS Loc Seams Project Developirrnl Funding Approved Implementation is Not Possible i I Activities Namath-1 1016 FY2011 Ami . . 201 3 Cumulative eLi 990.000 Decanber 2011 m1 Contract Costs Approved eLi Contracted Flamed Apps $11.6 Host'ng Provider [mt-D" Milieu The USCO embarked upon this development activity without speci?c and detailed policies, procedures, guidelines, and responsibilities related to program and project management. This lack of process guidance and accountability resulted from the failure to proactively address an existing gap in governance policies. The USCO project management team did not demonstrate effective, proactive project cost management practices. Over the six-year development period, USCO project management expended $11.6 million in vendor costs. The USCO project management team received speci?c funding for approximately $1.9 million in the ?rst two years of the project. USCO project management did not update project budgets for the subsequent six years of development activity, nor perform an analysis of estimated cost ovemms. Subsequent development funding activities occurred, inconsistent with initial funding requests. As discussed below, the USCO had no management body to evaluate and approve additional ?nding requests in conjunction with experienced development delays, analyses, and recommended courses of action Additionally, the USCO did not have an oversight body with authority to halt project activities based on cost overruns, delivery delays, and/or lack of functionality until appropriate remediation plans or project management structure was in place. manner enumm Library of Congress Performance Audit of SDLC and IT Security Audit Report Further, the USCO annual budget requests did not convey the eLi development challenges. Below is a summary of the USCO's annual Congressional Budget Justi?cations (CBJ) regarding eLi. These project status summaries are inconsistent with the actual project delays, cost overruns, and the eventual halt in development activities. Exhibit 3: Licensin Division aortin or eLi in the C31 Priority Activities Comments Licensing Reengineering/eLi in CBJ 2010 Begin business process reeng'neering project to be fully implemented in FY 2012. Licensing will continue its reengineering e?'orts with the goal of ?Jlly implementing 2011 - the process in FY 2012. 2012 Licensing will continue implementing and re?ning the reengineered processes and system 2013 Licensing will continue implementing and re?ning the reengineered processes and system. 2014 Licensing will continue implementing and re?ning the reengineered processes and system 2015 licensing will continue implementing and re?ning the reengineered processes and eLi. Licensing will continue to work toward a fully automated system for receiving and 2016 . . examining Statements of Account. 2017 Licensing will continue to work toward a fully automated system for examining and making available Statements of From 2010 to 2016, the USCO reported eLi strategic planning information to the Library?s planning of?ce via eLCplans. eLCplans is a centralized database tool that houses strateg'c planning information and annual performance data, automating the annual planning and performance process. In 2010, 2014, and 2016, the USCO reported no performance metrics or goals for eLi. Exhibit 4 provides a smnmary of the years in which the USCO reported performance metrics for eLi in eLCplans. These self-reported metrics are inconsistent with actual project delays, cost overnms, and the current halt in development activities. ma Library of Congress Performance Audit of SDLC and IT Security a Audit Report Ethibit 4: eLi Annual Re ortz'n in [ans Results Rating/Metric Develop functional requirements 2011 documents for online Functional Requirements Document Green! cable licensing for project has been developed. interface by September 30, 201 The USCO successfully launched the pilot of the cable Statement of Account submission system in September. For the ?rst time, users of statutory licenses were allowed to see and comment on an electronic system for submitting statements Pilot systems and and paying royalties to copyn'ght 2012 processes for online owners. Implementation of other I 11 cable licensing by relevant system components IT September 30, 2012 security, training, procedure manuals, Licensing Division System data migration) was also begun. Completion of user acceptance testing/piloting and ?nalization of e- system build for cable Statements of Account and fees are planned for completion in FY 2013. USCO accomplished the milestones of ?le project: Host online licensing - December 31: Awarded hosting system for ?ling cable contract Statements of Account March 31: Successfully 2013 in the cloud with an completed proof of concept in Green2 Approval to Operate Amazon Cloud (AWS) (ATO) by September a June 30: Completed 30, 2013 development environment in AWS - September 30: ATO gamed for next phase of pilot. 2 eLCplans used a three-color coded rating scheme. Green indicated that the service unit was on track to accomplish its annual goal: amber indicated that the service unit was behind its plan targets but adjustments could result in accomplishing the plan: and red indicated that the service unit would not accomplish the plan?s annual target. 10 Library of Congress Performance Audit of SDLC and IT Security Results Rating/Metric Planned milestones were completed eLi batch submission pilot, pilot results analyzed and communicated to stakeholders, ?scal requirements for eLi). The Independent Assessment was Complete FY 2015 reviewed and, as a result, subsequent 2015 scheduled steps have been taken and other Green2 reengineering phases steps are planned for implementation. The ownership of the eLi project is transitioning to the Copyright Technology Of?ce, which will manage its development and new targets/milestones for FY 2016 and after. The USCO begn eLi development as current SDLC policy began development and maturation As the current policy developed, LOC did not require a retroactive application to existing development efforts. In essence, the USCO develOped eLi without an LOC policy framework The USCO did not voluntarily adopt the matured LOC SDLC policy, nor did it develop a comparable framework based on industry best practices. The USCO did not establish a program oversight ?mction similar to IT Investment Board, nor did the USCO regularly review project management and deveIOpment activities- No management body existed to evaluate development delays, additional funding requests, and recommended courses of action. The USCO did not have an oversight body with authority to halt project activities based on cost overruns, delivery delays, and! or lack of functionality until appropriate remediation plans were in place. The USCO also failed to develop in detail the roles and responsmilities of project management. Project management practices exhibited several gaps from best practices, including not establishing accountability for Speci?c project management activities, failing to perform and document oversight on recurring basis, inconsistently documenting the matching of system requirements to development activities, failing to create and track project plan milestones, and not performing an analysis of additional funding requests. This lack of speci?c oversight activity precluded project unmagement from identifying missing system functionality and cost overruns early in the development cycle. Early remediation efforts are more as rework efforts are Additionally, vendors? contracts did not contain technical requirements, speci?c deliverables, or timelines to support program and project management oversight. These missing elements ampli?ed the issues resulting from the lack of speci?c project management procedures the speci?c contractor deliverables necessary to complete project management tasks). ll mum comm Library of Congress Performance Audit of SDLC and IT Security Audit Report Signi?cant challenges identi?ed in the eLi program audit include: Program Management at the Service Unit Level The USCO failed to develop service unit-level policies and procedures to establish accountability and clearly de?ne required progam and cost management activities including: - Project schedule monitoring a Project budget approval processes Regularly scheduled project budget reporting Cost variance analysis Accountability for project contractor oversight Tracking of corrective actions. The USCO did not conduct periodic service unit management reviews of project progress, variances, and development breakdowns. As a result, USCO was unable to make assessments to continue, alter, or cease project development. Additionally, there was no evidence of service unit management reporting to LOC management on capital project development. Project Management at the Functional Level (Development Activity) The eLi project did not have a thorough project management framework to ensure all phases of the development project were thoroughly planned and executed Planning, Analysis, Design, Deployment, and Maintenance). There was no evidence of a comprehensive Project Management Plan (PMP) to de?ne how the project is to be executed, monitored, and controlled, which would enable accurate reporting, planning, and project adjustments. Additionally, project managers did not build a Risk Management Plan and Risk Register, therefore, they were unable to identify and recognize potential events or conditions (risks) which could negatively e?'ect one or more project objectives, such as scope, schedule, cost, and quality. Project management did not e?'ectively track scope and schedule changes to closure, as they stopped tracking changes in 2014. Additionally, project managers failed to document departures from the planned project schedule (with associated justi?cation) in the project log, resulting in numerous and signi?cant scope changes a??ecting the schedule and increasing the overall cost of the project. Project managers did not create a concept proposal or PMP to capture all human capital requirements. As a result, USCO management could not match the necessary human capital to project needs or properly assign roles/responsibilities, reporting relationships, availability, etc. 12 Library of Congress Performance Audit of SDLC and IT Security There was no evidence of a standardized Requirements Management Plan used to elicit, document, and track deveIOpment requirements; therefore; The project team de?ned its own method for establishing and documenting requirements in multiple documents, formats, and locations The project management team did not have a standardized process to validate technical requirements and verify whether stakeholder needs were ?Jlly de?ned and implemented, causing scope changes across the project The project management team was unable to fully de?ne requirements in supporting vendor contracts (See Contract Issues). - There was no evidence that an Analysis of Alternatives was conducted to ensure selected project direction was the best solution to meet user needs with minimal cost and complexity. The resulting system is not operational at a cost of $1 1 .6 million, $9.7 million over budget. eLi did not have an observable system requirements baseline providing a de?ned, con?rmed, and validated set of system requirements needed to ensure user needs were met. There was no evidence of a System Development Plan or ?blueprint? for eLi which de?nes development methodologies and work standards. Without a solid System Development Plan as a reference, project managers were unaware of how contractors were constructing the system coding standards, testing schedules, and tools used) and could not validate if the system was built using industry-acceptable standards. Furthermore, without this reference document, the project management team could not properly monitor the development, testing, deployment, and veri?cation of software in the operational environment, creating a dependency of the government on the vendor for support. - There was no evidence of change management processes to receive, analyze, and validate proposed system changes before implementation and avoid unnecessary or potentially harm?il changes to the system. Additionally, there was no evidence to support that system changes made were veri?ed for accuracy and compliance with operational and security requirements. - Although system development for eLi began in 2010, the USCO did not approve the eLi project charter until May 2016. Contract Issues a The following key contracting elements were not present to establish vendor accountability for the eLi project: Requirements for project management best practices, customer oversight, and acceptance - Technical requirement details to ensure user ?mctionality l3 BUMPANY Library of Congress Performance Audit of SDLC and IT Security Audit Report Contractor deliverable details g, software source code, programmer?s documentation) Vendor oversight requirements Interim and ?nal review criteria milestones and expectations for development at those milestones) - Clearly de?ned technical framework, resulting in the inability to match technical requirements to deliverables. Security Issues Security testing was not performed on eLi, as the system is not operational. Summary of Project Results Culminating?om the Above De?ciencies Project terminated after six years Initial project plan and contract vehicle signi?cantly modi?ed after inception; existing Library server infrastructme did not have capacity to meet technical requirements and upgrading was not feasible or cost-effective No transparency to LOC or Congress regarding the used or time lost $11.6 million in wasted costs due to mismanagement of the project development and resultant excessive expenditures without delivery of a functioning system Recommendations l. For all future system development activities, USCO should ensure that current LOC policies and relevant industry best practices are adopted by service unit oversight and project management teams USCO should clearly de?ne technical requirements and functionality of the systems USCO should clearly de?ne vendor timelines, technical deliverables, and required documentation as part of the contract and Statement of Work (SOW) USCO should develop reasonable and reliable cost estimates for subsequent development activities and obtain LOC oversight approval USCO should clarify funding sources and status of ?mds reporting If developmt activities for eLi resume, USCO should assess elements of exisiting development work products for possible reuse. I4 unmatcongms Performance Audit of SDLC and IT Security Overseas Field Of?ce Replacement System (OFORS) The Overseas Operation Division of Library Services initiated the OFORS development program in 2010. OFORS is designed to support acquisition activities of the six LOC ?eld of?ces around the world. OFORS ?mctionality includes ordering and receiving claims for missing and overdue items, recording ?nancial obligations, and documenting payments, as well as credits for LOC and participants in the Cooperative Acquisitions Programs. The initial September 2010 awarded contract value was $1,730,109, which was subsequently increased to $1,771,000 in 2016. In May 2013, the contract type was changed from a ?Time and Materials with Firm Fixed Unit Pricing? to ?Firm Fixed Price? with an unchanged total contract value. 0v0ps submitted a ?ve-year, $500,000 annual funding request in FY 2011. Congress did not fund this request. Instead, OvOp has ?mded development costs ?'om the Acquisitions and Bibliographic Access (ABA) Directorate and the six ?eld of?ces? operating funds (Cooperative Acquisitions Program System). Additionally, OvOp did not include OFORS in the LOC strategic reporting and planning process until 2015. Several contract modi?cations have been issued to extend the period of performance, correct dates, add funding, change the CO, change the COR, and for various o?ier reasons. The system is currently deployed and operational; however, it is missing key functionality identi?ed to support overseas of?ce activities, including binding, shipping, inventory, and managing suppliers. Full feature capability is now expected by December 2017. To date, over $1,296,000 in contract costs have been invested in this system development project. . Exhibit 5: OFORS Development Timeline 9.2010 2011. @2013 8'10 14 10:2 015 53201-6 121201 7 initial Original I Contract Type OFORS Passes Total Vendor OFORS Re?ned Eud Development acquisition costs changed Rom LOC Security out:- Sl,296,000 Initiates of Contract Awarded wen: estimated to Assess-item and an ofEnrl of 5 Cute Notice Development $1,730,109 I Authorization 6-2010 2011 932012 9.2014 10'2016 Lnitia] Five Year'5500.000 OFORS Initial First Field Of?ce Scanning of Solicitation Budget Appropriation Planned with Baseline OFORS System Request Denied Completion Functionality Completed i UH Iii"; ic'- 3?1 in? In; it It: JIL. this} - ls-r Lidia-.1} ,cuirueir -- .- OvOp project managers embarked upon this development activity without speci?c and detailed policies, procedures, guidelines, and responsibilities related to program and project management. This lack of process guidance and accountability resulted from OvOp?s failure to proactively address an existing gap in governance policies. 15 ma Library of Congress Performance Audit of SDLC and IT Security EUMPANY Aumepm Library Services did not consistently report OFORS strategic planning information via eLCplans ?om FY 2010 to 2016. eLCplans is a centralized database tool that houses strategic planning information and annual performance data, automating the annual planning and performance process. In 2010-2014, Library Services reported no performance metrics or goals for OFORS. Below is a summary of the years in which Library Services reported OFORS performance metrics in eLCplans. These self-reported metrics are inconsistent with actual project delays, cost overruns, and the current halt in development activities. ineLC Exhibit 6: OFORS Annual I r?n [ans Year Goal Results In FY 2015, LOC implemented OFORS in all six overseas of?ces in Brazil, Egypt, India, Indonesia, Implement OFORS in Kenya, and Pakistan. The new the Jakarta, Islamabad, system enables the library to retire Green Rating/Metric 2015 Nairobi, and Rio De its disparate, obsolete systems for Janeiro of?ces. accounting, billing, and tracking in the overseas of?ces and will improve its service to Cooperative Acquisitions Program customers. Optimize OFORS in all six of?ces and reduce use of legacy systems by 70% to achieve OFORS was optimized in all six 2016 ef?ciencies and improve o?ices, and use of legacy systems has Green service to the Library been reduced by 90%. and its Cooperative Acquisitions Program participants. OvOp began OFORS development as current SDLC policy began development and maturation. As current policy developed, LOC did not require a retroactive application to existing development efforts. In essence, developed OFORS without a LOC policy framework. did not voluntarily adopt the matured LOC 0C 10?s policy, nor did it develop a comparable framework based on industry best practices. OvOp did not establish a program oversight function similar to LOC ?5 IT Investment Board until FY 2016. did not regularly review project management and development activities. No management body existed to evaluate development delays, additional funding requests, and recommended courses of action. OvOp did not have an oversight body with authority to halt project activities based on cost overruns, delivery delays, and/or lack of ?mctionality until appropriate remediation plans were in place. 0v0p also failed to develop in detail the roles and responsibility of project management. Project management practices exhibited several gaps from best practices, including not establishing accountability for speci?c project management activities, failure to perform and document 16 Mll?tst Librm' of Congress Performance Audit of SDLC and IT Security an Andit Report oversight on a recurring basis, inconsistently documenting the matching of system requirements to development activities, a lack of project plan milestones, and missing analysis of additional funding requests. This lack of speci?c oversight activity precluded project management from identifying missing system functionality and cost overruns early in the development cycle. Early remediation efforts are more cost-effective, as rework efforts are The Project Manager did not identify the rami?cations of the deployment of OFORS in six dissimilar operating environments until late in the Testing Phase. Project management did not complete initial security testing because they did not identify testing methods and assign adequate time and resources to complete security testing in all six environments. Testing identi?ed four different operating system versions with varying levels of security patching for OFORS without security testing completed prior to our audit. Similarly, project management encountered delays in deploying OFORS in different operating environments. Project management had not identi?ed the resources and skill sets necessary to customize OFORS for each operating environment. Initial plans scheduled completion of the OFORS deve10pment in September 2012, but multiple delays and non-performance by the contractor resulted in numerous contract modi?cations, including a change in the contract type in May 2013 from Time and Materials to Firm Fixed Price. Finally, because the contractor had not delivered key functionality, a cure notice was issued in June 2016, which resulted in an agreement to provide all development functionality by December 2017. Additionally, the vendor?s contracts did not contain technical requirements, speci?c deliverables, and timelines to support program and project management oversight. These missing elements ampli?ed the issues resulting ?'om no speci?c project management procedures the speci?c contractor deliverables necessary to complete project management tasks). Signi?cant challenges identi?ed in the OFORS program audit include: Program Management at the Service Unit Level - Library Services failed to develop service unit-level policies and procedures to establish stakeholder accountability and clearly de?ne required program and cost management activities including: - Development and tracking of a PMP Project budget approval processes Regularly scheduled project budget reporting Cost variance analysis Accountability for project contractor oversight Tracking of corrective actions. - Library Services did not conduct periodic service unit management reviews of project progress, variances, and development breakdowns. As a result, Library Services was unable to make assessments to continue, alter, or cease project development. 17 unmet commit Library of Congress Performance Audit of SDLC and IT Security Audit Report Additionally, there was no evidence of service unit management reporting to Library management on capital project development. Project Management at the Functional Level (Development Activity) The OFORS project did not have a thorough project management framework to ensure all phases of the development project are thoroughly planned and executed Planning, Analysis, Design, Deployment, and Maintenance). There was no evidence of a comprehensive Project Management Plan (PMP) to de?ne how the project is to be executed, monitored, and controlled, which would enable accurate reporting, planning, and project adjustments. Additionally, project managers did not build a Risk Management Plan and Risk Register, therefore, they were unable to identify and recognize potential events or conditions (risks) which could negatively effect one or more project objectives, such as scope, schedule, cost, and quality. Project management did not e??ectively track scope and schedule changes to closure. Additionally, project managers failed to document departures from the planned project schedule (with associated justi?cation) in the project log, resulting in numerous and signi?cant scope changes a??ecting the schedule and increasing the overall cost of the project. Project managers did not create a concept proposal or PMP to capture all human capital requirements. As a result, management could not match the necessary human capital to project needs or properly assign roles/responsibilities, reporting relationships, availability, etc. There was no evidence of a standardized Requirements Management Plan used to elicit, document, and track development requirements; therefore: The project team de?ned its own method for establishing and documenting requirements in multiple documents, formats, and locations, limiting clarity and enhancing risk of delays and extra expense of adding functionality later in development - The project management team did not have a standardized process to validate technical requirements and verify whether stakeholder needs were fully de?ned and implemented, causing scope changes across the project . The project management team was unable to ?illy de?ne requirements in supporting vendor contracts (See Contracts Issues). OF ORS did not have an observable system requirements baseline providing a de?ned, con?rmed, and validated set of system requirements needed to ensure user needs were met. Accordingly, OFORS project managers were tmable to verify that system requirements were appropriate for unique operating enviornments (six different locations). 18 untrue: comm Library of Congress Performance Audit of SDLC and IT Security Audit Report There was no evidence of a System Development Plan or ?blueprint? for OFORS which de?nes development methodologies and work standards. Without a solid System Development Plan as a reference, project managers were unaware of how contractors were constructing the system coding standards, testing schedules, and tools used) and could not validate if the system was built using industry-acceptable standards. Furthermore, without this reference document, the project management team could not properly monitor the development, testing, deployment, and veri?cation of software in the operational environment, creating a dependency of the government on the vendor for support. There was no evidence of change management processes to receive, analyze, and validate proposed system changes before implementation and avoid unnecessary or potentially harmful changes to the system. Additionally, there was no evidence to support that system changes made were veri?ed for accuracy and compliance with operational and security requirements. Contract Issues The following key contracting elements were not present to establish vendor accountability for the OFORS project: Requirements for project management best practices, customer oversight, and acceptance Technical requirement details to ensme user ?mctionality Contractor deliverable details software source code, programmer?s documentation) Vendor oversight requirements Interim and ?nal review criteria milestones and expectations for development at those milestones) Clearly de?ned technical framework, resulting in the inability to match technical requirements to deliverables. Security Issues There were inadequate system access control policies and procedures in place for OFORS. LOC management failed to ?nalized and approve account management and system monitoring control procedures affecting the six overseas OFORS implementations. This resulted in the following: Users being granted system access prior to formal approval Privileged users with system access without the required documentation No procedures to recertify privileged users Sixty-nine user accounts, inactive for over 30 days, were not disabled No application audit logging functionality or review of server or network logs for privileged user activity. 19 mum comm Library of Congress Performance Audit of SDLC and IT Security Audit Report - LOC management did not implement adequate Security Assessment procedures for OFORS. Speci?cally, the Security Program Manager did not prepare the Security Control Assessment documentation, the Security Control Assessor did not produce a Security Assessment Report, and LOC management did not approve the Security Assessment Plan for OFORS. - OFORS program management did not implement adequate Con?guration Management (CM) procedures: LOC management did not ?nalize and approve CM Plan The System Owner did not clearly map changes to OFORS to the Test Plan Sectnity impact assessments of proposed changes were not accomplished prior to change implementation There was no documentation of Supervisory or management review and approval for the OFORS releases prior to being put into production. - The OFORS System Security Plan (SSP) was not fully developed: The System Owner did not include documentation of the system boundary The System Owner did not update SSP references and alignment with cmrent guidance (National Institute of Standards and Technology Special Publication 800-53, Revision [Rev] 4 Recommended Security Controls for Federal Infommtion Svstems and Organizations) The System Owner failed to ensure the SSP included suf?cient details on implemented security controls and residual planned actions to address any weaknesses The System Owner failed to include security planning updates based on results from the continuous monitoring process. . OvOp management failed to register OFORS in the centralized 1T risk management and security documentation repository. - OF ORS System Owner did not. implement adequate vulnerability management and con?guration management as evidenced below: Of the six OFORS instances, four different. Linux operating systems were identi?ed Vulnerability scanning/continuous monitoring was not completed prior to October 2016, although the ?Approval to Operate? was issued in August 2014 The System Owner inadequately performed the initial certi?cation and accreditation testing. System Owners are currently re-performing The OFORS Contingency Plan was not ?nalized; it is still a draft document. Summary of Project Ranks ulminating ?rm: the Above De?ciencies 1. Initial expected project completion of September 2012 extended to December 2017 2- Key user-de?ned functionality not provided; standard application ?mctionality has been deployed. Development and deployment of speci?c modules to support and streamline LOC ?eld of?ce operations has not implemented 20 ?unmet comm Library of Congress Performance Audit of SDLC and IT Security Audit Report 0v0p did not obtain speci?c ?mding for OFORS development. 0v0p used the ABA 3. and six ?eld of?ce appropriations to fund the project. $500,000 of original development contract of $1,700,000 remains to ?md delivery of the ?ve remaining modules. Active project management is needed to ensure the original budget is not exceeded and original planned ?mctionality is delivered 4. Security management is not consistently documented and lacks continuous monitoring and update 5. Insecure and inconsistent system con?gurations. Recommendations 1. For all future system development activities, library Services should ensure that current LOC policies and relevant industry best practices are adopted by service unit oversight and project management teams 2. 0v0p should update and clearly de?ne technical requirements and functionality of the systems 3. 0v0p should clearly de?ne vendor timelines, technical deliverables, and required documentation as part of the contract and SOW 4. 0v0p should develop reasonable and reliable cost estimates for subsequent development activities and obtain LOC oversight approval 5. 0v0p should clarify funding sources and status of ?mds reporting 6. 0v0p should address security risks, perform required remediation, and complete all required documentation 7. 0v0p should identify necessary personnel requirements to successfully perform project management and security oversight. 21 Library of Congress Performance Audit of SDLC and IT Security Audit Report mums nommv Co ess. ov Initiated in 2010, the development of Congress. gov sought to modernize aging legislative information platforms. Congress. gov provides current and historical information about Congress, legislation, and the legislative process. As part of a permanent program to provide service enhancements and support for this authoritative website, LOC has invested $15 million from 2012 through 2016. Currently, the system is operational with additional deveIOpment planned for feature enhancements and eventual replacement of the legislative Information System (LIS). Exhibit 7: Congressgov Development Timeline I 2010 3:20? I [Inhaled Development 952013 0'20! 4 5 FY2017 Project Commenced 3-1012 SIJM $1.31? 52 9H Congrextgov Planing as Congress.ng Development Development 9 ZOI I Developmem Congress.ch Increased Part of I?mtt?acl Initial Beta Cmgressgw (?nmract Replaces Functimnlity and "Pnject Awarded sit? released Awarded Awarded ALTO Awarded Thomas gm Retirement of US I j, 832012 Phase One Compete EY 2012 FY 2013 2014 2915 r"1' ZUIO tarmac: (?055: (mu-act Costs. C0515 Lonuact ?05132 Contact UOSIS 3665K IM 5740K SLIM l?t 3'11! \ZI-lt tum II-tu ?Ittnu -t\ll - VII-H \nuuti I: Al The LOC OCIO develops policies and procedures governing system development consistent with many best practices. The Of?ce of Web Services within OCIO was able to comply with the policy using a recognized development method that stresses a quick iterative approach to challenges, while reducing the volume of project docmnentation OCIO maintains Congressgov in its hosting environment. IT Security Group (ITSG) identi?ed hosting environment security vulnerabilities through scans; currently, OCIO ITSG is in the process of determining if these vulnerabilities impact Congess. gov. LOC reported in the Management Discussion and Analysis section of its annual ?nancial report that Congress.gov?s development progress against annual objectives from FY 2012 to 2016. Below is a summary of discussion of Congressgov?s progress. These discussions are consistent with Congress. gov?s development efforts and on-time Phase I delivery. 22 marines: BUMPANY Exhibit 8: Con ess. Obj ec tire gov Discussion Summa Library of Congress Performance Audit of SDLC and IT Security Audit Report Accomplishment LOC initiates development of next multi-department team 2012 generation legislative information successfully accomplished initial release system platform of betacongressgov 2012 web presence based on new Congessional web presence culminated architecture has been built and tested in the release of betacongressgov LOC has developed and implemented Web metrics poliCies and Key 2012 plans for impro . user 1' Performance Indicators were . implemented in the releases of experience and access beta_congress.gov Launch Lawgov with guidelines for Congressgov Beta, a mObll? friendly . system for legislative materials, was 2012 proofs of concepts and best practices full 1 ed standards success aunc on September 19, 2012 Users of Congess. gov have more effective access to growing body of LOC has completed development of leg'slative content, including legislation 2013 next generation legislative information . tern latform and services post-1993, Congressmnal Record post- YS 1995, Committee reports post-1995, and enhanced search fimctions . Congress. gov has expanded with 2013 Egasvigoi?gzs been migrated to additions of legislation, Congressional Record, andCommittee reports. Complete the planned web design and . 2014 development tasks related to All planned deSIgn and development tasks completed With ?ve major releases Congressgov Complete the planned web design and All planned design and development 2015 development tasks related to . . . tasks completed With 81); major releases Congress. gov Complete the planned web desiy and Congress. gov continues to be enhanced 2016 development tasks related to to replace Library Information Service Congress. gov and Congressgov replaced THOMAS Program Management at the Service unit Level . There were no signi?cant ?ndings in this audit area. Project Management at the Functional Level . There were no signi?cant ?ndings in this audit area. manna Performance Audit of SDLC and IT Security comm We" Contract Issues There were no signi?cant ?ndings in this audit area. Security Issues The System Owner failed to fully document the Congress. gov SSP: System interconnections were not identi?ed - The SSP references and tests controls are consistent with outdated guidance - The SSP is missing details regarding implemented security controls and any residual actions planned to address remaining weaknesses Security planning lacks updates based on results from the continuous monitoring process. The System Owner failed to register Congressgov in the centralized IT risk management and security documentation repository There were inadequate Security Assessment procedures: The Information Systems Security Of?cer ([880) did not retain a Congressgov Security Assessment Plan The 1880 failed to document planned remedial actions in the 10 sampled Plan of Action and Milestones items The 1880 failed to attach evidence to support one closed Inadequate System Access control policy and procedures: The System Owner did not ensure procedures for account management and monitoring controls for Congress. gov were ?nalized and approved The 1580 failed to recertify privileged users administrators who manage applications, database servers, and other hardware) access during FY 2016 Inadequate Con?guration Management documentation: The System Owner failed to develop and maintain a CM Plan speci?c to Congressgov and did not detail con?guration items to be managed by the CM Plan The System Owner failed to complete security impact assessments prior to updated Congress. gov versions being released to production System security vulnerabilities not corrected 1n the allocated time set by LOC policies: OCIO lacks automated methods to patch Linux servers in the LOC Application Hosting Environment (AI-IE), which hosts Congress. gov OCIO failed to remediate over 40% of vulnerabilities present on the AHE servers for over 90 days. muiv& Library of Congress Performance Audit of SDLC and IT Security Summary of Project Results Culmina?ng?om the Above De?ciencies 1. Project Phase I placed in service on schedule and on budget 2. Vendor is addressing Phase 1] requirements according to project plan 3. Security issues related primarily to documentation; some hosting environment vulnerabilities may impact Congressgov. Recommendations 1. OCIO should ensure that continuing development activities incorporate current LOC policies and relevant industry best practices are adopted by service unit oversight and project management teams 2. OCIO should address security issues for all operating systems and environments, develop timeliness for remediating de?ciencies, and monitor progress towards resolution. 25 Library of Congress Performance Audit of SDLC and IT Security a 0 Audit Report AUDIT DIETHODOLOGY AND SCOPE The objective of this performance audit is to evaluate the effectiveness of system development and information security policies, programs, and practices. Kearney addressed these objectives by directing testing at project, cost, change management, and related information security policies, procedures, standards, and guidelines. We conducted this performance audit in accordance with Generally Accepted Government Auditing Standards (GAGAS), as presented in the Government Accountability Of?ce?s (GAO) Yellow Book (2011). Kearney?s testing approach is based upon the LOC procedures, Federal criteria, Federal best practices, and best practice methodology found in the Capability Maturity Model Integration Institute?s Capability Model Maturity Integration (CMMI) process models and Project Management Institute?s (PMI) Project Management Book of Knowledge These criteria include the following: . LOC ?3 Information Technology Security Directive (1T SDir) 01 General Infomtation eclmologv Secmitv, June 3, 2016 - LCR 1620, Information echnologv Security Policy of the Library of Congress . A guide to the project management body of knowledge: guide), 2013 . CMMI for Development, Guidelines for Process Integration and Product Improvement, 3lml Edition . E?Govemment Act of 2002 . NIST SP 800-37, Rev. 1, Guide for Applying the Risk Management Framework to Federal Infonnation Svstems; A Security Life chle Approach . NIST SP 800-53, Rev. 4, Security and Privaev Controls for Federal Infonnation Svstems and Organizations, Appendix I, Privacy Control Catalog . Federal Information Processing Standards (FIPS) Publication (PUB) 199, Standards for Seanity Categorization of Federal Information and Infonnation Svstems . IPS PUB 200, Minimmn Security Requirements for Federal Infonnation and Infommtion Systems . Of?ce of Management and Budget (OMB) Circular A-130, Appendix Secan'tv of Federal Automated Infomtation Resources. Systems Development Life Cycle (SDL C) Maturity Kearney conducted a risk assessment of LOC ?s system development process maturity based on CMMI for Development (CMMI-DEV) guidelines for process integration and product improvement. CMMI-DEV contains practices that cover project management, process management, software/system engineering, and other supporting processes used in development and maintenance. The practices speci?c to software/system development include: requirements development, technical solution, product integration, veri?cation, and validation- Basic project management practices address the activities related to project planning, project monitoring and control, requirements management, risk management, integrated project management, and 26 Library of Congress Performance Audit of SDLC and IT Securih' supplier agreement management. Support process areas focus on activities that support product development and maintenance. These include process and product quality assurance, con?guration management, measurement and analysis, and decision analysis and resolution. CMMI capability maturity levels are used in CMMI-DEV to describe a progressive maturity path for an organization that seeks to improve its processes for software/system development. There are ?ve maturity levels in model: Exhibit 9: Characteristics of the Maturity Levels Focus on process improvement Levela Processes measured mm and controlled Processes characterized for the organization and is proactive. (Projects tailor their processes from organization's standards) Processes characterized for projects and is often reactive. Processes unpredictable. poorly controlled and reactive Maturity Level 1 tl_nitiall Processes are considered performed but do not follow speci?c organization policy or a de?ned set of standard processes 0 Maturiu Level 2 (Mgedl Requires that an organization has policies in place that mandate the use of a speci?c process . Maturi? Level 3 (De?ned) Requires that standard processes for each process exist at the organization level and can be tailored for use to meet speci?c project needs. The goal is to have standard de?ned processes that are applied consistently across the organization . Maturity Levels 4 (Mtitatively Managed) and 5 Are considered ?high- maturity? process areas that focus on improving processes already in use through statistical and other quantitative methods. The mapping of processes to process areas enables an organization to assess and track its progress against the CMMI-DBV model, as well as plan for process improvements over time. For this audit, Kearney assessed LOC processes for alignment with the following CMMI mutiny . Level 2 and Level 3 Process Areas (PA): Maturity Level 2 Requirements Management (RM), Project Planning (PP), Project Monitoring and Control (PMC), Supplier Agreement Management (SAM), and Process and Product Quality Assurance (PPQA) 27 - Library of Congress Performance Audit of SDLC and IT Security COMPANY asap,? Maturity Level 3 Requirements Development (RD), Technical Solution (TS), Product Integration (PI), Veri?cation, Validation, Integrated Project Management, and Risk Management. Kearney assessed LOC processes across three systems: eLi, OFORS, and Congress. gov. We assessed each system individually, then summarized the risks to indicate the overall risks of alignment with the CMMI-DEV maturity model. Risks are identi?ed as follows: Low Risk PA goals are fully implemented (F1) Medium Risk PA goals are largely implemented meets most PA goals Medium High Risk PA goals are partially implemented meets some PA goals High Risk PA goals are not implemented no PA goals are met. DOC process areas at risk for aligning to the CMMI-DEV model include: Maturi? Level 2 . Medium High Risks: Project Planning Project Monitoring and Control Process and Product Quality Assurance . Medium Risks: Requirements Management Con?guration Management Supplier Agreement Management. Maturi? Level 3 Validation Integrated Project Management Risk Management - Medium Risks: Requirements Development Technical Solution Product Integration Veri?cation. The following list includes the status of PAS in Maturity Model Levels 2 and 3 as discussed above, and depicted in the chart below: Low risk, PA goals are fully implemented LI Medium risk, PA goals are largely implemented and meet most PA goals PI Medium high risk, PA goals are partially implemented and meet some PA goals NI High Risk, PA goals are not implemented and no PA goals addressed 28 muiv& Library of Congress Performance Audit of SDLC and IT Security a Audit Rapnrt The assessment indicates that Congressgov had successfully addressed Levels 2 and 3 PA risks; therefore, the process is organized and proactively addresses risks. The assessment also indicates that eLi and OFORS had not success?illy addressed Levels 2 and 3 PA risks; therefore, the process is impredictable, poorly controlled, and reactive. Exhibit 10: CW Assessment Monit and Control Process and Product A Risk Technical Solution Promlct LI Veri?cation NIST, an agency of the US. Department of Commerce, is an internationally recognized leader in IT security, with benchmark publications leveraged by Federal agencies and industry alike. Much of the IT secmity evaluation was based on NIST publications. For the following areas for each application, Kearney: Assessed application development and IT delivery methods against existing LOC policy requirements and CMMI best practices Conducted a gap analysis of the system cost suf?ciency analysis against CMMI and PMI best practices Assessed the design, deVelopment, and implementation of the application/systems project management practices Assessed whether project deliverables met objectives and functionality de?ned in design documents Conducted a gap analysis of LOC security procedures and against industry best practices Evaluated secm?ity and privacy controls based on NIST SP 800-53 best practices Evaluated access controls effectiveness in limiting and/or detecting inappropriate access. 29 Library of Congress Performance Audit of SDLC and IT Security AuditRepm APPENDIX A: DETAILED AUDIT FINDINGS Dining the course of the performance audit, Kearney evaluated program elements using the criteria listed above, along with industry best practices. As de?ciencies were identi?ed, Kearney used a ?Noti?cation of Findings and Recommendations process to highlight the Background (relevant and! or historical information), Condition (the speci?c de?ciency), Criteria (laws, legislation, and standards we measured against), and Effect (the result) that the de?ciency creates, along with Recommendations for corrective actions. These NFRs are generated by the audit team, evaluated by the 016, and reviewed with the auditee for factual accuracy before formal release. The following sections describe the detailed ?ndings/issues we identi?ed in our performance audit. Detailed Audit Findings - eLi 1.1.1 Requirements Defmilion Improvements Condition: The eLi project is not using a standardized process or format for eliciting, documenting, or tracking development requirements. There are multiple different versions, formats, and levels of detail used in the documentation and tracking of system requirements. The process of developing customer requirements into detailed product requirements is critical to ensuring customers? high-level needs are understood to the level needed to correctly implement new functionality that meets these needs. The later in the process that requirements and changes are uncovered, the more expensive they are to as it increases the amount of rework and time required to ful?ll the customer requirements. Effect: The eLi project was unable to demonstrate a repeatable or provable process that standardizes the requirements de?nition tasks needed to elicit, analyze, or establish requirements based on the customer?s need. The information, where it exists, is contained in multiple doc1m1ents and locations, and it does not provide a comprehensive, consolidated picture for a point in time. This has resulted in the inability of the USCO and contractors to: . Con?rm that the stakeholder requirements have been captured and delineated into detailed requirements . Con?rm that the ?mctionality built meets the requirements - Validate/verify the ?mctionality during Testing and Acceptance Phases - Have a single managed source for tracking the development of requirements. The process of developing customer requirements into detailed product requirements is critical to ensuring customers? high-level needs are understood to the level needed to correctly implement new functionality that meets these needs. Through analysis, any additional requirements, interactions, and rules become apparent, which may not otherwise be found until after the 30 of Congress Performance Audit of SDLC and IT Security ennui" Auaimepm incorrect product has been developed The later in the process that requirements and changes are uncovered, the more expensive they are to as it increases the amount of rework and time required to ful?ll the customer requirements. Source Control Con?guration Management Improvements Condition: The USCO does not require software vendors to conform to a well-de?ned process for managing and versioning the source control of the systems they have been contracted to build. The USCO has not established best practices, in accordance with the CMMI Con?guration Management (CM) process area, for management and versioning of the source code for eLi. The USCO eLi project management staff are aware that the contracted vendor performs fundamental program source code control, but they do not know the level of controls or compliance with best practices the source code library is achieving Since minimum compliance expectations are not provided to vendors, the USCO is left tmsure whether the develoPer is following industry best practices in the storage and management of the so?ware, which the USCO will own at the end of the development contract. Below, we noted examples of inconsistencies in source control and CM best practices. To assess whether formal CM practices were implemented in accordance with CMMI best practices detailed above, Kearney reviewed several documents provided and noted that the eLi contractor uses Subversion software versioning tool for source control. However, we did not discover any documentation that explains how Subversion is used to manage source control. For example: . What the branching (software code version control) strategy is . How rollbacks to prior software code versions can occur . Whether change sets patches addressing multiple areas of the application code) are always tied to tickets (authorized requests). Kearney provided a questionnaire to the eLi developers to gain more in-depth information about source code library practices used for the project. When asked about the branching strategy, the USCO stated: ?To the best of our knowledge, in looking at the code repository, it doesn?t appear that there was a strategy.? Because the source code library control process was never documented and provided to DOC, there is an unknown level of assurance that the software asset developed for the USCO has been stored in a manner that allows traceability back to functionality requirements. Lack of such information will hinder future alternate vendor or abilities to support or alter the software code. 31 Library of Congress Performance Audit of SDLC and IT Security Audit Report Kearney assessed the status of CM processes perfonned by the USCO in the ?eLi Security Assessment Report dated January 24, 2015, which con?rmed that key CM processes are not formally implemented or documented. The SAR document notes: I. 6. CM-02 Baseline Con?guration There currently is not a formal eLi application CM process that develops, documents, and maintains, under con?guration control, a current baseline con?guration of the eLi application environments Proof of Concept, Development, Test, Staging, Production, Shared) Baseline Con?guration (Review and Updates) There currently is not a formal eLi application CM process for reviews and updates Baseline Con?guration (Retention of Previous Con?gurations) There currently is not a formal eLi application CM process to develop, document, and maintain the retention of previous con?gurations under con?guration control Con?guration Change Control There currently is not a formal eLi application CM and con?guration change control process Con?guration Change Control (T est/V alidate/Document Changes) There currently is not a formal con?guration change control process to test/validate! document changes CM-09 Con?guration Management Plan There currently is not a formal eLi application CM Plan. In our review of the Requirements Traceability Matrix (RTM) documents, we noted that there does not appear to be any documentation that relates requirements to source control or explains how source control should be used by the development team. The eLi system design document has a section devoted to source control, which states: ?Software AG Designer and webMethods Developer enable you to create, maintain, and manage custom integration packages for use by webMethods Integration Server. Often, many enterprise organizations employ a version control system (V CS) for the development of software solutions, providing automatic auditing, versioning, and security to so?ware development projects. For eLi project Subversion 1.6 will be used as version control system? Guidelines or requirements for how Subversion is to be used are not mentioned. Kearney also noted that the documentation provided indicates that there is no documented baseline. Effect: The following list describes the effects of the ?ndings stated in this NPR: 1. Because the source control process is not documented, if there is any kind of traceability from requirements back to source code, the USCO is unaware of it. It is possible that eLi contractors are following best practices and could easily trace a requirement back to all 32 Performance Audit of SDLC and IT Security who? the change-sets that make up that requirement, but the USCO is not aware whether this is happening or not. Without this information: a. It can be extremely dif?cult to deploy Speci?c features or choose not to deploy those features b. It is di?icult to tmderstand the history of source code and why a change was made c. There is no guarantee that there is a documented reason for a clunge. lfchanges are required to be associated to requirements/bugs within the source control system, then each change is something that is ultimately within the purview of the USCO and not a change that a developer might complete on their own initiative d. It becomes more dif?cult to gauge how complex a feature was to implement c. It becomes more dif?cult to gauge how long a requirement took to implement (level of effort, costs) 2. It is not possible for the USCO to monitor and con?rm that the CM processes are being followed 3. USCO has no insight into whether the vendor?s Standard Operating Procedures (SOP), if they exist, are in accordance with SDLC process 4. Without documented baselines, software rollbacks may not be success?il. Subversion is being used in the case of eLi; thus, rolling back to different versions of the software is possible, but the SOPs for accomplishing this process are not currently within the purview of the USCO 5. It is unknown whether any version of the software could be identi?ed as a ?baseline? 6. It is lmknown whether questions could be answered, such as: a. What were the implemented requirements during the last release and what are they during this release? From several of the eLi test documents, we noted that many bugs were listed as change.? This could be attributable to deploying the wrong version of the code because a proper version control branching system does not exist b. Did this ?mctionality work during the last release? Can we deploy that version and test it out? c. What is every code change that has ever been associated to this requirement? Who wrote the code? Who reviewed it? Overall, there is a risk that, in the event that the USCO replaces the original vendor, or replacement vendor cannot access the source code in a workable, organized marmer g, the code and the controls placed on it in Subversion [code library system] ?zllv port over to the next vendor). 1.3.1 Design and Analysis of Alternative Improvements Condition: The USCO did not have established policies and requirements to assure that an Analysis of Alternatives is formally documented during the Technical Analysis, Design, and Architecture Phases of its SDLC. The absence of a proper and an explanation of why a certain design is required can lead to overly complicated designs that may ultimately cause signi?cant project delays, increased costs, and, in some cases, project failures. For the eLi project, we did not observe evidence that an was performed during the Design Phase. 33 Library of Congress Performance Audit of SDLC and IT Security a a Audit Report Below, we noted examples of inconsistencies in processes used for the design and analysis of alternative solutions when compared with CMMI best practices for Technical Solutions. In the design documents provided, we did not observe any discussion of how alternatives were evaluated or how Solutions were weighted against each other. For example, the webMethods Business Process Modeling tool suite was chosen as a Commercial Off-the-Shelf (COTS) product for eLi because it was expected that a signi?cant amount of functionality would be available out of the box. However, there is no discussion about how determinations were made that a certain feature or functionality should be custom software development versus con?guring webMethods components. None of the documents reviewed explored the idea that it might take more time to customize or con?gure webMethods versus creating a brand new component or using a different component to interface with webMethods. Additionally, the decision to build part of the system using webMethods and other parts of the system using custom development code .NET or Java) came years a?er the start of the project. This exposed the USCO to time and costs associated with signi?cantly re-addressing requirements to alternative solutions. According to the COTS Selection Criteria Evaluation Worksheet completed in 2011 as part of the technical proposal by the selected contractor, zero customization was to be required for webMethods. That is, 100% of the functionality required by eLi was either considered ?out of the box? or ?con?gurable.? Zero requirements were considered either ?not met? or ?need[ing] customization.? Nevertheless, at some point during the development life cycle, it became evident that webMethods could not support the entire eLi system with zero customization. Kearney reviewed the documents provided as part of project management, system design, and system deveIOpment documentation, but we could ?nd no indication that the design team undertook the task of establishing criteria to help them determine the best course of action customize, con?gure, reuse, build) for each requirement of the system. A signi?cant aspect that is not addressed in any solutions documentation we reviewed is the justi?cation for provisioning the massive amount of infrastructure they built. According to the eLi dated December 10, 2014, over 100 servers were originally proposed for eLi. This number was spread over ?ve environments: Proof of Concept, Development, Test, Pre- Production, and Production. Averaging 20 servers per environment for an application that is expected to process 3,300 applications every six months appears to be an astonishing amount of computing power. For comparison, StackOver?ow.com, one of the top 50 most popular sites on the intemet, uses around 25 servers. StackOver?ow.com receives roughly 66 million page loads per dqv. Additionally, for comparison purposes, Congress. gov has approximately 100 servers, including the backup recovery site. Ultimately, the number of servers for eLi was reduced to 46. Excluding the database servers in each environment, there remained nine servers used for production. That number still appears to be sign?icantlv higher than necessary for the amount of traf?c-eLi is expected to serve. In the eLi SAR the contracted developer noted that ?there were possibilities that each of the nine servers/services for each environment did not have to reside on a separate physical server, resulting in less than nine physical servers required per environment.? This being the case, and without decision analysis documentation, it is not clear why this multi- 3 eLi Assessment Project Final eLi System Assessment Report Draft. December 10. 2014 34 KEARNEYCSL ?masseuse? Performance Audit of SDLC and IT Security GOMPAHY server design was left in place. Traf?c engineering or simulated ?load? testing would provide a realistic estimate of the processing platforms required to support the application and users access. According to eLi management?s response to Keamey?s questionnaire, ?[the] USCO doesn?t believe that any load testing was done since the application was still in the development phase,? which once again raises the question as to why the amormt of servers in the design was considered necessary. Kearney was not able to interview the vendor team to inquire why their documentation did not re?ect the rationale for the initial number of proposed servers. Again, we would note that for a system to have a design as complex as the one that was ultimately adopted for eLi, it should be expected that the design analysis documents justify why that amount of complexity is required Effect: The lack of a proper and an explanation of why a certain design is required can lead to overly complicated designs that can ultimately cause signi?cant project delays, increased costs, and, in some cases, project failures. The fact that the document had to be completely rewritten is indicative of the complexity of the design proposed by the original contractors. With as complicated as the design and installation guides are, it is not surprising that the eLi SAR completed on December 10, 2014 states: ?once [the developer] gets the go ahead to proceed, the remaining environments could be brought on- line within several men 1.4. 1 Development Phase Improvements Condition: The USCO does not have adequate policies and procedures, as de?ned in the Project Management Life Cycle (PMLC) and SDLC guidance, for conducting oversight and monitoring of the Development Phase processes and practices of its contractors. Although the PMLC and SDLC include planning for and monitoring Development Phase activities, there is no speci?c language to require ??owdown? of the PMLC and SDLC processes to contractors performing development work for the USCO, nor guidance for USCO staff to monitor contractor adherence to PMLC and SDLC. Monitoring of contractor?s work is primarily tied to contract deliverables (end product) approvals via coordination between Project Managers and the COR However, on the eLi project, there is a lack of oversight and knowledge of internal develoPment methodologies of the software development contractor, including coding standards and procedures. Based on our inquiries and assessment of the Copyright eLi project team, there is little information known about how the contractors are developing code for systems and what practices they are following. Below, Kearney noted examples of inconsistencies in processes used for the development and coding practices when compared with CMMI development best practices in Technical Solution processes. 35 Library of Congress Performance Audit of SDLC and IT Security an Audit Report Kearney requested documentation from the eLi project team intended to encompass ?system deveIOpment methodologies and work environment standards for eLi to include: development environment standards and tools, coding standards and procedures.? Based on the documentation provided by the eLi project team, we were unable to con?rm that the tools, standards, and procedures were in accordance with the standards required by the SDLC practice, as that level of detail was not maintained and monitored by the USCO eLi project team. Speci?cally, the documentation provided does not contain information necessary to properly assess whether the vendor was following a proper SDLC. For example: What are the coding standards? Are there peer reviews of custom code? Are unit tests required? How often are tests run? Are tests automated to run upon check-in? Is there a Continuous Integration (CD/Continuous Development (CD) pipeline? What tools are used by the development team? . How is branching and merging of code done? For example, is it possible to work on bug ?xes in the current production release, while simultaneously working on new features intended for the same release? . Can a version of the code be deployed from any point in time? Kearney was told that the information to answer the questions above was not provided to the USCO by the contractor. Without this information, the USCO cannot con?rm whether LOC SDLC practices are being followed by the vendor. In response to our questionnaire, eLi management in the USCO noted: ?Since development was outsourced to the contractor, the USCO is unaware of what toolsets were utilized for development, compiling, and deployment. Since these items were not fully speci?ed within the software Architecture Design Document (ADD), we assume that there was no automated build and deployment.? Kearney reviewed the eLi and noted the following consistent responses from Copyright eLi project team members recorded in that assessment: ?Copyright Technology O?ice (CTO) is not aware of any existing unit test data, but it is assumed that unit tests were run. However, unit test reports were not speci?ed as deliverables? . ?To the best of our knowledge, coding standards were neither de?ned (at acquisition) nor used on this project? . ?Since development was outsourced to the contractor, United States Copyright O?ice (U SCO) is unaware of any existing development guides. None were provided as Government Furnished Information . ?Since development was outsourced to the contractor, USCO is unaware of any code reviews that were held- The software was a straight deliverable-? Project Final eLi SAR Draft. dated December 10, 2014, developed by vCentra 36 Library of Congress Performance Audit of SDLC and IT Security comm Because the USCO does not have insight into the methods and practices used to deve10p eLi, it cannot have con?dence that the software is built in a manner that is robust, well-tested, and easily maintainable. Additionally, software that follows development best practices consistent unit testing, automated deployments, and continuous integration) ultimately saves signi?cant time and money over software that is not developed using these practices. Effect: The USCO is not able to con?rm that its SDLC and the best practices outlined in the CMMI sub-practices are being followed by the vendor. The effects and bene?ts of following best practices in software development have recently gained attention in the Federal Government. The Government has recently acknowledged that the processes within the Implementation Phase are critical for successful software. The activities within the Implementation Phase (or lack thereof/failure) determine the overall success of the project. The United States Digital Services Playbook (msJ/plaMImiogovz) re?ects this fact by focusing no less than eight of its 13 ?plays? on the Development/Implementation Phase of software. Projects not following modern development best practices will result in: More bugs Signi?cantly longer development times Difficult to maintain/fragile code bases More system downtime A system that is more expensive to maintain A system that performs poorly under load Dif?culty in ?nding developers to maintain the system due to unconventional implementations. 1.5.1 Deployment and Operations Improvements Condition: The USCO does not have adequate policies and procedures de?ned for deployment and operations or requirements de?ned in the contract and SOW, requiring contractors to provide detailed documentation explaining how custom software is deployed or how COTS products are integrated with custom components. In some cases, there is almost no information known about how the contractors are deploying, testing, and verifying the software deployed into an environment. Policies, contracts, and SOWs should specify that contractors provide a detailed deployment guide describing how changes are tested and deployed, as well as an installation guide explaining how to perform the customizations. Below, Kearney noted examples of incomplete processes, including lack of documentation explaining procedures and processes related to testing and verifying deployed software and procedures for performing customizations. 37 m? Library of Congress Performance Audit of SDLC and IT Security an Audit Report We reviewed the contractor?s system development testing procedures and noted that there are checklists, smoke tests (tests run to verify that an application?s main features work properly before proceeding with more rigorous testing), and automated tests that are provided to ensure that an environment has been correctly con?gured. We did not observe, however, any documentation that explains how new code is deployed into an existing environment. Kearney also did not observe evidence that any kind of automated build, continuous integration, or continuous deployment system is in place. Automating the deployment process has several important bene?ts, including: . Signi?cant cost and time savings. For complicated deployments, manual processes can take several hours. Automated deployments completely eliminate this cost - The process is completely repeatable and immune to human error. This gives the team assurance that deployments will succeed in any environment and eliminates the cause of common production issues: forgetting to perform one or more deployment steps a The knowledge of deployment is not limited to a few individuals. When deployments are automated, anyone on the team can perform deployments, as opposed to manual deployments, which are generally performed by one individual. Ifthis individual leaves the team or is unavailable, the other team members have to learn the process, which costs signi?cant time and money. The eLi Test and Evaluation Master Plan explains that the development team: ?Designs, develops, and updates the webMethods and Data Pro components - Performs ?smoke? testing in the development environment prior to updating the testing environment . Performs system testing for webMethods and Data Pro components - Documents and resolves problems found during testing - Reviews resolution entries in the test report.? The contract development team did not adetpiately correct, track, and communicate remediation of faulty code identi?ed by Copyright acceptance testing. Kearney noted that test results for Contract Line Item Numbers (CLIN) 3 and 4 contain large numbers of test cases that are marked as change.? CLIN 3 had a 96% fail rate and CLIN 4 had a 74% fail rate. This indicates that either: 1) bugs are not being properly resolved; 2) bug ?xes are not being properly deployed to the test environment; 3) bug ?xes that are actually being deployed to the test environment were not being communicated to the acceptance testing team; or 4) developers improperly managed source code library modules and builds. De?ciencies in the software deployment processes (as evidenced by a high code failure rate) will result in additional costs and time necessary to complete the project. 38 Library of Congress Performance Audit of SDLC and IT Security ??Ml?llNY asap,? Effect: The USCO is not able to con?rm that the contractor is following internal best practices for product integration and deployment in accordance with the LOC SDLC Implementation Phase. In addition, the process of deployment was performed manually, which is costly, time- consuming, and more error-prone. Best practices recommend an automated which is less labor-intensive and results in fewer errors. Finally, the large number of change? seems to indicate that deployments are failing to include all the bug ?xes meant for a new deployment. 1.6.1 IT Governance Improvements Condition: The USCO did not verify and monitor contracted system development work for alignment with LOC OCIO governance for project management and system development processes being performed by contracted deve10pment ?rms. The eLi PMP, dated August 14, 2014, provided guidelines ?to ensm'e alignment with the Project Management Institute?s Project Management Body of Knowledge and the Library of Congress System Development Lifecycle (LOC SDLC) requirements.? The PMP included project management guidance that aligned with the PMLC and deliverables aligned with the LOC SDLC. However, the eLi project team did not perform effective monitoring of contractor execution against the LOC SDLC, including: The USCO did not require or approve the contractor?s SDLC practices to ensure alignment with the LOC SDLC The project management team did not proactively monitor the development contractor?s execution to these standards, as evidenced by the project?s lack of knowledge of the contractor?s development practices. Kearney observed an absence of requirements in contracts and SOWs for development contractors to adhere to SDLC practices and produce speci?ed deliverables, as follows: ?There is no mention of a requirement to adhere to an SDLC in the original contract for implementation of a con?gured version of webMethods, which is a COTS product, requiring extensive con?guration. The omission of requiring contractor development work to be conducted in a structured, professional manner could lead to late or non- delivery of software products, or provide a contractor with lower cost options to deliver software, resulting in higher failure and non-compliance rates.? Effect: The lack of mature IT governance and monitoring processes leads to an inconsistent application of project standards and controls, causing potential issues with quality so?ware development and implementation. An absence of development standards may affect delivery of ?rlly de?ned and properly implemented stakeholder requirements. 39 Library of Congress Performance Audit of SDLC and IT Security swam A lack of systems development governance oversight can result in inconsistencies with quality and standards of deliverables leading to cost overruns and missed project milestones. 1.10.1 USCO Project Oversight Policies Condition: The project cost management procedures do not include e?ective cost monitoring and controls, including reporting and analyzing cost variances and their causes, as well as tracking corrective actions for IT investments. Cost tracking information was observed in multiple formats with varying levels of detail, but no evidence was provided of project cost monitoring and analysis of cost variances and corrective actions. Below, we noted examples of inconsistencies in project cost management, including tracking, reporting, and analyzing cost variances, as well as managing corrective actions. The cost information for eLi was available in various formats, including General Ledger postings, draft budget estimates, and total project costs by vendor. We did not observe life cycle analysis of variances of projected costs versus actual, including corrective actions. Without the cost analysis, variance explanations were not included in any cost information provided. Below, we detail cost information provided from various sources: Exhibit 11: eLi Observed Fundin and C0515 Funding Details 2010 Original funding (Copyright Licensing Division Congressional $1 100 000 Budget Request) 2011 Additional funding request (Copyright Licensing Division $790 000 Congressional Budget Request) 2011 Total approved funding $1,890,000 2011 Estimated Acquisition costs in General Ledger ($613,000 $2 627 000 personnel costs and $2,014,000 contract costs) Total expenditures in General Ledger througi October 13, 2016 $11,623,000 Although we observed spreadsheets showing total project costs by vendor, contract value, and amounts expended, the eLi project management team recorded its analysis inconsistently, making it dif?cult to determine the correlation between cost increases and SDLC phases. The USCO oversight practices did not include a strategic review and approval of project cost increase and timeline extensions. With signi?cant cost increases, project management should have used a formal cost analysis practice, such as earned value analysis, to forecast cost to complete based on work completed versus remaining work Using a formal estimation process and analyzing variances in the schedule and budget as the project progresses provides more accurate estimates of cost to complete. Project management could not produce a formal cost tracking process for cost tracking, variance analysis, or review of corrective actions outside thresholds. Effect: Lack of formal project cost analysis and tracking methodology at the project level and LOC policy requiring eLi to report investment activities to OCIO has several negative effects in relation to adequately assessing project costs and risks. These include: Library of Congress Performance Audit of SDLC and IT eLi project management has not adopted formal cost analysis techniques for tracking and reporting estimated and actual costs - eLi has not applied formal cost variance analysis methods, including identifying causes and tracking corrective actions - Without regular and frequent monitoring and reporting of costs, variances, and corrective actions, there is increased risk surrounding ability to make informed and timely decisions about investments. 3.1.1 Project Management Scope and Schedule Improvements Condition: eLi project management did not have adequate scope and schedule management controls in place that aligned to best practices. Below, we provide evidence of these conditions: Project management did not e?'ectively manage risk that impacted the project scope and schedule. The Risk Register was last updated in 2014 and contains risks impacting scope and schedule that were not mitigated and tracked to closure - Project management did not effectively track scope and schedule issue remediation to closure. The issue log was last updated in 2014, and it does not document issues that lead to scope and schedule changes 0 Project management did not effectively manage changes to the scope and schedule following Project management did not document departures from the planned project schedule in the project log with an associated justi?cation/reason. The eLi project log documents ?expanded scope to include certi?cation;? however, we did not observe change requests documenting schedule and scope changes. Effect: Mismanaged eLi scope changes directly affected the project schedule and increased the cost of the project. As states, controlling the project scope ensures all requested changes and recommended corrective or preventive actions are processed through the Perform Integrated Change Control process. also discusses how communications, risk, and unanticipated changes can impact the schedule and/or outcome of the project. Human Resource Management Improvements Condition: USCO project management did not have adequate Human Resource management in place that aligned with best practices. Below, we provide evidence of these conditions: 0 Project management did not follow the plans identi?ed in the eLi PMP to review the project at each phase end, or at least quarterly, for accuracy and compliance with project documentation and update the project plans. Kearney observed outdated Resource Management Plans and PMPs. The Resource Management Plan was last updated in 2013 and the PMP was last updated in 2014 41 Library of Congress Performance Audit of SDLC and IT Security a 0 Audit Report a Project management did not followr the guidance to document a plan for adjusting resources during the project Closeout Phase. As a result, it is unclear whether excessive or suf?cient resources are assigned to complete closeout tasks. Effect: prroject team members do not possess required competencies and proper training is not provided to new resources, performance and success of the project can be jeopardized. When resource mismatches and changes are identi?ed, proactive responses, such as hiring, schedule changes, or scope changes, and documentation updates should be initiated. USCO project costs, schech?es, risks, quality, and ultimate success may be sigi?cantly affected by inadequate resource management and a misunderstanding of required roles and responsibilities. Without effective Human Resource planning and management, staf?ng issues may disrupt the project team from adhering to the PMP, causing the schedule to be extended or the budget to be exceeded Key bene?ts of e?ectively managing the project team include in?uencing team behavior, managing con?ict, resolving issues, and appraising team member performance. Failing to formally plan the method and timing of releasing resources ?'om a project can signi?cantly increase the likelihood of Human Resources risk occurring dining or at the end of the project and unnecessary resource cost being charged to the project. Inef?cient resource management planning may have been a key factor in the eLi project management team failing to properly maintain project documentation, track/meet deliverables, and report on the performance and requirements of project resources. 3.4.1 Communications Management Improvements Condition: The USCO project did not follow the LOC PMLC guidance for maintaining a PMP and Communications Management Plan. The PMP and Commtmications Management plan was last updated in 2014. Per the eLi PMP, ?project managers will review the project at each phase end, or at least quarterly for accm'acy and compliance with project documentation.? Ineffective communication creates a gap between diverse stakeholders who may have different cultural and organizational backgrounds, levels of expertise, and perspectives and interests, which may impact or have an in?uence upon the project execution or outcome. Timely communication among diverse project team members should be addressed in a commimications management plan for items such as requirements updates, design reviews, test readiness, and project status, including risks and issues. Effect: The lack of a requirement for eLi to follow the LOC PMLC has several negative effects with regard to eLi project management maintaining a formal documented Communications Management Plan. As stated in Section 2.3 of the LOC PMLC, ?comm1mications, risk, and unanticipated changes can impact the schedule and/or outcome of a project and it is important to plan in advance how these changes will be addressed.? 42 m? Library of Congress Performance Audit of SDLC and IT Security was" also discusses how ineffective communication creates a gap between diverse stakeholders who may have different cultural and organizational backgrounds, levels of expertise, and perspectives and interests, which may impact or have an in?uence upon the project execution or outcome. The USCO also projects that lacking a formal Management Plan may lead to con?icts with suppliers and internal team members, due to miscommunication. 3. 5.1 Risk and Issue Management Improvements Condition: The eLi project team did not follow the LOC PMLC requirement to deliver and maintain a Risk Management Plan describing how project risk assessments will be structured and performed Additionally, the eLi project team did not follow the LOC PMLC requirement to maintain a Risk Register and issue log throughout the life cycle of the project. Although a Risk Register and issue log were created early in the project, they were not maintained and updated throughout the project; the eLi project management last updated these documents in 2015 and omitted issues related to project resources. Issues and risk were identi?ed as ?ndings in the 2014 eLi SAR, but not acknowledged and tracked for remediation on the Risk Register/issue log. The 20150701 Updated (eLi Governance Board) Meeting Notes documents the concerns/issues with resources dating back to the beginning of the project; however, these issues are not captured in the Risk Register or issue log Effect: The lack of requirement for eLi to follow the LOC PMLC has several negative effects with regard to eLi Risk Management Plans, issue logs, and Risk Registers. As stated in Section 2.3 of the LOC PMLC, ?communications, risk and unanticipated changes can impact the schedule and/or outcome of a project and it is important to plan in advance how these changes will be addressed.? Project risk is de?ned as an uncertain event or condition that, if it occurs, has a positive or negative effect on one or more project objectives scope, schedule, cost, quality). The Risk Management Plan documented resource shortage risks, which eventually lead to project issues. The Risk Register stated that ?if resource shortages are not addressed, project timelines and/or project quality will suffer either agree to live with greater risks due to lower quality or lengthen timeline).? The Register does not document a mitigation strategy for this risk. Ineffective risk management may have been a factor in risk and issues not being addressed in a timely and formal manner. 4. 2.1 Requirements Management Improvements Condition: The USCO does not have guidelines, policies, or standardized processes for managing requirements throughout the SDLC or ongoing support and enhancement phases. Kearney observed requirements documentation in multiple formats with varying levels of detail with limited bidirectional traceability. Additionally, we did not observe requirement baselines, meaning that they either do not exist or have not been maintained as part of the ongoing project. A requirement baseline provides a de?ned, con?rmed, and validated set of detailed requirements at speci?ed checlqaoints throughout a project life cycle. These baselines should be used to obtain metrics to show the amount of change (and when the changes occurred) throughout the project. 43 Library of Congress Performance Audit of SDLC and IT Security a a Audit Report The cost of a project increases exponentially depending on when project additions or changes to requirements are discovered. By tracking changes to requirements against documented baselines, the USCO is better able to determine and manage risks to project schedules and budgets. Below, we noted examples of inconsistencies in processes used for requirements management when comparing with best practices for project management. While there was a requirements baseline and traceability for the eLi project, this baseline document was created in 2011, and we were unable to validate that this baseline has been maintained and managed since that time. Current processes for the eLi project appear to create a new RTM for changes with new numbering and limited context of where the requirements fall in the hierarchy of the overall system The original baseline should have been maintained as a living document with iterations for subsequent releases that resulted in new baselines over time to show how the system has matured and changed over the life of the project. Subsequent RTMs and requirements documentation do not contain the same level of detail or traceability to detailed use cases or test cases, which are critical for management and validation of the requirements. Because new RTMs are created for each change, there is no way for the USCO to evaluate when changes are to implement new ?mctionality, revise existing functions, or remove functionality no longer required. Kearney observed many different versions of RTle containing a limited subset of requirements without the level of detail needed to properly implement the requirements. In addition, we were unable to observe how these requirements were validated and approved (statuses) or how changes were managed throughout the development life cycle for individual releases. Without a clear and formal process for validating and approving requirements, there is no way for the USCO to ensure that the stakeholder requirements have been fully or correctly de?ned into detailed product requirements for development. The lack of change management also results in an inability to track and manage impacts to schedule, scope, or ?nal deliverables. Effect: Lack of formal policy and guidance regarding the establishment of standard processes to manage requirements throughout the complete SDLC, including ongoing maintenance enhancement releases, resulted in the following: - Inability to effectively view and manage project requirements - Inability to fully trace requirements throughout the development process Inability to effectively manage changes to existing requirements or new requirements as they are developed. 4. 3.1 Requirements Validation Improvements Condition: The USCO does not have standard guidelines, processes, or templates across programs for ensuring requirements are fully re?ned and updated as needed dining the Validation and Design Phases. Documentation is maintained in various formats and levels of detail across and within projects- Performance Audit of SDLC and IT Security Below, we noted examples of inconsistencies in processes used for ensuring requirements are fully captured during validation. The use case documentation, which is necessary to prepare validation test cases by showing exactly how requirements will be implemented, does not include traceability back to originating requirements. Traceability is needed to ensure that all requirements have been accounted for and will meet stakeholder needs. Additionally, the use cases provided for review are in multiple formats and do not provide enough detail to correctly determine how the ?mctionality gets triggered or where it ?ts into the overall system. The impact of this de?ciency was con?rmed by review of the testing and test validation docmnents, which show where the application does not meet the requirements and what the shortcomings are but does not show how the product was validated to arrive at these conclusions. The test plan documentation reviews show references to the RTM and design document; however, although we requested those documents for audit purposes to verify how they interfaced or whether they contained the full information to validate the product, the USCO was Imable to produce them as requested Therefore, we concluded they did not exist or the USCO did not have docmnent control in place. Effect: Lack of de?ned processes and policies to validate the requirements through detailed design use cases, wire?'ames, functional design) results in the USCO being unable to verify whether stakeholder requirements have been fully de?ned and implemented throughout the release cycle. This lack has resulted in the inability of the USCO and contractors to: - Validate stakeholder requirements lmve been captured in the detailed requirements a Verify that the ?mctionality built meets the requirements Validate/verify the functionality during the Testing and Acceptance Phases. 4.4.1 Requirements Verification Improvements Condition: The USCO does not use LOC SDLC standard guidelines, processes, or standard templates for verifying that requirements are fully tested and implemented timing the Veri?cation Phase. Documentation is maintained in various formats and levels of detail across the project. Below, we noted examples of inconsistencies in processes used for requirements veri?cation when comparing with best practices for requirements management and quality assurance for the eLi system Docmnentation provided for review on the eLi project shows where the application does not meet the requirements and what the shortcomings are, but it does not show how the product was validated to arrive at these conclusions. The provided test plan references a RTM and design document; however, these associated documents were not available for review to verify how these documents work together or if they contain ?ill information needed to validate the product. Based on responses to the auditor?s questionnaire and the work products provided, we were unable to determine Whether any peer reviews were conducted throughout the development life cycle. 45 Library 0f Congress Performance Audit of SDLC and IT Security a Audit Report E?'ect: Each program at the USCO has implemented its own formats and processes for requirements veri?cation to differing levels of success. There is no clear way for the USCO to review individual projects against the stakeholder requirements to verify what has been implemented against the stakeholder and detailed product requirements. Detailed Audit Findings OFORS 1.1.2 Requirements De?nition Improvements Condition: Library Services is not using a standardized process or format for eliciting, documenting, or tracking development requirements. The stakeholder requirements are well- de?ned in the tickets/ issues and gap analysis documentation; however, the decomposition of these into detailed product development requirements was not apparent in the speci?cations provided. While the stakeholder requirements trace to work items, these work items and deliverables are split out into multiple documents and formats, hindering a traceable view of the stakeholder requirements to speci?c required code changes and releases. The process of developing customer requirements into detailed, traceable product development requirements is critical to ensuring that customers? high-level needs are understood to the level needed to correctly implement new functionality that meets these needs. The later in the process that requirements and changes are discovered, the more expensive they are to as it increases the amount of rework and time required to ful?ll the customer requirements. Effect: The OF ORS project was unable to demonstrate a repeatable or provable process that standardizes the requirements de?nition tasks needed to elicit, analyze, or establish requirements based on the customer?s need. The information, where it exists, is contained in multiple documents and locations and does not provide a comprehensive, consolidated picture for a point in time. This has resulted in the inability of Library Services and contractors to: - Con?rm that the stakeholder requirements have been captured and delineated into detailed requirements - Con?rm that the ?mctionality built meets the requirements - Validate/verify the functionality during the Testing and Acceptance Phases Have a single managed source for tracking the development of requirements. The process of developing customer requirements into detailed product requirements is critical to ensming that customer?s high-level needs are understood to the level needed to correctly implement new functionality that meets these needs. Through analysis, additional requirements, interactions, and rules become apparent, which may not otherwise be found until after the incorrect product has been developed. The later in the process that requirements and changes are uncovered, the more expensive they are to as it increases the amount of rework and time required to ful?ll the customer requirements. m? Library of Congress Performance Audit of SDLC and IT Security a Audit Report 1.2.2 Source Control and Con?guration Management Improvements Condition: Library Services does not require contractors to conform to a well-de?ned process for managing and versioning the source control of the systems they have contracted to have built- It is not known whether source control and, therefore, CM is being employed whatsoever. For example, the practice of code check-in/check-out procedures and branching/source code CM strategies (critical elements of version control) were only enforced for in-house development efforts. Also, there was no knowledge of development guides used by the contractor and no requirement in the contract for any type of development guide. The lack of formal CM practices check-in and check-out procedures and branching strategies) impacts the ability to safely control changes, which poses signi?cant risk to the ability to maintain the integrity of the code and control changes to prevent unauthorized changes To assess whether formal CM practices were implemented in accordance with CMMI best practices detailed above, we reviewed the CM Plan which, while explaining the process for adding a change to the system, does not speak to procedures for performing the actual updates to the software. Under ?Perform Update,? the document simply states: ?The OFORS Project Manager at VTLS (development contractor) will assign a responsible party to perform the system update in accordance with WIS Standard Operating Procedures.? Critical elements of a CM Plan should include details about the source control system and procedures for using it, branching strategy that de?nes the process for integrating code, and rollback procedures for retmning code to a previous version The CM Plan did not the following elements: . What, if any, source control system is used What the branching (software code version control) strategy is How rollbacks to prior software code versions can occur. Further, if the VTLS Standard Operating Procedures (SOP) are documented, it was not provided to LOC. Kearney was told that this was not any kind of of?cial document; rather, it was just a promise from VTLS (the OFORS developer) to follow their SOPs. The CM Plan indicates that there is a somd change control process in place when changes are reviewed, approved, and deployed However, Kearney could not verify how baselines, or archives, could be retrieved in the case a rollback is necessary. None of the docmnentation we received explains how or even is using source control. The CM Plan states: ?If [sic] the OFORS Project Manager does not override a negative [880 recommendation [sic], the changes to the production systems must be rolled back in accordance with OFORS Standard Operating Procedures.? The reference to rolling back changes implies that there is a kind of source control being used, but how that source control is used is not currently within the purview of LOC. The SOPs of VTLS were never provided to LOC. 47 manna comm Library of Congress Performance Audit of SDLC and IT Security Audit Report Effect: The following list describes the e?'ects of this ?nding: 1. 5" Because the source control process is not documented, if there is any kind of traceability from requirements back to source control, Library Services is unaware of it. It is possible that OFORS contractors are following best practices and could easily trace a requirement back to all the change-sets that make up that requirement, but LOC is not tracking whether this is happening or not. Without this information: a. It can be extremely dif?cult to deploy speci?c features or to choose not to deploy those features b. It is di?icult to understand the history of source code and why a change was made c. There is no guarantee that there is a documented reason for a change. Ifchanges are required to be associated to requirements/bugs within the source control system, then each change is ultimately within the purview of Library Services and not a change that a developer might complete on his/her own initiative (1. It becomes more dif?cult to gauge how complex a feature was to implement c. It becomes more dif?cult to gauge how long a requirement took to implement. With change-sets associated to requirements, it can o??er Library Services a rough idea of how long each requirement took to implement. This information can be used to determine whether the contractor?s Level of Effort (LOE) estimates are accurate or not. Combined with regular sprints in an agile SDLC, this can be an extremely effective early warning system for determining whether LOEs are signi?cantly off base and, therefore, if the project schedule or manpower needs to be adjusted It is not possible for Library Services to monitor and con?rm the CM processes are being followed LOC has no insight into whether the vendor?s SOPs (if they exist) are in accordance with Library Service?s SDLC process Without documented baselines, software rollbacks may not be successful. In the case of OFORS, it is unknown if source control is being used at all It is lmknown whether any version of the software could be identi?ed as a. ?baseline? It is unknown whether questions could be answered, such as: a. What were the implemented requirements during the last release and what are they during this release? b. Did this ?mctionality work during the last release? Can we deploy that version and test it out? c. What is every code change that has ever been associated to this requirement? Who wrote the code? Who reviewed it? Overall, there is a risk that, in the event Library Services replaces the original vendor, a replacement vendor cannot access the source code in a workable, organized manner the code and the controls placed on it in Subversion [code library svstem] over to the next vendor). Library of Congress Performance Audit of SDLC and IT no Audit Report 1. 4.2 Development Phase Improvements Condition: Library Services does not have adequate policies and procedures, as de?ned in the LOC PMLC and SDLC, for monitoring the Development Phase processes and practices of its contractors. Although the PMLC and SDLC include planning for and monitoring Development Phase activities, there is no speci?c requirement to ??owdown? processes to contractors performing development work for LOC, nor guidance for LOC staff to monitor contractor adherence to best practices in its PMLC and SDLC. Monitoring of contractors? work is primarily tied to deliverables approvals via coordination between Project Managers and the COR. However, for OFORS, there is a lack of oversight and knowledge of internal development methodologies of the software development contractor, including coding standards and procedures. We observed in the assessments of the OFORS, there is little information known about how the contractors are building systems and what practices they are following. Similarly, the OFORS project team did not receive details from the contractor regarding development, compilation, and deployment toolsets. The contractor may be following best practices, but without the agency requiring implementation documentation, this cannot be con?rmed It is critical for Library Services to assess what toolsets contractors plan to use for develoPing, compiling, and deploying, such as coding standards, unit testing procedures, and development guides, as well as to monitor the contractor?s implementation of these development standards and procedures during the Development Phase. Without knowledge and approval of contractors? development methodologies and toolsets, it is dif?cult for library Services to con?rm alignment with SDLC, Enterprise Architecture, and CMMI best practices for development. Below, Kearney noted examples of inconsistencies in processes used for the developmt and coding practices when compared with CMMI development best practices in Technical Solution processes. Kearney requested documents for the assessment of Development Phase best practices that encompass ?system development methodologies and work environment standards for OFORS to include: development environment standards and tools, and coding standards and procedures.? Based on the documentation provided, we intended to con?rm that the tools, standards, and procedures were in accordance with best practicm or the requirements in SDLC. The documentation provided, however, does not provide that level of detail. The doctunentation provided includes 3 requirements gap analysis and a vision docmnent, which, while helpful, do not explain how the development team builds OFORS. Because LOC staff did not obtain details of the contractor?s development methodologies and toolsets, the answers to the following questions posed in our audit questionnaire for developers are not known: 49 Library of Congress Performance Audit of SDLC and IT Security [in Audit Report What are the coding standards? Are there peer reviews of custom code? Are unit tests required? How often are tests nm? Are they automated to run upon check-in? Is there a pipeline? What tools are used by the development team? How is branching and merging of code done? For example, is it possible to work on bug ?xes in the current production release, while simultaneously working on new features intended for the same release? a Can a version of the code be deployed from any point in time? Kearney was told that information supporting the questions above was not provided to library Services by the contractor. Best practices for assessing and monitoring contractor?s development methodologies to ensure alignment with the best practices or LOC SDLC and CMMI best practices were not in place. Effect: Library Services is not able to con?rm that SDLC procedures and the best practices outlined in the CMMI sub-practices are being followed by the vendor. Library Services is simply relying on the vendor to follow terms, such as SOPs, mentioned in the change management document of OFORS. The effects and bene?ts of following best practices in software development have recently gained attention in the Federal Government. Historically, the Federal Government has taken a waterfall approach to software development and focused primarily on the Requirements Phase. The Government has recently endorsed that the processes within the Implementation Phase are critical for successful so?ware. The activities within the Implementation Phase (or lack thereof/failure) determine the overall success of the project. The US. Chief Information Of?cer (C IO) and the Federal CIO Council?s Digital Playbook (h?ps?pla?ookciogovz) re?ects this fact by focusing no less than eight of its 13 ?plays? on the Development/Implementation Phase of software. Speci?cally: Projects not following modern development best practices will result in: More bugs Signi?can?y longer development times Dif?cult to maintain/fragile code bases More system downtime A system that is more expensive to maintain A system that performs poorly under load Di?iculty in ?nding developers to maintain the system due to unconventional implementations. 50 Library of Congress Performance Audit of SDLC and IT Security 0 a Audit Report 1.5.2 Deployment and Operations Improvements Condition: Library Services does not have adequate policies, procedures, or SOW requirements for contractors to provide detailed documentation explaining how custom software is deployed or how COTS products are integrated with custom components. For OFORS, there is little information within the project team on how the third-party vendor assembles the custom OFORS deployment packages and scripts. Policies and SOWs should specify that contractors provide a detailed deployment guide describing how changes are tested and deployed, as well as an installation guide explaining how to perform the customizations. There is no speci?c language in contracts to require ??owdown? of SDLC processes or best practices to contractors performing deployment and operations work for Library Services or guidance for Library Services sta?' to monitor contractor adherence to deployment and operations best practices. Below, Kearney noted examples of incomplete processes, including lack of documentation explaining procedures and processes related to testing and verifying deployed software and procedures for performing customizations. The OFORS implementation plan explains that Phase 111 (including customization) ?is the most important phase of the OFORS implementation plan. This phase deals with the actual details of how OFORS is set-up for each of?ce, transitioning of the data and preparing the system so the of?ces can begin using OFORS for some, if not all operations.? Additionally, the document states: ?This phase can be smmnarized into a few categories and may take anywhere between a couple of weeks to a month to be completely ready to start operating in Kearney agreed that the Customization Phase of Virtua, the vendor-proposed software, is the most important phase. However, we did not observe any documentation explaining what procedures andproeesses are used during this phase. Kearney also reviewed the OFORS System Administration Manual (SAM), which states: ?The Systems Administration Manual contains key information and Standard Operating Procedures (SOPs) necessary to maintain the system effectively. The manual provides the de?nition of the software support environment, the roles and responsibilities of the various personnel, and the regular activities essential to the support and maintenance the system? However, we noted that this document appears to be in draft mode. There are several sections that appear to be questions posed to the development contractor. For example, the document has comments such as,?Can Virtua [software] report account inactivity? Can Linux provide a clue?? In all of the documents provided for the audit team to review, there is little to no information on the tools, coding languages, or procedures the development contractor uses in assembling the custom OFORS deployment packages and scripts. 51 mu?& Library of Congress Performance Audit of SDLC and IT Security co?pnnv Audit Ream We did not observe evidence that OFORS has an automated software build or process as part of the product integration process. Automating the deployment process has several important bene?ts, including: . Signi?cant cost and time savings. For complicated deployments, manual processes can take several hours. Automated deployments completely eliminate this cost . The process is completely repeatable and immune to human error. This gives the team assurance that deployments will succeed in any environment and eliminates the cause of common production issues: forgetting to perform one or more deployment steps . The knowledge of deployment is not limited to a few individuals. When deployments are automated, anyone on the team can perform deployments, as opposed to manual deployments which are generally performed by one individual. Ifthis individual leaves the team or is unavailable, the other team members have to learn the process, which costs signi?cant time and money. Effects: Library Services is not able to con?rm that the contractor is following internal best practices for PI and deployment in accordance with the LOC SDLC Implementation Phase. In addition, the process of deployment was performed manually, which is costly, time-consuming, and more error-prone. Best practices recommend an automated which is less labor~ intensive and results in fewer errors. 1.6.2 IT Governance Improvements Condition: Library Services did not verify and monitor contracted system development work for alignment with best practices or LOC OCIO governance for project management and system development processes being performed by contracted development ?rms. Kearney observed an absence of requirements in contracts and 80W for development contractors to adhere to SDLC practices. There is no mention of a requirement to adhere to an SDLC in the original SOW. The omission of requiring contractor development work to be conducted in a structured professional manner could lead to late or non-delivery of software products or provide a contractor with lower cost options to deliver software, resulting in higher failure and non-compliance rates. Effect: The lack of mature IT governance and monitoring processes leads to an inconsistent application of project standards and controls causing potential issues with quality software development and implementation. An absence of development standards may a?'ect the delivery of fully de?ned and properly implemented stakeholder requirements. A lack of systems development governance oversight can result in inconsistencies with quality and standards of deliverables, leading to cost overruns and missed project milestones. 52 Library of Congress Performance Audit of SDLC and IT Security AnditReport 1.10 Library Services Oversight Policies Condition: Library Services? project cost management policies and procedures do not include effective cost monitoring and control, including reporting and analyzing cost variances and their causes, as well as tracking corrective actions for IT investments. Cost tracking information was observed in multiple formats with varying levels of detail, but no evidence was provided of project cost monitoring and analysis of cost variances to original estimates and corrective actions. OFORS provided cost information in various formats, including internal consolidated payment histories, budgets based on CLle, and acquisition costs schedule. We did not observe evidence that the project followed a formal process for analyzing costs, including variances, causes, and corrective actions. Original budget requests, estimated acquisitions costs, and General Ledger postings are listed below: . 2011 Original Budget Appropriation request was $500,000 (total investment estimate of $2,500,000 over a ?ve-year period) . 2011 Original Acquisition costs (Schedule Attachment 1) were estimated at $1,736,000 . 2016 General Ledger postings of invoices for the OFORS contract from the start of the project through to December 2016 totaled $1.23 million. Library Services did not monitor OFORS under the and IT Investment Management (lTlM) investment management process until the third quarter of FY 2016. Library Services did not follow LOC standard cost tracking and monitoring methodology prior to this point. As a Firm Fixed Price contract, there is no expectation on OFORS for projected versus actual variance reporting, since any cost variances are considered the vendor?s concern and risk. Original delivery of the developed system with all functionality was planned for September 2012 and after ultimately having to take the contractor to a legal cure, the contractor has agreed to provide all ?mctional requirements by December 2017. However, proactive monitoring of the contractor?s actual work completed against costs and schedule might have provided earlier insight into the risks the contractor was experiencing, resulting in more effective risk mitigation of non-delivery. Ultimately, the contractor did not deliver all the content required in the timeframe expected and a settlement had to be made. While contracting costs were contained by contractual agreements (Firm Fixed Price contract), product delivery was incomplete and delayed, as well as bene?ts from the envisioned end product. Additionally, as the investment continues past the planned completion dates, LOC incurs added internal costs related to contract oversight that were not initially identi?ed. Effect: Lack of formal policy and guidance regarding cost monitoring and control methodologies across the agency, including all service units, has several negative e?'ects in relation to adequately assessing project costs and risks. These 53 mu?& Library of Congress Performance Audit of SDLC and IT Security - OFORS has not adopted proactive monitoring of the contractor?s actual work completed against costs and schedule that might have provided earlier insight into risks for more effective mitigation . OFORS has not applied formal cost variance analysis methods, including identifying causes and tracking corrective actions to closure - Without regular and ?'equent monitoring and reporting of costs, variances, and corrective actions, there is increased risk surrounding ability to make informed and timely decisions about IT investments. 3.1.2 Project Management Scope and Schedule Improvement Condition: OFORS did not have adequate scope and schedule management controls in place that aligned to best practices as identi?ed in Below, we provide evidence of these conditions: . OFORS project management did not document and track project scope/schedule risks and issues in a Risk Register or issue log . OFORS project management did not effectively manage changes to the scope and schedule following the Scope Management process. We did not observe change requests documenting schedule and scope changes aligned with best practices. E?'ect: The lack of plans for managing scope and schedule for OFORS poses risks can lead to mismanaged scope and schedule changes that could affect the project schedule and increase project costs There is no documented process that guides the project in managing changes risks, and issues related to scope and schedule. As states ?controlling the project scope ensures all requested changes and recommended corrective or preventive actions are processed through the Perform Integrated Change Control process.? 3.3.2 Human Resources Management Improvements Condition: Library Services project management did not have adequate Human Resource Management processes in alignment with best practices. Kearney noted the following conditions for OFORS: . Project management did not develop a Human Resource Management Plan that aligned with standards. Project management did not outline roles, responsibilities, required skills, and reporting relationships using techniques such as a RACI chart, which is a matrix clarifying roles and responsibilities most typically used: Responsible, Accountable, Consulted, and Informed . Project management did not follow the guidance to document a plan for adjusting resources during the Project Closeout Phase. As a result, it is imclear if excessive or suf?cient resources are assigned to complete closeout tasks. 54 Library of Congress Performance Audit of SDLC and IT Security ?ll?l?AHY smear: Effect: prroject team members do not possess required competencies and proper training is not provided to new resources, performance and success of the project can be jeopardized When resource mismatches and changes are identi?ed, proactive responses, such as training, hiring, schedule changes, or scope changes and documentation updates should be initiated Library Services project costs, schedules, risks, quality, and other project areas may be signi?cantly a??ected by inadequate resources and misunderstanding of roles and responsibilities- According to effective Human Resources planning should consider and plan for the availability of or competition for scarce resources. Having a clear lmderstanding of project resources requirements can help avoid con?icts with other projects competing for Human Resources with the same competencies or skill sets. Project roles should be designated for teams or team members, and those teams or team members can be ?'om inside or outside the organization performing the project. Without e?'ective Hunmn Resource planning and management, sta?ing issues may disrupt the project team from adhering to the PMP, causing the schedule to be extended or the budget to be exceeded Key bene?ts of e?'ectively managing the project team is that it in?uences team behavior, manages con?ict, resolves issues, and appraises team member performance. Failing to formally plan the method and timing of releasing resources from a project can signi?cantly increase the likelihood of Human Resources risk occurring during or at the end of the project and unnecessary resource costs being charged to the project. Inef?cient resource management planning may have been a key factor in LOC projects failing to properly maintain documentation, track/meet deliverables, and report on the performance of the project resources. 3. 4.2 Communications Management Improvements Condition: library Services projects did not follow the LOC PMLC guidance for delivering and maintaining a Communications Management Plan Project management did not follow best practices or LOC guidance to create and maintain a Communications Management Plan. The Commimications Management Plan was listed as a project deliverable due 14 days a?er award in the base contract. Project resources communicate through various status meetings, utilize tools to manage project deliverables and schedules, and prepare project reports. Project conmnmications requirements were not centrally docmnented and maintained A Communications Management Plan facilitates e??ective and ef?cient communications with the various audiences who have a major stake in the project. E?'ective two-way communication between stakeholders is key for the success of the project. Good communication limits surprises, prevents duplication of effort, and helps reveal omissions and misallocation of resources early enough to permit corrections. 55 mu?& Library of Congress Performance Audit of SDLC and IT Security [in Audit Report E?'ect: Library Services? lack of a requirement for OFORS to follow the LOC PMLC has several negative effects with regard to OFORS project management maintaining a formally documented Communications Management Plan. As stated in Section 2.3 of the LOC PMLC, ?communications, risk and imanticipated changes can impact the schedule and/or outcome of a project and it is important to plan in advance how these changes will be addressed.? also discusses how ineffective communication creates a gap between diverse stakeholders who may have different cultural and organizational backgrounds, levels of expertise, and perspectives and interests, which may impact or have an in?uence upon the project execution or outcome. Library Services projects lacking a formal Communications Management Plan may experience con?icts with suppliers and internal team members due to 3.5.2 Risk and Issue Management Improvements Condition: The OFORS project team did not follow the best practices or the LOC PMLC requirement to deliver and maintain a Risk Management Plan that describes how project risk assessments will be structured and performed Additionally, the OFORS project team did not follow best practices or the LOC PMLC requirement to deliver and maintain a Risk Register, which serves as a record of risk, mitigation strategy, contingency plan, and resolutions throughout the life cycle of the project. Although an issue log was created, it was not maintained and updated throughout the project. The issue log was last updated in 2015 and did not include project management issues. Interviews with OFORS staff revealed that project issues occurred related to the vendor not meeting deliverable requirements; these issues were not included in the issue log and tracked to closure. Effect: lack of enforcement of requirement for OFORS to follow the LOC PMLC has several negative effects with regard to OFORS maintaining Risk Management Plans, issue logs, and Risk Registers. As stated in Section 2.3 of the LOC PMLC, ?communications, risk and unanticipated changes can impact the schedule and/or outcome of a project and it is important to plan in advance how these changes will be addressed.? Project risk is de?ned as an uncertain event or condition that, if it occurs, has a positive or negative effect on one or more project objectives, such as scope, schedule, cost, and quality. Project management relied on the vendor to maintain project documentation, but they did not include a Risk Management Plan as a deliverable. The vendor?s PMP did not address manag?ng risk or maintaining issue logs and Risk Registers. OFORS project management did not ensure that critical risks impacting scope, schedule, budget, business performance, and/or change management were proactively identi?ed, communicated, mitigated, and escalated in a timely manner. 56 m? Library of Congress Performance Audit of SDLC and IT Security Audit Report 4.2.2 Improvements to Requirements Management Condition: Library Services does not have guidelines, policies, or standardized processes for managing requirements throughout the SDLC or ongoing support and enhancement phases. Kearney observed requirements documentation in multiple formats with varying levels of detail with limited bidirectional traceability. Additionally, we did not observe requirement baselines, meaning that they either do not exist or have not been maintained as part of the ongoing projects. A requirement baseline provides a de?ned, con?rmed, and validated set of detailed requirements at speci?ed checkpoints throughout a project life cycle. These baselines should be used to obtain metrics to show the amount of change (and when the changes occurred) throughout the project. The cost of a project increases exponentially depending on when project additions or changes to requirements are discovered. By tracking these changes against documented baselines, Library Services is better able to determine and manage project risks to schedules and budgets. Below, we noted examples of inconsistencies in processes used for requirements management when comparing with best practices for project management. OFORS did not provide an observable baseline for the overall system Requirements docmnentation provided for the evaluation is contained in many disjointed documents of varying types and formats. The lack of cohesion in the documentation provided is a result of the lack of an orgnization-wide policy or guidance on what the development team needs to deliver and how this information should be managed. The OFORS program creates detailed ?mctional speci?cations, which capture requirements and provide detailed implementation guidelines; however, the requirements are not documented in a format that facilitates overall requirement tracking or change management. This practice of progressing to detailed design without fully documenting and linking requirements does not provide a demonstrable way for library Services stakeholders to validate that their requirements are understood and complete prior to moving into the Design Phase. This increases the risk of missing requirements. By having requirements embedded into the functional speci?cations, the OFORS requirements are not manageable. There is no way to track when changes are made or when new requirements are added or removed Currently, documents must be reviewed individually and require reviewers to have extensive system knowledge or hold multiple meetings to understand and ensure the documentation captures all system functionality. E?'ect: Lack of formal policy and guidance regarding the establishment of standard processes to manage requirements throughout the complete SDLC, including ongoing maintenance enhancement releases, resulted in the following: Inability to e??ectively view and manage project requirements . Inability to fully trace requirements throughout the development process . Inability to effectively manage changes to existing requirements or new requirements as they are developed. 57 KEARNEWSL Performance Audit of SDLC and IT Security [in Audit Report 4.4.2 Requirements Verification Improvements Condition: Library Services does not use best practices or LOC SDLC standard guidelines, processes, or standard templates for verifying that requirements are fully tested and implemented during the Veri?cation Phase. Documentation is maintained in various formats and levels of detail across and within projects. Below, we noted examples of inconsistencies in processes used for requirements veri?cation when comparing with best practices for requirements management and quality assurance for OFORS. Documentation provided for review (eg, use cases, release requirements, release notes, tickets/issues, test cases) is disjointed and requires management or peer reviewers to track information in multiple sources. The use cases do not contain enough detail to ensure correct implementation or veri?cation. Based on information provided by the project team, peer review meetings were conducted as needed, but evidence was not available to con?rm or verify output from these meetings. Effect: Each program at Library Services has implemented its own formats and processes for requirements Veri?cation to differing levels of success. There is no clear way for Library Services to review individual projects against the stakeholder requirements to verify what has been implemented against the stakeholder and detailed product requirements. 5.1.1 Inadequate Access Control Policy and Procedures Condition: The OFORS Information Technology Security Program Manager (ITSPM), in coordination with the System Owner, did not formally approve and ensure consistent implementation of documented procedures for account management and monitoring controls across the six OFORS overseas of?ces. The Library Services OFORS Operations SAM was not ?nalized and formally approved and has not been updated since 2014. The SAM includes the settings and procedures to follow for controls, including regular and privileged user account management. The lack of clear guidance compounds system administration issues, as OFORS operates in loose coordination across multiple remote sta? of?ces, making consistent and current administration documentation critical to maintaining a secure systems landscape. Improper or inconsistent application of account management controls can lead to unauthorized system access and changes to the data within the system We identi?ed the following speci?c violations of LOC policy and non-compliance with the NIST Security Guidance: . Two of eight new users added dining FY 2016 from all six OFORS overseas o?ices 58 ism-scum?, Performance Audit of SDLC and IT Security comm" W. were not properly approved prior to their system access - Eleven of all 14 privileged users did not have Privileged User Rules of Behavior forms approved by the System Owner - Sixty-nine out of a total of 397 user accounts user accounts have been inactive for over 30 days, but they were not disabled as required by LOC directives - The OFORS ITSPM, in coordination with the System Owner, has not implemented, and monitored procedures for periodic recerti?cation of users? continued access. Recerti?cation includes validating that each user?s permissions are still required based on their job function . The OFORS 1T SPM, in coordination with the System Owner, did not ensure periodic reviews were performed on audit logs containing privileged user activity through either application or network/server audit logs. Effect: Failure to implement and follow the system access controls policies identi?ed in IT SDir 01 would lead to unauthorized access to OFORS, including unauthorized transaction entries and approvals. Failure to periodically review and update the OFORS SAM, including system-speci?c security measures, contributes to the lisk of system compromise by unauthorized users. The absence of appropriately completed/amhorized OFORS Access Management Forms and Privileged User Rules of Behavior, along with a lack of annual user recerti?cation, could lead to granting users access to unauthorized job ?mctions. If inactive are not reviewed and removed in a timely manner, these provide an opportlmity for malicious attacks and increase the risk of loss, theft, or misuse of OFORS resources. Lastly, failure to implement the audit logging and review procedures reduces the system?s ability to identify attempted or completed actions and respond to adverse events affecting data managed by OFORS. 5.2.2 Lack of Security Control Assessment Documentation and Detailed Planned Remediation Procedures Conditions: The OFORS ITSPM did not e??ectively oversee and maintain security assessment and authorization documentation Documentation of most recent Security Control Assessment (SCA) for the OFORS application in Archer GRC did not include the following: . A ?nal and approved Security Assessment Plan (SAP) LOC provided Kearney with the OFORS SAP draft with no approval signatures from the appropriate LOC management of?cials . Evidence of a completed SAR The IT SPM did not effectively manage system vulnerabilities. In our evaluation of the OFORS Kearney identi?ed the following: . No new have been added to the listing since the initial accreditation for OFORS in 2014 59 mnuiv& Library 0f Cougress Performance Audit of SDLC and IT Security a Audit Resort . For a total of ?ve Kearney identi?ed that one open, low-impact started on August 13, 2014 did not have documented planned remedial actions. Effect: Failure to detail remediation actions required to address the weaknesses may lead to weaknesses ?ngering longer than necessary, thus increasing risk to the system Further, these weaknesses might never receive attention from the agency?s executives and System Owners, as the planned remedial actions are not documented in the The documented control assessment provides evidence of the current security posture of OFORS. Lack of documented control assessment results may hinder efforts to address any vulnerabilities identi?ed Thus, the SAR is a key requirement and serves as formal documentation for inclusion in a completed authorization package. Finally, in order to make sound strategic decisions about enterprise-wide priorities and the allocation of resources, the agency?s executives require a comprehensive understanding of all information security risks. Failure to perform assessments of the OFORS security posture weakens management?s decision-making ability. 5.3.2 Lac]: of Con?guration Management Plan and Security Impact Analysis Conditions: LOC policy states that the System Owner shall develop and maintain a CM Plan for systems lmder his/her purview. The OFORS System Owner was unable to provide a ?nalized copy of the OFORS CM Plan, although one was in draft form. Kearney assessed the process for testing and approving changes prior to release to production for OFORS. The OFORS draft CM Plan states that changes to any element covered by the plan must have a Technical Service Request (T SR) associated with these changes. Upon our testing of tracking OFORS system changes to TSRs, Kearney noted a lack of clear mapping between OFORS release test plans and TSRs. We could not reconcile from the documentation maintained that all ?mctionality intended for the release within the TSRs were tested. Kearney also noted that OFORS software releases did not have documented supervisory or management review and approval prior to being put into production. Improperly tested or unauthorized ?mctionality may introduce failures or vulnerabilities to the system. The ability to research functionality introduced to authorized and tested requests is a key requirement in software development and deployment activities. The System Owner did not ensure that con?guration management processes under their purview included and considered the results of a Security Impact Analysis prior to OFORS releases. The LOC Chief Information Secmity Of?cer (CISO) stated that this was a recognized issue that is now being prioritized and incorporated into Change Advisory Board procedures. Effects: Lack of a ?nalized CM Plan for OFORS may lead personnel to perform con?guration or change management processes using an tmauthorized draft plan. Without completing a security impact analysis, OFORS changes may have an adverse, unexpected impact on the 60 mn?lv?i Library of Congress Performance Audit of SDLC and IT Security BOMPANY security state of the system Failure to trace TSRs to the test plan can lead to unauthorized changes to OFORS. Agreed changes to OFORS may go untracked or unmonitored by the ITSPM and CO, resulting in undelivered changes. 6.1.2 Incomplete Risk Management Framework (RMF) Conditions: The OFORS System Owner did not ensure that the current OFORS SSP contains necessary detail to describe the management, Operational, and technical safeguards or countermeasures for the information system Speci?cally, the following de?ciencies were identi?ed: . Lack of documentation of the system boundary for OFORS in the SSP provided during the Testing Phase of the audit. Updates to the OFORS SSP were in progress towards the end of the Testing Phase and into the Reporting Phase of the audit . Reference and control testing consistent with outdated guidance (NIST SP 800?53, Rev. 3 Recommended Security Controls for Federal Infomiation Systems and Organizations) . Lack of detailed descriptions as to how OFORS management 1880, Control Assessor, Common Control Provider, and System Owner): Has applied security controls, including detail of implemented security controls and residual planned actions to address any weaknesses Updated the system?s Archer record based on results from the continuous monitoring process that occurs on a basis. Kearney determined that the OFORS System Owner did not ensure completion of the steps within Archer to register the system with appropriate organizational program/management o?ices. Speci?cally, Archer entry noted that the Parent/Child Relationship5 section was not addressed. Effect: Improper documentation of an information system?s system boundary and identi?cation of the scope of protection for the information system can lead to a failure in communicating the required internal/external control measures. The lack of a clearly de?ned system description, including system boundaries, could lead to ineffective determination or prioritization of controls to adequately safeguard that system. The use of outdated NIST guidelines can lead to inadequate assessment of the controls needed to prevent system vulnerabilities as threat landscapes change; using current guidance ensures the latest countermeasures are in place, such as inclusions to consider advanced persistent threats. Completion and documentation of the RMF for OFORS in the SSP would help mitigate against risks to the system, including those inherent to its use overseas. Speci?cally, the lack of consistent guidelines for the six overseas of?ces and LOC in Washington, DC. increases the risk to con?dentiality, integrity, and availability of information and data managed by OF ORS. 5 A ?parent/child relationship" is de?ned as a governing organization (parent) that owns, manages. and/or controls a system (child). 61 "mum innings. The RMI is a process where each step is critical to the outcome of the subsequent step Wrilun each step are separate tasks (e . Task 1-3: IS Registration. rs pan Step 1 Critegufize Is) It' the information syslenl categorization step is not completed correctly. System Managers may not adequately apply succeeding steps. such as tlte selection of controls It) implement fot the system 11.1 Inconsistency DfOptr-aling System Plnifarms and Inadequate i Management Connifinns: The System Owner failed to ensure that OFORS instances are securer and ccitsisteutly configured OFORS has a complex operaling envrronrnerrt. as it is configured and managed from the LDC Office New Delhi. India However. the [$50 sends updated vetstons In offices located in Cain). Egypt. islanlahad, Pakistan. Jakarta. Indonesia: Nairobi. Kenya. and Rio de Janeim. Brazil. According to Protect Manager. these updates would be installed on comparable server coufignratimts at the other locations to keep all OFORS sofiware instances running tree nideiects and security Issues as the initially approved New Delhi instance. However. this was not the case: scanmtig completed in October 2016 identified that unplemented the following four different versions of the Ltmur operatmg system (Os) across the SIX otfices at varying levels of secuniy patching Additionally. Keamey found that villiiembihly scanning/continuous mortrtonng has not been completed for the OFORS instances prior to October 2016. Further. OFORS available in the Archer Governance, Risk, and Compliance (GRC) tool were created by the 1550 in 2014. and there is no evidence of the ongoing identification of through other means and their inchiston in the listing In the October 2016 testing. the vulnerabilitres for the font sewers had varymg levels or patch and configuration weaknesses. demonstratmg the need for improved vuhieiability management Below is a table a surmnary cf the of the latest vulnerability scan perfomted fol OFORS. wluch provides the count and seventy on each server: Erhibil 12: OFORS Saver Vulnmbililits OYORS hencr iilncrabilities heterit} ()Vrll' l'rgenlf'rili mum; RS Server \'ttlnet'ultilltiux Stu lulitl l'rgenH'ritlul Seriuus Mu tum Minimal Intl [still 139 378 Effect: Maintaining niultiple configiuations reduces cost-effective management of the security posture because LOC has to ensure tests for each systeui are separately planned executed and iemediated. As ew'denced by the table above the deviations in have led to system instances havtng multiple urgent and critical vulnerabilities. while others have none. Adding weaknesses to the ensures trachug and allows LOC to coordinate actions. A Lack of constant updates lessens the attention provided from the agency's executives and System Owueis. as the planned remedial actions are not documented in the Without implementing the vulnerability identification and remediation process the OFORS System Owner is unable to determine if high severity vulnerabilities could lead to compromise systems and data If critical or high-risk annembilities remain on servers there is an increased risk that the system nistances could experience a loss ofeonfideiitiality. integrity. or Delniled Andi gs -- 'nngress.gov 5.1.2 Lack of Privileged Ustr RtviotY Condition: The use wuhiu the oc10 did not recenify Congess gov privileged uset accounts in FY 2016, as reqrnred by LOC policy, Recemfication includes validating that each privileged user's pitVileges are still required based on his/her job function, Kearney noted that administrative accounts have compensating controls as they are tied to the Active Directory accounts which are managed at the network level Active Directory accounts should be deactivated or reinoVed within 48 horns upon a user's voluntary separation from the LOC and iniuiediateiy for invohuitary termination Regular recemt'icntion helps ensure that system users do not accrete access inconsistent wnli current position responsibihties resulting from new positions and tempoiaiy assiginnents Regulai receititication policies ensin-e that users are only with aulhorlled access to accomplish assigned tasks as Library of Congress Performance Audit of SDLC and IT Security en Audit Report Effect: The absence of appropriately completed/authorized privileged user recerti?cation could lead to or continue granting users access to unauthorized Congress. gov job system capabilities and functions. This, in turn, could result in authorized access to Congressgov. 5.2.2 Lack of Security Assessment Plan and Detailed Planned Remediation Procedures Conditions: Kearney identi?ed the following conditions in our evaluation of LOC ?s Congress. gov management and security assessments: . 'Ihe Congressgov ISSO did not retain a SAP (The Congressgov SCA and SAR were available and reviewed) . Twenty-two of 23 total items were listed as open and ongoing. For 10 of the ongoing sampled, we noted the following: The ISSO did not document planned remedial actions in the 10 selected ongoing Congress. gov items - For one closed low-priority Kearney identi?ed the following: The ISSO did not attach the Scanning/Pen Test results used as evidence to close the completed Effect: Failure to detail remediation actions required to address the weaknesses may lead to weaknesses lingering longer than necessary, thus increasing risk to the system Weaknesses without planned remedial actions documented in the would not be considered as started when assessing current security performance metrics. Without a ?nalized assessment plan, System Managers do not know that an e?'ective or complete control assessment has been performed as advised by the C180 and Security Control Assessor. The System Owner may not be able to assess whether the procedures followed to perform the control assessment were according to a pre-approved scope and consistent with approved roles and responsibilities. . 5.3.1 Lacks Con?guration Management Planning and Security Impact Analysis Condition: LOC did not have a CM Plan speci?c to Congressgov. CM controls ensure that processes are in place to document changes made to a system?s hardware, software, and documentation throughout the development and operational life of the system Kearney noted that the Congress. gov System Owner did not follow the LOC security policy to develop and maintain the Congress. gov CM Plan We also noted that the Congress. gov System Owner failed to comply with NIST guidance directing organizations to identify and document information system con?gmation items to be managed in a CM Plan. LOC uses the Change Management Process document as the Congress. gov CM Plan, which is not speci?c to the Congress. gov system and does not contain Congress. gov con?guration items. 64 Library of Congress Performance Audit of SDLC and IT Security Additionally, the System Owner did not follow the LOC security policies to ensure a security impact analysis was performed prior to releases of Congress. gov into the operational environment. The LOC CISO stated that the lack of a pie-production security impact analysis is a recognized issue that is currently being prioritized. Effect: Congress. gov con?guration items are not documented due to the lack of a CM Plan speci?c to Congress. gov. Con?guration items refer to the various components of the Congress. gov system under con?guration control, and they are important for planning for effective management of the system Without identi?cation of con?guration items within the Congressgov CM Plan, it may be imclear to all stakeholders the potential impact of changes to the Congress. gov system LOC may not be able to determine the extent to which changes, vulnerabilities, or operations will affect the security state of the Congress. gov system 6. 1.1 Incomplae Risk Management Framework Conditions: The System Owner has not fully developed the Congress. gov SSP. Speci?cally, the following de?ciencies were identi?ed: a The SSP does not identify system interconnections for Congress. gov The SSP refers to and control testing is consistent with outdated guidance (NIST SP 800- 53, Rev. 3, Recommended Security Controls for Federal Infannation systems and Organizations) - There is a lack of detailed descriptions regarding how Congress. gov management 1880, Control Assessor, Common Control Provider, and System Owner): - Has applied security controls, including detail of implemented security controls and residual planned actions to address any weaknesses - Performs periodic updates to Archer (OCIO system inventory system) based on results from the continuous monitoring process that occurs on a basis. Kearney determined that the Congressgov System Owner did not ensure completion of the steps within Archer to register the information system with appropriate organizational program/ management o?ices. Speci?cally, for the Congress. gov Archer entry, we noted that entries for network architecture and data ?ow docmnentation were not addressed. Effect: The SSP outlines the security controls in place for the system and provides explanations of the reasons for those protections. The Interconnection Security Agreement (ISA) describes the risks posed by creating trusted connections and transmissions of data between systems. Without clear identi?cation of the systems connecting to Congress. gov, security professionals may not be aware of the protections needed to mitigate potential vulnerabilities. Describing interconnections and taking a coordinated approach allows System Owners to carefully consider the risks that may be introduced when information systems are connected to other systems with different security requirements. 65 Lihmn Vet'futnuttu mi Serums The use of outdated NIST guidelines can lead to madequate assessment of the controls needed to preveut system vulneratuhttes. as threat landscapes change: using cunent guidance enstues the latest coimtermeasures are in placer such as inclusions to consider advanced persistent threats. Nou-compleltou and lack of documentation of the RMF for Congress gov in the SSP eleimtes risks to the ongoing operations of the Congressgov system. Specifically, it is important that Congressgov stakeholders understand their roles and responsibilities, from documented references in the SSP to procedures for security control assessment and system continuous inoiutoring. which are key processes for the management of risk to the Congressgov system Tire RMF is a process where each step is to the outcome of the subsequent step. Within each step are separate tasks (eg, the IS Registration task is part of Step 1: Categorisation), It'the infommtion system categorization step is not completed correctly. System Managers may not adequately apply succeeding steps, such as the selection of controls to implement for the system 7. 1.2 Luck of Tinter Vulnmbility Mitigation ft>> the 0610 Application Hosting Environment (AHE) Condition: In coordination with the System Owner, IT security slafl' failed to remedial: vulnerabilities in a timely ninuuer for the sewers on which resides The 1550 fitted to ensure that vulnerabilities are remediated in a timely manner. Condition: hi coordination with the Congressgov System Owner. AHE System Owner failed to ensure vulnerabihties were remediated in a tuuely manner for Congress gov servers hosted in the ME The LOC rises Qualys, an enterprise systems scanning tool. to identify vuhierabilitics within the LOC systems environments. Qualys reports elevated vulnerabilities discovered in a ranked fonuat 7 "Urgent." "Critical," and "Serious," Based on the initial discovery otthe elevated security vulnerabilities, as reported by the Qualys loo], Keantey noted that vulnerab' e: have not been resolved for Red Hat Enterprise Linux (REBEL) servers used for Congress.gov. dating back to June of 2016, Kearney's analysis shows that over 40% of these vuhiernbihties have been present on the AHE servers on which Congressgov resides for over 90 days and have not been remediated. Below is a tame of the AHE vulnerabilities with a rating of severity (based on the Quatys scanner rating) and the age orearh vtnuerahility grouped by operating system. Erltiltt't 13: A1115 annembtti es Vqu 7 3t} 3179" Ol--lh'il 181721: Gnuut 3 Dan Days totut 66 tiara." inn" atsnt "at rt in EOMMNV in. <30 31--90 91--180 1817211 rid Days Davs Tolnl Grand [olnl LOC has reviewed the weaknesses on the AHE servers and is the process otinipleinenied automated for patch management. A hacking this issue was initially created In Oclober 2014. On November 28. 2016. the AHE System Owner requested a waiver. which was accepted on December 14' min granting an extended remediation date of time 14' 2017 While is managing the process. the systems residing on the Ami servers continue opeianons with elevated llneal due to highly explollable This threat Increases over time as published security weaknesses are available to the and automated attacks created and disseminated. Efl'ect: Without timely remediation of identified Vulnerabilities. the ('oiigressgov System Owner is Imble Io delemnne wliellier lugli seventy Vulnerabilities could lead in coinpronnse of LOC systems and data If critical or high-n51; vuhierabihties remain inninugated on Congress gov sewers. there is an increased n'sk that the system servers and data could experience a loss of confidentiality. integrity, or availability A lack of constant updates decreases the attention piovtded from the agency's executives and Syslem Owners. as the plainied actions are not doclnnemed the This can lead to increased risk of compromise ot'confidemialiiy. integrity and availability This page left blank intentionally FY 16 Review of Systems Development Life Cycle 2 Appendix B: Management?s Response 2016-lT-102 February 2017 This page left blank intentionally LIBRARY or CONGRESS DATE February 9, 2017 T0 Kurt W. Hyde, Inspector General FROM David S. Mao, Deputy Librarian of Congress SUBJECT Comments on Draft OIG Audit Report No. 2016-IT-102, FY16 Review of System Development Life Cycle (SDLC) Thank you for the opportunity to comment on the subject draft report. We appreciate Kearney 8: Company's review, which includes an assessment of the Library?s system development life cycle processes and a detailed review of three development efforts: Congressgov, the Library Services Overseas Field Office Replacement System (OFORS), and the US. Copyright Office Electronic Licensing System (eLi). As described on the attached chart, the Library concurs with the report?s findings and recommendations. In the past 18 months, the Library has made significant improvements in the management of its information technology. Through our centralized Office of the Chief Information Officer, the Library is implementing (and documenting) standards and processes for managing IT investments, the system development life cycle, and the project management life cycle. These practices are enhancing the Library? 5 ability to assess risks and make decisions at each stage of a system?s development, including, importantly, the ability to identify quickly when a system has gone off track and to correct course. Last summer, based on an independent third-party assessment of eLi ordered by the Copyright Office, and an internal assessment of OFORS by Library Services, the Library took action on the eLi and OFORS contracts. At the Copyright Office?s request, the Library terminated eLi and accepted delivery of existing code. The Library also reduced OFORS scope to essential functionality. The Library will conduct a further review of these projects. Please let me know if you have questions or would like to discuss this in greater detail. Attachment cc: Bernard A. Barton, Chief Information Officer I. Mark Sweeney, Associate Librarian for Library Services Karyn Temple Claggett, Acting Register of Copyrights Elizabeth Pugh, General Counsel Nicole L. Marcou, Special Assistant to the Deputy Librarian; OIG Audit Liaison Page: Management Comment on Draft OIG Audit Report No. February 9, 2017 Management Comments on Draft OIG Audit Report No. 2016-IT-102, FY16 Review of System Development Life Cycle (SDLC) I Huang?Wide Issues} Recommendation Comments 1. LOC should compare current SDLC policies and Concur. procedures to industry best practices to ensure development risks are actively monitored and managed. Library Approach: As part of the current effort to ensure consistent use of SDLC and Project Management Life Cycle (PMLC) practices Library?wide, the Library, with consultant support, is developing directives which describe detailed SDLC and PMLC procedures and process steps, including risk management guidance. To develop the directives, the consultant is taking into consideration industry best practices as well as GAO guidelines on risk management and prior experience from other government agencies. The directives are expected to be completed by Q2 FY17. 2. LOC should monitor current SDLC activity and Concur. environmental factors as part of a structured risk assessment framework to ensure policies and procedures identify and Library Approach: As noted above, the Library is currently developing SDLC and PMLC directives address emerging issues/new risks. which include procedures and process steps for ensuring project risks are identified, assigned an owner, tracked, and mitigated as part of an overall risk management plan. These directives are expected to be completed by Q2 FY17. In addition, the Library has recently established an annual independent verification and validation (ll/EV) IT project review process through which the risk management process for all major projects will be regularly monitored by the Project Management Office (PMO). Management Comment on Draft OIC Audit Report No. 2016-IT-102 February 9, 2017 3. C05 and Contracting Officer?s Technical Representatives (COTR) should collaboratively identify standard SDLC contract elements, including vendor timelines, technical deliverables, required documentation, and internal review and acceptance procedures, as well as ensure that SDLC contracts contain these elements. Concur. Library Approach: The Library currently includes in all IT contracts a standard SDLC clause that requires contractors to follow the Library? 5 SDLC and PMLC as described at 5/ . The Library will issue a policy and standard operating procedures to ensure that SDLC elements which are relevant to particular IT development projects are incorporated as scheduled deliverables into contracts and that internal review and acceptance procedures are clearly identi?ed and described by Q2 FY17. 4. LOC should develop policies that clearly delineate required oversight approval for additional funding requests, contract modifications, delivery delays, and inability to meet original technical requirements in all LOC service units. Concur. Library Approach: As part of the newly-implemented Information Technology Investment Management (ITIM) process at the Library, any changes to an approved investment must be submitted for review and approval by the Library?s Information Technology Steering Committee (ITSC) and Executive Committee (EC). Also, the ITIM framework requires quarterly updates on every IT investment, with updates on investment cost estimates and variances, milestone status including current and anticipated milestone completion dates and delays, and an assessment of overall investment risks. An LCR requiring all service units to follow is currently under review and expected to be promulgated by Q3 FY17. 5. LOC should clarify funding sources and status of funds reporting requirements Concur. Library Approach: The newly-implemented process at the Library includes identi?cation of funding sources at the proposal phase, and a comparison of proposed budget to actual execution each quarter during implementation. In addition, with the enforcement of the SLDC processes for all Library IT projects, rather than just those within OCIO, all IT projects will be required to budget and report on the resources at the project level as well. LCRs requiring Library-wide compliance with the and SDLC processes are currently under review and expected to be completed by Q3 FY17. Mana ,ement Comment on Draft GIG Audit Re - on No. 2016-IT-102 lei?li'nnic I Hisit?m lel ii Recommendati I Februar 9, 2017 1. If development activities resume, ensure that current LOC policies and relevant industry best practices are adopted by service unit oversight and project management teams. Concur. USCG Approach; USCG will ensure that future efforts to develOp electronic submission of cable statements of account will be managed in accordance with Library guidance, and will follow the detailed SDLC and PMLC procedures and process steps that are being formulated by the Library. For broader context, USCG notes that this audit concerned Only a single software project, eLi, which for most of its life was managed outside of the Copyright Technology Office (CTO). IT projects managed within CTO have censistently followed LDC?mandated SDLC methodology. Indeed, a 2015 Inspector General audit found that the USCG was ?compliant with SDLC methodology" with reSpect to its primary software applications, Electronic Copyright Office and Copyright imaging System which support registration and recordation functions, and are managed by CTO. See Library of Congress Office of the inspector General, Report on the Maturity of the Library?s System Development Life Cycle Processes and Procedures 8 (Feb. 2015), at Accordingly, USCG does not understand this report to be opining on the SDLC practices of USCG as a whole. USCG also notes that of its own accord?and before this audit commenced? USCG ordered an independent third-party assessment of the eLi project, which was delivered to USCG in December 2014. Although the 2014 assessment found that the "eli project ha[d] many artifacts that are required by a system development life cycle methodology,? the assessment found problems with the SDLC practices used on the eLi project. As a result of the assessment, USCG took immediate steps to remedy these issues, including transferring management of the project to the Copyright office, which has expertise regarding SDLC management. in addition, USCG implemented numerous recommendations from the assessment to improve the eLi project, including the creation of a senior steering committee, and appointing the Director of CTO as the project manager. decision to terminate the eI.i project was the direct result of these project management improvements adopted by USCG. After the eLi project came under Copyright CIO oversight, it became apparent that the contractor hired to develop ei.i was not performing adequately, and that future operation and maintenance costs would well exceed anticipated costs. As discussed below, USCG has pmposed to shift the effort to make an electronic submission system for statements of account in a more cost-effective and efficient direction. Management Commept on Draft OTC Audit Report No. 2016-lT-102 February 9, 2017 2. USCO should update and clearly de?ne technical Concur. requirements and functionality of the systems. USCO Approach: USCO intends to update and define existing business and technical requirements for future efforts to develop electronic submission of cable statements of account, in accordance with Library guidance. 3. USCG should assess elements of existing development Concur with comments. USCO notes that developments subsequent to the close of the audit suggest activity for possible reuse. an alternative approach may be more cost~effective and desirable to our external stakeholders. USCO Approach: As the audit report notes, before terminating the eLi project, USCO obtained a copy of the complete customized source code for eLi, so that if there was desire from stakeholders, USCO could continue with the eLi project. Accordingly, to the extent there is such a desire, USCG will ensure that the source code is assessed for reuse. Since the close of the audit, USCO submitted an Investment Proposal to the Library?s IT Steering Committee to adopt a spreadsheet-based form for electronic submission of statements of account; many cable companies already prepare the current paper form using a spreadsheet tool. This solution is likely to be cost-effective and easy for the cable industry to implement. Although initially conceived of as an interim step, feedback from stakeholders regarding decision to terminate the eLi project and implement spreadsheet-based remittance has been extremely positive. For example, an attorney representing one of the companies that submits a significant number of statements of account described the solution as a "positive development" and a ?smart approach.? Accordingly, it may be that this solution will fully satisfy stakeholders. 4. USCO should clearly de?ne vendor timelines, technical Concur. deliverables, and required documentation as part of the contract and Statement of Work (50W). USCO Approach: USCO will ensure that contracts and SOWs for any future efforts to deve10p electronic submission of cable statements of account clearly define vendor timelines, technical deliverables, and required documentation, in accordance with guidance from the Library. Management Comment on Draft Audit Report No. 2016-IT-102 February 9, 2017 5. Develop reasonable and reliable cost estimates for subsequent development activities and obtain LOC oversight approval. Concur. USCO Approach: USCO will develop reasonable and reliable cost estimates for any future efforts to develop electronic submission of cable statements of account, in accordance with guidance from the Library. Like eLi, such efforts will be funded entirely out of the collected royalties (rather than taxpayer funds), and the Office will accordingly ensure adequate transparency regarding the impact of future development on the royalty pool. With respect to Library oversight of EU, Licensing Division notes that it first briefed the ITSC about the eLi project in March 2012. As the audit report acknowledges, under then-prevailing Library policies, the eLi project was not selected by the ITSC for continued oversight, and was understood to be moving forward fully under direction of the USCO. Notwithstanding the understanding that the eLi project was to operate without ongoing Library involvement, USCO reported to various Library components about the status of the eLi project. In August 2015, the Copyright (210 briefed the on the eLi project to provide an update. Similarly, USCO reported on the eLi status to the Libra ry's Strategic Planning Office (SPO) (now Strategic Planning and Performance Management) as a ?secondary? performance target in 2014 and 2015, even though SPO does not require reporting of secondary targets. in addition, the Library?s financial statements in 2014 and 2015 did not reference eLi merely because secondary targets are not included in those statements (119., only ?primary? performance targets are included). 6. USCO should clarify funding sources and status of funds reparting. Concur. USCO Approach: USCO will ensure that funding sources and status of funds reporting are reported appropriately for any future efforts to develop electronic submission of cable statements of account, in accordance with guidance from the Library. Management Comment on Draft Audit Report No. 2016-IT-102 Overseas l'it'ltl Replacement QVJCIH Recommendation February 9, 2017 Comments 1. If development activities resume, ensure that current LOC policies and relevant industry best practices are adopted by service unit oversight and project management teams. Concur. Library Approach: The Library issued a Cure Notice to the contractor on June 15, 2016. In lieu of a terminaticm for default, development activities have resumed under revised requirements and delivery schedules. Library Services/Overseas Operations (LS/0v0p) will work with the OCIO PMO to ensure that Library policies are followed and relevant industry best practices are adopted for the remaining phases of the PMLC. The expected date for this to be completed is by Q1 FY18. 2. 0v0p should update and clearly define technical requirements and functionality of the systems. Concur. Library Approach: As part of the negotiated alternative to termination for default, the Library is revising the OFORS statement of technical requirements and functionality and modifying the contract accordingly. The Functional Design documents for the remaining modules of the system are expected to be completed is by Q2 FY17. 3. 0v0p should clearly define vendor timelines, technical deliverables, and required documentation as part of the contract and SOW. Concur. Library Approach: As part of the negotiated alternative to termination for default, the Library is revising the OFORS deliverables and delivery schedule and modifying the contract accordingly. will provide the documentation for vendor timelines, technical deliverables that are defined as a part of the most recent modification to the contract in response to a Cure Notice. The expected date for this to be completed is Q2 FY17. Management goinment on Draft OlgAudit 4. Develop reasonable and reliable cost estimates for subsequent development activities and obtain LOC oversight approval. [February 9, 2017 Library Approach: To ensure that there is appropriate and ongoing Library oversight of all development activities for this project going forward, will prepare an 1T investment package that includes all cost estimates currently projected for completion of the project. This package will be submitted to the It investment Management Portfolio Office (l'l?lMl?O) for review by the If recommended by the ITSC, the IT investment will be incorporated into the Library?s IT InVestment Portfolio and be provided to the EC and Librarian. As part of the process, will report quarterly on the health of the investment in the areas schedule milestones, risks, and costs. The expected date to have this activity fully managed by processes is Q3 FY17. 5. OVOP should clarify funding sources and status of funds reporting. Concur. Library Approach: The LC Financial Reporting System (FRS) Status of Funds Report and Spending Lines reports give breakdowns of expenditures to the BOC level on each contract, but not down to the LIN level of each contract. Therefore the funding sources and status of obligated contract funds will be established primarily from invoices paid through Momentum as well as from Spending Lines reports requested from the Office of the Chief Financial Officer (OCFO). It should also be noted that OCFO did not require or issue cost variance reports since the OFORS contract is firm fixed-price. As a corrective action, will develop spreadsheets and/or other documentation to clearly indicate the source of funding for each CLIN as shown on vendor invoices and the CLIN list (Section B) in the contract. The expected date for this to be completed is by Q4 FY17. 6. OvOp should addms security risks, perform required remediation, and complete all required documentation. Concur. Library Approach: Library Services had recognized the need to address the security concerns and proper documentation for this project; hence the service unit had already initiated a re-accreditation process starting August 2016. This process is currently underway under the guidance of Security Group As a corrective action, will cover a5pects of security risks, required remediation and associated documentation via the process. The culmination of this process will cover several aspects of risk by including it inthe l'lSG-approved Archer system and bringing the system under the purview of Qualys Continuous Monitoring process to identify and remediate risks wherever possible within the system limitations. Furthermore, will also work with to update all required documentation. The expected date for completion of the re?accreditation is Q4 PY17. Management Comment on Draft OIG Audit Report No. 2016-IT-102 February 9, 2017 7. should identify necessary personnel requirements Concur. to successfully perform project management and security oversight. Library Approach: As a corrective action, will work with the PMO to ensure the necessary personnel requiremenm are reported and documented by clearly identifying and defining the project roles and subsequently aligning the project with OCIO PMLC steps and deliverables. Also, the security oversight of the project will be addressed by representing the system in the Library's security risk and compliance system (Archer) and completion of the re?accreditation process. The expected date for completion of the Project Roles documentation is Q2 FY17. Mana ement Comment on Draft OIG Audit Re ort No. Recommendation February 9, 2017 Comments 1. OCIO should ensure that continuing development activities incorporate current LOC policies and relevant industry best practices are adopted by service unit oversight and project management teams. Concur. Library Approach: Members of the Congressgov project team have a direct role in several Library- wide initiatives to set policy and best practices, including Agile Software Development, Service Management, Change Management, and Project Management processes. The team will work with OCJO staff and management to ensure that the Congressgov project ad opts best practices and follows OCIO policy in all of these areas, including all elements related to security. 2. OCIO should address security issues for all Operating systems and environments, develop timeliness for remediating de?ciencies, and monitor progress towards resolution. Concur. Library Approach: The System Owner and Information System Security Officer (1550) have reviewed the entire set of security?related processes, documents and controls. All open items in the plan of actions and milestones have either been addressed, or are being addressed. A new security evaluation process has been implemented by the project team and 1850. This updated process will ensure that all relevant security documents and controls will be reviewed and revised both on a basis and for all new releases. This process was put in place in January 2017. The Congressgov project team is working with the infrastructure team to test the updated infrastructure. We expect to have the updates and tests completed Q3 FY17. Infrastructure changes to improve Linux patching include implementing a new server that is dedicated to patching all Unix operating systems in the LOCI environment. This server will automate the pushing of security patches to the Unix systems and will be implemented by Q3 FY17. Additionally GOD is updating a standard operating procedure for the management of security patches for all Library servers in conjunction with the guidance with an expected completed date of Q4 FY17. Below is a list of milestones. Implement new Unix Patching Server: December 2016 April 2017 Develop Patching SOP: January - April 2017 Train Staff on Patching SOP process: April May 2017 Implement new process: June ?September 2017