STATE OF MAINE PUBLIC UTILITIES COMMISSION CENTRAL MAINE POWER CO. Docket No. 2019-00015 Re: Commission Initiated Investigation Of Metering and Billing Issues Pertaining to Central Maine Power Company BRIEF OF THE OFFICE OF THE PUBLIC ADVOCATE November 19, 2019 Office of the Public Advocate 112 State House Station Augusta, Me 04333-0112 Table of Contents I. Introduction .............................................................................................................................................. 1 II. Prudence Standard and Burden of Proof ............................................................................................. 5 III. Argument ................................................................................................................................................... 8 A. CMP’s SmartCare billing and metering system is flawed and customers do not trust that their bills are accurate. ........................................................................................................................ 8 1. BerryDunn’s analysis of customer accounts demonstrates continuing problems with CMP’s billing system..................................................................................................................... 8 2. Individual billing system problems continue to exist that undermine confidence in the system............................................................................................................................................12 3. CMP Violated Chapter 815 of the Commission’s Rules. ......................................................14 B. CMP Management was imprudent in the way it developed and implemented SmartCare. ...17 1. IV. V. CMP’s testing of SmartCare prior to go-live did not align with best practices and CMP did not even follow its own flawed strategy ............................................................................17 i. User Acceptance Testing. ......................................................................................................19 ii. Stress Testing. ..........................................................................................................................20 iii. Regression Testing. .................................................................................................................20 2. CMP’s failure to create and use a complete Requirements Traceability Matrix was imprudent and points to a need for retesting. ........................................................................22 3. CMP’s management of defects in the development and implementation of SmartCare did not conform with best practices and exposed ratepayers to a flawed billing system .30 4. CMP’s management of risk in the process of developing and implementing SmartCare was imprudent. ............................................................................................................................31 Recommendations.............................................................................................................................34 A. The Commission should impose a financial penalty on CMP ...................................................34 B. The Commission should order third party validation testing of SmartCare. ...........................35 C. The Commission should require the implementation of an action plan to restore public confidence in SmartCare ..................................................................................................................37 D. The Commission should adopt the Staff’s proposal for independent audits and review process for customers with high usage complaints. .....................................................................38 E. The Commission should make no findings with respect to individual high bill complaints to allow customers to pursue any such claims on their own ...........................................................39 Conclusion ..............................................................................................................................................40 I. Introduction The Office of the Public Advocate (OPA) offers this brief on issues arising from the implementation by Central Maine Power Company (CMP) of a new $57 million billing system known as SmartCare, in October 2017, including concerns with metering and billing accuracy that have generated an unusually high number of customer complaints. The records in this proceeding, and in Docket No. 2018-00194, have demonstrated that CMP clearly and inarguably experienced a large number of problems in implementing its SmartCare system. Further, CMP provided exceptionally poor customer service in dealing with customer inquiries and complaints about their bills. To this day, CMP does not appear to have admitted the existence of a problem with SmartCare when the sheer number of complaints and the number of actual problems identified by the Commission, the parties, their witnesses and consultants belies any believable claim that “all is well.” Instead, CMP’s responses have the tone of blaming billing issues on the customer, such as having a meter reading outside of a scheduled reading, exchanging meters, etc. Another common refrain has been to deny there’s a problem because the dollar amount billed is correct despite the fact that much of the information presented on the bill is incorrect. Because of the presentation of such incorrect information, however, a customer will be unable to replicate the calculation of their bill, thereby undermining trust and, in some instances, violating Commission billing rules. Indeed, this proceeding feels like a game of “whack-a-mole” in which every time a problem is knocked down, another one pops up somewhere else. While CMP does continue 1 to make progress in addressing a number of real issues with SmartCare identified in this proceeding, a large number of outstanding issues remain unresolved.1 Almost two years post go-live, SmartCare continues to suffer significant problems with producing and presenting invoices for customers. Further, some problems that CMP has claimed were fixed resurface again later, casting doubt on the effectiveness of the initial fix. Finally, it remains unclear whether all problems have even been identified, due to CMP’s inadequate testing during implementation. Based on the number of defects described in Ms. Keim’s testimony and the number of defects CMP has identified in its Summary of Billing Cases in EXM-004-004, it is unlikely that a typical customer would understand their invoices as presented and would assume the usage shown on the invoices is inaccurate and/or that they are being overbilled for usage that did not occur.2 Because of the number of problems identified with SmartCare, the continued existence of such problems, the continued identification of new problems, the inadequate testing to demonstrate that all problems have been identified, and CMP’s poor response to customer complaints, the OPA is herein recommending the following actions be ordered by the Commission: • Impose a $6.5 million downward adjustment to CMP’s cost of equity, as recommended in the Bench Analysis dated February 22, 2019 in Docket No. 201800194, based on the Bench recommendations in that proceeding as supplemented by the record in this proceeding. 1 As of October 24, 2019, there were 9 open defects and 7 open billing cases. EXM-004-007. 2 Keim Direct at 51. 2 • Validation of appropriate test plan and test coverage by an independent and qualified testing professional that will provide confidence that testing coverage is adequate, and testing is executed in accordance with industry standards and best practices. • Develop an action plan to restore stakeholder confidence in SmartCare. This would include the recommendation in the next bullet and continued reporting on the identification and correction of defects in the SmartCare System until all such flaws are identified and corrected. • Direct the Commission’s Consumer Assistance and Safety Division (CASD) to resolve billing disputes as described in the Bench Analysis by (1) determining whether the customer has been billed accurately for the customer’s actual usage; (2) ordering the Company to adjust the customer’s bill if necessary; and (3) if a customer’s billing concern relates only to high usage, referring the customer’s complaint to CMP’s highuse team and/or Efficiency Maine. • Make no findings with respect to individual high bill complaints to allow customers to pursue any such claims in court or through individual complaint to the CASD. As we discuss in the Recommendations section of this brief, these remedies will accomplish the following key outcomes: • Ensure that SmartCare is working correctly; • Ensure that adequate systems are in place to respond to future metering and billing complaints; • Restore public confidence in CMP and its SmartCare system; and • Hold CMP accountable for shortcomings in its implementation of SmartCare and its response to customer complaints and inquiries relating to billing and metering issues arising from that implementation. Background In late October 2017, CMP went operational with its new Customer Information System (CIS) billing program, sometimes referred to as SmartCare or the CRM&B. From November 1, 2017, until the present time, CASD has received numerous calls from CMP customers, with the vast majority related to high bills and possible billing errors. While the CASD has been able to resolve a number of the complaints that have been filed to 3 date, the Commission has found that additional information is needed in order to determine and understand the existence of any metering, billing, and customer communication problems affecting CMP’s ability to provide service to its customers, as well as the source of such problems. Based on these facts, on March 1, 2018, the Commission initiated a Summary Investigation of these metering, billing, and customer communication issues under 35-A M.R.S. § 1303(1)(C). The Notice of Investigation identified a broad group of issues to be addressed as part of the Summary Investigation, including issues relating to metering, billing and customer communications. Given the highly technical nature of the billing and metering issues to be investigated, the Commission engaged The Liberty Consulting Group to perform a forensic audit of these systems. On July 10, 2018, the Commission expanded the scope of the Liberty’s audit to include the customer communication issues. On December 20, 2018, Liberty submitted its Final Report of the Forensic Audit of CMP’s Metering and Billing Systems to the Commission. The Liberty Report identified numerous shortcomings in CMP’s implementation of the new SmartCare system, including significant gaps in SmartCare testing and training, that gave rise to a high level of customer concern. The Hearing Examiner issued a Procedural Order seeking comments from interested persons on the process to be followed to consider the Report. After receiving comments, the Commission concluded that the deficiencies identified in the Liberty Report warranted a full investigation. On January 14, 2019, the Commission issued an Order and Notice of Investigation opening this docket to investigate metering 4 and billing issues affecting CMP customers since October 2017 and, as part of CMP’s ongoing general rate case, Docket No. 2018-00194, an investigation of CMP’s customer service and communication issues. On May 8, 2019, CMP filed testimony responding to issues raised in the Notice of Investigation and the Liberty Report, including metering and billing issues and CMP’s implementation of SmartCare. On June 24, 2019, the OPA submitted a proposal to test CMP customer accounts for a specified customer type and period of time. By Procedural Order dated June 27, 2019, the Commission authorized the OPA’s proposed testing plan. Combined public witness hearings were held in this proceeding and Docket No. 2018-00194 on July 16, 18 and 22, 2019 in Portland, Farmington and Hallowell, respectively. The Public Advocate and other intervenors filed direct testimony and the Commission Staff filed a Bench Analysis on August 30, September 3 and September 6, 2019. CMP filed Rebuttal Testimony on October 16, 2019. Hearings were held on November 5 and 6, 2019. II. Prudence Standard and Burden of Proof This Commission has defined “prudence” as the determination of “whether the utility followed a course of conduct that a capably managed utility would have followed in 5 light of existing and reasonably knowable circumstances.”3 If the utility did not follow such a course of conduct, then the costs that resulted from such action (or inaction) are not recovered from ratepayers. The concept of imprudence in utility regulation has its roots in the "efficient investment" theory, whereby regulation acts as a surrogate for competition which would charge "inefficiency" to shareholders. The prudence standard applied in Seabrook is the Commission's elaboration of the statutory directive in 35-A M.R.S. § 301(4) that: “In determining just and reasonable rates, the Commission… [m]ay consider whether the utility is operating as efficiently as possible and is utilizing sound management practices.” The fundamental principle of 35-A M.R.S. § 301 is that ratepayers are not to be charged for imprudently incurred costs.4 The Commission in Seabrook noted the following factors that it would consider in assessing imprudence: 1. Senior utility executives are expected to possess a high degree of financial and technical expertise. 2. While the prevailing practice is relevant, it is not determinative. The decisions of utility executives must also be reasonable when viewed against the decisions and courses of conduct of other corporations that make investment decisions of comparable size and complexity. 3. The size and nature of the undertaking being reviewed must be considered. 4. Review of utility decisions should consider the utility's legal obligation to provide safe, reasonable and adequate service at the lowest possible cost over time throughout its service territory and to operate "as efficiently as possible" using "sound management practices." Public Utilities Commission, Investigation of Seabrook Involvements by Maine Utilities, Docket No. 84-113 (Phase II) Order at 12 (May 28, 1985). (hereinafter Seabrook). 3 Maine Public Utilities Commission, Investigation into Annual Reconciliation of CMP's Stranded Cost Revenue Requirements and Costs, Docket No. 2006-200, Order at 8-9 (March 24, 2008). 4 6 5. A review of prudency requires not only a review of the initial investment decision but also of the continuing action of the utilities in response to changing circumstances. 6. If a utility has selected from among several reasonable courses of action, one of which turns out badly, the utility's decision was not imprudent. 7. The utility's course of conduct must be reviewed in light of existing facts and circumstances that either were known or knowable through an effort consistent with the size of the risk at the time decisions were made. The prudence of a decision cannot be based on hindsight.5 If imprudence is found, the Commission must determine if the imprudent action caused harm to ratepayers. If harm is found, then the injury or damage needs to be quantified.6 Under the provisions of 35-A M.R.S. § 1314, in all original proceedings where a utility is seeking to defend its actions, the burden of proof is on the utility to show that its management decisions were prudent.7 Thus, it is not the burden of the staff or the OPA to rebut, item-by-item, the evidence produced by CMP. Rather, it is the role of the Commission to evaluate whether CMP has met its burden to show that its implementation of its $57 million SmartCare system was reasonable in light of the substantial doubts raised by the OPA and in the Liberty Report about the prudence of CMP’s management. For the reasons explained further below, CMP has not met this burden. 5 Seabrook at 166-167. Seabrook at 13-14; see also Emera Maine, Request for Approval of a Proposed Rate Increase, Docket No. 201500360, Order at 21-22 (December 22, 2016). 6 Central Maine Power Company, Application for Fuel Cost Adjustment Pursuant to Chapter 34 and Establishment of ShortTerm Energy Only Rates for Small Power Producers Less Than 1 MW Pursuant to Chapter 36, (Investigation of QF Contracts), Docket No. 92-102, Order at 12-14 (Oct. 28, 1993). 7 7 III. Argument A. CMP’s SmartCare billing and metering system is flawed and customers do not trust that their bills are accurate. 1. BerryDunn’s analysis of customer accounts demonstrates continuing problems with CMP’s billing system. The OPA engaged consultant BerryDunn to analyze the bills of a group of customers who had made high bill complaints.8 The OPA, through BerryDunn, compared information retained in CMP’s head-end system (HES) to the information on customer invoices for the period of time following the close of the Liberty forensic audit period through the date of the proposal.9 The OPA was seeking to determine the number of accounts that contained anomalies between the HES or field collection system (FCS) readings and the usage amount recorded on bills.10 From the subset of accounts showing anomalies, the OPA proposed to engage in additional testing, such as examining hourly interval data or other information from which the source of the anomaly could be identified.11 This testing protocol was designed to focus on accounts involving customers with high-usage complaints, which were, understandably, of great concern and had generated numerous complaints and press coverage.12 Although the cause for high-usage complaints remains elusive, BerryDunn’s review identified widespread and varied problems with the This work was represented in the Direct Testimony of Julie Keim. Laurel Arnold, also of BerryDunn, provided testimony on CMP’s implementation of SmartCare which is discussed infra. 8 9 Keim Direct at 4. 10 Id. 11 Id. 12 Id.at 5. 8 SmartCare system.13 These issues raise serious and continuing concerns regarding the overall integrity of the system and its ability to provide accurate, timely and reliable invoices to CMP customers. BerryDunn’s analysis of 1,370 customer accounts and more than 5,400 customer invoices revealed numerous ongoing errors in the billing process.14 Customer invoices are depicting incorrect information in a number of critical areas, and these defects do not allow the customer to recalculate or determine if the dollar amount billed is correct, including: • Meter readings • Meter read dates • The total number of days presented on the invoice are not the actual number of days the usage was recorded • Calculation errors in total kWh in the ‘Your Meter Details’ box on the invoice • Total kWh billed per the delivery charges calculation differs with the total kWh in the ‘Your Meter Details’ box on the invoice • Billing periods that are not sequential and/or overlap in the delivery charges calculation, and additional kWh displayed in the delivery charges section do not factor into delivery charges calculation • Two separate invoices with the same invoice number showing different usage amounts mailed to a customer15 After eight weeks of intense review and analysis, BerryDunn was unable to isolate a defect, set of defects or root cause for the numerous complaints relating to high usage.16 13 Id. 14 Id. at 11. 15 Id. 16 Id. at 16. 9 Nevertheless, based upon this review and analysis, it is not possible to conclude that these complaints are without merit.17 Based on the results of the BerryDunn analysis, the SmartCare system continues to produce countless invoices that contain inaccurate and misleading information.18 These instances are not insignificant, since they can impact all CMP customers and are not isolated to just the subset of customers with high-usage complaints that we sought and reviewed in our analysis.19 We know this because the errors affect customers with meter exchanges, delayed billing and other non-usage invoice presentation issues.20 Moreover, because of the wide range of problems, it is highly likely that these types of problems have occurred, and continue to occur, on the invoices of other CMP customers.21 BerryDunn’s work reveals that SmartCare still does not consistently deliver accurate information on customer invoices.22 Customers have received invoices containing errors related to the current or prior meter reading, meter read dates, calculated total kWh, delivery charges presentation, and number of days usage occurred. Some of these instances involve long periods of unbilled usage, resulting in invoices that present confusing and misleading information regarding usage without explanation.23 Numerous customer invoices required 17 Id. 18 Id. 19 Id. 20 Id. 21 Id. 22 Id. at 51. 23 Id. 10 extensive review and examination, including review of backup information from the system, as well as discussions with CMP personnel, to gain an understanding of the problem.24 It is unlikely that a typical customer would understand these invoices as presented and would assume the invoices are inaccurate and/or that they are being overbilled for usage that did not occur.25 The analysis also identified instances in which CMP personnel had manually changed the coding in the SmartCare system to indicate that a meter reading had occurred when it had not.26 The apparent deficiency in internal control which permits this type of coding change appears to contribute to the issues that have raised concerns from customers.27 These problems demonstrate that the SmartCare system, almost two years post golive, has significant problems with producing and presenting invoices for customers. 28 Given the parameters of BerryDunn’s analysis, it is not possible to conclusively determine the extent or pervasiveness of these problems. However, the nature and extent of the issues identified call into question the overall integrity of the system and its ability to provide accurate, timely and reliable invoices to CMP customers. Indeed, the number of customers continuing to register complaints regarding the accuracy of their invoices is a reflection of the continuing problems and that such complaints likely will not subside until deliberate and 24 Id. 25 Id. 26 Id. 27 Id. 28 Id. 11 effective measures are taken to identify the root causes of the problems and create a system for resolving the problems identified. 2. Individual billing system problems continue to exist that undermine confidence in the system. The following discussion addresses a number of billing system problems that continue to exist through the date of hearings and in some cases beyond. This is intended to assist in understanding the issues identified in this proceeding and is not intended to be a comprehensive list. Indeed, as some of these examples indicate, there is significant doubt as to whether all problems have yet been identified and whether problems identified as having been fixed have actually been permanently resolved. Further, there remain a large number of complaints that have been made, the cause of which has not yet been determined.29 Certain cases identified in the responses to Data Requests LOO-001-031 and EXM004-004 are inconsistent between May 1, 2019 and October 24, 2019. A comparison of these two data responses reveals that CMP has added twelve new billing cases and twelve new defects since May 1, 2019. Critically, these problems have surfaced 18 to 24 months post golive in some instances. These are ongoing issues. Even if they are being fixed, their existence supports the conclusion that without adequate testing, it is impossible to know how many issues continue to exist. The response to EXM-004-019 lists defects that were opened, closed and then reopened. This demonstrates a lack of certainty with respect to the success of CMP in fixing 29 Bench Analysis at 10, fn 6. 12 defects even after they claim to have done so. This again supports the conclusion that additional testing is necessary to ensure that all problems have been identified. The response to EXM-004-025 lists a defect as having been identified on September 13, 2017. However, in the response to EXM-004-004, CMP states that the defect was identified on June 14, 2018. This again points to inconsistencies and how can the Commission be assured that all defects have been identified, properly communicated to the Commission and/or fixed without further testing. BerryDunn’s work was different than that of Liberty, as demonstrated in OPA-010001. BerryDunn focused on the customer-facing information, such as invoices, while Liberty focused on the SAP data. There is a disconnect between the information that SAP captures and the information provided to a customer on their invoice. BerryDunn demonstrated numerous errors in what is presented on the customer invoice, giving customers valid reasons for not trusting the information provided on that invoice. CMP states in rebuttal that Customer Service Guarantee Credits had been applied to certain accounts as described in the testimony.30 In Data Request OPA-010-009, CMP was asked to provide support for the panel’s testimony that it had provided Customer Service Guarantee credits for the billing issues identified in the examples identified by BerryDunn. In its response, CMP indicated that it “realized” that it had not provided these credits and that it had then credited the customer’s account on October 28, 2019, the day the data 30 CMP Metering and Billing Panel Rebuttal at 30. 13 response was due.31 This undermines any of CMP’s claims that actions have been taken or that problems have been corrected. In OPA-010-012, CMP included a defect (Defect 6564) that was not listed in EXM040-004 as a defect. This demonstrates that the list of defects included in response to Data Request EXM-004-004 is not complete and that there may be other defects that the Company has not yet disclosed. These are serious problems for a utility that does not enjoy the trust of its customers that its bills are accurate. CMP asks that customers trust that the dollar amount billed is correct, but these other issues like bill presentment and excessive monthly estimating do not lead to an atmosphere where such trust can exist. Finally, BerryDunn found many instances where customers usage was estimated for successive months. This led to the discovery that customers may be overcharged Maine sales tax.32 We recommend that CMP be required to indicate clearly on the invoice if the prior meter reading was estimated. This will allow customers to identify when they have a “truedup” invoice. 3. CMP Violated Chapter 815 of the Commission’s Rules. Although the focus of this proceeding has largely been on ensuring that the CMP metering and billing systems are working properly, it is worth noting that many of the 31 OPA-010-009. 32 Keim Direct at 10, 13, 14. 14 customer complaints included in the record of the proceeding have identified likely violations of Chapter 815. Chapter 815, Section 8(L) provides: A utility must obtain an actual meter reading every month, unless: 1. extreme weather conditions, emergencies, equipment failure, work stoppages or other similar circumstances prevent an actual meter reading by utility employees; 2. the utility must have access to the customer’s premises to obtain a reading and the utility is unable to gain access after using reasonable efforts to obtain access; or 3. a customer is billed on a seasonal basis according to terms included in the rate schedule of the utility. An “actual meter read” includes an electronic read obtained via an automated read system. A review of the many high bill complaints analyzed in this proceeding demonstrates that many were the result of CMP’s failure to obtain an actual meter reading.33 While some of these may have been due to allowable causes, the fact that some number of them were failures over multiple months suggests that many such failures violated Section 8(L) Chapter 815, Section 8(C)(1) provides that each bill issued by a utility shall clearly state the “beginning and ending dates of the period for which service was provided.” The presentment issues identified and described in Ms. Keim’s direct included instances where the metering date was misstated.34 CMP identified this as Defect 6621. This is a violation of Section 8(C)(1). 33 See, e.g., Keim Direct at 27. 34 Keim Direct at 10. 15 Chapter 815, Section 8(C)(10) provides that each bill issued by a utility shall include a “clear and conspicuous marking of all estimates.” Again, the presentment issues identified and described Ms. Keim’s direct included instances where the fact that the bill presented was an estimate was not clearly identified.35 This is a violation of Section 8(C)(10). Chapter 815, Section 8(C)(5) provides that each bill issued by a utility shall include an “itemization of State and Federal taxes.” Ms. Keim, at pages 20-22, 31, 33 and 51 of her direct, describes instances where inaccurately estimated bills result in the miscalculation of tax amounts owed. This is a violation of Section 8(C)(5). Chapter 815, Section 8(D)(2) requires that the bill format include “comparative usage information for the prior 12-month period and for the equivalent period 12 months ago.” Ms. Keim, at pages 31-34 of her direct, identifies a number of instances where usage for prior months was clearly incorrect. These are all violations of Section 8(D)(2). This discussion in not intended to be an exhaustive list of all Chapter 815 violations that CMP may have committed. However, the Commission enacted the billing requirements in Chapter 815 deliberately and with purpose. The failure of CMP’s SmartCare system to comply with these basic requirements undermines customer confidence regarding all aspects of its accuracy and functionality. Further, it provides a separate basis for imposing a cost of equity adjustment and for disallowing a portion of the costs incurred by CMP in implementing SmartCare. 35 See, e.g., Keim Direct at 28. 16 B. CMP Management was imprudent in the way it developed and implemented SmartCare. Beginning earlier in this decade, CMP began planning to implement SmartCare to replace its legacy billing system. Project planning documents produced by Deloitte, CMP’s System Integrator for SmartCare, and approved by CMP, acknowledged the applicability of industry standards and best practices, but the documentation and testimony provided by CMP failed to demonstrate that these standards and practices were consistently followed in the execution of the project. These failures have been documented in the Direct Testimony of Laurel Arnold. CMPs rebuttal testimony and responses to data requests have asserted that CMP adhered to industry standards and best practices, but these bald assertions are not supported by the documentary evidence examined and relied upon by Ms. Arnold which has not been meaningfully supplemented by CMP. This helps explain why CMP immediately encountered problems with SmartCare at go-live, and why it continues to experience significant problems with this billing system. Ms. Arnold’s testimony made six findings. For the purposes of this brief, we focus on four of these because they strongly underscore the problems and issues that customers continue to have with bills more than two years after go-live. Initially, we note that Ms. Arnold fully supports findings and conclusions of the Liberty Report with regard to SmartCare implementation.36 1. CMP’s testing of SmartCare prior to go-live did not align with best practices and CMP did not even follow its own flawed strategy. 36 Arnold Direct at 5-6. 17 In 2016, CMP approved a Testing Strategy document that had been prepared by Deloitte.37 It was nevertheless created to provide an outline of “the testing activities that will be carried out during the Test and Validation phase of the project.”38 This document was also supposed to be a “living document” that would be updated throughout the project.39 It was never updated.40 Ms. Arnold testified that A qualified reviewer with experience in management of testing for a system of this size and complexity should have known from the minimal details provided in the Testing Strategy that it was insufficient and recommended rejection of this document.41 This document contains numerous shortcomings that cause it to fall outside of best practices for the project. Moreover, in many cases CMP failed even to follow the Testing Strategy’s direction. Ms. Arnold described how many of the tests described in the Testing Strategy’s did not align with best practices. In this brief, we focus on three: User Acceptance Testing, Stress Testing and Regression Testing.42 OPA-008-011, Attachment 1 (hereafter referred to as the Testing Strategy); Tr. 6/13/19 at 122; Arnold Direct at 15. 37 38 Testing Strategy at 4. 39 Id. 40 Tr. 6/13/19 at 123. 41 Arnold Direct at 13. We do not concede that the problems found by Ms. Arnold with the other forms of testing were adequately rebutted by CMP, but for the sake of brevity rely upon her testimony to speak for itself. Arnold Direct at 611. 42 18 i. User Acceptance Testing. Citing multiple authorities, Ms. Arnold stated that best practices for user acceptance testing (UAT) require that it occur after integration testing and quality assurance has been completed. Ms. Arnold states that Starting acceptance testing before the completion of integration testing risks distributing a buggy and immature system to the business users, duplicated logging of bugs by business users and testers, and expanded turnaround time for triage of the duplicate defect entries.43 Section 2.4 of the Testing Strategy states that UAT could occur “as a part of the last cycle of Integration Test.”44 And this is in fact what happened, as Ms. McNally confirmed.45 CMP ran its UAT in conjunction with other tests, abandoning the serial approach to testing and using instead the parallel approach,46 presumably because of time pressure. This decision created a serious risk that business users received an incomplete version of the system for validation. Ms. Arnold described the risk: This creates a potential for duplicated bugs introduced by business users and testers resulting in expanded turnaround time for defect analysis and resolution and slower development team response due to unanticipated load. There is also a risk that management will partially lose control over both releases and test environments, especially if there are separate test environments for ITC 547 and UAT. Final results of these risks are the inability to complete testing in time; thus, the product is not fully validated by the business users as meeting all business requirements and needs. 48 43 Arnold Direct at 8. 44 Testing Strategy at 8. 45 Docket No. 2018-00194, Tr. 4/29/19 p. 17. 46 Arnold Direct at 13-14. 47 ITC 5 was a fifth cycle of integration testing added by CMP in September 2017. Arnold Direct at 14. 48 Id. at 14-15. 19 Again, citing multiple authorities, Ms. Arnold testified that “UAT must be the final testing phase and must occur after all other testing in complete; this is true for all testing methodologies.”49 CMP’s tepid rebuttal fails to address this flaw, admits that it ran these tests in parallel, and provided no justification for doing so.50 ii. Stress Testing. Stress testing is supposed to “verify the robustness of a system.”51 This testing “exercises” the system by subjecting it to uses “within or beyond specified limits.” 52 The Testing Strategy, however, had “vague and inadequate” definitions that resulted in “ambiguous and incomplete testing goals.” Stress testing should be designed to root out defects by placing demands on the system that mimic potential real-life scenarios such as one where “hundreds or thousands of users visit the application in real time.”53 It is not apparent that CMP performed this type of stress testing. Company witnesses did not say so in their rebuttal. 54 iii. Regression Testing. Regression testing is used to verify that prior fixes to defects were adequate and have not created other defects.55 CMP’s Testing Strategy failed to include procedures for selection of test cases, removal of obsolete test cases and criteria for entrance and exit of regression 49 Arnold Direct at 14. 50 CMP Implementation Panel Rebuttal at 19. 51 Arnold Direct at 10. 52 Id. 53 Id. 54 CMP Implementation Panel Rebuttal at 20-21. 55 Arnold Direct at 11; CMP agrees with this. CMP Implementation Panel Rebuttal at 23. 20 testing, including how failed tests are treated.56 CMP essentially accuses Ms. Arnold of crying wolf, but fails to state how it handled these risks, other than to make the bland assertion that regression testing was used “to ensure the business practices were executed with no unintended impacts.”57 The Commission should not simply accept this statement that project managers made the proper selections and decisions when doing regression testing, since the Testing Strategy did not provide any such guidance and CMP’s rebuttal did not specifically rebut Ms. Arnold’s clear elucidation of the problem. The risk that CMP’s project team would “misunderstand the procedures, goals and objectives, and [thus]… preclude a complete or reliable test result” is real.58 These three test shortcomings during implementation point to management decisions that fell outside of best practices. There are two other important points concerning testing. First, Ms. Arnold also testified that “[w]hen a system is not fully exercised during the test phases, latent defects in areas missed during testing will be uncovered by real-world users after production release.”59 She further stated that “[w]ithout documented objectives and intended end results, most developers will focus on creating test cases that follow successful logic paths as opposed to negative or error paths.” Given human nature, this could especially be the case when managers are facing budgetary and time constraints and therefore need to push for results. 56 Arnold Direct at 11. 57 CMP Implementation Panel Rebuttal at 24. 58 Arnold Direct at 11. 59 Id. 21 Second, as this flawed approach to testing was winding down in October 2017, CMP made another highly questionable decision. The Testing Strategy required that there be no “open high or medium priority defects” at go-live.60 This requirement was relaxed per a “conscious decision.”.61 This was a “substantive change” that “represented a lowering of criteria for Go-Live.”62 No adequate explanation was provided for this potentially crucial decision and it is not even clear if higher management knew of it.63 This decision combined with the flaws in testing described above and in Ms. Arnold’s testimony helps explain, at least in part, why SmartCare has failed to serve as a functional billing system that enjoys the confidence of the customers served by it. 2. CMP’s failure to create and use a complete Requirements Traceability Matrix was imprudent and points to a need for retesting. In order to gain the trust of customers in this system, the Commission must order further testing. As a general matter, what testing was and was not done, whether described in the Testing Strategy or not, can now only be discovered with the use of a proper Requirements Traceability Matrix (RTM). Ms. Arnold described this as a key artifact at hearing, saying that a usable RTM would “provide the evidence of what testing was done and what -- in accordance with good practice and appropriate coverage and what was missing.”64 An RTM that is fully populated with all test data and linkages would provide 60 Testing Strategy at 17. 61 Tr. 6/13/19 at 148, lines 2-6. 62 Arnold Direct at 15. 63 Id. 64 Tr. 11/6/19 at 120. 22 evidence documenting “that all the test cases that were necessary to adequately test each requirement were actually written and executed. Right now, we don’t have that.”65 CMP did not create an RTM during implementation that allowed traceability either before go-live or since. The RTM used by CMP during SmartCare implementation, produced in response to OPA-007-083, attachment 2, was never completed by CMP and thus was flawed. A review of the “requirements” tab in this Excel document shows no data in columns Y through AF, all which are under the heading “Traceability.”66 These columns of this document should have been fully populated with appropriate data and used prior to go-live, and it could be used post go-live to manage regression testing and ongoing management of the system. The incomplete RTM produced in discovery points to management imprudence. In her testimony, Ms. Arnold quoted from the Project Management Body of Knowledge (PMBOK) which describes the purpose of a Requirements Traceability Matrix (RTM) as follows: “The requirements traceability matrix is a grid that links product requirements from their origin to the deliverables that satisfy them. The implementation of a requirements traceability matrix helps ensure that each requirement adds business value by linking it to the business and project objectives. It provides a means to track requirements throughout the project life cycle, helping to ensure that requirements approved in the requirements documentation are delivered at the end of the project. Finally, it provides a structure for managing changes to product scope.”67 65 Id. at 120-121. 66 See Table 1 and discussion, infra. 67 Arnold Direct at 19. 23 Best practices in connection with the use of an RTM dictate that this should include system requirements, the test cases that cover those requirements and the defects that turn up during testing. These defects should be “mapped to test cases, and all test cases mapped to requirements.”68 In this way, defects can be traced.69 The Testing Strategy document that governed testing for SmartCare was developed by Deloitte and approved by CMP.70 Section 6 (p. 15) of this document acknowledges the purpose of the RTM and explains how traceability was to be maintained in HPQC (a/k/a HPALM)71 and in the RTM for SmartCare: In the context of testing, requirement traceability represents the ability to verify that the solution delivered by the CMP CRM&B project has addressed all requirements. This is achieved through building linkages between individual requirements and the tests that are used to verify that the system has satisfied the requirements.72 This section similarly states that “[w]here a test is developed to address a specific requirement, that test case and a specific test step should be explicitly link (sic) to the requirement in HPQC.”73 These are necessary and appropriate requirements for testing, but CMP failed to follow this plan. The OPA made the following data request of CMP (OPA-007-083): 68 Arnold Direct at 20. The PMBOK quoted by Ms. Arnold, and the Enterprise Value Delivery (EVD) employed by Deloitte in its help with the SmartCare project are very similar compendiums of knowledge and management approaches that constitute best practices with regard to projects like SmartCare. Mr. Tovar from Deloitte acknowledged this similarity when he agreed that “EVD is simply a tailoring, if you will, of best practice and industry standard to a specific type of implementation.” Tr. 10/31/19 at 105. EVD is described on page 15 of CMP Implementation Panel Rebuttal. 69 70 Tr. 6/13/19 at 122. 71 Tr. 6/13/19 at 123. 72 OPA-008-011, Att. 1, p. 15 (emphasis added). 73 Id. 24 Please provide the Final Requirements Traceability Matrix - showing disposition of all requirements, business rules and desired capabilities in the initial baseline scope, inclusive of those requirements removed from scope or deferred via change control, test cases written for each requirement, test cases executed for each requirement, final test outcome for each test case, defects associated with each requirements, priority and/or severity of each defect, go-live status of each defect. In response, CMP provided OPA-007-083, Attachment 2. Table 1 below draws from this spreadsheet document and summarizes information about each column and its completion status.74 Table 1: Requirements Tab of RTM (OPA-007-083, Attachment 2) Column A B C Category of Information Business Requirements D E F G H I J K L M N O P Q R S T U V W X Y Z AA AB AC AD AE AF 74 Fit Gap Analysis Traceability Fields Business Requirement ID* Business Requirement Name* Requirement Description / Definition* High Level Business Requirement Requirements From Requirement Type* Team* Subteam Business Unit Business Stakeholder Priority* Complexity Rationale Localization Identified By* Status* Fit/Gap* Fit/Gap Description* Resolution Type (Transaction / RICEF)* Gap Resolution Description* Gap Resolution Complexity* Gap Decision Solution Status* Development Object System Functional Specification Technical Specification Unit Test Case* String Test Case* Integration Test Case* UAT Case* Performance Test Case* Remarks The asterisks are in CMP’s original document. 25 Notes about Usage These fields were populated with data for business requirements or designated as “N/A” These fields indicate which requirements are met by the Commercial Off The Shelf (COT) solutions and which will be custom. They were populated with data derived from the Fit Gap Analysis or designated as “N/A” The fields designed to demonstrate traceability between requirements and test cases were not filled in. Table 1 clearly shows that the RTM ultimately produced for the project and approved by CMP management included only requirements-specific information and did not provide traceability.75 Test case coverage was not provided in the RTM nor in the test cases and test scenarios matrix.76 The linkages required by the Testing Strategy in order to map defects to test cases were never developed and to this day apparently do not exist. The RTM was not completed as planned and thus failed (and continues to fail) to serve its intended purpose to “verify that the solution delivered by the CMP CRM&B project has addressed all requirements,” as described in the Testing Strategy.77 CMP acknowledged that the disposition of each requirement did not inform the golive decision. Ms. McNally confirmed that no single report, data extract or document was presented to decision makers to show that each requirement had been fully tested, that all test cases for each requirement were executed, which had passed and which requirements still had outstanding defects.78 The absence of this type of traceability and reporting does not align with the Testing Strategy produced by Deloitte and approved by CMP. The decision of CMP management to approve implementation without this information is contrary to industry standards and best practices used by other corporations that make investment decisions of a comparable size and complexity.79 75 OPA-007-083, Attachment 2. 76 TLCG-001-0185, Attachment 1. 77 Testing Strategy at 15. 78 Tr. 6/13/19 p. 156, line 6 – p. 157 line 12. 79 Arnold Direct at 2. 26 The lack of traceability demonstrates that CMP management made the decision to go-live with SmartCare without evidence that the system had been adequately tested, without full assurance that the requirements of the system were being delivered as expected and blind to a potential host of system defects. As Ms. Arnold testified, “[t]he absence of this type of traceability and reporting is contrary to the best practices cited above and does not align with the traceability included in the Testing Strategy approved by CMP.”80 At hearing, Ms. Arnold was asked by CMP counsel if she had assessed the overall performance of the SmartCare system against requirements. She responded in part by saying It was not possible to do that given the data that was provided. Had we been provided with a complete requirements traceability matrix as was planned for and envisioned in the test strategy document by -- that was developed by Deloitte and approved by CMP, we would have been able to see the traceability. And when I say that, I mean the linkage and the coverage that would have been provided in such a document that showed that all requirements were accounted for, all requirements had been evaluated for an appropriate type of test coverage, that all test cases were written to provide that coverage, that all such test cases were executed, and what the outcome of that testing was. The fact that that was not provided made it difficult to ascertain the coverage to the extent to which test coverage covered all requirements. 81 This testimony highlights the present problem that it is not possible to know with any level of certainty what defects and problems exist in SmartCare until they actually surface, which has been and will often be when a customer receives a flawed invoice. Ms. Arnold clearly articulated CMP’s imprudence at hearing when she said that “had they provided [the RTM] prior to go live and used it, we wouldn't have many -- as many findings about the -we wouldn't be left at questioning the quality or the unknown unknowns within the system 80 Arnold Direct at 21. 81 Tr. 11/6/19 at 96-97. 27 right now because we would have evidence, and that's what was intended in this project. They intended to do that, but they didn't do it.”82 Rather than use the RTM specified in CMP’s Testing Strategy, managers created and used a “dashboard” that relied upon the number and type of unresolved defects to measure readiness for go-live.83 Reliance on defect counts alone to measure the quality of the system and readiness for go live is risky.84 It is akin to a mechanic telling you only that he looked your car over and found some deficiencies but not what those deficiencies are. You don’t know if the problem is the brakes or the windshield wipers or if he even checked the integrity of the frame -- yet you choose to drive the car anyway. Defects are logged when a test case has been written for a condition and fails or when testers observe something unexpected during testing. Defect counts alone can only be used as a reliable measure of quality when test coverage is adequate, all test scripts have been executed to plan and all defects have been logged and accounted for. When test coverage is not adequate prior to go-live (e.g., your mechanic did not inspect the brakes) or is conducted in a manner that makes it difficult to isolate defects, defects often remain undetected. Without a complete RTM, CMP managers unreasonably limited their ability to understand the cumulative impact of defects and identify potential gaps in testing. 82 Id. at 123, lines 17-22. 83 Tr. 6/13/19 at 161. 84 Arnold Direct at 21. 28 Ms. Arnold discovered that test cases for alternate, boundary, edge, error, and exception code paths were not documented in either the Testing Strategy or the list of test cases and test scenarios provided. These gaps in test coverage greatly increase the likelihood that defects remained undetected until production use.85 After go-live, CMP continues to operate without a complete RTM, apparently relying on unresolved defect counts as a measure of system quality. In production, defects may not be readily apparent to company staff or customers and thus remain undetected. The inaccurate values and processing glitches detected may not be easily traced to one defect but can be the net result of multiple defects. Thus, isolating defects in a production environment can prove difficult and expensive. Finally, changes to a production system require adequate regression testing to avoid introducing new defects into the system. Had a complete RTM been delivered prior to go live, CMP would have a mapping of test cases to requirements to select appropriate regression test cases. When a defect is not associated with and traceable to a business requirement, the root cause of the defect is more difficult to establish, the instability of the requirement will not be visible, and the identification or creation of test cases needed to retest, or regression test the defect fix is more difficult. CMP has not provided a regression test plan or evidence of test coverage being used to close defects. 85 Arnold Direct at 12; TLCG-001-185. 29 CMP’s rebuttal of Ms. Arnold’s findings and conclusions about the RTM fails to rebut her findings.86 In sum, an RTM that lacks the T – traceability – is not a usable RTM and this creates serious risks to a project and makes detecting and curing defects difficult and expensive during production. We urge the Commission to find that CMP management’s decision to not have and use a complete RTM was imprudent. 3. CMP’s management of defects in the development and implementation of SmartCare did not conform with best practices and exposed ratepayers to a flawed billing system. Ms. Arnold found that the Testing Strategy laid out a process of defect management that was “incomplete and inconsistent with best practices and industry standards.” 87 She further found that testing teams did not follow this flawed document, and both of these shortcomings led to her conclusion that CMP failed to employ best practices in the area of defect management.88 CMP failed to consistently manage defects in a single tool, employing shared Excel spreadsheets at some points, and the HP QC tool at others. CMP’s rebuttal points out that once the HP QC tool was available, it was consistently used89 but defects were logged very early in the process and the record does not reveal how many defects preceded the use of HP QC. 86 CMP Implementation Panel Rebuttal at 26-27. 87 Arnold Direct at 16. 88 Id. 89 CMP Implementation Panel Rebuttal at 26. 30 Ms. Arnold also found that CMP testers did not use the HP QC tool properly, failing to enter defects detected in unit and string testing.90 As a result, in later test phases, “testers and analysts could not easily compare new defects with previous defects to determine patterns or trends in quality” thus making it difficult for management to prioritize defect fixes and allow appropriate creation of regression test cases. These failures impaired CMP management’s ability to conduct root cause analyses which are important for seeing the full chain of events and thus preventing future defects.91 CMP did not rebut this observation other than to point out how its testers were instructed.92 Along with CMP management’s other failures, the failure to keep accurate track of defects in SmartCare implementation serves to explain the problems customers have had with this billing system. CMP’s management of system defects did not conform to best practices and was imprudent. 4. CMP’s management of risk in the process of developing and implementing Smart Care was imprudent. Ms. Arnold identified serious issues with the manner in which CMP managed risk in the implementation of SmartCare. First, she quotes the PMBOK where it states that “the effectiveness of Project Risk Management is directly related to project success.”93 She then notes that CMP’s Project Charter94 provided some guidance, giving the Executive 90 Arnold Direct at 18. 91 Id. 92 CMP Implementation Panel Rebuttal at 27. 93 Arnold Direct at 22. 94 OPA-008-001, Att. 1. 31 Committee primary strategic responsibility.95 It is vital that the project’s risk tolerance be explicitly stated since it is used to set process and definitions used to quantify risk. 96 Of great importance is identification of the project’s primary constraint, whether that be scope, schedule or cost. This allows the project team to make important decisions on tradeoffs between competing demands during implementation. But project management was confused about the primary constraint. The planning documents indicated that the primary constraint was scope.97 Ms. McNally stated that it was “quality.”98 Mr. Stinneford, however, who sat on the Steering Committee, a management step above Ms. McNally, indicated that it was cost. 99 Ms. Arnold described how this confusion affected the project: If quality was paramount and delivery of system functions (i.e. scope) was the primary constraint as suggested by Ms. McNally’s testimony, quality could have been safeguarded by adding resources to complete all planned testing before the scheduled deadline and/or the schedule could have been modified. If cost was the primary driver of decision making, management could have helped to maintain quality by looking for opportunities to reduce the scope of what would be deployed at Go-Live. Instead, CMP management significantly modified its approach to testing, held to a fixed date without making adequate compensating changes to scope and/or cost and quality predictably suffered.100 A project like SmartCare needs clear articulation of how risk is to be managed and this articulation needs to be understood by all who work on the project. Because SmartCare 95 Arnold Direct at 22. 96 Id. 97 OPA-007-008; Arnold Direct at 23. 98 Docket No. 2018-00194, Tr. 4/29/19 at 15; Arnold Direct at 23. 99 Docket No. 2018-00194, Tr. 4/29/19 at 22-23. 100 Arnold Direct at 24. 32 involved 240 resources, 32 vendors in 6 geographical locations, and because interaction was required with 52 interfaces,101 this clear articulation of risk management was crucial to success since it would “ensure adherence to common expectations, project norms and standardized methods of measuring, monitoring, and reporting of project performance and product quality.”102 This type of coordinated approach with standardized definitions was absent from SmartCare.103 CMP’s management of the project’s schedule lacked identification of a “critical path” which is the sequence of activities, regardless of responsible entity, that represents the single longest path through a project and determines the project’s shortest possible duration. Thus, any delay associated with any activity on the critical path jeopardizes the scheduled end date of the project. In a project schedule comprised of thousands of tasks, this analysis is critical to manage risk. 104 Failure to identify this critical path “compromised the project’s ability to proactively monitor and aggressively manage risk.”105 Ms. McNally testified that quality was paramount, but CMP’s Project Management Office Plan had only a single paragraph that said that quality “will be monitored,” and that “there should be a project quality plan to measure the delivery and process quality.”106 Both 101 Id. at 2. 102 Id. at 24. 103 Id. 104 Id. at 25. 105 Id. 106 OPA-007-005 Att. 1; Arnold Direct at 26. 33 Liberty and Ms. Arnold were unable to find any metrics associated with quality.107 One wonders how CMP management thought quality would be delivered. Given the high number of vendors associated with SmartCare, Ms. Arnold expected that the Project Management Office Plan would contain language aimed at enforcing accountability of these third parties. There is none. This failure left quality management “at risk of shifting expectations, demands and approaches.”108 Ms. Arnold testified that all these failures compromised CMP’s ability to manage project risk. We have shown that CMP management did not develop and use an RTM, it established poor criteria for some of its testing (and then did not even follow that), it botched its defect management and its poor risk management demonstrated confusion at the highest levels that can only have filtered down to lower levels as implementation proceeded. It is no wonder that SmartCare had and continues to have so many problems. IV. Recommendations A. The Commission should impose a financial penalty on CMP. Given the imprudence described in this brief, we reiterate our support for the downward adjustment to CMP’s ROE that Staff has recommended its Bench Analysis in Docket No. 2018-00194, and that we supported in our brief in that docket. The record in this proceeding underscores the management failures that led to the post go-live customer service problems. 107 Arnold Direct at 27. 108 Id. 34 B. The Commission should order third party validation testing of SmartCare. We recommend that a qualified, independent third party be engaged to support validation testing of SmartCare. This testing and production of a complete Requirements Traceability Matrix will provide evidence that currently does not exist – specifically, evidence of the extent to which: • all functional and non-functional requirements have been accounted for in testing; • requirements were specific, measurable, attainable, relevant and time-bound; • that test scripts written for each requirement covered not only positive and negative paths but also others such as alternate, boundary, edge, error and exception paths; • that all planned and approved test scenarios and scripts were executed; • defects were traced to test cases and requirements, allowing visibility into requirements that are flawed or incomplete. Validation of appropriate test plan and test coverage by an independent and qualified testing professional will ensure that testing coverage is adequate, and testing is executed in accordance with industry standards and best practices. Production of a complete RTM will provide objective and reasonable proof of the extent to which the system is working or not. Simply reporting that known defects have been fixed without demonstrating adequate test coverage will not provide such assurance. In her Direct Testimony, Ms. Arnold recommended that “CMP should be required to test the system as it should have been tested prior to go-live in accordance with applicable best practices.”109 At hearing, she clarified this by saying that she is not recommending a soup-to-nuts retesting of SmartCare, but rather a more targeted approach aimed at finding 109 Arnold Direct at 31. 35 and testing the gaps left by CMP’s testing. She said the key is to develop and use a complete RTM.110 Without one, these testing activities will require that CMP expend additional time and resources. With a complete RTM, it may take less time.111 The current approach (“whack-a-mole”) is where “we are chasing defects in production. We only see those that are identifiable by customers or through a review of outcomes in the system. It is not an effective way to identify defects in production.”112 We recommend that a third party do the retesting. Ms. Arnold stated that it was not clear that “CMP leadership and staff possess the knowledge of best practices and have experience to apply them.”113 Moreover, as indicated, doing the testing and doing it with the help of an independent third party will help restore public confidence in the system.114 We believe this is a reasonable request given that CMP purportedly intended to follow methodology aligned with industry standards and best practices and yet failed to do so. More specifically, as briefed above, CMP and Deloitte acknowledged the need for a complete RTM and failed to deliver or use one. Documenting the traceability of test cases to requirements will provide ongoing value to CMP and its customers. It will allow CMP to enhance their regression testing capabilities for ongoing management of the system. The compilation and validation of industry standard 110 Tr. 11/6/19 at 120. 111 Id. 112 Id. at 121, lines 21-24. 113 Arnold Direct at 31. 114 Id.; Tr. 11/6/19 at 121. 36 test cases could allow for automation of testing and lessen the ongoing demand on business resources. A complete RTM provides CMP decision makers and stakeholders, as well as the Commission, with objective evidence of the quality of requirements, test coverage, test execution and the extent to which the system is working as intended. Approaching the problem in this way may help to avoid negative impacts to customers as well as expensive proceedings such as this one in the future. Customers have been harmed by SmartCare’s shortcomings and we have demonstrated in a qualitative way, with the testimony of Ms. Keim and Ms. Arnold, many of these failures. Because of this harm, a portion of the cost of SmartCare should be removed from rate base. Quantifying the gap between the test coverage and test execution that existed prior to go-live with the test coverage and execution prescribed by the independent third party will help provide a fair and objective basis for calculating the amount of SmartCare costs that should be disallowed. We urge the Commission to provide for further process in this regard. C. The Commission should require the implementation of an action plan to restore public confidence in SmartCare. In a technical conference, Ms. Arnold was asked about the action plan she recommends on page 31 of her Direct Testimony. She spoke about the importance of building public confidence in a system such as SmartCare. Based on her participation in another state where the implementation of a system that involved the public’s access to medical care had failed, and where public confidence was ultimately successfully recovered, she offered insights into how to rebuild trust. A key element is buy-in from stakeholders. 37 Where this is a contested adjudicatory proceeding, buy-in starts with the Commission “to show the public that there's a commitment and investment in just moving forward and resolving these outstanding frustration, (sic) uncertainty, doubt, suspicions.” 115 This effort should, of course, be transparent, particularly with regard to the validation effort so that ratepayers and other stakeholders know that the system is undergoing important third party testing. Another important component would be a mechanism designed to make aggrieved customers whole. Adoption of the Staff’s proposal for outstanding high-use complaints, as we recommend below, is an important part of this. Also, CMP should provide periodic reports on the identification and correction of flaws uncovered in SmartCare. D. The Commission should adopt the Staff’s proposal for independent audits and review process for customers with high usage complaints. In its Bench Analysis, Commission Staff proposed a process for addressing the concerns of customers that have unresolved high-usage complaints.116 With the exception described below, we support the adoption of this approach as a way of helping customers reach closure on these concerns and complaints. Staff recommends that customers with unresolved complaints be given the option of having an audit performed by a third party, such as Efficiency Maine Trust, that would compare in real time their actual usage to the metered usage. If there is no discrepancy, the auditor could offer advice about electricity usage. If a discrepancy between the two is revealed, then further analysis would ensue. 115 Tr. 10/24/19 at 99, lines 15-18. 116 Bench Analysis at 8-12. 38 The Staff proposes that customers would qualify for this program only if they had complained to the CASD and had found no explanation or other relief. We have a significant concern with this. It is our understanding from interactions with many customers that some are assuming that their high bill concerns will be resolved in this docket. Because of press surrounding the appointment of Liberty and of BerryDunn to review SmartCare, and certain orders from the Commission, many customers with high bill concerns have probably not filed complaints with the CASD. They have a not-unreasonable expectation that CMP’s billing system will be repaired as a result of this process and that their high bill concerns will be addressed. We believe these customers would be wrongly excluded from the program under Staff’s approach. While the filing of a complaint is a convenient marker to use, we suggest instead that a more liberal plan be developed for these customers. This would necessarily require a public information campaign at the end of this proceeding (one may be required in any event) and in that process customers who seek to enter the program could be instructed to file a high-usage complaint if they have not already done so. There may be other ways to include these customers, and we remain open to other ideas that would not exclude these customers from possible remedies and closure. E. The Commission should make no findings with respect to individual high bill complaints to allow customers to pursue any such claims on their own. We do not believe that a Commission order that results from this case should preclude the ability of customers, should they so choose, to seek remedies either in court, or by further interaction with the CASD. We urge the Commission to make an explicit finding on this point. 39 V. Conclusion As briefed herein, CMP management understood what best practices required in the implementation of SmartCare, but it failed to follow those practices in many crucial areas of SmartCare implementation. We urge the Commission to find CMP’s implementation of SmartCare to be imprudent and to remedy the situation as outlined above. Respectfully submitted, Eric J. Bryant Senior Counsel Andrew Landry Deputy Public Advocate 40