Department of Health and Human Services st"mm, Centers for Disease Control and Prevention - tional rogram of . ncer egistries Data Quality Evaluations California Cancer Registry Diagnosis Year: 2010 CDC Contract No. 200-2008-27957 ACKNOWLEDGMENTS The staff of the Data Quality Evaluations Program would like to thank Janet Bates, M.D., M.P.H., Chief, Cancer Surveillance Section, of the California Cancer Registry, and her staff for their assistance with this evaluation. Its success was made possible by their generous and diligent efforts. We gratefully acknowledge their invaluable contribution to achieving standards of cancer data quality. We also thank the Centers for Disease Control and Prevention’s Contracting Officer’s Technical Representative and California State Program Consultant, Mary Lewis, CTR, for her participation in and guidance on the technical and administrative aspects of the evaluation. Last but not least, we thank the ICF Publications group for its contributions in preparing this report. i TABLE OF CONTENTS Acknowledgments............................................................................................................................ i I. INTRODUCTION ..................................................................................................................1 II. PURPOSE ...............................................................................................................................1 III. CONFIDENTIALITY AND SECURITY ..............................................................................1 IV. MATERIALS AND METHODS ............................................................................................2 Eligibility for NPCR Evaluation .............................................................................................2 Data Sources ...........................................................................................................................2 Reconsolidation Process .........................................................................................................3 Abstract-Level Visual Editing Evaluation Process .................................................................3 Policy and Procedure Manual Review ....................................................................................3 V. EVALUATION WORK PLAN ..............................................................................................4 Master Extract File of Reportable Cases ................................................................................4 Reconsolidation Activities ......................................................................................................4 Abstract-Level Visual Editing Evaluation Activities .............................................................4 Policy and Procedure Manual Review ....................................................................................7 VI. RECONCILIATION ...............................................................................................................7 Reconsolidation Cases ............................................................................................................7 Abstract-Level Visual Editing Evaluation Cases....................................................................7 VII. RESULTS AND DISCUSSION .............................................................................................7 Reconsolidation ......................................................................................................................7 Abstract-Level Visual Editing Evaluation and Reconsolidation ..........................................12 VIII. CONCLUSION AND RECOMMENDATIONS .................................................................14 Recommendations .................................................................................................................15 IX. EVALUATION TEAM ........................................................................................................16 APPENDIX: REFERENCES APPENDIX: TABLES APPENDIX: POLICY AND PROCEDURE MANUAL CHECKLIST ii NPCR Data Quality Evaluations California Cancer Registry—2010 Data I. INTRODUCTION Since the beginning of the National Program of Cancer Registries (NPCR), the Division of Cancer Prevention and Control within the Centers for Disease Control and Prevention (CDC) has provided assistance to States funded under NPCR. This assistance includes support in developing and enhancing State cancer registries; continuing with effective registry operations; and monitoring compliance with NPCR program standards for completeness, timeliness, and the quality of data reported to the central cancer registry under the auspices of Public Law 102-515 (the National Cancer Registries Amendment Act). On October 1, 2011, ICF Macro, Inc. (an ICF International Company, hereafter referred to as ICF) was awarded a 2-year contract for a pilot project to assess the accuracy of State central cancer registry data under the auspices of NPCR. The Data Quality Evaluations (DQE) activity follows the guidelines set by CDC and NPCR for assessment of the State cancer registry data. The results provide a comparison of the State’s performance with NPCR program standards and recommendations to improve the State cancer registry’s data accuracy. II. PURPOSE The primary purpose of NPCR-DQE is to assess the quality of the data of NPCR-funded, statewide, population-based cancer registries. These data are a crucial part of cancer surveillance systems because they are used for planning, operating, funding, and evaluating cancer control programs. Complete and accurate data are essential for estimating variations in and changes among population subgroups over time. The evaluation assessment is based on the existence of appropriate policies and procedures for the following: 1) Data consolidation 2) Assessment of data quality 3) Text documentation. III. CONFIDENTIALITY AND SECURITY All evaluation functions are performed under the pertinent confidentiality statutes. Even though patient identifiers have been deleted from the extract files, data security practices are still maintained. Data accessed by DQE evaluators during the evaluation were used only for the purpose of conducting the evaluation. Upon completion of the evaluation analysis and reporting activities, California Cancer Registry (CCR) data were either returned to the State or destroyed as required by the statement of disposition in the confidentiality agreement. Page 1 of 16 NPCR Data Quality Evaluations California Cancer Registry—2010 Data IV. MATERIALS AND METHODS ELIGIBILITY FOR NPCR EVALUATION All States receiving funding from CDC/NPCR for the operation of a central cancer registry are eligible for the Data Quality Evaluation. DATA SOURCES CCR prepared two extract files. The first was a “master abstract file” of colon, rectum, lung, female breast, corpus uteri, uterus, NOS, and prostate cases diagnosed in 2010. That file was a merged, unduplicated file (that is, it did not contain multiple facility reports for the same reportable malignancy). A second extract file containing source abstract-level data for the specific sites was also received from CCR. DQE staff used PC SAS software for the sampling. The customized client-server application and NPCR-DQE evaluation utility, CRS Plus, was used by DQE evaluators for record Reconsolidation. A customized version of Microsoft (MS) Access was used as the reconciliation tool, as well as for data analysis and report generation. ICF assessed the reliability of CCR data by performing electronic testing of required data elements to ensure compatibility with the North American Association of Central Cancer Registries’ (NAACCR) Standards for Cancer Registries, Volume II: Data Standards and Data Dictionary, Fifteenth Edition, and it was determined that the data were sufficiently reliable for the purposes of this report. Clinical check edits were also applied to the data and run on merged-level data to assess the completeness of treatment data. Any discrepancies generated warnings that referred to standard treatment not captured or recorded properly; however, no reconciliation by the central registry was required in response to clinical check activities. A report of the findings was generated, and the results were provided to NPCR as supplemental information. These results are displayed in table 1. Table 1. Clinical Check Edits Results Prognostic and Staging Info, Breast (Clin2) 19,373 Total Cases With Warning Messages 5,342 Prognostic and Staging Info, Colon (Clin2) 7,447 2,370 5,077 68.18% Radiation With Breast-Conserving Surg (Clin2) 7,562 2,819 4,743 62.72% Radiation With Rectal Cancer Surgery (Clin2) 459 0 459 100.00% Surgically Treated Non-metastatic Colon Canc (Clin2) 4,480 1,693 2,787 62.21% Systemic Treatment With Breast Surgery (Clin2) 5,085 2,866 2,219 43.64% NPCR Clinical Check Edits—2010 Data Total Eligible Cases Total Cases Without Warning Messages 14,031 Percentage of Cases Without Warning Messages 72.43% Page 2 of 16 NPCR Data Quality Evaluations California Cancer Registry—2010 Data The Radiation With Rectal Cancer Surgery (Clin2) edit had the highest warning-message-free percentage. The Systemic Treatment With Breast Surgery (Clin2) edit had the lowest warningmessage-free percentage. RECONSOLIDATION PROCESS Data evaluations are done to assess the accuracy (agreement with data) and reproducibility (agreement among data collectors) of registry data. The purposes of Reconsolidation are:  To assess the consistency of the interpretation and abstracting of the medical record  To estimate rates of agreement  To identify problems in data collection and interpretation. The CCR Reconsolidation sample included 200 cases, along with all abstract-level cases associated with them. The evaluators reviewed all abstract-level cases (Pending) associated with the 200 cases in the CCR sample and compared each of the abstract-level data elements included in the evaluation to the corresponding data elements in the central registry Summary case. Reconsolidation of abstract-level data was conducted using the customized CRS Plus utility. Upon completion of the Reconsolidation process, the data were imported into a customized version of MS Access that displayed any queried data elements to be addressed during the reconciliation process. ABSTRACT-LEVEL VISUAL EDITING EVALUATION PROCESS The CCR Visual Editing evaluation sample included 10 cases, along with all abstract-level cases associated with them. The evaluator reviewed all data elements included in the evaluation as well as the corresponding text for each abstract-level case. Any abstract-level codes not substantiated by text were recoded. Upon completion of the Visual Editing evaluation, the data were provided to CCR in an MS Excel spreadsheet to be addressed during the reconciliation process. Once the central registry had reconciled the data queries and returned the file to ICF, the data were then reconsolidated and compared to the merged central registry data to identify any final errors. These errors resulted when there was a complete lack of text to support the coded data element or the text was available but the coded data element was incorrect. POLICY AND PROCEDURE MANUAL REVIEW The CCR Policy and Procedure Manual was reviewed to provide an assessment of the quality of the CCR data through a review of the quality control/editing and consolidation procedures. Page 3 of 16 NPCR Data Quality Evaluations California Cancer Registry—2010 Data V. EVALUATION WORK PLAN MASTER EXTRACT FILE OF REPORTABLE CASES CCR prepared a master extract file of reportable cases diagnosed in 2010 among California residents. This master extract file contained consolidated cases with multiple abstracts from different facilities for the same reportable case for colon, rectum (including rectosigmoid), lung, female breast, corpus uteri, uterus, NOS, and prostate cases. RECONSOLIDATION ACTIVITIES The DQE evaluator reviewed all abstract-level cases (Pending cases) associated with the 200 cases (Summary cases) contained in CCR’s sample, which had been loaded into a customized version of CRS Plus software provided by CDC. In the event that a data element in the Pending case did not match the corresponding data element in the Summary case, the evaluator reviewed the supporting text in both the Pending and Summary cases and identified the most appropriate data value. If the Pending case information was deemed most appropriate, the evaluator updated the Summary case data element in question and provided a reason for recode to support the decision, thus reconsolidating the case. If the Summary case information was deemed most appropriate, no change was made, and no review of the supporting text was performed. After the 200 cases had been reviewed and reconsolidated, the data were compared to the original master extract file of merged, consolidated data and then loaded into a customized MS Access database. The evaluator then compiled the data discrepancies to be provided to CCR for reconciliation activities. ABSTRACT-LEVEL VISUAL EDITING EVALUATION ACTIVITIES To evaluate the Visual Editing accuracy, an additional sample of 10 merged cases was drawn from the CCR extract file and imported into CRS Plus. The evaluator reviewed every data element related to the evaluation and its associated text for each abstract-level case associated with each of the CCR unique patient identifiers/merged cases. If the text did not support the data element code, the evaluator recoded the data element based on the text provided and then provided a reason for recode to explain the new coded value. If the text was missing entirely, the evaluator recoded the data element to “unknown” (9, 99, or 999) and provided that explanation in the Reason for Recode field. After all abstract-level cases had been evaluated for the 10 cases, the data for each case were entered into an MS Excel spreadsheet for CCR reconciliation activities. Once CCR completed the reconciliation and had an opportunity to resolve any queried data elements, the evaluator reversed those queries that had been resolved by the central registry. For example, the original Grade was coded as 1 (well differentiated) in the central registry, and the evaluator recoded it to grade 9 due to lack of supporting text documentation. The central registry responded, citing pathology report documentation of well differentiated, and the recode was Page 4 of 16 NPCR Data Quality Evaluations California Cancer Registry—2010 Data reversed to the original code of grade 1. The evaluator then reconsolidated the abstract-level cases for each of the 10 sample cases and compared this reconsolidated database to the central registry’s original consolidated database. This Reconsolidation identified any abstract-level coding issues resulting from incomplete or missing text. Of specific significance are those abstract-level coding issues that impacted the accuracy of the Summary (merged) data, demonstrating the importance of providing accurate and complete text to substantiate coding choices (see table 5). To assess the quality of the data for the 200-case Reconsolidation and the 10 Visual Editing cases, the evaluators reviewed the following 34 data elements; however, CS Site-Specific Factor 1 was reviewed for only female breast and lung cases, CS Site-Specific Factor 2 was reviewed for only female breast and corpus uteri cases, CS Site-Specific Factor 3 was reviewed for only prostate cancer cases, and CS Site-Specific Factors 8–14 were reviewed for only female breast cases: 1) Tumor data: a. Primary Site (first 3 digits of the International Classification of Diseases for Oncology, Third Edition [ICD-O-3] topography code) (NAACCR #400) b. Subsite (fourth digit of the ICD-O-3 topography code) (NAACCR #400) c. Laterality (NAACCR #410) d. Histology (first 4 digits of the ICD-O-3 morphology code) (NAACCR #522) e. Behavior (fifth digit of the ICD-O-3 morphology code) (NAACCR #523) f. Grade (sixth digit of the ICD-O-3 morphology code) (NAACCR #440) g. Date of Diagnosis (yyyy/mm/dd) (NAACCR #390) h. Sequence Number (central registry) (NAACCR—Sequence Number—Central) (NAACCR #380) i. Collaborative Stage Tumor Size (NAACCR #2800) j. Collaborative Stage Extension (NAACCR—CS Extension) (NAACCR #2810) k. Collaborative Stage Tumor Size/Ext Eval (NAACCR #2820) l. Collaborative Stage Lymph Nodes (NAACCR—CS Lymph Nodes) (NAACCR #2830) m. Collaborative Stage Metastasis (NAACCR—CS Metastasis at Diagnosis) (NAACCR #2850) n. Collaborative Stage Site-Specific Factor 1 (NAACCR #2880) i. Lung: Separate Tumor Nodules—Ipsilateral Lung ii. Breast: Estrogen Receptor (ER) Assay o. Collaborative Stage Site-Specific Factor 2 (NAACCR #2890) i. Breast: Progesterone Receptor (PR) Assay ii. Corpus uteri: Peritoneal Cytology Page 5 of 16 NPCR Data Quality Evaluations California Cancer Registry—2010 Data p. Collaborative Stage Site-Specific Factor 3 (NAACCR—CS Site-Specific Factor 3) (NAACCR #2900) i. Prostate: CS Extension—Pathologic Extension q. Collaborative Stage Site-Specific Factor 8 (NAACCR #2862) i. Breast: HER2: Immunohistochemistry (IHC) Lab Value r. Collaborative Stage Site-Specific Factor 9 (NAACCR #2863) i. Breast: HER2: Immunohistochemistry (IHC) Test Interpretation s. Collaborative Stage Site-Specific Factor 10 (NAACCR #2864) i. Breast: HER2: Fluorescence In Situ Hybridization (FISH) Lab Value t. Collaborative Stage Site-Specific Factor 11 (NAACCR #2865) i. Breast: Her2: Fluorescence In Situ Hybridization (FISH) Test Interpretation u. Collaborative Stage Site-Specific Factor 12 (NAACCR #2866) i. Breast: HER2: Chromogenic In Situ Hybridization (CISH) Lab Value v. Collaborative Stage Site-Specific Factor 13 (NAACCR #2867) i. Breast: HER2: Chromogenic In Situ Hybridization (CISH) Test Interpretation w. Collaborative Stage Site-Specific Factor 14 (NAACCR #2868) i. Breast: HER2: Result of Other or Unknown Test x. Derived Summary Stage 2000 (NAACCR—Derived SS2000) (NAACCR #3020) 2) Treatment data: a. Date of first course treatment (SEER) (NAACCR—Date of Initial RX—SEER) (NAACCR #1260) b. Surgery of primary site (NAACCR—RX Summary Surgery Primary Site) (NAACCR #1290) c. Regional lymph node surgery (NAACCR—RX Summary Scope Regional Lymph Node Surgery) (NAACCR #1292) d. Surgery of other regional/distant site (NAACCR—RX Summary Surgery Other Regional/Distant) (NAACCR #1294) e. Radiation therapy (NAACCR—Radiation Regional RX Modality) (NAACCR #1570) f. Chemotherapy (NAACCR—RX Summary Chemotherapy) (NAACCR #1390) g. Hormone therapy (NAACCR—RX Summary Hormone) (NAACCR #1400) h. Biological response modifier therapy (NAACCR—RX Summary Biological Response Modifier) (NAACCR #1410) i. Transplant/endocrine therapy (NAACCR—RX Summary Transplant/Endocrine) (NAACCR #3250) j. Other therapy (NAACCR—RX Summary Other) (NAACCR #1420) Page 6 of 16 NPCR Data Quality Evaluations California Cancer Registry—2010 Data POLICY AND PROCEDURE MANUAL REVIEW The evaluator reviewed the CCR Policy and Procedure Manual. CCR conducts visual editing and documents data consolidation procedures (i.e., manual review and use of tumor linkage and consolidation). NPCR requires text documentation, and CCR is encouraged to include this requirement in the central registry policy and procedure manual. CCR provides in-depth feedback to reporting facilities following their data quality audits. This type of central registry feedback to data reporters is essential to improving data quality and allows the central registry to identify and initiate training priorities. VI. RECONCILIATION RECONSOLIDATION CASES The MS Access database containing the 200 Reconsolidation cases was provided to CCR for initiation of reconciliation within 1 week of the evaluation via the secure document server. CCR staff reviewed each queried data element and the reason for recode provided by the evaluator. If CCR staff disagreed with the evaluator’s recode, CCR provided an explanation in the Central Registry Response section. At the completion of the reconciliation period, the MS Access database containing CCR’s responses was uploaded to the California folder on the secure document server. The DQE staff evaluator reviewed CCR’s responses and reversed any recoded data elements that had been justified by CCR. The remaining errors were then analyzed to write the final report. ABSTRACT-LEVEL VISUAL EDITING EVALUATION CASES The MS Excel spreadsheet containing the 10 Visual Editing cases was provided to CCR for initiation of reconciliation via the secure document server. CCR staff reviewed each queried data element and the reason for recode provided by the evaluator. If CCR staff disagreed with the evaluator’s recode, an explanation was provided in the Central Registry Response section of the database. The DQE staff evaluator then reviewed CCR’s responses and reversed any recoded data elements that had been resolved by CCR. The remaining errors were then analyzed to write the final report. VII. RESULTS AND DISCUSSION RECONSOLIDATION A total of 200 cases were reconsolidated. Table 2 describes data elements with and without errors. A total of 24 base data elements were included in the evaluation, plus CS Site-Specific Factor 1 was evaluated for lung; CS Site-Specific Factors 1, 2, and 8–14 were evaluated for female breast; CS Site-Specific Factor 2 was evaluated for corpus uteri; and CS Site-Specific Factor 3 was evaluated for prostate. Of a total of 5,616 possible data elements that could have Page 7 of 16 NPCR Data Quality Evaluations California Cancer Registry—2010 Data had errors, 89 data elements (1.58 percent) were found to have errors. The resultant aggregate data accuracy rate was 98.42 percent. Data Accuracy (%) = (Number of Data Elements Without Discrepancy * 100)/Total Data Elements1 Table 2. Number of Data Elements With and Without Errors, by Site Number of Elements Reviewed Number of Elements With Errors Number of Elements Without Errors Accuracy Rate Colon 336 3 333 99.11% Rectum 432 8 424 98.15% Lung 800 11 789 98.63% 2,673 55 2,618 97.94% Corpus Uteri 400 3 397 99.25% Prostate 975 9 966 99.08% 5,616 89 5,527 98.42% Site Female Breast Total Figure 1 illustrates the number of final errors per case. Figure 1. Error-Count Distribution Errors per Case 0 71.0% 1 2 20.0% 5.5% 3 2.0% 5 1.5% Percentage of Total Cases Table 3 illustrates the number of cases with and without errors, distributed among the 6 primary site groups. 1 Total Data Elements = Number of Cases Reabstracted * Number of Data Elements Reviewed Page 8 of 16 NPCR Data Quality Evaluations California Cancer Registry—2010 Data Table 3. Number of Cases With and Without Errors, by Site Number of Cases Reviewed Number of Cases With Errors Number of Cases Without Errors Percentage of Error-Free Cases Colon 14 2 12 85.71% Rectum 18 7 11 61.11% Lung 32 8 24 75.00% Female Breast 81 30 51 62.96% Corpus Uteri 16 3 13 81.25% Prostate 39 8 31 79.49% Total 200 58 142 71.00% Site Table 4 illustrates the number of data element errors by primary site. A review of the final errors demonstrated that lack of supporting documentation or misinterpretation of available text impacted data quality in the following manner. These examples are not inclusive of overall final errors.    Radiation Regional RX Modality (10):  Text available, but no text regarding specific energy  Text available, but corresponding code incorrect  No text to support coded data element. CS Site-Specific Factor 8 (8):  Text available, but corresponding code incorrect  No text to support coded data element. Date of Diagnosis (7):    Text available, but corresponding code incorrect. CS Lymph Nodes (6):  Text available, but corresponding code incorrect  Text available, but no text regarding specific lymph node chain. RX Summary Surgery Primary Site (5):  Text available, but corresponding code incorrect  Text available, but no information regarding specific type of surgery. Page 9 of 16 NPCR Data Quality Evaluations California Cancer Registry—2010 Data  Grade (5):   CS Extension (5):     Text available, but corresponding code incorrect. Text available, but corresponding code incorrect. CS Site-Specific Factor 10 (5):  Text available, but corresponding code incorrect  Text available, but no specific information regarding FISH lab value. Date of Initial RX—SEER (5):  Text available, but corresponding code incorrect  Text available, but no text regarding specific date  No text. RX Summary Chemotherapy (5):  Text available, but corresponding code incorrect. Table 4. Data Errors by Data Element and Primary Site Data Element/ Site Colon Rectum Lung Female Breast Corpus Uteri Prostate Total Error Rate No. of Cases Reviewed 14 18 32 81 16 39 200 Primary Site 0 0 0 0 0 0 0 0.0% Subsite 0 0 0 1 0 0 1 0.5% Laterality 0 0 0 0 0 0 0 0.0% Histology 0 0 0 2 1 0 3 1.5% Behavior 0 0 0 0 0 0 0 0.0% Grade 0 1 3 0 0 1 5 2.5% Date of Diagnosis 1 0 1 4 0 1 7 3.5% Sequence Number—Central 0 0 0 0 0 0 0 0.0% CS Tumor Size 0 0 0 1 0 0 1 0.5% CS Extension 0 0 1 0 1 3 5 2.5% CS Tumor Size/ Ext Eval 0 0 0 1 0 0 1 0.5% Page 10 of 16 NPCR Data Quality Evaluations California Cancer Registry—2010 Data Data Element/ Site Colon Rectum Lung Female Breast Corpus Uteri Prostate Total Error Rate CS Lymph Nodes 1 1 1 3 0 0 6 3.0% CS Mets at Dx 0 0 1 0 0 0 1 0.5% CS Site-Specific Factor 1 0 0 0* 2 # 0 0 2 1.0% CS Site-Specific Factor 2 0 0 0 1 # 1* 0 2 1.0% CS Site-Specific Factor 3 0 0 0 0 0 0* 0 0.0% CS Site-Specific Factor 8 0 0 0 8 # 0 0 8 4.0% CS Site-Specific Factor 9 0 0 0 4 # 0 0 4 2.0% CS Site-Specific Factor 10 0 0 0 5 # 0 0 5 2.5% CS Site-Specific Factor 11 0 0 0 2 # 0 0 2 1.0% CS Site-Specific Factor 12 0 0 0 1 # 0 0 1 0.5% CS Site-Specific Factor 13 0 0 0 1 # 0 0 1 0.5% CS Site-Specific Factor 14 0 0 0 2 # 0 0 2 1.0% Derived SS2000 0 0 0 0 0 1 1 0.5% Date of Initial RX—SEER 1 1 1 0 0 2 5 2.5% RX Summ—Surg Prim Site 0 1 0 4 0 0 5 2.5% RX Summ—Scope Reg LN Sur 0 1 0 3 0 0 4 2.0% RX Summ—Surg Oth Reg/Dis 0 0 0 0 0 0 0 0.0% Rad—Regional RX Modality 0 0 2 7 0 1 10 5.0% RX Summ— Chemo 0 3 1 1 0 0 5 2.5% RX Summ— Hormone 0 0 0 2 0 0 2 1.0% Page 11 of 16 NPCR Data Quality Evaluations California Cancer Registry—2010 Data Data Element/ Site Colon Rectum Lung Female Breast Corpus Uteri Prostate Total Error Rate RX Summ—BRM 0 0 0 0 0 0 0 0.0% RX Summ— Transplnt/Endocr 0 0 0 0 0 0 0 0.0% RX Summ—Other 0 0 0 0 0 0 0 0.0% Total 3 8 11 55 3 9 89 Key: * = Impacts stage # = Clinically relevant ABSTRACT-LEVEL VISUAL EDITING EVALUATION AND RECONSOLIDATION Table 5 illustrates the errors found during the Visual Editing evaluation and Reconsolidation. Text documentation is an NPCR requirement and is essential to substantiate the data element codes. A total of 21 abstract-level cases associated with the 10 merged cases were reviewed during the evaluation of Visual Editing Reconsolidation. Twenty-four base data elements were reviewed for each abstract. Additional CS Site-Specific Factors were reviewed for lung (CS SiteSpecific Factor 1), female breast (CS Site-Specific Factors 1, 2, and 8–14), corpus uteri (CS SiteSpecific Factor 2), and prostate (CS Site-Specific Factor 3). A total of 568 data elements were reviewed:  Female breast: 3 cases: 6 abstracts x 33 data elements = 198  Corpus uteri: 2 cases: 4 abstracts x 25 data elements = 100  Prostate: 2 cases: 4 abstracts x 25 data elements = 100  Rectum: 1 case: 3 abstracts x 24 data elements = 72  Lung: 1 case: 2 abstracts x 25 data elements = 50  Colon: 1 case: 2 abstracts x 24 data elements = 48 A total of 16 errors (2.82 percent) were found during the Visual Editing evaluation and the resulting accuracy rate was 97.18 percent. Of the total 16 errors found during the Reconsolidation of the 10 Visual Editing cases, 5 errors (31.25 percent) resulted from a complete lack of supporting text documentation. The remaining 11 errors (68.75 percent) occurred when text was available but the code was incorrect. Page 12 of 16 NPCR Data Quality Evaluations California Cancer Registry—2010 Data Table 5. Abstract-Level Visual Editing Consolidation Coding Errors Due to Text/NA Total 1 1 1 1 1 1 3 3 CS Tumor Size/Ext Eval 1 1 CS Lymph Nodes 1 1 1 2 Data Element No Text Cancer Identification Primary Site Subsite Laterality Histology Behavior Grade Date of Diagnosis Sequence Number—Central Collaborative Staging CS Tumor Size CS Extension CS Mets at Dx CS Site-Specific Factor 1 1 CS Site-Specific Factor 2 1 1 CS Site-Specific Factor 8 1 1 CS Site-Specific Factor 9 1 1 CS Site-Specific Factor 3 CS Site-Specific Factor 10 CS Site-Specific Factor 11 CS Site-Specific Factor 12 CS Site-Specific Factor 13 CS Site-Specific Factor 14 Derived SS2000 Page 13 of 16 NPCR Data Quality Evaluations California Cancer Registry—2010 Data Data Element No Text Coding Errors Due to Text/NA Total 1 1 Treatment 1st Course Date of Initial RX—SEER RX Summ—Surg Prim Site RX Summ—Scope Reg LN Sur 1 1 RX Summ—Surg Oth Reg/Dis Rad—Regional RX Modality RX Summ—Chemo 1 1 11 16 RX Summ—Hormone RX Summ—BRM RX Summ—Transplnt/Endocr RX Summ—Other Total 5 VIII. CONCLUSION AND RECOMMENDATIONS CCR’s overall data accuracy rate of merged, consolidated data was 98.42 percent; CCR is to be highly commended for this outstanding result. CCR’s data accuracy rate for the Visual Editing evaluation and reconsolidation was 97.18 percent, and CCR is to be commended for this excellent result. A total of 5 errors resulted when text was completely missing; a total of 11 errors resulted when text was available but incorrectly coded for various data elements, impacting the consolidated information. CCR is encouraged to continue conducting visual editing to maintain the high standard of data quality in the State, in addition to reviewing basic abstracting principles with staff and data reporters and emphasizing to all reporting facilities that text documentation to support coding and consolidation decisions is required by NPCR (unless otherwise specified in the CCR Policy and Procedure Manual). Page 14 of 16 NPCR Data Quality Evaluations California Cancer Registry—2010 Data RECOMMENDATIONS In summary, our recommendations are as follows: 1) Statewide training should include a focus on the following items: a. Educating all reporting facilities that text documentation is required for all data elements, preferably using hands-on training, with an emphasis on Radiation Regional RX Modality (specific energy), CS Site-Specific Factor 8, and Date of Initial RX—SEER b. Reviewing CS Site-Specific Factor 8 coding rules c. Reviewing all radiographic imaging and pathology reports when coding Date of Diagnosis d. Reviewing Collaborative Staging rules with an emphasis on CS Extension and CS Lymph Nodes, with an emphasis on documentation of specific lymph node chain e. Reviewing the rules regarding RX Summary Surgery Primary Site and RX Summary Chemotherapy. Page 15 of 16 NPCR Data Quality Evaluations California Cancer Registry—2010 Data IX. EVALUATION TEAM The following individuals participated in the CCR audit: ICF DQE California Cancer Registry 1) Don McMaster, M.S., M.B.A. Project Officer 1) Janet Bates, M.D., M.P.H. Chief, Cancer Surveillance Section 2) Vaishali Joshi Project Director 2) Kyle Ziegler, CTR Quality Control Analyst, Auditor-Trainer 3) Brenda Lange, CTR, EMT Project Manager CDC 4) Qiming He, Ph.D. Statistician 1) Mary Lewis, CTR Contracting Officer’s Technical Representative and California State Program Consultant 5) Phil Schaeffer, M.P.A. Senior Programmer/Analyst 6) Janice Gregoire, M.S.H.S., CTR Oncology Data Analyst 7) Stephanie Boyd Project Assistant 8) Rick Piet Editor The CCR evaluation was conducted by Janice Gregoire, M.S.H.S., CTR, and Brenda Lange, CTR, EMT, of ICF. All members of the evaluation team are trained professionals in the areas of cancer registry operations and management. Page 16 of 16 APPENDIX: REFERENCES 1. International Classification of Diseases for Oncology, third edition. Geneva: World Health Organization, 2000. 2. Collaborative Stage Data Collection System User Documentation and Coding Instructions, version 02.04. Collaborative Stage Work Group of the American Joint Committee on Cancer. Chicago: American Joint Committee on Cancer, December 2011. 3. FORDS Manual. Commission on Cancer (CoC). Chicago: American College of Surgeons, 2010. 4. SEER Program Manual, third edition. Cancer Statistics Branch, National Cancer Institute. Washington, DC: U.S. Department of Health and Human Services, January 1998. 5. SEER Inquiry System (http://www.seer.cancer.gov/). Maintained by the SEER Program, National Cancer Institute. Updated annually. 6. Inquiry and Response System of the American College of (http://cancerbulletin.facs.org/forums/). Maintained by CoC. Updated annually. Surgeons 7. Standards for Cancer Registries, Volume II: Data Standards and Data Dictionary, fifteenth edition, record layout version 12.1. Ed. Thornton, M. Springfield, IL: North American Association of Central Cancer Registries, June 2010. 8. California Cancer Registry Data Dictionary. 9. Chromy, J.R. 1979. “Sequential Sample Selection Methods.” Proceedings of the Survey Research Methods of the American Statistical Association, 401–406. 10. Brewer, K.R.W., and Hanif, M. 1982. Sampling With Unequal Probabilities. New York: Springer-Verlag. APPENDIX: TABLES DISTRIBUTION OF ERRORS BY PRIMARY SITE TABLE 6. ERROR COUNT DISTRIBUTION FOR COLON (14 CASES) Number of Errors per Case Number of Cases Percentage 0 12 85.71% 1 1 7.14% 2 1 7.14% Total 14 100% Note: Details may not total to 100 percent due to rounding. TABLE 7. ERROR COUNT DISTRIBUTION FOR RECTUM (18 CASES) Number of Errors per Case Number of Cases Percentage 0 11 61.11% 1 6 33.33% 2 1 5.56% Total 18 100% TABLE 8. ERROR COUNT DISTRIBUTION FOR LUNG (32 CASES) Number of Errors per Case Number of Cases Percentage 0 24 75.00% 1 6 18.75% 2 1 3.13% 3 1 3.13% Total 32 100% Note: Details may not total to 100 percent due to rounding. TABLE 9. ERROR COUNT DISTRIBUTION FOR FEMALE BREAST (81 CASES) Number of Errors per Case Number of Cases Percentage 0 51 62.96% 1 17 20.99% 2 7 8.64% 3 3 3.70% 5 3 3.70% Total 81 100% Note: Details may not total to 100 percent due to rounding. TABLE 10. ERROR COUNT DISTRIBUTION FOR CORPUS UTERI (16 CASES) Number of Errors per Case Number of Cases Percentage 0 13 81.25% 1 3 18.75% Total 16 100% TABLE 11. ERROR COUNT DISTRIBUTION FOR PROSTATE (39 CASES) Number of Errors per Case Number of Cases Percentage 0 31 79.49% 1 7 17.95% 2 1 2.56% Total 39 100% APPENDIX: POLICY AND PROCEDURE MANUAL CHECKLIST Policy and Procedure Manual Review State: California NPCR Program Manual Reference Documented Comments Yes No Quality Control Electronic reporting format X Secure Internet-based, FTP, https, or encrypted e-mail mechanism to receive electronic data X Standardized data elements X Standardized data edits X Data linkages X Designated CTR responsible for QA program X Reabstracting audits from a sample of source documents X  Staff from both Regional offices and the Central Registry visit cancer reporting facilities to perform QC audits. Text documentation requirement X  Documentation from the Coding section: Codes must be supported by text documentation.  The CCR manual includes a list of common, acceptable symbols and abbreviations. Feedback to facilities X Audits of a routine sample of consolidated cases at CCR Record Consolidation Consolidation procedure X Education program X  Combination of both manual and electronic