HOSPITAL SURVEY ON PATIENT SAFETY CULTURE 2016 User Comparative Database Report Agency for Healthcare Research and Quality Advancing Excellence in Health Care www.ahrq.gov PATIENT SAFETY The authors of this report are responsible for its content. Statements in the report should not be construed as endorsement by the Agency for Healthcare Research and Quality or the U.S. Department of Health and Human Services. Hospital Survey on Patient Safety Culture: 2016 User Comparative Database Report Prepared for: Agency for Healthcare Research and Quality U.S. Department of Health and Human Services 5600 Fishers Lane Rockville, MD 20857 www.ahrq.gov Contract No. HHSA 290201300003C Managed and prepared by: Westat, Rockville, MD Theresa Famolaro, M.P.S., M.S. Naomi Dyer Yount, Ph.D. Willow Burns, M.A. Elizabeth Flashner, M.H.A. Helen Liu Joann Sorra, Ph.D. AHRQ Publication No. 16-0021-EF March 2016 The content of this document may be used and reprinted without permission except for the following: Federal Government logos, items noted with specific restrictions, and those copyrighted materials that are clearly noted in the document. Further reproduction of those copyrighted materials is prohibited without the specific permission of copyright holders. Suggested Citation: Famolaro T, Yount N, Burns W, et al. Hospital Survey on Patient Safety Culture 2016 User Comparative Database Report. (Prepared by Westat, Rockville, MD, under Contract No. HHSA 290201300003C). Rockville, MD: Agency for Healthcare Research and Quality; March 2016. AHRQ Publication No. 16-0021-EF. The authors of this report are responsible for its content. Statements in the report should not be construed as endorsement by the Agency for Healthcare Research and Quality or the U.S.Department of Health and Human Services. No investigators have any affiliations or financial involvement (e.g., employment, consultancies, honoraria, stock options, expert testimony, grants or patents received or pending, or royalties) that conflict with material presented in this report. ii Table of Contents Executive Summary ..................................................................................................................... vi Survey Content......................................................................................................................... vi 2016 Survey Administration Statistics.................................................................................... vii Hospital Characteristics .......................................................................................................... vii Respondent Characteristics ..................................................................................................... vii Areas of Strength for Most Hospitals .................................................................................... viii Areas With Potential for Improvement for Most Hospitals ................................................... viii Results by Hospital Characteristics ....................................................................................... viii Results by Respondent Characteristics .................................................................................... ix Trending: Comparing Results Over Time..................................................................................x Additional Trending Statistics ................................................................................................. xi Trending Results by Hospital Characteristics .......................................................................... xi Trending Results by Respondent Characteristics ................................................................... xii Action Planning for Improvement .......................................................................................... xii Purpose and Use of This Report ...................................................................................................1 Chapter 1. Introduction ................................................................................................................3 Survey Content...........................................................................................................................3 Data Limitations.........................................................................................................................4 Chapter 2. Survey Administration Statistics ...............................................................................7 Overall Hospital Statistics..........................................................................................................7 Chapter 3. Hospital Characteristics .............................................................................................9 Bed Size .....................................................................................................................................9 Teaching Status ........................................................................................................................10 Ownership ................................................................................................................................10 Geographic Region ..................................................................................................................11 Children’s Hospitals.................................................................................................................12 Chapter 4. Respondent Characteristics .....................................................................................13 Work Area/Unit .......................................................................................................................13 Staff Position............................................................................................................................14 Interaction With Patients..........................................................................................................15 Tenure With Current Hospital .................................................................................................15 Tenure in Current Work Area/Unit ..........................................................................................16 Tenure in Current Specialty or Profession ...............................................................................16 Hours Worked Per Week .........................................................................................................17 Chapter 5. Overall Results ..........................................................................................................19 Composite-Level Results .........................................................................................................20 Item-Level Results ...................................................................................................................21 Chapter 6. Comparing Your Results .........................................................................................29 Description of Comparative Statistics .....................................................................................29 Composite and Item-Level Comparative Tables .....................................................................32 Appendixes A and B: Overall Results by Hospital and Respondent Characteristics ..............39 Chapter 7. Trending: Comparing Results Over Time .............................................................43 Description of Trending Statistics ...........................................................................................47 Composite and Item-Level Trending Results ..........................................................................47 Bar Charts of Trending Results................................................................................................54 Appendixes C and D: Trending Results by Hospital and Respondent Characteristics............59 iii Chapter 8. What’s Next? Action Planning for Improvement ..................................................61 Seven Steps of Action Planning ...............................................................................................61 References .....................................................................................................................................67 Notes: Description of Data Cleaning and Calculations ............................................................68 Data Cleaning...........................................................................................................................68 Response Rates ........................................................................................................................68 Calculation of Percent Positive Scores ....................................................................................68 Item-Level Percent Positive Response.....................................................................................69 Composite-Level Percent Positive Response...........................................................................69 Item and Composite Percent Positive Scores...........................................................................69 Minimum Number of Responses .............................................................................................71 Percentiles ................................................................................................................................71 List of Tables Table 1-1. Table 2-1. Table 2-2. Table 2-3. Table 2-4. Table 3-1. Table 3-2. Table 3-3. Table 3-4. Table 3-5. Table 4-1. Table 4-2. Table 4-3. Table 4-4. Table 4-5. Table 4-6. Table 4-7. Table 6-1. Table 6-2. Table 6-3. Table 6-4. Table 6-5. Table 6-6. Table 7-1. Table 7-2. Table 7-3. Patient Safety Culture Composites and Definitions .....................................................3 Trending and Nontrending Overall Statistics—2016 Database Hospitals ...................7 Response Rate Statistics—2016 Database Hospitals ...................................................8 Survey Administration Statistics—2016 Database Hospitals ......................................8 Average Response Rate by Survey Administration Mode—2016 Database Hospitals .......................................................................................................................8 Bed Size: Distribution of 2016 Database Hospitals and Respondents Compared With AHA-Registered Hospitals ................................................................................10 Teaching Status: Distribution of 2016 Database Hospitals and Respondents Compared With AHA-Registered Hospitals ..............................................................10 Ownership: Distribution of 2016 Database Hospitals and Respondents Compared With AHA-Registered Hospitals ..............................................................10 Geographic Region: Distribution of 2016 Database Hospitals and Respondents Compared With AHA-Registered Hospitals ..............................................................11 Children’s Hospitals: Distribution of 2016 Database Hospitals and Respondents Compared With AHA-Registered Hospitals .........................................12 Work Area/Unit: Distribution of 2016 Database Respondents ..................................14 Staff Position: Distribution of 2016 Database Respondents ......................................14 Interaction With Patients: Distribution of 2016 Database Respondents ....................15 Tenure With Current Hospital: Distribution of 2016 Database Respondents ............15 Tenure in Current Work Area/Unit: Distribution of 2016 Database Respondents ....... 16 Tenure in Current Specialty or Profession: Distribution of 2016 Database Respondents................................................................................................................16 Hours Worked Per Week: Distribution of 2016 Database Respondents ....................17 Interpretation of Percentile Scores .............................................................................31 Sample Percentile Statistics........................................................................................32 Composite-Level Comparative Results—2016 Database Hospitals ..........................33 Item-Level Comparative Results—2016 Database Hospitals ....................................34 Percentage of Respondents Giving Their Work Area/Unit Patient Safety Grade Comparative Results—2016 Database Hospitals .......................................................38 Percentage of Respondents Reporting One or More Events in the Past 12 Months Comparative Results—2016 Database Hospitals..........................................38 Trending: Response Rate Statistics—2016 Database Hospitals ................................44 Bed Size—Distribution of 2016 Trending and Nontrending Hospitals .....................44 Teaching Status—Distribution of 2016 Trending and Nontrending Hospitals ..........45 iv Table 7-4. Ownership—Distribution of 2016 Trending and Nontrending Hospitals ..................45 Table 7-5. Geographic Region—Distribution of 2016 Trending and Nontrending Hospitals ....... 46 Table 7-6. Example of Trending Statistics ..................................................................................47 Table 7-7. Example of Other Trending Statistics ........................................................................47 Table 7-8. Trending: Composite-Level Results—2016 Database Hospitals ...............................48 Table 7-9. Trending: Item-Level Results—2016 Database Hospitals .........................................49 Table 7-10. Trending: Distribution of Work Area/Unit Patient Safety Grades—2016 Database Hospitals .....................................................................................................53 Table 7-11. Trending: Distribution of Number of Events Reported in the Past 12 Months— 2016 Database Hospitals ............................................................................................53 Table N1. Example of Computing Item and Composite Percent Positive Scores ......................70 Table N2. Example of Computing Patient Safety Grade and Number of Events Reported........71 Table N3. Data Table for Example of How To Compute Percentiles .........................................72 List of Charts Chart 5-1. Composite-Level Average Percent Positive Response – 2016 Database Hospitals ..... 20 Chart 5-2. Item-Level Average Percent Positive Response – 2016 Database Hospitals .............22 Chart 5-3. Average Percentage of 2016 Database Respondents Giving Their Work Area/Unit a Patient Safety Grade ...............................................................................26 Chart 5-4. Average Percentage of 2016 Database Respondents Reporting Events in the Past 12 Months ...........................................................................................................27 Chart 7-1. Trending: Percentage of 2016 Hospitals That Increased, Decreased, or Did Not Change on Each Composite .......................................................................................55 Chart 7-2. Trending: Percentage of 2016 Hospitals That Increased, Decreased, or Did Not Change on Work Area/Unit Patient Safety Grade as “Excellent” or “Very good” ...... 56 Chart 7-3. Trending: Percentage of 2016 Hospitals That Increased, Decreased, or Did Not Change on Number of Events Reported as Equal to One or More Events Reported ...... 56 Chart 7-4. Trending: Distribution of 2016 Hospitals by Number of Composites That Increased by 5 Percentage Points or More .................................................................57 Chart 7-5. Trending: Distribution of 2016 Hospitals by Number of Composites That Did Not Change by 5 Percentage Points or More .............................................................57 Chart 7-6. Trending: Distribution of 2016 Hospitals by Number of Composites That Decreased by 5 Percentage Points or More ................................................................58 List of Figures Figure 8-1. Plan-Do-Study-Act Cycle...........................................................................................65 Appendixes cited in this report are provided electronically at www.ahrq.gov/qual/hospsurvey16/. v Executive Summary In response to requests from hospitals interested in comparing their results with those of other hospitals on the Hospital Survey on Patient Safety Culture, the Agency for Healthcare Research and Quality (AHRQ) established the Hospital Survey on Patient Safety Culture comparative database. The first user comparative database report, released in 2007, included data from 382 U.S. hospitals. The 2016 user comparative database report displays results from 680 hospitals and 447,584 hospital staff respondents. This report also includes a chapter on trending that presents results showing change over time for 326 hospitals that administered the survey and submitted data more than once. From 2007 to 2012, data were collected annually. Data from past databases were retained until more recent data were submitted and as long as the data were no more than 4.5 years old. Starting with the 2014 database, survey data are collected every 2 years and may only be up to 2 years old. Hospitals must submit their survey data to consecutive databases in order to trend their results over time. Only hospitals that successively submit survey data will be included in trending analysis. This user comparative database report was developed as a tool for the following purposes: • • • • Comparison—To allow hospitals to compare their patient safety culture survey results with those of other hospitals. Assessment and Learning—To provide data to hospitals to facilitate internal assessment and learning in the patient safety improvement process. Supplemental Information—To provide supplemental information to help hospitals identify their strengths and areas with potential for improvement in patient safety culture. Trending—To provide data that describe changes in patient safety culture over time. Survey Content The hospital survey, released in November 2004, was designed to assess hospital staff opinions about patient safety issues, medical errors, and event reporting. The survey includes 42 items that measure 12 areas, or composites, of patient safety culture: 1. Communication openness. 2. Feedback and communication about error. 3. Frequency of events reported. 4. Handoffs and transitions. 5. Management support for patient safety. 6. Nonpunitive response to error. 7. Organizational learning—continuous improvement. 8. Overall perceptions of patient safety. 9. Staffing. 10. Supervisor/manager expectations and actions promoting safety. 11. Teamwork across units. 12. Teamwork within units. vi The survey also includes two questions that ask respondents to provide an overall grade on patient safety for their work area/unit and to indicate the number of events they reported over the past 12 months. 2016 Survey Administration Statistics • • • A total of 680 hospitals submitted data for the 2016 report. The average hospital response rate was 55 percent, with an average of 658 completed surveys per hospital. Most hospitals (78 percent) administered Web surveys, which resulted in lower response rates (54 percent) compared with response rates from paper (71 percent); hospitals that administered surveys by mixed mode had response rates similar to Web-only administration (55 percent). Hospital Characteristics • • • • • Most of the database hospitals (64 percent) have at least 100 beds. Database hospitals represented all geographic regions in the United States. Most database hospitals are nonteaching (62 percent) and are nongovernment not for profit (79 percent). Children’s hospitals represent 7 percent of database hospitals. Characteristics of database hospitals are fairly consistent with the distribution of hospitals registered with the American Hospital Association. Respondent Characteristics • • • • • • i ii There were 447,584 hospital staff respondents. The top three respondent work areas were: o Other (29 percent). i o Medicine (12 percent). o Surgery (10 percent). The top three respondent staff positions were: o Registered nurse or licensed vocational nurse/licensed practical nurse (36 percent). o Other (21 percent). ii o Technician (e.g., EKG, Lab, Radiology) (11 percent). Most respondents (77 percent) indicated they had direct interaction with patients. More than half of the respondents (56 percent) indicated they had worked with their current hospital at least 6 years. Similarly, almost half of the respondents (47 percent) indicated they had worked in their current work area/unit at least 6 years. Most respondents worked either less than 40 hours a week (46 percent) or 40 to 59 hours per week (47 percent). Many respondents chose “Other,” which allowed them to note their specific work area or unit. However, this information was not collected from the hospitals. Many respondents chose “Other,” which allowed them to note their staff position. However, this information was not collected from the hospitals. vii Areas of Strength for Most Hospitals The three areas of strength or composites with the highest average percent positive responses were iii: 1. Teamwork Within Units (82 percent positive)—Staff support each other, treat each other with respect, and work together as a team. 2. Supervisor/Manager Expectations and Actions Promoting Patient Safety (78 percent positive)—Supervisors/managers consider staff suggestions for improving patient safety, praise staff for following patient safety procedures, and do not overlook patient safety problems. 3. Organizational Learning—Continuous Improvement (73 percent positive)—Mistakes have led to positive changes and changes are evaluated for effectiveness. Areas With Potential for Improvement for Most Hospitals The three areas that showed potential for improvement or with the lowest average percent positive responses were: 1. Nonpunitive Response to Error (45 percent positive)—Staff feel that their mistakes and event reports are not held against them and that mistakes are not kept in their personnel file. 2. Handoffs and Transitions (48 percent positive)—Important patient care information is transferred across hospital units and during shift changes. 3. Staffing (54 percent positive)—There are enough staff to handle the workload and work hours are appropriate to provide the best care for patients. Results by Hospital Characteristics Bed Size • • Smaller hospitals (6–24 beds and 25–49 beds) had the highest percent positive for the average across all composites (69 percent); larger hospitals (300–399 beds) had the lowest (61 percent). Hospitals with 25–49 beds had the highest percentage of respondents who gave their work area/unit a patient safety grade of “Excellent” or “Very Good” (83 percent); hospitals with 300–399 beds and 400–499 beds had the lowest (70 percent). Teaching Status and Ownership • • iii Nonteaching hospitals, on average, scored higher than teaching hospitals by 5 percentage points or more on Overall Perceptions of Patient Safety, Staffing, and Handoffs and Transitions. Nonteaching hospitals had a higher percentage of respondents who gave their work area/unit a patient safety grade of “Excellent” or “Very Good” (78 percent) than teaching hospitals (73 percent). Percent positive is the percentage of positive responses (e.g., Agree, Strongly agree) to positively worded items (e.g., “People support one another in this unit”) or negative responses (e.g., Disagree, Strongly disagree) to negatively worded items (e.g., “We have safety problems in this unit”). viii • Overall, hospitals did not have large differences across ownership categories on the 12 composites, patient safety grade, or number of events reported. Geographic Region • • • East South Central iv hospitals had the highest percent positive for the average across all composites (68 percent); New England and Mid-Atlantic hospitals had the lowest (61 percent). West North Central hospitals had the highest percentage of respondents who gave their work area/unit a patient safety grade of “Excellent” or “Very Good” (79 percent); Mid-Atlantic hospitals had the lowest (70 percent). Pacific hospitals had the highest percentage of respondents who reported one or more events in the past year (50 percent); West South Central hospitals had the lowest (39 percent). Children’s Hospitals • Children’s hospitals and non-children’s hospitals did not have large differences on the 12 composites, patient safety grade, or number of events reported. Results by Respondent Characteristics Work Area/Unit • • • Respondents in Rehabilitation had the highest percent positive response for the average across the composites (71 percent positive); Emergency had the lowest (59 percent positive). Rehabilitation had the highest percentage of respondents who gave their work area/unit a patient safety grade of “Excellent” or “Very Good” (86 percent); Emergency had the lowest (65 percent). ICU (Any Type) had the highest percentage of respondents reporting one or more events in the past year (62 percent); Rehabilitation had the lowest (39 percent). Staff Position • iv Respondents in Administration/Management had the highest percent positive response for the average across the composites (76 percent positive); RN/LVN/LPN had the lowest (63 percent positive). States and territories are categorized into American Hospital Association (AHA)-defined regions as follows: • • • • • • • • • New England: CT, MA, ME, NH, RI, VT Mid-Atlantic: NJ, NY, PA South Atlantic/Associated Territories: DC, DE, FL, GA, MD, NC, SC, VA, WV, Puerto Rico, Virgin Islands East North Central: IL, IN, MI, OH, WI East South Central: AL, KY, MS, TN West North Central: IA, KS, MN, MO, ND, NE, SD West South Central: AR, LA, OK, TX Mountain: AZ, CO, ID, MT, NM, NV, UT, WY Pacific/Associated Territories: AK, CA, HI, OR, WA, American Samoa, Guam, Marshall Islands, Northern Mariana Islands ix • • Administration/Management had the highest percentage of respondents who gave their work area/unit a patient safety grade of “Excellent” or “Very Good” (89 percent); RN/LVN/LPN had the lowest (71 percent). Pharmacists had the highest percentage of respondents reporting one or more events in the past year (77 percent); Unit Assistants/Clerks/Secretaries had the lowest (17 percent). Interaction With Patients • • • Respondents with direct patient interaction were more positive than those without direct interaction on Handoffs and Transitions (49 percent compared with 43 percent) but less positive on Management Support for Patient Safety (71 percent compared with 79 percent) and Feedback and Communication About Error (67 percent compared with 72 percent). Respondents without direct patient interaction had a higher percentage of respondents who gave their work area/unit a patient safety grade of “Excellent” or “Very Good” (81 percent) than respondents with direct patient interaction (75 percent). More respondents with direct patient interaction reported one or more events in the past year (49 percent) than respondents without direct patient interaction (31 percent). Tenure in Current Work Area/Unit • • • Respondents with less than 1 year in their current work area/unit had the highest percent positive response for the average across the composites (69 percent); respondents with 6 to 10 years had the lowest (63 percent). Respondents with less than 1 year in their current work area/unit had the highest percentage of respondents who gave their work area/unit a patient safety grade of “Excellent” or “Very Good” (82 percent); respondents with 6 to 10 years had the lowest (74 percent). Respondents with 6 to 10 years, 11 to 15 years, and 21 years or more in their current work area/unit had the highest percentage of respondents reporting one or more events in the past year (48 percent each); respondents with less than 1 year had the lowest (31 percent). Trending: Comparing Results Over Time Results below describe changes over time on the patient safety culture composites, patient safety grade, and number of events reported for the 326 hospitals (of the 680 total database hospitals) that administered the survey and submitted data to both the 2014 and 2016 databases. Trending Hospitals • • • Across the 326 trending hospitals, the average percent positive scores across the 12 patient safety culture composites increased by 1 percentage point (ranging across the composites from a change of -2 to a change of 3 percentage points). Of those hospitals that increased on Patient Safety Grade, scores for “Excellent” or “Very Good” increased on average 6 percent. For hospitals with increases in the number of respondents who reported at least one event in the past 12 months, the average increase was 5 percent. x Additional Trending Statistics The charts in Chapter 7 provide results for two additional ways of summarizing changes in patient safety composite scores over time. The first series of charts displays the number of hospitals that increased, decreased, or did not change by 5 percentage points or more for each composite, patient safety grade, and number of events reported. The second set of charts displays the distribution of trending hospitals by number of composites that increased, decreased, or changed less than 5 percentage points. Trending Results by Hospital Characteristics Bed Size • • Hospitals with 6–24 beds increased on all 12 of the patient safety culture composites; increases ranged from 4 to 11 percentage points. Hospitals with 6–24 beds had the greatest increase in the percentage of respondents who gave their work area/unit a patient safety grade of “Excellent” or “Very Good” (a 5 percentage point increase, from 81 percent to 86 percent). Teaching Status and Ownership • • Both teaching and nonteaching hospitals showed the largest increase (3 percentage points) on Supervisor/Manager Expectations and Actions Promoting Patient Safety. Nongovernment-owned hospitals showed the largest increase (3 percentage points) on Supervisor/Manager Expectations and Actions Promoting Patient Safety. Governmentowned hospitals’ largest increase was 2 percentage points on the same composite. Geographic Region • • West Central region hospitals increased 8 percentage points on Nonpunitive Response to Error and 5 percentage points on Overall Perceptions of Safety. West Central region hospitals had the greatest increase of respondents who gave their work area/unit a patient safety grade of “Excellent” (a 3 percentage point increase, from 76 percent to 79 percent). xi Trending Results by Respondent Characteristics Work Area/Unit • • Surgery increased 4 percentage points on Supervisor/Manager Expectations and Actions Promoting Patient Safety. Anesthesiology increased 3 percentage points in the percentage of respondents reporting one or more events in the past year. Staff Position • • Attending/staff physician, resident physician/physician in training, or physician assistant/nurse practitioner increased 6 percentage points on Supervisor/Manager Expectations and Actions Promoting Patient Safety. Dietitians increased 5 percentage points on the percentage of respondents who gave their work area/unit a patient safety grade of “Excellent” or “Very Good.” Interaction With Patients • Respondents with direct interaction and without direct interaction with patients increased 3 percentage points on Supervisor/Manager Expectations and Actions Promoting Patient Safety. Tenure in Work Area/Unit • • Respondents with 1 to 5 years in their work area/unit increased 3 percentage points on Nonpunitive Response to Error and Supervisor/Manager Expectations and Actions Promoting Patient Safety. Respondents with 11 to 15 years in their work area/unit also increased 3 percentage points on Supervisor/Manager Expectations and Actions Promoting Patient Safety. Action Planning for Improvement The delivery of survey results is not the end point in the survey process; it is just the beginning. Often, the perceived failure of surveys to create lasting change is actually due to faulty or nonexistent action planning or survey followup. The Action Planning Tool for the AHRQ Surveys on Patient Safety Culture can help in the improvement process and is available at http://www.ahrq.gov/professionals/quality-patient-safety/patientsafetyculture/planningtool.html. Seven steps of action planning are provided to give hospitals guidance on next steps to take to turn their survey results into actual patient safety culture improvement: 1. 2. 3. 4. 5. 6. 7. Understand your survey results. Communicate and discuss the survey results. Develop focused action plans. Communicate action plans and deliverables. Implement action plans. Track progress and evaluate impact. Share what works. xii Purpose and Use of This Report In response to requests from hospitals interested in comparing their results with those of other hospitals on the Hospital Survey on Patient Safety Culture, the Agency for Healthcare Research and Quality (AHRQ) established the Hospital Survey on Patient Safety Culture comparative database. Since the first comparative database report, which was released in 2007 and included data from 382 U.S. hospitals, the number of hospitals and respondents contributing to the database report has grown. The Hospital Survey on Patient Safety Culture 2016 User Comparative Database Report consists of data from 680 hospitals and 447,584 hospital staff respondents. This user comparative database report was developed as a tool for the following purposes: • • • • Comparison—To allow hospitals to compare their patient safety culture survey results with those of other hospitals. Assessment and Learning—To provide data to hospitals to facilitate internal assessment and learning in the patient safety improvement process. Supplemental Information—To provide supplemental information to help hospitals identify their strengths and areas with potential for improvement in patient safety culture. Trending—To provide data that describe changes in patient safety culture over time. The report presents statistics (averages, standard deviations, minimum and maximum scores, and percentiles) on the patient safety culture composites and items from the survey. This report also includes a trending chapter that describes patient safety culture change over time for 326 hospitals that submitted data to both the 2014 and 2016 databases. This report also includes four appendixes: • • • • Appendix A presents overall results by hospital characteristics (bed size, teaching status, ownership, geographic region, and children’s hospitals). Appendix B presents overall results by respondent characteristics (hospital work area/unit, staff position, interaction with patients, and tenure in work area/unit). Appendix C presents results for the 326 trending hospitals by hospital characteristics (bed size, teaching status, ownership, and geographic region). Appendix D presents results for the 326 trending hospitals by respondent characteristics (hospital work area/unit, staff position, interaction with patients, and tenure in work area/unit). 1 Chapter 1. Introduction Patient safety is a critical component of health care quality. As health care organizations continually strive to improve, there is growing recognition of the importance of establishing a culture of patient safety. Achieving a culture of patient safety requires an understanding of the values, beliefs, and norms about what is important in an organization and what attitudes and behaviors related to patient safety are supported, rewarded, and expected. Survey Content The Agency for Healthcare Research and Quality (AHRQ) funded and supervised development of the Hospital Survey on Patient Safety Culture. Developers reviewed research pertaining to safety, patient safety, error and accidents, and error reporting. They also examined existing published and unpublished safety culture assessment tools. In addition, hospital employees and administrators were interviewed to identify key patient safety and error-reporting issues. The Hospital Survey on Patient Safety Culture, released in November 2004, was designed to assess hospital staff opinions about patient safety issues, medical errors, and event reporting. The survey includes 42 items that measure 12 areas, or composites, of patient safety culture. Each of the 12 patient safety culture composites is listed and defined in Table 1-1. Table 1-1. Patient Safety Culture Composites and Definitions Patient Safety Culture Composite 1. Communication openness 2. Feedback and communication about error 3. Frequency of events reported 4. Handoffs and transitions 5. Management support for patient safety 6. Nonpunitive response to error 7. Organizational learning—Continuous improvement 8. Overall perceptions of patient safety 9. Staffing Definition: The extent to which… Staff freely speak up if they see something that may negatively affect a patient and feel free to question those with more authority. Staff are informed about errors that happen, are given feedback about changes implemented, and discuss ways to prevent errors. Mistakes of the following types are reported: (1) mistakes caught and corrected before affecting the patient, (2) mistakes with no potential to harm the patient, and (3) mistakes that could harm the patient but do not. Important patient care information is transferred across hospital units and during shift changes. Hospital management provides a work climate that promotes patient safety and shows that patient safety is a top priority. Staff feel that their mistakes and event reports are not held against them and that mistakes are not kept in their personnel file. Mistakes have led to positive changes and changes are evaluated for effectiveness. Procedures and systems are good at preventing errors and there is a lack of patient safety problems. There are enough staff to handle the workload and work hours are appropriate to provide the best care for patients. 3 Table 1-1. Patient Safety Culture Composites and Definitions (continued) Patient Safety Culture Composite 10. Supervisor/manager expectations and actions promoting patient safety 11. Teamwork across units 12. Teamwork within units Definition: The extent to which… Supervisors/managers consider staff suggestions for improving patient safety, praise staff for following patient safety procedures, and do not overlook patient safety problems. Hospital units cooperate and coordinate with one another to provide the best care for patients. Staff support each other, treat each other with respect, and work together as a team. The survey also includes two questions that ask respondents to provide an overall grade on patient safety for their work area/unit and to indicate the number of events they reported over the past 12 months. In addition, respondents are asked to provide limited background demographic information about themselves (their work area/unit, staff position, whether they have direct interaction with patients, tenure in their work area/unit, etc.). The survey’s toolkit materials are available at the AHRQ Web site (http://www.ahrq.gov/ professionals/quality-patient-safety/patientsafetyculture/hospital/index.html) and include the survey, survey items and dimensions, user’s guide, information about the Microsoft Excel™ Data Entry and Analysis Tool, the Hospital Patient Safety Improvement Resource List, and the Action Planning Tool for the AHRQ Surveys on Patient Safety Culture. The toolkit provides hospitals with the basic knowledge and tools needed to conduct a patient safety culture assessment and ideas regarding how to use the data. The Hospital Survey on Patient Safety Culture is available in Spanish on the AHRQ Web site. The Spanish translation is designed for U.S. Spanish-speaking respondents from different countries. A number of translations in other languages have already been developed by international users who have agreed to share their translations. Information for translators and translation guidelines are available for download at the AHRQ Web site (http://www.ahrq.gov/professionals/quality-patientsafety/patientsafetyculture/pscintusers.html#guidelines). Data Limitations The survey results presented in this report represent the leading compilation of hospital survey data currently available and therefore provide a useful reference for comparison. However, several limitations to these data should be kept in mind. First, the hospitals that submitted data to the database are not a statistically selected sample of all U.S. hospitals, since only hospitals that administered the survey on their own and were willing to submit their data for inclusion in the database are represented. However, the characteristics of the database hospitals are fairly consistent with the distribution of hospitals registered with the American Hospital Association (AHA) and are described further in Chapter 3. Second, hospitals that administered the survey were not required to undergo any training and administered it in different ways. Some hospitals used a paper-only survey, others used Webonly surveys, and others used a combination of these two methods to collect the data. It is 4 possible that these different modes could lead to differences in survey responses; further research is needed to determine whether and how different modes affect the results. In addition, some hospitals conducted a census, surveying all hospital staff, while others administered the survey to a sample of staff. When a sample was drawn, no data were obtained to determine the methodology used to draw the sample. Survey administration statistics that were obtained about the database hospitals, such as survey administration modes and response rates, are provided in Chapter 2. Finally, the data hospitals submitted have been cleaned for blank records (where responses to all survey items were missing except for demographic items) and straight-lining (where responses to all items within or across sections A, B, C, and F of the survey were the same). Otherwise, data are presented as submitted. No additional attempts were made to verify or audit the accuracy of the data submitted. 5 Chapter 2. Survey Administration Statistics This chapter presents descriptive information regarding how the 2016 database hospitals conducted their survey administration. Highlights • • • The 2016 database consists of data from 447,584 hospital staff respondents across 680 hospitals. The average hospital response rate was 55 percent, with an average of 658 completed surveys per hospital. Most hospitals (78 percent) administered Web surveys, which resulted in lower response rates (54 percent) compared with response rates from paper (71 percent). The 2016 database consists of survey data from 680 hospitals with a total of 447,584 hospital staff respondents. Participating hospitals administered the hospital survey to their staff between July 2013 and June 2015 and voluntarily submitted their data for inclusion in the database. Overall Hospital Statistics As shown in Table 2-1, the 680 database hospitals include 209 hospitals that submitted to the database for the first time, 145 hospitals that submitted data in 2016 and previously submitted data in one or more of the databases from 2007 to 2012 but not in 2014, and 326 trending hospitals that submitted data to both the 2014 and 2016 databases. Only hospitals that submitted data for the 2016 database and previously submitted data in 2014 were included for trending analysis. Table 2-1. Trending and Nontrending Overall Statistics—2016 Database Hospitals Overall Statistic Number of hospitals Number of survey respondents v Nontrending Hospitals 2016 Hospitals st Submitting 2016 1 Time Submitters Before 2014 209 126,758 145 80,109 v Trending Hospitals Trending 2014-2016 Total Database 326 240,717 The number of trending hospitals and respondents shown as trending in Table 2-1 represent hospitals that participated consecutively in the 2014 and 2016 databases. 7 680 447,584 Table 2-2 presents data on the number of surveys completed and administered, as well as response rate information. The average response rate across participating hospitals is 55 percent. Table 2-2. Response Rate Statistics—2016 Database Hospitals Summary Statistic Average Number of completed surveys per hospital Number of surveys administered per hospital Hospital response rate Minimum Maximum 658 1,401 23 37 7,711 12,682 55% 7% 100% Table 2-3 presents data on the type of survey administration mode (paper, Web, or mixed mode). Most hospitals (78 percent) administered the survey by Web only. Table 2-3. Survey Administration Statistics—2016 Database Hospitals Survey Administration Mode Database Hospitals Number Percent Paper only Web only Both paper and Web TOTAL 45 528 107 680 Database Respondents Number Percent 7% 78% 16% 100% 13,855 360,851 72,878 447,584 3% 81% 16% 100% Note: Percentages may not add to 100 due to rounding. Table 2-4 shows average response rate by survey administration mode. Paper survey administration had a higher average response rate than Web or mixed mode. Table 2-4. Average Response Rate by Survey Administration Mode—2016 Database Hospitals Survey Administration Mode Average Hospital Response Rate Paper only Web only Both paper and Web 71% 54% 55% 8 Chapter 3. Hospital Characteristics This chapter presents information about the distribution of database hospitals by bed size, teaching status, ownership, and geographic region. Although the hospitals that voluntarily submitted data to the database do not constitute a statistically selected sample, the characteristics of these hospitals are fairly consistent with the distribution of hospitals registered with the American Hospital Association (AHA). The characteristics of database hospitals by bed size, teaching status, ownership, and geographic region are presented in the following tables and are compared with the distribution of AHAregistered hospitals included in the 2013 AHA Annual Survey of Hospitals. vi Highlights • • • • • Most of the database hospitals (64 percent) have at least 100 beds. Database hospitals represented all geographic regions in the United States. Most database hospitals are nonteaching (62 percent) and are nongovernment not for profit (79 percent). Children’s hospitals represent 7 percent of database hospitals. Characteristics of database hospitals are fairly consistent with the distribution of hospitals registered with the American Hospital Association. Bed Size Table 3-1 shows the distribution of database hospitals and respondents by hospital bed size. Overall, the distribution of database hospitals by bed size is similar to the distribution of AHAregistered U.S. hospitals. Most of the database hospitals (64 percent) have at least 100 beds, which is higher than the percentage of AHA-registered U.S. hospitals (45 percent). vi Data for U.S. and U.S. territory AHA-registered hospitals were obtained from 2013 AHA Annual Survey of Hospitals Database, © 2013 Health Forum, LLC, an affiliate of the American Hospital Association. Hospitals not registered with AHA were asked to provide information on their hospital’s characteristics, such as bed size, teaching status, and ownership. 9 Table 3-1. Bed Size: Distribution of 2016 Database Hospitals and Respondents Compared With AHA-Registered Hospitals Bed Size 6-24 beds 25-49 beds 50-99 beds 100-199 beds 200-299 beds 300-399 beds 400-499 beds 500 or more beds TOTAL AHA-Registered Hospitals Number Percent 729 12% 1,467 23% 1,256 20% 1,268 20% 679 11% 379 6% 201 3% 316 5% 100% 6,295 Database Hospitals Number Percent 35 5% 90 13% 119 18% 147 22% 106 16% 69 10% 43 6% 71 10% 680 100% Database Respondents Number Percent 3,176 1% 14,034 3% 30,600 7% 71,570 16% 79,566 18% 66,995 15% 53,164 12% 128,479 29% 447,584 100% Note: Percentages may not add to 100 due to rounding. Teaching Status As shown in Table 3-2, similar to the distribution of AHA-registered hospitals, most database hospitals were non-teaching. However, there was a smaller percentage of non-teaching hospitals in the database (62 percent) compared with AHA-registered hospitals (74 percent). Table 3-2. Teaching Status: Distribution of 2016 Database Hospitals and Respondents Compared With AHA-Registered Hospitals Teaching Status Teaching Non-teaching TOTAL AHA-Registered Hospitals Number Percent 1,617 4,678 6,295 26% 74% 100% Database Hospitals Number Percent 259 421 680 38% 62% 100% Database Respondents Number Percent 273,545 174,039 447,584 61% 39% 100% Note: Percentages may not add to 100 due to rounding. Ownership As shown in Table 3-3, more database hospitals were nongovernment not for profit (79 percent) compared with AHA-registered hospitals (50 percent). Table 3-3. Ownership: Distribution of 2016 Database Hospitals and Respondents Compared With AHA-Registered Hospitals Ownership AHA-Registered Hospitals Number Percent Database Hospitals Number Percent Database Respondents Number Percent Government (Federal and non-Federal) Nongovernment not for profit Investor owned (for profit) 1,508 24% 103 15% 61,758 14% 3,146 1,641 50% 26% 537 40 79% 6% 365,588 20,238 82% 5% TOTAL 6,295 100% 680 100% 447,584 100% Note: Percentages may not add to 100 due to rounding. 10 Geographic Region Table 3-4 shows the distribution of database hospitals by AHA-defined geographic regions. vii The largest percentages of database hospitals are from the South Atlantic/Associated Territories region (25 percent) and the East North Central region (21 percent). Table 3-4. Geographic Region: Distribution of 2016 Database Hospitals and Respondents Compared With AHA-Registered Hospitals Region AHA-Registered Hospitals Number Percent Database Hospitals Number Percent Database Respondents Number Percent New England 255 4% 33 5% 35,485 8% Mid-Atlantic 564 9% 67 10% 75,840 17% 1,002 16% 171 25% 114,462 26% East North Central 921 15% 145 21% 79,117 18% East South Central 515 8% 41 6% 19,270 4% West North Central 791 13% 40 6% 19,523 4% West South Central 1,076 17% 72 11% 38,648 9% South Atlantic/Associated Territories Mountain 520 8% 24 4% 12,089 3% Pacific/Associated Territories 651 10% 87 13% 53,150 12% 6,295 100% 680 100% 447,584 100% TOTAL Note: Percentages may not add to 100 due to rounding. vii States and territories are categorized into AHA-defined regions as follows: • • • • • • • • • New England: CT, MA, ME, NH, RI, VT Mid-Atlantic: NJ, NY, PA South Atlantic/Associated Territories: DC, DE, FL, GA, MD, NC, SC, VA, WV, Puerto Rico, Virgin Islands East North Central: IL, IN, MI, OH, WI East South Central: AL, KY, MS, TN West North Central: IA, KS, MN, MO, ND, NE, SD West South Central: AR, LA, OK, TX Mountain: AZ, CO, ID, MT, NM, NV, UT, WY Pacific/Associated Territories: AK, CA, HI, OR, WA, American Samoa, Guam, Marshall Islands, Northern Mariana Islands 11 Children’s Hospitals Table 3-5 presents the number and percentage of children’s database hospitals and respondents compared with the number and percentage of AHA-registered children’s hospitals. Database hospitals contained 7 percent of children’s hospitals compared with 1 percent of AHA-registered children’s hospitals. Table 3-5. Children’s Hospitals: Distribution of 2016 Database Hospitals and Respondents Compared With AHA-Registered Hospitals Children’s Hospitals Yes – Children’s Hospital AHA-Registered Hospitals Number Percent Database Hospitals Number Percent Database Respondents Number Percent 102 2% 49 7% 31,509 7% No – Not Children’s Hospital 4,692 98% 631 93% 416,075 93% TOTAL Missing Overall total 4,794 1,501 6,295 100% 680 100% 447,584 100% Note: Percentages may not add to 100 due to rounding. Missing AHA-registered hospitals do not identify whether they restrict admissions primarily to children. 12 Chapter 4. Respondent Characteristics This chapter describes the self-reported characteristics of database hospital staff respondents. Highlights • • There were 447,584 hospital staff respondents from 680 hospitals. The top three respondent work areas were: o Other (29 percent). o Medicine (12 percent). o Surgery (10 percent). • The top three respondent staff positions were: o Registered nurse or licensed vocational nurse/licensed practical nurse (36 percent). o Other (21 percent). o Technician (e.g., EKG, Lab, Radiology) (11 percent). • • • Most respondents (77 percent) indicated they had direct interaction with patients. More than half of the respondents (56 percent) indicated they had worked with their current hospital at least 6 years. Similarly, almost half of the respondents (47 percent) indicated they had worked in their current work area/unit at least 6 years. Most respondents worked either less than 40 hours a week (46 percent) or 40 to 59 hours per week (47 percent). Work Area/Unit Table 4-1 shows more than one-quarter of respondents (29 percent) selected “Other” as their work area, followed by “Medicine” (12 percent) and “Surgery” (10 percent). Most of the work area units had between 3 and 7 percent of respondents. The Hospital Survey on Patient Safety Culture uses generic categories for hospital work areas and units. Therefore, a large percentage of respondents chose the “Other” response option, which allowed them to note their specific work area or unit. Participating hospitals were not asked to submit written or “other-specify” responses for any questions, so no data are available to further describe the respondents in the “Other” work area category. 13 Table 4-1. Work Area/Unit: Distribution of 2016 Database Respondents Work Area/Unit Other Medicine (nonsurgical) Surgery Many different hospital units/no specific unit Intensive care unit (any type) Emergency department Radiology Laboratory Obstetrics Rehabilitation Pharmacy Psychiatry/mental health Pediatrics Anesthesiology TOTAL Missing: Did not answer or were not asked the question Overall total Database Respondents Number Percent 118,881 29% 50,292 12% 42,514 10% 32,469 8% 29,224 7% 24,851 6% 22,344 5% 18,873 5% 17,580 4% 16,191 4% 13,115 3% 12,211 3% 10,676 3% 3,231 1% 412,452 100% 35,132 447,584 Note: Percentages may not add to 100 due to rounding. Staff Position More than one-third of respondents (36 percent) selected “Registered Nurse (RN)” or “Licensed Vocational Nurse(LVN)/Licensed Practical Nurse (LPN)” as their staff position, followed by “Other” (21 percent) and “Technician” (11 percent), as shown in Table 4-2. Table 4-2. Staff Position: Distribution of 2016 Database Respondents Staff Position Registered nurse (RN) or licensed vocational nurse (LVN)/licensed practical nurse (LPN) Other Technician (EKG, Lab, Radiology) Administration/management Patient care assistant/hospital aide/care partner Attending/staff physician, resident physician/physician in training, or physician assistant (PA)/nurse practitioner (NP) Unit assistant/clerk/secretary Therapist (respiratory, physical, occupational, or speech) Pharmacist Dietitian TOTAL Missing: Did not answer or were not asked the question Overall total Note: Percentages may not add to 100 due to rounding. 14 Database Respondents Number Percent 148,832 36% 87,681 44,815 29,525 24,998 23,611 21% 11% 7% 6% 6% 22,184 20,000 7,694 2,838 412,178 35,406 447,584 5% 5% 2% 1% 100% Interaction With Patients As shown in Table 4-3, most respondents (77 percent) indicated they had direct interaction with patients. Table 4-3. Interaction With Patients: Distribution of 2016 Database Respondents Database Respondents Number Percent Interaction With Patients YES, have direct patient interaction NO, do NOT have direct patient interaction TOTAL Missing: Did not answer or were not asked the question Overall total 326,640 95,013 421,653 25,931 447,584 77% 23% 100% Note: Percentages may not add to 100 due to rounding. Tenure With Current Hospital As shown in Table 4-4, more than half of the respondents (56 percent) indicated they had worked with their current hospital at least 6 years. Table 4-4. Tenure With Current Hospital: Distribution of 2016 Database Respondents Tenure With Current Hospital Database Respondents Number Percent Less than 1 year 1 to 5 years 6 to 10 years 11 to 15 years 16 to 20 years 21 years or more 48,751 129,992 88,337 55,413 31,578 57,531 12% 32% 21% 13% 8% 14% TOTAL Missing: Did not answer or were not asked the question Overall total 411,602 35,982 447,584 100% Note: Percentages may not add to 100 due to rounding. 15 Tenure in Current Work Area/Unit As shown in Table 4-5, almost half of the respondents (47 percent) indicated they had worked in their current work area/unit at least 6 years. Table 4-5. Tenure in Current Work Area/Unit: Distribution of 2016 Database Respondents Tenure In Current Work Area/Unit Database Respondents Number Percent Less than 1 year 1 to 5 years 6 to 10 years 11 to 15 years 16 to 20 years 21 years or more TOTAL 62,824 155,372 88,765 48,880 25,152 34,010 415,003 Missing: Did not answer or were not asked the question Overall total 32,581 447,584 15% 37% 21% 12% 6% 8% 100% Note: Percentages may not add to 100 due to rounding. Tenure in Current Specialty or Profession As shown in Table 4-6, respondents varied in tenure of current specialty or profession. The highest percentage of respondents had worked in their profession 1 to 5 years (25 percent), and the second highest percentage of respondents had worked 21 years or more (24 percent). Table 4-6. Tenure in Current Specialty or Profession: Distribution of 2016 Database Respondents Tenure in Current Specialty or Profession Database Respondents Number Percent Less than 1 year 1 to 5 years 27,012 102,954 7% 25% 6 to 10 years 11 to 15 years 16 to 20 years 21 years or more TOTAL Missing: Did not answer or were not asked the question Overall total 78,513 55,962 44,014 99,669 408,124 39,460 447,584 19% 14% 11% 24% 100% Note: Percentages may not add to 100 due to rounding. 16 Hours Worked Per Week As shown in Table 4-7, most respondents (88 percent) indicated they worked between 20 and 59 hours per week. Table 4-7. Hours Worked Per Week: Distribution of 2016 Database Respondents Database Respondents Number Percent Hours Worked Per Week Less than 20 hours per week 19,442 5% 20 to 39 hours per week 40 to 59 hours per week 60 to 79 hours per week 80 to 99 hours per week 100 hours per week or more TOTAL Missing: Did not answer or were not asked the question 166,761 193,398 18,807 9,099 423 407,930 39,654 41% 47% 5% 2% 0% 100% Overall total 447,584 Note: Percentages may not add to 100 due to rounding. 17 Chapter 5. Overall Results This chapter presents the overall survey results for the database, showing the average percentage of positive responses across the database hospitals on each of the survey’s items and composites. Reporting the average across hospitals ensures that each hospital receives an equal weight that contributes to the overall average. Reporting the data at the hospital level in this way is important because culture is considered to be a group characteristic and is not considered to be a solely individual characteristic. An alternative method would be to report a straight percentage of positive responses across all respondents, but this method would give greater weight to respondents from larger hospitals (i.e., 300 beds or more). More than half of respondents (56 percent) are from hospitals with 300 beds or more. Highlights • The areas of strength or the composites with the highest average percent positive responses were: o Teamwork Within Units (82 percent positive). o Supervisor/Manager Expectations and Actions Promoting Patient Safety (78 percent positive). o Organizational Learning—Continuous Improvement (73 percent positive). • The areas with potential for improvement or the composites with the lowest average percent positive responses were: o Nonpunitive Response to Error (45 percent positive). o Handoffs and Transitions (48 percent positive). o Staffing (54 percent positive). • • On average, most respondents within hospitals (76 percent) gave their work area or unit a grade of “Excellent” (34 percent) or “Very Good” (42 percent) on patient safety. On average, less than half of respondents within hospitals (45 percent) reported at least one event in their hospital over the past 12 months. It is likely that this represents underreporting of events. This section provides the overall item and composite-level results. The method for calculating the percent positive scores at the item and composite level is described in the Notes section of this report. 19 Composite-Level Results Chart 5-1 shows the average percent positive response for each of the 12 patient safety culture composites across hospitals in the database. viii The patient safety culture composites are shown in order from the highest average percent positive response to the lowest. Chart 5-1. Composite-Level Average Percent Positive Response – 2016 Database Hospitals Patient Safety Culture Composites viii 1. Teamwork Within Units 2. Supv/Mgr Expectations & Actions Promoting Patient Safety 3. Organizational Learning - Continuous Improvement 4. Management Support for Patient Safety 5. Feedback & Communication About Error 6. Frequency of Events Reported 7. Overall Perceptions of Patient Safety 8. Communication Openness 9. Teamwork Across Units 10. Staffing 11 . Handoffs & Transitions 12. Nonpunitive Response to Error % Positive Response 82% 78% 73% 72% 68% 67% 66% 64% 61% 54% 48% 45% Some hospitals excluded one or more survey items and are therefore excluded from composite-level calculations when the omitted items pertain to a particular composite. For the 2016 report, four hospitals were excluded from one or more composite-level calculations for this reason. 20 Areas of Strength • • • Teamwork Within Units (82 percent positive)—Staff support each other, treat each other with respect, and work together as a team. Supervisor/Manager Expectations and Actions Promoting Patient Safety (78 percent positive)—Supervisors/managers consider staff suggestions for improving patient safety, praise staff for following patient safety procedures, and do not overlook patient safety problems. Organizational Learning—Continuous Improvement (73 percent positive)—Mistakes have led to positive changes and changes are evaluated for effectiveness. Areas With Potential for Improvement • • • Nonpunitive Response to Error (45 percent positive)—Staff feel that their mistakes and event reports are not held against them and that mistakes are not kept in their personnel file. Handoffs and Transitions (48 percent positive)—Important patient care information is transferred across hospital units and during shift changes. Staffing (54 percent positive)—There are enough staff to handle the workload and work hours are appropriate to provide the best care for patients. Item-Level Results Chart 5-2 shows the average percent positive response for each of the 42 survey items. The survey items are grouped by the patient safety culture composite they are intended to measure. Within each composite, the items are presented in the order in which they appear in the survey. Areas of Strength for the Patient Safety Culture Composite Items • The composite items with the highest average percent positive responses (87 percent positive) were from the patient safety composite Teamwork Within Units: (A1) “People support one another in this unit” and (A3) “When a lot of work needs to be done quickly, we work together as a team to get the work done.” 21 Chart 5-2. Item-Level Average Percent Positive Response – 2016 Database Hospitals (Page 1 of 4) Survey Item % Positive Res onse Survey Items By Patient Safe Culture Com osite 1. Teamwork Within Units 1. People support one another in this unit. (A 1) 87% 2. When a lot of w ork needs to be done quickly, w e work together as a team to get the work done. (A3) 87% 3. In this unit, people treat each other with respect. (A4) 4. When one area in this unit gets really busy, others help out. (A11) 2. 81% 71% Supv/Mgr Expectations & Actions Promoting Patient Safety 1. My supv/mgr says a good word w hen he/she sees a job done according to established patient safety procedures. (B 1) 78% 2. My supv/mgr seriously considers staff suggestions for improving patient safety. (82) 3. 80% 3. Whenever pressure builds up, my supv/mgr w ants us to w ork faster, even if it means taking shortcuts. (B3R) 77% 4. My supv/mgr overlooks patient safety problems that happen over and over. (B4R) 79% Organizational Learning - Continuous Improvement 1. We are actively doing things to improve patient safety. (A6) 2. Mistakes have led to positive changes here. (A9) 84% 64% 3. After we make changes to improve patient safety, w e evaluate their effectiveness. (A13) 70% Note: The item’s survey location is shown to the right in parentheses. An “R” indicates a negatively worded item, where the percent positive response is based on those who responded “Strongly disagree” or “Disagree,” or “Never” or “Rarely” (depending on the response category used for the item). 22 Chart 5-2. Item-Level Average Percent Positive Response – 2016 Database Hospitals (Page 2 of 4) (continued) Survey Item % Positive Res onse Survey Items By Patient Safe Culture Com osite 4. Management Support for Patient Safety 1. Hospital management provides a work climate that promotes patient safety. (F1 ) 81% 2. The actions of hospital management show that patient safety is a top priority. (F8) 3. Hospital management seems interested in patient safety only after an adverse event happens . (F9R) 5. 61% Feedback & Communication About Error 1. We are given feedback about changes put into place based on event reports. (C 1) 2. We are informed about errors that happen in this unit. (C3) 3. In this unit, we discuss w ays to prevent errors from happening again. (C5) 6. 76% 60% 69% 75% Frequency of Events Reported 1. W hen a mistake is made, but is caught and corrected before affecting the patient, how often is this reported? (01 ) 62% 2. W hen a mistake is made, but has no potential to harm the patient, how often is this reported? (02) 63% 3. W hen a mistake is made that could harm the patient, but does not, how often is this reported? (03) 75% Note: The item’s survey location is shown to the right in parentheses. An “R” indicates a negatively worded item, where the percent positive response is based on those who responded “Strongly disagree” or “Disagree,” or “Never” or “Rarely” (depending on the response category used for the item). 23 Chart 5-2. Item-Level Average Percent Positive Response – 2016 Database Hospitals (Page 3 of 4) (continued) Survey Items By Patient Safe Culture Com osite 7. Survey Item % Positive Res onse Overall Perceptions of Patient Safety 1. It is just by chance that more serious mistakes don't happen around here. (A10R) 61% 2. Patient safety is never sacrificed to get more work done. (A15) 64% 3. We have patient safety problems in this unit. (A 17R) 65% 4. Our procedures and systems are good at preventing errors from happening. (A 18) 8. 73% Communication Openness 1. Staff wil I freely speak up if they see something that may negatively affect patient care. (C2) 2. Staff feel free to question the decisions or actions of those w ith more authority. (C4) 77% 49% 3. Staff are afraid to ask questions when something does not seem right. (C6R) 9. 65% Teamwork Across Units 1. Hospital units do not coordinate well with each other. (F2R) 49% 2. There is good cooperation among hospital units that need to work together. (F4) 62% 3. It is often unpleasant to work with staff from other hospital units. (F6R) 63% 4. Hospital units work well together to provide the best care for patients. (F10) 71% Note: The item’s survey location is shown to the right in parentheses. An “R” indicates a negatively worded item, where the percent positive response is based on those who responded “Strongly disagree” or “Disagree,” or “Never” or “Rarely” (depending on the response category used for the item). 24 Chart 5-2. Item-Level Average Percent Positive Response – 2016 Database Hospitals (Page 4 of 4) (continued) Survey Items By Patient Safe Culture Com osite Survey Item % Positive Res onse 10. Staffing 1. We have enough staff to handle the w orkload. (A2) 51% 2. Staff in this unit work longer hours than is best for patient care. (ASR) 50% 3. We use more agency/temporary staff than is best for patient care. (A7R) 4 . We work in "crisis mode," trying to do too much, too quickly. (A14R) 65% 49% 11. Handoffs & Transitions 1. Things "fall between the cracks" when transferring patients from one unit to another. (F3R) 43% 2. Important patient care information is often lost during shift changes. (FSR) 53% 3. Problems often occur in the exchange of information across hospital units. (F7R) 47% 4. Shift changes are problematic for patients in this hospital. (F11 R ) 48% 12. Nonpunitive Response to Error 1. Staff feel like their mistakes are held against them. (ABR) 2. When an event is reported , it feels like the person is being w ritten up, not the problem. (A 12R) 3. Staff w orry that mistakes they make are kept in their personnel file. (A16R) 51% 48% 37% Note: The item’s survey location is shown to the right in parentheses. An “R” indicates a negatively worded item, where the percent positive response is based on those who responded “Strongly disagree” or “Disagree,” or “Never” or “Rarely” (depending on the response category used for the item). 25 Areas With Potential for Improvement for the Patient Safety Culture Composite Items • The composite item with the lowest average percent positive response (37 percent positive) was from the patient safety composite Nonpunitive Response to Error: (A16) “Staff worry that mistakes they make are kept in their personnel file.” Patient Safety Grade—Chart 5-3 shows that on average across hospitals, most respondents were positive, with 76 percent giving their work area or unit a patient safety grade of “Excellent” (34 percent) or “Very Good” (42 percent). Number of Events Reported—Chart 5-4 shows that on average across hospitals, less than half of respondents (45 percent) reported at least one event in their hospital over the past 12 months. Event reporting was identified as an area for improvement for most hospitals because underreporting of events means potential patient safety problems may not be recognized or identified and therefore may not be addressed. Chart 5-3. Average Percentage of 2016 Database Respondents Giving Their Work Area/Unit a Patient Safety Grade Note: Percentages may not add to 100 due to rounding. 26 Chart 5-4. Average Percentage of 2016 Database Respondents Reporting Events in the Past 12 Months Note: Percentages may not add to 100 due to rounding. Response categories that make up the percent positive score may not sum to the total percent positive due to rounding. 27 Chapter 6. Comparing Your Results This chapter presents how to compare your results with the results from the database. To compare your hospital’s survey results with the results from the database, you will need to calculate your hospital’s percent positive response on the survey’s 42 items and 12 composites, patient safety grade, and number of events reported. Refer to the Notes section at the end of this report for a description of how to calculate these percent positive scores. You will then be able to compare your hospital’s results with the database averages and examine the percentile scores to place your hospital’s results relative to the distribution of database hospitals. When comparing your hospital’s results with results from the database, keep in mind that the database provides only relative comparisons. Even though your hospital’s survey results may be better than the database statistics, you may still believe there is room for improvement in a particular area within your hospital in an absolute sense. As you will notice from the database results, there are some patient safety composites that even the highest scoring hospitals could improve on. Therefore, the comparative data provided in this report should be used to supplement your hospital’s own efforts toward identifying areas of strength and areas on which to focus patient safety culture improvement efforts. Highlights • There was considerable variability in the range of hospital scores (lowest to highest) across the 12 patient safety culture composites and individual items: o Supervisor/Manager Expectations & Actions Promoting Patient Safety is the composite that showed the most variability in scores, with at least one hospital scoring 17 percent positive and at least one other hospital scoring 96 percent positive. o Patient safety grade. In at least one hospital, only 4 percent of the respondents provided their unit with a patient safety grade of “Excellent” or “Very Good,” but at another hospital 98 percent provided their unit with a patient safety grade of “Excellent” or “Very Good.” o Number of events reported. In at least one hospital, 15 percent of respondents reported at least one event over the past 12 months, but at another hospital 81percent of respondents reported at least one event. Description of Comparative Statistics This section provides a brief description of the results shown in the remainder of this chapter. Average Percent Positive The average percent positive scores for each of the 12 patient safety culture composites and for the survey’s 42 items (plus the two questions on patient safety grade and number of events 29 reported) are provided in the comparative results tables in this chapter. These average percent positive scores were calculated by averaging composite-level percent positive scores across all hospitals in the database, as well as averaging item-level percent positive scores across hospitals. Since the percent positive is displayed as an overall average, scores from each hospital are weighted equally in their contribution to the calculation of the average. ix Standard Deviation The standard deviation (s.d.), a measure of the spread or variability of hospital scores around the average, is also displayed. The standard deviation tells you the extent to which hospitals’ scores differ from the average: • • • If scores from all hospitals were exactly the same, then the average would represent all their scores perfectly and the standard deviation would be zero. If scores from all hospitals were very close to the average, then the standard deviation would be small and close to zero. If scores from many hospitals were very different from the average, then the standard deviation would be a large number. When the distribution of hospital scores follows a normal bell-shaped curve (where most of the scores fall in the middle of the distribution, with fewer scores at the lower and higher ends of the distribution), the average, plus or minus the standard deviation, will include about 68 percent of all hospital scores. For example, if an average percent positive score across the database hospitals were 70 percent with a standard deviation of 10 percent and scores were normally distributed, then about 68 percent of all the database hospitals would have scores between 60 and 80 percent. Statistically “Significant” Differences Between Scores You may be interested in determining the statistical significance of differences between your scores and the averages in the database, or between scores in various breakout categories (hospital bed size, teaching status, etc.). Statistical significance is greatly influenced by sample size, so as the number of observations in comparison groups gets larger, small differences in scores will be statistically significant. While a 1 percent difference between percent positive scores might be “statistically” significant (that is, not due to chance), the difference is not likely to be meaningful or “practically” significant. Keep in mind that statistically significant differences are not always important, and nonsignificant differences are not always trivial. Therefore, we recommend the following guideline: • ix Use a 5 percentage point difference as a rule of thumb when comparing your hospital’s results with the database averages. Your hospital’s percent positive score should be at least 5 percentage points greater than the database average to be considered An alternative method would be to report a straight percentage of positive response across all respondents. However, this method would give greater weight to respondents from larger hospitals (i.e., greater than 300 beds) since they account for 56 percent of responses. 30 “better” and should be at least 5 percentage points less to be considered “lower” than the database average. A 5 percentage point difference is likely to be statistically significant for most hospitals given the number of responses per hospital and is also a meaningful difference to consider. Minimum and Maximum Scores The minimum (lowest) and maximum (highest) percent positive scores are presented for each composite and item. These scores provide information about the range of percent positive scores obtained by database hospitals and are actual scores from the lowest and highest scoring hospitals. When comparing with the minimum and maximum scores, keep in mind that these scores may represent hospitals that are extreme outliers (indicated by large differences between the minimum score and the 10th percentile score, or between the 90th percentile score and the maximum score). Percentiles The 10th, 25th, 50th (or median), 75th, and 90th percentile scores are displayed for the survey composites and items. Percentiles provide information about the distribution of hospital scores. To calculate percentile scores, all hospital percent positive scores were ranked in order from low to high. A specific percentile score shows the percentage of hospitals that scored at or below a particular score. For example, the 50th percentile, or median, is the percent positive score where 50 percent of the hospitals scored the same or lower and 50 percent of the hospitals scored higher. When the distribution of hospital scores follows a normal bell-shaped curve (where most of the scores fall in the middle of the distribution, with fewer scores at the lower and higher ends of the distribution), the 50th percentile, or median, will be very similar to the average score. Interpret the percentile scores as shown in Table 6-1. Table 6-1. Interpretation of Percentile Scores Percentile Score th 10 percentile Represents the lowest scoring hospitals. th 25 percentile Represents lower scoring hospitals. th 50 percentile (or median) Represents the middle of the distribution of hospitals. th 75 percentile Represents higher scoring hospitals. th 90 percentile Represents the highest scoring hospitals. Interpretation 10% of the hospitals scored the same or lower. 90% of the hospitals scored higher. 25% of the hospitals scored the same or lower. 75% of the hospitals scored higher. 50% of the hospitals scored the same or lower. 50% of the hospitals scored higher. 75% of the hospitals scored the same or lower. 25% of the hospitals scored higher. 90% of the hospitals scored the same or lower. 10% of the hospitals scored higher. To compare with the database percentiles, compare your hospital’s percent positive scores with the percentile scores for each composite and item. Look for the highest percentile where your hospital’s score is higher than that percentile. 31 For example: On survey item 1 in Table 6-2, the 75th percentile score is 49 percent positive, and the 90th percentile score is 62 percent positive. Table 6-2. Sample Percentile Statistics Survey Item % Positive Response Survey Item Item 1 Min 10th %ile 25th %ile Median/50th %ile 75th %ile 90th %ile Max 8% 10% 25% 35% 49% 62% 96% If your hospital's score is 55%, your score falls here: If your hospital's score is 65%, your score falls here: • • If your hospital’s score is 55 percent positive, it falls above the 75th percentile (but below the 90th), meaning that your hospital scored higher than at least 75 percent of the hospitals in the database. If your hospital’s score is 65 percent positive, it falls above the 90th percentile, meaning your hospital scored higher than at least 90 percent of the hospitals in the database. Composite and Item-Level Comparative Tables Table 6-3 presents comparative statistics (average percent positive and standard deviation, minimum and maximum scores, and percentiles) for each of the 12 patient safety culture composites. The patient safety culture composites are shown in order from the highest average percent positive response to the lowest. Supervisor/Manager Expectations & Actions Promoting Patient Safety is the composite that showed the most variability in scores, with at least one hospital scoring 17 percent positive and at least one other hospital scoring 96 percent positive. Table 6-4 presents comparative statistics for each of the 42 survey items. The survey items are grouped by the patient safety culture composite they are intended to measure. Within each composite, the items are presented in the order in which they appear in the survey. Patient safety grades of “Excellent” or “Very Good”, shown in Table 6-5, had a wide range of responses, from at least one hospital where only 4 percent of respondents provided their unit with a patient safety grade of “Excellent” or “Very Good” to a hospital where 98 percent did. Percentage of respondents who reported one or more events also had a wide range of response, as shown in Table 6-6, from at least one hospital where only 15 percent of respondents reported a single event over the past 12 months to a hospital where 81 percent of respondents reported at least one event. 32 Table 6-3. Composite-Level Comparative Results—2016 Database Hospitals Composite % Positive Response Percentiles Median/ 25th 10th 50th 75th 90th %ile %ile %ile %ile %ile Average % Positive s.d. Min 82% 5.91% 26% 75% 79% 82% 85% 88% 96% 2. Supervisor/Manager Expectations & Actions Promoting Patient Safety 3. Organizational Learning—Continuous Improvement 4. Management Support for Patient Safety 78% 6.66% 17% 71% 75% 79% 83% 86% 96% 73% 7.44% 15% 63% 68% 73% 77% 81% 93% 72% 9.14% 39% 60% 67% 73% 79% 83% 96% 5. Feedback & Communication About Error 68% 8.05% 17% 58% 63% 68% 74% 78% 89% 6. Frequency of Events Reported 67% 7.37% 43% 57% 61% 67% 71% 76% 94% 7. Overall Perceptions of Patient Safety 66% 8.50% 36% 55% 60% 66% 72% 77% 90% 8. Communication Openness 64% 6.70% 35% 55% 59% 64% 68% 72% 84% 9. Teamwork Across Units 61% 9.32% 34% 50% 56% 61% 67% 73% 91% 10. Staffing 54% 9.34% 20% 42% 48% 53% 60% 66% 86% 11. Handoffs & Transitions 48% 10.37% 22% 35% 41% 46% 54% 62% 80% 12. Nonpunitive Response to Error 45% 8.75% 20% 35% 39% 44% 51% 56% 75% Patient Safety Culture Composites 1. Teamwork Within Units Max 33 Table 6-4. Item-Level Comparative Results—2016 Database Hospitals (Page 1 of 4) Item 1. A1 A3 A4 A11 2. B1 34 B2 B3R B4R 3. A6 A9 A13 Survey Items by Composite Teamwork Within Units 1. People support one another in this unit. 2. When a lot of work needs to be done quickly, we work together as a team to get the work done. 3. In this unit, people treat each other with respect. 4. When one area in this unit gets really busy, others help out. Supervisor/Manager Expectations & Actions Promoting Patient Safety 1. My supv/mgr says a good word when he/she sees a job done according to established patient safety procedures. 2. My supv/mgr seriously considers staff suggestions for improving patient safety. 3. Whenever pressure builds up, my supv/mgr wants us to work faster, even if it means taking shortcuts. 4. My supv/mgr overlooks patient safety problems that happen over and over. Organizational Learning—Continuous Improvement 1. We are actively doing things to improve patient safety. 2. Mistakes have led to positive changes here. 3. After we make changes to improve patient safety, we evaluate their effectiveness. Survey Item % Positive Response Percentiles Median/ 10th 25th 50th 75th 90th Min %ile %ile %ile %ile %ile Max Average % Positive s.d. 87% 87% 6.21% 5.51% 28% 33% 81% 81% 85% 85% 88% 87% 91% 90% 93% 93% 100% 100% 81% 6.87% 28% 73% 77% 81% 85% 88% 97% 71% 7.01% 14% 64% 67% 72% 75% 79% 96% 78% 7.36% 19% 70% 74% 78% 83% 87% 97% 80% 7.38% 19% 71% 76% 80% 84% 88% 96% 77% 7.72% 13% 68% 73% 77% 82% 87% 100% 79% 7.53% 11% 71% 75% 79% 83% 87% 95% 84% 6.77% 28% 76% 80% 84% 88% 92% 98% 64% 8.20% 8% 54% 59% 63% 69% 73% 93% 70% 9.16% 8% 59% 64% 70% 76% 81% 92% Note: The item’s survey location is shown to the left. An “R” indicates a negatively worded item, where the percent positive response is based on those who responded “Strongly disagree” or “Disagree,” or “Never” or “Rarely” (depending on the response category used for the item). Table 6-4. Item-Level Comparative Results—2016 Database Hospitals (Page 2 of 4) (continued) Item 4. F1 F8 F9R 5. C1 C3 35 C5 6. D1 D2 D3 Survey Items by Composite Management Support for Patient Safety 1. Hospital mgmt provides a work climate that promotes patient safety. 2. The actions of hospital mgmt show that patient safety is a top priority. 3. Hospital mgmt seems interested in patient safety only after an adverse event happens. Feedback & Communication About Error 1. We are given feedback about changes put into place based on event reports. 2. We are informed about errors that happen in this unit. 3. In this unit, we discuss ways to prevent errors from happening again. Frequency of Events Reported 1. When a mistake is made, but is caught and corrected before affecting the patient, how often is this reported? 2. When a mistake is made, but has no potential to harm the patient, how often is this reported? 3. When a mistake is made that could harm the patient, but does not, how often is this reported? Survey Item % Positive Response Percentiles Median/ 10th 25th 50th 75th 90th Min %ile %ile %ile %ile %ile Max Average % Positive s.d. 81% 9.16% 19% 69% 76% 82% 87% 91% 100% 76% 9.58% 17% 64% 70% 76% 82% 87% 97% 61% 10.23% 23% 48% 54% 60% 68% 74% 92% 60% 9.85% 6% 48% 54% 61% 67% 73% 85% 69% 8.19% 22% 58% 64% 69% 74% 78% 93% 75% 7.65% 22% 66% 70% 76% 80% 84% 92% 62% 8.53% 35% 51% 56% 62% 68% 73% 94% 63% 8.22% 35% 52% 57% 63% 68% 74% 87% 75% 6.69% 50% 67% 71% 75% 80% 83% 100% Note: The item’s survey location is shown to the left. An “R” indicates a negatively worded item, where the percent positive response is based on those who responded “Strongly disagree” or “Disagree,” or “Never” or “Rarely” (depending on the response category used for the item). Table 6-4. Item-Level Comparative Results—2016 Database Hospitals (Page 3 of 4) (continued) Item Survey Items by Composite 7. A10R Overall Perceptions of Patient Safety 1. It is just by chance that more serious mistakes don’t happen around here. 2. Patient safety is never sacrificed to get more work done. 3. We have patient safety problems in this unit. 4. Our procedures and systems are good at preventing errors from happening. Communication Openness 1. Staff will freely speak up if they see something that may negatively affect patient care. 2. Staff feel free to question the decisions or actions of those with more authority. 3. Staff are afraid to ask questions when something does not seem right. Teamwork Across Units 1. Hospital units do not coordinate well with each other. 2. There is good cooperation among hospital units that need to work together. 3. It is often unpleasant to work with staff from other hospital units. 4. Hospital units work well together to provide the best care for patients. A15 A17R A18 8. C2 36 C4 C6R 9. F2R F4 F6R F10 Survey Item % Positive Response Percentiles Median/ 10th 25th 50th 75th 90th Min %ile %ile %ile %ile %ile Max Average % Positive s.d. 61% 10.18% 18% 50% 55% 62% 68% 75% 92% 64% 8.97% 19% 53% 58% 64% 70% 76% 91% 65% 10.31% 11% 52% 59% 65% 72% 79% 95% 73% 8.78% 14% 63% 68% 74% 79% 84% 96% 77% 6.88% 70% 74% 78% 82% 86% 97% 49% 7.83% 14% 39% 43% 48% 54% 59% 74% 65% 7.69% 14% 56% 61% 66% 70% 74% 86% 49% 11.36% 9% 35% 41% 48% 56% 63% 91% 62% 10.05% 6% 50% 56% 62% 68% 75% 93% 63% 8.72% 16% 52% 58% 63% 68% 74% 91% 71% 9.75% 9% 60% 66% 72% 77% 84% 96% 19% Note: The item’s survey location is shown to the left. An “R” indicates a negatively worded item, where the percent positive response is based on those who responded “Strongly disagree” or “Disagree,” or “Never” or “Rarely” (depending on the response category used for the item). Table 6-4. Item-Level Comparative Results—2016 Database Hospitals (Page 4 of 4) (continued) Item 10. A2 A5R A7R A14R 11. F3R 37 F5R F7R F11R 12. A8R A12R A16R Survey Items by Composite Staffing 1. We have enough staff to handle the workload. 2. Staff in this unit work longer hours than is best for patient care. 3. We use more agency/temporary staff than is best for patient care. 4. We work in “crisis mode” trying to do too much, too quickly. Handoffs & Transitions 1. Things “fall between the cracks” when transferring patients from one unit to another. 2. Important patient care information is often lost during shift changes. 3. Problems often occur in the exchange of information across hospital units. 4. Shift changes are problematic for patients in this hospital. Nonpunitive Response to Error 1. Staff feel like their mistakes are held against them. 2. When an event is reported, it feels like the person is being written up, not the problem. 3. Staff worry that mistakes they make are kept in their personnel file. Survey Item % Positive Response Percentiles Median/ 10th 25th 50th 75th 90th Min %ile %ile %ile %ile %ile Max Average % Positive s.d. 51% 11.99% 11% 36% 44% 51% 59% 67% 88% 50% 9.74% 16% 37% 44% 50% 56% 63% 87% 65% 10.88% 15% 51% 59% 65% 72% 77% 95% 49% 10.79% 19% 36% 42% 48% 56% 64% 85% 43% 11.53% 16% 29% 35% 41% 49% 58% 82% 53% 9.95% 18% 41% 47% 53% 59% 67% 84% 47% 10.69% 22% 33% 40% 45% 53% 61% 80% 48% 11.17% 16% 35% 40% 46% 54% 64% 89% 51% 9.39% 19% 39% 45% 51% 56% 63% 80% 48% 8.79% 19% 38% 42% 48% 53% 59% 76% 37% 9.26% 7% 26% 30% 36% 42% 48% 71% Note: The item’s survey location is shown to the left. An “R” indicates a negatively worded item, where the percent positive response is based on those who responded “Strongly disagree” or “Disagree,” or “Never” or “Rarely” (depending on the response category used for the item). Table 6-5. Percentage of Respondents Giving Their Work Area/Unit Patient Safety Grade Comparative Results—2016 Database Hospitals Item E1 Work Area/Unit Patient Safety Grade Excellent or Very Good Average % s.d. Min Survey Item % Response Percentiles Median/ 10th 50th 75th 90th 25th %ile %ile %ile %ile %ile 76% 10.43% 4% 63% 70% 77% 83% 88% Max 98% Note: For the full average distribution of results, see Chart 5-3. Table 6-6. Percentage of Respondents Reporting One or More Events in the Past 12 Months Comparative Results—2016 Database Hospitals Item G1 Events Reported in the Past 12 Months 1 or more events 38 Note: For the full average distribution of results, see Chart 5-4. Average % s.d. Min Survey Item % Response Percentiles Median/ 10th 25th 50th 75th 90th %ile %ile %ile %ile %ile 45% 9.62% 15% 33% 39% 44% 51% 57% Max 81% Appendixes A and B: Overall Results by Hospital and Respondent Characteristics In addition to the overall results on the database hospitals presented, Part II of the report presents data tables showing average percent positive scores on the survey composites and items across database hospitals, broken down by the following hospital and respondent characteristics: Appendix A: Results by Hospital Characteristics • • • • • Bed size Teaching status Ownership Geographic region Children’s hospitals Appendix B: Results by Respondent Characteristics • • • • Work area/unit Staff position Interaction with patients Tenure in current work area/unit The breakout tables are included as appendixes because there are a large number of them. Highlights of the findings from the breakout tables in these appendixes are provided on the following pages. The appendixes are available on the Web at: www.ahrq.gov/qual/hospsurvey16/. Highlights From Appendix A: Overall Results by Hospital Characteristics Bed Size (Tables A-1, A-3) • • Smaller hospitals (6–24 beds and 25–49 beds) had the highest percent positive for the average across all composites (69 percent); larger hospitals (300–399 beds) had the lowest (61 percent). Hospitals with 25–49 licensed beds had the highest percentage of respondents who gave their work area/unit a patient safety grade of “Excellent” or “Very Good” (83 percent); hospitals with 300–399 beds and 400–499 beds had the lowest (70 percent). Teaching Status and Ownership (Tables A-5, A-7, A-8) • • • Nonteaching hospitals, on average, scored higher than teaching hospitals by 5 percentage points or more on Overall Perceptions of Patient Safety, Staffing, and Handoffs and Transitions. Nonteaching hospitals had a higher percentage of respondents who gave their work area/unit a patient safety grade of “Excellent” or “Very Good” (78 percent) than teaching hospitals (73 percent). Overall, hospitals did not have large differences across ownership categories on the 12 composites, patient safety grade, or number of events reported. 39 Geographic Region (Tables A-9, A-11, A-12) • • • East South Central hospitals had the highest percent positive for the average across all composites (68 percent); New England and Mid-Atlantic hospitals had the lowest (61 percent). West North Central hospitals had the highest percentage of respondents who gave their work area/unit a patient safety grade of “Excellent” or “Very Good” (79 percent); Mid-Atlantic hospitals had the lowest (70 percent). Pacific hospitals had the highest percentage of respondents who reported one or more events in the past year (50 percent); West South Central hospitals had the lowest (39 percent). Children’s Hospitals (Tables A-13, A-15, A-16) • Children’s hospitals and non-children’s hospitals did not have large differences on the 12 composites, patient safety grade, or number of events reported. Highlights From Appendix B: Overall Results by Respondent Characteristics Work Area/Unit (Tables B-1, B-3, B-4) • • • Respondents in Rehabilitation had the highest percent positive response for the average across the composites (71 percent positive); Emergency had the lowest (59 percent positive). Rehabilitation had the highest percentage of respondents who gave their work area/unit a patient safety grade of “Excellent” or “Very Good” (86 percent); Emergency had the lowest (65 percent). ICU (Any Type) had the highest percentage of respondents reporting one or more events in the past year (62 percent); Rehabilitation had the lowest (39 percent). Staff Position (Tables B-5, B-7, B-8) • • • Respondents in Administration/Management had the highest percent positive response for the average across the composites (76 percent positive); RN/LVN/LPN had the lowest (63 percent positive). Administration/Management had the highest percentage of respondents who gave their work area/unit a patient safety grade of “Excellent” or “Very Good” (89 percent); RN/LVN/LPN had the lowest (71 percent). Pharmacists had the highest percentage of respondents reporting one or more events in the past year (77 percent); Unit Assistants/Clerks/Secretaries had the lowest (17 percent). Interaction With Patients (Tables B-9, B-11, B-12) • Respondents with direct patient interaction were more positive than those without direct interaction on Handoffs and Transitions (49 percent compared with 43 percent) but less positive on Management Support for Patient Safety (71 percent compared with 79 percent) and Feedback and Communication About Error (67 percent compared with 72 percent). • Respondents without direct patient interaction had a higher percentage of respondents who gave their work area/unit a patient safety grade of “Excellent” or “Very Good” (81 percent) than respondents with direct patient interaction (75 percent). 40 • More respondents with direct patient interaction reported one or more events in the past year (49 percent) than respondents without direct patient interaction (31 percent). Tenure in Current Work Area/Unit (Tables B-13, B-15, B-16) • • • Respondents with less than 1 year in their current work area/unit had the highest percent positive response for the average across the composites (69 percent); respondents with 6 to 10 years had the lowest (63 percent). Respondents with less than 1 year in their current work area/unit had the highest percentage of respondents who gave their work area/unit a patient safety grade of “Excellent” or “Very Good” (82 percent); respondents with 6 to 10 years had the lowest (74 percent). Respondents with 6 to 10 years, 11 to 15 years, and 21 years or more in their current work area/unit had the highest percentage of respondents reporting one or more events in the past year (48 percent each); respondents with less than 1 year had the lowest (31 percent). 41 Chapter 7. Trending: Comparing Results Over Time Many hospitals that administer the hospital survey have indicated that they intend to continue to administer the survey on a regular basis and to track changes in patient safety culture over time. While the overall results presented earlier in this report reflect only the most recent survey data from all 680 participating hospitals, we have data from two administrations of the survey for 326 hospitals that submitted to both the 2014 and 2016 databases, allowing us to examine trends over time for these hospitals. Hospitals that submitted to databases prior to 2014 were not included in the trending database. This chapter presents trending results from these 326 hospitals. Highlights • • • Across the 326 trending hospitals, the average percent positive scores across the 12 patient safety culture composites increased by 1 percentage point (ranging across the composites from a change of -2 to a change of 3 percentage points). Of those hospitals that increased on Patient Safety Grade, scores for “Excellent” or “Very Good” increased on average 6 percent. For hospitals that increased on the number of respondents who reported at least one event in the past 12 months, the average increase was 5 percent. When reviewing the results in this chapter, keep in mind that survey scores might change, or not change, over time for a number of complex reasons. Important factors to consider are whether the hospital implemented patient safety initiatives or took actions between survey administrations and the length of time between administrations. Survey methodology issues can also play a big role in score changes. Low survey response rates for the previous or most recent administration, changes in the number of staff asked to complete the survey, or changes in the types of staff asked to complete the survey will make it difficult to interpret changes in scores over time. 43 Table 7-1 displays summary statistics from the previous and most recent survey administrations for the 326 trending hospitals. Table 7-1. Trending: Response Rate Statistics—2016 Database Hospitals Summary Statistic Most Recent Submission (2016) Previous Submission (2014) 240,717 Average: 738 Range: 23 – 7,711 Average: 57% Range: 7% – 100% 216,719 Average: 665 Range: 10 – 7,162 Average: 56% Range: 3% – 100% Total number of respondents Number of completed surveys per hospital Hospital response rate Note: Trending hospitals include hospitals that submitted in both the 2014 and 2016 comparative databases. Tables 7-2 to 7-5 provide descriptive statistics on the 326 trending hospitals by bed size, teaching status, ownership, and region compared with nontrending hospitals. Table 7-2. Bed Size—Distribution of 2016 Trending and Nontrending Hospitals Trending Hospitals 44 Trending 2014-2016 Bed Size 6-24 beds 25-49 beds 50-99 beds 100-199 beds 200-299 beds 300-399 beds 400-499 beds 500 or more beds TOTAL Number Nontrending Hospitals 2016 Hospitals Submitting Before 2014 Percent Number Percent st 2016 Hospitals 1 Time Submitters Number Percent AHA-Registered U.S. Hospitals Number Percent 11 3% 12 8% 12 6% 729 12% 41 55 73 44 33 25 44 326 13% 17% 22% 14% 10% 8% 14% 100% 29 23 24 16 18 11 12 145 20% 16% 17% 11% 12% 8% 8% 100% 20 41 50 46 18 7 15 209 10% 20% 24% 22% 9% 3% 7% 100% 1,467 1,256 1,268 679 379 201 316 6,295 23% 20% 20% 11% 6% 3% 5% 100% Note: Percentages may not add to 100 due to rounding. Table 7-3. Teaching Status—Distribution of 2016 Trending and Nontrending Hospitals Trending Hospitals Trending 2014-2016 Teaching Status Teaching Nonteaching TOTAL Nontrending Hospitals 2016 Hospitals Submitting Before 2014 st 2016 Hospitals 1 Time Submitters AHA-Registered U.S. Hospitals Number Percent Number Percent Number Percent Number Percent 126 200 326 39% 61% 100% 63 82 145 43% 57% 100% 70 139 209 33% 67% 100% 1,617 4,678 6,295 26% 74% 100% Table 7-4. Ownership—Distribution of 2016 Trending and Nontrending Hospitals Trending Hospitals Trending 2014-2016 45 Ownership Government (Federal or non-Federal) Nongovernment (voluntary/nonprofit or proprietary/investor owned) TOTAL Nontrending Hospitals 2016 Hospitals Submitting Before 2014 st 2016 Hospitals 1 Time Submitters AHA-Registered U.S. Hospitals Number Percent Number Percent Number Percent Number Percent 50 15% 19 13% 34 16% 1,508 14% 276 85% 126 87% 175 84% 4,787 76% 326 100% 145 100% 209 100% 6,295 100% Table 7-5. Geographic Region—Distribution of 2016 Trending and Nontrending Hospitals Trending Hospitals Trending 2014-2016 Nontrending Hospitals 2016 Hospitals Submitting Before 2014 st 2016 Hospitals 1 Time Submitters AHA-Registered U.S. Hospitals Number Percent Number Percent Number Percent Number Percent New England Mid-Atlantic South Atlantic/ Associated Territories East North Central East South Central 10 33 83 3% 10% 25% 5 12 27 3% 8% 19% 18 22 61 9% 11% 29% 255 564 1,002 4% 9% 16% 81 22 25% 7% 36 9 25% 6% 28 10 13% 5% 921 515 15% 8% West Central Mountain/Pacific/Associ ated Territories TOTAL 46 51 15% 16% 35 21 25% 14% 31 39 15% 19% 1,867 1,171 30% 19% 326 100% 145 100% 209 100% 6,295 100% Region 46 Note: Percentages may not add to 100 due to rounding. States and territories are categorized into AHA-defined regions as follows: • New England: CT, MA, ME, NH, RI, VT • Mid-Atlantic: NJ, NY, PA • South Atlantic/Associated Territories: DC, DE, FL, GA, MD, NC, SC, VA, WV, Puerto Rico, Virgin Islands • East North Central: IL, IN, MI, OH, WI • East South Central: AL, KY, MS, TN • West Central: AR, IA, KS, LA, MN, MO, ND, NE, OK, SD, TX • Mountain/Pacific/Associated Territories: AK, AZ, CO, HI, ID, MT, NM, NV, OR, UT, WA, WY, American Samoa, Guam, Marshall Islands, Northern Mariana Islands Description of Trending Statistics Table 7-6 shows examples of the types of statistics provided in this chapter. The tables show the average percentage of respondents who answered positively in the most recent survey administration (left column) and the previous administration (middle column) for the trending hospitals only. The change over time (Most Recent score minus Previous score) is shown in the right column. The change is a negative number if the most recent administration showed a decline and a positive number if the most recent administration showed an increase. Table 7-6. Example of Trending Statistics Survey Item Most Recent Item 1 Item 2 Previous 80% 80% Change 84% 78% -4% 2% Table 7-7 shows additional types of trending statistics that are provided. The maximum increase shows the score from the hospital or hospitals with the largest percent positive score increase on a particular composite or item. Similarly, the maximum decrease shows the score from the hospital or hospitals with the largest percent positive score decrease. The average increase was calculated by including only hospitals that had any increase in their most recent score; hospitals that showed no change or decreased were not included when calculating the average increase. Similarly, the average decrease was calculated by including only hospitals that had a decrease in their most recent score; hospitals that showed no change or increased were not included when calculating the average decrease. Table 7-7. Example of Other Trending Statistics Survey Item Item 1 Item 2 Maximum Increase Maximum Decrease 18% 21% -45% -19% Average Increase Average Decrease 3% 5% -5% -6% Composite and Item-Level Trending Results Table 7-8 presents trending results on each of the 12 patient safety culture composites. Table 7-9 presents similar trending results for the 42 survey items. Table 7-10 and Table 7-11 present the trending results for patient safety grade and number of events reported over the past 12 months, respectively. 47 Table 7-8. Trending: Composite-Level Results—2016 Database Hospitals Composite % Positive Response Patient Safety Culture Composites Most Recent Previous Change Maximum Increase Maximum Decrease Average Increase Average Decrease 48 1. Teamwork Within Units 82% 82% 0% 21% -14% 4% -3% 2. Supervisor/Manager Expectations & Actions Promoting Patient Safety 80% 77% 3% 28% -30% 5% -3% 3. Organizational Learning—Continuous Improvement 75% 74% 1% 21% -18% 5% -4% 4. Management Support for Patient Safety 74% 74% 0% 27% -22% 5% -5% 5. Feedback & Communication About Error 70% 69% 1% 27% -15% 5% -4% 6. Frequency of Events Reported 68% 68% 0% 23% -24% 4% -4% 7. Overall Perceptions of Patient Safety 68% 67% 1% 30% -24% 5% -4% 8. Communication Openness 65% 63% 2% 31% -15% 5% -4% 9. Teamwork Across Units 64% 62% 2% 21% -16% 5% -4% 10. Staffing 55% 57% -2% 29% -23% 5% -5% 11. Handoffs & Transitions 50% 48% 2% 32% -26% 5% -4% 12. Nonpunitive Response to Error 47% 45% 2% 31% -24% 6% -4% Note: Based on data from 326 trending hospitals that had composite-level scores; the number of respondents was 240,717 for the most recent results and 216,719 for the previous results. Table 7-9. Trending: Item-Level Results—2016 Database Hospitals (Page 1 of 4) Item Survey Items by Composite 1. A1 A3 Teamwork Within Units 1. People support one another in this unit. 2. When a lot of work needs to be done quickly, we work together as a team to get the work done. 3. In this unit, people treat each other with respect. 4. When one area in this unit gets really busy, others help out. Supervisor/Manager Expectations & Actions Promoting Patient Safety 1. My supv/mgr says a good word when he/she sees a job done according to established patient safety procedures. 2. My supv/mgr seriously considers staff suggestions for improving patient safety. 3. Whenever pressure builds up, my supv/mgr wants us to work faster, even if it means taking shortcuts. 4. My supv/mgr overlooks patient safety problems that happen over and over. Organizational Learning—Continuous Improvement 1. We are actively doing things to improve patient safety. 2. Mistakes have led to positive changes here. 3. After we make changes to improve patient safety, we evaluate their effectiveness. A4 A11 2. B1 49 B2 B3R B4R 3. A6 A9 A13 Item % Positive Response Maximum Maximum Change Increase Decrease Most Recent Previous Average Increase Average Decrease 88% 87% 87% 87% 1% 0% 25% 26% -25% -18% 4% 3% -3% -3% 81% 80% 1% 27% -22% 4% -3% 73% 72% 1% 26% -18% 5% -4% 80% 76% 4% 29% -16% 6% -3% 81% 78% 3% 26% -23% 6% -3% 78% 75% 3% 40% -58% 6% -4% 80% 77% 3% 31% -59% 5% -4% 85% 85% 0% 21% -14% 4% -4% 66% 66% 0% 31% -22% 5% -5% 72% 72% 0% 33% -19% 6% -5% Note: Based on data from 326 trending hospitals. The number of respondents was 240,717 for the most recent results and 216,719 for the previous results, but the exact number of respondents will vary from item to item. The item’s survey location is shown to the left. An “R” indicates a negatively worded item, where the percent positive response is based on those who responded “Strongly disagree” or “Disagree,” or “Never” or “Rarely” (depending on the response category used for the item). Table 7-9. Trending: Item-Level Results—2016 Database Hospitals (Page 2 of 4) (continued) Item Survey Items by Composite 4. F1 Management Support for Patient Safety 1. Hospital mgmt provides a work climate that promotes patient safety. 2. The actions of hospital mgmt show that patient safety is a top priority. 3. Hospital mgmt seems interested in patient safety only after an adverse event happens. Feedback and Communication About Error 1. We are given feedback about changes put into place based on event reports. 2. We are informed about errors that happen in this unit. 3. In this unit, we discuss ways to prevent errors from happening again. Frequency of Events Reported 1. When a mistake is made, but is caught and corrected before affecting the patient, how often is this reported? 2. When a mistake is made, but has no potential to harm the patient, how often is this reported? 3. When a mistake is made that could harm the patient, but does not, how often is this reported? F8 F9R 5. C1 C3 50 C5 6. D1 D2 D3 Item % Positive Response Maximum Maximum Change Increase Decrease Most Recent Previous Average Increase Average Decrease 83% 82% 1% 24% -24% 5% -5% 78% 77% 1% 23% -23% 6% -5% 62% 62% 0% 40% -31% 6% -6% 63% 62% 1% 33% -25% 6% -5% 70% 69% 1% 33% -17% 5% -4% 77% 75% 2% 27% -21% 5% -4% 63% 62% 1% 24% -25% 5% -4% 64% 64% 0% 25% -24% 5% -4% 76% 77% -1% 25% -23% 4% -4% Note: Based on data from 326 trending hospitals. The number of respondents was 240,717 for the most recent results and 216,719 for the previous results, but the exact number of respondents will vary from item to item. The item’s survey location is shown to the left. An “R” indicates a negatively worded item, where the percent positive response is based on those who responded “Strongly disagree” or “Disagree,” or “Never” or “Rarely” (depending on the response category used for the item). Table 7-9. Trending: Item-Level Results—2016 Database Hospitals (Page 3 of 4) (continued) Item Survey Items by Composite 7. A10R Overall Perceptions of Patient Safety 1. It is just by chance that more serious mistakes don’t happen around here. 2. Patient safety is never sacrificed to get more work done. 3. We have patient safety problems in this unit. 4. Our procedures and systems are good at preventing errors from happening. Communication Openness 1. Staff will freely speak up if they see something that may negatively affect patient care. 2. Staff feel free to question the decisions or actions of those with more authority. 3. Staff are afraid to ask questions when something does not seem right. Teamwork Across Units 1. Hospital units do not coordinate well with each other. 2. There is good cooperation among hospital units that need to work together. 3. It is often unpleasant to work with staff from other hospital units. 4. Hospital units work well together to provide the best care for patients. A15 A17R A18 8. C2 C4 51 C6R 9. F2R F4 F6R F10 Item % Positive Response Maximum Maximum Change Increase Decrease Most Recent Previous Average Increase Average Decrease 63% 63% 0% 29% -44% 5% -5% 66% 64% 2% 63% -20% 8% -5% 66% 66% 0% 43% -58% 6% -5% 76% 75% 1% 29% -14% 5% -4% 78% 77% 1% 27% -22% 5% -3% 50% 49% 1% 43% -19% 5% -5% 66% 63% 3% 44% -43% 6% -5% 52% 50% 2% 27% -21% 6% -5% 64% 63% 1% 36% -19% 6% -5% 65% 63% 2% 22% -36% 5% -4% 74% 72% 2% 26% -29% 6% -4% Note: Based on data from 326 trending hospitals. The number of respondents was 240,717 for the most recent results and 216,719 for the previous results, but the exact number of respondents will vary from item to item. The item’s survey location is shown to the left. An “R” indicates a negatively worded item, where the percent positive response is based on those who responded “Strongly disagree” or “Disagree,” or “Never” or “Rarely” (depending on the response category used for the item). Table 7-9. Trending: Item-Level Results—2016 Database Hospitals (Page 4 of 4) (continued) Item Survey Items by Composite 10. A2 Staffing 1. We have enough staff to handle the workload. 2. Staff in this unit work longer hours than is best for patient care. 3. We use more agency/temporary staff than is best for patient care. 4. We work in “crisis mode” trying to do too much, too quickly. Handoffs & Transitions 1. Things “fall between the cracks” when transferring patients from one unit to another. 2. Important patient care information is often lost during shift changes. 3. Problems often occur in the exchange of information across hospital units. 4. Shift changes are problematic for patients in this hospital. Nonpunitive Response to Error 1. Staff feel like their mistakes are held against them. 2. When an event is reported, it feels like the person is being written up, not the problem. 3. Staff worry that mistakes they make are kept in their personnel file. A5R A7R A14R 11. F3R F5R 52 F7R F11R 12. A8R A12R A16R Item % Positive Response Maximum Maximum Change Increase Decrease Most Recent Previous Average Increase Average Decrease 53% 55% -2% 37% -29% 8% -8% 51% 53% -2% 47% -33% 5% -6% 66% 67% -1% 53% -51% 6% -7% 51% 51% 0% 33% -27% 6% -6% 45% 44% 1% 31% -22% 6% -5% 56% 54% 2% 40% -44% 6% -5% 49% 47% 2% 29% -25% 6% -4% 50% 49% 1% 53% -40% 6% -5% 52% 51% 1% 35% -27% 6% -5% 50% 49% 1% 26% -26% 6% -5% 38% 35% 3% 53% -27% 7% -4% Note: Based on data from 326 trending hospitals. The number of respondents was 240,717 for the most recent results and 216,719 for the previous results, but the exact number of respondents will vary from item to item. The item’s survey location is shown to the left. An “R” indicates a negatively worded item, where the percent positive response is based on those who responded “Strongly disagree” or “Disagree,” or “Never” or “Rarely” (depending on the response category used for the item). Table 7-10. Trending: Distribution of Work Area/Unit Patient Safety Grades—2016 Database Hospitals Item E1 Work Area/Unit Patient Safety Grade Excellent or Very Good Most Recent 77% Percentage of Respondents Within Hospitals Maximum Maximum Average Change Increase Decrease Increase Previous 77% 0% 44% -69% 6% Average Decrease -6% Note: Based on data from 326 trending hospitals that had data for this item. The number of respondents was 240,717 for the most recent results and 216,719 for the previous results. Most recent, previous, and change columns display average percent positive scores across the trending hospitals. Table 7-11. Trending: Distribution of Number of Events Reported in the Past 12 Months—2016 Database Hospitals Item G1 Events Reported in the Past 12 Months 1 or more events Most Recent 45% Percentage of Respondents Within Hospitals Maximum Maximum Average Previous Change Increase Decrease Increase 45% 0% 30% -51% 5% Average Decrease -5% 53 Note: Based on data from 326 trending hospitals that had data for this item. The number of respondents was 240,717 for the most recent results and 216,719 for the previous results. Most recent, previous, and change columns display average percent positive scores across the trending hospitals. Bar Charts of Trending Results Chart 7-1 shows the percentages of trending hospitals that increased, decreased, or did not change for each of the 12 patient safety culture composites. The chart shows that: • • Supervisor/Manager Expectations and Actions Promoting Patient Safety had the largest percentage of hospitals that increased 5 percentage points or more; 33 percent of hospitals increased by at least 5 percentage points. Staffing had the largest percentage of hospitals that decreased 5 percentage points or more; 32 percent of hospitals decreased by at least 5 percentage points. Chart 7-2 displays results for the percentages of trending hospitals that increased, decreased, or did not change on work area/unit patient safety grades (percent providing grades of “Excellent” or “Very Good”) and shows that: • • • 23 percent of hospitals increased by 5 percentage points or more. 57 percent of hospitals changed less than 5 percentage points. 20 percent of hospitals decreased by 5 percentage points or more. Chart 7-3 displays results for the percentages of trending hospitals that increased, decreased, or did not change in the percentage of respondents reporting one or more events (percentages may not add to 100 due to rounding) and shows that: • • • 26 percent of hospitals increased by 5 percentage points or more. 53 percent of hospitals changed less than 5 percentage points. 22 percent of hospitals decreased by 5 percentage points or more. Charts 7-4, 7-5, and 7-6 display the overall number of composites for which trending hospitals increased, decreased, or did not change: • • • Most hospitals (63 percent) increased by 5 percentage points or more on at least one composite. Nearly two-thirds (62 percent) of hospitals changed less than 5 percentage points on seven or more composites. Slightly more than half of hospitals (52 percent) decreased by 5 percentage points or more on at least one composite. 54 Chart 7-1. Trending: Percentage of 2016 Hospitals That Increased, Decreased, or Did Not Change on Each Composite 55 Increased (by 5 percentage points or more) Did Not Change (less than 5 percentage point change) Note: Based on data from 326 trending hospitals. Percentages may not add to 100 due to rounding. Decreased (by 5 percentage points or more) Chart 7-2. Trending: Percentage of 2016 Hospitals That Increased, Decreased, or Did Not Change on Work Area/Unit Patient Safety Grade as “Excellent” or “Very Good” Increased (by 5 percentage points or more) Decreased Did Not Change (by 5 percentage points or more) (less than 5 percentage point change) Note: Based on data from 326 trending hospitals that responded to this item. Chart 7-3. Trending: Percentage of 2016 Hospitals That Increased, Decreased, or Did Not Change on Number of Events Reported as Equal to One or More Events Reported Increased (by 5 percentage points or more) Decreased Did Not Change (less than 5 percentage point change) (by 5 percentage points or more) Note: Based on data from 326 trending hospitals that responded to this item. 56 Chart 7-4. Trending: Distribution of 2016 Hospitals by Number of Composites That Increased by 5 Percentage Points or More Note: Based on data from 326 trending hospitals that measured all 12 survey dimensions. Percentages may not add to 100 due to rounding. Chart 7-5. Trending: Distribution of 2016 Hospitals by Number of Composites That Did Not Change by 5 Percentage Points or More Changed on all composites Note: Based on data from 326 trending hospitals that measured all 12 survey dimensions. Percentages may not add to 100 due to rounding. 57 Chart 7-6. Trending: Distribution of 2016 Hospitals by Number of Composites That Decreased by 5 Percentage Points or More Note: Based on data from 326 trending hospitals that measured all 12 survey dimensions. Percentages may not add to 100 due to rounding. 58 Appendixes C and D: Trending Results by Hospital and Respondent Characteristics Part III of the report contains Appendixes C and D, which show trends over time for the 326 hospitals that administered the survey and submitted data more than once. Average percent positive scores from the most recent and previous administrations are shown on the survey composites and items, broken down by the following hospital and respondent characteristics: Appendix C: Trending Results by Hospital Characteristics • • • • Bed size Teaching status Ownership Geographic region Appendix D: Trending Results by Respondent Characteristics • • • • Work area/unit Staff position Interaction with patients Tenure in current work area/unit Because there are many breakout tables, they are included in Appendixes C and D. Highlights of the findings from the breakout tables in these appendixes are provided on the following pages. The appendixes are available on the Web at: www.ahrq.gov/qual/hospsurvey16/. Highlights From Appendix C: Trending Results by Hospital Characteristics Bed Size (Tables C-1, C-3) • Hospitals with 6–24 beds increased on all 12 of the patient safety culture composites; increases ranged from 4 to 11 percentage points. • Hospitals with 6–24 beds had the greatest increase in the percentage of respondents who gave their work area/unit a patient safety grade of “Excellent” or “Very Good” (a 5 percentage point increase, from 81 percent to 86 percent). Teaching Status and Ownership (Table C-5) • • Both teaching and nonteaching hospitals showed the largest increase of 3 percentage points on Supervisor/Manager Expectations and Actions Promoting Patient Safety. Nongovernment-owned hospitals showed the largest increase of 3 percentage points on Supervisor/Manager Expectations and Actions Promoting Patient Safety. Governmentowned hospitals’ largest increase was 2 percentage points on the same composite. Geographic Region (Tables C-9, C-11) • West Central region hospitals increased 8 percentage points on Nonpunitive Response to Error and 5 percentage points on Overall Perceptions of Safety. 59 • West Central region hospitals had the greatest increase of respondents who gave their work area/unit a patient safety grade of “Excellent” or “Very Good” (a 3 percentage point increase, from 76 percent to 79 percent. Highlights From Appendix D: Trending Results by Respondent Characteristics Work Area/Unit (Table D-1) • Surgery increased 4 percentage points on Supervisor/Manager Expectations and Actions Promoting Patient Safety. • Anesthesiology increased 3 percentage points on the percentage of respondents reporting one or more events in the past year. Staff Position (Table D-5, D-7) • • Attending/staff physician, resident physician/physician in training, or physician assistant/nurse practitioner increased 6 percentage points on Supervisor/Manager Expectations and Actions Promoting Patient Safety. Dietitians increased 5 percentage points on the percentage of respondents who gave their work area/unit a patient safety grade of “Excellent” or “Very Good.” Interaction With Patients (Table D-9) • Respondents with direct interaction and without direct interaction with patients increased 3 percentage points on Supervisor/Manager Expectations and Actions Promoting Patient Safety. Tenure in Current Work Area/Unit (Table D-13) • Respondents with 1 to 5 years in their work area/unit increased 3 percentage points on Nonpunitive Response to Error and Supervisor/Manager Expectations and Actions Promoting Patient Safety; respondents with 11 to 15 years in their work area/unit also increased 3 percentage points on Supervisor/Manager Expectations and Actions Promoting Patient Safety. 60 Chapter 8. What’s Next? Action Planning for Improvement The seven steps of action planning outlined in this chapter are primarily based on the book Designing and Using Organizational Surveys: A Seven-Step Process (Church and Waclawski, 1998). Highlights • • • The delivery of survey results is not the end point in the survey process; it is just the beginning. Often, the perceived failure of surveys to create lasting change is actually due to faulty or nonexistent action planning or survey followup. Seven steps of action planning are provided to give hospitals guidance on next steps to take to turn their survey results into actual patient safety culture improvement. Seven Steps of Action Planning Administering the hospital survey can be considered an “intervention,” a means of educating hospital staff and building awareness about issues of concern related to patient safety. But it should not be the only goal of conducting the survey. Administering the survey is not enough. Keep in mind that the delivery of survey results is not the end point in the survey process; it is actually just the beginning. Often, the perceived failure of surveys as a means for creating lasting change is actually due to faulty or nonexistent action planning or survey followup. Seven steps of action planning are provided to help your hospital go beyond simply conducting a survey to realizing patient safety culture change. The progression is getting survey results, developing an action plan, implementing the plan, and tracking progress. The seven steps of action planning are: 1. 2. 3. 4. 5. 6. 7. Understand your survey results. Communicate and discuss survey results. Develop focused action plans. Communicate action plans and deliverables. Implement action plans. Track progress and evaluate impact. Share what works. 61 Step 1: Understand Your Survey Results It is important to review the survey results and interpret them before you develop action plans. Develop an understanding of your hospital’s key strengths and areas for improvement. Examine your hospital’s overall percent positive scores on the patient safety culture composites and items. • • Which areas were most and least positive? How do your hospital’s results compare with the results from the database hospitals? Next, consider examining your survey data broken down by work area/unit or staff position. • • • • Are there different areas for improvement for different hospital units? Are there different areas for improvement for different hospital staff? Do any patterns emerge? How do your hospital’s results for these breakouts compare with the results from the database hospitals? Finally, if your hospital administered the survey more than once, compare your most recent results with your previous results to examine change over time. • • • Did your hospital have an increase in its scores on any of the survey composites or items? Did your hospital have a decrease in its scores? When you consider the types of patient safety actions that your hospital implemented between each survey administration, do you notice improvements in those areas? After reviewing the survey results carefully, identify two or three areas for improvement to avoid focusing on too many issues at one time. Step 2: Communicate and Discuss Survey Results Common complaints among survey respondents are that they never get any feedback about survey results and have no idea whether anything ever happens as a result of a survey. It is therefore important to thank your staff for taking the time to complete the survey and let them know that you value their input. Sharing results from the survey throughout the hospital shows your commitment to the survey and improvement process. Use survey feedback as an impetus for change. Feedback can be provided at the hospital level and at the department or unit level. However, to ensure respondent anonymity and confidentiality, it is important to report data only if there are enough respondents in a particular category or group. Common rules of thumb recommend not reporting data if a category has fewer than 5 or 10 respondents. For example, if a department has only four respondents, that department’s data should not be reported separately because there are too few respondents to provide complete assurance of anonymity and confidentiality. Distribute summaries of the survey results throughout the hospital in a top-down manner, beginning with senior management, administrators, medical and senior leaders, and committees, followed by department or unit managers and then staff. Managers at all levels should be expected to carefully review the findings. Summarize key findings, but also encourage 62 discussion about the results throughout the hospital. What do others see in the data and how do they interpret the results? In some cases, it may not be completely clear why an area of patient safety culture was particularly low. Keep in mind that surveys are only one way of examining culture, so strive for a deeper understanding when needed. Conduct followup activities, such as focus groups or interviews with staff to find out more about an issue, why it is problematic, and how it can be improved. Step 3: Develop Focused Action Plans Once you identify areas for improvement in patient safety culture, you need to develop formal written action plans to ensure progress toward change. Hospitalwide, department-based, or unitbased action plans can be developed. Major goals can be established as hospitalwide action plans. Unit-specific goals can be fostered by encouraging and empowering staff to develop action plans at the unit level. Health care organizations can use the Action Planning Tool for the AHRQ Surveys on Patient Safety Culture. The tool is intended for use after your organization administers the survey and analyzes the results. It provides step-by-step guidance to help survey users develop an action plan to improve patient safety culture. This tool includes an Action Plan Template your organization can use to document goals, initiatives, needed resources, process and outcome measures, and timelines. You can download the Action Planning Tool for the AHRQ Surveys on Patient Safety Culture at http://www.ahrq.gov/professionals/quality-patientsafety/patientsafetyculture/planningtool.html. Encourage action plans that are “SMART”: • • • • • Specific Measurable Achievable Relevant Time bound When deciding whether a particular action plan or initiative would be a good fit in your facility, you may want to check the guide Will It Work Here? A Decisionmaker’s Guide to Adopting Innovations (Brach, et al., 2008, available at http://innovations.ahrq.gov/guide/guideTOC). The guide helps users answer four overarching questions: • • • • Does this innovation fit? Should we do it here? Can we do it here? How can we do it here? Lack of resources is often a fundamental obstacle hindering implementation of action plans. Identify funding, staffing, or other resources needed to implement action plans and take steps to obtain these resources. It is also important to identify other obstacles you may encounter when trying to implement change and to anticipate and understand the rationale behind any potential resistance toward proposed action plans. 63 In the planning stage, it is also important to identify quantitative and qualitative measures that can be used to evaluate progress and the impact of changes implemented. Evaluative measures will need to be assessed before, during, and after implementation of your action plan initiatives. Step 4: Communicate Action Plans and Deliverables After you develop your action plans, you need to communicate the plans, deliverables, and expected outcomes. Those directly involved or affected will need to know their roles and responsibilities, as well as the timeframe for implementation. Action plans and goals should also be shared widely so that their transparency encourages further accountability and demonstrates the hospitalwide commitments being made in response to the survey results. At this step it is important for senior hospital managers and leaders to understand that they are the primary owners of the change process and that success depends on their full commitment and support. Senior-level commitment to taking action must be strong; without buy-in from the top, including medical leadership, improvement efforts are likely to fail. Step 5: Implement Action Plans Implementing action plans is one of the hardest steps. Taking action requires providing needed resources and support. It requires tracking quantitative and qualitative measures of progress and success that have already been identified. It requires publicly recognizing those individuals and units who take action to drive improvement. And it requires adjustments along the way. This step is critical to improving patient safety culture. While communicating the survey results is important, taking action makes the real difference. However, as the Institute for Healthcare Improvement (2016) suggests, actions do not have to be major permanent changes. In fact, it is worthwhile to strive to implement easier smaller changes that are likely to have a positive impact rather than big changes with unknown probability of success. The “Plan-Do-Study-Act” cycle (Langley, et al., 1996), shown in Chart 8-1, is a pilot-study approach to change. It involves developing a small-scale plan to test a proposed change (Plan), carrying out the plan (Do), observing and learning from the results (Study), and determining what change should be made to the plan (Act). Implementation of action plans can occur on a small scale within a single unit to examine impact and refine plans before rolling out the changes on a larger scale to other units or hospitals. 64 Figure 8-1. Plan-Do-Study-Act Cycle Step 6: Track Progress and Evaluate Impact Use quantitative and qualitative measures to review progress and evaluate whether a specific change actually leads to improvement. Ensure that there is timely communication of progress toward action plans on a regular basis. If you determine that a change has worked, communicate that success to staff by telling them what was changed and that it was done in response to the safety culture survey results. Be sure to make the connection to the survey so that the next time the survey is administered, staff will know that it will be worthwhile to participate again because actions were taken based on the prior survey’s results. Alternatively, your evaluation may show that a change is not working as expected or has failed to reach its goals and will need to be modified or replaced by another approach. Before you drop the effort completely, try to determine why it failed and whether it might be worth it to make adjustments. Keep in mind that it is important not to reassess culture too frequently because lasting culture change will be slow and may take years. Frequent assessments of culture are likely to find temporary shifts or improvements that may come back down to baseline levels in the longer term if changes are not sustained. When planning to reassess culture, it is also very important to obtain high survey response rates. Otherwise, it will not be clear whether changes in survey results over time are due to true changes in attitudes or are caused by surveying different staff each time. Step 7: Share What Works In step 6, you track measures to identify which changes result in improvement. Once your hospital has found effective ways to address a particular area, the changes can be implemented on a broader scale to other departments within the hospital and to other hospitals. Be sure to share your successes with outside hospitals and health care systems as well. 65 References Agency for Healthcare Research and Quality. Hospital Survey on Patient Safety Culture. Available at: www.ahrq.gov/professionals/quality-patient-safety/patientsafetyculture/hospital/ index.html. Accessed March 1, 2016. American Hospital Association (AHA) Annual Survey of Hospitals database. Chicago: Health Forum; multiple years (data used from 2013). Brach C, Lenfestey N, Roussel A, et al. Will it work here? A decisionmaker’s guide to adopting innovations. (Prepared by RTI International under Contract No. 233-02-0090). Rockville, MD: Agency for Healthcare Research and Quality; September 2008. AHRQ Publication No. 08-0051. Available at: http://innovations.ahrq.gov/guide/guideTOC. Accessed March 1, 2016. Church AH, Waclawski J. Designing and using organizational surveys: a seven-step process. San Francisco: Jossey-Bass; 1998. Institute for Healthcare Improvement. How To Improve. 2016. Available at: www.ihi.org/IHI/ Topics/Improvement/ImprovementMethods/HowToImprove. Accessed March 1, 2016. Langley C, Nolan K, Nolan T, et al. The improvement guide: a practical approach to improving organizational performance. San Francisco: Jossey-Bass; 1996. Sorra J, Gray L, Franklin M, et al. Action Planning Tool for the AHRQ Surveys on Patient Safety Culture. (Prepared by Westat, Rockville, MD, under Contract No. HHSA290201300003C). Rockville, MD: Agency for Healthcare Research and Quality; January 2016. AHRQ Publication No. 16-0008-EF. Available at: http://www.ahrq.gov/professionals/ quality-patient-safety/patientsafetyculture/planningtool.html. Accessed March 1, 2016. 67 Notes: Description of Data Cleaning and Calculations This notes section provides additional detail regarding how various statistics presented in this report were calculated. Data Cleaning Each participating hospital submitted individual-level survey data. Once the data were submitted, response frequencies were run on each hospital’s data to look for out-of-range values, missing variables, or other data anomalies. When data problems were found, hospitals were contacted and asked to make corrections and resubmit their data. In addition, each participating hospital was sent a copy of its data frequencies to verify that the dataset received was correct. Records of respondents who supplied the same answers within or across sections A, B, C, and F (i.e., straight-lined) or who answered only demographic items were deleted before any analyses. Response Rates As part of the data submission process, hospitals were asked to provide their response rate numerator and denominator. Response rates were calculated using the formula below. Response Rate = Number of complete, returned surveys Number of surveys distributed − Ineligibles Numerator = Number of complete, returned surveys after being cleaning for straight-lining. The numerator equals the number of individual survey records submitted to the database. It should exclude surveys that were returned blank on all nondemographic survey items but include surveys where at least one nondemographic survey item was answered. Denominator = The total number of surveys distributed minus ineligibles. Ineligibles include deceased individuals and those who were not employed at the hospital during data collection. Calculation of Percent Positive Scores Most of the survey items ask respondents to answer using 5-point response categories in terms of agreement (Strongly agree, Agree, Neither, Disagree, Strongly disagree) or frequency (Always, Most of the time, Sometimes, Rarely, Never). Three of the 12 patient safety culture composites use the frequency response option (Feedback and Communication About Error, Communication Openness, and Frequency of Events Reported), while the other 9 composites use the agreement response option. 68 Item-Level Percent Positive Response Both positively worded items (such as “People support one another in this unit”) and negatively worded items (such as “We have patient safety problems in this unit”) are included in the survey. Calculating the percent positive response on an item is different for positively and negatively worded items: • For positively worded items, percent positive response is the combined percentage of respondents within a hospital who answered “Strongly agree” or “Agree,” or “Always” or “Most of the time,” depending on the response categories used for the item. For example, for the item “People support one another in this unit,” if 50 percent of respondents within a hospital Strongly agree and 25 percent Agree, the item-level percent positive response for that hospital would be 50% + 25% = 75% positive. • For negatively worded items, percent positive response is the combined percentage of respondents within a hospital who answered “Strongly disagree” or “Disagree,” or “Never” or “Rarely,” because a negative answer on a negatively worded item indicates a positive response. For example, for the item “We have patient safety problems in this unit,” if 60 percent of respondents within a hospital Strongly disagree and 20 percent Disagree, the item-level percent positive response would be 80 percent positive (i.e., 80 percent of respondents do not believe they have patient safety problems in their work area). Composite-Level Percent Positive Response The survey’s 42 items measure 12 areas, or composites, of patient safety culture. Each of the 12 patient safety culture composites includes 3 or 4 survey items. Composite scores were calculated for each hospital by averaging the percent positive response on the items within a composite. For example, for a three-item composite, if the item-level percent positive responses were 50 percent, 55 percent, and 60 percent, the hospital’s composite-level percent positive response would be the average of these three percentages, or 55 percent positive. Item and Composite Percent Positive Scores To calculate your hospital’s composite score, simply average the percentage of positive responses to each item in the composite. Here is an example of computing a composite score for Overall Perceptions of Patient Safety: 1. There are four items in this composite—two are positively worded (items A15 and A18) and two are negatively worded (items A10 and A17). Keep in mind that disagreeing with a negatively worded item indicates a positive response. 2. Calculate the percentage of positive responses at the item level. (See example in Table N1.) 69 Table N1. Example of Computing Item and Composite Percent Positive Scores Items Measuring Overall Perceptions of Patient Safety For Positively Worded Items, Number of “Strongly Agree” or “Agree” Responses For Negatively Worded Items, Number of “Strongly Disagree” or “Disagree” Responses Total Number of Responses to the Item Percent Positive Response on Item Item A15: positively worded “Patient safety is never sacrificed to get more work done” 120 NA* 260 120/260=46% Item A18: positively worded “Our procedures and systems are good at preventing errors from happening” 130 NA* 250 130/250=52% Item A10: negatively worded “It is just by chance that more serious mistakes don’t happen around here” NA* 110 240 110/240=46% Item A17: negatively worded “We have patient safety problems in this unit” NA* 140 250 140/250= 56% Composite Score % Positive = (46% + 52% + 46% + 56%) / 4 = 50% * NA = Not applicable. In this example, there were four items with percent positive response scores of 46 percent, 52 percent, 46 percent, and 56 percent. Averaging these item-level percent positive scores results in a composite score of .50, or 50 percent, on Overall Perceptions of Patient Safety. In this example, an average of about 50 percent of the respondents responded positively to the survey items in this composite. Table N2 shows how to calculate the percent positive response for Overall Patient Safety Grade (E1) and Number of Events Reported (G1). 70 Table N2. Example of Computing Patient Safety Grade and Number of Events Reported Items Number of “Excellent” or “Very Good” Responses Number of “1 to 2 event reports” or “3 to 5 event reports” or “6 to 10 event reports” or “11 to 20 event reports” or “21 event reports or more” Total Number of Responses to the Item Percent Positive Response on Item Item E1: “Please give your work area/unit in this hospital an overall grade on patient safety.” ItemG1: 193 NA* 250 193/250=77% “In the past 12 months, how many event reports have you filled out and submitted?” NA* 106 240 106/240=44% * NA = Not applicable. In this example, the Overall Patient Safety Grade (E1) percent positive response is calculated by combining the percentage of respondents that answered “Excellent” and “Very Good”, and dividing by the total number of respondents that answered E1. The Number of Events Reported (G1) percent positive response is calculated by combining the percentage of respondents that answered that they reported one or more events in the past 12 months, and dividing by the total number of respondents that answered G1. Once you calculate your hospital’s percent positive response for each of the 12 safety culture composites, Overall Patient Safety Grade, and Number of Events Reported, you can compare your results with the composite-level results from the database hospitals. Minimum Number of Responses Beginning with the 2010 database report, we enacted several new rules regarding a minimum number of responses for calculating the percent positive scores. We calculated percent positive scores only for hospitals that had at least 10 completed surveys. Starting with the 2016 Comparative Database, if a hospital had at least 1 respondent in a breakout category (e.g., work area/unit, staff position, direct interaction with patients), statistics were calculated for that breakout category. Percentiles Percentiles were computed using the SAS® software default method. The first step in this procedure is to rank order the percent positive scores from all the participating hospitals, from lowest to highest. The next step is to multiply the number of hospitals (n) by the percentile of interest (p), which in our case would be the 10th, 25th, 50th, 75th, or 90th percentile. 71 For example, to calculate the 10th percentile, one would multiply 680 (the total number of hospitals) by .10 (10th percentile). The product of n x p is equal to j + g, where j is the integer and g is the number after the decimal. If g equals 0, the percentile is equal to the percent positive value of the hospital in the jth position plus the percent positive value of the hospital in the jth + 1 position, divided by 2 [(X(j) + X(j+1))/2]. If g is not equal to 0, the percentile is equal to the percent positive value of the hospital in the jth + 1 position. The following examples show how the 10th and 50th percentiles would be computed using a sample of percent positive scores from 12 hospitals (using fake data shown in Table N3). First, the percent positive scores are sorted from low to high on Composite “A.” Table N3. Data Table for Example of How To Compute Percentiles Hospital 1 2 3 4 5 6 7 8 9 10 11 12 Composite “A” % Positive Score 33% 48% 52% 60% 63% 64% 66% 70% 72% 75% 75% 78% th 10 percentile score = 48% th 50 percentile score = 65% 10th percentile 1. For the 10th percentile, we would first multiply the number of hospitals by 0.10: (n x p = 12 x 0.10 = 1.2). 2. The product of n x p = 1.2, where j = 1 and g = 2. Since g is not equal to 0, the 10th percentile score is equal to the percent positive value of the hospital in the jth + 1 position: a. j equals 1. b. The 10th percentile equals the value for the hospital in the 2nd position = 48%. 50th percentile 1. For the 50th percentile, we would first multiply the number of hospitals by .50: (n x p = 12 x .50 = 6.0). 2. The product of n x p = 6.0, where j = 6 and g = 0. Since g = 0, the 50th percentile score is equal to the percent positive value of the hospital in the jth position plus the percent positive value of the hospital in the jth + 1 position, divided by 2: a. j equals 6. b. The 50th percentile equals the average of the hospitals in the 6th and 7th positions (64% + 66%)/2 =65%. 72