IIR 09-095
Improving Quality of Care Through Improved Audit and Feedback
Sylvia J. Hysong, PhD MA BA Michael E. DeBakey VA Medical Center, Houston, TX Houston, TX Funding Period: April 2010 - September 2013 |
BACKGROUND/RATIONALE:
VA has led the industry in measuring facility performance as a critical element to improving quality of care, investing substantial resources to develop and maintain valid and cost-effective measures. VA's External Peer Review Program (EPRP) is the official data source for monitoring facility performance, used to prioritize the quality areas needing most attention. Facility performance measurement has significantly improved preventive and chronic care as well as overall quality, however, much variability still exists both in mean levels and improvement levels of performance across measures and facilities. Audit and feedback, an important component of effective performance measurement, can help reduce this variability and improve overall performance; previous research by the principal investigator suggests that VAMCs with high EPRP performance scores tend to use EPRP data as a feedback source. However, the manner in which EPRP data is used as a feedback source by individual providers as well as service line, facility, and network leadership is not well understood. An in-depth understanding of mental models, strategies, and specific feedback process characteristics adopted by high performing facilities is thus urgently needed. OBJECTIVE(S): This research compares how leaders of high, low, and moderately performing VA Medical Centers (VAMCs) use clinical performance data, including that from VA's External Peer Review Program (EPRP), as a feedback tool to maintain and improve quality of care. METHODS: This project employed qualitative, grounded theory analysis of interviews with primary care clinicians and facility leadership of high, moderate, and low performing facilities. Sites were selected based on their clinical performance on a set of 17 EPRP outpatient measures. We conducted 48 interviews, looking for evidence of the following: (1) cross-facility differences in perceptions of performance data usefulness and (2) differences in strategies for disseminating performance data, with particular attention to timeliness, individualization, and punitiveness of feedback delivery. FINDINGS/RESULTS: CHANGES IN FACILITY CLINICAL PERFORMANCE 2008-2012 Performance assignments changed over time. Sites were originally selected based on their clinical performance profile in 2007-8. By the end of the project, sites' performance profiles had changed considerably, making it far more difficult to distinguish the four categories of sites by which they were originally selected. It should be noted that the implementation of Patient Aligned Care Teams (PACT) occurred nationally during data collection for this study, a likely reason for the changes in clinical performance during this time period. PERCEPTIONS OF EPRP PERFORMANCE DATA USEFULNESS AS A FEEDBACK SOURCE Mental models of EPRP as a feedback source, as well as mental models of clinical performance feedback in general, varied considerably by facility. Mental models ranged from perceptions of EPRP as a starting point for feedback, to perceptions that EPRP was an inaccurate source of clinical performance feedback. Three dimensions, positivity/negativity of feedback perceptions, intensity of feedback, and consistency of feedback received, can be used to categorize the universe of mental models observed. We observed no meaningful relationship between these dimensions and facilities' clinical performance, either in 2008 or 2012. STRATEGIES USED FOR CLINICAL PERFORMANCE FEEDBACK As with mental models, the universe of strategies providing clinical performance feedback was diverse and varied across sites. We observed 165 unique strategies for feeding back clinical performance information, categorizable into four general strategy classes: (1) Computer interfaces, which could be either facility specific or nationally computer interfaces to deliver feedback; (2) meetings, which could be either dedicated for feedback purposes or general meetings in which feedback was incorporated, either of which could be led by either leadership or clinic staff; (3) written reports based on EPRP data, which may or may not be locally generated by quality management personnel; and (4) informal conversations, which could occur among peers or between supervisors and subordinates. Like mental models, we observed no meaningful relationship between these dimensions and facilities' clinical performance, either in 2008 or 2012. Follow-up comparison of 4 sites that remained in their original performance category (one from each category) throughout the life of the study showed more similarities than differences amongst them in terms of strategy classes, thus reinforcing our findings indicating no meaningful relationship. FOLLOW-UP ANALYSES Changes in Feedback Strategies Since PACT Transition Our interviews indicate the transition to PACT resulted at best in modest changes to feedback delivery for clinicians. The Primary Care Almanac, a panel-management information tool, emerged as the most prominent change to clinical-performance feedback dissemination; Facilities reported few, if any, appreciable changes to the assessment of clinical performance since transitioning to team-based care. Comparisons of Sites Employing Similar Feedback Strategies We additionally compared the performance of sites with similar feedback strategies and/or mental models and compared their performance. No meaningful relationships among similar sites were observed. IMPACT: VA spends millions of dollars annually to support its facility performance management system and is uniquely positioned to lead the industry in outpatient performance measurement; however, this goal depends on VA's ability to leverage the data it collects and analyzes in meaningful ways. The existence of so many feedback strategies and the somewhat lukewarm perceptions of current feedback systems may suggest the need for a more evidence-based, strategic approach to feeding back clinical performance information to health care team members. External Links for this ProjectNIH ReporterGrant Number: I01HX000172-01A1Link: https://reporter.nih.gov/project-details/7870084 Dimensions for VADimensions for VA is a web-based tool available to VA staff that enables detailed searches of published research and research projects.Learn more about Dimensions for VA. VA staff not currently on the VA network can access Dimensions by registering for an account using their VA email address. Search Dimensions for this project PUBLICATIONS:Journal Articles
DRA:
Health Systems Science
DRE: none Keywords: Primary care, Quality assessment, Quality Measure MeSH Terms: none |