Talk to the Veterans Crisis Line now
U.S. flag
An official website of the United States government

Health Services Research & Development

Veterans Crisis Line Badge
Go to the ORD website
Go to the QUERI website

IIR 09-095 – HSR&D Study

New | Current | Completed | DRA | DRE | Portfolios/Projects | Centers | Career Development Projects

IIR 09-095
Improving Quality of Care Through Improved Audit and Feedback
Sylvia J. Hysong PhD MA BA
Michael E. DeBakey VA Medical Center, Houston, TX
Houston, TX
Funding Period: April 2010 - September 2013

BACKGROUND/RATIONALE:
VA has led the industry in measuring facility performance as a critical element to improving quality of care, investing substantial resources to develop and maintain valid and cost-effective measures. VA's External Peer Review Program (EPRP) is the official data source for monitoring facility performance, used to prioritize the quality areas needing most attention. Facility performance measurement has significantly improved preventive and chronic care as well as overall quality, however, much variability still exists both in mean levels and improvement levels of performance across measures and facilities. Audit and feedback, an important component of effective performance measurement, can help reduce this variability and improve overall performance; previous research by the principal investigator suggests that VAMCs with high EPRP performance scores tend to use EPRP data as a feedback source. However, the manner in which EPRP data is used as a feedback source by individual providers as well as service line, facility, and network leadership is not well understood. An in-depth understanding of mental models, strategies, and specific feedback process characteristics adopted by high performing facilities is thus urgently needed.

OBJECTIVE(S):
This research compares how leaders of high, low, and moderately performing VA Medical Centers (VAMCs) use clinical performance data, including that from VA's External Peer Review Program (EPRP), as a feedback tool to maintain and improve quality of care.

METHODS:
This project employed qualitative, grounded theory analysis of interviews with primary care clinicians and facility leadership of high, moderate, and low performing facilities. Sites were selected based on their clinical performance on a set of 17 EPRP outpatient measures. We conducted 48 interviews, looking for evidence of the following: (1) cross-facility differences in perceptions of performance data usefulness and (2) differences in strategies for disseminating performance data, with particular attention to timeliness, individualization, and punitiveness of feedback delivery.

FINDINGS/RESULTS:
CHANGES IN FACILITY CLINICAL PERFORMANCE 2008-2012
Performance assignments changed over time. Sites were originally selected based on their clinical performance profile in 2007-8. By the end of the project, sites' performance profiles had changed considerably, making it far more difficult to distinguish the four categories of sites by which they were originally selected. It should be noted that the implementation of Patient Aligned Care Teams (PACT) occurred nationally during data collection for this study, a likely reason for the changes in clinical performance during this time period.

PERCEPTIONS OF EPRP PERFORMANCE DATA USEFULNESS AS A FEEDBACK SOURCE
Mental models of EPRP as a feedback source, as well as mental models of clinical performance feedback in general, varied considerably by facility. Mental models ranged from perceptions of EPRP as a starting point for feedback, to perceptions that EPRP was an inaccurate source of clinical performance feedback. Three dimensions, positivity/negativity of feedback perceptions, intensity of feedback, and consistency of feedback received, can be used to categorize the universe of mental models observed. We observed no meaningful relationship between these dimensions and facilities' clinical performance, either in 2008 or 2012.

STRATEGIES USED FOR CLINICAL PERFORMANCE FEEDBACK
As with mental models, the universe of strategies providing clinical performance feedback was diverse and varied across sites. We observed 165 unique strategies for feeding back clinical performance information, categorizable into four general strategy classes: (1) Computer interfaces, which could be either facility specific or nationally computer interfaces to deliver feedback; (2) meetings, which could be either dedicated for feedback purposes or general meetings in which feedback was incorporated, either of which could be led by either leadership or clinic staff; (3) written reports based on EPRP data, which may or may not be locally generated by quality management personnel; and (4) informal conversations, which could occur among peers or between supervisors and subordinates. Like mental models, we observed no meaningful relationship between these dimensions and facilities' clinical performance, either in 2008 or 2012. Follow-up comparison of 4 sites that remained in their original performance category (one from each category) throughout the life of the study showed more similarities than differences amongst them in terms of strategy classes, thus reinforcing our findings indicating no meaningful relationship.

FOLLOW-UP ANALYSES
Changes in Feedback Strategies Since PACT Transition
Our interviews indicate the transition to PACT resulted at best in modest changes to feedback delivery for clinicians. The Primary Care Almanac, a panel-management information tool, emerged as the most prominent change to clinical-performance feedback dissemination; Facilities reported few, if any, appreciable changes to the assessment of clinical performance since transitioning to team-based care.

Comparisons of Sites Employing Similar Feedback Strategies
We additionally compared the performance of sites with similar feedback strategies and/or mental models and compared their performance. No meaningful relationships among similar sites were observed.

IMPACT:
VA spends millions of dollars annually to support its facility performance management system and is uniquely positioned to lead the industry in outpatient performance measurement; however, this goal depends on VA's ability to leverage the data it collects and analyzes in meaningful ways. The existence of so many feedback strategies and the somewhat lukewarm perceptions of current feedback systems may suggest the need for a more evidence-based, strategic approach to feeding back clinical performance information to health care team members.

PUBLICATIONS:

Journal Articles

  1. Hysong SJ, Smitham K, SoRelle R, Amspoker A, Hughes AM, Haidet P. Mental models of audit and feedback in primary care settings. Implementation science : IS. 2018 May 30; 13(1):73.
  2. Payne VL, Hysong SJ. Model depicting aspects of audit and feedback that impact physicians' acceptance of clinical performance feedback. BMC health services research. 2016 Jul 13; 16(1):260.
  3. Hysong SJ, Knox MK, Haidet P. Examining clinical performance feedback in Patient-Aligned Care Teams. Journal of general internal medicine. 2014 Jul 1; 29 Suppl 2:S667-74.
  4. Hysong SJ, Smitham KB, Knox M, Johnson KE, SoRelle R, Haidet P. Recruiting clinical personnel as research participants: a framework for assessing feasibility. Implementation science : IS. 2013 Oct 24; 8(1):125.
  5. Hysong SJ, Teal CR, Khan MJ, Haidet P. Improving quality of care through improved audit and feedback. Implementation science : IS. 2012 May 18; 7(1):45.
Conference Presentations

  1. Payne V, Hysong SJ. A grounded provider feedback framework – the relationship between providers’ feedback acceptance and performance improvement. Poster session presented at: Diagnostic Error in Medicine Annual International Conference; 2014 Sep 16; Atlanta, GA.
  2. Hysong SJ, Knox M, Haidet P. Examining Clinical-Performance Feedback in Patient-Aligned Care Teams. Poster session presented at: AcademyHealth Annual Research Meeting; 2014 Jun 9; San Diego, CA.
  3. Hysong SJ, Smithan KB, SoRelle R, Knox M, Amspoker A, Haidet P. Mental Models of Outpatient Clinical Performance Feedback at VA Medical Centers. Poster session presented at: AcademyHealth Annual Research Meeting; 2014 Jun 9; San Diego, CA.
  4. Payne V, Hysong SJ. Providers’ Acceptance of Feedback and Factors Impacting Patient Management Behavior. Paper presented at: AcademyHealth Annual Research Meeting; 2014 Jun 8; San Diego, CA.
  5. Hysong SJ, Knox MK, Haidet P. U.S. Department of Veterans Affairs Medical Center Organizational Performance Project [Proyecto de Desempeño Organizacional de Centros Médicos del Departamento de Asuntos del Veterano de EEUU]. Poster session presented at: Puerto Rico Psychology Association Annual Convention [Sexagésima Convención de la Asociación de Psicología de Puerto Rico]; 2013 Nov 15; Ponce, Puerto Rico.
  6. Hysong SJ. Factores Organizacionales Relacionados al Desempeño Clínico Ambulatorio en los Hospitales del Departamento de Asuntos al Veterano. [Organizational Correlates of Outpatient Performance in Veterans Affairs Medical Centers]. Paper presented at: University of Puerto Rico Department of Psychology Monthly Meeting; 2013 Nov 12; Rio Piedras, Puerto Rico.
  7. Hysong SJ, Knox MK, Haidet P. Is it about Individuals or Teams? Examining Clinical-Performance Feedback in Patient-Aligned Care Teams. Paper presented at: ATLAS.ti Users Annual Conference; 2013 Sep 12; Berlin, Germany.
  8. Hysong SJ, Knox MK, Smitham K, Sorelle R. The Role of Organizational Culture on a Subculture of Feedback. Poster session presented at: ATLAS.ti Users Annual Conference; 2013 Sep 12; Berlin, Germany.
  9. Hysong SJ, Kell H, Petersen LA, Trautner B. Evidence-Based Design of Audit and Feedback Programs: Lessons Learned from Two Clinical Intervention Studies. Paper presented at: AcademyHealth Annual Research Meeting; 2013 Jun 24; Baltimore, MD.
  10. Hysong SJ, Broussard K, Knox M, Johnson K, Stelljes L, Gribble G, SoRelle R, Haidet P. Recruiting clinical personnel as research participants: A framework for estimating feasibility. Poster session presented at: AcademyHealth Annual Research Meeting; 2012 Jun 25; Boston, MA.


DRA: Health Systems
DRE: none
Keywords: Primary care, Quality assessment, Quality Measure
MeSH Terms: none

Questions about the HSR&D website? Email the Web Team.

Any health information on this website is strictly for informational purposes and is not intended as medical advice. It should not be used to diagnose or treat any condition.