Talk to the Veterans Crisis Line now
U.S. flag
An official website of the United States government

Health Services Research & Development

Veterans Crisis Line Badge
Go to the ORD website
Go to the QUERI website

IIR 09-368 – HSR&D Study

New | Current | Completed | DRA | DRE | Portfolios/Projects | Centers | Career Development Projects

IIR 09-368
Comparison of Fidelity Assessment Methods
Angela L Rollins PhD
Richard L. Roudebush VA Medical Center, Indianapolis, IN
Indianapolis, IN
Funding Period: November 2010 - October 2013

BACKGROUND/RATIONALE:
National policy has dramatically increased the emphasis on implementing evidence-based mental health services to meet the needs of people with severe mental illness, and the VHA has made great strides at providing effective, community-based services. One of the cornerstones of the VHA approach is Mental Health Intensive Case Management (MHICM), a model that is based on one of the most well-defined and empirically supported approaches: assertive community treatment. Most recently, VHA policy shifts have resulted in a proposed set of uniform mental health services to ensure access to a standard set of high quality mental health services, such as MHICM, across the entire VHA. However, successful implementation of evidence-based practices on a broad scale requires psychometrically valid, yet practical, ways to assess and monitor degree of implementation (i.e., fidelity). Currently, the only rigorous "gold-standard" method to monitor implementation is an on-site fidelity visit, which is a very time-intensive and expensive approach for both the assessor and the program.

OBJECTIVE(S):
The primary objective of this study was to examine the effectiveness of innovative and potentially cost-effective methods to ensure the quality of mental health services for disabled veterans with mental illness. This study examined the reliability, concurrent validity, and incremental predictive validity of expert-rated self-report, phone, and onsite fidelity assessments for assertive community treatment. The study also explored the relative costs of each approach (cost identification).

METHODS:
We recruited 32 of VA's 111 MHICM teams to participate in our study. Volunteer sites participated in a phone-based and an on-site fidelity assessment with experienced fidelity assessors using the Dartmouth Assertive Community Treatment Scale (DACTS). The DACTS is a 28-item scale, rated on a scale from 1 to 5, where 1 indicates low adherence to the model and 5 indicates full adherence to the model. The order of phone and on-site assessments was counter-balanced, with blinded assessors, to reduce potential bias. Sites reported information about their team's functioning prior to the initial phone or on-site assessment that was subsequently used as the basis for making DACTS ratings by a separate pair of experienced DACTS raters, resulting in 3 assessment types: onsite, phone, and expert-rated self-report. For site recruitment, we used a stratified random sampling technique based on type of VA facility served and previous year's self-reported fidelity using global self-scoring. We examined level of agreement between fidelity approaches with intraclass correlations. To determine the incremental predictive validity between fidelity method and hospital reduction, we used zero-inflated binomial regression to compare reductions in hospitalization for veterans from the year prior to MHICM intake to the most recent year preceding the fidelity assessment. We compared costs across the three methods of assessment using personnel and travel costs. We also included a formative evaluation to inform future dissemination of fidelity assessment methods in the VHA and elsewhere.

FINDINGS/RESULTS:
Teams showed modest fidelity to the assertive community treatment using the on-site fidelity assessment method. DACTS score means were 3.38 (SD=.41) for the Human Services subscale, 3.76 (SD=.38) for the Organizational Boundaries subscale, 2.66 (SD=.33) for the Services Subscale, and 3.22 (SD=.28) for Total DACTS mean.

Inter-rater reliability
Analyses indicated good inter-rater agreement for both phone and expert-rated self-report assessments. For the Human Resources, Organizational Boundaries, Services subscale, and total DACTS score, respectively, intraclass correlation for inter-rater agreement between phone raters were .96, .81, .78, and .92, and were .92, .87, .84, and .91 for expert-rated self-report raters.

Concurrent validity
Agreement between phone, and expert-rated self-report, and on-site methods were high for total DACTS score and most subscales. The Services subscale for expert-rated self-report was the only subscale that did not reach our a priori cut-off of .70 for minimum agreement. Intraclass correlations indicating agreement between phone and onsite methods were .92, .85, .84, and .88 for the Human Resources, Organizational Boundaries, Services subscale, and total DACTS score, respectively. Intraclass correlations indicating agreement between self-report and onsite methods were .92, .67, .79, and .84 for the Human Resources, Organizational Boundaries, Services subscale, and total DACTS score, respectively. Intraclass correlations also indicated high agreement between phone and expert-rated self-report: .95, .86, .76, and .91 for the Human Resources, Organizational Boundaries, Services subscale, and total DACTS score, respectively.

Predictive validity
We found no differences in incremental validity between methods, with all programs in the study showing significant reductions in hospital days for consumers.

Cost identification
Costs were analyzed for on-site or phone assessments using the method administered first (phone or onsite) to reduce any biases to personnel effort in a site participating in two successive assessments. To estimate costs that would translate to real-world use of phone or expert-rated self-report using a single rater, we averaged the time devoted by the two assessors and used that effort for the assessor cost calculation. Costs for the on-site assessments that were administered first (n=19) averaged $2579, including an average of $1663 in personnel costs and $916 in travel costs. Costs for the phone assessments that were administered first (n=13) were $571 and all expert-rated self-report assessment methods (n=32) averaged $553.

Formative evaluation results
Despite the favorable results for the remote fidelity assessment methods, most respondents (75%) in follow-up interviews expressed a preference for the on-site assessment methods (over phone), citing assessor traits such as being knowledgeable regarding the clinical model, greater perceived accuracy for on-site assessments (e.g., easier to communicate about program in person, assessor able to "see" program in action), the personal contact it provided, and getting informal feedback throughout the visit, particularly from an "outsider." Negative feedback regarding on-site visits included the amount of time required to complete the assessment. Positive comments about phone assessments commonly included the advantage of minimal time spent away from clinical duties.

IMPACT:
Phone or expert-rated self-report fidelity assessments compared favorably to onsite methods in terms of reliability, concurrent validity, and cost. If used appropriately, these alternative protocols hold promise in monitoring large scale program fidelity with limited resources. This project addresses a critical need in the VA system to effectively and efficiently monitor mental health program adherence to model standards for quality.

PUBLICATIONS:

Journal Articles

  1. Rollins AL, Kukla M, Salyers MP, McGrew JH, Flanagan ME, Leslie DL, Hunt MG, McGuire AB. Comparing the Costs and Acceptability of Three Fidelity Assessment Methods for Assertive Community Treatment. Administration and policy in mental health. 2017 Sep 1; 44(5):810-816.
  2. Rollins AL, McGrew JH, Kukla M, McGuire AB, Flanagan ME, Hunt MG, Leslie DL, Collins LA, Wright-Berryman JL, Hicks LJ, Salyers MP. Comparison of Assertive Community Treatment Fidelity Assessment Methods: Reliability and Validity. Administration and policy in mental health. 2016 Mar 1; 43(2):157-67.
  3. Salyers MP, Rollins AL, Kelly YF, Lysaker PH, Williams JR. Job satisfaction and burnout among VA and community mental health workers. Administration and policy in mental health. 2013 Mar 1; 40(2):69-75.
  4. Rollins AL, Bond GR, Jones AM, Kukla M, Collins LA. Workplace social networks and their relationship with job outcomes and other employment characteristics for people with severe mental illness. Journal of Vocational Rehabilitation. 2011 Jul 1; 35(3):243-252.
  5. Salyers MP, Stull LG, Rollins AL, Hopper K. The work of recovery on two assertive community treatment teams. Administration and policy in mental health. 2011 May 1; 38(3):169-80.
  6. Rollins AL, Salyers MP, Tsai J, Lydick JM. Staff turnover in statewide implementation of ACT: relationship with ACT fidelity and other team characteristics. Administration and policy in mental health. 2010 Sep 1; 37(5):417-26.
Conference Presentations

  1. Rollins AL. Comparison of the Costs of On-site and Remote Fidelity Assessment Methods. Poster session presented at: International Center of Mental Health Policy and Economics Annual Conference; 2015 Mar 27; Venice, Italy.
  2. Rollins AL. A Comparison of Fidelity Assessment Methods. Paper presented at: National Institute of Mental Health Mental Health Services Research Annual Conference; 2014 Apr 24; Bethesda, MD.
  3. Rollins AL. Move your recovery meter: Tools for mental health intensive case management teams. Paper presented at: VA Mental Health Intensive Case Management Annual Conference; 2013 Aug 28; Washington, DC.
  4. Rollins AL. Illness management and recovery overview. Presented at: VA Mental Health Intensive Case Management Annual Conference; 2013 Aug 28; Washington, DC.
  5. Salyers MP, Rollins AL. Reducing Staff Turnover and Burnout: Tips for Team Leaders. Paper presented at: Assertive Community Treatment Association Annual Conference; 2011 May 13; Huntington Beach, CA.


DRA: Health Systems
DRE: Research Infrastructure
Keywords: Statistical Methods
MeSH Terms: none

Questions about the HSR&D website? Email the Web Team.

Any health information on this website is strictly for informational purposes and is not intended as medical advice. It should not be used to diagnose or treat any condition.