Talk to the Veterans Crisis Line now
U.S. flag
An official website of the United States government

Health Services Research & Development

Veterans Crisis Line Badge
Go to the ORD website
Go to the QUERI website

SDR 11-399 – HSR&D Study

New | Current | Completed | DRA | DRE | Portfolios/Projects | Centers | Career Development Projects

SDR 11-399
Describing Variation in IRB Efficiency, Quality and Procedures
Daniel E. Hall MD MDiv MHSc
VA Pittsburgh Healthcare System University Drive Division, Pittsburgh, PA
Pittsburgh, PA
Funding Period: July 2012 - December 2014

BACKGROUND/RATIONALE:
Data demonstrate unjustifiable variation in the quality and efficiency of IRB review. We therefore attempted to improve quality and efficiency of the VA IRB review process though an in-depth analysis of IRB processes at 10 VA sites, including the Central IRB.

OBJECTIVE(S):
Using a combination of systems engineering, qualitative, and quantitative methods, we: (1) developed current-state process flow maps of the IRB review processes at each site; (2) measured variation in the efficiency of IRB review in terms of IRB review times; (3) measured variation in the quality of IRB review using the IRB Researcher Assessment Tool (IRB-RAT), the Common Rule Criteria for IRB review, and the Office of Human Research Protections' algorithm for determining the appropriate type of review (e.g., expedited, exempt, full board); and (4) identified high-impact, high-feasibility initiatives for process improvement.

METHODS:
(1) To develop flow maps of the IRB review process, collaborators from the Veterans Engineering Resource Center/Office of Systems Redesign (VERC/OSR) at the Pittsburgh VA conducted a 1-day site visit at each of the 10 study sites. During the visit, the VERC conducted focus groups with IRB stakeholders (e.g., IRB Chairs, IRB reviewers and investigators) to map the review process beginning with the submission of a new IRB protocol and ending with its final determination, either approval or disapproval. Analysis focused on comparing similarities and differences between sites to identify opportunities for process improvement.

(2) To measure variation in IRB review times, we randomly collected IRB records from each site related to a sample up to 45 protocols (15 exempt, 15 expedited, 15 full board) that received a final determination between January 2010 and December 2011. Two trained coders extracted the time for each incremental step in the review process, allowing us to calculate overall and incremental times.

(3) We measured the quality of IRB review in three ways:
(a) Trained coders analyzed the sampled IRB materials for evidence of the Common Rule Criteria. The Common Rule tasks IRBs to assess 8 criteria such as risk minimization or equitable participant selection. The coders examined the documents for evidence that the criterion was evaluated and rated how thoroughly it was evaluated. Analysis focused on comparing the thoroughness of evaluation between sites and across each of the 8 criteria.
(b) Coders also used OHRP algorithms to determine the appropriate level of review based on the merits of the approved protocol (exempt, expedited, full board). Analysis focused on comparing the actual review level with the expected review level based on the coders assessment.
(c) We sent an anonymous, on-line survey to 252 IRB members and 428 investigators who serve on or use the 10 VA IRBs in our study. The survey included a revised and shortened version of the IRB-RAT developed by the team for use in VA settings. Analysis focused on identifying the IRB functions and activities at each site and across all sites that were in greatest need of improvement.

(4) We conducted video teleconferences at each site to share our findings regarding review times and quality and facilitate brainstorming of initiatives for quality improvement.

FINDINGS/RESULTS:
(1) Although flow maps demonstrate substantial similarity in IRB review processes across sites, each process was unique, and there were substantial differences between sites. The greatest variability was noted in the role of the information security officers (ISO) and privacy officers (PO), often demonstrating inefficient re-work. We also noted that the Research and Development Committee (R&DC) typically added little or no value to the review process despite adding approximately 14 days to the review time. We therefore recommend clarifying the overlapping roles of the IRB, ISO and PO to optimize process design. We also recommend changing policy to expedite R&DC review by authorizing the chair of the R&DC to certify the approvals of the HRPP subcommittees, including the IRB.

(2) Two raters extracted IRB review times from 48 exempt, 106 expedited and 124 full board protocols. The 2 raters agreed on 3038 of 3090 time points (98.3%) abstracted from a sub-sample of 72 (22.4%) protocols. Total review times ranged from 24 to 310 days, with means of 93 51, 107 54 and 131 63 days for expedited, exempt and full board protocols respectively. Multivariable models using robust variance estimation controlling for site and review level demonstrate that, on average, expedited protocols were reviewed 45 days faster than full board reviews (95% CI= 31-59 days). There was no significant difference between full board and exempt reviews. We found significant between-site differences, with some sites approaching a consensus panel goal of 60 days for IRB review and other sites needing improvement to reach this goal.

(3a) We determined the quality of IRB review in terms of the Common Rule Criteria in 117 expedited and 134 full board protocols. Inter-rater agreement was good in a sub-sample of 74 protocols (kappa= 0.76). Across all 10 IRBs, 94.3% of the criteria were assessed. However, only 32.3% of the assessments were explained with substantive detail. Most assessments (62.0%) were unexplained (e.g., a checklist checkbox). Multinomial logistic regression controlling for site, criterion and review type demonstrate that expedited reviews are less likely to have explained assessments than full board reviews. Significant interactions between site and criteria demonstrate differences that might guide criterion- and site-specific initiatives for quality improvement.

(3b) We applied the OHRP algorithm to determine the appropriate level of review for 63 exempt, 120 expedited and 138 full board protocols. Inter-rater agreement on OHRP level of review was excellent in a sub-sample of 67 protocols (kappa= 0.97). Agreement across all 10 IRBs between IRB determination and expected level of review based on OHRP criteria was 73.5% (kappa=0.59, p<0.001). Site-specific agreement ranged from 95.1% to 13.3%. Poisson regression showed that IRBs conducted full board reviews more frequently than expected (p<0.001), with significant variability across sites. Overall, almost half (48.9%) of protocols given full board review could have been expedited according to OHRP criteria, and this proportion varied from 86.7% to 0.0% across sites. System redesign at some sites might reduce unnecessary variation and improve efficiency.

(3c) Response rates for the revised IRB-RAT were 65.5% for IRB members and 52.6% for investigators. The survey included 27 statements describing IRB activities (e.g., "An IRB that is open to reversing its earlier decisions"). Respondents indicated how well each statement described both their "actual" and "ideal" IRB. For each statement, we calculated the difference between the ratings of ideal and actual IRBs. Using random effects modelling, we identified the statements that fell outside the 95% CI, corresponding to those functions closest to (and furthest from) the ideal IRB. The stakeholders agreed that the IRB was closest to the ideal when protecting human subjects, treating investigators with respect and taking appropriate action for alleged scientific misconduct. The IRB was furthest from the ideal regarding the timeliness of review, allocation of sufficient resources to the IRB and provision of complete rationales for required changes to or disapprovals of protocols. We also identified noteworthy site-level differences in the ratings of each IRB activity. Our method may help IRB stakeholders to identify and monitor site- and activity- specific initiatives for quality improvement.

(4) Video teleconferences were held at 9 of the 10 sites. The VERC/OSR facilitated brainstorming sessions unique to each site that culminated in specific, high impact, high feasibility initiatives for process improvement.

IMPACT:
Our findings demonstrate that the quality of IRB review in the VA is better than expected, in terms of both review times and review quality. However, there is significant room for improvement. We have made concrete recommendations to improve the guidance regarding the ISO, PO and R&DC in the review process. We have also developed tools that can be used for ongoing assessment of quality improvement initiatives. In particular, we think the revised IRB-RAT could be used periodically to track the perceived performance of IRBs throughout the VA.

PUBLICATIONS:

Journal Articles

  1. Varley PR, Feske U, Gao S, Stone RA, Zhang S, Monte R, Arnold RM, Hall DE. Time required to review research protocols at 10 Veterans Affairs Institutional Review Boards. The Journal of surgical research. 2016 Aug 1; 204(2):481-9.
  2. Hall DE, Feske U, Hanusa BH, Ling BS, Gao S, Switzer GE, Dobalian A, Fine MJ, Arnold RM. Prioritizing Initiatives for Institutional Review Board (IRB) quality Improvement. AJOB empirical bioethics. 2016 May 13; doi: 10.1080/23294515.2016.1186757.
  3. Hall DE, Hanusa BH, Ling BS, Stone RA, Switzer GE, Fine MJ, Arnold RM. Using the IRB Researcher Assessment Tool to Guide Quality Improvement. Journal of empirical research on human research ethics : JERHRE. 2015 Dec 1; 10(5):460-9.
  4. Hall DE, Hanusa BH, Stone RA, Ling BS, Arnold RM. Time required for institutional review board review at one Veterans Affairs medical center. JAMA surgery. 2015 Feb 1; 150(2):103-9.
Center Products

  1. Hall DE. IRB quality and efficiency (Office of Research Oversight Live Meeting). 2015 Jan 13.
  2. Hall DE. Describing IRB efficiency, quality and procedures: an interim report [Invited Presentation to VA Office of Research and Development]. 2013 Oct 9.
Conference Presentations

  1. Hall DE, Feske U, Gao S, Stone RA, Zhang S, Arnold R. The Time Required to Review Research Protocols at 10 IRBs in the Veterans Health Administration. Paper presented at: Academic Surgical Annual Congress; 2016 Feb 3; Jacksonville, FL.
  2. Hall DE, Ling BS, Feske U, Gao S, Stone RA, Zhang S, Zickmund SL, Arnold RM, Lidz C. Describing the Quality of Review at 10 Institutional Review Boards in the Veterans Health Administration. Paper presented at: American Society for Bioethics and Humanities Annual Meeting; 2015 Oct 22; Houston, TX.
  3. Hall DE, Ling BS, Feske U, Gao S, Stone RA, Zhang S, Arnold RM. Systematic Bias Against Expedited Review Procedures Across Institutional Review Boards in the Veterans Health Administration. Paper presented at: American College of Surgeons Annual Clinical Congress; 2015 Oct 6; Chicago, IL.
  4. Klune R, Feske U, Stone RA, Hanusa BH, Gao S, Zhang S, Ling BS, Lidz C, Switzer GE, Dobalian A, Arnold RM, Hall DE. Prioritizing initiatives for IRB quality improvement. Paper presented at: VA Association of Surgeons Annual Meeting; 2015 May 3; Miami, FL.
  5. Hall DE, Hanusa BH, Fine MJ, Arnold RM. The time required for IRB review at one VA Medical Center. Paper presented at: VA Association of Surgeons Annual Meeting; 2014 Apr 6; New Haven, CT.


DRA: Health Systems
DRE: Research Infrastructure
Keywords: Best Practices, Gap Analysis, Quality Indicators, Systems Engineering
MeSH Terms: none

Questions about the HSR&D website? Email the Web Team.

Any health information on this website is strictly for informational purposes and is not intended as medical advice. It should not be used to diagnose or treat any condition.