As one of its core missions VA supports research intended to improve the health and well-being of veterans. Unethical research behavior that compromises the quality or validity of human subject research may directly harm research participants, and indirectly harm others subsequently affected by implementation of unreliable or invalid research results.
Study goals included: (1) the establishment and ongoing measurement of structures, processes, policies, and procedures, within organizations, (2) evaluation of the institutional environment supporting integrity in the conduct of research, and (3) use of this knowledge for ongoing improvement.
Using our previously validated Survey of Organizational Research Climates (SOuRCe), implemented via web- and mail-based survey, we collected baseline data on the research integrity climates of the research service in 42 VA facilities by canvassing all research-engaged staff. After randomly assigning VA facilities in equal numbers to either a basic feedback arm or an enhanced feedback arm, we used the SOuRCe results as the basis of the intervention content. We developed PDF reports tailored to each participating facility, including summary information regarding seven dimensions of their research integrity climates, along with aggregated comparison data for all other facilities included in the study. Where sufficient numbers of responses allowed, reports also included similar scale information separately by work-roles (e.g. Investigators & Faculty, Research Support Staff, Leadership or administrative staff, Graduate Students/Postdocs/Fellows) and by areas of research (e.g. Clinical, Biomedical, Rehab R&D, Health Services R&D). We emailed local research leaders their respective reports. In the enhanced feedback arm, we also scheduled phone conversations to discuss findings with these leaders. The point of these discussions was to use findings from the SOuRCe reports to draw research leaders' attention to areas where their local climates were particularly strong, as well as to identify specific weaknesses in their research climates, and help them to identify organizational actions and initiatives they might mount to bring about positive change in their local climates. Several months following these interventions, we assessed via phone interviews with research leaders in these same facilities whether, and if so what types of organizational change initiatives they had planned or implemented.
We distributed the SOuRCe summary reports to ACOS-R at 41 of the 42 sampled facilities in December 2014. One facility yielded too few respondents to the climate survey, requiring the facility to be dropped from the intervention portion of the study. Of the ACOS-R at the remaining 41 facilities, 25 (61%) consented to participate in the phone-based intervention and qualitative interview for follow-up.
Of the 25 ACOS-R who originally consented to participate, 14 were from facilities randomized to the enhanced arm. We were able to schedule and complete phone-based intervention calls with leaders at 12 of the 14 facilities in the enhanced arm (January through April, 2015).
From June to September of 2015, using a semi-structured interview guide, we conducted qualitative follow-up telephone interviews with 21 of the 25 leaders (84%) who had originally consented to participate in the study, for a combined participation rate of 51% (84% of 61%).
Phone interviews were audio-recorded and professionally transcribed. From the transcriptions one study team member (DM) coded a primary outcome measure of whether the research leader had undertaken any activity in response to the intervention (yes/no). For cases where some response to the intervention had occurred, we further coded whether that action was based on feedback from our summary report (yes/no) and rated (on a scale from 1 to 10, where 1 was low) the subjective likelihood that the particular action could lead to lasting change in the local climate (analyses pending). A second team-member (MPC) re-coded a random half of the follow-up interviews, resulting in 78% agreement. Discrepancies were resolved through an adjudication discussion between DM, MPC, and the PI (BCM). Given the less than optimal level of coder agreement, the team decided it would be prudent to duplicate code and adjudicate 100% of the interviews, though that work is pending at the time of this report submission.
At the time of this report, we have completed analyses of the primary outcome of whether any activity was planned or undertaken. We found that a higher proportion of ACOS-R had planned or taken some action in response to our summary report among those in the enhanced arm (67%) than in the basic feedback arm (22%). Even though this appears to be a sizable difference, the sample size of this pilot trial is too small to conclude that there was a significantly greater rate of action in response to the SOuRCe in the enhanced feedback arm compared to the basic feedback arm. Results of a Fisher Exact Test assessing whether these rates differ by study arm yielded a p=0.08.
We also examined whether there was differential response to the feedback based on our assessment of receptivity to QI feedback on the part of leaders at each facility (the assessment process for which we described in further detail in our grant application). Although we don't find any interpretable pattern of results by this dimension with respect to whether actions were taken in response to our feedback, we do find that those in the lowest tertile of our QI receptivity score were differentially non-participative in both the intervention and follow-up interviews. We obtained no participation from 64% of the ACOS-R in this lowest tertile, but non-participation was only 36% among those in the middle QI receptivity tertile and 46% in the highest tertile.
With this project, we have 1) established baseline measures of research climates and research related behaviors in the VA, 2) provided initial comparative feedback to VA leaders in appropriate positions to motivate positive change in settings where less than best practices may be occurring and 3) we can conclude that such feedback appears to have at least the potential to motivate and direct positive organizational change in research integrity climates in VA. These pilot data are suggestive that following up written SOuRCe reports with phone-based feedback and discussion with research leaders may be more beneficial than merely providing written reports. Moreover, our results are suggestive that a strictly voluntary approach to this reporting and feedback process may result in ACOS-R who are not particularly receptive to QI feedback processes to opt-out of participation, which would likely limit effectiveness.
- Martinson BC, Mohr DC, Charns MP, Nelson D, Hagel-Campbell E, Bangerter A, Bloomfield HE, Owen R, Thrush CR. Main outcomes of an RCT to pilot test reporting and feedback to foster research integrity climates in the VA. AJOB empirical bioethics. 2017 Jul 1; 8(3):211-219.
- Martinson BC, Nelson D, Hagel-Campbell E, Mohr D, Charns MP, Bangerter A, Thrush CR, Ghilardi JR, Bloomfield H, Owen R, Wells JA. Initial Results from the Survey of Organizational Research Climates (SOuRCe) in the U.S. Department of Veterans Affairs Healthcare System. PLoS ONE. 2016 Mar 11; 11(3):e0151571.
- Martinson BC. Results from the 2014 Survey of Organizational research climates in VA (SOURCE). [Cyberseminar]. 2015 Mar 19.
- Martinson BC, Mohr D, Charns M, Bloomfield HE, Nelson DB, Thrush C, Owen R, Bloomfield HE. Initial Results from the Survey of Organizational Research Climates (SOuRCe) in the VA. Poster session presented at: AcademyHealth Annual Research Meeting; 2015 Jun 15; Minneapolis, MN.
- Martinson BC. Can Research Integrity Be Incentivized? Paper presented at: National Academy of Sciences Institute for Laboratory Animal Research Roundtable Meeting; 2015 Jun 4; Washington, DC.
- Martinson BC. Reconsidering the Challenges: An Eco-Systemic Lens on Research Integrity. Presented at: Research Integrity Triennial World Conference; 2015 May 31; Rio de Janeiro, Brazil.
- Martinson BC. From bad apples to bad barrels: Complementary perspectives and narratives to understand undesirable, research-related behavior. Presented at: American Psychiatric Association Annual Meeting; 2015 May 16; Toronto, Canada.
- Martinson BC. How and Why Data Matters In Research Integrity. Paper presented at: American Association for the Advancement of Science Annual Meeting; 2015 Feb 15; San Jose, CA.
- Martinson BC. Understand Why Researchers Misbehave. Paper presented at: American Association for the Advancement of Science Annual Meeting; 2015 Feb 13; San Jose, CA.
- Martinson BC. Incentives and Disincentives to Quality Science: Systems Produce Exactly What They Are Designed To Produce. Presented at: Mayo Clinic Center for Clinical and Translational Science Grand Rounds; 2014 Sep 19; Rochester, MN.