Untreated disorders (e.g., PTSD, depression, addiction). PTSD, depression, and addiction (e.g., substance use disorder, alcohol use disorder, pathological gambling) are some of the most prominent psychiatric disorders among veterans. These disorders sometimes go untreated due to veterans' lack of awareness of the disorders, lack of access to VA care, or some combination. Note that we are including pathological gambling in addition to alcohol/substance use disorder because (a) it is the only other addiction that currently can be diagnosed per DSMIV; and (b) it is believed to be a problem for some veterans, especially among younger cohorts (e.g., OEF/OIF). Untreated SUD and mental health disorders are of particular concern in that the sooner the disorders are identified and potentially treated, the more likely a positive outcome and at lower cost to VA. (Delayed identification and treatment results in numerous chronic cases and more intensive interventions at a later date, which ultimately burden the VA system more than early detection.)
In recent years, interactive voice response (IVR) technology has been applied both within and outside of VA to conduct screening. IVR is an automated telephone-based method by which a person calling in can use his/her telephone keypad to indicate responses to questions. These automated treatments have been shown to be efficacious in a number of areas, including screening and intervention (such as helping alcohol abusers to moderate or quit drinking). Once created, these treatments are inexpensive to operate. An advantage of computer-controlled systems is that the screening or intervention is exactly specified and doesn't vary between personnel (e.g., assessors or therapists), while allowing tailored design for different content areas. Computer telephony offers some advantages even over screen-based programs (e.g., Internet, stand alone computer) because it uses a technology that is well established, low cost, and accessible to anyone (the telephone). It is also more interactive and "person-like" than screen-based programs. It has been shown that patients and other users anthropomorphize the voice of the telephony counseling programs and treat it like they like a human health professional who is conducting a telephone screening.
This proposal was grounded in implementation science (e.g., access to care issues, especially within the framework of Greenhalgh, et al., 2004), assessment theory (the comparison of different methodologies for the same constructs), and technology advancement (use of IVR and web). For example, the Greenhalgh model specifies a comprehensive and integrative model of factors impacting implementation success. The model considers: 1) system antecedents for innovation (general readiness for innovation), 2) system readiness for innovation (readiness specific to the current innovation), 3) characteristics of the innovation, 4) methods of communication and influence, 5) the "outer context" (e.g., incentives, mandates, environmental stability), 6) "linkage" between the change agency and the user system both during design of the intervention and implementation, 7) characteristics of the implementation process, and 8) characteristics of the adopters. In this project, we focused on points 4-8: veterans themselves are the "adopters" of this screening process.
This project had two aims.
1) To conduct automated telephone screening of 400 veterans in the community to help identify those with untreated PTSD, depression, and/or addictions (substance use disorder, alcohol use disorder, pathological gambling). Such a project could help veterans who are not currently accessing VA care to potentially do so; could help identify PTSD, depression and/or addictions sooner rather than later; and, due to its automation, is efficient, cost-effective, accessible, and confidential.
2) To conduct a second phase for 25% of the sample in which they would come to the VA to complete the same assessment in person, to compare the reliability of the IVR screening with the in-person screening. We also planned to evaluate whether the veterans followed up on use of VA care. (The latter is obtainable from the veterans' medical records within the electronic medical record system). These data could provide rigorous validation of whether the IVR system is actually helpful in promoting greater access to VA care, and can help quantitatively assess the value of the IVR system.
3) To study a variety of scientific questions related to aim 1, such as: factors that influence access to VA care, rates of PTSD and addictions among those screened, contextual factors such as sociodemographic variables and proximity to VA care.
We hypothesized that: a significant percentage of the 400 veterans would (a) meet criteria for one or more disorders based on our screening; (b) currently not be accessing VA care for those. We also hypothesized that there would be high concordance between the IVR and in-person assessment, and that the following factors would predict a veteran being positive on both (a) and (b): geographic distance from VA; perceived obstacles to VA care; younger cohorts; ethnic/racial minorities; and women.
Sample. Our targeted sample size was 400 veterans (200 per site). The only inclusion criteria for callers were: (a) willingness to answer telephone-based screening; and (b) veteran status. We anticipated that some non-veterans would be likely to call by mistake (our recruitment materials specified veterans as the targeted callers); data on non-veterans was not to be used.
Measures. The screening used the following measures: the Audit for drinking problems; the Severity of Dependence Scale for drug problems; the 4-item PTSD screen and the SPRINT-E; the 2-item PHQ (with those screening positive then being asked the 9-item version); the 4-item Lie-Bet screen for pathololgical gambling; the PHQ GAD-7 for anxiety; basic sociodemographic and background information (age, gender, ethnicity, branch of military service, and cohort such as OEF/OIF); and use of VA care (perceived barriers versus benefits, whether there was prior use of VA care; and likelihood of accessing VA care as a result of the screening on this project). The screening was estimated to take 7-10 minutes.
Note: we were specifically not asking about suicidal or other imminent dangerous behavior so as: (a) to avoid confusion with existing 800 numbers for VA suicide hotlines and other emergency procedures; (b) because this project did not have immediate clinical intervention (i.e., callers were encouraged to call their local VA, if screening positive for any disorder; but this will be voluntary on their part).
Economic analysis: Dr. Hendricks, our project collaborator, is an expert on cost/benefit analyses. On this project, we planned to identify: cost of the IVR system (in toto-both technology and labor costs to implement); and ratio of clinically-relevant cases "captured" by the system (veterans with diagnosible disorders who were not currently in VA care for them) to the total number of screenings.
Health disparities: Because we were collecting basic demographic data, we planned to identify the rate of clinically-relevant cases "captured" by the system (veterans with diagnosible disorders who are not currently in VA care for them) based on disparity factors such as age, ethnicity/race, service branch, service cohort, geographic distance from VA, and socioeconomic status. We thus would be able to address whether the veterans we were able to identify as needing care differ, on average, from those already in VA care.
Our project took place at two locations: Boston and Syracuse (the sites of the PI and Co-I Dr. Ouimette). After completing the screening, the caller was to be provided with feedback relevant to pursuing care (e.g., if they are identified as positively screened for PTSD, depression, or an addiction, they will be encouraged to seek VA care). The IVR system was programmed to provide referral phone numbers to the local VA for further assessment; (c) if desired, normative data relevant to each screen (e.g., how someone of their age and gender typically scores on the screening measures, where such information is known). To recruit callers we had planned to use ads in local newspapers, bus and subway venues, posting of fliers, and online sources (e.g., Craigs List). All calls were fully anonymous and no identifying data would be collected (e.g., name, location, phone number). We planned to conduct the screening until we achieved our targeted sample size, which we believed to be realistic within the timeframe and budget of the project, as well as providing sufficient numbers for data analytic purposes.
In addition, at the end of the IVR phone call, we planned to provide an invitation for the veteran to leave his/her phone number for us to call back so that we could assist in a more customized fashion with screening and referral. (As an alternative, we also provided our phone number in case the veteran prefers to call us rather than having us call them.) If the veteran was willing to be involved with this "second phase" contact, we would then call him/her and invite them into the VA to meet with a member of our study team. The study team member would be one who is qualified to conduct a diagnostic screening and to refer the client to further care within the VA (e.g., psychology intern or postdoctoral trainee). The team member would (a) consent the veteran (who will no longer be anonymous during this phase); (b) conduct the same screening as had been done by the IVR system; and (c) offers specific referral ideas in collaboration with the veteran's preferences; for example, which clinic might be most appropriate; setting up a clinical consultation or intake. We would conduct this "second phase" procedure until we reached at least 25% of our sample to come in (n=100). Also, for veterans who lived too far away to come into the VA, we would use a procedure by which they will be able to do the consent by mail and phone, and then the assessments by phone as well (thus allowing us to reach these veterans as well).
Aims 1 and 2. Our data analysis was designed to address descriptive statistics (e.g., percent who screen positive for PTSD, depression, addictions); and correlation/regression analyses (sociodemographic and background variables that may predict perceptions of VA care). In addition to these central questions, we planned to use data from the IVR system to explore issues such as how long callers stayed on the line, number of hangups prior to completing the screening, etc. These could help inform how to improve the IVR screening both for our study and future studies. We also planned to explore how veterans in treatment versus those not in treatment differed (e.g., in sociodemographics and other variables).
Our original timeline was as follows. Months 1-2: set up IVR system (including programming time for setting up our measures); set up initial procedures (e.g., advertising; plans for recruitment; develop study logs, etc.). Months 3-5: active recruitment. Month 6: data analysis and writing. We obtained an extension on the project due to difficulties ordering IT (information technology) equipment.
Our only findings are "lessons learned." Remarkably, at both Boston and Syracuse, we had extremely few callers over the entire project. At Boston, we had just one person go through the entire script. At Syracuse, it was 10 callers. Boston was hindered by not being able to purchase advertisements to recruit subjects. We had originally proposed to purchase ads, but found out after the study was underway that VA Boston had a policy of not permitting this for research purposes. We made exceptionally strong attempts to find other ways to recruit. We obtained IRB amendments to recruit via other means (Craigs-list, posting to Facebook, calling and emailing over a dozen local universities and colleges to link with their veteran/military coordinator, posting fliers at VA Boston, and making numerous email recruitment attempts to VA Boston mental health listserves). Similarly, Syracuse made numerous recruitment attempts in diverse ways. Our low response rate also may have been in part due to our stand-alone IVR computer "crashing" on various occasions (e.g., cleaning staff unplugged it accidentally; sometimes the system froze). It then went for several days until our staff could re-connect it and so we may lost some callers due to that. We also believe that our response may have been low because callers were not paid to participate in the research (unless they got to the end of the IVR script and volunteered to come in in person for the interview-based assessment). The numerous delays in starting the project (see Status) also decreased our recruitment time. Finally, our IVR system was developed shortly after the VA instituted a massive telephone outreach program to call every new veteran to inform them of VA services. This likely had a huge impact in that that project "called out" to every new veteran (whereas our system required veterans to see a recruitment flier or ad and then to voluntarily "call in", i.e., initiate a call to our system). Thus, a large number of veterans (the OEF/OIF cohort) did not need our system and were likely better served by the VA telephone outreach. We can also note that because our system asked about sensitive information (e.g., to screen for SUD, PTSD, depression, and pathological gambling), this may have been off-putting to veterans who may not have wanted to share this information with the VA (especially when the system was automated, thus not allowing callers to ask questions unless they contacted our project staff directly).
In sum, we successfully hired staff, obtained IRB approvals, created the IVR system exactly as proposed, and were able to get it to work behind the VA firewall. It was thus a success in terms of technology and project management. However, due to both study delays (see the section Status) and due the lack of callers, we have no actual research-based quantitative findings to report.
Lessons learned on this project are several.
1. It may be helpful to "call out" to veterans using IVR technology, rather than recruiting them to "call in."
2. It may be helpful to have a staff person standing by to answer questions, and only allow calls while that person is present (e.g., during certain blocks of time during the day).
3. It may be helpful to re-think what types of mental health assessments, if any, would lend themselves to IVR. Veterans with mental health conditions may be concerned or suspicious that their information could get used by the VA in some way that would be harmful to them (especially for sensitive information such as SUD). IVR systems may be better used for physical health problems, which may be less likely to raise concerns. Note that we have no data or feedback from anyone to indicate that they were concerned about this issue, but it is worth exploring in future research.
4. Technology-based projects in VA should allow for a longer timeframe, due to the major hurdles required to purchase IT equipment.
5. For research, finding a way to pay callers while still keeping the calls anonymous would be helpful. Many research participants are used to being paid for going through assessments, and in that our study did not, this may have hindered recruitment.
6. Before developing an IVR system in the future for veterans, conduct qualitative interviews with veterans to explore their ideas on how best to design the system, what might hinder their use of the system, etc. Such information could have helped us, but had not been proposed as part of this proposal due to the short timeframe of the Rapid Response mechanism.
We presented our findings at one conference as a poster presentation. (Ouimette P, Carnrike J, Martino S, Rothong N, Heath J, Najavits L, Rubin A, Simpson T. Telephone Screening of Mental Health Problems Among Community Veterans. 3rd annual VA Mental Health Conference: Implementing the Uniformed Services Package for Mental Health, Baltimore, MD; July, 2009.).
Our impact is otherwise primarily indicated by our "lessons learned" (see Findings), which may be helpful to other investigators.
It is also worth noting that even a study such as this, in which we were surprised to find so few callers calling into our IVR system, nonetheless can provide useful information to help guide future implementation science efforts. That is, "negative" findings can be highly informative. In this project, the impact may be to indicate that IVR is not the best system to screen veterans for mental health problems. Prior research outside of VA did find success in conducting screenings using IVR systems, but VA may be a unique environment in that there are many other ways for veterans to access screenings and feedback. This is particularly true of the OEF/OIF cohort, who are the subject of major VA outreach initiatives. Thus, an IVR system such as we designed may be less useful for non-veteran populations. The fact that both of our study sites found low levels of calls indicates that it was not a "fluke" of just one location. Both Boston and Syracuse are urban areas with many thousands of veterans. We successfully programmed the IVR system and knew it to be functioning in both locations. The lack of calls thus represents important cautionary information for future researchers.
- Ouimette CJ, Martino S, Rothong N, Heath J, Najavits LM, Rubin A, Simpson T. Telephone Screening of Mental health problems Among Community Veterans. Poster session presented at: VA Implementing a Public Health Model for Meeting the Mental Health Needs of Veterans Annual Mental Health Conference; 2009 Jul 21; Baltimore, MD.