Ensuring that patients meaningfully use health information technology (HIT) is a broad-reaching goal of the US health care system and an important element of VA's Blueprint for Excellence. Within VA's My HealtheVet (MHV) patient portal, the Blue Button (BB) function allows Veterans to download their personal health records (PHRs) to share with clinicians and others they trust, and Secure Messaging (SM) enables Veterans to communicate online with their healthcare team. VA has already initiated research to examine how Veterans use and value MHV and, specifically, Blue Button and Secure Messaging. But, to date, neither VA nor any other US health system has established validated measures that serve as an accurate, reliable gauge of Veterans' or patients' user experience with these tools.
The overall goal of this study was to develop and validate measures of Veteran meaningful use of the MHV patient portal, focusing on Blue Button and Secure Messaging. The project had 3 specific aims:
Aim 1. Consolidate and prioritize existing measures of Veterans' use of Blue Button and Secure Messaging.
Aim 2. Pilot the best candidate measures among a national sample of Veteran MHV users.
Aim 3. Validate the measures through principal component and confirmatory factor analysis of survey data.
Methods: In conjunction with our principal Operational Partner, the Veterans/Consumers Health Informatics Office, which houses the My HealtheVet Program Office, and with the support of the eHealth QUERI, this
mixed-methods study assembled existing measures for assessing Veterans' use of Blue Button and Secure Messaging through Key Informant interviews, and subjected those potential measures to a Modified Delphi panel process (Aim 1). The Delphi process led to candidate measures that will be pilot tested in an online survey of Veterans using MHV, incorporating these new items as "custom measures" in the American Customer Satisfaction Survey (Aim 2). These survey data will then undergo principal component analysis and confirmatory factor analysis to characterize how the measures relate to each other and how well they represent Veterans' meaningful use of these technologies (Aim 3).
All 12 expert panel members completed the study's three rounds of measures rating. In Round 1, we presented panelists with 64 process and outcome measures as candidate measures of digital patient engagement using MHV; these candidate measures had been identified from prior studies of MHV and from the literature. In Round 1, we did not present panelists with antecedent measures (i.e. variables considered as possibly predictive of, or correlated with, the measures themselves) to limit questionnaire length. After applying the consensus criteria to the panelist ratings in Round 1, 28 measures (44%) were accepted for inclusion in the final measures set (i.e., not subject to further panel review), and 4 measures (6%) were rejected and not considered further. The remaining 32 measures (50%) were deemed sufficiently strong to merit further refinement and thus were advanced to the second round of rating.
In Round 2, we asked panelists to rate the 32 measures carried forward from Round 1. We also introduced antecedent measures of Veterans' self-reported sociodemographics, general health, and access patterns to VA facilities and to the My HealtheVet portal. As well, we provided two separate measures sets, one for Secure Messaging, and one for all other MHV Blue Button features, to capture the different roles and usage patterns of these features, and asked the panelists to rate the measures separately for MHV/Blue Button and for Secure Messaging. In total, panel members rated 90 measures in Round 2 which resulted in 21 measures being accepted, 32 measures being rejected, and 37 measures being advanced for refinement and reconsideration in Round 3 rating.
Prior to Round 3, the Workgroup provided comments to improve the measures, specifically suggesting edits to the wording of items so that the measures would be more consistent with Veterans' use of MHV and so that the measure would be easily adapted for use in existing VA survey mechanisms. In particular, the Workgroup suggested the Rogers' "Diffusion of Innovation" Model as a framework for the broader consideration of digital patient engagement motivations and outcomes. Thus, we compared existing validated scales of patient activation and patient engagement to the five stages of Rogers' model. Our comparison suggested that our Veterans' engagement measures, and indeed, all of the patient activation and engagement scales studied, could be roughly mapped to Rogers' five stages. Thus, we developed two novel scales of patient engagement (i.e. a broad 12-measure version and a lean 4-measure version) as outcomes for testing.
In Round 3, we tested the two patient engagement scales in addition to the 37 measures rated sufficiently strong to advance from Round 2. The Round 3 measures set comprised: 13 antecedent measures, 24 process measures and 6 outcome measures, including the 4-measure patient engagement scale.
The final candidate measures from the three rounds of Delphi panel rating comprised two separate sets that represented digital patient engagement for both Secure Messaging and MHV/Blue Button. These sets included 58 measures for Secure Messaging and 71 for MHV/Blue Button, respectively.
-Antecedents represented 20 comparable measures for Secure Messaging and MHV/Blue Button, addressing: patient demographics and characteristics, health model, health and internet literacy, device access, access to and experience with the healthcare system, and portal registration and account type.
-Processes represented 32 Secure Messaging and 45 MHV/Blue Button measures. These addressed dimensions of: use; usefulness; ease of use; and the novel dimension of online care quality which measures: trust in online care and advice, self-management, staying informed, gaining peace of mind, care coordination, and judgment and decision making.
-Outcomes represented 6 comparable measures for Secure Messaging and MHV/Blue Button, addressing: intent to use and recommend; and patient engagement.
The ACSI Survey:
A total of 4,442,484 visitors came to My HealtheVet during the Wave I piloting period of November 18, 2013 to January 14, 2014. Of these, 3,321,651 visitors met the eligibility threshold of viewing 4 or more pages during the period. Of these, a 4% random sample of visitors was invited to take the survey yielding 132,866 prospective respondents. Of these, 34,748 accepted the survey invitation (acceptance rate approximately 26.2%), and 20,849 completed the survey (completion rate of 60%). The final Wave I Secure Messaging sample size was N=8,900. The final Wave II My HealtheVet/Blue Button sample size was N=13,997. Summary of key findings: Two measures gauge Veterans' intent to use and to recommend My HealtheVet features. Both Secure Messaging and My HealtheVet/Blue Button respondents are aligned in their agreement that they intend to continue using My HealtheVet in the future (91% SM; 90% MHV/BB) and they intend to recommend My HealtheVet to others (75% SM; 75% MHV/BB).
Implications: Establishing a valid and reliable scale is the first step to understanding digital patient engagement and its role in health and healthcare quality, outcomes, and effective, efficient implementation by healthcare providers and healthcare systems. This study yielded a robust set of candidate measures of what Veterans value in Blue Button and Secure Messaging. These measures may be used to accelerate My HealtheVet patient portal adoption, which in turn may improve Veterans' access to high quality healthcare and engagement with their healthcare teams. The results of this study, therefore, have potential implications for patients, clinicians and future research.
-For patients: Robust measurement of how patients value their online personal health records and secure messaging could allow healthcare systems like VA to strengthen Veterans' engagement with their care. This measurement could also enable VA (and, by analogy, other healthcare systems) to improve their patient-facing eHealth systems, which could in turn accelerate adoption. With accurate measures of patients' use of these systems and their perceptions of the benefits thereof, improvements could target patient audiences who may benefit most, including underserved populations.
-For clinicians: Demonstrating prioritized and tested digital patient engagement measures may help to persuade reluctant clinicians as to the potential for improved quality-of-care and communication with their patients associated with personal health records and secure messaging. Any resulting positive recommendations by clinicians to patients or to colleagues could, in turn, strengthen patient adoption.
-For the Veteran-physician partnership: Personal health records and patient portals are vital tools in a healthcare system's delivery of high-quality, coordinated healthcare for patients, potentially strengthening the patient-clinician relationship and continuity of care.
-For healthcare researchers: Personal health records and patient portal tools can serve as exemplars for specification of digital patient engagement for other web-based tools and services (such as refill/renewal of medications and request/review appointments) and other patient-facing services involving expert care (such as remote monitoring, telehealth and health-risk assessment with feedback). If adapted, personal health record and secure messaging measures could be transferrable to other electronic media (e.g. mobile device voice response applications).
None at this time.
Treatment - Implementation