Talk to the Veterans Crisis Line now
U.S. flag
An official website of the United States government

VA Health Systems Research

Go to the VA ORD website
Go to the QUERI website

IIR 13-296 – HSR Study

 
IIR 13-296
Systems for Helping Veterans Comprehend Electronic Health Record Notes
Hong Yu, PhD MS MA
VA Bedford HealthCare System, Bedford, MA
Bedford, MA
Funding Period: May 2015 - November 2019
BACKGROUND/RATIONALE:
As part of its commitment to patient-centered care, the Veterans Health Administration (VHA) is providing Veterans with comprehensive access to their personal health information. In January 2013, the VHA began to make clinical notes in the electronic health records (EHR) available through the My HealtheVet portal. Access to such information can increase transparency, communication between clinicians and patients, and patient involvement in self-management of care. However, many studies have shown that patients are confused by EHR notes, especially patients in vulnerable groups (e.g., low literacy, low income). Significant barriers to patients' understanding their clinical notes include use of specialized medical terminology and lack of explanations for complex concepts. Veterans who have limited health literacy are likely to have difficulty understanding their clinical notes.

OBJECTIVE(S):
We will develop and evaluate NoteAid, a Natural Language Processing (NLP) system that links medical jargon in clinical notes to consumer-oriented concepts, definitions, and educational materials in order to help Veterans comprehend their EHR notes. Our specific aims are to (1) develop a comprehensive health knowledge resource by: integrating clinical vocabularies and abbreviations, and linking medical concepts and corresponding lay concepts, definitions and education materials (2) develop a sophisticated NLP system for identifying medical concepts and abbreviations from EHR notes and for linking EHR notes to education materials and (3) evaluate NoteAid and assess improvement of patient's comprehension of EHRS, their communication with care givers, and their clinical knowledge and self-management.

METHODS:
We are continuing the development of the NLP components of the NoteAid by annotating and expanding the health knowledge resource of medical jargon and definitions (Aim 1) and by prioritizing jargon in EHR notes based on how important they are to their patients (Aim 2). We will then evaluate NoteAid by conducting a randomized comparison trial with 250 Veterans from the Edith Nourse Rogers Memorial Veterans Hospital (Aim 3). We propose a between-subject randomized experiment comparing those subjects exposed to clinical notes only versus those exposed to clinical notes plus NoteAid. The baseline interview assessment at ENRM will include the identification of chronic conditions, age, education, gender, and a literacy measure. Outcomes measures will assess comprehension, patient activation, and communication with care providers. Two subsequent interviews, including the outcome assessments, will follow during a six month time frame. Each Veteran will receive a $75 honorarium for participation ($40 for the initial visit and $35 for completing the follow-up interviews). Veterans consented but not enrolled will be given an honorarium of $10.

We will conduct quantitative analyses of the outcome measures. Bivariate statistics and regression analyses will be utilized. Equal emphasis will be given to both negative and positive findings of equal scientific merit. Results of the study will be disseminated through the eHealth QUERI across the VA system. We will report best practices and create a repository of material, consulting with national-level MyHealtheVet implementation and education managers.

FINDINGS/RESULTS:
Physician feedback was mixed. Positive feedback on NoteAid included (1) Easy to use, (2) Good visual display, (3) Satisfactory system speed, and (4) Adequate lay definitions. Opportunities for improvement arising from evaluation sessions and feedback included (1) improving the display of definitions for partially matched terms, (2) including more medical terms in CoDeMed, (3) improving the handling of terms whose definitions vary depending on different contexts, and (4) standardizing the scope of definitions for medicines. On the basis of these results, we have improved NoteAid's user interface and a number of definitions, and added 4502 more definitions in CoDeMed.

IMPACT:
Work has continued on natural language processing development. Research staff have utilized population-level rankings of terms in EHRs to expand a lay language resource that supports patient EHR comprehension. Adapted distant supervision was used to rank terms pulled from EHR notes and prioritize concepts. This work also helped to develop FIT (Finding Important Terms), a system that helps tailor information to each individual patient's needs.
Work has also been conducted that explores the relationship between readability scores and patients' perceived difficulty of text in EHR notes, and physician evaluation of the NoteAid system helped refine content and improve system operability.




External Links for this Project

NIH Reporter

Grant Number: I01HX001457-01A1
Link: https://reporter.nih.gov/project-details/8781838

Dimensions for VA

Dimensions for VA is a web-based tool available to VA staff that enables detailed searches of published research and research projects.

Learn more about Dimensions for VA.

VA staff not currently on the VA network can access Dimensions by registering for an account using their VA email address.
    Search Dimensions for this project

PUBLICATIONS:

Journal Articles

  1. Lalor JP, Hu W, Tran M, Wu H, Mazor KM, Yu H. Evaluating the Effectiveness of NoteAid in a Community Hospital Setting: Randomized Trial of Electronic Health Record Note Comprehension Interventions With Patients. Journal of medical Internet research. 2021 May 13; 23(5):e26354. [view]
  2. Chen J, Zheng J, Yu H. Finding Important Terms for Patients in Their Electronic Health Records: A Learning-to-Rank Approach Using Expert Annotations. JMIR medical informatics. 2016 Nov 30; 4(4):e40. [view]
  3. Lalor JP, Woolf B, Yu H. Improving Electronic Health Record Note Comprehension With NoteAid: Randomized Trial of Electronic Health Record Note Comprehension Interventions With Crowdsourced Workers. Journal of medical Internet research. 2019 Jan 16; 21(1):e10793. [view]
  4. Zheng J, Yu H. Readability Formulas and User Perceptions of Electronic Health Records Difficulty: A Corpus Study. Journal of medical Internet research. 2017 Mar 2; 19(3):e59. [view]
  5. Chen J, Yu H. Unsupervised ensemble ranking of terms in electronic health record notes based on their importance to patients. Journal of Biomedical Informatics. 2017 Apr 1; 68:121-131. [view]


DRA: Health Systems
DRE: Technology Development and Assessment
Keywords: Caregiving, Family, Healthcare Algorithms, Home Care, Natural Language Processing, Patient-Provider Interaction, Personal Health Record, Self-Care, Technology Development
MeSH Terms: none

Questions about the HSR website? Email the Web Team

Any health information on this website is strictly for informational purposes and is not intended as medical advice. It should not be used to diagnose or treat any condition.