Talk to the Veterans Crisis Line now
U.S. flag
An official website of the United States government

VA Health Systems Research

Go to the VA ORD website
Go to the QUERI website

IIR 12-068 – HSR Study

IIR 12-068
Design and Evaluation of User Centered Electronic Health Records
Lucila Ohno-Machado, MD PhD
VA San Diego Healthcare System, San Diego, CA
San Diego, CA
Funding Period: October 2015 - September 2019
There has been increasing need for quantitative research on how providers use Electronic Health Record (EHR) systems in real clinical settings, and on the methodology and metrics to assess EHR usability (e.g., effectiveness, efficiency, and user satisfaction). Such research will help identify candidate EHR components and integration features for redesign. In particular, documentation of the patient encounter and retrieval of existing patient data are complementary tasks of a clinical workflow at the point of care. Documentation is often in narrative form, stored as unstructured text documents. Providers also retrieve information from previous notes. Lack of EHR automation and poor user interfaces can contribute to the introduction of redundant information into the patient record (e.g., information copied/pasted from other portions of the EHR) as well as to inefficient workflows (e.g., duplicative work of clinicians entering orders through structured menus and subsequently manually documenting these orders in progress notes). Our study aims at describing patterns that can help optimize EHRs.

To conduct a retrospective clinical data analysis to identify candidate components for Electronic Health Record (EHR) redesign and use formative usability evaluation of prototypes to guide system design decisions.

Aim 1.Characterize Progress Notes Redundancy and Segmentation Longitudinally.
Aim 2. Characterize Progress Notes Workflow at the Point of Care.
Aim 3. Develop and Evaluate an ActiveNotes Prototype.

This study will quantitatively profile primary care providers use of the EHR documentation and information retrieval tasks. A mixed methodology (i.e., quantitative, qualitative and formative usability evaluation) will be used to measure longitudinally the degree of redundancy introduced over time in patient documentation, and to perform baseline content analysis to study variation in how clinicians organize and segment their notes into major sections. We will extract and document variation across providers. Archived progress notes from two VA sites will be aligned and manually coded. Stimulated recall clinician interviews will provide detailed user feedback. Data collected will help develop and evaluate a prototype software system for usable documentation and order entry system at the point of care. Formative usability evaluation of the prototype will be iterative and integrated into the agile development process. Feedback from clinician end-users based on test data, as well as input from other stakeholders, will guide system design decisions.
Our specific aims are to:

Aim 1.Characterize Progress Notes Redundancy and Segmentation Longitudinally.

Our primary aim, will be based on archival CPRS/VistA progress notes at the VA San Diego (SD) and Salt Lake City (SLC) sites. We will (a) quantitatively understand how information is organized and segmented in notes (eg, SOAP versus by condition) based on manual coding, and (b) quantitatively estimate the redundancy between notes, based on sequence alignment. We will study these patterns longitudinally across time-indexed samples of a patient's progress notes (by the same clinician). We will apply alignment at the whole-document level (primary outcome measure) and extend the method to compare redundancy at the level of individual sections, after understanding how clinicians structure their notes.

Aim 2. Characterize Progress Notes Workflow at the Point of Care.

Whereas Aim 1 is focused on the structure of progress notes, Aim 2 will focus on the process by which clinicians generate and use these documents. Aim 2 will leverage existing time-motion data of the clinicians' EHR workflow. We will quantitatively profile how clinicians use CPRS Notes and related functions for the complementary tasks of documentation and retrieval of existing data in EHR. We will code for and analyze modality used to create progress notes and note sources of actual or potential redundancy. We will classify data entry into the sections identified in Aim 1.

Aim 3. Develop and Evaluate an ActiveNotes Prototype.

We will use findings from Aims 1 and 2 to guide design and development of the ActiveNotes prototype. Currently, ActiveNotes focuses on improving clinician order entry, but we will extend the functionality to include features for progress note documentation and retrieval. Formative usability evaluation of prototypes will be conducted using patient test data with clinician users at SD and SLC sites.

Aim 1:
Previously we had split a whole note into six sections; Assessment plan, Labs, Medication, Past Medical History, Physical Exam, and Vital Sign. But the coverage of these six sections against the whole note was only 52% on average. To increase such a low coverage, we manually annotated 950 whole notes to have two additional sections, and, to achieve 75% mean coverage. On top of three distance functions that had been applied before (Levenshtein/Needleman-Wunsch/Smith-Waterman), we adopted a cosine similarity function based on term frequency and vector space model. Since the cosine similarity deals with numerics derived from term frequency, its main strength is a fast execution time. We completed the all pairwise similarity calculations of the whole notes (i.e. 950 choose 2 = 450,775) in a few hours with a cosine similarity. In contrast, the estimated total calculation time of whole notes with Levenshtein edit function alone is about one year or more accurately, choose(1000,2) / 60 minutes * 24 hours = 364.875 days. At section level (or a subset of a whole note), we finished the all pair wise calculations with all four distance functions. We also drew heat maps to visualize section-specific difference in document similarity across sections. We found out that structured sections (i.e. physicalexam, labs) had high average or minimum similarity compared to narrative sections (i.e. assessmentplatn, reasonforvisit). For example, the sections had a list of fixed variables like blood urea nitrogen (BUN), blood glucose test, total cholesterol, ALP/ALT liver tests, and others while only their lab test values get modified. This makes Levenshtein edit functions, which measures only changed values, to have high minimum values, e.g. at least 40% similarity because of the fixed variables. Currently we are compiling the effect of distance function and type of sections on similarity values.

Aim 2:
The hardware limitation in the VINCI computing environment is delaying the delivery of the computation results but we continue to work with the VINCI team but did not hindered Aim 2 as it is separate data. Our research assistant met with clinicians to understand how clinical work is actually done and how it could be done. We need to first examine and identify EMR "usability." Usability, as defined by NIST, refers to "effectiveness, efficiency and satisfaction with which intended users can achieve their tasks in the intended context of product use." The measurement and comparison of EMR usability during office visits is a necessary step for enhancing effective, efficient, and patient-centered use of EMRs. However, no single measure or method is likely to determine unambiguously whether or not, for example, a provider, workflow process, Health IT function, or usage pattern is patient-centered or efficient. We will use exploratory data visualization and quantitative analyses.

Aim 3:
The purpose of the prototype is to help find interesting/important information in old notes and visualize any copy-paste operations. The prototype also envisions how these copied data could be included in a new note that a clinician is composing. At the same time, we show how we could retain a link to the old notes that were used and visualize these links on the new note. Interface elements additions include a "note timeline" and a "search bar". We provided a timeline at the top of the interface that displays, by color, what type of note is available (e.g. blue = pcp) and allows the physician quick access to the whole note without going into a whole new page. If they choose to view the note, a new page opens with the notes side by side which allows for easier information transfer and comparison. The search bar allows the physician to easily search for keywords and view sections of past notes that match the search. Then the physician can easily access the old note, and view both notes side by side.
Some of the functionalities that we think could help towards the copy/paste visualization are the "Copy-Paste Tracker", and the "Note Tagging" The Copy-Paste Tracker allows physicians to view what sections of their note are from previous notes, and whether the sections brought into the new note have been edited. Note tagging provides colored bars next to the text that has been copied into the new note. This allows for easy review of copy-pasted information while looking through a note. All interactions are "click-based", so only click where instructed and this will simulate search, copy, paste, etc.

The URL for the prototype:

Our development team also had a chance to meet with the clinicians to discuss how best to develop the prototype for efficiency in the clinic.

The study will provide an assessment of how providers use EHR systems in real clinical VA settings, and help quantify the effectiveness, efficiency, and satisfaction with which users can achieve intended tasks. We will identify candidate EHR components for redesign and implement them in a prototype system.

We will assess the pattern of clinical note copy-paste in different community; within a patient-doctor, across patients within a doctor, between doctors, and beyond. We will report how different distance function can be used to detect different types of similarity and their weakness and strength. These findings will be stepping stones to redesigning a new EHR system.

External Links for this Project

NIH Reporter

Grant Number: I01HX000982-01A2

Dimensions for VA

Dimensions for VA is a web-based tool available to VA staff that enables detailed searches of published research and research projects.

Learn more about Dimensions for VA.

VA staff not currently on the VA network can access Dimensions by registering for an account using their VA email address.
    Search Dimensions for this project


Journal Articles

  1. Butler JM, Anderson KA, Supiano MA, Weir CR. "It Feels Like a Lot of Extra Work": Resident Attitudes About Quality Improvement and Implications for an Effective Learning Health Care System. Academic Medicine. 2017 Jul 1; 92(7):984-990. [view]
  2. Elizabeth Workman T, Weir C, Rindflesch TC. Differentiating Sense through Semantic Interaction Data. AMIA ... Annual Symposium proceedings / AMIA Symposium. AMIA Symposium. 2017 Feb 10; 2016:1238-1247. [view]
  3. Calvitti A, Hochheiser H, Ashfaq S, Bell K, Chen Y, El Kareh R, Gabuzda MT, Liu L, Mortensen S, Pandey B, Rick S, Street RL, Weibel N, Weir C, Agha Z. Physician activity during outpatient visits and subjective workload. Journal of Biomedical Informatics. 2017 May 1; 69:135-149. [view]

DRA: Health Systems
DRE: Technology Development and Assessment
Keywords: Care Management Tools, Efficiency, Electronic Health Record, Management and Human Factors, Practice Patterns/Trends
MeSH Terms: none

Questions about the HSR website? Email the Web Team

Any health information on this website is strictly for informational purposes and is not intended as medical advice. It should not be used to diagnose or treat any condition.