Talk to the Veterans Crisis Line now
U.S. flag
An official website of the United States government

Health Services Research & Development

Go to the ORD website
Go to the QUERI website

HSR&D Citation Abstract

Search | Search by Center | Search by Source | Keywords in Title

Pilot evaluation of a method to assess prescribers' information processing of medication alerts.

Russ AL, Melton BL, Daggy JK, Saleem JJ. Pilot evaluation of a method to assess prescribers' information processing of medication alerts. Journal of Biomedical Informatics. 2017 Feb 1; 66:11-18.

Dimensions for VA is a web-based tool available to VA staff that enables detailed searches of published research and research projects.

If you have VA-Intranet access, click here for more information vaww.hsrd.research.va.gov/dimensions/

VA staff not currently on the VA network can access Dimensions by registering for an account using their VA email address.
   Search Dimensions for VA for this citation
* Don't have VA-internal network access or a VA email address? Try searching the free-to-the-public version of Dimensions



Abstract:

BACKGROUND: Prescribers commonly receive alerts during medication ordering. Prescribers work in a complex, time-pressured environment; to enhance the effectiveness of safety alerts, the effort needed to cognitively process these alerts should be minimized. Methods to evaluate the extent to which computerized alerts support prescribers' information processing are lacking. OBJECTIVE: To develop a methodological protocol to assess the extent to which alerts support prescribers' information processing at-a-glance; specifically, the incorporation of information into their working memory. We hypothesized that the method would be feasible and that we would be able to detect a significant difference in prescribers' information processing with a revised alert display that incorporates warning design guidelines compared to the original alert display. METHODS: A counterbalanced, within-subject study was conducted with 20 prescribers in a human-computer interaction laboratory. We tested a single alert that was displayed in two different ways. Prescribers were informed that an alert would appear for 10s. After the alert was shown, a white screen was displayed, and prescribers were asked to verbally describe what they saw; indicate how many total warnings; and describe anything else they remembered about the alert. We measured information processing via the accuracy of prescribers' free recall and their ability to identify that three warning messages were present. Two analysts independently evaluated participants' responses against a comprehensive catalog of alert elements and then discussed discrepancies until reaching consensus. RESULTS: This feasibility study demonstrated that the method seemed to be effective for evaluating prescribers' information processing of medication alert displays. With this method, we were able to detect significant differences in prescribers' recall of alert information. The proportion of total data elements that prescribers were able to accurately recall was significantly greater for the revised versus original alert display (p = 0.006). With the revised display, more prescribers accurately reported that three warnings were shown (p = 0.002). CONCLUSIONS: The methodological protocol was feasible for evaluating the alert display and yielded important findings on prescribers' information processing. Study methods supplement traditional usability evaluation methods and may be useful for evaluating information processing of other healthcare technologies.





Questions about the HSR&D website? Email the Web Team.

Any health information on this website is strictly for informational purposes and is not intended as medical advice. It should not be used to diagnose or treat any condition.