Applying Usability Heuristics to Improve Outpatient Consultation Orders
Himalaya Patel, Ph.D., April Savoy, Ph.D., & Michael Weiner, M.D., M.P.H., VA Health Services Research & Development Service (CIN 13-416), Richard L. Roudebush VA Medical Center, Indianapolis, IN)
My hypothetical consultation order ended abruptly: “Please refer to Hand and Upper Extremity Clinic.”
This guidance would have been nice to know before starting. Instead, my half-finished referral to the Orthopedics clinic would have to be redone. Meanwhile, my imaginary Veteran patient with suspected carpal tunnel syndrome would have to wait a bit longer for an appointment.
But why is this the case?
Both inside and outside the U.S. Department of Veterans Affairs (VA), barriers to communication are common in outpatient medical referrals. In VA, we suspected that some barriers were linked to the usability of components of VA’s electronic health record system, Computerized Patient Record System (CPRS). To investigate, we analyzed 26 facility-level consultation order templates from CPRS at three VA medical centers. Our review was a heuristic evaluation, a process by which experts assess a system’s implementation of usability principles.
Heuristic evaluation can support a broader plan for improving health information technology. This article includes suggestions for planning a heuristic evaluation of your consultation templates and sharing your findings.
Consultations from a VA referrer’s perspective
Outpatient referrals are generated when a referrer, such as in primary care, formally requests recommendations about, evaluation of, or treatment for a patient by a consultant with expertise in a different medical specialty. At VA, referrals are documented in CPRS using consultation orders. Each such order represents an early referrer–consultant communication of a patient’s needs.
To support electronic referral communication, VA specialty care services work with VA clinical application coordinators (CACs) to create consultation templates. Each template prompts for information pertinent to the associated referral. Navigating through templates can be difficult, especially for new users. Templates vary in both structure and content across services and facilities. Few templates automatically include relevant clinical data.
These issues and others can delay consultation ordering and appointment scheduling. One in three outpatient referrals is discontinued (rejected) by specialty care services, and about half of discontinued referrals produce no appointments within 30 days of the request. Explanations vary for discontinuing a referral without an appointment; common explanations include missing prerequisite tests (14%), requesting unneeded appointments (13%), sending to the wrong consulting service (13%), duplicating previous orders (8%), and providing incomplete referral information (7%).
Inspecting your facility’s consultation templates
Before conducting your heuristic evaluation, review a published guide like the one posted by Oracle Corporation (PDF). For more specific information about evaluating consultation templates, see our recently published article in the Journal of Medical Systems. Next, consider the three factors described below: choosing templates to evaluate, finding evaluators, and choosing heuristics to assess.
Which consultation templates?
A heuristic evaluation may cover one or more components of a health information technology. Our aim was to identify a broad range of possible usability problems within a purposive sample of VA outpatient consultation templates. Related aims are listed below:
- Electronic consultations (e-consults), which yield a consultant’s recommendations based on review of medical records instead of a clinical encounter with the patient.
- Referral to out-of-network consultants.
- The organization of the consult menu (the electronic list of consulting services and/or templates)
- Combining multiple similar templates.
After deciding which templates to review, consider how to present the templates to your evaluators. If all evaluators can access the medical record system at your facility, and if they can find the consultation templates, they can review the templates directly. To prevent accidental ordering for real patients, your facility may supply fictitious records or grant access to an independent testing copy of the electronic health record system.
If direct access to the electronic health record system is not feasible for all evaluators, screenshots, or screen captures, of the templates can be used. Using screenshots gains consistency at the expense of interactivity. It also requires creating, organizing, and sharing the images. Because templates may include items that are hidden by default, before taking screenshots, activate any hidden items.
In Microsoft Windows, to make a screenshot of an opened template, use the built-in Snipping Tool and select the template’s window when prompted. For greater flexibility, press [Alt] and [Print Screen] together to copy a screenshot to the Clipboard, then paste the screenshot into an image editor or word processor. To capture templates spanning multiple pages, some third-party software may help: Try screenshot applications like Snagit (commercial license, $43), and image editors like Paint.net (open source, no cost).
Which evaluators, and how many?
Evaluators with expertise in both human-factors engineering and medicine are the most desirable—and the most difficult to find. Therefore, an interdisciplinary team is recommended. If you identify qualified evaluators at other facilities, consider the costs and benefits of getting them credentialed at your facility. Most usability issues are expected to be discovered with three to five evaluators.
Unlike in an informal expert review, each usability issue found in a heuristic evaluation is linked systematically to an applicable usability heuristic. As a result, your list of heuristics will shape the breadth and depth of your findings. Published heuristic evaluations of electronic health record interfaces often use J. Nielsen’s list of ten heuristics. An eight-item list is maintained by B. Shneiderman, and a 14-item list was published by J. Zhang and colleagues. Consider adding your own heuristics, as we did to address unmet communication needs. Whatever you decide, share operational definitions of all heuristics with your evaluators, and plan to resolve disagreements.
Typically, each issue found during a heuristic evaluation is assigned a severity rating. A common range is 0 through 4: An item rated 0 is not a problem, 1 is cosmetic, 2 is minor (greater than cosmetic), 3 is major, and 4 is critical for safe medical care. As with the heuristics, offer evaluators a guide to rating severity; plan to resolve disagreements.
What happens afterward?
Heuristic evaluations focus on identifying problems rather than creating solutions, and designing templates is beyond the scope of this article. Nevertheless, the findings of a heuristic evaluation can point to a set of design goals. For example, our results suggested increasing opportunities for free-text responses while limiting the need to recall specific lab values and dates.
Your results may provide evidence for specific changes. For each consultation template that you propose to change, address the needs of at least three groups of stakeholders: referrers using the templates, consultants receiving the referrals, and the CACs (or, if outside VA, an equivalent group of clinical programmers) maintaining the templates. Referrers will appreciate a self-contained form that decreases the overall time to write the order. To compare completion times among template designs, you can use simulation software like CogTool (open source, no cost). Consultants will appreciate receiving complete information and a clear clinical question in each referral. Review any service agreements between referring and consulting services; determine how well the new template represents the agreement’s terms. Finally, CACs will want to build and maintain the template easily. To minimize translation work, design the template using a basic text editor like Windows Notepad.
Authors’ note: Himalaya Patel manages the Human–Computer Interaction and Simulation Laboratory (the HCI Lab) at Richard L. Roudebush VA Medical Center. April Savoy directs the HCI Lab at Roudebush VAMC and holds appointments at Indiana University East and at Regenstrief Institute. Michael Weiner is the chief of Health Services Research and Development at Roudebush VAMC and holds appointments at Indiana University School of Medicine and at Regenstrief Institute.