Published on in Vol 26 (2024)

Preprints (earlier versions) of this paper are available at https://preprints.jmir.org/preprint/55247, first published .
Usability Evaluation Methods Used in Electronic Discharge Summaries: Literature Review

Usability Evaluation Methods Used in Electronic Discharge Summaries: Literature Review

Usability Evaluation Methods Used in Electronic Discharge Summaries: Literature Review

Review

1The University of Sydney School of Pharmacy, Sydney, Australia

2The University of Queensland School of Pharmacy, Brisbane, Australia

3The University of Sydney School of Medicine, Sydney, Australia

4Nepean Kidney Research Centre, Nepean Hospital, Sydney, Australia

5Australian Commission on Safety and Quality in Health Care, Sydney, Australia

Corresponding Author:

Parisa Aslani, PhD

The University of Sydney School of Pharmacy

A15, Science Rd, Camperdown NSW 2050

Sydney

Australia

Phone: 61 61 2 9036 6541

Email: parisa.aslani@sydney.edu.au


Background: With the widespread adoption of digital health records, including electronic discharge summaries (eDS), it is important to assess their usability in order to understand whether they meet the needs of the end users. While there are established approaches for evaluating the usability of electronic health records, there is a lack of knowledge regarding suitable evaluation methods specifically for eDS.

Objective: This literature review aims to identify the usability evaluation approaches used in eDS.

Methods: We conducted a comprehensive search of PubMed, CINAHL, Web of Science, ACM Digital Library, MEDLINE, and ProQuest databases from their inception until July 2023. The study information was extracted and reported in accordance with the PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses). We included studies that assessed the usability of eDS, and the systems used to display eDS.

Results: A total of 12 records, including 11 studies and 1 thesis, met the inclusion criteria. The included studies used qualitative, quantitative, or mixed methods approaches and reported the use of various usability evaluation methods. Heuristic evaluation was the most used method to assess the usability of eDS systems (n=7), followed by the think-aloud approach (n=5) and laboratory testing (n=3). These methods were used either individually or in combination with usability questionnaires (n=3) and qualitative semistructured interviews (n=4) for evaluating eDS usability issues. The evaluation processes incorporated usability metrics such as user performance, satisfaction, efficiency, and impact rating.

Conclusions: There are a limited number of studies focusing on usability evaluations of eDS. The identified studies used expert-based and user-centered approaches, which can be used either individually or in combination to identify usability issues. However, further research is needed to determine the most appropriate evaluation method which can assess the fitness for purpose of discharge summaries.

J Med Internet Res 2024;26:e55247

doi:10.2196/55247

Keywords



The adoption of digital health platforms for collecting, sharing, and analyzing health information has shown positive associations with improvements in health care quality, service delivery, and clinical benefits including patient safety [1-4]. Electronic health records (eHRs) have become essential in acute care facilities as they enable the collection, sharing, and analysis of patient-related information, facilitating communication within and across health care settings. However, despite the substantial growth in the digitalization of health information exchange platforms, the complexity of many systems used by health care providers often poses challenges in achieving interoperability across different settings [5,6].

Differences in electronic systems used across different health settings can affect the exchange of relevant patient health and clinical information, especially during transitions of care or clinical handover [7,8]. Evidence indicates that the suboptimal communication between hospitals and external health care providers leads to discrepancies in medication records, duplication of tests, and avoidable delays in service provision, especially affecting vulnerable populations including those with low levels of health care literacy [9,10]. Hence, a coordinated health system with improved health information exchange, usability, and interoperability across health facilities and settings has significant potential to improve postacute care transition and overall patient safety [11].

Hospital discharge is a high-risk event where inaccurate or delayed transfer of clinical information, including medication plans, can significantly risk patient safety and cause medication-related issues [12,13]. Therefore, the clinical handover at the point of hospital discharge is a crucial step in patient care that determines the quality of care and patient safety. The introduction of electronic discharge summaries (eDS) has greatly improved the timely transmission of information to relevant stakeholders, mainly those in the primary care setting [8,14]. eDS, defined as “an end-to-end electronic transfer from the hospital to the community, using a secure messaging system, with the information populated using both pre-populated fields and manual transcription” [15] has seen increased adoption over the past decades. However, to further improve the quality of care and reduce communication delays between health settings, it is crucial that eDS should be user friendly [16]. This will help in minimizing avoidable patient harm incidents caused due to usability issues.

Usability is generally defined as “the effectiveness, efficiency and satisfaction with which specified users achieve specified goals in particular environments” [17]. In the context of electronic systems, usability refers to whether the system is useful, usable, and satisfying for the intended users to enable completions of intended tasks in certain sequences [18]. Evidence indicates that there are several usability issues identified with eHRs, such as those related to data entry and alerts, interoperability issues, display, automation, and workflow [16]. These usability problems in addition to affecting the implementation of such systems have implications for patient safety such as medication error and use of inappropriate medication doses [16,19]. Evaluation of systems used to prepare eDS provides an opportunity to identify and improve usability issues with existing systems. Usability evaluation involves assessing performance, efficiency, and satisfaction of electronic interfaces and can identify usability issues with eHRs to thereby propose interventions to improve designs of interfaces, their learnability, and service efficiency [20]. While various international organizations have developed and provided guidelines on the content, form, and presentation of eDS [21-27], less is known about the usability of eDS and systems used to display eDS and their potential impact on quality of care.

Evidence from systematic reviews have identified a range of usability evaluation techniques applied broadly to eHRs, which include heuristic evaluation, cognitive walkthrough, think-aloud, user-testing, observation, coupled with use of questionnaires and interviews to assess participants’ perspectives and satisfaction [20,28]. However, there is limited evidence on the usability evaluations applied specifically to eDS. Therefore, the aim of this literature review was to identify the usability evaluation techniques that have been used to assess the usability of eDS.


This literature review is reported in accordance with the PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) guidelines [29].

Literature Search

We searched PubMed, CINAHL, Web of Science, ACM Digital Library, MEDLINE, and ProQuest databases from their inception until Jul 2023. The main concepts used for developing our search strategy included the following and are tailored for the individual databases. Concept 1: “usability evaluation” OR “usability testing” OR “usability test” OR “usability engineering” OR “usability inspection” AND Concept 2: (“discharge summar*” OR “discharge communication” OR “continuity of care” OR “transfer of care” OR “clinical handover” OR “electronic discharge” OR “patient discharge”).

To capture unpublished and unindexed documents, a gray literature search was conducted using Google Scholar and via a range of governmental and health authorities’ websites and guidelines. Reference lists of included studies were also manually searched to identify further eligible studies or government reports which may have been missed during our search. The full search strategy for all databases including gray literature sources is presented in Multimedia Appendix 1.

Study Selection

Search results were screened for eligibility following predefined inclusion and exclusion criteria. The retrieved studies were exported to EndNote and subsequently transferred to Covidence [30]. After removal of duplicates, the remaining documents were screened using title, abstract, and full-text by 2 independent reviewers (WT and MJ), with disagreements resolved via discussion until consensus was reached.

Eligibility Criteria

We included studies that used usability evaluation of eDS or discharge communication or those that evaluated usability issues of eHRs used to prepare an eDS and may also have implication or relevance for eDS. The relevance of eHRs for inclusion was determined based on whether the included studies assessed electronic system interactions without explicitly mentioning eDS (eg, cross-facility health information exchange) or were using an electronic platform that is also known to have an eDS component (eg, My Health Record—an Australian digital platform containing a secure web-based summary of key patient health information, where health care providers can access the system to view and upload information). We also considered studies that focused on electronic health information to patients, with the aim of assessing usability of such information to improve care after discharge. Quantitative, qualitative, and mixed methods studies were all eligible for inclusion.

Studies that evaluated the effectiveness of transfer of care tools or interventions on quality of care or patient outcomes but did not include usability evaluation of eDS were excluded. Studies addressing the use of tools without any usability assessment were also not the focus of this review. Publications in languages other than English were excluded. Finally, we also excluded protocol studies without any preliminary findings.

Operational Definitions

Discharge Summary

A range of information about events during care by a provider or organization, with the goal to provide relevant patient, clinical, and administrative information that enables a continuity of care upon patient’s discharge from hospital [21]. While our primary focus is on discharge summaries, we have expanded our scope to include studies addressing usability issues with electronic discharge instructions or information provided to patients or other health care professionals. This was mainly done to understand and address the information needs and preferences of patients during their transition across different types of care.

Electronic Discharge Summary

Refers to a computerized form of discharge summary or instructions typically generated within electronic health records used in tertiary care.

Usability (of eHRs)

Refers to whether the electronic system is useful, usable, and satisfying for the intended users to enable completions of intended tasks in certain sequences [18].

End User

The user of the electronic interfaces, who could be health professionals (eg, physicians, nurses, pharmacists, etc) or consumers (patients or their caregivers).

Data Extraction and Synthesis

We extracted the following information from included studies: study characteristics (authors, publication year, and country), characteristics of end users or participants targeted, study design used (eg, mixed methods, qualitative, etc), usability evaluation method used (eg, questionnaires, interviews, heuristic evaluation, etc), study outcomes reported, and conclusions and limitations. These data were extracted from included studies using a standardized data extraction format that was modified from the Joanna Briggs Institute’s manual for evidence synthesis [31], which can be found in Multimedia Appendix 2. Due to the nature of the included studies or heterogeneity of study participants and findings, quantitative analysis or meta-synthesis was not possible; however, we conducted a systematic narrative synthesis of the major study findings and their implications.


Characteristics of Evidence Source

Our search identified a total of 775 records (see PRISMA flowchart in Figure 1). Of these, 34 were relevant for full text review. After removing duplicate and irrelevant records, 12 studies met the eligibility criteria and were included in this review [32-43].

Figure 1. Flow diagram for study selection process.

The studies were conducted in the United States (n=5) [32,36,38-40], Australia (n=3) [34,41,42], Germany (n=2) [33,35], and one each from Canada [37] and France [43] and had used qualitative (including document review and semistructured interviews) or mixed methods (n=9) [32,33,35,37-39,41-43], or observational (n=2) [34,36] methods. One document was a thesis containing a study that used experimental and survey methods and presented some findings on usability testing [40]. Table 1 presents the key characteristics and major findings of included studies, while the detailed study findings are summarized in Multimedia Appendix 2.

Table 1. Key study characteristics and major findings.
StudyCountryStudy designParticipantsPrimary aimTarget of the usability evaluationMajor findings
Barton et al [32]United StatesQualitative evaluationEmergency medicine physicians, nurses, geriatricianTo assess a method for integrating diverse expertise such as clinical, patient, care partner, and IT, in the evaluation of patient-facing emergency department after visit summary.eDSaIdentified usability issues related to readability, comprehensibility, and content organization, highlighting the need to integrate experts’ perspectives during design.
Busse et al [33]GermanyMixed methods (Qualitative evaluation and observational)Pediatric palliative care health care professionalsTo evaluate how potential users from the pediatric palliative care setting perceived an electronic cross-facility system.Both contents of cross-facility medical records and the system used for presentationIdentified critical need for data transfer automation and suggested improvements in search functions and visualizations.
Doyle et al [34]AustraliaExploratory mixed methodsParents of children and physiciansTo understand parent and clinician experience of discharge communication and engagement in clinical research.System used for presenting electronic discharge instructionsHigh success rates and satisfaction scores were observed for both mobile and desktop interfaces, with most tasks completed successfully.
Kernbeck et al [35]GermanyQualitative observationalPediatric palliative care professionalsTo evaluate the acceptance of the medication module from potential users’ perspective and to involve them in the development process.Both contents of cross-facility medical records and the system used for presentationIdentified usability issues related to performance expectancy and learnability, emphasized clarity, and reduced cognitive load.
Naik et al [36]United StatesObservational (user-centered)People with colorectal cancerTo transform physician-centered discharge warnings into patient-friendly format using health literacy and usability heuristics standards and cognitive interviews.Both eDS contents and the system used for presentationIdentified inconsistencies in content presentation and readability, highlighted importance of a patient-centered design.
Soto et al [37]CanadaMixed methods studyGeneral practitioners, family medicine residentsTo improve health information exchange and use of clinical information for decision making.System used for eDS presentationIdentified usability issues related to drug prescription and medication list visualization.
Tremoulet et al [38]United StatesQualitative evaluationHuman factors experts, medical professionalsTo conduct heuristic evaluation to identify potential usability problems and their level of severity.Both eDS contents and the system used for presentationIdentified usability issues related to content, comprehensibility, readability, presentation, and organizational aspects of medical documents.
Tremoulet et al [39]United StatesLiterature review with mixed methods studyPrimary care physicians, nurses, nursing and medical directors, social workers, transition-of-care nursesTo provide insight into how existing acute care eDS support outpatient providers in the coordination of care of older adults.eDSIdentified usability issues affecting care coordination, emphasized need for standardization of discharge summaries.
Vaigneur [40]United StatesExperimental and surveyNovice readers (caregivers) of discharge instructionsTo examine the impact of adjusting readability level of discharge instructions on user comprehension and recall.eDSHigh readability discharge instructions received more attention, better comprehension, and reduced mental demand compared to low readability instructions.
Walsh et al [41]AustraliaQualitative evaluationMy Health Record usersTo identify potential usability issues within My Health Record focusing on eHealth literacy.Both contents of health information summary and the system used for presentationIdentified usability violations and problems related to language use, website navigation, design elements, and registration processes.
Walsh et al [42]AustraliaQualitative evaluationMy Health Record usersTo identify usability issues with My Health Record through an updated heuristic evaluation.Both contents of health information summary and the system used for presentationIdentified violations of usability heuristics and highlighted unmet needs for individuals with low eHealth literacy.
Watbled et al [43]FranceMixed methodsHuman factors experts, medical professionalsTo apply a combination of methods for longitudinal usability evaluation throughout the system development lifecycle and to identify causes of usability flaws.System used for presentation of eDSIdentified multiple usability flaws in voice recording systems and emphasized thorough analysis and context-specific evaluations.

aeDS: electronic discharge summaries.

Usability Evaluation Methods and Targets

Over half of the included studies [32,36-39,41-43] used a heuristic evaluation method alone or in combination with other methods (Table 2). The method by Nielsen et al or its modified versions [44,45] was the most used heuristic evaluation approach among the included studies [32,36,38,39,41,42]. Watbled et al [43] reported a modified version of the heuristic usability evaluation method known as heuristic walkthroughs.

Table 2. Usability evaluation techniques used.
AuthorHeuristic evaluationThink-aloudLaboratory testing (in situ observation, eye-tracking)Questionnaire (system usability survey)InterviewRemote evaluation
Barton et al [32]




Busse et al [33]


Doyle et al [34]


Kernbeck et al [35]



Naik et al [36]



Soto et al [37]


Tremoulet et al [38]




Tremoulet et al [39]


Vaigneur [40]



Walsh et al [41]




Walsh et al [42]




Watbled et al [43]



User testing methods such as think-aloud [33,35], eye-tracking and in situ observation techniques [40,43] were also used for usability evaluation of eDS systems. A combination of evaluation methods (eg, heuristics with think-aloud technique or use of think-aloud method along with a questionnaire) were also used in certain instances [40,43]. Questionnaires like the system usability survey (SUS) [38-40] and semistructured interviews [33,35,37,39] were also used by multiple studies together with other usability evaluation approaches to assess satisfaction and perception of system users. Remote evaluation (via a Zoom-based videoconference) was successfully applied in 1 study [33].

The usability evaluation studies focused on different participant categories. In the heuristic evaluation, the studies mainly involved experts who assessed usability of interface design, while others focused on either end users or a combination of experts and end users. The targeted end users included clinicians, medical secretaries, nurses, patients, or caregivers, while the experts were human factors experts [36,38,43] and domain experts (people with knowledge of broader health system and are experienced users of My Health Record) [41,42].

Summary of Major Findings

The included studies identified several usability problems with varying degree of severity in both the eDS as well as the systems used to prepare and display eDS. While some studies focused on the usability of eDS, such as content, comprehensibility, structure, and readability issues [32,36,38-40], other studies evaluated the usability of eDS systems from presentation, design, and ease of use points of view [33,35,41-43]. The studies used different usability metrics such as usefulness, system efficiency, learnability, performance, and satisfaction when evaluating the usability of the targeted systems [33-35,37,38,40]. While the use of heuristic evaluation identified several organizational, layout, and formatting-related usability issues with different systems used to host eDS, the combined approach of using heuristic walkthroughs with user testing proposed by Watbled et al [43] tended to identity more severe problems and also highlighted their potential negative impact. These included issues related to error management, workload, and compatibility. These issues could lead to serious outcomes, such as prolonged deadlines for task completion, mistakes in patient identification, and inadequate error detection by users [43].

Studies that used heuristic evaluation overall identified several content, comprehensibility, readability, and structural usability flaws [32,33,36,38,40]. Visualization and presentation problems (eg, visualization of medication list or diagnosis and clarity and readability of medication documentation) were among the domains identified to have the highest number of usability problems and may have an impact on patient comprehension and safety [32,33,35,37,38]. Further, design readability and layout issues were identified to have an association with longer duration of screen gazes, affecting comprehension of discharge instructions [40]. One study reported that less display fragmentation and data entry requirements can reduce the cognitive load of user, confusion, and usability concerns [35]. Similarly, a study that used the eye-tracking method demonstrated that improving readability and layout was associated with less mental demand [40].

Concerns with language use, interface layout, and lack of audio-visuals were identified as common usability flaws in Australian studies that used usability issues with My Health Record, with implications for people with low electronic health literacy [41,42]. Another Australian study that assessed user satisfaction using the SUS questionnaire highlighted high acceptability of a digital discharge communication tool, with consumers and clinicians reporting high satisfaction scores on the mobile (94%) and desktop (93%) interfaces, respectively [34].

One study, which involved an information technology expert, assessed the likelihood of addressing usability issues for a patient-facing emergency department visit summaries [32]. The study reported that nearly half of the usability issues identified were difficult to address (31/76 issues). These are issues with some information originating from different service vendors or when an eHR vendor was responsible for providing parts of the discharge summaries (eg, headers, content, and order of sections).


This review summarizes study findings on usability evaluation approaches used to assess eDS and eDS systems. The limited published evidence revealed the use of heterogenous usability evaluation techniques spanning from one conducted by experts to laboratory and user testing to the use of questionnaires and interviews. Broadly, our findings highlight that the use of heuristics (expert based) and think-aloud (user centered) were the most used methods for evaluation of eDS and eDS systems. Other techniques like eye-tracking, direct observation, questionnaire- and interview-based evaluations were also used in combination with either heuristic or think-aloud approaches.

Heuristic evaluation method, consistent with previous findings on eHRs [28], was used by most of the included studies for evaluation of eDS and eDS systems usability. This technique typically involves the application of a procedure including 3-5 experts to independently apply a set of best practices design (referred as heuristics) to identify usability flaws with system interfaces [44]. The heuristics used in evaluation are either defined a priori by experts or are derived from standard guidelines like the ergonomic criteria [46], which has 8 main domains around guidance, workload, explicit control, adaptability, error management, consistency, significance of codes, and compatibility.

Our review indicates that heuristic evaluation method can successfully identify a range of usability issues around readability, comprehensibility, organization, and content of eDS interfaces [38,39]. In addition to identifying usability flaws with user interfaces, heuristic evaluation also enabled assessment of the severity of usability problems. The severity of usability problems is often rated based on the 5-step severity scale developed by Nielsen et al [47], which is a tool widely applied to assess the usability of medical technologies and their impact on patient safety. This severity scale ranges in value from “0” for no usability problem to “4” for usability catastrophe, with mean scores of judgements from multiple evaluators used during heuristic usability evaluation [47].

While heuristic evaluation has the advantage of being more intuitive, efficient, and cheap, with less requirements for advanced planning and involvement of test users [44], it only identifies half of the usability problems that are related to the design of system interfaces [48]. A modified version called heuristic walkthroughs, which also involves the observation of end users, was associated with better detection of usability problems, mainly those characterized as moderate and severe [49]. This has been confirmed by one of the studies that reported heuristic walkthroughs to be effective in the identification of more severe usability problems [43]. Despite the advantages of heuristic evaluation, there are certain limitations associated with this approach. For example, the heterogenous nature of heuristics or guidelines applied in different settings indicates the lack of gold-standard guidelines applicable to every context [50]. Also, because heuristics are broadly defined, they are often interpreted and applied differently by different experts [50]. These limitations highlighted the need to explore alternative approaches of usability evaluation, preferably those that also consider input from end users.

The think-aloud technique is one of the controlled user testing approaches used by multiple studies in our review to evaluate the usability of eDS [33-35,37,43]. This method requires participants to verbalize their impressions about an interface while using it, enabling data collection from both direct observation and users’ self-reported statements [51,52]. This method has the advantage of providing insights into both design and learnability problems associated with systems [50]. Kernebeck et al [35] demonstrated that both effort and performance metrics can be effectively captured using a concurrent think-aloud evaluation approach, and emphasized the critical need to involve actual users from the start of the development process to enable a more transparent evaluation that meets the needs of end users. More importantly, the findings from Watbled et al [43] highlighted that this approach can be successfully integrated with heuristics, offering an advantage of a more holistic assessment of problems from both experts and users.

Eye-tracking is another controlled user testing method used for eDS usability evaluation [40]. In this method, eye-trackers record and analyze information on eye movement, fixation, and screen gaze to assess if the tasks involved are demanding. Questionnaires and semistructured interviews were also used to assess the usability issues with eHRs. Our review identified the use of the SUS [53]—a 10-item Likert tool that provides overall assessment of system usability. The SUS is a nonproprietary self-administered questionnaire with good validity and reliability; however, it is not robust and specific enough in identifying usability issues specific to eHRs.

The usability evaluation techniques used in the included studies, such as the use of heuristics, were not only used to identify issues related to eDS content, such as unnecessary or missing information, poor organization, and inconsistencies in formatting [32,38] but also used to understand the visualizations within eDS systems, including those associated with presenting medication lists and diagnoses [33,35,37,43]. The usability problems identified in the eDS systems had significant consequences, for example, the need for extended deadlines for task completion and errors in patient identification, which ultimately impacted the system's quality and performance [43]. These findings emphasize the importance of improving the speed and quality of systems when designing technologies for use in the context of eDS. It has been proposed that integrating usability testing methods during the development of these systems can potentially reduce adverse health events and outcomes.

In order to provide optimal and safe health services, eDS should provide clinically relevant, accurate, adequate, and clear display of relevant information. The content and quality of discharge summaries have implications for patient outcomes after their discharge from hospital [54]. While technological solutions can significantly improve the content and quality transfer of information, factors such as health literacy and individual patient differences are other important factors to consider during system implementation. This review highlights that this can be achieved through applying rigorous usability assessment techniques that require experts (heuristic evaluation and walkthroughs) approach with a user-based (think-aloud approach) method [50]. However, the limited number of studies assessing the usability of eDS or discharge instructions by patients with different levels of health literacy highlights the need for additional research.

Given the diverse user base of eHRs and discharge summaries in primary care settings, which includes physicians, nurses, pharmacists, and other health professionals, it is crucial to have systems that are easy to navigate, gather and select information, and interpret that information. Therefore, it is important to use a robust usability assessment approach that takes into account the wide range of users, including health professionals and patients, to develop a platform that can be used without significant challenges. Developing systems that can overcome usability issues, such as poor organization and display fragmentation, workflow interference, and cognitive overload, can affect the quality of information required to enable clinical decision making by health professionals and, therefore, continuity of care [55]. With emerging interest around International Patient Summary, which aims to provide a relatively generic means of communication for “unplanned, cross border care” [56], some of the identified usability techniques, especially those applicable to medication and condition summaries, can be used in this broader context.

Although most of the included studies assessed or explored different usability evaluation methods, the usability metrics used were heterogenous in nature. More studies focused on standardized usability metrics like efficiency, effectiveness, and satisfaction, as highlighted in the ISO (International Organization for Standardization) 9241-11 Ergonomics of human-system interaction [17] may shed light into the most effective approach for usability evaluation of eDS. Overall, usability evaluations applied on interfaces should aim to achieve adequate validity, thoroughness, and reliability [57]. In this context, considering the limitations with individual techniques, adopting a multimodal evaluation approach, for example, through combining heuristic evaluation with user testing methods or a questionnaire, may better achieve these objectives. More importantly, there should be an increased focus on developing and implementing usability evaluation techniques that consider factors such as learnability, regular use, error protection, accessibility, and maintainability, as highlighted in the ISO 9241-11 [17]. Another important consideration is the limited geographical locations covered by the included studies, which may limit the applicability of the findings in other settings with different electronic health systems and infrastructures. Lastly, the evidence concerning usability evaluation theories, approaches, and implementation frameworks specific to discharge summaries remains notably scarce. This highlights the need for further research in the area.

Even though we included a range of databases and gray literature sources, it is possible that we may have missed studies indexed in sources not included in this review. Our search strategy was specifically restricted to discharge summaries or instructions, which may have excluded usability evaluation techniques used in the context of EHRs in general. Some of these techniques identified in previous works focusing on EHRs could also be relevant to eDS [28]. We also acknowledge that despite our systematic and thorough approach, the potential for bias exists due to the reliance on a single reviewer for data extraction and quality appraisal.

Conclusions

We have identified multiple usability evaluation methods that can be used to identify usability concerns applicable to eDS and eDS systems as well as other discharge communication tools. While the evidence in this area is still emerging, especially in terms of standardizing the usability metrics used, published studies indicate the use of a variety of generic methods to effectively assess different aspects of discharge summary contents. These aspects include the presence of necessary information, organization, formatting, as well as the presentation (display and layout) of the systems used to host the eDS.

Heuristic and think-aloud evaluation techniques emerged as the most used methods. They were used either independently or in conjunction with other techniques, such as validated surveys or semistructured interviews. These methods were not only used to identify usability issues with eDS and eDS systems but also revealed severe issues that had implications for the quality and performance of these systems.

Acknowledgments

We would like to thank the Australian Commission on Safety and Quality in Health Care for funding this work.

Conflicts of Interest

None declared.

Multimedia Appendix 1

Search strategy.

DOCX File , 16 KB

Multimedia Appendix 2

Characteristics of included studies.

DOCX File , 34 KB

Multimedia Appendix 3

PRISMA checklist.

DOC File , 65 KB

  1. Manca DP. Do electronic medical records improve quality of care? Yes. Can Fam Physician. Oct 2015;61(10):846-847. [FREE Full text] [Medline]
  2. Lin YK, Lin M, Chen H. Do Electronic Health Records Affect Quality of Care? Evidence from the HITECH Act. Information Systems Research. Mar 2019;30(1):306-318. [CrossRef]
  3. Kern LM, Edwards A, Kaushal R. The patient-centered medical home, electronic health records, and quality of care. Ann Intern Med. Jun 03, 2014;160(11):741-749. [CrossRef] [Medline]
  4. King J, Patel V, Jamoom EW, Furukawa MF. Clinical benefits of electronic health record use: national findings. Health Serv Res. Feb 2014;49(1 Pt 2):392-404. [FREE Full text] [CrossRef] [Medline]
  5. Reisman M. EHRs: the challenge of making electronic data usable and interoperable. P T. 2017;42(9):572-575. [FREE Full text] [Medline]
  6. Keesara S, Jonas A, Schulman K. Covid-19 and health care's digital revolution. N Engl J Med. Jun 04, 2020;382(23):e82. [CrossRef] [Medline]
  7. Unnewehr M, Schaaf B, Marev R, Fitch J, Friederichs H. Optimizing the quality of hospital discharge summaries--a systematic review and practical tools. Postgrad Med. Aug 2015;127(6):630-639. [CrossRef] [Medline]
  8. Hesselink G, Schoonhoven L, Barach P, Spijker A, Gademan P, Kalkman C, et al. Improving patient handovers from hospital to primary care: a systematic review. Ann Intern Med. Sep 18, 2012;157(6):417-428. [CrossRef] [Medline]
  9. Jones CD, Jones J, Bowles KH, Flynn L, Masoudi FA, Coleman EA, et al. Quality of hospital communication and patient preparation for home health care: results from a statewide survey of home health care nurses and staff. J Am Med Dir Assoc. Apr 2019;20(4):487-491. [FREE Full text] [CrossRef] [Medline]
  10. Brody AA, Gibson B, Tresner-Kirsch D, Kramer H, Thraen I, Coarr ME, et al. High prevalence of medication discrepancies between home health referrals and Centers for Medicare and Medicaid Services home health certification and plan of care and their potential to affect safety of vulnerable elderly adults. J Am Geriatr Soc. Nov 2016;64(11):e166-e170. [FREE Full text] [CrossRef] [Medline]
  11. Moore AB, Krupp JE, Dufour AB, Sircar M, Travison TG, Abrams A, et al. Improving transitions to postacute care for elderly patients using a novel video-conferencing program: ECHO-Care transitions. Am J Med. Oct 2017;130(10):1199-1204. [CrossRef] [Medline]
  12. Wilson S, Ruscoe W, Chapman M, Miller R. General practitioner-hospital communications: a review of discharge summaries. J Qual Clin Pract. Dec 2001;21(4):104-108. [CrossRef] [Medline]
  13. Perren A, Previsdomini M, Cerutti B, Soldini D, Donghi D, Marone C. Omitted and unjustified medications in the discharge summary. Qual Saf Health Care. Jun 2009;18(3):205-208. [FREE Full text] [CrossRef] [Medline]
  14. Reinke CE, Kelz RR, Baillie CA, Norris A, Schmidt S, Wingate N, et al. Timeliness and quality of surgical discharge summaries after the implementation of an electronic format. Am J Surg. Jan 2014;207(1):7-16. [CrossRef] [Medline]
  15. Safety and Quality Evaluation of Electronic Discharge Summary Systems: Final Report. URL: https:/​/www.​safetyandquality.gov.au/​publications-and-resources/​resource-library/​safety-and-quality-evaluation-electronic-discharge-summary-systems-final-report [accessed 2022-10-07]
  16. Howe JL, Adams KT, Hettinger AZ, Ratwani RM. Electronic health record usability issues and potential contribution to patient harm. JAMA. Mar 27, 2018;319(12):1276-1278. [FREE Full text] [CrossRef] [Medline]
  17. Bevan N, Carter J, Harker S. ISO 9241-11 revised: what have we learnt about usability since 1998? Springer; 2015. Presented at: International Conference on Human-Computer Interaction; 2015 Jan 01; UK. [CrossRef]
  18. Zhang J, Walji MF. TURF: toward a unified framework of EHR usability. J Biomed Inform. Dec 2011;44(6):1056-1067. [FREE Full text] [CrossRef] [Medline]
  19. Ratwani RM, Savage E, Will A, Fong A, Karavite D, Muthu N, et al. Identifying electronic health record usability and safety challenges in pediatric settings. Health Aff (Millwood). Nov 2018;37(11):1752-1759. [CrossRef] [Medline]
  20. Ellsworth MA, Dziadzko M, O'Horo JC, Farrell AM, Zhang J, Herasevich V. An appraisal of published usability evaluations of electronic health records via systematic review. J Am Med Inform Assoc. Jan 2017;24(1):218-226. [FREE Full text] [CrossRef] [Medline]
  21. National Guidelines for On-Screen Presentation of Discharge Summaries. 2017. URL: https:/​/www.​safetyandquality.gov.au/​publications-and-resources/​resource-library/​national-guidelines-screen-presentation-discharge-summaries [accessed 2023-10-11]
  22. Snow V, Beck D, Budnitz T, Miller DC, Potter J, Wears RL, American College of Physicians, Society of General Internal Medicine, Society of Hospital Medicine, American Geriatrics Society, American College of Emergency Physicians, et al. Society of Academic Emergency Medicine. Transitions of Care Consensus Policy Statement American College of Physicians-Society of General Internal Medicine-Society of Hospital Medicine-American Geriatrics Society-American College of Emergency Physicians-Society of Academic Emergency Medicine. J Gen Intern Med. Aug 2009;24(8):971-976. [FREE Full text] [CrossRef] [Medline]
  23. The SIGN Discharge Document, SIGN. 2012. URL: http://www.sign.ac.uk/guidelines/fulltext/128/index.html [accessed 2022-08-02]
  24. National Standard for Patient Discharge Summary Information. 2013. URL: https://www.hiqa.ie/ [accessed 2022-09-29]
  25. Standards for the Structure and Content of Health and Care Records. 2018. URL: https:/​/www.​rcplondon.ac.uk/​projects/​outputs/​standards-clinical-structure-and-content-patient-records [accessed 2024-08-02]
  26. eDischarge summary standard V2.1. 2019. URL: https:/​/theprsb.​org/​standards/​edischargesummary/​#:~:text=The%20eDischarge%20Summary%20Standard%20enables,is%20discharged%20from%20hospital%20care [accessed 2024-08-02]
  27. UHN Discharge Summary. 2017. URL: https://www.uhnmodules.ca/DischargeSummary/home.html [accessed 2024-08-02]
  28. Wronikowska MW, Malycha J, Morgan LJ, Westgate V, Petrinic T, Young JD, et al. Systematic review of applied usability metrics within usability evaluation methods for hospital electronic healthcare record systems: Metrics and Evaluation Methods for eHealth Systems. J Eval Clin Pract. Dec 2021;27(6):1403-1416. [FREE Full text] [CrossRef] [Medline]
  29. Page MJ, McKenzie JE, Bossuyt PM, Boutron I, Hoffmann TC, Mulrow CD, et al. The PRISMA 2020 statement: an updated guideline for reporting systematic reviews. BMJ. Mar 29, 2021;372:n71. [FREE Full text] [CrossRef] [Medline]
  30. Covidence. 2020. URL: https://www.covidence.org/reviewers [accessed 2020-06-23]
  31. Aromataris E, Munn Z. JBI manual for evidence synthesis. South Australia. JBI; 2020.
  32. Barton HJ, Salwei ME, Rutkowski RA, Wust K, Krause S, Hoonakker PL, et al. Evaluating the usability of an emergency department after visit summary: staged heuristic evaluation. JMIR Hum Factors. Mar 09, 2023;10:e43729. [FREE Full text] [CrossRef] [Medline]
  33. Busse TS, Jux C, Kernebeck S, Dreier LA, Meyer D, Zenz D, et al. Participatory design of an electronic cross-facility health record (ECHR) system for pediatric palliative care: a think-aloud study. Children (Basel). Sep 24, 2021;8(10):839. [FREE Full text] [CrossRef] [Medline]
  34. Doyle S, Pavlos R, Carlson SJ, Barton K, Bhuiyan M, Boeing B, et al. Efficacy of digital health tools for a pediatric patient registry: semistructured interviews and interface usability testing with parents and clinicians. JMIR Form Res. Jan 17, 2022;6(1):e29889. [FREE Full text] [CrossRef] [Medline]
  35. Kernebeck S, Jux C, Busse TS, Meyer D, Dreier LA, Zenz D, et al. Participatory design of a medication module in an electronic medical record for paediatric palliative care: a think-aloud approach with nurses and physicians. Children (Basel). Jan 06, 2022;9(1):82. [FREE Full text] [CrossRef] [Medline]
  36. Naik AD, Horstman MJ, Li LT, Paasche-Orlow MK, Campbell B, Mills WL, et al. User-centered design of discharge warnings tool for colorectal surgery patients. J Am Med Inform Assoc. Sep 01, 2017;24(5):975-980. [FREE Full text] [CrossRef] [Medline]
  37. Soto M, Sicotte C, Motulsky A. Using health information exchange: usability and usefulness evaluation. Stud Health Technol Inform. Aug 21, 2019;264:1036-1040. [CrossRef] [Medline]
  38. Tremoulet P, Krishnan R, Karavite D, Muthu N, Regli SH, Will A, et al. A heuristic evaluation to assess use of after visit summaries for supporting continuity of care. Appl Clin Inform. Jul 2018;9(3):714-724. [FREE Full text] [CrossRef] [Medline]
  39. Tremoulet PD, Shah PD, Acosta AA, Grant CW, Kurtz JT, Mounas P, et al. Usability of electronic health record-generated discharge summaries: heuristic evaluation. J Med Internet Res. Apr 15, 2021;23(4):e25657. [FREE Full text] [CrossRef] [Medline]
  40. Vaigneur HM. Engineering hospital discharge instructions: An eye-tracking based study. Ann Arbor. Clemson University; 2015:70.
  41. Walsh L, Hemsley B, Allan M, Adams N, Balandin S, Georgiou A, et al. The E-health literacy demands of Australia's my health record: a heuristic evaluation of usability. Perspect Health Inf Manag. 2017;14(Fall):1f. [FREE Full text] [Medline]
  42. Walsh L, Hemsley B, Allan M, Dahm MR, Balandin S, Georgiou A, et al. Assessing the information quality and usability of within a health literacy framework: What's changed since 2016? Health Inf Manag. 2021;50(1-2):13-25. [FREE Full text] [CrossRef] [Medline]
  43. Watbled L, Marcilly R, Guerlinger S, Bastien JMC, Beuscart-Zéphir M-C, Beuscart R. Combining usability evaluations to highlight the chain that leads from usability flaws to usage problems and then negative outcomes. J Biomed Inform. Feb 2018;78:12-23. [FREE Full text] [CrossRef] [Medline]
  44. Nielsen J, Molich R. Heuristic evaluation of user interfaces. 1990. Presented at: Proceedings of the SIGCHI conference on Human factors in computing systems; 1990 March 01:249-256; United States. [CrossRef]
  45. Zhang J, Johnson TR, Patel VL, Paige DL, Kubose T. Using usability heuristics to evaluate patient safety of medical devices. J Biomed Inform. 2003;36(1-2):23-30. [FREE Full text] [CrossRef] [Medline]
  46. Scapin DL, Bastien JMC. Ergonomic criteria for evaluating the ergonomic quality of interactive systems. Behaviour & Information Technology. Jan 1997;16(4-5):220-231. [CrossRef]
  47. Nielsen J. Reliability of severity estimates for usability problems found by heuristic evaluation. 1992. Presented at: Posters and short talks of the 1992 SIGCHI conference on Human factors in computing systems; 1992 May 03; United States. [CrossRef]
  48. Thyvalikakath TP, Monaco V, Thambuganipalle H, Schleyer T. Comparative study of heuristic evaluation and usability testing methods. Stud Health Technol Inform. 2009;143:322-327. [FREE Full text] [Medline]
  49. Sears A. Heuristic walkthroughs: finding the problems without the noise. International Journal of Human-Computer Interaction. Sep 1997;9(3):213-234. [CrossRef]
  50. Jaspers MWM. A comparison of usability methods for testing interactive health technologies: methodological aspects and empirical evidence. Int J Med Inform. May 2009;78(5):340-353. [CrossRef] [Medline]
  51. Bolle S, Romijn G, Smets EMA, Loos EF, Kunneman M, van Weert JCM. Older cancer patients' user experiences with web-based health information tools: a think-aloud study. J Med Internet Res. Jul 25, 2016;18(7):e208. [FREE Full text] [CrossRef] [Medline]
  52. Fan M, Lin J, Chung C, Truong KN. Concurrent think-aloud verbalizations and usability problems. ACM Trans. Comput.-Hum. Interact. Jul 19, 2019;26(5):1-35. [CrossRef]
  53. Brooke J. SUS-a quick and dirty usability scale. Usability evaluation in industry. 1996;189:4-7. [CrossRef]
  54. Lakhaney D, Banker SL. An evaluation of the content of pediatric discharge summaries. Hosp Pediatr. 2020;10(11):949-954. [CrossRef] [Medline]
  55. Johnson C, Johnston D, Crowle P. EHR Usability Toolkit: A Background Report on Usability and Electronic Health Records. Rockville, MD. Agency for Healthcare Research and Quality; 2011.
  56. Cangioli G. International patient summary implementation guide. The International Patient Summary. 2018. URL: https://international-patient-summary.net/
  57. Hartson HR, Andre TS, Williges RC. Criteria For evaluating usability evaluation methods. Int J Hum-Comput Int. Dec 2001;13(4):373-410. [CrossRef]


eDS: electronic discharge summaries
eHR: electronic health record
ISO: International Organization for Standardization
PRISMA: Preferred Reporting Items for Systematic Reviews and Meta-Analyses
SUS: system usability survey


Edited by G Tsafnat; submitted 07.12.23; peer-reviewed by S Martikainen, A Bamgboje-Ayodele, LA Gomes, L Weinert, F Lau; comments to author 12.04.24; revised version received 17.05.24; accepted 30.06.24; published 12.09.24.

Copyright

©Wubshet Tesfaye, Margaret Jordan, Timothy F Chen, Ronald Lynel Castelino, Kamal Sud, Racha Dabliz, Parisa Aslani. Originally published in the Journal of Medical Internet Research (https://www.jmir.org), 12.09.2024.

This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research (ISSN 1438-8871), is properly cited. The complete bibliographic information, a link to the original publication on https://www.jmir.org/, as well as this copyright and license information must be included.