Published on in Vol 22, No 8 (2020): August

Preprints (earlier versions) of this paper are available at https://preprints.jmir.org/preprint/17022, first published .
Technological Capabilities to Assess Digital Excellence in Hospitals in High Performing Health Care Systems: International eDelphi Exercise

Technological Capabilities to Assess Digital Excellence in Hospitals in High Performing Health Care Systems: International eDelphi Exercise

Technological Capabilities to Assess Digital Excellence in Hospitals in High Performing Health Care Systems: International eDelphi Exercise

Original Paper

1Usher Institute, University of Edinburgh, Edinburgh, United Kingdom

2Institute for the Study of Science, Technology and Innovation, University of Edinburgh, Edinburgh, United Kingdom

3UCL School of Pharmacy, London, United Kingdom

4National Institute for Health Research Imperial Patient Safety Translational Research Centre, London, United Kingdom

5National Health Services Arden and Greater East Midlands Commissioning Support Unit, Warwick, United Kingdom

6Business School, University of Edinburgh, Edinburgh, United Kingdom

7Imperial College, London, United Kingdom

8KLAS Research’s Arch Collaborative, London, United Kingdom

9Institute of Health Informatics, University College London, London, United Kingdom

Corresponding Author:

Kathrin Cresswell, PhD

Usher Institute

University of Edinburgh

Old Medical School

Teviot Place

Edinburgh, EH8 9AG

United Kingdom

Phone: 44 131 651 7878

Email: Kathrin.Cresswell@ed.ac.uk


Background: Hospitals worldwide are developing ambitious digital transformation programs as part of broader efforts to create digitally advanced health care systems. However, there is as yet no consensus on how best to characterize and assess digital excellence in hospitals.

Objective: Our aim was to develop an international agreement on a defined set of technological capabilities to assess digital excellence in hospitals.

Methods: We conducted a two-stage international modified electronic Delphi (eDelphi) consensus-building exercise, which included a qualitative analysis of free-text responses. In total, 31 international health informatics experts participated, representing clinical, academic, public, and vendor organizations.

Results: We identified 35 technological capabilities that indicate digital excellence in hospitals. These are divided into two categories: (a) capabilities within a hospital (n=20) and (b) capabilities enabling communication with other parts of the health and social care system, and with patients and carers (n=15). The analysis of free-text responses pointed to the importance of nontechnological aspects of digitally enabled change, including social and organizational factors. Examples included an institutional culture characterized by a willingness to transform established ways of working and openness to risk-taking. The availability of a range of skills within digitization teams, including technological, project management and business expertise, and availability of resources to support hospital staff, were also highlighted.

Conclusions: We have identified a set of criteria for assessing digital excellence in hospitals. Our findings highlight the need to broaden the focus from technical functionalities to wider digital transformation capabilities.

J Med Internet Res 2020;22(8):e17022

doi:10.2196/17022

Keywords



It is now widely recognized that health information technology (HIT) has significant potential to transform health care systems and support continuous quality improvement efforts [1]. With growing recognition of this potential has come a strong international drive towards creating digitally advanced health care organizations. To this end, hospitals worldwide are now implementing ambitious digital transformation programs [2,3].

There are various ways to conceptualize and measure digital excellence in health care [4,5]. These approaches vary in scope from highly specialized models, focusing on a specific technological subsystem [6] to those assessing digital transformation across an entire hospital, and others encompassing the wider integrated health and care ecosystem [7]. The origin of these models is also diverse, including international health care industry organizations such as the Healthcare Information and Management System Society (HIMSS) Analytics [8], national health care providers [9], and academic groups [10]. Common to all existing frameworks is the concept of digital transformation progressing towards advanced levels of digital maturity through a defined set of stages associated with different technological capabilities. Perhaps the best known of these is the HIMSS Analytics Electronic Medical Record Adoption Model (HIMSS EMRAM; Textbox 1). Policymakers and health care organizations commonly use these frameworks for baseline assessments of current levels of digital maturity and as a roadmap for a desired future state of maturity. As such, these frameworks actively shape the direction of digital transformation [11].

Despite substantive worldwide efforts to promote digital excellence, there is no consensus on how to conceptualize it, what capabilities characterize a digitally excellent hospital, and how to best measure progress in a changing environment [12]. New models are beginning to emerge that acknowledge the importance of locally formed priorities and the changing nature of what constitutes digital excellence over time. In this study, we sought to identify and reach consensus on a defined set of internationally relevant technological capabilities for hospitals in order to address current gaps in approaches to conceptualizing and assessing digital excellence.

HIMSS Analytics Electronic Medical Record Adoption Model (EMRAM).

The HIMSS EMRAM classification evaluates the extent to which electronic medical records (EMRs) have been adopted within a hospital over eight progressive stages (Levels 0-7).

A hospital’s digital transformation begins at Level 0, in which no electronic laboratory, pharmacy, or radiology systems are installed. The hospital then moves through Levels 1-7 by progressive adoption of various aspects of EMRs. These include limited ancillary departmental systems (Level 1), and adoption across an increasing number of hospital departments (Levels 1-6), culminating in a virtually paperless environment with complex EMRs implemented in over 90% of the hospital’s departments (Level 7).

A hospital can be assessed against the HIMSS classification to establish its current HIMSS Level, which in turn highlights what further technological capabilities the hospital needs to reach the next level of the HIMSS classification. HIMSS Level 7 is often considered a ‘gold standard’ for the digitization of hospitals and an aspirational endpoint guiding the design of a hospital’s digital strategy.

Textbox 1. HIMSS Analytics Electronic Medical Record Adoption Model (EMRAM).

Study background

This work was conducted as part of a national evaluation of the National Health Service (NHS) Global Digital Exemplar (GDE) Programme in England [13]. The GDE Programme aims to create a cohort of digitally excellent hospitals (“Exemplars”), which are expected to share their experiences and learning with other health care providers and contribute toward creating a national learning health care ecosystem [3]. We followed the reporting recommendations for Delphi studies outlined by Boulkedid and colleagues [14].

Overview of the Delphi Method

The Delphi technique is a structured process that involves presenting a series of surveys to a group of experts to seek their agreement on statements relating to a particular issue [15]. An initial survey informs the development of a second survey, which is returned to the experts, who are asked to reconsider their initial judgment in light of feedback from the first round. Consecutive rounds are carried out until consensus is reached [16]. The key strength of the Delphi method is that it supports consensus development in an area of uncertainty or limited empirical evidence [17]. The method allows drawing on a wide range of experts’ knowledge and experiences. The feedback offered to participating experts between rounds has the potential to widen participants’ outlooks and stimulate new ideas that can be expressed in subsequent rounds [17]. The anonymity offered by the method (the identity of experts and their contributions are not known to other experts taking part in the Delphi exercise or the public) also has potential to facilitate disclosing opinions that may be underrepresented or not expressed in other forms of consensus-building approaches where participants are aware of each other’s identity. The potential risks associated with the Delphi approach include lack of accountability for anonymously presented views, and risks generally associated with consensus-building approaches such as group-think and lack of diversity of views represented in the outcomes [17]. After considering its strengths and weaknesses, we decided that the Delphi method would be an appropriate approach for addressing our aim of developing international agreement on a defined set of technological capabilities to assess digital excellence in hospitals in high performing health care systems.

To ensure a reasonable geographical spread of experts and relatively prompt completion of the Delphi exercise, we used a modified Delphi approach utilizing electronic communication with experts [18]. The modified electronic Delphi (eDelphi) technique has been widely used in health care and medical informatics, for example, to establish a set of readiness criteria for HIT innovations [19], to define key performance indicators to benchmark hospital information systems [20], and to identify ways to improve the delivery of medication alerts within computerized physician order entry (CPOE) systems [21].

The study took place between July 2018 and January 2019. Ethics approval was obtained from an Institutional Review Board at the School of Social and Political Science at The University of Edinburgh, United Kingdom. The Qualtrics platform was used to develop an online survey and collect data. SPSS Version 24 was used to conduct quantitative analyses, and NVivo Version 12 was used to analyze free-text responses.

The eDelphi Process

Identification of Experts

We identified a diverse group of international experts in the field of health informatics from leading clinical, academic, public, and vendor organizations, aiming for maximum variation in terms of geographical location, background (eg, academic, clinical, vendors), and gender. Our eligibility criteria included providing senior-level leadership in the field of health informatics and affiliation with a leading clinical, academic, public, or vendor organization. Experts were identified through the research team’s international academic and professional networks.

Development and Piloting of Candidate Capabilities

Our focus was to ensure that the proposed list of candidate technological capabilities forming the basis of the eDelphi exercise drew on ongoing work relating to digital excellence in hospitals. We used NHS England’s Digital Maturity Index as a basis for constructing the initial list [9]. The index was developed in 2013 based on HIMSS EMRAM but included additional dimensions of interoperability, technological readiness, and infrastructure. We then piloted this initial list with three clinical academics, which resulted in some changes to the wording to improve clarity.

Round 1 of the eDelphi

Identified experts received an invitation email explaining the rationale for and aim of the study, the reason they were invited, what taking part would involve, and a personalized link to the Round 1 online survey. Experts were asked to follow the link if they wished to participate. We sent up to three follow-up emails at 2-3 week intervals to those who did not complete the survey following the initial invitation.

The opening page of the online survey for Round 1 contained further details of the study and a link to a participant information sheet. We obtained informed consent from each participant before the start of the survey. Participants were given the option to receive a summary of the findings once the study was completed. The main body of the online survey consisted of the list of proposed technological capabilities identified in the piloting stage. Participants were asked to rate how much they agreed that each proposed capability could be used to assess the level of digital excellence in hospitals, using a scale ranging from 1 (strongly agree) to 9 (strongly disagree). Experts were also encouraged to comment on each capability to suggest more appropriate wording, merge, split, or remove the capability, or to add other comments. Finally, we asked experts for suggestions of any additional capabilities they wished to add to the list.

Analysis of Data From Round 1

The purpose of analysis at this stage was to produce material for Round 2 of the eDelphi exercise. First, we revised the list of proposed capabilities based on participants’ comments from Round 1, changing the wording and dividing some capabilities into two or more capabilities to improve precision and clarity. We also added capabilities proposed in Round 1 to the revised list. As the majority of candidate capabilities were revised following insights from Round 1, we decided not to remove any capabilities at this stage. We further produced a feedback document that contained a summary of experts’ comments and descriptive statistics from Round 1 for each capability.

Round 2 of the eDelphi

Experts who completed Round 1 were invited to take part in Round 2 via an invitation email as before. Again, we sent up to three reminders at 2-3 week intervals to those who did not complete Round 2 following the initial email. An online version of the feedback document from Round 1 was also provided. Experts were given their score for each revised capability and asked if they wished to reconsider given the feedback from Round 1. If they replied “Yes,” they were given an option to amend their assessment using the same scale as in Round 1. Experts were asked to rate how much they agreed that each proposed new capability could be used to assess the level of digital excellence in hospitals on the same scale as for other capabilities. Experts were also able to comment on each capability, as above.

Analysis of Data From Round 2 and Definition of Consensus

Analysis following Round 2 aimed to identify any consensus on the capabilities and determine whether an additional round was needed. We defined consensus a priori as 70% agreement among experts that a specific capability should be included [17]. In other words, to be included, at least 70% of experts needed to “agree” or “strongly agree” to the appropriateness of the capability to define digital excellence in hospitals. After calculating the percentage of experts agreeing or strongly agreeing that the capability should be included, we removed all capabilities for which fewer than 70% agreed (see Multimedia Appendix 1), to produce the final list of capabilities.

Qualitative Data Collection and Analysis

To supplement the consensus-building exercise with additional insights, we incorporated several open-ended questions into the surveys, for which experts were able to provide free-text responses. In Round 1, we gave one current definition of digital maturity proposed by the MIT Sloan Management Review and asked experts to comment on this definition in the health care context [22]. We also asked experts to comment on the role of nontechnological factors (eg, strategy, workforce, culture) in the context of digital excellence in hospitals. Some feedback from Round 1 suggested that the initial list of capabilities focused too narrowly on the internal operations of hospitals. In Round 2, we therefore asked experts to comment on (a) conceptualization of digital excellence in hospitals in the context of the broader health care ecosystem; and (b) digital excellence in the context of a patient-centered health care perspective. We analyzed data from all free-text entries using thematic analysis to identify key themes of digital excellence in health care [23]. The qualitative data were initially analyzed by one researcher (MK). The resulting coding framework and the analyses were then reviewed and expanded by the wider team (KC, RW, AS). Researchers from a variety of backgrounds (eg, social sciences, public health, informatics) were involved in the analysis of the qualitative data, and diverging findings and viewpoints were discussed in detail in order to minimize the risk of bias.


eDelphi Process and Expert Characteristics

In total, 77 experts were invited. Of these, 34 agreed to take part and completed Round 1 (44% response rate), and 31 of the 34 completed Round 2 (91% response rate). Table 1 describes the characteristics of the 31 experts who took part in both rounds; Figure 1 outlines the steps involved.

Table 1. Expert characteristics.
CharacteristicExperts approached to take part (n=77), n (%)Experts who took part in both rounds (n=31), n (%)
Sector


Clinical17 (22)6 (19)

Academia35 (45)17 (55)

Policy5 (6)1 (3)

Vendor20 (26)7 (23)
Region


North America21 (27)12 (39)


United States21 (27)12 (39)

South America2 (3)1 (3)


Brazil2 (3)1 (3)

Europe46 (60)15 (48)


United Kingdom23 (30)6 (19)


Spain6 (8)4 (13)


Norway4 (5)2 (7)


Denmark3 (4)1 (3)


Sweden2 (3)1 (3)


Slovenia2 (3)1 (3)


Belgium1 (1.5)0 (0)


Estonia1 (1.5)0 (0)


Finland1 (1.5)0 (0)


Russia1 (1.5)0 (0)


Austria1 (1.5)0 (0)


Germany1 (1.5)0 (0)

Australasia6 (8)3 (10)


Australia6 (8)3 (10)

Middle East2 (3)0 (0)


Saudi Arabia1 (1.5)0 (0)


Israel1 (1.5)0 (0)
Gender


Female11 (14)4 (13)

Male66 (86)27 (87)
Figure 1. Flow diagram for the eDelphi exercise.
View this figure

Digital Excellence in Hospitals

Experts identified 35 technological capabilities that were judged to characterize digital excellence in hospitals (Tables 2-4). The technological capabilities fell into two categories: (a) capabilities within a hospital, and (b) communication with other parts of the health care system, and with patients and carers. The need to distinguish between capabilities within hospitals and those relating to the broader context of the health care ecosystem was emphasized in free-text comments, for example:

There is an important assessment on where enterprises (eg, hospitals) are, versus where those enterprises sit in an ecosystem and how they interact with those wider ecosystems.
[Vendor]
Technological Capabilities Within a Hospital

The largest category, technological capabilities within a hospital, included 20 items (Table 2), including technologies to promote the appropriate use and administration of medication, capabilities to capture structured and unstructured data, and the ability to integrate new advanced technologies (eg, natural language processing) within existing systems.

The most substantial proportion of capabilities within this category (five of 20) related to medicines management. The highest level of agreement was applied to a capability related to closed-loop electronic medicines management stating that it should be included as a marker of digital excellence (90% agreement), this was closely followed by capabilities relating to the effective capture of clinical data.

Experts proposed four new capabilities in Round 1 (Capabilities 10, 12, 13, and 15; Table 2). These were concerned with advancements in electronic medical records, electronic prescribing and medicines administration systems to improve user experience (for example Capability 10 ‘A single list of all medication for one patient is available’), and integration of new technologies and analytical approaches into existing systems (for example Capability 15 ‘Use of machine learning and adding third party programs through Application Programming Interfaces’).

Table 2. Technological capabilities within hospitals.
Agreed list of capabilities“Strongly agreed” and “agreed”a (%)Number of experts who agreed (n=31)MedianIQRb
1. Closed-loop electronic medicines management and optimization (electronic prescribing with technology-assisted identification of both patient and medication, eg, bar codes or RFIDc tags)902811-2
2. Effective mechanisms to collect and record complete, accurate and high-quality patient/clinical data872721-2
3. Structured data (records, assessments, and plans) captured digitally at the point of care872711-2
4. Orders (eg, lab tests) are ordered, and results reported in a coded form (ie, using standard compendiums and international vocabulary standards, including dm+dd, and acknowledged electronically in the system842611-2
5. Effective mechanisms to review and improve the quality of patient/clinical data842621-2
6. Flexible digital systems guiding clinicians along evidence-based, person-specific, clinical pathways812521-2
7. Unstructured data (eg, notes, free text) captured at the point of care when appropriate812521-2
8. Person reading/acting on the results acknowledges this electronically in the system812511-2
9. Cybersecurity strategy and continuity processes in place and implemented effectively812511-2
10. A single list of all medication for one patient is availablee812511-2
11. Management intelligence through digital health data81251.51-2
12. Reducing the need for duplicate entry of patient data to near-zeroe812521-2
13. Third-party tools can be added through Application Programming Interfacese812521-2
14. Advanced clinical decision support (eg, integrated with lab data, diagnosis codes) with alerts that are both sensitive and specific and therefore less likely to result in alert fatigue772421-2
15. Use of machine learning and automation when appropriate (eg, analysis of radiology images)e772421-2
16. Clinical intelligence through digital health data772411-2
17. The ability to monitor outcome data for modifying clinical pathways based on digital tools and services772421-2
18. Open Application Programming Interfaces allowing different software components to interact742311-3
19. Supporting end-to-end redesign and improvement of clinical pathways based on digital tools and services742321-3
20. Advanced analytics capability to support the move from reactive to proactive/predictive models of care742321-3

aExperts scored each capability using a scale ranging from “1” (strongly agree) to “9” (strongly disagree).

bIQR: Interquartile range.

cRFID: Radio Frequency Identification.

ddm+d: Dictionary of Medicines and Devices.

eNew capabilities suggested by experts in Round 1 of the eDelphi.

Communication With Other Parts of the Health Care System, Patients, and Carers

This category was related to enabling the exchange of information and communication beyond an individual hospital setting, including communication with other parts of health and social care systems (Table 3), and communication with patients and carers (Table 4). In total, this category comprised 15 capabilities, of which ten related to communication with other parts of health care systems and five to communication with patients and carers. Experts proposed two new capabilities, including the use of a unique patient identifier (Capability 23, Table 3), and the ability to exchange information with other systems based on shared standards (Capability 6, Table 3).

Table 3. Technological capabilities related to communication with other parts of the health and social care system.
Agreed list of capabilities“Strongly agreed” and “agreed”a (%)Number of experts who agreed, (n=31)MedianIQRb
1. Exchange of prescription information in a structured way within and between organizations and sectors872711-2
2. Local sharing of relevant data across the local health care ecosystem facilitated by interfacing or interoperability of electronic systems842611-2
3. A unique patient identifier used across the health care systemc842611-2
4. Data analysis at scale and use of insights to deliver targeted care for high-risk and high-use groups of patients (eg, diabetes, chronic obstructive pulmonary disease, asthma) across a population or area842621-2
5. Using digital systems to enable the seamless (through interfaces/integration) flow and use of information/data across organizational boundaries within a local health care ecosystem812511-2
6. Ability to interoperate with other standard-based external systemsc812521-2
7. Referrals within and between hospitals are always managed electronically772411-2
8. Ability to send communications to primary care and social care through a variety of media772421-2
9. Ability to produce data for audits and other reports based on the routine collection of complete, accurate, and quality data742321-3
10. Discharge to primary care and community is always managed electronically712211-2

aExperts rated how much they agree that the capability can be used to assess the level of digital excellence in hospitals on a scale from “1” (strongly agree) to “9” (strongly disagree).

bIQR: Interquartile range.

cNew capabilities suggested by experts in Round 1 of the eDelphi.

Table 4. Technological capabilities related to communication with patients and carers.
Agreed list of capabilities“Strongly agreed” and “agreed”a (%)Number of experts who agreed,
(n=31)
MedianIQRb
1. Records, assessments, and plans shared digitally and easily accessible to patients and carers to enter and amend the data securely and confidentially902811-2
2. Records, assessments, and plans shared digitally and easily accessible to patients and carers to view the data securely and confidentially872711-2
3. Ability to receive communications from patients and carers through a variety of media742321-3
4. Ability to send communications to patients and carers through a variety of media742321-3
5. Using mobile technologies to support the delivery of care outside traditional settings and closer to home712221-3

aExperts rated how much they agree that the capability can be used to assess the level of digital excellence in hospitals on a scale from “1” (strongly agree) to “9” (strongly disagree).

bIQR: Interquartile range.

Broader Aspects of Digitally Enabled Change: Culture, Skills, and Strategy

Free text responses emphasized that technologies should not be viewed in isolation and that social and organizational factors were crucial for digital transformation (Table 5).

Organizational culture, characterized by a willingness to transform established ways of working and an openness to risk-taking, was frequently mentioned as key to promoting digital transformation:

It is important to have a culture where individuals are prepared to change their ways of working and take some risks with an understanding of the overall good that will be achieved.
[Policy expert]

Experts also frequently mentioned the need for a diverse set of interdisciplinary skills supporting these transformations. Here, participants called for a range of technological, project management, and business expertise:

Digital health is a diverse, interdisciplinary sector, something that is reflected in the skills required in the field, ranging from higher-level computing, such as software development and software engineering to project management and business-related skills.
[Vendor]

Experts further highlighted the need for sufficient resources for the existing staff base, and their emerging training needs to support digital transformation:

A digital agenda cannot be delivered without sufficient staff, who are experienced and well trained within the digital team.
[Clinician]
Table 5. Social and organizational factors contributing to digital maturity.
FactorDescription
Organizational culture
  • Willingness to face the new, change the way of thinking, and to take risks
  • Culture of allowing innovations
  • Understanding of change management
  • Culture free of bullying and harassment
  • Leadership to support digital transformation
Workforce
  • Skills within the digital team: software development, software engineering, project management, business-related skills
  • Skills across the hospital’s workforce: the ability to perform one’s role using digital tools
  • Professionalization of health informatics
Strategy
  • Putting clinical benefits at the center of clinical strategy
  • Aligning digital strategy with the overall strategy of the hospital
  • Support of the digital agenda from the hospital’s board

Summary of Findings

We have established consensus on a discrete set of internationally relevant technological capabilities to indicate digital excellence in hospitals. Engaging international experts in a transparent process represented by the Delphi technique, allowed us to develop a detailed, multi-axial mapping of digital excellence, which may be used by decision-makers to inform digital transformation strategy and evaluation. The outcomes of this eDelphi process mark a significant departure from existing tools such as HIMSS EMRAM and the NHS Digital Maturity Index [8,9]. First, our results point to a shift away from the description of purely technological functionalities towards digital transformation capabilities and highlight a need to be cognizant of cultural and strategic factors, such as skills and resources, to support the digitally enabled transformation of health care. Second, our findings indicate that the concept of digital excellence is moving beyond the physical boundaries of acute hospitals. Thus, once a certain level of digitization and data sharing is achieved within hospitals, strategic direction needs to shift towards sharing data and integration across local/regional/national ecosystems that encompass primary and social care providers and enable patient self-management.

Strengths and Limitations

This study is the first attempt to achieve international consensus on a defined set of technological capabilities to indicate digital excellence in hospital settings. We recruited a relatively large sample of international experts from a variety of countries and achieved a good overall response rate. There is considerable variation in the number of experts involved in Delphi studies, and no consensus exists as to what constitutes an optimal number of experts [24,25]. However, the available evidence indicates that the Delphi technique produces reliable outcomes for sample sizes of 20 or above, and that increasing the number of experts above that number does not significantly change the outcomes [25]. We do, however, acknowledge that including a larger number of experts in this eDelphi could have provided valuable additional insights. Our participants reached consensus after two rounds, and we decided not to conduct further rounds. Evidence in the literature indicates that Delphi studies consisting of two to three rounds are preferred [17]. Additionally, in the case of busy experts and clinicians (as was the case for this work), response exhaustion occurs after two rounds resulting in limited new insights occurring in consequent rounds [26].

Our identified criteria have the potential to be used internationally, although our sampling reflects a certain subset of predominantly English-speaking economically developed countries and, therefore, high-performing health care systems. Our sample also exhibits a strong gender bias, with 27 of 31 Round 2 participants being men. This bias may mirror a broader gender bias present across the digital health leadership community but is likely to affect our findings and conclusions.

A more general concern is that the eDelphi process itself has some limitations. It may, to some extent, force consensus and reinforce dominant views (although controlled anonymized feedback should minimize normative pressure to align views) [16,27]. The addition of a qualitative component may have helped mitigate against this risk by allowing dissenting voices to be heard and by allowing discussion of the complexity of the context in which attempts to measure excellence are taking place. While we originally intended to examine differences between groups of experts (eg, experts from different regions and commercial and public sector) with regards to ratings of different capabilities, we have not been able to conduct those analyses meaningfully given the sample size and the small overall variance in our data.

Integration of Findings With the Current Literature

Most existing models seeking to define digital excellence in health care settings are hospital-focused and stage-based [28]. Our findings question the appropriateness of such one-directional models, which assume that organizations and people within them progress towards increasingly advanced levels of maturity through a predefined set of consecutive stages associated with certain characteristics. Indeed, numerous accounts of organizations leapfrogging undermine the tacit idea of stage-based models going through a fixed sequence of stages [29]. Stage-based models are popular, perhaps because they promise a simple way to measure progress, but give little scope for health systems and individual organizations to articulate their local priorities [12]. Furthermore, ‘one size fits all’ assessment criteria enforce a common standard even under circumstances where achieving this may not be appropriate or impose disproportionate costs. Maturity models from outside the health sector can offer insights into possible ways of addressing some of the limitations of the currently dominant, linear, one-directional approaches in health care. For example, the Deloitte Maturity Model [30] developed primarily to meet the needs of the telecommunications industry, proposes to assess digital maturity using 179 digital capabilities grouped into five categories representing the core dimensions for the functioning of an organization (eg, ‘customers’, ‘operations’). An organization can choose which capabilities from which dimension to develop and in what order, based on its local priorities. This flexibility, in turn, allows for articulating local needs and following specific digital journey appropriate for the local context. Our findings support the increasing recognition that particular organizational and cultural environments of health systems are important factors when considering digital excellence [10,31].

The existing literature predominantly places large acute hospitals at the center of discussions of digital excellence [32]. Our study highlights how the entrenched focus on acute hospitals can draw attention away from integration across the health care ecosystem—even though integrated, patient-centered care has become a key component of current health policies internationally [33]. In line with this, HIMSS Analytics recently developed the Continuity of Care Maturity Model (CCMM) [34]. CCMM, like EMRAM, comprises seven stages and includes dimensions such as interoperability, exchange of information, coordination of care, patient involvement, and use of HIT to optimize clinical and financial outcomes. However, this extended HIMSS classification focuses on the individual health care provider rather than considering the entire health care system, and it remains a stage-based approach. It also remains mainly relevant to the hospital-centric United States context.

There is only limited evidence that meeting all criteria in any index of digital excellence leads to improved quality, safety, or efficiency outcomes, although some functionalities such as clinical decision support systems have been shown to improve practitioner performance [35,36]. At the same time renewing digital infrastructure to meet the ever-expanding requirements for each progressive stage of digital maturity indexes such as HIMSS EMRAM is costly. Many hospitals might choose not to pursue the route of advancing across stages of digital maturity due to high costs combined with insufficient evidence of desired return on investment. Thus, although digital excellence indices are commonly viewed as a proxy measure for improvement in efficiency and safety, there is limited evidence that adoption of these models will per se deliver such improvements [37].

Implications for Research, Policy, and Practice

The identified technological capabilities have the potential to serve as a practical means to baseline and measure digital progress within acute hospital settings and their wider health care context. They can also promote international comparisons. Future work should focus on developing an assessment tool based on these identified capabilities. This needs to include establishing a scoring mechanism and weighting criteria for the capabilities comprising the tool and demonstrating the tool’s reliability and validity, including responsiveness to change and discriminatory properties. This work could be facilitated by using the set of capabilities to assess digital excellence in selected hospitals worldwide. It might also be valuable to investigate whether our set of capabilities could be further divided into more detailed categories to provide a better understanding of dimensions constituting digital excellence in hospitals. Additionally, there is a need for further efforts aiming to develop an agreement on what constitutes digital excellence for health care providers that includes views of additional stakeholders such as politicians, decision makers, and authorities.

Conclusions

We have identified an internationally agreed defined set of technological capabilities that constitute digital excellence in hospitals. Our study also foregrounds the managerial and cultural skills necessary for successful, digitally enabled change. Finally, it highlights the need to address integrating digital capabilities across the wider health and care ecosystem to deliver safe, high-quality, and patient-centered care. Digital implementation strategies and indicators need to be positioned within this wider health system landscape to enable and foster transformational change in health care internationally.

Acknowledgments

We gratefully acknowledge the input of the wider GDE evaluation team and the Steering Group of this evaluation, especially Ann Slee (Associated CCIO Medicines, NHSX). We would also like to thank the experts who completed the study surveys and made this work possible.

This article is based on a program of independent research funded by NHS England. The views expressed are those of the author(s) and not necessarily those of the NHS, NHS England, NHSX or NHS Digital. This work was also supported by the National Institute for Health Research (NIHR) Imperial Patient Safety Translational Research Centre and Health Data Research UK. The views expressed in this publication are those of the authors and not necessarily those of the NHS, the NIHR, or the Department of Health and Care.

Authors' Contributions

MK conducted the study and drafted the paper. KC, RW, and AS conceived and designed the study and commented on multiple versions of the draft. BDF, CH, WL, HM, KM, SE, SH, RD, and HWWP commented on multiple versions of the draft.

Conflicts of Interest

None declared.

Multimedia Appendix 1

Capabilities that did not meet the inclusion criteria.

DOCX File , 16 KB

  1. Wachter RM. Making IT Work: Harnessing the Power of Health Information Technology to Improve Care in England Report of the National Advisory Group on Health Information Technology in England.: Department of Health and Social Care; 2016 Sep 07.   URL: https:/​/assets.​publishing.service.gov.uk/​government/​uploads/​system/​uploads/​attachment_data/​file/​550866/​Wachter_Review_Accessible.​pdf [accessed 2019-08-15]
  2. Blumenthal D. Launching HITECH. N Engl J Med 2010 Feb 04;362(5):382-385. [CrossRef] [Medline]
  3. Global Digital Exemplars. NHS England.   URL: https://www.england.nhs.uk/digitaltechnology/connecteddigitalsystems/exemplars/ [accessed 2019-08-15]
  4. Carvalho JV, Rocha A, Abreu A. Maturity Models of Healthcare Information Systems and Technologies: a Literature Review. J Med Syst 2016 Jun;40(6):131. [CrossRef] [Medline]
  5. Gomes J, Romão M. Information System Maturity Models in Healthcare. J Med Syst 2018 Oct 16;42(12):235. [CrossRef] [Medline]
  6. van de Wetering R, Batenburg R. A PACS maturity model: a systematic meta-analytic review on maturation and evolvability of PACS in the hospital enterprise. Int J Med Inform 2009 Feb;78(2):127-140. [CrossRef] [Medline]
  7. Grooten L, Borgermans L, Vrijhoef HJ. An Instrument to Measure Maturity of Integrated Care: A First Validation Study. Int J Integr Care 2018 Jan 25;18(1):10. [CrossRef] [Medline]
  8. HIMMS Electronic Medical Record Aadoption Model (EMRAM). Healtcare Information and Management Systems Society.   URL: https://www.himssanalytics.org/emram [accessed 2019-09-12]
  9. Digital Maturity Assessment. NHS England.   URL: https://www.england.nhs.uk/digitaltechnology/connecteddigitalsystems/maturity-index/ [accessed 2019-08-16]
  10. Carvalho JV, Rocha A, van de Wetering R, Abreu A. A Maturity model for hospital information systems. Journal of Business Research 2019 Jan;94:388-399. [CrossRef]
  11. Hughes O. Will Smart: England to get first HIMSS 7 hospitals before year’s end. Digital Health. 2018 May 17.   URL: https://www.digitalhealth.net/2018/05/will-smart-first-himss-7-hospital-before-years-end/ [accessed 2019-09-10]
  12. Cresswell K, Sheikh A, Krasuska M, Heeney C, Franklin BD, Lane W, et al. Reconceptualising the digital maturity of health systems. The Lancet Digital Health 2019 Sep;1(5):e200-e201. [CrossRef]
  13. Global Digital Exemplar (GDE) Programme Evaluation. University of Edinburgh.   URL: https://www.ed.ac.uk/usher/digital-exemplars [accessed 2019-06-03]
  14. Boulkedid R, Abdoul H, Loustau M, Sibony O, Alberti C. Using and reporting the Delphi method for selecting healthcare quality indicators: a systematic review. PLoS One 2011;6(6):e20476 [FREE Full text] [CrossRef] [Medline]
  15. Dalkey N. The Delphi Method: An experimental study of group opinion. Futures 1969 Sep;1(5):408-426. [CrossRef]
  16. Hasson F, Keeney S, McKenna H. Research guidelines for the Delphi survey technique. J Adv Nurs 2000 Oct;32(4):1008-1015. [Medline]
  17. Powell C. The Delphi technique: myths and realities. J Adv Nurs 2003 Feb;41(4):376-382. [Medline]
  18. Keeney S, Hasson F, McKenna H. Consulting the oracle: ten lessons from using the Delphi technique in nursing research. J Adv Nurs 2006 Jan;53(2):205-212. [CrossRef] [Medline]
  19. Snyder-Halpern R. Indicators of organizational readiness for clinical information technology/systems innovation: a Delphi study. Int J Med Inform 2001 Oct;63(3):179-204. [Medline]
  20. Hübner-Bloder G, Ammenwerth E. Key performance indicators to benchmark hospital information systems - a delphi study. Methods Inf Med 2009;48(6):508-518. [CrossRef] [Medline]
  21. Riedmann D, Jung M, Hackl WO, Ammenwerth E. How to improve the delivery of medication alerts within computerized physician order entry systems: an international Delphi study. J Am Med Inform Assoc 2011;18(6):760-766 [FREE Full text] [CrossRef] [Medline]
  22. Kane G, Palmer D, Phillips A, Kiron D, Buckley N. Achieving Digital Maturity: Adapting Your Company to a Changing World. MIT Sloan Management Review. 2017 Jul 13.   URL: https://sloanreview.mit.edu/projects/achieving-digital-maturity/ [accessed 2019-05-03]
  23. Ritchie J, Lewis J, McNaughton N, Ormston R, editors. Qualitative Research Practice: A Guide for Social Science Students and Researchers. London: SAGE Publications Ltd; 2013.
  24. Williams PL, Webb C. The Delphi technique: a methodological discussion. J Adv Nurs 1994 Jan;19(1):180-186. [Medline]
  25. Akins RB, Tolson H, Cole BR. Stability of response characteristics of a Delphi panel: application of bootstrap data expansion. BMC Med Res Methodol 2005 Dec 01;5:37 [FREE Full text] [CrossRef] [Medline]
  26. McKenna HP. The essential elements of a practitioners' nursing model: a survey of psychiatric nurse managers. J Adv Nurs 1994 May;19(5):870-877. [CrossRef] [Medline]
  27. Goodman CM. The Delphi technique: a critique. J Adv Nurs 1987 Nov;12(6):729-734. [CrossRef] [Medline]
  28. Mettler T, Pinto R. Evolutionary paths and influencing factors towards digital maturity: An analysis of the status quo in Swiss hospitals. Technological Forecasting and Social Change 2018 Aug;133:104-117. [CrossRef]
  29. Kharrazi H, Gonzalez CP, Lowe KB, Huerta TR, Ford EW. Forecasting the Maturation of Electronic Health Record Functions Among US Hospitals: Retrospective Analysis and Predictive Model. J Med Internet Res 2018 Aug 07;20(8):e10458 [FREE Full text] [CrossRef] [Medline]
  30. Digital Maturity Model: Achieving digital maturity to drive growth. Deloitte Development LLC.: Deloitte Development LLC; 2018.   URL: https:/​/www2.​deloitte.com/​content/​dam/​Deloitte/​global/​Documents/​Technology-Media-Telecommunications/​deloitte-digital-maturity-model.​pdf [accessed 2020-04-20]
  31. Sittig DF, Singh H. A new sociotechnical model for studying health information technology in complex adaptive healthcare systems. Qual Saf Health Care 2010 Oct;19 Suppl 3:i68-i74 [FREE Full text] [CrossRef] [Medline]
  32. Adler-Milstein J, Holmgren AJ, Kralovec P, Worzala C, Searcy T, Patel V. Electronic health record adoption in US hospitals: the emergence of a digital. J Am Med Inform Assoc 2017 Nov 01;24(6):1142-1148. [CrossRef] [Medline]
  33. Flott K, Callahan R, Darzi A, Mayer E. A Patient-Centered Framework for Evaluating Digital Maturity of Health Services: A Systematic Review. J Med Internet Res 2016 Apr 14;18(4):e75 [FREE Full text] [CrossRef] [Medline]
  34. HIMSS Continuity of Care Maturity Model (CCMM). Healtcare Information and Management Systems Society.   URL: https://www.himssanalytics.org/ccmm [accessed 2019-09-13]
  35. Black AD, Car J, Pagliari C, Anandan C, Cresswell K, Bokun T, et al. The impact of eHealth on the quality and safety of health care: a systematic overview. PLoS Med 2011 Jan;8(1):e1000387 [FREE Full text] [CrossRef] [Medline]
  36. Garg AX, Adhikari NKJ, McDonald H, Rosas-Arellano MP, Devereaux PJ, Beyene J, et al. Effects of computerized clinical decision support systems on practitioner performance and patient outcomes: a systematic review. JAMA 2005 Mar 9;293(10):1223-1238. [CrossRef] [Medline]
  37. Martin G, Clarke J, Liew F, Arora S, King D, Aylin P, et al. Evaluating the impact of organisational digital maturity on clinical outcomes in secondary care in England. NPJ Digit Med 2019;2:41 [FREE Full text] [CrossRef] [Medline]


CCMM: Continuity of Care Maturity Model
CPOE: Computerized Physician Order Entry
EMR: electronic medical record
EMRAN: Electronic Medical Record Adoption Model
HIMSS: Healthcare Information and Management System Society
HIT: health information technology
NHS: National Health Service


Edited by G Eysenbach; submitted 20.11.19; peer-reviewed by A Lange, M Alrige, P Pantoja Bustillos, W Gordon; comments to author 10.03.20; revised version received 11.05.20; accepted 14.05.20; published 18.08.20

Copyright

©Marta Krasuska, Robin Williams, Aziz Sheikh, Bryony Dean Franklin, Catherine Heeney, Wendy Lane, Hajar Mozaffar, Kathy Mason, Sally Eason, Susan Hinder, Rachel Dunscombe, Henry W W Potts, Kathrin Cresswell. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 18.08.2020.

This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research, is properly cited. The complete bibliographic information, a link to the original publication on http://www.jmir.org/, as well as this copyright and license information must be included.