Published on in Vol 25 (2023)

Preprints (earlier versions) of this paper are available at https://preprints.jmir.org/preprint/40976, first published .
The Assessment of Medical Device Software Supporting Health Care Services for Chronic Patients in a Tertiary Hospital: Overarching Study

The Assessment of Medical Device Software Supporting Health Care Services for Chronic Patients in a Tertiary Hospital: Overarching Study

The Assessment of Medical Device Software Supporting Health Care Services for Chronic Patients in a Tertiary Hospital: Overarching Study

Authors of this article:

Erik Baltaxe1 Author Orcid Image ;   Hsin Wen Hsieh2 Author Orcid Image ;   Josep Roca2 Author Orcid Image ;   Isaac Cano2 Author Orcid Image

Original Paper

1Institute of Pulmonary Medicine, Chaim Sheba Medical Center, Ramat Gan, Israel

2Hospital Clinic de Barcelona, Institut d’Investigacions Biomèdiques August Pi i Sunyer, Universitat de Barcelona, Barcelona, Spain

on behalf of the JADECARE Consortium

Corresponding Author:

Isaac Cano, PhD

Hospital Clinic de Barcelona

Institut d’Investigacions Biomèdiques August Pi i Sunyer

Universitat de Barcelona

Villarroel 170

Barcelona, 08036

Spain

Phone: 34 932275540

Email: iscano@recerca.clinic.cat


Background: Innovative digital health tools are increasingly being evaluated and, in some instances, integrated at scale into health systems. However, the applicability of assessment methodologies in real-life scenarios to demonstrate value generation and consequently foster sustainable adoption of digitally enabled health interventions has some bottlenecks.

Objective: We aimed to build on the process of premarket assessment of 4 digital health interventions piloted at the Hospital Clinic de Barcelona (HCB), as well as on the analysis of current medical device software regulations and postmarket surveillance in the European Union and United States in order to generate recommendations and lessons learnt for the sustainable adoption of digitally enabled health interventions.

Methods: Four digital health interventions involving prototypes were piloted at the HCB (studies 1-4). Cocreation and quality improvement methodologies were used to consolidate a pragmatic evaluation method to assess the perceived usability and satisfaction of end users (both patients and health care professionals) by means of the System Usability Scale and the Net Promoter Score, including general questions about satisfaction. Analyses of both medical software device regulations and postmarket surveillance in the European Union and United States (2017-2021) were performed. Finally, an overarching analysis on lessons learnt was conducted considering 4 domains (technical, clinical, usability, and cost), as well as differentiating among 3 different eHealth strategies (telehealth, integrated care, and digital therapeutics).

Results: Among the participant stakeholders, the System Usability Scale score was consistently higher in patients (studies 1, 2, 3, and 4: 78, 67, 56, and 76, respectively) than in health professionals (studies 2, 3, and 4: 52, 43, and 54, respectively). In general, use of the supporting digital health tools was recommended more by patients (studies 1, 2, 3, and 4: Net Promoter Scores of −3%, 31%, −21%, and 31%, respectively) than by professionals (studies 2, 3, and 4: Net Promoter Scores of −67%, 1%, and −80%, respectively). The overarching analysis resulted in pragmatic recommendations for the digital health evaluation domains and the eHealth strategies considered.

Conclusions: Lessons learnt on the digitalization of health resulted in practical recommendations that could contribute to future deployment experiences.

J Med Internet Res 2023;25:e40976

doi:10.2196/40976

Keywords



Over the past years, substantial progress has been made toward the adoption of digital technology for health or digital health [1]. Health programs using digital technology are increasingly being tested, evaluated, and, in some instances, integrated at scale into health information systems [2], considering its unique methodological challenges [3-6]. Investment in digital health requires evidence to support its value and expected benefits from the perspective of key stakeholders, that is, patients, professionals, health service providers, policy makes, and payers [7].

Assessment of digital health interventions has been conceptualized by a range of different evaluation frameworks [8,9]. A common factor in all of them is that digital health interventions should show benefit from all stakeholders’ perspectives during design and development [10], as well as after adoption. However, while such evaluation frameworks may serve as relevant guidelines, only recently, few of them have been extensively applied for the assessment of mature digital health tools [11-14]. Moreover, current medical device regulatory guidelines for digital health technologies [15-17] are, de facto, establishing their own evaluation constraints.

Common requirements are that digital health tools must be safe and must generate value to be successfully adopted in health care. Current medical device software (MDSW) regulations aim for successful links between privacy and information security, as well as for patient safety and clinical benefit [18,19]. It is of note that recent regulatory frames [14], aiming for fast-track assessment of digital applications, define a full set of requirements with respect to user friendliness, robustness, interoperability, and reimbursability.

The main objective of this research was to generate recommendations and to report lessons learnt from separate assessments of MDSW with the aim of bringing together the premarket experience from the cocreation of digital health interventions at the Hospital Clinic de Barcelona (HCB) during the period 2017-2019 [20] and the postmarket experience of MDSW after regulatory compliance in the European Union and the United States through MDSW recalls during the past 5 years (2017-2021). Due to the proliferation of innovation projects involving digital health interventions, there is a clear need for concerted efforts to harmonize and learn from both premarket piloting and postmarket surveillance experiences to foster applicability and standardization of the assessment of digital health tools in real-life settings.

It is of note that the premarket co-design experiences from the HCB [20] benefited from the combined input of all the stakeholders, including end users, aiming to prevent failures when reaching the market, and that the HCB has a dual role as a university center and as a driver of large community-based integrated care in the city of Barcelona (Área Integral de Salud de Barcelona Esquerra [AISBE]; 520,000 citizens) [21,22], falling within the activities of the Catalan Open Innovation Hub on Digitally Enabled Integrated Care Services, which is 1 of the 4 original EU Good Practices in the European Joint Action JADECARE [23].


Premarket Analysis of the Four Digital Health Prototypes Piloted at the HCB

Cocreation and quality improvement methodologies reported previously [20] were used to consolidate a pragmatic assessment protocol that was applied in the 4 digital health prototypes supporting the interventions (studies 1-4) described below.

Study 1 involved home-based noninvasive ventilation (NIV) of patients with hypercapnic respiratory failure. The study addressed enhanced management of chronic patients requiring specialized respiratory care for home-based NIV by means of a mobile app for patient self-management [24].

Study 2 involved prehabilitation of high-risk patients undergoing major abdominal surgery. The study assessed the potential of the supporting digital health tool, PREHAB [25], to enhance collaborative work among health professionals and patients using a mobile app for self-management at the community level.

The third cluster involved community-based care of frail chronic patients. This cluster of digital health interventions included 2 studies addressing specific objectives: (1) Study 3 assessed an adaptive case management platform [26] for community-based care of chronic patients; and (2) Study 4 investigated the potential of a secure communication platform, prototyped during 2019, for enhanced management of frail chronic patients [27] (Table 1). The details of the digital health interventions are provided in Multimedia Appendix 1.

Table 1. Details of the 4 digital health interventions (studies 1-4) piloted at the Hospital Clinic de Barcelona.
StudyDesignInclusion and exclusion criteriaDigital health intervention
1Single-blinded single-center RCTa with 2 parallel arms (1:1 ratio): (1) control group (n=34) and (2) digital health intervention during a period of 3 months (n=33).
  • Inclusion criteria: Adult patients under home-based NIVb at the HCBc and having a mobile phone or tablet in the intervention group.
  • Exclusion criteria: Patients with severe psychiatric or neurological diseases, as well as those hospitalized at the time of assessment.
MyPathway app (TRLd 5) [28] was used for bidirectional interaction with the research team. It consisted of positive feedback or reinforcement messages in response to the number of hours of NIV use reported by the patient daily. Moreover, general advice on specific NIV clinical problems was automatically provided by the app according to the patients’ weekly input.
2Prospective cohort study with 16 candidates for the prehabilitation service at the HCB.
  • Inclusion criteria: Candidates of major elective surgery in at least 4 weeks, age >70 years, an ASAe score of III/IV, and access to a mobile phone or tablet with internet connection.
  • Exclusion criteria: Physical or psychological problems affecting use and not having a career.
A digital health tool (TRL 6) [25] was used by health care professionals to prescribe and monitor tasks for patient self-management supported by an app, including physical activity goals, nutritional advice, mindfulness exercises, and predefined data collection instruments for patient-reported outcomes and experience.
3Prospective cohort study with 20 clinically stable chronic patients recruited in 1 primary care unit from AISBEf and followed-up for a period of 1 month.
  • Inclusion criteria: Acceptance to participate and having an appropriate smartphone or tablet.
  • Exclusion criteria: Physical or psychological problems precluding the use of the app and not having a career.
Patients were given access to the platform (TRL 5) through their smartphones, and a pedometer was provided to track adherence to a personalized daily physical activity prescription, with remote support from a case manager.
4Cluster RCT by primary care teams from AISBE, with an intervention (n=31) to control ratio of 2:1 and follow-up for a period of 3 months.
  • Inclusion criteria: Acceptance to participate and having an appropriate smartphone or tablet
  • Exclusion criteria: Physical or psychological problems precluding the use of the digital tool and not having a career.
A case manager nurse used the Health Circuit (TRL 5) communication channel to trigger bilateral or group conversations, including health information exchange among specialized care, social care professionals, and community-based services, to agree on a goal-oriented and personalized health plan to manage both expected and unexpected events communicated by study participants [27].

aRCT: randomized clinical trial.

bNIV: noninvasive ventilation.

cHCB: Hospital Clinic de Barcelona

dTRL: technology readiness level.

eASA: American Society of Anesthesiologists.

fAISBE: Área Integral de Salud de Barcelona Esquerra.

The premarket assessment of the 4 digital health interventions (Table 1) piloted during the period 2017-2019 primarily focused on assessment of technical robustness and usability. The former was assessed with a technical log book on the cloud that was updated daily with technical issues and suggestions for improvement from all study participants. The end users’ perceived usability and satisfaction were assessed by means of the System Usability Scale (SUS) [29] and Net Promoter Score (NPS) [30], alongside general questions about satisfaction.

Besides the technical and usability assessments mentioned above, compliance with other operational aspects, such as privacy and security, interoperability, transferability, and value generation of the accompanying integrated care services, was part of an overall evaluation framework [31], but the premarket analysis did not systematically analyze this. It is worth mentioning that digital support for the 4 digital health interventions (Table 1) was designed to operate on top of existing hospital information systems to minimize the need for ad-hoc integration via standard application programming interfaces.

When writing this manuscript, we adhered to the Consolidated Criteria for Reporting Qualitative Research (COREQ) [32]. All information retrieved from the technical and usability assessments of the 4 digital health interventions (including the end users’ interviews) was processed according to the protocol-specific ethics statement mentioned in the Ethics Approval and Consent section.

MDSW Regulations and Postmarket Surveillance Analysis

In recent years, digital health technology has developed rapidly in the market as software-only novel therapies or has been embedded into medical devices or clinical workflows as a companion MDSW device in the market. MDSW is defined, under EU Regulation (EU) 2017/745-MDR [16], as a software, used alone or in combination, that is intended by its manufacturer as a medical device for human beings for a specific medical purpose: diagnosis, prevention, investigation, monitoring, prediction, treatment, alleviation, prognosis, and prediction.

The analysis of the European MDSW regulatory frame was focused on the EU Regulation 2017/745-MDR [16] and the fast-track process generated by Germany’s Federal Institute for Drugs and Medical Devices, known as the “BfArM” guidelines for the evaluation of digital health applications (DiGA) [14]. Likewise, for the United States, we considered the FDA 21 CFR Part 820 [15]. Moreover, the EU General Data Protection Regulation (GDPR) [18] and its American counterpart, the US Health Insurance Portability and Accountability Act (HIPAA) [19], were also considered in the analysis.

Within the postmarket surveillance analysis at the EU level, we collected the numbers of devices that had been recalled, irrespective of their risk level and the product end user, over 5 years (January 1, 2017, to December 31, 2021) from the German BfArM website [33]. The results included malfunctioning software that may result in a severe adverse event, device deficiency, incident, or serious incident.

For the postmarket surveillance analysis in the United States, we collected data from the following 2 public databases: Manufacturer and User Facility Device Experience (MAUDE) database [34] and MEDSUN Reports [35]. Considered MDSW recalls included software failures in terms of security flaws, privacy risks, internal controls, technical controls, physical controls, and implementation.

Overarching Analysis

As a result of the experience-based cocreation process [20], 4 domains of digital health system validation (Table 2) and 3 eHealth contextual strategies (Table 3) were considered essential for the assessment of digital health tools. Therefore, they were used to guide a thematic analysis on the assessment results of the 4 digital health interventions, by the author EB, as well as on the overview of the MDSW current regulations and the analysis of MDSW recalls of the past 5 years (2017-2021) in the European Union and the United States, by the author HWH.

Then, the author EB discussed the findings with the participants of each digital health intervention, the final users of the supporting digital health tools, and all other coauthors. For each of the 4 digital health interventions, the evaluation results were judged by the author EB according to the contextual eHealth strategy, defined by the role of the digital tools in the health care service.

Finally, we evaluated the thematic analysis results to generate recommendations for assessment of the sustained adoption of future digital health interventions.

Table 2. The domains considered essential for the validation of digital health systems [20].
Domain and componentDescription
Technical

RobustnessTesting of performance when compared to a technical gold standard

Privacy and securityTesting of privacy and security requirements

InteroperabilityTesting of interoperability requirements

TransferabilityPotential to adapt to other services and implementation scenarios

SmartnessTesting of innovative features powered by artificial intelligence
Clinical

SafetyCritical appraisal of technology impact on patient safety outcomes

Medical benefitEvidence of positive health care effects
Usability

Ease of useWhether digital health systems can be used as intended by users

FeasibilityWhether digital health systems work as intended in each context
Cost

Value generationAnticipated cost impact on the clinical outcome of interest

AffordabilityIf the costs of digital health systems can be made affordable
Table 3. The eHealth contexts considered essential for the validation of digital health systems.
eHealth strategyDescription
TelehealthDigital support to well-established service workflows to enhance health care efficiencies. Typically, clinical evidence is not required. Not a medical device.
Integrated careDigital support to enable innovative service workflows with a care continuum approach. Clinical evidence is required. Requires regulatory clearance or approval.
Digital therapeuticsMedical device software–driven therapeutic intervention for prevention, management, or treatment. Evidence and regulatory approval are required.

Ethics Approval and Consent

Letters of medical ethics approval of the Ethics Committee for Medical Research of the HCB and signed informed consent forms were obtained for the 4 studies (study 1, HCB/2019/0510; study 2, HCB/2016/0883; study 3, HCB/2018/0803; and study 4, HCB/2018/0805).


Premarket Analysis of the Pilots

A total of 99 chronic patients and 9 health care professionals were assessed during the interaction and cocreation process of the 4 different digital health interventions piloted at the HCB (studies 1-4). Assessment results of the technical and usability performances are summarized in Table 4. See Multimedia Appendix 2 for further details.

Table 4. Usability performance and summary of the technical log book reported by patients and professionals with respect to the digital health tools supporting the 4 digital health interventions piloted at the Hospital Clinic de Barcelona.
StudyPatients’ experienceProfessionals’ experienceTechnical log booka

nNPSbSUSc scorenNPSbSUSc score

133−3%781N/AdN/ARecurrent login with a username and a password that are easy to forget (patients).
21631%672−67%52Technology bugs (health professionals) and system enforcement for a random password after reset (patients).
319−21%5611%43Problems connecting the pedometer via Bluetooth with some Android smartphones (patients).
43131%765−80%54Lack of robustness of the multimedia communication channel with some Android smartphones (health professionals).

aMain reported issues from patients or health professionals.

bThe Net Promoter Score (NPS) is a known questionnaire used to assess satisfaction with a product, which includes a key question: “How likely is it that you would recommend our system to a family member or friend?” Patients can give an answer ranging from 0 (“not at all likely”) to 10 (“extremely likely”). Individuals scoring 9 or 10 are called “promoters,” individuals scoring 7 or 8 are called “passives” (or neutrals), and individuals scoring 0 to 6 are called “detractors.” The NPS is computed as percent promoters − percent detractors, and ranges from −100% to 100%.

cThe System Usability Scale (SUS) was developed by John Brooke in 1986 and consists of a 10-item questionnaire scored on a 5-point Likert scale from 0 (strongly disagree) to 5 (strongly agree). The overall score is calculated from the sum of all item scores multiplied by 2.5 and can range from 0 to 100. A system or product that receives a score of 68 or above is considered to have good usability.

dN/A: not applicable.

Study 1: Home-Based NIV of Patients With Hypercapnic Respiratory Failure

Most (20/27, 74%) reported incidences by end users had to do with the need to login with a username and a password that were easy to forget, which precluded the ease of use. However, on a Likert scale from 1 (very bad) to 10 (very good), the general impression of patients was scored 7.5, user friendliness was scored 8.2, and the ability to use the app without assistance was scored 8.5. This was in line with the mean patient usability score (78 out of 100).

Study 2: Prehabilitation of High-Risk Patients Undergoing Major Abdominal Surgery

Fifty percent (5/10) of incidences were due to comfortability and accessibility (the system forced the use of a random password when the password was reset) and 30% (3/10) were due to technology robustness. The remaining 20% (2/10) of incidences were due to various factors. As in study 1, the general impression and user friendliness was scored 8 (out of 10) and the ability to use the app without assistance was scored 7.5 (out of 10). In contrast, the patient usability score had a mean value of 67, which is considered an average usability grading. With respect to the experience of professionals, a neutral experience using the web backend for professionals was reported, with an overall satisfaction score of 5 (out of 10) and a mean SUS score of 52.

Study 3: Community-Based Care of Frail Chronic Patients With the CONNECARE Platform

Most (4/7, 57%) observations during the pilot were due to lack of robustness of the Bluetooth connection with the pedometer. Reported observations regarding motivation, reliability, comfortability, and accessibility reached 14% (1/7) each. In general, patients had a slightly positive experience (6/10) using the system, but its usability was graded low (SUS score of 56). In case of professionals, perceived usability was graded lower (SUS score of 43); thus, the professionals involved would not recommend the CONNECARE system.

Study 4: Community-Based Care of Frail Chronic Patients With the Health Circuit Prototype

High proportions of observations were due to usability (11/18, 61%), and comfortability and accessibility (6/18, 33%) issues, mostly due to lack of robustness of the multimedia communication channel. In general, most patients had a positive experience using the system, which was reinforced by the fact that the median overall satisfaction score was 7.8 (out of 10) and the mean patient usability score was 76. However, the NPS reported by professionals (n=5) was negative (−80%), but since the median overall satisfaction score was 5 and the perceived usability score was 54, we could consider that professionals had a neutral experience using the prototype.

Comparability Analysis of EU and US MDSW Regulations

Data Protection

Both the EU and US regulatory frames (GDPR [18] and HIPAA [19], respectively) provide clear guidelines for manufacturers, health professionals, patients, and users in general, to assess how medical devices protect private information and security. The GDPR governs the use of and applies to all personal data from an individual person who is in an EU country at the time the data are collected, while HIPAA has a much narrower scope and only applies to protected health information. However, both the regulations are established keeping in mind the public interest and security of sensitive information [36]. In addition, potential risk management assessment and cyber security are essential to consider the stage of design, development, clinical investigation, and postmarket surveillance. Since the primary focus is on data security, privacy, and integrity, all the measures necessary to comply with the regulations are broadly similar. Thus, MDSW that are already GDPR or HIPAA compliant will have in place most of the security measures required to protect data privacy.

Medical Devices

To ensure the safety and efficiency of medical devices while supporting innovation, the European Union and United States have established their own transparency route to internal markets and a procedure for verification and validation (Medical Device Regulation [MDR] [16] and Food and Drug Administration [FDA] [15], respectively). Medical device regulatory compliance under both the FDA and MDR is a complex path involving processes that need constant monitoring and maintenance. For example, the recent requirements of the MDR are much closer to those of the FDA in terms of (1) prerequisites for the conformity assessment; (2) a quality management system in-place compliant with ISO 13485; and (3) use of consensus standards that are relevant to the development and design of interoperable medical devices. With that said, there are some key differences. The FDA’s classification system is based upon 3 risk classes, while the EU MDR has 4 device categories and 5 risk-based classifications.

The risk classification assigned will determine the depth and amount of clinical data required under the MDR to get approval for the medical device, whereas under the FDA, Class I and some Class II devices do not require clinical testing, and only proof is required that the medical device is substantially equivalent to a legally marketed product.

Toward Digital Therapeutics

Perhaps the biggest challenge facing EU and US MDSW regulations is reimbursement. With evidence supporting the efficacy of digital therapeutics stacking up, more payers are coming round to the idea of MDSW reimbursement and the business case for offering it [37]. With it becoming ever clearer that MDSW, in general, and digital therapeutics, in particular, can play significant roles in the treatment of many conditions around the world, both the European Union and FDA are creating regulatory frameworks for the safety and efficacy of digital therapeutics. Germany launched a fast-track process for digital health applications (DiGA) [14], which is the first in the world for digital therapeutics reimbursement. DiGA is a pathway for doctors to prescribe digital therapeutics to publicly insured patients and receive reimbursement in much the same way as traditional treatment. This catapulted Germany to global leadership in digital therapeutics regulation. No other country has yet made prescription digital therapeutics so widely available to such a high percentage of the population. Beyond EU MDR standards, DiGA defined further requirements, such as interoperability, robustness, and ease of use, among others.

Postmarket Surveillance Analysis

The review of the German regulatory framework (BfArM) retrieved a total of 556 postmarket events that fitted the research requirements focused on digital health tools (representing 13% of all events that included drugs, assays, and medical devices). Likewise, the review of the MAUDE and MEDSUN databases (United States) found a total of 114 software-related issues, representing 18% of all queried events. See Multimedia Appendix 3 for details.

The vast majority of reported issues were related with software problems, incorrect results, data mismatch, error codes, system unexpected shutdowns, incorrect procedures, and cybersecurity-related aspects, which folded into the robustness (620/665, 93.2%) and privacy and security (26/665, 3.9%) components of the technical domain mentioned in Table 2. Therefore, software technical defects were reported to have the highest potential risk of harming end users.

No issues were reported with respect to transferability, smartness, safety, or medical benefit, whereas very few issues were reported in relation to interoperability (2/665, 0.3%), ease of use (8/665, 1.2%), and feasibility (9/665, 1.3%).

Overarching Analysis

The overarching analysis of the process of premarket assessment of the 4 digital health interventions piloted at the HCB, as well as the analysis of current MDSW regulations and postmarket surveillance in the European Union and United States provided a source of experience-based knowledge that is described below and summarized in Tables 5 and 6 in terms of recommendations for each of the 4 digital health evaluation domains (Table 2) and lessons learnt toward sustained adoption in the 3 eHealth contexts considered (Table 3).

Table 5. Recommendations for the assessment of medical device software for sustained adoption of future digital health interventions.
DomainRecommendations
Technical
  • A high technology readiness level is key for sustained adoption of MDSWa.
  • Data privacy, security, and interoperability need to be addressed for regulatory compliance.
  • MDSW should evolve to support collaborative work.
Clinical
  • A dedicated change management team is required for integrated care eHealth strategies.
  • A unified evaluation protocol facilitates comparability among digital health interventions.
  • Key performance indicators need to be adopted for continuous assessment.
Usability
  • Cocreation facilitates design while minimizing the need for user training and enhances adoption.
  • Cognitive behavioral therapy techniques enhance user adherence to digital health applications.
Cost
  • Evidence on health care value generation of digital health interventions precedes cost containment.
  • Bundle payment approaches based on service performance are advised.

aMDSW: medical device software.

Table 6. Lessons learnt for the assessment of medical device software for sustained adoption of future digital health interventions.
eHealth strategyLessons learnt
TelehealthDigital health tools that engage consumers for lifestyle, wellness, and health-related purposes, which typically do not require regulatory oversight, do not ensure value generation.
Integrated careEvidence-based MDSWa that allow all stakeholders in the care continuum to collaborate and to access, share, aggregate, and visualize meaningful data daily, are expected to contribute the most to health care efficiency generation.
Digital therapeuticsDigital therapeutics that win public reimbursement must have solid proof of their efficacy/effectiveness. A market strategy or a MDSW regulation that helps build that proof is therefore essential. Real usage data for digital therapeutics and associated evaluations should determine national health coverage.

aMDSW: medical device software.

Technical Domain

Optimization of health care value generation and sustainability of the digitally enabled integrated care services explored in the 4 pilot studies were limited by the lack of technical robustness of the prototypes tested during the period 2017-2019, with technology readiness levels [38] within the interval 5-6.

It is of note that data privacy, information security, and data standardization are essential for enabling interoperability with health information systems from different providers or health information exchange platforms across providers within a geographical area. However, many privacy and security features are known to reduce user satisfaction.

In terms of interoperability and transferability, MDSW should evolve to support collaborative work among stakeholders across community and hospital services (ie, vertical and horizontal integration), using shared care plans that incorporate patient goals, which will foster the digital transformation of health care within a care continuum scenario.

Clinical Domain

Overall, digital support should be embedded into properly defined health care service workflows, particularly relevant for integrated care and to some extent digital therapeutic eHealth contexts. Moreover, implementation of adaptive case management for the management of care pathways is highly advisable to face the challenge of unexpected events within well-defined care paths. Telehealth tools not embedded into properly defined health care service workflows, focused on engaging consumers for lifestyle, wellness, or any other health-related purposes, which typically do not require regulatory oversight, do not ensure value generation.

The implementation of digitally enabled integrated care is disruptive and requires transformational change at all levels of an organization. This requires careful and solid strategic planning considering all the obstacles that may be encountered, as well as developing incentives and ongoing change management with a dedicated change management team. Pragmatic application of the same evaluation protocol is highly recommended to facilitate comparability among deployment experiences and to identify key performance indicators for long-term follow-up quality assessment of the service, beyond the initial deployment. In this regard, the use of profiled dashboards could be an efficient strategy for the assessment of cost-effectiveness in real-life settings, especially for MDSW using eHealth strategies that require evidence of efficacy and effectiveness (ie, integrated care and digital therapeutics).

Usability Domain

Flexible adoption of patient-centered cocreation methodologies during the premarket studies was useful to identify factors that generate bottlenecks, facilitating design and adoption of timely action plans. In general, cocreation efforts represented a success factor in terms of perceived usability by patients, but generated high expectations by health care professionals that were not met due to lack of technical robustness of the prototypes tested, which was a negative factor in terms of perceived usability. Efforts must be devoted toward the development of digital health applications that support patient empowerment for self-management with cognitive behavioral therapy to foster the long-term effectiveness of digitally enabled interventions.

Cost Domain

Digital transformation of health care must be based on cost containment. Operational costs of innovative, digitally supported, integrated care services are expected to decrease, so transitional costs should be covered by savings generated through decreases in operational costs. To this end, bundle payment approaches based on service performance are advised. Evidence-based MDSW embedded into properly defined health care service workflows with an integrated care approach are expected to contribute the most to health care efficiency generation. MDSW delivering a therapeutic intervention must have solid proof of efficacy in controlled clinical trials, but real usage data should be used to monitor cost-effectiveness in a real-world setting and ultimately determine national health coverage.

eHealth Strategies

Digital health tools have become integral to the prevention, diagnosis, treatment, and management of health and diseases. Clinicians use digital health tools to gain insights into patient outcomes, conduct telehealth visits, treat aspects of diseases otherwise unaddressed by traditional medications, and, ultimately, ensure health care efficiency generation. It is crucial to describe the landscape of available digital health tools in addition to the level of clinical evidence and regulatory oversight that correlates with each eHealth category in this quickly evolving industry. End users, clinicians, and payers should understand the difference between the purpose and function of various MDSW, since this differentiation determines the risk level assumed for clinical evidence generation alongside the technical requirements for regulatory oversight.


Summary of the Results

The study aimed to update health professionals on the current landscape of MDSW for enhanced management of chronic patients. The research generated recommendations on target evaluation domains and eHealth categories through an overarching analysis of 3 sources of information: (1) premarket evaluation of 4 pilots carried out at the HCB, using ongoing technological developments; (2) assessment of the regulatory frames of MDSW in the United States and Europe; and (3) postmarket surveillance reporting from the same 2 areas of the world.

Evaluation results of the 4 digital health interventions piloted at the HCB showed that patients tended to score higher than professionals in terms of the experience with supporting digital health tools, and in general, they would recommend the use of supporting digital health tools. This can be partly explained by the fact that the technology readiness level of the assessed digital health interventions was rather low at the precommercial stage, which most likely had a negative impact on the perceived usability by health care professionals who had to lead with technical issues at the same time than with the inherent complexities of case management. Moreover, the lack of integration with existing hospital information systems influenced the poor results with respect to the experience of health professionals. The 4 premarket digital health interventions were not considered mature for integration with existing health information systems, and in general, hospital information technology departments tend to reject integration of noncommercial digital health tools, which precludes usability, especially among health professionals who are not strongly motivated.

As mentioned above, EU and US MDSW regulations include premarket MDSW assessment with respect to clinical data, product information, performance testing, labeling, benefit-risk assessments, residual risks, etc. However, MDSW manufacturers should also plan, establish, document, implement, maintain, and update a postmarket surveillance system in a manner that is proportionate to the risk class and appropriate for the type of device. This system should be an integral part of the manufacturer’s quality management system and should be notified to the corresponding regulatory body.

Overall, the postmarket surveillance review of both German and US regulatory frameworks confirmed the crucial role of software verification and validation, the voluntary testing of cybersecurity, and the need for testing user interfaces.

Strengths and Weaknesses

The inherent heterogeneity of the 4 digital health interventions considered in this study represents both a strength, because it reinforces transferability of the assessment approach, and a limitation, because it precludes comparability of the results among the 4 study protocols. However, the cocreation process and the application of the same structured evaluation protocol over the 4 digital health interventions contributed to the evolution of the mindset of health professionals toward the use of digital health tools. Specifically, the participation of all stakeholders in the overarching analysis concluded with the generation of a set of general recommendations for adoption in routine clinical practice.

The evaluation protocol focused on technical and usability performance because the primary objective of the 4 digital health interventions piloted at the HCB was to demonstrate the feasibility of the approach. If large-scale deployment is the primary aim, other functional, technical, and ethical aspects of the supporting digital health tools will need to be assessed.

The 4 digital health interventions were piloted within the context of research and innovation projects. This represents a clear advantage for stimulating cocreation of the supporting digital health tools, but establishes the threshold of required technological maturity at the prototype level and limits the transferability of the results beyond the boundaries of pilot settings.

Considered MDSW regulations in the European Union and United States neglect the tools and general requirements of other countries or regions of the world. Although similar premarket approval applications and postmarket surveillance tools are being put in place in recent years [11-14], they do not apply to the full range of technology readiness levels, that is, they are intended to be used to evaluate mature technology. In this respect, development and production phases of digital health tools (focus of the 4 digital health interventions assessed in this study) may help to generate more mature and robust digital health tools that are ready to be assessed by corresponding health technology assessment frameworks [11-14].

Toward the Adoption of MDSW

Technical performance and usability are arguably among the most important considerations with patient-oriented mobile and digital-based solutions [39,40].

The overarching analysis showed a clear link between premarket assessment and postmarket surveillance in terms of technical failures hindering stakeholder adoption, regardless of usability and acceptability success. Such technical failures can be overcome by generating not only robust and secure products, but also online open access databases (the likes of clinical registries for single diseases) where basic MDSW approval information, medical specialty, and algorithm details can be publicly shared, thus enhancing transparency and collaborative work. Two recent studies [41,42] explored this concept when applied to artificial intelligence MDSW.

Moreover, digital health apps must be easy to use for their intended purpose, require minimal effort to complete tasks, have minimal data entry burden, and allow the user to control preferences when appropriate (eg, notifications). Since systems can be designed for users with disabilities (eg, impaired vision, motor deficits, and cognitive dysfunction), design considerations must ensure that accessibility compliance reflects the target user audience and different potential users, including family members and caretakers. Moreover, to maximize acceptability, digital health solutions require input from clinicians.

Developing digital health tools not only implies technological robustness and usability, but also guarantees data privacy. It requires thinking about how the newly collected data will need to be shared with health care professionals and whether the intended use of the technology ethically makes sense. During the long process of creating and validating a digital health application (starting with an idea, followed by its implementation and dissemination in different application markets), many stages must be achieved, and each one has its own particularities and methodologies.

Accordingly, a redefinition of the digital health ambit is needed. While some evaluation models in terms of the maturity of digital health interventions have been proposed [43], they are either very “technology specific” or “hospital oriented.” Moreover, as acknowledged previously [43], none of the identified models can be used as an overarching tool to encompass the wide range of digital tools used in a complex context such as integrated care. Moreover, they highlight the lack of a holistic approach to identify influencing factors.

The maturity grading criteria for digital health explored in this study cover a wide scope of tools and policies that correspond to the abovementioned new model of the comprehensive understanding of digital medicine. Recently, a review was conducted on the use of digital technologies in health care [44], and not surprisingly, it proposed an approach similar to the one proposed in the 4 domains and 3 eHealth contexts (Tables 2 and 3) to assess the different elements of MDSW adoption.

Conclusions

Usability performance was consistently perceived higher by patients than by health care professionals. This can be partly explained by the fact that the technology readiness level of the supporting digital health tools was within the interval 5-6, and health care professionals had to lead with technical issues at the same time than with the inherent complexities of case management.

However, the active participation of health care professionals in the co-design and application of the evaluation protocol contributed to the evolution of the mindset of the health professionals toward the use of digital health tools in routine clinical practice.

The overarching analysis resulted in lessons learnt and recommendations that could contribute to the large-scale adoption of digital health tools.

Acknowledgments

We gratefully acknowledge the contribution of all participants of the pilot studies of the 4 digital health interventions and the support of CONNECARE (H2020-PHC-2015, grant number 689802), Smart PITES (FIS- PI18/00841), NEXTCARE (COMRDI15-1-0016), PAPRIKA (EIT HEALTH 2020-19365), Generalitat de Catalunya (2014SGR661), and CERCA Programme/Generalitat de Catalunya.

Funding

The JADECARE project is a research and innovation project funded by the European Union’ Horizon 2020 Research and Innovation Program under grant agreement number 951442. The information reflects only the authors’ views, and the European Commission is not responsible for any use that may be made of the information.

Conflicts of Interest

None declared.

Multimedia Appendix 1

Details of the digital health interventions.

DOCX File , 1461 KB

Multimedia Appendix 2

Details of the premarket analysis of the 4 pilot studies.

DOCX File , 17 KB

Multimedia Appendix 3

Details of the postmarket surveillance analysis.

DOCX File , 152 KB

  1. WHO Global Observatory for eHealth. mHealth: new horizons for health through mobile technologies: second global survey on eHealth. World Health Organization. 2011.   URL: https://apps.who.int/iris/handle/10665/44607 [accessed 2022-12-05]
  2. Baltaxe E, Czypionka T, Kraus M, Reiss M, Askildsen JE, Grenkovic R, et al. Digital health transformation of integrated care in Europe: Overarching analysis of 17 integrated care programs. J Med Internet Res 2019 Sep 26;21(9):e14956 [FREE Full text] [CrossRef] [Medline]
  3. World Health Organization. Monitoring and evaluating digital health interventions: a practical guide to conducting research and assessment. World Health Organization. 2016.   URL: https://apps.who.int/iris/handle/10665/252183 [accessed 2022-12-05]
  4. Murray E, Hekler E, Andersson G, Collins L, Doherty A, Hollis C, et al. Evaluating digital health interventions: Key questions and approaches. Am J Prev Med 2016 Nov;51(5):843-851 [FREE Full text] [CrossRef] [Medline]
  5. Kostkova P. Grand challenges in digital health. Front Public Health 2015 May 05;3:134 [FREE Full text] [CrossRef] [Medline]
  6. Vayena E, Haeusermann T, Adjekum A, Blasimme A. Digital health: meeting the ethical and policy challenges. Swiss Med Wkly 2018 Jan 16;148(34):w14571 [FREE Full text] [CrossRef] [Medline]
  7. Kocher B. Cautious attitudes toward emerging digital technologies. NEJM Catalyst 2021 Apr 21;2(5):CAT.21.0141 [FREE Full text] [CrossRef]
  8. Wade V, Gray L, Carati C. Theoretical frameworks in telemedicine research. J Telemed Telecare 2016 Jul 09;23(1):181-187. [CrossRef]
  9. Ricciardi W. Assessing the impact of digital transformation of health services: Opinion by the Expert Panel on Effective Ways of Investing in Health (EXPH). Eur J Public Health 2019;29(Supplement_4):ckz185.769 [FREE Full text] [CrossRef]
  10. Raynor DK, Ismail H, Blenkinsopp A, Fylan B, Armitage G, Silcock J. Experience-based co-design-Adapting the method for a researcher-initiated study in a multi-site setting. Health Expect 2020 Jun 11;23(3):562-570 [FREE Full text] [CrossRef] [Medline]
  11. Kidholm K, Clemensen J, Caffery LJ, Smith AC. The Model for Assessment of Telemedicine (MAST): A scoping review of empirical studies. J Telemed Telecare 2017 Jul 31;23(9):803-813. [CrossRef]
  12. Unsworth H, Dillon B, Collinson L, Powell H, Salmon M, Oladapo T, et al. The NICE Evidence Standards Framework for digital health and care technologies - Developing and maintaining an innovative evidence framework with global impact. Digit Health 2021 Jun 24;7:20552076211018617 [FREE Full text] [CrossRef] [Medline]
  13. Haverinen J, Keränen N, Falkenbach P, Maijala A, Kolehmainen T, Reponen J. Digi-HTA: Health technology assessment framework for digital healthcare services. Finnish Journal of EHealth and EWelfare 2019;11(4):326-341. [CrossRef]
  14. Ludewig G, Klose C, Hunze L, Matenaar S. [Digital health applications: statutory introduction of patient-centred digital innovations into healthcare]. Bundesgesundheitsblatt Gesundheitsforschung Gesundheitsschutz 2021 Oct 16;64(10):1198-1206 [FREE Full text] [CrossRef] [Medline]
  15. Darrow JJ, Avorn J, Kesselheim AS. FDA regulation and approval of medical devices: 1976-2020. JAMA 2021 Aug 03;326(5):420-432. [CrossRef] [Medline]
  16. Holborow R. A Notified Body’s perspective on the clinical evaluation requirements under Regulation (EU) 2017/745 on medical devices. Journal of Medical Device Regulation 2021;18(1):33-47 [FREE Full text]
  17. Mathews D, Balatbat C, Dzau V. Governance of emerging technologies in health and medicine — Creating a new framework. N Engl J Med 2022 Jun 09;386(23):2239-2242 [FREE Full text] [CrossRef]
  18. REGULATION EU 2017/745 OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL - of 5 April 2017 - on medical devices, amending Directive 2001/83/EC, Regulation (EC) No 178/2002 and Regulation (EC) No 1223/2009 and repealing Council Directives 90/385/EEC and 93/42/EEC. EUR-Lex. 2017.   URL: https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:32017R0745 [accessed 2022-12-05]
  19. Digital Health Technologies for Remote Data Acquisition in Clinical Investigations. FDA.   URL: https:/​/www.​fda.gov/​regulatory-information/​search-fda-guidance-documents/​digital-health-technologies-remote-data-acquisition-clinical-investigations [accessed 2022-12-05]
  20. Baltaxe E, Cano I, Risco R, Sebio R, Dana F, Laxe S, et al. Role of co-creation for large-scale sustainable adoption of digitally supported integrated care: Prehabilitation as use case. Int J Integr Care 2022;22(4):1 [FREE Full text] [CrossRef] [Medline]
  21. Hernández C, Alonso A, Garcia-Aymerich J, Grimsmo A, Vontetsianos T, García Cuyàs F, et al. Integrated care services: lessons learned from the deployment of the NEXES project. Int J Integr Care 2015;15:e006 [FREE Full text] [CrossRef] [Medline]
  22. Font D, Escarrabill J, Gómez M, Ruiz R, Enfedaque B, Altimiras X. Integrated Health Care Barcelona Esquerra (Ais-Be): A global view of organisational development, re-engineering of processes and improvement of the information systems. The role of the tertiary university hospital in the transformation. Int J Integr Care 2016 May 23;16(2):8 [FREE Full text] [CrossRef] [Medline]
  23. JADECARE.   URL: https://www.jadecare.eu/ [accessed 2022-12-05]
  24. Baltaxe E, Embid C, Aumatell E, Martínez M, Barberan-Garcia A, Kelly J, et al. Integrated care intervention supported by a mobile health tool for patients using noninvasive ventilation at home: Randomized controlled trial. JMIR Mhealth Uhealth 2020 Apr 13;8(4):e16395 [FREE Full text] [CrossRef] [Medline]
  25. Patient empowerment for major surgery preparation at home. EIT Health.   URL: https://eithealth.eu/product-service/paprika/ [accessed 2022-12-05]
  26. Vargiu E, Fernández J, Gonzales-Gonzales M, Morales-Garzón J, Prunera-Moreda K, Miralles F. A self-management system for complex chronic patients. Int J Integr Care 2019 Aug 08;19(4):101 [FREE Full text] [CrossRef]
  27. Health Circuit.   URL: https://www.healthcircuit.es/ [accessed 2022-12-05]
  28. ADI Health.   URL: https://adi-health.co.uk/ [accessed 2022-12-05]
  29. Gao M, Kortum P, Oswald F. Multi-language toolkit for the System Usability Scale. International Journal of Human–Computer Interaction 2020 Aug 19;36(20):1883-1901 [FREE Full text] [CrossRef]
  30. Spiess J, T'Joens Y, Dragnea R, Spencer P, Philippart L. Using big data to improve customer experience and business performance. Bell Labs Tech. J 2014 Mar;18(4):3-17 [FREE Full text] [CrossRef]
  31. Baltaxe E, Cano I, Herranz C, Barberan-Garcia A, Hernandez C, Alonso A, et al. Evaluation of integrated care services in Catalonia: population-based and service-based real-life deployment protocols. BMC Health Serv Res 2019 Jun 11;19(1):370 [FREE Full text] [CrossRef] [Medline]
  32. Tong A, Sainsbury P, Craig J. Consolidated criteria for reporting qualitative research (COREQ): a 32-item checklist for interviews and focus groups. Int J Qual Health Care 2007 Dec 16;19(6):349-357. [CrossRef] [Medline]
  33. Medical Devices. Federal Institute for Drugs and Medical Devices.   URL: https://www.bfarm.de/EN/Medical-devices/_node.html [accessed 2022-12-05]
  34. MAUDE - Manufacturer and User Facility Device Experience. FDA.   URL: https://www.accessdata.fda.gov/scripts/cdrh/cfdocs/cfMAUDE/search.CFM [accessed 2022-12-05]
  35. Medsun Reports. FDA.   URL: https://www.accessdata.fda.gov/scripts/cdrh/cfdocs/Medsun/searchreport.cfm [accessed 2022-12-05]
  36. Tovino S. The HIPAA Privacy Rule and the EU GDPR: Illustrative Comparisons. Scholarly Works. 2017.   URL: https://scholars.law.unlv.edu/facpub/1066/ [accessed 2022-12-05]
  37. Valerio NA. Application of blended care as a mechanism of action in the construction of digital therapeutics. Einstein (Sao Paulo) 2020;18:eMD5640. [CrossRef]
  38. EARTO. 2014.   URL: https:/​/www.​earto.eu/​wp-content/​uploads/​The_TRL_Scale_as_a_R_I_Policy_Tool_-_EARTO_Recommendations_-_Final.​pdf [accessed 2022-12-05]
  39. Yen PY. Health information technology usability evaluation: methods, models, and measures. Proquest. 2010.   URL: https:/​/www.​proquest.com/​openview/​0df85be25e4479c55d4b72b070dc034c/​1?pq-origsite=gscholar&cbl=18750&diss=y [accessed 2022-12-05]
  40. Kaufman D, Roberts W, Merrill J, Lai T, Bakken S. Applying an evaluation framework for health information system design, development, and implementation. Nurs Res 2006;55(2 Suppl):S37-S42. [CrossRef] [Medline]
  41. Benjamens S, Dhunnoo P, Meskó B. The state of artificial intelligence-based FDA-approved medical devices and algorithms: an online database. NPJ Digit Med 2020;3:118 [FREE Full text] [CrossRef] [Medline]
  42. Muehlematter U, Daniore P, Vokinger K. Approval of artificial intelligence and machine learning-based medical devices in the USA and Europe (2015-20): a comparative analysis. Lancet Digit Health 2021 Mar;3(3):e195-e203 [FREE Full text] [CrossRef] [Medline]
  43. Carvalho J, Rocha, Abreu A. Maturity models of healthcare information systems and technologies: a literature review. J Med Syst 2016 Jun;40(6):131. [CrossRef] [Medline]
  44. Senbekov M, Saliev T, Bukeyeva Z, Almabayeva A, Zhanaliyeva M, Aitenova N, et al. The recent progress and applications of digital technologies in healthcare: A review. Int J Telemed Appl 2020;2020:8830200 [FREE Full text] [CrossRef] [Medline]


BfArM: Federal Institute for Drugs and Medical Devices
FDA: Food and Drug Administration
GDPR: General Data Protection Regulation
HCB: Hospital Clinic de Barcelona
HIPAA: Health Insurance Portability and Accountability Act
MAUDE: Manufacturer and User Facility Device Experience
MDR: Medical Device Regulation
MDSW: medical device software
NIV: noninvasive ventilation
NPS: Net Promoter Score
SUS: System Usability Scale


Edited by T Leung; submitted 12.07.22; peer-reviewed by J de Batlle, J Haverinen, H Elliott; comments to author 21.09.22; revised version received 10.11.22; accepted 25.11.22; published 04.01.23

Copyright

©Erik Baltaxe, Hsin Wen Hsieh, Josep Roca, Isaac Cano. Originally published in the Journal of Medical Internet Research (https://www.jmir.org), 04.01.2023.

This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research, is properly cited. The complete bibliographic information, a link to the original publication on https://www.jmir.org/, as well as this copyright and license information must be included.