Published on in Vol 26 (2024)

Preprints (earlier versions) of this paper are available at https://preprints.jmir.org/preprint/48694, first published .
Methodological Frameworks and Dimensions to Be Considered in Digital Health Technology Assessment: Scoping Review and Thematic Analysis

Methodological Frameworks and Dimensions to Be Considered in Digital Health Technology Assessment: Scoping Review and Thematic Analysis

Methodological Frameworks and Dimensions to Be Considered in Digital Health Technology Assessment: Scoping Review and Thematic Analysis

Review

Corresponding Author:

Joan Segur-Ferrer, BSS, PT, MSc

Agency for Health Quality and Assessment of Catalonia

Roc Boronat Street, 81-95, 2nd Fl

Barcelona, 08005

Spain

Phone: 34 935 513 900

Fax:34 935 517 510

Email: joan.segur@gencat.cat


Background: Digital health technologies (dHTs) offer a unique opportunity to address some of the major challenges facing health care systems worldwide. However, the implementation of dHTs raises some concerns, such as the limited understanding of their real impact on health systems and people’s well-being or the potential risks derived from their use. In this context, health technology assessment (HTA) is 1 of the main tools that health systems can use to appraise evidence and determine the value of a given dHT. Nevertheless, due to the nature of dHTs, experts highlight the need to reconsider the frameworks used in traditional HTA.

Objective: This scoping review (ScR) aimed to identify the methodological frameworks used worldwide for digital health technology assessment (dHTA); determine what domains are being considered; and generate, through a thematic analysis, a proposal for a methodological framework based on the most frequently described domains in the literature.

Methods: The ScR was performed in accordance with the guidelines established in the PRISMA-ScR guidelines. We searched 7 databases for peer reviews and gray literature published between January 2011 and December 2021. The retrieved studies were screened using Rayyan in a single-blind manner by 2 independent authors, and data were extracted using ATLAS.ti software. The same software was used for thematic analysis.

Results: The systematic search retrieved 3061 studies (n=2238, 73.1%, unique), of which 26 (0.8%) studies were included. From these, we identified 102 methodological frameworks designed for dHTA. These frameworks revealed great heterogeneity between them due to their different structures, approaches, and items to be considered in dHTA. In addition, we identified different wording used to refer to similar concepts. Through thematic analysis, we reduced this heterogeneity. In the first phase of the analysis, 176 provisional codes related to different assessment items emerged. In the second phase, these codes were clustered into 86 descriptive themes, which, in turn, were grouped in the third phase into 61 analytical themes and organized through a vertical hierarchy of 3 levels: level 1 formed by 13 domains, level 2 formed by 38 dimensions, and level 3 formed by 11 subdimensions. From these 61 analytical themes, we developed a proposal for a methodological framework for dHTA.

Conclusions: There is a need to adapt the existing frameworks used for dHTA or create new ones to more comprehensively assess different kinds of dHTs. Through this ScR, we identified 26 studies including 102 methodological frameworks and tools for dHTA. The thematic analysis of those 26 studies led to the definition of 12 domains, 38 dimensions, and 11 subdimensions that should be considered in dHTA.

J Med Internet Res 2024;26:e48694

doi:10.2196/48694

Keywords



Background

Digital health technologies (dHTs) are driving the transformation of health care systems. They are changing the way in which health services are delivered, and showing great potential to address some of the major challenges that European health systems, including the Spanish National Health System (SNS), are facing, such as the progressive aging of the population [Avanzas P, Pascual I, Moris C. The great challenge of the public health system in Spain. J Thorac Dis. May 2017;9(Suppl 6):S430-S433. [FREE Full text] [CrossRef] [Medline]1,Grubanov BS, Ghio D, Goujon A, Kalantaryan S, Belmonte M, Scipioni M. Health and long-term care workforce: demographic challenges and the potential contribution of migration and digital technology. Luxembourg. European Commission; 2021;2045-2322.2]; the growing demand for health and long-term care services [Grubanov BS, Ghio D, Goujon A, Kalantaryan S, Belmonte M, Scipioni M. Health and long-term care workforce: demographic challenges and the potential contribution of migration and digital technology. Luxembourg. European Commission; 2021;2045-2322.2]; the rise in health care costs, increasing financial pressures on health and welfare systems [Avanzas P, Pascual I, Moris C. The great challenge of the public health system in Spain. J Thorac Dis. May 2017;9(Suppl 6):S430-S433. [FREE Full text] [CrossRef] [Medline]1,World Health Organization (WHO). Health 2020: a European policy framework and strategy for the 21st century. Denmark. WHO; 2013. 3]; and the unequal distribution of health services across different geographical regions [World Health Organization (WHO). WHO guideline: recommendations on digital interventions for health system strengthening. Geneva. WHO; Jun 6, 2019. 4,Scholz N. Addressing health inequalities in the European Union. Luxembourg. European Parliamentary Research Service; 2020. 5]. In addition, dHT can improve the accessibility, sustainability, efficiency, and quality of health care systems [Kickbusch I, Agrawal A, Jack A, Lee N, Horton R. Governing health futures 2030: growing up in a digital world—a joint The Lancet and Financial Times Commission. Lancet. Oct 2019;394(10206):1309. [CrossRef]6,Reeves JJ, Ayers JW, Longhurst CA. Telehealth in the COVID-19 era: a balancing act to avoid harm. J Med Internet Res. Feb 01, 2021;23(2):e24785. [FREE Full text] [CrossRef] [Medline]7], leading to their becoming a determinant of health on their own [Kickbusch I, Agrawal A, Jack A, Lee N, Horton R. Governing health futures 2030: growing up in a digital world—a joint The Lancet and Financial Times Commission. Lancet. Oct 2019;394(10206):1309. [CrossRef]6,Richardson S, Lawrence K, Schoenthaler AM, Mann D. A framework for digital health equity. NPJ Digit Med. Aug 18, 2022;5(1):119. [FREE Full text] [CrossRef] [Medline]8].

However, the digital transformation of health care systems and the implementation of dHT (eg, artificial intelligence [AI]–based solutions, data-driven health care services, or the internet of things) are slow and unequal across different European regions [European Union. State of health in the EU: companion report 2021. Luxembourg. European Union; 2022. 9,LATITUD: Anàlisi comparativa de models d'atenció no presencial en l'àmbit de la salut. TIC Salut Social. 2020. URL: https://ticsalutsocial.cat/wp-content/uploads/2021/07/latitud.pdf [accessed 2024-03-28] 10]. Some of the reasons for this are (1) the immaturity of regulatory frameworks for the use of dHTs [European Union. State of health in the EU: companion report 2021. Luxembourg. European Union; 2022. 9], (2) the lack of funding and investment for the implementation of dHTs [European Union. State of health in the EU: companion report 2021. Luxembourg. European Union; 2022. 9], (3) the lack of sufficient and appropriate infrastructures and common standards for data management [Kickbusch I, Agrawal A, Jack A, Lee N, Horton R. Governing health futures 2030: growing up in a digital world—a joint The Lancet and Financial Times Commission. Lancet. Oct 2019;394(10206):1309. [CrossRef]6,European Union. State of health in the EU: companion report 2021. Luxembourg. European Union; 2022. 9], (4) the absence of skills and expertise of professionals and users [LATITUD: Anàlisi comparativa de models d'atenció no presencial en l'àmbit de la salut. TIC Salut Social. 2020. URL: https://ticsalutsocial.cat/wp-content/uploads/2021/07/latitud.pdf [accessed 2024-03-28] 10], and (5) the scarcity of strong evidence regarding the real benefits and effects of dHTs on health systems and people’s well-being, as well as the cost-effectiveness of these technologies. This makes decision-making difficult, potentially leading to the development and reproduction of low-value and short-lived dHTs [Kickbusch I, Agrawal A, Jack A, Lee N, Horton R. Governing health futures 2030: growing up in a digital world—a joint The Lancet and Financial Times Commission. Lancet. Oct 2019;394(10206):1309. [CrossRef]6,Global strategy on digital health 2020-2025. World Health Organization. Aug 18, 2021. URL: https://www.who.int/publications/i/item/9789240020924 [accessed 2024-03-28] 11].

To overcome these challenges, harness the potential of dHTs, and avoid nonintended consequences, the World Health Organization (WHO) [World Health Organization (WHO). WHO guideline: recommendations on digital interventions for health system strengthening. Geneva. WHO; Jun 6, 2019. 4,Global strategy on digital health 2020-2025. World Health Organization. Aug 18, 2021. URL: https://www.who.int/publications/i/item/9789240020924 [accessed 2024-03-28] 11] states that dHTs should be developed under the principles of transparency, accessibility, scalability, privacy, security, and confidentiality. Their implementation should be led by robust strategies that bring together leadership, financial, organizational, human, and technological resources, and decisions should be guided by the best-available evidence [World Health Organization (WHO). WHO guideline: recommendations on digital interventions for health system strengthening. Geneva. WHO; Jun 6, 2019. 4,Global strategy on digital health 2020-2025. World Health Organization. Aug 18, 2021. URL: https://www.who.int/publications/i/item/9789240020924 [accessed 2024-03-28] 11].

Regarding this last aspect, health technology assessment (HTA), defined as a “multidisciplinary process that uses explicit methods to determine the value of a health technology at different points in its life cycle,” is a widely accepted tool to inform decision-making and promote equitable, efficient, and high-quality health systems [Ming J, He Y, Yang Y, Hu M, Zhao X, Liu J, et al. Health technology assessment of medical devices: current landscape, challenges, and a way forward. Cost Eff Resour Alloc. Oct 05, 2022;20(1):54. [FREE Full text] [CrossRef] [Medline]12,O'Rourke B, Oortwijn W, Schuller T. The new definition of health technology assessment: a milestone in international collaboration. Int J Technol Assess Health Care. May 13, 2020;36(3):187-190. [CrossRef]13].

Generally, HTA is conducted according to specific methodological frameworks, such as the HTA Core Model of the European Network for Health Technology Assessment (EUnetHTA) [EUnetHTA Joint Action 2 Work Package 8. HTA Core Model ® version 3.0 2016. EUnetHTA. 2016. URL: https://www.eunethta.eu/wp-content/uploads/2018/03/HTACoreModel3.0-1.pdf [accessed 2024-03-28] 14] and the guidelines for the development and adaptation of rapid HTA reports of the Spanish Network of Agencies for Assessing National Health System Technologies and Performance (RedETS) [Puñal-Riobóo J, Baños, Varela LL, Castillo MM, Atienza MG, Ubago PR. Guía para la elaboración y adaptación de informes rápidos. Santiago de Compostela, Madrid. Agencia Gallega para la Gestión del Conocimientoto en Salud. Unidad de Asesoramiento Científico-técnico, avalia-t. Ministerio de Sanidad, Servicios Sociales e Igualdad; 2016. 15]. These frameworks establish the methodologies to follow and the elements to evaluate. Although these frameworks are helpful instruments for evaluating various health technologies, they have certain limitations in comprehensively assessing dHTs. For this reason, in the past few years, different initiatives have emerged to adapt existing methodological frameworks or develop new ones. The objective is to consider additional domains (eg, interoperability, scalability) to cover the intrinsic characteristics of dHTs [Haverinen J, Keränen N, Falkenbach P, Maijala A, Kolehmainen T, Reponen J. Digi-HTA: health technology assessment framework for digital healthcare services. FinJeHeW. Nov 02, 2019;11(4):326-341. [CrossRef]16-National Institute for Health and Care Excellence (NICE). Evidence standards framework (ESF) for digital health technologies. Contract no.: 30 de septiembre. London. NICE; 2022. 18]. Examples of these initiatives are the Evidence Standard Framework (ESF) of National Institute for Health and Care Excellence (NICE) [Evidence standards framework (ESF) for digital health technologies. National Institute for Health and Care Excellence. URL: https:/​/www.​nice.org.uk/​about/​what-we-do/​our-programmes/​evidence-standards-framework-for-digital-health-technologies [accessed 2024-03-28] 19] or the Digi-HTA Framework of the Finnish Coordinating Center for Health Technology Assessment (FinCCHTA) [Haverinen J, Keränen N, Falkenbach P, Maijala A, Kolehmainen T, Reponen J. Digi-HTA: health technology assessment framework for digital healthcare services. FinJeHeW. Nov 02, 2019;11(4):326-341. [CrossRef]16]. Nonetheless, the majority of these frameworks have certain constraints, such as being designed for a particular socioeconomic or national setting, which restricts their transferability or suitability for use in other countries; the specificity or exclusion of certain dHTs, resulting in limitations in their application; or the limited evidence regarding their actual usefulness.

In this context, we performed a scoping review (ScR) with the aim of identifying the methodological frameworks that are used worldwide for the evaluation of dHTs; determining what dimensions and aspects are considered for each type of dHT; and generating, through a thematic analysis, a proposal for a methodological framework that is based on the most frequently described dimensions in the literature. This research focused mainly on mobile health (mHealth), non–face-to-face care models and medical devices that integrate AI, as these particular dHTs are the ones most frequently assessed by HTA agencies and units of RedETS.

Identifying Research Questions

This ScR followed by a thematic analysis answered the following research questions:

  • What methodological frameworks currently exist for digital health technology assessment (dHTA)?
  • What domains and dimensions are considered in dHTA?
  • Do the different domains and dimensions considered depend on whether the dHT addressed is a non–face-to-face care model of health care provision, a mobile device (mHealth), or a device that incorporates AI?

Overview of Methods for Conducting the Scoping Review

We conducted an ScR of the literature and a thematic analysis of the studies included according to the published protocol [Segur-Ferrer J, Moltó-Puigmartí C, Pastells-Peiró R, Vivanco-Hidalgo RM. Methodological frameworks and dimensions to be taken into consideration in digital health technology assessment: protocol for a scoping review. JMIR Res Protoc. Oct 11, 2022;11(10):e39905. [FREE Full text] [CrossRef] [Medline]20]. The ScR aimed to answer the first research question, while the thematic analysis aimed to answer the second and third research questions. Spanish experts from various domains of HTA and dHT collaborated throughout the study design and development.

The ScR of the available scientific literature was carried out in accordance with the PRISMA-ScR (Preferred Reporting Items for Systematic Reviews and Meta-Analysis extension for Scoping Reviews) guidelines (

Multimedia Appendix 1

Preferred Reporting Items for Systematic Reviews and Meta-Analysis extension for Scoping Reviews (PRISMA-ScR) checklist [21].

DOCX File , 25 KBMultimedia Appendix 1) [Tricco AC, Lillie E, Zarin W, O'Brien KK, Colquhoun H, Levac D, et al. PRISMA Extension for Scoping Reviews (PRISMA-ScR): checklist and explanation. Ann Intern Med. Oct 02, 2018;169(7):467-473. [FREE Full text] [CrossRef] [Medline]21] and following the recommendations of Peters et al [Peters M, Marnie C, Tricco A, Pollock D, Munn Z, Alexander L. Updated methodological guidance for the conduct of scoping reviews. JBI Evid Synth. 2020;18(10):2119-2126. [CrossRef]22] and Pollock et al [Pollock D, Davies EL, Peters MDJ, Tricco AC, Alexander L, McInerney P, et al. Undertaking a scoping review: a practical guide for nursing and midwifery students, clinicians, researchers, and academics. J Adv Nurs. Apr 04, 2021;77(4):2102-2113. [FREE Full text] [CrossRef] [Medline]23].

Ethical Considerations

As this work was an ScR, no ethical board approval was required.

Search Strategy

The search strategy (

Multimedia Appendix 2

Search strategies for each database.

DOCX File , 34 KBMultimedia Appendix 2) was designed by an experienced information specialist (author RP-P) in accordance with the research questions and using the validated filter of Ayiku et al [Ayiku L, Hudson T, Glover S, Walsh N, Adams R, Deane J, et al. The NICE MEDLINE and Embase (Ovid) health apps search filters: development of validated filters to retrieve evidence about health apps. Int J Technol Assess Health Care. Oct 27, 2020;37(1):e16. [CrossRef]24] for health apps, adding the terms for concepts related to mHealth, remote care models, AI, digital health, methodological frameworks, and HTA. The strategy was peer-reviewed according to the “Peer Review of Electronic Search Strategies Statement” [McGowan J, Sampson M, Salzwedel DM, Cogo E, Foerster V, Lefebvre C. PRESS peer review of electronic search strategies: 2015 guideline statement. J Clin Epidemiol. Jul 2016;75:40-46. [FREE Full text] [CrossRef] [Medline]25] by authors JS-F and CM-P and was executed in the following 7 databases, considering the characteristics of each in terms of syntax, controlled vocabulary, and proximity operators: Medline (OVID), CINAHL Plus, Embase, Cochrane Library, Scopus, Web of Science, and TripDatabase. Note that no time, language, or other filters were used.

The identification of relevant studies was complemented with a manual search based on the references in the included studies, as well as the websites of the HTA agencies identified through the web pages of EUnetHTA, the International Network for Agencies for Health Technology Assessment (INAHTA), and Health Technology Assessment International (HTAi). Additionally, a search was conducted in Google Scholar, limiting the results to the first 250 items in order to guarantee the inclusion of all pertinent studies [Bramer WM, Rethlefsen ML, Kleijnen J, Franco OH. Optimal database combinations for literature searches in systematic reviews: a prospective exploratory study. Syst Rev. Dec 06, 2017;6(1):245. [FREE Full text] [CrossRef] [Medline]26].

Inclusion and Exclusion Criteria

The inclusion criteria used in the reference-screening process were based on the previously detailed research questions and are outlined in Textbox 1 using the Population/Problem, Phenomenon of Interest, Context and Design (PICo-D) format [Lockwood C, Munn Z, Porritt K. Qualitative research synthesis: methodological guidance for systematic reviewers utilizing meta-aggregation. Int J Evid Based Healthc. 2015;13(3):179-187. [CrossRef]27,Systematic reviews - research guide. Defining your review question. Murdoch University. 2023. URL: https://libguides.murdoch.edu.au/systematic [accessed 2024-03-28] 28]. The PICo-D format was used instead of the traditional Population/Problem, Intervention, Comparator, Outcomes, Design (PICO-D) format due to the qualitative nature of the research questions and the characteristics of the phenomenon of interest.

Studies were excluded if they were published before 2011, due to the rapid evolution of dHTs in the past few years, did not describe dimensions or evaluation criteria, or were based on methodological frameworks not intended for the assessment of dHTs (eg, EUnetHTA Core Model 3.0). Likewise, we excluded comments, editorials, letters, conference abstracts, frameworks, or tools focusing on the evaluation of dHTs by users (eg, User version of Mobile App Rating Scale [uMARS]) or documents in languages other than English, Spanish. or Catalan.

Textbox 1. Research questions in Population/Problem, Phenomenon of Interest, Context and Design (PICo-D) format.

Population/problem

Digital health technology assessment (dHTA)

Phenomenon of interest

Specific methodological frameworks for the evaluation of digital health (with special focus on mobile health [mHealth]: non–face-to-face care models and medical devices that integrate artificial intelligence [AI] due the type of technologies mostly assessed in the Spanish National Health System [SNS]) that describe the domains to be evaluated in dHTA

Context

Health technology assessment (HTA)

Design

Methodological guidelines and frameworks, scoping reviews (ScRs), systematic reviews (SRs), consensus documents, and qualitative studies

Reference Screening and Data Extraction

The screening of studies was carried out by authors CM-P and JS-F in 2 phases in accordance with the selection criteria detailed earlier (Textbox 1) and in a single-blind peer review manner. The first phase consisted of screening of the titles and abstracts of the studies identified in the bibliographic search. The second phase consisted of full-text screening of the studies included in the previous phase.

Data extraction was performed by 3 authors (CM-P, RP-P, and JS-F) using the web and desktop versions of ATLAS.ti version 22.0 (Scientific Software Development GmbH) [Scientific Software Development GmbH. Atlas.ti qualitative data analysis. 22.0 ed. Berlin. Scientific Software Development GmbH; 2021. 29] and the data extraction sheets designed ad hoc for this purpose following the recommendations of the Cochrane Handbook for Systematic Reviews of Interventions [Higgins J, Thomas J, Chandler J, Cumpston M, Li T, Page M, et al. Cochrane handbook for systematic reviews of interventions version 6.3. London. Cochrane; 2022. 30].

When disagreements emerged in either of the 2 processes, a consensus was reached between the 3 reviewers (CM-P, RP-P, and JS-F). When a consensus was not possible, a fourth reviewer (author RMV-H) was consulted.

Collecting, Summarizing, and Reporting the Results

A descriptive analysis was carried out to evaluate and report the existing methodological frameworks and their characteristics.

Overview of Methods for Thematic Analysis

The thematic analysis was performed following the recommendations and phases described by Thomas and Harden [Thomas J, Harden A. Methods for the thematic synthesis of qualitative research in systematic reviews. BMC Med Res Methodol. Jul 10, 2008;8(1):45. [FREE Full text] [CrossRef] [Medline]31] to determine HTA dimensions for dHTs: (1) line-by-line text coding, (2) development of descriptive topics, and (3) generation of analytical themes. Both analyses were carried out by 3 authors (CM-P, RP-P, and JS-F) using the web and desktop versions of ATLAS.ti version 22.0 [Scientific Software Development GmbH. Atlas.ti qualitative data analysis. 22.0 ed. Berlin. Scientific Software Development GmbH; 2021. 29].

Dimensions identified from systematic reviews (SRs) that were derived from primary studies also identified in our systematic search were only counted once in order to avoid duplication of data and risk of bias. It is worth mentioning that the primary studies included in the SRs were not directly analyzed but were analyzed through the findings reported in the SRs.


Study Selection and Characteristics

A total of 3042 studies were retrieved throughout the systematic (n=3023, 99.4%) and the manual (n=19, 0.6%) search. Of these, 2238 (73.6%) studies were identified as unique after removing duplicates.

After title and abstract review, 81 (3.6%) studies were selected for full-text review, of which 26 (32.1%) were finally included in the analysis. The excluded studies and reasons for exclusion are detailed in

Multimedia Appendix 3

References excluded at the full-text screening stage.

DOCX File , 20 KBMultimedia Appendix 3; in brief, the reasons for exclusion were phenomenon of interest (n=30, 37%), type of publication (n=15, 18.5%), purpose (n=6, 7.4%), language (n=2, 2.5%), and duplicated information (n=2, 2.5%). The study selection process is outlined in Figure 1 [Page M, McKenzie J, Bossuyt P, Boutron I, Hoffmann T, Mulrow C, et al. The PRISMA 2020 statement: an updated guideline for reporting systematic reviews. PLoS Med. Mar 2021;18(3):e1003583. [FREE Full text] [CrossRef] [Medline]32].

Of the 26 (32.1%) studies included in this ScR, 19 (73.1%) were designed as specific methodological frameworks for dHTA [Haverinen J, Keränen N, Falkenbach P, Maijala A, Kolehmainen T, Reponen J. Digi-HTA: health technology assessment framework for digital healthcare services. FinJeHeW. Nov 02, 2019;11(4):326-341. [CrossRef]16,Hussain MS, Silvera-Tawil D, Farr-Wharton G. Technology assessment framework for precision health applications. Int J Technol Assess Health Care. May 26, 2021;37(1):e67. [CrossRef]17,Kidholm K, Ekeland AG, Jensen LK, Rasmussen J, Pedersen CD, Bowes A, et al. A model for assessment of telemedicine applications: MAST. Int J Technol Assess Health Care. Jan 23, 2012;28(1):44-51. [CrossRef]33-World Health Organization (WHO). Monitoring and evaluating digital health interventions: a practical guide to conducting research and assessment. Geneva. WHO; 2016. 47], 4 (15.4%) were SRs [Enam A, Torres-Bonilla J, Eriksson H. Evidence-based evaluation of eHealth interventions: systematic literature review. J Med Internet Res. Nov 23, 2018;20(11):e10971. [FREE Full text] [CrossRef] [Medline]48-von Huben A, Howell M, Howard K, Carrello J, Norris S. Health technology assessment for digital technologies that manage chronic disease: a systematic review. Int J Technol Assess Health Care. May 26, 2021;37(1):e66. [CrossRef]51], 1 (3.9%) was a report from the European mHealth Hub’s working group on mHealth assessment guidelines [Report of the Working Group on mhealth assessment guidelines. European Commission. 2021. URL: https://digital-strategy.ec.europa.eu/en/library/report-working-group-mhealth-assessment-guidelines [accessed 2024-03-28] 52], 1 (3.9%) was a qualitative study [Kumar S, Nilsen WJ, Abernethy A, Atienza A, Patrick K, Pavel M, et al. Mobile health technology evaluation: the mhealth evidence workshop. Am J Prev Med. Aug 2013;45(2):228-236. [FREE Full text] [CrossRef] [Medline]53], and 1 (3.9%) was a viewpoint [Alami H, Lehoux P, Auclair Y, de Guise M, Gagnon M, Shaw J, et al. Artificial intelligence and health technology assessment: anticipating a new level of complexity. J Med Internet Res. Jul 07, 2020;22(7):e17707. [FREE Full text] [CrossRef] [Medline]54]. In addition, 3 (11.5%) focused on the assessment of non–face-to-face care models [Kidholm K, Ekeland AG, Jensen LK, Rasmussen J, Pedersen CD, Bowes A, et al. A model for assessment of telemedicine applications: MAST. Int J Technol Assess Health Care. Jan 23, 2012;28(1):44-51. [CrossRef]33-Measuring the impact of telehealth and telecare: SCTT toolkit. Glasgow. SCTT; 2013. 35], 8 (30.8%) on mHealth assessment [Caulfield B, Reginatto B, Slevin P. Not all sensors are created equal: a framework for evaluating human performance measurement technologies. NPJ Digit Med. Feb 14, 2019;2(1):7. [FREE Full text] [CrossRef] [Medline]36-Henson P, David G, Albright K, Torous J. Deriving a practical framework for the evaluation of health apps. Lancet Digit Health. Jun 2019;1(2):e52-e54. [CrossRef]40,Report of the Working Group on mhealth assessment guidelines. European Commission. 2021. URL: https://digital-strategy.ec.europa.eu/en/library/report-working-group-mhealth-assessment-guidelines [accessed 2024-03-28] 52,Kumar S, Nilsen WJ, Abernethy A, Atienza A, Patrick K, Pavel M, et al. Mobile health technology evaluation: the mhealth evidence workshop. Am J Prev Med. Aug 2013;45(2):228-236. [FREE Full text] [CrossRef] [Medline]53,Vokinger KN, Nittas V, Witt CM, Fabrikant SI, von Wyl V. Digital health and the COVID-19 epidemic: an assessment framework for apps from an epidemiological and legal perspective. Swiss Med Wkly. May 04, 2020;150(1920):w20282. [FREE Full text] [CrossRef] [Medline]55], 2 (7.7%) on the assessment of AI technology [Reddy S, Rogers W, Makinen V, Coiera E, Brown P, Wenzel M, et al. Evaluation framework to guide implementation of AI systems into healthcare settings. BMJ Health Care Inform. Oct 12, 2021;28(1):e100444. [FREE Full text] [CrossRef] [Medline]41,Alami H, Lehoux P, Auclair Y, de Guise M, Gagnon M, Shaw J, et al. Artificial intelligence and health technology assessment: anticipating a new level of complexity. J Med Internet Res. Jul 07, 2020;22(7):e17707. [FREE Full text] [CrossRef] [Medline]54], 4 (15.4%) on eHealth [Khoja S, Durrani H, Scott RE, Sajwani A, Piryani U. Conceptual framework for development of comprehensive e-health evaluation tool. Telemed J E Health. Jan 2013;19(1):48-53. [CrossRef] [Medline]42,Sockolow P, Bowles K, Rogers M. Health information technology evaluation framework (HITREF) comprehensiveness as assessed in electronic point-of-care documentation systems evaluations. Amsterdam. IOS Press; 2015. 43,Enam A, Torres-Bonilla J, Eriksson H. Evidence-based evaluation of eHealth interventions: systematic literature review. J Med Internet Res. Nov 23, 2018;20(11):e10971. [FREE Full text] [CrossRef] [Medline]48,Vis C, Bührmann L, Riper H, Ossebaard HC. Health technology assessment frameworks for eHealth: a systematic review. Int J Technol Assess Health Care. Apr 16, 2020;36(3):204-216. [CrossRef]50], and 9 (34.6%) on the overall assessment of digital health [Haverinen J, Keränen N, Falkenbach P, Maijala A, Kolehmainen T, Reponen J. Digi-HTA: health technology assessment framework for digital healthcare services. FinJeHeW. Nov 02, 2019;11(4):326-341. [CrossRef]16,Hussain MS, Silvera-Tawil D, Farr-Wharton G. Technology assessment framework for precision health applications. Int J Technol Assess Health Care. May 26, 2021;37(1):e67. [CrossRef]17,Kowatsch T, Otto L, Harperink S, Cotti A, Schlieter H. A design and evaluation framework for digital health interventions. it - Inf Technol. Nov 2019;61(5):253-263. [CrossRef]44-World Health Organization (WHO). Monitoring and evaluating digital health interventions: a practical guide to conducting research and assessment. Geneva. WHO; 2016. 47,Kolasa K, Kozinski G. How to value digital health interventions? A systematic literature review. Int J Environ Res Public Health. Mar 23, 2020;17(6):2119. [FREE Full text] [CrossRef] [Medline]49,von Huben A, Howell M, Howard K, Carrello J, Norris S. Health technology assessment for digital technologies that manage chronic disease: a systematic review. Int J Technol Assess Health Care. May 26, 2021;37(1):e66. [CrossRef]51,Baumel A, Muench F. Heuristic evaluation of ehealth interventions: establishing standards that relate to the therapeutic process perspective. JMIR Ment Health. Jan 13, 2016;3(1):e5. [FREE Full text] [CrossRef] [Medline]56].

Figure 1. PRISMA 2020 flow diagram of the search and study selection process for new SRs, meta-analyses, and ScRs. PRISMA: Preferred Reporting Items for Systematic Reviews and Meta-Analysis; ScR: scoping review; SR: systematic review.

Research Question 1: Description of Identified Frameworks for dHTA

The 19 methodological frameworks for dHTA [Haverinen J, Keränen N, Falkenbach P, Maijala A, Kolehmainen T, Reponen J. Digi-HTA: health technology assessment framework for digital healthcare services. FinJeHeW. Nov 02, 2019;11(4):326-341. [CrossRef]16,Hussain MS, Silvera-Tawil D, Farr-Wharton G. Technology assessment framework for precision health applications. Int J Technol Assess Health Care. May 26, 2021;37(1):e67. [CrossRef]17,Kidholm K, Ekeland AG, Jensen LK, Rasmussen J, Pedersen CD, Bowes A, et al. A model for assessment of telemedicine applications: MAST. Int J Technol Assess Health Care. Jan 23, 2012;28(1):44-51. [CrossRef]33-World Health Organization (WHO). Monitoring and evaluating digital health interventions: a practical guide to conducting research and assessment. Geneva. WHO; 2016. 47] were from various countries: The majority (n=5, 26.3%) originated in Australia [Hussain MS, Silvera-Tawil D, Farr-Wharton G. Technology assessment framework for precision health applications. Int J Technol Assess Health Care. May 26, 2021;37(1):e67. [CrossRef]17,Nepal S, Li J, Jang-Jaccard J, Alem L. A framework for telehealth program evaluation. Telemed J E Health. Apr 2014;20(4):393-404. [CrossRef] [Medline]34,Moshi MR, Tooher R, Merlin T. Development of a health technology assessment module for evaluating mobile medical applications. Int J Technol Assess Health Care. May 18, 2020;36(3):252-261. [CrossRef]38,Reddy S, Rogers W, Makinen V, Coiera E, Brown P, Wenzel M, et al. Evaluation framework to guide implementation of AI systems into healthcare settings. BMJ Health Care Inform. Oct 12, 2021;28(1):e100444. [FREE Full text] [CrossRef] [Medline]41,Milosevic Z. Ethics in digital health: a deontic accountability framework. 2019. Presented at: IEEE 23rd International Enterprise Distributed Object Computing Conference (EDOC); October 28-31, 2019; Paris, France. [CrossRef]46], followed by 3 (15.8%) from the United States [Sockolow P, Bowles K, Rogers M. Health information technology evaluation framework (HITREF) comprehensiveness as assessed in electronic point-of-care documentation systems evaluations. Amsterdam. IOS Press; 2015. 43,Mathews SC, McShea MJ, Hanley CL, Ravitz A, Labrique AB, Cohen AB. Digital health: a path to validation. NPJ Digit Med. May 13, 2019;2(1):38. [FREE Full text] [CrossRef] [Medline]45,Baumel A, Muench F. Heuristic evaluation of ehealth interventions: establishing standards that relate to the therapeutic process perspective. JMIR Ment Health. Jan 13, 2016;3(1):e5. [FREE Full text] [CrossRef] [Medline]56] and 2 (10.5%) from Switzerland [World Health Organization (WHO). Monitoring and evaluating digital health interventions: a practical guide to conducting research and assessment. Geneva. WHO; 2016. 47,Vokinger KN, Nittas V, Witt CM, Fabrikant SI, von Wyl V. Digital health and the COVID-19 epidemic: an assessment framework for apps from an epidemiological and legal perspective. Swiss Med Wkly. May 04, 2020;150(1920):w20282. [FREE Full text] [CrossRef] [Medline]55]; the remaining 9 (47.4%) frameworks were developed in Afghanistan [Khoja S, Durrani H, Scott RE, Sajwani A, Piryani U. Conceptual framework for development of comprehensive e-health evaluation tool. Telemed J E Health. Jan 2013;19(1):48-53. [CrossRef] [Medline]42], Denmark [Kidholm K, Ekeland AG, Jensen LK, Rasmussen J, Pedersen CD, Bowes A, et al. A model for assessment of telemedicine applications: MAST. Int J Technol Assess Health Care. Jan 23, 2012;28(1):44-51. [CrossRef]33], Scotland [Measuring the impact of telehealth and telecare: SCTT toolkit. Glasgow. SCTT; 2013. 35], Finland [Haverinen J, Keränen N, Falkenbach P, Maijala A, Kolehmainen T, Reponen J. Digi-HTA: health technology assessment framework for digital healthcare services. FinJeHeW. Nov 02, 2019;11(4):326-341. [CrossRef]16], Ireland [Caulfield B, Reginatto B, Slevin P. Not all sensors are created equal: a framework for evaluating human performance measurement technologies. NPJ Digit Med. Feb 14, 2019;2(1):7. [FREE Full text] [CrossRef] [Medline]36], Israel [Henson P, David G, Albright K, Torous J. Deriving a practical framework for the evaluation of health apps. Lancet Digit Health. Jun 2019;1(2):e52-e54. [CrossRef]40], the United Kingdom [Lewis TL, Wyatt JC. mHealth and mobile medical apps: a framework to assess risk and promote safer use. J Med Internet Res. Sep 15, 2014;16(9):e210. [FREE Full text] [CrossRef] [Medline]37], Spain [Puigdomènech E, Poses-Ferrer E, Espallargues CM, Blasco AJ, Varela LL, Paz VL. Evaluación de tecnología basada en mSalud para aplicaciones móviles. Madrid. Ministerio de Sanidad; 2020. 39], and Sweden [Kowatsch T, Otto L, Harperink S, Cotti A, Schlieter H. A design and evaluation framework for digital health interventions. it - Inf Technol. Nov 2019;61(5):253-263. [CrossRef]44].

The 19 methodological frameworks focused on evaluating various types of technologies. Specifically, 3 (15.8%) of them were designed for assessing non–face-to-face care models [Kidholm K, Ekeland AG, Jensen LK, Rasmussen J, Pedersen CD, Bowes A, et al. A model for assessment of telemedicine applications: MAST. Int J Technol Assess Health Care. Jan 23, 2012;28(1):44-51. [CrossRef]33-Measuring the impact of telehealth and telecare: SCTT toolkit. Glasgow. SCTT; 2013. 35], 6 (31.6%) for mHealth [Caulfield B, Reginatto B, Slevin P. Not all sensors are created equal: a framework for evaluating human performance measurement technologies. NPJ Digit Med. Feb 14, 2019;2(1):7. [FREE Full text] [CrossRef] [Medline]36-Henson P, David G, Albright K, Torous J. Deriving a practical framework for the evaluation of health apps. Lancet Digit Health. Jun 2019;1(2):e52-e54. [CrossRef]40], and 1 (5.3%) for AI solutions [Reddy S, Rogers W, Makinen V, Coiera E, Brown P, Wenzel M, et al. Evaluation framework to guide implementation of AI systems into healthcare settings. BMJ Health Care Inform. Oct 12, 2021;28(1):e100444. [FREE Full text] [CrossRef] [Medline]41]. The other 9 (47.4%) frameworks addressed eHealth [Khoja S, Durrani H, Scott RE, Sajwani A, Piryani U. Conceptual framework for development of comprehensive e-health evaluation tool. Telemed J E Health. Jan 2013;19(1):48-53. [CrossRef] [Medline]42,Sockolow P, Bowles K, Rogers M. Health information technology evaluation framework (HITREF) comprehensiveness as assessed in electronic point-of-care documentation systems evaluations. Amsterdam. IOS Press; 2015. 43,Baumel A, Muench F. Heuristic evaluation of ehealth interventions: establishing standards that relate to the therapeutic process perspective. JMIR Ment Health. Jan 13, 2016;3(1):e5. [FREE Full text] [CrossRef] [Medline]56] or digital health in general [Haverinen J, Keränen N, Falkenbach P, Maijala A, Kolehmainen T, Reponen J. Digi-HTA: health technology assessment framework for digital healthcare services. FinJeHeW. Nov 02, 2019;11(4):326-341. [CrossRef]16,Hussain MS, Silvera-Tawil D, Farr-Wharton G. Technology assessment framework for precision health applications. Int J Technol Assess Health Care. May 26, 2021;37(1):e67. [CrossRef]17,Kowatsch T, Otto L, Harperink S, Cotti A, Schlieter H. A design and evaluation framework for digital health interventions. it - Inf Technol. Nov 2019;61(5):253-263. [CrossRef]44-World Health Organization (WHO). Monitoring and evaluating digital health interventions: a practical guide to conducting research and assessment. Geneva. WHO; 2016. 47], which encompasses non–face-to-face care models, mHealth, and occasionally AI-based solutions [National Institute for Health and Care Excellence (NICE). Evidence standards framework (ESF) for digital health technologies. Contract no.: 30 de septiembre. London. NICE; 2022. 18] within its scope. It is pertinent to mention that the differentiation between the methodological frameworks designed for the evaluation of eHealth and those designed for dHTA was based on the specific terminology and descriptions used by the authors of those frameworks.

The structures and characteristics of the analyzed methodological frameworks were considered heterogeneous in terms of evaluation specificity (whether they focused on a global evaluation that encompassed more than 1 domain or dimension or on a specific assessment that addressed only 1 domain or dimension), assessment approach (whether they adopted a phased evaluation, a domain evaluation, or a hybrid of both), and number of domains included. Regarding evaluation specificity, 17 (89.5%) methodological frameworks were classified as global as they covered various aspects or domains within their scope [Haverinen J, Keränen N, Falkenbach P, Maijala A, Kolehmainen T, Reponen J. Digi-HTA: health technology assessment framework for digital healthcare services. FinJeHeW. Nov 02, 2019;11(4):326-341. [CrossRef]16,Hussain MS, Silvera-Tawil D, Farr-Wharton G. Technology assessment framework for precision health applications. Int J Technol Assess Health Care. May 26, 2021;37(1):e67. [CrossRef]17,Kidholm K, Ekeland AG, Jensen LK, Rasmussen J, Pedersen CD, Bowes A, et al. A model for assessment of telemedicine applications: MAST. Int J Technol Assess Health Care. Jan 23, 2012;28(1):44-51. [CrossRef]33-Caulfield B, Reginatto B, Slevin P. Not all sensors are created equal: a framework for evaluating human performance measurement technologies. NPJ Digit Med. Feb 14, 2019;2(1):7. [FREE Full text] [CrossRef] [Medline]36,Moshi MR, Tooher R, Merlin T. Development of a health technology assessment module for evaluating mobile medical applications. Int J Technol Assess Health Care. May 18, 2020;36(3):252-261. [CrossRef]38-World Health Organization (WHO). Monitoring and evaluating digital health interventions: a practical guide to conducting research and assessment. Geneva. WHO; 2016. 47,Vokinger KN, Nittas V, Witt CM, Fabrikant SI, von Wyl V. Digital health and the COVID-19 epidemic: an assessment framework for apps from an epidemiological and legal perspective. Swiss Med Wkly. May 04, 2020;150(1920):w20282. [FREE Full text] [CrossRef] [Medline]55,Baumel A, Muench F. Heuristic evaluation of ehealth interventions: establishing standards that relate to the therapeutic process perspective. JMIR Ment Health. Jan 13, 2016;3(1):e5. [FREE Full text] [CrossRef] [Medline]56], while 2 (10.5%) were classified as specific as they concentrated exclusively on 1 element or domain of assessment [Lewis TL, Wyatt JC. mHealth and mobile medical apps: a framework to assess risk and promote safer use. J Med Internet Res. Sep 15, 2014;16(9):e210. [FREE Full text] [CrossRef] [Medline]37,Milosevic Z. Ethics in digital health: a deontic accountability framework. 2019. Presented at: IEEE 23rd International Enterprise Distributed Object Computing Conference (EDOC); October 28-31, 2019; Paris, France. [CrossRef]46]. Regarding the assessment approach, 14 (73.7%) methodological frameworks proposed a domain-based evaluation [Haverinen J, Keränen N, Falkenbach P, Maijala A, Kolehmainen T, Reponen J. Digi-HTA: health technology assessment framework for digital healthcare services. FinJeHeW. Nov 02, 2019;11(4):326-341. [CrossRef]16, Hussain MS, Silvera-Tawil D, Farr-Wharton G. Technology assessment framework for precision health applications. Int J Technol Assess Health Care. May 26, 2021;37(1):e67. [CrossRef]17, Kidholm K, Ekeland AG, Jensen LK, Rasmussen J, Pedersen CD, Bowes A, et al. A model for assessment of telemedicine applications: MAST. Int J Technol Assess Health Care. Jan 23, 2012;28(1):44-51. [CrossRef]33, Measuring the impact of telehealth and telecare: SCTT toolkit. Glasgow. SCTT; 2013. 35, Caulfield B, Reginatto B, Slevin P. Not all sensors are created equal: a framework for evaluating human performance measurement technologies. NPJ Digit Med. Feb 14, 2019;2(1):7. [FREE Full text] [CrossRef] [Medline]36, Moshi MR, Tooher R, Merlin T. Development of a health technology assessment module for evaluating mobile medical applications. Int J Technol Assess Health Care. May 18, 2020;36(3):252-261. [CrossRef]38-Henson P, David G, Albright K, Torous J. Deriving a practical framework for the evaluation of health apps. Lancet Digit Health. Jun 2019;1(2):e52-e54. [CrossRef]40, Sockolow P, Bowles K, Rogers M. Health information technology evaluation framework (HITREF) comprehensiveness as assessed in electronic point-of-care documentation systems evaluations. Amsterdam. IOS Press; 2015. 43, Kowatsch T, Otto L, Harperink S, Cotti A, Schlieter H. A design and evaluation framework for digital health interventions. it - Inf Technol. Nov 2019;61(5):253-263. [CrossRef]44, Milosevic Z. Ethics in digital health: a deontic accountability framework. 2019. Presented at: IEEE 23rd International Enterprise Distributed Object Computing Conference (EDOC); October 28-31, 2019; Paris, France. [CrossRef]46, Vokinger KN, Nittas V, Witt CM, Fabrikant SI, von Wyl V. Digital health and the COVID-19 epidemic: an assessment framework for apps from an epidemiological and legal perspective. Swiss Med Wkly. May 04, 2020;150(1920):w20282. [FREE Full text] [CrossRef] [Medline]55, Baumel A, Muench F. Heuristic evaluation of ehealth interventions: establishing standards that relate to the therapeutic process perspective. JMIR Ment Health. Jan 13, 2016;3(1):e5. [FREE Full text] [CrossRef] [Medline]56], while 4 (21.1%) proposed a hybrid one (phased and domain based) [Reddy S, Rogers W, Makinen V, Coiera E, Brown P, Wenzel M, et al. Evaluation framework to guide implementation of AI systems into healthcare settings. BMJ Health Care Inform. Oct 12, 2021;28(1):e100444. [FREE Full text] [CrossRef] [Medline]41,Khoja S, Durrani H, Scott RE, Sajwani A, Piryani U. Conceptual framework for development of comprehensive e-health evaluation tool. Telemed J E Health. Jan 2013;19(1):48-53. [CrossRef] [Medline]42,Mathews SC, McShea MJ, Hanley CL, Ravitz A, Labrique AB, Cohen AB. Digital health: a path to validation. NPJ Digit Med. May 13, 2019;2(1):38. [FREE Full text] [CrossRef] [Medline]45,World Health Organization (WHO). Monitoring and evaluating digital health interventions: a practical guide to conducting research and assessment. Geneva. WHO; 2016. 47]; the remaining methodological framework did not fit into any of the previous categories, as it was not structured by domains or phases but by types of risk [Lewis TL, Wyatt JC. mHealth and mobile medical apps: a framework to assess risk and promote safer use. J Med Internet Res. Sep 15, 2014;16(9):e210. [FREE Full text] [CrossRef] [Medline]37]. Finally, the number of evaluation domains considered ranged from 1 to 14, with an average of 7. Table 1 outlines the primary features of the included methodological frameworks and provides a thorough breakdown of the domains and dimensions they address.

In contrast, from 3 (75%) [Kolasa K, Kozinski G. How to value digital health interventions? A systematic literature review. Int J Environ Res Public Health. Mar 23, 2020;17(6):2119. [FREE Full text] [CrossRef] [Medline]49-von Huben A, Howell M, Howard K, Carrello J, Norris S. Health technology assessment for digital technologies that manage chronic disease: a systematic review. Int J Technol Assess Health Care. May 26, 2021;37(1):e66. [CrossRef]51] of the 4 SRs [Enam A, Torres-Bonilla J, Eriksson H. Evidence-based evaluation of eHealth interventions: systematic literature review. J Med Internet Res. Nov 23, 2018;20(11):e10971. [FREE Full text] [CrossRef] [Medline]48-von Huben A, Howell M, Howard K, Carrello J, Norris S. Health technology assessment for digital technologies that manage chronic disease: a systematic review. Int J Technol Assess Health Care. May 26, 2021;37(1):e66. [CrossRef]51] and the report from the working group on guidelines for the evaluation of mHealth solutions from the European mHealth Hub [Report of the Working Group on mhealth assessment guidelines. European Commission. 2021. URL: https://digital-strategy.ec.europa.eu/en/library/report-working-group-mhealth-assessment-guidelines [accessed 2024-03-28] 52], we identified other methodological frameworks and tools focusing on the assessment of dHTs. Specifically, we identified 16 methodological frameworks or tools focusing on the evaluation of non–face-to-face care models [Alfonzo A, Huerta M, Wong S, Passariello G, Diaz M, La CA, et al. Design of a methodology for assessing an electrocardiographic telemonitoring system. Annu Int Conf IEEE Eng Med Biol Soc. 2007;2007:3729-3732. [CrossRef]57-Rojahn K, Laplante S, Sloand J, Main C, Ibrahim A, Wild J, et al. Remote monitoring of chronic diseases: a landscape assessment of policies in four European countries. PLoS One. May 19, 2016;11(5):e0155738. [FREE Full text] [CrossRef] [Medline]72], along with 37 for the evaluation of mHealth [LATITUD: Anàlisi comparativa de models d'atenció no presencial en l'àmbit de la salut. TIC Salut Social. 2020. URL: https://ticsalutsocial.cat/wp-content/uploads/2021/07/latitud.pdf [accessed 2024-03-28] 10,Report of the Working Group on mhealth assessment guidelines. European Commission. 2021. URL: https://digital-strategy.ec.europa.eu/en/library/report-working-group-mhealth-assessment-guidelines [accessed 2024-03-28] 52,Agarwal S, LeFevre AE, Lee J, L'Engle K, Mehl G, Sinha C, et al. WHO mHealth Technical Evidence Review Group. Guidelines for reporting of health interventions using mobile phones: mobile health (mHealth) evidence reporting and assessment (mERA) checklist. BMJ. Mar 17, 2016;352:i1174. [CrossRef] [Medline]73-App check. Zentrum für Telematik und Telemedizin. URL: https://ztg-nrw.de/ [accessed 2024-03-28] 95], 11 for the evaluation of eHealth [Bergmo TS. How to measure costs and benefits of ehealth interventions: an overview of methods and frameworks. J Med Internet Res. Nov 09, 2015;17(11):e254. [FREE Full text] [CrossRef] [Medline]96-Eivazzadeh S, Anderberg P, Larsson TC, Fricker SA, Berglund J. Evaluating health information systems using ontologies. JMIR Med Inform. Jun 16, 2016;4(2):e20. [FREE Full text] [CrossRef] [Medline]107], and 17 for the evaluation of dHTs in general [Our data-driven future in healthcare: people and partnerships at the heart of health related technologies. Academy of Medical Sciences. Nov 2018. URL: https://acmedsci.ac.uk/file-download/74634438 [accessed 2024-03-28] 108-Steventon A, Grieve R, Bardsley M. An approach to assess generalizability in comparative effectiveness research: a case study of the whole systems demonstrator cluster randomized trial comparing telehealth with usual care for patients with chronic health conditions. Med Decis Making. May 18, 2015;35(8):1023-1036. [CrossRef]124]. Additionally, 5 (26.3%) [Kidholm K, Ekeland AG, Jensen LK, Rasmussen J, Pedersen CD, Bowes A, et al. A model for assessment of telemedicine applications: MAST. Int J Technol Assess Health Care. Jan 23, 2012;28(1):44-51. [CrossRef]33,Nepal S, Li J, Jang-Jaccard J, Alem L. A framework for telehealth program evaluation. Telemed J E Health. Apr 2014;20(4):393-404. [CrossRef] [Medline]34,Caulfield B, Reginatto B, Slevin P. Not all sensors are created equal: a framework for evaluating human performance measurement technologies. NPJ Digit Med. Feb 14, 2019;2(1):7. [FREE Full text] [CrossRef] [Medline]36,Lewis TL, Wyatt JC. mHealth and mobile medical apps: a framework to assess risk and promote safer use. J Med Internet Res. Sep 15, 2014;16(9):e210. [FREE Full text] [CrossRef] [Medline]37,Khoja S, Durrani H, Scott RE, Sajwani A, Piryani U. Conceptual framework for development of comprehensive e-health evaluation tool. Telemed J E Health. Jan 2013;19(1):48-53. [CrossRef] [Medline]42] of the 19 methodological frameworks included in this ScR were also identified and analyzed in 1 or more of the 4 literature synthesis documents [Kolasa K, Kozinski G. How to value digital health interventions? A systematic literature review. Int J Environ Res Public Health. Mar 23, 2020;17(6):2119. [FREE Full text] [CrossRef] [Medline]49-Report of the Working Group on mhealth assessment guidelines. European Commission. 2021. URL: https://digital-strategy.ec.europa.eu/en/library/report-working-group-mhealth-assessment-guidelines [accessed 2024-03-28] 52]. It is important to note that the difference between the frameworks we retrieved through our systematic search and those identified in the 4 SRs is the result of the narrower perspective we adopted, focusing exclusively on frameworks directly relevant to the HTA field, in line with the aims of our study. In

Multimedia Appendix 4

Methodological frameworks included in systematic reviews.

DOCX File , 41 KBMultimedia Appendix 4, we provide a more detailed explanation of the methodological frameworks included in the studies mentioned earlier [Evidence standards framework (ESF) for digital health technologies. National Institute for Health and Care Excellence. URL: https:/​/www.​nice.org.uk/​about/​what-we-do/​our-programmes/​evidence-standards-framework-for-digital-health-technologies [accessed 2024-03-28] 19,Kolasa K, Kozinski G. How to value digital health interventions? A systematic literature review. Int J Environ Res Public Health. Mar 23, 2020;17(6):2119. [FREE Full text] [CrossRef] [Medline]49-Report of the Working Group on mhealth assessment guidelines. European Commission. 2021. URL: https://digital-strategy.ec.europa.eu/en/library/report-working-group-mhealth-assessment-guidelines [accessed 2024-03-28] 52,Alfonzo A, Huerta M, Wong S, Passariello G, Diaz M, La CA, et al. Design of a methodology for assessing an electrocardiographic telemonitoring system. Annu Int Conf IEEE Eng Med Biol Soc. 2007;2007:3729-3732. [CrossRef]57-Agarwal S, LeFevre AE, Lee J, L'Engle K, Mehl G, Sinha C, et al. WHO mHealth Technical Evidence Review Group. Guidelines for reporting of health interventions using mobile phones: mobile health (mHealth) evidence reporting and assessment (mERA) checklist. BMJ. Mar 17, 2016;352:i1174. [CrossRef] [Medline]73,The app evaluation model. American Psychiatric Association Initiative. 2022. URL: https://www.psychiatry.org/psychiatrists/practice/mental-health-apps/the-app-evaluation-model [accessed 2024-03-28] 75-mHealth. TIC Salut Social. 2022. URL: https://ticsalutsocial.cat/projecte/mhealth/ [accessed 2024-03-28] 135].

Table 1. Methodological frameworks (N=19) included in this ScRa.
Methodological framework, yearCountryAssessment
specificity
Assessment
approach
Assessment domains, n
Methodological frameworks focusing on the assessment of non–face-to-face care models (n=3, 15.8%)

Model for Assessment of Telemedicine Applications (MAST), 2012 [Kidholm K, Ekeland AG, Jensen LK, Rasmussen J, Pedersen CD, Bowes A, et al. A model for assessment of telemedicine applications: MAST. Int J Technol Assess Health Care. Jan 23, 2012;28(1):44-51. [CrossRef]33]DenmarkOverall evaluationBy domainHealth problem and application description; security; clinical effectiveness; patient perspective; economic aspects; organizational aspects; sociocultural, ethical, and legal aspects (n=7)

Scottish Centre for Telehealth & Telecare (SCTT) Toolkit [Measuring the impact of telehealth and telecare: SCTT toolkit. Glasgow. SCTT; 2013. 35]ScotlandOverall evaluationBy domainUser benefits/costs; benefits/service costs; user experience; increased use of the technology platform available to support routine local services; confidence in the use and awareness of staff; awareness of telehealth and telecare as tools (n=6)

Telehealth framework, 2014 [Nepal S, Li J, Jang-Jaccard J, Alem L. A framework for telehealth program evaluation. Telemed J E Health. Apr 2014;20(4):393-404. [CrossRef] [Medline]34]AustraliaOverall evaluationBy domainHealth domain; health services; communication technologies; environment configuration; socioeconomic evaluation (n=5)
Methodological frameworks focusing on the evaluation of mHealthb technologies (n=6, 31.9%)

Caulfield’s evaluation framework, 2019 [Caulfield B, Reginatto B, Slevin P. Not all sensors are created equal: a framework for evaluating human performance measurement technologies. NPJ Digit Med. Feb 14, 2019;2(1):7. [FREE Full text] [CrossRef] [Medline]36]IrelandOverall evaluationBy domainContext information; cost information; normative compliance; scientific evidence; human factors; data collection and interpretation (n=6)

mHealth-based technology assessment for mobile apps, 2020 [Puigdomènech E, Poses-Ferrer E, Espallargues CM, Blasco AJ, Varela LL, Paz VL. Evaluación de tecnología basada en mSalud para aplicaciones móviles. Madrid. Ministerio de Sanidad; 2020. 39]SpainOverall evaluationBy domainGeneral information about the clinical condition and about the mHealth solution; privacy and security; technological aspects and interoperability; evidence and clinical effectiveness; user experience, usability, acceptability, ease of use, and aesthetics; costs and economic evaluation; impact on the organization (n=7)

Henson’s app evaluation framework, 2019 [Henson P, David G, Albright K, Torous J. Deriving a practical framework for the evaluation of health apps. Lancet Digit Health. Jun 2019;1(2):e52-e54. [CrossRef]40]IsraelOverall evaluationBy domainContext information; privacy/security; scientific evidence; usability; data integration (n=5)

Lewis’s assessment risk framework, 2014 [Lewis TL, Wyatt JC. mHealth and mobile medical apps: a framework to assess risk and promote safer use. J Med Internet Res. Sep 15, 2014;16(9):e210. [FREE Full text] [CrossRef] [Medline]37]United KingdomRisk/safety assessmentN/AcRisk (n=1)

Mobile medical app evaluation module, 2020 [Moshi MR, Tooher R, Merlin T. Development of a health technology assessment module for evaluating mobile medical applications. Int J Technol Assess Health Care. May 18, 2020;36(3):252-261. [CrossRef]38]AustraliaOverall evaluationBy domainDescription and technical characteristics; current use of technology; effectiveness; security; effectivity cost; organizational aspects; ethical aspects; legal aspects; postmarket monitoring; social aspects (n=10)

Vokinger, 2020 [Vokinger KN, Nittas V, Witt CM, Fabrikant SI, von Wyl V. Digital health and the COVID-19 epidemic: an assessment framework for apps from an epidemiological and legal perspective. Swiss Med Wkly. May 04, 2020;150(1920):w20282. [FREE Full text] [CrossRef] [Medline]55]SwissOverall evaluationBy domainPurpose; usability; information accuracy; organizational reputation; transparency; privacy; self-determination or user control (n=7)
Methodological frameworks focusing on the assessment of solutions based on AId (n=1, 5.3%)

Translational Evaluation of Healthcare AI (TEHAI), 2021 [Reddy S, Rogers W, Makinen V, Coiera E, Brown P, Wenzel M, et al. Evaluation framework to guide implementation of AI systems into healthcare settings. BMJ Health Care Inform. Oct 12, 2021;28(1):e100444. [FREE Full text] [CrossRef] [Medline]41]AustraliaOverall evaluationHybridAbility; utility; adoption (n=3)
Methodological frameworks focusing on the evaluation of eHealth technologies (n=3, 15.8%)

Health Information Technology Evaluation Framework (HITREF), 2015 [Sockolow P, Bowles K, Rogers M. Health information technology evaluation framework (HITREF) comprehensiveness as assessed in electronic point-of-care documentation systems evaluations. Amsterdam. IOS Press; 2015. 43]United StatesOverall evaluationBy domainStructural quality; quality of information logistics; unintended consequences/benefits; effects on quality-of-care outcomes; effects on process quality (n=5)

Heuristic evaluation of eHealth interventions, 2016 [Baumel A, Muench F. Heuristic evaluation of ehealth interventions: establishing standards that relate to the therapeutic process perspective. JMIR Ment Health. Jan 13, 2016;3(1):e5. [FREE Full text] [CrossRef] [Medline]56]United StatesOverall evaluationBy domainUsability/ease of use/functionality; aesthetics; security; content; adherence; persuasive design; research evidence; owner credibility (n=8)

Khoja-Durrani-Sajwani (KDS) Framework, 2013 [Khoja S, Durrani H, Scott RE, Sajwani A, Piryani U. Conceptual framework for development of comprehensive e-health evaluation tool. Telemed J E Health. Jan 2013;19(1):48-53. [CrossRef] [Medline]42]Afghanistan, Canada, Kenya, PakistanOverall evaluationHybridHealth service results; technology results; economic results; sociotechnical and behavioral results; ethical results; preparation and change results; results of the regulation (n=7)
Methodological frameworks focusing on the evaluation of dHTse (n=6, 31.6%)

Deontic accountability framework, 2019 [Milosevic Z. Ethics in digital health: a deontic accountability framework. 2019. Presented at: IEEE 23rd International Enterprise Distributed Object Computing Conference (EDOC); October 28-31, 2019; Paris, France. [CrossRef]46]AustraliaEthical evaluationBy domainEthical principles (n=1)

Digi HTA, 2019 [Haverinen J, Keränen N, Falkenbach P, Maijala A, Kolehmainen T, Reponen J. Digi-HTA: health technology assessment framework for digital healthcare services. FinJeHeW. Nov 02, 2019;11(4):326-341. [CrossRef]16]FinlandOverall evaluationBy domainCompany information; product information; technical stability; costs; effectiveness; clinical safety; data protection and security; usability and accessibility; interoperability; AI; robots (n=11)

Digital Health Scorecard, 2019 [Mathews SC, McShea MJ, Hanley CL, Ravitz A, Labrique AB, Cohen AB. Digital health: a path to validation. NPJ Digit Med. May 13, 2019;2(1):38. [FREE Full text] [CrossRef] [Medline]45]United StatesOverall evaluationHybridTechnical validation; clinical validation; usability; costs (n=4)

Framework for the design and evaluation of digital health interventions (DEDHI), 2019 [Kowatsch T, Otto L, Harperink S, Cotti A, Schlieter H. A design and evaluation framework for digital health interventions. it - Inf Technol. Nov 2019;61(5):253-263. [CrossRef]44]SwedenOverall evaluationBy domainEasy to use; content quality; privacy and security; responsibility; adherence; aesthetics; perceived benefits; effectiveness; quality of service; personalization; perceived enjoyment; ethics; security (n=13)

Monitoring and Evaluating Digital Health Interventions Guide, 2016 [World Health Organization (WHO). Monitoring and evaluating digital health interventions: a practical guide to conducting research and assessment. Geneva. WHO; 2016. 47]SwissOverall evaluationHybridCosts; feasibility; usability; effectiveness; implementation science; efficiency; quality; use (n=8)

Precision Health Applications Evaluation Framework, 2021 [Hussain MS, Silvera-Tawil D, Farr-Wharton G. Technology assessment framework for precision health applications. Int J Technol Assess Health Care. May 26, 2021;37(1):e67. [CrossRef]17]AustraliaOverall evaluationBy domainNovelty; adaptability; information management; performance; clinical effectiveness; quality assurance (n=6)

aScR: scoping review.

bmHealth: mobile health.

cN/A: not applicable.

dAI: artificial intelligence.

edHT: digital health technology.

Research Question 2: Domains and Dimensions Being Considered in dHTA

The 26 (32.1%) studies included encompassed a broad range of items to consider in dHTA and often used diverse expressions for analogous concepts. We reduced this heterogeneity through our thematic analysis according to the recommendations and phases described by Thomas and Harden [Thomas J, Harden A. Methods for the thematic synthesis of qualitative research in systematic reviews. BMC Med Res Methodol. Jul 10, 2008;8(1):45. [FREE Full text] [CrossRef] [Medline]31].

In this sense, in the first phase of thematic analysis, we identified and coded 176 units of meaning (coded as provisional codes) that represented different items (domains or dimensions) of the assessment. These units were then grouped into 86 descriptive themes (second phase), which were further refined into 61 analytical themes that captured the key concepts and relationships between them (third phase). Lastly, the 61 analytical themes were arranged in a 3-level vertical hierarchy based on the evidence: level 1 (12 domains), level 2 (38 dimensions), and level 3 (11 subdimensions). We used the term “domain” to refer to a distinct area or topic of evaluation that is integral to the assessment of the technology in question. A domain may encompass multiple related concepts or dimensions that are relevant to the evaluation. Each dimension, in turn, represents a specific aspect of evaluation that belongs to the domain and contributes to an understanding of its overall significance. Finally, a subdimension refers to a partial element of a dimension that facilitates its analysis. By using these terms, we aimed to provide a clear, rigorous, and comprehensive framework for conducting HTA.

Table 2 displays the 61 analytical themes in descending order of coding frequency, aligned with the hierarchy derived from the data analysis. Additionally, the table specifies the intervention modalities or dHTs that correspond to each code and lists the studies from which each code originated. The network of relationships among the codes can be found in

Multimedia Appendix 5

Network of relationships among the codes.

PNG File , 399 KBMultimedia Appendix 5.

Table 2. Analytical themes of the thematic analysis presented in descending order of coding frequency and aligned with the hierarchy derived from the data analysis.
Domain (level 1) and dimensions (level 2)Subdimension (level 3)Type of dHTa
Description of the technology (n=19, 6.2%)Non–face-to-face care models [Kidholm K, Ekeland AG, Jensen LK, Rasmussen J, Pedersen CD, Bowes A, et al. A model for assessment of telemedicine applications: MAST. Int J Technol Assess Health Care. Jan 23, 2012;28(1):44-51. [CrossRef]33,Nepal S, Li J, Jang-Jaccard J, Alem L. A framework for telehealth program evaluation. Telemed J E Health. Apr 2014;20(4):393-404. [CrossRef] [Medline]34], mHealthb [Caulfield B, Reginatto B, Slevin P. Not all sensors are created equal: a framework for evaluating human performance measurement technologies. NPJ Digit Med. Feb 14, 2019;2(1):7. [FREE Full text] [CrossRef] [Medline]36,Moshi MR, Tooher R, Merlin T. Development of a health technology assessment module for evaluating mobile medical applications. Int J Technol Assess Health Care. May 18, 2020;36(3):252-261. [CrossRef]38-Henson P, David G, Albright K, Torous J. Deriving a practical framework for the evaluation of health apps. Lancet Digit Health. Jun 2019;1(2):e52-e54. [CrossRef]40,Vokinger KN, Nittas V, Witt CM, Fabrikant SI, von Wyl V. Digital health and the COVID-19 epidemic: an assessment framework for apps from an epidemiological and legal perspective. Swiss Med Wkly. May 04, 2020;150(1920):w20282. [FREE Full text] [CrossRef] [Medline]55], AIc [Reddy S, Rogers W, Makinen V, Coiera E, Brown P, Wenzel M, et al. Evaluation framework to guide implementation of AI systems into healthcare settings. BMJ Health Care Inform. Oct 12, 2021;28(1):e100444. [FREE Full text] [CrossRef] [Medline]41], eHealth [Vis C, Bührmann L, Riper H, Ossebaard HC. Health technology assessment frameworks for eHealth: a systematic review. Int J Technol Assess Health Care. Apr 16, 2020;36(3):204-216. [CrossRef]50], digital health [Haverinen J, Keränen N, Falkenbach P, Maijala A, Kolehmainen T, Reponen J. Digi-HTA: health technology assessment framework for digital healthcare services. FinJeHeW. Nov 02, 2019;11(4):326-341. [CrossRef]16,Hussain MS, Silvera-Tawil D, Farr-Wharton G. Technology assessment framework for precision health applications. Int J Technol Assess Health Care. May 26, 2021;37(1):e67. [CrossRef]17,Kolasa K, Kozinski G. How to value digital health interventions? A systematic literature review. Int J Environ Res Public Health. Mar 23, 2020;17(6):2119. [FREE Full text] [CrossRef] [Medline]49,von Huben A, Howell M, Howard K, Carrello J, Norris S. Health technology assessment for digital technologies that manage chronic disease: a systematic review. Int J Technol Assess Health Care. May 26, 2021;37(1):e66. [CrossRef]51]

Credibility and reputation (n=5, 1.6%)dmHealth [Henson P, David G, Albright K, Torous J. Deriving a practical framework for the evaluation of health apps. Lancet Digit Health. Jun 2019;1(2):e52-e54. [CrossRef]40,Vokinger KN, Nittas V, Witt CM, Fabrikant SI, von Wyl V. Digital health and the COVID-19 epidemic: an assessment framework for apps from an epidemiological and legal perspective. Swiss Med Wkly. May 04, 2020;150(1920):w20282. [FREE Full text] [CrossRef] [Medline]55], digital health [Baumel A, Muench F. Heuristic evaluation of ehealth interventions: establishing standards that relate to the therapeutic process perspective. JMIR Ment Health. Jan 13, 2016;3(1):e5. [FREE Full text] [CrossRef] [Medline]56]

Scientific basis (n=5, 1.6%)mHealth [Caulfield B, Reginatto B, Slevin P. Not all sensors are created equal: a framework for evaluating human performance measurement technologies. NPJ Digit Med. Feb 14, 2019;2(1):7. [FREE Full text] [CrossRef] [Medline]36,Puigdomènech E, Poses-Ferrer E, Espallargues CM, Blasco AJ, Varela LL, Paz VL. Evaluación de tecnología basada en mSalud para aplicaciones móviles. Madrid. Ministerio de Sanidad; 2020. 39,Henson P, David G, Albright K, Torous J. Deriving a practical framework for the evaluation of health apps. Lancet Digit Health. Jun 2019;1(2):e52-e54. [CrossRef]40], digital health [Hussain MS, Silvera-Tawil D, Farr-Wharton G. Technology assessment framework for precision health applications. Int J Technol Assess Health Care. May 26, 2021;37(1):e67. [CrossRef]17,Baumel A, Muench F. Heuristic evaluation of ehealth interventions: establishing standards that relate to the therapeutic process perspective. JMIR Ment Health. Jan 13, 2016;3(1):e5. [FREE Full text] [CrossRef] [Medline]56]

Technical evaluation and validation (n=3, 1.0%)mHealth [Caulfield B, Reginatto B, Slevin P. Not all sensors are created equal: a framework for evaluating human performance measurement technologies. NPJ Digit Med. Feb 14, 2019;2(1):7. [FREE Full text] [CrossRef] [Medline]36], digital health [Mathews SC, McShea MJ, Hanley CL, Ravitz A, Labrique AB, Cohen AB. Digital health: a path to validation. NPJ Digit Med. May 13, 2019;2(1):38. [FREE Full text] [CrossRef] [Medline]45]

Adoption (n=2, 0.6%)AI [Reddy S, Rogers W, Makinen V, Coiera E, Brown P, Wenzel M, et al. Evaluation framework to guide implementation of AI systems into healthcare settings. BMJ Health Care Inform. Oct 12, 2021;28(1):e100444. [FREE Full text] [CrossRef] [Medline]41], digital health [World Health Organization (WHO). Monitoring and evaluating digital health interventions: a practical guide to conducting research and assessment. Geneva. WHO; 2016. 47]

Adoption (n=2, 0.6%)Usage (n=2, 0.6%)AI [Reddy S, Rogers W, Makinen V, Coiera E, Brown P, Wenzel M, et al. Evaluation framework to guide implementation of AI systems into healthcare settings. BMJ Health Care Inform. Oct 12, 2021;28(1):e100444. [FREE Full text] [CrossRef] [Medline]41], digital health [World Health Organization (WHO). Monitoring and evaluating digital health interventions: a practical guide to conducting research and assessment. Geneva. WHO; 2016. 47]

Adoption (n=2, 0.6%)Integration (n=1, 0.3%)AI [Reddy S, Rogers W, Makinen V, Coiera E, Brown P, Wenzel M, et al. Evaluation framework to guide implementation of AI systems into healthcare settings. BMJ Health Care Inform. Oct 12, 2021;28(1):e100444. [FREE Full text] [CrossRef] [Medline]41]

Information management (n=2, 0.6%)eHealth [Sockolow P, Bowles K, Rogers M. Health information technology evaluation framework (HITREF) comprehensiveness as assessed in electronic point-of-care documentation systems evaluations. Amsterdam. IOS Press; 2015. 43], digital health [Hussain MS, Silvera-Tawil D, Farr-Wharton G. Technology assessment framework for precision health applications. Int J Technol Assess Health Care. May 26, 2021;37(1):e67. [CrossRef]17]

Novelty (n=1, 0.7%)Digital health [Hussain MS, Silvera-Tawil D, Farr-Wharton G. Technology assessment framework for precision health applications. Int J Technol Assess Health Care. May 26, 2021;37(1):e67. [CrossRef]17]
Safety (n=19, 6.2%)Non–face-to-face care models [Kidholm K, Ekeland AG, Jensen LK, Rasmussen J, Pedersen CD, Bowes A, et al. A model for assessment of telemedicine applications: MAST. Int J Technol Assess Health Care. Jan 23, 2012;28(1):44-51. [CrossRef]33], mHealth [Lewis TL, Wyatt JC. mHealth and mobile medical apps: a framework to assess risk and promote safer use. J Med Internet Res. Sep 15, 2014;16(9):e210. [FREE Full text] [CrossRef] [Medline]37-Henson P, David G, Albright K, Torous J. Deriving a practical framework for the evaluation of health apps. Lancet Digit Health. Jun 2019;1(2):e52-e54. [CrossRef]40,Report of the Working Group on mhealth assessment guidelines. European Commission. 2021. URL: https://digital-strategy.ec.europa.eu/en/library/report-working-group-mhealth-assessment-guidelines [accessed 2024-03-28] 52], AI [Reddy S, Rogers W, Makinen V, Coiera E, Brown P, Wenzel M, et al. Evaluation framework to guide implementation of AI systems into healthcare settings. BMJ Health Care Inform. Oct 12, 2021;28(1):e100444. [FREE Full text] [CrossRef] [Medline]41], eHealth [Sockolow P, Bowles K, Rogers M. Health information technology evaluation framework (HITREF) comprehensiveness as assessed in electronic point-of-care documentation systems evaluations. Amsterdam. IOS Press; 2015. 43,Vis C, Bührmann L, Riper H, Ossebaard HC. Health technology assessment frameworks for eHealth: a systematic review. Int J Technol Assess Health Care. Apr 16, 2020;36(3):204-216. [CrossRef]50], digital health [Haverinen J, Keränen N, Falkenbach P, Maijala A, Kolehmainen T, Reponen J. Digi-HTA: health technology assessment framework for digital healthcare services. FinJeHeW. Nov 02, 2019;11(4):326-341. [CrossRef]16,Kowatsch T, Otto L, Harperink S, Cotti A, Schlieter H. A design and evaluation framework for digital health interventions. it - Inf Technol. Nov 2019;61(5):253-263. [CrossRef]44,Kolasa K, Kozinski G. How to value digital health interventions? A systematic literature review. Int J Environ Res Public Health. Mar 23, 2020;17(6):2119. [FREE Full text] [CrossRef] [Medline]49,von Huben A, Howell M, Howard K, Carrello J, Norris S. Health technology assessment for digital technologies that manage chronic disease: a systematic review. Int J Technol Assess Health Care. May 26, 2021;37(1):e66. [CrossRef]51,Baumel A, Muench F. Heuristic evaluation of ehealth interventions: establishing standards that relate to the therapeutic process perspective. JMIR Ment Health. Jan 13, 2016;3(1):e5. [FREE Full text] [CrossRef] [Medline]56]

Clinical safety (n=12, 3.9%)Non–face-to-face care models [Kidholm K, Ekeland AG, Jensen LK, Rasmussen J, Pedersen CD, Bowes A, et al. A model for assessment of telemedicine applications: MAST. Int J Technol Assess Health Care. Jan 23, 2012;28(1):44-51. [CrossRef]33], mHealth [Lewis TL, Wyatt JC. mHealth and mobile medical apps: a framework to assess risk and promote safer use. J Med Internet Res. Sep 15, 2014;16(9):e210. [FREE Full text] [CrossRef] [Medline]37,Moshi MR, Tooher R, Merlin T. Development of a health technology assessment module for evaluating mobile medical applications. Int J Technol Assess Health Care. May 18, 2020;36(3):252-261. [CrossRef]38,Report of the Working Group on mhealth assessment guidelines. European Commission. 2021. URL: https://digital-strategy.ec.europa.eu/en/library/report-working-group-mhealth-assessment-guidelines [accessed 2024-03-28] 52], AI [Reddy S, Rogers W, Makinen V, Coiera E, Brown P, Wenzel M, et al. Evaluation framework to guide implementation of AI systems into healthcare settings. BMJ Health Care Inform. Oct 12, 2021;28(1):e100444. [FREE Full text] [CrossRef] [Medline]41], eHealth [Sockolow P, Bowles K, Rogers M. Health information technology evaluation framework (HITREF) comprehensiveness as assessed in electronic point-of-care documentation systems evaluations. Amsterdam. IOS Press; 2015. 43,Vis C, Bührmann L, Riper H, Ossebaard HC. Health technology assessment frameworks for eHealth: a systematic review. Int J Technol Assess Health Care. Apr 16, 2020;36(3):204-216. [CrossRef]50], digital health [Haverinen J, Keränen N, Falkenbach P, Maijala A, Kolehmainen T, Reponen J. Digi-HTA: health technology assessment framework for digital healthcare services. FinJeHeW. Nov 02, 2019;11(4):326-341. [CrossRef]16,Kowatsch T, Otto L, Harperink S, Cotti A, Schlieter H. A design and evaluation framework for digital health interventions. it - Inf Technol. Nov 2019;61(5):253-263. [CrossRef]44,Kolasa K, Kozinski G. How to value digital health interventions? A systematic literature review. Int J Environ Res Public Health. Mar 23, 2020;17(6):2119. [FREE Full text] [CrossRef] [Medline]49,von Huben A, Howell M, Howard K, Carrello J, Norris S. Health technology assessment for digital technologies that manage chronic disease: a systematic review. Int J Technol Assess Health Care. May 26, 2021;37(1):e66. [CrossRef]51]

Technical safety (n=11, 3.6%)Non–face-to-face care models [Kidholm K, Ekeland AG, Jensen LK, Rasmussen J, Pedersen CD, Bowes A, et al. A model for assessment of telemedicine applications: MAST. Int J Technol Assess Health Care. Jan 23, 2012;28(1):44-51. [CrossRef]33], mHealth [Lewis TL, Wyatt JC. mHealth and mobile medical apps: a framework to assess risk and promote safer use. J Med Internet Res. Sep 15, 2014;16(9):e210. [FREE Full text] [CrossRef] [Medline]37,Puigdomènech E, Poses-Ferrer E, Espallargues CM, Blasco AJ, Varela LL, Paz VL. Evaluación de tecnología basada en mSalud para aplicaciones móviles. Madrid. Ministerio de Sanidad; 2020. 39,Report of the Working Group on mhealth assessment guidelines. European Commission. 2021. URL: https://digital-strategy.ec.europa.eu/en/library/report-working-group-mhealth-assessment-guidelines [accessed 2024-03-28] 52], digital health [Haverinen J, Keränen N, Falkenbach P, Maijala A, Kolehmainen T, Reponen J. Digi-HTA: health technology assessment framework for digital healthcare services. FinJeHeW. Nov 02, 2019;11(4):326-341. [CrossRef]16,Kowatsch T, Otto L, Harperink S, Cotti A, Schlieter H. A design and evaluation framework for digital health interventions. it - Inf Technol. Nov 2019;61(5):253-263. [CrossRef]44,Kolasa K, Kozinski G. How to value digital health interventions? A systematic literature review. Int J Environ Res Public Health. Mar 23, 2020;17(6):2119. [FREE Full text] [CrossRef] [Medline]49,von Huben A, Howell M, Howard K, Carrello J, Norris S. Health technology assessment for digital technologies that manage chronic disease: a systematic review. Int J Technol Assess Health Care. May 26, 2021;37(1):e66. [CrossRef]51,Baumel A, Muench F. Heuristic evaluation of ehealth interventions: establishing standards that relate to the therapeutic process perspective. JMIR Ment Health. Jan 13, 2016;3(1):e5. [FREE Full text] [CrossRef] [Medline]56]
Clinical efficacy and effectiveness (n=17, 5.5%) Non–face-to-face care models [Kidholm K, Ekeland AG, Jensen LK, Rasmussen J, Pedersen CD, Bowes A, et al. A model for assessment of telemedicine applications: MAST. Int J Technol Assess Health Care. Jan 23, 2012;28(1):44-51. [CrossRef]33,Measuring the impact of telehealth and telecare: SCTT toolkit. Glasgow. SCTT; 2013. 35], mHealth [Moshi MR, Tooher R, Merlin T. Development of a health technology assessment module for evaluating mobile medical applications. Int J Technol Assess Health Care. May 18, 2020;36(3):252-261. [CrossRef]38,Puigdomènech E, Poses-Ferrer E, Espallargues CM, Blasco AJ, Varela LL, Paz VL. Evaluación de tecnología basada en mSalud para aplicaciones móviles. Madrid. Ministerio de Sanidad; 2020. 39,Report of the Working Group on mhealth assessment guidelines. European Commission. 2021. URL: https://digital-strategy.ec.europa.eu/en/library/report-working-group-mhealth-assessment-guidelines [accessed 2024-03-28] 52,Kumar S, Nilsen WJ, Abernethy A, Atienza A, Patrick K, Pavel M, et al. Mobile health technology evaluation: the mhealth evidence workshop. Am J Prev Med. Aug 2013;45(2):228-236. [FREE Full text] [CrossRef] [Medline]53], eHealth [Sockolow P, Bowles K, Rogers M. Health information technology evaluation framework (HITREF) comprehensiveness as assessed in electronic point-of-care documentation systems evaluations. Amsterdam. IOS Press; 2015. 43,Vis C, Bührmann L, Riper H, Ossebaard HC. Health technology assessment frameworks for eHealth: a systematic review. Int J Technol Assess Health Care. Apr 16, 2020;36(3):204-216. [CrossRef]50], digital health [Haverinen J, Keränen N, Falkenbach P, Maijala A, Kolehmainen T, Reponen J. Digi-HTA: health technology assessment framework for digital healthcare services. FinJeHeW. Nov 02, 2019;11(4):326-341. [CrossRef]16,Kowatsch T, Otto L, Harperink S, Cotti A, Schlieter H. A design and evaluation framework for digital health interventions. it - Inf Technol. Nov 2019;61(5):253-263. [CrossRef]44,Mathews SC, McShea MJ, Hanley CL, Ravitz A, Labrique AB, Cohen AB. Digital health: a path to validation. NPJ Digit Med. May 13, 2019;2(1):38. [FREE Full text] [CrossRef] [Medline]45,World Health Organization (WHO). Monitoring and evaluating digital health interventions: a practical guide to conducting research and assessment. Geneva. WHO; 2016. 47,Kolasa K, Kozinski G. How to value digital health interventions? A systematic literature review. Int J Environ Res Public Health. Mar 23, 2020;17(6):2119. [FREE Full text] [CrossRef] [Medline]49,von Huben A, Howell M, Howard K, Carrello J, Norris S. Health technology assessment for digital technologies that manage chronic disease: a systematic review. Int J Technol Assess Health Care. May 26, 2021;37(1):e66. [CrossRef]51]
Economic aspects (n=16, 5.2%)Non–face-to-face care models [Kidholm K, Ekeland AG, Jensen LK, Rasmussen J, Pedersen CD, Bowes A, et al. A model for assessment of telemedicine applications: MAST. Int J Technol Assess Health Care. Jan 23, 2012;28(1):44-51. [CrossRef]33-Measuring the impact of telehealth and telecare: SCTT toolkit. Glasgow. SCTT; 2013. 35], mHealth [Caulfield B, Reginatto B, Slevin P. Not all sensors are created equal: a framework for evaluating human performance measurement technologies. NPJ Digit Med. Feb 14, 2019;2(1):7. [FREE Full text] [CrossRef] [Medline]36,Moshi MR, Tooher R, Merlin T. Development of a health technology assessment module for evaluating mobile medical applications. Int J Technol Assess Health Care. May 18, 2020;36(3):252-261. [CrossRef]38,Puigdomènech E, Poses-Ferrer E, Espallargues CM, Blasco AJ, Varela LL, Paz VL. Evaluación de tecnología basada en mSalud para aplicaciones móviles. Madrid. Ministerio de Sanidad; 2020. 39], AI [Alami H, Lehoux P, Auclair Y, de Guise M, Gagnon M, Shaw J, et al. Artificial intelligence and health technology assessment: anticipating a new level of complexity. J Med Internet Res. Jul 07, 2020;22(7):e17707. [FREE Full text] [CrossRef] [Medline]54], eHealth [Khoja S, Durrani H, Scott RE, Sajwani A, Piryani U. Conceptual framework for development of comprehensive e-health evaluation tool. Telemed J E Health. Jan 2013;19(1):48-53. [CrossRef] [Medline]42,Enam A, Torres-Bonilla J, Eriksson H. Evidence-based evaluation of eHealth interventions: systematic literature review. J Med Internet Res. Nov 23, 2018;20(11):e10971. [FREE Full text] [CrossRef] [Medline]48,Vis C, Bührmann L, Riper H, Ossebaard HC. Health technology assessment frameworks for eHealth: a systematic review. Int J Technol Assess Health Care. Apr 16, 2020;36(3):204-216. [CrossRef]50], digital health [Haverinen J, Keränen N, Falkenbach P, Maijala A, Kolehmainen T, Reponen J. Digi-HTA: health technology assessment framework for digital healthcare services. FinJeHeW. Nov 02, 2019;11(4):326-341. [CrossRef]16,Mathews SC, McShea MJ, Hanley CL, Ravitz A, Labrique AB, Cohen AB. Digital health: a path to validation. NPJ Digit Med. May 13, 2019;2(1):38. [FREE Full text] [CrossRef] [Medline]45,World Health Organization (WHO). Monitoring and evaluating digital health interventions: a practical guide to conducting research and assessment. Geneva. WHO; 2016. 47,Kolasa K, Kozinski G. How to value digital health interventions? A systematic literature review. Int J Environ Res Public Health. Mar 23, 2020;17(6):2119. [FREE Full text] [CrossRef] [Medline]49,von Huben A, Howell M, Howard K, Carrello J, Norris S. Health technology assessment for digital technologies that manage chronic disease: a systematic review. Int J Technol Assess Health Care. May 26, 2021;37(1):e66. [CrossRef]51]

Costs (n=10, 3.2%)Non–face-to-face care models [Kidholm K, Ekeland AG, Jensen LK, Rasmussen J, Pedersen CD, Bowes A, et al. A model for assessment of telemedicine applications: MAST. Int J Technol Assess Health Care. Jan 23, 2012;28(1):44-51. [CrossRef]33,Measuring the impact of telehealth and telecare: SCTT toolkit. Glasgow. SCTT; 2013. 35], mHealth [Caulfield B, Reginatto B, Slevin P. Not all sensors are created equal: a framework for evaluating human performance measurement technologies. NPJ Digit Med. Feb 14, 2019;2(1):7. [FREE Full text] [CrossRef] [Medline]36,Puigdomènech E, Poses-Ferrer E, Espallargues CM, Blasco AJ, Varela LL, Paz VL. Evaluación de tecnología basada en mSalud para aplicaciones móviles. Madrid. Ministerio de Sanidad; 2020. 39], AI [Alami H, Lehoux P, Auclair Y, de Guise M, Gagnon M, Shaw J, et al. Artificial intelligence and health technology assessment: anticipating a new level of complexity. J Med Internet Res. Jul 07, 2020;22(7):e17707. [FREE Full text] [CrossRef] [Medline]54], digital health [Haverinen J, Keränen N, Falkenbach P, Maijala A, Kolehmainen T, Reponen J. Digi-HTA: health technology assessment framework for digital healthcare services. FinJeHeW. Nov 02, 2019;11(4):326-341. [CrossRef]16,Mathews SC, McShea MJ, Hanley CL, Ravitz A, Labrique AB, Cohen AB. Digital health: a path to validation. NPJ Digit Med. May 13, 2019;2(1):38. [FREE Full text] [CrossRef] [Medline]45,World Health Organization (WHO). Monitoring and evaluating digital health interventions: a practical guide to conducting research and assessment. Geneva. WHO; 2016. 47,Kolasa K, Kozinski G. How to value digital health interventions? A systematic literature review. Int J Environ Res Public Health. Mar 23, 2020;17(6):2119. [FREE Full text] [CrossRef] [Medline]49,von Huben A, Howell M, Howard K, Carrello J, Norris S. Health technology assessment for digital technologies that manage chronic disease: a systematic review. Int J Technol Assess Health Care. May 26, 2021;37(1):e66. [CrossRef]51]

Economic evaluation (n=7, 2.3%)Non–face-to-face care models [Kidholm K, Ekeland AG, Jensen LK, Rasmussen J, Pedersen CD, Bowes A, et al. A model for assessment of telemedicine applications: MAST. Int J Technol Assess Health Care. Jan 23, 2012;28(1):44-51. [CrossRef]33,Measuring the impact of telehealth and telecare: SCTT toolkit. Glasgow. SCTT; 2013. 35], mHealth [Moshi MR, Tooher R, Merlin T. Development of a health technology assessment module for evaluating mobile medical applications. Int J Technol Assess Health Care. May 18, 2020;36(3):252-261. [CrossRef]38,Puigdomènech E, Poses-Ferrer E, Espallargues CM, Blasco AJ, Varela LL, Paz VL. Evaluación de tecnología basada en mSalud para aplicaciones móviles. Madrid. Ministerio de Sanidad; 2020. 39], eHealth [Khoja S, Durrani H, Scott RE, Sajwani A, Piryani U. Conceptual framework for development of comprehensive e-health evaluation tool. Telemed J E Health. Jan 2013;19(1):48-53. [CrossRef] [Medline]42,Enam A, Torres-Bonilla J, Eriksson H. Evidence-based evaluation of eHealth interventions: systematic literature review. J Med Internet Res. Nov 23, 2018;20(11):e10971. [FREE Full text] [CrossRef] [Medline]48], digital health [von Huben A, Howell M, Howard K, Carrello J, Norris S. Health technology assessment for digital technologies that manage chronic disease: a systematic review. Int J Technol Assess Health Care. May 26, 2021;37(1):e66. [CrossRef]51]

Use of resources (n=4, 1.3%) and efficiency (n=1, 0.3%)Non–face-to-face care models [Measuring the impact of telehealth and telecare: SCTT toolkit. Glasgow. SCTT; 2013. 35], mHealth [Caulfield B, Reginatto B, Slevin P. Not all sensors are created equal: a framework for evaluating human performance measurement technologies. NPJ Digit Med. Feb 14, 2019;2(1):7. [FREE Full text] [CrossRef] [Medline]36], AI [Alami H, Lehoux P, Auclair Y, de Guise M, Gagnon M, Shaw J, et al. Artificial intelligence and health technology assessment: anticipating a new level of complexity. J Med Internet Res. Jul 07, 2020;22(7):e17707. [FREE Full text] [CrossRef] [Medline]54], digital health [World Health Organization (WHO). Monitoring and evaluating digital health interventions: a practical guide to conducting research and assessment. Geneva. WHO; 2016. 47,Kolasa K, Kozinski G. How to value digital health interventions? A systematic literature review. Int J Environ Res Public Health. Mar 23, 2020;17(6):2119. [FREE Full text] [CrossRef] [Medline]49]
Ethical aspects (n=13, 4.2%)Non–face-to-face care models [Kidholm K, Ekeland AG, Jensen LK, Rasmussen J, Pedersen CD, Bowes A, et al. A model for assessment of telemedicine applications: MAST. Int J Technol Assess Health Care. Jan 23, 2012;28(1):44-51. [CrossRef]33], mHealth [Moshi MR, Tooher R, Merlin T. Development of a health technology assessment module for evaluating mobile medical applications. Int J Technol Assess Health Care. May 18, 2020;36(3):252-261. [CrossRef]38,Vokinger KN, Nittas V, Witt CM, Fabrikant SI, von Wyl V. Digital health and the COVID-19 epidemic: an assessment framework for apps from an epidemiological and legal perspective. Swiss Med Wkly. May 04, 2020;150(1920):w20282. [FREE Full text] [CrossRef] [Medline]55], AI [Reddy S, Rogers W, Makinen V, Coiera E, Brown P, Wenzel M, et al. Evaluation framework to guide implementation of AI systems into healthcare settings. BMJ Health Care Inform. Oct 12, 2021;28(1):e100444. [FREE Full text] [CrossRef] [Medline]41,Alami H, Lehoux P, Auclair Y, de Guise M, Gagnon M, Shaw J, et al. Artificial intelligence and health technology assessment: anticipating a new level of complexity. J Med Internet Res. Jul 07, 2020;22(7):e17707. [FREE Full text] [CrossRef] [Medline]54], eHealth [Khoja S, Durrani H, Scott RE, Sajwani A, Piryani U. Conceptual framework for development of comprehensive e-health evaluation tool. Telemed J E Health. Jan 2013;19(1):48-53. [CrossRef] [Medline]42,Enam A, Torres-Bonilla J, Eriksson H. Evidence-based evaluation of eHealth interventions: systematic literature review. J Med Internet Res. Nov 23, 2018;20(11):e10971. [FREE Full text] [CrossRef] [Medline]48,Vis C, Bührmann L, Riper H, Ossebaard HC. Health technology assessment frameworks for eHealth: a systematic review. Int J Technol Assess Health Care. Apr 16, 2020;36(3):204-216. [CrossRef]50], digital health [Kowatsch T, Otto L, Harperink S, Cotti A, Schlieter H. A design and evaluation framework for digital health interventions. it - Inf Technol. Nov 2019;61(5):253-263. [CrossRef]44,Milosevic Z. Ethics in digital health: a deontic accountability framework. 2019. Presented at: IEEE 23rd International Enterprise Distributed Object Computing Conference (EDOC); October 28-31, 2019; Paris, France. [CrossRef]46,Kolasa K, Kozinski G. How to value digital health interventions? A systematic literature review. Int J Environ Res Public Health. Mar 23, 2020;17(6):2119. [FREE Full text] [CrossRef] [Medline]49,von Huben A, Howell M, Howard K, Carrello J, Norris S. Health technology assessment for digital technologies that manage chronic disease: a systematic review. Int J Technol Assess Health Care. May 26, 2021;37(1):e66. [CrossRef]51]

Equity (n=1, 0.3%)Digital health [Evidence standards framework (ESF) for digital health technologies. National Institute for Health and Care Excellence. URL: https:/​/www.​nice.org.uk/​about/​what-we-do/​our-programmes/​evidence-standards-framework-for-digital-health-technologies [accessed 2024-03-28] 19]

User control and self-determination (n=1, 0.3%)mHealth [Vokinger KN, Nittas V, Witt CM, Fabrikant SI, von Wyl V. Digital health and the COVID-19 epidemic: an assessment framework for apps from an epidemiological and legal perspective. Swiss Med Wkly. May 04, 2020;150(1920):w20282. [FREE Full text] [CrossRef] [Medline]55], digital health [Milosevic Z. Ethics in digital health: a deontic accountability framework. 2019. Presented at: IEEE 23rd International Enterprise Distributed Object Computing Conference (EDOC); October 28-31, 2019; Paris, France. [CrossRef]46]

Responsibility (n=1, 0.3%)Digital health [Kowatsch T, Otto L, Harperink S, Cotti A, Schlieter H. A design and evaluation framework for digital health interventions. it - Inf Technol. Nov 2019;61(5):253-263. [CrossRef]44,Milosevic Z. Ethics in digital health: a deontic accountability framework. 2019. Presented at: IEEE 23rd International Enterprise Distributed Object Computing Conference (EDOC); October 28-31, 2019; Paris, France. [CrossRef]46]

Explainability (n=1, 0.3%)Digital health [Milosevic Z. Ethics in digital health: a deontic accountability framework. 2019. Presented at: IEEE 23rd International Enterprise Distributed Object Computing Conference (EDOC); October 28-31, 2019; Paris, France. [CrossRef]46]
Human and sociocultural aspects (n=13, 4.2%)Non–face-to-face care models [Kidholm K, Ekeland AG, Jensen LK, Rasmussen J, Pedersen CD, Bowes A, et al. A model for assessment of telemedicine applications: MAST. Int J Technol Assess Health Care. Jan 23, 2012;28(1):44-51. [CrossRef]33,Measuring the impact of telehealth and telecare: SCTT toolkit. Glasgow. SCTT; 2013. 35], mHealth [Caulfield B, Reginatto B, Slevin P. Not all sensors are created equal: a framework for evaluating human performance measurement technologies. NPJ Digit Med. Feb 14, 2019;2(1):7. [FREE Full text] [CrossRef] [Medline]36,Moshi MR, Tooher R, Merlin T. Development of a health technology assessment module for evaluating mobile medical applications. Int J Technol Assess Health Care. May 18, 2020;36(3):252-261. [CrossRef]38,Puigdomènech E, Poses-Ferrer E, Espallargues CM, Blasco AJ, Varela LL, Paz VL. Evaluación de tecnología basada en mSalud para aplicaciones móviles. Madrid. Ministerio de Sanidad; 2020. 39], AI [Alami H, Lehoux P, Auclair Y, de Guise M, Gagnon M, Shaw J, et al. Artificial intelligence and health technology assessment: anticipating a new level of complexity. J Med Internet Res. Jul 07, 2020;22(7):e17707. [FREE Full text] [CrossRef] [Medline]54], eHealth [Khoja S, Durrani H, Scott RE, Sajwani A, Piryani U. Conceptual framework for development of comprehensive e-health evaluation tool. Telemed J E Health. Jan 2013;19(1):48-53. [CrossRef] [Medline]42,Enam A, Torres-Bonilla J, Eriksson H. Evidence-based evaluation of eHealth interventions: systematic literature review. J Med Internet Res. Nov 23, 2018;20(11):e10971. [FREE Full text] [CrossRef] [Medline]48], digital health [Hussain MS, Silvera-Tawil D, Farr-Wharton G. Technology assessment framework for precision health applications. Int J Technol Assess Health Care. May 26, 2021;37(1):e67. [CrossRef]17,Kolasa K, Kozinski G. How to value digital health interventions? A systematic literature review. Int J Environ Res Public Health. Mar 23, 2020;17(6):2119. [FREE Full text] [CrossRef] [Medline]49,von Huben A, Howell M, Howard K, Carrello J, Norris S. Health technology assessment for digital technologies that manage chronic disease: a systematic review. Int J Technol Assess Health Care. May 26, 2021;37(1):e66. [CrossRef]51]

User experience (n=7, 2.3%)Non–face-to-face care models [Measuring the impact of telehealth and telecare: SCTT toolkit. Glasgow. SCTT; 2013. 35], mHealth [Puigdomènech E, Poses-Ferrer E, Espallargues CM, Blasco AJ, Varela LL, Paz VL. Evaluación de tecnología basada en mSalud para aplicaciones móviles. Madrid. Ministerio de Sanidad; 2020. 39,Henson P, David G, Albright K, Torous J. Deriving a practical framework for the evaluation of health apps. Lancet Digit Health. Jun 2019;1(2):e52-e54. [CrossRef]40,Report of the Working Group on mhealth assessment guidelines. European Commission. 2021. URL: https://digital-strategy.ec.europa.eu/en/library/report-working-group-mhealth-assessment-guidelines [accessed 2024-03-28] 52], digital health [Hussain MS, Silvera-Tawil D, Farr-Wharton G. Technology assessment framework for precision health applications. Int J Technol Assess Health Care. May 26, 2021;37(1):e67. [CrossRef]17,Kowatsch T, Otto L, Harperink S, Cotti A, Schlieter H. A design and evaluation framework for digital health interventions. it - Inf Technol. Nov 2019;61(5):253-263. [CrossRef]44,Baumel A, Muench F. Heuristic evaluation of ehealth interventions: establishing standards that relate to the therapeutic process perspective. JMIR Ment Health. Jan 13, 2016;3(1):e5. [FREE Full text] [CrossRef] [Medline]56]

Accessibility (n=3, 1.0%)mHealth [Report of the Working Group on mhealth assessment guidelines. European Commission. 2021. URL: https://digital-strategy.ec.europa.eu/en/library/report-working-group-mhealth-assessment-guidelines [accessed 2024-03-28] 52]

Acceptability (n=2, 0.6%)mHealth [Puigdomènech E, Poses-Ferrer E, Espallargues CM, Blasco AJ, Varela LL, Paz VL. Evaluación de tecnología basada en mSalud para aplicaciones móviles. Madrid. Ministerio de Sanidad; 2020. 39], AI [Reddy S, Rogers W, Makinen V, Coiera E, Brown P, Wenzel M, et al. Evaluation framework to guide implementation of AI systems into healthcare settings. BMJ Health Care Inform. Oct 12, 2021;28(1):e100444. [FREE Full text] [CrossRef] [Medline]41]

Engagement (n=2, 0.6%)Digital health [Kowatsch T, Otto L, Harperink S, Cotti A, Schlieter H. A design and evaluation framework for digital health interventions. it - Inf Technol. Nov 2019;61(5):253-263. [CrossRef]44,Baumel A, Muench F. Heuristic evaluation of ehealth interventions: establishing standards that relate to the therapeutic process perspective. JMIR Ment Health. Jan 13, 2016;3(1):e5. [FREE Full text] [CrossRef] [Medline]56]

Perceived profit (n=1, 0.3%)Digital health [Kowatsch T, Otto L, Harperink S, Cotti A, Schlieter H. A design and evaluation framework for digital health interventions. it - Inf Technol. Nov 2019;61(5):253-263. [CrossRef]44]
Organizational aspects (n=3.68%) Non–face-to-face care models [Kidholm K, Ekeland AG, Jensen LK, Rasmussen J, Pedersen CD, Bowes A, et al. A model for assessment of telemedicine applications: MAST. Int J Technol Assess Health Care. Jan 23, 2012;28(1):44-51. [CrossRef]33], mHealth [Moshi MR, Tooher R, Merlin T. Development of a health technology assessment module for evaluating mobile medical applications. Int J Technol Assess Health Care. May 18, 2020;36(3):252-261. [CrossRef]38,Puigdomènech E, Poses-Ferrer E, Espallargues CM, Blasco AJ, Varela LL, Paz VL. Evaluación de tecnología basada en mSalud para aplicaciones móviles. Madrid. Ministerio de Sanidad; 2020. 39], AI [Reddy S, Rogers W, Makinen V, Coiera E, Brown P, Wenzel M, et al. Evaluation framework to guide implementation of AI systems into healthcare settings. BMJ Health Care Inform. Oct 12, 2021;28(1):e100444. [FREE Full text] [CrossRef] [Medline]41,Alami H, Lehoux P, Auclair Y, de Guise M, Gagnon M, Shaw J, et al. Artificial intelligence and health technology assessment: anticipating a new level of complexity. J Med Internet Res. Jul 07, 2020;22(7):e17707. [FREE Full text] [CrossRef] [Medline]54], eHealth [Khoja S, Durrani H, Scott RE, Sajwani A, Piryani U. Conceptual framework for development of comprehensive e-health evaluation tool. Telemed J E Health. Jan 2013;19(1):48-53. [CrossRef] [Medline]42,Sockolow P, Bowles K, Rogers M. Health information technology evaluation framework (HITREF) comprehensiveness as assessed in electronic point-of-care documentation systems evaluations. Amsterdam. IOS Press; 2015. 43,Enam A, Torres-Bonilla J, Eriksson H. Evidence-based evaluation of eHealth interventions: systematic literature review. J Med Internet Res. Nov 23, 2018;20(11):e10971. [FREE Full text] [CrossRef] [Medline]48,Vis C, Bührmann L, Riper H, Ossebaard HC. Health technology assessment frameworks for eHealth: a systematic review. Int J Technol Assess Health Care. Apr 16, 2020;36(3):204-216. [CrossRef]50], digital health [Kolasa K, Kozinski G. How to value digital health interventions? A systematic literature review. Int J Environ Res Public Health. Mar 23, 2020;17(6):2119. [FREE Full text] [CrossRef] [Medline]49,von Huben A, Howell M, Howard K, Carrello J, Norris S. Health technology assessment for digital technologies that manage chronic disease: a systematic review. Int J Technol Assess Health Care. May 26, 2021;37(1):e66. [CrossRef]51]
Legal and regulatory aspects (n=10, 3.2%)Non–face-to-face care models [Kidholm K, Ekeland AG, Jensen LK, Rasmussen J, Pedersen CD, Bowes A, et al. A model for assessment of telemedicine applications: MAST. Int J Technol Assess Health Care. Jan 23, 2012;28(1):44-51. [CrossRef]33], mHealth [Caulfield B, Reginatto B, Slevin P. Not all sensors are created equal: a framework for evaluating human performance measurement technologies. NPJ Digit Med. Feb 14, 2019;2(1):7. [FREE Full text] [CrossRef] [Medline]36,Moshi MR, Tooher R, Merlin T. Development of a health technology assessment module for evaluating mobile medical applications. Int J Technol Assess Health Care. May 18, 2020;36(3):252-261. [CrossRef]38], AI [Alami H, Lehoux P, Auclair Y, de Guise M, Gagnon M, Shaw J, et al. Artificial intelligence and health technology assessment: anticipating a new level of complexity. J Med Internet Res. Jul 07, 2020;22(7):e17707. [FREE Full text] [CrossRef] [Medline]54], eHealth [Khoja S, Durrani H, Scott RE, Sajwani A, Piryani U. Conceptual framework for development of comprehensive e-health evaluation tool. Telemed J E Health. Jan 2013;19(1):48-53. [CrossRef] [Medline]42,Enam A, Torres-Bonilla J, Eriksson H. Evidence-based evaluation of eHealth interventions: systematic literature review. J Med Internet Res. Nov 23, 2018;20(11):e10971. [FREE Full text] [CrossRef] [Medline]48,Vis C, Bührmann L, Riper H, Ossebaard HC. Health technology assessment frameworks for eHealth: a systematic review. Int J Technol Assess Health Care. Apr 16, 2020;36(3):204-216. [CrossRef]50], digital health [Hussain MS, Silvera-Tawil D, Farr-Wharton G. Technology assessment framework for precision health applications. Int J Technol Assess Health Care. May 26, 2021;37(1):e67. [CrossRef]17,Kolasa K, Kozinski G. How to value digital health interventions? A systematic literature review. Int J Environ Res Public Health. Mar 23, 2020;17(6):2119. [FREE Full text] [CrossRef] [Medline]49,von Huben A, Howell M, Howard K, Carrello J, Norris S. Health technology assessment for digital technologies that manage chronic disease: a systematic review. Int J Technol Assess Health Care. May 26, 2021;37(1):e66. [CrossRef]51]

Privacy (n=6, 1.9%)mHealth [Puigdomènech E, Poses-Ferrer E, Espallargues CM, Blasco AJ, Varela LL, Paz VL. Evaluación de tecnología basada en mSalud para aplicaciones móviles. Madrid. Ministerio de Sanidad; 2020. 39,Henson P, David G, Albright K, Torous J. Deriving a practical framework for the evaluation of health apps. Lancet Digit Health. Jun 2019;1(2):e52-e54. [CrossRef]40,Report of the Working Group on mhealth assessment guidelines. European Commission. 2021. URL: https://digital-strategy.ec.europa.eu/en/library/report-working-group-mhealth-assessment-guidelines [accessed 2024-03-28] 52,Vokinger KN, Nittas V, Witt CM, Fabrikant SI, von Wyl V. Digital health and the COVID-19 epidemic: an assessment framework for apps from an epidemiological and legal perspective. Swiss Med Wkly. May 04, 2020;150(1920):w20282. [FREE Full text] [CrossRef] [Medline]55], AI [Reddy S, Rogers W, Makinen V, Coiera E, Brown P, Wenzel M, et al. Evaluation framework to guide implementation of AI systems into healthcare settings. BMJ Health Care Inform. Oct 12, 2021;28(1):e100444. [FREE Full text] [CrossRef] [Medline]41], digital health [Kowatsch T, Otto L, Harperink S, Cotti A, Schlieter H. A design and evaluation framework for digital health interventions. it - Inf Technol. Nov 2019;61(5):253-263. [CrossRef]44]

Transparency (n=4, 1.3%)mHealth [Report of the Working Group on mhealth assessment guidelines. European Commission. 2021. URL: https://digital-strategy.ec.europa.eu/en/library/report-working-group-mhealth-assessment-guidelines [accessed 2024-03-28] 52,Vokinger KN, Nittas V, Witt CM, Fabrikant SI, von Wyl V. Digital health and the COVID-19 epidemic: an assessment framework for apps from an epidemiological and legal perspective. Swiss Med Wkly. May 04, 2020;150(1920):w20282. [FREE Full text] [CrossRef] [Medline]55], AI [Reddy S, Rogers W, Makinen V, Coiera E, Brown P, Wenzel M, et al. Evaluation framework to guide implementation of AI systems into healthcare settings. BMJ Health Care Inform. Oct 12, 2021;28(1):e100444. [FREE Full text] [CrossRef] [Medline]41,Alami H, Lehoux P, Auclair Y, de Guise M, Gagnon M, Shaw J, et al. Artificial intelligence and health technology assessment: anticipating a new level of complexity. J Med Internet Res. Jul 07, 2020;22(7):e17707. [FREE Full text] [CrossRef] [Medline]54]

Responsibility (n=1, 0.3%)Digital health [Kowatsch T, Otto L, Harperink S, Cotti A, Schlieter H. A design and evaluation framework for digital health interventions. it - Inf Technol. Nov 2019;61(5):253-263. [CrossRef]44]
Description of health problem (n=8, 2.6%) Non–face-to-face care models [Kidholm K, Ekeland AG, Jensen LK, Rasmussen J, Pedersen CD, Bowes A, et al. A model for assessment of telemedicine applications: MAST. Int J Technol Assess Health Care. Jan 23, 2012;28(1):44-51. [CrossRef]33,Nepal S, Li J, Jang-Jaccard J, Alem L. A framework for telehealth program evaluation. Telemed J E Health. Apr 2014;20(4):393-404. [CrossRef] [Medline]34], mHealth [Puigdomènech E, Poses-Ferrer E, Espallargues CM, Blasco AJ, Varela LL, Paz VL. Evaluación de tecnología basada en mSalud para aplicaciones móviles. Madrid. Ministerio de Sanidad; 2020. 39], eHealth [Vis C, Bührmann L, Riper H, Ossebaard HC. Health technology assessment frameworks for eHealth: a systematic review. Int J Technol Assess Health Care. Apr 16, 2020;36(3):204-216. [CrossRef]50], digital health [Kolasa K, Kozinski G. How to value digital health interventions? A systematic literature review. Int J Environ Res Public Health. Mar 23, 2020;17(6):2119. [FREE Full text] [CrossRef] [Medline]49,von Huben A, Howell M, Howard K, Carrello J, Norris S. Health technology assessment for digital technologies that manage chronic disease: a systematic review. Int J Technol Assess Health Care. May 26, 2021;37(1):e66. [CrossRef]51]
Content (n=5, 1.6%)mHealth [Vokinger KN, Nittas V, Witt CM, Fabrikant SI, von Wyl V. Digital health and the COVID-19 epidemic: an assessment framework for apps from an epidemiological and legal perspective. Swiss Med Wkly. May 04, 2020;150(1920):w20282. [FREE Full text] [CrossRef] [Medline]55], eHealth [Vis C, Bührmann L, Riper H, Ossebaard HC. Health technology assessment frameworks for eHealth: a systematic review. Int J Technol Assess Health Care. Apr 16, 2020;36(3):204-216. [CrossRef]50], digital health [Kowatsch T, Otto L, Harperink S, Cotti A, Schlieter H. A design and evaluation framework for digital health interventions. it - Inf Technol. Nov 2019;61(5):253-263. [CrossRef]44,Baumel A, Muench F. Heuristic evaluation of ehealth interventions: establishing standards that relate to the therapeutic process perspective. JMIR Ment Health. Jan 13, 2016;3(1):e5. [FREE Full text] [CrossRef] [Medline]56]

Information adequacy (n=2, 0.6%)mHealth [Vokinger KN, Nittas V, Witt CM, Fabrikant SI, von Wyl V. Digital health and the COVID-19 epidemic: an assessment framework for apps from an epidemiological and legal perspective. Swiss Med Wkly. May 04, 2020;150(1920):w20282. [FREE Full text] [CrossRef] [Medline]55], digital health [Baumel A, Muench F. Heuristic evaluation of ehealth interventions: establishing standards that relate to the therapeutic process perspective. JMIR Ment Health. Jan 13, 2016;3(1):e5. [FREE Full text] [CrossRef] [Medline]56]

Intervention adequacy (n=2, 0.6%)Digital health [Baumel A, Muench F. Heuristic evaluation of ehealth interventions: establishing standards that relate to the therapeutic process perspective. JMIR Ment Health. Jan 13, 2016;3(1):e5. [FREE Full text] [CrossRef] [Medline]56]
Technical aspects (n=4, 1.3%)AI [Alami H, Lehoux P, Auclair Y, de Guise M, Gagnon M, Shaw J, et al. Artificial intelligence and health technology assessment: anticipating a new level of complexity. J Med Internet Res. Jul 07, 2020;22(7):e17707. [FREE Full text] [CrossRef] [Medline]54], eHealth [Khoja S, Durrani H, Scott RE, Sajwani A, Piryani U. Conceptual framework for development of comprehensive e-health evaluation tool. Telemed J E Health. Jan 2013;19(1):48-53. [CrossRef] [Medline]42,Enam A, Torres-Bonilla J, Eriksson H. Evidence-based evaluation of eHealth interventions: systematic literature review. J Med Internet Res. Nov 23, 2018;20(11):e10971. [FREE Full text] [CrossRef] [Medline]48], digital health [Hussain MS, Silvera-Tawil D, Farr-Wharton G. Technology assessment framework for precision health applications. Int J Technol Assess Health Care. May 26, 2021;37(1):e67. [CrossRef]17]

Usability (n=10, 3.2%)mHealth [Puigdomènech E, Poses-Ferrer E, Espallargues CM, Blasco AJ, Varela LL, Paz VL. Evaluación de tecnología basada en mSalud para aplicaciones móviles. Madrid. Ministerio de Sanidad; 2020. 39,Report of the Working Group on mhealth assessment guidelines. European Commission. 2021. URL: https://digital-strategy.ec.europa.eu/en/library/report-working-group-mhealth-assessment-guidelines [accessed 2024-03-28] 52,Vokinger KN, Nittas V, Witt CM, Fabrikant SI, von Wyl V. Digital health and the COVID-19 epidemic: an assessment framework for apps from an epidemiological and legal perspective. Swiss Med Wkly. May 04, 2020;150(1920):w20282. [FREE Full text] [CrossRef] [Medline]55], digital health [Haverinen J, Keränen N, Falkenbach P, Maijala A, Kolehmainen T, Reponen J. Digi-HTA: health technology assessment framework for digital healthcare services. FinJeHeW. Nov 02, 2019;11(4):326-341. [CrossRef]16,Kowatsch T, Otto L, Harperink S, Cotti A, Schlieter H. A design and evaluation framework for digital health interventions. it - Inf Technol. Nov 2019;61(5):253-263. [CrossRef]44,Mathews SC, McShea MJ, Hanley CL, Ravitz A, Labrique AB, Cohen AB. Digital health: a path to validation. NPJ Digit Med. May 13, 2019;2(1):38. [FREE Full text] [CrossRef] [Medline]45,World Health Organization (WHO). Monitoring and evaluating digital health interventions: a practical guide to conducting research and assessment. Geneva. WHO; 2016. 47,Kolasa K, Kozinski G. How to value digital health interventions? A systematic literature review. Int J Environ Res Public Health. Mar 23, 2020;17(6):2119. [FREE Full text] [CrossRef] [Medline]49,Baumel A, Muench F. Heuristic evaluation of ehealth interventions: establishing standards that relate to the therapeutic process perspective. JMIR Ment Health. Jan 13, 2016;3(1):e5. [FREE Full text] [CrossRef] [Medline]56]

Adaptability (n=8, 2.6%)Digital health [Hussain MS, Silvera-Tawil D, Farr-Wharton G. Technology assessment framework for precision health applications. Int J Technol Assess Health Care. May 26, 2021;37(1):e67. [CrossRef]17]

Adaptability (n=8, 2.6%)Interoperability (n=4, 1.3%)mHealth [Puigdomènech E, Poses-Ferrer E, Espallargues CM, Blasco AJ, Varela LL, Paz VL. Evaluación de tecnología basada en mSalud para aplicaciones móviles. Madrid. Ministerio de Sanidad; 2020. 39,Report of the Working Group on mhealth assessment guidelines. European Commission. 2021. URL: https://digital-strategy.ec.europa.eu/en/library/report-working-group-mhealth-assessment-guidelines [accessed 2024-03-28] 52], digital health [Haverinen J, Keränen N, Falkenbach P, Maijala A, Kolehmainen T, Reponen J. Digi-HTA: health technology assessment framework for digital healthcare services. FinJeHeW. Nov 02, 2019;11(4):326-341. [CrossRef]16,Kolasa K, Kozinski G. How to value digital health interventions? A systematic literature review. Int J Environ Res Public Health. Mar 23, 2020;17(6):2119. [FREE Full text] [CrossRef] [Medline]49]

Adaptability (n=8, 2.6%)Scalability (n=2, 0.6%)mHealth [Report of the Working Group on mhealth assessment guidelines. European Commission. 2021. URL: https://digital-strategy.ec.europa.eu/en/library/report-working-group-mhealth-assessment-guidelines [accessed 2024-03-28] 52], AI [Reddy S, Rogers W, Makinen V, Coiera E, Brown P, Wenzel M, et al. Evaluation framework to guide implementation of AI systems into healthcare settings. BMJ Health Care Inform. Oct 12, 2021;28(1):e100444. [FREE Full text] [CrossRef] [Medline]41]

Adaptability (n=8, 2.6%)Integration of data (n=1, 0.3%)mHealth [Henson P, David G, Albright K, Torous J. Deriving a practical framework for the evaluation of health apps. Lancet Digit Health. Jun 2019;1(2):e52-e54. [CrossRef]40]

Adaptability (n=8, 2.6%%)Transferability (n=1, 0.3%)eHealth [Enam A, Torres-Bonilla J, Eriksson H. Evidence-based evaluation of eHealth interventions: systematic literature review. J Med Internet Res. Nov 23, 2018;20(11):e10971. [FREE Full text] [CrossRef] [Medline]48]

Quality (n=5, 1.6%)eHealth [Sockolow P, Bowles K, Rogers M. Health information technology evaluation framework (HITREF) comprehensiveness as assessed in electronic point-of-care documentation systems evaluations. Amsterdam. IOS Press; 2015. 43], digital health [Hussain MS, Silvera-Tawil D, Farr-Wharton G. Technology assessment framework for precision health applications. Int J Technol Assess Health Care. May 26, 2021;37(1):e67. [CrossRef]17,Kowatsch T, Otto L, Harperink S, Cotti A, Schlieter H. A design and evaluation framework for digital health interventions. it - Inf Technol. Nov 2019;61(5):253-263. [CrossRef]44,World Health Organization (WHO). Monitoring and evaluating digital health interventions: a practical guide to conducting research and assessment. Geneva. WHO; 2016. 47]

Design (n=5, 1.6%)Digital health [Baumel A, Muench F. Heuristic evaluation of ehealth interventions: establishing standards that relate to the therapeutic process perspective. JMIR Ment Health. Jan 13, 2016;3(1):e5. [FREE Full text] [CrossRef] [Medline]56]

Design (n=5, 1.6%)Persuasive design (n=1, 0.3%)Digital health [Baumel A, Muench F. Heuristic evaluation of ehealth interventions: establishing standards that relate to the therapeutic process perspective. JMIR Ment Health. Jan 13, 2016;3(1):e5. [FREE Full text] [CrossRef] [Medline]56]

Technical stability (n=4, 1.3%)mHealth: [Moshi MR, Tooher R, Merlin T. Development of a health technology assessment module for evaluating mobile medical applications. Int J Technol Assess Health Care. May 18, 2020;36(3):252-261. [CrossRef]38,Report of the Working Group on mhealth assessment guidelines. European Commission. 2021. URL: https://digital-strategy.ec.europa.eu/en/library/report-working-group-mhealth-assessment-guidelines [accessed 2024-03-28] 52], digital health [Haverinen J, Keränen N, Falkenbach P, Maijala A, Kolehmainen T, Reponen J. Digi-HTA: health technology assessment framework for digital healthcare services. FinJeHeW. Nov 02, 2019;11(4):326-341. [CrossRef]16,World Health Organization (WHO). Monitoring and evaluating digital health interventions: a practical guide to conducting research and assessment. Geneva. WHO; 2016. 47,Kolasa K, Kozinski G. How to value digital health interventions? A systematic literature review. Int J Environ Res Public Health. Mar 23, 2020;17(6):2119. [FREE Full text] [CrossRef] [Medline]49]

Aesthetics (n=3, 1.0%)mHealth [Puigdomènech E, Poses-Ferrer E, Espallargues CM, Blasco AJ, Varela LL, Paz VL. Evaluación de tecnología basada en mSalud para aplicaciones móviles. Madrid. Ministerio de Sanidad; 2020. 39], digital health [Kowatsch T, Otto L, Harperink S, Cotti A, Schlieter H. A design and evaluation framework for digital health interventions. it - Inf Technol. Nov 2019;61(5):253-263. [CrossRef]44,Baumel A, Muench F. Heuristic evaluation of ehealth interventions: establishing standards that relate to the therapeutic process perspective. JMIR Ment Health. Jan 13, 2016;3(1):e5. [FREE Full text] [CrossRef] [Medline]56]

Ease of use (n=3, 1.0%)mHealth [Puigdomènech E, Poses-Ferrer E, Espallargues CM, Blasco AJ, Varela LL, Paz VL. Evaluación de tecnología basada en mSalud para aplicaciones móviles. Madrid. Ministerio de Sanidad; 2020. 39,Henson P, David G, Albright K, Torous J. Deriving a practical framework for the evaluation of health apps. Lancet Digit Health. Jun 2019;1(2):e52-e54. [CrossRef]40], digital health [Baumel A, Muench F. Heuristic evaluation of ehealth interventions: establishing standards that relate to the therapeutic process perspective. JMIR Ment Health. Jan 13, 2016;3(1):e5. [FREE Full text] [CrossRef] [Medline]56]

Accessibility (n=2, 0.6%)mHealth [Report of the Working Group on mhealth assessment guidelines. European Commission. 2021. URL: https://digital-strategy.ec.europa.eu/en/library/report-working-group-mhealth-assessment-guidelines [accessed 2024-03-28] 52], digital health [Haverinen J, Keränen N, Falkenbach P, Maijala A, Kolehmainen T, Reponen J. Digi-HTA: health technology assessment framework for digital healthcare services. FinJeHeW. Nov 02, 2019;11(4):326-341. [CrossRef]16]

Technical effectiveness (n=1, 0.3%) or performance (n=2, 0.6%)Digital health [Hussain MS, Silvera-Tawil D, Farr-Wharton G. Technology assessment framework for precision health applications. Int J Technol Assess Health Care. May 26, 2021;37(1):e67. [CrossRef]17,World Health Organization (WHO). Monitoring and evaluating digital health interventions: a practical guide to conducting research and assessment. Geneva. WHO; 2016. 47]

Technical effectiveness (n=1, 0.3%) or performance (n=2, 0.6%)Reliability (n=6, 1.9%)mHealth [Report of the Working Group on mhealth assessment guidelines. European Commission. 2021. URL: https://digital-strategy.ec.europa.eu/en/library/report-working-group-mhealth-assessment-guidelines [accessed 2024-03-28] 52,Kumar S, Nilsen WJ, Abernethy A, Atienza A, Patrick K, Pavel M, et al. Mobile health technology evaluation: the mhealth evidence workshop. Am J Prev Med. Aug 2013;45(2):228-236. [FREE Full text] [CrossRef] [Medline]53], digital health [World Health Organization (WHO). Monitoring and evaluating digital health interventions: a practical guide to conducting research and assessment. Geneva. WHO; 2016. 47]

Technical effectiveness (n=1, 0.3%) or performance (n=2, 0.6%)Validity (n=5, 1.6%)mHealth [Report of the Working Group on mhealth assessment guidelines. European Commission. 2021. URL: https://digital-strategy.ec.europa.eu/en/library/report-working-group-mhealth-assessment-guidelines [accessed 2024-03-28] 52,Kumar S, Nilsen WJ, Abernethy A, Atienza A, Patrick K, Pavel M, et al. Mobile health technology evaluation: the mhealth evidence workshop. Am J Prev Med. Aug 2013;45(2):228-236. [FREE Full text] [CrossRef] [Medline]53], AI [Reddy S, Rogers W, Makinen V, Coiera E, Brown P, Wenzel M, et al. Evaluation framework to guide implementation of AI systems into healthcare settings. BMJ Health Care Inform. Oct 12, 2021;28(1):e100444. [FREE Full text] [CrossRef] [Medline]41]

Technical effectiveness (n=1, 0.3%) or performance (n=2, 0.6%)Accuracy (n=2, 0.6%)Digital health [Evidence standards framework (ESF) for digital health technologies. National Institute for Health and Care Excellence. URL: https:/​/www.​nice.org.uk/​about/​what-we-do/​our-programmes/​evidence-standards-framework-for-digital-health-technologies [accessed 2024-03-28] 19]

Technical effectiveness (n=1, 0.3%) or performance (n=2, 0.6%)Sensitivity (n=1, 0.3%)Digital health [Hussain MS, Silvera-Tawil D, Farr-Wharton G. Technology assessment framework for precision health applications. Int J Technol Assess Health Care. May 26, 2021;37(1):e67. [CrossRef]17]

Feasibility (n=1, 0.3%)Digital health [World Health Organization (WHO). Monitoring and evaluating digital health interventions: a practical guide to conducting research and assessment. Geneva. WHO; 2016. 47]

Generalizability and reproducibility (n=1, 0.3%)AI [Alami H, Lehoux P, Auclair Y, de Guise M, Gagnon M, Shaw J, et al. Artificial intelligence and health technology assessment: anticipating a new level of complexity. J Med Internet Res. Jul 07, 2020;22(7):e17707. [FREE Full text] [CrossRef] [Medline]54]

Interpretability (n=1, 0.3%)AI [Alami H, Lehoux P, Auclair Y, de Guise M, Gagnon M, Shaw J, et al. Artificial intelligence and health technology assessment: anticipating a new level of complexity. J Med Internet Res. Jul 07, 2020;22(7):e17707. [FREE Full text] [CrossRef] [Medline]54]

Customization (n=1, 0.3%)Digital health [Kowatsch T, Otto L, Harperink S, Cotti A, Schlieter H. A design and evaluation framework for digital health interventions. it - Inf Technol. Nov 2019;61(5):253-263. [CrossRef]44]
Postmarketing monitoring (n=3, 1%) mHealth [Moshi MR, Tooher R, Merlin T. Development of a health technology assessment module for evaluating mobile medical applications. Int J Technol Assess Health Care. May 18, 2020;36(3):252-261. [CrossRef]38], digital health [World Health Organization (WHO). Monitoring and evaluating digital health interventions: a practical guide to conducting research and assessment. Geneva. WHO; 2016. 47]

adHT: digital health technology.

bmHealth: mobile health.

cAI: artificial intelligence.

dN/A: not applicable.

Research Question 3: Variability of Domains and Dimensions Among Technologies

Our thematic analysis revealed a significant degree of variability and heterogeneity in the number and type of domains and dimensions considered by the methodological frameworks.

In terms of numbers, the variability was quite pronounced when we compared frameworks addressing different types of dHTs. For instance, the thematic analysis of frameworks for assessing telemedicine only identified 9 (75%) domains and 6 (15.8%) dimensions; instead, in frameworks for assessing mHealth, we identified 10 (83.3%) domains, 20 (52.6%) dimensions, and 6 (54.5%) subdimensions, and in frameworks for assessing AI, we identified 8 (66.7%) different domains, 7 (18.4%) different dimensions, and 6 (54.5%) subdimensions.

In terms of the types of domains considered, certain dimensions and domains were identified as more distinctive for one kind of dHT than for another. For instance, clinical efficacy and effectiveness, technical safety, economic evaluation, and user experience were relevant for the evaluation of models of nonpresential health care and mHealth but not for AI. In contrast, there were specific dimensions and domains of mHealth that were not considered in the evaluation of non–face-to-face health care or AI, such as postmarketing monitoring, scientific basis, technical evaluation and validation, user control and self-determination, accessibility, content and adequacy of information, and data interoperability and integration. Finally, specific methodological frameworks for the evaluation of AI included dimensions such as technical aspects, adoption, use, integration, generalizability, reproducibility, and interpretability, which were not considered in the evaluation of telemedicine or mHealth. In conclusion, greater clarity and structuring in the presentation of these ideas are required to facilitate their understanding and assimilation.

Proposal for Domains, Dimensions, and Subdimensions for dHTA

These findings led to the development of a proposed methodological framework for dHTA, which comprises domains, dimensions, and subdimensions. These evaluation items were established objectively based on thematically analyzed evidence, without incorporating the researcher’s perspective. Consequently, the proposal for domains, dimensions, and subdimensions emerged from the literature and represents the entirety of identified evaluation domains, dimensions, and subdimensions (n=61). Figure 2 presents a visual representation of the proposed framework comprising 12 domains, 38 dimensions, and their corresponding 11 subdimensions. Notably, the figure highlights certain domains, dimensions, and subdimensions that are particularly relevant to the evaluation of non–face-to-face care models, mHealth, and AI according to the evidence.

Figure 2. Proposed methodological framework for dHTA. aDimension identified as especially relevant for non–face-to-face care models; bdimension identified as especially relevant for mHealth; cdimension identified as especially relevant for AI; dHTA: digital health technology assessment. A higher-resolution version of this image is available as

Multimedia Appendix 6

High-resolution image of Figure 2.

PNG File , 121 KB
Multimedia Appendix 6
.

Principal Findings

In recent years, the interest in digital health has increased significantly, giving rise to a myriad of available technologies. This has brought about a profound transformation in health care systems, fundamentally changing the provision and consumption of health care services [European Union. State of health in the EU: companion report 2021. Luxembourg. European Union; 2022. 9]. However, despite these advancements, the shift toward digital health has been accompanied by challenges. One such challenge is the emergence of a plethora of short-lived implementations and an overwhelming diversity of digital tools, which has created a need for careful evaluation and analysis of the benefits and drawbacks of these technologies [World Health Organization (WHO). WHO guideline: recommendations on digital interventions for health system strengthening. Geneva. WHO; Jun 6, 2019. 4].

In this context, our ScR aimed to identify the methodological frameworks used worldwide for the assessment of dHTs; determine what domains are considered; and generate, through a thematic analysis, a proposal for a methodological framework based on the most frequently described domains in the literature.

Throughout the ScR, we identified a total of 95 methodological frameworks and tools, of which 19 [Haverinen J, Keränen N, Falkenbach P, Maijala A, Kolehmainen T, Reponen J. Digi-HTA: health technology assessment framework for digital healthcare services. FinJeHeW. Nov 02, 2019;11(4):326-341. [CrossRef]16,Hussain MS, Silvera-Tawil D, Farr-Wharton G. Technology assessment framework for precision health applications. Int J Technol Assess Health Care. May 26, 2021;37(1):e67. [CrossRef]17,Kidholm K, Ekeland AG, Jensen LK, Rasmussen J, Pedersen CD, Bowes A, et al. A model for assessment of telemedicine applications: MAST. Int J Technol Assess Health Care. Jan 23, 2012;28(1):44-51. [CrossRef]33-World Health Organization (WHO). Monitoring and evaluating digital health interventions: a practical guide to conducting research and assessment. Geneva. WHO; 2016. 47] were directly identified through a systematic search and 75 were indirectly identified through 4 SRs [Kolasa K, Kozinski G. How to value digital health interventions? A systematic literature review. Int J Environ Res Public Health. Mar 23, 2020;17(6):2119. [FREE Full text] [CrossRef] [Medline]49-Report of the Working Group on mhealth assessment guidelines. European Commission. 2021. URL: https://digital-strategy.ec.europa.eu/en/library/report-working-group-mhealth-assessment-guidelines [accessed 2024-03-28] 52]. The difference in the number of methodological frameworks identified through the ScR and the 4 evidence synthesis documents [Kolasa K, Kozinski G. How to value digital health interventions? A systematic literature review. Int J Environ Res Public Health. Mar 23, 2020;17(6):2119. [FREE Full text] [CrossRef] [Medline]49-Report of the Working Group on mhealth assessment guidelines. European Commission. 2021. URL: https://digital-strategy.ec.europa.eu/en/library/report-working-group-mhealth-assessment-guidelines [accessed 2024-03-28] 52] is attributed to the inclusion of keywords related to the concept of HTA in the search syntax, the exclusion of methodological frameworks published prior to 2011 during the screening process, and the differences in perspectives used for the development of this paper compared to the 4 evidence synthesis documents mentioned earlier. In this sense, these 4 documents [Kolasa K, Kozinski G. How to value digital health interventions? A systematic literature review. Int J Environ Res Public Health. Mar 23, 2020;17(6):2119. [FREE Full text] [CrossRef] [Medline]49-Report of the Working Group on mhealth assessment guidelines. European Commission. 2021. URL: https://digital-strategy.ec.europa.eu/en/library/report-working-group-mhealth-assessment-guidelines [accessed 2024-03-28] 52] have analyzed methodological frameworks and tools aimed at evaluating digital health that have not been developed from an HTA perspective despite the authors analyzing them as such. For example, von Huben et al. [von Huben A, Howell M, Howard K, Carrello J, Norris S. Health technology assessment for digital technologies that manage chronic disease: a systematic review. Int J Technol Assess Health Care. May 26, 2021;37(1):e66. [CrossRef]51] included in their analysis the Consolidated Standards of Reporting Trials (CONSORT)-EHEALTH tool [Eysenbach G, CONSORT-EHEALTH Group. CONSORT-EHEALTH: improving and standardizing evaluation reports of web-based and mobile health interventions. J Med Internet Res. Dec 31, 2011;13(4):e126. [FREE Full text] [CrossRef] [Medline]97], which aims to describe the information that should be reported in papers and reports that focus on evaluating web- and mHealth-based interventions; Koladas et al [Kolasa K, Kozinski G. How to value digital health interventions? A systematic literature review. Int J Environ Res Public Health. Mar 23, 2020;17(6):2119. [FREE Full text] [CrossRef] [Medline]49] included the mobile health evidence reporting and assessment (mERA) checklist [Agarwal S, LeFevre AE, Lee J, L'Engle K, Mehl G, Sinha C, et al. WHO mHealth Technical Evidence Review Group. Guidelines for reporting of health interventions using mobile phones: mobile health (mHealth) evidence reporting and assessment (mERA) checklist. BMJ. Mar 17, 2016;352:i1174. [CrossRef] [Medline]73], which aims to determine the information that should be reported in trials evaluating mHealth solutions; and the European mHealth Hub document [Report of the Working Group on mhealth assessment guidelines. European Commission. 2021. URL: https://digital-strategy.ec.europa.eu/en/library/report-working-group-mhealth-assessment-guidelines [accessed 2024-03-28] 52] includes the Isys Score, which is for cataloguing apps for smartphones.

However, as detailed in the Results section, some of the methodological frameworks identified through the ScR were characterized by the authors themselves as being specific for evaluating certain types of dHTs (eg, non–face-to-face care models, mHealth), presenting certain differences according to each typology. It is important to note that the differentiation among various types of dHTs, as described throughout this paper and commonly used in the field of digital health, cannot always be made in a precise and exclusive manner [Wienert J, Jahnel T, Maaß L. What are digital public health interventions? First steps toward a definition and an intervention classification framework. J Med Internet Res. Jun 28, 2022;24(6):e31921. [FREE Full text] [CrossRef] [Medline]136]. This is because a technology often can be classified in more than 1 category. For instance, an mHealth solution may use AI algorithms, while simultaneously being integrated into a non–face-to-face care model [Deniz-Garcia A, Fabelo H, Rodriguez-Almeida AJ, Zamora-Zamorano G, Castro-Fernandez M, Alberiche Ruano MDP, et al. WARIFA Consortium. Quality, usability, and effectiveness of mhealth apps and the role of artificial intelligence: current scenario and challenges. J Med Internet Res. May 04, 2023;25:e44030. [FREE Full text] [CrossRef] [Medline]137]. In this context, future research should consider using alternative taxonomies or classification methods that are based on the intended purpose of the technology, such as those proposed by NICE in the updated version of the Evidence Standards Framework [National Institute for Health and Care Excellence (NICE). Evidence standards framework (ESF) for digital health technologies. Contract no.: 30 de septiembre. London. NICE; 2022. 18] or the new digital health interventions system classification put forward by WHO [World Health Organization (WHO). Classification of digital interventions, services and applications in health: a shared language to describe the uses of digital technology for health, 2nd ed. Geneva. WHO; Oct 24, 2023. 138].

After conducting a thematic analysis of the 26 included studies, we observed that various methodological frameworks include a set of evaluation items, referred to as domains, dimensions, or criteria. These items primarily focus on the safety; effectiveness; technical aspects; economic impact; and ethical, legal, and social consequences of dHTs. However, there is significant heterogeneity among these frameworks in terms of the way they refer to the evaluation items, the quantity and depth of their description, the degree of granularity, and the proposed evaluation methods, especially when comparing frameworks that focus on different types of dHTs. Despite this heterogeneity, most methodological frameworks consider evaluation items related to the 9 domains described by the HTA Core Model of EUnetHTA, while some frameworks propose additional evaluation elements, such as usability [Haverinen J, Keränen N, Falkenbach P, Maijala A, Kolehmainen T, Reponen J. Digi-HTA: health technology assessment framework for digital healthcare services. FinJeHeW. Nov 02, 2019;11(4):326-341. [CrossRef]16,Kowatsch T, Otto L, Harperink S, Cotti A, Schlieter H. A design and evaluation framework for digital health interventions. it - Inf Technol. Nov 2019;61(5):253-263. [CrossRef]44,Mathews SC, McShea MJ, Hanley CL, Ravitz A, Labrique AB, Cohen AB. Digital health: a path to validation. NPJ Digit Med. May 13, 2019;2(1):38. [FREE Full text] [CrossRef] [Medline]45,World Health Organization (WHO). Monitoring and evaluating digital health interventions: a practical guide to conducting research and assessment. Geneva. WHO; 2016. 47,Kolasa K, Kozinski G. How to value digital health interventions? A systematic literature review. Int J Environ Res Public Health. Mar 23, 2020;17(6):2119. [FREE Full text] [CrossRef] [Medline]49,Baumel A, Muench F. Heuristic evaluation of ehealth interventions: establishing standards that relate to the therapeutic process perspective. JMIR Ment Health. Jan 13, 2016;3(1):e5. [FREE Full text] [CrossRef] [Medline]56], privacy [Puigdomènech E, Poses-Ferrer E, Espallargues CM, Blasco AJ, Varela LL, Paz VL. Evaluación de tecnología basada en mSalud para aplicaciones móviles. Madrid. Ministerio de Sanidad; 2020. 39-Reddy S, Rogers W, Makinen V, Coiera E, Brown P, Wenzel M, et al. Evaluation framework to guide implementation of AI systems into healthcare settings. BMJ Health Care Inform. Oct 12, 2021;28(1):e100444. [FREE Full text] [CrossRef] [Medline]41,Kowatsch T, Otto L, Harperink S, Cotti A, Schlieter H. A design and evaluation framework for digital health interventions. it - Inf Technol. Nov 2019;61(5):253-263. [CrossRef]44,Report of the Working Group on mhealth assessment guidelines. European Commission. 2021. URL: https://digital-strategy.ec.europa.eu/en/library/report-working-group-mhealth-assessment-guidelines [accessed 2024-03-28] 52,Vokinger KN, Nittas V, Witt CM, Fabrikant SI, von Wyl V. Digital health and the COVID-19 epidemic: an assessment framework for apps from an epidemiological and legal perspective. Swiss Med Wkly. May 04, 2020;150(1920):w20282. [FREE Full text] [CrossRef] [Medline]55], and technical stability [Haverinen J, Keränen N, Falkenbach P, Maijala A, Kolehmainen T, Reponen J. Digi-HTA: health technology assessment framework for digital healthcare services. FinJeHeW. Nov 02, 2019;11(4):326-341. [CrossRef]16,Moshi MR, Tooher R, Merlin T. Development of a health technology assessment module for evaluating mobile medical applications. Int J Technol Assess Health Care. May 18, 2020;36(3):252-261. [CrossRef]38,World Health Organization (WHO). Monitoring and evaluating digital health interventions: a practical guide to conducting research and assessment. Geneva. WHO; 2016. 47,Kolasa K, Kozinski G. How to value digital health interventions? A systematic literature review. Int J Environ Res Public Health. Mar 23, 2020;17(6):2119. [FREE Full text] [CrossRef] [Medline]49,Report of the Working Group on mhealth assessment guidelines. European Commission. 2021. URL: https://digital-strategy.ec.europa.eu/en/library/report-working-group-mhealth-assessment-guidelines [accessed 2024-03-28] 52] among others. These findings are consistent with earlier research [Vis C, Bührmann L, Riper H, Ossebaard HC. Health technology assessment frameworks for eHealth: a systematic review. Int J Technol Assess Health Care. Apr 16, 2020;36(3):204-216. [CrossRef]50,von Huben A, Howell M, Howard K, Carrello J, Norris S. Health technology assessment for digital technologies that manage chronic disease: a systematic review. Int J Technol Assess Health Care. May 26, 2021;37(1):e66. [CrossRef]51].

In addition, through the thematic analysis, the heterogeneity identified among the different methodological frameworks included in this ScR was reduced to a total of 61 analytical themes related to various evaluation elements that were arranged in a 3-level vertical hierarchy based on the evidence: level 1 (12 domains), level 2 (38 dimensions), and level 3 (11 subdimensions). At this point, it is pertinent to note that although from the researchers’ perspective, some dimensions could have been classified under different domains (eg, responsibility under ethical aspects) or seen as essential for other kinds of dHTs, an effort was made to maintain the highest degree of objectivity possible. It is for this reason that privacy issues were not described as essential for non–face-to-face care models and why the dimension of accessibility was categorized within the domains of human and sociocultural aspects and technical aspects. This categorization was made because some of the methodological frameworks analyzed associated it with sociocultural elements (eg, evaluating whether users with functional diversity can access the technology and have sufficient ability to use it as expected), while others linked it to technical elements (eg, adequacy of the elements, options, or accessibility functionalities that the system incorporates according to the target audience) [Haverinen J, Keränen N, Falkenbach P, Maijala A, Kolehmainen T, Reponen J. Digi-HTA: health technology assessment framework for digital healthcare services. FinJeHeW. Nov 02, 2019;11(4):326-341. [CrossRef]16,Report of the Working Group on mhealth assessment guidelines. European Commission. 2021. URL: https://digital-strategy.ec.europa.eu/en/library/report-working-group-mhealth-assessment-guidelines [accessed 2024-03-28] 52].

The ScR and thematic analysis conducted in this study led to a proposal for a methodological framework for dHTA. This framework was further developed using additional methodologies, such as consensus workshops by the Agency for Health Quality and Assessment of Catalonia (AQuAS), in collaboration with all agencies of RedETS, commissioned by the Ministry of Health of Spain. The final framework is a specific methodological tool for the assessment of dHTs, aimed at describing the domains and dimensions to be considered in dHTA and defining the evidence standards that such technologies must meet based on their associated risk level. The proposed methodological framework enables the assessment of a wide range of dHTs, mainly those classified as medical devices according to the Regulation (EU) 2017/745 for medical devices [European Parliament, Council of the European Union. Regulation (EU) 2017/745 of the European Parliament and of the Council of 5 April 2017 on medical devices, amending Directive 2001/83/EC, Regulation (EC) No 178/2002 and Regulation (EC) No 1223/2009 and repealing Council Directives 90/385/EEC and 93/42/EEC. EUR-Lex. 2017. URL: https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX%3A32017R0745 [accessed 2024-03-28] 139] and Regulation (EU) 2017/746 for in vitro diagnostic medical devices, although it can be adapted to assess dHTs not classified as medical devices [European Parliament, Council of the European Union. Regulation (EU) 2017/746 of the European Parliament and of the Council of 5 April 2017 on in vitro diagnostic medical devices and repealing Directive 98/79/EC and Commission Decision 2010/227/EU. EUR-Lex. 2017. URL: https://eur-lex.europa.eu/eli/reg/2017/746/oj [accessed 2024-03-28] 140]. Unlike existing frameworks, it establishes a clear link between the identified domains and dimensions and the evidence standards required for dHTs to meet. This approach will enhance the transparency and consistency of dHTAs and support evidence-based decision-making. The final document was published from November 2023 onward and is available on the RedETS website as well as on the main web page of AQuAS in the Spanish language [Segur-Ferrer J, Moltó-Puigmartí C, Pastells-Peiró R, Vivanco-Hidalgo R. Marco de evaluación de tecnologías sanitarias: adaptación para la evaluación de tecnologías de salud digital. Madrid, Barcelona. Ministerio de Sanidad, Agència de Qualitat i Avaluació Sanitàries de Catalunya; 2023. 141]. From the first week of February, the respective websites have hosted an English version of this document [Segur-Ferrer J, Moltó-Puigmartí C, Pastells-Peiró R, Vivanco-Hidalgo R. Marco de evaluación de tecnologías sanitarias: adaptación para la evaluación de tecnologías de salud digital. Madrid, Barcelona. Ministerio de Sanidad, Agència de Qualitat i Avaluació Sanitàries de Catalunya; 2023. 141], which also is accessible in the INAHTA database. In addition, the Spanish and English versions of the document will be periodically reviewed and, if necessary, adapted to align with emerging technologies and changes in legislation.

Limitations

Although this ScR was conducted in accordance with the PRISMA-ScR guidelines (

Multimedia Appendix 1

Preferred Reporting Items for Systematic Reviews and Meta-Analysis extension for Scoping Reviews (PRISMA-ScR) checklist [21].

DOCX File , 25 KBMultimedia Appendix 1) and following the recommendations of Peters et al [Peters M, Marnie C, Tricco A, Pollock D, Munn Z, Alexander L. Updated methodological guidance for the conduct of scoping reviews. JBI Evid Synth. 2020;18(10):2119-2126. [CrossRef]22] and Pollock et al [Pollock D, Davies EL, Peters MDJ, Tricco AC, Alexander L, McInerney P, et al. Undertaking a scoping review: a practical guide for nursing and midwifery students, clinicians, researchers, and academics. J Adv Nurs. Apr 04, 2021;77(4):2102-2113. [FREE Full text] [CrossRef] [Medline]23], there were some limitations. First, the search incorporated a block of keywords related to the concept of HTA (see

Multimedia Appendix 1

Preferred Reporting Items for Systematic Reviews and Meta-Analysis extension for Scoping Reviews (PRISMA-ScR) checklist [21].

DOCX File , 25 KB
Multimedia Appendix 1
) due to the perspective of our ScR, which may have limited the retrieval of some studies to meet the study objective. However, this limitation was compensated for by the analysis of the 3 SRs and the report of the working group on guidelines for the evaluation of mHealth solutions of the European mHealth Hub. Second, much of the literature related to HTA is gray literature and only published on the websites of the authoring agencies. Despite efforts to address this limitation through expert input and a comprehensive search of the websites of the world’s leading agencies, it is possible that certain studies were not identified. Third, the quality and limitations of the analysis conducted by the authors of methodological frameworks and tools included in SRs may have had an impact on the indirect thematic analysis. Therefore, it is possible that some data could have been omitted or not considered during this process. Fourth, the focus on dHTs encompassed within the 3 previously mentioned categories (mHealth, non–face-to-face care models, and medical devices that integrate AI) may have influenced the outcomes of the thematic analysis conducted. Fifth, only methodological frameworks written in Catalan, Spanish, and English were included.

Comparison With Prior Work

To the best of our knowledge, this is the first ScR to examine the methodological frameworks for dHTA, followed by a thematic analysis with the aim of proposing a new comprehensive framework that incorporates the existing literature in an objective manner and enables the assessment of various technologies included under the concept of digital health. In this sense, existing SRs and other evidence synthesis documents have only analyzed the literature and reported the results in a descriptive manner [Caulfield B, Reginatto B, Slevin P. Not all sensors are created equal: a framework for evaluating human performance measurement technologies. NPJ Digit Med. Feb 14, 2019;2(1):7. [FREE Full text] [CrossRef] [Medline]36,Enam A, Torres-Bonilla J, Eriksson H. Evidence-based evaluation of eHealth interventions: systematic literature review. J Med Internet Res. Nov 23, 2018;20(11):e10971. [FREE Full text] [CrossRef] [Medline]48,Kolasa K, Kozinski G. How to value digital health interventions? A systematic literature review. Int J Environ Res Public Health. Mar 23, 2020;17(6):2119. [FREE Full text] [CrossRef] [Medline]49,von Huben A, Howell M, Howard K, Carrello J, Norris S. Health technology assessment for digital technologies that manage chronic disease: a systematic review. Int J Technol Assess Health Care. May 26, 2021;37(1):e66. [CrossRef]51,Baumel A, Muench F. Heuristic evaluation of ehealth interventions: establishing standards that relate to the therapeutic process perspective. JMIR Ment Health. Jan 13, 2016;3(1):e5. [FREE Full text] [CrossRef] [Medline]56,D2.1 Knowledge Tool 1. Health apps assessment frameworks. European mHealth Hub. 2020. URL: https://mhealth-hub.org/download/d2-1-knowledge-tool-1-health-apps-assessment-frameworks [accessed 2024-03-28] 125,Moshi MR, Tooher R, Merlin T. Suitability of current evaluation frameworks for use in the health technology assessment of mobile medical applications: a systematic review. Int J Technol Assess Health Care. Sep 11, 2018;34(5):464-475. [CrossRef]126]. Furthermore, this ScR also considered, in addition to scientific literature, gray literature identified by searching the websites of the agencies, thus covering some limitations of previous reviews [Vis C, Bührmann L, Riper H, Ossebaard HC. Health technology assessment frameworks for eHealth: a systematic review. Int J Technol Assess Health Care. Apr 16, 2020;36(3):204-216. [CrossRef]50]. Moreover, this review was carried out from the perspective of HTA, addressing a clear need expressed by HTA agencies [Haverinen J, Keränen N, Falkenbach P, Maijala A, Kolehmainen T, Reponen J. Digi-HTA: health technology assessment framework for digital healthcare services. FinJeHeW. Nov 02, 2019;11(4):326-341. [CrossRef]16].

Future research should aim to identify what domains and dimensions are relevant at the different stages of the technology life cycle, to establish or develop a standardized set of outcomes for assessing or reporting each domain, and to evaluate the effectiveness and usefulness of the existing methodological frameworks for the different intended users [Vis C, Bührmann L, Riper H, Ossebaard HC. Health technology assessment frameworks for eHealth: a systematic review. Int J Technol Assess Health Care. Apr 16, 2020;36(3):204-216. [CrossRef]50,Benedetto V, Filipe L, Harris C, Spencer J, Hickson C, Clegg A. Analytical frameworks and outcome measures in economic evaluations of digital health interventions: a methodological systematic review. Med Decis Making. Oct 19, 2022;43(1):125-138. [CrossRef]142]. Moreover, future research should aim to determine the specific evaluation criteria that ought to be considered based on the level of risk associated with different types of technologies [von Huben A, Howell M, Howard K, Carrello J, Norris S. Health technology assessment for digital technologies that manage chronic disease: a systematic review. Int J Technol Assess Health Care. May 26, 2021;37(1):e66. [CrossRef]51].

Conclusion

Our ScR revealed a total of 102 methodological frameworks and tools designed for evaluating dHTs, with 19 being directly identified through a systematic search and 83 through 4 evidence synthesis documents. Only 19 of all the identified frameworks were developed from the perspective of HTA. These frameworks vary in assessment items, structure, and specificity, and their proven usefulness in practice is scarce.

The thematic analysis of the 26 studies that met the inclusion criteria led to the identification and definition of 12 domains, 38 dimensions, and 11 subdimensions that should be considered when evaluating dHTs. Building on our results, a methodological framework for dHTA was proposed.

Acknowledgments

We acknowledge Benigno Rosón Calvo (Servicio Gallego de Salud [SERGAS]), Carme Carrion (Universitat Oberta de Catalunya [UOC]), Carlos A Molina Carrón (Dirección General de Salud Digital y Sistemas de Información para el SNS. Ministerio de Sanidad, Gobierno de España), Carme Pratdepadua (Fundació Tic Salut i Social [FTSS]), Celia Muñoz (Instituto Aragonés de Ciencias de la Salud [IACS]), David Pijoan (Biocat, BioRegió de Catalunya), Felip Miralles (Eurecat – Centre Tecnològic de Catalunya), Iñaki Guiterrez Ibarluzea (Osasun Teknologien Ebaluazioko Zerbitzua [Osteba]), Janet Puñal Riobóo (Unidad de Asesoramiento Científico-técnico [avalia-t], Agencia Gallega para la Gestión del Conocimiento en Salud [ACIS]), Jordi Piera-Jiménez (Àrea de Sistemes d’Informació del Servei Català de la Salut [CatSalut]), Juan Antonio Blasco (Evaluación de Tecnologías Sanitarias de Andalucía [AETSA]), Liliana Arroyo Moliner (Direcció General de Societat Digital, Departament d’Empresa i Treball de la Generalitat de Catalunya), Lilisbeth Perestelo-Perez (Servicio de Evaluación del Servicio Canario de la Salud [SESCS]), Lucía Prieto Remón (IACS), Marifé Lapeña (Dirección General de Salud Digital y Sistemas de Información para el SNS. Ministerio de Sanidad, Gobierno de España), Mario Cárdaba (Insituto de Salud Carlos III [ISCIII]), Montserrat Daban (Biocat, BioRegió de Catalunya), Montserrat Moharra Frances (Agència de Qualitat i Avaluació Sanitàries de Catalunya), and Oscar Solans (CatSalut) for reviewing the protocol of this scoping review (ScR) and the ScR.

This research was framed within the budget of the work plan of the Spanish Network of Health Technology Assessment Agencies, commissioned by the General Directorate of Common Portfolio of Services of the National Health System and Pharmacy.

Authors' Contributions

JS-F and CM-P were responsible for conceptualization, methodology, formal analysis, investigation, data curation, writing—original draft, and visualization. RP-P handled conceptualization, methodology, formal analysis, investigation, resources, and writing—original draft. RMV-H handled conceptualization, writing—review and editing, supervision, and project administration.

Conflicts of Interest

None declared.

Multimedia Appendix 1

Preferred Reporting Items for Systematic Reviews and Meta-Analysis extension for Scoping Reviews (PRISMA-ScR) checklist [Tricco AC, Lillie E, Zarin W, O'Brien KK, Colquhoun H, Levac D, et al. PRISMA Extension for Scoping Reviews (PRISMA-ScR): checklist and explanation. Ann Intern Med. Oct 02, 2018;169(7):467-473. [FREE Full text] [CrossRef] [Medline]21].

DOCX File , 25 KB

Multimedia Appendix 2

Search strategies for each database.

DOCX File , 34 KB

Multimedia Appendix 3

References excluded at the full-text screening stage.

DOCX File , 20 KB

Multimedia Appendix 4

Methodological frameworks included in systematic reviews.

DOCX File , 41 KB

Multimedia Appendix 5

Network of relationships among the codes.

PNG File , 399 KB

Multimedia Appendix 6

High-resolution image of Figure 2.

PNG File , 121 KB

  1. Avanzas P, Pascual I, Moris C. The great challenge of the public health system in Spain. J Thorac Dis. May 2017;9(Suppl 6):S430-S433. [FREE Full text] [CrossRef] [Medline]
  2. Grubanov BS, Ghio D, Goujon A, Kalantaryan S, Belmonte M, Scipioni M. Health and long-term care workforce: demographic challenges and the potential contribution of migration and digital technology. Luxembourg. European Commission; 2021;2045-2322.
  3. World Health Organization (WHO). Health 2020: a European policy framework and strategy for the 21st century. Denmark. WHO; 2013.
  4. World Health Organization (WHO). WHO guideline: recommendations on digital interventions for health system strengthening. Geneva. WHO; Jun 6, 2019.
  5. Scholz N. Addressing health inequalities in the European Union. Luxembourg. European Parliamentary Research Service; 2020.
  6. Kickbusch I, Agrawal A, Jack A, Lee N, Horton R. Governing health futures 2030: growing up in a digital world—a joint The Lancet and Financial Times Commission. Lancet. Oct 2019;394(10206):1309. [CrossRef]
  7. Reeves JJ, Ayers JW, Longhurst CA. Telehealth in the COVID-19 era: a balancing act to avoid harm. J Med Internet Res. Feb 01, 2021;23(2):e24785. [FREE Full text] [CrossRef] [Medline]
  8. Richardson S, Lawrence K, Schoenthaler AM, Mann D. A framework for digital health equity. NPJ Digit Med. Aug 18, 2022;5(1):119. [FREE Full text] [CrossRef] [Medline]
  9. European Union. State of health in the EU: companion report 2021. Luxembourg. European Union; 2022.
  10. LATITUD: Anàlisi comparativa de models d'atenció no presencial en l'àmbit de la salut. TIC Salut Social. 2020. URL: https://ticsalutsocial.cat/wp-content/uploads/2021/07/latitud.pdf [accessed 2024-03-28]
  11. Global strategy on digital health 2020-2025. World Health Organization. Aug 18, 2021. URL: https://www.who.int/publications/i/item/9789240020924 [accessed 2024-03-28]
  12. Ming J, He Y, Yang Y, Hu M, Zhao X, Liu J, et al. Health technology assessment of medical devices: current landscape, challenges, and a way forward. Cost Eff Resour Alloc. Oct 05, 2022;20(1):54. [FREE Full text] [CrossRef] [Medline]
  13. O'Rourke B, Oortwijn W, Schuller T. The new definition of health technology assessment: a milestone in international collaboration. Int J Technol Assess Health Care. May 13, 2020;36(3):187-190. [CrossRef]
  14. EUnetHTA Joint Action 2 Work Package 8. HTA Core Model ® version 3.0 2016. EUnetHTA. 2016. URL: https://www.eunethta.eu/wp-content/uploads/2018/03/HTACoreModel3.0-1.pdf [accessed 2024-03-28]
  15. Puñal-Riobóo J, Baños, Varela LL, Castillo MM, Atienza MG, Ubago PR. Guía para la elaboración y adaptación de informes rápidos. Santiago de Compostela, Madrid. Agencia Gallega para la Gestión del Conocimientoto en Salud. Unidad de Asesoramiento Científico-técnico, avalia-t. Ministerio de Sanidad, Servicios Sociales e Igualdad; 2016.
  16. Haverinen J, Keränen N, Falkenbach P, Maijala A, Kolehmainen T, Reponen J. Digi-HTA: health technology assessment framework for digital healthcare services. FinJeHeW. Nov 02, 2019;11(4):326-341. [CrossRef]
  17. Hussain MS, Silvera-Tawil D, Farr-Wharton G. Technology assessment framework for precision health applications. Int J Technol Assess Health Care. May 26, 2021;37(1):e67. [CrossRef]
  18. National Institute for Health and Care Excellence (NICE). Evidence standards framework (ESF) for digital health technologies. Contract no.: 30 de septiembre. London. NICE; 2022.
  19. Evidence standards framework (ESF) for digital health technologies. National Institute for Health and Care Excellence. URL: https:/​/www.​nice.org.uk/​about/​what-we-do/​our-programmes/​evidence-standards-framework-for-digital-health-technologies [accessed 2024-03-28]
  20. Segur-Ferrer J, Moltó-Puigmartí C, Pastells-Peiró R, Vivanco-Hidalgo RM. Methodological frameworks and dimensions to be taken into consideration in digital health technology assessment: protocol for a scoping review. JMIR Res Protoc. Oct 11, 2022;11(10):e39905. [FREE Full text] [CrossRef] [Medline]
  21. Tricco AC, Lillie E, Zarin W, O'Brien KK, Colquhoun H, Levac D, et al. PRISMA Extension for Scoping Reviews (PRISMA-ScR): checklist and explanation. Ann Intern Med. Oct 02, 2018;169(7):467-473. [FREE Full text] [CrossRef] [Medline]
  22. Peters M, Marnie C, Tricco A, Pollock D, Munn Z, Alexander L. Updated methodological guidance for the conduct of scoping reviews. JBI Evid Synth. 2020;18(10):2119-2126. [CrossRef]
  23. Pollock D, Davies EL, Peters MDJ, Tricco AC, Alexander L, McInerney P, et al. Undertaking a scoping review: a practical guide for nursing and midwifery students, clinicians, researchers, and academics. J Adv Nurs. Apr 04, 2021;77(4):2102-2113. [FREE Full text] [CrossRef] [Medline]
  24. Ayiku L, Hudson T, Glover S, Walsh N, Adams R, Deane J, et al. The NICE MEDLINE and Embase (Ovid) health apps search filters: development of validated filters to retrieve evidence about health apps. Int J Technol Assess Health Care. Oct 27, 2020;37(1):e16. [CrossRef]
  25. McGowan J, Sampson M, Salzwedel DM, Cogo E, Foerster V, Lefebvre C. PRESS peer review of electronic search strategies: 2015 guideline statement. J Clin Epidemiol. Jul 2016;75:40-46. [FREE Full text] [CrossRef] [Medline]
  26. Bramer WM, Rethlefsen ML, Kleijnen J, Franco OH. Optimal database combinations for literature searches in systematic reviews: a prospective exploratory study. Syst Rev. Dec 06, 2017;6(1):245. [FREE Full text] [CrossRef] [Medline]
  27. Lockwood C, Munn Z, Porritt K. Qualitative research synthesis: methodological guidance for systematic reviewers utilizing meta-aggregation. Int J Evid Based Healthc. 2015;13(3):179-187. [CrossRef]
  28. Systematic reviews - research guide. Defining your review question. Murdoch University. 2023. URL: https://libguides.murdoch.edu.au/systematic [accessed 2024-03-28]
  29. Scientific Software Development GmbH. Atlas.ti qualitative data analysis. 22.0 ed. Berlin. Scientific Software Development GmbH; 2021.
  30. Higgins J, Thomas J, Chandler J, Cumpston M, Li T, Page M, et al. Cochrane handbook for systematic reviews of interventions version 6.3. London. Cochrane; 2022.
  31. Thomas J, Harden A. Methods for the thematic synthesis of qualitative research in systematic reviews. BMC Med Res Methodol. Jul 10, 2008;8(1):45. [FREE Full text] [CrossRef] [Medline]
  32. Page M, McKenzie J, Bossuyt P, Boutron I, Hoffmann T, Mulrow C, et al. The PRISMA 2020 statement: an updated guideline for reporting systematic reviews. PLoS Med. Mar 2021;18(3):e1003583. [FREE Full text] [CrossRef] [Medline]
  33. Kidholm K, Ekeland AG, Jensen LK, Rasmussen J, Pedersen CD, Bowes A, et al. A model for assessment of telemedicine applications: MAST. Int J Technol Assess Health Care. Jan 23, 2012;28(1):44-51. [CrossRef]
  34. Nepal S, Li J, Jang-Jaccard J, Alem L. A framework for telehealth program evaluation. Telemed J E Health. Apr 2014;20(4):393-404. [CrossRef] [Medline]
  35. Measuring the impact of telehealth and telecare: SCTT toolkit. Glasgow. SCTT; 2013.
  36. Caulfield B, Reginatto B, Slevin P. Not all sensors are created equal: a framework for evaluating human performance measurement technologies. NPJ Digit Med. Feb 14, 2019;2(1):7. [FREE Full text] [CrossRef] [Medline]
  37. Lewis TL, Wyatt JC. mHealth and mobile medical apps: a framework to assess risk and promote safer use. J Med Internet Res. Sep 15, 2014;16(9):e210. [FREE Full text] [CrossRef] [Medline]
  38. Moshi MR, Tooher R, Merlin T. Development of a health technology assessment module for evaluating mobile medical applications. Int J Technol Assess Health Care. May 18, 2020;36(3):252-261. [CrossRef]
  39. Puigdomènech E, Poses-Ferrer E, Espallargues CM, Blasco AJ, Varela LL, Paz VL. Evaluación de tecnología basada en mSalud para aplicaciones móviles. Madrid. Ministerio de Sanidad; 2020.
  40. Henson P, David G, Albright K, Torous J. Deriving a practical framework for the evaluation of health apps. Lancet Digit Health. Jun 2019;1(2):e52-e54. [CrossRef]
  41. Reddy S, Rogers W, Makinen V, Coiera E, Brown P, Wenzel M, et al. Evaluation framework to guide implementation of AI systems into healthcare settings. BMJ Health Care Inform. Oct 12, 2021;28(1):e100444. [FREE Full text] [CrossRef] [Medline]
  42. Khoja S, Durrani H, Scott RE, Sajwani A, Piryani U. Conceptual framework for development of comprehensive e-health evaluation tool. Telemed J E Health. Jan 2013;19(1):48-53. [CrossRef] [Medline]
  43. Sockolow P, Bowles K, Rogers M. Health information technology evaluation framework (HITREF) comprehensiveness as assessed in electronic point-of-care documentation systems evaluations. Amsterdam. IOS Press; 2015.
  44. Kowatsch T, Otto L, Harperink S, Cotti A, Schlieter H. A design and evaluation framework for digital health interventions. it - Inf Technol. Nov 2019;61(5):253-263. [CrossRef]
  45. Mathews SC, McShea MJ, Hanley CL, Ravitz A, Labrique AB, Cohen AB. Digital health: a path to validation. NPJ Digit Med. May 13, 2019;2(1):38. [FREE Full text] [CrossRef] [Medline]
  46. Milosevic Z. Ethics in digital health: a deontic accountability framework. 2019. Presented at: IEEE 23rd International Enterprise Distributed Object Computing Conference (EDOC); October 28-31, 2019; Paris, France. [CrossRef]
  47. World Health Organization (WHO). Monitoring and evaluating digital health interventions: a practical guide to conducting research and assessment. Geneva. WHO; 2016.
  48. Enam A, Torres-Bonilla J, Eriksson H. Evidence-based evaluation of eHealth interventions: systematic literature review. J Med Internet Res. Nov 23, 2018;20(11):e10971. [FREE Full text] [CrossRef] [Medline]
  49. Kolasa K, Kozinski G. How to value digital health interventions? A systematic literature review. Int J Environ Res Public Health. Mar 23, 2020;17(6):2119. [FREE Full text] [CrossRef] [Medline]
  50. Vis C, Bührmann L, Riper H, Ossebaard HC. Health technology assessment frameworks for eHealth: a systematic review. Int J Technol Assess Health Care. Apr 16, 2020;36(3):204-216. [CrossRef]
  51. von Huben A, Howell M, Howard K, Carrello J, Norris S. Health technology assessment for digital technologies that manage chronic disease: a systematic review. Int J Technol Assess Health Care. May 26, 2021;37(1):e66. [CrossRef]
  52. Report of the Working Group on mhealth assessment guidelines. European Commission. 2021. URL: https://digital-strategy.ec.europa.eu/en/library/report-working-group-mhealth-assessment-guidelines [accessed 2024-03-28]
  53. Kumar S, Nilsen WJ, Abernethy A, Atienza A, Patrick K, Pavel M, et al. Mobile health technology evaluation: the mhealth evidence workshop. Am J Prev Med. Aug 2013;45(2):228-236. [FREE Full text] [CrossRef] [Medline]
  54. Alami H, Lehoux P, Auclair Y, de Guise M, Gagnon M, Shaw J, et al. Artificial intelligence and health technology assessment: anticipating a new level of complexity. J Med Internet Res. Jul 07, 2020;22(7):e17707. [FREE Full text] [CrossRef] [Medline]
  55. Vokinger KN, Nittas V, Witt CM, Fabrikant SI, von Wyl V. Digital health and the COVID-19 epidemic: an assessment framework for apps from an epidemiological and legal perspective. Swiss Med Wkly. May 04, 2020;150(1920):w20282. [FREE Full text] [CrossRef] [Medline]
  56. Baumel A, Muench F. Heuristic evaluation of ehealth interventions: establishing standards that relate to the therapeutic process perspective. JMIR Ment Health. Jan 13, 2016;3(1):e5. [FREE Full text] [CrossRef] [Medline]
  57. Alfonzo A, Huerta M, Wong S, Passariello G, Diaz M, La CA, et al. Design of a methodology for assessing an electrocardiographic telemonitoring system. Annu Int Conf IEEE Eng Med Biol Soc. 2007;2007:3729-3732. [CrossRef]
  58. Bashshur R, Shannon G, Sapci H. Telemedicine evaluation. Telemed J E Health. Jun 2005;11(3):296-316. [FREE Full text] [CrossRef] [Medline]
  59. Beintner I, Vollert B, Zarski A, Bolinski F, Musiat P, Görlich D, et al. Adherence reporting in randomized controlled trials examining manualized multisession online interventions: systematic review of practices and proposal for reporting standards. J Med Internet Res. Aug 15, 2019;21(8):e14181. [FREE Full text] [CrossRef] [Medline]
  60. Brear M. Evaluating telemedicine: lessons and challenges. Health Inf Manag. Jul 21, 2006;35(2):23-31. [CrossRef] [Medline]
  61. DeChant HK, Tohme WG, Mun SK, Hayes WS, Schulman KA. Health systems evaluation of telemedicine: a staged approach. Telemed J. Jan 1996;2(4):303-312. [CrossRef] [Medline]
  62. Giansanti D, Morelli S, Macellari V. Telemedicine technology assessment part II: tools for a quality control system. Telemed J E Health. Apr 2007;13(2):130-140. [CrossRef] [Medline]
  63. Giansanti D, Morelli S, Macellari V. Telemedicine technology assessment part I: setup and validation of a quality control system. Telemed J E Health. Apr 2007;13(2):118-129. [CrossRef] [Medline]
  64. Grigsby J, Brega AG, Devore PA. The evaluation of telemedicine and health services research. Telemed J E Health. Jun 2005;11(3):317-328. [CrossRef] [Medline]
  65. Hailey D, Jacobs P, Simpson J, Doze S. An assessment framework for telemedicine applications. J Telemed Telecare. Jun 23, 1999;5(3):162-170. [CrossRef] [Medline]
  66. Ohinmaa A, Hailey D, Roine R. Elements for assessment of telemedicine applications. Int J Technol Assess Health Care. Jun 30, 2001;17(2):190-202. [CrossRef] [Medline]
  67. Rajan B, Tezcan T, Seidmann A. Service systems with heterogeneous customers: investigating the effect of telemedicine on chronic care. Manag Sci. Mar 2019;65(3):1236-1267. [CrossRef]
  68. Sisk JE, Sanders JH. A proposed framework for economic evaluation of telemedicine. Telemed J. Jan 1998;4(1):31-37. [CrossRef] [Medline]
  69. Zissman K, Lejbkowicz I, Miller A. Telemedicine for multiple sclerosis patients: assessment using Health Value Compass. Mult Scler. Apr 30, 2012;18(4):472-480. [CrossRef] [Medline]
  70. Grustam AS, Vrijhoef HJM, Koymans R, Hukal P, Severens JL. Assessment of a business-to-consumer (B2C) model for telemonitoring patients with chronic heart failure (CHF). BMC Med Inform Decis Mak. Oct 11, 2017;17(1):145. [FREE Full text] [CrossRef] [Medline]
  71. Hebert M. Telehealth success: evaluation framework development. Stud Health Technol Inform. 2001;84(Pt 2):1145-1149. [Medline]
  72. Rojahn K, Laplante S, Sloand J, Main C, Ibrahim A, Wild J, et al. Remote monitoring of chronic diseases: a landscape assessment of policies in four European countries. PLoS One. May 19, 2016;11(5):e0155738. [FREE Full text] [CrossRef] [Medline]
  73. Agarwal S, LeFevre AE, Lee J, L'Engle K, Mehl G, Sinha C, et al. WHO mHealth Technical Evidence Review Group. Guidelines for reporting of health interventions using mobile phones: mobile health (mHealth) evidence reporting and assessment (mERA) checklist. BMJ. Mar 17, 2016;352:i1174. [CrossRef] [Medline]
  74. Safety and quality strategy in mobile health apps. Complete list of recommendations on design, use and assessment of health apps. Agencia de Calidad Sanitaria de Andalucía. 2012. URL: http://www.calidadappsalud.com/en/listado-completo-recomendaciones-app-salud/ [accessed 2023-07-17]
  75. The app evaluation model. American Psychiatric Association Initiative. 2022. URL: https://www.psychiatry.org/psychiatrists/practice/mental-health-apps/the-app-evaluation-model [accessed 2024-03-28]
  76. Gdd AppStore. Association of Regional Public Health Services (GGD) and Regional Medical Emergency Preparedness and Planning (GHOR). 2016. URL: https://tinyurl.com/58th4p4w [accessed 2023-07-18]
  77. AppQ: quality criteria core set for more quality transparency in digital health applications. Bertelsmann Stiftung. Oct 29, 2019. URL: https://www.bertelsmann-stiftung.de/de/publikationen/publikation/did/appq [accessed 2024-03-28]
  78. Bradway M, Carrion C, Vallespin B, Saadatfard O, Puigdomènech E, Espallargues M, et al. mHealth assessment: conceptualization of a global framework. JMIR Mhealth Uhealth. May 02, 2017;5(5):e60. [FREE Full text] [CrossRef] [Medline]
  79. PAS 277:2015 - health and wellness apps. Quality criteria across the life cycle. British Standards Institution. URL: https:/​/knowledge.​bsigroup.com/​products/​health-and-wellness-apps-quality-criteria-across-the-life-cycle-code-of-practice/​standard [accessed 2023-07-18]
  80. MindApps. Centre for Telepsychiatry in the Region of Southern Denmark. 2017. URL: https://mindapps.dk/ [accessed 2023-07-18]
  81. Dick S, O'Connor Y, Thompson MJ, O'Donoghue J, Hardy V, Wu TJ, et al. Considerations for improved mobile health evaluation: retrospective qualitative investigation. JMIR Mhealth Uhealth. Jan 22, 2020;8(1):e12424. [FREE Full text] [CrossRef] [Medline]
  82. Federal Institute for Drugs and Medical Devices (BfArM). The fast-track process for digital health applications (diga) according to section 139e sgb v. A guide for manufacturers, service providers and users. Bonn, Germany. Federal Institute for Drugs and Medical Devices; 2020.
  83. Gorski I, Bram JT, Sutermaster S, Eckman M, Mehta K. Value propositions of mHealth projects. J Med Eng Technol. Aug 12, 2016;40(7-8):400-421. [CrossRef] [Medline]
  84. Hogaboam L, Daim T. Technology adoption potential of medical devices: the case of wearable sensor products for pervasive care in neurosurgery and orthopedics. Health Policy Technol. Dec 2018;7(4):409-419. [CrossRef]
  85. Huckvale K, Torous J, Larsen ME. Assessment of the data sharing and privacy practices of smartphone apps for depression and smoking cessation. JAMA Netw Open. Apr 05, 2019;2(4):e192542. [FREE Full text] [CrossRef] [Medline]
  86. Maar MA, Yeates K, Perkins N, Boesch L, Hua-Stewart D, Liu P, et al. A framework for the study of complex mhealth interventions in diverse cultural settings. JMIR Mhealth Uhealth. Apr 20, 2017;5(4):e47. [FREE Full text] [CrossRef] [Medline]
  87. McMillan B, Hickey E, Patel MG, Mitchell C. Quality assessment of a sample of mobile app-based health behavior change interventions using a tool based on the National Institute of Health and Care Excellence behavior change guidance. Patient Educ Couns. Mar 2016;99(3):429-435. [FREE Full text] [CrossRef] [Medline]
  88. Health and welness apps: new international guidelines to help to sort the best form the rest. NEN. URL: https://www.nen.nl/en/health-and-welness-apps [accessed 2024-03-28]
  89. Continua design guidelines. Personal Connected Health Alliance. 2019. URL: https://www.pchalliance.org/continua-design-guidelines [accessed 2023-07-18]
  90. Philpott D, Guergachi A, Keshavjee K. Design and validation of a platform to evaluate mhealth apps. Stud Health Technol Inform. 2017;235:3-7. [Medline]
  91. Ruck A, Wagner BS, Lowe C. Second draft of guidelines. EU guidelines on assessment of the reliability of mobile health applications. European Commission, Directorate-General of Communications Networks, Content & Technology. Luxembourg; 2016.
  92. Sax M, Helberger N, Bol N. Health as a means towards profitable ends: mhealth apps, user autonomy, and unfair commercial practices. J Consum Policy. May 22, 2018;41(2):103-134. [CrossRef]
  93. Wyatt JC. How can clinicians, specialty societies and others evaluate and improve the quality of apps for patient use? BMC Med. Dec 03, 2018;16(1):225. [FREE Full text] [CrossRef] [Medline]
  94. IRBs could address ethical issues related to tracking devices: mobile devices raise new concerns. IRB Advisor. Nov 1, 2017. URL: https:/​/www.​reliasmedia.com/​articles/​141589-irbs-could-address-ethical-issues-related-to-tracking-devices [accessed 2024-03-28]
  95. App check. Zentrum für Telematik und Telemedizin. URL: https://ztg-nrw.de/ [accessed 2024-03-28]
  96. Bergmo TS. How to measure costs and benefits of ehealth interventions: an overview of methods and frameworks. J Med Internet Res. Nov 09, 2015;17(11):e254. [FREE Full text] [CrossRef] [Medline]
  97. Eysenbach G, CONSORT-EHEALTH Group. CONSORT-EHEALTH: improving and standardizing evaluation reports of web-based and mobile health interventions. J Med Internet Res. Dec 31, 2011;13(4):e126. [FREE Full text] [CrossRef] [Medline]
  98. Shaw NT. 'CHEATS': a generic information communication technology (ICT) evaluation framework. Comput Biol Med. May 2002;32(3):209-220. [CrossRef] [Medline]
  99. Brown M, Shaw N. Evaluation practices of a major Canadian telehealth provider: lessons and future directions for the field. Telemed J E Health. Oct 2008;14(8):769-774. [CrossRef] [Medline]
  100. Casper GR, Kenron DA. A framework for technology assessment: approaches for the selection of a home technology device. Clin Nurse Spec. 2005;19(4):170-174. [CrossRef] [Medline]
  101. Sitting D, Kahol K, Singh H. Sociotechnical evaluation of the safety and effectiveness of point-of-care mobile computing devices: a case study conducted in India. Stud Health Technol Inform. 2013;192:515-519. [CrossRef]
  102. Haute Autorité de Santé. Good practice guidelines on health apps and smart devices (mobile health or mhealth). Paris. Haute Autorité de Santé; Nov 7, 2016.
  103. Health Information and Quality Authority. International review of consent models for the collection, use and sharing of health information. Dublin. Health Information and Quality Authority; 2020.
  104. Jurkeviciute M. Planning of a holistic summative ehealth evaluation: the interplay between standards and reality PhD Thesis. Gotherburg. Chalmers University of Technology; 2018.
  105. Vimarlund V, Davoody N, Koch S. Steps to consider for effective decision making when selecting and prioritizing eHealth services. Stud Health Technol Inform. 2013;192:239-243. [Medline]
  106. Currie WL. TEMPEST: an integrative model for health technology assessment. Health Policy Technol. Mar 2012;1(1):35-49. [CrossRef]
  107. Eivazzadeh S, Anderberg P, Larsson TC, Fricker SA, Berglund J. Evaluating health information systems using ontologies. JMIR Med Inform. Jun 16, 2016;4(2):e20. [FREE Full text] [CrossRef] [Medline]
  108. Our data-driven future in healthcare: people and partnerships at the heart of health related technologies. Academy of Medical Sciences. Nov 2018. URL: https://acmedsci.ac.uk/file-download/74634438 [accessed 2024-03-28]
  109. Australian Commission on Safety and Quality in Healthcare. National safety and quality digital mental health standards - consultation draft. Australia. Australian Commission on Safety and Quality in Healthcare; 2020.
  110. A guide to good practice for digital and data-driven health technologies. London. Department of Health & Social Care; 2021.
  111. Drury P, Roth S, Jones T, Stahl M, Medeiros D. Guidance for investing in digital health. In: Sustainable Development Working Papers. Mandaluyong, the Philippines. Asian Development Bank; May 2018.
  112. European Commission. Synospis report. Consultation: transformation health and care in the digital single market. Luxembourg. European Commission; 2018.
  113. Federal Ministry of Health. Regulation on the procedure and requirements for testing the eligibility for reimbursement of digital health applications in the statutory public health insurance (Digital Health Applications Ordinance - DiGAV). Alemania. Federal Ministry of Health; 2020.
  114. Haute Autorité de Santé. Guide to the specific features of clinical evaluation of a connected medical device (CMD) in view of its application for reimbursement. Paris. Haute Autorité de Santé; 2019.
  115. How we assess health apps and digital tools. NHS Digital. 2019. URL: https:/​/digital.​nhs.uk/​services/​nhs-apps-library/​guidance-forhealth-app-developers-commissioners-and-assessors/​how-we-assess-healthapps-and-digital-tools [accessed 2023-07-17]
  116. Lennon MR, Bouamrane M, Devlin AM, O'Connor S, O'Donnell C, Chetty U, et al. Readiness for delivering digital health at scale: lessons from a longitudinal qualitative evaluation of a national digital health innovation program in the United Kingdom. J Med Internet Res. Feb 16, 2017;19(2):e42. [FREE Full text] [CrossRef] [Medline]
  117. McNamee P, Murray E, Kelly MP, Bojke L, Chilcott J, Fischer A, et al. Designing and undertaking a health economics study of digital health interventions. Am J Prev Med. Nov 2016;51(5):852-860. [FREE Full text] [CrossRef] [Medline]
  118. Haute Autorité de Santé. Medical Device and Health Technology Evaluation Committee (CNEDiMTS*). Paris. Haute Autorité de Santé; 2019.
  119. Draft guidelines for preparing assessment reports for the Medical Services Advisory Committee: draft version 4.0. Australian Government Department of Health and Aged Care. Aug 2020. URL: https:/​/consultations.​health.gov.au/​technology-assessment-access-division/​msac-guidelines-review-consultation/​user_uploads/​draft-msac-guidelines---clean-version---28-august-2020-3.​pdf [accessed 2024-03-28]
  120. Haute Autorité de Santé. Methodological choices for the clinical development of medical devices. Paris. Haute Autorité de Santé; 2013.
  121. Michie S, Yardley L, West R, Patrick K, Greaves F. Developing and evaluating digital interventions to promote behavior change in health and health care: recommendations resulting from an international workshop. J Med Internet Res. Jun 29, 2017;19(6):e232. [FREE Full text] [CrossRef] [Medline]
  122. Mohr DC, Schueller SM, Riley WT, Brown CH, Cuijpers P, Duan N, et al. Trials of intervention principles: evaluation methods for evolving behavioral intervention technologies. J Med Internet Res. Jul 08, 2015;17(7):e166. [FREE Full text] [CrossRef] [Medline]
  123. Murray E, Hekler EB, Andersson G, Collins LM, Doherty A, Hollis C, et al. Evaluating digital health interventions: key questions and approaches. Am J Prev Med. Nov 2016;51(5):843-851. [FREE Full text] [CrossRef] [Medline]
  124. Steventon A, Grieve R, Bardsley M. An approach to assess generalizability in comparative effectiveness research: a case study of the whole systems demonstrator cluster randomized trial comparing telehealth with usual care for patients with chronic health conditions. Med Decis Making. May 18, 2015;35(8):1023-1036. [CrossRef]
  125. D2.1 Knowledge Tool 1. Health apps assessment frameworks. European mHealth Hub. 2020. URL: https://mhealth-hub.org/download/d2-1-knowledge-tool-1-health-apps-assessment-frameworks [accessed 2024-03-28]
  126. Moshi MR, Tooher R, Merlin T. Suitability of current evaluation frameworks for use in the health technology assessment of mobile medical applications: a systematic review. Int J Technol Assess Health Care. Sep 11, 2018;34(5):464-475. [CrossRef]
  127. Stensgaard T, Sørensen T. Telemedicine in Greenland — the creation of an evaluation plan. J Telemed Telecare. Jun 22, 2016;7(1_suppl):37-38. [CrossRef]
  128. HL7 Consumer Mobile Health Application Functional Framework (cMHAFF). HL7 International. 2018. URL: http://www.hl7.org/implement/standards/product_brief.cfm?product_id=476 [accessed 2024-03-28]
  129. mHealth. Kantonen K-uKvBu. 2017. URL: https://www.e-health-suisse.ch/gemeinschaften-umsetzung/ehealth-aktivitaeten/mhealth.html [accessed 2023-07-18]
  130. mHealthBelgium. mHealthBELGIUM. URL: https://mhealthbelgium.be/ [accessed 2024-03-28]
  131. Mookherji S, Mehl G, Kaonga N, Mechael P. Unmet need: improving mhealth evaluation rigor to build the evidence base. J Health Commun. Jun 04, 2015;20(10):1224-1229. [CrossRef] [Medline]
  132. MySNS Selecção. Serviços Partilhados do Minisério da Saúde. URL: https://mysns.min-saude.pt/mysns-seleccao-processo-de-avaliacao/ [accessed 2023-07-18]
  133. Nielsen S, Rimpiläinen S. Report on international practice on digital apps. Glasgow. Digital Health & Care Institute; 2018.
  134. Accreditation of Digital Health solutions is a fundamental foundation for their safe adoption, equipping healthcare providers and practitioners with access to health apps assured to your standards Internet. Organisation for the Review of Care and Health Applications (ORCHA). URL: https://orchahealth.com/services/ [accessed 2023-07-18]
  135. mHealth. TIC Salut Social. 2022. URL: https://ticsalutsocial.cat/projecte/mhealth/ [accessed 2024-03-28]
  136. Wienert J, Jahnel T, Maaß L. What are digital public health interventions? First steps toward a definition and an intervention classification framework. J Med Internet Res. Jun 28, 2022;24(6):e31921. [FREE Full text] [CrossRef] [Medline]
  137. Deniz-Garcia A, Fabelo H, Rodriguez-Almeida AJ, Zamora-Zamorano G, Castro-Fernandez M, Alberiche Ruano MDP, et al. WARIFA Consortium. Quality, usability, and effectiveness of mhealth apps and the role of artificial intelligence: current scenario and challenges. J Med Internet Res. May 04, 2023;25:e44030. [FREE Full text] [CrossRef] [Medline]
  138. World Health Organization (WHO). Classification of digital interventions, services and applications in health: a shared language to describe the uses of digital technology for health, 2nd ed. Geneva. WHO; Oct 24, 2023.
  139. European Parliament, Council of the European Union. Regulation (EU) 2017/745 of the European Parliament and of the Council of 5 April 2017 on medical devices, amending Directive 2001/83/EC, Regulation (EC) No 178/2002 and Regulation (EC) No 1223/2009 and repealing Council Directives 90/385/EEC and 93/42/EEC. EUR-Lex. 2017. URL: https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX%3A32017R0745 [accessed 2024-03-28]
  140. European Parliament, Council of the European Union. Regulation (EU) 2017/746 of the European Parliament and of the Council of 5 April 2017 on in vitro diagnostic medical devices and repealing Directive 98/79/EC and Commission Decision 2010/227/EU. EUR-Lex. 2017. URL: https://eur-lex.europa.eu/eli/reg/2017/746/oj [accessed 2024-03-28]
  141. Segur-Ferrer J, Moltó-Puigmartí C, Pastells-Peiró R, Vivanco-Hidalgo R. Marco de evaluación de tecnologías sanitarias: adaptación para la evaluación de tecnologías de salud digital. Madrid, Barcelona. Ministerio de Sanidad, Agència de Qualitat i Avaluació Sanitàries de Catalunya; 2023.
  142. Benedetto V, Filipe L, Harris C, Spencer J, Hickson C, Clegg A. Analytical frameworks and outcome measures in economic evaluations of digital health interventions: a methodological systematic review. Med Decis Making. Oct 19, 2022;43(1):125-138. [CrossRef]


AI: artificial intelligence
dHT: digital health technology
dHTA: digital health technology assessment
EUnetHTA: European Network for Health Technology Assessment
HTA: health technology assessment
INAHTA: International Network for Agencies for Health Technology Assessment
mHealth: mobile health
NICE: National Institute for Health and Care Excellence
PICo-D: Population, Phenomenon of interest, Context and Design
PRISMA: Preferred Reporting Items for Systematic Reviews and Meta-Analysis
PRISMA-ScR: Preferred Reporting Items for Systematic Reviews and Meta-Analysis extension for Scoping Reviews
RedETS: Spanish Network of Agencies for Assessing National Health System Technologies and Performance
ScR: scoping review
SNS: Spanish National Health System
SR: systematic review
SNS: Spanish National Health System
WHO: World Health Organization


Edited by T Leung; submitted 03.05.23; peer-reviewed by R Gorantla, KL Mauco, M Aymerich, J Haverinen, M Behzadifar; comments to author 10.11.23; revised version received 01.12.23; accepted 20.02.24; published 10.04.24.

Copyright

©Joan Segur-Ferrer, Carolina Moltó-Puigmartí, Roland Pastells-Peiró, Rosa Maria Vivanco-Hidalgo. Originally published in the Journal of Medical Internet Research (https://www.jmir.org), 10.04.2024.

This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research, is properly cited. The complete bibliographic information, a link to the original publication on https://www.jmir.org/, as well as this copyright and license information must be included.


Cookie Consent

We use our own cookies and third-party cookies so that we can show you this website and better understand how you use it, with a view to improving the services we offer. If you continue browsing, we consider that you have accepted the cookies.