Original Paper
Abstract
Background: The digital transformation in health care requires valid instruments to assess the level of digitalization. Maturity models are widely used to measure the digital maturity level of institutions. To date, however, there is no standardized and empirically grounded measurement tool for general practices in outpatient care.
Objective: This study aimed to identify and validate the key dimensions of digital maturity in order to develop a questionnaire to measure the digital maturity of general practices. The development of a questionnaire was intended to advance research into maturity models in outpatient care and provide general practitioners (GPs) and policymakers with a tool for the digital transformation process.
Methods: A web-based cross-sectional survey study was conducted among GPs in Germany. Based on exploratory factor analysis (EFA), the underlying dimensions of digital maturity were first identified. The factor structure was then examined using confirmatory factor analysis (CFA). After evaluating convergent and discriminant validity, the overall model fit was assessed using fit indices. Following model adjustments based on modification indices, the final questionnaire was established. Finally, we calculated the digital maturity level for our sample, both for each individual dimension and as an overall score, by computing the mean value for each dimension and an overall mean across all dimensions.
Results: Responses from 201 GPs were included in the data analysis. We identified and validated 6 dimensions of digital maturity, comprising 16 items. Both convergent and discriminant validity were confirmed. The model fit was excellent (robust comparative fit index [CFI]=0.993; robust Tucker-Lewis index [TLI]=0.990; robust root mean square error of approximation [RMSEA]=0.022; P value of close fit [PCLOSE]=.98; standardized root mean square residual [SRMR]=0.043). The questionnaire included six dimensions: effects of digitalization, participation of practice staff, maturity of the practice management system, staff competencies and sense of responsibility, IT security and data protection, and digitally supported processes. The scale showed good internal consistency (overall Cronbach α=.809). In our sample, the overall digital maturity averaged 3.77 out of 5, with the highest maturity observed in IT security and data protection (mean 4.45, SD 0.61) and the lowest in effects of digitalization (mean 3.1, SD 1.0).
Conclusions: This is the first study in which the dimensions of digital maturity in outpatient care for GP practices have been empirically identified and validated as the basis for developing a questionnaire. The findings provide a foundation for further research on measuring digital maturity in outpatient care and for advancing the development of digital maturity models. The questionnaire allows GPs to assess their practices’ digitalization, identify areas for improvement (eg, infrastructure, staff skills), and support internal strategy or benchmarking. For policymakers, it offers a standardized tool to plan support measures and monitor digitalization progress.
doi:10.2196/81416
Keywords
Introduction
Background
Factors such as demographic change, the shortage of specialists, and the uneven distribution of general practitioner (GP) practices between rural and urban areas mean that it is often no longer possible to ensure locally available and easily accessible care for patients. However, GPs are often the first point of contact for patients and, in many health care systems, also serve as gatekeepers who coordinate further medical treatment []. The use of digital solutions offers a promising opportunity to effectively address both current and future challenges in outpatient care []. GPs can implement digital apps in situations in which practices face limited resources, such as staffing shortages, or where patients are required to travel long distances. Examples include electronic patient records; telemedicine services, such as video consultations or telemonitoring; and health apps. Administrative processes in GP practices can also be streamlined through intelligent appointment scheduling systems. Overall, digitalization is believed to have the potential to contribute to more efficient and effective health care delivery [,].
Maturity models are increasingly being used to strategically advance digital transformation and to manage various initiatives. For example, in its global strategy on digital health, the World Health Organization emphasizes the importance of assessing the digital maturity of health care systems to support national funding decisions []. Maturity models were therefore developed “to assess the maturity (i.e. competency, capability, level of sophistication) of a selected domain based on a more or less comprehensive set of criteria” []. Standardized questionnaires serve to measure digital maturity, reflecting the current state of digitalization in organizations such as GP practices. The results can be used to evaluate one’s own level of digitalization, benchmark against other practices, and even identify specific areas for investment [-].
So far, however, the understanding of digital maturity in GP practices remains fragmented. Research on digital maturity, its dimensions, and the development of maturity models is still at an early stage []. There are only initial efforts to measure the level of digital maturity in general practice care, such as the measurement model developed by the Westphalia-Lippe Association of Statutory Health Insurance Physicians in Germany [,] and the Gippsland Primary Health Network model in Australia []. Widely used measurement instruments, such as questionnaires for assessing digital maturity in general practices, do not currently exist. This stands in stark contrast to inpatient care, where the Electronic Medical Record Adoption Model (EMRAM), developed by the Healthcare Information and Management Systems Society (HIMSS), serves as a widely adopted international maturity model []. In Germany, the Hospital Future Act (KHZG) established DigitalRadar as a national instrument for measuring the digital maturity of hospital [,]. Moreover, research on the measurement dimensions of digital maturity is significantly more advanced [].
To address the lack of maturity models in GP care, scientifically validated measurement instruments, based on psychometric testing, are needed. This requires the empirical validation of both dimensions and items, a step that is often criticized as missing in the development of maturity models []. Currently, however, such validation is absent from the few existing scientific publications on measuring digital maturity in GP practices [-]. Instead, frameworks or dimensions from other areas of health care are often applied.
Against this background, the authors took an initial step in foundational research by identifying 6 dimensions and 26 subcategories of digital maturity for GP practices as part of a qualitative research project [].
Objective
The aim of this study was to validate the dimensions of digital maturity and develop a corresponding instrument to measure the level of digitalization in GP practices. The research question guiding our work was, “Which dimensions of digital maturity in GP practices can be empirically identified and validated to develop a reliable and valid measurement instrument?”
The overarching goal of this study was to advance the research and development of maturity models in outpatient care in order to assess the digital transformation of GP practices and support them throughout the transformation process. Our study built directly on our previous research on the digital maturity of GP practices [,].
Methods
Study Design
To answer the research question, we conducted a cross-sectional online survey among GPs in Germany. The survey design followed the CHERRIES (Checklist for Reporting Results of Internet E-Surveys) []; see for the completed CHERRIES and for the questionnaire in English and German. The survey was open and administered using the free online survey platform LimeSurvey []. Participants were therefore drawn from a random sample.
Ethical Considerations
Ethical approval was granted by the Ethics Committee of Witten/Herdecke University (application number S-47/2023). Participants were required to provide informed consent by agreeing to the data protection policy before taking part in the survey. They were also informed about all relevant aspects of the study, including the survey duration, data storage period and location, the researchers responsible, and the purpose of the study.
Structure of the Questionnaire
The section of the questionnaire relevant to this study consisted of 6 pages and included a total of 43 items. The number of items per page ranged from 3 to 15. Completion took approximately 10-15 minutes. If participants did not fully complete a page, the survey application displayed a notification. Participants also had the option to review and change their responses using the Back button. Where applicable, questions were randomized within the survey. Adaptive questioning was not used.
At the beginning of the survey, demographic characteristics—including gender, age, type of practice, and population size of the practice location—were collected. In addition, participants were asked to assess the average digital health literacy of their patients on a scale from 0 to 5 and the digitalization level of their practice on a scale from 0 to 10. This was followed by 37 items covering 5 dimensions of digital maturity. In total, data was collected on 37 items on these dimensions using a 5-point Likert scale from 1 (strongly disagree) to 5 (strongly agree). The data collected from these items served as the basis for identifying and validating the dimensions and development of the questionnaire.
Item Construction and Pretest
The starting point was an item repository developed based on findings from previous expert interviews on the dimensions of digital maturity in GP practices []. These dimensions included “digitally supported processes,” “skills and competencies of the practice team,” “organizational structures and rules,” “technical infrastructure,” and “effects of digitalization.” In the absence of existing measurement instruments, either new items were developed or items were adopted from comparable studies. For instance, items related to the “skills and competencies of the practice team” dimension were adapted from the German TAEG questionnaire, which measures technological affinity []. Similarly, items in the “technical infrastructure” dimension were informed by the System Usability Scale []. Item development was carried out by an expert panel in collaboration with representatives of the medical profession and medical interest groups in order to incorporate both theoretical and practice-oriented perspectives.
Subsequently, a pretest was conducted in accordance with the recommendations of Schnell et al [], involving 20 participants. Half of the participants were GPs, while the other half consisted of specialists and experts in the field of digital health with extensive professional experience. The pretest followed the “frame of reference probing” methodology []. Based on the pretest results, we drew initial conclusions regarding content validity, reliability, and item difficulty and distribution. After the pretest was completed, revisions were made to the introductory text, the individual questions, and the response options. The final items and the abbreviations used can be found in .
Recruiting
Participant recruitment was conducted between October 2024 and January 2025. Participants were recruited through multiple channels. The majority were contacted via GP associations, institutes of general practice, general practice research networks, teaching practices, and mailing lists directed at GP practices. Additionally, GPs were approached through Associations of Statutory Health Insurance Physicians, medical chambers, physician networks, and personal contacts. Third, social media platforms, such as LinkedIn (Microsoft), were also used. Finally, the snowball sampling method was used: on the final page of the survey, participants were encouraged to share the survey with other eligible colleagues from their networks. We typically contacted multipliers—such as physician networks—initially via email or over the phone. These individuals then shared the survey link through internal communication channels and announcements. The survey was promoted primarily through online means, including emails, newsletters, and websites. Participation was voluntary, and no incentives were provided by the authors.
Data Cleaning
The data were cleaned using liter recommendations [,] with the help of SPSS software (IBM Corp) []. The stepwise data-cleaning process is presented in .
| Exclusion criterion | Result after data cleaning | Value, n (%) |
| —a | Clicked the survey link and consented to the data policy | 375 (100.0) |
| No items completed | Participated in the survey (at least one item completed) | 321 (85.6) |
| >1 item page not completed | Participated with at least one fully completed item page | 230 (61.3) |
| Further missing responses | Fully completed the survey | 212 (56.5) |
| Careless response behavior | Careful response behavior confirmed | 202 (53.9) |
| Illogical response behavior | Logical response behavior confirmed | 201 (53.6) |
| Violation of ≥5-fold k-anonymity | The k-anonymity requirements met | 201 (53.6) |
aNot applicable.
Due to our settings in the LimeSurvey app, data were collected only if participants clicked the survey link and provided informed consent by agreeing to the data protection policy (N=375). In the first step, we systematically checked the data sets for completeness. A total of 230 (61.3%) participants completed at least one item page. After excluding 18 (4.8%) participants due to isolated missing values, 212 (56.5%) participants remained with fully completed data sets. Given the minimal loss of information, we decided not to apply any imputation procedures. In the second step, we excluded data sets that indicated careless or illogical response behavior (n=11, 2.9%). Finally, we checked compliance with k-anonymity requirements. Data sets with fewer than five values in the demographic variables were excluded. In total, of the 375 individuals who accessed the survey, 201 (53.6%) responses were included in the final data analysis.
Data Analysis
First, we conducted exploratory factor analysis (EFA) using SPSS [] to identify the latent dimensions of digital maturity. In the second step, we evaluated the factor-analytical model using confirmatory factor analysis (CFA), performed with R version 4.4.3 (R Foundation for Statistical Computing) [] and the lavaan add-on package [].
For EFA, we followed the guidelines of Watkins [] and the recommendations of Samuels []. We began by testing the data for normality using the Kolmogorov-Smirnov test. Since normality could not be confirmed, we chose principal axis factoring as the estimation method, in line with Fabrigar et al []. We also applied oblique (promax) rotation, as correlations between the underlying dimensions were expected []. The number of factors was determined using parallel analysis []. Items with low communalities, low factor loadings, or high cross-loadings were removed. Once the factor model was stabilized, we assessed the total variance explained by the retained factors, the sampling adequacy using the Kaiser-Meyer-Olkin (KMO) measure and Bartlett’s test of sphericity, as well as the average extracted communality of the variables.
For CFA, we followed the reporting framework proposed by Gaskin et al []. Due to the nonnormal distribution of the data, we used the robust maximum likelihood (MLR) estimation method. To assess validity, we examined both discriminant and convergent validity at the factor level. Discriminant validity was evaluated using the Fornell-Larcker criterion [] and the recommendations of Rönkkö and Cho []. For convergent validity, we considered the average variance extracted (AVE) [].
At the model level, we assessed the model fit using the robust comparative fit index (CFI), the robust Tucker-Lewis index (TLI), the robust root mean square error of approximation (RMSEA), the P value of close fit (PCLOSE), and the standardized root mean square residual (SRMR), given the small sample size and the nonnormality of the data. Modification indices were also analyzed to identify opportunities for improving the model fit. After finalizing the questionnaire, we evaluated the reliability of the scale as a whole and for each dimension using the Cronbach α coefficient []. Next, we calculated the digital maturity level for our sample, both for each individual dimension and as an overall score. For this purpose, we computed the mean value for each dimension as well as an overall mean across all dimensions.
Results
Demographic Characteristics
The demographic characteristics of the study participants are summarized in .
| Characteristics | Value | |
| Sex, n (%) | ||
| Male | 124 (61.7) | |
| Female | 77 (38.3) | |
| Age (years), n (%) | ||
| 20-40 | 22 (10.9) | |
| 41-60 | 134 (66.7) | |
| >60 | 45 (22.4) | |
| Type of practice, n (%) | ||
| Solo practice | 92 (45.8) | |
| Shared practice | 24 (11.9) | |
| Group practice | 66 (32.8) | |
| Medical care center | 19 (9.5) | |
| Practice location (population size), n (%) | ||
| <5000 inhabitants | 42 (20.9) | |
| 5000-20,000 inhabitants | 57 (28.4) | |
| 20,001-100,000 inhabitants | 43 (21.4) | |
| 100,001-500,000 inhabitants | 32 (15.9) | |
| >500,000 inhabitants | 27 (13.4) | |
| Digitalization level of GP practice (scale from 0 to 10), mean (SD) | 7.11 (1.71) | |
| Average digital health literacy of patients (scale from 0 to 5), mean (SD) | 2.57 (0.95) | |
aGP: general practitioner.
Of the 201 participants included in the analysis, the majority of GPs were male, aged between 41 and 60 years, and primarily worked in solo practices. Most GP practices were located in small towns with populations between 5000 and 20,000 inhabitants. On average, participants rated the digital health literacy of their patients at 2.57 out of 5 (SD 0.95) and the digitalization level of their own practice at 7.11 out of 10 (SD 1.71).
Explorative Factor Analysis
The results of the EFA are presented in .
| Factors and items | Factor loadingb | |||||||
| 1 | 2 | 3 | 4 | 5 | 6 | 7 | ||
| Factor 1: effects of digitalization | ||||||||
| In my medical practice, digitalization has a positive effect on the practice’s business results. (EFF04) | 0.923 | —c | — | — | — | — | — | |
| In my medical practice, digitalization has a positive impact on the quality of patient care. (EFF01) | 0.899 | — | — | — | — | — | — | |
| In my medical practice, digitalization has a positive effect on the workload of the practice staff. (EFF03) | 0.791 | — | — | — | — | — | — | |
| In my medical practice, digitalization has a positive effect on patient satisfaction. (EFF02) | 0.667 | — | — | — | — | — | — | |
| Factor 2: practice culture | ||||||||
| In my medical practice, digital apps are consistently used whenever the situation allows for it. (PC02) | — | 0.780 | — | — | — | — | — | |
| In my medical practice, the use of digital apps is standardized. (KQM03) | — | 0.628 | — | — | — | — | — | |
| In my medical practice, the use of digital apps is an integral part of daily routines. (PC03) | — | 0.612 | — | — | — | — | — | |
| In my medical practice, the use of digital apps is part of our mission statement. (PC01) | — | 0.534 | — | — | — | — | — | |
| Factor 3: participation of practice staff | ||||||||
| Important decisions regarding digitalization projects are made jointly in our medical practice. (PP03) | — | — | 0.897 | — | — | — | — | |
| In my medical practice, the practice owners make decisions about digitalization projects without involving the practice staff. (PP02)d | — | — | 0.708 | — | — | — | — | |
| When decisions about digitalization projects are pending, the practice staff has sufficient opportunity to contribute. (PP01) | — | — | 0.684 | — | — | — | — | |
| Factor 4: maturity of the practice management system | ||||||||
| In my medical practice, the practice management system is reliable and stable. (TM03) | — | — | — | 0.875 | — | — | — | |
| In my medical practice, the practice management system is easy to use. (TM02) | — | — | — | 0.786 | — | — | — | |
| In my medical practice, the practice management system is up to date. (TM01) | — | — | — | 0.548 | — | — | — | |
| Factor 5: staff competencies and sense of responsibility | ||||||||
| In my medical practice, the team finds it easy to learn how to use digital apps. (TA02) | — | — | — | — | 0.814 | — | — | |
| In my medical practice, the staff often feels overwhelmed by new demands related to digitalization. (CM03)d | — | — | — | — | 0.722 | — | — | |
| In my medical practice, there is often a sense that no one feels responsible for digitalization projects. (RS02)d | — | — | — | — | 0.534 | — | — | |
| Factor 6: IT security and data protection | ||||||||
| In my medical practice, measures are taken to meet IT security requirements. (IDS01) | — | — | — | — | — | 0.753 | — | |
| In my medical practice, measures are taken to meet data protection requirements. (IDS02) | — | — | — | — | — | 0.753 | — | |
| In my medical practice, employees receive training on how to use IT systems securely when handling patient data. (IDS03) | — | — | — | — | — | 0.461 | — | |
| Factor 7: digitally supported processes | ||||||||
| In my medical practice, administrative processes (eg, finance, human resources, procurement, internal communication) are digitally supported. (DP02) | — | — | — | — | — | — | 0.961 | |
| In my medical practice, core processes (eg, appointment management, patient intake, medical history, diagnostics, treatment, documentation) are digitally supported. (DP01) | — | — | — | — | — | — | 0.523 | |
| In my medical practice, communication with external service providers and institutions (eg, specialists, laboratories, health insurance providers) is digitally supported. (DP03) | — | — | — | — | — | — | 0.420 | |
aEFA: exploratory factor analysis.
bThe items were translated from German into English. The values of the factor loadings apply to the German items.
cNot applicable.
dThe data for the items PP02, CM03, and RS02 were inverted.
After six iterations of the EFA, seven factors were identified: (1) effects of digitalization, (2) practice culture, (3) participation of practice staff, (4) maturity of the practice management system, (5) staff competencies and sense of responsibility, (6) IT security and data protection, and (7) digitally supported processes. The labels for factors 2 and 5 were newly developed as they comprised cross-thematic items (eg, factor 5 with items TA02, CM03_Invers, RS02_Invers) and therefore differed from the original categorization derived from the expert interviews []. A total of 23 variables were assigned to the 7 factors. During the adjustment process, two initial items were removed due to communalities below 0.2, and one item was excluded due to a factor loading below 0.3. Additional items were eliminated due to high cross-loadings or low factor loadings. Both Bartlett’s test of sphericity (χ²253=2036.241, P<.001) and the KMO measure of sampling adequacy (KMO=0.842) indicated that the sample was suitable for factor analysis. The average extracted communality of the items was 0.557, in line with the recommendations of MacCallum et al []. The seven factors explained 70.103% of the total variance.
Confirmatory Factor Analysis
The results for convergent and discriminant validity are presented in .
| Factor | AVE | Convergent validity met? | √AVE | Highest correlation with another factor | Discriminant validity met? |
| Factor 1: effects of digitalization | 0.698 | Yes | 0.835 | 0.640 (with factor 2) | Yes |
| Factor 2: practice culture | 0.493 | No | 0.702 | 0.651 (with factor 7) | Yes (with limitations) |
| Factor 3: participation of practice staff | 0.565 | Yes | 0.752 | 0.375 (with factor 5) | Yes |
| Factor 4: maturity of the practice management system | 0.548 | Yes | 0.740 | 0.270 (with factor 5) | Yes |
| Factor 5: staff competencies and sense of responsibility | 0.515 | Yes | 0.717 | 0.603 (with factor 2) | Yes |
| Factor 6: IT security and data protection | 0.467 | No | 0.683 | 0.470 (with factor 7) | Yes |
| Factor 7: digitally supported processes | 0.485 | No | 0.696 | 0.651 (with factor 2) | Yes (with limitations) |
aAVE: average variance extracted.
Factors 2 (AVE=0.493), 6 (AVE=0.467), and 7 (AVE=0.485) did not meet the criteria for convergent validity. The discriminant validity analysis revealed a high degree of overlap between factors 2 and 7. In addition, the highest observed correlation between any two factors was 0.8074, which exceeds the recommended threshold of 0.80 proposed by Rönkkö and Cho [].
Despite these limitations in convergent and discriminant validity, the overall model quality was good. All model fit indices indicated acceptable values (CFI=0.945, TLI=0.933, RMSEA=0.049, PCLOSE=0.523, SRMR=0.063).
Due to the unmet requirements for convergent and discriminant validity, the model was subsequently adjusted.
Model Adjustment
Items KQM03 (factor 2; loading=0.619), IDS03 (factor 6; loading=0.575), and DP03 (factor 7; loading=0.531) had the lowest factor loadings, indicating potential for improving convergent validity. presents the 10 highest modification indices identified in the measurement model.
| lhsa | opb | rhsc | mid | epce | sepc.lvf | sepc.allg |
| Factor3 =~ | DP03 | 16.63 | 0.370 | 0.269 | 0.294 | 0.294 |
| Factor7 =~ | IDS03 | 13.05 | 0.538 | 0.338 | 0.312 | 0.312 |
| Factor5 =~ | EFF03 | 12.70 | 0.351 | 0.251 | 0.215 | 0.215 |
| KQM03 ~~ | CM03_Invers | 11.80 | –0.183 | –0.183 | –0.297 | –0.297 |
| Factor4 =~ | DP03 | 11.01 | 0.557 | 0.215 | 0.236 | 0.236 |
| Factor5 =~ | DP03 | 10.53 | 0.383 | 0.274 | 0.300 | 0.300 |
| Factor2 =~ | IDS03 | 10.48 | 0.305 | 0.282 | 0.260 | 0.260 |
| DP01 ~~ | DP03 | 9.34 | –0.156 | –0.156 | –0.268 | –0.268 |
| Factor2 =~ | PP02_Invers | 9.16 | –0.273 | –0.252 | –0.209 | –0.209 |
| EFF03~~ | CM03_Invers | 8.51 | 0.123 | 0.123 | 0.263 | 0.263 |
alhs: left-hand side.
bop: operator.
crhs: right-hand side.
dmi: modification index.
eepc: expected parameter change.
fsepc.lv: standardized epc on a latent variable scale.
gsepc.all: standardized epc (all).
The modification indices primarily indicated additional cross-loadings, for example, for the items DP03, IDS03, and EFF03. Based on these findings and the results of low factor loadings, the items EFF03, KQM03, IDS03, and DP03 were removed. The removal of these items was consistent with our initial theoretical assumptions about the factors. Following the model adjustment, the requirements for convergent validity were met (factor 2: AVEbefore=0.493, AVEafter=0.530; factor 6: AVEbefore=0.467, AVEafter=0.535; factor 7: AVEbefore=0.485, AVEafter=0.586).
As discriminant validity remained only partially fulfilled, we conducted a new EFA, this time based on a six-factor model. As a result, the items from factor 2 (practice culture) and factor 7 (digitally supported processes) were merged into a single factor. However, this restructuring led to deterioration in the model fit, and the newly merged factor continued to show high correlations with other factors. We therefore concluded that the items within factor 2 (practice culture) lacked sufficient conceptual distinction from the other factors. We explain this by noting that practice culture shapes all activities within a practice, provides a clear identity, and thereby guides everyday actions. If digitalization is an integral part of the enacted practice culture, it also influences factors such as process digitalization, the competencies and responsibilities of practice staff, and measures for implementing IT security and data protection. Based on the empirical findings and the substantively understandable high correlations, we therefore decided to remove factor 2 entirely. The path diagram for the final CFA is presented in .

Following these adjustments, the requirements for discriminant validity were met. Overall, the factors showed clear distinctions from one another. The highest correlation was observed between factor 4 (staff competencies and sense of responsibility) and factor 6 (digitally supported processes), with a value of 0.469. In addition, the model fit improved substantially. An overview of the model fit for each estimated measurement model is provided in .
| Model fit index | Model values | |||
| 7-Factor model without adjustment | 7-Factor model with adjustment | 6-Factor model according to second EFAa | Final 6-factor model without “practice culture” factor | |
| Robust CFIb | 0.945 | 0.977 | 0.939 | 0.993 |
| Robust TLIc | 0.933 | 0.970 | 0.928 | 0.990 |
| Robust RMSEAd | 0.049 | 0.035 | 0.052 | 0.022 |
| PCLOSEe | .52 | .92 | .36 | .98 |
| SRMRf | 0.063 | 0.047 | 0.062 | 0.043 |
aEFA: exploratory factor analysis.
bCFI: comparative fit index.
cTLI: Tucker-Lewis index.
dRMSEA: root mean square error of approximation.
ePCLOSE: P value of close fit.
fSRMR: standardized root mean rquare residual.
Ultimately, the final model resulted in a questionnaire designed to measure digital maturity, consisting of 6 dimensions and 16 items (see ). The final questionnaire was translated from German into English; the original questions can be found in .
| Dimension | Items |
| Effects of digitalization |
|
| Participation of practice staff |
|
| Maturity of the practice management system |
|
| Staff competencies and sense of responsibility |
|
| IT security and data protection |
|
| Digitally supported processes |
|
aGP: general practitioner.
bThe data for the items will be inverted.
The questionnaire showed good internal consistency (overall Cronbach α=.809). Cronbach α values for the dimensions were .871 for effects of digitalization, .789 for participation of practice staff, .762 for maturity of the practice management system, .751 for staff competencies and sense of responsibility, .717 for IT security and data protection, and .729 for digitally supported processes.
Results of the Maturity Level Measurement
The results of the maturity level measurements for the sample can be seen in .
The overall digital maturity of the GP practices averaged 3.77 out of 5 (SD 0.50). The practices achieved their highest score in the “IT security and data protection” dimension, with a mean of 4.45 (SD 0.61). The dimensions “staff competencies and sense of responsibility” (mean 4.09, SD 0.82) and “digitally supported processes” (mean 3.99, SD 0.89) also received relatively high scores. In contrast, the lowest maturity level was observed in the “effects of digitalization” dimension (mean 3.10, SD 1.00).
Discussion
Principal Findings
In this study, we explored the research question of which dimensions of digital maturity in GP practices can be identified and validated to support the development of a questionnaire for measuring digital maturity. Maturity assessments for digitalization in health care are gaining increasing international traction; however, their development and level of research vary significantly across different care sectors. Our aim was to advance maturity measurements in outpatient care, both in research and in practice. The study introduces the first questionnaire for assessing digital maturity in this context, scientifically validated through psychometric testing. Through exploratory and confirmatory factor analyses, we were able to identify and confirm that the digital maturity of GP practices can be measured using 6 dimensions comprising 16 items. The questionnaire we developed captures various factors that influence the level of digitalization in a medical practice, including aspects related to practice staff, the practice management system, IT security and data protection, and digitally supported processes. In addition, the questionnaire incorporates outcome-oriented indicators, defining digital maturity in terms of the positive effects of digitalization on the medical practice. Applied to our sample, we found an average overall digital maturity score of 3.77 out of 5. Among the six dimensions, “effects of digitalization” was rated the lowest (mean 3.10, SD 1.00), while “IT security and data protection” received the highest rating (mean 4.45, SD 0.61).
Comparison With Prior Work
A comparison of our questionnaire with existing maturity level measurements in outpatient care is currently possible only to a limited extent, as these use different measurement approaches and, in some cases, rely on dimensions originally developed for the inpatient sector. Nevertheless, both similarities and differences can be identified. The few existing studies share the understanding that digital maturity is a complex interplay of multiple dimensions. This counters the frequently voiced criticism of a narrow focus on technological aspects, which is particularly evident in maturity assessments within the hospital sector []. It becomes apparent that the identified dimensions can primarily be categorized into structures, processes, and outcomes, which conceptually aligns with Donabedian’s quality model []. The emphasis here is on structures, referring mainly to the technical infrastructure as well as the organizational and human resources and competencies of the medical practice. Although we used the dimension “staff competencies and sense of responsibility” in our study, other authors have referred to similar constructs using terms such as “resources and ability (individual),” “resources and ability (organizational),” or “capabilities, willingness, and readiness” [,]. An example of an outcome-oriented indicator can be found in the work of Teixeira et al [], who examined whether digital systems have positive effects on patients, organizational structures, processes, or the financial performance of medical practices.
Unlike other maturity measurements, factors such as the interoperability of technical systems and the frequency of use are not captured in our questionnaire [,-]. However, a closer analysis reveals that the frequency of use was originally part of the “practice culture” factor but was excluded during the CFA. Additionally, our questionnaire does not include aspects related to management, governance, and strategy. At most, some indications of this can be found in the factor “participation of practice staff” as an expression of staff leadership, as well as in the eliminated factor “practice culture” through questions on the practice’s mission statement. The absence of such aspects stands in contrast to maturity assessments in outpatient care [] and, in particular, to maturity models in inpatient care, such as DigitalRadar [,,]. On the one hand, this is due to the fact that variables from these subject areas were excluded during the EFA, as no underlying common factor could be identified. On the other hand, our results are based on prior expert interviews, in which these areas were considered less relevant compared to other dimensions. In this study, with regard to the limited consideration of dimensions related to management activities, we have already pointed out that in our view, this is due to the fact that the organizational form and size of a medical practice are not comparable to larger institutions, such as hospitals, where management plays a much greater role. In Germany, the solo practice remains the most common form of establishment for general practices in outpatient care, accounting for about two-thirds of all practice types []. Furthermore, there are also differences between maturity measurements in terms of the overall digital maturity score and the method used to calculate it. Similar to Weik et al [], we used an overall score based on the mean values of the individual dimensions. Teixeira et al [] used a point-based approach in which each dimension was represented by a single statement. Agreement with the statement resulted in the allocation of 1 point, and the total number of points determined the maturity level []. Azar et al [] expanded their maturity measurement compared to our approach by integrating qualitative elements, specifically by defining distinct maturity levels. GP practices were assigned to a given level once a certain score threshold was reached [].
Finally, there is also a difference in respondents completing the questionnaire. Although in our model, only GPs assessed digital maturity, the Azar et al [] maturity model surveys practice managers and regional service officers. In a German project aimed at measuring maturity levels, medical assistants are trained as “digi-managers” through a specialized training program designed to assess the digital maturity of medical practices [,].
Practical Implications
The questionnaire serves as an initial tool for GPs to assess the digitalization status of their practice. The results provide an impetus for reflecting on the current state. It can be used to systematically record the practice’s level of digitalization and to identify targeted areas for further development, such as infrastructure or the competencies of practice staff. Additionally, it can serve as a foundation for internal team discussions or for developing digitalization strategies at the practice level. By assessing their level of digitalization, multiple practices can also engage in benchmarking with one another. The results may further stimulate communication between practices, fostering improvements in their own digital maturity. For policymakers, the questionnaire offers a standardized and validated measurement instrument that quantifies digital maturity at the practice level. It can be incorporated into needs assessments, funding programs, or evaluation initiatives to track digitalization progress and to plan targeted support measures. In Germany, there are increasing calls for government funding programs to promote the digitalization of medical practices as part of a proposed Practice Future Act []. In this context, maturity assessments using our questionnaire could be implemented in a manner similar to DigitalRadar established under the Hospital Future Act. Finally, the questionnaire also opens up new research opportunities. It can be used in empirical studies, for example, to explore correlations between digital maturity and the successful implementation of new digital systems. Even now, there are already initial approaches that examine the connections between digital maturity and constructs such as organizational commitment and job satisfaction [].
Limitations
In addition to the fact that the questionnaire was validated in German, this study is subject to several limitations related to the sample, the factor-analytical model, and the methodological approach used to validate the measurement dimensions. First, the sample size of 201 participating GPs is relatively small, which limits the validity and generalizability of the results. The literature offers various recommendations regarding sample size, including guidelines on the overall sample as well as the ratio of cases to variables involved in factor analysis []. Independent of these recommendations, however, there remains the question of how to address potential biases resulting from an insufficient sample size. Therefore, to address this issue, we applied CFA fit indices specifically recommended for small sample sizes, following the guidance of Gaskin et al []. Furthermore, selection bias cannot be ruled out, as it is likely that mainly individuals open to digitalization participated in the survey. Our sample was also not representative of health care providers with respect to gender, as it included a higher proportion of male participants (61.3 %) []. Additionally, the influence of social desirability cannot be completely ruled out, which may have led to slightly higher ratings for factors such as “IT security and data protection” and “participation of practice staff.” Nevertheless, we believe that the use of multiple recruitment channels allowed us to reach a diverse group of participants. Regarding the sample size, the recommendations of Gaskin et al [] were considered when interpreting the model quality, enabling an appropriate classification of the results within the given methodological constraints.
Second, there are limitations in the factor-analytical model applied. Two factors were represented by only two indicators each, which contradicts current recommendations that require at least three indicators per factor []. This can result in the corresponding factors being weakly identified or producing unstable parameter estimates. The content validity of such factors must also be critically evaluated. However, in the context of this study, the reduction to two indicators per factor was driven by methodological considerations. The affected items were removed due to insufficient convergent and discriminant validity. Furthermore, the theoretical foundation of the respective constructs aligned with the resulting structure, so the remaining factors were content-wise justified despite these formal limitations.
Finally, a limitation arises from the fact that the same dataset was used for both exploratory factor identification and confirmatory validation of the measurement dimensions. As a result, the independence of exploration and confirmation was constrained. This approach can lead to an overly optimistic model fit, thereby potentially compromising the validity of the results. The model could thus fit this particular data set better but perform worse on new data. The generalizability of the results is therefore limited. The decision to use the same dataset was made for practical reasons and is understandable, given the early development stage of the measurement instrument. The aim of this study was not to perform a final validation but rather to initially identify and empirically test a theoretically sound construct structure of digital maturity. Future research should focus on validating the scales using independent samples.
Further Development of the Questionnaire
A key starting point for the further development of the questionnaire is improving the item quality and the validity of the assessed dimensions. Two dimensions, in particular, are currently based on only two items each, which limits their content coverage and statistical robustness. To better capture the full scope of the constructs and enhance measurement accuracy, these dimensions should be expanded with additional, precise, and empirically validated items.
Furthermore, new dimensions could be incorporated into the questionnaire. Specifically, aspects such as the interoperability of technical systems and topics related to governance and strategy, which have not yet been addressed, could be considered. To further improve the questionnaire’s quality, additional research should be conducted on relevant digital maturity dimensions and the findings integrated into future revisions.
When expanding the questionnaire, however, care must be taken to ensure that it remains user-friendly and that the completion time does not increase disproportionately. One potential enhancement could be to supplement the maturity level results—based on Azar et al []—with qualitative statements. This would provide users with a more nuanced understanding of the digital maturity level, rather than a purely numerical evaluation. In addition, consideration should be given to not restricting the questionnaire completion exclusively to GPs. As demonstrated in previous studies, including other health care providers can be valuable to obtain a more comprehensive perspective and enhance the quality of the collected data [,]. Finally, it appears prudent to extend the questionnaire to other specialist areas within outpatient care. Our study has so far focused solely on GP practices, while specialist practices have not been included. Therefore, the adaptation and validation of the questionnaire for this target group should be explored.
Conclusion
This study is the first empirical investigation to identify and validate the dimensions of digital maturity in GP practices. The central outcome is a scientifically robust questionnaire that provides a foundation for future maturity assessments. The findings demonstrate that digital maturity is a multidimensional construct that can be captured through a variety of technical, human, and organizational factors. Outcome-oriented factors, such as enhanced patient satisfaction, can also serve as indicators of a high level of digital maturity in medical practices. This study offers practical implications for GPs, policymakers, and the scientific community. In future research, the developed questionnaire should be further refined, validated, and adapted to different contexts. Overall, the current results contribute to the establishment of maturity assessments in the outpatient sector and can serve as a basis for supporting the digital transformation in a targeted and sustainable way through accompanying evaluations.
Acknowledgments
This study received no funding. No generative artificial intelligence was used in any portion of the manuscript writing.
Data Availability
The data supporting this study’s findings are available from the corresponding author upon reasonable request.
Authors' Contributions
TN contributed to conceptualization, methodology, formal analysis, investigation, data curation, writing—review and editing, and visualization. AM provided a critical review. SM contributed to conceptualization, methodology, validation, and writing—review and editing.
Conflicts of Interest
None declared.
Completed Checklist for Reporting Results of Internet E-Surveys (CHERRIES).
DOCX File , 46 KBSurvey questionnaire for general practitioners.
DOCX File , 47 KBItems and abbreviations used.
DOCX File , 21 KBFinal questionnaire (original version).
DOCX File , 16 KBDigital maturity by dimension and as an overall value (mean, SD).
PNG File , 130 KBReferences
- Organisation for Economic Co-operation and Development (OECD), European Union. State of health in the EU cycle. In: Health at a Glance: Europe 2016. Paris, France. OECD; Nov 23, 2016.
- van der Kleij RM, Kasteleyn MJ, Meijer E, Bonten TN, Houwink EJ, Teichert M, et al. SERIES: eHealth in primary care. Part 1: concepts, conditions and challenges. Eur J Gen Pract. Oct 10, 2019;25(4):179-189. [FREE Full text] [CrossRef] [Medline]
- Chaudhry B, Wang J, Wu S, Maglione M, Mojica W, Roth E, et al. Systematic review: impact of health information technology on quality, efficiency, and costs of medical care. Ann Intern Med. May 16, 2006;144(10):742-752. [FREE Full text] [CrossRef] [Medline]
- Buntin MB, Burke MF, Hoaglin MC, Blumenthal D. The benefits of health information technology: a review of the recent literature shows predominantly positive results. Health Aff (Millwood). Mar 2011;30(3):464-471. [FREE Full text] [CrossRef] [Medline]
- World Health Organization. Digital Transformation Handbook for Primary Health Care: Optimizing Person-Centred Point of Service Systems. Geneva. World Health Organization; 2024.
- de Bruin T, Freeze R, Rosemann M. Understanding the main phases of developing a maturity assessment model. 2005. Presented at: 16th Australasian Conference on Information Systems; November 29-December 2, 2005; Sydney. URL: https://www.researchgate.net/publication/27482282_Understanding_the_Main_Phases_of_Developing_a_Maturity_Assessment_Model/link/00b7d51f71c388cc54000000/download?_tp=eyJjb250ZXh0Ijp7ImZpcnN0UGFnZSI6InB1YmxpY2F0aW9uIiwicGFnZSI6InB1YmxpY2F0aW9uIn19
- Phiri P, Cavalini H, Shetty S, Delanerolle G. Digital maturity consulting and strategizing to optimize services: overview. J Med Internet Res. Jan 17, 2023;25:e37545. [FREE Full text] [CrossRef] [Medline]
- Johnston DS. Digital maturity: are we ready to use technology in the NHS? Future Healthc J. Oct 2017;4(3):189-192. [FREE Full text] [CrossRef] [Medline]
- Carvalho JV, Rocha Á, Abreu A. Maturity models of healthcare information systems and technologies: a literature review. J Med Syst. Jun 15, 2016;40(6):131. [FREE Full text] [CrossRef] [Medline]
- Neunaber T, Meister S. Digital maturity and its measurement of general practitioners: a scoping review. Int J Environ Res Public Health. Feb 28, 2023;20(5):4377. [FREE Full text] [CrossRef] [Medline]
- Reifegradmodell - das digitale Reifegradmodell der KVWL. Kassenärztliche Vereinigung Westfalen-Lippe. 2023. URL: https://www.kvwl.de/themen-a-z/reifegradmodell [accessed 2025-03-17]
- Mainz A, Neunaber T, D'Agnese PC, Eid A, Galla T, Ellers C, et al. Digital literacy for “Digi-Managers” in outpatient care: conceptualization and longitudinal evaluation of a certificate course for the training of digitalization officers in medical and psychotherapeutic practices. JMIR Med Educ. Aug 29, 2025;11:e70843. [CrossRef] [Medline]
- Azar D, Cuman A, Blake T, Proposch A. A digital health maturity assessment for general practice. Gippsland PHN. Oct 2020. URL: https://gphn.org.au/wp-content/uploads/files/pdf/DOC-20-16665-Gippsland-PHN-Digital-Health-Maturity-Assessment-paper-2.pdf [accessed 2025-01-30]
- Electronic medical record adoption model (EMRAM). Healthcare Information and Management Systems Society (HIMSS). URL: https://www.himss.org/what-we-do-solutions/digital-health-transformation/maturity-models/ [accessed 2025-09-26]
- Geissler A, Hollenbach J, Haring M, Amelung VE, Thun S, Haering A. A nationwide digital maturity assessment of hospitals – results from the German DigitalRadar. Health Policy Technol. Sep 2024;13(4):100904. [FREE Full text] [CrossRef]
- Vogel J, Hollenbach J, Haering A, Augurzky B, Geissler A. The association of hospital profitability and digital maturity - an explorative study using data from the German DigitalRadar project. Health Policy. Apr 2024;142:105012. [FREE Full text] [CrossRef] [Medline]
- Duncan R, Eden R, Woods L, Wong I, Sullivan C. Synthesizing dimensions of digital maturity in hospitals: systematic review. J Med Internet Res. Mar 30, 2022;24(3):e32994. [FREE Full text] [CrossRef] [Medline]
- Lahrmann G, Marx F, Mettler T, Winter R, Wortmann F. Inductive design of maturity models: applying the Rasch algorithm for design science research. In: Jain H, Sinha AP, Vitharana P, editors. Service-Oriented Perspectives in Design Science Research. Berlin, Heidelberg. Springer; 2011:176-191.
- Weik L, Fehring L, Mortsiefer A, Meister S. Understanding inherent influencing factors to digital health adoption in general practices through a mixed-methods analysis. NPJ Digit Med. Feb 27, 2024;7(1):47. [FREE Full text] [CrossRef] [Medline]
- Weik L, Fehring L, Mortsiefer A, Meister S. Big 5 personality traits and individual- and practice-related characteristics as influencing factors of digital maturity in general practices: quantitative web-based survey study. J Med Internet Res. Jan 22, 2024;26:e52085. [FREE Full text] [CrossRef] [Medline]
- Haverinen J, Keränen N, Tuovinen T, Ruotanen R, Reponen J. National development and regional differences in eHealth maturity in Finnish public health care: survey study. JMIR Med Inform. Aug 12, 2022;10(8):e35612. [FREE Full text] [CrossRef] [Medline]
- Teixeira F, Li E, Laranjo L, Collins C, Irving G, Fernandez MJ, et al. Digital maturity and its determinants in general practice: a cross-sectional study in 20 countries. Front Public Health. Jan 13, 2022;10:962924. [FREE Full text] [CrossRef] [Medline]
- Neunaber T, Mortsiefer A, Meister S. Dimensions and subcategories of digital maturity in general practice: qualitative study. J Med Internet Res. Dec 19, 2024;26:e57786. [FREE Full text] [CrossRef] [Medline]
- Eysenbach G. Improving the quality of web surveys: the Checklist for Reporting Results of Internet E-Surveys (CHERRIES). J Med Internet Res. Sep 29, 2004;6(3):e34. [FREE Full text] [CrossRef] [Medline]
- LimeSurvey: an open source survey tool. LimeSurvey GmbH. 2025. URL: https://www.limesurvey.org/de [accessed 2025-02-11]
- Karrer-Gauß K, Roesler E, Siebert FW. Neuauflage des TAEG Fragebogens: technikaffinität valide und multidimensional mit einer Kurz- oder Langversion erfassen. Z Arb Wiss. Jul 08, 2024;78(3):387-406. [FREE Full text] [CrossRef]
- Brooke J. SUS: a quick and dirty usability scale. Usability Eval Ind. Nov 30, 1995:189. [FREE Full text] [CrossRef]
- Schnell R, Hill PB, Esser E. Methoden der empirischen Sozialforschung. 11., überarbeitete Auflage. Berlin Boston. De Gruyter Oldenbourg; 2018.
- Arevalo M, Brownstein NC, Whiting J, Meade CD, Gwede CK, Vadaparampil ST, et al. . Strategies and lessons learned during cleaning of data from research panel participants: cross-sectional web-based health behavior survey study. JMIR Form Res. Jun 23, 2022;6(6):e35797. [FREE Full text] [CrossRef] [Medline]
- Leiner DJ. Too fast, too straight, too weird: non-reactive indicators for meaningless data in internet surveys. Surv Res Methods. Dec 10, 2019;13(3):229-248. [FREE Full text] [CrossRef]
- IBM SPSS Statistics for Windows, version 29.0. IBM Corp. URL: https://www.ibm.com/support/pages/downloading-ibm-spss-statistics-29 [accessed 2025-04-05]
- R Core Team. R: A Language and Environment for Statistical Computing, version 4.4.3x. Vienna, Austria. R Foundation for Statistical Computing; 2024.
- Rosseel Y. lavaan: an R package for structural equation modeling. J Stat Software. 2012;48(2):1-36. [FREE Full text] [CrossRef]
- Watkins MW. Exploratory factor analysis: a guide to best practice. J Black Psychol. Apr 27, 2018;44(3):219-246. [FREE Full text] [CrossRef]
- Samuels P. Advice on exploratory factor analysis. Centre for Academic Success, Birmingham City University. Jun 9, 2017. URL: https://www.open-access.bcu.ac.uk/id/eprint/6076 [accessed 2025-06-01]
- Fabrigar LR, Wegener DT, MacCallum RC, Strahan EJ. Evaluating the use of exploratory factor analysis in psychological research. Psychol Methods. Sep 1999;4(3):272-299. [FREE Full text] [CrossRef]
- Costello A, Osborne J. Best practices in exploratory factor analysis: four recommendations for getting the most from your analysis. Pract Assess Res Eval. Jan 1, 2005;10(7):1-9. [FREE Full text] [CrossRef]
- Çokluk Bökeoğlu Ö, Koçak D. Using Horn’s parallel analysis method in exploratory factor analysis for determining the number of factors. Educ Sci Theory Pract. Apr 15, 2016;16(2):537-551. [FREE Full text] [CrossRef]
- Gaskin JE, Lowry PB, Rosengren W, Fife PT. Essential validation criteria for rigorous covariance‐based structural equation modelling. Inf Syst J. May 15, 2025:12598. [FREE Full text] [CrossRef]
- Fornell C, Larcker DF. Evaluating structural equation models with unobservable variables and measurement error. J Mark Res. Feb 1981;18(1):39. [FREE Full text] [CrossRef]
- Rönkkö M, Cho E. An updated guideline for assessing discriminant validity. Organ Res Methods. Nov 23, 2020;25(1):6-14. [FREE Full text] [CrossRef]
- Hair JF, Black WC, Babin BJ, Anderson RE. Multivariate Data Analysis. Eighth Edition. Andover, Hampshire. Cengage; 2019.
- Cronbach LJ. Coefficient alpha and the internal structure of tests. Psychometrika. Sep 1951;16(3):297-334. [FREE Full text] [CrossRef]
- MacCallum RC, Widaman KF, Zhang S, Hong S. Sample size in factor analysis. Psychol Methods. Mar 1999;4(1):84-99. [FREE Full text] [CrossRef]
- Cresswell K, Sheikh A, Krasuska M, Heeney C, Franklin BD, Lane W, et al. Reconceptualising the digital maturity of health systems. Lancet Digit Health. Sep 2019;1(5):e200-e201. [FREE Full text] [CrossRef] [Medline]
- Donabedian A. Evaluating the quality of medical care. Milbank Mem Fund Q. Jul 1966;44(3):166-203. [FREE Full text] [CrossRef]
- Woods L, Eden R, Duncan R, Kodiyattu Z, Macklin S, Sullivan C. Which one? A suggested approach for evaluating digital health maturity models. Front Digit Health. 2022;4:1045685. [FREE Full text] [CrossRef] [Medline]
- Auf einen Hausarzt oder eine Hausärztin kommen im Schnitt 1 264 Einwohnerinnen und Einwohner. Statistisches Bundesamt. URL: https://www.destatis.de/DE/Presse/Pressemitteilungen/2025/08/PD25_N046_231.html [accessed 2025-09-16]
- KBV - KBV fordert Förderprogramm für Digitalisierungsmaßnahmen. Kassenärztliche Bundesvereinigung. URL: https://www.kbv.de/praxis/tools-und-services/praxisnachrichten/2025/07-17/kbv-fordert-foerderprogramm-fuer-digitalisierungsmassnahmen [accessed 2025-09-11]
- Hawrysz L, Kludacz-Alessandri M, Fiałkowska-Filipek M. Digital maturity in primary care facilities: assessing its influence on organisational commitment and job satisfaction. Decision. Aug 27, 2025:1-18. [FREE Full text] [CrossRef]
- Kline P. An Easy Guide to Factor Analysis. London, UK. Routledge; 2014.
- Statistische Informationen aus dem Bundesarztregister. Kassenärztliche Bundesvereinigung. 2025. URL: https://www.kbv.de/documents/infothek/zahlen-und-fakten/Bundesarztregister/2024-12-31-BAR-Statistik.pdf [accessed 2025-09-09]
- Kline RB. Principles and Practice of Structural Equation Modeling, 4th ed. New York, NY. Guilford Press; 2016.
Abbreviations
| AVE: average variance extracted |
| CFA: confirmatory factor analysis |
| CFI: comparative fit index |
| CHERRIES: Checklist for Reporting Results of Internet E-Surveys |
| EFA: exploratory factor analysis |
| GP: general practitioner |
| KMO: Kaiser-Meyer-Olkin |
| PCLOSE: P value of close fit |
| RMSEA: root mean square error of approximation |
| SRMR: standardized root mean square residual |
| TLI: Tucker-Lewis index |
Edited by S Brini, A Mavragani; submitted 29.Jul.2025; peer-reviewed by B Senst, I Schlömer; comments to author 08.Sep.2025; revised version received 20.Sep.2025; accepted 22.Sep.2025; published 14.Oct.2025.
Copyright©Timo Neunaber, Achim Mortsiefer, Sven Meister. Originally published in the Journal of Medical Internet Research (https://www.jmir.org), 14.Oct.2025.
This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research (ISSN 1438-8871), is properly cited. The complete bibliographic information, a link to the original publication on https://www.jmir.org/, as well as this copyright and license information must be included.

