Published on in Vol 21, No 2 (2019): February

Preprints (earlier versions) of this paper are available at https://preprints.jmir.org/preprint/13269, first published .
Online Digital Education for Postregistration Training of Medical Doctors: Systematic Review by the Digital Health Education Collaboration

Online Digital Education for Postregistration Training of Medical Doctors: Systematic Review by the Digital Health Education Collaboration

Online Digital Education for Postregistration Training of Medical Doctors: Systematic Review by the Digital Health Education Collaboration

Review

1Health Services and Outcomes Research, National Healthcare Group, Singapore, Singapore

2Joanna Briggs Institute, University of Adelaide, Adelaide, Australia

3Lee Kong Chian School of Medicine, Nanyang Technological University, Singapore, Singapore

4Centre for Population Health Sciences, Lee Kong Chian School of Medicine, Nanyang Technological University, Singapore, Singapore

5Family Medicine and Primary Care, Lee Kong Chian School of Medicine, Nanyang Technological University, Singapore, Singapore

6Laboratory of Medical Physics, Aristotle University of Thessaloniki, Thessaloníki, Greece

7Ophthalmology Team, Novartis, Singapore, Singapore

8Medical Education Research and Scholarship Unit, Lee Kong Chian School of Medicine, Nanyang Technological University, Singapore, Singapore

9Department of Learning, Informative, Management and Ethics, Karolinska Institutet, Stockholm, Sweden

1010I Emerging Technologies Lab, Mohammed VI University of Health Sciences, Casablanca, Morocco

11Department of Family Medicine, Faculty of Medicine, University of Ljubljana, Ljubljana, Slovenia

Corresponding Author:

Josip Car, MD, PhD, DIC, MSc, FFPH, FRCP (Edin)

Centre for Population Health Sciences

Lee Kong Chian School of Medicine

Nanyang Technological University

Experimental Medicine Building

59 Nanyang Drive

Singapore,

Singapore

Phone: 65 6592 3960

Email: josip.car@ntu.edu.sg


Background: Globally, online and local area network–based (LAN) digital education (ODE) has grown in popularity. Blended learning is used by ODE along with traditional learning. Studies have shown the increasing potential of these technologies in training medical doctors; however, the evidence for its effectiveness and cost-effectiveness is unclear.

Objective: This systematic review evaluated the effectiveness of online and LAN-based ODE in improving practicing medical doctors’ knowledge, skills, attitude, satisfaction (primary outcomes), practice or behavior change, patient outcomes, and cost-effectiveness (secondary outcomes).

Methods: We searched seven electronic databased for randomized controlled trials, cluster-randomized trials, and quasi-randomized trials from January 1990 to March 2017. Two review authors independently extracted data and assessed the risk of bias. We have presented the findings narratively. We mainly compared ODE with self-directed/face-to-face learning and blended learning with self-directed/face-to-face learning.

Results: A total of 93 studies (N=16,895) were included, of which 76 compared ODE (including blended) and self-directed/face-to-face learning. Overall, the effect of ODE (including blended) on postintervention knowledge, skills, attitude, satisfaction, practice or behavior change, and patient outcomes was inconsistent and ranged mostly from no difference between the groups to higher postintervention score in the intervention group (small to large effect size, very low to low quality evidence). Twenty-one studies reported higher knowledge scores (small to large effect size and very low quality) for the intervention, while 20 studies reported no difference in knowledge between the groups. Seven studies reported higher skill score in the intervention (large effect size and low quality), while 13 studies reported no difference in the skill scores between the groups. One study reported a higher attitude score for the intervention (very low quality), while four studies reported no difference in the attitude score between the groups. Four studies reported higher postintervention physician satisfaction with the intervention (large effect size and low quality), while six studies reported no difference in satisfaction between the groups. Eight studies reported higher postintervention practice or behavior change for the ODE group (small to moderate effect size and low quality), while five studies reported no difference in practice or behavior change between the groups. One study reported higher improvement in patient outcome, while three others reported no difference in patient outcome between the groups. None of the included studies reported any unintended/adverse effects or cost-effectiveness of the interventions.

Conclusions: Empiric evidence showed that ODE and blended learning may be equivalent to self-directed/face-to-face learning for training practicing physicians. Few other studies demonstrated that ODE and blended learning may significantly improve learning outcomes compared to self-directed/face-to-face learning. The quality of the evidence in these studies was found to be very low for knowledge. Further high-quality randomized controlled trials are required to confirm these findings.

J Med Internet Res 2019;21(2):e13269

doi:10.2196/13269

Keywords



Information communication technology (ICT) has transformed the way information is exchanged and shared around the world [1-4]. In medical education, ICT facilitated a paradigm shift from traditional learning to a dynamic system, moving away from the instructor- or student-focused presentation session to a student-centered process, where students can learn anywhere, anytime, and at their own pace. It also provides unique opportunities for interactive communication and networking [5].

Online and local area network–based (LAN) digital education (ODE) encompasses a variety of interventions characterized by their tools, content, learning objectives, pedagogical approaches, and delivery settings. ODE also varies widely in its configuration (eg, tutorial, asynchronous discussion, and live conferencing), instructional methods (eg, practice exercises and cognitive interactivity), and presentation [6]. ODE uses a full electronic approach, which is entirely driven by technology, or a mix of traditional learning and digital technology (ie, blended learning). Blended learning may be more suitable for health care training, which commonly needs to combine hands-on skill-based training at a practical level and self-directed learning such as ODE [7-9].

ODE has been used widely in undergraduate medical and other health professionals’ education [10] and is now gaining popularity in postregistration medical education for lifelong learning (ie, continuing education), evidenced by the growing number of studies. Continuing medical education (CME) is defined as “all educational activities which serve to maintain, develop, or increase the knowledge, skills, and professional performance and relationships that a physician used to provide services for patients, the public, or the profession” [11] and continuing professional development (CPD) is defined as “a range of learning activities through which medical professionals maintained and developed throughout their career to ensure that they retain their capacity to practice safely, effectively and legally within their evolving scope of practice” [12]. Recently, nearly all medical schools in the United States and Canada moved to providing some form of online learning material as part of their CME for physicians [6].

Research shows that learning is influenced more by the content and instructional strategy than by the type of technology used to deliver the content [13]; in other words, the design of a course determines its effectiveness in learning [14]. There is a significant methodological, educational, and clinical heterogeneity amongst the studies [15-37], which highlighted the need for a review on ODE that focused specifically on the education of medical doctors with more homogenous learning technologies. The a priori protocol reported here has also been published in the Cochrane library [38].

The primary objective of this review was to evaluate the effectiveness of ODE in improving doctors’ knowledge, skills, attitude, and satisfaction. The secondary objectives were to assess changes in clinical practices or behaviors, patient outcomes, costs and cost-effectiveness of the intervention, and unintended/adverse effects on patients and physicians.


Search Strategy and Data Sources

A search strategy was developed in accordance with the Cochrane Handbook of Systematic Reviews of Interventions [39] to search the Cochrane CENTRAL, MEDLINE (Ovid), Embase (Elsevier), PsycINFO (Ovid), ERIC (Ovid), CINAHL (Ebsco), Web of Science Core Collection (Thomson Reuters), and International Clinical Trials Platform (World Health Organization) databases. The detailed search strategy is presented in Multimedia Appendix 1, and a detailed description of the methodology has been published elsewhere [40]. Databases were searched from January 1, 1990, to March 9, 2017. We selected 1990 as the starting year for our search because prior to this, the use of ICT for education was limited. We identified additional studies both by scanning relevant systematic reviews and meta-analyses and hand searching reference lists of all included studies.

Selection Criteria

Only randomized controlled trials (RCTs), cluster RCTs (cRCTs), and quasi-randomized controlled trials of postregistration education for medical doctors using ODE (standalone or blended) with any type of controls measuring knowledge, cognitive skills, attitudes, satisfaction (primary outcome), changes in practice or behavior, patient outcomes, costs, or adverse effects (secondary outcome) outcomes were eligible for inclusion in this review (Multimedia Appendix 2). Participants were not excluded on the basis of age, gender, or any other sociodemographic variables. We used the gold-standard systematic review methods recommended by the Cochrane Collaboration and strictly adhered to the published protocol [38].

Data Extraction

Three reviewers (PG, OZ, and BK) independently screened the titles and abstracts and full-text versions of the eligible studies and performed the data extraction. We extracted the relevant data on participants, intervention and control, outcome measures, and study designs. We contacted the study authors in cases of any missing information. A fourth review author (PP) acted as an arbiter in cases of disagreement.

Risk of Bias Assessment

Two reviewers independently assessed the risk of bias for RCTs using the Cochrane Collaboration’s “risk of bias” tool [39]. For RCTs, we did so across the domains of random sequence generation, allocation concealment, blinding of outcome assessment, incomplete outcome data, selective outcome reporting, and other bias including the comparability of intervention and control group; characteristics at baseline; validity and reliability of outcome assessment tools; and protection against contamination. Blinding of participants and personnel was not assessed, as the nature of the intervention precludes blinding.

We assessed the risk of bias for cRCTs across the domains of recruitment bias [41], baseline imbalances, loss of clusters incorrect analysis, and comparability with individual randomized trials [39]. For each study, two reviewers independently categorized each domain as low, high, or unclear risk of bias.

Assessment of Heterogeneity

Clinical heterogeneity was assessed to check if the included studies were similar in terms of their population, intervention characteristics, and reported outcomes and to ascertain the possibility of pooling the measures of effect. The extracted data were analyzed using RevMan 5.3 software (The Nordic Cochrane Centre, Copenhagen, Denmark). Statistical heterogeneity was assessed using the Chi-square and I2 tests [39]. We found significant heterogeneity (clinical and statistical) among the included studies; hence, meta-analysis was not suitable for analysis.

Data Synthesis

The results from individual RCTs were reported as the standardized mean difference (SMD) for continuous variables and risk ratios (RR) for dichotomous variables. Where studies reported more than one measure for each outcome, the primary measure, as defined by the study authors, was used in the analysis. If studies had multiple arms, we compared the intervention arm to the least active control arm and assessed the difference in postintervention outcomes. Similarly, when multiple domains of the same outcome were measured, only the primary domains identified and agreed upon by the review authors were reported. Meta-analyses were not possible because there was significant clinical and methodological heterogeneity across the included studies.

Summary of Findings

Summary of findings tables (Tables 1-3) were prepared based on the methods described in Chapter 11 of the Cochrane Handbook for Systematic Reviews of Interventions [39]. Two review authors (PG and BK) independently used the Grading of Recommendations, Assessment, Development and Evaluations (GRADE) criteria to rank the quality of the evidence using the GRADE profiler (GRADEpro) software [42]. In the main review, we only compared ODE with self-directed/face-to-face learning and blended learning with self-directed/face-to-face learning; the rest of the comparisons are presented in Multimedia Appendix 3.

Table 1. Summary of findings for online and local area network–based digital education as compared to self-directed learning. patient or population: postregistration medical doctors; setting: universities, hospitals, and primary care; intervention: online and local area network–based digital education; comparison: self-directed learning.
OutcomesNumber of participants (number of RCTsa)Quality of evidence (GRADEb)Direction of effects
Knowledge assessed with multiple-choice questions. Follow-up ranged from posttest to 1 year3067 (29)Very lowc,d,e,fSeventeen studies [43-60] reported that ODEg was significantly more effective than self-directed learning (very low certainty evidence). Two studies [61,62] reported mixed results (very low certainty evidence). Ten studies [63-72] reported that ODE was as effective as self-directed learning (very low certainty evidence).
Skills assessed with OSCEh, diagnostic assessment, examination, questionnaires, and surveys. Follow-up ranged from posttest to 4 years829 (8)Lowc,d,iFive studies [65,73-76] reported that ODE was significantly more effective than self-directed learning (low certainty evidence). Two studies [77,78] reported that ODE was as effective as self-directed learning (low certainty evidence). One study [54] reported self-directed learning was more effective than ODE (low certainty evidence).
Attitude assessed with questionnaires. Follow-up ranged from posttest to 136 days392 (4)Lowc,dOne study [47] reported that ODE was significantly more effective than self-directed learning (low certainty evidence). Another [66] reported that ODE was as effective as self-directed learning (low certainty evidence). Two studies [44,58] reported mixed results (low certainty evidence).
Satisfaction assessed with questionnaires. Follow-up ranged from posttest to 6 months934 (6)Lowc,dTwo studies [67,79] reported that ODE was significantly more effective (low certainty evidence). Three studies [54,58,80] reported that ODE was as effective as self-directed learning (low-certainty evidence). One study [61] reported mixed results (low certainty evidence).

aRCT: randomized controlled trial.

bGRADE: Grading of Recommendations, Assessment, Development and Evaluations.

cRated down by one level for study limitations. Most studies were considered to be at an unclear or high risk of bias. Overall, the risk of bias for most studies was unclear due to a lack of information reported.

dRated down by one level for inconsistency. There was variation in effect size (ie, very large and very small effects were observed).

eRated down by one level for publication bias. The effect estimates were asymmetrical, suggesting possible publication bias.

fVery low quality (+ – – –): We have very little confidence in the effect estimate. The true effect is likely to be substantially different from the estimate of effect.

gODE: online and local area network–based digital education.

hOSCE: objective structured clinical examination.

iLow quality (+ + – –): Our confidence in the effect estimate is limited. The true effect may be substantially different from the estimate of the effect

Table 2. Summary of findings for online digital education as compared to face-to-face learning. patient or population: postregistration medical doctors; setting: universities, hospitals, and primary care; intervention: online and local area network–based digital education; comparison: face-to-face learning.
OutcomesNumber of participants (number of RCTsa)Quality of evidence (GRADEb)Direction of effects
Knowledge assessed with multiple-choice questions. Follow-up ranged from posttest to 18 months1202 (9)Very lowc,d,e,fTwo studies [81,82] reported that ODEg was significantly more effective in improving physicians’ knowledge scores than face-to-face learning (very low certainty evidence). Six studies [83-88] found that ODE was as effective as face-to-face learning in improving physicians’ knowledge scores (very low certainty evidence). One study [89] reported that face-to-face learning was significantly more effective than ODE in improving physicians’ knowledge scores.
Skills assessed with OSCEh, diagnostic assessment, examination, questionnaires, and surveys. Follow-up ranged from posttest to 12 months291 (7)Lowc,d,iSix studies [84,87,90-93] reported ODE was as effective as face-to-face learning in improving physicians’ skills (low certainty evidence). In one study [94], data were missing.
Attitude assessed with questionnaires. Follow-up ranged from posttest to 18 months220 (2)Lowc,dTwo studies [82,95] reported that ODE was as effective as face-to-face learning in improving physicians’ attitude (low certainty evidence).
Satisfaction assessed with questionnaires. Follow-up ranged from posttest to 12 weeks260 (4)Lowc,dTwo studies [83,87] reported that ODE was significantly more effective than face-to-face learning for improving physicians’ satisfaction (low certainty evidence). Two studies [81,84] reported that ODE was as effective as face-to-face learning in improving physicians’ satisfaction (low certainty evidence).

aRCT: randomized controlled trial.

bGRADE: Grading of Recommendations, Assessment, Development and Evaluations.

cRated down by one level for study limitations. Most studies were considered to be at an unclear or high risk of bias. Overall, the risk of bias for most studies was unclear due to a lack of information reported.

dRated down by one level for inconsistency. There was variation in effect size (ie, very large and very small effects were observed).

eRated down by one level for publication bias. The effect estimates were asymmetrical, suggesting possible publication bias.

fVery low quality (+ – – –): We have very little confidence in the effect estimate. The true effect is likely to be substantially different from the estimate of effect.

gODE: online and local area network–based digital education.

hOSCE: objective structured clinical examination.

iLow quality (+ + – –): Our confidence in the effect estimate is limited. The true effect may be substantially different from the estimate of the effect

Table 3. Summary of findings for blended learning as compared to self-directed/face-to-face learning. patient or population: postregistration medical doctors; setting: universities, hospitals, and primary care; intervention: blended learning; comparison: self-directed/face-to-face learning.
OutcomesNumber of participants (number of studies)Quality of evidence (GRADEa)Direction of effects
Knowledge assessed with multiple-choice questions. Follow-up ranged from posttest to 26 months4413 (7 RCTsb)Very lowc,d,e,fTwo studies [96,97] reported that blended learning was significantly more effective in improving physicians’ knowledge than self-directed/face-to-face learning (very low certainty evidence). Five studies assessed together [96,98-101] reported that blended learning was as effective as self-directed/face-to-face learning (very low certainty evidence).
Skills assessed with OSCEg, diagnostic assessment, examination, questionnaires, and surveys. Follow-up ranged from posttest to 26 months.4131 (6 RCTs)Lowc,d,hTwo studies [96,102] reported that blended learning may significantly improve physicians’ skills, and four studies [98,99,103,104] reported that blended learning may be as effective as face-to-face learning in improving skills (low certainty evidence).
Attitude assessed with a questionnaire. Follow-up assessed posttest61 (1 cRCTi)Lowc,dKulier et al [105] compared a blended learning course on EBMj to a face-to-face EBM course and reported that the intervention may be as effective as the controls for improving physicians’ attitude.
Satisfaction assessed with questionnaires on a Likert scale. Follow-up ranged from posttest to 6 months166 (3 RCTs)Lowc,dAli et al [98] compared ATLSk delivered through blended learning to a standard ATLS course and reported no difference in satisfaction between the groups (low certainty evidence). Kronick et al [106] compared 3 hours of online training to no training (self-directed training) and found that the intervention slightly improved satisfaction (low certainty evidence). Platz et al [100] compared basic ultrasound principles and extended focused assessment with sonography for trauma using blended learning as compared to face-to-face training and reported mixed results (low certainty evidence).

aGRADE: Grading of Recommendations, Assessment, Development and Evaluations.

bRCT: randomized controlled trial.

cRated down by one level for study limitations. Most studies were considered to be at an unclear or high risk of bias. Overall, the risk of bias for most studies was unclear due to a lack of information reported.

dRated down by one level for inconsistency. There was variation in effect size (ie, very large and very small effects were observed).

eRated down by one level for publication bias. The effect estimates were asymmetrical, suggesting possible publication bias.

fVery low quality (+ – – –): We have very little confidence in the effect estimate. The true effect is likely to be substantially different from the estimate of effect.

gOSCE: objective structured clinical examination.

hLow quality (+ + – –): Our confidence in the effect estimate is limited. The true effect may be substantially different from the estimate of the effect.

icRCT: cluster-randomized trial.

jEBM: evidence-based medicine.

kATLS: Advanced Trauma Life Support


Search Results

Our searches yielded a total of 27,488 citations and 93 studies (Figure 1). Of those, 74 studies were RCTs including 12,537 participants and 19 were cRCTs including 1262 clusters, 3727 physicians, and 7690 patients. Sixty-four studies were published between 2010 and 2017, and the remaining 29 studies were published between 1999 and 2009.

Participants, Settings, and Countries of Origin

A total of 29 (31.1%) studies were conducted among primary care practitioners (general practitioners, family medicine practitioners/residents, and occupational physicians), 12 among surgeons (12.9%), 11 among general and internal medicine practitioners (11.8%), and 8 among pediatricians (8.6%; Figure 2 and Multimedia Appendices 4-10). Only 2 (2.2%) [83,96] of the 93 studies were conducted in low- to middle-income countries; all the remaining studies were conducted in high-income countries with majority in the United States (53.8%), Canada (10.8%), and Germany (5.4%; Multimedia Appendix 11). Fifty studies were carried out in hospital settings, 31 studies were conducted in university settings, 11 studies were conducted in primary care settings, and one study was conducted in a mixed hospital and university setting.

Figure 1. Modified Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) flow chart of the search results and study-selection process.
View this figure
Figure 2. Number of ODE studies by specialty and type of learning. ODE: online and local area network–based digital education.
View this figure

Interventions and Comparisons

A total of 61 studies compared ODE and self-directed/face-to-face learning. Self-directed learning was defined as self-learning through books and journals, and face-to-face learning was defined as learning through didactic classroom lectures and courses. Fourteen studies compared ODE and other types of ODE, 3 studies compared ODE and blended learning, and 15 studies compared blended learning and self-directed/face-to-face learning. Two studies used synchronous learning technology (video-conferencing systems) for training, and 39 studies used asynchronous learning technologies such as Web-based libraries/repositories of video modules, CD-ROM, emails, and online discussion groups to deliver the intervention. In the main review, we only compared ODE with self-directed/face-to-face learning and blended learning with self-directed/face-to-face learning; the rest of the comparisons are presented in Multimedia Appendix 3.

Risk of Bias in Randomized Controlled Trials and Cluster Randomized Controlled Trials

A total of 51 studies were considered to be at high risk of bias for at least one of the risk of bias domains (Figures 3 and 4). Six studies were rated as having a high risk of selection bias, 31 studies were rated as having a high risk of attrition bias due to a high drop-out rate (>20%), and 3 studies [103,107,108] had a high risk of reporting bias. Epstein et al reported a high risk of detection bias [109]. Further, 25 studies had a high risk of “other biases.” Similarly, among the cRCTs, 12 studies had a high risk of bias for baseline imbalance, 8 studies had a high risk of bias for loss of clusters, and 3 studies had a high risk of bias for incorrect analyses. Risk of bias is described in detail in Multimedia Appendices 4 and 12.

Effects of Interventions by Outcomes

The characteristics of included studies categorized by participants’ specialty, outcomes, comparisons, and intervention types are presented in Multimedia Appendices 5-13. The educational content was heterogeneous among the included studies. Studies that compared ODE/blended ODE with self-directed or face-to-face learning are presented in the manuscript; for other comparisons, see Multimedia Appendix 3.

Primary Outcomes

Knowledge

A total of 54 studies assessed knowledge: 20 studies used questionnaires (open ended), 28 studies used multiple-choice questions, and 6 studies did not specify the type of instrument used to measure knowledge.

Online and Local Area Network–Based Digital Education Versus Self-Directed Learning

A total of 29 studies compared ODE and self-directed learning; of these studies, only 18 studies reported numerical data in a format that could be used (Figures 5 and 6). Eleven studies [43-50,61,63,64] presented incomplete data (missing means, SDs, or CIs), which could not be included in the data analysis. Seventeen studies (n=2107) [43-60] reported that the ODE was significantly more effective than self-directed learning (small to large effect size, very low certainty evidence). Nine studies (n=796) reported that ODE was as effective as self-directed learning (very low certainty evidence). Two studies [61,62] reported mixed results (very low certainty evidence).

Figure 3. Risk-of-bias summary for each included study.
View this figure
Figure 4. Risk-of-bias item results presented as percentages across all included studies.
View this figure
Figure 5. Comparison of change in knowledge scores (postintervention). ODE: online and local area network–based digital education; IV: inverse variance.
View this figure
Online and Local Area Network–Based Digital Education Versus Face-to-Face Learning

Nine studies compared ODE with face-to-face learning; of these, only four studies [83-86] reported numerical data in a useable format (Figure 6). ODE improved physicians’ knowledge compared with face-to-face learning in studies by Fordis et al [81] and Pelayo-Alvarez et al [82]. Six studies [83-88] reported that ODE may be equally effective as face-to-face learning for improving knowledge scores (n=489). McLeod et al [89] reported that knowledge scores were higher with face-to-face learning than with ODE. Overall, empirical evidence suggests that ODE may have little effect on knowledge as compared with face-to-face learning. However, since the evidence was of very low quality, we are uncertain about the true estimate.

Blended Learning Versus Self-Directed/Face-to-Face Learning

Seven studies assessed learners’ knowledge, of which two studies [96,97] reported that blended learning was significantly more effective in improving physicians’ knowledge than self-directed/face-to-face learning (n=232; large effect size, very low certainty evidence; Figures 5 and 6). Five studies assessed together [96,98-101] reported that blended learning was as effective as self-directed/face-to-face learning for improving physicians’ knowledge scores (very low certainty evidence).

Skills

Twenty-one studies assessed participants’ skills: five studies used an objective structured clinical examination, five studies used questionnaires, four studies used practical skills test/exam, three studies used checklists, and the remaining used other methods to assess skills (Multimedia Appendix 6).

Online and Local Area Network–Based Digital Education Versus Self-Directed Learning

Eight studies [54,65,73-78] compared ODE with self-directed learning, which included no intervention or text-based learning. Of these, only four studies [65,74,76,78] had numerical information in a presentable format (Figures 7-9). Five studies [65,73-76] reported that ODE was significantly more effective than self-directed learning (low certainty evidence). Two studies [77,78] reported that ODE was as effective as self-directed learning (low certainty evidence). One study [54] reported that self-directed learning was more effective than ODE (low certainty evidence). Overall, empirical evidence from five studies [65,73-76] indicated that ODE interventions can improve physicians’ skills as compared to self-directed learning. Similarly, evidence from two [54,77] of the eight studies indicated that ODE may be as effective as self-directed learning for improving physicians’ postintervention skills scores.

Figure 6. Comparison of postintervention knowledge scores. ODE: online and local area network–based digital education; IV: inverse variance.
View this figure
Figure 7. Comparison of change in skills scores (postintervention). ODE: online and local area network–based digital education; IV: inverse variance.
View this figure
Figure 8. Comparison of postintervention skills scores. ODE: online and local area network–based digital education; IV: inverse variance.
View this figure
Figure 9. Comparison of postintervention skill scores (dichotomous). ODE: online and local area network–based digital education; M-H: Mantel-Haenszel.
View this figure
Online and Local Area Network–Based Digital Education Versus Self-Directed Learning

Seven studies [84,87,90-94] compared ODE and face-to-face learning (classroom didactic lecture–based learning), of which six studies [84,87,90-93] reported that ODE was as effective as face-to-face learning in improving physicians’ skills (low certainty evidence). In one study [94], data were missing (Figures 7-9). Overall, empirical evidence from six studies suggests that ODE may be as effective as face-to-face learning in improving physicians’ skills.

Blended Learning Versus Self-Directed/Face-to-Face Learning

Six studies assessed skills [96,98,99,102-104]: Kulier et al [96] and Szmuilowicz et al [102] reported that blended learning may improve physicians’ skills compared with self-directed / face-to-face learning (Figure 8). Overall, empirical evidence from two of the six studies suggests that blended learning may be as effective as self-directed/face-to-face learning in improving physicians’ skills (moderate to large effect size; high certainty evidence). Evidence from four [98,99,103,104] of the six studies suggests that blended learning may be as effective as self-directed/face-to-face in learning to improve physicians’ skills.

Attitude

Eight studies assessed participants’ attitude: six studies used questionnaires and two studies [66,105] used Likert scales.

Online and Local Area Network–Based Digital Education Versus Self-Directed Learning

Four studies [44,47,58,66] compared ODE with self-directed learning, of which only two studies [58,66] reported numerical data. Le et al [66] reported a change in mean attitude scores (SMD 0.46, 95% CI –0.38 to 1.30 [low certainty]) and Sullivan et al [58] assessed attitude posttest (SMD –0.01, 95% CI –0.28 to 0.26 [low certainty]). Harris et al [47] reported that ODE interventions may improve physicians’ attitude compared to self-directed learning. Le et al [66] reported that ODE may be as effective as self-directed learning for improving physicians’ attitude. Connolly et al [44] and Sullivan et al [58] reported mixed results. Overall, empirical evidence from the four studies reported mixed results for this outcome.

Online and Local Area Network–Based Digital Education Versus Face-to-Face Learning

Two studies [82,95] compared ODE with face-to-face learning (classroom didactic lecture–based learning). Only Putnam et al [95] reported a change in learners’ attitude as a dichotomous outcome (RR 0.94, 95% CI 0.72-1.22 [small effect size, low quality]); the other study did not provide data for inclusion in the analysis. Overall, empirical evidence from the two studies reported mixed results for this outcome.

Blended Learning Versus Self-Directed/Face-to-Face Learning

Kulier et al [105] compared an integrated ODE course with face-to-face training on evidence-based medicine among obstetrics and gynecology residents; another study by Kulier et al [96] compared an integrated ODE course with a self-directed course on evidence-based medicine and assessed attitude scores at baseline only. Kulier et al [105] reported that blended learning may be as effective as face-to-face training for improving physicians’ attitude (RR 3.00, 95% CI 0.76-11.88).

Satisfaction

Sixteen studies assessed participants’ satisfaction, of which 10 studies used questionnaires and 6 studies used Likert scales.

Online and Local Area Network–Based Digital Education Versus Self-Directed Learning

Six studies [54,58,61,67,79,80] compared ODE with self-directed learning, which included no intervention or text-based learning. Of these, only three studies [58,67,79] reported numerical data in a useable format (Figures 10 and 11). Two studies [58,67] assessed mean satisfaction scores posttest (Figure 10). Bell et al [67] reported higher satisfaction for the ODE group than the self-directed learning group (SMD 0.68, 95% CI 0.36-0.99 [moderate effect size, low quality]). Sullivan et al [58] reported no difference in satisfaction scores between the groups (SMD 0.18, 95% CI –0.09 to 0.45). Similarly, Matzie et al [79] assessed posttest satisfaction as a dichotomous outcome (Figure 11) and reported higher satisfaction in four of the five domains (RR 1.13, 95% CI 1.03-1.23 [small effect size, low certainty evidence]). Overall, empirical evidence from two studies [67,79] suggests that ODE may be as effective as self-directed learning on physicians’ satisfaction (moderate to large effect size, low quality). Similarly, evidence from four [54,58,79,80] of the six studies suggests that ODE may be as effective as self-directed learning in improving physicians’ satisfaction. Gold et al [61] reported mixed results.

Online and Local Area Network–Based Digital Education Versus Face-to-Face Learning

Four studies [81,83,84,87] compared ODE with face-to-face learning. Of these, two studies [83,84] reported numerical data in a useable format (Figures 10 and 11). Overall, empirical evidence from two [83,87] of the four studies suggested that ODE may be effective compared with face-to-face learning in improving physicians’ satisfaction (large effect size, low certainty evidence). Similarly, evidence from two [81,84] of the four studies suggested that ODE may be as effective as face-to-face learning in improving physicians’ satisfaction.

Blended Learning Versus Self-Directed/Face-to-Face Learning

Three studies [98,100,106] assessed satisfaction (Figures 10 and 11). Ali et al [98] reported no difference in satisfaction between the groups, while Kronick et al [106] reported mixed effects. Platz et al [100] found higher satisfaction with face-to-face learning compared to blended learning (moderate effect size, very certainty evidence). Overall, empirical evidence from three [98,100,106] studies suggests that the interventions have mixed effects on learners’ satisfaction.

Secondary Outcomes
Practice or Behavior Change

Fourteen studies assessed practice or behavior change: four studies used questionnaires; four studies used hospital chart audits or case note reviews; and four other studies used a Likert scale [66], an Intimate Partner Violence Survey scale [64], and patient data from an administrative database and a scenario-based decision-support system [110]. Two studies [111,112] did not state the assessment tools used to measure practice or behavior change.

Online and Local Area Network–Based Digital Education Versus Self-Directed Learning

Fourteen studies [43,45,62,64,66,110-118] compared ODE with self-directed learning, which included no intervention or text-based learning. Of these, only nine studies [62,66,110-113,115,117,118] reported numerical data in a useable format (Figures 12-14).

Figure 10. Comparison of postintervention satisfaction scores. ODE: online and local area network–based digital education; IV: inverse variance.
View this figure

Overall, empirical evidence from 7 [43,45,110,112,114,116,117] of the 14 studies suggests that ODE may be more effective than self-directed learning in improving physicians’ practice or behavior change (moderate to large effect size, very low certainty evidence). Evidence from 4 [62,66,113,115] of the 14 studies suggests that ODE may be as effective as self-directed learning in improving physicians’ practice or behavior change. Three studies [64,111,113] reported mixed results (Figures 12-14).

Fordis et al [81] compared ODE on cholesterol management with face-to-face learning for physicians. The study reported that ODE (online CME) may be as effective as face-to-face learning (live CME) for improving physicians’ practice or behavior change (RR 0.58, 95% CI –0.06 to 1.21; Figures 12 and 13).

Figure 11. Comparison of postintervention satisfaction scores (dichotomous). ODE: online and local area network–based digital education; M-H, Mantel-Haenszel.
View this figure
Figure 12. Comparison of practice or behavior change scores (pre-post intervention). ODE: online and local area network–based digital education; IV: inverse variance.
View this figure
Figure 13. Comparison of postintervention practice or behavior-change scores. ODE: online and local area network–based digital education; IV: inverse variance.
View this figure
Figure 14. Comparison of postintervention practice or behavior change (dichotomous). ODE: online and local area network–based digital education; M-H: Mantel-Haenszel.
View this figure
Blended Learning Versus Self-Directed/Face-to-Face Learning

Of the three studies [104,107,109] that assessed practice or behavior change, only two studies [107,109] reported numerical data; these studies reported mixed results for several behavior change outcomes (Figures 12 and 13). Midmer [104] reported that blended learning may be as effective as face-to-face learning in improving physicians’ practice or behavior change. Overall, empirical evidence from three studies suggests that blended learning may be as effective as self-directed/face-to-face learning.

Patient Outcomes for All Comparisons

Four studies assessed patient outcomes: These studies used hospital chart audits [65,114,119] and the European Organization for Research and Treatment of Cancer Quality of Life Questionnaire-C30.

Four studies [65,114,119,120] compared ODE or blended learning with self-directed/face-to-face learning. Butler [114] reported that ODE may be as effective as self-directed learning for improving patient outcomes. Dolan et al [65] reported that ODE may be more effective than self-directed learning for improving patient outcomes (RR 1.07, 95% CI 1.01-1.13). Girgis et al [120] reported that blended learning may be as effective as face-to-face learning for improving patient outcomes (SMD 0.00, 95% CI –0.20 to 0.20). However, Legare et al [119] found that blended learning may be more effective than self-directed learning for improving patient outcomes (RR 2.80, 95% CI 1.44-5.44). Overall, empirical evidence from the two studies [65,114,120] suggests that ODE may be as effective as self-directed/face-to-face learning in improving patient outcomes. In contrast, empirical evidence from two studies [65,119] suggests that ODE/blended learning may be more effective than self-directed learning.

Cost

Three studies reported the cost of the ODE interventions [43,99,114]. Braido et al [43] performed inter- or intragroup comparisons and a cost-minimization analysis and reported pharmaceutical cost containment of 29% in the ODE group compared to the self-directed learning group; however, the spending on diagnostic investigations increased by 13.4% in the ODE group and reduced by 24.4% in the control group. Butler et al [114] presented cost information for the Stemming the Tide of Antibiotic Resistance educational program and reported greater reduction in costs postintervention in the ODE group compared to self-directed learning (intervention: £120.76; control: £2.21 per 1000 patients). Perkins et al [99] presented the cost of the Advanced Life Support training program and reported that faculty, catering, and facility costs were 47% lower for the blended learning group than for the conventional training group.

No studies reported on the adverse or unintended effects of ODE or blended interventions.


Principal Findings

The review identified a variety of ODE interventions used for postregistration training of medical doctors; these interventions were used among diverse medical specialties, and a range of outcomes and comparators were reported. Because of the high heterogeneity, we were unable to pool the data quantitatively.

A total of 93 RCTs were included, of which 76 studies compared ODE/blended learning with self-directed/face-to-face learning (N=12,424). The results are presented in the main review, and the rest of the comparisons are presented in the Multimedia Appendices 1-13. Among these, 21 studies with 2611 participants reported higher postintervention knowledge scores (small to large effect size, very low quality) for the intervention group, while 20 studies with 5496 participants reported no difference in knowledge scores between the groups. Seven studies with 794 participants reported higher postintervention skill scores in the intervention group (large effect size, low quality), while 13 studies with 4447 participants reported no difference in skill scores between the groups. One study with 99 participants reported higher postintervention attitude scores with the intervention (very low quality), while 4 studies with 305 participants reported no difference in attitude scores between the groups. Four studies with 677 participants reported higher postintervention physician satisfaction with the intervention (large effect size, low quality), while 6 studies with 478 participants reported no difference in satisfaction between the groups. Eight studies with 5051 participants reported higher postintervention practice or behavior change for the ODE group (small to moderate effect size, low quality), while 5 studies with 377 participants reported no difference in practice or behavior change between the groups. One study with 449 participants reported higher improvement in patient outcome, while 3 studies with 667 participants reported no difference in patient outcome for the groups. None of the included studies reported any unintended/adverse effects of the interventions on learners.

Overall, the effect of ODE on postintervention knowledge, skills, attitude, satisfaction, practice or behavior change, and patient outcomes was inconsistent and ranged mostly from no difference between the intervention groups to higher postintervention score in the ODE/blended learning group (small to large effect size, very low to low quality evidence). Moreover, the quality of evidence according to GRADE criteria was judged to be low for most outcomes.

George et al assessed the effectiveness of ODE for undergraduate health professionals [10] and reported that ODE is equivalent, and possibly superior, to traditional learning. We are not aware of any other systematic reviews of RCTs that have evaluated ODE for medical doctors. A similar review of evidence from nonrandomized studies on the effectiveness of ODE in surgical education among medical and dental students, surgeons, and oral health specialists [121] reported knowledge gain from ODE compared with active control (face-to-face learning) or no intervention (self-directed learning). Another review [122] evaluated the evidence of effectiveness from nonrandomized studies of online CME for general practitioners alone; assessed their satisfaction, knowledge, clinical practice, and patient outcomes; and reported improvement in satisfaction, knowledge, or practices. Jwayyed et al [123] evaluated the effectiveness of ODE from nonrandomized studies among diverse health care professionals and reported inconsistent results. Our review, in congruence with other reviews [121-124], compared the effect of ODE and blended learning with self-directed/face-to-face learning and other forms of ODEs on physicians’ knowledge, skills, attitudes, satisfaction and clinical practice, and patient outcomes, but only included evidence from RCTs and cRCTs.

According to the GRADE criteria, the quality of the evidence was very low for knowledge and low for the other primary and secondary outcomes due to the unclear and high risk of bias, inconsistency, and publication bias. The majority of the studies did not provide information on randomization sequence generation and allocation concealment. Similarly, a high proportion of studies (75 studies, 82%; including comparisons presented in the Multimedia Appendix 12) did not provide sufficient information on the blinding of outcome assessors and were hence judged to have an unclear risk of bias. Thirty-one studies (33.3%) reported incomplete outcome data, 25 studies (26.9%) reported baseline differences in participant characteristics and were judged to be at high risk of bias, and 3 studies (3.3%) had a high risk of reporting bias.

This review has a few limitations. The included studies were heterogeneous in terms of the participants, learning content, and the types of ODE (CME/CPD), thus limiting the opportunity to pool the results and consequently run the preplanned subgroup analysis. Hence, the review could not generate conclusive findings on the effectiveness of ODE. Of the included studies, only two were from low- to middle-income countries, which could limit the completeness of the evidence and its generalizability in all settings. Third, we were unable to assess the cost-effectiveness of ODE as compared to self-directed/face-to-face learning, because none of the identified studies formally assessed it. Only three studies assessed the cost and maintenance of the ODE intervention. Fourth, none of the studies specifically addressed any adverse effects of ODE.

The study has several strengths including a thorough and reproducible search of available literature, independent screening and data extractions, and critical appraisal of the literature conducted in accordance with the Cochrane Handbook for Systematic Reviews of Interventions.

Implications

Given the strength of the evidence of ODE and blended learning, especially for cognitive, procedural, and diagnostic training; evidence-based medicine training; and CME, CPD, and Clinical Practice Guideline training, further high-quality studies of ODE interventions with a validated learning theory are needed to establish their effectiveness, cost-effectiveness, and financial sustainability (return on investment). Specifically, these studies should seek to address several unanswered questions such as ODE’s theoretical underpinning, content validation, frequency, intensity, interactivity, technical features, fidelity, safety, cost-effectiveness, adaptability, acceptability, barriers/facilitators to its adoption, financial sustainability, and learner’s readiness to switch from classroom learning to complete ODE.

Conclusions

Our review found that ODE and blended learning refers to a group of heterogeneous interventions with different learning theories, learning content, comparators, and outcomes. These interventions were used to train medical doctors in various specialties, such as primary care practitioners, surgeons, residents, and physicians. Although empirical evidence from a majority of studies shows that ODE and blended learning may improve practicing physicians’ knowledge, skills, attitude, satisfaction, practice or behavior change, and patient outcomes, few other studies showed that they may be comparable to self-directed, face-to-face learning. The quality of the evidence in these studies was found to be low and very low for knowledge. Therefore, further high-quality RCTs are required before the evidence of efficacy can be concluded for postregistration training of medical doctors.

Acknowledgments

This study is part of PG’s doctoral study. He gratefully acknowledges the funding received from the National Healthcare Group, Singapore, for his doctoral research at the University of Adelaide.

This review is conducted in collaboration with the Health Workforce Department of the World Health Organization. We would also like to thank Mr Carl Gornitzki, Ms GunBrit Knutssön, and Mr Klas Moberg from the University Library, Karolinska Institutet, Sweden, for developing the search strategy. We gratefully acknowledge funding from the Lee Kong Chian School of Medicine, Nanyang Technological University, Singapore (eLearning education grant for health professionals).

Authors' Contributions

JC and NZ conceived the idea for the review. PP and OZ wrote the review. LC and PP provided methodological guidance, drafted some of the methodology-related sections, and critically revised the review. BK, PA, NS, MS, NZ, and CL provided comments on and edited the review.

Conflicts of Interest

None declared.

Multimedia Appendix 1

Medline (Ovid) search strategy.

PDF File (Adobe PDF File), 47KB

Multimedia Appendix 2

Inclusion and exclusion criteria.

PDF File (Adobe PDF File), 43KB

Multimedia Appendix 3

Effects of the intervention (other comparisons).

PDF File (Adobe PDF File), 18KB

Multimedia Appendix 4

Risk of bias for cluster randomized controlled trials.

PDF File (Adobe PDF File), 26KB

Multimedia Appendix 5

Characteristics of included studies assessing knowledge.

PDF File (Adobe PDF File), 32KB

Multimedia Appendix 6

Characteristics of included studies assessing skills.

PDF File (Adobe PDF File), 24KB

Multimedia Appendix 7

Characteristics of included studies assessing attitude.

PDF File (Adobe PDF File), 18KB

Multimedia Appendix 8

Characteristics of included studies assessing satisfaction.

PDF File (Adobe PDF File), 20KB

Multimedia Appendix 9

Characteristics of included studies assessing practice or behavior change.

PDF File (Adobe PDF File), 24KB

Multimedia Appendix 10

Characteristics of included studies assessing patient outcomes.

PDF File (Adobe PDF File), 18KB

Multimedia Appendix 11

Online and local area network–based digital education intervention type by participants’ specialty, country of publication, and intervention duration.

PDF File (Adobe PDF File), 29KB

Multimedia Appendix 12

Risk of bias.

PDF File (Adobe PDF File), 17KB

Multimedia Appendix 13

Acronyms and definitions.

PDF File (Adobe PDF File), 14KB

Multimedia Appendix 14

Effects of the intervention (other comparisons).

PDF File (Adobe PDF File), 18KB

  1. Bleakley A, Bligh J, Browne J. Medical Education For The Future. Berlin, Germany: Springer Science & Business Media; 2019.
  2. Chen N, Wang Y. Cyber schooling framework: Improving mobility and situated learning. Intl Journal of Eng Ed 2007;23:421-433 [FREE Full text] [CrossRef]
  3. Clark D. Psychological myths in e-learning. Med Teach 2002 Nov;24(6):598-604 [FREE Full text] [CrossRef] [Medline]
  4. Cook DA. The research we still are not doing: an agenda for the study of computer-based learning. Acad Med 2005 Jun;80(6):541-548. [Medline]
  5. Sangrà A, Vlachopoulos D, Cabrera N. Building an inclusive definition of e-learning: an approach to the conceptual framework. Int Rev Res Open Distance Learn 2012;13:a.
  6. Cook DA, Garside S, Levinson AJ, Dupras DM, Montori VM. What do we mean by web-based learning? A systematic review of the variability of interventions. Med Educ 2010 Aug;44(8):765-774. [CrossRef] [Medline]
  7. Duque G, Demontiero O, Whereat S, Gunawardene P, Leung O, Webster P, et al. Evaluation of a blended learning model in geriatric medicine: a successful learning experience for medical students. Australas J Ageing 2013 Jun;32(2):103-109. [CrossRef] [Medline]
  8. Makhdoom N, Khoshhal KI, Algaidi S, Heissam K, Zolaly MA. ‘Blended learning’ as an effective teaching and learning strategy in clinical medicine: a comparative cross-sectional university-based study. Journal of Taibah University Medical Sciences 2013 Apr;8(1):12-17. [CrossRef]
  9. Rowe M, Frantz J, Bozalek V. The role of blended learning in the clinical education of healthcare students: a systematic review. Med Teach 2012 Mar;34(4):e216-e221. [CrossRef] [Medline]
  10. George PP, Papachristou N, Belisario JM, Wang W, Wark PA, Cotic Z, et al. Online eLearning for undergraduates in health professions: A systematic review of the impact on knowledge, skills, attitudes and satisfaction. J Glob Health 2014 Jun;4(1):010406 [FREE Full text] [CrossRef] [Medline]
  11. The Complexities of Physician Supply and Demand: Projections from 2013 to 2025. Washington, DC: Association of American Medical Colleges; 2015.   URL: https:/​/www.​aamc.org/​download/​426248/​data/​thecomplexitiesofphysiciansupplyanddemandprojectionsfrom2013to2.​pdf [accessed 2019-02-12] [WebCite Cache]
  12. Health and Care Professions Council. 2015. Continuing professional development (CPD)   URL: https://www.hcpc-uk.org/cpd/ [accessed 2019-02-12] [WebCite Cache]
  13. Ally M. Foundations of educational theory for online learning. In: The Theory and Practice of Online Learning. Athabasca, Alberta: Athabasca University Press; 2004:15-44.
  14. Rovai A. The International Review of Research in Open and Distributed Learning. 2002 Apr 01. Building Sense of Community at a Distance   URL: http://www.irrodl.org/index.php/irrodl/article/view/79 [accessed 2019-02-12] [WebCite Cache]
  15. Cook DA, Levinson AJ, Garside S, Dupras DM, Erwin PJ, Montori VM. Internet-based learning in the health professions: a meta-analysis. JAMA 2008 Sep 10;300(10):1181-1196. [CrossRef] [Medline]
  16. Tudor CL, Riboli-Sasco E, Marcano BJ, Nikolaou CK, Majeed A, Zary N, et al. PROSPERO CRD42015029786. 2015. Mobile learning for delivering health professional education [Cochrane Protocol]   URL: http://www.crd.york.ac.uk/PROSPERO/display_record.php?ID=CRD42015029786 [accessed 2019-02-12] [WebCite Cache]
  17. Hervatis V, Kyaw B, Semwal M, Dunleavy G, Tudor CL, Zary N, et al. PROSPERO CRD42016045679. 2018. Offline and computer-based eLearning interventions for medical students' education [Cochrane Protocol]   URL: http://www.crd.york.ac.uk/PROSPERO/display_record.php?ID=CRD42016045679 [accessed 2019-02-12] [WebCite Cache]
  18. Kononowicz A, Woodham L, Georg C, Edelbring S, Stathakarou N, Davies D, et al. PROSPERO CRD42016045912. 2016. Virtual patient simulations for health professional education [Cochrane Protocol]   URL: http://www.crd.york.ac.uk/PROSPERO/display_record.php?ID=CRD42016045912 [accessed 2019-02-12] [WebCite Cache]
  19. Saxena N, Kyaw B, Vseteckova J, Dev P, Paul P, Lim K, et al. PROSPERO CRD42016045470. 2016. Virtual reality environments for health professional education [Cochrane Protocol]   URL: http://www.crd.york.ac.uk/PROSPERO/display_record.php?ID=CRD42016045470 [accessed 2019-02-12] [WebCite Cache]
  20. Gentry S, L'Estrade E, Gauthier A, Alvarez J, Wortley D, van Rijswijk J, et al. PROSPERO CRD42016045997. 2016. Serious Gaming and Gamification interventions for health professional education [Cochrane Protocol]   URL: http://www.crd.york.ac.uk/PROSPERO/display_record.php?ID=CRD42016045997 [accessed 2019-02-12] [WebCite Cache]
  21. Semwal M, Wahabi H, Posadzki P, Divakar U, Lim K, Audouard-Marzin Y, et al. PROSPERO CRD42017083206. 2018. Offline and computer-based eLearning interventions for medical doctors' education [Cochrane protocol]   URL: http://www.crd.york.ac.uk/PROSPERO/display_record.php?ID=CRD42017083206 [accessed 2019-02-12] [WebCite Cache]
  22. Allison JJ, Kiefe CI, Wall T, Casebeer L, Ray MN, Spettell CM, et al. Multicomponent Internet continuing medical education to promote chlamydia screening. Am J Prev Med 2005 Apr;28(3):285-290. [CrossRef] [Medline]
  23. Bernstein HH, Dhepyasuwan N, Connors K, Volkan K, Serwint JR, CORNET Investigators. Evaluation of a national Bright Futures oral health curriculum for pediatric residents. Acad Pediatr 2013 Mar;13(2):133-139. [CrossRef] [Medline]
  24. Grover S, Currier PF, Elinoff JM, Katz JT, McMahon GT. Improving residents' knowledge of arterial and central line placement with a web-based curriculum. J Grad Med Educ 2010 Dec;2(4):548-554 [FREE Full text] [CrossRef] [Medline]
  25. Kerfoot BP, Baker HE, Koch MO, Connelly D, Joseph DB, Ritchey ML. Randomized, controlled trial of spaced education to urology residents in the United States and Canada. J Urol 2007 Apr;177(4):1481-1487. [CrossRef] [Medline]
  26. Marsh-Tootle WL, McGwin G, Kohler CL, Kristofco RE, Datla RV, Wall TC. Efficacy of a web-based intervention to improve and sustain knowledge and screening for amblyopia in primary care settings. Invest Ophthalmol Vis Sci 2011 Sep 09;52(10):7160-7167 [FREE Full text] [CrossRef] [Medline]
  27. Sangvai S, Mahan JD, Lewis KO, Pudlo N, Suresh S, McKenzie LB. The impact of an interactive Web-based module on residents' knowledge and clinical practice of injury prevention. Clin Pediatr (Phila) 2012 Feb 10;51(2):165-174. [CrossRef] [Medline]
  28. Saxon D, Pearson AT, Wu P. Hyperlink-Embedded Journal Articles Improve Statistical Knowledge and Reader Satisfaction. J Grad Med Educ 2015 Dec;7(4):654-657 [FREE Full text] [CrossRef] [Medline]
  29. Schroter S, Jenkins RD, Playle RA, Walsh KM, Probert C, Kellner T, et al. Evaluation of an online interactive Diabetes Needs Assessment Tool (DNAT) versus online self-directed learning: a randomised controlled trial. BMC Med Educ 2011 Jun 16;11(1):35-39 [FREE Full text] [CrossRef] [Medline]
  30. Shaw TJ, Pernar LI, Peyre SE, Helfrick JF, Vogelgesang KR, Graydon-Baker E, et al. Impact of online education on intern behaviour around joint commission national patient safety goals: a randomised trial. BMJ Qual Saf 2012 Oct 16;21(10):819-825 [FREE Full text] [CrossRef] [Medline]
  31. Pape-Koehler C, Immenroth M, Sauerland S, Lefering R, Lindlohr C, Toaspern J, et al. Multimedia-based training on Internet platforms improves surgical performance: a randomized controlled trial. Surg Endosc 2013 May;27(5):1737-1747 [FREE Full text] [CrossRef] [Medline]
  32. Ruf D, Berner M, Kriston L, Lohmann M, Mundle G, Lorenz G, et al. Cluster-randomized controlled trial of dissemination strategies of an online quality improvement programme for alcohol-related disorders. Alcohol Alcohol 2010 Nov 04;45(1):70-78. [CrossRef] [Medline]
  33. Talib N, Onikul R, Filardi D, Simon S, Sharma V. Effective educational instruction in preventive oral health: hands-on training versus web-based training. Pediatrics 2010 Mar 01;125(3):547-553. [CrossRef] [Medline]
  34. Yardley L, Douglas E, Anthierens S, Tonkin-Crine S, O'Reilly G, Stuart B, et al. Evaluation of a web-based intervention to reduce antibiotic prescribing for LRTI in six European countries: quantitative process analysis of the GRACE/INTRO randomised controlled trial. Implement Sci 2013 Nov 15;8:134 [FREE Full text] [CrossRef] [Medline]
  35. Weston CM, Sciamanna CN, Nash DB. Evaluating online continuing medical education seminars: evidence for improving clinical practices. Am J Med Qual 2008 Nov;23(6):475-483. [CrossRef] [Medline]
  36. Franchi C, Tettamanti M, Djade CD, Pasina L, Mannucci PM, Onder G, ELICADHE Investigators. E-learning in order to improve drug prescription for hospitalized older patients: a cluster-randomized controlled study. Br J Clin Pharmacol 2016 Dec 05;82(1):53-63 [FREE Full text] [CrossRef] [Medline]
  37. Kerfoot B, Turchin A, Breydo E, Gagnon D, Conlin P. An online spaced-education game among clinicians improves their patients' time to blood pressure control: a randomized controlled trial. Circ Cardiovasc Qual Outcomes 2014 May;7(3):468-474 [FREE Full text] [CrossRef] [Medline]
  38. George PP, Toon E, Hadadgar A, Jirwe M, Saxena N, Lim K, et al. PROSPERO CRD42016045571. 2016. Online- and local area network (LAN)-based eLearning interventions for medical doctors' education [Cochrane Protocol]   URL: http://www.crd.york.ac.uk/PROSPERO/display_record.php?ID=CRD42016045571 [accessed 2019-02-12] [WebCite Cache]
  39. Higgins J, Green S. Cochrane Handbook for Systematic Reviews of Interventions Version 5.1.   URL: http://handbook-5-1.cochrane.org/ [accessed 2019-02-07] [WebCite Cache]
  40. Car J, Carlstedt-Duke J, Tudor CL, Posadzki P, Zary N, Atun R, et al. Digital education for health professions: methods for overarching evidence syntheses. J Med Internet Res 2019:2018. [CrossRef]
  41. Puffer S, Torgerson D, Watson J. Evidence for risk of bias in cluster randomised trials: review of recent trials published in three general medical journals. BMJ 2003 Oct 04;327(7418):785-789 [FREE Full text] [CrossRef] [Medline]
  42. Schünemann H, Oxman A, Higgins J, Vist G, Glasziou P, Guyatt G. Cochrane Handbook for Systematic Reviews of Interventions Version 5.1.0 (updated March 2011), The Cochrane Collaboration. 2008 Sep 26. Presenting results and 'Summary of Findings' tables   URL: https://onlinelibrary.wiley.com/doi/abs/10.1002/9780470712184.ch11 [accessed 2019-02-12] [WebCite Cache]
  43. Braido F, Comaschi M, Valle I, Delgado L, Coccini A, Guerreras P, ARGA Study Group, EAACI/CME Committee. Knowledge and health care resource allocation: CME/CPD course guidelines-based efficacy. Eur Ann Allergy Clin Immunol 2012 Oct;44(5):193-199. [Medline]
  44. Connolly AM, Cunningham C, Sinclair AJ, Rao A, Lonergan A, Bye AM. 'Beyond Milestones': a randomised controlled trial evaluating an innovative digital resource teaching quality observation of normal child development. J Paediatr Child Health 2014 May 23;50(5):393-398. [CrossRef] [Medline]
  45. Farah SS, Winter M, Appu S. Helping doctors utilize the prostate-specific antigen effectively: an online randomized controlled trial (The DUPE trial). ANZ J Surg 2012 Sep 20;82(9):633-638. [CrossRef] [Medline]
  46. Gyorki DE, Shaw T, Nicholson J, Baker C, Pitcher M, Skandarajah A, et al. Improving the impact of didactic resident training with online spaced education. ANZ J Surg 2013 Jun 26;83(6):477-480. [CrossRef] [Medline]
  47. Harris J, Kutob R, Surprenant Z, Maiuro R, Delate T. Can Internet-based education improve physician confidence in dealing with domestic violence? Fam Med 2002 Apr;34(4):287-292. [Medline]
  48. Houwink EJ, van Teeffelen SR, Muijtjens AM, Henneman L, Jacobi F, van Luijk SJ, et al. Sustained effects of online genetics education: a randomized controlled trial on oncogenetics. Eur J Hum Genet 2013 Aug 14;22(3):310-316. [CrossRef]
  49. Kutob R, Senf J, Harris J. Teaching culturally effective diabetes care: results of a randomized controlled trial. Fam Med 2009 Mar;41(3):167-174 [FREE Full text] [Medline]
  50. Thompson J, Lebwohl B, Syngal S, Kastrinos F. Tu1482 Assessment of U.S. Gastroenterology Trainees' Knowledge of Quality Performance Measures for Colonoscopy and the Impact of a Web-Based Intervention. Gastrointestinal Endoscopy 2011 Apr;73(4):AB423 [FREE Full text] [CrossRef]
  51. World Health Organization. Geneva: WHO; 2013. Global health workforce shortage to reach 12.9 million in coming decades   URL: https://www.who.int/mediacentre/news/releases/2013/health-workforce-shortage/en/ [accessed 2019-02-12] [WebCite Cache]
  52. Alfieri J, Portelance L, Souhami L, Steinert Y, McLeod P, Gallant F, et al. Development and impact evaluation of an e-learning radiation oncology module. Int J Radiat Oncol Biol Phys 2012 Mar 01;82(3):e573-e580. [CrossRef] [Medline]
  53. Chang TP, Pham PK, Sobolewski B, Doughty CB, Jamal N, Kwan KY, et al. Pediatric emergency medicine asynchronous e-learning: a multicenter randomized controlled Solomon four-group study. Acad Emerg Med 2014 Aug;21(8):912-919 [FREE Full text] [CrossRef] [Medline]
  54. Claxton R, Marks S, Buranosky R, Rosielle D, Arnold RM. The educational impact of weekly e-mailed fast facts and concepts. J Palliat Med 2011 Apr;14(4):475-481. [CrossRef] [Medline]
  55. Cullinan S, O'Mahony D, Byrne S. Use of an e-Learning Educational Module to Better Equip Doctors to Prescribe for Older Patients: A Randomised Controlled Trial. Drugs Aging 2017 Dec 4;34(5):367-374. [CrossRef] [Medline]
  56. Hearty T, Maizels M, Pring M, Mazur J, Liu R, Sarwark J, et al. Orthopaedic resident preparedness for closed reduction and pinning of pediatric supracondylar fractures is improved by e-learning: a multisite randomized controlled study. J Bone Joint Surg Am 2013 Sep 04;95(17):e1261-e1267. [CrossRef] [Medline]
  57. Satterwhite T, Son J, Carey J, Zeidler K, Bari S, Gurtner G, et al. Microsurgery education in residency training: validating an online curriculum. Ann Plast Surg 2012 Apr;68(4):410-414. [CrossRef] [Medline]
  58. Sullivan MD, Gaster B, Russo J, Bowlby L, Rocco N, Sinex N, et al. Randomized trial of web-based training about opioid therapy for chronic pain. Clin J Pain 2010;26(6):512-517. [CrossRef] [Medline]
  59. Viguier M, Rist S, Aubin F, Leccia M, Richard M, Esposito-Farèse M, Club Rhumatismes et Inflammation. Online training on skin cancer diagnosis in rheumatologists: results from a nationwide randomized web-based survey. PLoS One 2015 May 21;10(5):e0127564 [FREE Full text] [CrossRef] [Medline]
  60. Westmoreland GR, Counsell SR, Tu W, Wu J, Litzelman DK. Web-based training in geriatrics for medical residents: a randomized controlled trial using standardized patients to assess outcomes. J Am Geriatr Soc 2010 Jun;58(6):1163-1169. [CrossRef] [Medline]
  61. Gold J, Begg W, Fullerton D, Mathisen D, Olinger G, Orringer M, et al. Successful implementation of a novel internet hybrid surgery curriculum: the early phase outcome of thoracic surgery prerequisite curriculum e-learning project. Ann Surg 2004 Sep;240(3):499-507; discussion 507. [Medline]
  62. Stewart M, Marshall JN, Ostbye T, Feightner JW, Brown JB, Harris S, et al. Effectiveness of case-based on-line learning of evidence-based practice guidelines. Fam Med 2005 Feb;37(2):131-138 [FREE Full text] [Medline]
  63. Enders F, Diener-West M. Methods of learning in statistics education: a randomized trial of public health graduate students. Statistics Education Research Journal 2006;5(1):5.
  64. Short LM, Surprenant ZJ, Harris JM. A community-based trial of an online intimate partner violence CME program. Am J Prev Med 2006 Feb;30(2):181-185 [FREE Full text] [CrossRef] [Medline]
  65. Dolan BM, Yialamas MA, McMahon GT. A Randomized Educational Intervention Trial to Determine the Effect of Online Education on the Quality of Resident-Delivered Care. J Grad Med Educ 2015 Sep;7(3):376-381 [FREE Full text] [CrossRef] [Medline]
  66. Le TT, Rait MA, Jarlsberg LG, Eid NS, Cabana MD. A randomized controlled trial to evaluate the effectiveness of a distance asthma learning program for pediatricians. J Asthma 2010 Apr 15;47(3):245-250. [CrossRef] [Medline]
  67. Bell DS, Fonarow GC, Hays RD, Mangione CM. Self-study from web-based and printed guideline materials. A randomized, controlled trial among resident physicians. Ann Intern Med 2000 Jun 20;132(12):938-946. [Medline]
  68. Butzlaff M, Vollmar HC, Floer B, Koneczny N, Isfort J, Lange S. Learning with computerized guidelines in general practice? A randomized controlled trial. Fam Pract 2004 Apr;21(2):183-188. [Medline]
  69. Cabrera-Muffly C, Bryson PC, Sykes KJ, Shnayder Y. Free online otolaryngology educational modules: a pilot study. JAMA Otolaryngol Head Neck Surg 2015 Apr 01;141(4):324-328. [CrossRef] [Medline]
  70. Chung S, Mandl KD, Shannon M, Fleisher GR. Efficacy of an educational Web site for educating physicians about bioterrorism. Acad Emerg Med 2004 Feb;11(2):143-148 [FREE Full text] [Medline]
  71. Ferguson MK, Thompson K, Huisingh-Scheetz M, Farnan J, Hemmerich J, Acevedo J, et al. The Impact of a Frailty Education Module on Surgical Resident Estimates of Lobectomy Risk. Ann Thorac Surg 2015 Jul;100(1):235-241 [FREE Full text] [CrossRef] [Medline]
  72. Wang CL, Schopp JG, Kani K, Petscavage-Thomas JM, Zaidi S, Hippe DS, et al. Prospective randomized study of contrast reaction management curricula: computer-based interactive simulation versus high-fidelity hands-on simulation. Eur J Radiol 2013 Dec;82(12):2247-2252. [CrossRef] [Medline]
  73. Hymowitz N, Schwab JV, Haddock CK, Pyle SA, Schwab LM. The pediatric residency training on tobacco project: four-year resident outcome findings. Prev Med 2007 Dec;45(6):481-490 [FREE Full text] [CrossRef] [Medline]
  74. Koppe H, van de Mortel TF, Ahern CM. How effective and acceptable is Web 2.0 Balint group participation for general practitioners and general practitioner registrars in regional Australia? A pilot study. Aust J Rural Health 2016 Feb 26;24(1):16-22 [FREE Full text] [CrossRef] [Medline]
  75. Lee A, Mader E, Morley C. Teaching cross-cultural communication skills online: a multi-method evaluation. Fam Med 2015 Apr;47(4):302-308 [FREE Full text] [Medline]
  76. Macrae HM, Regehr G, McKenzie M, Henteleff H, Taylor M, Barkun J, et al. Teaching practicing surgeons critical appraisal skills with an Internet-based journal club: A randomized, controlled trial. Surgery 2004 Sep;136(3):641-646. [CrossRef] [Medline]
  77. Conroy EJ, Kirkham JJ, Bellis JR, Peak M, Smyth RL, Williamson PR, et al. A pilot randomised controlled trial to assess the utility of an e-learning package that trains users in adverse drug reaction causality. Int J Pharm Pract 2015 Dec 29;23(6):447-455 [FREE Full text] [CrossRef] [Medline]
  78. Edrich T, Stopfkuchen-Evans M, Scheiermann P, Heim M, Chan W, Stone MB, et al. A Comparison of Web-Based with Traditional Classroom-Based Training of Lung Ultrasound for the Exclusion of Pneumothorax. Anesth Analg 2016 Dec;123(1):123-128. [CrossRef] [Medline]
  79. Matzie KA, Kerfoot BP, Hafler JP, Breen EM. Spaced education improves the feedback that surgical residents give to medical students: a randomized trial. Am J Surg 2009 Feb;197(2):252-257. [CrossRef] [Medline]
  80. Pernar LI, Beleniski F, Rosen H, Lipsitz S, Hafler J, Breen E. Spaced education faculty development may not improve faculty teaching performance ratings in a surgery department. J Surg Educ 2012 Jan;69(1):52-57. [CrossRef] [Medline]
  81. Fordis M, King JE, Ballantyne CM, Jones PH, Schneider KH, Spann SJ, et al. Comparison of the instructional efficacy of Internet-based CME with live interactive CME workshops: a randomized controlled trial. JAMA 2005 Sep 07;294(9):1043-1051. [CrossRef] [Medline]
  82. Pelayo-Alvarez M, Perez-Hoyos S, Agra-Varela Y. Clinical effectiveness of online training in palliative care of primary care physicians. J Palliat Med 2013 Oct;16(10):1188-1196 [FREE Full text] [CrossRef] [Medline]
  83. Hemmati N, Omrani S, Hemmati N. A comparison of Internet-based learning and traditional classroom lecture to learn CPR for continuing medical education. TurkiOJDE 2013;14:256-265 [FREE Full text]
  84. Chenkin J, Lee S, Huynh T, Bandiera G. Procedures can be learned on the Web: a randomized study of ultrasound-guided vascular access training. Acad Emerg Med 2008 Oct;15(10):949-954 [FREE Full text] [CrossRef] [Medline]
  85. Hadley J, Kulier R, Zamora J, Coppus SF, Weinbrenner S, Meyerrose B, et al. Effectiveness of an e-learning course in evidence-based medicine for foundation (internship) training. J R Soc Med 2010 Jul 03;103(7):288-294 [FREE Full text] [CrossRef] [Medline]
  86. Hugenholtz NIR, de Croon EM, Smits PB, van Dijk FJH, Nieuwenhuijsen K. Effectiveness of e-learning in continuing medical education for occupational physicians. Occup Med (Lond) 2008 Aug 20;58(5):370-372 [FREE Full text] [CrossRef] [Medline]
  87. Bello G, Pennisi MA, Maviglia R, Maggiore SM, Bocci MG, Montini L, et al. Online vs live methods for teaching difficult airway management to anesthesiology residents. Intensive Care Med 2005 Apr 8;31(4):547-552. [CrossRef] [Medline]
  88. Chan D, Leclair K, Kaczorowski J. Problem-based small-group learning via the Internet among community family physicians: a randomized controlled trial. MD Comput 1999;16(3):54-58. [Medline]
  89. McLeod RS, MacRae HM, McKenzie ME, Victor JC, Brasel KJ, Evidence Based Reviews in Surgery Steering Committee. A moderated journal club is more effective than an Internet journal club in teaching critical appraisal skills: results of a multicenter randomized controlled trial. J Am Coll Surg 2010 Dec;211(6):769-776. [CrossRef] [Medline]
  90. Barthelemy FX, Segard J, Fradin P, Hourdin N, Batard E, Pottier P, et al. ECG interpretation in Emergency Department residents: an update and e-learning as a resource to improve skills. Eur J Emerg Med 2017 Apr;24(2):149-156. [CrossRef] [Medline]
  91. Houwink EJF, Muijtjens AMM, van Teeffelen SR, Henneman L, Rethans JJ, Jacobi F, et al. Effect of comprehensive oncogenetics training interventions for general practitioners, evaluated at multiple performance levels. PLoS One 2015 Apr 2;10(4):e0122648 [FREE Full text] [CrossRef] [Medline]
  92. Shariff U, Kullar N, Haray PN, Dorudi S, Balasubramanian SP. Multimedia educational tools for cognitive surgical skill acquisition in open and laparoscopic colorectal surgery: a randomized controlled trial. Colorectal Dis 2015 May 21;17(5):441-450. [CrossRef] [Medline]
  93. Wilkinson JS, Barake W, Smith C, Thakrar A, Johri AM. Limitations of Condensed Teaching Strategies to Develop Hand-Held Cardiac Ultrasonography Skills in Internal Medicine Residents. Can J Cardiol 2016 Dec;32(8):1034-1037. [CrossRef] [Medline]
  94. Schmitz CC, Braman JP, Turner N, Heller S, Radosevich DM, Yan Y, et al. Learning by (video) example: a randomized study of communication skills training for end-of-life and error disclosure family care conferences. Am J Surg 2016 Nov;212(5):996-1004. [CrossRef] [Medline]
  95. Putnam L, Pham D, Etchegaray J, Ostovar-Kermani T, Ottosen M, Thomas E, et al. How Should Surgical Residents Be Educated about Patient Safety in the Operating Room: A Pilot Randomized Trial. Journal of the American College of Surgeons 2015 Oct;221(4):S127-S128 [FREE Full text] [CrossRef]
  96. Kulier R, Gülmezoglu AM, Zamora J, Plana MN, Carroli G, Cecatti JG, et al. Effectiveness of a clinically integrated e-learning course in evidence-based medicine for reproductive health training: a randomized trial. JAMA 2012 Dec 05;308(21):2218-2243. [CrossRef] [Medline]
  97. Sharma V, Chamos C, Valencia O, Meineri M, Fletcher SN. The impact of internet and simulation-based training on transoesophageal echocardiography learning in anaesthetic trainees: a prospective randomised study. Anaesthesia 2013 Jun;68(6):621-627 [FREE Full text] [CrossRef] [Medline]
  98. Ali J, Sorvari A, Camera S, Kinach M, Mohammed S, Pandya A. Telemedicine as a potential medium for teaching the advanced trauma life support (ATLS) course. J Surg Educ 2013 Mar;70(2):258-264. [CrossRef] [Medline]
  99. Perkins GD, Kimani PK, Bullock I, Clutton-Brock T, Davies RP, Gale M, Electronic Advanced Life Support Collaborators. Improving the efficiency of advanced life support training: a randomized, controlled trial. Ann Intern Med 2012 Jul 03;157(1):19-28. [CrossRef] [Medline]
  100. Platz E, Goldflam K, Mennicke M, Parisini E, Christ M, Hohenstein C. Comparison of Web-versus classroom-based basic ultrasonographic and EFAST training in 2 European hospitals. Ann Emerg Med 2010 Dec;56(6):660-667. [CrossRef] [Medline]
  101. Vollmar HC, Mayer H, Ostermann T, Butzlaff ME, Sandars JE, Wilm S, et al. Knowledge transfer for the management of dementia: a cluster randomised trial of blended learning in general practice. Implement Sci 2010 Jan 04;5(1):1-10 [FREE Full text] [CrossRef] [Medline]
  102. Szmuilowicz E, Neely KJ, Sharma RK, Cohen ER, McGaghie WC, Wayne DB. Improving residents' code status discussion skills: a randomized trial. J Palliat Med 2012 Jul;15(7):768-774 [FREE Full text] [CrossRef] [Medline]
  103. Ngamruengphong S, Horsley-Silva J, Hines S, Pungpapong S, Patel T, Keaveny A. Educational Intervention in Primary Care Residents' Knowledge and Performance of Hepatitis B Vaccination in Patients with Diabetes Mellitus. South Med J 2015 Sep;108(9):510-515. [CrossRef] [Medline]
  104. Midmer D, Kahan M, Marlow B. Effects of a distance learning program on physicians' opioid- and benzodiazepine-prescribing skills. J Contin Educ Health Prof 2006;26(4):294-301. [CrossRef] [Medline]
  105. Kulier R, Coppus SF, Zamora J, Hadley J, Malick S, Das K, et al. The effectiveness of a clinically integrated e-learning course in evidence-based medicine: a cluster randomised controlled trial. BMC Med Educ 2009 May 12;9(1):21-27 [FREE Full text] [CrossRef] [Medline]
  106. Kronick J, Blake C, Munoz E, Heilbrunn L, Dunikowski L, Milne WK. Improving on-line skills and knowledge. A randomized trial of teaching rural physicians to use on-line medical information. Can Fam Physician 2003 Mar;49:312-317 [FREE Full text] [Medline]
  107. Daetwyler CJ, Cohen DG, Gracely E, Novack DH. eLearning to enhance physician patient communication: A pilot test of “doc.com” and “WebEncounter” in teaching bad news delivery. Medical Teacher 2010 Aug 26;32(9):e381-e390. [CrossRef]
  108. Estrada CA, Safford MM, Salanitro AH, Houston TK, Curry W, Williams JH, et al. A web-based diabetes intervention for physician: a cluster-randomized effectiveness trial. Int J Qual Health Care 2011 Dec 10;23(6):682-689 [FREE Full text] [CrossRef] [Medline]
  109. Epstein JN, Langberg JM, Lichtenstein PK, Kolb R, Altaye M, Simon JO. Use of an Internet portal to improve community-based pediatric ADHD care: a cluster randomized trial. Pediatrics 2011 Nov 17;128(5):e1201-e1208 [FREE Full text] [CrossRef] [Medline]
  110. Dayton CS, Ferguson JS, Hornick DB, Peterson MW. Evaluation of an Internet-based decision-support system for applying the ATS/CDC guidelines for tuberculosis preventive therapy. Med Decis Making 2000 Jul;20(1):1-6. [CrossRef] [Medline]
  111. Gerbert B, Bronstone A, Maurer T, Berger T, McPhee SJ, Caspers N. The effectiveness of an Internet-based tutorial in improving primary care physicians' skin cancer triage skills. J Cancer Educ 2002;17(1):7-11. [CrossRef] [Medline]
  112. Meeker D, Linder JA, Fox CR, Friedberg MW, Persell SD, Goldstein NJ, et al. Effect of Behavioral Interventions on Inappropriate Antibiotic Prescribing Among Primary Care Practices: A Randomized Clinical Trial. JAMA 2016 Feb 09;315(6):562-570. [CrossRef] [Medline]
  113. Bell RA, McDermott H, Fancher TL, Green MJ, Day FC, Wilkes MS. Impact of a randomized controlled educational trial to improve physician practice behaviors around screening for inherited breast cancer. J Gen Intern Med 2015 Mar 2;30(3):334-341 [FREE Full text] [CrossRef] [Medline]
  114. Butler CC, Simpson SA, Dunstan F, Rollnick S, Cohen D, Gillespie D, et al. Effectiveness of multifaceted educational programme to reduce antibiotic dispensing in primary care: practice based randomised controlled trial. BMJ 2012 Feb 02;344(feb02 1):d8173-d8173 [FREE Full text] [CrossRef] [Medline]
  115. Curtis JR, Westfall AO, Allison J, Becker A, Melton ME, Freeman A, et al. Challenges in improving the quality of osteoporosis care for long-term glucocorticoid users: a prospective randomized trial. Arch Intern Med 2007 Mar 26;167(6):591-596. [CrossRef] [Medline]
  116. Feng B, Srinivasan M, Hoffman JR, Rainwater JA, Griffin E, Dragojevic M, et al. Physician communication regarding prostate cancer screening: analysis of unannounced standardized patient visits. Ann Fam Med 2013;11(4):315-323 [FREE Full text] [CrossRef] [Medline]
  117. Little P, Stuart B, Francis N, Douglas E, Tonkin-Crine S, Anthierens S, et al. Effects of internet-based training on antibiotic prescribing rates for acute respiratory-tract infections: a multinational, cluster, randomised, factorial, controlled trial. Lancet 2013 Oct 5;382(9899):1175-1182 [FREE Full text] [CrossRef] [Medline]
  118. Xiao Y, Seagull FJ, Bochicchio GV, Guzzo JL, Dutton RP, Sisley A, et al. Video-based training increases sterile-technique compliance during central venous catheter insertion. Crit Care Med 2007 May;35(5):1302-1306. [CrossRef] [Medline]
  119. Légaré F, Labrecque M, Cauchon M, Castel J, Turcotte S, Grimshaw J. Training family physicians in shared decision-making to reduce the overuse of antibiotics in acute respiratory infections: a cluster randomized trial. CMAJ 2012 Sep 18;184(13):E726-E734 [FREE Full text] [CrossRef] [Medline]
  120. Girgis A, Cockburn J, Butow P, Bowman D, Schofield P, Stojanovski E, et al. Improving patient emotional functioning and psychological morbidity: evaluation of a consultation skills training program for oncologists. Patient Educ Couns 2009 Dec;77(3):456-462. [CrossRef] [Medline]
  121. Jayakumar N, Brunckhorst O, Dasgupta P, Khan MS, Ahmed K. e-Learning in Surgical Education: A Systematic Review. J Surg Educ 2015 Nov;72(6):1145-1157. [CrossRef] [Medline]
  122. Thepwongsa I, Kirby CN, Schattner P, Piterman L. Online continuing medical education (CME) for GPs: does it work? A systematic review. Aust Fam Physician 2014 Oct;43(10):717-721 [FREE Full text] [Medline]
  123. Jwayyed S, Stiffler KA, Wilber ST, Southern A, Weigand J, Bare R, et al. Technology-assisted education in graduate medical education: a review of the literature. Int J Emerg Med 2011 Aug 08;4(1):51-13 [FREE Full text] [CrossRef] [Medline]
  124. Liu Q, Peng W, Zhang F, Hu R, Li Y, Yan W. The Effectiveness of Blended Learning in Health Professions: Systematic Review and Meta-Analysis. J Med Internet Res 2016 Jan 04;18(1):e2 [FREE Full text] [CrossRef] [Medline]


ATLS: Advanced Trauma Life Support
CME: continuing medical education
CPD: continuing professional development
cRCT: cluster-randomized controlled trial
EBM: evidence-based medicine
GRADE: Grading of Recommendations, Assessment, Development and Evaluations
ICT: information communication technology
LAN: local area network
ODE: online and local area network–based digital education
OSCE: objective structured clinical examination
RCT: randomized controlled trial
RR: risk ratio
SMD: standardized mean difference


Edited by A Marusic; submitted 03.01.19; peer-reviewed by M Vidak, R Tokalic; comments to author 21.01.19; revised version received 29.01.19; accepted 30.01.19; published 25.02.19

Copyright

©Pradeep Paul George, Olena Zhabenko, Bhone Myint Kyaw, Panagiotis Antoniou, Pawel Posadzki, Nakul Saxena, Monika Semwal, Lorainne Tudor Car, Nabil Zary, Craig Lockwood, Josip Car. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 25.02.2019.

This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research, is properly cited. The complete bibliographic information, a link to the original publication on http://www.jmir.org/, as well as this copyright and license information must be included.