Original Paper
Abstract
Background: Lack of knowledge of systematic reviews (SRs) could prevent individual health care professionals from using SRs as a source of information in their clinical practice or discourage them from participating in such research.
Objective: In this randomized controlled trial, we evaluated the effect of a short web-based educational intervention on short-term knowledge of SRs.
Methods: Eligible participants were 871 Master’s students of university health sciences studies in Croatia; 589 (67.6%) students who agreed to participate in the trial were randomized using a computer program into 2 groups. Intervention group A (294/589, 49.9%) received a short web-based educational intervention about SR methodology, and intervention group B (295/589, 50.1%) was presented with the PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) checklist. The participants’ knowledge of SRs was assessed before and after the intervention. The participants could not be blinded because of the nature of the intervention. The primary outcome was the difference in the percentage of correct answers about SR methodology per participant between the groups after the intervention, expressed as relative risk and 95% CI.
Results: Results from 162 and 165 participants in the educational intervention and PRISMA checklist groups, respectively, were available for analysis. Most of them (educational intervention group: 130/162, 80.2%; PRISMA checklist group: 131/165, 79.4%) were employed as health care professionals in addition to being health sciences students. After the intervention, the educational intervention group had 23% (relative risk percentage) more correct answers in the postintervention questionnaire than the PRISMA checklist group (relative risk=1.23, 95% CI 1.17-1.29).
Conclusions: A short web-based educational intervention about SRs is an effective tool for short-term improvement of knowledge of SRs among health care studies students, most of whom were also employed as health care professionals. Further studies are needed to explore the long-term effects of the tested education.
Trial Registration: OSF Registries 10.17605/OSF.IO/RYMVC; https://osf.io/rymvc
doi:10.2196/37000
Keywords
Introduction
Background
Evidence-based medicine (EBM), which is interchangeably also called evidence-based practice (EBP) or evidence-based health care (EBHC) [
], is credited with a major impact on health care [ ]. Systematic reviews (SRs) are considered the gold standard evidence that helps in making decisions about health within the concept of EBM [ ].However, multiple studies have shown a low level of knowledge of EBM among health care professionals. Low awareness of EBM was reported by Novak et al [
] among physicians in Croatia, and limited knowledge but a positive attitude toward EBM was reported by Ulvenes et al [ ] among Norwegian physicians. A study conducted by Munroe et al [ ] showed that only 3% of nurses evaluated their knowledge of EBP as very good.Knowledge of SRs is considered important for health sciences and medical students as well because it is important that clinicians know how to find and appraise evidence [
]. Knowledge of SRs in trainees can help not only in developing useful skills in critical appraisal but also in addressing important clinical questions and serve as a strong basis to design new, original research studies that will fill the gaps and answer relevant and unsolved clinical questions [ ].The importance of medical students’ exposure to EBM was shown by Vrdoljak et al [
], who reported that knowledge and attitudes of mentors toward EBM in general practice can be influenced by using medical students as academic detailers. It has been shown that better knowledge and more positive attitudes toward EBM among medical students are associated with the exposure to the vertical subject on research in biomedicine and activities of The Cochrane Collaboration [ ]. Glass et al [ ] reported that summarized research evidence delivered in a poster format can increase student nurses’ access to the evidence base. This intervention has increased their knowledge to guide their clinical practice. Thus, knowledge of EBM is a variable that can be influenced. A lack of knowledge of SRs and EBM could prevent individual health care professionals from using SRs as a source of information in their clinical practice or discourage them from participating in such research. Several studies have shown the effectiveness of educational programs on changing the beliefs on and attitudes toward EBM of health care professionals and their readiness to use evidence from EBM sources such as the Cochrane Library or SRs to solve clinical problems [ - ].Web-based educational interventions are low-cost, easy to implement, easily refined and stored for later use, and easily accessible by health care professionals. Educational interventions conducted via the internet related to various topics in medicine have been shown to be effective [
, ]. Several studies have also proved the effectiveness of web-based educational interventions among health care professionals on knowledge of EBP [ - ].A 2017 Campbell SR on the effectiveness of e-learning in improving knowledge of EBHC showed that, compared with no learning, pure e-learning improved knowledge of and skills regarding EBHC but not attitudes and behaviors [
]. Varnell et al [ ] showed that an accelerated 8-week training program influenced a statistically significant positive change in beliefs on and attitudes toward EBP. A controlled trial examining the effect of an educational intervention on knowledge of EBM among physicians in Israel [ ] reported a significant improvement in the level of knowledge of and attitudes toward EBM but not a significant impact on clinical practice [ ].Objectives
We were not able to find studies evaluating the effectiveness of educational interventions dedicated to learning about SRs and SR methodology. In this randomized controlled trial (RCT), we evaluated the effect of a short web-based educational intervention about SRs on short-term knowledge of SR among students of health sciences studies in Croatia.
Methods
Ethics Approval
The study protocol was approved by the Ethics Committee of the Catholic University of Croatia on March 1, 2021 (Klasa: 641-03/21-01/03; Urbroj: 498-03-02-06-02/1-21-02). Subsequently, the ethics committees of all participating institutions also approved the study protocol. The participants provided written informed consent to take part in the study.
Guidelines for Reporting
The manuscript was reported in line with the CONSORT (Consolidated Standards of Reporting Trials) checklist [
]. The CONSORT checklist for this manuscript is available in . The educational intervention was reported in line with the Guideline for Reporting Evidence-based Practice Educational Interventions and Teaching (GREET) checklist [ ].Trial Registration
The study protocol was prospectively registered (ie, before enrolling the first participants) on the Open Science Framework website [
]. There were no differences between the protocol and the conducted trial.Study Design
We conducted an RCT with 2 parallel groups and 1:1 participant allocation.
Participants
Inclusion Criteria
The participants were students of Master’s university health sciences studies in Croatia. The study programs available at the participating universities were Nursing, Radiological Technology, Clinical Nutrition, Physiotherapy, and other programs. Full-time and part-time students were eligible to take part in the study. Many of these students were already employed in health care; students were eligible for participation regardless of their employment status.
Institutions
There were 8 eligible institutions in Croatia for this study, and we invited all of them. The following 7 institutions accepted the invitation to participate: Catholic University of Croatia; University Department of Health Studies Split; University Department of Health Studies Zadar; University of Dubrovnik, Nursing Studies; University North, Faculty of Dental Medicine and Health; University of Osijek, Faculty of Health Studies; and University of Rijeka. One institution declined the invitation to participate in the study (University of Zagreb School of Medicine).
Contacting the Students
Students from eligible institutions were contacted via email by coauthors (MC, MN, KI, DA, NS, SZ, and SM) employed in these institutions and invited to participate in the study on brief web-based education about SRs of the literature. Students who agreed to participate were randomized by simple randomization using the Randomizer website. After randomization, they were sent an email invitation to access the web-based platform on which materials for participants from the educational intervention and PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) checklist groups were available.
The text of the email provided information about the study and provisions related to the anonymity of the participants according to the General Data Protection Regulation, and students were invited to click on the link to further participate in the study. For this study, 2 separate interfaces for the participants were created on the SurveyMonkey platform (Momentive Inc). One interface was created for participants in the educational intervention group and the other for participants enrolled in the PRISMA checklist group. Each group accessed their interface using a separate link.
The link in the email took the participants to their respective SurveyMonkey web-based interface. The text of the email to the participants is presented in
. In the SurveyMonkey interface, the participants were initially asked to confirm that they voluntarily took part in the research and that they were providing informed consent to participate in the study by entering the next page.Intervention Group A
In the web-based interface, intervention group A received a newly developed intervention created by the authors of this study with expertise in medical and health sciences education and research methodology. The educational intervention was written in the Croatian language. It consisted of 11 short educational texts on the methodology of producing SRs. A module describing the forest and funnel plots contained figures of those 2 graphs. The content of the educational intervention was an abbreviated version of the information contained in Cochrane’s educational materials for web-based learning about SRs of the literature (Cochrane Interactive Learning). The complete content of the educational intervention, translated into English, is presented in
.The learning objectives of the educational intervention anticipated that the participants would be able to define EBM, recognize different levels of evidence, define an SR, ask a clinical question, define the steps for preparing and registering an SR protocol, describe literature search and screening, explain the risk-of-bias assessment, and describe the process of data analysis and interpretation in SRs. In addition to theory, there was a practical learning objective: the participants were expected to be able to differentiate between an abstract of an SR and of a narrative literature review.
The first version of the educational intervention was iteratively revised within the team. Before conducting this trial, the web-based interface with the educational intervention was evaluated in a qualitative study among health care workers via semistructured interviews (Krnic Martinic et al, unpublished data, November 2021). The results of the users’ feedback obtained in the qualitative study were used to revise the educational intervention.
The intervention was delivered as an asynchronous web-based education that did not include any components of live education or interaction.
The participants were able to go back and forth through the web-based interface with the educational modules and respective questions without a time limit.
Intervention Group B
Intervention group B was presented with the PRISMA checklist [
] for reporting on SRs ( ) in their web-based interface, and the participants were asked to read it. It was presented to the participants in 11 separate sections to be as similar as possible to the number and form of the educational texts in intervention group A.Pre- and Postintervention Questionnaires
Both groups completed a preintervention questionnaire containing questions about demographic characteristics and their knowledge of SRs before the presentation of the intervention (educational intervention or PRISMA checklist group;
). We were unable to find questionnaires on this topic and purpose in the literature. Thus, we designed the pre- and postintervention questionnaires specifically for this study. The pre- and postintervention questionnaires were not validated. Questions evaluating knowledge of SRs were based on the questions used in our previous studies on knowledge of SRs [ ] and the definitions of SRs [ ].At the end of the educational intervention or PRISMA checklist presentation, the participants were asked to answer the postintervention questionnaire (
). The questionnaire contained the same questions on knowledge of SRs as in the preintervention questionnaire as well as questions about whether they agreed with the proposed characteristics of the definition of SRs. Finally, they were presented with 4 abstracts of published articles and asked to assess whether they were abstracts of SRs.As part of the postintervention questionnaire, the participants were asked to express the level of their agreement on whether an SR should have 6 characteristics proposed earlier by Krnic Martinic et al [
]. They were asked to express their agreement with a number on a Likert scale ranging from 1 to 5 that best suited their opinion, where 1 meant I do not agree at all and 5 meant I completely agree ( ).After those questions, the participants were presented with 4 abstracts of published scientific articles, of which 2 (50%) were abstracts of SRs [
, ] and the other 2 (50%) were abstracts of narrative reviews of the literature [ , ]. They were chosen based on a nonstructured literature search of SRs where we tried to find SR abstracts that were simple to understand and appropriate for the target population. The abstracts did not contain any mention of the study design used. If the abstract reported that it was an abstract of an SR or if a systematic search was mentioned, that part of the abstract was removed. The participants were asked to assess whether the abstracts were abstracts of SRs. The 4 abstracts used for this assessment are presented in [ - ].On the last page of the interface in both intervention groups A and B, the participants were invited to optionally leave their first and last name and email address if they wanted to receive a certificate of participation in the educational intervention. The certificate was prepared by Cochrane Croatia.
The entire questionnaire we administered to the participants was a survey and not a psychological instrument. Thus, we did not perform any psychometric calculations. For the 6 before-and-after questions about the opinion regarding SRs, we calculated that, at the first measurement (before the intervention), reliability was .89, expressed using Cronbach α.
Outcomes
The primary outcome was the difference in the percentage of correct answers per participant in the postintervention questionnaire between intervention groups A and B.
Secondary outcomes were the difference in the percentage of correct answers per participant in the pre- and postintervention questionnaires for the intervention group, the proportion of participants who correctly recognized an abstract describing an SR of the literature (percentage), and the proportion of participants who correctly recognized an abstract describing a simple narrative review of the literature (percentage).
Participant Timeline
After we obtained permission from the ethics committees, the participants were invited to take part. After collecting the names of students who agreed to participate and randomizing them, the invitation to participate in the study containing the link to the intervention A or intervention B interface was sent on June 7, 2021. The links were inactivated on June 20, 2021. The knowledge assessment was conducted immediately after the intervention.
Sample Size
The expected effect size was a difference of at least 20% for the primary outcome between intervention groups A and B. The calculation of the sample size to compare the proportions, predefining an α of .05 and β of .20, assuming a difference of at least 20% for the primary outcome between intervention groups A and B, determined that a sample size of 182 participants (91 participants per group) would be required. To compensate for the possible loss of participants after the beginning of the survey (incomplete answers) or the possibility that participants who initially agreed to take part might eventually choose not to take part, the plan was to include at least 20% more participants than calculated as necessary (n=218).
Encouraging the Inclusion of Participants (Recruitment)
After the initial email was sent to the participants with the link to their respective study arm, 3 more reminders were sent to the participants 4 days apart.
Randomization of Participants
The participants were randomized by simple randomization using the Randomizer website.
Allocation Concealment
After randomization, the participants were allocated to the study arms using a randomization sequence by a third person who was not included in other parts of the study.
Blinding
Blinding of Participants and Personnel
The intervention was of such a nature that the participants could not be blinded.
Blinding of Outcome Assessors
Only the first author (MKM) and the principal investigator (LP) had access to the complete raw data set generated by SurveyMonkey, which included the names and email addresses of the participants who wanted the certificate. MKM removed the participants’ names and email addresses before the outcome assessor (IB) analyzed the data; thus, anonymized data were analyzed.
Data Management
One author downloaded Microsoft Excel worksheets from SurveyMonkey, which were anonymized in case any participant left a name and email address to obtain the certificate. The SurveyMonkey interface was configured not to collect any information about the participants, including IP addresses. The data were stored on a secure server until the time of analysis.
Statistical Analysis
To determine the normality of the variables’ distribution, we used the Kolmogorov-Smirnov test. Categorical data were presented as frequencies and percentages, and numerical values were presented as medians with IQR for variables not following normal distribution and as arithmetic means with IQR for variables following normal distribution. Differences between intervention groups A and B for categorical variables were tested using the chi-square test. To express the difference between groups, numerical values were tested with 2-tailed t tests for independent samples (for variables following normal distribution) and Mann-Whitney tests (for variables not following normal distribution). Pre- and postintervention differences were evaluated using the chi-square test for categorical variables and the t test for independent samples for numerical variables. The effect size for the primary outcome (the difference between the percentage of correct answers between groups in the postintervention questionnaire) was expressed using relative risk (RR) and 95% CI, as was the difference between the number of correct answers in the pre- and postintervention questionnaires in both groups. The effect size for the secondary outcome was expressed using odds ratio with 95% CI.
We assessed the participants’ opinions before and after using parametric procedures on Likert-type scales, which are usually analyzed using a nonparametric test. This was done because, after the initial analysis where we used nonparametric statistics, the results were not interpretable. When we presented results using median and 95% CIs, the results were similar in both groups, although there were significant differences after the intervention. Therefore, we proceeded with parametric testing, which gave the same results but was more precise as it enabled us to interpret the direction of the difference clearly.
All analyses were performed using the computer program JASP (version 0.14.1.0; JASP Team). Statistical significance was set at P<.05.
Results
Participant Flow
In this trial, 871 potential participants met the inclusion criteria, of whom 282 (32.4%) indicated that they did not want to participate in the study. Thus, 67.6% (589/871) of students were randomized: 31.1% (183/589) from University North; 23.1% (136/589) from the Catholic University of Croatia; 22.4% (132/589) from the Faculty of Health Studies, University of Rijeka; 14.3% (84/589) from the University Department of Health Studies Split; 6.6% (39/589) from the University Department of Health Studies Zadar; 1.7% (10/589) from the University of Dubrovnik, Nursing Studies; and 0.8% (5/589) from the Faculty of Dental Medicine and Health, University of Osijek.
Recruitment and Access to the Educational Platform
The link to participate in the study was sent via email to the addresses of the 589 students on June 7, 2021. The students were sent 3 reminders 4 days apart, and access to the web-based platforms was inactivated on June 20, 2021. A detailed participant flow diagram is shown in
.The average time the participants took to complete the entire interface with questionnaires and educational materials was 21 (SD 9.00) minutes in the educational intervention group and 19 (SD 3.96) minutes in the PRISMA checklist group.
Baseline Participant Characteristics
The demographic participant data are presented in
. More than 40% of the participants (educational intervention group: 66/162, 40.7%; PRISMA checklist group: 64/165, 38.8%) were from 1 institution (University North), >80% of the participants (educational intervention group: 134/162, 82.7%; PRISMA checklist group: 138/165, 83.6%) studied nursing, and >50% of the participants (educational intervention group: 97/162, 59.9%; PRISMA checklist group: 85/165, 51.5%) attended the second year of study. More than 80% of the participants (educational intervention group: 136/162, 84%; PRISMA checklist group: 138/165, 83.6%) were employed while studying for their Master’s degree. Most participants were employed as health care workers (educational intervention group: 130/162, 80.2%; PRISMA checklist group: 131/165, 79.4%). The median length of working in health care was 9.9 years among participants who received the educational intervention and 9.8 years in the PRISMA checklist group. The median age of the participants in both groups was approximately 30 years, and >85% of the participants in both groups were women (educational intervention group: 140/162, 86.4%; PRISMA checklist group: 146/165, 88.5%; ).Participants in both groups rated their knowledge of SRs with a median grade of 3 (range 1-5). All participants (327/327, 100%) stated that they had heard of SRs, and approximately three-quarters of the participants in both groups (educational intervention group: 124/162, 76.5%; PRISMA checklist group: 123/165, 74.5%) stated that they had read an SR. In our sample, 17.3% (28/162) of the participants from the group that received the educational intervention and 18.2% (30/165) of the participants from the PRISMA checklist group stated that they had participated in producing an SR (
).Variable and level | Educational intervention (n=162) | PRISMAa checklist (n=165) | |
Institution, n (%) | |||
Faculty of Dental Medicine and Health Osijek | 3 (1.9) | 2 (1.2) | |
Faculty of Health Studies, University of Rijeka | 26 (16) | 30 (18.2) | |
Croatian Catholic University | 46 (28.4) | 41 (24.8) | |
Health Department, University of Zadar | 18 (11.1) | 20 (12.1) | |
Health Studies, University of Dubrovnik | 0 (0) | 3 (1.8) | |
Health Studies, University of Split | 3 (1.9) | 5 (3) | |
University North | 66 (40.7) | 64 (38.8) | |
Study program, n (%) | |||
Physiotherapy | 17 (10.5) | 15 (9.1) | |
Clinical Nutrition | 5 (3.1) | 8 (4.8) | |
Radiological Technology | 2 (1.2) | 1 (0.6) | |
Nursing | 134 (82.7) | 138 (83.6) | |
Something else | 4 (2.5) | 3 (1.8) | |
Year of study, n (%) | |||
First | 59 (36.4) | 74 (44.8) | |
Second | 97 (59.9) | 85 (51.5) | |
Third | 6 (3.7) | 6 (3.6) | |
Currently employed (yes), n (%) | 136 (84) | 138 (83.6) | |
Currently employed as a health care worker (yes), n (%) | 130 (80.2) | 131 (79.4) | |
Length of work status (years), median (IQR) | 7 (3-15) | 6 (2-16) | |
Age (years), median (IQR) | 28 (24-35) | 26 (24-34) | |
Women, n (%) | 140 (86.4) | 146 (88.5) | |
Self-assessment of knowledge of EBMb (1-5), median (IQR) | 3 (3-4) | 3 (3-4) | |
Had heard about systematic reviews, n (%) | 162 (100) | 165 (100) | |
Had read a systematic review, n (%) | 124 (76.5) | 123 (74.5) | |
Had participated in writing a systematic review, n (%) | 28 (17.3) | 30 (18.2) |
aPRISMA: Preferred Reporting Items for Systematic Reviews and Meta-Analyses.
bEBM: evidence-based medicine.
Numbers Analyzed
Of the 420 participants who accessed the interface, 327 (77.9% response rate) completed the questionnaires, and their results were included in further analysis (
).Owing to incomplete questionnaires, we excluded the results of 23.9% (51/213) of participants from the educational intervention group and 20.3% (42/207) of participants from the PRISMA checklist group. The results of 54.9% (162/295) of participants from the educational intervention group and 56.1% (165/294) of participants from the PRISMA checklist group were finally included in the analysis. There were no transfers of participants from one group to another.
Primary Outcome
In the postintervention questionnaire, of the 1458 potential correct answers, there were 1086 (74.49%) correct answers to knowledge questions in the educational intervention group (162/327, 49.5%). In the PRISMA checklist group (165/327, 50.5%), of the 1485 potential correct answers, there were 900 (60.61%) correct answers (
). Thus, the effect size for the difference in the number of correct answers to knowledge questions between groups was an RR of 1.23 (95% CI 1.17-1.29); that is, the educational intervention group had 23% (relative risk percentage) more correct answers in the postintervention questionnaire than the PRISMA checklist group.Questionnaire and items | Educational intervention (n=162) | PRISMAa checklist (n=165) | P valueb | |
Preintervention questionnaire (correct answer) | ||||
It is sufficient to search one database to produce an SR (no), n (%) | 128 (79) | 139 (84.2) | .22 | |
SRs must be produced by one author only (no), n (%) | 103 (63.6) | 98 (59.4) | .44 | |
SRs must contain meta-analyses (no), n (%) | 17 (10.5) | 21 (12.7) | .52 | |
SRs must have duplicate screening and data extraction (yes), n (%) | 87 (53.7) | 83 (50.3) | .54 | |
A list of both included and excluded studies must be provided (yes), n (%) | 117 (72.2) | 116 (70.3) | .70 | |
The quality of the included studies must be assessed (yes), n (%) | 135 (83.3) | 143 (86.7) | .40 | |
In the case of meta-analyses, a heterogeneity test must be done to ensure the results of the studies can be combined (yes), n (%) | 126 (77.8) | 126 (76.4) | .76 | |
Results of meta-analyses must be presented as a funnel plot (no), n (%) | 31 (19.1) | 13 (7.9) | .003 | |
Results of publication bias analysis must be presented as a forest plot (no), n (%) | 31 (19.1) | 27 (16.4) | .51 | |
Total correct answer scores, mean (95% CI) | 4.8 (4.5-5.0) | 4.6 (4.4-4.9) | .44 | |
Postintervention questionnaire (correct answer) | ||||
It is sufficient to search one database to produce an SR (no), n (%) | 156 (96.3) | 120 (72.7)c | <.001 | |
SRs must be produced by one author only (no), n (%) | 153 (94.4) | 126 (76.4) | <.001 | |
SRs must contain meta-analyses (no), n (%) | 38 (23.5)c | 33 (20)c | .45 | |
SRs must have duplicate screening and data extraction (yes), n (%) | 144 (88.9) | 111 (67.3)c | <.001 | |
A list of both included and excluded studies must be provided (yes), n (%) | 144 (88.9) | 141 (85.5)c | .35 | |
The quality of the included studies must be assessed (yes), n (%) | 155 (95.7)c | 142 (86.1) | .003 | |
In the case of meta-analyses, a heterogeneity test must be done to ensure the results of the studies can be combined (yes), n (%) | 150 (92.6)c | 146 (88.5)c | .20 | |
Results of meta-analyses must be presented as a funnel plot (no), n (%) | 80 (49.4)c | 44 (26.7)c | <.001 | |
Results of publication bias analysis must be presented as a forest plot (no), n (%) | 66 (40.7)c | 37 (22.4)c | <.001 | |
Total correct answer scores, mean (95% CI) | 6.7 (6.5-6.9)c | 5.5 (5.3-5.7)c | <.001 |
aPRISMA: Preferred Reporting Items for Systematic Reviews and Meta-Analyses.
bComparison between educational intervention and PRISMA checklist groups. Chi-square test was used for categorical variables, and 2-tailed t test was used for independent samples for numeric variables.
cComparison before and after the intervention. Chi-square test was used for categorical variables, and 2-tailed t test was used for dependent samples for numeric variables.
Secondary Outcomes
Difference in the Number of Correct Answers per Participant in the Pre- and Postintervention Questionnaires for the Educational Intervention Group
Both groups performed better on the postintervention questionnaire than on the preintervention questionnaire (
). In the educational intervention group, the total number of correct answers was 53.16% (775/1458) in the preintervention questionnaire and 74.49% (1086/1458) in the postintervention questionnaire (RR=1.40, 95% CI 1.32-1.48; ). In the PRISMA checklist group, the total number of correct answers was 51.58% (766/1485) in the preintervention questionnaire and 60.61% (900/1485) in the postintervention questionnaire (RR=1.17, 95% CI 1.10-1.25; ).Independent of the group, in the pre- and postintervention questionnaires, the smallest number of correct answers was to questions related to the concept of meta-analysis, whereas, in both groups, the highest number of correct answers was to the question about the necessity to assess the quality of research included in the SR (
).There was no difference in the overall results of the questionnaire assessing knowledge of SRs (the exact number of answers to all 9 knowledge questions) between the educational intervention and PRISMA checklist groups before the intervention (
).Proportion of Participants Who Correctly Recognized SR Abstracts
The first 2 presented summaries were identified accurately as summaries of SRs by 65.4% (106/162) and 74.1% (120/162) of participants from the educational intervention group and 71.5% (118/165) and 72.7% (120/165) of participants in the PRISMA checklist group, respectively (
). There was no statistically significant difference between the groups in the ability to correctly detect an SR summary ( ).The third and fourth summaries were recognized as a summary of a simple narrative review by 22.2% (36/162) and 46.3% (75/162) of participants from the educational intervention group and 34.5% (57/165) and 47.9% (79/165) of participants from the PRISMA checklist group, respectively (
). There was no statistically significant difference between the groups in the recognition of summaries of narrative reviews ( ).Variable and level | Educational intervention (n=162), n (%) | PRISMAa checklist (n=165), n (%) | P valueb | ||||
If you needed to search for information to solve a clinical problem, what would be the preferred information source for you? | |||||||
Colleagues | 65 (40.1) | 65 (39.4) | .98 | ||||
Books | 59 (36.4) | 61 (37) | .98 | ||||
Scientific literature | 120 (74.1) | 132 (80) | .25 | ||||
SR of the literature | 135 (83.3) | 136 (82.4) | .93 | ||||
Internet search engine (Google) | 29 (17.9) | 30 (18.2) | .98 | ||||
Is this an SR abstract? | |||||||
Abstract 1c—correct answer “Yes” | 106 (65.4) | 118 (71.5) | .34 | ||||
Abstract 2d—correct answer “Yes” | 120 (74.1) | 120 (72.7) | .56 | ||||
Abstract 3e—correct answer “No” | 36 (22.2) | 57 (34.5) | .02 | ||||
Abstract 4f—correct answer “No” | 75 (46.3) | 79 (47.9) | .78 |
aPRISMA: Preferred Reporting Items for Systematic Reviews and Meta-Analyses.
bChi-square test.
cA total of 7 answers missing.
dA total of 7 answers missing.
eA total of 8 answers missing.
fA total of 10 answers missing.
Additional Analyses
There was no statistical difference in the choice of information sources between the educational intervention and PRISMA checklist groups in the postintervention questionnaire, with multiple possible responses about where the participants would look for answers to a clinical question from their own clinical practice (
). More than 80% of the participants in both groups (educational intervention group: 135/162, 83.3%; PRISMA checklist group: 136/165, 82.4%) stated that they would look for answers in an SR. Most participants (educational intervention group: 120/162, 74.1%; PRISMA checklist group: 132/165, 80%) responded that they would look for answers in scientific literature in general ( ). A third of the participants in both groups would look for an answer to a clinical question in a textbook (educational intervention group: 59/162, 36.4%; PRISMA checklist group: 61/165, 37%) or ask a coworker for an answer (educational intervention group: 65/162, 40.1%; PRISMA checklist group: 65/165, 39.4%). Less than one-fifth of the participants in both groups would search for an answer on an internet search engine such as Google (educational intervention group: 29/162, 17.9%; PRISMA checklist group: 30/165, 18.2%; ).In the preintervention assessment in both groups of participants, there was no significant difference in agreement with the proposed characteristics of an SR (
). After the intervention, there was more agreement with these characteristics in the educational intervention group than in the PRISMA checklist group ( ).Questionnaire and items | Educational intervention (n=162), mean (95% CI) | PRISMAb checklist (n=165), mean (95% CI) | P valuec | |
Preintervention questionnaire | ||||
Research question is defined | 4.5 (4.4-4.7) | 4.4 (4.2-4.5) | .09 | |
Listed sources of literature searched, with repeatable search strategy (naming of databases, naming of search platforms, search date, and complete search strategy) | 4.4 (4.2-4.5) | 4.2 (4.0-4.3) | .08 | |
Listed criteria for inclusion and exclusion of research | 4.5 (4.3-4.6) | 4.3 (4.1-4.4) | .04 | |
Listed selection methods | 4.4 (4.3-4.6) | 4.3 (4.2-4.5) | .45 | |
Critically evaluates and reports on the quality or risk of bias of the included studies | 4.4 (4.2-4.5) | 4.1 (4.0-4.3) | .04 | |
Provides information on data analysis and synthesis that allows for the repeatability of the results | 4.4 (4.2-4.5) | 4.2 (4.1-4.4) | .11 | |
Postintervention questionnaire | ||||
Research question is defined | 4.8 (4.7-4.9) | 4.6 (4.5-4.7) | <.001 | |
Listed sources of literature searched, with repeatable search strategy (naming of databases, naming of search platforms, search date, and complete search strategy) | 4.7 (4.6-4.8) | 4.6 (4.5-4.7) | .05 | |
Listed criteria for inclusion and exclusion of research | 4.8 (4.7-4.9) | 4.5 (4.4-4.6) | <.001 | |
Listed selection methods | 4.8 (4.7-4.9) | 4.6 (4.5-4.7) | .02 | |
Critically evaluates and reports on the quality or risk of bias of the included studies | 4.7 (4.6-4.8) | 4.5 (4.4-4.6) | <.001 | |
Provides information on data analysis and synthesis that allows for the repeatability of the results | 4.7 (4.6-4.8) | 4.5 (4.3-4.6) | <.001 |
aAll differences before and after were statistically significant at P<.05; 2-tailed t test for paired samples.
bPRISMA: Preferred Reporting Items for Systematic Reviews and Meta-Analyses.
ct test (2-tailed) for independent samples.
Discussion
Principal Findings
This RCT demonstrated that a brief educational intervention conducted on the web about SRs significantly increased knowledge of SRs in the target population. To the best of our knowledge, this is the first trial conducted for this purpose. Relatively successful learning models about EBM have been reported in the literature [
, - ], but we could not find any publications on the effectiveness of educational interventions focused exclusively on knowledge of SRs.Comparison With Prior Work
The participants from the educational intervention group (162/327, 49.5%), who were presented with a new educational intervention designed for this study, needed an average of 21 minutes to go through the entire interface. The interface included multiple sections beyond educational intervention: pre- and postintervention questionnaires and the evaluation of 4 scientific abstracts. However, the web-based platform used for this study did not allow for the measurement of the time spent on specific items or pages in the interface. Thus, we cannot know how long the participants read the educational texts prepared for the educational intervention and PRISMA checklist groups. However, if we consider the time to read and answer the questions, the participants probably needed 15 minutes or less to read the educational intervention itself. Such an intervention is very short. Therefore, the intervention should be suitable for health professionals who usually state that their lack of time is a major obstacle to practicing EBM [
- ] and implementing the EBM curriculum during education [ ].Initially, participants in both groups rated their knowledge of SRs with a median grade of 3 out of 5. This is comparable with the self-assessed knowledge of EBP among nurses evaluated in the study conducted by Munroe et al [
]. In that study, only 3% of the nurses said that they were very familiar with EBP [ ].Three-quarters of the participants (educational intervention group: 124/162, 76.5%; PRISMA checklist group: 123/165, 74.5%) stated that they had read SRs. We were surprised with the result that 17.3% (28/162) and 18.2% (30/165) of the participants in the educational intervention and PRISMA checklist groups, respectively, stated that they had participated in developing a SR, which is a high percentage [
]. SR methodology is very complex. Thus, it is questionable whether the students have actually participated in the development of SRs in such large numbers. Health students may have participated, for example, in translating Cochrane’s plain language summaries into Croatian [ , ]. However, without the possibility of further clarifying what the participants really meant, it is not possible to discuss this topic in further detail. In the study by Olsson et al [ ], which focused on nursing PhD programs and candidatures, in the analyzed 135 nursing dissertations made according to the Scandinavian model of integrated research, only 5 published SRs were found (ie, only 4% of the included nurses—dissertation authors—participated in developing an SR). This number is much lower than the percentage of our participants who stated that they had participated in the production of an SR, and our students were not PhD students but Master’s-level students.The primary outcome of this study was the difference in the percentage of correct answers collected from the educational intervention and PRISMA checklist groups when answering questions evaluating knowledge of SRs on the postintervention questionnaire after the participants had read the educational materials. After the training, the educational intervention group had 23% more correct answers than the PRISMA checklist group (ie, the size of the effect expressed in RR was 1.23). In addition, comparing the pre- and postintervention questionnaire results in the educational intervention group, there were significantly more correct answers on the postintervention questionnaire than on the preintervention questionnaire, with an RR of 1.40. The RR of correct answers comparing the pre- and postintervention questionnaires in the PRISMA checklist group was 1.17.
An RCT by Sánchez-Mendiola et al [
] showed a significant effect of EBM education on the final knowledge of EBM among medical students, with a 25.9% increase in correct answers in the knowledge test about EBM [ ]. This is comparable with our primary outcome results. However, it should be emphasized that their intervention was very different in terms of content and duration. Sánchez-Mendiola et al [ ] tested an EBM course with 14 two-hour weekly sessions during 1 semester. The course was a formal part of the medical school curriculum; it was delivered by 6 experienced professors and included different content compared with ours. Their course covered 15 topics, including clinical decision-making, uncertainty and probability in medicine, the Bayes theorem, and clinical guidelines [ ].Rohwer et al [
] evaluated the effectiveness of e-learning in improving EBHC in a Campbell SR. The study included 24 trials, of which 20 were RCTs and 4 were observational studies, with a total of 3825 participants including physicians, nurses, physiotherapists, physician assistants, and educators at all levels of education. It demonstrated that, compared with nonlearning, pure e-learning improved EBHC knowledge and skills with similar outcomes to face-to-face learning for any observed primary outcome.In 2021, needs assessments and expectations regarding EBP knowledge acquisition and training activities were explored among frontline health care providers, including postgraduate medical and nursing students who were working or living in China. The results indicated that the respondents expressed a high need for education on evidence quality appraisal, interpretation of SRs or meta-analyses, and knowledge translation [
]. However, it may not be sufficient to only strive for the improvement of knowledge among the targeted individuals. Nursing education at the undergraduate level is starting to teach the process of research integration through EBP implementation with active learning strategies, which is endorsed by the students [ ].To advance the knowledge and application of evidence in daily practice, ultimately, health institutions will also need to recognize the need to foster such topics [
].Our educational intervention, implemented via a web tool, is particularly suitable in the current time of the COVID-19 pandemic. Owing to containment measures, many parts of the world have switched to web-based education during the COVID-19 pandemic. Bond et al [
] published a living systematic mapping review on August 30, 2021, calling the web-based teaching experience during the pandemic the “first global online semester.” Although such teaching was initially seen as a distance learning response to emergency remote teaching [ ], the educational experience gained during a pandemic is very valuable for evaluating the distance learning experience. Our study provides a further test of a remotely delivered educational intervention targeting students and health care workers.Many studies have evaluated experiences with virtual continuing medical education during the pandemic [
- ]. The SR education evaluated in our study could be incorporated into continuing medical education programs for health care professionals. Owing to the short format and the possibility of distance learning, such education could be of interest to health care professionals who want to learn more about the basics of SRs.In addition to showing the efficacy of our newly designed educational intervention, our study also indicated areas where the target group of participants significantly lacked knowledge. Independent of the group, in the pre- and postintervention questionnaires, the smallest number of correct answers was to questions related to the concept of meta-analysis and questions about graphical representations of meta-analyses (funnel plot and forest plot). Very modest improvements were observed in those questions in the postintervention questionnaire. In a study on knowledge of the basic methodological components of SRs conducted by Puljak and Sapunar [
] among the directors of postgraduate programs at European universities, only 31% of the participants answered correctly that an SR does not necessarily contain a meta-analysis.There were few correct answers to questions about graphical presentations of meta-analyses. In the educational intervention group, before the training, 20% of the participants correctly answered what a funnel plot and a forest plot represented. In the PRISMA checklist group, only 8% of the participants correctly answered what a funnel plot represented, and 16% correctly answered what a forest plot depicted. In the postintervention questionnaire, in the educational intervention group, 40.7% (66/162) to 49.4% (80/162) of the participants correctly answered the question about the use of the forest plot or funnel plot, whereas, in the PRISMA checklist group, only a fifth (37/165, 22.4%) to a quarter (44/165, 26.7%) of the participants correctly answered these questions. However, after the educational intervention, more than half of the participants did not know the correct answer to the questions regarding the graphical representations of meta-analyses.
Poor knowledge of graphical representations in meta-analyses has been described elsewhere. A survey conducted on psychologists in Italy found that less than a fifth of psychologists estimated that they had sufficient knowledge of the forest plot [
]. Less than 15% of psychologists stated that they had sufficient knowledge of the funnel plot [ ]. A survey conducted on psychologists in Spain showed that only approximately 10% of the participants said that they had satisfactory knowledge of the forest plot. Only 7% of the participants said that they had satisfactory knowledge of the funnel plot [ ]. Only 10% of PhD program directors accurately recognized the purpose of funnel plots, and 11.3% recognized the purpose of the forest plot [ ]. Poor knowledge of graphical representations of meta-analyses may be the best indicator of generally poor knowledge of SR and meta-analysis methodology.In this study, we also included a practical knowledge test that involved the recognition of journal abstracts of SRs after the intervention. The accuracy of journal abstract recognition was 22% to 72% in the educational intervention group and 46% to 74% in the PRISMA checklist group, without a statistically significant difference between the groups. This was the final test of understanding and pragmatic application of the knowledge acquired in our trial. We found that the educational intervention did not significantly affect the recognition of abstracts of SRs. It is possible that a time lag or longer systematic learning is needed for the acquired knowledge of SRs to influence the practical application of the knowledge itself. It is also possible that it is necessary to further adjust the educational intervention to enable the practical application of the acquired knowledge.
In the postintervention questionnaire, the participants were asked which sources of information they would use in searching for an answer to a question from their clinical practice, and most participants from both groups opted for scientific literature (educational intervention group: 120/162, 74.1%; PRISMA checklist group: 132/165, 80%) or SRs (educational intervention group: 135/162, 83.3%; PRISMA checklist group: 136/165, 82.4%). Compared with the results of a study conducted by Sánchez-Mendiola et al [
] on medical students in Mexico, in our study, a significantly higher percentage of students chose to search for answers in SRs and scientific literature. In the study by Sánchez-Mendiola et al [ ], most participants from the group that attended EBM classes stated that, in solving a certain health problem, they looked for answers in review articles or the Cochrane Library only occasionally, whereas very often they would look for the answer to a health problem in textbooks or search engines or they would ask their teachers. Nevertheless, the number of students who would seek answers in the Cochrane Library or scientific articles was higher than in groups of students who did not attend classes on EBM [ ]. Evidently, education about EBM—or, in our case, about SRs—has the express intention to use these data sources more often in solving clinical problems.Strengths and Limitations
The strength of this study is the appropriate sample size and a high number of fully completed questionnaires, with three-quarters of the participants (327/420, 77.9%) completing the questionnaire in full and almost equal numbers of unfinished questionnaires in the educational intervention and PRISMA checklist groups, allowing for comparable results. Furthermore, we tested the practical application of the acquired knowledge of SRs by asking the students to recognize summaries of SRs or narrative reviews.
A limitation of this study is a highly homogeneous sample that does not allow for significant analyses by sociodemographic subgroups. We acknowledge that there is a potential self-selection aspect in our final sample. We do not have data about nonparticipants among the eligible students and, theoretically, there could be some differences between responders and nonresponders. However, this is an inherent problem of any trial—the eligible participants are invited to take part, and they can choose whether they want to participate.
The study was conducted in only 1 country but in multiple institutions across the country. We did not measure the time spent in the intervention; some participants may not have spent much time reading the text. Furthermore, we measured the outcomes immediately after the reading of the educational texts. Such short-term follow-up does not allow for monitoring of the long-term retention of knowledge of SRs among the participants. Longer-term research will make it possible to verify the long-term effectiveness of the intervention on the knowledge of the target group.
Finally, we would like to note that, in this manuscript, when referring to the studies of other researchers, we used the terms EBM, EBP, and EBHC as they were reported in those manuscripts.
Conclusions
A short web-based educational intervention about SRs is an effective tool for short-term improvement of knowledge of SRs among health care studies students, most of whom were employed as health care professionals. This education can be further studied, modified, and used in the continuing medical education of health care professionals.
Acknowledgments
This study was conducted as a part of the Professionalism in Health: Decision-making in Practice and Science project (IP-2019-04-4882) funded by the Croatian Science Foundation, whose principal investigator is Professor Ana Marušić, MD, PhD. This study was part of the PhD thesis of the first author, MKM. The thesis was written and defended in the Croatian language at the University of Split School of Medicine in Croatia. The authors are grateful to the thesis committee members, professors Ozren Polasek, Zarko Alfirevic, and Damir Roje, for their constructive feedback on this study.
Data Availability
The raw data from this study were published on the Open Science Framework website [
] and made publicly available.Authors' Contributions
MKM, MC, AM, DS, TPP, IB, RT, and LP contributed to study design. MKM, IB, RT, S Malisa, MN, KI, DA, NS, SZ, S Miksic, DC, and LP contributed to data collection and analysis. MKM, IB, DC, and LP wrote the first draft of the manuscript. MKM, MC, AM, DS, TPP, IB, RT, S Malisa, MN, KI, DA, NS, SZ, S Miksic, DC, and LP critically revised the manuscript. MKM, MC, AM, DS, TPP, IB, RT, S Malisa, MN, KI, DA, NS, SZ, S Miksic, DC, and LP approved the final version of the manuscript.
Conflicts of Interest
Multiple authors of this study are members of Cochrane Croatia (AM, TPP, IB, RT, and LP), but this was not an official research project of the global Cochrane organization.
Editorial Notice
This randomized study was retrospectively registered. The authors published the protocol for the trial [
] before it commenced instead of registering it with a trial registry. As the trial does not involve a study of the cause-and-effect relationship between a health-related intervention and a health outcome, it does not meet the definition of a clinical trial. The editor granted an exception from ICMJE (International Committee of Medical Journal Editors) rules mandating prospective registration of randomized trials because the risk of bias appears low and because the publication of a protocol is equivalent to trial registration and meets all of the purposes listed by the ICMJE regarding transparency.CONSORT-eHEALTH checklist (V 1.6.2).
PDF File (Adobe PDF File), 87 KB
Text of the first email and reminders to invite participants to a randomized controlled trial.
DOCX File , 14 KB
Educational intervention.
DOCX File , 218 KB
PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) 2009 checklist (divided into 11 sections for the purpose of the study).
DOCX File , 17 KB
The text of the pre- and postintervention questionnaires.
DOCX File , 19 KB
Four abstracts selected for assessment.
DOCX File , 19 KBReferences
- Puljak L. The difference between evidence-based medicine, evidence-based (clinical) practice, and evidence-based health care. J Clin Epidemiol 2022 Feb;142:311-312. [CrossRef] [Medline]
- Seshia SS, Young GB. The evidence-based medicine paradigm: where are we 20 years later? Part 1. Can J Neurol Sci 2013 Jul 23;40(4):465-474. [CrossRef] [Medline]
- Sackett DL, Rosenberg WM, Gray JA, Haynes RB, Richardson WS. Evidence based medicine: what it is and what it isn't. BMJ 1996 Jan 13;312(7023):71-72. [CrossRef] [Medline]
- Novak K, Mirić D, Jurin A, Vukojević K, Aljinović J, Carić A, et al. Awareness and use of evidence-based medicine databases and Cochrane Library among physicians in Croatia. Croat Med J 2010 Apr;51(2):157-164 [FREE Full text] [CrossRef] [Medline]
- Ulvenes LV, Aasland O, Nylenna M, Kristiansen IS. Norwegian physicians' knowledge of and opinions about evidence-based medicine: cross-sectional study. PLoS One 2009 Nov 13;4(11):e7828 [FREE Full text] [CrossRef] [Medline]
- Munroe D, Duffy P, Fisher C. Nurse knowledge, skills, and attitudes related to evidence-based practice: before and after organizational supports. Medsurg Nurs 2008 Feb;17(1):55-60. [Medline]
- Usher-Smith J, Harrison H, Dennison R, Kelly S, Jones R, Kuhn I, et al. Developing training and opportunities for students in systematic reviews. Med Educ 2021 Nov;55(11):1333-1334 [FREE Full text] [CrossRef] [Medline]
- Fares M, Alahdab F, Alsaied T. Systematic reviews and meta-analyses for cardiology fellows. Congenit Heart Dis 2016 Jul 27;11(4):369-371. [CrossRef] [Medline]
- Vrdoljak D, Petric D, Diminić Lisica I, Kranjčević K, Došen Janković S, Delija I, et al. Knowledge and attitudes towards evidence-based medicine of mentors in general practice can be influenced by using medical students as academic detailers. Eur J Gen Pract 2015 Sep 04;21(3):170-175. [CrossRef] [Medline]
- Balajić K, Barac-Latas V, Drenjancević I, Ostojić M, Fabijanić D, Puljak L. Influence of a vertical subject on research in biomedicine and activities of The Cochrane Collaboration branch on medical students' knowledge and attitudes toward evidence-based medicine. Croat Med J 2012 Aug;53(4):367-373 [FREE Full text] [CrossRef] [Medline]
- Glass GF, Tan H, Chan E. Effect of an evidence-based poster on the knowledge of delirium and its prevention in student nurses: a quasi-experimental study. Contemp Nurse 2021 Dec 09;57(6):462-471. [CrossRef] [Medline]
- Varnell G, Haas B, Duke G, Hudson K. Effect of an educational intervention on attitudes toward and implementation of evidence-based practice. Worldviews Evid Based Nurs 2008;5(4):172-181. [CrossRef] [Medline]
- Stevenson K, Lewis M, Hay E. Do physiotherapists' attitudes towards evidence-based practice change as a result of an evidence-based educational programme? J Eval Clin Pract 2004 May;10(2):207-217. [CrossRef] [Medline]
- Shuval K, Berkovits E, Netzer D, Hekselman I, Linn S, Brezis M, et al. Evaluating the impact of an evidence-based medicine educational intervention on primary care doctors' attitudes, knowledge and clinical behaviour: a controlled trial and before and after study. J Eval Clin Pract 2007 Aug;13(4):581-598. [CrossRef] [Medline]
- Sánchez-Mendiola M, Kieffer-Escobar LF, Marín-Beltrán S, Downing SM, Schwartz A. Teaching of evidence-based medicine to medical students in Mexico: a randomized controlled trial. BMC Med Educ 2012 Nov 06;12:107 [FREE Full text] [CrossRef] [Medline]
- Ngim CF, Ibrahim H, Abdullah N, Lai NM, Tan RK, Ng CS, et al. A web-based educational intervention module to improve knowledge and attitudes towards thalassaemia prevention in Malaysian young adults. Med J Malaysia 2019 Jun;74(3):219-225 [FREE Full text] [Medline]
- Vinokur AD, Merion RM, Couper MP, Jones EG, Dong Y. Educational web-based intervention for high school students to increase knowledge and promote positive attitudes toward organ donation. Health Educ Behav 2006 Dec 31;33(6):773-786. [CrossRef] [Medline]
- Rohwer A, Motaze N, Rehfuess E, Young T. E‐learning of evidence‐based health care (EBHC) to increase EBHC competencies in healthcare professionals: a systematic review. Campbell Syst Rev 2017 Mar 02;13(1):1-147 [FREE Full text] [CrossRef]
- Cuschieri S. The CONSORT statement. Saudi J Anaesth 2019 Apr;13(Suppl 1):S27-S30 [FREE Full text] [CrossRef] [Medline]
- Phillips AC, Lewis LK, McEvoy MP, Galipeau J, Glasziou P, Moher D, et al. Development and validation of the guideline for reporting evidence-based practice educational interventions and teaching (GREET). BMC Med Educ 2016 Sep 06;16(1):237 [FREE Full text] [CrossRef] [Medline]
- Martinić MK, Puljak L, Čivljak M, Marusic A, Sapunar D, Peričić TP, et al. Online educational intervention to improve knowledge about systematic reviews among health science professionals – a randomized controlled trial. OSF. 2022 Aug 02. URL: https://osf.io/x2mf5/ [accessed 2022-06-24]
- Moher D, Liberati A, Tetzlaff J, Altman DG, PRISMA Group. Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. PLoS Med 2009 Jul 21;6(7):e1000097 [FREE Full text] [CrossRef] [Medline]
- Puljak L, Sapunar D. Acceptance of a systematic review as a thesis: survey of biomedical doctoral programs in Europe. Syst Rev 2017 Dec 12;6(1):253 [FREE Full text] [CrossRef] [Medline]
- Krnic Martinic M, Pieper D, Glatt A, Puljak L. Definition of a systematic review used in overviews of systematic reviews, meta-epidemiological studies and textbooks. BMC Med Res Methodol 2019 Nov 04;19(1):203 [FREE Full text] [CrossRef] [Medline]
- Sharma R, Lakhani R, Rimmer J, Hopkins C. Surgical interventions for chronic rhinosinusitis with nasal polyps. Cochrane Database Syst Rev 2014 Nov 20;11(11):CD006990. [CrossRef] [Medline]
- Lissiman E, Bhasale A, Cohen M. Garlic for the common cold. Cochrane Database Syst Rev 2012 Mar 14;11(3):CD006206. [CrossRef] [Medline]
- Bogduk N, Dreyfuss P, Govind J. A narrative review of lumbar medial branch neurotomy for the treatment of back pain. Pain Med 2009 Sep 01;10(6):1035-1045. [CrossRef] [Medline]
- Lam SK, Kwong EW, Hung MS, Pang SM, Chiang VC. Nurses' preparedness for infectious disease outbreaks: a literature review and narrative synthesis of qualitative evidence. J Clin Nurs 2018 Apr 11;27(7-8):e1244-e1255. [CrossRef] [Medline]
- Flores-Mateo G, Argimon JM. Evidence based practice in postgraduate healthcare education: a systematic review. BMC Health Serv Res 2007 Jul 26;7:119 [FREE Full text] [CrossRef] [Medline]
- Horsley T, Hyde C, Santesso N, Parkes J, Milne R, Stewart R. Teaching critical appraisal skills in healthcare settings. Cochrane Database Syst Rev 2011 Nov 09(11):CD001270 [FREE Full text] [CrossRef] [Medline]
- Khan KS, Coomarasamy A. A hierarchy of effective teaching and learning to acquire competence in evidenced-based medicine. BMC Med Educ 2006 Dec 15;6:59 [FREE Full text] [CrossRef] [Medline]
- Baig M, Sayedalamin Z, Almouteri O, Algarni M, Allam H. Perceptions, perceived barriers, and practices of physicians' towards evidence-based medicine. Pak J Med Sci 2016 Dec 31;32(1):49-54 [FREE Full text] [CrossRef] [Medline]
- Risahmawati RR, Emura SS, Nishi TT, Koizumi SS. Japanese resident physicians' attitudes, knowledge, and perceived barriers on the practice of evidence based medicine: a survey. BMC Res Notes 2011 Sep 28;4(1):374 [FREE Full text] [CrossRef] [Medline]
- Adeodu A, Agius R, Madan I. Attitudes and barriers to evidence-based guidelines among UK occupational physicians. Occup Med (Lond) 2009 Dec 24;59(8):586-592. [CrossRef] [Medline]
- Halalau A, Holmes B, Rogers-Snyr A, Donisan T, Nielsen E, Cerqueira TL, et al. Evidence-based medicine curricula and barriers for physicians in training: a scoping review. Int J Med Educ 2021 May 28;12:101-124 [FREE Full text] [CrossRef] [Medline]
- Puljak L. Using social media for knowledge translation, promotion of evidence-based medicine and high-quality information on health. J Evid Based Med 2016 Feb;9(1):4-7. [CrossRef] [Medline]
- Jakus D, Behmen D, Buljan I, Marušić A, Puljak L. Efficacy of reminders for increasing volunteer engagement in translating Cochrane plain language summaries: a pilot randomised controlled trial. BMJ Evid Based Med 2021 Apr 07;26(2):49-50. [CrossRef] [Medline]
- Olsson C, Ringnér A, Borglin G. Including systematic reviews in PhD programmes and candidatures in nursing - 'Hobson's choice'? Nurse Educ Pract 2014 Mar;14(2):102-105. [CrossRef] [Medline]
- Chau J, Chien W, Liu X, Hu Y, Jin Y. Needs assessment and expectations regarding evidence-based practice knowledge acquisition and training activities: a cross-sectional study of healthcare personnel in China. Int J Nurs Sci 2022 Jan;9(1):100-106 [FREE Full text] [CrossRef] [Medline]
- Ruppel KJ, Bone BA. Teaching evidence-based practice: knowledge to implementation in a BSN program, part 2. Worldviews Evid Based Nurs 2022 Jan 12 (forthcoming). [CrossRef] [Medline]
- Sorice V, Neal A. Nurses perceived institutional double standards in education and application of evidence-based practice. Evid Based Nurs 2022 Apr 25:ebnurs-2022-103515 (forthcoming). [CrossRef] [Medline]
- Bond M, Bedenlier S, Marín VI, Händel M. Emergency remote teaching in higher education: mapping the first global online semester. Int J Educ Technol High Educ 2021 Aug 30;18(1):50 [FREE Full text] [CrossRef] [Medline]
- Hodges C, Moore S, Lockee B, Trust T, Bond A. The difference between emergency remote teaching and online learning. Educause Review. 2020. URL: https://er.educause.edu/articles/2020/3/the-difference-between-emergency-remote-teaching-and-online-learning [accessed 2022-06-22]
- Weber C, Ntasumbumuyange D, Ngoga E, Bazzett-Matabele L, Francis J, Paley P, et al. Continuing medical education during COVID-19: virtual training for gynecologic oncology management in Rwanda. Int J Gynecol Cancer 2021 Aug 20;31(8):1184-1185 [FREE Full text] [CrossRef] [Medline]
- Burgos L, Gil Ramirez A, Utengen A, Thamman R. Use of Twitter during COVID-19 pandemic: an opportunity for continuing medical education in cardiology. Medicina (B Aires) 2020;80 Suppl 6:122-123 [FREE Full text] [Medline]
- Kisilevsky E, Margolin E, Kohly RP. Access, an unintended consequence of virtual continuing medical education during COVID-19: a department's experience at the University of Toronto. Can J Ophthalmol 2021 Feb;56(1):e18-e19 [FREE Full text] [CrossRef] [Medline]
- Wong A, Vohra R, Kopec K, Brooke N, Stolbach A. The importance of continuing medical education during the COVID-19 pandemic: the Global Educational Toxicology Uniting Project (GETUP). J Med Toxicol 2020 Jul 04;16(3):340-341 [FREE Full text] [CrossRef] [Medline]
- Kanneganti A, Lim KM, Chan GM, Choo S, Choolani M, Ismail-Pratt I, et al. Pedagogy in a pandemic - COVID-19 and virtual continuing medical education (vCME) in obstetrics and gynecology. Acta Obstet Gynecol Scand 2020 Jun 17;99(6):692-695 [FREE Full text] [CrossRef] [Medline]
- Badenes-Ribera L, Frias-Navarro D, Iotti NO, Bonilla-Campos A, Longobardi C. Perceived statistical knowledge level and self-reported statistical practice among academic psychologists. Front Psychol 2018;9:996 [FREE Full text] [CrossRef] [Medline]
- Badenes-Ribera L, Frias-Navarro D, Pascual-Soler M, Monterde-I-Bort H. Knowledge level of effect size statistics, confidence intervals and meta-analysis in Spanish academic psychologists. Psicothema 2016 Nov;28(4):448-456. [CrossRef] [Medline]
Abbreviations
CONSORT: Consolidated Standards of Reporting Trials |
EBHC: evidence-based health care |
EBM: evidence-based medicine |
EBP: evidence-based practice |
GREET: Guideline for Reporting Evidence-based Practice Educational Interventions and Teaching |
PRISMA: Preferred Reporting Items for Systematic Reviews and Meta-Analyses |
RCT: randomized controlled trial |
RR: relative risk |
SR: systematic review |
Edited by A Mavragani; submitted 02.02.22; peer-reviewed by S Bhattacharjya, K Kalavani, N Khan; comments to author 20.04.22; revised version received 10.05.22; accepted 13.05.22; published 25.08.22
Copyright©Marina Krnic Martinic, Marta Čivljak, Ana Marušić, Damir Sapunar, Tina Poklepović Peričić, Ivan Buljan, Ružica Tokalić, Snježana Mališa, Marijana Neuberg, Kata Ivanišević, Diana Aranza, Nataša Skitarelić, Sanja Zoranić, Štefica Mikšić, Dalibor Čavić, Livia Puljak. Originally published in the Journal of Medical Internet Research (https://www.jmir.org), 25.08.2022.
This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research, is properly cited. The complete bibliographic information, a link to the original publication on https://www.jmir.org/, as well as this copyright and license information must be included.