Published on in Vol 25 (2023)

Preprints (earlier versions) of this paper are available at https://preprints.jmir.org/preprint/43883, first published .
Comparative Effectiveness of eConsent: Systematic Review

Comparative Effectiveness of eConsent: Systematic Review

Comparative Effectiveness of eConsent: Systematic Review

Review

1AstraZeneca BV, The Hague, Netherlands

2Signant Health, London, United Kingdom

3Oxford PharmaGenesis, Oxford, United Kingdom

4AstraZeneca, Gothenburg, Sweden

5Nottingham Trent University, Nottingham, United Kingdom

Corresponding Author:

Edwin Cohen, MSc

AstraZeneca BV

Prinses Beatrixlaan, 582

The Hague, 2595 BM

Netherlands

Phone: 31 79 363 2546

Email: edwin.cohen@astrazeneca.com


Background: Providing informed consent means agreeing to participate in a clinical trial and having understood what is involved. Flawed informed consent processes, including missing dates and signatures, are common regulatory audit findings. Electronic consent (eConsent) uses digital technologies to enable the consenting process. It aims to improve participant comprehension and engagement with study information and to address data quality concerns.

Objective: This systematic literature review aimed to assess the effectiveness of eConsent in terms of patient comprehension, acceptability, usability, and study enrollment and retention rates, as well as the effects of eConsent on the time patients took to perform the consenting process (“cycle time”) and on-site workload in comparison with traditional paper-based consenting.

Methods: The systematic review was conducted and reported in accordance with the PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) guidelines. Ovid Embase and Ovid MEDLINE were systematically searched for publications reporting original, comparative data on the effectiveness of eConsent in terms of patient comprehension, acceptability, usability, enrollment and retention rates, cycle time, and site workload. The methodological validity of the studies that compared outcomes for comprehension, acceptability, and usability across paper consent and eConsent was assessed. Study methodologies were categorized as having “high” validity if comprehensive assessments were performed using established instruments.

Results: Overall, 37 publications describing 35 studies (13,281 participants) were included. All studies comparing eConsenting and paper-based consenting for comprehension (20/35, 57% of the studies; 10 with “high” validity), acceptability (8/35, 23% of the studies; 1 with “high” validity), and usability (5/35, 14% of the studies; 1 with “high” validity) reported significantly better results with eConsent, better results but without significance testing, or no significant differences in overall results. None of the studies reported better results with paper than with eConsent. Among the “high” validity studies, 6 studies on comprehension reported significantly better understanding of at least some concepts, the study on acceptability reported statistically significant higher satisfaction scores, and the study on usability reported statistically significant higher usability scores with eConsent than with paper (P<.05 for all). Cycle times were increased with eConsent, potentially reflecting greater patient engagement with the content. Data on enrollment and retention were limited. Comparative data from site staff and other study researchers indicated the potential for reduced workload and lower administrative burden with eConsent.

Conclusions: This systematic review showed that compared with patients using paper-based consenting, patients using eConsent had a better understanding of the clinical trial information, showed greater engagement with content, and rated the consenting process as more acceptable and usable. eConsent solutions thus have the potential to enhance understanding, acceptability, and usability of the consenting process while inherently being able to address data quality concerns, including those related to flawed consenting processes.

J Med Internet Res 2023;25:e43883

doi:10.2196/43883

Keywords



Background

Informed consent to participate remains a fundamental aspect of ethical clinical research. Potential participants of a clinical trial must be given adequate information about the study before they decide whether to participate in accordance with good clinical practice quality standards [1]. Providing informed consent means to agree to take part in the trial and to have understood what is involved, including the risks and benefits of participation [1]. Traditionally, the trial information is conveyed using printed documents that potential participants read before signing to indicate their consent to participate. The informed consent form (ICF), and the associated effective communication of study information, remains among the most challenging and complex processes within the clinical trial landscape. ICFs are known to have poor readability and take too long to be understood and digested effectively [2]. A review of ICFs developed for use in phase III oncology clinical trials showed that these were, on average, 21.4 pages long and that many participants had only a poor understanding of the key elements of their trial [2]. Poor understanding of the study requirements and treatment has been cited as a reason for early withdrawal from clinical trials [3]. To ensure that potential trial participants fully comprehend the study information, ICFs need to convey complicated and technical information in a way that meets the target group’s health literacy capabilities. ICFs have to maintain readers’ engagement sufficiently to ensure that they can make a fully informed decision on whether to participate.

In addition to patient-centered challenges of the ICF process, administrative aspects of the consenting process can pose challenges to investigators conducting clinical trials. Flawed informed consent processes are listed within the top 10 cited regulatory deficiencies and audit findings and are the third highest reason for US Food and Drug Administration (FDA) warning letters to clinical investigators [4-6]. Informed consent was among the top 2 most frequently observed issues in a recent auditing case study, conducted across 37 centers, and problems were identified related to processing errors and missing operational records [7]. Findings included missing signatures, incomplete ICFs, signing of incorrect ICF versions, and unauthorized site staff obtaining consents [7]. These are serious issues that can undermine the integrity of the consent process and the study, and they can result in the inability of researchers to analyze and report the data as intended.

Electronic consent (eConsent) uses digital technologies to enable the consenting process. Components can include multimedia to complement text-based content; interactivity (eg, to handle questions, test knowledge, explain definitions, and allow patients to resume the process from where they left off); electronic signature capture; status dashboards; and version control technology. eConsent aims to improve participant comprehension and engagement with study information and to address data quality concerns that may limit study integrity [8,9]. Although eConsent has been in use for about 15 years, its adoption has been slow until recently, when its accelerated uptake has been driven primarily by the COVID-19 pandemic [9]. Much of the supporting information on the promised benefits of eConsent comes from informal commentaries and reports from eConsent solution providers. In addition to digital technology, and just as with paper-based ICFs, eConsent solutions require good content to be effective. Similar to the computing analogy of “garbage in, garbage out,” poor eConsent content will result in poor overall effectiveness in terms of patient comprehension, acceptability, and usability, irrespective of the quality of the delivery technology.

Objective

The aim of our systematic review of peer-reviewed research was to provide a summary of qualitative and quantitative evidence to draw conclusions on the relative effectiveness of eConsent in comparison with traditional paper-based consenting.


Literature Searches

The systematic literature review was conducted and reported in accordance with the PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) guidelines [10]. A completed PRISMA checklist is included in Multimedia Appendix 1 [10]. We systematically searched the peer-reviewed literature for full papers and conference abstracts relevant to our review using Ovid Embase and Ovid MEDLINE on November 11, 2021. Ovid MEDLINE is equivalent in content to PubMed and additionally includes advanced search options (eg, adjacency operator and within-phrase wildcard) [11,12]. The search string contained terms related to electronic and consenting as follows: ([dynamic OR electronic OR interactive OR multimedia OR online OR tablet OR computer OR digital OR virtual] ADJ4 [consent* OR econsent OR e-consent]). Terms related to “electronic” were limited to the title, abstract, and keywords of a publication. The operator “ADJ4” was used to identify “electronic”- and “consent”-related terms separated by ≤3 words to filter for literature relevant to this review. The records were screened and selected based on our review of the title, abstract, and full text. No language restrictions or publication date limits were applied. The review was not registered, and a protocol was not prepared.

Inclusion and Exclusion Criteria

Publications reporting original, comparative data on the effectiveness of eConsent in terms of patient comprehension, acceptability, and usability were eligible for inclusion. Comparative data on the effect of eConsent on clinical study enrollment and retention rates, cycle time (ie, time taken to consent), site workload, and stakeholder views were also considered relevant. Head-to-head comparisons of paper-based methods versus eConsent were of particular relevance. Publications that did not present original data (eg, reviews, editorials, and commentaries) were excluded.

Study Selection

Following the systematic literature searches, duplicate records were removed using the deduplicate option in Ovid. All the remaining records were exported to EndNote X9 (Clarivate), and further duplicates were identified and removed manually. Two reviewers (EC and BB) independently assessed the systematic literature search results and the corresponding full texts following the initial screening of titles and abstracts, with one reviewer (AB) excluding the ineligible publications (eg, reviews), and the team of 3 reviewers resolved any disagreements by consensus-based discussions. Reasons for exclusion and inclusion were captured.

Data Collection and Summary

Data extraction (conducted by AB and reviewed by BB) included measures and outcomes for patient comprehension, acceptability, usability, enrollment rates, retention rates, cycle time, site workload, and stakeholder views. The extracted data were summarized descriptively. Data on patient comprehension, acceptability, and usability with eConsent versus paper-based ICFs were tabulated as part of the main descriptive summary. An overview of all the studies identified for inclusion is provided in the Multimedia Appendix 2 [13-49].

Study Categorization

For studies comparing patient comprehension, acceptability, and usability with eConsent versus paper-based ICFs, we estimated the quality of the evidence by categorizing their methodological validity as “high,” “moderate,” or “limited.” The study methodologies that we categorized as having high validity (score=+++) were those that used comprehensive assessments including detailed and open-ended questions (eg, “Tell me what will be done during the study visits”), possibly using established instruments as part of the formal assessments. Methodologies that involved self-rating by participants (eg, “Did you understand the following aspects of the study?”), without formal testing, were categorized as having moderate validity (score=++). When a methodology that involved limited questioning was used in the studies or when methodological details were not reported, we categorized these studies as having limited validity (score=+).


Overview

The systematic literature search identified 1872 publications (Figure 1). Of these 1872 publications, 608 (32.48%) duplicates were excluded before screening, and a further 1228 (65.6%) were excluded based on screening by title, abstract, and full publication, with the most common reason for exclusion being that the publication did not report on eConsent research. A total of 36 studies met the eligibility criteria [13-48], and an additional outcomes publication [49] was retrieved manually based on the identification of its accompanying methodology article during screening. Thus, in total, 37 publications (32 full publications and 5 conference abstracts) were included in this review (Multimedia Appendix 2) [13-49].

The included publications together described 35 studies (2 studies were each covered by 2 publications). Most of the studies (28/35, 80%) were from North America (United States: n=26, 93%; Canada, n=2, 7%; Multimedia Appendix 2). The remaining studies (7/35, 20%) were from Europe (Italy: n=1, 3%; Ireland and United Kingdom: n=1, 3%; United Kingdom: n=1, 3%), Australia (n=2, 6%), and Gambia (n=1, 3%), and 1 (3%) was multinational. Taken together, these studies included a total of 13,281 participants. The number of participants per study ranged from 9 to 3485. In total, 13 (37%) out of 35 studies were conducted as part of randomized (n=10) or nonrandomized (n=3) clinical research studies, 14 (40%) studies were simulated consent studies, and 8 (23%) studies were survey or interview studies. Most of the research and simulation consent studies (23/27, 85%) were conducted in person (Multimedia Appendix 2). Comparative data on patient comprehension, acceptability, and usability with eConsenting were provided in 26, 13, and 6 studies, respectively, of which 20, 8, and 5 studies included comparisons for eConsent versus paper-based ICFs, respectively. Aspects of eConsent in relation to enrollment rates, retention, cycle time, staff workload, and stakeholder views were covered in 12, 1, 13, 3, and 5 studies, respectively. Age groups ranged from 8 years to 91 years in the 14 studies that included age range information. Among the 23 studies that provided sufficient information on average (mean or median) age, the average age was <50 years in 12 studies and ≥50 years in 11 studies.

Figure 1. PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) flowchart of the systematic literature search. eConsent: electronic consent.

Patient Comprehension

Overall, 26 studies (8778 participants in total) assessed the aspects of patient comprehension with eConsenting.

Patient Comprehension: eConsenting Versus Paper

Comparative information on comprehension with eConsenting versus paper-based ICFs was provided in 20 studies, including a total of 6769 participants (of whom 5809 participants contributed comparative data on comprehension; Table 1). All 20 studies reported significantly better understanding with eConsent, better understanding but without significance testing, or no significant differences in overall understanding (Table 1).

Table 1. Studies providing comparative findings on comprehension with electronic consent (eConsent) versus paper informed consent form.
Study, yearParticipantsMethodologyComprehension findingsa

Sample size, NAge (years)MeasureValidityb
Abujarad et al [13], 202150
  • eConsent: mean 47 (SD 15; range NRc)
  • Comparator: mean 38 (SD 15; range NR)
  • QuICd
  • Self-rating, not tested
  • QuIC: +++
  • Self-rating, not tested: ++
  • QuIC: no significant difference
  • Self-rating, not tested: significantly better scores with eConsent vs paper for 2 of 4 informed consent–related concepts (P=.02; P=.045); other 2 concepts not significantly different
Afolabi et al [14], 2015311
  • Mean NR, (SD NR; range NR); >90% aged 18-49
  • DICCQe
  • +++
  • Better scores with eConsent vs paper and verbal. Differences significant on 3 of 4 d tested: day 0 (P=.04), day 14 (P=.04), and day 21 (P=.04)
Bickmore et al [15], 200929
  • Mean 60 (SD NR; range 28-91)
  • BICEPf
  • +++
  • Better scores with eConsent vs paper or verbal. Significantly different scores between paper, verbal, and eConsent (P=.006)
Buckley et al [16], 202097
  • NR
  • 10 questions (details NR)
  • +
  • Better scores with eConsent vs paper for both study protocols tested. Difference significant for 1 protocol (“genomic”; P<.01)
Chalil Madathil et al [18], 201340
  • Mean NR (SD NR; range 18-77)
  • 7 questions (complicated language)
  • +
  • Study researchers report that results suggested better understanding with iPad vs paper (full details NR)
Chapman et al [19,20], 2021 and 2020298
  • Mean 63 (SD 8; range 45-74)
  • 5 true or false questions
  • +
  • Significantly better scores with eConsent vs paper for question on participation requirements (P<.001) and data sharing (P=.03); no significant differences for other 3 questions
Harmell et al [25], 201235
  • Outpatients
  • eConsent: mean 57 (SD 10)
  • Comparator: mean 57 (SD 10)
  • Healthy individuals
  • eConsent: mean 49 (SD 16)
  • Comparator: mean 53 (SD 12; ranges NR)
  • UBACCg
  • MacCAT-CRh
  • UBACC: +++
  • MacCAT-CR: +++
  • UBACC
    • Outpatients: significantly better scores with eConsent vs paper (P=.03; Cohen d=0.94)
    • Healthy individuals: no significant difference
  • MacCAT-CR
    • Outpatients: no significant differences
    • Healthy individuals: no significant differences
Jayasinghe et al [27], 201935
  • Focus group: 77 (SD 8; range NR)
  • Pilot: 75 (SD 7; range NR)
  • UBACC
  • +++
  • Better scores with eConsent vs paper at baseline and week 1, but effect not statistically significant (P=.50; Hedges g=0.30)
Jeste et al [28], 200960
  • Paper: mean 54 (SD 9)
  • Multimedia: mean 55 (SD 7)
  • UBACC
  • MacCAT-CR
  • UBACC: +++
  • MacCAT-CR: +++
  • UBACC
    • Outpatients: significantly better scores with eConsent vs paper (P<.001; 95% CI 0.59-0.77)
    • Healthy individuals: no significant difference
  • MacCAT-CR
    • Outpatients: better scores with eConsent vs paper for all 4 concepts. Differences significant for understanding (P=.006; 95% CI 0.54-0.74) and choice (P=.02; 95% CI 0.51-0.57)
    • Healthy individuals: significantly better scores with eConsent vs paper for understanding (P=.02; 95% CI 0.52-0.79); no significant difference for other 3 concepts
Knapp et al [49], 2021109
  • Median 13 (range 11-14)
  • Custom survey (DMQi; 9 questions)
  • ++
  • Significantly better scores with eConsent vs paper for understanding (P=.003) and confidence in decision-making (P=.04); no significant differences for other 7 questions or in total scores
McCarty et al [30], 201556
  • Mean 73 (SD NR; range 55-86)
  • Custom survey after 6 mo (38 questions)
  • ++
  • No significant differences for 36 of 38 questions. Significantly better scores with eConsent vs paper for 2 knowledge questions (both P<.05)
McGraw et al [32], 201243
  • Mean 38 (SD NR; range 18-68)
  • Interviews
  • +
  • No difference in the proportion of participants recalling concepts spontaneously
Rothwell et al [35], 2020669
  • Mean 30 (SD 5)
  • QuIC parts A and B
  • +++
  • Video versus paper: significantly better scores with video for knowledge (P<.001) and understanding (P=.003)
  • Interactive app versus paper: significantly better scores with app for knowledge (P=.003); no significant difference for understanding
Rothwell et al [36], 201462
  • NR
  • Custom survey (14 questions)
  • ++
  • Significantly better scores with eConsent vs paper for 4 of 14 questions (P=.047; P=.002; P<.001; P=.02); no significant differences for other 10 questions
Rowbotham et al [37], 201375
  • Mean 50 (SD NR; range 18-80)
  • Custom survey (12 questions)
  • ++
  • Significantly better scores with eConsent vs paper (P<.001)
Simon et al [39], 2016200
  • Mean 47 (SD NR; range 18-86)
  • QuIC parts A and B
  • +++
  • eConsent improved understanding vs paper (P=.04; partial η2=0.021); no difference in confidence of understanding
Simon et al [41], 2021501
  • Mean 47 (SD N; range 18-84)
  • QuIC parts A and B
  • +++
  • No difference overall in understanding. Confidence in understanding was significantly lower with eConsent than paper (P=.02)
Sonne et al [42], 201361
  • Mean 43 (SD 14; range NR)
  • Custom survey (20 questions)
  • ++
  • No significant differences
Varnhagen et al [45], 20053045
  • Mean NR (SD NR; range NR); 84% aged ≥45
  • Unprompted recall
  • +
  • No significant differences
Warriner et al [47], 201633
  • eConsent: mean 69 (SD 7)
  • Paper: mean 71 (SD 9; range NR)
  • Health-ITUESj, QuIC
  • +++
  • No significant differences

aSignificant P values, effect sizes, and CIs are reported when provided in the publications.

bMethodological validity was categorized as “high” (+++), “moderate” (++), or “limited” (+).

cNR: not reported.

dQuIC [50]: Quality of Informed Consent. Part A=20 questions self-rated (agree, unsure, and disagree); part B=14 questions to self-rate the understanding of different aspects on a scale of 1 to 5.

eDICCQ [51]: Digitized Informed Consent Comprehension Questionnaire. A total of 26 questions (9 yes or no, 6 multiple-choice single answers, 4 multiple-choice multiple answers, and 7 verbal recall) and investigator-rated responses.

fBICEP [52]: Brief Informed Consent Evaluation Protocol. Contains 12 open questions, scored by the interviewer and assesses pressure to participate, understanding of care if not consented, benefits, risks, study requirements, purpose of study, when the study ends, and when participants could withdraw consent.

gUBACC [53]: University of California San Diego Brief Assessment of Capacity to Consent. It contains 10 open questions on study purpose, requirement to participate, impact of withdrawing, study requirements, risks and benefits, and costs.

hMacCAT-CR [54]: MacArthur Competence Assessment Tool for Clinical Research. Understanding (scores range from 0 to 26), Appreciation (0-6), Reasoning (0-8), and expression of a choice.

iDMQ: Decision-Making Questionnaire.

jHealth-ITUES: Health Information Technology Usability Evaluation Scale.

Different methods were used across studies to assess comprehension, some of which were more robust than others in their approaches. Overall, 10 studies included established instruments to assess comprehension, and their methodological validity was thus categorized as “high” (score=+++) [13-15,25,27,28,35,39,41,47]. The instruments used included the Brief Informed Consent Evaluation Protocol [15,52], Digitized Informed Consent Comprehension Questionnaire [14,51], MacArthur Competence Assessment Tool for Clinical Research [25,28,54], Quality of Informed Consent [13,35,39,41,47,50], and University of California San Diego Brief Assessment of Capacity to Consent [25,27,28,53].

Overall, 60% (6/10) of the “high” validity studies reported significantly better understanding with eConsent than paper-based ICFs for at least some of the concepts assessed using established instruments, with no statistical tests in favor of the paper process [14,15,25,28,35,39]. The remaining 4 (40%) of the 10 studies reported no significant difference in comprehension between eConsent and a paper-based consent process [13,27,41,47], with 1 study reporting statistically nonsignificant better comprehension using eConsent [27]. However, confidence in understanding was significantly lower with eConsent than with paper-based ICFs in 1 study that observed no difference in overall understanding [41].

Furthermore, 6 studies included custom surveys or participant self-rating without formal testing to evaluate comprehension, and their methodological validity was thus categorized as “moderate” (score=++) [13,30,36,37,42,49]; one of these studies used both “high” and “moderate” validity methodologies [13]. Of the 6 “moderate” validity studies, 67% (n=4) of studies reported significantly better comprehension with eConsent than with paper-based ICFs for at least some of the concepts assessed [13,36,37,49], with the remainder reporting no significant differences [30,42].

The remaining 5 studies (covered by 6 publications) used limited questioning or did not report methodological details, and their methodological validity was thus categorized as “limited” (score=+) [16,18-20,32,45]. Of the “limited” validity studies, 3 (covered by 4 publications) reported better comprehension with eConsent than with paper-based ICFs for at least some aspects [16,18-20], and 2 reported no differences [32,45].

Patient Comprehension: Other Evidence

In the study by Rothwell et al [36] (Table 1), participants in the eConsent group were interviewed after the consent process; several noted that the eConsent format was easy to understand and held their attention more than a paper-based approach would have done. Several further studies assessed comprehension either by comparing different electronic formats at baseline versus postconsent time point or by describing results from interviews about patient preferences [22-24,33,34,43,48] (Multimedia Appendix 2). Comprehension was significantly better with a highly interactive eConsent version than with less-interactive versions in a study by Geier et al [22]. Participants in a study by Naeim et al [33] found that information was easier to understand when the video presentation was animated rather than text based. Perrault and Keating [34] found that text layouts using line spacing, bold font, and bullet points could improve comprehension compared with a bullet-pointed flowchart. Golembiewski et al [23] and Harle et al [24] observed no significant differences in understanding between a standard tablet-based version and versions that had key terms hyperlinked to additional research-related information. Tait et al [43] showed that parents’ and children’s understanding of clinical trial–related terminology was improved after eConsenting compared with baseline. Most participants (67%) in a survey of clinical trial researchers by Zeps et al [48] thought that eConsent would improve patients’ comprehension.

Patient Acceptability

Overall, 13 studies (1694 participants in total) assessed the aspects of patient acceptability with eConsenting.

Patient Acceptability: eConsenting Versus Paper

Comparative information on the acceptability of eConsenting versus paper-based ICFs was provided in 8 studies, including a total of 631 participants (of whom 621 participants contributed comparative data on acceptability; Table 2). All 8 studies reported significantly higher satisfaction or enjoyment with eConsent, higher satisfaction but without significance testing, or no differences in acceptability (Table 2). Only one of the studies was categorized as having “high” methodological validity, having used an established instrument to assess acceptability, in this case, the Computer System Usability Questionnaire [18,55]. This study reported statistically significant higher satisfaction scores with eConsent compared with paper-based ICFs [18]. The methodology used to assess acceptability was categorized as having “limited” validity in the remaining 7 studies (covered by 8 publications) [13,15,19,20,25,37,42,47]. Furthermore, 6 studies with “limited” validity reported higher acceptability with eConsent than with paper-based ICFs [13,15,25,37,42,47], and in 3 of these studies, at least some of the differences were statistically significant [13,15,37].

Table 2. Studies providing comparative findings on acceptability of electronic consent (eConsent) versus paper informed consent form.
Study, yearParticipantsMethodologyAcceptability findings

Sample size, NAge (years)MeasureValiditya
Abujarad et al [13], 202150
  • eConsent: mean 47 (SD 15; range NRb)
  • Comparator: mean 38 (SD 15; range NR)
3 questions as part of a 12-question survey (Likert scale)+Significantly higher satisfaction scores with eConsent vs paper for 1 of 3 questions (P=.01); no significant differences for other 2 questions
Bickmore et al [15], 200929
  • Mean 60 (SD NR; range 28-91)
1 question (Likert scale)+Significantly higher satisfaction scores with eConsent vs paper or verbal (P=.02)
Chalil Madathil et al [18], 201340
  • Mean NR (SD NR; range 18-77)
CSUQc overall satisfaction score+++Higher satisfaction scores with eConsent formats vs paper. Difference across the different formats statistically significant (P<.05)
Chapman et al [19], 2021; Chapman et al [20], 2020298
  • Mean 63 (SD 8; range 45-74)
3 questions (multiple choice)+Similar levels of overall acceptability
Harmell et al [25], 201235
  • Outpatients
  • eConsent: mean 58 (SD 9)
  • Comparator: mean 57 (SD 10)
  • Healthy individuals
  • eConsent: mean 49 (SD 16)
  • Comparator: mean 53 (SD 12) (ranges NR)
1 question (multiple choice)+Proportion of patients preferring current vs past consenting experience higher with eConsent vs paper (P value NR)
Rowbotham et al [37], 201375
  • Mean 50 (SD NR; range 18-80)
2 questions (Likert scales)+Significantly higher scores with eConsent vs paper for enjoyment (P<.05). No significant difference for satisfaction (P=.09)
Sonne et al [42], 201361
  • Mean 43 (SD 14; range NR)
1 question+79% of participants preferred eConsent over paper format
Warriner et al [47], 201633
  • Tablet: mean 69 (SD 7)
  • Paper: mean 71 (SD 9; ranges NR)
3 questions (Likert scales)+Higher satisfaction with eConsent vs paper, but difference not statistically significant

aMethodological validity was categorized as “high” (+++), “moderate” (++), or “limited” (+).

bNR: not reported.

cCSUQ [55]: Computer System Usability Questionnaire. It contains 19 questions measuring overall satisfaction, system usefulness, information quality, and interface quality.

Patient Acceptability: Other Evidence

Several studies described viewpoints regarding consenting format preferences or comparing acceptability when using different electronic formats [23,24,26,29,31,46] (Multimedia Appendix 2). The survey and interview results indicated a preference for eConsent over paper-based ICFs. McGowan et al [31] reported that 52% of their study sample preferred eConsent, 46% had no preference, and only 3% would have preferred face-to-face consenting. Similarly, in a survey by Vercauteren et al [46], 41% of the respondents preferred eConsent, 41% had no preference, and only 16% preferred paper-based ICFs. The focus group participants in a study by Jimison et al [29] thought that eConsent was useful and could replace the paper-based ICFs. Only 22% of legally authorized representatives that eConsented on behalf of clinical study patients would have preferred a paper-based ICF in a study by Haussen et al [26]. The studies by Golembiewski et al [23] and Harle et al [24] compared different formats of eConsenting and observed no significant differences in acceptability between the standard version and the version with hyperlinks to additional materials.

Patient Usability

Overall, 6 studies (582 participants in total) assessed the aspects of patient usability with eConsenting.

Patient Usability: eConsenting Versus Paper

Comparative information on the usability of eConsenting versus paper-based ICFs was provided in 5 studies, including a total of 542 participants (of whom 532 participants contributed comparative data on usability; Table 3). All 5 studies reported significantly better usability with eConsenting, better usability but without significance testing, or no differences in usability (Table 3). One study had “high” methodological validity for assessing usability, having measured this via the Computer System Usability Questionnaire, and reported statistically significant higher usability scores with eConsent than with paper-based ICFs [18]. One study had “moderate” methodological validity and observed no overall significant difference in the usability between eConsent and paper-based ICFs [27]. Three studies (4 publications) with “limited” validity reported better usability with eConsent than with paper-based ICFs [13,19,20,49], and in 2 of these studies, at least some of the differences were statistically significant [13,19,20].

Table 3. Studies providing comparative findings on the usability of electronic consent (eConsent) versus paper informed consent form.
Study, yearParticipantsMethodologyFindings

Sample size, NAge (years)MeasureValiditya
Abujarad et al [13], 202150
  • eConsent: mean 47 (SD 15; range NRb)
  • Comparator: mean 38 (SD 15; range NR)
1 question (Likert scale)+eConsent participants scored the process as significantly less difficult than paper consent participants (P=.02)
Chalil Madathil et al [18], 201340
  • Mean NR (SD NR; range 18-77)
CSUQc system usefulness and interface quality subscales+++Higher usefulness and interface quality scores with eConsent formats vs paper. Difference across the different formats statistically significant (P<.05)
Chapman et al [19,20], 2021 and 2020298
  • Mean 63 (SD 8; range 45-74)
2 questions (multiple choice) plus successful completion+Significantly better scores with eConsent vs paper for engagement with study information (P<.001); no significant difference for improvement. All participants successfully completed the consenting process
Jayasinghe et al [27], 201935
  • 75 (SD 7; range NR)
10 questions (Likert scales)++Overall, no statistically significant difference with eConsent vs paper
Knapp et al [49], 2021109
  • Median 13, range 11-14
1 question (Likert scale)+Better scores with eConsent vs paper (P value NR)

aMethodological validity was categorized as “high” (+++), “moderate” (++), or “limited” (+).

bNR: not reported.

cCSUQ [55]: Computer System Usability Questionnaire. It contains 19 questions measuring overall satisfaction, system usefulness, information quality, and interface quality.

Patient Usability: Other Evidence

Participants who were asked about their impressions of the electronic and paper-based informed consent processes described the electronic process as well organized, easy to use, and useful in a study by Simon et al [40] (Multimedia Appendix 2).

Enrollment Rates

A total of 12 studies (6399 participants in total) assessed the effect of the ICF format on the aspects of patient enrollment, with mixed results. Comparisons of consenting rates with eConsenting versus paper-based ICFs were reported in 5 studies [15,18,28,35,41]. In the study by Bickmore et al [15], a significantly higher proportion of participants in the eConsent group than those in the paper group signed their ICFs (P=.01). Consenting rates were also higher with eConsent than with the paper-based ICFs in a study by Chalil Madathil et al [18] (P value not reported). Consenting rates were similar between the groups in a study by Jeste et al [28] (P value not reported), and Rothwell et al [35] reported that consenting rates were similar in paper-based ICFs and video eConsent groups, but the rates were lower in the app eConsent group (P value not reported). In a study by Simon et al [41], enrollment was significantly higher with a face-to-face informed consent process than with eConsenting (P=.004), although immediately after the consenting process, similar proportions of the 2 groups had reported their intention to enroll; the eConsent process was conducted at the same location as the face-to-face process.

Overall, 4 studies reported eConsenting rates using different electronic media formats [21,23,24,33,38]. No significant differences in enrollment rates were observed with animated versus text-based video consents by Naeim et al [33], with video- versus text-based consenting by Fanaroff et al [21], or with different levels of eConsent interactivity by Golembiewski et al [23] and Harle et al [24]. Siegel et al [38] observed an increase in enrollment rates after a content redesign and attributed the increased rates to the web-based consenting being directly integrated with new patient on-boarding.

Furthermore, 3 studies described results about preferences and found that the format of the ICF made little difference to participants’ decision-making regarding study participation [13,17,49]. In the study by Abujarad et al [13], participants were asked to score the importance of the consenting process in their decision to participate; scores were not significantly different between the paper ICF and the eConsent groups. In the study by Knapp et al [49], similar proportions of patients in the paper-based ICF and the eConsent groups found that the trial information provided helped them make their decision about whether to take part [49]. Study researchers who were surveyed in the study by Cagnazzo et al [17] thought that the use of eConsent had little influence on whether patients declined to participate in a stud.

Retention

None of the included studies reported overall study retention comparisons between the 2 consenting approaches of paper-based ICFs versus eConsenting. Fanaroff et al [21] (3485 participants) assessed different formats of eConsenting and found no statistically significant differences in the proportions of enrolled patients who subsequently completed the 2 requested study procedures, namely, a blood draw and survey questions.

Cycle Time

A total of 13 studies (2063 participants in total) assessed cycle time, and 10 studies (covered in 11 publications) assessed the comparative effect of eConsent versus paper-based ICFs on consenting times, 2 studies (3 publications) asked about perceived consenting time, and 1 study assessed consenting times with different electronic formats. eConsenting took more time than paper consenting in the studies by Chapman et al [19,20] (P=.006), Jayasinghe et al [27] (P<.001), McCarty et al [30] (P<.001), Rowbotham et al [37] (P<.001), Simon et al [39] (P<.001), Sonne et al [42] (P value not reported), and Varnhagen et al [45] (P<.001; partial η2=0.36). eConsenting was faster than paper consenting in the studies by Afolabi et al [14] and Jeste et al [28] (P value not reported in either study). Chalil Madathil et al [18] found no significant effect of consenting condition on time taken to complete the task. Abujarad et al [13] and Warriner et al [47] asked participants about their perceived time to complete the task and found no statistically significant differences between the eConsent and paper-based ICF groups. Different electronic formats of eConsenting did not significantly affect consenting times in the study by Golembiewski et al [23] and Harle et al [24].

Site Workload

In total, 3 studies (3284 participants in total) assessed the site workload. Hospital staff in the study by Chalil Madathil et al [18] reported a less subjective workload with eConsenting than with paper-based formats (P=.02), and the responses were assessed using the National Aeronautics and Space Administration Task Load Index. Site advisory group feedback in the study by Vanaken and Masand [44] included a beneficial reduction in the administrative burden and reduction in paper trail, although a potential for increased workload was also noted, for example, in relation to training and device management. In the study by Zeps et al [48], clinical trial researchers noted that eConsent devices could be clunky and prone to malfunction, which increased overall study time and burdened trial staff.

Stakeholder Views

Overall, 5 studies (3416 participants in total) assessed stakeholder views. Staff in the study by Chalil Madathil et al [18] preferred eConsenting formats over paper-based consenting (differences among systems, P<.005). In the study by Warriner et al [47], the findings from a telephone survey of practice sites that administered both consent processes favored eConsent over paper-based ICFs, but the differences were not statistically significant. Health Authority representatives were in favor of the broad implementation of eConsent in alignment with local regulations in the study by Vanaken and Masand [44]. However, approximately half (53%) of the surveyed research participants preferred having both a paper document and an eConsenting system [44]. Similarly, most centers (65%) in a survey by Cagnazzo et al [17] preferred using a paper-based ICF in parallel with eConsenting. Clinical research stakeholders surveyed by Cagnazzo et al [17] in late 2020 thought that at a regulatory level, the use of eConsent might increase the time to study approval. In the survey of clinical trial researchers’ opinions on eConsent conducted by Zeps et al [48] in early 2019, a total of 68% of the respondents believed that ethics committees would not approve the use of eConsent or were unsure if they would, while 67% of the respondents thought that the lack of standardized, consistent guidance across the sector was an important barrier to success and 60% of the respondents believed that the high initial cost might be a barrier to uptake.


Principal Findings

Our systematic literature review aimed to assess the effectiveness of eConsent in terms of patient comprehension, acceptability, usability, study enrollment and retention rates, cycle time, and site workload, primarily in comparison with traditional paper-based consenting. We identified 37 primary publications for inclusion that together described 35 studies (13,281 participants in total). Our results showed that compared with patients who used paper-based consenting, patients who used eConsent had a better understanding of the trial information, showed greater engagement with content, and rated the consenting process as more acceptable and usable. Cycle times were increased with eConsent, potentially reflecting the greater patient engagement with the content. Data on enrollment, retention, and site workload effects were limited. Some general themes emerged in relation to the effectiveness of eConsent, its administrative aspects, and the variability in eConsenting formats used across studies. We have discussed these under the following subheadings.

Effectiveness

Comprehension, Acceptability, and Usability

Informed consent involves providing potential clinical trial participants with adequate information on what the study involves, including the risks and benefits of participation, to allow them to make a fully informed decision on whether to participate. Knowing that potential trial participants have understood the study information is thus of utmost importance. Our systematic review showed good evidence of improvements in comprehension with eConsent versus paper-based ICFs. Assessments of patients’ experiences with eConsenting need to distinguish between the content of the eConsent information and the workability of the digital platform. Our findings in terms of comprehension, acceptability, and usability were consistent, showing either overall benefits to patients of eConsenting versus paper-based ICFs or no significant overall differences. Patients reported higher satisfaction and enjoyment with the eConsent process than with paper-based consenting and found eConsenting both more useful and less difficult to use than paper versions. None of the studies reported significantly higher overall patient benefits with paper-based ICFs than with eConsent.

Studies were limited in terms of their exploration of why eConsent was more effective than paper-based consenting. Craik and Lockhart [56], in their “levels of processing” framework for memory research, suggest that learning and memory are improved when the information is processed in depth. This deeper level of processing might be achieved in many ways. Research by Dellson et al [57] suggests that use of good graphic design in consent materials, for example, using illustrations rather than text to explain treatment regimens, raises potential participants’ motivation to engage with the materials and facilitates their understanding of the clinical study. In the cognitive theory of multimedia learning, Mayer [58] proposes that people can learn more deeply with multisensory processing, when audio and visual information is presented together at the same time [59,60]. Further improvements in learning efficiency are obtained with user-focused active engagement [61]. Compared with text alone, the use of multimedia is also likely to increase attention arousal [62], which is typically associated with increased learning [63,64]. However, maximizing sustained attention needs to be balanced with the cognitive processing effort, which should not be increased beyond the cognitive capacity of the participant [58]. In their study of web-based lectures, Chen and Wu [65] found that the visual information presented with a voice-over resulted in increased sustained attention workload and negatively affected learning performance, compared with the visual information presented with video and audio of the presenter. Future research might wish to explore the role of cognitive capacity in eConsent comprehension and ways to mitigate cognitive demands, for example, via the use of self-pacing [66].

Effectiveness in Older Age Groups

Encouragingly, many studies that we examined included patient groups up to the age of 91 years. Although studies do not examine age cohorts separately, the positive effectiveness findings also applied to patients in older adult age groups, indicating that age does not have a negative impact on the effectiveness of eConsent, although more data for a comprehensive assessment are needed. In a cardiovascular study that included 298 participants with a mean age of 63 (range 45-74) years, those randomized to eConsent, consisting of multimedia including video-, audio-, and computer-based finger-signed consent, had a better understanding of study requirements than their counterparts randomized to the traditional paper-based consenting [19,20]. It has been found that older adults integrate more of the audiovisual information in their environment when performing tasks and benefit more from multisensory processing than younger adults do [67], thus supporting the use of eConsent in older age groups. The perceived lower technology literacy in some older cohorts can be mitigated with a good solution design and effective training [68].

In addition to comprehension, the usability data showed that eConsenters had better engagement with study information than their paper-based ICF counterparts, and acceptability was similar for the 2 consenting formats [19,20]. Focus group discussions with individuals aged ≥65 years yielded frequently cited advantages of eConsenting, including its convenience and the usefulness of additional features such as definitions, graphics, and audio [27]. Although not evaluated here, it is likely that age per se has less of an impact than other patient characteristics, such as cognitive ability, dexterity, and technology literacy, on the usability and acceptability of digital solutions within clinical trials.

Enrollment

Overall, there was no consensus across publications as to whether a patient’s likelihood to enroll in a study is affected by whether the consenting process is electronic or on paper [15,18,28,35,41]. When questioned, the patients indicated that the format of the ICF made little difference to their decision-making regarding study participation [13,17,49]. However, eConsenting had the potential to increase patient enrollment by increasing accessibility when integrated into a web-based patient platform [38].

Retention

Potentially more relevant than enrollment effects is whether improved patient comprehension of the study and its requirements leads to enhanced trial retention. We identified a marked gap in the comparative research on the effect of eConsent on patient retention within clinical studies. The observed improvements in comprehension with eConsent could potentially be used as a surrogate for retention because we know that not fully understanding the study requirements beforehand is a key reason for early withdrawal from clinical trials [3].

Administration

Cycle Times

Most studies in this review that assessed the time it took for patients to undertake the consenting process with eConsent versus paper found that eConsenting took more time than paper consenting [19,20,27,30,37,39,42,45]. This finding is not unexpected. As eConsent is better able to hold patients’ attention than paper-based approaches [36], eConsenting patients are likely to engage more fully with the information provided, thus increasing cycle time. Explanations provided by the primary study authors for the increased time taken with eConsent included that this format enabled participants to engage more with the study information [19,20], that participants made use of the opportunities to view additional information available in the eConsent format [27], or that participants took time to listen to slide narration [39]. To mitigate the increase in cycle time, clinical researchers might consider providing remote eConsent access ahead of a study visit.

Site Workload

We identified only limited comparative information on site workload. None of the studies assessed workload across the entirety of a clinical trial. The workload advantages of the fully digitized consent process versus paper-based consenting may become visible only later in the clinical study timeline, when the administrative burden with paper-based consenting may increase owing to data quality issues.

One study assessed hospital staff’s subjective workload and found it to be reduced with eConsent versus paper [18]. Advisory group feedback included a reduction in administrative burden and paper trail, better version control, fewer issues around missing dates or signatures on forms, improved data quality, better participant oversight, and reduced number of site visits, potentially offset by an increased workload in relation to training and device management [44]. Site staff and health authority representatives tended to prefer eConsenting formats over paper-based consenting [18,44], although a preference for using both a paper document and an eConsenting system was also reported [17,44]. Technical difficulties with devices were noted as a potential burden for trial staff [48]. Sonne et al [42] described 1 in 5 participants with technical difficulties, including videos not loading or needing to be restarted, and internet connection issues, although that study was published in 2013 and is thus unlikely to reflect current setups.

Regulatory Aspects

Flawed informed consent processes are among the topmost regulatory and inspection findings for clinical trials [4-7]. Although we did not review, we expect that eConsent would implicitly protect against most of the common reasons for such findings. eConsent solutions prevent lodgment with incomplete information, missing signatures, or signing of incorrect versions and preclude retrospective signing, which cannot be detected or demonstrably proven with paper-based consenting. A fully digital consent process would allow for the evaluation of the success of the consent form content and system and their continuous improvement for patients. With paper-based ICFs, such evaluations would not be possible without having validated questionnaires included in each study. Moreover, the ability to track the withdrawal of a consent on an individual or a sample level benefits patients by ensuring that their data or samples are not used for future research. The industry would benefit by being able to comply with patients’ wishes by not using data or samples outside of the study. Future work will need to evaluate and confirm these benefits.

Variability

The types of eConsent used varied considerably across the included studies. Formats included straightforward digitization of paper documents, signature management systems, audio- and video-enhanced content, and fully interactive systems. Across formats, active multimedia engagement principles were observed, and significantly improved comprehension was achieved with highly interactive versions compared with less-interactive eConsent versions [22]. Animated video–based information was found to be easier to understand than text-based videos [33]. Even for text-based formats, use of line spacing, bold font, and bullet points could improve comprehension [34]. This variability in format has implications for future work, which might explore the differences and most effective formats further.

There was also variability in terms of how comprehension, acceptability, and usability were assessed. Among the studies that assessed the effectiveness of eConsenting compared with paper-based consenting, half of the 20 studies on comprehension had high methodological validity, but only 1 of the 8 studies on acceptability and 1 of the 5 studies on usability did so.

In its guidance on the use of electronic informed consent, the FDA notes the following:

[Electronic informed consent] may be used to provide information usually contained within the written informed consent document, evaluate the subject’s comprehension of the information presented, and document the consent of the subject or the subject’s legal authorized representative. Electronic processes to obtain informed consent may use an interactive interface, which may facilitate the subject’s ability to retain and comprehend the information [8].

In line with the FDA guidance, we suggest that a consenting format should be referred to as eConsent only if it can support patient engagement using multimedia components (eg, text, graphics, audio, and video) together with interactive functionalities to share information related to the study. If a digital consent solution does not have these capabilities, we suggest that it should be referred to as a digital consent form rather than a true eConsent solution.

Limitations

Limitations of our systematic literature search include the fact that the search strategy that we used would have missed some potentially relevant studies while trying to keep the number of publications for screening manageable. Among eConsent studies, it is conceivable that there may be a reporting bias in favor of those finding comparative differences. Differences in the outcome measures used, including differences in their validity, made comparisons between studies challenging and pooling across studies unfeasible. Most studies included in this review did not provide a detailed description of the eConsent format, and such information should be included in future studies to allow researchers to assess and compare the results across studies. These observations are a call to action to harmonize the analysis, documentation, and reporting of eConsent findings, as well as the parameters defining best practices for eConsent (including whether these are met by current eConsent vendors). The same applies for the used terminologies and processes around eConsent. A current ongoing initiative of the European Forum of Good Clinical Practice aims to achieve the standardization within the clinical trial [69].

Conclusions

In conclusion, this systematic review showed overall patient benefits with eConsent versus paper-based consenting in terms of understanding, acceptability, and usability. No study reported significantly better overall patient benefits with paper-based ICFs than with eConsent. eConsenting can increase enrollment into clinical studies by improving access to research. Comparative data from site staff and other study researchers indicate the potential for reduced workload and lower administrative burden with eConsent. In addition to these benefits, there are various other advantages associated with the use of digital solutions, including preventing flawed consenting processes, ensuring data quality, and supporting study integrity. Importantly, there are several avenues for future research that we believe are necessary. These include, but are not limited to, research that explores the best methodologies to target specific measures of eConsent efficacy (eg, recruitment, retention, and site experience); research that explores cross-cultural and globalization elements of eConsent; and research that makes better use of the theoretical underpinnings for why eConsent methods are more efficacious.

Acknowledgments

Oxford PharmaGenesis, with the financial support of AstraZeneca, provided the support for the development of this manuscript for publication, including assistance with the systematic literature review and data extraction. AstraZeneca authors participated in the study design and data collection and analysis.

Authors' Contributions

EC, BB, and MJ-K were involved in the conception and design of the study. Acquisition, analysis, and interpretation of data was carried out by EC, BB, AB, MJ-K, and AKM. Drafting and critically revising the article was carried out by EC, BB, AB, MJ-K, and AKM. All the authors provided final approval of the version to be published.

Conflicts of Interest

EC is an employee at AstraZeneca. BB is an employee at Signant Health. AB is a contractor at Oxford PharmaGenesis. MJ-K is an employee and shareholder at AstraZeneca. AKM has no conflicts of interest to declare.

Multimedia Appendix 1

PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) checklist.

DOCX File , 36 KB

Multimedia Appendix 2

Overview of the studies identified for inclusion.

DOCX File , 80 KB

  1. E6(R2) Good clinical practice: integrated addendum to ICH E6(R1). U.S. Food & Drug Administration. Mar 2018. URL: https:/​/www.​fda.gov/​regulatory-information/​search-fda-guidance-documents/​e6r2-good-clinical-practice-integrated-addendum-ich-e6r1 [accessed 2022-06-23]
  2. Schumacher A, Sikov WM, Quesenberry MI, Safran H, Khurshid H, Mitchell KM, et al. Informed consent in oncology clinical trials: a Brown University Oncology Research Group prospective cross-sectional pilot study. PLoS One. Feb 24, 2017;12(2):e0172957. [FREE Full text] [CrossRef] [Medline]
  3. Retention in clinical trials: keeping patients on protocols. Advarra. Mar 23, 2021. URL: https://www.advarra.com/resource-library/retention-in-clinical-trials-keeping-patients-on-protocols/ [accessed 2022-06-23]
  4. Gogtay NJ, Doshi BM, Kannan S, Thatte U. A study of warning letters issued to clinical investigators and institutional review boards by the United States Food and Drug Administration. Indian J Med Ethics. Oct 1, 2011;8(4):211-214. [CrossRef]
  5. Rogers CA, Ahearn JD, Bartlett MG. Data integrity in the pharmaceutical industry: analysis of inspections and warning letters issued by the bioresearch monitoring program between fiscal years 2007-2018. Ther Innov Regul Sci. Sep 24, 2020;54(5):1123-1133. [FREE Full text] [CrossRef] [Medline]
  6. Bernabe RD, van Thiel GJ, Breekveldt NS, Gispen CC, van Delden JJ. Ethics and the marketing authorization of pharmaceuticals: what happens to ethical issues discovered post-trial and pre-marketing authorization? BMC Med Ethics. Oct 27, 2020;21(1):103. [FREE Full text] [CrossRef] [Medline]
  7. Takaoka A, Zytaruk N, Davis M, Matte A, Johnstone J, Lauzier F, et al. PROSPECT Investigatorsthe Canadian Critical Care Trials Group. Monitoring and auditing protocol adherence, data integrity and ethical conduct of a randomized clinical trial: a case study. J Crit Care. Oct 2022;71:154094. [FREE Full text] [CrossRef] [Medline]
  8. Use of electronic informed consent: questions and answers. U.S. Food and Drug Administration. Dec 2016. URL: https://www.fda.gov/media/116850/download [accessed 2022-06-23]
  9. Electronic informed consent implementation guide: practical considerations. European CRO Federation. Mar 2021. URL: https:/​/www.​eucrof.eu/​images/​Electronic_Informed_Consent_Implementation_Guide_Practical_Considerations_Version_1.​0___March_2021_2.​pdf [accessed 2022-06-23]
  10. Page MJ, McKenzie JE, Bossuyt PM, Boutron I, Hoffmann TC, Mulrow CD, et al. The PRISMA 2020 statement: an updated guideline for reporting systematic reviews. BMJ. Mar 29, 2021;372:n71. [FREE Full text] [CrossRef] [Medline]
  11. How to compare Ovid MEDLINE and PubMed. Wolters Kluwer. Nov 2019. URL: https://tools.ovid.com/ovidtools/pdf/Ovid_MEDLINE_and_PubMed_compared.pdf [accessed 2023-06-18]
  12. Ovid tools and resources portal. Wolters Kluwer. URL: https://tools.ovid.com/ovidtools/medline.html [accessed 2023-06-18]
  13. Abujarad F, Peduzzi P, Mun S, Carlson K, Edwards C, Dziura J, et al. Comparing a multimedia digital informed consent tool with traditional paper-based methods: randomized controlled trial. JMIR Form Res. Oct 19, 2021;5(10):e20458. [FREE Full text] [CrossRef] [Medline]
  14. Afolabi MO, McGrath N, D'Alessandro U, Kampmann B, Imoukhuede EB, Ravinetto RM, et al. A multimedia consent tool for research participants in the Gambia: a randomized controlled trial. Bull World Health Organ. May 01, 2015;93(5):320-38A. [FREE Full text] [CrossRef] [Medline]
  15. Bickmore TW, Pfeifer LM, Paasche-Orlow MK. Using computer agents to explain medical documents to patients with low health literacy. Patient Educ Couns. Jun 2009;75(3):315-320. [FREE Full text] [CrossRef] [Medline]
  16. Buckley MT, Lengfellner JM, Koch MJ, Search B, Hoidra C, Lin M, et al. MSK eConsent: digitalizing the informed consent process to improve participant engagement and understanding. J Clin Oncol. May 20, 2020;38(15_suppl):2066. [CrossRef]
  17. Cagnazzo C, Nanni O, Di Costanzo A, Cenna R, Marchetti F, La Verde N, et al. 1856P Electronic informed consent: the need to redesign the consent process for the digital era. Annals Oncol. Sep 2021;32(11-12):S1249. [CrossRef]
  18. Chalil Madathil K, Koikkara R, Obeid J, Greenstein JS, Sanderson IC, Fryar K, et al. An investigation of the efficacy of electronic consenting interfaces of research permissions management system in a hospital setting. Int J Med Inform. Sep 2013;82(9):854-863. [FREE Full text] [CrossRef] [Medline]
  19. Chapman N, Mcwhirter R, Armstrong M, Fonseca R, Campbell J, Nelson M, et al. Multimedia for delivering participant informed consent in cardiovascular trials. J Hypertens. Apr 2021;39:e217-e218. [CrossRef]
  20. Chapman N, McWhirter R, Armstrong MK, Fonseca R, Campbell JA, Nelson M, et al. Self-directed multimedia process for delivering participant informed consent. BMJ Open. Jul 26, 2020;10(7):e036977. [FREE Full text] [CrossRef] [Medline]
  21. Fanaroff AC, Li S, Webb LE, Miller V, Navar AM, Peterson ED, et al. An observational study of the association of video- versus text-based informed consent with multicenter trial enrollment: lessons from the PALM Study (Patient and Provider Assessment of Lipid Management). Circ Cardiovasc Qual Outcomes. Apr 2018;11(4):e004675. [FREE Full text] [CrossRef] [Medline]
  22. Geier C, Adams RB, Mitchell KM, Holtz BE. Informed consent for online research-is anybody reading?: assessing comprehension and individual differences in readings of digital consent forms. J Empir Res Hum Res Ethics. Jul 2021;16(3):154-164. [CrossRef] [Medline]
  23. Golembiewski EH, Mainous AG, Rahmanian KP, Brumback B, Rooks BJ, Krieger JL, et al. An electronic tool to support patient-centered broad consent: a multi-arm randomized clinical trial in family medicine. Ann Fam Med. 2021;19(1):16-23. [FREE Full text] [CrossRef] [Medline]
  24. Harle CA, Golembiewski EH, Rahmanian KP, Brumback B, Krieger JL, Goodman KW, et al. Does an interactive trust-enhanced electronic consent improve patient experiences when asked to share their health records for research? a randomized trial. J Am Med Inform Assoc. Jul 01, 2019;26(7):620-629. [FREE Full text] [CrossRef] [Medline]
  25. Harmell AL, Palmer BW, Jeste DV. Preliminary study of a web-based tool for enhancing the informed consent process in schizophrenia research. Schizophr Res. Nov 2012;141(2-3):247-250. [FREE Full text] [CrossRef] [Medline]
  26. Haussen DC, Craft L, Doppelheuer S, Rodrigues GM, Al-Bayati AR, Ravindran K, et al. Legal authorized representative experience with smartphone-based electronic informed consent in an acute stroke trial. J Neurointerv Surg. May 2020;12(5):483-485. [CrossRef] [Medline]
  27. Jayasinghe N, Moallem BI, Kakoullis M, Ojie MJ, Sar-Graycar L, Wyka K, et al. Establishing the feasibility of a tablet-based consent process with older adults: a mixed-methods study. Gerontologist. Jan 09, 2019;59(1):124-134. [FREE Full text] [CrossRef] [Medline]
  28. Jeste DV, Palmer BW, Golshan S, Eyler LT, Dunn LB, Meeks T, et al. Multimedia consent for research in people with schizophrenia and normal subjects: a randomized controlled trial. Schizophr Bull. Jul 2009;35(4):719-729. [FREE Full text] [CrossRef] [Medline]
  29. Jimison HB, Sher PP, Appleyard R, LeVernois Y. The use of multimedia in the informed consent process. J Am Med Inform Assoc. 1998;5(3):245-256. [FREE Full text] [CrossRef] [Medline]
  30. McCarty CA, Berg R, Waudby C, Foth W, Kitchner T, Cross D. Long-term recall of elements of informed consent: a pilot study comparing traditional and computer-based consenting. IRB. 2015;37(1):1-5. [FREE Full text] [Medline]
  31. McGowan CR, Houlihan CF, Kingori P, Glynn JR. The acceptability of online consent in a self-test serosurvey of responders to the 2014-2016 West African Ebola outbreak. Public Health Ethics. Jul 2018;11(2):201-212. [FREE Full text] [CrossRef] [Medline]
  32. McGraw SA, Wood-Nutter CA, Solomon MZ, Maschke KJ, Bensen JT, Irwin DE. Clarity and appeal of a multimedia informed consent tool for biobanking. IRB. 2012;34(1):9-19. [Medline]
  33. Naeim A, Dry S, Elashoff D, Xie Z, Petruse A, Magyar C, et al. Electronic video consent to power precision health research: a pilot cohort study. JMIR Form Res. Sep 08, 2021;5(9):e29123. [FREE Full text] [CrossRef] [Medline]
  34. Perrault EK, Keating DM. Seeking ways to inform the uninformed: improving the informed consent process in online social science research. J Empir Res Hum Res Ethics. Feb 2018;13(1):50-60. [CrossRef] [Medline]
  35. Rothwell E, Johnson E, Wong B, Goldenberg A, Tarini BA, Riches N, et al. Comparison of video, app, and standard consent processes on decision-making for Biospecimen research: a randomized controlled trial. J Empir Res Hum Res Ethics. Oct 2020;15(4):252-260. [FREE Full text] [CrossRef] [Medline]
  36. Rothwell E, Wong B, Rose NC, Anderson R, Fedor B, Stark LA, et al. A randomized controlled trial of an electronic informed consent process. J Empir Res Hum Res Ethics. Dec 2014;9(5):1-7. [FREE Full text] [CrossRef] [Medline]
  37. Rowbotham MC, Astin J, Greene K, Cummings SR. Interactive informed consent: randomized comparison with paper consents. PLoS One. 2013;8(3):e58603. [FREE Full text] [CrossRef] [Medline]
  38. Siegel EM, Hawkins KP, Hildreth L, Grose T, Stringfellow D, Bloomer A, et al. Process improvement in online consenting for the Moffitt Cancer Center Total Cancer Care biobanking protocol. In: Proceedings of the AACR Special Conference on Modernizing Population Sciences in the Digital Age. Presented at: AACR Special Conference on Modernizing Population Sciences in the Digital Age; February 19-22, 2019, 2019; San Diego, CA. [CrossRef]
  39. Simon CM, Klein DW, Schartz HA. Interactive multimedia consent for biobanking: a randomized trial. Genet Med. Jan 2016;18(1):57-64. [FREE Full text] [CrossRef] [Medline]
  40. Simon CM, Schartz HA, Rosenthal GE, Eisenstein EL, Klein DW. Perspectives on electronic informed consent from patients underrepresented in research in the United States: a focus group study. J Empir Res Hum Res Ethics. Oct 2018;13(4):338-348. [CrossRef] [Medline]
  41. Simon CM, Wang K, Shinkunas LA, Stein DT, Meissner P, Smith M, et al. Communicating with diverse patients about participating in a biobank: a randomized multisite study comparing electronic and face-to-face informed consent processes. J Empir Res Hum Res Ethics. 2022;17(1-2):144-166. [FREE Full text] [CrossRef] [Medline]
  42. Sonne SC, Andrews JO, Gentilin SM, Oppenheimer S, Obeid J, Brady K, et al. Development and pilot testing of a video-assisted informed consent process. Contemp Clin Trials. Sep 2013;36(1):25-31. [FREE Full text] [CrossRef] [Medline]
  43. Tait AR, Voepel-Lewis T, McGonegal M, Levine R. Evaluation of a prototype interactive consent program for pediatric clinical trials: a pilot study. J Am Med Inform Assoc. Jun 2012;19(e1):e43-e45. [FREE Full text] [CrossRef] [Medline]
  44. Vanaken HI, Masand SN. Awareness and collaboration across stakeholder groups important for eConsent achieving value-driven adoption. Ther Innov Regul Sci. Nov 2019;53(6):724-735. [CrossRef] [Medline]
  45. Varnhagen CK, Gushta M, Daniels J, Peters TC, Parmar N, Law D, et al. How informed is online informed consent? Ethics Behav. 2005;15(1):37-48. [CrossRef] [Medline]
  46. Vercauteren S, Virani A, Longstaff H, Robillard J, Portales‐Casamar E, Lutynski A, et al. Electronic consent for pediatric biobanking: do kids and parents understand what they consent to? Biopreserv Biobank. Jun 12, 2020;18(3):O-07. [FREE Full text] [CrossRef]
  47. Warriner AH, Foster PJ, Mudano A, Wright NC, Melton ME, Sattui SE, et al. A pragmatic randomized trial comparing tablet computer informed consent to traditional paper-based methods for an osteoporosis study. Contemp Clin Trials Commun. Aug 15, 2016;3:32-38. [FREE Full text] [CrossRef] [Medline]
  48. Zeps N, Northcott N, Weekes L. Opportunities for eConsent to enhance consumer engagement in clinical trials. Med J Aust. Sep 2020;213(6):260-2.e1. [FREE Full text] [CrossRef] [Medline]
  49. Knapp P, Mandall N, Hulse W, Roche J, Moe-Byrne T, Martin-Kerry J, et al. (for the TRECA study group). Evaluating the use of multimedia information when recruiting adolescents to orthodontics research: a randomised controlled trial. J Orthod. Dec 06, 2021;48(4):343-351. [FREE Full text] [CrossRef] [Medline]
  50. Joffe S, Cook EF, Cleary PD, Clark JW, Weeks JC. Quality of informed consent: a new measure of understanding among research subjects. J Natl Cancer Inst. Jan 17, 2001;93(2):139-147. [CrossRef] [Medline]
  51. Afolabi MO, Bojang K, D'Alessandro U, Ota MO, Imoukhuede EB, Ravinetto R, et al. Digitised audio questionnaire for assessment of informed consent comprehension in a low-literacy African research population: development and psychometric evaluation. BMJ Open. Jun 24, 2014;4(6):e004817. [FREE Full text] [CrossRef] [Medline]
  52. Sugarman J, Lavori PW, Boeger M, Cain C, Edsond R, Morrison V, et al. Evaluating the quality of informed consent. Clin Trials. Sep 03, 2005;2(1):34-41. [CrossRef] [Medline]
  53. Jeste DV, Palmer BW, Appelbaum PS, Golshan S, Glorioso D, Dunn LB, et al. A new brief instrument for assessing decisional capacity for clinical research. Arch Gen Psychiatry. Aug 01, 2007;64(8):966-974. [CrossRef] [Medline]
  54. Appelbaum PS, Grisso T. MacArthur Competence Assessment Tool for Clinical Research (MacCAT-CR). Sarasota, FL. Professional Resource Press; 2001.
  55. Lewis JR. IBM computer usability satisfaction questionnaires: psychometric evaluation and instructions for use. Int J Hum Comput Interact. Jan 1995;7(1):57-78. [CrossRef]
  56. Craik FI, Lockhart RS. Levels of processing: a framework for memory research. J Verbal Learning Verbal Behav. Dec 1972;11(6):671-684. [CrossRef]
  57. Dellson P, Nilbert M, Carlsson C. Patient representatives' views on patient information in clinical cancer trials. BMC Health Serv Res. Feb 01, 2016;16(1):36. [FREE Full text] [CrossRef] [Medline]
  58. Mayer RE. Cognitive theory of multimedia learning. In: Mayer RE, editor. The Cambridge Handbook of Multimedia Learning. Cambridge. Cambridge University Press; 2005.
  59. Seitz AR, Kim R, Shams L. Sound facilitates visual learning. Curr Biol. Jul 25, 2006;16(14):1422-1427. [FREE Full text] [CrossRef] [Medline]
  60. Shams L, Seitz AR. Benefits of multisensory learning. Trends Cogn Sci. Nov 2008;12(11):411-417. [CrossRef] [Medline]
  61. Michel N, Cater III JJ, Varela O. Active versus passive teaching styles: an empirical study of student learning outcomes. Hum Resour Dev Q. Sep 2009;20(4):397-418. [CrossRef]
  62. Andres HP. Multimedia, information complexity, and cognitive processing. Inf Resour Manag J. 2004;17(1):63-78. [CrossRef]
  63. Chen CM, Wang JY. Effects of online synchronous instruction with an attention monitoring and alarm mechanism on sustained attention and learning performance. Interact Learn Environ. Jun 30, 2017;26(4):427-443. [CrossRef]
  64. Steinmayr R, Ziegler M, Träuble B. Do intelligence and sustained attention interact in predicting academic achievement? Learn Individ Differ. Feb 2010;20(1):14-18. [CrossRef]
  65. Chen CM, Wu CH. Effects of different video lecture types on sustained attention, emotion, cognitive load, and learning performance. Comput Educ. Jan 2015;80:108-121. [CrossRef]
  66. Merkt M, Weigand S, Heier A, Schwan S. Learning with videos vs. learning with print: the role of interactive features. Learn Instr. Dec 2011;21(6):687-704. [CrossRef]
  67. de Dieuleveult AL, Siemonsma PC, van Erp JB, Brouwer AM. Effects of aging in multisensory integration: a systematic review. Front Aging Neurosci. Mar 28, 2017;9:80. [FREE Full text] [CrossRef] [Medline]
  68. Garner K, Byrom B. Attitudes of older people/seniors to completion of electronic patient-reported outcome measures and use of mobile applications in clinical trials: results of a qualitative research study. J Comp Eff Res. Mar 2020;9(4):307-315. [FREE Full text] [CrossRef] [Medline]
  69. eConsent initiave. European Forum for Good Clinical Practice. URL: https://efgcp.eu/project?initiative=eConsent [accessed 2023-03-23]


eConsent: electronic consent
FDA: Food and Drug Administration
ICF: informed consent form
PRISMA: Preferred Reporting Items for Systematic Reviews and Meta-Analyses


Edited by T de Azevedo Cardoso; submitted 15.11.22; peer-reviewed by R Marshall, H Vanaken; comments to author 27.02.23; revised version received 24.04.23; accepted 27.06.23; published 01.09.23.

Copyright

©Edwin Cohen, Bill Byrom, Anja Becher, Magnus Jörntén-Karlsson, Andrew K Mackenzie. Originally published in the Journal of Medical Internet Research (https://www.jmir.org), 01.09.2023.

This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research, is properly cited. The complete bibliographic information, a link to the original publication on https://www.jmir.org/, as well as this copyright and license information must be included.