Published on in Vol 24, No 8 (2022): August

This is a member publication of University College London (Jisc)

Preprints (earlier versions) of this paper are available at https://preprints.jmir.org/preprint/40015, first published .
Testing the Effectiveness of an Animated Decision Aid to Improve Recruitment of Control Participants in a Case-Control Study: Web-Based Experiment

Testing the Effectiveness of an Animated Decision Aid to Improve Recruitment of Control Participants in a Case-Control Study: Web-Based Experiment

Testing the Effectiveness of an Animated Decision Aid to Improve Recruitment of Control Participants in a Case-Control Study: Web-Based Experiment

Original Paper

1Department of Behavioural Science and Health, University College London, London, United Kingdom

2Institute of Pharmaceutical Medicine, University of Basel, Basel, Switzerland

3Wolfson Institute of Population Health, Queen Mary University of London, London, United Kingdom

4School of Health Sciences, University of Surrey, Guildford, United Kingdom

5Department of Surgery and Cancer, Imperial College London, London, United Kingdom

Corresponding Author:

Yasemin Hirst, MSc, PhD

Department of Behavioural Science and Health

University College London

Gower Street

London, WC1E 6BT

United Kingdom

Phone: 44 2076791615

Email: y.hirst@ucl.ac.uk


Background: Participation in case-control studies is crucial in epidemiological research. The self-sampling bias, low response rate, and poor recruitment of population representative controls are often reported as limitations of case-control studies with limited strategies to improve participation. With greater use of web-based methods in health research, there is a further need to understand the effectiveness of different tools to enhance informed decision-making and willingness to take part in research.

Objective: This study tests whether the inclusion of an animated decision aid in the recruitment page of a study website can increase participants’ intentions to volunteer as controls.

Methods: A total of 1425 women were included in a web-based experiment and randomized to one of two experimental conditions: one in which they were exposed to a simulated website that included the animation (animation; n=693, 48.6%), and one in which they were exposed to the simulated website without the animation (control; n=732, 51.4%). The simulated website was adapted from a real website for a case-control study, which invites people to consider taking part in a study that investigates differences in purchasing behaviors between women with and without ovarian cancer and share their loyalty card data collected through 2 high street retailers with the researchers. After exposure to the experimental manipulation, participants were asked to state (1) their intention to take part in the case-control study, (2) whether they would be willing to share their loyalty card for research, and (3) their willingness to be redirected to the real website after completing the survey. Data were assessed using ordinal and binary logistic regression, reported in percentages (%), adjusted odds ratio (AOR), and 95% confidence intervals.

Results: Including the animation in the simulated website did not increase intentions to participate in the study (AOR 1.09; 95% CI 0.88-1.35) or willingness to visit the real study website after the survey (control 50.5% vs animation 52.6%, AOR 1.08; 95% CI 0.85-1.37). The animation, however, increased the participants’ intentions to share the data from their loyalty cards for research in general (control 17.9% vs animation 26%; AOR 1.64; 95% CI 1.23-2.18).

Conclusions: While the results of this study indicate that the animated decision aid did not lead to greater intention to take part in our web-based case-control study, they show that they can be effective in increasing people’s willingness to share sensitive data for health research.

J Med Internet Res 2022;24(8):e40015

doi:10.2196/40015

Keywords



One of the most effective methods to test for exposure in epidemiological research is to conduct a case-control study in which people who have an illness are compared retrospectively with a matching population without the outcome [1]. Although it is a reliable methodology to test for associations, poor recruitment of population representative controls often undermines such studies [2].

Previous research reports the most common methods of recruiting control participants for case-control studies as follows: door-to-door recruitment, postal invitation, and random digit dialing [1,2]. More recently, with greater access to the internet, many cohort studies have moved their participant management and recruitment online (using unique websites), providing new opportunities to recruit participants, potentially improving diversity and ease of data collection [3,4]. Despite several advantages, however, caution needs to be exercised with the opportunistic recruitment of participants to web-based studies. For example, a recent study reported less than 4% of participants who visited a study recruitment website, after clicking on a targeted social media advertisement campaign, went on to sign up to the research [5]. This indicates that while individuals may be forming some interest to take part in research studies by clicking on a recruitment advertisement, their intention does not always translate to survey completion after they land on the research website.

Evidence on the barriers and facilitators of web-based survey completion primarily relates to the completion of stand-alone web-based surveys, rather than the use of unique websites to recruit participants to case-control studies [6-11]. These studies suggest that individuals’ trust in the organization carrying out the research, whether they are early adopters of technology and high in literacy, and whether the research is in line with the individuals’ values and beliefs are positive predictors of individual participation. Recommendations to achieve better outcomes include clear communication of the research goals, transparency about how data will be used, and shorter survey length.

Clinical trials have attempted to address some of the above (eg, transparency about how data will be used), using audio-visual decision aids to supplement the process of obtaining informed consent [12]. Communicating information via these mediums (enabled through web-based recruitment strategies), have the potential to reduce the associated cognitive load, facilitate further engagement with the research aims, generate positive attitudes toward the targeted behavior, and subsequently motivate engagement in the behavior itself [13-16]. To our knowledge, the potential impact of animated decision aids on intentions to take part in a web-based case-control study has not previously been investigated. This study aims to measure the effectiveness of an animated decision aid as a supplementary tool on a simulated website of a case-control study to encourage participation.


Setting

This study comprised a randomized web-based experiment, which assessed the effectiveness of adding an animated decision aid to a simulated website. The simulated website the animation was designed for (or added to) was the recruitment website for the case-control study: Cancer Loyalty Card Study (CLOCS) [17].

Cancer Loyalty Card Study

CLOCS is an observational case-control study that aims to investigate the self-care behaviors of patients with ovarian cancer prior to their cancer diagnosis. It seeks to do this by investigating differences in transactional data (such as medication purchasing) between women with and without ovarian cancer (the transactional data are collected through the loyalty cards of 2 UK-based high street retailers). Cases (ie, women with ovarian cancer) are recruited through participating National Health Service sites, while controls are recruited through the study website. Full details for CLOCS have previously been reported in the study protocol [17].

Animated Decision Aid

A key challenge for CLOCS recruitment has been communicating the research aims clearly, and our previous research highlights that the public often needs further explanations for how individual transactional data can be used in health research [18]. To improve public understanding and engagement with the aims of CLOCS, an animated decision aid was jointly prepared by Science Animated Limited, 2 patient representatives, and the CLOCS research team prior to the initiation of this web-based experiment. It aimed to convey key facts on ovarian cancer, the potential contribution of CLOCS in informing earlier diagnosis, as well as how women in the general population can play a role to aid its efforts (based on the participant information sheet tailored and approved by a National Health Service Research Ethics Committee [19/NW/0427-SA1], included in Multimedia Appendix 1). However, it should be noted that the animation was designed as a supplement to the main study materials, not as a key participant communication material to prompt informed decision-making. The animation features English subtitles and is 123 seconds (2 minutes and 3 seconds) long [19].

Procedure

The randomized web-based experiment was programmed in Survey Monkey. In July 2020, women who were eligible to take part in CLOCS as a control(ie, between the ages of 18 and 70, living in the United Kingdom, and without an ovarian cancer diagnosis) were recruited through a survey vendor (Dynata Limited). For those who were interested in taking part, information about the experiment, including a brief description of CLOCS, was presented and followed by the completion of the study consent form. If participants consented and were eligible, they were randomized (in a 1:1 ratio) to one of the following two experimental conditions: the simulated CLOCS website without the animation (control), or the simulated CLOCS website with the animation (animation) (Multimedia Appendix 2 and 3).

Once everyone viewed the simulated website, they were asked to complete the survey, where they were required to indicate their intention to take part in CLOCS using a 4-point Likert scale (definitely yes, probably yes, probably not, and definitely not) adapted from previous research [20-22]. The study participants were then asked to indicate their loyalty card use by selecting from a list of high street retailers’ loyalty cards and whether they would be willing to share their loyalty card data for research purposes. The latter question was adapted from research on willingness to share electronic health data, which uses 4 commonly used models of consent for the use of data [23].

In the next step, the study participants were asked about their educational level, annual household income, and health literacy. For the latter, the eHealth Literacy Scale was used. This scale assesses individuals’ retrieval and judgement of health information on the internet [24]. It consists of 8 items rated on a 5-point scale, ranging from 1 to 5, and has demonstrated considerable reliability and validity [24]. The participants’ scores across the 8 items on the scale were summed and calculated for a sample mean. Individual scores below and above (or equal to) this sample mean were defined as low and high health literacy, respectively [25,26].

The final survey item was included as a behavior proxy. The participants were asked whether they want to be redirected to the actual CLOCS website for more information on how to take part. Those who responded that they would like to visit the website were provided with a link to the CLOCS website at the final page of the survey [22]. The website opened in a new tab for participants who clicked on the link. No data were collected from the study participants for their direct participation in CLOCS associated with the experiment.

Ethics Approval

Ethics approval for this study was granted by the UCL Research Ethics Committee (Project ID: 17823/001).

Data Analysis

A pilot study has been conducted beforehand for the purpose of sample size calculations. Based on the findings from the initial sample of 359 participants, with a 10% difference in intention to take part (definitely yes or probably yes versus definitely no or probably no), we determined the number of participants needed to achieve 95% certainty and 80% power was 650 per trial arm. Data from participants in both the pilot and final sample were combined for analysis.

Sample characteristics were assessed using descriptive statistics (Table S1 in Multimedia Appendix 4). To aid interpretation, the participants’ income and educational levels were dichotomized in inferential analyses. For income, we used £30,000 (US $37,000) as the cut-off point, based on the average household income in the United Kingdom (reported by the Office for National Statistics, 2020) [27]. For education, we categorized participants into those with General Certificate of Secondary Education or A Levels and those with a university degree [20-22].

Differences in intentions to take part in CLOCS were assessed using univariate and multivariate ordinal logistic regression. Willingness to visit the actual website and willingness to share loyalty card data for research purposes, between groups, were assessed using univariate and multivariate binary logistic regressions. Odds ratios (ORs), 95% confidence intervals, and P values are presented in the results, with P values below .05 regarded as statistically significant. Participants who spent a short amount of time completing the survey (ie, survey speeders) were excluded from the analysis (Multimedia Appendices 5-8, based on the 50% cut-off points provided from previous research using median values) [28]. We report the analysis for the whole sample in Table S2 (Multimedia Appendix 9) and the distribution of the time spent on the survey before and after the exclusion of the speeders in Multimedia Appendices 5-8 [29]. Additional analyses for interaction between the intervention and health literacy led to null results and were not reported due to the unbalanced proportion of those with low health literacy and high literacy in the study population.


Study Sample

Figure 1 demonstrates the flow of participants through the study. In total, 6034 women were invited to participate, of which 4609 (76.4%) were excluded as they were not eligible, dropped out, or discontinued the initial screening. The remaining 1425 (23.6%) were then randomized to one of the following two experimental conditions: 732 (51.4%) were randomly allocated to the control condition, and 693 (48.6%) to the animation condition. Across conditions, 131 (9.2%) did not finish the survey after randomization. Furthermore, 137 (9.6%) were excluded as they spent less than 50% of the median time. The analytical sample consisted of 1157 women—610 (52.7%) participants in the control condition, and 547 (47.3%) in the animation condition.

Most women in the analytical sample were aged between 55 and 70 years (n=417, 36.9%), did not have a university degree (n=686, 59.3%), and had an annual household income of less than £30,000 (US $37,000; n=622, 53.8%). The mean eHealth Literacy Scale score for the participants was 30.3 out of 40; thus, those who scored below this mean were classified as having low health literacy [20,21]. Post hoc comparisons revealed that sociodemographic characteristics were comparable between the two experimental conditions (Table S1 in Multimedia Appendix 4).

Figure 1. Flow through the study.
View this figure

Intentions to Participate in CLOCS

Intentions to participate in CLOCS were generally very high, with 69.7% (n=807) of women stating that they would probably or definitely participate. Figure 2 shows the distribution of the intentions to participate in CLOCS after exposure to the simulated website. The ordered logistic regressions in Table 1 show that the inclusion of the animation did not affect participation intentions (OR 1.12; 95% CI 0.90-1.39 and AOR 1.09; 95% CI 0.88-1.35). The regression further shows that older women aged 55-70 years stated lower intentions to participate (OR 0.56; 95% CI 0.42-0.74), while those with an income above average (OR 1.36; 95% CI 1.06-1.63), one or more existing loyalty cards (AOR 2.05; 95% CI 1.16-3.62), and low health literacy (AOR 1.26; 95% CI 1.01- 1.58) had higher intentions to take part in CLOCS.

Figure 2. Distribution for intention to take part in Cancer Loyalty Card Study.
View this figure
Table 1. Ordered logistic regression on intention to participate in Cancer Loyalty Card Study (N=1157).
VariablesUnadjusted regressionAdjusted regression

AORa95% CIP valueAOR95% CIP value
Condition

ControlReferencebReference

Animation1.1200.904-1.389.301.0880.876-1.352.45
Age (years)

18-34ReferenceReference

35-440.9980.726-1.371.990.9520.691-1.311.76

45-540.8060.577-1.127.210.7990.569-1.122.20

55-700.5560.421-0.735<.0010.5910.445-0.786<.001
Education

Below or equal to GCSEcReferenceReference

University degree1.3201.060-1.643.011.1450.909-1.441.25
Income

Below averageReferenceReference

Above average1.3621.097-1.690.011.2631.006-1.584.04
Card

NoReferenceReference

Yes2.1141.202-3.719.0092.0511.164-3.616.01
Health literacy

High literacyReferenceReference

Low literacy1.5011.125-2.002.0061.3811.029-1.854.03

aAOR: adjusted odds ratio.

bNot applicable.

cGCSE: General Certificate of Secondary Education.

Willingness to Share Loyalty Card Data for Research

Most study participants stated that their data should be used but that they should have the option of saying no (control condition: n=246, 40.3%; and animation condition: n=181, 33.1%; Figure 3). However, significantly more participants in the animation condition, compared to control, indicated that they would be willing to provide the data if they were needed—as shown in the binary logistic regressions in Table 2 (control: n=109, 17.9% vs animation: n=142, 26.0%; OR 1.61; 95% CI 1.22-2.14 and AOR 1.64; 95% CI 1.23-2.18). Similarly, as with the intentions to participate in CLOCS, women aged 55-70 years were again less likely to state that their data should be used if needed compared to those aged 18-34 years (n=71, 17% vs n=72, 24.8%; AOR 0.62; 95% CI 0.43-0.89).

Figure 3. Distribution for willingness to share loyalty card data .
View this figure
Table 2. Binary logistic regression on agreeing to share data from loyalty cards when needed (N=1157).
VariablesTotal, n (%)Unadjusted regressionAdjusted regression


AORa95% CIP valueAOR95% CIP value
Overall251 (21.7)b
Condition

Control109 (17.9)ReferenceReference

Animation142 (26.0)1.6121.216-2.136.0011.6401.232-2.184 .001
Age (years)

18-3472 (24.8)ReferenceReference

35-4455 (22.9)0.9000.602-1.346.610.8640.575-1.299.48

45-5453 (25.2)1.0220.678-1.540.921.0460.688-1.591.83

55-7071 (17.0)0.6210.430-0.899.010.6010.411-0.878.008
Education

Below or equal to GCSEc147 (21.4)ReferenceReference

University degree104 (22.1)1.0390.782-1.380.790.9320.690-1.260.65
Income

Below average income130 20.9)ReferenceReference

Above average income121 (22.6)1.1060.836-1.464.481.0520.782-1.414.74
Card

No9 (20.9)ReferenceReference

Yes242 (21.7)1.0480.496-2.216.901.0940.511-2.344.82
Health literacy

High literacy214 (22.0)ReferenceReference

Low literacy37 (20.0)0.8860.599-1.309.540.8070.541-1.204.30

aAOR: adjusted odds ratio.

bNot applicable.

cGCSE: General Certificate of Secondary Education.

Willingness to Visit the CLOCS Website After the Survey

A slight majority of the study participants (n=596, 51.5%) indicated that they would like to visit the CLOCS website after the end of the survey (Multimedia Appendix 10). Table 3 shows that there was no difference between the two experimental conditions (control: n=308, 50.5% vs animation: n=288, 52.6%; OR 1.09; 95% CI 0.87-1.37 and AOR 1.08; 95% CI 0.85-1.37). Women with low health literacy, in comparison to those with high health literacy (n=111, 60% vs n=485, 49.9%; AOR 1.43; 95% CI 1.03-1.98), and women aged 35-44 years, in comparison to those aged 18-34 years, were more interested in visiting the study website (n=145, 60.4% vs n=148, 51%; AOR 1.49; 95% CI 1.05-2.12).

Table 3. Binary logistic regression on willingness to visit website after the survey (N=1157).
VariablesTotal, n (%)Unadjusted regressionAdjusted regression


AORa95% CIP valueAOR95% CIP value
Overall596 (51.5)b
Condition

Control308 (50.5)ReferenceReference

Animation288 (52.7)1.0900.865-1.374.461.0790.853-1.367.53
Age (years)

18-34 years148 (51.0)ReferenceReference

35-44 years145 (60.4)1.4641.036-2.071.031.4891.048-2.115.03

45-54 years118 (56.2)1.2310.861-1.758.251.3030.905-1.877.16

55-70 years185 (44.4)0.7650.566-1.033.080.8250.606-1.123.22
Education

Below or equal to GCSEc337 (49.1)ReferenceReference

University degree259 (55.0)1.2651.000-1.601.051.2140.946-1.559.13
Income

Below average income322 (51.8)ReferenceReference

Above average income274 (51.2)0.9780.776-1.233.850.8650.677-1.107.25
Card

No18 (41.9)ReferenceReference

Yes578 (51.9)1.4980.808-2.776.201.5210.810-2.854.19
Health literacy

High literacy485 (49.9)ReferenceReference

Low literacy111 (60.0)1.5061.094-2.074.011.4311.031-1.986.03

aAOR: adjusted odds ratio.

bNot applicable.

cGCSE: General Certificate of Secondary Education.


Key Findings

This randomized web-based experiment examined the effectiveness of an animated decision aid to increase the willingness to participate in a case-control study. The results show that the animation did not increase intentions to participate in a real-world case-control study (CLOCS), or willingness to visit the real study website after the survey. However, the animation increased the participants’ willingness to share data from their loyalty cards for research. Interestingly, immediately after the completion of this web-based experiment, there was a spike in activity within the case-control study, with over 100 people signing up to participate in CLOCS.

Comparison With Previous Literature

Our findings reflect the mixed evidence currently available in the literature. While there has been some support for the effectiveness of animated decision aids in the context of health behavior research, many studies have focused on how the interventions can improve participants’ knowledge of the health behavior concerned, as their main outcome. When assessing participant intention to engage in the behavior (or an objective measurement of the behavior itself), the findings have been more inconsistent [30,31].

An interesting finding in our study is that, among all participants, those with lower health literacy scores were more interested to find out about the study compared with those with high health literacy scores. Our sample size calculations were based solely on the primary outcome; thus, we might have been underpowered to detect the interaction effects of health literacy and outcomes on the behavior proxy. However, the previous studies indicate that multimedia interventions are not always significant in individuals with lower educational levels [14] and low health literacy [15,16]. It has been previously shown in the cancer screening literature that gist-based supplementary materials could be used to enhance engagement with the main literature among people with low numeracy [32], and perhaps using animation and other easy-to-read materials could enhance participant recruitment in health research [33]. Furthermore, the positive association between low health literacy and willingness to visit the website may be explained by factors such as wanting to find more information, the salience of the research topic, and other factors that were not included in this web-based experiment. As such, future studies focusing on the comprehension of the materials among people with low health literacy using a think-aloud methodology could explain this outcome.

Strengths and Limitations

This study has some important limitations, which call for follow-up research. First, we did not include a comprehension assessment to check if participants fully understood or watched the animation. Second, we did not measure attitudes toward the simulated website and CLOCS. A recent systematic review on participant comprehension and informed consent in health research further highlighted that while there are efforts to improve participation rates using various methods, there is a lack of assessment of participant readability, literacy, and standardization of recruitment methods in health research for informed consent procedures [33]. While all the participant-facing CLOCS research materials have been reviewed by patient and public representatives, further assessment of participant comprehension prior to the animation experiment would have strengthened our methodology. However, the rationale for the exclusion of these measures was based on the assumptions that people often form immediate decisions about whether something is relevant to them using heuristic decision-making before establishing deliberative decisions [34]. As such, by excluding cognitive measures in our assessment to minimize judgement and bias, we tried to capture individuals’ potential reactions to the website as close to their reaction in real life. In this context, further studies using eye-tracking experiments on the simulated website will be highly informative to build a better understanding of the interaction with the website and the contents [35].

On the other hand, exposure to the animation increased the intentions to share loyalty card data for research in general, but not for intentions to participate in the CLOCS study; this suggests that there are study-specific characteristics that did not appeal to individuals (eg, actively signing up to provide information) or that the study participants were not eligible. However, a recent study also shows that only half of the population is willing to share shopping data for health research, highlighting the differences in sociodemographic characteristics of people who are willing to share their data for health research [36]. The characteristics of the participants who are willing to take part in CLOCS in this study mirror the results of this age-stratified survey employed in England, with older women less willing to take part. While self-sampling bias will continue to be a concern of case-control studies based on the differences in characteristics of the people who are willing to take part, recruitment strategies could be stratified and tailored to engage different populations who are less willing to take part in health research based on this evidence and the validation of public acceptability. Our results support this evidence further using experimental design with greater internal validity for potential barriers in recruiting participants to a case-control study.

Implications for Policy and Future Research

While such questions of generalizability are warranted, this study still poses important relevance and implications to current research contexts. Due to the COVID-19 pandemic, many research studies have been forced to consider the possibility of being adapted online. This may require researchers to derive additional strategies to reach web-based samples with different characteristics. Underrepresentation in health research is already an issue for minority populations, people with low literacy, and those with greater deprivation using traditional methods of recruitment, unless they are specifically targeted [37,38]; thus, there is a further need to ensure that web-based strategies can provide means for researchers to attract representative samples.

Future studies should therefore continue exploring web-based methods to facilitate complex decision-making processes for potential participants of health research. The CLOCS animation has not been actively disseminated for participant recruitment following this dearth of evidence; however, unique findings might be obtained for research of a different nature. Other multimedia formats or mediums such as social media can be further explored in future studies, along with the consideration of potentially important variables such as participants’ willingness to share data for research purposes.

Conclusion

The results of this study indicate that the animated decision aid did not influence the participants’ intention to take part in CLOCS or visit the study website. The animation, however, increased the probability of individuals stating that they would share their loyalty card data for research. Future research should continue exploring methods that can effectively engage participants with low health literacy to participate in complex health research.

Acknowledgments

The authors would like to acknowledge Deborah Tanner and Fiona Murphy who have acted as representatives to patients with ovarian cancer since the inception of the CLOCS project and helped us inform the design and contents of the animation. This study was funded by Cancer Research UK Early Detection and Diagnosis project grant (C38463/A26726).

Authors' Contributions

JHL, STS, RK, and YH developed the study concept and design. YH, HRB, and JMF designed and developed the animation aid for the Cancer Loyalty Card Study (CLOCS). JHL, STS, and YH performed the data analysis and interpretation. JHL, STS, and YH drafted the manuscript. All authors provided critical revisions and approved the final version of the manuscript for submission.

Conflicts of Interest

None declared.

Multimedia Appendix 1

Cancer Loyalty Card Study participant information sheet - health volunteer version 2.

DOCX File , 666 KB

Multimedia Appendix 2

A screenshot for the simulated website without the animation.

PNG File , 523 KB

Multimedia Appendix 3

A screenshot of the simulated website with the animation.

PNG File , 553 KB

Multimedia Appendix 4

Sample characteristics.

DOCX File , 14 KB

Multimedia Appendix 5

Distribution for time spent on the survey without excluding speeders (control group).

PNG File , 72 KB

Multimedia Appendix 6

Distribution for time spent on the survey after excluding speeders (Control group).

PNG File , 78 KB

Multimedia Appendix 7

Distribution for time spent on the survey without excluding speeders (Animation group).

PNG File , 66 KB

Multimedia Appendix 8

Distribution for time spent on the survey excluding speeders (Animation group).

PNG File , 77 KB

Multimedia Appendix 9

Adjusted regression models on the whole sample without excluding the speeders.

DOCX File , 14 KB

Multimedia Appendix 10

Distribution of willingness to visit the website after the survey.

PNG File , 18 KB

  1. Grimes DA, Schulz KF. Compared to what? Finding controls for case-control studies. The Lancet 2005 Apr;365(9468):1429-1433. [CrossRef]
  2. Bernstein L. Control recruitment in population-based case-control studies. Epidemiology 2006 May;17(3):255-257. [CrossRef] [Medline]
  3. Brøgger-Mikkelsen M, Ali Z, Zibert JR, Andersen AD, Thomsen SF. Online Patient Recruitment in Clinical Trials: Systematic Review and Meta-Analysis. J Med Internet Res 2020 Nov 04;22(11):e22179 [FREE Full text] [CrossRef] [Medline]
  4. Lane TS, Armin J, Gordon JS. Online Recruitment Methods for Web-Based and Mobile Health Studies: A Review of the Literature. J Med Internet Res 2015 Jul 22;17(7):e183 [FREE Full text] [CrossRef] [Medline]
  5. Fenner Y, Garland SM, Moore EE, Jayasinghe Y, Fletcher A, Tabrizi SN, et al. Web-based recruiting for health research using a social networking site: an exploratory study. J Med Internet Res 2012 Feb 01;14(1):e20 [FREE Full text] [CrossRef] [Medline]
  6. Loxton D, Harris ML, Forder P, Powers J, Townsend N, Byles J, et al. Factors Influencing Web-Based Survey Response for a Longitudinal Cohort of Young Women Born Between 1989 and 1995. J Med Internet Res 2019 Mar 25;21(3):e11286 [FREE Full text] [CrossRef] [Medline]
  7. Keusch F. Why do people participate in Web surveys? Applying survey participation theory to Internet survey data collection. Manag Rev Q 2015 Jan 9;65(3):183-216. [CrossRef]
  8. Fang J, Shao P, Lan G. Effects of innovativeness and trust on web survey participation. Computers in Human Behavior 2009 Jan;25(1):144-152. [CrossRef]
  9. Fan W, Yan Z. Factors affecting response rates of the web survey: A systematic review. Computers in Human Behavior 2010 Mar;26(2):132-139. [CrossRef]
  10. Christensen T, Riis AH, Hatch EE, Wise LA, Nielsen MG, Rothman KJ, et al. Costs and Efficiency of Online and Offline Recruitment Methods: A Web-Based Cohort Study. J Med Internet Res 2017 Mar 01;19(3):e58 [FREE Full text] [CrossRef] [Medline]
  11. Topolovec-Vranic J, Natarajan K. The Use of Social Media in Recruitment for Medical Research Studies: A Scoping Review. J Med Internet Res 2016 Nov 07;18(11):e286 [FREE Full text] [CrossRef] [Medline]
  12. Synnot A, Ryan R, Prictor M, Fetherstonhaugh D, Parker B. Audio-visual presentation of information for informed consent for participation in clinical trials. Cochrane Database Syst Rev 2014 May 09(5):CD003717 [FREE Full text] [CrossRef] [Medline]
  13. Visscher BB, Steunenberg B, Heijmans M, Hofstede JM, Devillé W, van der Heide I, et al. Evidence on the effectiveness of health literacy interventions in the EU: a systematic review. BMC Public Health 2018 Dec 29;18(1):1414 [FREE Full text] [CrossRef] [Medline]
  14. Lin Y, Chen C, Lee W, Cheng Y, Lin T, Lin C, et al. Educational video-assisted versus conventional informed consent for trauma-related debridement surgery: a parallel group randomized controlled trial. BMC Med Ethics 2018 Mar 09;19(1):23 [FREE Full text] [CrossRef] [Medline]
  15. Housten AJ, Kamath GR, Bevers TB, Cantor SB, Dixon N, Hite A, et al. Does Animation Improve Comprehension of Risk Information in Patients with Low Health Literacy? A Randomized Trial. Med Decis Making 2019 Dec 03;40(1):17-28. [CrossRef]
  16. Meppelink CS, van Weert JCM, Haven CJ, Smit EG. The effectiveness of health animations in audiences with different health literacy levels: an experimental study. J Med Internet Res 2015 Jan 13;17(1):e11 [FREE Full text] [CrossRef] [Medline]
  17. Brewer HR, Hirst Y, Sundar S, Chadeau-Hyam M, Flanagan JM. Cancer Loyalty Card Study (CLOCS): protocol for an observational case-control study focusing on the patient interval in ovarian cancer diagnosis. BMJ Open 2020 Sep 08;10(9):e037459 [FREE Full text] [CrossRef] [Medline]
  18. Flanagan JM, Skrobanski H, Shi X, Hirst Y. Self-Care Behaviors of Ovarian Cancer Patients Before Their Diagnosis: Proof-of-Concept Study. JMIR Cancer 2019 Jan 17;5(1):e10447. [CrossRef] [Medline]
  19. The Cancer Loyalty Card Study (CLOCS). YouTube. 2020 Jan 15.   URL: https://youtu.be/vr5mCxl7WIw [accessed 2022-08-11]
  20. von Wagner C, Hirst Y, Waller J, Ghanouni A, McGregor LM, Kerrison RS, et al. The impact of descriptive norms on motivation to participate in cancer screening - Evidence from online experiments. Patient Educ Couns 2019 Sep;102(9):1621-1628 [FREE Full text] [CrossRef] [Medline]
  21. Stoffel S, Hirst Y, Ghanouni A, McGregor LM, Kerrison R, Verstraete W, et al. Testing active choice for screening practitioner's gender in endoscopy among disinclined women: An online experiment. J Med Screen 2019 Jun;26(2):98-103 [FREE Full text] [CrossRef] [Medline]
  22. Stoffel S, Yang J, Vlaev I, von Wagner C. Testing the decoy effect to increase interest in colorectal cancer screening. PLoS ONE 2019 Mar 26;14(3):e0213668 [FREE Full text] [CrossRef]
  23. Bartlett G, Macgibbon B, Rubinowicz A, Nease C, Dawes M, Tamblyn R. The Importance of Relevance: Willingness to Share eHealth Data for Family Medicine Research. Front Public Health 2018 Sep 4;6:255 [FREE Full text] [CrossRef] [Medline]
  24. Norman CD, Skinner HA. eHEALS: The eHealth Literacy Scale. J Med Internet Res 2006 Nov 14;8(4):e27 [FREE Full text] [CrossRef] [Medline]
  25. Park H, Lee E. Self-reported eHealth literacy among undergraduate nursing students in South Korea: a pilot study. Nurse Educ Today 2015 Feb;35(2):408-413. [CrossRef] [Medline]
  26. Park H, Moon M, Baeg JH. Association of eHealth literacy with cancer information seeking and prior experience with cancer screening. Comput Inform Nurs 2014 Sep;32(9):458-463. [CrossRef] [Medline]
  27. Average household income, UK: financial year ending 2020 (provisional). Office for National Statistics. 2020.   URL: https:/​/www.​ons.gov.uk/​peoplepopulationandcommunity/​personalandhouseholdfinances/​incomeandwealth/​bulletins/​householddisposableincomeandinequality/​financialyearending2020provisional [accessed 2020-08-01]
  28. Greszki R, Meyer M, Schoen H. The impact of speeding on data quality in nonprobability and freshly recruited probability-based online panels. In: Online Panel Research: A Data Quality Perspective. Chichester, UK: Wiley; 2014:238-262.
  29. Miller DP, Spangler JG, Case LD, Goff DC, Singh S, Pignone MP. Effectiveness of a web-based colorectal cancer screening patient decision aid: a randomized controlled trial in a mixed-literacy population. Am J Prev Med 2011 Jun;40(6):608-615 [FREE Full text] [CrossRef] [Medline]
  30. Jerant A, Kravitz RL, Sohler N, Fiscella K, Romero RL, Parnes B, et al. Sociopsychological tailoring to address colorectal cancer screening disparities: a randomized controlled trial. Ann Fam Med 2014 May 12;12(3):204-214 [FREE Full text] [CrossRef] [Medline]
  31. Reynolds WW, Nelson RM. Risk perception and decision processes underlying informed consent to research participation. Soc Sci Med 2007 Nov;65(10):2105-2115. [CrossRef] [Medline]
  32. Smith S, Raine R, Obichere A, Wolf MS, Wardle J, von Wagner C. The effect of a supplementary ('Gist-based') information leaflet on colorectal cancer knowledge and screening intention: a randomized controlled trial. J Behav Med 2015 Apr;38(2):261-272 [FREE Full text] [CrossRef] [Medline]
  33. Flory J, Emanuel E. Interventions to improve research participants' understanding in informed consent for research: a systematic review. JAMA 2004 Oct 06;292(13):1593-1601. [CrossRef] [Medline]
  34. Shaffer VA, Owens J, Zikmund-Fisher BJ. The effect of patient narratives on information search in a web-based breast cancer decision aid: an eye-tracking study. J Med Internet Res 2013 Dec 17;15(12):e273 [FREE Full text] [CrossRef] [Medline]
  35. UK government coronavirus lockdowns. Institute for Government. 2022.   URL: https://www.instituteforgovernment.org.uk/charts/uk-government-coronavirus-lockdowns> [accessed 2022-05-10]
  36. Hirst Y, Stoffel S, Brewer H, Timotijevic L, Raats M. Flanagan JM Understanding Public Attitudes and Willingness to Share Commercial Data for Health Research: A Survey Study. JMIR Preprints 2022 Jul 07:Preprint.
  37. Patel MX, Doku V, Tennakoon L. Challenges in recruitment of research participants. Adv. psychiatr. treat 2018 Jan 02;9(3):229-238. [CrossRef]
  38. Sully BGO, Julious SA, Nicholl J. A reinvestigation of recruitment to randomised, controlled, multicenter trials: a review of trials funded by two UK funding agencies. Trials 2013 Jun 09;14(1):166 [FREE Full text] [CrossRef] [Medline]


AOR: adjusted odds ratio
CLOCS: Cancer Loyalty Card Study
OR: odds ratio


Edited by A Mavragani; submitted 01.06.22; peer-reviewed by YK Lin, R Sampieri-Cabrera; comments to author 13.07.22; revised version received 21.07.22; accepted 31.07.22; published 26.08.22

Copyright

©Sandro T Stoffel, Jing Hui Law, Robert Kerrison, Hannah R Brewer, James M Flanagan, Yasemin Hirst. Originally published in the Journal of Medical Internet Research (https://www.jmir.org), 26.08.2022.

This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research, is properly cited. The complete bibliographic information, a link to the original publication on https://www.jmir.org/, as well as this copyright and license information must be included.