Published on in Vol 27 (2025)

Preprints (earlier versions) of this paper are available at https://preprints.jmir.org/preprint/60413, first published .
Recruiting Young People for Digital Mental Health Research: Lessons From an AI-Driven Adaptive Trial

Recruiting Young People for Digital Mental Health Research: Lessons From an AI-Driven Adaptive Trial

Recruiting Young People for Digital Mental Health Research: Lessons From an AI-Driven Adaptive Trial

Original Paper

Corresponding Author:

Wu Yi Zheng, PhD

Black Dog Institute

University of New South Wales

Hospital Road, Randwick

Sydney, 2031

Australia

Phone: 61 0422510718

Email: wuyi.zheng@blackdog.org.au


Background: With increasing adoption of remote clinical trials in digital mental health, identifying cost-effective and time-efficient recruitment methodologies is crucial for the success of such trials. Evidence on whether web-based recruitment methods are more effective than traditional methods such as newspapers, media, or flyers is inconsistent. Here we present insights from our experience recruiting tertiary education students for a digital mental health artificial intelligence–driven adaptive trial—Vibe Up.

Objective: We evaluated the effectiveness of recruitment via Facebook and Instagram compared to traditional methods for a treatment trial and compared different recruitment methods’ retention rates. With recruitment coinciding with COVID-19 lockdowns across Australia, we also compared the cost-effectiveness of social media recruitment during and after lockdowns.

Methods: Recruitment was completed for 2 pilot trials and 6 minitrials from June 2021 to May 2022. To recruit participants, paid social media advertising on Facebook and Instagram was used, alongside mailing lists of university networks and student organizations or services, media releases, announcements during classes and events, study posters or flyers on university campuses, and health professional networks. Recruitment data, including engagement metrics collected by Meta (Facebook and Instagram), advertising costs, and Qualtrics data on recruitment methods and survey completion rates, were analyzed using RStudio with R (version 3.6.3; R Foundation for Statistical Computing).

Results: In total, 1314 eligible participants (aged 22.79, SD 4.71 years; 1079, 82.1% female) were recruited to 2 pilot trials and 6 minitrials. The vast majority were recruited via Facebook and Instagram advertising (n=1203; 92%). Pairwise comparisons revealed that the lead institution’s website was more effective in recruiting eligible participants than Facebook (z=3.47; P=.003) and Instagram (z=4.23; P<.001). No differences were found between recruitment methods in retaining participants at baseline, at midpoint, and at study completion. Wilcoxon tests found significant differences between lockdown (pilot 1 and pilot 2) and postlockdown (minitrials 1-6) on costs incurred per link click (lockdown: median Aus $0.35 [US $0.22], IQR Aus $0.27-$0.47 [US $0.17-$0.29]; postlockdown: median Aus $1.00 [US $0.62], IQR Aus $0.70-$1.47 [US $0.44-$0.92]; W=9087; P<.001) and the amount spent per hour to reach the target sample size (lockdown: median Aus $4.75 [US $2.95], IQR Aus $1.94-6.34 [US $1.22-$3.97]; postlockdown: median Aus $13.29 [US $8.26], IQR Aus $4.70-25.31 [US $2.95-$15.87]; W=16044; P<.001).

Conclusions: Social media advertising via Facebook and Instagram was the most successful strategy for recruiting distressed tertiary students into this artificial intelligence–driven adaptive trial, providing evidence for the use of this recruitment method for this type of trial in digital mental health research. No recruitment method stood out in terms of participant retention. Perhaps a reflection of the added distress experienced by young people, social media recruitment during the COVID-19 lockdown period was more cost-effective.

Trial Registration: Australian New Zealand Clinical Trials Registry ACTRN12621001092886; https://tinyurl.com/39f2pdmd; Australian New Zealand Clinical Trials Registry ACTRN12621001223820; https://tinyurl.com/bdhkvucv

J Med Internet Res 2025;27:e60413

doi:10.2196/60413

Keywords



Background

Advances in digital clinical trial infrastructure have made fully remote clinical trials more commonplace, particularly in the field of digital mental health. In such trials, participants are screened, assessed, treated, and followed up entirely via remote digital methods. Pivotal to the scientific validity of these trials is the timely and adequate recruitment of target participants who meet a set of prespecified eligibility criteria. Due to the paucity of research into best practice methods for efficiently recruiting trial samples, in this paper, we present insights from our experiences recruiting young participants for a digital mental health trial that used an artificial intelligence (AI)–driven adaptive trial design. This type of trial design has the potential to minimize time, costs, and the sample size needed to achieve sufficient statistical power [1]. Additionally, this method facilitates the optimization of treatment plans, which can enhance adherence and lower attrition [2]. However, compared to traditional randomized controlled trials (RCTs), AI-adaptive trials face an increased risk of insufficient and untimely enrollment due to the need for frequent, short-spanned recruitment drives (eg, monthly recruitment drives over a year). Hence, identifying recruitment methodologies that are both cost-effective and time-efficient is crucial for the success of such trials.

What recruitment methods are currently used and how do they compare with each other? A review by Lane et al [3] on online recruitment methods for web-based and mHealth studies found that successful recruitment of participants into a trial varied widely depending on the population, budget, intervention, costs, and study design. Of the studies included in their review, less than half (42%) found Facebook advertisements to be an effective method of recruitment and a quarter found Google advertisements to be the most effective method [3]. Only 1 study reported better outcomes using traditional methods of recruitment (eg, print advertisements and flyers) [3]. Overall, there was no consistent evidence on whether web-based recruitment methods were more effective than traditional methods such as newspapers, media, or flyers [3]. Similar mixed findings were echoed by a systematic review of recruitment strategies in mental health clinical trials [4], and a scoping review that examined the role of social media in clinical trials recruitment [5].

Specific to online recruitment methods, there is also contradictory evidence regarding the most cost-effective platform. Morgan et al [6] used a range of internet-based methods to recruit for the Mood Memos depression prevention study, including Facebook, emails, search engine advertisements, and forum posts, and reported that advertising on Google was more cost-effective than Facebook advertising (Aus $12 [US $7.46] per recruited participant vs Aus $19.89 [US $12.37] per recruited participant) in signing up participants. Similarly, recruitment costs for a large RCT of web-based smoking interventions were highest for Facebook (US $40.51) per randomized participant, compared to Google (US $34.71), traditional sources such as press releases, newspaper, radio, and television interviews (US $20.30), and using an online survey panel company (US $13.95) [7]. In contrast, Facebook advertising was found to be more cost-effective for recruiting a large sample of general community participants for suicide prevention research (Aus $2.81 [US $1.75] per completed survey) [8]. Similarly, Batterham [9] reported that advertising on a Facebook page that featured links to a survey on suicidal ideation was more effective in yielding completions (Aus $1.51 [US $0.94] per completion), compared to advertisements linked to an external website (Aus $9.82 [US $6.11] per completion). Both were more cost-effective than postal surveys, which averaged Aus $19.10 [US $11.88] per completion [9]. In a review of 176 studies that used social media for mental health research recruitment, Facebook was found to be the most preferred platform, at a median cost of US $19.47 per final recruited study participant [10]. The authors also found that in 2 out of 3 studies included in the review, social media recruitment performed as well as or better than traditional methods in the number and cost of final enrolled participants [10].

Differences in effectiveness between recruitment methods can often be difficult to interpret. In their systematic review, Liu et al [4] found that it was difficult to assess the overall effectiveness of any particular recruitment strategy as some strategies that worked well for a particular population may not work as well for others. For example, the greater costs of Facebook advertising for the Mood Memos study relative to the other studies described may be due to study requirements to target adults with subthreshold depression symptoms as opposed to recruiting a nonclinical sample. In terms of recruitment of young participants, evidence suggests that online advertisements are effective at appealing to this particular demographic. For example, Ford et al [11] examined paid advertising on Instagram, Snapchat, and Facebook to recruit youths (aged 13-20 years) for 2 cross-sectional surveys. They concluded that these platforms appear to be a modern and cost-effective approach to reaching youths with surveys on sensitive health topics [11]. Similar findings were obtained from an Australian health survey study that assessed the feasibility of recruiting young females using targeted advertising on Facebook [12]. In another study, Lee et al [8] found that male participants were more difficult to recruit compared to female participants but were able to use targeted, gender-specific advertisements (different language and imagery compared to gender-neutral advertisements) on Facebook to recruit more males. This ability to target different subgroups and easily make changes to targeting strategies are significant benefits of Facebook recruitment. The impact of different targeting strategies can be monitored (eg, costs per click), and further tweaks can be made to determine the best way to reach the target population. In addition, marketing collaterals such as images, headlines, and copy can be changed in order to find the most effective study advertisement [7].

Overall, these studies suggest that online advertising via platforms such as Facebook and Instagram is promising for the effective recruitment of young adults for mHealth survey studies. However, evidence supporting using these recruitment strategies for treatment studies is scarcer. A recent Danish study demonstrated the feasibility of using Facebook and Instagram to recruit young Danes (aged 15-25 years) for an RCT to evaluate a national online youth mental health promotion service [13]. A study that assessed the effectiveness of social media (Facebook), targeted mailing, and in-person recruitment of young adults for diabetes self-management clinical trials found that leveraging a variety of recruitment strategies produced a more representative sample of young adults [14]. To our knowledge, no single study has investigated the effectiveness of various recruitment methods to enlist distressed young adults in digital intervention studies.

The Vibe Up Trial

Vibe Up is an AI-driven adaptive trial, in which a series of “mini” trials are performed instead of 1 large trial. AI methods are used to update an underlying model of the interventions’ effectiveness and alter the proportion of participants allocated to each intervention in the next minitrial. Under this scheme, progressively fewer participants are allocated to less effective interventions in later minitrials, with the aim of identifying the most effective intervention as quickly as possible. This trial delivered brief interventions, including mindfulness (a set of five 3-5 minute instructor-guided audio meditations), physical activity (an evidence-based 7-minute high-intensity circuit training protocol, or self-selected daily physical activity), sleep hygiene (a program of 4 infographic-based information modules), and ecological momentary assessment (blended protocol consisting of signal contingent and event-contingent), to ease psychological distress in Australian tertiary education students [15]. A mobile app, Vibe Up, was purpose-built for these trials, allowing eligible students to participate in a “minitrial” by downloading the app from the App Store or Google Play. To ensure that data was captured as intended and to detect fraudulent activities, data verification was carried out by the data manager after the completion of each minitrial. Recruitment for Vibe Up coincided with COVID-19 lockdowns in Australia. This provided us with the opportunity to assess the cost-effectiveness of social media recruitment during a pandemic compared to its aftermath. Research has shown significant increases in distress among young people as a result of the COVID-19 pandemic [16-18]. Transition to online learning for students and restrictions on routine daily activities affected the well-being of young people [19-21]. With access to in-person mental health support no longer possible, we examined whether the cost-effectiveness of recruitment differed between the lockdown and the postlockdown period.

Aims

This study evaluated recruitment data of an AI-driven adaptive digital mental health trial aimed at reducing psychological distress in Australian tertiary education students [15]. This study aims to (1) evaluate the effectiveness of recruitment via Facebook and Instagram compared to traditional methods for a treatment study, (2) compare the cost-effectiveness of online recruitment during and after COVID-19 lockdown, and (3) compare retention rates of different recruitment methods used in this study.


Study Design

This study evaluated recruitment data collected from the Vibe Up study [15]. Data included engagement metrics collected by Meta (Facebook and Instagram), advertising costs, and Qualtrics data [22] on recruitment methods and rates of survey completion. To recruit participants, paid social media advertising on Facebook and Instagram was used, alongside mailing lists of university networks and student organizations or services, media releases, announcements during classes and events, study posters or flyers on university campuses, and our health professionals’ network. Recruitment for 2 pilot trials and 6 minitrials occurred between June 2021 and May 2022.

Participants

Participants were Australian tertiary education students. No restrictions were placed on the state or territory of origin. In order to participate in the study, participants must (1) be aged 18 years or older; (2) be currently enrolled as a student at an Australian university, technical and further education, or other higher education institution; (3) be currently residing in Australia and planning to be a resident throughout the study period; (4) be an advanced, fluent, or native English speaker; (5) own an eligible personal smartphone (iPhone 6S or Android 5 or later) with active mobile number and internet access; and (6) score 20 or more on the Kessler Psychological Distress Scale, 10-item version [23].

Participants are excluded if they (1) score 21 or more on the Suicide Ideation Attributes Scale [24]; (2) report a current active diagnosis of psychosis or bipolar disorder; (3) have already participated in a “minitrial” in the study; (4) indicate major disruptions or events in the next 2 months which may make it difficult to take part in the study; (5) indicate plans to travel outside of Australia in the next 2 months; and (6) indicate that they would be unable to safely undertake a physical activity intervention if allocated to receive this treatment.

Procedure

Recruitment was conducted by the research team in collaboration with the lead institution’s in-house marketing and communications department. A social media targeting strategy was developed (Table 1). Clicking on the Meta advertisements (Facebook and Instagram) took potential participants to a landing page, hosted on the lead research institution’s website, which contained study information and a direct link to the screening survey that was administered on the Qualtrics XM survey platform [22]. The recruitment window for each Vibe Up trial remained open for up to 7 days only; however, to not risk losing any potential participants between recruitment drives, the survey link was replaced by an expression of interest form on the website to capture the name, email address, and education institution of interested students. These students were contacted at the beginning of the following recruitment drive. Two pilot trials were carried out before recruitment began for 6 minitrials.

Table 1. Profile of individuals targeted on social media.
CategoryDetail
LocationAustralia
Age (years)18-40
SexMale or female
EducationAt university (undergraduate) or at university (postgraduate)
InterestsScience, artificial intelligence, mindfulness, computer science, DSM-5a, mental health foundation, Headspace, Beyond Blue or Lifeline (crisis support service), and international students

aDSM-5: Diagnostic and Statistical Manual of Mental Disorders (Fifth Edition).

Active recruitment was conducted via paid advertising on Facebook and Instagram. Traditional methods for recruiting students such as mailing lists, study flyers or posters, class or event announcements, and posts on the Black Dog Institute website and university media were also used. Meta’s “dynamic creative” function was used to upload a variety of images, copies, and headlines. Meta delivered a combination of these elements until it determined the top-performing combination and the combination that delivered on the objective at the cheapest cost.

We trialed different advertising objectives, using Meta’s (Facebook and Instagram) preset choices. For example, the use of link clicks allowed Meta to deliver study advertisements to people most likely to click on them, with the aim of obtaining the cheapest cost per click. We then unsuccessfully trialed a “landing page view objective” which focused on getting people to not just click on the advertisement, but to load and engage with the content on the page. To improve performance, for minitrials 5 and 6, the Meta pixel was used to set up an “initiate checkout” conversion on the “GET STARTED” button which appeared at the bottom of the recruitment website and took participants to the Qualtrics screening survey. In this instance, Meta optimized study advertisements toward people most likely to take this action.

Measures

Metrics used to measure recruitment in this study were derived from Facebook analytics data and Qualtrics survey data. For ease of comparison, metrics were largely based on previous research in this area [8,9] and are included in Table 2.

Table 2. Study measures and definitions.
Variable nameDefinition
Amount spent on advertising per trial (Aus $)Facebook and Instagram advertising costs per trial in Australian dollars.
ImpressionsNumber of times study advertisements were viewed.
ReachNumber of people that viewed study advertisements.
Link clicksNumber of times that links to study advertisements were clicked on.
Unique link clicksNumber of people that clicked on study advertisement links, regardless of how many times they clicked on a link.
Click through rateThe percentage of people who viewed and then clicked on the study advertisement. This was calculated by dividing the total number of clicks by the total number of impressions multiplied by 100. Click-through rates measure how successful the study advertisements are in capturing people’s attention.
Number of students screenedNumber of people who completed screening.
Number of eligible studentsNumber of people who were eligible to take part in the study.
Number of students who downloaded the study appNumber of people who downloaded the study app from Google Play or the App Store.
Number of students who completed a trialNumber of people who completed the 3 phases of the study (baseline, midpoint, and post).
Costs per screenedCosts per person who completed the screening questionnaires in Australian dollars.
Costs per eligibleCosts per person who was eligible to participate in the study in Australian dollars.
Costs per downloadedCosts per person who downloaded the study app from Google Play or the App Store in Australian dollars.
Costs per completedCosts per person who completed all three phases of the study in Australian dollars.

Statistical Analysis

Analyses were performed in RStudio with R (version 3.6.3; R Foundation for Statistical Computing). Descriptive analyses were conducted using Facebook and Instagram analytics data to assess how study advertisements performed and using Qualtrics survey data to determine rates of screened, eligible, and completed participants. A general linear model with pairwise post hoc estimates was used to compare the effectiveness of different methods in recruiting and retaining participants. For each recruitment pathway (Facebook and Instagram advertisement or post, other, and website of lead institution), Fischer exact tests were conducted to determine if gender, student status (domestic or international), and student level (undergraduate or postgraduate) of participants changed significantly as a result of dropout across different stages of the trial. To test for differences in age, Kruskal-Wallis tests were performed. For the purpose of comparing cost-effectiveness between COVID-19 lockdown and post–COVID-19 lockdown, pilot 1 and pilot 2 were classified in the former category and minitrials 1-6 fell into the latter. Inferential statistics were performed using Wilcoxon tests.

Ethical Considerations

Ethical approval was sought and obtained from the University of New South Wales Sydney Human Research Ethics Committee (HC200466). The trial sponsor is the University of New South Wales Sydney. The trial is registered with the Australian New Zealand Clinical Trials Registry (Pilot trials: ACTRN12621001092886; Mini-trials 1-6: ACTRN12621001223820). Informed consent was obtained from each trial participant, and the collected data were stored in a deidentified format. Each participant received Aus $30 [US $18.65] in gift cards.


Demographics

In total, 1314 participants were recruited for 2 pilot trials and 6 minitrials. Participants had a mean age of 22.79 (SD 4.71) years, predominately female (n=1079, 82.1%), domestic (n=1226, 93.3%), and undergraduate students (n=1045, 79.5%). Participants were recruited from 72 tertiary education institutions from across Australia (for the top 10 institutions; Table S1 in Multimedia Appendix 1).

Breakdown by Recruitment Methods

For the 2 pilot trials and 6 minitrials, 3638 students attempted the screening survey. The majority of students who screened for the trials were recruited via social media advertising or posts via Facebook (n=1690, 47%) and Instagram (n=922, 25%). Of the 1314 students (36%) eligible to participate in the study, the vast majority were recruited via social media advertising or posts (n=1203, 92%). Of the 1058 students (81% of those eligible) who provided baseline data, similar proportions were recruited via social media advertising or posts. Of the 982 students (75% of those eligible) who provided midpoint data, recruitment via social media remained the most fruitful (n=889, 91%). Of the 862 students (66% of those eligible) who completed a trial, 90 percent of students were recruited via social media advertising or posts. In terms of participant retention, see Table 3 for a detailed breakdown by recruitment method.

Table 3. Breakdown of participants by recruitment pathway and trial retention.a
Recruitment pathwayNumber of students retained in the trial

Attempted screening (N=3638), nScreened eligible (n=1314), n (%)Completed baseline (n=1058), n (%)Completed midpoint (n=982), n (%)Completed a trial (n=862), n (%)
Facebook advertisement or post1690810 (47.9)644 (38.1)597 (35.3)518 (30.7)
Instagram advertisement or post922393 (42.6)316 (34.3)292 (31.7)257 (27.9)
Other10761 (57)55 (51.4)52 (48.6)49 (45.8)
Website of lead institution7250 (69.4)43 (59.7)41 (56.9)38 (52.7)
Survey not completed847b

aOther included class or event, email, health professional, poster or flyer, press or media. Retention percentages were calculated based on the number of participants who attempted screening.

bNot applicable.

Pairwise comparisons revealed that the lead institution’s website was more effective in recruiting eligible participants compared to Facebook (z=3.47; P=.003) and Instagram (z=4.23; P<.001). No statistically significant difference was found between website recruitment and other methods (eg, email, poster or flyer) in recruiting eligible participants. No statistically significant differences were found between recruitment methods in retaining participants at baseline, at mid-point, and at study completion (Tables S2-S9 in Multimedia Appendix 1). To assess possible sample selection effects by recruitment pathway, characteristics of participants who dropped out at each trial stage were examined. No statistically significant age, sex, and student level (undergraduate vs postgraduate) differences were found for recruitment by Facebook, Instagram, others, and the website of the lead institution (Tables S10-S13 in Multimedia Appendix 1). For recruitment by Facebook, statistically significant differences were found for student status (domestic vs international; P=.001; Table S10 in Multimedia Appendix 1). No such differences were found for Instagram, others, and the website of the lead institution (Tables S11-S13 in Multimedia Appendix 1).

Facebook and Instagram Recruitment

Across the 2 pilot trials and 6 minitrials, Aus $7571.72 (US $4707.72) was spent on social media recruitment via Facebook and Instagram advertising, at an average of Aus $946.47 (US $588.47; SE Aus $160.99 [US $100.10]) per trial. A total of 1,119,661 impressions were counted, at an average of 139,958 (SE 27,927) impressions per trial. The performance of study advertisements was measured by the number of clicks generated that led potential participants from the social media platform to the Vibe Up recruitment page. See Figures S1 and S2 in Multimedia Appendix 2 for the top-performing Vibe Up advertisements. During the recruitment drives, 825,635 people were reached, at an average of 103,204 people (SE 17,569) per trial. 8138 clicks were registered on the study website (mean 1017, SE 71), with 7858 being unique link clicks (mean 982, SE 68). The overall click-through rate was 0.73. See Table 4 for the detailed breakdown of participant engagement by trial, during and post–COVID-19 lockdown.

Table 4. Breakdown of trial costs and engagement metrics.
Trial (stage)Amount spent (Aus $a)Impressions, nReach, nClicks, nUnique clicks, nClick-through rate
Pilot 1(lockdown)445.9882,26665,654122711551.49
Pilot 2 (lockdown)254.1139,70931,5926996691.76
Trial 1 (postlockdown)850.0091,32271,994114211121.25
Trial 2 (postlockdown)999.70237,201150,766119711560.50
Trial 3 (postlockdown)1100.00247,540175,161120311680.49
Trial 4 (postlockdown)1750.80207,799146,4239679370.47
Trial 5 (postlockdown)1095.1299,28581,6528508240.86
Trial 6 (postlockdown)1076.01114,539102,3938538370.74
Total7571.721,119,661825,635813878580.73

aAus $1=US $0.62.

Cost-Effectiveness of Trial Recruitment

On average, it costs Aus $2.28 (US $1.42; SE Aus $0.49 [US $0.30]) per screened participant, Aus $6.59 (US $4.10; SE Aus $1.39 [US $0.86]) per eligible participant, Aus $7.78 (US $4.84; SE Aus $1.64 [US $1.02]) per participant who downloaded the study mobile app, and Aus $10.01 (US $6.22; SE Aus $2.20 [US $1.37]) per completion. See Table S16 in Multimedia Appendix 1 for the breakdown of costs by trial.

Impacts of COVID-19 Lockdowns

Wilcoxon tests were carried out to compare costs incurred per link click between COVID-19 lockdown (pilot 1 and pilot 2), and post–COVID-19 lockdown (minitrials 1-6). Significant differences were found in the cost per click (lockdown: median Aus $0.35 [US $0.22], IQR Aus $0.27-$0.47 [US $0.17-$0.29]; postlockdown: median Aus $1.00 [US $0.62], IQR Aus $0.70-$1.47 [US $0.44-$0.92]; W=9087; P<.001; Table S14 in Multimedia Appendix 1). As recruitment drives occurred in short bursts (3-7 days as opposed to weeks or months for other recruitment campaigns), we compared lockdown and postlockdown costs using hourly spending on Facebook and Instagram advertising. Differences between lockdown and postlockdown on the amount spent per hour to reach the target sample size were also statistically significant (lockdown: median Aus $4.75 [US $2.95], IQR Aus $1.94-$6.34 [US $1.22-$3.97]; postlockdown: median Aus $13.29 [US $8.26], IQR Aus $4.70-$25.31 [US $2.95-$15.87]; W=16044; P<.001; Table S15 in Multimedia Appendix 1).


Principal Findings

With the increasing adoption of remote clinical trials in digital mental health research, and the uptake of the AI-driven adaptive trial methodology, strategies to optimize timely recruitment of suitable participants warrant investigation. Advancements in the way online social media platforms can be manipulated to reach target audiences faster and cheaper provided further impetus for this study. The effectiveness of participant recruitment via Facebook and Instagram was compared with traditional methods such as mailing lists and study flyers or posters. Results suggest that Facebook and Instagram are effective in recruiting young adults for remote digital mental health trials. Social media advertising outperformed more traditional methods in the total number of eligible participants recruited. However, we also found that compared to Facebook and Instagram, recruitment via the lead institution’s website was more effective in yielding eligible participants from those who completed screening. It may be the case that participants recruited while browsing the institution’s website (a mental health research institute) were actively seeking information to manage mental health concerns, and thus more likely to be eligible for the study than someone shown the study advertisement via social media because they fit the target age group. In addition, although not statistically significant, the completion rate was higher for participants recruited via the website. This could be explained by the familiarity, trust, and support of the mental health research carried out by the lead institution and suggests that this recruitment pathway can be effective for trial enrollment and importantly, trial completion. Given that the vast majority of eligible participants were recruited via social media advertising (92%) compared to website recruitment (4%), if funds are available, it would still make sense to include social media in the recruitment strategy to recruit young people for digital mental health trials. In terms of participant retention from study baseline to completion, there was no stand-out recruitment method. Similar to the number of eligible participants, the sheer volume of completers recruited via social media (90%) compared to traditional methods, provides a rationale for including this method in participant recruitment strategies. We did not find evidence of a sample selection effect due to the recruitment methods used. However, similar trials in the future should identify strategies to attract more international students. For example, targeted social media advertising for this student group and liaising with student leaders to disseminate trial information to club or society members.

In terms of cost-effectiveness, this study provided further evidence to support the benefits of using social media to recruit young adults for mental health research participation. Our results (Aus $7.78 [US $4.84] per participant app download and Aus $10.01 [US $6.22] per trial completion) compared favorably with the Mood Memos Study (Aus $19.89 [US $12.37] per recruited participant) [6], a health survey study targeting young females (US $20 per compliant participant) [12], a health intervention study of 334 young Australian adults (Aus $105.77 [US $65.76] per recruited participant) [25], diabetes self-management clinical trial for young adults (US $334 per enrolled participant) [14], and a review that calculated estimated median recruitment costs for RCTs in mental health research (US $42.82) [10]. Even though costs per completion for our trial were higher than studies conducted by Lee et al [8] (Aus $2.81 [US $1.75] per completion), Ash et al [26] (Aus $6.07 [US $3.77] per completion) [26], and Batterham [9] (Aus $1.51 [US $0.94] per completion), additional costs are explained by the nature of research activities involved; one-off survey completions, as opposed to completing a 29-day trial involving brief mental health interventions in Vibe Up. In comparison to an RCT that recruited 560 Danes aged 15-25 years to evaluate an online youth mental health promotion service [13], US $4.20 per baseline survey completion reported for this trial was almost identical to costs per eligible participant for Vibe Up (Aus $6.59 [US $4.10]).

More than 1.1 million impressions were counted during our recruitment drives, reaching 825,635 people. These are numbers difficult to achieve in a short period of time using flyers or posters, class announcements, and mailing lists. The ability to track the number of clicks on the study website (n=8138), and determine if clicks were by different individuals (7858 unique clicks), in addition to the ability to alter targeting setup (eg, tailoring advertisements to male students), makes social media recruitment via Facebook and Instagram an attractive proposition for researchers looking to recruit a large sample within a limited period of time. Further, click-through rates provided valuable insights into how engaging the study advertisements are to the target audience, allowing fine-tuning of recruitment materials during recruitment drives to maximize participant sign-up. According to WordStream [27], a good click-through rate for Facebook advertising averages approximately 0.90%. This study achieved an overall click-through rate of 0.73%, with the highest rates achieved during the COVID-19 lockdown (pilot 1: 1.49 and pilot 2: 1.76). It is important to acknowledge that the success in recruiting young people for our trial also leveraged the well-established and trusted brand of the lead institution in the mental health space in Australia.

A spike in costs for minitrial 4 warranted our attention. Up until then, a “link click” objective was used wherein Facebook and Instagram directed the study advertisements toward individuals most likely to click on them. A change to the “landing page view” objective for trial 4, where individuals more likely to load the page content were targeted, resulted in higher costs. It was a tactic that did not pay off as Facebook and Instagram needed time and more money to relearn the change in objective. For subsequent trials, a conversion pixel on the “GET STARTED” button was implemented, targeting people most likely to participate. This was a cost-saving strategy, homing in on people more likely to take action. Although the ability to change strategies demonstrated advancements in social media recruitment to reach specific target samples, it does come with additional financial costs that researchers should be aware of. In addition, fraudulent participants have increasingly infiltrated research recruitment via social media, especially trials offering monetary reimbursement. The current study implemented several safeguards including the use of time-based one-time password for app downloads and manual checking of participant details before inclusion in the study.

The first 2 recruitment drives coinciding with the COVID-19 lockdowns provided us with the unique opportunity to assess whether social media recruitment was more cost-effective during the lockdown period compared to postlockdown. Comparisons between the 2 periods found that fewer funds were needed during lockdown to trigger interest from prospective participants and result in sign-ups. The COVID lockdown may have facilitated recruitment, with students taking the opportunity to learn new, evidence-based coping strategies for psychological distress offered by the trial. The planned recruitment period of 2 weeks was reduced to 7 days, with almost 300 participants recruited for the first pilot trial. Adjustments were made for pilot 2 (7-day recruitment period); however, intake was closed after 3 days with sign-ups exceeding the target of 120-150 participants. COVID-19 lockdown was the most cost-effective recruitment period, with fewer funds required per screened participant, per eligible participant, per app download, and per completed participant. The psychological impact of the lockdown may have prompted students to seek ways to improve their mental well-being. Even though recruitment periods for subsequent trials (postlockdown) remained consistent (3-5 days), more funds were expended to reach the target sample size. Costs per completed participant increased to between Aus $8 (US $4.97) and Aus $21 (US $13.06; compared to under Aus $3 [US $1.87] for the pilot trials). These opportunistic findings suggest that recruitment of young people may yield better results during their “downtime,” for example, during semester breaks or festive seasons, when they have the capacity to contribute to research. In addition, recruitment materials need to include information on benefits to participants (for example, providing an opportunity to learn evidence-based coping strategies), and screening needs to exclude those unable to commit for the entire study (for example, asking participants to indicate major disruptions or events in the next 2 months which may make it difficult to take part in the study) to help boost retention rates.

Limitations

First, we were unable to accurately assign a monetary value to the traditional recruitment methods used for our trials (eg, poster or flyer and classroom announcements) so no direct comparisons were made on cost-effectiveness with social media recruitment. As a result, we are unable to comment on whether one recruitment method is more cost-effective than another in this study. Instead, the costs of social media recruitment were compared with previously published data to help determine if this method should be pursued in future mental health AI-driven adaptive trials. Secondly, we struggled to correct the sex imbalance within our sample. This is consistent with the recruitment literature [28,29]. Even though more funds were allocated to the recruitment of male participants from main trial 4 onwards, we were unable to correct the skew. For future studies, user testing of tailored advertisements for male participants should be considered before the start of recruitment campaigns. Next, given that our participants were recruited from all over Australia, it was difficult to isolate which participants were in lockdown, as lockdown periods varied across the country. We have categorized our trials based on dates published online by the Australian government [30]. Finally, the young people referred to in our study are all tertiary education students. Their characteristics and inclination to participate in research may differ from others in this group (eg, in full-time employment). Hence, our findings may not be generalizable to all young people in Australia but are likely to reflect those currently attending an Australian tertiary education institution.

Conclusions

Social media advertising on Facebook and Instagram is a productive method to recruit young people for AI-driven adaptive digital mental health intervention trials. Given that repeated recruitment drives are necessary for this trial methodology (in our case, once every month for 2 pilot trials and 6 minitrials), we found social media to be more effective in reaching potential participants all over Australia than traditional recruitment methods such as mailing lists and flyers or posters. In addition, advertising costs from this study compared favorably with other large-scale traditional trials, providing evidence that social media advertising can be effective for participant recruitment in future AI-driven adaptive trials. Perhaps a reflection of the added distress experienced by young people, online advertising costs were low during COVID-19 lockdowns and increased post lockdown. Based on our experience, we encourage other researchers interested in running AI-driven adaptive trials to consider the use of social media advertising to recruit participants.

Acknowledgments

The authors would like to acknowledge contributions from Nicola Livingstone and Simon Goh to the study.

Authors' Contributions

WYZ, A Shvetcov, A Slade, ZJ, LH, AW, RL, SG, KH, ES, SV, KM, RV, JMN, and HC played key roles in the design and development of study processes. ES, KH, WYZ, ZJ, LH, RL, SV, KM, JMN, and HC developed the recruitment strategies with support from AW, SG, JR, SR, A Shvetcov, and A Slade. LH led the development of the technical trial platform, with support from SK, AA, JFK, SC, MS, and NM. All authors provided critical input throughout participant recruitment and the development of the manuscript. All authors reviewed the manuscript and provided comments. All authors approved the manuscript prior to publication.

Conflicts of Interest

None declared.

Multimedia Appendix 1

Additional tables.

DOCX File , 43 KB

Multimedia Appendix 2

Vibe Up study advertisements.

DOCX File , 211 KB

  1. Pallmann P, Bedding AW, Choodari-Oskooei B, Dimairo M, Flight L, Hampson LV, et al. Adaptive designs in clinical trials: why use them, and how to run and report them. BMC Med. 2018;16(1):29. [FREE Full text] [CrossRef] [Medline]
  2. Harrer S, Shah P, Antony B, Hu J. Artificial intelligence for clinical trial design. Trends Pharmacol Sci. 2019;40(8):577-591. [FREE Full text] [CrossRef] [Medline]
  3. Lane TS, Armin J, Gordon JS. Online recruitment methods for web-based and mobile health studies: a review of the literature. J Med Internet Res. 2015;17(7):e183. [FREE Full text] [CrossRef] [Medline]
  4. Liu Y, Pencheon E, Hunter RM, Moncrieff J, Freemantle N. Recruitment and retention strategies in mental health trials—a systematic review. PLoS One. 2018;13(8):e0203127. [FREE Full text] [CrossRef] [Medline]
  5. Darmawan I, Bakker C, Brockman TA, Patten CA, Eder M. The role of social media in enhancing clinical trial recruitment: scoping review. J Med Internet Res. 2020;22(10):e22810. [FREE Full text] [CrossRef] [Medline]
  6. Morgan AJ, Jorm AF, Mackinnon AJ. Internet-based recruitment to a depression prevention intervention: lessons from the Mood Memos study. J Med Internet Res. 2013;15(2):e31. [FREE Full text] [CrossRef] [Medline]
  7. Watson NL, Mull KE, Heffner JL, McClure JB, Bricker JB. Participant recruitment and retention in remote eHealth intervention trials: methods and lessons learned from a large randomized controlled trial of two web-based smoking interventions. J Med Internet Res. 2018;20(8):e10351. [FREE Full text] [CrossRef] [Medline]
  8. Lee S, Torok M, Shand F, Chen N, McGillivray L, Burnett A, et al. Performance, cost-effectiveness, and representativeness of Facebook recruitment to suicide prevention research: online survey study. JMIR Ment Health. 2020;7(10):e18762. [FREE Full text] [CrossRef] [Medline]
  9. Batterham PJ. Recruitment of mental health survey participants using internet advertising: content, characteristics and cost effectiveness. Int J Methods Psychiatr Res. 2014;23(2):184-191. [FREE Full text] [CrossRef] [Medline]
  10. Sanchez C, Grzenda A, Varias A, Widge AS, Carpenter LL, McDonald WM, et al. Social media recruitment for mental health research: a systematic review. Compr Psychiatry. 2020;103:152197. [FREE Full text] [CrossRef] [Medline]
  11. Ford KL, Albritton T, Dunn TA, Crawford K, Neuwirth J, Bull S. Youth study recruitment using paid advertising on Instagram, Snapchat, and Facebook: cross-sectional survey study. JMIR Public Health Surveill. 2019;5(4):e14080. [FREE Full text] [CrossRef] [Medline]
  12. Fenner Y, Garland SM, Moore EE, Jayasinghe Y, Fletcher A, Tabrizi SN, et al. Web-based recruiting for health research using a social networking site: an exploratory study. J Med Internet Res. 2012;14(1):e20. [FREE Full text] [CrossRef] [Medline]
  13. Hoffmann SH, Paldam Folker A, Buskbjerg M, Paldam Folker M, Huber Jezek A, Lyngsø Svarta D, et al. Potential of online recruitment among 15-25-year olds: feasibility randomized controlled trial. JMIR Form Res. 2022;6(5):e35874. [FREE Full text] [CrossRef] [Medline]
  14. Salvy SJ, Carandang K, Vigen CL, Concha-Chavez A, Sequeira PA, Blanchard J, et al. Effectiveness of social media (Facebook), targeted mailing, and in-person solicitation for the recruitment of young adult in a diabetes self-management clinical trial. Clin Trials. 2020;17(6):664-674. [FREE Full text] [CrossRef] [Medline]
  15. Huckvale K, Hoon L, Stech E, Newby JM, Zheng WY, Han J, et al. Protocol for a bandit-based response adaptive trial to evaluate the effectiveness of brief self-guided digital interventions for reducing psychological distress in university students: the Vibe Up study. BMJ Open. 2023;13(4):e066249. [FREE Full text] [CrossRef] [Medline]
  16. Wilkins R, Vera-Toscano E, Botha F, Wooden M, Trinh TA. The household, income and labour dynamics in Australia survey: selected findings from waves 1 to 20. Melbourne Institute: Applied Economic & Social Research. 2022. URL: https:/​/melbourneinstitute.​unimelb.edu.au/​__data/​assets/​pdf_file/​0011/​4382057/​HILDA_Statistical_Report_2022.​pdf [accessed 2024-12-12]
  17. Bower M, Smout S, Donohoe-Bales A, O'Dean S, Teesson L, Boyle J, et al. A hidden pandemic? An umbrella review of global evidence on mental health in the time of COVID-19. Front Psychiatry. 2023;14:1107560. [FREE Full text] [CrossRef] [Medline]
  18. Batra K, Sharma M, Batra R, Singh TP, Schvaneveldt N. Assessing the psychological impact of COVID-19 among college students: an evidence of 15 countries. Healthcare. 2021;9(2):222. [FREE Full text] [CrossRef] [Medline]
  19. Alesi M, Giordano G, Gentile A, Caci B. The switch to online learning during the COVID-19 pandemic: the interplay between personality and mental health on university students. Int J Environ Res Public Health. 2023;20(7):5255. [FREE Full text] [CrossRef] [Medline]
  20. Butnaru GI, Haller AP, Dragolea LL, Anichiti A, Tacu Hârșan GD. Students' wellbeing during transition from onsite to online education: Are there risks arising from social isolation? Int J Environ Res Public Health. 2021;18(18):9665. [FREE Full text] [CrossRef] [Medline]
  21. Liu C, McCabe M, Dawson A, Cyrzon C, Shankar S, Gerges N, et al. Identifying predictors of university students' wellbeing during the COVID-19 pandemic—a data-driven approach. Int J Environ Res Public Health. 2021;18(13):6730. [FREE Full text] [CrossRef] [Medline]
  22. Qualtrics LLC. URL: https://www.qualtrics.com/en-au/ [accessed 2024-12-14]
  23. Kessler RC, Andrews G, Colpe LJ, Hiripi E, Mroczek DK, Normand SL, et al. Short screening scales to monitor population prevalences and trends in non-specific psychological distress. Psychol Med. 2002;32(6):959-976. [CrossRef]
  24. van Spijker BA, Batterham PJ, Calear AL, Farrer L, Christensen H, Reynolds J, et al. The Suicidal Ideation Attributes Scale (SIDAS): community-based validation study of a new scale for the measurement of suicidal ideation. Suicide Life Threat Behav. 2014;44(4):408-419. [CrossRef] [Medline]
  25. Musiat P, Winsall M, Orlowski S, Antezana G, Schrader G, Battersby M, et al. Paid and unpaid online recruitment for health interventions in young adults. J Adolesc Health. 2016;59(6):662-667. [CrossRef] [Medline]
  26. Ash GI, Robledo DS, Ishii M, Pittman B, DeMartini KS, O'Malley SS, et al. Using web-based social media to recruit heavy-drinking young adults for sleep intervention: prospective observational study. J Med Internet Res. 2020;22(8):e17449. [FREE Full text] [CrossRef] [Medline]
  27. Irvine M. Facebook ad benchmarks for your industry [data]. WordStream. URL: https://www.wordstream.com/blog/ws/2017/02/28/facebook-advertising-benchmarks [accessed 2017-02-28]
  28. Ryan J, Lopian L, Le B, Edney S, Van Kessel G, Plotnikoff R, et al. It's not raining men: a mixed-methods study investigating methods of improving male recruitment to health behaviour research. BMC Public Health. 2019;19(1):814. [FREE Full text] [CrossRef] [Medline]
  29. Knox J, Morgan P, Kay-Lambkin F, Wilson J, Wallis K, Mallise C, et al. Male involvement in randomised trials testing psychotherapy or behavioural interventions for depression: a scoping review. Curr Psychol. 2022;42(34):1-16. [FREE Full text] [CrossRef] [Medline]
  30. Department of Education ECEC COVID-19 timeline: Australian Government Department of Education, Skills and Employment. URL: https://tinyurl.com/mvhaa7a3 [accessed 2021-12-19]


AI: artificial intelligence
RCT: randomized controlled trial


Edited by T McCall; submitted 15.05.24; peer-reviewed by J Bourgaize, K Ishida; comments to author 27.10.24; revised version received 12.11.24; accepted 12.11.24; published 14.01.25.

Copyright

©Wu Yi Zheng, Artur Shvetcov, Aimy Slade, Zoe Jenkins, Leonard Hoon, Alexis Whitton, Rena Logothetis, Smrithi Ravindra, Stefanus Kurniawan, Sunil Gupta, Kit Huckvale, Eileen Stech, Akash Agarwal, Joost Funke Kupper, Stuart Cameron, Jodie Rosenberg, Nicholas Manoglou, Manisha Senadeera, Svetha Venkatesh, Kon Mouzakis, Rajesh Vasa, Helen Christensen, Jill M Newby. Originally published in the Journal of Medical Internet Research (https://www.jmir.org), 14.01.2025.

This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research (ISSN 1438-8871), is properly cited. The complete bibliographic information, a link to the original publication on https://www.jmir.org/, as well as this copyright and license information must be included.