Published on in Vol 24, No 2 (2022): February

Preprints (earlier versions) of this paper are available at https://preprints.jmir.org/preprint/30880, first published .
The Effect of Adjunct Telephone Support on Adherence and Outcomes of the Reboot Online Pain Management Program: Randomized Controlled Trial

The Effect of Adjunct Telephone Support on Adherence and Outcomes of the Reboot Online Pain Management Program: Randomized Controlled Trial

The Effect of Adjunct Telephone Support on Adherence and Outcomes of the Reboot Online Pain Management Program: Randomized Controlled Trial

Original Paper

1Clinical Research Unit for Anxiety and Depression, St Vincent's Hospital, Sydney, Australia

2Department of Pain Medicine, St Vincent's Hospital, Sydney, Australia

3Royal Ryde Rehabilitation Hospital, Sydney, Australia

4University of New South Wales, Sydney, NSW, Australia

Corresponding Author:

Tania Gardner, PhD

Clinical Research Unit for Anxiety and Depression, St Vincent's Hospital

406 Victoria St

Sydney, 2010

Australia

Phone: 61 0410449766

Email: taniagardner@optusnet.com.au


Background: Internet-based treatment programs present a solution for providing access to pain management for those unable to access clinic-based multidisciplinary pain programs. Attrition from internet interventions is a common issue. Clinician-supported guidance can be an important feature in web-based interventions; however, the optimal level of therapist guidance and expertise required to improve adherence remains unclear.

Objective: The aim of this study is to evaluate whether augmenting the existing Reboot Online program with telephone support by a clinician improves program adherence and effectiveness compared with the web-based program alone.

Methods: A 2-armed, CONSORT (Consolidated Standards of Reporting Trials)–compliant, registered randomized controlled trial with one-to-one group allocation was conducted. It compared a web-based multidisciplinary pain management program, Reboot Online, combined with telephone support (n=44) with Reboot Online alone (n=45) as the control group. Participants were recruited through web-based social media and the This Way Up service provider network. The primary outcome for this study was adherence to the Reboot Online program. Adherence was quantified through three metrics: completion of the program, the number of participants who enrolled into the program, and the number of participants who commenced the program. Data on adherence were collected automatically through the This Way Up platform. Secondary measures of clinical effectiveness were also collected.

Results: Reboot Online combined with telephone support had a positive effect on enrollment and commencement of the program compared with Reboot Online without telephone support. Significantly more participants from the Reboot Online plus telephone support group enrolled (41/44, 93%) into the course than those from the control group (35/45, 78%; χ21=4.2; P=.04). Furthermore, more participants from the intervention group commenced the course than those from the control group (40/44, 91% vs 27/45, 60%, respectively; χ21=11.4; P=.001). Of the participants enrolled in the intervention group, 43% (19/44) completed the course, and of those in the control group, 31% (14/45) completed the course. When considering the subgroup of those who commenced the program, there was no significant difference between the proportions of people who completed all 8 lessons in the intervention (19/40, 48%) and control groups (14/27, 52%; χ21=1.3; P=.24). The treatment efficacy on clinical outcome measures did not differ between the intervention and control groups.

Conclusions: Telephone support improves participants’ registration, program commencement, and engagement in the early phase of the internet intervention; however, it did not seem to have an impact on overall course completion or efficacy.

Trial Registration: Australian New Zealand Clinical Trials Registry ACTRN12619001076167; https://anzctr.org.au/Trial/Registration/TrialReview.aspx?ACTRN=12619001076167

J Med Internet Res 2022;24(2):e30880

doi:10.2196/30880

Keywords



Background

The global prevalence of chronic pain is estimated to be 10% to 33% [1-3], and pain conditions continue to be a leading cause of worldwide disability [4]. The accepted best practice for managing chronic pain is a multidisciplinary pain program that involves attending a group program—typically convened face to face at a hospital or clinical site—facilitated by members of a multidisciplinary team [5].

A main challenge to the management of chronic pain is the lack of services available within existing health care systems. Access to multidisciplinary pain management groups can be compromised for people living in rural areas, those with family or work commitments, and those with mental health issues that limit their ability to engage and be involved in a social group environment [6]. Furthermore, because of the COVID-19 global pandemic, increased social isolation and reduced accessibility of face-to-face clinical interactions have further compromised access to required pain management services for many people [7]. Web-based treatment programs present a solution for providing access to pain management for those unable to access clinic-based multidisciplinary pain programs [6,7] and for delivering care at reduced personnel costs [7-9].

The evidence to date suggests that internet-based health interventions that provide a combination of information and some form of user support—such as decision management support, behavior change support, or social support—lead to improvements in knowledge as well as behavioral and clinical outcomes [8,10-13]. Generally, internet-based pain programs focus on unimodal interventions that offer a single discipline of therapy, most commonly psychological therapies such as internet-delivered cognitive behavioral therapy (CBT) [14]. These programs typically show small to moderate effects on pain-related disability and interference as well as pain catastrophizing and are a cost-effective alternative to face-to-face therapy [13,14].

Reboot Online was one of the first internet-based pain management programs that offered a multidisciplinary model. The program has been shown to significantly improve pain self-efficacy and reduce pain-related disability, kinesiophobia, and psychological distress compared with usual care [15,16]. These studies were carried out within a strict clinical trial paradigm. Although highly effective for those participants who completed the program, adherence rates were moderate, with only 61% of the trial participants completing all the lessons. Adherence (completion of all 8 lessons of the program) and effectiveness (significant changes in clinical outcome measures) of the program in real-world conditions were subsequently evaluated [17]. Without the regulated framework and rigor embedded within a clinical trial design, the adherence to the Reboot Online program reduced to 41% when used under routine care conditions.

Attrition, or dropout of study participants, is a common problem for internet-based treatments [18,19]. In a systematic review of internet interventions for chronic pain [19], the range in attrition levels across published trials was considerable, from 4% to 54%. Different methods have been used to prevent dropout, such as telephone support, personalized reminders and feedback, and financial incentives [19]. However, it remains unclear how helpful these methods are [19].

Clinician-supported guidance can be an important feature in web-based interventions for mental health disorders; however, the optimal level and pattern of therapist guidance required to improve adherence remains unclear [20]. Having human social support, whether it be peer-led or specialized clinician–led, has been suggested to help support healthy behavior change [21,22] and has been shown to have small effects on improving adherence compared with unguided interventions for individuals with chronic pain [22-25]. The addition of clinician-supported guidance has not yet been tested in routine care using the Reboot Online program.

Objectives

We aim to evaluate whether augmenting the existing Reboot Online program with telephone support by a clinician improves program adherence and effectiveness compared with the web-based program alone. We hypothesized that the Reboot Online plus telephone support group would have better adherence (defined as higher number of enrollments, higher number of commencements, and higher completion rates) to the program than the control group who received the Reboot Online program only.


Study Design

The study was a 2-armed, CONSORT (Consolidated Standards of Reporting Trials)–compliant, registered randomized controlled trial with one-to-one group allocation. It compared a web-based multidisciplinary pain management program, Reboot Online, combined with telephone support with Reboot Online alone as the control group.

The study was approved by the human research ethics committee of St Vincent’s Hospital, Sydney, Australia (2019/ETH08682). The trial was prospectively registered on the Australian New Zealand Clinical Trials Registry (ACTRN12619001076167), and the protocol was followed as per the registry.

Participants

Participants were recruited between September 2019 and April 2020 through social media advertisements (eg, Facebook and Twitter) and through the This Way Up service provider network. To replicate real-world conditions, the trial was run through the established web-based platform through which the Reboot Online program is available: This Way Up [26]. Applicants completed a 2-step screening process. All applicants first completed web-based screening questionnaires about their chronic pain symptoms and demographic details, followed by a telephone interview to confirm their chronic pain diagnosis and study eligibility. The Mini International Neuropsychiatric Interview (version 5.0.0) [27] for major depressive disorder (MDD) and risk assessment modules were also administered during the telephone interview as a screening and diagnostic tool for depression. MDD helps us to compare the sample characteristics with our previous trials and establish the number of individuals who had comorbid depression, which also assists in clinical support. MDD plus risk assessment modules help establish how the clinician will prioritize and triage calls as well as monitor and ensure that participants who are at risk are appropriately supported throughout the program.

Applicants were eligible for inclusion if they (1) had experienced pain for >3 months, (2) were aged ≥18 years, (3) were a resident of Australia, (4) had their pain assessed by a physician in the past 3 months, (5) were prepared to provide contact details of their general practitioner, (6) had access to the internet and computer or tablet and deemed themselves to have basic computer literacy, and (7) were fluent in written and spoken English. Applicants were excluded if they had psychosis, bipolar disorder, were actively suicidal (eg, demonstrated intent or a plan or had a recent suicide attempt). These exclusion criteria were included to be consistent with other trial exclusion criteria and to be able to adequately evaluate participants’ needs and assist their access to more intensive clinical support if required. Participants were also excluded if they had participated in a group-based pain management program within the last 6 months so that their prior treatment or learning would not confound the results. Participants were also ineligible for the study if they had surgical procedures scheduled in the 6 months after their application; had commenced or made significant changes to their psychotropic medication in the previous 2 months before intake assessment; or had commenced face-to-face CBT within 4 weeks of the intake assessment. These exclusion criteria were included because of their potential confounds. We wanted to explore the effects of clinician support on program adherence and outcomes and minimize the confounding effects of these features.

Participants provided electronic informed consent before being enrolled in the study; they also provided permission for their pain physician (or general practitioner) to be notified by the study team detailing their involvement in the trial. Measures of psychological distress (Kessler-10 Psychological Distress Scale [28,29]) were monitored throughout the program for any risk of harm.

Randomization

The eligible participants were randomly assigned to either the intervention or control group in a 1:1 allocation based on a random number sequence generated [30]. An individual web-based link for baseline data questionnaires through REDCap (Research Electronic Data Capture [31]) was sent to the eligible participants to complete. Once the initial baseline questionnaires were completed, an individual registration link to Reboot Online through the usual This Way Up process was sent to the participants, which gave them access to register for the course, complete pretreatment questionnaires, and commence the first lesson. Telephone screening interviewers at baseline remained blinded to the participants’ group allocation. Study staff members conducting the telephone support intervention could not be blinded to group allocation. Group allocation was indicated to the participant by whether they received telephone support after registration into the program.

Intervention

Reboot Online Program (Control)

Participants received free access to the standard Reboot Online program [32]. Details of the content and design of the program have been previously outlined [16]. In brief, Reboot Online consists of 8 lessons delivered over 16 weeks. Participants follow an illustrated story of a fictional character who learns to self-manage her chronic pain using a multidisciplinary approach. The content delivers psychoeducation on the socio-psycho-biomedical nature of chronic pain within a multidisciplinary framework. The core lessons are combined with a movement station section: a graded, patient-led exercise program focusing on activity and exercise reactivation. This is combined with pacing: an activity-management technique aimed to maximize a person’s activity by developing a structured, slow, and gradual planned increase in activity levels over a period of time. The level of activity is dependent on the planned quantity rather than pain intensity experienced by the patient. The program also incorporates SMART (specific, measurable, achievable, relevant, and timely) goal-setting principles, whereby the patient leads the process of identifying issues or problems that matter most to them, sets the goal that they want to accomplish, and develops strategies and goals that are specific, measurable, achievable, relevant, and within a time frame. This is coupled with evidence-based CBT skills activities, including thought challenging, activity planning, problem solving, effective communication, and flare-up management. Participants also have access to expert educational videos as well as tai chi and relaxation audio guides. See Table 1 for details of the lesson content of the program. All participants received automatic email communication to notify them when a lesson was available and to encourage lesson completion as well as engaging in activity.

Table 1. Reboot Online lesson content.
Lesson titleLesson contentHomework activitiesLesson resources (PDF or video)
Lesson 1: What Is Chronic Pain and What Is the Best Way to Manage It?
  • Chronic pain explained
  • The Brick Wall of Chronic Pain
  • Medication overview
  • The movement station
  • Review of acute vs chronic pain
  • Complete own Brick Wall of Chronic Pain
  • Instructions for the movement station
  • Welcome (video)
  • Medication management (video)
Lesson 2: Goal-Setting and Moving Toward Acceptance
  • Scans, test results, and pain
  • The cycle of chronic pain
  • Moving toward acceptance
  • SMARTa goals
  • Reviewing the cycle of pain
  • Identifying acceptance, change, and goals
  • Setting short- and long-term SMART goals
  • Medical imaging (video)
  • Making life changes (PDF)
Lesson 3: Movement, Pacing, and Daily Activity Scheduling
  • Exploring the relationship between pain and activity
  • Learning about the “Boom Bust” pattern
  • Fear avoidance beliefs
  • Importance of pacing
  • Introduction to daily activity scheduling
  • Identifying healthy vs unhealthy coping skills
  • Pacing activity
  • Keeping a movement diary
  • Daily activity scheduling (PDF)
Lesson 4: Monitoring Thoughts and Recognizing Unhelpful Thinking Patterns
  • Revision of chronic pain cycle and the link among thoughts, feelings, and behaviors
  • Recognizing unhelpful thinking patterns
  • Monitoring thoughts and learning to think more helpful thoughts
  • Ways to recognize own unhelpful thinking patterns
  • Complete thought record
  • Continue to monitor and track thoughts
b
Lesson 5: Mood and Pain, Thought Challenging, and Managing Arousal
  • Challenging unhelpful thinking
  • Activity planning in practice
  • Emotions and pain
  • Strategies to manage anxiety
  • Mood
  • Thought-challenging situations
  • Activity planning and monitoring
  • Managing anger: looking out for triggers and learning strategies
  • Practicing relaxation
  • My Thought Challenging Worksheet (PDF)
Lesson 6: Stress Management and Getting Better Sleep
  • What is stress?
  • How to manage stress better
  • Using problem solving
  • Barriers to good sleep with chronic pain and ways to get better sleep
  • Recognizing sources of stress and own signs and symptoms
  • Structured problem-solving task
  • Sleep diary to improve habits
  • Good sleep guide (PDF)
  • Better sleep for chronic pain (video)
Lesson 7: Communication and Relationships
  • What is good communication?
  • Communication styles
  • Relationships with others
  • Ways to improve those relationships
  • Communication skills practice
  • Task: select a relationship and explore how you would like that relationship to be different
  • Revisiting thought-challenging exercise
  • Conversation skills tips (PDF)
  • Chronic pain and information for family and friends (PDF)
Lesson 8: Managing Flare-ups and Continuing Management of Chronic Pain
  • Exploring flare-ups, relapse, and the need for a flare-up plan
  • Continuing chronic pain management
  • How to get further help
  • Ten things to help during a relapse
  • Flare-up prevention plan
  • Summary
  • What is a pain clinic? (video)
  • Congratulations (video)

aSMART: specific, measurable, achievable, relevant, and timely.

bNot available.

Reboot Online Program With Telephone Support (Intervention)

Participants received free access to the standard Reboot Online program, as outlined in the control intervention, in conjunction with telephone support. They received a phone call every fortnight for the duration of the program (maximum of 8 phone calls). There was a set maximum of 3 attempts if contact by phone was unsuccessful. The phone call was conducted by a single clinician (senior allied health clinician) experienced in the management of chronic pain. To replicate what may occur in routine clinical care, we sought a flexible approach whereby the duration of the phone call was led by the needs of the patient. During the call, the participants were advised that the call was to check in and to see how their program was going. They were asked to report on their progress and encouraged to continue engaging with the program. The participants were also given the opportunity to discuss any challenges or hurdles they were experiencing, to problem-solve possible strategies to overcome them, and to receive feedback on their progress.

For both treatment groups, a senior clinician (senior pain physiotherapist TG and pain psychologist JW) experienced in chronic pain management monitored questionnaire responses, and if any responses indicated deterioration in well-being, phone contact was made with the participant and further clinical intervention was advised if indicated.

Outcome Measures

Participants were assessed using a suite of outcome measures, collected at three time points: baseline (immediately before commencing treatment), after treatment (weeks 16-17) and follow-up (3 months after completing posttreatment questionnaires).

The primary outcome for this study was adherence to the Reboot Online program. Adherence was quantified through three metrics: (1) completion of the program assessed through the number of participants who completed all 8 lessons of the intervention and who were classified as completers, whereas those who did not complete all 8 lessons were classified as noncompleters; (2) the number of participants who enrolled to undertake the program; and (3) the number of participants who commenced at least one lesson of the program. The rate of overall program completion was the primary outcome measure for adherence. These adherence measures were chosen to capture three key potential points where patients may drop out: (1) enrolling after being prescribed the course, (2) after completing baseline questionnaires and commencing the course, and (3) once engaged in the program after each lesson. Data on adherence were collected automatically through the This Way Up platform.

The secondary outcomes were as follows:

  1. Pain Self-Efficacy Questionnaire [33] to assess participants’ confidence to perform activities while in pain, with higher scores indicating greater confidence in functional capacity. A minimally clinically important difference (MCID) was considered to be 5.5 [34].
  2. Tampa Scale for Kinesiophobia [35] to measure fear and avoidance of movement. An MCID was considered to be 6 [36].
  3. Brief Pain Inventory [37] to assess pain severity and its impact on function through its 2 subscales for severity and pain interference. An MCID was considered to be 2 [38].
  4. Pain Disability Index [39] to assess the degree to which chronic pain interferes with participants’ daily activities and essential life activities. An MCID was considered to be between 8.5 and 9.5 [28].
  5. International Physical Activity Questionnaire [29,40] to measure self-reported physical activity in the previous 7 days. Metabolic equivalents and level of physical activity were calculated using a preformed Microsoft Excel program [41]. This program calculated metabolic equivalents and levels of physical activity as low, moderate, and high according to International Physical Activity Questionnaire short form categorical scoring.
  6. Kessler-10 Psychological Distress Scale [42,43] to measure psychological distress.

Self-reported data on clinical outcome measures were collected through the This Way Up platform as well as the REDCap tool hosted at St Vincent’s Hospital, Sydney [44,45]. REDCap is a secure, web-based software platform designed to support data capture for research studies, providing (1) an intuitive interface for validated data capture, (2) audit trails for tracking data manipulation and export procedures, (3) automated export procedures for seamless data downloads to common statistical packages, and (4) procedures for data integration and interoperability with external sources.

Telephone Support Time and Satisfaction

Descriptive data pertaining to the amount of clinician contact and time spent conducting the telephone support calls were collected for all participants in the intervention group. Descriptive data on treatment satisfaction and perceived usefulness of the intervention were collected from all participants (both groups) during the posttreatment assessments. Additional user feedback was collected from the intervention group regarding how helpful and motivating they found the telephone support calls received as part of the intervention. Data on subjective participant feedback were quantitative in nature (collected on a rating scale from 1 to 10) and were analyzed descriptively through median values and IQRs (because of nonparametric data distribution).

Statistical Methods

Before commencement of the study, a power calculation was conducted to determine the minimum sample size required to detect a difference in the proportion of participants adhering to the course between the intervention and control groups. This was based on previous investigations [16], where 60% of the participants were observed to have adhered to the complete program under control conditions and an increase of 25% in adherence was predicted for the telephone support intervention group. Assuming a power of 80% and a Cronbach α set at .05, a minimum sample size of 39 participants in each group was needed to detect a 25% difference in the proportion of completers. To account for an anticipated approximate 10% dropout rate, a total of 44 participants were required in each study arm.

Descriptive statistics were used to characterize the demographic and clinical features of the study groups at baseline. Data on the time taken to deliver the telephone support and subjective participant feedback on the telephone support and web-based aspects of the Reboot Online program were also analyzed descriptively.

Cross-tabulations were used to examine between-group differences in the primary outcome of adherence. Chi-square analyses were performed with binary factors of group (intervention and control) and adherence (yes or no) for each of the identified adherence metrics (enrollment, course commencement, and course completion). A Mann–Whitney U test was used to compare the median number of lessons completed by the intervention and control groups.

To examine treatment efficacy, intention-to-treat linear mixed models were implemented separately for each of the secondary outcome measures. This form of analysis can robustly account for the unbalanced nature of repeated measures data, including missing data because of participant dropout. Each model included group, time, and a group-by-time interaction as fixed factors. A random effect of participant was also included in each model, with an identity covariance structure used to model the covariance structure of the random intercept. Model fit indices supported the selection of an unstructured covariance matrix for each of the outcome measures. Significant treatment effects were followed with pair-wise comparisons of their estimated marginal means. For each outcome, the estimated marginal means derived from the model were used to calculate within-group effect sizes (Hedges g, adjusted for the correlation among time points). Pre- to posttreatment effect sizes (Hedges g, adjusted for the correlation among time points) were calculated, corresponding to the changes between pre- and posttreatment values and between posttreatment and follow-up values. Between-group effect sizes (Hedges g, adjusted for the correlation among time points) for each outcome after treatment and at follow-up were also calculated. Effect sizes of <0.49, 0.50-0.79, and >0.80 were considered to be small, moderate, and large, respectively [46].

Statistical analyses were conducted using SPSS software (version 25; IBM Corp), and results were considered significant where P<.05.


Baseline Sample Characteristics

Baseline and sample characteristics are outlined in Table 2. Demographic data were available for all participants (N=89) who met the inclusion criteria. The participants had a mean age of 49.3 years (SD 16.1; range 21-86 years), and most of them were women (59/89, 66%). At baseline, 42% (37/89) of the participants reported taking simple analgesia for pain (nonopioid analgesics such as paracetamol and nonsteroidal anti-inflammatory drugs), 27% (24/89) took gabanoids, 43% (38/89) took antidepressant medication, and 44% (39/89) took opioid analgesia. Approximately 1 in 3 participants (28/89, 32%) met the criteria for MDD at the time of study enrollment.

Table 2. Baseline demographics (N=89).
CharacteristicsTotal sample (N=89)Intervention (n=44)Control (n=45)Between-group comparison




ValuesP value
Age (years), mean (SD)49.3 (16.1)48.0 (15.8)50.6 (16.4)t84=–0.74.46
Gender, n (%)χ22=2.2.34

Male28 (32)13 (30)15 (33)


Female59 (66)29 (66)30 (67)


Nonbinary2 (2)2 (5)0 (0)

Rural status, n (%)χ21=0.9.17

Major city52 (58)29 (66)23 (51)


Regional or remote37 (42)15 (34)22 (49)

Major depressive disorder, n (%)28 (32)16 (36)12 (27)χ21=1.0.33
Opioid analgesia, n (%)39 (44)14 (33)25 (56)χ21=4.7.03a
Antidepressants, n (%)38 (43)20 (47)18 (40)χ21=0.4.54
Anticonvulsants, n (%)24 (27)13 (30)11 (24)χ21=0.4.54
Benzodiazepines, n (%)16 (18)8 (19)8 (18)χ21=0.0.92
Simple analgesia, n (%)44 (49)18(41)26 (58)χ21=2.5.11

aItalicized results are significant at P<.05.

Adherence

Participant flow through the trial is illustrated in Figure 1. Reboot Online combined with telephone support had a positive effect on enrollment and commencement of the program compared with Reboot Online without telephone support. Significantly more participants from the Reboot Online plus telephone support group enrolled (41/44, 93%) into the course than those from the control group (35/45, 78%; χ21=4.2; P=.04). Furthermore, more participants from the intervention group commenced the course than those from the control group (40/44, 91% vs 27/45, 60%, respectively; χ21=11.4; P=.001).

The median number of lessons completed by those in the intervention versus control groups was significantly different; the intervention group completed a median of 6 (IQR 3-8) lessons compared with a median of 2 (IQR 0-8) lessons in the control group (Mann–Whitney U=682; P=.009). More participants from the intervention group completed at least half the course (≥4 lessons; χ21=5.0; P=.03) than those from the control group. The overall completion rate for both groups was 37% (33/89). Of the participants enrolled in the intervention group, 43% (19/44) completed the course, and of those in the control group, 31% (14/45) completed the course. When considering the subgroup of those who commenced the program, the overall course completion was 49% (33/67) from both groups; however, there was no significant difference between the proportions of people who completed all 8 lessons in the intervention (19/40, 48%) versus control groups (14/27, 52%; χ21=1.4; P=.24).

Figure 1. Participant flowchart.
View this figure

Effectiveness in Clinical Outcomes

Estimated marginal means at pretreatment, posttreatment, and 3-month follow-up as well as within- and between-group effect sizes for all outcome measures, are shown in Table 3. Significant main effects of time showing improvements from before to after treatment were observed for measures of pain self-efficacy (F2,48=16.4; P<.001), kinesiophobia (F2,52=10.3; P<.001), pain interference (F2,53=12.4; P<.001), pain disability (F2,50=10.8; P<.001), and psychological distress (F2,46=9.6; P<.001). There was no significant main effect of time with measures of pain severity (F2,53=2.0; P=.15) or physical activity (F2,55=1.8; P=.19). Within each group, posttreatment improvements were maintained at 3-month follow-up assessment, with no significant changes in outcomes between posttreatment and follow-up values.

There was no significant group-by-time interaction observed for any outcome measure, indicating that treatment efficacy on these outcome measures did not differ between the intervention and control groups.

Table 3. Estimated marginal means (EMMs) before and after treatment and at follow-up and within- and between-group effect sizes.
OutcomeBefore treatment, EMM (SD)After treatment, EMM (SD)Follow-up, EMM (SD)Pre- to posttreatment within-group comparisonPosttreatment to follow-up within-group comparisonPosttreatment between-group comparison, between-group effect size, Hedges g (95% CI)Follow-up, between-group comparison, between-group effect size, Hedges g (95% CI)




rWithin-group effect size, Hedges g (95% CI)rWithin-group effect size, Hedges g (95% CI)

PSEQa

Intervention26.9 (10.9)33.9 (12.0)33.8 (12.3)0.730.61 (0.08 to 1.14)b0.920.01 (–0.57 to 0.59)0.43 (–0.12 to –0.99)0.32 (–0.26 to –0.89)

Control28.7 (10.0)38.9 (10.5)37.7 (11.9)0.370.98 (0.28 to 1.67)b0.620.10 (–0.55 to 0.76)N/AcN/A
TSKd

Intervention37.4 (6.2)33.1 (8.0)33.2 (7.1)0.600.60 (0.07 to 1.12)b0.740.01 (–0.57 to 0.58)0.50 (–0.05 to –1.04)0.23 (–0.33 to –0.80)

Control40.6 (5.9)36.9 (7.2)34.8 (7.1)0.380.54 (–0.11 to –1.19)e0.720.29 (–0.35 to 0.93)N/AN/A
BPIf severity

Intervention5.4 (1.4)5.1 (1.5)5.3 (1.4)0.630.25 (–0.27 to –0.76)0.810.19 (–0.39 to 0.77)0.04 (–0.50 to –0.57)0.36 (–0.21 to –0.94)

Control5.4 (1.3)5.0 (1.4)4.8 (1.4)0.660.28 (–0.36 to 0.91)0.800.13 (–0.50 to 0.77)N/AN/A
BPI interference

Intervention6.4 (1.8)5.3 (2.1)4.9 (2.2)0.750.59 (0.06 to 1.11)b0.780.15 (–0.43 to 0.73)0.01 (–0.5 to –0.55)0.21 (–0.36 to 0.78)

Control6.2 (1.7)5.2 (1.9)4.5 (2.2)0.220.53 (–0.12 to –1.18)0.750.36 (–0.28 to 1.00)N/AN/A
PDIg

Intervention40.1 (12.2)30.9 (15.6)31.0 (16.2)0.690.65 (0.12 to 1.18)b0.630.01 (–0.57 to 0.59)0.07 (–0.47 to –0.60)0.23 (–0.34 to –0.80)

Control38.2 (11.5)31.9 (13.9)27.3 (15.8)0.520.49 (–0.16 to –1.13)e0.820.31 (–0.33 to 0.94)N/AN/A
IPAQh

Intervention1571.9 (1709.8)2783.4 (2790.3)2353.7 (2391.5)0.680.53 (0.01 to 1.05)e0.560.17 (–0.41 to 0.75)0.18 (–0.37 to –0.74)0.24 (–0.33 to –0.81)

Control2176.6 (1472.6)2278.5 (2698.4)1766.6 (2525.9)0.490.05 (–0.54 to –0.64)0.700.19 (–0.40 to 0.78)N/AN/A
K10i

Intervention27.3 (6.8)23.2 (7.5)23.6 (8.7)0.760.57 (0.04 to 1.09)b0.710.05 (–0.53 to 0.63)0.07 (–0.49 to –0.63)0.02 (–0.54 to –0.59)

Control26.4 (5.9)22.7 (6.3)23.4 (8.3)0.400.59 (–0.12 to –1.30)e0.710.09 (–0.58 to 0.77)N/A N/A

aPSEQ: Pain Self-Efficacy Questionnaire.

bSignificance at P<.001.

cN/A: not applicable.

dTSK: Tampa Scale for Kinesiophobia.

eSignificance at P<.05.

fBPI: Brief Pain Inventory.

gPDI: Pain Disability Index.

hIPAQ: International Physical Activity Questionnaire.

iK10: Kessler-10 Psychological Distress Scale.

Telephone Support Sessions

Over the course of the study, all participants in the telephone support group (n=44) had at least one telephone support session and 30% (13/44) had 8 support sessions, with a mean number of 5.7 (SD 2.4) sessions for each participant. The total cumulative telephone time for the intervention group was 3240 minutes, which comprised multiple attempts to make telephone contact as well as successful telephone support sessions, and the overall active telephone time in which the clinician was engaged with the participant was 1780 (54.94%) minutes. This equated to a mean total of 73.6 (SD 31.8) minutes of clinician time per participant over the entire program, of which 40.5 (SD 26.8) minutes was active time spent engaging with the participant. The mean clinician time at each active session was 7.9 (SD 5.5) minutes. No additional time was spent with any participants in either group for follow-up because of acute deterioration in well-being.

Treatment Satisfaction

When asked to rate their satisfaction with the treatment received in this study on a scale from 0 to 10, the participants provided a median rating of 7 (IQR 5-8). Furthermore, when asked to rate how helpful they found the study intervention for managing their chronic pain, the participants responded with a median rating of 8 (IQR 5-9) out of 10. There was no significant difference between the groups for scores of either satisfaction or how helpful they found the Reboot Online program. Of the 44 participants in the intervention group, 35 (79%) reported that the telephone support made it easier for them to participate in or complete the Reboot Online program. When asked how motivating or encouraging they found the telephone support aspect of their treatment specifically (rated out of 10), the median score of the participants was 7 (IQR 5-9).


Principal Findings

This study aimed to investigate whether additional clinician guidance, in the form of telephone support, would significantly enhance adherence to the Reboot Online program for chronic pain and improve clinical outcomes. We found that Reboot Online combined with telephone coaching resulted in improved rates of enrollment and commencement (onboarding) of the program compared with the usual Reboot Online program. The overall completion rate from both groups was 37% (33/89; intervention 19/44, 43%, and control 14/45, 31%). The overall completion rate for those who commenced the program from both groups was 49% (33/67); however, there was no significant difference in overall completion rate between the groups (intervention 19/40, 48%; control 14/27, 52%) once the program was commenced. These completion rates are consistent with our real-world adherence outcome of 41% [17] and reflect published attrition rates: 4% to 54% [20]. Furthermore, there was no significant group difference in treatment efficacy on a variety of outcome measures.

The most effective way to provide clinician guidance adjunct to other interventions remains unclear. Little is known as to whether providing guidance evenly timed over the entire course of an intervention is most effective or whether initial support gradually tapered to self-management would be more effective [18]. Our results showed a significant difference between the groups in the rates of enrollment and commencement of the program—the onboarding phase. Once participants commenced the program, there was no significant difference in completion rates between the groups, regardless of phone support. This would suggest that telephone support is most effective in improving the onboarding of persons with chronic pain undertaking web-based treatments. The average amount of time involved for each telephone session (including time spent calling to contact participants together with time of consult) was 9.2 (SD 6.0) minutes, with a mean active consult time of 7.9 (SD 5.5) minutes. Thus, the telephone support intervention did not seem onerous and therefore could be easily incorporated into a clinical caseload. When considering where to best use clinical resources, clinician time may be best used in the early phase of onboarding to maximize adherence.

The telephone support provided in this study was delivered by experienced allied health clinicians with specialized training in pain management. The content of the telephone support calls was basic and consisted mainly of reaffirmation of the course material as well as feedback on progress and suggestions to address any barriers that the participants were experiencing. There was no specialized psychological treatment or intervention to address mood or cognition delivered through the calls themselves; nor was there detailed clinical education or advice on movement and physical activity. As the support did not provide specialized intervention, in future it could be provided by a trained peer or junior clinician to streamline the provision of clinical resources, which may be adequate for patients who are not clinically complex. Other studies have shown little difference between clinician and peer-led coaching.

It has been suggested that telephone support provides the possibility to make a diagnosis, tailor the intervention, and actively assist patients to access other needed services [47]. A triage system could also be used to identify persons with more complex issues who need to be referred to specialist clinician intervention. This may be relevant for those with chronic pain who are particularly afraid of movement because of fear of pain because it has been shown that high kinesiophobia negatively effects adherence rates [17]. Research investigating the level of telephone support stratified according to kinesiophobia scores may be worthwhile, with this study being underpowered to investigate this aspect.

The act of individually tailoring a web-based intervention has been suggested to help with adherence and outcomes [47]. Standardized and nontailored internet treatments may leave little room for patient and clinician preferences [48] regarding course content, format, and learning styles. Chien et al [49] identified 5 engagement subtypes in patients enrolled in an internet-delivered CBT program. Their study suggests that by identifying subtypes of patient engagement, programs can be targeted to what is most meaningful to different patients by providing different levels of support or additional treatment modules. The inclusion of telephone support may enable the clinician to assess patient preferences early in the program and adapt the course content to better suit their needs. Others have suggested that allowing the patient to choose the level of support they require would allow efficient allocation of clinician resources to where greater support is wanted, without affecting outcomes or adherence rates [50].

An important consideration related to patient adherence is the logical assumption that adherent patients will have better treatment outcomes than nonadherent patients, although the evidence has not always supported this assumption [51,52]. This study would suggest that additional telephone support does not have an effect on clinical outcomes; however, it did improve onboarding rates, thus increasing the number of participants who commenced the program. Web-based adherence is difficult to monitor because key therapy components are delivered on the web and practiced without the physical presence of a clinician [51]. For instance, a patient may practice the skills outlined in a program without logging in to the internet module. A more relevant way to measure adherence may be to look at intended use, defined as “the extent to which individuals should experience the content (of the intervention) to derive maximum benefit from the intervention, as defined or implied by its creators” [52]. This may explain why the patients who did not complete all modules in our study still showed significant improvements in their outcome measures. Further evaluation of the minimum amount of engagement with the program required to induce improvement may provide valuable insight. However, inherent difficulties lie in conducting such an analysis because those who stop engaging with treatments are also more likely to be lost to follow-up assessment.

As internet-delivered interventions evolve, the importance of various technical features in programs and their influence on adherence and outcomes need to be better understood. The mode of guidance, whether by email, text, phone, telehealth, program content, web design, multimedia format, web-based design features, or a hybrid of these features, may play a part in successful engagement of the individual [52]. Qualitative data would support this, with real-world Reboot Online service users anecdotally reporting a desire for more multimedia capability and web-based features. Program developers will need to consider these aspects of design that have an impact on usability and user experience during the development and ongoing evaluation of internet-delivered programs. Further evidence is needed to elucidate what effect each of these design components may have on user engagement and adherence and, in turn, whether design optimization strategies could be successfully used to improve treatment adherence and outcomes [20].

Studies investigating web-based CBT courses for pain conditions [24] report mixed results on whether adjunct guidance improves measures of pain catastrophizing, self-efficacy, and pain interference more than unguided internet interventions alone. Although our study showed significant improvements in measures of pain self-efficacy, kinesiophobia, pain interference, pain disability, and psychological distress over time, there was no significant difference in outcome measures between Reboot Online combined with telephone support and the usual Reboot Online intervention. This may indicate that adding telephone support may not add to the effect of the core Reboot Online program on clinical outcome measures.

The results presented here need to be considered within the context of a number of study limitations. Blinding of participants and the telephone support coach was not possible, introducing a possible risk of bias. Responder bias is another limitation noted because participants self-selected to participate in the trial and outcome data were lost from those participants who dropped out of the course. The power calculation was performed with reference to our primary outcome measure, which was adherence. The study was not powered formally for effectiveness on secondary outcomes and did not correct for multiple comparisons. Loss of follow-up data after treatment and at 3 months resulted in a smaller sample than estimated, which may have led to biased estimates or overestimates of treatment effects. Intention-to-treat analyses were conducted to enable inclusion of participants with suboptimal compliance into the data analysis, and linear mixed model analyses were selected, given that they remain robust in the presence of considerable missing data. Most of the baseline data were collected through patient-reported surveys, and thus the reliability of data are limited by the accuracy of participant recall and self-report. The subjective physical activity questionnaire may have been subject to notable reporting bias; an objective measure of physical activity (such as an activity-monitoring device) is needed in future evaluations’ estimate of physical activity. Although outside the scope of this study, collection of more detailed data on direct intervention costs and opiate use could have facilitated more comprehensive evaluation of the benefits of the Reboot Online program. The study used 2 therapists (TG and JW) to conduct the telephone support, both senior clinicians in chronic pain and trained in the telephone coaching protocol. Although this allowed for consistent intervention during the study, it is unknown whether our findings would generalize to broader clinical settings or multiple clinicians. Finally, the study was conducted between September 2019 and November 2020. During this time period, there were significant and unprecedented local (drought and bushfires in Australia) and global events (COVID-19) that may have had an impact on the ability of participants to adhere and commit to the study as well as on measures of their well-being and mental and physical health.

Conclusions

Reboot Online offers an effective web-based intervention for chronic pain. Telephone support improves participants’ registration, program commencement, and engagement in the early phase of the internet intervention; however, it did not seem to have an impact on overall course completion or efficacy. To maximize adherence to web-based interventions, clinician resources may be best used in the early phase of onboarding. Further research is warranted to gain better understanding of optimal guidance levels and models as well as program design components that will most effectively improve adherence to web-based interventions.

Acknowledgments

This project was supported by a grant from the St Vincent’s Health Australia Inclusive Health Innovation Fund, St Vincent’s Clinic Foundation, 2019. This Way Up is funded by the Australian Government Department of Health. JN is supported by an Australian Medical Research Future Fund Fellowship (MRFF1145382).

Conflicts of Interest

None declared.

Multimedia Appendix 1

CONSORT eHEALTH checklist (V 1.6.1).

PDF File (Adobe PDF File), 699 KB

  1. Blyth F, March L, Brnabic A, Jorm L, Williamson M, Cousins M. Chronic pain in Australia: a prevalence study. Pain 2001;89(2):127-134. [CrossRef]
  2. González-Chica DA, Vanlint S, Hoon E, Stocks N. Epidemiology of arthritis, chronic back pain, gout, osteoporosis, spondyloarthropathies and rheumatoid arthritis among 1.5 million patients in Australian general practice: NPS MedicineWise MedicineInsight dataset. BMC Musculoskelet Disord 2018 Jan 18;19(1):20 [FREE Full text] [CrossRef] [Medline]
  3. Breivik H, Collett B, Ventafridda V, Cohen R, Gallacher D. Survey of chronic pain in Europe: prevalence, impact on daily life, and treatment. Eur J Pain 2006 May;10(4):287-333. [CrossRef] [Medline]
  4. Vos T, Abajobir AA, Abate KH, Abbafati C, Abbas KM, Abd-Allah F, et al. Global, regional, and national incidence, prevalence, and years lived with disability for 301 acute and chronic diseases and injuries in 188 countries, 1990-2013: A systematic analysis for the global burden of disease study 2013. The Lancet 2015;386(9995):743-800. [CrossRef]
  5. Casey M, Smart K, Segurado R, Doody C. Multidisciplinary-Based Rehabilitation (MBR) compared with active physical interventions for pain and disability in adults with chronic pain: a systematic review and meta-analysis. Clin J Pain 2020 Nov;36(11):874-886. [CrossRef] [Medline]
  6. Garg S, Garg D, Turin T, Chowdhury M. Web-based interventions for chronic back pain: a systematic review. J Med Internet Res 2016 Jul 26;18(7):e139 [FREE Full text] [CrossRef] [Medline]
  7. Hruschak V, Flowers KM, Azizoddin DR, Jamison RN, Edwards RR, Schreiber KL. Cross-sectional study of psychosocial and pain-related variables among patients with chronic pain during a time of social distancing imposed by the coronavirus disease 2019 pandemic. Pain 2021 Feb 01;162(2):619-629 [FREE Full text] [CrossRef] [Medline]
  8. Eccleston C, Fisher E, Craig L, Duggan G, Rosser B, Keogh E. Psychological therapies (Internet-delivered) for the management of chronic pain in adults. Cochrane Database Syst Rev 2012:CD010152. [CrossRef]
  9. Bender J, Radhakrishnan A, Diorio C, Englesakis M, Jadad A. Can pain be managed through the internet? A systematic review of randomized controlled trials. Pain 2011 Aug;152(8):1740-1750. [CrossRef] [Medline]
  10. Cuijpers P, van Straten A, Andersson G. Internet-administered cognitive behavior therapy for health problems: a systematic review. J Behav Med 2008 Apr;31(2):169-177 [FREE Full text] [CrossRef] [Medline]
  11. Vugts M, Joosen M, van der Geer JE, Zedlitz A, Vrijhoef H. The effectiveness of various computer-based interventions for patients with chronic pain or functional somatic syndromes: a systematic review and meta-analysis. PLoS One 2018;13(5):e0196467 [FREE Full text] [CrossRef] [Medline]
  12. Murray E, Burns J, See T, Lai R, Nazareth I. Interactive Health Communication Applications for people with chronic disease. Cochrane Database Syst Rev 2005 Oct 19;4(4):CD004274. [CrossRef] [Medline]
  13. Wantland D, Portillo C, Holzemer W, Slaughter R, McGhee E. The effectiveness of web-based vs. non-web-based interventions: a meta-analysis of behavioral change outcomes. J Med Internet Res 2004 Nov 10;6(4):e40 [FREE Full text] [CrossRef] [Medline]
  14. Buhrman M, Fältenhag S, Ström L, Andersson G. Controlled trial of internet-based treatment with telephone support for chronic back pain. Pain 2004 Oct;111(3):368-377. [CrossRef] [Medline]
  15. Schultz R, Smith J, Newby JM, Gardner T, Shiner CT, Andrews G, et al. Pilot trial of the reboot online program: an internet-delivered, multidisciplinary pain management program for chronic pain. Pain Res Manag 2018 Sep 05;2018:1-11 [FREE Full text] [CrossRef] [Medline]
  16. Smith J, Faux S, Gardner T, Hobbs M, James M, Joubert A, et al. Reboot online: a randomized controlled trial comparing an online multidisciplinary pain management program with usual care for chronic pain. Pain Med 2019 Dec 01;20(12):2385-2396. [CrossRef] [Medline]
  17. Lim D, Newby J, Gardner T, Haskelberg H, Schultz R, Faux S, et al. Evaluating real-world adherence and effectiveness of the "Reboot Online" program for the management of chronic pain in routine care. Pain Med 2021 Aug 06;22(8):1784-1792. [CrossRef] [Medline]
  18. Andersson G. The Internet and CBT: A Clinical Guide. London, UK: CRC Press; 2014.
  19. Buhrman M, Gordh T, Andersson G. Internet interventions for chronic pain including headache: a systematic review. Internet Interv 2016 May;4:17-34 [FREE Full text] [CrossRef] [Medline]
  20. Baumeister H, Reichler L, Munzinger M, Lin J. The impact of guidance on internet-based mental health interventions — A systematic review. Internet Intervent 2014 Oct;1(4):205-215. [CrossRef]
  21. Evans EH, Araújo-Soares V, Adamson A, Batterham AM, Brown H, Campbell M, et al. The NULevel trial of a scalable, technology-assisted weight loss maintenance intervention for obese adults after clinically significant weight loss: study protocol for a randomised controlled trial. Trials 2015 Sep 22;16(1):421 [FREE Full text] [CrossRef] [Medline]
  22. Barnett J, Harricharan M, Fletcher D, Gilchrist B, Coughlan J. myPace: an integrative health platform for supporting weight loss and maintenance behaviors. IEEE J Biomed Health Inform 2015 Jan;19(1):109-116. [CrossRef]
  23. Schäfer AG, Zalpour C, von Piekartz H, Hall TM, Paelke V. The efficacy of electronic health-supported home exercise interventions for patients with osteoarthritis of the knee: systematic review. J Med Internet Res 2018 Apr 26;20(4):e152 [FREE Full text] [CrossRef] [Medline]
  24. Lin J, Sander L, Paganini S, Schlicker S, Ebert D, Berking M, et al. Effectiveness and cost-effectiveness of a guided internet- and mobile-based depression intervention for individuals with chronic back pain: protocol of a multi-centre randomised controlled trial. BMJ Open 2017 Dec 28;7(12):e015226 [FREE Full text] [CrossRef] [Medline]
  25. Stinson JN, McGrath PJ, Hodnett ED, Feldman BM, Duffy CM, Huber AM, et al. An internet-based self-management program with telephone support for adolescents with arthritis: a pilot randomized controlled trial. J Rheumatol 2010 Sep;37(9):1944-1952. [CrossRef] [Medline]
  26. Learn practical tools to take care of your mental health. This way up.   URL: https://thiswayup.org.au/ [accessed 2022-01-18]
  27. Lecrubier Y, Sheehan D, Weiller E, Amorim P, Bonora I, Sheehan KH, et al. The Mini International Neuropsychiatric Interview (MINI). A short diagnostic structured interview: reliability and validity according to the CIDI. Eur Psychiatr 2020 Apr 16;12(5):224-231. [CrossRef]
  28. Soer R, Reneman MF, Vroomen PC, Stegeman P, Coppes MH. Responsiveness and minimal clinically important change of the pain disability index in patients with chronic back pain. Spine 2012;37(8):711-715. [CrossRef]
  29. International Physical Activity Questionnaire. The IPAQ. 2016.   URL: https://sites.google.com/site/theipaq/ [accessed 2021-12-30]
  30. Welcome to randomization.com!!!: where it's never the same thing twice. Randomization. 2020.   URL: http://www.jerrydallal.com/random/randomize.htm [accessed 2022-01-18]
  31. How REDCap is being used in response to COVID-19. REDCap.   URL: https://www.project-redcap.org/ [accessed 2022-01-18]
  32. The Chronic Pain Course. This way up.   URL: https://thiswayup.org.au/courses/the-chronic-pain-course/ [accessed 2022-01-18]
  33. Nicholas M. The pain self-efficacy questionnaire: taking pain into account. Eur J Pain 2007 Mar;11(2):153-163. [CrossRef] [Medline]
  34. Chiarotto A, Vanti C, Cedraschi C, Ferrari S, de Lima E Sà Resende F, Ostelo RW, et al. Responsiveness and minimal important change of the pain self-efficacy questionnaire and short forms in patients with chronic low back pain. J Pain 2016 Jun;17(6):707-718. [CrossRef] [Medline]
  35. Miller R, Kori S, Todd D. The Tampa Scale: a measure of kinesiophobia. Clin J Pain 1991;7(1):51 [FREE Full text]
  36. Monticone M, Ambrosini E, Rocca B, Foti C, Ferrante S. Responsiveness and minimal clinically important changes for the Tampa Scale of Kinesiophobia after lumbar fusion during cognitive behavioral rehabilitation. Eur J Phys Rehabil Med 2017 Jun;53(3):351-358. [CrossRef]
  37. Cleeland C, Ryan K. Pain assessment: global use of the brief pain inventory. Ann Acad Med Singapore 1994;23(2):129-138. [CrossRef]
  38. Mease PJ, Spaeth M, Clauw DJ, Arnold LM, Bradley LA, Russell IJ, et al. Estimation of minimum clinically important difference for pain in fibromyalgia. Arthritis Care Res (Hoboken) 2011 Jun 31;63(6):821-826 [FREE Full text] [CrossRef] [Medline]
  39. Chibnall J, Tait R. The Pain Disability Index: factor structure and normative data. Arch Phys Med Rehabil 1994 Oct;75(10):1082-1086. [CrossRef] [Medline]
  40. Craig CL, Marshall AL, Sjostrom M, Bauman AE, Booth ML, Ainsworth BE, et al. International Physical Activity Questionnaire: 12-country reliability and validity. Med Sci Sports Exerc 2003;35(8):1381-1395. [CrossRef]
  41. Cheng HL. A simple, easy-to-use spreadsheet for automatic scoring of the International Physical Activity Questionnaire (IPAQ) Short Form. ResearchGate 2016. [CrossRef]
  42. Furukawa TA, Kessler RC, Slade T, Andrews G. The performance of the K6 and K10 screening scales for psychological distress in the Australian National Survey of Mental Health and Well-Being. Psychol Med 2003 Mar 14;33(2):357-362. [CrossRef] [Medline]
  43. Kessler RC, Andrews G, Colpe LJ, Hiripi E, Mroczek DK, Normand SL, et al. Short screening scales to monitor population prevalences and trends in non-specific psychological distress. Psychol Med 2002 Aug 26;32(6):959-976. [CrossRef] [Medline]
  44. Harris PA, Taylor R, Thielke R, Payne J, Gonzalez N, Conde JG. Research electronic data capture (REDCap)--a metadata-driven methodology and workflow process for providing translational research informatics support. J Biomed Inform 2009 Apr;42(2):377-381 [FREE Full text] [CrossRef] [Medline]
  45. Harris PA, Taylor R, Minor BL, Elliott V, Fernandez M, O'Neal L, REDCap Consortium. The REDCap consortium: building an international community of software platform partners. J Biomed Inform 2019 Jul;95:103208 [FREE Full text] [CrossRef] [Medline]
  46. Cohen J. Statistical power analysis. Curr Dir Psychol Sci 2016 Jun 24;1(3):98-101. [CrossRef]
  47. Carlbring P, Maurin L, Törngren C, Linna E, Eriksson T, Sparthan E, et al. Individually-tailored, Internet-based treatment for anxiety disorders: a randomized controlled trial. Behav Res Ther 2011 Jan;49(1):18-24. [CrossRef] [Medline]
  48. Spek V, Cuijpers P, Nyklícek I, Riper H, Keyzer J, Pop V. Internet-based cognitive behaviour therapy for symptoms of depression and anxiety: a meta-analysis. Psychol Med 2006 Nov 20;37(03):319. [CrossRef]
  49. Chien I, Enrique A, Palacios J, Regan T, Keegan D, Carter D, et al. A machine learning approach to understanding patterns of engagement with internet-delivered mental health interventions. JAMA Netw Open 2020 Jul 01;3(7):e2010791 [FREE Full text] [CrossRef] [Medline]
  50. Hadjistavropoulos H, Schneider L, Mehta S, Karin E, Dear B, Titov N. Preference trial of internet-delivered cognitive behaviour therapy comparing standard weekly versus optional weekly therapist support. J Anxiety Disord 2019 Apr;63:51-60 [FREE Full text] [CrossRef] [Medline]
  51. Andersson G, Titov N. Advantages and limitations of internet-based interventions for common mental disorders. World Psychiatry 2014 Mar 04;13(1):4-11 [FREE Full text] [CrossRef] [Medline]
  52. Sieverink F, Kelders SM, van Gemert-Pijnen JE. Clarifying the concept of adherence to eHealth technology: systematic review on when usage becomes adherence. J Med Internet Res 2017 Dec 06;19(12):e402 [FREE Full text] [CrossRef] [Medline]


CBT: cognitive behavioral therapy
CONSORT: Consolidated Standards of Reporting Trials
MCID: minimally clinically important difference
MDD: major depressive disorder
REDCap: Research Electronic Data Capture
SMART: specific, measurable, achievable, relevant, and timely


Edited by A Mavragani; submitted 07.06.21; peer-reviewed by A Martin-Pintado Zugasti, S Bhattacharjee, A Kebede; comments to author 01.08.21; revised version received 24.08.21; accepted 07.12.21; published 03.02.22

Copyright

©Tania Gardner, Regina Schultz, Hila Haskelberg, Jill M Newby, Jane Wheatley, Michael Millard, Steven G Faux, Christine T Shiner. Originally published in the Journal of Medical Internet Research (https://www.jmir.org), 03.02.2022.

This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research, is properly cited. The complete bibliographic information, a link to the original publication on https://www.jmir.org/, as well as this copyright and license information must be included.