Accessibility settings

Published on in Vol 28 (2026)

Preprints (earlier versions) of this paper are available at https://preprints.jmir.org/preprint/84030, first published .
Identifying Evidence-Based Strategies in a Digital Mental Health Intervention for Depression: Qualitative Content Analysis

Identifying Evidence-Based Strategies in a Digital Mental Health Intervention for Depression: Qualitative Content Analysis

Identifying Evidence-Based Strategies in a Digital Mental Health Intervention for Depression: Qualitative Content Analysis

1Department of Psychiatry, Massachusetts General Hospital, 185 Cambridge Street, 2nd Floor, Boston, MA, United States

2Department of Psychiatry, Harvard Medical School, Boston, MA, United States

3Koa Health, London, United Kingdom

Corresponding Author:

Geneva K Jonathan, PhD


Background: Depression is one of the leading causes of disability worldwide. Cognitive behavioral therapy (CBT) is an effective treatment, but it is difficult to access due to clinician shortages, waitlists, and logistical barriers. Smartphone-based CBT interventions offer a scalable alternative to traditional face-to-face care, but few provide transparency regarding how closely they adhere to evidence-based therapeutic principles. Understanding what therapeutic components are included in interventions helps clinicians and patients determine whether they follow CBT principles and how they might help reduce depressive symptoms.

Objective: This study aimed to characterize the therapeutic content of Mindset (Koa Health), a therapist-guided smartphone intervention for depression, by identifying the core CBT techniques it delivers and the specific behavioral strategies the app uses to put those techniques into practice.

Methods: A qualitative content analysis was conducted on all 393 unique intervention pages of Mindset. Using established CBT strategy definitions and the behavior change technique (BCT) Taxonomy version 1 (BCTTv1), coders independently evaluated each page using a collaborative consensus approach. Interrater agreement was 93.75% for CBT and 93.62% for BCT coding. Descriptive statistics (frequency, mean, and SD) and overlap between the two were calculated.

Results: All 16 core CBT techniques were identified. CBT techniques were used a total of 528 times (mean per module 66.0, SD 56.0). The most frequently used techniques included psychoeducation (164/325, 50.5% of pages), skill building (110/325, 33.8%), cognitive restructuring (46/325, 14.2%), activity scheduling (42/325, 12.9%), and self-monitoring (39/325, 12%). Across modules, 37 of 93 possible BCTs were coded 878 times (mean per module 109.8, SD 92.0) across 13 of 16 BCTTv1 categories. The most frequently applied BCT categories were shaping knowledge (205/325, 63.1% of pages), repetition and substitution (138/325, 42.5%), and feedback and monitoring (113/325, 34.8%). Overlap between the 2 frameworks was common, with the most frequent CBT-BCT pairings being psychoeducation (CBT technique)×Shaping knowledge (BCT category; appearing together on 119 pages), skill building×Shaping knowledge (80 pages), activity scheduling×Shaping knowledge (42 pages), and activity scheduling×Repetition and substitution (42 pages).

Conclusions: Mindset demonstrates coverage of CBT techniques and alignment with evidence-based BCTs. This study is the first to introduce mechanism mapping, a dual-coding approach that describes the presence of therapeutic strategies and how they are behaviorally operationalized, addressing a gap in digital mental health transparency. Unlike existing content evaluations that use presence or absence checklists, our framework captures implementation depth through systematic documentation of behavioral scaffolding. This replicable methodology enables researchers to evaluate therapeutic fidelity, supports clinicians in making evidence-informed recommendations for digital mental health treatments, and provides a foundation for the development of adaptive interventions that can enhance real-world treatment outcomes for individuals with depression.

J Med Internet Res 2026;28:e84030

doi:10.2196/84030

Keywords



Depression is a severe and common illness that affects more than 300 million people worldwide, making it one of the leading causes of disability [1,2]. Its prevalence has risen in recent years [3], increasing the disease’s burden on individuals and health care systems [4]. Depression impairs quality of life [5], disrupts cognitive functioning [6-8], and increases suicidal thoughts and behaviors [4,9]. Cognitive behavioral therapy (CBT) is among the most effective treatments for depression, with extensive evidence supporting its ability to reduce symptom severity [10-12] and improve functioning [13] and quality of life [14]. However, access to traditional face-to-face CBT is limited due to clinician shortages, long waitlists, geographic limitations, out-of-pocket costs, stigma, and motivational difficulties inherent to depression [15-19].

Digital interventions have emerged to address barriers to traditional face-to-face care. Multiple meta-analyses of app and web-based interventions for individuals with moderate-to-severe depression demonstrate moderate reductions in symptom severity relative to usual care or minimal control conditions, with pooled effect sizes falling in the moderate range (standardized mean difference 0.50‐0.62) [20-22]. However, these average effects provide limited insight into what users are actually receiving in digital “CBT-based” interventions. Systematic reviews of these apps consistently show that adherence to evidence-based CBT or behavioral activation protocols is as low as 15%, and that commercially available apps do not fully meet established guideline, protocol, or treatment manual standards [23-29].

The gap between interventions labeled as “CBT-based” and their actual implementation of CBT becomes apparent when examining app content in detail. Content analyses of existing depression apps show that many include only a small number of evidence-based treatment elements (median 3 per app), with some of the most “hallmark” CBT strategies, such as cognitive restructuring, problem-solving, and relapse prevention, absent or underrepresented [25]. Even when CBT components are present, they may be implemented superficially, with limited tailoring and follow-up. For example, a review of 28 unguided depression apps and web programs found that even though behavioral activation was “present” in many programs, the strategy was rarely broken into achievable steps, anticipated barriers were not addressed, and few programs checked whether planned activities were completed, despite behavioral activation being one of the most commonly included components in CBT for depression [27]. These findings suggest that existing methods for evaluating the presence and depth of therapeutic strategies fall short of capturing the concrete implementation features that determine whether a therapeutic component functions as structured skills training (likely to achieve a greater therapeutic effect) or as a brief, one-off exercise.

Despite this limitation, existing reviews and meta-analyses have not expanded their approach to characterizing how CBT is operationalized within these tools. Most syntheses classify interventions using broad labels such as “CBT-based” or by the presence or absence of high-level components, which are often derived from intervention descriptions, stated theoretical orientations, or protocol-level summaries rather than a systematic review of the delivered in-app content and exercises themselves [21,29,30]. As a result, interventions that differ meaningfully in their therapeutic design are often treated as equivalent, obscuring variation in the active ingredients users are exposed to. Even content-focused evaluations typically summarize interventions at the program or app level, focusing on content frequency or presence and providing little information about the distribution, sequencing, or repetition of specific therapeutic techniques [24,25]. Without more granular and methodological characterization of therapeutic content, it is difficult to link intervention design to engagement, behavior change, or clinical outcomes, or to identify which components of digital CBT are most essential for effective depression treatment. However, detailed characterization of therapeutic content only addresses part of the challenge. Knowing what strategies are present does not explain how they produce change.

The mechanisms through which digital interventions exert their effects are poorly understood [22,30]. In traditional face-to-face CBT, symptom improvement is thought to arise from the coordinated enactment of cognitive, behavioral, and self-regulatory processes supported through structured practice, homework, and therapist feedback [31]. However, it is unclear whether digital CBT interventions engage these same change processes, particularly given their asynchronous format, reduced clinician involvement, and reliance on self-directed engagement. A critical but underexplored question is how digital interventions operationalize these strategies; that is, through what behavioral mechanisms do they prompt users to engage in the cognitive and behavioral work that facilitates change. Without understanding how digital interventions translate therapeutic strategies into concrete, repeated user behaviors, it is difficult to determine whether they provide structured practice opportunities that established CBT theory suggests are necessary for skill acquisition and symptom change [32,33].

To understand how digital CBT engages change mechanisms, we must recognize that CBT strategies are behavioral in nature [31]. Cognitive restructuring involves identifying and examining thoughts, activity scheduling requires planning and completing activities, and behavioral experiments entail designing and implementing behavioral tests. Critically, the same strategy can be delivered through very different user tasks, such as written thought record versus brief reflection prompts, or a full weekly activity plan versus a single daily goal. Because CBT strategies can be enacted in multiple ways, we need to move beyond describing whether they are mentioned to specifying how strategies are translated into user actions and practice opportunities.

Behavior change techniques (BCTs) provide a structured way to characterize how interventions translate therapeutic strategies into user actions that support skill enactment and change. BCTs are the smallest identifiable “active ingredients” designed to change behavior, for example, prompting self-monitoring, providing instruction, or encouraging repetition [34]. The BCT Taxonomy version 1 (BCTTv1) has been widely applied across digital behavioral domains, including medication adherence [35], physical activity [36,37], alcohol [38], smoking cessation [39], and condom use [40]. However, the application of BCT in digital mental health is comparatively limited [41-43]. Pairing CBT strategies with BCT coding enables a systematic approach to understanding how therapeutic components are behaviorally operationalized, for example, how activity scheduling is supported through action planning, self-monitoring, or behavioral prompts. This aligns with growing calls to identify active ingredients and clarify the processes through which digital interventions achieve effects [44,45].

This paper evaluates Mindset (Koa Health), a therapist-guided smartphone CBT intervention for depression, to advance transparency and mechanistic understanding in digital mental health. Mindset was selected because it has demonstrated clinical outcomes in an open trial [46] and uses a therapist-guided delivery model associated with superior effectiveness compared to self-guided approaches [47,48]. Most importantly, we had complete access to all intervention content for comprehensive coding. In the 8-week trial of Mindset (n=28), participants showed significant reductions in clinician-rated depression severity on the Hamilton Depression Rating Scale [49] from baseline (mean 19.1, SD 5.0) to posttreatment (mean 10.8, SD 6.1), with a large effect size (Hedges g=1.47; P<.001; mean reduction 7.8 points, 95% CI 5.2‐10.5), with gains maintained at 3-month follow-up [46]. Large effects were also observed for self-reported depression, functioning, and quality of life [46].

To better understand how Mindset supports change and delivers its clinical effects, we aimed to identify and describe the CBT strategies and BCTs present in Mindset and explore how BCTs operationalize CBT strategies by describing their overlap. Our overarching goal is to propose a methodological framework for describing therapeutic content that can be replicated to support digital mental health transparency.


Study Design

We conducted a qualitative content analysis of all intervention pages (ie, all app content) in Mindset, a therapist-guided smartphone CBT intervention for depression.

Ethical Considerations

The original open trial of Mindset for Depression took place at Massachusetts General Hospital in Boston, Massachusetts, between May 2022 and February 2023. It was approved by the Institutional Review Board of Massachusetts General Hospital (protocol 2020P001958) and was registered on ClinicalTrials.gov (NCT05386329). All trial participants provided informed consent before the initiation of study procedures and were given the option to withdraw at any time. Participants in the original trial were compensated US $25 at each of 3 assessment time points (midtreatment, posttreatment, and 3-month follow-up). Given that it was a content analysis of the intervention materials only, this study did not include any participant data, user-generated content, or personally identifiable information. Therefore, no additional consent was collected. No images, screenshots, or other materials in this paper or supplementary files contain any participant data or identifiable information; all content reflects only the intervention’s standardized therapeutic materials. The full open trial, including a detailed description of study methods and results, is reported elsewhere [46].

Mindset Intervention Delivery

Mindset is a therapist-guided smartphone CBT intervention for depression consisting of 8 modules delivered across 8 weeks [46]. The intervention is grounded in CBT theory, which posits that improvements in depression are facilitated by changes in thoughts, behaviors, and emotional responses [50].

Participants had immediate access to the intervention after completing baseline assessments. They were instructed to use the app independently while attending weekly 15‐ to 20-minute therapist sessions conducted via videoconferencing (Health Insurance Portability and Accountability Act [HIPAA]-compliant) to review progress and receive support. Therapists were licensed clinical psychologists trained in CBT for depression, and, because of licensure requirements, participants were required to reside in Massachusetts.

The program began with a brief onboarding sequence that introduced CBT principles and app navigation (Table 1). Modules followed a typical CBT progression, beginning with psychoeducation and self-monitoring, moving into behavioral activation, and ending with cognitive restructuring and relapse prevention [50]. Module 1 could be completed at the participant’s own pace, whereas Modules 2‐8 required a minimum of 7 days per module to support practice and skill consolidation. During Module 2, participants were prompted to self-monitor by logging at least 3 activities they engaged in and the corresponding moods each day. From Module 3 onward, participants implemented behavioral activation by scheduling at least 3 activities per week aligned with 7 value domains (health and wellness, causes and community, spirituality, creative pursuits, career and education, personal relationships, and day-to-day tasks), then logging mood after each activity. The activity library included 102 options, and activity scheduling continued through Modules 4‐8. Although therapeutic content was standardized, participants’ activity choices and the extent to which they revisited the activity scheduling and logging pages allowed for optional individual tailoring or personalization.

Table 1. Mindset intervention overview, module-by-module.
ModuleDescriptionNumber of pages
UniqueRepeatedTotal
OnboardingIntroduction to intervention and how to use10010
Module 1: Exploring How We ThinkCognitive restructuring to challenge unhelpful thinking1350135
Module 2: Activities and MoodSelf-monitoring of activities and associated moods28028
Module 3: Goals and SchedulingIdentify personal values, set goals, and schedule activities aligned with values24731
Module 4: Establishing a RhythmMindful breathing, continued scheduling of activities19726
Module 5: Keeping MomentumMindful grounding, continued scheduling of activities9716
Module 6: Planning Around PrioritiesSkills to let go of unhelpful thoughts, continued scheduling of activities9716
Module 7: Examining Core BeliefsModification of core beliefs and building self-esteem, continued scheduling of activities26733
Module 8: Preparing for the FutureRelapse prevention, reflect on progress and learned skills, continued scheduling of activities23730
Total283a325

aNot applicable.

All data and therapeutic materials were sourced directly from the Mindset app, which included 393 unique app pages across onboarding content and modules (Table 1), totaling 943 pages when repeated content was included. No materials or pages from the app were excluded from analysis. Content was accessible only within the smartphone app and was not available as external files or in print. On average, there were 39.3 pages per module (SD 44.8). Content was delivered in the app via text, though some pages, such as the introductory module, mindfulness and grounding exercises, included the option to watch the content via video.

Coding Framework

We used a dual-coding approach to systematically identify what therapeutic strategies were delivered and how. This approach, which we call mechanism mapping, pairs established CBT technique definitions [31,50] with the BCTTv1 [34] to describe the therapeutic content included and its behavioral operationalization.

CBT Techniques

We first identified and labeled CBT techniques present in Mindset to determine which therapeutic strategies were included and to evaluate alignment with established CBT protocols for depression (Table S1 in Multimedia Appendix 1). The CBT coding framework consisted of 16 techniques found in manualized CBT protocols for depression [25,31,50]. Refer to Table S1 in Multimedia Appendix 1 for coding definitions.

BCTs

We applied the BCTTv1, a validated framework comprising 93 techniques organized into 16 hierarchically clustered categories [34,51,52]. The first author (GKJ) completed training through the official online platform [51]. The 16 categories and their definitions are fully described by Michie et al [34] and in Table S2 in Multimedia Appendix 1.

We chose BCTTv1 because of its ability to characterize the “ingredients” that operationalize change in a digital CBT intervention, consistent with a mechanisms-focused approach [34]. It also provides a validated, fine-grained taxonomy for identifying discrete, observable techniques embedded in app-based content, that is, what users see and do on each screen, making it particularly appropriate for therapy apps that translate evidence-based strategies into modular, self-guided activities. Other frameworks, such as the Theoretical Domains Framework and the Capability, Opportunity, Motivation–Behavior (COM-B) theory, describe why behavior changes by organizing determinants, but do not enumerate concrete techniques [53,54]. Although BCTTv1 was originally developed for health behavior interventions, its structure aligns well with CBT’s emphasis on observable behavioral and cognitive skills [50].

Coding Procedures

The raw text of the Mindset app was organized page-by-page to preserve the intervention’s structure during coding. A collaborative coding approach was used to identify CBT techniques and BCTs [55]. Techniques were coded based on the presence of explicit content, exercises, or instructions; implied techniques were not coded.

Coder Roles, Independence, and Workflow

Coding was conducted by 2 doctoral-level clinical psychologists (GKJ and JYS) and 1 bachelor’s-level research assistant (HTA), all familiar with CBT for depression. None were involved in the initial development or testing of the Mindset intervention. The first author (GKJ) independently coded each intervention page in Dedoose (SocioCultural Research Consultants, LLC) [56]. Two additional coders (JYS and HTA) reviewed the codes and documented disagreements in memos. Discrepancies were discussed collaboratively using the memos as well as CBT and BCTTv1 definitions until consensus was reached.

Interrater Agreement

To assess coder agreement, we report percentage agreement as the reliability metric [57]. Percentage agreement is appropriate for qualitative content analysis with multiple potential codes (eg, different pages of an intervention included multiple techniques or skills), because kappa statistics can be overly conservative when the number of possible categories is large relative to the sample size [57,58]. Before consensus, agreement was 93.62% for BCT codes (822/878) and 93.75% for CBT codes (494/528).

Analysis

We calculated the frequency and distribution of CBT techniques and BCT categories across Mindset, including mean applications, SD, range, and percentage of total app pages.

For ease of interpreting results, we summarize BCT categories rather than each technique, and we prioritize CBT techniques and BCT categories that contribute ≥10% of total codes for reporting.

To prevent repeated content from inflating counts, we weighted pages that participants were guided to revisit multiple times. For example, activity scheduling pages appeared in Modules 3‐8, with instructions to schedule at least 3 activities per week. We applied a weighting factor equal to the average number of activities scheduled per participant divided by the total number of possible activity pages (n=110). This weighting assumption was based on participant instructions provided in the app. After applying this weighting factor, the 110 activity library pages in each of Modules 3-8 were reduced to 7 weighted pages per module, yielding an analytic dataset of 325 pages (Table 1) used as the denominator for all frequency and percentage calculations.

Data aggregation and analysis were conducted using R version 4.4.2 (R Core Team) [59]; a heatmap was developed using the pheatmap package [60], and figures were finalized in Canva (Canva Pty Ltd) [61]. Multimedia Appendix 1 includes detailed documentation of coding distributions.

Reporting Standards

This study follows the TIDieR (Template for Intervention Description and Replication) reporting guideline to promote transparency and replicability in describing intervention content and delivery [62]. A completed TIDieR checklist is provided in Checklist 1.


CBT Techniques

All 16 CBT techniques were represented and applied 528 times (mean 66.0 per module, SD 56.0) throughout the intervention. Figure 1 shows the distribution of CBT techniques by module, and Table 2 reports the descriptive summary of identified strategies. Full frequencies and distributions are reported in Multimedia Appendix 1.

Figure 1. Distribution of cognitive behavioral therapy techniques across intervention modules in the therapist-guided Mindset smartphone program for depression. This figure summarizes when each of the 16 cognitive behavioral therapy techniques was delivered throughout the 8 modules. Horizontal bars represent the percentage of app pages (N=325) within each module that included the cognitive behavioral therapy strategy. CBT: cognitive behavioral therapy.
Table 2. Cognitive behavioral therapy techniques in the Mindset smartphone intervention for depression.a
RankCBTb categoryTop modulePages, n (%)Mean per module when present (SD)Examples of implementation
1Psychoeducation1164 (50.5)20.5 (23.1)Explanation of the difference between sadness and depression
2Skill building1110 (33.8)13.8 (12.9)Prompting participants to log 3 activities a day
3Cognitive restructuring146 (14.2)23.0 (21.2)Identifying a negative thought or belief and typing in a more balanced thought
4Activity scheduling3-842 (12.9)7.0 (0.0)Scheduling 3 activities per day and rating the mood for each activity
5Self-monitoring139 (12.0)5.6 (13.8)Logging 3 activities per day and rating mood for each activity
6Homework assignment132 (9.8)4.0 (0.5)Direction to at least 3 log activities in order to move to next module
7Expressing self-kindness125 (7.7)4.2 (2.4)“Write an encouraging note to self”
8Problem-solving121 (6.5)2.6 (0.8)Ask participants to identify and write down barriers to planned activities in the app
9Identification of values112 (3.7)1.5 (1.9)Selection of activities aligned with personal values
10Stimulus control111 (3.4)2.8 (3.5)“Turn off notifications while practicing mindfulness”
11Labeling emotions110 (3.1)10 (0.0)Encouragement to reflect on and name what emotion they feel after a challenging situation
12Modeling16 (1.8)6.0 (0.0)Demonstration of how to set a SMARTc goal
13Relapse prevention85 (1.5)5.0 (0.0)Preparing a plan for anticipated stressful events
14Mindfulness4-63 (0.9)1.0 (0.0)Guided breathing exercise with attention to present-moment sensations
15Guided imagery61 (0.3)1.0 (0.0)“Visualize placing each thought on a cloud and watching it drift away”
16Treatment goal setting11(0.3)0.1 (0.4)Set goal in alignment with values

aCognitive behavioral therapy strategies were coded 528 times throughout all 8 intervention modules. As such, percentages are reported as a percentage of the total app pages (N=325).

bCBT: cognitive behavioral therapy.

cSMART: specific, measurable, achievable, relevant, and time-bound.

High-Frequency Techniques (≥10%)

Psychoeducation was present in all modules and accounted for more than half of the intervention pages (164/325, 50.5%), followed by skill building (110/325, 33.8%) and cognitive restructuring (46/325, 14.2%), both of which were most present in Module 1. Activity scheduling (42/325, 12.9%) appeared consistently across Modules 3‐8, in alignment with repeated practice of behavioral activation skills. Self-monitoring (39/325, 12.0%) was present intermittently throughout the intervention, but most focused on modules that asked users to log and observe mood changes linked to activities.

Low-Frequency Techniques (<10%)

Homework assignments, self-kindness, problem-solving, and value identification appeared on 3%‐10% of the pages. Labeling emotions, mindfulness, guided imagery, and relapse prevention were less frequent (<5%).

BCTs

The intervention included 37 BCTs (39.8% of the taxonomy), from 13 of 16 categories, applied 878 times (mean 109.8, SD 92.0) across modules. Figure 2 shows the distribution of BCTs by module, and Table 3 provides a descriptive summary of the identified techniques.

Figure 2. Distribution of behavior change technique categories across the therapist-guided Mindset smartphone cognitive behavioral therapy intervention for depression. Each of the 13 behavior change technique categories identified in Mindset was delivered throughout the 8 modules. Horizontal bars represent the percentage of app pages (N=325) within each module that included the cognitive behavioral therapy strategy. BCT: behavior change technique.
Table 3. Behavior change techniques in the Mindset smartphone intervention for depression.a
RankBCTb categoryTop modulePages, n (%)Mean per module when present (SD)Most common BCTs usedIntervention, n (%)Examples of implementation
1Shaping knowledge1205 (63.1)25.6 (22.5)
Instruction on how to perform a behavior140 (43.1)Description of how to plan activities (eg, check schedule and anticipate barriers)
Information about antecedents65 (20.0)Changes in season can lead to low mood
2Repetition and substitution1138 (42.5)17.3 (13.9)
Behavioral practice or rehearsal110 (33.8)Logging 3 activities per day in app
Habit formation16 (4.9)Repeated practice of activity scheduling or planning
3Feedback and monitoring1113 (34.8)14.1 (12.4)
Self-monitoring of behavior or thought55 (16.9)Rate mood after activity
Feedback on behavior45 (13.8)Feedback on cognitive restructuring quiz
4Self-belief189 (27.4)11.1 (11.0)
Self-talk41 (12.6)Textbox to write an encouraging note to self
Verbal persuasion about capability38 (11.7)You\'re well-equipped to handle future challenges
5Goals and planning887 (26.8)10.9 (6.1)
Action planning21 (6.5)Plan for anticipated stressful events
Goal setting (behavior)21 (6.5)Set intentions for use of Mindset
6Identity158 (17.8)7.3 (12.2)
Framing or reframing45 (13.8)Identify negative beliefs, reframe with a balanced perspective
Valued self-identity10 (3.1)Choose positive qualities about self from the list
7Comparison of behavior158 (17.8)7.3 (5.4)
Demonstration of behavior55 (16.9)Example of negative thinking pattern
Social comparison3 (0.9)Describe another person and select their characteristics from a list
8Natural consequences151 (15.7)6.4 (10.8)
Information about emotional consequences35 (10.8)Negative thoughts may make you feel discouraged
Information about social or environmental consequences11 (3.4)Low mood may lead to social interaction avoidance
9Associations151 (15.7)6.4 (5.7)
Prompts or cues51 (15.7)Encouragement to return to Mindset to refresh skills
10Comparison of outcomes113 (4.0)1.6 (4.6)
Credible Source13 (4.0)Embedded links to evidence-based resources
11Antecedents4-66 (1.8)0.8 (1.0)
Restructuring the physical environment3 (0.9)Enabling "do not disturb" during mindfulness practice
Body changes3 (0.9)Bring attention to breath during mindfulness practice
12Social support16 (1.8)0.8 (1.6)
Social support (practical)4 (1.2)Send messages to ask therapist how to use Mindset
Social support (emotional)1 (0.3)Encouragement to reach out to 911 or 988 in an emergency
13Reward and threat83 (0.9)0.4 (1.1)
Self-reward3 (0.9)Acknowledge how far you\'ve come and skills gained

aBehavior change techniques were coded 878 times throughout all 8 intervention modules; as such, percentages are reported as a percentage of the total app pages.

bBCT: behavior change technique.

High-Frequency Techniques (≥10%)

Six BCT categories were found in all 8 modules: shaping knowledge (205/325, 63.1%), repetition and substitution (138/325, 42.5%), feedback and monitoring (113/325, 34.8%), self-belief (89/325, 27.4%), comparison of behavior (58/325, 17.8%), and associations (51/878, 15.7%). Goals and planning (87/325 26.8%) and identity (58/325, 17.8%) were present in all modules, except Module 2, where participants began learning to monitor thoughts and activities. Natural consequences (51/325, 15.7%) were present in more than half of the modules.

Low-Frequency Techniques (<10%)

Comparison of outcomes, social support, antecedents, and reward and threat occurred on fewer than 5% of pages. No techniques were coded from the regulation, scheduled-consequences, or covert-learning categories.

CBT and BCT Co-Occurrence

We examined how CBT techniques were behaviorally implemented through the codelivery of BCT mechanisms at the page level. Each co-occurrence corresponds to an intervention page where a CBT technique and a BCT are presented simultaneously, aiming to provide insight into how skills are taught, rehearsed, and reinforced during intervention use (Figure 3).

Psychoeducation showed the largest overlap, co-occurring with all 13 observed BCT categories, most frequently with shaping knowledge (119/325, 36.6% of intervention pages had this overlap). Skill building was also widely scaffolded, most often through repetition and substitution (114/325, 35.1%), feedback and monitoring (83/325, 25.5%), and shaping knowledge (80/325, 24.6%). Activity scheduling was consistently behaviorally reinforced, most prominently through goals and planning (72/325, 22.2%), repetition and substitution (42/325, 12.9%), and shaping knowledge (42/325, 12.9%).

Several core CBT techniques were behaviorally supported at moderate densities. For example, cognitive restructuring was frequently paired with repetition and substitution (43/325, 13.2%) and feedback and monitoring (35/325, 10.8%). Self-monitoring and problem solving were supported by fewer BCT categories (generally 2%‐8% of pages per pairing).

By contrast, mindfulness, guided imagery, and relapse prevention were rarely paired with BCTs, with most co-occurrences appearing on ≤10 pages (≤3.1%). Motivational, consequence-based, and social mechanisms, including reward and threat, social support, comparison of outcomes, and antecedents, were similarly infrequent (≤20 pages, ≤6.2% each).

Figure 3. Mapping of cognitive behavioral therapy techniques to behavior change technique categories in the Mindset digital cognitive behavioral therapy intervention for depression. This heatmap displays how often each cognitive behavioral therapy technique (x-axis) was coded with each behavior change technique category (y-axis) based on page-level content coding of the Mindset smartphone program for adults with major depressive disorder. Cell shading reflects co-occurrence frequency (0‐150; darker cells indicate higher frequency).

Principal Findings

This study aimed to advance transparency and mechanistic understanding in digital mental health by conducting a comprehensive analysis of Mindset, a therapist-guided smartphone CBT intervention for depression. Three primary findings emerged. First, all 16 CBT techniques were present across the intervention’s 8 modules, with a total of 528 instances. Second, Mindset incorporated 37 unique BCTs from 13 of 16 taxonomy categories, coded 878 times throughout the intervention. Shaping knowledge, repetition and substitution, and feedback and monitoring were most prevalent. Third, co-occurrence analysis highlighted the behavioral mechanisms through which CBT strategies were operationalized. High-frequency techniques such as psychoeducation, skill building, and activity scheduling were consistently paired with multiple BCTs that structure practice (repetition and substitution), provide feedback (feedback and monitoring), and support learning (shaping knowledge). In contrast, lower-frequency techniques such as mindfulness and guided imagery showed minimal BCT scaffolding, suggesting they were introduced but not behaviorally reinforced through repeated practice. These findings show how Mindset accomplishes fidelity to CBT protocols through multilayered behavioral mechanisms while establishing a replicable methodology for describing therapeutic content in a digital intervention.

Mechanism Mapping: Showing Implementation Depth Through Behavioral Scaffolding

This study introduces a mechanism mapping approach that directly addresses limitations of prior evaluations of digital CBT by methodologically examining how therapeutic techniques are implemented in user-facing content [25,43,44,63,64]. Specifically, our co-occurrence analysis (Figure 3) builds on previous studies by showing which BCTs scaffold each CBT strategy, making implementation depth visible and quantifiable.

Applying this mechanism mapping approach to Mindset highlighted considerable variation in implementation depth across CBT strategies. High-frequency techniques showed dense scaffolding: psychoeducation drew upon all 13 BCT categories, engaging users through informational content (shaping knowledge), emotional reflection exercises (natural consequences), and repeated practice activities (repetition and substitution). In contrast, mindfulness and guided imagery appeared with minimal BCT support, suggesting in-app introduction without behavioral reinforcement. The content distribution also showed app design choices: Module 1 was the largest (135 pages vs 9‐28 in other modules) and most densely scaffolded, front-loading introductory skills with instruction, practice, and feedback, laying the groundwork for subsequent modules to build upon.

This variation in implementation would be invisible using CBT-only classification but may help explain variability among apps claiming to deliver “CBT-based” care. Evidence from component-focused meta-analyses reinforces why implementation depth matters: combined cognitive and behavioral processes mediate digital intervention outcomes, but techniques delivered on their own showed no effects [45]. Additionally, skill enactment, actively practicing therapeutic strategies, predicts clinical improvement in most trials, whereas knowledge acquisition alone does not [32]. In Mindset, the BCTs supporting behavioral activation and cognitive change, such as instruction, repeated practice, and ongoing monitoring, appeared consistently across modules, creating structured opportunities for skill rehearsal. The intervention was designed this way so that users actively had to practice and rehearse strategies, not just learn about them. Most importantly, mechanism mapping generates testable hypotheses about content features that may facilitate clinical effects. To our knowledge, no other digital mental health study has visualized or described these relationships, providing a novel analytical framework that is replicable across interventions.

Positioning Mindset Within the Digital Mental Health Landscape

Mindset’s 16 CBT strategies and 37 BCTs exceed typical implementation in digital mental health tools. Prior assessments show that most digital tools include only a median of 3‐4 CBT-based elements [25,29] and 9‐15 BCTs, with conversational agents averaging 15 BCTs (range 4‐30) [63], eating disorder interventions averaging 14 BCTs (range 9‐18) [43], and general mental health apps containing 1‐10 techniques [65]. Our higher technique count compared to prior reviews likely reflects methodological differences in content analysis. We examined every page of the actual intervention, where previous studies coded app store descriptions or published summaries [41,44,63,64], which risks overlooking embedded therapeutic elements and overrelying on developer claims. However, technique quantity alone is not a reliable predictor of treatment effects [65]. Interventions grounded in established psychological theory show better outcomes than those lacking theoretical grounding [43], and full CBT packages outperform single-component strategies. Accordingly, Mindset’s extensive technique involvement may matter less than how these techniques are combined and delivered.

The most frequent CBT techniques in Mindset align with what systematic reviews consistently identify in digital interventions: psychoeducation (present in 50%‐75% of apps), behavioral activation including activity planning and scheduling (31%‐68%), and cognitive restructuring (31%‐79%) [25,27,29]. However, many techniques we identified in Mindset are inconsistently reported in large-scale content syntheses. Skill building, for example, appears in 0%‐12% of interventions depending on how broadly it is defined, with some reviews only capturing specific skill types such as social skills training [25,27]. Similarly, techniques such as homework assignment, self-compassion or expressing self-kindness, problem-solving, stimulus control, and emotion labeling were present in Mindset but are rarely systematically documented in existing app reviews. This inconsistency may reflect the absence of standardized CBT technique taxonomies in digital mental health research, making cross-study comparisons difficult and potentially underestimating the therapeutic content actually delivered to users.

The BCTs most strongly associated with engagement in mobile health interventions, feedback on behavior, self-monitoring, and instructions on how to perform a behavior were among Mindset’s most frequently used [41]. Importantly, these behavioral strategies were not features exclusively designed to keep users engaged; they were directly tied to CBT skill enactment (eg, mood monitoring implemented in service of behavioral activation). This integration contrasts sharply with common commercial designs in which self-monitoring or reminders are detached from core therapeutic mechanisms [25,41]. Mindset’s approach of embedding engagement-related BCTs within therapeutic workflows may explain how the intervention maintains both user engagement and therapeutic fidelity.

Our comprehensive content analysis also sheds light on which CBT strategies received minimal behavioral support and which BCT categories were absent entirely. Techniques such as grounding and mindfulness appeared infrequently and were associated with few BCTs. However, this may reflect the implementation format rather than inadequate support: these strategies were delivered through video demonstrations, which our text-based coding may have undercounted compared to written exercises that explicitly name behavioral techniques. Additionally, entire BCT categories were absent from Mindset, including regulation (eg, reduce negative emotions and conserve mental resources) and scheduled consequences (eg, punishment and reward). However, transparency about what is missing must be interpreted cautiously. Identifying absent strategies or mechanisms does not indicate whether their inclusion would improve outcomes. The value of this analysis lies in enabling transparent evaluation of what tools actually deliver, allowing clinicians and researchers to generate testable hypotheses about which gaps, if any, meaningfully impact outcomes.

Limitations

This analysis has several limitations. First, the study did not statistically link participants’ exposure to or adherence with these techniques to clinical outcomes, preventing examination of which techniques may have been associated with these outcomes. Second, the review focused on text-based content, potentially underestimating exposure to interactive audio or video elements that may be more engaging than their script counterparts. Third, despite formal BCT training, content coding may have introduced subjective biases. Fourth, Mindset includes therapist-delivered content that was intentionally excluded to isolate digital components, meaning this analysis may underrepresent the full therapeutic exposure users received. Finally, our analysis concentrated on evidence-based CBT techniques and BCTs, without assessing non–evidence-based or potentially contraindicated content. Some mental health apps include elements that may conflict with recommended approaches, which may also impact safety and effectiveness. Accordingly, mechanism mapping should therefore be seen as an initial foundation for objective fidelity monitoring instead of a complete measure of treatment quality.

Broader Implications

This study demonstrates that transparent evaluation of digital mental health interventions is feasible and necessary. We offer the following actionable recommendations for advancing quality and accountability in the field. For intervention developers, we recommend auditing existing tools using this dual-coding framework to identify which therapeutic techniques lack behavioral scaffolding so that we can confirm techniques are appropriately reinforced through user action. For researchers, we recommend contributing to transparency by reporting the frequency of therapeutic techniques, making this standard practice to enable meta-analyses to examine implementation depth as a moderator of outcomes. Our methodology is fully replicable for any text-based intervention. We encourage clinicians to evaluate apps based on documented content rather than marketing claims, to seek evidence of technique coverage and reinforcement of mechanisms, and transparent reporting of delivered content. This can be used to inform patient-centered recommendations of these tools. Finally, for the field, we hope that this report sets a new precedent for minimum reporting standards when evaluating these tools, so that they can be more comprehensively evaluated and compared.

Conclusions

The rapid global adoption of digital mental health tools has outpaced our ability to evaluate what they actually deliver. This study introduces mechanism mapping as the first systematic approach to describe which therapeutic strategies are present and how they are behaviorally operationalized through user actions. Unlike prior content analyses that rely on presence or absence coding, our dual-framework methodology reveals implementation depth, distinguishing between techniques that are mentioned versus those supported through repeated practice, feedback, and skill consolidation. This innovation addresses a fundamental transparency gap: most digital interventions function as “black boxes,” making it impossible for clinicians, patients, or researchers to assess therapeutic fidelity.

This replicable framework brings 3 critical contributions to the field: it enables comparative evaluation of digital interventions based on therapeutic content rather than marketing claims, provides testable hypotheses about which implementation features drive clinical effects, and establishes a methodological standard for transparent reporting. In the real world, this work supports clinicians in making evidence-based referrals, empowers patients to choose interventions aligned with their therapeutic needs, and guides developers in creating tools with demonstrable fidelity to evidence-based practices.

Future work should extend this approach by integrating behavioral measurement and mechanistic outcomes, applying mechanism mapping across multiple digital interventions to identify common implementation signatures, exploring whether implementation patterns could inform adaptive tailoring of behavioral supports, and incorporating therapist-delivered components in hybrid models to capture the full therapeutic ecosystem. Importantly, transparency must become the standard in digital mental health, not the exception. Understanding how apps work, for whom, and through what processes is critical for the next generation of precision mental health care.

Acknowledgments

Artificial intelligence (AI) was used to generate a table-of-contents image for the manuscript.

Funding

This primary outcome study on which this paper is based was supported by Koa Health (Wilhelm, principal investigator), and a NIDA Career Development Program in Substance Use and Addiction Medicine (principal investigator AEE, K12 DA043490) as well as an anonymous donor fund (Wilhelm, principal investigator).

Data Availability

The code-frequency data generated and analyzed during this study are included in this published article (Multimedia Appendix 1). The full intervention pages, underlying code assignments, and codebooks are not publicly available as the digital intervention is commercially licensed. However, summary files of the behavioral categories and cognitive behavioral therapy frequency counts have been provided to enable transparency and independent verification of findings. Additional data may be made available from the corresponding author on reasonable request, subject to commercial licensing allowances.

Authors' Contributions

Conceptualization: GJ

Formal analysis: GJ, JS, HA

Methodology: GJ

Investigation: KB, EB, JG, SW

Project administration: EB, HW

Resources: OH, SW

Supervision: KB, EB, SW

Funding acquisition: SW

Writing – original draft: GJ, JS, HA

Writing – review & editing: PH, KB, EB, HW, JG, OH, SW

Conflicts of Interest

GKJ has received research support from the National Institute on Drug Abuse. JYS has received research support from the Robert Wood Johnson Foundation as well as the International OCD Foundation. EEB has received research support from Koa Health Digital Solutions LLC. HMW receives salary from HabitAware, Inc, is a paid consultant to APA Labs, and was formerly a paid consultant to Hello Therapeutics, Inc., is a paid scientific advisory board member to Augmend Health, and has received research support from Koa Health, Inc., National Institute of Mental Health, and President and Fellows of Harvard College. She is a member of the APA’s Mental Health Technology Advisory Committee. SW is a presenter for the Massachusetts General Hospital Psychiatry Academy in educational programs supported through independent medical education grants. She has received royalties from Guilford Publications, New Harbinger Publications, Springer, and Oxford University Press. SW has also received speaking honoraria from various academic institutions and foundations, including the International Obsessive Compulsive Disorder Foundation, the Tourette Association of America, and the Centers for Disease Control and Prevention. In addition, she received honoraria for her role on the Scientific Advisory Board for One-Mind (PsyberGuide), Koa Health Digital Solutions LLC, and Noom, Inc. SW has received research support from Koa Health Digital Solutions LLC. HMW receives salary from HabitAware, Inc, has been a paid consultant to APA Labs and Hello Therapeutics, Inc, is a paid scientific advisory board member for Augmend Health, and has received research support from National Institute of Mental Health, Koa Health, LLC, and President and Fellows of Harvard College. KHB is a paid consultant to Visible Health Inc, and has received research support from the National Institute of Mental Health. OTH is Founder and CEO of Koa Health Limited, and receives reimbursement as the Royal Society Entrepreneur in Residence in healthcare artificial intelligence at Oxford University. All other authors have no disclosures to report.

Multimedia Appendix 1

Cognitive behavioral therapy and behavior change technique coding descriptive data.

XLSX File, 22 KB

Checklist 1

TIDieR framework checklist.

DOCX File, 22 KB

  1. Depressive disorder (depression). World Health Organization. URL: https://www.who.int/news-room/fact-sheets/detail/depression [Accessed 2024-11-22]
  2. Friedrich MJ. Depression is the leading cause of disability around the world. JAMA. Apr 18, 2017;317(15):1517. [CrossRef]
  3. Moreno-Agostino D, Wu YT, Daskalopoulou C, Hasan MT, Huisman M, Prina M. Global trends in the prevalence and incidence of depression:a systematic review and meta-analysis. J Affect Disord. Feb 15, 2021;281:235-243. [CrossRef] [Medline]
  4. Lépine JP, Briley M. The increasing burden of depression. Neuropsychiatr Dis Treat. 2011;7(Suppl 1):3-7. [CrossRef] [Medline]
  5. Hansson L. Quality of life in depression and anxiety. Int Rev Psychiatry. Jan 2002;14(3):185-189. [CrossRef]
  6. Snyder HR. Major depressive disorder is associated with broad impairments on neuropsychological measures of executive function: a meta-analysis and review. Psychol Bull. Jan 2013;139(1):81-132. [CrossRef] [Medline]
  7. Porter RJ, Bourke C, Gallagher P. Neuropsychological impairment in major depression: its nature, origin and clinical significance. Aust N Z J Psychiatry. Feb 2007;41(2):115-128. [CrossRef] [Medline]
  8. Dillon DG, Pizzagalli DA. Mechanisms of memory disruption in depression. Trends Neurosci. Mar 2018;41(3):137-149. [CrossRef] [Medline]
  9. Hawton K, Casañas I Comabella C, Haw C, Saunders K. Risk factors for suicide in individuals with depression: a systematic review. J Affect Disord. May 2013;147(1-3):17-28. [CrossRef] [Medline]
  10. Lepping P, Whittington R, Sambhi RS, et al. Clinical relevance of findings in trials of CBT for depression. Eur Psychiatry. Sep 2017;45:207-211. [CrossRef] [Medline]
  11. Twomey C, O’Reilly G, Byrne M. Effectiveness of cognitive behavioural therapy for anxiety and depression in primary care: a meta-analysis. Fam Pract. Feb 2015;32(1):3-15. [CrossRef] [Medline]
  12. Oud M, de Winter L, Vermeulen-Smit E, et al. Effectiveness of CBT for children and adolescents with depression: a systematic review and meta-regression analysis. Eur Psychiatry. Apr 2019;57:33-45. [CrossRef] [Medline]
  13. Renner F, Cuijpers P, Huibers MJH. The effect of psychotherapy for depression on improvements in social functioning: a meta-analysis. Psychol Med. Oct 2014;44(14):2913-2926. [CrossRef] [Medline]
  14. Hofmann SG, Curtiss J, Carpenter JK, Kind S. Effect of treatments for depression on quality of life: a meta-analysis. Cogn Behav Ther. Jun 2017;46(4):265-286. [CrossRef] [Medline]
  15. Arnaez JM, Krendl AC, McCormick BP, Chen Z, Chomistek AK. The association of depression stigma with barriers to seeking mental health care: a cross-sectional analysis. J Ment Health. Apr 2020;29(2):182-190. [CrossRef] [Medline]
  16. Kohn R, Saxena S, Levav I, Saraceno B. The treatment gap in mental health care. Bull World Health Organ. Nov 2004;82(11):858-866. [Medline]
  17. Chekroud AM, Foster D, Zheutlin AB, et al. Predicting barriers to treatment for depression in a U.S. national sample: a cross-sectional, proof-of-concept study. Psychiatr Serv. Aug 1, 2018;69(8):927-934. [CrossRef] [Medline]
  18. Andrade LH, Alonso J, Mneimneh Z, et al. Barriers to mental health treatment: results from the WHO World Mental Health surveys. Psychol Med. Apr 2014;44(6):1303-1317. [CrossRef] [Medline]
  19. Cruz M, Pincus HA, Harman JS, Reynolds CF 3rd, Post EP. Barriers to care-seeking for depressed African Americans. Int J Psychiatry Med. 2008;38(1):71-80. [CrossRef] [Medline]
  20. Serrano-Ripoll MJ, Zamanillo-Campos R, Fiol-DeRoque MA, Castro A, Ricci-Cabello I. Impact of smartphone app-based psychological interventions for reducing depressive symptoms in people with depression: systematic literature review and meta-analysis of randomized controlled trials. JMIR Mhealth Uhealth. Jan 27, 2022;10(1):e29621. [CrossRef] [Medline]
  21. Wang M, Chen H, Yang F, Xu X, Li J. Effects of digital psychotherapy for depression and anxiety: a systematic review and bayesian network meta-analysis. J Affect Disord. Oct 2023;338:569-580. [CrossRef]
  22. Bae H, Shin H, Ji HG, Kwon JS, Kim H, Hur JW. App-based interventions for moderate to severe depression: a systematic review and meta-analysis. JAMA Netw Open. Nov 1, 2023;6(11):e2344120. [CrossRef] [Medline]
  23. Buss JF, Steinberg JS, Banks G, et al. Availability of internet-based cognitive-behavioral therapies for depression: a systematic review. Behav Ther. Jan 2024;55(1):201-211. [CrossRef] [Medline]
  24. Huguet A, Rao S, McGrath PJ, et al. A systematic review of cognitive behavioral therapy and behavioral activation apps for depression. PLoS One. 2016;11(5):e0154248. [CrossRef] [Medline]
  25. Wasil AR, Venturo-Conerly KE, Shingleton RM, Weisz JR. A review of popular smartphone apps for depression and anxiety: assessing the inclusion of evidence-based content. Behav Res Ther. Dec 2019;123:103498. [CrossRef] [Medline]
  26. Bowie-DaBreo D, Sünram-Lea SI, Sas C, Iles-Smith H. Evaluation of treatment descriptions and alignment with clinical guidance of apps for depression on app stores: systematic search and content analysis. JMIR Form Res. Nov 13, 2020;4(11):e14988. [CrossRef] [Medline]
  27. Bubolz S, Mayer G, Gronewold N, Hilbel T, Schultz JH. Adherence to established treatment guidelines among unguided digital interventions for depression: quality evaluation of 28 web-based programs and mobile apps. J Med Internet Res. Jul 13, 2020;22(7):e16136. [CrossRef] [Medline]
  28. Stawarz K, Preist C, Tallon D, Wiles N, Coyle D. User experience of cognitive behavioral therapy apps for depression: an analysis of app functionality and user reviews. J Med Internet Res. Jun 6, 2018;20(6):e10120. [CrossRef] [Medline]
  29. Martinengo L, Stona AC, Griva K, et al. Self-guided cognitive behavioral therapy apps for depression: systematic assessment of features, functionality, and congruence with evidence. J Med Internet Res. Jul 30, 2021;23(7):e27619. [CrossRef] [Medline]
  30. Plessen CY, Panagiotopoulou OM, Tong L, Cuijpers P, Karyotaki E. Digital mental health interventions for the treatment of depression: a multiverse meta-analysis. J Affect Disord. Jan 15, 2025;369:1031-1044. [CrossRef] [Medline]
  31. López-López JA, Davies SR, Caldwell DM, et al. The process and delivery of CBT for depression in adults: a systematic review and network meta-analysis. Psychol Med. Sep 2019;49(12):1937-1947. [CrossRef] [Medline]
  32. Jackson HM, Calear AL, Batterham PJ, Ohan JL, Farmer GM, Farrer LM. Skill enactment and knowledge acquisition in digital cognitive behavioral therapy for depression and anxiety: systematic review of randomized controlled trials. J Med Internet Res. May 31, 2023;25:e44673. [CrossRef] [Medline]
  33. Hundt NE, Mignogna J, Underhill C, Cully JA. The relationship between use of CBT skills and depression treatment outcome: a theoretical and methodological review of the literature. Behav Ther. Mar 2013;44(1):12-26. [CrossRef] [Medline]
  34. Michie S, Richardson M, Johnston M, et al. The behavior change technique taxonomy (v1) of 93 hierarchically clustered techniques: building an international consensus for the reporting of behavior change interventions. Ann Behav Med. Aug 2013;46(1):81-95. [CrossRef] [Medline]
  35. Bobrow K, Brennan T, Springer D, et al. Efficacy of a text messaging (SMS) based intervention for adults with hypertension: protocol for the StAR (SMS Text-message Adherence suppoRt trial) randomised controlled trial. BMC Public Health. Jan 11, 2014;14(1):1-9. [CrossRef] [Medline]
  36. Devi R, Powell J, Singh S. A web-based program improves physical activity outcomes in a primary care angina population: randomized controlled trial. J Med Internet Res. Sep 12, 2014;16(9):e186. [CrossRef] [Medline]
  37. Yang CH, Maher JP, Conroy DE. Implementation of behavior change techniques in mobile applications for physical activity. Am J Prev Med. Apr 2015;48(4):452-455. [CrossRef] [Medline]
  38. Garnett C, Crane D, West R, Brown J, Michie S. Identification of behavior change techniques and engagement strategies to design a smartphone app to reduce alcohol consumption using a formal consensus method. JMIR Mhealth Uhealth. Jun 29, 2015;3(2):e73. [CrossRef] [Medline]
  39. Struik L, Rodberg D, Sharma RH. The behavior change techniques used in Canadian online smoking cessation programs: content analysis. JMIR Ment Health. Mar 1, 2022;9(3):e35234. [CrossRef] [Medline]
  40. Webster R, Michie S, Estcourt C, Gerressu M, Bailey JV, Group MT. Increasing condom use in heterosexual men: development of a theory-based interactive digital intervention. Transl Behav Med. Sep 2016;6(3):418-427. [CrossRef] [Medline]
  41. Milne-Ives M, Lam C, De Cock C, Van Velthoven MH, Meinert E. Mobile apps for health behavior change in physical activity, diet, drug and alcohol use, and mental health: systematic review. JMIR Mhealth Uhealth. Mar 18, 2020;8(3):e17046. [CrossRef] [Medline]
  42. Goulding EH, Dopke CA, Rossom RC, et al. A smartphone-based self-management intervention for individuals with bipolar disorder (LiveWell): empirical and theoretical framework, intervention design, and study protocol for a randomized controlled trial. JMIR Res Protoc. Feb 21, 2022;11(2):e30710. [CrossRef] [Medline]
  43. Thomas PC, Curtis K, Potts HWW, et al. Behavior change techniques within digital interventions for the treatment of eating disorders: systematic review and meta-analysis. JMIR Ment Health. Aug 1, 2024;11:e57577. [CrossRef] [Medline]
  44. Chiang CP, Hayes D, Panagiotopoulou E. Apps targeting anorexia nervosa in young people: a systematic review of active ingredients. Transl Behav Med. Jun 9, 2023;13(6):406-417. [CrossRef] [Medline]
  45. Watkins E, Newbold A, Tester-Jones M, Collins LM, Mostazir M. Investigation of active ingredients within internet-delivered cognitive behavioral therapy for depression: a randomized optimization trial. JAMA Psychiatry. Sep 1, 2023;80(9):942-951. [CrossRef] [Medline]
  46. Wilhelm S, Bernstein EE, Bentley KH, et al. Feasibility, acceptability, and preliminary efficacy of a smartphone app-led cognitive behavioral therapy for depression under therapist supervision: open trial. JMIR Ment Health. Apr 9, 2024;11(1):e53998. [CrossRef] [Medline]
  47. Karyotaki E, Efthimiou O, Miguel C, et al. Internet-based cognitive behavioral therapy for depression: a systematic review and individual patient data network meta-analysis. JAMA Psychiatry. Apr 1, 2021;78(4):361-371. [CrossRef] [Medline]
  48. Gratzer D, Khalid-Khan F. Internet-delivered cognitive behavioural therapy in the treatment of psychiatric illness. CMAJ. Mar 1, 2016;188(4):263-272. [CrossRef] [Medline]
  49. Williams JBW. A structured interview guide for the Hamilton Depression Rating Scale. Arch Gen Psychiatry. Aug 1, 1988;45(8):742. [CrossRef]
  50. Gautam M, Tripathi A, Deshmukh D, Gaur M. Cognitive behavioral therapy for depression. Indian J Psychiatry. Jan 2020;62(Suppl 2):S223-S229. [CrossRef] [Medline]
  51. Michie S, Johnston M, Richardson M, Francis J, Abraham C, Wood CE, et al. BCTTv1 online training. BCT Taxonomy. URL: https://www.bct-taxonomy.com [Accessed 2026-03-14]
  52. Carey RN, Connell LE, Johnston M, et al. Behavior change techniques and their mechanisms of action: a synthesis of links described in published intervention literature. Ann Behav Med. Jul 17, 2019;53(8):693-707. [CrossRef] [Medline]
  53. Cane J, O’Connor D, Michie S. Validation of the theoretical domains framework for use in behaviour change and implementation research. Implement Sci. Apr 24, 2012;7(1):37. [CrossRef] [Medline]
  54. Willmott TJ, Pang B, Rundle-Thiele S. Capability, opportunity, and motivation: an across contexts empirical examination of the COM-B model. BMC Public Health. May 29, 2021;21(1):1014. [CrossRef] [Medline]
  55. Richards KAR, Hemphill MA. A practical guide to collaborative qualitative data analysis. J Teach Phys Educ. Apr 2018;37(2):225-231. [CrossRef]
  56. Salmona M, Lieber E, Kaczynski D. Qualitative and Mixed Methods Data Analysis Using Dedoose: A Practical Approach for Research across the Social Sciences. Sage Publications; 2019. ISBN: 1506397808
  57. O’Connor C, Joffe H. Intercoder reliability in qualitative research: debates and practical guidelines. Int J Qual Methods. Jan 1, 2020;19:1609406919899220. [CrossRef]
  58. Burla L, Knierim B, Barth J, Liewald K, Duetz M, Abel T. From text to codings: intercoder reliability assessment in qualitative content analysis. Nurs Res. 2008;57(2):113-117. [CrossRef] [Medline]
  59. R Core Team. R: a language and environment for statistical computing. R Foundation for Statistical Computing; 2020. URL: https://www.r-project.org/ [Accessed 2026-04-10]
  60. Kolde R, Kolde MR. Package ‘pheatmap’. R package. 2015;1(7):790. URL: https://cran.r-project.org/web/packages/pheatmap/index.html [Accessed 2026-04-10]
  61. Gehred AP. Canva. JMLA. 2020;108(2):338. [CrossRef]
  62. Hoffmann TC, Glasziou PP, Boutron I, et al. Better reporting of interventions: Template for Intervention Description and Replication (TIDieR) checklist and guide. BMJ. Mar 7, 2014;348:g1687. [CrossRef] [Medline]
  63. Lin X, Martinengo L, Jabir AI, et al. Scope, characteristics, behavior change techniques, and quality of conversational agents for mental health and well-being: systematic assessment of apps. J Med Internet Res. Jul 18, 2023;25:e45984. [CrossRef] [Medline]
  64. Martinengo L, Jabir AI, Goh WWT, et al. Conversational agents in health care: scoping review of their behavior change techniques and underpinning theory. J Med Internet Res. Oct 3, 2022;24(10):e39243. [CrossRef] [Medline]
  65. Alqahtani F, Al Khalifah G, Oyebode O, Orji R. Apps for mental health: an evaluation of behavior change strategies and recommendations for future development. Front Artif Intell. 2019;2:30. [CrossRef] [Medline]


BCT: behavior change technique
BCTTv1: behavior change technique Taxonomy version 1
CBT: cognitive behavioral therapy
COM-B: Capability, Opportunity, Motivation–Behavior
HIPAA: Health Insurance Portability and Accountability Act
TIDieR: Template for Intervention Description and Replication


Edited by Stefano Brini; submitted 12.Sep.2025; peer-reviewed by Babak Najand, Justin Angel; final revised version received 12.Jan.2026; accepted 19.Jan.2026; published 16.Apr.2026.

Copyright

© Geneva K Jonathan, Jenna Y Sung, Heyli T Arcese, Phoebe Holz, Kathryn H Bentley, Emily E Bernstein, Hilary M Weingarden, Jennifer L Greenberg, Oliver T Harrison, Sabine Wilhelm. Originally published in the Journal of Medical Internet Research (https://www.jmir.org), 16.Apr.2026.

This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research (ISSN 1438-8871), is properly cited. The complete bibliographic information, a link to the original publication on https://www.jmir.org/, as well as this copyright and license information must be included.