Review
Abstract
Background: Digital health interventions (DHIs) harness technological innovation to address challenges in the accessibility and scalability of health care. However, the effectiveness of DHIs is challenged by low user engagement and adherence, as users tend to drop out over time. The supportive accountability model (SAM) is a theoretical framework designed to enhance adherence to DHIs by incorporating structured human support.
Objective: Guided by SAM, this scoping review answers the following research questions: (1) What is the extent of research on human support factors and their influence on engagement with and adherence to DHIs? and (2) What is the extent of research applying SAM (ie, accountability, bond, and legitimacy) to improve engagement with and adherence to DHIs?
Methods: Our search strategy followed the PRISMA-ScR (Preferred Reporting Items for Systematic Reviews and Meta-Analyses Extension for Scoping Reviews). We conducted our literature search using 6 databases selected based on relevance to our research topic: MEDLINE, PsycINFO, Embase, CINAHL, Scopus, and ClinicalTrials.gov. Search terms included (“human support” OR “supportive accountability”) AND (engagement OR adherence) AND intervention, applied to titles, abstracts, and keywords. Hand-searching was also used to identify additional relevant articles. Two authors (SPYC and GK) screened articles in multiple rounds using predefined inclusion and exclusion criteria. The final sample consisted of 36 empirical, peer-reviewed articles published in scholarly journals. All articles examined human-supported DHIs.
Results: Implementation of human support among the interventions varied by the source, delivery method, and frequency and duration of support. Overall, there were inconsistencies in the application of SAM to intervention designs. Support was provided by 4 main groups: peers and peer specialists, health experts and practitioners, trained coaches, and members of the research study team. Modes of communication included phone or video calls, as well as text-based support, such as messaging or email. The frequency and duration of support varied across studies and were influenced by the communication method used, with more structured and frequent contact occurring in interventions that relied on synchronous support, such as phone or video calls. In addition, we found that some studies used human support as the primary mode of intervention delivery rather than as an adjunctive tool, focusing on improving engagement and adherence, as proposed by SAM. Aside from accountability, there was also a lack of explicit focus on other constructs within the model (eg, bond and legitimacy).
Conclusions: This scoping review highlights the current use of human support to promote DHI adherence and reveals gaps in the application of SAM. Future research should address all core SAM components—not just accountability—and ensure human support is used as an adjunct to enhance engagement. These steps can help maximize the impact of DHIs on health care access and outcomes.
doi:10.2196/72639
Keywords
Introduction
Digital Health Interventions
Digital health interventions (DHIs) encompass a range of technologies and applications designed to enhance health outcomes, improve health care delivery, and promote patient engagement. DHIs leverage advancements in mobile health (mHealth), telemedicine, wearable devices, and health information systems to transform the health care landscape by offering scalable, accessible, and cost-effective solutions for health concerns such as cardiovascular disease and mental illness [-]. However, the application of DHIs is challenged by low user engagement and adherence, or the “law of attrition,” a phenomenon in which users tend to drop out of eHealth interventions over time []. User engagement refers to the degree to which users interact with the intervention [], while adherence reflects the extent to which users follow prescribed usage patterns []. Because both are crucial factors in determining the effectiveness of DHIs [,], it is imperative to understand the factors related to engagement and adherence.
Theoretical Foundations: Self-Determination and Social Cognitive Theories
The influence of DHIs on behavioral change, including user engagement and adherence, is grounded in several psychological and behavioral theories. Two such theories—self-determination theory (SDT) and social cognitive theory (SCT)—offer complementary insights into motivation, self-regulation, and social influence in the context of DHIs.
SDT posits that social environments and interventions are most effective when they support 3 basic psychological needs: autonomy (the sense that one’s behavior is self-directed and volitional), relatedness (feeling connected to others), and competence (feeling effective and capable of achieving goals) []. When these needs are met, individuals are more likely to internalize motivation, resulting in lasting behavioral change. SDT views motivation as a continuum ranging from amotivation (no motivation) to intrinsic motivation (engagement driven by inherent interest or enjoyment) []. In between lie controlled regulation (including external and introjected regulation) and autonomous regulation (including identified, integrated, and intrinsic regulation). Behaviors driven by more autonomous forms of motivation are generally more persistent and robust across time [].
SCT similarly focuses on the internal and social processes that guide behavior, emphasizing self-regulation as a key mechanism. According to SCT, behavior is shaped through a dynamic interplay between personal factors (such as beliefs and self-efficacy), environmental influences (including social norms and reinforcement), and behavioral capabilities (acquired through observation and experience) [,]. Central to SCT is the construct of self-efficacy—one’s belief in their ability to perform a specific behavior—which influences the goals people set, the effort they expend, and their persistence in the face of challenges. SCT also emphasizes the role of observational learning, outcome expectations, and social support in shaping health behaviors, making it especially relevant to DHIs that incorporate human support or social features [].
Human Support Factors in DHIs
Human support has been identified as an important factor for enhancing both engagement with and adherence to DHIs. Human support factors encompass various forms of support, including social support from peers, human coaching and tailored support, and the involvement of health care providers. These collectively create an environment that motivates users to engage with and adhere to interventions [,]. By providing users with accountability and feedback, human support can significantly improve the outcomes of DHIs [,]. For example, several studies have highlighted the importance of peer support for enhancing engagement with DHIs. Peer relationships uniquely create reciprocal accountability, wherein users hold each other responsible for achieving goals (eg, see []). Peer support can foster autonomy, shared experiences, and a sense of belonging, collectively influencing engagement through mechanisms such as goal setting, task agreement, and social bonding []. In particular, peer support offers a unique type of support; for example, peer support specialists may include certified peer specialists or recovery coaches who share a similar mental health diagnosis, are in recovery, and provide peer support services []. They can leverage familiarity, perceived similarity, and trust [] to develop a bond with consumers [] that facilitates engagement with DHIs [].
The application of human coaching in DHIs has also demonstrated a positive impact on user engagement. Coaches can provide personalized guidance, address challenges, and motivate users to continue using the intervention []. Tailoring coaching to individual needs and preferences can amplify its impact. Similarly, health care providers can play a significant role in influencing user engagement with DHIs. Providers can introduce and promote DHI use to their patients, offer guidance on integrating DHIs into existing treatment plans, and address any concerns or questions that may arise []. Ongoing support from a trusted health care professional can increase user buy-in and commitment to the intervention [].
Human support can play a crucial role in fulfilling the basic psychological needs outlined in SDT, thereby enhancing engagement with DHIs and promoting lasting behavior change. By fostering autonomy, competence, and relatedness, supportive interactions can enhance individuals’ intrinsic motivation, which is linked to higher levels of adherence. For instance, personalized feedback and encouragement from health care providers can strengthen users’ sense of competence in managing their health. In a study by Höchsmann and colleagues [], an mHealth intervention designed for individuals with type 2 diabetes incorporated autonomy-supportive features—such as user-driven goal setting—and tailored recommendations to meet individual needs. Their findings demonstrated improved adherence in the intervention group compared with the control group [].
Similarly, support from peers and health care professionals can provide powerful modeling and reinforcement, key mechanisms in SCT that help sustain user engagement and adherence []. A review of SCT-based interventions found that incorporating social support features, such as peer support or coaches, was associated with improved health outcomes []. Emerging research also highlights the synergistic potential of combining SDT and SCT principles in the design of DHIs. For example, Smith and colleagues [] developed a mobile app for obesity prevention among adolescent boys in low-income communities that integrated goal-setting tools (supporting SDT principles of autonomy, competence, and relatedness) with social comparison features (reflecting SCT’s concept of observational learning). Compared with the control group, participants using the app showed significant improvements in muscular fitness, movement skills, and key weight-related behaviors [].
Supportive Accountability Model
One theoretical framework that has gained prominence in understanding the role of human support in DHIs is the supportive accountability model (SAM) []. SAM suggests that providing accountability in a supportive manner can greatly enhance user engagement and adherence []. SAM suggests that having human support who can provide encouragement, monitor progress, and hold users accountable can boost the intrinsic motivation of users, resulting in increased sustained engagement and adherence to the intervention. SAM proposes 3 interpersonal or human supportive factors that could lead to better adherence: accountability, bond, and legitimacy.
Accountability refers to the expectation that an individual may have to justify their actions to their accountability partners []. In SAM, accountability is established through reinforcing social presence, expectations (ie, goal setting), and performance monitoring. Accountability and DHI adherence can then be enhanced by the bond between the human support and intervention participant, as well as legitimacy (ie, trustworthiness, benevolence, and expertise) []. In short, people adhere better when they are held accountable, particularly when they are held accountable by people they respect and care about []. Drawing from SDT, the concept of autonomous accountability —an internal, self-endorsed motivation to meet expectations—has been posited to yield more sustainable adherence than controlled accountability, which is driven by external pressure or obligation []. Oussedik et al [] furthered developed autonomous and controlled accountability in a modified version of Bandura’s [] SCT, illustrating how social norms, interpersonal interactions, and supportive relationships influence accountability and, in turn, behavioral adherence. Accountability can also be conceptualized through self-observation, a core SCT process wherein individuals monitor their behaviors with the awareness that others are observing their progress. Through this lens, accountability may foster a greater sense of personal responsibility and satisfaction as individuals self-regulate their behaviors to achieve their health goals [].
Studies applying SAM in DHIs have demonstrated its effectiveness in various contexts. For instance, mental health DHIs that incorporate supportive accountability from therapists or coaches have shown higher rates of user adherence and better clinical outcomes [,]. Similarly, DHIs for weight management that include regular check-ins with a coach have resulted in more significant weight loss and improved adherence than those without such support [,].
Taxonomy of DHIs
With the rapid advancements in the field of DHIs, there is a need to categorize DHIs based on their core components, such as the type of support provided, level of personalization, and mode of delivery, to guide the conceptualization and development of future DHIs. Pineda et al [] developed a framework to present 4 types of digital mental health interventions (DMHIs) by categorizing interventions based on the level of human involvement, from primary content delivery (ie, high involvement) to adjunctive support (ie, low involvement). The taxonomy is categorized into the following types: (1) provider-administered DMHIs (asynchronous or synchronous delivered by practitioners via a digital platform); (2) provider-administered DMHIs with blended digital adjuncts (asynchronous or synchronous delivered by practitioners via a digital platform with digital adjuncts to enhance the interventions); (3) self-help DMHIs with human support (offering human support to self-help digital tools to reduce the substantial proportion of dropouts); and (4) self-help fully automated DMHIs (without human support).
The taxonomy highlights that human-supported interventions, particularly those incorporating elements of SAM, can achieve better engagement and adherence outcomes in self-guided or automated interventions, or what the authors term “Type 3: Human-supported self-help interventions” []. This clear distinction is important as it guides how SAM is applied in research. Again, this highlights the crucial role of human support in the effective implementation of DHIs, offering a structured approach for evaluating and designing these interventions.
Despite the recognized importance of human support and the promising potential of SAM, the extent to which these factors have been applied in DHI studies remains unclear. The goal of this scoping review was to answer the following questions: (1) What is the extent of research on human support factors and their influence on engagement with and adherence to DHIs? (2) What is the extent of research applying SAM (ie, accountability, bond, and legitimacy) to improve engagement with and adherence to DHIs? Results can guide future research and inform the design and implementation of DHIs that encourage adherence, thereby optimizing the efficacy of interventions in promoting health and well-being.
Methods
The study procedure was informed by the scoping review methodological framework described in Arksey and O'Malley []. This framework follows a 5-stage approach: (1) identifying guiding research questions, (2) searching for relevant studies, (3) selecting those to be included, (4) charting data, and (5) collating and summarizing data and reporting results. This review protocol was not registered prior to beginning the review.
Stage 1: Identifying Research Questions
Using SAM [] as a guiding framework, we developed the following research questions to investigate: (1) What is the extent of research on human support factors and their influence on engagement with and adherence to DHIs? and (2) What is the extent of research applying SAM (ie, accountability, bond, and legitimacy) to improve engagement with and adherence to DHIs?
Stage 2: Identifying Relevant Studies
Our search strategy followed the PRISMA-ScR (Preferred Reporting Items for Systematic Reviews and Meta-Analyses Extension for Scoping Reviews) framework []. First, the following 6 databases were selected for our literature search: MEDLINE, PsycINFO, Embase, CINAHL, Scopus, and ClinicalTrials.gov. These databases were selected based on their relevance to the topic.
The following search terms were entered into each database’s search engine in February 2024: (“human support” or “supportive accountability”) and (engagement or adherence) and intervention. The terms “human support” and “supportive accountability” were grounded in the SAM [] and reflect core constructs relevant to human-supported DHIs. “Adherence” was informed by the MeSH term “adherence interventions,” which encompasses strategies aimed at improving compliance with health interventions. “Engagement” is a controlled vocabulary term commonly used in digital health literature to describe user interaction with interventions, including in journals such as the Journal of Medical Internet Research (eg, “user engagement”). These terms were searched within article titles, abstracts, and keywords. When the options were available, search results were limited to articles from peer-reviewed journals, empirical studies, and articles written in English. We also excluded reviews (ie, scoping, systematic, narrative) when possible since one of our inclusion criteria (listed in the next section) was that the article must be an empirical study. The results of each search were exported into database-specific Excel (Microsoft Corp) sheets that contained article metadata, including title, year, author(s), journal, and abstract.
Additionally, we conducted a hand search by reviewing the “Cited By” and references lists of included articles to identify additional relevant studies. This manual step helped address limitations in database indexing and variations in author-assigned keywords that may have excluded relevant articles from our initial search. Data extraction was conducted by the second author (SPYC). The independent review process for study selection, including how discrepancies were resolved between raters, is detailed in the next section.
Stage 3: Selection of Studies
Our search yielded 217 articles across all 6 databases. The number of results generated in each database search is shown in . Our search using ClinicalTrials.gov yielded 0 results and was therefore excluded. Among the 217 articles, 134 duplicates were removed, resulting in 83 remaining articles.

The study team (GK and SPYC) reviewed all the article abstracts to screen for relevance and eligibility. Inclusion criteria included that the article was (1) peer-reviewed, (2) published in a scholarly journal, (3) about an empirical study (quantitative, qualitative, or mixed methods), (4) a study that focused on DHIs that used human support, (5) written in English, and (6) full text available online. Criteria (1) and (2) ensured that the included articles had undergone peer review and met a basic standard of a systematic process and scientific rigor, which resulted in excluding white papers, theses, and dissertations. Criterion (3) excluded nonempirical works such as reviews, commentaries, study protocols, conceptual or theoretical papers, and papers on intervention development because our guiding questions focus on the application of a theoretical framework in empirical research. Criterion (4) was required to ensure relevance to our guiding questions about research using SAM, a theoretical framework that is centered around the role of human support in DHI adherence. Finally, criteria (5) and (6) were decided upon for pragmatic reasons, as the authors needed to be able to access and download the full articles as well as to read them.
The 83 abstracts were independently reviewed by the first two authors (SPYC and GK) based on the inclusion criteria and relevance to our research questions. Following the initial screening, they met to discuss which articles should be excluded. In cases of uncertainty or disagreement, the full text was retrieved and consulted to determine whether the study adequately addressed the guiding research questions []. Only after both authors reached a consensus on the article was it included or excluded at this stage. Among the 83 articles, 47 were excluded due to irrelevance, and 36 full articles were downloaded for further review. Among these excluded articles were measurement validation studies, study protocols, scoping and systematic reviews (not an empirical study), conceptual papers, commentaries, and a case study. These were excluded either because they were not an empirical study or because they did not study engagement with or adherence to a DHI.
The first two authors independently reviewed each full article, noting those that were unrelated to the research questions. In another peer debriefing, the team met to determine the final sample. We excluded 12 additional articles due to irrelevance, including interventions that did not use a human support element, did not examine engagement or adherence, or focused mainly on feasibility and implementation outcomes. These were considered irrelevant because they did not reflect key concepts of SAM. To supplement the sample, the team reviewed the reference lists and Google Scholar “Cited by” lists of the remaining 24 articles. Reference lists had an average of 45.8 citations (median=40), while Cited By lists, retrieved via Google Scholar in February 2024, had an average of 51.9 articles (median=34.5). We identified 12 more articles through this hand-searching strategy, resulting in a final analytic sample of 36 articles.
Stage 4: Charting the Data
The narrative review tradition typically applies an analytical framework to standardize data collection []. In addition to collecting conventional metadata (eg, title, author, year, journal, findings), information about the key components of SAM (ie, accountability, legitimacy, and bond) was also collected. The following data were collected from each article: title, author(s), year, journal, study design, stage of study (clinical trials), measurement instruments used (if any), engagement or adherence measure (if any), sampling strategy, sample size, study location (country), study population (age and health status), engagement model (if any), intervention target outcome, human support strategy, presence of a training description, findings related to engagement and adherence, source of human support, frequency of human support interactions, mode of communication, duration of interaction, and comparison group(s).
Stage 5: Collating, Summarizing, and Reporting Results
The second author (SPYC) first identified preliminary themes from the data charted in Stage 4. After preliminary themes were generated, the first author (GK) and second author (SPYC) continued to develop and refine the themes until both authors agreed that they captured the review’s findings. Strategies for rigor include team debriefings and an audit trail to keep track of decisions made throughout the analysis [].
Results
Description of Studies
The results of the search are shown in . The distribution of research designs among the articles included in our scoping review (N=36) is as follows: 25 (69%) were quantitative, 7 (19%) were mixed methods, and 4 (11%) were qualitative studies. Most studies (30/36, 83%) were conducted with adult participants, while only 6 (6/36, 17%) examined engagement in interventions with youth. Two-thirds (24/36, 67%) of the studies were conducted in the United States. Other study locations included Australia and New Zealand, the Netherlands, Spain, Norway, Malaysia, Singapore, England, and the United Kingdom. Over one-third (14/36, 39%) of the studies were randomized controlled trials, while an additional 2 studies were randomized trials.
Human Support: Sources, Mode of Communication, Frequency, and Duration
Overview
The implementation of human support among the interventions varied by the source of the support (ie, who is providing the support?), the delivery of the support (ie, how do human supports contact the participants?), and the frequency and duration of the support (ie, how often and how long is each contact?). We found that sources of human support across interventions could be categorized into 4 groups: (1) peers and peer specialists; (2) health experts and practitioners; (3) platform or product experts and trained coaches; and (4) members of the research study team. The frequency and duration of contact between human support and the participants (eg, weekly, 3 times a week) appeared to be related to the mode of communication (eg, phone calls, text messaging). For example, text messages occurred more frequently than phone calls.
Sources
Peers and Peer Specialists
Studies by Blonigen et al [], Duffecy et al [], Duffecy et al [], Duffecy et al [], Ho et al [], Lattie et al [], Lederman et al [], Possemato et al [], and Tomasino et al [] used peer support to engage participants in their respective DHI studies. Of these, 7 [-,] integrated social networking features into their intervention platforms, which are comparable to those found on popular social media sites such as Facebook. For example, these features allowed participants to view each other’s activity on the platform [,-,], create and reply to each other’s posts [,,,], and send notifications, sometimes referred to as “nudges,” to peers who had been inactive recently [,,,]. The notification feature would remind inactive users to reengage with intervention tasks, such as viewing educational content or completing exercises on the platform [,,,].
In studies wherein human support was delivered via social networking features, support came from peers who were also participants in the study. Meanwhile, Blonigen et al [] used peer specialists who were not participants in the studies to increase engagement. The peer specialists were employed by the Veterans Health Administration to provide human support to participants. In the Veterans Health Administration, peer specialists are veterans who are in recovery from substance use or mental health problems and are trained to provide services to other veterans who are struggling with similar problems using their lived experience []. Their role includes providing patients with emotional support, serving as role models for self-management of health issues, and assisting patients with navigating the health care system. In this study, peer specialists would use their lived experience and support to facilitate participants’ engagement with the intervention for unhealthy alcohol use. This type of peer support emphasizes mentorship from those who have more experience overcoming a challenge, contrasting with the former interventions, which relied on cohort members to support one another in engaging with the interventions.
Health Experts and Practitioners
Of the 36 studies, 10 (27.7%) investigated interventions wherein various health experts and practitioners provided human support. Health professionals included nurses and nurse interns [,]; a behavioral weight loss interventionist []; dietitians []; mental health providers, such as clinical psychologists [,]; therapists [-]; and clinic staff [,]. Interventions used by Chhabria et al [] and Jesuthasan et al [] also drew human support from experts from multiple disciplines. For example, Jesuthasan et al [] used a multidisciplinary team comprising dietitians, physiotherapists, fitness coaches, pharmacists, and medical advisors to deliver various components of the intervention, providing individualized therapeutic content to participants.
Trained Coaches
Although the term “coach” may broadly encompass various disciplines (eg, peer specialists, health experts and practitioners, and platform or product experts), coaches (eg, a recovery coach) do not necessarily have professional training related to the content of the intervention. Therefore, coaches often receive additional training tailored to the intervention. Of the 36 studies, 14 (39%) studies explicitly used the term “coach” to describe the sources of human support in their studies [,,,-]. One additional study described their human support as “telephone counselors” [], who were trained by the study team to provide support and advice, facilitate problem-solving, and promote participants’ use of the intervention platform. Similarly, in a commercialized intervention platform (Naluri), Jesuthasan et al [] employed and trained the delegated staff to provide support for their users.
Research Study Team
Like coaches, research team members were not professionally trained for the content of the intervention. However, their roles as human support were less focused on the intervention content and more focused on research study-specific questions or tasks. In 3 (3/36, 8%) studies, members of the research team provided general support to participants [-]. In the study by Cheng et al [], participants were guided through the intervention by an animated digital therapist with interactive components; research staff, however, were available to participants via phone or email for inquiries about research-related processes, such as creating an account on the intervention platform. In the study by Glasgow et al [], study team members provided initial support to participants during the early phase, while later support came from a diabetes care manager or a nutritionist. In the study by Kelders et al [], the research team members wrote automated SMS text messages and scheduled them to be sent to participants 3 times a week.
Mode of Communication, Frequency, and Duration
The frequency and duration of contact with those providing human support varied and were often determined by the mode of communication (). Several interventions also offered participants options from which to select their preferred method of communication with human support.
Phone and Video Chat
Among the 23 (23/36, 64%) interventions in which human support contacted participants via phone call or video chat, 10 (10/23, 44%) studies involved weekly contact, and 2 (2/23, 9%) involved contact every other week. One intervention (1/23, 4%) involved a tapered frequency, beginning with weekly contact for 8 weeks, then gradually decreasing to monthly for 2 months []. Four interventions (4/23, 17%) scheduled contact based on the intervention curriculum and schedule. For example, human support in the study by Dennison et al [] contacted participants after weeks 1 and 4, while human support in the study by Glasgow et al [] contacted participants after weeks 2 and 8, as well as at a 4-month follow-up. Coaches in the study by Baron et al [] contacted participants every 2 modules that the latter completed. In another study, participants received 5 peer support sessions throughout the 8-week intervention []. Two interventions that used only phone-based support offered more frequent contact, ranging from twice a week [] to 5 times a week []. The duration of interactions between human support and participants was reported by 15 (15/23, 65%) studies. Across these studies, contact lasted between 5 minutes and 50 minutes. Of these 15 studies, 3 (20%) studies began with longer sessions to build rapport between the support and participants, as well as to orient participants to the intervention. These initial “engagement” sessions were about 20 minutes in the study by Baron et al [] and 30 minutes to 45 minutes in the studies by Mohr et al [,]. Of the 15 studies, 2 (13%) studies provided support through group-based videoconferencing sessions, which lasted approximately 20 minutes to 30 minutes [,].
Text-Based Support
In total, 23 (23/36, 64%) interventions used or offered text-based human support through SMS messaging, emails, private messaging, or forum posts on the intervention platform. Text-based communication, which tends to be asynchronous, allowed for a greater frequency of participant interactions with human support. In fact, 13 of the 23 (57%) text-based support interventions used an asynchronous model of continuous support, enabling participants to send messages to their human support at any time. On the other hand, 3 studies [,,] used synchronous text messaging exchanges. For example, Sayegh et al [] reported 3 to 5 text conversations a week, each lasting approximately 5 minutes. Some studies (4/23, 17%) used text-based communication as a supplementary form of support between calls, enabling participants and coaches to remain connected in the interim [,,,].
Applying SAM
Use of SAM
Although we devised our inclusion criteria for this scoping review based on SAM by Mohr et al [], not all studies indicated SAM as a guiding model for their intervention designs. Of the 36 studies, 22 (61%) studies justified the incorporation of a human support component using SAM (see ). Although 4 (4/36, 11%) others did not explicitly identify SAM, the seminal article by Mohr et al [] was cited [,,,]. Mohr was a co-author in 2 of these studies [,]. The following section focuses on the extent to which accountability, legitimacy, and bond were incorporated in the 22 studies that specifically identified SAM when describing intervention development and considerations for enhancing engagement.
Accountability
Nearly all studies but one (21/22, 96%) described how the intervention design encouraged accountability. The most common strategies included checking in (social presence), discussing intervention usage (expectation—process accountability; eg, [,,,,,]), reviewing and setting personal goals (expectation—goal setting; eg, [,,,,]), and creating plans for continued engagement and progress toward goals (goal setting; eg, [,,]). In some cases, these strategies were coupled with monitoring of participants’ activities on the intervention platforms (performance monitoring), such as mobile apps or websites (eg, [,,,,,,]). Among the interventions that relied on mutual peer support from participants, the use of social media features such as news feeds allowed participants to be accountable to one another (social presence). For example, participants could view when their peers were completing activities on the platform [,-,,]. On some platforms, participants had the ability to send push notifications to peers who had been inactive, prompting them to engage with the intervention [-]. In 1 study [], accountability was reinforced by quizzes (expectation—outcome accountability) and email notifications (social presence). Kelders et al [] highlighted that human support could provide opportunities for interactions by asking questions and sending personalized responses.
Legitimacy
People are more likely to respond positively to accountability demands from a coach who is perceived as legitimate []. Legitimacy can be categorized into instrumental (ie, expertise and reciprocity) and relational factors (ie, trustworthiness and benevolence) [,]. Of the 4 characteristics, only expertise was considered most among the articles accounted for in this review. Expertise was derived from training or other credentials, such as profession/discipline or license/certification (eg, [,,,-,,-]). On the other hand, studies that leveraged peer participants as human support [-] prioritized accountability and bonds within the peer cohort rather than expertise. An exception to this is the use of peer specialists, which uniquely applies both instrumental and relational factors. For example, Blonigen et al [] used peer specialists who were professionally trained to use their lived experiences to support adherence to DHIs, leveraging expertise and benevolence. Finally, in most (14/22, 64%) studies, it was unclear whether expertise was conveyed to participants and, perhaps more importantly, whether the participants perceived their respective human support as experts and if such perceptions influenced DHI engagement and adherence.
People look for integrity, caring, and a sense of benevolence when determining the legitimacy of their coaches []. In our review, only a few (4/22, 18%) studies [,,,] proposed how the human support and participants establish trust (trustworthiness). For example, Borghouts et al [] used “nurse promoteres,” who were trusted leaders in the local Spanish-speaking community, to provide support to participants. They built trust by connecting community members to health services and providing support that aligned with the intervention goals. Their existing role in the community made them an ideal source of human support in the intervention.
Other studies used different methods to establish and understand trust. Van Middelaar et al [] intentionally designed the initial baseline consultation between participants and their assigned coaches to take place in person to better establish trust []. In the studies by Baron et al [], Mohr et al [], and Mohr et al [], the first coaching session was an “engagement session,” designed to last longer than follow-up sessions to introduce the interventions, discuss participant goals, set expectations, and build rapport.
Although these studies discussed how they anticipated legitimacy, including trust, would be established, few assessed whether participants themselves perceived their human support as benevolent, trustworthy, and knowledgeable. On the other hand, Sayegh et al [] and Wilhelmsen et al [] assessed whether trust was established by interviewing participants about their experiences with the intervention, including their relationships with the human support. In both studies, participants expressed a sense of connection to their human support. For instance, participants suggested that their relationships with the coaches felt very personal and trusting [] and that they could trust their therapists due to their expertise [].
Although reciprocity was not explicitly mentioned in any of the articles, it may have been implied or inferred in some interventions. In Jesuthasan et al [], coaches introduced themselves during the first week of the intervention and explained their roles, which may demonstrate reciprocity by framing the relationship in terms of the benefits that coaching may provide participants []. This is similarly evident in the studies by Baron et al [], Mohr et al [], and Mohr et al []. In the study by Lepore et al [], an intervention study with low-income minority mothers who smoke, participants were explicitly made aware of the expectations regarding app usage and the role of the telephone counselors (ie, to uphold accountability and support participants throughout the duration of the intervention). This demonstrates a major hallmark of reciprocity, the patient-coach “contract,” in which both parties have clearly defined roles (eg, the patient is expected to log into the intervention and complete specific tasks or activities, while the coach provides time, attention, and support with problems that may arise) []. In a similar vein, although no study explicitly described benevolence, many suggested that benevolence was considered during intervention design and in defining the role of the human support. For example, human support was often described as “warm” and “friendly” (eg, []) and instructed to reflect a sense of caring through encouragement and affirmation [,]. Such direction appeared to be intended to ensure that participants perceive that their human support has their best interests in mind.
Bond
The notion of bond captures the emotional attachment between intervention participants and their human support, which may ultimately promote the effects of accountability []. The majority of studies that used SAM (14/22, 64%) did not explicitly mention methods or processes to encourage liking or bonding []. How interventions intended to cultivate bonds was briefly described by 8 studies [,,-,,,]. Of these studies, 3 used social networking features, such as user profiles [,] and posting and commenting on users’ posts [] to connect. In these studies, participants served as peer human support to one another. Duffecy et al [] also suggested that the ability to create user profiles containing personal information allowed participants to increase group bonds. Meanwhile, Ho et al [] suggested that commenting on each other’s posts created opportunities for off-topic discussions, thereby allowing participants to build emotional connections and trust with one another. In other studies, research assistants and facilitators received training on developing positive relationships and building trust to encourage user adherence to the intervention [,,].
Discussion
Principal Findings
In this review, we found that the sources of human support could be categorized into the following: (1) peers and peer specialists, (2) health experts and practitioners, (3) trained coaches, and (4) members of the research team. Most individuals providing human support are paraprofessionals with no formal credentials nor professional training related to the intervention []. The advantage of using health experts and practitioners is that they are readily available since they are already embedded in the clinical setting. This can allow for a more seamless and streamlined implementation process. However, several system-level barriers must be considered. For example, providers may have concerns regarding the time and effort it takes to integrate interventions into their existing workflow [], as their time and effort invested may not be reimbursed or compensated [-]. The use of paraprofessionals (eg, trained coaches) specific to the interventions could bridge this gap, as existing evidence suggests that interventions delivered by paraprofessionals are scalable, feasible, acceptable, and effective [,]. In addition, pairing paraprofessionals with DHIs can enhance treatment fidelity by minimizing human error or therapist drift, which is common in manualized treatment protocols []. Adequate training for paraprofessionals is key to ensuring fidelity and scalability. Rosenberg et al [] offered an in-depth description of their coach training program based on learning theories and competency-based supervision.
One of the common sources of human support found in this review is peers and peer specialists. Peer-to-peer support could promote DHI engagement because their relationship is characterized by reciprocal accountability, wherein peers motivate each other to work toward their individual goals []. On the other hand, peer coaching models (eg, peer specialists) are frequently used to enhance intervention accessibility, engagement, and scalability by providing human support from coaches who share similar identities and lived experiences []. One common use of peer specialists can be found in recovery support for substance use [], in which peers apply personal experience and knowledge to support individuals who are earlier in their recovery process [,]. Aligned with our review, Blonigen et al [] used peer specialists to facilitate veteran participants’ engagement with a DHI for unhealthy alcohol use. Shared lived experiences give peer specialists credibility and offer participants a sense of belonging and hope for the future [].
Although some studies have demonstrated the positive effects of these peer models [-], empirical support remains mixed [-]. These inconsistencies may be due, in part, to the lack of structured interventions, formalized training, and quality assurance mechanisms for peer specialists and other paraprofessional human supports [,]. A critical gap in the current literature is the lack of detailed descriptions of training protocols, core competencies, and supervision processes, which makes it difficult to assess fidelity, replicate successful interventions, or scale promising approaches. Developing and reporting standardized training protocols for human support providers in DHIs are essential to ensure consistency, maintain intervention quality, and enhance participant outcomes. This is especially important for peer specialists and other paraprofessional human support, who are not typically professionally trained for these specific roles and may require additional guidance. Such protocols should include clearly defined roles and responsibilities, communication strategies grounded in behavior change theory, ethical and boundary training, cultural competence, and ongoing supervision or fidelity monitoring.
Our findings suggest that the frequency and duration of human support interactions are closely tied to the mode of communication, with important implications for engagement and adherence in DHIs. This relationship aligns with the communication bandwidth principle proposed in SAM [], which suggests that communication media differ in their ability to convey social presence, thus influencing the quality of the interaction and the strength of accountability. For example, phone and video calls—used in 64% of the interventions—typically involved less frequent but longer sessions, allowing for richer interpersonal exchanges that include verbal and, in the case of video, visual cues. These higher-bandwidth modes facilitate stronger perceptions of bond, legitimacy, and trust—core constructs of SAM—by supporting nuanced communication, emotional expression, and real-time responsiveness.
In contrast, text-based communication (also used in 64% of interventions) enabled more frequent but shorter and often asynchronous interactions. Although this lower-bandwidth mode may offer greater flexibility and accessibility, particularly for asynchronous support, it strips away nonverbal cues and could potentially diminish social presence. The findings imply that, although both modes have utility, higher-bandwidth communication may be more effective for establishing initial trust and accountability. In contrast, text-based support may be a more suitable supplementary strategy for maintaining ongoing engagement. Future DHI design should consider tailoring support frequency and duration based on the mode of communication and the complexity of the behavior change goals, striking a balance between efficiency and the need for relational depth.
The majority of studies (97%) used human support to promote accountability to improve engagement with DHIs, including monitoring users’ usage, sending reminders, checking in, setting goals, and discussing performance [,]. Each of these strategies may be positioned relative to each other on a spectrum based on the degree to which an approach is “active.” For example, using human support to monitor user performance alone is a passive approach as it does not necessarily promote engagement with DHIs. Performance monitoring is often coupled with more direct check-ins and reminders that allow participants to infer the social presence of human support. To improve engagement, human support can also set forth expectations by setting goals with users; for example, human support can help users set and review personal goals, as well as create plans for continued DHI use. When setting goals, the targets of the expectations can vary. In this review, most human support focused on process accountability, with a primary goal of increasing engagement with the DHI. On the other hand, Cheng et al [] used automated support, quizzes, and email notifications to maintain accountability. Although the notifications were effective reminders to complete the weekly sessions and diary entries, quizzes prompted participants to pay more attention to the information as they promoted outcome accountability.
Although accountability was used across most studies, we did not see any use of instruments to measure accountability. Therefore, it is unknown whether the steps taken to ensure accountability had any effects, positive or negative, on accountability. Qualitative studies by Sayegh et al [] and Wilhelmsen et al [] interviewed participants about their experiences with the intervention. In each of these studies, participants described their experience with their human support, and their encounters resulted in different levels of accountability that promoted their use of DHIs. Parent participants in the study by Whiteside et al [] also described observing that the interventions gave a sense of responsibility for their children’s symptom management. Meyerhoff et al [] saw a need for supportive accountability measures for coached digital interventions and developed the Supportive Accountability Inventory (SAI). The SAI is a 6-item instrument assessing users’ perceptions of (1) whether they believe the coach pays attention to their use of DHIs and progress and (2) if their coach maintains expectations of user engagement with the DHIs and their effort toward treatment-related goals. Future studies should utilize validated measures such as the SAI [] to assess the mechanisms by which accountability is achieved (eg, monitoring and expectation).
Additionally, many studies did not explicitly describe how bonds and legitimacy were used to cultivate accountability. Articles frequently used keywords such as connection, warmth, trust, and friendly, which may have been used to signal or allude to bonds (eg, [,,]. However, without clear definitions or theoretical grounding, it becomes difficult to determine whether these references reflect bonds—which emphasize mutual liking, emotional connection, and respect (eg, therapeutic alliance) []—or relational legitimacy, which centers on perceptions of trustworthiness, benevolence, and expertise [] and does not necessarily involve interpersonal liking []. One possible reason for this gap is that qualities associated with bonds or relational legitimacy are often assumed to be inherent in any supportive or therapeutic relationship. As such, these relational dynamics may be implicitly embedded in human-supported DHIs but are rarely made explicit in intervention design or measured in outcome evaluations.
To address this gap, future research should more deliberately integrate and measure these constructs using validated instruments. For example, the Working Alliance Inventory [] could capture the strength of the bond, while the Credibility/Expectancy Questionnaire [] may assess relational legitimacy and perceived coach expertise. Intervention protocols could also include structured relational strategies such as rapport-building scripts, credibility-enhancing coach introductions, and personalized messaging that reinforces benevolence and competence. Last, qualitative evaluations are essential for understanding how participants perceive relational legitimacy, which, in turn, influences their behaviors. Although many interventions aim to build legitimacy, few assess how it is experienced. Interviews and open-ended feedback can capture these relational nuances, offering critical insight into how legitimacy fosters engagement and adherence in DHIs.
In this scoping review, we found certain studies used human support as the main deliverers of the intervention content, corresponding to Types 1 and 2 in the taxonomy in Pineda et al []. As aforementioned, SAM was proposed to suggest the specific use of coaches to support user engagement and adherence to DHIs []. This aligns with Type 3 within the taxonomy. The tension among Types 1, 2, and 3 within this scoping review indicates variations in the application of SAM to DHIs. These studies did not appear to utilize SAM in the way that it was originally intended. As such, it is necessary to clearly define intervention goals and subsequent roles of coaches when considering the use of human support in DHIs.
We found that only 6 (17%) of the 36 studies focused on a youth population, yet youth are increasingly intertwined with technology, and many seek and share health advice through social media [,]. One study [] targeted children with anxiety, but the intervention itself was designed for both parents and their children, some as young as 8 years old. In our review, we found that the use of peers (eg, [-,]) is a common strategy that aligns with the social affordances described by Wong et al []. Since youth, adolescents, and young adults are particularly interested in and influenced by their peers, future studies may consider leveraging peer influence in accordance with SAM to enhance DHI engagement and adherence among younger populations. For example, adolescents and young adults may be motivated to adopt DHIs that incorporate social components, as these may foster social support and a sense of belonging [,].
Limitations
To our knowledge, this is the first scoping review to explore the application of human support strategies in DHI design to promote engagement. Several potential limitations should be considered. First, there is a chance that we missed relevant articles, as this review only included peer-reviewed studies and excluded gray literature. Another limitation is that, although this review focused on the 3 core factors (ie, accountability, bonds, and legitimacy) of SAM, the model has additional components that inform the pathways to adherence (eg, motivation and communication bandwidth). They are also important factors that can enhance the influence of accountability on adherence. Last, our search terms were designed to narrowly capture literature that is more likely to apply SAM. It is possible, however, that other studies utilize human support but do not term it as such. Authors may use different keywords that were not captured by our search terms. We attempted to address this through additional hand-searching.
Conclusion
Strategies to improve engagement with DHIs are undoubtedly needed. This scoping review examines the current state of knowledge regarding the strategy of human support and the application of SAM in DHIs. Our review identified inconsistencies in the application of SAM across studies. For example, although SAM focuses on using human support primarily to improve DHI engagement and adherence, studies used human support for the administration of interventions. Further, bond and legitimacy, major constructs that, according to SAM, promote accountability, were rarely explicitly considered and discussed by authors. Further research should demonstrate more intentional application of these constructs, documenting how interventions were designed to promote bonds and establish legitimacy and measuring each with psychometrically sound instruments. Studies should also examine communication bandwidth and motivation as potential moderators of adherence []. For example, we found that interaction frequency and duration appeared to be influenced by the type of communication (eg, call, text, videoconferencing). Future studies could focus on the frequency and duration (ie, bandwidth) of these communications, as well as their quality, assuming that greater communication could lead to improved task completion, enhanced interpersonal relations, and greater social presence []. Similarly, it is worth exploring how human support can move users from operating solely on extrinsic motivation to intrinsic motivation as well. Techniques such as motivational interviewing may facilitate this transition. Last, future studies may consider comparing and evaluating the impact of human support strategies on engagement and adherence. We suspect that those strategies that are highly personalized and tailored to the target population (ie, appropriate for age or developmental stage, symptoms or health conditions, mode of delivery) will lead to better engagement and adherence. For example, utilizing peers as a form of human support is acceptable, appropriate, and potentially effective among youth or young adults. In sum, adherence and engagement to DHIs have real-world implications, and this review sheds light on how human support is currently utilized to promote engagement. As the field continues to respond to the ubiquitous nature of technology within our society, the prospect of DHIs offers accessibility and scalability for many populations challenged by a myriad of health conditions. However, the efficacy of DHIs cannot be realized in the face of the law of attrition. We thus require thoughtful, intentional, and faithful integration of human support to optimize DHI engagement and adherence.
Data Availability
The data that support the findings of this study are available from the corresponding author upon reasonable request.
Authors' Contributions
Conceptualization: GK, KD, JD, SPYC
Data curation: SPYC
Formal analysis: GK, SPYC
Methodology: GK, SPYC
Project administration: SPYC
Supervision: GK
Writing – original draft: GK, SPYC
Writing – review & editing: GK, SPYC, JD, KD
Conflicts of Interest
JD is a consultant for My Alloy, Chorus Health, and Rebound Health. This review is solely the authors' work and is not connected to these companies. All other authors declare no conflicts of interest.
Summary of research studies.
DOCX File , 31 KBIncorporation of supportive accountability model (SAM) components in digital health intervention (DHI) designs.
DOCX File , 24 KBPRISMA-ScR Checklist.
PDF File (Adobe PDF File), 731 KBReferences
- Murray E, Hekler EB, Andersson G, Collins LM, Doherty A, Hollis C, et al. Evaluating digital health interventions: key questions and approaches. Am J Prev Med. Nov 2016;51(5):843-851. [FREE Full text] [CrossRef] [Medline]
- Soobiah C, Cooper M, Kishimoto V, Bhatia RS, Scott T, Maloney S, et al. Identifying optimal frameworks to implement or evaluate digital health interventions: a scoping review protocol. BMJ Open. Aug 13, 2020;10(8):e037643. [FREE Full text] [CrossRef] [Medline]
- Philippe TJ, Sikder N, Jackson A, Koblanski ME, Liow E, Pilarinos A, et al. Digital health interventions for delivery of mental health care: systematic and comprehensive meta-review. JMIR Ment Health. May 12, 2022;9(5):e35159. [FREE Full text] [CrossRef] [Medline]
- Widmer RJ, Collins NM, Collins CS, West CP, Lerman LO, Lerman A. Digital health interventions for the prevention of cardiovascular disease: a systematic review and meta-analysis. Mayo Clin Proc. Apr 2015;90(4):469-480. [FREE Full text] [CrossRef] [Medline]
- Eysenbach G. The law of attrition. J Med Internet Res. Mar 31, 2005;7(1):e11. [FREE Full text] [CrossRef] [Medline]
- O'Brien HL, Morton E, Kampen A, Barnes SJ, Michalak EE. Beyond clicks and downloads: a call for a more comprehensive approach to measuring mobile-health app engagement. BJPsych Open. Aug 11, 2020;6(5):e86. [FREE Full text] [CrossRef] [Medline]
- Jakob R, Harperink S, Rudolf AM, Fleisch E, Haug S, Mair JL, et al. Factors influencing adherence to mHealth apps for prevention or management of noncommunicable diseases: systematic review. J Med Internet Res. May 25, 2022;24(5):e35371. [FREE Full text] [CrossRef] [Medline]
- Kelders SM, Kok RN, Ossebaard HC, Van Gemert-Pijnen JEWC. Persuasive system design does matter: a systematic review of adherence to web-based interventions. J Med Internet Res. Nov 14, 2012;14(6):e152. [FREE Full text] [CrossRef] [Medline]
- Perski O, Blandford A, West R, Michie S. Conceptualising engagement with digital behaviour change interventions: a systematic review using principles from critical interpretive synthesis. Transl Behav Med. Jun 2017;7(2):254-267. [FREE Full text] [CrossRef] [Medline]
- Ryan RM, Deci EL. Self-determination theory: Basic psychological needs in motivation, development, and wellness. New York, NY. The Guilford Press; 2017.
- Sebire SJ, Standage M, Vansteenkiste M. Examining intrinsic versus extrinsic exercise goals: cognitive, affective, and behavioral outcomes. J Sport Exerc Psychol. Apr 2009;31(2):189-210. [FREE Full text] [CrossRef] [Medline]
- Bandura A. Social cognitive theory: an agentic perspective. Annu Rev Psychol. 2001;52:1-26. [CrossRef] [Medline]
- Bandura A. Social Foundations of Thought and Action: A Social Cognitive Theory. Engelwood Cliffs, NJ. Prentice-Hall; 1986.
- Luszczynska A, Schwarzer R. Changing behavior using social cognitive theory. In: Hagger MS, Cameron LD, Hamilton K, Hankonen N, Lintunen T, editors. The Handbook of Behavior Change. New York, NY. Cambridge University Press; 2020:32-45.
- Renfrew ME, Morton DP, Morton JK, Hinze JS, Przybylko G, Craig BA. The influence of three modes of human support on attrition and adherence to a web- and mobile app-based Mental Health Promotion Intervention in a nonclinical cohort: randomized comparative study. J Med Internet Res. Sep 29, 2020;22(9):e19945. [FREE Full text] [CrossRef] [Medline]
- Renfrew ME, Morton DP, Morton JK, Przybylko G. The influence of human support on the effectiveness of digital mental health promotion interventions for the general population. Front Psychol. Aug 19, 2021;12:716106. [FREE Full text] [CrossRef] [Medline]
- Baker TB, Gustafson DH, Shah D. How can research keep up with eHealth? Ten strategies for increasing the timeliness and usefulness of eHealth research. J Med Internet Res. Feb 19, 2014;16(2):e36. [FREE Full text] [CrossRef] [Medline]
- Yardley L, Spring BJ, Riper H, Morrison LG, Crane DH, Curtis K, et al. Understanding and promoting effective engagement with digital behavior change interventions. Am J Prev Med. Nov 2016;51(5):833-842. [CrossRef] [Medline]
- Fortuna KL, Brooks JM, Umucu E, Walker R, Chow PI. Peer support: a human factor to enhance engagement in digital health behavior change interventions. J Technol Behav Sci. Jun 15, 2019;4(2):152-161. [FREE Full text] [CrossRef] [Medline]
- Solomon P. Peer support/peer provided services underlying processes, benefits, and critical ingredients. Psychiatr Rehabil J. 2004;27(4):392-401. [CrossRef] [Medline]
- Zhao L, Lu Y, Wang B, Chau PY, Zhang L. Cultivating the sense of belonging and motivating user participation in virtual communities: a social capital perspective. International Journal of Information Management. Dec 2012;32(6):574-588. [CrossRef]
- Davidson L, Chinman M, Sells D, Rowe M. Peer support among adults with serious mental illness: a report from the field. Schizophr Bull. Jul 2006;32(3):443-450. [FREE Full text] [CrossRef] [Medline]
- Fortuna KL, Naslund JA, Aschbrenner KA, Lohman MC, Storm M, Batsis JA, et al. Text message exchanges between older adults with serious mental illness and older certified peer specialists in a smartphone-supported self-management intervention. Psychiatr Rehabil J. Mar 2019;42(1):57-63. [FREE Full text] [CrossRef] [Medline]
- Jesuthasan J, Low M, Ong T. The impact of personalized human support on engagement with behavioral intervention technologies for employee mental health: an exploratory retrospective study. Front Digit Health. Apr 27, 2022;4:846375. [FREE Full text] [CrossRef] [Medline]
- Willis VC, Thomas Craig KJ, Jabbarpour Y, Scheufele EL, Arriaga YE, Ajinkya M, et al. Digital health interventions to enhance prevention in primary care: scoping review. JMIR Med Inform. Jan 21, 2022;10(1):e33518. [FREE Full text] [CrossRef] [Medline]
- Peterson EB, Ostroff JS, DuHamel KN, D'Agostino TA, Hernandez M, Canzona MR, et al. Impact of provider-patient communication on cancer screening adherence: a systematic review. Prev Med. Dec 2016;93:96-105. [FREE Full text] [CrossRef] [Medline]
- Höchsmann C, Infanger D, Klenk C, Königstein K, Walz SP, Schmidt-Trucksäss A. Effectiveness of a behavior change technique-based smartphone game to improve intrinsic motivation and physical activity adherence in patients with type 2 diabetes: randomized controlled trial. JMIR Serious Games. Feb 13, 2019;7(1):e11444. [FREE Full text] [CrossRef] [Medline]
- Bandura A. Health promotion by social cognitive means. Health Educ Behav. Apr 2004;31(2):143-164. [CrossRef] [Medline]
- Islam KF, Awal A, Mazumder H, Munni UR, Majumder K, Afroz K, et al. Social cognitive theory-based health promotion in primary care practice: a scoping review. Heliyon. Apr 2023;9(4):e14889. [FREE Full text] [CrossRef] [Medline]
- Smith JJ, Morgan PJ, Plotnikoff RC, Dally KA, Salmon J, Okely AD, et al. Smart-phone obesity prevention trial for adolescent boys in low-income communities: the ATLAS RCT. Pediatrics. Sep 25, 2014;134(3):e723-e731. [CrossRef] [Medline]
- Mohr DC, Cuijpers P, Lehman K. Supportive accountability: a model for providing human support to enhance adherence to eHealth interventions. J Med Internet Res. Mar 10, 2011;13(1):e30. [FREE Full text] [CrossRef] [Medline]
- Lerner JS, Tetlock PE. Accounting for the effects of accountability. Psychol Bull. Mar 1999;125(2):255-275. [CrossRef] [Medline]
- Ryan RM, Deci EL. Self-determination theory and the facilitation of intrinsic motivation, social development, and well-being. Am Psychol. Jan 2000;55(1):68-78. [CrossRef] [Medline]
- Oussedik E, Foy CG, Masicampo EJ, Kammrath LK, Anderson RE, Feldman SR. Accountability: a missing construct in models of adherence behavior and in clinical practice. PPA. Jul 2017;Volume 11:1285-1294. [CrossRef]
- Schueller S, Tomasino K, Mohr D. Integrating human support into behavioral intervention technologies: the efficiency model of support. Clin Psychol Sci Pract. Nov 17, 2016;24(1):27-45. [CrossRef]
- Stiles-Shields C, Montague E, Kwasny MJ, Mohr DC. Behavioral and cognitive intervention strategies delivered via coached apps for depression: pilot trial. Psychol Serv. May 2019;16(2):233-238. [FREE Full text] [CrossRef] [Medline]
- Spring B, Duncan JM, Janke EA, Kozak AT, McFadden HG, DeMott A, et al. Integrating technology into standard weight loss treatment: a randomized controlled trial. JAMA Intern Med. Jan 28, 2013;173(2):105-111. [FREE Full text] [CrossRef] [Medline]
- Dennison L, Morrison L, Lloyd S, Phillips D, Stuart B, Williams S, et al. Does brief telephone support improve engagement with a web-based weight management intervention? Randomized controlled trial. J Med Internet Res. Mar 28, 2014;16(3):e95. [FREE Full text] [CrossRef] [Medline]
- Pineda BS, Mejia R, Qin Y, Martinez J, Delgadillo LG, Muñoz RF. Updated taxonomy of digital mental health interventions: a conceptual framework. Mhealth. 2023;9:28. [FREE Full text] [CrossRef] [Medline]
- Arksey H, O'Malley L. Scoping studies: towards a methodological framework. International Journal of Social Research Methodology. Feb 2005;8(1):19-32. [CrossRef]
- Tricco AC, Lillie E, Zarin W, O'Brien KK, Colquhoun H, Levac D, et al. PRISMA Extension for Scoping Reviews (PRISMA-ScR): checklist and explanation. Ann Intern Med. Oct 02, 2018;169(7):467-473. [FREE Full text] [CrossRef] [Medline]
- Skapinakis P, Caldwell D, Hollingworth W, Bryden P, Fineberg N, Salkovskis P, et al. A systematic review of the clinical effectiveness and cost-effectiveness of pharmacological and psychological interventions for the management of obsessive-compulsive disorder in children/adolescents and adults. Health Technol Assess. Jun 2016;20(43):1-392. [FREE Full text] [CrossRef] [Medline]
- Bueno M, Stevens B, Barwick MA, Riahi S, Li S, Lanese A, et al. A cluster randomized clinical trial to evaluate the effectiveness of the Implementation of Infant Pain Practice Change (ImPaC) resource to improve pain practices in hospitalized infants: a study protocol. Trials. Jan 06, 2020;21(1):16. [FREE Full text] [CrossRef] [Medline]
- Blonigen DM, Harris-Olenak B, Kuhn E, Timko C, Humphreys K, Smith JS, et al. Using peers to increase veterans' engagement in a smartphone application for unhealthy alcohol use: a pilot study of acceptability and utility. Psychol Addict Behav. Nov 2021;35(7):829-839. [FREE Full text] [CrossRef] [Medline]
- Duffecy J, Grekin R, Hinkel H, Gallivan N, Nelson G, O'Hara MW. A group-based online intervention to prevent postpartum depression (Sunnyside): feasibility randomized controlled trial. JMIR Ment Health. May 28, 2019;6(5):e10778. [FREE Full text] [CrossRef] [Medline]
- Duffecy J, Sanford S, Wagner L, Begale M, Nawacki E, Mohr DC. Project onward: an innovative e-health intervention for cancer survivors. Psychooncology. Apr 2013;22(4):947-951. [FREE Full text] [CrossRef] [Medline]
- Duffecy J, Grekin R, Long JD, Mills JA, O'Hara M. Randomized controlled trial of Sunnyside: individual versus group-based online interventions to prevent postpartum depression. J Affect Disord. Aug 15, 2022;311:538-547. [FREE Full text] [CrossRef] [Medline]
- Ho J, Corden ME, Caccamo L, Tomasino KN, Duffecy J, Begale M, et al. Design and evaluation of a peer network to support adherence to a web-based intervention for adolescents. Internet Interv. Nov 2016;6:50-56. [FREE Full text] [CrossRef] [Medline]
- Lattie EG, Ho J, Sargent E, Tomasino KN, Smith JD, Brown CH, et al. Teens engaged in collaborative health: the feasibility and acceptability of an online skill-building intervention for adolescents at risk for depression. Internet Interv. Jun 2017;8:15-26. [FREE Full text] [CrossRef] [Medline]
- Lederman R, Wadley G, Gleeson J, Bendall S, Álvarez-Jiménez M. Moderated online social therapy: designing and evaluating technology for mental health. ACM Transactions on Computer-Human Interaction (TOCHI). Feb 01, 2014;21(1):1-26. [CrossRef]
- Possemato K, Wu J, Greene C, MacQueen R, Blonigen D, Wade M, et al. Web-based problem-solving training with and without peer support in veterans with unmet mental health needs: pilot study of feasibility, user acceptability, and participant engagement. J Med Internet Res. Jan 13, 2022;24(1):e29559. [FREE Full text] [CrossRef] [Medline]
- Tomasino KN, Lattie EG, Ho J, Palac HL, Kaiser SM, Mohr DC. Harnessing peer support in an online intervention for older adults with depression. Am J Geriatr Psychiatry. Oct 2017;25(10):1109-1119. [FREE Full text] [CrossRef] [Medline]
- Goldberg R. Increasing the role of peer specialists in primary care settings. VA Recovery Update. 2017. URL: https://www.mentalhealth.va.gov/communityproviders/assets/docs/wellness/RECOVERY_UPDATE_5.pdf [accessed 2025-08-16]
- Borghouts J, Eikey EV, De Leon C, Schueller SM, Schneider M, Stadnick NA, et al. Understanding the role of support in digital mental health programs with older adults: users' perspective and mixed methods study. JMIR Form Res. Dec 13, 2022;6(12):e43192. [FREE Full text] [CrossRef] [Medline]
- Yardley L, Ware LJ, Smith ER, Williams S, Bradbury KJ, Arden-Close EJ, et al. Randomised controlled feasibility trial of a web-based weight management intervention with nurse support for obese patients in primary care. Int J Behav Nutr Phys Act. May 21, 2014;11(1):67. [FREE Full text] [CrossRef] [Medline]
- Carpenter CA, Eastman A, Ross KM. Consistency with and disengagement from self-monitoring of weight, dietary intake, and physical activity in a technology-based weight loss program: exploratory study. JMIR Form Res. Feb 18, 2022;6(2):e33603. [FREE Full text] [CrossRef] [Medline]
- Chhabria K, Ross KM, Sacco SJ, Leahey TM. The assessment of supportive accountability in adults seeking obesity treatment: psychometric validation study. J Med Internet Res. Jul 28, 2020;22(7):e17967. [FREE Full text] [CrossRef] [Medline]
- Fletcher TL, Amspoker AB, Wassef M, Hogan JB, Helm A, Jackson C, et al. Increasing access to care for trauma-exposed rural veterans: a mixed methods outcome evaluation of a web-based skills training program with telehealth-delivered coaching. J Rural Health. Sep 14, 2022;38(4):740-747. [CrossRef] [Medline]
- Mira A, Bretón-López J, García-Palacios A, Quero S, Baños RM, Botella C. An internet-based program for depressive symptoms using human and automated support: a randomized controlled trial. NDT. Mar 2017;Volume 13:987-1006. [CrossRef]
- Wilhelmsen M, Lillevoll K, Risør M, Høifødt R, Johansen M, Waterloo K, et al. Motivation to persist with internet-based cognitive behavioural treatment using blended care: a qualitative study. BMC Psychiatry. Nov 7, 2013;13(1):296. [CrossRef]
- Whiteside SP, Biggs BK, Tiede MS, Dammann JE, Hathaway JC, Blasi ME, et al. An online- and mobile-based application to facilitate exposure for childhood anxiety disorders. Cogn Behav Pract. Aug 2019;26(3):478-491. [FREE Full text] [CrossRef] [Medline]
- Berman MA, Guthrie NL, Edwards KL, Appelbaum KJ, Njike VY, Eisenberg DM, et al. Change in glycemic control with use of a digital therapeutic in adults with type 2 diabetes: cohort study. JMIR Diabetes. Feb 14, 2018;3(1):e4. [FREE Full text] [CrossRef] [Medline]
- Chew CSE, Davis C, Lim JKE, Lim CMM, Tan YZH, Oh JY, et al. Use of a mobile lifestyle intervention app as an early intervention for adolescents with obesity: single-cohort study. J Med Internet Res. Sep 28, 2021;23(9):e20520. [FREE Full text] [CrossRef] [Medline]
- Renfrew ME, Morton DP, Morton JK, Hinze JS, Beamish PJ, Przybylko G, et al. A web- and mobile app-based mental health promotion intervention comparing email, short message service, and videoconferencing support for a healthy cohort: randomized comparative study. J Med Internet Res. Jan 06, 2020;22(1):e15592. [FREE Full text] [CrossRef] [Medline]
- Sayegh CS, Iverson E, MacDonell KK, Wu S, Belzer M. Youth perspectives on mobile health adherence interventions: a qualitative study guided by the supportive accountability model. Patient Educ Couns. Feb 2024;119:108079. [CrossRef] [Medline]
- Smart MH, Nabulsi NA, Gerber BS, Gupta I, Di Eugenio B, Ziebart B, et al. A remote health coaching, text-based walking program in ethnic minority primary care patients with overweight and obesity: feasibility and acceptability pilot study. JMIR Form Res. Jan 19, 2022;6(1):e31989. [FREE Full text] [CrossRef] [Medline]
- van Middelaar T, Beishuizen CRL, Guillemont J, Barbera M, Richard E, Moll van Charante EP, et al. HATICE consortium. Engaging older people in an internet platform for cardiovascular risk self-management: a qualitative study among Dutch HATICE participants. BMJ Open. Jan 21, 2018;8(1):e019683. [FREE Full text] [CrossRef] [Medline]
- Cueto V, Wang CJ, Sanders LM. Impact of a mobile app-based health coaching and behavior change program on participant engagement and weight status of overweight and obese children: retrospective cohort study. JMIR Mhealth Uhealth. Nov 15, 2019;7(11):e14458. [FREE Full text] [CrossRef] [Medline]
- Kar P, Goward C, Whitman M, Davies M, Willner T, Shaw K. Engagement and effectiveness of digitally enabled behavioural change support for people living with type 2 diabetes. Practical Diabetes. Oct 07, 2020;37(5):167. [CrossRef]
- Mohr DC, Duffecy J, Ho J, Kwasny M, Cai X, Burns MN, et al. A randomized controlled trial evaluating a manualized TeleCoaching protocol for improving adherence to a web-based intervention for the treatment of depression. PLoS One. 2013;8(8):e70086. [FREE Full text] [CrossRef] [Medline]
- Mohr DC, Schueller SM, Tomasino KN, Kaiser SM, Alam N, Karr C, et al. Comparison of the effects of coaching and receipt of app recommendations on depression, anxiety, and engagement in the IntelliCare platform: factorial randomized controlled trial. J Med Internet Res. Aug 28, 2019;21(8):e13609. [FREE Full text] [CrossRef] [Medline]
- Lepore SJ, Collins BN, Killam HW, Barry B. Supportive accountability and mobile app use in a tobacco control intervention targeting low-income minority mothers who smoke: observational study. JMIR Mhealth Uhealth. Jul 02, 2021;9(7):e28175. [FREE Full text] [CrossRef] [Medline]
- Cheng P, Santarossa S, Kalmbach D, Sagong C, Hu K, Drake C. Patient perspectives on facilitators and barriers to equitable engagement with digital CBT-I. Sleep Health. Oct 2023;9(5):571-578. [CrossRef] [Medline]
- Glasgow RE, Christiansen SM, Kurz D, King DK, Woolley T, Faber AJ, et al. Engagement in a diabetes self-management website: usage patterns and generalizability of program use. J Med Internet Res. Jan 25, 2011;13(1):e9. [FREE Full text] [CrossRef] [Medline]
- Kelders SM, Bohlmeijer ET, Pots WTM, van Gemert-Pijnen JEWC. Comparing human and automated support for depression: fractional factorial randomized controlled trial. Behav Res Ther. Sep 2015;72:72-80. [CrossRef] [Medline]
- Baron KG, Duffecy J, Reid K, Begale M, Caccamo L. Technology-assisted behavioral intervention to extend sleep duration: development and design of the sleep bunny mobile app. JMIR Ment Health. Jan 10, 2018;5(1):e3. [FREE Full text] [CrossRef] [Medline]
- Tyler TR. The psychology of legitimacy: a relational perspective on voluntary deference to authorities. Pers Soc Psychol Rev. 1997;1(4):323-345. [CrossRef] [Medline]
- Keijsers GP, Schaap CP, Hoogduin CA. The impact of interpersonal patient and therapist behavior on outcome in cognitive-behavior therapy. A review of empirical studies. Behav Modif. Apr 2000;24(2):264-297. [CrossRef] [Medline]
- Rosenberg BM, Kodish T, Cohen ZD, Gong-Guy E, Craske MG. A novel peer-to-peer coaching program to support digital mental health: design and implementation. JMIR Ment Health. Jan 26, 2022;9(1):e32430. [FREE Full text] [CrossRef] [Medline]
- Zakerabasali S, Ayyoubzadeh SM, Baniasadi T, Yazdani A, Abhari S. Mobile health technology and healthcare providers: systemic barriers to adoption. Healthc Inform Res. Oct 2021;27(4):267-278. [FREE Full text] [CrossRef] [Medline]
- Mohr DC, Azocar F, Bertagnolli A, Choudhury T, Chrisp P, Frank R, et al. Banbury Forum on Digital Mental Health. Banbury Forum consensus statement on the path forward for digital mental health treatment. Psychiatr Serv. Jun 2021;72(6):677-683. [FREE Full text] [CrossRef] [Medline]
- Okorodudu DE, Bosworth HB, Corsino L. Innovative interventions to promote behavioral change in overweight or obese individuals: a review of the literature. Ann Med. May 10, 2015;47(3):179-185. [FREE Full text] [CrossRef] [Medline]
- Thies K, Anderson D, Cramer B. Lack of adoption of a mobile app to support patient self-management of diabetes and hypertension in a federally qualified health center: interview analysis of staff and patients in a failed randomized trial. JMIR Hum Factors. Oct 03, 2017;4(4):e24. [FREE Full text] [CrossRef] [Medline]
- Woods LS, Duff J, Roehrer E, Walker K, Cummings E. Patients' experiences of using a consumer mHealth app for self-management of heart failure: mixed-methods study. JMIR Hum Factors. May 02, 2019;6(2):e13009. [FREE Full text] [CrossRef] [Medline]
- Naslund JA, Aschbrenner KA, Araya R, Marsch LA, Unützer J, Patel V, et al. Digital technology for treating and preventing mental disorders in low-income and middle-income countries: a narrative review of the literature. Lancet Psychiatry. Jun 2017;4(6):486-500. [FREE Full text] [CrossRef] [Medline]
- Padmanathan P, De Silva MJ. The acceptability and feasibility of task-sharing for mental healthcare in low and middle income countries: a systematic review. Soc Sci Med. Nov 2013;97:82-86. [CrossRef] [Medline]
- Waller G, Turner H. Therapist drift redux: Why well-meaning clinicians fail to deliver evidence-based therapy, and how to get back on track. Behav Res Ther. Feb 2016;77:129-137. [FREE Full text] [CrossRef] [Medline]
- Gagne CA, Finch WL, Myrick KJ, Davis LM. Peer workers in the behavioral and integrated health workforce: opportunities and future directions. Am J Prev Med. Jun 2018;54(6 Suppl 3):S258-S266. [FREE Full text] [CrossRef] [Medline]
- Bassuk EL, Hanson J, Greene RN, Richard M, Laudet A. Peer-delivered recovery support services for addictions in the United States: a systematic review. J Subst Abuse Treat. Apr 2016;63:1-9. [CrossRef] [Medline]
- Watson E. The mechanisms underpinning peer support: a literature review. J Ment Health. Dec 2019;28(6):677-688. [CrossRef] [Medline]
- Gillard S, Foster R, Gibson S, Goldsmith L, Marks J, White S. Describing a principles-based approach to developing and evaluating peer worker roles as peer support moves into mainstream mental health services. MHSI. Jun 12, 2017;21(3):133-143. [CrossRef]
- van der Zanden R, Kramer J, Gerrits R, Cuijpers P. Effectiveness of an online group course for depression in adolescents and young adults: a randomized trial. J Med Internet Res. Jun 07, 2012;14(3):e86. [FREE Full text] [CrossRef] [Medline]
- Day V, McGrath PJ, Wojtowicz M. Internet-based guided self-help for university students with anxiety, depression and stress: a randomized controlled clinical trial. Behav Res Ther. Jul 2013;51(7):344-351. [CrossRef] [Medline]
- Klatt C, Berg CJ, Thomas JL, Ehlinger E, Ahluwalia JS, An LC. The role of peer e-mail support as part of a college smoking-cessation website. Am J Prev Med. Dec 2008;35(6 Suppl):S471-S478. [CrossRef] [Medline]
- Conley CS, Hundert CG, Charles JLK, Huguenel BM, Al-khouja M, Qin S, et al. Honest, open, proud–college: effectiveness of a peer-led small-group intervention for reducing the stigma of mental illness. Stigma and Health. May 2020;5(2):168-178. [CrossRef]
- Silver J, Nemec PB. The role of the peer specialists: unanswered questions. Psychiatr Rehabil J. Sep 2016;39(3):289-291. [CrossRef] [Medline]
- Lloyd-Evans B, Mayo-Wilson E, Harrison B, Istead H, Brown E, Pilling S, et al. A systematic review and meta-analysis of randomised controlled trials of peer support for people with severe mental illness. BMC Psychiatry. Feb 14, 2014;14:39. [FREE Full text] [CrossRef] [Medline]
- Fortuna KL, Naslund JA, LaCroix JM, Bianco CL, Brooks JM, Zisman-Ilani Y, et al. Digital peer support mental health interventions for people with a lived experience of a serious mental illness: systematic review. JMIR Ment Health. Apr 03, 2020;7(4):e16460. [FREE Full text] [CrossRef] [Medline]
- Ali K, Farrer L, Gulliver A, Griffiths KM. Online peer-to-peer support for young people with mental health problems: a systematic review. JMIR Ment Health. 2015;2(2):e19. [FREE Full text] [CrossRef] [Medline]
- Eddie D, Hoffman L, Vilsaint C, Abry A, Bergman B, Hoeppner B, et al. Lived experience in new models of care for substance use disorder: a systematic review of peer recovery support services and recovery coaching. Front Psychol. 2019;10:1052. [FREE Full text] [CrossRef] [Medline]
- Koh A, Swanepoel DW, Ling A, Ho BL, Tan SY, Lim J. Digital health promotion: promise and peril. Health Promot Int. Dec 13, 2021;36(Supplement_1):i70-i80. [FREE Full text] [CrossRef] [Medline]
- Erku D, Khatri R, Endalamaw A, Wolka E, Nigatu F, Zewdie A, et al. Digital health interventions to improve access to and quality of primary health care services: a scoping review. Int J Environ Res Public Health. Sep 28, 2023;20(19):1. [FREE Full text] [CrossRef] [Medline]
- Meyerhoff J, Haldar S, Mohr DC. The Supportive Accountability Inventory: psychometric properties of a measure of supportive accountability in coached digital interventions. Internet Interv. Sep 2021;25:100399. [FREE Full text] [CrossRef] [Medline]
- Bordin ES. The generalizability of the psychoanalytic concept of the working alliance. Psychotherapy: Theory, Research & Practice. 1979;16(3):252-260. [CrossRef]
- Horvath AO, Greenberg LS. Development and validation of the Working Alliance Inventory. Journal of Counseling Psychology. 1989;36(2):223-233. [CrossRef]
- Devilly GJ, Borkovec TD. Psychometric properties of the credibility/expectancy questionnaire. J Behav Ther Exp Psychiatry. Jun 2000;31(2):73-86. [CrossRef] [Medline]
- Gray NJ, Klein JD, Noyce PR, Sesselberg TS, Cantrill JA. Health information-seeking behaviour in adolescence: the place of the internet. Soc Sci Med. Apr 2005;60(7):1467-1478. [CrossRef] [Medline]
- Rideout V, Fox S, Well Being Trust. Digital Health Practices, Social Media Use, and Mental Well-Being Among Teens and Young Adults in the U.S. Providence. 2018. URL: https://digitalcommons.providence.org/publications/1093 [accessed 2025-08-16]
- Wong CA, Madanay F, Ozer EM, Harris SK, Moore M, Master SO, et al. Digital health technology to enhance adolescent and young adult clinical preventive services: affordances and challenges. J Adolesc Health. Aug 2020;67(2S):S24-S33. [FREE Full text] [CrossRef] [Medline]
- Sutcliffe AG, Gonzalez V, Binder J, Nevarez G. Social mediating technologies: social affordances and functionalities. International Journal of Human-Computer Interaction. Nov 2011;27(11):1037-1065. [CrossRef]
- Best P, Manktelow R, Taylor B. Online communication, social media and adolescent wellbeing: a systematic narrative review. Children and Youth Services Review. Jun 2014;41:27-36. [CrossRef]
Abbreviations
| DHI: digital health intervention |
| DMHI: digital mental health intervention |
| mHealth: mobile health |
| PRISMA-ScR: Preferred Reporting Items for Systematic Reviews and Meta-Analyses Extension for Scoping Reviews |
| SAI: Supportive Accountability Inventory |
| SAM: supportive accountability model |
| SCT: social cognitive theory |
| SDT: self-determination theory |
Edited by T de Azevedo Cardoso; submitted 13.Feb.2025; peer-reviewed by R James, E Hannah; comments to author 29.May.2025; revised version received 28.Jul.2025; accepted 31.Jul.2025; published 26.Sep.2025.
Copyright©Gary Kwok, Shannon Pui Ying Cheung, Jennifer Duffecy, Katie A Devine. Originally published in the Journal of Medical Internet Research (https://www.jmir.org), 26.Sep.2025.
This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research (ISSN 1438-8871), is properly cited. The complete bibliographic information, a link to the original publication on https://www.jmir.org/, as well as this copyright and license information must be included.

