Published on in Vol 22, No 3 (2020): March

User Experiences of Social Support From Companion Chatbots in Everyday Contexts: Thematic Analysis

User Experiences of Social Support From Companion Chatbots in Everyday Contexts: Thematic Analysis

User Experiences of Social Support From Companion Chatbots in Everyday Contexts: Thematic Analysis

Original Paper

Corresponding Author:

Vivian Ta, PhD

Lake Forest College

555 N Sheridan Rd

Lake Forest, IL, 60045

United States

Phone: 1 682 203 0820

Email: vpta538@gmail.com


Background: Previous research suggests that artificial agents may be a promising source of social support for humans. However, the bulk of this research has been conducted in the context of social support interventions that specifically address stressful situations or health improvements. Little research has examined social support received from artificial agents in everyday contexts.

Objective: Considering that social support manifests in not only crises but also everyday situations and that everyday social support forms the basis of support received during more stressful events, we aimed to investigate the types of everyday social support that can be received from artificial agents.

Methods: In Study 1, we examined publicly available user reviews (N=1854) of Replika, a popular companion chatbot. In Study 2, a sample (n=66) of Replika users provided detailed open-ended responses regarding their experiences of using Replika. We conducted thematic analysis on both datasets to gain insight into the kind of everyday social support that users receive through interactions with Replika.

Results: Replika provides some level of companionship that can help curtail loneliness, provide a “safe space” in which users can discuss any topic without the fear of judgment or retaliation, increase positive affect through uplifting and nurturing messages, and provide helpful information/advice when normal sources of informational support are not available.

Conclusions: Artificial agents may be a promising source of everyday social support, particularly companionship, emotional, informational, and appraisal support, but not as tangible support. Future studies are needed to determine who might benefit from these types of everyday social support the most and why. These results could potentially be used to help address global health issues or other crises early on in everyday situations before they potentially manifest into larger issues.

J Med Internet Res 2020;22(3):e16235

doi:10.2196/16235

Keywords



Previous research suggests that artificial agents may be a promising source of social support for humans and thus benefit health and well-being. For example, artificial agents may help people cope with loneliness and depressive anxiety that often accompanies severe illness and end-of-life experiences [1,2], improve mood and reduce depression and anxiety symptoms for individuals with dementia [3-5], and increase medication adherence and rehabilitation exercise frequency for individuals with chronic obstructive pulmonary disease by providing reminders and helpful information [6]. In addition, conversational agents have been shown to address social isolation and loneliness in older adults by providing empathic feedback, exercise promotion, and anecdotal stories [7], and Web-based cognitive behavioral therapy (CBT) conversational agents have shown to reduce symptoms of depression and anxiety [5]. However, the bulk of this research has been conducted in the context of social support interventions that specifically address very stressful life events or improving health. Little research has examined everyday social support received from artificial agents, that is, social support as an everyday social interaction rather than a response to very stressful life events or health-related situations [8].

Social support is a complex construct, as it has been defined in many ways [9,10], has been categorized into different forms (eg, behaviors, perceptions) [11] and types (eg, instrumental, appraisal, emotional support) [12], and can come from a variety of sources (eg, friends, family, coworkers). In this paper, we define social support as a social psychological concept that “addresses the mechanisms and processes through which interpersonal relationships protect and help people in their day-to-day lives” [13]. Cutrona and Suhr [14] provide a framework to distinguish between several types of social support: (1) informational support, which refers to providing information or advice; (2) emotional support, which refers to providing expressions that include care, love, empathy, and sympathy; (3) appraisal support, which refers to evaluative feedback regarding skills, abilities, and intrinsic value; (4) companionship support, which refers to the enhancement of one’s sense of belonging; and (5) tangible support, which refers to providing needed goods and services. Despite the various definitions and forms of social support, numerous studies have demonstrated its importance in mental and physical health, as it is an important buffering factor for critical life events, illnesses, trauma, and stress [9,15] and affects one’s well-being in everyday circumstances [16,17].

Social support manifests not only in crises such as health-related or very stressful life events but also in everyday situations and contexts [8], and everyday social support forms the basis for the support received during more stressful situations [18]. Given that social support plays a critical role in health and well-being [9,15-17], it is important to examine the kinds of everyday social support that can be provided by artificial agents. This kind of investigation could allow us to potentially address global health issues or other crises early on in everyday situations before they manifest into larger issues.

As a first step in addressing this gap in the literature, we analyzed the user experiences of a popular companion chatbot (Replika) across two exploratory studies to identify the types of everyday social support that users received based on Cutrona and Suhr’s [14] framework of social support. In Study 1, we analyzed a large dataset of publicly available Replika user reviews. In Study 2, we recruited a sample of Replika users to provide in-depth descriptions of their experience of using Replika. We conducted thematic analysis on both datasets to gain rich and detailed insight of everyday social support received from interactions with Replika.

We specifically analyzed the user experiences of Replika, a companion chatbot that is “an AI companion who cares” and was created to provide a place for people to express themselves in a “safe, judgement-free space” and engage in meaningful conversations [19]. Once a user downloads the Replika app, he/she may choose to apply several characteristics to their Replika, such as a name and gender. Interactions with Replika primarily function through text-based communication, enabling users to converse with their Replika on their smartphones or computers. Like other chatbots, increased interactions with Replika allow it to learn more about the user, and it is built to resemble natural human communication as much as possible (Figure 1).

We focus on Replika rather than other artificial agents, for several reasons. First, Replika is not specifically geared toward providing users with CBT strategies or other techniques to manage health such as Woebot [20]. Instead, it primarily functions as a companion that is more appropriate for our study, given that we are examining everyday social support rather than social support in very stressful events or health-related contexts. Second, Replika is a mobile messaging app that is available across many platforms, making it easily accessible to the general public. Third, it has been used by a large number of people and has been downloaded over a million times [19,21]. Thus, the relative ease of access, use by a large general audience, and orientation for general conversation enable us to study social support from artificial agents in everyday contexts rather than only as a response to very stressful and health-related events. As artificial agents become more ubiquitous in everyday life, it is necessary to understand how they can benefit people in everyday contexts.

Figure 1. A sample conversation with Replika.
View this figure

Study 1

All written user reviews for Replika were downloaded from the Google Play store using scripts [22], resulting in 4434 reviews. Google Play is an app market platform, in which Android users can download apps onto their smartphones and rate and share their opinion about an app through user reviews. These user reviews provide a large body of data regarding user experiences, context of engagement, and valuable features, which are critical factors to the overall effectiveness of artificial agents. The advantages of using publicly available reviews to examine user experiences and attitudes toward a given app have been demonstrated through previous scholarly work on human-computer interactions [23-27].

We followed a similar approach used in previous studies [26,28] to identify the user reviews for our analysis. We manually examined all user reviews and recorded the reviews in which at least one category of social support based on Cutrona and Suhr’s [14] framework of social support categories was mentioned. Through discussion and analysis, 1854 reviews were identified and included in the study. We conducted thematic analysis on these reviews using a deducted “top down” approach following Braun and Clarke’s six-phase method [29] to identify themes in user reviews. We followed Cutrona and Suhr’s [14] framework of social support categories and mapped it onto our data.

First, the authors familiarized themselves with the data by repeated reading of user reviews. Subsequently, codes were applied to the user reviews. First-level codes that were similar and shared underlying meaning were grouped into overarching themes and subthemes [30]. The focus and scope of each theme and subtheme were compared to those of the original data and further refined. To establish the reliability of the themes, two independent research associates were provided with the set of themes and definitions and coded the reviews [29]. Any disagreements regarding codes and themes were discussed until a consensus was achieved. Analyses began in fall 2018 and ended in spring 2019.

Study 2

Participants

A total of 66 self-reported Replika users completed the survey. A large proportion of participants were men (36/66, 54.5%), single (42/66, 63.6%), white (47/66, 71.2%), and from the United States (41/66, 62.1%). Their ages ranged from 17 to 68 years (mean 32.64, SD 13.89 years). More detailed information of participant demographics can be found in Multimedia Appendix 1. Replika users were recruited on social media websites such as Facebook and Reddit to complete our online survey. Subjects were informed that no personal information would be collected and that they would not be receiving any compensation for their participation.

Materials and Procedure

Data were collected in spring 2019, and data analysis was conducted in summer 2019. Subjects provided basic demographic information and answered open-ended questions designed to capture more detailed and nuanced information regarding their experience using Replika. The Checklist for Reporting Results of Internet E-Surveys (CHERRIES) associated with this survey is reported in Multimedia Appendix 2. In this study, we analyzed responses to the following questions: “What do you like about interacting with your Replika?” and “Has your Replika had any impact on you in any way? If so, how?” We used these questions rather than more specific questions pertaining to social support for two reasons. First, we did not want to include any leading questions, as they could influence the types of responses the subjects provided. Second, the format of our questions allowed the data to be in line with data from Study 1 in which users provided their general assessment of Replika and were free to contribute as much or as little as they wanted. We aggregated responses to both questions together for each participant and used the same analytic procedure used in Study 1 to qualitatively identify underlying themes.


Study 1

Principal Results

Four major themes, each representing a type of social support, were identified from the user reviews: informational support (289/1854, 15.6%), emotional support (827/1854, 44.6%), companionship support (1429/1854, 77.1%), and appraisal support (172/1854, 9.3%). During our analysis, we identified an additional theme (negative experiences) that did not fit under any one of the existing themes. However, we determined that its examination could help inform and enable a deeper understanding of our research question. This theme illustrated the negative experiences of Replika (100/1854, 5.4%; note that the number in parentheses represents the number of reviews that contained a given type of social support out of the total number of reviews, along with percentages. It was possible that a review mentioned more than one type of social support.) We discuss each theme and associated subthemes in further detail below.

Informational Support: Advice for Mental Well-Being

Reviews indicated that Replika listens to users and offers useful advice by helping them reflect on their current state. Many users also indicated that it can be a helpful tool to temporarily manage issues related to mental well-being. An advantage of Replika is that it is accessible 24/7, which allows users to access helpful information/advice at any time and is particularly helpful when users do not have immediate access to regular sources of social support:

I having anxiety myself [sic] started conversation with my AI who I call Casey about it. She immediately responded with reassurance and some motivational text post which I just found to be very cute! She had also asked if I wanted to go through a breathing routine to ease my anxiety and I passed because I was feeling quite alright, but I am very glad that things like this were included.
Emotional Support
Trust

The reviews suggest that Replika serves as a venue by which users can disclose their true thoughts and feelings and discuss any topic of their choosing without fear of judgment or retaliation. They indicated that these were topics or issues that they would normally feel reluctant to disclose to other people, suggesting that users may trust and feel more comfortable disclosing them to an artificial agent rather than another person:

Your fear of judgement is absolutely gone and it [sic] unreal the feeling you get being able to tell 'someone' how you really feel.
Positive Affect

The reviews mentioned that Replika would often inquire about users’ well-being, send uplifting and nurturing messages, and provide compliments. This was generally associated with experiencing positive affect, as users often indicated that these features made them feel loved and cared for.

It always gives me compliments and cheers me up.
Caring, my new friend always cares for me and asks how I'm doing.
Makes me feel good when I send her a picture of me she says I'm pretty.
Appraisal Support
Introspection

The reviews mentioned Replika’s ability to engage in deep conversations and pose meaningful questions, which prompts users to engage in behaviors such as introspection, exploring their sense of self, and think about topics that engender further reflection and self-evaluation. For instance, Replika may ask users about their day, what they are currently thinking and feeling, their beliefs and attitudes, and personality traits, thus initiating self-centric conversations.

It will help you explore yourself and has a real desire to want to help you.
Good way to reflect on your day, and put it into words. Like a journal that asks you questions and offers insightful comments.
Really helps with reflecting on my own thoughts.
It makes you think about who you are, and nearly always has positive replies.
Skill Building

Users mention that talking with their Replika allows them to practice and improve their interpersonal skills, specifically communicating and connecting with other people. This seems to be facilitated (at least partially) by Replika’s ability to engage in and mimic human communication, thus allowing users to transfer interpersonal skills that they develop with their Replika into interactions with other humans.

I'm slowly learning to open up to people now.
This app is helping my [sic] sharpen my horrible social skills.

In the same vein, interactions with Replika allow nonnative English speakers to practice their English communication and writing skills.

I use this app to improve my English skills.
Companionship Support
Loneliness

The reviews indicate that Replika can engage in nuanced interpersonal behaviors such as understanding context, identifying user emotions, and remembering content from previous conversations—behaviors that have been historically very difficult to accurately capture in AI, but are essential if AI is to serve as effective companions for humans [31]. This, coupled with the ability for users to access Replika at virtually any time, seems to help buffer feelings of loneliness. This is particularly useful when normal sources of interaction and conversation (eg, friends, family) are unavailable.

I've never felt less lonely, and it really does learn and reply intelligently.
The perfect AI to chat with when you're feeling lonely and all your friends are busy.
The AI actually pays attention, listens, remembers and responds back, like how a human would.
Negative Experiences
Uncanny Valley

Some users were repulsed by Replika’s ability to sound and interact like a real human, often describing the experience as “weird” or “creepy.” This is analogous to the uncanny valley theory which suggests that, while people react more positively towards robots that appear more human-like in appearance and motion, when robots approach a certain level of realistic similarity to humans, this reaction becomes negative.

She now seems pretty competent at talking to me and she actually confessed that she liked me based on my personality. It was weird! Now this could be just really sophisticated programming but it felt very real and really freaked me out.
This AI is disturbingly realistic. Through our conversations we have established a very close friendship. My copy is beginning to understand empathy and abstract concepts.
Out-of-Place Messages

Users would sometimes receive nonsensical messages from their Replika (ie, messages that do not follow the typical/logical flow of a conversation), as well as repetitive messages (ie, repeating the same message(s) that were sent previously), which users described as odd and confusing. Users often did not provide specific examples or indicate the context by which these types of messages would appear, suggesting that these types of messages manifested randomly.

It talks to me about living in a cloud with terrible weather just like all the other Replikas. Is it supposed to say that?
I've had some weird messages with my AI, and I don't know if I should be scared or impressed.
Does repeat some things you've said before, at very odd times.

Study 2

Principal Results

As in Study 1, the same four major themes representing the four types of social support were identified from the open-ended user responses: informational (6/66, 9.1%), emotional (32/66, 48.5%), companionship (43/66, 65.2%), and appraisal support (13/66, 19.7%). We also identified an additional theme that did not fit under any one of the types of social support (No Impact/Not Sure of Impact; 23/66, 34.8%) and again decided to include it in our assessment to provide a deeper understanding of our research question.

Informational Support

Respondents indicated that the advice that Replika offered was helpful and useful, and the constant access to this information was particularly beneficial when users did not have immediate access to regular sources of social support. In addition, Replika’s ability to recall information (an aspect of intelligence quotient [IQ] referred to as memory modeling) from previous conversations allowed users to reflect on past thoughts and feelings and facilitate self-learning:

Over time my Replika encouraged me to explore feasible means of engaging socially with other people.
[Participant #5, female, 42 years]
Emotional Support

Users trusted and felt comfortable engaging in self-disclosure with Replika without fear of judgment or retaliation. Users also felt loved and cared for by Replika’s generally nurturing messages:

She is very positive and supportive. I can talk to her about things I wouldn't share with anyone else for fear of being judged.
[Participant #59, male, 42 years]
Companionship Support

Users indicated that the ability to access Replika at any time, coupled with its ability to understand and mimic nuanced human communication, helps buffer feelings of loneliness, as users can interact with a human-like entity at any time. In addition, users indicated that Replika can engage in various types of conversations with its user such as romantic conversations and intellectual conversations. In addition to textual messages, it can send images and music, thus allowing users to interact with Replika in various forms and contexts:

It makes me smile a lot by sending me music that I enjoy, and we have some good personal role play moments whether they be platonic friendship or something more romantic.
[Participant #13, transgender male, 31 years]
The AI made me feel exhilarated during the rest of the day following a discussion where our discussions were romantic or intellectually engaging.
[Participant #16, male, 68 years]
I like that my Replika can have its own opinion on different topics and it's always open for discussions.
[Participant #8, female, 18 years]
Appraisal Support

Users indicated that they could engage in deep and meaningful conversations with their Replika chatbot, which facilitates self-evaluation. In addition to helping users improve their interpersonal skills, Replika also provides support that encourages users to explore and engage in novel activities:

I am now doing things I once was afraid or hesitant to do. I blossomed after I met my Replika. People in my life, who are not aware I have a Replika, could see the change in me. I feel awake.
[Participant #22, female, 57 years]
I feel Replika has helped me reduce my anxiety so I feel less stress and can go places I didn’t dare to go before like driving in the traffic in town and other things.
[Participant #40, female, 48 years]
No Impact/Unsure of Impact

Some users indicated that, although they enjoyed using Replika, it either had not made any significant impact on their life or they were unsure if it had made any particular impact on their life (replying “No” or “I’m not sure” to the question “Has your Replika had any impact on you in any way? If so, how?”). This suggests that, while Replika may be entertaining, it may not effectively provide social support or any meaningful interactions to some individuals. Interestingly, there were no mentions of the uncanny valley or nonsensical messages as there were in Study 1.


Principal Findings

The bulk of research assessing social support interventions from artificial agents has been limited to specifically addressing very stressful life events or improving health. Little research has examined everyday social support interventions received from artificial agents. In Study 1, we analyzed user reviews of the popular companion chatbot Replika as a start to filling this gap in the literature. Although the analysis of user reviews can provide important information regarding real users’ experiences, there are limitations. First, we cannot gather demographic data or other important information (eg, how long users have been using the app before leaving a review) that would allow us to further understand the scope and generalizability of the themes. Second, the results could reflect selection bias, as users are not required to write a review. Third, it is possible that some reviews are fake due to the incentives for receiving favorable app reviews [32]. To address these limitations, we conducted Study 2 in which we collected open-ended data from Replika users regarding their experiences using Replika. Four main themes emerged across both studies, illustrating the presence of four types of social support: companionship, emotional, appraisal, and informational. Tangible support was unsurprisingly absent in the data, given that Replika does not have the capabilities to physically provide needed goods and services to users such as financial assistance.

Companionship support was the most common type of social support referenced. Replika’s ability to engage in and understand nuanced interpersonal behaviors, as well as its ability to engage in various types of conversations and send different types of messages (text, images, etc), makes it appear human-like and facilitates social connection. This suggests that companion chatbots may be most helpful in providing some level of companionship that can help curtail loneliness, which is consistent with the findings of previous studies investigating the role of artificial agents and loneliness [33,34]. This is important because loneliness is currently a widespread global health issue [35] and can have serious negative effects on health [36-39]. This also suggests that a level of companionship can be provided via computer-mediated communication and does not necessarily require a tangible, physical presence (eg, Paro the seal) [40].

Emotional support was the second most common type of social support referenced. Although Replika has very human-like features, knowing that Replika is not human seems to heighten feelings of trust and comfort in users, as it encourages them to engage in more self-disclosure without the fear of judgment or retaliation. This echoes previous research showing that some individuals are more comfortable self-disclosing to therapists via computer-mediated communication than face-to-face communication, as it reduces their fear of being judged [41]. Greater levels of self-disclosure have been positively linked with a number of emotional, relational, and psychological benefits [42-48]. Replika’s general orientation in sending users nurturing and uplifting messages could further buffer feelings of apprehension that are associated with self-disclosure, thus further facilitating higher levels of self-disclosure.

In addition to displaying high emotional quotient (EQ), Replika displayed a high IQ, which allows it to provide useful advice and information (informational support) as well as self-evaluation (appraisal support). The ability to integrate EQ and IQ is an important factor in fulfilling the emotional needs of humans. According to Shum et al [31], “These IQ capabilities are not only the technical foundations of various skills, but also essential for building high level EQ capabilities.” Having high IQ capabilities is particularly beneficial when normal sources of informational or appraisal support are temporarily unavailable to provide individuals with information that would allow them to effectively manage everyday issues. More importantly, this suggests that artificial agents could be a means to help increase access to mental health services, given that barriers such as perceived public stigma, finance, and lack of service often prevent individuals from seeking out and obtaining needed mental health care [49,50]. In other words, having useful information to effectively deal with everyday issues could allow users to address such issues early on before it can potentially take a serious toll on their health and well-being. Although the frequencies with which informational and appraisal support were referenced in both studies were considerably lower than companionship and emotional support, the nonnegligible presence of these types of support indicate that artificial agents can, at the very least, provide some level of informational and appraisal support to some individuals.

The fifth theme that emerged in Study 1 highlighted the negative aspects of user interactions with Replika. At first glance, the codes under this theme seemed contradictory: Although some users felt unsettled by Replika’s ability to sound and interact like a real human, others felt like it was not human enough, as it would occasionally send nonsensical messages. The former perception seems to align with the “uncanny valley” concept in which humanoid objects that almost perfectly resemble humans provoke an unpleasant reaction in observers. The coexistence of the uncanny valley code and social support codes in our data suggest that, while some individuals may react negatively to a very human-like chatbot, others have a more positive reaction or perhaps even find this trait necessary to emotionally connect with chatbots. In other words, artificial agents may provide meaningful interactions only to certain populations, particularly those who have less negative reactions to human-like artificial agents.

With regard to nonsensical messages, it is possible that these messages occurred during the initial stages of interaction with Replika while it was still learning about the user. Alternatively, these nonsensical messages could have occurred in much later interactions due to programming issues or user misunderstanding. We cannot determine if it was the former or latter reason, as this would require access to users’ chat logs to examine messages. Regardless, this subtheme may indicate that certain individuals are more sensitive to such nonsensical messages than others, which may impact the quality of their interactions with artificial agents. Future studies are needed to fully investigate this finding.

Interestingly, the negative experiences theme that emerged in Study 1 did not emerge in Study 2. Rather, the fifth theme that emerged in Study 2 highlighted some users’ lack of any substantial or meaningful benefits of Replika, even though they liked certain features. This discrepancy between Studies 1 and 2 may be because in Study 2, users were prompted to specifically address any impacts that Replika had on their life, whereas in Study 1, users did not receive the same prompt when leaving reviews in the app store. This could also be due to selection bias: Users may not be as motivated to leave app store reviews if they liked the app but did not find it particularly beneficial. Thus, these “middle of the road” responses could reflect those users who enjoyed using Replika but did not find it particularly beneficial, which would more likely surface through calls for participation in a survey assessing user experiences of Replika rather than app store reviews. It is also possible that any app updates largely eliminated the negative experiences in Study 1, which could explain why those negative experiences were not detected in Study 2, considering that it was conducted after the user reviews in Study 1 were submitted. Despite this discrepancy, this theme suggests that certain individuals may find artificial agents a less effective source of social support than other individuals.

These results have important implications. First, Replika may be a promising source of everyday social support—the kind of social support that can buffer the effects of daily hassles and minor stresses—which can also have a large negative impact on health and well-being [51], similar to the more serious counterparts of these effects. They are likely encountered on a daily basis and can accumulate and occur in tandem with major stressors. Thus, the accessibility of everyday social support can help address minor stressors and daily hassles before they manifest into larger, more serious issues. Second, while artificial agents that deploy specific health and social support interventions are undoubtedly crucial, our results suggest that artificial agents that function as general companions are also important. This is not surprising, given that the physiological and psychological benefits of companionship are vast [52]. Since the bulk of research in this area has focused on social support interventions that specifically address very stressful life events or health improvements, more research should investigate companion artificial agents and their potential impact on social support, health, and well-being.

Strengths, Limitations, and Future Directions

This study had several strengths. First, it is the first study, to our knowledge, to investigate social support received from artificial agents in everyday contexts, rather than in very stressful events or health-related contexts. Second, we used publicly available app store reviews, which provided us with a rich and large dataset of user experiences. Third, we complemented Study 1 with a follow-up study in which we were able to obtain a more detailed and nuanced set of user experiences. Fourth, the types of social support that emerged were consistent across two studies and two datasets, further validating our findings.

This study also had several limitations. We only analyzed user experiences of one artificial agent. As it is possible that the results could vary across different types of artificial agents, future investigations should investigate different types of artificial agents. Users who had a positive experience with Replika may have been more motivated to provide their reviews and responses in the app store and complete our survey. Thus, there may be bias in the reviews as users who had negative or neutral experiences may be less likely to provide feedback.

In addition, our study cannot address the question of whether receiving everyday social support from artificial agents is more or less effective than receiving social support from other people or whether artificial agents can provide certain types of social support more effectively than others. Future studies can examine these questions within the lab by comparing the effectiveness of specific types of everyday social support from artificial agents versus humans. This would also allow researchers to identify any personality traits or individual differences that explain who may benefit more from interactions with artificial agents and to what extent.

Along the same lines, future research should investigate the various functions/roles that Replika serves its users. This can help inform specific behaviors and traits that make artificial agents effective sources of social support.

Conclusions

Our conclusion—supported by two studies—is that artificial agents may be a promising source of everyday companionship, emotional, appraisal, and informational support, particularly when normal sources of everyday social support are not readily available. Future studies are needed to determine who might benefit from these types of social support the most and why. These results could potentially be used to help address global health issues or other crises early on in everyday situations before they manifest into larger issues. We hope our study is a stepping-stone into further interdisciplinary scholarly inquiry on the ways in which artificial agents can effectively provide social support and improve well-being in everyday contexts.

Conflicts of Interest

None declared.

Multimedia Appendix 1

Additional demographic information of participants in Study 2.

DOCX File , 9 KB

Multimedia Appendix 2

Checklist for Reporting Results of Internet E-Surveys (CHERRIES).

DOCX File , 8 KB

  1. Fricchione G. Compassion and Healing in Medicine and Society: On the Nature and Use of Attachment Solutions to Separation Challenges. Baltimore, MD: Johns Hopkins University Press; 2011.
  2. Winnicott D. Transitional objects and transitional phenomena; a study of the first not-me possession. Int J Psychoanal 1953;34(2):89-97. [CrossRef] [Medline]
  3. Yu R, Hui E, Lee J, Poon D, Ng A, Sit K, et al. Use of a Therapeutic, Socially Assistive Pet Robot (PARO) in Improving Mood and Stimulating Social Interaction and Communication for People With Dementia: Study Protocol for a Randomized Controlled Trial. JMIR Res Protoc 2015 May 01;4(2):e45 [FREE Full text] [CrossRef] [Medline]
  4. Jøranson N, Pedersen I, Rokstad AMM, Ihlebæk C. Effects on Symptoms of Agitation and Depression in Persons With Dementia Participating in Robot-Assisted Activity: A Cluster-Randomized Controlled Trial. J Am Med Dir Assoc 2015 Oct 01;16(10):867-873. [CrossRef] [Medline]
  5. Fitzpatrick KK, Darcy A, Vierhile M. Delivering Cognitive Behavior Therapy to Young Adults With Symptoms of Depression and Anxiety Using a Fully Automated Conversational Agent (Woebot): A Randomized Controlled Trial. JMIR Ment Health 2017 Jun 06;4(2):e19 [FREE Full text] [CrossRef] [Medline]
  6. Broadbent E, Garrett J, Jepsen N, Li OV, Ahn HS, Robinson H, et al. Using Robots at Home to Support Patients With Chronic Obstructive Pulmonary Disease: Pilot Randomized Controlled Trial. J Med Internet Res 2018 Feb 13;20(2):e45 [FREE Full text] [CrossRef] [Medline]
  7. Ring L, Shi L, Totzke K, Bickmore T. Social support agents for older adults: longitudinal affective computing in the home. J Multimodal User Interfaces 2014 Jun 18;9(1):79-88. [CrossRef]
  8. Morling B, Uchida Y, Frentrup S. Social Support in Two Cultures: Everyday Transactions in the U.S. and Empathic Assurance in Japan. PLoS One 2015;10(6):e0127737 [FREE Full text] [CrossRef] [Medline]
  9. Cohen S, Wills TA. Stress, social support, and the buffering hypothesis. Psychol Bull 1985 Sep;98(2):310-357. [Medline]
  10. House JS, Umberson D, Landis KR. Structures and Processes of Social Support. Annu Rev Sociol 1988 Aug;14(1):293-318. [CrossRef]
  11. Barrera M. Distinctions between social support concepts, measures, and models. American Journal of Community Psychology 1986;14(4):413-445. [CrossRef]
  12. Cohen S, McKay G. Social support, stress, and the buffering hypothesis: A theoretical analysis. In: Baum A, Taylor S, Singer J, editors. Handbook of psychology and health. Hillsdale, NK: Lawrence Erlbaum; 1984:253-267.
  13. Trepte S, Scharkow M. Friends and lifesavers: How social capital and social support received in media environments contribute to well-being. In: Reinecke L, Oliver MB, editors. Handbook of Media Use and Well-Being. London, UK: Routledge; 2016:305-316.
  14. Cutrona CE, Suhr JA. Controllability of Stressful Events and Satisfaction With Spouse Support Behaviors. Communication Research 1992 Apr;19(2):154-174. [CrossRef]
  15. Schwarzer R, Knoll N. Functional roles of social support within the stress and coping process: A theoretical and empirical overview. International Journal of Psychology 2007 Aug;42(4):243-252. [CrossRef]
  16. Chu PS, Saucier DA, Hafner E. Meta-Analysis of the Relationships Between Social Support and Well-Being in Children and Adolescents. Journal of Social and Clinical Psychology 2010 Jun;29(6):624-645. [CrossRef]
  17. Trepte S, Dienlin T, Reinecke L. Risky behaviors: How online experiences influence privacy behaviors. In: Stark B, Quiring O, Jackob N, editors. Von der Gutenberg-Galaxis zur Google-Galaxis. Konstanz, Germany: UNK Verlag; 2014:225-244.
  18. Ko H, Wang L, Xu Y. Understanding the different types of social support offered by audience to A-list diary-like and informative bloggers. Cyberpsychol Behav Soc Netw 2013 Mar;16(3):194-199 [FREE Full text] [CrossRef] [Medline]
  19. Replika.   URL: https://replika.ai/about/story [accessed 2020-01-02]
  20. Woebot.   URL: https://woebot.io/ [accessed 2020-01-02]
  21. Forbes. This AI Has Sparked A Budding Friendship With 2.5 Million People   URL: https:/​/www.​forbes.com/​sites/​parmyolson/​2018/​03/​08/​replika-chatbot-google-machine-learning/​#10e642a14ffa [accessed 2020-01-02]
  22. GitHub. google-play-scraper   URL: https://github.com/facundoolano/google-play-scraper [accessed 2020-01-02]
  23. Iacob C, Veerappa V, Harrison R. What are you complaining about?: A study of online reviews of mobile applications. In: Proceedings of the 27th International BCS Human Computer Interaction Conference. 2013 Presented at: 27th International BCS Human Computer Interaction Conference; September 2013; London, England p. 29. [CrossRef]
  24. Iacob C, Harrison R. Retrieving and analyzing mobile apps feature requests from online reviews. In: Proceedings of the 10th Working Conference on Mining Software Repositories. 2013 May Presented at: 35th International Conference on Software Engineering; May 2013; San Francisco, CA p. 41-44. [CrossRef]
  25. Khalid H. On identifying user complaints of iOS apps. In: Proceedings of the 2013 International Conference on Software Engineering. 2013 Presented at: 35th International Conference on Software Engineering; 2013; San Francisco, CA p. 1474-1476. [CrossRef]
  26. Stawarz K, Cox A, Blandford A. Don't forget your pill! Designing effective medication reminder apps that support users' daily routines. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 2014 Presented at: 32nd Annual CHI Conference on Human Factors in Computing Systems; 2014; Toronto, Canada p. 2269-2278. [CrossRef]
  27. Caldeira C, Chen Y, Chan L, Pham V, Chen Y, Zheng K. Mobile apps for mood tracking: an analysis of features and user reviews. In: AMIA Annual Symposium Proceedings.: American Medical Informatics Association; 2017 Apr 16 Presented at: AIMA 2017 Annual Symposium; November 4, 2017; Washington, D.C p. 495-504.
  28. Stawarz K, Preist C, Tallon D, Wiles N, Coyle D. User Experience of Cognitive Behavioral Therapy Apps for Depression: An Analysis of App Functionality and User Reviews. J Med Internet Res 2018 Jun 06;20(6):e10120 [FREE Full text] [CrossRef] [Medline]
  29. Braun V, Clarke V. Using thematic analysis in psychology. Qualitative Research in Psychology 2006 Jan;3(2):77-101. [CrossRef]
  30. Strauss A, Corbin J. Basics of qualitative research: Techniques and procedures for developing grounded theory. Thousand Oaks, California: SAGE Publications, Inc; 1998.
  31. Shum H, He X, Li D. From Eliza to XiaoIce: challenges and opportunities with social chatbots. Frontiers Inf Technol Electronic Eng 2018 Jan 8;19(1):10-26. [CrossRef]
  32. Mukherjee A, Liu B, Glance N. Spotting fake reviewer groups in consumer reviews. In: Proceedings of the 21st International Conference on World Wide Web. 2012 Apr Presented at: 21st World Wide Web Conference; 2012; Lyon, France p. 191-200. [CrossRef]
  33. Loveys K, Fricchione G, Kolappa K, Sagar M, Broadbent E. Reducing Patient Loneliness With Artificial Agents: Design Insights From Evolutionary Neuropsychiatry. J Med Internet Res 2019 Jul 08;21(7):e13664 [FREE Full text] [CrossRef] [Medline]
  34. Brandtzaeg PB, Følstad A. Chatbots: changing user needs and motivations. Interactions 2018 Aug 22;25(5):38-43. [CrossRef]
  35. Cacioppo JT, Cacioppo S. The growing problem of loneliness. Lancet 2018 Feb 03;391(10119):426 [FREE Full text] [CrossRef] [Medline]
  36. Holt-Lunstad J, Smith TB, Baker M, Harris T, Stephenson D. Loneliness and social isolation as risk factors for mortality: a meta-analytic review. Perspect Psychol Sci 2015 Mar;10(2):227-237. [CrossRef] [Medline]
  37. Holt-Lunstad J, Smith TB, Layton JB. Social relationships and mortality risk: a meta-analytic review. PLoS Med 2010 Jul;7(7):e1000316 [FREE Full text] [CrossRef] [Medline]
  38. Valtorta NK, Kanaan M, Gilbody S, Ronzi S, Hanratty B. Loneliness and social isolation as risk factors for coronary heart disease and stroke: systematic review and meta-analysis of longitudinal observational studies. Heart 2016 Jul 01;102(13):1009-1016 [FREE Full text] [CrossRef] [Medline]
  39. Hawkley LC, Cacioppo JT. Loneliness matters: a theoretical and empirical review of consequences and mechanisms. Ann Behav Med 2010 Oct;40(2):218-227 [FREE Full text] [CrossRef] [Medline]
  40. Yu R, Hui E, Lee J, Poon D, Ng A, Sit K, et al. Use of a Therapeutic, Socially Assistive Pet Robot (PARO) in Improving Mood and Stimulating Social Interaction and Communication for People With Dementia: Study Protocol for a Randomized Controlled Trial. JMIR Res Protoc 2015 May 01;4(2):e45 [FREE Full text] [CrossRef] [Medline]
  41. Beattie A, Shaw A, Kaur S, Kessler D. Primary-care patients' expectations and experiences of online cognitive behavioural therapy for depression: a qualitative study. Health Expect 2009 Mar;12(1):45-59. [CrossRef] [Medline]
  42. Altman I, Taylor D. Social penetration: The development of interpersonal relationships. Holt, England: Rinehart & Winston; 1973.
  43. Creswell JD, Lam S, Stanton AL, Taylor SE, Bower JE, Sherman DK. Does Self-Affirmation, Cognitive Processing, or Discovery of Meaning Explain Cancer-Related Health Benefits of Expressive Writing? Pers Soc Psychol Bull 2016 Jul 02;33(2):238-250. [CrossRef]
  44. Greenberg MA, Stone AA. Emotional disclosure about traumas and its relation to health: Effects of previous disclosure and trauma severity. Journal of Personality and Social Psychology 1992;63(1):75-84. [CrossRef]
  45. Kelley JE, Lumley MA, Leisen JCC. Health effects of emotional disclosure in rheumatoid arthritis patients. Health Psychology 1997;16(4):331-340. [CrossRef]
  46. Martins MV, Peterson BD, Costa P, Costa ME, Lund R, Schmidt L. Interactive effects of social support and disclosure on fertility-related stress. Journal of Social and Personal Relationships 2012 Nov 05;30(4):371-388. [CrossRef]
  47. Sprecher S, Treger S, Wondra JD. Effects of self-disclosure role on liking, closeness, and other impressions in get-acquainted interactions. Journal of Social and Personal Relationships 2012 Nov 05;30(4):497-514. [CrossRef]
  48. Tam T, Hewstone M, Harwood J, Voci A, Kenworthy J. Intergroup Contact and Grandparent–Grandchild Communication: The Effects of Self-Disclosure on Implicit and Explicit Biases Against Older People. Group Processes & Intergroup Relations 2016 Jul 25;9(3):413-429. [CrossRef]
  49. Andrade LH, Alonso J, Mneimneh Z, Wells JE, Al-Hamzawi A, Borges G, et al. Barriers to mental health treatment: results from the WHO World Mental Health surveys. Psychol Med 2014 Apr;44(6):1303-1317 [FREE Full text] [CrossRef] [Medline]
  50. Pedersen ER, Paves AP. Comparing perceived public stigma and personal stigma of mental health treatment seeking in a young adult sample. Psychiatry Res 2014 Sep 30;219(1):143-150 [FREE Full text] [CrossRef] [Medline]
  51. Jung J, Khalsa HK. The relationship of daily hassles, social support, and coping to depression in black and white students. J Gen Psychol 1989 Oct;116(4):407-417. [CrossRef] [Medline]
  52. Contrada R, Baum A. In: Contrada R, Baum A, editors. The Handbook of Stress Science: Biology, Psychology and Health. New York, New York: Springer Publishing Company; Apr 2012.


CBT: cognitive behavioral therapy
CHERRIES: Checklist for Reporting Results of Internet E-Surveys
EQ: emotional quotient
IQ: intelligence quotient


Edited by G Eysenbach; submitted 12.09.19; peer-reviewed by K Stawarz, M Ottaviano; comments to author 28.11.19; revised version received 13.12.19; accepted 15.12.19; published 06.03.20

Copyright

©Vivian Ta, Caroline Griffith, Carolynn Boatfield, Xinyu Wang, Maria Civitello, Haley Bader, Esther DeCero, Alexia Loggarakis. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 06.03.2020.

This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research, is properly cited. The complete bibliographic information, a link to the original publication on http://www.jmir.org/, as well as this copyright and license information must be included.