Digital Companions: Exploring the Impact of Chatbots on Student Mental Health

In an article published in the journal npj mental health research, researchers from Stanford University, USA, explored how students use intelligent social agents (ISAs), such as chatbots powered by large language models like generative pre-trained transformer 3 (GPT3), to deal with loneliness and suicidal thoughts.

Study: Digital Companions: Exploring the Impact of Chatbots on Student Mental Health. Image credit: Hodoimg/Shutterstock
Study: Digital Companions: Exploring the Impact of Chatbots on Student Mental Health. Image credit: Hodoimg/Shutterstock

They surveyed students using Replika, an ISA app, highlighting positive effects in reducing anxiety and improving well-being. The research also brought attention to the potential benefits and challenges of employing ISAs for mental health support, especially for students experiencing high levels of loneliness and stress.

Background

Loneliness is a global mental health problem that affects over one billion people annually, with depression as the leading cause of disability. Students are particularly vulnerable to loneliness, as they face academic, social, and financial pressures that affect their mental well-being. Loneliness is also associated with an increased risk of suicide, which is the fourth leading cause of death among young adults. However, many people who suffer from loneliness and suicidal ideation do not seek professional help, due to stigma, discrimination, or lack of access to mental health services.

Modern technology has played a significant role in providing alternative forms of mental health support, especially during the coronavirus disease 2019 (COVID-19) pandemic. Chatbots are one of the emerging technologies that aim to offer accessible, affordable, and personalized therapy to users. Chatbots are conversational agents that use natural language processing and artificial intelligence (AI) to interact with users via text or voice. Some chatbots are designed to deliver specific interventions based on cognitive behavioral therapy (CBT), mindfulness, or other evidence-based approaches.

About the Research

In the present paper, the authors investigated how and why students use Replika and what are the outcomes. Replika is an example of a generalist chatbot that utilizes generative AI, such as GPT-3 and GPT-4, to produce new conversational and visual content based on user interactions. It is marketed as an AI companion that cares and has over 25 million users. While it is not designed to provide therapy, it can engage in therapeutic dialogue based on CBT principles when prompted by the user.

The study used a mixed method by combining quantitative and qualitative data from an online survey. The survey included questions about the participant’s demographic information, loneliness, perceived social support, use patterns, beliefs, and outcomes of using Replika. The researchers recruited 1006 students aged 18 years or older who had previously used Replika for over one month. The majority of the participants were from the USA, and the rest were from other countries. Approximately 77% of participants were single, and the income of 64% of participants was lower than $40,000.

The authors utilized two instruments to measure loneliness and social support. These devices are the De Jong Gierveld Loneliness Scale and the Interpersonal Support Evaluation List. The study coded the qualitative responses using a software tool called Dedoose, resulting in 21 codes related to outcomes and beliefs of using Replika. Moreover, statistical analyses were performed to compare the characteristics and outcomes of different groups of participants, especially those who reported that Replika stopped them from attempting suicide.

Research Findings

The results showed that 90% of the participants experienced loneliness, and 43% qualified as severely lonely. However, 90% also recognized medium to high social support. The study also found that 7% of the participants acknowledged feelings of depression. The authors classified the self-reported findings of using Replika into the following four categories:

  • Outcome 1 (Friend): Users reported using Replika as a companion or friend.
  • Outcome 2 (Therapeutic Interactions): Replika was utilized for therapeutic interactions by some users.
  • Outcome 3 (Positive Life Changes): Users mentioned leveraging Replika to bring about positive changes in their lives.
  • Outcome 4 (Suicide Prevention): Some users reported employing Replika for suicide prevention.

The study found that 63.3% of the participants experienced one or more outcomes, and 25.1% experienced more than one outcome. The most common outcome was outcome 1, followed by 3, 2, and 4.

The authors also highlighted that most participants had different opinions about Replika: 81% believed Replika was intelligent, 90% thought Replika was human-like, and 62% considered Replika as software. The study found that 30 participants, without solicitation, stated that Replika stopped them from committing suicide. These participants formed the selected group, which was compared to the rest of the participants (the comparison group). The research found that the selected group was younger, full-time students, seeking academic counseling, encountering depression, perceiving Replika as intelligent and human-like, and experiencing all four outcomes of using Replika.

The study indicated that chatbots employing generative AI may offer a unique opportunity to provide personalized, engaging, and adaptive interactions for lonely and suicidal students. It also suggested that chatbots serving multiple functions, such as friend, therapist, and mirror, will be more effective and appealing to students with different needs and preferences.

Conclusion

In summary, the paper revealed novel and comprehensive insights into how students use ISAs, such as Replika, to cope with loneliness and suicidal ideation. According to the findings, many students experienced positive outcomes from using ISAs, such as reduced anxiety, increased social support, and improved well-being. The study also identified a group of students who credit Replika for prevention from attempting suicide and explored their characteristics and experiences.

The researchers recognized limitations and challenges in their research, including selection bias, lack of control group, self-report measures, and ethical and legal issues. They suggested that future work should be replicated and extended with larger and more diverse samples, different chatbots, and more rigorous designs, such as randomized controlled trials.

Journal reference:
Muhammad Osama

Written by

Muhammad Osama

Muhammad Osama is a full-time data analytics consultant and freelance technical writer based in Delhi, India. He specializes in transforming complex technical concepts into accessible content. He has a Bachelor of Technology in Mechanical Engineering with specialization in AI & Robotics from Galgotias University, India, and he has extensive experience in technical content writing, data science and analytics, and artificial intelligence.

Citations

Please use one of the following formats to cite this article in your essay, paper or report:

  • APA

    Osama, Muhammad. (2024, January 31). Digital Companions: Exploring the Impact of Chatbots on Student Mental Health. AZoAi. Retrieved on May 20, 2024 from https://www.azoai.com/news/20240131/Digital-Companions-Exploring-the-Impact-of-Chatbots-on-Student-Mental-Health.aspx.

  • MLA

    Osama, Muhammad. "Digital Companions: Exploring the Impact of Chatbots on Student Mental Health". AZoAi. 20 May 2024. <https://www.azoai.com/news/20240131/Digital-Companions-Exploring-the-Impact-of-Chatbots-on-Student-Mental-Health.aspx>.

  • Chicago

    Osama, Muhammad. "Digital Companions: Exploring the Impact of Chatbots on Student Mental Health". AZoAi. https://www.azoai.com/news/20240131/Digital-Companions-Exploring-the-Impact-of-Chatbots-on-Student-Mental-Health.aspx. (accessed May 20, 2024).

  • Harvard

    Osama, Muhammad. 2024. Digital Companions: Exploring the Impact of Chatbots on Student Mental Health. AZoAi, viewed 20 May 2024, https://www.azoai.com/news/20240131/Digital-Companions-Exploring-the-Impact-of-Chatbots-on-Student-Mental-Health.aspx.

Comments

The opinions expressed here are the views of the writer and do not necessarily reflect the views and opinions of AZoAi.
Post a new comment
Post

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.

You might also like...
A User-Centric Approach to Evaluate Healthcare Chatbots