0% found this document useful (0 votes)
158 views6 pages

Role of Chat Bots in Mental

This research evaluates the effectiveness of AI-powered chatbots in providing mental health support in the Philippines, where access to traditional therapy is limited. It highlights high user satisfaction and symptom relief among participants, while also addressing concerns regarding emotional limitations and privacy issues. The study suggests that AI chatbots can serve as supplementary tools in mental health care but should not replace professional therapists.

Uploaded by

Mark
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
158 views6 pages

Role of Chat Bots in Mental

This research evaluates the effectiveness of AI-powered chatbots in providing mental health support in the Philippines, where access to traditional therapy is limited. It highlights high user satisfaction and symptom relief among participants, while also addressing concerns regarding emotional limitations and privacy issues. The study suggests that AI chatbots can serve as supplementary tools in mental health care but should not replace professional therapists.

Uploaded by

Mark
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

THE ROLE OF CHATBOTS AND APPLICATIONS IN THERAPY AND

EMOTIONAL WELL-BEING: EVALUATING THE EFFECTIVENESS OF


AI-POWERED CHATBOTS IN MENTial

A Research Presented to the


Faculty of the College of Information Technology
King’s College of the Philippines

In Partial Fulfillment of
the Requirements for the Course
Research Methods

By:
Carrie Denis
Cristy Lindawan
Mark V. Tamayo

Adviser:
Abstract treatments. In addition, stigma, affordability,
This research investigates the efficacy of AI and lack of professionals discourage many
chatbots in addressing mental health, from going for treatment.
specifically in the Filipino setting where In turn, chatbots driven by artificial
mental health interventions are underfunded intelligence (AI) have become substitute
and inaccessible to most. As depression, mediums to provide mental health
anxiety, and stress increasingly become assistance. Chatbots, for example, Woebot,
common issues, AI chatbots like Woebot Tess, and Wysa, apply cognitive behavioral
and Tess present a potential answer through therapy (CBT), mindfulness, and emotional
scalable, low-cost, and evidence-based regulation practices to guide supportive
psychological care. The research applies the conversations with the users. For Joerin et
mixed-methods strategy to assess user al. (2020), these technologies may enhance
satisfaction, perceived efficacy, and user wellness through 24/7 accessibility and
concerns for ethical and emotional symptom reductions in stress and anxiety.
limitations. Results are intended to enlighten Recent literature underscores the
healthcare professionals, developers, and significance of AI in promoting positive
users about the promise of AI tools in mental health. Malik et al. (2024) emphasize
augmenting conventional therapy. AI’s role in managing disorders such as
schizophrenia, autism, and mood disorders,
Introduction and highlight the ethical necessity of
This study examines how well AI chatbots culturally sensitive algorithms. An APA
can support mental health, particularly in the (2023) article supports the use of AI tools
Philippines, where most people lack access for early-stage support but warns that
to and get inadequate funding for mental chatbots lack human empathy and clinical
health services. AI chatbots like Woebot and nuance.
Tess offer a viable remedy in the form of Liu et al. (2021) discovered both users and
scalable, affordable, and evidence-based professionals open to AI chatbots,
psychological care as stress, anxiety, and referencing advantages such as accessibility
depression become more prevalent issues. and stigma reduction, although privacy and
The study uses a variety of techniques to reliability are still issues. A similar study by
evaluate perceived utility, user happiness, IREJournals (2023) revealed the positive
and ethical and emotional boundary effect of chatbot utilization among high
concerns. Findings aim to inform users, school students, highlighting integration into
developers, and medical professionals on educational systems as a means of
the potential of AI technologies to support encouraging early intervention and digital
traditional therapy. mental health tool normalization.

Background of the Study Review of Related Literature


In the Philippines alone, there are estimated AI-powered chatbots have gained
6 million people suffering from depression momentum as tools to support mental health
and anxiety with mental illness being the by providing scalable, immediate, and
third most prevalent disability (World stigma-free assistance. Malik et al. (2024)
Health Organization, 2011). Although this is highlighted the diverse applications of AI in
so, the nation only spends 0.22% of its managing emotional regulation and
health budget on mental illness, which is psychiatric disorders such as schizophrenia,
indicative of a systemic lack in available mood disorders, and autism. They stressed
the need for culturally sensitive and ethical
algorithms to ensure equitable and safe Purpose of the Study
implementation [4]. This study aims to evaluate the real-world
effectiveness, limitations, and integration
Joerin et al. (2020) evaluated AI chatbot potential of AI-powered mental health
applications like Woebot, noting their use of chatbots, particularly for Filipino users who
evidence-based strategies including may lack access to traditional therapy.
cognitive behavioral therapy (CBT) to
deliver accessible, supportive mental health Research Objectives
interventions [1]. Similarly, Green et al. The study specifically aims:
(2020) demonstrated how such tools 1. To evaluate the effectiveness of AI-based
expanded mental health access in chatbots and applications in providing
low-resource settings, providing reliable mental health support to users.
symptom relief [2]. 2. To identify how AI technologies can
complement traditional therapy methods in
Liu et al. (2021) conducted a perception psychological care.
study and found that both users and 3. To examine the potential risks and
healthcare professionals generally favored limitations associated with the use of AI in
chatbot integration into mental health care mental health assistance, including issues of
systems. However, they raised concerns clinical reliability, ethical concerns, and data
regarding the emotional intelligence of these privacy.
tools, ethical risks, and privacy issues [6].

The American Psychological Association Methods


(2023) emphasized that while AI chatbots A mixed-methods approach was employed,
can serve as early-stage support tools, they integrating quantitative surveys and
should not replace licensed therapists due to qualitative interviews to gather a nuanced
limitations in empathy and contextual understanding of user experiences with AI
understanding [5]. chatbots.
Participants were individuals aged 18 and
In a study focusing on adolescents, Sharma older who had utilized AI chatbots (e.g.,
et al. (2023) illustrated how AI-powered Woebot, Wysa, Tess) for mental health
chatbots helped high school students access reasons in the past six months.
emotional support without stigma. Their A standardized questionnaire assessed user
integration into educational systems showed satisfaction, symptom variation, and
positive outcomes in early intervention and attitudes towards chatbot use. Interview
stress management [7]. questions semi-standardized users' deeper
impressions about emotional connection,
privacy, and helpfulness.
Recruitment occurred through online social
Statement of the Problem media platforms and mental health support
While global studies affirm the effectiveness forums. Following a minimum of two weeks
of mental health chatbots, there is limited of chatbot use, the survey was completed by
research assessing their practical use and participants, with the additional option to
impact in the Philippine setting. There is a participate in interviews.
pressing need to understand their role not Quantitative data were examined through
only as standalone tools but also as descriptive statistics, while qualitative data
complementary aids to traditional therapy were analyzed thematically to determine
within local cultural, technological, and important trends in user experience.
economic contexts.
Results and Discussions

Category Findings Interpretation/Discussion

User Satisfaction 80% of participants reported Indicates high user


a positive experience using engagement and acceptance,
AI chatbots. aligning with Green et Al
(2020) on accessibility and
ease of use

Symptom Relief 65% observed mild to Suggests that AI Chatbots


moderate improvement in can serve as early-stage
stress/anxiety levels. support,
echoing Joering et
Al.(2020).

Preferred Features 24/7 availability, anonymity, Highlights the value of


self-paced interaction. on-demand mental health
tools, especially for users
hesitant to seek therapy due
to stigma (APA, 2023).

Identified Limitations Chatbots were seen as Reinforces concerns from


emotionally limited and not Liu et al. (2021) and the
suitable for crises. APA (2023) that chatbots
lack human empathy and
critical intervention capacity.

Privacy Concerns 40% of users expressed Underlines the need for


concern over data handling stronger ethical standards
and confidentiality. and secure design (Malik et
al., 2024).

The findings of this study affirm the


growing role of AI-powered chatbots as accessibility, particularly among youth and
supplemental tools in mental health care. underserved populations.
High user satisfaction (80%) suggests that
individuals are open to digital support Furthermore, the observed symptom relief
platforms, especially when traditional in 65% of users aligns with Joerin et al.
therapy is inaccessible or stigmatized. This (2020), who found similar outcomes in users
supports prior research (Green et al., 2020; of CBT-based chatbot interventions. These
IREJournals, 2023) that emphasizes tools may not cure mental illness but can
chatbots’ ability to increase mental health help users manage stress and anxiety at a
basic level, acting as a bridge to
professional care.
However, the study also reinforces existing ethical safeguards are recurring challenges.
concerns about chatbot limitations. As APA (2023) and Liu et al. (2021)
Emotional nuance, crisis handling, and suggest, AI tools should complement—not
replace—human therapists. Addressing results show that while users appreciate the
privacy concerns and emotional depth is 24/7 availability, anonymity, and helpfulness
vital for improving user trust and chatbot of chatbots, concerns remain regarding their
credibility. emotional intelligence, ethical usage, and
crisis management capabilities. Chatbots
like Woebot and Tess demonstrate promise
Conclusion in delivering psychoeducation and basic
This study highlights the growing emotional support, particularly for
significance of AI-powered chatbots as individuals who face barriers in accessing
accessible and scalable tools for mental traditional therapy. However, they should
health support, especially in underserved not be viewed as replacements for
populations such as in the Philippines. The professional mental health care.
1] A. Joerin, M. Rauws, R. Fulmer, and V.
Recommendations Black, “Ethical Artificial Intelligence for
1.​ Mental health professionals should Digital Health Organizations,” Cureus, vol.
consider integrating AI chatbots as 12, no. 3, p. e7202, Mar. 2020. doi:
supplementary tools to assist in 10.7759/cureus.7202
routine check-ins, psychoeducation,
and follow-ups. [2] E. P. Green, R. Tuli, P. Nguhiu, S. Geng,
and A. Okeno, “Expanding Access to
2.​ Developers of AI mental health Perinatal Depression Treatment in Kenya
applications should prioritize data through Automated Psychological Support,”
privacy, emotional responsiveness, JMIR Formative Research, vol. 4, no. 10, p.
and culturally adaptive algorithms to e17895, Oct. 2020. doi: 10.2196/17895
improve reliability and user trust.
[3] World Health Organization, Mental
3.​ Educational institutions and Health Atlas 2011, WHO Press, Geneva,
community health programs can 2011.
incorporate chatbot tools as early
intervention platforms, especially for [4] A. Malik, R. Wadhwa, and S. Chugh,
adolescents and young adults. “Artificial Intelligence in Positive Mental
Health: A Narrative Review,” Frontiers in
4.​ Further research is encouraged to Digital Health, Jan. 2024. [Online].
assess the long-term impact of Available:
chatbot use on clinical outcomes and https://www.ncbi.nlm.nih.gov/pmc/articles/P
to develop frameworks that guide MC10982476/
their ethical implementation.
[5] American Psychological Association,
“Using generic AI chatbots for mental
health support,” APA Services, 2023.
[Online]. Available:
https://www.apaservices.org/practice/busine
ss/technology/artificial-intelligence-chatbots
-therapists

[6] B. Liu et al., “Can Chatbots Help


Support a Person’s Mental Health?
References Perceptions of Mental Healthcare
Professionals and Users,” ACM
Transactions on Computer-Human
Interaction, vol. 28, no. 4, pp. 1–36, 2021.
doi: 10.1145/3453175

[7] A. Sharma, R. Mehta, and D. Singh,


“Leveraging AI-Powered Chatbots for
Mental Health Support for High School
Students,” International Research Journal of
Engineering and Technology (IREJournals),
vol. 10, no. 2, pp. 525–530, 2023. [Online].
Available:
https://www.irejournals.com/formatedpaper/
1706902.pdf.

You might also like