Back to Journals » Journal of Multidisciplinary Healthcare » Volume 17

Assessing the Effectiveness of ChatGPT in Delivering Mental Health Support: A Qualitative Study

Authors Alanezi F 

Received 30 October 2023

Accepted for publication 8 January 2024

Published 31 January 2024 Volume 2024:17 Pages 461—471

DOI https://doi.org/10.2147/JMDH.S447368

Checked for plagiarism Yes

Review by Single anonymous peer review

Peer reviewer comments 3

Editor who approved publication: Dr Scott Fraser



Fahad Alanezi

College of Business Administration, Department Management Information Systems, Imam Abdulrahman Bin Faisal University, Dammam, 31441, Saudi Arabia

Correspondence: Fahad Alanezi, Tel +13-3330030, Email [email protected]

Background: Artificial Intelligence (AI) applications are widely researched for their potential in effectively improving the healthcare operations and disease management. However, the research trend shows that these applications also have significant negative implications on the service delivery.
Purpose: To assess the use of ChatGPT for mental health support.
Methods: Due to the novelty and unfamiliarity of the ChatGPT technology, a quasi-experimental design was chosen for this study. Outpatients from a public hospital were included in the sample. A two-week experiment followed by semi-structured interviews was conducted in which participants used ChatGPT for mental health support. Semi-structured interviews were conducted with 24 individuals with mental health conditions.
Results: Eight positive factors (psychoeducation, emotional support, goal setting and motivation, referral and resource information, self-assessment and monitoring, cognitive behavioral therapy, crisis interventions, and psychotherapeutic exercises) and four negative factors (ethical and legal considerations, accuracy and reliability, limited assessment capabilities, and cultural and linguistic considerations) were associated with the use of ChatGPT for mental health support.
Conclusion: It is important to carefully consider the ethical, reliability, accuracy, and legal challenges and develop appropriate strategies to mitigate them in order to ensure safe and effective use of AI-based applications like ChatGPT in mental health support.

Keywords: ChatGPT, artificial intelligence, mentally-ill patients, support, motivation, anxiety


A Letter to the Editor has been published for this article.


Introduction

Mental health is important for several reasons. Firstly, it has an impact on how we function and live each day. People with good mental health are more capable to handle stress, keep up positive relationships, and function effectively in both their personal and professional lives.1 It also influences our ability to learn, make decisions, and adapt to changes in our environment. Secondly, there is a direct correlation between mental and physical wellness.2 Numerous physical health illnesses, such as gastrointestinal troubles, immune system abnormalities, and cardiovascular diseases, have been related to mental health conditions like sadness and anxiety. Additionally, physical health outcomes are impacted by mental health activities, including sleep, exercise, and nutrition.3–7 Thirdly, the welfare of society and the social fabric depends on mental health. Positive social interactions, stronger social ties, and a sense of community and belonging are all enhanced by mental health.8 On the other hand, mental health problems that go untreated can result in social exclusion, strained relationships, and detrimental societal effects like elevated healthcare expenses and lower productivity.9,10

Mental health is also a fundamental human right. Everyone has a right to the enjoyment of the best achievable standard of mental health, according to the World Health Organization (WHO), which acknowledges mental health as a critical component of health and wellbeing.11 Prioritizing mental health and ensuring that people have access to sufficient mental health services and support are crucial. Realizing its importance, mental health is considered as one of the sustainable development goals by the United Nations.12 Mental illness is one of the serious health-related problems affecting globally. Some of the major mental health conditions include anxiety, depression, substance use disorder, bipolar disorder, schizophrenia, eating disorders, obsessive compulsive disorder, and post-traumatic stress disorder.13 Additionally, certain issues like tardive dyskinesia can arise due to the side effects of medications.14 As per the 2019 statistics, 13% of the global population is living with mental disorders, accounting to approximately one billion people. Among the mental disorders, anxiety has highest prevalence of 31%, followed by depression (28.9%), development disorder (11.1%), Attention-deficit/hyper-activity disorder (8.8%), and bipolar disorder (4.1%).15 It is observed that people with chronic mental disorders like recurrent depressive disorder die 7 to 1 years earlier than general population.16 Only 3% of low-income, 13% of lower middle income, 32% upper middle income, and 25% of high-income countries reported a fully compliant policy or plan implementation in relation to mental illness, indicating a lack of effective approach at policy levels, with more seriousness in low-income countries.17 Despite the seriousness of the problem, globally governments only allocate 2% of their health budget for treatment and prevention of mental disorders.18

Moreover, globally only 4.6% of health research focuses on mental health,17 indicating the area is under researched. Considering the seriousness of the problem, it is very much essential to understand that there is an immediate need to extend the research studies on various cost-effective and efficient interventions that could prevent and manage mental health disorders. By considering health literacy and support-based technology interventions as a social practice rather than as a method of changing individual behavior, it is possible to create a wide range of community-based initiatives that result in long-lasting improvements in health and equity. The importance of initiatives to advance health awareness and support at higher social and policy levels is not minimized by this method, but it does urge for these actions to be taken with consideration for how they will affect people, workplaces, and communities in the course of daily life.19 This requirement places community involvement at the forefront of all programs designed to increase mental health support, particularly when using apps based on cutting-edge Artificial Intelligence (AI) technology.

The commercial and healthcare sectors have undergone a dramatic transformation as a result of recent advances in artificial intelligence technology, particularly with the introduction of automated virtual assistants or humanoids that provide remarkably accurate information in response to user enquiries. These AI technology solutions can be particularly beneficial in promoting diabetes self-management because to their simplicity, cost, informational and instructional value, engagement, and availability of round-The-clock support. Recent advancements in deep learning and natural language processing have led to the creation of large language models like ChatGPT. These models have been widely used for a variety of tasks, such as text creation, language translation, and question-answering.20 According to research,21–23 ChatGPT outperforms earlier models in terms of accuracy and efficiency when responding to a variety of queries. Additionally, ChatGPT has demonstrated the ability to produce text that is coherent and well-organized, making it useful for tasks like content creation and summary.24

As a tool for patient care in the healthcare sector, ChatGPT has shown a promising delivery of quality care.25,26 Teleconsultants can make more informed decisions and provide better patient care with the aid of ChatGPT’s ability to give prompt and accurate information27 and can assist in disseminating general health-related information to the public.28 Studies29–33 indicated that ChatGPT applications were highly applicable to the healthcare sector, but they also highlighted the need for additional study to fully understand ChatGPT’s effects on patients’ health and psychology. AI-based chatbot applications were already proving to be effective in managing mental health of the patients.34–36 In addition, a recent scoping review of such applications has found that the mental health chatbots were easy to use, attractive, quick responders, trustworthy, and enjoy using them.37 However, no significant research on the use of ChatGPT was identified in relation to mental health.14

Several theories underpin the evaluation of ChatGPT’s efficacy in delivering mental health support to patients. The Technology Acceptance Model (TAM) suggests that a user’s perception of a technology’s ease of use and usefulness influences its adoption. Applied here, it implies that patients’ acceptance and continued use of ChatGPT for mental health support could depend on how user-friendly and beneficial they find the interactions.38–40 Moreover, the Elaboration Likelihood Model (ELM) proposes that the persuasiveness of messages varies based on the depth of cognitive processing. In the context of ChatGPT, the model suggests that the effectiveness of its mental health support may relate to the quality of conversation and the extent to which it engages patients cognitively.41 Finally, the Social Cognitive Theory (SCT) highlights the significance of observational learning and self-efficacy in behavioral change.42 In this case, patients may benefit from observing others’ positive experiences with ChatGPT and gradually build confidence in using it for mental health support. The convergence of these theories underscores the importance of assessing ChatGPT’s ease of use, the depth of its conversations, and the influence of observed experiences on patients’ confidence in utilizing it for mental health assistance. However, most of the studies38–42 adopted quantitative methods in evaluating the effectiveness of ChatGPT, but undermined the importance of qualitative methods, which can help in deepening the understanding of users opinions and beliefs in different contexts.

Conducting a study in this context can have significant benefits. Firstly, access to mental health services can be limited due to various barriers, including cost, stigma, and availability of mental health professionals. Thus, exploring innovative and accessible ways to deliver mental health support, such as through ChatGPT, could potentially help bridge this gap and make mental health services more widely available to those in need. Secondly, as technology continues to advance, chatbots and virtual assistants are increasingly being utilized in various domains, including healthcare. ChatGPT, as a state-of-The-art language model, has the potential to revolutionize the way mental health support is delivered by providing a conversational interface that can engage with users in a human-like manner. Understanding the effectiveness of ChatGPT in this context can help inform the development and implementation of technology-based mental health interventions. Thirdly, ethical considerations are paramount in the field of mental health. The use of ChatGPT for mental health support raises ethical concerns related to privacy, data security, informed consent, and the potential impact on the therapeutic relationship. Investigating the ethical implications of using ChatGPT in this context can help identify safeguards and guidelines to protect the well-being and rights of users. Therefore, this study intends to assess the use of ChatGPT for mental health support.

Methods

This study employed a quasi-experimental design43,44 because ChatGPT technology is novel and few individuals are familiar with it.

Study Setting & Participants

Various factors were considered for evaluating ChatGPT’s effectiveness in delivering mental health support, based on a preliminary literature review investigation. Participants included outpatients (18 years or above) with mental-health conditions including anxiety, depression, and behavioral disorders from the hospital at King Fahad University, Saudi Arabia. During outpatient visits, the patients were requested to participate in the research while the purpose and objectives were explained. Patients were instructed, with their consent, to use ChatGPT (free version 3.5) at home for two weeks to seek support from ChatGPT to manage their condition for at least 15 minutes a day. Participants were free to inquire about any mental-health related information by posing any query or self-management practices, or a friendly interaction. The participants were asked to partake in semi-structured interviews at the university hospital following a two-week experiment. A semi-structured interview is defined as a conversation between at least two individuals about a topic of shared interest that also encourages disagreement in interpretation and the exchange of perspectives.45 Semi-structured interviews are a format that researchers or interviewers can choose because they give them the option to pre-formulate the questions and add more pertinent ones later in the interviewing process.45,46

Selection and Sampling

As the study required that participants use ChatGPT for a period of time, it was imperative that the researchers use a readily accessible sample. Consequently, both purposive and convenience sampling techniques were used in this study, as was customary for comparable research.47 Purposive sampling was used for selecting participants based on their characteristics, knowledge, experiences such as “affecting from mental health problems” and has “awareness of using ChatGPT”; and convenience sampling was used to recruit individuals who can be easily accessed, ie, from university hospital. When approximately 42 adult patients were initially asked if they were interested in participating in the study, 35 responded positively. Only 24 out of the 38 patients were aware of ChatGPT and its function. Consequently, 24 patients were selected to participate in the study. Before participating in an interview, the patients were thoroughly informed of the study’s objectives and given the opportunity to provide informed consent during their outpatient visit. Following approval, patients used ChatGPT at home for two weeks to seek mental-health support in their own ways. The participants were required to interact with ChatGPT3 for a minimum of 15 minutes per day for 14 days. Twenty to thirty interviews were regarded an appropriate sample size for qualitative investigations, particularly those employing interviews as a method of data collection.48 In this qualitative quasi-experimental investigation, therefore, 24 outpatients with different mental-health conditions were considered for data collection.

Questionnaire Design

One can conduct semi-structured interviews in a variety of methods. It is important to note that, in contrast to structured interviews, these types of interviews do not commence with a predetermined list of questions; rather, the researchers develop a small number of initial interview questions. Based on the responses of the interviewees, it is possible to introduce new questions or expand the existing ones.49 As a result, the authors designed an interview questionnaire containing four demographic inquiries pertaining to gender, age, education, and employment status. In addition, there are ten questions regarding the impact of ChatGPT on participants’ perceptions of ChatGPT for delivering mental-health support, based on their utilization. The questions are provided in Appendix A. The interview questions were then translated from English to Arabic by a qualified translator.50 Two professors from the eHealth department at Imam Abdulrahman Bin Fahd University certify the translated questionnaire. A few grammatical adjustments were suggested, and the Arabic version was revised to incorporate them.

Data Collection

The semi-structured interviews that took place in the university hospital were conducted in Arabic. The interviews were recorded audibly. On average, each interview lasted approximately 54 minutes, lasting between 50 and 60 minutes. The participants consented to having their interviews recorded.

Data Analysis

The recorded interviews were converted into interview transcripts (text documents) using the NVivo software. The Arabic interview transcripts were then translated into English for improved examination. The interview data were analyzed using thematic analysis, a common approach used in qualitative studies.51 Initially, 162 distinct codes that emphasize specific information were discovered from interview transcripts. The codes were then classified into twelve themes based on their similarities reflecting both negative (four themes including ethical and legal considerations, accuracy and reliability, limited assessment capabilities, and cultural and linguistic considerations) and positive effects (eight themes including psychoeducation, emotional support, goal setting and motivation, referral and resource information, self-assessment and monitoring, cognitive behavioral therapy techniques, crisis intervention, and psychotherapeutic exercises) on mental health support, which were then used to analyze the results.

Ethics-Related Factors

The research ethics committee at Abdulrahman Bin Faisal University approved the study. This study complies with the declaration of Helsinki. The data collection and analysis process were conducted in accordance with all applicable ethical standards. The purpose of the investigation and the participants’ legal rights were disclosed in full. Informed consent is obtained from all the participants. Each interviewee was given a fictitious name to safeguard the participants’ anonymity. Before the interviews began, all participants gave their informed consent, and participation was entirely voluntary. The participants informed consent included publication of anonymized responses.

Results

Participants’ Characteristics

Twenty-four patients participated in the interviews. Eight participants were females and sixteen participants were males. The demographic information of the participants is presented in Table S1 and Appendix B.

ChatGPT for Delivering Mental-Health Support

Analysis of interview data resulted in twelve themes, which can be related to the use of ChatGPT for mental health support. These are presented in the following sections.

Psychoeducation

All participants recognized ChatGPT’s importance in education, especially lifetime learning about mental health disorders, which psychologists and other doctors may not explain. Over 60% of participants (15 out of 24 interviewees) believed ChatGPT could educate patients about mental health conditions, their causes, symptoms, and treatment options to improve their mental health literacy. Nearly 80% of participants said ChatGPT helped them manage their mental health symptoms with relaxation, mindfulness, stress-reduction, and other evidence-based coping skills they could use daily. The following statements support the findings.

I was experiencing fear syndrome and negative thoughts for a while now. Although, I take few medicines and practice yoga, I could not improve my condition completely because of lack of awareness or complete information about my condition. I learned about my condition by asking ChatGPT, and it gave me a clear picture of myself, and how I can improve my condition (Interviewee 4)

ChatGPT was very supportive, easy to use, and quick in responses. I learned a lot about mental-health conditions, the ways to improve, and the current research and updates in this area.like the use of AI-based technologies like Chatbots for providing mental support (Interviewee 21)

Emotional Support

More than 50% of the participants (13 out of 24 interviewees) referred to the emotional support of ChatGPT. The participants believed that ChatGPT could offer empathetic responses and validation to individuals who may be struggling with their emotions; and it could also, provide a non-judgmental space for individuals to express their feelings and concerns, and offer compassionate responses to help patients feel heard and understood. These findings are realized from the following supportive statements.

I said the application that I am sad, and its reply that it was sorry to hear that I am sad, and provided some suggestions to improve my mood…These things made me feel more happy seeing that a machine is caring for me, and it was doing its best for making me feel happy, like it said a joke… (Interviewee 6)

The application provided funny jokes, which has brighten-up my mood…For example, why the bicycle can stand on its own? Because it is two-tired…and many more jokes like this (Interviewee 17)

I was feeling really relaxed and peaceful, as I could openly express my feelings to a machine, without revealing my identity (Interviewee 12)

Goal Setting and Motivation

Few participants (five out of 24 interviewees) observed that ChatGPT could help them set goals and develop a plan to achieve them. Furthermore, almost 50% of the participants (11 out of 24 interviewees) believed that it provided motivational support by encouraging them to take positive steps towards improving their mental health, such as setting small achievable goals and tracking progress over time. These findings can be inferred from the following supportive statements.

By taking cue from my doctor, who suggested me to set goals and achieve them, I asked the application, if it can help in this process. Although, it could not store and monitor data, it suggested many AI-based applications for goal-setting and monitoring purpose. (Interviewee 19)

I requested it to help me in stopping smoking. It was providing many motivational messages and the good effects of quitting smoking…It was a different experience seeking motivation from an unknown, but it helped me in becoming motivated, as I became aware of many risks, which I do not know earlier (Interviewee 23)

Referral and Resource Information

Few participants (six out of 24 interviewees) found that ChatGPT could provide information about mental health resources like books, magazines, and journals; and suggest the ways to find medical resources such as therapists, counselors, support groups, and hotlines. It can also help patients find local mental health services and provide contact information for relevant organizations or professionals. These findings are inferred from the following statements.

It helped me in finding some good orthopedic doctors in Riyadh. It suggested many applications like Labayh, which I used to find the good orthopedic doctors in my area, also found the reviews for them. This helped me in decision-making on selecting the doctor. (Interviewee 1)

It provided many book names and journals like The Depression Cure: The 6-Step Program to Beat Depression without Drugs by Stephen, which really helped me in managing my condition (Interviewee 15)

Self-Assessment and Monitoring

Majority of the participants (18 out of 24 interviewees) found that ChatGPT could facilitate self-assessment and monitoring of mental health symptoms over time. By seeking relevant information, the participants were able to assess and monitor their conditions. Furthermore, they observed that ChatGPT provided guidance on self-care practices that promoted their mental well-being, such as maintaining a healthy lifestyle, getting regular exercise, practicing good sleep hygiene, and engaging in activities that bring joy and fulfillment. These findings are inferred from the following supportive statements.

It suggested many approaches like having enough sleep, taking a balanced diet, regular exercises, and many more…most importantly it explained me how these lifestyle changes could help me in coming out of depression…like it said regular exercise would release feel good hormone, endorphin, which can reduce stress and improve mood. (Interviewee 11)

After knowing about symptoms of anxiety in detail, I was better aware of my condition and treatment.Also I was able to track my progress, and adopt self-are practices after receiving the suggestions from the application (Interviewee 18)

Cognitive Behavioral Therapy (CBT) Techniques

Few participants (three out of 24 interviewees) opined that ChatGPT could help patients practice cognitive restructuring, a common technique used in CBT, which involves identifying and challenging negative thought patterns and replacing them with more balanced and rational thoughts. In supporting the findings, Interviewee 10 stated that

By seeking help from the application, I was able to identify my negative thoughts more effectively and started to replace them with positive ones.I was in a better position than before in doing this process because of the support from the application

Another Interviewee 8 stated that

I was sometimes confused to identify thoughts, because, although a thought is negative, I felt some positive angle to it…But by taking cues from the application, I was able to better frame my thoughts

Crisis Intervention

Few participants (eight out of 24 interviewees) ChatGPT could provide crisis intervention support for patients who may be experiencing acute distress or in need of immediate assistance. It can also provide information on emergency hotlines, suicide prevention resources, and guide patients on steps to take in a crisis. In this context, interviewee 5 stated that

I tried asking emergency helpline numbers…Like I asked for emergency number to seek help from suicide thoughts.It provided 937 number… Saudi Red Crescent Authority (SRCA), which is responsible for providing emergency medical services and crisis intervention in Saudi Arabia, including suicide prevention.

While I received support from the application to manage my condition, I also asked about all emergency support numbers and noted them down, as I was not aware of them…this would help me in case of emergencies (Interviewee 7)

Psychotherapeutic Exercises

About 50% of the participants (11 out of 24 interviewees) ChatGPT could facilitate various psychotherapeutic exercises, such as guided imagery, journaling prompts, and thought challenging exercises, to help patients explore their emotions, gain insights, and develop coping skills. This finding can be inferred from the following supportive statements.

The application helped me in adopting effective practices to change my mood. Like it suggested me to write down three positive things happened in a day, asked me to think why I felt they are positive, and asked me to appreciate the positive moments…tis way I started to think more positively and avoided negative thoughts (Interviewee 22)

Its approach to motivate myself and gain self-control has influenced me.It suggested to write down things or negative thoughts that bother me…and asked to write why they bother me, and how can I gain control over them, and set goas to gain control…this way, I started to change my behavior. (Interviewee 13)

The findings also revealed some challenges associated with ChatGPT in providing mental health support, which included the following.

Ethical and Legal Considerations

There are ethical and legal considerations surrounding the use of AI in mental health support. Issues such as data privacy, consent, confidentiality, and the potential for biases in AI algorithms are the few challenges raised by most of the participants. These are inferred from the following statements.

I was worried, if my personal details or my health-related queries that I ask ChatGPT may be leaked or misused.Some of the information could be personal, and these days I hear a lot about personal data theft online (Interviewee 2)

I am not aware whom to go, if the ChatGPT suggestions prove to be failure or pose any risk…I am not sure if it is regulated by government or not, as it keeps saying that it is a language model and keeps on learns from new data sets (Interviewee 16)

Accuracy and Reliability

Nearly 80% (20 out of 24 interviewees) observed various accuracy and reliability related issues with ChatGPT. Most of the concerns were about the reliability of the information it provides. The following statements support this finding.

When I enquired how reliable is its information, it stated that it may be trained on outdated or incomplete information; there could be biases in training data; its inability to very facts; inability to adopt moral considerations.and lot more.It confuses me.although, the information it provided was mostly correct, but it itself states that its data could be inaccurate or unreliable (Interviewee 3)

In some instance, it did not provide any data. It said it is only language model and cannot prescribe any medicines for anxiety. But when I modified question and asked what common medicines are used for anxiety, it gave a list of medicines like Prozac, Zoloft, Lexapro. But are they reliable and safe? I am not sure. (Interviewee 20)

Limited Assessment Capabilities

Mental health is a condition which requires careful assessment. However, most of the participants stated that the application lacks assessment skills. Furthermore, due to its biased responses, there are safety concerns that may emerge on its use for mental health support. Furthermore, it lacks human sense of touch and emotion, which are very much essential in treating mental-health conditions. These findings are inferred from the following statements.

When I asked the application to diagnose and assess me about my anxiety, it states that it doesn’t have the ability to diagnose or assess individuals mental health condition (Interviewee 24)

It was openly providing some medicine names for different conditions. This a big concern, as few people may use the information for unethical purposes (Interviewee 11)

One of the most important concerns I observed was lack of personal interaction with human. Although I got privacy and a sense of independence in freely sharing with ChatGPT, due is emotionless state, I am not aware if it understood my problems or not, and it was little demotivating for me to interact. (Interviewee 14)

Cultural and Linguistic Considerations

Few participants (four out of 24 interviewees) opined that ChatGPT may not be fully equipped to understand and address the cultural and linguistic diversity of patients. It may not be able to fully adapt to patients’ cultural beliefs, values, and language preferences, which can affect the effectiveness of the support provided. These findings can be inferred from the following statements.

It was understanding Arabic, that’s a good thing; but it was not aware of few cultural words and things that are traditionally rooted in Arabic culture, which was affecting my interaction with ChatGPT (Interviewee 11)

If I share my problems like I fear about something that is culturally rooted, a personal doctor may understand my problem.I tried some instances with ChatGPT, but it was unable to recognize the issue (interviewee 6)

Discussion

The qualitative analysis of the interview data resulted in twelve themes reflecting the impact of ChatGPT as a tool on mental health support. Out of the twelve, eight themes were identified to be positively associated and four themes were negatively associated with ChatGPT for mental health support.

One of the significant impacts of ChatGPT for mental health support was observed in relation to psychoeducation. Studies52–55 have observed that poor health literacy is one of the critical challenges facing all the countries globally, resulting in increasing healthcare costs and poor management of public health. The findings in this study reflected that ChatGPT could be an effective intervention tool for psychoeducation, which can significantly improve the mental health of the individuals, making them effective in taking informed decisions.56 Furthermore, in some contexts, it was observed ChatGPT was providing emotional support to the participants. This indicates the relevance of ChatGPT in providing mental health support, where emotional support plays an important role in the treatment.57 In addition, ChatGPT’s role in motivation and goal setting, referral and resource informational source were identified in recent study.57 Although, ChatGPT was identified to be effective for self-assessment and monitoring in few studies,57 issues such as inadequate evaluation, poor utilization of emotional information was identified in other studies.58 Similar AI-application based studies for mental health support34–36 have also identified issues with emotional support, diagnosis, and assessment in providing mental health support.

However, its role in other areas such as treatment and support strategies could be more supportive in providing mental health support. For instance, Cognitive restructuring support from ChatGPT can help patients develop healthier thinking patterns and improve their overall mental well-being. In addition, ChatGPT can offer psycho-social support to patients by providing a compassionate and non-judgmental space for them to talk about their concerns, fears, and challenges.59

However, it is important to remember that while ChatGPT can provide valuable support, it is not a substitute for professional mental health care. It is always recommended for patients to seek qualified medical advice and treatment from licensed mental health professionals for their specific needs.57 While ChatGPT is a powerful language model, it may not always provide accurate or reliable information. It is important to ensure that the information provided by ChatGPT is evidence-based and up-to-date to avoid misinformation and potential harm to patients. However, it is important to remember that the application is trained on different datasets, which may be new or old ones. Therefore, there is need to integrate factchecking functionality to increase the accuracy and reliability of the application in providing mental health support.

It is interesting to note that, although many participants stated that ChatGPT provides good information; yet they were concerned about the accuracy and reliability of the information. ChatGPT may have limitations in assessing patients’ mental health conditions accurately, as stated in.58 Furthermore, it may not be able to fully understand complex emotions, interpret non-verbal cues, or accurately assess the severity of mental health symptoms, which can affect the quality of support provided.58 There may be safety concerns in providing mental health support through an AI tool. For example, in cases of crisis or emergencies, ChatGPT may not be able to provide appropriate and timely interventions, and patients may require immediate human intervention. As an AI-powered tool, ChatGPT lacks the human touch and emotional connection that can be important in mental health support.59 Building rapport, establishing trust, and providing empathy, which are crucial in therapeutic relationships, may be challenging for an AI model like ChatGPT. However, recent studies60–62 have identified potential benefits of using ChatGPT for mental health, suggesting it as a reliable intervention tool.

While the findings in this study reflects more positive aspects of ChatGPT as a tool for mental health support, especially in education and access to general health information; there are few challenges associated with it, and for addressing these, the application may require additional improvements. In addition, further studies are required to gain a clear understanding into the effects of using ChatGPT for mental health support.

The findings in this study have both practical and theoretical implications. Firstly, the findings from this study contributes to the research gaps on the application of ChatGPT in healthcare. Secondly, the findings in this study can be used by the policy makers, system designers and research institutions to develop strategies for the application of ChatGPT in providing mental health support, thereby increasing access to healthcare services, and reducing healthcare costs. The study also has few limitations. The study used a small sample for collecting the data, and only from a particular location. However, the awareness and the skills of using AI applications may differ among individuals. Therefore, results in this study must be generalized with care.

Conclusion

The purpose of this study is to assess the use of ChatGPT for mental health support, which was achieved through a qualitative approach. As observed from the findings in this study, despite various challenges, AI-powered tools like ChatGPT have the potential to provide valuable mental health support to patients when used appropriately and as part of a comprehensive mental health care plan. Therefore, it is important to carefully consider these challenges and develop appropriate strategies to mitigate them in order to ensure safe and effective use of AI-based applications like ChatGPT in mental health support. Furthermore, the area of ChatGPT application in healthcare is under researched and requires immediate attention among the researchers to expand the scope of research associated with ChatGPT, as it has a huge potential in improving healthcare services.

Disclosure

The author reports no conflicts of interest in this work.

References

1. Walsh R. Lifestyle and mental health. Am Psychologist. 2011;66(7):579–592. doi:10.1037/a0021769

2. Orpana H, Vachon J, Dykxhoorn J, McRae L, Jayaraman G. Monitoring positive mental health and its determinants in Canada: the development of the Positive Mental Health Surveillance Indicator Framework. Health Promot Chronic Dis Prev Can. 2016;36(1):1–10. doi:10.24095/hpcdp.36.1.01

3. Centers for Disease Control and Prevention. Heart Disease and Mental Health Disorders; 2023. Available from: cdc.gov/heartdisease/mentalhealth.htm. Accessed January 26, 2024.

4. Salleh MR. Life event, stress and illness. Malays J Med Sci. 2008;15(4):9–18.

5. Clapp M, Aurora N, Herrera L, Bhatia M, Wilen E, Wakefield S. Gut microbiota’s effect on mental health: the gut-brain axis. Clin Pract. 2017;7(4):987. doi:10.4081/cp.2017.987

6. Naughton MJ, Weaver KE. Physical and mental health among cancer survivors: considerations for long-term care and quality of life. N C Med J. 2014;75(4):283–286. doi:10.18043/ncm.75.4.283

7. Romeo RD. The impact of stress on the structure of the Adolescent Brain: implications for Adolescent Mental Health. Brain Res. 2017;1654:185–191. doi:10.1016/j.brainres.2016.03.021

8. Ollinheimo A, Hakkarainen K. Critical thinking as cooperation and its relation to Mental Health and Social Welfare. New Ideas Psychol. 2023;68:100988. doi:10.1016/j.newideapsych.2022.100988

9. Boardman J. Social exclusion and mental health – how people with mental health problems are disadvantaged: an overview. Ment Health Soc Incl. 2011;15(3):112–121. doi:10.1108/20428301111165690

10. Heinz A, Zhao X, Liu S. Implications of the Association of Social Exclusion with Mental Health. JAMA Psychiatry. 2020;77(2):113–114. doi:10.1001/jamapsychiatry.2019.3009

11. World Health Organization. Connecting mental health and human rights; 2023. Available from: https://www.who.int/europe/activities/connecting-mental-health-and-human-rights. Accessed January 26, 2024.

12. World Health Organization. Mental health; 2023. Available from: https://www.who.int/health-topics/mental-health#tab=tab_1. Accessed January 26, 2024.

13. Marissa W Mental health statistics 2023; 2023. Available from: https://www.singlecare.com/blog/news/mental-health-statistics/. Accessed January 26, 2024.

14. Uludag K, Wang DM, Goodman C, Wang L, Zhang X. Prevalence, clinical correlates and risk factors associated with Tardive Dyskinesia in Chinese patients with schizophrenia. Asian J Psychiatry. 2021;66:102877. doi:10.1016/j.ajp.2021.102877

15. World Health Organization. World Mental health Report; 2023. Available from: https://www.who.int/publications/i/item/9789240049338. Accessed January 26, 2024.

16. Chesney E, Goodwin GM, Fazel S. Risks of all-cause and suicide mortality in mental disorders: a meta-review. World Psychiatry. 2014;13(2):153–160. doi:10.1002/wps.20128

17. World Health Organization. Mental health atlas 2020; 2023. Available from: https://apps.who.int/iris/handle/10665/345946. Accessed January 26, 2024.

18. World Health Organization. The world health report 2001. Mental health: new understanding, new hope. Geneva; 2023. Available from: https://apps.who.int/iris/handle/10665/42390. Accessed January 26, 2024.

19. World Health Organization. Health literacy development for the prevention and control of noncommunicable diseases; 2023. Available from: https://www.who.int/publications-detail-redirect/9789240055391. Accessed January 26, 2024.

20. Aljanabi M, Ghazi M, Ali AH, Abed SA. ChatGpt: open Possibilities. Iraqi J Computer Sci Mathematics. 2023;4(1):62–64.

21. Shen Y, Heacock L, Elias J, et al. CHATGPT and other large language models are double-edged swords. Radiology. 2023;307. doi:10.1148/radiol.230163

22. Jiao W, Wang W, Huang H, Wang X, Tu Z. Is ChatGPT a Good Translator? A Preliminary Study. Computation Language. 2023. doi:10.48550/arXiv.2301.08745

23. Gao CA, Howard FM, Markov NS, et al. Comparing scientific abstracts generated by CHATGPT to original abstracts using an artificial intelligence output detector, plagiarism detector, and blinded human reviewers. J Electonic. 2022.

24. Aydın Ö, Karaarslan E. Is Chatgpt leading generative ai? What is beyond expectations? SSRN Electron J. 2023. doi:10.2139/ssrn.4341500

25. Primack D Here come the robot doctors; 2023. Available from: https://www.axios.com/2023/01/18/chatgpt-ai-health-care-doctors. Accessed January 26, 2024.

26. Nov O, Singh N, Mann DM. Putting chatgpt’s medical advice to the (turing) test. SSRN Electronic Journal. 2023. doi:10.2139/ssrn.4413305

27. Aljanabi M. ChatGPT: future Directions and Open possibilities. Mesopotamian j Cybersecurity. 2023;16–17. doi:10.58496/MJCS/2023/003

28. Cascella M, Montomoli J, Bellini V, Bignami E. Evaluating the feasibility of CHATGPT in Healthcare: an analysis of multiple clinical and research scenarios. J Med Systems. 2023;47(1). doi:10.1007/s10916-023-01925-4

29. Biswas SS. Role of chat GPT in public health. Ann. Biomed. Eng. 2023;51(5):868–869. doi:10.1007/s10439-023-03172-7

30. Khan RA, Jawaid M, Khan AR, Sajjad M. CHATGPT - Reshaping Medical Education and clinical management. Pakistan J Med Sci. 2023;39(2). doi:10.12669/pjms.39.2.7653

31. Goodman RS, Patrinely JR, Osterman T, Wheless L, Johnson DB. On the cusp: considering the impact of artificial intelligence language models in healthcare. Medicine. 2023;4(3):139–140. doi:10.1016/j.medj.2023.02.008

32. Seth I, Rodwell A, Tso R, Valles J, Bulloch G, Seth N. A conversation with an open artificial intelligence platform on osteoarthritis of the Hip and treatment. J Orthopaedics Sports Med. 2023;05(01). doi:10.26502/josm.511500088

33. Sng GG, Tung JY, Lim DY, Bee YM. Potential and pitfalls of CHATGPT and natural-language artificial intelligence models for diabetes education. Diabetes Care. 2023;46(5):e103–e105. doi:10.2337/dc23-0197

34. Fingerprintfr success. The Best Mental Health Chatbots (& What They Can Do for You); 2023. Available from: https://www.fingerprintforsuccess.com/blog/mental-health-chatbot#:~:text=Woebot,conditions%20like%20depression%20and%20anxiety. Accessed January 26, 2024.

35. Sweeney C, Potts C, Ennis E, et al. Can Chatbots help support a person’s mental health? Perceptions and views from mental healthcare professionals and experts. ACM Transactions Computing Healthcare. 2021;2(3):1–15. doi:10.1145/3453175

36. Dosovitsky G, Pineda BS, Jacobson NC, Chang C, Escoredo M, Bunge EL. Artificial Intelligence Chatbot for Depression: descriptive Study of Usage. JMIR Form Res. 2020;4(11):e17065. doi:10.2196/17065

37. Abd-Alrazaq AA, Alajlani M, Ali N, Denecke K, Bewick BM, Househ M. Perceptions and opinions of patients about Mental Health Chatbots: scoping review. J Med Internet Res. 2021;23(1):1. doi:10.2196/17828

38. Bin-Nashwan SA, Sadallah M, Bouteraa M. Use of chatgpt in academia: academic integrity hangs in the balance. Technol Soc. 2023;75:102370. doi:10.1016/j.techsoc.2023.102370

39. Shahsavar Y, Choudhury A. User Intentions to Use ChatGPT for Self-Diagnosis and Health-Related Purposes: cross-sectional Survey Study. JMIR Hum Factors. 2023;10:e47564. doi:10.2196/47564

40. Ali O, Murray PA, Momin M, Al-Anzi FS. The knowledge and innovation challenges of CHATGPT: a scoping review. Technol Soc. 2023;75:102402. doi:10.1016/j.techsoc.2023.102402

41. Paul J, Ueno A, Dennis C. chatgpt and consumers: benefits, Pitfalls and Future Research Agenda. Int J Consum Stud. 2023;47(4):1213–1225. doi:10.1111/ijcs.12928

42. Salah M, Alhalbusi H, Ismail MM, Abdelfattah F. Chatting with CHATGPT: decoding the mind of chatbot users and unveiling the intricate connections between user perception, trust and stereotype perception on self-esteem and psychological well-being. Current Psychology. 2023. doi:10.1007/s12144-023-04989-0

43. Zhu G, Fan X, Hou F, et al. Embrace Opportunities and Face Challenges: using ChatGPT in Undergraduate Students’ Collaborative Interdisciplinary Learning. Computers Society. 2023. doi:10.48550/arXiv.2305.18616

44. Han J-W, Park J, Lee H. Analysis of the effect of an artificial intelligence chatbot educational program on non-face-to-face classes: a quasi-experimental study. BMC Medical Educ. 2022;22(1). doi:10.1186/s12909-022-03898-3

45. Cohen L, Manion L, Morrison M. Research Methods in Education. 6th ed. New York: Routledge; 2007.

46. Brinkmann S. Unstructured and Semi-Structured Interviewing. The Oxford handbook of qualitative research; 2014:277–299.

47. Bonsu EM, Baffour-Koduah D. From the consumers’ side: determining students’ perception and intention to use ChatGPT in Ghanaian higher education. Journal of Education, Society & Multiculturalism. 2023;4(1):1–29. doi:10.2478/jesm-2023-0001

48. Dworkin SL. Sample Size Policy for Qualitative Studies Using In-Depth Interviews. Arch Sex Behav. 2012;41(6):1319–1320. doi:10.1007/s10508-012-0016-6

49. Kallio H, Pietilä A-M, Johnson M, et al. Systematic methodological review: developing a framework for a qualitative semi‐structured interview guide. Journal of Advanced Nursing. 2016;72(12):2954–2965. doi:10.1111/jan.13031

50. Cha E, Kim KH, Erlen JA. Translation of scales in cross‐cultural research: issues and techniques. Journal of Advanced Nursing. 2007;58(4):386–395. doi:10.1111/j.1365-2648.2007.04242.x

51. Braun V, Clarke V. Using thematic analysis in psychology. Qualitative Research Psychol. 2006;3(2):77–101. doi:10.1191/1478088706qp063oa

52. Baker DW, Wolf MS, Feinglass J, Thompson JA, Gazmararian JA, Huang J. Health literacy and mortality among elderly persons. Arch Intern Med. 2007;167:1503–1509. doi:10.1001/archinte.167.14.1503

53. Berkman ND, Sheridan SL, Donahue KE, Halpern DJ, Crotty K. Low health literacy and health outcomes: an updated systematic review. Ann Intern Med. 2011;155:97–107. doi:10.7326/0003-4819-155-2-201107190-00005

54. Yin HS, Dreyer BP, Foltin G, van Schaick L, Mendelsohn AL. Association of low caregiver health literacy with reported use of nonstandardized dosing instruments and lack of knowledge of weight-based dosing. Ambul Pediatr. 2007;7:292–298. doi:10.1016/j.ambp.2007.04.004

55. DeWalt DA, Dilling MH, Rosenthal MS, Pignone MP. Low parental literacy is associated with worse asthma care measures in children. Ambul Pediatr. 2007;7:25–31. doi:10.1016/j.ambp.2006.10.001

56. Biswas SS. Role of chat GPT in public health. Ann. Biomed. Eng. 2023;51(5):868–869.

57. Asch DA. An Interview with ChatGPT About Health Care. NEJM Catalyst. Massachusetts Med Soc. 2023.

58. Yang K, Ji S, Zhang T, Xie Q, Ananiadou S. On the Evaluations of ChatGPT and Emotion-Enhanced Prompting for Mental Health Analysis. Computation and Language, Cornell University; 2023.

59. Frąckiewicz M The Impact of ChatGPT on Mental Health and Wellbeing; 2023. Available from: https://ts2.space/en/the-impact-of-chatgpt-on-mental-health-and-wellbeing/#:~:text=ChatGPT%20is%20designed%20to%20provide,them%20manage%20their%20mental%20health. Accessed January 26, 2024.

60. Farhat F. CHATGPT as a complementary mental health resource: a Boon or a bane. Ann. Biomed. Eng. 2023. doi:10.1007/s10439-023-03326-7

61. Tal A, Elyoseph Z, Haber Y, et al. The artificial third: utilizing CHATGPT in mental health. Am J Bioeth. 2023;23(10):74–77. doi:10.1080/15265161.2023.2250297

62. Spallek S, Birrell L, Kershaw S, Devine EK, Thornton L. Can we use ChatGPT for Mental Health and Substance Use Education? Examining Its Quality and Potential Harms. JMIR Med Educ. 2023;9:e51243. doi:10.2196/51243

Creative Commons License © 2024 The Author(s). This work is published and licensed by Dove Medical Press Limited. The full terms of this license are available at https://www.dovepress.com/terms.php and incorporate the Creative Commons Attribution - Non Commercial (unported, v3.0) License. By accessing the work you hereby accept the Terms. Non-commercial uses of the work are permitted without any further permission from Dove Medical Press Limited, provided the work is properly attributed. For permission for commercial use of this work, please see paragraphs 4.2 and 5 of our Terms.