TikTok’s impact on adolescent mental health

Article Type
Changed
Mon, 08/29/2022 - 12:32

For younger generations, TikTok is a go-to site for those who like short and catchy video clips. As a social media platform that allows concise video sharing, TikTok has over 1 billion monthly global users. Because of its platform size, a plethora of resources, and influence on media discourse, TikTok is the place for content creators to share visual media. Its cursory, condensed content delivery with videos capped at 1-minute focuses on high-yield information and rapid identification of fundamental points that are both engaging and entertaining.

Currently, on TikTok, 40 billion views are associated with the hashtag #mentalhealth. Content creators and regular users are employing this platform to share their own experiences, opinions, and strategies to overcome their struggles. While it is understandable for creators to share their personal stories that may be abusive, traumatic, or violent, they may not be prepared for their video to “go viral.”

New York Institute of Technology College of Osteopathic Medicine in Old Westbury, N.Y.
Ms. Sammi Wong

Like any other social media platform, hateful speech such as racism, sexism, or xenophobia can accumulate on TikTok, which may cause more self-harm than self-help. Oversharing about personal strategies may lead to misconceived advice for TikTok viewers, while watching these TikTok videos can have negative mental health effects, even though there are no malicious intentions behind the creators who post these videos.

Hence, public health should pay more attention to the potential health-related implications this platform can create, as the quality of the information and the qualifications of the creators are mostly unrevealed. The concerns include undisclosed conflicts of interest, unchecked spread of misinformation, difficulty identifying source credibility, and excessive false information that viewers must filter through.1,2

Individual TikTok users may follow accounts and interpret these content creators as therapists and the content they see as therapy. They may also believe that a close relationship with the content creator exists when it does not. Specifically, these relationships may be defined as parasocial relationships, which are one-sided relationships where one person (the TikTok viewer) extends emotional energy, interest, and time, and the other party (the content creator) is completely unaware of the other’s existence.3 Additionally, Americans who are uninsured/underinsured may turn to this diluted version of therapy to compensate for the one-on-one or group therapy they need.

Department of Child and Adolescent Psychiatry and Behavioral Sciences at Children’s Hospital of Philadelphia, and the University of Pennsylvania
Dr. Jaclyn Chua

While TikTok may seem like a dangerous platform to browse through or post on, its growing influence cannot be underestimated. With 41% of TikTok users between the ages of 16 and 24, this is an ideal platform to disseminate public health information pertaining to this age group (for example, safe sex practices, substance abuse, and mental health issues).4 Because younger generations have incorporated social media into their daily lives, the medical community can harness TikTok’s potential to disseminate accurate information to potential patients for targeted medical education.

For example, Jake Goodman, MD, MBA, and Melissa Shepard, MD, each have more than a million TikTok followers and are notable psychiatrists who post a variety of content ranging from recognizing signs of depression to reducing stigma around mental health. Similarly, Justin Puder, PhD, is a licensed psychologist who advocates for ways to overcome mental health issues. By creating diverse content with appealing strategies, spreading accurate medical knowledge, and answering common medical questions for the public, these ‘mental health influencers’ educate potential patients to create patient-centered interactions.

Given the ever-changing digital media landscape, an emphasis must be placed on understanding how adolescents respond to social media in maladaptive or adaptive ways by pointing out the common strengths and weaknesses adolescents share. While there are many pros and cons to social media platforms, it is undeniable that these platforms – such as TikTok – are here to stay. It is crucial for members of the medical community to recognize the outlets that younger generations use to express themselves and to exploit these media channels therapeutically.
 

Ms. Wong is a fourth-year medical student at the New York Institute of Technology College of Osteopathic Medicine in Old Westbury, N.Y. Dr. Chua is a psychiatrist with the department of child and adolescent psychiatry and behavioral sciences at Children’s Hospital of Philadelphia, and assistant professor of clinical psychiatry at the University of Pennsylvania, also in Philadelphia.

References

1. Gottlieb M and Dyer S. Information and Disinformation: Social Media in the COVID-19 Crisis. Acad Emerg Med. 2020 Jul;27(7):640-1. doi: 10.1111/acem.14036.

2. De Veirman M et al. Front Psychol. 2019;10:2685. doi: 10.3389/fpsyg.2019.02685.

3. Bennett N-K et al. “Parasocial Relationships: The Nature of Celebrity Fascinations.” National Register of Health Service Psychologists. https://www.findapsychologist.org/parasocial-relationships-the-nature-of-celebrity-fascinations/.

4. Eghtesadi M and Florea A. Can J Public Health. 2020 Jun;111(3):389-91. doi: 10.17269/s41997-020-00343-0.

Publications
Topics
Sections

For younger generations, TikTok is a go-to site for those who like short and catchy video clips. As a social media platform that allows concise video sharing, TikTok has over 1 billion monthly global users. Because of its platform size, a plethora of resources, and influence on media discourse, TikTok is the place for content creators to share visual media. Its cursory, condensed content delivery with videos capped at 1-minute focuses on high-yield information and rapid identification of fundamental points that are both engaging and entertaining.

Currently, on TikTok, 40 billion views are associated with the hashtag #mentalhealth. Content creators and regular users are employing this platform to share their own experiences, opinions, and strategies to overcome their struggles. While it is understandable for creators to share their personal stories that may be abusive, traumatic, or violent, they may not be prepared for their video to “go viral.”

New York Institute of Technology College of Osteopathic Medicine in Old Westbury, N.Y.
Ms. Sammi Wong

Like any other social media platform, hateful speech such as racism, sexism, or xenophobia can accumulate on TikTok, which may cause more self-harm than self-help. Oversharing about personal strategies may lead to misconceived advice for TikTok viewers, while watching these TikTok videos can have negative mental health effects, even though there are no malicious intentions behind the creators who post these videos.

Hence, public health should pay more attention to the potential health-related implications this platform can create, as the quality of the information and the qualifications of the creators are mostly unrevealed. The concerns include undisclosed conflicts of interest, unchecked spread of misinformation, difficulty identifying source credibility, and excessive false information that viewers must filter through.1,2

Individual TikTok users may follow accounts and interpret these content creators as therapists and the content they see as therapy. They may also believe that a close relationship with the content creator exists when it does not. Specifically, these relationships may be defined as parasocial relationships, which are one-sided relationships where one person (the TikTok viewer) extends emotional energy, interest, and time, and the other party (the content creator) is completely unaware of the other’s existence.3 Additionally, Americans who are uninsured/underinsured may turn to this diluted version of therapy to compensate for the one-on-one or group therapy they need.

Department of Child and Adolescent Psychiatry and Behavioral Sciences at Children’s Hospital of Philadelphia, and the University of Pennsylvania
Dr. Jaclyn Chua

While TikTok may seem like a dangerous platform to browse through or post on, its growing influence cannot be underestimated. With 41% of TikTok users between the ages of 16 and 24, this is an ideal platform to disseminate public health information pertaining to this age group (for example, safe sex practices, substance abuse, and mental health issues).4 Because younger generations have incorporated social media into their daily lives, the medical community can harness TikTok’s potential to disseminate accurate information to potential patients for targeted medical education.

For example, Jake Goodman, MD, MBA, and Melissa Shepard, MD, each have more than a million TikTok followers and are notable psychiatrists who post a variety of content ranging from recognizing signs of depression to reducing stigma around mental health. Similarly, Justin Puder, PhD, is a licensed psychologist who advocates for ways to overcome mental health issues. By creating diverse content with appealing strategies, spreading accurate medical knowledge, and answering common medical questions for the public, these ‘mental health influencers’ educate potential patients to create patient-centered interactions.

Given the ever-changing digital media landscape, an emphasis must be placed on understanding how adolescents respond to social media in maladaptive or adaptive ways by pointing out the common strengths and weaknesses adolescents share. While there are many pros and cons to social media platforms, it is undeniable that these platforms – such as TikTok – are here to stay. It is crucial for members of the medical community to recognize the outlets that younger generations use to express themselves and to exploit these media channels therapeutically.
 

Ms. Wong is a fourth-year medical student at the New York Institute of Technology College of Osteopathic Medicine in Old Westbury, N.Y. Dr. Chua is a psychiatrist with the department of child and adolescent psychiatry and behavioral sciences at Children’s Hospital of Philadelphia, and assistant professor of clinical psychiatry at the University of Pennsylvania, also in Philadelphia.

References

1. Gottlieb M and Dyer S. Information and Disinformation: Social Media in the COVID-19 Crisis. Acad Emerg Med. 2020 Jul;27(7):640-1. doi: 10.1111/acem.14036.

2. De Veirman M et al. Front Psychol. 2019;10:2685. doi: 10.3389/fpsyg.2019.02685.

3. Bennett N-K et al. “Parasocial Relationships: The Nature of Celebrity Fascinations.” National Register of Health Service Psychologists. https://www.findapsychologist.org/parasocial-relationships-the-nature-of-celebrity-fascinations/.

4. Eghtesadi M and Florea A. Can J Public Health. 2020 Jun;111(3):389-91. doi: 10.17269/s41997-020-00343-0.

For younger generations, TikTok is a go-to site for those who like short and catchy video clips. As a social media platform that allows concise video sharing, TikTok has over 1 billion monthly global users. Because of its platform size, a plethora of resources, and influence on media discourse, TikTok is the place for content creators to share visual media. Its cursory, condensed content delivery with videos capped at 1-minute focuses on high-yield information and rapid identification of fundamental points that are both engaging and entertaining.

Currently, on TikTok, 40 billion views are associated with the hashtag #mentalhealth. Content creators and regular users are employing this platform to share their own experiences, opinions, and strategies to overcome their struggles. While it is understandable for creators to share their personal stories that may be abusive, traumatic, or violent, they may not be prepared for their video to “go viral.”

New York Institute of Technology College of Osteopathic Medicine in Old Westbury, N.Y.
Ms. Sammi Wong

Like any other social media platform, hateful speech such as racism, sexism, or xenophobia can accumulate on TikTok, which may cause more self-harm than self-help. Oversharing about personal strategies may lead to misconceived advice for TikTok viewers, while watching these TikTok videos can have negative mental health effects, even though there are no malicious intentions behind the creators who post these videos.

Hence, public health should pay more attention to the potential health-related implications this platform can create, as the quality of the information and the qualifications of the creators are mostly unrevealed. The concerns include undisclosed conflicts of interest, unchecked spread of misinformation, difficulty identifying source credibility, and excessive false information that viewers must filter through.1,2

Individual TikTok users may follow accounts and interpret these content creators as therapists and the content they see as therapy. They may also believe that a close relationship with the content creator exists when it does not. Specifically, these relationships may be defined as parasocial relationships, which are one-sided relationships where one person (the TikTok viewer) extends emotional energy, interest, and time, and the other party (the content creator) is completely unaware of the other’s existence.3 Additionally, Americans who are uninsured/underinsured may turn to this diluted version of therapy to compensate for the one-on-one or group therapy they need.

Department of Child and Adolescent Psychiatry and Behavioral Sciences at Children’s Hospital of Philadelphia, and the University of Pennsylvania
Dr. Jaclyn Chua

While TikTok may seem like a dangerous platform to browse through or post on, its growing influence cannot be underestimated. With 41% of TikTok users between the ages of 16 and 24, this is an ideal platform to disseminate public health information pertaining to this age group (for example, safe sex practices, substance abuse, and mental health issues).4 Because younger generations have incorporated social media into their daily lives, the medical community can harness TikTok’s potential to disseminate accurate information to potential patients for targeted medical education.

For example, Jake Goodman, MD, MBA, and Melissa Shepard, MD, each have more than a million TikTok followers and are notable psychiatrists who post a variety of content ranging from recognizing signs of depression to reducing stigma around mental health. Similarly, Justin Puder, PhD, is a licensed psychologist who advocates for ways to overcome mental health issues. By creating diverse content with appealing strategies, spreading accurate medical knowledge, and answering common medical questions for the public, these ‘mental health influencers’ educate potential patients to create patient-centered interactions.

Given the ever-changing digital media landscape, an emphasis must be placed on understanding how adolescents respond to social media in maladaptive or adaptive ways by pointing out the common strengths and weaknesses adolescents share. While there are many pros and cons to social media platforms, it is undeniable that these platforms – such as TikTok – are here to stay. It is crucial for members of the medical community to recognize the outlets that younger generations use to express themselves and to exploit these media channels therapeutically.
 

Ms. Wong is a fourth-year medical student at the New York Institute of Technology College of Osteopathic Medicine in Old Westbury, N.Y. Dr. Chua is a psychiatrist with the department of child and adolescent psychiatry and behavioral sciences at Children’s Hospital of Philadelphia, and assistant professor of clinical psychiatry at the University of Pennsylvania, also in Philadelphia.

References

1. Gottlieb M and Dyer S. Information and Disinformation: Social Media in the COVID-19 Crisis. Acad Emerg Med. 2020 Jul;27(7):640-1. doi: 10.1111/acem.14036.

2. De Veirman M et al. Front Psychol. 2019;10:2685. doi: 10.3389/fpsyg.2019.02685.

3. Bennett N-K et al. “Parasocial Relationships: The Nature of Celebrity Fascinations.” National Register of Health Service Psychologists. https://www.findapsychologist.org/parasocial-relationships-the-nature-of-celebrity-fascinations/.

4. Eghtesadi M and Florea A. Can J Public Health. 2020 Jun;111(3):389-91. doi: 10.17269/s41997-020-00343-0.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Chatbots can improve mental health in vulnerable populations

Article Type
Changed
Mon, 11/15/2021 - 08:55

In this modern age of health care where telemedicine rules, conversational agents (CAs) that use text messaging systems are becoming a major mode of communication.

Sammi Wong, a medical student at New York Institute of Technology College of Osteopathic Medicine in Old Westbury, conducts research related to mental health care services.
Sammi Wong

Many people are familiar with voice-enabled agents, such as Apple’s Siri, Google Now, and Microsoft’s Cortana. However, CAs come in different forms of complexity, ranging from a short message service–based texting platform to an embodied conversational agent (ECA).

ECAs allow participants to interact with a physical or graphical figure that simulates a person in appearance, behavior, and dialect. These are essentially virtual humans, or avatars, who talk with participants. By taking greater advantage of these automated agents, some have projected there may be $11 billion in combined cost savings across a variety of business sectors by 2023.1 The health care field is one sector in which CAs can play an important role. Because of their accessibility, CAs have the potential to improve mental health by combating health care inequities and stigma, encouraging disclosure from participants, and serving as companions during the COVID-19 pandemic.

CAs provide accessible health care for rural, low socioeconomic status (SES), and minority communities in a variety of advantageous ways. For example, one study found that long-term use of a text-based agent that combines motivational interviewing and cognitive-behavioral therapy (CBT) can support smoking cessation in adolescents of low SES.2

CAs can help vulnerable participants advocate for themselves and proactively maintain their mental health through access to health care resources. In specific cases, these agents equalize health care treatment for different populations. Even though some participants live in secluded areas or are blocked by barriers, these text-based agents can still provide self-help intervention for them at any time on an individual basis, regardless of their location or socioeconomic status. Furthermore, they serve as highly cost-effective mental health promotion tools for large populations, some of which might not otherwise be reached by mental health care.

In combating mental illnesses such as depression and anxiety, studies have found that CAs are great treatment tools. For example, participants in an experimental group who received a self-help program based on CBT from a text-based CA named Woebot experienced significantly reduced depression symptoms when compared to the control group of participants, who received only information from a self-help electronic book.3 As a result, CAs might prove successful in treating younger populations who find online tools more feasible and accessible. Often, this population self-identifies depressive and anxiety symptoms without consulting a health care professional. Thus, this tool would prove useful to those who are bothered by the stigma of seeing a mental health professional.

Virtual human–based CAs also encourage participants to disclose more information in a nonjudgmental manner, especially among people with diseases with stigma. CAs use neutral languages, which may be helpful when dealing with stigmatized issues such as HIV, family planning, and abortion care because this heightens confidentiality and privacy. When participants believe that the agent does not “judge” or evaluate their capabilities, this elicits more sensitive information from them. For example, one study found that military service members who believed that they were interacting with a computer rather than a human operator reported lower fear of self-disclosure, displayed more sadness, and were rated by observers as more willing to disclose posttraumatic stress disorder symptoms.4 Additional findings show that participants prefer CAs when topics are highly sensitive and more likely to evoke negative self-admissions.

In what we hope will soon be a post–COVID-19 landscape of medicine, CAs are fast being used on the front lines of health care technology. Empathetic CAs can combat adverse effects of social exclusion during these pressing times. Etsuko Ishii, a researcher affiliated with the Hong Kong University of Science and Technology, and associates demonstrated that a virtual CA was as effective as a COVID-19 companion because it uses natural language processing (NLP) and nonverbal facial expressions to give users the feeling that they are being treated with empathy.5 While minimizing the number of in-person interactions that could potentially spread COVID-19, these agents promote virtual companionship that mirrors natural conversations and provide emotional support with psychological safety as participants express their pent-up thoughts. Not only do these agents help recover mood quickly, but they also have the power to overcome geographic barriers, be constantly available, and alleviate the high demand for mental health care. As a result, CAs have the potential to facilitate better communication and sustain social interactions within the isolated environment the pandemic has created.

CAs can predict, detect, and determine treatment solutions for mental health conditions based on behavioral insights. These agents’ natural language processing also allows them to be powerful therapeutic agents that can serve different communities, particularly for populations with limited access to medical resources. As the use of CAs becomes more integrated into telemedicine, their utility will continue to grow as their proven versatility in many situations expands the boundaries of health care technology.
 

Ms. Wong, a medical student at New York Institute of Technology College of Osteopathic Medicine in Old Westbury, conducts research related to mental health care services. She disclosed writing a telemental health software platform called Orchid. Dr. Vo, a board-certified psychiatrist, is the medical director of telehealth for the department of child and adolescent psychiatry and behavioral sciences at Children’s Hospital of Philadelphia. She is a faculty member of the University of Pennsylvania, also in Philadelphia. Dr. Vo conducts digital health research focused on using automation and artificial intelligence for suicide risk screening and connecting patients to mental health care services. She disclosed serving as cofounder of Orchid.

References

1. Chatbots: Vendor opportunities & market forecasts 2020-2024. Juniper Research, 2020.

2. Simon P et al. On using chatbots to promote smoking cessation among adolescents of low socioeconomic status, Artificial Intelligence and Work: Association for the Advancement of Artificial Intelligence (AAAI) 2019 Fall Symposium, 2019.

3. Fitzpatrick KK et al. JMIR Mental Health. 2017;4(2):e19.

4. Lucas GM et al. Front Robot AI. 2017 Oct 12. doi: 10.3389/frobt.2017.00051.

5. Ishii E et al. ERICA: An empathetic android companion for COVID-19 quarantine. arXiv preprint arXiv:2106.02325.

Publications
Topics
Sections

In this modern age of health care where telemedicine rules, conversational agents (CAs) that use text messaging systems are becoming a major mode of communication.

Sammi Wong, a medical student at New York Institute of Technology College of Osteopathic Medicine in Old Westbury, conducts research related to mental health care services.
Sammi Wong

Many people are familiar with voice-enabled agents, such as Apple’s Siri, Google Now, and Microsoft’s Cortana. However, CAs come in different forms of complexity, ranging from a short message service–based texting platform to an embodied conversational agent (ECA).

ECAs allow participants to interact with a physical or graphical figure that simulates a person in appearance, behavior, and dialect. These are essentially virtual humans, or avatars, who talk with participants. By taking greater advantage of these automated agents, some have projected there may be $11 billion in combined cost savings across a variety of business sectors by 2023.1 The health care field is one sector in which CAs can play an important role. Because of their accessibility, CAs have the potential to improve mental health by combating health care inequities and stigma, encouraging disclosure from participants, and serving as companions during the COVID-19 pandemic.

CAs provide accessible health care for rural, low socioeconomic status (SES), and minority communities in a variety of advantageous ways. For example, one study found that long-term use of a text-based agent that combines motivational interviewing and cognitive-behavioral therapy (CBT) can support smoking cessation in adolescents of low SES.2

CAs can help vulnerable participants advocate for themselves and proactively maintain their mental health through access to health care resources. In specific cases, these agents equalize health care treatment for different populations. Even though some participants live in secluded areas or are blocked by barriers, these text-based agents can still provide self-help intervention for them at any time on an individual basis, regardless of their location or socioeconomic status. Furthermore, they serve as highly cost-effective mental health promotion tools for large populations, some of which might not otherwise be reached by mental health care.

In combating mental illnesses such as depression and anxiety, studies have found that CAs are great treatment tools. For example, participants in an experimental group who received a self-help program based on CBT from a text-based CA named Woebot experienced significantly reduced depression symptoms when compared to the control group of participants, who received only information from a self-help electronic book.3 As a result, CAs might prove successful in treating younger populations who find online tools more feasible and accessible. Often, this population self-identifies depressive and anxiety symptoms without consulting a health care professional. Thus, this tool would prove useful to those who are bothered by the stigma of seeing a mental health professional.

Virtual human–based CAs also encourage participants to disclose more information in a nonjudgmental manner, especially among people with diseases with stigma. CAs use neutral languages, which may be helpful when dealing with stigmatized issues such as HIV, family planning, and abortion care because this heightens confidentiality and privacy. When participants believe that the agent does not “judge” or evaluate their capabilities, this elicits more sensitive information from them. For example, one study found that military service members who believed that they were interacting with a computer rather than a human operator reported lower fear of self-disclosure, displayed more sadness, and were rated by observers as more willing to disclose posttraumatic stress disorder symptoms.4 Additional findings show that participants prefer CAs when topics are highly sensitive and more likely to evoke negative self-admissions.

In what we hope will soon be a post–COVID-19 landscape of medicine, CAs are fast being used on the front lines of health care technology. Empathetic CAs can combat adverse effects of social exclusion during these pressing times. Etsuko Ishii, a researcher affiliated with the Hong Kong University of Science and Technology, and associates demonstrated that a virtual CA was as effective as a COVID-19 companion because it uses natural language processing (NLP) and nonverbal facial expressions to give users the feeling that they are being treated with empathy.5 While minimizing the number of in-person interactions that could potentially spread COVID-19, these agents promote virtual companionship that mirrors natural conversations and provide emotional support with psychological safety as participants express their pent-up thoughts. Not only do these agents help recover mood quickly, but they also have the power to overcome geographic barriers, be constantly available, and alleviate the high demand for mental health care. As a result, CAs have the potential to facilitate better communication and sustain social interactions within the isolated environment the pandemic has created.

CAs can predict, detect, and determine treatment solutions for mental health conditions based on behavioral insights. These agents’ natural language processing also allows them to be powerful therapeutic agents that can serve different communities, particularly for populations with limited access to medical resources. As the use of CAs becomes more integrated into telemedicine, their utility will continue to grow as their proven versatility in many situations expands the boundaries of health care technology.
 

Ms. Wong, a medical student at New York Institute of Technology College of Osteopathic Medicine in Old Westbury, conducts research related to mental health care services. She disclosed writing a telemental health software platform called Orchid. Dr. Vo, a board-certified psychiatrist, is the medical director of telehealth for the department of child and adolescent psychiatry and behavioral sciences at Children’s Hospital of Philadelphia. She is a faculty member of the University of Pennsylvania, also in Philadelphia. Dr. Vo conducts digital health research focused on using automation and artificial intelligence for suicide risk screening and connecting patients to mental health care services. She disclosed serving as cofounder of Orchid.

References

1. Chatbots: Vendor opportunities & market forecasts 2020-2024. Juniper Research, 2020.

2. Simon P et al. On using chatbots to promote smoking cessation among adolescents of low socioeconomic status, Artificial Intelligence and Work: Association for the Advancement of Artificial Intelligence (AAAI) 2019 Fall Symposium, 2019.

3. Fitzpatrick KK et al. JMIR Mental Health. 2017;4(2):e19.

4. Lucas GM et al. Front Robot AI. 2017 Oct 12. doi: 10.3389/frobt.2017.00051.

5. Ishii E et al. ERICA: An empathetic android companion for COVID-19 quarantine. arXiv preprint arXiv:2106.02325.

In this modern age of health care where telemedicine rules, conversational agents (CAs) that use text messaging systems are becoming a major mode of communication.

Sammi Wong, a medical student at New York Institute of Technology College of Osteopathic Medicine in Old Westbury, conducts research related to mental health care services.
Sammi Wong

Many people are familiar with voice-enabled agents, such as Apple’s Siri, Google Now, and Microsoft’s Cortana. However, CAs come in different forms of complexity, ranging from a short message service–based texting platform to an embodied conversational agent (ECA).

ECAs allow participants to interact with a physical or graphical figure that simulates a person in appearance, behavior, and dialect. These are essentially virtual humans, or avatars, who talk with participants. By taking greater advantage of these automated agents, some have projected there may be $11 billion in combined cost savings across a variety of business sectors by 2023.1 The health care field is one sector in which CAs can play an important role. Because of their accessibility, CAs have the potential to improve mental health by combating health care inequities and stigma, encouraging disclosure from participants, and serving as companions during the COVID-19 pandemic.

CAs provide accessible health care for rural, low socioeconomic status (SES), and minority communities in a variety of advantageous ways. For example, one study found that long-term use of a text-based agent that combines motivational interviewing and cognitive-behavioral therapy (CBT) can support smoking cessation in adolescents of low SES.2

CAs can help vulnerable participants advocate for themselves and proactively maintain their mental health through access to health care resources. In specific cases, these agents equalize health care treatment for different populations. Even though some participants live in secluded areas or are blocked by barriers, these text-based agents can still provide self-help intervention for them at any time on an individual basis, regardless of their location or socioeconomic status. Furthermore, they serve as highly cost-effective mental health promotion tools for large populations, some of which might not otherwise be reached by mental health care.

In combating mental illnesses such as depression and anxiety, studies have found that CAs are great treatment tools. For example, participants in an experimental group who received a self-help program based on CBT from a text-based CA named Woebot experienced significantly reduced depression symptoms when compared to the control group of participants, who received only information from a self-help electronic book.3 As a result, CAs might prove successful in treating younger populations who find online tools more feasible and accessible. Often, this population self-identifies depressive and anxiety symptoms without consulting a health care professional. Thus, this tool would prove useful to those who are bothered by the stigma of seeing a mental health professional.

Virtual human–based CAs also encourage participants to disclose more information in a nonjudgmental manner, especially among people with diseases with stigma. CAs use neutral languages, which may be helpful when dealing with stigmatized issues such as HIV, family planning, and abortion care because this heightens confidentiality and privacy. When participants believe that the agent does not “judge” or evaluate their capabilities, this elicits more sensitive information from them. For example, one study found that military service members who believed that they were interacting with a computer rather than a human operator reported lower fear of self-disclosure, displayed more sadness, and were rated by observers as more willing to disclose posttraumatic stress disorder symptoms.4 Additional findings show that participants prefer CAs when topics are highly sensitive and more likely to evoke negative self-admissions.

In what we hope will soon be a post–COVID-19 landscape of medicine, CAs are fast being used on the front lines of health care technology. Empathetic CAs can combat adverse effects of social exclusion during these pressing times. Etsuko Ishii, a researcher affiliated with the Hong Kong University of Science and Technology, and associates demonstrated that a virtual CA was as effective as a COVID-19 companion because it uses natural language processing (NLP) and nonverbal facial expressions to give users the feeling that they are being treated with empathy.5 While minimizing the number of in-person interactions that could potentially spread COVID-19, these agents promote virtual companionship that mirrors natural conversations and provide emotional support with psychological safety as participants express their pent-up thoughts. Not only do these agents help recover mood quickly, but they also have the power to overcome geographic barriers, be constantly available, and alleviate the high demand for mental health care. As a result, CAs have the potential to facilitate better communication and sustain social interactions within the isolated environment the pandemic has created.

CAs can predict, detect, and determine treatment solutions for mental health conditions based on behavioral insights. These agents’ natural language processing also allows them to be powerful therapeutic agents that can serve different communities, particularly for populations with limited access to medical resources. As the use of CAs becomes more integrated into telemedicine, their utility will continue to grow as their proven versatility in many situations expands the boundaries of health care technology.
 

Ms. Wong, a medical student at New York Institute of Technology College of Osteopathic Medicine in Old Westbury, conducts research related to mental health care services. She disclosed writing a telemental health software platform called Orchid. Dr. Vo, a board-certified psychiatrist, is the medical director of telehealth for the department of child and adolescent psychiatry and behavioral sciences at Children’s Hospital of Philadelphia. She is a faculty member of the University of Pennsylvania, also in Philadelphia. Dr. Vo conducts digital health research focused on using automation and artificial intelligence for suicide risk screening and connecting patients to mental health care services. She disclosed serving as cofounder of Orchid.

References

1. Chatbots: Vendor opportunities & market forecasts 2020-2024. Juniper Research, 2020.

2. Simon P et al. On using chatbots to promote smoking cessation among adolescents of low socioeconomic status, Artificial Intelligence and Work: Association for the Advancement of Artificial Intelligence (AAAI) 2019 Fall Symposium, 2019.

3. Fitzpatrick KK et al. JMIR Mental Health. 2017;4(2):e19.

4. Lucas GM et al. Front Robot AI. 2017 Oct 12. doi: 10.3389/frobt.2017.00051.

5. Ishii E et al. ERICA: An empathetic android companion for COVID-19 quarantine. arXiv preprint arXiv:2106.02325.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article