Latest campus and coronavirus information Roadrunner Return Fall 2020
Monday, August 10, 2020

Researcher warns about dangers of AI chatbots for treating mental illness

Researcher warns about dangers of AI chatbots for treating mental illness

JULY 8, 2020 — According to a national government survey, about half of Americans suffer from a mental illness. Yet the majority of those with mental illness don't receive any therapeutic treatment. It’s for this reason that the COVID-19 pandemic has inspired a surge of companies to provide smartphone psychotherapy with artificial intelligence and big data analytics.

Now, bioethicists at UTSA have shown in new research that until the industry has more data on detection, treatment effectiveness, and better patient-privacy regulations, it should caution further funding of these therapeutic approaches.

“Robots should not be used in lieu of therapists,” said assistant professor of philosophy Şerife Tekin, who works in the intersection of philosophy, biomedicine and artificial intelligence at UTSA. “My biggest concern is that there is not enough research on how effective these technologies are.”


“My biggest concern is that there is not enough research on how effective these technologies are.”



Computerized therapy that uses artificial intelligence was first recommended in 2006 as a way of providing cognitive behavioral therapy for treatment of depression, panic and phobias. However, in April of this year, due to the coronavirus pandemic, the FDA decided to change its guidance and permit the health industry to increase the use of digital health devices without further clinical trials for treating psychiatric disorders.

Now, providers such as Woebot, a chatbot providing CBT for individuals, operate in the telepsychotherapy industry, exchanging 4.7 million messages with people per week.

Digital phenotyping refers to the practice of using a cell phone to monitor active and passive data and serve as an early warning system of a person’s mental health condition. Those that support this computerized therapy say that chatbots, through their use of CBT use structured exercises, encourage a person to examine and change their habits of thought.

Tekin argues, though, that the data gathered from the artificially intelligent chatbots may not be all that accurate. According to Tekin, data for the efficacy of this therapeutic approach is only based on a limited number of studies that usually rely on small noncontrolled and nonrandomized samples. There is also little evidence that results are sustained beyond three months.

Some psychotherapy chatbots rely on the incorrect assumption that participants will be able to report their moods accurately. Moreover, many services are self-directed and the user is in charge of tracking and reporting. This might limit the monitoring of mental and behavioral phenomena such as in the case of schizophrenia. There is also the risk of misdiagnosing. For example, a user may simply choose to ignore a service prompt but the chatbot will take this as social withdrawal and consequently a sign of mental distress.

In a face-to-face interaction with a therapist, a patient may have a high level of trust to openly discuss conditions. In the digital realm a patient may self-censor due to the fear of data breaches. Even if there is a written privacy policy issued by the developer, Tekin argues that there are currently no regulations to protect the privacy and security of personal health information.


Explore Şerife Tekin’s work in artificial intelligence at UTSA.


“I recommend that research funding should be allotted cautiously to develop and test the efficacy of this technology at this early stage,” said Tekin. “Furthermore, ethical questions and concerns need to be included in the development process of this technology from the very beginning; ethics cannot be an afterthought.”

Tekin’s research was published in the Journal of Philosophy and Technology. She currently is at work with artificial intelligence engineers at UTSA to develop a more holistic and ethical approach to digital phenotyping.

Milady Nazir



UTSA Today is produced by University Strategic Communications,
the official news source
of The University of Texas at San Antonio.

Send your feedback to news@utsa.edu.


UTSA Today is produced by University Communications and Marketing, the official news source of The University of Texas at San Antonio. Send your feedback to news@utsa.edu. Keep up-to-date on UTSA news by visiting UTSA Today. Connect with UTSA online at Facebook, Twitter, Youtube and Instagram.


Events


Spotlight

Spotlight

skyline-san-antonio_680.png
UTSA gives hope to small-business owners during pandemic

UTSA’s Mission

The University of Texas at San Antonio is dedicated to the advancement of knowledge through research and discovery, teaching and learning, community engagement and public service. As an institution of access and excellence, UTSA embraces multicultural traditions and serves as a center for intellectual and creative resources as well as a catalyst for socioeconomic development and the commercialization of intellectual property - for Texas, the nation and the world.

UTSA’s Vision

To be a premier public research university, providing access to educational excellence and preparing citizen leaders for the global environment.

UTSA’s Core Values

We encourage an environment of dialogue and discovery, where integrity, excellence, inclusiveness, respect, collaboration and innovation are fostered.

UTSA’S Destinations

UTSA is a proud Hispanic Serving Institution (HSI) as designated by the U.S. Department of Education.