AI in therapy? Here’s what people think about it – according to research

November 1, 2024
5
min read
AI in therapy? Here’s what people think about it – according to research

The past few years have seen a huge shift, with Artificial Intelligence (AI) now becoming more and more commonly used in healthcare settings. But what do people actually think about its use in therapy? 

This article sheds light on the latest research into the perceptions of mental health providers and the general public about the use of AI.

How AI is currently being used in therapy?

A review of the mental health space reveals that AI is currently being used in two main ways.

Chatbots / virtual therapists

  • For example Wysa and Woebot offer AI chatbots designed to help users manage mental health challenges through the use of evidence-based strategies, including cognitive-behavioral techniques, to help users build resilience and improve mental well-being.

Note-writing tools

  • For example Upheal uses AI to automatically generate progress notes, facilitate scheduling, and provide session analytics that capture additional objective data points such as speech cadence, sentiment, and more.

Overall, integrating AI into mental health care has yielded benefits including increased access and reach of mental health care.

For example, the use of chatbots and virtual therapists allows for 24/7 anonymous access to care, which may increase access among folks who experience certain barriers to getting to care, including location, time constraints, scheduling difficulties, or even stigma. (See this 2021 review.)

In some cases, AI has also made therapy more cost-effective and reduced clinician workload by automating parts of the therapy process (e.g., education and assessment). As a psychoeducation tool, it showed promise, delivering psychoeducational prompts accurately, clearly, and in a relevant manner, with an empathic tone in responses to emotionally challenging queries. 

Although the use of AI is still relatively new in the mental health field, and we have yet to review some of the concerns about its use (see below), there are many ways in which AI has already enhanced clinical care. 

Most notable is its ability to support the note-taking process. Some tools, such as Upheal, can save up to 10 hours a week.

Overall positive public perceptions of AI in therapy 

In a 2022 cross-study of 872 individuals, results showed the public generally has positive views of AI in psychotherapy, particularly when its use is supervised by humans.

Interestingly, three important benefits of AI-based psychotherapy were identified: being able to comfortably talk about embarrassing experiences, having accessibility at any time, and accessing remote communication. 

Recent research also shows that people recognize AI as a resource to help reduce therapist workload (notetaking technology like Upheal reduces time spent on therapy notes), and can possibly lead to fewer human errors in clinical care (for example, mistakes in notes or billing. 

Overall, there are positive perceptions and opinions about chatbots for mental health. Clients appreciate the 24/7 availability and anonymity of certain AI tools like chatbots and virtual therapists, which make therapy easier to access despite barriers to mental health care, including mental health stigma or difficulty scheduling time to attend sessions during regular business hours.

There is also hope that AI’s online and 24/7 availability may also increase access to mental health support for those living in more remote areas.

The public’s three main concerns 

Although the public holds positive views of AI, there are also concerns about integrating AI into psychotherapy. 

For example, individuals worry that incorporating AI into therapy may lead to the loss of the most valuable “human” components, such as the ability to express empathy and understand complex human emotions (Abd-Alrazaq et al., 2021; Benda et al., 2024).

There are also concerns about whether AI can effectively personalize recommendations for mental health care. Finally, some folks worry about whether their healthcare data will remain private with the integration of AI into psychotherapy.

And what do therapists think? Providers’ perceptions of AI in therapy

Mental health professionals also foresee possible benefits of using AI in their clinical practice. 

For example, some mental health providers recognize that AI could help with clinical decision-making (for example, determining which type of therapeutic intervention might be most effective for a particular client given their symptom profile.

They also generally have positive views of the use of chatbots. Others recognize the capability of AI to assist with updating patient records, notetaking, and synthesizing patient health information, which can free up time and possibly reduce therapist burnout (Nash et al., 2023). 

Among mental health providers, cognitive-behavioral oriented therapists tend to have more favorable views of integrating AI into clinical practice compared to therapists with psychodynamic or systems theoretical orientations (Sebri et al., 2021).

Providers have also expressed some concern about incorporating AI into therapy. Like the general public, mental health providers worry about the ability of AI to understand complex emotions and mental health issues reliably and accurately (Brown & Halpern, 2021; Fiske et al., 2019; Sweeney et al., 2021). 

They also worry about how reduced human interaction that may result from interaction with AI therapists versus human therapists may affect the quality of clinical care provided). 

The future of AI in therapy: opportunities and challenges

In line with views expressed by mental health providers and the general public, there are important opportunities for the effective integration of AI into clinical work to enhance mental health care. 

By automating parts of clinical care (e.g., taking notes), AI can reduce therapist workload and possibly burnout, which would likely improve the quality of patient care. 

Incorporating AI into other newer tools used in psychotherapy like virtual reality may further enhance patients’ therapy experiences. However, AI use in psychotherapy is still very new and will continue to improve as technology and machine learning advance over time.

However, integrating AI into psychotherapy will not be without challenges. AI needs to be used ethically, AI systems need to be unbiased and inclusive, and the use of AI needs to be balanced with human empathy and judgment in psychotherapy.

Mental health professionals can help guide the responsible adoption of AI tools into clinical care. AI also needs to be integrated in a manner that adheres to existing healthcare privacy laws (like the Health Insurance Portability and Accountability Act, or HIPAA) like Upheal does with its note-writing software. 

Successful collaboration between therapists and AI developers can ultimately help address these challenges to create effective and ethical AI systems for use in psychotherapy contexts. 

Conclusion

Generally, both the public and mental health providers foresee potential opportunities and possible pitfalls of using AI in mental health practice. Clear guidelines and regulations should be established to ensure responsible AI use in therapy. AI has the potential to transform mental health care, so long as it is used with respect for the importance of human elements in therapy.

Share this post
Katie Galbraith, PhD and Matt Scult, PhD
Katie Galbraith, PhD and Matt Scult, PhD
Mindflex Health, Digital Mental Health Consulting
,

More blog posts