Applied Psychology

The Impact of AI on Mental Health

Feb 22, 2024 | By Jenna Van Schoor
Reading time: 5 min

What will the impact of AI be in the field of mental health? Artificial Intelligence (AI) is used in many fields to automate tasks and make routine processes more efficient. For example, AI can help with data gathering and analysis, but what about the ethics involved?

Improving existing technology is nothing new. However, the pace of change means chatbots, apps, and other digital services are evolving rapidly. While social scientists agree that more research is needed, recent studies point to these apps’ positive potential.

Many different applications are available, notably Woebot and October Health. This post will examine how these applications work and the ethical implications of using AI.

AI and Therapy

During the pandemic, online therapy became popular when people couldn’t easily see a therapist in person. Apart from meeting with a therapist online via Zoom or other platforms, online therapy tools have become popular as they are more accessible and affordable than face-to-face therapy. 

These tools include applications that use chatbots to answer questions and provide information, known as psychological AI chatbots. The effectiveness of these tools still needs to be determined. However, research shows that AI tools like machine learning and data mining can assist practitioners in providing more personalised care. 

In other words, the upside of these technologies is that they can gather valuable information about how people are feeling and what mental health issues they are struggling with. On the downside, we must consider privacy and confidentiality issues, i.e., how we collect, store, and share data – this includes the user’s personal data and the content they share in the interactions with the app.

Ethical issues also come into play in exchanging personal information. What happens if someone is genuinely struggling? Is an app going to be able to help them? What liability is there if someone acts on information they received from an app and things go wrong?

There are no easy answers to these questions, as the truth is that we just don’t fully understand the impact of AI use in psycho-educational tools just yet. But to get more insight, we’ll look at two popular apps to get an idea of some of the ethical issues to be aware of.

Woebot

Woebot aligns with Cognitive Behavioural Therapy (CBT) principles in an accessible format. Although not marketed as a replacement for traditional in-person therapy, the app works on the premise that using CBT principles in a self-directed format can be a helpful tool in managing mental wellbeing. 

In summary, Woebot is a psychological AI chatbot that helps people talk about their feelings and gives them practical tips and information. Although a review on the psychology website Very Well Mind states that the app feels superficial, for someone new to mental health concepts, it could help them develop more awareness about their behaviour.

October Health

October Health is another app that uses AI to direct people to information by guiding them through online assessments and talking with a chatbot called Luna. 

While still in development, chatting with Luna helps guide people to different sections of the app. The app is a repository of online resources that can help people learn more about common mental wellbeing challenges. Trained psychologists also offer webinars to help users learn more.

Once again, this app isn’t marketed as a substitute for traditional in-person therapy but can help people become more aware of ways to cope and feel better. 

October Health is also used at an enterprise level to help improve productivity and workplace performance by gathering data about employees’ mental health through the app. This data can guide human resources teams to understand what challenges affect their workforce. The same ethical issues of privacy and confidentiality apply here.

What will happen in the future?

From these two summaries, there is the potential to reach many people using this format. We’ve also seen from apps like October Health that AI can help gather valuable data. However, while AI can offer value in terms of accessibility and affordability, it can’t provide the same benefits as a traditional face-to-face therapeutic relationship as yet. 

Researchers argue that a successful therapeutic relationship involves a robust therapeutic alliance and shared decision-making, which cannot occur in the current context of psychological AI tools. Essentially, AI cannot replicate what a trained psychologist can offer. However, this doesn’t mean we can’t use these tools and traditional therapy together.

As mentioned in the intro, we can talk all day about the perceived impact of AI, but when it comes to the future, we just don’t know how it will evolve. How it develops depends on various factors, including how many people use it, how we continue to develop algorithms and the biases involved.

No one has the answers, but people have sketched various scenarios. According to a research article called “Is AI the Future of Mental Health?”, researchers have identified four possible outcomes:

  1. AI becomes a phenomenal tool for providing mental health care affordably, with few downsides
  2. AI becomes a helpful tool but is too expensive, or its risks outweigh the benefits
  3. AI only becomes useful through collaboration between healthcare practitioners and AI. In other words, the two work together.
  4. Using AI becomes too risky and less cost-effective than we thought.

Considering these four scenarios, a lot depends on how we use and develop AI in the present. There are also many factors we can’t control, for example, financial limitations and pressures. 

Learn more about the ethical implications of AI

We’ve briefly covered some ethical issues when discussing the impact of psychological AI tools and their current and potential therapeutic applications.

From what we’ve discussed, there is much to consider when using AI as a potential therapeutic tool. To ensure the best possible outcome, we need to understand and be aware of all the pros and cons to build a constructive mental healthcare industry in the future.
Register for our online workshop, Navigating the Impact of AI in the Field of Mental Health, to learn more.

Previous post

Next post