A study conducted this year showed that 18% of adults in the United States have a mental health condition; that's is over 43 million Americans. The epidemic is growing as mental health and access to care is declining. There is also an extreme shortage of providers. In Alabama, there is only one mental health professional per 1,260 people. Another issue that people seeking help are having is that many psychiatrists don't accept insurance. The issue has also been affecting children; 7.7% of kids do not have access to mental health services through their private insurance.

As 1 in 5 adult Americans suffer from mental illness, suicide rates have also increased to an all-time high. There's no question that something had to be done to scale back this crisis. Scientists got to thinking and came up with a way to program artificial intelligence (AI) to help diagnose and treat mental illness. Many are excited about these new tools that are being developed because they will hopefully be able to help large populations at lower costs.


This solution is much needed as there's a high demand for psychiatrists and other mental health specialists. There simply aren't enough professionals practicing to meet the needs of the growing population that are needing help. Almost 40% of Americans live in areas that have a shortage of mental health professionals. Over 60% of counties in the United States don't even have one psychiatrist working within their borders. People who do have access to mental health services often can't afford to pay out of pocket, which is typically the only payment method accepted.


Many people suffering from mental illnesses are either turning to the ER or their primary care physician for help, where insurance is often more accepted. People not suffering from depression see their physician on average three times per year, while people with depression visit their physician more than five times per year. Mental health has surpassed heart conditions for being the most expensive part of our health care system. An estimated 201 billion dollars is spent on mental health every year.


AI is becoming more accessible to the general population through the use of smartphones and virtual assistants such as Amazon's Alexa and Google Home. Now major universities and companies are trying to give AI a new and more meaningful purpose. The idea is that a person would speak to AI about what they're suffering from psychologically and then AI would be able to diagnose and carefully treat the person's mental illness as a psychiatrist would. Some people consider this way of treatment to be impersonal, while others believe it will have significant benefits.  

There has been a stigma around mental health for a long time. That is partially the reason many people are too afraid or reluctant to seek help. They may be embarrassed to talk to someone about the problems they're dealing with or maybe they're afraid of judgment. The anonymity of AI could help patients let their guard down and really open up about what they're feeling. Another major advantage of using AI to help with mental health is the lowered costs.


As these AI tools were recently developed, there are some risks to consider. The major risk associated with using AI for this particular purpose is patient privacy. Unfortunately, there have been many security breaches over the past few years. It seems that hackers are able to steal personal information more easily now than ever. As more medical records become digitized, more hackers are targeting health care. What many people are worried about is their most personal and intimate conversations being leaked on the web. These details could be linked to social media and consumer data. Programmers are trying to diminish this risk by employing certain techniques to further protect patients information. Providers are considering storing minimal personally identifiable data, as well as, routinely deleting transcripts once the analyses are completed. Providers will also have to ensure that the data is encrypted on the server itself.

The designers would also have to program AI to distinguish between race, gender, and age. If the programmers are only using their tech teams voices and faces to train AI, it will have trouble distinguishing non-verbal cues from women, people of color, and seniors- few of whom work in tech. It would also affect AIs ability to analyze speech patterns. If they are only testing AI with one demographic group, it could lead to false alerts and incorrect diagnoses for people outside of that demographic. The programmers will have to train AI using big test groups that represent the whole community, as research clinicians do.

We must also put into place strict safety regulations to avoid harming patients. One simple programming error could negatively affect millions of users. The same level of cautiousness that goes into drug development should be applied to AI development. As long as these issues are addressed and corrected and safeguards are put into place, there is huge potential for AI to help with mental illnesses.


There have been four approaches for using AI that have stood out from the rest. The first is using AI to help psychiatrists and other mental health professionals perform more efficiently and effectively. AI is able to collect and process information much faster than humans. AI then uses this information to suggest effective treatment options. One example of this use is Ginger.oi's virtual mental health services. This service offers video and text-based therapy and coaching sessions. This downloadable smartphone app allows healthcare professionals to better track patients' progress by identifying times of crisis and creating personalized care plans. Ginger.oi has already shown positive results. Over one year, 72% of users reported clinically significant improvements in symptoms of depression

Another way AI can help identify and treat mental illness is by anticipating problems. Quartet Health is a perfect example of this. It uses patients' medical histories and behavioral patterns to identify undiagnosed mental illnesses. One example of this is if someone is has been repeatedly tested for a non-existent cardiac problem, Quartet Health could suggest possible anxiety. Quartet Health can also suggest preventative follow-up in situations where patients experience anxiety or depression after receiving a bad diagnosis or treatment for a major physical illness. Quartet Health has reduced ER visits and hospitalizations by 15-25% for some of its users and has become adopted by many insurance companies and employer medical plans.

Chatbot counseling is another new AI tool created to simulate human conversations. The computer program can be accessed through text and voice-enabled AI interface. The purpose is to help identify people who may be suffering from depression, anxiety, or substance abuse and then provide cost-effective care. One example of a chatbot is Woebot. Developed by clinical psychologists at Stanford University in 2017, Woebot uses cognitive behavioral therapy to help treat depression and anxiety. Cognitive Behavioral therapy is a technique that has been used for forty years. It was designed to alter patient's negative thought patterns in a certain amount of sessions by using highly structured talk psychotherapy. Woebot has already shown great results. In a study comprised of college students suffering from depression, the students who used Woebot experienced a 20% improvement in only two weeks. These results were based on PHQ-9 scores, which measure depression. Because Woebot only costs $39 per month and is so convenient, many people were talking to the bot daily which is one reason for its success.

The more advanced version of a chatbot is Ellie, a virtual therapist developed by the University of Southern California's Institute for Creative Technologies. She does what a chatbot does but can also detect nonverbal cues and respond accordingly. Ellie is an avatar rendered in 3-D on a television. She can nod approvingly and say "hmmm" to help patients feel more comfortable opening up. Her program can detect 66 points on the patient's face, as well as identify the patient's rate of speech. Ellie's actions, motions, and speech were partially designed to mimic a therapist but not completely that patients would feel uncomfortable with therapy.

A research project was conducted on soldiers recently returned from Afghanistan and Ellie. It turned out that Ellie found more evidence of PTSD in the soldiers than the Post-Deployment Health Assessment given by the military. This opens up a door for soldiers suffering from post-traumatic stress disorder- which consists of 20% of returning veterans. Ellie's help could have a tremendous impact on the mental health of veterans and potentially decrease the staggering suicide rate among the population.

Although there are some obstacles that need to be tackled, the future of AI in mental health looks bright. With more research and safeguards we will be able to make mental health care more accessible and affordable for everyone.

Click here to shop healthy snacks like protein cookies, bars, and shakes!