New Unregulated Technology: The Science of Mental Health Apps

Whether you are standing in an elevator or sitting at a table, there is a good chance that one of the people next to you is experiencing psychological difficulties. Maybe it’s you. Last year, we estimated 47 million Americans have suffered from mental illness; it’s almost one in five.

In response, mobile apps designed to increase the psychological well-being of users have also proliferated in recent years. Some are generic wellness apps that motivate people to meditate or do yoga, while others provide targeted treatments for specific mental illnesses such as post-traumatic stress disorder or bipolar disorder. Each of these technologies has the potential to reach people who otherwise would not have access to mental health care.


Read more: Your Next Therapist Could Be a Chatbot App


At the start of the pandemic, mental health professionals struggled to meet the growing demand for their services. A survey of adults who have received such services revealed that 17.7 million Americans experienced late or canceled appointments in 2020. Although demand has declined slightly since then, access to services remains a significant issue: last year, more than 26 million Americans have suffered from a mental illness that has not been processed.

While traditional therapists must go through a licensing process, there is no equivalent screening process for mental health applications. “It’s the Wild West out there. The ground is fertile for all sorts of players to play in the sandbox,” says Smisha Agarwal, assistant professor of digital health at the Johns Hopkins Bloomberg School of Public Health.

In May, Agarwal and his colleagues published an assessment framework for mental health apps. It is one of the few systems proposed to distinguish the good from the bad. But for now, users will have to decide for themselves.

Doubtful criteria

The most used mental health apps, like calm Where Moodfit, target a wide audience; they are designed to help anyone who is stressed, anxious or depressed. The approach combines wellness exercises with gamification. In-app goals and rewards motivate users to deal with negative emotions through healthy outlets.

Agarwal says apps like these pose few direct risks to users. Indeed, the behaviors they promote are healthy for most people, regardless of their mental state. Keep in mind, however, that some apps may not be effective at what they set out to do. “Many are lacking in terms of user interface and general usability,” she says. “And most don’t use established behavior change modalities or evidence-based treatment protocols.”

Although apps are questionable therapeutic methods for people struggling with mental illness, studies have shown that some can have a positive impact on the general population. An article from 2018 found that using the meditation app Head space reduction in stress and irritability among a random sample of healthy adults.

Unfortunately, many wellness apps have a data security issue. A May report by Mozilla Software Developer studied 32 popular mental health apps and ultimately designated 28 as “privacy not included.” Some of these apps simply had weak security measures, while others included clauses in their privacy policies that allowed them to sell user data to third parties.

“You are dealing with a population struggling with mental health problems. The privacy and security statements are barely comprehensible even to someone operating at full mental capacity,” says Agarwal. At best, user data could be used to create targeted advertisements on other websites. At worst, a security breach could give hackers access to personal health and financial information.

A balance exercise

While apps like Calm and Headspace are aimed at low-risk populations, many apps have been developed as potential therapeutic tools for high-risk populations – people with schizophrenia, bipolar disorder, or PTSD. So far, however, few of these designs have passed clinical trials. Those who do it often have difficulty gaining strength.

“I think there are two main types of apps,” says David Bakker, clinical psychologist and founder of the app. MoodMission. “One is a research-driven app that is developed quite extensively by academics. Then they have no idea how to run the business after the grant money runs out.” The second type, he says, is profit-regulated and collects user data like all other apps.

When Bakker founded MoodMission in 2015, he hoped to avoid some of the pitfalls of other mental health apps by running the company on a not-for-profit model. The app aims to relieve symptoms of depression and anxiety by suggesting a combination of cognitive behavioral therapy and general wellness exercises for users. In 2019, Bakker and his colleagues conducted a randomized controlled trial which showed that the app successfully helped depressed subjects develop effective coping mechanisms. And unlike other research-backed apps, MoodMission has been downloaded over 100,000 times on Android and Apple devices.

While MoodMission’s combination of rigorous research and popularity is rare among mental health apps today, it’s proof that an organization with the right mission can develop something that’s both effective and accessible.

Future frameworks

Now the crux of the matter is how to educate consumers on what to look for. “You can regulate providers, but you can’t regulate patients,” says Agarwal.

Ultimately, she hopes that an established framework for evaluating mental health apps will “give consumers and clinical providers insight.” While app seekers currently have to wade through blogs and user reviews to make a decision, a seal of approval from a certification body could one day tell us which apps are safe and effective. It’s the same model that allows shoppers to select organic or fair trade products at the grocery store.

In the meantime, innovators will continue to evolve the technology that powers these applications. Bakker envisions a future application that uses artificial intelligence to help clinicians select therapeutic interventions for mental health patients. It’s a vision shared by tech companies like limbic.

“That way we can do the work of connecting with someone interpersonally, and at the end of a session I can go to my tablet and see that there’s an 86% chance that some This approach works well for this person,” says Bakker. “As a psychologist, I look forward to a future where there can be a hybrid psychological treatment model between an AI and a human.”

Comments are closed.