Apps for mental health of all kinds are a rapidly growing phenomenon, you can hardly look at an online store without getting a list of all the new apps for meditation, online therapy, diet helps, insomnia, mood charting and many others.
Now, researchers are taking a deeper look into the data that these apps collect on users, and how well or poorly the data is protected. There is a spectrum of providers running the gamut from rigorous protection of users’ privacy to those who exploit loopholes and gray areas in existing regulation to harvest and monetize the data of even the most vulnerable of users.
Last year, in a landmark ruling, the Federal Trade Commission banned online counseling service Better Help, Inc. from giving online user data to third parties, and ordered it to pay more than $7 million in damages to people who signed up for online counseling and had their personal data sold or monetized by the company or by third parties such as Facebook. All indications are that there are other companies offering mental health apps without privacy safeguards, or using deceptive practices to claim that data will be kept private when they are actually selling it to third parties.
It can be very difficult to tell whether a company is adhering to privacy protection standards or not. Part of the reason for the fine ordered by the FTC for Better Health is because the website deceptively declares that users’ data will “never be sold”, so a statement like that attached to an app you are using may not be adequate to assure you of the company’s safeguards.
Mozilla Foundation, an international internet watchdog, has been researching mental health and other apps for some years now, and has some worrying findings. For one thing, their researchers reported in 2022 that mental health apps were “worse than any other product category” when it comes to privacy and security. In that year, 29 of the 32 apps reviewed got a “privacy not included” label from Mozilla. However, in an encouraging trend, they report that in 2023, nearly one-third of the apps made improvements over their 2022 performance. Still, in 2023, 19 out of 32 apps reviewed had inadequate privacy protections.
In addition to media and non-profit watchdogs like Mozilla, MoodSurfing looks at randomized control trials, and other academic studies of available apps before recommending them. See the footnotes in our articles about Smartphone Apps and Mindfulness Apps for further information.
Mental health related apps and sites should be more, not less, responsible about the use and protection of users’ data, since many of their customers are sharing their private struggles and needs and may be even more vulnerable to exploitation and abuse. We call on public regulatory bodies to strengthen and tighten their oversight and standards.