How to navigate mental health apps that may share your data | News and comments

If you’ve ever assumed that information shared on a mental health app is confidential, you’re in good company with many others who probably assume that sensitive medical information is always protected. However, this is not true and it is important to understand why.

Many of us are familiar or active users of some type of digital health app. Whether it’s nutrition, fitness, sleep tracking or mindfulness, the arena for apps that can help us track aspects of our health has never been bigger. Likewise, platforms that help us reach healthcare providers and receive virtual care have become more accessible and often necessary during the pandemic. Online therapy in particular has grown over the years and has become a critical resource for many people during quarantine and remote living.

Making health resources and care more accessible to people is vital, and easy access to health resources right from your phone is obvious.

However, among the many severe consequences of Roe v. Wade a number of digital privacy concerns have been overruled. There has been a significant focus recently on period or fertility tracking apps as well as location information, and rightly so. On July 8, the House Oversight Committee sent letters to data brokers and healthcare companies “requesting information and documents regarding the collection and sale of personal reproductive health data.”

What is less discussed is the large gap in legal protection for all types of medical information that is shared through digital platforms, all of which should be subject to regulation and better oversight.

The US Department of Health and Human Services (HHS) recently released updated guidance on mobile phones, health information, and HIPAA, confirming that the HIPAA privacy rule does not apply to most healthcare applications because they are not “covered entities” under the law. The Health Insurance Portability and Accountability Act (HIPAA) is a federal law that creates a privacy rule for our “medical records” and “other individually identifiable health information” during the flow of certain health care transactions. Most applications that are individually selected by the user are not covered — only platforms that are specifically used by or developed for traditional healthcare providers (ie a clinic’s digital patient portal where they send you messages or test results ).

Mental health apps are a case in point. They, like other digital health applications, are generally not bound by the privacy laws that apply to traditional health care providers. This is particularly concerning because people often seek out mental health platforms specifically to discuss difficult or traumatic experiences with sensitive implications. HIPAA and state laws for that matter would need to be amended to specifically include digital application-based platforms as covered entities. For example, California currently has a bill pending that would bring mental health apps under their state medical privacy law.

It’s important to note that even HIPAA has exceptions for law enforcement, so bringing these apps under HIPAA’s scope still won’t prevent government requests for that data. It would be more useful in regulating the information that is shared with data brokers and companies like Facebook and Google.

An example of information that is shared is what is collected during an “intake questionnaire” that must be completed on popular services such as Talkspace and BetterHelp in order to be matched with a provider. The questions cover highly sensitive information: gender identity, age, sexual orientation, mental health history (including details such as when or if you’ve had suicidal thoughts, whether you’ve had panic attacks or have phobias), sleep habits, medications, current symptoms. etc. All of these responses were found by Jezebel to be shared with analytics company BetterHelp, along with the user’s approximate location and device.

Another type is any “metadata” (ie, data about the data) about your use of the app, and Consumer Reports found that this can include the fact that you’re a user of a mental health app. Other information shared may include how long you are in the app, how long your sessions with your therapist are, how long you message in the app, when you log in, when you message/talk to your therapist, your approximate location, how often you open the app, etc. .n. Data brokers Facebook and Google were found to be among the recipients of this information from Talkspace and BetterHelp. Apps regularly justify sharing information about users if that data is “anonymized,” but anonymized data can easily be linked to you when combined with other information.

Along with collecting and sharing this data, health apps’ data retention is incredibly opaque. Several of these apps don’t have clear rules about how long they keep your data, and there’s no rule that requires them to. HIPAA does not create any record retention requirements—those are governed by state laws and are unlikely to include healthcare applications such as subject practitioners. For example, New York State requires licensed mental health practitioners to maintain records for at least six years, but the application itself is not practicing or licensed. Asking to delete your account or data may not remove everything either, but there’s no way to know what’s left. It is unclear how long the sensitive information they collect and store about you may be available at some future point to law enforcement.

Accordingly, here are a few things to keep in mind when navigating health apps that may share your data:

The accessibility of care created by these types of apps is more than critical, and everyone should seek the care they need, including through these platforms, if they are the best option for you (and they are for many people). The important thing is to be as informed as possible when using them and take the steps available to you to maximize your privacy.

Leave a Comment

Your email address will not be published. Required fields are marked *