Copyright infringement

Mental health app privacy language opens up holes for user data

In the world of mental health programs, secret scams have become more frequent. Over the course of a few months, the report or research reveals anonymous information that appears in behaviors such as Crisis Line, Talkspace, BetterHelp, and others: people have given the information in the hope that they will feel better , then it turns out that their data was used in ways that help companies make money (and not help them).

It seems to me like a twisted game of wiak-a-mole. When under them researchah apps mostly change or improve their policies – and then new problems or problems arise. Not just me: Mozilla researchers say this week that the mental health app has some of the worst privacy protections of any type of app.

Watching the circuit over the last few years has fascinated me as, exactly, that goes on. The specific terms and conditions of the app must govern what companies are allowed to do with user data. But most people barely read it before signing (hit accept), even if they read it, they are often very difficult and difficult to determine the impact they will have on a quick glance.

“That makes the consumer completely unaware of what it means to even say yes,” said David Grande, an associate professor of medicine at the University of Pennsylvania School of Medicine studying digital health secrets.

So what does it mean to say yes? I looked at several good publications to get an idea of ​​what was going on under the hood. The “mental health program” is a broad component, and can cover everything from peer-to-peer counseling lines to one-on-one communication with real-time therapists. Rules, protections, and regulations vary across categories. But I found two traits shared by many different policies that made me wonder what even the purpose of having a policy in the first place.

We can change this policy at any time

Even if you carefully read the privacy policy before signing up for a digital mental health program, and even if you are truly satisfied with that policy – however, the company may reconsider and change that policy at any time. they want. They may tell you – they may not.

Jessica Roberts, director of the University of Houston School of Law and Health Policy, and Jim Hawkins, professor of law at the University of Houston, point out the problems with this type of language in the 2020 op-ed in the journal Science. Someone may sign up for a mental health app and they will keep their data private and then the policy will be revised to release their data to a wider use than they are satisfied with. Unless they go back to check politics, they would not know.

One of the apps I looked at, Happify, specifically states its policy that users will be able to choose if they want the new use of data for each new policy to apply to their information. They are able to leave if they do not want to be attracted to the new policy. BetterHelp, on the other hand, says the only way to reverse it if someone does not like the new policy is to stop using the site altogether.

Having this kind of flexibility in privacy policies is a design. The type of data this generation collects is valuable, and it is likely that companies will want to be able to take advantage of every opportunity to come up with new ways to use that data in the future. “There are a lot of benefits to making these things more flexible from a company perspective,” Grande said. “It’s hard to predict a year or two, five years in the future. About what another story is used for you can imagine this data.”

If we sell the company, we also sell your data

Satisfaction with all the way the company is using your current data to register you to use the service also does not guarantee that no one else will be responsible for that company in the future. All of the specific policies I looked at included a specific language that, if the app was purchased, sold, incorporated into another group, or another business-y item, the data is included.

Politics, then, only applies now. This may not apply in the future, after you have previously used the service and provided your mental health information. “So, you could argue that it is completely useless,” said John Torous, a digital health researcher at the Department of Psychiatry at Beth Israel Deaconess Medical Center.

The data can be unique as to why one company buys another first. The information that people provide to the mental health app is unique and therefore very valuable – arguably more than any other type of health data. Advertisers may want to target people with special mental health needs for other types of products or treatments. Discussion notes at the treatment session can provide information on how people feel and how they respond to different situations, which can be useful for teams building artificial intelligence programs.

“I think that’s why we’ve seen so many more cases in the behavioral health field – that’s where the most valuable and easy data is harvested,” Torous said.


I asked Happify, Cerebral, BetterHelp, and 7 trophies about these specific sections of their language policies. Only Joy and Cerebral responded. Spokespersons from both sides described the language as the “standard” of the industry. In both cases, the user must review the changes and log in, “Happify spokeswoman Erin Bocherer said in an email. Qarka.

The cerebral policy on data sales is beneficial because it allows clients to continue treatment if there is a change in ownership, an email statement said. Qarka by Anne Elorriaga spokeswoman The language that allows the company to change the privacy requirements at any time “enables us to let our customers know how we process their personal information,” the statement said.

Now, these are two small sections of the privacy policies of mental health programs. They jumped at me As specific parts of the language that give a wide range of companies the opportunity to make smart decisions about user data – but the rest of the policies often do the same. Many of these digital medical devices do not include health professionals who speak directly to patients, so they are not required. HIPAA guidelines around the protection and disclosure of health information. Even if they decide to follow HIPAA guidelines, they still have greater freedom of user data: the law allows groups to share personal health information as long as it is not recognized and stripped of the identity information.

These broad policies are not just a feature of mental health programs. They are common in other types of healthcare (and abortion in general), too, and digital healthcare companies often have a great deal of power over the information people provide. But mental Health information gets more scrutiny because most people feel differently about this data than other types of health information. One survey of American adults was published inside JAMA Network Open in January, for example, found that most people were too young to want to share digital information about depression than cancer. The information can be sensitive and weird – it includes details about people’s personal experiences and vulnerable conversations that may want to be held anonymously.

Delivering health care (or any private activity) online usually means that a certain amount of information is absorbed by the Internet, Torous said. That is the normal business, and the expectations of the total number of online sites may be unrealistic. But, he said, it should be possible to mediate the fate. “There is nothing online that is 100 percent private,” he said. “But we know we can make things even more special than they are now.”

Still, making changes that would actually improve the protection of people’s mental health information is difficult. Demand for mental health programs is high: their use skyrocketed during the spread of COVID-19, when many people were seeking treatmentbut there is still it was not enough mental health care available. The data is valuable, and there is no real external pressure for companies to change it.

So the policies, which allow people to open up to lose control of their data, continue to have similar patterns. As long as the next major media report attracts a particular case of a particular app, users may not realize their vulnerabilities. Unsurprisingly, Torous says, that cycle could erode digital mental health confidence in general. “Health and mental health care is based on trust,” he said. “I think if we continue on this path, we will start to lose the trust of patients and clinics.”