A disturbing reality of the pandemic is that the population has suffered an increase in anxiety and mental health issues. This has, in turn, led to an increase in the use, and availability of apps aimed to assist and support those with mental health decline or promote the importance of mental wellbeing. A recent report has brought to the forefront privacy concerns with healthcare companies using such apps.

Privacy concerns with healthcare companies

The report by the Mozilla Foundation found many of the apps to be processing sensitive personal data relating to individuals to provide support or service. Its findings suggest that several big players in the mental health and prayer sphere deserve the label of *Privacy Not Included – a warning label that no technology company board desires.

What does *Privacy Not Included mean

The Mozilla Foundation aims to “empower consumers to demand better online privacy, trustworthy artificial intelligence, and safe online experiences from Big Tech and governments”. Together with its helpful search a device function available on its website, its privacy researchers created the *Privacy Not Included consumer guide. Mozilla sets a strict test for technologies; rating how private and secure they are with the result being a *Privacy Not Included warning label if they determine there to be privacy and security concerns. The standards and methodology are explained further here.

*Privacy Not Included in mental health 

In its latest report, the research focussed on mental health apps and prayer apps, including some big players in the market. Out of 32 apps, 26¹ were given the label – you can find out which ones here.

What’s the worst that could happen?

The report found that one very popular mindfulness app could “get to know all about your meditation practices, your mood, your gender, your location, and more. Then use or share that information to target you with ads about things, like a wine company thinking you’re ripe for targeting with wine ads when you’re using the app a lot because that might mean you’re stressed out. But you’re a recovering alcoholic and the wine ads add to your stress. It’s just one potential scenario of how targeted ads based on your behaviour and identifiers on the app could go wrong.” 

One suicide prevention app seemingly had the ability under its privacy policy to “share your data with just about anyone if they wanted, as long as they aren’t a stranger” and a prayer app company was found to not take steps to personal data such as secure name, church-donation information, photos, and users’ contact lists – even after they were made aware of the problem – resulting in a leak.

In summary – a lot could go wrong due to the nature of the service these apps provide. Healthcare is a high-risk area often involving some of the most sensitive data regarding a person.

What should healthcare companies learn from this?

There can often be a battle of resources between commercialisation and privacy. However, this report, coupled with the recent decision in Ireland against WhatsApp, highlights the importance of having clear and compliant privacy notices and practices; accompanied by a privacy-by-design approach. Ultimately, a loss of public trust may impact sales, and a loss or misuse of data may result in claims and fines – it is in your business interest to put privacy first.

Finally, Healthtech businesses should keep an eye out for any developments around the UK Government consultation on a voluntary code of practice for all app store operators and developers which will remain open until 29 June this year.

If you wish to speak to a member of the team about your data protection and privacy practices, or any other technology query, contact our IP, IT, and Data Protection Team on 01392 210700 or by email at dataprotection@stephens-scown.co.uk.

 

¹correct at the time of writing