The age old question we ask ourselves on what we receive, is to what we have to give up in order to have it. Will data become the new aspect we give up in order for tech in the health field progress? With new ways to help improve the lives of millions of people who potentially can be at their most vulnerable, personal data continues to be of major importance in health progression. At the same time, it begs the question: what will happen if business and ethics clash as a result of bettering healthcare?
Value and the Unfortunate Demand
The following are examples on why data may be the valuable aspect individuals will have to put a risk on when it comes to the gamble in progression of healthcare and its convenience.
Earlier this July, the Dutch Data Protection Authority (DPA) issued the country’s first ever GDPR healthcare related fine. The Hague’s largest hospital, Haga Ziekenhuis, was fined 460,000 EUR for failing to secure the personal data of one of the patients. The Dutch DPA stated that at least two of the hospital’s security measures were insufficient. Not only did the hospital fail to alert administrators that an unauthorized employee was looking into personal files, but the hospital also failed to use a two factor authentication for accessing the database itself.
Fast forward several months later into the early weeks of September, when yet another personal health data breach has occurred. This time, with a far more sinister intent. Earlier this week, numerous sources reported that various popular mental health support websites in Europe were selling personal user data to third parties for advertising purposes. Information such as mental health check tests were reported to be passed onto third parties for ad targeting purposes, according to the privacy rights advocacy group, Privacy International. The group analysed the data habits of 136 popular mental health support web pages in France, the UK, and Germany. Sharing the results of the analysis, they write: “certain mental health websites treat the personal data of individuals as a commodity” and that they “fail to meet obligations under European data protection laws.”
As far as data anonymity is concerned, our data might never be immune from third party data miners. In fact, a few days ago we learned that several period-tracking apps, used by millions of women globally, are sharing sensitive data with controversial tech giant, Facebook. It was reported that the data leak started via Facebook’s Software Development Kit, where the developers gather user data for better targeted Facebook advertisements.
Health Data: The Future
Health data continues to grow in demand, as tech breakthroughs continue to progress in order for individuals to have complete convenience. Only time can tell what technology will provide within the health field. Here are a few pointers that indicate the near future, plus a survey conducted last year by Rock Health that found 11% of respondents said they’d be willing to share health data with tech companies.
In 2018, Google announced that it would be “absorbing” DeepMind Health, the medical unit of the AI-backed company it acquired four years ago, even closer to Google. This meant that Google would be taking total control over the AI technology. Deepmind is a division of Alphabet Inc. responsible for creating general-purpose AI technology. One of the purposes it continues to serve is to help the UK’s National Health Service (NHS) with AI-powered support for medical professionals. Various research projects have already been contacted in frame of the partnership between Google and the NHS, such as using the AI to spot eye disease in routine scans on patients. The privacy scare comes into light as Google has also come under scrutiny over data gathering and ad targeting allegations by privacy web browser, Brave. Therefore the implications of them expanding data collection to healthcare data might have significant consequences for users.
Another game changer for the sector is Amazon. Earlier this year, Amazon announced that its AI-powered smart speaker voice assistant, Alexa, was adopting healthcare features. What this means is that Alexa’s skills would allow users to ask the virtual AI for help with booking a doctor’s appointment, checking on the status of a prescription delivery, among many other options. Amazon later additionally stated that the program only allows selected covered business associates who have complied to the HIPAA (the US Health Insurance Portability and Accountability Act of 1996) to be involved with Alexa’s skills. As further access to health data becomes but a voice assistant command away, recent news about Alexa’s data gathering will cause concern on not only just how much data is being gathered, but what data exactly it is.
What lies ahead?
In retrospect, one may argue that the use of mobile health convenience such as going to a mental health support website may impose the same risks as any other website on the internet when it comes to handing in personal data. With that in mind, just how much are we able to trust the development of healthcare tech? Will worrying over privacy breaches be the norm when it comes to seeking help? And how will laws help ease our concerns?
The GDPR is the legislation to turn to when it comes to data protection. This Regulation requires organisations to implement real measures to protect our personal data. It’s becoming increasingly apparent that attaining the required valid consent for processing health data, involves more than just asking simple permission for any and all processings. It requires organisations to make a clear distinction between consenting to a single medical check up/examination and consenting to the gathering and processing of personal data in online medical reports. Before processing health data, organisations have to offer individuals a real choice and control, besides complying with the other obligations that the GDPR bestows on them.
Even though the GDPR has only come into effect quite recently, efforts to enforce it are on the rise - and with good reason, as the above examples and cases suggest an ever growing initiative from big tech to move in on the healthcare sector. The big question is, will these recent cases serve as an incentive for organisations, moving them to implement extra data protection measures, or will risking access to our health data simply just be the cost we have to pay for health progression.