APBN New Site

APBN Developing Site

Your Secrets for Sale

Do existing policies sufficiently protect unsuspecting patients from being preyed upon by third parties for their health data?

by Michelle Tan Min Shuen

The first immortal cell line was discovered in 1951. Tucked away in Johns Hopkins Hospital, Maryland, cell biologist George Gey cultured cancerous cells from a tumour biopsied during Henrietta Lack’s treatment for cervical cancer.

For years, Gey had been trying to cultivate human cells in vitro. All previous attempts at growing human cells in a laboratory were futile as the cell cultures would perish within a few generations. That is, until Henrietta’s tumour sample. The HeLa cells, as they were later coined, were remarkably durable and prolific, dividing infinitely in his cell culture plate, soon becoming the choice human cell line for biomedical research.

Since the discovery of HeLa cells, they have been bought and sold for incalculable profit, and have given rise to many medical breakthroughs, from the development of the polio vaccine to the study of AIDs and various cancers worldwide. However, long missing from the sensationalised story of HeLa cells was the story of the woman who had unwittingly provided them. The fact that Henrietta herself neither consented to nor was aware of this procedure was hidden behind the growing popularity of the new discovery, and Henrietta and her family were never financially compensated for their contributions to the scientific field.

Henrietta Lack’s story eerily preempts a disturbing reality in the digital age — the economic exploitation of personal healthcare data by companies. While patients’ digital health data holds immense potential for scientific and clinical advances as well as for commercial gain, the patients themselves whose data are bought and sold are usually unaware of which data are being collected, as well as almost never receive compensation from the people who are profiting off their means.

Tech companies, among many others, are clamouring to get their hands on the goldmines of patient data which reside in electronic health records (EHRs) that healthcare organizations are safeguarding, and for good reason too. Considering these troves of patient data can help software programmers train their systems to identify trends and achieve more efficient outcomes. For instance, using data about lung cancer risk from thousands of patients, an algorithm could help doctors decide which patients would benefit the most from chest CT scans and advise doctors to allocate limited resources accordingly. “The value isn’t in the data itself,” said Pamela Spence, EY’s global health-science and wellness industry leader. “It’s in the power of the analytics and the clever insights that can be generated from AI algorithms.”

But the use of personal data to make health predictions also is raising concerns that patients’ privacy could be compromised if the information is misused, mishandled or sold to untrustworthy entities. “This information, if collected for good, can be really helpful,” says Deven McGraw, chief regulatory officer for Ciitizen Corp., a start-up seeking to help consumers manage their medical records digitally. “If it is put to nefarious purposes, or breached, then [it] can have consequences.”

Nevertheless, the demand for all this data is rising, and fast. Valued at approximately USD 14.25 billion in 2017, the health data market is predicted to grow nearly five times bigger, to USD 68.75 billion in under seven years. As the benefits of obtaining data became more apparent, governments saw the need to implement and continually update laws that would ensure that data would not get misused by the companies purchasing them. One such law is the Health Insurance Portability and Accountability Act (HIPAA). HIPAA is a federal law that provides baseline privacy and security standards for medical information. The law was passed into legislation in 1996, and till today, is actively being enforced by the U.S. Department of Health and Human Services (HHS). As mandated by Title II of HIPAA, known as the Administrative Simplification (AS) provisions, official health records can only be sold after they have been stripped of all personal identifiers, in order to protect individually identifiable health information from uses that may unnecessarily compromise a person’s privacy.


Ease of Re-identification

Yet the HIPAA’s efficacy as a policy comes into question after research has shown that anonymised data can often be re-identified with minimal effort. In a 2018 study, researchers were able to re-identify 95 percent of individual adults from the National Health and Nutrition Examination Survey using machine learning techniques. Despite evidently undermining promises of confidentiality, healthcare providers could now still legally sell their data to companies as long as details such as name, address, and sixteen other identifiable information are omitted. Its buyers and sharers — pharma giants and tech giants, among many others — optimistically claim that these massive health data sets can fuel advances in precision medicine. For example, Google’s recent partnership with the Ascension hospital network has already allowed Google to access millions of American patient records and use that information to deliver more personalised medical treatment for the masses. Capitalising on the identifiability of the data and using it as a tool for screening and risk identification could very well improve healthcare delivery, but on the flip side, it also means that insurance companies can now identify individuals at higher risk of contracting diseases in order and raise their health premiums accordingly.


Presence of Regulatory Loopholes

Compounding the problem, not all health data is protected by privacy rules. With HIPAA, the problem lies not with the lack of enforcement of the legislation but rather the restrictive nature of the law, which states that the HIPAA only applies to covered entities and their business associates. In this case, the term covered entities refer to healthcare providers, health plans and healthcare clearinghouses, while the term business associates refer to entities who create, receive, maintain, or transmit protected health information (PHI) on behalf of a covered entity or another business associate acting as a subcontractor. In essence, the confidentiality of a patient’s health information is only protected by the HIPAA when it is being recorded or used by these covered entities.

As healthcare continues to transcend its digital trajectory, the limits of the HIPAA framework merit urgent legislative attention. The fact that HIPAA is monitoring the actions of the entities rather than the usage of PHI is worrying as the HIPAA then effectively exempts a growing industry of health management tools that neither fall under the categories of “covered entities” nor “business associates”. Seemingly in the grey area of the law, these health management tools that take the forms of mobile applications are given free rein to use your data however they please.

As the famous Silicon Valley adage warns, “If the product is free, you are the product”. Fertility and pregnancy tracking mobile application Ovia uses its colourful and vibrant interface to entice young mothers to record their intimate habits such as their menstrual cycles, pregnancy symptoms, and even sexual activity on the application itself. Unbeknownst to many of these young mothers, their private information is not protected by HIPAA as no HIPAA-defined covered entities are involved in the process of data collection or management. It is no wonder that many are oblivious to the fact that the company is selling their data to third parties, as stated in the apps’ terms of service that states that users of the free application grant Ovia the right to “utilize and exploit” their anonymous personal data for research, marketing purposes, or sale to third parties.

Another loophole lies with the fact that the HIPAA only sets standards for safeguarding PHI. For a piece of health data to be considered as PHI and regulated under or by the HIPAA, it needs to be personally identifiable or recognizable to the patient and it has to be utilized or disclosed to a covered entity only, during the course of health care. The stringent standards of what constitutes PHI makes it significantly easier for third parties to extricate other forms of health data that are not technically considered to be PHI, but can still serve the same purposes. In fact, the rapid development and implementation of technologies capable of collecting and analysing passive data is generating a gargantuan market for alternative forms of such data. Also known as shadow health records, these are the data that are generated when one wears a fitness tracker, uses a smartphone health application, or does anything online that leaves a trackable digital trace. For example, global positioning systems (GPS) and accelerometers in mobile phones can track the whereabouts of its users, while wearable IoT-connected devices such as fitness bands continually collect patient health data to provide detailed accounts of an individual’s physical activity and behaviour. Such forms of data fall outside of the HIPAA as they are not collected or managed by “covered entities”, and similarly can be freely bought or sold.


The Need for a Reform

For many of us, it seems ironic that the least invasive forms of digital healthcare technologies are the ones most responsible for eroding our privacy. Further developments in the field of IoT is unidirectional, and as Internet penetration rates continue to rise, it becomes even more impossible for people to go about their daily lives without leaving a digital trace that can be unknowingly picked up by avaricious companies. In an era where data plays an omnipresent role in the lives of any individual, data’s potential grows beyond that of affecting trivial advertisement placements to also influencing important decisions such as an employer’s hiring decisions. Now, more than ever before, data privacy is becoming a topic of concern.

“Big data and AI are racing ahead in medicine,” Says Nicholson Price, assistant professor of law at the University of Michigan. “And right now, law and policy are playing catch-up.” Indeed, as more time passes, it becomes painfully obvious that the implementation of updated data privacy laws thoroughly lags behind the rapid pace of discovery of new technologies which continually find ways to exploit legislative loopholes. There is an urgent need to reform the (now more than a quarter of a century old) sector-specific regulatory approach of the HIPAA legislation, to one that perhaps anchors protections to the data itself rather than the entities handling the data, similar to that of the European General Data Protection Regulation (GDPR) which can be referenced. Ideally, consent procedures should also create as much transparency as possible regarding the types of data collected as well as the intent behind the collection of the data.

Though a slow and arduous process, steps have already been taken to implement more comprehensive legislation that will better safeguard patient data. As of 1st January 2020, the California Consumer Privacy Act was enacted, allowing residents to access personal information collected about them digitally and to opt out of the commercialization of their personal data. Extensive efforts will be needed to understand how these protections will be interpreted, adopted, enforced, and replicated or expanded elsewhere in the US or possibly even in the world. Until then, we might just have to live with the reality of “being both the product and the customer that pays for the product,” as Knudsen, chair of Jefferson’s cancer center, tactfully puts it. Let this article serve as a reminder for us to be mindful of the information we make available online, lest our most intimate secrets be turned into a commodity for sale. [APBN]


  1. David Grande, M. (2020, July 09). Health Policy and Privacy Challenges Associated With Digital Technology. Retrieved from https://jamanetwork.com/journals/jamanetworkopen/fullarticle/2768091
  2. Ethics of Informed Consent and the Legacy of Henrietta Lacks. (n.d.). Retrieved December 13, 2020, from https://rabinmartin.com/insight/ethics-of-informed-consent-and-the-legacy-of-henrietta-lacks/
  3. Farr, C. (2019, December 19). Hospital execs say they are getting flooded with requests for your health data. Retrieved from https://www.cnbc.com/2019/12/18/hospital-execs-say-theyre-flooded-with-requests-for-your-health-data.shtml
  4. Gavin, K. (2019, February 22). We’re constantly generating ‘shadow’ medical records. Retrieved from https://www.futurity.org/shadow-medical-records-privacy-1989902-2/
  5. Health Privacy: HIPAA Basics. (2015, February 01). Retrieved from https://privacyrights.org/consumer-guides/health-privacy-hipaa-basics
  6. O’Neill, L. (2019, December 12). Rethinking Patient Data Privacy In The Era Of Digital Health: Health Affairs Blog. Retrieved from https://www.healthaffairs.org/do/10.1377/hblog20191210.216658/full/
  7. Pasternack, A. (2019, March 26). What you don’t know about your health data will make you sick. Retrieved from https://www.fastcompany.com/90317471/what-you-dont-know-about-your-health-data-privacy-will-make-you-sick
  8. Sinhasane, S. (2019, May 23). What is PHI and What is Not PHI? Retrieved from https://mobisoftinfotech.com/resources/blog/what-is-phi-and-what-is-not-phi/
  9. Wells, R. (2019, April 12). Your Pregnancy App May Be Selling Your Data-to Your Boss. Retrieved from https://www.glamour.com/story/your-pregnancy-app-may-be-selling-your-datato-your-boss