Privacy Healthcare Privacy Invasion Promoted by the Federal & State Governments – Part VI Preventing Patients from Controlling Access to their Personal Health Information – Addendum added: 02/27/2022

Title II of the Health Information Portability and Accountability Act of 1996 (HIPAA) authorized Health and Human Services to create regulations for “Administrative Simplification”, one of which is the Privacy Rule. The HIPAA privacy rule has been inadequately protecting personal health information since compliance was required on April 14, 2003. It has only failed to an even greater extent since the widespread adoption of electronic health records and the storage and analysis of Big Data.

There are at least two glaring problems. The first is that almost any inspection and use of health data could be justified by “Healthcare Operations” resulting in the sort of exposure of data that has publicized when Google employees raised concerns about the patient data from Ascension. https://www.politico.com/newsletters/morning-ehealth/2019/11/13/googles-partnership-alarms-patients-privacy-advocates-hhs-782279 The second is that the provisions for de-identification of personal health information data was known to be inadequate even before they became part of HIPAA regulations. Dr. Letanya Sweeney is a Harvard professor who focuses on privacy and has testified before congress multiple times. In a 2003 publication, https://dataprivacylab.org/dataprivacy/projects/kanonymity/paper4.pdf , she described what was necessary to de-identify data yet keep it useful for research and other purposes. HIPAA requirements do not meet this standard. If the HIPAA standard is met, medical data is legally de-identified and can be redistributed. The HIPAA standard is described here on this government web site https://www.hhs.gov/hipaa/for-professionals/privacy/special-topics/de-identification/index.html . It was inadequate in 2003 and it is even more so now in this era of big data when many more datasets of information are available that can be mined and compared to allow re-identification.

It is interesting that HHS is making claims data available to the public, and on the same page that description of how this data can be downloaded, is a statement that it is illegal to attempt to de-identify the data. That tells me that HHS knows very well that it can be re-identified and yet posts it here. https://HCUPNET.ahrq.gov/#setup There are multiple warnings not to attempt to re-identify the data such as below.


I will make no attempts to identify individuals, including by the use of vulnerability analysis or penetration testing. In addition, methods that could be used to identify individuals directly or indirectly shall not be disclosed, released, or published.
I will make no attempts to identify establishments directly or by inference.
I will not use deliberate technical analysis to discover or release information on small numbers of observations ≤10.
I will not attempt to link this information with individually identifiable records from any other source.
I will not attempt to use this information to contact any persons or establishments in the data for any purpose.

Note that at this link there is this paragraph regarding how the data is de-identified. https://www.cms.gov/Research-Statistics-Data-and-Systems/Files-for-Order/LimitedDataSets

“The Centers for Medicare & Medicaid Services (CMS) is responsible for administering the Medicare, Medicaid and State Children’s Health Insurance Programs, as well as a number of health oversight programs. CMS gathers and formats data to support the agency’s operations. Information about Medicare beneficiaries, Medicare claims, Medicare providers, clinical data, and Medicaid eligibility and claims are included. These data are made available to the public, subject to privacy release approvals and the availability of computing resources.”

Datasets to use to re-identify data are readily available. https://www.vice.com/en_us/article/dygy8k/researchers-find-anonymized-data-is-even-less-anonymous-than-we-thought reports on findings by Harvard researchers. Adam Tanner wrote the following article discussing how HIPAA does not adequately de-identify data. https://tcf.org/content/report/strengthening-protection-patient-medical-data/. ” Broken Promises of Privacy: Responding to the Surprising Failure of Anonymization found is here, https://papers.ssrn.com/sol3/papers.cfm?abstract_id=1450006 , is a UCLA Law Review article by Paul Ohm from Georgetown Law.

I have concerns also about the broad sharing of data put into databases without patients permission whether or not the data is slated to be shared. Any time a database is created, it is likely to be hacked especially if it is exposed to the Internet. If the NSA can’t keep data secure, how can we expect other datasets to be secure. Patients should be allowed to determine whether their data is exposed to the Internet or put in registries or other database before it is done. If they want the convenience of Internet access, then they need to be informed of the risks and accept the risks. This helps protect providers from being sued when there is a data breach and improves patients’ chances of keeping their data private.

Patients and their physicians are largely unaware of all of the data sharing being done by government and by private entities. In addition, much of this data sharing is coerced by regulations and not the choice of medical providers, whether or not those providers know it is happening. Prior posts on this site describe how some of that is happening.

Unlike when you visit a lawyer, there does not seem to be any meaningful expectation of privacy when a patient visits a health care provider. Many seem to feel that it is too late to protect privacy now as the “cat is out of the bag”. I feel that we need to push for more protections for ourselves and future generations. Right now, government and businesses are controlling most of our privacy.

Consumer Reports published an article about what data was being shared by GoodRx when you search for coupons and reduced prices for drugs, and Deven McGraw, who previously was responsible for protecting privacy with HHS, confirmed that HIPAA offers no protection. https://www.consumerreports.org/health-privacy/goodrx-shares-users-health-data-with-google-facebook-others/

“If people think that HIPAA protects health data, then they probably believe that any health data in any context is going to be protected. That’s just not the case,” said Deven McGraw, chief regulatory officer at consumer health tech company Citizen and former deputy director of health information privacy at the U.S. Department of Health & Human Services’ Office of Civil Rights. 

However, HIPAA doesn’t apply to GoodRx or many other “direct-to-consumer” websites and apps that provide health and pharmaceutical information. It doesn’t apply to heart-rate data generated by a sports watch or Fitbit, information you enter into period-tracking apps, or running data held by running and cycling apps such as Strava. As far as the law goes, such information has no more protection than your Instagram likes.”

Addendum 02/27/2022: The Department of Veterans Affairs has chosen to replace their electronic health record (EHR), VistA aka CPRS, and its 40 years of patient data with Cerner under a different name. The DOD is using Cerner as MHS Genesis and under the Trump administration the VA was ordered to do the same to facilitate the life long heath records for active duty personnel and veterans. Now, Oracle is set to acquire Cerner for $28 Billion and it is assumed they are most interested in the data they will purchase access to that will fuel their AI programs. This has been articulated here, although similar articles are referred to in other portions of this site about how de-identification is already possible with relatively little difficulty given the lax rules for de-identifiation under HIPAA. https://www.healthdatamanagement.com/articles/oracles-pending-purchase-of-cerner-raises-privacy-concerns Of course, patients with their data stored in Cerner have no say about whether or not Oracle will be able to use their information for training AI, to be “de-identified” and sold, etc.