What Obamacare teaches us about privacy

Here’s an interesting consequence of the Affordable Care Act’s ban on discrimination based on pre-existing conditions — it has made some patients more comfortable with sharing otherwise sensitive medical information.  From the New York Times:

Reluctance to share personal health information is declining, and not only for cultural reasons. A prevalent fear, said Ben Heywood, co-founder and president of PatientsLikeMe, an online discussion and research forum, has been that if a person disclosed an illness, it might be more difficult to obtain health insurance. But the Affordable Care Act bans discrimination by insurers based on so-called pre-existing conditions. “The biggest deterrent has been removed by law,” Mr. Heywood said.

There are, of course, many good reasons patients may want to keep medical information private beyond protecting against insurer discrimination. And medical information can still be used in a variety of ways to discriminate against patients, by insurers or non-insurers, without running afoul of the ACA. Nevertheless, we should not be surprised that the ACA’s non-discrimination provisions have changed patients’ privacy attitudes.

There is an interesting theoretical question in all this: what do non-discrimination rules have to do with privacy? Isn’t privacy about keeping things secret? This question, a version of which privacy scholars have been debating for decades, has both theoretical and practical significance. If privacy is mainly about secrecy, then perhaps we should take steps to limit the reach of privacy rules — and adjust our discourse on privacy — to exclude situations where the real concern is something other than privacy (in this case, discrimination).

The best answer given by privacy scholars is that privacy is in fact about much more than secrecy. This is the only way we can make sense of the types of privacy-driven value judgments we make every day. We object on privacy grounds when Facebook changes its terms of service not because Facebook revealed our secrets, but because we feel we’ve lost control over information we shared with a specific group of people in a specific setting. We object on privacy grounds when a business takes information we provided for one purpose and uses it for an entirely different purpose because we understand that privacy can be about how information is used in addition to how it’s learned. And so on.

As scholars like Dan Solove and Helen Nissenbaum (and others) have argued, privacy is best defined from the bottom up, as a series of related but distinct problems that come into focus against the backdrop of specific contextual norms regarding information collection, sharing, and processing. That’s a mouthful, but in a nutshell it means privacy rights and privacy harms are context-specific: in some cases privacy rights might require keeping information secret; in others they might require placing restrictions on the misuse of information, or something entirely different. It also means that privacy problems are often very hard to diagnose and address, and are rarely just about privacy qua privacy.

This theoretical backdrop explains why removing insurers’ ability to discriminate based on medical information can be a valid (though partial) answer to the privacy problems posed by the disclosure of medical records; just as accountability mechanisms (like the requirement that police apply for a search warrant to a neutral magistrate) can be valid answers to the privacy problems posed by government surveillance. And it explains why addressing one set of privacy problems, pertaining to the risk of discrimination by insurers, doesn’t necessarily extinguish other privacy risks with the sharing of medical data. Per the Times:

But concerns remain, especially about privacy, given the sensitive nature of medical information. One of the research projects Mr. Keating plans to share his data with is the Personal Genome Project at Harvard. The file of a person’s DNA fingerprint need not bear a name to identify its owner… in principle someone with scientific skills could use a person’s genome to infer paternity, generate statistical evidence that might hurt a person’s chances of getting a job, insurance or loans, and make synthetic DNA to plant at a crime scene.