I recently wrote about this on the topic of Smart Grid security, and Gib Sorebo of SAIC followed up on his blog with his opinion. Gib and I have had some long and great discussions about the issue of privacy in a world of security vulnerabilities, and the one area that seems to get us both preaching is the issue of privacy as it relates to health care. Simply put, this is a good time to shift the security discussion to something that really matters, and I am sorry to say that privacy needs to leave the room for a while.
Okay, I am sure the entire world of privacy evangelists are probably going to want to send me that nasty fruitcake (or worse) they have been holding onto for the last 20 years after reading that last statement, but please hear me out before you head over to the post office. Privacy IS important and DOES MATTER to me and probably every security professional in this world. I am a strong supporter of privacy, and consistently do all I can to protect my privacy. I refuse to give my address and phone number at stores that ask for it when I pay with cash or ask me for that information for any reason whatsoever. I refuse to share ANY information with ANY entity that requests it that I deem is not on a need-to-know list, and have held up lines in stores, banks, and other places (sorry to all of you who stood behind me) defending my rights to my own information. Privacy is indeed very important in the digital world we are now completely enveloped in.
...but it has got to stop being a part of health care security discussions, or we are probably going to end up with a lot of dead people as a result.
In fact, we already are ending up with seriously damaged patients in the age of digital health care. I read an article on The Huffington Post this morning titled "Electronic Medical Record Shift: Signs Of Harm Emerge As Doctors Move From Paper" which pointed out how either bad information or a failure in software has led to patient trauma (heart attacks, seizures). The article did not speak of security issues that led to failures in these systems, yet the failures found in these systems serve to illustrate what I have been talking about for years. If a system is vulnerable to penetration and compromise by an attacker, the attacker can cause a lot more harm than a patient would suffer as a result of a privacy breach.
Let me specifically paint a scenario based upon the Huffington Post article. The first sentence of the article speaks of hospital workers misreading medical dosage information and dispensing 10 times the normal dose of a medication, leading to a patient heart attack. Under HIPAA HITECH, if an attacker should enter a system and change a single patient record (perhaps a patient who is a political figure) for a medical dosage to purposely cause a heart attack or death, the health care organization would be in violation of a privacy law, but could not be held liable for the death of the patient due to a failure in data integrity. In my opinion (and the opinion of others I have spoken to about this issue), there is something very wrong with this.
The problem becomes even more complicated when you add medical devices to the system. Medical devices have become increasingly "smart" and are now trusted devices on health care networks. Devices perform many functions in health care, and the information some devices are trusted with gathering is often used to make life or death decisions. A device which automates the process of typing blood and then sends the information to a patient record database is indeed one type of device that would fall into this category description. If an attacker could spoof such a device he could then populate the database with incorrect information that could kill a patient. Moreover, some medical devices have firmware that can be updated (and in some cases over a network connection), which opens up the possibility of rogue firmware that could be purposely introduced to cause havoc.
I bring this up because the cost of failure due to a privacy breach simply pales in comparison to the potential cost of failure due to a failure to deliver correct information to the system. One leads to embarrassment and potential financial headaches, the other leads to death. Why is this distinction important? Well, except for obvious reasons, it is important because in a world where risks are mitigated based on costs of failure from a LEGAL perspective (i.e. a finable offense), the actual cost of failure due to a privacy breach is infinitesimally small compared to to somebody dying. A good lawyer can potentially turn a $1.5 million dollar fine (the maximum fine for a single instance under HITECH) down considerably if he or she could convince a judge/jury that the punishment does not really fit the crime.
It happens all the time, in fact, in other industries. At one time I worked for a company that dealt in motor oil who faced millions of dollars in fines from the EPA for statutory violations, but the fine was reduced to a few thousand dollars because the violation simply did not lead to anyone being harmed. The potential for harm was very high (as is true with medical record breaches), but if nobody is actually harmed then a slap on the wrist is a common punishment (especially if you have a good lawyer). Sure, it may cost you in legal fees, but if you already have a staff of lawyers anyway it is not so hard to stomach.
HITECH is a good step in the right direction for better security, but it still completely fails to address the bigger issues. As we continue to build out our "internet of health care" and interconnect data sources at a national (and eventually global) level, the security risks grow at a nearly exponential rate. This is because attackers like to attack systems more as they get bigger simply because it has a bigger impact. We should not wait for theoretical dangers to manifest themselves before we address these issues. Security vulnerabilities of large infrastructures are well known enough today that a failure to pro-actively address them is simply nothing more than negligence, and the health care industry should act more responsibly.
They know better.
No comments:
Post a Comment