I’ve been blogging about health data breaches lately, but I’m not sure if there are more of them or if the reporting requirements are more strict. I suspect the latter.
One of the things I’ve noticed is that many of the breaches seem to be of multiple exposures by the same organization, which has led to recent legislative changes to the HITECH Act. You can see from the quote below that not only has the limit to the penalty been increased, but the penalties for repeat violators are higher.
Given the sensitive nature of health data, I’m still thinking that we need to move more towards criminal penalties for wilful neglect and repeat violations.
In addition to redefining the scope and liabilities of business associates in the healthcare industry, the final HIPAA omnibus rule includes revisions to the penalties applied to each HIPAA violation category. While the American Recovery and Reinvestment Act of 2009 (ARRA) initially established a tiered penalty structure, it hasn’t been revised until now.
Section 160.404 refers to the amount of civil monetary penalty as administered under the HITECH (Health Information Technology for Economic and Clinical Health) Act. The original penalty structure used to be:
Do you think companies are bearing enough of the responsibility for protecting our data? Do you as a data professional get enough support from management to ensure that data is protected?
I think we need to have an industry acronym now that this seems to happen every week. My proposals:
- Yet Another USB Breach (YAUB)
- Blame A Thumbdrive (BLAT)
- Yet Another Flashdrive Fail (YAFF)
I like the YAFF one best, so I’m going with that, even though the #FAIL really isn’t in the hardware, but in the abuse of policy and hardware to cause a data breach.
This week’s YAFF announcement comes again from Utah, where a contractor with access to sensitive health data lost a USB flash drive somewhere between Salt Lake City, Denver, and Washington, DC.
What’s different about this news story is that we get more insight as to why that data was on a portable device. And it’s just as I prognosticated in a previous post: the contractor was frustrated with an infrastructure issues.
The contractor, Goold Health Systems, handles Medicaid pharmacy transactions for the Health Department.Department spokesman Tom Hudachko said the GHS employee, identified only as a woman from Denver, was having trouble with an Internet connection Thursday while trying to upload the data to a server. The employee saved the personal information to an unencrypted USB memory stick and left the Health Department with the device. The employee lost the stick sometime in the following days while traveling between Salt Lake City, Denver and Washington, D.C.
The contractor lost her job over this.
People Forget Policy When They Are Frustrated or Stressed
I once found a QA contractor cursing at his computer because he was having trouble sending a large file via his Hotmail account. I offered to help. When he showed me what he was doing I just about had a heart attack. He had been trying to send our offshore contractor a copy of a production database backup. This backup contained names, addresses, phone numbers, credit card information (no, the legacy system shouldn’t have been storing this information, but it did), SSNs, Driver’s license numbers and other forms of ID. It was an identity theft treasure chest of awesome.
When I asked him why he was trying email this information to our offshore contractor he said he was frustrated that corporate email system would not let him email such a large file.
He told me the only reason he did this was that he had to get the bug logged and fixed before the weekend because he had plans to be away. He also forgot that production data was never supposed to leave the building. I’m not sure he ever really felt that what he was doing was wrong, or had any idea why emailing sensitive data was wrong.
The other shock I got was that it was a production DBA who had given him the backup. When I asked the DBA why he did this without even asking what it was for, he said "I was really busy and didn’t have time."
I wonder just how many times this scenario plays out every day in offices around the world.
Love your data, even when you are stressed. Especially when you are stressed.
A recent CMIO post describes the data breach of 34,000 patients’ personally identifiable information.
A former contractor’s personal laptop containing patient information was stolen, according to a statement from Larry Warren, CEO of the hospital. “This information was downloaded in violation of Howard University Hospital policy,” he wrote.
I’ll give you 30 seconds to spot 3 problems with the situation. Tick, Tock.
I can see three especially worrisome problems:
- Information was downloaded in violation: I’m guessing that there was no monitoring of downloads of sensitive data at this medical institution. This sort of monitoring may have prevented this data from leaving the building.
- Former contractor: So a person who had access to this sensitive data was allowed to leave the organization with it. I personally refuse to put data such as this on my own devices, mainly because I do not want the liability of having to protect it or report it if something were go wrong. I am usually the only person on the project who refuses. However, I have never even been asked or reminded about removing any company data from any of my storage devices when I go on to other projects.
- Personal Laptop: I sometimes use my own equipment when working at a client and that is normally due to the fact that client systems are often less powerful than my own and they don’t have licenses for tools that I need to do my job.. But I’d rather use systems that have enterprise-class security, encryption and monitoring. I wish more corporate systems supports such practices.
Since the article did not mention that the data was encrypted, I’m guessing it wasn’t. I’m also wondering why this ever got reported…most former consultants would not do so, I’m guessing, if they had the data in violation. Perhaps the laptop was recovered and the breach was reported that way.
I’ve previously blogged about how poorly medical data is protected.
This sort of data breach makes me mad. It’s nice that the hospital says that they are now “implementing enhanced security measures”, but why didn’t they do that before? Did their compliance officer recommend it but management said “no, too expensive”? Did their DBA say “the database is encrypted, so we are covered”? Did the former contractor take the data maliciously? Did he have to put it on his personal laptop? Why do we continue to treat data as if it is someone else’s problem to manage? Do we not understand that we have a professional obligation to protect patient data? Even with legislation it seems the message still isn’t making it through to everyone.
Does your organization have security monitoring in place to protect patient or customer data? If it doesn’t, have you recommended that it do so? Go do it, now.
So you live in a country that has legislation requiring your health data to be protected and you believe it’s all safe. If you live in the US, think again.
According to a study by Ponemon Institute sponsored by MegaPath:
- 91% of small healthcare organizations (think your local doctor, dentist, optometrist or clinic) had experienced a breach of protected health information (PHI) in the previous 12 months; of those, 29% resulted in medical identity theft
- 52% of small healthcare providers rated their security technology plans as ineffective
- 43% had experienced medical identity theft in their organizations
- 55% of respondents had to notify patients of a data breach in the previous 12 months
- On average, less than 10% of the respondents’ IT budgets are spent on security
You can register and download the entire paper at http://www.megapath.com/solutions/industry/healthcare/study/
I found this table the most
From a data governance and data protection point of view, I’d really expect to see ALL of those be 100%. My doctor recently moved to mostly electronic health records (as have most in my province), but I’m wondering what his answers to all of these questions would be. When I think about the 91% data breach numbers, I see this table as one of the key reasons that number is so high.
Even if you aren’t in a health-related organization, I’d expect your numbers to be higher. 63% backup and disaster recovery plans? How can we call ourselves professionals when this is life-critical information? Ultimately it is organization leadership who are responsible for protecting data. But I’ve always been concerned about how far we data professionals should go in ensuring that the public is protected from harm when data polices and practices are not sufficient. Should we not move to other projects? Report bad practices? To whom?
This is a US-based study and I’m curious about similar numbers in other countries with and without health data privacy legislation. If you have links to other sources, please provide them in the comments.
Subscribe via E-mail
- September 2016
- August 2016
- June 2016
- May 2016
- April 2016
- March 2016
- February 2016
- January 2016
- December 2015
- November 2015
- September 2015
- July 2015
- June 2015
- May 2015
- April 2015
- March 2015
- February 2015
- January 2015
- December 2014
- November 2014
- October 2014
- August 2014
- July 2014
- June 2014
- May 2014
- April 2014
- March 2014
- February 2014
- January 2014
- December 2013
- November 2013
- October 2013
- September 2013
- August 2013
- July 2013
- June 2013
- May 2013
- April 2013
- March 2013
- February 2013
- January 2013
- December 2012
- November 2012
- October 2012
- September 2012
- August 2012
- July 2012
- June 2012
- May 2012
- April 2012
- March 2012
- February 2012
- January 2012
- December 2011
- November 2011
- October 2011
- September 2011
- August 2011
- July 2011
- June 2011
- May 2011
- April 2011
- March 2011
- February 2011
- January 2011
- December 2010
- November 2010
- September 2010
- August 2010
- July 2010
- February 2009