I have the privilege of getting to talk to a lot of health care IT folks. It is an interesting perspective - - having been “one of them” for so long and to be on the vendor side now, the “dark side” as I used to call it. I also have the opportunity to “address” a lot of healthcare IT folks - - through webinars, at HIMSS events, at other meetings and sometimes in their facilities. Usually it is about some aspect of securing or protecting data.
Lately, I’ve gotten tired of talking about it. Not because it isn’t important - - it is. Not because it isn’t required - - it is. Not because I don’t have a passion for it - - I do. I’m tired of talking about it because, as I learned in West Texas, “talkin’ ain’t doin’” and it is time to do something about security and privacy in healthcare. Now! And we don’t do security and privacy very well - - we talk about it a lot.
We’ve been talking about it since HIPAA became effective in 2003 (OK the compliance date was later but the intent was earlier). And yet more than 340 major health information breaches have been reported to federal authorities since the HIPAA breach notification rule took effect in September 2009 - - upwards of 10 million records! It really isn’t that hard. It does take some budget and staff and I know that is difficult right now. More than money, though, it takes prioritization and focus. We get very focused on Meaningful Use and outcomes and analytics and cost reduction around EMRs and clinical data. But let’s start at the very beginning . . . the data. How meaningful or useful is the data to you if the patient leaves your organization because their data has been breached? How good will the analytics be if no one opts in for use of their data because they don’t trust you to keep it safe?
Privacy and security often get dumped into IT as a technology issue. That is naïve. Security and privacy of data is more and more a patient care issue - - and in some cases could be a patient safety issue. Patients trust you with their lives and that data is part of their life. It is a snapshot, clinically, of them. They expect the caregiver to take care of that data as if they were taking care of them. And IT cannot think of themselves as the technology group in the hospital. You are part of the care team - - the patient data has to be there when a hands-on caregiver needs. Any time. Any place. Any device. It has to be secure, available and accessible to the RIGHT person and it has to be reliable.
It is time to stop talking about privacy and security and do it. We don’t do it very well in this country, personally or professionally. Our window of opportunity is running out. I recently was sent a graphic - - an ad for Teddy Bears. It shows a horrible monster attacking a sleeping child but standing at the head of the bed is a tiny Teddy Bear ready to battle the monster. The cutline reads: “Teddy Bears. Protecting innocent children from under the bed monsters since 1902.” The difference for us is the monsters are real (breaches, hackers, outages, data loss, etc.). And we have some real abilities and tools to defend ourselves. It is time to stop pretending to be teddy bears with wooden swords and actually implement the tools, policies, procedures, hire the people, enforce what we say. Make privacy and security a function of the business supported by the tools and capabilities of IT and take care of our patients and their data.