Earlier this week, I shared information about the U.S. Health and Human Services (“HHS”) Office of Civil Rights’ (“OCR”) first settlement with a medical practice for alleged violations of the breach notification provisions of the Health Information Technology for Economic and Clinical Health (“HITECH”) Act. The $150,000 settlement was made with Adult & Pediatric Dermatology, P.C., (“the Practice”) after the entity reported a stolen jump drive that contained PHI of approximately 2,200 patients.
Both the HITECH Act and HIPAA Final Omnibus Rule (which took effect in September 2013) have strict breach notification provisions. The Omnibus Rule is unique because it presumes that an impermissible use or disclosure of PHI is a breach, unless the covered entity or business associate can demonstrate that there is a “low probability that the PHI has been compromised.” There is, however, an exception to this stringent presumption. HHS considers ePHI that is encrypted to be secure – so an impermissible use or disclosure of encrypted ePHI is not a breach. The jump drive lost by the Practice, unfortunately, was unencrypted.
If a device (such as a jump drive, laptop, or mobile phone) is lost or stolen and it contains accessible ePHI, a practice is required to notify patients, HHS, and, in some cases, the media of the breach. A self-disclosure would not only garner increased supervision from HHS, but also result in reputational harm.
As a health care law attorney, I often hear providers and administrators complain that encryption is “too technical” or “too expensive” for their practice to use. It does not have to be either of these things. And, as the settlement with the Practice demonstrates, taking a risk with unencrypted data can be more costly than implementing encryption technologies and methodologies.
Encryption can be done by computer programs or by specially designed computer hardware devices. Some operating systems already come equipped with encryption capabilities. There are even free programs available for download. These programs or devices apply a mathematical algorithm to information, which results in a “scrambling” of the original data A legitimate user can access the data by using a “key” that unscrambles the information (i.e., puts it back into its original, readable form).
A key is simply a piece of data that the algorithm requires. Keys can be “secret,” where the same key is used to encrypt or decrypt data. Alternatively, keys can be “public,” so that a public key is used to encrypt data, but only a private key can decrypt it. (This is similar to a ‘comments box’ – lots of people can put information in, but only the person with the ‘unlock key’ can open it.)
Keys can be stored in different places (such as smart cards or on a special server within a computer network), but note that HHS will not exempt an entity from breach notification requirements if the entity keeps the keys on the same device as encrypted data.
Initially, installing and configuring encryption products may take some time. A system administrator or EHR vendor should be relied upon to do this. After installation, encryption and decryption takes only minimal time and does not interfere with normal business operations.
Providers should encrypt any electronically-stored files containing PHI. In addition, practice management systems, claims payment appeals, spreadsheets, scanned images, and even emails should be encrypted. If individuals can access ePHI via the Internet, this information should also be encrypted. It may sound complicated, but you have no doubt already been exposed to encrypted Internet data: any web site that has an Internet address beginning with “https” is using an encryption method.
If you are a medical practice physician or administrator, do not hide your head in the sand! The OCR will turn a deaf ear to ignorance and excuses! Encryption may be a scary word, but the fines you risk for using unencrypted data are even scarier.