A Privacy Nightmare

This case occurred at a health care organization that performed a variety of reproductive health procedures, including medication abortions via the administration of misoprostol and mifepristone. The organization ran their electronic health record (EHR) in an environment hosted by the application vendor (these records were held offsite on a solution that was built on Amazon Web Services), with templates and code maintained by a third-party consulting company. While these facts demonstrate some complexities, there is nothing inherently troublesome in these facts if the organization is well-versed in best practices with strong change management procedures.

This organization, however, was not.

A change had to be introduced in the environment, as usage of mifepristone requires that a serial number be recorded. This information is required not only by federal law, but the manufacturer also has clinicians sign releases to indicate that they understand this important requirement driven mainly by patient safety concerns. Until this change was made, the clinic had not established clear processes to ensure that the information was being consistently entered. This means that some employees would use the Lot Number field in the core application, while other employees would record the Serial Number in the Serial Number field provided by third party templates, and yet another group of employees would not enter the information at all.

The EHR template vendor was given system specifications to make the change, and the altered code was released into a testing environment for user / clinical director review. An email was sent to the appropriate sign-off staff, including the Head of Nursing.

Photo by National Cancer Institute on Unsplash

The Head of Nursing decided to ‘test’ the behavior of the electronic health record and opened what they thought was a test patient on the production system.

The patient was, of course, not a test patient.

The Head of Nursing was a credentialed member of staff with elevated system access. They entered the medication on the patient record along with a fake serial number (6666666) to test the system’s behavior.

They then used the approval queue feature to sign off on this record, essentially permanently locking the information as being a part of the patient’s overall medical record. Even though the (real) patient record for the (test) encounter was devoid of information that would indicate a real patient visit had occurred (think blood pressure, interview questions, a check-in time), the record was sent to additional staff for follow-up procedures to take place.

The clinician signing off on the abortion drug triggered a chain of events in the EHR that prompted others that the patient needed to have clinical follow-up performed over the next few weeks to ensure their safety and that the procedure was completed and that they are no longer pregnant. If the procedure has not appropriately completed, other actions must be taken as soon as possible.

The clinician sign off is also why the patient was manually billed for the medication abortion, as the clinic was attempting to reduce their days in A/R by dropping charges as soon as possible when they came up for billing review.

The next person to review the record was alerted they needed to call the patient and ensure they were no longer pregnant. This first follow-up nurse attempted to call the patient. They were unable to reach the patient at the phone number on the record, as the phone number was several years out of date. When they were unable to reach the patient, the nurse flagged the record for further follow-up actions at a later time.

The follow-up procedures that were in place required that they get a response, thus the next follow-up attempt (made by a THIRD credentialed employee) required that a letter be sent to the patient’s address. In the body of the letter, it indicated that the patient had received a medical abortion at the clinic and needed to report back to the clinic regarding pregnancy test results. The nurse folded and sent the letter to the last known good address for the patient.

The last known good address for the patient was the patient’s parent’s house.

At this point, the patient called the clinic, and the head nurse had the opportunity to talk to the patient personally.

When the clinic reported the incident, they indicated that there was ‘no patient impact.’

The Head Nurse was the HIPAA Privacy Officer.

The HIPAA Security officer was never informed.


Pretty gnarly, right? So what could have been done to prevent all this, and what were the correct remediation steps the organization should have taken?

  • Change management is important, and it’s important that any organization that regularly depends on systems understands how changes are tested, communicated, and implemented.
  • Testing should only occur in a test environment, in the rare cases where it is performed in a production environment, it must be tightly controlled.
  • The test patients that are in production should be limited, and named obviously.
  • People should be trained not to sign off on documentation unless they have completed a comprehensive review.
  • Charges should only be rendered for services where there is an appropriate level of documentation.
  • When follow up is performed, a review of the patient’s chart should be performed to ensure it is complete.
  • When follow-up is performed and results in an incorrect phone number, it needs to be noted and possibly escalated.
  • Old patient records should be regularly archived to prevent documentation against them, streamline processes, and safeguard privacy.
  • Any patient contact on an issue should immediately regard it as ‘having patient impact.’
  • Any HIPAA breach must be reported to both the HIPAA Security Officer and the HIPAA Privacy Officer.

By Jamie Toth, The Somewhat Cyclops on .

Canonical link

Exported from Medium on December 20, 2021.


Posted

in

by

Tags: