The Mayo Clinic Health System in Mankato is an 11-hospital, five-clinic health system under the broader umbrella of the national Mayo Health System. Mayo-Mankato, based in Mankato, in southeastern Minnesota, involves 17 staff radiologists.
The leaders at Mayo-Mankato have been moving forward in a variety of different ways in the imaging informatics space, with a desire to move beyond the storage and image presentation capabilities of the PACS (picture archiving and communications systems) and RIS (radiology information systems) systems that virtually all radiology organizations already use.
Ernest Beaupain, PACS administrator at Mayo Clinic Health System in Mankato, has been helping to lead a process of process improvement around critical test results reporting for diagnostic imaging tests. In that area, he and his colleagues have been partnering with the Sarasota, Fla.-based peerVue. Beaupain spoke during the RSNA Conference, being held this week at the McCormick Place Convention Center in Chicago, with HCI Editor-in-Chief Mark Hagland, regarding the current initiative unfolding at Mayo-Mankato. Below are excerpts from that interview.
Tell me about the original impetus for moving forward on your current critical results reporting initiative?
The original impetus, really, for purchasing peerVue, was a determination of non-compliance with the Joint Commission’s requirements on critical test results reporting.
When did you go live with the peerVue solution?
We became a customer of peerVue in 2010, and went live in the summer of 2011 with their critical test results management solution; we went live with their semi-urgent tests results management solution that fall, along with two other modules from them: peer review, and technologist quality improvement. The technologist quality improvement module measures the skill with which radiologic technologists perform diagnostic exams, and helps us to assess trends that are developing. During 2011, we implemented the critical test results management solution in eight of the eleven hospitals; the other three went live in 2013.
Tell me more about your implementation of the critical test results management solution. How had the critical test results management process been executed prior to automation?
Before the implementation of the system, the degree of success with the communication of critical test results was largely unknown; the radiologists had pen and paper, but usually just called the clinicians to convey those results. But in any busy physician practice, getting hold of your colleague is nearly impossible. Now, if radiologists wish to call referring physicians to inform them of critical test results, they still can, and use peerVue to track that call; but they also have the ability to send off the result to an internally staffed call center. What was hindering them moving onto the next case after making that initial call—in other words, being burdened by having to continue to make repeated attempts at contact.
Do you have any before-and-after metrics on this?
We don’t have any solid metrics about what went on before implementation, but it was taking at least an hour, on average; following implementation, we reduced that time period to under 30 minutes.
With regard to semi-urgent results communication, what has happened?
The degree of improvement was the same, and that measure was also something that had never been tracked before. They would make phone calls in some circumstances. But often, what would happen is that that incidental lung nodule they’d noticed when they were looking at the base of the lung—in the past, they would notice it in the report, but no one would be called. And sometimes, the clinicians wouldn’t even read the report; the process relied on the referring physicians seeing and reading those things. The driver there was patient safety, even though not as addressed by the Joint Commission, since they don’t address those goes specifically.
Tell me a bit more about the peer review process, as facilitated by automation?
In that, we are being driven by the radiologists’ workflow. And what they’ve chosen to do is, the first five cases that have prior exams, each day, they’re doing the peer review on the prior exams.
Are you using analytics yet?
No, right now, the radiologists have chosen purely time-based selection in terms of which cases they’re looking at. We’re not using the engine to do prospective peer review.
How has the implementation of the critical test results solution influenced workflow, process, and culture in your organization?
Radiologists have seen a dramatic improvement in their workflow because of this. They can submit the case to peerVue, know that it’s being taken care of, and move on to the next case.
And the referring physicians have gotten used to having a call center call them?
That’s where the culture change comes in. Two-and-a-half years out, there are still multiple clinicians who are saying, I’m an orthopedic surgeon, I know that that heel is fractured, why are you bothering me? So that remains a challenge.
Aren’t the referring physicians grateful, though, with regard to results that are life-threatening?
Sometimes yes, sometimes no. Sometimes, they’re looking at the results before the call center has reached them. So we’re continually reminding them that we’re doing this for patient safety, and these are Joint Commission safety goals. Also, when we initially went live with peerVue, we had tried to use some of the technological features of the software—I’m referring to a pop-up on their screen, and a text message. And that met a lot of resistance as well. We experienced a zero-to-five percent compliance rate, which led us to switch to a call-center format instead. The pop-ups had some technical problems in terms of the identification of the communicators. So we really needed to use the telephone in terms of critical results management. In terms of semi-urgent results reporting, we send them an e-mail every day, and it sends them a link that they can acknowledge the result on, with just two clicks. Right now, we have a 50-percent compliance rate among the referring physicians; and we follow up by phone if they have not acknowledged it.
What would your advice be for IT and clinician stakeholders, based on your experience so far with these programs?
They definitely need to find a system that integrates tightly with their radiology environment’s workflow, so that radiologists can just click to comply. That’s a major thing. And I would say the multimedia features are less important than it is to go out and educate the referring physician staff on change management. I think it depends on each individual group of physicians as to whether or not you can introduce certain features to certain physicians. And also we have, among 17 radiologists practice in one way, while five practice in another way, within the results communication realm. Within our environment, the five hospitals in the three hospitals choose to contact the ordering providers, in most cases, while 12 of the radiologists in the eight hospitals choose to utilize the call center.
So from both the radiologists’ standpoint and the referring physicians’ standpoint, there is flexibility with this system. And that’s important—tailoring the solution to the practice preferences of both groups of physicians. Within one area, we have the call center calling a particular cell phone number. And we can bring that up within the solution, to customize that preference. There’s customization there.
So the physicians are coming along in terms of acceptance and adoption?
Yes. And honestly, we’re improving the patient safety environment. Honestly, the fact that a few are a bit upset by procedure changes is far outweighed by the fact that we’re in a more secure place in terms of patient safety concerns.