Skip to content Skip to navigation

Peer Review of Radiologic Studies for Quality Assurance: One Health System’s Experience

February 25, 2013
by Mark Hagland
| Reprints
At the Tacoma-based MultiCare Health System, clinicians and informaticists have implemented an automation-facilitated radiologic study peer review system

MultiCare Health System is a not-for-profit health system based in Tacoma, Washington. Pierce County, Wash. It encompasses four acute-care facilities with 868 beds, seven ambulatory surgical centers, seven urgent care centers, and 20 sites of care for imaging services. The system employs 400 physician FTEs in its MultiCare Medical Associates; what’s more, radiologists in two different radiology groups have been participating in an initiative that offers a great deal of potential for specialty medical management going forward. The initiative encompasses 15 of the 22 radiologists at Medical Imaging Northwest, and 23 of the 46 radiologists from Tacoma Radiological Association; as well as eight orthopedic physicians in various locations. Those physicians are participating in a program in which radiologists receive assigned radiological studies and review them in a quality review process.

Using information technology from the Sarasota, Fla.-based PeerVue, those radiologists are ensuring that some core quality assurance/peer review processes that all radiologists should be engaged in, are performed, tracked, and analyzed, and that the information that comes out of that process is then plowed back into a continuous performance improvement cycle. The PeerVue solution supporting the radiologic study peer review process went live in November 2009.

Jim Sapienza, administrator for imaging services, MultiCare Health System, and Andrew Levine, M.D., chairman of the Executive Committee, Medical Imaging Northwest, and Medical Director of South King County Diagnostic Imaging Services, spoke recently with HCI Editor-in-Chief Mark Hagland regarding the initiative taking place at MultiCare, and its implications for medical management going forward. Below are excerpts from that interview.

What made you decide to move forward into this area, strategically?

Jim Sapienza: It was our Quality Committee for Diagnostic Procedure Specialties: cardiology, radiology, some others, feeling there wasn’t a close enough review by the radiologists of the exams they were reading. And ultimately, any study review goes to a committee if there’s an escalation of any concerns or issues that come out of any study that was read.

So, essentially, the review process is triggered in case there is a problem with any particular radiologic study?

Sapienza: Yes, the desire to have case review, or peer review, of our imaging studies, was the initiator. Dr. Levine was doing research on this and found PeerVue.

Andrew Levine, M.D.: Part of what happened was that Lori Morgan, M.D., the head trauma surgeon and chair of that committee a number of years ago, had actually requested that we put together some kind of peer review process including radiologists and surgeons, and so on; so it wasn’t only radiologists, but other clinicians who had requested this. Let’s say a trauma surgeon orders a total body set of scans, and then the next day, the surgeons would look at the interpretations. In the past, if problems arose in radiologic interpretations, addressing them would involve someone like myself who’s a medical director. I’d look at the case and talk to the radiologist who had made the mistake, but it would pretty much stop there; and there would be no follow-up or analysis examining why the same individual was making the same mistakes, or multiple people were making the same mistakes.

Trying to do all of this in a paper-based system was not user-friendly; it required the radiologist to pull out paper and make notes and give those notes to a technical or clerical person and then have that person give it to me. This way, we know that every week, a certain number of cases are reviewed on or by each person, and we can go in and do the tech QA and also the retrospective and prospective stuff—someone might have had a significant miss.

And someone can handle the software in the background, and we don’t have to deal with that stuff. Like other clinicians, radiologists want to do what we want to do, not clerical things. We want to have QA [quality assurance], but we want the process to be efficient.

Among the benefits of developing an automated quality assurance program are timeliness, greater accountability, and transparency, then?

Levine: And closing the loop; that’s the big issue.

So you’re able to connect right away with the physician whose interpretations are problematic, right? Let's call him “Dr. Smith.”

Yes. It might be Tuesday, and I might review the case and put an addendum on it, and Dr. Smith is on vacation for a week, and there’s then a gap. And medico-legally, you don’t want to put this on e-mail. PeerVue is considered part of a hospital’s quality assurance process, and is protected.

What kinds of changes have been made since implementing PeerVue?

Levine: I’m on the QI [Quality Improvement] Committee; every two months, we have 15-25 cases that fall out [of the norm] for some reason and are flagged for a group review. And we look at these before our QA meeting, and decide whether it’s something that should have been picked up, or challenged, or may perhaps have been a protocol error or something. But in this process, we bring cases back to our group and say, this is the kind of thing people are missing. Sometimes, things are ‘one-offs’: someone missed a lymph node or a fractured wrist or something. But a lot of times, you see patterns; so, for example, when we talk about diagnosis of intercranial aneurysms, we’ll ask a neuroradiologist to come talk to us about that.

It’s almost a kind of continuing medical education process, then, isn’t it?

Levine: Exactly. We could almost get CME credit for that.