Skip to content Skip to navigation

Will Financial Incentives to Radiologists Propel IT-Facilitated Quality Processes Forward?

March 17, 2015
| Reprints
Some of the financial imperatives facing practicing radiologists may also lead to the adoption of IT-facilitated radiological peer review—but forward progress remains slow

I read with considerable interest an article in Diagnostic Imaging online last month. The report, by Aine Cysts, covered the topic of data analytics solutions for optimizing patient scheduling in radiology practices and hospital radiology departments.

Among other people, the article quoted Nadim Daher, principal analyst for medical imaging at the San Antonio-based Frost & Sullivan, as saying that there has been pressure in the past to years for radiology to align with the “new realities of healthcare, where it’s about cost efficiency, outcomes, payments, quality, and value. These are things that radiology has not been prepared for, Daher said. “Instead, radiology has been rooted in the fee-for-service model, the ‘do more and earn more mentality.’” Daher categorized radiology analytics solutions into three key areas—operational, financial, and clinical.

What’s more, DI’s Cysts quoted Tessa Cook, assistant professor of radiology at the Perelman School of Medicine at the University of Pennsylvania, as saying that “It feels like the specialty is being challenged in different ways: reimbursement cuts, the job market. Radiology is in the spotlight—and not in a good way,” she said. “Analytics give us the ability to really start to show what we [as radiologists] bring to the table in terms of contributing to patient care.”

Much of the rest of the DI article focused on patient scheduling and throughput issues. Reading it, though, the thought occurred to me that other policy and reimbursement trends are in play as well, and that radiologists’ need to optimize throughput and satisfy patients in terms of convenient scheduling, may also lead radiologists to more eagerly embrace other types of analytics, most especially clinical peer review of outcomes quality on the part of radiologists in radiology groups.

For example, two years ago, I interviewed radiologist leaders at Multi-Care Health System, a Tacoma, Washington-based health system with four acute-care hospitals and 20 sites of care for imaging services, as well as an employed physician group and two affiliated radiology groups.

As I noted in that article, “Radiologists in [the] two different radiology groups have been participating in an initiative that offers a great deal of potential for specialty medical management going forward. The initiative encompasses 15 of the 22 radiologists at Medical Imaging Northwest, and 23 of the 46 radiologists from Tacoma Radiological Association; as well as eight orthopedic physicians in various locations. Those physicians are participating in a program in which radiologists receive assigned radiological studies and review them in a quality review process.”

As I reported back then, “Using information technology from the Sarasota, Fla.-based PeerVue, those radiologists are ensuring that some core quality assurance/peer review processes that all radiologists should be engaged in, are performed, tracked, and analyzed, and that the information that comes out of that process is then plowed back into a continuous performance improvement cycle. The PeerVue solution supporting the radiologic study peer review process went live in November 2009.”

At the time, I interviewed Jim Sapienza, administrator for imaging services, MultiCare Health System, and Andrew Levine, M.D., chairman of the Executive Committee, Medical Imaging Northwest, and Medical Director of South King County Diagnostic Imaging Services. What Sapienza and Dr. Levine told me seemed very promising.

As Dr. Levine told me in that interview, “In the past, if problems arose in radiologic interpretations, addressing them would involve someone like myself who’s a medical director. I’d look at the case and talk to the radiologist who had made the mistake, but it would pretty much stop there; and there would be no follow-up or analysis examining why the same individual was making the same mistakes, or multiple people were making the same mistakes.”

What’s more, Levine said, “Trying to do all of this in a paper-based system was not user-friendly; it required the radiologist to pull out paper and make notes and give those notes to a technical or clerical person and then have that person give it to me. This way, we know that every week, a certain number of cases are reviewed on or by each person, and we can go in and do the tech QA [quality assurance] and also the retrospective and prospective stuff—someone might have had a significant miss. And someone can handle the software in the background, and we don’t have to deal with that stuff. Like other clinicians, radiologists want to do what we want to do, not clerical things. We want to have QA [quality assurance], but we want the process to be efficient.”