Joe Marion, principal in the Waukesha, Wis.-based Healthcare Integration Strategies LLC consulting firm, is one of the most respected consultants in the imaging and imaging informatics industry. He is also one of the veterans with among the longest memories of annual iterations of the RSNA Conference (sponsored by the Oak Brook, Ill.-based Radiology Society of North America), having participated in his first RSNA Conference in 1976; this conference marks his forty-first.
Marion, who blogs and writes regularly for Healthcare Informatics, continues to see waves of change lashing the shores of the radiology and imaging informatics world. On Tuesday, during RSNA 2017 at the McCormick Place Convention Center in Chicago, he sat down with Healthcare Informatics Editor-in-Chief Mark Hagland to reflect on this year’s conference and the trends he’s seeing. Below are excerpts from that interview.
What have your impressions been of the RSNA Conference this year—in terms of the discussions taking place, in terms of what’s visible on the exhibit floor, and in terms of healthcare leader concerns? It seems that artificial intelligence and machine learning are absolutely everywhere this year.
Yes, absolutely. Clearly, machine learning is everywhere this year. Not much of it is for real yet. Everybody wants to jump on the bandwagon. IBM Watson and Mindshare appear to be among the leaders; they seem to be the big games in town. And as that evolves, the perception of most people is that it will lead to radiology assistance rather than replacement. And one phrase being coined now is workflow orchestration--Medicalis/Siemens, Primordial/Nuance, and Clario. Clario is focused physician practice, the other two more in IDNs [integrated delivery networks]. And I think there's sort of a split there. Some vendors, like Agfa, understand the consolidation that's occurring, and how it's requiring radiologists to work across multiple entities. Others are focused on productivity.
Some of the AI-based solutions are based on rules—on the ability for studies to be read, based on location, shift, and other criteria; they basically assign studies based on criteria. Another interesting approach to this is being taken by GE Healthcare; they have something called AutoServe. Instead of a work list, they basically assign cases in real time, serving up the next case in each subspecialty. Their objective is to see how much more efficient radiologists can be in real time. Part of the issue is, and that's one of the other criteria people are using--wait time--how long has a study been waiting to be read? Some of these groups have service line agreements, saying, I'll read a chest x-ray in eight hours, or something like that. The most obvious would be a stat study. What the AutoServe is doing is that, rather than having a radiologist picking off a list, it will assign that person. So this is sort of a different approach than saying, let's just serve them [the studies] up based on the priorities.
How long do you think it will take in order for artificial intelligence and machine learning tools to be incorporated into existing solutions, or integrated into them? I was interested in Nuance’s announcement that they are going to be facilitating relatively open AI algorithm-set development.
Yes, there are numerous options that vendors could take here. They could develop their own applications, do third-party applications, or work with academic researchers to develop solutions. So there's a mix of approaches. Where will the validation come from? And if I'm an academic center and come up with an application, would the FDA need to approve it? If your algorithm is around tuberculosis diagnosis--this TB application might help you prioritize certain cases. And there might be some false positives, but it would improve your reading efficiency. And it's an interesting question as to whether the FDA [Food and Drug Administration] might in fact have to approve some of these applications; but that could slow it down by eons.
So the artificial intelligence applications could support radiologist work lists and clinical decision support areas. And the CDS mandate has been pushed back to 2018 or even 2020--something to verify. The real question is, will there be an impact on the imaging companies?
How would you context all this for CIOs and other healthcare IT leaders?
A couple of things. First, where does imaging fit into the context of value-based healthcare, population health management, and other phenomena, as the U.S. healthcare system
The other approach is from a pixel perspective, contributing to improving diagnosis, through algorithms. On the textual side, Change Healthcare is involved in analyzing reports and extracting relevant pieces of information--intelligent mining of the EHR [electronic health record], if you will. In the past, a radiologist picked up a film, read it, and analyzed it in isolation. Now, the emphasis is on bringing up other relevant information during the reading. You can call it data mining of the EHR and other systems.
On a very practical level, what does all this mean? Should HIT leaders rush out and buy ‘stuff’—or should they think carefully about the hype cycle and development trajectory around this, first?