Top Ten Tech Trends 2018: There’s Value in AI, but Where Is the Value Greatest? | Healthcare Informatics Magazine | Health IT | Information Technology Skip to content Skip to navigation

Top Ten Tech Trends 2018: There’s Value in AI, but Where Is the Value Greatest?

August 31, 2018
by Rajiv Leventhal, Managing Editor
| Reprints
The spotlight has been shining bright on IBM Watson of late as healthcare stakeholders ponder how artificial intelligence can help solve some of the industry’s biggest problems

Editor’s Note: Throughout the next week, in our annual Top Ten Tech Trends package, we will share with you, our readers, stories on how we gauge the U.S. healthcare system’s forward evolution into the future.

Reading about the future of healthcare these days likely means there will be some reference to artificial intelligence (AI). It’s one of those “buzz terms” that is being used in a variety of ways across the sector, though applications are still quite early in most cases. But make no mistake—for healthcare stakeholders of all types, AI is a term that’s on their minds.

A big reason why AI in healthcare has become such a popular concept certainly is due to the mainstream media coverage of IBM Watson, an artificial intelligence supercomputer that was thrusted into the world of healthcare just a few years after it won in Jeopardy! against record-setting champions in 2011. Watson Health, a unit of IBM, was launched at the 2015 HIMSS conference and employs thousands of people. However, along with the popularity of Watson has come intense scrutiny, especially in the last year.

A STAT News report from September 2017 was one of the first major stories detailing how Watson has been performing in hospitals, specifically examining Watson for Oncology—a solution that aims to help physicians quickly identify key information in a patient’s medical record, surface relevant articles and explore treatment options to reduce unwanted variation of care and give time back to their patients.

Webinar

Experience New Records for Speed & Scale: High Performance Genomics & Imaging

Through real use cases and live demo, Frank Lee, PhD, Global Industry Leader for Healthcare & Life Sciences, will illustrate the architecture and solution for high performance data and AI...

But the piece found that Watson for Oncology has struggled in several key areas, noting that while IBM sales executives say that Watson for Oncology possesses the ability to identify new approaches to cancer care, in reality, “the system doesn’t create new knowledge and is artificially intelligent only in the most rudimentary sense of the term.” A more recent report, also from STAT, included internal documents from IBM Watson Health which indicated that the Watson for Oncology product often returns “multiple examples of unsafe and incorrect treatment recommendations.”

There was also one newsworthy story last year about a partnership between IBM and MD Anderson Cancer Center, part of the University of Texas, which soured to the point where the $62 million project for the cancer center to deploy Watson had been scratched. Lynda Chin, M.D., who oversaw the Watson project at MD Anderson before it fell apart, told STAT reporters that it was quite challenging to make the technology functional in healthcare. “Teaching a machine to read a record is a lot harder than anyone thought,” she told STAT, noting how her team spent countless hours trying to get the machine to deal with the idiosyncrasies of medical records.

Meanwhile, in a recent interview with Healthcare Informatics, Francine Sandrow, M.D., chief health information officer (CHIO) at the Corporal Michael J. Crescenz Veteran's Affairs Medical Center in Philadelphia, notes that her team was working on a project with Watson, which was being used to identify patients who were at-risk for post-traumatic stress disorder (PTSD), but had not actually been diagnosed with it. This project focused on simply feeding their charts into the Watson engine, says Sandrow, who is involved in several Veterans Health Administration clinical informatics initiatives.

Unfortunately, she says, “They de-funded [the project] before we got to the results part.” She explains, “When you’re dissecting a chart, the first thing you have to do, when you’re training a computer to recognize [something], is define the terms that would be included as triggers for a particular condition. “So, for post-traumatic stress disorder, she continues, the high volume of terms meant that there weren’t too many charts that would be eliminated. In other words, there were too many indicators for the Watson machine to effectively pull out those patients at risk. “I’m not certain that they would be able to get the specificity that they were looking for. There’s a lot of subtle indicators for PTSD, and human behavior, that I think it would have clouded up the ability of the computer to recognize it, simply from the chart,” Sandrow says.

IBM, according to STAT, has reiterated to its customers that all data included in Watson for Oncology is based on real patients and that the product has won praise around the world for its recommendations. Discussions have also emerged on just how much the company should be blamed—versus the end user—for implementation struggles. To this point, Leonard D'Avolio, Ph.D., an assistant professor at Harvard Medical School and CEO and co-founder of healthcare technology company Cyft, notes, “Who is at fault there? IBM or the provider team that bought the product for marketing and hoped it would fulfill a vision?”

Of course, Watson is just one example of an AI technology that has sparked debate, but due to IBM’s immense industry standing and given how the tech giant has marketed Watson, for one of its top tech trends this year, Healthcare Informatics sought out to ask industry leaders what they were seeing and hearing about the AI supercomputer, and how its performance has affected the broader artificial intelligence landscape.

Humans versus Computers

Bill Kassler, M.D., is the deputy chief health officer at IBM Watson Health, and as a physician, he offers a unique dual-perspective on AI as he comes from both ends of the spectrum: a healthcare practitioner and a technology solution company executive. When asked about the skepticism that has surrounded Watson of late, Dr. Kassler says that in general in healthcare, doctors, hospital administrators and other decision makers are conservative and operate in resource-constrained ways. “They are skeptical about technology, drugs, and anything else that’s new. That’s the baseline culture.”

Bill Kassler, M.D.

Kassler contends that even though IBM must work around this challenge, its AI offerings remain quite popular worldwide.  Indeed, IBM Watson Health’s Oncology and Genomics business has doubled in revenue year after year since 2015, and its AI offerings are now being used in more than 230 hospitals around the world. Last year at this time, that number was just 55 hospitals, he says.

For traditional physicians, one of the primary critiques with AI is that the computer’s treatment recommendations may differ from the doctor’s. For instance, a physician that makes decisions based on decades of experience might not take too kindly to a computer recommendation that the doctor firmly believes is not the best option for the patient.

Kassler says he gets asked this question frequently, and attests that studies have been done on how often the Watson computer agrees with a panel of patient care experts. He references one particular study, published last year in the journal The Oncologist, that was led by oncologists at the University of North Carolina’s Lineberger Comprehensive Cancer Center. The oncologists tested Watson for Genomics on more than 1,000 retrospective patient cases. More than 99 percent of the time, Watson agreed with the physicians, but beyond that, in more than 300 cases, Watson found clinically actionable therapeutic options that the physicians had not identified.

To this point, Kassler acknowledges that if the technology simply always agrees with the human, there is “limited utility.” While it can improve unwanted variation and quality, “what you really want is for that system to surface new insights,” he says. In a separate study of Watson for Oncology that Kassler mentions, inclusive of nearly 2,000 high-risk breast cancer patients, 30 percent of the time, Watson identified a new tumor mutation and had actionable recommendations.

As such, Kassler says, “If there’s a conflict [between computer and human], our hope is that Watson will deliver a list of recommended treatment options, the doctor will look at that and [compare] what his or her patient has with the other factors that Watson has included, and will then choose to accept the computer’s recommendations or not. And then the doctor will tell Watson why he or she made that decision so that Watson can learn from it,” he explains.

Expanding on this point, Yan Li, Ph.D., an assistant professor of information systems and technology at California-based Claremont Graduate University, notes that most AI technologies are in the form of a black box—that is, providing an output (recommendations) from a set of inputs without an explanation as to why. “It is very difficult for an experienced clinician to trust such an output without a logical explanation, especially if the output is different from his or her experience-based judgment,” Li asserts.

Is it Worth the Battle?

More broadly speaking, the reason why so many innovators are bullish on leveraging AI in healthcare has to do with the computer's learning or computation capabilities—specifically its speed and volumes in consuming information, Li says. “To provide high-quality care, medical practitioners must continuously update their clinical knowledge and keep current with the research literature,” she says, referencing a study that estimated it would require a physician approximately 627.5 hours per month to evaluate newly published research in primary care. But for computers, Li says, “processing this literature would take a matter of a few hours, and even less if we horizontally scale up the computation power.”

At the same time, there are a fair share of challenges, beyond the aforementioned trust issue. Li believes that in their current state, most AI solutions require training. “It is not the computer; rather, it is the computational algorithm that is trained based on historical data, and then makes predictions, classifications, or inferences based on input data. AI algorithms fall short in not considering relevant clinical information that may not be captured in the training data,” she says, offering an example of a diagnostic conversation between the patient and the clinician.

There is additionally a fear conundrum: the concern that AI technologies will eventually diminish the need for certain human jobs as they have begun to do in many other sectors. But the experts interviewed for this piece believe that this apprehension is mostly unwarranted. “It’s not a valid fear. It’s just something that sells stories because talking about replacing humans is something that’s super interesting,” says Cyft’s D'Avolio. Sanket Shah, an instructor for the University of Illinois at Chicago’s Department of Biomedical and Health Information Sciences, agrees with D'Avolio, noting, “Physicians need not fear being replaced by AI. Physicians are the providers of care and AI is one of the many tools they use to administer that care and improve their craft.”

Leonard D'Avolio, Ph.D.

In the end, when all the concerns and potential benefits are added altogether, most experts are still bullish on how AI can provide key clinical decision support to improve patient outcomes and lower costs. D'Avolio believes that many health system leaders have recently broadened “what was once a narrow view of AI and machine learning within their organizations.”

What’s sorely needed, most leaders in this space agree, is better education on how AI offerings exactly will work in healthcare organizations. And in this sense, Watson’s successes and failures can be used to learn lessons moving forward. In the first STAT report, the authors wrote, “The actual capabilities of Watson for Oncology are not well-understood by the public, and even by some of the hospitals that use it.” 

Of course, at what level a provider might leverage AI might also depend on several other factors. IBM’s Kassler notes, “If you are a small, one-person family practice in rural Vermont that is now just starting to use Excel spreadsheets for population health registries, yes, it’s too early [to start using AI]. But if you are a large integrated delivery network looking to invest in and be part of the development and perfection of this technology, it’s a great time,” he says.

As such, it makes it tough to answer if AI is at a crossroads in this current moment, and this will probably be a meaningful health IT trend in the years to come. As Kassler acknowledges, “For those on the leading edge, it’s a great time to get involved, but it’s not for everyone.”


The Health IT Summits gather 250+ healthcare leaders in cities across the U.S. to present important new insights, collaborate on ideas, and to have a little fun - Find a Summit Near You!


/article/analytics/there-s-value-ai-where-value-greatest
/news-item/analytics/have-cios-top-priorities-2018-become-reality

Have CIOs’ Top Priorities for 2018 Become a Reality?

December 12, 2018
by Rajiv Leventhal, Managing Editor
| Reprints

In comparing healthcare CIOs’ priorities at the end of 2017 to this current moment, new analysis has found that core clinical IT goals have shifted from focusing on EHR (electronic health record) integration to data analytics.

In December 2017, hospitals CIOs said they planned to mostly focus on EHR integration and mobile adoption and physician buy-in, according to a survey then-conducted by Springfield, Va.-based Spok, a clinical communications solutions company, of College of Healthcare Information Management Executives (CHIME) member CIOs.

The survey from one year ago found that across hospitals, 40 percent of CIO respondents said deploying an enterprise analytics platform is a top priority in 2018. Seventy-one percent of respondents cited integrating with the EHR is a top priority, and 62 percent said physician adoption and buy-in for securing messaging was a top priority in the next 18 months. What’s more, 38 percent said optimizing EHR integration with other hospital systems with a key focus for 2018.

Spok researchers were curious whether their predictions became reality, so they analyzed several industry reports and asked a handful of CIOs to recap their experiences from 2018. The most up-to-date responses revealed that compared to last year when just 40 percent of CIOs said they were deploying an enterprise analytics platform in 2018, harnessing data analytics looks to be a huge priority in 2019: 100 percent of the CIOs reported this as top of mind.

Further comparisons on 2018 predictions to realities included:

  • 62 percent of CIOs predicted 2018 as the year of EHR integration; 75 percent reported they are now integrating patient monitoring data
  • 79 percent said they were selecting and deploying technology primarily for secure messaging; now, 90 percent of hospitals have adopted mobile technology and report that it’s helping improve patient safety and outcomes
  • 54 percent said the top secure messaging challenge was adoption/buy in; now, 51 percent said they now involve clinicians in mobile policy and adoption

What’s more, regarding future predictions, 87 percent of CIOs said they expect to increase spending on cybersecurity in 2019, and in three years from now, 60 percent of respondents expect data to be stored in a hybrid/private cloud.

CIOs also expressed concern regarding big tech companies such as Apple, Amazon and Google disrupting the healthcare market; 70 percent said they were somewhat concerned.

More From Healthcare Informatics

/article/analytics/how-one-community-hospital-leveraging-ai-bolster-its-care-pathways-process

How One Community Hospital is Leveraging AI to Bolster Its Care Pathways Process

December 6, 2018
by Heather Landi, Associate Editor
| Reprints
Click To View Gallery

Managing clinical variation continues to be a significant challenge facing most hospitals and health systems today as unwarranted clinical variation often results in higher costs without improvements to patient experience or outcomes.

Like many other hospitals and health systems, Flagler Hospital, a 335-bed community hospital in St. Augustine, Florida, had a board-level mandate to address its unwarranted clinical variation with the goal of improving outcomes and lowering costs, says Michael Sanders, M.D., Flagler Hospital’s chief medical information officer (CMIO).

“Every hospital has been struggling with this for decades, managing clinical variation,” he says, noting that traditional methods of addressing clinical variation management have been inefficient, as developing care pathways, which involves identifying best practices for high-cost procedures, often takes up to six months or even years to develop and implement. “By the time you finish, it’s out of date,” Sanders says. “There wasn’t a good way of doing this, other than picking your spots periodically, doing analysis and trying to make sense of the data.”

What’s more, available analytics software is incapable of correlating all the variables within the clinical, billing, analytics and electronic health record (EHR) databases, he notes.

Another limitation is that care pathways are vulnerable to the biases of the clinicians involved, Sanders says. “In medicine, what we typically do is we’ll have an idea of what we want to study, design a protocol, and then run the trial and collect the data that we think is important and then we try to disprove or prove our hypothesis,” he says.

Webinar

Experience New Records for Speed & Scale: High Performance Genomics & Imaging

Through real use cases and live demo, Frank Lee, PhD, Global Industry Leader for Healthcare & Life Sciences, will illustrate the architecture and solution for high performance data and AI...

Sanders says he was intrigued by advances in machine learning tools and artificial intelligence (AI) platforms capable of applying advanced analytics to identify hidden patterns in data.

Working with Palo Alto, Calif.-based machine intelligence software company Ayasdi, Flagler Hospital initiated a pilot project to use Ayasdi’s clinical variation management application to develop care pathways for both acute and non-acute conditions and then measure adherence to those pathways.

Michael Sanders, M.D.

Flagler targeted their treatment protocols for pneumonia as an initial care process model. “We kicked around the idea of doing sepsis first, because it’s a huge problem throughout the country. We decided to use pneumonia first to get our feet wet and figure out how to use the tool correctly,” he says.

The AI tools from Ayasdi revealed new, improved care pathways for pneumonia after analyzing thousands of patient records from the hospital and identifying the commonalities between those with the best outcomes. The application uses unsupervised machine learning and supervised prediction to optimally align the sequence and timing of care with the goal of optimizing for patient outcomes, cost, readmissions, mortality rate, provider adherence, and other variables.

The hospital quickly implemented the new pneumonia pathway by changing the order set in its Allscripts EHR system. As a result, for the pneumonia care path, Flagler Hospital saved $1,350 per patient and reduced the length of stay (LOS) for these patients by two days, on average. What’s more, the hospital reduced readmission by 7 times—the readmission rate dropped from 2.9 percent to 0.4 percent, hospital officials report. The initial work saved nearly $850,000 in unnecessary costs—the costs were trimmed by eliminating labs, X-rays and other processes that did not add value or resulted in a reduction in the lengths of stay or readmissions.

“Those results are pretty amazing,” Sanders says. “It’s taking our data and showing us what we need to pursue. That’s powerful.”

With the success of the pneumonia care pathway, Flagler Hospital leaders also deployed a new sepsis pathway. The hospital has expanded its plans for using Ayasdi to develop new care pathways, from the original plan of tackling 12 conditions over three years, to now tackling one condition per month. Future plans are to tackle heart failure, total hip replacement, chronic obstructive pulmonary disease (COPD), coronary artery bypass grafting (CABG), hysterectomy and diabetes, among other conditions. Flagler Hospital expects to save at least $20 million from this program in the next three years, according to officials.

Finding the “Goldilocks” group

Strong collaboration between IT and physician teams has been a critical factor in deploying the AI tool and to continue to successfully implement new care pathways, Sanders notes.

The effort to create the first pathway began with the IT staff writing structured query language (SQL) code to extract the necessary data from the hospital’s Allscripts EHR, enterprise data warehouse, surgical, financial and corporate performance systems. This data was brought into the clinical variation management application using the FHIR (Fast Healthcare Interoperability Resources) standard.

“That was a major effort, but some of us had been data scientists before we were physicians, and so we parameterized all these calls. The first pneumonia care path was completed in about nine weeks. We’ve turned around and did a second care path, for sepsis, which is much harder, and we’ve done that in two weeks. We’ve finished sepsis and have moved on to total hip and total knee replacements. We have about 18 or 19 care paths that we’re going to be doing over the next 18 months,” he says.

After being fed data of past pneumonia treatments, the software automatically created cohorts of patients who had similar outcomes accompanied by the treatments they received at particular times and in what sequence. The program also calculated the direct variable costs, average lengths of stay, readmission and mortality rates for each of those cohorts, along with the statistical significance of its conclusions. Each group had different comorbidities, such as diabetes, COPD and heart failure, which was factored into the application's calculations. At the push of a button, the application created a care path based on the treatment given to the patients in each cohort.

The findings were then reviewed with the physician IT group, or what Sanders calls the PIT crew, to select what they refer to as the “Goldilocks” cohort. “This is a group of patients that had the combination of low cost, short length of stay, low readmissions and almost zero mortality rate. We then can publish the care path and then monitor adherence to that care path across our physicians,” Sanders says.

The AI application uncovered relationships and patterns that physicians either would not have identified or would have taken much longer to identify, Sanders says. For instance, the analysis revealed that for patients with pneumonia and COPD, beginning nebulizer treatments early in their hospital stays improved outcomes tremendously, hospital leaders report.

The optimal events, sequence, and timing of care were presented to the physician team using an intuitive interface that allowed them to understand exactly why each step, and the timing of the action, was recommended. Upon approval, the team operationalized the new care path by revising the emergency-department and inpatient order sets in the hospital EHR.

Sanders says having the data generated by the AI software is critical to getting physicians on board with the project. “When we deployed the tool for the pneumonia care pathway, our physicians were saying, ‘Oh no, not another tool’,” Sanders says. “I brought in a PIT Crew (physician IT crew) and we went through our data with them. I had physicians in the group going through the analysis and they saw that the data was real. We went into the EMR to make sure the data was in fact valid, and after they realized that, then they began to look at the outcomes, the length of stay, the drop in readmissions and how the costs dropped, and they were on board right away.”

The majority of Flagler physicians are adhering to the new care path, according to reports generated by the AI software's adherence application. The care paths effectively sourced the best practices from the hospital’s best doctors using the hospital’s own patient groups, and that is key, Sanders notes.

“When we had conversations with physicians about the data, some would say, ‘My patient is sicker than yours,’ or ‘I have a different patient population.’ However, we can drill down to the physician’s patients and show the physician where things are. It’s not based on an ivory tower analysis, it’s based on our own data. And, yes, our patients, and our community, are unique—a little older than most, and we have a lot of Europeans here visiting. We have some challenges, but this tool is taking our data and showing us what we need to pursue. That’s pretty powerful.”

He adds, “It’s been amazing to see physicians rally around this. We just never had the tool before that could do this.”

While Flagler Hospital is a small community hospital with fewer resources than academic medical centers or larger health systems—for example, the hospital doesn’t have a dedicated data scientist but rather uses its in-house informatics staff for this project—the hospital is progressive in its use of advanced analytics, according to Sanders.

“We’ve been able to do a lot of querying ourselves, and we have some sepsis predictive models that we’ve created and put into place. We do a lot of real-time monitoring for sepsis and central line-associated bloodstream infections,” he says. “Central line-associated bloodstream infections are a bane for all hospitals. In the past year and a half, since we’ve put in our predictive model, we’ve had zero bloodstream infections, and that’s just unheard of.”

Sanders and his team plan to continue to use the AI tool to analyze new data and adjust the care paths according to new discoveries. As the algorithms find more effective and efficient ways to deliver care that result in better outcomes, Flagler will continue to improve its care paths and measure the adherence of its providers.

There continues to be growing interest, and also some hype, around AI tools, but Sanders notes that AI and machine learning are simply another tool. “Historically, what we’ve done is that we had an idea of what we wanted to do, conducted a clinical trial and then proved or disproved the hypothesis, based on the data that we collected. We have a tool with AI which can basically show us relationships that we didn’t know even existed and answer questions that we didn’t know to ask. I think it’s going to open up a tremendous pathway in medicine for us to both reduce cost, improve care and really take better care of our patients,” he says, adding, “When you can say that to physicians, they are on board. They respond to the data.”

 


Related Insights For: Analytics

/article/analytics/rsna-2018-intense-focus-artificial-intelligence

At RSNA 2018, An Intense Focus on Artificial Intelligence

November 29, 2018
by Mark Hagland, Editor-in-Chief
| Reprints
Artificial intelligence solutions—and discussions—were everywhere at RSNA 2018 this week

Artificial intelligence solutions—and certainly, the promotion of such solutions—were everywhere this year at the RSNA Conference, held this week at Chicago’s vast McCormick Place, where nearly 49,000 attendees attended clinical education sessions, viewed nearly 700 vendor exhibits. And AI and machine learning promotions, and discussions were everywhere.

Scanning the exhibit floor on Monday, Glenn Galloway, CIO of the Center for Diagnostic Imaging, an ambulatory imaging center in the Minneapolis suburb of St. Louis Park, Minn., noted that “There’s a lot of focus on AI this year. We’re still trying to figure out exactly what it is; I think a lot of people are doing the same, with AI.” In terms of whether what’s being pitched is authentic solutions, vaporware, or something in between, Galloway said, “I think it’s all that. I think there will be some solutions that live and survive. There are some interesting concepts of how to deliver it. We’ve been talking to a few folks. But the successful solutions are going to be very focused; not just AI for a lung, but for a lung and some very specific diagnoses, for example.” And what will be most useful? According to Galloway, “Two things: AI for the workflow and the quality. And there’ll be some interesting things for what it will do for the quality and the workflow.”

“Certainly, this is another year where machine learning is absolutely dominating the conversation,” said James Whitfill, M.D., CMO at Innovation Care Partners in Scottsdale, Ariz., on Monday. “In radiology, we continue to be aware of how the hype of machine learning is giving way to the reality; that it’s not a wholesale replacement of physicians. There have already been tremendous advances in, for example, interpreting chest x-rays; some of the work that Stanford’s done. They’ve got algorithms that can diagnose 15 different pathological findings. So there is true material advancement taking place.”

Meanwhile, Dr. Whitfill said, “At the same time, people are realizing that coming up with the algorithm is one piece, but that there are surprising complications. So you develop an algorithm on Siemens equipment, but when you to Fuji, the algorithm fails—it no longer reliably identifies pathology, because it turns out you have to train the algorithm not just on examples form just one manufacturer, but form lots of manufacturers. We continue to find that these algorithms are not as consistent as identifying yourself on Facebook, for example. It’s turning out that radiology is way more complex. We take images on lots of different machines. So huge strides are being made,” he said. “But it’s very clear that human and machine learning together will create the breakthroughs. We talk about physician burnout, and even physicians leaving. I think that machine learning offers a good chance of removing a lot of the drudgery in healthcare. If we can automate some processes, then it will free up our time for quality judgment, and also to spend time talking to patients, not just staring at the screen.”

Webinar

Experience New Records for Speed & Scale: High Performance Genomics & Imaging

Through real use cases and live demo, Frank Lee, PhD, Global Industry Leader for Healthcare & Life Sciences, will illustrate the architecture and solution for high performance data and AI...

Looking at the hype cycle around AI

Of course, inevitably, there was talk around the talk of the hype cycle involving artificial intelligence. One of those engaging in that discussion was Paul Chang, M.D.., a practicing radiologist and medical director of enterprise imaging at the University of Chicago. Dr. Chang gave a presentation on Tuesday about AI. According a report by Michael Walter in Radiology Business, Dr. Chang said, “AI is not new or spooky. It’s been around for decades. So why the hype?” He described computer-aided detection (CAD) as a form of artificial intelligence, one that radiologists have been making use of for years.

Meanwhile, with regard to the new form of AI, and the inevitable hype cycle around emerging technologies, Dr. Chang said during his presentation that “When you’re going up the ride, you get excited. But then right at the top, before you are about to go down, you have that moment of clarity—‘What am I getting myself into?’—and that’s where we are now. We are upon that crest of magical hype and we are about to get the trench of disillusionment.” Still, he told his audience, “It is worth the rollercoaster of hype. But I’m here to tell you that it’s going to take longer than you think.”

So, which artificial intelligence-based solutions will end up going the distance? On a certain level, the answer to that question is simple, said Joe Marion, a principal in the Waukesha, Wis.-based Healthcare Integration Strategies LLC, and one of the imaging informatics industry’s most respected observers. “I think it’s going to be the value of the product,” said Marion, who has participated in 42 RSNA conferences; “and also the extent to which the vendors will make their products flexible in terms of being interfaced with others, so there’s this integration aspect, folding into vendor A, vendor B, vendor C, etc. So for a third party, the more they reach out and create relationships, the more successful they’ll be. A lot of it will come down to clinical value, though. Watson has had problems in that people have said, it’s great, but where’s the clinical value? So the ones that succeed will be the ones that find the most clinical value.”

Still, Marion noted, even the concept of AI, as applied to imaging informatics, remains an area with some areas lacking in clarity. “The reality, he said, “is that I think it means different things to different people. The difference between last year and this year is that some things are coming to fruition; it’s more real. And so some vendors are offering viable solutions. The message I’m hearing from vendors this year is, I have this platform, and if a third party wants to develop an application or I develop an application, or even an academic institution develops a solution, I can run it on my platform. They’re trying to become as vendor-agnostic as possible.”

Marion expressed surprise at the seemingly all-encompassing focus on artificial intelligence this year, given the steady march towards value-based healthcare-driven mandates. “Outside of one vendor, I’m not really seeing a whole lot of emphasis this year on value-based care; that’s disappointing,” Marion said. “I don’t know whether people don’t get it or not about value-based care, but the vendors are clearly more focused on AI right now.”

Might next year prove to be different? Yes, absolutely, especially given the coming mandates coming out of the Protecting Access to Medicare Act (PAMA), which will require referring providers to consult appropriate use criteria (AUC) prior to ordering advanced diagnostic imaging services—CT, MR, nuclear medicine and PET—for Medicare patients. The federal Centers for Medicare and Medicaid Services (CMS) will progress with a phased rollout of the CDS mandate, as the American College of Radiology (ACR) explains on its website, with voluntary reporting of the use of AUC taking place until December 2019, and mandatory reporting beginning in January 2020.

But for now, this certainly was the year of the artificial intelligence focus at the RSNA Conference. Only time will tell how that focus plays out in the imaging and imaging informatics vendor space within the coming 12 months, before RSNA 2019 kicks off one year from now, at the conference’s perennial location, McCormick Place.

 

 


See more on Analytics

betebet sohbet hattı betebet bahis siteleringsbahis