A Look at Mayo Clinic’s Daring Enterprise Analytics Leap | Healthcare Informatics Magazine | Health IT | Information Technology Skip to content Skip to navigation

A Look at Mayo Clinic’s Daring Enterprise Analytics Leap

September 30, 2016
by Mark Hagland
| Reprints
Mayo Clinic’s leaders have made tremendous progress around quality measures reporting

Leaders at the Rochester, Minn.-based Mayo Clinic have been making tremendous progress lately in an area of great interest across U.S. healthcare: they have been building an enterprise-wide data analytics program. And that was the subject of a presentation on Sep. 28 by Dwight Brown, Mayo’s director of enterprise analytics. Brown spoke on the topic at the Health IT Summit in New York, sponsored by Healthcare Informatics.

Brown offered his presentation, “The Mayo Clinic, Data Mapping and Building a Successful Advanced Data Analytics Program,” to healthcare leaders gathered at the Convene conference center in New York’s Financial District. Joining him was Sanjay Pudupakkam, principal and owner of the Wellington, Fla.-based Avatar Enterprise Business Solutions, which partnered with Brown and his colleagues at Mayo in building their enterprise analytics foundation.

Describing the origins of the work to build an advanced, enterprise-wide data analytics program, Brown told his audience, “We undertook this initiative between five and six years ago, focusing on clinical workflow. Why did we look at this? With healthcare payment reform, it is important to have a good grasp on data centers and to be able to perform data analysis. To remain competitive and viable, patient care organizations need to be able to use data to positively affect the quality of care, contain costs, and manage and administer for quality,” Brown said. “It was also important to ensure the reliability and integrity of the data we had. It’s not enough to put the data in place; you have to have good clinical workflow, or you’ll never get the data set you need.”

Looking back on the situation at the start of the initiative, Brown told his audience, “The problem we ran into is that our internal and external quality measurement was growing too fast; we had all kinds of measures the government was requiring us to report, and the majority of those forms of data were having to be abstracted manually. It was non-discrete and required manual chart extraction. And that,” he said, “was problem for the Mayo Quality Management Services Department. We had had to grow by over threefold over a period of three years,” and even that growth was not keeping pace with the accelerating demand for data reports and analysis.


Dwight Brown

Webinar

Experience New Records for Speed & Scale: High Performance Genomics & Imaging

Through real use cases and live demo, Frank Lee, PhD, Global Industry Leader for Healthcare & Life Sciences, will illustrate the architecture and solution for high performance data and AI...

“So we employed Sanjay to come help us,”Brown said, referring to Pudupakkam. “We had kind of an idea of what we needed to do, but needed help. We needed to automate the manual processes to free up our Quality Management staff. It was way too difficult, and way too difficult to train people in manual chart abstraction processes—was taking a year to train them. We also needed a centralized quality measures meta data repository. And we needed a roadmap.” Emphasizing the size and scope of the initiative they were plunging into, he added, “This was not just a one-time project.”

The core of the challenge, Brown noted, was this: a huge number of quality measures that Mayo leaders needed to respond to and support. Indeed, he said, the analysis performed by the core leadership team he gathered around himself, which included a quality administrator, an associate quality manager, and Brown’s quality measurement system team, found that the organization was having to work with 275 quality measures altogether.

The external quality measures included those related to:

>  PQRI-GPRO

>  Meaningful use

>  Stroke

>  VTE (venous thromboembolytic prophylaxis)

>  Leapfrog Group

>  Hospital outpatient measures

>  Core measures

>  AHRQ (Agency for healthcare Research and Quality) safety indicators

>  Minnesota community measures

>  Minnesota-based inpatient psychiatric measures

>  Minnesota public reporting measures

 

Internal measures included those related to:

>  Hospital standardized mortality ratio

>  Mortality and morbidity

>  Arizona surgical-based reporting

>  Infectious diseases

>  Adverse events

>  Specialty councils

>  Adverse events

“The challenge,” Brown told his audience, “was this: how do we automate or semi-automate so many measures? What data elements are needed to support these measures? What data elements are common and which are different among these measures? Are these data elements even being captured consistently across the EMR? Which data elements are discrete and which ones are non-discrete? How do we prioritize the measures for automation?”

An additional hurdle was the fact that Brown and his colleagues struggled with the reality of having to work with several different electronic health record (EHR) systems. “At the time,” he said, we had two instances of Cerner and one of IDX.” So, he said, “First, we had to take a measure decomposition approach” to the initiative. As a result, “Each project grouping was bundled into measure groups. That included PQRI, meaningful use, routine internal reporting, routine regulatory reporting, quality measures for specialty consults, and ETS to MIDAS Conversion & our DON Mart,” he said; and, he added, “we had to de-duplicate measures.” The end result? “We found that we had 500 unique data elements across 40 source systems, for those 275 measures—which now number over 320.”

The key challenges that Brown and his team uncovered and addressed including the following:

Ø  Multiple, disparate data sources (over 40 sources of data altogether)

Ø  Data elements were difficult to locate—there was no “single source of truth”; in fact, there was an inconsistency in how the data was being entered

Ø  Multiple sources for a single data element. There were data elements that were discrete in the EHR, but were not being used; for some data elements, there were up to five different sources

Ø  A mix of what turned to be discrete data elements, non-discrete, and mixed

Ø  Over 75 percent of data needed was found in text fields

Ø  Identifying measures and data elements walking through the abstraction process at each site

Ø  The team also identified and recommended changes for data entry and chart abstraction standardization and consistencies

“Meaningful use really precipitated this,” Brown told his audience. “We had to provide the quality measures and report them on an automated basis. Here’s one example, around one meaningful use measure. For patients with a certain BMI [body mass index], you’re supposed to provide weight counseling for those patients. But physicians would say, I’m going to talk with the patient, but document that in my text note. So when patients would check in, we would document their weight and height. And if they were over a certain BMI, the system would kick out a ‘fat sheet’ for that patient, without intervention from the physician,” making it unnecessary to require physicians in practice to add that data point into their physician notes.

So, what has been learned from all of this? “One of the key takeaways we’ve had so far,” Brown said, “is this: we used a big Excel spreadsheet to document all the data elements. We still use that; we call it the Sanjay Sheet. We use it to continue to look at the data elements.” So, the key learning there has been the awareness of the need to systematically and strategically approach the management of data collection, analysis, and reporting. In that regard, he noted, “We have quality informatics specialists now.” Brown and his colleagues have taken several staff members, primarily nurses, and have trained them to be able to analyze quality measure-related data at a granular level. “Sanjay helped to train these folks,” noted, speaking of Pudupakkam. “Among the questions we had to address: how do you look for the data elements? How do you look for the meta data? How do you track the data? They really brought the knowledge and expertise to understand available data,” he said of the Avatar consultants.

And, asked by an audience member how he and his colleagues acquired quality informatics specialists, Brown said, “Did we essentially build the set of skills for the quality informatics specialists? We utilized our nurse abstracters, and we had a few with an interest in informatics, so they went back for training; one actually went back and got a master’s in informatics. We also had other informatics specialists within the organization. So we did some of that, and combined it with the workflow we had created with the meta data repository. And altogether, we built a set of two or three individuals who could then lead the initiative. We had a couple of people who had that education who could be the team leaders and work together with the nurse specialists. They had all spent so much time with the medical record as it was, that it was a pretty easy leap for us.”

Drawing conclusions

Looking at the big picture around the initiative so far, Brown said, “This has been a foundational initiative that has laid the groundwork for future IT applications and analytics projects.” In addition, he said, “We our focus on clinical workflow processes forced compliance on the data entry of vital data elements.” What’s more, he said, success has required a “clear, focused methodology.”

The results have been very gratifying, Brown said. “At the outset of this initiative, 75 percent of the data elements” required for quality measure reporting “were in text fields; by the end of 2015, only 50 percent were.” The methodology used also provided a jump start on the meaningful use eCQMs, or electronically reported clinical quality measures, he noted.

Asked by an audience member as to whether the initiative has enabled the improvement of self-service, Brown said, “Yes, absolutely. In the past,” he conceded, “we in analytics did the ‘dump and run, don’t ask us what the data means’” approach. “Now, it’s much more collaborative. And, getting data into a good format for end-users remains an issue. Ultimately, we want to help enable more extensive self-service. But it’s in evolution right now, because we don’t have every metric in some sort of self-service portal; but it’s certainly moving in that direction.”

And, asked by an audience member what role culture has played in enabling the initiative, Brown said, “We’re a large organization. And we’re a committee-driven organization. And so to a certain extent, we were actually working against the culture of the organization, in that we needed to move quickly on it. Once people saw that we had good success in these metrics, we were able to get a lot more cooperation” from end-users across the organization around analytics and reporting issues. “But really, it was us moving against our typical culture,” which is a hard-driving, fast-moving one, he conceded. “But we had to show our success in order to not have somebody come down and say, that’s it, you’re done. When you have success happening, it’s really hard for people to sunset things.” In the end, he said, the initiative has proven how successful an initiative like this can be, by virtue of its being strategic, comprehensive, collaborative, and forward-thinking.

 


The Health IT Summits gather 250+ healthcare leaders in cities across the U.S. to present important new insights, collaborate on ideas, and to have a little fun - Find a Summit Near You!


/article/analytics/look-mayo-clinic-s-daring-enterprise-analytics-leap
/news-item/analytics/have-cios-top-priorities-2018-become-reality

Have CIOs’ Top Priorities for 2018 Become a Reality?

December 12, 2018
by Rajiv Leventhal, Managing Editor
| Reprints

In comparing healthcare CIOs’ priorities at the end of 2017 to this current moment, new analysis has found that core clinical IT goals have shifted from focusing on EHR (electronic health record) integration to data analytics.

In December 2017, hospitals CIOs said they planned to mostly focus on EHR integration and mobile adoption and physician buy-in, according to a survey then-conducted by Springfield, Va.-based Spok, a clinical communications solutions company, of College of Healthcare Information Management Executives (CHIME) member CIOs.

The survey from one year ago found that across hospitals, 40 percent of CIO respondents said deploying an enterprise analytics platform is a top priority in 2018. Seventy-one percent of respondents cited integrating with the EHR is a top priority, and 62 percent said physician adoption and buy-in for securing messaging was a top priority in the next 18 months. What’s more, 38 percent said optimizing EHR integration with other hospital systems with a key focus for 2018.

Spok researchers were curious whether their predictions became reality, so they analyzed several industry reports and asked a handful of CIOs to recap their experiences from 2018. The most up-to-date responses revealed that compared to last year when just 40 percent of CIOs said they were deploying an enterprise analytics platform in 2018, harnessing data analytics looks to be a huge priority in 2019: 100 percent of the CIOs reported this as top of mind.

Further comparisons on 2018 predictions to realities included:

  • 62 percent of CIOs predicted 2018 as the year of EHR integration; 75 percent reported they are now integrating patient monitoring data
  • 79 percent said they were selecting and deploying technology primarily for secure messaging; now, 90 percent of hospitals have adopted mobile technology and report that it’s helping improve patient safety and outcomes
  • 54 percent said the top secure messaging challenge was adoption/buy in; now, 51 percent said they now involve clinicians in mobile policy and adoption

What’s more, regarding future predictions, 87 percent of CIOs said they expect to increase spending on cybersecurity in 2019, and in three years from now, 60 percent of respondents expect data to be stored in a hybrid/private cloud.

CIOs also expressed concern regarding big tech companies such as Apple, Amazon and Google disrupting the healthcare market; 70 percent said they were somewhat concerned.

More From Healthcare Informatics

/article/analytics/how-one-community-hospital-leveraging-ai-bolster-its-care-pathways-process

How One Community Hospital is Leveraging AI to Bolster Its Care Pathways Process

December 6, 2018
by Heather Landi, Associate Editor
| Reprints
Click To View Gallery

Managing clinical variation continues to be a significant challenge facing most hospitals and health systems today as unwarranted clinical variation often results in higher costs without improvements to patient experience or outcomes.

Like many other hospitals and health systems, Flagler Hospital, a 335-bed community hospital in St. Augustine, Florida, had a board-level mandate to address its unwarranted clinical variation with the goal of improving outcomes and lowering costs, says Michael Sanders, M.D., Flagler Hospital’s chief medical information officer (CMIO).

“Every hospital has been struggling with this for decades, managing clinical variation,” he says, noting that traditional methods of addressing clinical variation management have been inefficient, as developing care pathways, which involves identifying best practices for high-cost procedures, often takes up to six months or even years to develop and implement. “By the time you finish, it’s out of date,” Sanders says. “There wasn’t a good way of doing this, other than picking your spots periodically, doing analysis and trying to make sense of the data.”

What’s more, available analytics software is incapable of correlating all the variables within the clinical, billing, analytics and electronic health record (EHR) databases, he notes.

Another limitation is that care pathways are vulnerable to the biases of the clinicians involved, Sanders says. “In medicine, what we typically do is we’ll have an idea of what we want to study, design a protocol, and then run the trial and collect the data that we think is important and then we try to disprove or prove our hypothesis,” he says.

Webinar

Experience New Records for Speed & Scale: High Performance Genomics & Imaging

Through real use cases and live demo, Frank Lee, PhD, Global Industry Leader for Healthcare & Life Sciences, will illustrate the architecture and solution for high performance data and AI...

Sanders says he was intrigued by advances in machine learning tools and artificial intelligence (AI) platforms capable of applying advanced analytics to identify hidden patterns in data.

Working with Palo Alto, Calif.-based machine intelligence software company Ayasdi, Flagler Hospital initiated a pilot project to use Ayasdi’s clinical variation management application to develop care pathways for both acute and non-acute conditions and then measure adherence to those pathways.

Michael Sanders, M.D.

Flagler targeted their treatment protocols for pneumonia as an initial care process model. “We kicked around the idea of doing sepsis first, because it’s a huge problem throughout the country. We decided to use pneumonia first to get our feet wet and figure out how to use the tool correctly,” he says.

The AI tools from Ayasdi revealed new, improved care pathways for pneumonia after analyzing thousands of patient records from the hospital and identifying the commonalities between those with the best outcomes. The application uses unsupervised machine learning and supervised prediction to optimally align the sequence and timing of care with the goal of optimizing for patient outcomes, cost, readmissions, mortality rate, provider adherence, and other variables.

The hospital quickly implemented the new pneumonia pathway by changing the order set in its Allscripts EHR system. As a result, for the pneumonia care path, Flagler Hospital saved $1,350 per patient and reduced the length of stay (LOS) for these patients by two days, on average. What’s more, the hospital reduced readmission by 7 times—the readmission rate dropped from 2.9 percent to 0.4 percent, hospital officials report. The initial work saved nearly $850,000 in unnecessary costs—the costs were trimmed by eliminating labs, X-rays and other processes that did not add value or resulted in a reduction in the lengths of stay or readmissions.

“Those results are pretty amazing,” Sanders says. “It’s taking our data and showing us what we need to pursue. That’s powerful.”

With the success of the pneumonia care pathway, Flagler Hospital leaders also deployed a new sepsis pathway. The hospital has expanded its plans for using Ayasdi to develop new care pathways, from the original plan of tackling 12 conditions over three years, to now tackling one condition per month. Future plans are to tackle heart failure, total hip replacement, chronic obstructive pulmonary disease (COPD), coronary artery bypass grafting (CABG), hysterectomy and diabetes, among other conditions. Flagler Hospital expects to save at least $20 million from this program in the next three years, according to officials.

Finding the “Goldilocks” group

Strong collaboration between IT and physician teams has been a critical factor in deploying the AI tool and to continue to successfully implement new care pathways, Sanders notes.

The effort to create the first pathway began with the IT staff writing structured query language (SQL) code to extract the necessary data from the hospital’s Allscripts EHR, enterprise data warehouse, surgical, financial and corporate performance systems. This data was brought into the clinical variation management application using the FHIR (Fast Healthcare Interoperability Resources) standard.

“That was a major effort, but some of us had been data scientists before we were physicians, and so we parameterized all these calls. The first pneumonia care path was completed in about nine weeks. We’ve turned around and did a second care path, for sepsis, which is much harder, and we’ve done that in two weeks. We’ve finished sepsis and have moved on to total hip and total knee replacements. We have about 18 or 19 care paths that we’re going to be doing over the next 18 months,” he says.

After being fed data of past pneumonia treatments, the software automatically created cohorts of patients who had similar outcomes accompanied by the treatments they received at particular times and in what sequence. The program also calculated the direct variable costs, average lengths of stay, readmission and mortality rates for each of those cohorts, along with the statistical significance of its conclusions. Each group had different comorbidities, such as diabetes, COPD and heart failure, which was factored into the application's calculations. At the push of a button, the application created a care path based on the treatment given to the patients in each cohort.

The findings were then reviewed with the physician IT group, or what Sanders calls the PIT crew, to select what they refer to as the “Goldilocks” cohort. “This is a group of patients that had the combination of low cost, short length of stay, low readmissions and almost zero mortality rate. We then can publish the care path and then monitor adherence to that care path across our physicians,” Sanders says.

The AI application uncovered relationships and patterns that physicians either would not have identified or would have taken much longer to identify, Sanders says. For instance, the analysis revealed that for patients with pneumonia and COPD, beginning nebulizer treatments early in their hospital stays improved outcomes tremendously, hospital leaders report.

The optimal events, sequence, and timing of care were presented to the physician team using an intuitive interface that allowed them to understand exactly why each step, and the timing of the action, was recommended. Upon approval, the team operationalized the new care path by revising the emergency-department and inpatient order sets in the hospital EHR.

Sanders says having the data generated by the AI software is critical to getting physicians on board with the project. “When we deployed the tool for the pneumonia care pathway, our physicians were saying, ‘Oh no, not another tool’,” Sanders says. “I brought in a PIT Crew (physician IT crew) and we went through our data with them. I had physicians in the group going through the analysis and they saw that the data was real. We went into the EMR to make sure the data was in fact valid, and after they realized that, then they began to look at the outcomes, the length of stay, the drop in readmissions and how the costs dropped, and they were on board right away.”

The majority of Flagler physicians are adhering to the new care path, according to reports generated by the AI software's adherence application. The care paths effectively sourced the best practices from the hospital’s best doctors using the hospital’s own patient groups, and that is key, Sanders notes.

“When we had conversations with physicians about the data, some would say, ‘My patient is sicker than yours,’ or ‘I have a different patient population.’ However, we can drill down to the physician’s patients and show the physician where things are. It’s not based on an ivory tower analysis, it’s based on our own data. And, yes, our patients, and our community, are unique—a little older than most, and we have a lot of Europeans here visiting. We have some challenges, but this tool is taking our data and showing us what we need to pursue. That’s pretty powerful.”

He adds, “It’s been amazing to see physicians rally around this. We just never had the tool before that could do this.”

While Flagler Hospital is a small community hospital with fewer resources than academic medical centers or larger health systems—for example, the hospital doesn’t have a dedicated data scientist but rather uses its in-house informatics staff for this project—the hospital is progressive in its use of advanced analytics, according to Sanders.

“We’ve been able to do a lot of querying ourselves, and we have some sepsis predictive models that we’ve created and put into place. We do a lot of real-time monitoring for sepsis and central line-associated bloodstream infections,” he says. “Central line-associated bloodstream infections are a bane for all hospitals. In the past year and a half, since we’ve put in our predictive model, we’ve had zero bloodstream infections, and that’s just unheard of.”

Sanders and his team plan to continue to use the AI tool to analyze new data and adjust the care paths according to new discoveries. As the algorithms find more effective and efficient ways to deliver care that result in better outcomes, Flagler will continue to improve its care paths and measure the adherence of its providers.

There continues to be growing interest, and also some hype, around AI tools, but Sanders notes that AI and machine learning are simply another tool. “Historically, what we’ve done is that we had an idea of what we wanted to do, conducted a clinical trial and then proved or disproved the hypothesis, based on the data that we collected. We have a tool with AI which can basically show us relationships that we didn’t know even existed and answer questions that we didn’t know to ask. I think it’s going to open up a tremendous pathway in medicine for us to both reduce cost, improve care and really take better care of our patients,” he says, adding, “When you can say that to physicians, they are on board. They respond to the data.”

 


Related Insights For: Analytics

/article/analytics/rsna-2018-intense-focus-artificial-intelligence

At RSNA 2018, An Intense Focus on Artificial Intelligence

November 29, 2018
by Mark Hagland, Editor-in-Chief
| Reprints
Artificial intelligence solutions—and discussions—were everywhere at RSNA 2018 this week

Artificial intelligence solutions—and certainly, the promotion of such solutions—were everywhere this year at the RSNA Conference, held this week at Chicago’s vast McCormick Place, where nearly 49,000 attendees attended clinical education sessions, viewed nearly 700 vendor exhibits. And AI and machine learning promotions, and discussions were everywhere.

Scanning the exhibit floor on Monday, Glenn Galloway, CIO of the Center for Diagnostic Imaging, an ambulatory imaging center in the Minneapolis suburb of St. Louis Park, Minn., noted that “There’s a lot of focus on AI this year. We’re still trying to figure out exactly what it is; I think a lot of people are doing the same, with AI.” In terms of whether what’s being pitched is authentic solutions, vaporware, or something in between, Galloway said, “I think it’s all that. I think there will be some solutions that live and survive. There are some interesting concepts of how to deliver it. We’ve been talking to a few folks. But the successful solutions are going to be very focused; not just AI for a lung, but for a lung and some very specific diagnoses, for example.” And what will be most useful? According to Galloway, “Two things: AI for the workflow and the quality. And there’ll be some interesting things for what it will do for the quality and the workflow.”

“Certainly, this is another year where machine learning is absolutely dominating the conversation,” said James Whitfill, M.D., CMO at Innovation Care Partners in Scottsdale, Ariz., on Monday. “In radiology, we continue to be aware of how the hype of machine learning is giving way to the reality; that it’s not a wholesale replacement of physicians. There have already been tremendous advances in, for example, interpreting chest x-rays; some of the work that Stanford’s done. They’ve got algorithms that can diagnose 15 different pathological findings. So there is true material advancement taking place.”

Meanwhile, Dr. Whitfill said, “At the same time, people are realizing that coming up with the algorithm is one piece, but that there are surprising complications. So you develop an algorithm on Siemens equipment, but when you to Fuji, the algorithm fails—it no longer reliably identifies pathology, because it turns out you have to train the algorithm not just on examples form just one manufacturer, but form lots of manufacturers. We continue to find that these algorithms are not as consistent as identifying yourself on Facebook, for example. It’s turning out that radiology is way more complex. We take images on lots of different machines. So huge strides are being made,” he said. “But it’s very clear that human and machine learning together will create the breakthroughs. We talk about physician burnout, and even physicians leaving. I think that machine learning offers a good chance of removing a lot of the drudgery in healthcare. If we can automate some processes, then it will free up our time for quality judgment, and also to spend time talking to patients, not just staring at the screen.”

Webinar

Experience New Records for Speed & Scale: High Performance Genomics & Imaging

Through real use cases and live demo, Frank Lee, PhD, Global Industry Leader for Healthcare & Life Sciences, will illustrate the architecture and solution for high performance data and AI...

Looking at the hype cycle around AI

Of course, inevitably, there was talk around the talk of the hype cycle involving artificial intelligence. One of those engaging in that discussion was Paul Chang, M.D.., a practicing radiologist and medical director of enterprise imaging at the University of Chicago. Dr. Chang gave a presentation on Tuesday about AI. According a report by Michael Walter in Radiology Business, Dr. Chang said, “AI is not new or spooky. It’s been around for decades. So why the hype?” He described computer-aided detection (CAD) as a form of artificial intelligence, one that radiologists have been making use of for years.

Meanwhile, with regard to the new form of AI, and the inevitable hype cycle around emerging technologies, Dr. Chang said during his presentation that “When you’re going up the ride, you get excited. But then right at the top, before you are about to go down, you have that moment of clarity—‘What am I getting myself into?’—and that’s where we are now. We are upon that crest of magical hype and we are about to get the trench of disillusionment.” Still, he told his audience, “It is worth the rollercoaster of hype. But I’m here to tell you that it’s going to take longer than you think.”

So, which artificial intelligence-based solutions will end up going the distance? On a certain level, the answer to that question is simple, said Joe Marion, a principal in the Waukesha, Wis.-based Healthcare Integration Strategies LLC, and one of the imaging informatics industry’s most respected observers. “I think it’s going to be the value of the product,” said Marion, who has participated in 42 RSNA conferences; “and also the extent to which the vendors will make their products flexible in terms of being interfaced with others, so there’s this integration aspect, folding into vendor A, vendor B, vendor C, etc. So for a third party, the more they reach out and create relationships, the more successful they’ll be. A lot of it will come down to clinical value, though. Watson has had problems in that people have said, it’s great, but where’s the clinical value? So the ones that succeed will be the ones that find the most clinical value.”

Still, Marion noted, even the concept of AI, as applied to imaging informatics, remains an area with some areas lacking in clarity. “The reality, he said, “is that I think it means different things to different people. The difference between last year and this year is that some things are coming to fruition; it’s more real. And so some vendors are offering viable solutions. The message I’m hearing from vendors this year is, I have this platform, and if a third party wants to develop an application or I develop an application, or even an academic institution develops a solution, I can run it on my platform. They’re trying to become as vendor-agnostic as possible.”

Marion expressed surprise at the seemingly all-encompassing focus on artificial intelligence this year, given the steady march towards value-based healthcare-driven mandates. “Outside of one vendor, I’m not really seeing a whole lot of emphasis this year on value-based care; that’s disappointing,” Marion said. “I don’t know whether people don’t get it or not about value-based care, but the vendors are clearly more focused on AI right now.”

Might next year prove to be different? Yes, absolutely, especially given the coming mandates coming out of the Protecting Access to Medicare Act (PAMA), which will require referring providers to consult appropriate use criteria (AUC) prior to ordering advanced diagnostic imaging services—CT, MR, nuclear medicine and PET—for Medicare patients. The federal Centers for Medicare and Medicaid Services (CMS) will progress with a phased rollout of the CDS mandate, as the American College of Radiology (ACR) explains on its website, with voluntary reporting of the use of AUC taking place until December 2019, and mandatory reporting beginning in January 2020.

But for now, this certainly was the year of the artificial intelligence focus at the RSNA Conference. Only time will tell how that focus plays out in the imaging and imaging informatics vendor space within the coming 12 months, before RSNA 2019 kicks off one year from now, at the conference’s perennial location, McCormick Place.

 

 


See more on Analytics

betebet sohbet hattı betebet bahis siteleringsbahis