6 Keys to Building an Effective Analytics Program | Healthcare Informatics Magazine | Health IT | Information Technology Skip to content Skip to navigation

6 Keys to Building an Effective Analytics Program

January 3, 2017
by George Reynolds, M.D., Principal, Reynolds Healthcare Advisers
| Reprints

Healthcare providers have reached a tipping point. As little as ten years ago, they struggled to collect accurate, actionable clinical data. With much of it locked away in siloed electronic health records (EHRs) or paper patient charts, the best most organizations could do was rely on claims data that was optimized for billing purposes—not patient care.

Now that the vast majority of hospitals and large physician practice groups are using EHRs, healthcare provider organizations face an entirely new set of challenges. Access to clinical and operational data is no longer the primary challenge—many of these organizations are drowning in data. Instead, their challenge is turning this data into actionable information that their leaders and clinicians can use to make decisions. As the healthcare system (slowly) shifts from volume-based reimbursement to value-based reimbursement, organizations must be able to leverage their data to prioritize competing clinical and operational opportunities to improve efficiency, reduce unnecessary variation and waste, and identify and address gaps in quality of care.

Clearly, the need for a robust analytics program that can address these challenges has never been greater. Yet many, if not most healthcare providers struggle to develop an effective program. This article examines the analytics programs of four successful organizations that have made the transition to a data-driven culture. What organizational features do these programs have in common, and what lessons have they learned along the way?

The organizations include an academic medical center, two community-based hospital systems, and a regional system with 45 hospitals in four states. Despite these very different organizational dynamics, these four analytics programs have many things in common. Each has strong clinical leadership. While most healthcare systems operate reporting and analytics as a unit within IT, each of the programs reviewed here is a distinct entity with a charter, a clear reporting organization and a robust governance structure. Each system uses Epic (Epic Systems Corporation, Verona, Wisc.) as their primary electronic health record (EHR). Some of them refer to the analytics tools they have built as ‘dashboards’ while others refer to them as ‘apps’—they all have chosen QlikView (QlikTech International, AB) as the data visualization tool for their programs. And each program is relatively small, with staffs ranging from 5-10 FTEs, proving that you don’t need a large staff to make a significant impact.

Pilot Projects and Early Wins


Experience New Records for Speed & Scale: High Performance Genomics & Imaging

Through real use cases and live demo, Frank Lee, PhD, Global Industry Leader for Healthcare & Life Sciences, will illustrate the architecture and solution for high performance data and AI...

The need for a dedicated analytics program is not always immediately obvious. Demonstrating value with a pilot project or a single, well-defined problem is an important first step in building a more comprehensive program. Dr. Cameron Berg is the director of acute care medicine for the clinical integration program at North Collaborative Care, the ACO for North Memorial Health Care in Minneapolis. His experience in starting his program is fairly typical. He had done some initial work with his colleagues in the ED around workflows. The benefits to both patients and the organization of this early work led to CEO and CMO-level support for a dedicated analytics team.

Dr. Binu Mathew, vice president of medical intelligence and analytics at Mercy Health System headquartered in the St. Louis area, had a similar experience. “We knew there was a significant opportunity around clinical documentation… It started with that use case… But to truly crack it, we needed the whole cycle of people, process, and analytics to all work together.” In explaining the potential benefits of a dedicated analytics, “You can go the traditional route of asking IT to help build a BI solution, but the reason why they liked this was because when you have that cross-domain knowledge you can build faster. But not just faster, you can build contextually a lot deeper and have an organic team that grows over time as opposed to just having consultant services forever.”

At University of Wisconsin Health (UW Health), Dr. Grace Flood is the medical director for clinical analytics and reporting. UW Health’s pilot project was driven by the desire to make Wisconsin Collaborative for Healthcare Quality (WCHQ) data more accessible to providers and organizational leaders. WCHQ publically reports organization and clinic level performance on ambulatory quality measures. UW Health, however, also collects underlying data down to provider and individual patient levels. Since several provider pay-for-performance measures at UW Health are based on WCHQ performance, this project garnered physician support, helping drive early successes. Moreover, the rigor of WCHQ’s standardized data formats and monthly data submissions helped the UW team build a governance structure that provided the basis for subsequent projects. Dr. Flood recommends, “Get some quick wins—show end-users some ‘wows’ that can help build momentum.”

Executive Buy-In and Sponsorship is Essential

While the staffing for each of the analytics programs described here is lean, they all needed financial support in order to build on their early wins. More importantly, the belief and backing of the senior leaders is critical in driving adoption of the analytics tools. As Dave Lehr, executive director for analytics and data strategy at Anne Arundel Medical Center in Annapolis, MD explained, “I’ve definitely seen in the past at other organizations where they are doing so many great things, but nobody knows about it except for the IT folks.” But if the CEO and other senior leaders are championing the analytics program, the front-line managers and clinicians are much more likely to take the time to learn the tools.

At North Memorial Health, “The CEO and CMO jointly said, “We need to do a better job of this… let’s invest in this clinical integration work... and reallocate existing resources… to facilitate this.” Dr. Berg went on to say, “Because there was that executive level support, the other clinical stakeholders… understood that this was a priority, and we were able to really get their buy-in.”

Barbara Baldwin, vice president and chief information officer at Anne Arundel Medical Center also emphasized this point, “The whole focus as we began to evolve with total cost of care, population health really just strengthened the importance of what analytics needed to do to help our organization progress. And so I can say that this was really not a difficult concept to bring to the leadership in the organization from the CEO to the CMO to the CFO. They have been ready and embracing of the concept that we take the analytics and continue to evolve it as a product of the organization and not just a by-product of IT. … As a leadership team, they were like ‘Come on. We’re ready. Bring the concept forward.’”

“The executive buy-in has been key. I don’t think it would have worked without it,” according to Dr. Berg. Ms. Baldwin agreed, “Here’s the secret, I can say I’m keeping my executives informed, or they’re engaged. But if your executives are not hungry for this, it’s very difficult to lead them to water… The organization and the executive leadership have to be at that level if you’re really going to excel.”

Dashboard Development is a Team Sport

Each program’s leadership emphasized the importance of collaboration between the clinical and operational project sponsors and the analysts who build the dashboards and applications and the. At North Memorial Health, “One of these analyst is at the table even at the very beginning when we’re developing hypotheses, and then they are doing the build sort of in real-time along with that so that we can do hypothesis testing in the analytics platform.” Dr. Berg continued, “Otherwise, we develop these hypotheses that cannot be answered given the structural limitations of data. …we develop these great questions, but we haven’t framed them using the right language, and so they’ve been unanswerable and all of a sudden you’ve wasted dozens of physician hours.”

Analytics development is clearly an iterative process, and it is most efficiently done with frequent face-to-face meetings. Similar approaches were used at all of the other programs. At the UW Health Dr. Flood described the build-show-build cycle of dashboard development. Dr. Mathew emphasized the importance of getting all the stakeholders at the table, “It’s important to get the big picture. Make sure you get advice from multiple angles. That’s why it’s so important to have a team that is looking at it from multiple perspectives. …Having the ability to engage with the end users and making them feel a part of the solution I believe is absolutely key.”

While not specifically described as an agile methodology, the rapid cycles of building and validation used at North includes many of the elements of agile. Both Mercy and AAMC explicitly utilize agile methods in building their apps and dashboards. At AAMC, Mr. Lehr described the benefits, “The agile process is an important part of what we do. So many people get caught up in putting out fires and not thinking about what they really want to accomplish in an intermediate term and a long term view.” Ms. Baldwin added, “Or the flip of that: perfection is the enemy of good, and you’re so into analysis paralysis that you don’t deliver. Agile helps you past that.”

It is also important to keep the big picture in mind. As Mr. Lehr put it, “We wanted to make sure that we weren’t building 100 dashboards that each had one user. We’d rather build just one dashboard that has 101 users. That’s a better use of our time and a more efficient way to get your information out there.” Ms. Baldwin offered, “…that means taking a more expansive design for those dashboards and really saying, ‘OK, what else would be utilized in this particular line of questioning?’”

The medical intelligence and analytics program at Mercy is based in the Revenue department and has produced applications in the clinical, operational and financial domains but with a primary focus of transforming complexity of data into insight, process efficiency and workflow automation.  One such popular application helps improve charge capture for nursing procedures such as IV infusions. It transformed a 30 to 90-minute complex charge capture process down to a few minutes. Training individuals to consume a complicated application would be a costly endeavor. The solution?  “We’re moving more and more toward designing applications that are a lot more intuitive—it’s minimal to no training—and that’s been our focus. Make sure the app itself requires no hand-holding, because, if it does, then we’ve probably failed in our design somewhere,” said Dr. Mathew.

At AAMC, maintaining a consistent and engaging design for all of their dashboards has been a focus since the beginning of the program. Mr. Lehr: “We brought in a group called Draper & Dash …to train our developers on their UI/UX design philosophy, and then we took it from there.” By enforcing a consistent style guide in design, end users have a consistent experience across all of the dashboards.

Clinical Focus

As healthcare shifts to value-based care, the need to combine clinical and operational data has become an organizational imperative. Mr. Lehr explains, “They’re not separate, at all, anymore. …the days of being able to change the way you’re coding or do some simple changes in denials management and have that make a huge impact to your bottom-line—I think those days are gone. And really, the biggest ways that you can, as an organization, change your bottom-line is through clinical optimization and clinical innovation. …Whether it’s the VP of Revenue or the CFO or—of course—the doctors and nurses, everybody is focused on clinical innovation.”

Dr. Berg described a similar shift in focus at North Memorial Health, “Historically, before my time at the organization, it was a very siloed old-fashioned structure, like most healthcare environments are. …And so I think like most places, within a fairly short period of time, that became very heavy on finance analysis and light on meaningful clinical analysis. …In the last few years, as we’ve tried to re-orient this work around clinical stuff, we’ve developed a home-grown analytics platform.”

Likewise, Dr. Flood emphasized that UW Health’s program has largely focused on clinical quality and clinical effectiveness projects. And while the program at Mercy does focus on operational issues, many of the issues they have tackled such as clinical documentation, clinical charge capture and case management have significant clinical workflow implications.

From the selection of the data visualization front end, to the program governance, the analytics program at Anne Arundel Medical Center is clinically focused and led. “It was really the physicians and nurses that drove it. But that’s the way to do it, right? We’re a clinical enterprise,” argued Ms. Baldwin.

None of this should be surprising. When you focus on clinically-driven analytics, it’s about efficient and effective care. If you have a clinical focus to what you are trying to accomplish, by definition, you start to meet the needs of value-based care and the challenges of managing a population’s health.

Robust Analytics Governance Ties Priorities to Strategy

Dr. Berg described the challenges many organizations face in trying to prioritize competing analytics and process improvement projects: “I think historically that has been a lot of what happens in healthcare environments is that you’ve got a handful of people at the top and they garner a lot of buy-in because of the positions they are in. Something bubbles up to their attention and then they kind of sic the whole team on it and the whole team works on it for some period of time and it is unclear what the real goal was or what the payoff is.”

At North Memorial Health, they have developed a rigorous, evidence-driven methodology to prioritize projects. “We’ve tried to have a fairly diligent up-front methodology. …We have used an application in QlikView that we developed, that we call a Cohort Explorer…that aggregates claims and billing and Epic clinical information from all of our sites…and then we rank different clinical conditions.” By combining clinical data, the volume of charges associated with various DRG groupings, and a fairly robust cost accounting methodology in Cohort Explorer, North was able to quickly identify the top diagnoses in terms of both clinical and financial impact. “As we ranked those, not surprisingly, sepsis was far and away the top…Since the time of sepsis, we’ve gotten all the way down through number 5,” said Dr. Berg. Let’s be clear, North actually uses analytics to drive the prioritization of their analytics program!

At UW Health, a QlikView Governance Dashboard is used to monitor the use of the program’s other dashboards. This serves as an important feedback loop for the members of QlikView Leadership Team and the Integrated Analytics Team that oversee the program.

Mercy uses a priority matrix and a scoring system to prioritize projects. But, because of the specific focus of their program, they spend several weeks working with the project sponsors to define the scope and refine the possible solutions before a project is finally ranked. Dr. Mathew: “Where we focus is on high value business problems that, historically, Mercy has struggled with or they know that it is super-high-value and we need to take it on. …  we prioritize based on a composite score of quality, service and cost savings.

Anne Arundel Medical Center’s Data Stewardship Council is chaired by a physician and has strong nursing and physician representation that reflects the program’s solid clinical focus. Ms. Baldwin describes the governance challenge this way, “One of the things that is critical for governance to look at is balancing how we want to use our valuable-but-limited analytics resources. You can spend it clearing the decks of low hanging fruit, which can feel satisfying, but is often unproductive. Or you can take valuable resources and actually determine how do I get the bigger stuff done …which may take longer, but has a better payoff.”

Data Governance Should Not Be an Afterthought

Like most healthcare organizations across the country, the majority of the analytics programs described here describe their data governance process as a work in progress. Dr. Flood went so far as to identify the need for strong data governance as one of her top three lessons learned stating, “Data governance is often an afterthought. It should be a top priority. The lack of strong data governance is one of the biggest road blocks to ensuring consistency across dashboards.”

However, the AAMC team has developed a data governance program that probably represents best practice in the industry. In fact, they view programmatic analytics governance and data governance as two sides of the same coin. Mr. Lehr explains, “We use some fuzzy terminology across our organization. When we say Data Governance, we really mean program governance and data governance. Our Data Stewardship Council consists of… an enterprise-wide representative group that is able to look at all the priorities that we have. And that rolls up to our Analytics Governance Council which…consists of our CEO’s direct reports.”

Before the first meeting of the Data Stewardship Council, the AAMC team built an on-line data dictionary with a Google-like search capability. They then took the remarkable step of devoting 3 analysts full-time for about 3 months to back-populate the dictionary with all of the data elements already in use complete with definitions, sources and the data steward responsible for each element. Ms. Baldwin points out, “A single source of truth builds confidence in the data being published.”

In honor of their location on the Chesapeake Bay, they branded the on-line data dictionary the ‘Data Bay.’ They then gave access to leadership, front-line managers, and the governance team as well as the project sponsors and analytics team members. Anyone can search the ‘Data Bay‘to find where a data element is used, where it comes from and who is responsible for it.

Mr. Lehr argues, “Doing the hard work of documentation is an absolutely essential part that almost everybody misses. I talk to these folks who are just getting started in data governance, and every single one of them asks me, ‘Yeah, but how much time is it going to take to go back and backfill all of the reports that we did? That seems like it’s too much work.’ I say to them, ‘It’s not a new project to document what you’ve already done. It’s just finishing all of those projects that you never finished.’ People are taking on lots of technical debt by having a report out there that nobody knows what it’s saying, nobody knows what it’s doing, yet every time it breaks, one of their analysts is spending time to fix it and maintain it because they don’t know if somebody out there might actually be using the thing. Going through that process of figuring out what you’re using and what you’re not using, and then documenting that so that more than one person can actually get meaning out of it. It’s not an extra project. It’s just technical debt that you need to pay off.”

The Future of Analytics

Not content to rest on their past successes, each of these programs is looking to the future. Machine Learning and Artificial Intelligence are topics that several of these organizations are exploring. Dr. Mathew maintained, “The time to insight in any analytics work will need to get shorter and shorter… As we think about embedding this in people’s work flows, the analytics really has to have a combination of not just humans but machines making decisions and taking actions so that we can make quantum leaps in improvement and progress. True work flow automation is what I’m trying to focus on more and more.”

George Reynolds, M.D., is a principal at Reynolds Healthcare Advisers and is a former CIO and CMIO of Children's Hospital & Medical Center in Omaha, Nebraska.

The Health IT Summits gather 250+ healthcare leaders in cities across the U.S. to present important new insights, collaborate on ideas, and to have a little fun - Find a Summit Near You!


You Have to Learn to Walk Before You Can Run With Predictive Analytics

November 11, 2018
by David Raths, Contributing Editor
| Reprints
Health systems report obstacles in turning their big data into actionable insights

The title of a recent webinar says all you need to know about predictive analytics in healthcare: “Within Sight Yet Out of Reach.”

The Center for Connected Medicine, jointly operated by GE Healthcare, Nokia, and UPMC, put on the webinar and partnered with HIMSS on a survey on the state of predictive analytics in healthcare.

The survey of 100 health IT leaders found that approximately 7 out of 10 hospitals and health systems say they are taking some action to formulate or execute a strategy for predictive analytics. But despite the buzz and potential, there are obstacles for health systems that want to turn their big data into actionable insights.

Although 69 percent said they are effective at using data to describe past health events, 49 percent said they are less effective at using data to predict future outcomes. They cite a lack of interoperability and a shortage of skilled workers as barriers. “They want to put all that data to work to provide insights as we deliver care, but it is not an easy task,” said Oscar Marroquin, M.D., chief clinical analytics officer at UPMC. “They are having trouble getting access to the data in useful and standardized formats and don’t have the people in place to apply machine learning techniques.”

The top five use cases cited in the survey are:


Experience New Records for Speed & Scale: High Performance Genomics & Imaging

Through real use cases and live demo, Frank Lee, PhD, Global Industry Leader for Healthcare & Life Sciences, will illustrate the architecture and solution for high performance data and AI...

• Fostering more cost-effective care

• Reducing readmissions

• Identifying at-risk patients

• Driving proactive preventive care

• Improving chronic conditions management

UPMC’s journey into the analytics space was jump-started by an institutional commitment to building the analytics program and a recognition that it needed to be a more data-driven organization. “We were never able to consume our data to drive how we deliver care until we had a dedicated team to do analytics,” Marroquin said. “Traditionally these functions were done as a side job by team members in IT systems. We have found having a dedicated team is absolutely necessary.”

Mona Siddiqui, M.D., M.P.H., chief data officer at the U.S. Department of Health & Human Services, says she is focused on the interoperability aspect across 29 agencies. “We are looking at how we are using data across silos to create more business value for the department,” she said. “We don’t have that infrastructure in place yet,” which leads to one-off projects rather than tackling larger priorities. She is focusing on enterprise-level data governance and interoperability structures. “I think the promise of big data is real, but I don’t think a lot of organizations have thought through the tough work required to make it happen. Practitioners start to see it as buzzword rather than something creating real value. There is a lot of work that needs to happen before we see value coming from data.”

Noting the survey result about human resources, she added that “the talent pool is an incredible challenge. While we talk about sharing data and using it for business intelligence, we don’t resource our teams appropriately to fulfill that promise.”

She said the move to value-based care has made predictive analytics more important to health systems. “It is a data play from the ground up,” and now we are starting to see the real impact in terms of managing chronic conditions. “More organizations like UPMC are seeing this is about data and measurement and bringing in not just data they have, but resources and data they may not have had access to previously.”

Travis Frosch, senior director of analytics at GE Healthcare, said that hospitals generate petabytes of data per year, yet only 3 percent is tagged for analytical use later on. “So 97 percent goes down the drain,” he added, suggesting that organizations need to start small. “If you are an organization that does not have maturity in analytics, start with traditional business intelligence to build the trust and foundation to move toward higher level of analytics maturity,” Frosch said. “Pick projects that don’t require tons of data sources. If you get a good a return on investment you can open up the budget to further your analytics journey. But you have to have a unit in place to measure the impact.”


More From Healthcare Informatics


Survey: More Than Half of Healthcare CIOs Lack Strong Trust in Their Data

November 9, 2018
by Heather Landi, Associate Editor
| Reprints

For U.S. healthcare leaders, trusted data is more important than ever, as their organizations migrate from the fee-for-service model to value-based care. However, a recent survey of CIOs found that less than half of healthcare organizations show very strong levels of trust in their data.

The survey, by Burlington, Mass.-based Dimensional Insight, an analytics and data management solutions provider, is based on responses from 85 members of a professional organization of CIOs and other healthcare IT leaders about trust in data across their enterprises.

During this transition from fee-for-service to value-based care, healthcare organizations must weigh investments, risks, and trade-offs objectively with quantitative, trustworthy data. This kind of data driven decision-making will be critical in shaping the initiatives and high-stakes choices required by value-based care. The transition will require increased, high-level collaboration among different constituencies within a healthcare enterprise. It also will require decisions to be quantitatively assessed against reliable, trustworthy data, the survey report notes.

The survey sought to gauge the current state of data trust and access? How much trust do CIOs and stakeholders have in their clinical, financial, and operational data these days? How many have direct, self-service access to the information they need to make data-driven decisions? Are healthcare organizations ready to invest funds to improve trust in data and self-service capabilities?

Overall, few organizations have very strong trust in their data while levels of self-service vary across the enterprise, according to the survey. Most healthcare organizations plan to invest money toward improving both data trust and self-service, the survey found.

As part of the survey, CIOs were asked to rate the index of trust in data within their various user communities, on a 1-10 scale, with 10 being the highest. The index of trust was defined as how strongly “user populations believe that they can trust the data provided to make decisions.”

Forty-eight percent of respondents assessed financial data as an 8 or above. The percentage of “8-and-up” responses was 40 percent for clinical and 36 percent for operational.

Clinical users have the lowest levels of self-service in making data-driven decisions. More than half of CIOs report that 30 percent or less of their clinical population is self-serviced in data-driven decision making.

Approximately three-quarters of healthcare organizations plan to increase investments to improve trust in data and self-service capabilities. At least 70% responded “yes” to investments in trusted data in each of the three realms. In addition, most organizations (68 – 78 percent) plan to increase their investments towards improving users’ capacity for self-service data analytics.

The survey demonstrates that healthcare organizations have a long way to go in developing rock-solid trust in their data and self-service access to it. The survey results also indicate that executives are aware of these challenges and are ready to dedicate resources to improving both trust and access.

“Trusted data is more important than ever, as healthcare organizations migrate from the fee-for-service model to value-based care,” Fred Powers, president and CEO of Dimensional Insight, said in a statement. “During this transition, healthcare organizations must weigh investments, risks, and tradeoffs against quantitative, trustworthy data. This kind of data driven decision-making will be critical in shaping the initiatives and high-stakes choices required by value-based care.”

Dimensional Insight executives also provide a number of recommendations for improving trust in data and increasing self-service capabilities:

  • Keep subject matter experts close to the data. Healthcare organizations will need programmers and data engineers to extract data from the source systems, but it is the subject matter experts who best understand the data and how it will be used.
  • Automate business logic transformations. More automation is better when it comes to the often complex logic required to transform raw data into meaningful information.
  •  Promote transparency and visibility. The best way to make sure data is right is to let people — the frontline information consumers — at it.



Related Insights For: Analytics


Study: AI Falls Short When Analyzing Data Across Multiple Health Systems

November 7, 2018
by Heather Landi, Associate Editor
| Reprints
Click To View Gallery

Artificial intelligence (AI) tools and machine learning technologies hold the promise of transforming healthcare and there is ongoing discuss about how much of an impact AI and machine learning will have on the practice of medicine and on the business of healthcare overall.

In a recent study, researchers from New York City-based Mount Sinai Hospital and Icahn School of Medicine at Mount Sinai found that AI may fall short when analyzing data across multiple health systems. In conclusions, researchers noted that the study findings indicate healthcare organizations should carefully assess AI tools and their real-world performance. The study was published in a recent special issue of PLOS Medicine on machine learning and health care.

As interest in the use of computer system frameworks called convolutional neural networks (CNN) to analyze medical imaging and provide a computer-aided diagnosis grows, recent studies have suggested that AI image classification may not generalize to new data as well as commonly portrayed, the researchers wrote in a press release about the study.

Early results in using CNNs on X-rays to diagnose disease have been promising, but it has not yet been shown that models trained on X-rays from one hospital or one group of hospitals will work equally well at different hospitals, the researchers stated. Before these tools are used for computer-aided diagnosis in real-world clinical settings, we must verify their ability to generalize across a variety of hospital systems, according to the researchers.

The study is timely giving the interest in machine learning, particularly in the area of medical imaging. A survey from Reaction Data found that 84 percent of medical imaging professionals view the technology as being either important or extremely important in medical imaging. What’s more, about 20 percent of medical imaging professionals say they have already adopted machine learning, and about one-third say they will adopt it by 2020.

Breaking it down, 7 percent of respondents said they have just adopted some machine learning and 11 percent say they plan on adopting the technology in the next 12 months. Fourteen percent of respondents said their organizations have been using machine learning for a while. About a quarter of respondents say they plan to adopt machine learning by 2020, and another 25 percent said they are three or more years away from adopting it. Only 16 percent of medical imaging professionals say they have no plans to adopt machine learning.

That survey found that there has been very little adoption by imaging centers, and all of the current adopters are hospitals.

In this particular Mount Sinai study, researchers at the Icahn School of Medicine at Mount Sinai assessed how AI models identified pneumonia in 158,000 chest X-rays across three medical institutions: the National Institutes of Health; The Mount Sinai Hospital; and Indiana University Hospital. Researchers chose to study the diagnosis of pneumonia on chest X-rays for its common occurrence, clinical significance, and prevalence in the research community.

In three out of five comparisons, CNNs’ performance in diagnosing diseases on X-rays from hospitals outside of its own network was significantly lower than on X-rays from the original health system. However, CNNs were able to detect the hospital system where an X-ray was acquired with a high-degree of accuracy, and cheated at their predictive task based on the prevalence of pneumonia at the training institution, according to the study.

Researches concluded that AI tools trained to detect pneumonia on chest X-rays suffered significant decreases in performance when tested on data from outside health systems. What’s more, researchers noted that the difficulty of using deep learning models in medicine is that they use a massive number of parameters, making it challenging to identify specific variables driving predictions, such as the types of CT scanners used at a hospital and the resolution quality of imaging.

“The performance of CNNs in diagnosing diseases on X-rays may reflect not only their ability to identify disease-specific imaging findings on X-rays but also their ability to exploit confounding information,” the researchers wrote in the study. “Estimates of CNN performance based on test data from hospital systems used for model training may overstate their likely real-world performance.”

These findings suggest that artificial intelligence in the medical space must be carefully tested for performance across a wide range of populations; otherwise, the deep learning models may not perform as accurately as expected, the researches stated.

“Our findings should give pause to those considering rapid deployment of artificial intelligence platforms without rigorously assessing their performance in real-world clinical settings reflective of where they are being deployed,” senior author Eric Oermann, M.D., instructor in Neurosurgery at the Icahn School of Medicine at Mount Sinai, said in a statement. “Deep learning models trained to perform medical diagnosis can generalize well, but this cannot be taken for granted since patient populations and imaging techniques differ significantly across institutions.”

First author John Zech, a medical student at the Icahn School of Medicine at Mount Sinai, said, “If CNN systems are to be used for medical diagnosis, they must be tailored to carefully consider clinical questions, tested for a variety of real-world scenarios, and carefully assessed to determine how they impact accurate diagnosis.”


See more on Analytics

betebettipobetngsbahis bahis siteleringsbahis