Healthcare Analytics’ New World | Healthcare Informatics Magazine | Health IT | Information Technology Skip to content Skip to navigation

Healthcare Analytics’ New World

November 10, 2016
by Rajiv Leventhal
| Reprints
Recent federal mandates and developments around bundled payments, readmissions reduction and accountable care have again confirmed that data analytics and information technology will be crucial to healthcare’s value-based transformation

In back-to-back months this summer, announcements around new mandatory bundled payment programs from the Department of Health and Human Services (HHS) as well as the latest updates regarding Centers for Medicare & Medicaid Services (CMS) penalties on hospitals for failing to lower their rehospitalization rates, collectively signaled to healthcare leaders that payment reform is here to stay.

The July 25 announcement of the mandatory bundled payment program for heart attack care and for cardiac bypass surgery stated, “The hospital in which a Medicare patient is admitted for care for a heart attack or bypass surgery would be accountable for the cost and quality of care provided to Medicare fee-for-service beneficiaries during the inpatient stay and for 90 days after discharge. The proposed cardiac care policies would be phased in over a period of five years, but would begin July 1, 2017 for hospitals located in the 98 metro areas participating in the model (about one-quarter of all metro areas in the nation).” These new bundled payment models for cardiac care, in addition to the extension of the existing bundled payment model for hip replacements and other hip surgeries, are yet another major step in forcing reimbursement forward into value-based purchasing.

Meanwhile, on the hospital readmissions front, although the news didn’t come out of CMS directly, an August 2 Kaiser Health News report revealed that the federal government’s penalties on hospitals for failing to lower their rehospitalization rates will hit a new high as Medicare will withhold approximately $528 million—about $108 million more than last year. CMS will penalize more than half of the nation’s hospitals—a total of 2,597—for having more patients than expected return within a month, as mandated by the government’s Hospital Readmissions Reduction Program, which adjusts payments for hospitals with higher than expected 30-day readmission rates for six targeted clinical conditions.

These revelations point to a realization beyond payment reform that patient care leaders likely already knew, but is now confirmed: U.S. hospitals are under more pressure than ever before to produce optimal clinical and cost outcomes. Key to this transformation will be leveraging robust data analytics and information technology to help drive continuous performance improvement.

Payers and Providers Converge

Webinar

Experience New Records for Speed & Scale: High Performance Genomics & Imaging

Through real use cases and live demo, Frank Lee, PhD, Global Industry Leader for Healthcare & Life Sciences, will illustrate the architecture and solution for high performance data and AI...

A critical element to providers planning for a value-based care future is aligning their needs and goals with those of payers. While this hasn’t always been easy to accomplish, most of the sources interviewed for this story agree that real strides are being made. Tim Moore, M.D., executive vice president of health affairs and chief medical officer of technology provider AxisPoint Health, a Westminster, Colo.-based spinoff of McKesson, which works primarily with payers, says there are plenty of new opportunities emerging around getting payers and providers on the same side of the table to sort out risk-based contracting challenges.

“With better integration and better relationships between payers and providers, through value-based reimbursement, there should be much better use of clinical data that is more timely and can provide interventions that are more appropriate to drive opportunities for savings,” Moore, previously chief medical officer at WebMD Health Services, says. Historically, he notes, payers would be straddled with only 60-day or 90-day-old claims-based administrative data, and by the time they did something with that, 30 more days would pass. “So there was a limit from a time perspective and also an accuracy perspective,” Moore says. He adds, “Providers have more timely claims be it through the electronic health record [EHR] or through the hospital with admission/discharge/transfer [ADT] information. If you have that timely information and you can leverage it, you can much better leverage algorithms and analytics to help predict who needs better support and guidance, from their own real data rather than administrative claims data that’s 90 days old.”

Tim Moore, M.D.

The thing that payers can bring to the table that providers sometimes cannot, continues Moore, is a higher level view of the population that the providers are delivering service to. “Providers sometimes don’t get a good view of the whole population they are serving, as they are only serving one patient at a time. But payers see a longitudinal view of patients over the past year or two,” he says.

Moore gives an example of how some hospitals throughout the country leverage health information exchanges (HIEs) that have good ADT data that hasn’t been shared or used by industry players such as the payer market. “With this ADT data, you can pull out other information including how many ER visits someone has had in the past six months, his or her diagnosis, and when he or she was in the hospital, so you have timely information that says here is a patient that has been in the hospital and because of this condition they have a higher risk of a readmission,” he says.

At the same time, payers can help by looking across different hospitals and pick out which ones are outliers in terms of high readmission rates. “Some hospitals are good at [avoidable readmissions], so you need to put resources towards the ones that are outliers,” Moore says. “Providers don’t have that full view like payers do. I think that leveraging the two sides can open up a whole new way of taking the data, and putting together and focusing the resources on where it will be most impactful,” he says.

To this end, Independence Blue Cross Blue Shield in southeastern Pennsylvania, serving two million members in five counties in and around Philadelphia, uses a predictive tool that calculates an individual’s likely future health state based on associated clinical conditions or diagnoses. The risk matrix, from San Mateo, Calif.-based healthcare analytics company Lumiata, helps the payer identify where members might be at risk for or might have certain conditions, and then helps alert their providers, explains Michael Vennera, senior vice president and CIO at Independence Blue Cross Blue Shield.

Vennera says that with the analytics tool, the payer can go to providers in its market and say that there is a chance patient X has a certain condition, even though it’s not diagnosed on his or her claims. All different types of data goes into that risk engine, says Vennera—medical claims data, prescription drug claims, lab results, and also basic demographics such as age, gender, and location. “Then what we get out of it is a prediction around diseases with different confidence levels for different members. And then you can use that to follow up,” he says.

Michael Vennera

Being right in the thick of the payer-provider relationship, Vennera notes how there are now lots of opportunities to combine the depth of information that a payer has with the depth of information a provider has, and put analytics on top of it. “What you typically see in most markets is that providers will have deep information, such as services rendered in an EHR,” he says.  “So if you go to a hospital, all the information about that stay and the procedures done make for a rich and deep data set. But then the challenge is when people go to multiple providers. We know interoperability is long way off. But payers do have a broad set of data which covers most of your healthcare since most of healthcare flows back to your insurers,” he says.

Rose Higgins, president of the West Hartford, Conn.-based analytics solution company SCIO Health Analytics, additionally notes the challenge of payers and providers being as transparent with the data as possible. Higgins says that while it begins with recognizing that the data has intrinsic value to both sides, there has to be a willingness to be open with respect to the information, and share it, for opportunities to be identified and acted on. “It’s challenging to mix different types of payer data sets. Providers have multiple contracts, so there are different approaches with each payer. This means that a payer may not want to share data with a provider when they know another payer’s data may be mingled with their own,” Higgins explains.

Drilling Down with Policy Implications

When CMS’ Bundled Payments for Care Improvement (BPCI) initiative first got off the ground five years ago, there were high expectations for investments in technology—to track performance on bundles, to make more predictions on performance, and to potentially price commercial bundles, notes Matthew Cinque, executive director, product management at The Advisory Board, a Washington, D.C.-based consulting and technology company. “But at that time, the market did not move as quickly as was expected on the analytics side,” Cinque says. “Folks signed up for the bundled payment programs, so there were lots of conversations around bundles in general, but when push came to shove, there was not a lot of movement.”

Cinque explains that one of the reasons for this was the upside in the CMS program was not big enough to justify freestanding investments in new analytics. “Folks would get by with what they had in Excel. On the commercial side, we saw a lot of interest but there was hesitance on the part of payers to try to adjudicate bundled payments,” he says, adding that with a commercial population under the age of 65, the numbers showed that there was not a lot of volume of any one thing.” Even with the [Comprehensive Care for Joint Replacement Model] announced last year, there has not been “a huge move around analytics investment,” Cinque says, noting that he expects that to “get more serious sooner than later.” He says, “I would say it is an immature market, but one that I expect to have more dedicated focus on bundled payment-specific analytics as CMS rolls out more mandatory programs related to this.”

Matthew Cinque

Cinque adds that “getting more serious” involves an investment in integrating different data sources. “One thing that makes bundled payments so challenging, especially if you look at the cardiac care model, is that so much of what you’re trying to manage happens outside of the four walls of the hospital. You need to be able to get data across inpatient metrics and get visibility into what happens in the physician office and skilled nursing facilities. Those are almost always entirely different data sets,” he says. Thus, data aggregation has to be a big point of investment, be it through a data warehouse or something else, he notes. “It’s about accumulating that data and then manipulating it. The data aggregation component of it is really what makes it cost prohibitive today,” he says.

Dan Golder, principal at Naperville, Ill.-based healthcare consulting firm Impact Advisors, agrees that the data integration piece could be the toughest. Golder says there are three levels when looking at value-based purchasing: claims data, clinical data, and eligibility data, and they happen to live in three siloes. Most groups, from what Golder has seen, have been working with claims data since it’s most available and “although it’s not easy to integrate it, you can,” he says. But the other two siloes are extremely problematic, he adds. “Linking claims data to other claims data from other payers, as well as clinical data and eligibility data to get it to be actionable at the point of care is an issue. Third-party tools are doing this right now, meaning providers are accessing a second application and leaving the EHR if they want to look at aggregate data and population health data,” Golder says.

Meanwhile, on the readmissions front, SCIO Health Analytics’ Higgins says that the organizations that have tackled this challenge head on are predicting where the outliers are, how to identify those earlier, and then work with those providers and patients to reduce these trends around readmissions. “The penalties are real and meaningful, so the approach for most organizations we are talking to is figuring out how to look at the providers who aren’t as strong in the primary care practice relationships that need to be in place with these patients to make sure there is a good plan of care prior to and after admission,” Higgins says. 

AxisPoint Health's Moore adds that payers are becoming increasingly frustrated since they have put programs in place that intuitively, and in an academic research setting, prove that they have value in terms of outcomes for lowering readmission rates, but when these programs are moved into a community-based setting, the community doesn’t act like an academic setting. “So many clients are frustrated that they have put programs in place that don’t work,” Moore says. We need more community-based studies and interventions that can be leveraged across the U.S.,” he says. Moore further points to the “LACE” index—a tool that identifies patients that are at risk for readmission or death—which is used by many hospitals and academic centers. “But unless you have those data points, like the ADT data, you can’t really do anything of significance,” he notes.

One area around readmissions that Cinque has seen an increased utilization of analytics for is psychosocial factors and financial barriers to care. “These readmission rates are higher in lower income areas than elsewhere,” he says. “Organizations are beginning to incorporate either a low-fi method, so collecting information while patients are in the hospital, or through more progressive methods, such as data mining to try to flag patients for readmissions. The integration of those data types in trying to become more predictive about risk factors for readmissions is an emerging area,” Cinque says.

Nonetheless, for some patient care organizations, notably smaller provider groups, incorporating this level of analytics can prove too expensive and overwhelming. To this end, Scott Pillittere, vice president of Impact Advisors, says that these smaller groups will be willing to “take the hit” from CMS regarding penalties for these policy mandates, since they won’t be able to pay for the data analytics that are needed. “We are seeing more consolidation in the marketplace, and this will be another factor that will push standalone or small physician practices into a much larger organization so they have the financing to pay for the data analytics groups that can help them with this part of their care,” Pillittere predicts. “There are not a whole lot of doctors who want to play in this business side of healthcare, so they are looking for help,” he says.

Golder adds that when he reads the tea leaves of Medicare’s new rules with how much payment they want tied to value in the coming years, much of it is budget neutral, meaning for someone to earn an incentive payment, someone else will have to pay a penalty. This represents a difference from the Meaningful Use program in which everyone could earn incentives. “So the inability to pay for systems and the lack of capability to run analytics to do better, will likely shift small practices into larger groups that can be successful in the world of population health and accountable care,” Golder says. 

Nevertheless, the shifting healthcare landscape isn’t stopping senior leaders at SCL Health from getting in front of the analytics game. The patient care organization, a nine-hospital health system with three safety-net clinics, one children’s mental health center, and approximately 200 ambulatory sites in three states—Colorado, Kansas, and Montana —last year selected Fort Collins, Col.-based Total Benchmark Solution (TBS) as its vendor for benchmark data and advanced analytics. The platform enabled the health system to quickly and easily compare performance using historical trends, and/or performance targets, and peer group data. It was then able to identify areas of undesirable variation to target for improvement, its officials say.

“The platform allows us to filter and adjust an analysis based on various criteria, such as a certain type of patient or a particular payer,” says Chris Bliersbach, senior director of clinical outcomes at SCL Health. We can integrate data sources, such as our ADT feed, Epic, and Press Ganey to see the whole picture through volume, cost, charges, supplies, quality, patient experience, and many other metrics.”

Prior to this technology implementation, two SCL Health care sites and a commercial payer already had been particularly interested in hip and knee surgery improvement within the Comprehensive Care for Joint Replacement bundled payment model. Both care sites had desirable performance with length of stay (LOS) as measured against Medicare and all-payer benchmarks in the TBS database, its officials attest.

But they realized that customized benchmarks would provide a stretch goal appropriate to best-practice hip and knee surgery outcomes at the care sites. To develop the benchmarks, SCL Health and TBS collected and analyzed data from Healthgrades on organizations that had five-star ratings for hip and knee surgery. Importantly, organizations that could offer stretch goals for the care sites also required appearance on the U.S. News and World Report “Best Hospitals” list, and a similar patient volume to the care sites. Indeed, 80 hospitals providing knee surgery and 56 providing hip surgery met the criteria for “best practice” organizations with both low LOS and low complication rates. Data from those organizations were used to establish the tailor-made benchmarks, SCL Health officials say.

Advice for the C-Suite

All of the healthcare leaders interviewed for this story agree with the notion that with all of the initiatives that federal healthcare officials are creating now—around readmissions reduction, value-based purchasing for both hospitals and physicians, bundled payments, and accountable care—the leveraging of data and healthcare IT will be critically important.

So what is the best plan of action for CIOs and CMIOs right now around leveraging robust data analytics to bend the healthcare cost curve? Well, there isn’t one single answer for organizations nationwide, says Impact Advisors’ Golder, who notes several options for health systems such as: aggregating data by building data warehouses; integrating data sources themselves; looking at their existing vendor partners to help them since their doctors don’t want to leave the EHR; and finding third-party vendors for data aggregation. “So, for the provider organization, what’s your appetite for risk?” Golder asks.

Independence Blue Cross Blue Shield’s Vennera says that payers should be talking to the providers they work with in their market, if they aren’t already, about how they can share data, particularly if there is no regional data exchange program or HIE in place. And on the analytics specific side, he adds, “The big thing for CIOs is to strike the right balance between insourcing and outsourcing. With the analytics arms race and the cost of analytics resources, you can’t build everything in-house. But also you can’t bend your way to the answers. You need a combination of vendor solutions and building the in-house talent to interpret results, challenge findings, and think through and develop proprietary analytics.”

Moore agrees with this advice. He says: “Put stakeholders in a room together and have them each bring the data they believe is most important and share it with each other, so they could understand the data that exists rather than create something new.” Moore believes that there is a tendency in healthcare to try to create something new and constantly look at something differently. “Right now, we have many different data points that are not necessarily used as well as they could be and integrated as well as they could be. Start with the data you have and figure out how to best leverage it,” he says.

The Advisory Board’s Cinque further brings up the point that when an organization begins collecting information and data, and synthesizes it across different sites of care, it forces interactions with other EHR systems, or in some cases, places that don’t even have EHRs. “You need to understand the IT landscape of your partners and other providers in your community. That will influence your success on these programs,” he says. “For CIOs and CMIOs, there is a belief in if they are not on our EHR platform, we shouldn’t work with them or it shouldn’t matter. That’s not a tenable strategy with these interconnected programs.”

On the risk-based side, Moore also notes the fact that there is still such a mix of fee-for-service and value-based reimbursement, which ultimately slows things down. He calls it “a schizophrenic way for a provider to try to practice.” He says that provider organizations need to figure out how to segregate, meaning having a fee-for-service group and value-based group that is at least 75 percent reimbursed from one side or the other. “Until the majority of a doctor’s compensation is tied to one side, they won’t behave in that certain way. That’s one of the biggest challenges for us in the next five years,” he says.


The Health IT Summits gather 250+ healthcare leaders in cities across the U.S. to present important new insights, collaborate on ideas, and to have a little fun - Find a Summit Near You!


/article/analytics/healthcare-analytics-new-world
/blogs/joe-marion/analytics/ai-imaging-where-s-bang-buck

AI in Imaging: Where’s the Bang for the Buck?

January 23, 2019
| Reprints

Over the past year much has been written about the capability of Artificial Intelligence (AI), and what it will mean for imaging services.  At last year’s RSNA, AI was the featured topic and received the lion’s share of publicity. 

The glamorous aspect of AI and Machine Learning has been how AI can assist the radiologist with diagnosis of imaging studies.  A key area of focus has been in chest imaging (https://www.auntminnieeurope.com/index.aspx?sec=sup&sub=aic&pag=dis&ItemID=616828) where there has been some success in triaging abnormal chest images.  The upside of such applications is improved diagnostic efficiency, particularly as healthcare moves toward value-based care.  The downside is that such algorithms require substantial amounts of data to validate, and they will need to go through the FDA approval process, which will take time before they can be fully implemented.

Ultimately, AI imaging applications will pay off.  But, what about the other potentially less-glamorous aspect of applying AI/Machine Learning to the diagnostic process?  By that, I am referring to its use in terms of workflow orchestration.  Aside from interpreting imaging content, AI/Machine Learning applied to workflow orchestration can provide valuable information and assistance in preparing a case for interpretation. 

Take for example Siemens Healthineer’s AI-Rad Companion application (https://www.healthcare.siemens.com/infrastructure-it/artificial-intelligence/ai-rad-companion).  The application provides automated identification, localization, labeling and measurements for anatomies and abnormalities.  Such a capability can improve the radiologist’s efficiency without necessarily employing an algorithm to assess the image.

Other workflow applications can assess the study and mine relevant information from the EHR to present to the radiologist, again with the goal of improving their efficiency and efficacy.  Still other applications match radiologist reading assignments with available studies to improve reading efficiency.  In another twist, one company has demonstrated a capability to further analyze cases, using AI to assign the next appropriate case to a radiologist without the need for a worklist. 

As healthcare providers consolidate, there is a growing need for improvement in resource utilization across facilities.  Smart worklists that can present cases to individual radiologists across facilities can improve the overall efficiency and efficacy of interpretation.  Rule sets that address radiologist availability, reading sub-specialties, location, etc. can help “level-load” reading resources. 

My point is that while AI applications that manipulate images may hold great promise for the future of diagnosis, areas such as workflow orchestration might offer more immediate results in an environment of changing healthcare.  Providers should take a close look at these applications to assess whether they can achieve a more immediate impact on imaging operations.

More From Healthcare Informatics

/news-item/analytics/national-library-medicine-creating-scientific-director-position

National Library of Medicine Creating Scientific Director Position

January 23, 2019
by David Raths, Contributing Editor
| Reprints
New position will oversee Lister Hill National Center for Biomedical Communications and National Center for Biotechnology Information

As part of a reorganization of its intramural research activities, the U.S. National Library of Medicine (NLM) has launched a search for a scientific director. The scientific director will oversee a group of 150 scientific personnel, developing new approaches to data science, biomedical informatics, and computational biology.

In a blog post on the library’s website, director Patti Brennan, R.N., Ph.D., called the move a big step in revving up its intramural research operation.

One of the 27 Institutes and Centers of the National Institutes of Health (NIH), NLM creates and hosts major digital resources, tools, and services for biomedical and health literature, data, and standards, sending 115 terabytes of data to five million users and receiving 15 terabytes of data from 3,000 users every weekday.

NLM’s strategic plan for 2017-2027 positions it to become a platform for biomedical discovery and data-powered health. NLM anticipates continued expansion of its intramural research program to keep pace with growing demand for innovative data science and informatics approaches that can be applied to biomedical research and health and growing interest in data science across the NIH.

A Blue Ribbon Panel recently reviewed NLM’s intramural research programs and recommended, among other things, unifying the programs under a single scientific director. That shift also aligns the library with NIH’s other institutes and centers, most of which are guided by one scientific director.

NLM’s  intramural research program includes activities housed in both the Lister Hill National Center for Biomedical Communications (LHC) and the National Center for Biotechnology Information (NCBI). The researchers in these two centers develop and apply computational approaches to a broad range of problems in biomedicine, molecular biology, and health, but LHC focuses on medical and clinical data, while NCBI focuses on biological and genomic data.

But the Blue Ribbon Panel noted that the boundaries between clinical and biological data are dissolving, and the analytical and computational strategies for each are increasingly shared. “As a result, the current research environment calls for a more holistic view of biomedical data, one best served by shared approaches and ongoing collaborations while preserving the two centers’ unique identities, wrote Brennan, who came to NIH in 2016 from the University of Wisconsin-Madison, where she was the Lillian L. Moehlman Bascom Professor at the School of Nursing and College of Engineering.

She added that having a single scientific director should lead to a sharper focus on research priorities, fewer barriers to collaboration, the cross-fertilization of ideas and the optimization of scarce resources.

The new scientific director will be asked to craft a long-range plan that identifies research areas where the NLM can best leverage its unique position and resources. We’ll also look for ways to allocate more resources to fundamental research while streamlining operational support. “Down the road, we’ll expand our research agenda to include high-risk, high-reward endeavors, the kinds of things that raise profound questions and have the potential to yield tremendous impact,” she wrote.

Besides the scientific director, the NLM is also recruiting three investigators to complement its strengths in machine learning and natural language processing.

 

 

 

 

Related Insights For: Analytics

/news-item/analytics/survey-digital-ai-top-priorities-2019-ehrs-will-dominate-it-spend

Survey: Digital, AI Top Priorities in 2019, but EHRs Will Dominate IT Spend

January 22, 2019
by Heather Landi, Associate Editor
| Reprints

Digital, advanced analytics, and artificial intelligence (AI) are top spending priorities for healthcare executives in 2019, but electronic health record (EHR) systems will dominate technology spending budgets, according to a recent technology-focused healthcare survey.

Damo Consulting, a Chicago-based healthcare growth and digital transformation advisory firm, surveyed technology and service provider executives and healthcare enterprise executives about how the demand environment for healthcare IT is changing and will impact the industry in the coming year. Damo Consulting’s third annual Healthcare IT Demand Survey also analyzes the challenges for healthcare organizations and the perceived impact of macro-level changes.

The report indicates technology vendors will continue to struggle with long sales cycles as they aggressively market digital and AI. For the second year in a row, the rise of non-traditional players such as Amazon and Google will have a strong impact on the competitive environment among technology vendors while EHR vendors grow in dominance.

Among the key findings from the survey, IT budgets are expected to grow by 20 percent or more, with healthcare executives indicating they are more upbeat about IT spend growth than vendors. All the healthcare executives who participated in the survey said digital transformation initiatives are gaining momentum in their enterprises.

However, the majority (75 percent) agree that rapid change in the healthcare IT landscape makes technology decisions harder and only 58 percent believe there are plenty of viable and ready-to-deploy solutions available today in emerging technologies such as AI and digital health solutions. Seventy-one percent agree that federal government policies have provided a boost to healthcare IT spend this past year.

Top IT priorities for healthcare enterprise executives in 2019 are digital, advanced analytics and AI. Of the survey respondents, 79 percent said accelerating digital health initiatives was a top priority and 58 percent cited investing in advanced analytics and AI capabilities as top priorities. However, modernizing IT infrastructure (25 percent) and optimizing EHRs (21 percent) are also significant priorities.

Technology vendors also see AI, advanced analytics and digital transformation as top areas of focus for next year, as those areas were cited by 75 percent and 70 percent of technology and service provider executives, respectively. Thirty-three percent of those respondents cited EHR optimization and 25 percent cited cybersecurity and ransomware. Thirteen percent cited M&A integration as a top area of focus in 2019.

However, EHR systems will dominate technology spending budgets, even as the focus turns to digital analytics, the survey found. Technology and service provider executives who participated in the survey identified EHR system optimization and cybersecurity as significant drivers of technology spend in 2019. Sixty percent of respondents said enterprise digital transformation and advanced analytics and AI would drive technology spend this year, but 38 percent also cited EHR optimization and cybersecurity/ransomware. One executive survey respondent said, “For best of breed solutions, (the challenge is) attracting enough mindshare and budget vs. EHR spends.”

When asked what digital transformation means, close to half of healthcare executives cited reimaging patient and caregiver experiences, while one quarter cited analytics and AI and 17 percent cited automation. As one executive said, “The biggest challenge for healthcare in 2019 will be navigating tightening margins and limited incentives to invest in care design.”

Healthcare executives are divided on whether digital is primarily an IT-led initiative, and are also divided on whether technology-led innovation is dependent on the startup ecosystem.

The CIO remains the most important buyer for technology vendors, however IT budgets are now sitting with multiple stakeholders, the survey found, as respondents also cited the CFO, the CTO, the CMIO and the chief digital officer.

“Digital and AI are emerging as critical areas for technology spend among healthcare enterprises in 2019. However, healthcare executives are realistic around their technology needs vs. their need to improve care delivery. They find the currently available digital health solutions in the market are not very mature,” Paddy Padmanabhan, CEO of Damo Consulting, said in a statement. “However, they are also more upbeat about the overall IT spend growth than their technology vendors.”

Looking at the technology market, healthcare executives perceive a lack of maturity in technology solution choices for digital initiatives, as well as a lack of internal capabilities for managing digital transformation. In the survey report, one executive said, “HIT architecture needs to substantially change from large monolithic code sets to an API-driven environment with multiple competing apps.”

A majority of healthcare enterprise executives view data silos and lack of interoperability as the biggest challenges to digital transformation. And, 63 percent believe the fee-for-service reimbursement model will remain the dominant payment model for the foreseeable future.

In addition, cybersecurity issues will continue to be a challenge for the healthcare sector in 2019, but not the biggest driver of technology spending or the top area of focus for health systems in the coming year, according to the survey.

Healthcare executives continue to be confused by the buzz around AI and digital and struggle to make sense of the changing landscape of who is playing what role and the blurred lines of capabilities and competition, according to the survey report. When asked who their primary choice is when looking for potential partners to help with digital transformation, 46 percent of healthcare executives cited their own internal IT and innovation teams, 17 percent cited their EHR vendor and 8 percent cited boutique consulting firms. A quarter of respondents cited “other.”

For technology vendors, the biggest challenge is long cycles, along with product/service differentiation and brand visibility.

The rise of non-traditional players, such as Amazon, Apple, and Google, will have a strong impact on the competitive healthcare technology environment, the survey responses indicated. At the same time, deeply entrenched EHR vendors such as Epic and Cerner will grow in dominance.

 

See more on Analytics

agario agario---betebet sohbet hattı betebet bahis siteleringsbahis