White House Report Highlights Role of Big Data in Cancer Moonshot Efforts | Healthcare Informatics Magazine | Health IT | Information Technology Skip to content Skip to navigation

White House Report Highlights Role of Big Data in Cancer Moonshot Efforts

October 18, 2016
by Heather Landi
| Reprints
Click To View Gallery

U.S. Vice President Joe Biden on Monday delivered the Cancer Moonshot Task Force report summarizing the work of the initiative since its inception 10 months ago and outlined new commitments toward the goals of the Cancer Moonshot from both the public and private sectors, with a significant emphasis on harnessing big data.

The report laid out the Vice President’s strategic plan for transforming cancer research and care moving forward. The task force’s report specifically highlighted unleashing the power of data as among the strategic goals outlined on Monday.

“Today researchers are working with an unprecedented amount of data, in part due to the explosion of genomic information, increasing use of electronic health records, and large datasets of clinical, environmental, and public health information,” the task force report stated.

“Under Strategic Goal 2, the task force is maximizing access to and usability of these data to enhance, improve and inform the journey of every cancer patient by: enabling a seamless data environment for clinical and research data through shared policies and technologies; unlocking scientific advances through open publication and storage platforms and next-generation computer architecture; and developing a scientific workforce capable of using the open and connected data environment.”

During his 2016 State of the Union Address, President Obama called on Vice President Biden to lead a national Cancer Moonshot focused on making a decade of progress in preventing, diagnosing, and treating cancer in five years. A Presidential Memorandum established the Cancer Moonshot Task Force, which was directed to unite the federal government in achieving the Moonshot’s mission through a focused effort to leverage federal investments, targeted incentives, private sector efforts, and patient initiatives, among other mechanisms. The memorandum also directed the NCI at the National Institutes of Health (NIH) to form the Cancer Moonshot Blue Ribbon Panel (BRP) to bring together experts in a variety of disciplines to identify key areas of science for new investment at NCI.

In the task force’s report, Vice President Biden wrote that with regard to cancer research and treatment, the country has reached an “inflection point” due to a number of factors, such as more collaboration across research disciplines, technological advances and the wealth of research and healthcare data.

“Everywhere I traveled, I was told that data are key, and we have an unprecedented amount and diversity of data being generated daily through genomics, family history records, lifestyle measurements, and treatment outcomes with this data we can find new patterns of causes, earlier signs of cancer and successful treatments of cancer,” Biden wrote in the report. “We now have the capability to realize the promise of all these data because of advances in super computing power. Researchers can analyze enormously complex and large amounts of data to find answers we couldn’t just five years ago.”

The report also laid year 1 accomplishments and plans and outlined plans for year 2 and beyond. As part of these plans, the report highlighted a number of announcements about new public initiatives, public-private partnerships and private efforts focused on harnessing big data, sharing research among scientists and expanding preventive measures.

The National Cancer Institute (NCI), Amazon Web Services, and Microsoft announced a collaboration to build a sustainable model for maintaining cancer genomic data in the cloud. The information stored there will be available to cancer researchers through the NCI’s Genomic Data Commons and Cancer Genomics Cloud programs.

Lyft and Uber expanded their commitment to providing transportation for cancer patients. According to the White House press release, currently one-fourth of patients miss or reschedule their treatments and appointments because of transportation issues. Lyft has committed to expand its Boston-based treatment transport partnership to all 200 cities Lyft currently serves by 2020, to provide patients, particularly those from low income communities, with credits to receive free transportation to and from treatments. Uber sets a goal to connect millions of patients with rides annually, in over 500 cities by 2018.

The National Aeronautics and Space Administration (NASA) also is involved in Cancer Moonshot work. NASA and NCI announced a collaboration to study the biological effects of particle beam radiotherapy, a novel technology that may deliver a more targeted dose of radiation to tumor cells

The Department of Defense (DoD) is establishing a groundbreaking new longitudinal study of the biological underpinnings of cancer. Using data housed within DoD's cancer registry and serum repository, researchers will work to identify new linkages between pre-diagnostic biological markers and various types of cancer. “Approximately 1,000 new cases of cancer occur annually in active duty personnel, and there are approximately 250,000 samples from the last 25 years available to undergo protein signature analysis for pre-incident cancer markers. DoD and the Environmental Protection Agency (EPA) will also work in partnership to link results with the "Environmental Quality Index" to further evaluate the environmental factors contributing to this disease with appropriate considerations taken to ensure privacy and consent of current and past active duty members that will be part of the study,” according to the report.

The DoD’s Joint Pathology Center will also explore digitizing and making available its repository of over 34 million unique pathology samples. Digitizing this vast and unique pathology resource will have numerous benefits; including allowing increased access to a diverse range of researchers and diagnosticians, and builds on recent efforts that combine image analysis and machine learning algorithms to improve cancer diagnoses.

The new actions and public-private partnerships announced on Monday are just some of the over 70 commitments made this year as a result of the Cancer Moonshot.

The report also outlined progress being made, to date, on a number of initiatives. The NCI has adopted a new dashboard that makes it easier for patients and doctors to search for clinical trials and increase the ability for patients to participate in clinical research. The dashboard was created by the Presidential Innovation Fellows in partnership with the NCI to maximize the user experience on the HCI's cancer trials website.

NIH launched a new partnership to bring together drug companies, major cancer research centers, foundations, and philanthropies to collaborate on early stage research –i.e. the basic biology of cancer –and to share all of the data. According to the report, rather than 20 companies each studying the same thing and not sharing the results, the participating organizations will be able to see each other’s findings and build upon the results more quickly. The report says that more drug companies are signing up to be a part of the partnership.

The Health IT Summits gather 250+ healthcare leaders in cities across the U.S. to present important new insights, collaborate on ideas, and to have a little fun - Find a Summit Near You!


/news-item/analytics/cancer-moonshot-task-force-report-highlights-role-big-data-accelerate-progress
/news-item/analytics/definitive-healthcare-acquires-himss-analytics-data-services

Definitive Healthcare Acquires HIMSS Analytics’ Data Services

January 16, 2019
by Rajiv Leventhal, Managing Editor
| Reprints

Definitive Healthcare, a data analytics and business intelligence company, has acquired the data services business and assets of HIMSS Analytics, the organizations announced today.

The purchase includes the Logic, Predict, Analyze and custom research products from HIMSS Analytics, which is commonly known as the data and research arm of the Healthcare Information and Management Systems Society.

According to Definitive officials, the acquisition builds on the company’s “articulated growth strategy to deliver the most reliable and consistent view of healthcare data and analytics available in the market.”

Definitive Healthcare will immediately begin integrating the datasets and platform functionality into a single source of truth, their executives attest. The new offering will aim to include improved coverage of IT purchasing intelligence with access to years of proposals and executed contracts, enabling transparency and efficiency in the development of commercial strategies.

Broadly, Definitive Healthcare is a provider of data and intelligence on hospitals, physicians, and other healthcare providers. Its product suite its product suite provides comprehensive data on 8,800 hospitals, 150,000 physician groups, 1 million physicians, 10,000 ambulatory surgery centers, 14,000 imaging centers, 86,000 long-term care facilities, and 1,400 ACOs and HIEs, according to officials.

Together, Definitive Healthcare and HIMSS Analytics have more than 20 years of experience in data collection through exclusive methodologies.

“HIMSS Analytics has developed an extraordinarily powerful dataset including technology install data and purchasing contracts among other leading intelligence that, when combined with Definitive Healthcare’s proprietary healthcare provider data, will create a truly best-in-class solution for our client base,” Jason Krantz, founder and CEO of Definitive Healthcare, said in a statement.

More From Healthcare Informatics

/news-item/analytics/machine-learning-survey-many-organizations-several-years-away-adoption-citing

Machine Learning Survey: Many Organizations Several Years Away from Adoption, Citing Cost

January 10, 2019
by Heather Landi, Associate Editor
| Reprints

Radiologists and imaging leaders see an important role for machine learning in radiology going forward, however, most organizations are still two to three years away from adopting the technology, and a sizeable minority have no plans to adopt machine learning, according to a recent survey.

A recent study* by Reaction Data sought to examine the hype around artificial intelligence and machine learning, specifically in the area of radiology and imaging, to uncover where AI might be more useful and applicable and in what areas medical imaging professionals are looking to utilize machine learning.

Reaction Data, a market research firm, got feedback from imaging professionals, including directors of radiology, radiologists, chiefs of radiology, imaging techs, PACS administrators and managers of radiology, from 152 healthcare organizations to gauge the industry on machine learning. About 60 percent of respondents were from academic medical centers or community hospitals, while 15 percent were from integrated delivery networks and 12 percent were from imaging centers. The remaining respondents worked at critical access hospitals, specialty clinics, cancer hospitals or children’s hospitals.

Among the survey respondents, there was significant variation in the number of annual radiology studies performed—17 percent performed 100-250 thousand studies each year; 16 percent performed 1 to 2 million studies; 15 percent performed 5 to 25 thousand studies; 13 percent performed 250 to 500 thousand; 10 percent performed more than 2 million studies a year.

More than three quarters of imaging and radiology leaders (77 percent) view machine learning as being important in medical imaging, up from 65 percent in a 2017 survey. Only 11 percent view the technology as not important. However, only 59 percent say they understand machine learning, although that percentage is up from 52 percent in 2017. Twenty percent say they don’t understand the technology, and 20 percent have a partial understanding.

Looking at adoption, only 22 percent of respondents say they are currently using machine learning—either just adopted it or have been using it for some time. Eleven percent say they plan to adopt the technology in the next year.

Half of respondents (51 percent) say their organizations are one to two years away (28 percent) or even more than three years away (23 percent) from adoption. Sixteen percent say their organizations will most likely never utilize machine learning.

Reaction Data collected commentary from survey respondents as part of the survey and some respondents indicated that funding was an issue with regard to the lack of plans to adopt the technology. When asked why they don’t ever plan to utilize machine learning, one respondent, a chief of cardiology, said, “Our institution is a late adopter.” Another respondent, an imaging tech, responded: “No talk of machine learning in my facility. To be honest, I had to Google the definition a moment ago.”

Survey responses also indicated that imaging leaders want machine learning tools to be integrated into PACS (picture archiving and communication systems) software, and that cost is an issue.

“We'd like it to be integrated into PACS software so it's free, but we understand there is a cost for everything. We wouldn't want to pay more than $1 per study,” one PACS Administrator responded, according to the survey.

A radiologist who responded to the survey said, “The market has not matured yet since we are in the research phase of development and cost is unknown. I expect the initial cost to be on the high side.”

According to the survey, when asked how much they would be willing to pay for machine learning, one imaging director responded: “As little as possible...but I'm on the hospital administration side. Most radiologists are contracted and want us to buy all the toys. They take about 60 percent of the patient revenue and invest nothing into the hospital/ambulatory systems side.”

And, one director of radiology responded: “Included in PACS contract would be best... very hard to get money for this.”

The survey also indicates that, among organizations that are using machine learning in imaging, there is a shift in how organizations are applying machine learning in imaging. In the 2017 survey, the most common application for machine learning was breast imaging, cited by 36 percent of respondents, and only 12 percent cited lung imaging.

In the 2018 survey, only 22 percent of respondents said they were using machine learning for breast imaging, while there was an increase in other applications. The next most-used application cited by respondents who have adopted and use machine learning was lung imaging (22 percent), cardiovascular imaging (13 percent), chest X-rays (11 percent), bone imaging (7 percent), liver imaging (7 percent), neural imaging (5 percent) and pulmonary imaging (4 percent).

When asked what kind of scans they plan to apply machine learning to once the technology is adopted, one radiologist cited quality control for radiography, CT (computed tomography) and MR (magnetic resonance) imaging.

The survey also examines the vendors being used, among respondents who have adopted machine learning, and the survey findings indicate some differences compared to the 2017 survey results. No one vendor dominates this space, as 19 percent use GE Healthcare and about 16 percent use Hologic, which is down compared to 25 percent of respondents who cited Hologic as their vendor in last year’s survey.

Looking at other vendors being used, 14 percent use Philips, 7 percent use Arterys, 3 percent use Nvidia and Zebra Medical Vision and iCAD were both cited by 5 percent of medical imaging professionals. The percentage of imaging leaders citing Google as their machine learning vendor dropped from 13 percent in 2017 to 3 percent in this latest survey. Interestingly, the number of respondents reporting the use of homegrown machine learning solutions increased to 14 percent from 9 percent in 2017.

 

*Findings were compiled from Reaction Data’s Research Cloud. For additional information, please contact Erik Westerlind at ewesterlind@reactiondata.com.

 

Related Insights For: Analytics

/article/analytics/drexel-university-moves-forward-leveraging-nlp-improve-clinical-and-research

Drexel University Moves Forward on Leveraging NLP to Improve Clinical and Research Processes

January 8, 2019
by Mark Hagland, Editor-in-Chief
| Reprints
At Drexel University, Walter Niemczura is helping to lead an ongoing initiative to improve research processes and clinical outcomes through the leveraging of NLP technology

Increasingly, the leaders of patient care organizations are using natural language processing (NLP) technologies to leverage unstructured data, in order to improve patient outcomes and reduce costs. Healthcare IT and clinician leaders are still relatively early in the long journey towards full and robust success in this area; but they are moving forward in healthcare organizations nationwide.

One area in which learnings are accelerating is in medical research—both basic and applied. Numerous medical colleges are moving forward in this area, with strong results. Drexel University in Philadelphia is among that group. There, Walter Niemczura, director of application development, has been helping to lead an initiative that is supporting research and patient care efforts, at the Drexel University College of Medicine, one of the nation’s oldest medical colleges (it was founded in 1848), and across the university. Niemczura and his colleagues have been partnering with the Cambridge, England-based Linguamatics, in order to engage in text mining that can support improved research and patient care delivery.

Recently, Niemczura spoke with Healthcare Informatics Editor-in-Chief Mark Hagland, regarding his team’s current efforts and activities in that area. Below are excerpts from that interview.

Is your initiative moving forward primarily on the clinical side or the research side, at your organization?

We’re making advances that are being utilized across the organization. The College of Medicine used to be a wholly owned subsidiary of Drexel University. About four years ago, we merged with the university, and two years ago we lost our CIO to the College of Medicine. And now the IT group reports to the CIO of the whole university. I had started here 12 years ago, in the College of Medicine.

Webinar

Experience New Records for Speed & Scale: High Performance Genomics & Imaging

Through real use cases and live demo, Frank Lee, PhD, Global Industry Leader for Healthcare & Life Sciences, will illustrate the architecture and solution for high performance data and AI...

And some of the applications of this technology are clinical and some are non-clinical, correct?

Yes, that’s correct. Our data repository is used for clinical and non-clinical research. Clinical: College of Medicine, College of Nursing, School of Public Health. And we’re working with the School of Biomedical Engineering. And college of Arts and Sciences, mostly with the Psychology Department. But we’re using Linguamatics only on the clinical side, with our ambulatory care practices.

Overall, what are you doing?

If you look at our EHR [electronic health record], there are discrete fields that might have diagnosis codes, procedure codes and the like. Let’s break apart from of that. Let’s say our HIV Clinic—they might put down HIV as a diagnosis, but in the notes, might mention hepatitis B, but they’re not putting that down as a co-diagnosis; it’s up to the provider how they document. So here’s a good example: HIV and hepatitis C have frequent comorbidity. So our organization asked a group of residents to go in and look at 5,700 patient charts, with patients with HIV and hepatitis C. Anybody in IT could say, we have 677 patients with both. But doctors know there’s more to the story. So it turns out another 443 had HIV in the code and hep C mentioned in the notes. Another 14 had hep C in the code, and HIV in the notes.

So using Linguamatics, it’s not 5,700 charts that you need to look at, but 1,150. By using Linguamatics, we narrowed it down to 1,150 patients—those who had both codes. But then we found roughly 460 who had the comorbidity mentioned partly in the notes. Before Linguamatics, all residents had to look at all 5,700 charts, in cases like this one.

So this was a huge time-saver?

Yes, it absolutely was a huge time-saver. When you’re looking at hundreds of thousands or millions of patient records, the value might be not the ones you have to look at, but the ones you don’t have to look at. And we’re looking at operationalizing this into day-to-day operations. While we’re billing, we can pull files from that day and say, here’s a common co-morbidity—HIV and hep C, with hep C mentioned in those notes—and is there a missed opportunity to get the discrete fields correct?

Essentially, then, you’re making things far more accurate in a far more efficient way?

Yes, this involves looking at patient trials on the research side, while on the clinical side, we can have better quality of care, and more updated billing, based on more accurate data management.

When did this initiative begin?

Well, we’ve been working with Linguamatics for six or seven years. Initially, our work was around discrete fields. The other type of note we look at has to do with text. We had our rheumatology department, and they wanted to find out which patients had had particular tests done—they’re looking for terms in notes… When a radiologist does a report on your x-ray, it’s not like a test for diabetes, where a blood sugar number comes out; x-rays are read and interpreted. The radiologists gave us key words to search for, sclerosis, erosions, bone edema. There are about 30 words. They’re looking for patients who have particular x-rays or MRIs done, so that instead of looking for everyone who had these x-rays done, roughly 400 had these terms. We reduced the number who were undergoing particular tests. The rheumatology department was looking for patients for patient recruitment who had x-rays done, and had these kinds of findings.

So the rheumatology people needed to identify certain types of patients, and you needed to help them do that?

Yes, that’s correct. Now, you might say, we could do word search in Microsoft Word; but the word “erosion” by itself might not help. You have to structure your query to be more accurate, and exclude certain appearances of words. And Linguamatics is very good at that. I use their ontology, and it helps us understand the appearance of words within structure. I used to be in telecommunications. When all the voice-over IP came along, there was confusion. You hear “buy this stock,” when the message was, “don’t buy this stock.”

So this makes identifying certain elements in text far more efficient, then, correct?

Yes—the big buzzword is unstructured data.

Have there been any particular challenges in doing this work?

One is that this involves an iterative process. For someone in IT, we’re used to writing queries and getting them right the first time. This is a different mindset. You start out with one query and want to get results back. You find ways to mature your query; at each pass, you get better and better at it; it’s an iterative process.

What have your biggest learnings been in all this, so far?

There’s so much promise—there’s a lot of data in the notes. And I use it now for all my preparatory research. And Drexel is part of a consortium here called Partnership In Educational Research—PIER.

What would you say to CIOs, CMIOs, CTOs, and other healthcare IT leaders, about this work?

My recommendation would be to dedicate resources to this effort. We use this not only for queries, but to interface with other systems. And we’re writing applications around this. You can get a data set out and start putting it into your work process. It shouldn’t be considered an ad hoc effort by some of your current people.

 

 


See more on Analytics

agario agario---betebet sohbet hattı betebet bahis siteleringsbahis