Now that clinical settings are generating so much data, machine learning is starting to be deployed to fine-tune predictive analytics involving an increasing number of care settings and conditions. For instance, researchers are using machine learning to build and evaluate prediction models to identify which patients might be at increased risk of suicide and when that risk is reduced or elevated.
Suicide remains the 10th-ranked cause of death in the United States, accounting for 41,000 deaths in 2013. Suicide attempts lead to 600,000 emergency department visits and 200,000 hospitalizations annually in the United States.
In an Oct. 27 presentation to the NIH Collaboratory, Gregory Simon, M.D., M.P.H., a psychiatrist and senior investigator at the Kaiser Permanente Washington Health Research Institute, described efforts to use population-based data from large health systems to develop evidence-based suicide attempt risk calculators for mental health and primary care clinicians.
Simon began by explaining that many provider organizations, including Kaiser, have already developed standard workflows for creating personal safety plans for patients who had a high score in response to a questionnaire about self-harm (called an Item 9 screening). He noted that Kaiser providers were held accountable and had salary dependent on following systematic care processes for patients with suicidal ideation.
But on closer examination, researchers found that the responses to Item 9 were not as predictive as they would like in terms of identifying people who actually ended up harming themselves or attempting to. “We started to look at the data from people completing questionnaires, and Item 9 was not as accurate as we would have hoped,” he said.
Thus began an effort to look at many more indicators to see if any of them or combinations of them could be more predictive. Seven health system sites — five Kaiser sites as the Henry Ford Health Systems in Michigan and the Minnesota-based HealthPartners Institute for Education and Research — are contributing data to the project on about 350 predictors ranging from socio-demographic characteristics to psychiatric diagnoses, co-occurring substance use disorder, co-occurring medical illness, outpatient treatment history, and inpatient treatment history as well as suicidal behavior history.
The study is looking at about 20 million visits by 3 million people. The work is being done by extracting data from research data warehouses created using the Healthcare Systems Research Network common data model, Simon explained.
Although they haven’t yet published their results, the researchers have found that the machine learning models for suicide risk seem to offer statistically significant improvements over previous methods of identifying suicide risk and “suicidal behavior seems more predictable than some other adverse medical events,” Simon said.
If it is true that the machine-learning algorithm offers an improvement in terms of predictive value, what does that mean for clinical practice?
“We would hope to use these to develop new standard work flows,” Simon said. It could also trigger new standard workflow outside of or between visits. If someone fails to attend a visit or cancels a visit, this could give an opportunity to do more aggressive outreach, he said.
In the next few months, Kaiser Permanente in Washington, D.C., is planning to upload the results of the risk scores generated by the algorithm into its EHR at the point of care, Simon said.
Following Simon’s presentation, Don Mordecai, M.D., Kaiser Permanente National Leader for Mental Health and Wellness, said, “the idea that machine learning is about to be launched in our healthcare system is tremendously exciting. It could really turn the system on its head. Ever since healthcare was something humans did, the patient has had to hold up his hand and the system would respond. The idea here is that if you have rich enough data you can instead predict who may need help and do outreach and move care upstream. That is a goal for healthcare in general, whether it is dealing with cancer or a person heading toward self-harm. I am excited and thrilled to see how clinicians will use it.”