Pills

No law prohibits the collection of such data or their use in the examination room. The epidemic of opioids, which kills about 130 Americans a day, is the rationale for risk rating. | Good Medical Photo / Patrick Sison

Health care

The information used to assess the risk of opioid overdose is unregulated and is used without the consent of the patient.

By MOHANA RAVINDRANATH

02/03/2019 06:56 AM

Companies begin selling "risk scores" to physicians, insurers and hospitals to identify patients at risk of opioid dependence or overdose, without patient consent and with little regulation of the types of personal information used to create the scores.

While data collection is intended to help physicians make more informed decisions about opioid prescribing, it could also lead to some patients being blacked out and prevented from receiving the medications they need. according to their supporters.

History continues below

Over the past year, powerful companies such as LexisNexis have begun to accumulate data on insurance claims, digital medical records, housing records and even information about friends, family and roommates. a patient, without indicating to the patient that they had access to it and creating risk scores for health care providers and insurers. Health insurance giant Cigna and Optum from UnitedHealth also use risk scores.

Sharona Hoffman, professor of bioethics at Case Western Reserve University, has no guarantee as to the accuracy of the algorithms and "really no protection" against their use. Overestimating risk could lead health systems to focus their energy on the wrong patients. a low risk score could cause a patient to fall through the cracks.

No law prohibits the collection of such data or their use in the examination room. Congress did not address the issue of massive intrusive data collection in health care. This is an area in which technology is evolving too fast for government and society to follow.

"Consumers, clinicians and institutions need to understand that personalized health is a type of surveillance," said Eric Perakslis, a professor at Harvard University. "There is no way to get around it, so it must be recognized and understood."

The terrible epidemic of opioids, which kills about 130 Americans a day, is partly fueled by overprescription of legal pain killers. The Trump administration and Congress have spent billions of dollars fighting the epidemic and have not fled intrusive methods to combat it. In its national strategy, released on Thursday, the White House's Office of National Drug Control Policy urged physicians to consult each patient in a drug database.

Health care providers have a legitimate interest in knowing if a patient with pain can safely take opioids, at what doses and for how long – and which patients are at high risk of addiction or overdose. Data companies offer their predictive formulas, or algorithms, as tools that can help make the right decisions.

The practice frightens some advocates of health care security. The goal of scoring is to help doctors determine whether to prescribe opioids to their patients, but it may pinch people without their knowledge and give doctors an excuse to stop them from "getting the drugs they need." they need, "says Lorraine Possanza, a critic of the ECRI Institute.

The algorithms assign each patient a number on a scale of zero to one, indicating their risk of dependence if opioids are prescribed. Risk predictions sometimes go directly into patients' health records, where clinicians may use them, for example, to deny or limit a patient's demand for a painkiller.

Doctors can share patients' results with them – if they wish, say the data collectors. "We are not doing all we can to defend a particular opinion," said Brian Studebaker, of one of the risk assessment companies, the Milliman actuaries firm.

However, according to addiction experts, predicting who is at risk is an inexact science. Past addiction is pretty much the only clear red flag when a doctor is considering prescribing opioid pain relievers.

But several companies with which POLITICO has already spoken sell this predictive technology. None would name customers. Nor would they disclose exactly what goes into the mathematical formulas they use to create their risk scores – because this information is the "secret sauce" they sell.

Congress has shown some interest in the confidentiality of the data. Last year, a series of hearings focused on data theft or suspicious processes of data sharing by large companies like Facebook. But that did not really deepen the myriad of impacts of data compression on health care and health protection.

Consumer Brian Schatz (D-Hawaii), who co-sponsored legislation last year prohibiting companies from using individuals, "expects the data that they will provide websites and apps are not used against them. " data in a harmful way. The HIPAA privacy law of the late 1990s limited the way physicians share patient information. Schatz says "online companies should be forced to do the same".

A bill by Senator Ed Markey (D-Mass.), S. 1815 (115), would require data brokers to be more transparent about what they collect, but neither his bill nor that of Schatz deal with Specifically data in health care, a field in which separates the harmful from the benign can prove to be particularly delicate.

According to Martin Tisne, an expert in data governance, the use of big data in this area is about human rights beyond the mere violation of privacy. In a recent issue of Technology Review for Bill of Rights, he argues for the right to protection from "unreasonable surveillance" and unfair discrimination on the database.

Mark Meadows and Jim Jordan in Capitol Rotunda "data-size =" promo_xsmall_rectangle

Risk scores can be "the way of the future"

Research on risk factors for opioids is nascent. The University of Pittsburgh received an NIH grant last year to determine whether computer programs incorporating Medicaid claims and clinical data were more accurate than those based on claims alone.

The risk scores could be useful if they helped clinicians to open up frank conversations about the unique circumstances that could make a patient more vulnerable to opioid use disorder, said Olsen Yngvild, Member of the Board of Directors of the American Society of Addiction Medicine.

However, algorithms may rely on inaccurate public data and may have the effect of preventing patients from supporting themselves, leaving them in the dark as to the systems evaluating them by Big Brotherish. According to Hoffman of Case Western, another major challenge is to ensure that predictions do not derail clinicians' instincts and do not reinforce prejudices.

It's hard to imagine what a solid protection against the misuse of predictive algorithms might look like, she said. One approach might be to revise the Health Care Privacy Act to prohibit groups from taking advantage of health data or algorithms that correct it. But this will not prevent technology companies from making predictions based on anything they can access.

Algorithms predicting health risks are probably "the way of the future," she said. "I'm afraid we have to learn to live with them. … but better educate.

Companies that use predictive analytics to deal with the opioid crisis include insurer Cigna, who announced last year expanding its program to detect patients who may overdose. The insurer has a "number of tools to better understand," said Gina Papush of Cigna. Optum has also begun to stratify patients based on opioid risk. He clarified that a spokesman was not available to comment.

Donald Trump "data-size =" promo_xsmall_rectangle

Milliman won an FDA innovation challenge aimed at creating an artificial intelligence algorithm that predicts whether patients will be diagnosed with opioid use disorder over the next six months . The company proposes to provide payers with a list of high-risk patients who can pass on relevant information to clinicians.

Milliman has signed start-up contracts with some responsible healthcare organizations. It assigns patients a risk score of zero to 1 and also compares them to other patients.

HBI Solutions, another company, uses a mathematical formula drawing data from unidentified claims, said First Vice President Laura Kanov. Payers or providers can run the formula on their own patient data. Unlike some companies, HBI displays the reasoning behind each risk score, she said.

LexisNexis sells to Medicare plans a tool that identifies patients who may already be suffering from a disorder of opioid use. Someone might be at higher risk if his loved ones or roommates were abusing opioids, or if they were using a pharmacy known for his big pills, said Shweta Vyas of LexisNexis. LexisNexis can establish "relatively strong links" between people, based on public records showing that they live at the same address, she said. If both parties are enrolled in the same health plan, the software can find patterns "in the overall behavior of these two people".

Sally Satel, a member of the American Enterprise Institute and psychiatrist, warned that risk scores could reinforce what she sees as a misconception that doctors who prescribe too much medication are the main drivers of the opioid crisis. A patient who has had a serious car accident could exceed the recommended duration of opioid use because of their mental and emotional state, and not just because a doctor gave them too much, a- she declared.

"I do not know how much an algorithm can look at all these much more personal dimensions," she said. "I would love this study to be studied more instead of being sold."

Arthur Allen contributed to this report.

This article was tagged as:

Do you miss the last spoons? Sign up for POLITICO's Playbook and receive the latest news every morning in your inbox.