top of page
Writer's picture2244 Online

Do You Know Your Pain Score?

Wired October 2021 “The Pain Algorithm” “The latest weapon in the war on drugs is a predictive AI on your doctor’s computer. It can determine who receives treatment for pain-and who doesn’t” by Maia Szalavitz





Read Wired for all the details.


“I don’t think you are aware of how high some scores are in your chart” a patient was told when in-hospital opioid treatment was reportedly going to be withheld. Although she had long been treated with opioids as an out-patient for a chronic and painful condition she was discharged without any pain medication. Later her gynecologist terminated their doctor-patient relationship because of “a report from the NarxCare database.” These databases are part of “prescription drug monitoring programs” (PDMPs) that “track scripts for certain controlled substances in real time” and that have become “something like a seamless, national prescription drug registry” to which artificial intelligence (AI) has been applied.


NarxCare “purports to instantly and automatically identify a patient’s risk of misusing opioids.” Using machine learning the system can assign a “comprehensive Overdose Risk Score” for each patient. NarxCare by Appriss is “adamant that a NarxCare score is not meant to supplant a doctor’s diagnosis.” For physician’s not using the tool or an equivalent comes with a risk as more states “legally require physicians and pharmacists to consult them when prescribing controlled substances, on penalty of losing their license.” Even worse, “police and federal law enforcement...can also access this...information...to prosecute both doctors and patients.”


Not surprisingly such a system has flaws including flagging cancer patients and others with medically complex diagnoses. Patients with these conditions may appear to be physician shopping as they often require “multiple specialists” during the course of their care and treatment. These programs reportedly may also scan a patient’s medical records for diagnoses like depression and post-traumatic stress disorder as “variables that could impact risk assessment.” Researchers point out that these programs are proprietary. They are not transparent about how the systems were developed and maintained. Tools like NarxCare are essentially “unaccountable...quasi-medical tools” that have not “been validated as safe and effective by peer-reviewed research.” On an individual patient basis the impact is significant in many ways including feeling as though they are suspects, in an “inquisition” whenever they seek medical care.


The author points out that the medical community has ebbed and flowed with respect to treating pain from being highly restrictive to more liberal. Currently, there’s been a pullback but this has only led to “many Americans [to seek] substances like illegally manufactured fentanyl.”


As it turns out “70 percent of adults have taken medical opioids-yet only 0.5 percent suffer from what is officially labeled ‘opioid use disorder’” AKA “addiction” but tools like NarxCare can be flawed in identifying those that struggle with using opioids. This can lead to forms of discrimination against patients with a history of sexual abuse, women and minorities.


Patients are striking back creating advocacy groups like “Don’t Punish Pain Rally” but have had little success in countering unjust classification. This inability to counter such classifications impacts a patient's ability “to return to work after injuries and struggling to get pain treatment.”


The root cause for these flawed AI tools resides in the complexity of developing and validating such machine learning tools. It is suggested that NarxCare and similar programs be required to seek and obtain FDA approval before use in medical care.


Comments


bottom of page