top of page

Software Driving Medical Decision-Making Are Largely Unregulated by the FDA "First Do No Harm!"

Scientific American January 2022 pp10-11 |FORUM|COMMENTARY ON SCIENCE IN THE NEWS FROM THE EXPERTS|”Medical Algorithms Need Better Regulation” “Many do not require FDA approval, and those that do often do not undergo clinical trials” By Soleil Shah and Abdul El-Sayed


Read Scientific American for all the details.


Summary by 2244




Medical algorithms also known as clinical decision support software (CDS) are widely used in American health care to aid in making diagnoses, offer prognosis and to monitor patients. As an example of one application see the figure and link above. Since the inception of many CDS problems have been noted and are detailed in the article. Some of these “issues might be extending the current pandemic as well; a 2021 review of dozens of machine-learning algorithms designed to detect or predict the progression of COVID found none clinically useful.”


How can this happen?


As it turns out, CDS can legally evade better evaluation by using streamlined processes-long in place and recent legislation that was aimed at facilitating innovation. The oldest ineffective method is called the FDA's 510K application process which speeds approval if a device is considered in some way equivalent to a device that was previously approved. This would hold true even if the predicate device has since been recalled. The recent legislation known as the “21st Century Cures Act of 2016…excluded certain health-related software from the definition of a medical device…[thereby allowing]...CDS…to evade FDA oversight altogether.”


What do the authors recommend to remediate this issue?


“First, Congress must lower the threshold for FDA evaluation.” To use a 510K the “definition of equivalency…[to a predicate device]...should be narrowed.


“Second, Congress should dismantle systems that foster health-care workers’ overreliance on medical algorithms.” A notable example has been having to review a certain CDS that scores substance abuse risk “prior to prescribing opioids.” The authors advocate putting the ultimate decision-making, in this application, back to physicians without worry of legal liability if their assessment of the patient differs from the CDS.


“Third, Congress must establish a system of accountability for technologies that can evolve over time.” It’s widely known that CDS, for example used in CT image analysis, can vary from device-to-device of the same make, let alone a newer model.


What’s currently being done?


Some legislation aimed at strengthening requirements like the “Algorithmic Accountability Act of 2019” and others are being discussed in Congress. “The FDA…[last year]...released its first action plan tailored specifically to these technologies.” The authors conclude “If a [CDS] decision affects a patient’s life, ‘“do not harm’” must apply-even for computer algorithms.”


bottom of page