Louisville Medicine Volume 73, Issue 8 | Page 15

​Action Plan: What We Do Now
for“ lack of medical necessity.” 1
The error rates on these tools can be staggering. Federal class-action lawsuits filed against major insurers allege error rates as high as 90 % when these automated decisions are actually appealed by a human. Yet, the burden of proof shifts entirely to your practice to fight the machine, one claim at a time. 2
This is not“ utilization management.” It is Practice by Probability, and it is dangerous.
The Liability Trap
The second threat is quieter but equally deadly. Who is responsible when the AI gets it wrong?
If a clinical decision support tool suggests a medication dosage that turns out to be lethal, or a radiology AI misses a nodule that a human might have caught, current malpractice frameworks generally point the finger squarely at you. 3
We are entering a“ Liability Gap.” Hospitals and insurers are pushing us to use these efficiency tools to see more patients faster. But when the tool fails, the algorithm doesn’ t get sued. The physician does. We are being asked to be the“ insurance policy” for a technology we didn’ t build and often don’ t fully understand.
We cannot be Luddites. AI is not going away, and when used correctly, it can be a powerful ally. But we must set the rules of engagement now, before they are set for us by Silicon Valley or Wall Street.
Here is my roadmap as President of GLMS in 2026:
1. Legislate“ Human in the Loop” Protections: I want GLMS and KMA to work directly with the Kentucky legislature’ s Artificial Intelligence Task Force to introduce a simple but non-negotiable standard: No medical claim in the Commonwealth of Kentucky should be denied solely by an algorithm. I will be calling for a“ Human in the Loop” law requiring that any adverse determination must be reviewed and signed by a licensed physician of the same or similar specialty. If a robot wants to deny my sinus surgery, it needs to find a human otolaryngologist willing to put their license on the line to agree with it.
2. Demand“ Augmented,” Not“ Artificial”: We must align with the American Medical Association’ s shift in terminology to“ Augmented Intelligence.” 4 This is a critical framing war. Technology should augment human decision-making, not replace it.
In your own practices, ask tough questions of your vendors. If your EMR is rolling out an AI module, ask for the“ Model Card”— the transparency document that explains what data on which the AI was trained. Was it trained on a diverse patient population similar to Louisville’ s? Or was it trained on a homogenous dataset that will bake bias into your care?
3. Report the“ Weird” Denials: This is my direct ask of you today. We need data to win in Frankfort.
If you receive a denial that defies clinical logic, such as one that feels like it ignored the chart entirely, send it to us. GLMS would like to compile a dossier of“ Algorithmic Absurdities” to present to the Senate Banking & Insurance Committee. One story is an anecdote; 50 stories is a pattern of practice.
Let’ s Defend the Art of Medicine from AI Overreach
AI should be a tool that lets us spend more time looking at patients and less time looking at screens. It should not be a tool that allows an insurance company to deny care at the speed of light.
The technology is artificial. The consequences for our patients are very real.
Let’ s ensure that in Kentucky, the human physician remains the ultimate authority in the room.
References
1
Ross, C.“ UnitedHealth faces class action lawsuit over algorithmic care denials in Medicare Advantage plans.” STAT News. November 14, 2023.
2
Estate of Gene B. Lokken et al. v. UnitedHealth Group Inc. et al., No. 0:23-cv-03514( D. Minn. filed Nov. 14, 2023).
3
Price II, William Nicholson and Gerke, Sara and Cohen, I. Glenn, Liability for Use of Artificial Intelligence in Medicine( May 20, 2022). W. Nicholson Price II, Sara Gerke, & I. Glenn Cohen, Liability for Use of Artificial Intelligence in Medicine, Research Handbook on Health, AI and the Law, Barry Solaiman & I. Glenn Cohen, eds. Edward Elgar Publishing Ltd.( 2023 Forthcoming), Harvard Public Law Working Paper No. 22-16, U of Michigan Public Law Research Paper No. 22- 020, U of Michigan Law & Econ Research Paper No. 22-020, Available at SSRN: https:// ssrn. com / abstract = 4115538 or http:// dx. doi. org / 10.2139 / ssrn. 4115538.
4
American Medical Association.“ Augmented Intelligence in Health Care.” AMA Policy H-480.940. Modified 2024.
Dr. Higgins is a rhinologist in private practice at Kentuckiana ENT, a division of ENTCC, and President and Chairman of the Board of ENT Care Centers( ENTCC).
January 2026 13