Auto-Assisted Coding without Human Oversight a Massive Compliance Risk, Legal Liability

By Brian Murphy

There’s been increasing discussion about the safe and ethical use of AI in healthcare. This week the House Energy and Commerce Subcommittee on Health held a session to explore how hospitals are using the technology. See link below (AHA).

You won’t find regulations outlining the safe use of AI in healthcare, though the Biden administration just released an executive order that puts us on the path. It’s new tech and we’re still trying to figure it out.

So if you’re a coding or CDI leader using it, or about to, what should you do?

My suggestion: Consider the opinion of people who are elbow-deep in developing AI assisted coding products. Like the Executive Vice President of Research and Development at EPIC.

EPIC is building an AI assisted medical coding tool. While not ready for commercial use it sounds about ready to roll out. See Healthcare IT News article linked below. I recommend listening to the interview. It’s only 12 minutes. Coding discussion around the 10:38 mark.

I like that this new tool ASSISTS with medical coding. Not auto-assign codes.

The AI will list potential diagnosis and procedure codes, based on the clinical documentation, and list supportive documentation alongside the suggested code(s), allowing the coder to evaluate the information, and then code.

This strikes me as doing it the right way.

More striking is the comment of Epic’s EVP of R&D, who said:

“I have a few rules of thumb, the first is, is the AI tool entirely automating the process? Or is there a human in the loop who makes the decision? Be wary of the former, it needs to hit a much higher bar.”

Auto-assignment of codes without human intervention is a bad idea. If a vendor is selling you this idea as a way of saving your hospital $, run in the other direction.

Why not just auto-assign codes? Because the documentation is often insufficient. As in, not in the record at all. Or, the documentation is conflicting, and therefore subjective.

One example includes principal diagnosis (PDX). Critical for proper DRG assignment and payment. PDX assignment is complex and may require input from other clinicians that are part of the care team. Another example is determining present on admission status. Not always clear, and limited by available information.

Humans, not machines, can call an attending physician (or another clinician) for added context. Humans will wait until additional testing comes, labs, etc.

AIs never say, “I don’t know the answer, I’ll wait.”

Humans, not machines, should be making these calls.

If Epic’s EVP of R&D is saying so, that’s enough for me.

A recent episode of Off the Record with Kory Anderson touched on this topic. Future episodes will cover it in further detail.

AI assisted CDI and coding solutions are important to consider many reasons. One is the future of medical coding jobs. The other is legal.

While I can’t prove this, it seems to me that your hospital, not the machine manufacturer or AI programmer, will be held liable for inaccurate code assignments.

I realize this might sound like my argument is a mere appeal to authority. But here’s some more logic behind my reasoning.

Is it OK to take your hands off the wheel of an auto-driving Tesla? Here in the U.S. the laws are murky, and vary by state. But in Washington D.C. for example an autonomous vehicle may operate on a public roadway, provided that the vehicle:

  1. Has a manual override feature that allows a driver to assume control of the autonomous vehicle at any time;
  2. Has a driver seated in the control seat of the vehicle while in operation who is prepared to take control of the autonomous vehicle at any moment

The same (very likely) applies to coding. I am hopeful we’ll see more regulations and official guidance in the near future.

But for now, computer assisted coding solutions, though increasingly sophisticated and AI powered, still need a conscious human being at the wheel.

References

AHA, “House subcommittee hearing explores AI use in health care”:

https://www.aha.org/news/headline/2023-11-29-house-subcommittee-hearing-explores-ai-use-health-care

Healthcare IT News, “EHR giant Epic’s success with AI, detailed by its EVP of R&D”:

https://www.healthcareitnews.com/video/ehr-giant-epics-success-ai-detailed-its-evp-rd

Related News & Insights

Compliant capture of SDOH and chronic conditions a healthy imperative

We know about the bad examples.    Insurance companies adding diagnoses solely to inflate risk scores and…

Read More read more

Outpatient CDI adoption, denials prevention and risk adjustment trending high in CDI

By Brian Murphy   CDI Week might be winding down … but I’m keyed up for the…

Read More read more