Artificial Intelligence: Beneficial to CDI and Coding Professionals, but also puts Clinical and Coding Knowledge at Risk of Erosion

By Brian Murphy

MedPage Today asks: Could AI put clinical knowledge at risk?

I’d like to slightly alter the question: Could AI put clinical/coding knowledge at risk, for mid-revenue cycle professionals?

I’m intrigued by all things AI. Not just for the obvious reasons (how it works, intellectual property/fair use implications), but also for what AI means for how we learn.

What is important to keep in our heads. What is important for us to know. Either in fine, memorized detail, or conceptually. Or, what is just trivia, and probably better outsourced to a machine (unless you play trivia nights at the local Applebys—important).

The capacities of the human brain are still not fully understood, but what is widely known is that we are terrible multi-taskers. Our memory is suspect. Our wetware is limited. Which begs the question:

Does a CDI professional need to have in his or head the clinical indicators of gram-negative pneumonia, or can the machine serve up the information?

Does a coding professional need to know the definition of principal diagnosis?

Before you answer “of course,” why not outsource this knowledge to machines?

One reason is, those skills might then be forever lost.

The Medpage Today article expresses the concern that if pathologists are no longer required to evaluate basic histology (microscopic tissue structure, which AI can now evaluate), the skill to do so will be gone.

I’ve heard many “old timers” lament “encoder dependency.” Allowing an encoder to lead you down the path of code selection, without application of human critical thinking.

I think this problem is about to accelerate.

By outsourcing knowledge, are we at risk for losing our grasp of the fundamentals? Of losing sight of the “why”? Are we ceding too much authority to the machines?

What should a coder or CDI professional be expected to know?

I also worry we’re giving tech companies a free pass to take decades/centuries of accumulated human institutional knowledge and hand it back to us, with a hefty licensing free. As does JAMA:

It is clear that AI applications are being developed with the speed of lightning, and from recent publications it becomes frightfully apparent what we are heading for and not all of this is good. AI may be capable of amazing performance in terms of speed, consistency, and accuracy, but all of its operations are built on knowledge derived from experts in the field. 

Looking at this from a glass half full:

Is there too much ambiguity for a machine ever to make the same leaps and insights as a human CDI reviewer or coder? Will AI ultimately lead to more fulfilling roles, where we don’t have to rely on suspect memories or an advertisement-compromised Google search, but can use the machine as a human extender?

Can we answer any of these questions, or just engage in speculation as I’m doing now?

I think these conversations are worth having. So, please send your thoughts to me at brian.murphy@norwood.com.

References

Related News & Insights

NEJM Study: AI not Ready to Perform Basic Medical Coding, Let Alone Replace People

By Brian Murphy In the battle of artificial intelligence (AI) vs. humans, score one for humanity. Yeah,…

Read More read more

Transforming Episode Accountability Model (TEAM) Reconfirms CMS Commitment to Bundled Payments and Value-Based Care

By Brian Murphy Every time I think bundled payments are out, they drag me back in. I…

Read More read more