ѻý

ChatGPT Is More Like Penicillin Than Precision Medicine

<ѻý class="mpt-content-deck">— It has useful healthcare applications but can't replace human sensibility
MedpageToday
A photo of a man looking at a smartphone displaying the OpenAI logo in front of a monitor displaying the ChatGPT application

An artificial intelligence (AI) arms race was just kicked into high gear with of the free language model called ChatGPT. The chatbot has already passed , which makes it one of the fastest consumer app adoptions in history with a nearly company valuation to go along with it.

Like providers and investors in healthcare, the ChatGPT chatbot is trained on extensive amounts of data to learn patterns to help predict what comes next in a sequence. But like its AI predecessors, questions remain on the app's ability to address empathy, ethics, and critical thinking.

As a health policy and management professor, healthcare investor, health company board member, author, and volunteer provider in healthcare, I was curious where and how ChatGPT would be helpful to my portfolio of work. I downloaded the , took the tutorial, and started hacking away at healthcare use cases.

My first warm up question was: "Who is the best orthopedic surgeon in the U.S.?" ChatGPT was appropriately careful, answering, "This is a difficult question to answer as there are many excellent orthopedic surgeons so it's best to consult with your primary care physician."

I asked for a review of the latest guidelines for treating community acquired pneumonia (CAP), which was much more targeted and accurate but not sourced.

ChatGPT was quick to generate answers to direct questions but grappled with going deep on the real-world application of those answers. It was less accurate when probed around the stages of chronic kidney disease. There wasn't an option to give feedback or add links to peer reviewed references to help with quality improvement. ChatGPT wasn't helpful with deal sourcing or valuation work. In fact, the app wouldn't even put a on itself.

In healthcare, clinical judgement, ethics, and treating patients as individuals is a critical part of patient care. It's also why so many tech companies have failed in healthcare, as they don't account for the necessary human interaction, which limits the success of their products. Many seniors living alone at home need a human to provide care and literally drive them to care versus a product that just delivers a component of care electronically.

The most concerning result (or exciting, depending on the user) was when I plugged in a prior exam from my "Business of Healthcare" class, and ChatGPT passed the exam (binary multiple choice) in less than 5 minutes.

At the end of the day, we want students to grow intellectually, and distill information to apply their learnings to the real world. Using AI to pass an exam is far less valuable than a graduate's ability to comprehend data and use judgment to apply it ethically and correctly. To be fair, there are subjects in school like algebra, chemistry, and pathophysiology that do require some level of memorization. The concern is the short circuiting of the academic foundation required for higher scholarly learning and advancement in certain fields.

All in all, my experience with ChatGPT was analogous to penicillin: a powerful invention, widely used to treat a range of problems (infections) but not useful in all cases (some bacterial strains or viruses) and with a risk for overuse and possible negative effects by the user. If automated responses are acted upon without consideration for generalization, bias, or transparent authorship, it's a slippery slope. In fact, today I received an email from an employer that they do not support ChatGPT and are "looking into tools to detect its use" to ensure content fidelity.

The potential benefits of ChatGPT or similar applications in healthcare are wide-ranging. It takes an average of for research evidence to reach clinical practice; more efficient use of data could accelerate innovation and improve care delivery while saving the healthcare system a year.

On the safety side, improved accuracy in e-prescribing medication orders could help decrease , which harm at least 1.5 million people annually with morbidity and mortality costs running $77 billion per year. In terms of experience, administrative friction around appointment booking and billing processes are ripe for improvement.

The field of AI and applications like ChatGPT have an opportunity to help healthcare users (researchers, providers, students, caregivers, investors) construct and query data to get answers faster, which are on the right side of the goals.

Healthcare has a dichotomous need to address mundane administrative tasks (billing) while advancing R&D analytical firepower (curing) at the bedside. AI can be a powerful partner to enhance efficiency and reduce cost, but I predict it will never replace the critical need for judgement, ethics, and sensibility.

To date the best in healthcare is a symbiotic partnership of humans and technology.

is an adjunct professor with the Columbia University Mailman School of Public Health, and a private equity investor.

Disclosures

FitzGerald has no investments in AI chatbots.