ѻý

What Can ChatGPT Do For Your Practice?

<ѻý class="mpt-content-deck">— Impressive AI text generator still has notable limitations
Last Updated December 29, 2022
MedpageToday
A photo of a man accessing ChatGPT on a laptop

At the end of November, a new artificial intelligence chatbot was released for public use. Just 2 weeks later, Clifford Stermer, MD, found an intriguing new use for the technology in his rheumatology practice, which he shared on TikTok.

In the , Stermer typed a prompt into ChatGPT, developed by OpenAI, requesting that the chatbot write a letter to a medical insurance company to explain why a patient with systemic sclerosis should be approved for an echocardiogram. Seconds later, on camera, the program started writing a full letter, complete with appropriate heading and formatting.

Stermer, who owns One Rheumatology in Palm Beach Gardens, Florida, narrated the writing of the letter during the TikTok video. He highlighted the most impressive elements, such as the explanation of the treatment and use of references. He concluded the video by saying, "Amazing stuff. Use this in your daily practice, okay. It will save time. It will save effort."

That video went viral almost immediately, he said, and now has more than 130,000 views. It was widely shared on other social media sites as well, including Twitter, where about other uses of this tool in healthcare.

Beyond the ChatGPT Hype

Since ChatGPT was released on November 30, it has become one of the internet's most referenced and tested programs in recent weeks.

"I was talking to some other doctors about it online, and there was a question about how can we use this to our advantage," Stermer told ѻý. He said that insurance denial letters were one of the first things that came to his mind when he learned about the program. "This seems like one of the tasks that we need to do regularly that it may be able to help with."

According to the OpenAI , ChatGPT was created using a language model trained to produce text, known as GPT-3.5. And it was designed to create texts, like insurance denial letters, using a method that models previously demonstrated texts created by humans "to guide the model toward desired behavior."

David Canes, MD, of Lahey Hospital & Medical Center in Burlington, Massachusetts, about how he's used ChatGPT. He said it has potential in small, time-saving tasks, but it doesn't quite measure up to the initial hype.

"Just like when our pets do anything remotely human-like, we -- probably inappropriately -- ascribe emotion and knowledge to the pet," Canes told ѻý. "Similarly here, ChatGPT knows statistically what the next word should be. It can come up with wildly impressive-looking results, but it is prone to error."

"It does not 'know' or 'feel' anything," Canes added. "If relied upon for medical research or asking medical questions, it might get the question right, but it will also confidently put forth completely fantastic-sounding garbage."

After the video took off, Stermer realized that the program had more limitations than he first thought. For one, he said, the references in the letter . ChatGPT made them up.

In fact, Stermer has found that much of the information produced by the program has been inaccurate or only partially correct.

"The sources were kind of half right," he said. "The authors were right, and then some of the articles were right, but it didn't piece it together exactly. So it's not something you could just send off to an insurance company, or really use at this point."

"It wrote a beautiful text, and it kind of got the points across," he said. "But it was still not where it needs to be."

OpenAI acknowledges that ChatGPT is not connected to the internet, so it can produce incorrect answers. In fact, ChatGPT has limited knowledge of "events after 2021 and may also occasionally produce harmful instructions or biased content," according to the organization's website.

A Future of Possibilities

Despite the current limits of ChatGPT, Canes said the technology is an improvement on tools people have already become accustom to, such as predictive text in Gmail. Physicians should expect this tool to become just as common.

"Young healthcare professionals need to keep a close eye on this," Canes said. "The technology is likely to improve by leaps and bounds. Doctors need to have a seat at the table when it comes to any workflow improvements, so this is no different."

Anobel Odisho, MD, MPH, of the University of California San Francisco, agreed that ChatGPT holds some unique potential in easing a physician's daily tasks.

In particular, Odisho noted that repetitive writing is a big part of his job as a clinician, a practice manager, and a researcher.

"In each of those roles, we do some repetitive, low-value writing and administrative work which I think we can automate, either with ChatGPT or other tools," Odisho told ѻý.

Odisho said he has used the program to generate first drafts of letters that he then reviewed and edited. He also used it to generate more patient-friendly descriptions of procedures or post-procedure instructions. He even used it to create a draft of an on-call schedule.

"I think the possible use cases are only limited by your creativity and ability to write a good chat 'prompt,'" Odisho said.

But he also acknowledged that this technology has serious limitations as well. He emphasized that healthcare professionals should be careful about what information they share with this program, especially when it comes to patients.

"Do not trust it, it is not ready for primetime," Odisho warned.

Ultimately, Odisho said ChatGPT is just the beginning of this kind of technology in medicine. His primary takeaway is that this is the future, and he believes all healthcare professionals should start getting comfortable with it.

"I think in the future you really will have two classes, one group that can seamlessly and fluidly utilize these tools effectively in their personal and work lives, and another that cannot, and will struggle to keep pace," Odisho said.

Stermer agreed that ChatGPT is a preview of things to come, but right now he acknowledged it is not as useful for his practice as he first hoped.

"It's more a look into the future of the possibilities, or how things are going to be done," he said "But it does not replace medical expertise."

  • author['full_name']

    Michael DePeau-Wilson is a reporter on ѻý’s enterprise & investigative team. He covers psychiatry, long covid, and infectious diseases, among other relevant U.S. clinical news.