Your doctor might be using ChatGPT for a second opinion. Should you?
An artist in Germany who liked to draw outdoors showed up at the hospital with a bug bite and a host of symptoms that doctors couldn’t quite connect. After a month and several unsuccessful treatments, the patient started plugging his medical history into ChatGPT, which offered a diagnosis: tularemia, also known as rabbit fever. The chatbot was correct, and the case was later written up in a peer-reviewed medical study.
Around the same time, another study described a man who appeared at a hospital in the United States with signs of psychosis, paranoid that his neighbor had been poisoning him. It turns out, the patient had asked ChatGPT for alternatives to sodium chloride, or table salt. The chatbot suggested sodium bromide, which is used to clean pools. He’d been eating the toxic substance for three months and, once he’d stopped, required three weeks in a psychiatric unit to stabilize.
You’re probably familiar with consulting Google for a mystery ailment. You search the internet for your symptoms, sometimes find helpful advice, and sometimes get sucked into a vortex of anxiety and dread, convinced that you’ve got a rare, undiagnosed form of cancer. Now, thanks to the wonder that is generative AI, you can carry out this process in more detail. Meet Dr. ChatGPT.
AI chatbots are an appealing stand-in for a human physician, especially given the ongoing doctor shortage as well as the broader barriers to accessing health care in the United States.
ChatGPT is not a doctor in the same way that Google is not a doctor. Searching for medical information on either platform is just as likely to lead you to the wrong conclusion as it is to point toward the correct diagnosis. Unlike Google search, however, which simply points users to information, ChatGPT and other large language models (LLMs) invite people to have a conversation about it. They’re designed to be approachable, engaging, and always available. This makes AI chatbots an appealing stand-in for a human physician, especially given the ongoing doctor shortage as well as the broader barriers to accessing health care in the United States.
As the rabbit fever anecdote shows, these tools can also ingest all kinds of data and, having been trained on reams of medical journals, sometimes arrive at expert-level conclusions that doctors missed. Or it might give you really terrible medical advice.
There’s a difference........
© Vox
