With the ever-growing complexity of healthcare, it seems only natural that some aspects of doctor-patient communications will be farmed out. In the 21st century, delegating tasks often means adopting new technology, such as the AI bot ChatGPT, to do the job a person doesn’t have time to. The latest job a person may not have time for? Being kind to patients.
In a recent study published in JAMA, researchers compared physician and artificial intelligence chatbot responses to patient questions. Using Reddit’s “AskDocs” forum, 195 posts were randomly selected. These posts were answered by verified healthcare professionals. The same 195 questions were also posed to ChatGPT, and the AI language model responded. Both the healthcare providers’ and ChatGPT’s responses were shown to a panel of three licensed healthcare professionals, and the professionals rated the answers for quality and empathy.
In an incredible result, the panel chose ChatGPT’s responses 79% of the time. Dr. Christopher Longhurst, of UC San Diego Health, said in an interview with The Guardian: “These results suggest that tools like ChatGPT can efficiently draft high-quality, personalised medical advice for review by clinicians, and we are beginning that process at UCSD Health.”
The responses to JAMA’s article are mixed. Some health organizations are encouraged that AI could draft responses that actual doctors can then edit, saving doctors time. However, Professor Anthony Cohn, of the University of Leeds, told The Guardian, “Humans have been shown to overly trust machine responses, particularly when they are often right, and a human may not always be sufficiently vigilant to properly check a chatbot’s response.”
ChatGPT was created to be likable, so some readers of the study believe this fact skews all results as the AI was created with empathy in mind. Many doctors are concerned that the general public may believe that healthcare professionals could be replaced by technology. Ediriweera Desapriya, Ph.D. at UBC-BC Children’s Hospital responded to the study: “While chatbots can simulate empathy through pre-programmed responses, they cannot truly understand the emotions and needs of human users in the same way that a human healthcare professional can.”
While the study is interesting, at this point, it’s safe to bet that people are not ready to give up doctors for AI. Perhaps the study will encourage doctors to revisit their bedside manner and look to its findings to remind them that patients need not only quality information and advice, but they need it with a personal touch of empathy.