
epocrates
ChatGPT fails to answer drug-related questions, new study finds

Researchers posed 39 medication-related questions to the free version of ChatGPT, all of which were real questions from the Long Island University's College of Pharmacy drug information service. The study found that ChatGPT provided correct responses to only about 10 questions. For the other prompts, the answers were incomplete, inaccurate, or they didn't address the questions.
Some answers could even be harmful. In one question, researchers asked ChatGPT if Paxlovid and verapamil would react with each other. ChatGPT responded that the combination would yield no adverse effects. In reality, patients who take both medications may experience a significant drop in BP (Viswanathan, 2023).
The study’s lead author said the findings demonstrate that patients and clinicians should be cautious about relying on this viral chatbot for drug information and to verify any responses with trusted sources (Constantino, 2023).
Sources:
Viswanathan, G. (2023, December 10). CNN. ChatGPT struggles to answer medical questions, new research finds. https://www.cnn.com/2023/12/10/health/chatgpt-medical-questions
Constantino, A.K. (2023, December 5). CNBC. Free ChatGPT may incorrectly answer drug questions, study says. https://www.cnbc.com/2023/12/05/free-chatgpt-may-incorrectly-answer-drug-questions-study-says.html