AI chatbots like ChatGPT Health are increasingly being designed to answer health questions by analyzing medical records and wearable data. However, tech companies explicitly state these tools are not substitutes for professional medical care and should not be used for diagnosis.
Their intended role is to support users by summarizing complex test results, helping prepare for doctor appointments, and identifying health trends from personal data.
The main topics covered are the rise of health-specific AI chatbots, their limitations as non-diagnostic tools, and their recommended supportive functions in healthcare.
Should you ask an AI chatbot about your health? 5 things to know before you start sharing
AIs like Chat GPT and Claude are not substitutes for professional care, say experts, but can give useful information if used responsibly
With hundreds of millions of people turning to chatbots for advice, it was only a matter of time before tech companies began offering programs specifically designed to answer health questions.
In January, OpenAI introduced ChatGPT Health, a new version of its chatbot that the company says can analyse users’ medical records, wellness apps and wearable device data to answer health and medical questions.
Both companies say their programs are not a substitute for professional care and should not be used to diagnose medical conditions.
Instead, they say the chatbots can summarise and explain complicated test results, help prepare for a doctor’s visit, or analyse important health trends buried in medical records and app metrics.