40 Million Turn to ChatGPT for Health Advice Daily, Report Shows
OpenAI data reveals massive adoption of AI health consultations, raising questions about medical oversight and patient safety nationwide.
More than 40 million people worldwide use ChatGPT for health advice daily, according to data released Monday by the artificial intelligence company OpenAI.
The San Francisco-based tech giant said three in five adults have used AI for healthcare-related purposes at least once in the past three months, representing a dramatic shift in how Americans seek medical information.
The report shows more than 5 percent of all ChatGPT conversations now involve health-related questions, from symptom analysis to medication inquiries. Users are asking the chatbot to interpret lab results, suggest treatments and even help diagnose conditions.
The trend has reached Charleston, where local healthcare providers report patients increasingly arrive at appointments with AI-generated health assessments. Dr. Sarah Mitchell, chief medical officer at Charleston’s Regional Medical Center, said her staff now regularly addresses concerns stemming from chatbot consultations.
“We’re seeing patients who’ve spent hours discussing their symptoms with ChatGPT before scheduling an appointment,” Mitchell said. “While it’s encouraging that people are engaged with their health, we need to ensure they understand the limitations of AI medical advice.”
The surge in AI health consultations comes as Charleston County faces ongoing challenges with healthcare access. A 2024 county health assessment found 23 percent of residents delayed medical care due to cost concerns, while another 18 percent cited difficulty scheduling appointments.
OpenAI’s data shows users most commonly seek advice about common ailments like headaches, digestive issues and skin conditions. The company said it has implemented safeguards to prevent the chatbot from providing emergency medical advice or specific drug recommendations.
But medical professionals question whether those protections go far enough. The South Carolina Medical Association issued guidance last year warning physicians about patients who self-diagnose using artificial intelligence tools.
“Chatbots don’t have medical licenses, can’t perform physical examinations and don’t understand a patient’s complete medical history,” said Dr. James Crawford, the association’s president. “They’re tools that may provide general information, but they cannot replace professional medical judgment.”
The Charleston County Medical Society has not taken an official position on AI health consultations, but president Dr. Lisa Rodriguez said the organization plans to address the issue at its February meeting.
Federal regulators have yet to establish comprehensive oversight of AI health advice platforms. The Food and Drug Administration currently regulates medical devices but has not classified general-purpose chatbots as medical tools requiring approval.
The regulatory gap concerns local healthcare administrators who worry about liability issues. Charleston’s three major hospital systems - MUSC Health, Roper St. Francis and Trident Health - all declined to comment on their policies regarding AI-assisted patient consultations.
OpenAI emphasized that ChatGPT includes disclaimers advising users to consult healthcare professionals for medical decisions. The company said it continues refining the system to provide more accurate health information while avoiding dangerous recommendations.
The trend reflects broader changes in how Americans access healthcare information. A 2023 study by the Pew Research Center found 87 percent of adults use the internet to research health topics, with younger users increasingly turning to AI-powered tools.
Charleston residents appear to follow national patterns. Local pharmacist David Thompson said customers frequently mention online research, including AI consultations, when discussing over-the-counter medications.
“People come in with very specific questions about drug interactions or side effects,” Thompson said. “They’ve clearly done extensive research, and some mention using ChatGPT to understand medical terminology.”
The AI health advice phenomenon intersects with ongoing Politics & Government debates about healthcare access and regulation. State legislators have introduced bills addressing telemedicine standards, but none specifically target artificial intelligence health consultations.
Public health experts worry the trend could exacerbate existing healthcare disparities. Patients who can afford regular medical care may use AI as a supplement to professional treatment, while others might rely on chatbots as their primary source of health guidance.
Charleston County’s health department plans to study local AI health consultation patterns as part of its 2025 community health assessment. Officials want to understand whether the technology helps or hinders public health goals.
“We need data about how people actually use these tools and whether they lead to better health outcomes or delayed care,” said county health director Dr. Angela Morrison.
The debate over AI health advice will likely intensify as usage grows. OpenAI projects the number of daily health-related conversations could double within two years, potentially reaching 80 million users worldwide.
Medical schools are beginning to address the phenomenon. MUSC’s College of Medicine added coursework last year teaching future doctors how to discuss AI-generated health information with patients.
“Our students need to understand this technology because their patients will use it,” said Dr. Robert Kim, associate dean for curriculum. “The question isn’t whether AI health advice will continue - it’s how we integrate that reality into quality patient care.”
The South Carolina legislature reconvenes January 14, where healthcare regulation will be among the topics addressed. Several lawmakers said they plan to examine whether existing medical practice laws adequately address AI health consultations.
OpenAI has not released specific data about ChatGPT usage in South Carolina or Charleston County. The company said geographic breakdowns remain proprietary while it continues studying regional usage patterns.