The new feature appears as a dedicated “Health” tab inside ChatGPT, designed as a separate environment where conversations, memory, and associated files are stored independently from standard chats, with additional layers of encryption and security. OpenAI emphasizes that health data connected to ChatGPT Health is not used to train its core AI models and is intended solely to improve response relevance through personal context.
ChatGPT Health enables users to voluntarily connect their medical data and information from health and wellness apps such as Apple Health, MyFitnessPal, Function, Peloton, Weight Watchers, and AllTrails. This allows the AI to tailor responses to real test results, clinical history, sleep metrics, physical activity, and dietary habits.
OpenAI stresses that the tool is meant to support users in understanding their data and preparing for conversations with healthcare professionals — not to replace professional diagnosis or treatment. In its official statement, the company notes that Health can assist with tasks such as explaining lab results, creating question lists ahead of doctor visits, interpreting health patterns, or comparing health insurance options — all based on the user’s own data.
According to OpenAI, hundreds of millions of users already ask health- and well-being-related questions in ChatGPT — more than 230 million health queries per week worldwide — highlighting strong demand for such functionality. The Health feature was developed in collaboration with over 260 physicians from 60 countries, who helped define how AI responses should be structured to remain both helpful and safe.
The product is currently available via a waitlist for early users on ChatGPT Free, Go, Plus, and Pro plans, excluding users in the EU Economic Area, Switzerland, and the United Kingdom. Integrations with medical records and certain apps are currently available only in the United States, and Apple Health syncing requires an iOS device. OpenAI plans to expand availability to all Web and iOS users in the coming weeks.
OpenAI also highlights that ChatGPT Health operates with enhanced privacy protections, including conversation isolation and additional encryption. Health data and “Health Memories” are stored separately and do not flow back into standard chats or get used to train base models. Users retain full control — they can review, delete stored memories, or disconnect linked apps at any time.
Finally, it’s worth noting that the expansion of ChatGPT into healthcare comes at a time when regulators and users alike are increasingly scrutinizing the role of AI in medicine. While features that help users understand their own health data or prepare for medical consultations are widely welcomed, concerns remain about privacy and the risk of misleading or harmful advice — especially if AI-generated responses are mistakenly treated as equivalent to a medical diagnosis. For now, one can only hope that OpenAI knows exactly what it’s doing.

