Doctors use AI tools to care for patients and manage treatments. (Photo from Adobe)
02 February 2026 04:25 GMT+03:00
TThe use of artificial intelligence (AI) in healthcare is growing rapidly, creating both significant opportunities and risks, particularly when AI-generated information is treated as definitive medical advice.
Experts warn that while AI can provide preliminary assessments and simplify medical information, overreliance on these systems can delay professional care and lead to serious consequences.
The warnings come from Professor Recep Ozturk, vice-rector of Istanbul’s Medipol University, who spoke to Anadolu Agency (AA) about the growing role of AI in healthcare and its implications for patients and doctors.
AI as a preliminary assessment tool
Professor Ozturk emphasized that AI is no longer a futuristic concept. Worldwide, approximately 230 million people consult digital AI systems every week for advice on healthy living and well-being.
Reports indicate that more than 40 million daily health-related queries are made to ChatGPT alone, reflecting a profound shift in the way people search for and interpret health information.
He explained that AI can serve as a preliminary assessment tool by helping patients understand complex laboratory results, imaging reports and symptoms, while reducing anxiety caused by uncertainty.
However, he emphasized that these systems cannot replace doctors, provide definitive diagnoses or perform critical tasks such as physical examination and clinical assessment.
Risks of over-reliance on AI
Professor Dr. Ozturk warned that the greatest danger lies in treating AI-generated information as absolutely accurate, which could delay professional medical consultation.
Although AI can help in radiology, dermatology and pathology by detecting details that the human eye might miss, it cannot fully understand the clinical context or patient history.
He highlighted the risk of “hallucinations,” misleading results that appear convincing but are false. Studies show an 8% to 20% risk of hallucinations in clinical decision support systems, and some radiology tools misclassify benign nodules as malignant in up to 12% of cases.
Support doctors, not replace them
Integrating AI into hospital systems can reduce administrative burdens and improve data analysis. Professor Dr Ozturk emphasized that AI tools are designed to help doctors and not replace them.
Even platforms like ChatGPT Health, which do not provide formal diagnostics, can function as medical tools for tasks like blood sugar monitoring or genetic data analysis, highlighting the need for strict validation and regulatory oversight.
He also emphasized that doctors retain full responsibility for clinical decisions and that AI results should never be accepted without reservation.
Future healthcare professionals must be trained to critically evaluate algorithmic recommendations alongside traditional clinical judgment.


