When patients and their families have to make critical health decisions, they usually search for a second medical opinion to ensure they choose the best option. It was typically another physician consultation. However, recently, with the emergence of new technologies in healthcare, such as digital health and artificial intelligence, they have been investigating new approaches to look for adequate support and trustworthy information via consulting medical websites and using AI-driven tools like ChatGPT.
In a recent poll that we have conducted on LinkedIn, we have noticed that a large majority—71% of people—still opt for a second medical opinion to seek from another physician. This tendency highlights how patients see the value of having an experienced doctor on their side. Physicians provide personalized consultations and understand the patient’s fears and emotions using their extensive training, experience, and emotional intelligence. They can address nuanced questions, tailor their advice to individual circumstances and social conditions, and adapt their communications to the patient’s needs and levels of understanding that make them irreplaceable, especially during critical moments.
However, the accessibility of digital medical information for the public has transformed health decision strategies for a considerable part of the population: around 18% of individuals in this survey look for information from reputable sources like WebMD or Mayo Clinic. These platforms are built by medical professionals and offer peer-reviewed, evidence-based insights into various symptoms, diagnoses, and treatment options with simple and comprehensible language for the public. While they may lack the personal tailoring of direct physician consultation, they are considered a valuable and trustworthy source of medical information to comprehend any health issues.
Also Read: Artificial Intelligence for Healthcare Operational Excellence
Nonetheless, it is not surprising to see the emergence of AI tools as potential medical advisors for a second medical opinion! Approximately 11% of people in this poll are starting to utilize AI tools like ChatGPT to explore health information. They were fed by and trained on vast datasets from medical literature, whatever their quality and sources, allowing them to provide swift, data-driven counselling.
Despite its huge datasets and advanced technologies, can these AI tools be trusted to support an adequate second medical opinion?
The response is very complex: We need to consider critical factors to see how reliable AI tools are in this situation:
1. The Quality of Data
The accuracy of AI tools is directly linked to the quality and the source’s trustworthiness of the medical information on which they’ve been trained. Ensuring the data is updated, reliable, and credible is essential for obtaining helpful and secure responses.
2. The clarity and the details in the prompts
The effectiveness of AI’s answers is often tied to the prompts’ specificity. Straightforward, detailed questions lead to richer, more relevant information, while vague inquiries can yield generic responses. Being communicated by non-experts, the unclear and non-detailed prompts can lead to mistaken responses and sometimes severe errors in medical decisions.
3. Empathy, compassion and emotional understanding
Despite its new advances, AI tools are still unable to understand human communication clearly, which is frequently associated with emotions and incomprehensible gestures and expressions, mainly in critical health situations or disabled patients. In sensitive discussions related to mental health, end-of-life care, or family dynamics, the emotional intelligence of a real doctor can be invaluable.
While AI tools help enhance our understanding and encourage more thoughtful questions, they simply can’t replace seasoned healthcare professionals’ contextual understanding and emotional depth.
Assistant Professor and Consultant of Internal Medicine
Medicine Faculty of Sousse, Tunisia.