AusDoc 12th Dec | Seite 10

12 DECEMBER 2025 10 NEWS ausdoc. com. au
2026

12 DECEMBER 2025 10 NEWS ausdoc. com. au

When patients trust AI, not you

Dr Amandeep Hansra.
Jamie Thannoo A DAY after speaking at the RACGP conference last month about the future wonders of AI, Dr Amandeep Hansra had to confront a patient who was convinced she was wrong because ChatGPT disagreed with her.
She had originally seen the patient at her Sydney practice for a bread-andbutter GP issue and recommended a treatment.
However, the patient had subsequently‘ consulted’ with ChatGPT and called back to complain.
Dr Hansra arranged a follow-up appointment to persuade them that the AI bot was recommending unnecessary treatment. She did not succeed.
“ They had no need for this other treatment, but they had it in mind that they needed it, and they just would not let go,” Dr Hansra told Australian Doctor.
As the patient left, they told Dr Hansra they would search
for another GP to give them what the AI chatbot said they required.
Dr Hansra said she

AMH 2026 Pre Pub

Stay up to date and be rewarded

Hurry! Book the book now and stay ahead with evidence-based medicines information you can trust.
AMH is committed to supporting you in maintaining the standards you’ ve worked hard to achieve and upholding the quality use of medicines in Australia.
Pre-order the 2026 AMH book *- or any current AMH resource- before 31 December 2025 for your chance to win over $ 35,000 in rewards, including:
• Professional conference package
• Laptops and smartphones
• AMH resources
SCAN THE QR CODE TO
FIND OUT MORE
* The AMH book will be released and shipped in Mid-January 2026.
‘ They felt ChatGPT was nicer to them.’
suspected they would find one willing to comply.
“ Without putting down any of my colleagues, there is always someone out there who is happy to please the patient, who does not want to have the debate,” she said.
“ I am sure we will get more patients presenting who are saying,‘ All I need from you is to write this script or order this test. I do not need you to give me your advice because I have
* Terms & Conditions Apply. For full details and conditions of entry and / or to order your AMH resource, please go to www. amh. net. au. NSW Permit No. TP / 04602. ACT Lic. No: T25 / 02255. SA Lic. No: T25 / 1717. The promoter is Australian Medicines Handbook, Level 13, 33 King William St, Adelaide SA 5000. The random prize draw will take place at Level 13, 33 King William St. Adelaide 1pm on 21 / 1 / 26. Winners will be notified by 27 / 1 / 26 and draw results published on AMH website on 2 / 2 / 26. All values include GST.
already done my own online consultation.’”
As chief clinical adviser for the Australian Digital Health Agency, Dr Hansra knows the good and bad of technology.
She said large language model chatbots, such as ChatGPT, were posing a different challenge from‘ Dr Google’.
By using information from previous conversations with the patient, AI chatbots could incorporate far more personal information about the patient when giving medical advice than a GP who saw the patient every six months.
With this, they could then generate highly specific advice, backed by explanations of their‘ thinking’, in pleasant and supportive— arguably, sycophantic— language.
“ One of my patients told me they felt like ChatGPT was much nicer to them than their specialist,” Dr Hansra said.
“ The specialist just said,‘ I need you to have a CT scan.’”
She said there was a strange imbalance between patients and doctors when it came to AI.
AIs for a wide range of clinical topics were unlikely to become commonplace for years, with regulators across the world still trying to decide how they should be regulated, she said. So, what could GPs do? Dr Hansra said she had spent more time explaining her clinical reasoning during consultations and trying to help patients understand the processes doctors learnt during medical training.
She recommended that GPs play around with AI chatbots to understand what patients were experiencing.
“ When one patient complained that ChatGPT had told them a different thing than I did, I decided to go back to ChatGPT and tell it the symptoms,” she said.
“ It suggested X and Y, and I asked it to tell me why.
“ Then I realised ChatGPT did not have access to examination findings, which was why it came up with this set of possible diagnoses.
“ Once I told it that, it conceded it did not have as much information as me.
“ But a patient is not going to go through that process.”
She said she hoped that doctors, patients and chatbots would learn to coexist harmoniously.
“ We have to help our patients know when it is appropriate to use these tools,” she said.
“ I am fine with patients coming in with summaries, such as,‘ Here is what ChatGPT suggested we do; what do you think?’
“ We have to work collaboratively, not fight it.”