Nursing Review Issue 6 October-November 2023 | Page 29

Technology
Technology

Smart asks

Can ChatGPT improve healthcare access ?
By Elise Hartevelt

We ’ ve all heard about the dangers of ChatGPT – from cheating in exams to fabricating reports – but it ’ s not all terrible .

Associate Professor Beena Ahmed , from UNSW ’ s School of Electrical Engineering and Telecommunications , says that ChatGPT could play a critical role in optimising access to healthcare services .
We spoke with Professor Ahmed about how generative AI could improve health practitioner based relationships through improved communication , and what dangers to be aware of .
NR : Can ChatGPT be used in healthcare settings ? BA : It ’ s a generative AI system , meaning that you can ask it questions , and it will give you responses . So , within the healthcare system , it can be used for various purposes .
You can use ChatGPT to translate technical or medical terms to a patient in a more straightforward language .
When a healthcare worker might be struggling to communicate with a patient because of a language barrier , you could ask ChatGPT to translate medical information into various languages .
Or when a nurse is trying to discuss a very difficult topic with a patient , the nurse can ask ChatGPT to describe the situation in more sympathetic language .
It can produce anything from general text to pamphlets , brochures and videos for patients to read or view .
So , those functions are helpful to improve the relationship between a healthcare worker and patient .
A healthcare worker can also use ChatGPT to create text to communicate within the healthcare system . It can write emails , prepare summary reports and create an information sheet based on a patient ’ s records .
It can produce discharge papers a person could use for Medicare or insurance claims and create transcripts of conversations .
ChatGPT can also prevent miscommunication within teams that could ’ ve led to issues .
Now , the caveat here is that somebody always has to check whether ChatGPT ’ s output is correct .
It can prefill everything and prepare anything , but a healthcare worker has to review it before it goes out .
It ’ s a time-saver for administrative tasks , which is very useful for healthcare workers who are overworked and short-staffed .
What downsides are there to using ChatGPT in a healthcare context ? The single biggest issue for me is , ‘ how do we ensure data protection of all that medical information we might collect ?’ I think the government or another autonomous body needs to take control of and implement guidelines that everyone must follow .
At the moment , the data belongs to whoever is collecting it , and that is very
“ When a nurse is trying to discuss a very difficult topic with a patient , the nurse can ask ChatGPT to describe the situation in more sympathetic language
often a tech company that can then just sell that information to the highest bidder .
Another huge risk is that data can be incorrect and biased , giving someone in the medical field the wrong information .
The last thing you want to do as a healthcare professional is make a wrong diagnosis or give the wrong advice .
There ’ s also a cultural sensitivity aspect to ChatGPT because it ’ s predominantly trained on English text .
The majority of the internet text also comes from the USA , so it ’ s very skewed .
Then you can have gender bias too , not just cultural . But it ’ s open AI , so the company that created ChatGPT is not the only one working on models like this .
There ’ s Google ’ s version , Bard , and the Chinese search engine , Baidu , also works on one called Ernie .
So , ChatGPT is not going to be the only solution – there will be others that can cover other information written in different languages .
Like with every technology , we can ’ t stop progress from happening . But what users should be aware of is that we can ’ t blindly trust technology like this .
It ’ s got its deficiencies , and we should be wary of them . ■
nursingreview . com . au | 27