Before seeing a doctor about a rash, sleeping issue or chronic heart condition, have you considered asking a chatbot for advice?
It is the suggestion of leading artificial intelligence company OpenAI, which has flagged plans to launch a dedicated health service with Australian users as possible candidates for its trial.
The announcement comes with the company revealing its ChatGPT tool handles more than 230 million health-related questions every day from more than 40 million individuals.
AI researchers say the a fully-scaled launch is all but inevitable due to overwhelming consumer demand but medical experts are split on its potential impact.
While some say increasing medical literacy via customised, computer-generated responses could prove beneficial for patients, others warn too little is known about the accuracy of its advice and its regulation and privacy protections.
OpenAI revealed plans for ChatGPT Health early in January, describing it as a complementary service to analyse medical data and advise on what to ask a doctor or what to eat and how to exercise.
âHealth is designed to support, not replace, medical care,â the company said in a statement.
âInstead, it helps you navigate everyday questions and understand patterns over time â not just moments of illness â so you can feel more informed and prepared for important medical conversations.â
Users will be able to link health their records from services such as Apple Health, MyFitnessPal, Peloton and Weight Watchers to ChatGPT, in addition to uploading raw data from medical scans and tests.
Australians could be included in a small group of early testers, although the service is not compatible with the governmentâs My Health Record platform.
Artificial intelligenceâs move into the health field is a natural extension of its current offerings, says UNSW AI Institute chief scientist Toby Walsh, given so many people are already turning to chatbots to answer intimate questions.
âPeople have been uploading their X-rays, their blood tests, everything into it,â he tells AAP.
âIt makes sense, therefore, to try and do it in a better, more formalised way where it knows something about your health history so it doesnât just give you generic responses but tries to give you specific, personalised, tailored advice.â
Using artificial intelligence to crunch raw health data, identify health trends, develop a list of probing questions or summarise symptoms for a doctorâs visit could help users, Prof Walsh says.
But generative AI technology is relatively new and untested in the health industry and could produce concerning results.
âThereâs huge amounts of money in health â itâs one of the biggest businesses around â and these tools can do useful stuff,â he says.
âThereâs a useful side to this but I fear, as ever, weâre rushing into it.â
For health industry consultant and general practitioner Joe Kosterich, the technology and concerns around it feel familiar.
âIt reminds me of when Google first got big and everyone was concerned about âDr Googleâ,â he says.
âUltimately, AI extracts from what is already online and it does it much more efficiently than a Google search and a Google search was much more efficient than going to an old-fashioned medical encyclopedia.â
Stopping AI entering the health industry would be impossible, Dr Kosterich says, given its rapid development and spread through a wide range of industries.
But the technology could benefit patientsâ medical literacy and has the potential to provide helpful, supplemental advice about managing ailments between doctor visits.
âFor the vast majority of people, itâs probably going to be quite useful â they can learn a bit more about their condition and how to manage it,â he says.
âHowever, nobody should be treating any medical condition based on what they find on Google or through AI without chatting with their doctor about it.â
It is this risk that concerns pharmacist and integrative health commentator Mick Alexander, who says some AI users may turn to the technology seeking a diagnosis rather than guidance.
âWhen you see a doctor, they have enough understanding to know what questions to ask,â he says.
âYour doctor can also see the person in front of them and gain visual cues from your appearance or from your body language.
âThese are things data points donât pick up if theyâre just based on results from blood tests.â
Questions about the regulation of generative AI health advice are also yet to be answered, Mr Alexander says, and whether the advice it sends users will be scrutinised by medical professionals.
If AI health offerings fall into the category of clinical decision support services, they may require approval from Australiaâs Therapeutic Goods Administration.
Itâs a grey space in the market, Mr Alexander says, and one that should be thoroughly tested and addressed before any medical professional recommends an AI health service.
âI wouldnât be relying on it as a decision-making tool,â he says.
âIf youâre looking for more information about your health or your conditions, the first place to start is with your healthcare team or your doctor.â
Â
Jennifer Dudley-Nicholson
(Australian Associated Press)
Â






