Sure, we don’t have a robot to make our beds (yet), but AI is alive and well in the world. Use it to proofread your writing, enhance your productivity and now, you can consult Dr Chatbot for health advice.
ChatGPT, but make it for health
Per a PWC report, “One of AI’s biggest potential benefits is to help people stay healthy so they don’t need a doctor, or at least not as often.” Already, AI can detect diseases such as cancer. And, the medical devices that appear to us as wearable health-enhancing tech uses AI to maximise our outlook. Here’s looking at you, Fitbit.
Plus, using tech for your health concerns is nothing new. For decades, we’ve been consulting Dr Google for anything from a niggle to a full-on flare-up. But can AI be a panacea for our health needs?
How does ChatGPT work?
ChatGPT is arguably the most well-known and powerful AI in the space. Five days after it launched, the language bot became the fastest consumer application to reach a million users in history. And since then, that number has reached 100 million. That makes ChatGPT a popular alternative to Google for seeking out information and advice.
Using an AI model, ChatGPT has been ‘trained’ to recognise patterns in language that allows it to make ‘predictions’ based on that learning. This is a tool that draws on available information to produce well-articulated answers to almost any question entered into its chat bar, no matter how broad or narrow the question or how detailed the expected response is.
Plus, AI like ChatGPT could lessen the burden on healthcare providers and can provide empathetic, high-quality responses, a study has found.
AI for your health: the problem
But how reliable is the information? What happens when it comes to subject matter that may contain conflicting opinions or biased sources of information? Here it becomes hard to discern the chatbot’s reliability or potential prejudices.
The answers ChatGPT spat out for medical concerns were deemed as good or better than Google’s, per a new study conducted by a UCT professor. But some answers were vague, misleading or just plain made-up.
Study author Dr Philip Moons, Honorary Professor at UCT’s Department of Paediatrics and Child Health, cautions about using ChatGPT. “It is important to know that ChatGPT is a language model. It’s built to make good texts. It is not made to look things up,” he says.
A thorny issue
On its homepage, ChatGPT includes disclaimers that it may generate incorrect or biased information. Yet, we don’t know when a response is correct or not, how it may have been influenced by the manner in which a question was phrased and when and how it may be influenced by vested interests. And when it comes to health advice, this could amplify certain falsehoods or points of view, or overlook potential blind spots of information. Unlike a Google search that leads you to specific websites, with ChatGPT there is no single source of information which makes it hard to ascertain its credibility.
Already, healthcare services in the US are struggling to adapt to the glitches and failures of AI while dealing with patients in real-time.
“Legislators are currently concerned about the lack of legislation,” says Dr Moons about the questions the use of ChatGPT raises. “I’m sure that governments are dealing with this issue. To the best of my knowledge, there is no uniform framework for it.”
This raises the larger question of accountability. Let’s say a healthcare institution begins using an AI-registered platform, incorporating it into the diagnostic process, and a claim arises. Who answers to any potential liability – the AI itself, the company that developed the programme, the programmers who developed and maintain the algorithm or the healthcare institution that adopted the platform and failed to manage associated risks? Only time will tell.
So… should you use ChatGPT for your health issue?
For now, use caution when consulting ChatGPT for healthcare issues. As always, a real-life doctor’s advice is always the gold standard, rather than what you find on the Internet. Doctors can get a proper look at all the factors that create an issue, instead of what you exclusively input into an AI chat. “I think that healthcare providers need to warn their patients and the lay public that ChatGPT is not made for looking up things,” says Dr Moons.
If you’re desperate, Dr Moons suggests New Bing, an AI-powered search engine made by Microsoft. “It has the functionality of a search engine (like Google), but the technology of ChatGPT is built in. New Bing is meant to look things up. Therefore, it is a safer place to go to than ChatGPT,” he explains.