Science, Art, Litt, Science based Art & Science Communication
Q: Is it bad to use Google search engine to get information on medical issues?
Krishna: Yes!
Undoubtedly!
Why? Because you use your 'symptoms' to search.
And because a wide range of symptoms are common to most health conditions.
You will get a wrong diagnosis most of the time and with a limited knowledge of a common man this can lead to panic, stress and can lead to other complications.
If you try to self-medicate that will make the situation more compound.
Self-diagnosis is a wrong thing to do. People who go on search engines for self-diagnosis do not understand what they are reading or know the consequences. Search engines mostly use keywords and some unwanted materials may pop up and the results may inflict fear on the average readers and it's wrong.
What is worse is some people are asking others like them on the net to give advice. And I have seen several ignorant people just googling and answering them.
Some are giving strange alternative or home-born remedies for serious medical conditions.
This is a very dangerous situation. If you try to tell these people not to give such advices, some even try to attack you. They have so much confidence on their 'internet research, diagnosis and treatments'!
Like they say 'half knowledge is always dangerous' and
A small amount of knowledge can mislead people into thinking that they are more expert than they really are, which can lead to mistakes being made
Knowing a little about something tempts one to overestimate one's abilities.
I read an interesting research paper (1) recently.
It seems even some GPs (doctors) are using AI to diagnose their patients!
In a recent survey published in the journal BMJ Health & Care Informatics, it was found that one in five GPs are using AI. This is despite a lack of guidance or any clear work policies in place as of yet.
The research argues that both doctors and medical trainees need to be trained in the use, as well as the pros and cons, of AI before using it. There is a continued risk of inaccuracies (known as hallucinations), as well as algorithmic biases and patient privacy risks.
A random sample of GPs were sent a survey to complete. Each doctor was asked if they had ever used ChatGPT, Bing AI, Google Bard or any other AI service in their work.
In total, 1006 GPs completed the survey. One in five respondents reported using generative AI tools in their clinical practice. Of these, more than 1 in 4 used the tools to generate patient documentation and a similar number said they used them to suggest diagnosis.
Of the AI tools mentioned in the study, ChatGPT was the most used by far, followed by Bing and Google Bard. However, of the 1006 replies that came in, the vast majority stated that they didn’t use any of these tools at work.
“These findings signal that GPs may derive value from these tools, particularly with administrative tasks and to support clinical reasoning. However, we caution that these tools have limitations since they can embed subtle errors and biases,” the researchers said.
But the doctors have atleast some knowledge to understand whether the diagnosis is correct or not and can ask you to go for further tests.
You don't have that information to comprehend the situation correctly.
Therefore, never depend on the online information to come to a conclusion on any health issues.
Footnotes:
Tags:
27
© 2024 Created by Dr. Krishna Kumari Challa. Powered by