SCI-ART LAB

Science, Art, Litt, Science based Art & Science Communication

Q: Is it bad to use Google search engine to get information on medical issues?

Krishna: Yes! 

Undoubtedly!

Why? Because you use your 'symptoms' to search.

And because a wide range of symptoms are common to most health conditions. 

You will get a wrong diagnosis most of the time and with a limited knowledge of a common man this can lead to panic, stress and can lead to other complications.

If you try to self-medicate that will make the situation more compound.

Using online search engines to diagnose health conditions can lead to serious risks, including: 
  • Misdiagnosis: Online resources can't provide the same level of individualized evaluation as a medical professional. With limited knowledge and based on your faulty understanding of what you've read, you may become convinced that you have a certain condition and turn a deaf ear to other possible explanations.
  • Unreliable sources: Just because a website looks reliable doesn't mean it is. This can lead you to draw the wrong conclusions

  • Cyberchondria: This condition is characterized by excessive anxiety and worry that stems from online research. It's easy to latch on to the worst-case scenario whenever you have symptoms that worry you. This can lead to undue distress.
  • Delayed diagnosis and treatment: People may skip getting medical attention if they believe they've found the source of their problems. 
  • Confirmation bias: People may focus on information that confirms their existing beliefs.
  • Unnecessary tests: People may insist on tests they don't need, wasting time and money. 
  • Dangerous treatments: People may treat their condition with supplements, herbal remedies, or other alternative medications, which can lead to side effects and toxicities. 

Self-diagnosis is a wrong thing to do. People who go on search engines for self-diagnosis do not understand what they are reading or know the consequences. Search engines mostly use keywords and some unwanted materials may pop up and the results may inflict fear on the average readers and it's wrong.

What is worse is some people are asking others like them on the net to give advice.  And I have seen several ignorant people just googling and answering them.

Some are giving strange alternative or home-born remedies for serious medical conditions. 

This is a very dangerous situation. If you try to tell these people not to give such advices, some even try to attack you. They have so much confidence on their 'internet research, diagnosis and treatments'!

Like they say 'half knowledge is always dangerous' and 

 A small amount of knowledge can mislead people into thinking that they are more expert than they really are, which can lead to mistakes being made

 Knowing a little about something tempts one to overestimate one's abilities.

The Dunning-Kruger effect is a cognitive bias that causes people to overestimate their own knowledge or abilities. It can affect anyone, but it's more likely to happen to people who lack knowledge or skills, are overconfident, or are poor performers. 

I read an interesting research paper (1) recently.

It seems even some GPs (doctors) are using AI to diagnose their patients!

In a recent survey published in the journal BMJ Health & Care Informatics, it was found that one in five GPs are using AI. This is despite a lack of guidance or any clear work policies in place as of yet.

The research argues that both doctors and medical trainees need to be trained in the use, as well as the pros and cons, of AI before using it. There is a continued risk of inaccuracies (known as hallucinations), as well as algorithmic biases and patient privacy risks.

A random sample of GPs were sent a survey to complete. Each doctor was asked if they had ever used ChatGPT, Bing AI, Google Bard or any other AI service in their work.

In total, 1006 GPs completed the survey. One in five respondents reported using generative AI tools in their clinical practice. Of these, more than 1 in 4 used the tools to generate patient documentation and a similar number said they used them to suggest diagnosis.

Of the AI tools mentioned in the study, ChatGPT was the most used by far, followed by Bing and Google Bard. However, of the 1006 replies that came in, the vast majority stated that they didn’t use any of these tools at work.

“These findings signal that GPs may derive value from these tools, particularly with administrative tasks and to support clinical reasoning. However, we caution that these tools have limitations since they can embed subtle errors and biases,” the researchers said.

But the doctors have atleast some knowledge to understand whether the diagnosis is correct or not and can ask you to go for  further tests. 

You don't have that information to comprehend the situation correctly.

Therefore, never depend on the online information  to come to a conclusion on any health issues.

Footnotes: 

1. https://informatics.bmj.com/content/31/1/e101102

Views: 27

Replies to This Discussion

27

RSS

Badge

Loading…

© 2024   Created by Dr. Krishna Kumari Challa.   Powered by

Badges  |  Report an Issue  |  Terms of Service