SCI-ART LAB

Science, Art, Litt, Science based Art & Science Communication

Q: I am facing some trolls on the net. How can I deal with them?

Krishna:

I face this problem  too. Because I am a woman more people try to troll me.

About 1% of people troll even the scientists, professors and highly qualified experts.  Some who don't even know the ABCDs of  science try to teach us what science is! They learn a few things on the net and argue endlessly. And say toxic things that make any decent person face an embarrassing situation. 

User behavior toxicity analysis on reddit (1) showed that 16.11% of users publish toxic posts, and 13.28% of users publish toxic comments. 30.68% of users publishing posts, and 81.67% of users publishing comments, exhibit changes in their toxicity across different communities—or subreddits—indicating that users adapt their behavior to the communities' norms.

The study suggests that one way to limit the spread of toxicity is by limiting the communities in which users can participate. The researchers found a positive correlation between the increase in the number of communities and the increase in toxicity but cannot guarantee that this is the only reason behind the increase in toxic content.

Various types of content can be shared and published on social media platforms, enabling users to communicate with each other in various ways. The growth of social media platforms has unfortunately led to an explosion of malicious content such as harassment, profanity, and cyberbullying. Various reasons may motivate users of social media platforms to spread harmful content. It has been shown that publishing toxic content (i.e., malicious behavior) spreads—the malicious behavior of non-malicious users can influence non-malicious users and make them misbehave, negatively impacting online communities.

One challenge with studying online toxicity is the multitude of forms it takes, including hate speech, harassment, and cyberbullying. Toxic content often contains insults, threats, and offensive language, which, in turn, contaminate online platforms. Several online platforms have implemented prevention mechanisms, but these efforts are not scalable enough to curtail the rapid growth of toxic content on online platforms. These challenges call for developing effective automatic or semiautomatic solutions to detect toxicity from a large stream of content on online platforms.

According to researchers, monitoring the change in users' toxicity can be an early detection method for toxicity in on line communities. The proposed methodology can identify when users exhibit a change by calculating the toxicity percentage in posts and comments. This change, combined with the toxicity level our system detects in users' posts, can be used efficiently to stop toxicity dissemination.

I run social media networks too. I get some complaints from people using them. I try my best to stop anybody trolling others. Suspending and totally banning toxic people  becomes inevitable sometimes on my networks. 

If you say something that people don't like, they intentionally antagonize you online by posting inflammatory, irrelevant, or offensive comments or other disruptive content.

This trolling is more prominent amongst people who follow pseudo-science, superstitions, politicians, political parties, Godmen and those who worship film and TV heroes as Gods. 

Most scientists and professors just ignore them. During one of the discussions we had on this topic, one professor said, 'How do you deal with a mad man? He doesn't understand what he is doing. He doesn't understand what we are saying. He doesn't understand many things at all. There is no use in trying to talk with him. There is no use in giving replies to such a person. Just  ignore his mad talk'. 

I follow this advice. If the person continues to bother me more and more, I block the person, mute him, flag him, complain or unfriend him. On my networks I suspend and ban toxic people.

But science communication asks us to deal with such people too.  We have to deal with bandwagon effect, confirmatory bias, Dunning–Kruger effect, Barnum effect, ingroup bias, misinformation effect, and several other things that screw peoples' thought process.  So sometimes I try to drill some sense into these trolls. But I realized most of the time it is futile.

So I try to concentrate on rest of the 99% of the people and ignore this 1%. As they can't have their way, and can't keep talking to themselves, they just leave you alone!

If you find what these trolls say is toxic, just ignore and block them to have peace of mind. Don't waste your time on them.

Q: How can we deal with nasty people?

Krishna: Dealing with a toxic person can be mentally draining, but employing certain techniques can help you protect your boundaries.

Signs of toxicity:

  • self-absorption or self-centeredness
  • manipulation and other emotional abuse
  • dishonesty and deceit
  • difficulty offering compassion to others
  • a tendency to create drama or conflict
  • trolling you for no fault of yours and say things that makes you uncomfortable

It’s tough to face attacks from someone who behaves in a toxic manner. They might get personal, try to twist your words, or accuse you of wanting to hurt them. At some point, you might even second guess yourself and rack your brain for something you might’ve done.

But remind yourself their behaviour has nothing to do with you. Restate your boundaries and try not to take their spite personally. Take deep breaths to calm yourself or mindfully acknowledge their words so you can let them go without being affected.

Next time you feel anxious in an interaction, try grounding yourself with these tips:

  • Breathe slowly and deeply.
  • Try relaxing your muscles instead of tensing them.
  • Let the words wash over you and silently repeat  calming words.
  • Distract yourself if the situation allows.

I just ignore nasty, toxic people. Don't listen to what they say, don't answer them, and behave as if they don't exist.  After sometime they just completely disappear from your world.

They may move on when they see their tactics don’t work on you. If you’re never available, they might eventually stop trying to engage.

 

Footnotes:
1. Hind Almerekhi et al, Investigating toxicity changes of cross-community redditors from 2 billion posts and comments, PeerJ Computer Science (2022). DOI: 10.7717/peerj-cs.1059
 

Views: 81

Replies to This Discussion

81

RSS

© 2024   Created by Dr. Krishna Kumari Challa.   Powered by

Badges  |  Report an Issue  |  Terms of Service