Science, Art, Litt, Science based Art & Science Communication
Some people think science has nothing to do with emotions and morals because it can't deal with them!
But everything you feel, think and do is related to your biochemistry. More specifically speaking, the brain's work is based on its biochemistry and electrochemical signalling, driven by the interaction of neurotransmitters, ions, and energy metabolism. And brain circuits formed during your interaction with the world.
I wrote about it disproving the myths and posted my article here: Can-science-explain-or-deal-with-emotions-and-morals?
Now science has found the brain region associated with moral inconsistency.
Why don't some people practice what they preach? Researchers reveal that a brain region called the ventromedial prefrontal cortex (vmPFC) is involved. The researchers used fMRI imaging to identify brain activity patterns associated with moral behaviour and judgment. People who behaved dishonestly despite judging the same behaviour as immoral in others had less activity in the vmPFC, and when the researchers stimulated participants' vmPFCs, they became more morally inconsistent. The work is published in the journal Cell Reports.
Moral consistency is an active biological process. Being a 'moral person' requires the brain to integrate moral knowledge into daily behaviour—a process that can fail even in people who know the moral principle perfectly well.
Previous studies have identified brain regions that are involved in moral behaviour and moral judgment.
But, why knowing the right thing to do doesn't always translate into doing it?
To identify brain regions associated with moral inconsistency, the researchers used fMRI imaging to scan people's brains during a task that required them to weigh honesty and profit. Participants could earn more money by being dishonest, but they were also asked to rate their own behaviour on a 10-point scale from "extremely immoral" to "extremely moral." The team also monitored the participants' brain activity while they judged the morality of other people undertaking the same task.
In people who were morally consistent—meaning, they judged themselves and others by the same moral standards—the vmPFC was activated similarly during both the behavioural and judgment tasks.
However, in morally inconsistent participants—those who judged other people's cheating as immoral but rated their own cheating more leniently—the vmPFC was less active during the behavioural task and less connected to other brain regions involved in decision making and morality.
To examine whether vmPFC activity plays a causal role in moral inconsistency, the researchers stimulated some participants' vmPFCs via a non-invasive method called transcranial temporal interference stimulation (tTIS) before they undertook the behavioural and judging tasks. They showed that vmPFC stimulation resulted in higher levels of moral inconsistency compared to participants who received mock stimulation.
These results suggest that people who are morally inconsistent don't make use of their vmPFC to integrate information when making behaviural decisions, the researchers say.
Individuals exhibiting moral inconsistency are not necessarily blind to their own moral principles; they are just biologically failing to consider and apply them in their own moral behaviour.
In future research, the team plans to investigate the brain activity related to the "victim perspective" to understand how these neural circuits react when people are treated unfairly.
These findings suggest that we should treat moral consistency like a skill that can be strengthened through deliberate decision making.
Moral inconsistency is based on the vmPFC's insufficient representation across tasks and connectedness, Cell Reports (2026). DOI: 10.1016/j.celrep.2026.117058. www.cell.com/cell-reports/full … 2211-1247(26)00136-1
----
Every day we encounter circumstances we consider wrong: a starving child, a corrupt politician, an unfaithful partner, a fraudulent person. These examples highlight several moral issues, including matters of care, fairness and betrayal.
A team of researchers set out to probe the nature of morality using one of moral psychology’s most prolific theories.
They discovered that a general network of brain regions was involved in judging moral violations, like cheating on a test, in contrast with mere social norm violations, such as drinking coffee with a spoon. What’s more, the network’s topography overlapped strikingly with the brain regions involved in theory of mind. However, distinct activity patterns emerged at finer resolution, suggesting that the brain processes different moral issues along different pathways, supporting a pluralist view of moral reasoning. The results, published in Nature Human Behaviour , even reveal differences between how liberals and conservatives evaluate a given moral issue.
They showed that moral judgments of a wide range of different types of morally relevant behaviours are instantiated in shared brain regions.
A machine-learning algorithm could reliably identify which moral category, or “foundation,” a person was judging based on their brain activity. This is only possible because moral foundations elicit distinct neural activations.
The group was guided by Moral Foundations Theory (MFT), a framework for explaining the origins and variation in human moral reasoning. MFT predicts that humans possess a set of innate and universal moral foundations.
These are generally organized into six categories:
The framework arranges these foundations into two broad moral categories: care/harm and fairness/cheating emerge as “individualizing” foundations that primarily serve to protect the rights and freedoms of individuals. Meanwhile loyalty/betrayal, authority/subversion and sanctity/degradation form “binding” foundations, which primarily operate at the group level.
The researchers created a model based on MFT to test whether the framework — and its nested categories — was reflected in neural activity. Sixty-four participants rated short descriptions of behaviors that violated a particular set of moral foundations, as well as behaviors that simply went against conventional social norms, which served as a control. An fMRI machine monitored activity across different regions of their brains as they reasoned through the vignettes.
Certain brain regions distinguished moral from non-moral judgment across the board, such as activity in the medial prefrontal cortex, temporoparietal junction and posterior cingulate, among other regions. Participants also took longer to rate moral transgressions than non-moral ones. The delay suggests that judging moral issues may involve a deeper evaluation of an individuals’ actions and how they relate to one’s own values, the authors said.
Although moral judgments are intuitive at first, deeper judgment requires responses to the six ‘W questions.
Who does what, when, to whom, with what effect, and why. And this can be complex and takes time. Indeed, moral reasoning recruited regions of the brain also associated with mentalizing and theory of mind.
The researchers also found that transgressions of loyalty, authority and sanctity prompted greater activity in regions of the brain associated with processing other people’s actions, as opposed to the self. It was surprising to understand how well the organization into ‘individualizing’ versus ‘binding’ moral foundations is reflected on the neurological level in multiple networks.
Next, the authors developed a decoding model that accurately predicted which specific moral foundation or social norm individuals were judging from fine-grained activity pattern across their brains. This would not have been possible if all moral categories were unified at the neurological level, they explained.
“This supports MFT’s prediction that each moral foundation is not encoded in a single ‘moral hotspot,’” the authors write, “but (is instead) instantiated via multiple brain regions distributed across the brain.” This finding suggests that the distinct moral categories proposed by Moral Foundations Theory have an underlying neurologic basis.
In this way, moral reasoning is similar to other mental tasks: it elicits characteristic patterns across the brain, with nuances based on the specifics.
However, when looking at the pattern of activation in this region, one can clearly discern whether someone is looking at a house or a face. Analogously, moral reasoning activates certain regions of the brain, “yet, the activation patterns in those same regions are highly distinct for different classes of moral behaviors, suggesting that they are not unified.”
The results provide evidence at the neurological level that liberals and conservatives have complex differential neural responses when judging moral foundations.
----
Tags:
22
© 2026 Created by Dr. Krishna Kumari Challa.
Powered by