Science Simplified!

                       JAI VIGNAN

All about Science - to remove misconceptions and encourage scientific temper

Communicating science to the common people

'To make  them see the world differently through the beautiful lense of  science'

Load Previous Comments
  • Dr. Krishna Kumari Challa

    Rewriting the rules of genetics: Study reveals gene boundaries are dynamic, not fixed

    Molecular biologists have long thought that the beginning of a gene launched the process of transcription—the process by which a segment of DNA is copied into RNA and then RNA helps make the proteins that cells need to function.

    But a new study published in Science by researchers challenges that understanding, revealing that the beginning and end of genes are not fixed points, but move together—reshaping how cells build proteins and adapt through evolution.

    This work rewrites a textbook idea: the beginning of a gene doesn't just launch transcription—it helps decide where it stops and what protein you ultimately make. 

    For years, we taught that a gene's 'start' only decides where transcription begins. We now show the start also helps set the finish line—gene beginnings control gene endings, say the researchers  of this new work.

    The discovery offers a promising new strategy for targeting cancer and neurological disorders, as well as developmental delays and aging. When gene transcription is disrupted or misregulated, protein production can become abnormal, potentially causing tumor growth.

    The understanding that the beginning and ends of genes are connected could allow physicians to redirect gene expression—restoring healthy protein variants and suppressing harmful ones, without altering the underlying DNA sequence.

    Misplacing a start or an end isn't a small mistake—it can flip a protein's domain structure and change its function, too. In cancer, that flip can mean turning a tumor suppressor into an oncogene. An oncogene is a mutated gene that has the potential to cause cancer by promoting uncontrolled cell growth and division.

    These new findings show that controlling where a gene begins is a powerful way to control where it ends—and, ultimately, what a cell can do. 

    Ezequiel Calvo-Roitberg et al, mRNA initiation and termination are spatially coordinated, Science (2025). DOI: 10.1126/science.ado8279

  • Dr. Krishna Kumari Challa

    Flipping the switch on sperm motility offers new hope for male infertility

    Infertility affects about one in six couples, and male factors account for roughly half of all cases—often because sperm don't swim well. Researchers have uncovered a key component of the "switch" that keeps the movement signal strong, offering a promising new avenue for both diagnosis and treatment. When this switch is absent, sperm slow down, and fertilization fails. By restoring that signal in the lab, the team rescued swimming and achieved healthy births in mice.

    The study has been published in Proceedings of the National Academy of Sciences.

    For sperm to successfully fertilize an egg, they must be able to swim, a process driven by their tail. This movement is activated by an essential signaling molecule called cyclic AMP (cAMP). While it was known that an enzyme named soluble adenylyl cyclase (sAC) produces cAMP inside sperm, the precise mechanism controlling this enzyme's stability and function remained largely a mystery.

    The study focused on a protein with a previously unknown function, TMEM217, which is produced specifically in the testes. They engineered mice that could not produce TMEM217 and found that the males were completely infertile, with sperm that were almost entirely immotile. Further investigation revealed that TMEM217 partners with another protein, SLC9C1, to form a stable complex.

    This complex is crucial for maintaining the presence of the sAC in mature sperm. Without TMEM217, SLC9C1 is lost and sAC is markedly reduced, causing cAMP levels to plummet and sperm motility to fail.  

    In a significant breakthrough, the team took the immotile sperm from these mice and treated them with a cAMP analog—a molecule that mimics cAMP. This treatment successfully restored the sperm's movement and enabled them to fertilize eggs in vitro, leading to the birth of healthy pups.

    The study has revealed a fundamental "switch" in sperm, providing a deeper understanding of sperm motility regulation. The discovery of the TMEM217-SLC9C1-sAC axis offers a new target for diagnosing unexplained cases of male infertility.

    Formation of a complex between TMEM217 and the sodium-proton exchanger SLC9C1 is crucial for mouse sperm motility and male fertility, Proceedings of the National Academy of Sciences (2025). DOI: 10.1073/pnas.2516573122

  • Dr. Krishna Kumari Challa

    New lab-grown human embryo model produces blood cells

    Scientists have used human stem cells to create three-dimensional embryo-like structures that replicate certain aspects of very early human development—including the production of blood stem cells. The findings are published in the journal Cell Reports.

    Human blood stem cells, also known as hematopoietic stem cells, are immature cells that can develop into any type of blood cell, including red blood cells that carry oxygen and various types of white blood cells crucial to the immune system.

    The embryo-like structures, which the scientists have named "hematoids," are self-organizing and start producing blood after around two weeks of development in the lab—mimicking the development process in human embryos.

    The structures differ from real human embryos in many ways, and cannot develop into them because they lack several embryonic tissues, as well as the supporting yolk sac and placenta needed for further development.

    Hematoids hold exciting potential for a better understanding of blood formation during early human development, simulating blood disorders like leukemia, and for producing long-lasting blood stem cells for transplants.

    The human stem cells used to derive hematoids can be created from any cell in the body. This means the approach also holds great potential for personalized medicine in the future, by allowing the production of blood that is fully compatible with a patient's own body.

    A post-implantation model of human embryo development includes a definitive hematopoietic niche, Cell Reports (2025). DOI: 10.1016/j.celrep.2025.116373www.cell.com/cell-reports/full … 2211-1247(25)01144-1

  • Dr. Krishna Kumari Challa

    A neural basis for flocking

    When animals move together in flocks, herds, or schools, neural dynamics in their brain become synchronized through shared ways of representing space, a new study by researchers  suggests. The findings challenge the conventional view of how collective motion arises in nature.

    Flocking animals, such as hundreds of birds sweeping across the sky in unison, are a mesmerizing sight. But how does their collective motion—seen in many species, from swarming locusts to schooling fish and flocking birds—arise?

    Researchers have developed a novel theoretical framework that integrates neurobiological principles to upend long-held assumptions about how flocking behavior emerges in nature.

    In a recent article published in Nature Communications they demonstrate that flocking does not require individuals to rely on rigid behavioral rules, as is typically assumed. Instead, it can arise naturally from a simple and widespread neural architecture found across the animal kingdom: the ring attractor network.

    In the new model, flocking arises because neural activity in each animal becomes linked through perception: Every individual processes its surroundings using a ring attractor—a circular network of neurons that tracks the direction toward perceived objects in space. This way, the animal can maintain bearings toward others relative to stable features in the environment. The researchers found that when many such individuals interact, their neural dynamics synchronize, giving rise to spontaneous alignment and collective movement.

    This means that coordinated motion can emerge directly from navigational processes in the brain, challenging decades of theory.

    The new framework shows that collective motion emerges when individuals represent the directions of others relative to stable features in their surroundings—a world-centered, or allocentric, perspective. This mechanism underlies what the authors describe as "allocentric flocking."

    Mohammad Salahshour et al, Allocentric flocking, Nature Communications (2025). DOI: 10.1038/s41467-025-64676-5

  • Dr. Krishna Kumari Challa

    Mom's voice boosts language-center development in preemies' brains, study finds

    Hearing the sound of their mother's voice promotes development of language pathways in a premature baby's brain, according to a new  study.

    During the study, which is published in Frontiers in Human Neuroscience, hospitalized preemies regularly heard recordings of their mothers reading to them. At the end of the study, MRI brain scans showed that a key language pathway was more mature than that of preemies in a control group who did not hear the recordings. It is the first randomized controlled trial of such an intervention in early development.

    This is the first causal evidence that a speech experience is contributing to brain development at this very young age.

    Premature babies—born at least three weeks early—often spend weeks or months in the hospital, typically going home around their original due dates. During hospitalization, they hear less maternal speech than if they had continued to develop in utero.

    Parents can't usually stay at the hospital around the clock; they may have older children to care for or jobs they must return to, for example. Preemies are at risk for language delays, and scientists have suspected that reduced early-life exposure to the sounds of speech contributes to the problem.

    The researchers decided to boost preemies' exposure to their mom's voices during hospitalization. They did this by playing recordings of the mothers speaking, a total of two hours and 40 minutes a day, for a few weeks at the end of the babies' hospital stays.

    Babies were exposed to this intervention for a relatively short time. In spite of that, researchers saw very measurable differences in their language tracts. It's powerful that something fairly small seems to make a big difference.

    Part 1

  • Dr. Krishna Kumari Challa

    Fetal hearing begins to develop a little more than halfway through pregnancy, around 24 weeks into what is normally a 40-week gestation period. As the fetus grows, the uterus expands and the uterine wall thins.

    Late in pregnancy, more sounds, including the mother's conversations, reach the fetus. At birth, full-term newborns recognize their mother's voice and prefer the sounds of their parents' native language to other languages, prior research has shown.

    These factors suggest that listening to Mom's voice contributes to brain maturation in the latter half of a full-term pregnancy.

    So in their work the researchers realized that by supplementing the sounds that premature babies hear in the hospital so they resemble what they would have heard in the womb, they had a unique opportunity to possibly improve brain development at this stage of life.

     Listening to Mom in the Neonatal Intensive Care Unit: A randomized trial of increased maternal speech exposure on white matter connectivity in infants born preterm, Frontiers in Human Neuroscience (2025). DOI: 10.3389/fnhum.2025.1673471

    Part 2

  • Dr. Krishna Kumari Challa

    Men’s brains shrink more with age
    Men’s brains shrink more as they age than women’s brains do, which could scupper the theory that age-related brain changes explain why women are more frequently diagnosed with Alzheimer’s disease than men. Using more than 12,500 brain scans from 4,726 people, researchers found that men experienced a greater reduction in volume across more regions of the brain over time than women did. This suggests that sex differences in brain volume don’t play a part in the development of Alzheimer’s, but “just looking at age-related changes in brain atrophy is unlikely to explain the complexities behind [the disease]”, say neurophysiologists.

     Proceedings of the National Academy of Sciences paper

    https://www.pnas.org/doi/10.1073/pnas.2510486122

    https://www.nature.com/articles/d41586-025-03353-5?utm_source=Live+...

  • Dr. Krishna Kumari Challa

    A rare variety of wheat with three ovaries—gene discovery could triple production

    Researchers discovered the gene that makes a rare form of wheat grow three ovaries per flower instead of one. Since each ovary can potentially develop into a grain of wheat, the gene could help farmers grow much more wheat per acre. Their work is published in the journal Proceedings of the National Academy of Sciences.

    The special trait of growing three ovaries per flower was initially discovered in a spontaneously occurring mutant of common bread wheat. But it wasn't clear what genetic changes led to the new trait. The UMD team created a highly detailed map of the multi-ovary wheat's DNA and compared it to regular wheat.

    They discovered that the normally dormant gene WUSCHEL-D1 (WUS-D1) was "switched on" in the multi-ovary wheat.
    When WUS-D1 is active early in flower development, it enlarges the flower-building tissues, enabling them to produce extra female parts like pistils or ovaries.

    If breeders can control or mimic this genetic trick of activating WUS-D1, they could design new wheat varieties that grow more kernels per plant. Even small gains in the number of kernels per plant can translate into huge increases in food supply at the global scale.

    Pinpointing the genetic basis of this trait offers a path for breeders to incorporate it into new wheat varieties, potentially increasing the number of grains per spike and overall yield, say the researchers. By employing a gene editing toolkit, scientists can now focus on further improving this trait for enhancing wheat yield. This discovery provides an exciting route to develop cost-effective hybrid wheat.

    The discovery of WUS-D1 could also lead to the development of similar multi-ovary varieties of other grain crops.

     Adam Schoen et al, WUSCHEL-D1upregulation enhances grain number by inducing formation of multiovary-producing florets in wheat, Proceedings of the National Academy of Sciences (2025). DOI: 10.1073/pnas.2510889122

  • Dr. Krishna Kumari Challa

    Electric charge connects jumping worm to aerial prey

    A tiny worm that leaps high into the air—up to 25 times its body length—to attach to flying insects uses static electricity to perform this astounding feat, scientists have found.

    The journal PNAS published the work on the nematode Steinernema carpocapsae, a parasitic roundworm. 

    Researchers identified the electrostatic mechanism this worm uses to hit its target, and we've shown the importance of this mechanism for the worm's survival. Higher voltage, combined with a tiny breath of wind, greatly boosts the odds of a jumping worm connecting to a flying insect.

    They conducted the experiments, including the use of high-speed microscopy techniques to film the parasitic worm—whose length is about the diameter of a needle point—as it leaped onto electrically charged fruit flies.

    The researchers showed how a charge of a few hundred volts, similar to that generated by an insect's wings beating the air, initiates an opposite charge in the worm, creating an attractive force. They identified electrostatic induction as the charging mechanism driving this process.

    Using physics, scientists learned something new and interesting about an adaptive strategy in an organism.

    Ranjiangshang Ran et al, Electrostatics facilitate midair host attachment in parasitic jumping nematodes, Proceedings of the National Academy of Sciences (2025). DOI: 10.1073/pnas.2503555122

  • Dr. Krishna Kumari Challa

    Differences in coexistence create new species

    A simple change in species composition can impact the course of evolution: A research team shows that the presence of just one other fish species is enough to drive the emergence of new species in sticklebacks.

    It has long been assumed that adaptation to different habitats plays an important role in the evolution of new species. Yet how important this influence truly is—particularly during the initial stages of the speciation process—and which ecological differences are most critical remain major questions in evolutionary research.

    For the current study, the research team studied populations of threespine stickleback—small fish about the size of a finger—from lakes in western Canada. These lakes formed after glaciers from the last ice age melted less than 12,000 years ago and were then colonized by sticklebacks from the sea. While many of these lakes are environmentally similar, they differ in one aspect: in some, another fish species, the prickly sculpin, lives alongside sticklebacks, while in other lakes sculpins are absent.

    This seemingly simple ecological difference—living with or without sculpins—has repeatedly pushed sticklebacks down distinct evolutionary paths: in lakes with sculpins, sticklebacks have evolved into slimmer open-water forms, while in sculpin-free lakes they have become stockier bottom-feeding specialists.

     Marius Roesti et al, A species interaction kick-starts ecological speciation in allopatry, Proceedings of the National Academy of Sciences (2025). DOI: 10.1073/pnas.2506625122

  • Dr. Krishna Kumari Challa

    Human cells activate self-destruction when viruses disrupt RNA production, study shows

    Viruses are masters at taking over our cells: They disable our defenses and hijack the cellular machinery in order to multiply successfully. For example, the herpes simplex virus 1, which causes blister-like skin rashes, and influenza viruses specifically block a crucial step in gene activity in which the production of RNA molecules is completed—known as transcription termination. The blockade results in unnaturally long RNA molecules that cannot be translated into proteins. This suppresses the antiviral defense in the cells and creates optimal conditions for the viruses to multiply.

    A new study published in Nature now shows that human cells are not helpless against this viral sabotage. They recognize the disruption of transcription termination as an alarm signal, activate a "self-destruction program" and sacrifice themselves—even before the virus can multiply in them. This enables them to nip the spread of the infection in the bud.

    Researchers discovered that the unnaturally long RNA molecules adopt a special structure: They twist into left-turning double strands, known as Z-RNAs. These unusual RNA forms are recognized by the cellular protein ZBP1. And then the controlled cell death begins.

    It is particularly noteworthy that Z-RNAs form primarily in those sections of these unnaturally long RNA molecules that originate, among other things, from remnants of previous viral infections. These otherwise silent areas of our genome are only transcribed into RNA due to the virus-related disruption of transcription termination.

    Our cells therefore use these genetic remnants of ancient viral infections to detect and ward off current viral attacks.

    Evolution has thus turned the tables: what once began as a viral invasion now serves as an alarm signal for the antiviral immune defense. This discovery impressively demonstrates how closely virus and host have been intertwined over millions of years—and how our cells can transform viral sabotage into highly effective protective strategies.

     Chaoran Yin et al, Host cell Z-RNAs activate ZBP1 during virus infections, Nature (2025). DOI: 10.1038/s41586-025-09705-5

  • Dr. Krishna Kumari Challa

    Scientists turned off moths' sex signals—this could be the key to greener pest control

    A single "sexy" gene could help us combat one of the world's most destructive fruit pests. By deleting the gene that lets female moths produce their mating scent, researchers  created an "unsexy" moth—and showed one way to turn insect attraction into a powerful pest control tool.

    You've probably seen moths flittering around a bright lamppost on a balmy summer night. Those same insects, in their larval form, are the worms that burrow into your apples and peaches, making them serious pests in agriculture.
    Moths are usually controlled with chemical pesticides, but pests evolve resistance and these sprays also harm bees and other pollinators. We need new and more sustainable methods to protect important crops targeted by moth larvae, like apples, maize, tomatoes and rice.

    In a new study published in the Journal of Chemical Ecology, researchers have demonstrated a way to unravel sexual communication in insects and provide a more sustainable alternative to pesticides. Yes, now  we can stop moths by using their natural instincts against them.

    Marie Inger Dam et al, Sex pheromone biosynthesis in the Oriental fruit moth Grapholita molesta involves Δ8 desaturation, Insect Biochemistry and Molecular Biology (2025). DOI: 10.1016/j.ibmb.2025.104307

  • Dr. Krishna Kumari Challa

    Why women's brains face higher risk: Scientists pinpoint X-chromosome gene behind MS and Alzheimer's

    New research has identified a sex-chromosome linked gene that drives inflammation in the female brain, offering insight into why women are disproportionately affected by conditions such as Alzheimer's disease and multiple sclerosis as well as offering a potential target for intervention.

    The study, published in the journal Science Translational Medicine, used a mouse model of multiple sclerosis to identify a gene on the X chromosome that drives inflammation in brain immune cells, known as microglia. Because females have two X chromosomes, as opposed to only one in males, they get a "double dose" of inflammation, which plays a major role in aging, Alzheimer's disease and multiple sclerosis.

    When the gene, known as Kdm6a, and its associated protein were deactivated, the multiple sclerosis-like disease and neuropathology were both ameliorated with high significance in female mice.

    Multiple sclerosis and Alzheimer's disease each affect women more often than men, about two to three times as often. Also, two-thirds of healthy women have 'brain fog' during menopause. These new findings explain why and point to a new treatment to target this.

    When researchers genetically "knocked out" the gene Kdm6a in brain immune cells, the inflammatory molecules shifted from being activated to a resting state. Additionally, they  performed a pharmacologic "knock down" of the protein made by this gene using metformin. Metformin is widely used as a treatment for diabetes but is currently being researched for potential anti-aging properties.

    While these interventions were highly significant in female mice, their effect was almost undetectable in males.

    This is consistent with there being 'more to block' in females due to having two copies of the X-linked gene.

    It's also why females are more likely to get MS and AD than males. This has implications for the clinic. Women may respond differently to metformin treatment than men.

    The findings may also have implications for explaining a connection to brain fog in healthy women during menopause.

    Sex chromosomes and sex hormones achieve a balance through evolution. There is a selection bias to do so. Females have a balance between X chromosome-driven inflammation that can be good to fight infections at child-bearing ages. This is held in check by estrogen, which is anti-inflammatory and neuroprotective. As women age, menopause causes loss of estrogen, unleashing the proinflammatory and neurodegenerative effects of this X chromosome in brain immune cells.

    Yuichiro Itoh et al, Microglia-specific deletion of the X-chromosomal gene Kdm6a reverses the disease-associated microglia translatome in female mice, Science Translational Medicine (2025). DOI: 10.1126/scitranslmed.adq3401www.science.org/doi/10.1126/scitranslmed.adq3401

  • Dr. Krishna Kumari Challa

    'Jump-scare' science: Study elucidates how the brain responds to fear

    In haunted houses across the US this month, threatening figures will jump out of the shadows, prompting visitors—wide-eyed and heart racing—to instinctively freeze and flee.

    Evolutionarily speaking, this "innate threat response" is key to survival, helping a wide variety of animal species escape predators. But when stuck in overdrive it can cause problems for humans.

     research team has identified a novel brain circuit responsible for orchestrating this threat response. Known as the interpeduncular nucleus (IPN), this dense cluster of specialized neurons not only jump-starts that freeze-and-flee reaction, but dials it down when animals learn there's no real danger.

    In people with anxiety or post-traumatic stress disorder (PTSD), this circuit may be broken, researchers say.

    The findings could help explain why some people have a greater appetite for risk than others and lead to new therapies for psychiatric disorders.

    The brain's threat system is like an alarm. It needs to sound when danger is real, but it needs to shut off when it's not. This new study shows how the brain learns to fine-tune those responses through experience, helping us adapt to the world.

     Elora W. Williams et al, Interpeduncular GABAergic neuron function controls threat processing and innate defensive adaptive learning, Molecular Psychiatry (2025). DOI: 10.1038/s41380-025-03131-9

    **

  • Dr. Krishna Kumari Challa

    Preventing overhydration: Study uncovers a neural circuit that prompts mice to stop drinking

    Identifying the neural mechanisms that support the regulation of vital physiological processes, such as drinking, eating and sleeping, is a long-standing goal within the neuroscience research community. As the disruption of these processes can severely impact people's health and everyday functioning, uncovering their neural and biological underpinnings is of the utmost importance.

    New insights gathered by neuroscientists could ultimately inform the development of more effective interventions designed to regulate vital physiological processes. Thirst and hunger are known to be regulated by homeostatic processes, biological processes that allow the body to maintain internal stability.

    Yet drinking behaviour can also be anticipatory, which means that animals and humans often adjust their actions (i.e., stop drinking) before the concentration of substances in the blood changes in response to drinking water. The mechanisms through which the brain predicts when it is the right time to stop drinking remain poorly understood.

    Researchers  recently carried out a study involving mice aimed at shedding new light on these mechanisms. Their findings, published in Nature Neuroscience, led to the identification of a neural pathway that reduces neural activity in specific regions of the mouse brain, signaling that the body has received enough water.

    Drinking behaviour is not only homeostatically regulated but also rapidly adjusted before any changes in blood osmolality occur, known as anticipatory thirst satiation.

    Homeostatic and anticipatory signals converge in the subfornical organ (SFO); however, the neural pathways conveying peripheral information to the SFO before changes in blood composition are incompletely understood till now.

    Researchers now  reveal an inhibitory pathway from the medial septum (MS) to the SFO that is involved in the control of anticipatory drinking behaviour in mice.

    As part of their experiments, researchers observed the drinking behavior of adult mice, while recording their neural activity. This led to the discovery of a neural pathway connecting the MS, a small region in the mouse brain that contributes to the synchronization of brain circuits, and the SFO, a region implicated in the monitoring of bodily fluids.

    "MS γ-aminobutyric acid (GABA)ergic neurons encode water-satiation signals by integrating cues from the oral cavity and tracking gastrointestinal signals," wrote the authors in their research paper. "These neurons receive inputs from the parabrachial nucleus and relay to SFOCaMKII neurons, forming a bottom-up pathway with activity that prevents overhydration. Disruption of this circuit leads to excessive water intake and hyponatremia."

    Essentially, the researchers found that after a mouse starts drinking, GABAergic neurons in the MS become active and receive signals from the parabrachial nucleus, a brain region that processes signals originating from the mouth and gut. These GABAergic neurons then send inhibitory signals to neurons in the SFO, which in turn modulate the feeling of thirst.

    Part 1

  • Dr. Krishna Kumari Challa

    Interestingly, when the team disrupted this pathway's activity, they found that mice no longer stopped drinking and developed hyponatremia. This is a condition characterized by overhydration and an abnormally low concentration of sodium in the blood.

    This recent study gathered new valuable insight into how the mouse brain prevents overhydration, signaling that it is time to stop drinking.

    Lingyu Xu et al, A bottom-up septal inhibitory circuit mediates anticipatory control of drinking, Nature Neuroscience (2025). DOI: 10.1038/s41593-025-02056-4.

    Part 2

  • Dr. Krishna Kumari Challa

    The way we talk to chatbots affects their accuracy, new research reveals

    Whether we're seeking customer support, looking for recommendations, or simply asking a quick question, AI chatbots are designed to give us the answers we're looking for. But there's more going on beneath the surface. Every time we chat with them, they are learning from us to improve their understanding and responses. And the type of language we use, whether formal or informal, directly affects the quality of their answers, according to new research.

    In general, people naturally adapt their conversation style to the person they are speaking with. 

    The researchers compared thousands of messages people sent to human agents with those sent to AI chatbots, focusing on features like grammar, vocabulary and politeness. They found that people were 14.5% more polite and formal and 5.3% more grammatically fluent when chatting with humans than when talking with AI, based on analysis by the Claude 3.5 Sonnet model.

    Next, they trained an AI model called Mistral 7B on about 13,000 real chats between people, then tested how well it understood more than 1,300 messages people had sent to chatbots. To broaden the AI's exposure, they also created blunt and polite rewrites of those messages to simulate different communication styles.

    It turns out that chatbots trained on a diverse mix of message styles, including real and fake messages, were 2.9% better at understanding user intent than AI trained solely on original human conversations. The researchers also tried to improve Mistral AI's understanding by rewriting informal messages at the last minute to be more formal, but this led to a drop in understanding by almost 2%.

    So the best way to make chatbots smarter is to train them on a range of communication styles, as the researchers state in their paper published on the arXiv preprint server. "Training-time exposure to diverse linguistic variation is more effective than inference-time normalization. Models must learn to interpret diverse communication styles during training, rather than rely on brittle post-hoc transformations that risk semantic distortion."

    Fulei Zhang et al, Mind the Gap: Linguistic Divergence and Adaptation Strategies in Human-LLM Assistant vs. Human-Human Interactions, arXiv (2025). DOI: 10.48550/arxiv.2510.02645

  • Dr. Krishna Kumari Challa

    Men experience more brain atrophy with age despite women's higher Alzheimer's risk

    Many women complained to me that their husbands "behaved strangely" as they got older and older. 

    It seems they complained more, got irritated and angry more, understood situations less, grumbled a lot, ... and the descriptions take a strange turn as they go on describing them.

    Now we have an explanation for such behaviours.

    Women are far more likely than men to end up with Alzheimer's disease (AD). This may, at least partially, be due to women's longer average lifespans, but many scientists think there is probably more to the story. It would be easy to surmise that the increased risk is also related to differences in the way men's and women's brains change as they age. 

    Now, a new study, published in Proceedings of the National Academy of Sciences, indicates that it's men who experience greater decline in more regions of the brian as they age. Researchers involved in the study analyzed 12,638 brain MRIs from 4,726 cognitively healthy participants (at least two scans per person) from the ages of 17–95 to find how age-related changes occurred and whether they differed between men and women.

    The results showed that men experienced declines in cortical thickness and surface area in many regions of the brain and a decline in subcortical structures in older age. Meanwhile, women showed greater decline only in a few regions and more ventricular expansion in older adults. So, while differences in brain aging between the sexes are apparent, the cause of increased AD prevalence in women is still a bit mysterious.

    These findings suggest that the higher prevalence of AD diagnoses in women likely stems from factors beyond differential rates of age-related brain atrophy," the study authors write.

    One factor that might be to blame is genetics, particularly the APOE ε4 allele, which may affect protein accumulation in the brain and work differently in men and women. Other factors might include differences in hormonal changes, diagnosis patterns, and sociocultural influences.

    Survival bias may also skew the results in AD studies, as more men may have been diagnosed with AD if their average lifespans matched women's more closely. In this particular study, participants were also more educated on average, which is a protective factor for AD—leading to a potential representativity bias.

    When the researchers corrected for life expectancy, they say some of the differences did clear up for men and additional differences cropped up in women.

    "The interpretation of these sex differences is complicated by our life expectancy analyses, which removed several cortical decline effects in men while revealing effects in women, including greater hippocampal decline. Whether this reflects the removal of proximity-to-death artifacts or elimination of biological aging differences cannot be determined, and these findings should be interpreted with caution, especially considering representativity bias in our sample with potentially healthier men," the authors explain.

    Anne Ravndal et al, Sex differences in healthy brain aging are unlikely to explain higher Alzheimer's disease prevalence in women, Proceedings of the National Academy of Sciences (2025). DOI: 10.1073/pnas.2510486122

  • Dr. Krishna Kumari Challa

    Disconnected cerebral hemisphere in epilepsy patients shows sleep-like state during wakefulness

    Sleep-like slow-wave patterns persist for years in surgically disconnected neural tissue of awake epilepsy patients, according to a study published in PLOS Biology.

    The presence of slow waves in the isolated hemisphere impairs consciousness; however, whether they serve any functional or plastic role remains unclear.

    Hemispherotomy is a surgical procedure used to treat severe cases of epilepsy in children. The goal of this procedure is to achieve maximal disconnection of the diseased neural tissue, potentially encompassing an entire hemisphere, from the rest of the brain to prevent the spread of seizures.

    The disconnected cortex—the outer layer of neural tissue in the brain—is not surgically removed and has a preserved vascular supply. Because it is isolated from sensory and motor pathways, it cannot be evaluated behaviorally, leaving open the question of whether it retains internal states consistent with some form of awareness. More broadly, the activity patterns that large portions of the disconnected cortex can sustain in awake humans remain poorly understood.

    Researchers recently tried to investigate these things.

    They  used electroencephalography (EEG) to measure activity in the isolated cortex during wakefulness before and up to three years after surgery in 10 pediatric patients, focusing on non-epileptic background activity. Following surgery, prominent slow waves appeared over the disconnected cortex. This is novel evidence that this pattern can last for months and years after complete cortical disconnection. The persistence of slow waves raises the question of whether they play any functional role or merely reflect a regression to a default mode of cortical activity.

    The pronounced broad-band EEG slowing resembled patterns observed in conditions such as deep non-rapid eye movement (NREM) sleep, general anesthesia, and the vegetative state. The findings indicate absent or reduced likelihood of dream-like experiences in the isolated cortex. Overall, the EEG evidence is compatible with a state of absent or reduced awareness.

    According to the researchers, any inference about the presence or absence of consciousness, based solely on the brain's physical properties such as prominent EEG slow waves, should be approached with caution, particularly in neural structures that are not behaviorally accessible. The slowing observed at the scalp level should be further characterized with intracranial recordings in cases in which clinical outcomes require postoperative invasive monitoring.

    Michele A. Colombo et al, Hemispherotomy leads to persistent sleep-likslow waves in the isolated cortex of awake humans, PLOS Biology (2025). DOI: 10.1371/journal.pbio.3003060

  • Dr. Krishna Kumari Challa

    How to STOP A Dog Attack BEFORE It Happens!

  • Dr. Krishna Kumari Challa

  • Dr. Krishna Kumari Challa

    Older fathers linked to more new gene mutations in puppies, study finds

    An international study has shown how and when entirely new gene mutations, known as de novo mutations, originate in dogs. A key finding is that higher paternal age increases the number of de novo mutations in puppies. Maternal age also has an effect.

    The study analyzed 390 parent–offspring trios. Trio denotes a design where the genomes of the puppy and both parents are sequenced. This enables accurately identifying gene mutations that do not occur in either parent's genome—mutations that have taken place in the sperm, the ovum or soon after conception. While these rare mutations are the basis of evolution, they can also predispose their carriers to hereditary diseases.

    The results, published in Genome Biology, also show why dogs differ from humans in certain genomic regions and what the findings mean for canine health and breeding.

    Shao-Jie Zhang et al, Determinants of de novo mutations in extended pedigrees of 43 dog breeds, Genome Biology (2025). DOI: 10.1186/s13059-025-03804-2

  • Dr. Krishna Kumari Challa

    What happens when the cell's 'antenna' malfunctions?

    Researchers have uncovered the molecular mechanisms responsible for regulating a structure that plays a critical role in how cells communicate with their environment. Their new study has been published in Communications Biology.

    Found on the surface of almost every cell, the primary cilium is a tiny antenna-like projection that enables the cell to sense environmental signals. Through this structure, cells regulate essential processes such as growth, development, and adaptation. For healthy functioning, primary cilia must maintain the correct length, stability, and morphology.

    The research highlights the role of DYRK kinases, a family of enzymes that regulate intracellular processes. The findings  show that these kinases are essential for maintaining the length, stability, and shape of primary cilia.

    When DYRK kinases malfunction, cilia may become abnormally long, structurally deformed, or unstable. In such cases, the cell loses its ability to properly sense and process external signals.

    This discovery not only advances our understanding of fundamental cell biology but also provides new perspectives on health conditions linked to ciliary dysfunction, such as developmental disorders, kidney diseases, and vision loss. Moreover, it may open new avenues for addressing complex diseases in the future by uncovering potential targets for therapeutic intervention.

     Melis D. Arslanhan et al, Kinase activity of DYRK family members is required for regulating primary cilium length, stability and morphology, Communications Biology (2025). DOI: 10.1038/s42003-025-08373-5

  • Dr. Krishna Kumari Challa

    'Wetware': Scientists use human mini-brains to power computers

    Wetware (brain), a term drawn from the computer-related idea of hardware or software, but applied to biological life forms.

    'Wetware': Scientists use human mini-brains to power computers

    Ten universities around the world are conducting experiments using FinalSpark's organoids -- the small company's website even has a live feed of the neurons at work.

    Inside a lab in the picturesque Swiss town of Vevey, a scientist gives tiny clumps of human brain cells the nutrient-rich fluid they need to stay alive.

    It is vital these mini-brains remain healthy, because they are serving as rudimentary computer processors—and, unlike your laptop, once they die, they cannot be rebooted.

    This new field of research, called biocomputing or "wetware," aims to harness the evolutionarily honed yet still mysterious computing power of the human brain.

    The scientists think that that processors using brain cells will one day replace the chips powering the artificial intelligence boom.

    The supercomputers behind AI tools like ChatGPT currently use silicon semiconductors to simulate the neurons and networks of the human brain. Instead of trying to mimic, these scientists are using the real thing.

    Among other potential advantages, biocomputing could help address the skyrocketing energy demands of AI, which have already threatened climate emissions targets and led some tech giants to resort to nuclear power.

    Biological neurons are one million times more energy efficient than artificial neurons, these scientists say. They can also be endlessly reproduced in the lab, unlike the massively in-demand AI chips made by companies like behemoth Nvidia.

    But for now, wetware's computing power is a very long way from competing with the hardware that runs the world.

    Source: News agencies

    https://www.newindianexpress.com/lifestyle/tech/2025/Oct/17/wetware...'Wetware'%3A%20Scientists%20use%20human%20mini%2Dbrains%20to%20power%20computers

  • Dr. Krishna Kumari Challa

    Comprehending the expansion of the universe without using 'Dark Energy'

    Why is the universe expanding at an ever-increasing rate? This is one of the most exciting yet unresolved questions in modern physics. Because it cannot be fully answered using our current physical worldview, researchers assume the existence of a mysterious "dark energy." However, its origin remains unclear to this day.

    An international research team has come to the conclusion that the expansion of the universe can be explained—at least in part—without dark energy.

    In physics, the evolution of the universe has so far been described by the general theory of relativity and the so-called Friedmann equations. However, in order to explain the observed expansion of the universe on this basis, an additional "dark energy term" must be manually added to the equations.

    This unsatisfactory solution prompted the researchers to take a different approach. Their findings, published in the Journal of Cosmology and Astroparticle Physics, are based on an extension of general relativity (GR) by the later developed model of Finsler gravity. Unlike the original explanatory approach of GRT, the Finsler model allows for a more accurate modeling of the gravitational force of gases, as it is based on a more general spacetime geometry than GRT.

    When the research team calculated the Finsler extension of the Friedmann equations, they made an exciting discovery: The Finsler-Friedmann equations already predict an accelerated expansion of the universe in a vacuum—without the need to introduce additional assumptions or "dark energy" terms.

    We may now be able to explain the accelerated expansion of the universe without dark energy, based on a generalized spacetime geometry, say the researchers. The new geometry opens up completely new possibilities for better understanding the laws of nature in the cosmos.

    Christian Pfeifer et al, From kinetic gases to an exponentially expanding universe—the Finsler-Friedmann equation, Journal of Cosmology and Astroparticle Physics (2025). DOI: 10.1088/1475-7516/2025/10/050. On arXiv (2025). DOI: 10.48550/arxiv.2504.08062

  • Dr. Krishna Kumari Challa

    Bats' brains reveal a global neural compass that doesn't depend on the moon and stars

    Some 40 kilometers east of the Tanzanian coast in East Africa lies Latham Island, a rocky, utterly isolated and uninhabited piece of land about the size of seven soccer fields. It was on this unlikely patch of ground that  researchers recorded—for the first time ever—the neural activity of mammals in the wild.

    In their study, published in Science, the team used a tiny device to record, at the level of single neurons, the brain activity of fruit bats as they flew around the island. The scientists discovered that the bats' neuronal "compass" is global: It provides stable directional information across the entire island and does not depend on the moon or stars.

    Many species share the behavioral ability to orient themselves using an "internal compass," and it is quite possible that humans rely on the same neural mechanism that was studied in these bats.

    They found that every time the bats flew with their heads pointing in a particular direction—north, for instance—a unique group of neurons became active, creating an "internal compass." Navigation by means of directional neurons had previously been observed in the lab, but this was the first evidence that it happens in nature as well. When the researchers analyzed the recordings from different parts of the island, they discovered that the activity of the head-direction cells was consistent and reliable across the entire island, enabling the bats to orient themselves over a large geographical area.

     The compass is global and uniform: No matter where the bat is on the island and no matter what it sees, specific cells always point in the same direction—north stays north and south stays south. 

     Shaked Palgi et al, Head-direction cells as a neural compass in bats navigating outdoors on a remote oceanic island, Science (2025). DOI: 10.1126/science.adw6202www.science.org/doi/10.1126/science.adw6202

  • Dr. Krishna Kumari Challa

    Scientists Just Discovered a Whole New Type of Connection Between Neurons

    Super-resolution microscopes have revealed a whole new type of connection between neurons in mouse and human brains.

    In the lab,  researchers identified tiny tubular bridges in the branching tips of cultured neurons. In further tests on mouse models of Alzheimer's disease, it appeared the bridges were shuttling calcium and disease-related molecules directly between cells.

    Similar] structures can transport a vast range of materials, from small ions (10−10m) to large mitochondria (10−6 m)," the team writes in their paper.

    In cultured neurons, we observed these nanotubes forming dynamically and confirmed that they possessed a distinct internal structure, setting them apart from other neuronal extensions.

    Neurons are well known for passing rapid messages to each other using synapses to transmit both electrical and chemical information. Yet, other cell types are known to use physically connecting bridging tubes to exchange molecules. Researchers have just confirmed that a similar type of tube bridge occurs in neurons too, using advanced imaging and machine learning.

    The researchers observed the nanotubes transporting amyloid-beta molecules that they had injected into mouse brain cells. These molecules have been implicated in neurodegenerative diseases like Alzheimer's, where they tend to clump together abnormally. When researchers stopped the bridges from forming, the amyloid-beta stopped spreading between cells, too, confirming that the nanotubes acted as direct conduits.

    The computational model supported these findings, predicting that overactivation in the nanotube network could accelerate the toxic accumulation of amyloid in specific neurons, thereby providing a mechanistic link between nanotube alterations and the progression of Alzheimer's pathology," the researchers explain. 

    https://www.science.org/doi/10.1126/science.adr7403

  • Dr. Krishna Kumari Challa

    Graying hair may reflect a natural defense against cancer risk

    Throughout life, our cells are constantly exposed to environmental and internal factors that can damage DNA. While such DNA damage is known to contribute to both aging and cancer, the precise connection—particularly how damaged stem cells shape long-term tissue health—has remained elusive.

    Melanocyte stem cells (McSCs) are tissue‐resident stem cells that serve as the source of mature melanocytes, the pigment‐producing cells responsible for hair and skin coloration. In mammals, these stem cells reside in the bulge–sub‐bulge region of hair follicles as immature melanoblasts, maintaining pigmentation through cyclical regeneration.

    Published in Nature Cell Biologya study used long-term in vivo lineage tracing and gene expression profiling in mice to investigate how McSCs respond to different types of DNA damage.

    Researchers identified a specific response to DNA double-strand breaks: senescence-coupled differentiation (seno-differentiation), a process in which McSCs irreversibly differentiate and are then lost, leading to hair graying. This process is driven by activation of the p53–p21 pathway.

    In contrast, when exposed to certain carcinogens, such as 7,12-dimethylbenz(a)anthracene or ultraviolet B, McSCs bypass this protective differentiation program—even in the presence of DNA damage. Instead, they retain self-renewal capacity and expand clonally, a process supported by KIT ligand secreted both from the local niche and within the epidermis. This niche-derived signal suppresses seno-differentiation, tipping McSCs toward a tumor-prone fate.

    These findings reveal that the same stem cell population can follow antagonistic fates—exhaustion or expansion—depending on the type of stress and microenvironmental signals. It reframes hair graying and melanoma not as unrelated events, but as divergent outcomes of stem cell stress responses.

    Importantly, this study does not suggest that graying hair prevents cancer, but rather that seno-differentiation represents a stress-induced protective pathway that removes potentially harmful cells. Conversely, when this mechanism is bypassed, the persistence of damaged McSCs may predispose to melanomagenesis.

    By identifying the molecular circuits that govern this fate bifurcation, the study provides a conceptual framework that links tissue aging and cancer, and highlights the beneficial role of eliminating potentially harmful stem cells through natural "senolysis," resulting in a phenotype that safeguards against cancer.

     Yasuaki Mohri et al, Antagonistic stem cell fates under stress govern decisions between hair greying and melanoma, Nature Cell Biology (2025). DOI: 10.1038/s41556-025-01769-9

  • Dr. Krishna Kumari Challa

    Serotonin produced by gut bacteria provides hope for a novel IBS treatment

    New research clarifies the complex interaction between gut bacteria and irritable bowel syndrome (IBS). Experiments demonstrate that gut bacteria can produce the important substance serotonin. The finding may lead to future treatments.

    IBS is a common gastrointestinal disorder, more common in women, with symptoms such as abdominal pain, constipation or diarrhea. The cause of the disease is not clear, but the intestinal environment, including the gut microbiota and serotonin, appear to be important factors.

    Serotonin is best known as a neurotransmitter in the brain, but over 90% of the body's serotonin is produced in the gut, where it controls bowel movements via the enteric nervous system, sometimes called the "gut–brain."

    Previous research has shown that the bacteria in the gut, the gut microbiota, affect how much serotonin is produced by the host, but until now it has been unclear whether gut bacteria themselves can form biologically active serotonin.

    In the current study, published in the journal Cell Reports, the researchers have identified two bacteria that together can produce serotonin: Limosilactobacillus mucosae and Ligilactobacillus ruminis.

    When the bacteria were introduced into germ-free mice with serotonin deficiency, the levels of serotonin in the gut increased, as did the density of nerve cells in the colon. The bacteria also normalized the intestinal transit time.

     Researchers were also able to see that people with IBS had lower levels of one of the bacteria (L. mucosae) in their stools compared to healthy individuals, and that this bacterium also has the enzyme required for serotonin production.

    The results indicate that certain intestinal bacteria can produce bioactive serotonin and thus play an important role in intestinal health and open new avenues for the treatment of functional gastrointestinal disorders such as IBS.

    Chiara H. Moretti et al, Identification of human gut bacteria that produce bioactive serotonin and promote colonic innervation, Cell Reports (2025). DOI: 10.1016/j.celrep.2025.116434www.cell.com/cell-reports/full … 2211-1247(25)01205-7

  • Dr. Krishna Kumari Challa

    Brainwave study sheds light on cause of 'hearing voices'

    A new study led by psychologists  has provided the strongest evidence yet that auditory verbal hallucinations—or hearing voices—in schizophrenia may stem from a disruption in the brain's ability to recognize its own inner voice.

    In a paper published today in the journal Schizophrenia Bulletin, the researchers say the finding could also be an important step toward finding biological indicators that point to the presence of schizophrenia. This is significant, as there are currently no blood tests, brain scans, or lab-based biomarkers—signs in the body that can tell us something about our health—that are uniquely characteristic of schizophrenia.

    Inner speech is the voice in your head that silently narrates your thoughts—what you're doing, planning, or noticing. Most people experience inner speech regularly, often without realizing it, though there are some who don't experience it at all.

    The research shows that when we speak—even just in our heads—the part of the brain that processes sounds from the outside world becomes less active. This is because the brain predicts the sound of our own voice. But in people who hear voices, this prediction seems to go wrong, and the brain reacts as if the voice is coming from someone else.

    This confirms what mental health researchers have long theorized: that auditory hallucinations in schizophrenia may be due to the person's own inner speech being misattributed as external speech.

    And how do you measure it? One way is by using an EEG, which records the brain's electrical activity. Even though we can't hear inner speech, the brain still reacts to it—and in healthy people, using inner speech produces the same kind of reduction in brain activity as when they speak out loud.

    But in people who hear voices, that reduction of activity doesn't happen. In fact, their brains react even more strongly to inner speech, as if it's coming from someone else. That might help explain why the voices feel so real.

    Part 1

  • Dr. Krishna Kumari Challa

    In the experiments conducted, in the healthy participants, when the sound that played in the headphones matched the syllable they imagined saying in their minds, the EEG showed reduced activity in the auditory cortex—the part of the brain that processes sound and speech. This suggests the brain was predicting the sound and dampening its response—similar to what happens when we speak out loud.

    However, in the group of participants who had recently experienced AVH, the results were the reverse. In these individuals, instead of the expected suppression of brain activity when the imagined speech matched the sound heard, the EEG showed an enhanced response.
    Their brains reacted more strongly to inner speech that matched the external sound, which was the exact opposite of what the researchers found in the healthy participants.
    This reversal of the normal suppression effect suggests that the brain's prediction mechanism may be disrupted in people currently experiencing auditory hallucinations, which may cause their own inner voice to be misinterpreted as external speech.
    Participants in the second group—people with a schizophrenia-spectrum disorder who hadn't experienced AVH recently or at all—showed a pattern that was intermediate between the healthy participants and the hallucinating participants.
    The researchers say this is the strongest confirmation to date that the brains of people living with schizophrenia are misperceiving imagined speech as speech that is produced externally.

    Thomas Whitford et al, Corollary discharge dysfunction to inner speech and its relationship to auditory verbal hallucinations in patients with schizophrenia spectrum disorders, Schizophrenia Bulletin (2025). DOI: 10.1093/schbul/sbaf167

    Part 2

  • Dr. Krishna Kumari Challa

    An edible fungus could make paper and fabric liquid-proof

    As an alternative to single-use plastic wrap and paper cup coatings, researchers in Langmuir report a way to waterproof materials using edible fungus. Along with fibers made from wood, the fungus produced a layer that blocks water, oil and grease absorption. In a proof-of-concept study, the impervious film grew on common materials such as paper, denim, polyester felt and thin wood, revealing its potential to replace plastic coatings with sustainable, natural materials.

    By providing more ways to potentially reduce our reliance on single-use plastics, we can help lessen the waste that ends up in landfills and the ocean; nature offers elegant, sustainable solutions to help us get there.

    Fungi are more than their mushroom caps; underground they form an extensive, interwoven network of feathery filaments called mycelium. Recently, researchers have been inventing water-resistant materials made from these fibrous networks, including leather-like, electrically conductive gauze and spun yarn, because the surface of mycelium naturally repels water.

    Additionally, films made from the fluffy wood fibers used in paper-making—specifically, a microscopic form called cellulose nanofibrils—can create barriers for oxygen, oil and grease.

    The edible "turkey tail" fungus (Trametes versicolor) can grow with cellulose fibrils into a protective coating on various materials.

    Part 1

  • Dr. Krishna Kumari Challa

    To create the film, the researchers first blended T. versicolor mycelia with a nutrient-rich solution of cellulose nanofibrils. They applied thin layers of the mixture to denim, polyester felt, birch wood veneer and two types of paper, letting the fungus grow in a warm environment. Placing the samples in an oven for one day inactivated the fungus and allowed the coating to dry.

    It took at least three days of fungal growth for an effective water barrier to develop. And after four days, the newly grown layer didn't add much thickness to the materials (about the same as a coat of paint), but it did change their colors, forming mottled yellow, orange or tan patterns.

    Water droplets placed on the fungus-treated textiles and paper formed bead-like spheres, whereas similar droplets on untreated materials either flattened out or soaked in completely. In addition, the fungal coating prevented other liquids from absorbing, including n-heptane, toluene and castor oil, suggesting that it could be a barrier to many liquids. The researchers say this work is a successful demonstration of a food-safe fungal coating and shows this technology's potential to replace single-use plastic products.

    Sandro Zier et al, Growing Sustainable Barrier Coatings from Edible Fungal Mycelia, Langmuir (2025). DOI: 10.1021/acs.langmuir.5c03185

    Part 2

  • Dr. Krishna Kumari Challa

    Female bodybuilders at risk of sudden cardiac death, research indicates

    Sudden cardiac death is responsible for an unusually high proportion of deaths in female bodybuilders worldwide, according to research published in the European Heart Journal.

    Sudden cardiac death is when someone dies suddenly and unexpectedly due to a problem with their heart. It is generally rare in young and seemingly healthy individuals.

    The study found the greatest risk among women competing professionally. It also revealed a high proportion of deaths from suicide and homicide among female bodybuilders.

    Bodybuilders, both female and male, often engage in extreme training, and use fasting and dehydration strategies to achieve extreme physiques. Some also take performance-enhancing substances. These strategies can take a serious toll on the heart and blood vessels.

    Over recent years, more and more women have taken up strength training and competitive bodybuilding. Despite this growing participation, most of the available research and media attention has focused exclusively on male athletes. This work counters that.

    The researchers gathered the names of 9,447 female bodybuilders from the official competition records and from an unofficial online database. All the women had participated in at least one International Fitness and Bodybuilding Federation event between 2005 and 2020.

    The researchers then searched for reports of deaths of any of these named competitors in five different languages across different web sources, including official media reports, social media, bodybuilding forums and blogs. Any reported deaths were then cross-referenced using multiple sources and these reports were verified and analyzed by two clinicians to establish, as far as possible, the cause of death.

    The researchers found 32 deaths among the women, with an average age at death of around 42 years. Sudden cardiac death was the most common cause of death, accounting for 31% of deaths. The risk of sudden cardiac death was more than 20 times higher among professional bodybuilders, compared to amateurs.

    These results indicate that the risk of sudden cardiac death seems much higher for women bodybuilders compared to other professional athletes, although it is lower than the risk for male bodybuilders.

    The researchers acknowledge that the study is based on a web-based search strategy, which could have influenced their findings. For example, some deaths, especially among less-known athletes, may have gone unreported. They also found that autopsy data were available for only a small proportion of cases, meaning that sudden deaths had to be classified based on clinical interpretation rather than confirmed forensic findings.

    Mortality in female bodybuilding athletes, European Heart Journal (2025). DOI: 10.1093/eurheartj/ehaf789

  • Dr. Krishna Kumari Challa

    Blood test for more than 50 cancers brings 'exciting' results

    A blood test that screens for more than 50 cancers is correct in 62% of cases where it thinks people may have the disease, a study has found.

    The Galleri test, which can be given annually and is undergoing trial in the U.K.'s health system, looks for the "fingerprint" of dozens of deadly cancers, often picking up signs before symptoms even appear.

    It works by identifying DNA in the bloodstream that has been shed by cancer cells, giving the earliest signs somebody may have the disease.

    Now, a key U.S. trial on the test has shown that Galleri is highly accurate in ruling out cancer in people without the disease, while also picking up cancer cases at an early stage, when the disease is more treatable.

    Of those people found to have a "cancer signal" detected in their blood, 61.6% went on to be diagnosed with cancer, the findings of the Pathfinder 2 study showed.

    And in 92% of cases, the test could pinpoint in which organ or tissue the cancer arose, meaning time and money could be saved on other scans and other tests.

    More than half (53.5%) of the new cancers detected by Galleri in the study were the earliest stage I or II, while more than two-thirds (69.3%) were detected at stages I-III.

    Galleri, which has been dubbed the holy grail of cancer tests, also correctly ruled out cancer in 99.6% of people who did not have the disease.

    The findings are being presented at the European Society for Medical Oncology, or ESMO, Congress in Berlin.

    https://bmjopen.bmj.com/content/15/5/e086648

  • Dr. Krishna Kumari Challa

    Early life sugar restriction linked to lasting heart benefits in adulthood

    Restricted sugar intake during early life is linked to lower risks of several heart conditions in adulthood, including heart attack, heart failure, and stroke, finds a study published by The BMJ using data from the end of UK sugar rationing in 1953.

    The greatest protection against the risk of developing heart problems—and the longest delay in disease onset—was seen in people whose sugar intake was restricted from conception (in utero) to around 2 years of age.

    Evidence suggests that the first 1,000 days of life (from conception to around 2 years of age) is a period when diet can have lasting health effects and leading health organizations recommend avoiding sugary drinks and ultra-processed foods (which often contain high amounts of sugar) as babies and toddlers are introduced to solids.

    Researchers therefore wanted to examine whether restricting sugar during this time is associated with a reduced risk of cardiovascular outcomes in adulthood.

    Using the end of UK sugar rationing in September 1953 as a natural experiment, they drew on data from 63,433 UK Biobank participants (average age 55 years) born between October 1951 and March 1956 with no history of heart disease.

    In total, the study included 40,063 participants exposed to sugar rationing and 23,370 who were not.

    Linked health records were then used to track rates of cardiovascular disease (CVD), heart attack, heart failure, irregular heart rhythm (atrial fibrillation), stroke, and cardiovascular death, adjusting for a range of genetic, environmental, and lifestyle factors.

    An external control group of non-UK born adults who did not experience sugar rationing or similar policy changes around 1953 were also assessed for more reliable comparisons.

    The results show that longer exposure to sugar rationing was associated with progressively lower cardiovascular risks in adulthood, partly due to reduced risks of diabetes and high blood pressure.

    Compared with people never exposed to rationing, those exposed in utero plus one to two years had a 20% reduced risk of CVD, as well as reduced risks of heart attack (25%), heart failure (26%), atrial fibrillation (24%), stroke (31%), and cardiovascular death (27%).

    Part 1

  • Dr. Krishna Kumari Challa

    People exposed to rationing in utero and during early life also showed progressively longer delays (up to two and a half years) in the age of onset of cardiovascular outcomes compared with those not exposed to rationing.

    Sugar rationing was also associated with small yet meaningful increases in healthy heart function compared with those never rationed.

    The authors point out that during the rationing period, sugar allowances for everyone, including pregnanat women and children, were limited to under 40 g per day—and no added sugars were permitted for infants under 2 years old—restrictions consistent with modern dietary recommendations.

    Reminder: This is just an observational study, so no firm conclusions can be drawn about cause and effect.

    Exposure to sugar rationing in first 1000 days after conception and long term cardiovascular outcomes: natural experiment study, The BMJ (2025). DOI: 10.1136/bmj-2024-083890

    Part 2

  • Dr. Krishna Kumari Challa

    Why the universe exists

    In 1867, Lord Kelvin imagined atoms as knots in the aether. The idea was soon disproven. Atoms turned out to be something else entirely. But his discarded vision may yet hold the key to why the universe exists.

    Now, for the first time, physicists have shown that knots can arise in a realistic particle physics framework, one that also tackles deep puzzles such as neutrino masses, dark matter, and the strong CP problem.

    Their findings, in Physical Review Letters, suggest these "cosmic knots" could have formed and briefly dominated in the turbulent newborn universe, collapsing in ways that favored matter over antimatter and leaving behind a unique hum in spacetime that future detectors could listen for—a rarity for a physics mystery that's notoriously hard to probe.

    This study addresses one of the most fundamental mysteries in physics: why our universe is made of matter and not antimatter.

    This question is important because it touches directly on why stars, galaxies, and we ourselves exist at all.

    The universe's missing antimatter

    The Big Bang should have produced equal amounts of matter and antimatter, each particle destroying its twin until only radiation remained. Yet the universe is overwhelmingly made of matter, with almost no antimatter in sight. Calculations show that everything we see today, from atoms to galaxies, exists because just one extra particle of matter survived for every billion matter–antimatter pairs.

    The Standard Model of particle physics, despite its extraordinary success, cannot account for that discrepancy. Its predictions fall many orders of magnitude short. Explaining the origin of that tiny excess of matter, known as baryogenesis, is one of physics' greatest unsolved puzzles.

    In the present study, by combining a gauged Baryon Number Minus Lepton Number (B-L) symmetry, with the Peccei–Quinn (PQ) symmetry, the team showed that knots could naturally form in the early universe and generate the observed surplus.

    These two long-studied extensions of the Standard Model patch some of its most puzzling gaps. The PQ symmetry solves the strong CP problem, the conundrum of why experiments don't detect the tiny electric dipole moment that theory predicts for the neutron, and in the process, introduces the axion, a leading dark matter candidate. Meanwhile, the B–L symmetry explains why neutrinos, ghostlike particles that can slip through entire planets unnoticed, have mass.

    Minoru Eto et al, Tying Knots in Particle Physics, Physical Review Letters (2025). DOI: 10.1103/s3vd-brsn

  • Dr. Krishna Kumari Challa

    Chemists discover clean and green way to recycle Teflon

    New research demonstrates a simple, eco-friendly method to break down Teflon—one of the world's most durable plastics—into useful chemical building blocks.

    Scientists have developed a clean and energy-efficient way to recycle Teflon (PTFE), a material best known for its use in non-stick coatings and other applications that demand high chemical and thermal stability.

    The researchers discovered that Teflon waste can be broken down and repurposed using only sodium metal and mechanical energy—movement by shaking—at room temperature and without toxic solvents.

    Publishing their findings in the Journal of the American Chemical Society, researchers reveal a low-energy, waste-free alternative to conventional fluorine recycling.

    The process they discovered breaks the strong carbon–fluorine bonds in Teflon, converting it into sodium fluoride, which is used in fluoride toothpastes and added to drinking water.

    Polytetrafluoroethylene (PTFE), best known by the brand name Teflon, is prized for its resistance to heat and chemicals, making it ideal for cookware, electronics, and laboratory equipment, but those same properties make it almost impossible to recycle.
    When burned or incinerated, PTFE releases persistent pollutants known as "forever chemicals" (PFAS), which remain in the environment for decades. Traditional disposal methods therefore raise major environmental and health concerns.
    The research team tackled this challenge using mechanochemistry—a green approach that drives chemical reactions by applying mechanical energy instead of heat.
    Inside a sealed steel container known as a ball mill, sodium metal fragments are ground with Teflon which causes them to react at room temperature. The process breaks the strong carbon–fluorine bonds in Teflon, converting it into harmless carbon and sodium fluoride, a stable inorganic salt.
    The researchers then showed that the sodium fluoride recovered in this way can also be used directly, without purification, to create other valuable fluorine-containing molecules. These include compounds used in pharmaceuticals, diagnostics, and other fine chemicals.

    A Reductive Mechanochemical Approach Enabling Direct Upcycling of Fluoride from Polytetrafluoroethylene (PTFE) into Fine Chemicals, Journal of the American Chemical Society (2025). DOI: 10.1021/jacs.5c14052

  • Dr. Krishna Kumari Challa

    Dangerous E. coli strain blocks gut's defense mechanism against spreading infection

    When harmful bacteria that cause food poisoning, such as E. coli, invade through the digestive tract, gut cells usually fight back by pushing infected cells out of the body to stop the infection from spreading.

    In a new study published today in Nature, scientists discovered that a dangerous strain of E. coli—known for causing bloody diarrhea—can block this gut defense, allowing the bacteria to spread more easily.

    The bacteria inject a special protein called NleL into gut cells, which breaks down key enzymes known as ROCK1 and ROCK2, that are needed for infected cells to be expelled. Without this process, the infected cells can't leave quickly, allowing the bacteria to spread more easily.

    Usually, when harmful bacteria invade the gut, the body fights back quickly. The first line of defense is the intestinal lining—made up of tightly packed cells that absorb nutrients and keep bacteria out of the bloodstream. If one of these cells gets infected, it sacrifices itself by pushing itself out of the gut lining and into the intestines to be flushed. This helps prevent the bacteria from spreading.

    This study shows that pathogenic bacteria can block infected cells from being pushed out.

    It's a completely different strategy from what we've seen before. Some bacteria try to hide from being detected, but this one actually stops the cell's escape route.

    This discovery could pave the way for new treatments that target how bacteria cause disease, rather than killing the bacteria outright, like antibiotics do.

    Vishva Dixit, Enteropathogenic bacteria evade ROCK-driven epithelial cell extrusion, Nature (2025). DOI: 10.1038/s41586-025-09645-0www.nature.com/articles/s41586-025-09645-0

  • Dr. Krishna Kumari Challa

    Snakes' biting styles revealed in fine detail for the first time

    Few actions in nature inspire more fear and fascination than snake bites. And the venomous reptiles have to move fast to sink their fangs into their prey before their victim flinches, which may be as little as 60 ms when hunting rodents.

    Until recently, video technology was not sufficiently sophisticated to capture the deathly maneuvers in high definition, but recent improvements have made this possible, so researchers decided to get to the heart of how venomous viper, elapid and colubrid snakes sink their fangs into their dinner.

    Publishing their research in the Journal of Experimental Biology, the researchers reveal how vipers sink their fangs into their victims before walking them into position to inject venom. Elapids squeeze venom into their victims by biting repeatedly. And colubrids sweep their jaws from side to side to tear a gash in their victim and deliver maximum venom.

    Researchers also tempted 36 species of snake—from western diamondback rattlesnakes (Crotalus atrox) and west African carpet vipers (Echis ocellatus) to the rough-scaled death adder (Acanthophis rugosus)—to lunge at a cylinder of warm muscle-like medical gel resembling a small animal, recording the encounters with two cameras at 1000 frames/s to recreate the lightning-fast maneuvers in 3D. 

    Part 1

  • Dr. Krishna Kumari Challa

    After capturing more than 100 snake strikes in minute detail, the team saw the vipers embed their fangs in the fake prey within 100ms of launching a smooth strike—with the blunt-nosed viper (Macrovipera lebetina) accelerating up to 710m/s2 and landing its bite within 22ms; the elapid snakes bit their victims as quickly as vipers.

    In addition, the vipers moved the fastest as they struck, with Bothrops asper—sometimes known as the ultimate pit-viper—reaching speeds of over 4.5m/s after hitting accelerations of more than 370m/s2, although the fastest elapid—the rough-scaled death adder—only reached speeds of 2.5m/s.

    Focusing on the vipers' fangs, the team saw the needle-like teeth sink into the fake prey, but if the viper wasn't happy with the position of a fang, it pulled it out to reinsert it at a better angle, effectively walking the fang forward. Only when the fangs were comfortably in place did the vipers close their jaws and inject venom into their catch.

    Part 2

  • Dr. Krishna Kumari Challa

    In contrast, the elapid snakes, such as the Cape coral cobra (Aspidelaps lubricus) and the forest cobra (Naja melanoleuca), used a stealthier strategy, creeping closer to their victim before lunging and biting repeatedly as their jaw muscles tensed to squeeze the venom into their dinner.

    The colubrid snakes, with fangs further back in their mouths, lunge over the greatest distances before clamping their jaws around their meal, sweeping their jaws from side-to-side to tear a crescent-shaped gash in the victim to deliver the maximum dose of venom. And on one occasion, a blunt-nosed viper misjudged the distance to its prey, hitting the right fang and breaking it off. But the team suspects that this occurs more than you'd think, with fangs turning up in snake scats after being swallowed.

    Venomous snakes use dramatically different strategies to deliver their deadly bites. Vipers and elapids strike elegantly before victims are even aware of their presence and colubrid bites inflict the maximum damage. These creatures don't pull any punches when they mean business.

    Cleuren, S. G. C., et al. Kinematics of feeding strikes in venomous snakes., Journal of Experimental Biology (2025). DOI: 10.1242/jeb.250347

    Part 3

  • Dr. Krishna Kumari Challa

    Peatlands' 'huge reservoir' of carbon at risk of release, researchers warn

    Peatlands make up just 3% of Earth's land surface but store more than 30% of the world's soil carbon, preserving organic matter and sequestering its carbon for tens of thousands of years. A new study sounds the alarm that an extreme drought event could quadruple peatland carbon loss in a warming climate.

    In the study, published in Science, researchers find that, under conditions that mimic a future climate (with warmer temperatures and elevated carbon dioxide), extreme drought dramatically increases the release of carbon in peatlands by nearly three times. This means that droughts in future climate conditions could turn a valuable carbon sink into a carbon source, erasing between 90 and 250 years of carbon stores in a matter of months.

    As temperatures increase, drought events become more frequent and severe, making peatlands more vulnerable than before. These extreme drought events can wipe out hundreds of years of accumulated carbon, so this has a huge implication.

    The researchers found that the lowered water table during drought took longer to recover at higher temperatures and elevated carbon dioxide levels, which led to more carbon release.

    Quan Quan et al, Drought-induced peatland carbon loss exacerbated by elevated CO2 and warming, Science (2025). DOI: 10.1126/science.adv7104www.science.org/doi/10.1126/science.adv7104

  • Dr. Krishna Kumari Challa

    Hitchhiking DNA picked up by a gene may save a species from extinction

    An international research team  has solved a genetic mystery and revealed a previously unknown way that DNA can control what cells do.

    Published in Sciencethe study reveals that in the roundworm C. elegans, vital RNA needed to keep the ends of chromosomes intact does not have its own gene. Instead, it hitchhikes inside another one. DNA hitchhiking could be a common strategy in the animal kingdom, and has implications for anti-aging therapies and regenerative medicine in humans.

    Telomeres are DNA caps that protect the ends of chromosomes, much like the plastic tips of shoelaces. As we age, the cells of our bodies—called somatic cells—divide when we need new tissue, and every time that happens the telomeres lose some of their DNA.

    Some signs of aging are related to this process. For example, skin cells with shorter telomeres make less collagen and skin becomes wrinkled. When they are too short, cells self-destruct.

    Sperm and egg precursor cells—collectively called germ cells—are an exception to this rule. When they divide, an enzyme called telomerase adds replacement DNA to the ends of shortened telomeres. Because of this, telomere length doesn't get shorter with each generation, and species do not become extinct.

    Part 1

  • Dr. Krishna Kumari Challa

    Telomerase contains an RNA template that is used to make the replacement DNA. In humans and other mammals, this RNA comes from the TERC gene. C. elegans has working telomerase, but it doesn't seem to have a TERC gene.
    This mystery has stumped scientists for more than 20 years, and some have assumed that the gene was lost during evolution. In their study, the team at RIKEN BDR discovered how C. elegans can exist without a standalone TERC gene.

    Because telomerase levels are normally very low, the researchers genetically engineered C. elegans to overproduce the telomerase protein, which made it possible to collect large amounts of the whole telomerase complex, including the RNA template.

    They then used all the collected template RNA to search the genome for matching DNA. Unlike in mammals, instead of being located in its own gene, they found it inside another gene's intron.

    Usually, the instructions in DNA within genes are used to build proteins. But some parts of genes, called introns, are not used to build proteins and are usually removed and discarded once the gene's protein is made.

    It was surprising to find that the key RNA—which 's named terc-1—was hidden inside an intron of the gene called nmy-2, which is expressed only in germ cells.
    Indeed, the discovery that the essential telomerase RNA was hidden within an intron was completely unexpected.
    Experiments showed that in C. elegans lacking terc-1, telomeres became shorter each generation, and within 15 generations, the animals became extinct. Inserting terc-1 inside introns of other genes that are expressed in germ cells created roundworms that had normal telomeres and did not become extinct.

    In contrast, when terc-1 was inserted into introns of genes that only activate in somatic cells, the animals did become extinct. Thus, by hitching a ride inside genes activated in germ cells, terc-1 is produced where it is needed—the germ cells. There, it helps ensure that future generations do not receive shortened telomeres, thus supporting the survival of the species.
    Beyond its evolutionary significance, this discovery will help us better understand how telomerase is regulated in healthy cells and could transform approaches to aging, fertility, and regenerative medicine.

     Yutaka Takeda et al, Nematode telomerase RNA hitchhikes on introns of germline–up-regulated genes, Science (2025). DOI: 10.1126/science.ads7778www.science.org/doi/10.1126/science.ads7778

    Part 2

  • Dr. Krishna Kumari Challa

    Tigers in trouble as Malaysian big cat numbers dwindle

    Malaysia's national animal is in trouble.

    Poaching, food loss and diminishing habitat have slashed the population from 3,000 in the 1950s to less than 150 roaming free today, according to official estimates.

    The government said last month it was ramping up efforts to combat wildlife crime, introducing AI-enabled camera traps and methods to detect smuggling at airports.

    But experts and officials admit that resources fall far short of what is needed to protect the country's famed big cat, listed as critically endangered.

    The next 10 years will decide whether we can bring back the roar of the Malayan tiger.

    Source: News agencies

  • Dr. Krishna Kumari Challa

    Not all gas is bad: Hydrogen gas found to play key role in supporting gut health

     Scientists have revealed how hydrogen is made and used in the human gut. Though infamous for making flatulence ignite, hydrogen also has a positive role supporting gut health.

    In a study published in Nature Microbiology, researchers  analyzed how microbes control hydrogen levels in the gut.

    Hydrogen gas is naturally produced in the gut when bacteria ferment undigested carbohydrates from our diets. Some of this gas is exhaled, much is recycled by other gut bacteria, and the rest exits the body as flatulence.

    The results revealed hydrogen had an even bigger role in gut function than previously thought.

    Most people release about a liter of gas per day and half of that is hydrogen. But hydrogen is more than just the gas behind flatulence—it's a hidden driver of gut health.

    Gas production in the gut is a normal process. Hydrogen is made in large amounts when gut bacteria break down food and is then used by other microbes for energy.

    The study shows hydrogen shapes the gut microbiome in surprising and varied ways. It helps some beneficial bacteria thrive in the gut and keeps digestion going.

    However, excessive hydrogen production can signal gut problems. Abnormal hydrogen levels are associated with infections, digestive disorders, and even cancer, and are often measured in breath tests to assess gut health.

    They also  saw signs that hydrogen production was disrupted in people with gut disorders, but it's unclear if this is a cause or consequence of disease.

    The researchers' work was focused on understanding the fundamental role of hydrogen in gut function, rather than improving diagnostics or developing therapies.

    The study found that a specific enzyme called Group B [FeFe]-hydrogenase was mainly responsible for making hydrogen in the gut. This enzyme is found in many gut bacteria and is very active.

    The researchers studied bacteria from stool samples and gut tissue and found that this enzyme helps bacteria grow and produce hydrogen, especially in the primary health associated groups. They also discovered that this enzyme works by using a specific chemical reaction involving iron and another protein called ferredoxin.

    As an example, healthy people have a lot of these enzymes in their gut, but people with Crohn's disease have fewer of them and more of the other types of hydrogen-producing enzymes.

    The researchers hope their discovery will highlight the need to expand fundamental knowledge of how our gut works so it can be used to design new treatments for gastrointestinal issues.

     Caitlin Welsh et al, A widespread hydrogenase supports fermentative growth of gut bacteria in healthy people, Nature Microbiology (2025). DOI: 10.1038/s41564-025-02154-w

  • Dr. Krishna Kumari Challa

    Scientists find ways to boost memory in aging brains

    Memory loss may not simply be a symptom of getting older. New research  shows that it's tied to specific molecular changes in the brain and that adjusting those processes can improve memory.

    In two complementary studies, researchers used gene-editing tools to target those age-related changes to improve memory performance in older subjects. The work was conducted on rats, a standard model for studying how memory changes with age.

    Memory loss affects more than a third of people over 70, and it's a major risk factor for Alzheimer's disease. 

    This work shows that memory decline is linked to specific molecular changes that can be targeted and studied. If we can understand what's driving it at the molecular level, we can start to understand what goes wrong in dementia and eventually use that knowledge to guide new approaches to treatment.

    In the first study, published in the journal Neuroscience one research team examined a process called K63 polyubiquitination. This process acts as a molecular tagging system that tells proteins inside the brain how to behave. When the system functions normally, it helps brain cells communicate and form memories.

    They found that aging disrupts K63 polyubiquitination in two distinct areas of the brain. In the hippocampus, which helps form and retrieve memories, levels of K63 polyubiquitination increase with age. Using the CRISPR-dCas13 RNA editing system to reduce these levels, the researchers were able to improve memory in older rats.

    In the amygdala, which is important for emotional memory, the researchers noted that K63 polyubiquitination declines with age. By reducing it even further, they were able to boost memory in older rats.

    Together, these findings reveal the important functions of K63 polyubiquitination in the brain's aging process. In both regions, adjusting this one molecular process helped improve memory.

    Part 1

  • Dr. Krishna Kumari Challa

    second study, published in the Brain Research Bulletin

    focused on IGF2, a growth-factor gene that supports memory formation. As the brain ages, IGF2 activity drops as the gene becomes chemically silenced in the hippocampus.

    IGF2 is one of a small number of genes in our DNA that's imprinted, which means it's expressed from only one parental copy. When that single copy starts to shut down with age, you lose its benefit.

    The researchers found that this silencing happens through DNA methylation, a natural process in which chemical tags accumulate on the gene and switch it off. Using a precise gene-editing tool, CRISPR-dCas9, they removed those tags and reactivated the gene. The result was better memory in older rats.

    Together, the two studies show that memory loss is not caused by a single molecule or pathway and that multiple molecular systems likely contribute to how the brain ages.

     Yeeun Bae et al, Age-related dysregulation of proteasome-independent K63 polyubiquitination in the hippocampus and amygdala, Neuroscience (2025). DOI: 10.1016/j.neuroscience.2025.06.032

    Shannon Kincaid et al, Increased DNA methylation of Igf2 in the male hippocampus regulates age-related deficits in synaptic plasticity and memory, Brain Research Bulletin (2025). DOI: 10.1016/j.brainresbull.2025.111509

    Part 2