Science Simplified!

                       JAI VIGNAN

All about Science - to remove misconceptions and encourage scientific temper

Communicating science to the common people

'To make  them see the world differently through the beautiful lense of  science'

Load Previous Comments
  • Dr. Krishna Kumari Challa

    In a second study, the working group also identified the changes in the brain that are associated with criminal behavior in frontotemporal dementia. This study was published in Human Brain Mapping.

    Persons exhibiting criminal behavior showed larger atrophy in the temporal lobe, indicating that criminal behavior might be caused by so-called disinhibition, i.e., the loss of normal restraints or inhibitions, leading to a reduced ability to regulate one's behavior, impulses, and emotions.

    Disinhibition can manifest as acting impulsively, without thinking through the consequences, and behaving in ways that are inappropriate for the situation.

    One has to prevent a further stigmatization of persons with dementia. Of note, most offenses committed were minor, such as indecent behavior, traffic violations, theft, damage to property, but also physical violence or aggression occurred

    Hence, sensitivity for this topic as possible early signs of dementia and earliest diagnosis and treatment are of the uttermost importance. Besides early diagnosis and treatment of persons affected, one has to discuss adaptations of the legal system, such as increasing awareness of offenses due to those diseases, and taking into account diseases in respective penalties and in jails, say the researchers.

     Matthias L. Schroeter et al, Criminal minds in dementia: A systematic review and quantitative meta-analysis, Translational Psychiatry (2025). DOI: 10.1038/s41398-025-03523-z

    Karsten Mueller et al, Criminal Behavior in Frontotemporal Dementia: A Multimodal MRI Study, Human Brain Mapping (2025). DOI: 10.1002/hbm.70308

    Part 2

  • Dr. Krishna Kumari Challa

    Depression linked to presence of immune cells in the brain's protective layer

    Immune cells released from bone marrow in the skull in response to chronic stress and adversity could play a key role in symptoms of depression and anxiety, say researchers.

    The discovery—found in a study in mice—sheds light on the role that inflammation can play in mood disorders and could help in the search for new treatments, in particular for those individuals for whom current treatments are ineffective.

    Around 1 billion people will be diagnosed with a mood disorder such as depression or anxiety at some point in their life. While there may be many underlying causes, chronic inflammation—when the body's immune system stays active for a long time, even when there is no infection or injury to fight—has been linked to depression. This suggests that the immune system may play an important role in the development of mood disorders.

    Previous studies have highlighted how high levels of an immune cell known as a neutrophil, a type of white blood cell, are linked to the severity of depression. But how neutrophils contribute to symptoms of depression is currently unclear.

    In research published in Nature Communications, a team of scientists  tested the hypothesis that chronic stress can lead to the release of neutrophils from bone marrow in the skull. These cells then collect in the meninges—membranes that cover and protect your brain and spinal cord—and contribute to symptoms of depression.

    As it is not possible to test this hypothesis in humans, the team used mice exposed to chronic social stress. In this experiment, an 'intruder' mouse is introduced into the home cage of an aggressive resident mouse. The two have brief daily physical interactions and can otherwise see, smell, and hear each other.

    The researchers found that prolonged exposure to this stressful environment led to a noticeable increase in levels of neutrophils in the meninges, and that this was linked to signs of depressive behavior in the mice. Even after the stress ended, the neutrophils lasted longer in the meninges than they did in the blood.

    Analysis confirmed the researchers' hypothesis that the meningeal neutrophils—which appeared subtly different from those found in the blood—originated in the skull.

    Further analysis suggested that long-term stress triggered a type of immune system 'alarm warning' known as type I interferon signaling in the neutrophils. Blocking this pathway—in effect, switching off the alarm—reduced the number of neutrophils in the meninges and improved behavior in the depressed mice.

    Further analysis suggested that long-term stress triggered a type of immune system 'alarm warning' known as type I interferon signaling in the neutrophils. Blocking this pathway—in effect, switching off the alarm—reduced the number of neutrophils in the meninges and improved behavior in the depressed mice.

    --

    Part 1

  • Dr. Krishna Kumari Challa

    The reason why there are high levels of neutrophils in the meninges is unclear. One explanation could be that they are recruited by microglia, a type of immune cell unique to the brain.

    Another possible explanation is that chronic stress may cause microhemorrhages, tiny leaks in brain blood vessels, and that neutrophils—the body's 'first responders'—arrive to fix the damage and prevent any further damage. These neutrophils then become more rigid, possibly getting stuck in brain capillaries and causing further inflammation in the brain.

    These new findings show that these 'first responder' immune cells leave the skull bone marrow and travel to the brain, where they can influence mood and behavior.

    Stacey L. Kigar et al, Chronic social defeat stress induces meningeal neutrophilia via type I interferon signaling in male mice, Nature Communications (2025). DOI: 10.1038/s41467-025-62840-5

    Part 2

  • Dr. Krishna Kumari Challa

    Hyperactive blood platelets linked to heart attacks despite standard drug therapy

    Scientists  have discovered a particularly active subgroup of blood platelets that may cause heart attacks in people with coronary heart disease despite drug therapy. This discovery may open up new prospects for customized therapies. The research results are published in the European Heart Journal and were presented on August 31 at Europe's largest cardiology congress.

    Despite modern medication, many people with coronary heart disease continue to suffer heart attacks. A research team  has now found a possible explanation for this and has also provided a very promising therapeutic approach.

    Their work focused on so-called "reticulated platelets": particularly young, RNA-rich and reactive thrombocytes that play a central role in the formation of blood clots in patients with coronary heart disease.

    The researchers were able to comprehensively characterize the biological mechanisms of these cells for the first time. This allows us to explain why these platelets remain overactive in many patients even under optimal therapy. The reason is that these young platelets have a particularly large number of activating signaling pathways that make them more sensitive and reactive than mature platelets.

    The results come from a multidimensional, using various methods, analysis of the blood of over 90 patients with coronary heart disease. Among other things, the researchers found signaling pathways that can be used to specifically inhibit the activity of such blood cells, in particular two target structures called GPVI and PI3K. Initial laboratory experiments confirmed that inhibiting these signaling pathways can reduce platelet hyperactivity.

     Kilian Kirmes et al, Reticulated platelets in coronary artery disease: a multidimensional approach unveils prothrombotic signalling and novel therapeutic targets, European Heart Journal (2025). DOI: 10.1093/eurheartj/ehaf694

  • Dr. Krishna Kumari Challa

    8,000 years of human activities have caused wild animals to shrink and domestic animals to grow

    Humans have caused wild animals to shrink and domestic animals to grow, according to a new study. 

    Researchers studied tens of thousands of animal bones from Mediterranean France covering the last 8,000 years to see how the size of both types of animals has changed over time.

    Scientists already know that human choices, such as selective breeding, influence the size of domestic animals, and that environmental factors also impact the size of both. However, little is known about how these two forces have influenced the size of wild and domestic animals over such a prolonged period. This latest research, published in the Proceedings of the National Academy of Sciences , fills a major gap in our knowledge.

    The scientists analyzed more than 225,000 bones from 311 archaeological sites in Mediterranean France. They took thousands of measurements of things like the length, width, and depth of bones and teeth from wild animals, such as foxes, rabbits and deer, as well as domestic ones, including goats, cattle, pigs, sheep and chickens.

    But the researchers didn't just focus on the bones. They also collected data on the climate, the types of plants growing in the area, the number of people living there and what they used the land for. And then, with some sophisticated statistical modeling, they were able to track key trends and drivers behind the change in animal size.

    The research team's findings reveal that for around 7,000 years, wild and domestic animals evolved along similar paths, growing and shrinking together in sync with their shared environment and human activity. However, all that changed around 1,000 years ago. Their body sizes began to diverge dramatically, especially during the Middle Ages.

    Domestic animals started to get much bigger as they were being actively bred for more meat and milk. At the same time, wild animals began to shrink in size as a direct result of human pressures, such as hunting and habitat loss. In other words, human activities replaced environmental factors as the main force shaping animal evolution.

     Mureau, Cyprien et al, 8,000 years of wild and domestic animal body size data reveal long-term synchrony and recent divergence due to intensified human impact, Proceedings of the National Academy of Sciences (2025). DOI: 10.1073/pnas.2503428122www.pnas.org/cgi/doi/10.1073/pnas.2503428122

  • Dr. Krishna Kumari Challa

    World's oldest host-associated bacterial DNA found

    An international team led by researchers at the Center for Paleogenetics, has uncovered microbial DNA preserved in woolly and steppe mammoth remains dating back more than one million years. The analyses reveal some of the world's oldest microbial DNA ever recovered, as well as the identification of bacteria that possibly caused disease in mammoths. The findings are published in Cell.

    Researchers analyzed microbial DNA from 483 mammoth specimens, of which 440 were sequenced for the first time. Among them was a steppe mammoth that lived about 1.1 million years ago. Using advanced genomic and bioinformatic techniques, the team distinguished microbes that once lived alongside the mammoths from those that invaded their remains after death.

     Ancient Host-Associated Microbes obtained from Mammoth Remains, Cell (2025). DOI: 10.1016/j.cell.2025.08.003www.cell.com/cell/fulltext/S0092-8674(25)00917-1

  • Dr. Krishna Kumari Challa

    Genetic mechanism reveals how plants coordinate flowering with light and temperature conditions

    Plants may be stuck in one place, but the world around them is constantly changing. In order to grow and flower at the right time, plants must constantly collect information about their surroundings, measuring things like temperature, brightness, and length of day. Still, it's unclear how all this information gets combined to trigger specific behaviors.

    Scientists have discovered a genetic mechanism for how plants integrate light and temperature information to control their flowering.

    In a study published in Nature Communications, the researchers found an interaction between two genetic pathways that signals the presence of both blue light and low temperature. This genetic module helps plants fine-tune their flowering to the optimal environmental conditions.

    In one pathway, blue light activates the PHOT2 blue light receptor, with help from partner protein NPH3. In another pathway, low ambient temperature allows a transcription factor called CAMTA2 to boost the expression of a gene called EHB1. Importantly, EHB1 is known to interact with NPH3, placing NPH3 at the convergence point of the blue light and low temperature signals. This this genetic architecture effectively works as a coincidence detector, linking the presence of blue light and low temperature to guide the switch to flowering.

    The study describes an important component of plant growth, reproduction, and information processing. The newly discovered genetic module allows plants to have fine control over their flowering in low temperatures. Understanding this system will now help scientists optimize crop growth under changing environmental conditions.  

    Adam Seluzicki et al, Genetic architecture of a light-temperature coincidence detector, Nature Communications (2025). DOI: 10.1038/s41467-025-62194-y

  • Dr. Krishna Kumari Challa

     Generative AI designs molecules that kill drug-resistant bacteria

    What if generative AI could design life-saving antibiotics, not just art and text? In a new Cell Biomaterials paper, researchers introduce AMP-Diffusion, a generative AI tool used to create tens of thousands of new antimicrobial peptides (AMPs)—short strings of amino acids, the building blocks of proteins—with bacteria-killing potential. In animal models, the most potent AMPs performed as well as FDA-approved drugs, without detectable adverse effects.

    While past work has shown that AI can successfully sort through mountains of data to identify promising antibiotic candidates, this study adds to a small but growing number of demonstrations that AI can invent antibiotic candidates from scratch.

    Using AMP-Diffusion, the researchers generated the amino-acid sequences for about 50,000 candidates. They used   AI to filter the results. 

    After synthesizing the 46 most promising candidates, the researchers tested them in human cells and animal models. Treating skin infections in mice, two AMPs demonstrated efficacy on par with levofloxacin and polymyxin B, FDA-approved drugs used to treat antibiotic-resistant bacteria, without adverse effects.

    In the future, the researchers hope to refine AMP-Diffusion, giving it the capability to denoise with a more specific goal in mind, like treating a particular type of bacterial infection, among other features.

    Generative latent diffusion language modeling yields anti-infective synthetic peptides, Cell Biomaterials (2025). DOI: 10.1016/j.celbio.2025.100183www.cell.com/cell-biomaterials … 3050-5623(25)00174-6

  • Dr. Krishna Kumari Challa

    How bacteria bounce back after antibiotics

    A study by researchers has uncovered how Escherichia coli (E. coli) persister bacteria survive antibiotics by protecting their genetic instructions.

    The work, published in Nature Microbiology, offers new hope for tackling chronic, recurring infections.

    Persister bacteria, which enter a dormant state to survive antibiotics that target active cells, are linked to over 20% of chronic infections and resist current treatments.

    Understanding their survival mechanisms could lead to new ways to combat recurring infections. This study utilized E. coli bacteria as a model and found that prolonged stress leads to the increased formation of aggresomes (membraneless droplets) and the enrichment of mRNA (molecules that carry instructions for making proteins) within them, which enhances the ability of E. coli to survive and recover from stress.

    Researchers used multiple approaches, including imaging, modeling, and transcriptomics, to show that prolonged stress leading to ATP (fuel for all living cells) depletion in Escherichia coli results in increased aggresome formation, their compaction, and enrichment of mRNA within aggresomes compared to the cytosol (the liquid inside of cells).

    Transcript length was longer in aggresomes compared to the cytosol. Mass spectrometry showed exclusion of mRNA ribonuclease (an enzyme that breaks down RNA) from aggresomes, which was due to negative charge repulsion.

    Experiments with fluorescent reporters and disruption of aggresome formation showed that mRNA storage within aggresomes promoted translation and was associated with reduced lag phases during growth after stress removal. These findings suggest that mRNA storage within aggresomes confers an advantage for bacterial survival and recovery from stress.

    This breakthrough illuminates how persister cells survive and revive after antibiotic treatment. 

    By targeting aggresomes, new drugs could disrupt this protective mechanism, preventing bacteria from storing mRNA and making them more vulnerable to elimination, thus reducing the risk of infection relapse.

    Linsen Pei et al, Aggresomes protect mRNA under stress in Escherichia coli, Nature Microbiology (2025). DOI: 10.1038/s41564-025-02086-5

  • Dr. Krishna Kumari Challa

    Deforestation reduces rainfall by 74% and increases temperatures by 16% in Amazon during dry season, study found

    Deforestation in the Brazilian Amazon is responsible for approximately 74.5% of the reduction in rainfall and 16.5% of the temperature increase in the biome during the dry season. For the first time, researchers have quantified the impact of vegetation loss and global climate change on the forest.

    The study provides fundamental results to guide effective mitigation and adaptation strategies.

    How climate change and deforestation interact in the transformation of the Amazon rainforest, Nature Communications (2025). DOI: 10.1038/s41467-025-63156-0

  • Dr. Krishna Kumari Challa

    Magnetic fields in infant universe may have been billions of times weaker than a fridge magnet

    The magnetic fields that formed in the very early stages of the universe may have been billions of times weaker than a small fridge magnet, with strengths comparable to magnetism generated by neurons in the human brain. Yet, despite such weakness, quantifiable traces of their existence still remain in the cosmic web, the visible cosmic structures connected throughout the universe.

    These conclusions emerge from a study using around a quarter of a million computer simulations.

    The research, recently published in Physical Review Letters, specifies both possible and maximum values for the strengths of primordial magnetic fields. It also offers the possibility of refining our knowledge of the early universe and the formation of the first stars and galaxies.

     Mak Pavičević et al, Constraints on Primordial Magnetic Fields from the Lyman- α Forest, Physical Review Letters (2025). DOI: 10.1103/77rd-vkpz. On arXivDOI: 10.48550/arxiv.2501.06299

  • Dr. Krishna Kumari Challa

    Removing yellow stains from fabric with blue light

    Sweat and food stains can ruin your favorite clothes. But bleaching agents such as hydrogen peroxide or dry-cleaning solvents that remove stains aren't options for all fabrics, especially delicate ones. Now, researchers in ACS Sustainable Chemistry & Engineering report a simple way to remove yellow stains using a high-intensity blue LED light. They demonstrate the method's effectiveness at removing stains from orange juice, tomato juice and sweat-like substances on multiple fabrics, including silk.

    The method utilizes visible blue light in combination with ambient oxygen, which acts as the oxidizing agent to drive the photobleaching process.This approach avoids the use of harsh chemical oxidants typically required in conventional bleaching methods, making it inherently more sustainable.

    Yellow clothing stains are caused by squalene and oleic acid from skin oils and sweat, as well as natural pigments like beta carotene and lycopene, present in oranges, tomatoes and other foods. UV light is a potential stain-removing alternative to chemical oxidizers like bleach and hydrogen peroxide, but it can damage delicate fabrics.

    The same researchers previously determined that a high-intensity blue LED light could remove yellow color from aged resin polymers, and they wanted to see whether blue light could also break down yellow stains on fabric without causing damage.

    Initially, they exposed vials of beta carotene, lycopene and squalene to high-intensity blue LED light for three hours. All the samples lost color, and spectroscopic analyses indicated that oxygen in the air helped the photobleaching process by breaking bonds to produce colorless compounds.

    Next, the team applied squalene onto cotton fabric swatches. After heating the swatches to simulate aging, they treated the samples for 10 minutes, by soaking them in a hydrogen peroxide solution or exposing them to the blue LED or UV light. The blue light reduced the yellow stain substantially more than hydrogen peroxide or UV exposure. In fact, UV exposure generated some new yellow-colored compounds.
    Additional tests showed that the blue LED treatment lightened squalene stains on silk and polyester without damaging the fabrics. The method also reduced the color of other stain-causing substances, including aged oleic acid, orange juice and tomato juice, on cotton swatches.

    High-intensity blue LED light is a promising way to remove clothing stains, but the researchers say they want to do additional colorfastness and safety testing before commercializing a light system for home and industrial use.

    Tomohiro Sugahara et al, Environmentally Friendly Photobleaching Method Using Visible Light for Removing Natural Stains from Clothing, ACS Sustainable Chemistry & Engineering (2025). DOI: 10.1021/acssuschemeng.5c03907

  • Dr. Krishna Kumari Challa

    Scientists develop the world's first 6G chip, capable of 100 Gbps speeds

    Sixth generation, or 6G, wireless technology is one step closer to reality with news that  researchers have unveiled the world's first "all-frequency" 6G chip. The chip is capable of delivering mobile internet speeds exceeding 100 gigabits per second (Gbps).

    6G technology is the successor to 5G and promises to bring about a massive leap in how we communicate. It will offer benefits such as ultra-high-speed connectivity, ultra-low latency and AI integration that can manage and optimize networks in real-time. To achieve this, 6G networks will need to operate across a range of frequencies, from standard microwaves to much higher frequency terahertz waves. Current 5G technology utilizes a limited set of radio frequencies, similar to those used in previous generations of wireless technologies.

    The new chip is no bigger than a thumbnail, measuring 11 millimeters by 1.7 millimeters. It operates across a wide frequency range, from 0.5 GHz to 115 GHz, which traditionally takes nine separate radio systems to cover this spectrum. While the development of a single all-frequency chip is a significant breakthrough, the technology is still in its early stages of development. Many experts expect that commercial 6G networks will begin to roll out around 2030.

    Zihan Tao et al, Ultrabroadband on-chip photonics for full-spectrum wireless communications, Nature (2025). DOI: 10.1038/s41586-025-09451-8

  • Dr. Krishna Kumari Challa

    Mother's Germs May  Influence Her Child's Brain Development

    Our bodies are colonized by a teeming, ever-changing mass of microbes that help power countless biological processes. Now, a new study has identified how these microorganisms get to work shaping the brain before birth.
    Researchers at Georgia State University studied newborn mice specifically bred in a germ-free environment to prevent any microbe colonization. Some of these mice were immediately placed with mothers with normal microbiota, which leads to microbes being transferred rapidly.

    That gave the study authors a way to pinpoint just how early microbes begin influencing the developing brain. Their focus was on the paraventricular nucleus (PVN), a region of the hypothalamus tied to stress and social behavior, already known to be partly influenced by microbe activity in mice later in life.
    At birth, a newborn body is colonized by microbes as it travels through the birth canal.
    Birth also coincides with important developmental events that shape the brain.
    Part 1
  • Dr. Krishna Kumari Challa

    When the germ-free mice were just a handful of days old, the researchers found fewer neurons in their PVN, even when microbes were introduced after birth. That suggests the changes caused by these microorganisms happen in the uterus during development.

    These neural modifications last, too: the researchers also found that the PVN was neuron-light even in adult mice, if they'd been raised to be germ-free. However, the cross-fostering experiment was not continued into adulthood (around eight weeks).

    The details of this relationship still need to be worked out and researched in greater detail, but the takeaway is that microbes – specifically the mix of microbes in the mother's gut – can play a notable role in the brain development of their offspring.
    Rather than shunning our microbes, we should recognize them as partners in early life development. They're helping build our brains from the very beginning, say the researchers.
    While this has only been shown in mouse models so far, there are enough biological similarities between mice and humans that there's a chance we're also shaped by our mother's microbes before we're born.

    One of the reasons this matters is because practices like Cesarean sections and the use of antibiotics around birth are known to disrupt certain types of microbe activity – which may in turn be affecting the health of newborns.

    https://www.sciencedirect.com/science/article/pii/S0018506X25000686...

    Part 2

    **

  • Dr. Krishna Kumari Challa

    Mutations driving evolution are informed by the genome, not random, study suggests

    A study published in the Proceedings of the National Academy of Sciences by scientists  shows that an evolutionarily significant mutation in the human APOL1 gene arises not randomly but more frequently where it is needed to prevent disease, fundamentally challenging the notion that evolution is driven by random mutations and tying the results to a new theory that, for the first time, offers a new concept for how mutations arise.

    Implications for biology, medicine, computer science, and perhaps even our understanding of the origin of life itself, are potentially far reaching.

    A random mutation is a genetic change whose chance of arising is unrelated to its usefulness. Only once these supposed accidents arise does natural selection vet them, sorting the beneficial from the harmful. For over a century, scientists have thought that a series of such accidents has built up over time, one by one, to create the diversity and splendor of life around us.

    However, it has never been possible to examine directly whether mutations in the DNA originate at random or not. Mutations are rare events relative to the genome's size, and technical limitations have prevented scientists from seeing the genome in enough detail to track individual mutations as they arise naturally.

    To overcome this, researchers developed a new ultra-accurate detection method and recently applied it to the famous HbS mutation, which protects from malaria but causes sickle-cell anemia in homozygotes.

    Results showed that the HbS mutation did not arise at random, but emerged more frequently exactly in the gene and population where it was needed. Now, they report the same nonrandom pattern in a second mutation of evolutionary significance.

    The new study examines the de novo origination of a mutation in the human APOL1 gene that protects against a form of trypanosomiasis, a disease that devastated central Africa in historical times and until recently has caused tens of thousands of deaths there per year, while increasing the risk of chronic kidney disease in people with two copies.

    If the APOL1 mutation arises by chance, it should arise at a similar rate in all populations, and only then spread under Trypanosoma pressure. However, if it is generated nonrandomly, it may actually arise more frequently where it is useful.

    Results supported the nonrandom pattern: the mutation arose much more frequently in sub-Saharan Africans, who have faced generations of endemic disease, compared to Europeans, who have not, and in the precise genomic location where it confers protection.

    Part 1

  • Dr. Krishna Kumari Challa

    The new findings challenge the notion of random mutation fundamentally.
    Historically, there have been two basic theories for how evolution happens— random mutation and natural selection, and Lamarckism—the idea that an individual directly senses its environment and somehow changes its genes to fit it. Lamarckism has been unable to explain evolution in general, so biologists have concluded that mutations must be random.
    This new theory moves away from both of these concepts, proposing instead that two inextricable forces underlie evolution. While the well-known external force of natural selection ensures fitness, a previously unrecognized internal force operates inside the organism, putting together genetic information that has accumulated over generations in useful ways.
    To illustrate, take fusion mutations, a type of mutation where two previously separate genes fuse to form a new gene. As for all mutations, it has been thought that fusions arise by accident: one day, a gene moves by error to another location and by chance fuses to another gene, once in a great while, leading to a useful adaptation. But researchers have recently shown that genes do not fuse at random.
    Instead, genes that have evolved to be used together repeatedly over generations are the ones that are more likely to get fused. Because the genome folds in 3D space, bringing genes that work together to the same place at the same time in the nucleus with their chromatin open, molecular mechanisms fuse these genes rather than others. An interaction involving complex regulatory information that has gradually evolved over generations leads to a mutation that simplifies and "hardwires" it into the genome.
    Part 2

  • Dr. Krishna Kumari Challa

    In the paper, they argue that fusions are a specific example of a more general and extensive internal force that applies across mutation types. Rather than local accidents arising at random locations in the genome disconnected from other genetic information, mutational processes put together multiple meaningful pieces of heritable information in many ways.

    Genes that evolved to interact tightly are more likely to be fused; single-letter RNA changes that evolved to occur repeatedly across generations via regulatory phenomena are more likely to be "hardwired" as point mutations into the DNA; genes that evolved to interact in incipient networks, each under its own regulation, are more likely to be invaded by the same transposable element that later becomes a master-switch of the network, streamlining regulation, and so on. Earlier mutations influence the origination of later ones, forming a vast network of influences over evolutionary time.

    Previous studies examined mutation rates as averages across genomic positions, masking the probabilities of individual mutations. But these new studies suggest that, at the scale of individual mutations, each mutation has its own probability, and the causes and consequences of mutation are related.
    At each generation, mutations arise based on the information that has accumulated in the genome up to that time point, and those that survive become a part of that internal information.
    This vast array of interconnected mutational activity gradually hones in over the generations on mutations relevant to the long-term pressures experienced, leading to long-term directed mutational responses to specific environmental pressures, such as the malaria and Trypanosoma–protective HbS and APOL1 mutations.
    New genetic information arises in the first place, they argue, as a consequence of the fact that mutations simplify genetic regulation, hardwiring evolved biological interactions into ready-made units in the genome. This internal force of natural simplification, together with the external force of natural selection, act over evolutionary time like combined forces of parsimony and fit, generating co-optable elements that themselves have an inherent tendency to come together into new, emergent interactions.
    Co-optable elements are generated by simplification under performance pressure, and then engage in emergent interactions—the source of innovation is at the system level.
    Understood in the proper timescale, an individual mutation does not arise at random nor does it invent anything in and of itself.
    The potential depth of evolution from this new perspective can be seen by examining other networks. For example, the gene fusion mechanism—where genes repeatedly used together across evolutionary time are more likely to be fused together by mutation—echoes chunking, one of the most basic principles of cognition and learning in the brain, where pieces of information that repeatedly co-occur are eventually chunked into a single unit.

    Yet fusions are only one instance of a broader principle: diverse mutational processes respond to accumulated information in the genome, combining it over generations into streamlined instructions. This view recasts mutations not as isolated accidents, but as meaningful events in a larger, long-term process.

    Daniel Melamed et al, De novo rates of a Trypanosoma -resistant mutation in two human populations, Proceedings of the National Academy of Sciences (2025). DOI: 10.1073/pnas.2424538122

    Part 3

  • Dr. Krishna Kumari Challa

    New tool enables rapid, large-scale profiling of disease-linked RNA modifications

    Researchers have developed a powerful tool capable of scanning thousands of biological samples to detect transfer ribonucleic acid (tRNA) modifications—tiny chemical changes to RNA molecules that help control how cells grow, adapt to stress and respond to diseases such as cancer and antibiotic‑resistant infections. This tool opens up new possibilities for science, health care and industry—from accelerating disease research and enabling more precise diagnostics, to guiding the development of more effective medical treatments for diseases such as cancer and antibiotic‑resistant infections.

    Cancer and infectious diseases are complicated health conditions in which cells are forced to function abnormally by mutations in their genetic material or by instructions from an invading microorganism. The SMART-led research team is among the world's leaders in understanding how the epitranscriptome—the over 170 different chemical modifications of all forms of RNA—controls growth of normal cells and how cells respond to stressful changes in the environment, such as loss of nutrients or exposure to toxic chemicals. The researchers are also studying how this system is corrupted in cancer or exploited by viruses, bacteria and parasites in infectious diseases.
    Current molecular methods used to study the expansive epitranscriptome and all of the thousands of different types of modified RNA are often slow, labor‑intensive, costly and involve hazardous chemicals which limit research capacity and speed.

    To solve this problem, the SMART team developed a new tool that enables fast, automated profiling of tRNA modifications—molecular changes that regulate how cells survive, adapt to stress and respond to disease. This capability allows scientists to map cell regulatory networks, discover novel enzymes and link molecular patterns to disease mechanisms, paving the way for better drug discovery and development, and more accurate disease diagnostics.

    SMART's automated system was specially designed to profile tRNA modifications across thousands of samples rapidly and safely. Unlike traditional methods—which are costly, labor‑intensive and use toxic solvents such as phenol and chloroform—this tool integrates robotics to automate sample preparation and analysis, eliminating the need for hazardous chemical handling and reducing costs. This advancement increases safety, throughput and affordability, enabling routine large‑scale use in research and clinical labs.

    Jingjing Sun et al, tRNA modification profiling reveals epitranscriptome regulatory networks in Pseudomonas aeruginosa, Nucleic Acids Research (2025). DOI: 10.1093/nar/gkaf696

  • Dr. Krishna Kumari Challa

    Fruit fly research shows that mechanical forces drive evolutionary change

    A tissue fold known as the cephalic furrow, an evolutionary novelty that forms between the head and the trunk of fly embryos, plays a mechanical role in stabilizing embryonic tissues during the development of the fruit fly Drosophila melanogaster.

    Researchers have integrated computer simulations with their experiments and showed that the timing and position of cephalic furrow formation are crucial for its function, preventing mechanical instabilities in the embryonic tissues.

    The work appears in Nature.

    The increased mechanical instability caused by embryonic tissue movements may have contributed to the origin and evolution of the cephalic furrow genetic program. This shows that mechanical forces can shape the evolution of new developmental features.

    Mechanical forces shape tissues and organs during the development of an embryo through a process called morphogenesis. These forces cause tissues to push and pull on each other, providing essential information to cells and determining the shape of organs. Despite the importance of these forces, their role in the evolution of development is still not well understood.

    Animal embryos undergo tissue flows and folding processes, involving mechanical forces, that transform a single-layered blastula (a hollow sphere of cells) into a complex multi-layered structure known as the gastrula. During early gastrulation, some flies of the order Diptera form a tissue fold at the head-trunk boundary called the cephalic furrow. This fold is a specific feature of a subgroup of Diptera and is therefore an evolutionary novelty of flies.

    Part1

  • Dr. Krishna Kumari Challa

    The cephalic furrow is especially interesting because it is a prominent embryonic invagination whose formation is controlled by genes, but that has no obvious function during development. The fold does not give rise to specific structures, and later in development, it simply unfolds, leaving no trace.
    With their experiments, the researchers show that the absence of the cephalic furrow leads to an increase in the mechanical instability of embryonic tissues and that the primary sources of mechanical stress are cell divisions and tissue movements typical of gastrulation. They demonstrate that the formation of the cephalic furrow absorbs these compressive stresses. Without a cephalic furrow, these stresses build up, and outward forces caused by cell divisions in the single-layered blastula cause mechanical instability and tissue buckling.

    This intriguing physical role gave the researchers the idea that the cephalic furrow may have evolved in response to the mechanical challenges of dipteran gastrulation, with mechanical instability acting as a potential selective pressure.
    Flies either feature a cephalic furrow, or if they lack one, display widespread out-of-plane division, meaning the cells divide downward to reduce the surface area. Both mechanisms act as mechanical sinks to prevent tissue collision and distortion.
    These findings uncover empirical evidence for how mechanical forces can influence the evolution of innovations in early development. The cephalic furrow may have evolved through genetic changes in response to the mechanical challenges of dipteran gastrulation. They show that mechanical forces are not just important for the development of the embryo but also for the evolution of its development.

    Patterned invagination prevents mechanical instability during gastrulation, Nature (2025). DOI: 10.1038/s41586-025-09480-3

    Bipasha Dey et al, Divergent evolutionary strategies pre-empt tissue collision in gastrulation. Nature (2025). DOI: 10.1038/s41586-025-09447-4

    Part 2

  • Dr. Krishna Kumari Challa

    Some sugar substitutes linked to faster cognitive decline

    Some sugar substitutes may come with unexpected consequences for long-term brain health, according to a study published in Neurology. The study examined seven low- and no-calorie sweeteners and found that people who consumed the highest amounts experienced faster declines in thinking and memory skills compared to those who consumed the lowest amounts.

    The link was even stronger in people with diabetes. While the study showed a link between the use of some artificial sweeteners and cognitive decline, it did not prove that they were a cause.

    The artificial sweeteners examined in the study were aspartame, saccharin, acesulfame-K, erythritol, xylitol, sorbitol and tagatose. These are mainly found in ultra-processed foods like flavored water, soda, energy drinks, yogurt and low-calorie desserts. Some are also used as a standalone sweetener.

    Low- and no-calorie sweeteners are often seen as a healthy alternative to sugar, however the new findings suggest certain sweeteners may have negative effects on brain health over time.

    The study included 12,772 adults from across Brazil. The average age was 52, and participants were followed for an average of eight years.

    After adjusting for factors such as age, sex, high blood pressure and cardiovascular disease, researchers found people who consumed the highest amount of sweeteners showed faster declines in overall thinking and memory skills than those who consumed the lowest amount, with a decline that was 62% faster. This is the equivalent of about 1.6 years of aging. Those in the middle group had a decline that was 35% faster than the lowest group, equivalent to about 1.3 years of aging.

    When researchers broke the results down by age, they found that people under the age of 60 who consumed the highest amounts of sweeteners showed faster declines in verbal fluency and overall cognition when compared to those who consumed the lowest amounts. They did not find links in people over 60. They also found that the link to faster cognitive decline was stronger in participants with diabetes than in those without diabetes.

    When looking at individual sweeteners, consuming aspartame, saccharin, acesulfame-k, erythritol, sorbitol and xylitol was associated with a faster decline in overall cognition, particularly in memory.

    They found no link between the consumption of tagatose and cognitive decline.

    https://www.neurology.org/doi/10.1212/WNL.0000000000214023

  • Dr. Krishna Kumari Challa

    SeeMe detects hidden signs of consciousness in brain injury patients

    SeeMe, a computer vision tool tested by Stony Brook University researchers, was able to detect low-amplitude, voluntary facial movements in comatose acute brain injury patients days before clinicians could identify overt responses.

    There are ways to detect covert consciousness with EEG and fMRI, though these are not always available. Many acute brain injury patients appear unresponsive in early care, with signs that are so small, or so infrequent, that they are simply missed.

    In the study, "Computer vision detects covert voluntary facial movements in unresponsive brain injury patients," published in Communications Medicine, investigators designed SeeMe to quantify tiny facial movements in response to auditory commands with the objective of identifying early, stimulus-evoked behavior.

    A single-center prospective cohort included 37 comatose acute brain injury patients and 16 healthy volunteers, aged 18–85, enrolled at Stony Brook University Hospital. Patients had initial Glasgow Coma Scale scores ≤8 and no prior neurologically debilitating diagnoses.

    SeeMe tagged facial pores at ~0.2 mm resolution and tracked movement vectors while subjects heard three commands: open your eyes, stick out your tongue, and show me a smile.

    Results indicate earlier and broader detection with SeeMe. Eye-opening was detected on average 9.1 (± 5.5) days after injury by SeeMe versus 13.2 (± 11.4) days by clinical examination, yielding a 4.1-day lead.

    SeeMe identified eye-opening in 30 of 36 patients (85.7%) compared with 25 of 36 (71.4%) by clinical exam. Among patients without an endotracheal tube obscuring the mouth, SeeMe detected mouth movements in 16 of 17 (94.1%).

    In seven patients with analyzable mouth videos and clinical command following, SeeMe identified reproducible mouth responses 8.3 days earlier on average. Amplitude and frequency of SeeMe-positive responses correlated with discharge outcomes on the Glasgow Outcome Scale-Extended, with significant Kruskal-Wallis results for both features.

    A deep neural network classifier trained on SeeMe-positive trials identified command specificity with 81% accuracy for eye-opening and an overall accuracy of 65%. Additional observations were lower, with 37% for tongue movement and 47% for smile, indicating a strong specificity for eyes.

    Authors conclude that acute brain injury patients can exhibit low-amplitude, stimulus-evoked facial movements before overt signs appear at bedside, suggesting that many covertly conscious patients may have motor behavior currently undetected by clinicians.

    Earlier detection could inform family discussions, guide rehabilitation timing, and serve as a quantitative signal for future monitoring or interface-based communication strategies, while complementing standard examinations.

    Xi Cheng et al, Computer vision detects covert voluntary facial movements in unresponsive brain injury patients, Communications Medicine (2025). DOI: 10.1038/s43856-025-01042-y

  • Dr. Krishna Kumari Challa

    Pancreatic insulin disruption triggers bipolar disorder-like behaviors in mice, study shows

    Bipolar disorder is a psychiatric disorder characterized by alternating episodes of depression (i.e., low mood and a loss of interest in everyday activities) and mania (i.e., a state in which arousal and energy levels are abnormally high). On average, an estimated 1–2% of people worldwide are diagnosed with bipolar disorder at some point during their lives.

    Bipolar disorder can be highly debilitating, particularly if left untreated. Understanding the neural and physiological processes that contribute to its emergence could thus be very valuable, as it could inform the development of new prevention and treatment strategies.

    In addition to experiencing periodic changes in mood, individuals diagnosed with this disorder often exhibit some metabolic symptoms, including changes in their blood sugar levels. While some previous studies reported an association between blood sugar control mechanisms and bipolar disorder, the biological link between the two has not yet been uncovered.

    Researchers recently carried out a study aimed at further exploring the link between insulin secretion and bipolar disorder-like behaviors, particularly focusing on the expression of the gene RORβ. 

    Their findings, published in Nature Neuroscience, show that an overexpression of this gene in a subtype of pancreatic cells disrupts the release of insulin, which in turn prompts a feedback loop with a region of the brain known as the hippocampus, producing alternative depression-like and mania-like behaviors in mice.

    The results in mice point to a pancreas–hippocampus feedback mechanism by which metabolic and circadian factors cooperate to generate behavioral fluctuations, and which may play a role in bipolar disorder, wrote the authors.

    Yao-Nan Liu et al, A pancreas–hippocampus feedback mechanism regulates circadian changes in depression-related behaviors, Nature Neuroscience (2025). DOI: 10.1038/s41593-025-02040-y.

  • Dr. Krishna Kumari Challa

    Shampoo-like gel could help chemo patients keep their hair

    Cancer fighters know that losing their hair is often part of the battle, but researchers have developed a shampoo-like gel that has been tested in animal models and could protect hair from falling out during chemotherapy treatment.

    Baldness from chemotherapy-induced alopecia causes personal, social and professional anxiety for everyone who experiences it. Currently, there are few solutions—the only ones that are approved are cold caps worn on the patient's head, which are expensive and have their own extensive side effects.

    The gel is a hydrogel, which absorbs a lot of water and provides long-lasting delivery of drugs to the patient's scalp. The hydrogel is designed to be applied to the patient's scalp before the start of chemotherapy and left on their head as long as the chemotherapy drugs are in their system—or until they are ready to easily wash it off.

    During chemotherapy treatment, chemotherapeutic drugs circulate throughout the body. When these drugs reach the blood vessels surrounding the hair follicles on the scalp, they kill or damage the follicles, which releases the hair from the shaft and causes it to fall out.
    The gel, containing the drugs lidocaine and adrenalone, prevents most of the chemotherapy drugs from reaching the hair follicle by restricting the blood flow to the scalp. Dramatic reduction in drugs reaching the follicle will help protect the hair and prevent it from falling out.

    To support practical use of this "shampoo," the gel is designed to be temperature responsive. For example, at body temperature, the gel is thicker and clings to the patient's hair and scalp surface. When the gel is exposed to slightly cooler temperatures, the gel becomes thinner and more like a liquid that can be easily washed away.

     Romila Manchanda et al, Hydrogel-based drug delivery system designed for chemotherapy-induced alopecia, Biomaterials Advances (2025). DOI: 10.1016/j.bioadv.2025.214452

  • Dr. Krishna Kumari Challa

    How aging drives neurodegenerative diseases

    research team has identified a direct molecular link between aging and neurodegeneration by investigating how age-related changes in cell signaling contribute to toxic protein aggregation.

    Although aging is the biggest risk factor for neurodegenerative diseases, scientists still don't fully understand which age-associated molecular alterations drive their development.

    Using the small nematode worm Caenorhabditis elegans, a research team studied a signaling pathway that leads to pathological protein accumulation with age. Their new paper is published in Nature Aging.

    The team focused on the aging-associated protein EPS8 and the signaling pathways it regulates. This protein is known to accumulate with age and to activate harmful stress responses that lead to a shorter lifespan in worms.

    Researchers found that increased levels of EPS8, and the activation of its signaling pathways, drive pathological protein aggregation and neurodegeneration—typical features of age-associated neurodegenerative diseases such as Huntington's disease and amyotrophic lateral sclerosis (ALS). By reducing EPS8 activity, the group was then able to prevent the build-up of the toxic protein aggregates and preserve neuronal function in worm models of these two diseases.

    Importantly, EPS8 and its signaling partners are evolutionarily conserved and also present in human cells. Similar to what they achieved in the worms, the team was able to prevent the accumulation of toxic protein aggregates in human cell models of Huntington's disease and ALS by reducing EPS8 levels.

    Seda Koyuncu et al, The aging factor EPS8 induces disease-related protein aggregation through RAC signaling hyperactivation, Nature Aging (2025). DOI: 10.1038/s43587-025-00943-w

  • Dr. Krishna Kumari Challa

    Cooling pollen sunscreen can block UV rays without harming corals

    Materials scientists have invented the world's first pollen-based sunscreen derived from Camellia flowers.

    In experiments, the pollen-based sunscreen absorbed and blocked harmful ultraviolet (UV) rays as effectively as commercially available sunscreens, which commonly use minerals like titanium dioxide (TiO2) and zinc oxide (ZnO).
    In laboratory tests on corals, commercial sunscreen induced coral bleaching in just two days, leading to coral death by day six. Each year, an estimated 6,000 to 14,000 tons of commercial sunscreen make their way into the ocean, as people wash it off in the sea or it flows in from wastewater.

    In contrast, the pollen-based sunscreen did not affect the corals, which remained healthy even up to 60 days.

    In other tests, the pollen-based sunscreen also demonstrated its ability to reduce surface skin temperature, thereby helping to keep the skin cool in the presence of simulated sunlight.

    Nature's Guard: UV Filter from Pollen, Advanced Functional Materials (2025). DOI: 10.1002/adfm.202516936advanced.onlinelibrary.wiley.c … .1002/adfm.202516936

  • Dr. Krishna Kumari Challa

    Why we slip on ice: Physicists challenge centuries-old assumptions

    For over a hundred years, schoolchildren around the world have learned that ice melts when pressure and friction are applied. When you step out onto an icy pavement in winter, you can slip up because of the pressure exerted by your body weight through the sole of your (still warm) shoe. But it turns out that this explanation misses the mark.

    New research  reveals that it's not pressure or friction that causes ice to become slippery, but rather the interaction between molecular dipoles in the ice and those on the contacting surface, such as a shoe sole.

    The work is published in the journal Physical Review Letters. This insight overturns a paradigm established nearly two centuries ago by the brother of Lord Kelvin, James Thompson, who proposed that pressure and friction contribute to ice melting alongside temperature.

    It turns out that neither pressure nor friction plays a particularly significant part in forming the thin liquid layer on ice.

    Instead,computer stimulations by researchers reveal that molecular dipoles are the key drivers behind the formation of this slippery layer, which so often causes us to lose our footing in winter. But what exactly is a dipole? A molecular dipole arises when a molecule has regions of partial positive and partial negative charge, giving the molecule an overall polarity that points in a specific direction.

    Achraf Atila et al, Cold Self-Lubrication of Sliding Ice, Physical Review Letters (2025). DOI: 10.1103/1plj-7p4z

  • Dr. Krishna Kumari Challa

    Fetal brain harm linked to pregnancy infection

    A specific bacterial infection during pregnancy that can cause severe harm to the unborn brain has been identified for the first time, in a finding that could have huge implications for prenatal health.

    Previous studies have disagreed on whether fetal exposure to Ureaplasma parvum has a detrimental effect on brain development, so newborn health specialists of Medical Research set out to determine the answer, once and for all.

    Ureaplasma parvum Serovars 3 and 6 are among the most common types that are isolated in pregnancies complicated by infection/inflammation, so they tested them individually in a pre-clinical model and the results were clear.

    They showed that long-term exposure to a specific subtype of Ureaplasma (parvum 6) resulted in loss of cells that are responsible for the production of myelin (the fatty sheath that insulates nerve cells in the brain).

    This resulted in less myelin production and a disruption to the architecture of myelin in the brain. This sort of disruption to myelin production can have a devastating and lifelong impact on neurodevelopment, cognition and motor function.

    By contrast, they also showed that exposure to another subtype of Ureaplasma (parvum 3) had little effect on neurodevelopment.

    Many of the babies affected by this infection in utero are at a much higher risk of preterm birth and the chronic intensive care and inflammation associated with being born too early, the researchers say.

    Dima Abdu et al, Intra-amniotic infection with Ureaplasma parvum causes serovar-dependent white matter damage in preterm fetal sheep, Brain Communications (2025). DOI: 10.1093/braincomms/fcaf182

  • Dr. Krishna Kumari Challa

    After early-life stress, astrocytes can affect behaviour

    Astrocytes in the lateral hypothalamus region of the brain, an area involved in the regulation of sleep and wakefulness, play a key role in neuron activity in mice and affect their behavior,  researchers have found.

    By broadening medical science's understanding of cerebral mechanisms, the discovery could someday help in the treatment and prevention of depression in humans, the researchers say.

    According to the scientific literature, early-life stress leads to a five-fold increase in the risk of developing a mental-health disorder as an adult, notably causing treatment-resistant disorders.

    As brain cells, astrocytes are sensitive to variations in the blood concentration of metabolites and, in response to changes in the blood, astrocytes can modulate the extent of their interaction with neurons, their neighboring cells.

    In mice, those changes are particularly responsive to the level of corticosterone, the stress hormone in the rodents' blood.

    In adult mice who experienced early-life stress, researchers saw abnormally high levels of corticosterone. The impact of stress on behavior also differed according to sex. Females were less active at night, while males were hyperactive during the day.

    In people with depression who have experienced a similar type of stress, these sex differences have also been observed.

     Lewis R. Depaauw-Holt et al, A divergent astrocytic response to stress alters activity patterns via distinct mechanisms in male and female mice, Nature Communications (2025). DOI: 10.1038/s41467-025-61643-y

  • Dr. Krishna Kumari Challa

    Is this the future of food? 'Sexless' seeds that could transform farming
    Scientists are tinkering with plant genes to create crops that seed their own clones, with a host of benefits for farmers.
    Sacks of seeds without the sex
    Agriculture is on the brink of a revolution: grain crops that produce seeds asexually. The technology — trials of which could start sprouting as early as next month — exploits a quirk of nature called apomixis, in which plants create seeds that produce clones of the parent. Apomixis could slash the time needed to create new varieties of crops, and give smallholder farmers access to affordable high-yielding sorghum (Sorghum bicolor) and cowpea (Vigna unguiculata). But before self-cloning crops can be commercialized, the technology must run the regulatory gauntlet.

    https://www.nature.com/articles/d41586-025-02753-x?utm_source=Live+...

  • Dr. Krishna Kumari Challa

    Macaws learn by watching interactions of others, a skill never seen in animals before

    One of the most effective ways we learn is through third-party imitation, where we observe and then copy the actions and behaviors of others. Until recently, this was thought to be a unique human trait, but a new study published in Scientific Reports reveals that macaws also possess this ability.

    Second-party imitation is already known to exist in the animal kingdom. Parrots are renowned for their ability to imitate human speech and actions, and primates, such as chimpanzees, have learned to open a puzzle box by observing a human demonstrator. But third-party imitation is different because it involves learning by observing two or more individuals interact rather than by direct instruction.

    Scientists chose blue-throated macaws for this study because they live in complex social groups in the wild, where they need to learn new behaviors to fit in quickly. Parrots, like macaws, are also very smart and can do things like copy sounds and make tools.

    To find out whether macaws could learn through third-party imitation, researchers worked with two groups of them, performing more than 4,600 trials. In the test group, these birds watched another macaw perform one of five different target actions in response to a human's hand signals. These were fluffing up feathers, spinning its body, vocalizing, lifting a leg or flapping its wings. In the control group, macaws were given the same hand signals without ever seeing another bird perform the actions.

    The results were clear. The test group learned more actions than the control group, meaning the interactions between the single macaw and the human experimenter helped them learn specific behaviors. They also learned these actions faster and performed them more accurately than the control group. The study's authors suggest that this ability to learn from passively observing two or more individuals helps macaws adapt to new social situations and may even contribute to their own cultural traditions.

    The research shows that macaws aren't just smart copycats. They may also have their own complex social lives and cultural norms, similar to humans.

    Esha Haldar et al, Third-party imitation is not restricted to humans, Scientific Reports (2025). DOI: 10.1038/s41598-025-11665-9

  • Dr. Krishna Kumari Challa

    Physicists create a new kind of time crystal that humans can actually see

    Imagine a clock that doesn't have electricity, but its hands and gears spin on their own for all eternity. In a new study, physicists  have used liquid crystals, the same materials that are in your phone display, to create such a clock—or, at least, as close as humans can get to that idea. The team's advancement is a new example of a "time crystal." That's the name for a curious phase of matter in which the pieces, such as atoms or other particles, exist in constant motion.

    The researchers aren't the first to make a time crystal, but their creation is the first that humans can actually see, which could open a host of technological applications. They can be observed directly under a microscope and even, under special conditions, by the naked eye.

    In the study, the researchers designed glass cells filled with liquid crystals—in this case, rod-shaped molecules that behave a little like a solid and a little like a liquid. Under special circumstances, if you shine a light on them, the liquid crystals will begin to swirl and move, following patterns that repeat over time.

    Time Crystal in motion

    Under a microscope, these liquid crystal samples resemble psychedelic tiger stripes, and they can keep moving for hours—similar to that eternally spinning clock.

    Hanqing Zhao et al, Space-time crystals from particle-like topological solitons, Nature Materials (2025). DOI: 10.1038/s41563-025-02344-1

  • Dr. Krishna Kumari Challa

    Ant Queens Produce Offspring of Two Different Species

    Some ant queens can produce offspring of more than one species – even when the other species is not known to exist in the nearby wild.

    No other animal on Earth is known to do this, and it's hard to believe.

    This is bizarre story of a system that allows things to happen that seem almost unimaginable.

    Some queen ants are known to mate with other species to produce hybrid workers, but Iberian harvester ants (Messor ibericus) on the island of Sicily go even further, blurring the lines of species-hood. These queens can give birth to cloned males of another species entirely (Messor structor), with which they can mate to produce hybrid workers.

    Their reproductive strategy is akin to "sexual domestication", the researchers say. The queens have come to control the reproduction of another species, which they exploited from the wild long ago – similar to how humans have domesticated dogs.

    In the past, the Iberian ants seem to have stolen the sperm of another species they once depended on, creating an army of male M. structor clones to reproduce with whenever they wanted.

    According to genetic analysis, the colony's offspring are two distinct species, and yet they share the same mother.

    Some are the hairy M. ibericus, and the others the hairless M. structor – the closest wild populations of which live more than a thousand kilometers away.

    The queen can mate with males of either species in the colony to reproduce. Mating with M. ibericus males will produce the next generation of queen, while mating with M. structor males will result in more workers. As such, all workers in the colony are hybrids of M. ibericus and M. structor.

    Part 1

  • Dr. Krishna Kumari Challa

    The two species diverged from each other more than 5 million years ago, and yet today, M. structor is a sort of parasite living in the queen's colony. But she is hardly a victim.

    The Iberian harvester queen ant controls what happens to the DNA of her clones. When she reproduces, she can do so asexually, producing a clone of herself. She can also fertilize her egg with the sperm of her own species or M. structor, or she can 'delete' her own nuclear DNA and use her egg as a vessel solely for the DNA of her male M. structor clones.
    This means her offspring can either be related to her, or to another species. The only similarity is that both groups contain the queen's mitochondrial DNA. The result is greater diversity in the colony without the need for a wild neighboring species to mate with.

    It also means that Iberian harvester ants belong to a 'two-species superorganism' – which the researchers say "challenges the usual boundaries of individuality."

    The M. structor males produced by M. ibericus queens don't look exactly the same as males produced by M. structor queens, but their genomes match.

    https://www.nature.com/articles/s41586-025-09425-w

    Part 2

  • Dr. Krishna Kumari Challa

    Electrical stimulation can reprogram immune system to heal the body faster

    Scientists  have discovered that electrically stimulating macrophages—one of the immune systems key players—can reprogram them in such a way as to reduce inflammation and encourage faster, more effective healing in disease and injury.

    This breakthrough uncovers a potentially powerful new therapeutic option, with further work ongoing to delineate the specifics.

    Macrophages are a type of white blood cell with several high-profile roles in our immune system. They patrol around the body, surveying for bugs and viruses, as well as disposing of dead and damaged cells, and stimulating other immune cells—kicking them into gear when and where they are needed.

    However, their actions can also drive local inflammation in the body, which can sometimes get out of control and become problematic, causing more damage to the body than repair. This is present in lots of different diseases, highlighting the need to regulate macrophages for improved patient outcomes.

    In the study, published in the journal Cell Reports Physical Science, the researchers worked with human macrophages isolated from healthy donor blood samples. 

    They stimulated these cells using a custom bioreactor to apply electrical currents and measured what happened.

    The scientists discovered that this stimulation caused a shift of macrophages into an anti-inflammatory state that supports faster tissue repair; a decrease in inflammatory marker (signaling) activity; an increase in expression of genes that promote the formation of new blood vessels (associated with tissue repair as new tissues form); and an increase in stem cell recruitment into wounds (also associated with tissue repair).

     Electromodulation of human monocyte-derived macrophages drives a regenerative phenotype and impedes inflammation, Cell Reports Physical Science (2025). DOI: 10.1016/j.xcrp.2025.102795www.cell.com/cell-reports-phys … 2666-3864(25)00394-7

  • Dr. Krishna Kumari Challa

    Human brains explore more to avoid losses than to seek gains

    Researchers traced a neural mechanism that explains why humans explore more aggressively when avoiding losses than when pursuing gains. Their work reveals how neuronal firing and noise in the amygdala shape exploratory decision-making.

    Human survival has its origins in a delicate balance of exploration versus exploitation. There is safety in exploiting what is known, the local hunting grounds, the favorite foraging location, the go-to deli with the familiar menu. Exploitation also involves the risk of over-reliance on the familiar to the point of becoming too dependent upon it, either through depletion or a change in the stability of local resources.

    Exploring the world in the hope of discovering better options has its own set of risks and rewards. There is the chance of finding plentiful hunting grounds, alternative foraging resources, or a new deli that offers a fresh take on old favorites. And there is the risk that new hunting grounds will be scarce, the newly foraged berries poisonous, or that the meal time will be ruined by a deli that disappoints.

    Exploration-exploitation (EE) dilemma research has concentrated on gain-seeking contexts, identifying exploration-related activity in surface brain areas like cortex and in deeper regions such as the amygdala. Exploration tends to rise when people feel unsure about which option will pay off.

    Loss-avoidance strategies differ from gain-seeking strategies, with links to negative outcomes such as PTSD, anxiety and mood disorders.

    In the study, "Rate and noise in human amygdala drive increased exploration in aversive learning," published in Nature, researchers recorded single-unit activity to compare exploration during gain versus loss learning.

    Part 1

  • Dr. Krishna Kumari Challa

    Researchers examined how strongly neurons fired just before each choice and how variable that activity was across trials. Variability served as "neural noise." Computational models estimated how closely choices followed expected value and how much they reflected overall uncertainty.

    Results showed more exploration in loss trials than in gain trials. After people learned which option was better, exploration stayed higher in loss trials and accuracy fell more from its peak.

    Pre-choice firing in the amygdala and temporal cortex rose before exploratory choices in both gain and loss, indicating a shared, valence-independent rate signal. Loss trials also showed noisier amygdala activity from cue to choice.

    More noise was linked to higher uncertainty and a higher chance of exploring, and noise declined as learning progressed. Using the measured noise levels in decision models reproduced the extra exploration seen in loss trials. Loss aversion did not explain the gap.
    Researchers report two neural signals that shape exploration. A valence-independent firing-rate increase in the amygdala and temporal cortex precedes exploratory choices in both gain and loss. A loss-specific rise in amygdala noise raises the odds of exploration under potential loss, scales with uncertainty, and wanes as learning accrues.

    Behavioral modeling matches this pattern, with value-only rules fitting gain choices and value-plus-total-uncertainty rules fitting loss choices. Findings point to neural variability as a lever that tilts strategy toward more trial-and-error when loss looms, while causal tests that manipulate noise remain to be done.

    From an evolutionary survival perspective, the strategy fits well with the need to seek out new resources when facing the loss of safe or familiar choices. While one might consider trying a new restaurant at any time, true seeking behavior will become a priority if the favorite location is closed for remodeling.

    Tamar Reitich-Stolero et al, Rate and noise in human amygdala drive increased exploration in aversive learning, Nature (2025). DOI: 10.1038/s41586-025-09466-1

    Part 2

  • Dr. Krishna Kumari Challa

    Scientists discover why the flu is more deadly for older people

    Scientists have discovered why older people are more likely to suffer severely from the flu, and can now use their findings to address this risk.

    In a study published in PNAS, experts discovered that older people produce a glycosylated protein called apolipoprotein D (ApoD), which is involved in lipid metabolism and inflammation, at much higher levels than in younger people. This has the effect of reducing the patient's ability to resist virus infection, resulting in a more serious disease outcome.

    ApoD is therefore a target for therapeutic intervention to protect against severe influenza virus infection in the elderly which would have a major impact on reducing morbidity and mortality in the aging population.

    ApoD mediates age-associated increase in vulnerability to influenza virus infection, Proceedings of the National Academy of Sciences (2025). DOI: 10.1073/pnas.2423973122

  • Dr. Krishna Kumari Challa

    There is less room to store carbon dioxide, driver of climate change, than previously thought, finds a new study

    The world has far fewer places to securely store carbon dioxide deep underground than previously thought, steeply lowering its potential to help stem global warming, according to a new study that challenges long-held industry claims about the practice.

    The study, published last week in the journal Nature, found that global carbon storage capacity was 10 times less than previous estimates after ruling out geological formations where the gas could leak, trigger earthquakes or contaminate groundwater, or had other limitations. That means carbon capture and storage would only have the potential to reduce human-caused warming by 0.7 degrees Celsius (1.26 Fahrenheit)—far less than previous estimates of around 5-6 degrees Celsius (9-10.8 degrees Fahrenheit), researchers said.

    Carbon storage is often portrayed as a way out of the climate crisis. These new findings make clear that it is a limited tool and reaffirms the extreme importance of reducing emissions as fast and as soon as possible.

     Matthew J. Gidden et al, A prudent planetary limit for geologic carbon storage, Nature (2025). DOI: 10.1038/s41586-025-09423-y

  • Dr. Krishna Kumari Challa

    Smart patch runs tests using sweat instead of blood

    It is possible now to precisely assess the body's health status using only sweat instead of blood tests. 

    A research team has now developed a smart patch that can precisely observe internal changes through sweat when simply attached to the body. This is expected to greatly contribute to the advancement of chronic disease management and personalized health care technologies. 

    The researchers developed a wearable sensor that can simultaneously and in real-time analyze multiple metabolites in sweat. 

    The research team developed a thin and flexible wearable sweat patch that can be directly attached to the skin. This patch incorporates both microchannels for collecting sweat and an ultrafine nanoplasmonic structure that analyzes sweat components using light. Thanks to this, multiple sweat metabolites can be simultaneously analyzed without the need for separate staining or labels, with just one patch application.

    A nanoplasmonic structure is an optical sensor structure where nanoscale metallic patterns interact with light, designed to sensitively detect the presence or changes in concentration of molecules in sweat.

    The patch was created by combining nanophotonics technology, which manipulates light at the nanometer scale (one-hundred-thousandth the thickness of a human hair) to read molecular properties, with microfluidics technology, which precisely controls sweat in channels thinner than a hair.

    In other words, within a single sweat patch, microfluidic technology enables sweat to be collected sequentially over time, allowing for the measurement of changes in various metabolites without any labeling process. Inside the patch are six to 17 chambers (storage spaces), and sweat secreted during exercise flows along the microfluidic structures and fills each chamber in order.

    The research team applied the patch to actual human subjects and succeeded in continuously tracking the changing components of sweat over time during exercise.

    In this study, they demonstrated for the first time that three metabolites—uric acid, lactic acid, and tyrosine—can be quantitatively analyzed simultaneously, as well as how they change depending on exercise and diet.

    In particular, by using artificial intelligence analysis methods, they were able to accurately distinguish signals of desired substances even within the complex components of sweat.

    In the future,  this technology can be expanded to diverse fields such as chronic disease management, drug response tracking, environmental exposure monitoring, and the discovery of next-generation biomarkers for metabolic diseases.

    Jaehun Jeon et al, All-flexible chronoepifluidic nanoplasmonic patch for label-free metabolite profiling in sweat, Nature Communications (2025). DOI: 10.1038/s41467-025-63510-2

  • Dr. Krishna Kumari Challa

    Microbiome instability linked to poor growth in kids

    Malnutrition is a leading cause of death in children under age 5, and nearly 150 million children globally under this age have stunted growth from lack of nutrition. Although an inadequate diet is a major contributor, researchers  found over a decade ago that dysfunctional communities of gut microbes play an important role in triggering malnutrition.

    They  have discovered that toddlers in Malawi—among the places hardest hit by malnutrition—who had a fluctuating gut microbiome showed poorer growth than kids with a more stable microbiome. All of the children were at high risk for stunting and acute malnutrition.

    The findings, published in Cell, establish a pediatric microbial genome library—a public health database containing complete genetic profiles of 986 microbes from fecal samples of eight Malawian children collected over nearly a year that can be used for future studies to help predict, prevent and treat malnutrition.

    Culture-independent meta-pangenomics enabled by long-read metagenomics reveals associations with pediatric undernutrition, Cell (2025). DOI: 10.1016/j.cell.2025.08.020www.cell.com/cell/fulltext/S0092-8674(25)00975-4

  • Dr. Krishna Kumari Challa

    Discovery reveals how chromosome ends can be protected

    Researchers have uncovered a previously unknown mechanism that safeguards the chromosome ends from being mistakenly repaired by the cell. While DNA repair is vital for survival, attempts to repair the chromosome ends—called telomeres—can have catastrophic outcomes for cells.

    The research, published in Nature, increases the understanding of how cancer and certain rare diseases develop.

    Cells constantly monitor their DNA. A DNA helix that ends abruptly is a signal that DNA has been severely damaged—at least in most cases. The most severe DNA damage a cell can suffer is when a DNA helix breaks up in two pieces. 

    Normally, the cells would try to promptly repair all damage to DNA. The dilemma is that our chromosomes have ends that look just like broken DNA. If cells were to "repair" them, looking for another loose end to join them with, this would lead to the fusion between two or more chromosomes, making the cell susceptible to transform into a cancer cell.

    Therefore, the chromosome ends—called telomeres—must be protected from the cell's DNA repair machinery.

    The cells have to constantly repair DNA damage to avoid mutations, cell death, and cancer, while at the same time, they must not repair the chromosome ends by mistake, since that would result in the same catastrophic outcome. What's the difference between damaged DNA and the natural chromosome end? This problem has been known for almost a century, but some aspects are still not completely resolved.

    Although the telomeres are not to be repaired as if they were broken DNA, several DNA repair proteins can be found at the chromosome ends. 

    The research team has previously shown that a key repair protein, DNA Protein Kinase (called DNA-PK) helps in processing telomeres and in protecting them from degradation. But how DNA-PK at the same time is prevented from trying to repair these blunt DNA ends remained a mystery until now.

    The researchers have now shown that two other proteins, called RAP1 and TRF2, have an important role to play in regulating DNA-PK.

    They show genetically, biochemically and structurally how the protein RAP1, brought to telomeres by TRF2, ensure by  direct interaction that DNA-PK doesn't 'repair' the telomeres.

    Part1

  • Dr. Krishna Kumari Challa

    The role of telomeres in processes such as cancer development and aging, and what happens when they do not work properly, has long interested scientists. Diseases caused by disturbances in telomere maintenance are rare, but very serious, and include premature aging, blood cell deficiency called aplastic anemia and fibrosis in the lungs.

    In about half of cases, the disease is explained by a known mutation that affects telomere stability, but in many cases, there is currently no known medical explanation for why the individual is sick.
    Their research is also relevant to cancer research. On one hand, inappropriate "repair" of telomeres can trigger catastrophic events leading to the accumulation of mutations and cancer.

    On the other hand, cancer cells are often less efficient at repairing damage to DNA compared to normal cells. This weakness is exploited in cancer treatments and many therapies kill tumor cells by causing DNA damage or inhibiting repair, or both.

    In other words, knowledge of how cells regulate DNA repair and protect telomeres has a bearing on both prevention and treatment of cancer.

    Increased understanding of which proteins play key roles in these cellular processes can, in the long term, contribute to more precise and targeted treatment strategies.

    Patrik Eickhoff et al, Chromosome end protection by RAP1-mediated inhibition of DNA-PK, Nature (2025). DOI: 10.1038/s41586-025-08896-1

    Part 2

  • Dr. Krishna Kumari Challa

    Fathers' drinking plays role in fetal alcohol spectrum disorder, study shows

    It's a well-known fact that fetal alcohol spectrum disorder (FASD) in children is caused by mothers who drink during pregnancy. But it turns out that the father's drinking habits could also affect a child's growth and development.

    A team of international researchers found that a father's alcohol use may have a small but direct negative impact on a child's development by the age of seven. A father's drinking contributes to the harm caused by alcohol use during pregnancy.

    The findings of their study were published recently in the journal Alcohol: Clinical & Experimental Research.

    The researchers analyzed data from five studies on the prevalence and characteristics of FASD among Grade 1 learners in the Western Cape to explore whether a father's drinking affects children diagnosed with FASD. The children's biological mothers or legal guardians completed a questionnaire on the risk factors for FASD.

    According to the researchers, there is a growing recognition that factors beyond pregnant women's drinking habits can affect their children's development. They add that increased attention is currently paid to the role of fathers, not only as a contributing factor to women's drinking habits, but also as an independent contributing factor to the growth and development of children.

    The findings show that children whose fathers drank alcohol were more likely to be shorter, have smaller heads, and score lower on verbal IQ tests. It was also clear from the study that the highest risk to the child's development exists when both parents use alcohol during pregnancy. It also appeared that 'binge drinking' by the father, but especially by both parents, has the most detrimental effect on the child's development. 

    Data analysis showed that between 66% and 77% of fathers of children on the FAS spectrum drank during their partner's pregnancy with the child in question. These fathers drank an average of 12 drinks per drinking day. The number of drinks that fathers drank per drinking day was significantly correlated with smaller head circumference in their children. Head circumference is used as a measure of brain development."

    The researchers add that fathers who drank an average of five or more drinks per drinking day had shorter children with smaller head circumferences. These children also performed worse on measures in verbal intelligence tests.

    "In general, it was found that the more fathers drank, the worse their children performed. However, it should also be noted that all these effects were observed in children whose mothers consumed alcohol during pregnancy."

    They note that a father's drinking alone didn't increase the chances of a child being diagnosed with FASD. However, when the mother drank during pregnancy and the father was also a heavy drinker, the child was more likely to have the most serious symptoms of FASD.
    "When looking at both parents' drinking patterns, the father's alcohol use alone didn't show a clear link to the child's physical or brain development problems. While both parents' drinking was considered, the main effects on a child's development and physical features were linked to the mother's alcohol use.
    Part 1
  • Dr. Krishna Kumari Challa

    "Even after accounting for alcohol use by mothers during pregnancy, the father's drinking was still linked to lower child height, smaller head size and reduced verbal IQ. This suggests that paternal alcohol use may have its own, though limited, impact on a child's growth and development.

    "Data analyses of children where both parents consumed alcohol during pregnancy had significantly negative effects on growth, head circumference, verbal intelligence and general birth defect scores than in children where neither parent consumed alcohol."

    The researchers say that, while it is not yet clear whether the impact of a father's drinking on a child's growth and development stems from impaired sperm quality or other epigenetic influences (changes in how genes work that don't involve altering the DNA code itself), the father's role in the development of FASD cannot be overlooked.

    Philip A. May et al, Does paternal alcohol consumption affect the severity of traits of fetal alcohol spectrum disorders?, Alcohol, Clinical and Experimental Research (2025). DOI: 10.1111/acer.70105

    Part 2

  • Dr. Krishna Kumari Challa

    Why some influenza viruses are more dangerous than others

    Serious infections with influenza A viruses are characterized by an excessive immune response, known as cytokine storm. It was previously unclear why some virus strains trigger these storms, while others do not. Researchers investigated 11 different influenza A virus strains and their effect on different human immune cells.

    The results, published in Emerging Microbes & Infections, show that highly pathogenic avian influenza viruses infect specific kinds of immune cells and thus stimulate the production of type I interferon. This could explain why these viruses are particularly dangerous.

    The new research results show that not only the immune cells that have always been the focus of attention regarding type I interferon production, but also other immune system cells could be decisive in whether an influenza infection triggers an excessive immune system response. This knowledge is important in order to be able to better assess the risk of dangerous virus variants.

    Influenza viruses are some of the most significant respiratory disease pathogens worldwide. While most infections are relatively mild, certain virus strains can cause severe pneumonia, leading to acute respiratory failure. Highly pathogenic influenza viruses are particularly dangerous. They often spread from birds to humans and are associated with significantly higher mortality rates.

    The immune system plays a crucial role in this danger: some virus strains cause the body to release an excessive amount of messenger substances known as cytokines. If a cytokine storm occurs, the immune response will end up damaging the body's tissue more than the virus itself.

    But why do some influenza viruses trigger such excessive reactions, while others only cause infections with mild progression?

    The researchers  tested how the flu viruses infect different immune cells and stimulate the release of messenger substances.

    The aim of their research is to decipher the mechanisms behind mild and severe disease progression and to develop long-term approaches for better protection and treatment options.

    Part 1

  • Dr. Krishna Kumari Challa

    The investigations showed that a certain type of immune cell, so-called plasmacytoid dendritic cells, produce large amounts of the important antiviral messenger interferon-α (IFN-α) upon infection with influenza—regardless of the virus strain. This means that these immune cells generally respond strongly to influenza viruses without the need for the virus to productively infect them.

    If this is the case, why do not all viruses cause the same severity of the disease? The research team found that other immune cells, namely myeloid dendritic cells and different types of macrophages, are infected with highly pathogenic influenza viruses and produce large amounts of IFN-α. Viral replication in these immune cells appears to be an important factor in the production of type I interferon and the development of an excessive immune response (cytokine storm).
    These results provide an explanation of why some influenza viruses could be so much more dangerous than others: it might be their ability to replicate in certain immune cells and thereby trigger an extremely strong immune reaction that leads to severe inflammation and harm to health. This knowledge can help to develop targeted therapies and better identify risk groups.

    Marc A. Niles et al, Influenza A virus induced interferon-alpha production by myeloid dendritic cells and macrophages requires productive infection, Emerging Microbes & Infections (2025). DOI: 10.1080/22221751.2025.2556718

    Part 2

  • Dr. Krishna Kumari Challa

    Giant DNA discovered in people's mouths could impact oral health, immunity and even cancer risk

    Researchers have made a surprising discovery hiding in people's mouths: Inocles, giant DNA elements that had previously escaped detection. These appear to play a central role in helping bacteria adapt to the constantly changing environment of the mouth.

    The findings, published in the journal Nature Communications, provide fresh insight into how oral bacteria colonize and persist in humans, with potential implications for health, disease and microbiome research.

    They discovered Inocles, an example of extrachromosomal DNA—chunks of DNA that exist in cells, in this case bacteria, but outside their main DNA. It's like finding a book with extra footnotes stapled to it, and  scientists are just starting to read them to find out what they do.

    Detecting Inocles was not easy, as conventional sequencing methods fragment genetic data, making it impossible to reconstruct large elements.

    To overcome this, the team applied advanced long-read sequencing techniques, which can capture much longer stretches of DNA.

    Inocle genomes, turned out to be hosted by the bacteria Streptococcus salivarius.

    The average genome size of Inocle is 350 kilobase pairs, a measure of length for genetic sequences, so it is one of the largest extrachromosomal genetic elements in the human microbiome. Plasmids, other forms of extrachromosomal DNA, are at most a few tens of kilobase pairs.

    This long length endows Inocles with genes for various functions, including resistance to oxidative stress, DNA damage repair and cell wall-related genes, possibly involved in adapting to extracellular stress response.

    What's remarkable is that, given the range of the human population the saliva samples represent, researchers think 74% of all human beings may possess Inocles. And even though the oral microbiome has long been studied, Inocles remained hidden all this time because of technological limitations.

    Now that we know they exist, we can begin to explore how they shape the relationship between humans, their resident microbes and our oral health. And there's even some hints that Inocles might serve as markers for serious diseases like cancer, say the researchers.

     Yuya Kiguchi et al, Giant extrachromosomal element "Inocle" potentially expands the adaptive capacity of the human oral microbiome, Nature Communications (2025). DOI: 10.1038/s41467-025-62406-5

  • Dr. Krishna Kumari Challa

    Scientists find quasi-moon orbiting the Earth for the last 60 years

    Everyone who has ever lived on Earth has been well-aware of the moon, but it turns out Earth also has some frequent temporary companions. These "quasi-moons" are small asteroids that enter into a kind of resonance with Earth's orbit, although they aren't technically orbiting Earth. In August, this small group of asteroids, called Arjunas, offered another companion to add to the list.

    Astronomers at the Pan-STARRS observatory in Hawaii discovered the new quasi-moon, referred to as "2025 PN7," on August 2, 2025. Their research was recently published in Research Notes of the AAS. Using JPL's Horizons system and Python tools, they analyzed the orbital data and compared it to other Arjunas and quasi-satellites.

    The team found that 2025 PN7 had been in a quasi-orbit for about 60 years already and would likely be nearby for another 60 or so years before departing. Compared to other quasi-moons, this period is relatively short. The quasi-moon Kamo'oalewa has an expected near-Earth orbit of around 381 years, while the total time for 2025 PN7 is 128 years.

    Scientists have been aware of these quasi-satellites since 1991, when they first discovered 1991 VG—which some thought was an interstellar probe at the time.

    Over three decades later, it is now widely accepted that such objects are natural and constitute a secondary asteroid belt that occupies the region in which the Earth–moon system orbits around the sun, defining the Arjuna dynamical class. The Arjunas with the most Earth-like orbits can experience temporary captures as mini-moons of our planet.

    However, mini-moons are distinct from quasi-moons, like 2025 PN7, as mini-moons do temporarily  orbit Earth and quasi-moons only appear to do so. Currently, there are six other known quasi-moons: 164207 Cardea (2004 GU9), 469219 Kamo'oalewa (2016 HO3), 277810 (2006 FV35), 2013 LX28, 2014 OL339 and 2023 FW13.

    Carlos de la Fuente Marcos et al, Meet Arjuna 2025 PN7, the Newest Quasi-satellite of Earth, Research Notes of the AAS (2025). DOI: 10.3847/2515-5172/ae028f