Unlike female sexual traits, the loss of male sexual traits is generally thought to take an extremely long time. In many other species, even rare males often retain their reproductive capabilities. However, these findings suggest that R. mikado has relied solely on parthenogenesis for such an extended period that even neutral mutations have accumulated, leading to the complete loss of male reproductive traits. This study demonstrates that parthenogenesis in R. mikado has become irreversible. Although asexual reproduction is often considered evolutionarily short-lived due to the lack of genetic recombination, previous research estimated that this species has persisted for hundreds of thousands of years. How has R. mikado managed to survive for such a long time? This remains an intriguing mystery for future research.
Tomonari Nozaki et al, Lack of successful sexual reproduction suggests the irreversible parthenogenesis in a stick insect, Ecology (2025). DOI: 10.1002/ecy.4522
The secret behind sharp vision: New research reveals the benefits of tiny eye movements
Even when we think we are holding our gaze perfectly still, our eyes make tiny, involuntary movements. While these "fixational eye movements" might seem like they would blur our vision, new research reveals they actually help us see fine details more clearly.
In a study combining theoretical modeling and human experiments, researchers and their collaborators have uncovered how these microscopic eye movements enhance rather than impair our visual acuity.
Using advanced eye-tracking technology and computational models, the team demonstrated that these movements help our retinas process visual information more effectively. Their paper ispublishedin theProceedings of the National Academy of Sciences.
It is a fascinating paradox, say the researchers. These constant, tiny movements of our eyes might appear to make our vision less precise, but they actually optimize the way our retinas encode visual information. We found that humans naturally maintain these movements within a nearly perfect range for enhanced visual acuity.
The researchers found that these movements help by 'refreshing' the content of our visual receptors while maintaining an optimal balance between motion and stability. They also found that in the experiment, the movements adapt to the size of the object shown.
The study was conducted using a sophisticated adaptive optics scanning laser ophthalmoscope, allowing researchers to track these minute eye movements with unprecedented precision while participants performed visual tasks. The researchers then combined theoretical modeling with empirical data to link eye movements to retinal neural coding and human behaviour.
Trang-Anh E. Nghiem et al, Fixational eye movements as active sensation for high visual acuity, Proceedings of the National Academy of Sciences (2025). DOI: 10.1073/pnas.2416266122
Climate change may be delaying births, suggests study
New research has found exposure to outdoor air pollution and extreme temperatures during pregnancy may increase the risk of prolonged pregnancy, offering new insights into the impact of climate change on maternal health.
Published inUrban Climate,the studyis titled "Maternal climate-related exposures and prolonged pregnancy: Findings from a statewide population-based cohort study in Western Australia."
The study analyzed data from nearly 400,000 births in Western Australia and found that higher exposure to fine particulate air pollution (PM2.5) and biothermal stress (a measure that combines air temperature, radiant temperature, relative humidity, wind speed, and human physiology) was associated with pregnancies lasting beyond 41 weeks.
While climate exposure has long been linked to preterm births, this is the first study to examine its impact on prolonged pregnancies.
These findings show that exposure to air pollution and biothermal stress during pregnancy increases the likelihood of prolonged pregnancies, particularly among mothers over 35 years old, first-time mothers, those living in urban areas, and those with complicated pregnancies.
"Environmental stressors, including climate-related exposures during pregnancy, have been associated with maternal stress response and subsequent disruptions in endocrine and inflammatory activities, which increase towards the end of pregnancy. This can either shorten gestation, leading to preterm birth, or lengthen gestation, resulting in prolonged pregnancy in some cases."
Prolonged pregnancy can have serious health implications for both mother and baby, including the need for medical interventions such as labor induction or cesarean sections, increased risk of stillbirth, birth complications, child mortality, early childhood behavioral and emotional problems, and emotional impacts on families.
This study highlights the need for targeted policies and preventative measures to reduce climate-related health risks, including better air quality regulations and public health initiatives aimed at protecting expectant mothers and children from extreme climatic conditions.
Sylvester Dodzi Nyadanu et al, Maternal climate-related exposures and prolonged pregnancy: Findings from a statewide population-based cohort study in Western Australia, Urban Climate (2025). DOI: 10.1016/j.uclim.2025.102316
Method to measure blood-brain barrier permeability accurately developed
For decades, scientists across the globe have investigated methods to accurately measure drug permeability across the blood-brain barrier, a compact layer of cells that protect the brain from potentially dangerous substances and microbes. They struggled with a number of parameters, such as blood flow and binding to plasma proteins, which were shown to impact permeability in different ways.
In research published in the December 2024 issue of Fluids and Barriers of the CNS ("Brain endothelial permeability, transport and flow assessed over 10 orders of magnitude using the in situ brain perfusion technique"), researchers sought to reconcile discrepancies in the field and provide accurate methods for measuring permeability over a very broad range spanning from poorly crossing polar compounds (compounds with a positive or negative charge) to rapidly crossing approved central nervous system (CNS) clinical drugs.
The project team evaluated 120 compounds, revealing that many current CNS drugs permeate the barrier and equilibrate in the brain in less than 10 minutes. The findings challenged previous literature and demonstrated that the equilibration rate for a significant number of CNS drugs is much greater than previously realized.
The researchers showed that many of the drugs that are used and approved for CNS uptake go into the brain quite well. A good number of agents in the benzodiazepine, antidepressant, antipsychotic, stimulant and antiepileptic drugs go in as quickly as the blood flow can deliver them. It's amazingly rapid. For such agents, we had to have extremely accurate measurement of cerebral blood flow.
The project also highlighted the role of plasma proteins which, for many lipophilic agents can serve an additional brain delivery role beyond that of free drug in plasma. In effect, plasma-bound drugs can dissociate to maintain the intervascular free drug concentration, which otherwise would show rapid depletion under conditions of higher extraction (50–99%).
Quentin R. Smith et al, Brain endothelial permeability, transport, and flow assessed over 10 orders of magnitude using the in situ brain perfusion technique, Fluids and Barriers of the CNS (2024). DOI: 10.1186/s12987-024-00584-y
Rising pollen levels linked to increased mortality in older adults
As climate change intensifies pollen seasons across some regions, new research reveals a connection between pollen exposure and death rates among older adults with breathing problems.
The study, published in BMC Public Health, shows that high pollen days aren't just an inconvenience for allergy sufferers—they could pose serious health risks for vulnerable populations. With pollen seasons growing longer and more intense, understanding these risks has become increasingly urgent for public health officials and health care providers.
The study found that high levels of certain pollen, particularly from deciduous trees and ragweed, were linked to increased risk of death from breathing problems. The effects could last up to two weeks after exposure.
The findings suggest that exposure to certain types of pollen can increase the risk of death from breathing-related problems, particularly for people with chronic conditions. This is especially concerning given expectations that climate change will exacerbate the severity of pollen seasons in coming years.
The researchers looked at four types of pollen: deciduous tree pollen from trees that lose their leaves, evergreen tree pollen, grass pollen and ragweed pollen.
While not everyone is equally sensitive to pollen, the findings highlight the importance of tracking pollen levels and taking precautions during high pollen days, especially for older adults with breathing problems, the researchers say. And, they add, with predicted climate change, preparing for the risks will be increasingly important for public health.
Peter S. Larson et al, Chronic and infectious respiratory mortality and short-term exposures to four types of pollen taxa in older adults in Michigan, 2006-2017, BMC Public Health (2025). DOI: 10.1186/s12889-025-21386-3
Cockatoos prefer their noodles dunked in blueberry yogurt: First evidence of non-primate food flavoring behaviour
Researchers are reporting that Goffin's cockatoos (Cacatua goffiniana) engage in food flavoring behaviour by dunking food into soy yogurt. Experimentally controlled tests have confirmed that the birds selectively dipped food in flavored yogurt rather than neutral alternatives, ruling out alternative explanations such as soaking or cleaning.
Dunking behavior innon-human animalsis considered a foraging innovation. It is typically associated with softening dry food, cleaning, flavoring, drowning prey, or transporting liquid.
Prior research has documented various species engaging in dunking, though reports on food flavoring behavior are rare.
Previously, members of the same group of cockatoos exhibited innovative dunking behavior to soak dry food.
The observations that led to the study titled "Innovative Flavoring Behavior in Goffin's Cockatoos,"publishedinCurrent Biology, began when two cockatoos were seen dunking cooked potato pieces into blueberry-flavored soy yogurt.
To systematically investigate this behavior, researchers conducted 14 30-minute observations during breakfast sessions. Eighteen cockatoos were given access to a food bowl containing potatoes or noodles, along with three dunking mediums: freshwater, blueberry-flavored soy yogurt, and neutral soy yogurt.
Nine out of 18 cockatoos engaged in dunking, preferring noodles over potatoes (an average of 12 times per bird vs. 6 times per bird).
Statistical analysis showed that food was dunked in blueberry yogurt over two times more often than neutral yogurt, while no food was dunked in water. The birds also preferred directly eating blueberry yogurt over the neutral variety.
To rule out alternative explanations for the behavior, researchers tested whether dunking functioned as soaking, cleaning, food transport, or tool use. Cockatoos left food in yogurt for an average of 3.2 seconds, significantly shorter than the 22.9 seconds previously observed for water-soaking behavior.
The absence of dunking in water, combined with eating the food with yogurt rather than licking it off, supported the interpretation that dunking was intended for flavoring. A separate test for color preference between yogurts found no significant difference in selection, indicating that dunking choices were based on flavor rather than visual cues.
Food preference testing further revealed that the birds preferred the combination of noodles and blueberry yogurt over noodles or yogurt alone. The potatoes were acceptable without flavoring.
This study provides the first experimental evidence of food flavoring behavior outside the primate lineage. While the cognitive mechanisms behind the innovation remain unclear, researchers note that Goffin's cockatoos demonstrate high cognitive abilities, including problem-solving and sequential planning.
Jeroen Stephan Zewald et al, Innovative flavoring behavior in Goffin's cockatoos, Current Biology (2025). DOI: 10.1016/j.cub.2025.01.002
Engineered animals show new way to fight mercury pollution
Scientists have found an effective new way to clean up methylmercury, one of the world's most dangerous pollutants, which often builds up in our food and environment because of industrial activities such as illegal gold mining and burning coal. The discovery, published in Nature Communications on 12 February 2025, could lead to new ways of engineering animals to protect both wildlife and human health.
The researchers have successfully genetically modified fruit flies and zebrafish to transform methylmercury into a far less harmful gas that disperses in air.
So we can now use synthetic biology to convert the most environmentally harmful form of mercury and evaporate it out of an animal!
Methylmercury causes environmental harm due to its high bioavailability and poor excretion: it can easily cross the digestive tract, the blood-brain barrier, and the placenta and becomes increasingly concentrated as it moves up through food webs to levels that can cause harm to neural and reproductive health.
The research team modified the DNA of fruit flies and zebrafish by inserting variants of genes from bacteria to make two enzymes that together can convert methylmercury to elemental mercury, which evaporates from the animals as a gas.
When they tested the modified animals, they found that not only did they have less than half as much mercury in their bodies, but the majority of the mercury was in a much less bioavailable form than methylmercury.
The researchers included safety measures to ensure the modified organisms cannot spread uncontrollably in nature, and they also highlight the need for regulatory control for any real-world use.
Methylmercury demethylation and volatilization by animals expressing microbial enzymes, Nature Communications (2025). DOI: 10.1038/s41467-025-56145-w
How did non-living chemicals become bio-chemicals on early Earth? This is the question most people ask.
A new study explores how complex chemical mixtures change under shifting environmental conditions, shedding light on the prebiotic processes that may have led to life. By exposing organic molecules to repeated wet-dry cycles, researchers observed continuous transformation, selective organization, and synchronized population dynamics.
Their findings, appearing inNature Chemistry, suggest that environmental factors played a key role in shaping the molecular complexity needed for life to emerge.
To simulate early Earth, the team subjected chemical mixtures to repeated wet-dry cycles. Rather than reacting randomly, the molecules organized themselves, evolved over time, and followed predictable patterns.
This challenges the idea that early chemical evolution was chaotic. Instead, the study suggests that natural environmental fluctuations helped guide the formation of increasingly complex molecules, eventually leading to life's fundamental building blocks.
The new work investigates how chemical mixtures evolve over time, illuminating potential mechanisms that contributed to the emergence of life on Earth.
The research examines how chemical systems can undergo continuous transformation while maintaining structured evolution, offering new insights into the origins of biological complexity.
Chemical evolution refers to the gradual transformation of molecules in prebiotic conditions, a key process in understanding how life may have arisen from non-living matter. While much research has focused on individual chemical reactions that could lead to biological molecules, this study establishes an experimental model to explore how entire chemical systems evolve when exposed to environmental changes.
The researchers used mixtures containing organic molecules with diverse functional groups, including carboxylic acids, amines, thiols, and hydroxyls.
By subjecting these mixtures to repeated wet-dry cycles—conditions that mimic the environmental fluctuations of early Earth—the study identified three key findings: chemical systems can continuously evolve without reaching equilibrium, avoid uncontrolled complexity through selective chemical pathways, and exhibit synchronized population dynamics among different molecular species.
These observations suggest that prebiotic environments may have played an active role in shaping the molecular diversity that eventually led to life.
This research offers a new perspective on how molecular evolution might have unfolded on early Earth. By demonstrating that chemical systems can self-organize and evolve in structured ways, this work provides experimental evidence that may help bridge the gap between prebiotic chemistry and the emergence of biological molecules. Beyond its relevance to origins-of-life research, the study's findings may have broader applications in synthetic biology and nanotechnology. Controlled chemical evolution could be harnessed to design new molecular systems with specific properties, potentially leading to innovations in materials science, drug development, and biotechnology.
Evolution of Complex Chemical Mixtures Reveals Combinatorial Compression and Population Synchronicity, Nature Chemistry (2025). DOI: 10.1038/s41557-025-01734-x
Mimicry: Masquerading moth deploys specialized nanostructures to evade predators
Researchers found the forewings of the fruit-sucking moth (Eudocima aurantia) have the appearance of a crumpled leaf—but are in fact flat.
They published their research in Current Biology this week.
They found the moth mimics the 3D shape and coloration of a leaf using specialized nanostructures on its wings. These nanostructures create a shiny wing surface that mimics the highlights found on a smooth, curved leaf surface.
Structural and pigmentary coloration produces a leaf-like brown color, with the moth exploiting thin-film reflectors to produce directional reflections—producing the illusion of a 3D leaf shape.
It is intriguing that the nanostructures which produce shininess only occur on the parts of the wing that would be curved if the wing was a leaf.
This suggests that moths are exploiting the way predators perceive 3D shapes to improve their camouflage, which is very impressive.
What is remarkable about this moth, however, is that it is creating the appearance of a three-dimensional object despite being almost completely flat.
This mimicry likely serves as a camouflage strategy, fooling predators into misidentifying the moth as an inedible object.
Diabetes can drive the evolution of antibiotic resistance, study suggests
Antibiotics are powerful, fast-acting medications designed to eradicate bacterial infections. However, in recent years, their dependability has waned as antibiotic resistant bacteria continues to evolve and spread.
Staphylococcus aureus is a leading cause of antibiotic-resistance-associated infections and deaths. It is also the most prevalent bacterial infection among those with diabetes mellitus, a chronic condition that affects blood sugar control and reduces the body's ability to fight infections.
Researchers have just shown that people with diabetes are more likely to develop antibiotic-resistant strains of Staph, too.
Their results, which were published in Science Advances, show how the diabetic microbial environment produces resistant mutations, while hinting at ways antibiotic resistance can be combated in this patient population.
The researchers found that antibiotic resistance emerges much more rapidly in diabetic models than in non-diabetic models of disease.
This interplay between bacteria and diabetes could be a major driver of the rapid evolution and spread of antibiotic resistance that we are seeing.
Diabetes affects the body's ability to control a type of sugar called glucose, often causing excess glucose to build up in the bloodstream. Staph feeds off these high sugar levels, allowing it to reproduce more rapidly. The bacterium can also grow without consequence, as diabetes also impairs the immune system's ability to destroy cells and control infection.
As the numbers of bacteria increase in a diabetic infection, so does the likelihood of resistance. Random mutations appear and some build up resistance to external stressors, like antibiotics. Once a resistant mutant is present in a diabetic infection, it rapidly takes over the population, using the excess glucose to drive its rapid growth.
Staphylococcus aureus is uniquely suited to take advantage of this diabetic environment.
Once that resistant mutation happens, you have excess glucose and you don't have the immune system to clear the mutant and it takes over the entire bacterial population in a matter of days.
This was proved in their experiments and models.
So, what can be done to prevent it? Well, the researchers showed that reducing blood sugar levels in diabetic models (through administration of insulin) deprived bacteria of their fuel, keeping their numbers at bay, and reducing the chances of antibiotic-resistant mutations from occurring.
Their findings suggest that controlling blood sugar through insulin use could be key in preventing antibiotic resistance.
The sexome's forensic potential: After intercourse, both partners leave traces of their own unique genital microbiome
Criminal investigations of heterosexual sexual assault often include a DNA analysis of the woman's genitals with the aim of identifying the presence of the perpetrator's sperm for proof of intercourse. However, in cases where no sperm is detected, including in assaults where the perpetrator uses a condom, these exams are often ineffective.
In research published in iScience on February 12, 2025, researchers show that bacterial species are transferred between both individuals during sexual intercourse, and these species can be traced to a sexual partner's unique genital microbiome.
The authors say that analyses of these genital microorganisms—which they called the "sexome"—may be useful in identifying perpetrators of sexual assault.
This research is based on the forensic concept that every contact leaves a trace.
In this study, the researchers confirmed that both men and women have unique populations of bacteria in their genital areas. They then recruited 12 monogamous, heterosexual couples to investigate whether these sexomes are transferred during sexual intercourse, including when a condom is used.
At the beginning of the study, each participant collected samples of their genital microbiome using swabs. The investigators used RNA gene sequencing to determine which bacteria strains were present—down to the sub-species level—and identified microbial signatures for each participant.
Couples were then asked to abstain from sex for varying lengths of time (from two to 14 days) and then to participate in intercourse. Afterwards, samples were collected again from each individual's genital microbiome. Analysis showed that a participant's unique bacterial signature could be identified in their sexual partner's sample following intercourse.
Three of the couples reported using a condom. The analysis found that although this did have some impact on the transfer of microbial content, it did not inhibit it entirely.
When a condom was used, the majority of transfer occurred from the female to the male.
This shows promise for a means of testing a perpetrator post-assault and means there may be microbial markers that detect sexual contact even when a condom was used.
The investigators also looked at whether males were circumcised and whether the participants had pubic hair, but found that neither factor seemed to affect the transfer of bacterial species between partners. However, they did find that the makeup of the vaginal microbiome changed during menstruation, which they note could affect results.
You can escape from police and law but you cannot escape from science and scientists. Can you?
Physicists uncover evidence of two arrows of time emerging from the quantum realm
What if time is not as fixed as we thought? Imagine that instead of flowing in one direction—from past to future—time could flow forward or backwards due to processes taking place at the quantum level. This is the thought-provoking discovery made by researchers , as a new study reveals that opposing arrows of time can theoretically emerge from certain quantum systems.
For centuries, scientists have puzzled over the arrow of time—the idea that time flows irreversibly from past to future. While this seems obvious in our experienced reality, the underlying laws of physics do not inherently favor a single direction. Whether time moves forward or backwards, the equations remain the same.
One way to explain this is when you look at a process like spilled milk spreading across a table, it's clear that time is moving forward. But if you were to play that in reverse, like a movie, you'd immediately know something was wrong—it would be hard to believe milk could just gather back into a glass.
However, there are processes, such as the motion of a pendulum, that look just as believable in reverse. The puzzle is that, at the most fundamental level, the laws of physics resemble the pendulum; they do not account for irreversible processes.
The new findings suggest that while our common experience tells us that time only moves one way, we are just unaware that the opposite direction would have been equally possible."
The study,published inScientific Reports, explored how a quantum system—the world of the sub-atomic—interacts with its environment, known as an "open quantum system."
Researchers investigated why we perceive time as moving in one direction, and whether this perception emerges from open quantum mechanics.
To simplify the problem, the team made two key assumptions. First, they treated the vast environment surrounding the system in such a way that they could focus only on the quantum system itself. Second, they assumed that the environment—like the entire universe—is so large that energy and information dissipate into it, never returning.
This approach enabled them to examine how time emerges as a one-way phenomenon, even though, at the microscopic level, time could theoretically move in both directions. Even after applying these assumptions, the system behaved the same way whether time moved forward or backwards. This discovery provided a mathematical foundation for the idea that time-reversal symmetry still holds in open quantum systems—suggesting that time's arrow may not be as fixed as we experience it. 'The surprising part of this project was that even after making the standard simplifying assumption to our equations describing open quantum systems, the equations still behaved the same way whether the system was moving forwards or backwards in time', say the researchers. When the physicists carefully worked through the math, they found that this behavior had to be the case because a key part of the equation, the 'memory kernel,' is symmetrical in time.
They also found a small but important detail which is usually overlooked—a time discontinuous factor emerged that keeps the time-symmetry property intact. It's unusual to see such a mathematical mechanism in a physics equation because it's not continuous, and it was very surprising to see it pop up so naturally.
Thomas Guff et al, Emergence of opposing arrows of time in open quantum systems, Scientific Reports (2025). DOI: 10.1038/s41598-025-87323-x
Can organisms help others around even after their death?
Bacteria evolved to help neighboring cells after death, new research reveals
Darwin's theory of natural selection provides an explanation for why organisms develop traits that help them survive and reproduce. Because of this, death is often seen as a failure rather than a process shaped by evolution.
When organisms die, their molecules need to be broken down for reuse by other living things. Such recycling of nutrients is necessary for new life to grow.
Researchers have shown that a type of E. coli bacteria produces an enzyme which breaks the contents of their cells down into nutrients after death. The dead bacteria are therefore offering a banquet of nutrients to the cells that were their neighbours when they were living.
The study has been published inNature Communications.
We typically think of death being the end, that after something dies it just falls apart, rots and becomes a passive target as it is scavenged for nutrients.
But what this new work has demonstrated is that death is not the end of the programmed biological processes that occur in an organism.
Those processes continue after death, and they have evolved to do so.
"That is a fundamental rethink about how we view the death of an organism."
The researchers realized they had stumbled across a potentially new area of biology; processes that have evolved to function after death.
One problem remained; the researchers couldn't work out how an enzyme that functions after death could have evolved.
"Typically, we think of evolution acting on living organisms not dead ones.
"The solution is that neighboring cells which gain nutrients from the dead cellsare likely to be clonally related to the dead cell.
Consequently, the dead cell is giving nutrients to its relatives, analogous to how animals will often help feed younger members of their family group.
The finding demonstrates that processes after death, like processes during life, can be biologically programmed and subject to evolution. Biomolecules that regulate processes after death might be exploited in the future as novel targets for bacterial disease or as candidates to enhance bacterial growth in biotechnology.
Bacteria encode post-mortem protein catabolism that enables altruistic nutrient recycling, Nature Communications (2025). DOI: 10.1038/s41467-025-56761-6
The discovery is of great interest for cancer medicine because a change of identity of cells has come into focus as a fundamental principle of carcinogenesis for several years. The researchers were able to show that the newly discovered sentinel is so powerful that it can slow down highly potent cancer drivers and cause malignant liver tumors to regress in mice.
As a rule, the identity of cells is determined during embryonic development. They differentiate into nerve cells or liver cells, for example, and their fate is sealed. Only stem cells retain the ability to develop in different directions. However, once cells have differentiated, they usually stay on course.
Cancer cells are different. They have the amazing ability to reactivate embryonic programs and thus change their identity—their phenotype. This ability is referred to as—unwanted or abnormal—plasticity.
It enables tumor cells to break away from the cell network and migrate through the body. Once they have arrived in the target organ, the cells differentiate again, become sedentary again and form metastases at this site.
It is not so long ago that the importance of plasticity as a fundamental phenomenon in cancer was recognized.
The researchers' goal is to reduce the plasticity of cancer cellsand thus prevent the development and spread of malignant tumors. To do this, they first need to understand how cell plasticity is regulated.
In principle, almost all cells in the body have an identical genome. But how is it possible then that such different and highly specialized cell types as nerve cellsor liver cells arise?
This is only possible because cells have a sophisticated control network.
These ensure that only certain genes are switched on, depending on the cell type, while others are permanently silenced. Master regulators play a central role in this process. They switch on genes that influence specialized cells to change their identity and even acquire stem cell properties.
However, little is known about the antagonists—the control instances that prevent unwanted (re)transformation of differentiated cells by switching off certain genes.
The researchers used a computer program to search for gene switches that could potentially serve as guardians.
The research team found almost 30 different guardian candidates and decided to pursue one of them further: PROX1 (Prospero homeobox protein 1).
Studies on the liver cancer model showed that the team had hit the mark. It turned out that PROX1 is a very influential guard in liver cells. If it is missing, the liver cells change their phenotype. And conversely, the versatility of tumor cells can be reduced by experimentally inducing an increase in the activity of the guard. PROX1 was able to override the influence of such strong cancer drivers and suppress the formation of tumors despite their presence. The researchers also found something else: the PROX1 guardian must be constantly active around the clock to fulfill its function. This is different from many other gene switches, which, like a toggle switch, only need to be activated briefly.
Lim B et al, Active repression of cell fate plasticity by PROX1 safeguards hepatocyte identity and prevents liver tumourigenesis. Nature Genetics (2025). DOI: 10.1038/s41588-025-02081-w
Birds have developed complex brains independently from mammals, studies reveal
Two studies published in the latest issue of Science have revealed that birds, reptiles, and mammals have developed complex brain circuits independently, despite sharing a common ancestor. These findings challenge the traditional view of brain evolution and demonstrate that, while comparable brain functions exist among these groups, embryonic formation mechanisms and cell types have followed divergent evolutionary trajectories.
The pallium is the brainregion where the neocortex forms in mammals, the part responsible for cognitive and complex functions that most distinguishes humans from other species. The pallium has traditionally been considered a comparable structure among mammals, birds, and reptiles, varying only in complexity levels. It was assumed that this region housed similar neuronal types, with equivalent circuits for sensory and cognitive processing.
Previous studies had identified the presence of shared excitatory and inhibitory neurons as well as general connectivity patterns suggesting a similar evolutionary path in these vertebrate species.
However, the new studies reveal that, although the general functions of the pallium are equivalent among these groups, its developmental mechanisms and the molecular identity of its neurons have diverged substantially throughout evolution.
Active matter: Scientists create three-dimensional 'synthetic worms'
Researchers have made a breakthrough in the development of "life-like" synthetic materials which are able to move by themselves like worms.
Scientists have been investigating a new class of materials called "active matter," which could be used for various applications from drug delivery to self-healing materials.
Compared to inanimate matter—the sort of motionless materials we come across in our lives every day, such as plastic and wood—active matter can show fascinating life-like behavior.
These materials are made of elements which are driven out of equilibrium by internal energy sources, allowing them to move independently.
Researchers carried out the experiment using special micron-sized (one millionth of a meter) particles called Janus colloids, which were suspended in a liquid mixture.
The team then made the material active by applying a strong electric fieldand observed the effects using a special kind of microscope which takes three-dimensional images.
When the electric field was turned on, the scattered colloid particles would merge together to form worm-like structures—which creates a fully three-dimensional synthetic active matter system.
The researchers found the formation of fascinating new structures—self-driven active filaments that are reminiscent of living worms. They were then able to develop a theoretical frame work which enabled us to predict and control the motion of the synthetic worms solely based on their lengths.
Air inside your home may be more polluted than outside due to everyday chemical products
When you walk through a flower garden, the crisp, fresh scent is one of the first things you notice.
But bringing that flower scent or other aromas indoors with the help of chemical products—yes, air fresheners, wax melts, floor cleaners, deodorants and others—rapidly fills the air with nanoscale particles that are small enough to get deep into your lungs, researchers have found over a series of studies.
These nanoparticles form when fragrances interact with ozone, which enters buildings through ventilation systems, triggering chemical transformations that create new airborne pollutants.
A forest is a pristine environment, but if you're using cleaning and aromatherapy products full of chemically manufactured scents to recreate a forest in your home, you're actually creating a tremendous amount of indoor air pollution that you shouldn't be breathing in.
Nanoparticles just a few nanometers in size can penetrate deep into the respiratory system and spread to other organs.
To understand how airborne particles form indoors, you need to measure the smallest nanoparticles—down to a single nanometer. At this scale, we can observe the earliest stages of new particle formation, where fragrances react with ozone to form tiny molecular clusters. These clusters then rapidly evolve, growing and transforming in the air around us.
Even though it's yet to be determined how breathing in volatile chemicals from these products impacts your health, the two have repeatedly found that when fragrances are released indoors, they quickly react with ozone to form nanoparticles. These newly formed nanoparticles are particularly concerning because they can reach very high concentrations, potentially posing risks to respiratory health.
Essential oil diffusers, disinfectants, air fresheners and other scented sprays also generate a significant number of nanoscale particles.
But it's not just scented products contributing to indoor nanoparticle pollution: A study by researchers found that cooking on a gas stove also emits nanoparticles in large quantities.
Just 1 kilogram of cooking fuel emits 10 quadrillion particles smaller than 3 nanometers, which matches or exceeds what's emitted from cars with internal combustion engines. At that rate, you might be inhaling 10–100 times more of these sub-3 nanometer particles from cooking on a gas stove indoors than you would from car exhaust while standing on a busy street.
Still, scented chemical products match or surpass gas stoves and car engines in the generation of nanoparticles smaller than 3 nanometers, called nanocluster aerosol. Between 100 billion and 10 trillion of these particles could deposit in your respiratory system within just 20 minutes of exposure to scented products.
Satya S. Patra et al, Flame-Free Candles Are Not Pollution-Free: Scented Wax Melts as a Significant Source of Atmospheric Nanoparticles, Environmental Science & Technology Letters (2025). DOI: 10.1021/acs.estlett.4c00986
Maternal acetaminophen (paracetamol) use may alter placental gene expression, raising ADHD risk in children
Maternal acetaminophen exposure during pregnancy is associated with a higher likelihood of childhood attention deficit hyperactivity disorder (ADHD), according to a new study.
Researchers analyzed plasma biomarkers of acetaminophen (APAP) exposure in a cohort of 307 African American mother-child pairs. Detection of APAP in second-trimester maternal blood samples correlated with increased odds of ADHD diagnosis in children by age 8–10.
Acetaminophen (also called paracetamol) is widely used during pregnancy, with an estimated 41–70% of pregnant individuals in the United States, Europe, and Asia reporting its use. Despite its classification as a low-risk medication by regulatory agencies such as the U.S. Food and Drug Administration and the European Medicines Agency, accumulating evidence suggests a potential link between prenatal APAP exposure and adverse neurodevelopmental outcomes, including ADHD and autism spectrum disorder.
Researchers utilized untargeted metabolomics to identify APAP metabolites in second-trimester maternal plasma samples and examined their relationship with childhood ADHD diagnoses and placental gene expression.
APAP metabolites were detected in 20.2% of maternal plasma samples. Children whose mothers had APAP biomarkers present in their plasma had a 3.15 times higher likelihood of an ADHD diagnosis (95% confidence interval: 1.20 to 8.29) compared with those without detected exposure.
The association was stronger among females than males, with female children of APAP-exposed mothers showing a 6.16 times higher likelihood of ADHD (95% confidence interval: 1.58 to 24.05), while the association was weaker and nonsignificant in males.
Placental gene expression analysis of a subset of 174 participants indicated sex-specific transcriptional changes. In females, APAP exposure was associated with upregulation of immune-related pathways, including increased expression of immunoglobulin heavy constant gamma 1 (IGHG1).
Increased IGHG1 expression was statistically linked to ADHD diagnoses, with mediation analysis suggesting that APAP's effect on ADHD was partly mediated through this gene's placental expression.
Oxidative phosphorylation pathways were downregulated in both sexes, a pattern previously associated with neurodevelopmental impairment.
Findings align with prior epidemiological studies and experimental animal research linking prenatal APAP exposure to neurodevelopmental disruptions. The current study eliminated the bias concerns raised in previous studies where APAP use was self-reported by using objective biomarker measurements.
Brennan H. Baker et al, Associations of maternal blood biomarkers of prenatal APAP exposure with placental gene expression and child attention deficit hyperactivity disorder, Nature Mental Health (2025). DOI: 10.1038/s44220-025-00387-6
Dangerous bacteria lurk in hospital sink drains, despite rigorous cleaning, study reveals
We hope to be cured when we stay in hospital. But too often, we acquire new infections there. Such "health-care-associated infections" (HAI) are a growing problem worldwide, taking up an estimated 6% of global hospital budgets.
Patients with lowered immune defenses, and in some hospitals, poor adherence to hygiene protocols, allow HAIs to thrive. Furthermore, antibiotics are widely used in hospitals, which tends to select for hardy, resistant strains of bacteria. When such resistance genes lie on mobile genetic elements, they can even jump between bacterial species, potentially leading to novel diseases.
Researchers now have shown that hospital sink drains host bacterial populations that change over time, despite impeccable cleaning protocols.
These results highlight that controlling bacterial growth in drains, and preventing colonization by new strains of such hard-to-disinfect niches, is likely a global problem.
Sinks and their drains are routinely cleaned with bleach, as well as disinfected with chemicals and pressurized steam every fortnight, or every month in non-patient areas. Once a year, drainpipes are hyperchlorinated at low temperature.
Despite this, the authors of this study identified a total of 67 different species from the drains. The diversity in most drains went up and down over time with no clear pattern—seasonal or otherwise. The greatest diversity occurred in general medicine and intensive care, while the fewest isolates were found in the microbiology laboratory.
Dominant across wards were six Stenotrophomonas species as well as Pseudomonas aeruginosa, a pathogen known to cause ventilator-associated pneumonia and sepsis, and characterized by the WHO as one of the greatest threats to humans in terms of antibiotic resistance. At least 16 other Pseudomonas species were also found at various times and in various wards.
Other notorious hospital-associated pathogens found repeatedly were Klebsiella pneumoniae, Acinetobacter johnsonii and Acinetobacter ursingii , Enterobacter mori and Enterobacter quasiroggenkampii and Staphylococcus aureus .
The bacteria the researchers found may originate from many sources, from patients, medical personnel, and even the environment surrounding the hospital. Once established in sink drains, they can spread outwards, posing significant risks to immunocompromised patients above all.
Yearlong analysis of bacterial diversity in hospital sink drains: culturomics, antibiotic resistance and implications for infection control, Frontiers in Microbiology (2025). DOI: 10.3389/fmicb.2024.1501170
The cool conditions which have allowed ice caps to form on Earth are rare events in the planet's history and require many complex processes working at once, according to new research.
A team of scientists investigated why Earth has existed in what is known as a "greenhouse" state without ice caps for much of its history, and why the conditions we are living in now are so rare.
They found that Earth's current ice-covered state is not typical for the planet's history and was only achieved through a strange coincidence.
Many ideas have previously been proposed to explain the known cold intervals in Earth's history. These include decreased CO2emissions from volcanoes, or increased carbon storage by forests, or the reaction of CO2with certain types of rocks.
The researchers undertook the first ever combined test of all of these cooling processes in a new type of long-term 3D model of the Earth.
This type of "Earth Evolution Model" has only recently been made possible through advances in computing.
They concluded that no single process could drive these cold climates, and that the cooling in fact required the combined effects of several processes at once. The results of their study werepublished14 February 2025 inScience Advances.
The reason we live on an Earth with ice caps—rather than an ice-free planet—is due to a coincidental combination of very low rates of global volcanism, and highly dispersed continents with big mountains, which allow for lots of global rainfall and therefore amplify reactions that remove carbon from the atmosphere.
The important implication here is that the Earth's natural climate regulation mechanism appears to favor a warm and high-CO2world with no ice caps, not the partially glaciated and low-CO2world we have today.
This general tendency towards a warm climate has helped prevent devastating 'snowball Earth' global glaciations, which have only occurred very rarely and have therefore helped life to continue to prosper.
There is an important message, which is that we should not expect the Earth to always return to a cold state as it was in the pre-industrial age.
Earth's current ice-covered state is not typical for the planet's history, but our current global society relies on it. We should do everything we can to preserve it, and we should be careful with assumptions that cold climates will return if we drive excessive warming before stopping emissions. Over its long history, the Earth likes it hot, but our human society does not, say the researchers.
Inconsistent reporting by companies leads to underestimation of methane's climate impact, study finds
Companies around the world are underestimating their total greenhouse gas footprints because of inconsistent accounting standards for methane emissions, finds a new study by researchers.
The new study, published in Nature Communications, found that methane emissions are being underreported by at least the equivalent of between 170 million and 3.3 billion tons of carbon over a decade, depending on the metric used in calculating the shortfall.
This means that each year, on average, companies around the world have potentially underestimated their carbon footprint by as much in total as the annual carbon emissions of the UK in 2022. This represents a significant methane emissions gap that could cost between $1.6 billion (£1.3 billion) and $40 billion (£32 billion) to fix.
The cumulative emission gap the researchers have documented in this work shows how important it is to standardize the reporting of methane emissions. Methane is a potent greenhouse gas and the first step towards properly addressing its effect on climate is to make sure that it's accounted for properly.
Adopting a global standard is in principle easy for companies as it essentially only requires the adjustment of a few conversion factors when calculating their greenhouse gas footprint. However, it requires global coordination as companies are currently often subject to fragmented regulations.
Methane is a potent greenhouse gas that contributes to global warming at levels comparable to carbon dioxide. Though methane is emitted in much smaller quantities than carbon dioxide, it's more efficient at trapping heat in the atmosphere. However, methane is also short-lived in the atmosphere, with a half-life of only about 10 years versus 120 years for carbon dioxide.
How much total heat a greenhouse gas traps is called its Global Warming Potential (GWP) and measured in CO2equivalent units, or the amount of carbon dioxide gas that would cause the same amount of warming. Because of methane's short lifespan, the conversion to CO2is not straightforward and debate persists about how best to represent it in terms of carbon dioxide.
If methane's impact is calculated over 20 years (GWP-20), it's about 80 times more potent than carbon dioxide because that's the timeframe before most of it has dissipated. However, gauged over 100 years (GWP-100) more of the methane has broken down so it's only about 28 times as potent.
For companies estimating and reporting their greenhouse gas footprint, this lack of harmonization can cause confusion and inaccuracies, as there's no legally binding guidance or consensus for which standard to use.
The authors note that even with their suggested corrections, total methane emissions are still being underestimated, as their calculations only focused on emissions directly produced by the companies they analyzed. Other downstream emissions, such as that which come from sold products, were not included, and are likely significant contributors as well, particularly in the energy sector.
Simone Cenci et al, Lack of harmonisation of greenhouse gases reporting standards and the methane emissions gap, Nature Communications (2025). DOI: 10.1038/s41467-025-56845-3
With microplastics now permeating our food and our bodies, researchers are keen to assess the potential damage these tiny fragments could be doing. A new study shows how plastics may lead to dangerous blood flow blockages in the brain. The study involved tracking microplastics in blood vessels moving through mouse brains in real time – the first time microplastic movement has been tracked in this way. Using high-resolution laser-based imaging techniques, the researchers found microplastic-laden immune cells becoming lodged inside blood vessels in the cortex area of the brain. "The data reveal a mechanism by which microplastics disrupt tissue function indirectly through regulation of cell obstruction and interference with local blood circulation, rather than direct tissue penetration," write the researchers in their published paper. This revelation offers a lens through which to comprehend the toxicological implications of microplastics that invade the bloodstream. The researchers found some similarities between the blockages here and blood clots, while also looking at the subsequent impact on mouse behavior. Mice with microplastics in their blood performed less well than their plastic-free peers on movement, memory, and coordination tests, pointing to impaired brain function.
Microplastics are defined as plastic fragments less than 5 millimeters (0.2 inches) in diameter. As you might expect, the smaller specks of plastic were found to be less likely to cause blockages than larger ones. While the microplastic blockages were cleared up over the course of a month, and most cognitive behaviors in the mice returned to normal, the researchers suggest there could be links here to neurological problems like depression and anxiety, as well as an increased risk of strokes and cardiovascular disease.
"These findings indicate that mice display multifaceted abnormalities in neurobehavioral regulation, resembling depressive states associated with disrupted cerebral blood flow," write the researchers. While it's not certain that the same processes are happening in human brains – there are significant differences in terms of immune systems and blood vessel sizes – mice are biologically similar enough to us as a species to make this a real concern.
Scientists Just Achieved a Major Milestone in Creating Synthetic Life
After more than a decade of work, researchers have reached a major milestone in their efforts to re-engineer life in the lab, putting together the final chromosome in a synthetic yeast (Saccharomyces cerevisiae) genome. The researchers chose yeast as a way to demonstrate the potential for producing foodstuffs that could survive the rigors of a changing climate or widespread disease.
It's the first time a synthetic eukaryotic genome has been constructed in full, following on from successes with simpler bacteria organisms. It's a proof-of-concept for how more complex organisms, like food crops, could be synthesized by scientists. This doesn't mean we can start growing completely artificial yeast from scratch, but it does mean living yeast cells can potentially be entirely recoded – though lots more work is required to get this process refined and scaled up before that can happen. And the coding analogy is a good one, because the researchers had to spend plenty of time and effort debugging the 16th and final synthetic yeast chromosome (called SynXVI) before the genome functioned as desired.
Trees can cool cities, but only with a little help
Because trees can cool cities by providing shade and evaporating water into the atmosphere, greening city streets is an often-touted strategy for climate change adaptation. But trees provide benefits only if they're healthy, and physical variations in urban environments mean that not all trees have the same chance to thrive.
researchers told their story of setting trees up for identifying cityscape features for success and that may cause them to struggle.
The researchers used data from the ECOSTRESS sensor aboard the International Space Station to map the summer afternoon canopy temperatures. Then they applied machine learning to assess the relationship between these temperatures and various environmental factors, including proximity to water, urbanization, traffic exposure, and surrounding land cover.
They found that proximity to blue and green spaces (areas with water or vegetation) improved tree health, whereas trees in areas with a lot of built structures and impervious surfaces fared worse.
Using this analysis, the researchers created and calculated the combined urban tree index (CUTI)—a metric that considers the fraction of land covered by tree canopy along with the temperature and health of the canopy—to determine how much an area benefits from its trees. The CUTI scale ranges from 0 to 1, with 0 meaning no benefit and 1 meaning maximum benefit.
In urban areas where the surroundings will likely cause trees to do poorly, city managers will need to plant more trees and attend to them more carefully than in areas where trees thrive naturally, the authors concluded.
Jean V. Wilkening et al, Canopy Temperature Reveals Disparities in Urban Tree Benefits, AGU Advances (2025). DOI: 10.1029/2024AV001438
How is gold formed? The simple answer here is that we are not certain. However, scientists have some ideas.
Gold, like all elements, formed through high energy reactions that occurred in various cosmic and space environments some 13 billion years ago, when the universe started to form.
However, gold deposits—or the concentration of gold in large volumes within rock formations—are believed to occur through various processes, explained by two theories.
The first theory—described by geologist Richard J. Goldfarb—argues that large amounts of gold were deposited in certain areas when continents were expanding and changing shape, around 3 billion years ago. This happened when smaller landmasses, or islands, collided and stuck to larger continents, a process called accretionary tectonics. During these collisions, mineral-rich fluids moved through the Earth's crust, depositing gold in certain areas.
A newer, complementary theory by planetary scientist Andrew Tomkins explains the formation of some much younger gold deposits during the Phanerozoic period (approximately 650 million years ago). It suggests that as the Earth's oceans became richer in oxygen during the Phanerozoic period, gold got trapped within another mineral known as pyrite (often called fool's gold) as microscopic particles. Later, geological processes—like continental growth (accretion) and heat or pressure changes (metamorphism) released this gold—forming deposits that could be mined.
AI on aircraft can reduce risk of mid-air stalls and sudden drops, study shows
Artificial intelligence aboard aircraft could help prevent terrifying drops in altitude. In a new study, an international research team successfully tested a machine learning system for preventing trouble with turbulence. The findings are published in the journal Nature Communications.
Researchers conducted tests on an AI system designed to enhance the effectiveness of experimental technologies for manipulating airflow on wing surfaces. The results indicate that these innovations work better when paired with deep reinforcement learning (DLR), in which the program adapts to airflow dynamics based on previously learned experiences.
The AI control system zeroes in on one particularly dangerous aerodynamic phenomenon known as flow detachment, or turbulent separation bubbles.
Flow detachment is as serious as it sounds. To stay aloft, airplanes need slow moving air underneath the wing, and fast moving air above it. The air moving over the wing surface needs to follow the wing shape, or "attach," to the surface. When the air moving over the wing's surface no longer follows the wing shape and instead breaks away, it creates a dangerous swirling or stalled airflow.
This usually occurs when the wing is at a high angle of attack, or when the air slows down due to increasing pressure. When this happens, lift decreases, and drag increases, which can lead to a stall and make the aircraft harder to control.
The researchers report that they can reduce these bubbles by 9%.
The team tested how effectively AI could control experimental devices that pulse air in and out of a small opening in the wing surface, known as synthetic jets. While such innovations are still in the experimental stage, aerospace engineers look at them to complement physical features such as vortex generators that planes rely on to maintain the right balance of airflow above and below the wings.
Up to this point, the prevailing wisdom has been that these bursts should occur at regular periodic intervals. However, the study shows that periodic activation only reduces turbulence separation bubbles by 6.8%.
This study highlights how important AI is for scientific innovation. It offers exciting implications for aerodynamics, energy efficiency and next-generation computational fluid dynamics.
Bernat Font et al, Deep reinforcement learning for active flow control in a turbulent separation bubble, Nature Communications (2025). DOI: 10.1038/s41467-025-56408-6
Sudden vision loss in children: Study points to a novel retinal disorder
A multicenter study led by researchers from the Key Laboratory of Ophthalmology has characterized a distinct retinal disorder in children following high fever illness. The study describes hyperacute outer retinal dysfunction (HORD), a condition marked by sudden bilateral vision loss, photoreceptor disruption, and variable recovery.
Eight pediatric patients between the ages of 3 and 7 experienced severe, sudden-onset vision loss approximately two weeks after a febrile illness. Despite initial poor visual acuity, most showed significant central vision recovery over one year. Comprehensive retinal imaging revealed characteristic ellipsoid zone (EZ) and external limiting membrane (ELM) disruptions. Electroretinography (ERG) findings demonstrated extinguished cone and rod responses, even in cases where vision improved.
In the study, "Hyperacute Outer Retinal Dysfunction," published in JAMA Ophthalmology, researchers examined eight children (16 eyes) referred to pediatric retina services in China. Patients had no prior history of visual impairment and underwent thorough ophthalmic and systemic evaluations. Exclusion criteria included inherited retinal disease, uveitis, and white dot syndromes.
Best-corrected visual acuity (BCVA) was assessed at baseline and during follow-up. Multimodal imaging included color fundus photography, ultra-widefield imaging, optical coherence tomography (OCT), fluorescence angiography, fundus autofluorescence, and electroretinography. Genetic and serological testing was conducted to rule out inherited and autoimmune retinal diseases. Patients received varying immunosuppressive treatments, including corticosteroids, intravenous immunoglobulin and methotrexate.
Initial symptoms included severe bilateral vision loss, nyctalopia, visual field constriction, and dyschromatopsia. At presentation, the patient's mean visual acuity was below the ability to count fingers correctly. OCT imaging showed diffuse EZ and ELM loss, while early fundus findings were largely unremarkable.
By the fourth week, signs of macular recovery appeared. At one year, 88% (7 of 8 patients) achieved visual acuity of 20/40 or better, with 50% (4 of 8) reaching 20/25 or better. Macular EZ and ELM appeared intact in 75% and 88% of eyes, respectively, though extrafoveal regions remained affected. ERG continued to show extinguished rod and cone responses despite visual improvement.
Systemic evaluations were unremarkable. No infectious or autoimmune triggers were identified, although two patients tested positive for specific antiretinal antibodies (antiPKC γ and antiRi). Treatment with corticosteroids and IVIG was initiated in most patients, though a definitive therapeutic effect of treatment remained unclear from the study.
A commentary by Timothy Boyce and Ian Han at the University of Iowa, "Hyperacute Outer Retinal Dysfunction—A Retina on Fire," also published in JAMA Ophthalmology, suggests that HORD may represent a novel inflammatory-mediated retinal disorder.
The authors propose similarities with autoimmune encephalitis, suggesting a possible antibody-mediated mechanism. Early OCT findings, including vitritis, vascular sheathing, and intraretinal hyperreflective dots, point to acute inflammation as a potential driver of retinal damage.
Biologists transform gut bacteria into tiny protein pharmacies
Hundreds of different species of microbes live in your gut. In the future, one of these might serve a new function: microscopic in-house pharmacist.
A new study published Feb. 18 in Nature Biotechnology shows how gut bacteria can be directed to produce and release proteins within the lower gastrointestinal tract—eliminating a major roadblock to delivering drugs to that part of the body.
Oral medication is the most common and practical means of drug administration, but the stomach doesn't let much pass through unscathed. This is good when it comes to things like foodborne pathogens, but gut-focused therapies are regularly deactivated and flushed out.
In an unprecedented workaround, biologists engineered bacteria-eating viruses called phages to infect and reprogram bacterial cells to produce and release a sustained flow of a protein-based drug. Collaborating with immunologists they showed that this approach can be used to potentially treat chronic diseases.
Bacteriophages (phages for short) are viruses that naturally infect bacteria. Phages are harder to classify than bacteria and therefore less understood, but we do know how they attack bacteria.
After attaching to a bacterial cell, phages inject their own DNA and reprogram the cell so that it manufactures more phages—agents of the cell's own destruction. When the bacterial cell eventually succumbs, it explodes into a flood of new phages in a process called lysis. Millions of these events happening simultaneously produce a constant supply of a targeted protein inside the lower intestine.
Even though phages act (and look) like spider aliens, they are regular players on the gut-microbiome home team.
So Biologists engineered special phages that inject a little extra genetic material into the bacterial cell.
In addition to making a flurry of new phages, the instructions prompt the cell to produce a tagalong protein that can lend itself to targeted therapies inside the lower intestines.
Engineered proteins reduced inflammation and obesity in mice.
Burning plastic for cooking and heating: An emerging environmental crisis
A new research paper "The Use of Plastic as a Household Fuel among the Urban Poor in the Global South" published in Nature Cities, has called for action to reduce the burning of plastics for heating and cooking, a common yet hazardous practice emerging in millions of households in developing nations due to a lack of traditional energy sources.
Researchers investigated the energy consumption of developing countries in Africa, Asia and Latin America, finding many were unable to afford clean fuels such as gas or electricity.
The team also found urban sprawl had made traditional fuels such as wood and charcoal difficult to find, while a lack of waste management meant plastic waste was in abundance.
Burning plastic releases harmful chemicals such as dioxins, furans and heavy metals into the air, which can have a range of health and welfare impacts such as lung diseases, the researchers point out.
These risks are particularly pronounced among women and children, as they spend more time at home.
But the pollution doesn't just stay in households who burn it: it spreads across neighborhoods and cities, affecting everyone.
The issue may affect millions of people who bear the burden of acute inequality in cities and could potentially have a bigger impact as plastic use increases and cities grow.
And many governments are not addressing the issue effectively because it's usually concentrated in areas such as slums, which are often neglected.
Possible ways to address the problem include subsidies for cleaner fuels to make them affordable for poorer families, better waste management to prevent plastic from piling up in slum areas, education campaigns to inform communities about the dangers of burning plastic and alternative low-cost, innovative cooking solutions tailored to lower-income areas, say the experts.
Bishal Bharadwaj et al, The use of plastic as a household fuel among the urban poor in the Global South, Nature Cities (2025). DOI: 10.1038/s44284-025-00201-5
What makes us remember our dreams? How sleep patterns and mindset shape recall
Some people wake up vividly recalling their dreams from the night, and can tell precise stories experienced during the night, while others struggle to remember even a single detail. Why does this happen? A new study, conducted by researchers and published in Communications Psychology explores the factors that influence so-called "dream recall"—the ability to remember dreams upon awakening—and uncovers which individual traits and sleep patterns shape this phenomenon.
The reason why there is such a difference in recalling dreams remains a mystery. Some studies found that women, young persons, or people with a tendency to daydream, tend to better recall night dreams. But other studies did not confirm these findings.
Other hypotheses, such as that personality traits or cognitive abilities count, received even less support from data. During the recent COVID pandemic, the phenomenon of individual differences in morning dream recall attracted renewed public and scientific attention when an abrupt surge in morning dream recall was reported worldwide.
The new research was conducted in the years from 2020 to 2024, and involved over 200 participants, aged 18 to 70, who recorded their dreams daily for 15 days while their sleep and cognitive data were tracked using wearable devices and psychometric tests.
Each study participant was given a voice recorder to report, every day right after the awakening, about the experiences they had during sleep. Participants had to report whether they remembered having dreamed or not, if they had the impression of having dreamed but did not remember anything about the experience, and to describe the content of the dream if they were able to remember it.
For the duration of the study, participants also wore an actigraph, a sleep monitoring wristwatch that detects sleep duration, efficiency, and disturbances. At the beginning and end of the dream recording period, participants were subjected to psychological tests and questionnaires that measure various factors, from anxiety levels to interest in dreams, proneness to mind-wandering (the tendency to frequently shift attention away from the task at hand toward unrelated thoughts, or internal reflections), up to memory and selective attention tests.
Dream recall, defined as the probability of waking up in the morning with impressions and memories from a dream experience, showed considerable variability between individuals and was influenced by multiple factors. The study revealed that people with a positive attitude toward dreams and a tendency for mind-wandering were significantly more likely to recall their dreams. Sleep patterns also seemed to play a critical role: individuals who experienced longer periods of light sleep had a greater likelihood of waking with a memory of their dreams. Younger participants showed higher rates of dream recall, while older individuals often experienced "white dreams" (a sensation of having dreamed without recalling any details). This suggests age-related changes in memory processes during sleep. Moreover, seasonal variations emerged, with participants reporting lower dream recall during winter compared to spring, hinting at the potential influence of environmental or circadian factors. The findings suggest that dream recall is not just a matter of chance but a reflection of how personal attitudes, cognitive traits, and sleep dynamics interact.
You can cheat your strict and disciplined parents or partner, you can lie to your doctor or people who make surveys but you cannot escape from scientists!
Scientists have developed a breakthrough method to track diet using stool metagenomic data.
Developed by researchers at the Institute for Systems Biology (ISB), the new method, called MEDI (Metagenomic Estimation of Dietary Intake), detects food-derived DNA in stool samples to estimate dietary intake. MEDI leverages stool metagenomics, which refers to sequencing all the DNA present in fecal samples (including microbial, human, and food-derived DNA). This non-invasive, data-driven approach offers an objective alternative to traditional food diaries and questionnaires, which are still the gold standard in dietary assessment but can suffer from misreporting and compliance issues.
Using CRISPR to remove extra chromosomes in Down syndrome
Gene editing techniques may eventually allow trisomy to be treated at the cellular level, according to an in vitro proof-of-concept study.
Down syndrome is caused by the presence of a third copy of the 21st chromosome. The condition occurs in approximately 1 in 700 live births and is relatively easy to diagnose at early stages of development. However, there are no treatments.
Researchers used the CRISPR-Cas9 gene editing system to cleave the third chromosome in previously generated trisomy 21 cell lines derived from both pluripotent cells and skin fibroblasts. The technique is able to identify which chromosome has been duplicated, which is necessary to ensure the cell does not end up with two identical copies after removal, but instead has one from each parent.
The researchers were able to remove duplicate chromosomes from both induced pluripotent stem cells and fibroblasts. Suppressing chromosomal DNA repair ability increased the rate of duplicate chromosome elimination.
The authors show that the chromosomal rescue reversibly restores both gene expression and cellular phenotypes. The approach is not yet ready for in vivo application, however, in part because the current technique can also change the retained chromosomes.
Similar approaches could eventually be used in neurons and glial cells and form the basis of novel medical interventions for people with Down syndrome, say the researchers.
Ryotaro Hashizume et al, Trisomic rescue via allele-specific multiple chromosome cleavage using CRISPR-Cas9 in trisomy 21 cells, PNAS Nexus (2025). DOI: 10.1093/pnasnexus/pgaf022
Swimming in some lakes can lead to infection with Legionella, warn scientists
Swimming in some lakes with still water can lead to infection with Legionella, bacteria that can cause pneumonia, and people who engage in open water swimming should be aware of this risk, say the authors of a practice article published in the Canadian Medical Association Journal.
Legionella infection represents a public health hazard owing to its ability to spread through exposure to natural water bodies and human-made water reservoirs.
Legionella infection is an atypical cause of community-acquired pneumonia. Referred to as legionnaires' disease, it presents with fever, fatigue, respiratory symptoms, and sometimes diarrhea. Legionella bacteria thrive in the warm, stagnant water in plumbing systems, air conditioners, public spas, and even lakes and rivers.
Risk factors for legionnaires' disease include age older than 50 years, smoking history, chronic cardiovascular or kidney disease, diabetes, and a compromised immune system.
Stomach cancers make electrical connections with the nervous system to fuel spread, study finds
Researchers have discovered that stomach cancers make electrical connections with nearby sensory nerves and use these malignant circuits to stimulate the cancer's growth and spread.
It is the first time that electrical contacts between nerves and a cancer outside the brain have been found, raising the possibility that many other cancers progress by making similar connections.
Many cancers exploit nearby neurons to fuel their growth, but outside of cancers in the brain, these interactions have been attributed to the secretion of growth factors broadly or through indirect effects.
Now that we know the communication between the two is more direct and electrical, it raises the possibility of repurposing drugs designed for neurological conditions to treat cancer.
The wiring of neurons to cancer cells also suggests that cancer can commandeer a particularly rapid mechanism to stimulate growth.
There are many different cells surrounding cancers, and this microenvironment can sometimes provide a rich soil for their growth.
Researchers have been focusing on the role of the microenvironment's immune cells, connective tissue, and blood vessels in cancer growth but have only started to examine the role of nerves in the last two decades. What's emerged recently is how advantageous the nervous system can be to cancer.
The nervous system works faster than any of these other cells in the tumor microenvironment, which allows tumors to more quickly communicate and remodel their surroundings to promote their growth and survival.
Earlier the researchers discovered that cutting the vagus nerve in mice with stomach cancer significantly slowed tumor growth and increased survival rate.
Many different types of neurons are contained in the vagus nerve, but the researchers focused here on sensory neurons, which reacted most strongly to the presence of stomach cancer in mice. Some of these sensory neurons extended themselves deep into stomach tumors in response to a protein released by cancer cells called Nerve Growth Factor (NGF), drawing the cancer cells close to the neurons.
After establishing this connection, tumors signaled the sensory nerves to release the peptide Calcitonin Gene Related Peptide (CGRP), inducing electrical signals in the tumor.
Though the cancer cells and neurons may not form classical synapses where they meet—the team's electron micrographs are still a bit fuzzy—"there's no doubt that the neurons create an electric circuit with the cancer cells," the researchers say. "It's a slower response than a typical nerve-muscle synapse, but it's still an electrical response." The researchers could see this electrical activity with calcium imaging, a technique that uses fluorescent tracers that light up when calcium ions surge into a cell as an electrical impulse travels through.
"There's a circuit that starts from the tumor, goes up toward the brain, and then turns back down toward the tumor again," they confirm. It's like a feed-forward loop that keeps stimulating the cancer and promoting its growth and spread."
For stomach cancer, CGRP inhibitors that are currently used to treat migraines could potentially short-circuit the electrical connection between tumors and sensory neurons.
Synthetic diamond with hexagonal lattice outshines the natural kind with unprecedented hardness
A team of physicists, materials scientists and engineers has grown a diamond that is harder than those found in nature. In their project, reported in the journal Nature Materials, the group developed a process that involves heating and compressing graphite to create synthetic diamonds.
Diamonds are a prized gem the world over. Their sparkling appearance has made them one of the most treasured gemstones throughout human history. In more recent times, because they are so hard, diamonds have also been used in commercial applications, such as drilling holes through other hard materials. Such attributes have kept the price of diamonds high. Because of that, scientists have developed ways to synthesize them, and today, a host of synthetic diamonds are available for sale.
Researchers have previously pursued harder diamonds with hexagonal rather than cubic lattices, which is how most natural and synthetic diamonds form. However, past attempts to make hexagon-lattice synthetic diamonds have resulted in diamonds that were too small and of low purity.
In this new effort, the research team tried a different approach. They developed a process that involved heating graphene samples to high temperatures while inside a high-pressure chamber. By adjusting the parameters of their setup, the researchers found they could get the graphene to grow into a synthetic diamond with hexagonal lattices.
The first diamond made by the group was millimeter-sized and was tested at 155 GPa, with thermal stability up to 1,100°C. Natural diamonds typically range from 70 to 100 GPa and can only withstand temperatures up to 700°C.
The researchers note that it is unlikely diamonds made using their technique would be used for jewelry; instead, they would be used for drilling and machining applications. They also note that they might be used in other ways, such as for data storage or in thermal management applications.
Desi Chen et al, General approach for synthesizing hexagonal diamond by heating post-graphite phases, Nature Materials (2025). DOI: 10.1038/s41563-025-02126-9
Artificial sweetener triggers insulin spike, leading to blood vessel inflammation in mice
From diet soda to zero-sugar ice cream, artificial sweeteners have been touted as a guilt-free way to indulge our sweet tooth. However, new research published in Cell Metabolism shows that aspartame, one of the most common sugar substitutes, may impact vascular health.
The team of cardiovascular health experts and clinicians found that aspartame triggers increased insulin levels in animals, which in turn contributes to atherosclerosis—buildup of fatty plaque in the arteries, which can lead to higher levels of inflammation and an increased risk of heart attacks and stroke over time.
Previous research has linked consumption of sugar substitutes to increased chronic disorders like cardiovascular disease and diabetes. However, the mechanisms involved were previously unexplored.
For this study, the researchers fed mice daily doses of food containing 0.15% aspartame for 12 weeks—an amount that corresponds to consuming about three cans of diet soda each day for humans.
Compared to mice without a sweetener-infused diet, aspartame-fed mice developed larger and more fatty plaques in their arteries and exhibited higher levels of inflammation, both of which are hallmarks of compromised cardiovascular health.
When the team analyzed the mice's blood, they found a surge in insulin levels after aspartame entered their system. The team noted that this wasn't a surprising result, given that our mouths, intestines, and other tissues are lined with sweetness-detecting receptors that help guide insulin release. But aspartame, 200 times sweeter than sugar, seemed to trick the receptors into releasing more insulin.
The researchers then demonstrated that the mice's elevated insulin levels fueled the growth of fatty plaques in the mice's arteries, suggesting that insulin may be the key link between aspartame and cardiovascular health.
Next, they investigated how exactly elevated insulin levels lead to arterial plaque buildup and identified an immune signal called CX3CL1 that is especially active under insulin stimulation.
Because blood flow through the artery is strong and robust, most chemicals would be quickly washed away as the heart pumps. Surprisingly, not CX3CL1. It stays glued to the surface of the inner lining of blood vessels. There, it acts like a bait, catching immune cells as they pass by.
Many of these trapped immune cells are known to stoke blood vessel inflammation. However, when researchers eliminated CX3CL1 receptors from one of the immune cells in aspartame-fed mice, the harmful plaque buildup didn't occur. These results point to CX3CL1's role in aspartame's effects on the arteries.
Artificial sweeteners have penetrated almost all kinds of food, so we have to be careful not to consume such food, say the researchers.
Cancer cells cooperate to scavenge for nutrients, scientists discover
Cancer cells work together to source nutrients from their environment—a cooperative process that was previously overlooked by scientists but may be a promising target for treating cancer.
Researchers identified cooperative interactions among cancer cells that allow them to proliferate. Thinking about the mechanisms that tumor cells exploit can inform future therapies.
Scientists have long known that cancer cells compete with one another for nutrients and other resources. Over time, a tumor becomes more and more aggressive as it becomes dominated by the strongest cancer cells.
However, ecologists also know that living organisms cooperate, particularly under harsh conditions. For instance, penguins form tight huddles to conserve heat during the extreme cold of winter, with their huddles growing in size as temperatures drop—a behavior not observed during warmer months.
Similarly, microorganisms such as yeast work together to find nutrients, but only when facing starvation.
Cancer cells also need nutrients to thrive and replicate into life-threatening tumors, but they live in environments where nutrients are scarce. Is it possible that cancer cells work together to scavenge for resources?
To determine whether cancer cells cooperate, the researchers tracked the growth of cells from different types of tumors. Using a robotic microscope and image analysis software they developed, they quickly counted millions of cells under hundreds of conditions over time.
This approach allowed them to examine tumor cultures at a range of densities—from sparsely populated dishes to those crowded with cancer cells—and with different levels of nutrients in the environment.
It is well-known that cells uptake amino acids in a competitive manner. But when the cancer cells studied were starved of amino acids such as glutamine, all of the cell types tested showed a strong need to work together to acquire the available nutrients.
Surprisingly, the researchers observed that limiting amino acids benefited larger cell populations, but not sparse ones, suggesting that this is a cooperative process that depends on population density. It became really clear that there was true cooperation among tumor cells.
Through additional experiments with skin, breast, and lung cancer cells, the researchers determined that a key source of nutrients for cancer cells comes from oligopeptides—pieces of proteins made up of small chains of amino acids—found outside the cell.
Where this process becomes cooperative is that instead of grabbing these peptides and ingesting them internally, the scientists found that tumor cells secrete a specialized enzyme that digests these peptides into free amino acids. Because this process happens outside the cells, the result is a shared pool of amino acids that becomes a common good. Determining the enzyme secreted by cancer cells, called CNDP2, was an important discovery.
Mountain ranges could be hidden treasure troves of natural hydrogen, plate tectonic modeling finds
The successful development of sustainable georesources for the energy transition is a key challenge for humankind in the 21st century. Hydrogen gas (H2) has great potential to replace current fossil fuels while simultaneously eliminating the associated emission of CO2 and other pollutants.
However, a major obstacle is that H2 must be produced first. Current synthetic hydrogen production is at best based on renewable energies but it can also be polluting if fossil energy is used.
The solution may be found in nature, since various geological processes can generate hydrogen. Yet, until now, it has remained unclear where we should be looking for potentially large-scale natural H2accumulations.
A team of researchers present an answer to this question: using plate tectonic modeling, they found that mountain ranges in which originally deep mantle rocks are found near the surface represent potential natural hydrogen hotspots.
Such mountain ranges may not only be ideal geological environments for large-scale natural H2generation, but also for forming large-scale H2accumulations that can be drilled for H2production.
The results of this research have beenpublishedinScience Advances.
Natural hydrogen can be generated in several ways, for instance by bacterial transformation of organic material or splitting of water molecules driven by decay of radioactive elements in the Earth's continental crust.
As a result, the occurrence of natural H2 is reported in many places worldwide. The general viability of natural hydrogen as an energy source has already been proven in Mali, where limited volumes of H2 originating from iron-rich sedimentary layers are produced through boreholes in the subsurface.
However, the most promising mechanism for large-scale natural hydrogen generation is a geological process in which mantle rocks react with water. The minerals in the mantle rocks change their composition and form new minerals of the so-called serpentine group, as well as H2 gas.
This process is called serpentinization. Mantle rocks are normally situated at great depth, below the Earth's crust. In order for these rocks to come in contact with water and serpentinize, they must be tectonically exhumed, i.e. being brought near the Earth's surface. There are two main plate tectonic environments in which mantle rocks are exhumed and serpentinized over the course of millions of years: 1) ocean basins that open as continents break apart during rifting, allowing the mantle to rise as the overlying continental crust is thinned and eventually split (for example in the Atlantic Ocean), and 2) subsequent basin closure and mountain building as continents move back together and collide, allowing mantle rocks to be pushed up towards the surface (for example in the Pyrenees and Alps).
A thorough understanding of how such tectonic environments evolve is key to properly assess their natural hydrogen potential. Using a state-of-the-art numerical plate tectonic modeling approach, calibrated with data from natural examples, the GFZ-led research team simulated the full plate tectonic evolution from initial rifting to continental break-up, followed by basin closure and mountain building.
In these simulations, the researchers were able to determine for the first time where, when, and how much mantle rocks are exhumed in mountains, and when these rocks may be in contact with water at favorable temperatures, to allow for efficient serpentinization and natural hydrogen generation.
It turns out that conditions for serpentinization and thus natural H2 generation are considerably better in mountain ranges than in rift basins. Due to the comparably colder environment in mountain ranges, larger volumes of exhumed mantle rocks are found at favorable serpentinization temperatures of 200–350°C, and at the same time, plenty of water circulation along large faults within the mountains can allow for their serpentinization potential to be realized.
As a result, the annual hydrogen generation capacity in mountain ranges can be up to 20 times greater than in rift environments. In addition, suitable reservoir rocks (for example sandstones) required for the accumulation of economically viable natural H2 volumes are readily available in mountain ranges, but are likely absent during serpentinization and hydrogen generation in the deeper parts of rift basins. Part 2
Dr. Krishna Kumari Challa
Unlike female sexual traits, the loss of male sexual traits is generally thought to take an extremely long time. In many other species, even rare males often retain their reproductive capabilities. However, these findings suggest that R. mikado has relied solely on parthenogenesis for such an extended period that even neutral mutations have accumulated, leading to the complete loss of male reproductive traits.
This study demonstrates that parthenogenesis in R. mikado has become irreversible. Although asexual reproduction is often considered evolutionarily short-lived due to the lack of genetic recombination, previous research estimated that this species has persisted for hundreds of thousands of years. How has R. mikado managed to survive for such a long time? This remains an intriguing mystery for future research.
Tomonari Nozaki et al, Lack of successful sexual reproduction suggests the irreversible parthenogenesis in a stick insect, Ecology (2025). DOI: 10.1002/ecy.4522
Part 2
Feb 12
Dr. Krishna Kumari Challa
The secret behind sharp vision: New research reveals the benefits of tiny eye movements
Even when we think we are holding our gaze perfectly still, our eyes make tiny, involuntary movements. While these "fixational eye movements" might seem like they would blur our vision, new research reveals they actually help us see fine details more clearly.
In a study combining theoretical modeling and human experiments, researchers and their collaborators have uncovered how these microscopic eye movements enhance rather than impair our visual acuity.
Using advanced eye-tracking technology and computational models, the team demonstrated that these movements help our retinas process visual information more effectively. Their paper is published in the Proceedings of the National Academy of Sciences.
It is a fascinating paradox, say the researchers. These constant, tiny movements of our eyes might appear to make our vision less precise, but they actually optimize the way our retinas encode visual information. We found that humans naturally maintain these movements within a nearly perfect range for enhanced visual acuity.
The researchers found that these movements help by 'refreshing' the content of our visual receptors while maintaining an optimal balance between motion and stability. They also found that in the experiment, the movements adapt to the size of the object shown.
The study was conducted using a sophisticated adaptive optics scanning laser ophthalmoscope, allowing researchers to track these minute eye movements with unprecedented precision while participants performed visual tasks. The researchers then combined theoretical modeling with empirical data to link eye movements to retinal neural coding and human behaviour.
Trang-Anh E. Nghiem et al, Fixational eye movements as active sensation for high visual acuity, Proceedings of the National Academy of Sciences (2025). DOI: 10.1073/pnas.2416266122
Feb 12
Dr. Krishna Kumari Challa
Climate change may be delaying births, suggests study
New research has found exposure to outdoor air pollution and extreme temperatures during pregnancy may increase the risk of prolonged pregnancy, offering new insights into the impact of climate change on maternal health.
Published in Urban Climate, the study is titled "Maternal climate-related exposures and prolonged pregnancy: Findings from a statewide population-based cohort study in Western Australia."
The study analyzed data from nearly 400,000 births in Western Australia and found that higher exposure to fine particulate air pollution (PM2.5) and biothermal stress (a measure that combines air temperature, radiant temperature, relative humidity, wind speed, and human physiology) was associated with pregnancies lasting beyond 41 weeks.
While climate exposure has long been linked to preterm births, this is the first study to examine its impact on prolonged pregnancies.
These findings show that exposure to air pollution and biothermal stress during pregnancy increases the likelihood of prolonged pregnancies, particularly among mothers over 35 years old, first-time mothers, those living in urban areas, and those with complicated pregnancies.
"Environmental stressors, including climate-related exposures during pregnancy, have been associated with maternal stress response and subsequent disruptions in endocrine and inflammatory activities, which increase towards the end of pregnancy. This can either shorten gestation, leading to preterm birth, or lengthen gestation, resulting in prolonged pregnancy in some cases."
Prolonged pregnancy can have serious health implications for both mother and baby, including the need for medical interventions such as labor induction or cesarean sections, increased risk of stillbirth, birth complications, child mortality, early childhood behavioral and emotional problems, and emotional impacts on families.
This study highlights the need for targeted policies and preventative measures to reduce climate-related health risks, including better air quality regulations and public health initiatives aimed at protecting expectant mothers and children from extreme climatic conditions.
Sylvester Dodzi Nyadanu et al, Maternal climate-related exposures and prolonged pregnancy: Findings from a statewide population-based cohort study in Western Australia, Urban Climate (2025). DOI: 10.1016/j.uclim.2025.102316
Feb 12
Dr. Krishna Kumari Challa
Method to measure blood-brain barrier permeability accurately developed
For decades, scientists across the globe have investigated methods to accurately measure drug permeability across the blood-brain barrier, a compact layer of cells that protect the brain from potentially dangerous substances and microbes. They struggled with a number of parameters, such as blood flow and binding to plasma proteins, which were shown to impact permeability in different ways.
In research published in the December 2024 issue of Fluids and Barriers of the CNS ("Brain endothelial permeability, transport and flow assessed over 10 orders of magnitude using the in situ brain perfusion technique"), researchers sought to reconcile discrepancies in the field and provide accurate methods for measuring permeability over a very broad range spanning from poorly crossing polar compounds (compounds with a positive or negative charge) to rapidly crossing approved central nervous system (CNS) clinical drugs.
The project team evaluated 120 compounds, revealing that many current CNS drugs permeate the barrier and equilibrate in the brain in less than 10 minutes. The findings challenged previous literature and demonstrated that the equilibration rate for a significant number of CNS drugs is much greater than previously realized.
The researchers showed that many of the drugs that are used and approved for CNS uptake go into the brain quite well. A good number of agents in the benzodiazepine, antidepressant, antipsychotic, stimulant and antiepileptic drugs go in as quickly as the blood flow can deliver them. It's amazingly rapid. For such agents, we had to have extremely accurate measurement of cerebral blood flow.
The project also highlighted the role of plasma proteins which, for many lipophilic agents can serve an additional brain delivery role beyond that of free drug in plasma. In effect, plasma-bound drugs can dissociate to maintain the intervascular free drug concentration, which otherwise would show rapid depletion under conditions of higher extraction (50–99%).
Quentin R. Smith et al, Brain endothelial permeability, transport, and flow assessed over 10 orders of magnitude using the in situ brain perfusion technique, Fluids and Barriers of the CNS (2024). DOI: 10.1186/s12987-024-00584-y
Feb 12
Dr. Krishna Kumari Challa
Rising pollen levels linked to increased mortality in older adults
As climate change intensifies pollen seasons across some regions, new research reveals a connection between pollen exposure and death rates among older adults with breathing problems.
The study, published in BMC Public Health, shows that high pollen days aren't just an inconvenience for allergy sufferers—they could pose serious health risks for vulnerable populations. With pollen seasons growing longer and more intense, understanding these risks has become increasingly urgent for public health officials and health care providers.
The study found that high levels of certain pollen, particularly from deciduous trees and ragweed, were linked to increased risk of death from breathing problems. The effects could last up to two weeks after exposure.
The findings suggest that exposure to certain types of pollen can increase the risk of death from breathing-related problems, particularly for people with chronic conditions. This is especially concerning given expectations that climate change will exacerbate the severity of pollen seasons in coming years.
The researchers looked at four types of pollen: deciduous tree pollen from trees that lose their leaves, evergreen tree pollen, grass pollen and ragweed pollen.
While not everyone is equally sensitive to pollen, the findings highlight the importance of tracking pollen levels and taking precautions during high pollen days, especially for older adults with breathing problems, the researchers say. And, they add, with predicted climate change, preparing for the risks will be increasingly important for public health.
Peter S. Larson et al, Chronic and infectious respiratory mortality and short-term exposures to four types of pollen taxa in older adults in Michigan, 2006-2017, BMC Public Health (2025). DOI: 10.1186/s12889-025-21386-3
Feb 12
Dr. Krishna Kumari Challa
Cockatoos prefer their noodles dunked in blueberry yogurt: First evidence of non-primate food flavoring behaviour
Researchers are reporting that Goffin's cockatoos (Cacatua goffiniana) engage in food flavoring behaviour by dunking food into soy yogurt. Experimentally controlled tests have confirmed that the birds selectively dipped food in flavored yogurt rather than neutral alternatives, ruling out alternative explanations such as soaking or cleaning.
Dunking behavior in non-human animals is considered a foraging innovation. It is typically associated with softening dry food, cleaning, flavoring, drowning prey, or transporting liquid.
Prior research has documented various species engaging in dunking, though reports on food flavoring behavior are rare.
Previously, members of the same group of cockatoos exhibited innovative dunking behavior to soak dry food.
The observations that led to the study titled "Innovative Flavoring Behavior in Goffin's Cockatoos," published in Current Biology, began when two cockatoos were seen dunking cooked potato pieces into blueberry-flavored soy yogurt.
To systematically investigate this behavior, researchers conducted 14 30-minute observations during breakfast sessions. Eighteen cockatoos were given access to a food bowl containing potatoes or noodles, along with three dunking mediums: freshwater, blueberry-flavored soy yogurt, and neutral soy yogurt.
Nine out of 18 cockatoos engaged in dunking, preferring noodles over potatoes (an average of 12 times per bird vs. 6 times per bird).
Statistical analysis showed that food was dunked in blueberry yogurt over two times more often than neutral yogurt, while no food was dunked in water. The birds also preferred directly eating blueberry yogurt over the neutral variety.
Part 1
Feb 13
Dr. Krishna Kumari Challa
To rule out alternative explanations for the behavior, researchers tested whether dunking functioned as soaking, cleaning, food transport, or tool use. Cockatoos left food in yogurt for an average of 3.2 seconds, significantly shorter than the 22.9 seconds previously observed for water-soaking behavior.
The absence of dunking in water, combined with eating the food with yogurt rather than licking it off, supported the interpretation that dunking was intended for flavoring. A separate test for color preference between yogurts found no significant difference in selection, indicating that dunking choices were based on flavor rather than visual cues.
Food preference testing further revealed that the birds preferred the combination of noodles and blueberry yogurt over noodles or yogurt alone. The potatoes were acceptable without flavoring.
This study provides the first experimental evidence of food flavoring behavior outside the primate lineage. While the cognitive mechanisms behind the innovation remain unclear, researchers note that Goffin's cockatoos demonstrate high cognitive abilities, including problem-solving and sequential planning.
Jeroen Stephan Zewald et al, Innovative flavoring behavior in Goffin's cockatoos, Current Biology (2025). DOI: 10.1016/j.cub.2025.01.002
Part 2
Feb 13
Dr. Krishna Kumari Challa
Engineered animals show new way to fight mercury pollution
Scientists have found an effective new way to clean up methylmercury, one of the world's most dangerous pollutants, which often builds up in our food and environment because of industrial activities such as illegal gold mining and burning coal. The discovery, published in Nature Communications on 12 February 2025, could lead to new ways of engineering animals to protect both wildlife and human health.
The researchers have successfully genetically modified fruit flies and zebrafish to transform methylmercury into a far less harmful gas that disperses in air.
So we can now use synthetic biology to convert the most environmentally harmful form of mercury and evaporate it out of an animal!
Methylmercury causes environmental harm due to its high bioavailability and poor excretion: it can easily cross the digestive tract, the blood-brain barrier, and the placenta and becomes increasingly concentrated as it moves up through food webs to levels that can cause harm to neural and reproductive health.
The research team modified the DNA of fruit flies and zebrafish by inserting variants of genes from bacteria to make two enzymes that together can convert methylmercury to elemental mercury, which evaporates from the animals as a gas.
When they tested the modified animals, they found that not only did they have less than half as much mercury in their bodies, but the majority of the mercury was in a much less bioavailable form than methylmercury.
The researchers included safety measures to ensure the modified organisms cannot spread uncontrollably in nature, and they also highlight the need for regulatory control for any real-world use.
Methylmercury demethylation and volatilization by animals expressing microbial enzymes, Nature Communications (2025). DOI: 10.1038/s41467-025-56145-w
Feb 13
Dr. Krishna Kumari Challa
Earth's early cycles shaped the chemistry of life
How did non-living chemicals become bio-chemicals on early Earth? This is the question most people ask.
A new study explores how complex chemical mixtures change under shifting environmental conditions, shedding light on the prebiotic processes that may have led to life. By exposing organic molecules to repeated wet-dry cycles, researchers observed continuous transformation, selective organization, and synchronized population dynamics.
Their findings, appearing in Nature Chemistry, suggest that environmental factors played a key role in shaping the molecular complexity needed for life to emerge.
To simulate early Earth, the team subjected chemical mixtures to repeated wet-dry cycles. Rather than reacting randomly, the molecules organized themselves, evolved over time, and followed predictable patterns.
This challenges the idea that early chemical evolution was chaotic. Instead, the study suggests that natural environmental fluctuations helped guide the formation of increasingly complex molecules, eventually leading to life's fundamental building blocks.
The new work investigates how chemical mixtures evolve over time, illuminating potential mechanisms that contributed to the emergence of life on Earth.
The research examines how chemical systems can undergo continuous transformation while maintaining structured evolution, offering new insights into the origins of biological complexity.
Chemical evolution refers to the gradual transformation of molecules in prebiotic conditions, a key process in understanding how life may have arisen from non-living matter. While much research has focused on individual chemical reactions that could lead to biological molecules, this study establishes an experimental model to explore how entire chemical systems evolve when exposed to environmental changes.
The researchers used mixtures containing organic molecules with diverse functional groups, including carboxylic acids, amines, thiols, and hydroxyls.
By subjecting these mixtures to repeated wet-dry cycles—conditions that mimic the environmental fluctuations of early Earth—the study identified three key findings: chemical systems can continuously evolve without reaching equilibrium, avoid uncontrolled complexity through selective chemical pathways, and exhibit synchronized population dynamics among different molecular species.
These observations suggest that prebiotic environments may have played an active role in shaping the molecular diversity that eventually led to life.
Part 1
Feb 13
Dr. Krishna Kumari Challa
This research offers a new perspective on how molecular evolution might have unfolded on early Earth.
By demonstrating that chemical systems can self-organize and evolve in structured ways, this work provides experimental evidence that may help bridge the gap between prebiotic chemistry and the emergence of biological molecules.
Beyond its relevance to origins-of-life research, the study's findings may have broader applications in synthetic biology and nanotechnology. Controlled chemical evolution could be harnessed to design new molecular systems with specific properties, potentially leading to innovations in materials science, drug development, and biotechnology.
Evolution of Complex Chemical Mixtures Reveals Combinatorial Compression and Population Synchronicity, Nature Chemistry (2025). DOI: 10.1038/s41557-025-01734-x
Part 2
Feb 13
Dr. Krishna Kumari Challa
Mimicry: Masquerading moth deploys specialized nanostructures to evade predators
Researchers found the forewings of the fruit-sucking moth (Eudocima aurantia) have the appearance of a crumpled leaf—but are in fact flat.
They published their research in Current Biology this week.
They found the moth mimics the 3D shape and coloration of a leaf using specialized nanostructures on its wings. These nanostructures create a shiny wing surface that mimics the highlights found on a smooth, curved leaf surface.
Structural and pigmentary coloration produces a leaf-like brown color, with the moth exploiting thin-film reflectors to produce directional reflections—producing the illusion of a 3D leaf shape.
It is intriguing that the nanostructures which produce shininess only occur on the parts of the wing that would be curved if the wing was a leaf.
This suggests that moths are exploiting the way predators perceive 3D shapes to improve their camouflage, which is very impressive.
What is remarkable about this moth, however, is that it is creating the appearance of a three-dimensional object despite being almost completely flat.
This mimicry likely serves as a camouflage strategy, fooling predators into misidentifying the moth as an inedible object.
Jennifer L. Kelley et al, A leaf-mimicking moth uses nanostructures to create 3D leaf shape appearance, Current Biology (2025). DOI: 10.1016/j.cub.2025.01.029. www.cell.com/current-biology/f … 0960-9822(25)00059-4
Feb 13
Dr. Krishna Kumari Challa
Diabetes can drive the evolution of antibiotic resistance, study suggests
Antibiotics are powerful, fast-acting medications designed to eradicate bacterial infections. However, in recent years, their dependability has waned as antibiotic resistant bacteria continues to evolve and spread.
Staphylococcus aureus is a leading cause of antibiotic-resistance-associated infections and deaths. It is also the most prevalent bacterial infection among those with diabetes mellitus, a chronic condition that affects blood sugar control and reduces the body's ability to fight infections.
Researchers have just shown that people with diabetes are more likely to develop antibiotic-resistant strains of Staph, too.
Their results, which were published in Science Advances, show how the diabetic microbial environment produces resistant mutations, while hinting at ways antibiotic resistance can be combated in this patient population.
The researchers found that antibiotic resistance emerges much more rapidly in diabetic models than in non-diabetic models of disease.
This interplay between bacteria and diabetes could be a major driver of the rapid evolution and spread of antibiotic resistance that we are seeing.
Diabetes affects the body's ability to control a type of sugar called glucose, often causing excess glucose to build up in the bloodstream. Staph feeds off these high sugar levels, allowing it to reproduce more rapidly. The bacterium can also grow without consequence, as diabetes also impairs the immune system's ability to destroy cells and control infection.
As the numbers of bacteria increase in a diabetic infection, so does the likelihood of resistance. Random mutations appear and some build up resistance to external stressors, like antibiotics. Once a resistant mutant is present in a diabetic infection, it rapidly takes over the population, using the excess glucose to drive its rapid growth.
Staphylococcus aureus is uniquely suited to take advantage of this diabetic environment.
Once that resistant mutation happens, you have excess glucose and you don't have the immune system to clear the mutant and it takes over the entire bacterial population in a matter of days.
This was proved in their experiments and models.
So, what can be done to prevent it? Well, the researchers showed that reducing blood sugar levels in diabetic models (through administration of insulin) deprived bacteria of their fuel, keeping their numbers at bay, and reducing the chances of antibiotic-resistant mutations from occurring.
Their findings suggest that controlling blood sugar through insulin use could be key in preventing antibiotic resistance.
John Shook et al, Diabetes Potentiates the Emergence and Expansion of Antibiotic Resistance, Science Advances (2025). DOI: 10.1126/sciadv.ads1591. www.science.org/doi/10.1126/sciadv.ads1591
Feb 13
Dr. Krishna Kumari Challa
The sexome's forensic potential: After intercourse, both partners leave traces of their own unique genital microbiome
Criminal investigations of heterosexual sexual assault often include a DNA analysis of the woman's genitals with the aim of identifying the presence of the perpetrator's sperm for proof of intercourse. However, in cases where no sperm is detected, including in assaults where the perpetrator uses a condom, these exams are often ineffective.
In research published in iScience on February 12, 2025, researchers show that bacterial species are transferred between both individuals during sexual intercourse, and these species can be traced to a sexual partner's unique genital microbiome.
The authors say that analyses of these genital microorganisms—which they called the "sexome"—may be useful in identifying perpetrators of sexual assault.
This research is based on the forensic concept that every contact leaves a trace.
In this study, the researchers confirmed that both men and women have unique populations of bacteria in their genital areas. They then recruited 12 monogamous, heterosexual couples to investigate whether these sexomes are transferred during sexual intercourse, including when a condom is used.
At the beginning of the study, each participant collected samples of their genital microbiome using swabs. The investigators used RNA gene sequencing to determine which bacteria strains were present—down to the sub-species level—and identified microbial signatures for each participant.
Couples were then asked to abstain from sex for varying lengths of time (from two to 14 days) and then to participate in intercourse. Afterwards, samples were collected again from each individual's genital microbiome. Analysis showed that a participant's unique bacterial signature could be identified in their sexual partner's sample following intercourse.
Three of the couples reported using a condom. The analysis found that although this did have some impact on the transfer of microbial content, it did not inhibit it entirely.
When a condom was used, the majority of transfer occurred from the female to the male.
This shows promise for a means of testing a perpetrator post-assault and means there may be microbial markers that detect sexual contact even when a condom was used.
The investigators also looked at whether males were circumcised and whether the participants had pubic hair, but found that neither factor seemed to affect the transfer of bacterial species between partners. However, they did find that the makeup of the vaginal microbiome changed during menstruation, which they note could affect results.
You can escape from police and law but you cannot escape from science and scientists. Can you?
Bacterial transfer during sexual intercourse as a tool for forensic detection, iScience (2025). DOI: 10.1016/j.isci.2025.111861. www.cell.com/iscience/fulltext … 2589-0042(25)00121-X
Feb 13
Dr. Krishna Kumari Challa
Physicists uncover evidence of two arrows of time emerging from the quantum realm
What if time is not as fixed as we thought? Imagine that instead of flowing in one direction—from past to future—time could flow forward or backwards due to processes taking place at the quantum level. This is the thought-provoking discovery made by researchers , as a new study reveals that opposing arrows of time can theoretically emerge from certain quantum systems.
For centuries, scientists have puzzled over the arrow of time—the idea that time flows irreversibly from past to future. While this seems obvious in our experienced reality, the underlying laws of physics do not inherently favor a single direction. Whether time moves forward or backwards, the equations remain the same.
One way to explain this is when you look at a process like spilled milk spreading across a table, it's clear that time is moving forward. But if you were to play that in reverse, like a movie, you'd immediately know something was wrong—it would be hard to believe milk could just gather back into a glass.
However, there are processes, such as the motion of a pendulum, that look just as believable in reverse. The puzzle is that, at the most fundamental level, the laws of physics resemble the pendulum; they do not account for irreversible processes.
The new findings suggest that while our common experience tells us that time only moves one way, we are just unaware that the opposite direction would have been equally possible."
The study, published in Scientific Reports, explored how a quantum system—the world of the sub-atomic—interacts with its environment, known as an "open quantum system."
Researchers investigated why we perceive time as moving in one direction, and whether this perception emerges from open quantum mechanics.
Part 1
Feb 14
Dr. Krishna Kumari Challa
To simplify the problem, the team made two key assumptions. First, they treated the vast environment surrounding the system in such a way that they could focus only on the quantum system itself. Second, they assumed that the environment—like the entire universe—is so large that energy and information dissipate into it, never returning.
This approach enabled them to examine how time emerges as a one-way phenomenon, even though, at the microscopic level, time could theoretically move in both directions.
Even after applying these assumptions, the system behaved the same way whether time moved forward or backwards. This discovery provided a mathematical foundation for the idea that time-reversal symmetry still holds in open quantum systems—suggesting that time's arrow may not be as fixed as we experience it.
'The surprising part of this project was that even after making the standard simplifying assumption to our equations describing open quantum systems, the equations still behaved the same way whether the system was moving forwards or backwards in time', say the researchers.
When the physicists carefully worked through the math, they found that this behavior had to be the case because a key part of the equation, the 'memory kernel,' is symmetrical in time.
They also found a small but important detail which is usually overlooked—a time discontinuous factor emerged that keeps the time-symmetry property intact. It's unusual to see such a mathematical mechanism in a physics equation because it's not continuous, and it was very surprising to see it pop up so naturally.
Thomas Guff et al, Emergence of opposing arrows of time in open quantum systems, Scientific Reports (2025). DOI: 10.1038/s41598-025-87323-x
Part 2
Feb 14
Dr. Krishna Kumari Challa
Can organisms help others around even after their death?
Bacteria evolved to help neighboring cells after death, new research reveals
Darwin's theory of natural selection provides an explanation for why organisms develop traits that help them survive and reproduce. Because of this, death is often seen as a failure rather than a process shaped by evolution.
When organisms die, their molecules need to be broken down for reuse by other living things. Such recycling of nutrients is necessary for new life to grow.
Researchers have shown that a type of E. coli bacteria produces an enzyme which breaks the contents of their cells down into nutrients after death. The dead bacteria are therefore offering a banquet of nutrients to the cells that were their neighbours when they were living.
The study has been published in Nature Communications.
We typically think of death being the end, that after something dies it just falls apart, rots and becomes a passive target as it is scavenged for nutrients.
But what this new work has demonstrated is that death is not the end of the programmed biological processes that occur in an organism.
Those processes continue after death, and they have evolved to do so.
"That is a fundamental rethink about how we view the death of an organism."
The researchers realized they had stumbled across a potentially new area of biology; processes that have evolved to function after death.
One problem remained; the researchers couldn't work out how an enzyme that functions after death could have evolved.
"Typically, we think of evolution acting on living organisms not dead ones.
"The solution is that neighboring cells which gain nutrients from the dead cells are likely to be clonally related to the dead cell.
Consequently, the dead cell is giving nutrients to its relatives, analogous to how animals will often help feed younger members of their family group.
The finding demonstrates that processes after death, like processes during life, can be biologically programmed and subject to evolution. Biomolecules that regulate processes after death might be exploited in the future as novel targets for bacterial disease or as candidates to enhance bacterial growth in biotechnology.
Bacteria encode post-mortem protein catabolism that enables altruistic nutrient recycling, Nature Communications (2025). DOI: 10.1038/s41467-025-56761-6
Feb 14
Dr. Krishna Kumari Challa
Guardian molecule keeps cells on track
A guardian molecule ensures that liver cells do not lose their identity. This has been discovered by researchers.
The research is published in Nature Genetics.
The discovery is of great interest for cancer medicine because a change of identity of cells has come into focus as a fundamental principle of carcinogenesis for several years. The researchers were able to show that the newly discovered sentinel is so powerful that it can slow down highly potent cancer drivers and cause malignant liver tumors to regress in mice.
As a rule, the identity of cells is determined during embryonic development. They differentiate into nerve cells or liver cells, for example, and their fate is sealed. Only stem cells retain the ability to develop in different directions. However, once cells have differentiated, they usually stay on course.
Cancer cells are different. They have the amazing ability to reactivate embryonic programs and thus change their identity—their phenotype. This ability is referred to as—unwanted or abnormal—plasticity.
It enables tumor cells to break away from the cell network and migrate through the body. Once they have arrived in the target organ, the cells differentiate again, become sedentary again and form metastases at this site.
It is not so long ago that the importance of plasticity as a fundamental phenomenon in cancer was recognized.
The researchers' goal is to reduce the plasticity of cancer cells and thus prevent the development and spread of malignant tumors. To do this, they first need to understand how cell plasticity is regulated.
In principle, almost all cells in the body have an identical genome. But how is it possible then that such different and highly specialized cell types as nerve cells or liver cells arise?
This is only possible because cells have a sophisticated control network.
These ensure that only certain genes are switched on, depending on the cell type, while others are permanently silenced. Master regulators play a central role in this process. They switch on genes that influence specialized cells to change their identity and even acquire stem cell properties.
However, little is known about the antagonists—the control instances that prevent unwanted (re)transformation of differentiated cells by switching off certain genes.
The researchers used a computer program to search for gene switches that could potentially serve as guardians.
Part 1
Feb 14
Dr. Krishna Kumari Challa
The research team found almost 30 different guardian candidates and decided to pursue one of them further: PROX1 (Prospero homeobox protein 1).
Studies on the liver cancer model showed that the team had hit the mark.
It turned out that PROX1 is a very influential guard in liver cells. If it is missing, the liver cells change their phenotype. And conversely, the versatility of tumor cells can be reduced by experimentally inducing an increase in the activity of the guard.
PROX1 was able to override the influence of such strong cancer drivers and suppress the formation of tumors despite their presence.
The researchers also found something else: the PROX1 guardian must be constantly active around the clock to fulfill its function. This is different from many other gene switches, which, like a toggle switch, only need to be activated briefly.
Lim B et al, Active repression of cell fate plasticity by PROX1 safeguards hepatocyte identity and prevents liver tumourigenesis. Nature Genetics (2025). DOI: 10.1038/s41588-025-02081-w
Part 2
Feb 14
Dr. Krishna Kumari Challa
Birds have developed complex brains independently from mammals, studies reveal
Two studies published in the latest issue of Science have revealed that birds, reptiles, and mammals have developed complex brain circuits independently, despite sharing a common ancestor. These findings challenge the traditional view of brain evolution and demonstrate that, while comparable brain functions exist among these groups, embryonic formation mechanisms and cell types have followed divergent evolutionary trajectories.
The pallium is the brain region where the neocortex forms in mammals, the part responsible for cognitive and complex functions that most distinguishes humans from other species. The pallium has traditionally been considered a comparable structure among mammals, birds, and reptiles, varying only in complexity levels. It was assumed that this region housed similar neuronal types, with equivalent circuits for sensory and cognitive processing.
Previous studies had identified the presence of shared excitatory and inhibitory neurons as well as general connectivity patterns suggesting a similar evolutionary path in these vertebrate species.
However, the new studies reveal that, although the general functions of the pallium are equivalent among these groups, its developmental mechanisms and the molecular identity of its neurons have diverged substantially throughout evolution.
Eneritz Rueda-Alaña et al, Evolutionary convergence of sensory circuits in the pallium of amniotes, Science (2025). DOI: 10.1126/science.adp3411. www.science.org/doi/10.1126/science.adp3411
Zaremba B et al. Developmental origins and evolution of pallial cell types and structures in birds. Science (2025). DOI: 10.1126/science.adp5182. www.science.org/doi/10.1126/science.adp5182
Feb 14
Dr. Krishna Kumari Challa
Active matter: Scientists create three-dimensional 'synthetic worms'
Researchers have made a breakthrough in the development of "life-like" synthetic materials which are able to move by themselves like worms.
Scientists have been investigating a new class of materials called "active matter," which could be used for various applications from drug delivery to self-healing materials.
Compared to inanimate matter—the sort of motionless materials we come across in our lives every day, such as plastic and wood—active matter can show fascinating life-like behavior.
These materials are made of elements which are driven out of equilibrium by internal energy sources, allowing them to move independently.
Researchers carried out the experiment using special micron-sized (one millionth of a meter) particles called Janus colloids, which were suspended in a liquid mixture.
The team then made the material active by applying a strong electric field and observed the effects using a special kind of microscope which takes three-dimensional images.
When the electric field was turned on, the scattered colloid particles would merge together to form worm-like structures—which creates a fully three-dimensional synthetic active matter system.
The researchers found the formation of fascinating new structures—self-driven active filaments that are reminiscent of living worms. They were then able to develop a theoretical frame work which enabled us to predict and control the motion of the synthetic worms solely based on their lengths.
Xichen Chao et al, Traveling Strings of Active Dipolar Colloids, Physical Review Letters (2025). DOI: 10.1103/PhysRevLett.134.018302
Feb 14
Dr. Krishna Kumari Challa
Air inside your home may be more polluted than outside due to everyday chemical products
When you walk through a flower garden, the crisp, fresh scent is one of the first things you notice.
But bringing that flower scent or other aromas indoors with the help of chemical products—yes, air fresheners, wax melts, floor cleaners, deodorants and others—rapidly fills the air with nanoscale particles that are small enough to get deep into your lungs, researchers have found over a series of studies.
These nanoparticles form when fragrances interact with ozone, which enters buildings through ventilation systems, triggering chemical transformations that create new airborne pollutants.
A forest is a pristine environment, but if you're using cleaning and aromatherapy products full of chemically manufactured scents to recreate a forest in your home, you're actually creating a tremendous amount of indoor air pollution that you shouldn't be breathing in.
Nanoparticles just a few nanometers in size can penetrate deep into the respiratory system and spread to other organs.
To understand how airborne particles form indoors, you need to measure the smallest nanoparticles—down to a single nanometer. At this scale, we can observe the earliest stages of new particle formation, where fragrances react with ozone to form tiny molecular clusters. These clusters then rapidly evolve, growing and transforming in the air around us.
Even though it's yet to be determined how breathing in volatile chemicals from these products impacts your health, the two have repeatedly found that when fragrances are released indoors, they quickly react with ozone to form nanoparticles. These newly formed nanoparticles are particularly concerning because they can reach very high concentrations, potentially posing risks to respiratory health.
Essential oil diffusers, disinfectants, air fresheners and other scented sprays also generate a significant number of nanoscale particles.
But it's not just scented products contributing to indoor nanoparticle pollution: A study by researchers found that cooking on a gas stove also emits nanoparticles in large quantities.
Just 1 kilogram of cooking fuel emits 10 quadrillion particles smaller than 3 nanometers, which matches or exceeds what's emitted from cars with internal combustion engines. At that rate, you might be inhaling 10–100 times more of these sub-3 nanometer particles from cooking on a gas stove indoors than you would from car exhaust while standing on a busy street.
Still, scented chemical products match or surpass gas stoves and car engines in the generation of nanoparticles smaller than 3 nanometers, called nanocluster aerosol. Between 100 billion and 10 trillion of these particles could deposit in your respiratory system within just 20 minutes of exposure to scented products.
Satya S. Patra et al, Flame-Free Candles Are Not Pollution-Free: Scented Wax Melts as a Significant Source of Atmospheric Nanoparticles, Environmental Science & Technology Letters (2025). DOI: 10.1021/acs.estlett.4c00986
Feb 14
Dr. Krishna Kumari Challa
Maternal acetaminophen (paracetamol) use may alter placental gene expression, raising ADHD risk in children
Maternal acetaminophen exposure during pregnancy is associated with a higher likelihood of childhood attention deficit hyperactivity disorder (ADHD), according to a new study.
Researchers analyzed plasma biomarkers of acetaminophen (APAP) exposure in a cohort of 307 African American mother-child pairs. Detection of APAP in second-trimester maternal blood samples correlated with increased odds of ADHD diagnosis in children by age 8–10.
Acetaminophen (also called paracetamol) is widely used during pregnancy, with an estimated 41–70% of pregnant individuals in the United States, Europe, and Asia reporting its use. Despite its classification as a low-risk medication by regulatory agencies such as the U.S. Food and Drug Administration and the European Medicines Agency, accumulating evidence suggests a potential link between prenatal APAP exposure and adverse neurodevelopmental outcomes, including ADHD and autism spectrum disorder.
Researchers utilized untargeted metabolomics to identify APAP metabolites in second-trimester maternal plasma samples and examined their relationship with childhood ADHD diagnoses and placental gene expression.
APAP metabolites were detected in 20.2% of maternal plasma samples. Children whose mothers had APAP biomarkers present in their plasma had a 3.15 times higher likelihood of an ADHD diagnosis (95% confidence interval: 1.20 to 8.29) compared with those without detected exposure.
The association was stronger among females than males, with female children of APAP-exposed mothers showing a 6.16 times higher likelihood of ADHD (95% confidence interval: 1.58 to 24.05), while the association was weaker and nonsignificant in males.
Placental gene expression analysis of a subset of 174 participants indicated sex-specific transcriptional changes. In females, APAP exposure was associated with upregulation of immune-related pathways, including increased expression of immunoglobulin heavy constant gamma 1 (IGHG1).
Increased IGHG1 expression was statistically linked to ADHD diagnoses, with mediation analysis suggesting that APAP's effect on ADHD was partly mediated through this gene's placental expression.
Oxidative phosphorylation pathways were downregulated in both sexes, a pattern previously associated with neurodevelopmental impairment.
Findings align with prior epidemiological studies and experimental animal research linking prenatal APAP exposure to neurodevelopmental disruptions. The current study eliminated the bias concerns raised in previous studies where APAP use was self-reported by using objective biomarker measurements.
Brennan H. Baker et al, Associations of maternal blood biomarkers of prenatal APAP exposure with placental gene expression and child attention deficit hyperactivity disorder, Nature Mental Health (2025). DOI: 10.1038/s44220-025-00387-6
Feb 15
Dr. Krishna Kumari Challa
Dangerous bacteria lurk in hospital sink drains, despite rigorous cleaning, study reveals
We hope to be cured when we stay in hospital. But too often, we acquire new infections there. Such "health-care-associated infections" (HAI) are a growing problem worldwide, taking up an estimated 6% of global hospital budgets.
Patients with lowered immune defenses, and in some hospitals, poor adherence to hygiene protocols, allow HAIs to thrive. Furthermore, antibiotics are widely used in hospitals, which tends to select for hardy, resistant strains of bacteria. When such resistance genes lie on mobile genetic elements, they can even jump between bacterial species, potentially leading to novel diseases.
Researchers now have shown that hospital sink drains host bacterial populations that change over time, despite impeccable cleaning protocols.
These results highlight that controlling bacterial growth in drains, and preventing colonization by new strains of such hard-to-disinfect niches, is likely a global problem.
Sinks and their drains are routinely cleaned with bleach, as well as disinfected with chemicals and pressurized steam every fortnight, or every month in non-patient areas. Once a year, drainpipes are hyperchlorinated at low temperature.
Despite this, the authors of this study identified a total of 67 different species from the drains. The diversity in most drains went up and down over time with no clear pattern—seasonal or otherwise. The greatest diversity occurred in general medicine and intensive care, while the fewest isolates were found in the microbiology laboratory.
Dominant across wards were six Stenotrophomonas species as well as Pseudomonas aeruginosa, a pathogen known to cause ventilator-associated pneumonia and sepsis, and characterized by the WHO as one of the greatest threats to humans in terms of antibiotic resistance. At least 16 other Pseudomonas species were also found at various times and in various wards.
Other notorious hospital-associated pathogens found repeatedly were Klebsiella pneumoniae, Acinetobacter johnsonii and Acinetobacter ursingii , Enterobacter mori and Enterobacter quasiroggenkampii and Staphylococcus aureus .
The bacteria the researchers found may originate from many sources, from patients, medical personnel, and even the environment surrounding the hospital. Once established in sink drains, they can spread outwards, posing significant risks to immunocompromised patients above all.
Yearlong analysis of bacterial diversity in hospital sink drains: culturomics, antibiotic resistance and implications for infection control, Frontiers in Microbiology (2025). DOI: 10.3389/fmicb.2024.1501170
Feb 15
Dr. Krishna Kumari Challa
How Earth got its ice caps
The cool conditions which have allowed ice caps to form on Earth are rare events in the planet's history and require many complex processes working at once, according to new research.
A team of scientists investigated why Earth has existed in what is known as a "greenhouse" state without ice caps for much of its history, and why the conditions we are living in now are so rare.
They found that Earth's current ice-covered state is not typical for the planet's history and was only achieved through a strange coincidence.
Many ideas have previously been proposed to explain the known cold intervals in Earth's history. These include decreased CO2 emissions from volcanoes, or increased carbon storage by forests, or the reaction of CO2 with certain types of rocks.
The researchers undertook the first ever combined test of all of these cooling processes in a new type of long-term 3D model of the Earth.
This type of "Earth Evolution Model" has only recently been made possible through advances in computing.
They concluded that no single process could drive these cold climates, and that the cooling in fact required the combined effects of several processes at once. The results of their study were published 14 February 2025 in Science Advances.
The reason we live on an Earth with ice caps—rather than an ice-free planet—is due to a coincidental combination of very low rates of global volcanism, and highly dispersed continents with big mountains, which allow for lots of global rainfall and therefore amplify reactions that remove carbon from the atmosphere.
The important implication here is that the Earth's natural climate regulation mechanism appears to favor a warm and high-CO2 world with no ice caps, not the partially glaciated and low-CO2 world we have today.
This general tendency towards a warm climate has helped prevent devastating 'snowball Earth' global glaciations, which have only occurred very rarely and have therefore helped life to continue to prosper.
There is an important message, which is that we should not expect the Earth to always return to a cold state as it was in the pre-industrial age.
Earth's current ice-covered state is not typical for the planet's history, but our current global society relies on it. We should do everything we can to preserve it, and we should be careful with assumptions that cold climates will return if we drive excessive warming before stopping emissions. Over its long history, the Earth likes it hot, but our human society does not, say the researchers.
Andrew Merdith, Phanerozoic icehouse climates as the result of multiple solid-Earth cooling mechanisms, Science Advances (2025). DOI: 10.1126/sciadv.adm9798. www.science.org/doi/10.1126/sciadv.adm9798
Feb 15
Dr. Krishna Kumari Challa
Inconsistent reporting by companies leads to underestimation of methane's climate impact, study finds
Companies around the world are underestimating their total greenhouse gas footprints because of inconsistent accounting standards for methane emissions, finds a new study by researchers.
The new study, published in Nature Communications, found that methane emissions are being underreported by at least the equivalent of between 170 million and 3.3 billion tons of carbon over a decade, depending on the metric used in calculating the shortfall.
This means that each year, on average, companies around the world have potentially underestimated their carbon footprint by as much in total as the annual carbon emissions of the UK in 2022. This represents a significant methane emissions gap that could cost between $1.6 billion (£1.3 billion) and $40 billion (£32 billion) to fix.
The cumulative emission gap the researchers have documented in this work shows how important it is to standardize the reporting of methane emissions. Methane is a potent greenhouse gas and the first step towards properly addressing its effect on climate is to make sure that it's accounted for properly.
Adopting a global standard is in principle easy for companies as it essentially only requires the adjustment of a few conversion factors when calculating their greenhouse gas footprint. However, it requires global coordination as companies are currently often subject to fragmented regulations.
Methane is a potent greenhouse gas that contributes to global warming at levels comparable to carbon dioxide. Though methane is emitted in much smaller quantities than carbon dioxide, it's more efficient at trapping heat in the atmosphere. However, methane is also short-lived in the atmosphere, with a half-life of only about 10 years versus 120 years for carbon dioxide.
How much total heat a greenhouse gas traps is called its Global Warming Potential (GWP) and measured in CO2 equivalent units, or the amount of carbon dioxide gas that would cause the same amount of warming. Because of methane's short lifespan, the conversion to CO2 is not straightforward and debate persists about how best to represent it in terms of carbon dioxide.
If methane's impact is calculated over 20 years (GWP-20), it's about 80 times more potent than carbon dioxide because that's the timeframe before most of it has dissipated. However, gauged over 100 years (GWP-100) more of the methane has broken down so it's only about 28 times as potent.
For companies estimating and reporting their greenhouse gas footprint, this lack of harmonization can cause confusion and inaccuracies, as there's no legally binding guidance or consensus for which standard to use.
The authors note that even with their suggested corrections, total methane emissions are still being underestimated, as their calculations only focused on emissions directly produced by the companies they analyzed. Other downstream emissions, such as that which come from sold products, were not included, and are likely significant contributors as well, particularly in the energy sector.
Simone Cenci et al, Lack of harmonisation of greenhouse gases reporting standards and the methane emissions gap, Nature Communications (2025). DOI: 10.1038/s41467-025-56845-3
Feb 15
Dr. Krishna Kumari Challa
Microplastics Can Block Blood Flow in The Brain
With microplastics now permeating our food and our bodies, researchers are keen to assess the potential damage these tiny fragments could be doing. A new study shows how plastics may lead to dangerous blood flow blockages in the brain. The study involved tracking microplastics in blood vessels moving through mouse brains in real time – the first time microplastic movement has been tracked in this way.
Using high-resolution laser-based imaging techniques, the researchers found microplastic-laden immune cells becoming lodged inside blood vessels in the cortex area of the brain.
"The data reveal a mechanism by which microplastics disrupt tissue function indirectly through regulation of cell obstruction and interference with local blood circulation, rather than direct tissue penetration," write the researchers in their published paper.
This revelation offers a lens through which to comprehend the toxicological implications of microplastics that invade the bloodstream.
The researchers found some similarities between the blockages here and blood clots, while also looking at the subsequent impact on mouse behavior. Mice with microplastics in their blood performed less well than their plastic-free peers on movement, memory, and coordination tests, pointing to impaired brain function.
Microplastics are defined as plastic fragments less than 5 millimeters (0.2 inches) in diameter. As you might expect, the smaller specks of plastic were found to be less likely to cause blockages than larger ones.
While the microplastic blockages were cleared up over the course of a month, and most cognitive behaviors in the mice returned to normal, the researchers suggest there could be links here to neurological problems like depression and anxiety, as well as an increased risk of strokes and cardiovascular disease.
"These findings indicate that mice display multifaceted abnormalities in neurobehavioral regulation, resembling depressive states associated with disrupted cerebral blood flow," write the researchers.
While it's not certain that the same processes are happening in human brains – there are significant differences in terms of immune systems and blood vessel sizes – mice are biologically similar enough to us as a species to make this a real concern.
https://www.science.org/doi/10.1126/sciadv.adr8243
Feb 17
Dr. Krishna Kumari Challa
Scientists Just Achieved a Major Milestone in Creating Synthetic Life
After more than a decade of work, researchers have reached a major milestone in their efforts to re-engineer life in the lab, putting together the final chromosome in a synthetic yeast (Saccharomyces cerevisiae) genome. The researchers chose yeast as a way to demonstrate the potential for producing foodstuffs that could survive the rigors of a changing climate or widespread disease.
It's the first time a synthetic eukaryotic genome has been constructed in full, following on from successes with simpler bacteria organisms. It's a proof-of-concept for how more complex organisms, like food crops, could be synthesized by scientists.
This doesn't mean we can start growing completely artificial yeast from scratch, but it does mean living yeast cells can potentially be entirely recoded – though lots more work is required to get this process refined and scaled up before that can happen.
And the coding analogy is a good one, because the researchers had to spend plenty of time and effort debugging the 16th and final synthetic yeast chromosome (called SynXVI) before the genome functioned as desired.
https://www.nature.com/articles/s41467-024-55318-3
Feb 17
Dr. Krishna Kumari Challa
Trees can cool cities, but only with a little help
Because trees can cool cities by providing shade and evaporating water into the atmosphere, greening city streets is an often-touted strategy for climate change adaptation. But trees provide benefits only if they're healthy, and physical variations in urban environments mean that not all trees have the same chance to thrive.
In an article published in AGU Advances,
researchers told their story of setting trees up for identifying cityscape features for success and that may cause them to struggle.
The researchers used data from the ECOSTRESS sensor aboard the International Space Station to map the summer afternoon canopy temperatures. Then they applied machine learning to assess the relationship between these temperatures and various environmental factors, including proximity to water, urbanization, traffic exposure, and surrounding land cover.
They found that proximity to blue and green spaces (areas with water or vegetation) improved tree health, whereas trees in areas with a lot of built structures and impervious surfaces fared worse.
Using this analysis, the researchers created and calculated the combined urban tree index (CUTI)—a metric that considers the fraction of land covered by tree canopy along with the temperature and health of the canopy—to determine how much an area benefits from its trees. The CUTI scale ranges from 0 to 1, with 0 meaning no benefit and 1 meaning maximum benefit.
In urban areas where the surroundings will likely cause trees to do poorly, city managers will need to plant more trees and attend to them more carefully than in areas where trees thrive naturally, the authors concluded.
Jean V. Wilkening et al, Canopy Temperature Reveals Disparities in Urban Tree Benefits, AGU Advances (2025). DOI: 10.1029/2024AV001438
Feb 18
Dr. Krishna Kumari Challa
How is gold formed?
The simple answer here is that we are not certain. However, scientists have some ideas.
Gold, like all elements, formed through high energy reactions that occurred in various cosmic and space environments some 13 billion years ago, when the universe started to form.
However, gold deposits—or the concentration of gold in large volumes within rock formations—are believed to occur through various processes, explained by two theories.
The first theory—described by geologist Richard J. Goldfarb—argues that large amounts of gold were deposited in certain areas when continents were expanding and changing shape, around 3 billion years ago. This happened when smaller landmasses, or islands, collided and stuck to larger continents, a process called accretionary tectonics. During these collisions, mineral-rich fluids moved through the Earth's crust, depositing gold in certain areas.
A newer, complementary theory by planetary scientist Andrew Tomkins explains the formation of some much younger gold deposits during the Phanerozoic period (approximately 650 million years ago). It suggests that as the Earth's oceans became richer in oxygen during the Phanerozoic period, gold got trapped within another mineral known as pyrite (often called fool's gold) as microscopic particles. Later, geological processes—like continental growth (accretion) and heat or pressure changes (metamorphism) released this gold—forming deposits that could be mined.
Feb 18
Dr. Krishna Kumari Challa
AI on aircraft can reduce risk of mid-air stalls and sudden drops, study shows
Artificial intelligence aboard aircraft could help prevent terrifying drops in altitude. In a new study, an international research team successfully tested a machine learning system for preventing trouble with turbulence. The findings are published in the journal Nature Communications.
Researchers conducted tests on an AI system designed to enhance the effectiveness of experimental technologies for manipulating airflow on wing surfaces. The results indicate that these innovations work better when paired with deep reinforcement learning (DLR), in which the program adapts to airflow dynamics based on previously learned experiences.
The AI control system zeroes in on one particularly dangerous aerodynamic phenomenon known as flow detachment, or turbulent separation bubbles.
Flow detachment is as serious as it sounds. To stay aloft, airplanes need slow moving air underneath the wing, and fast moving air above it. The air moving over the wing surface needs to follow the wing shape, or "attach," to the surface. When the air moving over the wing's surface no longer follows the wing shape and instead breaks away, it creates a dangerous swirling or stalled airflow.
This usually occurs when the wing is at a high angle of attack, or when the air slows down due to increasing pressure. When this happens, lift decreases, and drag increases, which can lead to a stall and make the aircraft harder to control.
Part 1
Feb 18
Dr. Krishna Kumari Challa
The researchers report that they can reduce these bubbles by 9%.
The team tested how effectively AI could control experimental devices that pulse air in and out of a small opening in the wing surface, known as synthetic jets. While such innovations are still in the experimental stage, aerospace engineers look at them to complement physical features such as vortex generators that planes rely on to maintain the right balance of airflow above and below the wings.
Up to this point, the prevailing wisdom has been that these bursts should occur at regular periodic intervals. However, the study shows that periodic activation only reduces turbulence separation bubbles by 6.8%.
This study highlights how important AI is for scientific innovation. It offers exciting implications for aerodynamics, energy efficiency and next-generation computational fluid dynamics.
Bernat Font et al, Deep reinforcement learning for active flow control in a turbulent separation bubble, Nature Communications (2025). DOI: 10.1038/s41467-025-56408-6
Part 2
Feb 18
Dr. Krishna Kumari Challa
Sudden vision loss in children: Study points to a novel retinal disorder
A multicenter study led by researchers from the Key Laboratory of Ophthalmology has characterized a distinct retinal disorder in children following high fever illness. The study describes hyperacute outer retinal dysfunction (HORD), a condition marked by sudden bilateral vision loss, photoreceptor disruption, and variable recovery.
Eight pediatric patients between the ages of 3 and 7 experienced severe, sudden-onset vision loss approximately two weeks after a febrile illness. Despite initial poor visual acuity, most showed significant central vision recovery over one year. Comprehensive retinal imaging revealed characteristic ellipsoid zone (EZ) and external limiting membrane (ELM) disruptions. Electroretinography (ERG) findings demonstrated extinguished cone and rod responses, even in cases where vision improved.
In the study, "Hyperacute Outer Retinal Dysfunction," published in JAMA Ophthalmology, researchers examined eight children (16 eyes) referred to pediatric retina services in China. Patients had no prior history of visual impairment and underwent thorough ophthalmic and systemic evaluations. Exclusion criteria included inherited retinal disease, uveitis, and white dot syndromes.
Best-corrected visual acuity (BCVA) was assessed at baseline and during follow-up. Multimodal imaging included color fundus photography, ultra-widefield imaging, optical coherence tomography (OCT), fluorescence angiography, fundus autofluorescence, and electroretinography. Genetic and serological testing was conducted to rule out inherited and autoimmune retinal diseases. Patients received varying immunosuppressive treatments, including corticosteroids, intravenous immunoglobulin and methotrexate.
Initial symptoms included severe bilateral vision loss, nyctalopia, visual field constriction, and dyschromatopsia. At presentation, the patient's mean visual acuity was below the ability to count fingers correctly. OCT imaging showed diffuse EZ and ELM loss, while early fundus findings were largely unremarkable.
By the fourth week, signs of macular recovery appeared. At one year, 88% (7 of 8 patients) achieved visual acuity of 20/40 or better, with 50% (4 of 8) reaching 20/25 or better. Macular EZ and ELM appeared intact in 75% and 88% of eyes, respectively, though extrafoveal regions remained affected. ERG continued to show extinguished rod and cone responses despite visual improvement.
Systemic evaluations were unremarkable. No infectious or autoimmune triggers were identified, although two patients tested positive for specific antiretinal antibodies (antiPKC γ and antiRi). Treatment with corticosteroids and IVIG was initiated in most patients, though a definitive therapeutic effect of treatment remained unclear from the study.
Part 1
Feb 19
Dr. Krishna Kumari Challa
A commentary by Timothy Boyce and Ian Han at the University of Iowa, "Hyperacute Outer Retinal Dysfunction—A Retina on Fire," also published in JAMA Ophthalmology, suggests that HORD may represent a novel inflammatory-mediated retinal disorder.
The authors propose similarities with autoimmune encephalitis, suggesting a possible antibody-mediated mechanism. Early OCT findings, including vitritis, vascular sheathing, and intraretinal hyperreflective dots, point to acute inflammation as a potential driver of retinal damage.
Yizhe Cheng et al, Hyperacute Outer Retinal Dysfunction, JAMA Ophthalmology (2025). DOI: 10.1001/jamaophthalmol.2024.6372
Timothy M. Boyce et al, Hyperacute Outer Retinal Dysfunction—A Retina on Fire, JAMA Ophthalmology (2025). DOI: 10.1001/jamaophthalmol.2024.6488
Part 2
Feb 19
Dr. Krishna Kumari Challa
Biologists transform gut bacteria into tiny protein pharmacies
Hundreds of different species of microbes live in your gut. In the future, one of these might serve a new function: microscopic in-house pharmacist.
A new study published Feb. 18 in Nature Biotechnology shows how gut bacteria can be directed to produce and release proteins within the lower gastrointestinal tract—eliminating a major roadblock to delivering drugs to that part of the body.
Oral medication is the most common and practical means of drug administration, but the stomach doesn't let much pass through unscathed. This is good when it comes to things like foodborne pathogens, but gut-focused therapies are regularly deactivated and flushed out.
In an unprecedented workaround, biologists engineered bacteria-eating viruses called phages to infect and reprogram bacterial cells to produce and release a sustained flow of a protein-based drug. Collaborating with immunologists they showed that this approach can be used to potentially treat chronic diseases.
Bacteriophages (phages for short) are viruses that naturally infect bacteria. Phages are harder to classify than bacteria and therefore less understood, but we do know how they attack bacteria.
After attaching to a bacterial cell, phages inject their own DNA and reprogram the cell so that it manufactures more phages—agents of the cell's own destruction. When the bacterial cell eventually succumbs, it explodes into a flood of new phages in a process called lysis. Millions of these events happening simultaneously produce a constant supply of a targeted protein inside the lower intestine.
Even though phages act (and look) like spider aliens, they are regular players on the gut-microbiome home team.
So Biologists engineered special phages that inject a little extra genetic material into the bacterial cell.
In addition to making a flurry of new phages, the instructions prompt the cell to produce a tagalong protein that can lend itself to targeted therapies inside the lower intestines.
Engineered proteins reduced inflammation and obesity in mice.
Nature Biotechnology (2025). DOI: 10.1038/s41587-025-02570-7. www.nature.com/articles/s41587-025-02570-7
Feb 19
Dr. Krishna Kumari Challa
Burning plastic for cooking and heating: An emerging environmental crisis
A new research paper "The Use of Plastic as a Household Fuel among the Urban Poor in the Global South" published in Nature Cities, has called for action to reduce the burning of plastics for heating and cooking, a common yet hazardous practice emerging in millions of households in developing nations due to a lack of traditional energy sources.
Researchers investigated the energy consumption of developing countries in Africa, Asia and Latin America, finding many were unable to afford clean fuels such as gas or electricity.
The team also found urban sprawl had made traditional fuels such as wood and charcoal difficult to find, while a lack of waste management meant plastic waste was in abundance.
Burning plastic releases harmful chemicals such as dioxins, furans and heavy metals into the air, which can have a range of health and welfare impacts such as lung diseases, the researchers point out.
These risks are particularly pronounced among women and children, as they spend more time at home.
But the pollution doesn't just stay in households who burn it: it spreads across neighborhoods and cities, affecting everyone.
The issue may affect millions of people who bear the burden of acute inequality in cities and could potentially have a bigger impact as plastic use increases and cities grow.
And many governments are not addressing the issue effectively because it's usually concentrated in areas such as slums, which are often neglected.
Possible ways to address the problem include subsidies for cleaner fuels to make them affordable for poorer families, better waste management to prevent plastic from piling up in slum areas, education campaigns to inform communities about the dangers of burning plastic and alternative low-cost, innovative cooking solutions tailored to lower-income areas, say the experts.
Bishal Bharadwaj et al, The use of plastic as a household fuel among the urban poor in the Global South, Nature Cities (2025). DOI: 10.1038/s44284-025-00201-5
Feb 19
Dr. Krishna Kumari Challa
What makes us remember our dreams? How sleep patterns and mindset shape recall
Some people wake up vividly recalling their dreams from the night, and can tell precise stories experienced during the night, while others struggle to remember even a single detail. Why does this happen? A new study, conducted by researchers and published in Communications Psychology explores the factors that influence so-called "dream recall"—the ability to remember dreams upon awakening—and uncovers which individual traits and sleep patterns shape this phenomenon.
The reason why there is such a difference in recalling dreams remains a mystery. Some studies found that women, young persons, or people with a tendency to daydream, tend to better recall night dreams. But other studies did not confirm these findings.
Other hypotheses, such as that personality traits or cognitive abilities count, received even less support from data. During the recent COVID pandemic, the phenomenon of individual differences in morning dream recall attracted renewed public and scientific attention when an abrupt surge in morning dream recall was reported worldwide.
The new research was conducted in the years from 2020 to 2024, and involved over 200 participants, aged 18 to 70, who recorded their dreams daily for 15 days while their sleep and cognitive data were tracked using wearable devices and psychometric tests.
Each study participant was given a voice recorder to report, every day right after the awakening, about the experiences they had during sleep. Participants had to report whether they remembered having dreamed or not, if they had the impression of having dreamed but did not remember anything about the experience, and to describe the content of the dream if they were able to remember it.
For the duration of the study, participants also wore an actigraph, a sleep monitoring wristwatch that detects sleep duration, efficiency, and disturbances. At the beginning and end of the dream recording period, participants were subjected to psychological tests and questionnaires that measure various factors, from anxiety levels to interest in dreams, proneness to mind-wandering (the tendency to frequently shift attention away from the task at hand toward unrelated thoughts, or internal reflections), up to memory and selective attention tests.
part 1
Feb 19
Dr. Krishna Kumari Challa
Dream recall, defined as the probability of waking up in the morning with impressions and memories from a dream experience, showed considerable variability between individuals and was influenced by multiple factors. The study revealed that people with a positive attitude toward dreams and a tendency for mind-wandering were significantly more likely to recall their dreams. Sleep patterns also seemed to play a critical role: individuals who experienced longer periods of light sleep had a greater likelihood of waking with a memory of their dreams.
Younger participants showed higher rates of dream recall, while older individuals often experienced "white dreams" (a sensation of having dreamed without recalling any details). This suggests age-related changes in memory processes during sleep. Moreover, seasonal variations emerged, with participants reporting lower dream recall during winter compared to spring, hinting at the potential influence of environmental or circadian factors.
The findings suggest that dream recall is not just a matter of chance but a reflection of how personal attitudes, cognitive traits, and sleep dynamics interact.
The individual determinants of morning dream recall, Communications Psychology (2025). DOI: 10.1038/s44271-025-00191-z
Part 2
Feb 19
Dr. Krishna Kumari Challa
Scientists decode diet from stool DNA
You can cheat your strict and disciplined parents or partner, you can lie to your doctor or people who make surveys but you cannot escape from scientists!
Scientists have developed a breakthrough method to track diet using stool metagenomic data.
Developed by researchers at the Institute for Systems Biology (ISB), the new method, called MEDI (Metagenomic Estimation of Dietary Intake), detects food-derived DNA in stool samples to estimate dietary intake. MEDI leverages stool metagenomics, which refers to sequencing all the DNA present in fecal samples (including microbial, human, and food-derived DNA). This non-invasive, data-driven approach offers an objective alternative to traditional food diaries and questionnaires, which are still the gold standard in dietary assessment but can suffer from misreporting and compliance issues.
Metagenomic estimation of dietary intake from human stool, Nature Metabolism (2025). DOI: 10.1038/s42255-025-01220-1
Feb 19
Dr. Krishna Kumari Challa
Using CRISPR to remove extra chromosomes in Down syndrome
Gene editing techniques may eventually allow trisomy to be treated at the cellular level, according to an in vitro proof-of-concept study.
Down syndrome is caused by the presence of a third copy of the 21st chromosome. The condition occurs in approximately 1 in 700 live births and is relatively easy to diagnose at early stages of development. However, there are no treatments.
Researchers used the CRISPR-Cas9 gene editing system to cleave the third chromosome in previously generated trisomy 21 cell lines derived from both pluripotent cells and skin fibroblasts. The technique is able to identify which chromosome has been duplicated, which is necessary to ensure the cell does not end up with two identical copies after removal, but instead has one from each parent.
The study is published in the journal PNAS Nexus.
The researchers were able to remove duplicate chromosomes from both induced pluripotent stem cells and fibroblasts. Suppressing chromosomal DNA repair ability increased the rate of duplicate chromosome elimination.
The authors show that the chromosomal rescue reversibly restores both gene expression and cellular phenotypes. The approach is not yet ready for in vivo application, however, in part because the current technique can also change the retained chromosomes.
Similar approaches could eventually be used in neurons and glial cells and form the basis of novel medical interventions for people with Down syndrome, say the researchers.
Ryotaro Hashizume et al, Trisomic rescue via allele-specific multiple chromosome cleavage using CRISPR-Cas9 in trisomy 21 cells, PNAS Nexus (2025). DOI: 10.1093/pnasnexus/pgaf022
Feb 19
Dr. Krishna Kumari Challa
Swimming in some lakes can lead to infection with Legionella, warn scientists
Swimming in some lakes with still water can lead to infection with Legionella, bacteria that can cause pneumonia, and people who engage in open water swimming should be aware of this risk, say the authors of a practice article published in the Canadian Medical Association Journal.
Legionella infection represents a public health hazard owing to its ability to spread through exposure to natural water bodies and human-made water reservoirs.
Legionella infection is an atypical cause of community-acquired pneumonia. Referred to as legionnaires' disease, it presents with fever, fatigue, respiratory symptoms, and sometimes diarrhea. Legionella bacteria thrive in the warm, stagnant water in plumbing systems, air conditioners, public spas, and even lakes and rivers.
Risk factors for legionnaires' disease include age older than 50 years, smoking history, chronic cardiovascular or kidney disease, diabetes, and a compromised immune system.
Legionnaires' disease following lake swimming in Iowa, Canadian Medical Association Journal (2025). DOI: 10.1503/cmaj.241086. www.cmaj.ca/lookup/doi/10.1503/cmaj.241086
Feb 19
Dr. Krishna Kumari Challa
Feb 19
Dr. Krishna Kumari Challa
Stomach cancers make electrical connections with the nervous system to fuel spread, study finds
Researchers have discovered that stomach cancers make electrical connections with nearby sensory nerves and use these malignant circuits to stimulate the cancer's growth and spread.
It is the first time that electrical contacts between nerves and a cancer outside the brain have been found, raising the possibility that many other cancers progress by making similar connections.
Many cancers exploit nearby neurons to fuel their growth, but outside of cancers in the brain, these interactions have been attributed to the secretion of growth factors broadly or through indirect effects.
Now that we know the communication between the two is more direct and electrical, it raises the possibility of repurposing drugs designed for neurological conditions to treat cancer.
The wiring of neurons to cancer cells also suggests that cancer can commandeer a particularly rapid mechanism to stimulate growth.
There are many different cells surrounding cancers, and this microenvironment can sometimes provide a rich soil for their growth.
Researchers have been focusing on the role of the microenvironment's immune cells, connective tissue, and blood vessels in cancer growth but have only started to examine the role of nerves in the last two decades. What's emerged recently is how advantageous the nervous system can be to cancer.
The nervous system works faster than any of these other cells in the tumor microenvironment, which allows tumors to more quickly communicate and remodel their surroundings to promote their growth and survival.
Earlier the researchers discovered that cutting the vagus nerve in mice with stomach cancer significantly slowed tumor growth and increased survival rate.
Part 1
Feb 20
Dr. Krishna Kumari Challa
Many different types of neurons are contained in the vagus nerve, but the researchers focused here on sensory neurons, which reacted most strongly to the presence of stomach cancer in mice. Some of these sensory neurons extended themselves deep into stomach tumors in response to a protein released by cancer cells called Nerve Growth Factor (NGF), drawing the cancer cells close to the neurons.
After establishing this connection, tumors signaled the sensory nerves to release the peptide Calcitonin Gene Related Peptide (CGRP), inducing electrical signals in the tumor.
Though the cancer cells and neurons may not form classical synapses where they meet—the team's electron micrographs are still a bit fuzzy—"there's no doubt that the neurons create an electric circuit with the cancer cells," the researchers say. "It's a slower response than a typical nerve-muscle synapse, but it's still an electrical response."
The researchers could see this electrical activity with calcium imaging, a technique that uses fluorescent tracers that light up when calcium ions surge into a cell as an electrical impulse travels through.
"There's a circuit that starts from the tumor, goes up toward the brain, and then turns back down toward the tumor again," they confirm. It's like a feed-forward loop that keeps stimulating the cancer and promoting its growth and spread."
For stomach cancer, CGRP inhibitors that are currently used to treat migraines could potentially short-circuit the electrical connection between tumors and sensory neurons.
Timothy Wang, Nociceptive neurons promote gastric tumor progression via a CGRPRamp1 axis, Nature (2025). DOI: 10.1038/s41586-025-08591-1. www.nature.com/articles/s41586-025-08591-1
Part 2
Feb 20
Dr. Krishna Kumari Challa
Synthetic diamond with hexagonal lattice outshines the natural kind with unprecedented hardness
A team of physicists, materials scientists and engineers has grown a diamond that is harder than those found in nature. In their project, reported in the journal Nature Materials, the group developed a process that involves heating and compressing graphite to create synthetic diamonds.
Diamonds are a prized gem the world over. Their sparkling appearance has made them one of the most treasured gemstones throughout human history. In more recent times, because they are so hard, diamonds have also been used in commercial applications, such as drilling holes through other hard materials. Such attributes have kept the price of diamonds high. Because of that, scientists have developed ways to synthesize them, and today, a host of synthetic diamonds are available for sale.
Researchers have previously pursued harder diamonds with hexagonal rather than cubic lattices, which is how most natural and synthetic diamonds form. However, past attempts to make hexagon-lattice synthetic diamonds have resulted in diamonds that were too small and of low purity.
In this new effort, the research team tried a different approach. They developed a process that involved heating graphene samples to high temperatures while inside a high-pressure chamber. By adjusting the parameters of their setup, the researchers found they could get the graphene to grow into a synthetic diamond with hexagonal lattices.
The first diamond made by the group was millimeter-sized and was tested at 155 GPa, with thermal stability up to 1,100°C. Natural diamonds typically range from 70 to 100 GPa and can only withstand temperatures up to 700°C.
The researchers note that it is unlikely diamonds made using their technique would be used for jewelry; instead, they would be used for drilling and machining applications. They also note that they might be used in other ways, such as for data storage or in thermal management applications.
Desi Chen et al, General approach for synthesizing hexagonal diamond by heating post-graphite phases, Nature Materials (2025). DOI: 10.1038/s41563-025-02126-9
Feb 20
Dr. Krishna Kumari Challa
Artificial sweetener triggers insulin spike, leading to blood vessel inflammation in mice
From diet soda to zero-sugar ice cream, artificial sweeteners have been touted as a guilt-free way to indulge our sweet tooth. However, new research published in Cell Metabolism shows that aspartame, one of the most common sugar substitutes, may impact vascular health.
The team of cardiovascular health experts and clinicians found that aspartame triggers increased insulin levels in animals, which in turn contributes to atherosclerosis—buildup of fatty plaque in the arteries, which can lead to higher levels of inflammation and an increased risk of heart attacks and stroke over time.
Previous research has linked consumption of sugar substitutes to increased chronic disorders like cardiovascular disease and diabetes. However, the mechanisms involved were previously unexplored.
For this study, the researchers fed mice daily doses of food containing 0.15% aspartame for 12 weeks—an amount that corresponds to consuming about three cans of diet soda each day for humans.
Compared to mice without a sweetener-infused diet, aspartame-fed mice developed larger and more fatty plaques in their arteries and exhibited higher levels of inflammation, both of which are hallmarks of compromised cardiovascular health.
When the team analyzed the mice's blood, they found a surge in insulin levels after aspartame entered their system. The team noted that this wasn't a surprising result, given that our mouths, intestines, and other tissues are lined with sweetness-detecting receptors that help guide insulin release. But aspartame, 200 times sweeter than sugar, seemed to trick the receptors into releasing more insulin.
Part 1
Feb 20
Dr. Krishna Kumari Challa
The researchers then demonstrated that the mice's elevated insulin levels fueled the growth of fatty plaques in the mice's arteries, suggesting that insulin may be the key link between aspartame and cardiovascular health.
Next, they investigated how exactly elevated insulin levels lead to arterial plaque buildup and identified an immune signal called CX3CL1 that is especially active under insulin stimulation.
Because blood flow through the artery is strong and robust, most chemicals would be quickly washed away as the heart pumps. Surprisingly, not CX3CL1. It stays glued to the surface of the inner lining of blood vessels. There, it acts like a bait, catching immune cells as they pass by.
Many of these trapped immune cells are known to stoke blood vessel inflammation. However, when researchers eliminated CX3CL1 receptors from one of the immune cells in aspartame-fed mice, the harmful plaque buildup didn't occur. These results point to CX3CL1's role in aspartame's effects on the arteries.
Artificial sweeteners have penetrated almost all kinds of food, so we have to be careful not to consume such food, say the researchers.
Sweetener aspartame aggravates atherosclerosis through insulin-triggered inflammation, Cell Metabolism (2025). DOI: 10.1016/j.cmet.2025.01.006. www.cell.com/cell-metabolism/f … 1550-4131(25)00006-3
Part 2
Feb 20
Dr. Krishna Kumari Challa
Cancer cells cooperate to scavenge for nutrients, scientists discover
Cancer cells work together to source nutrients from their environment—a cooperative process that was previously overlooked by scientists but may be a promising target for treating cancer.
Researchers identified cooperative interactions among cancer cells that allow them to proliferate. Thinking about the mechanisms that tumor cells exploit can inform future therapies.
Scientists have long known that cancer cells compete with one another for nutrients and other resources. Over time, a tumor becomes more and more aggressive as it becomes dominated by the strongest cancer cells.
However, ecologists also know that living organisms cooperate, particularly under harsh conditions. For instance, penguins form tight huddles to conserve heat during the extreme cold of winter, with their huddles growing in size as temperatures drop—a behavior not observed during warmer months.
Similarly, microorganisms such as yeast work together to find nutrients, but only when facing starvation.
Cancer cells also need nutrients to thrive and replicate into life-threatening tumors, but they live in environments where nutrients are scarce. Is it possible that cancer cells work together to scavenge for resources?
To determine whether cancer cells cooperate, the researchers tracked the growth of cells from different types of tumors. Using a robotic microscope and image analysis software they developed, they quickly counted millions of cells under hundreds of conditions over time.
This approach allowed them to examine tumor cultures at a range of densities—from sparsely populated dishes to those crowded with cancer cells—and with different levels of nutrients in the environment.
It is well-known that cells uptake amino acids in a competitive manner. But when the cancer cells studied were starved of amino acids such as glutamine, all of the cell types tested showed a strong need to work together to acquire the available nutrients.
Surprisingly, the researchers observed that limiting amino acids benefited larger cell populations, but not sparse ones, suggesting that this is a cooperative process that depends on population density. It became really clear that there was true cooperation among tumor cells.
Part 1
Feb 20
Dr. Krishna Kumari Challa
Through additional experiments with skin, breast, and lung cancer cells, the researchers determined that a key source of nutrients for cancer cells comes from oligopeptides—pieces of proteins made up of small chains of amino acids—found outside the cell.
Where this process becomes cooperative is that instead of grabbing these peptides and ingesting them internally, the scientists found that tumor cells secrete a specialized enzyme that digests these peptides into free amino acids. Because this process happens outside the cells, the result is a shared pool of amino acids that becomes a common good.
Determining the enzyme secreted by cancer cells, called CNDP2, was an important discovery.
Carlos Carmona-Fontaine, Cooperative nutrient scavenging is an evolutionary advantage in cancer, Nature (2025). DOI: 10.1038/s41586-025-08588-w. www.nature.com/articles/s41586-025-08588-w
Part 2
Feb 20
Dr. Krishna Kumari Challa
Mountain ranges could be hidden treasure troves of natural hydrogen, plate tectonic modeling finds
The successful development of sustainable georesources for the energy transition is a key challenge for humankind in the 21st century. Hydrogen gas (H2) has great potential to replace current fossil fuels while simultaneously eliminating the associated emission of CO2 and other pollutants.
However, a major obstacle is that H2 must be produced first. Current synthetic hydrogen production is at best based on renewable energies but it can also be polluting if fossil energy is used.
The solution may be found in nature, since various geological processes can generate hydrogen. Yet, until now, it has remained unclear where we should be looking for potentially large-scale natural H2 accumulations.
A team of researchers present an answer to this question: using plate tectonic modeling, they found that mountain ranges in which originally deep mantle rocks are found near the surface represent potential natural hydrogen hotspots.
Such mountain ranges may not only be ideal geological environments for large-scale natural H2 generation, but also for forming large-scale H2 accumulations that can be drilled for H2 production.
The results of this research have been published in Science Advances.
Part 1
Feb 20
Dr. Krishna Kumari Challa
Natural hydrogen can be generated in several ways, for instance by bacterial transformation of organic material or splitting of water molecules driven by decay of radioactive elements in the Earth's continental crust.
As a result, the occurrence of natural H2 is reported in many places worldwide. The general viability of natural hydrogen as an energy source has already been proven in Mali, where limited volumes of H2 originating from iron-rich sedimentary layers are produced through boreholes in the subsurface.
However, the most promising mechanism for large-scale natural hydrogen generation is a geological process in which mantle rocks react with water. The minerals in the mantle rocks change their composition and form new minerals of the so-called serpentine group, as well as H2 gas.
This process is called serpentinization. Mantle rocks are normally situated at great depth, below the Earth's crust. In order for these rocks to come in contact with water and serpentinize, they must be tectonically exhumed, i.e. being brought near the Earth's surface.
There are two main plate tectonic environments in which mantle rocks are exhumed and serpentinized over the course of millions of years: 1) ocean basins that open as continents break apart during rifting, allowing the mantle to rise as the overlying continental crust is thinned and eventually split (for example in the Atlantic Ocean), and 2) subsequent basin closure and mountain building as continents move back together and collide, allowing mantle rocks to be pushed up towards the surface (for example in the Pyrenees and Alps).
A thorough understanding of how such tectonic environments evolve is key to properly assess their natural hydrogen potential. Using a state-of-the-art numerical plate tectonic modeling approach, calibrated with data from natural examples, the GFZ-led research team simulated the full plate tectonic evolution from initial rifting to continental break-up, followed by basin closure and mountain building.
In these simulations, the researchers were able to determine for the first time where, when, and how much mantle rocks are exhumed in mountains, and when these rocks may be in contact with water at favorable temperatures, to allow for efficient serpentinization and natural hydrogen generation.
It turns out that conditions for serpentinization and thus natural H2 generation are considerably better in mountain ranges than in rift basins. Due to the comparably colder environment in mountain ranges, larger volumes of exhumed mantle rocks are found at favorable serpentinization temperatures of 200–350°C, and at the same time, plenty of water circulation along large faults within the mountains can allow for their serpentinization potential to be realized.
As a result, the annual hydrogen generation capacity in mountain ranges can be up to 20 times greater than in rift environments. In addition, suitable reservoir rocks (for example sandstones) required for the accumulation of economically viable natural H2 volumes are readily available in mountain ranges, but are likely absent during serpentinization and hydrogen generation in the deeper parts of rift basins.
Part 2
Feb 20