Pancreatic insulin disruption triggers bipolar disorder-like behaviors in mice, study shows
Bipolar disorder is a psychiatric disorder characterized by alternating episodes of depression (i.e., low mood and a loss of interest in everyday activities) and mania (i.e., a state in which arousal and energy levels are abnormally high). On average, an estimated 1–2% of people worldwide are diagnosed with bipolar disorder at some point during their lives.
Bipolar disorder can be highly debilitating, particularly if left untreated. Understanding the neural and physiological processes that contribute to its emergence could thus be very valuable, as it could inform the development of new prevention and treatment strategies.
In addition to experiencing periodic changes in mood, individuals diagnosed with this disorder often exhibit some metabolic symptoms, including changes in their blood sugar levels. While some previous studies reported an association between blood sugar control mechanisms and bipolar disorder, the biological link between the two has not yet been uncovered.
Researchers recently carried out a study aimed at further exploring the link between insulin secretion and bipolar disorder-like behaviors, particularly focusing on the expression of the gene RORβ.
Their findings, published in Nature Neuroscience, show that an overexpression of this gene in a subtype of pancreatic cells disrupts the release of insulin, which in turn prompts a feedback loop with a region of the brain known as the hippocampus, producing alternative depression-like and mania-like behaviors in mice.
The results in mice point to a pancreas–hippocampus feedback mechanism by which metabolic and circadian factors cooperate to generate behavioral fluctuations, and which may play a role in bipolar disorder, wrote the authors.
Yao-Nan Liu et al, A pancreas–hippocampus feedback mechanism regulates circadian changes in depression-related behaviors, Nature Neuroscience (2025). DOI: 10.1038/s41593-025-02040-y.
Shampoo-like gel could help chemo patients keep their hair
Cancer fighters know that losing their hair is often part of the battle, but researchers have developed a shampoo-like gel that has been tested in animal models and could protect hair from falling out during chemotherapy treatment.
Baldness from chemotherapy-induced alopecia causes personal, social and professional anxiety for everyone who experiences it. Currently, there are few solutions—the only ones that are approved are cold caps worn on the patient's head, which are expensive and have their own extensive side effects.
The gel is a hydrogel, which absorbs a lot of water and provides long-lasting delivery of drugs to the patient's scalp. The hydrogel is designed to be applied to the patient's scalp before the start of chemotherapy and left on their head as long as the chemotherapy drugs are in their system—or until they are ready to easily wash it off.
During chemotherapy treatment, chemotherapeutic drugs circulate throughout the body. When these drugs reach the blood vessels surrounding the hair follicles on the scalp, they kill or damage the follicles, which releases the hair from the shaft and causes it to fall out. The gel, containing the drugs lidocaine and adrenalone, prevents most of the chemotherapy drugs from reaching the hair follicle by restricting the blood flow to the scalp. Dramatic reduction in drugs reaching the follicle will help protect the hair and prevent it from falling out.
To support practical use of this "shampoo," the gel is designed to be temperature responsive. For example, at body temperature, the gel is thicker and clings to the patient's hair and scalp surface. When the gel is exposed to slightly cooler temperatures, the gel becomes thinner and more like a liquid that can be easily washed away.
Romila Manchanda et al, Hydrogel-based drug delivery system designed for chemotherapy-induced alopecia, Biomaterials Advances (2025). DOI: 10.1016/j.bioadv.2025.214452
A research team has identified a direct molecular link between aging and neurodegeneration by investigating how age-related changes in cell signaling contribute to toxic protein aggregation.
Although aging is the biggest risk factor for neurodegenerative diseases, scientists still don't fully understand which age-associated molecular alterations drive their development.
Using the small nematode worm Caenorhabditis elegans, a research team studied a signaling pathway that leads to pathological protein accumulation with age. Their new paper is published in Nature Aging.
The team focused on the aging-associated protein EPS8 and the signaling pathways it regulates. This protein is known to accumulate with age and to activate harmful stress responses that lead to a shorter lifespan in worms.
Researchers found that increased levels of EPS8, and the activation of its signaling pathways, drive pathological protein aggregation and neurodegeneration—typical features of age-associated neurodegenerative diseases such as Huntington's disease and amyotrophic lateral sclerosis (ALS). By reducing EPS8 activity, the group was then able to prevent the build-up of the toxic protein aggregates and preserve neuronal function in worm models of these two diseases.
Importantly, EPS8 and its signaling partners are evolutionarily conserved and also present in human cells. Similar to what they achieved in the worms, the team was able to prevent the accumulation of toxic protein aggregates in human cell models of Huntington's disease and ALS by reducing EPS8 levels.
Seda Koyuncu et al, The aging factor EPS8 induces disease-related protein aggregation through RAC signaling hyperactivation, Nature Aging (2025). DOI: 10.1038/s43587-025-00943-w
Cooling pollen sunscreen can block UV rays without harming corals
Materials scientists have invented the world's first pollen-based sunscreen derived from Camellia flowers.
In experiments, the pollen-based sunscreen absorbed and blocked harmful ultraviolet (UV) rays as effectively as commercially available sunscreens, which commonly use minerals like titanium dioxide (TiO2) and zinc oxide (ZnO). In laboratory tests on corals, commercial sunscreen induced coral bleaching in just two days, leading to coral death by day six. Each year, an estimated 6,000 to 14,000 tons of commercial sunscreen make their way into the ocean, as people wash it off in the sea or it flows in from wastewater.
In contrast, the pollen-based sunscreen did not affect the corals, which remained healthy even up to 60 days.
In other tests, the pollen-based sunscreen also demonstrated its ability to reduce surface skin temperature, thereby helping to keep the skin cool in the presence of simulated sunlight.
Why we slip on ice: Physicists challenge centuries-old assumptions
For over a hundred years, schoolchildren around the world have learned that ice melts when pressure and friction are applied. When you step out onto an icy pavement in winter, you can slip up because of the pressure exerted by your body weight through the sole of your (still warm) shoe. But it turns out that this explanation misses the mark.
New research reveals that it's not pressure or friction that causes ice to become slippery, but rather the interaction between molecular dipoles in the ice and those on the contacting surface, such as a shoe sole.
The work is published in the journal Physical Review Letters. This insight overturns a paradigm established nearly two centuries ago by the brother of Lord Kelvin, James Thompson, who proposed that pressure and friction contribute to ice melting alongside temperature.
It turns out that neither pressure nor friction plays a particularly significant part in forming the thin liquid layer on ice.
Instead,computer stimulations by researchers reveal that molecular dipoles are the key drivers behind the formation of this slippery layer, which so often causes us to lose our footing in winter. But what exactly is a dipole? A molecular dipole arises when a molecule has regions of partial positive and partial negative charge, giving the molecule an overall polarity that points in a specific direction.
Achraf Atila et al, Cold Self-Lubrication of Sliding Ice, Physical Review Letters (2025). DOI: 10.1103/1plj-7p4z
A specific bacterial infection during pregnancy that can cause severe harm to the unborn brain has been identified for the first time, in a finding that could have huge implications for prenatal health.
Previous studies have disagreed on whether fetal exposure to Ureaplasma parvum has a detrimental effect on brain development, so newborn health specialists of Medical Research set out to determine the answer, once and for all.
Ureaplasma parvum Serovars 3 and 6 are among the most common types that are isolated in pregnancies complicated by infection/inflammation, so they tested them individually in a pre-clinical model and the results were clear.
They showed that long-term exposure to a specific subtype of Ureaplasma (parvum 6) resulted in loss of cells that are responsible for the production of myelin (the fatty sheath that insulates nerve cells in the brain).
This resulted in less myelin production and a disruption to the architecture of myelin in the brain. This sort of disruption to myelin production can have a devastating and lifelong impact on neurodevelopment, cognition and motor function.
By contrast, they also showed that exposure to another subtype of Ureaplasma (parvum 3) had little effect on neurodevelopment.
Many of the babies affected by this infection in utero are at a much higher risk of preterm birth and the chronic intensive care and inflammation associated with being born too early, the researchers say.
Dima Abdu et al, Intra-amniotic infection with Ureaplasma parvum causes serovar-dependent white matter damage in preterm fetal sheep, Brain Communications (2025). DOI: 10.1093/braincomms/fcaf182
After early-life stress, astrocytes can affect behaviour
Astrocytes in the lateral hypothalamus region of the brain, an area involved in the regulation of sleep and wakefulness, play a key role in neuron activity in mice and affect their behavior, researchers have found.
By broadening medical science's understanding of cerebral mechanisms, the discovery could someday help in the treatment and prevention of depression in humans, the researchers say.
According to the scientific literature, early-life stress leads to a five-fold increase in the risk of developing a mental-health disorder as an adult, notably causing treatment-resistant disorders.
As brain cells, astrocytes are sensitive to variations in the blood concentration of metabolites and, in response to changes in the blood, astrocytes can modulate the extent of their interaction with neurons, their neighboring cells.
In mice, those changes are particularly responsive to the level of corticosterone, the stress hormone in the rodents' blood.
In adult micewho experienced early-life stress, researchers saw abnormally high levels of corticosterone. The impact of stress on behavior also differed according to sex. Females were less active at night, while males were hyperactive during the day.
In people with depression who have experienced a similar type of stress, these sex differences have also been observed.
Lewis R. Depaauw-Holt et al, A divergent astrocytic response to stress alters activity patterns via distinct mechanisms in male and female mice, Nature Communications (2025). DOI: 10.1038/s41467-025-61643-y
Is this the future of food? 'Sexless' seeds that could transform farming Scientists are tinkering with plant genes to create crops that seed their own clones, with a host of benefits for farmers. Sacks of seeds without the sex Agriculture is on the brink of a revolution: grain crops that produce seeds asexually. The technology — trials of which could start sprouting as early as next month — exploits a quirk of nature called apomixis, in which plants create seeds that produce clones of the parent. Apomixis could slash the time needed to create new varieties of crops, and give smallholder farmers access to affordable high-yielding sorghum (Sorghum bicolor) and cowpea (Vigna unguiculata). But before self-cloning crops can be commercialized, the technology must run the regulatory gauntlet.
Macaws learn by watching interactions of others, a skill never seen in animals before
One of the most effective ways we learn is through third-party imitation, where we observe and then copy the actions and behaviors of others. Until recently, this was thought to be a unique human trait, but a new study published in Scientific Reports reveals that macaws also possess this ability.
Second-party imitation is already known to exist in the animal kingdom. Parrots are renowned for their ability to imitate human speech and actions, and primates, such as chimpanzees, have learned to open a puzzle box by observing a human demonstrator. But third-party imitation is different because it involves learning by observing two or more individuals interact rather than by direct instruction.
Scientists chose blue-throated macaws for this study because they live in complex social groups in the wild, where they need to learn new behaviors to fit in quickly. Parrots, like macaws, are also very smart and can do things like copy sounds and make tools.
To find out whether macaws could learn through third-party imitation, researchers worked with two groups of them, performing more than 4,600 trials. In the test group, these birds watched another macaw perform one of five different target actions in response to a human's hand signals. These were fluffing up feathers, spinning its body, vocalizing, lifting a leg or flapping its wings. In the control group, macaws were given the same hand signals without ever seeing another bird perform the actions.
The results were clear. The test group learned more actions than the control group, meaning the interactions between the single macaw and the human experimenter helped them learn specific behaviors. They also learned these actions faster and performed them more accurately than the control group. The study's authors suggest that this ability to learn from passively observing two or more individuals helps macaws adapt to new social situations and may even contribute to their own cultural traditions.
The research shows that macaws aren't just smart copycats. They may also have their own complex social lives and cultural norms, similar to humans.
Esha Haldar et al, Third-party imitation is not restricted to humans, Scientific Reports (2025). DOI: 10.1038/s41598-025-11665-9
Physicists create a new kind of time crystal that humans can actually see
Imagine a clock that doesn't have electricity, but its hands and gears spin on their own for all eternity. In a new study, physicists have used liquid crystals, the same materials that are in your phone display, to create such a clock—or, at least, as close as humans can get to that idea. The team's advancement is a new example of a "time crystal." That's the name for a curious phase of matter in which the pieces, such as atoms or other particles, exist in constant motion.
The researchers aren't the first to make a time crystal, but their creation is the first that humans can actually see, which could open a host of technological applications. They can be observed directly under a microscope and even, under special conditions, by the naked eye.
In the study, the researchers designed glass cells filled with liquid crystals—in this case, rod-shaped molecules that behave a little like a solid and a little like a liquid. Under special circumstances, if you shine a light on them, the liquid crystals will begin to swirl and move, following patterns that repeat over time.
Time Crystal in motion
Under a microscope, these liquid crystal samples resemble psychedelic tiger stripes, and they can keep moving for hours—similar to that eternally spinning clock.
Hanqing Zhao et al, Space-time crystals from particle-like topological solitons, Nature Materials (2025). DOI: 10.1038/s41563-025-02344-1
Ant Queens Produce Offspring of Two Different Species
Some ant queens can produce offspring of more than one species – even when the other species is not known to exist in the nearby wild.
No other animal on Earth is known to do this, and it's hard to believe.
This is bizarre story of a system that allows things to happen that seem almost unimaginable.
Some queen ants are known to mate with other species to produce hybrid workers, but Iberian harvester ants (Messor ibericus) on the island of Sicily go even further, blurring the lines of species-hood. These queens can give birth to cloned males of another species entirely (Messor structor), with which they can mate to produce hybrid workers.
Their reproductive strategy is akin to "sexual domestication", the researchers say. The queens have come to control the reproduction of another species, which they exploited from the wild long ago – similar to how humans have domesticated dogs.
In the past, the Iberian ants seem to have stolen the sperm of another species they once depended on, creating an army of male M. structor clones to reproduce with whenever they wanted.
According to genetic analysis, the colony's offspring are two distinct species, and yet they share the same mother.
Some are the hairyM. ibericus, and the others the hairlessM.structor– the closest wild populations of which live more than a thousand kilometers away.
The queen can mate with males of either species in the colony to reproduce. Mating withM. ibericusmales will produce the next generation of queen, while mating withM. structormales will result in more workers. As such, all workers in the colony are hybrids ofM. ibericusandM. structor.
The two species diverged from each other more than 5 million years ago, and yet today, M. structor is a sort of parasite living in the queen's colony. But she is hardly a victim.
The Iberian harvester queen ant controls what happens to the DNA of her clones. When she reproduces, she can do so asexually, producing a clone of herself. She can also fertilize her egg with the sperm of her own species or M. structor, or she can 'delete' her own nuclear DNA and use her egg as a vessel solely for the DNA of her male M. structor clones. This means her offspring can either be related to her, or to another species. The only similarity is that both groups contain the queen's mitochondrial DNA. The result is greater diversity in the colony without the need for a wild neighboring species to mate with.
It also means that Iberian harvester ants belong to a 'two-species superorganism' – which the researchers say "challenges the usual boundaries of individuality."
The M. structor males produced by M. ibericus queens don't look exactly the same as males produced by M. structor queens, but their genomes match.
Electrical stimulation can reprogram immune system to heal the body faster
Scientists have discovered that electrically stimulating macrophages—one of the immune systems key players—can reprogram them in such a way as to reduce inflammation and encourage faster, more effective healing in disease and injury.
This breakthrough uncovers a potentially powerful new therapeutic option, with further work ongoing to delineate the specifics.
Macrophages are a type of white blood cell with several high-profile roles in our immune system. They patrol around the body, surveying for bugs and viruses, as well as disposing of dead and damaged cells, and stimulating other immune cells—kicking them into gear when and where they are needed.
However, their actions can also drive local inflammation in the body, which can sometimes get out of control and become problematic, causing more damage to the body than repair. This is present in lots of different diseases, highlighting the need to regulate macrophages for improved patient outcomes.
In the study, published in the journal Cell Reports Physical Science, the researchers worked with human macrophages isolated from healthy donor blood samples.
They stimulated these cells using a custom bioreactor to apply electrical currents and measured what happened.
The scientists discovered that this stimulation caused a shift of macrophages into an anti-inflammatory state that supports faster tissue repair; a decrease in inflammatory marker (signaling) activity; an increase in expression of genes that promote the formation of new blood vessels (associated with tissue repair as new tissues form); and an increase in stem cell recruitment into wounds (also associated with tissue repair).
Human brains explore more to avoid losses than to seek gains
Researchers traced a neural mechanism that explains why humans explore more aggressively when avoiding losses than when pursuing gains. Their work reveals how neuronal firing and noise in the amygdala shape exploratory decision-making.
Human survival has its origins in a delicate balance of exploration versus exploitation. There is safety in exploiting what is known, the local hunting grounds, the favorite foraging location, the go-to deli with the familiar menu. Exploitation also involves the risk of over-reliance on the familiar to the point of becoming too dependent upon it, either through depletion or a change in the stability of local resources.
Exploring the world in the hope of discovering better options has its own set of risks and rewards. There is the chance of finding plentiful hunting grounds, alternative foraging resources, or a new deli that offers a fresh take on old favorites. And there is the risk that new hunting grounds will be scarce, the newly foraged berries poisonous, or that the meal time will be ruined by a deli that disappoints.
Exploration-exploitation (EE) dilemma research has concentrated on gain-seeking contexts, identifying exploration-related activity in surface brain areas like cortex and in deeper regions such as the amygdala. Exploration tends to rise when people feel unsure about which option will pay off.
Loss-avoidance strategies differ from gain-seeking strategies, with links to negative outcomes such as PTSD, anxiety and mood disorders.
In the study, "Rate and noise in human amygdala drive increased exploration in aversive learning," published in Nature, researchers recorded single-unit activity to compare exploration during gain versus loss learning.
Researchers examined how strongly neurons fired just before each choice and how variable that activity was across trials. Variability served as "neural noise." Computational models estimated how closely choices followed expected value and how much they reflected overall uncertainty.
Results showed more exploration in loss trials than in gain trials. After people learned which option was better, exploration stayed higher in loss trials and accuracy fell more from its peak.
Pre-choice firing in the amygdala and temporal cortex rose before exploratory choices in both gain and loss, indicating a shared, valence-independent rate signal. Loss trials also showed noisier amygdala activity from cue to choice.
More noise was linked to higher uncertainty and a higher chance of exploring, and noise declined as learning progressed. Using the measured noise levels in decision models reproduced the extra exploration seen in loss trials. Loss aversion did not explain the gap. Researchers report two neural signals that shape exploration. A valence-independent firing-rate increase in the amygdala and temporal cortex precedes exploratory choices in both gain and loss. A loss-specific rise in amygdala noise raises the odds of exploration under potential loss, scales with uncertainty, and wanes as learning accrues.
Behavioral modeling matches this pattern, with value-only rules fitting gain choices and value-plus-total-uncertainty rules fitting loss choices. Findings point to neural variability as a lever that tilts strategy toward more trial-and-error when loss looms, while causal tests that manipulate noise remain to be done.
From an evolutionary survival perspective, the strategy fits well with the need to seek out new resources when facing the loss of safe or familiar choices. While one might consider trying a new restaurant at any time, true seeking behavior will become a priority if the favorite location is closed for remodeling.
Tamar Reitich-Stolero et al, Rate and noise in human amygdala drive increased exploration in aversive learning, Nature (2025). DOI: 10.1038/s41586-025-09466-1
Scientists discover why the flu is more deadly for older people
Scientists have discovered why older people are more likely to suffer severely from the flu, and can now use their findings to address this risk.
In a study published in PNAS, experts discovered that older people produce a glycosylated protein called apolipoprotein D (ApoD), which is involved in lipid metabolism and inflammation, at much higher levels than in younger people. This has the effect of reducing the patient's ability to resist virus infection, resulting in a more serious disease outcome.
ApoD is therefore a target for therapeutic intervention to protect against severe influenza virus infection in the elderly which would have a major impact on reducing morbidity and mortality in the aging population.
ApoD mediates age-associated increase in vulnerability to influenza virus infection, Proceedings of the National Academy of Sciences (2025). DOI: 10.1073/pnas.2423973122
There is less room to store carbon dioxide, driver of climate change, than previously thought, finds a new study
The world has far fewer places to securely store carbon dioxide deep underground than previously thought, steeply lowering its potential to help stem global warming, according to a new study that challenges long-held industry claims about the practice.
The study, published last week in the journal Nature, found that global carbon storage capacity was 10 times less than previous estimates after ruling out geological formations where the gas could leak, trigger earthquakes or contaminate groundwater, or had other limitations. That means carbon capture and storage would only have the potential to reduce human-caused warming by 0.7 degrees Celsius (1.26 Fahrenheit)—far less than previous estimates of around 5-6 degrees Celsius (9-10.8 degrees Fahrenheit), researchers said.
Carbon storage is often portrayed as a way out of the climate crisis. These new findings make clear that it is a limited tool and reaffirms the extreme importance of reducing emissions as fast and as soon as possible.
Matthew J. Gidden et al, A prudent planetary limit for geologic carbon storage, Nature (2025). DOI: 10.1038/s41586-025-09423-y
Smart patch runs tests using sweat instead of blood
It is possible now to precisely assess the body's health status using only sweat instead of blood tests.
A research team has now developed a smart patch that can precisely observe internal changes through sweat when simply attached to the body. This is expected to greatly contribute to the advancement of chronic disease management and personalized health care technologies.
The researchers developed a wearable sensor that can simultaneously and in real-time analyze multiple metabolites in sweat.
The research team developed a thin and flexible wearable sweat patch that can be directly attached to the skin. This patch incorporates both microchannels for collecting sweat and an ultrafine nanoplasmonic structure that analyzes sweat components using light. Thanks to this, multiple sweat metabolites can be simultaneously analyzed without the need for separate staining or labels, with just one patch application.
A nanoplasmonic structure is an optical sensor structure where nanoscale metallic patterns interact with light, designed to sensitively detect the presence or changes in concentration of molecules in sweat.
The patch was created by combining nanophotonics technology, which manipulates light at the nanometer scale (one-hundred-thousandth the thickness of a human hair) to read molecular properties, with microfluidics technology, which precisely controls sweat in channels thinner than a hair.
In other words, within a single sweat patch, microfluidic technology enables sweat to be collected sequentially over time, allowing for the measurement of changes in various metabolites without any labeling process. Inside the patch are six to 17 chambers (storage spaces), and sweat secreted during exercise flows along the microfluidic structures and fills each chamber in order.
The research team applied the patch to actual human subjects and succeeded in continuously tracking the changing components of sweat over time during exercise.
In this study, they demonstrated for the first time that three metabolites—uric acid, lactic acid, and tyrosine—can be quantitatively analyzed simultaneously, as well as how they change depending on exercise and diet.
In particular, by using artificial intelligence analysis methods, they were able to accurately distinguish signals of desired substances even within the complex components of sweat.
In the future, this technology can be expanded to diverse fields such as chronic disease management, drug response tracking, environmental exposure monitoring, and the discovery of next-generation biomarkers for metabolic diseases.
Jaehun Jeon et al, All-flexible chronoepifluidic nanoplasmonic patch for label-free metabolite profiling in sweat, Nature Communications (2025). DOI: 10.1038/s41467-025-63510-2
Microbiome instability linked to poor growth in kids
Malnutrition is a leading cause of death in children under age 5, and nearly 150 million children globally under this age have stunted growth from lack of nutrition. Although an inadequate diet is a major contributor, researchers found over a decade ago that dysfunctional communities of gut microbes play an important role in triggering malnutrition.
They have discovered that toddlers in Malawi—among the places hardest hit by malnutrition—who had a fluctuating gut microbiome showed poorer growth than kids with a more stable microbiome. All of the children were at high risk for stunting and acute malnutrition.
The findings, published in Cell, establish a pediatric microbial genome library—a public health database containing complete genetic profiles of 986 microbes from fecal samples of eight Malawian children collected over nearly a year that can be used for future studies to help predict, prevent and treat malnutrition.
Discovery reveals how chromosome ends can be protected
Researchers have uncovered a previously unknown mechanism that safeguards the chromosome ends from being mistakenly repaired by the cell. While DNA repair is vital for survival, attempts to repair the chromosome ends—called telomeres—can have catastrophic outcomes for cells.
The research, published in Nature, increases the understanding of how cancer and certain rare diseases develop.
Cells constantly monitor their DNA. A DNA helix that ends abruptly is a signal that DNA has been severely damaged—at least in most cases. The most severe DNA damage a cell can suffer is when a DNA helix breaks up in two pieces.
Normally, the cells would try to promptly repair all damage to DNA. The dilemma is that our chromosomes have ends that look just like broken DNA. If cells were to "repair" them, looking for another loose end to join them with, this would lead to the fusion between two or more chromosomes, making the cell susceptible to transform into a cancer cell.
Therefore, the chromosome ends—called telomeres—must be protected from the cell's DNA repair machinery.
The cells have to constantly repair DNA damage to avoid mutations, cell death, and cancer, while at the same time, they must not repair the chromosome ends by mistake, since that would result in the same catastrophic outcome. What's the difference between damaged DNA and the natural chromosome end? This problem has been known for almost a century, but some aspects are still not completely resolved.
Although the telomeres are not to be repaired as if they were broken DNA, several DNA repair proteins can be found at the chromosome ends.
The research team has previously shown that a key repair protein, DNA Protein Kinase (called DNA-PK) helps in processing telomeres and in protecting them from degradation. But how DNA-PK at the same time is prevented from trying to repair these blunt DNA ends remained a mystery until now.
The researchers have now shown that two other proteins, called RAP1 and TRF2, have an important role to play in regulating DNA-PK.
They show genetically, biochemically and structurally how the protein RAP1, brought to telomeres by TRF2, ensure by direct interaction that DNA-PK doesn't 'repair' the telomeres.
The role of telomeres in processes such as cancer development and aging, and what happens when they do not work properly, has long interested scientists. Diseases caused by disturbances in telomere maintenance are rare, but very serious, and include premature aging, blood cell deficiency called aplastic anemia and fibrosis in the lungs.
In about half of cases, the disease is explained by a known mutation that affects telomere stability, but in many cases, there is currently no known medical explanation for why the individual is sick. Their research is also relevant to cancer research. On one hand, inappropriate "repair" of telomeres can trigger catastrophic events leading to the accumulation of mutations and cancer.
On the other hand, cancer cells are often less efficient at repairing damage to DNA compared to normal cells. This weakness is exploited in cancer treatments and many therapies kill tumor cells by causing DNA damage or inhibiting repair, or both.
In other words, knowledge of how cells regulate DNA repair and protect telomeres has a bearing on both prevention and treatment of cancer.
Increased understanding of which proteins play key roles in these cellular processes can, in the long term, contribute to more precise and targeted treatment strategies.
Patrik Eickhoff et al, Chromosome end protection by RAP1-mediated inhibition of DNA-PK, Nature (2025). DOI: 10.1038/s41586-025-08896-1
Fathers' drinking plays role in fetal alcohol spectrum disorder, study shows
It's a well-known fact that fetal alcohol spectrum disorder (FASD) in children is caused by mothers who drink during pregnancy. But it turns out that the father's drinking habits could also affect a child's growth and development.
A team of international researchers found that a father's alcohol use may have a small but direct negative impact on a child's development by the age of seven. A father's drinking contributes to the harm caused by alcohol use during pregnancy.
The researchers analyzed data from five studies on the prevalence and characteristics of FASD among Grade 1 learners in the Western Cape to explore whether a father's drinking affects children diagnosed with FASD. The children's biological mothers or legal guardians completed a questionnaire on the risk factors for FASD.
According to the researchers, there is a growing recognition that factors beyond pregnant women's drinking habits can affect their children's development. They add that increased attention is currently paid to the role of fathers, not only as a contributing factor to women's drinking habits, but also as an independent contributing factor to the growth and development of children.
The findings show that children whose fathers drank alcohol were more likely to be shorter, have smaller heads, and score lower on verbal IQ tests. It was also clear from the study that the highest risk to the child's development exists when both parents use alcohol during pregnancy. It also appeared that 'binge drinking' by the father, but especially by both parents, has the most detrimental effect on the child's development.
Data analysis showed that between 66% and 77% of fathers of children on the FAS spectrum drank during their partner's pregnancy with the child in question. These fathers drank an average of 12 drinks per drinking day. The number of drinks that fathers drank per drinking day was significantly correlated with smaller head circumference in their children. Head circumference is used as a measure of brain development."
The researchers add that fathers who drank an average of five or more drinks per drinking day had shorter children with smaller head circumferences. These children also performed worse on measures in verbal intelligence tests.
"In general, it was found that the more fathers drank, the worse their children performed. However, it should also be noted that all these effects were observed in children whose mothers consumed alcohol during pregnancy."
They note that a father's drinking alone didn't increase the chances of a child being diagnosed with FASD. However, when the mother drank during pregnancy and the father was also a heavy drinker, the child was more likely to have the most serious symptoms of FASD.
"When looking at both parents' drinking patterns, the father's alcohol use alone didn't show a clear link to the child's physical or brain development problems. While both parents' drinking was considered, the main effects on a child's development and physical features were linked to the mother's alcohol use.
Part 1
"Even after accounting for alcohol use by mothers during pregnancy, the father's drinking was still linked to lower child height, smaller head size and reduced verbal IQ. This suggests that paternal alcohol use may have its own, though limited, impact on a child's growth and development.
"Data analyses of children where both parents consumed alcohol during pregnancy had significantly negative effects on growth, head circumference, verbal intelligence and general birth defect scores than in children where neither parent consumed alcohol."
The researchers say that, while it is not yet clear whether the impact of a father's drinking on a child's growth and development stems from impaired sperm quality or other epigenetic influences (changes in how genes work that don't involve altering the DNA code itself), the father's role in the development of FASD cannot be overlooked.
Philip A. May et al, Does paternal alcohol consumption affect the severity of traits of fetal alcohol spectrum disorders?, Alcohol, Clinical and Experimental Research (2025). DOI: 10.1111/acer.70105
Why some influenza viruses are more dangerous than others
Serious infections with influenza A viruses are characterized by an excessive immune response, known as cytokine storm. It was previously unclear why some virus strains trigger these storms, while others do not. Researchers investigated 11 different influenza A virus strains and their effect on different human immune cells.
The results, published in Emerging Microbes & Infections, show that highly pathogenic avian influenza viruses infect specific kinds of immune cells and thus stimulate the production of type I interferon. This could explain why these viruses are particularly dangerous.
The new research results show that not only the immune cells that have always been the focus of attention regarding type I interferon production, but also other immune system cells could be decisive in whether an influenza infection triggers an excessive immune system response. This knowledge is important in order to be able to better assess the risk of dangerous virus variants.
Influenza viruses are some of the most significant respiratory disease pathogens worldwide. While most infections are relatively mild, certain virus strains can cause severe pneumonia, leading to acute respiratory failure. Highly pathogenic influenza viruses are particularly dangerous. They often spread from birds to humans and are associated with significantly higher mortality rates.
The immune system plays a crucial role in this danger: some virus strains cause the body to release an excessive amount of messenger substances known as cytokines. If a cytokine storm occurs, the immune response will end up damaging the body's tissue more than the virus itself.
But why do some influenza viruses trigger such excessive reactions, while others only cause infections with mild progression?
The researchers tested how the flu viruses infect different immune cells and stimulate the release of messenger substances.
The aim of their research is to decipher the mechanisms behind mild and severe disease progression and to develop long-term approaches for better protection and treatment options.
The investigations showed that a certain type of immune cell, so-called plasmacytoid dendritic cells, produce large amounts of the important antiviral messenger interferon-α (IFN-α) upon infection with influenza—regardless of the virus strain. This means that these immune cells generally respond strongly to influenza viruses without the need for the virus to productively infect them.
If this is the case, why do not all viruses cause the same severity of the disease? The research team found that other immune cells, namely myeloid dendritic cells and different types of macrophages, are infected with highly pathogenic influenza viruses and produce large amounts of IFN-α. Viral replication in these immune cells appears to be an important factor in the production of type I interferon and the development of an excessive immune response (cytokine storm). These results provide an explanation of why some influenza viruses could be so much more dangerous than others: it might be their ability to replicate in certain immune cells and thereby trigger an extremely strong immune reaction that leads to severe inflammation and harm to health. This knowledge can help to develop targeted therapies and better identify risk groups.
Marc A. Niles et al, Influenza A virus induced interferon-alpha production by myeloid dendritic cells and macrophages requires productive infection, Emerging Microbes & Infections (2025). DOI: 10.1080/22221751.2025.2556718
Giant DNA discovered in people's mouths could impact oral health, immunity and even cancer risk
Researchers have made a surprising discovery hiding in people's mouths: Inocles, giant DNA elements that had previously escaped detection. These appear to play a central role in helping bacteria adapt to the constantly changing environment of the mouth.
The findings, published in the journal Nature Communications, provide fresh insight into how oral bacteria colonize and persist in humans, with potential implications for health, disease and microbiome research.
They discovered Inocles, an example of extrachromosomal DNA—chunks of DNA that exist in cells, in this case bacteria, but outside their main DNA. It's like finding a book with extra footnotes stapled to it, and scientists are just starting to read them to find out what they do.
Detecting Inocles was not easy, as conventional sequencing methods fragment genetic data, making it impossible to reconstruct large elements.
To overcome this, the team applied advanced long-read sequencing techniques, which can capture much longer stretches of DNA.
Inocle genomes, turned out to be hosted by the bacteria Streptococcus salivarius.
The average genome size of Inocle is 350 kilobase pairs, a measure of length for genetic sequences, so it is one of the largest extrachromosomal genetic elements in the human microbiome. Plasmids, other forms of extrachromosomal DNA, are at most a few tens of kilobase pairs.
This long length endows Inocles with genes for various functions, including resistance to oxidative stress, DNA damage repair and cell wall-related genes, possibly involved in adapting to extracellular stress response.
What's remarkable is that, given the range of the human population the saliva samples represent, researchers think 74% of all human beings may possess Inocles. And even though the oral microbiome has long been studied, Inocles remained hidden all this time because of technological limitations.
Now that we know they exist, we can begin to explore how they shape the relationship between humans, their resident microbes and our oral health. And there's even some hints that Inocles might serve as markers for serious diseases like cancer, say the researchers.
Yuya Kiguchi et al, Giant extrachromosomal element "Inocle" potentially expands the adaptive capacity of the human oral microbiome, Nature Communications (2025). DOI: 10.1038/s41467-025-62406-5
Scientists find quasi-moon orbiting the Earth for the last 60 years
Everyone who has ever lived on Earth has been well-aware of the moon, but it turns out Earth also has some frequent temporary companions. These "quasi-moons" are small asteroids that enter into a kind of resonance with Earth's orbit, although they aren't technically orbiting Earth. In August, this small group of asteroids, called Arjunas, offered another companion to add to the list.
Astronomers at the Pan-STARRS observatory in Hawaii discovered the new quasi-moon, referred to as "2025 PN7," on August 2, 2025. Their research was recentlypublishedinResearch Notes of the AAS. Using JPL's Horizons system and Python tools, they analyzed the orbital data and compared it to other Arjunas and quasi-satellites.
The team found that 2025 PN7 had been in a quasi-orbit for about 60 years already and would likely be nearby for another 60 or so years before departing. Compared to other quasi-moons, this period is relatively short. The quasi-moon Kamo'oalewa has an expected near-Earth orbit of around 381 years, while the total time for 2025 PN7 is 128 years.
Scientists have been aware of these quasi-satellites since 1991, when they first discovered 1991 VG—which some thought was an interstellar probe at the time.
Over three decades later, it is now widely accepted that such objects are natural and constitute a secondary asteroid belt that occupies the region in which the Earth–moon system orbits around the sun, defining the Arjuna dynamical class. The Arjunas with the most Earth-like orbits can experience temporary captures as mini-moons of our planet.
However, mini-moons are distinct from quasi-moons, like 2025 PN7, as mini-moons do temporarily orbit Earth and quasi-moons only appear to do so. Currently, there are six other known quasi-moons: 164207 Cardea (2004 GU9), 469219 Kamo'oalewa (2016 HO3), 277810 (2006 FV35), 2013 LX28, 2014 OL339 and 2023 FW13.
Carlos de la Fuente Marcos et al, Meet Arjuna 2025 PN7, the Newest Quasi-satellite of Earth, Research Notes of the AAS (2025). DOI: 10.3847/2515-5172/ae028f
Hawking and Kerr black hole theories confirmed by gravitational wave
Scientists have confirmed two long-standing theories relating to black holes—thanks to the detection of the most clearly recorded gravitational wave signal to date.
Ten years after detecting the first gravitational wave, the LIGO-Virgo-KAGRA Collaboration has (10 Sep) announced the detection of GW250114—a ripple in spacetime which offers unprecedented insights into the nature of black holes and the fundamental laws of physics.
The study confirms Professor Stephen Hawking's 1971 prediction that when black holes collide, the total event horizon area of the resulting black hole is bigger than the sum of individual black holes—it cannot shrink.
Research also confirmed the Kerr nature of black holes—a set of equations developed in 1963 by New Zealand mathematician Roy Kerr elegantly explaining what space and time look like near a spinning black hole. The Kerr metric predicts effects such as space being 'dragged' around and light looping to make multiple copies of objects.
Publishing their findings inPhysical Review Letters, the international group of researchers note that GW250114 was detected with a signal-to-noise ratio of 80. This clarity enabled precise tests of general relativity and black hole thermodynamics.
GW250114: testing Hawking's area law and the Kerr nature of black holes, Physical Review Letters (2025). DOI: 10.1103/kw5g-d732
'Potential biosignatures' found in ancient Mars lake
A new study suggests a habitable past and signs of ancient microbial processes on Mars. Led by NASA and featuring key analysis from Imperial College London, the work has uncovered a range of minerals and organic matter in Martian rocks that point to an ancient history of habitable conditions and potential biological processes on the Red Planet.
This is a very exciting discovery of a potential biosignature but it does not mean we have discovered life on Mars. We now need to analyze this rock sample on Earth to truly confirm if biological processes were involved or not.
While driving through the valley, called Neretva Vallis, Perseverance came across a thick succession of fine-grained mudstones and muddy conglomerates. Here, it conducted a detailed analysis of these rocks, using instruments such as the Planetary Instrument for X-ray Lithochemistry (PIXL) and Scanning Habitable Environments with Raman & Luminescence for Organics & Chemicals (SHERLOC).
By mapping the types and distributions of different sedimentary rocks, researchers were able to reconstruct the environment in which these mudstones were deposited.
Their analysis revealed a range of sedimentary structures and textures indicative of lake margin and lake bed environments, including a composition rich in minerals like silica and clays—the opposite to a river scenario, where fast-moving water would carry these tiny particles away.
This pointed to a surprising conclusion: they had found lake deposits in the bottom of a river valley.
The finding may suggest a period in the history of Jezero Crater where the valley itself was flooded, giving rise to this potentially habitable lake.
With the lake habitat scenario pinned down, the Perseverance science team turned their attention to the mudstones themselves. It was inside these rocks that they discovered a group of tiny nodules and reaction fronts, with chemical analysis revealing that these millimeter-scale structures are highly enriched in iron-phosphate and iron-sulfide minerals (likely vivianite and greigite).
These appear to have formed through redox reactions involving organic carbon, a process that could have been driven by either abiotic or—interestingly—biological chemistry. Importantly, this sets the stage for everything that happened next: the formation of this specific type of oxidized, iron- and phosphorus-rich sediment was the essential prerequisite for creating the ingredients for subsequent reactions.
Since these ingredients mirror by-products of microbial metabolism seen on Earth, it can be considered a compelling potential biosignature, raising the possibility that there was once microbial life on Mars.
Ultimately, the only way for the true origin of these structures to be determined is by returning the samples to Earth, a possibility that rests on when future missions will manage to successfully collect the samples from Mars' surface.
Climate change is driving fish stocks from countries' waters to the high seas, study finds
Fish and other marine organisms, though deeply affected by human activities, don't respect human borders. The ranges of many commercially important species in fact straddle the borders of countries' exclusive economic zones (EEZs) and international waters, known as the high seas. This arrangement, which makes fisheries management difficult, is set to get even more complicated as climate change continues to heat up the ocean, a new study says.
The study, published July 30 in the journal Science Advances, found that more than half of the world's straddling stocks will shift across the maritime borders between EEZs and the high seas by 2050. Most of these shifts will be into the high seas, where fisheries management is much more challenging and stocks are more likely to be overexploited.
Juliano Palacios-Abrantes et al, Climate change drives shifts in straddling fish stocks in the world's ocean,Science Advances(2025).DOI: 10.1126/sciadv.adq5976
Kamal Azmi et al, Putting Regional Fisheries Management Organisations' Climate Change House in Order,Fish and Fisheries(2025).DOI: 10.1111/faf.70015
Maternal gut microbiome composition may be linked to preterm births
Researchers have found that the presence of certain bacteria in the maternal gut microbiome during early pregnancy is linked to a higher risk of preterm birth. Published in the journal Cell Host & Microbe on September 10, the study reports that one particular species, Clostridium innocuum (C. innocuum), contains a gene that can degrade estradiol—an important pregnancy hormone.
This study suggests that for pregnant women or women preparing to become pregnant, it may be important to monitor their gut microbiome to prevent potential adverse pregnancy outcomes.
The researchers used samples and data from two large pregnancy cohorts—the Tongji-Huaxi-Shuangliu Birth cohort from southwest China and the Westlake Precision Birth cohort from southeast China. For the first cohort, the team collected stool samples from 4,286 participants in early pregnancy, at an average of 10.4 gestational weeks. For the second cohort, they collected stool samples from 1,027 participants in mid-pregnancy, or around 26 gestational weeks. They also collected blood samples from all participants, which were used to measure human genetic variations and hormone metabolism.
After working with the first cohort, the researchers were able to establish a comprehensive database containing rRNA-based microbial genera data, metagenome-based species data, and phenotype data such as preterm delivery status.
They used several statistical models to screen the annotated gut microbial genera and species and their relationship to preterm birth status or gestational duration .Through that work, they identified 11 genera and 1 species that had a statistically significant link.
The results, which were validated with the second cohort, showed that bacterial species C. innocuum—a small, rod-shaped bacteria—had the strongest connection to preterm birth. Further study of C. innocuum revealed that this species makes an enzyme that degrades estradiol—a form of estrogen that plays a pivotal role during pregnancy.
Estradiol regulates critical pathways that sustain pregnancy and initiate the process of childbirth.
The researchers propose that dysregulated estradiol levels induced by a high prevalence of C. innocuum could be the mechanism that links the gut microbiome to preterm birth. The researchers note that because their study was based on two China-based cohorts with a relatively low prevalence of preterm birth, the findings may not be generalizable to other populations and should be authenticated by more work with them.
In the future, they hope to further elucidate the molecular mechanisms by which C. innocuum modulates preterm birth risk and potentially identify optimal intervention strategies to mitigate the bacteria's impact on pregnancy. They also seek to characterize the interaction between C. innocuum and host estrogen metabolism more generally, beyond pregnancy.
More women than ever are carrying babies conceived with someone else's egg—but few are told that this might carry greater health risks.
Pregnancies involving an embryo that doesn't share the pregnant woman's DNA are becoming more common. For many, it's a path to parenthood that would otherwise be closed. But emerging evidence suggests that these pregnancies may come with higher rates of complications, including preeclampsia, gestational diabetes and preterm birth, and that women are often not given the full picture before treatment.
As the fertility industry expands and diversifies, it's time to ask whether patients are being adequately informed about the risks of carrying another woman's egg—and whether more caution is needed in how these options are presented.
There are three situations in which a woman may carry another woman's egg in her uterus.
The most common is when a woman cannot produce her own eggs but has a functioning uterus. In this case, donor eggs and in-vitro fertilization (IVF) offer the only route to pregnancy.
The other two situations involve fertile women carrying a donated egg on behalf of someone else. This happens in cases of gestational surrogacy, where a surrogate carries a baby genetically unrelated to her, or in reciprocal IVF, also known as ROPA or co-IVF. In the latter, one woman in a same-sex couple (or a trans man) donates her egg to her partner, so that both have a biological connection to the child.
In IVF, fertilization occurs outside the body and the resulting embryo is transferred into the uterus. But what happens when the egg in the uterus has no genetic similarity to the woman carrying it? Could this cause complications for her or the baby?
To answer that question, we need to compare outcomes in these situations to pregnancies where the egg shares approximately 50% of the mother's DNA, either through natural conception or own-egg IVF. Early evidence suggests that having someone else's egg in the uterus is associated with a higher risk of obstetric complications, including preeclampsia, gestational diabetes and preterm birth.
There are three key comparisons to make. First, donor-egg IVF v own-egg IVF. For infertile women using donor eggs, the most relevant comparison is IVF with their own eggs.
Second, gestational v traditional surrogacy. In gestational surrogacy, the surrogate carries a donor egg, while in traditional surrogacy, she uses her own. Outcomes can also be compared with the surrogate's previous natural pregnancies.
Third, reciprocal IVF v own-egg IVF. In same-sex couples, reciprocal IVF can be compared to own-egg IVF to assess risks.
Part1
Areview of 11 studiescomparing donor-egg IVF to own-egg IVF found that donor-egg pregnancies had significantly higher rates of hypertensive disorders in the mother, as well as preterm birth and babies that were small for their gestational age.
A separate reviewfocusing on preeclampsia in singleton IVF pregnancies found the condition occurred in 11.2% of donor-egg pregnancies, compared to 3.9% of own-egg pregnancies.
For women who can only become pregnant using a donor egg, these risks may be worth accepting. But it's important that women are made aware of the potential complications, especially if carrying twins, which further increases risks.
Women deserve full, unbiased information about the risks. That includes knowing that carrying someone else's egg may increase the likelihood of pregnancy complications. They can then make informed decisions about whether the potential benefits outweigh the risks.
Phoebe Barry et al, The outcomes of surrogate pregnancy using a donor egg compared to the surrogate's egg: a systematic review, Preprints (2025). DOI: 10.22541/au.174919886.60741282/v1
DNA cassette tapes could solve global data storage problems
Our increasingly digitized world has a data storage problem. Hard drives and other storage media are reaching their limits, and we are creating data faster than we can store it. Fortunately, we don't have to look too far for a solution, because nature already has a powerful storage medium with DNA (deoxyribonucleic acid). It is this genetic material that researchers are using to create DNA storage cassettes.
DNA is the ultimate data storage solution because it is compact, dense and durable. It can hold an enormous amount of information in a microscopic space and preserve that data for thousands of years without needing electricity. Theoretically, the DNA in a single human cell has a capacity of approximately 3.2 gigabytes, which equates to roughly 6,000 books, 1,000 pieces of music or two movies.
Scientists have known about DNA's potential as a storage solution for a long time, but the challenge up until now has been to create a viable system that we can use. In a new study published in the journal Science Advances, researchers describe how they made a DNA cassette similar to cassette tapes that were staples in personal and car stereos in the 1980s.
The team first created the physical tape from a polyester-nylon blend. Then they printed barcode patterns on it to make millions of tiny, separate sections, similar to folders on a computer. This lets the system find the exact spot where the data is stored. Accessing information has been one of the problems of previous DNA storage techniques. To store a file, digital data is first translated into a DNA sequence. The four bases, or building blocks of DNA (A, G, C, and T) act as a code, similar to the zeroes and ones that computers use. The researchers also coated the tape with a protective crystalline layer to protect the DNA bonds from breaking down. Finally, they proved the system works by converting a digital image into DNA, then successfully and quickly retrieving it from the tape.
DNA cassette tape provides a strategy for fast, compact, large-scale DNA-based cold (infrequently accessed) or warm (needed on demand) data storage, wrote the scientists in their paper.
Jiankai Li et al, A compact cassette tape for DNA-based data storage, Science Advances (2025). DOI: 10.1126/sciadv.ady3406
Medications leave lasting mark on gut microbiome, even years after use
Medications taken years ago can continue to shape the human gut microbiome, according to a large-scale study.
Analyzing stool samplesand prescription records from over 2,500 Estonian Biobank participants in the Estonian Microbiome cohort, researchers found that the majority of drugs studied were linked to microbiome changes, with a substantial number of them also showing long-term effects detectable years after patients stopped taking them.
The impact was not limited to antibiotics: antidepressants, beta-blockers, proton pump inhibitors, and benzodiazepines all left microbial "fingerprints."
Most microbiome studies only consider current medications, but our results show that past drug use can be just as important as it is a surprisingly strong factor in explaining individual microbiome differences.
This highlights that it is critical to account for drug usage history when studying links between the microbiome and disease. The research is published in the journal mSystems.
Interestingly, benzodiazepines—commonly prescribed for anxiety—had microbiome effects comparable to broad-spectrum antibiotics. The results also show that drugs from the same class that might be used for the same condition, e.g. diazepam and alprazolam, may differ in how much they disrupt the microbiome.
Follow-up samples from a subset of participants confirmed that starting or stopping certain drugs caused predictable microbial shifts, suggesting causal effects. Despite the small sample size of the second time-point analysis, the authors were able to verify long-term effects of proton pump inhibitors, selective serotonin reuptake inhibitors and antibiotics, such as penicillins in combination and macrolides.
Oliver Aasmets et al, A hidden confounder for microbiome studies: medications used years before sample collection, mSystems (2025). DOI: 10.1128/msystems.00541-25
Breathlessness increases long-term mortality risk, finds a study in Malawi
Research led by Liverpool School of Tropical Medicine and the Malawi-Liverpool-Wellcome Program shows that over half of hospital patients with breathlessness had died within a year of admission (51%), as opposed to just 26% of those without the symptom.
Most of these patients had more than one condition that caused breathlessness, including pneumonia, anemia, heart failureand TB.
The findings demonstrate the importance of integrated, patient-centered care, researchers say, to tackle the burden of high mortality for people with breathlessness, particularly in low-income countries. The work appears inThorax.
Most of these patients live with more than one condition at the same time, which the researchers found to be a factor linked to higher mortality, such as those with TB or pneumonia. This suggests that treating diseases in isolation is not enough, and health care models that have traditionally focused on single presenting conditions may overlook important concurrent diseases.
Acute breathlessness as a cause of hospitalisation in Malawi: a prospective, patient-centred study to evaluate causes and outcomes, Thorax (2025). DOI: 10.1136/thorax-2025-223623
Smells that deceive the brain: Research reveals how certain aromas are interpreted as taste
Flavored drinks without sugar can be perceived as sweet—and now researchers know why. A new study from Karolinska Institutet in Sweden, published in the journal Nature Communications, reveals that the brain interprets certain aromas as taste. The paper is titled "Tastes and retronasal odours evoke a shared flavour-specific neura...
When we eat or drink, we don't just experience taste, but rather a "flavor." This taste experience arises from a combination of taste and smell, where aromas from food reach the nose via the oral cavity, known as retronasal odor.
Researchers have now shown that the brain integrates these signals earlier than previously thought—already in the insula, a brain region known as the taste cortex—before the signals reach the frontal cortex, which controls our emotions and behavior.
The taste cortex reacts to taste-associated aromas as if they were real tastes.
The finding provides a possible explanation for why we sometimes experience taste from smell alone, for example in flavored waters. This underscores how strongly odors and tastes work together to make food pleasurable, potentially inducing craving and encouraging overeating of certain foods.
The study results showed that aromas that are perceived as sweet or savory not only activate the same parts of the brain's taste cortex as the actual tastes but that they evoke similar patterns of activation. This overlap was particularly evident in the parts of the taste cortex that are linked to the integration of sensory impressions.
This shows that the brain does not process taste and smell separately, but rather creates a joint representation of the flavor experience in the taste cortex.
This mechanism may be relevant for how our taste preferences and eating habits are formed and influenced.
Putu Agus Khorisantono, et al. Tastes and retronasal odours evoke a shared flavour-specific neural code in the human insula, Nature Communications (2025). DOI: 10.1038/s41467-025-63803-6
Microbial allies: Some Bacteria help fight against cancer
An international team of scientists have discovered that microbes associated with tumors produce a molecule that can control cancer progression and boost the effectiveness of chemotherapy.
Most people are familiar with the microbes on the skin or in the gut, but recent discoveries have revealed that tumors also host unique communities of bacteria. Scientists are now investigating how these tumor-associated bacteria can affect tumour growth and the response to chemotherapy.
New research, published online in Cell Systems, provides a significant breakthrough in this field, identifying a powerful anti-cancer metabolite produced by bacteria associated with colorectal cancer.
This finding opens the door to new strategies for treating cancer, including the development of novel drugs that could make existing therapies more potent.
The researchers used a sophisticated large-scale screening approach to test over 1,100 conditions in C. elegans. Through this, they found that the bacteria E. coli produced a molecule called 2-methylisocitrate (2-MiCit) that could improve the effectiveness of the chemotherapy drug 5-fluorouracil (5-FU).
Using computer modeling, the team demonstrated that the tumor-associated microbiome (bacteria found within and around tumors) of patients was also able to produce 2-MiCit. To confirm the effectiveness of 2-MiCit, the team used two further systems; human cancer cells and a fly model of colorectal cancer. In both cases, they found that 2-MiCit showed potent anti-cancer properties, and for the flies could extend survival.
Bacteria are associated with tumors, and now scientists are starting to understand the chemical conversation they're having with cancer cells.
They found that one of these bacterial chemicals can act as a powerful partner for chemotherapy, disrupting the metabolism of cancer cells and making them more vulnerable to the drug.
The study revealed that 2-MiCit works by inhibiting a key enzyme in the mitochondria (structures inside cells that generate energy for cellular functions) of cancer cells. This leads to DNA damage and activates pathways known to reduce the progression of cancer. This multi-pronged attack weakens the cancer cells and works in synergy with 5-FU. The combination was significantly more effective at killing cancer cells than either compound alone.
These exciting discoveries highlight how the cancer-associated microbiome can impact tumor progression, and how metabolites produced by these bacteria could be harnessed to improve cancer treatments.
These findings are also important in the context of personalized medicine, emphasizing the importance of considering not only the patient, but also their microbes.
Daniel Martinez-Martinez et al, Chemotherapy modulation by a cancer-associated microbiota metabolite, Cell Systems (2025). DOI: 10.1016/j.cels.2025.101397
Researchers may have found a way to limit the debilitating damage strokes can cause
With limited treatment options for stroke patients available, researchers are developing an experimental drug that is capable of protecting the brain and improving recovery after a cerebral vascular accident also known as a brain attack.
They targeted a small regulatory biological molecule called microRNA, which becomes abnormally elevated after stroke and promotes inflammation, contributes to tissue loss and causes a decline in neurological function.
MicroRNAs (miRNAs) are a class of non-coding RNAs, which do not translate into proteins, that play important roles in regulating gene expression.
So the researchers developed a next-generation inhibitor of this MiRNA to block its harmful effects.
Unlike traditional experimental drugs that target only a single protein or molecule, this approach simultaneously suppresses multiple damaging processes by targeting several proteins. This reduces brain injury, inflammation, and the damage of the tissue while enhancing protective factors that support repair.
A pathological partnership between Salmonella and yeast in the gut
Researchers have found that a common gut yeast, Candida albicans, can help Salmonella typhimurium take hold in the intestine and spread through the body. When interacting, a Salmonella protein called SopB prompts the yeast to release arginine, which turns on Salmonella's invasion machinery and quiets the body's inflammation signals.
Gut microbes shape human health across colonization resistance, immune training, digestion, and signaling that reaches distant organs. Bacteria dominate both abundance and research attention, while roles for viruses and fungi remain less defined.
Altered mycobiome composition appears in multiple gastrointestinal diseases, and integration of fungi into gut ecology and into interactions with commensal and pathogenic bacteria remains largely unknown.
Non-typhoidal Salmonella ranks among the best-studied enteric pathogens, infecting an estimated 100 million people each year. Healthy individuals typically experience localized inflammatory diarrhea, while immunocompromised patients face risks of spread to peripheral organs.
Establishing gut colonization requires competition with resident microorganisms, and commensal fungi occur across tested mammalian species, yet mycobiome contributions during enteric infection remain largely unexplored.
Candida albicans is a frequent colonizer of human mucosal surfaces, present in the gut of more than 60% of healthy humans. Usual behavior is commensal, with pathogenic potential particularly in immunocompromised hosts. A key virulence trait is morphology switching from yeast to epithelium-penetrating hyphae.
Associations with inflammatory bowel disease, specifically Crohn's disease, have been reported. C. albicans cannot induce gut inflammation and has been shown to exacerbate it. Both Salmonella and C. albicans thrive under inflammatory gut conditions, and C. albicans likely resides in the gut of many patients at the time Salmonella infection occurs.
In the study, "Commensal yeast promotes Salmonella Typhimurium virulence," published in Nature, researchers investigated cross-kingdom interactions to determine how Candida albicans influences Salmonella colonization, systemic dissemination, and host inflammatory responses.
In the experiments conducted in mice, Candida in the gut led to higher Salmonella loads in the large intestine and more bacteria reaching the spleen and liver, with co-infected mice losing more weight. Candida also boosted Salmonella entry into human colon cell lines. Gene readouts showed Salmonella's invasion machinery switched on near Candida.
Co-cultures contained millimolar arginine, and adding L-arginine alone increased invasion in a dose-dependent way, while an arginine-transporter mutant did not respond to Candida. Candida lacking arginine production also failed to boost Salmonella invasion or gut colonization, and an ARG4 revertant restored the effect.
Researchers conclude that C. albicans colonization represents a susceptibility factor for Salmonella infection, with arginine acting as a pivotal metabolite connecting fungus, bacterium, and host. Findings point to SopB-driven arginine production in Candida that boosts Salmonella's invasion program while softening host inflammatory signals.
The sound of crying babies makes our faces hotter, according to new research
Hearing a baby cry can trigger a range of responses in adults, such as sympathy, anxiety and a strong urge to help. However, new research suggests that a deeper physical reaction is also occurring. A baby's cry, particularly if it is in pain or distress, makes our faces physically warmer.
Since they can't speak yet, babies cry to communicate their needs, whether they're in pain or want some attention. When a baby is in distress, they forcefully contract their ribcage, which produces high-pressure air that causes their vocal cords to vibrate chaotically. This produces complex disharmonious sounds known as nonlinear phenomena (NLP).
To study how adults respond to crying babies, scientists played 23 different recordings to 41 men and women with little to no experience with young infants. At the same time, a thermal infrared imaging camera measured subtle changes to their facial temperatures. A rise in temperature in this part of the body is governed by the autonomic nervous system, a network of nerves that controls unconscious processes such as breathing and digestion. After each cry, the participants rated whether the baby was in discomfort or in pain.
The study found that adults' facial temperatures change when they hear a baby cry, a clear sign that the autonomic nervous system has been activated. This suggests that people unconsciously pick up on acoustic features in a baby's cry. The higher the level of NLP (meaning a baby is in more pain or distress), the stronger and more in sync the listener's facial temperature became. In other words, as the cry grew louder, a person's face grew warmer. This physiological reaction was the same for both men and women.
Lény Lego et al, Nonlinear acoustic phenomena tune the adults' facial thermal response to baby cries with the cry amplitude envelope, Journal of the Royal Society Interface (2025). DOI: 10.1098/rsif.2025.0150
Scientists engineer plants to double carbon uptake ability and produce more seeds and lipids
Typically, plants rely on the Calvin-Benson-Bassham (CBB) cycle to convert carbon dioxide in the atmosphere to usable organic matter for growth. Although this cycle is the main pathway for carbon fixation in all plants on Earth, it is surprisingly inefficient—losing one third of carbon in the cycle when synthesizing the molecule acetyl–coenzyme A (CoA) to generate lipids, phytohormones, and metabolites. Plants also lose carbon during photorespiration, which limits their growth. This is largely due to the inefficiency of an enzyme called RuBisCO.
In efforts to increase carbon uptake and reduce carbon loss in plants to boost biomass and lipid production, scientists have experimented with ways to increase the efficiency of RuBisCO, overexpress CBB cycle enzymes, introduce carbon-concentrating mechanisms, and reduce photorespiration losses. But, a new study published in Science, focuses on a novel approach—creating an altogether new pathway for carbon uptake.
The researchers involved in the study introduced a synthetic CO2uptake cycle into the plant Arabidopsis thaliana. They refer to the engineered cycle as the malyl-CoA-glycerate (McG) cycle, which works in conjunction with the CBB cycle to create a dual-cycle CO2fixation system. The new cycle increases efficiency by using previously wasted carbon.
"In the McG cycle, one additional carbon is fixed when 3PG is the input, or no carbon is lost when glycolate is the input. In both cases, acetyl-CoA is produced more efficiently, which is expected to enhance the production of lipids and other important plant metabolites, including phytohormones," the authors write.
Kuan-Jen Lu et al, Dual-cycle CO2 fixation enhances growth and lipid synthesis in Arabidopsis thaliana, Science (2025). DOI: 10.1126/science.adp3528
Scientists discover how nanoplastics disrupt brain energy metabolism
Scientists have discovered how nanoplastics—even smaller than microplastics—disrupt energy metabolism in brain cells. Their findings may have implications for better understanding neurodegenerative diseases characterized by declining neurological or brain function, and even shed new light on issues with learning and memory.
The study has revealed the specific mechanism by which these tiny nanoplastics can interfere with energy production in the brain in an animal model. The findings, recently published in the Journal of Hazardous Materials: Plastics, provide fresh insights into the potential health risks posed by environmental plastics.
Polystyrene nanoplastics (PS-NPs) are produced when larger plastics break down in the environment. These particles have been detected in multiple organs in the body, including the brain, sparking growing concerns about their possible role in neurological disease.
The researchers focused on mitochondria, which are critical for producing the energy needed for brain function. Mitochondrial dysfunction is a well-known feature of neurodegenerative diseases such as Parkinson's and Alzheimer's, as well as normal aging.
By isolating mitochondria from brain cells, the researchers showed that exposure to PS-NPs specifically disrupted the "electron transport chain," a simplified term for the set of protein complexes that work together to help generate cellular energy in the form of ATP. While individual mitochondrial complexes I and II were not directly impaired, electron transfer between complexes I–III and II–III, as well as the activity of complex IV, was significantly inhibited.
The scientists found that electron transfer between complex I–III and complex II–III was potently inhibited at much lower concentrations, suggesting environmentally relevant exposures could also impair bioenergetic function over chronic timeframes.
Interestingly, the same broad effects were seen in synaptic mitochondria, which are essential for communication between brain cells. This suggests that nanoplastics could also interfere with synaptic plasticity, a process fundamental to learning and memory.
D.M. Seward et al, Polystyrene nanoplastics target electron transport chain complexes in brain mitochondria, Journal of Hazardous Materials: Plastics (2025). DOI: 10.1016/j.hazmp.2025.100003
Dinosaurs had such an immense impact on Earth that their sudden extinction led to wide-scale changes in landscapes—including the shape of rivers—and these changes are reflected in the geologic record, according to a new study.
Scientists have long recognized the stark difference in rock formations from just before dinosaurs went extinct to just after, but chalked it up to sea level rise, coincidence, or other abiotic reasons. But the new study shows that once dinosaurs were extinguished, forests were allowed to flourish, which had a strong impact on rivers.
Studying these rock layers, the researchers suggest that dinosaurs were likely enormous "ecosystem engineers," knocking down much of the available vegetation and keeping land between trees open and weedy. The result was rivers that spilled openly, without wide meanders, across landscapes. Once the dinosaurs perished, forests were allowed to flourish, helping stabilize sediment and corralling water into rivers with broad meanders.
Their results, published in the journal Communications Earth & Environment, demonstrate how rapidly the Earth can change in response to catastrophic events.
Very often when we're thinking about how life has changed through time and how environments change through time, it's usually that the climate changes and, therefore, it has a specific effect on life, or this mountain has grown and, therefore, it has a specific effect on life. It's rarely thought that life itself could actually alter the climate and the landscape. The arrow doesn't just go in one direction, the researchers say.
Dinosaur extinction can explain continental facies shifts at the Cretaceous-Paleogene boundary, Communications Earth & Environment (2025). DOI: 10.1038/s43247-025-02673-8
Ants defend plants from herbivores, but can hinder pollination by bees
Around 4,000 plant species from different parts of the world secrete nectar outside their flowers, such as on their stems or leaves, through secretory glands known as extrafloral nectaries. Unlike floral nectar, extrafloral nectar does not attract pollinators; rather, it attracts insects that defend plants, such as ants. These insects feed on the sweet liquid and, in return, protect the plant from herbivores. However, this protection comes at a cost.
A study published in the Journal of Ecology points out that the presence of ants can reduce the frequency and duration that bees visit the flowers of plants with extrafloral nectaries.
Pollination is only impaired when extrafloral nectaries are close to the flowers. Plants with these glands in other locations, such as on their leaves or branches, had increased reproductive success, likely due to the protection against herbivores provided by ants.
On the other hand, butterflies, another group of pollinators, are not affected by ants. This may be due to the way these two groups feed. Butterflies use a long, straw-like organ called a proboscis to suck nectar from a distance, keeping them safe from ants.
Bees, on the other hand, need to get very close to the flower to collect pollen and floral nectar, but ants don't allow them to stay for long. Not surprisingly, the new analysis showed that the presence of ants is detrimental to pollination when extrafloral nectaries are close to flowers, but has a positive effect on plant reproduction when they're located further away.
The conclusions are the result of an analysis of data from 27 empirical studies on the relationships between ants, pollinators, and plants with extrafloral nectaries. The articles were selected from an initial screening of 567 studies after applying inclusion and exclusion criteria. The data were compiled and analyzed with computational tools.
Amanda Vieira da Silva et al, Ants on flowers: Protective ants impose a low but variable cost to pollination, moderated by location of extrafloral nectaries and type of flower visitor, Journal of Ecology (2025). DOI: 10.1111/1365-2745.70087
Culture is overtaking genetics in shaping human evolution, researchers argue
Some Researchers are theorizing that human beings may be in the midst of a major evolutionary shift—driven not by genes, but by culture.
In a paper published in BioScience, they argue that culture is overtaking genetics as the main force shaping human evolution.
"When we learn useful skills, institutions or technologies from each other, we are inheriting adaptive cultural practices. On reviewing the evidence, we find that culture solves problems much more rapidly than genetic evolution. This suggests our species is in the middle of a great evolutionary transition", they say.
Cultural practices—from farming methods to legal codes—spread and adapt far faster than genes can, allowing human groups to adapt to new environments and solve novel problems in ways biology alone could never match. According to the research team, this long-term evolutionary transition extends deep into the past, it is accelerating, and may define our species for millennia to come.
Cultural evolution eats genetic evolution for breakfast, they argue.
In the modern environment cultural systems adapt so rapidly they routinely "preempt" genetic adaptation. For example, eyeglasses and surgery correct vision problems that genes once left to natural selection.
Medical technologies like cesarean sections or fertility treatments allow people to survive and reproduce in circumstances that once would have been fatal or sterile. These cultural solutions, researchers argue, reduce the role of genetic adaptation and increase our reliance on cultural systems such as hospitals, schools and governments.
Today, your well-being is determined less and less by your personal biology and more and more by the cultural systems that surround you—your community, your nation, your technologies. And the importance of culture tends to grow over the long term because culture accumulates adaptive solutions more rapidly.
Over time, this dynamic could mean that human survival and reproduction depend less on individual genetic traits and more on the health of societies and their cultural infrastructure.
But, this transition comes with a twist. Because culture is fundamentally a shared phenomenon, culture tends to generate group-based solutions.
Using evidence from anthropology, biology and history, Waring and Wood argue that group-level cultural adaptation has been shaping human societies for millennia, from the spread of agriculture to the rise of modern states. They note that today, improvements in health, longevity and survival reliably come from group-level cultural systems like scientific medicine and hospitals, sanitation infrastructure and education systems rather than individual intelligence or genetic change.
The researchers argue that if humans are evolving to rely on cultural adaptation, we are also evolving to become more group-oriented and group-dependent, signaling a change in what it means to be human.
Part1
In the history of evolution, life sometimes undergoes transitions which change what it means to be an individual. This happened when single cells evolved to become multicellular organisms and social insects evolved into ultra-cooperative colonies. These individuality transitions transform how life is organized, adapts and reproduces. Biologists have been skeptical that such a transition is occurring in humans.
But the researchers suggest that because culture is fundamentally shared, our shift to cultural adaptation also means a fundamental reorganization of human individuality—toward the group. Cultural organization makes groups more cooperative and effective. And larger, more capable groups adapt—via cultural change—more rapidly. It's a mutually reinforcing system, and the data suggest it is accelerating. For example, genetic engineering is a form of cultural control of genetic material, but genetic engineering requires a large, complex society. So, in the far future, if the hypothesized transition ever comes to completion, our descendants may no longer be genetically evolving individuals, but societal "superorganisms" that evolve primarily via cultural change. The researchers emphasize that their theory is testable and lay out a system for measuring how fast the transition is happening. The team is also developing mathematical and computer models of the process and plans to initiate a long-term data collection project in the near future. They caution, however, against treating cultural evolution as progress or inevitability. They are not suggesting that some societies, like those with more wealth or better technology, are morally 'better' than others. Evolution can create both good solutions and brutal outcomes. They think this might help our whole species avoid the most brutal parts. The goal of this work goal is to use their understanding of deep patterns in human evolution to foster positive social change. Still, the new research raises profound questions about humanity's future. "If cultural inheritance continues to dominate, our fates as individuals, and the future of our species, may increasingly hinge on the strength and adaptability of our societies. And if so, the next stage of human evolution may not be written in DNA, but in the shared stories, systems, and institutions we create together, the researchers conclude.
Scientists uncover how cellular receptors trigger inflammation and sensory changes
In two new studies, scientists have uncovered detailed blueprints of how certain molecular "gates" in human cells work—findings that could open doors to new treatments for conditions ranging from certain cancers and brain diseases to hearing loss and atherosclerosis, or plaque build-up in the arteries.
They studied a group of proteins known as P2X receptors, which sit on the surface of cells and detect ATP—a molecule best known as the body's energy source inside of cells.
When ATP leaks outside of cells, often as a sign of stress or damage, P2X receptors act like alarm bells, triggering responses related to inflammation, pain and sensory processing.
Extracellular ATP is a universal danger signal. When it builds up outside cells, P2X receptors sense it and change how the cells respond. Understanding these receptors at the atomic level is key to designing drugs that can either calm them down or fine-tune their activity.
In a study published in Nature Communications, researchers examined the molecular structure of the human P2X7 receptor, a protein linked to inflammatory diseases such as cancer, Alzheimer's and atherosclerosis. Despite years of effort, no drugs targeting P2X7 have reached the clinical market, partly because drugs that have worked well in animal models have not had the same success in humans.
Building on a previous study, where they determined how to turn the rat P2X7 receptor off, the team has now mapped how drugs turn off the human P2X7 receptor for the first time. They now know what makes the human receptor different from the receptor that is present in animal models. This is important for understanding how to better customize drugs to fit the binding pockets within the human receptor.
Using that information, the researchers, in collaboration with groups from around the world, designed a new compound referred to as UB-MBX-46. The compound complements the binding pocket in the human receptor, translating to a molecule that blocks the human receptor with high precision and strength.
This is the first time scientists have visualized the human P2X7 receptor and really understood how it is different from others. With that knowledge, they can now create a drug candidate that perfectly fits binding pockets within the human receptor, much like how a key fits in a lock. It gives us hope for developing therapies that have better chances to reach the clinic.
A second studypublishedinProceedings of the National Academy of Sciencesexamined the human P2X2 receptor, a protein in the same family as the P2X7 receptor, but is predominantly found in the cochlea, the hearing organ of the inner ear.
The P2X2 receptor is involved in hearing processes and in the ear's adaptation to loud noise. Certain genetic mutations of this receptor have been linked to hearing loss. Currently, there are no drugs that target this receptor effectively, and until now, scientists had limited insight into how it functions.
Researchers used cryo-electron microscopy—a powerful imaging method—to capture 3D structures of the human P2X2 receptor in two states: in a resting state and in a state bound to ATP but desensitized, meaning it's not active anymore. The team discovered unique structural features and pinpointed areas where hearing-related mutations occur.
Together, the studies mark a leap forward in understanding how P2X receptors contribute to a wide range of diseases by triggering inflammation and sensory changes.
Adam C. Oken et al, A polycyclic scaffold identified by structure-based drug design effectively inhibits the human P2X7 receptor,Nature Communications(2025).DOI: 10.1038/s41467-025-62643-8
Franka G. Westermann et al, Subtype-specific structural features of the hearing loss–associated human P2X2 receptor,Proceedings of the National Academy of Sciences(2025).DOI: 10.1073/pnas.2417753122
Dr. Krishna Kumari Challa
Pancreatic insulin disruption triggers bipolar disorder-like behaviors in mice, study shows
Bipolar disorder is a psychiatric disorder characterized by alternating episodes of depression (i.e., low mood and a loss of interest in everyday activities) and mania (i.e., a state in which arousal and energy levels are abnormally high). On average, an estimated 1–2% of people worldwide are diagnosed with bipolar disorder at some point during their lives.
Bipolar disorder can be highly debilitating, particularly if left untreated. Understanding the neural and physiological processes that contribute to its emergence could thus be very valuable, as it could inform the development of new prevention and treatment strategies.
In addition to experiencing periodic changes in mood, individuals diagnosed with this disorder often exhibit some metabolic symptoms, including changes in their blood sugar levels. While some previous studies reported an association between blood sugar control mechanisms and bipolar disorder, the biological link between the two has not yet been uncovered.
Researchers recently carried out a study aimed at further exploring the link between insulin secretion and bipolar disorder-like behaviors, particularly focusing on the expression of the gene RORβ.
Their findings, published in Nature Neuroscience, show that an overexpression of this gene in a subtype of pancreatic cells disrupts the release of insulin, which in turn prompts a feedback loop with a region of the brain known as the hippocampus, producing alternative depression-like and mania-like behaviors in mice.
The results in mice point to a pancreas–hippocampus feedback mechanism by which metabolic and circadian factors cooperate to generate behavioral fluctuations, and which may play a role in bipolar disorder, wrote the authors.
Yao-Nan Liu et al, A pancreas–hippocampus feedback mechanism regulates circadian changes in depression-related behaviors, Nature Neuroscience (2025). DOI: 10.1038/s41593-025-02040-y.
Sep 4
Dr. Krishna Kumari Challa
Shampoo-like gel could help chemo patients keep their hair
Cancer fighters know that losing their hair is often part of the battle, but researchers have developed a shampoo-like gel that has been tested in animal models and could protect hair from falling out during chemotherapy treatment.
Baldness from chemotherapy-induced alopecia causes personal, social and professional anxiety for everyone who experiences it. Currently, there are few solutions—the only ones that are approved are cold caps worn on the patient's head, which are expensive and have their own extensive side effects.
The gel is a hydrogel, which absorbs a lot of water and provides long-lasting delivery of drugs to the patient's scalp. The hydrogel is designed to be applied to the patient's scalp before the start of chemotherapy and left on their head as long as the chemotherapy drugs are in their system—or until they are ready to easily wash it off.
During chemotherapy treatment, chemotherapeutic drugs circulate throughout the body. When these drugs reach the blood vessels surrounding the hair follicles on the scalp, they kill or damage the follicles, which releases the hair from the shaft and causes it to fall out.
The gel, containing the drugs lidocaine and adrenalone, prevents most of the chemotherapy drugs from reaching the hair follicle by restricting the blood flow to the scalp. Dramatic reduction in drugs reaching the follicle will help protect the hair and prevent it from falling out.
To support practical use of this "shampoo," the gel is designed to be temperature responsive. For example, at body temperature, the gel is thicker and clings to the patient's hair and scalp surface. When the gel is exposed to slightly cooler temperatures, the gel becomes thinner and more like a liquid that can be easily washed away.
Romila Manchanda et al, Hydrogel-based drug delivery system designed for chemotherapy-induced alopecia, Biomaterials Advances (2025). DOI: 10.1016/j.bioadv.2025.214452
Sep 4
Dr. Krishna Kumari Challa
How aging drives neurodegenerative diseases
A research team has identified a direct molecular link between aging and neurodegeneration by investigating how age-related changes in cell signaling contribute to toxic protein aggregation.
Although aging is the biggest risk factor for neurodegenerative diseases, scientists still don't fully understand which age-associated molecular alterations drive their development.
Using the small nematode worm Caenorhabditis elegans, a research team studied a signaling pathway that leads to pathological protein accumulation with age. Their new paper is published in Nature Aging.
The team focused on the aging-associated protein EPS8 and the signaling pathways it regulates. This protein is known to accumulate with age and to activate harmful stress responses that lead to a shorter lifespan in worms.
Researchers found that increased levels of EPS8, and the activation of its signaling pathways, drive pathological protein aggregation and neurodegeneration—typical features of age-associated neurodegenerative diseases such as Huntington's disease and amyotrophic lateral sclerosis (ALS). By reducing EPS8 activity, the group was then able to prevent the build-up of the toxic protein aggregates and preserve neuronal function in worm models of these two diseases.
Importantly, EPS8 and its signaling partners are evolutionarily conserved and also present in human cells. Similar to what they achieved in the worms, the team was able to prevent the accumulation of toxic protein aggregates in human cell models of Huntington's disease and ALS by reducing EPS8 levels.
Seda Koyuncu et al, The aging factor EPS8 induces disease-related protein aggregation through RAC signaling hyperactivation, Nature Aging (2025). DOI: 10.1038/s43587-025-00943-w
Sep 4
Dr. Krishna Kumari Challa
Cooling pollen sunscreen can block UV rays without harming corals
Materials scientists have invented the world's first pollen-based sunscreen derived from Camellia flowers.
In experiments, the pollen-based sunscreen absorbed and blocked harmful ultraviolet (UV) rays as effectively as commercially available sunscreens, which commonly use minerals like titanium dioxide (TiO2) and zinc oxide (ZnO).
In laboratory tests on corals, commercial sunscreen induced coral bleaching in just two days, leading to coral death by day six. Each year, an estimated 6,000 to 14,000 tons of commercial sunscreen make their way into the ocean, as people wash it off in the sea or it flows in from wastewater.
In contrast, the pollen-based sunscreen did not affect the corals, which remained healthy even up to 60 days.
In other tests, the pollen-based sunscreen also demonstrated its ability to reduce surface skin temperature, thereby helping to keep the skin cool in the presence of simulated sunlight.
Nature's Guard: UV Filter from Pollen, Advanced Functional Materials (2025). DOI: 10.1002/adfm.202516936. advanced.onlinelibrary.wiley.c … .1002/adfm.202516936
Sep 5
Dr. Krishna Kumari Challa
Why we slip on ice: Physicists challenge centuries-old assumptions
For over a hundred years, schoolchildren around the world have learned that ice melts when pressure and friction are applied. When you step out onto an icy pavement in winter, you can slip up because of the pressure exerted by your body weight through the sole of your (still warm) shoe. But it turns out that this explanation misses the mark.
New research reveals that it's not pressure or friction that causes ice to become slippery, but rather the interaction between molecular dipoles in the ice and those on the contacting surface, such as a shoe sole.
The work is published in the journal Physical Review Letters. This insight overturns a paradigm established nearly two centuries ago by the brother of Lord Kelvin, James Thompson, who proposed that pressure and friction contribute to ice melting alongside temperature.
It turns out that neither pressure nor friction plays a particularly significant part in forming the thin liquid layer on ice.
Instead,computer stimulations by researchers reveal that molecular dipoles are the key drivers behind the formation of this slippery layer, which so often causes us to lose our footing in winter. But what exactly is a dipole? A molecular dipole arises when a molecule has regions of partial positive and partial negative charge, giving the molecule an overall polarity that points in a specific direction.
Achraf Atila et al, Cold Self-Lubrication of Sliding Ice, Physical Review Letters (2025). DOI: 10.1103/1plj-7p4z
Sep 5
Dr. Krishna Kumari Challa
Fetal brain harm linked to pregnancy infection
A specific bacterial infection during pregnancy that can cause severe harm to the unborn brain has been identified for the first time, in a finding that could have huge implications for prenatal health.
Previous studies have disagreed on whether fetal exposure to Ureaplasma parvum has a detrimental effect on brain development, so newborn health specialists of Medical Research set out to determine the answer, once and for all.
Ureaplasma parvum Serovars 3 and 6 are among the most common types that are isolated in pregnancies complicated by infection/inflammation, so they tested them individually in a pre-clinical model and the results were clear.
They showed that long-term exposure to a specific subtype of Ureaplasma (parvum 6) resulted in loss of cells that are responsible for the production of myelin (the fatty sheath that insulates nerve cells in the brain).
This resulted in less myelin production and a disruption to the architecture of myelin in the brain. This sort of disruption to myelin production can have a devastating and lifelong impact on neurodevelopment, cognition and motor function.
By contrast, they also showed that exposure to another subtype of Ureaplasma (parvum 3) had little effect on neurodevelopment.
Many of the babies affected by this infection in utero are at a much higher risk of preterm birth and the chronic intensive care and inflammation associated with being born too early, the researchers say.
Dima Abdu et al, Intra-amniotic infection with Ureaplasma parvum causes serovar-dependent white matter damage in preterm fetal sheep, Brain Communications (2025). DOI: 10.1093/braincomms/fcaf182
Sep 5
Dr. Krishna Kumari Challa
After early-life stress, astrocytes can affect behaviour
Astrocytes in the lateral hypothalamus region of the brain, an area involved in the regulation of sleep and wakefulness, play a key role in neuron activity in mice and affect their behavior, researchers have found.
By broadening medical science's understanding of cerebral mechanisms, the discovery could someday help in the treatment and prevention of depression in humans, the researchers say.
According to the scientific literature, early-life stress leads to a five-fold increase in the risk of developing a mental-health disorder as an adult, notably causing treatment-resistant disorders.
As brain cells, astrocytes are sensitive to variations in the blood concentration of metabolites and, in response to changes in the blood, astrocytes can modulate the extent of their interaction with neurons, their neighboring cells.
In mice, those changes are particularly responsive to the level of corticosterone, the stress hormone in the rodents' blood.
In adult mice who experienced early-life stress, researchers saw abnormally high levels of corticosterone. The impact of stress on behavior also differed according to sex. Females were less active at night, while males were hyperactive during the day.
In people with depression who have experienced a similar type of stress, these sex differences have also been observed.
Lewis R. Depaauw-Holt et al, A divergent astrocytic response to stress alters activity patterns via distinct mechanisms in male and female mice, Nature Communications (2025). DOI: 10.1038/s41467-025-61643-y
Sep 5
Dr. Krishna Kumari Challa
Is this the future of food? 'Sexless' seeds that could transform farming
Scientists are tinkering with plant genes to create crops that seed their own clones, with a host of benefits for farmers.
Sacks of seeds without the sex
Agriculture is on the brink of a revolution: grain crops that produce seeds asexually. The technology — trials of which could start sprouting as early as next month — exploits a quirk of nature called apomixis, in which plants create seeds that produce clones of the parent. Apomixis could slash the time needed to create new varieties of crops, and give smallholder farmers access to affordable high-yielding sorghum (Sorghum bicolor) and cowpea (Vigna unguiculata). But before self-cloning crops can be commercialized, the technology must run the regulatory gauntlet.
https://www.nature.com/articles/d41586-025-02753-x?utm_source=Live+...
Sep 5
Dr. Krishna Kumari Challa
Macaws learn by watching interactions of others, a skill never seen in animals before
One of the most effective ways we learn is through third-party imitation, where we observe and then copy the actions and behaviors of others. Until recently, this was thought to be a unique human trait, but a new study published in Scientific Reports reveals that macaws also possess this ability.
Second-party imitation is already known to exist in the animal kingdom. Parrots are renowned for their ability to imitate human speech and actions, and primates, such as chimpanzees, have learned to open a puzzle box by observing a human demonstrator. But third-party imitation is different because it involves learning by observing two or more individuals interact rather than by direct instruction.
Scientists chose blue-throated macaws for this study because they live in complex social groups in the wild, where they need to learn new behaviors to fit in quickly. Parrots, like macaws, are also very smart and can do things like copy sounds and make tools.
To find out whether macaws could learn through third-party imitation, researchers worked with two groups of them, performing more than 4,600 trials. In the test group, these birds watched another macaw perform one of five different target actions in response to a human's hand signals. These were fluffing up feathers, spinning its body, vocalizing, lifting a leg or flapping its wings. In the control group, macaws were given the same hand signals without ever seeing another bird perform the actions.
The results were clear. The test group learned more actions than the control group, meaning the interactions between the single macaw and the human experimenter helped them learn specific behaviors. They also learned these actions faster and performed them more accurately than the control group. The study's authors suggest that this ability to learn from passively observing two or more individuals helps macaws adapt to new social situations and may even contribute to their own cultural traditions.
The research shows that macaws aren't just smart copycats. They may also have their own complex social lives and cultural norms, similar to humans.
Esha Haldar et al, Third-party imitation is not restricted to humans, Scientific Reports (2025). DOI: 10.1038/s41598-025-11665-9
Sep 6
Dr. Krishna Kumari Challa
Physicists create a new kind of time crystal that humans can actually see
Imagine a clock that doesn't have electricity, but its hands and gears spin on their own for all eternity. In a new study, physicists have used liquid crystals, the same materials that are in your phone display, to create such a clock—or, at least, as close as humans can get to that idea. The team's advancement is a new example of a "time crystal." That's the name for a curious phase of matter in which the pieces, such as atoms or other particles, exist in constant motion.
The researchers aren't the first to make a time crystal, but their creation is the first that humans can actually see, which could open a host of technological applications. They can be observed directly under a microscope and even, under special conditions, by the naked eye.
In the study, the researchers designed glass cells filled with liquid crystals—in this case, rod-shaped molecules that behave a little like a solid and a little like a liquid. Under special circumstances, if you shine a light on them, the liquid crystals will begin to swirl and move, following patterns that repeat over time.
Time Crystal in motion
Under a microscope, these liquid crystal samples resemble psychedelic tiger stripes, and they can keep moving for hours—similar to that eternally spinning clock.
Hanqing Zhao et al, Space-time crystals from particle-like topological solitons, Nature Materials (2025). DOI: 10.1038/s41563-025-02344-1
Sep 6
Dr. Krishna Kumari Challa
Ant Queens Produce Offspring of Two Different Species
Some ant queens can produce offspring of more than one species – even when the other species is not known to exist in the nearby wild.
No other animal on Earth is known to do this, and it's hard to believe.
This is bizarre story of a system that allows things to happen that seem almost unimaginable.
Some queen ants are known to mate with other species to produce hybrid workers, but Iberian harvester ants (Messor ibericus) on the island of Sicily go even further, blurring the lines of species-hood. These queens can give birth to cloned males of another species entirely (Messor structor), with which they can mate to produce hybrid workers.
Their reproductive strategy is akin to "sexual domestication", the researchers say. The queens have come to control the reproduction of another species, which they exploited from the wild long ago – similar to how humans have domesticated dogs.
In the past, the Iberian ants seem to have stolen the sperm of another species they once depended on, creating an army of male M. structor clones to reproduce with whenever they wanted.
According to genetic analysis, the colony's offspring are two distinct species, and yet they share the same mother.
Some are the hairy M. ibericus, and the others the hairless M. structor – the closest wild populations of which live more than a thousand kilometers away.
The queen can mate with males of either species in the colony to reproduce. Mating with M. ibericus males will produce the next generation of queen, while mating with M. structor males will result in more workers. As such, all workers in the colony are hybrids of M. ibericus and M. structor.
Part 1
Sep 7
Dr. Krishna Kumari Challa
The two species diverged from each other more than 5 million years ago, and yet today, M. structor is a sort of parasite living in the queen's colony. But she is hardly a victim.
The Iberian harvester queen ant controls what happens to the DNA of her clones. When she reproduces, she can do so asexually, producing a clone of herself. She can also fertilize her egg with the sperm of her own species or M. structor, or she can 'delete' her own nuclear DNA and use her egg as a vessel solely for the DNA of her male M. structor clones.
This means her offspring can either be related to her, or to another species. The only similarity is that both groups contain the queen's mitochondrial DNA. The result is greater diversity in the colony without the need for a wild neighboring species to mate with.
It also means that Iberian harvester ants belong to a 'two-species superorganism' – which the researchers say "challenges the usual boundaries of individuality."
The M. structor males produced by M. ibericus queens don't look exactly the same as males produced by M. structor queens, but their genomes match.
https://www.nature.com/articles/s41586-025-09425-w
Part 2
Sep 7
Dr. Krishna Kumari Challa
Electrical stimulation can reprogram immune system to heal the body faster
Scientists have discovered that electrically stimulating macrophages—one of the immune systems key players—can reprogram them in such a way as to reduce inflammation and encourage faster, more effective healing in disease and injury.
This breakthrough uncovers a potentially powerful new therapeutic option, with further work ongoing to delineate the specifics.
Macrophages are a type of white blood cell with several high-profile roles in our immune system. They patrol around the body, surveying for bugs and viruses, as well as disposing of dead and damaged cells, and stimulating other immune cells—kicking them into gear when and where they are needed.
However, their actions can also drive local inflammation in the body, which can sometimes get out of control and become problematic, causing more damage to the body than repair. This is present in lots of different diseases, highlighting the need to regulate macrophages for improved patient outcomes.
In the study, published in the journal Cell Reports Physical Science, the researchers worked with human macrophages isolated from healthy donor blood samples.
They stimulated these cells using a custom bioreactor to apply electrical currents and measured what happened.
The scientists discovered that this stimulation caused a shift of macrophages into an anti-inflammatory state that supports faster tissue repair; a decrease in inflammatory marker (signaling) activity; an increase in expression of genes that promote the formation of new blood vessels (associated with tissue repair as new tissues form); and an increase in stem cell recruitment into wounds (also associated with tissue repair).
Electromodulation of human monocyte-derived macrophages drives a regenerative phenotype and impedes inflammation, Cell Reports Physical Science (2025). DOI: 10.1016/j.xcrp.2025.102795. www.cell.com/cell-reports-phys … 2666-3864(25)00394-7
Sep 8
Dr. Krishna Kumari Challa
Human brains explore more to avoid losses than to seek gains
Researchers traced a neural mechanism that explains why humans explore more aggressively when avoiding losses than when pursuing gains. Their work reveals how neuronal firing and noise in the amygdala shape exploratory decision-making.
Human survival has its origins in a delicate balance of exploration versus exploitation. There is safety in exploiting what is known, the local hunting grounds, the favorite foraging location, the go-to deli with the familiar menu. Exploitation also involves the risk of over-reliance on the familiar to the point of becoming too dependent upon it, either through depletion or a change in the stability of local resources.
Exploring the world in the hope of discovering better options has its own set of risks and rewards. There is the chance of finding plentiful hunting grounds, alternative foraging resources, or a new deli that offers a fresh take on old favorites. And there is the risk that new hunting grounds will be scarce, the newly foraged berries poisonous, or that the meal time will be ruined by a deli that disappoints.
Exploration-exploitation (EE) dilemma research has concentrated on gain-seeking contexts, identifying exploration-related activity in surface brain areas like cortex and in deeper regions such as the amygdala. Exploration tends to rise when people feel unsure about which option will pay off.
Loss-avoidance strategies differ from gain-seeking strategies, with links to negative outcomes such as PTSD, anxiety and mood disorders.
In the study, "Rate and noise in human amygdala drive increased exploration in aversive learning," published in Nature, researchers recorded single-unit activity to compare exploration during gain versus loss learning.
Part 1
Sep 9
Dr. Krishna Kumari Challa
Researchers examined how strongly neurons fired just before each choice and how variable that activity was across trials. Variability served as "neural noise." Computational models estimated how closely choices followed expected value and how much they reflected overall uncertainty.
Results showed more exploration in loss trials than in gain trials. After people learned which option was better, exploration stayed higher in loss trials and accuracy fell more from its peak.
Pre-choice firing in the amygdala and temporal cortex rose before exploratory choices in both gain and loss, indicating a shared, valence-independent rate signal. Loss trials also showed noisier amygdala activity from cue to choice.
More noise was linked to higher uncertainty and a higher chance of exploring, and noise declined as learning progressed. Using the measured noise levels in decision models reproduced the extra exploration seen in loss trials. Loss aversion did not explain the gap.
Researchers report two neural signals that shape exploration. A valence-independent firing-rate increase in the amygdala and temporal cortex precedes exploratory choices in both gain and loss. A loss-specific rise in amygdala noise raises the odds of exploration under potential loss, scales with uncertainty, and wanes as learning accrues.
Behavioral modeling matches this pattern, with value-only rules fitting gain choices and value-plus-total-uncertainty rules fitting loss choices. Findings point to neural variability as a lever that tilts strategy toward more trial-and-error when loss looms, while causal tests that manipulate noise remain to be done.
From an evolutionary survival perspective, the strategy fits well with the need to seek out new resources when facing the loss of safe or familiar choices. While one might consider trying a new restaurant at any time, true seeking behavior will become a priority if the favorite location is closed for remodeling.
Tamar Reitich-Stolero et al, Rate and noise in human amygdala drive increased exploration in aversive learning, Nature (2025). DOI: 10.1038/s41586-025-09466-1
Part 2
Sep 9
Dr. Krishna Kumari Challa
Scientists discover why the flu is more deadly for older people
Scientists have discovered why older people are more likely to suffer severely from the flu, and can now use their findings to address this risk.
In a study published in PNAS, experts discovered that older people produce a glycosylated protein called apolipoprotein D (ApoD), which is involved in lipid metabolism and inflammation, at much higher levels than in younger people. This has the effect of reducing the patient's ability to resist virus infection, resulting in a more serious disease outcome.
ApoD is therefore a target for therapeutic intervention to protect against severe influenza virus infection in the elderly which would have a major impact on reducing morbidity and mortality in the aging population.
ApoD mediates age-associated increase in vulnerability to influenza virus infection, Proceedings of the National Academy of Sciences (2025). DOI: 10.1073/pnas.2423973122
Sep 9
Dr. Krishna Kumari Challa
There is less room to store carbon dioxide, driver of climate change, than previously thought, finds a new study
The world has far fewer places to securely store carbon dioxide deep underground than previously thought, steeply lowering its potential to help stem global warming, according to a new study that challenges long-held industry claims about the practice.
The study, published last week in the journal Nature, found that global carbon storage capacity was 10 times less than previous estimates after ruling out geological formations where the gas could leak, trigger earthquakes or contaminate groundwater, or had other limitations. That means carbon capture and storage would only have the potential to reduce human-caused warming by 0.7 degrees Celsius (1.26 Fahrenheit)—far less than previous estimates of around 5-6 degrees Celsius (9-10.8 degrees Fahrenheit), researchers said.
Carbon storage is often portrayed as a way out of the climate crisis. These new findings make clear that it is a limited tool and reaffirms the extreme importance of reducing emissions as fast and as soon as possible.
Matthew J. Gidden et al, A prudent planetary limit for geologic carbon storage, Nature (2025). DOI: 10.1038/s41586-025-09423-y
Sep 9
Dr. Krishna Kumari Challa
Smart patch runs tests using sweat instead of blood
It is possible now to precisely assess the body's health status using only sweat instead of blood tests.
A research team has now developed a smart patch that can precisely observe internal changes through sweat when simply attached to the body. This is expected to greatly contribute to the advancement of chronic disease management and personalized health care technologies.
The researchers developed a wearable sensor that can simultaneously and in real-time analyze multiple metabolites in sweat.
The research team developed a thin and flexible wearable sweat patch that can be directly attached to the skin. This patch incorporates both microchannels for collecting sweat and an ultrafine nanoplasmonic structure that analyzes sweat components using light. Thanks to this, multiple sweat metabolites can be simultaneously analyzed without the need for separate staining or labels, with just one patch application.
A nanoplasmonic structure is an optical sensor structure where nanoscale metallic patterns interact with light, designed to sensitively detect the presence or changes in concentration of molecules in sweat.
The patch was created by combining nanophotonics technology, which manipulates light at the nanometer scale (one-hundred-thousandth the thickness of a human hair) to read molecular properties, with microfluidics technology, which precisely controls sweat in channels thinner than a hair.
In other words, within a single sweat patch, microfluidic technology enables sweat to be collected sequentially over time, allowing for the measurement of changes in various metabolites without any labeling process. Inside the patch are six to 17 chambers (storage spaces), and sweat secreted during exercise flows along the microfluidic structures and fills each chamber in order.
The research team applied the patch to actual human subjects and succeeded in continuously tracking the changing components of sweat over time during exercise.
In this study, they demonstrated for the first time that three metabolites—uric acid, lactic acid, and tyrosine—can be quantitatively analyzed simultaneously, as well as how they change depending on exercise and diet.
In particular, by using artificial intelligence analysis methods, they were able to accurately distinguish signals of desired substances even within the complex components of sweat.
In the future, this technology can be expanded to diverse fields such as chronic disease management, drug response tracking, environmental exposure monitoring, and the discovery of next-generation biomarkers for metabolic diseases.
Jaehun Jeon et al, All-flexible chronoepifluidic nanoplasmonic patch for label-free metabolite profiling in sweat, Nature Communications (2025). DOI: 10.1038/s41467-025-63510-2
Sep 9
Dr. Krishna Kumari Challa
Microbiome instability linked to poor growth in kids
Malnutrition is a leading cause of death in children under age 5, and nearly 150 million children globally under this age have stunted growth from lack of nutrition. Although an inadequate diet is a major contributor, researchers found over a decade ago that dysfunctional communities of gut microbes play an important role in triggering malnutrition.
They have discovered that toddlers in Malawi—among the places hardest hit by malnutrition—who had a fluctuating gut microbiome showed poorer growth than kids with a more stable microbiome. All of the children were at high risk for stunting and acute malnutrition.
The findings, published in Cell, establish a pediatric microbial genome library—a public health database containing complete genetic profiles of 986 microbes from fecal samples of eight Malawian children collected over nearly a year that can be used for future studies to help predict, prevent and treat malnutrition.
Culture-independent meta-pangenomics enabled by long-read metagenomics reveals associations with pediatric undernutrition, Cell (2025). DOI: 10.1016/j.cell.2025.08.020. www.cell.com/cell/fulltext/S0092-8674(25)00975-4
Sep 10
Dr. Krishna Kumari Challa
Discovery reveals how chromosome ends can be protected
Researchers have uncovered a previously unknown mechanism that safeguards the chromosome ends from being mistakenly repaired by the cell. While DNA repair is vital for survival, attempts to repair the chromosome ends—called telomeres—can have catastrophic outcomes for cells.
The research, published in Nature, increases the understanding of how cancer and certain rare diseases develop.
Cells constantly monitor their DNA. A DNA helix that ends abruptly is a signal that DNA has been severely damaged—at least in most cases. The most severe DNA damage a cell can suffer is when a DNA helix breaks up in two pieces.
Normally, the cells would try to promptly repair all damage to DNA. The dilemma is that our chromosomes have ends that look just like broken DNA. If cells were to "repair" them, looking for another loose end to join them with, this would lead to the fusion between two or more chromosomes, making the cell susceptible to transform into a cancer cell.
Therefore, the chromosome ends—called telomeres—must be protected from the cell's DNA repair machinery.
The cells have to constantly repair DNA damage to avoid mutations, cell death, and cancer, while at the same time, they must not repair the chromosome ends by mistake, since that would result in the same catastrophic outcome. What's the difference between damaged DNA and the natural chromosome end? This problem has been known for almost a century, but some aspects are still not completely resolved.
Although the telomeres are not to be repaired as if they were broken DNA, several DNA repair proteins can be found at the chromosome ends.
The research team has previously shown that a key repair protein, DNA Protein Kinase (called DNA-PK) helps in processing telomeres and in protecting them from degradation. But how DNA-PK at the same time is prevented from trying to repair these blunt DNA ends remained a mystery until now.
The researchers have now shown that two other proteins, called RAP1 and TRF2, have an important role to play in regulating DNA-PK.
They show genetically, biochemically and structurally how the protein RAP1, brought to telomeres by TRF2, ensure by direct interaction that DNA-PK doesn't 'repair' the telomeres.
Part1
Sep 10
Dr. Krishna Kumari Challa
The role of telomeres in processes such as cancer development and aging, and what happens when they do not work properly, has long interested scientists. Diseases caused by disturbances in telomere maintenance are rare, but very serious, and include premature aging, blood cell deficiency called aplastic anemia and fibrosis in the lungs.
In about half of cases, the disease is explained by a known mutation that affects telomere stability, but in many cases, there is currently no known medical explanation for why the individual is sick.
Their research is also relevant to cancer research. On one hand, inappropriate "repair" of telomeres can trigger catastrophic events leading to the accumulation of mutations and cancer.
On the other hand, cancer cells are often less efficient at repairing damage to DNA compared to normal cells. This weakness is exploited in cancer treatments and many therapies kill tumor cells by causing DNA damage or inhibiting repair, or both.
In other words, knowledge of how cells regulate DNA repair and protect telomeres has a bearing on both prevention and treatment of cancer.
Increased understanding of which proteins play key roles in these cellular processes can, in the long term, contribute to more precise and targeted treatment strategies.
Patrik Eickhoff et al, Chromosome end protection by RAP1-mediated inhibition of DNA-PK, Nature (2025). DOI: 10.1038/s41586-025-08896-1
Part 2
Sep 10
Dr. Krishna Kumari Challa
Fathers' drinking plays role in fetal alcohol spectrum disorder, study shows
It's a well-known fact that fetal alcohol spectrum disorder (FASD) in children is caused by mothers who drink during pregnancy. But it turns out that the father's drinking habits could also affect a child's growth and development.
A team of international researchers found that a father's alcohol use may have a small but direct negative impact on a child's development by the age of seven. A father's drinking contributes to the harm caused by alcohol use during pregnancy.
The findings of their study were published recently in the journal Alcohol: Clinical & Experimental Research.
The researchers analyzed data from five studies on the prevalence and characteristics of FASD among Grade 1 learners in the Western Cape to explore whether a father's drinking affects children diagnosed with FASD. The children's biological mothers or legal guardians completed a questionnaire on the risk factors for FASD.
According to the researchers, there is a growing recognition that factors beyond pregnant women's drinking habits can affect their children's development. They add that increased attention is currently paid to the role of fathers, not only as a contributing factor to women's drinking habits, but also as an independent contributing factor to the growth and development of children.
The findings show that children whose fathers drank alcohol were more likely to be shorter, have smaller heads, and score lower on verbal IQ tests. It was also clear from the study that the highest risk to the child's development exists when both parents use alcohol during pregnancy. It also appeared that 'binge drinking' by the father, but especially by both parents, has the most detrimental effect on the child's development.
Data analysis showed that between 66% and 77% of fathers of children on the FAS spectrum drank during their partner's pregnancy with the child in question. These fathers drank an average of 12 drinks per drinking day. The number of drinks that fathers drank per drinking day was significantly correlated with smaller head circumference in their children. Head circumference is used as a measure of brain development."
The researchers add that fathers who drank an average of five or more drinks per drinking day had shorter children with smaller head circumferences. These children also performed worse on measures in verbal intelligence tests.
"In general, it was found that the more fathers drank, the worse their children performed. However, it should also be noted that all these effects were observed in children whose mothers consumed alcohol during pregnancy."
They note that a father's drinking alone didn't increase the chances of a child being diagnosed with FASD. However, when the mother drank during pregnancy and the father was also a heavy drinker, the child was more likely to have the most serious symptoms of FASD."When looking at both parents' drinking patterns, the father's alcohol use alone didn't show a clear link to the child's physical or brain development problems. While both parents' drinking was considered, the main effects on a child's development and physical features were linked to the mother's alcohol use.
Part 1
Sep 10
Dr. Krishna Kumari Challa
"Even after accounting for alcohol use by mothers during pregnancy, the father's drinking was still linked to lower child height, smaller head size and reduced verbal IQ. This suggests that paternal alcohol use may have its own, though limited, impact on a child's growth and development.
"Data analyses of children where both parents consumed alcohol during pregnancy had significantly negative effects on growth, head circumference, verbal intelligence and general birth defect scores than in children where neither parent consumed alcohol."
The researchers say that, while it is not yet clear whether the impact of a father's drinking on a child's growth and development stems from impaired sperm quality or other epigenetic influences (changes in how genes work that don't involve altering the DNA code itself), the father's role in the development of FASD cannot be overlooked.
Philip A. May et al, Does paternal alcohol consumption affect the severity of traits of fetal alcohol spectrum disorders?, Alcohol, Clinical and Experimental Research (2025). DOI: 10.1111/acer.70105
Part 2
Sep 10
Dr. Krishna Kumari Challa
Why some influenza viruses are more dangerous than others
Serious infections with influenza A viruses are characterized by an excessive immune response, known as cytokine storm. It was previously unclear why some virus strains trigger these storms, while others do not. Researchers investigated 11 different influenza A virus strains and their effect on different human immune cells.
The results, published in Emerging Microbes & Infections, show that highly pathogenic avian influenza viruses infect specific kinds of immune cells and thus stimulate the production of type I interferon. This could explain why these viruses are particularly dangerous.
The new research results show that not only the immune cells that have always been the focus of attention regarding type I interferon production, but also other immune system cells could be decisive in whether an influenza infection triggers an excessive immune system response. This knowledge is important in order to be able to better assess the risk of dangerous virus variants.
Influenza viruses are some of the most significant respiratory disease pathogens worldwide. While most infections are relatively mild, certain virus strains can cause severe pneumonia, leading to acute respiratory failure. Highly pathogenic influenza viruses are particularly dangerous. They often spread from birds to humans and are associated with significantly higher mortality rates.
The immune system plays a crucial role in this danger: some virus strains cause the body to release an excessive amount of messenger substances known as cytokines. If a cytokine storm occurs, the immune response will end up damaging the body's tissue more than the virus itself.
But why do some influenza viruses trigger such excessive reactions, while others only cause infections with mild progression?
The researchers tested how the flu viruses infect different immune cells and stimulate the release of messenger substances.
The aim of their research is to decipher the mechanisms behind mild and severe disease progression and to develop long-term approaches for better protection and treatment options.
Part 1
Sep 10
Dr. Krishna Kumari Challa
The investigations showed that a certain type of immune cell, so-called plasmacytoid dendritic cells, produce large amounts of the important antiviral messenger interferon-α (IFN-α) upon infection with influenza—regardless of the virus strain. This means that these immune cells generally respond strongly to influenza viruses without the need for the virus to productively infect them.
If this is the case, why do not all viruses cause the same severity of the disease? The research team found that other immune cells, namely myeloid dendritic cells and different types of macrophages, are infected with highly pathogenic influenza viruses and produce large amounts of IFN-α. Viral replication in these immune cells appears to be an important factor in the production of type I interferon and the development of an excessive immune response (cytokine storm).
These results provide an explanation of why some influenza viruses could be so much more dangerous than others: it might be their ability to replicate in certain immune cells and thereby trigger an extremely strong immune reaction that leads to severe inflammation and harm to health. This knowledge can help to develop targeted therapies and better identify risk groups.
Marc A. Niles et al, Influenza A virus induced interferon-alpha production by myeloid dendritic cells and macrophages requires productive infection, Emerging Microbes & Infections (2025). DOI: 10.1080/22221751.2025.2556718
Part 2
Sep 10
Dr. Krishna Kumari Challa
Giant DNA discovered in people's mouths could impact oral health, immunity and even cancer risk
Researchers have made a surprising discovery hiding in people's mouths: Inocles, giant DNA elements that had previously escaped detection. These appear to play a central role in helping bacteria adapt to the constantly changing environment of the mouth.
The findings, published in the journal Nature Communications, provide fresh insight into how oral bacteria colonize and persist in humans, with potential implications for health, disease and microbiome research.
They discovered Inocles, an example of extrachromosomal DNA—chunks of DNA that exist in cells, in this case bacteria, but outside their main DNA. It's like finding a book with extra footnotes stapled to it, and scientists are just starting to read them to find out what they do.
Detecting Inocles was not easy, as conventional sequencing methods fragment genetic data, making it impossible to reconstruct large elements.
To overcome this, the team applied advanced long-read sequencing techniques, which can capture much longer stretches of DNA.
Inocle genomes, turned out to be hosted by the bacteria Streptococcus salivarius.
The average genome size of Inocle is 350 kilobase pairs, a measure of length for genetic sequences, so it is one of the largest extrachromosomal genetic elements in the human microbiome. Plasmids, other forms of extrachromosomal DNA, are at most a few tens of kilobase pairs.
This long length endows Inocles with genes for various functions, including resistance to oxidative stress, DNA damage repair and cell wall-related genes, possibly involved in adapting to extracellular stress response.
What's remarkable is that, given the range of the human population the saliva samples represent, researchers think 74% of all human beings may possess Inocles. And even though the oral microbiome has long been studied, Inocles remained hidden all this time because of technological limitations.
Now that we know they exist, we can begin to explore how they shape the relationship between humans, their resident microbes and our oral health. And there's even some hints that Inocles might serve as markers for serious diseases like cancer, say the researchers.
Yuya Kiguchi et al, Giant extrachromosomal element "Inocle" potentially expands the adaptive capacity of the human oral microbiome, Nature Communications (2025). DOI: 10.1038/s41467-025-62406-5
Sep 11
Dr. Krishna Kumari Challa
Scientists find quasi-moon orbiting the Earth for the last 60 years
Everyone who has ever lived on Earth has been well-aware of the moon, but it turns out Earth also has some frequent temporary companions. These "quasi-moons" are small asteroids that enter into a kind of resonance with Earth's orbit, although they aren't technically orbiting Earth. In August, this small group of asteroids, called Arjunas, offered another companion to add to the list.
Astronomers at the Pan-STARRS observatory in Hawaii discovered the new quasi-moon, referred to as "2025 PN7," on August 2, 2025. Their research was recently published in Research Notes of the AAS. Using JPL's Horizons system and Python tools, they analyzed the orbital data and compared it to other Arjunas and quasi-satellites.
The team found that 2025 PN7 had been in a quasi-orbit for about 60 years already and would likely be nearby for another 60 or so years before departing. Compared to other quasi-moons, this period is relatively short. The quasi-moon Kamo'oalewa has an expected near-Earth orbit of around 381 years, while the total time for 2025 PN7 is 128 years.
Scientists have been aware of these quasi-satellites since 1991, when they first discovered 1991 VG—which some thought was an interstellar probe at the time.
Over three decades later, it is now widely accepted that such objects are natural and constitute a secondary asteroid belt that occupies the region in which the Earth–moon system orbits around the sun, defining the Arjuna dynamical class. The Arjunas with the most Earth-like orbits can experience temporary captures as mini-moons of our planet.
However, mini-moons are distinct from quasi-moons, like 2025 PN7, as mini-moons do temporarily orbit Earth and quasi-moons only appear to do so. Currently, there are six other known quasi-moons: 164207 Cardea (2004 GU9), 469219 Kamo'oalewa (2016 HO3), 277810 (2006 FV35), 2013 LX28, 2014 OL339 and 2023 FW13.
Carlos de la Fuente Marcos et al, Meet Arjuna 2025 PN7, the Newest Quasi-satellite of Earth, Research Notes of the AAS (2025). DOI: 10.3847/2515-5172/ae028f
on Thursday
Dr. Krishna Kumari Challa
Hawking and Kerr black hole theories confirmed by gravitational wave
Scientists have confirmed two long-standing theories relating to black holes—thanks to the detection of the most clearly recorded gravitational wave signal to date.
Ten years after detecting the first gravitational wave, the LIGO-Virgo-KAGRA Collaboration has (10 Sep) announced the detection of GW250114—a ripple in spacetime which offers unprecedented insights into the nature of black holes and the fundamental laws of physics.
The study confirms Professor Stephen Hawking's 1971 prediction that when black holes collide, the total event horizon area of the resulting black hole is bigger than the sum of individual black holes—it cannot shrink.
Research also confirmed the Kerr nature of black holes—a set of equations developed in 1963 by New Zealand mathematician Roy Kerr elegantly explaining what space and time look like near a spinning black hole. The Kerr metric predicts effects such as space being 'dragged' around and light looping to make multiple copies of objects.
Publishing their findings in Physical Review Letters, the international group of researchers note that GW250114 was detected with a signal-to-noise ratio of 80. This clarity enabled precise tests of general relativity and black hole thermodynamics.
GW250114: testing Hawking's area law and the Kerr nature of black holes, Physical Review Letters (2025). DOI: 10.1103/kw5g-d732
on Thursday
Dr. Krishna Kumari Challa
'Potential biosignatures' found in ancient Mars lake
A new study suggests a habitable past and signs of ancient microbial processes on Mars. Led by NASA and featuring key analysis from Imperial College London, the work has uncovered a range of minerals and organic matter in Martian rocks that point to an ancient history of habitable conditions and potential biological processes on the Red Planet.
This is a very exciting discovery of a potential biosignature but it does not mean we have discovered life on Mars. We now need to analyze this rock sample on Earth to truly confirm if biological processes were involved or not.
While driving through the valley, called Neretva Vallis, Perseverance came across a thick succession of fine-grained mudstones and muddy conglomerates. Here, it conducted a detailed analysis of these rocks, using instruments such as the Planetary Instrument for X-ray Lithochemistry (PIXL) and Scanning Habitable Environments with Raman & Luminescence for Organics & Chemicals (SHERLOC).
By mapping the types and distributions of different sedimentary rocks, researchers were able to reconstruct the environment in which these mudstones were deposited.
Their analysis revealed a range of sedimentary structures and textures indicative of lake margin and lake bed environments, including a composition rich in minerals like silica and clays—the opposite to a river scenario, where fast-moving water would carry these tiny particles away.
This pointed to a surprising conclusion: they had found lake deposits in the bottom of a river valley.
The finding may suggest a period in the history of Jezero Crater where the valley itself was flooded, giving rise to this potentially habitable lake.
With the lake habitat scenario pinned down, the Perseverance science team turned their attention to the mudstones themselves. It was inside these rocks that they discovered a group of tiny nodules and reaction fronts, with chemical analysis revealing that these millimeter-scale structures are highly enriched in iron-phosphate and iron-sulfide minerals (likely vivianite and greigite).
These appear to have formed through redox reactions involving organic carbon, a process that could have been driven by either abiotic or—interestingly—biological chemistry. Importantly, this sets the stage for everything that happened next: the formation of this specific type of oxidized, iron- and phosphorus-rich sediment was the essential prerequisite for creating the ingredients for subsequent reactions.
Since these ingredients mirror by-products of microbial metabolism seen on Earth, it can be considered a compelling potential biosignature, raising the possibility that there was once microbial life on Mars.
Ultimately, the only way for the true origin of these structures to be determined is by returning the samples to Earth, a possibility that rests on when future missions will manage to successfully collect the samples from Mars' surface.
Joel Hurowitz, Redox-driven mineral and organic associations in Jezero Crater, Mars, Nature (2025). DOI: 10.1038/s41586-025-09413-0. www.nature.com/articles/s41586-025-09413-0
on Thursday
Dr. Krishna Kumari Challa
Climate change is driving fish stocks from countries' waters to the high seas, study finds
Fish and other marine organisms, though deeply affected by human activities, don't respect human borders. The ranges of many commercially important species in fact straddle the borders of countries' exclusive economic zones (EEZs) and international waters, known as the high seas. This arrangement, which makes fisheries management difficult, is set to get even more complicated as climate change continues to heat up the ocean, a new study says.
The study, published July 30 in the journal Science Advances, found that more than half of the world's straddling stocks will shift across the maritime borders between EEZs and the high seas by 2050. Most of these shifts will be into the high seas, where fisheries management is much more challenging and stocks are more likely to be overexploited.
Juliano Palacios-Abrantes et al, Climate change drives shifts in straddling fish stocks in the world's ocean, Science Advances (2025). DOI: 10.1126/sciadv.adq5976
Kamal Azmi et al, Putting Regional Fisheries Management Organisations' Climate Change House in Order, Fish and Fisheries (2025). DOI: 10.1111/faf.70015
on Thursday
Dr. Krishna Kumari Challa
Maternal gut microbiome composition may be linked to preterm births
Researchers have found that the presence of certain bacteria in the maternal gut microbiome during early pregnancy is linked to a higher risk of preterm birth. Published in the journal Cell Host & Microbe on September 10, the study reports that one particular species, Clostridium innocuum (C. innocuum), contains a gene that can degrade estradiol—an important pregnancy hormone.
This study suggests that for pregnant women or women preparing to become pregnant, it may be important to monitor their gut microbiome to prevent potential adverse pregnancy outcomes.
The researchers used samples and data from two large pregnancy cohorts—the Tongji-Huaxi-Shuangliu Birth cohort from southwest China and the Westlake Precision Birth cohort from southeast China.
For the first cohort, the team collected stool samples from 4,286 participants in early pregnancy, at an average of 10.4 gestational weeks. For the second cohort, they collected stool samples from 1,027 participants in mid-pregnancy, or around 26 gestational weeks. They also collected blood samples from all participants, which were used to measure human genetic variations and hormone metabolism.
After working with the first cohort, the researchers were able to establish a comprehensive database containing rRNA-based microbial genera data, metagenome-based species data, and phenotype data such as preterm delivery status.
They used several statistical models to screen the annotated gut microbial genera and species and their relationship to preterm birth status or gestational duration .Through that work, they identified 11 genera and 1 species that had a statistically significant link.
The results, which were validated with the second cohort, showed that bacterial species C. innocuum—a small, rod-shaped bacteria—had the strongest connection to preterm birth. Further study of C. innocuum revealed that this species makes an enzyme that degrades estradiol—a form of estrogen that plays a pivotal role during pregnancy.
Estradiol regulates critical pathways that sustain pregnancy and initiate the process of childbirth.
Part1
on Thursday
Dr. Krishna Kumari Challa
The researchers propose that dysregulated estradiol levels induced by a high prevalence of C. innocuum could be the mechanism that links the gut microbiome to preterm birth.
The researchers note that because their study was based on two China-based cohorts with a relatively low prevalence of preterm birth, the findings may not be generalizable to other populations and should be authenticated by more work with them.
In the future, they hope to further elucidate the molecular mechanisms by which C. innocuum modulates preterm birth risk and potentially identify optimal intervention strategies to mitigate the bacteria's impact on pregnancy. They also seek to characterize the interaction between C. innocuum and host estrogen metabolism more generally, beyond pregnancy.
Maternal gut microbiome during early pregnancy predicts preterm birth, Cell Host & Microbe (2025). DOI: 10.1016/j.chom.2025.08.004. www.cell.com/cell-host-microbe … 1931-3128(25)00330-0
Part 2
on Thursday
Dr. Krishna Kumari Challa
Donor-egg births may carry more risks
More women than ever are carrying babies conceived with someone else's egg—but few are told that this might carry greater health risks.
Pregnancies involving an embryo that doesn't share the pregnant woman's DNA are becoming more common. For many, it's a path to parenthood that would otherwise be closed.
But emerging evidence suggests that these pregnancies may come with higher rates of complications, including preeclampsia, gestational diabetes and preterm birth, and that women are often not given the full picture before treatment.
As the fertility industry expands and diversifies, it's time to ask whether patients are being adequately informed about the risks of carrying another woman's egg—and whether more caution is needed in how these options are presented.
There are three situations in which a woman may carry another woman's egg in her uterus.
The most common is when a woman cannot produce her own eggs but has a functioning uterus. In this case, donor eggs and in-vitro fertilization (IVF) offer the only route to pregnancy.
The other two situations involve fertile women carrying a donated egg on behalf of someone else. This happens in cases of gestational surrogacy, where a surrogate carries a baby genetically unrelated to her, or in reciprocal IVF, also known as ROPA or co-IVF. In the latter, one woman in a same-sex couple (or a trans man) donates her egg to her partner, so that both have a biological connection to the child.
In IVF, fertilization occurs outside the body and the resulting embryo is transferred into the uterus. But what happens when the egg in the uterus has no genetic similarity to the woman carrying it? Could this cause complications for her or the baby?
To answer that question, we need to compare outcomes in these situations to pregnancies where the egg shares approximately 50% of the mother's DNA, either through natural conception or own-egg IVF. Early evidence suggests that having someone else's egg in the uterus is associated with a higher risk of obstetric complications, including preeclampsia, gestational diabetes and preterm birth.
There are three key comparisons to make. First, donor-egg IVF v own-egg IVF. For infertile women using donor eggs, the most relevant comparison is IVF with their own eggs.Second, gestational v traditional surrogacy. In gestational surrogacy, the surrogate carries a donor egg, while in traditional surrogacy, she uses her own. Outcomes can also be compared with the surrogate's previous natural pregnancies.
Third, reciprocal IVF v own-egg IVF. In same-sex couples, reciprocal IVF can be compared to own-egg IVF to assess risks.
Part1
on Thursday
Dr. Krishna Kumari Challa
A review of 11 studies comparing donor-egg IVF to own-egg IVF found that donor-egg pregnancies had significantly higher rates of hypertensive disorders in the mother, as well as preterm birth and babies that were small for their gestational age.
A separate review focusing on preeclampsia in singleton IVF pregnancies found the condition occurred in 11.2% of donor-egg pregnancies, compared to 3.9% of own-egg pregnancies.
For women who can only become pregnant using a donor egg, these risks may be worth accepting. But it's important that women are made aware of the potential complications, especially if carrying twins, which further increases risks.
Women deserve full, unbiased information about the risks. That includes knowing that carrying someone else's egg may increase the likelihood of pregnancy complications. They can then make informed decisions about whether the potential benefits outweigh the risks.
Phoebe Barry et al, The outcomes of surrogate pregnancy using a donor egg compared to the surrogate's egg: a systematic review, Preprints (2025). DOI: 10.22541/au.174919886.60741282/v1
Part2
on Thursday
Dr. Krishna Kumari Challa
DNA cassette tapes could solve global data storage problems
Our increasingly digitized world has a data storage problem. Hard drives and other storage media are reaching their limits, and we are creating data faster than we can store it. Fortunately, we don't have to look too far for a solution, because nature already has a powerful storage medium with DNA (deoxyribonucleic acid). It is this genetic material that researchers are using to create DNA storage cassettes.
DNA is the ultimate data storage solution because it is compact, dense and durable. It can hold an enormous amount of information in a microscopic space and preserve that data for thousands of years without needing electricity. Theoretically, the DNA in a single human cell has a capacity of approximately 3.2 gigabytes, which equates to roughly 6,000 books, 1,000 pieces of music or two movies.
Scientists have known about DNA's potential as a storage solution for a long time, but the challenge up until now has been to create a viable system that we can use. In a new study published in the journal Science Advances, researchers describe how they made a DNA cassette similar to cassette tapes that were staples in personal and car stereos in the 1980s.
The team first created the physical tape from a polyester-nylon blend. Then they printed barcode patterns on it to make millions of tiny, separate sections, similar to folders on a computer. This lets the system find the exact spot where the data is stored. Accessing information has been one of the problems of previous DNA storage techniques.
To store a file, digital data is first translated into a DNA sequence. The four bases, or building blocks of DNA (A, G, C, and T) act as a code, similar to the zeroes and ones that computers use. The researchers also coated the tape with a protective crystalline layer to protect the DNA bonds from breaking down.
Finally, they proved the system works by converting a digital image into DNA, then successfully and quickly retrieving it from the tape.
DNA cassette tape provides a strategy for fast, compact, large-scale DNA-based cold (infrequently accessed) or warm (needed on demand) data storage, wrote the scientists in their paper.
Jiankai Li et al, A compact cassette tape for DNA-based data storage, Science Advances (2025). DOI: 10.1126/sciadv.ady3406
on Friday
Dr. Krishna Kumari Challa
Medications leave lasting mark on gut microbiome, even years after use
Medications taken years ago can continue to shape the human gut microbiome, according to a large-scale study.
Analyzing stool samples and prescription records from over 2,500 Estonian Biobank participants in the Estonian Microbiome cohort, researchers found that the majority of drugs studied were linked to microbiome changes, with a substantial number of them also showing long-term effects detectable years after patients stopped taking them.
The impact was not limited to antibiotics: antidepressants, beta-blockers, proton pump inhibitors, and benzodiazepines all left microbial "fingerprints."
Most microbiome studies only consider current medications, but our results show that past drug use can be just as important as it is a surprisingly strong factor in explaining individual microbiome differences.
This highlights that it is critical to account for drug usage history when studying links between the microbiome and disease. The research is published in the journal mSystems.
Interestingly, benzodiazepines—commonly prescribed for anxiety—had microbiome effects comparable to broad-spectrum antibiotics. The results also show that drugs from the same class that might be used for the same condition, e.g. diazepam and alprazolam, may differ in how much they disrupt the microbiome.
Follow-up samples from a subset of participants confirmed that starting or stopping certain drugs caused predictable microbial shifts, suggesting causal effects. Despite the small sample size of the second time-point analysis, the authors were able to verify long-term effects of proton pump inhibitors, selective serotonin reuptake inhibitors and antibiotics, such as penicillins in combination and macrolides.
Oliver Aasmets et al, A hidden confounder for microbiome studies: medications used years before sample collection, mSystems (2025). DOI: 10.1128/msystems.00541-25
on Friday
Dr. Krishna Kumari Challa
Breathlessness increases long-term mortality risk, finds a study in Malawi
Research led by Liverpool School of Tropical Medicine and the Malawi-Liverpool-Wellcome Program shows that over half of hospital patients with breathlessness had died within a year of admission (51%), as opposed to just 26% of those without the symptom.
Most of these patients had more than one condition that caused breathlessness, including pneumonia, anemia, heart failure and TB.
The findings demonstrate the importance of integrated, patient-centered care, researchers say, to tackle the burden of high mortality for people with breathlessness, particularly in low-income countries. The work appears in Thorax.
Most of these patients live with more than one condition at the same time, which the researchers found to be a factor linked to higher mortality, such as those with TB or pneumonia. This suggests that treating diseases in isolation is not enough, and health care models that have traditionally focused on single presenting conditions may overlook important concurrent diseases.
Acute breathlessness as a cause of hospitalisation in Malawi: a prospective, patient-centred study to evaluate causes and outcomes, Thorax (2025). DOI: 10.1136/thorax-2025-223623
on Friday
Dr. Krishna Kumari Challa
Smells that deceive the brain: Research reveals how certain aromas are interpreted as taste
on Saturday
Dr. Krishna Kumari Challa
Microbial allies: Some Bacteria help fight against cancer
New research, published online in Cell Systems, provides a significant breakthrough in this field, identifying a powerful anti-cancer metabolite produced by bacteria associated with colorectal cancer.
This finding opens the door to new strategies for treating cancer, including the development of novel drugs that could make existing therapies more potent.
The researchers used a sophisticated large-scale screening approach to test over 1,100 conditions in C. elegans. Through this, they found that the bacteria E. coli produced a molecule called 2-methylisocitrate (2-MiCit) that could improve the effectiveness of the chemotherapy drug 5-fluorouracil (5-FU).
Using computer modeling, the team demonstrated that the tumor-associated microbiome (bacteria found within and around tumors) of patients was also able to produce 2-MiCit. To confirm the effectiveness of 2-MiCit, the team used two further systems; human cancer cells and a fly model of colorectal cancer. In both cases, they found that 2-MiCit showed potent anti-cancer properties, and for the flies could extend survival.
Bacteria are associated with tumors, and now scientists are starting to understand the chemical conversation they're having with cancer cells.
They found that one of these bacterial chemicals can act as a powerful partner for chemotherapy, disrupting the metabolism of cancer cells and making them more vulnerable to the drug.
The study revealed that 2-MiCit works by inhibiting a key enzyme in the mitochondria (structures inside cells that generate energy for cellular functions) of cancer cells. This leads to DNA damage and activates pathways known to reduce the progression of cancer. This multi-pronged attack weakens the cancer cells and works in synergy with 5-FU. The combination was significantly more effective at killing cancer cells than either compound alone.
These exciting discoveries highlight how the cancer-associated microbiome can impact tumor progression, and how metabolites produced by these bacteria could be harnessed to improve cancer treatments.
These findings are also important in the context of personalized medicine, emphasizing the importance of considering not only the patient, but also their microbes.
Daniel Martinez-Martinez et al, Chemotherapy modulation by a cancer-associated microbiota metabolite, Cell Systems (2025). DOI: 10.1016/j.cels.2025.101397
on Saturday
Dr. Krishna Kumari Challa
Researchers may have found a way to limit the debilitating damage strokes can cause
on Saturday
Dr. Krishna Kumari Challa
A pathological partnership between Salmonella and yeast in the gut
Researchers have found that a common gut yeast, Candida albicans, can help Salmonella typhimurium take hold in the intestine and spread through the body. When interacting, a Salmonella protein called SopB prompts the yeast to release arginine, which turns on Salmonella's invasion machinery and quiets the body's inflammation signals.
Gut microbes shape human health across colonization resistance, immune training, digestion, and signaling that reaches distant organs. Bacteria dominate both abundance and research attention, while roles for viruses and fungi remain less defined.
Altered mycobiome composition appears in multiple gastrointestinal diseases, and integration of fungi into gut ecology and into interactions with commensal and pathogenic bacteria remains largely unknown.
Non-typhoidal Salmonella ranks among the best-studied enteric pathogens, infecting an estimated 100 million people each year. Healthy individuals typically experience localized inflammatory diarrhea, while immunocompromised patients face risks of spread to peripheral organs.
Establishing gut colonization requires competition with resident microorganisms, and commensal fungi occur across tested mammalian species, yet mycobiome contributions during enteric infection remain largely unexplored.
Candida albicans is a frequent colonizer of human mucosal surfaces, present in the gut of more than 60% of healthy humans. Usual behavior is commensal, with pathogenic potential particularly in immunocompromised hosts. A key virulence trait is morphology switching from yeast to epithelium-penetrating hyphae.
Associations with inflammatory bowel disease, specifically Crohn's disease, have been reported. C. albicans cannot induce gut inflammation and has been shown to exacerbate it. Both Salmonella and C. albicans thrive under inflammatory gut conditions, and C. albicans likely resides in the gut of many patients at the time Salmonella infection occurs.
In the study, "Commensal yeast promotes Salmonella Typhimurium virulence," published in Nature, researchers investigated cross-kingdom interactions to determine how Candida albicans influences Salmonella colonization, systemic dissemination, and host inflammatory responses.
In the experiments conducted in mice, Candida in the gut led to higher Salmonella loads in the large intestine and more bacteria reaching the spleen and liver, with co-infected mice losing more weight. Candida also boosted Salmonella entry into human colon cell lines. Gene readouts showed Salmonella's invasion machinery switched on near Candida.
Co-cultures contained millimolar arginine, and adding L-arginine alone increased invasion in a dose-dependent way, while an arginine-transporter mutant did not respond to Candida. Candida lacking arginine production also failed to boost Salmonella invasion or gut colonization, and an ARG4 revertant restored the effect.
Researchers conclude that C. albicans colonization represents a susceptibility factor for Salmonella infection, with arginine acting as a pivotal metabolite connecting fungus, bacterium, and host. Findings point to SopB-driven arginine production in Candida that boosts Salmonella's invasion program while softening host inflammatory signals.
Kanchan Jaswal et al, Commensal yeast promotes Salmonella Typhimurium virulence, Nature (2025). DOI: 10.1038/s41586-025-09415-y
yesterday
Dr. Krishna Kumari Challa
The sound of crying babies makes our faces hotter, according to new research
Hearing a baby cry can trigger a range of responses in adults, such as sympathy, anxiety and a strong urge to help. However, new research suggests that a deeper physical reaction is also occurring. A baby's cry, particularly if it is in pain or distress, makes our faces physically warmer.
Since they can't speak yet, babies cry to communicate their needs, whether they're in pain or want some attention. When a baby is in distress, they forcefully contract their ribcage, which produces high-pressure air that causes their vocal cords to vibrate chaotically. This produces complex disharmonious sounds known as nonlinear phenomena (NLP).
To study how adults respond to crying babies, scientists played 23 different recordings to 41 men and women with little to no experience with young infants. At the same time, a thermal infrared imaging camera measured subtle changes to their facial temperatures. A rise in temperature in this part of the body is governed by the autonomic nervous system, a network of nerves that controls unconscious processes such as breathing and digestion. After each cry, the participants rated whether the baby was in discomfort or in pain.
The study found that adults' facial temperatures change when they hear a baby cry, a clear sign that the autonomic nervous system has been activated. This suggests that people unconsciously pick up on acoustic features in a baby's cry. The higher the level of NLP (meaning a baby is in more pain or distress), the stronger and more in sync the listener's facial temperature became. In other words, as the cry grew louder, a person's face grew warmer. This physiological reaction was the same for both men and women.
Lény Lego et al, Nonlinear acoustic phenomena tune the adults' facial thermal response to baby cries with the cry amplitude envelope, Journal of the Royal Society Interface (2025). DOI: 10.1098/rsif.2025.0150
yesterday
Dr. Krishna Kumari Challa
Scientists engineer plants to double carbon uptake ability and produce more seeds and lipids
Typically, plants rely on the Calvin-Benson-Bassham (CBB) cycle to convert carbon dioxide in the atmosphere to usable organic matter for growth. Although this cycle is the main pathway for carbon fixation in all plants on Earth, it is surprisingly inefficient—losing one third of carbon in the cycle when synthesizing the molecule acetyl–coenzyme A (CoA) to generate lipids, phytohormones, and metabolites. Plants also lose carbon during photorespiration, which limits their growth. This is largely due to the inefficiency of an enzyme called RuBisCO.
In efforts to increase carbon uptake and reduce carbon loss in plants to boost biomass and lipid production, scientists have experimented with ways to increase the efficiency of RuBisCO, overexpress CBB cycle enzymes, introduce carbon-concentrating mechanisms, and reduce photorespiration losses. But, a new study published in Science, focuses on a novel approach—creating an altogether new pathway for carbon uptake.
The researchers involved in the study introduced a synthetic CO2 uptake cycle into the plant Arabidopsis thaliana. They refer to the engineered cycle as the malyl-CoA-glycerate (McG) cycle, which works in conjunction with the CBB cycle to create a dual-cycle CO2 fixation system. The new cycle increases efficiency by using previously wasted carbon.
"In the McG cycle, one additional carbon is fixed when 3PG is the input, or no carbon is lost when glycolate is the input. In both cases, acetyl-CoA is produced more efficiently, which is expected to enhance the production of lipids and other important plant metabolites, including phytohormones," the authors write.
Kuan-Jen Lu et al, Dual-cycle CO2 fixation enhances growth and lipid synthesis in Arabidopsis thaliana, Science (2025). DOI: 10.1126/science.adp3528
yesterday
Dr. Krishna Kumari Challa
Scientists discover how nanoplastics disrupt brain energy metabolism
Scientists have discovered how nanoplastics—even smaller than microplastics—disrupt energy metabolism in brain cells. Their findings may have implications for better understanding neurodegenerative diseases characterized by declining neurological or brain function, and even shed new light on issues with learning and memory.
The study has revealed the specific mechanism by which these tiny nanoplastics can interfere with energy production in the brain in an animal model. The findings, recently published in the Journal of Hazardous Materials: Plastics, provide fresh insights into the potential health risks posed by environmental plastics.
Polystyrene nanoplastics (PS-NPs) are produced when larger plastics break down in the environment. These particles have been detected in multiple organs in the body, including the brain, sparking growing concerns about their possible role in neurological disease.
The researchers focused on mitochondria, which are critical for producing the energy needed for brain function. Mitochondrial dysfunction is a well-known feature of neurodegenerative diseases such as Parkinson's and Alzheimer's, as well as normal aging.
By isolating mitochondria from brain cells, the researchers showed that exposure to PS-NPs specifically disrupted the "electron transport chain," a simplified term for the set of protein complexes that work together to help generate cellular energy in the form of ATP. While individual mitochondrial complexes I and II were not directly impaired, electron transfer between complexes I–III and II–III, as well as the activity of complex IV, was significantly inhibited.
The scientists found that electron transfer between complex I–III and complex II–III was potently inhibited at much lower concentrations, suggesting environmentally relevant exposures could also impair bioenergetic function over chronic timeframes.
Interestingly, the same broad effects were seen in synaptic mitochondria, which are essential for communication between brain cells. This suggests that nanoplastics could also interfere with synaptic plasticity, a process fundamental to learning and memory.
D.M. Seward et al, Polystyrene nanoplastics target electron transport chain complexes in brain mitochondria, Journal of Hazardous Materials: Plastics (2025). DOI: 10.1016/j.hazmp.2025.100003
yesterday
Dr. Krishna Kumari Challa
The death of the dinosaurs reengineered Earth
Dinosaurs had such an immense impact on Earth that their sudden extinction led to wide-scale changes in landscapes—including the shape of rivers—and these changes are reflected in the geologic record, according to a new study.
Scientists have long recognized the stark difference in rock formations from just before dinosaurs went extinct to just after, but chalked it up to sea level rise, coincidence, or other abiotic reasons. But the new study shows that once dinosaurs were extinguished, forests were allowed to flourish, which had a strong impact on rivers.
Studying these rock layers, the researchers suggest that dinosaurs were likely enormous "ecosystem engineers," knocking down much of the available vegetation and keeping land between trees open and weedy. The result was rivers that spilled openly, without wide meanders, across landscapes. Once the dinosaurs perished, forests were allowed to flourish, helping stabilize sediment and corralling water into rivers with broad meanders.
Their results, published in the journal Communications Earth & Environment, demonstrate how rapidly the Earth can change in response to catastrophic events.
Very often when we're thinking about how life has changed through time and how environments change through time, it's usually that the climate changes and, therefore, it has a specific effect on life, or this mountain has grown and, therefore, it has a specific effect on life. It's rarely thought that life itself could actually alter the climate and the landscape. The arrow doesn't just go in one direction, the researchers say.
Dinosaur extinction can explain continental facies shifts at the Cretaceous-Paleogene boundary, Communications Earth & Environment (2025). DOI: 10.1038/s43247-025-02673-8
yesterday
Dr. Krishna Kumari Challa
Ants defend plants from herbivores, but can hinder pollination by bees
Around 4,000 plant species from different parts of the world secrete nectar outside their flowers, such as on their stems or leaves, through secretory glands known as extrafloral nectaries. Unlike floral nectar, extrafloral nectar does not attract pollinators; rather, it attracts insects that defend plants, such as ants. These insects feed on the sweet liquid and, in return, protect the plant from herbivores. However, this protection comes at a cost.
A study published in the Journal of Ecology points out that the presence of ants can reduce the frequency and duration that bees visit the flowers of plants with extrafloral nectaries.
Pollination is only impaired when extrafloral nectaries are close to the flowers. Plants with these glands in other locations, such as on their leaves or branches, had increased reproductive success, likely due to the protection against herbivores provided by ants.
On the other hand, butterflies, another group of pollinators, are not affected by ants. This may be due to the way these two groups feed. Butterflies use a long, straw-like organ called a proboscis to suck nectar from a distance, keeping them safe from ants.
Bees, on the other hand, need to get very close to the flower to collect pollen and floral nectar, but ants don't allow them to stay for long. Not surprisingly, the new analysis showed that the presence of ants is detrimental to pollination when extrafloral nectaries are close to flowers, but has a positive effect on plant reproduction when they're located further away.
The conclusions are the result of an analysis of data from 27 empirical studies on the relationships between ants, pollinators, and plants with extrafloral nectaries. The articles were selected from an initial screening of 567 studies after applying inclusion and exclusion criteria. The data were compiled and analyzed with computational tools.
Amanda Vieira da Silva et al, Ants on flowers: Protective ants impose a low but variable cost to pollination, moderated by location of extrafloral nectaries and type of flower visitor, Journal of Ecology (2025). DOI: 10.1111/1365-2745.70087
yesterday
Dr. Krishna Kumari Challa
Culture is overtaking genetics in shaping human evolution, researchers argue
Some Researchers are theorizing that human beings may be in the midst of a major evolutionary shift—driven not by genes, but by culture.
In a paper published in BioScience, they argue that culture is overtaking genetics as the main force shaping human evolution.
"When we learn useful skills, institutions or technologies from each other, we are inheriting adaptive cultural practices. On reviewing the evidence, we find that culture solves problems much more rapidly than genetic evolution. This suggests our species is in the middle of a great evolutionary transition", they say.
Cultural practices—from farming methods to legal codes—spread and adapt far faster than genes can, allowing human groups to adapt to new environments and solve novel problems in ways biology alone could never match. According to the research team, this long-term evolutionary transition extends deep into the past, it is accelerating, and may define our species for millennia to come.
Cultural evolution eats genetic evolution for breakfast, they argue.
In the modern environment cultural systems adapt so rapidly they routinely "preempt" genetic adaptation. For example, eyeglasses and surgery correct vision problems that genes once left to natural selection.
Medical technologies like cesarean sections or fertility treatments allow people to survive and reproduce in circumstances that once would have been fatal or sterile. These cultural solutions, researchers argue, reduce the role of genetic adaptation and increase our reliance on cultural systems such as hospitals, schools and governments.
Today, your well-being is determined less and less by your personal biology and more and more by the cultural systems that surround you—your community, your nation, your technologies. And the importance of culture tends to grow over the long term because culture accumulates adaptive solutions more rapidly.
Over time, this dynamic could mean that human survival and reproduction depend less on individual genetic traits and more on the health of societies and their cultural infrastructure.
But, this transition comes with a twist. Because culture is fundamentally a shared phenomenon, culture tends to generate group-based solutions.
Using evidence from anthropology, biology and history, Waring and Wood argue that group-level cultural adaptation has been shaping human societies for millennia, from the spread of agriculture to the rise of modern states. They note that today, improvements in health, longevity and survival reliably come from group-level cultural systems like scientific medicine and hospitals, sanitation infrastructure and education systems rather than individual intelligence or genetic change.The researchers argue that if humans are evolving to rely on cultural adaptation, we are also evolving to become more group-oriented and group-dependent, signaling a change in what it means to be human.
Part1
yesterday
Dr. Krishna Kumari Challa
In the history of evolution, life sometimes undergoes transitions which change what it means to be an individual. This happened when single cells evolved to become multicellular organisms and social insects evolved into ultra-cooperative colonies. These individuality transitions transform how life is organized, adapts and reproduces. Biologists have been skeptical that such a transition is occurring in humans.
But the researchers suggest that because culture is fundamentally shared, our shift to cultural adaptation also means a fundamental reorganization of human individuality—toward the group.
Cultural organization makes groups more cooperative and effective. And larger, more capable groups adapt—via cultural change—more rapidly. It's a mutually reinforcing system, and the data suggest it is accelerating.
For example, genetic engineering is a form of cultural control of genetic material, but genetic engineering requires a large, complex society. So, in the far future, if the hypothesized transition ever comes to completion, our descendants may no longer be genetically evolving individuals, but societal "superorganisms" that evolve primarily via cultural change.
The researchers emphasize that their theory is testable and lay out a system for measuring how fast the transition is happening. The team is also developing mathematical and computer models of the process and plans to initiate a long-term data collection project in the near future. They caution, however, against treating cultural evolution as progress or inevitability.
They are not suggesting that some societies, like those with more wealth or better technology, are morally 'better' than others. Evolution can create both good solutions and brutal outcomes. They think this might help our whole species avoid the most brutal parts.
The goal of this work goal is to use their understanding of deep patterns in human evolution to foster positive social change.
Still, the new research raises profound questions about humanity's future. "If cultural inheritance continues to dominate, our fates as individuals, and the future of our species, may increasingly hinge on the strength and adaptability of our societies.
And if so, the next stage of human evolution may not be written in DNA, but in the shared stories, systems, and institutions we create together, the researchers conclude.
Timothy M Waring et al, Cultural inheritance is driving a transition in human evolution, BioScience (2025). DOI: 10.1093/biosci/biaf094. academic.oup.com/bioscience/ad … osci/biaf094/8230384
Part 2
yesterday
Dr. Krishna Kumari Challa
Scientists uncover how cellular receptors trigger inflammation and sensory changes
In two new studies, scientists have uncovered detailed blueprints of how certain molecular "gates" in human cells work—findings that could open doors to new treatments for conditions ranging from certain cancers and brain diseases to hearing loss and atherosclerosis, or plaque build-up in the arteries.
They studied a group of proteins known as P2X receptors, which sit on the surface of cells and detect ATP—a molecule best known as the body's energy source inside of cells.
When ATP leaks outside of cells, often as a sign of stress or damage, P2X receptors act like alarm bells, triggering responses related to inflammation, pain and sensory processing.
Extracellular ATP is a universal danger signal. When it builds up outside cells, P2X receptors sense it and change how the cells respond. Understanding these receptors at the atomic level is key to designing drugs that can either calm them down or fine-tune their activity.
In a study published in Nature Communications, researchers examined the molecular structure of the human P2X7 receptor, a protein linked to inflammatory diseases such as cancer, Alzheimer's and atherosclerosis. Despite years of effort, no drugs targeting P2X7 have reached the clinical market, partly because drugs that have worked well in animal models have not had the same success in humans.
Building on a previous study, where they determined how to turn the rat P2X7 receptor off, the team has now mapped how drugs turn off the human P2X7 receptor for the first time. They now know what makes the human receptor different from the receptor that is present in animal models. This is important for understanding how to better customize drugs to fit the binding pockets within the human receptor.
Using that information, the researchers, in collaboration with groups from around the world, designed a new compound referred to as UB-MBX-46. The compound complements the binding pocket in the human receptor, translating to a molecule that blocks the human receptor with high precision and strength.
This is the first time scientists have visualized the human P2X7 receptor and really understood how it is different from others. With that knowledge, they can now create a drug candidate that perfectly fits binding pockets within the human receptor, much like how a key fits in a lock. It gives us hope for developing therapies that have better chances to reach the clinic.
Part 1
yesterday
Dr. Krishna Kumari Challa
A second study published in Proceedings of the National Academy of Sciences examined the human P2X2 receptor, a protein in the same family as the P2X7 receptor, but is predominantly found in the cochlea, the hearing organ of the inner ear.
The P2X2 receptor is involved in hearing processes and in the ear's adaptation to loud noise. Certain genetic mutations of this receptor have been linked to hearing loss. Currently, there are no drugs that target this receptor effectively, and until now, scientists had limited insight into how it functions.
Researchers used cryo-electron microscopy—a powerful imaging method—to capture 3D structures of the human P2X2 receptor in two states: in a resting state and in a state bound to ATP but desensitized, meaning it's not active anymore. The team discovered unique structural features and pinpointed areas where hearing-related mutations occur.
Together, the studies mark a leap forward in understanding how P2X receptors contribute to a wide range of diseases by triggering inflammation and sensory changes.
Adam C. Oken et al, A polycyclic scaffold identified by structure-based drug design effectively inhibits the human P2X7 receptor, Nature Communications (2025). DOI: 10.1038/s41467-025-62643-8
Franka G. Westermann et al, Subtype-specific structural features of the hearing loss–associated human P2X2 receptor, Proceedings of the National Academy of Sciences (2025). DOI: 10.1073/pnas.2417753122
Part2
yesterday