Sugary drinks may increase risk of metastasis in advanced colorectal cancer
A new study by researchers shows that the glucose-fructose mix found in sugary drinks directly fuels metastasis in preclinical models of advanced colorectal cancer. The study was published today in Nature Metabolism.
The researchers studied how sugary drinks may affect late-stage colorectal cancer. Using laboratory cancer models, they compared the effects of the glucose-fructose mix found in most sugary drinks with those of glucose or fructose alone. Only the sugar mix made cancer cells more mobile, leading to faster spread to the liver—the most common site of colorectal cancer metastasis.
The sugar mix activated an enzyme called sorbitol dehydrogenase (SORD), which boosts glucose metabolism and triggers the cholesterol pathway, ultimately driving metastasis. This is the same pathway targeted by statins, common heart drugs that inhibit cholesterol production. Blocking SORD slowed metastasis, even with the sugar mix present. These findings suggest that targeting SORD could also offer an opportunity to block metastasis.
These findings highlight that daily diet matters not only for cancer risk but also for how the disease progresses once it has developed.
Think before you ink: Uncovering the hidden risks in tattoo inks
New research suggests that what's in the ink may pose greater risks than the size and design of the tattoo.
A new study has revealed that the ingredients listed on tattoo ink labels often don't match what's actually inside the bottle. Researchers warn that this comes with risk because there are currently few regulations, laws and safety criteria for tattoo and permanent cosmetic formulations.
The findings, published this month as the cover story in the Journal of Environmental Health, raise fresh concerns about the safety and regulation of tattoo inks.
Using a combination of advanced analytical techniques, researchers found discrepancies between labeled and actual ingredients in a range of commercially available yellow tattoo inks.
The results showed not only discrepancies with label claims, but also the presence of unlisted elements such as aluminum, sodium and silicon.
Batool A. Aljubran et al, Decoding Tattoo Inks: Multiple Analysis Techniques Reveal Discrepancies in Ingredient Composition and Elemental Content When Compared Against Label Claims, Journal of Environmental Health (2025). DOI: 10.70387/001c.143999
Study finds 10% of pediatric blood cancers may stem from medical imaging radiation
A new study has concluded that radiation from medical imaging is associated with a higher risk of blood cancers in children.
Researchers examined data from nearly 4 million children and estimated that one in 10 blood cancers—some 3,000 cancers in all—may be attributable to radiation exposure from medical imaging. The risk increased proportionally based on the cumulative amount of radiation the children received.
The investigation is the first comprehensive assessment using data from children and adolescents in North America that quantifies the association between radiation exposure from medical imaging and blood and bone marrow cancers, such as leukemia and lymphoma, which are the most common forms of cancer in children and adolescents.
Medical imaging saves lives by enabling timely diagnosis and effective treatment, but it also exposes patients to ionizing radiation, a known carcinogen, particularly through computed tomography (CT).
Children are particularly vulnerable to radiation-induced cancer due to their heightened radiosensitivity and longer life expectancy.
The authors caution that doctors and parents should avoid excessive radiation doses and minimize exposure when clinically feasible.
Investigators found a significant relationship between cumulative radiation dose and the risk of a hematologic malignancy, which includes tumors affecting the blood, bone marrow, lymph, and lymphatic system.
Among all the forms of medical imaging, the study found that chest radiography was the most common imaging exam that doctors performed. The most common form of CT was of the head and brain.
For children who underwent a head CT, the researchers attributed about a quarter of the children's subsequent hematologic malignancies to radiation exposure.
The authors emphasized that while medical imaging remains an invaluable tool in pediatric care, their findings highlight the need to carefully balance its diagnostic benefits with potential long-term risks.
Rebecca Smith-Bindman, et al. Medical Imaging and Pediatric and Adolescent Hematologic Cancer Risk. The New England Journal of Medicine (2025). DOI: 10.1056/NEJMoa2502098
El Niño, which means little boy in Spanish, is a complex, cyclical weather pattern that warms the surface waters in the eastern Pacific Ocean. This weakens trade winds and alters atmospheric conditions, a process that has long been known to suppress summer rainfall in India.
Studies found a direct connection between Indian rainfall and El Niño. In the drier regions of southeastern and northwestern India, this climate phenomenon reduces total rainfall and the intensity of extreme events. However, in the wet central and southwestern parts of the country, it results in less frequent but more intense downpours. This is due to changes in convective buoyancy, an atmospheric force that powers storms.
During El Niño summers, convective buoyancy increases in the wettest regions, which makes more extreme rainfall likely. The chance of a very heavy downpour (more than 250 mm of rain) increases by 43% across all of India's monsoon-affected areas and 59% for the Central Monsoon Zone during El Niño summers.
"Extreme daily rainfall over the Indian summer monsoon's rainiest areas becomes more likely the stronger that El Niño conditions are," commented the researchers in their study.
Hormone therapy eases hot flashes while leaving heart risk unchanged in younger women
Researchers report that menopausal hormone therapy relieved vasomotor symptoms without raising cardiovascular risk in younger women, but increased risk was evident in women aged 70 years and older.
Concerns about the safety of hormone therapyhave discouraged many women and clinicians from using it to treat hot flashes and night sweats. Some previous studies have suggested increased risk of coronary heart disease linked to hormone use, particularly in older women. At the same time, vasomotor symptoms remain among the most common and distressing consequences of menopause, and effective treatment options have been limited.
In the study, "Menopausal Hormone Therapy and Cardiovascular Diseases in Women With Vasomotor Symptoms: A Secondary Analysis of the Women's Health Initiative Randomized Clinical Trials,"publishedinJAMA Internal Medicine, researchers analyzed outcomes from two hormone therapy trials to assess the safety of treatment in women experiencing vasomotor symptoms.
A total of 27,347 postmenopausal women aged 50 to 79 years were enrolled from 40 clinical centers across the United States. Among them, 10,739 had undergone hysterectomy and were randomized to receive conjugated equine estrogens (CEE) or placebo, while 16,608 with an intact uterus were randomized to CEE plus medroxyprogesterone acetate (MPA) or placebo.
Women were followed for a median of 7.2 years in the CEE trial and 5.6 years in the CEE plus MPA trial. Vasomotor symptoms were assessed at baseline and year one, with nearly all women who reported moderate or severe symptoms recalling onset near menopause.
CEE alone reduced vasomotor symptoms by 41% across all age groups. CEE plus MPA also reduced symptoms but with age-related variation, showing strong benefit in women aged 50 to 59 years and diminished or absent benefit in women 70 years and older. Analysis of cardiovascular outcomes found neutral effects of both CEE alone and CEE plus MPA in women aged 50 to 59 years with moderate or severe symptoms. No clear harm was observed in those aged 60 to 69 years, although risk estimates were elevated for CEE alone.
Women aged 70 years and older had significantly higher rates of atherosclerotic cardiovascular disease, with hazard ratios of 1.95 for CEE alone and 3.22 for CEE plus MPA, corresponding to 217 and 382 excess events per 10,000 person-years, respectively.
Investigators concluded that hormone therapy can be considered for treatment of vasomotor symptoms in younger women aged 50 to 59 years, while initiation in women aged 60 to 69 years requires caution. For women 70 years and older, the findings argue against the use of hormone therapy due to substantially increased cardiovascular risk.
Jacques E. Rossouw et al, Menopausal Hormone Therapy and Cardiovascular Diseases in Women With Vasomotor Symptoms,JAMA Internal Medicine(2025).DOI: 10.1001/jamainternmed.2025.4510
Deborah Grady et al, Hormone Therapy for Menopausal Vasomotor Symptoms—Better Understanding Cardiovascular Risk,JAMA Internal Medicine(2025).DOI: 10.1001/jamainternmed.2025.4521
COVID-19 models suggest universal vaccination may avert over 100,000 hospitalizations
US Scenario Modeling Hub, a collaborative modeling effort of 17 academic research institutions, reports a universal COVID-19 vaccination recommendation could avert thousands more hospitalizations and deaths than a high-risk-only strategy.
Nine independent teams produced projections under six scenarios that combined two immune escape rates, 20% and 50% per year. An immune escape rate is the annual reduction in protection against infection that occurs as new SARS-CoV-2 variants evolve. Projections covered the United States population of 332 million with an estimated 58 million aged 65 years.
Three vaccine recommendation strategies were tested. No recommendation, recommendation for high-risk groups only, or recommendation for all eligible individuals. Annually reformulated vaccines were assumed to match variants circulating on June 15, 2024, to be available on September 1, 2024, and to be 75% effective against hospitalization at the time of release.
Teams calibrated their models to weekly hospitalizations and deaths reported by the National Healthcare Safety Network and the National Center for Health Statistics.
In the worst case scenario, defined by high immune escape with no vaccine recommendation, projections reached 931,000 hospitalizations with a 95% projection interval of 0.5 to 1.3 million and 62,000 deaths with a 95% projection interval of 18,000 to 115,000.
In the best case defined by low immune escape with a universal recommendation, the ensemble projected 550,000 hospitalizations with a 95% projection interval of 296,000 to 832,000 and 42,000 deaths with a 95% projection interval of 13,000 to 72,000.
Severe outcomes clustered in older populations. Across scenarios, adults aged 65 years and older accounted for 51% to 62% of hospitalizations and 84% to 87% of deaths.
Vaccination of high-risk groups was only projected to avert 11% of hospitalizations under low immune escape and 8% under high escape, along with 13% and 10% of deaths. A universal recommendation increased the effect with 15% fewer hospitalizations under low immune escape and 11% fewer under high, with 16% and 13% fewer deaths.
Under high immune escape, a high-risk-only strategy averted 76,000 hospitalizations with a 95% CI of 34,000 to 118,000 and 7,000 deaths with a 95% CI of 3,000 to 11,000. Expanding to a universal recommendation prevented 104,000 hospitalizations with a 95% CI of 55,000 to 153,000 and 9,000 deaths with a 95% CI of 4,000 to 14,000.
Additional indirect benefits accrued to older adults under a universal strategy. Compared with high-risk-only vaccination, universal recommendations prevented about 11,000 more hospitalizations and 1,000 more deaths in those aged 65 years and older.
Observed national patterns diverged in timing from projections. A marked summer 2024 wave was followed by a smaller peak in January 2025, while projections anticipated the heaviest burden from late December 2024 to mid January 2025. Ensemble coverage for weekly deaths remained strong, with 95% intervals closely matching observed values.
Authors conclude that vaccines remain a critical tool to limit COVID-19 burden in 2024–2025, with universal recommendations offering added direct and indirect protection and the potential to save thousands more lives, including among older adults.
Nanoparticles supercharge vinegar's old-fashioned wound healing power
Wounds that do not heal are often caused by bacterial infections and are particularly dangerous for the elderly and people with diabetes, cancer and other conditions. Acetic acid (more commonly known as vinegar) has been used for centuries as a disinfectant, but it is only effective against a small number of bacteria, and it does not kill the most dangerous types.
New research has resulted in the ability to boost the natural bacterial killing qualities of vinegar by adding antimicrobial nanoparticles made from carbon and cobalt. The findings have been published in the journal ACS Nano.
The acidic environment from the vinegar made bacterial cellsswell and take up the nanoparticle treatment.
Once exposed, the nanoparticles appear to attack dangerous bacteria from both inside the bacterial cell and also on its surface, causing them to burst. Importantly, this approach is nontoxic to human cellsand was shown to remove bacterial infections from mice wounds without affecting healing.
Adam Truskewycz et al, Cobalt-Doped Carbon Quantum Dots Work Synergistically with Weak Acetic Acid to Eliminate Antimicrobial-Resistant Bacterial Infections, ACS Nano (2025). DOI: 10.1021/acsnano.5c03108
Over 60,000 died in Europe alone from heat during 2024 summer: Study
More than 60,000 people died from heat in Europe during last year's record-breaking summer, a benchmark study said this week, in the latest warning of the massive toll climate change is having on the continent.
With Europe heating up twice as fast as the global average, the Spain-based researchers suggested an emergency alert system could help warn vulnerable people—particularly the elderly—ahead of dangerous heat waves.
Europe experienced an exceptionally deadly summer in 2024 with more than 60,000 heat-related deaths, bringing the total burden over the past three summers to more than 181,000, said the study in the journal Nature Medicine.
Our actions are dictated by 'autopilot' not choice, finds study
Habit, not conscious choice, drives most of our actions, according to new research .
The research,publishedinPsychology & Health, found that two-thirds of our daily behaviors are initiated "on autopilot", out of habit.
Habits are actions that we are automatically prompted to do when we encounter everyday settings, due to associations that we have learned between those settings and our usual responses to them.
The research also found that 46% of behaviors were both triggered by habit and aligned with conscious intentions, suggesting that people form habits that support their personal goals, and often disrupt habits that conflict with them.
The study found that 65% of daily behaviors were habitually initiated, meaning people were prompted to do them out of routine rather than making a conscious decision.
For people who want to break their bad habits, simply telling them to 'try harder' isn't enough. To create lasting change, we must incorporate strategies to help people recognize and disrupt their unwanted habits, and ideally form positive new ones in their place.
The researchers recommend that initiatives designed to help people adopt new behaviors, like exercising or eating healthier, should focus on building new, positive habits.
Amanda L. Rebar et al, How habitual is everyday life? An ecological momentary assessment study, Psychology & Health (2025). DOI: 10.1080/08870446.2025.2561149
The brain splits up vision without you even noticing
The brain divides vision between its two hemispheres—what's on your left is processed by your right hemisphere and vice versa—but your experience with every bike or bird that you see zipping by is seamless. A new study by neuroscientists reveals how the brain handles the transition.
It's surprising to some people to hear that there's some independence between the hemispheres, because that doesn't really correspond to how we perceive reality. In our consciousness, everything seems to be unified.
There are advantages to separately processing vision on either side of the brain, including the ability to keep track of more things at once, researchers have found, but neuroscientists have been eager to fully understand how perception ultimately appears so unified in the end.
Researchers measured neural activity in the brains of animals as they tracked objects crossing their field of view. The results reveal that different frequencies of brain waves encoded and then transferred information from one hemisphere to the other in advance of the crossing and then held on to the object representation in both hemispheres until after the crossing was complete.
The process is analogous to how relay racers hand off a baton, how a child swings from one monkey bar to the next, and how cell phone towers hand off a call from one to the next as a train passenger travels through their area. In all cases, both towers or hands actively hold what's being transferred until the handoff is confirmed.
To conduct the study, published in the Journal of Neuroscience, the researchers measured both the electrical spiking of individual neurons and the various frequencies of brain waves that emerge from the coordinated activity of many neurons. They studied the dorsal and ventrolateral prefrontal cortex in both hemispheres, brain areas associated with executive brain functions.
The power fluctuations of the wave frequencies in each hemisphere told the researchers a clear story about how the subject's brains transferred information from the "sending" to the "receiving" hemisphere whenever a target object crossed the middle of their field of view. In the experiments, the target was accompanied by a distractor object on the opposite side of the screen to confirm that the subjects were consciously paying attention to the target object's motion and not just indiscriminately glancing at whatever happened to pop up on to the screen.
The highest frequency "gamma" waves, which encode sensory information, peaked in both hemispheres when the subjects first looked at the screen and again when the two objects appeared. When a color change signaled which object was the target to track, the gamma increase was only evident in the "sending" hemisphere (on the opposite side as the target object), as expected. Meanwhile, the power of somewhat lower frequency "beta" waves, which regulate when gamma waves are active, varied inversely with the gamma waves. These sensory encoding dynamics were stronger in the ventrolateral locations compared to the dorsolateral ones.
Meanwhile, two distinct bands of lower frequency waves showed greater power in the dorsolateral locations at key moments related to achieving the handoff. About a quarter of a second before a target object crossed the middle of the field of view, "alpha" waves ramped up in both hemispheres and then peaked just after the object crossed. Meanwhile, "theta" band waves peaked after the crossing was complete, only in the "receiving" hemisphere (opposite from the target's new position).
Accompanying the pattern of wave peaks, neuron spiking data showed how the brain's representation of the target's location traveled. Using decoder software, which interprets what information the spikes represent, the researchers could see the target representation emerge in the sending hemisphere's ventrolateral location when it was first cued by the color change. Then they could see that as the target neared the middle of the field of view, the receiving hemisphere joined the sending hemisphere in representing the object, so that they both encoded the information during the transfer.
Taken together, the results showed that after the sending hemisphere initially encoded the target with a ventrolateral interplay of beta and gamma waves, a dorsolateral ramp up of alpha waves caused the receiving hemisphere to anticipate the handoff by mirroring the sending hemisphere's encoding of the target information. Alpha peaked just after the target crossed the middle of the field of view, and when the handoff was complete, theta peaked in the receiving hemisphere as if to say, "I got it." Part 2
And in trials where the target never crossed the middle of the field of view, these handoff dynamics were not apparent in the measurements.
The study shows that the brain is not simply tracking objects in one hemisphere and then just picking them up anew when they enter the field of view of the other hemisphere.
"These results suggest there are active mechanisms that transfer information between cerebral hemispheres," the authors wrote. "The brain seems to anticipate the transfer and acknowledge its completion."
But they also note based on other studies that the system of interhemispheric coordination can sometimes appear to break down in certain neurological conditions including schizophrenia, autism, depression, dyslexia and multiple sclerosis. The new study may lend insight into the specific dynamics needed for it to succeed.
Matthew B. Broschard et al, Evidence for an active handoff between hemispheres during target tracking, The Journal of Neuroscience (2025). DOI: 10.1523/jneurosci.0841-25.2025
UK Biobank analysis finds higher dementia incidence with frailty
Researchers report evidence that physical frailty is associated with dementia and that genetic background, brain structure, and immunometabolic function may mediate this link.
Dementia causes loss of cognitive abilities and daily functioning, and reached an estimated 57 million cases worldwide in 2021 with projections indicating a rise to 153 million by 2050. Treatments remain limited in effectiveness, creating a need for early identification of risk factors.
Previous studies have noted associations between frailty and elevated risk of incident dementia. Frailty is characterized by decreased physical function and a reduced ability to overcome stressors. Genetic susceptibility might influence the frailty-dementia relationship, and immunometabolic processes and brain structure are implicated, though mechanisms remain unclear.
In the study, "Association of Frailty With Dementia and the Mediating Role of Brain Structure and Immunometabolic Signatures," published in Neurology, researchers conducted a prospective cohort investigation to elucidate the link between physical frailty and dementia, assess causality, and explore biologic mechanisms.
UK Biobank contributed data from 489,573 participants without dementia at enrollment between 2006 and 2010, with a median follow-up of 13.58 years during which 8,900 dementia cases were documented. Brain MRI was available for a subset.
Physical frailty was defined by five criteria: weight loss, exhaustion, physical inactivity, slow walking speed, and low grip strength. Components were summed to a 0–5 score and categorized as nonfrail, prefrail, or frail. Biomarker panels included peripheral blood cell counts, biochemical measures, and metabolomic markers with correction for multiple testing.
Risk climbed with frailty status, with prefrailty associated with higher hazard versus nonfrailty (HR 1.50, 95% CI 1.44–1.57) and frailty associated with a larger increase (HR 2.82, 95% CI 2.61–3.04). Joint analyses placed the greatest hazards where frailty met genetic susceptibility, including HR 3.87 (95% CI 3.30–4.55) for high polygenic risk with frailty and HR 8.45 (95% CI 7.51–9.51) for APOE-e4 gene carriers with frailty.
Neuroimaging and biomarker findings linked frailty severity with image-derived brain measures and immunometabolic markers. Mendelian randomization supported a potential causal effect of physical frailty on dementia (OR 1.79, 95% CI 1.03–3.12), with reverse analysis reporting a null association (OR 1.00, 95% CI 0.98–1.01).
Authors conclude that the findings support a causal association between physical frailty and dementia, suggesting frailty may serve as a correlative early marker of vulnerability.
Xiangying Suo et al, Association of Frailty With Dementia and the Mediating Role of Brain Structure and Immunometabolic Signatures, Neurology (2025). DOI: 10.1212/wnl.0000000000214199
The hunted, not the hunters: AI reveals early humans were prey for leopards
A new study may be about to rewrite a part of our early human history. It has long been thought that Homo habilis, often considered the first true human species, was the one to turn the tables on the predator–prey relationship. However, a recent analysis of previous archaeological finds suggests that they were possibly more hunted than hunters and not the dominant species we once believed them to be.
To investigate this, researchers used artificial intelligence and computer vision to analyze tiny tooth marks on two H. habilis fossils. These ancient remains come from Olduvai Gorge in Tanzania and date back almost 2 million years.
The researchers trained the AI models on a library of 1,496 images of tooth marks made by modern carnivores, including leopards, lions, crocodiles, wolves and hyenas. Once it was trained, they presented the AI with photos of the fossil tooth marks.
As the scientists detail in their paper,publishedin theAnnals of the New York Academy of Sciences, the AI compared these marks to what it had learned and concluded, with more than 90% probability, that leopards made the tooth marks. A key reason for this was that the triangular shape of the toothpits on the bones matched those in the leopard reference samples.
"The implications of this are major, since it shows that H. habilis was still more of a prey than a predator," wrote the researchers in their paper. "It also shows that the trophic position of some of the earliest representatives of the genus Homo was not different from those of other australopithecines."
Although the research was limited to just two individuals, the scientists contend that if H. habilis had become a powerful species that could compete with carnivores, their bones would more likely have been scavenged by bone-crushing animals, such as hyenas, after they died from other causes.
The fact that the bites were from a flesh-eating predator means the leopards were actively hunting them. This suggests that the transition to a dominant position in the food chain came later in human evolution, according to the research team.
Marina Vegara‐Riquelme et al, Early humans and the balance of power: Homo habilis as prey, Annals of the New York Academy of Sciences (2025). DOI: 10.1111/nyas.15321
Popular keto diet linked to glucose intolerance and fatty liver in mice
Avocado toast with fried cheese as the bread and zucchini noodles in butter-bacon sauce are among the many recipe ideas fueling social media's beloved high-fat, low-carbohydrate ketogenic, or keto diet. However, scientists have found that while keto can lead to limited weight gain and even weight loss, it does so at the cost of metabolic issues like glucose intolerance.
scientists divided mice into four dietary groups: ketogenic diet (KD, 90% fat), a high-fat diet (HFD, 60% fat), low-fat (LFD), and low-fat moderate protein (LFMP), with varying levels of carbohydrates and proteins.
The mice were allowed to eat freely for up to 36 weeks in males and 44 weeks in females. Test results showed that while the KD supported weight control, it also raised blood cholesterol levels and led to fatty liver in males.
The diet is named "ketogenic" because it sends the body into ketosis—a metabolic state where the body burns fat as the primary fuel instead of the usual carbohydrates, and, as a result, produces molecules called ketone bodies. The diet isn't a new food trend, it has been around for nearly 100 years and is well-established for treating drug-resistant epilepsy in children, by reducing seizures in many cases. The exact way a KD helps control seizures is still unclear, but several ideas have been proposed. Some studies have suggested that KD can stabilize blood glucose (BG), which is beneficial to brain metabolism and neurotransmitter activity, while others highlight the anticonvulsant effects of ketone bodies themselves.
For this study, the researchers included both male and female mice and carried out regular check-ins to assess the long-term effects of KD on health parameters, including body composition, organ health, and blood profile.
They found that KD protected against excessive weight gain compared to the conventional high-fat diet, but gained more weight than low-fat diets. Long-term KD caused severe hyperlipidemia, meaning there were very high levels of fat in the blood.
KD-fed mice developed severe glucose intolerance because the diet caused a severe impairment in insulin secretion from the pancreas. While female mice on KD seemed fine, the male ones showed signs of liver dysfunction and fatty liver.
The findings made it quite evident that long-term KD can trigger several metabolic disturbances, raising caution against its widespread use as a health-promoting diet.
Molly R. Gallop et al, A long-term ketogenic diet causes hyperlipidemia, liver dysfunction, and glucose intolerance from impaired insulin secretion in mice, Science Advances (2025). DOI: 10.1126/sciadv.adx2752
Scientists show how to grow more nutritious rice that uses less fertilizer
The cultivation of rice—the staple grain for more than 3.5 billion people around the world—comes with extremely high environmental, climate and economic costs.
This may be about to change, thanks to new research.
Scientists have shown that nanoscale applications of the element selenium can decrease the amount of fertilizer necessary for rice cultivation while sustaining yields, boosting nutrition, enhancing the soil's microbial diversity and cutting greenhouse gas emissions.
In a new paper published in the Proceedings of the National Academy of Sciences, they demonstrate for the first time that such nanoscale applications work in real-world conditions.
They used an aerial drone to lightly spray rice growing in a paddy with the suspension of nanoscale selenium. That direct contact means that the rice plant is far more efficient at absorbing the selenium than it would be if we applied it to the soil.
Selenium stimulates the plant's photosynthesis, which increased by more than 40%. Increased photosynthesis means the plant absorbs more CO2, which it then turns into carbohydrates. Those carbohydrates flow down into the plant's roots, which causes them to grow.
Bigger, healthier roots release a host of organic compounds that cultivate beneficial microbes in the soil, and it's these microbes that then work symbiotically with the rice roots to pull more nitrogen and ammonium out of the soil and into the plant, increasing its NUE from 30 to 48.3%, decreasing the amount of nitrous oxideand ammonia release to the atmosphere by 18.8–45.6%.
With more nutrients coming in, the rice itself produces a higher yield, with a more nutritious grain: levels of protein, certain critical amino acids, and selenium also jumped.
On top of all of this, they found that their nano-selenium applications allowed farmers to reduce their nitrogen applications by 30%. Since rice cultivation accounts for 15–20% of the global nitrogen use, this new technique holds real promise for helping to meet the triple threat of growing population, climate change, and the rising economic and environmental costs of agriculture.
Wang, Zhenyu et al, Nanotechnology-driven coordination of shoot–root systems enhances rice nitrogen use efficiency, Proceedings of the National Academy of Sciences (2025). DOI: 10.1073/pnas.2508456122
The Ganges River is drying at an unprecedented rate, new study finds
The Ganges River is in crisis. This lifeline for around 600 million people in India and neighboring countries is experiencing its worst drying period in 1,300 years. Using a combination of historical data, paleoclimate records and hydrological models, researchers discovered that human activity is the main cause. They also found that the current drying is more severe than any recorded drought in the river's history.
In their study, published in the Proceedings of the National Academy of Sciences, researchers first reconstructed the river's flow for the last 1,300 years (700 to 2012 C.E.) by analyzing tree rings from the Monsoon Asia Drought Atlas (MADA) dataset. Then they used powerful computer programs to combine this tree-ring data with modern records to create a timeline of the river's flow. To ensure its accuracy, they double-checked it against documented historical droughts and famines.
The scientists found that the recent drying of the Ganges River from 1991 to 2020 is 76% worse than the previous worst recorded drought, which occurred during the 16th century. Not only is the river drier overall, but droughts are now more frequent and last longer. The main reason, according to the researchers, is human activity. While some natural climate patterns are at play, the primary driver is the weakening of the summer monsoon.
This weakening is linked to human-driven factors such as the warming of the Indian Ocean and air pollution from anthropogenic aerosols. These are liquid droplets and fine solid particles that come from factories, vehicles and power plants, among other sources and can suppress rainfall. The scientists also found that most climate models failed to spot the severe drying trend.
How to avoid this?
The researchers suggest two main courses of action. Given the mismatch between climate models and what they actually found, they are calling for better modeling to account for the regional impacts of human activity.
And because the Ganges is a vital source of water for drinking, agricultural production, industrial use and wildlife, the team also recommends implementing new adaptive water management strategies to mitigate potential water scarcity.
Dipesh Singh Chuphal et al, Recent drying of the Ganga River is unprecedented in the last 1,300 years, Proceedings of the National Academy of Sciences (2025). DOI: 10.1073/pnas.2424613122
Study finds microplastics in all beverages tested, raising exposure estimates
Microplastics have found their way deep inside our bones, brains, and even babies. A UK study found that 100% of all 155 hot and cold beverage samples tested contained synthetic plastic particles.
MPs are tiny pieces of plastic ranging in size from 1 μm to 5 mm, which are widespread across aquatic, terrestrial, and even airborne environments. They have become a growing health concern for living beings across species due to their ability to accumulate and carry toxic chemicals through the food webs.
Humans come into contact with MPs every day through food, water, consumer goods, and even air.
The researchers tested different products from popular UK brands, including coffee, tea, juices, energy drinks, soft drinks, and even tap and bottled water, and not a single beverage was free of microplastics (MPs). Surprisingly, the more expensive tea bag brand showed a higher concentration of MPs, compared to the cheaper ones.
Traces of plastics, including polypropylene, polystyrene, polyethylene terephthalate, and polyethylene—commonly used for food packaging and disposable containers—were found in the fluids. The daily average exposure through beverages was found to be 1.65 MPs/kg body weight per day.
The findings are published inScience of the Total Environment.
This can be true to all countries in the world.
Muneera Al-Mansoori et al, Synthetic microplastics in hot and cold beverages from the UK market: Comprehensive assessment of human exposure via total beverage intake, Science of The Total Environment (2025). DOI: 10.1016/j.scitotenv.2025.180188
Molecular discovery reveals how chromosomes are passed from one generation to the next
When a woman becomes pregnant, the outcome of that pregnancy depends on many things—including a crucial event that happened while she was still growing inside her own mother's womb. It depends on the quality of the egg cells that were already forming inside her fetal ovaries. The DNA-containing chromosomes in those cells must be cut, spliced and sorted perfectly. In males, the same process produces sperm in the testes but occurs only after puberty.
If that goes wrong, then you end up with the wrong number of chromosomes in the eggs or sperm. This can result in infertility, miscarriage or the birth of children with genetic diseases.
In a paper published Sept. 24 in the journal Nature, researchers report a major new discovery about a process that helps safeguard against these mistakes. They have pieced together the choreography of proteins that connect matching chromosome pairs—ensuring that they are sorted correctly as egg and sperm cells develop and divide.
These discoveries required methods to watch the molecular events of chromosome recombination unfold with unprecedented detail. This involved genetic engineering in budding yeast—a model organism that has been used for decades to discover how fundamental cellular processes work.
Humans have 46 chromosomes in each of our cells, made up of 23 pairs of matching, "homologous" chromosomes, with one of each pair inherited from each parent. Early in the process of making sperm or eggs, those chromosome pairs line up, and the parental chromosomes break and rejoin to each other. These chromosome exchanges, called "crossovers," serve two important functions.
First, they help ensure that each chromosome that is passed on to the offspring contains a unique mixture of genes from both parents. Crossovers also keep the chromosomes connected in matching pairs. These connections guide the distribution of chromosomes when cells divide to produce eggs and sperm. Maintaining crossover connections is especially crucial in females.
As chromosomes pair up in developing egg or sperm cells, matching DNA strands are exchanged and twined together over a short distance to form a structure called a "double Holliday junction." DNA strands of this structure are then cut to join the chromosomes forming a crossover.
In males, developing immature sperm cellsthen immediately divide and distribute chromosomes to the sperm. In contrast, egg cells developing in the fetal ovary arrest their development after crossovers have formed. The immature egg cells can remain in suspended animation for decades after birth, until they are activated to undergo ovulation.
Only then does the process lurch back into motion: The egg cell finally divides, and the chromosome pairs that were connected by crossovers are finally separated to deliver a single set of chromosomes to the mature egg. Maintaining the crossover connections over many years is a major challenge for immature egg cells. If chromosome pairs aren't connected by at least one crossover, they can lose contact with each other, like two people separated in a jostling crowd. This causes them to segregate incorrectly when the cell finally divides, producing egg cells with extra or missing chromosomes. This can cause infertility, miscarriage or genetic conditions such as Down syndrome, in which a child is born with an extra copy of chromosome 21, leading to cognitive impairment, heart defects, hearing loss and other problems. Researchers have identified dozens of proteins that bind and process these junctions. They used a technique called "real-time genetics" to investigate the function of those proteins. With this method, they made cells degrade one or more specific proteins within the junction-associated structures. They could then analyze the DNA from these cells, to see whether the junctions were resolved and if they formed crossovers. In this way, they built up a picture in which a network of proteins function together to ensure that crossovers are formed. They identified key proteins such as cohesin that prevent an enzyme called the STR complex (or Bloom complex in humans) from inappropriately dismantling the junctions before they can form crossovers.
They protect the double Holliday junction. That is a key discovery. Failure to protect double-Holliday junctions may be linked to fertility problems in humans.
Shangming Tang et al, Protecting double Holliday junctions ensures crossing over during meiosis, Nature (2025). DOI: 10.1038/s41586-025-09555-1
AI-generated voices now indistinguishable from real human voices
Many people still think of AI-generated speech as sounding "fake" or unconvincing and easily told apart from human voices. But new research shows that AI voice technology has now reached a stage where it can create "voice clones" or deepfakes which sound just as realistic as human recordings.
The study compared real human voices with two different types of synthetic voices, generated using state-of-the-art AI voice synthesis tools. Some were "cloned" from voice recordings of real humans, intended to mimic them, and others were generated from a large voice model and did not have a specific human counterpart.
Participants were asked to evaluate which voices sounded most realistic, and which sounded most dominant or trustworthy. Researchers also looked at whether AI-generated voices had become "hyperreal," given that some studies have shown that AI-generated images of faces are now judged to be human more often than images of real human faces.
While the study did not find a "hyperrealism effect" from the AI voices, it did find that voice clones can sound as real as human voices, making it difficult for listeners to distinguish between them. Both types of AI-generated voices were evaluated as more dominant than human voices, and some were also perceived as more trustworthy.
A new look at how the brain works reveals that wiring isn't everything
How a brain's anatomical structure relates to its function is one of the most important questions in neuroscience. It explores how physical components, such as neurons and their connections, give rise to complex behaviors and thoughts. A recent study of the brain of the tiny worm C. elegans provides a surprising answer: Structure alone doesn't explain how the brain works.
C. elegans is often used in neuroscience research because, unlike the incredibly complex human brain, which has billions of connections, the worm has a very simple nervous system with only 302 neurons. A complete, detailed map of every single one of its connections, or brain wiring diagram (connectome), was mapped several years ago, making it ideal for study.
In this research, scientists compared the worm's physical wiring in the brain to its signaling network, how the signals travel from one neuron to another. First, they used an electron microscope to get a detailed map of the physical connections between its nerve cells. Then, they activated individual neurons with light to create a signaling network and used a technique called calcium imaging to observe which other neurons responded to this stimulation.
Finally, they used computer programs to compare the physical wiring map and the signal flow map, identifying any differences and areas of overlap.
The team discovered that the brain's functional organization differs from its anatomical structure. An analogy is that the brain's structure is like a city map showing every street. However, the function is more akin to traffic flow, with jams, detours and shortcuts that are not visible on the map. In other words, brain activity does not always follow the predictable pathways of its physical wiring.
Sophie Dvali et al, Diverging Network Architecture of the C. elegans Connectome and Signaling Network, PRX Life (2025). DOI: 10.1103/6wgv-b9m6
Lab-grown kidneys yield urine Researchers have created the most sophisticated kidney organoid to date, offering a real shot at growing transplantable kidneys from stem cells. These mini kidneys were capable of making urine when transplanted into mice. “You wouldn’t mistake it for a real kidney,” says experimental anatomist Jamie Davies. “But it is trying to do the right things.” Plumbing — encouraging the organoid to develop blood vessels and the duct that carries urine to the bladder — is the major hold-up. The researchers estimate a transplantable kidney will be ready for animal testing in less than five years.
Scientists discover that cell nucleus is actually less dense than surrounding cytoplasm
Just as life pulsates in big vibrant cities, it also prospers in crowded environments inside cells. The interior of cells is densely packed with biomolecules like proteins and nucleic acids. How is all this material distributed within a cell and what regulates its distribution?
In a study published in Nature Communications, researchers measure subcellular densities across a wide range of organisms. Their aim is to better understand biomolecular processes ranging from yeast cells to human cells.
Conventional scientific textbooks describe the cell nucleus as a compartment packed with an impressive amount of DNA wrapped around histone proteins.
Now, an international team of researchers has discovered that—contrary to expectations—the nucleus is less dense than the surrounding cytoplasm.
Despite their rich biomolecular composition, nuclei contain less dry mass than the same volume of the surrounding cytoplasm.
How can density be measured in microscopic objects such as individual cell compartments? Scientists use light for this purpose. Not only does light allow cells to be examined, it also enables them to be manipulated. Light can exert forces, enabling laser beams to "pull" on cells and measure their mechanical properties using an "optical stretcher."
The researchers developed an optical setup which allowed them to obtain three-dimensional density distributions inside cells at high resolution by combining optical diffraction tomography and confocal fluorescence microscopy.
While NC density ratios are maintained from yeast to human cells, we do start seeing deviations in disease. During stressed cellular states such as aging, the so-called senescence, cell nuclei become denser than the cytoplasm. Thus, the study points to the fundamental importance of density as a variable that determines healthy cellular processes, the researchers note in their paper.
Abin Biswas et al, Conserved nucleocytoplasmic density homeostasis drives cellular organization across eukaryotes, Nature Communications (2025). DOI: 10.1038/s41467-025-62605-0
Camouflage or caution? How anti-predator strategies have evolved
Predators and the environment determine why some animals use camouflage to avoid being eaten, while others use bright colors to warn them off, new research reveals. Published recently in the journal Science, the findings help explain the evolution and global distribution of the most common color strategies used by insects to avoid predators.
The global study took place across six continents and involved over 50 scientific collaborators.
Using the same experiment, researchers deployed more than 15,000 artificial prey with three different colors to investigate which strategy works best to deter predators: a classic warning pattern of orange and black, a dull brown that blends in, and an unusual bright blue and black.
The researchers found the answer to why some animals use camouflage over warning colors to deter predators turned out to be more complex than expected.
The findings showed there is no single best color strategy to deter predators, but that context is critical. The different characteristics of the predator and prey communities, as well as habitat in that part of the globe, heavily decide which strategy performs better in each place. This makes sense when we see animals employing so many varying camouflage and warning color strategies as defense systems all over the world.
Predators had the biggest influence on which color strategy was most successful for prey, the study revealed.
In environments where predators are competing intensely for food, they are more likely to risk attacking prey that might be dangerous or distasteful. Hence, the researchers saw that camouflage worked best in areas with lots of predation.
Whereas, in places where cryptic prey (insects who use camouflage) are abundant, hiding becomes less effective, as predators are better at looking for those types of animals.
The findings help scientists understand why some species, such as the cryptic bogong moth or the brightly colored harlequin bug, have evolved their strategies against predators.
The rise in tree mortality is troubling for local forest ecosystems. As a global phenomenon, however, it has a significant social impact that remains poorly understood.
We don't currently know whether climate change will lead to the death of 10% or 50% of all trees worldwide.
An international group of more than 100 forest researchers are reviewing almost 500,000 forest monitoring studies from 89 countries and five continents. The researchers found that the main cause of tree mortality is anthropogenic (human-induced) climate change and its consequences: heat, dry air and soil, forest fires, storms, and increased insect damage and plant diseases.
In the article published in New Phytologist, the researchers aimed to identify methods, requirements and data gaps in monitoring tree mortality trends.
Towards a global understanding of tree mortality, New Phytologist (2025). DOI: 10.1111/nph.20407
Squirrels bite when they feel threatened, cornered, or are aggressively seeking food. They can also bite inadvertently by mistaking a finger for a treat or startling when a hand is presented to them. While usually a defensive action or a form of play, squirrels lack the bite inhibition of domesticated animals, and their bites, though not typically malicious, can be deep and pose a risk of infection.
Reasons for biting:
Self-defense: Like any wild animal, squirrels will bite to protect themselves if they feel endangered. Aggression for food: Squirrels may become aggressive if they are accustomed to being fed by humans and approach to get a meal, according to Critter Control. Accidental bites: Squirrels don't have the same depth perception as humans and can mistakenly bite a finger when trying to take a treat. Nesting: A mother squirrel in a nesting area, such as an attic, may bite if she feels cornered or threatened. Play behaviour: Squirrels also "play bite" to practice skills they will use as adults, similar to how siblings interact. Risks of a squirrel bite: Infection: Because squirrels are wild rodents, a bite can lead to an infection. Diseases: Though rabies is rare in squirrels, they can carry other diseases, such as the plague, which is transmitted by fleas. What to do if a squirrel bites: Wash the wound: Immediately wash the wound thoroughly with soap and water to reduce the risk of infection. Seek medical attention: Consult a healthcare professional to determine if a tetanus shot is needed and to monitor for any signs of infection.
Mucus contains molecules that block Salmonella infection, study reveals
Mucus is more than just a sticky substance: It contains a wealth of powerful molecules called mucins that help to tame microbes and prevent infection. In a new study, researchers have identified mucins that defend against Salmonella and other bacteria that cause diarrhea.
The researchers now hope to mimic this defense system to create synthetic mucins that could help prevent or treat illness in soldiers or other people at risk of exposure to Salmonella. It could also help prevent "traveler's diarrhea," a gastrointestinal infection caused by consuming contaminated food or water.
Mucins are bottlebrush-shaped polymers made of complex sugar molecules known as glycans, which are tethered to a peptide backbone. In this study, the researchers discovered that a mucin called MUC2 turns off genes that Salmonella uses to enter and infect host cells.
Mucus lines much of the body, providing a physical barrier to infection, but that's not all it does.
Researchers identified mucins that can help to disarm Vibrio cholerae, as well as Pseudomonas aeruginosa, which can infect the lungs and other organs, and the yeast Candida albicans.
The researchers found in the new study that when they exposed Salmonella to a mucin called MUC2, which is found in the intestines, the bacteria stopped producing the proteins encoded by SPI-1, and they were no longer able to infect cells.
Further studies revealed that MUC2 achieves this by turning off a regulatory bacterial protein known as HilD. When this protein is blocked by mucins, it can no longer activate the T3SS genes.
Using computational simulations, the researchers showed that certain monosaccharides found in glycans, including GlcNAc and GalNAc, can attach to a specific binding site of the HilD protein. However, their studies showed that these monosaccharides can't turn off HilD on their own—the shutoff only occurs when the glycans are tethered to the peptide backbone of the mucin.
The researchers also discovered that a similar mucin called MUC5AC, which is found in the stomach, can block HilD. And, both MUC2 and MUC5AC can turn off virulence genes in other foodborne pathogens that also use HilD as a gene regulator.
The researchers now plan to explore ways to use synthetic versions of these mucins to help boost the body's natural defenses and protect the GI tract from Salmonella and other infections.
Kelsey M. Wheeler et al, Mucus-derived glycans are inhibitory signals for Salmonella Typhimurium SPI-1-mediated invasion, Cell Reports (2025). DOI: 10.1016/j.celrep.2025.116304
Million-year-old skull could change human evolution timeline
A digital reconstruction of a million-year-old skull suggests humans may have diverged from our ancient ancestors 400,000 years earlier than thought and in Asia not Africa, a study said this week.
The findings are based on a reconstruction of a crushed skull discovered in China in 1990, and have the potential to resolve the longstanding "Muddle in the Middle" of human evolution, researchers said.
But experts not involved in the work cautioned that the findings were likely to be disputed, and pointed to ongoing uncertainties in the timeline of human evolution.
The skull, labeled Yunxian 2, was previously thought to belong to a human forerunner called Homo erectus.
But modern reconstruction technologies revealed features closer to species previously thought to have existed only later in human evolution, including the recently discovered Homo longi and our own Homo sapiens.
It suggests that by one million years ago, our ancestors had already split into distinct groups, pointing to a much earlier and more complex human evolutionary split than previously thought.
If the findings are correct, it suggests there could have been much earlier members of other early hominins, including Neanderthals and Homo sapiens, the study says.
It also "muddies the waters" on longstanding assumptions that early humansdispersed from Africa.
There's a big change potentially happening here, where east Asia is now playing a very key role in hominin evolution.
The research, published in the journalScience, used advanced CT scanning, structure light imaging and virtual reconstruction techniques to model a complete Yunxian 2.
The scientists relied in part on another similar skull to shape their model, and then compared it to over 100 other specimens.
The resulting model "shows a distinctive combination of traits," the study said, some of them similar to Homo erectus, including a projecting lower face.
But other aspects, including its apparently larger brain capacity, are closer to Homo longi and Homo sapiens, the researchers said.
"Yunxian 2 may help us resolve what's been called the 'Muddle in the Middle,' the confusing array of human fossils from between 1 million and 300,000 years ago.
The findings are only the latest in a string of recent research that has complicated what we thought we know about our origins.
Xiaobo Feng et al, The phylogenetic position of the Yunxian cranium elucidates the origin of Homo longiand the Denisovans, Science (2025). DOI: 10.1126/science.ado9202
Origins of the 'Ostrich Effect': Researchers pinpoint the age we start avoiding information—even when it's helpful
In a world of information overload, it can feel soothing to stick your head in the sand.
According to psychologists, avoiding information when it's uncomfortable is a common adult behavior, often referred to as the "Ostrich Effect."
But how do we become an ostrich? Children are notorious for seeking out information, often in the form of endless questions. So when do we sprout feathers and decide that, actually, the number of calories in a slice of cake is none of our business?
This behavioral origin point was exactly what researchers wanted to pin down.
In a study published in Psychological Science, a research team discovered that as children aged, the tendency to avoid information grew stronger.
Though 5- and 6-year-olds still actively sought information, 7- to 10-year-olds were much more likely to strategically avoid learning something if it elicited a negative emotion.
Why is it that children are these super curious people, but then we somehow end up as these information avoiders as adults?
In their initial experiment, the researchers looked at five reasons why we might willfully choose to remain ignorant:
To avoid negative emotions like anxiety or disappointment
To avoid negative information about our own likability or competence
To avoid challenges to our beliefs
To protect our preferences
To act in our own self-interest (perhaps while trying to appear not self-interested)
Part 1
Researchers then adapted these into five scenarios for children to see if they could elicit information avoidance. For example, each child was asked to imagine their favorite and least favorite candy. They were then asked if they wanted to watch a video about why eating that candy was bad for their teeth.
They found that, whereas younger children really wanted to seek information, older children started to exhibit these avoidance tendencies. For example, they didn't want to know why their favorite candy was bad for them, but they were totally fine learning why their least favorite candy is bad for them. This finding held for all motivations except for competency. Children of all ages were not afraid to learn if they'd done badly on a test, for example.
To avoid avoidance, she suggests thinking through why you might be avoiding something—possibly prioritizing short-term comfort over long-term benefits. Researchers posit that it could help to reframe uncomfortable information as useful and valuable.
Research suggests that intervening while children are still young could keep them from falling into avoidance traps and have compounding benefits. Humans have this propensity to want to resolve uncertainty, but when the resolution is threatening, people might flip to avoidance instead. If all else fails, she advises, mimic what children do best: Follow your curiosity.
Radhika Santhanagopalan et al, Becoming an Ostrich: The Development of Information Avoidance, Psychological Science (2025). DOI: 10.1177/09567976251344551
Bacterial endotoxins are high-potency, low-mass drivers of PM₂.₅ toxicity, sampling study reveals
Endotoxin, a toxic chemical found in bacteria, makes up only 0.0001% of PM2.5 fine particles but packs a serious punch when it comes to its bioactivity.
According to a study by researchers endotoxin drives 0.1–17% of the inflammatory responses triggered by these airborne particles, with its toxicity contribution being three to five orders of magnitude higher than its mass contribution.
Air pollution is now the world's leading environmental health threat, linked to more than three million premature deaths every year. One of the key culprits is PM2.5, which refers to airborne particles smaller than 2.5 micrometers, small enough to slip deep into the lungs and even seep into the bloodstream.
Scientists have long been focusing on PM2.5 because evidence consistently links it to respiratory illnesses, such as asthma, chronic obstructive pulmonary disease, and airway inflammation. Studies suggest that the damage caused by PM2.5 could be due to oxidative stress and the triggering of immune responses in the lungs following exposure.
PM2.5 is a complex atmospheric cocktail of natural and anthropogenic particles containing biological, inorganic, and organic constituents. For decades, researchers have extensively studied the impact of chemicals—including transition metals, polycyclic aromatic hydrocarbons, and industrial smoke—produced by human activities. These components, however, contribute to less than half of the respiratory damage inflicted by PM2.5, leaving roughly 60% of its impact still unexplained.
Researchers of this study conducted daily 24-hour PM2.5 sampling for a year across an urban and coastal area of Hong Kong. To assess inflammatory responses, the researchers exposed human bronchial epithelial cells to PM2.5 and measured the release of interleukin-8 (IL-8)—a small protein, called a cytokine, that is released by the immune system— as a marker of inflammation.
Endotoxin concentrations were measured using the Limulus Amebocyte Lysate (LAL) assay, then researchers used DNA sequencing and source tracking to identify the Gram-negative bacteria they came from. Finally, they applied mixture-toxicity modeling to estimate how much these endotoxins contributed to the overall harmful effects of PM2.5 exposure.
They found that despite making up only a minuscule fraction of the total PM2.5 mixture, it drove about 0.1 to 17% of the IL-8 release triggered by PM2.5.
Among all reported PM2.5 components, endotoxin demonstrated the highest toxicity-to-mass contribution ratio, 10,000:1 to 100,000:1, establishing its extreme biological potency. These findings show that less is indeed more.
The researchers note that this study brings to light the importance of identifying highly toxic components present in low concentrations and tracing their sources. Pinpointing these toxicity drivers can help us design cost-effective strategies in which even modest reductions in PM2.5 mass could yield substantial decreases in overall toxicity.
Jinyan Yu et al, Disproportionately Higher Contribution of Endotoxin to PM2.5 Bioactivity than Its Mass Share Highlights the Need to Identify Low-Concentration, High-Potency Components, Environmental Science & Technology (2025). DOI: 10.1021/acs.est.5c07255
Mamba (Dendroaspis species) snake bites are a significant threat in sub-Saharan Africa, accounting for 30,000 deaths annually.
A breakthrough study has discovered a hidden dangerous feature of the black mamba, one of the most venomous snakes in the world.
The study revealed the venoms of three species of mamba were far more neurologically complex than previously thought, explaining why antivenoms were sometimes ineffective. This research was published in Toxins.
The black mamba, western green mamba and Jamesons mamba snakes aren't just using one form of chemical weapon, they're launching a coordinated attack at two different points in the nervous system.
If you're bitten by 3 out of 4 mamba species, you will experience flaccid or limp paralysis caused by postsynaptic neurotoxicity.
Current antivenoms can treat the flaccid paralysis but this study found the venoms of these three species are then able to attack another part of the nervous system causing spastic paralysis by presynaptic toxicity.
Researchers previously thought the fourth species of mamba, the eastern green mamba, was the only one capable of causing spastic paralysis.
This finding resolves a long-standing clinical mystery of why some patients bitten by mambas seem to initially improve with antivenom and regain muscle tone and movement only to start having painful, uncontrolled spasms.
The venom first blocks nerve signals from reaching the muscles, but after the antivenom is administered, it then overstimulates the muscles.
It's like treating one disease and suddenly revealing another.
Researchers also found the venom function of the mambas was different depending on their geographic location, particularly within populations of the black mamba from Kenya and South Africa.
This further complicates treatment strategies across regions because the antivenoms are not developed to counteract the intricacies of the different venoms.
By identifying the limitations of current antivenoms and understanding the full range of venom activity, we can now directly inform evidence-based snakebite care.
Lee Jones et al, Neurotoxic Sleight of Fang: Differential Antivenom Efficacy Against Mamba (Dendroaspis spp.) Venom Spastic-Paralysis Presynaptic/Synaptic vs. Flaccid-Paralysis Postsynaptic Effects, Toxins (2025). DOI: 10.3390/toxins17100481
Scientists read mice's 'thoughts' from their faces
It's easy to read emotions on people's faces—each one has its clear, unmistakable signature. But what about thoughts? A study published in Nature Neuroscience shows that mice's problem-solving strategies can be deciphered from subtle facial movements.
According to the authors, this is a proof of concept that the contents of the mind can be read out from video recordings, potentially offering powerful new research and diagnostic tools.
Scientists found that they can get as much information about what the mouse was 'thinking' as they could from recording the activity of dozens of neurons.
Having such easy access to the hidden contents of the mind could provide an important boost to brain research. However, it also highlights a need to start thinking about regulations to protect our mental privacy.
Facial expressions in mice reveal latent cognitive variables and their neural correlates, Nature Neuroscience (2025). DOI: 10.1038/s41593-025-02071-5.
Stem cells from fat tissue may help prevent kidney dialysis access failure
To undergo kidney dialysis, doctors must first surgically create an access route—an arteriovenous fistula—usually in an arm, a conduit that will accommodate hemodialysis treatments. It is a routine outpatient procedure performed for years worldwide.
But it is a procedure beset by problems.
An arteriovenous fistula must first "mature," a process in which the newly established connection between an artery and a vein becomes large enough to support the turbulent flow of blood in hemodialysis. For many patients, this artificially created channel tends to narrow, leaving it useless as a conduit.
Researchers investigating a possible way to prevent problematic narrowing—a condition called stenosis—with a procedure that relies on the use of stem cells.
The study, researchers asserted, is a crucial step toward improving a necessary treatment for patients with kidney failure by tapping into a population of cells that are essentially blank slates.
Theinvestigationis reported in the journalScience Translational Medicine.
The phase 1 randomized trial involved patients undergoing an arteriovenous fistula (AVF) placement in an arm. Some of the patients in the small trial also received autologous adipose-derived mesenchymal stem cells. The cells were delivered at the time of the AVF procedure.
The stem cells were placed along the AVF starting at the distal artery, one centimeter upstream to the anastomosis [the surgical connection between adjacent blood vessels] and extending to the first four centimeters of the vein just distal to the anastomosis by dripping them onto the adventitia of the vessels slowly over five minutes.
The adventitia is the outermost layer of a blood vessel.
Mesenchymal stem cells are a form of somatic, or adult stem cells, which can be found in a variety of tissues throughout the body, including adipose (fat) tissue, which is an abundant source.
The stem cells are aimed at improving AVF function by preventing vascular narrowing. The cells were also a site-specific treatment for another problem tied to arteriovenous fistulas: inflammation, a hallmark of AVFs. Fortunately, anti-inflammatory activity is a function of mesenchymal stem cells.
Side-by-side images in the study show the vascular opening to be wide and capable of handling the turbulence of hemodialysis among patients who received mesenchymal stem cells. Patients who did not receive the stem cell treatment suffered vascular narrowing.
The research team sees promise in their unique approach, which is producing positive results at a critical time.
The team's phase 1 clinical trial involved 21 patients who received arteriovenous fistulas in the arm; 11 of the 21 patients also received mesenchymal stem cells derived from their own fat tissue. After 42 months, fistulas had matured faster in patients who received stem cells. Additional study and approval by the U.S. Food and Drug Administration are required before the treatment can become available.
Sreenivasulu Kilari, et al Periadventitial delivery of mesenchymal stem cells improves vascular remodeling and maturation in arteriovenous fistulas, Science Translational Medicine (2025). DOI: 10.1126/scitranslmed.adp7723
Scientists finally prove that a quantum computer can unconditionally outperform classical computers
A quantum computer has demonstrated that it can solve a problem more efficiently than a conventional computer. This achievement comes from being able to unlock a vast memory resource that classical computing cannot match.
Instead of using classical bits that can only be 0 or 1, quantum machines use qubits, which can exist in multiple states and store exponentially more information than their traditional counterparts. However, proving that a quantum computer can access this memory advantage in the real world has been a challenge for two main reasons.
First, any successful demonstration has to be feasible on realistic quantum hardware, and second, there must be unconditional mathematical proof that no future classical algorithm could achieve the same performance.
In a study published on the arXiv preprint server, a research team reports how they achieved this feat of quantum supremacy. They constructed a complicated mathematical task designed to test this memory advantage. Their experiment was like a game between two parts of the quantum system referred to as Alice and Bob. Alice's task was to create a quantum state and send it in a message to Bob, who had to measure it to figure out what it was. The goal was to build a process so accurate that Bob could predict the state before Alice finished preparing the message.
The researchers optimized this process over 10,000 independent trials, and their analysis revealed that a classical computer would need at least 62 bits of memory to complete the task with the same success rate. The quantum device performed it using only 12 qubits.
The result provides the most direct evidence yet that currently existing quantum processors can generate and manipulate entangled states of sufficient complexity to access the exponentiality of Hilbert space (the vast memory resource of a quantum computer)," wrote the researchers in their paper.
This form of quantum advantage—which we call quantum information supremacy—represents a new benchmark in quantum computing, one that does not rely on unproven conjectures.
William Kretschmer et al, Demonstrating an unconditional separation between quantum and classical information resources, arXiv (2025). DOI: 10.48550/arxiv.2509.07255
The Red Sea went completely dry before being flooded by the Indian Ocean over 6 million years ago
Scientists have provided conclusive evidence that the Red Sea completely dried out about 6.2 million years ago, before being suddenly refilled by a catastrophic flood from the Indian Ocean. The findings put a definitive time on a dramatic event that changed the Red Sea.
Using seismic imaging, microfossil evidence, and geochemical dating techniques, the researchers showed that a massive change happened in about 100,000 years—a blink of an eye for a major geological event. The Red Sea went from connecting with the Mediterranean Sea to an empty, salt-filled basin. Then, a massive flood burst through volcanic barriers to open the Bab el-Mandab strait and reconnect the Red Sea with the world's oceans.
The findings show that the Red Sea basin records one of the most extreme environmental events on Earth, when it dried out completely and was then suddenly reflooded about 6.2 million years ago.
The Red Sea was initially connected from the north to the Mediterranean through a shallow sill. This connection was severed, drying the Red Sea into a barren salt desert. In the south of the Red Sea, near the Hanish Islands, a volcanic ridge separated the sea from the Indian Ocean.
But around 6.2 million years ago, seawater from the Indian Ocean surged across this barrier in a catastrophic flood. The torrent carved a 320-kilometer-long submarine canyon that is still visible today on the seafloor. The flood rapidly refilled the basin, drowning the salt flats and restoring normal marine conditions in less than 100,000 years. This event happened nearly a million years before the Mediterranean was refilled by the famous Zanclean flood, giving the Red Sea a unique story of rebirth.
Tihana Pensa et al, Desiccation of the Red Sea basin at the start of the Messinian salinity crisis was followed by major erosion and reflooding from the Indian Ocean, Communications Earth & Environment (2025). DOI: 10.1038/s43247-025-02642-1
Forensic test recovers fingerprints from fired ammunition casings despite intense heat
A pioneering new test that can recover fingerprints from ammunition casing, once thought nearly impossible, has been developed by scientists.
The researchers have developed a unique electrochemical method which can visualize fingerprints on brass casings, even after they have been exposed to the high temperature conditions experienced during gunfire. The study is published in the journal Forensic Chemistry.
For decades, investigators have struggled to recover fingerprints from weapons because any biological trace is usually destroyed by the high temperatures, friction and gas released after a gun is fired. As a result, criminals often abandon their weapons or casings at crime scenes, confident that they leave no fingerprint evidence behind.
Traditionally, the intense heat of firing destroys any biological residue. However, the technique has been able to reveal fingerprint ridges that would otherwise remain imperceptible.
The team found they could coat brass casings with a thin layer of specialized materials to make hidden fingerprint ridges visible. Unlike existing methods that need dangerous chemicals or high-powered equipment, the new process uses readily available non-toxic polymers and minimal amounts of energy to quickly reveal prints from seemingly blank surfaces.
It works by placing the brass casing of interest in an electrochemical cell containing specific chemical substances. When a small voltage is applied, chemicals in the solution are attracted to the surface, coating the spaces between fingerprint ridges and creating a clear, high contrast image of the print. The fingerprint appears within seconds as if by magic!
Tests showed that this technique also worked on samples aged up to 16 months, demonstrating remarkable durability.
The research has significant implications for criminal investigations, where the current assumption is that firing a gun eliminates fingerprint residues on casings.
Colm McKeever et al, Electrodeposition of redox materials with potential for enhanced visualisation of latent finger-marks on brass substrates and ammunition casings., Forensic Chemistry (2025). DOI: 10.1016/j.forc.2025.100663
Population bottlenecks cause decline of mammals' immunity, researchers find
Population bottlenecks caused by stark population loss due to illness or habitat destruction caused mammals' disease immunity to decline, according to a new study led by computational biologists .
The finding comes from the first comparative study of genomic sequences—roadmaps of DNA instructions responsible for encoding how the body works—encoding immunity in 46 mammals.
The study, published in Molecular Biology and Evolution, is the first step for scientists analyzing regions of mammalian DNA that were previously inaccessible without modern biotechnology computational tools.
Genes influence how our body works: Humans and animals have genetics predisposed to certain diseases based on DNA. Although the same basic building blocks make up DNA across the 46 mammals assessed, the genomic sequences diverged wildly. So, even though we might have a similar set of genes, they are different based on variations in the DNA architecture.
In the immune system, things are complicated further by something known as adaptive immunity. As opposed to the non-discriminatory defense the immune system deploys at the first hint of an infection, adaptive immunity refers to the parts of the immune system that study the specifics of a pathogen and design antibodies precisely targeted for it, should it invade again.
Antibodies are produced from highly variable "template" genes encoded in the genome, and this variability enables versatile immune responses through the generation of antibodies against diverse targets.
The question is, how did this adaptive immunity evolve?
To answer this Q, researchers analyzed five types of gene clusters that control various aspects of immune system production—specifically the building of antibodies and the receptors on another immune cell type known as the T-cell—across 46 mammals to better understand how genetic variation could affect immune function.
Researchers scanned, aligned and compared publicly available DNA sequences of 46 mammals of 13 taxonomic orders, such as primates, rodents, bats, carnivores and marsupials, to draw conclusions about how their immune systems evolved.
Researchers found that a decline in adaptive immunity, and possible vulnerability to certain diseases among mammals as a result, was likely caused by genetic bottlenecks: a stark decrease in population over certain periods in history due to factors such as habitat loss or disease.
Bottlenecks happened during medieval times when humanity was devastated by various diseases like the Black Plague, or when animals suffered widespread habitat loss due to forest fire.
Species with these past population bottlenecks include felines, aquatic mammals, seals, some primates and ruminants, which are mammals that adapted a special stomach for digesting tough plants.
Genetic bottlenecks result in limited gene pool diversity for these animals, the researchers explained, which led to possible declining of adaptive immunity.
Mariia Pospelova et al, Comparative Analysis of Mammalian Adaptive Immune Loci Revealed Spectacular Divergence and Common Genetic Patterns, Molecular Biology and Evolution (2025). DOI: 10.1093/molbev/msaf152
Dads influence embryo growth via molecular signatures, research reveals
Over the past few decades, growing evidence has challenged the belief that inheritance is governed solely by DNA sequences. Scientists now recognize the crucial role of epigenetic inheritance—the transmission of biological traits via chemical modifications to DNA and its associated proteins. These modifications do not alter the genetic code itself but influence how genes are switched on or off, often in response to environmental factors such as stress, diet, or drug exposure.
While the concept of maternal epigenetic inheritance is relatively intuitive—given the direct biological connection between mother and embryo during gestation—recent research shows that fathers, too, can transmit environmentally induced epigenetic changes to their offspring. However, the prevalence of epigenetic inheritance—and the mechanisms behind it—remains unclear.
In a recent study, researchers demonstrated that disrupting the gut microbiome of male mice increases disease risk in their future offspring. On the other hand, some have focused on mechanisms that regulate embryonic development in response to changes in paternal diet. A collaborative study between the groups, now published in The EMBO Journal, examined how specific paternal environments affect early embryonic development in a systematic manner and under tightly controlled genetic and environmental conditions in mice.
To induce environmental perturbations, prospective fathers were exposed to either non-absorbable antibiotics (disrupting the gut microbiota) or to a low-protein, high-sugar diet. To minimize experimental variability, the analyses were performed on embryos resulting from in vitro fertilization (IVF). Embryos were collected approximately four days after fertilization (blastocyst stage) and individually analyzed to measure differences in gene expression compared to controls (blastocysts that resulted from fathers without any treatment). The results were striking. Both environmental perturbations led to significant changes in embryonic gene expression. Disruption of the paternal gut microbiota reduced the expression of key genes involved in extra-embryonic tissue development, while changes in the diet were linked with a modest developmental delay.
To further investigate the influence of the genetic background, scientists repeated the experiments using a different mouse strain. The outcome differed, suggesting the importance of the genetic component in shaping how environmental exposures affect offspring.
Additionally, embryos derived from older fathers showed a stronger effect on gene expression, especially on genes involved in immune-related processes, indicating that paternal age is another important factor involved in epigenetic inheritance.
Mathilde Dura et al, Embryonic signatures of intergenerational epigenetic inheritance across paternal environments and genetic backgrounds, The EMBO Journal (2025). DOI: 10.1038/s44318-025-00556-4
Microlightning causes eerie lights of lore Spontaneous flashes of ‘microlightning’ between bubbles of gas could explain will-o’-the-wisps — flickering lights that can appear on marshlands. Researchers blew tiny bubbles of methane and air into water, where smaller bubbles took on a negative charge and larger ones, a positive charge. As the charges equalized, they produced a small zap of electricity and a flash of light. This could explain why the ghostly-looking lights appear over methane-rich bogs.
Microplastics reduce soil fertility and boost production of a potent greenhouse gas, study shows
More than 90% of plastic waste ends up in the soil, where it breaks down into microplastics that are invisible to the naked eye. Microplastic pollution of the soil poses a severe threat to soil health as it can harm essential microbial communities and reduce crop yields. The presence of these tiny plastics may also worsen climate change by boosting the production of greenhouse gases, according to a new study published in Environmental Science & Technology.
Most previous research focused on one plastic at a time and their effect on soil function and nutrient cycling, but microplastics do not tend to occur in isolation.
in the present study, the researchers went for the combined effect of various types of plastics on soil and key functions, such as the nitrogen cycle.
To quantify the problem, the team ran a microcosm experiment in the lab, using soil samples mixed with six different types of plastic, including polyethylene terephthalate (PET) and polyvinyl chloride (PVC). They created four distinct groups with varying levels of plastic, from zero plastics (the control group) to five different types of plastic. After 40 days of incubation, they collected the soil and ran several tests. These included measuring soil properties, such as acidity and key enzyme activities, as well as DNA sequencing to identify bacteria and their associated functional genes.
The team's analysis revealed that increasing microplastic diversity leads to significant shifts in soil health. For example, the plastic mixture considerably raised soil pH (making the soil more alkaline) and increased soil carbon content.
However, one of the most important findings was that microplastic diversity boosted the activity of bacterial genes responsible for denitrification. This is the process by which bacteria convert plant nutrient material into nitrogen gas, which is then released into the atmosphere. It not only makes the soil less fertile, but also releases nitrous oxide, a greenhouse gas that is around 300 times more potent in warming the planet than carbon dioxide. The primary cause of this accelerated nitrogen loss was a family of bacteria known as Rhodocyclaceae.
Tian-Gui Cai et al, Microplastic Diversity as a Potential Driver of Soil Denitrification Shifts, Environmental Science & Technology (2025). DOI: 10.1021/acs.est.5c04981
Parkinson's 'trigger' directly observed in human brain tissue for the first time
Scientists have, for the first time, directly visualized and quantified the protein clusters believed to trigger Parkinson's, marking a major advance in the study of the world's fastest-growing neurological disease.
These tiny clusters, called alpha-synuclein oligomers, have long been considered the likely culprits for Parkinson's disease to start developing in the brain, but until now, they have evaded direct detection in human brain tissue.
Now, researchers have developed an imaging technique that allows them to see, count and compare oligomers in human brain tissue, a development one of the team says is "like being able to see stars in broad daylight."
Their results, reported in the journal Nature Biomedical Engineering, could help unravel the mechanics of how Parkinson's spreads through the brain and support the development of diagnostics and potential treatments.
The team examined post-mortem brain tissue samples from people with Parkinson's and compared them to healthy individuals of similar age. They found that oligomers exist in both healthy and Parkinson's brains. The main difference between disease and healthy brains was the size of the oligomers, which were larger, brighter and more numerous in disease samples, suggesting a direct link to the progression of Parkinson's.
The team also discovered a sub-class of oligomers that appeared only in Parkinson's patients, which could be the earliest visible markers of the disease—potentially years before symptoms appear.
Rebecca Andrews et al, Large-scale visualisation of α-synuclein oligomers in Parkinson's disease brain tissue, Nature Biomedical Engineering (2025). DOI: 10.1038/s41551-025-01496-4
Tracing the evolutionary roots of why women live longer than men
Around the world, women on average live longer than men. This striking pattern holds true across nearly all countries and historical time periods. Although the gap between the sexes has narrowed in some countries due to medical advances and improved living conditions, new research now provides clues as to why this difference is unlikely to disappear anytime soon. The causes are deeply rooted in evolutionary history and can be observed in many animal species.
An international team of scientists conducted the most comprehensive analysis of sex differences in lifespan across mammals and birds to date. Their findings, published in Science Advances, provide novel insight into one of biology's long-standing puzzles: why males and females age differently.
Among mammals, females usually live longer—for instance, in baboons and gorillas, females often outlive males. Yet this pattern is not universal: In many birds, insects, and reptiles, males are the longer-lived sex. One genetic explanation, the heterogametic sex hypothesis, points to differences in sex chromosomes.
In mammals, females have two X chromosomes, while males have only one X and one Y (making them the heterogametic sex). Some research suggests that having two X chromosomes may protect females from harmful mutations, offering a survival advantage. In birds, however, the system is reversed: females are the heterogametic sex.
Using records from over 1,176 bird and mammal species in zoos worldwide, the researchers found a striking contrast in lifespan, supporting the heterogametic sex hypothesis: in most mammals (72 percent), females lived longer, by on average twelve percent, while in most bird species (68 percent), males lived longer, overall by an average of five percent.
Still, there was remarkable variation with many exceptions. Some species showed the opposite of the expected pattern. For example, in many birds of prey, females are both larger and longer-lived than males. So sex chromosomes can only be part of the story.
Sexual selection and parental care shape lifespan differences
In addition to genetics, reproductive strategies also play a role. Through sexual selection, males in particular develop conspicuous characteristics such as colorful plumage, weapons, or large body size, which increase reproductive success but can shorten lifespan. The new study supports this assumption: In polygamous mammals with strong competition, males generally die earlier than females.
Many birds, on the other hand, are monogamous, which means that competitive pressure is lower and males often live longer. Overall, the differences were smallest in monogamous species, while polygamy and pronounced size differences were associated with a more pronounced advantage for females.
Parental care also plays a role. The researchers found evidence that the sex that invests more in raising offspring—in mammals, this is often the females—tends to live longer. In long-lived species such as primates, this is likely to be a selective advantage: females survive until their offspring are independent or sexually mature.
Zoo life reduces—but does not erase—lifespan gaps A long-standing idea is that environmental pressures—such as predation, pathogens, or harsh climates—drive the observed gaps between males and females. To test this, the researchers turned to zoo populations, where such pressures are largely absent. They found that lifespan gaps persisted even under these protected conditions. Comparing zoo and wild populations showed that the gaps were often smaller in zoos but rarely disappeared—mirroring the human case, where advances in medicine and living conditions have narrowed but not eliminated the lifespan gap.
The findings suggest that sex differences in lifespan are deeply rooted in evolutionary processes—shaped by sexual selection and parental investment and that genetic differences in the sex determination system may also play a role. Environmental factors influence the extent of the differences, but cannot eliminate them. The differences between the sexes are therefore not only a product of the environment, but part of our evolutionary history, and will most likely continue to exist in the future.
Sexual selection drives sex difference in adult life expectancy across mammals and birds, Science Advances (2025). DOI: 10.1126/sciadv.ady8433
Rare fossil reveals ancient leeches weren't bloodsuckers
A newly described fossil reveals that leeches are at least 200 million years older than scientists previously thought, and that their earliest ancestors may have feasted not on blood, but on smaller marine creatures.
Roughly 430 million years old, the fossil includes a large tail sucker—a feature still found in modernleeches—along with a segmented, teardrop-shaped body. But one important feature isn't found in this fossil: the forward sucker that many of today's leeches use to pierce skin and draw blood.
This absence, along with the fossil's marine origin, suggests a very different early lifestyle for the group known as Hirudinida. Rather than sucking blood from mammals, reptiles, and other vertebrates, the earliest leeches may have roamed the oceans, consuming soft-bodied invertebrates whole or feeding on their internal fluids.
Blood feeding takes a lot of specialized machinery. Anticoagulants, mouthparts, and digestive enzymes are complex adaptations. It makes more sense that early leeches were swallowing prey whole or maybe drinking the internal fluids of small, soft-bodied marine animals.
Previously, scientists thought leeches emerged about 150–200 million years ago. That timeline has now been pushed back by at least 200 million years, thanks to the fossil found in the Waukesha biota, a geological formation in Wisconsin known for preserving the bodies of soft tissue animals that usually decay before fossilization.
Preserving a leech fossil is no small feat. Leeches lack bones, shells, or exoskeletons that are most easily preserved over millions of years. Fossils like this require exceptional circumstances to preserve, often involving near-immediate burial, a low-oxygen environment, and unusual geochemical conditions.
A rare animal and just the right environment to fossilize it—it's like hitting the lottery twice.
de Carle D, et al. The first leech body fossil predates estimated hirudinidan origins by 200 million years, PeerJ (2025). doi.org/10.7717/peerj.19962
Dr. Krishna Kumari Challa
New cancer clue discovered in our saliva
Scientists have discovered a brand-new type of DNA element swimming in our saliva.
The giant loops of genetic information – named ‘inocles’ – are linked to oral health, the immune system, and cancer risk.
Those with head and neck cancers show far fewer of these DNA elements in their mouths.
Inocles are found in roughly 75 percent of the population.
Like little survival kits, their presence might help certain oral bacteria during stressful times, possibly altering the microbiome in our mouths.
How that impacts tumor growth is a mystery to solve. Right now, 95 percent of the genes in inocles remain elusive.
https://www.nature.com/articles/s41467-025-62406-5
Sep 19
Dr. Krishna Kumari Challa
Sugary drinks may increase risk of metastasis in advanced colorectal cancer
A new study by researchers shows that the glucose-fructose mix found in sugary drinks directly fuels metastasis in preclinical models of advanced colorectal cancer. The study was published today in Nature Metabolism.
The researchers studied how sugary drinks may affect late-stage colorectal cancer. Using laboratory cancer models, they compared the effects of the glucose-fructose mix found in most sugary drinks with those of glucose or fructose alone. Only the sugar mix made cancer cells more mobile, leading to faster spread to the liver—the most common site of colorectal cancer metastasis.
The sugar mix activated an enzyme called sorbitol dehydrogenase (SORD), which boosts glucose metabolism and triggers the cholesterol pathway, ultimately driving metastasis. This is the same pathway targeted by statins, common heart drugs that inhibit cholesterol production. Blocking SORD slowed metastasis, even with the sugar mix present. These findings suggest that targeting SORD could also offer an opportunity to block metastasis.
These findings highlight that daily diet matters not only for cancer risk but also for how the disease progresses once it has developed.
Fructose and glucose from sugary drinks enhance colorectal cancer metastasis via SORD, Nature Metabolism (2025). DOI: 10.1038/s42255-025-01368-w. www.nature.com/articles/s42255-025-01368-w
Sep 20
Dr. Krishna Kumari Challa
Think before you ink: Uncovering the hidden risks in tattoo inks
New research suggests that what's in the ink may pose greater risks than the size and design of the tattoo.
A new study has revealed that the ingredients listed on tattoo ink labels often don't match what's actually inside the bottle. Researchers warn that this comes with risk because there are currently few regulations, laws and safety criteria for tattoo and permanent cosmetic formulations.
The findings, published this month as the cover story in the Journal of Environmental Health, raise fresh concerns about the safety and regulation of tattoo inks.
Using a combination of advanced analytical techniques, researchers found discrepancies between labeled and actual ingredients in a range of commercially available yellow tattoo inks.
The results showed not only discrepancies with label claims, but also the presence of unlisted elements such as aluminum, sodium and silicon.
Batool A. Aljubran et al, Decoding Tattoo Inks: Multiple Analysis Techniques Reveal Discrepancies in Ingredient Composition and Elemental Content When Compared Against Label Claims, Journal of Environmental Health (2025). DOI: 10.70387/001c.143999
Sep 20
Dr. Krishna Kumari Challa
Study finds 10% of pediatric blood cancers may stem from medical imaging radiation
A new study has concluded that radiation from medical imaging is associated with a higher risk of blood cancers in children.
Researchers examined data from nearly 4 million children and estimated that one in 10 blood cancers—some 3,000 cancers in all—may be attributable to radiation exposure from medical imaging. The risk increased proportionally based on the cumulative amount of radiation the children received.
The investigation is the first comprehensive assessment using data from children and adolescents in North America that quantifies the association between radiation exposure from medical imaging and blood and bone marrow cancers, such as leukemia and lymphoma, which are the most common forms of cancer in children and adolescents.
Medical imaging saves lives by enabling timely diagnosis and effective treatment, but it also exposes patients to ionizing radiation, a known carcinogen, particularly through computed tomography (CT).
Children are particularly vulnerable to radiation-induced cancer due to their heightened radiosensitivity and longer life expectancy.
The authors caution that doctors and parents should avoid excessive radiation doses and minimize exposure when clinically feasible.
Investigators found a significant relationship between cumulative radiation dose and the risk of a hematologic malignancy, which includes tumors affecting the blood, bone marrow, lymph, and lymphatic system.
Among all the forms of medical imaging, the study found that chest radiography was the most common imaging exam that doctors performed. The most common form of CT was of the head and brain.
For children who underwent a head CT, the researchers attributed about a quarter of the children's subsequent hematologic malignancies to radiation exposure.
The authors emphasized that while medical imaging remains an invaluable tool in pediatric care, their findings highlight the need to carefully balance its diagnostic benefits with potential long-term risks.
Rebecca Smith-Bindman, et al. Medical Imaging and Pediatric and Adolescent Hematologic Cancer Risk. The New England Journal of Medicine (2025). DOI: 10.1056/NEJMoa2502098
Sep 22
Dr. Krishna Kumari Challa
El Niño, which means little boy in Spanish, is a complex, cyclical weather pattern that warms the surface waters in the eastern Pacific Ocean. This weakens trade winds and alters atmospheric conditions, a process that has long been known to suppress summer rainfall in India.
Studies found a direct connection between Indian rainfall and El Niño. In the drier regions of southeastern and northwestern India, this climate phenomenon reduces total rainfall and the intensity of extreme events. However, in the wet central and southwestern parts of the country, it results in less frequent but more intense downpours. This is due to changes in convective buoyancy, an atmospheric force that powers storms.
During El Niño summers, convective buoyancy increases in the wettest regions, which makes more extreme rainfall likely. The chance of a very heavy downpour (more than 250 mm of rain) increases by 43% across all of India's monsoon-affected areas and 59% for the Central Monsoon Zone during El Niño summers.
"Extreme daily rainfall over the Indian summer monsoon's rainiest areas becomes more likely the stronger that El Niño conditions are," commented the researchers in their study.
Spencer A. Hill, More extreme Indian monsoon rainfall in El Niño summers, Science (2025). DOI: 10.1126/science.adg5577. www.science.org/doi/10.1126/science.adg5577
**
Sep 23
Dr. Krishna Kumari Challa
Hormone therapy eases hot flashes while leaving heart risk unchanged in younger women
Researchers report that menopausal hormone therapy relieved vasomotor symptoms without raising cardiovascular risk in younger women, but increased risk was evident in women aged 70 years and older.
Concerns about the safety of hormone therapy have discouraged many women and clinicians from using it to treat hot flashes and night sweats. Some previous studies have suggested increased risk of coronary heart disease linked to hormone use, particularly in older women. At the same time, vasomotor symptoms remain among the most common and distressing consequences of menopause, and effective treatment options have been limited.
In the study, "Menopausal Hormone Therapy and Cardiovascular Diseases in Women With Vasomotor Symptoms: A Secondary Analysis of the Women's Health Initiative Randomized Clinical Trials," published in JAMA Internal Medicine, researchers analyzed outcomes from two hormone therapy trials to assess the safety of treatment in women experiencing vasomotor symptoms.
A total of 27,347 postmenopausal women aged 50 to 79 years were enrolled from 40 clinical centers across the United States. Among them, 10,739 had undergone hysterectomy and were randomized to receive conjugated equine estrogens (CEE) or placebo, while 16,608 with an intact uterus were randomized to CEE plus medroxyprogesterone acetate (MPA) or placebo.
Women were followed for a median of 7.2 years in the CEE trial and 5.6 years in the CEE plus MPA trial. Vasomotor symptoms were assessed at baseline and year one, with nearly all women who reported moderate or severe symptoms recalling onset near menopause.
CEE alone reduced vasomotor symptoms by 41% across all age groups. CEE plus MPA also reduced symptoms but with age-related variation, showing strong benefit in women aged 50 to 59 years and diminished or absent benefit in women 70 years and older.
Analysis of cardiovascular outcomes found neutral effects of both CEE alone and CEE plus MPA in women aged 50 to 59 years with moderate or severe symptoms. No clear harm was observed in those aged 60 to 69 years, although risk estimates were elevated for CEE alone.
Women aged 70 years and older had significantly higher rates of atherosclerotic cardiovascular disease, with hazard ratios of 1.95 for CEE alone and 3.22 for CEE plus MPA, corresponding to 217 and 382 excess events per 10,000 person-years, respectively.
Investigators concluded that hormone therapy can be considered for treatment of vasomotor symptoms in younger women aged 50 to 59 years, while initiation in women aged 60 to 69 years requires caution. For women 70 years and older, the findings argue against the use of hormone therapy due to substantially increased cardiovascular risk.
Part 1
Sep 23
Dr. Krishna Kumari Challa
Jacques E. Rossouw et al, Menopausal Hormone Therapy and Cardiovascular Diseases in Women With Vasomotor Symptoms, JAMA Internal Medicine (2025). DOI: 10.1001/jamainternmed.2025.4510
Deborah Grady et al, Hormone Therapy for Menopausal Vasomotor Symptoms—Better Understanding Cardiovascular Risk, JAMA Internal Medicine (2025). DOI: 10.1001/jamainternmed.2025.4521
Part 2
Sep 23
Dr. Krishna Kumari Challa
COVID-19 models suggest universal vaccination may avert over 100,000 hospitalizations
US Scenario Modeling Hub, a collaborative modeling effort of 17 academic research institutions, reports a universal COVID-19 vaccination recommendation could avert thousands more hospitalizations and deaths than a high-risk-only strategy.
Nine independent teams produced projections under six scenarios that combined two immune escape rates, 20% and 50% per year. An immune escape rate is the annual reduction in protection against infection that occurs as new SARS-CoV-2 variants evolve. Projections covered the United States population of 332 million with an estimated 58 million aged 65 years.
Three vaccine recommendation strategies were tested. No recommendation, recommendation for high-risk groups only, or recommendation for all eligible individuals. Annually reformulated vaccines were assumed to match variants circulating on June 15, 2024, to be available on September 1, 2024, and to be 75% effective against hospitalization at the time of release.
Teams calibrated their models to weekly hospitalizations and deaths reported by the National Healthcare Safety Network and the National Center for Health Statistics.
In the worst case scenario, defined by high immune escape with no vaccine recommendation, projections reached 931,000 hospitalizations with a 95% projection interval of 0.5 to 1.3 million and 62,000 deaths with a 95% projection interval of 18,000 to 115,000.
In the best case defined by low immune escape with a universal recommendation, the ensemble projected 550,000 hospitalizations with a 95% projection interval of 296,000 to 832,000 and 42,000 deaths with a 95% projection interval of 13,000 to 72,000.
Severe outcomes clustered in older populations. Across scenarios, adults aged 65 years and older accounted for 51% to 62% of hospitalizations and 84% to 87% of deaths.
Vaccination of high-risk groups was only projected to avert 11% of hospitalizations under low immune escape and 8% under high escape, along with 13% and 10% of deaths. A universal recommendation increased the effect with 15% fewer hospitalizations under low immune escape and 11% fewer under high, with 16% and 13% fewer deaths.
Under high immune escape, a high-risk-only strategy averted 76,000 hospitalizations with a 95% CI of 34,000 to 118,000 and 7,000 deaths with a 95% CI of 3,000 to 11,000. Expanding to a universal recommendation prevented 104,000 hospitalizations with a 95% CI of 55,000 to 153,000 and 9,000 deaths with a 95% CI of 4,000 to 14,000.
Additional indirect benefits accrued to older adults under a universal strategy. Compared with high-risk-only vaccination, universal recommendations prevented about 11,000 more hospitalizations and 1,000 more deaths in those aged 65 years and older.
Observed national patterns diverged in timing from projections. A marked summer 2024 wave was followed by a smaller peak in January 2025, while projections anticipated the heaviest burden from late December 2024 to mid January 2025. Ensemble coverage for weekly deaths remained strong, with 95% intervals closely matching observed values.
Authors conclude that vaccines remain a critical tool to limit COVID-19 burden in 2024–2025, with universal recommendations offering added direct and indirect protection and the potential to save thousands more lives, including among older adults.
Part 1
Sep 23
Dr. Krishna Kumari Challa
Sara L. Loo et al, Scenario Projections of COVID-19 Burden in the US, 2024-2025, JAMA Network Open (2025). DOI: 10.1001/jamanetworkopen.2025.32469
Part 2
Sep 23
Dr. Krishna Kumari Challa
Nanoparticles supercharge vinegar's old-fashioned wound healing power
Wounds that do not heal are often caused by bacterial infections and are particularly dangerous for the elderly and people with diabetes, cancer and other conditions. Acetic acid (more commonly known as vinegar) has been used for centuries as a disinfectant, but it is only effective against a small number of bacteria, and it does not kill the most dangerous types.
New research has resulted in the ability to boost the natural bacterial killing qualities of vinegar by adding antimicrobial nanoparticles made from carbon and cobalt. The findings have been published in the journal ACS Nano.
The acidic environment from the vinegar made bacterial cells swell and take up the nanoparticle treatment.
Once exposed, the nanoparticles appear to attack dangerous bacteria from both inside the bacterial cell and also on its surface, causing them to burst. Importantly, this approach is nontoxic to human cells and was shown to remove bacterial infections from mice wounds without affecting healing.
Adam Truskewycz et al, Cobalt-Doped Carbon Quantum Dots Work Synergistically with Weak Acetic Acid to Eliminate Antimicrobial-Resistant Bacterial Infections, ACS Nano (2025). DOI: 10.1021/acsnano.5c03108
Sep 23
Dr. Krishna Kumari Challa
Over 60,000 died in Europe alone from heat during 2024 summer: Study
More than 60,000 people died from heat in Europe during last year's record-breaking summer, a benchmark study said this week, in the latest warning of the massive toll climate change is having on the continent.
With Europe heating up twice as fast as the global average, the Spain-based researchers suggested an emergency alert system could help warn vulnerable people—particularly the elderly—ahead of dangerous heat waves.
Europe experienced an exceptionally deadly summer in 2024 with more than 60,000 heat-related deaths, bringing the total burden over the past three summers to more than 181,000, said the study in the journal Nature Medicine.
Tomáš Janoš, Heat-related mortality in Europe during 2024 and health emergency forecasting to reduce preventable deaths, Nature Medicine (2025). DOI: 10.1038/s41591-025-03954-7. www.nature.com/articles/s41591-025-03954-7
Sep 23
Dr. Krishna Kumari Challa
Our actions are dictated by 'autopilot' not choice, finds study
Habit, not conscious choice, drives most of our actions, according to new research .
The research, published in Psychology & Health, found that two-thirds of our daily behaviors are initiated "on autopilot", out of habit.
Habits are actions that we are automatically prompted to do when we encounter everyday settings, due to associations that we have learned between those settings and our usual responses to them.
The research also found that 46% of behaviors were both triggered by habit and aligned with conscious intentions, suggesting that people form habits that support their personal goals, and often disrupt habits that conflict with them.
The study found that 65% of daily behaviors were habitually initiated, meaning people were prompted to do them out of routine rather than making a conscious decision.
For people who want to break their bad habits, simply telling them to 'try harder' isn't enough. To create lasting change, we must incorporate strategies to help people recognize and disrupt their unwanted habits, and ideally form positive new ones in their place.
The researchers recommend that initiatives designed to help people adopt new behaviors, like exercising or eating healthier, should focus on building new, positive habits.
Amanda L. Rebar et al, How habitual is everyday life? An ecological momentary assessment study, Psychology & Health (2025). DOI: 10.1080/08870446.2025.2561149
Sep 23
Dr. Krishna Kumari Challa
The brain splits up vision without you even noticing
The brain divides vision between its two hemispheres—what's on your left is processed by your right hemisphere and vice versa—but your experience with every bike or bird that you see zipping by is seamless. A new study by neuroscientists reveals how the brain handles the transition.
It's surprising to some people to hear that there's some independence between the hemispheres, because that doesn't really correspond to how we perceive reality. In our consciousness, everything seems to be unified.
There are advantages to separately processing vision on either side of the brain, including the ability to keep track of more things at once, researchers have found, but neuroscientists have been eager to fully understand how perception ultimately appears so unified in the end.
Researchers measured neural activity in the brains of animals as they tracked objects crossing their field of view. The results reveal that different frequencies of brain waves encoded and then transferred information from one hemisphere to the other in advance of the crossing and then held on to the object representation in both hemispheres until after the crossing was complete.
The process is analogous to how relay racers hand off a baton, how a child swings from one monkey bar to the next, and how cell phone towers hand off a call from one to the next as a train passenger travels through their area. In all cases, both towers or hands actively hold what's being transferred until the handoff is confirmed.
Part 1
Sep 23
Dr. Krishna Kumari Challa
To conduct the study, published in the Journal of Neuroscience, the researchers measured both the electrical spiking of individual neurons and the various frequencies of brain waves that emerge from the coordinated activity of many neurons. They studied the dorsal and ventrolateral prefrontal cortex in both hemispheres, brain areas associated with executive brain functions.
The power fluctuations of the wave frequencies in each hemisphere told the researchers a clear story about how the subject's brains transferred information from the "sending" to the "receiving" hemisphere whenever a target object crossed the middle of their field of view. In the experiments, the target was accompanied by a distractor object on the opposite side of the screen to confirm that the subjects were consciously paying attention to the target object's motion and not just indiscriminately glancing at whatever happened to pop up on to the screen.
The highest frequency "gamma" waves, which encode sensory information, peaked in both hemispheres when the subjects first looked at the screen and again when the two objects appeared. When a color change signaled which object was the target to track, the gamma increase was only evident in the "sending" hemisphere (on the opposite side as the target object), as expected.
Meanwhile, the power of somewhat lower frequency "beta" waves, which regulate when gamma waves are active, varied inversely with the gamma waves. These sensory encoding dynamics were stronger in the ventrolateral locations compared to the dorsolateral ones.
Meanwhile, two distinct bands of lower frequency waves showed greater power in the dorsolateral locations at key moments related to achieving the handoff. About a quarter of a second before a target object crossed the middle of the field of view, "alpha" waves ramped up in both hemispheres and then peaked just after the object crossed. Meanwhile, "theta" band waves peaked after the crossing was complete, only in the "receiving" hemisphere (opposite from the target's new position).
Accompanying the pattern of wave peaks, neuron spiking data showed how the brain's representation of the target's location traveled. Using decoder software, which interprets what information the spikes represent, the researchers could see the target representation emerge in the sending hemisphere's ventrolateral location when it was first cued by the color change. Then they could see that as the target neared the middle of the field of view, the receiving hemisphere joined the sending hemisphere in representing the object, so that they both encoded the information during the transfer.
Taken together, the results showed that after the sending hemisphere initially encoded the target with a ventrolateral interplay of beta and gamma waves, a dorsolateral ramp up of alpha waves caused the receiving hemisphere to anticipate the handoff by mirroring the sending hemisphere's encoding of the target information. Alpha peaked just after the target crossed the middle of the field of view, and when the handoff was complete, theta peaked in the receiving hemisphere as if to say, "I got it."
Part 2
Sep 23
Dr. Krishna Kumari Challa
And in trials where the target never crossed the middle of the field of view, these handoff dynamics were not apparent in the measurements.
The study shows that the brain is not simply tracking objects in one hemisphere and then just picking them up anew when they enter the field of view of the other hemisphere.
"These results suggest there are active mechanisms that transfer information between cerebral hemispheres," the authors wrote. "The brain seems to anticipate the transfer and acknowledge its completion."
But they also note based on other studies that the system of interhemispheric coordination can sometimes appear to break down in certain neurological conditions including schizophrenia, autism, depression, dyslexia and multiple sclerosis. The new study may lend insight into the specific dynamics needed for it to succeed.
Matthew B. Broschard et al, Evidence for an active handoff between hemispheres during target tracking, The Journal of Neuroscience (2025). DOI: 10.1523/jneurosci.0841-25.2025
Part 3
Sep 23
Dr. Krishna Kumari Challa
UK Biobank analysis finds higher dementia incidence with frailty
Researchers report evidence that physical frailty is associated with dementia and that genetic background, brain structure, and immunometabolic function may mediate this link.
Dementia causes loss of cognitive abilities and daily functioning, and reached an estimated 57 million cases worldwide in 2021 with projections indicating a rise to 153 million by 2050. Treatments remain limited in effectiveness, creating a need for early identification of risk factors.
Previous studies have noted associations between frailty and elevated risk of incident dementia. Frailty is characterized by decreased physical function and a reduced ability to overcome stressors. Genetic susceptibility might influence the frailty-dementia relationship, and immunometabolic processes and brain structure are implicated, though mechanisms remain unclear.
In the study, "Association of Frailty With Dementia and the Mediating Role of Brain Structure and Immunometabolic Signatures," published in Neurology, researchers conducted a prospective cohort investigation to elucidate the link between physical frailty and dementia, assess causality, and explore biologic mechanisms.
UK Biobank contributed data from 489,573 participants without dementia at enrollment between 2006 and 2010, with a median follow-up of 13.58 years during which 8,900 dementia cases were documented. Brain MRI was available for a subset.
Physical frailty was defined by five criteria: weight loss, exhaustion, physical inactivity, slow walking speed, and low grip strength. Components were summed to a 0–5 score and categorized as nonfrail, prefrail, or frail. Biomarker panels included peripheral blood cell counts, biochemical measures, and metabolomic markers with correction for multiple testing.
Risk climbed with frailty status, with prefrailty associated with higher hazard versus nonfrailty (HR 1.50, 95% CI 1.44–1.57) and frailty associated with a larger increase (HR 2.82, 95% CI 2.61–3.04). Joint analyses placed the greatest hazards where frailty met genetic susceptibility, including HR 3.87 (95% CI 3.30–4.55) for high polygenic risk with frailty and HR 8.45 (95% CI 7.51–9.51) for APOE-e4 gene carriers with frailty.
Neuroimaging and biomarker findings linked frailty severity with image-derived brain measures and immunometabolic markers. Mendelian randomization supported a potential causal effect of physical frailty on dementia (OR 1.79, 95% CI 1.03–3.12), with reverse analysis reporting a null association (OR 1.00, 95% CI 0.98–1.01).
Authors conclude that the findings support a causal association between physical frailty and dementia, suggesting frailty may serve as a correlative early marker of vulnerability.
Xiangying Suo et al, Association of Frailty With Dementia and the Mediating Role of Brain Structure and Immunometabolic Signatures, Neurology (2025). DOI: 10.1212/wnl.0000000000214199
Sep 24
Dr. Krishna Kumari Challa
The hunted, not the hunters: AI reveals early humans were prey for leopards
A new study may be about to rewrite a part of our early human history. It has long been thought that Homo habilis, often considered the first true human species, was the one to turn the tables on the predator–prey relationship. However, a recent analysis of previous archaeological finds suggests that they were possibly more hunted than hunters and not the dominant species we once believed them to be.
To investigate this, researchers used artificial intelligence and computer vision to analyze tiny tooth marks on two H. habilis fossils. These ancient remains come from Olduvai Gorge in Tanzania and date back almost 2 million years.
The researchers trained the AI models on a library of 1,496 images of tooth marks made by modern carnivores, including leopards, lions, crocodiles, wolves and hyenas. Once it was trained, they presented the AI with photos of the fossil tooth marks.
As the scientists detail in their paper, published in the Annals of the New York Academy of Sciences, the AI compared these marks to what it had learned and concluded, with more than 90% probability, that leopards made the tooth marks. A key reason for this was that the triangular shape of the tooth pits on the bones matched those in the leopard reference samples.
"The implications of this are major, since it shows that H. habilis was still more of a prey than a predator," wrote the researchers in their paper. "It also shows that the trophic position of some of the earliest representatives of the genus Homo was not different from those of other australopithecines."
Although the research was limited to just two individuals, the scientists contend that if H. habilis had become a powerful species that could compete with carnivores, their bones would more likely have been scavenged by bone-crushing animals, such as hyenas, after they died from other causes.
The fact that the bites were from a flesh-eating predator means the leopards were actively hunting them. This suggests that the transition to a dominant position in the food chain came later in human evolution, according to the research team.
Marina Vegara‐Riquelme et al, Early humans and the balance of power: Homo habilis as prey, Annals of the New York Academy of Sciences (2025). DOI: 10.1111/nyas.15321
Sep 24
Dr. Krishna Kumari Challa
Popular keto diet linked to glucose intolerance and fatty liver in mice
Avocado toast with fried cheese as the bread and zucchini noodles in butter-bacon sauce are among the many recipe ideas fueling social media's beloved high-fat, low-carbohydrate ketogenic, or keto diet. However, scientists have found that while keto can lead to limited weight gain and even weight loss, it does so at the cost of metabolic issues like glucose intolerance.
scientists divided mice into four dietary groups: ketogenic diet (KD, 90% fat), a high-fat diet (HFD, 60% fat), low-fat (LFD), and low-fat moderate protein (LFMP), with varying levels of carbohydrates and proteins.
The mice were allowed to eat freely for up to 36 weeks in males and 44 weeks in females. Test results showed that while the KD supported weight control, it also raised blood cholesterol levels and led to fatty liver in males.
The diet is named "ketogenic" because it sends the body into ketosis—a metabolic state where the body burns fat as the primary fuel instead of the usual carbohydrates, and, as a result, produces molecules called ketone bodies. The diet isn't a new food trend, it has been around for nearly 100 years and is well-established for treating drug-resistant epilepsy in children, by reducing seizures in many cases. The exact way a KD helps control seizures is still unclear, but several ideas have been proposed. Some studies have suggested that KD can stabilize blood glucose (BG), which is beneficial to brain metabolism and neurotransmitter activity, while others highlight the anticonvulsant effects of ketone bodies themselves.
For this study, the researchers included both male and female mice and carried out regular check-ins to assess the long-term effects of KD on health parameters, including body composition, organ health, and blood profile.
They found that KD protected against excessive weight gain compared to the conventional high-fat diet, but gained more weight than low-fat diets. Long-term KD caused severe hyperlipidemia, meaning there were very high levels of fat in the blood.
KD-fed mice developed severe glucose intolerance because the diet caused a severe impairment in insulin secretion from the pancreas. While female mice on KD seemed fine, the male ones showed signs of liver dysfunction and fatty liver.
The findings made it quite evident that long-term KD can trigger several metabolic disturbances, raising caution against its widespread use as a health-promoting diet.
Molly R. Gallop et al, A long-term ketogenic diet causes hyperlipidemia, liver dysfunction, and glucose intolerance from impaired insulin secretion in mice, Science Advances (2025). DOI: 10.1126/sciadv.adx2752
Sep 24
Dr. Krishna Kumari Challa
Scientists show how to grow more nutritious rice that uses less fertilizer
The cultivation of rice—the staple grain for more than 3.5 billion people around the world—comes with extremely high environmental, climate and economic costs.
This may be about to change, thanks to new research.
Scientists have shown that nanoscale applications of the element selenium can decrease the amount of fertilizer necessary for rice cultivation while sustaining yields, boosting nutrition, enhancing the soil's microbial diversity and cutting greenhouse gas emissions.
In a new paper published in the Proceedings of the National Academy of Sciences, they demonstrate for the first time that such nanoscale applications work in real-world conditions.
They used an aerial drone to lightly spray rice growing in a paddy with the suspension of nanoscale selenium. That direct contact means that the rice plant is far more efficient at absorbing the selenium than it would be if we applied it to the soil.
Selenium stimulates the plant's photosynthesis, which increased by more than 40%. Increased photosynthesis means the plant absorbs more CO2, which it then turns into carbohydrates. Those carbohydrates flow down into the plant's roots, which causes them to grow.
Bigger, healthier roots release a host of organic compounds that cultivate beneficial microbes in the soil, and it's these microbes that then work symbiotically with the rice roots to pull more nitrogen and ammonium out of the soil and into the plant, increasing its NUE from 30 to 48.3%, decreasing the amount of nitrous oxide and ammonia release to the atmosphere by 18.8–45.6%.
With more nutrients coming in, the rice itself produces a higher yield, with a more nutritious grain: levels of protein, certain critical amino acids, and selenium also jumped.
On top of all of this, they found that their nano-selenium applications allowed farmers to reduce their nitrogen applications by 30%. Since rice cultivation accounts for 15–20% of the global nitrogen use, this new technique holds real promise for helping to meet the triple threat of growing population, climate change, and the rising economic and environmental costs of agriculture.
Wang, Zhenyu et al, Nanotechnology-driven coordination of shoot–root systems enhances rice nitrogen use efficiency, Proceedings of the National Academy of Sciences (2025). DOI: 10.1073/pnas.2508456122
Sep 24
Dr. Krishna Kumari Challa
The Ganges River is drying at an unprecedented rate, new study finds
The Ganges River is in crisis. This lifeline for around 600 million people in India and neighboring countries is experiencing its worst drying period in 1,300 years. Using a combination of historical data, paleoclimate records and hydrological models, researchers discovered that human activity is the main cause. They also found that the current drying is more severe than any recorded drought in the river's history.
In their study, published in the Proceedings of the National Academy of Sciences, researchers first reconstructed the river's flow for the last 1,300 years (700 to 2012 C.E.) by analyzing tree rings from the Monsoon Asia Drought Atlas (MADA) dataset. Then they used powerful computer programs to combine this tree-ring data with modern records to create a timeline of the river's flow. To ensure its accuracy, they double-checked it against documented historical droughts and famines.
The scientists found that the recent drying of the Ganges River from 1991 to 2020 is 76% worse than the previous worst recorded drought, which occurred during the 16th century. Not only is the river drier overall, but droughts are now more frequent and last longer. The main reason, according to the researchers, is human activity. While some natural climate patterns are at play, the primary driver is the weakening of the summer monsoon.
This weakening is linked to human-driven factors such as the warming of the Indian Ocean and air pollution from anthropogenic aerosols. These are liquid droplets and fine solid particles that come from factories, vehicles and power plants, among other sources and can suppress rainfall. The scientists also found that most climate models failed to spot the severe drying trend.
How to avoid this?
The researchers suggest two main courses of action. Given the mismatch between climate models and what they actually found, they are calling for better modeling to account for the regional impacts of human activity.
And because the Ganges is a vital source of water for drinking, agricultural production, industrial use and wildlife, the team also recommends implementing new adaptive water management strategies to mitigate potential water scarcity.
Dipesh Singh Chuphal et al, Recent drying of the Ganga River is unprecedented in the last 1,300 years, Proceedings of the National Academy of Sciences (2025). DOI: 10.1073/pnas.2424613122
Sep 25
Dr. Krishna Kumari Challa
Study finds microplastics in all beverages tested, raising exposure estimates
Microplastics have found their way deep inside our bones, brains, and even babies. A UK study found that 100% of all 155 hot and cold beverage samples tested contained synthetic plastic particles.
MPs are tiny pieces of plastic ranging in size from 1 μm to 5 mm, which are widespread across aquatic, terrestrial, and even airborne environments. They have become a growing health concern for living beings across species due to their ability to accumulate and carry toxic chemicals through the food webs.
Humans come into contact with MPs every day through food, water, consumer goods, and even air.
The researchers tested different products from popular UK brands, including coffee, tea, juices, energy drinks, soft drinks, and even tap and bottled water, and not a single beverage was free of microplastics (MPs). Surprisingly, the more expensive tea bag brand showed a higher concentration of MPs, compared to the cheaper ones.
Traces of plastics, including polypropylene, polystyrene, polyethylene terephthalate, and polyethylene—commonly used for food packaging and disposable containers—were found in the fluids. The daily average exposure through beverages was found to be 1.65 MPs/kg body weight per day.
The findings are published in Science of the Total Environment.
This can be true to all countries in the world.
Muneera Al-Mansoori et al, Synthetic microplastics in hot and cold beverages from the UK market: Comprehensive assessment of human exposure via total beverage intake, Science of The Total Environment (2025). DOI: 10.1016/j.scitotenv.2025.180188
Sep 25
Dr. Krishna Kumari Challa
Molecular discovery reveals how chromosomes are passed from one generation to the next
When a woman becomes pregnant, the outcome of that pregnancy depends on many things—including a crucial event that happened while she was still growing inside her own mother's womb. It depends on the quality of the egg cells that were already forming inside her fetal ovaries. The DNA-containing chromosomes in those cells must be cut, spliced and sorted perfectly. In males, the same process produces sperm in the testes but occurs only after puberty.
If that goes wrong, then you end up with the wrong number of chromosomes in the eggs or sperm. This can result in infertility, miscarriage or the birth of children with genetic diseases.
In a paper published Sept. 24 in the journal Nature, researchers report a major new discovery about a process that helps safeguard against these mistakes. They have pieced together the choreography of proteins that connect matching chromosome pairs—ensuring that they are sorted correctly as egg and sperm cells develop and divide.
These discoveries required methods to watch the molecular events of chromosome recombination unfold with unprecedented detail. This involved genetic engineering in budding yeast—a model organism that has been used for decades to discover how fundamental cellular processes work.
Humans have 46 chromosomes in each of our cells, made up of 23 pairs of matching, "homologous" chromosomes, with one of each pair inherited from each parent. Early in the process of making sperm or eggs, those chromosome pairs line up, and the parental chromosomes break and rejoin to each other. These chromosome exchanges, called "crossovers," serve two important functions.
First, they help ensure that each chromosome that is passed on to the offspring contains a unique mixture of genes from both parents. Crossovers also keep the chromosomes connected in matching pairs. These connections guide the distribution of chromosomes when cells divide to produce eggs and sperm. Maintaining crossover connections is especially crucial in females.
As chromosomes pair up in developing egg or sperm cells, matching DNA strands are exchanged and twined together over a short distance to form a structure called a "double Holliday junction." DNA strands of this structure are then cut to join the chromosomes forming a crossover.
In males, developing immature sperm cells then immediately divide and distribute chromosomes to the sperm. In contrast, egg cells developing in the fetal ovary arrest their development after crossovers have formed. The immature egg cells can remain in suspended animation for decades after birth, until they are activated to undergo ovulation.
Part 1
Sep 25
Dr. Krishna Kumari Challa
Only then does the process lurch back into motion: The egg cell finally divides, and the chromosome pairs that were connected by crossovers are finally separated to deliver a single set of chromosomes to the mature egg. Maintaining the crossover connections over many years is a major challenge for immature egg cells.
If chromosome pairs aren't connected by at least one crossover, they can lose contact with each other, like two people separated in a jostling crowd. This causes them to segregate incorrectly when the cell finally divides, producing egg cells with extra or missing chromosomes. This can cause infertility, miscarriage or genetic conditions such as Down syndrome, in which a child is born with an extra copy of chromosome 21, leading to cognitive impairment, heart defects, hearing loss and other problems.
Researchers have identified dozens of proteins that bind and process these junctions. They used a technique called "real-time genetics" to investigate the function of those proteins.
With this method, they made cells degrade one or more specific proteins within the junction-associated structures. They could then analyze the DNA from these cells, to see whether the junctions were resolved and if they formed crossovers. In this way, they built up a picture in which a network of proteins function together to ensure that crossovers are formed.
They identified key proteins such as cohesin that prevent an enzyme called the STR complex (or Bloom complex in humans) from inappropriately dismantling the junctions before they can form crossovers.
They protect the double Holliday junction. That is a key discovery.
Failure to protect double-Holliday junctions may be linked to fertility problems in humans.
Shangming Tang et al, Protecting double Holliday junctions ensures crossing over during meiosis, Nature (2025). DOI: 10.1038/s41586-025-09555-1
Part 2
Sep 25
Dr. Krishna Kumari Challa
AI-generated voices now indistinguishable from real human voices
Many people still think of AI-generated speech as sounding "fake" or unconvincing and easily told apart from human voices. But new research shows that AI voice technology has now reached a stage where it can create "voice clones" or deepfakes which sound just as realistic as human recordings.
The study compared real human voices with two different types of synthetic voices, generated using state-of-the-art AI voice synthesis tools. Some were "cloned" from voice recordings of real humans, intended to mimic them, and others were generated from a large voice model and did not have a specific human counterpart.
Participants were asked to evaluate which voices sounded most realistic, and which sounded most dominant or trustworthy. Researchers also looked at whether AI-generated voices had become "hyperreal," given that some studies have shown that AI-generated images of faces are now judged to be human more often than images of real human faces.
While the study did not find a "hyperrealism effect" from the AI voices, it did find that voice clones can sound as real as human voices, making it difficult for listeners to distinguish between them. Both types of AI-generated voices were evaluated as more dominant than human voices, and some were also perceived as more trustworthy.
Voice clones sound realistic but not (yet) hyperrealistic, PLOS One (2025). DOI: 10.1371/journal.pone/0332692
Sep 25
Dr. Krishna Kumari Challa
A new look at how the brain works reveals that wiring isn't everything
How a brain's anatomical structure relates to its function is one of the most important questions in neuroscience. It explores how physical components, such as neurons and their connections, give rise to complex behaviors and thoughts. A recent study of the brain of the tiny worm C. elegans provides a surprising answer: Structure alone doesn't explain how the brain works.
C. elegans is often used in neuroscience research because, unlike the incredibly complex human brain, which has billions of connections, the worm has a very simple nervous system with only 302 neurons. A complete, detailed map of every single one of its connections, or brain wiring diagram (connectome), was mapped several years ago, making it ideal for study.
In this research, scientists compared the worm's physical wiring in the brain to its signaling network, how the signals travel from one neuron to another. First, they used an electron microscope to get a detailed map of the physical connections between its nerve cells. Then, they activated individual neurons with light to create a signaling network and used a technique called calcium imaging to observe which other neurons responded to this stimulation.
Finally, they used computer programs to compare the physical wiring map and the signal flow map, identifying any differences and areas of overlap.
The team discovered that the brain's functional organization differs from its anatomical structure. An analogy is that the brain's structure is like a city map showing every street. However, the function is more akin to traffic flow, with jams, detours and shortcuts that are not visible on the map. In other words, brain activity does not always follow the predictable pathways of its physical wiring.
Sophie Dvali et al, Diverging Network Architecture of the C. elegans Connectome and Signaling Network, PRX Life (2025). DOI: 10.1103/6wgv-b9m6
Sep 25
Dr. Krishna Kumari Challa
Lab-grown kidneys yield urine
Researchers have created the most sophisticated kidney organoid to date, offering a real shot at growing transplantable kidneys from stem cells. These mini kidneys were capable of making urine when transplanted into mice. “You wouldn’t mistake it for a real kidney,” says experimental anatomist Jamie Davies. “But it is trying to do the right things.” Plumbing — encouraging the organoid to develop blood vessels and the duct that carries urine to the bladder — is the major hold-up. The researchers estimate a transplantable kidney will be ready for animal testing in less than five years.
https://www.sciencedirect.com/science/article/abs/pii/S193459092500...
https://www.science.org/content/article/scientists-make-most-authen...
Sep 25
Dr. Krishna Kumari Challa
Scientists discover that cell nucleus is actually less dense than surrounding cytoplasm
Just as life pulsates in big vibrant cities, it also prospers in crowded environments inside cells. The interior of cells is densely packed with biomolecules like proteins and nucleic acids. How is all this material distributed within a cell and what regulates its distribution?
In a study published in Nature Communications, researchers measure subcellular densities across a wide range of organisms. Their aim is to better understand biomolecular processes ranging from yeast cells to human cells.
Conventional scientific textbooks describe the cell nucleus as a compartment packed with an impressive amount of DNA wrapped around histone proteins.
Now, an international team of researchers has discovered that—contrary to expectations—the nucleus is less dense than the surrounding cytoplasm.
Despite their rich biomolecular composition, nuclei contain less dry mass than the same volume of the surrounding cytoplasm.
How can density be measured in microscopic objects such as individual cell compartments? Scientists use light for this purpose. Not only does light allow cells to be examined, it also enables them to be manipulated. Light can exert forces, enabling laser beams to "pull" on cells and measure their mechanical properties using an "optical stretcher."
The researchers developed an optical setup which allowed them to obtain three-dimensional density distributions inside cells at high resolution by combining optical diffraction tomography and confocal fluorescence microscopy.
While NC density ratios are maintained from yeast to human cells, we do start seeing deviations in disease. During stressed cellular states such as aging, the so-called senescence, cell nuclei become denser than the cytoplasm. Thus, the study points to the fundamental importance of density as a variable that determines healthy cellular processes, the researchers note in their paper.
Abin Biswas et al, Conserved nucleocytoplasmic density homeostasis drives cellular organization across eukaryotes, Nature Communications (2025). DOI: 10.1038/s41467-025-62605-0
Sep 26
Dr. Krishna Kumari Challa
Camouflage or caution? How anti-predator strategies have evolved
Predators and the environment determine why some animals use camouflage to avoid being eaten, while others use bright colors to warn them off, new research reveals. Published recently in the journal Science, the findings help explain the evolution and global distribution of the most common color strategies used by insects to avoid predators.
The global study took place across six continents and involved over 50 scientific collaborators.
Using the same experiment, researchers deployed more than 15,000 artificial prey with three different colors to investigate which strategy works best to deter predators: a classic warning pattern of orange and black, a dull brown that blends in, and an unusual bright blue and black.
The researchers found the answer to why some animals use camouflage over warning colors to deter predators turned out to be more complex than expected.
The findings showed there is no single best color strategy to deter predators, but that context is critical. The different characteristics of the predator and prey communities, as well as habitat in that part of the globe, heavily decide which strategy performs better in each place. This makes sense when we see animals employing so many varying camouflage and warning color strategies as defense systems all over the world.
Predators had the biggest influence on which color strategy was most successful for prey, the study revealed.
In environments where predators are competing intensely for food, they are more likely to risk attacking prey that might be dangerous or distasteful. Hence, the researchers saw that camouflage worked best in areas with lots of predation.
Whereas, in places where cryptic prey (insects who use camouflage) are abundant, hiding becomes less effective, as predators are better at looking for those types of animals.
The findings help scientists understand why some species, such as the cryptic bogong moth or the brightly colored harlequin bug, have evolved their strategies against predators.
Iliana Medina et al, Global selection on insect antipredator coloration, Science (2025). DOI: 10.1126/science.adr7368. www.science.org/doi/10.1126/science.adr7368
Sep 26
Dr. Krishna Kumari Challa
Trees are dying at an alarming rate
The rise in tree mortality is troubling for local forest ecosystems. As a global phenomenon, however, it has a significant social impact that remains poorly understood.
We don't currently know whether climate change will lead to the death of 10% or 50% of all trees worldwide.
An international group of more than 100 forest researchers are reviewing almost 500,000 forest monitoring studies from 89 countries and five continents. The researchers found that the main cause of tree mortality is anthropogenic (human-induced) climate change and its consequences: heat, dry air and soil, forest fires, storms, and increased insect damage and plant diseases.
In the article published in New Phytologist, the researchers aimed to identify methods, requirements and data gaps in monitoring tree mortality trends.
Towards a global understanding of tree mortality, New Phytologist (2025). DOI: 10.1111/nph.20407
Sep 26
Dr. Krishna Kumari Challa
Squirrels bite when they feel threatened, cornered, or are aggressively seeking food. They can also bite inadvertently by mistaking a finger for a treat or startling when a hand is presented to them. While usually a defensive action or a form of play, squirrels lack the bite inhibition of domesticated animals, and their bites, though not typically malicious, can be deep and pose a risk of infection.
Reasons for biting:
Self-defense: Like any wild animal, squirrels will bite to protect themselves if they feel endangered.
Aggression for food: Squirrels may become aggressive if they are accustomed to being fed by humans and approach to get a meal, according to Critter Control.
Accidental bites: Squirrels don't have the same depth perception as humans and can mistakenly bite a finger when trying to take a treat.
Nesting: A mother squirrel in a nesting area, such as an attic, may bite if she feels cornered or threatened.
Play behaviour: Squirrels also "play bite" to practice skills they will use as adults, similar to how siblings interact.
Risks of a squirrel bite:
Infection: Because squirrels are wild rodents, a bite can lead to an infection.
Diseases: Though rabies is rare in squirrels, they can carry other diseases, such as the plague, which is transmitted by fleas.
What to do if a squirrel bites:
Wash the wound: Immediately wash the wound thoroughly with soap and water to reduce the risk of infection.
Seek medical attention: Consult a healthcare professional to determine if a tetanus shot is needed and to monitor for any signs of infection.
Sep 26
Dr. Krishna Kumari Challa
Mucus contains molecules that block Salmonella infection, study reveals
Mucus is more than just a sticky substance: It contains a wealth of powerful molecules called mucins that help to tame microbes and prevent infection. In a new study, researchers have identified mucins that defend against Salmonella and other bacteria that cause diarrhea.
The researchers now hope to mimic this defense system to create synthetic mucins that could help prevent or treat illness in soldiers or other people at risk of exposure to Salmonella. It could also help prevent "traveler's diarrhea," a gastrointestinal infection caused by consuming contaminated food or water.
Mucins are bottlebrush-shaped polymers made of complex sugar molecules known as glycans, which are tethered to a peptide backbone. In this study, the researchers discovered that a mucin called MUC2 turns off genes that Salmonella uses to enter and infect host cells.
Mucus lines much of the body, providing a physical barrier to infection, but that's not all it does.
Researchers identified mucins that can help to disarm Vibrio cholerae, as well as Pseudomonas aeruginosa, which can infect the lungs and other organs, and the yeast Candida albicans.
The researchers found in the new study that when they exposed Salmonella to a mucin called MUC2, which is found in the intestines, the bacteria stopped producing the proteins encoded by SPI-1, and they were no longer able to infect cells.
Further studies revealed that MUC2 achieves this by turning off a regulatory bacterial protein known as HilD. When this protein is blocked by mucins, it can no longer activate the T3SS genes.
Using computational simulations, the researchers showed that certain monosaccharides found in glycans, including GlcNAc and GalNAc, can attach to a specific binding site of the HilD protein. However, their studies showed that these monosaccharides can't turn off HilD on their own—the shutoff only occurs when the glycans are tethered to the peptide backbone of the mucin.
The researchers also discovered that a similar mucin called MUC5AC, which is found in the stomach, can block HilD. And, both MUC2 and MUC5AC can turn off virulence genes in other foodborne pathogens that also use HilD as a gene regulator.
The researchers now plan to explore ways to use synthetic versions of these mucins to help boost the body's natural defenses and protect the GI tract from Salmonella and other infections.
Kelsey M. Wheeler et al, Mucus-derived glycans are inhibitory signals for Salmonella Typhimurium SPI-1-mediated invasion, Cell Reports (2025). DOI: 10.1016/j.celrep.2025.116304
Sep 26
Dr. Krishna Kumari Challa
Million-year-old skull could change human evolution timeline
A digital reconstruction of a million-year-old skull suggests humans may have diverged from our ancient ancestors 400,000 years earlier than thought and in Asia not Africa, a study said this week.
The findings are based on a reconstruction of a crushed skull discovered in China in 1990, and have the potential to resolve the longstanding "Muddle in the Middle" of human evolution, researchers said.
But experts not involved in the work cautioned that the findings were likely to be disputed, and pointed to ongoing uncertainties in the timeline of human evolution.
The skull, labeled Yunxian 2, was previously thought to belong to a human forerunner called Homo erectus.
But modern reconstruction technologies revealed features closer to species previously thought to have existed only later in human evolution, including the recently discovered Homo longi and our own Homo sapiens.
It suggests that by one million years ago, our ancestors had already split into distinct groups, pointing to a much earlier and more complex human evolutionary split than previously thought.
If the findings are correct, it suggests there could have been much earlier members of other early hominins, including Neanderthals and Homo sapiens, the study says.
It also "muddies the waters" on longstanding assumptions that early humans dispersed from Africa.
There's a big change potentially happening here, where east Asia is now playing a very key role in hominin evolution.
The research, published in the journal Science, used advanced CT scanning, structure light imaging and virtual reconstruction techniques to model a complete Yunxian 2.
The scientists relied in part on another similar skull to shape their model, and then compared it to over 100 other specimens.
The resulting model "shows a distinctive combination of traits," the study said, some of them similar to Homo erectus, including a projecting lower face.
But other aspects, including its apparently larger brain capacity, are closer to Homo longi and Homo sapiens, the researchers said.
"Yunxian 2 may help us resolve what's been called the 'Muddle in the Middle,' the confusing array of human fossils from between 1 million and 300,000 years ago.
The findings are only the latest in a string of recent research that has complicated what we thought we know about our origins.
Xiaobo Feng et al, The phylogenetic position of the Yunxian cranium elucidates the origin of Homo longiand the Denisovans, Science (2025). DOI: 10.1126/science.ado9202
on Saturday
Dr. Krishna Kumari Challa
Origins of the 'Ostrich Effect': Researchers pinpoint the age we start avoiding information—even when it's helpful
In a world of information overload, it can feel soothing to stick your head in the sand.
According to psychologists, avoiding information when it's uncomfortable is a common adult behavior, often referred to as the "Ostrich Effect."
But how do we become an ostrich? Children are notorious for seeking out information, often in the form of endless questions. So when do we sprout feathers and decide that, actually, the number of calories in a slice of cake is none of our business?
This behavioral origin point was exactly what researchers wanted to pin down.
In a study published in Psychological Science, a research team discovered that as children aged, the tendency to avoid information grew stronger.
Though 5- and 6-year-olds still actively sought information, 7- to 10-year-olds were much more likely to strategically avoid learning something if it elicited a negative emotion.
Why is it that children are these super curious people, but then we somehow end up as these information avoiders as adults?
In their initial experiment, the researchers looked at five reasons why we might willfully choose to remain ignorant:To avoid negative emotions like anxiety or disappointment
To avoid negative information about our own likability or competence
To avoid challenges to our beliefs
To protect our preferences
To act in our own self-interest (perhaps while trying to appear not self-interested)
Part 1
on Saturday
Dr. Krishna Kumari Challa
Researchers then adapted these into five scenarios for children to see if they could elicit information avoidance. For example, each child was asked to imagine their favorite and least favorite candy. They were then asked if they wanted to watch a video about why eating that candy was bad for their teeth.
They found that, whereas younger children really wanted to seek information, older children started to exhibit these avoidance tendencies. For example, they didn't want to know why their favorite candy was bad for them, but they were totally fine learning why their least favorite candy is bad for them.
This finding held for all motivations except for competency. Children of all ages were not afraid to learn if they'd done badly on a test, for example.
To avoid avoidance, she suggests thinking through why you might be avoiding something—possibly prioritizing short-term comfort over long-term benefits. Researchers posit that it could help to reframe uncomfortable information as useful and valuable.
Research suggests that intervening while children are still young could keep them from falling into avoidance traps and have compounding benefits.
Humans have this propensity to want to resolve uncertainty, but when the resolution is threatening, people might flip to avoidance instead.
If all else fails, she advises, mimic what children do best: Follow your curiosity.
Radhika Santhanagopalan et al, Becoming an Ostrich: The Development of Information Avoidance, Psychological Science (2025). DOI: 10.1177/09567976251344551
Part 2
on Saturday
Dr. Krishna Kumari Challa
Bacterial endotoxins are high-potency, low-mass drivers of PM₂.₅ toxicity, sampling study reveals
Endotoxin, a toxic chemical found in bacteria, makes up only 0.0001% of PM2.5 fine particles but packs a serious punch when it comes to its bioactivity.
According to a study by researchers endotoxin drives 0.1–17% of the inflammatory responses triggered by these airborne particles, with its toxicity contribution being three to five orders of magnitude higher than its mass contribution.
Air pollution is now the world's leading environmental health threat, linked to more than three million premature deaths every year. One of the key culprits is PM2.5, which refers to airborne particles smaller than 2.5 micrometers, small enough to slip deep into the lungs and even seep into the bloodstream.
Scientists have long been focusing on PM2.5 because evidence consistently links it to respiratory illnesses, such as asthma, chronic obstructive pulmonary disease, and airway inflammation. Studies suggest that the damage caused by PM2.5 could be due to oxidative stress and the triggering of immune responses in the lungs following exposure.
PM2.5 is a complex atmospheric cocktail of natural and anthropogenic particles containing biological, inorganic, and organic constituents. For decades, researchers have extensively studied the impact of chemicals—including transition metals, polycyclic aromatic hydrocarbons, and industrial smoke—produced by human activities. These components, however, contribute to less than half of the respiratory damage inflicted by PM2.5, leaving roughly 60% of its impact still unexplained.
Researchers of this study conducted daily 24-hour PM2.5 sampling for a year across an urban and coastal area of Hong Kong. To assess inflammatory responses, the researchers exposed human bronchial epithelial cells to PM2.5 and measured the release of interleukin-8 (IL-8)—a small protein, called a cytokine, that is released by the immune system— as a marker of inflammation.
Part 1
on Tuesday
Dr. Krishna Kumari Challa
Endotoxin concentrations were measured using the Limulus Amebocyte Lysate (LAL) assay, then researchers used DNA sequencing and source tracking to identify the Gram-negative bacteria they came from. Finally, they applied mixture-toxicity modeling to estimate how much these endotoxins contributed to the overall harmful effects of PM2.5 exposure.
They found that despite making up only a minuscule fraction of the total PM2.5 mixture, it drove about 0.1 to 17% of the IL-8 release triggered by PM2.5.
Among all reported PM2.5 components, endotoxin demonstrated the highest toxicity-to-mass contribution ratio, 10,000:1 to 100,000:1, establishing its extreme biological potency. These findings show that less is indeed more.
The researchers note that this study brings to light the importance of identifying highly toxic components present in low concentrations and tracing their sources. Pinpointing these toxicity drivers can help us design cost-effective strategies in which even modest reductions in PM2.5 mass could yield substantial decreases in overall toxicity.
Jinyan Yu et al, Disproportionately Higher Contribution of Endotoxin to PM2.5 Bioactivity than Its Mass Share Highlights the Need to Identify Low-Concentration, High-Potency Components, Environmental Science & Technology (2025). DOI: 10.1021/acs.est.5c07255
Part 2
on Tuesday
Dr. Krishna Kumari Challa
Mamba snake bites worsen after antivenom
Mamba (Dendroaspis species) snake bites are a significant threat in sub-Saharan Africa, accounting for 30,000 deaths annually.
A breakthrough study has discovered a hidden dangerous feature of the black mamba, one of the most venomous snakes in the world.
The study revealed the venoms of three species of mamba were far more neurologically complex than previously thought, explaining why antivenoms were sometimes ineffective. This research was published in Toxins.
The black mamba, western green mamba and Jamesons mamba snakes aren't just using one form of chemical weapon, they're launching a coordinated attack at two different points in the nervous system.
If you're bitten by 3 out of 4 mamba species, you will experience flaccid or limp paralysis caused by postsynaptic neurotoxicity.
Current antivenoms can treat the flaccid paralysis but this study found the venoms of these three species are then able to attack another part of the nervous system causing spastic paralysis by presynaptic toxicity.
Researchers previously thought the fourth species of mamba, the eastern green mamba, was the only one capable of causing spastic paralysis.
This finding resolves a long-standing clinical mystery of why some patients bitten by mambas seem to initially improve with antivenom and regain muscle tone and movement only to start having painful, uncontrolled spasms.
The venom first blocks nerve signals from reaching the muscles, but after the antivenom is administered, it then overstimulates the muscles.
It's like treating one disease and suddenly revealing another.
Researchers also found the venom function of the mambas was different depending on their geographic location, particularly within populations of the black mamba from Kenya and South Africa.
This further complicates treatment strategies across regions because the antivenoms are not developed to counteract the intricacies of the different venoms.
By identifying the limitations of current antivenoms and understanding the full range of venom activity, we can now directly inform evidence-based snakebite care.
Lee Jones et al, Neurotoxic Sleight of Fang: Differential Antivenom Efficacy Against Mamba (Dendroaspis spp.) Venom Spastic-Paralysis Presynaptic/Synaptic vs. Flaccid-Paralysis Postsynaptic Effects, Toxins (2025). DOI: 10.3390/toxins17100481
on Tuesday
Dr. Krishna Kumari Challa
Scientists read mice's 'thoughts' from their faces
It's easy to read emotions on people's faces—each one has its clear, unmistakable signature. But what about thoughts? A study published in Nature Neuroscience shows that mice's problem-solving strategies can be deciphered from subtle facial movements.
According to the authors, this is a proof of concept that the contents of the mind can be read out from video recordings, potentially offering powerful new research and diagnostic tools.
Scientists found that they can get as much information about what the mouse was 'thinking' as they could from recording the activity of dozens of neurons.
Having such easy access to the hidden contents of the mind could provide an important boost to brain research. However, it also highlights a need to start thinking about regulations to protect our mental privacy.
Facial expressions in mice reveal latent cognitive variables and their neural correlates, Nature Neuroscience (2025). DOI: 10.1038/s41593-025-02071-5.
yesterday
Dr. Krishna Kumari Challa
Stem cells from fat tissue may help prevent kidney dialysis access failure
To undergo kidney dialysis, doctors must first surgically create an access route—an arteriovenous fistula—usually in an arm, a conduit that will accommodate hemodialysis treatments. It is a routine outpatient procedure performed for years worldwide.
But it is a procedure beset by problems.
An arteriovenous fistula must first "mature," a process in which the newly established connection between an artery and a vein becomes large enough to support the turbulent flow of blood in hemodialysis. For many patients, this artificially created channel tends to narrow, leaving it useless as a conduit.
Researchers investigating a possible way to prevent problematic narrowing—a condition called stenosis—with a procedure that relies on the use of stem cells.
The study, researchers asserted, is a crucial step toward improving a necessary treatment for patients with kidney failure by tapping into a population of cells that are essentially blank slates.
The investigation is reported in the journal Science Translational Medicine.
The phase 1 randomized trial involved patients undergoing an arteriovenous fistula (AVF) placement in an arm. Some of the patients in the small trial also received autologous adipose-derived mesenchymal stem cells. The cells were delivered at the time of the AVF procedure.
The stem cells were placed along the AVF starting at the distal artery, one centimeter upstream to the anastomosis [the surgical connection between adjacent blood vessels] and extending to the first four centimeters of the vein just distal to the anastomosis by dripping them onto the adventitia of the vessels slowly over five minutes.
The adventitia is the outermost layer of a blood vessel.
Mesenchymal stem cells are a form of somatic, or adult stem cells, which can be found in a variety of tissues throughout the body, including adipose (fat) tissue, which is an abundant source.
The stem cells are aimed at improving AVF function by preventing vascular narrowing. The cells were also a site-specific treatment for another problem tied to arteriovenous fistulas: inflammation, a hallmark of AVFs. Fortunately, anti-inflammatory activity is a function of mesenchymal stem cells.
Side-by-side images in the study show the vascular opening to be wide and capable of handling the turbulence of hemodialysis among patients who received mesenchymal stem cells. Patients who did not receive the stem cell treatment suffered vascular narrowing.
The research team sees promise in their unique approach, which is producing positive results at a critical time.
The team's phase 1 clinical trial involved 21 patients who received arteriovenous fistulas in the arm; 11 of the 21 patients also received mesenchymal stem cells derived from their own fat tissue. After 42 months, fistulas had matured faster in patients who received stem cells. Additional study and approval by the U.S. Food and Drug Administration are required before the treatment can become available.
Sreenivasulu Kilari, et al Periadventitial delivery of mesenchymal stem cells improves vascular remodeling and maturation in arteriovenous fistulas, Science Translational Medicine (2025). DOI: 10.1126/scitranslmed.adp7723
yesterday
Dr. Krishna Kumari Challa
Scientists finally prove that a quantum computer can unconditionally outperform classical computers
A quantum computer has demonstrated that it can solve a problem more efficiently than a conventional computer. This achievement comes from being able to unlock a vast memory resource that classical computing cannot match.
Instead of using classical bits that can only be 0 or 1, quantum machines use qubits, which can exist in multiple states and store exponentially more information than their traditional counterparts. However, proving that a quantum computer can access this memory advantage in the real world has been a challenge for two main reasons.
First, any successful demonstration has to be feasible on realistic quantum hardware, and second, there must be unconditional mathematical proof that no future classical algorithm could achieve the same performance.
In a study published on the arXiv preprint server, a research team reports how they achieved this feat of quantum supremacy.
They constructed a complicated mathematical task designed to test this memory advantage. Their experiment was like a game between two parts of the quantum system referred to as Alice and Bob. Alice's task was to create a quantum state and send it in a message to Bob, who had to measure it to figure out what it was. The goal was to build a process so accurate that Bob could predict the state before Alice finished preparing the message.
The researchers optimized this process over 10,000 independent trials, and their analysis revealed that a classical computer would need at least 62 bits of memory to complete the task with the same success rate. The quantum device performed it using only 12 qubits.
The result provides the most direct evidence yet that currently existing quantum processors can generate and manipulate entangled states of sufficient complexity to access the exponentiality of Hilbert space (the vast memory resource of a quantum computer)," wrote the researchers in their paper.
This form of quantum advantage—which we call quantum information supremacy—represents a new benchmark in quantum computing, one that does not rely on unproven conjectures.
William Kretschmer et al, Demonstrating an unconditional separation between quantum and classical information resources, arXiv (2025). DOI: 10.48550/arxiv.2509.07255
yesterday
Dr. Krishna Kumari Challa
The Red Sea went completely dry before being flooded by the Indian Ocean over 6 million years ago
Scientists have provided conclusive evidence that the Red Sea completely dried out about 6.2 million years ago, before being suddenly refilled by a catastrophic flood from the Indian Ocean. The findings put a definitive time on a dramatic event that changed the Red Sea.
Using seismic imaging, microfossil evidence, and geochemical dating techniques, the researchers showed that a massive change happened in about 100,000 years—a blink of an eye for a major geological event. The Red Sea went from connecting with the Mediterranean Sea to an empty, salt-filled basin. Then, a massive flood burst through volcanic barriers to open the Bab el-Mandab strait and reconnect the Red Sea with the world's oceans.
The findings show that the Red Sea basin records one of the most extreme environmental events on Earth, when it dried out completely and was then suddenly reflooded about 6.2 million years ago.
The Red Sea was initially connected from the north to the Mediterranean through a shallow sill. This connection was severed, drying the Red Sea into a barren salt desert. In the south of the Red Sea, near the Hanish Islands, a volcanic ridge separated the sea from the Indian Ocean.
But around 6.2 million years ago, seawater from the Indian Ocean surged across this barrier in a catastrophic flood. The torrent carved a 320-kilometer-long submarine canyon that is still visible today on the seafloor. The flood rapidly refilled the basin, drowning the salt flats and restoring normal marine conditions in less than 100,000 years. This event happened nearly a million years before the Mediterranean was refilled by the famous Zanclean flood, giving the Red Sea a unique story of rebirth.
Tihana Pensa et al, Desiccation of the Red Sea basin at the start of the Messinian salinity crisis was followed by major erosion and reflooding from the Indian Ocean, Communications Earth & Environment (2025). DOI: 10.1038/s43247-025-02642-1
yesterday
Dr. Krishna Kumari Challa
Forensic test recovers fingerprints from fired ammunition casings despite intense heat
A pioneering new test that can recover fingerprints from ammunition casing, once thought nearly impossible, has been developed by scientists.
The researchers have developed a unique electrochemical method which can visualize fingerprints on brass casings, even after they have been exposed to the high temperature conditions experienced during gunfire. The study is published in the journal Forensic Chemistry.
For decades, investigators have struggled to recover fingerprints from weapons because any biological trace is usually destroyed by the high temperatures, friction and gas released after a gun is fired. As a result, criminals often abandon their weapons or casings at crime scenes, confident that they leave no fingerprint evidence behind.
Traditionally, the intense heat of firing destroys any biological residue. However, the technique has been able to reveal fingerprint ridges that would otherwise remain imperceptible.
The team found they could coat brass casings with a thin layer of specialized materials to make hidden fingerprint ridges visible. Unlike existing methods that need dangerous chemicals or high-powered equipment, the new process uses readily available non-toxic polymers and minimal amounts of energy to quickly reveal prints from seemingly blank surfaces.
It works by placing the brass casing of interest in an electrochemical cell containing specific chemical substances. When a small voltage is applied, chemicals in the solution are attracted to the surface, coating the spaces between fingerprint ridges and creating a clear, high contrast image of the print. The fingerprint appears within seconds as if by magic!
Tests showed that this technique also worked on samples aged up to 16 months, demonstrating remarkable durability.
The research has significant implications for criminal investigations, where the current assumption is that firing a gun eliminates fingerprint residues on casings.
Colm McKeever et al, Electrodeposition of redox materials with potential for enhanced visualisation of latent finger-marks on brass substrates and ammunition casings., Forensic Chemistry (2025). DOI: 10.1016/j.forc.2025.100663
yesterday
Dr. Krishna Kumari Challa
Population bottlenecks cause decline of mammals' immunity, researchers find
Population bottlenecks caused by stark population loss due to illness or habitat destruction caused mammals' disease immunity to decline, according to a new study led by computational biologists .
The finding comes from the first comparative study of genomic sequences—roadmaps of DNA instructions responsible for encoding how the body works—encoding immunity in 46 mammals.
The study, published in Molecular Biology and Evolution, is the first step for scientists analyzing regions of mammalian DNA that were previously inaccessible without modern biotechnology computational tools.
Genes influence how our body works: Humans and animals have genetics predisposed to certain diseases based on DNA. Although the same basic building blocks make up DNA across the 46 mammals assessed, the genomic sequences diverged wildly. So, even though we might have a similar set of genes, they are different based on variations in the DNA architecture.
In the immune system, things are complicated further by something known as adaptive immunity. As opposed to the non-discriminatory defense the immune system deploys at the first hint of an infection, adaptive immunity refers to the parts of the immune system that study the specifics of a pathogen and design antibodies precisely targeted for it, should it invade again.
Antibodies are produced from highly variable "template" genes encoded in the genome, and this variability enables versatile immune responses through the generation of antibodies against diverse targets.
The question is, how did this adaptive immunity evolve?
To answer this Q, researchers analyzed five types of gene clusters that control various aspects of immune system production—specifically the building of antibodies and the receptors on another immune cell type known as the T-cell—across 46 mammals to better understand how genetic variation could affect immune function.
Researchers scanned, aligned and compared publicly available DNA sequences of 46 mammals of 13 taxonomic orders, such as primates, rodents, bats, carnivores and marsupials, to draw conclusions about how their immune systems evolved.
Researchers found that a decline in adaptive immunity, and possible vulnerability to certain diseases among mammals as a result, was likely caused by genetic bottlenecks: a stark decrease in population over certain periods in history due to factors such as habitat loss or disease.
Bottlenecks happened during medieval times when humanity was devastated by various diseases like the Black Plague, or when animals suffered widespread habitat loss due to forest fire.
Species with these past population bottlenecks include felines, aquatic mammals, seals, some primates and ruminants, which are mammals that adapted a special stomach for digesting tough plants.
Genetic bottlenecks result in limited gene pool diversity for these animals, the researchers explained, which led to possible declining of adaptive immunity.
Mariia Pospelova et al, Comparative Analysis of Mammalian Adaptive Immune Loci Revealed Spectacular Divergence and Common Genetic Patterns, Molecular Biology and Evolution (2025). DOI: 10.1093/molbev/msaf152
yesterday
Dr. Krishna Kumari Challa
Dads influence embryo growth via molecular signatures, research reveals
Over the past few decades, growing evidence has challenged the belief that inheritance is governed solely by DNA sequences. Scientists now recognize the crucial role of epigenetic inheritance—the transmission of biological traits via chemical modifications to DNA and its associated proteins. These modifications do not alter the genetic code itself but influence how genes are switched on or off, often in response to environmental factors such as stress, diet, or drug exposure.
While the concept of maternal epigenetic inheritance is relatively intuitive—given the direct biological connection between mother and embryo during gestation—recent research shows that fathers, too, can transmit environmentally induced epigenetic changes to their offspring. However, the prevalence of epigenetic inheritance—and the mechanisms behind it—remains unclear.
In a recent study, researchers demonstrated that disrupting the gut microbiome of male mice increases disease risk in their future offspring. On the other hand, some have focused on mechanisms that regulate embryonic development in response to changes in paternal diet.
A collaborative study between the groups, now published in The EMBO Journal, examined how specific paternal environments affect early embryonic development in a systematic manner and under tightly controlled genetic and environmental conditions in mice.
To induce environmental perturbations, prospective fathers were exposed to either non-absorbable antibiotics (disrupting the gut microbiota) or to a low-protein, high-sugar diet. To minimize experimental variability, the analyses were performed on embryos resulting from in vitro fertilization (IVF). Embryos were collected approximately four days after fertilization (blastocyst stage) and individually analyzed to measure differences in gene expression compared to controls (blastocysts that resulted from fathers without any treatment).
The results were striking. Both environmental perturbations led to significant changes in embryonic gene expression. Disruption of the paternal gut microbiota reduced the expression of key genes involved in extra-embryonic tissue development, while changes in the diet were linked with a modest developmental delay.
To further investigate the influence of the genetic background, scientists repeated the experiments using a different mouse strain. The outcome differed, suggesting the importance of the genetic component in shaping how environmental exposures affect offspring.
Additionally, embryos derived from older fathers showed a stronger effect on gene expression, especially on genes involved in immune-related processes, indicating that paternal age is another important factor involved in epigenetic inheritance.
Mathilde Dura et al, Embryonic signatures of intergenerational epigenetic inheritance across paternal environments and genetic backgrounds, The EMBO Journal (2025). DOI: 10.1038/s44318-025-00556-4
yesterday
Dr. Krishna Kumari Challa
Microlightning causes eerie lights of lore
Spontaneous flashes of ‘microlightning’ between bubbles of gas could explain will-o’-the-wisps — flickering lights that can appear on marshlands. Researchers blew tiny bubbles of methane and air into water, where smaller bubbles took on a negative charge and larger ones, a positive charge. As the charges equalized, they produced a small zap of electricity and a flash of light. This could explain why the ghostly-looking lights appear over methane-rich bogs.
https://www.pnas.org/doi/10.1073/pnas.2521255122
yesterday
Dr. Krishna Kumari Challa
Microplastics reduce soil fertility and boost production of a potent greenhouse gas, study shows
More than 90% of plastic waste ends up in the soil, where it breaks down into microplastics that are invisible to the naked eye. Microplastic pollution of the soil poses a severe threat to soil health as it can harm essential microbial communities and reduce crop yields. The presence of these tiny plastics may also worsen climate change by boosting the production of greenhouse gases, according to a new study published in Environmental Science & Technology.
Most previous research focused on one plastic at a time and their effect on soil function and nutrient cycling, but microplastics do not tend to occur in isolation.
in the present study, the researchers went for the combined effect of various types of plastics on soil and key functions, such as the nitrogen cycle.
To quantify the problem, the team ran a microcosm experiment in the lab, using soil samples mixed with six different types of plastic, including polyethylene terephthalate (PET) and polyvinyl chloride (PVC). They created four distinct groups with varying levels of plastic, from zero plastics (the control group) to five different types of plastic. After 40 days of incubation, they collected the soil and ran several tests. These included measuring soil properties, such as acidity and key enzyme activities, as well as DNA sequencing to identify bacteria and their associated functional genes.
The team's analysis revealed that increasing microplastic diversity leads to significant shifts in soil health. For example, the plastic mixture considerably raised soil pH (making the soil more alkaline) and increased soil carbon content.
However, one of the most important findings was that microplastic diversity boosted the activity of bacterial genes responsible for denitrification. This is the process by which bacteria convert plant nutrient material into nitrogen gas, which is then released into the atmosphere. It not only makes the soil less fertile, but also releases nitrous oxide, a greenhouse gas that is around 300 times more potent in warming the planet than carbon dioxide. The primary cause of this accelerated nitrogen loss was a family of bacteria known as Rhodocyclaceae.
Tian-Gui Cai et al, Microplastic Diversity as a Potential Driver of Soil Denitrification Shifts, Environmental Science & Technology (2025). DOI: 10.1021/acs.est.5c04981
19 hours ago
Dr. Krishna Kumari Challa
Parkinson's 'trigger' directly observed in human brain tissue for the first time
Scientists have, for the first time, directly visualized and quantified the protein clusters believed to trigger Parkinson's, marking a major advance in the study of the world's fastest-growing neurological disease.
These tiny clusters, called alpha-synuclein oligomers, have long been considered the likely culprits for Parkinson's disease to start developing in the brain, but until now, they have evaded direct detection in human brain tissue.
Now, researchers have developed an imaging technique that allows them to see, count and compare oligomers in human brain tissue, a development one of the team says is "like being able to see stars in broad daylight."
Their results, reported in the journal Nature Biomedical Engineering, could help unravel the mechanics of how Parkinson's spreads through the brain and support the development of diagnostics and potential treatments.
The team examined post-mortem brain tissue samples from people with Parkinson's and compared them to healthy individuals of similar age. They found that oligomers exist in both healthy and Parkinson's brains. The main difference between disease and healthy brains was the size of the oligomers, which were larger, brighter and more numerous in disease samples, suggesting a direct link to the progression of Parkinson's.
The team also discovered a sub-class of oligomers that appeared only in Parkinson's patients, which could be the earliest visible markers of the disease—potentially years before symptoms appear.
Rebecca Andrews et al, Large-scale visualisation of α-synuclein oligomers in Parkinson's disease brain tissue, Nature Biomedical Engineering (2025). DOI: 10.1038/s41551-025-01496-4
18 hours ago
Dr. Krishna Kumari Challa
Why women live longer than men
Tracing the evolutionary roots of why women live longer than men
Around the world, women on average live longer than men. This striking pattern holds true across nearly all countries and historical time periods. Although the gap between the sexes has narrowed in some countries due to medical advances and improved living conditions, new research now provides clues as to why this difference is unlikely to disappear anytime soon. The causes are deeply rooted in evolutionary history and can be observed in many animal species.
An international team of scientists conducted the most comprehensive analysis of sex differences in lifespan across mammals and birds to date. Their findings, published in Science Advances, provide novel insight into one of biology's long-standing puzzles: why males and females age differently.
Among mammals, females usually live longer—for instance, in baboons and gorillas, females often outlive males. Yet this pattern is not universal: In many birds, insects, and reptiles, males are the longer-lived sex. One genetic explanation, the heterogametic sex hypothesis, points to differences in sex chromosomes.
In mammals, females have two X chromosomes, while males have only one X and one Y (making them the heterogametic sex). Some research suggests that having two X chromosomes may protect females from harmful mutations, offering a survival advantage. In birds, however, the system is reversed: females are the heterogametic sex.
Using records from over 1,176 bird and mammal species in zoos worldwide, the researchers found a striking contrast in lifespan, supporting the heterogametic sex hypothesis: in most mammals (72 percent), females lived longer, by on average twelve percent, while in most bird species (68 percent), males lived longer, overall by an average of five percent.
Still, there was remarkable variation with many exceptions. Some species showed the opposite of the expected pattern. For example, in many birds of prey, females are both larger and longer-lived than males. So sex chromosomes can only be part of the story.
Sexual selection and parental care shape lifespan differences
In addition to genetics, reproductive strategies also play a role. Through sexual selection, males in particular develop conspicuous characteristics such as colorful plumage, weapons, or large body size, which increase reproductive success but can shorten lifespan. The new study supports this assumption: In polygamous mammals with strong competition, males generally die earlier than females.
Many birds, on the other hand, are monogamous, which means that competitive pressure is lower and males often live longer. Overall, the differences were smallest in monogamous species, while polygamy and pronounced size differences were associated with a more pronounced advantage for females.
Parental care also plays a role. The researchers found evidence that the sex that invests more in raising offspring—in mammals, this is often the females—tends to live longer. In long-lived species such as primates, this is likely to be a selective advantage: females survive until their offspring are independent or sexually mature.
Part 1
17 hours ago
Dr. Krishna Kumari Challa
Zoo life reduces—but does not erase—lifespan gaps
A long-standing idea is that environmental pressures—such as predation, pathogens, or harsh climates—drive the observed gaps between males and females. To test this, the researchers turned to zoo populations, where such pressures are largely absent.
They found that lifespan gaps persisted even under these protected conditions. Comparing zoo and wild populations showed that the gaps were often smaller in zoos but rarely disappeared—mirroring the human case, where advances in medicine and living conditions have narrowed but not eliminated the lifespan gap.
The findings suggest that sex differences in lifespan are deeply rooted in evolutionary processes—shaped by sexual selection and parental investment and that genetic differences in the sex determination system may also play a role. Environmental factors influence the extent of the differences, but cannot eliminate them. The differences between the sexes are therefore not only a product of the environment, but part of our evolutionary history, and will most likely continue to exist in the future.
Sexual selection drives sex difference in adult life expectancy across mammals and birds, Science Advances (2025). DOI: 10.1126/sciadv.ady8433
Part 2
17 hours ago
Dr. Krishna Kumari Challa
Rare fossil reveals ancient leeches weren't bloodsuckers
A newly described fossil reveals that leeches are at least 200 million years older than scientists previously thought, and that their earliest ancestors may have feasted not on blood, but on smaller marine creatures.
Roughly 430 million years old, the fossil includes a large tail sucker—a feature still found in modern leeches—along with a segmented, teardrop-shaped body. But one important feature isn't found in this fossil: the forward sucker that many of today's leeches use to pierce skin and draw blood.
This absence, along with the fossil's marine origin, suggests a very different early lifestyle for the group known as Hirudinida. Rather than sucking blood from mammals, reptiles, and other vertebrates, the earliest leeches may have roamed the oceans, consuming soft-bodied invertebrates whole or feeding on their internal fluids.
Blood feeding takes a lot of specialized machinery. Anticoagulants, mouthparts, and digestive enzymes are complex adaptations. It makes more sense that early leeches were swallowing prey whole or maybe drinking the internal fluids of small, soft-bodied marine animals.
Previously, scientists thought leeches emerged about 150–200 million years ago. That timeline has now been pushed back by at least 200 million years, thanks to the fossil found in the Waukesha biota, a geological formation in Wisconsin known for preserving the bodies of soft tissue animals that usually decay before fossilization.
Preserving a leech fossil is no small feat. Leeches lack bones, shells, or exoskeletons that are most easily preserved over millions of years. Fossils like this require exceptional circumstances to preserve, often involving near-immediate burial, a low-oxygen environment, and unusual geochemical conditions.
A rare animal and just the right environment to fossilize it—it's like hitting the lottery twice.
de Carle D, et al. The first leech body fossil predates estimated hirudinidan origins by 200 million years, PeerJ (2025). doi.org/10.7717/peerj.19962
17 hours ago