Scientists have discovered how a diarrhea-causing strain of bacteria uses "molecular scissors" to cut open and destroy gut cells, leading to severe illness and sometimes death.
Published in Gut Microbes, the research reveals for the first time the three-dimensional structure of a toxin secreted by enteropathogenic E. coli (EPEC) bacteria, and shows how the bacteria use the toxin to invade and destroy the epithelial cells that line the gut.
The toxin, which is an enzyme called EspC, destroys the cells by cutting up their internal protein structure.
There are more than five types of E. coli that damage epithelial cells in different ways to cause gut infection.
These include STEC, which was responsible for the recent salad spinach recall and uses Shiga toxin to invade gut cells, and EPEC—the subject of this study—which uses the toxin EspC and is the leading cause of diarrhea in children and babies worldwide.
Currently, infections caused by the many diverse strains of E. coli are typically treated with broad-spectrum antibiotics. However, these drugs kill both harmful and beneficial gut bacteria, and E. coli's rapid adaptation ability means these pathogens are becoming resistant to many antibiotics.
Akila U. Pilapitiya et al, The crystal structure of the toxin EspC from enteropathogenic Escherichia coli reveals the mechanism that governs host cell entry and cytotoxicity, Gut Microbes (2025). DOI: 10.1080/19490976.2025.2483777
Study exposes huge levels of untargeted antibiotic prescribing
Doctors are prescribing antibiotics for tens of thousands of patients with infections, with little or no consideration of prognosis and the risk of the infection worsening, according to a new study by epidemiologists.
The study of 15.7 million patient records, published in the Journal of the Royal Society of Medicine on April 4, '25, implies there could be scope to prescribe far fewer antibiotics. The researchers found the probability of being prescribed antibiotics for a lower respiratory tract or urinary tract infection was unrelated to hospital admission risk.
And the probability of being prescribed an antibiotic for an upper respiratory tract infection was only weakly related to hospital admission risk.
The study also showed that patient characteristics such as age and the presence of other health problems were only weakly associated with the probability of being prescribed an antibiotic for treatment of a common infection.
The most elderly patients in the sample were 31% less likely than the youngest patients to receive an antibiotic for upper respiratory infections.
That inevitably means, say the researchers, that many younger people are being prescribed antibiotics, even though they are often fit enough to recover without them, potentially leading to resistance.
Part 1
Conversely, many older people—who may not be able to deal with infections without antibiotics—are not receiving them, with the potential of complications and hospital admissions.
Patients with combinations of diseases were 7% less likely than people without major health problems to receive an antibiotic for upper respiratory infections. Antibiotics are effective in treating bacterial infections, but they carry the risks of antimicrobial resistance (AMR) and loss of effectiveness when used inappropriately.
That is why AMR to antibiotics has been recognized as one of the biggest threats to global public health.
"Given the threat of resistance, there is a need to better target antibiotics in primary care to patients with higher risks of infection-related complications such as sepsis.
But this study finds that antibiotics for common infections are commonly not prescribed according to complication risk and that suggests there is plenty of scope to do more to reduce antibiotic prescribing. The study also showed that the probability of being prescribed an antibiotic for lower respiratory infections was even more unrelated to complication risk during the pandemic; however, they were only minor changes for urinary tract infections. Rather than imposing targets for reducing inappropriate prescribing, we argue that it is far more viable for clinicians to focus on improving risk-based antibiotic prescribing for infections that are less severe and typically self-limiting.Prognosis and harm should explicitly be considered in treatment guidelines, alongside better personalized information for clinicians and patients to support shared decision making, say the researchers.
Ali Fahmi et al, Antibiotics for Common Infections in Primary Care Before, During and after the COVID-19 Pandemic and Extent of Risk-Based Prescribing: Need for Personalised Guidelines, Journal of the Royal Society of Medicine (2025).
One-year-old infants already display compositional abilities, study finds
To understand complex objects, humans are known to mentally transform them and represent them as a combination of simpler elements. This ability, known as compositionality, was so far assumed to require fluency in language, thus emerging in childhood after humans have learned to speak and understand others.
Researchers recently explored the possibility that compositionality is based on simple processes and might therefore already be present in infants. Their paper, published in Communications Psychology, provides evidence that infants as young as 1-year-old already possess basic compositional abilities.
One of the central properties of language is compositionality, which is a long word that simply means the capacity to put words together to understand sentences.
In contrast to older children and adults, infants cannot yet express their thought processes, thus assessing their abilities in experimental settings typically entails observing their behavior. To evaluate their compositional abilities, Dautriche and her colleague, Emmanuel Chemla, carried out three experiments using a well-established experimental paradigm for research with infants, which involves measuring the time they spend gazing at specific objects.
In their three experiments, researchers gathered evidence that infants can correctly compose simple noun-verb sentences at approximately 14 months of age, can understand compositional facial expressions (i.e., a negation expressed through a facial expression) at one year of age, and can make basic mental physical transformations when they are 10 months old. Collectively, their findings suggest that compositionality can emerge before humans learn to speak, indicating it could be based on simpler processes than previously anticipated.
The researchers propose that compositionality is a capacity that humans have before learning language, which brings us a little bit closer to better understanding how our minds work and how language evolved. It could be, for instance, that the minds of other species are also compositional and that language might just have built on this feature of the mind.
Isabelle Dautriche et al, Evidence for compositional abilities in one-year-old infants, Communications Psychology (2025). DOI: 10.1038/s44271-025-00222-9.
Clinical research reports no significant long-term benefit of cocoa flavanol supplementation in preventing age-related macular degeneration (AMD). The paper is published in the journal JAMA Ophthalmology.
AMD is a progressive retinal disease and the most common cause of severe vision loss in adults over age 50. AMD damages the macula, the central part of the retina responsible for sharp, detailed vision. While peripheral sight is typically preserved, central vision loss can impair reading, driving, facial recognition, and other quality of life tasks. Abnormalities of blood flow in the eye are associated with the occurrence of AMD.
Cocoa flavanols are a group of naturally occurring plant compounds classified as flavonoids, found primarily in the cocoa bean. These bioactive compounds have been studied for their vascular effects, including improved endothelial function and enhanced nitric oxide production, which contribute to vasodilation and circulatory health. Previous trials have shown that moderate intake of cocoa flavanols may lower blood pressure, improve lipid profiles, and reduce markers of inflammation, suggesting a role in mitigating cardiovascular and related vascular conditions.
In the study titled "Cocoa Flavanol Supplementation and Risk of Age-Related Macular Degeneration: An Ancillary Study of the COSMOS Randomized Clinical Trial," researchers conducted a double-blind, placebo-controlled randomized clinical trial to examine whether daily supplementation with cocoa extract prevents the development or progression of AMD.
A cohort of 21,442 U.S. adults (12,666 women aged 65 and older, and 8,776 men aged 60 and older) were recruited, with eligibility criteria requiring discontinuation of non-trial cocoa supplements and multivitamins for the 3.6-year duration of the trial. COSMOS included its own multivitamin supplement as one arm of the trial.
Daily supplementation consisted of 500 mg cocoa flavanols containing 80 mg (−)-epicatechin. Randomization assigned participants to either the cocoa extract or a matching placebo group. AMD outcomes were identified through self-reported diagnoses, verified through medical record confirmation. Compliance biomarkers confirmed a threefold increase in flavanol metabolite levels in the cocoa group.
A total of 344 participants experienced a confirmed AMD event, including 316 incident cases and 28 cases of progression. Incidence was 1.5% in the cocoa extract group and 1.7% in the placebo group.
Statistical modeling found a non-significant reduced risk during the first two years of treatment (HR, 0.77; 95% CI, 0.59–1.01), and a non-significant effect beyond two years (HR, 1.06; 95% CI, 0.76–1.50). Similar trends were observed in secondary outcomes.
In subgroup analyses, treatment effect varied by hypertension status, with a significant reduced risk in participants without hypertension (HR, 0.63; 95% CI, 0.44–0.92) and no significant findings among those with hypertension (HR, 1.04; 95% CI, 0.80–1.36).
Researchers concluded that cocoa extract supplementation had no significant overall effect on AMD risk over a median 3.6-year period. Although researchers did not rule out a possible early benefit trend, the findings do not support cocoa flavanol supplementation as a preventive strategy for AMD.
However, the researchers acknowledge that this 's a limited sample study.
William G. Christen et al, Cocoa Flavanol Supplementation and Risk of Age-Related Macular Degeneration, JAMA Ophthalmology (2025). DOI: 10.1001/jamaophthalmol.2025.0353
Female hormones can stimulate immune cells to make opioids that naturally suppress pain
Scientists have discovered a new mechanism that acts via an immune cell and points toward a different way of treating chronic pain. Female hormones can suppress pain by making immune cells near the spinal cord produce opioids, a new study by researchers has found. This stops pain signals before they get to the brain.
The discovery could help with developing new treatments for chronic pain. It may also explain why some painkillers work better for women than men and why postmenopausal women experience more pain.
The work reveals an entirely new role for T-regulatory immune cells (T-regs), which are known for their ability to reduce inflammation.
The researchers looked at T-regs in the protective layers that encase the brain and spinal cord in mice. Until now, scientists thought these tissues, called the meninges, only served to protect the central nervous system and eliminate waste. T-regs were only discovered there in recent years.
This new research shows that the immune system actually uses the meninges to communicate with distant neurons that detect sensation on the skin.
Further experiments revealed a relationship between T-regs and female hormones that no one had seen before: Estrogen and progesterone were prompting the cells to churn out painkilling enkephalin.
This work could be particularly helpful for women who have gone through menopause and no longer produce estrogen and progesterone, many of whom experience chronic pain.
Molecular clock analysis shows bacteria used oxygen long before widespread photosynthesis
Microbial organisms dominate life on Earth, but tracing their early history and evolution is difficult because they rarely fossilize. Determining when exactly a particular group of microbes first appeared is especially hard. However, ancient sediments and rocks hold chemical clues of available nutrients that could support the growth of bacteria.
A key turning point was when oxygen accumulated in the atmosphere around 2.3 billion years ago. Scientists have used this oxygen surge and how microbes adapted to it to map out bacterial evolution.
In a study published in Science, researchers have constructed a detailed timeline for bacterial evolution and oxygen adaptation.
Their findings suggest some bacteria could use trace oxygen long before evolving the ability to produce it through photosynthesis.
The researchers focused on how microorganisms responded to the Great Oxygenation Event (GOE) some 2.3 billion years ago. This event, triggered in large part by the development of oxygenic (oxygen-generating) photosynthesis in cyanobacteria and carbon deposition, fundamentally changed Earth's atmosphere from one mostly devoid of oxygen to one where oxygen became relatively abundant, as it is today.
Until now, establishing accurate timescales for how bacteria evolved before, during, and after this pivotal transition has been difficult due to incomplete fossil evidence and the challenge of determining the maximum possible ages for microbial groups—given that the only reliable maximum limit for the vast majority of lineages is the moon-forming impact 4.5 billion years ago, which likely sterilized the planet.
The researchers addressed these gaps by concurrently analyzing geological and genomic records. Their key innovation was to use the GOE itself as a time boundary, assuming that most aerobic (oxygen-using) branches of bacteria are unlikely to be older than this event—unless fossil or genetic signals strongly suggest an earlier origin. Using Bayesian statistics, they created a model that can override this assumption when data supports it.
This approach, however, requires making predictions about which lineages were aerobic in the deep past. The team used probabilistic methods to infer which genes ancient genomes contained, and then machine-learning to predict whether they used oxygen.
To best utilize the fossil record, they leveraged fossils of eukaryotes, whose mitochondria evolved from Alphaproteobacteria, and chloroplasts evolved from cyanobacteria to better estimate how and when aerobic bacteria evolved.
Their results indicate that at least three lineages had aerobic lifestyles before the GOE—the earliest nearly 900 million years before—suggesting that a capacity for using oxygen evolved well before its widespread accumulation in the atmosphere.
Intriguingly, these findings point to the possibility that aerobic metabolism may have occurred long before the evolution of oxygenic photosynthesis. Evidence suggests that the earliest aerobic transition occurred in an ancestor of photosynthetic cyanobacteria, indicating that the ability to utilize trace amounts of oxygen may have allowed the development of genes central to oxygenic photosynthesis.
The study estimates that the last common ancestor of all modern bacteria lived sometime between 4.4 and 3.9 billion years ago, in the Hadean or earliest Archean era. The ancestors of major bacterial phyla are placed in the Archean and Proterozoic eras (2.5–1.8 billion years ago), while many families date back to 0.6–0.75 billion years ago, overlapping with the era when land plants and animal phyla originated.
Notably, once atmospheric oxygen levels rose during the GOE, aerobic lineages diversified more rapidly than their anaerobic counterparts, indicating that oxygen availability played a substantial role in shaping bacterial evolution. This combined approach of using genomic data, fossils, and Earth's geochemical history brings new clarity to evolutionary timelines, especially for microbial groups that don't have a fossil record.
A geological timescale for bacterial evolution and oxygen adaption, Science (2025). DOI: 10.1126/science.ADP1853
Antibiotic resistance among key bacterial species plateaus over time, study shows
Antibiotic resistance tends to stabilize over time, according to a study published in the open-access journal PLOS Pathogens.
In this study, researchers analyzed drug resistance in more than 3 million bacterial samples collected across 30 countries in Europe from 1998 to 2019. Samples encompassed eight bacteria species important to public health, including Streptococcus pneumoniae, Staphylococcus aureus, Escherichia coli, and Klebsiella pneumoniae.
They found that while antibiotic resistance initially rises in response to antibiotic use, it does not rise indefinitely. Instead, resistance rates reached an equilibrium over the 20-year period in most species.
Antibiotic use contributed to how quickly resistance levels stabilized as well as variability in resistance rates across different countries. But the association between changes in drug resistance and antibiotic use was weak, suggesting that additional, yet unknown, factors are at play.
The study highlights that a continued increase in antibiotic resistance is not inevitable and provides new insights to help researchers monitor drug resistance.
When researchers looked into the dynamics of antibiotic resistance in many important bacterial pathogens all over Europe and in the last few decades, they often found that resistance frequency initially increases and then stabilizes to an intermediate level. The consumption of the antibiotic in the country explained both the speed of initial increase and the level of stabilization.
Jumbo phages infect cells with a protective cloaking mechanism, researchers discover
In a growing global trend, bacteria are evolving new ways to maneuver around medical treatments for a variety of infections. The rising antibiotic resistance crisis poses a significant public health threat in hospitals and other settings, with infections resulting in millions of deaths in recent years.
Scientists are now looking to bacteriophages—viruses that infect bacteria—and their potential to treat drug-resistant infections. They have begun to look deeper into an intriguing class of large bacteriophage known as "jumbo phages" that exhibit extraordinary features as possible new agents for bacterial infection treatments.
A study by researchers has shed new light on the unusual ways that phages have evolved to infect bacteria. Over millions of years, viruses and bacteria have engaged in a back-and-forth arms race. Viruses develop new ways to infect bacteria, while bacteria counter by evolving a resistance mechanism.
In order to fully realize the potential of jumbo phages and their promise as new therapeutics, researchers must decipher the mechanisms they employ to infect bacteria and evade the host's defenses.
A newstudypublished in the journalCell Host & Microbedescribes the first-of-its-kind discovery of a type of membrane-bound sac, or vesicle, used by jumbo phages of the Chimalliviridae family.
The researchers found that immediately after jumbo phages infect a bacterial cell, they form a structure that shields and hides valuable DNA material. Phages use this genetic material to develop a nucleus inside their bacterial hosts.
The newly discovered compartment, which they named the EPI, or early phage infection vesicle, serves as a type of cloaking device that prevents triggering the bacteria's immune system.
When phages infect a bacterial cell, the EPI vesicle protects the genome of the virus during early stages of infection when it's very vulnerable. Bacteria and viruses are often dismissed as simple organisms but they're actually capable of very sophisticated intracellular warfare and this study is a new example of that.
Because most phages simply inject their DNA directly into the host, effectively announcing their arrival within the cell, the results of Chimalliviridae phage's stealth approach came as a revelation to researchers. The bacteria don't realize that there's a virus in there, producing things that will eventually take over.
Researchers identified curious DNA "dots" within infected cells under a light microscope. Professor Elizabeth Villa's laboratory then used high-end imaging technologies to discover that these dots were tiny vesicles containing viral DNA and molecular machineries outside these vesicles. They found that these vesicles were actually metabolically active, confirming the purpose of the molecular machines hanging outside the vesicles. Not only did the researchers show that these vesicles are making RNA, but also they are getting ready to establish infection by synthesizing genes important for nucleus formation.
Emily G. Armbruster et al, Sequential membrane- and protein-bound organelles compartmentalize genomes during phage infection, Cell Host & Microbe (2025). DOI: 10.1016/j.chom.2025.03.005
Scientists discover how nanoparticles of toxic metal used in MRI scans infiltrate human tissue
Researchers studying the health risks posed by gadolinium, a toxic rare earth metal used in MRI scans, have found that oxalic acid, a molecule found in many foods, can generate nanoparticles of the metal in human tissues.
In a new paper published in the journal Magnetic Resonance Imaging, a research team sought to explain the formation of the nanoparticles, which have been associated with serious health problems in the kidneys and other organs.
The worst disease caused by MRI contrast agents is nephrogenic systemic fibrosis. People have succumbed after just a single dose. The condition can cause a thickening and hardening of the skin, heart and lungs and cause painful contracture of the joints.
Gadolinium-based contrast agents are injected prior to MRI scans to help create sharper images.
The metal is usually tightly bound to other molecules and is excreted from the body, and most people experience no adverse effects. However, previous research has shown that even in those with no symptoms, gadolinium particles have been found in the kidney and the brain and can be detected in the blood and urine years after exposure.
Scientists are left with intertwined puzzles: Why do some people get sick, when most don't, and how do gadolinium particles become pried loose from the other molecules in the contrast agent?
Almost 50% of the patients had been exposed only a single time, which means that there's something that is amplifying the disease signal.
In their study, the research team focused on oxalic acid, which is found in many plant-based foods, including spinach, rhubarb, most nuts and berries and chocolate, because it binds with metal ions. The process helps lead to the formation of kidney stones, which result when oxalate binds with calcium. Meanwhile, oxalic acid also forms in the body when people eat foods or supplements containing vitamin C.
In test tube experiments the researchers found that oxalic acid caused minute amounts of gadolinium to precipitate out of the contrast agent and form nanoparticles, which then infiltrated the cells of various organs.
Some people might form these things, while others do not, and it may be their metabolic milieu. It might be if they were in a high oxalic state or a state where molecules are more prone to linking to the gadolinium, leading to the formation of the nanoparticles. That might be why some individuals have such awful symptoms and this massive disease response, whereas other people are fine.
The finding points to a possible way to mitigate some of the risks associated with MRI scan.
The scientists are getting closer to some recommendations for helping these individuals who are susceptible.
Ian M. Henderson et al, Precipitation of gadolinium from magnetic resonance imaging contrast agents may be the Brass tacks of toxicity, Magnetic Resonance Imaging (2025). DOI: 10.1016/j.mri.2025.110383
Marriage linked to higher dementia risk in older adults, 18-year study finds
This one is going to surprise you.
Researchers found that older adults who were divorced or never married had a lower risk of developing dementia over an 18-year period compared to their married peers. Findings suggest that being unmarried may not increase vulnerability to cognitive decline, contrary to long-held beliefs in public health and aging research.
Marriage is often linked to better health outcomes and longer life, but evidence connecting marital status to dementia risk remains inconsistent.
Prior research has not consistently addressed how marital status relates to specific causes of dementia or how factors such as sex, depression, or genetic predisposition may influence these associations.
In the study, "Marital status and risk of dementia over 18 years: Surprising findings from the National Alzheimer's Coordinating Center," published in Alzheimer's & Dementia, researchers conducted an 18-year cohort study to understand whether marital status was associated with dementia risk in older adults.
More than 24,000 participants without dementia at baseline were enrolled from over 42 Alzheimer's Disease Research Centers across the United States through the National Alzheimer's Coordinating Center. Annual clinical evaluations were conducted by trained clinicians using standardized protocols to assess cognitive function and determine diagnoses of dementia or mild cognitive impairment.
To assess long-term risk, researchers followed participants for up to 18.44 years, yielding over 122,000 person-years of data. Marital status at baseline was categorized as married, widowed, divorced, or never married.
Dementia risk was analyzed using Cox proportional hazards regression, with married participants serving as the reference group. The models incorporated demographic characteristics, mental and physical health, behavioral history, genetic risk factors, and diagnostic as well as enrollment variables.
Compared to married participants, divorced or never married showed a consistently lower risk of developing dementia over the study period. Dementia diagnoses occurred in 20.1% of the overall sample. Among married participants, 21.9% developed dementia during the study period. Incidence was identical among widowed participants at 21.9% but notably lower for divorced (12.8%) and never-married participants (12.4%).
Hazard ratios showed a reduced risk for all three unmarried groups. In initial models adjusting only for age and sex, divorced individuals had a 34% lower risk of developing dementia (HR = 0.66, 95% CI = 0.59–0.73), never-married individuals had a 40% lower risk (HR = 0.60, 95% CI = 0.52–0.71), and widowed individuals had a 27% lower risk (HR = 0.73, 95% CI = 0.67–0.79).
These associations remained significant for the divorced and never-married groups after accounting for health, behavioral, genetic, and referral-related factors. The association for widowed participants weakened and was no longer statistically significant in the fully adjusted model.
Part 1
When looking at specific dementia subtypes, all unmarried participants also showed reduced risk for Alzheimer's disease and Lewy body dementia. In contrast, no consistent associations were observed for vascular dementia or frontotemporal lobar degeneration in fully adjusted models. Divorced and never-married groups were also less likely to progress from mild cognitive impairment to dementia.
Risk patterns appeared slightly stronger among men, younger individuals, and participants referred to clinics by health professionals. Yet stratified analyses showed minimal variation, suggesting that the associations held across a wide range of demographic and clinical subgroups.
Researchers concluded that unmarried individuals, particularly those who were divorced or never married, had a lower risk of developing dementia than those who remained married. These associations persisted even after adjusting for physical and mental health, lifestyle factors, genetics, and differences in clinical referral and evaluation. Alzheimer's disease and Lewy body dementia were higher in married participants. Risk of progression from mild cognitive impairment to dementia was also higher. The findings contrast with prior studies linking unmarried status to increased dementia risk and offer new evidence on how relationship status may relate to cognitive outcomes when diagnosis is measured under standardized conditions.
Selin Karakose et al, Marital status and risk of dementia over 18 years: Surprising findings from the National Alzheimer's Coordinating Center, Alzheimer's & Dementia (2025). DOI: 10.1002/alz.70072
Indian cormorants (Phalacocorax fuscicollis) can spread antimicrobial-resistant bacteria in their droppings. Intrigued by the smell of cormorant droppings, veterinary microbiologist Siddhartha Narayan Joardar and his teamanalysed some of the bird poo and found a strain ofE. colithat produced enzymes that helped it resist certain antibiotics. The birds probably pick up the bacterium by eating fish from ponds contaminated by human wastewater, the team suggests. “Growing evidence shows the presence of antimicrobial-resistant bacteria in common urban birds,” says Joardar, which poses a public health threat if it spills over into humans.
Obesity severity tied to increased risk across 16 common conditions
New research has found that obesity, particularly severe obesity, is strongly associated with the incidence of 16 common health outcomes. Associations remained consistent across sex and racial groups. Strong associations were observed for obstructive sleep apnea, type 2 diabetes, and metabolic dysfunction-associated steatotic liver disease.
Obesity is a risk factor for adverse health outcomes involving multiple organ systems. Prior studies have analyzed conditions individually, limiting understanding of obesity's total health burden. External validity has also been limited by underrepresentation of individuals with class III obesity and of diverse demographic groups.
In the study, "Associations between Class I, II, or III Obesity and Health Outcomes," published in NEJM Evidence, researchers conducted a longitudinal cohort study to understand how different levels of obesity relate to a wide array of health conditions across a diverse U.S. population.
Participants contributed electronic health records, physical measurements, and survey data. Body mass index (BMI) was calculated at enrollment and used to classify individuals as normal weight, overweight, or obese, with further stratification into obesity classes I, II, and III.
Sixteen pre-identified health conditions were evaluated: hypertension, type 2 diabetes, hyperlipidemia or dyslipidemia, heart failure, atrial fibrillation, atherosclerotic cardiovascular disease, chronic kidney disease, pulmonary embolism, deep vein thrombosis, gout, liver disease linked to metabolic dysfunction, biliary calculus, obstructive sleep apnea, asthma, gastroesophageal reflux disease, and osteoarthritis.
Cox proportional hazards models were used to estimate the risk of each condition by obesity class, adjusting for sex, age, race or ethnicity, income, and education. Researchers also calculated population-attributable fractions for each condition by obesity class.
Obesity was present in 42.4% of the study population, including 21.2% with class I obesity, 11.3% with class II, and 9.8% with class III. Compared to those with normal weight, individuals with obesity were more likely to be female, Black, have lower income and education levels, and have higher blood pressure and waist-to-hip ratios.
Prevalence and incidence rates increased progressively with higher obesity classes for all 16 health outcomes. Observed associations with class III obesity were strongest for obstructive sleep apnea (hazard ratio 10.94), type 2 diabetes mellitus (7.74), and metabolic dysfunction–associated liver disease (6.72). Weaker associations were found for asthma (2.14), osteoarthritis (2.06), and atherosclerotic cardiovascular disease (1.96).
Obesity was associated with elevated risk across all subgroups, with consistent patterns by sex and race. Population-attributable fractions showed that obesity explained 51.5% of obstructive sleep apnea cases and 36.3% of metabolic liver disease cases, and 14.0% of all osteoarthritis cases in the study population were estimated to be attributable to obesity.
Increased risk, particularly at higher severity levels, was associated with all 16 health outcomes studied. Risks rose in a stepwise manner across obesity classes, with the highest burden observed among individuals with class III obesity. Findings remained consistent across demographic subgroups and were supported by data from a large, diverse national cohort.
Associations between obesity and several conditions such as sleep apnea, type 2 diabetes, liver disease, and heart failure were strong and statistically robust. Population-attributable fractions indicated that a substantial proportion of these conditions may be preventable through effective obesity management.
Zhiqi Yao et al, Associations between Class I, II, or III Obesity and Health Outcomes, NEJM Evidence (2025). DOI: 10.1056/EVIDoa2400229
Mushroom study identifies most bitter substance known to date
The molecular world of bitter compounds has so far only been partially explored. Researchers have now isolated three new bitter compounds from the mushroom Amaropostia stiptica and investigated their effect on human bitter taste receptors.
In doing so, they discovered one of the potentially most bitter substances known to date. The study results, published in the Journal of Agricultural and Food Chemistry, expand our knowledge of natural bitter compounds and their receptors, thus making an important contribution to food and health research.
The BitterDB database currently contains more than 2,400 bitter molecules. For about 800 of these very chemically diverse substances, at least one bitter taste receptor is specified. However, the bitter compounds recorded are mainly from flowering plants or synthetic sources. Bitter compounds of animal, bacterial or fungal origin, on the other hand, are still rarely represented in the database.
Researchers assume that bitter taste receptors have developed to warn against the consumption of potentially harmful substances. However, not all bitter compounds are toxic or harmful, and not every toxin tastes bitter, as the example of the death cap mushroom toxin shows. But why is that the case?
Studies have also shown that the sensors for bitter substances are not only found in the mouth, but also in organs such as the stomach, intestines, heart and lungs, as well as on certain blood cells. Since we do not "taste" with these organs and cells, the question arises as to the physiological significance of the receptors there.
Comprehensive data collections on bitter compounds and their receptors could help us to find answers to these open questions.
The more well-founded data we have on the various bitter compound classes, taste receptor types and variants, the better we can develop predictive models using systems biology methods to identify new bitter compounds and predict bitter taste receptor-mediated effects. This applies to both food constituents and endogenous substances that activate extraoral bitter taste receptors.
The research team that undertook this study examined the Bitter Bracket (Amaropostia stiptica) as part of a collaborative project. The mushroom is non-toxic, but tastes extremely bitter.
Using modern analytical methods, the research group has succeeded in isolating three previously unknown compounds and elucidating their structures. Using a cellular test system, the researchers then showed that the compounds activate at least one of the approximately 25 human bitter taste receptor types.
Particularly noteworthy is the newly discovered bitter compound oligoporin D, which stimulates the bitter taste receptor type TAS2R46 even at the lowest concentrations (approx. 63 millionths of a gram/liter). To illustrate: the concentration corresponds to one gram of oligoporin D dissolved in about 106 bathtubs of water, where one gram corresponds approximately to the weight of a knife tip of baking soda.
Lea M. Schmitz et al, Taste-Guided Isolation of Bitter Compounds from the Mushroom Amaropostia stiptica Activates a Subset of Human Bitter Taste Receptors, Journal of Agricultural and Food Chemistry (2025). DOI: 10.1021/acs.jafc.4c12651
scientists discover babies can sense their heartbeat and breathing
Body signals such as heartbeat and breathing accompany us constantly, often unnoticed as background noise of our perception. Even in the earliest years of life, these signals are important as they contribute to the development of self-awareness and identity. But can babies perceive their own body signals?
A recent study from Wiener Kinderstudien Lab at the University of Vienna demonstrates for the first time that babies as young as 3 months can perceive their own heartbeat. In addition, the team also investigated the first-time infants' perception of their own breathing and found developments during the first two years of life. The results are now published in the journal eLife.
The perception of internal body signals is closely linked to emotional awareness, mental health, and self-perception. Early in life, the ability to perceive one's own body signals may be particularly important, as it often forms the basis for interactions with caregivers—for example, babies rely on their caregivers to respond appropriately to signs of hunger or discomfort. Moreover, the development of self-awareness and identity partly depends on the perception and experience of one's own body.
The study shows that even 3-month-old babies can perceive their own heartbeat and that this ability remains relatively stable during the first two years of life. At the same time, the findings indicate that the perception of breathing improves significantly during the second year. Interestingly, the ability to perceive the heartbeat and breathing does not appear to be related—much like in adults.
Results showed that even at an early age, babies recognize the correspondence between their own heartbeat or breathing rhythm and the animated figures.
Markus R Tünte et al, Respiratory and cardiac interoceptive sensitivity in the first two years of life, eLife (2023). DOI: 10.7554/eLife.91579
Your season of conception could influence how your body stores fat
Individuals who were conceived in colder seasons are more likely to show higher brown adipose tissue activity, increased energy expenditure and a lower body mass index (BMI), and lower fat accumulation around internal organs, compared with those conceived in warmer seasons, suggests a study published in Nature Metabolism. The findings, based on an analysis involving more than 500 participants, indicate a potential role for meteorological conditions influencing human physiology.
Although eating habits and exercise are key indicators of fat loss, exposure to cold and warmth also plays a part. In colder temperatures, the body generates more heat (cold-induced thermogenesis) via brown adipose tissue activity and stores less fat in the form of white adipose tissue than it does in hotter temperatures.
Researchers analyzed brown adipose tissue density, activity and thermogenesis in 683 healthy male and female individuals between ages 3 and 78 in Japan, whose parents were exposed to cold temperatures (defined in the study as between 17 October and 15 April) or warm temperatures (between 16 April and 16 October) during the fertilization and birth periods.
Individuals who were conceived during the cold season showed higher brown adipose tissue activity, which then correlated with increased energy expenditure, increased thermogenesis, lower visceral fat accumulation and lower BMI into adulthood. More specifically, the researchers show that a key factor in determining brown adipose tissue activity in human offspring is a large daily temperature variation and lower ambient temperature during the pre-conception period.
Takeshi Yoneshiro et al, Pre-fertilization-origin preservation of brown fat-mediated energy expenditure in humans, Nature Metabolism (2025). DOI: 10.1038/s42255-025-01249-2
The gut microbiome as a predictive factor for kidney rejection
Kidney is one of the most transplanted organs in the world.
For patients with advanced kidney failure, a kidney transplant remains the best treatment option.
The demand is correspondingly high: several patients around the world are on the waiting list for a kidney transplant. A serious risk for patients who have already received a transplant is rejection of the transplant. This is a defensive reaction of the body against the foreign tissue, which can lead to a complete loss of organ function in an emergency.
Why transplants are sometimes rejected and sometimes not depends largely on immune mechanisms. The causes are complex and often poorly understood. To help answer this question, researchers have analyzed the changes in the composition and function of the gut microbiome of kidney transplant patients .
They discovered an altered signature in the gut microbiome that preceded transplant rejection. This study, published in the American Journal of Transplantation, offers a possible starting point for recognizing the risk of rejection at an early stage.
Our gut is home to countless microorganisms that play an important role in how our immune system works. This is known as the microbiome. The majority of these, over 90%, are bacteria. These bacteria and the substances they produce communicate with our body—especially with the cells that protect us from disease. They therefore help to control and strengthen our immune system, which is important for both healthy and sick people.
In patients with chronic kidney disease, the composition of the gut microbiome is severely altered, resulting in lower concentrations of anti-inflammatory short-chain fatty acids (SCFA) and increased concentrations of pro-inflammatory metabolites from the microbiome.
In their study, researchers analyzed the changes in the composition and function of the gut microbiome of patients after kidney transplantation. They found changes in the gut microbiome that were already detectable before the transplant rejection reaction.
It was noticeable that in patients who showed a rejection reaction, bacteria that typically occur in patients with advanced kidney failure, such as Fusobacterium and disease-associated genera such as Streptococcus, increased again. This was not the case in the other group studied, the "non-rejection group."
Overall, the analyses showed that the production potential of short-chain fatty acids in the stool is reduced before kidney rejection. This is indicated by the reduced frequency of bacterial enzymes from which short-chain fatty acids are produced before rejection.
The previously observed dynamic regeneration of the microbiome after kidney transplantation may be significantly disturbed in the case of transplant rejection: prior to rejection, profound changes in the composition of the microbiome occur, characterized by reduced diversity and a low number of SCFA-producing bacterial populations.
The results suggest that the microbiome plays an important role in how the immune system reacts after a kidney transplant. This observation can help to identify the risk of transplant rejection at an early stage or perhaps influence it therapeutically .
Johannes Holle et al, Gut microbiome alterations precede graft rejection in kidney transplantation patients, American Journal of Transplantation (2025). DOI: 10.1016/j.ajt.2025.02.010
Scientists make water-repellent replacement for toxic 'forever chemicals'
A team of international scientists has invented a substitute for synthetic chemicals, called PFAS (perfluoroalkyl substances), which are widely used in everyday products despite being hazardous to health and the environment.
Until now, it was believed fluorine—the element in such products which forms a highly effective barrier between substances like air and water, making them water repellent—could not easily be replaced because of its unique properties.
But scientists have discovered that the unique "bulky" attribute of fluorine, which makes it especially good at filling space, can actually be replicated in a different, non-toxic form. The findings are published in the Journal of Colloid and Interface Science.
From fire-fighting foam to furniture, food packaging and cookware, to make-up and toilet tissue, PFAS products are everywhere. Despite the risks to human health, and the fact they don't degrade, perfluoroalkyl substances persist in the environment, finding an alternative with comparable properties has proven elusive. But after many years of intensive research, researchers have made a great breakthrough now.
The results of their discovery are published in a study which unpacks the chemical structure of PFAS and pinpoints the characteristic "bulkiness" they sought to replicate in a safer form. It also demonstrates how non-fluorinated components, containing only non-toxic carbon and hydrogen, could be equally effective replacements.
Through extensive experimentation, it turns out these 'bulky' fragments feature in other common chemical systems like fats and fuels. So scientists took those principles and created modified chemicals which have these positive attributes and are also much safer.
Using their specialized laboratories for chemical synthesis, they substituted the fluorine in PFAS with certain groups containing only carbon and hydrogen. The whole process has taken about 10 years and the implications are very significant not least because PFAS is used in so many different products and situations.
The researchers now plan on using these principles discovered in the lab to design commercially viable versions of PFAS substitutes.
Masanobu Sagisaka et al, New fluorine-free low surface energy surfactants and surfaces, Journal of Colloid and Interface Science (2025). DOI: 10.1016/j.jcis.2025.03.018
Turning pollution into fuel with record-breaking carbon dioxide to carbon monoxide conversion rates
What if we could transform harmful pollution into a helpful energy source? As we strive towards carbon neutrality, researching energy innovations that reduce pollution is crucial.
Researchers developed a streamlined process for converting carbon dioxide (CO2) into carbon monoxide (CO)—a key precursor for synthetic fuels. Their method achieved record-breaking efficiency, cutting down the required time from 24 hours to just 15 minutes. The study is published in the journal Advanced Science.
CO2-to-CO conversion is currently a hot topic to address climate change, but the conventional techniques had major pitfalls that scientists wanted to address.
The materials are expensive, unstable, has limited selectivity, and took long to prepare. It just wouldn't be feasible to use them in an actual industrial setting.
With industrial standards in mind, the researchers selected various phthalocyanines (Pc) expected to improve performance [metal-free (H₂Pc), iron (FePc), cobalt (CoPc), nickel (NiPc), and copper (CuPc)]. These were sprayed onto gas diffusion electrodes to directly form crystalline layers of the phthalocyanines on the electrode surface. Ultimately, CoPc—a low-cost pigment and metal complex—showed the highest efficiency in converting CO₂ to CO.
This graffiti-like method of simply spraying the catalyst on a surface reduces the typical processing time down to a mere 15 minutes. Conventional methods require a tedious process of mixing conductive carbon and binders, drying, and heat treatment over 24 hours.
Furthermore, under a current density of 150 mA/cm², the new system maintained stable performance for 144 hours. Using the DigCat Database (the largest experimental electrocatalysis database to date), the researchers confirmed that their catalyst surpassed all previously reported Pc-based catalysts.
Not only is this the best Pc-based catalyst for producing CO to date, but it successfully exceeds the industrial standard thresholds for its reaction rate and stability, say the researchers.
Tengyi Liu et al, Surface Charge Transfer Enhanced Cobalt‐Phthalocyanine Crystals for Efficient CO2‐to‐CO Electroreduction with Large Current Density Exceeding 1000 mA cm−2, Advanced Science (2025). DOI: 10.1002/advs.202501459
Sperm don't just swim, they screw their way forward
Researchers have discovered that swimming sperm create swirling fluid vortices—shaped like rolling corkscrews—giving them an extra boost in the race to the egg.
The study, published in Cell Reports Physical Science, reveals that these vortices attach to the sperm cell and rotate in sync, adding extra spin that enhances propulsion and helps keep them on a direct path through the fluid.
As the sperm swims, its flagellum (tail) generates a whipping motion that creates swirling fluid currents that could optimize its propulsion in the reproductive tract. What's really fascinating is how these spiral-like 'imprints' in the surrounding fluid attach to the sperm body and rotate in sync, adding extra thrust.
The size and strength of these flow structures could impact sperm interactions with nearby surfaces, other sperm, or even the egg itself.
Eating only during the daytime could protect people from heart risks of shift work, study suggests
A study by researchers suggests that, when it comes to cardiovascular health, food timing could be a bigger risk factor than sleep timing.
Numerous studies have shown that working the night shift is associated with serious health risks, including to the heart. However, a new study suggests that eating only during the daytime could help people avoid the health risks associated with shift work. Results are published in Nature Communications.
Prior research has shown that circadian misalignment—the mistiming of our behavioral cycle relative to our internal body clock—increases cardiovascular risk factors. So what can be done to lower this risk? This new research suggests food timing could be that target.
Animal studies have shown that aligning food timing with the internal body clock could mitigate the health risks of staying awake during the typical rest time.
In the experiments conducted during this study, the cardiovascular risk factors increased after simulated night work compared to the baseline in the participants who were scheduled to eat during the day and night. However, the risk factors stayed the same in the study participants who only ate during the daytime, even though how much and what they ate was not different between the groups—only "when" they ate.
Chellappa SL et al. Daytime eating during simulated night work mitigates changes in cardiovascular risk factors: secondary analyses of a randomized controlled trial, Nature Communications (2025). DOI: 10.1038/s41467-025-57846-y
General anesthesia reduces uniqueness of brain's functional 'fingerprint,' study finds
Past psychology research suggests that different people display characteristic patterns of spontaneous thought, emotions and behaviors. These patterns make the brains of distinct individuals unique, to the point that neuroscientists can often tell them apart based on their neural activity.
Researchers recently carried out a study aimed at investigating how general anesthesia influences the unique neural activity signatures that characterize the brains of different people and animals.
Their findings, published in Nature Human Behavior, show that general anesthesia suppresses each brain's unique functional connectivity patterns (i.e., the connections and communication patterns between different regions of the brain), both in humans and other species.
Every person is unique, they think and feel and act in unique ways. This uniqueness comes from our brain. The way that areas of the brain interact with each other is unique to each individual: it can be used like a 'brain fingerprint.
But when you lose consciousness, for example during deep sleep, your sense of being 'you' is gone. So, the question 's: what happens to brain fingerprints when we lose consciousness, such as during the artificial sleep induced by general anesthesia?
To explore the effects of anesthesia on the brain's functional connectivity patterns, the researchers employed an imaging technique commonly used in neuroscience research called functional magnetic resonance imaging (fMRI). This technique allows neuroscientists to monitor the activity of different brain regions over time and non-invasively, by measuring changes in blood flow.
They collected fMRI scans from healthy human volunteers before general anesthesia, then when they were unconscious because of the anesthesia, and then again after they recovered consciousness.
For each scan, they measured 'functional connectivity': a representation of how brain regions interact. They used this functional connectivity to obtain 'brain fingerprints,' telling them how easy or difficult it is to tell people apart based on their brain activity.
Interestingly, the fMRI scans collected by the researchers showed that the brain activity of people while they were under the influence of anesthesia was suppressed. In fact, anesthesia made people almost impossible to tell apart from each other solely by examining their brain activity, which was possible when they were still conscious.
In contrast, using brain fingerprints it is very easy to tell people apart when they are conscious. This effect is not uniform in the brain: it is strongest in the parts of the brain that are uniquely human and mostly distinguish us from other species. The implication is that just as your own conscious experience is unique to you, so are the brain patterns that support it. When consciousness is gone, people's brain activity is also less unique.
The researchers gathered interesting new insights about the effects of general anesthesia on the brain and its unique patterns of neural activity. In the future, the results of their study could inspire further cross-species research looking at the brain before, during and after the administration of anesthetics, which could in turn inform the development of interventions to facilitate the rehabilitation of both people and animals after medical procedures that require anesthesia. The researchers hope that by learning how the brain reboots consciousness after anesthesia, they can learn how to help recovery of consciousness in patients who suffer from coma and other forms of chronic unconsciousness after a brain injury.
Andrea I. Luppi et al, General anaesthesia decreases the uniqueness of brain functional connectivity across individuals and species, Nature Human Behaviour (2025). DOI: 10.1038/s41562-025-02121-9
In utero opioid use associated with smaller brain regions in newborns
Researchers have found that newborns exposed to opioids in utero exhibited smaller brain volumes in multiple regions compared to unexposed infants.
Cortical gray matter, white matter, deep gray matter, cerebellum, brainstem, and the amygdala all showed reduced size, indicating possible early markers of neurodevelopmental impairment.
Opioid exposure affects a growing number of pregnancies in the world. An estimated 7% of pregnant individuals report using opioids during pregnancy. These substances cross the placenta and may interfere with fetal brain development.
Antenatal opioid exposure has been linked to lower infant cognitive and language scores, increased rates of attention-deficit/hyperactivity disorder, and difficulties with executive functioning.
Confounding influences such as socioeconomic status, co-exposure to other substances, parenting environment, and genetic factors complicate any direct causal interpretation. In some studies, once these environmental risks are controlled, associations between opioid exposure and neurodevelopmental outcomes weaken or disappear.
In the study, "Antenatal Opioid Exposure and Global and Regional Brain Volumes in Newborns," published in JAMA Pediatrics, researchers conducted a multisite, prospective observational study.
A total of 173 newborns with antenatal opioid exposure and 96 unexposed controls were recruited from four sites across the United States. All infants were born at or after 37 weeks of gestation and underwent brain MRI scans before eight weeks of age.
Three-dimensional volumetric MRI scans were acquired during natural sleep using Siemens and Philips 3T scanners. Volumetric data were analyzed using covariance models that controlled for postmenstrual age at scan, sex, birth weight, maternal smoking, and maternal education.
Opioid-exposed newborns had significantly smaller total brain volume compared to unexposed controls: 387.51 cm3 vs. 407.06 cm3. Reductions were also found in cortical gray matter: 167.07 cm3 vs. 176.35 cm3, deep gray matter: 27.22 cm3 vs. 28.76 cm3, white matter: 159.90 cm3 vs. 166.65 cm3, cerebellum: 23.47 cm3 vs. 24.99 cm3, and brainstem: 6.80 cm3 vs. 7.18 cm3.
Amygdala volume was also reduced in opioid-exposed infants. Left amygdala volume measured 0.48 cm3 compared to 0.51 cm3 in controls. Right amygdala volume measured 0.51 cm3 vs. 0.55 cm3 in controls. Methadone-exposed newborns showed significantly smaller white matter volume. Buprenorphine-exposed newborns showed significantly smaller right amygdala volume.
Newborns exposed to opioids only and those exposed to opioids plus other substances both exhibited significant reductions in cortical and deep gray matter, cerebellum, brainstem, right amygdala, and total brain volume. Polysubstance-exposed newborns also showed reductions in white matter and the left amygdala. Researchers concluded that antenatal opioid exposure is associated with reductions in global, regional, and tissue-specific brain volumes in newborns. Structural differences were observed across multiple brain regions and varied by type of opioid exposure.
Methadone exposure was linked with reduced white matter volume. Buprenorphine exposure was associated with smaller right amygdala volume. Newborns with polysubstance exposure showed volume reductions in additional areas, including white matter and the left amygdala.
According to the authors, these structural brain differences may represent early biomarkers of later neurodevelopmental dysfunction.
Yao Wu et al, Antenatal Opioid Exposure and Global and Regional Brain Volumes in Newborns,JAMA Pediatrics(2025).DOI: 10.1001/jamapediatrics.2025.0277
Study finds pet dogs pose significant threat to wildlife and ecosystems
New research into the overlooked environmental impact of pet dogs has found far-reaching negative effects on wildlife, ecosystems and climate.
While ecological damage caused by cats has been extensively studied, the new research found dogs, as the world's most common large carnivores, present a significant and multifaceted environmental threat.
The paper, "Bad Dog? The environmental effects of owned dogs," has been published in Pacific Conservation Biology.
The research found that human-owned, pet dogs disturb and directly harm wildlife, particularly shorebirds, even when leashed.
As well as predatory behavior like chasing wildlife, dogs leave scents, urine and feces, which can disrupt animal behavior long after the dogs have left. Studies have found that animals like deer, foxes and bobcats are less active or completely avoid areas where dogs are regularly walked, even in the absence of the dogs.
Dog waste also contributes to pollution in waterways and inhibits plant growth, while wash-off from chemical treatments used to clean and guard dogs from parasites can add toxic compounds to aquatic environments. In addition, the pet food industry, driven by a vast global dog population, has a substantial carbon, land and water footprint.
Addressing these challenges required a careful balance between reducing environmental harm and maintaining the positive role of dogs as companions and working animals, say the researchers.
The sheer number of pet dogs globally, combined with uninformed or lax behaviors by some owners, is driving environmental issues that we can no longer ignore.
Bad Dog? The environmental effects of owned dogs, Pacific Conservation Biology (2025). doi.org/10.1071/PC24071
Handheld electro-shockers can pose risk for individuals with cardiac implants, study finds
Research has found that handheld electro-shockers commonly used for self-defense can potentially interact with cardiac implantable electronic devices (CIEDs) such as pacemakers, putting individuals at risk.
ThestudyinHeart Rhythmshows that the individual interactive risk is primarily based on the applied voltage, but also on the manufacturer and type of implanted CIED.
The use of TASER pistols by security forces has been controversial because of associatedhealth risksfor subjects receiving a TASER shock. In contrast to TASER pistols, which shoot electrical darts over a distance of up to 10 meters and transmit electrical currents through large parts of a person's body, a handheld electro-shocker delivers energy superficially by directly applying the device to a target.
The handheld electro-shockers tested in this study are legal to own and carry in most countries and therefore, patients with CIEDs might have an increased risk of coming into contact with these devices. This is the first time a study has evaluated the effects of these electro-shockers on CIEDs.
Effects of handheld electro-shockers on cardiac implantable electronic devices, Heart Rhythm (2025). DOI: 10.1016/j.hrthm.2025.02.025
Research leads to designation of new type of diabetes: Type 5
Malnutrition-related diabetes—typically affecting lean, malnourished teens and young adults in low- and middle-income countries—is now officially recognized as a distinct form of the disease, known as type 5 diabetes.
Diabetes that is caused by obesity, known as type 2 diabetes, accounts for the majority of diabetes cases in developing countries. But increasingly young people are being diagnosed with diabetes caused not by too much food but by too little—by malnutrition, in other words. Type 5 diabetes is estimated to affect 20-to-25 million people worldwide, mainly in Asia and Africa.
Doctors are still unsure how to treat these patients, who often don't live for more than a year after diagnosis.
While often discussed as two main types, diabetes actually encompasses several categories, including type 1, type 2, gestational diabetes, and other specific types like MODY (Maturity-Onset Diabetes of the Young). Here's a breakdown of the main types of diabetes: Type 1 Diabetes: An autoimmune condition where the body's immune system mistakenly attacks and destroys the insulin-producing cells in the pancreas. Type 2 Diabetes: A condition where the body becomes resistant to insulin, and the pancreas may not produce enough insulin to maintain normal blood sugar levels. Gestational Diabetes: A form of diabetes that develops during pregnancy, often due to hormonal changes that make the body less sensitive to insulin. Other Specific Types of Diabetes: MODY (Maturity-Onset Diabetes of the Young): A genetic form of diabetes that typically develops in childhood or early adulthood, often before the age of 25. LADA (Latent Autoimmune Diabetes in Adults): A type of diabetes that is similar to type 1 but develops more slowly, with the damage to the insulin-producing cells in the pancreas happening gradually. Neonatal Diabetes: A rare form of diabetes that occurs in newborns or infants. Secondary Diabetes: Diabetes that develops as a result of other medical conditions or medications, such as cystic fibrosis, pancreatitis, or corticosteroid use.
Researchers identify simple rules for folding the genome
An international team of researchers have identified rules that tell cells how to fold DNA into the tightly packed, iconic X-shaped chromosomes formed during mitosis that help ensure the accurate passing of genetic information between cells during cell division.
Published in the journal Science, these findings illuminate basic biological functions underlying mitosis on a micrometer scale. Understanding how the cells accomplish this critical task may provide important new insights into inheritance and DNA stability and repair, as well as genetic mutations that lead to human diseases such as cancer.
Kumiko Samejima et al, Rules of engagement for condensins and cohesins guide mitotic chromosome formation, Science (2025). DOI: 10.1126/science.adq1709
Venom characteristics of a deadly snake depends on climate .... and can be predicted from local climate
Local climate can be used to predict the venom characteristics of a deadly snake that is widespread in India, helping clinicians to provide targeted therapies for snake bite victims, according to a study published in PLOS Neglected Tropical Diseases
Russell's viper (Daboia russelii) is found across the Indian subcontinent and is responsible for over 40% of snake bite-related deaths in India each year. Its venom is extremely variable, and snake bites cause different symptoms in different regions of India.
The toxic effects of snake venom are caused by the concentrations of different enzymes, which can be influenced by many factors, including prey availability and climate. However, the factors driving variation in Russell's viper venom are unknown.
To investigate, researchers analyzed venom samples from 115 snakes collected in 34 locations across India. They tested the activity of venom toxins, including enzymes that break down proteins, phospholipids and amino acids.
Next, they used historical climate data to understand the relationship between venom composition and the local climate where the snakes were caught. They found that temperature and rainfall partly explained regional variation in snake venom composition.
Protease activity showed the closest relationship to climate variables, whereas the activity of amino acid oxidases was unaffected by climate. Snakes in drier regions of India tended to have higher protease activity.
The researchers used this data to create a map of expected venom types across Russell's viper's range in India, which could be used to predict the clinical symptoms of snake bites in different regions.
The venom maps developed in this study could help clinicians select the most appropriate treatment for patients with snake bites, or to develop targeted therapies such as toxin-specific antibodies, the researchers say.
'Ozone-climate penalty' adds to India's air pollution
India's cities are already ranked among the world's most polluted, based on concentrations of fine particulate matter in the air. Now new research indicates they are battling rising levels of another life-threatening pollutant—surface ozone.
A study published in the journal Global Transitions says deaths from ozone in India exceeded 50,000 in 2022 and caused losses of around US$16.8 billion—about 1.5 times the government's total health spending that year.
Surface ozone is a toxic gas that not only affects public health but also impacts ecosystems and climate due to the greenhouse effect.
Ozone, a variant of oxygen, occurs at ground level as well as in the upper atmosphere. Formed naturally, the ozone in the stratosphere helps filter out harmful ultraviolet rays that are a part of the sun's radiation. However, surface ozone, also called ground-level ozone, is generated by interactions between pollutants. For example, nitrogen oxides found in vehicular exhaust can react with volatile organic compounds released by industrial activity and waste dumps to produce ozone. Surface ozone is the primary component of smog and can have negative effects on human health and the environment.
Short-term exposure to ozone increases the risk of death from heart disease, stroke, hypertension and respiratory issues, while long-term exposure may decrease lung capacity, induce oxidative stress, suppress immune response and cause lung inflammation.
Climate change, rising temperatures and altered weather patterns can raise surface ozone in a phenomenon described by experts as the "ozone-climate penalty". Factors that affect ozone generation include solar radiation, humidity, precipitation and the presence of precursors—substances that lead to the formation of a pollutant through a chemical reaction—such as methane, nitrogen oxides and volatile organic compounds.
Ozone pollution increases during the hot summer months and declines during the monsoon period from June to September as heavy rains wash out the pollutants, and reduced solar radiation limits photochemical reactions.
Critically, human exposure to fine particulate matter—known as PM2.5—may worsen the health effects of ozone.
PM2.5refers to particles smaller than 2.5 microns, which can enter the bloodstream through the lungs.
According to the2024 World Air Quality Report, 11 of the world's 20 cities carrying the highest burden of PM2.5are in India.
A study (1) published in The Lancet Planetary Health found that the whole of the population of India lives in areas where PM2.5 levels exceed WHO guidelines.
Besides harming health, high levels of surface ozone reduce photosynthesis by damaging photosystems, CO2fixation and pigments. This leads to a reduction in carbon assimilation that results in the decline of crop yields.
According to the ozone study, India's rice yield loss due to ozone pollution rose from 7.39 million tons to 11.46 million tons between 2005 and 2020, costing around US$2.92 billion and impacting food security.
Even if precursor emissions remain at the current level, climate changealone could contribute to the increase in surface ozone in the highly polluted regions of South Asia by 2050 with the Indo-Gangetic Plains, one of the most fertile regions in the region, likely to face significant crop yield losses.
Much of the surface ozone generated is taken care of by nature, such as the monsoon rains. Dealing with the reported increase in ozone levels is best done by reducing the precursors—nitrogen oxides, methane and PM2.5.
G.S. Gopikrishnan et al, Exposure to surface ozone and its associated health effects and economic burden in India, Global Transitions (2025). DOI: 10.1016/j.glt.2025.03.002
An international team of researchers may have answered one of space science's long-running questions—and it could change our understanding of how life began. Carbon-rich asteroids are abundant in space yet make up less than 5% of meteorites found on Earth.
Published in Nature Astronomy, researchers analyzed close to 8,500 meteoroids and meteorite impacts, using data from 19 fireball observation networks across 39 countries—making it the most comprehensive study of its kind. The paper is titled "Perihelion history and atmospheric survival as primary drivers of Earth's meteorite record."
The team discovered Earth's atmosphere and the sun act like giant filters, destroying fragile, carbon-rich (carbonaceous) meteoroids before they reach the ground.
Scientists have long suspected that weak, carbonaceous material doesn't survive atmospheric entry. What this research shows is many of these meteoroids don't even make it that far: they break apart from being heated repeatedly as they pass close to the sun. The ones that do survive getting cooked in space are more likely to also make it through Earth's atmosphere.
The study also found meteoroids created by tidal disruptions—when asteroids break apart from close encounters with planets—are especially fragile and almost never survive atmospheric entry.
Carbonaceous meteorites are particularly important because they contain water and organic molecules—key ingredients linked to the origin of life on Earth. Carbon-rich meteorites are some of the most chemically primitive materials we can study—they contain water, organic molecules and even amino acids.
However, scientists have so few of them in their meteoritecollections that they risk having an incomplete picture of what's actually out there in space and how the building blocks of life arrived on Earth.
Understanding what gets filtered out and why is key to reconstructing our solar system's history and the conditions that made life possible.
Patrick M. Shober et al, Perihelion history and atmospheric survival as primary drivers of the Earth's meteorite record, Nature Astronomy (2025). DOI: 10.1038/s41550-025-02526-6
Half of the universe's hydrogen gas, long unaccounted for, has been found
Astronomers tallying up all the normal matter—stars, galaxies and gas—in the universe today have come up embarrassingly short of the total matter produced in the Big Bang 13.6 billion years ago. In fact, more than half of normal matter—half of the 15% of the universe's matter that is not dark matter—cannot be accounted for in the glowing stars and gas we see.
New measurements, however, seem to have found this missing matter in the form of very diffuse and invisible ionized hydrogen gas, which forms a halo around galaxies and is more puffed out and extensive than astronomers thought.
The findings not only relieve a conflict between astronomical observations and the best, proven model of the evolution of the universe since the Big Bang, they also suggest that the massive black holes at the centers of galaxies are more active than previously thought, fountaining gas much farther from the galactic centerthan expected—about five times farther, the team found.
The new measurements are certainly consistent with finding all of the gas.
The results of the study, co-authored by 75 scientists from institutions around the world, have been presented at recent scientific meetings, posted as a preprint on arXiv and are undergoing peer review at the journal Physical Review Letters.
While the still mysterious dark matter makes up the bulk—about 84%—of matter in the universe, the remainder is normal matter. Only about 7% of normal matter is in the form of stars, while the rest is in the form of invisible hydrogen gas—most of it ionized—in galaxies and the filaments that connect galaxies in a kind of cosmic network.
The ionized gasand associated electrons strung out in this filament network are referred to as the warm-hot intergalactic medium, which is too cold and too diffuse to be seen with the usual techniques at astronomers' disposal, and therefore has remained elusive until now.
In the new paper, the researchers estimated the distribution of ionized hydrogen around galaxies by stacking images of approximately 7 million galaxies—all within about 8 billion light-years of Earth—and measuring the slight dimming or brightening of the cosmic microwave background caused by a scattering of the radiation by electrons in the ionized gas, the so-called kinematic Sunyaev-Zel'dovich effect.
B. Hadzhiyska et al, Evidence for large baryonic feedback at low and intermediate redshifts from kinematic Sunyaev-Zel'dovich observations with ACT and DESI photometric galaxies, arXiv (2024). DOI: 10.48550/arxiv.2407.07152
An alternative to artificial fertilizers: Small peptides enhance symbiosis between plants and fungi
Industrial farming practices often deplete the soil of important nutrients and minerals, leaving farmers to rely on artificial fertilizers to support plant growth. In fact, fertilizer use has more than quadrupled since the 1960s, but this comes with serious consequences. Fertilizer production consumes massive amounts of energy, and its use pollutes the water, air, and land.
Plant biologists are now proposing a new solution to help kick this unsustainable fertilizer habit.
In a new study, the researchers identified a key molecule produced by plant roots, a small peptide called CLE16, that encourages plants and beneficial soil fungi to interact with each other. They say boosting this symbiotic relationship, in which the fungi provide mineral nutrients to the plants through CLE16 supplementation, could be a more natural and sustainable way to encourage crop growth without the use of harmful artificial fertilizers.
By restoring the natural symbiosis between plant roots and fungi, we could help crops get the nutrients they need without the use of harmful fertilizers.
In this mutually beneficial relationship, soil-borne arbuscular mycorrhizal fungi supply plants with water and phosphorus, which the plants accept in exchange for carbon molecules. These exchanges occur by specialized symbiotic fungal tendrils, called arbuscules, burying themselves into plant root cells.
Around 80% of plants can trade resources with fungi in this way. However, the traits that support this symbiosis have been weakened over centuries of agricultural plant breeding that prioritized creating crops with the biggest yields.
Scientists say new crop varieties could be bred to strengthen these traits again—an opportunity they intend to explore through the Institute'sHarnessing Plants Initiative.
Müller, Lena Maria, A plant CLE peptide and its fungal mimic promote arbuscular mycorrhizal symbiosis via CRN-mediated ROS suppression, Proceedings of the National Academy of Sciences (2025). DOI: 10.1073/pnas.2422215122
The drugs we take, from anxiety medications to antibiotics, don't simply vanish after leaving our bodies. Many are not fully removed by wastewater treatment systems and end up in rivers, lakes, and streams, where they can linger and affect wildlife in unexpected ways.
The new findings suggest that even tiny traces of drugs in the environment can alter animal behavior in ways that may shape their survival and success in the wild.
A recent global survey of the world's rivers found drugs were contaminating waterways on every continent—even Antarctica. These substances enter aquatic ecosystems not only through our everyday use, as active compounds pass through our bodies and into sewage systems, but also due to improper disposal and industrial effluents.
To date, almost 1,000 different active pharmaceutical substances have been detected in environments worldwide.
Particularly worrying is the fact that the biological targets of many of these drugs, such as receptors in the human brain, are also present in a wide variety of other species. That means animals in the wild can also be affected.
In fact, research over the last several decades has demonstrated that pharmaceutical pollutants can disrupt a wide range of traits in animals, including their physiology, development, and reproduction.
Radiation from CT scans could account for 5% of all cancer cases a year, study suggests
CT can save lives, but its potential harms are often overlooked. CTs expose patients to ionizing radiation—a carcinogen—and it's long been known that the technology carries a higher risk of cancer.
Radiation from CT scans may account for 5% of all cancers annually, according to a new study that cautions against overusing and overdosing CTs.
The danger is greatest for infants, followed by children and adolescents. But adults are also at risk, since they are the most likely to get scans.
Nearly 103,000 cancers are predicted to result from the 93 million CTs that were performed in 2023 alone. This is three to four times more than previous assessments, the researchers say.
The researchers said some CT scans are unlikely to help patients and are overused, such as those for upper respiratory infectionsor for headaches without concerning signs or symptoms. They said patients could lower their risk by getting fewer of these scans, or by getting lower-dose scans.
There is currently unacceptable variation in the doses used for CT, with some patients receiving excessive doses.
Modified antibody fragment blocks fertilization, paving way for nonhormonal contraceptive
Current methods of contraception rely on hormones, which can cause side effects such as mood changes, headaches or increased risk of blood clots. Blocking fertilization on the surface of the egg has been proposed as an alternative, but antibodies were deemed unsuitable due to possible immune responses triggered by their Fc region.
A new study shows how a small antibody fragment can block fertilization by targeting a key protein on the surface of the egg. This discovery brings a nonhormonal contraceptive one step closer to reality. The study has been published in the Proceedings of the National Academy of Sciences.
In the study, the researchers describe how a modified antibody fragment can block fertilization by targeting the protein ZP2 on the surface of the egg.
This small antibody fragment can block fertilization by targeting ZP2, a key protein in the outer layer of the egg that is involved in both sperm binding and blocking polyspermy.
The researchers have used X-ray crystallography to map the interaction between the antibody IE-3, which is known to prevent fertilization in mice, and ZP2 at the atomic level. A modified, smaller version of the antibody (scFV) was found to be equally effective, blocking fertilization in 100% of IVF tests with mouse eggs. Because it lacks the immune-triggering Fc region of the full antibody, scFV minimizes potential side effects.
Elisa Dioguardi et al, Structural basis of ZP2-targeted female nonhormonal contraception, Proceedings of the National Academy of Sciences (2025). DOI: 10.1073/pnas.2426057122
Repeated treatment with malaria medication can decrease its effect
In a recently published article in the journal Nature Communications, researchers present results indicating that repeated treatment with piperaquine, an antimalarial drug, can lead to the parasites developing decreased sensitivity to this drug. These findings may impact the use of piperaquine in the future.
Piperaquine is an importantantimalarial drugcharacterized by a long half-life, meaning it remains in the body for several weeks and protects against new infections. This is a key asset of this drug. However, the researchers behind the study have discovered that this advantage can disappear with repeated treatment in areas with high malaria transmission.
The study shows that repeated treatment with dihydroartemisinin-piperaquine can lead to parasites developing drug tolerance by duplicating the plasmepsin 3 (pm3) gene. This allows them to reinfect patients earlier than expected during the expected protective period, reducing piperaquine's effectiveness as a prophylactic medicine.
Leyre Pernaute-Lau et al, Decreased dihydroartemisinin-piperaquine protection against recurrent malaria associated with Plasmodium falciparum plasmepsin 3 copy number variation in Africa, Nature Communications (2025). DOI: 10.1038/s41467-025-57726-5
Researchers have found that plant leaves can directly absorb microplastics (MPs) from the atmosphere, leading to a widespread presence of plastic polymers in vegetation. Concentrations of polyethylene terephthalate (PET) and polystyrene (PS) were detected in leaves collected from multiple environments, including urban areas and agricultural sites. The study is published in the journal Nature.
Researchers performed field investigations and laboratory simulation experiments to quantify plastic accumulation in plant leaves. Leaf absorption was confirmed as a significant pathway for plastic accumulation in plants, with evidence of translocation into vascular tissue and retention in specialized structures like trichomes.
MPs have been detected throughout terrestrial environments, including soil, water, and air. Laboratory studies have shown that plant roots can absorb MPs, with submicrometer and nanometer-sized particles of PS and polymethylmethacrylate transported upward from the roots of Triticum aestivum, Lactuca sativa, and Arabidopsis thaliana. Root uptake through the apoplastic pathway has been observed, yet translocation to shoots occurs slowly.
Airborne MPs have been measured at concentrations between 0.4 and 2,502 items per cubic meter in urban settings such as Paris, Shanghai, Southern California, and London. Laboratory experiments demonstrated the foliar absorption of nanoparticles including Ag, CuO, TiO2, and CeO2.
Plastic particles have been shown to deposit on plant surfaces, and some studies reported internal accumulation following exposure to high levels of commercial PS models.
At the most polluted sites, concentrations of PET reached tens of thousands of nanograms per gram of dry leaf weight. PS levels followed a similar pattern, with the highest values detected in leaves from the landfill site.
PET and PS were also found in nine leafy vegetables, with open-air crops exhibiting higher levels than greenhouse-grown counterparts. Nano-sized PET and PS were visually confirmed in plant tissue.
Older leaves and outer leaves of vegetables accumulated more plastic than newly grown or inner leaves, suggesting an accumulation over time.
Laboratory exposure of maize to plastic-laden dust resulted in measurable PET absorption in leaf tissue after just one day. PET was not detected in roots or stems under similar root-exposure conditions. Fluorescent and europium-labeled particles enabled visualization of stomatal entry and subsequent migration through the apoplastic pathway.
Abscisic acid was applied to maize roots to chemically induce stomatal closure. Plants exposed to dust laden with PET MPs under these conditions showed significantly lower absorption in leaf tissue, confirming that open stomata are crucial for foliar uptake of airborne MPs.
Plastic particles absorbed through leaves accumulated in measurable quantities across multiple species and sites. Airborne PET and PS entered leaves through stomata and moved along internal pathways to vascular tissues and trichomes.
Concentrations increased with exposure time, environmental levels, and leaf age. Field measurements showed that plastic accumulation in aboveground plant parts exceeds what is typically absorbed through roots.
Widespread detection of plastic polymers and fragments in edible plant parts confirms atmospheric exposure as a significant route of entry into vegetation. As leaves function as a primary source in terrestrial food chains, the presence of accumulated MPs suggests the potential for exposure to multiple layers of the ecosystem.
With plastics around, even vegetarians are not safe!
Ye Li et al, Leaf absorption contributes to accumulation of microplastics in plants,Nature(2025).DOI: 10.1038/s41586-025-08831-4
Willie Peijnenburg, Airborne microplastics enter plant leaves and end up in our food,Nature(2025).DOI: 10.1038/d41586-025-00909-3
Heart valve abnormality is associated with malignant arrhythmias, study reveals
People with a certain heart valve abnormality are at increased risk of severe heart rhythm disorders, even after successful valve surgery. This is according to a new study, "Mitral annular disjunction and mitral valve prolapse: long-term risk of ventricular arrhythmias after surgery" , published in the European Heart Journal.
The condition is more common in women and younger patients with valve disorder and can, in the worst case, lead to sudden cardiac arrest.
Mitral annular disjunction, MAD, is a heart abnormality in which the mitral valve attachment "slides." In recent years, the condition has been linked to an increased risk of severe cardiac arrhythmias. Until now, it has not been known whether the risk of arrhythmias disappears if MAD is surgically corrected.
MAD is often associated with a heart disease called mitral valve prolapse, which affects 2.5% of the population and causes one of the heart's valves to leak. This can lead to blood being pumped backward in the heart, causing heart failure and arrhythmias. The disease can cause symptoms such as shortness of breath and palpitations.
In the current study, researchers investigated the risk of cardiac arrhythmias in 599 patients with mitral valve prolapse who underwent heart surgery at Karolinska University Hospital between 2010 and 2022. Some 16% of the patients also had the cardiac abnormality MAD.
The researchers have been able to show that people with MAD have a significantly higher risk of suffering from ventricular arrhythmias, a dangerous type of heart rhythm disorder that, in the worst case, can lead to cardiac arrest in a subset of patients.
People with MAD were more likely to be female and were on average eight years younger than those without MAD. They also had more extensive mitral valvedisease. Although the surgery was successful in correcting MAD, these patients had more than three times the risk of ventricular arrhythmias during five years of follow-up compared to patients without preoperative MAD.
These results show that it is important to closely monitor patients with this condition, even after a successful operation, say the researchers.
The study has led to new hypotheses that the researchers are now investigating further. One hypothesis is that MAD causes permanent changes in the heart muscle over time. Another is that MAD is a sign of an underlying heart muscle disease.
The researchers are now continuing to study scarring in the heart using MRI (magnetic resonance imaging) and analyze tissue samples from the heart muscle.
Bahira Shahim et al, Mitral annular disjunction and mitral valve prolapse: long-term risk of ventricular arrhythmias after surgery, (2025). DOI: 10.1093/eurheartj/ehaf195
Dr. Krishna Kumari Challa
Scientists reveal new toxin that damages the gut
Scientists have discovered how a diarrhea-causing strain of bacteria uses "molecular scissors" to cut open and destroy gut cells, leading to severe illness and sometimes death.
Published in Gut Microbes, the research reveals for the first time the three-dimensional structure of a toxin secreted by enteropathogenic E. coli (EPEC) bacteria, and shows how the bacteria use the toxin to invade and destroy the epithelial cells that line the gut.
The toxin, which is an enzyme called EspC, destroys the cells by cutting up their internal protein structure.
There are more than five types of E. coli that damage epithelial cells in different ways to cause gut infection.
These include STEC, which was responsible for the recent salad spinach recall and uses Shiga toxin to invade gut cells, and EPEC—the subject of this study—which uses the toxin EspC and is the leading cause of diarrhea in children and babies worldwide.
Currently, infections caused by the many diverse strains of E. coli are typically treated with broad-spectrum antibiotics. However, these drugs kill both harmful and beneficial gut bacteria, and E. coli's rapid adaptation ability means these pathogens are becoming resistant to many antibiotics.
Akila U. Pilapitiya et al, The crystal structure of the toxin EspC from enteropathogenic Escherichia coli reveals the mechanism that governs host cell entry and cytotoxicity, Gut Microbes (2025). DOI: 10.1080/19490976.2025.2483777
Apr 5
Dr. Krishna Kumari Challa
Study exposes huge levels of untargeted antibiotic prescribing
Doctors are prescribing antibiotics for tens of thousands of patients with infections, with little or no consideration of prognosis and the risk of the infection worsening, according to a new study by epidemiologists.
The study of 15.7 million patient records, published in the Journal of the Royal Society of Medicine on April 4, '25, implies there could be scope to prescribe far fewer antibiotics. The researchers found the probability of being prescribed antibiotics for a lower respiratory tract or urinary tract infection was unrelated to hospital admission risk.
And the probability of being prescribed an antibiotic for an upper respiratory tract infection was only weakly related to hospital admission risk.The study also showed that patient characteristics such as age and the presence of other health problems were only weakly associated with the probability of being prescribed an antibiotic for treatment of a common infection.
The most elderly patients in the sample were 31% less likely than the youngest patients to receive an antibiotic for upper respiratory infections.
That inevitably means, say the researchers, that many younger people are being prescribed antibiotics, even though they are often fit enough to recover without them, potentially leading to resistance.
Part 1
Apr 5
Dr. Krishna Kumari Challa
Conversely, many older people—who may not be able to deal with infections without antibiotics—are not receiving them, with the potential of complications and hospital admissions.
Patients with combinations of diseases were 7% less likely than people without major health problems to receive an antibiotic for upper respiratory infections.
Antibiotics are effective in treating bacterial infections, but they carry the risks of antimicrobial resistance (AMR) and loss of effectiveness when used inappropriately.
That is why AMR to antibiotics has been recognized as one of the biggest threats to global public health.
"Given the threat of resistance, there is a need to better target antibiotics in primary care to patients with higher risks of infection-related complications such as sepsis.
But this study finds that antibiotics for common infections are commonly not prescribed according to complication risk and that suggests there is plenty of scope to do more to reduce antibiotic prescribing.
The study also showed that the probability of being prescribed an antibiotic for lower respiratory infections was even more unrelated to complication risk during the pandemic; however, they were only minor changes for urinary tract infections.
Rather than imposing targets for reducing inappropriate prescribing, we argue that it is far more viable for clinicians to focus on improving risk-based antibiotic prescribing for infections that are less severe and typically self-limiting.Prognosis and harm should explicitly be considered in treatment guidelines, alongside better personalized information for clinicians and patients to support shared decision making, say the researchers.
Ali Fahmi et al, Antibiotics for Common Infections in Primary Care Before, During and after the COVID-19 Pandemic and Extent of Risk-Based Prescribing: Need for Personalised Guidelines, Journal of the Royal Society of Medicine (2025).
Part 2
Apr 5
Dr. Krishna Kumari Challa
One-year-old infants already display compositional abilities, study finds
To understand complex objects, humans are known to mentally transform them and represent them as a combination of simpler elements. This ability, known as compositionality, was so far assumed to require fluency in language, thus emerging in childhood after humans have learned to speak and understand others.
Researchers recently explored the possibility that compositionality is based on simple processes and might therefore already be present in infants. Their paper, published in Communications Psychology, provides evidence that infants as young as 1-year-old already possess basic compositional abilities.
One of the central properties of language is compositionality, which is a long word that simply means the capacity to put words together to understand sentences.
In contrast to older children and adults, infants cannot yet express their thought processes, thus assessing their abilities in experimental settings typically entails observing their behavior. To evaluate their compositional abilities, Dautriche and her colleague, Emmanuel Chemla, carried out three experiments using a well-established experimental paradigm for research with infants, which involves measuring the time they spend gazing at specific objects.
In their three experiments, researchers gathered evidence that infants can correctly compose simple noun-verb sentences at approximately 14 months of age, can understand compositional facial expressions (i.e., a negation expressed through a facial expression) at one year of age, and can make basic mental physical transformations when they are 10 months old. Collectively, their findings suggest that compositionality can emerge before humans learn to speak, indicating it could be based on simpler processes than previously anticipated.
The researchers propose that compositionality is a capacity that humans have before learning language, which brings us a little bit closer to better understanding how our minds work and how language evolved. It could be, for instance, that the minds of other species are also compositional and that language might just have built on this feature of the mind.
Isabelle Dautriche et al, Evidence for compositional abilities in one-year-old infants, Communications Psychology (2025). DOI: 10.1038/s44271-025-00222-9.
**
Apr 6
Dr. Krishna Kumari Challa
Cocoa extract fails to prevent age-related vision loss, clinical trial finds
Clinical research reports no significant long-term benefit of cocoa flavanol supplementation in preventing age-related macular degeneration (AMD). The paper is published in the journal JAMA Ophthalmology.
AMD is a progressive retinal disease and the most common cause of severe vision loss in adults over age 50. AMD damages the macula, the central part of the retina responsible for sharp, detailed vision. While peripheral sight is typically preserved, central vision loss can impair reading, driving, facial recognition, and other quality of life tasks. Abnormalities of blood flow in the eye are associated with the occurrence of AMD.
Cocoa flavanols are a group of naturally occurring plant compounds classified as flavonoids, found primarily in the cocoa bean. These bioactive compounds have been studied for their vascular effects, including improved endothelial function and enhanced nitric oxide production, which contribute to vasodilation and circulatory health. Previous trials have shown that moderate intake of cocoa flavanols may lower blood pressure, improve lipid profiles, and reduce markers of inflammation, suggesting a role in mitigating cardiovascular and related vascular conditions.
In the study titled "Cocoa Flavanol Supplementation and Risk of Age-Related Macular Degeneration: An Ancillary Study of the COSMOS Randomized Clinical Trial," researchers conducted a double-blind, placebo-controlled randomized clinical trial to examine whether daily supplementation with cocoa extract prevents the development or progression of AMD.
A cohort of 21,442 U.S. adults (12,666 women aged 65 and older, and 8,776 men aged 60 and older) were recruited, with eligibility criteria requiring discontinuation of non-trial cocoa supplements and multivitamins for the 3.6-year duration of the trial. COSMOS included its own multivitamin supplement as one arm of the trial.
Daily supplementation consisted of 500 mg cocoa flavanols containing 80 mg (−)-epicatechin. Randomization assigned participants to either the cocoa extract or a matching placebo group. AMD outcomes were identified through self-reported diagnoses, verified through medical record confirmation. Compliance biomarkers confirmed a threefold increase in flavanol metabolite levels in the cocoa group.
A total of 344 participants experienced a confirmed AMD event, including 316 incident cases and 28 cases of progression. Incidence was 1.5% in the cocoa extract group and 1.7% in the placebo group.
Part 1
Apr 6
Dr. Krishna Kumari Challa
Statistical modeling found a non-significant reduced risk during the first two years of treatment (HR, 0.77; 95% CI, 0.59–1.01), and a non-significant effect beyond two years (HR, 1.06; 95% CI, 0.76–1.50). Similar trends were observed in secondary outcomes.
In subgroup analyses, treatment effect varied by hypertension status, with a significant reduced risk in participants without hypertension (HR, 0.63; 95% CI, 0.44–0.92) and no significant findings among those with hypertension (HR, 1.04; 95% CI, 0.80–1.36).
Researchers concluded that cocoa extract supplementation had no significant overall effect on AMD risk over a median 3.6-year period. Although researchers did not rule out a possible early benefit trend, the findings do not support cocoa flavanol supplementation as a preventive strategy for AMD.
However, the researchers acknowledge that this 's a limited sample study.
William G. Christen et al, Cocoa Flavanol Supplementation and Risk of Age-Related Macular Degeneration, JAMA Ophthalmology (2025). DOI: 10.1001/jamaophthalmol.2025.0353
Apr 6
Dr. Krishna Kumari Challa
Female hormones can stimulate immune cells to make opioids that naturally suppress pain
Scientists have discovered a new mechanism that acts via an immune cell and points toward a different way of treating chronic pain. Female hormones can suppress pain by making immune cells near the spinal cord produce opioids, a new study by researchers has found. This stops pain signals before they get to the brain.
The discovery could help with developing new treatments for chronic pain. It may also explain why some painkillers work better for women than men and why postmenopausal women experience more pain.
The work reveals an entirely new role for T-regulatory immune cells (T-regs), which are known for their ability to reduce inflammation.
The researchers looked at T-regs in the protective layers that encase the brain and spinal cord in mice. Until now, scientists thought these tissues, called the meninges, only served to protect the central nervous system and eliminate waste. T-regs were only discovered there in recent years.
This new research shows that the immune system actually uses the meninges to communicate with distant neurons that detect sensation on the skin.
Further experiments revealed a relationship between T-regs and female hormones that no one had seen before: Estrogen and progesterone were prompting the cells to churn out painkilling enkephalin.
This work could be particularly helpful for women who have gone through menopause and no longer produce estrogen and progesterone, many of whom experience chronic pain.
Élora Midavaine et al, Meningeal regulatory T cells inhibit nociception in female mice, Science (2025). DOI: 10.1126/science.adq6531. www.science.org/doi/10.1126/science.adq6531
Apr 6
Dr. Krishna Kumari Challa
The Dangers of Going to Space
Apr 6
Dr. Krishna Kumari Challa
Molecular clock analysis shows bacteria used oxygen long before widespread photosynthesis
Microbial organisms dominate life on Earth, but tracing their early history and evolution is difficult because they rarely fossilize. Determining when exactly a particular group of microbes first appeared is especially hard. However, ancient sediments and rocks hold chemical clues of available nutrients that could support the growth of bacteria.
A key turning point was when oxygen accumulated in the atmosphere around 2.3 billion years ago. Scientists have used this oxygen surge and how microbes adapted to it to map out bacterial evolution.
In a study published in Science, researchers have constructed a detailed timeline for bacterial evolution and oxygen adaptation.
Their findings suggest some bacteria could use trace oxygen long before evolving the ability to produce it through photosynthesis.
The researchers focused on how microorganisms responded to the Great Oxygenation Event (GOE) some 2.3 billion years ago. This event, triggered in large part by the development of oxygenic (oxygen-generating) photosynthesis in cyanobacteria and carbon deposition, fundamentally changed Earth's atmosphere from one mostly devoid of oxygen to one where oxygen became relatively abundant, as it is today.
Until now, establishing accurate timescales for how bacteria evolved before, during, and after this pivotal transition has been difficult due to incomplete fossil evidence and the challenge of determining the maximum possible ages for microbial groups—given that the only reliable maximum limit for the vast majority of lineages is the moon-forming impact 4.5 billion years ago, which likely sterilized the planet.
The researchers addressed these gaps by concurrently analyzing geological and genomic records. Their key innovation was to use the GOE itself as a time boundary, assuming that most aerobic (oxygen-using) branches of bacteria are unlikely to be older than this event—unless fossil or genetic signals strongly suggest an earlier origin. Using Bayesian statistics, they created a model that can override this assumption when data supports it.
part1
Apr 6
Dr. Krishna Kumari Challa
This approach, however, requires making predictions about which lineages were aerobic in the deep past. The team used probabilistic methods to infer which genes ancient genomes contained, and then machine-learning to predict whether they used oxygen.
To best utilize the fossil record, they leveraged fossils of eukaryotes, whose mitochondria evolved from Alphaproteobacteria, and chloroplasts evolved from cyanobacteria to better estimate how and when aerobic bacteria evolved.
Their results indicate that at least three lineages had aerobic lifestyles before the GOE—the earliest nearly 900 million years before—suggesting that a capacity for using oxygen evolved well before its widespread accumulation in the atmosphere.
Intriguingly, these findings point to the possibility that aerobic metabolism may have occurred long before the evolution of oxygenic photosynthesis.
Evidence suggests that the earliest aerobic transition occurred in an ancestor of photosynthetic cyanobacteria, indicating that the ability to utilize trace amounts of oxygen may have allowed the development of genes central to oxygenic photosynthesis.
The study estimates that the last common ancestor of all modern bacteria lived sometime between 4.4 and 3.9 billion years ago, in the Hadean or earliest Archean era. The ancestors of major bacterial phyla are placed in the Archean and Proterozoic eras (2.5–1.8 billion years ago), while many families date back to 0.6–0.75 billion years ago, overlapping with the era when land plants and animal phyla originated.
Notably, once atmospheric oxygen levels rose during the GOE, aerobic lineages diversified more rapidly than their anaerobic counterparts, indicating that oxygen availability played a substantial role in shaping bacterial evolution.
This combined approach of using genomic data, fossils, and Earth's geochemical history brings new clarity to evolutionary timelines, especially for microbial groups that don't have a fossil record.
A geological timescale for bacterial evolution and oxygen adaption, Science (2025). DOI: 10.1126/science.ADP1853
Part 2
Apr 6
Dr. Krishna Kumari Challa
Antibiotic resistance among key bacterial species plateaus over time, study shows
Antibiotic resistance tends to stabilize over time, according to a study published in the open-access journal PLOS Pathogens.
In this study, researchers analyzed drug resistance in more than 3 million bacterial samples collected across 30 countries in Europe from 1998 to 2019. Samples encompassed eight bacteria species important to public health, including Streptococcus pneumoniae, Staphylococcus aureus, Escherichia coli, and Klebsiella pneumoniae.
They found that while antibiotic resistance initially rises in response to antibiotic use, it does not rise indefinitely. Instead, resistance rates reached an equilibrium over the 20-year period in most species.
Antibiotic use contributed to how quickly resistance levels stabilized as well as variability in resistance rates across different countries. But the association between changes in drug resistance and antibiotic use was weak, suggesting that additional, yet unknown, factors are at play.
The study highlights that a continued increase in antibiotic resistance is not inevitable and provides new insights to help researchers monitor drug resistance.
When researchers looked into the dynamics of antibiotic resistance in many important bacterial pathogens all over Europe and in the last few decades, they often found that resistance frequency initially increases and then stabilizes to an intermediate level. The consumption of the antibiotic in the country explained both the speed of initial increase and the level of stabilization.
PLOS Pathogens (2025). DOI: 10.1371/journal.ppat.1012945
**
Apr 6
Dr. Krishna Kumari Challa
Jumbo phages infect cells with a protective cloaking mechanism, researchers discover
In a growing global trend, bacteria are evolving new ways to maneuver around medical treatments for a variety of infections. The rising antibiotic resistance crisis poses a significant public health threat in hospitals and other settings, with infections resulting in millions of deaths in recent years.
Scientists are now looking to bacteriophages—viruses that infect bacteria—and their potential to treat drug-resistant infections. They have begun to look deeper into an intriguing class of large bacteriophage known as "jumbo phages" that exhibit extraordinary features as possible new agents for bacterial infection treatments.
A study by researchers has shed new light on the unusual ways that phages have evolved to infect bacteria. Over millions of years, viruses and bacteria have engaged in a back-and-forth arms race. Viruses develop new ways to infect bacteria, while bacteria counter by evolving a resistance mechanism.
In order to fully realize the potential of jumbo phages and their promise as new therapeutics, researchers must decipher the mechanisms they employ to infect bacteria and evade the host's defenses.
A new study published in the journal Cell Host & Microbe describes the first-of-its-kind discovery of a type of membrane-bound sac, or vesicle, used by jumbo phages of the Chimalliviridae family.
The researchers found that immediately after jumbo phages infect a bacterial cell, they form a structure that shields and hides valuable DNA material. Phages use this genetic material to develop a nucleus inside their bacterial hosts.
The newly discovered compartment, which they named the EPI, or early phage infection vesicle, serves as a type of cloaking device that prevents triggering the bacteria's immune system.
When phages infect a bacterial cell, the EPI vesicle protects the genome of the virus during early stages of infection when it's very vulnerable. Bacteria and viruses are often dismissed as simple organisms but they're actually capable of very sophisticated intracellular warfare and this study is a new example of that.
Because most phages simply inject their DNA directly into the host, effectively announcing their arrival within the cell, the results of Chimalliviridae phage's stealth approach came as a revelation to researchers. The bacteria don't realize that there's a virus in there, producing things that will eventually take over.
Part 1
Apr 6
Dr. Krishna Kumari Challa
Researchers identified curious DNA "dots" within infected cells under a light microscope. Professor Elizabeth Villa's laboratory then used high-end imaging technologies to discover that these dots were tiny vesicles containing viral DNA and molecular machineries outside these vesicles.
They found that these vesicles were actually metabolically active, confirming the purpose of the molecular machines hanging outside the vesicles.
Not only did the researchers show that these vesicles are making RNA, but also they are getting ready to establish infection by synthesizing genes important for nucleus formation.
Emily G. Armbruster et al, Sequential membrane- and protein-bound organelles compartmentalize genomes during phage infection, Cell Host & Microbe (2025). DOI: 10.1016/j.chom.2025.03.005
Part 2
Apr 6
Dr. Krishna Kumari Challa
Scientists discover how nanoparticles of toxic metal used in MRI scans infiltrate human tissue
Researchers studying the health risks posed by gadolinium, a toxic rare earth metal used in MRI scans, have found that oxalic acid, a molecule found in many foods, can generate nanoparticles of the metal in human tissues.
In a new paper published in the journal Magnetic Resonance Imaging, a research team sought to explain the formation of the nanoparticles, which have been associated with serious health problems in the kidneys and other organs.
The worst disease caused by MRI contrast agents is nephrogenic systemic fibrosis. People have succumbed after just a single dose. The condition can cause a thickening and hardening of the skin, heart and lungs and cause painful contracture of the joints.
Gadolinium-based contrast agents are injected prior to MRI scans to help create sharper images.
The metal is usually tightly bound to other molecules and is excreted from the body, and most people experience no adverse effects. However, previous research has shown that even in those with no symptoms, gadolinium particles have been found in the kidney and the brain and can be detected in the blood and urine years after exposure.
Scientists are left with intertwined puzzles: Why do some people get sick, when most don't, and how do gadolinium particles become pried loose from the other molecules in the contrast agent?
Almost 50% of the patients had been exposed only a single time, which means that there's something that is amplifying the disease signal.
In their study, the research team focused on oxalic acid, which is found in many plant-based foods, including spinach, rhubarb, most nuts and berries and chocolate, because it binds with metal ions. The process helps lead to the formation of kidney stones, which result when oxalate binds with calcium. Meanwhile, oxalic acid also forms in the body when people eat foods or supplements containing vitamin C.
In test tube experiments the researchers found that oxalic acid caused minute amounts of gadolinium to precipitate out of the contrast agent and form nanoparticles, which then infiltrated the cells of various organs.
Some people might form these things, while others do not, and it may be their metabolic milieu. It might be if they were in a high oxalic state or a state where molecules are more prone to linking to the gadolinium, leading to the formation of the nanoparticles. That might be why some individuals have such awful symptoms and this massive disease response, whereas other people are fine.
The finding points to a possible way to mitigate some of the risks associated with MRI scan.
The scientists are getting closer to some recommendations for helping these individuals who are susceptible.
Ian M. Henderson et al, Precipitation of gadolinium from magnetic resonance imaging contrast agents may be the Brass tacks of toxicity, Magnetic Resonance Imaging (2025). DOI: 10.1016/j.mri.2025.110383
Apr 7
Dr. Krishna Kumari Challa
Marriage linked to higher dementia risk in older adults, 18-year study finds
This one is going to surprise you.
Researchers found that older adults who were divorced or never married had a lower risk of developing dementia over an 18-year period compared to their married peers. Findings suggest that being unmarried may not increase vulnerability to cognitive decline, contrary to long-held beliefs in public health and aging research.
Marriage is often linked to better health outcomes and longer life, but evidence connecting marital status to dementia risk remains inconsistent.
Prior research has not consistently addressed how marital status relates to specific causes of dementia or how factors such as sex, depression, or genetic predisposition may influence these associations.
In the study, "Marital status and risk of dementia over 18 years: Surprising findings from the National Alzheimer's Coordinating Center," published in Alzheimer's & Dementia, researchers conducted an 18-year cohort study to understand whether marital status was associated with dementia risk in older adults.
More than 24,000 participants without dementia at baseline were enrolled from over 42 Alzheimer's Disease Research Centers across the United States through the National Alzheimer's Coordinating Center. Annual clinical evaluations were conducted by trained clinicians using standardized protocols to assess cognitive function and determine diagnoses of dementia or mild cognitive impairment.
To assess long-term risk, researchers followed participants for up to 18.44 years, yielding over 122,000 person-years of data. Marital status at baseline was categorized as married, widowed, divorced, or never married.Dementia risk was analyzed using Cox proportional hazards regression, with married participants serving as the reference group. The models incorporated demographic characteristics, mental and physical health, behavioral history, genetic risk factors, and diagnostic as well as enrollment variables.
Compared to married participants, divorced or never married showed a consistently lower risk of developing dementia over the study period. Dementia diagnoses occurred in 20.1% of the overall sample. Among married participants, 21.9% developed dementia during the study period. Incidence was identical among widowed participants at 21.9% but notably lower for divorced (12.8%) and never-married participants (12.4%).
Hazard ratios showed a reduced risk for all three unmarried groups. In initial models adjusting only for age and sex, divorced individuals had a 34% lower risk of developing dementia (HR = 0.66, 95% CI = 0.59–0.73), never-married individuals had a 40% lower risk (HR = 0.60, 95% CI = 0.52–0.71), and widowed individuals had a 27% lower risk (HR = 0.73, 95% CI = 0.67–0.79).
These associations remained significant for the divorced and never-married groups after accounting for health, behavioral, genetic, and referral-related factors. The association for widowed participants weakened and was no longer statistically significant in the fully adjusted model.
Part 1
Apr 7
Dr. Krishna Kumari Challa
When looking at specific dementia subtypes, all unmarried participants also showed reduced risk for Alzheimer's disease and Lewy body dementia. In contrast, no consistent associations were observed for vascular dementia or frontotemporal lobar degeneration in fully adjusted models. Divorced and never-married groups were also less likely to progress from mild cognitive impairment to dementia.
Risk patterns appeared slightly stronger among men, younger individuals, and participants referred to clinics by health professionals. Yet stratified analyses showed minimal variation, suggesting that the associations held across a wide range of demographic and clinical subgroups.
Researchers concluded that unmarried individuals, particularly those who were divorced or never married, had a lower risk of developing dementia than those who remained married. These associations persisted even after adjusting for physical and mental health, lifestyle factors, genetics, and differences in clinical referral and evaluation.
Alzheimer's disease and Lewy body dementia were higher in married participants. Risk of progression from mild cognitive impairment to dementia was also higher.
The findings contrast with prior studies linking unmarried status to increased dementia risk and offer new evidence on how relationship status may relate to cognitive outcomes when diagnosis is measured under standardized conditions.
Selin Karakose et al, Marital status and risk of dementia over 18 years: Surprising findings from the National Alzheimer's Coordinating Center, Alzheimer's & Dementia (2025). DOI: 10.1002/alz.70072
Part 2
Apr 7
Dr. Krishna Kumari Challa
Urban birds host drug-resistant bacteria
Indian cormorants (Phalacocorax fuscicollis) can spread antimicrobial-resistant bacteria in their droppings. Intrigued by the smell of cormorant droppings, veterinary microbiologist Siddhartha Narayan Joardar and his team analysed some of the bird poo and found a strain of E. coli that produced enzymes that helped it resist certain antibiotics. The birds probably pick up the bacterium by eating fish from ponds contaminated by human wastewater, the team suggests. “Growing evidence shows the presence of antimicrobial-resistant bacteria in common urban birds,” says Joardar, which poses a public health threat if it spills over into humans.
https://www.nature.com/articles/d44151-025-00042-0?utm_source=Live+...
https://www.ijah.in/upload/snippet/707_84.pdf
Apr 7
Dr. Krishna Kumari Challa
Obesity severity tied to increased risk across 16 common conditions
New research has found that obesity, particularly severe obesity, is strongly associated with the incidence of 16 common health outcomes. Associations remained consistent across sex and racial groups. Strong associations were observed for obstructive sleep apnea, type 2 diabetes, and metabolic dysfunction-associated steatotic liver disease.
Obesity is a risk factor for adverse health outcomes involving multiple organ systems. Prior studies have analyzed conditions individually, limiting understanding of obesity's total health burden. External validity has also been limited by underrepresentation of individuals with class III obesity and of diverse demographic groups.
In the study, "Associations between Class I, II, or III Obesity and Health Outcomes," published in NEJM Evidence, researchers conducted a longitudinal cohort study to understand how different levels of obesity relate to a wide array of health conditions across a diverse U.S. population.
Participants contributed electronic health records, physical measurements, and survey data. Body mass index (BMI) was calculated at enrollment and used to classify individuals as normal weight, overweight, or obese, with further stratification into obesity classes I, II, and III.
Sixteen pre-identified health conditions were evaluated: hypertension, type 2 diabetes, hyperlipidemia or dyslipidemia, heart failure, atrial fibrillation, atherosclerotic cardiovascular disease, chronic kidney disease, pulmonary embolism, deep vein thrombosis, gout, liver disease linked to metabolic dysfunction, biliary calculus, obstructive sleep apnea, asthma, gastroesophageal reflux disease, and osteoarthritis.
Part 1
Apr 8
Dr. Krishna Kumari Challa
Cox proportional hazards models were used to estimate the risk of each condition by obesity class, adjusting for sex, age, race or ethnicity, income, and education. Researchers also calculated population-attributable fractions for each condition by obesity class.
Obesity was present in 42.4% of the study population, including 21.2% with class I obesity, 11.3% with class II, and 9.8% with class III. Compared to those with normal weight, individuals with obesity were more likely to be female, Black, have lower income and education levels, and have higher blood pressure and waist-to-hip ratios.
Prevalence and incidence rates increased progressively with higher obesity classes for all 16 health outcomes. Observed associations with class III obesity were strongest for obstructive sleep apnea (hazard ratio 10.94), type 2 diabetes mellitus (7.74), and metabolic dysfunction–associated liver disease (6.72). Weaker associations were found for asthma (2.14), osteoarthritis (2.06), and atherosclerotic cardiovascular disease (1.96).
Obesity was associated with elevated risk across all subgroups, with consistent patterns by sex and race. Population-attributable fractions showed that obesity explained 51.5% of obstructive sleep apnea cases and 36.3% of metabolic liver disease cases, and 14.0% of all osteoarthritis cases in the study population were estimated to be attributable to obesity.
Increased risk, particularly at higher severity levels, was associated with all 16 health outcomes studied. Risks rose in a stepwise manner across obesity classes, with the highest burden observed among individuals with class III obesity. Findings remained consistent across demographic subgroups and were supported by data from a large, diverse national cohort.
Associations between obesity and several conditions such as sleep apnea, type 2 diabetes, liver disease, and heart failure were strong and statistically robust. Population-attributable fractions indicated that a substantial proportion of these conditions may be preventable through effective obesity management.
Zhiqi Yao et al, Associations between Class I, II, or III Obesity and Health Outcomes, NEJM Evidence (2025). DOI: 10.1056/EVIDoa2400229
Part 2
Apr 8
Dr. Krishna Kumari Challa
Mushroom study identifies most bitter substance known to date
The molecular world of bitter compounds has so far only been partially explored. Researchers have now isolated three new bitter compounds from the mushroom Amaropostia stiptica and investigated their effect on human bitter taste receptors.
In doing so, they discovered one of the potentially most bitter substances known to date. The study results, published in the Journal of Agricultural and Food Chemistry, expand our knowledge of natural bitter compounds and their receptors, thus making an important contribution to food and health research.
The BitterDB database currently contains more than 2,400 bitter molecules. For about 800 of these very chemically diverse substances, at least one bitter taste receptor is specified. However, the bitter compounds recorded are mainly from flowering plants or synthetic sources. Bitter compounds of animal, bacterial or fungal origin, on the other hand, are still rarely represented in the database.
Researchers assume that bitter taste receptors have developed to warn against the consumption of potentially harmful substances. However, not all bitter compounds are toxic or harmful, and not every toxin tastes bitter, as the example of the death cap mushroom toxin shows. But why is that the case?
Studies have also shown that the sensors for bitter substances are not only found in the mouth, but also in organs such as the stomach, intestines, heart and lungs, as well as on certain blood cells. Since we do not "taste" with these organs and cells, the question arises as to the physiological significance of the receptors there.
Comprehensive data collections on bitter compounds and their receptors could help us to find answers to these open questions.
The more well-founded data we have on the various bitter compound classes, taste receptor types and variants, the better we can develop predictive models using systems biology methods to identify new bitter compounds and predict bitter taste receptor-mediated effects. This applies to both food constituents and endogenous substances that activate extraoral bitter taste receptors.
The research team that undertook this study examined the Bitter Bracket (Amaropostia stiptica) as part of a collaborative project. The mushroom is non-toxic, but tastes extremely bitter.
Using modern analytical methods, the research group has succeeded in isolating three previously unknown compounds and elucidating their structures. Using a cellular test system, the researchers then showed that the compounds activate at least one of the approximately 25 human bitter taste receptor types.
Part 1
Apr 8
Dr. Krishna Kumari Challa
Particularly noteworthy is the newly discovered bitter compound oligoporin D, which stimulates the bitter taste receptor type TAS2R46 even at the lowest concentrations (approx. 63 millionths of a gram/liter). To illustrate: the concentration corresponds to one gram of oligoporin D dissolved in about 106 bathtubs of water, where one gram corresponds approximately to the weight of a knife tip of baking soda.
Lea M. Schmitz et al, Taste-Guided Isolation of Bitter Compounds from the Mushroom Amaropostia stiptica Activates a Subset of Human Bitter Taste Receptors, Journal of Agricultural and Food Chemistry (2025). DOI: 10.1021/acs.jafc.4c12651
Part 2
Apr 8
Dr. Krishna Kumari Challa
scientists discover babies can sense their heartbeat and breathing
Body signals such as heartbeat and breathing accompany us constantly, often unnoticed as background noise of our perception. Even in the earliest years of life, these signals are important as they contribute to the development of self-awareness and identity. But can babies perceive their own body signals?
A recent study from Wiener Kinderstudien Lab at the University of Vienna demonstrates for the first time that babies as young as 3 months can perceive their own heartbeat. In addition, the team also investigated the first-time infants' perception of their own breathing and found developments during the first two years of life. The results are now published in the journal eLife.
The perception of internal body signals is closely linked to emotional awareness, mental health, and self-perception. Early in life, the ability to perceive one's own body signals may be particularly important, as it often forms the basis for interactions with caregivers—for example, babies rely on their caregivers to respond appropriately to signs of hunger or discomfort. Moreover, the development of self-awareness and identity partly depends on the perception and experience of one's own body.
The study shows that even 3-month-old babies can perceive their own heartbeat and that this ability remains relatively stable during the first two years of life. At the same time, the findings indicate that the perception of breathing improves significantly during the second year. Interestingly, the ability to perceive the heartbeat and breathing does not appear to be related—much like in adults.
Results showed that even at an early age, babies recognize the correspondence between their own heartbeat or breathing rhythm and the animated figures.
Markus R Tünte et al, Respiratory and cardiac interoceptive sensitivity in the first two years of life, eLife (2023). DOI: 10.7554/eLife.91579
Apr 8
Dr. Krishna Kumari Challa
Your season of conception could influence how your body stores fat
Individuals who were conceived in colder seasons are more likely to show higher brown adipose tissue activity, increased energy expenditure and a lower body mass index (BMI), and lower fat accumulation around internal organs, compared with those conceived in warmer seasons, suggests a study published in Nature Metabolism. The findings, based on an analysis involving more than 500 participants, indicate a potential role for meteorological conditions influencing human physiology.
Although eating habits and exercise are key indicators of fat loss, exposure to cold and warmth also plays a part. In colder temperatures, the body generates more heat (cold-induced thermogenesis) via brown adipose tissue activity and stores less fat in the form of white adipose tissue than it does in hotter temperatures.
Researchers analyzed brown adipose tissue density, activity and thermogenesis in 683 healthy male and female individuals between ages 3 and 78 in Japan, whose parents were exposed to cold temperatures (defined in the study as between 17 October and 15 April) or warm temperatures (between 16 April and 16 October) during the fertilization and birth periods.
Individuals who were conceived during the cold season showed higher brown adipose tissue activity, which then correlated with increased energy expenditure, increased thermogenesis, lower visceral fat accumulation and lower BMI into adulthood. More specifically, the researchers show that a key factor in determining brown adipose tissue activity in human offspring is a large daily temperature variation and lower ambient temperature during the pre-conception period.
Takeshi Yoneshiro et al, Pre-fertilization-origin preservation of brown fat-mediated energy expenditure in humans, Nature Metabolism (2025). DOI: 10.1038/s42255-025-01249-2
Apr 8
Dr. Krishna Kumari Challa
The gut microbiome as a predictive factor for kidney rejection
Kidney is one of the most transplanted organs in the world.
For patients with advanced kidney failure, a kidney transplant remains the best treatment option.
The demand is correspondingly high: several patients around the world are on the waiting list for a kidney transplant. A serious risk for patients who have already received a transplant is rejection of the transplant. This is a defensive reaction of the body against the foreign tissue, which can lead to a complete loss of organ function in an emergency.
Why transplants are sometimes rejected and sometimes not depends largely on immune mechanisms. The causes are complex and often poorly understood. To help answer this question, researchers have analyzed the changes in the composition and function of the gut microbiome of kidney transplant patients .
They discovered an altered signature in the gut microbiome that preceded transplant rejection. This study, published in the American Journal of Transplantation, offers a possible starting point for recognizing the risk of rejection at an early stage.
Our gut is home to countless microorganisms that play an important role in how our immune system works. This is known as the microbiome. The majority of these, over 90%, are bacteria. These bacteria and the substances they produce communicate with our body—especially with the cells that protect us from disease. They therefore help to control and strengthen our immune system, which is important for both healthy and sick people.
In patients with chronic kidney disease, the composition of the gut microbiome is severely altered, resulting in lower concentrations of anti-inflammatory short-chain fatty acids (SCFA) and increased concentrations of pro-inflammatory metabolites from the microbiome.
In their study, researchers analyzed the changes in the composition and function of the gut microbiome of patients after kidney transplantation. They found changes in the gut microbiome that were already detectable before the transplant rejection reaction.
It was noticeable that in patients who showed a rejection reaction, bacteria that typically occur in patients with advanced kidney failure, such as Fusobacterium and disease-associated genera such as Streptococcus, increased again. This was not the case in the other group studied, the "non-rejection group."
Overall, the analyses showed that the production potential of short-chain fatty acids in the stool is reduced before kidney rejection. This is indicated by the reduced frequency of bacterial enzymes from which short-chain fatty acids are produced before rejection.
The previously observed dynamic regeneration of the microbiome after kidney transplantation may be significantly disturbed in the case of transplant rejection: prior to rejection, profound changes in the composition of the microbiome occur, characterized by reduced diversity and a low number of SCFA-producing bacterial populations.
The results suggest that the microbiome plays an important role in how the immune system reacts after a kidney transplant. This observation can help to identify the risk of transplant rejection at an early stage or perhaps influence it therapeutically .
Johannes Holle et al, Gut microbiome alterations precede graft rejection in kidney transplantation patients, American Journal of Transplantation (2025). DOI: 10.1016/j.ajt.2025.02.010
Apr 8
Dr. Krishna Kumari Challa
Scientists make water-repellent replacement for toxic 'forever chemicals'
A team of international scientists has invented a substitute for synthetic chemicals, called PFAS (perfluoroalkyl substances), which are widely used in everyday products despite being hazardous to health and the environment.
Until now, it was believed fluorine—the element in such products which forms a highly effective barrier between substances like air and water, making them water repellent—could not easily be replaced because of its unique properties.
But scientists have discovered that the unique "bulky" attribute of fluorine, which makes it especially good at filling space, can actually be replicated in a different, non-toxic form. The findings are published in the Journal of Colloid and Interface Science.
From fire-fighting foam to furniture, food packaging and cookware, to make-up and toilet tissue, PFAS products are everywhere. Despite the risks to human health, and the fact they don't degrade, perfluoroalkyl substances persist in the environment, finding an alternative with comparable properties has proven elusive. But after many years of intensive research, researchers have made a great breakthrough now.
The results of their discovery are published in a study which unpacks the chemical structure of PFAS and pinpoints the characteristic "bulkiness" they sought to replicate in a safer form. It also demonstrates how non-fluorinated components, containing only non-toxic carbon and hydrogen, could be equally effective replacements.
Through extensive experimentation, it turns out these 'bulky' fragments feature in other common chemical systems like fats and fuels. So scientists took those principles and created modified chemicals which have these positive attributes and are also much safer.
Using their specialized laboratories for chemical synthesis, they substituted the fluorine in PFAS with certain groups containing only carbon and hydrogen. The whole process has taken about 10 years and the implications are very significant not least because PFAS is used in so many different products and situations.
The researchers now plan on using these principles discovered in the lab to design commercially viable versions of PFAS substitutes.
Masanobu Sagisaka et al, New fluorine-free low surface energy surfactants and surfaces, Journal of Colloid and Interface Science (2025). DOI: 10.1016/j.jcis.2025.03.018
Apr 9
Dr. Krishna Kumari Challa
Turning pollution into fuel with record-breaking carbon dioxide to carbon monoxide conversion rates
What if we could transform harmful pollution into a helpful energy source? As we strive towards carbon neutrality, researching energy innovations that reduce pollution is crucial.
Researchers developed a streamlined process for converting carbon dioxide (CO2) into carbon monoxide (CO)—a key precursor for synthetic fuels. Their method achieved record-breaking efficiency, cutting down the required time from 24 hours to just 15 minutes. The study is published in the journal Advanced Science.
CO2-to-CO conversion is currently a hot topic to address climate change, but the conventional techniques had major pitfalls that scientists wanted to address.
The materials are expensive, unstable, has limited selectivity, and took long to prepare. It just wouldn't be feasible to use them in an actual industrial setting.
With industrial standards in mind, the researchers selected various phthalocyanines (Pc) expected to improve performance [metal-free (H₂Pc), iron (FePc), cobalt (CoPc), nickel (NiPc), and copper (CuPc)]. These were sprayed onto gas diffusion electrodes to directly form crystalline layers of the phthalocyanines on the electrode surface. Ultimately, CoPc—a low-cost pigment and metal complex—showed the highest efficiency in converting CO₂ to CO.
This graffiti-like method of simply spraying the catalyst on a surface reduces the typical processing time down to a mere 15 minutes. Conventional methods require a tedious process of mixing conductive carbon and binders, drying, and heat treatment over 24 hours.
Furthermore, under a current density of 150 mA/cm², the new system maintained stable performance for 144 hours. Using the DigCat Database (the largest experimental electrocatalysis database to date), the researchers confirmed that their catalyst surpassed all previously reported Pc-based catalysts.
Not only is this the best Pc-based catalyst for producing CO to date, but it successfully exceeds the industrial standard thresholds for its reaction rate and stability, say the researchers.
Tengyi Liu et al, Surface Charge Transfer Enhanced Cobalt‐Phthalocyanine Crystals for Efficient CO2‐to‐CO Electroreduction with Large Current Density Exceeding 1000 mA cm−2, Advanced Science (2025). DOI: 10.1002/advs.202501459
Apr 9
Dr. Krishna Kumari Challa
Sperm don't just swim, they screw their way forward
Researchers have discovered that swimming sperm create swirling fluid vortices—shaped like rolling corkscrews—giving them an extra boost in the race to the egg.
The study, published in Cell Reports Physical Science, reveals that these vortices attach to the sperm cell and rotate in sync, adding extra spin that enhances propulsion and helps keep them on a direct path through the fluid.
As the sperm swims, its flagellum (tail) generates a whipping motion that creates swirling fluid currents that could optimize its propulsion in the reproductive tract. What's really fascinating is how these spiral-like 'imprints' in the surrounding fluid attach to the sperm body and rotate in sync, adding extra thrust.
The size and strength of these flow structures could impact sperm interactions with nearby surfaces, other sperm, or even the egg itself.
Farzan Akbaridoust et al, Superhelix flow structures drive sperm locomotion, Cell Reports Physical Science (2025). DOI: 10.1016/j.xcrp.2025.102524. www.cell.com/cell-reports-phys … 2666-3864(25)00123-7
Apr 9
Dr. Krishna Kumari Challa
Eating only during the daytime could protect people from heart risks of shift work, study suggests
A study by researchers suggests that, when it comes to cardiovascular health, food timing could be a bigger risk factor than sleep timing.
Numerous studies have shown that working the night shift is associated with serious health risks, including to the heart. However, a new study suggests that eating only during the daytime could help people avoid the health risks associated with shift work. Results are published in Nature Communications.
Prior research has shown that circadian misalignment—the mistiming of our behavioral cycle relative to our internal body clock—increases cardiovascular risk factors. So what can be done to lower this risk? This new research suggests food timing could be that target.
Animal studies have shown that aligning food timing with the internal body clock could mitigate the health risks of staying awake during the typical rest time.
In the experiments conducted during this study, the cardiovascular risk factors increased after simulated night work compared to the baseline in the participants who were scheduled to eat during the day and night. However, the risk factors stayed the same in the study participants who only ate during the daytime, even though how much and what they ate was not different between the groups—only "when" they ate.
Chellappa SL et al. Daytime eating during simulated night work mitigates changes in cardiovascular risk factors: secondary analyses of a randomized controlled trial, Nature Communications (2025). DOI: 10.1038/s41467-025-57846-y
**
Apr 9
Dr. Krishna Kumari Challa
General anesthesia reduces uniqueness of brain's functional 'fingerprint,' study finds
Past psychology research suggests that different people display characteristic patterns of spontaneous thought, emotions and behaviors. These patterns make the brains of distinct individuals unique, to the point that neuroscientists can often tell them apart based on their neural activity.
Researchers recently carried out a study aimed at investigating how general anesthesia influences the unique neural activity signatures that characterize the brains of different people and animals.
Their findings, published in Nature Human Behavior, show that general anesthesia suppresses each brain's unique functional connectivity patterns (i.e., the connections and communication patterns between different regions of the brain), both in humans and other species.
Every person is unique, they think and feel and act in unique ways. This uniqueness comes from our brain. The way that areas of the brain interact with each other is unique to each individual: it can be used like a 'brain fingerprint.
But when you lose consciousness, for example during deep sleep, your sense of being 'you' is gone. So, the question 's: what happens to brain fingerprints when we lose consciousness, such as during the artificial sleep induced by general anesthesia?
To explore the effects of anesthesia on the brain's functional connectivity patterns, the researchers employed an imaging technique commonly used in neuroscience research called functional magnetic resonance imaging (fMRI). This technique allows neuroscientists to monitor the activity of different brain regions over time and non-invasively, by measuring changes in blood flow.
They collected fMRI scans from healthy human volunteers before general anesthesia, then when they were unconscious because of the anesthesia, and then again after they recovered consciousness.
For each scan, they measured 'functional connectivity': a representation of how brain regions interact. They used this functional connectivity to obtain 'brain fingerprints,' telling them how easy or difficult it is to tell people apart based on their brain activity.
Interestingly, the fMRI scans collected by the researchers showed that the brain activity of people while they were under the influence of anesthesia was suppressed. In fact, anesthesia made people almost impossible to tell apart from each other solely by examining their brain activity, which was possible when they were still conscious.
In contrast, using brain fingerprints it is very easy to tell people apart when they are conscious. This effect is not uniform in the brain: it is strongest in the parts of the brain that are uniquely human and mostly distinguish us from other species. The implication is that just as your own conscious experience is unique to you, so are the brain patterns that support it. When consciousness is gone, people's brain activity is also less unique.
Part 1
Apr 10
Dr. Krishna Kumari Challa
The researchers gathered interesting new insights about the effects of general anesthesia on the brain and its unique patterns of neural activity. In the future, the results of their study could inspire further cross-species research looking at the brain before, during and after the administration of anesthetics, which could in turn inform the development of interventions to facilitate the rehabilitation of both people and animals after medical procedures that require anesthesia.
The researchers hope that by learning how the brain reboots consciousness after anesthesia, they can learn how to help recovery of consciousness in patients who suffer from coma and other forms of chronic unconsciousness after a brain injury.
Andrea I. Luppi et al, General anaesthesia decreases the uniqueness of brain functional connectivity across individuals and species, Nature Human Behaviour (2025). DOI: 10.1038/s41562-025-02121-9
Part 2
Apr 10
Dr. Krishna Kumari Challa
In utero opioid use associated with smaller brain regions in newborns
Researchers have found that newborns exposed to opioids in utero exhibited smaller brain volumes in multiple regions compared to unexposed infants.
Cortical gray matter, white matter, deep gray matter, cerebellum, brainstem, and the amygdala all showed reduced size, indicating possible early markers of neurodevelopmental impairment.
Opioid exposure affects a growing number of pregnancies in the world. An estimated 7% of pregnant individuals report using opioids during pregnancy. These substances cross the placenta and may interfere with fetal brain development.
Antenatal opioid exposure has been linked to lower infant cognitive and language scores, increased rates of attention-deficit/hyperactivity disorder, and difficulties with executive functioning.
Confounding influences such as socioeconomic status, co-exposure to other substances, parenting environment, and genetic factors complicate any direct causal interpretation. In some studies, once these environmental risks are controlled, associations between opioid exposure and neurodevelopmental outcomes weaken or disappear.
In the study, "Antenatal Opioid Exposure and Global and Regional Brain Volumes in Newborns," published in JAMA Pediatrics, researchers conducted a multisite, prospective observational study.
Part 1
Apr 10
Dr. Krishna Kumari Challa
A total of 173 newborns with antenatal opioid exposure and 96 unexposed controls were recruited from four sites across the United States. All infants were born at or after 37 weeks of gestation and underwent brain MRI scans before eight weeks of age.
Three-dimensional volumetric MRI scans were acquired during natural sleep using Siemens and Philips 3T scanners. Volumetric data were analyzed using covariance models that controlled for postmenstrual age at scan, sex, birth weight, maternal smoking, and maternal education.
Opioid-exposed newborns had significantly smaller total brain volume compared to unexposed controls: 387.51 cm3 vs. 407.06 cm3. Reductions were also found in cortical gray matter: 167.07 cm3 vs. 176.35 cm3, deep gray matter: 27.22 cm3 vs. 28.76 cm3, white matter: 159.90 cm3 vs. 166.65 cm3, cerebellum: 23.47 cm3 vs. 24.99 cm3, and brainstem: 6.80 cm3 vs. 7.18 cm3.
Amygdala volume was also reduced in opioid-exposed infants. Left amygdala volume measured 0.48 cm3 compared to 0.51 cm3 in controls. Right amygdala volume measured 0.51 cm3 vs. 0.55 cm3 in controls.
Methadone-exposed newborns showed significantly smaller white matter volume. Buprenorphine-exposed newborns showed significantly smaller right amygdala volume.
Newborns exposed to opioids only and those exposed to opioids plus other substances both exhibited significant reductions in cortical and deep gray matter, cerebellum, brainstem, right amygdala, and total brain volume. Polysubstance-exposed newborns also showed reductions in white matter and the left amygdala.
Researchers concluded that antenatal opioid exposure is associated with reductions in global, regional, and tissue-specific brain volumes in newborns. Structural differences were observed across multiple brain regions and varied by type of opioid exposure.
Methadone exposure was linked with reduced white matter volume. Buprenorphine exposure was associated with smaller right amygdala volume. Newborns with polysubstance exposure showed volume reductions in additional areas, including white matter and the left amygdala.
According to the authors, these structural brain differences may represent early biomarkers of later neurodevelopmental dysfunction.
Yao Wu et al, Antenatal Opioid Exposure and Global and Regional Brain Volumes in Newborns, JAMA Pediatrics (2025). DOI: 10.1001/jamapediatrics.2025.0277
Nethra K. Madurai et al, Following the Developing Brain Affected by Opioid Exposure, JAMA Pediatrics (2025). DOI: 10.1001/jamapediatrics.2025.0274
Part 2
Apr 10
Dr. Krishna Kumari Challa
Study finds pet dogs pose significant threat to wildlife and ecosystems
New research into the overlooked environmental impact of pet dogs has found far-reaching negative effects on wildlife, ecosystems and climate.
While ecological damage caused by cats has been extensively studied, the new research found dogs, as the world's most common large carnivores, present a significant and multifaceted environmental threat.
The paper, "Bad Dog? The environmental effects of owned dogs," has been published in Pacific Conservation Biology.
The research found that human-owned, pet dogs disturb and directly harm wildlife, particularly shorebirds, even when leashed.
As well as predatory behavior like chasing wildlife, dogs leave scents, urine and feces, which can disrupt animal behavior long after the dogs have left. Studies have found that animals like deer, foxes and bobcats are less active or completely avoid areas where dogs are regularly walked, even in the absence of the dogs.
Dog waste also contributes to pollution in waterways and inhibits plant growth, while wash-off from chemical treatments used to clean and guard dogs from parasites can add toxic compounds to aquatic environments. In addition, the pet food industry, driven by a vast global dog population, has a substantial carbon, land and water footprint.
Addressing these challenges required a careful balance between reducing environmental harm and maintaining the positive role of dogs as companions and working animals, say the researchers.
The sheer number of pet dogs globally, combined with uninformed or lax behaviors by some owners, is driving environmental issues that we can no longer ignore.
Bad Dog? The environmental effects of owned dogs, Pacific Conservation Biology (2025). doi.org/10.1071/PC24071
Apr 10
Dr. Krishna Kumari Challa
Handheld electro-shockers can pose risk for individuals with cardiac implants, study finds
Research has found that handheld electro-shockers commonly used for self-defense can potentially interact with cardiac implantable electronic devices (CIEDs) such as pacemakers, putting individuals at risk.
The study in Heart Rhythm shows that the individual interactive risk is primarily based on the applied voltage, but also on the manufacturer and type of implanted CIED.
The use of TASER pistols by security forces has been controversial because of associated health risks for subjects receiving a TASER shock. In contrast to TASER pistols, which shoot electrical darts over a distance of up to 10 meters and transmit electrical currents through large parts of a person's body, a handheld electro-shocker delivers energy superficially by directly applying the device to a target.
The handheld electro-shockers tested in this study are legal to own and carry in most countries and therefore, patients with CIEDs might have an increased risk of coming into contact with these devices. This is the first time a study has evaluated the effects of these electro-shockers on CIEDs.
Effects of handheld electro-shockers on cardiac implantable electronic devices, Heart Rhythm (2025). DOI: 10.1016/j.hrthm.2025.02.025
Apr 10
Dr. Krishna Kumari Challa
Research leads to designation of new type of diabetes: Type 5
Malnutrition-related diabetes—typically affecting lean, malnourished teens and young adults in low- and middle-income countries—is now officially recognized as a distinct form of the disease, known as type 5 diabetes.
Diabetes that is caused by obesity, known as type 2 diabetes, accounts for the majority of diabetes cases in developing countries. But increasingly young people are being diagnosed with diabetes caused not by too much food but by too little—by malnutrition, in other words. Type 5 diabetes is estimated to affect 20-to-25 million people worldwide, mainly in Asia and Africa.
Doctors are still unsure how to treat these patients, who often don't live for more than a year after diagnosis.
**
Apr 10
Dr. Krishna Kumari Challa
While often discussed as two main types, diabetes actually encompasses several categories, including type 1, type 2, gestational diabetes, and other specific types like MODY (Maturity-Onset Diabetes of the Young).
Here's a breakdown of the main types of diabetes:
Type 1 Diabetes:
An autoimmune condition where the body's immune system mistakenly attacks and destroys the insulin-producing cells in the pancreas.
Type 2 Diabetes:
A condition where the body becomes resistant to insulin, and the pancreas may not produce enough insulin to maintain normal blood sugar levels.
Gestational Diabetes:
A form of diabetes that develops during pregnancy, often due to hormonal changes that make the body less sensitive to insulin.
Other Specific Types of Diabetes:
MODY (Maturity-Onset Diabetes of the Young): A genetic form of diabetes that typically develops in childhood or early adulthood, often before the age of 25.
LADA (Latent Autoimmune Diabetes in Adults): A type of diabetes that is similar to type 1 but develops more slowly, with the damage to the insulin-producing cells in the pancreas happening gradually.
Neonatal Diabetes: A rare form of diabetes that occurs in newborns or infants.
Secondary Diabetes: Diabetes that develops as a result of other medical conditions or medications, such as cystic fibrosis, pancreatitis, or corticosteroid use.
**
Apr 10
Dr. Krishna Kumari Challa
Researchers identify simple rules for folding the genome
An international team of researchers have identified rules that tell cells how to fold DNA into the tightly packed, iconic X-shaped chromosomes formed during mitosis that help ensure the accurate passing of genetic information between cells during cell division.
Published in the journal Science, these findings illuminate basic biological functions underlying mitosis on a micrometer scale. Understanding how the cells accomplish this critical task may provide important new insights into inheritance and DNA stability and repair, as well as genetic mutations that lead to human diseases such as cancer.
Kumiko Samejima et al, Rules of engagement for condensins and cohesins guide mitotic chromosome formation, Science (2025). DOI: 10.1126/science.adq1709
**
Apr 12
Dr. Krishna Kumari Challa
Venom characteristics of a deadly snake depends on climate .... and can be predicted from local climate
Local climate can be used to predict the venom characteristics of a deadly snake that is widespread in India, helping clinicians to provide targeted therapies for snake bite victims, according to a study published in PLOS Neglected Tropical Diseases
Russell's viper (Daboia russelii) is found across the Indian subcontinent and is responsible for over 40% of snake bite-related deaths in India each year. Its venom is extremely variable, and snake bites cause different symptoms in different regions of India.
The toxic effects of snake venom are caused by the concentrations of different enzymes, which can be influenced by many factors, including prey availability and climate. However, the factors driving variation in Russell's viper venom are unknown.
To investigate, researchers analyzed venom samples from 115 snakes collected in 34 locations across India. They tested the activity of venom toxins, including enzymes that break down proteins, phospholipids and amino acids.
Next, they used historical climate data to understand the relationship between venom composition and the local climate where the snakes were caught. They found that temperature and rainfall partly explained regional variation in snake venom composition.
Protease activity showed the closest relationship to climate variables, whereas the activity of amino acid oxidases was unaffected by climate. Snakes in drier regions of India tended to have higher protease activity.
The researchers used this data to create a map of expected venom types across Russell's viper's range in India, which could be used to predict the clinical symptoms of snake bites in different regions.
The venom maps developed in this study could help clinicians select the most appropriate treatment for patients with snake bites, or to develop targeted therapies such as toxin-specific antibodies, the researchers say.
PLOS Neglected Tropical Diseases (2025). DOI: 10.1371/journal.pntd.0012949
Apr 13
Dr. Krishna Kumari Challa
'Ozone-climate penalty' adds to India's air pollution
India's cities are already ranked among the world's most polluted, based on concentrations of fine particulate matter in the air. Now new research indicates they are battling rising levels of another life-threatening pollutant—surface ozone.
A study published in the journal Global Transitions says deaths from ozone in India exceeded 50,000 in 2022 and caused losses of around US$16.8 billion—about 1.5 times the government's total health spending that year.
Surface ozone is a toxic gas that not only affects public health but also impacts ecosystems and climate due to the greenhouse effect.
Ozone, a variant of oxygen, occurs at ground level as well as in the upper atmosphere. Formed naturally, the ozone in the stratosphere helps filter out harmful ultraviolet rays that are a part of the sun's radiation. However, surface ozone, also called ground-level ozone, is generated by interactions between pollutants. For example, nitrogen oxides found in vehicular exhaust can react with volatile organic compounds released by industrial activity and waste dumps to produce ozone. Surface ozone is the primary component of smog and can have negative effects on human health and the environment.
Short-term exposure to ozone increases the risk of death from heart disease, stroke, hypertension and respiratory issues, while long-term exposure may decrease lung capacity, induce oxidative stress, suppress immune response and cause lung inflammation.
Climate change, rising temperatures and altered weather patterns can raise surface ozone in a phenomenon described by experts as the "ozone-climate penalty". Factors that affect ozone generation include solar radiation, humidity, precipitation and the presence of precursors—substances that lead to the formation of a pollutant through a chemical reaction—such as methane, nitrogen oxides and volatile organic compounds.
Ozone pollution increases during the hot summer months and declines during the monsoon period from June to September as heavy rains wash out the pollutants, and reduced solar radiation limits photochemical reactions.
Critically, human exposure to fine particulate matter—known as PM2.5—may worsen the health effects of ozone.
PM2.5 refers to particles smaller than 2.5 microns, which can enter the bloodstream through the lungs.
According to the 2024 World Air Quality Report, 11 of the world's 20 cities carrying the highest burden of PM2.5 are in India.
A study (1) published in The Lancet Planetary Health found that the whole of the population of India lives in areas where PM2.5 levels exceed WHO guidelines.
Part 1
Apr 15
Dr. Krishna Kumari Challa
Besides harming health, high levels of surface ozone reduce photosynthesis by damaging photosystems, CO2 fixation and pigments. This leads to a reduction in carbon assimilation that results in the decline of crop yields.
According to the ozone study, India's rice yield loss due to ozone pollution rose from 7.39 million tons to 11.46 million tons between 2005 and 2020, costing around US$2.92 billion and impacting food security.
Even if precursor emissions remain at the current level, climate change alone could contribute to the increase in surface ozone in the highly polluted regions of South Asia by 2050 with the Indo-Gangetic Plains, one of the most fertile regions in the region, likely to face significant crop yield losses.
Much of the surface ozone generated is taken care of by nature, such as the monsoon rains. Dealing with the reported increase in ozone levels is best done by reducing the precursors—nitrogen oxides, methane and PM2.5.
G.S. Gopikrishnan et al, Exposure to surface ozone and its associated health effects and economic burden in India, Global Transitions (2025). DOI: 10.1016/j.glt.2025.03.002
Footnotes:
1. Estimating the effect of annual PM2-5 exposure on mortality in India: a difference-in-differences approach, The Lancet Planetary Health (2024). DOI: 10.1016/S2542-5196(24)00248-1. www.thelancet.com/journals/lan … (24)00248-1/fulltext
Part 2
Apr 15
Dr. Krishna Kumari Challa
Scientists uncover why carbon-rich meteorites rarely reach Earth
An international team of researchers may have answered one of space science's long-running questions—and it could change our understanding of how life began. Carbon-rich asteroids are abundant in space yet make up less than 5% of meteorites found on Earth.
Published in Nature Astronomy, researchers analyzed close to 8,500 meteoroids and meteorite impacts, using data from 19 fireball observation networks across 39 countries—making it the most comprehensive study of its kind. The paper is titled "Perihelion history and atmospheric survival as primary drivers of Earth's meteorite record."
The team discovered Earth's atmosphere and the sun act like giant filters, destroying fragile, carbon-rich (carbonaceous) meteoroids before they reach the ground.
Scientists have long suspected that weak, carbonaceous material doesn't survive atmospheric entry. What this research shows is many of these meteoroids don't even make it that far: they break apart from being heated repeatedly as they pass close to the sun. The ones that do survive getting cooked in space are more likely to also make it through Earth's atmosphere.
The study also found meteoroids created by tidal disruptions—when asteroids break apart from close encounters with planets—are especially fragile and almost never survive atmospheric entry.
Carbonaceous meteorites are particularly important because they contain water and organic molecules—key ingredients linked to the origin of life on Earth. Carbon-rich meteorites are some of the most chemically primitive materials we can study—they contain water, organic molecules and even amino acids.
However, scientists have so few of them in their meteorite collections that they risk having an incomplete picture of what's actually out there in space and how the building blocks of life arrived on Earth.
Understanding what gets filtered out and why is key to reconstructing our solar system's history and the conditions that made life possible.
Patrick M. Shober et al, Perihelion history and atmospheric survival as primary drivers of the Earth's meteorite record, Nature Astronomy (2025). DOI: 10.1038/s41550-025-02526-6
Apr 15
Dr. Krishna Kumari Challa
Half of the universe's hydrogen gas, long unaccounted for, has been found
Astronomers tallying up all the normal matter—stars, galaxies and gas—in the universe today have come up embarrassingly short of the total matter produced in the Big Bang 13.6 billion years ago. In fact, more than half of normal matter—half of the 15% of the universe's matter that is not dark matter—cannot be accounted for in the glowing stars and gas we see.
New measurements, however, seem to have found this missing matter in the form of very diffuse and invisible ionized hydrogen gas, which forms a halo around galaxies and is more puffed out and extensive than astronomers thought.
The findings not only relieve a conflict between astronomical observations and the best, proven model of the evolution of the universe since the Big Bang, they also suggest that the massive black holes at the centers of galaxies are more active than previously thought, fountaining gas much farther from the galactic center than expected—about five times farther, the team found.
The new measurements are certainly consistent with finding all of the gas.
The results of the study, co-authored by 75 scientists from institutions around the world, have been presented at recent scientific meetings, posted as a preprint on arXiv and are undergoing peer review at the journal Physical Review Letters.
While the still mysterious dark matter makes up the bulk—about 84%—of matter in the universe, the remainder is normal matter. Only about 7% of normal matter is in the form of stars, while the rest is in the form of invisible hydrogen gas—most of it ionized—in galaxies and the filaments that connect galaxies in a kind of cosmic network.
The ionized gas and associated electrons strung out in this filament network are referred to as the warm-hot intergalactic medium, which is too cold and too diffuse to be seen with the usual techniques at astronomers' disposal, and therefore has remained elusive until now.
In the new paper, the researchers estimated the distribution of ionized hydrogen around galaxies by stacking images of approximately 7 million galaxies—all within about 8 billion light-years of Earth—and measuring the slight dimming or brightening of the cosmic microwave background caused by a scattering of the radiation by electrons in the ionized gas, the so-called kinematic Sunyaev-Zel'dovich effect.
B. Hadzhiyska et al, Evidence for large baryonic feedback at low and intermediate redshifts from kinematic Sunyaev-Zel'dovich observations with ACT and DESI photometric galaxies, arXiv (2024). DOI: 10.48550/arxiv.2407.07152
Apr 15
Dr. Krishna Kumari Challa
An alternative to artificial fertilizers: Small peptides enhance symbiosis between plants and fungi
Industrial farming practices often deplete the soil of important nutrients and minerals, leaving farmers to rely on artificial fertilizers to support plant growth. In fact, fertilizer use has more than quadrupled since the 1960s, but this comes with serious consequences. Fertilizer production consumes massive amounts of energy, and its use pollutes the water, air, and land.
Plant biologists are now proposing a new solution to help kick this unsustainable fertilizer habit.
In a new study, the researchers identified a key molecule produced by plant roots, a small peptide called CLE16, that encourages plants and beneficial soil fungi to interact with each other. They say boosting this symbiotic relationship, in which the fungi provide mineral nutrients to the plants through CLE16 supplementation, could be a more natural and sustainable way to encourage crop growth without the use of harmful artificial fertilizers.
The findings are published in the Proceedings of the National Academy of Sciences.
By restoring the natural symbiosis between plant roots and fungi, we could help crops get the nutrients they need without the use of harmful fertilizers.
In this mutually beneficial relationship, soil-borne arbuscular mycorrhizal fungi supply plants with water and phosphorus, which the plants accept in exchange for carbon molecules. These exchanges occur by specialized symbiotic fungal tendrils, called arbuscules, burying themselves into plant root cells.
Around 80% of plants can trade resources with fungi in this way. However, the traits that support this symbiosis have been weakened over centuries of agricultural plant breeding that prioritized creating crops with the biggest yields.
Scientists say new crop varieties could be bred to strengthen these traits again—an opportunity they intend to explore through the Institute's Harnessing Plants Initiative.
Müller, Lena Maria, A plant CLE peptide and its fungal mimic promote arbuscular mycorrhizal symbiosis via CRN-mediated ROS suppression, Proceedings of the National Academy of Sciences (2025). DOI: 10.1073/pnas.2422215122
Apr 15
Dr. Krishna Kumari Challa
Drug pollution in water
The drugs we take, from anxiety medications to antibiotics, don't simply vanish after leaving our bodies. Many are not fully removed by wastewater treatment systems and end up in rivers, lakes, and streams, where they can linger and affect wildlife in unexpected ways.
The new findings suggest that even tiny traces of drugs in the environment can alter animal behavior in ways that may shape their survival and success in the wild.
A recent global survey of the world's rivers found drugs were contaminating waterways on every continent—even Antarctica. These substances enter aquatic ecosystems not only through our everyday use, as active compounds pass through our bodies and into sewage systems, but also due to improper disposal and industrial effluents.
To date, almost 1,000 different active pharmaceutical substances have been detected in environments worldwide.
Particularly worrying is the fact that the biological targets of many of these drugs, such as receptors in the human brain, are also present in a wide variety of other species. That means animals in the wild can also be affected.
In fact, research over the last several decades has demonstrated that pharmaceutical pollutants can disrupt a wide range of traits in animals, including their physiology, development, and reproduction.
https://www.science.org/doi/10.1126/science.adp7174
Apr 15
Dr. Krishna Kumari Challa
Radiation from CT scans could account for 5% of all cancer cases a year, study suggests
CT can save lives, but its potential harms are often overlooked. CTs expose patients to ionizing radiation—a carcinogen—and it's long been known that the technology carries a higher risk of cancer.
Radiation from CT scans may account for 5% of all cancers annually, according to a new study that cautions against overusing and overdosing CTs.
The danger is greatest for infants, followed by children and adolescents. But adults are also at risk, since they are the most likely to get scans.
Nearly 103,000 cancers are predicted to result from the 93 million CTs that were performed in 2023 alone. This is three to four times more than previous assessments, the researchers say.
The researchers said some CT scans are unlikely to help patients and are overused, such as those for upper respiratory infections or for headaches without concerning signs or symptoms. They said patients could lower their risk by getting fewer of these scans, or by getting lower-dose scans.
There is currently unacceptable variation in the doses used for CT, with some patients receiving excessive doses.
JAMA Internal Medicine (2025). jamanetwork.com/journals/jamai … ainternmed.2025.0505
Apr 15
Dr. Krishna Kumari Challa
Modified antibody fragment blocks fertilization, paving way for nonhormonal contraceptive
Current methods of contraception rely on hormones, which can cause side effects such as mood changes, headaches or increased risk of blood clots. Blocking fertilization on the surface of the egg has been proposed as an alternative, but antibodies were deemed unsuitable due to possible immune responses triggered by their Fc region.
A new study shows how a small antibody fragment can block fertilization by targeting a key protein on the surface of the egg. This discovery brings a nonhormonal contraceptive one step closer to reality. The study has been published in the Proceedings of the National Academy of Sciences.
In the study, the researchers describe how a modified antibody fragment can block fertilization by targeting the protein ZP2 on the surface of the egg.
This small antibody fragment can block fertilization by targeting ZP2, a key protein in the outer layer of the egg that is involved in both sperm binding and blocking polyspermy.
The researchers have used X-ray crystallography to map the interaction between the antibody IE-3, which is known to prevent fertilization in mice, and ZP2 at the atomic level. A modified, smaller version of the antibody (scFV) was found to be equally effective, blocking fertilization in 100% of IVF tests with mouse eggs. Because it lacks the immune-triggering Fc region of the full antibody, scFV minimizes potential side effects.
Elisa Dioguardi et al, Structural basis of ZP2-targeted female nonhormonal contraception, Proceedings of the National Academy of Sciences (2025). DOI: 10.1073/pnas.2426057122
Apr 15
Dr. Krishna Kumari Challa
Repeated treatment with malaria medication can decrease its effect
In a recently published article in the journal Nature Communications, researchers present results indicating that repeated treatment with piperaquine, an antimalarial drug, can lead to the parasites developing decreased sensitivity to this drug. These findings may impact the use of piperaquine in the future.
Piperaquine is an important antimalarial drug characterized by a long half-life, meaning it remains in the body for several weeks and protects against new infections. This is a key asset of this drug. However, the researchers behind the study have discovered that this advantage can disappear with repeated treatment in areas with high malaria transmission.
The study shows that repeated treatment with dihydroartemisinin-piperaquine can lead to parasites developing drug tolerance by duplicating the plasmepsin 3 (pm3) gene. This allows them to reinfect patients earlier than expected during the expected protective period, reducing piperaquine's effectiveness as a prophylactic medicine.
Leyre Pernaute-Lau et al, Decreased dihydroartemisinin-piperaquine protection against recurrent malaria associated with Plasmodium falciparum plasmepsin 3 copy number variation in Africa, Nature Communications (2025). DOI: 10.1038/s41467-025-57726-5
Apr 15
Dr. Krishna Kumari Challa
Airborne microplastics infiltrate plant leaves, raising environmental concerns
Researchers have found that plant leaves can directly absorb microplastics (MPs) from the atmosphere, leading to a widespread presence of plastic polymers in vegetation. Concentrations of polyethylene terephthalate (PET) and polystyrene (PS) were detected in leaves collected from multiple environments, including urban areas and agricultural sites. The study is published in the journal Nature.
Researchers performed field investigations and laboratory simulation experiments to quantify plastic accumulation in plant leaves. Leaf absorption was confirmed as a significant pathway for plastic accumulation in plants, with evidence of translocation into vascular tissue and retention in specialized structures like trichomes.
MPs have been detected throughout terrestrial environments, including soil, water, and air. Laboratory studies have shown that plant roots can absorb MPs, with submicrometer and nanometer-sized particles of PS and polymethylmethacrylate transported upward from the roots of Triticum aestivum, Lactuca sativa, and Arabidopsis thaliana. Root uptake through the apoplastic pathway has been observed, yet translocation to shoots occurs slowly.
Airborne MPs have been measured at concentrations between 0.4 and 2,502 items per cubic meter in urban settings such as Paris, Shanghai, Southern California, and London. Laboratory experiments demonstrated the foliar absorption of nanoparticles including Ag, CuO, TiO2, and CeO2.
Plastic particles have been shown to deposit on plant surfaces, and some studies reported internal accumulation following exposure to high levels of commercial PS models.
At the most polluted sites, concentrations of PET reached tens of thousands of nanograms per gram of dry leaf weight. PS levels followed a similar pattern, with the highest values detected in leaves from the landfill site.
PET and PS were also found in nine leafy vegetables, with open-air crops exhibiting higher levels than greenhouse-grown counterparts. Nano-sized PET and PS were visually confirmed in plant tissue.
Older leaves and outer leaves of vegetables accumulated more plastic than newly grown or inner leaves, suggesting an accumulation over time.
Laboratory exposure of maize to plastic-laden dust resulted in measurable PET absorption in leaf tissue after just one day. PET was not detected in roots or stems under similar root-exposure conditions. Fluorescent and europium-labeled particles enabled visualization of stomatal entry and subsequent migration through the apoplastic pathway.
Part 1
Apr 16
Dr. Krishna Kumari Challa
Abscisic acid was applied to maize roots to chemically induce stomatal closure. Plants exposed to dust laden with PET MPs under these conditions showed significantly lower absorption in leaf tissue, confirming that open stomata are crucial for foliar uptake of airborne MPs.
Plastic particles absorbed through leaves accumulated in measurable quantities across multiple species and sites. Airborne PET and PS entered leaves through stomata and moved along internal pathways to vascular tissues and trichomes.
Concentrations increased with exposure time, environmental levels, and leaf age. Field measurements showed that plastic accumulation in aboveground plant parts exceeds what is typically absorbed through roots.
Widespread detection of plastic polymers and fragments in edible plant parts confirms atmospheric exposure as a significant route of entry into vegetation. As leaves function as a primary source in terrestrial food chains, the presence of accumulated MPs suggests the potential for exposure to multiple layers of the ecosystem.
With plastics around, even vegetarians are not safe!
Ye Li et al, Leaf absorption contributes to accumulation of microplastics in plants, Nature (2025). DOI: 10.1038/s41586-025-08831-4
Willie Peijnenburg, Airborne microplastics enter plant leaves and end up in our food, Nature (2025). DOI: 10.1038/d41586-025-00909-3
Part 2
Apr 16
Dr. Krishna Kumari Challa
Heart valve abnormality is associated with malignant arrhythmias, study reveals
People with a certain heart valve abnormality are at increased risk of severe heart rhythm disorders, even after successful valve surgery. This is according to a new study, "Mitral annular disjunction and mitral valve prolapse: long-term risk of ventricular arrhythmias after surgery" , published in the European Heart Journal.
The condition is more common in women and younger patients with valve disorder and can, in the worst case, lead to sudden cardiac arrest.
Mitral annular disjunction, MAD, is a heart abnormality in which the mitral valve attachment "slides." In recent years, the condition has been linked to an increased risk of severe cardiac arrhythmias. Until now, it has not been known whether the risk of arrhythmias disappears if MAD is surgically corrected.
MAD is often associated with a heart disease called mitral valve prolapse, which affects 2.5% of the population and causes one of the heart's valves to leak. This can lead to blood being pumped backward in the heart, causing heart failure and arrhythmias. The disease can cause symptoms such as shortness of breath and palpitations.
In the current study, researchers investigated the risk of cardiac arrhythmias in 599 patients with mitral valve prolapse who underwent heart surgery at Karolinska University Hospital between 2010 and 2022. Some 16% of the patients also had the cardiac abnormality MAD.
The researchers have been able to show that people with MAD have a significantly higher risk of suffering from ventricular arrhythmias, a dangerous type of heart rhythm disorder that, in the worst case, can lead to cardiac arrest in a subset of patients.
People with MAD were more likely to be female and were on average eight years younger than those without MAD. They also had more extensive mitral valve disease. Although the surgery was successful in correcting MAD, these patients had more than three times the risk of ventricular arrhythmias during five years of follow-up compared to patients without preoperative MAD.
These results show that it is important to closely monitor patients with this condition, even after a successful operation, say the researchers.
The study has led to new hypotheses that the researchers are now investigating further. One hypothesis is that MAD causes permanent changes in the heart muscle over time. Another is that MAD is a sign of an underlying heart muscle disease.
The researchers are now continuing to study scarring in the heart using MRI (magnetic resonance imaging) and analyze tissue samples from the heart muscle.
Bahira Shahim et al, Mitral annular disjunction and mitral valve prolapse: long-term risk of ventricular arrhythmias after surgery, (2025). DOI: 10.1093/eurheartj/ehaf195
Apr 16