AI helps stethoscope detect three heart conditions in 15 seconds
An AI-enabled stethoscope can help doctors pick up three heart conditions in just 15 seconds, according to the results of a real-world trial presented at the European Society of Cardiology's annual congress.
The stethoscope, invented in 1816, is a vital part of a doctor's toolkit, used to listen to sounds within the body. But an AI stethoscope can do much more, including analyzing tiny differences in heartbeat and blood flowwhich are undetectable to the human ear, and taking a rapid ECG at the same time.
Researchers now have evidence that an AI stethoscope can increase detection of heart failure at the early stage when someone goes to their GP with symptoms.
Astudyinvolving more than 200 GP surgeries, with more than 1.5 million patients, looked at people with symptoms such as breathlessness or fatigue.
Those examined using an AI stethoscope were twice as likely to be diagnosed with heart failure, compared to similar patients who were not examined using the technology.
Patients examined with the AI-stethoscope were about 3.5 times more likely to be diagnosed with atrial fibrillation—an abnormal heart rhythm which can increase the risk of having a stroke. They were almost twice as likely to receive a diagnosis of heart valve disease, which is where one or more heart valves do not work properly.
Early diagnosis is vital for all three conditions, allowing patients who may need potentially lifesaving medicines to be identified sooner, before they become dangerously unwell.
Mihir A Kelshiker et al, Triple cardiovascular disease detection with an artificial intelligence-enabled stethoscope (TRICORDER): design and rationale for a decentralised, real-world cluster-randomised controlled trial and implementation study, BMJ Open (2025). DOI: 10.1136/bmjopen-2024-098030
A firewall for science: AI tool identifies 1,000 'questionable' journals
A team of computer scientists has developed a new artificial intelligence platform that automatically seeks out "questionable" scientific journals.
The study, published Aug. 27 in the journal Science Advances, tackles an alarming trend in the world of research.
Spam messages come from people who purport to be editors at scientific journals, usually researchers never heard of, and offer to publish their papers—for a hefty fee.
Such publications are sometimes referred to as "predatory" journals. They target scientists, convincing them to pay hundreds or even thousands of dollars to publish their research without proper vetting. There has been a growing effort among scientists and organizations to vet these journals. But it's like whack-a-mole. You catch one, and then another appears, usually from the same company. They just create a new website and come up with a new name.
The new new AI tool automatically screens scientific journals, evaluating their websites and other online data for certain criteria: Do the journals have an editorial board featuring established researchers? Do their websites contain a lot of grammatical errors?
In an era when prominent figures are questioning the legitimacy of science, stopping the spread of questionable publications has become more important than ever before.
In science, you don't start from scratch. You build on top of the research of others. So if the foundation of that tower crumbles, then the entire thing collapses. When scientists submit a new study to a reputable publication, that study usually undergoes a practice called peer review. Outside experts read the study and evaluate it for quality—or, at least, that's the goal.
A growing number of companies have sought to circumvent that process to turn a profit. In 2009, Jeffrey Beall, a librarian at CU Denver, coined the phrase "predatory" journals to describe these publications.
Often, they target researchers outside of the United States and Europe, such as in China, India and Iran—countries where scientific institutions may be young, and the pressure and incentives for researchers to publish are high.
These publishers of predatory journals will say, 'If you pay $500 or $1,000, we will review your paper. In reality, they don't provide any service. They just take the PDF and post it on their website. a nonprofit organization called the Directory of Open Access Journals (DOAJ). Since 2003, volunteers at the DOAJ have flagged thousands of journals as suspicious based on six criteria. (Reputable publications, for example, tend to include a detailed description of their peer review policies on their websites.)
But keeping pace with the spread of those publications has been daunting for humans.
To speed up the process, researchers now turned to AI. They trained its system using the DOAJ's data, then asked the AI to sift through a list of nearly 15,200 open-access journals on the internet.
Among those journals, the AI initially flagged more than 1,400 as potentially problematic. When scientists checked them , 350 were found to be legitimate. However, more than 1000 were predatory in nature! The team discovered that questionable journals published an unusually high number of articles. They also included authors with a larger number of affiliations than more legitimate journals, and authors who cited their own research, rather than the research of other scientists, to an unusually high level.
The new AI system isn't publicly accessible, but the researchers hope to make it available to universities and publishing companies soon.
Han Zhuang et al, Estimating the predictability of questionable open-access journals, Science Advances (2025). DOI: 10.1126/sciadv.adt2792
Genetic tools identify lost human relatives Denisovans from fossil records
New genetic techniques are shedding light on a mysterious part of our family tree—ancient human relatives called the Denisovans that emerged during the Pleistocene epoch, approximately 370,000 years ago.
Little is known about them because so few fossils have been found. The group was discovered by accident in 2010, based solely on DNA analysis of what was thought to be a Neanderthal finger bone found in the Denisova Cave in Siberia. However, it turned out to belong to a previously unknown lineage closely related to Neanderthals.
Since then, only a few additional fragments have been found, so researchers
developed a new genetic tool to go through the fossil record identifying potential Denisovans. Their starting point was the Denisovan genome.
The research ispublishedin the journalProceedings of the National Academy of Sciences.
The research team's innovative technique detects subtle changes in how Denisovan genes are regulated. These changes were likely responsible for altering the Denisovan skeleton, providing clues to what they looked like. Using this technique, the scientists identified 32 physical traits most closely related to skull morphology. Then they used this predicted profile to scan the Middle Pleistocene fossil record, specifically focusing on skulls from that period. Before applying this method to unknown fossils, the researchers tested its accuracy. They used the same technique to successfully predict known physical traits of Neanderthals and chimpanzees with more than 85% accuracy. Next, they identified 18 skull features they could measure, such as head width and forehead height, and then measured these traits on ten ancient skulls. They compared these to skulls from reference groups, including Neanderthals, modern humans and Homo erectus. Using sophisticated statistical tests, they gave each ancient skull a score to see how well it matched the Denisovan blueprint. Of these, two skulls from China, known as the Harbin and Dali specimens, were a close match to the Denisovan profile. The Harbin skull matched 16 traits, while the Dali skull matched 15. The study also found that a third skull, Kabwe 1, had a strong link to the Denisovan-Neanderthal tree, which may indicate that it was the root of the Denisovan lineage that split from Neanderthals.
In their study, the team states that their work may help with the identification of other extinct human relatives.
Nadav Mishol et al, Candidate Denisovan fossils identified through gene regulatory phenotyping, Proceedings of the National Academy of Sciences (2025). DOI: 10.1073/pnas.2513968122
Rare seasonal brain shrinkage in shrews is driven by water loss, not cell death, MRI study reveals
Common shrews are one of only a handful of mammals known to flexibly shrink and regrow their brains. This rare seasonal cycle, known as Dehnel's phenomenon, has puzzled scientists for decades. How can a brain lose volume and regrow months later without sustaining permanent damage?
A study using noninvasive MRI has scanned the brains of shrews undergoing shrinkage, identifying a key molecule involved in the phenomenon: water. The work is published in the journal Current Biology.
In the experiments conducted by researchers, shrews lost 9% of their brains during shrinkage, but the cells did not die. The cells just lost water.
Normally, brain cells that lose water become damaged and ultimately die, but in shrews, the opposite happened. The cells remained alive and even increased in number.
This finding solves a mystery—and opens up potential pathways for the treatment of human brain disease. Brain shrinkage in shrews matches closely what happens in patients suffering from Alzheimer's, Parkinson's, and other brain diseases.
The study also shows that a specific protein known for regulating water—aquaporin 4—was likely involved in moving water out of the brain cells of shrews. This same protein is present in higher quantities in the diseased brains of humans, too.
That the shrunken brains of shrews share characteristics with diseased human brains makes the case that these miniature mammals, with their ability to reverse brain loss, could also offer clues for medical treatments.
Super corals could help buy time for reefs in a warming world
Coral reefs are in crisis. Rising ocean temperatures driven by climate change are pushing these ecosystems to the brink, with mass bleaching events becoming more frequent and severe.
But what if nature already holds part of the solution?
New research led by UTS demonstrates that corals that naturally thrive in extreme environments, often referred to as "super corals," could be used in restoration efforts to protect vulnerable reef systems.
The researchwas publishedin the journalScience Advancesand offers compelling evidence that these resilient corals can retain their heat tolerance even after being moved to more stable reef habitats.
Christine D. Roper et al, Coral thermotolerance retained following year-long exposure to a novel environment, Science Advances (2025). DOI: 10.1126/sciadv.adu3858
Longevity gains slowing with life expectancy of 100 unlikely, study finds
A new study finds that life expectancy gains made by high-income countries in the first half of the 20th century have slowed significantly, and that none of the generations born after 1939 will reach 100 years of age on average.
Published in the journal Proceedings of the National Academy of Sciences, the study analyzed life expectancy for 23 high-income and low-mortality countries using data from the Human Mortality Database and six different mortality forecasting methods.
The unprecedented increase in life expectancy we achieved in the first half of the 20th century appears to be a phenomenon we are unlikely to achieve again in the foreseeable future. In the absence of any major breakthroughs that significantly extend human life, life expectancy would still not match the rapid increases seen in the early 20th century even if adult survival improved twice as fast as we predict.
From 1900 to 1938, life expectancy rose by about five and a half months with each new generation. The life expectancy for an individual born in a high-income country in 1900 was an average of 62 years. For someone born just 38 years later in similar conditions, life expectancy had jumped to 80 years on average.
For those born between 1939 and 2000, the increase slowed to roughly two and a half to three and a half months per generation, depending on the forecasting method. Mortality forecasting methods are statistical techniques that make informed predictions about future lifespans based on past and current mortality information. These models enabled the research team to estimate how life expectancy will develop under a variety of plausible future scenarios.
Researchers forecast that those born in 1980 will not live to be 100 on average, and none of the cohorts in their study will reach this milestone. This decline is largely due to the fact that past surges in longevity were driven by remarkable improvements in survival at very young ages.
At the beginning of the 20th century, infant mortality fell rapidly due to medical advances and other improvements in quality of life for high-income countries. This contributed significantly to the rapid increase in life expectancy. However, infant and child mortality is now so low that the forecasted improvements in mortality in older age groups will not be enough to sustain the previous pace of longevity gains.
--
While mortality forecasts can never be certain as the future may unfold in unexpected ways—by way of pandemics, new medical treatments or other unforeseen societal changes—this study provides critical insight for governments looking to anticipate the needs of their health care systems, pension planning and social policies.
Although a population-level analysis, this research also has implications for individuals, as life expectancy influences personal decisions about saving, retirement and long-term planning. If life expectancy increases more slowly, as this study shows is likely, both governments and individuals may need to recalibrate their expectations for the future.
José Andrade et al, Cohort mortality forecasts indicate signs of deceleration in life expectancy gains, Proceedings of the National Academy of Sciences (2025). DOI: 10.1073/pnas.2519179122
Researchers have uncovered the genetic triggers that cause male and female bovine embryos to develop differently, as early as seven to eight days after fertilization. The breakthrough in basic science has implications for human health—such as drug development and in vitro fertilization—and for bovine health and dairy industry sustainability.
Scientists have known since the 1990s that male embryos of multiple mammalian species, including humans, grow faster than female embryos, but until now, the underlying reasons were unclear.
In a paper, published in Cell & Bioscience, scientists grew bovine embryos in petri dishes, then analyzed their genetic sex and RNA sequencing, which shows how genes are being expressed.
They discovered significant sex differences in gene regulation: male embryos prioritized genes associated with energy metabolism, causing them to grow faster than their female counterparts. Female embryos emphasized genes associated with sex differentiation, gonad development and inflammatory pathways that are important for future development.
Understanding these fundamental sex differences at the genomic and molecular levels is critically important to improving in vitro fertilization (IVF) success in humans and cows, and in developing treatments that will work for both men and women.
Meihong Shi et al, Sex-biased transcriptome in in vitro produced bovine early embryos, Cell & Bioscience (2025). DOI: 10.1186/s13578-025-01459-x
Scientists find that ice generates electricity when bent
A new study reveals that ice is a flexoelectric material, meaning it can produce electricity when unevenly deformed. Published inNature Physics, this discovery could have major technological implications while also shedding light on natural phenomena such as lightning.
Frozen water is one of the most abundant substances on Earth. It is found in glaciers, on mountain peaks and in polar ice caps. Although it is a well-known material, studying its properties continues to yield fascinating results.
An international study has shown for the first time that ordinary ice is a flexoelectric material.
In other words, it can generate electricity when subjected to mechanical deformation. This discovery could have significant implications for the development of future technological devices and help to explain natural phenomena such as the formation of lightning in thunderstorms.
The study represents a significant step forward in our understanding of the electromechanical properties of ice.
The scientists discovered that ice generates electric charge in response to mechanical stress at all temperatures. In addition, they identified a thin 'ferroelectric' layer at the surface at temperatures below -113ºC (160K).
This means that the ice surface can develop a natural electric polarization, which can be reversed when an external electric field is applied—similar to how the poles of a magnet can be flipped. The surface ferroelectricity is a cool discovery in its own right, as it means that ice may have not just one way to generate electricity, but two: ferroelectricity at very low temperatures, and flexoelectricity at higher temperatures all the way to 0 °C.
This property places ice on a par with electroceramic materials such as titanium dioxide, which are currently used in advanced technologies like sensors and capacitors.
One of the most surprising aspects of this discovery is its connection to nature. The results of the study suggest that the flexoelectricity of ice could play a role in the electrification of clouds during thunderstorms, and therefore in the origin of lightning.
It is known that lightning forms when an electric potential builds up in clouds due to collisions between ice particles, which become electrically charged. This potential is then released as a lightning strike. However, the mechanism by which ice particles become electrically charged has remained unclear, since ice is not piezoelectric—it cannot generate charge simply by being compressed during a collision.
However, the study shows that ice can become electrically charged when it is subjected to inhomogeneous deformations, i.e. when it bends or deforms irregularly.
X. Wen et al, Flexoelectricity and surface ferroelectricity of water ice, Nature Physics (2025). DOI: 10.1038/s41567-025-02995-6
Neuroscientists show for first time that precise timing of nerve signals determines how brain processes information
It has long been known that the brain preferentially processes information that we focus our attention on—a classic example is the so-called cocktail party effect.
In an environment full of voices, music, and background noise, the brain manages to concentrate on a single voice. The other noises are not objectively quieter, but are perceived less strongly at that moment.
The brain focuses its processing on the information that is currently relevant—in this case, the voice of the conversation partner—while other signals are received but not forwarded and processed to the same extent.
Neuroscientists provided the first causal evidence of how the brain transmits and processes relevant information. The work is published in the journal Nature Communications.
Whether a signal is processed further in the brain depends crucially on whether it arrives at the right moment—during a short phase of increased receptivity of the nerve cells.
Nerve cells do not work continuously, but in rapid cycles. They are particularly active and receptive for a few milliseconds, followed by a window of lower activity and excitability. This cycle repeats itself approximately every 10 to 20 milliseconds. Only when a signal arrived shortly before the peak of this active phase did it change the behaviour of the neurons.
This temporal coordination is the fundamental mechanism of information processing. Attention makes targeted use of this phenomenon by aligning the timing of the nerve cells so that relevant signals arrive precisely in this time window, while others are excluded.
Eric Drebitz et al, Gamma-band synchronization between neurons in the visual cortex is causal for effective information processing and behavior, Nature Communications (2025). DOI: 10.1038/s41467-025-62732-8
A suspected perpetrator who can barely remember his name, several traffic violations committed by a woman in her mid-fifties who is completely unreasonable and doesn't understand her behavior—should such cases be brought before a court? And how does the state deal with people who commit acts of violence without meaning to?
Those questions come to mind if one hears those examples from everyday clinical praxis with persons suffering from dementia. Neurodegenerative diseases might affect several functions of the brain, ranging from memory in Alzheimer's disease to behavior, such as in behavioral variant frontotemporal dementia, and to sensorimotor function in Parkinson's disease.
One of the most interesting consequences of these alterations is the fact that persons affected by these diseases might develop criminal risk behavior like harassment, traffic violation, theft or even behavior causing harm to other people or animals, even as the first disease sign.
If persons violate social or legal norms due to changes in behavior, personality and cognition, these incidents may have a substantial impact on this person's family and social surroundings and may lead to prosecution.
Researchers investigated this problem in a broad meta-analysis that included 14 studies with 236,360 persons from different countries (U.S.A., Sweden and Finland, Germany and Japan). The work is published in the journal Translational Psychiatry.
Their systematic literature review revealed that criminal risk behavior prevalence is more frequent in early disease course than in the general population, but declines thereafter below population levels. Therefore, criminal behavior committed for the first time at mid-age could be an indicator of incident dementia, requiring earliest diagnosis and therapy.
The team shows that the prevalence of criminal risk behaviors was highest in behavioral variant frontotemporal dementia (>50%), followed by semantic variant primary progressive aphasia (40%), but rather low in vascular dementia and Huntington's disease (15%), Alzheimer's disease (10%), and lowest in Parkinsonian syndromes (<10%).
Criminal risk behavior in frontotemporal dementia is most likely caused by the neurodegenerative disease itself. Most of the patients showed criminal risk behavior for the first time in their life and had no previous records of criminal activity.
The prevalence seems to be more frequent in frontotemporal dementia and Alzheimer's disease in early disease course than in the general population before diagnosis, presumably in pre-stages, such as mild behavioral or cognitive impairment, but declines thereafter, finally leading to lower prevalence in dementia after diagnosis if compared with the general population.
The researchers also found that criminal risk behaviour is more frequent in men than in women in dementia. After diagnosis, men showed four times more criminal risk behaviour than women in frontotemporal dementia and seven times more in Alzheimer's disease.
In a second study, the working group also identified the changes in the brain that are associated with criminal behavior in frontotemporal dementia. This study was published in Human Brain Mapping.
Persons exhibiting criminal behavior showed larger atrophy in the temporal lobe, indicating that criminal behavior might be caused by so-called disinhibition, i.e., the loss of normal restraints or inhibitions, leading to a reduced ability to regulate one's behavior, impulses, and emotions.
Disinhibition can manifest as acting impulsively, without thinking through the consequences, and behaving in ways that are inappropriate for the situation.
One has to prevent a further stigmatization of persons with dementia. Of note, most offenses committed were minor, such as indecent behavior, traffic violations, theft, damage to property, but also physical violence or aggression occurred
Hence, sensitivity for this topic as possible early signs of dementia and earliest diagnosis and treatment are of the uttermost importance. Besides early diagnosis and treatment of persons affected, one has to discuss adaptations of the legal system, such as increasing awareness of offenses due to those diseases, and taking into account diseases in respective penalties and in jails, say the researchers.
Matthias L. Schroeter et al, Criminal minds in dementia: A systematic review and quantitative meta-analysis,Translational Psychiatry(2025).DOI: 10.1038/s41398-025-03523-z
Karsten Mueller et al, Criminal Behavior in Frontotemporal Dementia: A Multimodal MRI Study,Human Brain Mapping(2025).DOI: 10.1002/hbm.70308
Depression linked to presence of immune cells in the brain's protective layer
Immune cells released from bone marrow in the skull in response to chronic stress and adversity could play a key role in symptoms of depression and anxiety, say researchers.
The discovery—found in a study in mice—sheds light on the role that inflammation can play in mood disorders and could help in the search for new treatments, in particular for those individuals for whom current treatments are ineffective.
Around 1 billion people will be diagnosed with a mood disorder such as depression or anxiety at some point in their life. While there may be many underlying causes, chronic inflammation—when the body's immune system stays active for a long time, even when there is no infection or injury to fight—has been linked to depression. This suggests that the immune system may play an important role in the development of mood disorders.
Previous studies have highlighted how high levels of an immune cell known as a neutrophil, a type of white blood cell, are linked to the severity of depression. But how neutrophils contribute to symptoms of depression is currently unclear.
In research published in Nature Communications, a team of scientists tested the hypothesis that chronic stress can lead to the release of neutrophils from bone marrow in the skull. These cells then collect in the meninges—membranes that cover and protect your brain and spinal cord—and contribute to symptoms of depression.
As it is not possible to test this hypothesis in humans, the team used mice exposed to chronic social stress. In this experiment, an 'intruder' mouse is introduced into the home cage of an aggressive resident mouse. The two have brief daily physical interactions and can otherwise see, smell, and hear each other.
The researchers found that prolonged exposure to this stressful environment led to a noticeable increase in levels of neutrophils in the meninges, and that this was linked to signs of depressive behavior in the mice. Even after the stress ended, the neutrophils lasted longer in the meninges than they did in the blood.
Analysis confirmed the researchers' hypothesis that the meningeal neutrophils—which appeared subtly different from those found in the blood—originated in the skull.
Further analysis suggested that long-term stress triggered a type of immune system 'alarm warning' known as type I interferon signaling in the neutrophils. Blocking this pathway—in effect, switching off the alarm—reduced the number of neutrophils in the meninges and improved behavior in the depressed mice.
Further analysis suggested that long-term stress triggered a type of immune system 'alarm warning' known as type I interferon signaling in the neutrophils. Blocking this pathway—in effect, switching off the alarm—reduced the number of neutrophils in the meninges and improved behavior in the depressed mice.
The reason why there are high levels of neutrophils in the meninges is unclear. One explanation could be that they are recruited by microglia, a type of immune cell unique to the brain.
Another possible explanation is that chronic stress may cause microhemorrhages, tiny leaks in brain blood vessels, and that neutrophils—the body's 'first responders'—arrive to fix the damage and prevent any further damage. These neutrophils then become more rigid, possibly getting stuck in brain capillaries and causing further inflammation in the brain.
These new findings show that these 'first responder' immune cells leave the skull bone marrow and travel to the brain, where they can influence mood and behavior.
Stacey L. Kigar et al, Chronic social defeat stress induces meningeal neutrophilia via type I interferon signaling in male mice, Nature Communications (2025). DOI: 10.1038/s41467-025-62840-5
Hyperactive blood platelets linked to heart attacks despite standard drug therapy
Scientists have discovered a particularly active subgroup of blood platelets that may cause heart attacks in people with coronary heart disease despite drug therapy. This discovery may open up new prospects for customized therapies. The research results are published in the European Heart Journal and were presented on August 31 at Europe's largest cardiology congress.
Despite modern medication, many people with coronary heart disease continue to suffer heart attacks. A research team has now found a possible explanation for this and has also provided a very promising therapeutic approach.
Their work focused on so-called "reticulated platelets": particularly young, RNA-rich and reactive thrombocytes that play a central role in the formation of blood clots in patients with coronary heart disease.
The researchers were able to comprehensively characterize the biological mechanisms of these cells for the first time. This allows us to explain why these platelets remain overactive in many patients even under optimal therapy. The reason is that these young platelets have a particularly large number of activating signaling pathways that make them more sensitive and reactive than mature platelets.
The results come from a multidimensional, using various methods, analysis of the blood of over 90 patients with coronary heart disease. Among other things, the researchers found signaling pathways that can be used to specifically inhibit the activity of such blood cells, in particular two target structures called GPVI and PI3K. Initial laboratory experiments confirmed that inhibiting these signaling pathways can reduce platelet hyperactivity.
Kilian Kirmes et al, Reticulated platelets in coronary artery disease: a multidimensional approach unveils prothrombotic signalling and novel therapeutic targets, European Heart Journal (2025). DOI: 10.1093/eurheartj/ehaf694
8,000 years of human activities have caused wild animals to shrink and domestic animals to grow
Humans have caused wild animals to shrink and domestic animals to grow, according to a new study.
Researchers studied tens of thousands of animal bones from Mediterranean France covering the last 8,000 years to see how the size of both types of animals has changed over time.
Scientists already know that human choices, such as selective breeding, influence the size of domestic animals, and that environmental factors also impact the size of both. However, little is known about how these two forces have influenced the size of wild and domestic animals over such a prolonged period. This latest research, published in the Proceedings of the National Academy of Sciences , fills a major gap in our knowledge.
The scientists analyzed more than 225,000 bones from 311 archaeological sites in Mediterranean France. They took thousands of measurements of things like the length, width, and depth of bones and teeth from wild animals, such as foxes, rabbits and deer, as well as domestic ones, including goats, cattle, pigs, sheep and chickens.
But the researchers didn't just focus on the bones. They also collected data on the climate, the types of plants growing in the area, the number of people living there and what they used the land for. And then, with some sophisticated statistical modeling, they were able to track key trends and drivers behind the change in animal size.
The research team's findings reveal that for around 7,000 years, wild and domestic animals evolved along similar paths, growing and shrinking together in sync with their shared environment and human activity. However, all that changed around 1,000 years ago. Their body sizes began to diverge dramatically, especially during the Middle Ages.
Domestic animals started to get much bigger as they were being actively bred for more meat and milk. At the same time, wild animals began to shrink in size as a direct result of human pressures, such as hunting and habitat loss. In other words, human activities replaced environmental factors as the main force shaping animal evolution.
Mureau, Cyprien et al, 8,000 years of wild and domestic animal body size data reveal long-term synchrony and recent divergence due to intensified human impact, Proceedings of the National Academy of Sciences (2025). DOI: 10.1073/pnas.2503428122. www.pnas.org/cgi/doi/10.1073/pnas.2503428122
World's oldest host-associated bacterial DNA found
An international team led by researchers at the Center for Paleogenetics, has uncovered microbial DNA preserved in woolly and steppe mammoth remains dating back more than one million years. The analyses reveal some of the world's oldest microbial DNA ever recovered, as well as the identification of bacteria that possibly caused disease in mammoths. The findings are published in Cell.
Researchers analyzed microbial DNA from 483 mammoth specimens, of which 440 were sequenced for the first time. Among them was a steppe mammoth that lived about 1.1 million years ago. Using advanced genomic and bioinformatic techniques, the team distinguished microbes that once lived alongside the mammoths from those that invaded their remains after death.
Genetic mechanism reveals how plants coordinate flowering with light and temperature conditions
Plants may be stuck in one place, but the world around them is constantly changing. In order to grow and flower at the right time, plants must constantly collect information about their surroundings, measuring things like temperature, brightness, and length of day. Still, it's unclear how all this information gets combined to trigger specific behaviors.
Scientists have discovered a genetic mechanism for how plants integrate light and temperature information to control their flowering.
In a study published in Nature Communications, the researchers found an interaction between two genetic pathways that signals the presence of both blue light and low temperature. This genetic module helps plants fine-tune their flowering to the optimal environmental conditions.
In one pathway, blue light activates the PHOT2 blue light receptor, with help from partner protein NPH3. In another pathway, low ambient temperature allows a transcription factor called CAMTA2 to boost the expression of a gene called EHB1. Importantly, EHB1 is known to interact with NPH3, placing NPH3 at the convergence point of the blue light and low temperature signals. This this genetic architecture effectively works as a coincidence detector, linking the presence of blue light and low temperature to guide the switch to flowering.
The study describes an important component of plant growth, reproduction, and information processing. The newly discovered genetic module allows plants to have fine control over their flowering in low temperatures. Understanding this system will now help scientists optimize crop growth under changing environmental conditions.
Adam Seluzicki et al, Genetic architecture of a light-temperature coincidence detector, Nature Communications (2025). DOI: 10.1038/s41467-025-62194-y
Generative AI designs molecules that kill drug-resistant bacteria
What if generative AI could design life-saving antibiotics, not just art and text? In a new Cell Biomaterials paper, researchers introduce AMP-Diffusion, a generative AI tool used to create tens of thousands of new antimicrobial peptides (AMPs)—short strings of amino acids, the building blocks of proteins—with bacteria-killing potential. In animal models, the most potent AMPs performed as well as FDA-approved drugs, without detectable adverse effects.
While past work has shown that AI can successfully sort through mountains of data to identify promising antibiotic candidates, this study adds to a small but growing number of demonstrations that AI can invent antibiotic candidates from scratch.
Using AMP-Diffusion, the researchers generated the amino-acid sequences for about 50,000 candidates. They used AI to filter the results.
After synthesizing the 46 most promising candidates, the researchers tested them in human cells and animal models. Treating skin infections in mice, two AMPs demonstrated efficacy on par with levofloxacin and polymyxin B, FDA-approved drugs used to treat antibiotic-resistant bacteria, without adverse effects.
In the future, the researchers hope to refine AMP-Diffusion, giving it the capability to denoise with a more specific goal in mind, like treating a particular type of bacterial infection, among other features.
Persister bacteria, which enter a dormant state to survive antibiotics that target active cells, are linked to over 20% of chronic infections and resist current treatments.
Understanding their survival mechanisms could lead to new ways to combat recurring infections. This study utilized E. coli bacteria as a model and found that prolonged stress leads to the increased formation of aggresomes (membraneless droplets) and the enrichment of mRNA (molecules that carry instructions for making proteins) within them, which enhances the ability of E. coli to survive and recover from stress.
Researchers used multiple approaches, including imaging, modeling, and transcriptomics, to show that prolonged stress leading to ATP (fuel for all living cells) depletion in Escherichia coli results in increased aggresome formation, their compaction, and enrichment of mRNA within aggresomes compared to the cytosol (the liquid inside of cells).
Transcript length was longer in aggresomes compared to the cytosol. Mass spectrometry showed exclusion of mRNA ribonuclease (an enzyme that breaks down RNA) from aggresomes, which was due to negative charge repulsion.
Experiments with fluorescent reporters and disruption of aggresome formation showed that mRNA storage within aggresomes promoted translation and was associated with reduced lag phases during growth after stress removal. These findings suggest that mRNA storage within aggresomes confers an advantage for bacterial survival and recovery from stress.
This breakthrough illuminates how persister cells survive and revive after antibiotic treatment.
By targeting aggresomes, new drugs could disrupt this protective mechanism, preventing bacteria from storing mRNA and making them more vulnerable to elimination, thus reducing the risk of infection relapse.
Linsen Pei et al, Aggresomes protect mRNA under stress in Escherichia coli, Nature Microbiology (2025). DOI: 10.1038/s41564-025-02086-5
Deforestation reduces rainfall by 74% and increases temperatures by 16% in Amazon during dry season, study found
Deforestation in the Brazilian Amazon is responsible for approximately 74.5% of the reduction in rainfall and 16.5% of the temperature increase in the biome during the dry season. For the first time, researchers have quantified the impact of vegetation loss and global climate change on the forest.
The study provides fundamental results to guide effective mitigation and adaptation strategies.
How climate change and deforestation interact in the transformation of the Amazon rainforest, Nature Communications (2025). DOI: 10.1038/s41467-025-63156-0
Magnetic fields in infant universe may have been billions of times weaker than a fridge magnet
The magnetic fields that formed in the very early stages of the universe may have been billions of times weaker than a small fridge magnet, with strengths comparable to magnetism generated by neurons in the human brain. Yet, despite such weakness, quantifiable traces of their existence still remain in the cosmic web, the visible cosmic structures connected throughout the universe.
These conclusions emerge from a study using around a quarter of a million computer simulations.
The research, recently published in Physical Review Letters, specifies both possible and maximum values for the strengths of primordial magnetic fields. It also offers the possibility of refining our knowledge of the early universe and the formation of the first stars and galaxies.
Removing yellow stains from fabric with blue light
Sweat and food stains can ruin your favorite clothes. But bleaching agents such as hydrogen peroxide or dry-cleaning solvents that remove stains aren't options for all fabrics, especially delicate ones. Now, researchers in ACS Sustainable Chemistry & Engineeringreport a simple way to remove yellow stains using a high-intensity blue LED light. They demonstrate the method's effectiveness at removing stains from orange juice, tomato juice and sweat-like substances on multiple fabrics, including silk.
The method utilizes visible blue light in combination with ambient oxygen, which acts as the oxidizing agent to drive the photobleaching process.This approach avoids the use of harsh chemical oxidants typically required in conventional bleaching methods, making it inherently more sustainable.
Yellow clothing stains are caused by squalene and oleic acid from skin oils and sweat, as well as natural pigments like beta carotene and lycopene, present in oranges, tomatoes and other foods. UV light is a potential stain-removing alternative to chemical oxidizers like bleach and hydrogen peroxide, but it can damage delicate fabrics.
The same researchers previously determined that a high-intensity blue LED light could remove yellow color from aged resin polymers, and they wanted to see whether blue light could also break down yellow stains on fabric without causing damage.
Initially, they exposed vials of beta carotene, lycopene and squalene to high-intensity blue LED light for three hours. All the samples lost color, and spectroscopic analyses indicated that oxygen in the air helped the photobleaching process by breaking bonds to produce colorless compounds.
Next, the team applied squalene onto cotton fabric swatches. After heating the swatches to simulate aging, they treated the samples for 10 minutes, by soaking them in a hydrogen peroxide solution or exposing them to the blue LED or UV light. The blue light reduced the yellow stain substantially more than hydrogen peroxide or UV exposure. In fact, UV exposure generated some new yellow-colored compounds. Additional tests showed that the blue LED treatment lightened squalene stains on silk and polyester without damaging the fabrics. The method also reduced the color of other stain-causing substances, including aged oleic acid, orange juice and tomato juice, on cotton swatches.
High-intensity blue LED light is a promising way to remove clothing stains, but the researchers say they want to do additional colorfastness and safety testing before commercializing a light system for home and industrial use.
Tomohiro Sugahara et al, Environmentally Friendly Photobleaching Method Using Visible Light for Removing Natural Stains from Clothing, ACS Sustainable Chemistry & Engineering (2025). DOI: 10.1021/acssuschemeng.5c03907
Scientists develop the world's first 6G chip, capable of 100 Gbps speeds
Sixth generation, or 6G, wireless technology is one step closer to reality with news that researchers have unveiled the world's first "all-frequency" 6G chip. The chip is capable of delivering mobile internet speeds exceeding 100 gigabits per second (Gbps).
6G technology is the successor to 5G and promises to bring about a massive leap in how we communicate. It will offer benefits such as ultra-high-speed connectivity, ultra-low latency and AI integration that can manage and optimize networks in real-time. To achieve this, 6G networks will need to operate across a range of frequencies, from standard microwaves to much higher frequency terahertz waves. Current 5G technology utilizes a limited set of radio frequencies, similar to those used in previous generations of wireless technologies.
The new chip is no bigger than a thumbnail, measuring 11 millimeters by 1.7 millimeters. It operates across a wide frequency range, from 0.5 GHz to 115 GHz, which traditionally takes nine separate radio systems to cover this spectrum. While the development of a single all-frequency chip is a significant breakthrough, the technology is still in its early stages of development. Many experts expect that commercial 6G networks will begin to roll out around 2030.
Zihan Tao et al, Ultrabroadband on-chip photonics for full-spectrum wireless communications, Nature (2025). DOI: 10.1038/s41586-025-09451-8
Mother's Germs May Influence Her Child's Brain Development
Our bodies are colonized by a teeming, ever-changing mass of microbes that help power countless biological processes. Now, a new study has identified how these microorganisms get to work shaping the brain before birth.
Researchers at Georgia State University studied newborn mice specifically bred in a germ-free environment to prevent any microbe colonization. Some of these mice were immediately placed with mothers with normal microbiota, which leads to microbes being transferred rapidly.
That gave the study authors a way to pinpoint just how early microbes begin influencing the developing brain. Their focus was on the paraventricular nucleus (PVN), a region of the hypothalamus tied to stress and social behavior, already known to be partly influenced by microbe activity in mice later in life.
At birth, a newborn body is colonized by microbes as it travels through the birth canal.
Birth also coincides with important developmental events that shape the brain.
Part 1
When the germ-free mice were just a handful of days old, the researchers found fewer neurons in their PVN, even when microbes were introduced after birth. That suggests the changes caused by these microorganisms happen in the uterus during development.
These neural modifications last, too: the researchers also found that the PVN was neuron-light even in adult mice, if they'd been raised to be germ-free. However, the cross-fostering experiment was not continued into adulthood (around eight weeks).
The details of this relationship still need to be worked out and researched in greater detail, but the takeaway is that microbes – specifically the mix of microbes in the mother's gut – can play a notable role in the brain development of their offspring. Rather than shunning our microbes, we should recognize them as partners in early life development. They're helping build our brains from the very beginning, say the researchers. While this has only been shown in mouse models so far, there are enough biological similarities between mice and humans that there's a chance we're also shaped by our mother's microbes before we're born.
One of the reasons this matters is because practices like Cesarean sections and the use of antibiotics around birth are known to disrupt certain types of microbe activity – which may in turn be affecting the health of newborns.
Mutations driving evolution are informed by the genome, not random, study suggests
A study published in the Proceedings of the National Academy of Sciences by scientists shows that an evolutionarily significant mutation in the human APOL1 gene arises not randomly but more frequently where it is needed to prevent disease, fundamentally challenging the notion that evolution is driven by random mutations and tying the results to a new theory that, for the first time, offers a new concept for how mutations arise.
Implications for biology, medicine, computer science, and perhaps even our understanding of the origin of life itself, are potentially far reaching.
A random mutation is a genetic change whose chance of arising is unrelated to its usefulness. Only once these supposed accidents arise does natural selection vet them, sorting the beneficial from the harmful. For over a century, scientists have thought that a series of such accidents has built up over time, one by one, to create the diversity and splendor of life around us.
However, it has never been possible to examine directly whether mutations in the DNA originate at random or not. Mutations are rare events relative to the genome's size, and technical limitations have prevented scientists from seeing the genome in enough detail to track individual mutations as they arise naturally.
To overcome this, researchers developed a new ultra-accurate detection method and recently applied it to the famous HbS mutation, which protects from malaria but causes sickle-cell anemia in homozygotes.
Results showed that the HbS mutation did not arise at random, but emerged more frequently exactly in the gene and population where it was needed. Now, they report the same nonrandom pattern in a second mutation of evolutionary significance.
The new study examines the de novo origination of a mutation in the human APOL1 gene that protects against a form of trypanosomiasis, a disease that devastated central Africa in historical times and until recently has caused tens of thousands of deaths there per year, while increasing the risk of chronic kidney disease in people with two copies.
If the APOL1 mutation arises by chance, it should arise at a similar rate in all populations, and only then spread under Trypanosoma pressure. However, if it is generated nonrandomly, it may actually arise more frequently where it is useful.
Results supported the nonrandom pattern: the mutation arose much more frequently in sub-Saharan Africans, who have faced generations of endemic disease, compared to Europeans, who have not, and in the precise genomic location where it confers protection.
The new findings challenge the notion of random mutation fundamentally. Historically, there have been two basic theories for how evolution happens— random mutation and natural selection, and Lamarckism—the idea that an individual directly senses its environment and somehow changes its genes to fit it. Lamarckism has been unable to explain evolution in general, so biologists have concluded that mutations must be random.
This new theory moves away from both of these concepts, proposing instead that two inextricable forces underlie evolution. While the well-known external force of natural selection ensures fitness, a previously unrecognized internal force operates inside the organism, putting together genetic information that has accumulated over generations in useful ways.
To illustrate, take fusion mutations, a type of mutation where two previously separate genes fuse to form a new gene. As for all mutations, it has been thought that fusions arise by accident: one day, a gene moves by error to another location and by chance fuses to another gene, once in a great while, leading to a useful adaptation. But researchers have recently shown that genes do not fuse at random.
Instead, genes that have evolved to be used together repeatedly over generations are the ones that are more likely to get fused. Because the genome folds in 3D space, bringing genes that work together to the same place at the same time in the nucleus with their chromatin open, molecular mechanisms fuse these genes rather than others. An interaction involving complex regulatory information that has gradually evolved over generations leads to a mutation that simplifies and "hardwires" it into the genome.
Part 2
In the paper, they argue that fusions are a specific example of a more general and extensive internal force that applies across mutation types. Rather than local accidents arising at random locations in the genome disconnected from other genetic information, mutational processes put together multiple meaningful pieces of heritable information in many ways.
Genes that evolved to interact tightly are more likely to be fused; single-letter RNA changes that evolved to occur repeatedly across generations via regulatory phenomena are more likely to be "hardwired" as point mutations into the DNA; genes that evolved to interact in incipient networks, each under its own regulation, are more likely to be invaded by the same transposable element that later becomes a master-switch of the network, streamlining regulation, and so on. Earlier mutations influence the origination of later ones, forming a vast network of influences over evolutionary time.
Previous studies examined mutation rates as averages across genomic positions, masking the probabilities of individual mutations. But these new studies suggest that, at the scale of individual mutations, each mutation has its own probability, and the causes and consequences of mutation are related. At each generation, mutations arise based on the information that has accumulated in the genome up to that time point, and those that survive become a part of that internal information. This vast array of interconnected mutational activity gradually hones in over the generations on mutations relevant to the long-term pressures experienced, leading to long-term directed mutational responses to specific environmental pressures, such as the malaria and Trypanosoma–protective HbS and APOL1 mutations. New genetic information arises in the first place, they argue, as a consequence of the fact that mutations simplify genetic regulation, hardwiring evolved biological interactions into ready-made units in the genome. This internal force of natural simplification, together with the external force of natural selection, act over evolutionary time like combined forces of parsimony and fit, generating co-optable elements that themselves have an inherent tendency to come together into new, emergent interactions. Co-optable elements are generated by simplification under performance pressure, and then engage in emergent interactions—the source of innovation is at the system level. Understood in the proper timescale, an individual mutation does not arise at random nor does it invent anything in and of itself. The potential depth of evolution from this new perspective can be seen by examining other networks. For example, the gene fusion mechanism—where genes repeatedly used together across evolutionary time are more likely to be fused together by mutation—echoes chunking, one of the most basic principles of cognition and learning in the brain, where pieces of information that repeatedly co-occur are eventually chunked into a single unit.
Yet fusions are only one instance of a broader principle: diverse mutational processes respond to accumulated information in the genome, combining it over generations into streamlined instructions. This view recasts mutations not as isolated accidents, but as meaningful events in a larger, long-term process.
Daniel Melamed et al, De novo rates of a Trypanosoma -resistant mutation in two human populations, Proceedings of the National Academy of Sciences (2025). DOI: 10.1073/pnas.2424538122
New tool enables rapid, large-scale profiling of disease-linked RNA modifications
Researchers have developed a powerful tool capable of scanning thousands of biological samples to detect transfer ribonucleic acid (tRNA) modifications—tiny chemical changes to RNA molecules that help control how cells grow, adapt to stress and respond to diseases such as cancer and antibiotic‑resistant infections. This tool opens up new possibilities for science, health care and industry—from accelerating disease research and enabling more precise diagnostics, to guiding the development of more effective medical treatments for diseases such as cancer and antibiotic‑resistant infections.
Cancer and infectious diseases are complicated health conditions in which cells are forced to function abnormally by mutations in their genetic material or by instructions from an invading microorganism. The SMART-led research team is among the world's leaders in understanding how the epitranscriptome—the over 170 different chemical modifications of all forms of RNA—controls growth of normal cells and how cells respond to stressful changes in the environment, such as loss of nutrients or exposure to toxic chemicals. The researchers are also studying how this system is corrupted in cancer or exploited by viruses, bacteria and parasites in infectious diseases. Current molecular methods used to study the expansive epitranscriptome and all of the thousands of different types of modified RNA are often slow, labor‑intensive, costly and involve hazardous chemicals which limit research capacity and speed.
To solve this problem, the SMART team developed a new tool that enables fast, automated profiling of tRNA modifications—molecular changes that regulate how cells survive, adapt to stress and respond to disease. This capability allows scientists to map cell regulatory networks, discover novel enzymes and link molecular patterns to disease mechanisms, paving the way for better drug discovery and development, and more accurate disease diagnostics.
SMART's automated system was specially designed to profile tRNA modifications across thousands of samples rapidly and safely. Unlike traditional methods—which are costly, labor‑intensive and use toxic solvents such as phenol and chloroform—this tool integrates robotics to automate sample preparation and analysis, eliminating the need for hazardous chemical handling and reducing costs. This advancement increases safety, throughput and affordability, enabling routine large‑scale use in research and clinical labs.
Jingjing Sun et al, tRNA modification profiling reveals epitranscriptome regulatory networks in Pseudomonas aeruginosa, Nucleic Acids Research (2025). DOI: 10.1093/nar/gkaf696
Fruit fly research shows that mechanical forces drive evolutionary change
A tissue fold known as the cephalic furrow, an evolutionary novelty that forms between the head and the trunk of fly embryos, plays a mechanical role in stabilizing embryonic tissues during the development of the fruit fly Drosophila melanogaster.
Researchers have integrated computer simulations with their experiments and showed that the timing and position of cephalic furrow formation are crucial for its function, preventing mechanical instabilities in the embryonic tissues.
The work appears in Nature.
The increased mechanical instability caused by embryonic tissue movements may have contributed to the origin and evolution of the cephalic furrow genetic program. This shows that mechanical forces can shape the evolution of new developmental features.
Mechanical forces shape tissues and organs during the development of an embryo through a process called morphogenesis. These forces cause tissues to push and pull on each other, providing essential information to cells and determining the shape of organs. Despite the importance of these forces, their role in the evolution of development is still not well understood.
Animal embryos undergo tissue flows and folding processes, involving mechanical forces, that transform a single-layered blastula (a hollow sphere of cells) into a complex multi-layered structure known as the gastrula. During early gastrulation, some flies of the order Diptera form a tissue fold at the head-trunk boundary called the cephalic furrow. This fold is a specific feature of a subgroup of Diptera and is therefore an evolutionary novelty of flies.
The cephalic furrow is especially interesting because it is a prominent embryonic invagination whose formation is controlled by genes, but that has no obvious function during development. The fold does not give rise to specific structures, and later in development, it simply unfolds, leaving no trace. With their experiments, the researchers show that the absence of the cephalic furrow leads to an increase in the mechanical instability of embryonic tissues and that the primary sources of mechanical stress are cell divisions and tissue movements typical of gastrulation. They demonstrate that the formation of the cephalic furrow absorbs these compressive stresses. Without a cephalic furrow, these stresses build up, and outward forces caused by cell divisions in the single-layered blastula cause mechanical instability and tissue buckling.
This intriguing physical role gave the researchers the idea that the cephalic furrow may have evolved in response to the mechanical challenges of dipteran gastrulation, with mechanical instability acting as a potential selective pressure. Flies either feature a cephalic furrow, or if they lack one, display widespread out-of-plane division, meaning the cells divide downward to reduce the surface area. Both mechanisms act as mechanical sinks to prevent tissue collision and distortion. These findings uncover empirical evidence for how mechanical forces can influence the evolution of innovations in early development. The cephalic furrow may have evolved through genetic changes in response to the mechanical challenges of dipteran gastrulation. They show that mechanical forces are not just important for the development of the embryo but also for the evolution of its development.
Some sugar substitutes linked to faster cognitive decline
Some sugar substitutes may come with unexpected consequences for long-term brain health, according to a study published in Neurology. The study examined seven low- and no-calorie sweeteners and found that people who consumed the highest amounts experienced faster declines in thinking and memory skills compared to those who consumed the lowest amounts.
The link was even stronger in people with diabetes. While the study showed a link between the use of some artificial sweeteners and cognitive decline, it did not prove that they were a cause.
The artificial sweeteners examined in the study were aspartame, saccharin, acesulfame-K, erythritol, xylitol, sorbitol and tagatose. These are mainly found in ultra-processed foods like flavored water, soda, energy drinks, yogurt and low-calorie desserts. Some are also used as a standalone sweetener.
Low- and no-calorie sweeteners are often seen as a healthy alternative to sugar, however the new findings suggest certain sweeteners may have negative effects on brain health over time.
The study included 12,772 adults from across Brazil. The average age was 52, and participants were followed for an average of eight years.
After adjusting for factors such as age, sex, high blood pressure and cardiovascular disease, researchers found people who consumed the highest amount of sweeteners showed faster declines in overall thinking and memory skills than those who consumed the lowest amount, with a decline that was 62% faster. This is the equivalent of about 1.6 years of aging. Those in the middle group had a decline that was 35% faster than the lowest group, equivalent to about 1.3 years of aging.
When researchers broke the results down by age, they found that people under the age of 60 who consumed the highest amounts of sweeteners showed faster declines in verbal fluency and overall cognition when compared to those who consumed the lowest amounts. They did not find links in people over 60. They also found that the link to faster cognitive decline was stronger in participants with diabetes than in those without diabetes.
When looking at individual sweeteners, consuming aspartame, saccharin, acesulfame-k, erythritol, sorbitol and xylitol was associated with a faster decline in overall cognition, particularly in memory.
They found no link between the consumption of tagatose and cognitive decline.
SeeMe detects hidden signs of consciousness in brain injury patients
SeeMe, a computer vision tool tested by Stony Brook University researchers, was able to detect low-amplitude, voluntary facial movements in comatose acute brain injury patients days before clinicians could identify overt responses.
There are ways to detect covert consciousness with EEG and fMRI, though these are not always available. Many acute brain injury patients appear unresponsive in early care, with signs that are so small, or so infrequent, that they are simply missed.
In the study, "Computer vision detects covert voluntary facial movements in unresponsive brain injury patients,"publishedinCommunications Medicine, investigators designed SeeMe to quantify tiny facial movements in response to auditory commands with the objective of identifying early, stimulus-evoked behavior.
A single-center prospective cohort included 37 comatose acute brain injury patients and 16 healthy volunteers, aged 18–85, enrolled at Stony Brook University Hospital. Patients had initial Glasgow Coma Scale scores ≤8 and no prior neurologically debilitating diagnoses.
SeeMe tagged facial pores at ~0.2 mm resolution and tracked movement vectors while subjects heard three commands: open your eyes, stick out your tongue, and show me a smile.
Results indicate earlier and broader detection with SeeMe. Eye-opening was detected on average 9.1 (± 5.5) days after injury by SeeMe versus 13.2 (± 11.4) days by clinical examination, yielding a 4.1-day lead.
SeeMe identified eye-opening in 30 of 36 patients (85.7%) compared with 25 of 36 (71.4%) by clinical exam. Among patients without an endotracheal tube obscuring the mouth, SeeMe detected mouth movements in 16 of 17 (94.1%).
In seven patients with analyzable mouth videos and clinical command following, SeeMe identified reproducible mouth responses 8.3 days earlier on average. Amplitude and frequency of SeeMe-positive responses correlated with discharge outcomes on the Glasgow Outcome Scale-Extended, with significant Kruskal-Wallis results for both features.
A deep neural network classifier trained on SeeMe-positive trials identified command specificity with 81% accuracy for eye-opening and an overall accuracy of 65%. Additional observations were lower, with 37% for tongue movement and 47% for smile, indicating a strong specificity for eyes.
Authors conclude that acute brain injury patients can exhibit low-amplitude, stimulus-evoked facial movements before overt signs appear at bedside, suggesting that many covertly conscious patients may have motor behavior currently undetected by clinicians.
Earlier detection could inform family discussions, guide rehabilitation timing, and serve as a quantitative signal for future monitoring or interface-based communication strategies, while complementing standard examinations.
Xi Cheng et al, Computer vision detects covert voluntary facial movements in unresponsive brain injury patients, Communications Medicine (2025). DOI: 10.1038/s43856-025-01042-y
Pancreatic insulin disruption triggers bipolar disorder-like behaviors in mice, study shows
Bipolar disorder is a psychiatric disorder characterized by alternating episodes of depression (i.e., low mood and a loss of interest in everyday activities) and mania (i.e., a state in which arousal and energy levels are abnormally high). On average, an estimated 1–2% of people worldwide are diagnosed with bipolar disorder at some point during their lives.
Bipolar disorder can be highly debilitating, particularly if left untreated. Understanding the neural and physiological processes that contribute to its emergence could thus be very valuable, as it could inform the development of new prevention and treatment strategies.
In addition to experiencing periodic changes in mood, individuals diagnosed with this disorder often exhibit some metabolic symptoms, including changes in their blood sugar levels. While some previous studies reported an association between blood sugar control mechanisms and bipolar disorder, the biological link between the two has not yet been uncovered.
Researchers recently carried out a study aimed at further exploring the link between insulin secretion and bipolar disorder-like behaviors, particularly focusing on the expression of the gene RORβ.
Their findings, published in Nature Neuroscience, show that an overexpression of this gene in a subtype of pancreatic cells disrupts the release of insulin, which in turn prompts a feedback loop with a region of the brain known as the hippocampus, producing alternative depression-like and mania-like behaviors in mice.
The results in mice point to a pancreas–hippocampus feedback mechanism by which metabolic and circadian factors cooperate to generate behavioral fluctuations, and which may play a role in bipolar disorder, wrote the authors.
Yao-Nan Liu et al, A pancreas–hippocampus feedback mechanism regulates circadian changes in depression-related behaviors, Nature Neuroscience (2025). DOI: 10.1038/s41593-025-02040-y.
Shampoo-like gel could help chemo patients keep their hair
Cancer fighters know that losing their hair is often part of the battle, but researchers have developed a shampoo-like gel that has been tested in animal models and could protect hair from falling out during chemotherapy treatment.
Baldness from chemotherapy-induced alopecia causes personal, social and professional anxiety for everyone who experiences it. Currently, there are few solutions—the only ones that are approved are cold caps worn on the patient's head, which are expensive and have their own extensive side effects.
The gel is a hydrogel, which absorbs a lot of water and provides long-lasting delivery of drugs to the patient's scalp. The hydrogel is designed to be applied to the patient's scalp before the start of chemotherapy and left on their head as long as the chemotherapy drugs are in their system—or until they are ready to easily wash it off.
During chemotherapy treatment, chemotherapeutic drugs circulate throughout the body. When these drugs reach the blood vessels surrounding the hair follicles on the scalp, they kill or damage the follicles, which releases the hair from the shaft and causes it to fall out. The gel, containing the drugs lidocaine and adrenalone, prevents most of the chemotherapy drugs from reaching the hair follicle by restricting the blood flow to the scalp. Dramatic reduction in drugs reaching the follicle will help protect the hair and prevent it from falling out.
To support practical use of this "shampoo," the gel is designed to be temperature responsive. For example, at body temperature, the gel is thicker and clings to the patient's hair and scalp surface. When the gel is exposed to slightly cooler temperatures, the gel becomes thinner and more like a liquid that can be easily washed away.
Romila Manchanda et al, Hydrogel-based drug delivery system designed for chemotherapy-induced alopecia, Biomaterials Advances (2025). DOI: 10.1016/j.bioadv.2025.214452
A research team has identified a direct molecular link between aging and neurodegeneration by investigating how age-related changes in cell signaling contribute to toxic protein aggregation.
Although aging is the biggest risk factor for neurodegenerative diseases, scientists still don't fully understand which age-associated molecular alterations drive their development.
Using the small nematode worm Caenorhabditis elegans, a research team studied a signaling pathway that leads to pathological protein accumulation with age. Their new paper is published in Nature Aging.
The team focused on the aging-associated protein EPS8 and the signaling pathways it regulates. This protein is known to accumulate with age and to activate harmful stress responses that lead to a shorter lifespan in worms.
Researchers found that increased levels of EPS8, and the activation of its signaling pathways, drive pathological protein aggregation and neurodegeneration—typical features of age-associated neurodegenerative diseases such as Huntington's disease and amyotrophic lateral sclerosis (ALS). By reducing EPS8 activity, the group was then able to prevent the build-up of the toxic protein aggregates and preserve neuronal function in worm models of these two diseases.
Importantly, EPS8 and its signaling partners are evolutionarily conserved and also present in human cells. Similar to what they achieved in the worms, the team was able to prevent the accumulation of toxic protein aggregates in human cell models of Huntington's disease and ALS by reducing EPS8 levels.
Seda Koyuncu et al, The aging factor EPS8 induces disease-related protein aggregation through RAC signaling hyperactivation, Nature Aging (2025). DOI: 10.1038/s43587-025-00943-w
Cooling pollen sunscreen can block UV rays without harming corals
Materials scientists have invented the world's first pollen-based sunscreen derived from Camellia flowers.
In experiments, the pollen-based sunscreen absorbed and blocked harmful ultraviolet (UV) rays as effectively as commercially available sunscreens, which commonly use minerals like titanium dioxide (TiO2) and zinc oxide (ZnO). In laboratory tests on corals, commercial sunscreen induced coral bleaching in just two days, leading to coral death by day six. Each year, an estimated 6,000 to 14,000 tons of commercial sunscreen make their way into the ocean, as people wash it off in the sea or it flows in from wastewater.
In contrast, the pollen-based sunscreen did not affect the corals, which remained healthy even up to 60 days.
In other tests, the pollen-based sunscreen also demonstrated its ability to reduce surface skin temperature, thereby helping to keep the skin cool in the presence of simulated sunlight.
Why we slip on ice: Physicists challenge centuries-old assumptions
For over a hundred years, schoolchildren around the world have learned that ice melts when pressure and friction are applied. When you step out onto an icy pavement in winter, you can slip up because of the pressure exerted by your body weight through the sole of your (still warm) shoe. But it turns out that this explanation misses the mark.
New research reveals that it's not pressure or friction that causes ice to become slippery, but rather the interaction between molecular dipoles in the ice and those on the contacting surface, such as a shoe sole.
The work is published in the journal Physical Review Letters. This insight overturns a paradigm established nearly two centuries ago by the brother of Lord Kelvin, James Thompson, who proposed that pressure and friction contribute to ice melting alongside temperature.
It turns out that neither pressure nor friction plays a particularly significant part in forming the thin liquid layer on ice.
Instead,computer stimulations by researchers reveal that molecular dipoles are the key drivers behind the formation of this slippery layer, which so often causes us to lose our footing in winter. But what exactly is a dipole? A molecular dipole arises when a molecule has regions of partial positive and partial negative charge, giving the molecule an overall polarity that points in a specific direction.
Achraf Atila et al, Cold Self-Lubrication of Sliding Ice, Physical Review Letters (2025). DOI: 10.1103/1plj-7p4z
A specific bacterial infection during pregnancy that can cause severe harm to the unborn brain has been identified for the first time, in a finding that could have huge implications for prenatal health.
Previous studies have disagreed on whether fetal exposure to Ureaplasma parvum has a detrimental effect on brain development, so newborn health specialists of Medical Research set out to determine the answer, once and for all.
Ureaplasma parvum Serovars 3 and 6 are among the most common types that are isolated in pregnancies complicated by infection/inflammation, so they tested them individually in a pre-clinical model and the results were clear.
They showed that long-term exposure to a specific subtype of Ureaplasma (parvum 6) resulted in loss of cells that are responsible for the production of myelin (the fatty sheath that insulates nerve cells in the brain).
This resulted in less myelin production and a disruption to the architecture of myelin in the brain. This sort of disruption to myelin production can have a devastating and lifelong impact on neurodevelopment, cognition and motor function.
By contrast, they also showed that exposure to another subtype of Ureaplasma (parvum 3) had little effect on neurodevelopment.
Many of the babies affected by this infection in utero are at a much higher risk of preterm birth and the chronic intensive care and inflammation associated with being born too early, the researchers say.
Dima Abdu et al, Intra-amniotic infection with Ureaplasma parvum causes serovar-dependent white matter damage in preterm fetal sheep, Brain Communications (2025). DOI: 10.1093/braincomms/fcaf182
After early-life stress, astrocytes can affect behaviour
Astrocytes in the lateral hypothalamus region of the brain, an area involved in the regulation of sleep and wakefulness, play a key role in neuron activity in mice and affect their behavior, researchers have found.
By broadening medical science's understanding of cerebral mechanisms, the discovery could someday help in the treatment and prevention of depression in humans, the researchers say.
According to the scientific literature, early-life stress leads to a five-fold increase in the risk of developing a mental-health disorder as an adult, notably causing treatment-resistant disorders.
As brain cells, astrocytes are sensitive to variations in the blood concentration of metabolites and, in response to changes in the blood, astrocytes can modulate the extent of their interaction with neurons, their neighboring cells.
In mice, those changes are particularly responsive to the level of corticosterone, the stress hormone in the rodents' blood.
In adult micewho experienced early-life stress, researchers saw abnormally high levels of corticosterone. The impact of stress on behavior also differed according to sex. Females were less active at night, while males were hyperactive during the day.
In people with depression who have experienced a similar type of stress, these sex differences have also been observed.
Lewis R. Depaauw-Holt et al, A divergent astrocytic response to stress alters activity patterns via distinct mechanisms in male and female mice, Nature Communications (2025). DOI: 10.1038/s41467-025-61643-y
Is this the future of food? 'Sexless' seeds that could transform farming Scientists are tinkering with plant genes to create crops that seed their own clones, with a host of benefits for farmers. Sacks of seeds without the sex Agriculture is on the brink of a revolution: grain crops that produce seeds asexually. The technology — trials of which could start sprouting as early as next month — exploits a quirk of nature called apomixis, in which plants create seeds that produce clones of the parent. Apomixis could slash the time needed to create new varieties of crops, and give smallholder farmers access to affordable high-yielding sorghum (Sorghum bicolor) and cowpea (Vigna unguiculata). But before self-cloning crops can be commercialized, the technology must run the regulatory gauntlet.
Macaws learn by watching interactions of others, a skill never seen in animals before
One of the most effective ways we learn is through third-party imitation, where we observe and then copy the actions and behaviors of others. Until recently, this was thought to be a unique human trait, but a new study published in Scientific Reports reveals that macaws also possess this ability.
Second-party imitation is already known to exist in the animal kingdom. Parrots are renowned for their ability to imitate human speech and actions, and primates, such as chimpanzees, have learned to open a puzzle box by observing a human demonstrator. But third-party imitation is different because it involves learning by observing two or more individuals interact rather than by direct instruction.
Scientists chose blue-throated macaws for this study because they live in complex social groups in the wild, where they need to learn new behaviors to fit in quickly. Parrots, like macaws, are also very smart and can do things like copy sounds and make tools.
To find out whether macaws could learn through third-party imitation, researchers worked with two groups of them, performing more than 4,600 trials. In the test group, these birds watched another macaw perform one of five different target actions in response to a human's hand signals. These were fluffing up feathers, spinning its body, vocalizing, lifting a leg or flapping its wings. In the control group, macaws were given the same hand signals without ever seeing another bird perform the actions.
The results were clear. The test group learned more actions than the control group, meaning the interactions between the single macaw and the human experimenter helped them learn specific behaviors. They also learned these actions faster and performed them more accurately than the control group. The study's authors suggest that this ability to learn from passively observing two or more individuals helps macaws adapt to new social situations and may even contribute to their own cultural traditions.
The research shows that macaws aren't just smart copycats. They may also have their own complex social lives and cultural norms, similar to humans.
Esha Haldar et al, Third-party imitation is not restricted to humans, Scientific Reports (2025). DOI: 10.1038/s41598-025-11665-9
Physicists create a new kind of time crystal that humans can actually see
Imagine a clock that doesn't have electricity, but its hands and gears spin on their own for all eternity. In a new study, physicists have used liquid crystals, the same materials that are in your phone display, to create such a clock—or, at least, as close as humans can get to that idea. The team's advancement is a new example of a "time crystal." That's the name for a curious phase of matter in which the pieces, such as atoms or other particles, exist in constant motion.
The researchers aren't the first to make a time crystal, but their creation is the first that humans can actually see, which could open a host of technological applications. They can be observed directly under a microscope and even, under special conditions, by the naked eye.
In the study, the researchers designed glass cells filled with liquid crystals—in this case, rod-shaped molecules that behave a little like a solid and a little like a liquid. Under special circumstances, if you shine a light on them, the liquid crystals will begin to swirl and move, following patterns that repeat over time.
Time Crystal in motion
Under a microscope, these liquid crystal samples resemble psychedelic tiger stripes, and they can keep moving for hours—similar to that eternally spinning clock.
Hanqing Zhao et al, Space-time crystals from particle-like topological solitons, Nature Materials (2025). DOI: 10.1038/s41563-025-02344-1
Ant Queens Produce Offspring of Two Different Species
Some ant queens can produce offspring of more than one species – even when the other species is not known to exist in the nearby wild.
No other animal on Earth is known to do this, and it's hard to believe.
This is bizarre story of a system that allows things to happen that seem almost unimaginable.
Some queen ants are known to mate with other species to produce hybrid workers, but Iberian harvester ants (Messor ibericus) on the island of Sicily go even further, blurring the lines of species-hood. These queens can give birth to cloned males of another species entirely (Messor structor), with which they can mate to produce hybrid workers.
Their reproductive strategy is akin to "sexual domestication", the researchers say. The queens have come to control the reproduction of another species, which they exploited from the wild long ago – similar to how humans have domesticated dogs.
In the past, the Iberian ants seem to have stolen the sperm of another species they once depended on, creating an army of male M. structor clones to reproduce with whenever they wanted.
According to genetic analysis, the colony's offspring are two distinct species, and yet they share the same mother.
Some are the hairyM. ibericus, and the others the hairlessM.structor– the closest wild populations of which live more than a thousand kilometers away.
The queen can mate with males of either species in the colony to reproduce. Mating withM. ibericusmales will produce the next generation of queen, while mating withM. structormales will result in more workers. As such, all workers in the colony are hybrids ofM. ibericusandM. structor.
The two species diverged from each other more than 5 million years ago, and yet today, M. structor is a sort of parasite living in the queen's colony. But she is hardly a victim.
The Iberian harvester queen ant controls what happens to the DNA of her clones. When she reproduces, she can do so asexually, producing a clone of herself. She can also fertilize her egg with the sperm of her own species or M. structor, or she can 'delete' her own nuclear DNA and use her egg as a vessel solely for the DNA of her male M. structor clones. This means her offspring can either be related to her, or to another species. The only similarity is that both groups contain the queen's mitochondrial DNA. The result is greater diversity in the colony without the need for a wild neighboring species to mate with.
It also means that Iberian harvester ants belong to a 'two-species superorganism' – which the researchers say "challenges the usual boundaries of individuality."
The M. structor males produced by M. ibericus queens don't look exactly the same as males produced by M. structor queens, but their genomes match.
Electrical stimulation can reprogram immune system to heal the body faster
Scientists have discovered that electrically stimulating macrophages—one of the immune systems key players—can reprogram them in such a way as to reduce inflammation and encourage faster, more effective healing in disease and injury.
This breakthrough uncovers a potentially powerful new therapeutic option, with further work ongoing to delineate the specifics.
Macrophages are a type of white blood cell with several high-profile roles in our immune system. They patrol around the body, surveying for bugs and viruses, as well as disposing of dead and damaged cells, and stimulating other immune cells—kicking them into gear when and where they are needed.
However, their actions can also drive local inflammation in the body, which can sometimes get out of control and become problematic, causing more damage to the body than repair. This is present in lots of different diseases, highlighting the need to regulate macrophages for improved patient outcomes.
In the study, published in the journal Cell Reports Physical Science, the researchers worked with human macrophages isolated from healthy donor blood samples.
They stimulated these cells using a custom bioreactor to apply electrical currents and measured what happened.
The scientists discovered that this stimulation caused a shift of macrophages into an anti-inflammatory state that supports faster tissue repair; a decrease in inflammatory marker (signaling) activity; an increase in expression of genes that promote the formation of new blood vessels (associated with tissue repair as new tissues form); and an increase in stem cell recruitment into wounds (also associated with tissue repair).
Human brains explore more to avoid losses than to seek gains
Researchers traced a neural mechanism that explains why humans explore more aggressively when avoiding losses than when pursuing gains. Their work reveals how neuronal firing and noise in the amygdala shape exploratory decision-making.
Human survival has its origins in a delicate balance of exploration versus exploitation. There is safety in exploiting what is known, the local hunting grounds, the favorite foraging location, the go-to deli with the familiar menu. Exploitation also involves the risk of over-reliance on the familiar to the point of becoming too dependent upon it, either through depletion or a change in the stability of local resources.
Exploring the world in the hope of discovering better options has its own set of risks and rewards. There is the chance of finding plentiful hunting grounds, alternative foraging resources, or a new deli that offers a fresh take on old favorites. And there is the risk that new hunting grounds will be scarce, the newly foraged berries poisonous, or that the meal time will be ruined by a deli that disappoints.
Exploration-exploitation (EE) dilemma research has concentrated on gain-seeking contexts, identifying exploration-related activity in surface brain areas like cortex and in deeper regions such as the amygdala. Exploration tends to rise when people feel unsure about which option will pay off.
Loss-avoidance strategies differ from gain-seeking strategies, with links to negative outcomes such as PTSD, anxiety and mood disorders.
In the study, "Rate and noise in human amygdala drive increased exploration in aversive learning," published in Nature, researchers recorded single-unit activity to compare exploration during gain versus loss learning.
Researchers examined how strongly neurons fired just before each choice and how variable that activity was across trials. Variability served as "neural noise." Computational models estimated how closely choices followed expected value and how much they reflected overall uncertainty.
Results showed more exploration in loss trials than in gain trials. After people learned which option was better, exploration stayed higher in loss trials and accuracy fell more from its peak.
Pre-choice firing in the amygdala and temporal cortex rose before exploratory choices in both gain and loss, indicating a shared, valence-independent rate signal. Loss trials also showed noisier amygdala activity from cue to choice.
More noise was linked to higher uncertainty and a higher chance of exploring, and noise declined as learning progressed. Using the measured noise levels in decision models reproduced the extra exploration seen in loss trials. Loss aversion did not explain the gap. Researchers report two neural signals that shape exploration. A valence-independent firing-rate increase in the amygdala and temporal cortex precedes exploratory choices in both gain and loss. A loss-specific rise in amygdala noise raises the odds of exploration under potential loss, scales with uncertainty, and wanes as learning accrues.
Behavioral modeling matches this pattern, with value-only rules fitting gain choices and value-plus-total-uncertainty rules fitting loss choices. Findings point to neural variability as a lever that tilts strategy toward more trial-and-error when loss looms, while causal tests that manipulate noise remain to be done.
From an evolutionary survival perspective, the strategy fits well with the need to seek out new resources when facing the loss of safe or familiar choices. While one might consider trying a new restaurant at any time, true seeking behavior will become a priority if the favorite location is closed for remodeling.
Tamar Reitich-Stolero et al, Rate and noise in human amygdala drive increased exploration in aversive learning, Nature (2025). DOI: 10.1038/s41586-025-09466-1
Scientists discover why the flu is more deadly for older people
Scientists have discovered why older people are more likely to suffer severely from the flu, and can now use their findings to address this risk.
In a study published in PNAS, experts discovered that older people produce a glycosylated protein called apolipoprotein D (ApoD), which is involved in lipid metabolism and inflammation, at much higher levels than in younger people. This has the effect of reducing the patient's ability to resist virus infection, resulting in a more serious disease outcome.
ApoD is therefore a target for therapeutic intervention to protect against severe influenza virus infection in the elderly which would have a major impact on reducing morbidity and mortality in the aging population.
ApoD mediates age-associated increase in vulnerability to influenza virus infection, Proceedings of the National Academy of Sciences (2025). DOI: 10.1073/pnas.2423973122
Dr. Krishna Kumari Challa
AI helps stethoscope detect three heart conditions in 15 seconds
An AI-enabled stethoscope can help doctors pick up three heart conditions in just 15 seconds, according to the results of a real-world trial presented at the European Society of Cardiology's annual congress.
The paper is also published in the journal BMJ Open.
The stethoscope, invented in 1816, is a vital part of a doctor's toolkit, used to listen to sounds within the body. But an AI stethoscope can do much more, including analyzing tiny differences in heartbeat and blood flow which are undetectable to the human ear, and taking a rapid ECG at the same time.
Researchers now have evidence that an AI stethoscope can increase detection of heart failure at the early stage when someone goes to their GP with symptoms.
A study involving more than 200 GP surgeries, with more than 1.5 million patients, looked at people with symptoms such as breathlessness or fatigue.
Those examined using an AI stethoscope were twice as likely to be diagnosed with heart failure, compared to similar patients who were not examined using the technology.
Patients examined with the AI-stethoscope were about 3.5 times more likely to be diagnosed with atrial fibrillation—an abnormal heart rhythm which can increase the risk of having a stroke. They were almost twice as likely to receive a diagnosis of heart valve disease, which is where one or more heart valves do not work properly.
Early diagnosis is vital for all three conditions, allowing patients who may need potentially lifesaving medicines to be identified sooner, before they become dangerously unwell.
Mihir A Kelshiker et al, Triple cardiovascular disease detection with an artificial intelligence-enabled stethoscope (TRICORDER): design and rationale for a decentralised, real-world cluster-randomised controlled trial and implementation study, BMJ Open (2025). DOI: 10.1136/bmjopen-2024-098030
Sep 1
Dr. Krishna Kumari Challa
A firewall for science: AI tool identifies 1,000 'questionable' journals
A team of computer scientists has developed a new artificial intelligence platform that automatically seeks out "questionable" scientific journals.
The study, published Aug. 27 in the journal Science Advances, tackles an alarming trend in the world of research.
Spam messages come from people who purport to be editors at scientific journals, usually researchers never heard of, and offer to publish their papers—for a hefty fee.
Such publications are sometimes referred to as "predatory" journals. They target scientists, convincing them to pay hundreds or even thousands of dollars to publish their research without proper vetting. There has been a growing effort among scientists and organizations to vet these journals. But it's like whack-a-mole. You catch one, and then another appears, usually from the same company. They just create a new website and come up with a new name.
The new new AI tool automatically screens scientific journals, evaluating their websites and other online data for certain criteria: Do the journals have an editorial board featuring established researchers? Do their websites contain a lot of grammatical errors?
In an era when prominent figures are questioning the legitimacy of science, stopping the spread of questionable publications has become more important than ever before.
Part 1
Sep 1
Dr. Krishna Kumari Challa
In science, you don't start from scratch. You build on top of the research of others. So if the foundation of that tower crumbles, then the entire thing collapses.
When scientists submit a new study to a reputable publication, that study usually undergoes a practice called peer review. Outside experts read the study and evaluate it for quality—or, at least, that's the goal.
A growing number of companies have sought to circumvent that process to turn a profit. In 2009, Jeffrey Beall, a librarian at CU Denver, coined the phrase "predatory" journals to describe these publications.
Often, they target researchers outside of the United States and Europe, such as in China, India and Iran—countries where scientific institutions may be young, and the pressure and incentives for researchers to publish are high.
These publishers of predatory journals will say, 'If you pay $500 or $1,000, we will review your paper. In reality, they don't provide any service. They just take the PDF and post it on their website.
a nonprofit organization called the Directory of Open Access Journals (DOAJ). Since 2003, volunteers at the DOAJ have flagged thousands of journals as suspicious based on six criteria. (Reputable publications, for example, tend to include a detailed description of their peer review policies on their websites.)
But keeping pace with the spread of those publications has been daunting for humans.
To speed up the process, researchers now turned to AI.
They trained its system using the DOAJ's data, then asked the AI to sift through a list of nearly 15,200 open-access journals on the internet.
Among those journals, the AI initially flagged more than 1,400 as potentially problematic. When scientists checked them , 350 were found to be legitimate. However, more than 1000 were predatory in nature!
The team discovered that questionable journals published an unusually high number of articles. They also included authors with a larger number of affiliations than more legitimate journals, and authors who cited their own research, rather than the research of other scientists, to an unusually high level.
The new AI system isn't publicly accessible, but the researchers hope to make it available to universities and publishing companies soon.
Han Zhuang et al, Estimating the predictability of questionable open-access journals, Science Advances (2025). DOI: 10.1126/sciadv.adt2792
Part 2
Sep 1
Dr. Krishna Kumari Challa
Genetic tools identify lost human relatives Denisovans from fossil records
New genetic techniques are shedding light on a mysterious part of our family tree—ancient human relatives called the Denisovans that emerged during the Pleistocene epoch, approximately 370,000 years ago.
Little is known about them because so few fossils have been found. The group was discovered by accident in 2010, based solely on DNA analysis of what was thought to be a Neanderthal finger bone found in the Denisova Cave in Siberia. However, it turned out to belong to a previously unknown lineage closely related to Neanderthals.
Since then, only a few additional fragments have been found, so researchers
developed a new genetic tool to go through the fossil record identifying potential Denisovans. Their starting point was the Denisovan genome.
The research is published in the journal Proceedings of the National Academy of Sciences.
The research team's innovative technique detects subtle changes in how Denisovan genes are regulated. These changes were likely responsible for altering the Denisovan skeleton, providing clues to what they looked like.
Using this technique, the scientists identified 32 physical traits most closely related to skull morphology. Then they used this predicted profile to scan the Middle Pleistocene fossil record, specifically focusing on skulls from that period.
Before applying this method to unknown fossils, the researchers tested its accuracy. They used the same technique to successfully predict known physical traits of Neanderthals and chimpanzees with more than 85% accuracy.
Next, they identified 18 skull features they could measure, such as head width and forehead height, and then measured these traits on ten ancient skulls. They compared these to skulls from reference groups, including Neanderthals, modern humans and Homo erectus. Using sophisticated statistical tests, they gave each ancient skull a score to see how well it matched the Denisovan blueprint.
Of these, two skulls from China, known as the Harbin and Dali specimens, were a close match to the Denisovan profile. The Harbin skull matched 16 traits, while the Dali skull matched 15.
The study also found that a third skull, Kabwe 1, had a strong link to the Denisovan-Neanderthal tree, which may indicate that it was the root of the Denisovan lineage that split from Neanderthals.
In their study, the team states that their work may help with the identification of other extinct human relatives.
Nadav Mishol et al, Candidate Denisovan fossils identified through gene regulatory phenotyping, Proceedings of the National Academy of Sciences (2025). DOI: 10.1073/pnas.2513968122
Sep 2
Dr. Krishna Kumari Challa
Rare seasonal brain shrinkage in shrews is driven by water loss, not cell death, MRI study reveals
Common shrews are one of only a handful of mammals known to flexibly shrink and regrow their brains. This rare seasonal cycle, known as Dehnel's phenomenon, has puzzled scientists for decades. How can a brain lose volume and regrow months later without sustaining permanent damage?
A study using noninvasive MRI has scanned the brains of shrews undergoing shrinkage, identifying a key molecule involved in the phenomenon: water. The work is published in the journal Current Biology.
In the experiments conducted by researchers, shrews lost 9% of their brains during shrinkage, but the cells did not die. The cells just lost water.
Normally, brain cells that lose water become damaged and ultimately die, but in shrews, the opposite happened. The cells remained alive and even increased in number.
This finding solves a mystery—and opens up potential pathways for the treatment of human brain disease. Brain shrinkage in shrews matches closely what happens in patients suffering from Alzheimer's, Parkinson's, and other brain diseases.
The study also shows that a specific protein known for regulating water—aquaporin 4—was likely involved in moving water out of the brain cells of shrews. This same protein is present in higher quantities in the diseased brains of humans, too.
That the shrunken brains of shrews share characteristics with diseased human brains makes the case that these miniature mammals, with their ability to reverse brain loss, could also offer clues for medical treatments.
Programmed seasonal brain shrinkage in the common shrew via water loss without cell death, Current Biology (2025). DOI: 10.1016/j.cub.2025.08.015. www.cell.com/current-biology/f … 0960-9822(25)01081-4
Sep 2
Dr. Krishna Kumari Challa
Super corals could help buy time for reefs in a warming world
Coral reefs are in crisis. Rising ocean temperatures driven by climate change are pushing these ecosystems to the brink, with mass bleaching events becoming more frequent and severe.
But what if nature already holds part of the solution?
New research led by UTS demonstrates that corals that naturally thrive in extreme environments, often referred to as "super corals," could be used in restoration efforts to protect vulnerable reef systems.
The research was published in the journal Science Advances and offers compelling evidence that these resilient corals can retain their heat tolerance even after being moved to more stable reef habitats.
Christine D. Roper et al, Coral thermotolerance retained following year-long exposure to a novel environment, Science Advances (2025). DOI: 10.1126/sciadv.adu3858
Sep 2
Dr. Krishna Kumari Challa
Longevity gains slowing with life expectancy of 100 unlikely, study finds
A new study finds that life expectancy gains made by high-income countries in the first half of the 20th century have slowed significantly, and that none of the generations born after 1939 will reach 100 years of age on average.
Published in the journal Proceedings of the National Academy of Sciences, the study analyzed life expectancy for 23 high-income and low-mortality countries using data from the Human Mortality Database and six different mortality forecasting methods.
The unprecedented increase in life expectancy we achieved in the first half of the 20th century appears to be a phenomenon we are unlikely to achieve again in the foreseeable future. In the absence of any major breakthroughs that significantly extend human life, life expectancy would still not match the rapid increases seen in the early 20th century even if adult survival improved twice as fast as we predict.
From 1900 to 1938, life expectancy rose by about five and a half months with each new generation. The life expectancy for an individual born in a high-income country in 1900 was an average of 62 years. For someone born just 38 years later in similar conditions, life expectancy had jumped to 80 years on average.
For those born between 1939 and 2000, the increase slowed to roughly two and a half to three and a half months per generation, depending on the forecasting method. Mortality forecasting methods are statistical techniques that make informed predictions about future lifespans based on past and current mortality information. These models enabled the research team to estimate how life expectancy will develop under a variety of plausible future scenarios.
Researchers forecast that those born in 1980 will not live to be 100 on average, and none of the cohorts in their study will reach this milestone. This decline is largely due to the fact that past surges in longevity were driven by remarkable improvements in survival at very young ages.
At the beginning of the 20th century, infant mortality fell rapidly due to medical advances and other improvements in quality of life for high-income countries. This contributed significantly to the rapid increase in life expectancy. However, infant and child mortality is now so low that the forecasted improvements in mortality in older age groups will not be enough to sustain the previous pace of longevity gains.
--
While mortality forecasts can never be certain as the future may unfold in unexpected ways—by way of pandemics, new medical treatments or other unforeseen societal changes—this study provides critical insight for governments looking to anticipate the needs of their health care systems, pension planning and social policies.
Although a population-level analysis, this research also has implications for individuals, as life expectancy influences personal decisions about saving, retirement and long-term planning. If life expectancy increases more slowly, as this study shows is likely, both governments and individuals may need to recalibrate their expectations for the future.
José Andrade et al, Cohort mortality forecasts indicate signs of deceleration in life expectancy gains, Proceedings of the National Academy of Sciences (2025). DOI: 10.1073/pnas.2519179122
Sep 2
Dr. Krishna Kumari Challa
Why male embryos grow faster
Researchers have uncovered the genetic triggers that cause male and female bovine embryos to develop differently, as early as seven to eight days after fertilization. The breakthrough in basic science has implications for human health—such as drug development and in vitro fertilization—and for bovine health and dairy industry sustainability.
Scientists have known since the 1990s that male embryos of multiple mammalian species, including humans, grow faster than female embryos, but until now, the underlying reasons were unclear.
In a paper, published in Cell & Bioscience, scientists grew bovine embryos in petri dishes, then analyzed their genetic sex and RNA sequencing, which shows how genes are being expressed.
They discovered significant sex differences in gene regulation: male embryos prioritized genes associated with energy metabolism, causing them to grow faster than their female counterparts. Female embryos emphasized genes associated with sex differentiation, gonad development and inflammatory pathways that are important for future development.
Understanding these fundamental sex differences at the genomic and molecular levels is critically important to improving in vitro fertilization (IVF) success in humans and cows, and in developing treatments that will work for both men and women.
Meihong Shi et al, Sex-biased transcriptome in in vitro produced bovine early embryos, Cell & Bioscience (2025). DOI: 10.1186/s13578-025-01459-x
Sep 2
Dr. Krishna Kumari Challa
Scientists find that ice generates electricity when bent
A new study reveals that ice is a flexoelectric material, meaning it can produce electricity when unevenly deformed. Published in Nature Physics, this discovery could have major technological implications while also shedding light on natural phenomena such as lightning.
Frozen water is one of the most abundant substances on Earth. It is found in glaciers, on mountain peaks and in polar ice caps. Although it is a well-known material, studying its properties continues to yield fascinating results.
An international study has shown for the first time that ordinary ice is a flexoelectric material.
In other words, it can generate electricity when subjected to mechanical deformation. This discovery could have significant implications for the development of future technological devices and help to explain natural phenomena such as the formation of lightning in thunderstorms.
The study represents a significant step forward in our understanding of the electromechanical properties of ice.
The scientists discovered that ice generates electric charge in response to mechanical stress at all temperatures. In addition, they identified a thin 'ferroelectric' layer at the surface at temperatures below -113ºC (160K).
This means that the ice surface can develop a natural electric polarization, which can be reversed when an external electric field is applied—similar to how the poles of a magnet can be flipped. The surface ferroelectricity is a cool discovery in its own right, as it means that ice may have not just one way to generate electricity, but two: ferroelectricity at very low temperatures, and flexoelectricity at higher temperatures all the way to 0 °C.
This property places ice on a par with electroceramic materials such as titanium dioxide, which are currently used in advanced technologies like sensors and capacitors.
One of the most surprising aspects of this discovery is its connection to nature. The results of the study suggest that the flexoelectricity of ice could play a role in the electrification of clouds during thunderstorms, and therefore in the origin of lightning.
It is known that lightning forms when an electric potential builds up in clouds due to collisions between ice particles, which become electrically charged. This potential is then released as a lightning strike. However, the mechanism by which ice particles become electrically charged has remained unclear, since ice is not piezoelectric—it cannot generate charge simply by being compressed during a collision.
However, the study shows that ice can become electrically charged when it is subjected to inhomogeneous deformations, i.e. when it bends or deforms irregularly.
X. Wen et al, Flexoelectricity and surface ferroelectricity of water ice, Nature Physics (2025). DOI: 10.1038/s41567-025-02995-6
Sep 2
Dr. Krishna Kumari Challa
Neuroscientists show for first time that precise timing of nerve signals determines how brain processes information
It has long been known that the brain preferentially processes information that we focus our attention on—a classic example is the so-called cocktail party effect.
In an environment full of voices, music, and background noise, the brain manages to concentrate on a single voice. The other noises are not objectively quieter, but are perceived less strongly at that moment.
The brain focuses its processing on the information that is currently relevant—in this case, the voice of the conversation partner—while other signals are received but not forwarded and processed to the same extent.
Neuroscientists provided the first causal evidence of how the brain transmits and processes relevant information. The work is published in the journal Nature Communications.
Whether a signal is processed further in the brain depends crucially on whether it arrives at the right moment—during a short phase of increased receptivity of the nerve cells.
Nerve cells do not work continuously, but in rapid cycles. They are particularly active and receptive for a few milliseconds, followed by a window of lower activity and excitability. This cycle repeats itself approximately every 10 to 20 milliseconds. Only when a signal arrived shortly before the peak of this active phase did it change the behaviour of the neurons.
This temporal coordination is the fundamental mechanism of information processing. Attention makes targeted use of this phenomenon by aligning the timing of the nerve cells so that relevant signals arrive precisely in this time window, while others are excluded.
Eric Drebitz et al, Gamma-band synchronization between neurons in the visual cortex is causal for effective information processing and behavior, Nature Communications (2025). DOI: 10.1038/s41467-025-62732-8
Sep 2
Dr. Krishna Kumari Challa
Criminal behavior in patients with dementia
Don't blame the criminals for everything they do.
A suspected perpetrator who can barely remember his name, several traffic violations committed by a woman in her mid-fifties who is completely unreasonable and doesn't understand her behavior—should such cases be brought before a court? And how does the state deal with people who commit acts of violence without meaning to?
Those questions come to mind if one hears those examples from everyday clinical praxis with persons suffering from dementia. Neurodegenerative diseases might affect several functions of the brain, ranging from memory in Alzheimer's disease to behavior, such as in behavioral variant frontotemporal dementia, and to sensorimotor function in Parkinson's disease.
One of the most interesting consequences of these alterations is the fact that persons affected by these diseases might develop criminal risk behavior like harassment, traffic violation, theft or even behavior causing harm to other people or animals, even as the first disease sign.
If persons violate social or legal norms due to changes in behavior, personality and cognition, these incidents may have a substantial impact on this person's family and social surroundings and may lead to prosecution.
Researchers investigated this problem in a broad meta-analysis that included 14 studies with 236,360 persons from different countries (U.S.A., Sweden and Finland, Germany and Japan). The work is published in the journal Translational Psychiatry.
Their systematic literature review revealed that criminal risk behavior prevalence is more frequent in early disease course than in the general population, but declines thereafter below population levels. Therefore, criminal behavior committed for the first time at mid-age could be an indicator of incident dementia, requiring earliest diagnosis and therapy.
The team shows that the prevalence of criminal risk behaviors was highest in behavioral variant frontotemporal dementia (>50%), followed by semantic variant primary progressive aphasia (40%), but rather low in vascular dementia and Huntington's disease (15%), Alzheimer's disease (10%), and lowest in Parkinsonian syndromes (<10%).
Criminal risk behavior in frontotemporal dementia is most likely caused by the neurodegenerative disease itself. Most of the patients showed criminal risk behavior for the first time in their life and had no previous records of criminal activity.
The prevalence seems to be more frequent in frontotemporal dementia and Alzheimer's disease in early disease course than in the general population before diagnosis, presumably in pre-stages, such as mild behavioral or cognitive impairment, but declines thereafter, finally leading to lower prevalence in dementia after diagnosis if compared with the general population.
The researchers also found that criminal risk behaviour is more frequent in men than in women in dementia. After diagnosis, men showed four times more criminal risk behaviour than women in frontotemporal dementia and seven times more in Alzheimer's disease.
Part 1
Sep 2
Dr. Krishna Kumari Challa
In a second study, the working group also identified the changes in the brain that are associated with criminal behavior in frontotemporal dementia. This study was published in Human Brain Mapping.
Persons exhibiting criminal behavior showed larger atrophy in the temporal lobe, indicating that criminal behavior might be caused by so-called disinhibition, i.e., the loss of normal restraints or inhibitions, leading to a reduced ability to regulate one's behavior, impulses, and emotions.
Disinhibition can manifest as acting impulsively, without thinking through the consequences, and behaving in ways that are inappropriate for the situation.
One has to prevent a further stigmatization of persons with dementia. Of note, most offenses committed were minor, such as indecent behavior, traffic violations, theft, damage to property, but also physical violence or aggression occurred
Hence, sensitivity for this topic as possible early signs of dementia and earliest diagnosis and treatment are of the uttermost importance. Besides early diagnosis and treatment of persons affected, one has to discuss adaptations of the legal system, such as increasing awareness of offenses due to those diseases, and taking into account diseases in respective penalties and in jails, say the researchers.
Matthias L. Schroeter et al, Criminal minds in dementia: A systematic review and quantitative meta-analysis, Translational Psychiatry (2025). DOI: 10.1038/s41398-025-03523-z
Karsten Mueller et al, Criminal Behavior in Frontotemporal Dementia: A Multimodal MRI Study, Human Brain Mapping (2025). DOI: 10.1002/hbm.70308
Part 2
Sep 2
Dr. Krishna Kumari Challa
Depression linked to presence of immune cells in the brain's protective layer
Immune cells released from bone marrow in the skull in response to chronic stress and adversity could play a key role in symptoms of depression and anxiety, say researchers.
The discovery—found in a study in mice—sheds light on the role that inflammation can play in mood disorders and could help in the search for new treatments, in particular for those individuals for whom current treatments are ineffective.
Around 1 billion people will be diagnosed with a mood disorder such as depression or anxiety at some point in their life. While there may be many underlying causes, chronic inflammation—when the body's immune system stays active for a long time, even when there is no infection or injury to fight—has been linked to depression. This suggests that the immune system may play an important role in the development of mood disorders.
Previous studies have highlighted how high levels of an immune cell known as a neutrophil, a type of white blood cell, are linked to the severity of depression. But how neutrophils contribute to symptoms of depression is currently unclear.
In research published in Nature Communications, a team of scientists tested the hypothesis that chronic stress can lead to the release of neutrophils from bone marrow in the skull. These cells then collect in the meninges—membranes that cover and protect your brain and spinal cord—and contribute to symptoms of depression.
As it is not possible to test this hypothesis in humans, the team used mice exposed to chronic social stress. In this experiment, an 'intruder' mouse is introduced into the home cage of an aggressive resident mouse. The two have brief daily physical interactions and can otherwise see, smell, and hear each other.
The researchers found that prolonged exposure to this stressful environment led to a noticeable increase in levels of neutrophils in the meninges, and that this was linked to signs of depressive behavior in the mice. Even after the stress ended, the neutrophils lasted longer in the meninges than they did in the blood.
Analysis confirmed the researchers' hypothesis that the meningeal neutrophils—which appeared subtly different from those found in the blood—originated in the skull.
Further analysis suggested that long-term stress triggered a type of immune system 'alarm warning' known as type I interferon signaling in the neutrophils. Blocking this pathway—in effect, switching off the alarm—reduced the number of neutrophils in the meninges and improved behavior in the depressed mice.
Further analysis suggested that long-term stress triggered a type of immune system 'alarm warning' known as type I interferon signaling in the neutrophils. Blocking this pathway—in effect, switching off the alarm—reduced the number of neutrophils in the meninges and improved behavior in the depressed mice.
--
Part 1
Sep 2
Dr. Krishna Kumari Challa
The reason why there are high levels of neutrophils in the meninges is unclear. One explanation could be that they are recruited by microglia, a type of immune cell unique to the brain.
Another possible explanation is that chronic stress may cause microhemorrhages, tiny leaks in brain blood vessels, and that neutrophils—the body's 'first responders'—arrive to fix the damage and prevent any further damage. These neutrophils then become more rigid, possibly getting stuck in brain capillaries and causing further inflammation in the brain.
These new findings show that these 'first responder' immune cells leave the skull bone marrow and travel to the brain, where they can influence mood and behavior.
Stacey L. Kigar et al, Chronic social defeat stress induces meningeal neutrophilia via type I interferon signaling in male mice, Nature Communications (2025). DOI: 10.1038/s41467-025-62840-5
Part 2
Sep 2
Dr. Krishna Kumari Challa
Hyperactive blood platelets linked to heart attacks despite standard drug therapy
Scientists have discovered a particularly active subgroup of blood platelets that may cause heart attacks in people with coronary heart disease despite drug therapy. This discovery may open up new prospects for customized therapies. The research results are published in the European Heart Journal and were presented on August 31 at Europe's largest cardiology congress.
Despite modern medication, many people with coronary heart disease continue to suffer heart attacks. A research team has now found a possible explanation for this and has also provided a very promising therapeutic approach.
Their work focused on so-called "reticulated platelets": particularly young, RNA-rich and reactive thrombocytes that play a central role in the formation of blood clots in patients with coronary heart disease.
The researchers were able to comprehensively characterize the biological mechanisms of these cells for the first time. This allows us to explain why these platelets remain overactive in many patients even under optimal therapy. The reason is that these young platelets have a particularly large number of activating signaling pathways that make them more sensitive and reactive than mature platelets.
The results come from a multidimensional, using various methods, analysis of the blood of over 90 patients with coronary heart disease. Among other things, the researchers found signaling pathways that can be used to specifically inhibit the activity of such blood cells, in particular two target structures called GPVI and PI3K. Initial laboratory experiments confirmed that inhibiting these signaling pathways can reduce platelet hyperactivity.
Kilian Kirmes et al, Reticulated platelets in coronary artery disease: a multidimensional approach unveils prothrombotic signalling and novel therapeutic targets, European Heart Journal (2025). DOI: 10.1093/eurheartj/ehaf694
Sep 2
Dr. Krishna Kumari Challa
8,000 years of human activities have caused wild animals to shrink and domestic animals to grow
Humans have caused wild animals to shrink and domestic animals to grow, according to a new study.
Researchers studied tens of thousands of animal bones from Mediterranean France covering the last 8,000 years to see how the size of both types of animals has changed over time.
Scientists already know that human choices, such as selective breeding, influence the size of domestic animals, and that environmental factors also impact the size of both. However, little is known about how these two forces have influenced the size of wild and domestic animals over such a prolonged period. This latest research, published in the Proceedings of the National Academy of Sciences , fills a major gap in our knowledge.
The scientists analyzed more than 225,000 bones from 311 archaeological sites in Mediterranean France. They took thousands of measurements of things like the length, width, and depth of bones and teeth from wild animals, such as foxes, rabbits and deer, as well as domestic ones, including goats, cattle, pigs, sheep and chickens.
But the researchers didn't just focus on the bones. They also collected data on the climate, the types of plants growing in the area, the number of people living there and what they used the land for. And then, with some sophisticated statistical modeling, they were able to track key trends and drivers behind the change in animal size.
The research team's findings reveal that for around 7,000 years, wild and domestic animals evolved along similar paths, growing and shrinking together in sync with their shared environment and human activity. However, all that changed around 1,000 years ago. Their body sizes began to diverge dramatically, especially during the Middle Ages.
Domestic animals started to get much bigger as they were being actively bred for more meat and milk. At the same time, wild animals began to shrink in size as a direct result of human pressures, such as hunting and habitat loss. In other words, human activities replaced environmental factors as the main force shaping animal evolution.
Mureau, Cyprien et al, 8,000 years of wild and domestic animal body size data reveal long-term synchrony and recent divergence due to intensified human impact, Proceedings of the National Academy of Sciences (2025). DOI: 10.1073/pnas.2503428122. www.pnas.org/cgi/doi/10.1073/pnas.2503428122
Sep 3
Dr. Krishna Kumari Challa
World's oldest host-associated bacterial DNA found
An international team led by researchers at the Center for Paleogenetics, has uncovered microbial DNA preserved in woolly and steppe mammoth remains dating back more than one million years. The analyses reveal some of the world's oldest microbial DNA ever recovered, as well as the identification of bacteria that possibly caused disease in mammoths. The findings are published in Cell.
Researchers analyzed microbial DNA from 483 mammoth specimens, of which 440 were sequenced for the first time. Among them was a steppe mammoth that lived about 1.1 million years ago. Using advanced genomic and bioinformatic techniques, the team distinguished microbes that once lived alongside the mammoths from those that invaded their remains after death.
Ancient Host-Associated Microbes obtained from Mammoth Remains, Cell (2025). DOI: 10.1016/j.cell.2025.08.003. www.cell.com/cell/fulltext/S0092-8674(25)00917-1
Sep 3
Dr. Krishna Kumari Challa
Genetic mechanism reveals how plants coordinate flowering with light and temperature conditions
Plants may be stuck in one place, but the world around them is constantly changing. In order to grow and flower at the right time, plants must constantly collect information about their surroundings, measuring things like temperature, brightness, and length of day. Still, it's unclear how all this information gets combined to trigger specific behaviors.
Scientists have discovered a genetic mechanism for how plants integrate light and temperature information to control their flowering.
In a study published in Nature Communications, the researchers found an interaction between two genetic pathways that signals the presence of both blue light and low temperature. This genetic module helps plants fine-tune their flowering to the optimal environmental conditions.
In one pathway, blue light activates the PHOT2 blue light receptor, with help from partner protein NPH3. In another pathway, low ambient temperature allows a transcription factor called CAMTA2 to boost the expression of a gene called EHB1. Importantly, EHB1 is known to interact with NPH3, placing NPH3 at the convergence point of the blue light and low temperature signals. This this genetic architecture effectively works as a coincidence detector, linking the presence of blue light and low temperature to guide the switch to flowering.
The study describes an important component of plant growth, reproduction, and information processing. The newly discovered genetic module allows plants to have fine control over their flowering in low temperatures. Understanding this system will now help scientists optimize crop growth under changing environmental conditions.
Adam Seluzicki et al, Genetic architecture of a light-temperature coincidence detector, Nature Communications (2025). DOI: 10.1038/s41467-025-62194-y
Sep 3
Dr. Krishna Kumari Challa
Generative AI designs molecules that kill drug-resistant bacteria
What if generative AI could design life-saving antibiotics, not just art and text? In a new Cell Biomaterials paper, researchers introduce AMP-Diffusion, a generative AI tool used to create tens of thousands of new antimicrobial peptides (AMPs)—short strings of amino acids, the building blocks of proteins—with bacteria-killing potential. In animal models, the most potent AMPs performed as well as FDA-approved drugs, without detectable adverse effects.
While past work has shown that AI can successfully sort through mountains of data to identify promising antibiotic candidates, this study adds to a small but growing number of demonstrations that AI can invent antibiotic candidates from scratch.
Using AMP-Diffusion, the researchers generated the amino-acid sequences for about 50,000 candidates. They used AI to filter the results.
After synthesizing the 46 most promising candidates, the researchers tested them in human cells and animal models. Treating skin infections in mice, two AMPs demonstrated efficacy on par with levofloxacin and polymyxin B, FDA-approved drugs used to treat antibiotic-resistant bacteria, without adverse effects.
In the future, the researchers hope to refine AMP-Diffusion, giving it the capability to denoise with a more specific goal in mind, like treating a particular type of bacterial infection, among other features.
Generative latent diffusion language modeling yields anti-infective synthetic peptides, Cell Biomaterials (2025). DOI: 10.1016/j.celbio.2025.100183. www.cell.com/cell-biomaterials … 3050-5623(25)00174-6
Sep 3
Dr. Krishna Kumari Challa
How bacteria bounce back after antibiotics
A study by researchers has uncovered how Escherichia coli (E. coli) persister bacteria survive antibiotics by protecting their genetic instructions.
The work, published in Nature Microbiology, offers new hope for tackling chronic, recurring infections.
Persister bacteria, which enter a dormant state to survive antibiotics that target active cells, are linked to over 20% of chronic infections and resist current treatments.
Understanding their survival mechanisms could lead to new ways to combat recurring infections. This study utilized E. coli bacteria as a model and found that prolonged stress leads to the increased formation of aggresomes (membraneless droplets) and the enrichment of mRNA (molecules that carry instructions for making proteins) within them, which enhances the ability of E. coli to survive and recover from stress.
Researchers used multiple approaches, including imaging, modeling, and transcriptomics, to show that prolonged stress leading to ATP (fuel for all living cells) depletion in Escherichia coli results in increased aggresome formation, their compaction, and enrichment of mRNA within aggresomes compared to the cytosol (the liquid inside of cells).
Transcript length was longer in aggresomes compared to the cytosol. Mass spectrometry showed exclusion of mRNA ribonuclease (an enzyme that breaks down RNA) from aggresomes, which was due to negative charge repulsion.
Experiments with fluorescent reporters and disruption of aggresome formation showed that mRNA storage within aggresomes promoted translation and was associated with reduced lag phases during growth after stress removal. These findings suggest that mRNA storage within aggresomes confers an advantage for bacterial survival and recovery from stress.
This breakthrough illuminates how persister cells survive and revive after antibiotic treatment.
By targeting aggresomes, new drugs could disrupt this protective mechanism, preventing bacteria from storing mRNA and making them more vulnerable to elimination, thus reducing the risk of infection relapse.
Linsen Pei et al, Aggresomes protect mRNA under stress in Escherichia coli, Nature Microbiology (2025). DOI: 10.1038/s41564-025-02086-5
Sep 3
Dr. Krishna Kumari Challa
Deforestation reduces rainfall by 74% and increases temperatures by 16% in Amazon during dry season, study found
Deforestation in the Brazilian Amazon is responsible for approximately 74.5% of the reduction in rainfall and 16.5% of the temperature increase in the biome during the dry season. For the first time, researchers have quantified the impact of vegetation loss and global climate change on the forest.
The study provides fundamental results to guide effective mitigation and adaptation strategies.
How climate change and deforestation interact in the transformation of the Amazon rainforest, Nature Communications (2025). DOI: 10.1038/s41467-025-63156-0
Sep 3
Dr. Krishna Kumari Challa
Magnetic fields in infant universe may have been billions of times weaker than a fridge magnet
The magnetic fields that formed in the very early stages of the universe may have been billions of times weaker than a small fridge magnet, with strengths comparable to magnetism generated by neurons in the human brain. Yet, despite such weakness, quantifiable traces of their existence still remain in the cosmic web, the visible cosmic structures connected throughout the universe.
These conclusions emerge from a study using around a quarter of a million computer simulations.
The research, recently published in Physical Review Letters, specifies both possible and maximum values for the strengths of primordial magnetic fields. It also offers the possibility of refining our knowledge of the early universe and the formation of the first stars and galaxies.
Mak Pavičević et al, Constraints on Primordial Magnetic Fields from the Lyman- α Forest, Physical Review Letters (2025). DOI: 10.1103/77rd-vkpz. On arXiv: DOI: 10.48550/arxiv.2501.06299
Sep 3
Dr. Krishna Kumari Challa
Removing yellow stains from fabric with blue light
Sweat and food stains can ruin your favorite clothes. But bleaching agents such as hydrogen peroxide or dry-cleaning solvents that remove stains aren't options for all fabrics, especially delicate ones. Now, researchers in ACS Sustainable Chemistry & Engineering report a simple way to remove yellow stains using a high-intensity blue LED light. They demonstrate the method's effectiveness at removing stains from orange juice, tomato juice and sweat-like substances on multiple fabrics, including silk.
The method utilizes visible blue light in combination with ambient oxygen, which acts as the oxidizing agent to drive the photobleaching process.This approach avoids the use of harsh chemical oxidants typically required in conventional bleaching methods, making it inherently more sustainable.
Yellow clothing stains are caused by squalene and oleic acid from skin oils and sweat, as well as natural pigments like beta carotene and lycopene, present in oranges, tomatoes and other foods. UV light is a potential stain-removing alternative to chemical oxidizers like bleach and hydrogen peroxide, but it can damage delicate fabrics.
The same researchers previously determined that a high-intensity blue LED light could remove yellow color from aged resin polymers, and they wanted to see whether blue light could also break down yellow stains on fabric without causing damage.
Initially, they exposed vials of beta carotene, lycopene and squalene to high-intensity blue LED light for three hours. All the samples lost color, and spectroscopic analyses indicated that oxygen in the air helped the photobleaching process by breaking bonds to produce colorless compounds.
Next, the team applied squalene onto cotton fabric swatches. After heating the swatches to simulate aging, they treated the samples for 10 minutes, by soaking them in a hydrogen peroxide solution or exposing them to the blue LED or UV light. The blue light reduced the yellow stain substantially more than hydrogen peroxide or UV exposure. In fact, UV exposure generated some new yellow-colored compounds.
Additional tests showed that the blue LED treatment lightened squalene stains on silk and polyester without damaging the fabrics. The method also reduced the color of other stain-causing substances, including aged oleic acid, orange juice and tomato juice, on cotton swatches.
High-intensity blue LED light is a promising way to remove clothing stains, but the researchers say they want to do additional colorfastness and safety testing before commercializing a light system for home and industrial use.
Tomohiro Sugahara et al, Environmentally Friendly Photobleaching Method Using Visible Light for Removing Natural Stains from Clothing, ACS Sustainable Chemistry & Engineering (2025). DOI: 10.1021/acssuschemeng.5c03907
Sep 3
Dr. Krishna Kumari Challa
Scientists develop the world's first 6G chip, capable of 100 Gbps speeds
Sixth generation, or 6G, wireless technology is one step closer to reality with news that researchers have unveiled the world's first "all-frequency" 6G chip. The chip is capable of delivering mobile internet speeds exceeding 100 gigabits per second (Gbps).
6G technology is the successor to 5G and promises to bring about a massive leap in how we communicate. It will offer benefits such as ultra-high-speed connectivity, ultra-low latency and AI integration that can manage and optimize networks in real-time. To achieve this, 6G networks will need to operate across a range of frequencies, from standard microwaves to much higher frequency terahertz waves. Current 5G technology utilizes a limited set of radio frequencies, similar to those used in previous generations of wireless technologies.
The new chip is no bigger than a thumbnail, measuring 11 millimeters by 1.7 millimeters. It operates across a wide frequency range, from 0.5 GHz to 115 GHz, which traditionally takes nine separate radio systems to cover this spectrum. While the development of a single all-frequency chip is a significant breakthrough, the technology is still in its early stages of development. Many experts expect that commercial 6G networks will begin to roll out around 2030.
Zihan Tao et al, Ultrabroadband on-chip photonics for full-spectrum wireless communications, Nature (2025). DOI: 10.1038/s41586-025-09451-8
Sep 3
Dr. Krishna Kumari Challa
Mother's Germs May Influence Her Child's Brain Development
Our bodies are colonized by a teeming, ever-changing mass of microbes that help power countless biological processes. Now, a new study has identified how these microorganisms get to work shaping the brain before birth.Researchers at Georgia State University studied newborn mice specifically bred in a germ-free environment to prevent any microbe colonization. Some of these mice were immediately placed with mothers with normal microbiota, which leads to microbes being transferred rapidly.
That gave the study authors a way to pinpoint just how early microbes begin influencing the developing brain. Their focus was on the paraventricular nucleus (PVN), a region of the hypothalamus tied to stress and social behavior, already known to be partly influenced by microbe activity in mice later in life.
At birth, a newborn body is colonized by microbes as it travels through the birth canal.
Birth also coincides with important developmental events that shape the brain.
Part 1
Sep 3
Dr. Krishna Kumari Challa
When the germ-free mice were just a handful of days old, the researchers found fewer neurons in their PVN, even when microbes were introduced after birth. That suggests the changes caused by these microorganisms happen in the uterus during development.
These neural modifications last, too: the researchers also found that the PVN was neuron-light even in adult mice, if they'd been raised to be germ-free. However, the cross-fostering experiment was not continued into adulthood (around eight weeks).
The details of this relationship still need to be worked out and researched in greater detail, but the takeaway is that microbes – specifically the mix of microbes in the mother's gut – can play a notable role in the brain development of their offspring.
Rather than shunning our microbes, we should recognize them as partners in early life development. They're helping build our brains from the very beginning, say the researchers.
While this has only been shown in mouse models so far, there are enough biological similarities between mice and humans that there's a chance we're also shaped by our mother's microbes before we're born.
One of the reasons this matters is because practices like Cesarean sections and the use of antibiotics around birth are known to disrupt certain types of microbe activity – which may in turn be affecting the health of newborns.
https://www.sciencedirect.com/science/article/pii/S0018506X25000686...
Part 2
**
Sep 3
Dr. Krishna Kumari Challa
Mutations driving evolution are informed by the genome, not random, study suggests
A study published in the Proceedings of the National Academy of Sciences by scientists shows that an evolutionarily significant mutation in the human APOL1 gene arises not randomly but more frequently where it is needed to prevent disease, fundamentally challenging the notion that evolution is driven by random mutations and tying the results to a new theory that, for the first time, offers a new concept for how mutations arise.
Implications for biology, medicine, computer science, and perhaps even our understanding of the origin of life itself, are potentially far reaching.
A random mutation is a genetic change whose chance of arising is unrelated to its usefulness. Only once these supposed accidents arise does natural selection vet them, sorting the beneficial from the harmful. For over a century, scientists have thought that a series of such accidents has built up over time, one by one, to create the diversity and splendor of life around us.
However, it has never been possible to examine directly whether mutations in the DNA originate at random or not. Mutations are rare events relative to the genome's size, and technical limitations have prevented scientists from seeing the genome in enough detail to track individual mutations as they arise naturally.
To overcome this, researchers developed a new ultra-accurate detection method and recently applied it to the famous HbS mutation, which protects from malaria but causes sickle-cell anemia in homozygotes.
Results showed that the HbS mutation did not arise at random, but emerged more frequently exactly in the gene and population where it was needed. Now, they report the same nonrandom pattern in a second mutation of evolutionary significance.
The new study examines the de novo origination of a mutation in the human APOL1 gene that protects against a form of trypanosomiasis, a disease that devastated central Africa in historical times and until recently has caused tens of thousands of deaths there per year, while increasing the risk of chronic kidney disease in people with two copies.
If the APOL1 mutation arises by chance, it should arise at a similar rate in all populations, and only then spread under Trypanosoma pressure. However, if it is generated nonrandomly, it may actually arise more frequently where it is useful.
Results supported the nonrandom pattern: the mutation arose much more frequently in sub-Saharan Africans, who have faced generations of endemic disease, compared to Europeans, who have not, and in the precise genomic location where it confers protection.
Part 1
Sep 4
Dr. Krishna Kumari Challa
The new findings challenge the notion of random mutation fundamentally.
Historically, there have been two basic theories for how evolution happens— random mutation and natural selection, and Lamarckism—the idea that an individual directly senses its environment and somehow changes its genes to fit it. Lamarckism has been unable to explain evolution in general, so biologists have concluded that mutations must be random.
This new theory moves away from both of these concepts, proposing instead that two inextricable forces underlie evolution. While the well-known external force of natural selection ensures fitness, a previously unrecognized internal force operates inside the organism, putting together genetic information that has accumulated over generations in useful ways.
To illustrate, take fusion mutations, a type of mutation where two previously separate genes fuse to form a new gene. As for all mutations, it has been thought that fusions arise by accident: one day, a gene moves by error to another location and by chance fuses to another gene, once in a great while, leading to a useful adaptation. But researchers have recently shown that genes do not fuse at random.
Instead, genes that have evolved to be used together repeatedly over generations are the ones that are more likely to get fused. Because the genome folds in 3D space, bringing genes that work together to the same place at the same time in the nucleus with their chromatin open, molecular mechanisms fuse these genes rather than others. An interaction involving complex regulatory information that has gradually evolved over generations leads to a mutation that simplifies and "hardwires" it into the genome.
Part 2
Sep 4
Dr. Krishna Kumari Challa
In the paper, they argue that fusions are a specific example of a more general and extensive internal force that applies across mutation types. Rather than local accidents arising at random locations in the genome disconnected from other genetic information, mutational processes put together multiple meaningful pieces of heritable information in many ways.
Genes that evolved to interact tightly are more likely to be fused; single-letter RNA changes that evolved to occur repeatedly across generations via regulatory phenomena are more likely to be "hardwired" as point mutations into the DNA; genes that evolved to interact in incipient networks, each under its own regulation, are more likely to be invaded by the same transposable element that later becomes a master-switch of the network, streamlining regulation, and so on. Earlier mutations influence the origination of later ones, forming a vast network of influences over evolutionary time.
Previous studies examined mutation rates as averages across genomic positions, masking the probabilities of individual mutations. But these new studies suggest that, at the scale of individual mutations, each mutation has its own probability, and the causes and consequences of mutation are related.
At each generation, mutations arise based on the information that has accumulated in the genome up to that time point, and those that survive become a part of that internal information.
This vast array of interconnected mutational activity gradually hones in over the generations on mutations relevant to the long-term pressures experienced, leading to long-term directed mutational responses to specific environmental pressures, such as the malaria and Trypanosoma–protective HbS and APOL1 mutations.
New genetic information arises in the first place, they argue, as a consequence of the fact that mutations simplify genetic regulation, hardwiring evolved biological interactions into ready-made units in the genome. This internal force of natural simplification, together with the external force of natural selection, act over evolutionary time like combined forces of parsimony and fit, generating co-optable elements that themselves have an inherent tendency to come together into new, emergent interactions.
Co-optable elements are generated by simplification under performance pressure, and then engage in emergent interactions—the source of innovation is at the system level.
Understood in the proper timescale, an individual mutation does not arise at random nor does it invent anything in and of itself.
The potential depth of evolution from this new perspective can be seen by examining other networks. For example, the gene fusion mechanism—where genes repeatedly used together across evolutionary time are more likely to be fused together by mutation—echoes chunking, one of the most basic principles of cognition and learning in the brain, where pieces of information that repeatedly co-occur are eventually chunked into a single unit.
Yet fusions are only one instance of a broader principle: diverse mutational processes respond to accumulated information in the genome, combining it over generations into streamlined instructions. This view recasts mutations not as isolated accidents, but as meaningful events in a larger, long-term process.
Daniel Melamed et al, De novo rates of a Trypanosoma -resistant mutation in two human populations, Proceedings of the National Academy of Sciences (2025). DOI: 10.1073/pnas.2424538122
Part 3
Sep 4
Dr. Krishna Kumari Challa
New tool enables rapid, large-scale profiling of disease-linked RNA modifications
Researchers have developed a powerful tool capable of scanning thousands of biological samples to detect transfer ribonucleic acid (tRNA) modifications—tiny chemical changes to RNA molecules that help control how cells grow, adapt to stress and respond to diseases such as cancer and antibiotic‑resistant infections. This tool opens up new possibilities for science, health care and industry—from accelerating disease research and enabling more precise diagnostics, to guiding the development of more effective medical treatments for diseases such as cancer and antibiotic‑resistant infections.
Cancer and infectious diseases are complicated health conditions in which cells are forced to function abnormally by mutations in their genetic material or by instructions from an invading microorganism. The SMART-led research team is among the world's leaders in understanding how the epitranscriptome—the over 170 different chemical modifications of all forms of RNA—controls growth of normal cells and how cells respond to stressful changes in the environment, such as loss of nutrients or exposure to toxic chemicals. The researchers are also studying how this system is corrupted in cancer or exploited by viruses, bacteria and parasites in infectious diseases.
Current molecular methods used to study the expansive epitranscriptome and all of the thousands of different types of modified RNA are often slow, labor‑intensive, costly and involve hazardous chemicals which limit research capacity and speed.
To solve this problem, the SMART team developed a new tool that enables fast, automated profiling of tRNA modifications—molecular changes that regulate how cells survive, adapt to stress and respond to disease. This capability allows scientists to map cell regulatory networks, discover novel enzymes and link molecular patterns to disease mechanisms, paving the way for better drug discovery and development, and more accurate disease diagnostics.
SMART's automated system was specially designed to profile tRNA modifications across thousands of samples rapidly and safely. Unlike traditional methods—which are costly, labor‑intensive and use toxic solvents such as phenol and chloroform—this tool integrates robotics to automate sample preparation and analysis, eliminating the need for hazardous chemical handling and reducing costs. This advancement increases safety, throughput and affordability, enabling routine large‑scale use in research and clinical labs.
Jingjing Sun et al, tRNA modification profiling reveals epitranscriptome regulatory networks in Pseudomonas aeruginosa, Nucleic Acids Research (2025). DOI: 10.1093/nar/gkaf696
Sep 4
Dr. Krishna Kumari Challa
Fruit fly research shows that mechanical forces drive evolutionary change
A tissue fold known as the cephalic furrow, an evolutionary novelty that forms between the head and the trunk of fly embryos, plays a mechanical role in stabilizing embryonic tissues during the development of the fruit fly Drosophila melanogaster.
Researchers have integrated computer simulations with their experiments and showed that the timing and position of cephalic furrow formation are crucial for its function, preventing mechanical instabilities in the embryonic tissues.
The work appears in Nature.
The increased mechanical instability caused by embryonic tissue movements may have contributed to the origin and evolution of the cephalic furrow genetic program. This shows that mechanical forces can shape the evolution of new developmental features.
Mechanical forces shape tissues and organs during the development of an embryo through a process called morphogenesis. These forces cause tissues to push and pull on each other, providing essential information to cells and determining the shape of organs. Despite the importance of these forces, their role in the evolution of development is still not well understood.
Animal embryos undergo tissue flows and folding processes, involving mechanical forces, that transform a single-layered blastula (a hollow sphere of cells) into a complex multi-layered structure known as the gastrula. During early gastrulation, some flies of the order Diptera form a tissue fold at the head-trunk boundary called the cephalic furrow. This fold is a specific feature of a subgroup of Diptera and is therefore an evolutionary novelty of flies.
Part1
Sep 4
Dr. Krishna Kumari Challa
The cephalic furrow is especially interesting because it is a prominent embryonic invagination whose formation is controlled by genes, but that has no obvious function during development. The fold does not give rise to specific structures, and later in development, it simply unfolds, leaving no trace.
With their experiments, the researchers show that the absence of the cephalic furrow leads to an increase in the mechanical instability of embryonic tissues and that the primary sources of mechanical stress are cell divisions and tissue movements typical of gastrulation. They demonstrate that the formation of the cephalic furrow absorbs these compressive stresses. Without a cephalic furrow, these stresses build up, and outward forces caused by cell divisions in the single-layered blastula cause mechanical instability and tissue buckling.
This intriguing physical role gave the researchers the idea that the cephalic furrow may have evolved in response to the mechanical challenges of dipteran gastrulation, with mechanical instability acting as a potential selective pressure.
Flies either feature a cephalic furrow, or if they lack one, display widespread out-of-plane division, meaning the cells divide downward to reduce the surface area. Both mechanisms act as mechanical sinks to prevent tissue collision and distortion.
These findings uncover empirical evidence for how mechanical forces can influence the evolution of innovations in early development. The cephalic furrow may have evolved through genetic changes in response to the mechanical challenges of dipteran gastrulation. They show that mechanical forces are not just important for the development of the embryo but also for the evolution of its development.
Patterned invagination prevents mechanical instability during gastrulation, Nature (2025). DOI: 10.1038/s41586-025-09480-3
Bipasha Dey et al, Divergent evolutionary strategies pre-empt tissue collision in gastrulation. Nature (2025). DOI: 10.1038/s41586-025-09447-4
Part 2
Sep 4
Dr. Krishna Kumari Challa
Some sugar substitutes linked to faster cognitive decline
Some sugar substitutes may come with unexpected consequences for long-term brain health, according to a study published in Neurology. The study examined seven low- and no-calorie sweeteners and found that people who consumed the highest amounts experienced faster declines in thinking and memory skills compared to those who consumed the lowest amounts.
The link was even stronger in people with diabetes. While the study showed a link between the use of some artificial sweeteners and cognitive decline, it did not prove that they were a cause.
The artificial sweeteners examined in the study were aspartame, saccharin, acesulfame-K, erythritol, xylitol, sorbitol and tagatose. These are mainly found in ultra-processed foods like flavored water, soda, energy drinks, yogurt and low-calorie desserts. Some are also used as a standalone sweetener.
Low- and no-calorie sweeteners are often seen as a healthy alternative to sugar, however the new findings suggest certain sweeteners may have negative effects on brain health over time.
The study included 12,772 adults from across Brazil. The average age was 52, and participants were followed for an average of eight years.
After adjusting for factors such as age, sex, high blood pressure and cardiovascular disease, researchers found people who consumed the highest amount of sweeteners showed faster declines in overall thinking and memory skills than those who consumed the lowest amount, with a decline that was 62% faster. This is the equivalent of about 1.6 years of aging. Those in the middle group had a decline that was 35% faster than the lowest group, equivalent to about 1.3 years of aging.
When researchers broke the results down by age, they found that people under the age of 60 who consumed the highest amounts of sweeteners showed faster declines in verbal fluency and overall cognition when compared to those who consumed the lowest amounts. They did not find links in people over 60. They also found that the link to faster cognitive decline was stronger in participants with diabetes than in those without diabetes.
When looking at individual sweeteners, consuming aspartame, saccharin, acesulfame-k, erythritol, sorbitol and xylitol was associated with a faster decline in overall cognition, particularly in memory.
They found no link between the consumption of tagatose and cognitive decline.
https://www.neurology.org/doi/10.1212/WNL.0000000000214023
Sep 4
Dr. Krishna Kumari Challa
SeeMe detects hidden signs of consciousness in brain injury patients
SeeMe, a computer vision tool tested by Stony Brook University researchers, was able to detect low-amplitude, voluntary facial movements in comatose acute brain injury patients days before clinicians could identify overt responses.
There are ways to detect covert consciousness with EEG and fMRI, though these are not always available. Many acute brain injury patients appear unresponsive in early care, with signs that are so small, or so infrequent, that they are simply missed.
In the study, "Computer vision detects covert voluntary facial movements in unresponsive brain injury patients," published in Communications Medicine, investigators designed SeeMe to quantify tiny facial movements in response to auditory commands with the objective of identifying early, stimulus-evoked behavior.
A single-center prospective cohort included 37 comatose acute brain injury patients and 16 healthy volunteers, aged 18–85, enrolled at Stony Brook University Hospital. Patients had initial Glasgow Coma Scale scores ≤8 and no prior neurologically debilitating diagnoses.
SeeMe tagged facial pores at ~0.2 mm resolution and tracked movement vectors while subjects heard three commands: open your eyes, stick out your tongue, and show me a smile.
Results indicate earlier and broader detection with SeeMe. Eye-opening was detected on average 9.1 (± 5.5) days after injury by SeeMe versus 13.2 (± 11.4) days by clinical examination, yielding a 4.1-day lead.
SeeMe identified eye-opening in 30 of 36 patients (85.7%) compared with 25 of 36 (71.4%) by clinical exam. Among patients without an endotracheal tube obscuring the mouth, SeeMe detected mouth movements in 16 of 17 (94.1%).
In seven patients with analyzable mouth videos and clinical command following, SeeMe identified reproducible mouth responses 8.3 days earlier on average. Amplitude and frequency of SeeMe-positive responses correlated with discharge outcomes on the Glasgow Outcome Scale-Extended, with significant Kruskal-Wallis results for both features.
A deep neural network classifier trained on SeeMe-positive trials identified command specificity with 81% accuracy for eye-opening and an overall accuracy of 65%. Additional observations were lower, with 37% for tongue movement and 47% for smile, indicating a strong specificity for eyes.
Authors conclude that acute brain injury patients can exhibit low-amplitude, stimulus-evoked facial movements before overt signs appear at bedside, suggesting that many covertly conscious patients may have motor behavior currently undetected by clinicians.
Earlier detection could inform family discussions, guide rehabilitation timing, and serve as a quantitative signal for future monitoring or interface-based communication strategies, while complementing standard examinations.
Xi Cheng et al, Computer vision detects covert voluntary facial movements in unresponsive brain injury patients, Communications Medicine (2025). DOI: 10.1038/s43856-025-01042-y
Sep 4
Dr. Krishna Kumari Challa
Pancreatic insulin disruption triggers bipolar disorder-like behaviors in mice, study shows
Bipolar disorder is a psychiatric disorder characterized by alternating episodes of depression (i.e., low mood and a loss of interest in everyday activities) and mania (i.e., a state in which arousal and energy levels are abnormally high). On average, an estimated 1–2% of people worldwide are diagnosed with bipolar disorder at some point during their lives.
Bipolar disorder can be highly debilitating, particularly if left untreated. Understanding the neural and physiological processes that contribute to its emergence could thus be very valuable, as it could inform the development of new prevention and treatment strategies.
In addition to experiencing periodic changes in mood, individuals diagnosed with this disorder often exhibit some metabolic symptoms, including changes in their blood sugar levels. While some previous studies reported an association between blood sugar control mechanisms and bipolar disorder, the biological link between the two has not yet been uncovered.
Researchers recently carried out a study aimed at further exploring the link between insulin secretion and bipolar disorder-like behaviors, particularly focusing on the expression of the gene RORβ.
Their findings, published in Nature Neuroscience, show that an overexpression of this gene in a subtype of pancreatic cells disrupts the release of insulin, which in turn prompts a feedback loop with a region of the brain known as the hippocampus, producing alternative depression-like and mania-like behaviors in mice.
The results in mice point to a pancreas–hippocampus feedback mechanism by which metabolic and circadian factors cooperate to generate behavioral fluctuations, and which may play a role in bipolar disorder, wrote the authors.
Yao-Nan Liu et al, A pancreas–hippocampus feedback mechanism regulates circadian changes in depression-related behaviors, Nature Neuroscience (2025). DOI: 10.1038/s41593-025-02040-y.
Sep 4
Dr. Krishna Kumari Challa
Shampoo-like gel could help chemo patients keep their hair
Cancer fighters know that losing their hair is often part of the battle, but researchers have developed a shampoo-like gel that has been tested in animal models and could protect hair from falling out during chemotherapy treatment.
Baldness from chemotherapy-induced alopecia causes personal, social and professional anxiety for everyone who experiences it. Currently, there are few solutions—the only ones that are approved are cold caps worn on the patient's head, which are expensive and have their own extensive side effects.
The gel is a hydrogel, which absorbs a lot of water and provides long-lasting delivery of drugs to the patient's scalp. The hydrogel is designed to be applied to the patient's scalp before the start of chemotherapy and left on their head as long as the chemotherapy drugs are in their system—or until they are ready to easily wash it off.
During chemotherapy treatment, chemotherapeutic drugs circulate throughout the body. When these drugs reach the blood vessels surrounding the hair follicles on the scalp, they kill or damage the follicles, which releases the hair from the shaft and causes it to fall out.
The gel, containing the drugs lidocaine and adrenalone, prevents most of the chemotherapy drugs from reaching the hair follicle by restricting the blood flow to the scalp. Dramatic reduction in drugs reaching the follicle will help protect the hair and prevent it from falling out.
To support practical use of this "shampoo," the gel is designed to be temperature responsive. For example, at body temperature, the gel is thicker and clings to the patient's hair and scalp surface. When the gel is exposed to slightly cooler temperatures, the gel becomes thinner and more like a liquid that can be easily washed away.
Romila Manchanda et al, Hydrogel-based drug delivery system designed for chemotherapy-induced alopecia, Biomaterials Advances (2025). DOI: 10.1016/j.bioadv.2025.214452
Sep 4
Dr. Krishna Kumari Challa
How aging drives neurodegenerative diseases
A research team has identified a direct molecular link between aging and neurodegeneration by investigating how age-related changes in cell signaling contribute to toxic protein aggregation.
Although aging is the biggest risk factor for neurodegenerative diseases, scientists still don't fully understand which age-associated molecular alterations drive their development.
Using the small nematode worm Caenorhabditis elegans, a research team studied a signaling pathway that leads to pathological protein accumulation with age. Their new paper is published in Nature Aging.
The team focused on the aging-associated protein EPS8 and the signaling pathways it regulates. This protein is known to accumulate with age and to activate harmful stress responses that lead to a shorter lifespan in worms.
Researchers found that increased levels of EPS8, and the activation of its signaling pathways, drive pathological protein aggregation and neurodegeneration—typical features of age-associated neurodegenerative diseases such as Huntington's disease and amyotrophic lateral sclerosis (ALS). By reducing EPS8 activity, the group was then able to prevent the build-up of the toxic protein aggregates and preserve neuronal function in worm models of these two diseases.
Importantly, EPS8 and its signaling partners are evolutionarily conserved and also present in human cells. Similar to what they achieved in the worms, the team was able to prevent the accumulation of toxic protein aggregates in human cell models of Huntington's disease and ALS by reducing EPS8 levels.
Seda Koyuncu et al, The aging factor EPS8 induces disease-related protein aggregation through RAC signaling hyperactivation, Nature Aging (2025). DOI: 10.1038/s43587-025-00943-w
Sep 4
Dr. Krishna Kumari Challa
Cooling pollen sunscreen can block UV rays without harming corals
Materials scientists have invented the world's first pollen-based sunscreen derived from Camellia flowers.
In experiments, the pollen-based sunscreen absorbed and blocked harmful ultraviolet (UV) rays as effectively as commercially available sunscreens, which commonly use minerals like titanium dioxide (TiO2) and zinc oxide (ZnO).
In laboratory tests on corals, commercial sunscreen induced coral bleaching in just two days, leading to coral death by day six. Each year, an estimated 6,000 to 14,000 tons of commercial sunscreen make their way into the ocean, as people wash it off in the sea or it flows in from wastewater.
In contrast, the pollen-based sunscreen did not affect the corals, which remained healthy even up to 60 days.
In other tests, the pollen-based sunscreen also demonstrated its ability to reduce surface skin temperature, thereby helping to keep the skin cool in the presence of simulated sunlight.
Nature's Guard: UV Filter from Pollen, Advanced Functional Materials (2025). DOI: 10.1002/adfm.202516936. advanced.onlinelibrary.wiley.c … .1002/adfm.202516936
Sep 5
Dr. Krishna Kumari Challa
Why we slip on ice: Physicists challenge centuries-old assumptions
For over a hundred years, schoolchildren around the world have learned that ice melts when pressure and friction are applied. When you step out onto an icy pavement in winter, you can slip up because of the pressure exerted by your body weight through the sole of your (still warm) shoe. But it turns out that this explanation misses the mark.
New research reveals that it's not pressure or friction that causes ice to become slippery, but rather the interaction between molecular dipoles in the ice and those on the contacting surface, such as a shoe sole.
The work is published in the journal Physical Review Letters. This insight overturns a paradigm established nearly two centuries ago by the brother of Lord Kelvin, James Thompson, who proposed that pressure and friction contribute to ice melting alongside temperature.
It turns out that neither pressure nor friction plays a particularly significant part in forming the thin liquid layer on ice.
Instead,computer stimulations by researchers reveal that molecular dipoles are the key drivers behind the formation of this slippery layer, which so often causes us to lose our footing in winter. But what exactly is a dipole? A molecular dipole arises when a molecule has regions of partial positive and partial negative charge, giving the molecule an overall polarity that points in a specific direction.
Achraf Atila et al, Cold Self-Lubrication of Sliding Ice, Physical Review Letters (2025). DOI: 10.1103/1plj-7p4z
Sep 5
Dr. Krishna Kumari Challa
Fetal brain harm linked to pregnancy infection
A specific bacterial infection during pregnancy that can cause severe harm to the unborn brain has been identified for the first time, in a finding that could have huge implications for prenatal health.
Previous studies have disagreed on whether fetal exposure to Ureaplasma parvum has a detrimental effect on brain development, so newborn health specialists of Medical Research set out to determine the answer, once and for all.
Ureaplasma parvum Serovars 3 and 6 are among the most common types that are isolated in pregnancies complicated by infection/inflammation, so they tested them individually in a pre-clinical model and the results were clear.
They showed that long-term exposure to a specific subtype of Ureaplasma (parvum 6) resulted in loss of cells that are responsible for the production of myelin (the fatty sheath that insulates nerve cells in the brain).
This resulted in less myelin production and a disruption to the architecture of myelin in the brain. This sort of disruption to myelin production can have a devastating and lifelong impact on neurodevelopment, cognition and motor function.
By contrast, they also showed that exposure to another subtype of Ureaplasma (parvum 3) had little effect on neurodevelopment.
Many of the babies affected by this infection in utero are at a much higher risk of preterm birth and the chronic intensive care and inflammation associated with being born too early, the researchers say.
Dima Abdu et al, Intra-amniotic infection with Ureaplasma parvum causes serovar-dependent white matter damage in preterm fetal sheep, Brain Communications (2025). DOI: 10.1093/braincomms/fcaf182
Sep 5
Dr. Krishna Kumari Challa
After early-life stress, astrocytes can affect behaviour
Astrocytes in the lateral hypothalamus region of the brain, an area involved in the regulation of sleep and wakefulness, play a key role in neuron activity in mice and affect their behavior, researchers have found.
By broadening medical science's understanding of cerebral mechanisms, the discovery could someday help in the treatment and prevention of depression in humans, the researchers say.
According to the scientific literature, early-life stress leads to a five-fold increase in the risk of developing a mental-health disorder as an adult, notably causing treatment-resistant disorders.
As brain cells, astrocytes are sensitive to variations in the blood concentration of metabolites and, in response to changes in the blood, astrocytes can modulate the extent of their interaction with neurons, their neighboring cells.
In mice, those changes are particularly responsive to the level of corticosterone, the stress hormone in the rodents' blood.
In adult mice who experienced early-life stress, researchers saw abnormally high levels of corticosterone. The impact of stress on behavior also differed according to sex. Females were less active at night, while males were hyperactive during the day.
In people with depression who have experienced a similar type of stress, these sex differences have also been observed.
Lewis R. Depaauw-Holt et al, A divergent astrocytic response to stress alters activity patterns via distinct mechanisms in male and female mice, Nature Communications (2025). DOI: 10.1038/s41467-025-61643-y
Sep 5
Dr. Krishna Kumari Challa
Is this the future of food? 'Sexless' seeds that could transform farming
Scientists are tinkering with plant genes to create crops that seed their own clones, with a host of benefits for farmers.
Sacks of seeds without the sex
Agriculture is on the brink of a revolution: grain crops that produce seeds asexually. The technology — trials of which could start sprouting as early as next month — exploits a quirk of nature called apomixis, in which plants create seeds that produce clones of the parent. Apomixis could slash the time needed to create new varieties of crops, and give smallholder farmers access to affordable high-yielding sorghum (Sorghum bicolor) and cowpea (Vigna unguiculata). But before self-cloning crops can be commercialized, the technology must run the regulatory gauntlet.
https://www.nature.com/articles/d41586-025-02753-x?utm_source=Live+...
Sep 5
Dr. Krishna Kumari Challa
Macaws learn by watching interactions of others, a skill never seen in animals before
One of the most effective ways we learn is through third-party imitation, where we observe and then copy the actions and behaviors of others. Until recently, this was thought to be a unique human trait, but a new study published in Scientific Reports reveals that macaws also possess this ability.
Second-party imitation is already known to exist in the animal kingdom. Parrots are renowned for their ability to imitate human speech and actions, and primates, such as chimpanzees, have learned to open a puzzle box by observing a human demonstrator. But third-party imitation is different because it involves learning by observing two or more individuals interact rather than by direct instruction.
Scientists chose blue-throated macaws for this study because they live in complex social groups in the wild, where they need to learn new behaviors to fit in quickly. Parrots, like macaws, are also very smart and can do things like copy sounds and make tools.
To find out whether macaws could learn through third-party imitation, researchers worked with two groups of them, performing more than 4,600 trials. In the test group, these birds watched another macaw perform one of five different target actions in response to a human's hand signals. These were fluffing up feathers, spinning its body, vocalizing, lifting a leg or flapping its wings. In the control group, macaws were given the same hand signals without ever seeing another bird perform the actions.
The results were clear. The test group learned more actions than the control group, meaning the interactions between the single macaw and the human experimenter helped them learn specific behaviors. They also learned these actions faster and performed them more accurately than the control group. The study's authors suggest that this ability to learn from passively observing two or more individuals helps macaws adapt to new social situations and may even contribute to their own cultural traditions.
The research shows that macaws aren't just smart copycats. They may also have their own complex social lives and cultural norms, similar to humans.
Esha Haldar et al, Third-party imitation is not restricted to humans, Scientific Reports (2025). DOI: 10.1038/s41598-025-11665-9
Sep 6
Dr. Krishna Kumari Challa
Physicists create a new kind of time crystal that humans can actually see
Imagine a clock that doesn't have electricity, but its hands and gears spin on their own for all eternity. In a new study, physicists have used liquid crystals, the same materials that are in your phone display, to create such a clock—or, at least, as close as humans can get to that idea. The team's advancement is a new example of a "time crystal." That's the name for a curious phase of matter in which the pieces, such as atoms or other particles, exist in constant motion.
The researchers aren't the first to make a time crystal, but their creation is the first that humans can actually see, which could open a host of technological applications. They can be observed directly under a microscope and even, under special conditions, by the naked eye.
In the study, the researchers designed glass cells filled with liquid crystals—in this case, rod-shaped molecules that behave a little like a solid and a little like a liquid. Under special circumstances, if you shine a light on them, the liquid crystals will begin to swirl and move, following patterns that repeat over time.
Time Crystal in motion
Under a microscope, these liquid crystal samples resemble psychedelic tiger stripes, and they can keep moving for hours—similar to that eternally spinning clock.
Hanqing Zhao et al, Space-time crystals from particle-like topological solitons, Nature Materials (2025). DOI: 10.1038/s41563-025-02344-1
Sep 6
Dr. Krishna Kumari Challa
Ant Queens Produce Offspring of Two Different Species
Some ant queens can produce offspring of more than one species – even when the other species is not known to exist in the nearby wild.
No other animal on Earth is known to do this, and it's hard to believe.
This is bizarre story of a system that allows things to happen that seem almost unimaginable.
Some queen ants are known to mate with other species to produce hybrid workers, but Iberian harvester ants (Messor ibericus) on the island of Sicily go even further, blurring the lines of species-hood. These queens can give birth to cloned males of another species entirely (Messor structor), with which they can mate to produce hybrid workers.
Their reproductive strategy is akin to "sexual domestication", the researchers say. The queens have come to control the reproduction of another species, which they exploited from the wild long ago – similar to how humans have domesticated dogs.
In the past, the Iberian ants seem to have stolen the sperm of another species they once depended on, creating an army of male M. structor clones to reproduce with whenever they wanted.
According to genetic analysis, the colony's offspring are two distinct species, and yet they share the same mother.
Some are the hairy M. ibericus, and the others the hairless M. structor – the closest wild populations of which live more than a thousand kilometers away.
The queen can mate with males of either species in the colony to reproduce. Mating with M. ibericus males will produce the next generation of queen, while mating with M. structor males will result in more workers. As such, all workers in the colony are hybrids of M. ibericus and M. structor.
Part 1
Sep 7
Dr. Krishna Kumari Challa
The two species diverged from each other more than 5 million years ago, and yet today, M. structor is a sort of parasite living in the queen's colony. But she is hardly a victim.
The Iberian harvester queen ant controls what happens to the DNA of her clones. When she reproduces, she can do so asexually, producing a clone of herself. She can also fertilize her egg with the sperm of her own species or M. structor, or she can 'delete' her own nuclear DNA and use her egg as a vessel solely for the DNA of her male M. structor clones.
This means her offspring can either be related to her, or to another species. The only similarity is that both groups contain the queen's mitochondrial DNA. The result is greater diversity in the colony without the need for a wild neighboring species to mate with.
It also means that Iberian harvester ants belong to a 'two-species superorganism' – which the researchers say "challenges the usual boundaries of individuality."
The M. structor males produced by M. ibericus queens don't look exactly the same as males produced by M. structor queens, but their genomes match.
https://www.nature.com/articles/s41586-025-09425-w
Part 2
Sep 7
Dr. Krishna Kumari Challa
Electrical stimulation can reprogram immune system to heal the body faster
Scientists have discovered that electrically stimulating macrophages—one of the immune systems key players—can reprogram them in such a way as to reduce inflammation and encourage faster, more effective healing in disease and injury.
This breakthrough uncovers a potentially powerful new therapeutic option, with further work ongoing to delineate the specifics.
Macrophages are a type of white blood cell with several high-profile roles in our immune system. They patrol around the body, surveying for bugs and viruses, as well as disposing of dead and damaged cells, and stimulating other immune cells—kicking them into gear when and where they are needed.
However, their actions can also drive local inflammation in the body, which can sometimes get out of control and become problematic, causing more damage to the body than repair. This is present in lots of different diseases, highlighting the need to regulate macrophages for improved patient outcomes.
In the study, published in the journal Cell Reports Physical Science, the researchers worked with human macrophages isolated from healthy donor blood samples.
They stimulated these cells using a custom bioreactor to apply electrical currents and measured what happened.
The scientists discovered that this stimulation caused a shift of macrophages into an anti-inflammatory state that supports faster tissue repair; a decrease in inflammatory marker (signaling) activity; an increase in expression of genes that promote the formation of new blood vessels (associated with tissue repair as new tissues form); and an increase in stem cell recruitment into wounds (also associated with tissue repair).
Electromodulation of human monocyte-derived macrophages drives a regenerative phenotype and impedes inflammation, Cell Reports Physical Science (2025). DOI: 10.1016/j.xcrp.2025.102795. www.cell.com/cell-reports-phys … 2666-3864(25)00394-7
Sep 8
Dr. Krishna Kumari Challa
Human brains explore more to avoid losses than to seek gains
Researchers traced a neural mechanism that explains why humans explore more aggressively when avoiding losses than when pursuing gains. Their work reveals how neuronal firing and noise in the amygdala shape exploratory decision-making.
Human survival has its origins in a delicate balance of exploration versus exploitation. There is safety in exploiting what is known, the local hunting grounds, the favorite foraging location, the go-to deli with the familiar menu. Exploitation also involves the risk of over-reliance on the familiar to the point of becoming too dependent upon it, either through depletion or a change in the stability of local resources.
Exploring the world in the hope of discovering better options has its own set of risks and rewards. There is the chance of finding plentiful hunting grounds, alternative foraging resources, or a new deli that offers a fresh take on old favorites. And there is the risk that new hunting grounds will be scarce, the newly foraged berries poisonous, or that the meal time will be ruined by a deli that disappoints.
Exploration-exploitation (EE) dilemma research has concentrated on gain-seeking contexts, identifying exploration-related activity in surface brain areas like cortex and in deeper regions such as the amygdala. Exploration tends to rise when people feel unsure about which option will pay off.
Loss-avoidance strategies differ from gain-seeking strategies, with links to negative outcomes such as PTSD, anxiety and mood disorders.
In the study, "Rate and noise in human amygdala drive increased exploration in aversive learning," published in Nature, researchers recorded single-unit activity to compare exploration during gain versus loss learning.
Part 1
Sep 9
Dr. Krishna Kumari Challa
Researchers examined how strongly neurons fired just before each choice and how variable that activity was across trials. Variability served as "neural noise." Computational models estimated how closely choices followed expected value and how much they reflected overall uncertainty.
Results showed more exploration in loss trials than in gain trials. After people learned which option was better, exploration stayed higher in loss trials and accuracy fell more from its peak.
Pre-choice firing in the amygdala and temporal cortex rose before exploratory choices in both gain and loss, indicating a shared, valence-independent rate signal. Loss trials also showed noisier amygdala activity from cue to choice.
More noise was linked to higher uncertainty and a higher chance of exploring, and noise declined as learning progressed. Using the measured noise levels in decision models reproduced the extra exploration seen in loss trials. Loss aversion did not explain the gap.
Researchers report two neural signals that shape exploration. A valence-independent firing-rate increase in the amygdala and temporal cortex precedes exploratory choices in both gain and loss. A loss-specific rise in amygdala noise raises the odds of exploration under potential loss, scales with uncertainty, and wanes as learning accrues.
Behavioral modeling matches this pattern, with value-only rules fitting gain choices and value-plus-total-uncertainty rules fitting loss choices. Findings point to neural variability as a lever that tilts strategy toward more trial-and-error when loss looms, while causal tests that manipulate noise remain to be done.
From an evolutionary survival perspective, the strategy fits well with the need to seek out new resources when facing the loss of safe or familiar choices. While one might consider trying a new restaurant at any time, true seeking behavior will become a priority if the favorite location is closed for remodeling.
Tamar Reitich-Stolero et al, Rate and noise in human amygdala drive increased exploration in aversive learning, Nature (2025). DOI: 10.1038/s41586-025-09466-1
Part 2
Sep 9
Dr. Krishna Kumari Challa
Scientists discover why the flu is more deadly for older people
Scientists have discovered why older people are more likely to suffer severely from the flu, and can now use their findings to address this risk.
In a study published in PNAS, experts discovered that older people produce a glycosylated protein called apolipoprotein D (ApoD), which is involved in lipid metabolism and inflammation, at much higher levels than in younger people. This has the effect of reducing the patient's ability to resist virus infection, resulting in a more serious disease outcome.
ApoD is therefore a target for therapeutic intervention to protect against severe influenza virus infection in the elderly which would have a major impact on reducing morbidity and mortality in the aging population.
ApoD mediates age-associated increase in vulnerability to influenza virus infection, Proceedings of the National Academy of Sciences (2025). DOI: 10.1073/pnas.2423973122
Sep 9