Glow-in-the-dark succulents that recharge with sunlight could pave way to plant-based lighting systems
From mushrooms that cast a soft green glow to plankton that glimmers sparkling blue, glowing plants are nothing new to nature. Now, scientists are bringing that light to houseplants.
Reporting in the journal Matter, researchers crafted glow-in-the-dark succulents that recharge in sunlight. Injected with light-emitting compounds, the plants can shine in various colors and rival a small night light at their brightest. The simple, low-cost method may help lay the foundation for sustainable, plant-based lighting systems.
Researchers used afterglow phosphor particles—materials similar to those found in glow-in-the-dark toys. These compounds absorb light and release it slowly over time.
For the particles to travel through leaf tissues, the researchers had to get the size just right: around 7 micrometers, roughly the width of a red blood cell.
Smaller, nano-sized particles move easily within the plant but are dimmer. Larger particles glowed brighter but couldn't travel far inside the plant.
The team then injected the particles into several plant species, including succulents and non-succulents like golden pothos and bok choy. But only the succulents produced a strong glow, thanks to the narrow, uniform, and evenly distributed channels within the leaf that helped to disperse the particles more effectively. After a couple of minutes of exposure to sunlight or indoor LED light, the modified plants glowed for up to two hours.
The particles diffused in just seconds, and the entire succulent leaf glowed.
By using different types of phosphors, the researchers created plants that shine in various colors, including green, red, and blue. They even built a glowing plant wall with 56 succulents, bright enough to illuminate nearby objects and read texts.
Each plant takes about 10 minutes to prepare.
The glowing succulents' light fades over time, and the team is still studying the long-term safety of the materials for the plants.
Sunlight-powered multicolor and uniform luminescence in material-engineered living plants, Matter (2025). DOI: 10.1016/j.matt.2025.102370
Birds are some of the most striking creatures on Earth, coming in a rainbow of colors that serve several important functions, such as attracting a mate and communicating with other birds. These vibrant hues are produced by pigments, primarily melanin, but a major unknown until now was how much these pigments weigh. Since wings need to be as light as possible for flight, understanding pigmentation weight may tell us something about the trade-off between the evolutionary benefits of colored feathers and the physical cost of carrying that weight.
In a new study published in the journal Biology Letters, scientists have investigated how much melanin adds to the weight of feathers and the difference in weight between the two main chemical forms of melanin—eumelanin (responsible for brown and black colors) and pheomelanin (responsible for reds and lighter colors).
The researchers analyzed the feathers from 109 bird specimens across 19 different species, including the common kingfisher (Alcedo atthis), the golden eagle (Aquila chrysaetos) and the Eurasian bullfinch (Pyrrhula pyrrhula). They examined feathers with mixed colors and those with single, pure colors, and used a chemical processinvolving sodium hydroxideor caustic soda, as it is more commonly known, to extract the pigments. Once extracted, they were weighed and compared to the original weight of the feathers.
According to the scientists, melanin pigments account for about 22% of a feather's total weight, and in the most pigmented feathers, this weight is no more than 25%. Additionally, the two types of pigments don't weigh the same. Eumelanin, which makes feathers black or brown, was significantly heavier than pheomelanin, which produces lighter colors.
The paper suggests that a bird has to expend more energy to carry the weight of certain feather colors, which could explain why birdshave evolved such a wide variety of hues, as the scientists write in their paper.
The researchers suggested that birds in cold climates, like snowy owls, didn't just evolve white feathers for camouflage. The lack of a heavy pigment would allow them to grow thicker, more insulating feathers without adding too much weight. Therefore, they can stay warm and still be able to fly. Additionally, migratory birds may have evolved lighter feathers to reduce the energy cost of flying long distances.
Ismael Galván et al, Pigment contribution to feather mass depends on melanin form and is restricted to approximately 25%, Biology Letters (2025). DOI: 10.1098/rsbl.2025.0299
Microbes in soil may affect hormones tied to love, mental health and social bonds
Experts are exploring evidence that microbes in the soil and the environments around us can affect human microbiota and the "gut-brain axis," potentially shaping emotional states and relationship dynamics—including aspects of romantic love.
They outline the idea in a review article in mSystems proposing how the human gut microbiome might influence hormonal pathways involved in emotions commonly associated with love.
This is not claiming microbes 'cause' love or hatred. The aim is to map plausible biological routes, grounded in microbiology and endocrinology, that researchers can now evaluate with rigorous human studies.
The mini-review, "Does a microbial-endocrine interplay shape love-associated emotions in humans? A hypothesis," synthesizes evidence that microbes can modulate hormones and key neurotransmitters such as dopamine, serotonin and oxytocin.
The researchers are exploring how the evolutionary underpinnings of microbial-endocrine interactions could provide important insights into how microbes influence emotions beyond love, including hate and aggression.
If these pathways are confirmed, the findings could open avenues for microbiome-informed strategies to support mental health and relational well-being. For now, it provides a roadmap for careful, hypothesis-driven science.
Jake M. Robinson et al, Does a microbial-endocrine interplay shape love-associated emotions in humans? A hypothesis,mSystems(2025).DOI: 10.1128/msystems.00415-25
Xin Sun et al, Unforeseen high continental-scale soil microbiome homogenization in urban greenspaces,Nature Cities(2025).DOI: 10.1038/s44284-025-00294-y
Inflammation may explain why women with no standard modifiable risk factors have heart attacks and strokes
Several people ask me this question: Why do people who lead very healthy lives get heart attacks and strokes?
Yes why?
Cardiologists have long known that up to half of all heart attacks and strokes occur among apparently healthy individuals who do not smoke and do not have high blood pressure, high cholesterol, or diabetes, the "standard modifiable risk factors" which doctors often call "SMuRFs."
How to identify risk among the "SMuRF-Less" has been an elusive goal in preventive cardiology, particularly in women who are often under-diagnosed and under-treated.
A new study now has found hsCRP—a marker of inflammation—can help identify women who are at risk but are missed by current screening algorithms.
Results were presented at a late-breaking clinical science session at the European Society of Cardiology Congress (ESC) and simultaneously published inThe European Heart Journal.
Women who suffer from heart attacks and strokes yet have no standard modifiable risk factors are not identified by the risk equations doctors use in daily practice.
Yet the new data clearly shows that apparently healthy women who are inflamed are at substantial lifetime risk. Medical professionals should be identifying these women in their 40s, at a time when they can initiate preventive care, not wait for the disease to establish itself in their 70s when it is often too late to make a real difference.
As part of the study, researchers studied 12,530 initially healthy women with no standard modifiable risk factors who had the inflammatory biomarker hsCRP measured at study entry and who were then followed over 30 years.
Despite the lack of traditional risks, women who were inflamed as defined by hsCRP levels > 3 mg/L had a 77% increased lifetime risk of coronary heart disease, a 39% increased lifetime risk of stroke, and a 52% increased lifetime risk of any major cardiovascular event.
Additionally, researchers released a new analysis of randomized trial data showing that "SMuRF-Less but Inflamed" patients can reduce their risk of heart attack and stroke by 38% using statin therapy.
While those with inflammation should aggressively initiate lifestyle and behavioral preventive efforts, statin therapy could also play an important role in helping reduce risk among these individuals, say the researchers.
Paul Ridker et al, C-Reactive Protein and Cardiovascular Risk Among Women with No Standard Modifiable Risk Factors: Evaluating The "SMuRF-Less but Inflamed", (2025). DOI: 10.1093/eurheartj/ehaf658
Placebo pain relief works differently across the human body
Researchers have used placebo pain relief to uncover a map-like system in the brainstem that controls pain differently depending on where it's felt in the body. The findings may pave the way for safer, more targeted treatments for chronic pain that don't rely on opioids.
Like a highway, the brainstem connects the brain to the spinal cord and manages all signals going to and from the brain. It produces and releases nearly all the neurochemicals needed for thinking, survival and sensing.
Published in Science, the study used 7-Tesla functional magnetic resonance imaging (fMRI)—one of the most powerful brain scanners available—to pinpoint how two key brainstem regions manage pain through placebo effects.
Researchers exposed 93 healthy participants to heat pain on different body parts and applied a placebo pain-relief cream while secretly lowering the temperature, conditioning them to believe the cream was alleviating their pain.
The temperature used was individually adjusted to be moderately painful as perceived by each participant. Researchers used a self-report scale, where 0 was no pain and 100 was the worst pain imaginable, and sought a temperature between 40 and 50 for each participant.
Later, the same pain stimulus was applied to the placebo-treated area as well as a separate untreated area for comparison. Up to 61% of participants still reported less pain in the area where the placebo cream was originally applied, typical of a true placebo response.
They found that upper parts of the brainstem were more active when relieving facial pain, while lower regions were engaged for arm or leg pain.
Two key brainstem regions are involved in this process: the periaqueductal gray (PAG) and the rostral ventromedial medulla (RVM). These areas showed distinct patterns of activity depending on where pain relief was directed, with the upper parts of the PAG and RVM more active for facial pain, while lower parts were more active for arm or leg pain.
The brain has a built-in system to control pain in specific areas. It's not just turning pain off everywhere; but working in a highly coordinated, anatomically precise system.
Understanding which brainstem areas are linked to different parts of the body may open new avenues for developing non-invasive therapies that reduce pain without widespread side effects.
The study also challenges long-held assumptions about how placebo pain relief works. Instead of relying on the brain's opioid system, experts say a different part of the brainstem—the lateral PAG—is not only responsible but works without using opioids and could instead be linked to cannabinoid activity.
Opioid-based pain relief typically activates central areas of the brain and can affect the whole body, whereas the cannabinoid circuit that they identified now appears to operate in more targeted regions of the brainstem.
Knowing exactly where pain relief is happening in the brain means we can target that area or assess whether a drug is working in the right place.
Lewis S. Crawford et al, Somatotopic organization of brainstem analgesic circuitry, Science (2025). DOI: 10.1126/science.adu8846
The AI breakthrough that uses almost no power to create images
From creating art and writing code to drafting emails and designing new drugs, generative AI tools are becoming increasingly indispensable for both business and personal use. As demand increases, they will require even more computing power, memory and, therefore, energy. That's got scientists looking for ways to reduce their energy consumption.
In a paper published in the journal Nature, researchers describe the development of an AI image generator that consumes almost no power.
AI image generators use a process called diffusion to generate images from text. First, they are trained on a large dataset of images and repeatedly add a statistical noise, a kind of digital static, until the image has disappeared.
Then, when you give AI a prompt such as "create an image of a house," it starts with a screen full of static and then reverses the process, gradually removing the noise until the image appears. If you want to perform large-scale tasks, such as creating hundreds of millions of images, this process is slow and energy intensive.
AI tool targets RNA structures to unravel secrets of the dark genome
We mapped the human genome decades ago, but most of it is still a black box. Now, scientists have developed a tool to peer inside and what they find could reshape how we think about disease.
Your genome is the genetic map of you, and we understand almost none of it. Our handle on the bits of the genome that tell the body how to do things ("make eyes blue," "build heart tissue," "give this person sickle cell anemia") is OK, but there are vast areas of the genome that don't appear to do anything. Scientists long assumed this was just "junk" DNA, leftovers from the billions of years it took us to evolve from primordial soup into the complex life we know today. But it turns out we just didn't know what to look for, nor how to look for it. Now, new tools developed by researchers are helping us understand just how important the dark genome really is.
Understanding that, scientists hope, will offer new avenues for drug discovery, and transform how we think about disease and even life itself.
Your genome is broken up into protein-coding genes (around 2%), and the rest.
Proteins are the little machines that do the actual work of running an organism;protein-coding genesare the instructions for those proteins.
The other 98% doesn't build proteins, it doesn't follow the same rules, and it's much harder to understand. It contains "long non-coding RNAs," the stuff that was long dismissed as junk.
Scientists are trying to decode the logic circuitry of the human genome—the hidden rules that tell our DNA how to build and run a human being.
It's a tough job, but a crucial one because studies show that the roots of many diseases, including heart disease, cancer, and some psychiatric disorders, lie outside the well-understood, protein-coding regions of the genome.
Non-protein coding genes have key regulatory roles, turning certain genes on and off or altering their shape.
Do any of that at the wrong time, and things start to break down.
Scientists think that these RNAs act like software, orchestrating the protein 'hardware' into a functioning symphony. Not junk at all. You might have heard that you share 60% of your DNA with a banana.
It's not strictly speaking true, unfortunately, but there is a germ of truth there.
Humans and bananas (and everything else) all evolved from the same soup of bacteria and single-celled organisms about four billion years ago, and some of our DNA has been conserved over that time because it performs fundamental functions in our cells.
Studies have shown that about 10% of the human genome shows obvious conservation, but the rest is harder to pin down.
Scientists suspect that the remaining approximately 90% of the genome harbors many conserved RNA structures that are invisible to traditional approaches—hidden regulatory elements camouflaged in the genome. Which is where the the newly developed AI—a tool called ECSFinder—comes in. It has been detailed in a paper in the journal Nucleic Acids Research. Scientists trained the program on known RNA structures to detect secrets hidden in the genome. The program outperformed other available tools and is now ready to be unleashed on the entire genome.
They expect to uncover hundreds of thousands of new RNA structures, adding a new dimension to our understanding of the genome.
The holy grail of medicine right now is personalized therapy, treatments designed specifically for your individual illness.
One day, it's hoped, clinicians will be able to take a sample of your cancer's DNA, map its genome, then design ultra-specific drugs around its weaknesses, all within a few weeks.
And if the secrets of all gene-driven illnesses are hidden somewhere in the dark genome, learning how to read it is imperative.
Because RNA structures can be targeted by drugs, they present an exciting new frontier for therapies.
Vanda Gaonac'h-Lovejoy et al, ECSFinder: optimized prediction of evolutionarily conserved RNA secondary structures from genome sequences, Nucleic Acids Research (2025). DOI: 10.1093/nar/gkaf780
Scientists move toward developing vaccine against pathogenic Staphylococcus aureus
Antibiotics are the old medicine cabinet standby for treating infections caused by multidrug-resistant Staphylococcus aureus, but as antimicrobial resistance continues to mount globally, scientists say there's a need for new strategies.
While vaccines are a potential answer, achieving an effective way to immunize against multidrug-resistant S. aureus has led scientists down dozens of blind alleys. Ten candidate vaccines that looked promising in preclinical animal studies in recent years failed miserably in human clinical trials.
Now, scientists are investigating a way to sidestep the myriad problems that plagued vaccine investigators in the past by choosing not to target a whole antigen. Instead, they say, it's time to home in on a critical "surface loop" as a vaccine target. The infinitesimal loop is located on the S. aureus antigen known as MntC.
The loop is an ideal target, scientists say, because it has been shown to be essential for aiding survival in S. aureus by mitigating oxidative stress. By triggering vaccine-induced antibodies that zero in on that site, survival is impossible. As a vaccine target, the surface loop is known as an epitope.
WritinginScience Translational Medicine, researchers at institutions from throughout China report on the creation of a first-of-its-kind vaccine that has been tested in animal models based on a target pinpointed in samples from human clinical trials. The evolving strategy overcomes past obstacles and confers protection against drug-resistant S. aureus with a vaccine.
Xiaokai Zhang et al, An epitope vaccine derived by analyzing clinical trial samples safeguards hosts with prior exposure to S. aureus against reinfection, Science Translational Medicine (2025). DOI: 10.1126/scitranslmed.adr7464
Cancer can smuggle mitochondria in neighboring cells and put them to work
Cancer cells provide healthy neighboring cells with additional mitochondria to put them to work. This has been demonstrated by researchers in a new study. In this way, cancer is exploiting a mechanism that frequently serves to repair damaged cells.
Tumors have developed many strategies and tricks to gain advantages in the body. Researchers have now discovered another surprising trick that certain tumors resort to in ensuring their survival and growth.
In a study published in the journalNature Cancer, biologists show that skin cancer cells are able to transfer their mitochondria to healthy connective tissue cells (fibroblasts) in their immediate vicinity. Mitochondria are the cell compartments that provide energy in the form of the molecule ATP.
The cancer cells use tiny tubes made of cell membrane material to transfer the mitochondria and connect the two cells—much like in a pneumatic tube system.
The mitochondrial transfer reprograms the fibroblasts functionally into tumor-associated fibroblasts, which mainly support cancer cells: tumor-associated fibroblasts usually multiply faster than normal fibroblasts and produce more ATP, while also secreting higher amounts of growth factors and cytokines. And all this benefits the tumor cells: they also multiply faster, making the tumor more aggressive.
The hijacked fibroblasts also alter the cell environment—the so-called extracellular matrix—by increasing the production of certain matrix components in such a way that cancer cells thrive. The extracellular matrix is vital for the mechanical stability of tissues and influences growth, wound healing and intercellular communication.
Finally, the researchers also clarified the molecular mechanism behind the mitochondrial transfer. Some proteins were already known to assist in transporting mitochondria. The researchers investigated which of these proteins were present in large numbers in cancer cells that transfer mitochondria and came across the protein MIRO2. This protein is produced in very high quantities in cancer cells that transfer their mitochondria.
The researchers detected MIRO2 not only in cell cultures, but also in samples of human tissue—especially in tumor cells at the edges of tumors that grow invasively into the tissue and occur in close proximity to fibroblasts.
The new findings offer starting points for arresting tumor growth. When the researchers blocked the formation of MIRO2, the mitochondrial transfer was inhibited, and the fibroblasts did not develop into tumor-promoting fibroblasts.
The MIRO2 blockade worked in the test tube and in mouse models. Whether it also works in human tissue remains to be seen, say the researchers.
Michael Cangkrama et al, MIRO2-mediated mitochondrial transfer from cancer cells induces cancer-associated fibroblast differentiation, Nature Cancer (2025). DOI: 10.1038/s43018-025-01038-6
Metformin changes blood metal levels in humans, offering further insight into its mechanism of action
The widely used diabetes drug metformin changes blood metal levels in humans. A new study is an important step in understanding the drug's many actions and designing better ones in the future.
Metformin is the most widely prescribed diabetes drug in the world. Apart from lowering blood sugar levels, it is also known to have a broad range of beneficial side effects such as against tumors, inflammation and atherosclerosis. However, although it has been used for more than 60 years now, its mechanism of action is still not clear, hampering the development of even better drugs against these conditions.
It is known that diabetes patients experience changes in the blood levels of metals such as copper, iron and zinc.
In addition, chemical studies found that metformin has the ability to bind certain metals, such as copper, and recent studies showed that it is this binding ability that might be responsible for some of the drug's beneficial effects.
In the journal BMJ Open Diabetes Research & Care, researchers have published the first clinical evidence of altered blood metal levels in patients taking metformin. They showed that drug-taking patients have significantly lower copper and iron levels and heightened zinc levels.
Furthermore, since decreases in copper and iron concentrations and an increase in zinc concentration are all considered to be associated with improved glucose tolerance and prevention of complications, these changes may indeed be related to metformin's action.
Association of metformin treatment with changes in metal dynamics in individuals with type 2 diabetes, BMJ Open Diabetes Research & Care (2025). DOI: 10.1136/bmjdrc-2025-005255
AI helps stethoscope detect three heart conditions in 15 seconds
An AI-enabled stethoscope can help doctors pick up three heart conditions in just 15 seconds, according to the results of a real-world trial presented at the European Society of Cardiology's annual congress.
The stethoscope, invented in 1816, is a vital part of a doctor's toolkit, used to listen to sounds within the body. But an AI stethoscope can do much more, including analyzing tiny differences in heartbeat and blood flowwhich are undetectable to the human ear, and taking a rapid ECG at the same time.
Researchers now have evidence that an AI stethoscope can increase detection of heart failure at the early stage when someone goes to their GP with symptoms.
Astudyinvolving more than 200 GP surgeries, with more than 1.5 million patients, looked at people with symptoms such as breathlessness or fatigue.
Those examined using an AI stethoscope were twice as likely to be diagnosed with heart failure, compared to similar patients who were not examined using the technology.
Patients examined with the AI-stethoscope were about 3.5 times more likely to be diagnosed with atrial fibrillation—an abnormal heart rhythm which can increase the risk of having a stroke. They were almost twice as likely to receive a diagnosis of heart valve disease, which is where one or more heart valves do not work properly.
Early diagnosis is vital for all three conditions, allowing patients who may need potentially lifesaving medicines to be identified sooner, before they become dangerously unwell.
Mihir A Kelshiker et al, Triple cardiovascular disease detection with an artificial intelligence-enabled stethoscope (TRICORDER): design and rationale for a decentralised, real-world cluster-randomised controlled trial and implementation study, BMJ Open (2025). DOI: 10.1136/bmjopen-2024-098030
A firewall for science: AI tool identifies 1,000 'questionable' journals
A team of computer scientists has developed a new artificial intelligence platform that automatically seeks out "questionable" scientific journals.
The study, published Aug. 27 in the journal Science Advances, tackles an alarming trend in the world of research.
Spam messages come from people who purport to be editors at scientific journals, usually researchers never heard of, and offer to publish their papers—for a hefty fee.
Such publications are sometimes referred to as "predatory" journals. They target scientists, convincing them to pay hundreds or even thousands of dollars to publish their research without proper vetting. There has been a growing effort among scientists and organizations to vet these journals. But it's like whack-a-mole. You catch one, and then another appears, usually from the same company. They just create a new website and come up with a new name.
The new new AI tool automatically screens scientific journals, evaluating their websites and other online data for certain criteria: Do the journals have an editorial board featuring established researchers? Do their websites contain a lot of grammatical errors?
In an era when prominent figures are questioning the legitimacy of science, stopping the spread of questionable publications has become more important than ever before.
In science, you don't start from scratch. You build on top of the research of others. So if the foundation of that tower crumbles, then the entire thing collapses. When scientists submit a new study to a reputable publication, that study usually undergoes a practice called peer review. Outside experts read the study and evaluate it for quality—or, at least, that's the goal.
A growing number of companies have sought to circumvent that process to turn a profit. In 2009, Jeffrey Beall, a librarian at CU Denver, coined the phrase "predatory" journals to describe these publications.
Often, they target researchers outside of the United States and Europe, such as in China, India and Iran—countries where scientific institutions may be young, and the pressure and incentives for researchers to publish are high.
These publishers of predatory journals will say, 'If you pay $500 or $1,000, we will review your paper. In reality, they don't provide any service. They just take the PDF and post it on their website. a nonprofit organization called the Directory of Open Access Journals (DOAJ). Since 2003, volunteers at the DOAJ have flagged thousands of journals as suspicious based on six criteria. (Reputable publications, for example, tend to include a detailed description of their peer review policies on their websites.)
But keeping pace with the spread of those publications has been daunting for humans.
To speed up the process, researchers now turned to AI. They trained its system using the DOAJ's data, then asked the AI to sift through a list of nearly 15,200 open-access journals on the internet.
Among those journals, the AI initially flagged more than 1,400 as potentially problematic. When scientists checked them , 350 were found to be legitimate. However, more than 1000 were predatory in nature! The team discovered that questionable journals published an unusually high number of articles. They also included authors with a larger number of affiliations than more legitimate journals, and authors who cited their own research, rather than the research of other scientists, to an unusually high level.
The new AI system isn't publicly accessible, but the researchers hope to make it available to universities and publishing companies soon.
Han Zhuang et al, Estimating the predictability of questionable open-access journals, Science Advances (2025). DOI: 10.1126/sciadv.adt2792
Genetic tools identify lost human relatives Denisovans from fossil records
New genetic techniques are shedding light on a mysterious part of our family tree—ancient human relatives called the Denisovans that emerged during the Pleistocene epoch, approximately 370,000 years ago.
Little is known about them because so few fossils have been found. The group was discovered by accident in 2010, based solely on DNA analysis of what was thought to be a Neanderthal finger bone found in the Denisova Cave in Siberia. However, it turned out to belong to a previously unknown lineage closely related to Neanderthals.
Since then, only a few additional fragments have been found, so researchers
developed a new genetic tool to go through the fossil record identifying potential Denisovans. Their starting point was the Denisovan genome.
The research ispublishedin the journalProceedings of the National Academy of Sciences.
The research team's innovative technique detects subtle changes in how Denisovan genes are regulated. These changes were likely responsible for altering the Denisovan skeleton, providing clues to what they looked like. Using this technique, the scientists identified 32 physical traits most closely related to skull morphology. Then they used this predicted profile to scan the Middle Pleistocene fossil record, specifically focusing on skulls from that period. Before applying this method to unknown fossils, the researchers tested its accuracy. They used the same technique to successfully predict known physical traits of Neanderthals and chimpanzees with more than 85% accuracy. Next, they identified 18 skull features they could measure, such as head width and forehead height, and then measured these traits on ten ancient skulls. They compared these to skulls from reference groups, including Neanderthals, modern humans and Homo erectus. Using sophisticated statistical tests, they gave each ancient skull a score to see how well it matched the Denisovan blueprint. Of these, two skulls from China, known as the Harbin and Dali specimens, were a close match to the Denisovan profile. The Harbin skull matched 16 traits, while the Dali skull matched 15. The study also found that a third skull, Kabwe 1, had a strong link to the Denisovan-Neanderthal tree, which may indicate that it was the root of the Denisovan lineage that split from Neanderthals.
In their study, the team states that their work may help with the identification of other extinct human relatives.
Nadav Mishol et al, Candidate Denisovan fossils identified through gene regulatory phenotyping, Proceedings of the National Academy of Sciences (2025). DOI: 10.1073/pnas.2513968122
Rare seasonal brain shrinkage in shrews is driven by water loss, not cell death, MRI study reveals
Common shrews are one of only a handful of mammals known to flexibly shrink and regrow their brains. This rare seasonal cycle, known as Dehnel's phenomenon, has puzzled scientists for decades. How can a brain lose volume and regrow months later without sustaining permanent damage?
A study using noninvasive MRI has scanned the brains of shrews undergoing shrinkage, identifying a key molecule involved in the phenomenon: water. The work is published in the journal Current Biology.
In the experiments conducted by researchers, shrews lost 9% of their brains during shrinkage, but the cells did not die. The cells just lost water.
Normally, brain cells that lose water become damaged and ultimately die, but in shrews, the opposite happened. The cells remained alive and even increased in number.
This finding solves a mystery—and opens up potential pathways for the treatment of human brain disease. Brain shrinkage in shrews matches closely what happens in patients suffering from Alzheimer's, Parkinson's, and other brain diseases.
The study also shows that a specific protein known for regulating water—aquaporin 4—was likely involved in moving water out of the brain cells of shrews. This same protein is present in higher quantities in the diseased brains of humans, too.
That the shrunken brains of shrews share characteristics with diseased human brains makes the case that these miniature mammals, with their ability to reverse brain loss, could also offer clues for medical treatments.
Super corals could help buy time for reefs in a warming world
Coral reefs are in crisis. Rising ocean temperatures driven by climate change are pushing these ecosystems to the brink, with mass bleaching events becoming more frequent and severe.
But what if nature already holds part of the solution?
New research led by UTS demonstrates that corals that naturally thrive in extreme environments, often referred to as "super corals," could be used in restoration efforts to protect vulnerable reef systems.
The researchwas publishedin the journalScience Advancesand offers compelling evidence that these resilient corals can retain their heat tolerance even after being moved to more stable reef habitats.
Christine D. Roper et al, Coral thermotolerance retained following year-long exposure to a novel environment, Science Advances (2025). DOI: 10.1126/sciadv.adu3858
Longevity gains slowing with life expectancy of 100 unlikely, study finds
A new study finds that life expectancy gains made by high-income countries in the first half of the 20th century have slowed significantly, and that none of the generations born after 1939 will reach 100 years of age on average.
Published in the journal Proceedings of the National Academy of Sciences, the study analyzed life expectancy for 23 high-income and low-mortality countries using data from the Human Mortality Database and six different mortality forecasting methods.
The unprecedented increase in life expectancy we achieved in the first half of the 20th century appears to be a phenomenon we are unlikely to achieve again in the foreseeable future. In the absence of any major breakthroughs that significantly extend human life, life expectancy would still not match the rapid increases seen in the early 20th century even if adult survival improved twice as fast as we predict.
From 1900 to 1938, life expectancy rose by about five and a half months with each new generation. The life expectancy for an individual born in a high-income country in 1900 was an average of 62 years. For someone born just 38 years later in similar conditions, life expectancy had jumped to 80 years on average.
For those born between 1939 and 2000, the increase slowed to roughly two and a half to three and a half months per generation, depending on the forecasting method. Mortality forecasting methods are statistical techniques that make informed predictions about future lifespans based on past and current mortality information. These models enabled the research team to estimate how life expectancy will develop under a variety of plausible future scenarios.
Researchers forecast that those born in 1980 will not live to be 100 on average, and none of the cohorts in their study will reach this milestone. This decline is largely due to the fact that past surges in longevity were driven by remarkable improvements in survival at very young ages.
At the beginning of the 20th century, infant mortality fell rapidly due to medical advances and other improvements in quality of life for high-income countries. This contributed significantly to the rapid increase in life expectancy. However, infant and child mortality is now so low that the forecasted improvements in mortality in older age groups will not be enough to sustain the previous pace of longevity gains.
--
While mortality forecasts can never be certain as the future may unfold in unexpected ways—by way of pandemics, new medical treatments or other unforeseen societal changes—this study provides critical insight for governments looking to anticipate the needs of their health care systems, pension planning and social policies.
Although a population-level analysis, this research also has implications for individuals, as life expectancy influences personal decisions about saving, retirement and long-term planning. If life expectancy increases more slowly, as this study shows is likely, both governments and individuals may need to recalibrate their expectations for the future.
José Andrade et al, Cohort mortality forecasts indicate signs of deceleration in life expectancy gains, Proceedings of the National Academy of Sciences (2025). DOI: 10.1073/pnas.2519179122
Researchers have uncovered the genetic triggers that cause male and female bovine embryos to develop differently, as early as seven to eight days after fertilization. The breakthrough in basic science has implications for human health—such as drug development and in vitro fertilization—and for bovine health and dairy industry sustainability.
Scientists have known since the 1990s that male embryos of multiple mammalian species, including humans, grow faster than female embryos, but until now, the underlying reasons were unclear.
In a paper, published in Cell & Bioscience, scientists grew bovine embryos in petri dishes, then analyzed their genetic sex and RNA sequencing, which shows how genes are being expressed.
They discovered significant sex differences in gene regulation: male embryos prioritized genes associated with energy metabolism, causing them to grow faster than their female counterparts. Female embryos emphasized genes associated with sex differentiation, gonad development and inflammatory pathways that are important for future development.
Understanding these fundamental sex differences at the genomic and molecular levels is critically important to improving in vitro fertilization (IVF) success in humans and cows, and in developing treatments that will work for both men and women.
Meihong Shi et al, Sex-biased transcriptome in in vitro produced bovine early embryos, Cell & Bioscience (2025). DOI: 10.1186/s13578-025-01459-x
Scientists find that ice generates electricity when bent
A new study reveals that ice is a flexoelectric material, meaning it can produce electricity when unevenly deformed. Published inNature Physics, this discovery could have major technological implications while also shedding light on natural phenomena such as lightning.
Frozen water is one of the most abundant substances on Earth. It is found in glaciers, on mountain peaks and in polar ice caps. Although it is a well-known material, studying its properties continues to yield fascinating results.
An international study has shown for the first time that ordinary ice is a flexoelectric material.
In other words, it can generate electricity when subjected to mechanical deformation. This discovery could have significant implications for the development of future technological devices and help to explain natural phenomena such as the formation of lightning in thunderstorms.
The study represents a significant step forward in our understanding of the electromechanical properties of ice.
The scientists discovered that ice generates electric charge in response to mechanical stress at all temperatures. In addition, they identified a thin 'ferroelectric' layer at the surface at temperatures below -113ºC (160K).
This means that the ice surface can develop a natural electric polarization, which can be reversed when an external electric field is applied—similar to how the poles of a magnet can be flipped. The surface ferroelectricity is a cool discovery in its own right, as it means that ice may have not just one way to generate electricity, but two: ferroelectricity at very low temperatures, and flexoelectricity at higher temperatures all the way to 0 °C.
This property places ice on a par with electroceramic materials such as titanium dioxide, which are currently used in advanced technologies like sensors and capacitors.
One of the most surprising aspects of this discovery is its connection to nature. The results of the study suggest that the flexoelectricity of ice could play a role in the electrification of clouds during thunderstorms, and therefore in the origin of lightning.
It is known that lightning forms when an electric potential builds up in clouds due to collisions between ice particles, which become electrically charged. This potential is then released as a lightning strike. However, the mechanism by which ice particles become electrically charged has remained unclear, since ice is not piezoelectric—it cannot generate charge simply by being compressed during a collision.
However, the study shows that ice can become electrically charged when it is subjected to inhomogeneous deformations, i.e. when it bends or deforms irregularly.
X. Wen et al, Flexoelectricity and surface ferroelectricity of water ice, Nature Physics (2025). DOI: 10.1038/s41567-025-02995-6
Neuroscientists show for first time that precise timing of nerve signals determines how brain processes information
It has long been known that the brain preferentially processes information that we focus our attention on—a classic example is the so-called cocktail party effect.
In an environment full of voices, music, and background noise, the brain manages to concentrate on a single voice. The other noises are not objectively quieter, but are perceived less strongly at that moment.
The brain focuses its processing on the information that is currently relevant—in this case, the voice of the conversation partner—while other signals are received but not forwarded and processed to the same extent.
Neuroscientists provided the first causal evidence of how the brain transmits and processes relevant information. The work is published in the journal Nature Communications.
Whether a signal is processed further in the brain depends crucially on whether it arrives at the right moment—during a short phase of increased receptivity of the nerve cells.
Nerve cells do not work continuously, but in rapid cycles. They are particularly active and receptive for a few milliseconds, followed by a window of lower activity and excitability. This cycle repeats itself approximately every 10 to 20 milliseconds. Only when a signal arrived shortly before the peak of this active phase did it change the behaviour of the neurons.
This temporal coordination is the fundamental mechanism of information processing. Attention makes targeted use of this phenomenon by aligning the timing of the nerve cells so that relevant signals arrive precisely in this time window, while others are excluded.
Eric Drebitz et al, Gamma-band synchronization between neurons in the visual cortex is causal for effective information processing and behavior, Nature Communications (2025). DOI: 10.1038/s41467-025-62732-8
A suspected perpetrator who can barely remember his name, several traffic violations committed by a woman in her mid-fifties who is completely unreasonable and doesn't understand her behavior—should such cases be brought before a court? And how does the state deal with people who commit acts of violence without meaning to?
Those questions come to mind if one hears those examples from everyday clinical praxis with persons suffering from dementia. Neurodegenerative diseases might affect several functions of the brain, ranging from memory in Alzheimer's disease to behavior, such as in behavioral variant frontotemporal dementia, and to sensorimotor function in Parkinson's disease.
One of the most interesting consequences of these alterations is the fact that persons affected by these diseases might develop criminal risk behavior like harassment, traffic violation, theft or even behavior causing harm to other people or animals, even as the first disease sign.
If persons violate social or legal norms due to changes in behavior, personality and cognition, these incidents may have a substantial impact on this person's family and social surroundings and may lead to prosecution.
Researchers investigated this problem in a broad meta-analysis that included 14 studies with 236,360 persons from different countries (U.S.A., Sweden and Finland, Germany and Japan). The work is published in the journal Translational Psychiatry.
Their systematic literature review revealed that criminal risk behavior prevalence is more frequent in early disease course than in the general population, but declines thereafter below population levels. Therefore, criminal behavior committed for the first time at mid-age could be an indicator of incident dementia, requiring earliest diagnosis and therapy.
The team shows that the prevalence of criminal risk behaviors was highest in behavioral variant frontotemporal dementia (>50%), followed by semantic variant primary progressive aphasia (40%), but rather low in vascular dementia and Huntington's disease (15%), Alzheimer's disease (10%), and lowest in Parkinsonian syndromes (<10%).
Criminal risk behavior in frontotemporal dementia is most likely caused by the neurodegenerative disease itself. Most of the patients showed criminal risk behavior for the first time in their life and had no previous records of criminal activity.
The prevalence seems to be more frequent in frontotemporal dementia and Alzheimer's disease in early disease course than in the general population before diagnosis, presumably in pre-stages, such as mild behavioral or cognitive impairment, but declines thereafter, finally leading to lower prevalence in dementia after diagnosis if compared with the general population.
The researchers also found that criminal risk behaviour is more frequent in men than in women in dementia. After diagnosis, men showed four times more criminal risk behaviour than women in frontotemporal dementia and seven times more in Alzheimer's disease.
In a second study, the working group also identified the changes in the brain that are associated with criminal behavior in frontotemporal dementia. This study was published in Human Brain Mapping.
Persons exhibiting criminal behavior showed larger atrophy in the temporal lobe, indicating that criminal behavior might be caused by so-called disinhibition, i.e., the loss of normal restraints or inhibitions, leading to a reduced ability to regulate one's behavior, impulses, and emotions.
Disinhibition can manifest as acting impulsively, without thinking through the consequences, and behaving in ways that are inappropriate for the situation.
One has to prevent a further stigmatization of persons with dementia. Of note, most offenses committed were minor, such as indecent behavior, traffic violations, theft, damage to property, but also physical violence or aggression occurred
Hence, sensitivity for this topic as possible early signs of dementia and earliest diagnosis and treatment are of the uttermost importance. Besides early diagnosis and treatment of persons affected, one has to discuss adaptations of the legal system, such as increasing awareness of offenses due to those diseases, and taking into account diseases in respective penalties and in jails, say the researchers.
Matthias L. Schroeter et al, Criminal minds in dementia: A systematic review and quantitative meta-analysis,Translational Psychiatry(2025).DOI: 10.1038/s41398-025-03523-z
Karsten Mueller et al, Criminal Behavior in Frontotemporal Dementia: A Multimodal MRI Study,Human Brain Mapping(2025).DOI: 10.1002/hbm.70308
Depression linked to presence of immune cells in the brain's protective layer
Immune cells released from bone marrow in the skull in response to chronic stress and adversity could play a key role in symptoms of depression and anxiety, say researchers.
The discovery—found in a study in mice—sheds light on the role that inflammation can play in mood disorders and could help in the search for new treatments, in particular for those individuals for whom current treatments are ineffective.
Around 1 billion people will be diagnosed with a mood disorder such as depression or anxiety at some point in their life. While there may be many underlying causes, chronic inflammation—when the body's immune system stays active for a long time, even when there is no infection or injury to fight—has been linked to depression. This suggests that the immune system may play an important role in the development of mood disorders.
Previous studies have highlighted how high levels of an immune cell known as a neutrophil, a type of white blood cell, are linked to the severity of depression. But how neutrophils contribute to symptoms of depression is currently unclear.
In research published in Nature Communications, a team of scientists tested the hypothesis that chronic stress can lead to the release of neutrophils from bone marrow in the skull. These cells then collect in the meninges—membranes that cover and protect your brain and spinal cord—and contribute to symptoms of depression.
As it is not possible to test this hypothesis in humans, the team used mice exposed to chronic social stress. In this experiment, an 'intruder' mouse is introduced into the home cage of an aggressive resident mouse. The two have brief daily physical interactions and can otherwise see, smell, and hear each other.
The researchers found that prolonged exposure to this stressful environment led to a noticeable increase in levels of neutrophils in the meninges, and that this was linked to signs of depressive behavior in the mice. Even after the stress ended, the neutrophils lasted longer in the meninges than they did in the blood.
Analysis confirmed the researchers' hypothesis that the meningeal neutrophils—which appeared subtly different from those found in the blood—originated in the skull.
Further analysis suggested that long-term stress triggered a type of immune system 'alarm warning' known as type I interferon signaling in the neutrophils. Blocking this pathway—in effect, switching off the alarm—reduced the number of neutrophils in the meninges and improved behavior in the depressed mice.
Further analysis suggested that long-term stress triggered a type of immune system 'alarm warning' known as type I interferon signaling in the neutrophils. Blocking this pathway—in effect, switching off the alarm—reduced the number of neutrophils in the meninges and improved behavior in the depressed mice.
The reason why there are high levels of neutrophils in the meninges is unclear. One explanation could be that they are recruited by microglia, a type of immune cell unique to the brain.
Another possible explanation is that chronic stress may cause microhemorrhages, tiny leaks in brain blood vessels, and that neutrophils—the body's 'first responders'—arrive to fix the damage and prevent any further damage. These neutrophils then become more rigid, possibly getting stuck in brain capillaries and causing further inflammation in the brain.
These new findings show that these 'first responder' immune cells leave the skull bone marrow and travel to the brain, where they can influence mood and behavior.
Stacey L. Kigar et al, Chronic social defeat stress induces meningeal neutrophilia via type I interferon signaling in male mice, Nature Communications (2025). DOI: 10.1038/s41467-025-62840-5
Hyperactive blood platelets linked to heart attacks despite standard drug therapy
Scientists have discovered a particularly active subgroup of blood platelets that may cause heart attacks in people with coronary heart disease despite drug therapy. This discovery may open up new prospects for customized therapies. The research results are published in the European Heart Journal and were presented on August 31 at Europe's largest cardiology congress.
Despite modern medication, many people with coronary heart disease continue to suffer heart attacks. A research team has now found a possible explanation for this and has also provided a very promising therapeutic approach.
Their work focused on so-called "reticulated platelets": particularly young, RNA-rich and reactive thrombocytes that play a central role in the formation of blood clots in patients with coronary heart disease.
The researchers were able to comprehensively characterize the biological mechanisms of these cells for the first time. This allows us to explain why these platelets remain overactive in many patients even under optimal therapy. The reason is that these young platelets have a particularly large number of activating signaling pathways that make them more sensitive and reactive than mature platelets.
The results come from a multidimensional, using various methods, analysis of the blood of over 90 patients with coronary heart disease. Among other things, the researchers found signaling pathways that can be used to specifically inhibit the activity of such blood cells, in particular two target structures called GPVI and PI3K. Initial laboratory experiments confirmed that inhibiting these signaling pathways can reduce platelet hyperactivity.
Kilian Kirmes et al, Reticulated platelets in coronary artery disease: a multidimensional approach unveils prothrombotic signalling and novel therapeutic targets, European Heart Journal (2025). DOI: 10.1093/eurheartj/ehaf694
8,000 years of human activities have caused wild animals to shrink and domestic animals to grow
Humans have caused wild animals to shrink and domestic animals to grow, according to a new study.
Researchers studied tens of thousands of animal bones from Mediterranean France covering the last 8,000 years to see how the size of both types of animals has changed over time.
Scientists already know that human choices, such as selective breeding, influence the size of domestic animals, and that environmental factors also impact the size of both. However, little is known about how these two forces have influenced the size of wild and domestic animals over such a prolonged period. This latest research, published in the Proceedings of the National Academy of Sciences , fills a major gap in our knowledge.
The scientists analyzed more than 225,000 bones from 311 archaeological sites in Mediterranean France. They took thousands of measurements of things like the length, width, and depth of bones and teeth from wild animals, such as foxes, rabbits and deer, as well as domestic ones, including goats, cattle, pigs, sheep and chickens.
But the researchers didn't just focus on the bones. They also collected data on the climate, the types of plants growing in the area, the number of people living there and what they used the land for. And then, with some sophisticated statistical modeling, they were able to track key trends and drivers behind the change in animal size.
The research team's findings reveal that for around 7,000 years, wild and domestic animals evolved along similar paths, growing and shrinking together in sync with their shared environment and human activity. However, all that changed around 1,000 years ago. Their body sizes began to diverge dramatically, especially during the Middle Ages.
Domestic animals started to get much bigger as they were being actively bred for more meat and milk. At the same time, wild animals began to shrink in size as a direct result of human pressures, such as hunting and habitat loss. In other words, human activities replaced environmental factors as the main force shaping animal evolution.
Mureau, Cyprien et al, 8,000 years of wild and domestic animal body size data reveal long-term synchrony and recent divergence due to intensified human impact, Proceedings of the National Academy of Sciences (2025). DOI: 10.1073/pnas.2503428122. www.pnas.org/cgi/doi/10.1073/pnas.2503428122
World's oldest host-associated bacterial DNA found
An international team led by researchers at the Center for Paleogenetics, has uncovered microbial DNA preserved in woolly and steppe mammoth remains dating back more than one million years. The analyses reveal some of the world's oldest microbial DNA ever recovered, as well as the identification of bacteria that possibly caused disease in mammoths. The findings are published in Cell.
Researchers analyzed microbial DNA from 483 mammoth specimens, of which 440 were sequenced for the first time. Among them was a steppe mammoth that lived about 1.1 million years ago. Using advanced genomic and bioinformatic techniques, the team distinguished microbes that once lived alongside the mammoths from those that invaded their remains after death.
Genetic mechanism reveals how plants coordinate flowering with light and temperature conditions
Plants may be stuck in one place, but the world around them is constantly changing. In order to grow and flower at the right time, plants must constantly collect information about their surroundings, measuring things like temperature, brightness, and length of day. Still, it's unclear how all this information gets combined to trigger specific behaviors.
Scientists have discovered a genetic mechanism for how plants integrate light and temperature information to control their flowering.
In a study published in Nature Communications, the researchers found an interaction between two genetic pathways that signals the presence of both blue light and low temperature. This genetic module helps plants fine-tune their flowering to the optimal environmental conditions.
In one pathway, blue light activates the PHOT2 blue light receptor, with help from partner protein NPH3. In another pathway, low ambient temperature allows a transcription factor called CAMTA2 to boost the expression of a gene called EHB1. Importantly, EHB1 is known to interact with NPH3, placing NPH3 at the convergence point of the blue light and low temperature signals. This this genetic architecture effectively works as a coincidence detector, linking the presence of blue light and low temperature to guide the switch to flowering.
The study describes an important component of plant growth, reproduction, and information processing. The newly discovered genetic module allows plants to have fine control over their flowering in low temperatures. Understanding this system will now help scientists optimize crop growth under changing environmental conditions.
Adam Seluzicki et al, Genetic architecture of a light-temperature coincidence detector, Nature Communications (2025). DOI: 10.1038/s41467-025-62194-y
Generative AI designs molecules that kill drug-resistant bacteria
What if generative AI could design life-saving antibiotics, not just art and text? In a new Cell Biomaterials paper, researchers introduce AMP-Diffusion, a generative AI tool used to create tens of thousands of new antimicrobial peptides (AMPs)—short strings of amino acids, the building blocks of proteins—with bacteria-killing potential. In animal models, the most potent AMPs performed as well as FDA-approved drugs, without detectable adverse effects.
While past work has shown that AI can successfully sort through mountains of data to identify promising antibiotic candidates, this study adds to a small but growing number of demonstrations that AI can invent antibiotic candidates from scratch.
Using AMP-Diffusion, the researchers generated the amino-acid sequences for about 50,000 candidates. They used AI to filter the results.
After synthesizing the 46 most promising candidates, the researchers tested them in human cells and animal models. Treating skin infections in mice, two AMPs demonstrated efficacy on par with levofloxacin and polymyxin B, FDA-approved drugs used to treat antibiotic-resistant bacteria, without adverse effects.
In the future, the researchers hope to refine AMP-Diffusion, giving it the capability to denoise with a more specific goal in mind, like treating a particular type of bacterial infection, among other features.
Persister bacteria, which enter a dormant state to survive antibiotics that target active cells, are linked to over 20% of chronic infections and resist current treatments.
Understanding their survival mechanisms could lead to new ways to combat recurring infections. This study utilized E. coli bacteria as a model and found that prolonged stress leads to the increased formation of aggresomes (membraneless droplets) and the enrichment of mRNA (molecules that carry instructions for making proteins) within them, which enhances the ability of E. coli to survive and recover from stress.
Researchers used multiple approaches, including imaging, modeling, and transcriptomics, to show that prolonged stress leading to ATP (fuel for all living cells) depletion in Escherichia coli results in increased aggresome formation, their compaction, and enrichment of mRNA within aggresomes compared to the cytosol (the liquid inside of cells).
Transcript length was longer in aggresomes compared to the cytosol. Mass spectrometry showed exclusion of mRNA ribonuclease (an enzyme that breaks down RNA) from aggresomes, which was due to negative charge repulsion.
Experiments with fluorescent reporters and disruption of aggresome formation showed that mRNA storage within aggresomes promoted translation and was associated with reduced lag phases during growth after stress removal. These findings suggest that mRNA storage within aggresomes confers an advantage for bacterial survival and recovery from stress.
This breakthrough illuminates how persister cells survive and revive after antibiotic treatment.
By targeting aggresomes, new drugs could disrupt this protective mechanism, preventing bacteria from storing mRNA and making them more vulnerable to elimination, thus reducing the risk of infection relapse.
Linsen Pei et al, Aggresomes protect mRNA under stress in Escherichia coli, Nature Microbiology (2025). DOI: 10.1038/s41564-025-02086-5
Deforestation reduces rainfall by 74% and increases temperatures by 16% in Amazon during dry season, study found
Deforestation in the Brazilian Amazon is responsible for approximately 74.5% of the reduction in rainfall and 16.5% of the temperature increase in the biome during the dry season. For the first time, researchers have quantified the impact of vegetation loss and global climate change on the forest.
The study provides fundamental results to guide effective mitigation and adaptation strategies.
How climate change and deforestation interact in the transformation of the Amazon rainforest, Nature Communications (2025). DOI: 10.1038/s41467-025-63156-0
Magnetic fields in infant universe may have been billions of times weaker than a fridge magnet
The magnetic fields that formed in the very early stages of the universe may have been billions of times weaker than a small fridge magnet, with strengths comparable to magnetism generated by neurons in the human brain. Yet, despite such weakness, quantifiable traces of their existence still remain in the cosmic web, the visible cosmic structures connected throughout the universe.
These conclusions emerge from a study using around a quarter of a million computer simulations.
The research, recently published in Physical Review Letters, specifies both possible and maximum values for the strengths of primordial magnetic fields. It also offers the possibility of refining our knowledge of the early universe and the formation of the first stars and galaxies.
Removing yellow stains from fabric with blue light
Sweat and food stains can ruin your favorite clothes. But bleaching agents such as hydrogen peroxide or dry-cleaning solvents that remove stains aren't options for all fabrics, especially delicate ones. Now, researchers in ACS Sustainable Chemistry & Engineeringreport a simple way to remove yellow stains using a high-intensity blue LED light. They demonstrate the method's effectiveness at removing stains from orange juice, tomato juice and sweat-like substances on multiple fabrics, including silk.
The method utilizes visible blue light in combination with ambient oxygen, which acts as the oxidizing agent to drive the photobleaching process.This approach avoids the use of harsh chemical oxidants typically required in conventional bleaching methods, making it inherently more sustainable.
Yellow clothing stains are caused by squalene and oleic acid from skin oils and sweat, as well as natural pigments like beta carotene and lycopene, present in oranges, tomatoes and other foods. UV light is a potential stain-removing alternative to chemical oxidizers like bleach and hydrogen peroxide, but it can damage delicate fabrics.
The same researchers previously determined that a high-intensity blue LED light could remove yellow color from aged resin polymers, and they wanted to see whether blue light could also break down yellow stains on fabric without causing damage.
Initially, they exposed vials of beta carotene, lycopene and squalene to high-intensity blue LED light for three hours. All the samples lost color, and spectroscopic analyses indicated that oxygen in the air helped the photobleaching process by breaking bonds to produce colorless compounds.
Next, the team applied squalene onto cotton fabric swatches. After heating the swatches to simulate aging, they treated the samples for 10 minutes, by soaking them in a hydrogen peroxide solution or exposing them to the blue LED or UV light. The blue light reduced the yellow stain substantially more than hydrogen peroxide or UV exposure. In fact, UV exposure generated some new yellow-colored compounds. Additional tests showed that the blue LED treatment lightened squalene stains on silk and polyester without damaging the fabrics. The method also reduced the color of other stain-causing substances, including aged oleic acid, orange juice and tomato juice, on cotton swatches.
High-intensity blue LED light is a promising way to remove clothing stains, but the researchers say they want to do additional colorfastness and safety testing before commercializing a light system for home and industrial use.
Tomohiro Sugahara et al, Environmentally Friendly Photobleaching Method Using Visible Light for Removing Natural Stains from Clothing, ACS Sustainable Chemistry & Engineering (2025). DOI: 10.1021/acssuschemeng.5c03907
Scientists develop the world's first 6G chip, capable of 100 Gbps speeds
Sixth generation, or 6G, wireless technology is one step closer to reality with news that researchers have unveiled the world's first "all-frequency" 6G chip. The chip is capable of delivering mobile internet speeds exceeding 100 gigabits per second (Gbps).
6G technology is the successor to 5G and promises to bring about a massive leap in how we communicate. It will offer benefits such as ultra-high-speed connectivity, ultra-low latency and AI integration that can manage and optimize networks in real-time. To achieve this, 6G networks will need to operate across a range of frequencies, from standard microwaves to much higher frequency terahertz waves. Current 5G technology utilizes a limited set of radio frequencies, similar to those used in previous generations of wireless technologies.
The new chip is no bigger than a thumbnail, measuring 11 millimeters by 1.7 millimeters. It operates across a wide frequency range, from 0.5 GHz to 115 GHz, which traditionally takes nine separate radio systems to cover this spectrum. While the development of a single all-frequency chip is a significant breakthrough, the technology is still in its early stages of development. Many experts expect that commercial 6G networks will begin to roll out around 2030.
Zihan Tao et al, Ultrabroadband on-chip photonics for full-spectrum wireless communications, Nature (2025). DOI: 10.1038/s41586-025-09451-8
Mother's Germs May Influence Her Child's Brain Development
Our bodies are colonized by a teeming, ever-changing mass of microbes that help power countless biological processes. Now, a new study has identified how these microorganisms get to work shaping the brain before birth.
Researchers at Georgia State University studied newborn mice specifically bred in a germ-free environment to prevent any microbe colonization. Some of these mice were immediately placed with mothers with normal microbiota, which leads to microbes being transferred rapidly.
That gave the study authors a way to pinpoint just how early microbes begin influencing the developing brain. Their focus was on the paraventricular nucleus (PVN), a region of the hypothalamus tied to stress and social behavior, already known to be partly influenced by microbe activity in mice later in life.
At birth, a newborn body is colonized by microbes as it travels through the birth canal.
Birth also coincides with important developmental events that shape the brain.
Part 1
When the germ-free mice were just a handful of days old, the researchers found fewer neurons in their PVN, even when microbes were introduced after birth. That suggests the changes caused by these microorganisms happen in the uterus during development.
These neural modifications last, too: the researchers also found that the PVN was neuron-light even in adult mice, if they'd been raised to be germ-free. However, the cross-fostering experiment was not continued into adulthood (around eight weeks).
The details of this relationship still need to be worked out and researched in greater detail, but the takeaway is that microbes – specifically the mix of microbes in the mother's gut – can play a notable role in the brain development of their offspring. Rather than shunning our microbes, we should recognize them as partners in early life development. They're helping build our brains from the very beginning, say the researchers. While this has only been shown in mouse models so far, there are enough biological similarities between mice and humans that there's a chance we're also shaped by our mother's microbes before we're born.
One of the reasons this matters is because practices like Cesarean sections and the use of antibiotics around birth are known to disrupt certain types of microbe activity – which may in turn be affecting the health of newborns.
Mutations driving evolution are informed by the genome, not random, study suggests
A study published in the Proceedings of the National Academy of Sciences by scientists shows that an evolutionarily significant mutation in the human APOL1 gene arises not randomly but more frequently where it is needed to prevent disease, fundamentally challenging the notion that evolution is driven by random mutations and tying the results to a new theory that, for the first time, offers a new concept for how mutations arise.
Implications for biology, medicine, computer science, and perhaps even our understanding of the origin of life itself, are potentially far reaching.
A random mutation is a genetic change whose chance of arising is unrelated to its usefulness. Only once these supposed accidents arise does natural selection vet them, sorting the beneficial from the harmful. For over a century, scientists have thought that a series of such accidents has built up over time, one by one, to create the diversity and splendor of life around us.
However, it has never been possible to examine directly whether mutations in the DNA originate at random or not. Mutations are rare events relative to the genome's size, and technical limitations have prevented scientists from seeing the genome in enough detail to track individual mutations as they arise naturally.
To overcome this, researchers developed a new ultra-accurate detection method and recently applied it to the famous HbS mutation, which protects from malaria but causes sickle-cell anemia in homozygotes.
Results showed that the HbS mutation did not arise at random, but emerged more frequently exactly in the gene and population where it was needed. Now, they report the same nonrandom pattern in a second mutation of evolutionary significance.
The new study examines the de novo origination of a mutation in the human APOL1 gene that protects against a form of trypanosomiasis, a disease that devastated central Africa in historical times and until recently has caused tens of thousands of deaths there per year, while increasing the risk of chronic kidney disease in people with two copies.
If the APOL1 mutation arises by chance, it should arise at a similar rate in all populations, and only then spread under Trypanosoma pressure. However, if it is generated nonrandomly, it may actually arise more frequently where it is useful.
Results supported the nonrandom pattern: the mutation arose much more frequently in sub-Saharan Africans, who have faced generations of endemic disease, compared to Europeans, who have not, and in the precise genomic location where it confers protection.
The new findings challenge the notion of random mutation fundamentally. Historically, there have been two basic theories for how evolution happens— random mutation and natural selection, and Lamarckism—the idea that an individual directly senses its environment and somehow changes its genes to fit it. Lamarckism has been unable to explain evolution in general, so biologists have concluded that mutations must be random.
This new theory moves away from both of these concepts, proposing instead that two inextricable forces underlie evolution. While the well-known external force of natural selection ensures fitness, a previously unrecognized internal force operates inside the organism, putting together genetic information that has accumulated over generations in useful ways.
To illustrate, take fusion mutations, a type of mutation where two previously separate genes fuse to form a new gene. As for all mutations, it has been thought that fusions arise by accident: one day, a gene moves by error to another location and by chance fuses to another gene, once in a great while, leading to a useful adaptation. But researchers have recently shown that genes do not fuse at random.
Instead, genes that have evolved to be used together repeatedly over generations are the ones that are more likely to get fused. Because the genome folds in 3D space, bringing genes that work together to the same place at the same time in the nucleus with their chromatin open, molecular mechanisms fuse these genes rather than others. An interaction involving complex regulatory information that has gradually evolved over generations leads to a mutation that simplifies and "hardwires" it into the genome.
Part 2
In the paper, they argue that fusions are a specific example of a more general and extensive internal force that applies across mutation types. Rather than local accidents arising at random locations in the genome disconnected from other genetic information, mutational processes put together multiple meaningful pieces of heritable information in many ways.
Genes that evolved to interact tightly are more likely to be fused; single-letter RNA changes that evolved to occur repeatedly across generations via regulatory phenomena are more likely to be "hardwired" as point mutations into the DNA; genes that evolved to interact in incipient networks, each under its own regulation, are more likely to be invaded by the same transposable element that later becomes a master-switch of the network, streamlining regulation, and so on. Earlier mutations influence the origination of later ones, forming a vast network of influences over evolutionary time.
Previous studies examined mutation rates as averages across genomic positions, masking the probabilities of individual mutations. But these new studies suggest that, at the scale of individual mutations, each mutation has its own probability, and the causes and consequences of mutation are related. At each generation, mutations arise based on the information that has accumulated in the genome up to that time point, and those that survive become a part of that internal information. This vast array of interconnected mutational activity gradually hones in over the generations on mutations relevant to the long-term pressures experienced, leading to long-term directed mutational responses to specific environmental pressures, such as the malaria and Trypanosoma–protective HbS and APOL1 mutations. New genetic information arises in the first place, they argue, as a consequence of the fact that mutations simplify genetic regulation, hardwiring evolved biological interactions into ready-made units in the genome. This internal force of natural simplification, together with the external force of natural selection, act over evolutionary time like combined forces of parsimony and fit, generating co-optable elements that themselves have an inherent tendency to come together into new, emergent interactions. Co-optable elements are generated by simplification under performance pressure, and then engage in emergent interactions—the source of innovation is at the system level. Understood in the proper timescale, an individual mutation does not arise at random nor does it invent anything in and of itself. The potential depth of evolution from this new perspective can be seen by examining other networks. For example, the gene fusion mechanism—where genes repeatedly used together across evolutionary time are more likely to be fused together by mutation—echoes chunking, one of the most basic principles of cognition and learning in the brain, where pieces of information that repeatedly co-occur are eventually chunked into a single unit.
Yet fusions are only one instance of a broader principle: diverse mutational processes respond to accumulated information in the genome, combining it over generations into streamlined instructions. This view recasts mutations not as isolated accidents, but as meaningful events in a larger, long-term process.
Daniel Melamed et al, De novo rates of a Trypanosoma -resistant mutation in two human populations, Proceedings of the National Academy of Sciences (2025). DOI: 10.1073/pnas.2424538122
New tool enables rapid, large-scale profiling of disease-linked RNA modifications
Researchers have developed a powerful tool capable of scanning thousands of biological samples to detect transfer ribonucleic acid (tRNA) modifications—tiny chemical changes to RNA molecules that help control how cells grow, adapt to stress and respond to diseases such as cancer and antibiotic‑resistant infections. This tool opens up new possibilities for science, health care and industry—from accelerating disease research and enabling more precise diagnostics, to guiding the development of more effective medical treatments for diseases such as cancer and antibiotic‑resistant infections.
Cancer and infectious diseases are complicated health conditions in which cells are forced to function abnormally by mutations in their genetic material or by instructions from an invading microorganism. The SMART-led research team is among the world's leaders in understanding how the epitranscriptome—the over 170 different chemical modifications of all forms of RNA—controls growth of normal cells and how cells respond to stressful changes in the environment, such as loss of nutrients or exposure to toxic chemicals. The researchers are also studying how this system is corrupted in cancer or exploited by viruses, bacteria and parasites in infectious diseases. Current molecular methods used to study the expansive epitranscriptome and all of the thousands of different types of modified RNA are often slow, labor‑intensive, costly and involve hazardous chemicals which limit research capacity and speed.
To solve this problem, the SMART team developed a new tool that enables fast, automated profiling of tRNA modifications—molecular changes that regulate how cells survive, adapt to stress and respond to disease. This capability allows scientists to map cell regulatory networks, discover novel enzymes and link molecular patterns to disease mechanisms, paving the way for better drug discovery and development, and more accurate disease diagnostics.
SMART's automated system was specially designed to profile tRNA modifications across thousands of samples rapidly and safely. Unlike traditional methods—which are costly, labor‑intensive and use toxic solvents such as phenol and chloroform—this tool integrates robotics to automate sample preparation and analysis, eliminating the need for hazardous chemical handling and reducing costs. This advancement increases safety, throughput and affordability, enabling routine large‑scale use in research and clinical labs.
Jingjing Sun et al, tRNA modification profiling reveals epitranscriptome regulatory networks in Pseudomonas aeruginosa, Nucleic Acids Research (2025). DOI: 10.1093/nar/gkaf696
Fruit fly research shows that mechanical forces drive evolutionary change
A tissue fold known as the cephalic furrow, an evolutionary novelty that forms between the head and the trunk of fly embryos, plays a mechanical role in stabilizing embryonic tissues during the development of the fruit fly Drosophila melanogaster.
Researchers have integrated computer simulations with their experiments and showed that the timing and position of cephalic furrow formation are crucial for its function, preventing mechanical instabilities in the embryonic tissues.
The work appears in Nature.
The increased mechanical instability caused by embryonic tissue movements may have contributed to the origin and evolution of the cephalic furrow genetic program. This shows that mechanical forces can shape the evolution of new developmental features.
Mechanical forces shape tissues and organs during the development of an embryo through a process called morphogenesis. These forces cause tissues to push and pull on each other, providing essential information to cells and determining the shape of organs. Despite the importance of these forces, their role in the evolution of development is still not well understood.
Animal embryos undergo tissue flows and folding processes, involving mechanical forces, that transform a single-layered blastula (a hollow sphere of cells) into a complex multi-layered structure known as the gastrula. During early gastrulation, some flies of the order Diptera form a tissue fold at the head-trunk boundary called the cephalic furrow. This fold is a specific feature of a subgroup of Diptera and is therefore an evolutionary novelty of flies.
The cephalic furrow is especially interesting because it is a prominent embryonic invagination whose formation is controlled by genes, but that has no obvious function during development. The fold does not give rise to specific structures, and later in development, it simply unfolds, leaving no trace. With their experiments, the researchers show that the absence of the cephalic furrow leads to an increase in the mechanical instability of embryonic tissues and that the primary sources of mechanical stress are cell divisions and tissue movements typical of gastrulation. They demonstrate that the formation of the cephalic furrow absorbs these compressive stresses. Without a cephalic furrow, these stresses build up, and outward forces caused by cell divisions in the single-layered blastula cause mechanical instability and tissue buckling.
This intriguing physical role gave the researchers the idea that the cephalic furrow may have evolved in response to the mechanical challenges of dipteran gastrulation, with mechanical instability acting as a potential selective pressure. Flies either feature a cephalic furrow, or if they lack one, display widespread out-of-plane division, meaning the cells divide downward to reduce the surface area. Both mechanisms act as mechanical sinks to prevent tissue collision and distortion. These findings uncover empirical evidence for how mechanical forces can influence the evolution of innovations in early development. The cephalic furrow may have evolved through genetic changes in response to the mechanical challenges of dipteran gastrulation. They show that mechanical forces are not just important for the development of the embryo but also for the evolution of its development.
Some sugar substitutes linked to faster cognitive decline
Some sugar substitutes may come with unexpected consequences for long-term brain health, according to a study published in Neurology. The study examined seven low- and no-calorie sweeteners and found that people who consumed the highest amounts experienced faster declines in thinking and memory skills compared to those who consumed the lowest amounts.
The link was even stronger in people with diabetes. While the study showed a link between the use of some artificial sweeteners and cognitive decline, it did not prove that they were a cause.
The artificial sweeteners examined in the study were aspartame, saccharin, acesulfame-K, erythritol, xylitol, sorbitol and tagatose. These are mainly found in ultra-processed foods like flavored water, soda, energy drinks, yogurt and low-calorie desserts. Some are also used as a standalone sweetener.
Low- and no-calorie sweeteners are often seen as a healthy alternative to sugar, however the new findings suggest certain sweeteners may have negative effects on brain health over time.
The study included 12,772 adults from across Brazil. The average age was 52, and participants were followed for an average of eight years.
After adjusting for factors such as age, sex, high blood pressure and cardiovascular disease, researchers found people who consumed the highest amount of sweeteners showed faster declines in overall thinking and memory skills than those who consumed the lowest amount, with a decline that was 62% faster. This is the equivalent of about 1.6 years of aging. Those in the middle group had a decline that was 35% faster than the lowest group, equivalent to about 1.3 years of aging.
When researchers broke the results down by age, they found that people under the age of 60 who consumed the highest amounts of sweeteners showed faster declines in verbal fluency and overall cognition when compared to those who consumed the lowest amounts. They did not find links in people over 60. They also found that the link to faster cognitive decline was stronger in participants with diabetes than in those without diabetes.
When looking at individual sweeteners, consuming aspartame, saccharin, acesulfame-k, erythritol, sorbitol and xylitol was associated with a faster decline in overall cognition, particularly in memory.
They found no link between the consumption of tagatose and cognitive decline.
SeeMe detects hidden signs of consciousness in brain injury patients
SeeMe, a computer vision tool tested by Stony Brook University researchers, was able to detect low-amplitude, voluntary facial movements in comatose acute brain injury patients days before clinicians could identify overt responses.
There are ways to detect covert consciousness with EEG and fMRI, though these are not always available. Many acute brain injury patients appear unresponsive in early care, with signs that are so small, or so infrequent, that they are simply missed.
In the study, "Computer vision detects covert voluntary facial movements in unresponsive brain injury patients,"publishedinCommunications Medicine, investigators designed SeeMe to quantify tiny facial movements in response to auditory commands with the objective of identifying early, stimulus-evoked behavior.
A single-center prospective cohort included 37 comatose acute brain injury patients and 16 healthy volunteers, aged 18–85, enrolled at Stony Brook University Hospital. Patients had initial Glasgow Coma Scale scores ≤8 and no prior neurologically debilitating diagnoses.
SeeMe tagged facial pores at ~0.2 mm resolution and tracked movement vectors while subjects heard three commands: open your eyes, stick out your tongue, and show me a smile.
Results indicate earlier and broader detection with SeeMe. Eye-opening was detected on average 9.1 (± 5.5) days after injury by SeeMe versus 13.2 (± 11.4) days by clinical examination, yielding a 4.1-day lead.
SeeMe identified eye-opening in 30 of 36 patients (85.7%) compared with 25 of 36 (71.4%) by clinical exam. Among patients without an endotracheal tube obscuring the mouth, SeeMe detected mouth movements in 16 of 17 (94.1%).
In seven patients with analyzable mouth videos and clinical command following, SeeMe identified reproducible mouth responses 8.3 days earlier on average. Amplitude and frequency of SeeMe-positive responses correlated with discharge outcomes on the Glasgow Outcome Scale-Extended, with significant Kruskal-Wallis results for both features.
A deep neural network classifier trained on SeeMe-positive trials identified command specificity with 81% accuracy for eye-opening and an overall accuracy of 65%. Additional observations were lower, with 37% for tongue movement and 47% for smile, indicating a strong specificity for eyes.
Authors conclude that acute brain injury patients can exhibit low-amplitude, stimulus-evoked facial movements before overt signs appear at bedside, suggesting that many covertly conscious patients may have motor behavior currently undetected by clinicians.
Earlier detection could inform family discussions, guide rehabilitation timing, and serve as a quantitative signal for future monitoring or interface-based communication strategies, while complementing standard examinations.
Xi Cheng et al, Computer vision detects covert voluntary facial movements in unresponsive brain injury patients, Communications Medicine (2025). DOI: 10.1038/s43856-025-01042-y
Pancreatic insulin disruption triggers bipolar disorder-like behaviors in mice, study shows
Bipolar disorder is a psychiatric disorder characterized by alternating episodes of depression (i.e., low mood and a loss of interest in everyday activities) and mania (i.e., a state in which arousal and energy levels are abnormally high). On average, an estimated 1–2% of people worldwide are diagnosed with bipolar disorder at some point during their lives.
Bipolar disorder can be highly debilitating, particularly if left untreated. Understanding the neural and physiological processes that contribute to its emergence could thus be very valuable, as it could inform the development of new prevention and treatment strategies.
In addition to experiencing periodic changes in mood, individuals diagnosed with this disorder often exhibit some metabolic symptoms, including changes in their blood sugar levels. While some previous studies reported an association between blood sugar control mechanisms and bipolar disorder, the biological link between the two has not yet been uncovered.
Researchers recently carried out a study aimed at further exploring the link between insulin secretion and bipolar disorder-like behaviors, particularly focusing on the expression of the gene RORβ.
Their findings, published in Nature Neuroscience, show that an overexpression of this gene in a subtype of pancreatic cells disrupts the release of insulin, which in turn prompts a feedback loop with a region of the brain known as the hippocampus, producing alternative depression-like and mania-like behaviors in mice.
The results in mice point to a pancreas–hippocampus feedback mechanism by which metabolic and circadian factors cooperate to generate behavioral fluctuations, and which may play a role in bipolar disorder, wrote the authors.
Yao-Nan Liu et al, A pancreas–hippocampus feedback mechanism regulates circadian changes in depression-related behaviors, Nature Neuroscience (2025). DOI: 10.1038/s41593-025-02040-y.
Shampoo-like gel could help chemo patients keep their hair
Cancer fighters know that losing their hair is often part of the battle, but researchers have developed a shampoo-like gel that has been tested in animal models and could protect hair from falling out during chemotherapy treatment.
Baldness from chemotherapy-induced alopecia causes personal, social and professional anxiety for everyone who experiences it. Currently, there are few solutions—the only ones that are approved are cold caps worn on the patient's head, which are expensive and have their own extensive side effects.
The gel is a hydrogel, which absorbs a lot of water and provides long-lasting delivery of drugs to the patient's scalp. The hydrogel is designed to be applied to the patient's scalp before the start of chemotherapy and left on their head as long as the chemotherapy drugs are in their system—or until they are ready to easily wash it off.
During chemotherapy treatment, chemotherapeutic drugs circulate throughout the body. When these drugs reach the blood vessels surrounding the hair follicles on the scalp, they kill or damage the follicles, which releases the hair from the shaft and causes it to fall out. The gel, containing the drugs lidocaine and adrenalone, prevents most of the chemotherapy drugs from reaching the hair follicle by restricting the blood flow to the scalp. Dramatic reduction in drugs reaching the follicle will help protect the hair and prevent it from falling out.
To support practical use of this "shampoo," the gel is designed to be temperature responsive. For example, at body temperature, the gel is thicker and clings to the patient's hair and scalp surface. When the gel is exposed to slightly cooler temperatures, the gel becomes thinner and more like a liquid that can be easily washed away.
Romila Manchanda et al, Hydrogel-based drug delivery system designed for chemotherapy-induced alopecia, Biomaterials Advances (2025). DOI: 10.1016/j.bioadv.2025.214452
A research team has identified a direct molecular link between aging and neurodegeneration by investigating how age-related changes in cell signaling contribute to toxic protein aggregation.
Although aging is the biggest risk factor for neurodegenerative diseases, scientists still don't fully understand which age-associated molecular alterations drive their development.
Using the small nematode worm Caenorhabditis elegans, a research team studied a signaling pathway that leads to pathological protein accumulation with age. Their new paper is published in Nature Aging.
The team focused on the aging-associated protein EPS8 and the signaling pathways it regulates. This protein is known to accumulate with age and to activate harmful stress responses that lead to a shorter lifespan in worms.
Researchers found that increased levels of EPS8, and the activation of its signaling pathways, drive pathological protein aggregation and neurodegeneration—typical features of age-associated neurodegenerative diseases such as Huntington's disease and amyotrophic lateral sclerosis (ALS). By reducing EPS8 activity, the group was then able to prevent the build-up of the toxic protein aggregates and preserve neuronal function in worm models of these two diseases.
Importantly, EPS8 and its signaling partners are evolutionarily conserved and also present in human cells. Similar to what they achieved in the worms, the team was able to prevent the accumulation of toxic protein aggregates in human cell models of Huntington's disease and ALS by reducing EPS8 levels.
Seda Koyuncu et al, The aging factor EPS8 induces disease-related protein aggregation through RAC signaling hyperactivation, Nature Aging (2025). DOI: 10.1038/s43587-025-00943-w
Cooling pollen sunscreen can block UV rays without harming corals
Materials scientists have invented the world's first pollen-based sunscreen derived from Camellia flowers.
In experiments, the pollen-based sunscreen absorbed and blocked harmful ultraviolet (UV) rays as effectively as commercially available sunscreens, which commonly use minerals like titanium dioxide (TiO2) and zinc oxide (ZnO). In laboratory tests on corals, commercial sunscreen induced coral bleaching in just two days, leading to coral death by day six. Each year, an estimated 6,000 to 14,000 tons of commercial sunscreen make their way into the ocean, as people wash it off in the sea or it flows in from wastewater.
In contrast, the pollen-based sunscreen did not affect the corals, which remained healthy even up to 60 days.
In other tests, the pollen-based sunscreen also demonstrated its ability to reduce surface skin temperature, thereby helping to keep the skin cool in the presence of simulated sunlight.
Why we slip on ice: Physicists challenge centuries-old assumptions
For over a hundred years, schoolchildren around the world have learned that ice melts when pressure and friction are applied. When you step out onto an icy pavement in winter, you can slip up because of the pressure exerted by your body weight through the sole of your (still warm) shoe. But it turns out that this explanation misses the mark.
New research reveals that it's not pressure or friction that causes ice to become slippery, but rather the interaction between molecular dipoles in the ice and those on the contacting surface, such as a shoe sole.
The work is published in the journal Physical Review Letters. This insight overturns a paradigm established nearly two centuries ago by the brother of Lord Kelvin, James Thompson, who proposed that pressure and friction contribute to ice melting alongside temperature.
It turns out that neither pressure nor friction plays a particularly significant part in forming the thin liquid layer on ice.
Instead,computer stimulations by researchers reveal that molecular dipoles are the key drivers behind the formation of this slippery layer, which so often causes us to lose our footing in winter. But what exactly is a dipole? A molecular dipole arises when a molecule has regions of partial positive and partial negative charge, giving the molecule an overall polarity that points in a specific direction.
Achraf Atila et al, Cold Self-Lubrication of Sliding Ice, Physical Review Letters (2025). DOI: 10.1103/1plj-7p4z
Dr. Krishna Kumari Challa
Glow-in-the-dark succulents that recharge with sunlight could pave way to plant-based lighting systems
From mushrooms that cast a soft green glow to plankton that glimmers sparkling blue, glowing plants are nothing new to nature. Now, scientists are bringing that light to houseplants.
Reporting in the journal Matter, researchers crafted glow-in-the-dark succulents that recharge in sunlight. Injected with light-emitting compounds, the plants can shine in various colors and rival a small night light at their brightest. The simple, low-cost method may help lay the foundation for sustainable, plant-based lighting systems.
Researchers used afterglow phosphor particles—materials similar to those found in glow-in-the-dark toys. These compounds absorb light and release it slowly over time.
For the particles to travel through leaf tissues, the researchers had to get the size just right: around 7 micrometers, roughly the width of a red blood cell.
Smaller, nano-sized particles move easily within the plant but are dimmer. Larger particles glowed brighter but couldn't travel far inside the plant.
The team then injected the particles into several plant species, including succulents and non-succulents like golden pothos and bok choy. But only the succulents produced a strong glow, thanks to the narrow, uniform, and evenly distributed channels within the leaf that helped to disperse the particles more effectively. After a couple of minutes of exposure to sunlight or indoor LED light, the modified plants glowed for up to two hours.
The particles diffused in just seconds, and the entire succulent leaf glowed.
By using different types of phosphors, the researchers created plants that shine in various colors, including green, red, and blue. They even built a glowing plant wall with 56 succulents, bright enough to illuminate nearby objects and read texts.
Each plant takes about 10 minutes to prepare.
The glowing succulents' light fades over time, and the team is still studying the long-term safety of the materials for the plants.
Sunlight-powered multicolor and uniform luminescence in material-engineered living plants, Matter (2025). DOI: 10.1016/j.matt.2025.102370
Aug 29
Dr. Krishna Kumari Challa
How pigments affect the weight of bird feathers
Birds are some of the most striking creatures on Earth, coming in a rainbow of colors that serve several important functions, such as attracting a mate and communicating with other birds. These vibrant hues are produced by pigments, primarily melanin, but a major unknown until now was how much these pigments weigh. Since wings need to be as light as possible for flight, understanding pigmentation weight may tell us something about the trade-off between the evolutionary benefits of colored feathers and the physical cost of carrying that weight.
In a new study published in the journal Biology Letters, scientists have investigated how much melanin adds to the weight of feathers and the difference in weight between the two main chemical forms of melanin—eumelanin (responsible for brown and black colors) and pheomelanin (responsible for reds and lighter colors).
The researchers analyzed the feathers from 109 bird specimens across 19 different species, including the common kingfisher (Alcedo atthis), the golden eagle (Aquila chrysaetos) and the Eurasian bullfinch (Pyrrhula pyrrhula). They examined feathers with mixed colors and those with single, pure colors, and used a chemical process involving sodium hydroxide or caustic soda, as it is more commonly known, to extract the pigments. Once extracted, they were weighed and compared to the original weight of the feathers.
According to the scientists, melanin pigments account for about 22% of a feather's total weight, and in the most pigmented feathers, this weight is no more than 25%. Additionally, the two types of pigments don't weigh the same. Eumelanin, which makes feathers black or brown, was significantly heavier than pheomelanin, which produces lighter colors.
The paper suggests that a bird has to expend more energy to carry the weight of certain feather colors, which could explain why birds have evolved such a wide variety of hues, as the scientists write in their paper.
The researchers suggested that birds in cold climates, like snowy owls, didn't just evolve white feathers for camouflage. The lack of a heavy pigment would allow them to grow thicker, more insulating feathers without adding too much weight. Therefore, they can stay warm and still be able to fly. Additionally, migratory birds may have evolved lighter feathers to reduce the energy cost of flying long distances.
Ismael Galván et al, Pigment contribution to feather mass depends on melanin form and is restricted to approximately 25%, Biology Letters (2025). DOI: 10.1098/rsbl.2025.0299
Aug 29
Dr. Krishna Kumari Challa
Microbes in soil may affect hormones tied to love, mental health and social bonds
Experts are exploring evidence that microbes in the soil and the environments around us can affect human microbiota and the "gut-brain axis," potentially shaping emotional states and relationship dynamics—including aspects of romantic love.
They outline the idea in a review article in mSystems proposing how the human gut microbiome might influence hormonal pathways involved in emotions commonly associated with love.
This is not claiming microbes 'cause' love or hatred. The aim is to map plausible biological routes, grounded in microbiology and endocrinology, that researchers can now evaluate with rigorous human studies.
The mini-review, "Does a microbial-endocrine interplay shape love-associated emotions in humans? A hypothesis," synthesizes evidence that microbes can modulate hormones and key neurotransmitters such as dopamine, serotonin and oxytocin.
The researchers are exploring how the evolutionary underpinnings of microbial-endocrine interactions could provide important insights into how microbes influence emotions beyond love, including hate and aggression.
If these pathways are confirmed, the findings could open avenues for microbiome-informed strategies to support mental health and relational well-being. For now, it provides a roadmap for careful, hypothesis-driven science.
Jake M. Robinson et al, Does a microbial-endocrine interplay shape love-associated emotions in humans? A hypothesis, mSystems (2025). DOI: 10.1128/msystems.00415-25
Xin Sun et al, Unforeseen high continental-scale soil microbiome homogenization in urban greenspaces, Nature Cities (2025). DOI: 10.1038/s44284-025-00294-y
Aug 29
Dr. Krishna Kumari Challa
Inflammation may explain why women with no standard modifiable risk factors have heart attacks and strokes
Several people ask me this question: Why do people who lead very healthy lives get heart attacks and strokes?
Yes why?
Cardiologists have long known that up to half of all heart attacks and strokes occur among apparently healthy individuals who do not smoke and do not have high blood pressure, high cholesterol, or diabetes, the "standard modifiable risk factors" which doctors often call "SMuRFs."
How to identify risk among the "SMuRF-Less" has been an elusive goal in preventive cardiology, particularly in women who are often under-diagnosed and under-treated.
A new study now has found hsCRP—a marker of inflammation—can help identify women who are at risk but are missed by current screening algorithms.
Results were presented at a late-breaking clinical science session at the European Society of Cardiology Congress (ESC) and simultaneously published in The European Heart Journal.
Women who suffer from heart attacks and strokes yet have no standard modifiable risk factors are not identified by the risk equations doctors use in daily practice.
Yet the new data clearly shows that apparently healthy women who are inflamed are at substantial lifetime risk. Medical professionals should be identifying these women in their 40s, at a time when they can initiate preventive care, not wait for the disease to establish itself in their 70s when it is often too late to make a real difference.
As part of the study, researchers studied 12,530 initially healthy women with no standard modifiable risk factors who had the inflammatory biomarker hsCRP measured at study entry and who were then followed over 30 years.
Despite the lack of traditional risks, women who were inflamed as defined by hsCRP levels > 3 mg/L had a 77% increased lifetime risk of coronary heart disease, a 39% increased lifetime risk of stroke, and a 52% increased lifetime risk of any major cardiovascular event.
Additionally, researchers released a new analysis of randomized trial data showing that "SMuRF-Less but Inflamed" patients can reduce their risk of heart attack and stroke by 38% using statin therapy.
While those with inflammation should aggressively initiate lifestyle and behavioral preventive efforts, statin therapy could also play an important role in helping reduce risk among these individuals, say the researchers.
Paul Ridker et al, C-Reactive Protein and Cardiovascular Risk Among Women with No Standard Modifiable Risk Factors: Evaluating The "SMuRF-Less but Inflamed", (2025). DOI: 10.1093/eurheartj/ehaf658
Aug 29
Dr. Krishna Kumari Challa
Placebo pain relief works differently across the human body
Researchers have used placebo pain relief to uncover a map-like system in the brainstem that controls pain differently depending on where it's felt in the body. The findings may pave the way for safer, more targeted treatments for chronic pain that don't rely on opioids.
Like a highway, the brainstem connects the brain to the spinal cord and manages all signals going to and from the brain. It produces and releases nearly all the neurochemicals needed for thinking, survival and sensing.
Published in Science, the study used 7-Tesla functional magnetic resonance imaging (fMRI)—one of the most powerful brain scanners available—to pinpoint how two key brainstem regions manage pain through placebo effects.
Researchers exposed 93 healthy participants to heat pain on different body parts and applied a placebo pain-relief cream while secretly lowering the temperature, conditioning them to believe the cream was alleviating their pain.
The temperature used was individually adjusted to be moderately painful as perceived by each participant. Researchers used a self-report scale, where 0 was no pain and 100 was the worst pain imaginable, and sought a temperature between 40 and 50 for each participant.
Later, the same pain stimulus was applied to the placebo-treated area as well as a separate untreated area for comparison. Up to 61% of participants still reported less pain in the area where the placebo cream was originally applied, typical of a true placebo response.
They found that upper parts of the brainstem were more active when relieving facial pain, while lower regions were engaged for arm or leg pain.
Two key brainstem regions are involved in this process: the periaqueductal gray (PAG) and the rostral ventromedial medulla (RVM). These areas showed distinct patterns of activity depending on where pain relief was directed, with the upper parts of the PAG and RVM more active for facial pain, while lower parts were more active for arm or leg pain.
The brain has a built-in system to control pain in specific areas. It's not just turning pain off everywhere; but working in a highly coordinated, anatomically precise system.
Understanding which brainstem areas are linked to different parts of the body may open new avenues for developing non-invasive therapies that reduce pain without widespread side effects.
The study also challenges long-held assumptions about how placebo pain relief works. Instead of relying on the brain's opioid system, experts say a different part of the brainstem—the lateral PAG—is not only responsible but works without using opioids and could instead be linked to cannabinoid activity.
Opioid-based pain relief typically activates central areas of the brain and can affect the whole body, whereas the cannabinoid circuit that they identified now appears to operate in more targeted regions of the brainstem.
Knowing exactly where pain relief is happening in the brain means we can target that area or assess whether a drug is working in the right place.
Lewis S. Crawford et al, Somatotopic organization of brainstem analgesic circuitry, Science (2025). DOI: 10.1126/science.adu8846
Aug 29
Dr. Krishna Kumari Challa
The AI breakthrough that uses almost no power to create images
From creating art and writing code to drafting emails and designing new drugs, generative AI tools are becoming increasingly indispensable for both business and personal use. As demand increases, they will require even more computing power, memory and, therefore, energy. That's got scientists looking for ways to reduce their energy consumption.
In a paper published in the journal Nature, researchers describe the development of an AI image generator that consumes almost no power.
AI image generators use a process called diffusion to generate images from text. First, they are trained on a large dataset of images and repeatedly add a statistical noise, a kind of digital static, until the image has disappeared.
Then, when you give AI a prompt such as "create an image of a house," it starts with a screen full of static and then reverses the process, gradually removing the noise until the image appears. If you want to perform large-scale tasks, such as creating hundreds of millions of images, this process is slow and energy intensive.
Shiqi Chen et al, Optical generative models, Nature (2025). DOI: 10.1038/s41586-025-09446-5
Daniel Brunner, Machine-learning model generates images using light, Nature (2025). DOI: 10.1038/d41586-025-02523-9
Aug 29
Dr. Krishna Kumari Challa
AI tool targets RNA structures to unravel secrets of the dark genome
We mapped the human genome decades ago, but most of it is still a black box. Now, scientists have developed a tool to peer inside and what they find could reshape how we think about disease.
Your genome is the genetic map of you, and we understand almost none of it.
Our handle on the bits of the genome that tell the body how to do things ("make eyes blue," "build heart tissue," "give this person sickle cell anemia") is OK, but there are vast areas of the genome that don't appear to do anything.
Scientists long assumed this was just "junk" DNA, leftovers from the billions of years it took us to evolve from primordial soup into the complex life we know today. But it turns out we just didn't know what to look for, nor how to look for it.
Now, new tools developed by researchers are helping us understand just how important the dark genome really is.
Understanding that, scientists hope, will offer new avenues for drug discovery, and transform how we think about disease and even life itself.
Your genome is broken up into protein-coding genes (around 2%), and the rest.
Proteins are the little machines that do the actual work of running an organism;protein-coding genes are the instructions for those proteins.
The other 98% doesn't build proteins, it doesn't follow the same rules, and it's much harder to understand. It contains "long non-coding RNAs," the stuff that was long dismissed as junk.
Scientists are trying to decode the logic circuitry of the human genome—the hidden rules that tell our DNA how to build and run a human being.
Part 1
Aug 30
Dr. Krishna Kumari Challa
It's a tough job, but a crucial one because studies show that the roots of many diseases, including heart disease, cancer, and some psychiatric disorders, lie outside the well-understood, protein-coding regions of the genome.
Non-protein coding genes have key regulatory roles, turning certain genes on and off or altering their shape.
Do any of that at the wrong time, and things start to break down.
Scientists think that these RNAs act like software, orchestrating the protein 'hardware' into a functioning symphony.
Not junk at all.
You might have heard that you share 60% of your DNA with a banana.
It's not strictly speaking true, unfortunately, but there is a germ of truth there.
Humans and bananas (and everything else) all evolved from the same soup of bacteria and single-celled organisms about four billion years ago, and some of our DNA has been conserved over that time because it performs fundamental functions in our cells.
Studies have shown that about 10% of the human genome shows obvious conservation, but the rest is harder to pin down.
Scientists suspect that the remaining approximately 90% of the genome harbors many conserved RNA structures that are invisible to traditional approaches—hidden regulatory elements camouflaged in the genome.
Which is where the the newly developed AI—a tool called ECSFinder—comes in. It has been detailed in a paper in the journal Nucleic Acids Research.
Scientists trained the program on known RNA structures to detect secrets hidden in the genome. The program outperformed other available tools and is now ready to be unleashed on the entire genome.
They expect to uncover hundreds of thousands of new RNA structures, adding a new dimension to our understanding of the genome.
The holy grail of medicine right now is personalized therapy, treatments designed specifically for your individual illness.
One day, it's hoped, clinicians will be able to take a sample of your cancer's DNA, map its genome, then design ultra-specific drugs around its weaknesses, all within a few weeks.
And if the secrets of all gene-driven illnesses are hidden somewhere in the dark genome, learning how to read it is imperative.
Because RNA structures can be targeted by drugs, they present an exciting new frontier for therapies.
Vanda Gaonac'h-Lovejoy et al, ECSFinder: optimized prediction of evolutionarily conserved RNA secondary structures from genome sequences, Nucleic Acids Research (2025). DOI: 10.1093/nar/gkaf780
Part 2
Aug 30
Dr. Krishna Kumari Challa
Scientists move toward developing vaccine against pathogenic Staphylococcus aureus
Antibiotics are the old medicine cabinet standby for treating infections caused by multidrug-resistant Staphylococcus aureus, but as antimicrobial resistance continues to mount globally, scientists say there's a need for new strategies.
While vaccines are a potential answer, achieving an effective way to immunize against multidrug-resistant S. aureus has led scientists down dozens of blind alleys. Ten candidate vaccines that looked promising in preclinical animal studies in recent years failed miserably in human clinical trials.
Now, scientists are investigating a way to sidestep the myriad problems that plagued vaccine investigators in the past by choosing not to target a whole antigen. Instead, they say, it's time to home in on a critical "surface loop" as a vaccine target. The infinitesimal loop is located on the S. aureus antigen known as MntC.
The loop is an ideal target, scientists say, because it has been shown to be essential for aiding survival in S. aureus by mitigating oxidative stress. By triggering vaccine-induced antibodies that zero in on that site, survival is impossible. As a vaccine target, the surface loop is known as an epitope.
Writing in Science Translational Medicine, researchers at institutions from throughout China report on the creation of a first-of-its-kind vaccine that has been tested in animal models based on a target pinpointed in samples from human clinical trials. The evolving strategy overcomes past obstacles and confers protection against drug-resistant S. aureus with a vaccine.
Xiaokai Zhang et al, An epitope vaccine derived by analyzing clinical trial samples safeguards hosts with prior exposure to S. aureus against reinfection, Science Translational Medicine (2025). DOI: 10.1126/scitranslmed.adr7464
Aug 30
Dr. Krishna Kumari Challa
Cancer can smuggle mitochondria in neighboring cells and put them to work
Cancer cells provide healthy neighboring cells with additional mitochondria to put them to work. This has been demonstrated by researchers in a new study. In this way, cancer is exploiting a mechanism that frequently serves to repair damaged cells.
Tumors have developed many strategies and tricks to gain advantages in the body. Researchers have now discovered another surprising trick that certain tumors resort to in ensuring their survival and growth.
In a study published in the journal Nature Cancer, biologists show that skin cancer cells are able to transfer their mitochondria to healthy connective tissue cells (fibroblasts) in their immediate vicinity. Mitochondria are the cell compartments that provide energy in the form of the molecule ATP.
The cancer cells use tiny tubes made of cell membrane material to transfer the mitochondria and connect the two cells—much like in a pneumatic tube system.
The mitochondrial transfer reprograms the fibroblasts functionally into tumor-associated fibroblasts, which mainly support cancer cells: tumor-associated fibroblasts usually multiply faster than normal fibroblasts and produce more ATP, while also secreting higher amounts of growth factors and cytokines. And all this benefits the tumor cells: they also multiply faster, making the tumor more aggressive.
The hijacked fibroblasts also alter the cell environment—the so-called extracellular matrix—by increasing the production of certain matrix components in such a way that cancer cells thrive. The extracellular matrix is vital for the mechanical stability of tissues and influences growth, wound healing and intercellular communication.
Finally, the researchers also clarified the molecular mechanism behind the mitochondrial transfer. Some proteins were already known to assist in transporting mitochondria. The researchers investigated which of these proteins were present in large numbers in cancer cells that transfer mitochondria and came across the protein MIRO2. This protein is produced in very high quantities in cancer cells that transfer their mitochondria.
The researchers detected MIRO2 not only in cell cultures, but also in samples of human tissue—especially in tumor cells at the edges of tumors that grow invasively into the tissue and occur in close proximity to fibroblasts.
The new findings offer starting points for arresting tumor growth. When the researchers blocked the formation of MIRO2, the mitochondrial transfer was inhibited, and the fibroblasts did not develop into tumor-promoting fibroblasts.
The MIRO2 blockade worked in the test tube and in mouse models. Whether it also works in human tissue remains to be seen, say the researchers.
Michael Cangkrama et al, MIRO2-mediated mitochondrial transfer from cancer cells induces cancer-associated fibroblast differentiation, Nature Cancer (2025). DOI: 10.1038/s43018-025-01038-6
Aug 30
Dr. Krishna Kumari Challa
Metformin changes blood metal levels in humans, offering further insight into its mechanism of action
The widely used diabetes drug metformin changes blood metal levels in humans. A new study is an important step in understanding the drug's many actions and designing better ones in the future.
Metformin is the most widely prescribed diabetes drug in the world. Apart from lowering blood sugar levels, it is also known to have a broad range of beneficial side effects such as against tumors, inflammation and atherosclerosis. However, although it has been used for more than 60 years now, its mechanism of action is still not clear, hampering the development of even better drugs against these conditions.
It is known that diabetes patients experience changes in the blood levels of metals such as copper, iron and zinc.
In addition, chemical studies found that metformin has the ability to bind certain metals, such as copper, and recent studies showed that it is this binding ability that might be responsible for some of the drug's beneficial effects.
In the journal BMJ Open Diabetes Research & Care, researchers have published the first clinical evidence of altered blood metal levels in patients taking metformin. They showed that drug-taking patients have significantly lower copper and iron levels and heightened zinc levels.
Furthermore, since decreases in copper and iron concentrations and an increase in zinc concentration are all considered to be associated with improved glucose tolerance and prevention of complications, these changes may indeed be related to metformin's action.
Association of metformin treatment with changes in metal dynamics in individuals with type 2 diabetes, BMJ Open Diabetes Research & Care (2025). DOI: 10.1136/bmjdrc-2025-005255
on Monday
Dr. Krishna Kumari Challa
AI helps stethoscope detect three heart conditions in 15 seconds
An AI-enabled stethoscope can help doctors pick up three heart conditions in just 15 seconds, according to the results of a real-world trial presented at the European Society of Cardiology's annual congress.
The paper is also published in the journal BMJ Open.
The stethoscope, invented in 1816, is a vital part of a doctor's toolkit, used to listen to sounds within the body. But an AI stethoscope can do much more, including analyzing tiny differences in heartbeat and blood flow which are undetectable to the human ear, and taking a rapid ECG at the same time.
Researchers now have evidence that an AI stethoscope can increase detection of heart failure at the early stage when someone goes to their GP with symptoms.
A study involving more than 200 GP surgeries, with more than 1.5 million patients, looked at people with symptoms such as breathlessness or fatigue.
Those examined using an AI stethoscope were twice as likely to be diagnosed with heart failure, compared to similar patients who were not examined using the technology.
Patients examined with the AI-stethoscope were about 3.5 times more likely to be diagnosed with atrial fibrillation—an abnormal heart rhythm which can increase the risk of having a stroke. They were almost twice as likely to receive a diagnosis of heart valve disease, which is where one or more heart valves do not work properly.
Early diagnosis is vital for all three conditions, allowing patients who may need potentially lifesaving medicines to be identified sooner, before they become dangerously unwell.
Mihir A Kelshiker et al, Triple cardiovascular disease detection with an artificial intelligence-enabled stethoscope (TRICORDER): design and rationale for a decentralised, real-world cluster-randomised controlled trial and implementation study, BMJ Open (2025). DOI: 10.1136/bmjopen-2024-098030
on Monday
Dr. Krishna Kumari Challa
A firewall for science: AI tool identifies 1,000 'questionable' journals
A team of computer scientists has developed a new artificial intelligence platform that automatically seeks out "questionable" scientific journals.
The study, published Aug. 27 in the journal Science Advances, tackles an alarming trend in the world of research.
Spam messages come from people who purport to be editors at scientific journals, usually researchers never heard of, and offer to publish their papers—for a hefty fee.
Such publications are sometimes referred to as "predatory" journals. They target scientists, convincing them to pay hundreds or even thousands of dollars to publish their research without proper vetting. There has been a growing effort among scientists and organizations to vet these journals. But it's like whack-a-mole. You catch one, and then another appears, usually from the same company. They just create a new website and come up with a new name.
The new new AI tool automatically screens scientific journals, evaluating their websites and other online data for certain criteria: Do the journals have an editorial board featuring established researchers? Do their websites contain a lot of grammatical errors?
In an era when prominent figures are questioning the legitimacy of science, stopping the spread of questionable publications has become more important than ever before.
Part 1
on Monday
Dr. Krishna Kumari Challa
In science, you don't start from scratch. You build on top of the research of others. So if the foundation of that tower crumbles, then the entire thing collapses.
When scientists submit a new study to a reputable publication, that study usually undergoes a practice called peer review. Outside experts read the study and evaluate it for quality—or, at least, that's the goal.
A growing number of companies have sought to circumvent that process to turn a profit. In 2009, Jeffrey Beall, a librarian at CU Denver, coined the phrase "predatory" journals to describe these publications.
Often, they target researchers outside of the United States and Europe, such as in China, India and Iran—countries where scientific institutions may be young, and the pressure and incentives for researchers to publish are high.
These publishers of predatory journals will say, 'If you pay $500 or $1,000, we will review your paper. In reality, they don't provide any service. They just take the PDF and post it on their website.
a nonprofit organization called the Directory of Open Access Journals (DOAJ). Since 2003, volunteers at the DOAJ have flagged thousands of journals as suspicious based on six criteria. (Reputable publications, for example, tend to include a detailed description of their peer review policies on their websites.)
But keeping pace with the spread of those publications has been daunting for humans.
To speed up the process, researchers now turned to AI.
They trained its system using the DOAJ's data, then asked the AI to sift through a list of nearly 15,200 open-access journals on the internet.
Among those journals, the AI initially flagged more than 1,400 as potentially problematic. When scientists checked them , 350 were found to be legitimate. However, more than 1000 were predatory in nature!
The team discovered that questionable journals published an unusually high number of articles. They also included authors with a larger number of affiliations than more legitimate journals, and authors who cited their own research, rather than the research of other scientists, to an unusually high level.
The new AI system isn't publicly accessible, but the researchers hope to make it available to universities and publishing companies soon.
Han Zhuang et al, Estimating the predictability of questionable open-access journals, Science Advances (2025). DOI: 10.1126/sciadv.adt2792
Part 2
on Monday
Dr. Krishna Kumari Challa
Genetic tools identify lost human relatives Denisovans from fossil records
New genetic techniques are shedding light on a mysterious part of our family tree—ancient human relatives called the Denisovans that emerged during the Pleistocene epoch, approximately 370,000 years ago.
Little is known about them because so few fossils have been found. The group was discovered by accident in 2010, based solely on DNA analysis of what was thought to be a Neanderthal finger bone found in the Denisova Cave in Siberia. However, it turned out to belong to a previously unknown lineage closely related to Neanderthals.
Since then, only a few additional fragments have been found, so researchers
developed a new genetic tool to go through the fossil record identifying potential Denisovans. Their starting point was the Denisovan genome.
The research is published in the journal Proceedings of the National Academy of Sciences.
The research team's innovative technique detects subtle changes in how Denisovan genes are regulated. These changes were likely responsible for altering the Denisovan skeleton, providing clues to what they looked like.
Using this technique, the scientists identified 32 physical traits most closely related to skull morphology. Then they used this predicted profile to scan the Middle Pleistocene fossil record, specifically focusing on skulls from that period.
Before applying this method to unknown fossils, the researchers tested its accuracy. They used the same technique to successfully predict known physical traits of Neanderthals and chimpanzees with more than 85% accuracy.
Next, they identified 18 skull features they could measure, such as head width and forehead height, and then measured these traits on ten ancient skulls. They compared these to skulls from reference groups, including Neanderthals, modern humans and Homo erectus. Using sophisticated statistical tests, they gave each ancient skull a score to see how well it matched the Denisovan blueprint.
Of these, two skulls from China, known as the Harbin and Dali specimens, were a close match to the Denisovan profile. The Harbin skull matched 16 traits, while the Dali skull matched 15.
The study also found that a third skull, Kabwe 1, had a strong link to the Denisovan-Neanderthal tree, which may indicate that it was the root of the Denisovan lineage that split from Neanderthals.
In their study, the team states that their work may help with the identification of other extinct human relatives.
Nadav Mishol et al, Candidate Denisovan fossils identified through gene regulatory phenotyping, Proceedings of the National Academy of Sciences (2025). DOI: 10.1073/pnas.2513968122
on Tuesday
Dr. Krishna Kumari Challa
Rare seasonal brain shrinkage in shrews is driven by water loss, not cell death, MRI study reveals
Common shrews are one of only a handful of mammals known to flexibly shrink and regrow their brains. This rare seasonal cycle, known as Dehnel's phenomenon, has puzzled scientists for decades. How can a brain lose volume and regrow months later without sustaining permanent damage?
A study using noninvasive MRI has scanned the brains of shrews undergoing shrinkage, identifying a key molecule involved in the phenomenon: water. The work is published in the journal Current Biology.
In the experiments conducted by researchers, shrews lost 9% of their brains during shrinkage, but the cells did not die. The cells just lost water.
Normally, brain cells that lose water become damaged and ultimately die, but in shrews, the opposite happened. The cells remained alive and even increased in number.
This finding solves a mystery—and opens up potential pathways for the treatment of human brain disease. Brain shrinkage in shrews matches closely what happens in patients suffering from Alzheimer's, Parkinson's, and other brain diseases.
The study also shows that a specific protein known for regulating water—aquaporin 4—was likely involved in moving water out of the brain cells of shrews. This same protein is present in higher quantities in the diseased brains of humans, too.
That the shrunken brains of shrews share characteristics with diseased human brains makes the case that these miniature mammals, with their ability to reverse brain loss, could also offer clues for medical treatments.
Programmed seasonal brain shrinkage in the common shrew via water loss without cell death, Current Biology (2025). DOI: 10.1016/j.cub.2025.08.015. www.cell.com/current-biology/f … 0960-9822(25)01081-4
on Tuesday
Dr. Krishna Kumari Challa
Super corals could help buy time for reefs in a warming world
Coral reefs are in crisis. Rising ocean temperatures driven by climate change are pushing these ecosystems to the brink, with mass bleaching events becoming more frequent and severe.
But what if nature already holds part of the solution?
New research led by UTS demonstrates that corals that naturally thrive in extreme environments, often referred to as "super corals," could be used in restoration efforts to protect vulnerable reef systems.
The research was published in the journal Science Advances and offers compelling evidence that these resilient corals can retain their heat tolerance even after being moved to more stable reef habitats.
Christine D. Roper et al, Coral thermotolerance retained following year-long exposure to a novel environment, Science Advances (2025). DOI: 10.1126/sciadv.adu3858
on Tuesday
Dr. Krishna Kumari Challa
Longevity gains slowing with life expectancy of 100 unlikely, study finds
A new study finds that life expectancy gains made by high-income countries in the first half of the 20th century have slowed significantly, and that none of the generations born after 1939 will reach 100 years of age on average.
Published in the journal Proceedings of the National Academy of Sciences, the study analyzed life expectancy for 23 high-income and low-mortality countries using data from the Human Mortality Database and six different mortality forecasting methods.
The unprecedented increase in life expectancy we achieved in the first half of the 20th century appears to be a phenomenon we are unlikely to achieve again in the foreseeable future. In the absence of any major breakthroughs that significantly extend human life, life expectancy would still not match the rapid increases seen in the early 20th century even if adult survival improved twice as fast as we predict.
From 1900 to 1938, life expectancy rose by about five and a half months with each new generation. The life expectancy for an individual born in a high-income country in 1900 was an average of 62 years. For someone born just 38 years later in similar conditions, life expectancy had jumped to 80 years on average.
For those born between 1939 and 2000, the increase slowed to roughly two and a half to three and a half months per generation, depending on the forecasting method. Mortality forecasting methods are statistical techniques that make informed predictions about future lifespans based on past and current mortality information. These models enabled the research team to estimate how life expectancy will develop under a variety of plausible future scenarios.
Researchers forecast that those born in 1980 will not live to be 100 on average, and none of the cohorts in their study will reach this milestone. This decline is largely due to the fact that past surges in longevity were driven by remarkable improvements in survival at very young ages.
At the beginning of the 20th century, infant mortality fell rapidly due to medical advances and other improvements in quality of life for high-income countries. This contributed significantly to the rapid increase in life expectancy. However, infant and child mortality is now so low that the forecasted improvements in mortality in older age groups will not be enough to sustain the previous pace of longevity gains.
--
While mortality forecasts can never be certain as the future may unfold in unexpected ways—by way of pandemics, new medical treatments or other unforeseen societal changes—this study provides critical insight for governments looking to anticipate the needs of their health care systems, pension planning and social policies.
Although a population-level analysis, this research also has implications for individuals, as life expectancy influences personal decisions about saving, retirement and long-term planning. If life expectancy increases more slowly, as this study shows is likely, both governments and individuals may need to recalibrate their expectations for the future.
José Andrade et al, Cohort mortality forecasts indicate signs of deceleration in life expectancy gains, Proceedings of the National Academy of Sciences (2025). DOI: 10.1073/pnas.2519179122
on Tuesday
Dr. Krishna Kumari Challa
Why male embryos grow faster
Researchers have uncovered the genetic triggers that cause male and female bovine embryos to develop differently, as early as seven to eight days after fertilization. The breakthrough in basic science has implications for human health—such as drug development and in vitro fertilization—and for bovine health and dairy industry sustainability.
Scientists have known since the 1990s that male embryos of multiple mammalian species, including humans, grow faster than female embryos, but until now, the underlying reasons were unclear.
In a paper, published in Cell & Bioscience, scientists grew bovine embryos in petri dishes, then analyzed their genetic sex and RNA sequencing, which shows how genes are being expressed.
They discovered significant sex differences in gene regulation: male embryos prioritized genes associated with energy metabolism, causing them to grow faster than their female counterparts. Female embryos emphasized genes associated with sex differentiation, gonad development and inflammatory pathways that are important for future development.
Understanding these fundamental sex differences at the genomic and molecular levels is critically important to improving in vitro fertilization (IVF) success in humans and cows, and in developing treatments that will work for both men and women.
Meihong Shi et al, Sex-biased transcriptome in in vitro produced bovine early embryos, Cell & Bioscience (2025). DOI: 10.1186/s13578-025-01459-x
on Tuesday
Dr. Krishna Kumari Challa
Scientists find that ice generates electricity when bent
A new study reveals that ice is a flexoelectric material, meaning it can produce electricity when unevenly deformed. Published in Nature Physics, this discovery could have major technological implications while also shedding light on natural phenomena such as lightning.
Frozen water is one of the most abundant substances on Earth. It is found in glaciers, on mountain peaks and in polar ice caps. Although it is a well-known material, studying its properties continues to yield fascinating results.
An international study has shown for the first time that ordinary ice is a flexoelectric material.
In other words, it can generate electricity when subjected to mechanical deformation. This discovery could have significant implications for the development of future technological devices and help to explain natural phenomena such as the formation of lightning in thunderstorms.
The study represents a significant step forward in our understanding of the electromechanical properties of ice.
The scientists discovered that ice generates electric charge in response to mechanical stress at all temperatures. In addition, they identified a thin 'ferroelectric' layer at the surface at temperatures below -113ºC (160K).
This means that the ice surface can develop a natural electric polarization, which can be reversed when an external electric field is applied—similar to how the poles of a magnet can be flipped. The surface ferroelectricity is a cool discovery in its own right, as it means that ice may have not just one way to generate electricity, but two: ferroelectricity at very low temperatures, and flexoelectricity at higher temperatures all the way to 0 °C.
This property places ice on a par with electroceramic materials such as titanium dioxide, which are currently used in advanced technologies like sensors and capacitors.
One of the most surprising aspects of this discovery is its connection to nature. The results of the study suggest that the flexoelectricity of ice could play a role in the electrification of clouds during thunderstorms, and therefore in the origin of lightning.
It is known that lightning forms when an electric potential builds up in clouds due to collisions between ice particles, which become electrically charged. This potential is then released as a lightning strike. However, the mechanism by which ice particles become electrically charged has remained unclear, since ice is not piezoelectric—it cannot generate charge simply by being compressed during a collision.
However, the study shows that ice can become electrically charged when it is subjected to inhomogeneous deformations, i.e. when it bends or deforms irregularly.
X. Wen et al, Flexoelectricity and surface ferroelectricity of water ice, Nature Physics (2025). DOI: 10.1038/s41567-025-02995-6
on Tuesday
Dr. Krishna Kumari Challa
Neuroscientists show for first time that precise timing of nerve signals determines how brain processes information
It has long been known that the brain preferentially processes information that we focus our attention on—a classic example is the so-called cocktail party effect.
In an environment full of voices, music, and background noise, the brain manages to concentrate on a single voice. The other noises are not objectively quieter, but are perceived less strongly at that moment.
The brain focuses its processing on the information that is currently relevant—in this case, the voice of the conversation partner—while other signals are received but not forwarded and processed to the same extent.
Neuroscientists provided the first causal evidence of how the brain transmits and processes relevant information. The work is published in the journal Nature Communications.
Whether a signal is processed further in the brain depends crucially on whether it arrives at the right moment—during a short phase of increased receptivity of the nerve cells.
Nerve cells do not work continuously, but in rapid cycles. They are particularly active and receptive for a few milliseconds, followed by a window of lower activity and excitability. This cycle repeats itself approximately every 10 to 20 milliseconds. Only when a signal arrived shortly before the peak of this active phase did it change the behaviour of the neurons.
This temporal coordination is the fundamental mechanism of information processing. Attention makes targeted use of this phenomenon by aligning the timing of the nerve cells so that relevant signals arrive precisely in this time window, while others are excluded.
Eric Drebitz et al, Gamma-band synchronization between neurons in the visual cortex is causal for effective information processing and behavior, Nature Communications (2025). DOI: 10.1038/s41467-025-62732-8
on Tuesday
Dr. Krishna Kumari Challa
Criminal behavior in patients with dementia
Don't blame the criminals for everything they do.
A suspected perpetrator who can barely remember his name, several traffic violations committed by a woman in her mid-fifties who is completely unreasonable and doesn't understand her behavior—should such cases be brought before a court? And how does the state deal with people who commit acts of violence without meaning to?
Those questions come to mind if one hears those examples from everyday clinical praxis with persons suffering from dementia. Neurodegenerative diseases might affect several functions of the brain, ranging from memory in Alzheimer's disease to behavior, such as in behavioral variant frontotemporal dementia, and to sensorimotor function in Parkinson's disease.
One of the most interesting consequences of these alterations is the fact that persons affected by these diseases might develop criminal risk behavior like harassment, traffic violation, theft or even behavior causing harm to other people or animals, even as the first disease sign.
If persons violate social or legal norms due to changes in behavior, personality and cognition, these incidents may have a substantial impact on this person's family and social surroundings and may lead to prosecution.
Researchers investigated this problem in a broad meta-analysis that included 14 studies with 236,360 persons from different countries (U.S.A., Sweden and Finland, Germany and Japan). The work is published in the journal Translational Psychiatry.
Their systematic literature review revealed that criminal risk behavior prevalence is more frequent in early disease course than in the general population, but declines thereafter below population levels. Therefore, criminal behavior committed for the first time at mid-age could be an indicator of incident dementia, requiring earliest diagnosis and therapy.
The team shows that the prevalence of criminal risk behaviors was highest in behavioral variant frontotemporal dementia (>50%), followed by semantic variant primary progressive aphasia (40%), but rather low in vascular dementia and Huntington's disease (15%), Alzheimer's disease (10%), and lowest in Parkinsonian syndromes (<10%).
Criminal risk behavior in frontotemporal dementia is most likely caused by the neurodegenerative disease itself. Most of the patients showed criminal risk behavior for the first time in their life and had no previous records of criminal activity.
The prevalence seems to be more frequent in frontotemporal dementia and Alzheimer's disease in early disease course than in the general population before diagnosis, presumably in pre-stages, such as mild behavioral or cognitive impairment, but declines thereafter, finally leading to lower prevalence in dementia after diagnosis if compared with the general population.
The researchers also found that criminal risk behaviour is more frequent in men than in women in dementia. After diagnosis, men showed four times more criminal risk behaviour than women in frontotemporal dementia and seven times more in Alzheimer's disease.
Part 1
on Tuesday
Dr. Krishna Kumari Challa
In a second study, the working group also identified the changes in the brain that are associated with criminal behavior in frontotemporal dementia. This study was published in Human Brain Mapping.
Persons exhibiting criminal behavior showed larger atrophy in the temporal lobe, indicating that criminal behavior might be caused by so-called disinhibition, i.e., the loss of normal restraints or inhibitions, leading to a reduced ability to regulate one's behavior, impulses, and emotions.
Disinhibition can manifest as acting impulsively, without thinking through the consequences, and behaving in ways that are inappropriate for the situation.
One has to prevent a further stigmatization of persons with dementia. Of note, most offenses committed were minor, such as indecent behavior, traffic violations, theft, damage to property, but also physical violence or aggression occurred
Hence, sensitivity for this topic as possible early signs of dementia and earliest diagnosis and treatment are of the uttermost importance. Besides early diagnosis and treatment of persons affected, one has to discuss adaptations of the legal system, such as increasing awareness of offenses due to those diseases, and taking into account diseases in respective penalties and in jails, say the researchers.
Matthias L. Schroeter et al, Criminal minds in dementia: A systematic review and quantitative meta-analysis, Translational Psychiatry (2025). DOI: 10.1038/s41398-025-03523-z
Karsten Mueller et al, Criminal Behavior in Frontotemporal Dementia: A Multimodal MRI Study, Human Brain Mapping (2025). DOI: 10.1002/hbm.70308
Part 2
on Tuesday
Dr. Krishna Kumari Challa
Depression linked to presence of immune cells in the brain's protective layer
Immune cells released from bone marrow in the skull in response to chronic stress and adversity could play a key role in symptoms of depression and anxiety, say researchers.
The discovery—found in a study in mice—sheds light on the role that inflammation can play in mood disorders and could help in the search for new treatments, in particular for those individuals for whom current treatments are ineffective.
Around 1 billion people will be diagnosed with a mood disorder such as depression or anxiety at some point in their life. While there may be many underlying causes, chronic inflammation—when the body's immune system stays active for a long time, even when there is no infection or injury to fight—has been linked to depression. This suggests that the immune system may play an important role in the development of mood disorders.
Previous studies have highlighted how high levels of an immune cell known as a neutrophil, a type of white blood cell, are linked to the severity of depression. But how neutrophils contribute to symptoms of depression is currently unclear.
In research published in Nature Communications, a team of scientists tested the hypothesis that chronic stress can lead to the release of neutrophils from bone marrow in the skull. These cells then collect in the meninges—membranes that cover and protect your brain and spinal cord—and contribute to symptoms of depression.
As it is not possible to test this hypothesis in humans, the team used mice exposed to chronic social stress. In this experiment, an 'intruder' mouse is introduced into the home cage of an aggressive resident mouse. The two have brief daily physical interactions and can otherwise see, smell, and hear each other.
The researchers found that prolonged exposure to this stressful environment led to a noticeable increase in levels of neutrophils in the meninges, and that this was linked to signs of depressive behavior in the mice. Even after the stress ended, the neutrophils lasted longer in the meninges than they did in the blood.
Analysis confirmed the researchers' hypothesis that the meningeal neutrophils—which appeared subtly different from those found in the blood—originated in the skull.
Further analysis suggested that long-term stress triggered a type of immune system 'alarm warning' known as type I interferon signaling in the neutrophils. Blocking this pathway—in effect, switching off the alarm—reduced the number of neutrophils in the meninges and improved behavior in the depressed mice.
Further analysis suggested that long-term stress triggered a type of immune system 'alarm warning' known as type I interferon signaling in the neutrophils. Blocking this pathway—in effect, switching off the alarm—reduced the number of neutrophils in the meninges and improved behavior in the depressed mice.
--
Part 1
on Tuesday
Dr. Krishna Kumari Challa
The reason why there are high levels of neutrophils in the meninges is unclear. One explanation could be that they are recruited by microglia, a type of immune cell unique to the brain.
Another possible explanation is that chronic stress may cause microhemorrhages, tiny leaks in brain blood vessels, and that neutrophils—the body's 'first responders'—arrive to fix the damage and prevent any further damage. These neutrophils then become more rigid, possibly getting stuck in brain capillaries and causing further inflammation in the brain.
These new findings show that these 'first responder' immune cells leave the skull bone marrow and travel to the brain, where they can influence mood and behavior.
Stacey L. Kigar et al, Chronic social defeat stress induces meningeal neutrophilia via type I interferon signaling in male mice, Nature Communications (2025). DOI: 10.1038/s41467-025-62840-5
Part 2
on Tuesday
Dr. Krishna Kumari Challa
Hyperactive blood platelets linked to heart attacks despite standard drug therapy
Scientists have discovered a particularly active subgroup of blood platelets that may cause heart attacks in people with coronary heart disease despite drug therapy. This discovery may open up new prospects for customized therapies. The research results are published in the European Heart Journal and were presented on August 31 at Europe's largest cardiology congress.
Despite modern medication, many people with coronary heart disease continue to suffer heart attacks. A research team has now found a possible explanation for this and has also provided a very promising therapeutic approach.
Their work focused on so-called "reticulated platelets": particularly young, RNA-rich and reactive thrombocytes that play a central role in the formation of blood clots in patients with coronary heart disease.
The researchers were able to comprehensively characterize the biological mechanisms of these cells for the first time. This allows us to explain why these platelets remain overactive in many patients even under optimal therapy. The reason is that these young platelets have a particularly large number of activating signaling pathways that make them more sensitive and reactive than mature platelets.
The results come from a multidimensional, using various methods, analysis of the blood of over 90 patients with coronary heart disease. Among other things, the researchers found signaling pathways that can be used to specifically inhibit the activity of such blood cells, in particular two target structures called GPVI and PI3K. Initial laboratory experiments confirmed that inhibiting these signaling pathways can reduce platelet hyperactivity.
Kilian Kirmes et al, Reticulated platelets in coronary artery disease: a multidimensional approach unveils prothrombotic signalling and novel therapeutic targets, European Heart Journal (2025). DOI: 10.1093/eurheartj/ehaf694
on Tuesday
Dr. Krishna Kumari Challa
8,000 years of human activities have caused wild animals to shrink and domestic animals to grow
Humans have caused wild animals to shrink and domestic animals to grow, according to a new study.
Researchers studied tens of thousands of animal bones from Mediterranean France covering the last 8,000 years to see how the size of both types of animals has changed over time.
Scientists already know that human choices, such as selective breeding, influence the size of domestic animals, and that environmental factors also impact the size of both. However, little is known about how these two forces have influenced the size of wild and domestic animals over such a prolonged period. This latest research, published in the Proceedings of the National Academy of Sciences , fills a major gap in our knowledge.
The scientists analyzed more than 225,000 bones from 311 archaeological sites in Mediterranean France. They took thousands of measurements of things like the length, width, and depth of bones and teeth from wild animals, such as foxes, rabbits and deer, as well as domestic ones, including goats, cattle, pigs, sheep and chickens.
But the researchers didn't just focus on the bones. They also collected data on the climate, the types of plants growing in the area, the number of people living there and what they used the land for. And then, with some sophisticated statistical modeling, they were able to track key trends and drivers behind the change in animal size.
The research team's findings reveal that for around 7,000 years, wild and domestic animals evolved along similar paths, growing and shrinking together in sync with their shared environment and human activity. However, all that changed around 1,000 years ago. Their body sizes began to diverge dramatically, especially during the Middle Ages.
Domestic animals started to get much bigger as they were being actively bred for more meat and milk. At the same time, wild animals began to shrink in size as a direct result of human pressures, such as hunting and habitat loss. In other words, human activities replaced environmental factors as the main force shaping animal evolution.
Mureau, Cyprien et al, 8,000 years of wild and domestic animal body size data reveal long-term synchrony and recent divergence due to intensified human impact, Proceedings of the National Academy of Sciences (2025). DOI: 10.1073/pnas.2503428122. www.pnas.org/cgi/doi/10.1073/pnas.2503428122
on Wednesday
Dr. Krishna Kumari Challa
World's oldest host-associated bacterial DNA found
An international team led by researchers at the Center for Paleogenetics, has uncovered microbial DNA preserved in woolly and steppe mammoth remains dating back more than one million years. The analyses reveal some of the world's oldest microbial DNA ever recovered, as well as the identification of bacteria that possibly caused disease in mammoths. The findings are published in Cell.
Researchers analyzed microbial DNA from 483 mammoth specimens, of which 440 were sequenced for the first time. Among them was a steppe mammoth that lived about 1.1 million years ago. Using advanced genomic and bioinformatic techniques, the team distinguished microbes that once lived alongside the mammoths from those that invaded their remains after death.
Ancient Host-Associated Microbes obtained from Mammoth Remains, Cell (2025). DOI: 10.1016/j.cell.2025.08.003. www.cell.com/cell/fulltext/S0092-8674(25)00917-1
on Wednesday
Dr. Krishna Kumari Challa
Genetic mechanism reveals how plants coordinate flowering with light and temperature conditions
Plants may be stuck in one place, but the world around them is constantly changing. In order to grow and flower at the right time, plants must constantly collect information about their surroundings, measuring things like temperature, brightness, and length of day. Still, it's unclear how all this information gets combined to trigger specific behaviors.
Scientists have discovered a genetic mechanism for how plants integrate light and temperature information to control their flowering.
In a study published in Nature Communications, the researchers found an interaction between two genetic pathways that signals the presence of both blue light and low temperature. This genetic module helps plants fine-tune their flowering to the optimal environmental conditions.
In one pathway, blue light activates the PHOT2 blue light receptor, with help from partner protein NPH3. In another pathway, low ambient temperature allows a transcription factor called CAMTA2 to boost the expression of a gene called EHB1. Importantly, EHB1 is known to interact with NPH3, placing NPH3 at the convergence point of the blue light and low temperature signals. This this genetic architecture effectively works as a coincidence detector, linking the presence of blue light and low temperature to guide the switch to flowering.
The study describes an important component of plant growth, reproduction, and information processing. The newly discovered genetic module allows plants to have fine control over their flowering in low temperatures. Understanding this system will now help scientists optimize crop growth under changing environmental conditions.
Adam Seluzicki et al, Genetic architecture of a light-temperature coincidence detector, Nature Communications (2025). DOI: 10.1038/s41467-025-62194-y
on Wednesday
Dr. Krishna Kumari Challa
Generative AI designs molecules that kill drug-resistant bacteria
What if generative AI could design life-saving antibiotics, not just art and text? In a new Cell Biomaterials paper, researchers introduce AMP-Diffusion, a generative AI tool used to create tens of thousands of new antimicrobial peptides (AMPs)—short strings of amino acids, the building blocks of proteins—with bacteria-killing potential. In animal models, the most potent AMPs performed as well as FDA-approved drugs, without detectable adverse effects.
While past work has shown that AI can successfully sort through mountains of data to identify promising antibiotic candidates, this study adds to a small but growing number of demonstrations that AI can invent antibiotic candidates from scratch.
Using AMP-Diffusion, the researchers generated the amino-acid sequences for about 50,000 candidates. They used AI to filter the results.
After synthesizing the 46 most promising candidates, the researchers tested them in human cells and animal models. Treating skin infections in mice, two AMPs demonstrated efficacy on par with levofloxacin and polymyxin B, FDA-approved drugs used to treat antibiotic-resistant bacteria, without adverse effects.
In the future, the researchers hope to refine AMP-Diffusion, giving it the capability to denoise with a more specific goal in mind, like treating a particular type of bacterial infection, among other features.
Generative latent diffusion language modeling yields anti-infective synthetic peptides, Cell Biomaterials (2025). DOI: 10.1016/j.celbio.2025.100183. www.cell.com/cell-biomaterials … 3050-5623(25)00174-6
on Wednesday
Dr. Krishna Kumari Challa
How bacteria bounce back after antibiotics
A study by researchers has uncovered how Escherichia coli (E. coli) persister bacteria survive antibiotics by protecting their genetic instructions.
The work, published in Nature Microbiology, offers new hope for tackling chronic, recurring infections.
Persister bacteria, which enter a dormant state to survive antibiotics that target active cells, are linked to over 20% of chronic infections and resist current treatments.
Understanding their survival mechanisms could lead to new ways to combat recurring infections. This study utilized E. coli bacteria as a model and found that prolonged stress leads to the increased formation of aggresomes (membraneless droplets) and the enrichment of mRNA (molecules that carry instructions for making proteins) within them, which enhances the ability of E. coli to survive and recover from stress.
Researchers used multiple approaches, including imaging, modeling, and transcriptomics, to show that prolonged stress leading to ATP (fuel for all living cells) depletion in Escherichia coli results in increased aggresome formation, their compaction, and enrichment of mRNA within aggresomes compared to the cytosol (the liquid inside of cells).
Transcript length was longer in aggresomes compared to the cytosol. Mass spectrometry showed exclusion of mRNA ribonuclease (an enzyme that breaks down RNA) from aggresomes, which was due to negative charge repulsion.
Experiments with fluorescent reporters and disruption of aggresome formation showed that mRNA storage within aggresomes promoted translation and was associated with reduced lag phases during growth after stress removal. These findings suggest that mRNA storage within aggresomes confers an advantage for bacterial survival and recovery from stress.
This breakthrough illuminates how persister cells survive and revive after antibiotic treatment.
By targeting aggresomes, new drugs could disrupt this protective mechanism, preventing bacteria from storing mRNA and making them more vulnerable to elimination, thus reducing the risk of infection relapse.
Linsen Pei et al, Aggresomes protect mRNA under stress in Escherichia coli, Nature Microbiology (2025). DOI: 10.1038/s41564-025-02086-5
on Wednesday
Dr. Krishna Kumari Challa
Deforestation reduces rainfall by 74% and increases temperatures by 16% in Amazon during dry season, study found
Deforestation in the Brazilian Amazon is responsible for approximately 74.5% of the reduction in rainfall and 16.5% of the temperature increase in the biome during the dry season. For the first time, researchers have quantified the impact of vegetation loss and global climate change on the forest.
The study provides fundamental results to guide effective mitigation and adaptation strategies.
How climate change and deforestation interact in the transformation of the Amazon rainforest, Nature Communications (2025). DOI: 10.1038/s41467-025-63156-0
on Wednesday
Dr. Krishna Kumari Challa
Magnetic fields in infant universe may have been billions of times weaker than a fridge magnet
The magnetic fields that formed in the very early stages of the universe may have been billions of times weaker than a small fridge magnet, with strengths comparable to magnetism generated by neurons in the human brain. Yet, despite such weakness, quantifiable traces of their existence still remain in the cosmic web, the visible cosmic structures connected throughout the universe.
These conclusions emerge from a study using around a quarter of a million computer simulations.
The research, recently published in Physical Review Letters, specifies both possible and maximum values for the strengths of primordial magnetic fields. It also offers the possibility of refining our knowledge of the early universe and the formation of the first stars and galaxies.
Mak Pavičević et al, Constraints on Primordial Magnetic Fields from the Lyman- α Forest, Physical Review Letters (2025). DOI: 10.1103/77rd-vkpz. On arXiv: DOI: 10.48550/arxiv.2501.06299
on Wednesday
Dr. Krishna Kumari Challa
Removing yellow stains from fabric with blue light
Sweat and food stains can ruin your favorite clothes. But bleaching agents such as hydrogen peroxide or dry-cleaning solvents that remove stains aren't options for all fabrics, especially delicate ones. Now, researchers in ACS Sustainable Chemistry & Engineering report a simple way to remove yellow stains using a high-intensity blue LED light. They demonstrate the method's effectiveness at removing stains from orange juice, tomato juice and sweat-like substances on multiple fabrics, including silk.
The method utilizes visible blue light in combination with ambient oxygen, which acts as the oxidizing agent to drive the photobleaching process.This approach avoids the use of harsh chemical oxidants typically required in conventional bleaching methods, making it inherently more sustainable.
Yellow clothing stains are caused by squalene and oleic acid from skin oils and sweat, as well as natural pigments like beta carotene and lycopene, present in oranges, tomatoes and other foods. UV light is a potential stain-removing alternative to chemical oxidizers like bleach and hydrogen peroxide, but it can damage delicate fabrics.
The same researchers previously determined that a high-intensity blue LED light could remove yellow color from aged resin polymers, and they wanted to see whether blue light could also break down yellow stains on fabric without causing damage.
Initially, they exposed vials of beta carotene, lycopene and squalene to high-intensity blue LED light for three hours. All the samples lost color, and spectroscopic analyses indicated that oxygen in the air helped the photobleaching process by breaking bonds to produce colorless compounds.
Next, the team applied squalene onto cotton fabric swatches. After heating the swatches to simulate aging, they treated the samples for 10 minutes, by soaking them in a hydrogen peroxide solution or exposing them to the blue LED or UV light. The blue light reduced the yellow stain substantially more than hydrogen peroxide or UV exposure. In fact, UV exposure generated some new yellow-colored compounds.
Additional tests showed that the blue LED treatment lightened squalene stains on silk and polyester without damaging the fabrics. The method also reduced the color of other stain-causing substances, including aged oleic acid, orange juice and tomato juice, on cotton swatches.
High-intensity blue LED light is a promising way to remove clothing stains, but the researchers say they want to do additional colorfastness and safety testing before commercializing a light system for home and industrial use.
Tomohiro Sugahara et al, Environmentally Friendly Photobleaching Method Using Visible Light for Removing Natural Stains from Clothing, ACS Sustainable Chemistry & Engineering (2025). DOI: 10.1021/acssuschemeng.5c03907
on Wednesday
Dr. Krishna Kumari Challa
Scientists develop the world's first 6G chip, capable of 100 Gbps speeds
Sixth generation, or 6G, wireless technology is one step closer to reality with news that researchers have unveiled the world's first "all-frequency" 6G chip. The chip is capable of delivering mobile internet speeds exceeding 100 gigabits per second (Gbps).
6G technology is the successor to 5G and promises to bring about a massive leap in how we communicate. It will offer benefits such as ultra-high-speed connectivity, ultra-low latency and AI integration that can manage and optimize networks in real-time. To achieve this, 6G networks will need to operate across a range of frequencies, from standard microwaves to much higher frequency terahertz waves. Current 5G technology utilizes a limited set of radio frequencies, similar to those used in previous generations of wireless technologies.
The new chip is no bigger than a thumbnail, measuring 11 millimeters by 1.7 millimeters. It operates across a wide frequency range, from 0.5 GHz to 115 GHz, which traditionally takes nine separate radio systems to cover this spectrum. While the development of a single all-frequency chip is a significant breakthrough, the technology is still in its early stages of development. Many experts expect that commercial 6G networks will begin to roll out around 2030.
Zihan Tao et al, Ultrabroadband on-chip photonics for full-spectrum wireless communications, Nature (2025). DOI: 10.1038/s41586-025-09451-8
on Wednesday
Dr. Krishna Kumari Challa
Mother's Germs May Influence Her Child's Brain Development
Our bodies are colonized by a teeming, ever-changing mass of microbes that help power countless biological processes. Now, a new study has identified how these microorganisms get to work shaping the brain before birth.Researchers at Georgia State University studied newborn mice specifically bred in a germ-free environment to prevent any microbe colonization. Some of these mice were immediately placed with mothers with normal microbiota, which leads to microbes being transferred rapidly.
That gave the study authors a way to pinpoint just how early microbes begin influencing the developing brain. Their focus was on the paraventricular nucleus (PVN), a region of the hypothalamus tied to stress and social behavior, already known to be partly influenced by microbe activity in mice later in life.
At birth, a newborn body is colonized by microbes as it travels through the birth canal.
Birth also coincides with important developmental events that shape the brain.
Part 1
on Wednesday
Dr. Krishna Kumari Challa
When the germ-free mice were just a handful of days old, the researchers found fewer neurons in their PVN, even when microbes were introduced after birth. That suggests the changes caused by these microorganisms happen in the uterus during development.
These neural modifications last, too: the researchers also found that the PVN was neuron-light even in adult mice, if they'd been raised to be germ-free. However, the cross-fostering experiment was not continued into adulthood (around eight weeks).
The details of this relationship still need to be worked out and researched in greater detail, but the takeaway is that microbes – specifically the mix of microbes in the mother's gut – can play a notable role in the brain development of their offspring.
Rather than shunning our microbes, we should recognize them as partners in early life development. They're helping build our brains from the very beginning, say the researchers.
While this has only been shown in mouse models so far, there are enough biological similarities between mice and humans that there's a chance we're also shaped by our mother's microbes before we're born.
One of the reasons this matters is because practices like Cesarean sections and the use of antibiotics around birth are known to disrupt certain types of microbe activity – which may in turn be affecting the health of newborns.
https://www.sciencedirect.com/science/article/pii/S0018506X25000686...
Part 2
**
on Wednesday
Dr. Krishna Kumari Challa
Mutations driving evolution are informed by the genome, not random, study suggests
A study published in the Proceedings of the National Academy of Sciences by scientists shows that an evolutionarily significant mutation in the human APOL1 gene arises not randomly but more frequently where it is needed to prevent disease, fundamentally challenging the notion that evolution is driven by random mutations and tying the results to a new theory that, for the first time, offers a new concept for how mutations arise.
Implications for biology, medicine, computer science, and perhaps even our understanding of the origin of life itself, are potentially far reaching.
A random mutation is a genetic change whose chance of arising is unrelated to its usefulness. Only once these supposed accidents arise does natural selection vet them, sorting the beneficial from the harmful. For over a century, scientists have thought that a series of such accidents has built up over time, one by one, to create the diversity and splendor of life around us.
However, it has never been possible to examine directly whether mutations in the DNA originate at random or not. Mutations are rare events relative to the genome's size, and technical limitations have prevented scientists from seeing the genome in enough detail to track individual mutations as they arise naturally.
To overcome this, researchers developed a new ultra-accurate detection method and recently applied it to the famous HbS mutation, which protects from malaria but causes sickle-cell anemia in homozygotes.
Results showed that the HbS mutation did not arise at random, but emerged more frequently exactly in the gene and population where it was needed. Now, they report the same nonrandom pattern in a second mutation of evolutionary significance.
The new study examines the de novo origination of a mutation in the human APOL1 gene that protects against a form of trypanosomiasis, a disease that devastated central Africa in historical times and until recently has caused tens of thousands of deaths there per year, while increasing the risk of chronic kidney disease in people with two copies.
If the APOL1 mutation arises by chance, it should arise at a similar rate in all populations, and only then spread under Trypanosoma pressure. However, if it is generated nonrandomly, it may actually arise more frequently where it is useful.
Results supported the nonrandom pattern: the mutation arose much more frequently in sub-Saharan Africans, who have faced generations of endemic disease, compared to Europeans, who have not, and in the precise genomic location where it confers protection.
Part 1
yesterday
Dr. Krishna Kumari Challa
The new findings challenge the notion of random mutation fundamentally.
Historically, there have been two basic theories for how evolution happens— random mutation and natural selection, and Lamarckism—the idea that an individual directly senses its environment and somehow changes its genes to fit it. Lamarckism has been unable to explain evolution in general, so biologists have concluded that mutations must be random.
This new theory moves away from both of these concepts, proposing instead that two inextricable forces underlie evolution. While the well-known external force of natural selection ensures fitness, a previously unrecognized internal force operates inside the organism, putting together genetic information that has accumulated over generations in useful ways.
To illustrate, take fusion mutations, a type of mutation where two previously separate genes fuse to form a new gene. As for all mutations, it has been thought that fusions arise by accident: one day, a gene moves by error to another location and by chance fuses to another gene, once in a great while, leading to a useful adaptation. But researchers have recently shown that genes do not fuse at random.
Instead, genes that have evolved to be used together repeatedly over generations are the ones that are more likely to get fused. Because the genome folds in 3D space, bringing genes that work together to the same place at the same time in the nucleus with their chromatin open, molecular mechanisms fuse these genes rather than others. An interaction involving complex regulatory information that has gradually evolved over generations leads to a mutation that simplifies and "hardwires" it into the genome.
Part 2
yesterday
Dr. Krishna Kumari Challa
In the paper, they argue that fusions are a specific example of a more general and extensive internal force that applies across mutation types. Rather than local accidents arising at random locations in the genome disconnected from other genetic information, mutational processes put together multiple meaningful pieces of heritable information in many ways.
Genes that evolved to interact tightly are more likely to be fused; single-letter RNA changes that evolved to occur repeatedly across generations via regulatory phenomena are more likely to be "hardwired" as point mutations into the DNA; genes that evolved to interact in incipient networks, each under its own regulation, are more likely to be invaded by the same transposable element that later becomes a master-switch of the network, streamlining regulation, and so on. Earlier mutations influence the origination of later ones, forming a vast network of influences over evolutionary time.
Previous studies examined mutation rates as averages across genomic positions, masking the probabilities of individual mutations. But these new studies suggest that, at the scale of individual mutations, each mutation has its own probability, and the causes and consequences of mutation are related.
At each generation, mutations arise based on the information that has accumulated in the genome up to that time point, and those that survive become a part of that internal information.
This vast array of interconnected mutational activity gradually hones in over the generations on mutations relevant to the long-term pressures experienced, leading to long-term directed mutational responses to specific environmental pressures, such as the malaria and Trypanosoma–protective HbS and APOL1 mutations.
New genetic information arises in the first place, they argue, as a consequence of the fact that mutations simplify genetic regulation, hardwiring evolved biological interactions into ready-made units in the genome. This internal force of natural simplification, together with the external force of natural selection, act over evolutionary time like combined forces of parsimony and fit, generating co-optable elements that themselves have an inherent tendency to come together into new, emergent interactions.
Co-optable elements are generated by simplification under performance pressure, and then engage in emergent interactions—the source of innovation is at the system level.
Understood in the proper timescale, an individual mutation does not arise at random nor does it invent anything in and of itself.
The potential depth of evolution from this new perspective can be seen by examining other networks. For example, the gene fusion mechanism—where genes repeatedly used together across evolutionary time are more likely to be fused together by mutation—echoes chunking, one of the most basic principles of cognition and learning in the brain, where pieces of information that repeatedly co-occur are eventually chunked into a single unit.
Yet fusions are only one instance of a broader principle: diverse mutational processes respond to accumulated information in the genome, combining it over generations into streamlined instructions. This view recasts mutations not as isolated accidents, but as meaningful events in a larger, long-term process.
Daniel Melamed et al, De novo rates of a Trypanosoma -resistant mutation in two human populations, Proceedings of the National Academy of Sciences (2025). DOI: 10.1073/pnas.2424538122
Part 3
yesterday
Dr. Krishna Kumari Challa
New tool enables rapid, large-scale profiling of disease-linked RNA modifications
Researchers have developed a powerful tool capable of scanning thousands of biological samples to detect transfer ribonucleic acid (tRNA) modifications—tiny chemical changes to RNA molecules that help control how cells grow, adapt to stress and respond to diseases such as cancer and antibiotic‑resistant infections. This tool opens up new possibilities for science, health care and industry—from accelerating disease research and enabling more precise diagnostics, to guiding the development of more effective medical treatments for diseases such as cancer and antibiotic‑resistant infections.
Cancer and infectious diseases are complicated health conditions in which cells are forced to function abnormally by mutations in their genetic material or by instructions from an invading microorganism. The SMART-led research team is among the world's leaders in understanding how the epitranscriptome—the over 170 different chemical modifications of all forms of RNA—controls growth of normal cells and how cells respond to stressful changes in the environment, such as loss of nutrients or exposure to toxic chemicals. The researchers are also studying how this system is corrupted in cancer or exploited by viruses, bacteria and parasites in infectious diseases.
Current molecular methods used to study the expansive epitranscriptome and all of the thousands of different types of modified RNA are often slow, labor‑intensive, costly and involve hazardous chemicals which limit research capacity and speed.
To solve this problem, the SMART team developed a new tool that enables fast, automated profiling of tRNA modifications—molecular changes that regulate how cells survive, adapt to stress and respond to disease. This capability allows scientists to map cell regulatory networks, discover novel enzymes and link molecular patterns to disease mechanisms, paving the way for better drug discovery and development, and more accurate disease diagnostics.
SMART's automated system was specially designed to profile tRNA modifications across thousands of samples rapidly and safely. Unlike traditional methods—which are costly, labor‑intensive and use toxic solvents such as phenol and chloroform—this tool integrates robotics to automate sample preparation and analysis, eliminating the need for hazardous chemical handling and reducing costs. This advancement increases safety, throughput and affordability, enabling routine large‑scale use in research and clinical labs.
Jingjing Sun et al, tRNA modification profiling reveals epitranscriptome regulatory networks in Pseudomonas aeruginosa, Nucleic Acids Research (2025). DOI: 10.1093/nar/gkaf696
yesterday
Dr. Krishna Kumari Challa
Fruit fly research shows that mechanical forces drive evolutionary change
A tissue fold known as the cephalic furrow, an evolutionary novelty that forms between the head and the trunk of fly embryos, plays a mechanical role in stabilizing embryonic tissues during the development of the fruit fly Drosophila melanogaster.
Researchers have integrated computer simulations with their experiments and showed that the timing and position of cephalic furrow formation are crucial for its function, preventing mechanical instabilities in the embryonic tissues.
The work appears in Nature.
The increased mechanical instability caused by embryonic tissue movements may have contributed to the origin and evolution of the cephalic furrow genetic program. This shows that mechanical forces can shape the evolution of new developmental features.
Mechanical forces shape tissues and organs during the development of an embryo through a process called morphogenesis. These forces cause tissues to push and pull on each other, providing essential information to cells and determining the shape of organs. Despite the importance of these forces, their role in the evolution of development is still not well understood.
Animal embryos undergo tissue flows and folding processes, involving mechanical forces, that transform a single-layered blastula (a hollow sphere of cells) into a complex multi-layered structure known as the gastrula. During early gastrulation, some flies of the order Diptera form a tissue fold at the head-trunk boundary called the cephalic furrow. This fold is a specific feature of a subgroup of Diptera and is therefore an evolutionary novelty of flies.
Part1
yesterday
Dr. Krishna Kumari Challa
The cephalic furrow is especially interesting because it is a prominent embryonic invagination whose formation is controlled by genes, but that has no obvious function during development. The fold does not give rise to specific structures, and later in development, it simply unfolds, leaving no trace.
With their experiments, the researchers show that the absence of the cephalic furrow leads to an increase in the mechanical instability of embryonic tissues and that the primary sources of mechanical stress are cell divisions and tissue movements typical of gastrulation. They demonstrate that the formation of the cephalic furrow absorbs these compressive stresses. Without a cephalic furrow, these stresses build up, and outward forces caused by cell divisions in the single-layered blastula cause mechanical instability and tissue buckling.
This intriguing physical role gave the researchers the idea that the cephalic furrow may have evolved in response to the mechanical challenges of dipteran gastrulation, with mechanical instability acting as a potential selective pressure.
Flies either feature a cephalic furrow, or if they lack one, display widespread out-of-plane division, meaning the cells divide downward to reduce the surface area. Both mechanisms act as mechanical sinks to prevent tissue collision and distortion.
These findings uncover empirical evidence for how mechanical forces can influence the evolution of innovations in early development. The cephalic furrow may have evolved through genetic changes in response to the mechanical challenges of dipteran gastrulation. They show that mechanical forces are not just important for the development of the embryo but also for the evolution of its development.
Patterned invagination prevents mechanical instability during gastrulation, Nature (2025). DOI: 10.1038/s41586-025-09480-3
Bipasha Dey et al, Divergent evolutionary strategies pre-empt tissue collision in gastrulation. Nature (2025). DOI: 10.1038/s41586-025-09447-4
Part 2
yesterday
Dr. Krishna Kumari Challa
Some sugar substitutes linked to faster cognitive decline
Some sugar substitutes may come with unexpected consequences for long-term brain health, according to a study published in Neurology. The study examined seven low- and no-calorie sweeteners and found that people who consumed the highest amounts experienced faster declines in thinking and memory skills compared to those who consumed the lowest amounts.
The link was even stronger in people with diabetes. While the study showed a link between the use of some artificial sweeteners and cognitive decline, it did not prove that they were a cause.
The artificial sweeteners examined in the study were aspartame, saccharin, acesulfame-K, erythritol, xylitol, sorbitol and tagatose. These are mainly found in ultra-processed foods like flavored water, soda, energy drinks, yogurt and low-calorie desserts. Some are also used as a standalone sweetener.
Low- and no-calorie sweeteners are often seen as a healthy alternative to sugar, however the new findings suggest certain sweeteners may have negative effects on brain health over time.
The study included 12,772 adults from across Brazil. The average age was 52, and participants were followed for an average of eight years.
After adjusting for factors such as age, sex, high blood pressure and cardiovascular disease, researchers found people who consumed the highest amount of sweeteners showed faster declines in overall thinking and memory skills than those who consumed the lowest amount, with a decline that was 62% faster. This is the equivalent of about 1.6 years of aging. Those in the middle group had a decline that was 35% faster than the lowest group, equivalent to about 1.3 years of aging.
When researchers broke the results down by age, they found that people under the age of 60 who consumed the highest amounts of sweeteners showed faster declines in verbal fluency and overall cognition when compared to those who consumed the lowest amounts. They did not find links in people over 60. They also found that the link to faster cognitive decline was stronger in participants with diabetes than in those without diabetes.
When looking at individual sweeteners, consuming aspartame, saccharin, acesulfame-k, erythritol, sorbitol and xylitol was associated with a faster decline in overall cognition, particularly in memory.
They found no link between the consumption of tagatose and cognitive decline.
https://www.neurology.org/doi/10.1212/WNL.0000000000214023
yesterday
Dr. Krishna Kumari Challa
SeeMe detects hidden signs of consciousness in brain injury patients
SeeMe, a computer vision tool tested by Stony Brook University researchers, was able to detect low-amplitude, voluntary facial movements in comatose acute brain injury patients days before clinicians could identify overt responses.
There are ways to detect covert consciousness with EEG and fMRI, though these are not always available. Many acute brain injury patients appear unresponsive in early care, with signs that are so small, or so infrequent, that they are simply missed.
In the study, "Computer vision detects covert voluntary facial movements in unresponsive brain injury patients," published in Communications Medicine, investigators designed SeeMe to quantify tiny facial movements in response to auditory commands with the objective of identifying early, stimulus-evoked behavior.
A single-center prospective cohort included 37 comatose acute brain injury patients and 16 healthy volunteers, aged 18–85, enrolled at Stony Brook University Hospital. Patients had initial Glasgow Coma Scale scores ≤8 and no prior neurologically debilitating diagnoses.
SeeMe tagged facial pores at ~0.2 mm resolution and tracked movement vectors while subjects heard three commands: open your eyes, stick out your tongue, and show me a smile.
Results indicate earlier and broader detection with SeeMe. Eye-opening was detected on average 9.1 (± 5.5) days after injury by SeeMe versus 13.2 (± 11.4) days by clinical examination, yielding a 4.1-day lead.
SeeMe identified eye-opening in 30 of 36 patients (85.7%) compared with 25 of 36 (71.4%) by clinical exam. Among patients without an endotracheal tube obscuring the mouth, SeeMe detected mouth movements in 16 of 17 (94.1%).
In seven patients with analyzable mouth videos and clinical command following, SeeMe identified reproducible mouth responses 8.3 days earlier on average. Amplitude and frequency of SeeMe-positive responses correlated with discharge outcomes on the Glasgow Outcome Scale-Extended, with significant Kruskal-Wallis results for both features.
A deep neural network classifier trained on SeeMe-positive trials identified command specificity with 81% accuracy for eye-opening and an overall accuracy of 65%. Additional observations were lower, with 37% for tongue movement and 47% for smile, indicating a strong specificity for eyes.
Authors conclude that acute brain injury patients can exhibit low-amplitude, stimulus-evoked facial movements before overt signs appear at bedside, suggesting that many covertly conscious patients may have motor behavior currently undetected by clinicians.
Earlier detection could inform family discussions, guide rehabilitation timing, and serve as a quantitative signal for future monitoring or interface-based communication strategies, while complementing standard examinations.
Xi Cheng et al, Computer vision detects covert voluntary facial movements in unresponsive brain injury patients, Communications Medicine (2025). DOI: 10.1038/s43856-025-01042-y
yesterday
Dr. Krishna Kumari Challa
Pancreatic insulin disruption triggers bipolar disorder-like behaviors in mice, study shows
Bipolar disorder is a psychiatric disorder characterized by alternating episodes of depression (i.e., low mood and a loss of interest in everyday activities) and mania (i.e., a state in which arousal and energy levels are abnormally high). On average, an estimated 1–2% of people worldwide are diagnosed with bipolar disorder at some point during their lives.
Bipolar disorder can be highly debilitating, particularly if left untreated. Understanding the neural and physiological processes that contribute to its emergence could thus be very valuable, as it could inform the development of new prevention and treatment strategies.
In addition to experiencing periodic changes in mood, individuals diagnosed with this disorder often exhibit some metabolic symptoms, including changes in their blood sugar levels. While some previous studies reported an association between blood sugar control mechanisms and bipolar disorder, the biological link between the two has not yet been uncovered.
Researchers recently carried out a study aimed at further exploring the link between insulin secretion and bipolar disorder-like behaviors, particularly focusing on the expression of the gene RORβ.
Their findings, published in Nature Neuroscience, show that an overexpression of this gene in a subtype of pancreatic cells disrupts the release of insulin, which in turn prompts a feedback loop with a region of the brain known as the hippocampus, producing alternative depression-like and mania-like behaviors in mice.
The results in mice point to a pancreas–hippocampus feedback mechanism by which metabolic and circadian factors cooperate to generate behavioral fluctuations, and which may play a role in bipolar disorder, wrote the authors.
Yao-Nan Liu et al, A pancreas–hippocampus feedback mechanism regulates circadian changes in depression-related behaviors, Nature Neuroscience (2025). DOI: 10.1038/s41593-025-02040-y.
yesterday
Dr. Krishna Kumari Challa
Shampoo-like gel could help chemo patients keep their hair
Cancer fighters know that losing their hair is often part of the battle, but researchers have developed a shampoo-like gel that has been tested in animal models and could protect hair from falling out during chemotherapy treatment.
Baldness from chemotherapy-induced alopecia causes personal, social and professional anxiety for everyone who experiences it. Currently, there are few solutions—the only ones that are approved are cold caps worn on the patient's head, which are expensive and have their own extensive side effects.
The gel is a hydrogel, which absorbs a lot of water and provides long-lasting delivery of drugs to the patient's scalp. The hydrogel is designed to be applied to the patient's scalp before the start of chemotherapy and left on their head as long as the chemotherapy drugs are in their system—or until they are ready to easily wash it off.
During chemotherapy treatment, chemotherapeutic drugs circulate throughout the body. When these drugs reach the blood vessels surrounding the hair follicles on the scalp, they kill or damage the follicles, which releases the hair from the shaft and causes it to fall out.
The gel, containing the drugs lidocaine and adrenalone, prevents most of the chemotherapy drugs from reaching the hair follicle by restricting the blood flow to the scalp. Dramatic reduction in drugs reaching the follicle will help protect the hair and prevent it from falling out.
To support practical use of this "shampoo," the gel is designed to be temperature responsive. For example, at body temperature, the gel is thicker and clings to the patient's hair and scalp surface. When the gel is exposed to slightly cooler temperatures, the gel becomes thinner and more like a liquid that can be easily washed away.
Romila Manchanda et al, Hydrogel-based drug delivery system designed for chemotherapy-induced alopecia, Biomaterials Advances (2025). DOI: 10.1016/j.bioadv.2025.214452
yesterday
Dr. Krishna Kumari Challa
How aging drives neurodegenerative diseases
A research team has identified a direct molecular link between aging and neurodegeneration by investigating how age-related changes in cell signaling contribute to toxic protein aggregation.
Although aging is the biggest risk factor for neurodegenerative diseases, scientists still don't fully understand which age-associated molecular alterations drive their development.
Using the small nematode worm Caenorhabditis elegans, a research team studied a signaling pathway that leads to pathological protein accumulation with age. Their new paper is published in Nature Aging.
The team focused on the aging-associated protein EPS8 and the signaling pathways it regulates. This protein is known to accumulate with age and to activate harmful stress responses that lead to a shorter lifespan in worms.
Researchers found that increased levels of EPS8, and the activation of its signaling pathways, drive pathological protein aggregation and neurodegeneration—typical features of age-associated neurodegenerative diseases such as Huntington's disease and amyotrophic lateral sclerosis (ALS). By reducing EPS8 activity, the group was then able to prevent the build-up of the toxic protein aggregates and preserve neuronal function in worm models of these two diseases.
Importantly, EPS8 and its signaling partners are evolutionarily conserved and also present in human cells. Similar to what they achieved in the worms, the team was able to prevent the accumulation of toxic protein aggregates in human cell models of Huntington's disease and ALS by reducing EPS8 levels.
Seda Koyuncu et al, The aging factor EPS8 induces disease-related protein aggregation through RAC signaling hyperactivation, Nature Aging (2025). DOI: 10.1038/s43587-025-00943-w
yesterday
Dr. Krishna Kumari Challa
Cooling pollen sunscreen can block UV rays without harming corals
Materials scientists have invented the world's first pollen-based sunscreen derived from Camellia flowers.
In experiments, the pollen-based sunscreen absorbed and blocked harmful ultraviolet (UV) rays as effectively as commercially available sunscreens, which commonly use minerals like titanium dioxide (TiO2) and zinc oxide (ZnO).
In laboratory tests on corals, commercial sunscreen induced coral bleaching in just two days, leading to coral death by day six. Each year, an estimated 6,000 to 14,000 tons of commercial sunscreen make their way into the ocean, as people wash it off in the sea or it flows in from wastewater.
In contrast, the pollen-based sunscreen did not affect the corals, which remained healthy even up to 60 days.
In other tests, the pollen-based sunscreen also demonstrated its ability to reduce surface skin temperature, thereby helping to keep the skin cool in the presence of simulated sunlight.
Nature's Guard: UV Filter from Pollen, Advanced Functional Materials (2025). DOI: 10.1002/adfm.202516936. advanced.onlinelibrary.wiley.c … .1002/adfm.202516936
10 hours ago
Dr. Krishna Kumari Challa
Why we slip on ice: Physicists challenge centuries-old assumptions
For over a hundred years, schoolchildren around the world have learned that ice melts when pressure and friction are applied. When you step out onto an icy pavement in winter, you can slip up because of the pressure exerted by your body weight through the sole of your (still warm) shoe. But it turns out that this explanation misses the mark.
New research reveals that it's not pressure or friction that causes ice to become slippery, but rather the interaction between molecular dipoles in the ice and those on the contacting surface, such as a shoe sole.
The work is published in the journal Physical Review Letters. This insight overturns a paradigm established nearly two centuries ago by the brother of Lord Kelvin, James Thompson, who proposed that pressure and friction contribute to ice melting alongside temperature.
It turns out that neither pressure nor friction plays a particularly significant part in forming the thin liquid layer on ice.
Instead,computer stimulations by researchers reveal that molecular dipoles are the key drivers behind the formation of this slippery layer, which so often causes us to lose our footing in winter. But what exactly is a dipole? A molecular dipole arises when a molecule has regions of partial positive and partial negative charge, giving the molecule an overall polarity that points in a specific direction.
Achraf Atila et al, Cold Self-Lubrication of Sliding Ice, Physical Review Letters (2025). DOI: 10.1103/1plj-7p4z
10 hours ago