How inflammation may prime the gut for cancer Chronic gut inflammation can leave lasting epigenetic changes, or "molecular scars," in intestinal cells, making tissues more susceptible to cancer if a cancer-promoting mutation occurs later. These epigenetic alterations persist after inflammation subsides, are inherited by daughter cells, and may accelerate tumor growth, highlighting potential biomarkers and intervention targets for colorectal cancer risk.
Chronic inflammation can raise a person's risk of cancer, and a new study reveals key details about how that might happen in the gut and points to better ways to identify and reduce risk. revealed in mice that after colitis (chronic intestinal inflammation), seemingly healed gut tissues may retain the memory of earlier inflammation through molecular "scars" that make it easier for cancer to take hold later on. These memories are encoded as changes in the epigenome that are handed down from cell to cell through many generations of cell division, with long-lasting effects on gene activity that can later drive tumor growth. The work, appearing in Nature, suggests a two-hit process over time in which alterations in the genome—an epigenetic change and a cancer mutation—can accelerate tumor growth. It also points to ways to potentially identify and possibly intervene on these cancer-promoting factors with new biomarkers and therapeutics.
Dengue fever is a growing problem: Why it's so hard to beat with vaccines
Dengue fever, caused by four related viral serotypes, is expanding globally due to climate and urbanization. Vaccine development is challenging because immunity to one serotype can worsen infection with another via antibody-dependent enhancement. Effective vaccines must induce balanced, strongly neutralizing responses to all serotypes. Vaccine performance varies by prior infection, age, and transmission intensity, requiring tailored strategies and ongoing safety monitoring.
Premature placental separation may increase child's risk of heart disease by age 28
Children born after placental abruption face a 4.6-fold higher risk of early cardiovascular disease or death from cardiovascular disease by age 28 compared to those without this complication. Risks of heart-related hospitalization and stroke are also significantly elevated. The association persists even when accounting for genetic and environmental factors within families. The risk of developing early cardiovascular disease or dying from cardiovascular disease by the age of 28 was about 4.6 times higher among people born to mothers who had a placental abruption during their pregnancy. This finding was compared to people whose birth did not have this complication, according to new research published in the Journal of the American Heart Association.
Placental abruption occurs when the placenta separates from the uterus before birth rather than after delivery, and this can lead to severe hemorrhaging or other serious complications for the mother and baby. According to the American Heart Association's 2026 Heart Disease and Stroke Statistics, most studies have reported an incidence of 0.5% to 1% for placental abruption in the general population.
The study suggests that placental abruption needs to be taken as a very serious complication for the mother and also potentially affecting the baby's cardiovascular health later in life.
Most treatments after a placental abruption focus on following the mother after a pregnancy complication. This study shows it is important that their children are also monitored to identify potential complications due to their increased risk of cardiovascular disease.
Out of nearly 3 million pregnancies, approximately 1% (n = 28,641) were affected by placental abruption. During a 28-year follow-up period, children born to mothers who had a placental abruption during the pregnancy were 4.6 times more likely to die from cardiovascular disease than children born to mothers who experienced a normal placental separation from the uterus after delivery. Children born to mothers who had a placental abruption faced nearly three times higher risk of being hospitalized for heart-related complications during the next 28 years. These conditions included heart failure, ischemic heart disease, heart attack, blocked arteries, and general cardiovascular disease. The children's risk of stroke hospitalization was 2.4 times higher than for children whose mothers did not have a placental abruption. These heart disease and stroke risks associated with abruption were even higher among children younger than 1 year old. The association between placental abruption and increased cardiovascular risk remained similar after conducting an additional analysis contrasting cardiovascular disease risks between biological siblings (each mother served as their own control), suggesting that genetic and environmental factors did not explain this relationship.
Placental abruption is a sudden and often catastrophic event that cannot be prevented and comes with no warning. Older women or those expecting more than one baby, such as twins or triplets, have an increased risk of developing this condition. Avoiding smoking, drinking alcohol, and using illegal drugs (particularly, cocaine) and maintaining good blood pressure control are also important, as they are linked to placental abruption.
Cardiovascular Disease in Singleton Offspring Born of Pregnancies Complicated by Placental Abruption: A Population-Based Retrospective Cohort Study, DOI: 10.1161/JAHA.125.045199
Walking pace may outperform blood pressure and cholesterol in predicting mortality risk, study suggests Analysis of over 400,000 UK adults indicates that walking pace is a stronger predictor of mortality risk than blood pressure or cholesterol, particularly in individuals with chronic health conditions. Incorporating simple physical measures such as walking pace, handgrip strength, and resting heart rate enhances mortality risk prediction beyond traditional clinical factors.
The Utility of Measures of Physical Behavior, Function, and Fitness as Predictors of Mortality, Mayo Clinic Proceedings (2026). DOI: 10.1016/j.mayocpiqio.2026.100710
After 20 years and more than 30,000 cloning attempts, researchers have found the limit on the number of times that a single mouse can be serially re-cloned — their attempts failed after 58 generations. The cloned mice looked normal and lived as long as normal mice, but accumulated mutations at an unusually high rate, which could be why attempts to clone them were eventually unsuccessful, the team says. The findings suggest that asexual reproduction is ultimately unsustainable for mice, and potentially for other mammals, too.
Why cells respond 'incorrectly' in old age Aging cells exhibit altered responses to external signals due to changes in chromatin structure within the nucleus. With age, chromatin becomes more open, leading to less accurate gene expression and increased activation of inappropriate genes. This impairs essential cellular processes and may contribute to age-related diseases, suggesting that targeting chromatin structure could support healthier aging. Some of the signs of aging in human cells originate in the cell nucleus, because the packaged form of DNA changes with age. This has now been demonstrated by researchers. It means that older cells can no longer react appropriately to external stimuli, and this can even lead to diseases. This insight could help scientists to curb such alterations and support better health in old age.
As we age, our cells age with us. Although they remain active, they become less flexible, stop dividing and sometimes respond incorrectly to signals. The reason for this lies in the cell nuclei, specifically in the chromatin—the packaged form of our DNA.
This is the finding of a study by researchers who analyzed samples of skin cells taken from people of different ages in the laboratory. They have published their findings in the journalPNAS.
Using a microscope and molecular biological methods, the scientists examined how various skin cells reacted to a specific chemical messenger when mechanical tension was applied. They compared skin cells from ten-year-old children with those from 75-year-olds.
the response of older people's cells to the same stimuli was different and significantly weaker. The scientists were able to attribute this to a specific cause: the chromatin in the cell nucleus changes with age. As a result, certain genes—sections of DNA that belong together—can no longer be read as accurately.
This process, known as gene expression, is important in order to produce the proteins required by the organism, because the genes store the instructions for building those proteins. However, if the shape of the chromatin changes with age, other processes may be triggered instead, which can harm the organism.
Tiny rotating hairs inside a microscopic cavity decide where your organs will grow Organ asymmetry in humans originates during embryonic development, where rotating cilia in the embryonic node generate a directional fluid flow that influences left-right organ placement. Using an artificial node with magnetically controlled cilia and advanced simulations, it was demonstrated that both cilia-induced flow and morphogen transport together break left-right symmetry.
Heart to the left. Liver to the right. That's where you'll find these organs in a healthy human body, but surprisingly, in some people, the heart is on the right and the liver on the left. This normal or abnormal asymmetry can be traced back to your embryonic stage. In the early days of your development, a small fluid-filled cavity known as an embryonic node forms in your embryo. Inside, tiny micro-hairs known as cilia create a flow pattern that steers where organs grow in your body.
Researchers have revealed key details behind the process by building a world-first artificial embryonic node—using synthetic magnetically controlled cilia to generate a flow pattern—and then explore what happens in the node using advanced simulations. They published their findings on March 25 in the journal Science Advances.
It can be traced all the way back to your first period as an embryo. And it has to do with what happens in something called an embryonic node.
An embryonic node is a small cavity that contains a fluid (made up of water, proteins, hormones, and other substances). The top is closed off by a membrane, while the bottom layer is lined with a few hundred tiny micro-hairs called cilia. The whole node is just a few hundred micrometers across. "The cilia in the embryonic node rotate in the same direction, making a tilted conical motion. This generates an anticlockwise fluid flow inside the node, and it's this flow that is known to play a key role in the left-right symmetry.
Tanveer ul Islam et al, Artificial embryonic node elucidates the role of flow in left-right symmetry breaking in vertebrates, Science Advances (2026). DOI: 10.1126/sciadv.aec2328
Influencers promoting prescription drugs on social media pose public health risks Influencer promotion of prescription drugs on social media increases the risk of misinformation, with exaggerated benefits and omitted side effects commonly observed. Current FDA and FTC regulations are insufficient for this context, and inconsistent disclosure blurs the line between personal stories and advertisements. Stronger guidelines, standardized disclosures, and improved public media literacy are recommended to address these risks.
Severe strokes may 'rejuvenate' undamaged brain regions
Analysis of MRI data from over 500 stroke survivors indicates that while severe strokes accelerate aging in the damaged brain hemisphere, undamaged regions—especially the contralesional frontoparietal network—exhibit a younger structural profile. This suggests adaptive neuroplasticity, where unaffected areas reorganize to compensate for lost motor function. These findings may inform personalized rehabilitation strategies.
Gilsoon Park et al, Associations between contralesional neuroplasticity and motor impairment through deep learning-derived MRI regional brain age in chronic stroke (ENIGMA): a multicohort, retrospective, observational study, The Lancet Digital Health (2026). DOI: 10.1016/j.landig.2025.100942
Eating about 4,200 mg sodium a day may raise heart failure risk 15%
Daily sodium intake averaging 4,200 mg, nearly double the recommended limit, is associated with a 15% higher risk of new-onset heart failure, independent of other health or sociodemographic factors. Modest sodium reduction could lower heart failure incidence and related healthcare costs, particularly in high-risk, low-income populations.
Excessive consumption of dietary sodium (salt) is a significant, independent risk factor for new-onset heart failure, according to a report from Vanderbilt Health, published in the Journal of the American College of Cardiology: Advances. Consuming a population average of about 4,200 milligrams of dietary sodium a day (the recommended maximum is 2,300 milligrams) was associated with a 15% increase in the risk of incident (new) cases of heart failure.
"Even modest reductions in sodium consumption may significantly reduce the burden of heart failure in this high-risk population," the researchers report. The increased risk of heart failure linked to sodium was independent of sociodemographic factors, including diet quality and caloric intake, as well as health conditions such as high blood pressure and high lipid blood levels. Even a modest reduction in dietary salt, to 4,000 milligrams a day or less, could reduce heart failure cases by 6.6% over 10 years, the researchers predicted.
Leonie Dupuis et al, Dietary Sodium Intake and Risk of Incident Heart Failure in the Southern Community Cohort Study, JACC: Advances (2026). DOI: 10.1016/j.jacadv.2026.102651
Implantable 'living pharmacy' produces multiple drugs inside the body
A multi-institutional team of scientists has taken a crucial step toward implantable "living pharmacies"—tiny devices containing engineered cells that continuously produce medicines inside the body. In a new study published in Device, the team engineered cells to simultaneously produce three different biologics—an anti-HIV antibody, a GLP-1-like peptide used to treat type 2 diabetes, and leptin, a hormone that regulates appetite and metabolism. When implanted under the skin of a small animal model, the device kept drug-producing cells alive and stably delivered all three therapies at once.
Called HOBIT (hybrid oxygenation bioelectronics system for implanted therapy), the new system integrates the engineered cells with oxygen-producing bioelectronics. Roughly the size of a folded stick of gum, the design shields cells from the body's immune system while also providing cells with oxygen and nutrients to keep them alive and producing biologic drugs for several weeks.
With more work, living pharmacies hold the potential to treat chronic conditions with a single, long-lasting therapy—bypassing the need for patients to carry, inject or remember to take medications.
Scientists testing new scanning technology discover mysterious structure beneath an ancient Egyptian city
Archaeologists working in Egypt's Nile Delta may have discovered a tomb or temple dating back around 2,600 years while testing a new technology designed to locate structures buried deep beneath the surface. The team was studying the Buto (Tell el-Fara'in) site, the ruins of an ancient city that was occupied from the Predynastic period (around 3800 BCE) to the Early Islamic era (7th century CE).
During its long history, Buto was built, destroyed, and rebuilt and consequently has several layers, each potentially a rich source of archaeological remains. However, the oldest parts of the city are now buried beneath later ruins and thick mud deposits. This would make traditional digging difficult, time-consuming, and expensive as archaeologists have to move tons of debris or struggle with groundwater to reach lower levels. The technology helps guide them where to dig.
The researchers were trialing a multipronged approach that uses satellite radar (SAR) and electrical resistivity tomography (ERT).
The team started with the Sentinel-1 radar satellite to identify large-scale anomalies from space, as they describe in a paper published in the journal Acta Geophysica. When it detected something of interest, they followed up with ERT. The technology sends electrical currents between a series of electrodes placed in the ground to create a 3D model of subsurface structures based on how different materials conduct/resist electricity. This technique has been likened to an underground CT scan.
In the top three meters, the scans showed a layer of broken pottery and debris from the later Roman and Ptolemaic periods. However, at a depth of between three and six meters, the technology revealed a large, well-defined structure from the Saite period (around 2,600 years ago).
At this stage, the researchers thought it could be a large tomb or shrine. Next, they carried out a small 10 x 10 meter excavation, which uncovered mudbrick walls and religious artifacts where the sensors had predicted.
"The results of this study demonstrate the effectiveness of combining geophysical measurements and remote sensing data, which gave a very accurate vision in detecting buried settlements in a complex region," wrote the team in their paper.
Mohamed A. R. Abouarab et al, Multi-scale detection of buried archaeological elements across different occupation phases: an integrated approach using radar satellite imagery and electric resistivity tomography at Buto, northwestern Nile Delta of Egypt, Acta Geophysica (2026). DOI: 10.1007/s11600-026-01809-4
Major volcanic eruptions might be driven by gas dissolving back into magma
Understanding what triggers large volcanic eruptions is crucial for hazard assessment, but the exact mechanism driving these eruptions is still poorly understood. The prevailing theory is that volatile exsolution—gas coming out of magma—is a main driver of eruptions, particularly in volcanoes rich in silica. However, a new study, published in Nature Communications, posits that it is actually gas being dissolved back into the magma that leads to the pressurization needed for large eruptions. Previous research has emphasized volatile exsolution as a driver for eruption caused by increasing pressure in magma chambers. In volatile exsolution, dissolved gases, like water vapor, carbon dioxide and sulfur separate from a silicate melt to form bubbles as magma rises or cools. This decreases solubility, creating significant magmatic overpressure, which drives volcanic eruptions. Some previous studies have found that exsolved gases in large volcanic systems can actually buffer pressure, making eruptions less frequent, but larger when they do occur.
"However, for volatile exsolution to act as primary eruption trigger, exsolution must outpace both volatile loss by passive degassing and viscous relaxation of the crust. This, however, requires rapid crystallization rates that are difficult to maintain in larger, thermally buffered reservoirs. In large silicic systems, exsolved volatiles may therefore exert a primary control on magma compressibility and chamber growth, rather than directly triggering eruptions," the study authors explain. The new study explores the opposite process, called volatile resorption, in which gases dissolve back into magma. The team says that this resorption reduces magma compressibility, modulating the system's response to recharge and its overall stability, which ends up expediting eruptions since it is harder to compress. Volatile resorption can rapidly increase pressure in large silicic magma chambers, potentially triggering eruptions faster than volatile exsolution. The study authors say the simulations they conducted show resorption results in eruption faster than cases involving exsolution.
Franziska Keller et al, Volatile resorption expedites eruption onset in large silicic systems, Nature Communications (2026). DOI: 10.1038/s41467-026-70206-8
JWST solves decades-long mystery about why Saturn appears to change its spin
Saturn has puzzled scientists for many years. Measurements taken by NASA's Cassini spacecraft in 2004 suggested the planet's rotation rate was slowly changing over time—yet this should not have been possible, as a planet cannot simply speed up or slow down its spin. Researchers now have used the most powerful space telescope ever built to answer one of the longest-standing puzzles in planetary science—why does Saturn appear to spin at a different speed depending on how you measure it? The findings, published in the Journal of Geophysical Research: Space Physics, reveal for the first time the complex patterns of heat and electrically charged particles in Saturn's aurora, and show that the entire system is driven by a self-sustaining feedback loop powered by the planet's own northern lights.
Using the James Webb Space Telescope (JWST), the team observed Saturn's northern auroral region—the equivalent of Earth's northern lights—continuously for a full Saturnian day, capturing detailed measurements that were simply not possible with any previous instrument.
By analyzing the infrared glow from a molecule called trihydrogen cation, which forms in Saturn's upper atmosphere and acts as a natural thermometer, the researchers were able to produce the first high-resolution maps of both temperature and particle density across Saturn's auroral region. The level of detail was extraordinary. Previous measurements had errors of around 50 degrees Celsius, roughly on a par with the differences the scientists were trying to detect, and were produced by combining broad regions of the hot polar aurora. The new JWST data was ten times more accurate than previous measurements, allowing the team to map fine details of heating and cooling across Saturn's auroral region for the very first time.
What the team found was that these temperature and density patterns match remarkably well with predictions made by computer models more than a decade ago, but only if the source of heat is placed exactly where the main auroral emissions enter the atmosphere.
This means Saturn's aurora is not just a visual display—it is actively heating the atmosphere in a specific location. That localized heating drives winds, which in turn generate the electrical currents responsible for the aurora. The aurora then heats the atmosphere again, sustaining the whole cycle. Part 1
What we are seeing is essentially a planetary heat pump. Saturn's aurora heats its atmosphere, the atmosphere drives winds, the winds produce currents that power the aurora, and so it goes on. The system feeds itself.
"For decades, we knew something strange was happening with Saturn's apparent rotation rate, but we could not explain it. We then showed it was being driven by atmospheric winds, but we still did not know why those winds existed. These new observations, made possible by JWST, finally give us the evidence we needed to close that loop."
The findings also have broader implications. The research suggests that what happens in Saturn's atmosphere directly influences conditions in its surrounding magnetosphere—the vast region of space shaped by the planet's magnetic field—which in turn feeds energy back into the system. This two-way relationship between the atmosphere and magnetosphere may help explain why the effect is so stable and long-lasting.
Tom S. Stallard et al, JWST/NIRSpec Reveals the Atmospheric Driver of Saturn's Variable Magnetospheric Rotation Rate, Journal of Geophysical Research: Space Physics (2026). DOI: 10.1029/2025ja034578
New study uncovers mechanism explaining how obesity increases cancer risk
Obesity increases cancer risk by causing organs such as the liver, kidneys, and pancreas to enlarge primarily through an increase in cell number (hyperplasia), not just cell size. This expansion raises the number of cells susceptible to mutations, thereby elevating cancer risk. Organ size may predict cancer risk more accurately than BMI, highlighting the importance of maintaining a healthy weight early in life.
The study, published in Cancer Research,reveals a major driving force behind how obesity increases cancer risk across multiple organs.
The findings emphasize the importance of maintaining a healthy weight from early childhood and propose a potentially more accurate way than body mass index (BMI) to predict the increase in cancer risk associated with obesity. The study reveals that excess weight doesn't just affect metabolism or hormones—it can physically enlarge organs, creating more opportunities for cancer to take hold. Understanding that process matters because it helps explain how everyday health choices can shape cancer risk years or even decades down the line."
In other words, as a person gains weight, their organs also grow in size by accumulating more cells to meet the higher energy needs of a bigger body. Having more cells boosts the odds of more DNA errors as cells divide, increasing the likelihood of cancer. To test this hypothesis, researchers conducted a two-pronged study. First, the team evaluated 747 adults whose weight in relationship to their height spanned the complete BMI spectrum, from underweight (18.5 BMI) all the way to severely obese (40-plus BMI). Using CT scans, the researchers measured the size of each adult's liver, kidneys, and pancreas.
This study is the first to analyze the size of multiple organs in a large cohort of individuals across the full BMI spectrum.
The scientists discovered that the organs grew larger as body weight increased. For every 5-point increase in BMI, the liver grew by 12%, kidneys by 9%, and the pancreas by 7%.
Next, the research team counted the cells in samples of kidney tissue taken from autopsies and reanalyzed biopsy data from living patients. The lab showed that more than 60% of the kidneys' growth resulted from an increase in the number of cells in the organ, a process called hyperplasia. The rest was due to individual cells growing bigger, or hypertrophy. The new finding corrects earlier theories that larger organ size in obese individuals resulted primarily from fatter cells. Rather, obesity mainly increases the number of cells at risk for copying errors, uncontrollable growth, and potential malignancy.
This increase in organ size has harmful consequences. The more cells in an organ, the more mutations and the greater the risk of one cell going awry during division and becoming cancerous. Overall, the study showed a strong link between organ size enlargement and cancer risk across all three organs, confirming the mathematical predictions. The finding provides evidence for this as a major mechanism of tumorigenesis induced by obesity, in addition to factors like inflammation and hormonal imbalances.
This newly discovered effect of obesity can be large—with organs even doubling in size.
"When an organ doubles in size, it is expected to roughly double its risk of developing cancer", say the researchers. Noting the relationship between diet and cancer, the authors emphasized the importance of maintaining a healthy weight from a young age.
Sophie Pénisson et al, Hyperplasia Functions as a Link Between Obesity and Cancer, Cancer Research (2026). DOI: 10.1158/0008-5472.can-25-2487
Our eyes and brains sync to music we are listening!
If you walk down the street while listening to your favourite song and you will notice that you were stepping in time to the beat! And people instinctively blink to the beat too. In a recent study, researchers saw that dozens of non-musicians blinked in sync to the beat structure of Bach songs, without being asked.
Our bodies respond to music in a lot of weird ways. For example, one Japanese study determined what types of songs will spontaneously make people bop up and down, and which make them sway side-to-side. In the case of blinking to the beat, the scientists found that when study participants were distracted by a screen, they were less likely to sync their blinks. It might mean that active listening is required to incite these involuntary effects.
The finding makes sense because music activates the motor areas of the brain. Even if we’re just sitting still—and not bopping along to the beat—there can often be this sense of motion.
Why cultivating drought-resistant plants disappoints: Soil physics may be the real bottleneck Water uptake in plants is primarily limited by soil properties, specifically capillary and viscous forces in soil pores, rather than by plant physiology. As soil dries and water potential drops below -1.5 MPa, plants cannot extract water efficiently, regardless of their internal adaptations. This explains why breeding for drought resistance by altering plant traits has had limited success. Most water in the soil exists in pores of varying sizes. These pores exert a capillary force that holds water. Soil physicists discovered that when the soil water potential falls below -1.5 megapascals, plants are unable to extract water fast enough to meet their needs.
In other words when soil dries, capillary and viscous forces in the pores increase—and plants find it harder to draw water from the soil.
Stomata are super sensitive. Plants have special structures on the underside of their leaves known as stomata that function as an interface for gas exchange. These are small valves that the plant opens and closes in response to fluctuating environments.
When they are open, carbon dioxide from the air can flow into the leaf while water can escape into the atmosphere as vapor.
When the plant closes it stomata, it conserves water. This prevents it from dying of thirst. However, when the stomata are closed, the plant faces starvation because less carbon dioxide enters its leaves, meaning it produces fewer new sugar molecules. As a result, it grows more slowly.
Ultimately, the behaviour of these tiny valves determines how much carbon from the atmosphere enters the land plant biomass.
A plant requires considerable energy to draw water from soil pores. For example, the cell walls of the tubes through which water rises in shoot stems or tree trunks are thickened.
This enables them to withstand the tension in the vascular system and not collapse.
Further up in the leaves, dissolved substances in plant cells generate osmotic pressure, which keeps cells turgid despite the high tension in neighbouring vascular tissues.
The agricultural industry has long attempted to breed plants that store more solutes in their cells, hoping this would help them absorb water more efficiently from the soil and thus better withstand drought. Although a substantial amount of money has been invested in such breeding programmes, these hopes have never been realized.
The new results explain this failure: the limiting factor lies not in the plants but in the soil.
The physics of capillarity not only predicts the extent to which soil pores empty but also what occurs high up in the leaves.
Andrea Carminati et al, Soils drive convergence in the regulation of vascular tension in land plants, Science (2026). DOI: 10.1126/science.adx8114
Bird‑like robots promise greater flexibility and control than drones
Engineers have developed a bird-like ornithopter with flexible wings powered by piezoelectric materials, eliminating the need for motors, gears, or mechanical linkages. This solid-state design enables flapping and twisting motions, offering greater maneuverability and potential for applications such as search and rescue. Advanced modelling integrates flight physics, aiding future design optimization.
Xin Shan et al, Multi-physics finite state fully coupled modeling of mechanism-free induced-strain actuated ornithopters, Aerospace Science and Technology (2025). DOI: 10.1016/j.ast.2025.110573
Jamming bacterial communications, instead of killing the microbes, might provide long-lasting treatment
Targeting bacterial quorum sensing, rather than killing bacteria directly, offers a promising strategy against multidrug-resistant Pseudomonas aeruginosa. Screening FDA-approved drugs identified molecules, including Vorinostat, that inhibit the QS protein PqsE and reduce its enzymatic activity. Structural modifications may further disrupt bacterial communication and impair infection capability. Traditional antibiotics kill bacteria or target their growth and replication. While this strategy has worked well for decades, it can lead to resistance—when mutated bacteria survive and continue to replicate, unbothered by the treatment.
In recent years, scientists have begun investigating another idea: What if we don't kill the bacteria but instead interfere with their communications and prevent them from launching a coordinated attack on the host? This approach might make it harder for bacteria to develop resistance. It's the foundation of a new approach, which targets quorum sensing (QS). Bacteria communicate with a chemical language, emitting molecules into their environment. This is like radio communications among soldiers on a covert mission. Once they realize enough of their comrades have landed, they can start to connect and behave as a unit. Bacteria do something very similar.
Once a quorum has been reached, they begin to change their behaviour, moving from acting as individuals to acting as a group. They form colonies and secrete toxins. It's this coordinated attack that ultimately makes us sick.
By understanding and interfering with how bacteria talk to each other, researchers hope to develop a treatment for deadly bacterial diseases.
Hannah A. Jones et al, Screening of FDA-Approved Small Molecules to Discover Inhibitors of the Pseudomonas aeruginosa Quorum-Sensing Enzyme, PqsE, Biochemistry (2026). DOI: 10.1021/acs.biochem.5c00475
Altered colony chemistry reveals a process that destroys termite societies
Termite colony collapse is linked to the accumulation of uric acid in worker termites, particularly following changes in reproductive hierarchy. Elevated uric acid reduces reactive oxygen species (ROS) levels, weakening immune responses and increasing susceptibility to infections, which ultimately leads to colony decline. This process highlights the importance of internal chemical balance for colony stability.
Takao Konishi et al, What kills a society: accumulation of uric acid increases infectious disease risk in termites, Proceedings of the Royal Society B: Biological Sciences (2026). DOI: 10.1098/rspb.2025.2438.
Why drawing eyes on food packaging could stop seagulls stealing your chips
Displaying eye-like images on food packaging can deter some herring gulls from approaching and stealing food, as these birds are slower to approach and less likely to peck at boxes with eyes compared to plain ones. This response likely stems from an instinctive wariness of being watched, a behavior observed in many animals. However, the deterrent effect is not universal, with only about half of gulls consistently avoiding the eye-marked packaging.
Deuterium-labeled guinea pig helps scientists study metabolism
A guinea pig was raised exclusively on heavy water, resulting in deuterium incorporation into its biomolecules. Using mass spectrometry, the dynamics of deuterium labeling in various compounds were tracked, revealing synthesis rates and dietary contributions. This approach enables precise metabolic studies and supports the development of personalized nutrition strategies. Scientists have raised the world's only isotope-labeled guinea pig. For 156 days, the animal, named Khryun, was given only heavy water to drink. Such water is non-radioactive and has long been used in biomedical research as a way to "label" molecules: the natural isotope deuterium accumulates in the chemical bonds of organic compounds and serves as a tracer for tracking their formation and breakdown. This approach can be useful for studying human metabolism, including the development of personalized medicine methods. The research results are published in the International Journal of Molecular Sciences.
Chemical elements exist in nature as isotopes—atoms with the same number of protons but different numbers of neutrons. In heavy water, ordinary hydrogen atoms are replaced by its heavier isotope, deuterium. Isotopes have virtually identical chemical properties, allowing them to be used as invisible tracers. This very approach—replacing ordinary compounds with labeled ones and tracking their transformations—forms the basis for deciphering most biochemical processes. To track how the isotopic labels became incorporated into various biological molecules, the researcher used high-performance liquid chromatography combined with high-resolution mass spectrometry. The Mass spectrometry enables precise determination of the mass of all molecules present in a sample and distinguishes isotopes by their weight, making this method indispensable for such studies. The rate at which the label appears in a given compound reflects the intensity of its synthesis in the body.
The study determined the timeframes over which deuterium content in various substances reached steady-state levels. The final isotope content in each compound indicates to what extent it is synthesized by the body itself versus derived from food.
Oat shoots grown on heavy water were also produced. After Khryun ate them, the researcher determined how quickly the isotopically labeled substances from the oats became incorporated into the animal's own biological molecules.
The study established a methodological framework for using isotopically labelled food to investigate individual metabolic characteristics, opening up enormous possibilities for metabolic control.
Yury Kostyukevich et al, Turnover Rate of Lipids, Metabolites and Proteins Revealed by 156-Day-Long D2O Administration in a Guinea Pig, International Journal of Molecular Sciences (2026). DOI: 10.3390/ijms27041944
Mild hypoxia after premature birth may disrupt hippocampal communication, mouse study suggests Mild hypoxia after premature birth impairs learning and memory into adulthood by disrupting neuron-to-neuron communication in the hippocampus. This effect involves altered function of a specific protein channel and a second regulatory protein. Restoring the second protein's function in adult mice reversed the channel impairment, indicating potential therapeutic targets for hypoxia-induced cognitive deficits. During intensive care after preterm births, babies can experience low oxygen in their tissue and cells—or hypoxia. Hypoxia is linked to poor brain health outcomes and life-long memory issues, but the mechanisms are unclear. Researchers discovered a contributing mechanism by creating a mouse model for mild hypoxia following premature birth. This new study explores how mild hypoxia may alter brain development without direct brain injury in this neonatal period. As presented in their JNeurosci paper, mild hypoxia shortly after birth hindered learning and memory into adulthood, and the researchers discovered, at least in part, the mechanism for this effect: altered neuron-to-neuron communication in the hippocampus.
Probing a molecular mechanism, the researchers found that hypoxia following premature birth affected a protein channel involved in neuron-to-neuron communication and memory that develops in the hippocampus during adolescence. They also identified a second protein that was involved in hypoxia's effects on the channel's functioning.
When the researchers targeted this second protein in adult mice, they restored the channel's function. According to the authors, this work sheds light on how hypoxia in preterm babies influences neuron communication in memory-related brain regions to hinder learning and memory into adulthood.
Mild Neonatal Hypoxia Targets Synaptic Maturation, Disrupts Adult Hippocampal Learning and Memory, and is Associated with CK2-Mediated Loss of Synaptic Calcium-Activated Potassium Channel KCNN2 Activity, JNeurosci (2026). DOI: 10.1523/JNEUROSCI.1643-25.2026
Boosting good gut bacteria population through targeted interventions may slow cognitive decline
The gut is essentially an anaerobic ecosystem teeming with a vast mix of microorganisms, including bacteria, fungi, and protozoa, collectively called the microbiota. These organisms settle across different regions of the gut lining and together form the bulk of the human microbiome, spanning over 1,000 microbial species.
Several studies have established that gut microbiota play a key role in brain development and cognition. As people age or maintain poor diets, the community of bacteria in the gut can shift from helpful to harmful, a state known as dysbiosis. Such imbalances in gut bacteria may quietly fuel brain decline by triggering inflammation, weakening the brain's protective barrier, and allowing harmful proteins linked to Alzheimer's to build up.
The origin of neurodegenerative diseases like Alzheimer's or dementia isn't limited to the brain. The state of your gut can quietly set off a cycle of chronic, system-wide inflammation that nudges the brain toward cognitive decline. But how does the pathogenesis of a disease that seems purely brain-based begin in the gut—an organ that is mostly busy producing chemicals for digesting food?
It turns out these two entities are linked by the gut-brain axis, a two-way communication superhighway that constantly sends signals between the digestive tract and the central nervous system. It runs on chemical messengers like neurotransmitters and fatty acids, sharing information that shapes our memory, mood, and inflammation triggers.
Ananalysisof 15 studies involving more than 4,200 participants found that the gut-brain highway can be put to work as a drug-free route to support cognitive health. Tuning the gut microbiota through diet, supplements, or medical interventions such as fecal microbiota transplantation (FMT) can help improve memory, executive function, and overall cognitive performance, particularly in early or mild cases of cognitive impairment.
The credit for this protective effect goes to an increase in beneficial gut bacteria that produce compounds capable of slowing cognitive decline and reducing inflammation. The findings are published inNutrition Research.
Sofia Libriani et al, The association between gut microbiota and cognitive decline: A systematic review of the literature, Nutrition Research (2026). DOI: 10.1016/j.nutres.2026.01.003
Human brain operates near, but not at, the critical point
A recent study published in Physical Review Letters reveals that many widely used signatures of criticality in brain data may be statistical artifacts. They propose a more robust framework that, when applied to whole-brain fMRI data, confirms the brain operates near, but not exactly at, a critical point.
Neuroscientists have long found the idea fascinating—that the brain operates near a "critical point," a phase transition between stable and chaotic dynamics. Theory suggests this sweet spot enhances computational flexibility, dynamic range, and sensitivity to inputs. Evidence has mounted over the years from neural recordings showing approximate scale invariance and power-law behavior across spatiotemporal scales.
Operating near a critical point can retain many of the proposed computational benefits, such as rich multiscale collective modes and strong but controllable amplification, while avoiding the drawbacks of sitting exactly at criticality, where small perturbations can lead to instability, runaway activity, or reduced robustness, the researchers say.
Rubén Calvo et al, Robust Scaling in Human Brain Dynamics Despite Correlated Inputs and Limited Sampling Distortions, Physical Review Letters (2026). DOI: 10.1103/36v9-wtm8.
Graphene oxide kills bacteria while sparing human cells
Hygiene in everyday items that touch the body—such as clothing, masks, and toothbrushes—is critically important. The underlying principle of how graphene selectively eliminates only bacteria has now been revealed. In Advanced Functional Materials, a research team presents the potential for a next-generation antibacterial material that is safe for the human body and capable of replacing antibiotics.
The research team has identified the mechanism by which graphene oxide (GO) exhibits powerful antibacterial effects against bacteria while remaining harmless to human cells.
Graphene oxide is a nanomaterial consisting of an atomic level carbon layer (graphene) with oxygen attached; it is characterized by its ability to mix well with water and implement various functions.
This study is highly significant as it provides molecular level proofof graphene's antibacterial action, which had not been clearly understood until now.
The research team confirmed that graphene oxide performs "selective antibacterial action" by attaching to and destroying only the membranes of bacteria, much like a magnet attaches only to specific metals, while leaving human cells untouched. This occurs because the oxygen functional groups on the surface of graphene oxide selectively bind with a specific component (POPG) found only in bacterial cell membranes. Simply put, it recognizes a "target" present only in bacterial membranes to attach and destroy the structure. In this context, phospholipids are fatty components that make up the membrane surrounding a cell, and POPG is a component primarily present in bacteria.
Furthermore, fibres using this materialmaintained their antibacterial functions even after multiple washes, showing potential for use in various industrial fields such as apparel and medical textiles.
This technology is already being applied to consumer products.
Sujin Cha et al, Biocompatible but Antibacterial Mechanism of Graphene Oxide for Sustainable Antibiotics, Advanced Functional Materials (2026). DOI: 10.1002/adfm.74695
Liquids can fracture like solids—researchers discover the breaking point
Simple liquids can fracture like solids when subjected to sufficient tensile stress, exhibiting brittle fracture at a critical stress of about 2 MPa. This behaviour is governed by viscosity rather than elasticity and appears consistent across different simple liquids, regardless of chemical composition. The finding challenges traditional views of fluid mechanics and suggests new avenues for manipulating liquids in various applications.
Thamires A. Lima et al, Unexpected Solidlike Fracture in Simple Liquids, Physical Review Letters (2026). DOI: 10.1103/t2vy-32wr
Earth formed from material exclusively from the inner solar system, planetary scientists show Analysis of isotopic data from meteorites indicates that Earth's material originated almost entirely from the inner solar system, with less than 2% contribution from beyond Jupiter. This suggests minimal exchange between inner and outer solar system reservoirs, likely due to Jupiter acting as a barrier. Most volatile elements, including water, must have been present in the inner solar system during Earth's formation.
Paolo A. Sossi et al, Homogeneous accretion of the Earth in the inner Solar System, Nature Astronomy (2026). DOI: 10.1038/s41550-026-02824-7
Rare earth elements, comprising 17 metals including the lanthanides, are essential for modern technologies due to their unique magnetic, conductive, and optical properties. Despite their name, they are relatively abundant but challenging to mine safely. Global supply is not limited to one country, though China dominates processing. Recovering rare earths from mine waste offers a sustainable supply option. What are rare earth elements? Where do they come from? What's the big deal?
1. Everyday devices are possible because of rare earths The phrase "rare earth elements" generally refers to 17 chemical elements, including Scandium, Yttrium and a 15-member family from atomic number 57 (Lanthanum) to 71 (Lutetium) called the lanthanides.
Many of them share magnetic, conductive and optical properties that make them useful as coatings and additives in alloys and glass and other materials used in a wide range of modern technology. These include jet engines, LED bulbs, fiber-optic cables, lasers and a lot of military technology.
"In some of those applications, it's safe to say rare earths are irreplaceable. For example, neodymium and praseodymium make super powerful magnets that have enabled the miniaturization of technologies in phones and computers. These really powerful magnets make the magic happen in high-speed trains and MRI machines, too."
Not every application feels particularly high-tech. Seat belts in cars also use rare earth magnets.
"It's not due to a particular engineering need either. It turns out that when folks were developing the seat belt retracting mechanism, that was the type of magnet they had on the shelf." 2. 'Rare' is a misnomer
Rare earths are not, in fact, particularly rare. The rare earths name is a holdover from the 18th century, when Yttrium was discovered by a miner in Sweden. These elements were "rare" then, because nobody had seen them before. But now we know they can be found around the globe.
"Seventeen elements is actually a sizable chunk of the periodic table. We're talking about a fair amount of the stuff that makes up Earth's crust, from an elemental and mineralogical standpoint. The rare earths that we use most commonly are as abundant as copper or lead."
They're just not particularly fun to dig up.
"The geological conditions that cause rare earths to come together in higher concentrations can also concentrate radioactive materials. hat makes them hard to mine safely, and can really increase costs."
That doesn't mean rare earths are expensive. They're actually relatively cheap, trading at prices far lower than precious metals like gold or platinum. In China, which has 30% of the world's proven rare earth reserves, mines typically discard as much as half of the rare earths they dig up, because prices aren't high enough to put the effort into recovering more. Part 1
3. Rare earths may seem so scarce because 'Avatar' was so popular In December 2009, the sci-fi film "Avatar" was released, and it remained the most popular film in U.S. theaters for months. The plot was built around humans displacing a native race on another planet to make way for mining a fabulously valuable material called "unobtanium."
In 2010, in the real world, a diplomatic dispute led China to cut off Japan's access to rare earth elements—a very temporary blow (the embargo didn't even last as long as "Avatar" did as the No. 1 film) to Japanese tech manufacturers.
"There were headlines that said something like "China cuts off access to unobtanium. "Our popular imagination was kind of primed by the movie, and then this short-term crisis happened. The narrative—which has continued to support a lot of other politics over the years—stuck, and it's been hard to get unstuck."
4. It's unlikely one country would just turn off the rare earths tap While China does have ample rare earths reserves, we know the elements are distributed all around the world. Aside from China's willingness to take on the environmental price of rare earths mining, the real source of the country's market dominance is the expertise and infrastructure it has developed to process what it mines.
"Where China does have an outsized share of the rare earth economy is in the crucial intermediate steps involved in transforming a rock in the ground into useful technological components. Other countries and industries have supported the establishment of China's rare earths strength by continuing to trade for the materials, and maintaining those trading relationships is important for everyone.
"Price squeezes and supply chain concerns tend to be episodic rather than sustained. Buyers and sellers like to be connected. If you're a seller located in China with buyers located outside China, you don't want to be cut off. There's pressure in China to avoid longer-term trade wars that might hurt domestic businesses." 5. Abandoned mines could be a rare earths gold mine—and sustainable solution—for the U.S. A recent study showed that much of the domestic demand for rare earths (and other important minerals) can be satisfied by recovering the rare earth elements from the waste piled up around old and active mines in the United States.
"A lot of these materials are already present in what was cast off by other mines. Maybe we could actually get what we need by cleaning up these long-standing, problematic, abandoned mine waste sites. It could literally be trash to treasure."
Binding to RNA is not enough—changing its shape is what makes a drug work, study reveals Small molecules that merely bind to RNA rarely alter its function, whereas those that induce changes in RNA structure have a greater functional impact. Modulating RNA folding, rather than just binding, is crucial for effective RNA-targeting drugs. A new framework is proposed to identify small molecules capable of altering RNA structure, aiming to improve RNA-targeted drug development.
Chundan Zhang et al, RNA functional modulation by Mitoxantrone via RNA structural ensemble repartitioning, Nature Communications (2026). DOI: 10.1038/s41467-026-70801-9
Soil bacteria break down toxic chemicals in the environment
Many aromatic compounds, such as phenols, cresols and styrenes, are toxic to organisms and harmful to the environment. They can accumulate as a result of industrial processes and harm ecosystems. Soil bacteria can help to break them down.
Soil bacteria such as Rhodococcus opacus 1CP possess large, redundant genomes encoding multiple enzymes that enable the breakdown of toxic aromatic compounds like phenols, cresols, and styrenes. These redundancies allow bacteria to adapt to varying environmental conditions and maintain pollutant degradation, even when specific enzymes are inactive, by activating alternative metabolic pathways.
Selvapravin Kumaran et al, Whole-genomic and transcriptomic analyses elucidate p-cresol and styrene degradation metabolism in Rhodococcus opacus 1CP, Applied and Environmental Microbiology (2026). DOI: 10.1128/aem.00045-26
Why a man's health before pregnancy matters for the next generation
Men's health and life experiences before conception significantly influence pregnancy outcomes and child development. Factors such as age, nutrition, substance use, mental health, and environmental exposures can affect sperm and gene expression, impacting offspring health. Supportive partner relationships and early-life experiences also shape family well-being across generations.
The four types of dementia most people don't know exist Dementia encompasses over 100 types, with Alzheimer's disease accounting for about 60% of cases. Less common forms include posterior cortical atrophy, Creutzfeldt-Jakob disease, FTD-MND, and progressive supranuclear palsy, each presenting distinct symptoms beyond memory loss, such as visual, motor, or behavioral changes. Early recognition of these subtypes is crucial for appropriate care.
Evacuating in 90 seconds? New simulations show the safest cabin layout
In case of an emergency, the Aviation Administration requires aircraft to be able to evacuate within 90 seconds. However, as the median age of the global population increases, the growing number of elderly airline passengers poses new challenges during emergency situations.
InAIP Advances,an international collaboration of researchers simulated 27 different evacuation scenarios in the case of a dual-engine fire in an Airbus A320, one of the most common narrow-body aircraft in the world. They compared three different cabin layouts with three different ratios of passengers over the age of 60 and three different distributions of those passengers.
While a dual-engine fire scenario is statistically rare, it falls under the broader category of dual-engine failures and critical emergencies in aviation.
In seeking the most efficient combination of factors, the researchers created full-scale computer-aided design models of the A320 cabin and used Pathfinder—the industry-standard software for evacuation modelling—to simulate passengers' behaviour. They found the proportion and location of elderly passengers have the largest effect on evacuation time.
The fastest option—a layout that accommodates a total of 152 passengers with two rows of first-class seats at the front, and 30 elderly passengers evenly distributed throughout the cabin—still required 141 seconds for all the passengers to reach the ground, much longer than the AA mandates.
Previous studies have shown that cognitive decline in elderly populations can affect situational awareness and delay decision making, and that reduced dexterity can be exacerbated during high-stress situations. The researchers hope that incorporating this information into their findings—for example, by offering additional safety briefings to elderly passengers—will help further accelerate the deboarding process.
Children, infants, and pregnant women also introduce unique physical capabilities and behaviours that add another vital layer to evacuation modelling, which the group plans to investigate in their future work.
Effect of elderly passenger distribution on A320 aircraft evacuation under dual-engine fire scenarios, AIP Advances (2026). DOI: 10.1063/5.0310405
Researchers have captured the exact atomic movements that write data to next-generation memory devices, which could pave the way for smaller, faster and more energy-efficient electronics.
Using advanced electron microscopy the research team captured atomic-scale movements inside promising memory materials, known as fluorite-type ferroelectrics, that could overcome current limits to how small and efficient memory devices can become. Everyday technologies, such as smartphones, medical devices, wearable electronics and contactless IC cards used in public transport, store data as billions of digital 1s and 0s. In these materials, the physical position of an atom acts like a "switch"—and moving an atom just a fraction of a nanometer is what flips a data bit from a 0 to a 1.
This research shows exactly how that physical movement happens in real time. Until now, scientists couldn't directly see how this switching actually happened, in fractions of a second.
They discovered that switching doesn't happen in a single step, but through previously unseen intermediate atomic structures, and that the process can be controlled by changing the material's composition.
Kousuke Ooe et al, Direct observation of cation-dependent polarisation switching dynamics in fluorite ferroelectrics, Nature Communications (2026). DOI: 10.1038/s41467-026-70593-y
Animals are powerful landscape engineers shaping the Earth's surface, global study finds
Wild animals significantly modify Earth's surface by altering soil and sediment through activities such as burrowing and feeding. A global meta-analysis by researchers analyzed data from 64 studies covering 61 species of wild animals across freshwater and terrestrial environments Animal activity increases soil porosity and reduces fine material, influencing erosion and landscape development. These effects are more pronounced in freshwater ecosystems (136% change) than terrestrial ones (66%), highlighting animals as key geomorphic agents.
Wild animals are not just inhabitants of the natural world. Many also act as natural landscape engineers, reshaping Earth's surface as they burrow, feed, and build shelters that move soil and sediment across ecosystems. From animals disturbing riverbeds to burrowing species redistributing soil, wildlife constantly modifies the physical structure of landscapes through everyday activities. The research found that animals consistently increased the porosity of soils and sediments and reduced the amount of fine material present. These changes influence how water and sediment move through ecosystems and can affect processes such as erosion, river behaviour and landscape development. The new study provides quantitative evidence of how strongly animal activity can modify geomorphic processes across ecosystems.
Z. Khan et al, Signatures of Wild Animal Life in Earth's Landscapes, Journal of Geophysical Research: Earth Surface (2026). DOI: 10.1029/2025jf008351
Antibacterial soaps and wipes can fuel antimicrobial resistance, scientists warn
An international team of scientists is warning that everyday antibacterial soaps, wipes, sprays, and other "germ-killing" products are quietly contributing to the global rise of antimicrobial resistance (AMR) while providing no added health benefit for most consumer uses.
Widespread use of antibacterial soaps, wipes, and other consumer products containing biocides such as quaternary ammonium compounds is contributing to antimicrobial resistance (AMR) without providing added health benefits for most uses. These chemicals promote bacterial resistance, including cross-resistance to antibiotics, and persist in the environment. Health authorities recommend plain soap and water for routine handwashing. The authors summarize numerous laboratory and real-world studies showing that environmental levels of these chemicals cause resistant bacteria to survive and spread, promote cross-resistance to important antibiotics, and cause lasting genetic changes to microbes, including the exchange of resistance genes.
Over time, these shifts can allow resistant strains to dominate. This translates to the spread of antibiotic resistant genes that threaten the effectiveness of antibiotics when we really need them and can contribute to rising deaths. Evidence shows biocides in many consumer products provide no added health benefit, but the biocides do raise concerns about AMR and toxicity.
Targeting Biocide Overuse in Consumer Products Will Strengthen Global AMR Action, Environmental Science & Technology (2026). DOI: 10.1021/acs.est.5c17673
Vibrations in your skull New tool for Biometric authentication
A team of researchers has developed a security system that could change how people log in to virtual and augmented reality platforms by eliminating passwords, personal identification numbers and eye scans and replacing them with something far more seamless.
A new authentication system uses unique vibration patterns from breathing and heartbeats transmitted through the skull, detected by motion sensors in XR headsets, to identify users. This software-only approach achieved over 95% accuracy in authenticating users and over 98% in rejecting unauthorized access, offering secure, continuous, and hardware-free biometric verification.
The system, a software program called VitalID, is based on the team's discovery of a new biometric: tiny vibrations generated by breathing and heartbeats that resonate through the skull in patterns unique to each person's bone structure and facial tissues. The human body is always moving in tiny ways, even when a person is sitting still. Each breath and each heartbeat create very small vibrations inside the body. Those vibrations travel up through the neck and into the head.
When they reach the skull, they cause it to vibrate slightly. Because every skull has a different shape, thickness and bone structure, the vibrations change in unique ways as they move through each person's head. Soft tissues in the face, such as muscle and fat, also influence how the vibrations travel.
As a result, each person produces a distinct vibration pattern. Motion sensors built inside virtual reality headsets can detect these tiny patterns and assess them like a fingerprint to determine who is wearing the device. We do not need to add any device or additional hardware. It requires only software. In testing across 52 users over a 10-month period using two popular XR headsets, the system correctly authenticated legitimate users more than 95% of the time and rejected unauthorized users more than 98% of the time.
The researchers built a filtering system that removes interference from extraneous head and body movement, allowing the headset to focus only on the tiny vibrations in the skull caused by breathing and heartbeat. They then used advanced computer models to analyze those vibration patterns.
Because the vibrations travel internally through bone and tissue, they may also be more difficult to spoof. Someone might imitate another person's breathing rhythm but cannot easily replicate the biomechanical properties of another person's skull.
The headset would continuously confirm identity in the background simply by sensing the subtle vibrations that come with being alive.
Tianfang Zhang et al, Harnessing Vital Sign Vibration Harmonics for Effortless and Inbuilt XR User Authentication, Proceedings of the 2025 ACM SIGSAC Conference on Computer and Communications Security (2025). DOI: 10.1145/3719027.3765060
Large language models can enhance scientific peer review by identifying objective errors and inconsistencies, improving draft quality, and alleviating reviewer workload. However, AI is less effective at subjective judgments, such as assessing novelty or significance. Human oversight remains essential, with transparency about AI involvement and accountability for final decisions. AI's role in science is expected to expand.
There is tremendous interest in using AI, especially language models, to support research and peer review and speed up the scientific process. A key advantage is that AI can act like a rapid, always-available critic, a sort of pre-submission review process—before scientists officially send in a paper for publication. AI can be quite good at assessing drafts for gaps and limitations, so researchers can preemptively address them. This can improve the quality of first drafts submitted for publication and reduce the back-and-forth later. And on the reviewer side, the pressure is real: As submissions grow, human reviewers are very much overburdened, which can lead to lower-quality reviews and frustration for authors.
IN their tests researchers found that besides spotting gaps and limitations, AI can be quite good at the more objective, verifiable aspects of review.
AI is strongest on objective, checkable inconsistencies and technical issues and weaker on subjective judgments about the novelty or significance of the research.
The researchers say that AI should support and inform—not fully replace—human decision-making.
A human, or team of humans, must make the final editorial decisions and scientists must stand behind the work. AI can offer comments on early drafts, point out omissions, and suggest improvements in the writing and the research—but the scientists must remain accountable for incorporating and synthesizing feedback from the AI and human reviewers.
Scientists have to be up-front about how and where AI has assisted in the research itself and in the writing and review of the papers. They should acknowledge exactly how AI was involved and what tools were used in the paper. It comes down to accountability and a clear chain of responsibility and that final decisions are still made by humans.
Following on this work, many conferences and journals are now exploring using LLMs to assist the review process.
Nitya Thakkar et al, A large-scale randomized study of large language model feedback in peer review, Nature Machine Intelligence (2026). DOI: 10.1038/s42256-026-01188-x
In one recent large-scale randomized experiment, the results of which were published in Nature Machine Intelligence, researchers provided AI assistance to human reviewers on roughly 20,000 reviews to assess AI's impact on review quality.
Premature and small births are linked to lifelong learning problems
Preterm birth and low birth weight are consistently associated with lower IQ and persistent educational disadvantages, particularly in mathematics, from early childhood into adulthood. The severity of these challenges increases with earlier gestational age and lower birth weight, underscoring the importance of early identification and ongoing support to improve long-term outcomes. Being born early or at a lower weight is linked to lower IQ scores and poorer educational outcomes in school and beyond, according to a new study published in the journal JAMA Pediatrics. In this research, known as an umbrella review, the team examined what previous studies had discovered about preterm birth and low birth weight and long-term development. This involved going back to the original numbers and recalculating the results using a single, consistent method to ensure accuracy. They looked at five different life stages, from babies under two years old to adults over 18.
This meta-analysis confirmed that both preterm birth and low birth weight are linked to disadvantages that persist over time. In particular, babies born before 28 weeks or weighing less than 1 kg at birth showed larger academic disadvantages on average than babies born at term with normal birth weight.
The most affected subject was math, with significant gaps in calculation and problem-solving skills. Stark differences were also seen in reading, comprehension, spelling and identifying words.
These challenges were often most visible during primary school and closed slightly during teenage years. However, some of these learning difficulties reappear once a person reaches adulthood, as the study authors note in their paper. "These disadvantages generally increased with earlier gestational age and lower birth weight. Although some associations appeared to attenuate during adolescence, evidence of persistent disadvantages into adulthood was observed for several outcomes."
The research team believes their findings show that the impact of being born early or much smaller than average can have lifelong consequences. For some, this may mean fewer job opportunities or earning lower salaries than their peers.
Mingzheng Hu et al, Cognitive and Educational Outcomes After Preterm Birth or Low Birth Weight, JAMA Pediatrics (2026). DOI: 10.1001/jamapediatrics.2026.0533
How time and space become one inside your brain—and what it means for Alzheimer's
If you develop Alzheimer's disease, you not only lose your sense of time, but you also lose your sense of place.
Neural circuits in the retrosplenial cortex process time and space using similar activity patterns, indicating these dimensions are integrated in the brain. This shared mechanism helps explain why both temporal and spatial orientation deteriorate together in Alzheimer's disease, highlighting the need to understand healthy episodic memory networks to address dementia.
All memories are made up of different components. You don't just remember what you had for dinner yesterday, but also the time and place. We often think of time and space as separate categories, a distinction created by philosophers and physicists that is incredibly practical for organizing our lives. But our brain cells don't see it that way. These cells don't distinguish between a step forward in space or a second passing in time. Instead, they simply record a continuously changing stream of information from our senses, tracking events as they unfold. To the brain's internal network, time and place are effectively two sides of the same coin. In Alzheimer's disease, it is therefore not surprising that both are affected; when the neural network is damaged, our sense of 'where' and 'when' begins to unravel together. Remembering where, when and how something happened is called episodic memory. In your brain, billions of nerve cells form large networks, passing signals like a relay race to process information from your senses, the sounds, smells, and sights of your life.
We already know that cells which link memories to time and space are found in the hippocampus. But this group of researchers had a theory that another area of the brain is also involved, namely the retrosplenial cortex. Located at the back of the cerebral cortex near the hippocampus, this area was previously only known for linking memories to place. To test if this area also tracks time, the team designed a memory challenge for mice. The task required them to hold a specific odor in their "working memory" during a brief period. Their study is published in Cell Reports. The most striking discovery was that the retrosplenial cortex uses the same "neural script" for both space and time. The researchers found that the sequence of neuronal activity in the retrosplenial cortex looks almost identical whether a mouse is physically running through a room or simply holding a memory in its mind for five seconds. This discovery brings us back to the tragic reality of Alzheimer's disease, where those affected struggle to anchor themselves in both time and place. By showing that the brain uses the same "neural script" for both, this research explains why these two senses often fail together. This work also challenges how we perceive the world around us. While we use the concepts of time and space to organize our lives, this distinction is largely a human construct. In fact, some modern theories in physics are moving away from using time and space as the fundamental building blocks of the universe. It appears the brain's internal wiring mirrors this deeper reality.
Anna Christina Garvert et al, Area-specific encoding of temporal information in the neocortex, Cell Reports (2025). DOI: 10.1016/j.celrep.2025.115363
Unexplained sky flashes from the 1950s: Independent analysis supports their existence
Historical observations from an observatory in Germany have now independently verified evidence for brief, mysterious flashes of light in the night sky, first picked up by an American astronomical survey in the 1950s. Through fresh analysis of a German survey from the same period, independent researcher Ivo Busko, a now-retired developer at NASA, has uncovered striking new support for these puzzling signals. The results have been published as a preprint on arXiv.
In2019, an international team of astronomers launched the VASCO Project, aiming to identify unusual phenomena hidden within vast archives of historical data. In particular, their work focused on astronomical transients: objects that suddenly appear in the sky in some images, but vanish in subsequent observations.
An especially exciting result emerged in 2025, when researchers analyzed photographic plates captured as part of the Palomar Observatory Sky Survey. Carried out in California throughout the 1950s, this ambitious program produced nearly 2,000 images of the night sky using long-exposure plates. Within these images, the team found clear evidence of transients with strange appearance and behavior, captured at a time that predates the launch of any human-made satellites.
Crucially, the spatial spread of light from these sources appeared too sharp to be explained by normal stars or distant astronomical objects. Combined with the way the plates recorded their brightness, the signals suggested that the flashes lasted for less than a second, despite being embedded within exposures lasting tens of minutes.
Unless they arise from some as-yet unknown astrophysical phenomenon, one especially captivating possibility remained: that the flashes were produced by artificial objects, either briefly orbiting Earth or passing nearby.
Until now, however, the Palomar observations had not been independently confirmed. To address this gap, Busko turned to a completely separate dataset: archival photographic plates taken at the Hamburg Observatory in Germany during the same period in the 1950s. These plates captured many of the same regions of sky and were later digitized by the APPLAUSE Archive, making them accessible for modern analysis.
By comparing pairs of plates taken in close succession—each exposed for around 30 minutes before being replaced—Busko was able to search for fleeting changes between images.
His results revealed clear evidence of transients that are remarkably similar to those reported by the VASCO team, providing the first independent confirmation of the phenomenon using a different method and dataset.
For now, only a small fraction of the Hamburg plates have been examined. But with further improvements to the analysis techniques, Busko is hopeful that more subtle examples of these flashes could be uncovered across the archive, strengthening the statistical significance of the findings.
Artificial objects? While astronomers may never know exactly what caused these events, both the VASCO results and Busko's independent analysis point toward a consistent interpretation: that the flashes could have originated from flat, rotating objects orbiting close to Earth, briefly reflecting sunlight toward the ground. For some, this leaves open a more speculative possibility: that these mysterious signals may even hint at artificial objects which were sent to Earth deliberately.
Ivo Busko, Searching for Fast Astronomical Transients in Archival Photographic Plates,arXiv(2026).DOI: 10.48550/arxiv.2603.20407
Pesticides and cancer: Study reveals the biological mechanisms behind an environmental health risk A new study, published in Nature Health, reveals a strong link between exposure to agricultural pesticides in the environment and the risk of developing cancer. By combining environmental data, a nationwide cancer registry, and biological analyses, researchers have shed new light on the role of pesticide exposure in the development of certain cancers. Pesticides are widely present in food, water, and the environment, often in the form of complex mixtures. Until now, it has been difficult to accurately assess their effects on human health, as most studies focus on isolated substances and experimental models that are far removed from real-world exposure conditions.
This new study adopts an innovative, integrative approach that accounts for the complexity of real-world exposures experienced by populations. This is the first time researchers have been able to link pesticide exposure, on a national scale, to biological changes suggesting an increased risk of cancer. The study shows that certain tumors, although they affect different organs, share common biological vulnerabilities linked to their cellular origin that can be weakened by pesticide exposure. Notably, the liver is a key organ in the metabolism of chemicals and is considered a sentinel site for environmental exposure.
Molecular analyses conducted show that pesticides disrupt processes that help maintain cell function and cellular identity. These biological changes appear before cancer develops, suggesting early, cumulative, and silent effects. They could make tissues more vulnerable to other risk factors, such as infections, inflammation, or environmental stressors. The results challenge conventional toxicological approaches, which are based on the evaluation of isolated substances and the establishment of thresholds considered safe. They highlight the importance of considering pesticide mixtures, environmental exposure, and real-world socio-ecological contexts.
Dr. Krishna Kumari Challa
How inflammation may prime the gut for cancer
Chronic gut inflammation can leave lasting epigenetic changes, or "molecular scars," in intestinal cells, making tissues more susceptible to cancer if a cancer-promoting mutation occurs later. These epigenetic alterations persist after inflammation subsides, are inherited by daughter cells, and may accelerate tumor growth, highlighting potential biomarkers and intervention targets for colorectal cancer risk.
Chronic inflammation can raise a person's risk of cancer, and a new study reveals key details about how that might happen in the gut and points to better ways to identify and reduce risk.
revealed in mice that after colitis (chronic intestinal inflammation), seemingly healed gut tissues may retain the memory of earlier inflammation through molecular "scars" that make it easier for cancer to take hold later on. These memories are encoded as changes in the epigenome that are handed down from cell to cell through many generations of cell division, with long-lasting effects on gene activity that can later drive tumor growth.
The work, appearing in Nature, suggests a two-hit process over time in which alterations in the genome—an epigenetic change and a cancer mutation—can accelerate tumor growth. It also points to ways to potentially identify and possibly intervene on these cancer-promoting factors with new biomarkers and therapeutics.
Surya Nagaraja et al, Epigenetic memory of colitis in stem cells promotes tumour growth, Nature (2026). DOI: 10.1038/s41586-026-10258-4. www.nature.com/articles/s41586-026-10258-4
Mar 26
Dr. Krishna Kumari Challa
Dengue fever is a growing problem: Why it's so hard to beat with vaccines
Dengue fever, caused by four related viral serotypes, is expanding globally due to climate and urbanization. Vaccine development is challenging because immunity to one serotype can worsen infection with another via antibody-dependent enhancement. Effective vaccines must induce balanced, strongly neutralizing responses to all serotypes. Vaccine performance varies by prior infection, age, and transmission intensity, requiring tailored strategies and ongoing safety monitoring.
https://theconversation.com/dengue-fever-is-a-growing-problem-why-i...
Mar 26
Dr. Krishna Kumari Challa
Premature placental separation may increase child's risk of heart disease by age 28
Children born after placental abruption face a 4.6-fold higher risk of early cardiovascular disease or death from cardiovascular disease by age 28 compared to those without this complication. Risks of heart-related hospitalization and stroke are also significantly elevated. The association persists even when accounting for genetic and environmental factors within families.
The risk of developing early cardiovascular disease or dying from cardiovascular disease by the age of 28 was about 4.6 times higher among people born to mothers who had a placental abruption during their pregnancy. This finding was compared to people whose birth did not have this complication, according to new research published in the Journal of the American Heart Association.
Placental abruption occurs when the placenta separates from the uterus before birth rather than after delivery, and this can lead to severe hemorrhaging or other serious complications for the mother and baby. According to the American Heart Association's 2026 Heart Disease and Stroke Statistics, most studies have reported an incidence of 0.5% to 1% for placental abruption in the general population.
The study suggests that placental abruption needs to be taken as a very serious complication for the mother and also potentially affecting the baby's cardiovascular health later in life.
Most treatments after a placental abruption focus on following the mother after a pregnancy complication. This study shows it is important that their children are also monitored to identify potential complications due to their increased risk of cardiovascular disease.
Mar 26
Dr. Krishna Kumari Challa
The study reported that:
Out of nearly 3 million pregnancies, approximately 1% (n = 28,641) were affected by placental abruption.
During a 28-year follow-up period, children born to mothers who had a placental abruption during the pregnancy were 4.6 times more likely to die from cardiovascular disease than children born to mothers who experienced a normal placental separation from the uterus after delivery.
Children born to mothers who had a placental abruption faced nearly three times higher risk of being hospitalized for heart-related complications during the next 28 years. These conditions included heart failure, ischemic heart disease, heart attack, blocked arteries, and general cardiovascular disease.
The children's risk of stroke hospitalization was 2.4 times higher than for children whose mothers did not have a placental abruption.
These heart disease and stroke risks associated with abruption were even higher among children younger than 1 year old.
The association between placental abruption and increased cardiovascular risk remained similar after conducting an additional analysis contrasting cardiovascular disease risks between biological siblings (each mother served as their own control), suggesting that genetic and environmental factors did not explain this relationship.
Placental abruption is a sudden and often catastrophic event that cannot be prevented and comes with no warning. Older women or those expecting more than one baby, such as twins or triplets, have an increased risk of developing this condition.
Avoiding smoking, drinking alcohol, and using illegal drugs (particularly, cocaine) and maintaining good blood pressure control are also important, as they are linked to placental abruption.
Cardiovascular Disease in Singleton Offspring Born of Pregnancies Complicated by Placental Abruption: A Population-Based Retrospective Cohort Study, DOI: 10.1161/JAHA.125.045199
Part 2
Mar 26
Dr. Krishna Kumari Challa
Walking pace may outperform blood pressure and cholesterol in predicting mortality risk, study suggests
Analysis of over 400,000 UK adults indicates that walking pace is a stronger predictor of mortality risk than blood pressure or cholesterol, particularly in individuals with chronic health conditions. Incorporating simple physical measures such as walking pace, handgrip strength, and resting heart rate enhances mortality risk prediction beyond traditional clinical factors.
The Utility of Measures of Physical Behavior, Function, and Fitness as Predictors of Mortality, Mayo Clinic Proceedings (2026). DOI: 10.1016/j.mayocpiqio.2026.100710
Mar 26
Dr. Krishna Kumari Challa
The cloning limit does exist
After 20 years and more than 30,000 cloning attempts, researchers have found the limit on the number of times that a single mouse can be serially re-cloned — their attempts failed after 58 generations. The cloned mice looked normal and lived as long as normal mice, but accumulated mutations at an unusually high rate, which could be why attempts to clone them were eventually unsuccessful, the team says. The findings suggest that asexual reproduction is ultimately unsustainable for mice, and potentially for other mammals, too.
https://www.nature.com/articles/s41467-026-69765-7?utm_source=Live+...
Mar 26
Dr. Krishna Kumari Challa
Why cells respond 'incorrectly' in old age
Aging cells exhibit altered responses to external signals due to changes in chromatin structure within the nucleus. With age, chromatin becomes more open, leading to less accurate gene expression and increased activation of inappropriate genes. This impairs essential cellular processes and may contribute to age-related diseases, suggesting that targeting chromatin structure could support healthier aging.
Some of the signs of aging in human cells originate in the cell nucleus, because the packaged form of DNA changes with age. This has now been demonstrated by researchers. It means that older cells can no longer react appropriately to external stimuli, and this can even lead to diseases. This insight could help scientists to curb such alterations and support better health in old age.
As we age, our cells age with us. Although they remain active, they become less flexible, stop dividing and sometimes respond incorrectly to signals. The reason for this lies in the cell nuclei, specifically in the chromatin—the packaged form of our DNA.
This is the finding of a study by researchers who analyzed samples of skin cells taken from people of different ages in the laboratory. They have published their findings in the journal PNAS.
Using a microscope and molecular biological methods, the scientists examined how various skin cells reacted to a specific chemical messenger when mechanical tension was applied. They compared skin cells from ten-year-old children with those from 75-year-olds.
the response of older people's cells to the same stimuli was different and significantly weaker. The scientists were able to attribute this to a specific cause: the chromatin in the cell nucleus changes with age. As a result, certain genes—sections of DNA that belong together—can no longer be read as accurately.
This process, known as gene expression, is important in order to produce the proteins required by the organism, because the genes store the instructions for building those proteins. However, if the shape of the chromatin changes with age, other processes may be triggered instead, which can harm the organism.
Shivashankar, G. V., Chromatin accessibility regulates age-dependent nuclear mechanotransduction, Proceedings of the National Academy of Sciences (2026). DOI: 10.1073/pnas.2522217123. doi.org/10.1073/pnas.2522217123
Mar 27
Dr. Krishna Kumari Challa
Tiny rotating hairs inside a microscopic cavity decide where your organs will grow
Organ asymmetry in humans originates during embryonic development, where rotating cilia in the embryonic node generate a directional fluid flow that influences left-right organ placement. Using an artificial node with magnetically controlled cilia and advanced simulations, it was demonstrated that both cilia-induced flow and morphogen transport together break left-right symmetry.
Heart to the left. Liver to the right. That's where you'll find these organs in a healthy human body, but surprisingly, in some people, the heart is on the right and the liver on the left. This normal or abnormal asymmetry can be traced back to your embryonic stage. In the early days of your development, a small fluid-filled cavity known as an embryonic node forms in your embryo. Inside, tiny micro-hairs known as cilia create a flow pattern that steers where organs grow in your body.
Researchers have revealed key details behind the process by building a world-first artificial embryonic node—using synthetic magnetically controlled cilia to generate a flow pattern—and then explore what happens in the node using advanced simulations. They published their findings on March 25 in the journal Science Advances.
It can be traced all the way back to your first period as an embryo. And it has to do with what happens in something called an embryonic node.
An embryonic node is a small cavity that contains a fluid (made up of water, proteins, hormones, and other substances). The top is closed off by a membrane, while the bottom layer is lined with a few hundred tiny micro-hairs called cilia. The whole node is just a few hundred micrometers across. "The cilia in the embryonic node rotate in the same direction, making a tilted conical motion. This generates an anticlockwise fluid flow inside the node, and it's this flow that is known to play a key role in the left-right symmetry.
Tanveer ul Islam et al, Artificial embryonic node elucidates the role of flow in left-right symmetry breaking in vertebrates, Science Advances (2026). DOI: 10.1126/sciadv.aec2328
Mar 27
Dr. Krishna Kumari Challa
Influencers promoting prescription drugs on social media pose public health risks
Influencer promotion of prescription drugs on social media increases the risk of misinformation, with exaggerated benefits and omitted side effects commonly observed. Current FDA and FTC regulations are insufficient for this context, and inconsistent disclosure blurs the line between personal stories and advertisements. Stronger guidelines, standardized disclosures, and improved public media literacy are recommended to address these risks.
Sascha Gell et al, Prescription Drug Promotion by Social Media Influencers, JAMA Network Open (2026). DOI: 10.1001/jamanetworkopen.2026.2738
Mar 27
Dr. Krishna Kumari Challa
Severe strokes may 'rejuvenate' undamaged brain regions
Analysis of MRI data from over 500 stroke survivors indicates that while severe strokes accelerate aging in the damaged brain hemisphere, undamaged regions—especially the contralesional frontoparietal network—exhibit a younger structural profile. This suggests adaptive neuroplasticity, where unaffected areas reorganize to compensate for lost motor function. These findings may inform personalized rehabilitation strategies.
Gilsoon Park et al, Associations between contralesional neuroplasticity and motor impairment through deep learning-derived MRI regional brain age in chronic stroke (ENIGMA): a multicohort, retrospective, observational study, The Lancet Digital Health (2026). DOI: 10.1016/j.landig.2025.100942
Mar 27
Dr. Krishna Kumari Challa
Eating about 4,200 mg sodium a day may raise heart failure risk 15%
Daily sodium intake averaging 4,200 mg, nearly double the recommended limit, is associated with a 15% higher risk of new-onset heart failure, independent of other health or sociodemographic factors. Modest sodium reduction could lower heart failure incidence and related healthcare costs, particularly in high-risk, low-income populations.
Excessive consumption of dietary sodium (salt) is a significant, independent risk factor for new-onset heart failure, according to a report from Vanderbilt Health, published in the Journal of the American College of Cardiology: Advances.
Consuming a population average of about 4,200 milligrams of dietary sodium a day (the recommended maximum is 2,300 milligrams) was associated with a 15% increase in the risk of incident (new) cases of heart failure.
"Even modest reductions in sodium consumption may significantly reduce the burden of heart failure in this high-risk population," the researchers report.
The increased risk of heart failure linked to sodium was independent of sociodemographic factors, including diet quality and caloric intake, as well as health conditions such as high blood pressure and high lipid blood levels.
Even a modest reduction in dietary salt, to 4,000 milligrams a day or less, could reduce heart failure cases by 6.6% over 10 years, the researchers predicted.
Leonie Dupuis et al, Dietary Sodium Intake and Risk of Incident Heart Failure in the Southern Community Cohort Study, JACC: Advances (2026). DOI: 10.1016/j.jacadv.2026.102651
Mar 27
Dr. Krishna Kumari Challa
Implantable 'living pharmacy' produces multiple drugs inside the body
A multi-institutional team of scientists has taken a crucial step toward implantable "living pharmacies"—tiny devices containing engineered cells that continuously produce medicines inside the body. In a new study published in Device, the team engineered cells to simultaneously produce three different biologics—an anti-HIV antibody, a GLP-1-like peptide used to treat type 2 diabetes, and leptin, a hormone that regulates appetite and metabolism. When implanted under the skin of a small animal model, the device kept drug-producing cells alive and stably delivered all three therapies at once.
Called HOBIT (hybrid oxygenation bioelectronics system for implanted therapy), the new system integrates the engineered cells with oxygen-producing bioelectronics. Roughly the size of a folded stick of gum, the design shields cells from the body's immune system while also providing cells with oxygen and nutrients to keep them alive and producing biologic drugs for several weeks.
With more work, living pharmacies hold the potential to treat chronic conditions with a single, long-lasting therapy—bypassing the need for patients to carry, inject or remember to take medications.
Design of a wireless, fully implantable platform for in-situ oxygenation of encapsulated cell therapies, Device (2026). DOI: 10.1016/j.device.2026.101106. www.cell.com/device/fulltext/S2666-9986(26)00058-X
on Saturday
Dr. Krishna Kumari Challa
Scientists testing new scanning technology discover mysterious structure beneath an ancient Egyptian city
Archaeologists working in Egypt's Nile Delta may have discovered a tomb or temple dating back around 2,600 years while testing a new technology designed to locate structures buried deep beneath the surface. The team was studying the Buto (Tell el-Fara'in) site, the ruins of an ancient city that was occupied from the Predynastic period (around 3800 BCE) to the Early Islamic era (7th century CE).
During its long history, Buto was built, destroyed, and rebuilt and consequently has several layers, each potentially a rich source of archaeological remains. However, the oldest parts of the city are now buried beneath later ruins and thick mud deposits. This would make traditional digging difficult, time-consuming, and expensive as archaeologists have to move tons of debris or struggle with groundwater to reach lower levels. The technology helps guide them where to dig.
The researchers were trialing a multipronged approach that uses satellite radar (SAR) and electrical resistivity tomography (ERT).
The team started with the Sentinel-1 radar satellite to identify large-scale anomalies from space, as they describe in a paper published in the journal Acta Geophysica. When it detected something of interest, they followed up with ERT. The technology sends electrical currents between a series of electrodes placed in the ground to create a 3D model of subsurface structures based on how different materials conduct/resist electricity. This technique has been likened to an underground CT scan.
In the top three meters, the scans showed a layer of broken pottery and debris from the later Roman and Ptolemaic periods. However, at a depth of between three and six meters, the technology revealed a large, well-defined structure from the Saite period (around 2,600 years ago).
At this stage, the researchers thought it could be a large tomb or shrine. Next, they carried out a small 10 x 10 meter excavation, which uncovered mudbrick walls and religious artifacts where the sensors had predicted.
"The results of this study demonstrate the effectiveness of combining geophysical measurements and remote sensing data, which gave a very accurate vision in detecting buried settlements in a complex region," wrote the team in their paper.
Mohamed A. R. Abouarab et al, Multi-scale detection of buried archaeological elements across different occupation phases: an integrated approach using radar satellite imagery and electric resistivity tomography at Buto, northwestern Nile Delta of Egypt, Acta Geophysica (2026). DOI: 10.1007/s11600-026-01809-4
on Saturday
Dr. Krishna Kumari Challa
Major volcanic eruptions might be driven by gas dissolving back into magma
Understanding what triggers large volcanic eruptions is crucial for hazard assessment, but the exact mechanism driving these eruptions is still poorly understood. The prevailing theory is that volatile exsolution—gas coming out of magma—is a main driver of eruptions, particularly in volcanoes rich in silica. However, a new study, published in Nature Communications, posits that it is actually gas being dissolved back into the magma that leads to the pressurization needed for large eruptions.
Previous research has emphasized volatile exsolution as a driver for eruption caused by increasing pressure in magma chambers. In volatile exsolution, dissolved gases, like water vapor, carbon dioxide and sulfur separate from a silicate melt to form bubbles as magma rises or cools. This decreases solubility, creating significant magmatic overpressure, which drives volcanic eruptions. Some previous studies have found that exsolved gases in large volcanic systems can actually buffer pressure, making eruptions less frequent, but larger when they do occur.
"However, for volatile exsolution to act as primary eruption trigger, exsolution must outpace both volatile loss by passive degassing and viscous relaxation of the crust. This, however, requires rapid crystallization rates that are difficult to maintain in larger, thermally buffered reservoirs. In large silicic systems, exsolved volatiles may therefore exert a primary control on magma compressibility and chamber growth, rather than directly triggering eruptions," the study authors explain.
The new study explores the opposite process, called volatile resorption, in which gases dissolve back into magma. The team says that this resorption reduces magma compressibility, modulating the system's response to recharge and its overall stability, which ends up expediting eruptions since it is harder to compress. Volatile resorption can rapidly increase pressure in large silicic magma chambers, potentially triggering eruptions faster than volatile exsolution.
The study authors say the simulations they conducted show resorption results in eruption faster than cases involving exsolution.
Franziska Keller et al, Volatile resorption expedites eruption onset in large silicic systems, Nature Communications (2026). DOI: 10.1038/s41467-026-70206-8
on Saturday
Dr. Krishna Kumari Challa
JWST solves decades-long mystery about why Saturn appears to change its spin
Saturn has puzzled scientists for many years. Measurements taken by NASA's Cassini spacecraft in 2004 suggested the planet's rotation rate was slowly changing over time—yet this should not have been possible, as a planet cannot simply speed up or slow down its spin.
Researchers now have used the most powerful space telescope ever built to answer one of the longest-standing puzzles in planetary science—why does Saturn appear to spin at a different speed depending on how you measure it? The findings, published in the Journal of Geophysical Research: Space Physics, reveal for the first time the complex patterns of heat and electrically charged particles in Saturn's aurora, and show that the entire system is driven by a self-sustaining feedback loop powered by the planet's own northern lights.
Using the James Webb Space Telescope (JWST), the team observed Saturn's northern auroral region—the equivalent of Earth's northern lights—continuously for a full Saturnian day, capturing detailed measurements that were simply not possible with any previous instrument.
By analyzing the infrared glow from a molecule called trihydrogen cation, which forms in Saturn's upper atmosphere and acts as a natural thermometer, the researchers were able to produce the first high-resolution maps of both temperature and particle density across Saturn's auroral region.
The level of detail was extraordinary. Previous measurements had errors of around 50 degrees Celsius, roughly on a par with the differences the scientists were trying to detect, and were produced by combining broad regions of the hot polar aurora. The new JWST data was ten times more accurate than previous measurements, allowing the team to map fine details of heating and cooling across Saturn's auroral region for the very first time.
What the team found was that these temperature and density patterns match remarkably well with predictions made by computer models more than a decade ago, but only if the source of heat is placed exactly where the main auroral emissions enter the atmosphere.
This means Saturn's aurora is not just a visual display—it is actively heating the atmosphere in a specific location. That localized heating drives winds, which in turn generate the electrical currents responsible for the aurora. The aurora then heats the atmosphere again, sustaining the whole cycle.
Part 1
on Saturday
Dr. Krishna Kumari Challa
What we are seeing is essentially a planetary heat pump. Saturn's aurora heats its atmosphere, the atmosphere drives winds, the winds produce currents that power the aurora, and so it goes on. The system feeds itself.
"For decades, we knew something strange was happening with Saturn's apparent rotation rate, but we could not explain it. We then showed it was being driven by atmospheric winds, but we still did not know why those winds existed. These new observations, made possible by JWST, finally give us the evidence we needed to close that loop."
The findings also have broader implications. The research suggests that what happens in Saturn's atmosphere directly influences conditions in its surrounding magnetosphere—the vast region of space shaped by the planet's magnetic field—which in turn feeds energy back into the system. This two-way relationship between the atmosphere and magnetosphere may help explain why the effect is so stable and long-lasting.
Tom S. Stallard et al, JWST/NIRSpec Reveals the Atmospheric Driver of Saturn's Variable Magnetospheric Rotation Rate, Journal of Geophysical Research: Space Physics (2026). DOI: 10.1029/2025ja034578
Part 2
on Saturday
Dr. Krishna Kumari Challa
New study uncovers mechanism explaining how obesity increases cancer risk
Obesity increases cancer risk by causing organs such as the liver, kidneys, and pancreas to enlarge primarily through an increase in cell number (hyperplasia), not just cell size. This expansion raises the number of cells susceptible to mutations, thereby elevating cancer risk. Organ size may predict cancer risk more accurately than BMI, highlighting the importance of maintaining a healthy weight early in life.
The study, published in Cancer Research,reveals a major driving force behind how obesity increases cancer risk across multiple organs.
The findings emphasize the importance of maintaining a healthy weight from early childhood and propose a potentially more accurate way than body mass index (BMI) to predict the increase in cancer risk associated with obesity.
The study reveals that excess weight doesn't just affect metabolism or hormones—it can physically enlarge organs, creating more opportunities for cancer to take hold. Understanding that process matters because it helps explain how everyday health choices can shape cancer risk years or even decades down the line."
In other words, as a person gains weight, their organs also grow in size by accumulating more cells to meet the higher energy needs of a bigger body. Having more cells boosts the odds of more DNA errors as cells divide, increasing the likelihood of cancer.
To test this hypothesis, researchers conducted a two-pronged study.
First, the team evaluated 747 adults whose weight in relationship to their height spanned the complete BMI spectrum, from underweight (18.5 BMI) all the way to severely obese (40-plus BMI). Using CT scans, the researchers measured the size of each adult's liver, kidneys, and pancreas.
This study is the first to analyze the size of multiple organs in a large cohort of individuals across the full BMI spectrum.
The scientists discovered that the organs grew larger as body weight increased. For every 5-point increase in BMI, the liver grew by 12%, kidneys by 9%, and the pancreas by 7%.
Next, the research team counted the cells in samples of kidney tissue taken from autopsies and reanalyzed biopsy data from living patients. The lab showed that more than 60% of the kidneys' growth resulted from an increase in the number of cells in the organ, a process called hyperplasia. The rest was due to individual cells growing bigger, or hypertrophy.
The new finding corrects earlier theories that larger organ size in obese individuals resulted primarily from fatter cells. Rather, obesity mainly increases the number of cells at risk for copying errors, uncontrollable growth, and potential malignancy.
This increase in organ size has harmful consequences.
The more cells in an organ, the more mutations and the greater the risk of one cell going awry during division and becoming cancerous.
Overall, the study showed a strong link between organ size enlargement and cancer risk across all three organs, confirming the mathematical predictions. The finding provides evidence for this as a major mechanism of tumorigenesis induced by obesity, in addition to factors like inflammation and hormonal imbalances.
This newly discovered effect of obesity can be large—with organs even doubling in size.
"When an organ doubles in size, it is expected to roughly double its risk of developing cancer", say the researchers.
Noting the relationship between diet and cancer, the authors emphasized the importance of maintaining a healthy weight from a young age.
Sophie Pénisson et al, Hyperplasia Functions as a Link Between Obesity and Cancer, Cancer Research (2026). DOI: 10.1158/0008-5472.can-25-2487
on Saturday
Dr. Krishna Kumari Challa
Syncing Your Blinks
Our eyes and brains sync to music we are listening!
If you walk down the street while listening to your favourite song and you will notice that you were stepping in time to the beat! And people instinctively blink to the beat too. In a recent study, researchers saw that dozens of non-musicians blinked in sync to the beat structure of Bach songs, without being asked.
Our bodies respond to music in a lot of weird ways. For example, one Japanese study determined what types of songs will spontaneously make people bop up and down, and which make them sway side-to-side. In the case of blinking to the beat, the scientists found that when study participants were distracted by a screen, they were less likely to sync their blinks. It might mean that active listening is required to incite these involuntary effects.
The finding makes sense because music activates the motor areas of the brain. Even if we’re just sitting still—and not bopping along to the beat—there can often be this sense of motion.
https://pmc.ncbi.nlm.nih.gov/articles/PMC12626317/#:~:text=This%20d....
https://journals.plos.org/plosbiology/article?id=10.1371/journal.pb...
on Sunday
Dr. Krishna Kumari Challa
How metformin treats diabetes
on Sunday
Dr. Krishna Kumari Challa
Your Blood Type May Increase Risk of an Early Stroke
on Sunday
Dr. Krishna Kumari Challa
Why cultivating drought-resistant plants disappoints: Soil physics may be the real bottleneck
Water uptake in plants is primarily limited by soil properties, specifically capillary and viscous forces in soil pores, rather than by plant physiology. As soil dries and water potential drops below -1.5 MPa, plants cannot extract water efficiently, regardless of their internal adaptations. This explains why breeding for drought resistance by altering plant traits has had limited success.
Most water in the soil exists in pores of varying sizes. These pores exert a capillary force that holds water.
Soil physicists discovered that when the soil water potential falls below -1.5 megapascals, plants are unable to extract water fast enough to meet their needs.
In other words when soil dries, capillary and viscous forces in the pores increase—and plants find it harder to draw water from the soil.
Stomata are super sensitive. Plants have special structures on the underside of their leaves known as stomata that function as an interface for gas exchange. These are small valves that the plant opens and closes in response to fluctuating environments.
When they are open, carbon dioxide from the air can flow into the leaf while water can escape into the atmosphere as vapor.
When the plant closes it stomata, it conserves water. This prevents it from dying of thirst. However, when the stomata are closed, the plant faces starvation because less carbon dioxide enters its leaves, meaning it produces fewer new sugar molecules. As a result, it grows more slowly.
Ultimately, the behaviour of these tiny valves determines how much carbon from the atmosphere enters the land plant biomass.
A plant requires considerable energy to draw water from soil pores. For example, the cell walls of the tubes through which water rises in shoot stems or tree trunks are thickened.
This enables them to withstand the tension in the vascular system and not collapse.
Further up in the leaves, dissolved substances in plant cells generate osmotic pressure, which keeps cells turgid despite the high tension in neighbouring vascular tissues.
The agricultural industry has long attempted to breed plants that store more solutes in their cells, hoping this would help them absorb water more efficiently from the soil and thus better withstand drought. Although a substantial amount of money has been invested in such breeding programmes, these hopes have never been realized.
The new results explain this failure: the limiting factor lies not in the plants but in the soil.
The physics of capillarity not only predicts the extent to which soil pores empty but also what occurs high up in the leaves.
Andrea Carminati et al, Soils drive convergence in the regulation of vascular tension in land plants, Science (2026). DOI: 10.1126/science.adx8114
on Sunday
Dr. Krishna Kumari Challa
Bird‑like robots promise greater flexibility and control than drones
Engineers have developed a bird-like ornithopter with flexible wings powered by piezoelectric materials, eliminating the need for motors, gears, or mechanical linkages. This solid-state design enables flapping and twisting motions, offering greater maneuverability and potential for applications such as search and rescue. Advanced modelling integrates flight physics, aiding future design optimization.
Xin Shan et al, Multi-physics finite state fully coupled modeling of mechanism-free induced-strain actuated ornithopters, Aerospace Science and Technology (2025). DOI: 10.1016/j.ast.2025.110573
on Sunday
Dr. Krishna Kumari Challa
Jamming bacterial communications, instead of killing the microbes, might provide long-lasting treatment
Targeting bacterial quorum sensing, rather than killing bacteria directly, offers a promising strategy against multidrug-resistant Pseudomonas aeruginosa. Screening FDA-approved drugs identified molecules, including Vorinostat, that inhibit the QS protein PqsE and reduce its enzymatic activity. Structural modifications may further disrupt bacterial communication and impair infection capability.
Traditional antibiotics kill bacteria or target their growth and replication. While this strategy has worked well for decades, it can lead to resistance—when mutated bacteria survive and continue to replicate, unbothered by the treatment.
In recent years, scientists have begun investigating another idea: What if we don't kill the bacteria but instead interfere with their communications and prevent them from launching a coordinated attack on the host? This approach might make it harder for bacteria to develop resistance. It's the foundation of a new approach, which targets quorum sensing (QS).
Bacteria communicate with a chemical language, emitting molecules into their environment.
This is like radio communications among soldiers on a covert mission. Once they realize enough of their comrades have landed, they can start to connect and behave as a unit. Bacteria do something very similar.
Once a quorum has been reached, they begin to change their behaviour, moving from acting as individuals to acting as a group. They form colonies and secrete toxins. It's this coordinated attack that ultimately makes us sick.
By understanding and interfering with how bacteria talk to each other, researchers hope to develop a treatment for deadly bacterial diseases.
Hannah A. Jones et al, Screening of FDA-Approved Small Molecules to Discover Inhibitors of the Pseudomonas aeruginosa Quorum-Sensing Enzyme, PqsE, Biochemistry (2026). DOI: 10.1021/acs.biochem.5c00475
on Monday
Dr. Krishna Kumari Challa
Altered colony chemistry reveals a process that destroys termite societies
Termite colony collapse is linked to the accumulation of uric acid in worker termites, particularly following changes in reproductive hierarchy. Elevated uric acid reduces reactive oxygen species (ROS) levels, weakening immune responses and increasing susceptibility to infections, which ultimately leads to colony decline. This process highlights the importance of internal chemical balance for colony stability.
Takao Konishi et al, What kills a society: accumulation of uric acid increases infectious disease risk in termites, Proceedings of the Royal Society B: Biological Sciences (2026). DOI: 10.1098/rspb.2025.2438.
on Monday
Dr. Krishna Kumari Challa
Why drawing eyes on food packaging could stop seagulls stealing your chips
Displaying eye-like images on food packaging can deter some herring gulls from approaching and stealing food, as these birds are slower to approach and less likely to peck at boxes with eyes compared to plain ones. This response likely stems from an instinctive wariness of being watched, a behavior observed in many animals. However, the deterrent effect is not universal, with only about half of gulls consistently avoiding the eye-marked packaging.
https://theconversation.com/why-drawing-eyes-on-food-packaging-coul...
on Monday
Dr. Krishna Kumari Challa
Deuterium-labeled guinea pig helps scientists study metabolism
A guinea pig was raised exclusively on heavy water, resulting in deuterium incorporation into its biomolecules. Using mass spectrometry, the dynamics of deuterium labeling in various compounds were tracked, revealing synthesis rates and dietary contributions. This approach enables precise metabolic studies and supports the development of personalized nutrition strategies.
Scientists have raised the world's only isotope-labeled guinea pig. For 156 days, the animal, named Khryun, was given only heavy water to drink. Such water is non-radioactive and has long been used in biomedical research as a way to "label" molecules: the natural isotope deuterium accumulates in the chemical bonds of organic compounds and serves as a tracer for tracking their formation and breakdown. This approach can be useful for studying human metabolism, including the development of personalized medicine methods. The research results are published in the International Journal of Molecular Sciences.
Chemical elements exist in nature as isotopes—atoms with the same number of protons but different numbers of neutrons. In heavy water, ordinary hydrogen atoms are replaced by its heavier isotope, deuterium. Isotopes have virtually identical chemical properties, allowing them to be used as invisible tracers. This very approach—replacing ordinary compounds with labeled ones and tracking their transformations—forms the basis for deciphering most biochemical processes.
To track how the isotopic labels became incorporated into various biological molecules, the researcher used high-performance liquid chromatography combined with high-resolution mass spectrometry.
The Mass spectrometry enables precise determination of the mass of all molecules present in a sample and distinguishes isotopes by their weight, making this method indispensable for such studies. The rate at which the label appears in a given compound reflects the intensity of its synthesis in the body.
The study determined the timeframes over which deuterium content in various substances reached steady-state levels. The final isotope content in each compound indicates to what extent it is synthesized by the body itself versus derived from food.
Oat shoots grown on heavy water were also produced. After Khryun ate them, the researcher determined how quickly the isotopically labeled substances from the oats became incorporated into the animal's own biological molecules.
The study established a methodological framework for using isotopically labelled food to investigate individual metabolic characteristics, opening up enormous possibilities for metabolic control.
Yury Kostyukevich et al, Turnover Rate of Lipids, Metabolites and Proteins Revealed by 156-Day-Long D2O Administration in a Guinea Pig, International Journal of Molecular Sciences (2026). DOI: 10.3390/ijms27041944
on Monday
Dr. Krishna Kumari Challa
Mild hypoxia after premature birth may disrupt hippocampal communication, mouse study suggests
Mild hypoxia after premature birth impairs learning and memory into adulthood by disrupting neuron-to-neuron communication in the hippocampus. This effect involves altered function of a specific protein channel and a second regulatory protein. Restoring the second protein's function in adult mice reversed the channel impairment, indicating potential therapeutic targets for hypoxia-induced cognitive deficits.
During intensive care after preterm births, babies can experience low oxygen in their tissue and cells—or hypoxia. Hypoxia is linked to poor brain health outcomes and life-long memory issues, but the mechanisms are unclear.
Researchers discovered a contributing mechanism by creating a mouse model for mild hypoxia following premature birth.
This new study explores how mild hypoxia may alter brain development without direct brain injury in this neonatal period.
As presented in their JNeurosci paper, mild hypoxia shortly after birth hindered learning and memory into adulthood, and the researchers discovered, at least in part, the mechanism for this effect: altered neuron-to-neuron communication in the hippocampus.
Probing a molecular mechanism, the researchers found that hypoxia following premature birth affected a protein channel involved in neuron-to-neuron communication and memory that develops in the hippocampus during adolescence. They also identified a second protein that was involved in hypoxia's effects on the channel's functioning.
When the researchers targeted this second protein in adult mice, they restored the channel's function.
According to the authors, this work sheds light on how hypoxia in preterm babies influences neuron communication in memory-related brain regions to hinder learning and memory into adulthood.
Mild Neonatal Hypoxia Targets Synaptic Maturation, Disrupts Adult Hippocampal Learning and Memory, and is Associated with CK2-Mediated Loss of Synaptic Calcium-Activated Potassium Channel KCNN2 Activity, JNeurosci (2026). DOI: 10.1523/JNEUROSCI.1643-25.2026
on Monday
Dr. Krishna Kumari Challa
Boosting good gut bacteria population through targeted interventions may slow cognitive decline
The gut is essentially an anaerobic ecosystem teeming with a vast mix of microorganisms, including bacteria, fungi, and protozoa, collectively called the microbiota. These organisms settle across different regions of the gut lining and together form the bulk of the human microbiome, spanning over 1,000 microbial species.
Several studies have established that gut microbiota play a key role in brain development and cognition. As people age or maintain poor diets, the community of bacteria in the gut can shift from helpful to harmful, a state known as dysbiosis. Such imbalances in gut bacteria may quietly fuel brain decline by triggering inflammation, weakening the brain's protective barrier, and allowing harmful proteins linked to Alzheimer's to build up.
The origin of neurodegenerative diseases like Alzheimer's or dementia isn't limited to the brain. The state of your gut can quietly set off a cycle of chronic, system-wide inflammation that nudges the brain toward cognitive decline. But how does the pathogenesis of a disease that seems purely brain-based begin in the gut—an organ that is mostly busy producing chemicals for digesting food?
It turns out these two entities are linked by the gut-brain axis, a two-way communication superhighway that constantly sends signals between the digestive tract and the central nervous system. It runs on chemical messengers like neurotransmitters and fatty acids, sharing information that shapes our memory, mood, and inflammation triggers.
An analysis of 15 studies involving more than 4,200 participants found that the gut-brain highway can be put to work as a drug-free route to support cognitive health. Tuning the gut microbiota through diet, supplements, or medical interventions such as fecal microbiota transplantation (FMT) can help improve memory, executive function, and overall cognitive performance, particularly in early or mild cases of cognitive impairment.
The credit for this protective effect goes to an increase in beneficial gut bacteria that produce compounds capable of slowing cognitive decline and reducing inflammation. The findings are published in Nutrition Research.
Sofia Libriani et al, The association between gut microbiota and cognitive decline: A systematic review of the literature, Nutrition Research (2026). DOI: 10.1016/j.nutres.2026.01.003
on Tuesday
Dr. Krishna Kumari Challa
Human brain operates near, but not at, the critical point
A recent study published in Physical Review Letters reveals that many widely used signatures of criticality in brain data may be statistical artifacts. They propose a more robust framework that, when applied to whole-brain fMRI data, confirms the brain operates near, but not exactly at, a critical point.
Neuroscientists have long found the idea fascinating—that the brain operates near a "critical point," a phase transition between stable and chaotic dynamics. Theory suggests this sweet spot enhances computational flexibility, dynamic range, and sensitivity to inputs. Evidence has mounted over the years from neural recordings showing approximate scale invariance and power-law behavior across spatiotemporal scales.
Operating near a critical point can retain many of the proposed computational benefits, such as rich multiscale collective modes and strong but controllable amplification, while avoiding the drawbacks of sitting exactly at criticality, where small perturbations can lead to instability, runaway activity, or reduced robustness, the researchers say.
Rubén Calvo et al, Robust Scaling in Human Brain Dynamics Despite Correlated Inputs and Limited Sampling Distortions, Physical Review Letters (2026). DOI: 10.1103/36v9-wtm8.
on Tuesday
Dr. Krishna Kumari Challa
Graphene oxide kills bacteria while sparing human cells
Hygiene in everyday items that touch the body—such as clothing, masks, and toothbrushes—is critically important. The underlying principle of how graphene selectively eliminates only bacteria has now been revealed. In Advanced Functional Materials, a research team presents the potential for a next-generation antibacterial material that is safe for the human body and capable of replacing antibiotics.
The research team has identified the mechanism by which graphene oxide (GO) exhibits powerful antibacterial effects against bacteria while remaining harmless to human cells.
Graphene oxide is a nanomaterial consisting of an atomic level carbon layer (graphene) with oxygen attached; it is characterized by its ability to mix well with water and implement various functions.
This study is highly significant as it provides molecular level proof of graphene's antibacterial action, which had not been clearly understood until now.
The research team confirmed that graphene oxide performs "selective antibacterial action" by attaching to and destroying only the membranes of bacteria, much like a magnet attaches only to specific metals, while leaving human cells untouched. This occurs because the oxygen functional groups on the surface of graphene oxide selectively bind with a specific component (POPG) found only in bacterial cell membranes.
Simply put, it recognizes a "target" present only in bacterial membranes to attach and destroy the structure. In this context, phospholipids are fatty components that make up the membrane surrounding a cell, and POPG is a component primarily present in bacteria.
Furthermore, fibres using this material maintained their antibacterial functions even after multiple washes, showing potential for use in various industrial fields such as apparel and medical textiles.
This technology is already being applied to consumer products.
Sujin Cha et al, Biocompatible but Antibacterial Mechanism of Graphene Oxide for Sustainable Antibiotics, Advanced Functional Materials (2026). DOI: 10.1002/adfm.74695
on Tuesday
Dr. Krishna Kumari Challa
Liquids can fracture like solids—researchers discover the breaking point
Simple liquids can fracture like solids when subjected to sufficient tensile stress, exhibiting brittle fracture at a critical stress of about 2 MPa. This behaviour is governed by viscosity rather than elasticity and appears consistent across different simple liquids, regardless of chemical composition. The finding challenges traditional views of fluid mechanics and suggests new avenues for manipulating liquids in various applications.
Thamires A. Lima et al, Unexpected Solidlike Fracture in Simple Liquids, Physical Review Letters (2026). DOI: 10.1103/t2vy-32wr
on Tuesday
Dr. Krishna Kumari Challa
Earth formed from material exclusively from the inner solar system, planetary scientists show
Analysis of isotopic data from meteorites indicates that Earth's material originated almost entirely from the inner solar system, with less than 2% contribution from beyond Jupiter. This suggests minimal exchange between inner and outer solar system reservoirs, likely due to Jupiter acting as a barrier. Most volatile elements, including water, must have been present in the inner solar system during Earth's formation.
Paolo A. Sossi et al, Homogeneous accretion of the Earth in the inner Solar System, Nature Astronomy (2026). DOI: 10.1038/s41550-026-02824-7
on Tuesday
Dr. Krishna Kumari Challa
Things to know about rare earth elements
Rare earth elements, comprising 17 metals including the lanthanides, are essential for modern technologies due to their unique magnetic, conductive, and optical properties. Despite their name, they are relatively abundant but challenging to mine safely. Global supply is not limited to one country, though China dominates processing. Recovering rare earths from mine waste offers a sustainable supply option.
What are rare earth elements? Where do they come from? What's the big deal?
1. Everyday devices are possible because of rare earths
The phrase "rare earth elements" generally refers to 17 chemical elements, including Scandium, Yttrium and a 15-member family from atomic number 57 (Lanthanum) to 71 (Lutetium) called the lanthanides.
Many of them share magnetic, conductive and optical properties that make them useful as coatings and additives in alloys and glass and other materials used in a wide range of modern technology. These include jet engines, LED bulbs, fiber-optic cables, lasers and a lot of military technology.
"In some of those applications, it's safe to say rare earths are irreplaceable.
For example, neodymium and praseodymium make super powerful magnets that have enabled the miniaturization of technologies in phones and computers. These really powerful magnets make the magic happen in high-speed trains and MRI machines, too."
Not every application feels particularly high-tech. Seat belts in cars also use rare earth magnets.
"It's not due to a particular engineering need either. It turns out that when folks were developing the seat belt retracting mechanism, that was the type of magnet they had on the shelf."
2. 'Rare' is a misnomer
Rare earths are not, in fact, particularly rare. The rare earths name is a holdover from the 18th century, when Yttrium was discovered by a miner in Sweden. These elements were "rare" then, because nobody had seen them before. But now we know they can be found around the globe.
"Seventeen elements is actually a sizable chunk of the periodic table.
We're talking about a fair amount of the stuff that makes up Earth's crust, from an elemental and mineralogical standpoint. The rare earths that we use most commonly are as abundant as copper or lead."
They're just not particularly fun to dig up.
"The geological conditions that cause rare earths to come together in higher concentrations can also concentrate radioactive materials.
hat makes them hard to mine safely, and can really increase costs."
That doesn't mean rare earths are expensive. They're actually relatively cheap, trading at prices far lower than precious metals like gold or platinum. In China, which has 30% of the world's proven rare earth reserves, mines typically discard as much as half of the rare earths they dig up, because prices aren't high enough to put the effort into recovering more.
Part 1
on Tuesday
Dr. Krishna Kumari Challa
3. Rare earths may seem so scarce because 'Avatar' was so popular
In December 2009, the sci-fi film "Avatar" was released, and it remained the most popular film in U.S. theaters for months. The plot was built around humans displacing a native race on another planet to make way for mining a fabulously valuable material called "unobtanium."
In 2010, in the real world, a diplomatic dispute led China to cut off Japan's access to rare earth elements—a very temporary blow (the embargo didn't even last as long as "Avatar" did as the No. 1 film) to Japanese tech manufacturers.
"There were headlines that said something like "China cuts off access to unobtanium.
"Our popular imagination was kind of primed by the movie, and then this short-term crisis happened. The narrative—which has continued to support a lot of other politics over the years—stuck, and it's been hard to get unstuck."
4. It's unlikely one country would just turn off the rare earths tap
While China does have ample rare earths reserves, we know the elements are distributed all around the world. Aside from China's willingness to take on the environmental price of rare earths mining, the real source of the country's market dominance is the expertise and infrastructure it has developed to process what it mines.
"Where China does have an outsized share of the rare earth economy is in the crucial intermediate steps involved in transforming a rock in the ground into useful technological components.
Other countries and industries have supported the establishment of China's rare earths strength by continuing to trade for the materials, and maintaining those trading relationships is important for everyone.
"Price squeezes and supply chain concerns tend to be episodic rather than sustained. Buyers and sellers like to be connected. If you're a seller located in China with buyers located outside China, you don't want to be cut off. There's pressure in China to avoid longer-term trade wars that might hurt domestic businesses."
5. Abandoned mines could be a rare earths gold mine—and sustainable solution—for the U.S.
A recent study showed that much of the domestic demand for rare earths (and other important minerals) can be satisfied by recovering the rare earth elements from the waste piled up around old and active mines in the United States.
"A lot of these materials are already present in what was cast off by other mines. Maybe we could actually get what we need by cleaning up these long-standing, problematic, abandoned mine waste sites. It could literally be trash to treasure."
Source: https://news.wisc.edu/five-things-to-know-about-rare-earth-elements/
Part 2
on Tuesday
Dr. Krishna Kumari Challa
Binding to RNA is not enough—changing its shape is what makes a drug work, study reveals
Small molecules that merely bind to RNA rarely alter its function, whereas those that induce changes in RNA structure have a greater functional impact. Modulating RNA folding, rather than just binding, is crucial for effective RNA-targeting drugs. A new framework is proposed to identify small molecules capable of altering RNA structure, aiming to improve RNA-targeted drug development.
Chundan Zhang et al, RNA functional modulation by Mitoxantrone via RNA structural ensemble repartitioning, Nature Communications (2026). DOI: 10.1038/s41467-026-70801-9
on Tuesday
Dr. Krishna Kumari Challa
Soil bacteria break down toxic chemicals in the environment
Many aromatic compounds, such as phenols, cresols and styrenes, are toxic to organisms and harmful to the environment. They can accumulate as a result of industrial processes and harm ecosystems. Soil bacteria can help to break them down.
Soil bacteria such as Rhodococcus opacus 1CP possess large, redundant genomes encoding multiple enzymes that enable the breakdown of toxic aromatic compounds like phenols, cresols, and styrenes. These redundancies allow bacteria to adapt to varying environmental conditions and maintain pollutant degradation, even when specific enzymes are inactive, by activating alternative metabolic pathways.
Selvapravin Kumaran et al, Whole-genomic and transcriptomic analyses elucidate p-cresol and styrene degradation metabolism in Rhodococcus opacus 1CP, Applied and Environmental Microbiology (2026). DOI: 10.1128/aem.00045-26
on Tuesday
Dr. Krishna Kumari Challa
Why a man's health before pregnancy matters for the next generation
Men's health and life experiences before conception significantly influence pregnancy outcomes and child development. Factors such as age, nutrition, substance use, mental health, and environmental exposures can affect sperm and gene expression, impacting offspring health. Supportive partner relationships and early-life experiences also shape family well-being across generations.
original article.
on Tuesday
Dr. Krishna Kumari Challa
The four types of dementia most people don't know exist
Dementia encompasses over 100 types, with Alzheimer's disease accounting for about 60% of cases. Less common forms include posterior cortical atrophy, Creutzfeldt-Jakob disease, FTD-MND, and progressive supranuclear palsy, each presenting distinct symptoms beyond memory loss, such as visual, motor, or behavioral changes. Early recognition of these subtypes is crucial for appropriate care.
original article.
on Tuesday
Dr. Krishna Kumari Challa
Evacuating in 90 seconds? New simulations show the safest cabin layout
In case of an emergency, the Aviation Administration requires aircraft to be able to evacuate within 90 seconds. However, as the median age of the global population increases, the growing number of elderly airline passengers poses new challenges during emergency situations.
In AIP Advances, an international collaboration of researchers simulated 27 different evacuation scenarios in the case of a dual-engine fire in an Airbus A320, one of the most common narrow-body aircraft in the world. They compared three different cabin layouts with three different ratios of passengers over the age of 60 and three different distributions of those passengers.
While a dual-engine fire scenario is statistically rare, it falls under the broader category of dual-engine failures and critical emergencies in aviation.
In seeking the most efficient combination of factors, the researchers created full-scale computer-aided design models of the A320 cabin and used Pathfinder—the industry-standard software for evacuation modelling—to simulate passengers' behaviour. They found the proportion and location of elderly passengers have the largest effect on evacuation time.
The fastest option—a layout that accommodates a total of 152 passengers with two rows of first-class seats at the front, and 30 elderly passengers evenly distributed throughout the cabin—still required 141 seconds for all the passengers to reach the ground, much longer than the AA mandates.
Previous studies have shown that cognitive decline in elderly populations can affect situational awareness and delay decision making, and that reduced dexterity can be exacerbated during high-stress situations.
The researchers hope that incorporating this information into their findings—for example, by offering additional safety briefings to elderly passengers—will help further accelerate the deboarding process.
Children, infants, and pregnant women also introduce unique physical capabilities and behaviours that add another vital layer to evacuation modelling, which the group plans to investigate in their future work.
Effect of elderly passenger distribution on A320 aircraft evacuation under dual-engine fire scenarios, AIP Advances (2026). DOI: 10.1063/5.0310405
yesterday
Dr. Krishna Kumari Challa
Why Does the Middle East Have the Largest Oil Reserves?
yesterday
Dr. Krishna Kumari Challa
Scientists capture atoms in motion
Researchers have captured the exact atomic movements that write data to next-generation memory devices, which could pave the way for smaller, faster and more energy-efficient electronics.
Using advanced electron microscopy the research team captured atomic-scale movements inside promising memory materials, known as fluorite-type ferroelectrics, that could overcome current limits to how small and efficient memory devices can become.
Everyday technologies, such as smartphones, medical devices, wearable electronics and contactless IC cards used in public transport, store data as billions of digital 1s and 0s. In these materials, the physical position of an atom acts like a "switch"—and moving an atom just a fraction of a nanometer is what flips a data bit from a 0 to a 1.
This research shows exactly how that physical movement happens in real time. Until now, scientists couldn't directly see how this switching actually happened, in fractions of a second.
They discovered that switching doesn't happen in a single step, but through previously unseen intermediate atomic structures, and that the process can be controlled by changing the material's composition.
Kousuke Ooe et al, Direct observation of cation-dependent polarisation switching dynamics in fluorite ferroelectrics, Nature Communications (2026). DOI: 10.1038/s41467-026-70593-y
yesterday
Dr. Krishna Kumari Challa
Animals are powerful landscape engineers shaping the Earth's surface, global study finds
Wild animals significantly modify Earth's surface by altering soil and sediment through activities such as burrowing and feeding. A global meta-analysis by researchers analyzed data from 64 studies covering 61 species of wild animals across freshwater and terrestrial environments
Animal activity increases soil porosity and reduces fine material, influencing erosion and landscape development. These effects are more pronounced in freshwater ecosystems (136% change) than terrestrial ones (66%), highlighting animals as key geomorphic agents.
Wild animals are not just inhabitants of the natural world. Many also act as natural landscape engineers, reshaping Earth's surface as they burrow, feed, and build shelters that move soil and sediment across ecosystems. From animals disturbing riverbeds to burrowing species redistributing soil, wildlife constantly modifies the physical structure of landscapes through everyday activities.
The research found that animals consistently increased the porosity of soils and sediments and reduced the amount of fine material present. These changes influence how water and sediment move through ecosystems and can affect processes such as erosion, river behaviour and landscape development.
The new study provides quantitative evidence of how strongly animal activity can modify geomorphic processes across ecosystems.
Z. Khan et al, Signatures of Wild Animal Life in Earth's Landscapes, Journal of Geophysical Research: Earth Surface (2026). DOI: 10.1029/2025jf008351
yesterday
Dr. Krishna Kumari Challa
Antibacterial soaps and wipes can fuel antimicrobial resistance, scientists warn
An international team of scientists is warning that everyday antibacterial soaps, wipes, sprays, and other "germ-killing" products are quietly contributing to the global rise of antimicrobial resistance (AMR) while providing no added health benefit for most consumer uses.
Widespread use of antibacterial soaps, wipes, and other consumer products containing biocides such as quaternary ammonium compounds is contributing to antimicrobial resistance (AMR) without providing added health benefits for most uses. These chemicals promote bacterial resistance, including cross-resistance to antibiotics, and persist in the environment. Health authorities recommend plain soap and water for routine handwashing.
The authors summarize numerous laboratory and real-world studies showing that environmental levels of these chemicals cause resistant bacteria to survive and spread, promote cross-resistance to important antibiotics, and cause lasting genetic changes to microbes, including the exchange of resistance genes.
Over time, these shifts can allow resistant strains to dominate. This translates to the spread of antibiotic resistant genes that threaten the effectiveness of antibiotics when we really need them and can contribute to rising deaths.
Evidence shows biocides in many consumer products provide no added health benefit, but the biocides do raise concerns about AMR and toxicity.
Targeting Biocide Overuse in Consumer Products Will Strengthen Global AMR Action, Environmental Science & Technology (2026). DOI: 10.1021/acs.est.5c17673
yesterday
Dr. Krishna Kumari Challa
Vibrations in your skull New tool for Biometric authentication
A team of researchers has developed a security system that could change how people log in to virtual and augmented reality platforms by eliminating passwords, personal identification numbers and eye scans and replacing them with something far more seamless.
A new authentication system uses unique vibration patterns from breathing and heartbeats transmitted through the skull, detected by motion sensors in XR headsets, to identify users. This software-only approach achieved over 95% accuracy in authenticating users and over 98% in rejecting unauthorized access, offering secure, continuous, and hardware-free biometric verification.
The system, a software program called VitalID, is based on the team's discovery of a new biometric: tiny vibrations generated by breathing and heartbeats that resonate through the skull in patterns unique to each person's bone structure and facial tissues.
The human body is always moving in tiny ways, even when a person is sitting still. Each breath and each heartbeat create very small vibrations inside the body. Those vibrations travel up through the neck and into the head.
When they reach the skull, they cause it to vibrate slightly. Because every skull has a different shape, thickness and bone structure, the vibrations change in unique ways as they move through each person's head. Soft tissues in the face, such as muscle and fat, also influence how the vibrations travel.
As a result, each person produces a distinct vibration pattern. Motion sensors built inside virtual reality headsets can detect these tiny patterns and assess them like a fingerprint to determine who is wearing the device.
We do not need to add any device or additional hardware. It requires only software.
In testing across 52 users over a 10-month period using two popular XR headsets, the system correctly authenticated legitimate users more than 95% of the time and rejected unauthorized users more than 98% of the time.
The researchers built a filtering system that removes interference from extraneous head and body movement, allowing the headset to focus only on the tiny vibrations in the skull caused by breathing and heartbeat. They then used advanced computer models to analyze those vibration patterns.
Because the vibrations travel internally through bone and tissue, they may also be more difficult to spoof. Someone might imitate another person's breathing rhythm but cannot easily replicate the biomechanical properties of another person's skull.
The headset would continuously confirm identity in the background simply by sensing the subtle vibrations that come with being alive.
Tianfang Zhang et al, Harnessing Vital Sign Vibration Harmonics for Effortless and Inbuilt XR User Authentication, Proceedings of the 2025 ACM SIGSAC Conference on Computer and Communications Security (2025). DOI: 10.1145/3719027.3765060
yesterday
Dr. Krishna Kumari Challa
AI's growing role in scientific peer review
Large language models can enhance scientific peer review by identifying objective errors and inconsistencies, improving draft quality, and alleviating reviewer workload. However, AI is less effective at subjective judgments, such as assessing novelty or significance. Human oversight remains essential, with transparency about AI involvement and accountability for final decisions. AI's role in science is expected to expand.
There is tremendous interest in using AI, especially language models, to support research and peer review and speed up the scientific process. A key advantage is that AI can act like a rapid, always-available critic, a sort of pre-submission review process—before scientists officially send in a paper for publication. AI can be quite good at assessing drafts for gaps and limitations, so researchers can preemptively address them. This can improve the quality of first drafts submitted for publication and reduce the back-and-forth later. And on the reviewer side, the pressure is real: As submissions grow, human reviewers are very much overburdened, which can lead to lower-quality reviews and frustration for authors.
IN their tests researchers found that besides spotting gaps and limitations, AI can be quite good at the more objective, verifiable aspects of review.
AI is strongest on objective, checkable inconsistencies and technical issues and weaker on subjective judgments about the novelty or significance of the research.
The researchers say that AI should support and inform—not fully replace—human decision-making.
A human, or team of humans, must make the final editorial decisions and scientists must stand behind the work. AI can offer comments on early drafts, point out omissions, and suggest improvements in the writing and the research—but the scientists must remain accountable for incorporating and synthesizing feedback from the AI and human reviewers.
Scientists have to be up-front about how and where AI has assisted in the research itself and in the writing and review of the papers. They should acknowledge exactly how AI was involved and what tools were used in the paper. It comes down to accountability and a clear chain of responsibility and that final decisions are still made by humans.
Following on this work, many conferences and journals are now exploring using LLMs to assist the review process.
Nitya Thakkar et al, A large-scale randomized study of large language model feedback in peer review, Nature Machine Intelligence (2026). DOI: 10.1038/s42256-026-01188-x
In one recent large-scale randomized experiment, the results of which were published in Nature Machine Intelligence, researchers provided AI assistance to human reviewers on roughly 20,000 reviews to assess AI's impact on review quality.
**
yesterday
Dr. Krishna Kumari Challa
Premature and small births are linked to lifelong learning problems
Preterm birth and low birth weight are consistently associated with lower IQ and persistent educational disadvantages, particularly in mathematics, from early childhood into adulthood. The severity of these challenges increases with earlier gestational age and lower birth weight, underscoring the importance of early identification and ongoing support to improve long-term outcomes.
Being born early or at a lower weight is linked to lower IQ scores and poorer educational outcomes in school and beyond, according to a new study published in the journal JAMA Pediatrics.
In this research, known as an umbrella review, the team examined what previous studies had discovered about preterm birth and low birth weight and long-term development. This involved going back to the original numbers and recalculating the results using a single, consistent method to ensure accuracy. They looked at five different life stages, from babies under two years old to adults over 18.
This meta-analysis confirmed that both preterm birth and low birth weight are linked to disadvantages that persist over time. In particular, babies born before 28 weeks or weighing less than 1 kg at birth showed larger academic disadvantages on average than babies born at term with normal birth weight.
The most affected subject was math, with significant gaps in calculation and problem-solving skills. Stark differences were also seen in reading, comprehension, spelling and identifying words.
These challenges were often most visible during primary school and closed slightly during teenage years. However, some of these learning difficulties reappear once a person reaches adulthood, as the study authors note in their paper. "These disadvantages generally increased with earlier gestational age and lower birth weight. Although some associations appeared to attenuate during adolescence, evidence of persistent disadvantages into adulthood was observed for several outcomes."
The research team believes their findings show that the impact of being born early or much smaller than average can have lifelong consequences. For some, this may mean fewer job opportunities or earning lower salaries than their peers.
Mingzheng Hu et al, Cognitive and Educational Outcomes After Preterm Birth or Low Birth Weight, JAMA Pediatrics (2026). DOI: 10.1001/jamapediatrics.2026.0533
**
yesterday
Dr. Krishna Kumari Challa
How time and space become one inside your brain—and what it means for Alzheimer's
If you develop Alzheimer's disease, you not only lose your sense of time, but you also lose your sense of place.
Neural circuits in the retrosplenial cortex process time and space using similar activity patterns, indicating these dimensions are integrated in the brain. This shared mechanism helps explain why both temporal and spatial orientation deteriorate together in Alzheimer's disease, highlighting the need to understand healthy episodic memory networks to address dementia.
All memories are made up of different components. You don't just remember what you had for dinner yesterday, but also the time and place. We often think of time and space as separate categories, a distinction created by philosophers and physicists that is incredibly practical for organizing our lives. But our brain cells don't see it that way.
These cells don't distinguish between a step forward in space or a second passing in time. Instead, they simply record a continuously changing stream of information from our senses, tracking events as they unfold. To the brain's internal network, time and place are effectively two sides of the same coin.
In Alzheimer's disease, it is therefore not surprising that both are affected; when the neural network is damaged, our sense of 'where' and 'when' begins to unravel together.
Remembering where, when and how something happened is called episodic memory. In your brain, billions of nerve cells form large networks, passing signals like a relay race to process information from your senses, the sounds, smells, and sights of your life.
We already know that cells which link memories to time and space are found in the hippocampus.
But this group of researchers had a theory that another area of the brain is also involved, namely the retrosplenial cortex. Located at the back of the cerebral cortex near the hippocampus, this area was previously only known for linking memories to place.
To test if this area also tracks time, the team designed a memory challenge for mice. The task required them to hold a specific odor in their "working memory" during a brief period. Their study is published in Cell Reports.
The most striking discovery was that the retrosplenial cortex uses the same "neural script" for both space and time. The researchers found that the sequence of neuronal activity in the retrosplenial cortex looks almost identical whether a mouse is physically running through a room or simply holding a memory in its mind for five seconds.
This discovery brings us back to the tragic reality of Alzheimer's disease, where those affected struggle to anchor themselves in both time and place. By showing that the brain uses the same "neural script" for both, this research explains why these two senses often fail together.
This work also challenges how we perceive the world around us. While we use the concepts of time and space to organize our lives, this distinction is largely a human construct. In fact, some modern theories in physics are moving away from using time and space as the fundamental building blocks of the universe. It appears the brain's internal wiring mirrors this deeper reality.
Anna Christina Garvert et al, Area-specific encoding of temporal information in the neocortex, Cell Reports (2025). DOI: 10.1016/j.celrep.2025.115363
yesterday
Dr. Krishna Kumari Challa
Unexplained sky flashes from the 1950s: Independent analysis supports their existence
Historical observations from an observatory in Germany have now independently verified evidence for brief, mysterious flashes of light in the night sky, first picked up by an American astronomical survey in the 1950s. Through fresh analysis of a German survey from the same period, independent researcher Ivo Busko, a now-retired developer at NASA, has uncovered striking new support for these puzzling signals. The results have been published as a preprint on arXiv.
In2019, an international team of astronomers launched the VASCO Project, aiming to identify unusual phenomena hidden within vast archives of historical data. In particular, their work focused on astronomical transients: objects that suddenly appear in the sky in some images, but vanish in subsequent observations.
An especially exciting result emerged in 2025, when researchers analyzed photographic plates captured as part of the Palomar Observatory Sky Survey. Carried out in California throughout the 1950s, this ambitious program produced nearly 2,000 images of the night sky using long-exposure plates. Within these images, the team found clear evidence of transients with strange appearance and behavior, captured at a time that predates the launch of any human-made satellites.
Crucially, the spatial spread of light from these sources appeared too sharp to be explained by normal stars or distant astronomical objects. Combined with the way the plates recorded their brightness, the signals suggested that the flashes lasted for less than a second, despite being embedded within exposures lasting tens of minutes.
Unless they arise from some as-yet unknown astrophysical phenomenon, one especially captivating possibility remained: that the flashes were produced by artificial objects, either briefly orbiting Earth or passing nearby.
Part 1
5 hours ago
Dr. Krishna Kumari Challa
Until now, however, the Palomar observations had not been independently confirmed. To address this gap, Busko turned to a completely separate dataset: archival photographic plates taken at the Hamburg Observatory in Germany during the same period in the 1950s. These plates captured many of the same regions of sky and were later digitized by the APPLAUSE Archive, making them accessible for modern analysis.
By comparing pairs of plates taken in close succession—each exposed for around 30 minutes before being replaced—Busko was able to search for fleeting changes between images.
His results revealed clear evidence of transients that are remarkably similar to those reported by the VASCO team, providing the first independent confirmation of the phenomenon using a different method and dataset.
For now, only a small fraction of the Hamburg plates have been examined. But with further improvements to the analysis techniques, Busko is hopeful that more subtle examples of these flashes could be uncovered across the archive, strengthening the statistical significance of the findings.
Artificial objects?
While astronomers may never know exactly what caused these events, both the VASCO results and Busko's independent analysis point toward a consistent interpretation: that the flashes could have originated from flat, rotating objects orbiting close to Earth, briefly reflecting sunlight toward the ground. For some, this leaves open a more speculative possibility: that these mysterious signals may even hint at artificial objects which were sent to Earth deliberately.
Ivo Busko, Searching for Fast Astronomical Transients in Archival Photographic Plates, arXiv (2026). DOI: 10.48550/arxiv.2603.20407
Part 2
**
5 hours ago
Dr. Krishna Kumari Challa
Pesticides and cancer: Study reveals the biological mechanisms behind an environmental health risk
A new study, published in Nature Health, reveals a strong link between exposure to agricultural pesticides in the environment and the risk of developing cancer. By combining environmental data, a nationwide cancer registry, and biological analyses, researchers have shed new light on the role of pesticide exposure in the development of certain cancers.
Pesticides are widely present in food, water, and the environment, often in the form of complex mixtures. Until now, it has been difficult to accurately assess their effects on human health, as most studies focus on isolated substances and experimental models that are far removed from real-world exposure conditions.
This new study adopts an innovative, integrative approach that accounts for the complexity of real-world exposures experienced by populations.
This is the first time researchers have been able to link pesticide exposure, on a national scale, to biological changes suggesting an increased risk of cancer.
The study shows that certain tumors, although they affect different organs, share common biological vulnerabilities linked to their cellular origin that can be weakened by pesticide exposure. Notably, the liver is a key organ in the metabolism of chemicals and is considered a sentinel site for environmental exposure.
Molecular analyses conducted show that pesticides disrupt processes that help maintain cell function and cellular identity. These biological changes appear before cancer develops, suggesting early, cumulative, and silent effects. They could make tissues more vulnerable to other risk factors, such as infections, inflammation, or environmental stressors.
The results challenge conventional toxicological approaches, which are based on the evaluation of isolated substances and the establishment of thresholds considered safe. They highlight the importance of considering pesticide mixtures, environmental exposure, and real-world socio-ecological contexts.
Mapping pesticide mixtures to cancer risk at country scale with spatial exposomics, Nature Health (2026). www.nature.com/articles/s44360-026-00087-0
5 hours ago