Science Simplified!

                       JAI VIGNAN

All about Science - to remove misconceptions and encourage scientific temper

Communicating science to the common people

'To make  them see the world differently through the beautiful lense of  science'

Load Previous Comments
  • Dr. Krishna Kumari Challa

    Hitler's DNA reveals possible genetic disorder tied to sexual and social behaviour

    Adolf Hitler most likely suffered from the genetic condition Kallmann Syndrome that can manifest itself in undescended testicles and a micropenis, researchers  said this week, following DNA testing of the Nazi dictator's blood.

    The new research also quashes the suggestion that Hitler had Jewish ancestry.

    Popular World War II songs often mocked Hitler's anatomy but lacked any scientific basis.

    The findings by an international team of scientists and historians now appear to confirm longstanding suspicions around his sexual development.

    No one has ever really been able to explain why Hitler was so uncomfortable around women throughout his life, or why he probably never entered into intimate relations with women.

    But now we know that he had Kallmann Syndrome, this could be the answer we've been looking for, say the researchers.

    The research findings are featured in a new documentary, "Hitler's DNA: Blueprint of a Dictator." 

    The testing found a "high likelihood" that Hitler had Kallmann Syndrome and "very high" scores—in the top one percent—for a predisposition to autism, schizophrenia and biopolar disorder. 

    The research team stressed that such conditions, however, could not explain or excuse Hitler's warmongering or racist policies.

    The testing was made possible after researchers obtained a sample of Hitler's blood from a piece of material taken from the sofa on which he shot himself.

    The DNA results additionally rule out the possibility that Hitler had a Jewish grandfather via his grandmother.

    Source: AFP

    **

  • Dr. Krishna Kumari Challa

    Randomized trials show no evidence of non-specific vaccine effects

    Researchers  have conducted randomized trials involving thousands of children in Guinea-Bissau and Denmark to demonstrate so-called non-specific vaccine effects—that is, whether vaccines also protect against diseases other than the one they are designed to prevent.

    A new comprehensive Danish review now shows that the trials have been unable to demonstrate non-specific effects for the widely used vaccinations against measles, tuberculosis, diphtheria, tetanus, and whooping cough. The study has just been published in the journal Vaccine.

    It is concerning that such a prominent research group has conducted so many randomized trials over such a long period without finding real results. Randomized trials are normally considered the gold standard in medical research, so if they do not show anything, one should be very cautious about presenting it as convincing evidence.

    The new study is the first to systematically analyze all of Benn and Aaby's randomized trials. While others have previously criticized individual studies, the researchers behind the new review examined the full body of work.

    Researchers find indications that the researchers systematically selected and highlighted results that supported their theories, while downplaying the fact that they did not confirm the primary hypothesis the trials were actually designed to test. When you look at the overall picture, there are almost no real findings left.

    The results meant it was time to change the global approach to vaccination—all new vaccines should routinely be assessed for non-specific effects, and vaccination programs should be revised worldwide.

    Part 1

  • Dr. Krishna Kumari Challa

    The new review is based on 13 randomized trials presented in 26 articles containing more than 1,400 separate statistical analyses. Only one of the 13 randomized trials demonstrated the effect it was designed to detect—and that trial was stopped early and was considered unsuccessful by the researchers themselves.

    The review showed that only about 7% of the many hypotheses tested by the researchers could be expected to be correct. Notably, this did not apply to the researchers' own primary hypotheses—these were not supported by the corrected results.

    In 23 out of 25 articles, the researchers highlighted secondary findings as support for their theories, but in 22 of these cases the evidence disappeared after proper statistical handling. Overall, the researchers' interpretation did not take into account how many analyses they had conducted, and they did not focus on the main outcomes of the trials.
    The researchers stress that the purpose was not to determine whether non-specific vaccine effects exist, but to examine Benn and Aaby's research practices.

    "We hope that others in the field will now re-evaluate the evidence—what do we actually know about non-specific vaccine effects? Although Benn and Aaby have contributed about one-third of all research in the area, others have also studied the question, and this should be included to form a complete picture.

     Henrik Støvring et al, What is actually the emerging evidence about non-specific vaccine effects in randomized trials from the Bandim Health Project?, Vaccine (2025). DOI: 10.1016/j.vaccine.2025.127937

    Part 2

    **

  • Dr. Krishna Kumari Challa

    What puts the ‘cable’ in cable bacteria
    The ‘wires’ that cable bacteria use to conduct electricity seem to be made of repeating units of a compound that contains nickel and sulfur. Researchers found that these units make up ‘nanoribbons’, which are woven together like a plait to form the larger wires. The work has yet to be peer reviewed, but if confirmed, the finding could be the first known example of a biologically produced metal–organic framework. “If it holds true, this is a major step in our understanding of what cable bacteria can accomplish,” says electromicrobiologist Lars Peter Nielsen, who co-discovered the microorganisms in 2009.

    https://www.biorxiv.org/content/10.1101/2025.10.10.681601v1

    https://www.science.org/content/article/metal-scaffolds-turn-bacter...

  • Dr. Krishna Kumari Challa

    AI at the speed of light just became a possibility

    Researchers have demonstrated single-shot tensor computing at the speed of light, a remarkable step towards next-generation artificial general intelligence hardware powered by optical computation rather than electronics.

    Tensor operations are the kind of arithmetic that form the backbone of nearly all modern technologies, especially artificial intelligence, yet they extend beyond the simple math we're familiar with. Imagine the mathematics behind rotating, slicing, or rearranging a Rubik's cube along multiple dimensions. While humans and classical computers must perform these operations step by step, light can do them all at once.

    Today, every task in AI, from image recognition to natural language processing, relies on tensor operations. However, the explosion of data has pushed conventional digital computing platforms, such as GPUs, to their limits in terms of speed, scalability and energy consumption.

    Motivated by this pressing problem, an international research collaboration has unlocked a new approach that performs complex tensor computations using a single propagation of light. The result is single-shot tensor computing, achieved at the speed of light itself.

    The new method performs the same kinds of operations that today's GPUs handle, like convolutions and attention layers, but does them all at the speed of light.

    To achieve this, the researchers encoded digital data into the amplitude and phase of light waves, effectively turning numbers into physical properties of the optical field. When these light fields interact and combine, they naturally carry out mathematical operations such as matrix and tensor multiplications, which form the core of deep learning algorithms.

    By introducing multiple wavelengths of light, the team extended this approach to handle even higher-order tensor operations.

    Another key advantage of this method is its simplicity. The optical operations occur passively as the light propagates, so no active control or electronic switching is needed during computation.

    Direct tensor processing with coherent light, Nature Photonics (2025). DOI: 10.1038/s41566-025-01799-7.

  • Dr. Krishna Kumari Challa

    Direct link between peak air pollution and cardiac risk revealed

    The risk of suffering cardiac arrest may increase on days recording high levels of air pollution in some parts of the world. This emerges from a study conducted by researchers and published in the journal Global Challenges.

    Researchers analyzed 37,613 cases of out-of-hospital cardiac arrest in Lombardy between 2016 and 2019 by assessing, for each episode, the daily concentrations of various pollutants (PM₂.₅, PM₁₀, NO₂, O₃ and CO) obtained from satellite data of the European Copernicus program (ESA). The study used advanced spatio-temporal statistical models to identify the relationship between pollution peaks and increased risk of cardiac events.

    They observed a strong association with nitrogen dioxide (NO₂). Indeed, for every 10 micrograms per cubic meter increase, the risk of cardiac arrest rises by 7% over the next 96 hours.

    Even particulate matter PM₂.₅ and PM₁₀ present a 3% and 2.5% increase in the risk rate, respectively, on the same day of exposure.

    The effect is more pronounced in urban areas but significant associations are also observed in rural towns. The risk particularly marks an upswing in the warm months, suggesting a possible interaction between heat and pollutants. The association was also observed at levels below legal limits, suggesting that there is no safe exposure threshold.

    The link between air quality and out-of-hospital cardiac arrest is a wake-up call for local health systems.

     Amruta Umakant Mahakalkar et al, Short‐Term Effect of Air Pollution on Out of Hospital Cardiac Arrest (OHCA) in Lombardy—A Case‐Crossover Spatiotemporal Study, Global Challenges (2025). DOI: 10.1002/gch2.202500241

  • Dr. Krishna Kumari Challa

    Neanderthals May Never Have Truly Gone Extinct

    Neanderthals may have never truly gone extinct, according to new research – at least not in the genetic sense.

    A new mathematical model has explored a fascinating scenario in which Neanderthals gradually disappeared not through "true extinction" but through genetic absorption into a more prolific species: Us.
    According to the analysis, the long and drawn-out 'love affair' between Homo sapiens and Neanderthals could have led to almost complete genetic absorption within 10,000–30,000 years.
    The model is simple and not regional, but it provides a "robust explanation for the observed Neanderthal demise, say researchers.
    Once, the idea that Neanderthals and Homo sapiens interbred was a radical notion, and yet now, modern genome studies and archaeological evidence have provided strong evidence that our two lineages were hooking up and reproducing right across Eurasia for tens of thousands of years.

    Today, people of non-African ancestry have inherited about 1 to 4 percent of their DNA from Neanderthals.
    No one knows why Neanderthals disappeared from the face of our planet roughly 40,000 years ago, but experts tend to agree that there are probably numerous factors that played a role, like environmental changes, loss of genetic diversity, or competition with Homo sapiens.
    Now researchers present a model that does not rule out these other explanations.
    It does suggest that genetic drift played a strong role – even with the researchers assuming the Neanderthal genes our species 'absorbed' had no survival benefit.

    If the model were to include the potential advantages of some Neanderthal genes to the larger Homo sapiens population, then mathematical support for genetic dilution may become even stronger.

    Like all models, this new one is based on imperfect assumptions. It uses the birth rates of modern hunter-gatherer tribes to predict how quickly small Neanderthal tribes would be engulfed by a much larger human population, given how frequently it seems we were interbreeding.
    Part 1

  • Dr. Krishna Kumari Challa

    The results are consistent with recent archaeological discoveries and align with a body of evidence that suggests the decline of Neanderthals in Europe was gradual rather than sudden.

    Homo sapiens seem to have started migrating out of Africa much earlier than scientists previously thought, and they arrived in Europe in several influxes, possibly starting more than 200,000 years ago.
    As each wave of migration came crashing into the region, it engulfed local Neanderthal communities, diluting their genes, like sand pulled into the sea.
    Today, some scientists argue that there is more to unite Homo sapiens and Neanderthals than there is to differentiate us. Our lineages, they say, should not be regarded as two separate species, but rather as distinct populations belonging to a "common human species.".

    Neanderthals were surprisingly adaptable and intelligent. They made intricate tools, created cave art, and used fire – and when it came to communication, they were probably capable of far more than simple grunting.
    Neanderthal populations and cultures may no longer exist, but their genetic legacy lives on inside of us.
    These are not just our cousins; they are also our ancestors.

    https://www.nature.com/articles/s41598-025-22376-6

    Part 2

    **

  • Dr. Krishna Kumari Challa

    Parasitic ant tricks workers into killing their queen, then takes the throne

    Scientists document a new form of host manipulation where an invading, parasitic ant queen "tricks" ant workers into killing their queen mother. The invading ant integrates herself into the nest by pretending to be a member of the colony, then sprays the host queen with fluid that causes her daughters to turn against her. The parasitic queen then usurps the throne, having the workers serve her instead as the new queen regent. This work was published in Current Biology on November 17.

    Matricide, a behavior where offspring kill or eat their mother, is a rarely seen phenomenon in nature. Despite appearing maladaptive at first glance, it does offer advantages by either nourishing the young and giving the mother indirect benefits through increased offspring survival or allowing the young to invest in offspring of their own.

    Up until now, only two types of matricide have been recorded in which either the mother or offspring benefit. In this novel matricide that researchers reported, neither profit; only the parasitic third party.

    The ants Lasius orientalis and umbratus, commonly referred to as the "bad-smell ants" in Japanese, are so-called "social parasites" that execute a covert operation to infiltrate and eventually take over the colony of their unsuspecting ant queen hosts, Lasius flavus and japonicus, respectively. The parasitic queen takes advantage of ants' reliance on smell to identify both friends and foes to dupe unsuspecting worker ants into believing she is part of the family.

    Ants live in the world of odors. 

    Before infiltrating the nest, the parasitic queen stealthily acquires the colony's odor on her body from workers walking outside so that she is not recognized as the enemy.

    Once these "bad-smell" ants have been accepted by the colony's workers and locate the queen, the parasitic ant douses her with a foul-smelling chemical researchers presume to be formic acid—a chemical unique to some ants and stored in a specialized organ.

    The parasitic ants exploit that ability to recognize odors by spraying formic acid to disguise the queen's normal scent with a repugnant one. This causes the daughters, who normally protected their queen mother, to attack her as an enemy.

    Then, like fleeing the scene of the crime, the parasitic queen immediately (but temporarily) retreats. "She knows the odor of formic acid is very dangerous, because if host workers perceive the odor they would immediately attack her as well.

    She will periodically return and spray the queen multiple times until the workers have killed and disposed of their mother queen. Then, once the dust has settled, the parasitic ant queen returns and begins laying eggs of her own. With this newly accepted parasitic queen in the colony and no other queen to compete with, the matricidal workers begin taking care of her and her offspring instead.

     Host daughter ants manipulated into unwitting matricide by a social parasitic queen, Current Biology (2025). DOI: 10.1016/j.cub.2025.09.037www.cell.com/current-biology/f … 0960-9822(25)01207-2

  • Dr. Krishna Kumari Challa

    Lead-free alternative discovered for essential electronics component

    Ferroelectric materials are used in infrared cameras, medical ultrasounds, computer memory and actuators that turn electric properties into mechanical properties and vice-versa. Most of these essential materials, however, contain lead and can therefore be toxic.

    Therefore, for the last 10 years, there has been a huge initiative all over the world to find ferroelectric materials that do not contain lead.

    The atoms in a ferroelectric material can have more than one crystalline structure. Where two crystalline structures meet is called a phase boundary, and the properties that make ferroelectric materials useful are strongest at these boundaries.

    Using chemical processes, scientists have manipulated the phase boundaries of lead-based ferroelectric materials to create higher performance and smaller devices. Chemically tuning the phase boundaries of lead-free ferroelectric material, however, has been challenging.

    New research found a way to enhance lead-free ferroelectrics using strain, or mechanical force, rather than a chemical process. The discovery could produce lead-free ferroelectric components, opening new possibilities for devices and sensors that could be implanted in humans.

    Part 1

  • Dr. Krishna Kumari Challa

    Ferroelectric materials, first discovered in 1920, have a natural electrical polarization that can be reversed by an electric field. That polarization remains reversed even once the electric field has been removed.

    The materials are dielectric, meaning they can be polarized by the application of an electric field. That makes them highly effective in capacitors.

    Ferroelectrics are also piezoelectric, which means they can generate electric properties in response to mechanical energy, and vice versa. This quality can be used in sonars, fire sensors, tiny speakers in a cell phone or actuators that precisely form letters in an inkjet printer.

    All these properties can be enhanced by manipulating the phase boundary of ferroelectric materials.
    In a lead-based ferroelectric, such as lead zirconate titanate, one can chemically tune the compositions to land right at the phase.
    Lead-free ferroelectrics, however, contain highly volatile alkaline metals, which can become a gas and evaporate when chemically tuned.
    The researchers instead created a thin film of the lead-free ferroelectric material sodium niobate (NaNbO3). The material is known to have a complex crystalline ground state structure at room temperature. It is also flexible. Scientists have long known that changing the temperature of sodium niobate can produce multiple phases, or different arrangements of atoms.
    Instead of a chemical process or manipulating the temperature, the researchers changed the structure of the atoms in sodium niobate by strain.

    They grew a thin film of sodium niobate on a substrate. The structure of the atoms in the sodium niobate contract and expand as they try to match the structure of the atoms in the substrate. The process creates strain on the sodium niobate.

    "What is quite remarkable with sodium niobate is if you change the length a little bit, the phases change a lot.
    To the researchers' surprise, the strain caused the sodium niobate to have three different phases at once, which optimizes the useful ferroelectric properties of the material by creating more boundaries.
    The experiments were conducted at room temperature. The next step will be seeing if sodium niobate responds to strain in the same way at extreme temperatures ranging from minus 270 C to 1,000 C above.

    Reza Ghanbari et al, Strain-induced lead-free morphotropic phase boundary, Nature Communications (2025). DOI: 10.1038/s41467-025-63041-w

    Part 2

  • Dr. Krishna Kumari Challa

    Reducing arsenic in drinking water cuts risk of death, even after years of chronic exposure: 20-year study

    Arsenic is among the most common chemical pollutants of ground water.

    Arsenic is a naturally occurring element that accumulates in groundwater, and because it has no taste or odor, people can unknowingly drink contaminated water for years.

    A 20-year study of nearly 11,000 adults in Bangladesh found that lowering arsenic levels in drinking water was associated with up to a 50% lower risk of death from heart disease, cancer and other chronic illnesses, compared with continued exposure.

    The study provides the first long-term, individual-level evidence that reducing arsenic exposure may lower mortality, even among people exposed to the toxic contaminant for years.

    The landmark analysis by researchers is important for public health because groundwater contamination from naturally occurring arsenic remains a serious issue worldwide.

    The study shows what happens when people who are chronically exposed to arsenic are no longer exposed. You're not just preventing deaths from future exposure, but also from past exposure.

    The results provide the clearest evidence to date of the link between arsenic reduction and lower mortality.

    For two decades, the research team followed each participant's health and repeatedly collected urine samples to track exposure, which they say strengthened the accuracy of their findings.

    People whose urinary arsenic levels dropped from high to low had mortality rates identical to those who had consistently low exposure throughout the duration of the study. The larger the drop in arsenic levels, the greater the decrease in mortality risk. By contrast, individuals who continued drinking high-arsenic water saw no reduction in their risk of death from chronic disease.

     Arsenic Exposure Reduction and Chronic Disease Mortality, JAMA (2025). jamanetwork.com/journals/jama/ … 1001/jama.2025.19161

  • Dr. Krishna Kumari Challa

    Lethal dose of plastics for ocean wildlife: Surprisingly small amounts can kill seabirds, sea turtles and marine mammals

    By studying more than 10,000 necropsies, researchers now know how much plastic it takes to kill seabirds, sea turtles, and marine mammals, and the lethal dose is much smaller than you might think. Their new study titled "A quantitative risk assessment framework for mortality due to macroplastic ingestion in seabirds, marine mammals, and sea turtles" is published in the Proceedings of the National Academy of Sciences.

    Led by Ocean Conservancy researchers, the paper is the most comprehensive study yet to quantify the extent to which a range of plastic types—from soft, flexible plastics like bags and food wrappers; to balloon pieces; to hard plastics ranging from fragments to whole items like beverage bottles—result in the death of seabirds, sea turtles, and marine mammals that consume them.

    The study reveals that, on average, consuming less than three sugar cubes' worth of plastics for seabirds like Atlantic puffins (which measure approximately 28 centimeters, or 11 inches, in length); just over two baseballs' worth of plastics for sea turtles like Loggerheads (90 centimeters or 35 inches); and about a soccer ball's worth of plastics for marine mammals like harbor porpoises (1.5 meters, or 60 inches), has a 90% likelihood of death.

    At the 50% mortality threshold, the volumes are even more startling: consuming less than one sugar cube's worth of plastics kills one in two Atlantic puffins; less than half a baseball's worth of plastics kills one in two Loggerhead turtles; and less than a sixth of a soccer ball kills one in two harbor porpoises.

    The lethal dose varies based on the species, the animal's size, the type of plastic it's consuming, and other factors, but overall it's much smaller than you might think, which is troubling when you consider that more than a garbage truck's worth of plastics enters the ocean every minute.

    Part 1

  • Dr. Krishna Kumari Challa

    Nearly half (47%) of all sea turtles; a third (35%) of seabirds; and 12% of marine mammals in the dataset had plastics in their digestive tracts at their time of death. Overall, one in five (21.5%) of the animals recorded had ingested plastics, often of varying types. Additional findings included:
    Seabirds
    Of seabirds that ate plastic, 92% ate hard plastics, 9% ate soft plastics, 8% ate fishing debris, 6% ate rubber, and 5% ate foams, with many individuals eating multiple plastic types.
    Seabirds are especially vulnerable to synthetic rubber: just six pieces, each smaller than a pea, are 90% likely to cause death.
    Sea turtles
    Of sea turtles that ate plastic, 69% ate soft plastics, 58% ate fishing debris, 42% ate hard plastics, 7% ate foam, 4% ate synthetic rubbers, and 1% ate synthetic cloth.
    Sea turtles, which on average weigh several hundred pounds, are especially vulnerable to soft plastics, like plastic bags: just 342 pieces, each about the size of a pea, would be lethal with 90% certainty.
    Mammals
    Of marine mammals that ate plastic, 72% ate fishing debris, 10% ate soft plastics, 5% ate rubber, 3% ate hard plastics, 2% ate foam, and 0.7% ate synthetic cloth.
    Marine mammals are especially vulnerable to fishing debris: 28 pieces, each smaller than a tennis ball, are enough to kill a sperm whale in 90% of cases.
    Threatened species and broader impacts
    The study also found that nearly half of the individual animals who had ingested plastics are red-listed as threatened—that is, near-threatened, vulnerable, endangered or critically endangered—by the IUCN. Notably, the study only analyzed the impacts of ingesting large plastics (greater than 5 millimeters) on these species, and did not account for all plastic impacts and interactions. For example, they excluded entanglement, sublethal impacts of ingestion that can impact overall animal health, and microplastics consumed.

    This research really drives home how ocean plastics are an existential threat to the diversity of life on our planet.

     Murphy, Erin L., A quantitative risk assessment framework for mortality due to macroplastic ingestion in seabirds, marine mammals, and sea turtles, Proceedings of the National Academy of Sciences (2025). DOI: 10.1073/pnas.2415492122doi.org/10.1073/pnas.2415492122

    Part2

  • Dr. Krishna Kumari Challa

    Dogs 10,000 years ago roamed with bands of humans and came in all shapes and sizes
    Analysis of ancient dog skulls and genomes indicates that significant physical and genetic diversity in dogs emerged over 10,000 years ago, predating modern selective breeding. Distinctive dog-like skulls appeared around 11,000 years ago, and ancient DNA reveals that dogs often migrated with human groups, reflecting complex, intertwined histories and early biocultural exchanges.

    https://theconversation.com/dogs-10-000-years-ago-roamed-with-bands...

  • Dr. Krishna Kumari Challa

    New study identifies part of brain animals use to make inferences

    Animals survive in changing and unpredictable environments by not merely responding to new circumstances, but also, like humans, by forming inferences about their surroundings—for instance, squirrels understand that certain bird noises don't signal the presence of a predator, so they won't seek shelter when they later hear these same sounds. But less clear is how the brain works to create these inferences.

    In a study published in the journal Neuron, a team of  researchers identified a particular part of the brain that serves as an "inference engine." The region, the orbitofrontal  cortex (OFC), allows animals to update their understanding of their surroundings based on changing circumstances.

    To survive, animals cannot simply react to their surroundings. They must generalize and make inferences—a cognitive process that is among the most vital and complicated operations that nervous systems perform. These findings advance our knowledge of how the brain works in applying what we've learned.

    The scientists add that the results offer promise for better understanding the nature of neuropsychiatric disorders, such as bipolar disorder and schizophrenia, in which our ability to make inferences is diminished.

    In the experiments conducted when the brain's OFC was disrupted, the trained rats could no longer update their understanding of what the other available rewards might be—specifically, they couldn't make distinctions among hidden states.

    These results, based on recordings of more than 10,000 neurons, suggest that the OFC is directly involved in helping the brain make inferences in changing situations.

     The orbitofrontal cortex updates beliefs for state inference, Neuron (2025). DOI: 10.1016/j.neuron.2025.10.024. www.cell.com/neuron/fulltext/S0896-6273(25)00805-0

  • Dr. Krishna Kumari Challa

    Learning to see after being born blind: Brain imaging study highlights infant adaptability

    Some babies are born with early blindness due to dense bilateral congenital cataracts, requiring surgery to restore their sight. This period of several months without vision can leave a lasting mark on how the brain processes visual details, but surprisingly little on the recognition of faces, objects, or words. This is the main finding of an international study conducted by neuroscientist.

    Using brain imaging, the researchers compared adults who had undergone surgery for congenital cataracts as babies with people born with normal vision. The results are striking: in people born with cataracts, the area of the brain that analyzes small visual details (contours, contrasts, etc.) retains a lasting alteration from this early blindness.

    On the other hand, the more advanced regions of the visual brain, responsible for recognizing faces, objects, and words, function almost normally. These "biological" results have been validated by computer models involving artificial neural networks. This distinction between altered and preserved areas of the brain paves the way for new treatments. In the future, clinicians may be able to offer visual therapies that are better tailored to each patient.

    Babies' brains are highly adaptable . Even if vision is lacking at the very beginning of life, the brain can adapt and learn to recognize the world around it even on the basis of degraded information.

    These findings also challenge the idea of a single "critical period" for visual development. Some areas of the brain are more vulnerable to early vision loss, while others retain a surprising capacity for recovery. The brain is both fragile and resilient. Early experiences matter, but they don't determine everything.

    Impact of a transient neonatal visual deprivation on the development of the ventral occipito-temporal cortex in humans, Nature Communications (2025).

    https://www.nature.com/articles/s41467-025-65468-7#:~:text=We%20sho...

  • Dr. Krishna Kumari Challa

    Frozen RNA survived 50,000 years
    Researchers have found the oldest RNA molecules to date in mummified woolly mammoth tissue. RNA is a fragile molecule, which makes intact ancient samples few and far between. But such samples are sought after because analysing ancient RNA could shed light on the gene activity of extinct animals. Scientists used enzymes to convert RNA in the mammoth tissue to DNA, and then reverse-engineered the original RNA sequences. This technique recovered fragments of RNA from three samples, dated to between 39,000 and 52,000 years old.

    https://www.cell.com/cell/fulltext/S0092-8674(25)01231-0?_returnURL=https%3A%2F%2Flinkinghub.elsevier.com%2Fretrieve%2Fpii%2FS0092867425012310%3Fshowall%3Dtrue

    https://www.science.org/content/article/forty-thousand-year-old-mam...

  • Dr. Krishna Kumari Challa

    Psilocybin could reverse effects of brain injuries resulting from intimate partner violence, rat study finds

    The term intimate partner violence (IPV) refers to physical, sexual or psychological abuse perpetrated by an individual on their romantic partner or spouse. Victims of IPV who are violently attacked and physically abused on a regular basis can sometimes present injuries that have lasting consequences on their mood, mental processes and behaviour.

    Chronic neurobehavioral sequelae from IPV-BI are associated with neuroinflammation and impaired neuroplasticity, and effective treatment options are scarce, particularly in the context of IPV.

    Common types of injuries observed in IPV victims who are periodically attacked physically include mild traumatic brain injuries (mTBI) and disruptions in the flow of blood or oxygen to the brain emerging from non-fatal strangulation (NFS). Both these have been linked to inflammation in the brain and a hindered ability to form new connections between neurons or change older connections (i.e., neuroplasticity).

    Researchers recently carried out a study involving rats aimed at assessing the potential of the psychedelic compound psilocybin  for reversing the chronic effects of IPV-related brain injuries. Their findings, published in Molecular Psychiatry, suggest that psilocybin could in fact reduce inflammation and anxiety, improve memory and facilitate learning following brain injuries caused by repeated physical trauma.

    Part 1

  • Dr. Krishna Kumari Challa

    Psilocybin, a 5-HT2A receptor agonist with therapeutic potential in psychiatric disorders that share overlapping pathophysiology as BI, is a promising candidate.

    As the majority of IPV victims are female, Allen, Sun and their colleagues performed their experiments on female rats. To model IPV-related brain injuries, they subjected the rats to mild injuries that mirrored those observed in many victims of IPV.

    "Female rats underwent daily mTBI (lateral impact) followed by NFS (90 s) for five days, followed by 16 weeks of recovery," the authors explained.

    Four months after the rats had been subjected to the injuries, they were either given a dose of psilocybin or a placebo (i.e., saline water) injection. 24 hours later they completed behavioral tasks designed to assess their memory, learning, motivation and anxiety levels.

    Psilocybin is known to activate 5-HT2A receptors, a subtype of serotonin receptor that are known to play a role in the regulation of mood, mental processes and the brain's adaptability (i.e., plasticity). Using a drug that can block the activity of 5-HT2A receptors, the researchers also tried to determine whether these receptors played a key role in any effects they observed.

    "To investigate whether psilocybin's effects were 5-HT2A receptor dependent, additional rats received pre-treatment with selective 5-HT2A receptor antagonist M100907 (1.5 mg/kg) one hour before psilocybin administration," wrote the authors. in their paper.
    Overall, the results of this research team's experiments suggest that psilocybin could help to reverse some of the behavioral, cognitive and brain-related damage caused by repeated physical attacks. The female rats they tested were found to exhibit less anxiety-like and depression-like behaviors after they received psilocybin, while also performing better in memory and learning tests.

    As this study was carried out in rats, its findings cannot be confidently applied to humans yet. In the future, human clinical trials could help to determine whether psilocybin is in fact a safe and effective therapeutic strategy to aid recovery from IPV-related brain injuries.

     Josh Allen et al, Psilocybin mitigates chronic behavioral and neurobiological alterations in a rat model of recurrent intimate partner violence-related brain injury, Molecular Psychiatry (2025). DOI: 10.1038/s41380-025-03329-x.

    Part 2

    **

  • Dr. Krishna Kumari Challa

    How most of the universe's visible mass is generated: Experiments explore emergence of hadron mass

    Deep in the heart of the matter, some numbers don't add up. For example, while protons and neutrons are made of quarks, nature's fundamental building blocks bound together by gluons, their masses are much larger than the individual quarks from which they are formed.

    This leads to a central puzzle … why? In the theory of the strong interaction, known as quantum chromodynamics or QCD, quarks acquire their bare mass through the Higgs mechanism. The long-hypothesized process was confirmed by experiments at the CERN Large Hadron Collider in Switzerland and led to the Nobel Prize for Peter Higgs in 2013.

    Yet the inescapable issue remains that this mechanism contributes to the measured proton and neutron masses at the level of less than 2%. This clearly demonstrates that the dominant part of the mass of real-world matter is generated through another mechanism, not through the Higgs. The rest arises from emergent phenomena.

    The unaccounted mass and how it arises have long been open problems in nuclear physics, but scientists  are now getting a more detailed understanding of this mass-generating process than ever before.

    So, what gives protons and other strongly interacting particles (together called hadrons) their added "heft?" The answer lies in the dynamics of QCD. Through QCD, the strong interaction generates mass from the energy stored in the fields of the strongly interacting quarks and gluons. This is termed the emergence of hadron mass, or EHM.

    Part 1

  • Dr. Krishna Kumari Challa

    Over the past decade, significant progress has been made in understanding the dominant portion of the universe's visible mass. This is driven by studies of the distance (or momentum) dependence of the strong interaction within a QCD-based approach known as the continuum Schwinger method (CSM).

    Bridging CSM and experiments via phenomenology, physicists analyzed nearly 30 years of data collected. The comprehensive effort has given scientists the most detailed look yet at the mechanisms responsible for EHM.

     The comprehensive effort has given scientists the most detailed look yet at the mechanisms responsible for EHM.

    QCD describes the dynamics of the most elementary constituents of matter known so far: quarks and gluons. Through QCD processes, all hadronic matter is generated. This includes protons, neutrons, other bound quark-gluon systems and, ultimately, all atomic nuclei. A distinctive feature of the strong force is gluon self-interaction.

    Without gluon self-interaction, the universe would be completely different.

    It creates beauty through different particle properties and makes real-world hadron phenomena through emergent physics.

    Because of this property, the strong interaction evolves rapidly with distance. This evolution of strong-interaction dynamics is described within the CSM approach. At distances comparable to the size of a hadron, ~10-13 cm, its relevant constituents are no longer the bare quarks and gluons of QCD.

    Instead, dressed quarks and dressed gluons emerge when bare quarks and gluons are surrounded by clouds of strongly coupled quarks and gluons undergoing continual creation and annihilation.

    In this regime, dressed quarks acquire dynamically generated masses that evolve with distance. This provides a natural explanation for EHM: a transition from the nearly massless bare quarks (with masses of only a few MeV) to fully dressed quarks of approximately 400 MeV mass. Strong interactions among the three dressed quarks of the proton generate its mass of about 1 GeV, as well as the masses of its excited states in the range of 1.0–3.0 GeV.

    This raises the question: Can EHM be elucidated by mapping the momentum dependence of the dressed-quark mass from experimental studies of the proton and its excited states?


    Part 2

  • Dr. Krishna Kumari Challa

    Investigations, comparing experiment with theory, demonstrate conclusively that dressed quarks with dynamically generated masses are the active degrees of freedom underlying the structure of the proton and its excited states. They also solidify the case that experimental results can be used to assess the mechanisms responsible for EHM.

    Patrick Achenbach et al, Electroexcitation of Nucleon Resonances and Emergence of Hadron Mass, Symmetry (2025). DOI: 10.3390/sym17071106www.mdpi.com/2073-8994/17/7/1106

    Part 3

  • Dr. Krishna Kumari Challa

    Could atoms be reordered to enhance electronic devices?
    Reordering atoms in SiGeSn barriers surrounding GeSn quantum wells unexpectedly increases charge carrier mobility, contrary to prior assumptions. This enhancement is likely due to atomic short-range ordering, suggesting that manipulating atomic arrangements could significantly improve electronic device performance and miniaturization, with implications for neuromorphic and quantum computing.

     Christopher R. Allemang et al, High Mobility and Electrostatics in GeSn Quantum Wells With SiGeSn Barriers, Advanced Electronic Materials (2025). DOI: 10.1002/aelm.202500460advanced.onlinelibrary.wiley.c … .1002/aelm.202500460

  • Dr. Krishna Kumari Challa

    Humans are evolved for nature, not cities, say anthropologists

    A new paper by evolutionary anthropologists argues that modern life has outpaced human evolution. The study suggests that chronic stress and many modern health issues are the result of an evolutionary mismatch between our primarily nature-adapted biology and the industrialized environments we now inhabit.

    Over hundreds of thousands of years, humans adapted to the demands of hunter-gatherer life—high mobility, intermittent stress and close interaction with natural surroundings.

    Industrialization, by contrast, has transformed the human environment in only a few centuries, by introducing noise, air and light pollution, microplastics, pesticides, constant sensory stimulation, artificial light, processed foods and sedentary lifestyles.

    In our ancestral environments, we were well adapted to deal with acute stress to evade or confront predators. 

    The lion would come around occasionally, and you had to be ready to defend yourself—or run. The key is that the lion goes away again.

    Today's stressors—traffic, work demands, social media and noise, to name just a few—trigger the same biological systems, but without resolution or recovery. Our body reacts as though all these stressors were lion, say the researchers.

    Whether it's a difficult discussion with your boss or traffic noise, your stress response system is still the same as if you were facing lion after lion. As a result, you have a very powerful response from your nervous system, but no recovery.

    This stress is becoming constant.

    In their review, the researchers synthesize evidence suggesting that industrialization and urbanization are undermining human evolutionary fitness. From an evolutionary standpoint, the success of a species depends on survival and reproduction. According to the authors, both have been adversely affected since the Industrial Revolution.

    They point to declining global fertility rates and rising levels of chronic inflammatory conditions such as autoimmune diseases as signs that industrial environments are taking a biological toll.

    There's a paradox where, on the one hand, we've created tremendous wealth, comfort and health care for a lot of people on the planet, but on the other hand, some of these industrial achievements are having detrimental effects on our immune, cognitive, physical and reproductive functions.

    One well-documented example is the global decline in sperm count and motility observed since the 1950s, which the researchers  link to environmental factors. This is thought to be tied to pesticides and herbicides in food, but also to microplastics.

    Part 1

  • Dr. Krishna Kumari Challa

    Given the pace of technological and environmental change, biological evolution cannot keep up. Biological adaptation is very slow. Longer-term genetic adaptations are multigenerational—tens to hundreds of thousands of years.

    That means the mismatch between our evolved physiology and modern conditions is unlikely to resolve itself naturally. Instead, the researchers argue, societies need to mitigate these effects by rethinking their relationship with nature and designing healthier, more sustainable environments.

    Addressing the mismatch requires both cultural and environmental solutions.

    One approach is to fundamentally rethink our relationship with nature—treating it as a key health factor and protecting or regenerating spaces that resemble those from our hunter-gatherer past.
    Another is to design healthier, more resilient cities that take human physiology into account.
    This new research based thinking can identify which stimuli most affect blood pressure, heart rate or immune function, for example, and pass that knowledge on to decision-makers.
    We need to get our cities right—and at the same time regenerate, value and spend more time in natural spaces, say the researchers.

     Daniel P. Longman et al, Homo sapiens, industrialisation and the environmental mismatch hypothesis, Biological Reviews (2025). DOI: 10.1111/brv.70094

    Part 2

  • Dr. Krishna Kumari Challa

    Why do we fear snakes?

    Fear of snakes is common. In most polls conducted in some parts of the world, respondents listed snakes as their top fear, outranking public speaking and heights.

    Why are so many of us afraid of snakes? And more curiously, why does our unconscious mind recognize them as a threat before our conscious mind?

    This is what the experts say:

    Our relationship with snakes is an ancient one that reaches back to the evolutionary origins of primates. Primates are really differentiated from other mammals by their heavy reliance on vision as the primary sensory modality with the environment. If you want to understand why primates evolved, you have to address why they have such good vision.

    The predator-prey relationship between snakes and primates across tens of millions of years enhanced our visual acuity. 

    Avoiding predators and getting food are the two main selective pressures operating on organisms. The idea is that the unique conditions under which primates were living, that is, being active at night like other mammals but resting during the day in trees where sunlight penetrates, instead of in caves or burrows, allowed them to expand their visual sense as a way to avoid being eaten by snakes.

    Molecular evidence bolsters the idea of ancient snake predation on primates. Researchers have found that primates in Africa and Asia, where cobras live, have evolved some immunity toward cobra venom. But primates in Madagascar and South America, where there are no cobras, have no immunity.

    Even today, there seems to be a biological tendency for primates to perceive snakes as a threat.

    If a young, naïve monkey watches a video of older monkeys reacting carefully to a snake, they will learn to do that. But if you splice in a flower where the snake was, they don't learn to react carefully toward the flower.

    Even captive-born-and-raised rhesus macaques, who likely haven't been harmed by snakes, respond with "fearful fascination" to them.

    Primate vision is highly snake-sensitive.

    This fact is beyond primatology and enhanced by the fields like neuroscience, evolutionary theory, genetics and molecular biology, to name a few.

    While constricting snakes were instrumental in the origin of primates and the initial changes in their vision, venomous snakes, appearing later, were very important in facilitating the changes in vision that led to anthropoid primates.

    Two mammalian visual systems are key to this idea: the superior colliculus-pulvinar visual system and the lateral geniculate nucleus visual system. The first system enables our unconscious detection of an object in our environment. The second, slightly slower, system allows for conscious recognition of the object and the ability to assign intent to it. Both visual systems are more developed in primates than in other mammals, and even more so in anthropoids.

    If you've ever had the experience of walking on a trail and coming across something that could be a snake, you might suddenly freeze or jump away before you actually recognize that there is a snake in front of you. The recognition taps into the conscious visual system, but the unconscious visual system gives you the ability to get out of danger more quickly.

    Evolution at its best!

    https://www.ucdavis.edu/magazine/why-do-we-fear-snakes#:~:text=Both...

    **

  • Dr. Krishna Kumari Challa

    Oregano oil shows promise as natural fire ant repellent
    Oregano essential oil, particularly its compound carvacrol, effectively repels invasive fire ants and disrupts their nest-building behavior. Carvacrol and related plant-derived compounds are biodegradable, less toxic to humans and beneficial insects, and may offer a sustainable alternative to synthetic pesticides for fire ant management.

    Ginson George et al, Repellent effect of oregano essential oil and carvacrol analogs against imported fire ants, Pest Management Science (2025). DOI: 10.1002/ps.70297

  • Dr. Krishna Kumari Challa

    Blink to the beat: Scientists discover that when we listen to music, we unconsciously blink our eyes
    Spontaneous eye blinks synchronize with the beat of music, reflecting involuntary auditory-motor synchronization even in non-musicians. This effect disappears when attention is diverted, indicating that focus on music is required. The findings suggest a link between auditory processing and oculomotor control, offering potential for non-invasive rhythm assessment and therapeutic applications.

    Wu Y, et al. Eye blinks synchronize with musical beats during music listening, PLOS Biology (2025). DOI: 10.1371/journal.pbio.3003456

  • Dr. Krishna Kumari Challa

    Scientist captures tiny particles for clues on what sparks lightning

    Using lasers as tweezers to understand cloud electrification might sound like science fiction, but at the Institute of Science and Technology Austria (ISTA) it is a reality. By trapping and charging micron-sized particles with lasers, researchers can now observe their charging and discharging dynamics over time.

    This method, published in Physical Review Letters, could provide key insights into what sparks lightning.

    Aerosols are liquid or solid particles that float in the air. They are all around us. Some are large and visible, such as pollen in spring, while others, such as viruses that spread during flu season, cannot be detected by the naked eye. Some we can even taste, like the airborne salt crystals we breathe in at the seaside.

    In the new work researchers focused on ice crystals within clouds. The  scientists used model aerosols—tiny, transparent silica particles—to explore how these ice crystals accumulate and interact with electrical charge.

    They developed a way to catch, hold, and electrically charge a single silica particle using two laser beams. This approach holds potential for application in different areas, including demystifying how clouds become electrified and what sparks lightning.

    The scientists discovered that lasers charge the particle through a "two-photon process."

    Typically, aerosol particles are close to neutrally charged, with electrons (negatively charged entities) swirling around in every atom of the particle.

    The laser beams consist of photons (particles of light traveling at the speed of light), and when two of these photons are absorbed simultaneously, they can "kick out" one electron from the particle. In this way, the particle gains one elemental positive charge. Step by step, it becomes increasingly positively charged.

    The researchers can now precisely observe the evolution of one aerosol particle as it charges up from neutral to highly charged and adjust the laser power to control the rate.

    This observation also reveals that, as the particle becomes positively charged, it begins to discharge, meaning that it occasionally releases charge in spontaneous bursts.

    Way above our heads, something similar might also be happening in clouds.

    Part 1

  • Dr. Krishna Kumari Challa

    Thunderstorm clouds contain ice crystals and larger ice pellets. When these collide, they exchange electrical charges. Eventually, the cloud becomes so charged that lightning forms.
    One theory suggests that the first little spark of a lightning bolt could be initiated at the charged ice crystals themselves.
    Alternative theories suggest cosmic rays initiate the process as the charged particles they create accelerate from pre-existing electric fields.
    While ice crystals in clouds are much larger than the model ones, the ISTA scientists are now aiming to decode these microscale interactions to better understand the big picture.

    Using optical tweezers to simultaneously trap, charge and measure the charge of a microparticle in air, Physical Review Letters (2025). DOI: 10.1103/5xd9-4tjj

    Part 2

  • Dr. Krishna Kumari Challa

    Taking prenatal supplements associated with 30% lower risk of autism

    Researchers  report that prenatal folic acid and multivitamin supplementation is associated with a roughly 30% lower risk of autism spectrum disorder (ASD) in children, based on an umbrella review of existing systematic reviews and meta-analyses.

    Global estimates in the reviewed material place ASD prevalence at up to 1% of children. ASD affects reciprocal social interaction, nonverbal communication, and understanding of social relationships. Co-occurring conditions frequently include epilepsy, depression, anxiety, attention deficit hyperactivity disorder, sleep disturbance, and self-injury.

    Previous studies found that both genetic mutations and environmental influences contribute to ASD risk, with prenatal maternal nutrition identified as one modifiable environmental factor. Within that broader category of prenatal maternal nutrition, folic acid and multivitamin supplements are among the most accessible interventions offered to women before and during pregnancy.

    Folic acid supports DNA methylation and epigenetic regulation that shape neurodevelopment and supports neural tube formation, processes linked to structural brain development. Multivitamin preparations typically provide vitamin B12, vitamin D, iodine, and other micronutrients that help maintain immune balance, modulate inflammation, and support neurotransmitter synthesis and amino acid metabolism, creating a nutritional context that may favor optimal fetal brain development and potentially lower ASD risk.

    Combined analysis in the umbrella review indicated that maternal prenatal folic acid and/or multivitamin supplementation was associated with a 30% reduced risk of ASD in offspring (pooled relative risk of 0.70 with 95% CI 0.62–0.78 vs. no supplementation.)

    Subgroup analysis by supplement type found that prenatal multivitamin supplementation was associated with a 34% reduction in ASD risk. Folic acid supplementation alone was associated with a 30% reduction in ASD risk.

    Authors conclude that maternal prenatal folic acid or multivitamin supplementation is associated with reduced ASD risk in children and that current evidence provides highly suggestive support for a protective effect.

    Researchers described this pattern as strong enough to support incorporating folic acid and multivitamin supplementation into routines beginning before conception and continuing through early pregnancy.

    Biruk Beletew Abate et al, The association between maternal prenatal folic acid and multivitamin supplementation and autism spectrum disorders in offspring: An umbrella review, PLOS One (2025). DOI: 10.1371/journal.pone.0334852

  • Dr. Krishna Kumari Challa

    'Chocolate-flavored' honey created using cocoa bean shells

    A group of researchers  developed a product made from native bee honey and cocoa bean shells that can be consumed directly or used as an ingredient in food and cosmetics. The results were published in the journal ACS Sustainable Chemistry & Engineering.

    The researchers used native bee honey as an edible solvent to extract stimulants such as theobromine and caffeine, which are associated with heart health, from cocoa bean shells. These shells are usually discarded during the production of chocolate and other cocoa derivatives. The ultrasound-assisted extraction process also enriched the honey with phenolic compounds, which have antioxidant and anti-inflammatory properties.

    The researchers who tasted it say that, depending on the ratio of honey to shells, it has a strong chocolate flavor, although they are still planning tests on the product's taste and other sensory properties.

     Felipe Sanchez Bragagnolo et al, Stingless Bee Honeys As Natural and Edible Extraction Solvents: An Intensified Approach to Cocoa Bean Shell Valorization, ACS Sustainable Chemistry & Engineering (2025). DOI: 10.1021/acssuschemeng.5c04842

  • Dr. Krishna Kumari Challa

    Bacteria 'pills' could detect gut diseases—without the endoscope

    Move over, colonoscopies—researchers report in ACS Sensors that they've developed a sensor made of tiny microspheres packed with blood-sensing bacteria that detect markers of gastrointestinal disease. Taken orally, the miniature "pills" also contain magnetic particles that make them easy to collect from stool.

    Once excreted from mouse models with colitis, the bacterial sensor detected gastrointestinal bleeding within minutes. The researchers say the bacteria in the sensor could be adapted to detect other gut diseases.

    This technology provides a new paradigm for rapid and noninvasive detection of gastrointestinal diseases.

    How it works ....

    Previously, the researchers developed heme-sensing bacteria that light up in the presence of blood, but the bacterial sensors break down in the digestive system and are hard to collect. In this current study, they encapsulated their heme-detecting bacteria and magnetic particles inside globs of sodium alginate, a thickening agent used in foods.

    The process creates tiny hydrogel microsphere sensors that can easily be removed from feces with a magnet after they travel through the body. Initial tests showed that the hydrogel protected the bacteria from simulated digestive fluids but also allowed heme to interact with the bacterial sensor, causing it to glow.

    The team administered the microspheres orally to mouse models of colitis, representing disease levels from no activity to severe stages. After the microspheres traveled through the animals' gastrointestinal systems, the researchers retrieved the sensors from feces with a magnet and found:
    Microsphere cleanup and signal analysis took about 25 minutes.
    As the disease stage progressed, the intensity of the light produced by the sensor increased, which indicated more heme from mouse models with more advanced colitis.
    Assessments of healthy mice given the sensor indicated the microspheres were biocompatible and safe.
    Although the sensor still needs to be tested in humans, the researchers say that this method of encapsulating bacterial sensors could diagnose gastrointestinal diseases and monitor treatments and disease progression.

     Magnetic Hydrogel: Enhanced Bacterial Biosensor for Speedy Gut Diseases Detection, ACS Sensors (2025). DOI: 10.1021/acssensors.5c01813

  • Dr. Krishna Kumari Challa

    Microplastics detected in 100% of donkey feces: Study links plastic pollution to animal deaths and food risks
    Microplastics were found in 100% of donkey and cattle fecal samples on Lamu Island, indicating widespread ingestion of plastic waste by livestock. This contamination is linked to animal health issues, including fatal colic, and raises concerns about microplastic transfer to humans through the food chain. The findings highlight urgent risks to animal welfare, food safety, and ecosystem health.

    Emily Haddy et al, Ingestion of terrestrial plastic pollution by free-roaming livestock, including working donkeys: an interdisciplinary assessment, Cambridge Prisms: Plastics (2025). DOI: 10.1017/plc.2025.10036

  • Dr. Krishna Kumari Challa

    'Trained' bacteriophages expand treatment options for antibiotic-resistant infections

    Antibiotic resistance is one of the most pressing challenges to global public health as harmful microbes evolve to evade these medications.

    Now researchers  have developed a new method to combat antibiotic-resistant bacteria using bacteriophages, or phages, for short—viruses that infect and kill bacteria—as an alternative to traditional antibiotics.

    The researchers targeted Klebsiella pneumoniae, a species of bacteria notorious for its ability to resist multiple antibiotics. The dangerous pathogen can cause severe infections in hospital settings, including pneumonia and sepsis.

    While phages have been used as a treatment for bacterial infections for over a century, they are extremely specific about which strains of a bacterial species they will attack. This has limited their effectiveness against the most antibiotic-resistant strains.

    To overcome this problem, the research team "trained" the phages by allowing them to evolve together with the bacteria in a controlled laboratory setting for 30 days.

    This technique, called "experimental evolution," permitted the phages to adapt to bacterial defenses. This resulted in significant improvements to their ability to kill a wide variety of bacterial strains, including multidrug-resistant and extensively drug-resistant K. pneumoniae—strains that pose a significant challenge to modern medicine.

    What's more, the evolved phages also demonstrated an enhanced ability to suppress bacterial growth over extended periods of time.

    Genetic analysis revealed that the evolved phages acquired mutations to specific genes responsible for recognizing and binding to bacterial cells to initiate the infection process. These changes likely contributed to their improved effectiveness.

    The research highlights the potential of phage therapy as a powerful tool to address the global antibiotic resistance crisis.

    Nature Communications (2025). DOI: 10.1038/s41467-025-66062-7

  • Dr. Krishna Kumari Challa

    New type of DNA damage discovered in our cells' mitochondria

    A previously unknown type of DNA damage in the mitochondria, the tiny power plants inside our cells, could shed light on how our bodies sense and respond to stress. The findings of the UC Riverside-led study are published today in the Proceedings of the National Academy of Sciences and have potential implications for a range of mitochondrial dysfunction-associated diseases, including cancer and diabetes.

    Mitochondria have their own genetic material, known as mitochondrial DNA (mtDNA), which is essential for producing the energy that powers our bodies and sending signals within and outside cells. While it has long been known that mtDNA is prone to damage, scientists didn't fully understand the biological processes. The new research identifies a culprit: glutathionylated DNA (GSH-DNA) adducts.

    An adduct is a bulky chemical tag formed when a chemical, such as a carcinogen, attaches directly to DNA. If the damage isn't repaired, it can lead to DNA mutations and increase the risk of disease.

    The researchers found in their experiments in cultured human cells that these adducts accumulate at levels up to 80 times higher in mtDNA than in the DNA of the cell's nucleus, suggesting that mtDNA is particularly vulnerable to this type of damage.

    mtDNA makes up only a small fraction—about 1-5%—of all the DNA in a cell. It is circular in shape, has just 37 genes, and is passed down only from the mother. In contrast, nuclear DNA (nDNA) is linear in shape and inherited from both parents.

    mtDNA is more prone to damage than nDNA. Each mitochondrion has many copies of mtDNA, which provides some backup protection. The repair systems for mtDNA are not as strong or efficient as those for nuclear DNA.

    Part1

  • Dr. Krishna Kumari Challa

    When the engine's manual—the mtDNA—gets damaged, it's not always by a spelling mistake, a mutation.
    Sometimes, it's more like a sticky note that gets stuck to the pages, making it hard to read and use. That's what these GSH-DNA adducts are doing.
    The researchers linked the accumulation of the sticky lesions to significant changes in mitochondrial function. They observed a decrease in proteins needed for energy production and a simultaneous increase in proteins that help with stress response and mitochondrial repair, suggesting the cell fights back against the damage.

    The researchers also used advanced computer simulations to model the effect of the adducts.

    They found that the sticky tags can actually make the mtDNA less flexible and more rigid. This might be a way the cell 'marks' damaged DNA for disposal, preventing it from being copied and passed on.

    When mtDNA is damaged, it can escape from the mitochondria and trigger immune and inflammatory responses.
    The findings hold promise for understanding diseases.

    Problems with mitochondria and inflammation linked to damaged mtDNA have been connected to diseases such as neurodegeneration and diabetes.

    Yu Hsuan Chen et al, Glutathionylated DNA adducts accumulate in mitochondrial DNA and are regulated by AP endonuclease 1 and tyrosyl-DNA phosphodiesterase 1, Proceedings of the National Academy of Sciences (2025). DOI: 10.1073/pnas.2509312122

    Part 2

  • Dr. Krishna Kumari Challa

     Smart tech maps moisture levels, will adjust watering automatically
    A  wheat field was equipped with 86 solar-powered Bluetooth soil moisture sensors, enabling precise, real-time mapping of water levels. The system, integrated with a center pivot sprinkler and enhanced by a parabolic antenna, doubled data transmission range to 600 m. This technology allows targeted irrigation, optimizing water use and supporting crop health during drought conditions.

     Samuel Craven et al, Smart Bluetooth Stakes: Deployment of Soil Moisture Sensors with Rotating High-Gain Antenna Receiver on Center Pivot Irrigation Boom in a Commercial Wheat Field, Sensors (2025). DOI: 10.3390/s25175537

  • Dr. Krishna Kumari Challa

    39,000-Year-Old Mammoth RNA Is the Oldest Ever Found and Sequenced

    RNA from one of the best-preserved woolly mammoths broke the record for oldest RNA ever found, pushing the limit of how long these famously labile molecules can last.
    In a recent Cell study, researchers recovered and sequenced RNA from an extraordinarily preserved 39,000-year-old woolly mammoth called Yuka, which was discovered in the Siberian permafrost more than a decade ago (1). This sample was nearly three times older than the previous record holder, an approximately 14,000-year-old wolf puppy.(2)

    1. Mármol-Sánchez E, et al. Ancient RNA expression profiles from the extinct woolly mammoth. Cell. 2025;189:1-18.
    2. Smith O, et al. Ancient RNA from Late Pleistocene permafrost and historical canids .... PLoS Biol. 2019;17(7):e3000166.
  • Dr. Krishna Kumari Challa

    Moss spores survive 9 months outside International Space Station

    Mosses thrive in the most extreme environments on Earth, from the peaks of the Himalayas to the sands of Death Valley, the Antarctic tundra to the lava fields of active volcanoes. Inspired by moss's resilience, researchers sent moss sporophytes—reproductive structures that encase spores—to the most extreme environment yet: space.

    Their results, published in the journal iScience on November 20, show that more than 80% of the spores survived nine months outside of the International Space Station (ISS) and made it back to Earth still capable of reproducing, demonstrating for the first time that an early land plant can survive long-term exposure to the elements of space.

    Most living organisms, including humans, cannot survive even briefly in the vacuum of space. 

    However, the moss spores retained their vitality after nine months of direct exposure. This provides striking evidence that the life that has evolved on Earth possesses, at the cellular level, intrinsic mechanisms to endure the conditions of space.

    The researchers found that UV radiation was the toughest element to survive, and the sporophytes were by far the most resilient of the three moss parts. None of the juvenile moss survived high UV levels or extreme temperatures. The brood cells had a higher rate of survival, but the encased spores exhibited ~1,000x more tolerance to UV radiation. The spores were also able to survive and germinate after being exposed to −196°C for over a week, as well as after living in 55°C heat for a month.

    The team suggested that the structure surrounding the spore serves as a protective barrier, absorbing UV radiation and blanketing the inner spore both physically and chemically to prevent damage. The researchers note that this is likely an evolutionary adaptation that allowed bryophytes—the group of plants to which mosses belong—to transition from aquatic to terrestrial plants 500 million years ago and survive several mass extinction events since then.
    Part1
  • Dr. Krishna Kumari Challa

    Over 80% of the spores survived their intergalactic journey, and all but 11% of the remaining spores were able to germinate back in the lab. The team also tested the chlorophyll levels of the spores and found normal levels for all types, with the exception of a 20% reduction in chlorophyll a—a compound which is particularly sensitive to changes in visual light, but this change didn't seem to impact the health of the spores.

    This study demonstrates the astonishing resilience of life that originated on Earth.

     Extreme Environmental Tolerance and Space Survivability of the Moss, Physcomitrium patens, iScience (2025). DOI: 10.1016/j.isci.2025.113827www.cell.com/iscience/fulltext … 2589-0042(25)02088-7

    Part 2

  • Dr. Krishna Kumari Challa

    Take a lesson from these birds on how to make new friends

    Making new friends has its challenges, even for birds. Researchers  found that monk parakeets introduced to new birds will "test the waters" with potential friends to avoid increasingly dangerous close encounters that could lead to injury. They gradually approach a stranger, taking time to get familiar before ramping up increasingly risky interactions.

    There can be a lot of benefits to being social, but these friendships have to start somewhere. Many parrots, for example, form strong bonds with one or two other birds. 

    But making that first contact carries risk, especially when animals are unfamiliar to one another.

    Birds that don't welcome a newcomer's attention can react aggressively, which can lead to injuries. 

    Researchers found that strangers were more likely to approach each other with caution compared to birds they knew. Stranger birds took time to share space before eventually perching shoulder to shoulder, touching beaks or preening others. Some strangers escalated further to sharing food or mating.

    This study had results comparable to a 2020 study of vampire bats that found that newcomers likewise test the waters, gradually escalating from social grooming relationships to food-sharing relationships with trustworthy partners.

    Claire L. O'Connell et al, Monk parakeets 'test the waters' when forming new relationships, Biology Letters (2025). DOI: 10.1098/rsbl.2025.0399

  • Dr. Krishna Kumari Challa

    Why a foreign language sounds like a blur to non-native ears

    Why is it so easy to hear individual words in your native language, but in a foreign language they run together in one long stream of sound?

    Researchers have begun to answer that question with two complementary studies that show how the brain learns the sound patterns of a language until it recognizes where one word ends and the next begins.

    When we speak naturally, we don't put pauses or "spaces" between words, yet fluent speakers effortlessly perceive them. For years, researchers assumed it was the brain areas that give meaning to speech that were figuring out the boundaries between words.

    The new studies focus on a different brain region, called the superior temporal gyrus, or STG. Until now, it was thought only to handle simple sound processing, like identifying consonants and vowels.

    The new studies show the STG contains neurons that learn to track where words begin and end over years of experience hearing a language.

    This shows that the STG isn't just hearing sounds, it's using experience to identify words as they're being spoken. This new work  gives us a neural blueprint for how the brain transforms continuous sound into meaningful units.

    In the Nature study, researchers recorded brain activity from 34 volunteers who were being monitored for epilepsy. Most spoke either Spanish, Mandarin, or English as their native language. Eight were bilingual, but no one spoke all three languages.

    Participants listened to sentences in English, Spanish, and Mandarin—languages that were both familiar and unfamiliar to them.

    The researchers used machine learning models to analyze patterns and found that when participants heard their native tongue or a language they knew, the specialized neurons in the STG lit up. But when participants heard a language they didn't know, the neurons failed to light up.

    Part 1

  • Dr. Krishna Kumari Challa

    The Neuron study showed how these specialized neurons detect the beginnings and endings of words.

    Given that fluent speakers utter several words per second, these neurons must rapidly reset to take note of the next word.

    It's like a kind of reboot, where the brain has processed a word it recognizes, and then resets so it can start in on the next word.
    The studies clarify why injury to certain regions of the brain can impair the ability to comprehend speech even when a person's hearing is intact.

    Yizhen Zhang et al, Human cortical dynamics of auditory word form encoding, Neuron (2025). DOI: 10.1016/j.neuron.2025.10.011

    Ilina Bhaya-Grossman et al, Shared and language-specific phonological processing in the human temporal lobe, Nature (2025). DOI: 10.1038/s41586-025-09748-8

    Part 2

  • Dr. Krishna Kumari Challa

    Depression tied to immune system imbalance, not just brain chemistry

    Major depressive disorder (MDD) is characterized by a lowered mood and loss of interest, contributing not only to difficulties in academic and professional life but also as a major cause of suicide. However, there are currently no objective biological markers that can be used for diagnosis or treatment.

    Amidst this, a research team has revealed that depression is not merely a problem of the mind or brain, but is deeply connected to abnormalities in the body's overall immune response.

    They found that this immune abnormality affects brain function, and the "Immune Neural Axis" imbalance is the core mechanism of depression, opening up the possibility for the discovery of new biomarkers and the development of new drugs for depression treatment.

    The research team simultaneously examined genetic changes in immune cells in the blood and changes in nervous-system-related proteins. The results confirmed a breakdown in the balance of immune-neural interaction in patients with depression.

     Insook Ahn et al, Exploration of Novel Biomarkers Through a Precision Medicine Approach Using Multi‐Omics and Brain Organoids in Patients With Atypical Depression and Psychotic Symptoms, Advanced Science (2025). DOI: 10.1002/advs.202508383

  • Dr. Krishna Kumari Challa

    Aging alters the protein landscape in the brain—diet can counteract it, say researchers

    As we age, the composition and function of proteins in the brain change, affecting how well our brain performs later in life—influencing memory, responsiveness, and the risk of neurodegenerative diseases.

    An international research team has now discovered that a specific chemical modification, known as ubiquitylation, plays a crucial role in these processes. This modification determines which proteins remain active and which are targeted for degradation.

    Proteins perform vital tasks in the brain—they control metabolism, signal transmission, and energy balance in cells. To function properly, they must be constantly broken down, renewed, or chemically modified. One of these modifications, the so-called ubiquitylation, serves as a kind of molecular tag: it marks proteins for degradation and regulates their activity.

    The new analyses have shown that aging leads to fundamental changes in how the proteins in the brain are chemically labeled.

    The ubiquitylation process acts like a molecular switch—it determines whether a protein remains active, changes its function, or is degraded. In the aging brains of mice, researchers observed that this finely tuned system becomes increasingly unbalanced: many labels accumulate and some are even lost, regardless of how much of a particular protein is present

    As we age, the cell's internal "recycling system" also begins to falter. The proteasome—a molecular machine responsible for breaking down damaged or unneeded proteins—gradually loses efficiency. As a result, proteins tagged for disposal with ubiquitin start to build up in the brain, a clear sign that its cellular cleanup machinery is no longer working properly.

    The researchers found that roughly one-third of the age-related changes in protein ubiquitylation in the brain can be directly linked to this decline in proteasome activity.

    The  data shows that the reduced ability of cells to completely eliminate damaged proteins is a central mechanism of the aging brain.

    The sensitive balance between protein synthesis and degradation shifts—a typical feature of cellular aging. In the long term, this can also impair the function of nerve cells in the brain.

    Part 1

  • Dr. Krishna Kumari Challa

    In a further step, the researchers investigated whether the ubiquitylation patterns found could be influenced by changes in diet. To this end, older mice were fed a moderate diet (calorie restriction) for four weeks before being returned to a normal diet. The surprising result was that the short-term change in diet significantly altered the ubiquitylation pattern in the mice—in some proteins, it even reverted to the previous, youthful state.

    The  results show that even in old age, diet can still have an important influence on molecular processes in the brain.

    However, diet does not affect all aging processes in the brain equally: some are slowed down, while others hardly change or even increase.

    The study thus provides new insights into the molecular mechanisms of brain aging. It suggests that ubiquitylation is a sensitive biomarker of the aging processes—and potentially a starting point for slowing down age-related damage to nerve cells.

    Antonio Marino et al, Aging and diet alter the protein ubiquitylation landscape in the mouse brain, Nature Communications (2025). DOI: 10.1038/s41467-025-60542-6

    Part 2

  • Dr. Krishna Kumari Challa

    Underlying cause of Gulf War illness confirmed
    Gulf War illness symptoms are linked to mitochondrial dysfunction rather than neuronal damage, as shown by elevated total creatine (tCr) levels in affected veterans' brains. This energy imbalance, rather than irreversible nerve injury, suggests that targeting mitochondrial function could offer effective treatments for the chronic symptoms experienced by Gulf War veterans.

     Sergey Cheshkov et al, Decadelong low basal ganglia NAA/tCr from elevated tCr supports ATP depletion from mitochondrial dysfunction and neuroinflammation in Gulf War illness, Scientific Reports (2025). DOI: 10.1038/s41598-025-24099-0

  • Dr. Krishna Kumari Challa

    Researchers diagnose disease with a drop of blood, a microscope and AI

    Not long ago, the idea of diagnosing a disease with a droplet of blood was considered a pipe dream. Today, this technology could soon become a reality.

    A group of scientists  has developed an automated, high-throughput system that relies on imaging droplets of biofluids (such as blood, saliva and urine) for disease diagnosis in an attempt to reduce the number of consumables and equipment needed for biomedical testing.

    In the workflow, biofluid droplet images are analyzed by machine-learning algorithms to diagnose disease. Remarkably, the technology relies on the drying process of biofluid droplets to distinguish between normal and abnormal samples.

    Current medical diagnostic tests typically require 5 milliliters to 10 milliliters of blood, necessitating a trip to the clinic or other phlebotomy service to draw blood with needles and tubes. Besides being painful, inconvenient and inefficient, blood draws are often a luxury of developed nations with modern health care infrastructure.

    By eliminating the need for phlebotomy services and other consumables, diagnostic tests could be implemented worldwide to improve disease diagnosis and cost efficiency.

    Traditionally, researchers have focused only on the final pattern left after drying. In this study, researchers looked beyond that, observing the entire drying process in real time. By tracking how the droplet's shape and internal structures evolve over time, they were able to uncover rich information about the fluid's composition.
    By using machine learning, the team could "decode" the evolving patterns in drying blood droplets, allowing them to clearly distinguish between healthy blood and samples with abnormalities based solely on their drying behaviour.
    Importantly, this technique doesn't require specialized equipment to make an accurate diagnosis.

    Part1