Science Simplified!

                       JAI VIGNAN

All about Science - to remove misconceptions and encourage scientific temper

Communicating science to the common people

'To make  them see the world differently through the beautiful lense of  science'

Load Previous Comments
  • Dr. Krishna Kumari Challa

    Scientists shoot lasers into brain cells to uncover how illusions work

    An illusion is when we see and perceive an object that doesn't match the sensory input that reaches our eyes.

    In a new study published in Nature Neuroscience, researchers identified the key neural circuit and cell type that plays a pivotal role in detecting these illusions—more specifically, their outer edges or "contours"—and how this circuit works.

    They discovered a special group of cells called IC–encoder neurons that tell the brain to see things that aren't really there as part of a process called recurrent pattern completion.

    Because IC–encoder neurons have this unique capacity to drive pattern completion, the researchers think that they might have specialized synaptic output connectivity that allows them to recreate this pattern in a very effective manner.

    They also also know that they receive top-down inputs from higher visual areas. The representation of the illusion arises in higher visual areas first and then gets fed back to the primary visual cortex; and when that information is fed back, it's received by these IC–encoders in the primary visual cortex.

     In the context of the brain and vision—using the above shape diagram—higher levels of the brain interpret the image as a square and then tell the lower-level visual cortex to "see a square" even though the visual stimulus consists of four semi-complete black circles.

    They made the discovery by observing the electrical brain activity patterns of mice when they were shown illusory images like the Kanizsa triangle. They then shot beams of light at the IC-encoder neurons, in a process called two-photon holographic optogenetics, when there was no illusory image present.

    When this happened, they noticed that even in the absence of an illusion, IC-encoder neurons triggered the same brain activity patterns that exist when an illusory image was present. They successfully emulated the same brain activity by stimulating these specialized neurons.

    The findings shed light on how the visual system and perception work in the brain and have implications for diseases where this system malfunctions. In certain diseases you have patterns of activity that emerge in your brain that are abnormal, and in schizophrenia these are related to object representations that pop up randomly.

    If you don't understand how those objects are formed and a collective set of cells work together to make those representations emerge, you're not going to be able to treat it; so understanding which cells and in which layer this activity occurs is helpful.

    Recurrent pattern completion drives the neocortical representation of sensory inference, Nature Neuroscience (2025). DOI: 10.1038/s41593-025-02055-5.

  • Dr. Krishna Kumari Challa

  • Dr. Krishna Kumari Challa

    Geologists discover where energy goes during an earthquake

    The ground-shaking that an earthquake generates is only a fraction of the total energy that a quake releases. A quake can also generate a flash of heat, along with a domino-like fracturing of underground rocks. But exactly how much energy goes into each of these three processes is exceedingly difficult, if not impossible, to measure in the field.

    Earthquakes are driven by energy that is stored up in rocks over millions of years. As tectonic plates slowly grind against each other, stress accumulates through the crust. When rocks are pushed past their material strength, they can suddenly slip along a narrow zone, creating a geologic fault. As rocks slip on either side of the fault, they produce seismic waves that ripple outward and upward.

    We perceive an earthquake's energy mainly in the form of ground shaking, which can be measured using seismometers and other ground-based instruments. But the other two major forms of a quake's energy—heat and underground fracturing—are largely inaccessible with current technologies.

    Now  geologists have traced the energy that is released by "lab quakes"—miniature analogs of natural earthquakes that are carefully triggered in a controlled laboratory setting. For the first time, they have quantified the complete energy budget of such quakes, in terms of the fraction of energy that goes into heat, shaking, and fracturing.

    They found that only about 10% of a lab quake's energy causes physical shaking. An even smaller fraction—less than 1%—goes into breaking up rock and creating new surfaces. The overwhelming portion of a quake's energy—on average 80%—goes into heating up the immediate region around a quake's epicenter. In fact, the researchers observed that a lab quake can produce a temperature spike hot enough to melt surrounding material and turn it briefly into liquid melt.

    The geologists also found that a quake's energy budget depends on a region's deformation history—the degree to which rocks have been shifted and disturbed by previous tectonic motions. The fractions of quake energy that produce heat, shaking, and rock fracturing can shift depending on what the region has experienced in the past.

    The team's lab quakes are a simplified analog of what occurs during a natural earthquake. Down the road, their results could help seismologists predict the likelihood of earthquakes in regions that are prone to seismic events.

     Daniel Ortega‐Arroyo et al, "Lab‐Quakes": Quantifying the Complete Energy Budget of High‐Pressure Laboratory Failure, AGU Advances (2025). DOI: 10.1029/2025av001683

  • Dr. Krishna Kumari Challa

    Scientists use AI to decode protein structures behind bitter taste detection

    Receptor proteins, expressed on the cell surface or within the cell, bind to different signaling molecules, known as ligands, initiating cellular responses. Taste receptors, expressed in oral tissues, interact with tastants, the molecules responsible for the sensation of taste.

    Bitter taste receptors (T2Rs) are responsible for the sensation of bitter taste. However, apart from oral tissue, these receptors are also expressed in the neuropod cells of the gastrointestinal tract, which are responsible for transmitting signals from the gut to the brain. Thus, T2Rs might play a crucial role in maintaining the gut-brain axis.

    25 types of human T2Rs have been identified to date. However, due to certain complexities, the structure of most of these receptors is not yet elucidated. In recent times, AI-based prediction models have been used to understand protein structure accurately. Previously, a Nobel Prize-winning artificial intelligence (AI)-based model, AlphaFold2 (AF2), was utilized to decipher the structures of T2Rs. However, with the advancement in technology, the model has been updated to its latest version, AlphaFold3 (AF3). The latest model allows a more detailed structural prediction compared to the previous version.

    In this study, a group of researchers decided to analyze the structure of T2Rs using the AF3 model and compare the accuracy with the results from the AF2-based prediction study and the available three-dimensional structures of the two T2Rs, T2R14 and T2R46.

    The expression of bitter taste receptors in the gastrointestinal tract indicates that they are involved in maintaining the gut-brain axis, glucose tolerance, and appetite regulation. Hence, understanding the structure can provide a better insight into its function.

    The researchers obtained the amino acid sequences of all human T2Rs from the UniProt database and used the AF3 model to predict their three-dimensional structures. For comparison, previously generated AF2 prediction data were retrieved from the AlphaFold database. The experimentally determined structures of T2R14 and T2R46 were sourced from the Protein Data Bank (PDB). Various software tools were employed for structure visualization, alignment, and accuracy assessment.

    The analysis revealed that AF3 provided consistently more accurate structural predictions than AF2. For T2R14, predictions were benchmarked against 115 cryo-EM structures, and AF3 showed a higher agreement with experimental data. Similarly, for T2R46, comparisons with three experimentally resolved structures confirmed that AF3 achieved the closest match in all cases.

    Part1

  • Dr. Krishna Kumari Challa

    Similarities in the structure of the T2Rs were also analyzed for this study. For these receptors, part of the protein remains inside the cell, known as the intracellular region, while another part stays outside the cell (extracellular region). The interaction with signal molecules happens in the extracellular region. The study demonstrated that there are more structural similarities and consistencies among the intracellular regions of the T2Rs. The extracellular region of the receptors shows significant structural variation.
    "Clustering of proteins is based on their structural similarity and dissimilarity. Based on their findings, the researchers divided the T2Rs into three different clusters.
    The structure of T2Rs probably allows them to recognize the thousands of different bitter substances via interaction with another taste receptor-specific G protein, α-gustducin.

    "With the receptors' involvement in detecting bitter tastants and maintaining the gut-brain axis, this can play an important role in health and pharmaceutical-based research, specifically targeting lifestyle diseases like diabetes.

     Takafumi Shimizu et al, The three-dimensional structure prediction of human bitter taste receptor using the method of AlphaFold3, Current Research in Food Science (2025). DOI: 10.1016/j.crfs.2025.101146

    Part 2

    **

  • Dr. Krishna Kumari Challa

    Kidney transplant rejection associated with changes in lymphatic vessels, new research shows

    Scientists have uncovered how lymphatic vessels—the kidney's "plumbing system"—undergo dramatic changes during chronic transplant rejection, becoming structurally disorganized and spreading to unusual parts of the kidney.

    Researchers used single-cell sequencing combined with powerful 3D imaging to look at small lymphatic vessels in kidney tissue, comparing healthy kidneys with transplanted kidneys that had been rejected.

    Published in the Journal of Clinical Investigation, the research sheds new light on a major unsolved challenge in kidney transplantation and could open the door to new treatments that help transplants last longer.
    Kidney transplantation is the most common form of solid organ transplant worldwide. Although the short-term outcomes of kidney transplantation—within a year after surgery—are very good, the long-term outcomes are poorer. Within 10 years, and depending on what country patients are treated in, roughly 50% of kidney grafts will fail.

    Researchers know that a big component of why kidney transplant failure occurs is that the patient's immune system attacks parts of the new kidney—such as the blood vessels within it. However, the role of the lymphatic vessels is far less understood. In healthy kidneys, lymphatic vessels act as the organ's plumbing system—playing a vital role in draining excess fluid and helping to regulate immune activity. Therefore, the researchers sought to gain a deeper understanding of the lymphatic system during transplant rejection.

    Part 1

  • Dr. Krishna Kumari Challa

    Researchers used two different and powerful methods—single cell RNA sequencing and advanced 3D imaging. They studied samples from both healthy and transplant rejection patients.

    Single-cell sequencing allows scientists to study the activity of genes in individual cells, one at a time. The researchers did this on a very large scale to generate a huge amount of data. Then the team stained large chunks of kidney tissue while still intact and used a procedure to make it transparent. This 3D imaging helped validate the predictions from the single-cell genetic analysis.

    The researchers found that during kidney transplant rejection, the lymphatic vessels within the transplant change their shape and organization. The vessels spread into deeper parts of the kidney known as the medulla, which normally has no lymphatic vessels within it. At the same time, the cell junctions, which are protein anchors that connect cells, go from looking like loose buttons to tightening up like zippers. This is a change that in other contexts is associated with immune cells getting trapped and unable to escape.

    Additionally, the researchers found that the balance of T cells inside and around the vessels was disrupted. These T cells released signals that made the vessels switch on molecules acting like "brakes" for the immune system, in an attempt to calm inflammation. However, this protective response was not enough, as other immune cells and antibodies were seen to be directly attacking the kidney. Strikingly, the vessels themselves were also carrying signs that they too were being targeted by the same harmful antibodies.
    These findings challenge the view that lymphatic vessels are simply good or bad in transplant rejection. This study suggests that the lymphatic system is normally protective but impaired in transplant rejection as the findings show the vessels change in ways that could encourage rejection by altering their structure and fueling immune responses. The results pave the way for research to focus on regenerating or protecting the lymphatic system in chronic kidney rejection.

    Daniyal J. Jafree et al, Organ-specific features of human kidney lymphatics are disrupted in chronic transplant rejection, Journal of Clinical Investigation (2025). DOI: 10.1172/jci168962

    Part 2

  • Dr. Krishna Kumari Challa

    Man's COVID Infection Lasted 2 Years, Setting a New Record

    An immunocompromised man endured ongoing acute COVID-19 for more than 750 days. During this time, he experienced persistent respiratory symptoms and was hospitalized five times.
    In spite of its duration, the man's condition differs from long COVID as it wasn't a case of symptoms lingering once the virus had cleared out, but the viral phase of SARS-CoV-2 that continued for over two years.

    While this record may be easy to dismiss as something that occurs only to vulnerable people, persistent infections have implications for us all, researchers warn in their new study.

    https://www.thelancet.com/journals/lanmic/article/PIIS2666-5247(25)00050-3/fulltext

  • Dr. Krishna Kumari Challa

    A new explanation for Siberia's giant exploding craters

    Scientists may be a step closer to solving the mystery of Siberia's giant exploding craters. First spotted in the Yamal and Gydan peninsulas of Western Siberia in 2012, these massive holes, known as giant gas emission craters (GECs) can be up to 164 feet deep. They seem to appear randomly in the permafrost and are formed when powerful explosions blast soil and ice hundreds of feet into the air.

    For more than a decade, researchers have been coming up with theories about the origin of these craters, ranging from meteor impacts to gas explosions. However, none of these have been able to explain why the craters are only found in this specific area and not in the permafrost elsewhere in the Arctic.

    Now, research published in the journal Science of the Total Environment proposes a new and more complete explanation that links the craters to specific factors unique to the two peninsulas, the vast gas reserves in this region and the effects of climate change.

    "We propose that the formation of GECs is linked to the specific conditions in the area, including abundant natural gas generation and seepage and the overall limited thickness of the continuous permafrost," wrote the researchers in their paper.

    According to their model, GECs form when gas and heat rise from deep underground. The heat melts the permafrost seal (a layer of permanently frozen ground that acts as a lid), making it thinner. Meanwhile, the gas builds up underneath it, and with nowhere to go, the pressure rises. As the climate warms, the permafrost thaws even more, making the lid thinner. Eventually, pressure becomes too great and causes an explosive collapse that creates a large crater.

     Helge Hellevang et al, Formation of giant Siberian gas emission craters (GECs), Science of The Total Environment (2025). DOI: 10.1016/j.scitotenv.2025.180042

    Exploding Siberian Craters

  • Dr. Krishna Kumari Challa

    Study provides first evidence that plastic nanoparticles can accumulate in the edible parts of vegetables

    Plastic pollution represents a global environmental challenge, and once in the environment, plastic can fragment into smaller and smaller pieces.

    A new study shows for the first time that some of the tiniest particles found in the environment can be absorbed into the edible sections of crops during the growing process.

    The research used radishes to demonstrate, for the first time, that nanoplastics—some measuring as little as one millionth of a centimeter in diameter—can enter the roots, before spreading and accumulating into the edible parts of the plant.

    The researchers say the findings reveal another potential pathway for humans and animals to unintentionally consume nanoplastics and other particles and fibers that are increasingly present in the environment.

    It also underscores the need for further research to investigate what is an emerging food safety issue, and the precise impacts it could have on environmental and human health.

    This study provides clear evidence that particles in the environment can accumulate not only in seafood but also in vegetables. This work forms part of our growing understanding on accumulation, and the potentially harmful effects of micro- and nanoparticles on human health.

    Nathaniel J. Clark et al, Determining the accumulation potential of nanoplastics in crops: An investigation of 14C-labelled polystyrene nanoplastic into radishes, Environmental Research (2025). DOI: 10.1016/j.envres.2025.122687

  • Dr. Krishna Kumari Challa

    Estimated 16,500 climate change deaths during Europe summer: Study

    Scientists estimated this week that rising temperatures from human-caused climate change were responsible for roughly 16,500 deaths in European cities this summer, using modeling to project the toll before official data is released.

    The rapidly produced study is the latest effort by climate and health researchers to quickly link the death toll during heat waves to global warming—without waiting months or years to be published in a peer-reviewed journal.

    The estimated deaths were not actually recorded in the European cities, but instead were a projection based on methods such as modeling used in previously peer-reviewed studies.

    Death tolls during heat waves are thought to be vastly underestimated because the causes of death recorded in hospitals are normally heart, breathing or other health problems that particularly affect the elderly when the mercury soars.

    researchers used climate modeling to estimate that global warming made temperatures an average of 2.2 degrees Celsius hotter in 854 European cities between June and August.

    Using historical data indicating how such soaring temperatures drive up mortality rates, the team estimated there were around 24,400 excess deaths in those cities during that time.

    They then compared this number to how many people would have died in a world that was not 1.3C warmer due to climate change caused by humans burning fossil fuels.

    Nearly 70%—16,500—of the estimated excess deaths were due to global warming, according to the rapid attribution study.

    This means climate change could have tripled the number of heat deaths this summer, said the study from scientists at Imperial College London and epidemiologists at the London School of Hygiene & Tropical Medicine.

    The estimates did reflect previous peer-reviewed research, such as a Nature Medicine study which determined there were more than 47,000 heat-related deaths during the European summer of 2023.

    Numerous prominent climate and health researchers also backed the study.

    What makes this finding even more alarming is that the methods used in these attribution studies are scientifically robust, yet conservative.

    The actual death toll could be even higher, warn the researchers. And what about if the figures for the entire world taken into account! Extremely alarming.

    Source: Nature Medicine

  • Dr. Krishna Kumari Challa

    Some small asteroids can abruptly explode

    Some asteroids are more dangerous than others, according to a report published in Nature Astronomy by an international team of researchers.

    The team had presented their findings of an investigation into the impact of small asteroid 2023 CX1 over France in February 2023. This new paper revealed that small asteroids can explode on atmospheric entry.

    The researchers confirmed the existence of a new population of asteroids linked to L-type chondrites, capable of fragmenting abruptly in the atmosphere and releasing almost all their energy at once. 

    Such asteroids must be accounted for in planetary defense strategies, as they pose an increased risk to populated areas, they say.

    Auriane Egal et al, Catastrophic disruption of asteroid 2023 CX1 and implications for planetary defence, Nature Astronomy (2025). DOI: 10.1038/s41550-025-02659-8.

  • Dr. Krishna Kumari Challa

    Coral reefs set to stop growing as climate warms

    Most coral reefs will soon stop growing and may begin to erode—and almost all will do so if global warming hits 2°C, according to a new study in the western Atlantic. 

    The study, published in the journal Nature, projects that more than 70% of the region's reefs will stop growing by 2040—and over 99% will do so by 2100 if warming reaches 2°C or more above pre-industrial levels. The paper is titled "Reduced Atlantic reef growth past 2°C warming amplifies sea-level impacts."

    Climate change—along with other issues such as coral disease and deteriorating water quality—reduces overall reef growth by killing corals and impacting colony growth rates, the study concludes.

     Chris Perry, Reduced Atlantic reef growth past 2°C warming amplifies sea-level impacts, Nature (2025). DOI: 10.1038/s41586-025-09439-4www.nature.com/articles/s41586-025-09439-4

  • Dr. Krishna Kumari Challa

    Ostrich and emu ancestor could fly, scientists discover

    How did the ostrich cross the ocean?

    We have long been puzzled by how the family of birds that includes African ostriches, Australian emus and cassowaries, New Zealand kiwis and South American rheas spread across the world—given that none of them can fly.

    However, a study published this week may have found the answer to this mystery: the family's oldest-known ancestors were able to take wing.

    The only currently living member of this bird family—which is called paleognaths—capable of flight is the tinamous in Central and South America. But even then, the shy birds can only fly over short distances when they need to escape danger or clear obstacles.

    Researchers analyzed the specimen of a lithornithid, the oldest paleognath group for which fossils have been discovered. They lived during the Paleogene period 66–23 million years ago.

    The fossil of the bird Lithornis promiscuus was first found in the US state of Wyoming, but had been sitting in the Smithsonian museum's collection.

    Because bird bones tend to be delicate, they are often crushed during the process of fossilization, but this one was not. 

    Crucially for this study, it retained its original shape. This allowed the researchers to scan the animal's breastbone, which is where the muscles that enable flight would have been attached.

    They determined that Lithornis promiscuus was able to fly—either by continuously beating its wings or alternating between flapping and gliding.

    But  why did these birds give up the power of flight?

    Birds tend to evolve flightlessness when two important conditions are met: they have to be able to obtain all their food on the ground, and there cannot be any predators to threaten them.

    Part 1

  • Dr. Krishna Kumari Challa

    Research has also recently revealed that lithornithids may have had a bony organ on the tip of their beaks which made them excel at foraging for insects.

    But what about the second condition—a lack of predators?

    Researchers suspect that paleognath ancestors likely started evolving towards flightlessness after dinosaurs went extinct around 65 million years ago.

    With all the major predators gone, ground-feeding birds would have been free to become flightless, which would have saved them a lot of energy.

    The small mammals that survived the event that wiped out the dinosaurs would have taken some time to evolve into predators.

    This would have given flightless birds "time to adapt by becoming swift runners" like the emu, ostrich and rhea—or even "becoming themselves dangerous and intimidating, like the cassowary.

    Quantitative analysis of stem-palaeognath flight capabilities sheds light on ratite dispersal and flight loss, Biology Letters (2025). DOI: 10.1098/rsbl.2025.0320royalsocietypublishing.org/doi … .1098/rsbl.2025.0320

    Part 2

  • Dr. Krishna Kumari Challa

    How an essential vitamin B-derived nutrient concentrates in mitochondria

    There's a molecule that our body makes from vitamin B5 that is critical for all of the metabolic processes essential for human life. And when something goes wrong in that molecule's production, it affects nearly every organ system in our body and causes a number of diseases.

    Researchers have discovered that up to 95% of this molecule—called essential cofactor coenzyme A (CoA)—is located inside mitochondria, organelles that supply cellular energy and regulate cellular metabolism. But what has not been clear is how CoA gets there.

    Reporting in Nature Metabolism, researchers have now uncovered that CoA is trafficked into mitochondria and have identified the mechanisms responsible.

    This information, the researchers say, is important for future considerations about when and where to target treatments for diseases in which CoA is implicated.

    The researchers were able to identify 33 different CoA conjugates in whole cells as well as 23 CoA conjugates in mitochondria.

    The question then was whether the CoA conjugates in the mitochondria were made there or brought in from elsewhere.

    In additional experiments, the researchers discovered that the enzyme required to make CoA largely exists outside of mitochondria. Further, when they made cells that lacked the molecular transporters that can move CoA around, mitochondria had far less CoA.

    These findings strongly support the idea that CoA is being imported into mitochondria, and these transporters are required for that to happen.

    This study advances the fundamental understanding of CoA and how it gets to where it needs to be in order to perform its essential functions. That, in turn, sheds light on how disruptions of this process might contribute to illness.

    For instance, mutations in the genes that produce CoA transporters are associated with diseases such as encephalomyopathy, a disorder that can include neurodevelopmental delay, epilepsy, and decreased muscle tone. Mutations in the enzymes that produce CoA have been implicated in neurodegeneration.

    In the context of brain disorders, such as neurodegeneration and psychiatric disorders, there's an emerging idea that dysregulated mitochondrial metabolism is a contributor.

    Ran Liu et al, Cellular pan-chain acyl-CoA profiling reveals SLC25A42/SLC25A16 in mitochondrial CoA import and metabolism, Nature Metabolism (2025). DOI: 10.1038/s42255-025-01358-y

  • Dr. Krishna Kumari Challa

    COVID-19 vaccine responses show four patterns, with 'rapid-decliners' at higher infection risk

    Two health care workers get COVID-19 vaccinations on the same day. Both show strong antibody responses initially, but six months later one stays healthy while the other contracts the virus. A new study published in Science Translational Medicine could help explain this difference.

    Researchers tracked individuals' antibody levels after vaccinations and identified four distinct patterns of immune response after the first booster vaccination. Notably, people in the group that started with the highest antibody levels but experienced a faster decline were infected earlier. People with lower blood levels of IgA(S) antibodies, which protect the nose and throat, were also at higher risk. The findings suggest that monitoring how antibody levels change over time could assist in identifying individuals at greater risk of infection.

    The research team measured antibody levels in 2,526 people over 18 months to see how vaccine responses changed between the first vaccination and later booster shots. They developed a mathematical classification system for COVID-19 vaccine responses using long-term tracking and AI-based computer analysis, becoming the first to systematically identify and characterize the "rapid-decliner" group.

    The researchers found that immune responses fell into four clear patterns: Some people maintained high antibody levels over time (durable responders), others started with strong levels but lost them quickly (rapid-decliners), a third group produced few antibodies that also declined rapidly (vulnerable responders), and the rest fell in between (intermediate responders).

    A breakthrough or subsequent infection refers to infections that occur after vaccination because the virus overcomes the immune protection that vaccines provide. The researchers found that people whose antibodies declined faster, either because they started low or dropped quickly (vulnerable responders and rapid-decliners), were slightly more likely to get breakthrough infections earlier.

    After booster vaccinations, 29% of participants fell into the durable responder category, 28% were vulnerable responders, and 19% were rapid-decliners. The remaining participants showed intermediate patterns. The differences in breakthrough infection rates between groups were modest—5.2% for durable responders and 6% for vulnerable and rapid-decliners.

    Part 1

  • Dr. Krishna Kumari Challa

    The study also revealed that participants who experienced breakthrough infections had lower levels of IgA(S) antibodies in their blood several weeks after vaccination. These antibodies protect the nose and throat and are our first line of defense against respiratory viruses.

    Importantly, the researchers found a strong correlation between blood IgA(S) levels and nasal IgA(S) levels, suggesting that blood tests can reliably indicate the strength of immune protection in airways. As a result, measuring blood IgA(S) levels after vaccination may help identify individuals at higher risk for breakthrough infection, especially among vulnerable groups.
    The researchers emphasize the importance of identifying the underlying biological mechanisms responsible for the rapid decline in antibody levels in order to develop more effective vaccination strategies.
    Previous research points to factors such as age, genetic variation, vaccine-specific characteristics, and environmental influences, including sleep habits, stress levels, and medications being taken at the same time.
    Identifying the rapid-decliner pattern is especially important—it helps explain why some people may need boosters sooner than others.

    This could potentially contribute to better, more personalized vaccination strategies.

     Longitudinal antibody titers measured after COVID-19 mRNA vaccination can identify individuals at risk for subsequent infection, Science Translational Medicine (2025). DOI: 10.1126/scitranslmed.adv4214

    Part 2

  • Dr. Krishna Kumari Challa

    Scientists discover proteins that initiate cellular immunity in bone marrow

    Researchers have shown how a critical pathway is fundamental to the immune system.

    Establishing cellular immunity depends on the thymus, a lymph gland located in front of the heart. This gland produces and exports T cells, a workhorse white blood cell, out to the rest of the body, using the building blocks of stem cells from the bone marrow. But it has remained a riddle how T cell fate is initiated.

    The new paper shows that two protein "transcription factors" called Tcf1 and Lef1 are critical modulators that direct bone marrow stem cells to the T cell path in the thymus.

    By carefully removing these proteins via in vivo and ex vivo models, the team of scientists revealed a foundational event in the immune system, which represents essentially the very origin of a functional cellular immune competence.

    This discovery illustrates a whole new understanding of T cell formation, and could lead to a wide range of novel approaches in treating immune deficiencies, autoimmune diseases, and optimizing immunotherapies in the ongoing fight against cancer.

    "These findings reveal that Tcf1 and Lef1 act much earlier than previously recognized, extending beyond their roles in promoting T-cell lineage specification and commitment at later stages in the thymus," the authors write, adding that the "downstream" Notch signaling pathway is corrupted without these two assisting proteins.

     Xin Zhao et al, Single-cell multiomics identifies Tcf1 and Lef1 as key initiators of early thymic progenitor fate, Science Immunology (2025). DOI: 10.1126/sciimmunol.adq8970

  • Dr. Krishna Kumari Challa

    Who are vulnerable to health disinformation?

    Individuals who like to think critically are better at identifying false information online, while those with conservative political affiliations struggle more with detecting fake medical information on social media, according to a PLOS One study.

    Researchers analyzed responses from 508  participants who reviewed 10 different social media posts that covered diverse health topics with both honest and disinforming claims, with 60% of them falling under the second category.

    After viewing the videos and making their honesty judgments, participants were asked to record their responses through online surveys, selecting up to eight reasons from a predefined list. These reasons included the reliability of the source, whether claims were supported by evidence or based on opinion, and the presence or absence of credentials.

    Overall, people detected health disinformation with 66% accuracy. This performance is troubling, especially with the world living through an infodemic, a flood of online information where fact and fiction mix freely. Finding trustworthy health guidance online has become increasingly difficult, owing to black box–like social media algorithms that amplify accurate and misleading posts alike.

    The COVID-19 pandemic made quite evident the dangers of health disinformation spreading online. Misleading health advice in digital spaces is not just a matter of debate anymore, it is a pressing public health concern with serious real-world consequences.

    Previous studies have shown that political leanings can be an indicator of how a person fares at spotting false information online. More recent research, however, highlights a personality trait known as need for cognition—the tendency to seek out and enjoy analytical thinking—as an even stronger predictor.

    The results indicated that those with a high need for cognition were significantly better at identifying false information. At the same time, political leanings influenced how certain posts were perceived, particularly on polarizing topics such as COVID-19 vaccines and FDA warnings about drugs like ivermectin and hydroxychloroquine.

    The researchers note that the nature of the current infodemic calls for deeper investigation into how disinformation on health issues affects people's everyday lives.

    Joey F. George, Political affiliation or need for cognition? It depends on the post: Comparing key factors related to detecting health disinformation in the U.S., PLOS One (2025). DOI: 10.1371/journal.pone.0315259

    .

  • Dr. Krishna Kumari Challa

    Jaguar swims over a kilometer, showing dams are not absolute barriers to large carnivores

    Scientists in Brazil recently recorded evidence that a jaguar visited an isolated island in the reservoir area of the Serra da Mesa Hydroelectric Power Dam in northern Goiás State. The same jaguar had been identified on the mainland, 2.48 km away from the island, back in 2020. Both instances were recorded by camera-trap stations, three on the mainland and one on the island, which were set up for an exploratory jaguar survey. The specific jaguar's identity was confirmed by spot-pattern analysis.

    After analyzing possible aquatic trajectories for the jaguar, the researchers found that it had two possible paths. Either the jaguar swam the direct 2.48 km to the island, or used a stepping-stone-like islet, taking a 1.06 km swim, followed by a 1.27 km swim to get to the island.

    Previous records indicated a maximum swimming distance of around 200 meters for jaguars. Given the possible paths here, this jaguar had to swim a minimum of 1.27 km, possibly more, shattering previous records. This impressive feat is documented in a new bioRxiv preprint, in which the researchers involved also discuss a newly proposed aquatic-cost scale for assessing the ecological connectivity between landmasses.

    It was previously thought that reservoirs, like Serra da Mesa, acted as absolute barriers for large carnivores due to prior instances of predator collapse on islands further than a kilometer away from the mainland. Yet, genetic studies across the Amazon River indicated only partial segregation, suggesting occasional crossings.

    The study authors explain, "These rare events suggest that, under favorable conditions (e.g., warm water, low currents, presence of stepping-stone islands), large felids may occasionally exploit aquatic corridors that appear to be initially insurmountable."

    The research team proposes a new ordinal aquatic-cost scale for modeling connectivity between landmasses, described by low/medium/high cost ranking for water crossings. Low cost was defined as less than 300 m; medium as 300–1,000 m with stepping-stones; and high as greater than 1,000 m of open water. This scale is meant to help inform future hydropower impact assessments and corridor planning for jaguar conservation.

    Leandro Silveira et al, Kilometre-scale jaguar swimming reveals permeable hydropower barriers: implications for conservation in the Cerrado hotspot, bioRxiv (2025). DOI: 10.1101/2025.09.05.674446

  • Dr. Krishna Kumari Challa

    Carbon credits have little to no effect on making companies greener, study reveals

    Many companies across the world use carbon credits as part of their climate strategies to offset emissions. A carbon credit is a certificate that represents the reduction, avoidance or removal of one ton of carbon dioxide from the atmosphere. While organizations claim these credits help them reduce their environmental impact, there is debate about whether companies that buy credits decarbonize faster. However, an in-depth study of 89 multinationals, published in Nature Communications, reveals that companies that purchase credits do not decarbonize any quicker than those that do not.

    Voluntary emission offsetting is not associated with positive corporate environmental performance. Therefore, it is not a reliable alternative to regulatory measures, such as compliance carbon pricing, wrote the researchers in their paper.

    Researchers examined over 400 sustainability reports and self-reported environmental data from multinational companies working in the oil and gas, automotive and airline industries. These were firms that bought and used roughly a quarter of all carbon credits that were available in 2022.
    They then compared how much these companies reduced their emissions between 2018 and 2023 and how ambitious their climate goals were with the amount of carbon credits they purchased. To make sure the company data was accurate, the researchers checked it against major carbon credit agencies.

    The study found that, on average, companies spend about 1% of their capital spending on carbon credits, which means they account for a small share of the overall budget. The research also highlights a problem where carbon offsets can compete with internal decarbonization.

    For some large-scale off-setters buying large amounts of carbon credits can divert funds from internal projects that would directly cut their emissions. Other companies use carbon credits to meet their goals because it is cheaper and easier than making internal structural changes.

    The researchers suggest a way forward, namely a shift away from voluntary carbon offsetting to regulatory measures, such as carbon compliance. This is a government-mandated system where companies must pay for the carbon they emit. The main goal here is to create a financial incentive for companies to reduce their carbon emissions.

    Niklas Stolz et al, The negligible role of carbon offsetting in corporate climate strategies, Nature Communications (2025). DOI: 10.1038/s41467-025-62970-w

  • Dr. Krishna Kumari Challa

    Our body organs cannot simply be classified as male or female

    Biological sex is usually described in simple binary terms: male or female. This works well for germ cells (sperm versus eggs), but for other body organs it is of little help.

    A new study shows that our organs form a mosaic of sex-specific characteristics—far removed from the strict division into male and female.

    The research shows that in many organs, sex-specific patterns overlap strongly. Only testes and ovaries are clearly distinguishable. All other organs show mosaic-like combinations of male and female characteristics.

    Sex-specific genes stand out most strongly in the sexual organs. But in other organs the picture is more complex. In mice, the kidney and liver show large differences, while in humans it is adipose tissue. By contrast, the brain shows only minimal differences in both species—consistent with previous studies of human brain structure.

    To capture this diversity, the researchers developed a Sex-Bias Index (SBI). This index summarizes the activity of all male- and female-specific genes in an organ into a single value. While the index shows a clear separation in the sexual organs, in other organs the values are often so close that men and women cannot be distinguished reliably.

    For example, a man's heart may be more "female-like" than that of some women. Even within an individual, organs can differ—the heart more female, the liver more male. This results in a mosaic of sex characteristics that contradicts the idea of a clear-cut binary.

    The study, published in eLife, also shows that sex-specific gene activity in organs evolves very rapidly—much faster than genes active in both sexes equally. Even between mouse species that diverged less than 2 million years ago, the majority of genes have lost or even switched their sex-specific role.

    As a result, when comparing humans and mice, only very few genes retain conserved sex-specific activity. This also means that mouse models are of very limited use when applied to sex-specific medicine in humans.

    Part 1

  • Dr. Krishna Kumari Challa

    The researchers further found that sex-specific genes often occur in "modules" that are regulated together. Evolution therefore alters sex differences not by changing single genes, but by rearranging whole networks. The driving force here is sexual selection—the ongoing evolutionary conflict between the interests of males and females. This conflict can never be fully resolved, as every adaptation creates new contrasts.

    When applied to human tissues, the method reveals a clear pattern: markedly fewer sex-specific genes than in mice, and even stronger overlaps between men and women. In our species, differences are therefore weaker, further undermining the idea of a strict binary classification.

    The study concludes that while the sexual organs show a clear binary pattern, most other tissues display a continuum of sex-specific gene activity—a dynamic spectrum that varies both between species and between individuals.

    Sex is therefore not rigid and clear-cut, but shaped by evolution, overlaps and individual differences. Instead of classifying the body strictly as male or female based on molecular features, it should be understood as a complex mosaic.

    Chen Xie et al, Fast evolutionary turnover and overlapping variances of sex-biased gene expression patterns defy a simple binary sex classification of somatic tissues, eLife (2025). DOI: 10.7554/elife.99602.4

    Part 2

  • Dr. Krishna Kumari Challa

    Grue jay? Rare hybrid bird identified 

    Biologists who have reported discovering a bird that's the natural result of a green jay and a blue jay's mating, say it may be among the first examples of a hybrid animal that exists because of recent changing patterns in the climate. The two different parent species are separated by 7 million years of evolution, and their ranges didn't overlap as recently as a few decades ago.

    They  think it's the first observed vertebrate that's hybridized as a result of two species both expanding their ranges due, at least in part, to climate change.

    Past vertebrate hybrids have resulted from human activity, like the introduction of invasive species, or the recent expansion of one species' range into another's—think polar bears and grizzlies—but this case appears to have occurred when shifts in weather patterns spurred the expansion of both parent species.

    Hybridization is probably way more common in the natural world than researchers know about because there's just so much inability to report these things happening.

    The researchers did not opt to name the hybrid bird, but other naturally occurring hybrids have received nicknames like "grolar bear" for the polar bear-grizzly hybrid, "coywolf" for a creature that's part coyote and part wolf and "narluga" for an animal with both narwhal and beluga whale parents.

    Brian R. Stokes et al, An Intergeneric Hybrid Between Historically Isolated Temperate and Tropical Jays Following Recent Range Expansion, Ecology and Evolution (2025). DOI: 10.1002/ece3.72148

  • Dr. Krishna Kumari Challa

    Age 70 identified as cutoff for chemotherapy benefit in colorectal cancer

    Colorectal cancer remains a leading cause of cancer death, with incidence rising among older adults. One of the most pressing clinical questions has been whether elderly patients should receive oxaliplatin, a standard component of adjuvant chemotherapy that is known to cause serious side effects.

    Researchers  examined health records from more than 8,500 patients with stage II or III colorectal cancer who underwent surgery followed by chemotherapy between 2014 and 2016. Patients were divided into two groups: those treated with oxaliplatin-based combinations, and those given standard chemotherapy alone. Using advanced statistical methods, the researchers systematically tested whether an age threshold existed at which oxaliplatin stopped providing survival benefits.

    The results were decisive. In stage III patients aged 70 or younger, oxaliplatin reduced the risk of death by 41%, boosting five-year survival from 78% to nearly 85%. But in those older than 70, oxaliplatin did not improve survival and was linked to higher rates of treatment discontinuation. In fact, almost 40% of older patients receiving oxaliplatin stopped chemotherapy early, often due to toxicity. For stage II patients of any age, oxaliplatin showed no added survival benefit.

    The most important point is that oxaliplatin improves survival only in patients with stage III colorectal cancer who are aged 70 years or younger. Beyond 70, the benefit disappears, and oxaliplatin is associated with higher discontinuation rates due to toxicity.

    These findings have immediate real-world applications.

    Oncologists can use this age threshold to make more precise, evidence-based choices about whether to add oxaliplatin, avoiding unnecessary toxicity in patients unlikely to benefit.

    The broader significance extends to health care policy. Avoiding ineffective chemotherapy in older patients may help reduce costs, complications, and hospitalizations. Health systems could redirect resources to therapies and supportive care that make a greater difference in survival and quality of life. The research also sets the stage for longer-term changes in global cancer care.

    Jun Woo Bong et al, Older Age Threshold for Oxaliplatin Benefit in Stage II to III Colorectal Cancer, JAMA Network Open (2025). DOI: 10.1001/jamanetworkopen.2025.25660

  • Dr. Krishna Kumari Challa

    India health alert after 'brain-eating' amoeba rise

    India has issued a health alert after infections and deaths caused by a rare water-borne "brain-eating" amoeba doubled compared to last year in the southern state of Kerala.

    Officials were "conducting tests on a large scale across the state to detect and treat cases."

    Officials reported 19 deaths and 72 infections of the Naegleria fowleri amoeba this year, including nine deaths and 24 cases in September alone.

    Last year, the amoeba killed nine people out of 36 reported cases.

    The US Centers for Disease Control and Prevention (CDC) says it is often called a "brain-eating amoeba" because it can "infect the brain and destroy brain tissue."

    If the amoeba reaches the brain, it can cause an infection that kills over 95% of those affected.

    Infections are "very rare but nearly always fatal," the CDC notes.

    The amoeba lives in warm lakes and rivers and is contracted by contaminated water entering the nose. It does not spread from person to person.

    The World Health Organization says that symptoms include headache, fever and vomiting, which rapidly progresses to "seizures, altered mental status, hallucinations, and coma."

    Source: News Agencies

  • Dr. Krishna Kumari Challa

    New cancer clue discovered in our saliva

    Scientists have discovered a brand-new type of DNA element swimming in our saliva.

    The giant loops of genetic information – named ‘inocles’ – are linked to oral health, the immune system, and cancer risk.

    Those with head and neck cancers show far fewer of these DNA elements in their mouths.

    • Inocles are found in roughly 75 percent of the population.

    • Like little survival kits, their presence might help certain oral bacteria during stressful times, possibly altering the microbiome in our mouths.

    How that impacts tumor growth is a mystery to solve. Right now, 95 percent of the genes in inocles remain elusive.

    https://www.nature.com/articles/s41467-025-62406-5

  • Dr. Krishna Kumari Challa

    Sugary drinks may increase risk of metastasis in advanced colorectal cancer

    A new study by researchers shows that the glucose-fructose mix found in sugary drinks directly fuels metastasis in preclinical models of advanced colorectal cancer. The study was published today in Nature Metabolism.

    The researchers studied how sugary drinks may affect late-stage colorectal cancer. Using laboratory cancer models, they compared the effects of the glucose-fructose mix found in most sugary drinks with those of glucose or fructose alone. Only the sugar mix made cancer cells more mobile, leading to faster spread to the liver—the most common site of colorectal cancer metastasis.

    The sugar mix activated an enzyme called sorbitol dehydrogenase (SORD), which boosts glucose metabolism and triggers the cholesterol pathway, ultimately driving metastasis. This is the same pathway targeted by statins, common heart drugs that inhibit cholesterol production. Blocking SORD slowed metastasis, even with the sugar mix present. These findings suggest that targeting SORD could also offer an opportunity to block metastasis.

    These  findings highlight that daily diet matters not only for cancer risk but also for how the disease progresses once it has developed.

    Fructose and glucose from sugary drinks enhance colorectal cancer metastasis via SORD, Nature Metabolism (2025). DOI: 10.1038/s42255-025-01368-wwww.nature.com/articles/s42255-025-01368-w

  • Dr. Krishna Kumari Challa

    Think before you ink: Uncovering the hidden risks in tattoo inks

    New research suggests that what's in the ink may pose greater risks than the size and design of the tattoo.

    A new study has revealed that the ingredients listed on tattoo ink labels often don't match what's actually inside the bottle. Researchers warn that this comes with risk because there are currently few regulations, laws and safety criteria for tattoo and permanent cosmetic formulations.

    The findings, published this month as the cover story in the Journal of Environmental Health, raise fresh concerns about the safety and regulation of tattoo inks.

    Using a combination of advanced analytical techniques, researchers found discrepancies between labeled and actual ingredients in a range of commercially available yellow tattoo inks.

    The results showed not only discrepancies with label claims, but also the presence of unlisted elements such as aluminum, sodium and silicon.

    Batool A. Aljubran et al, Decoding Tattoo Inks: Multiple Analysis Techniques Reveal Discrepancies in Ingredient Composition and Elemental Content When Compared Against Label Claims, Journal of Environmental Health (2025). DOI: 10.70387/001c.143999

  • Dr. Krishna Kumari Challa

    Study finds 10% of pediatric blood cancers may stem from medical imaging radiation

    A new study has concluded that radiation from medical imaging is associated with a higher risk of blood cancers in children.

    Researchers examined data from nearly 4 million children and estimated that one in 10 blood cancers—some 3,000 cancers in all—may be attributable to radiation exposure from medical imaging. The risk increased proportionally based on the cumulative amount of radiation the children received.

    The investigation is the first comprehensive assessment using data from children and adolescents in North America that quantifies the association between radiation exposure from medical imaging and blood and bone marrow cancers, such as leukemia and lymphoma, which are the most common forms of cancer in children and adolescents.

    Medical imaging saves lives by enabling timely diagnosis and effective treatment, but it also exposes patients to ionizing radiation, a known carcinogen, particularly through computed tomography (CT).

    Children are particularly vulnerable to radiation-induced cancer due to their heightened radiosensitivity and longer life expectancy.

    The authors caution that doctors and parents should avoid excessive radiation doses and minimize exposure when clinically feasible.

    Investigators found a significant relationship between cumulative radiation dose and the risk of a hematologic malignancy, which includes tumors affecting the blood, bone marrow, lymph, and lymphatic system.

    Among all the forms of medical imaging, the study found that chest radiography was the most common imaging exam that doctors performed. The most common form of CT was of the head and brain.

    For children who underwent a head CT, the researchers attributed about a quarter of the children's subsequent hematologic malignancies to radiation exposure.

    The authors emphasized that while medical imaging remains an invaluable tool in pediatric care, their findings highlight the need to carefully balance its diagnostic benefits with potential long-term risks.

    Rebecca Smith-Bindman, et al. Medical Imaging and Pediatric and Adolescent Hematologic Cancer Risk. The New England Journal of Medicine (2025). DOI: 10.1056/NEJMoa2502098

  • Dr. Krishna Kumari Challa

    El Niño, which means little boy in Spanish, is a complex, cyclical weather pattern that warms the surface waters in the eastern Pacific Ocean. This weakens trade winds and alters atmospheric conditions, a process that has long been known to suppress summer rainfall in India.

    Studies  found a direct connection between Indian rainfall and El Niño. In the drier regions of southeastern and northwestern India, this climate phenomenon reduces total rainfall and the intensity of extreme events. However, in the wet central and southwestern parts of the country, it results in less frequent but more intense downpours. This is due to changes in convective buoyancy, an atmospheric force that powers storms.

    During El Niño summers, convective buoyancy increases in the wettest regions, which makes more extreme rainfall likely. The chance of a very heavy downpour (more than 250 mm of rain) increases by 43% across all of India's monsoon-affected areas and 59% for the Central Monsoon Zone during El Niño summers.

    "Extreme daily rainfall over the Indian summer  monsoon's rainiest areas becomes more likely the stronger that El Niño conditions are," commented the researchers in their study.

    Spencer A. Hill, More extreme Indian monsoon rainfall in El Niño summers, Science (2025). DOI: 10.1126/science.adg5577www.science.org/doi/10.1126/science.adg5577

    **

  • Dr. Krishna Kumari Challa

    Hormone therapy eases hot flashes while leaving heart risk unchanged in younger women

    Researchers report that menopausal hormone therapy relieved vasomotor symptoms without raising cardiovascular risk in younger women, but increased risk was evident in women aged 70 years and older.

    Concerns about the safety of hormone therapy have discouraged many women and clinicians from using it to treat hot flashes and night sweats. Some previous studies have suggested increased risk of coronary heart disease linked to hormone use, particularly in older women. At the same time, vasomotor symptoms remain among the most common and distressing consequences of menopause, and effective treatment options have been limited.

    In the study, "Menopausal Hormone Therapy and Cardiovascular Diseases in Women With Vasomotor Symptoms: A Secondary Analysis of the Women's Health Initiative Randomized Clinical Trials," published in JAMA Internal Medicine, researchers analyzed outcomes from two hormone therapy trials to assess the safety of treatment in women experiencing vasomotor symptoms.

    A total of 27,347 postmenopausal women  aged 50 to 79 years were enrolled from 40 clinical centers across the United States. Among them, 10,739 had undergone hysterectomy and were randomized to receive conjugated equine estrogens (CEE) or placebo, while 16,608 with an intact uterus were randomized to CEE plus medroxyprogesterone acetate (MPA) or placebo.

    Women were followed for a median of 7.2 years in the CEE trial and 5.6 years in the CEE plus MPA trial. Vasomotor symptoms were assessed at baseline and year one, with nearly all women who reported moderate or severe symptoms recalling onset near menopause.

    CEE alone reduced vasomotor symptoms by 41% across all age groups. CEE plus MPA also reduced symptoms but with age-related variation, showing strong benefit in women aged 50 to 59 years and diminished or absent benefit in women 70 years and older.
    Analysis of cardiovascular outcomes found neutral effects of both CEE alone and CEE plus MPA in women aged 50 to 59 years with moderate or severe symptoms. No clear harm was observed in those aged 60 to 69 years, although risk estimates were elevated for CEE alone.

    Women aged 70 years and older had significantly higher rates of atherosclerotic cardiovascular disease, with hazard ratios of 1.95 for CEE alone and 3.22 for CEE plus MPA, corresponding to 217 and 382 excess events per 10,000 person-years, respectively.

    Investigators concluded that hormone therapy can be considered for treatment of vasomotor symptoms in younger women aged 50 to 59 years, while initiation in women aged 60 to 69 years requires caution. For women 70 years and older, the findings argue against the use of hormone therapy due to substantially increased cardiovascular risk.

    Part 1

  • Dr. Krishna Kumari Challa

     Jacques E. Rossouw et al, Menopausal Hormone Therapy and Cardiovascular Diseases in Women With Vasomotor Symptoms, JAMA Internal Medicine (2025). DOI: 10.1001/jamainternmed.2025.4510

    Deborah Grady et al, Hormone Therapy for Menopausal Vasomotor Symptoms—Better Understanding Cardiovascular Risk, JAMA Internal Medicine (2025). DOI: 10.1001/jamainternmed.2025.4521

    Part 2

  • Dr. Krishna Kumari Challa

    COVID-19 models suggest universal vaccination may avert over 100,000 hospitalizations

    US Scenario Modeling Hub, a collaborative modeling effort of 17 academic research institutions, reports a universal COVID-19 vaccination recommendation could avert thousands more  hospitalizations and deaths than a high-risk-only strategy.

    Nine independent teams produced projections under six scenarios that combined two immune escape rates, 20% and 50% per year. An immune escape rate is the annual reduction in protection against infection that occurs as new SARS-CoV-2 variants evolve. Projections covered the United States population of 332 million with an estimated 58 million aged 65 years.

    Three vaccine recommendation strategies were tested. No recommendation, recommendation for high-risk groups only, or recommendation for all eligible individuals. Annually reformulated vaccines were assumed to match variants circulating on June 15, 2024, to be available on September 1, 2024, and to be 75% effective against hospitalization at the time of release.

    Teams calibrated their models to weekly hospitalizations and deaths reported by the National Healthcare Safety Network and the National Center for Health Statistics.

    In the worst case scenario, defined by high immune escape with no vaccine recommendation, projections reached 931,000 hospitalizations with a 95% projection interval of 0.5 to 1.3 million and 62,000 deaths with a 95% projection interval of 18,000 to 115,000.

    In the best case defined by low immune escape with a universal recommendation, the ensemble projected 550,000 hospitalizations with a 95% projection interval of 296,000 to 832,000 and 42,000 deaths with a 95% projection interval of 13,000 to 72,000.

    Severe outcomes clustered in older populations. Across scenarios, adults aged 65 years and older accounted for 51% to 62% of hospitalizations and 84% to 87% of deaths.

    Vaccination of high-risk groups was only projected to avert 11% of hospitalizations under low immune escape and 8% under high escape, along with 13% and 10% of deaths. A universal recommendation increased the effect with 15% fewer hospitalizations under low immune escape and 11% fewer under high, with 16% and 13% fewer deaths.

    Under high immune escape, a high-risk-only strategy averted 76,000 hospitalizations with a 95% CI of 34,000 to 118,000 and 7,000 deaths with a 95% CI of 3,000 to 11,000. Expanding to a universal recommendation prevented 104,000 hospitalizations with a 95% CI of 55,000 to 153,000 and 9,000 deaths with a 95% CI of 4,000 to 14,000.

    Additional indirect benefits accrued to older adults under a universal strategy. Compared with high-risk-only vaccination, universal recommendations prevented about 11,000 more hospitalizations and 1,000 more deaths in those aged 65 years and older.

    Observed national patterns diverged in timing from projections. A marked summer 2024 wave was followed by a smaller peak in January 2025, while projections anticipated the heaviest burden from late December 2024 to mid January 2025. Ensemble coverage for weekly deaths remained strong, with 95% intervals closely matching observed values.

    Authors conclude that vaccines remain a critical tool to limit COVID-19 burden in 2024–2025, with universal recommendations offering added direct and indirect protection and the potential to save thousands more lives, including among older adults.

    Part 1

  • Dr. Krishna Kumari Challa

    Sara L. Loo et al, Scenario Projections of COVID-19 Burden in the US, 2024-2025, JAMA Network Open (2025). DOI: 10.1001/jamanetworkopen.2025.32469

    Part 2

  • Dr. Krishna Kumari Challa

    Nanoparticles supercharge vinegar's old-fashioned wound healing power

    Wounds that do not heal are often caused by bacterial infections and are particularly dangerous for the elderly and people with diabetes, cancer and other conditions. Acetic acid (more commonly known as vinegar) has been used for centuries as a disinfectant, but it is only effective against a small number of bacteria, and it does not kill the most dangerous types.

    New research has resulted in the ability to boost the natural bacterial killing qualities of vinegar by adding antimicrobial nanoparticles made from carbon and cobalt. The findings have been published in the journal ACS Nano.

    The acidic environment from the vinegar made bacterial cells swell and take up the nanoparticle treatment.

    Once exposed, the nanoparticles appear to attack dangerous bacteria from both inside the bacterial cell and also on its surface, causing them to burst. Importantly, this approach is nontoxic to human cells and was shown to remove bacterial infections from mice wounds without affecting healing.

    Adam Truskewycz et al, Cobalt-Doped Carbon Quantum Dots Work Synergistically with Weak Acetic Acid to Eliminate Antimicrobial-Resistant Bacterial Infections, ACS Nano (2025). DOI: 10.1021/acsnano.5c03108

  • Dr. Krishna Kumari Challa

    Over 60,000 died in Europe alone from heat during 2024 summer: Study

    More than 60,000 people died from heat in Europe during last year's record-breaking summer, a benchmark study said this week, in the latest warning of the massive toll climate change is having on the continent.

    With Europe heating up twice as fast as the global average, the Spain-based researchers suggested an emergency alert system could help warn vulnerable people—particularly the elderly—ahead of dangerous heat waves.

    Europe experienced an exceptionally deadly summer in 2024 with more than 60,000 heat-related deaths, bringing the total burden over the past three summers to more than 181,000,  said the study in the journal Nature Medicine.

    Tomáš Janoš, Heat-related mortality in Europe during 2024 and health emergency forecasting to reduce preventable deaths, Nature Medicine (2025). DOI: 10.1038/s41591-025-03954-7www.nature.com/articles/s41591-025-03954-7

  • Dr. Krishna Kumari Challa

    Our actions are dictated by 'autopilot' not choice, finds study

    Habit, not conscious choice, drives most of our actions, according to new research . 

    The research, published in Psychology & Health, found that two-thirds of our daily behaviors are initiated "on autopilot", out of habit.

    Habits are actions that we are automatically prompted to do when we encounter everyday settings, due to associations that we have learned between those settings and our usual responses to them.

    The research also found that 46% of behaviors were both triggered by habit and aligned with conscious intentions, suggesting that people form habits that support their personal goals, and often disrupt habits that conflict with them.

    The study found that 65% of daily behaviors were habitually initiated, meaning people were prompted to do them out of routine rather than making a conscious decision.

    For people who want to break their bad habits, simply telling them to 'try harder' isn't enough. To create lasting change, we must incorporate strategies to help people recognize and disrupt their unwanted habits, and ideally form positive new ones in their place.

    The researchers recommend that initiatives designed to help people adopt new behaviors, like exercising or eating healthier, should focus on building new, positive habits.

    Amanda L. Rebar et al, How habitual is everyday life? An ecological momentary assessment study, Psychology & Health (2025). DOI: 10.1080/08870446.2025.2561149

  • Dr. Krishna Kumari Challa

    The brain splits up vision without you even noticing

    The brain divides vision between its two hemispheres—what's on your left is processed by your right hemisphere and vice versa—but your experience with every bike or bird that you see zipping by is seamless. A new study by neuroscientists  reveals how the brain handles the transition.

    It's surprising to some people to hear that there's some independence between the hemispheres, because that doesn't really correspond to how we perceive reality. In our consciousness, everything seems to be unified.

    There are advantages to separately processing vision on either side of the brain, including the ability to keep track of more things at once, researchers have found, but neuroscientists have been eager to fully understand how perception ultimately appears so unified in the end.

    Researchers measured neural activity in the brains of animals as they tracked objects crossing their field of view. The results reveal that different frequencies of brain waves encoded and then transferred information from one hemisphere to the other in advance of the crossing and then held on to the object representation in both hemispheres until after the crossing was complete.

    The process is analogous to how relay racers hand off a baton, how a child swings from one monkey bar to the next, and how cell phone towers hand off a call from one to the next as a train passenger travels through their area. In all cases, both towers or hands actively hold what's being transferred until the handoff is confirmed.

    Part 1

  • Dr. Krishna Kumari Challa

    To conduct the study, published in the Journal of Neuroscience, the researchers measured both the electrical spiking of individual neurons and the various frequencies of brain waves that emerge from the coordinated activity of many neurons. They studied the dorsal and ventrolateral prefrontal cortex in both hemispheres, brain areas associated with executive brain functions.

    The power fluctuations of the wave frequencies in each hemisphere told the researchers a clear story about how the subject's brains transferred information from the "sending" to the "receiving" hemisphere whenever a target object crossed the middle of their field of view. In the experiments, the target was accompanied by a distractor object on the opposite side of the screen to confirm that the subjects were consciously paying attention to the target object's motion and not just indiscriminately glancing at whatever happened to pop up on to the screen.

    The highest frequency "gamma" waves, which encode sensory information, peaked in both hemispheres when the subjects first looked at the screen and again when the two objects appeared. When a color change signaled which object was the target to track, the gamma increase was only evident in the "sending" hemisphere (on the opposite side as the target object), as expected.
    Meanwhile, the power of somewhat lower frequency "beta" waves, which regulate when gamma waves are active, varied inversely with the gamma waves. These sensory encoding dynamics were stronger in the ventrolateral locations compared to the dorsolateral ones.

    Meanwhile, two distinct bands of lower frequency waves showed greater power in the dorsolateral locations at key moments related to achieving the handoff. About a quarter of a second before a target object crossed the middle of the field of view, "alpha" waves ramped up in both hemispheres and then peaked just after the object crossed. Meanwhile, "theta" band waves peaked after the crossing was complete, only in the "receiving" hemisphere (opposite from the target's new position).

    Accompanying the pattern of wave peaks, neuron spiking data showed how the brain's representation of the target's location traveled. Using decoder software, which interprets what information the spikes represent, the researchers could see the target representation emerge in the sending hemisphere's ventrolateral location when it was first cued by the color change. Then they could see that as the target neared the middle of the field of view, the receiving hemisphere joined the sending hemisphere in representing the object, so that they both encoded the information during the transfer.

    Taken together, the results showed that after the sending hemisphere initially encoded the target with a ventrolateral interplay of beta and gamma waves, a dorsolateral ramp up of alpha waves caused the receiving hemisphere to anticipate the handoff by mirroring the sending hemisphere's encoding of the target information. Alpha peaked just after the target crossed the middle of the field of view, and when the handoff was complete, theta peaked in the receiving hemisphere as if to say, "I got it."
    Part 2

  • Dr. Krishna Kumari Challa

    And in trials where the target never crossed the middle of the field of view, these handoff dynamics were not apparent in the measurements.

    The study shows that the brain is not simply tracking objects in one hemisphere and then just picking them up anew when they enter the field of view of the other hemisphere.

    "These results suggest there are active mechanisms that transfer information between cerebral hemispheres," the authors wrote. "The brain seems to anticipate the transfer and acknowledge its completion."

    But they also note based on other studies that the system of interhemispheric coordination can sometimes appear to break down in certain neurological conditions including schizophrenia, autism, depression, dyslexia and multiple sclerosis. The new study may lend insight into the specific dynamics needed for it to succeed.

    Matthew B. Broschard et al, Evidence for an active handoff between hemispheres during target tracking, The Journal of Neuroscience (2025). DOI: 10.1523/jneurosci.0841-25.2025

    Part 3

  • Dr. Krishna Kumari Challa

    UK Biobank analysis finds higher dementia incidence with frailty

    Researchers report evidence that physical frailty is associated with dementia and that genetic background, brain structure, and immunometabolic function may mediate this link.

    Dementia causes loss of cognitive abilities and daily functioning, and reached an estimated 57 million cases worldwide in 2021 with projections indicating a rise to 153 million by 2050. Treatments remain limited in effectiveness, creating a need for early identification of risk factors.

    Previous studies have noted associations between frailty and elevated risk of incident dementia. Frailty is characterized by decreased physical function and a reduced ability to overcome stressors. Genetic susceptibility might influence the frailty-dementia relationship, and immunometabolic processes and brain structure are implicated, though mechanisms remain unclear.

    In the study, "Association of Frailty With Dementia and the Mediating Role of Brain Structure and Immunometabolic Signatures," published in Neurology, researchers conducted a prospective cohort investigation to elucidate the link between physical frailty and dementia, assess causality, and explore biologic mechanisms.

    UK Biobank contributed data from 489,573 participants without dementia at enrollment between 2006 and 2010, with a median follow-up of 13.58 years during which 8,900 dementia cases were documented. Brain MRI was available for a subset.

    Physical frailty was defined by five criteria: weight loss, exhaustion, physical inactivity, slow walking speed, and low grip strength. Components were summed to a 0–5 score and categorized as nonfrail, prefrail, or frail. Biomarker panels included peripheral blood cell counts, biochemical measures, and metabolomic markers with correction for multiple testing.

    Risk climbed with frailty status, with prefrailty associated with higher hazard versus nonfrailty (HR 1.50, 95% CI 1.44–1.57) and frailty associated with a larger increase (HR 2.82, 95% CI 2.61–3.04). Joint analyses placed the greatest hazards where frailty met genetic susceptibility, including HR 3.87 (95% CI 3.30–4.55) for high polygenic risk with frailty and HR 8.45 (95% CI 7.51–9.51) for APOE-e4 gene carriers with frailty.

    Neuroimaging and biomarker findings linked frailty severity with image-derived brain measures and immunometabolic markers. Mendelian randomization supported a potential causal effect of physical frailty on dementia (OR 1.79, 95% CI 1.03–3.12), with reverse analysis reporting a null association (OR 1.00, 95% CI 0.98–1.01).

    Authors conclude that the findings support a causal association between physical frailty and dementia, suggesting frailty may serve as a correlative early marker of vulnerability.

    Xiangying Suo et al, Association of Frailty With Dementia and the Mediating Role of Brain Structure and Immunometabolic Signatures, Neurology (2025). DOI: 10.1212/wnl.0000000000214199

  • Dr. Krishna Kumari Challa

    The hunted, not the hunters: AI reveals early humans were prey for leopards

    A new study may be about to rewrite a part of our early human history. It has long been thought that Homo habilis, often considered the first true human species, was the one to turn the tables on the predator–prey relationship. However, a recent analysis of previous archaeological finds suggests that they were possibly more hunted than hunters and not the dominant species we once believed them to be.

    To investigate this, researchers used artificial intelligence and computer vision to analyze tiny tooth marks on two H. habilis fossils. These ancient remains come from Olduvai Gorge in Tanzania and date back almost 2 million years.  

    The researchers trained the AI models on a library of 1,496 images of tooth marks made by modern carnivores, including leopards, lions, crocodiles, wolves and hyenas. Once it was trained, they presented the AI with photos of the fossil tooth marks.

    As the scientists detail in their paper, published in the Annals of the New York Academy of Sciences, the AI compared these marks to what it had learned and concluded, with more than 90% probability, that leopards made the tooth marks. A key reason for this was that the triangular shape of the tooth pits on the bones matched those in the leopard reference samples.

    "The implications of this are major, since it shows that H. habilis was still more of a prey than a predator," wrote the researchers in their paper. "It also shows that the trophic position of some of the earliest representatives of the genus Homo was not different from those of other australopithecines."

    Although the research was limited to just two individuals, the scientists contend that if H. habilis had become a powerful species that could compete with carnivores, their bones would more likely have been scavenged by bone-crushing animals, such as hyenas, after they died from other causes.

    The fact that the bites were from a flesh-eating predator means the leopards were actively hunting them. This suggests that the transition to a dominant position in the food chain came later in human evolution, according to the research team.

    Marina Vegara‐Riquelme et al, Early humans and the balance of power: Homo habilis as prey, Annals of the New York Academy of Sciences (2025). DOI: 10.1111/nyas.15321

  • Dr. Krishna Kumari Challa

    Popular keto diet linked to glucose intolerance and fatty liver in mice

    Avocado toast with fried cheese as the bread and zucchini noodles in butter-bacon sauce are among the many recipe ideas fueling social media's beloved high-fat, low-carbohydrate ketogenic, or keto diet. However, scientists have found that while keto can lead to limited weight gain and even weight loss, it does so at the cost of metabolic issues like glucose intolerance.

    scientists divided mice into four dietary groups: ketogenic diet (KD, 90% fat), a high-fat diet (HFD, 60% fat), low-fat (LFD), and low-fat moderate protein (LFMP), with varying levels of carbohydrates and proteins.

    The mice were allowed to eat freely for up to 36 weeks in males and 44 weeks in females. Test results showed that while the KD supported weight control, it also raised blood cholesterol levels and led to fatty liver in males.

    The diet is named "ketogenic" because it sends the body into ketosis—a metabolic state where the body burns fat as the primary fuel instead of the usual carbohydrates, and, as a result, produces molecules called ketone bodies. The diet isn't a new food trend, it has been around for nearly 100 years and is well-established for treating drug-resistant epilepsy in children, by reducing seizures in many cases. The exact way a KD helps control seizures is still unclear, but several ideas have been proposed. Some studies have suggested that KD can stabilize blood glucose (BG), which is beneficial to brain metabolism and neurotransmitter activity, while others highlight the anticonvulsant effects of ketone bodies themselves.

    For this study, the researchers included both male and female mice and carried out regular check-ins to assess the long-term effects of KD on health parameters, including body composition, organ health, and blood profile.

    They found that KD protected against excessive weight gain compared to the conventional high-fat diet, but gained more weight than low-fat diets. Long-term KD caused severe hyperlipidemia, meaning there were very high levels of fat in the blood.

    KD-fed mice developed severe glucose intolerance because the diet caused a severe impairment in insulin secretion from the pancreas. While female mice on KD seemed fine, the male ones showed signs of liver dysfunction and fatty liver.

    The findings made it quite evident that long-term KD can trigger several metabolic disturbances, raising caution against its widespread use as a health-promoting diet.

     Molly R. Gallop et al, A long-term ketogenic diet causes hyperlipidemia, liver dysfunction, and glucose intolerance from impaired insulin secretion in mice, Science Advances (2025). DOI: 10.1126/sciadv.adx2752

  • Dr. Krishna Kumari Challa

    Scientists show how to grow more nutritious rice that uses less fertilizer

    The cultivation of rice—the staple grain for more than 3.5 billion people around the world—comes with extremely high environmental, climate and economic costs.

    This may be about to change, thanks to new research. 

    Scientists have shown that nanoscale applications of the element selenium can decrease the amount of fertilizer necessary for rice cultivation while sustaining yields, boosting nutrition, enhancing the soil's microbial diversity and cutting greenhouse gas emissions. 

    In a new paper published in the Proceedings of the National Academy of Sciences, they demonstrate for the first time that such nanoscale applications work in real-world conditions.

    They used an aerial drone to lightly spray rice growing in a paddy with the suspension of nanoscale selenium. That direct contact means that the rice plant is far more efficient at absorbing the selenium than it would be if we applied it to the soil.

    Selenium stimulates the plant's photosynthesis, which increased by more than 40%. Increased photosynthesis means the plant absorbs more CO2, which it then turns into carbohydrates. Those carbohydrates flow down into the plant's roots, which causes them to grow.

    Bigger, healthier roots release a host of organic compounds that cultivate beneficial microbes in the soil, and it's these microbes that then work symbiotically with the rice roots to pull more nitrogen and ammonium out of the soil and into the plant, increasing its NUE from 30 to 48.3%, decreasing the amount of nitrous oxide and ammonia release to the atmosphere by 18.8–45.6%.

    With more nutrients coming in, the rice itself produces a higher yield, with a more nutritious grain: levels of protein, certain critical amino acids, and selenium also jumped.

    On top of all of this, they  found that their nano-selenium applications allowed farmers to reduce their nitrogen applications by 30%. Since rice cultivation accounts for 15–20% of the global nitrogen use, this new technique holds real promise for helping to meet the triple threat of growing population, climate change, and the rising economic and environmental costs of agriculture.

    Wang, Zhenyu et al, Nanotechnology-driven coordination of shoot–root systems enhances rice nitrogen use efficiency, Proceedings of the National Academy of Sciences (2025). DOI: 10.1073/pnas.2508456122

  • Dr. Krishna Kumari Challa

    The Ganges River is drying at an unprecedented rate, new study finds

    The Ganges River is in crisis. This lifeline for around 600 million people in India and neighboring countries is experiencing its worst drying period in 1,300 years. Using a combination of historical data, paleoclimate records and hydrological models, researchers discovered that human activity is the main cause. They also found that the current drying is more severe than any recorded drought in the river's history.

    In their study, published in the Proceedings of the National Academy of Sciences, researchers first reconstructed the river's flow for the last 1,300 years (700 to 2012 C.E.) by analyzing tree rings from the Monsoon Asia Drought Atlas (MADA) dataset. Then they used powerful computer programs to combine this tree-ring data with modern records to create a timeline of the river's flow. To ensure its accuracy, they double-checked it against documented historical droughts and famines.

    The scientists found that the recent drying of the Ganges River from 1991 to 2020 is 76% worse than the previous worst recorded drought, which occurred during the 16th century. Not only is the river drier overall, but droughts are now more frequent and last longer. The main reason, according to the researchers, is human activity. While some natural climate patterns are at play, the primary driver is the weakening of the summer monsoon.

    This weakening is linked to human-driven factors such as the warming of the Indian Ocean and air pollution from anthropogenic aerosols. These are liquid droplets and fine solid particles that come from factories, vehicles and power plants, among other sources and can suppress rainfall. The scientists also found that most climate models failed to spot the severe drying trend.

    How to avoid this?

    The researchers suggest two main courses of action. Given the mismatch between climate models and what they actually found, they are calling for better modeling to account for the regional impacts of human activity.

    And because the Ganges is a vital source of water for drinking, agricultural production, industrial use and wildlife, the team also recommends implementing new adaptive water management strategies to mitigate potential water scarcity.

    Dipesh Singh Chuphal et al, Recent drying of the Ganga River is unprecedented in the last 1,300 years, Proceedings of the National Academy of Sciences (2025). DOI: 10.1073/pnas.2424613122

  • Dr. Krishna Kumari Challa

    Study finds microplastics in all beverages tested, raising exposure estimates

    Microplastics have found their way deep inside our bones, brains, and even babies. A UK study found that 100% of all 155 hot and cold beverage samples tested contained synthetic plastic particles.

    MPs are tiny pieces of plastic ranging in size from 1 μm to 5 mm, which are widespread across aquatic, terrestrial, and even airborne environments. They have become a growing health concern for living beings across species due to their ability to accumulate and carry toxic chemicals through the food webs.

    Humans come into contact with MPs every day through food, water, consumer goods, and even air.

    The researchers tested different products from popular UK brands, including coffee, tea, juices, energy drinks, soft drinks, and even tap and bottled water, and not a single beverage was free of microplastics (MPs). Surprisingly, the more expensive tea bag brand showed a higher concentration of MPs, compared to the cheaper ones.

    Traces of plastics, including polypropylene, polystyrene, polyethylene terephthalate, and polyethylene—commonly used for food packaging and disposable containers—were found in the fluids. The daily average exposure through beverages was found to be 1.65 MPs/kg body weight per day.

    The findings are published in Science of the Total Environment.

    This can be true to all countries in the world.

     Muneera Al-Mansoori et al, Synthetic microplastics in hot and cold beverages from the UK market: Comprehensive assessment of human exposure via total beverage intake, Science of The Total Environment (2025). DOI: 10.1016/j.scitotenv.2025.180188

  • Dr. Krishna Kumari Challa

    Molecular discovery reveals how chromosomes are passed from one generation to the next

    When a woman becomes pregnant, the outcome of that pregnancy depends on many things—including a crucial event that happened while she was still growing inside her own mother's womb. It depends on the quality of the egg cells that were already forming inside her fetal ovaries. The DNA-containing chromosomes in those cells must be cut, spliced and sorted perfectly. In males, the same process produces sperm in the testes but occurs only after puberty.

    If that goes wrong, then you end up with the wrong number of chromosomes in the eggs or sperm. This can result in infertility, miscarriage or the birth of children with genetic diseases.

    In a paper published Sept. 24 in the journal Nature,  researchers report a major new discovery about a process that helps safeguard against these mistakes. They have pieced together the choreography of proteins that connect matching chromosome pairs—ensuring that they are sorted correctly as egg and sperm cells develop and divide.

    These discoveries required methods to watch the molecular events of chromosome recombination unfold with unprecedented detail. This involved genetic engineering in budding yeast—a model organism that has been used for decades to discover how fundamental cellular processes work.

    Humans have 46 chromosomes in each of our cells, made up of 23 pairs of matching, "homologous" chromosomes, with one of each pair inherited from each parent. Early in the process of making sperm or eggs, those chromosome pairs line up, and the parental chromosomes break and rejoin to each other. These chromosome exchanges, called "crossovers," serve two important functions.

    First, they help ensure that each chromosome that is passed on to the offspring contains a unique mixture of genes from both parents. Crossovers also keep the chromosomes connected in matching pairs. These connections guide the distribution of chromosomes when cells divide to produce eggs and sperm. Maintaining crossover connections is especially crucial in females.

    As chromosomes pair up in developing egg or sperm cells, matching DNA strands are exchanged and twined together over a short distance to form a structure called a "double Holliday junction." DNA strands of this structure are then cut to join the chromosomes forming a crossover.

    In males, developing immature sperm cells then immediately divide and distribute chromosomes to the sperm. In contrast, egg cells developing in the fetal ovary arrest their development after crossovers have formed. The immature egg cells can remain in suspended animation for decades after birth, until they are activated to undergo ovulation.

    Part 1

  • Dr. Krishna Kumari Challa

    Only then does the process lurch back into motion: The egg cell finally divides, and the chromosome pairs that were connected by crossovers are finally separated to deliver a single set of chromosomes to the mature egg. Maintaining the crossover connections over many years is a major challenge for immature egg cells.
    If chromosome pairs aren't connected by at least one crossover, they can lose contact with each other, like two people separated in a jostling crowd. This causes them to segregate incorrectly when the cell finally divides, producing egg cells with extra or missing chromosomes. This can cause infertility, miscarriage or genetic conditions such as Down syndrome, in which a child is born with an extra copy of chromosome 21, leading to cognitive impairment, heart defects, hearing loss and other problems.
    Researchers have identified dozens of proteins that bind and process these junctions. They used a technique called "real-time genetics" to investigate the function of those proteins.
    With this method, they made cells degrade one or more specific proteins within the junction-associated structures. They could then analyze the DNA from these cells, to see whether the junctions were resolved and if they formed crossovers. In this way, they built up a picture in which a network of proteins function together to ensure that crossovers are formed.
    They identified key proteins such as cohesin that prevent an enzyme called the STR complex (or Bloom complex in humans) from inappropriately dismantling the junctions before they can form crossovers.

    They protect the double Holliday junction. That is a key discovery.
    Failure to protect double-Holliday junctions may be linked to fertility problems in humans.

    Shangming Tang et al, Protecting double Holliday junctions ensures crossing over during meiosis, Nature (2025). DOI: 10.1038/s41586-025-09555-1

    Part 2