Science Simplified!

                       JAI VIGNAN

All about Science - to remove misconceptions and encourage scientific temper

Communicating science to the common people

'To make  them see the world differently through the beautiful lense of  science'

Load Previous Comments
  • Dr. Krishna Kumari Challa

    This Super-Black Material Made of Wood Can Absorb 99.3% of Light

    A new super-black wood-based material has been created that absorbs more than 99 percent of the light that hits it. 

    Researchers were working on water-repelling technologies for wood, using high-energy plasma gas – and noticed that the application of the gas turned the ends of wood cells completely black. Further examination revealed the incredible light absorption properties of the new super-black material.

    The researchers  named their new material Nxylon.

    Nxylon's composition combines the benefits of natural materials with unique structural features, making it lightweight, stiff and easy to cut into intricate shapes.

    The material has a velvety appearance, its inventors report. The plasma treatment actually changes the tiny structures on the surface of the wood, introducing indentations that help to capture light and minimize any reflections.

    Indeed, when a gold alloy was applied to the material, it remained black. This shows that the structure of the wood has changed – this isn't just an extra coating, but a reconfiguration of the fundamentals of the material.

    Super-black materials are valuable in industries like astronomy, solar energy, and optics, where they help devices function more accurately or efficiently by reducing unwanted light reflection. For example, super-black materials can reduce glare and improve clarity on telescopes.

    https://onlinelibrary.wiley.com/doi/10.1002/adsu.202400184

  • Dr. Krishna Kumari Challa

    'Cool paint' for cars to keep drivers cooler

    Car companies are trying  "cool paint" to keep people inside vehicles cooler, although the coating is six times thicker, making commercialization still a challenge.

    The vehicles with the special paint looked like ordinary cars, but felt much cooler to the touch.

    The cool paint lowered the cars' roof-panel temperature by 12 degrees Celsius (22 degrees Fahrenheit) and the interiors by 5 C (9 F).

    Cooling materials are already widely used in buildings and other items. Cooler cars can reduce use of air-conditioning and relieve the toll from heat on engines and electric vehicle batteries.

    They have  also been experimenting with paint that delivers lower cabin temperatures, mostly focusing on colours that refract the sun's rays.

    The cool paint reflects sunlight better and also creates electromagnetic waves that block the rays, redirecting energy away from vehicles.

    Source: Various news agencies

  • Dr. Krishna Kumari Challa

     How our brain decides what to do

     What exactly happens in our brain when we make a decision has been a mystery till now. But researchers  have found the solution. They have deciphered which brain chemical and which nerve cells mediate this decision: the messenger substance orexin and the neurons that produce it.

    These neuroscientific fundamentals are relevant because many people don't take good decisions like getting enough exercise because they don't find it easy to decide. Most of us have probably already decided once or even several times to skip exercising in favor of one of the numerous alternative temptations of daily life. According to the World Health Organization, 80% of adolescents and 27% of adults don't get enough exercise. And obesity is increasing at an alarming rate not only among adults but also among children and adolescents.

    Despite these statistics, many people manage to resist the constantly present temptations and get enough exercise.

    In their experiments with mice, the researchers were able to show that orexin plays a key role in taking decisions. It's one of over a hundred messenger substances that are active in the brain. Other chemical messengers, such as serotonin and dopamine, were discovered a long time ago and their role has largely been decoded. The situation for orexin is different: Researchers discovered it relatively late, around 25 years ago, and they are now clarifying its functions step by step.

    In neuroscience, dopamine is a popular explanation till now  for why we choose to do some things but avoid others. This brain messenger is critical for our general motivation. However, our current knowledge about dopamine does not easily explain why we decide to exercise instead of eating. Our brain releases dopamine both when we eat and when we exercise, which does not explain why we choose one over the other.

    To find out what does explain this, the researchers devised a sophisticated behavioral experiment for mice, which were able to choose freely from among eight different options in ten-minute trials. These included a wheel they could run on and a "milkshake bar" where they could enjoy a standard strawberry-flavored milkshake.

    "Mice like a milkshake for the same reason people do: It contains lots of sugar and fat and tastes good.

    In their experiment, the scientists compared different groups of mice: one made up of normal mice and one in which the mice's orexin systems were blocked, either with a drug or through genetic modification of their cells.

    The mice with an intact orexin system spent twice as much time on the running wheel and half as much time at the milkshake bar as the mice whose orexin system had been blocked. Interestingly, however, the behavior of the two groups didn't differ in experiments in which the scientists only offered the mice either the running wheel or the milkshake.

    This means that the primary role of the orexin system is not to control how much the mice move or how much they eat. Rather, it seems central to making the decision between one and the other, when both options are available. Without orexin, the decision was strongly in favor of the milkshake, and the mice gave up exercising in favour of eating.

    researchers expect that orexin may also be responsible for this decision in humans; the brain functions involved here are known to be practically the same in both species.

    The researchers are now trying to verify this  in humans too.

    Part 1

  • Dr. Krishna Kumari Challa

    This could involve examining patients who have a restricted orexin system for genetic reasons—this is the case in around one in two thousand people. These people suffer from narcolepsy (a sleeping disorder). Another possibility would be to observe people who receive a drug that blocks orexin. Such drugs are authorized for patients with insomnia.

    If we understand how the brain arbitrates between food consumption and physical activity, we can develop more effective strategies for addressing the global obesity epidemic and related metabolic disorders.
    Interventions could be developed to help overcome exercise barriers in healthy individuals and those whose physical activity is limited.

    Orexin neurons mediate temptation-resistant voluntary exercise, Nature Neuroscience (2024). DOI: 10.1038/s41593-024-01696-2

  • Dr. Krishna Kumari Challa

    Looking inside a microchip with 4 nanometer precision: New X-ray world record

    Researchers  have used X-rays to look inside a microchip with higher precision than ever before. The image resolution of 4 nanometers marks a new world record. The high-resolution three-dimensional images of the type they produced will enable advances in both information technology and the life sciences.

    The researchers  reported their findings in the current issue of the journal Nature.

    Since 2010, the scientists have been developing microscopy methods with the goal of producing three-dimensional images in the nanometer range. In their current research, they have succeeded for the first time in taking pictures of state-of-the-art computer chips microchips with a resolution of 4 nanometers—a world record.

    Instead of using lenses, with which images in this range are not currently possible, the scientists resort to a technique known as ptychography, in which a computer combines many individual images to create a single, high-resolution picture. Shorter exposure times and an optimized algorithm were key to significantly improving upon the world record they themselves set in 2017. For their experiments, the researchers used X-rays from the Swiss Light Source SLS at PSI.

    Tomas Aidukas et al, High-performance 4-nm-resolution X-ray tomography using burst ptychography, Nature (2024). DOI: 10.1038/s41586-024-07615-6

  • Dr. Krishna Kumari Challa

    What Your Nails Say About Your Health

    https://youtu.be/PRftXdvENRw?si=VgUs4ne4n_eT2KL_

  • Dr. Krishna Kumari Challa

    Domestication causes smaller brain size in dogs than in the wolf: Study challenges notion

    A recent study, published in Biology Letters, challenges the long-held notion that domestication is the primary driver of reduced brain size in domesticated animals, specifically dogs.

    Employing a phylogenetic comparative approach, researchers  show that the domesticated dog does not exhibit an exceptionally small brain relative to its body size compared to other canid species, suggesting that  domestication is not as unique an evolutionary force as previously thought.

    The prevailing belief has been that domestication leads to a significant reduction in brain size due to relaxed selection pressures, such as reduced need for foraging, mating competition, and predator avoidance.

    This phenomenon is thought to be a result of the decreased necessity for metabolically costly brain tissue in a domesticated environment. While domesticated dogs show a substantial decrease in brain size compared to their wild ancestor, the gray wolf (Canis lupus), this study aimed to determine if this reduction is exceptional when viewed in a broader phylogenetic context.

    Researchers analyzed brain and body size data for 25 canid species, including ancient dog breeds that are genetically closer to the ancestral domesticated dog.

    Their phylogenetic predictions and allometric regressions showed that the reduction in brain size in domesticated dogs is not an unambiguous evolutionary singularity. The observed brain size in dogs fell within the expected range for most ancient breeds used in the study, suggesting that domestication is not uniquely influential in reducing brain size among canids.

    Interestingly, the study found that the common raccoon dog (Nyctereutes procyonoides), which hibernates, is a more pronounced outlier in terms of brain size reduction. Hibernation, associated with prolonged periods of low metabolic activity and food scarcity, is hypothesized to constrain brain size evolution due to the high energy demands of large brains.

    Part 1

  • Dr. Krishna Kumari Challa

    The raccoon dog's significantly smaller brain size supports this hypothesis, highlighting that factors other than domestication, such as ecological adaptations like hibernation, can also drive reductions in brain size.

    The study concludes that while domestication does contribute to brain size reduction in dogs, it should not be overemphasized as a uniquely powerful evolutionary force.

    The findings suggest that other ecological and evolutionary pressures can similarly affect brain size and can mediate extreme variations in non-domesticated species as well. A more balanced and less human-focused perspective could refine our understanding of the complex interplay between domestication and brain size evolution in mammals.

     László Zsolt Garamszegi et al, The reduction in relative brain size in the domesticated dog is not an evolutionary singularity among the canids, Biology Letters (2024). DOI: 10.1098/rsbl.2024.0336 , royalsocietypublishing.org/doi … .1098/rsbl.2024.0336

    Part 2

  • Dr. Krishna Kumari Challa

    Modern aircraft emit less carbon than older aircraft, but their contrails may do more environmental harm

    Modern commercial aircraft flying at high altitudes create longer-lived planet-warming contrails than older aircraft, a new study has found.

    The result means that although modern planes emit less carbon than older aircraft, they may be contributing more to climate change through contrails.

    The study highlights the immense challenges the aviation industry faces to reduce its impact on the climate. The new study also found that private jets produce more contrails than previously thought, potentially leading to outsized impacts on climate warming.

    Contrails, or condensation trails, are thin streaks of cloud created by aircraft exhaust fumes that contribute to global warming by trapping heat in the atmosphere.

    While the exact warming effect of contrails is uncertain, scientists think it is greater than warming caused by carbon emissions from jet fuel.
    Published in Environmental Research Letters, the study used machine learning to analyze satellite data on more than 64,000 contrails from a range of aircraft flying over the North Atlantic Ocean.

    Modern aircraft that fly at above 38,000 feet (about 12km), such as the Airbus A350 and Boeing 787 Airliners, create more contrails than older passenger-carrying commercial aircraft, the study found.

    To reduce jet fuel consumption, modern aircraft are designed to fly at higher altitudes where the air is thinner with less aerodynamic drag, compared to older commercial aircraft, which usually fly at slightly lower altitudes (around 35,000ft/11km).

    This means these higher-flying aircraft create less carbon emissions per passenger. However, it also means they create contrails that take longer to dissipate—creating a warming effect for longer and a complicated trade-off for the aviation industry.

    Part 1

  • Dr. Krishna Kumari Challa

    It's common knowledge that flying is not good for the climate. However, most people do not appreciate that contrails and jet fuel carbon emissions cause a double-whammy warming of the climate.
    This study throws a spanner in the works for the aviation industry. Newer aircraft are flying higher and higher in the atmosphere to increase fuel efficiency and reduce carbon emissions. The unintended consequence of this is that these aircraft flying over the North Atlantic are now creating more, longer-lived, contrails, trapping additional heat in the atmosphere and increasing the climate impact of aviation.
    This finding reflects the challenges the aviation industry faces when reducing its climate impact.
    The study did confirm a simple step that can be taken to shorten the lifetime of contrails: Reduce the amount of soot emitted from aircraft engines, produced when fuel burns inefficiently.

    Modern aircraft engines are designed to be cleaner and typically emit fewer soot particles, which cuts down the lifetime of contrails.

    While other studies using models have predicted this phenomenon, the study published today is the first to confirm it using real-world observations.
    Even higher in the sky, the researchers found that private jets create contrails more often than previously thought—adding to concerns about the excessive use of these aircraft by the super-rich.

    Despite being smaller and using less fuel, private jets create similar contrails to much larger commercial aircraft, the analysis found.
    Private jets fly higher than other planes, more than 40,000 feet above earth where there is less air traffic. However, like modern commercial aircraft creating more contrails compared to lower-flying older commercial aircraft, the high altitudes flown by private jets means they create outsized contrails.

     Operational differences lead to longer lifetimes of satellite detectable contrails from more fuel efficient aircraft, Environmental Research Letters (2024). DOI: 10.1088/1748-9326/ad5b78

    Part 2

  • Dr. Krishna Kumari Challa

    Why your best friends' genes matter

    "Choose your friends wisely, no scientifically". Because ....

    A study by researchers shows that your best friend's traits can rub off on you—especially ones that are in their genes.

    The genetic makeup of adolescent peers may have long-term consequences for individual risk of drug and alcohol use disorders, depression and anxiety, the groundbreaking study has found.

    Peers' genetic predispositions for psychiatric and substance-use disorders are associated with an individual's own risk of developing the same disorders in young adulthood.

    The data of this study exemplifies the long reach of social genetic effects.

    Socio-genomics—the influence of one person's genotype on the observable traits of another—is an emerging field of genomics. Research suggests that peers' genetic makeup may influence health outcomes of their friends.

    In the studies conducted, even when controlling for factors such as the target individuals' own genetic predispositions and family socioeconomic factors, the researchers found a clear association between peers' genetic predispositions and target individuals' likelihood of developing a substance use or psychiatric disorder. The effects were stronger among school-based peers than geographically defined peers.

    Within school groups, the strongest effects were among upper secondary school classmates, particularly those in the same vocational or college-preparatory track between ages 16 and 19. Social genetic effects for school-based peers were greater for drug and alcohol use disorders than major depression and anxiety disorder.

    More research is needed to understand why these connections exist. 

    The most obvious explanation for why peers' genetic predispositions might be associated with our own well-being is the idea that our peers' genetic predispositions influence their phenotype, or the likelihood that peers are also affected by the disorder.

    This research also underscores the importance of disrupting processes and risks that extend for at least a decade after attendance in school. Peer genetic influences have a very long reach.

    Peer Social Genetic Effects and the Etiology of Substance Use Disorders, Major Depression, and Anxiety Disorder in a Swedish National Sample, American Journal of Psychiatry (2024). DOI: 10.1176/appi.ajp.20230358

  • Dr. Krishna Kumari Challa

    Researchers show that pesticide contamination is more than apple-skin deep

    Pesticides and herbicides are critical to ensuring food security worldwide, but these substances can present a safety risk to people who unwittingly ingest them. Protecting human health, therefore, demands sensitive analytical methods to identify even trace levels of potentially harmful substances. Now, researchers reporting in Nano Letters have developed a high-tech imaging method to detect pesticide contamination at low levels, and its application on fruits reveals that current food safety practices may be insufficient.

    The analytical method called surface-enhanced Raman spectroscopy (SERS) is gaining popularity as a nondestructive method for detecting chemicals from modern farming on produce. With SERS, metal nanoparticles or nanosheets are used to amplify the signals created by molecules when they are exposed to a Raman laser beam. The patterns created by the metal-enhanced scattered light serve as molecular signatures and can be used to identify small amounts of specific compounds.

    In tests of the silver-embedded membrane for food safety applications, the researchers sprayed the pesticides thiram and carbendazim, alone or together, onto apples, air-dried the fruits and then washed them to mimic everyday practices. When they laid their membrane over the apples, SERS detected pesticides on the apples, even though the chemicals were present at low concentrations. The team was also able to clearly resolve scattered-light signatures for each pesticide on apples sprayed with both thiram and carbendazim, as well as detect pesticide contamination through the fruit's peel and into the outermost layer of pulp.

    But the pigmented skin of apple is very important.  A raw apple with skin contains up to 332% more vitamin K, 142% more vitamin A, 115% more vitamin C, 20% more calcium, and up to 19% more potassium than a peeled apple does.

    The peel of 1 apple will contain about 8.4 mg of vitamin C and 98 IU of vitamin A. When eating apples, if you peel them off, the amount of this vitamin will decrease. down to 6.4 mg of Vitamin C and 61 IU of Vitamin A.

    You can't remove them without severely effecting the nutrient quality of the fruits.

    However, these results suggest that washing alone could be insufficient to prevent pesticide ingestion and that peeling would be required to remove potential contamination in the skin and outer pulp, the researchers say. Beyond apples, they also used the SERS membrane system to detect pesticides on cucumbers, shrimp, chili powder and rice.

    But Cellulose Surface Nanoengineering for Visualizing Food Safety, Nano Letters (2024). DOI: 10.1021/acs.nanolett.4c01513 , pubs.acs.org/doi/abs/10.1021/acs.nanolett.4c01513

  • Dr. Krishna Kumari Challa

    Bacterial gut diversity improves the athletic performance of racehorses

    The composition of gut bacteria of thoroughbred racehorses at one month old can predict their future athletic performance, according to a new study. 

    In the study, foals with lower bacterial diversity at 28 days old also had a significantly increased risk of respiratory disease later in life.

    This study is published in the journal Scientific Reports.

    Researchers from Surrey's School of Veterinary Medicine and School of Bioscience 

    investigated the composition of gut bacteria in thoroughbred foals bred for flat racing and its impact on their long-term health and athletic performance.

    To investigate this, 438 fecal samples from 52 foals were analyzed, and respiratory, gastrointestinal, orthopedic and soft-tissue health issues were tracked from birth to age three. In addition, the team analyzed information regarding finishing position, official rating, and total prize money earnings as measures of athletic performance.

    Minimizing the risk of disease and injury is important for the welfare of racehorses, and maximizing their athletic potential is important for their owners. Researchers have now found that gut health, in particular the health of gut bacterial communities very early in life, exerts a profound and enduring impact on racehorse health and performance.

    Researchers found that the athletic performance of the foals was positively associated with higher fecal bacterial diversity at one month old. They identified that a higher abundance of the bacteria Anaeroplasmataceae was associated with a higher official rating (an evaluation of a horse based on its past performances), and increased levels of Bacillaceae at 28 days old were linked to higher race placings.

    The team also investigated the long-term impact of foals receiving antibiotics during the first month of life. It was found that these foals had significantly lower fecal bacterial diversity at 28 days old than other foals who did not receive such treatments. Further analysis revealed that these foals won significantly lower prize money (an indicator of athletic performance) in their subsequent racing careers. In addition, foals who received antibiotics during their first 28 days of life had a significantly increased rate of developing a respiratory disease compared to their counterparts.

    Interestingly, researchers also identified that low gut bacterial diversity in early life is associated with an increased risk of soft-tissue and orthopedic issues developing later in life. Researchers believe that the health impacts of low gut bacterial diversity in early life are likely to be related to immunological priming.

    Work is currently underway to develop novel probiotics that will enhance the gut health of foals in early life and to investigate how antibiotics can be used while preserving gut health.

     Early-life gut bacterial community structure predicts disease risk and athletic performance in horses bred for racing, Scientific Reports (2024). DOI: 10.1038/s41598-024-64657-6

  • Dr. Krishna Kumari Challa

    DNA fragments help detect kidney organ rejection

    Findings from a study published in Nature Medicine show that donor-derived cell-free DNA (dd-cfDNA), also called liquid biopsy, has the potential for early detection of kidney transplant rejection.

    The international study enrolled a diverse population of nearly 3,000 kidney transplant recipients—both adult and pediatric—from 14 transplantation centers in Europe and the U.S. The Department of Pediatrics at Washington University School of Medicine in St. Louis contributed one of the two pediatric datasets involved in the study.
    When cells undergo apoptosis or necrosis, they release small fragments of DNA, known as cell-free DNA (cf-DNA), into the bloodstream. In inflammation associated with transplant rejection, dying cells release donor-derived cell-free DNA (dd-cfDNA). Researchers found those dd-cfDNA levels were strongly correlated with different types of transplant rejection, including antibody-mediated rejection, T cell-mediated rejection and mixed rejection. The study found similar accuracy in children and adults.

    Often unnecessary and invasive graft biopsies are currently considered the "gold standard" in diagnosing transplant rejection, but dd-cfDNA could provide a non-invasive, accurate biomarker to reduce the need for biopsy.

    While biopsies will continue as the method of rejection diagnosis, dd-cfDNA may improve early rejection diagnosis and enhance the care of kidney transplant recipients.

    Olivier Aubert et al, Cell-free DNA for the detection of kidney allograft rejection, Nature Medicine (2024). DOI: 10.1038/s41591-024-03087-3

  • Dr. Krishna Kumari Challa

    Processing traumatic memories during sleep leads to changes in the brain associated with improvement in PTSD symptoms

    Currently, the first-choice treatment for PTSD is exposure-based psychotherapy, where therapists help rewire the emotions associated with the traumatic memory in the patient's brain, shifting from fear and arousal to a more neutral response.

    PTSD is a mental health disorder that can occur after experiencing or witnessing a traumatic event. People with PTSD may experience flashbacks, nightmares, heightened vigilance, hyper-arousal, and mood and sleep problems. Currently available treatments for PTSD include eye movement desensitization and reprocessing (EMDR), where therapists guide patients through their traumatic memories while using a moving light or clicking sounds to distract them.

    However, up to 50% of patients fail to respond well to this treatment.

    EMDR has shown positive results, but  that success is low and dropping out from the treatment program is common among patients because revisiting traumatic memories is emotionally demanding.

    In a study published on August 7 in Current Biology, scientists show for the first time that reactivating therapeutically-altered memories during sleep leads to more brain activity related to memory processing, which is associated with a reduction in PTSD symptoms.

    Sleep provides a unique opportunity to enhance the memory of newly formed emotional reactions to traumatic events. During sleep, the brain focuses on consolidating memories and storing information for the long term.

    Previous research has shown that if someone forms a new memory in the presence of an experimentally administered sound or scent, exposing them to the sound or scent while they sleep can improve their ability to recall that memory after waking up. This memory-enhancement technique is called targeted memory reactivation (TMR).

    TMR has no negative effects on these patients. None of the patients reported more nightmares or worsened sleep after TMR.

    Many psychiatric disorders, such as phobias, anxiety disorders, and addiction, are also related to maladaptive memories. This new work can inspire future research to explore the beneficial effects of TMR in treating other conditions.

    Targeted memory reactivation to augment treatment in post-traumatic stress disorder, Current Biology (2024). DOI: 10.1016/j.cub.2024.07.019www.cell.com/current-biology/f … 0960-9822(24)00922-9

  • Dr. Krishna Kumari Challa

    Nanomaterials may enhance plant tolerance to high soil salt levels

    Soil salt concentrations above the optimal threshold for plant growth can threaten global food security by compromising agricultural productivity and crop quality. An analysis published in Physiologia Plantarum has examined the potential of nanomaterials—which have emerged over the past decade as a promising tool to mitigate such "salinity stress"—to address this challenge.

    Nanomaterials, which are tiny natural or synthetic materials, can modulate a plant's response to salinity stress through various mechanisms, for example by affecting the expression of genes related to salt tolerance or by enhancing physiological processes such as antioxidant activities.
    When investigators assessed 495 experiments from 70 publications related to how different nanomaterials interact with plants under salinity stress, they found that nanomaterials enhance plant performance and mitigate salinity stress when applied at lower dosages. At higher doses, however, nanomaterials are toxic to plants and may even worsen salinity stress.

    Also, plant responses to nanomaterials vary across plant species, plant families, and nanomaterial types.

    Meta-analysis of nanomaterials and plants interaction under salinity stress, Physiologia Plantarum (2024). DOI: 10.1111/ppl.14445

  • Dr. Krishna Kumari Challa

    Scientists uncover hidden forces causing continents to rise

    Scientists have answered one of the most puzzling questions in plate tectonics: how and why "stable" parts of continents gradually rise to form some of the planet's greatest topographic features?

    They have found that when tectonic plates break apart, powerful waves are triggered deep within the Earth that can cause continental surfaces to rise by over a kilometer.
    Their findings help resolve a long-standing mystery about the dynamic forces that shape and connect some of the Earth's most dramatic landforms—expansive topographic features called 'escarpments' and 'plateaus' that profoundly influence climate and biology.

    The new research examined the effects of global tectonic forces on landscape evolution over hundreds of millions of years. The findings are published Aug 8 in the journal Nature.

    The research  results help explain why parts of the continents previously thought of as "stable" experience substantial uplift and erosion, and how such processes can migrate hundreds or even thousands of kilometers inland, forming sweeping elevated regions known as plateaus, like the Central Plateau of South Africa.

    The researchers  discovered that when continents split apart, the stretching of the continental crust causes stirring movements in Earth's mantle (the voluminous layer between the crust and the core).

    This process can be compared to a sweeping motion that moves towards the continents and disturbs their deep foundations.

    The team noticed an interesting pattern: the speed of the mantle "waves" moving under the continents in their simulations closely matched the speed of major erosion events that swept across the landscape in Southern Africa following the breakup of the ancient supercontinent Gondwana.

    The scientists pieced together evidence to propose that the Great Escarpments originate at the edges of ancient rift valleys, much like the steep walls seen at the margins of the East African Rift today. Meanwhile, the rifting event also sets about a "deep mantle wave" that travels along the continent's base at about 15–20 kilometers per million years.

    This wave convectively removes layers of rock from the continental roots.

    Much like how a hot-air balloon sheds weight to rise higher, this loss of continental material causes the continents to rise—a process called isostasy.

    Part 1

  • Dr. Krishna Kumari Challa

    Building on this, the team modeled how landscapes respond to this mantle-driven uplift. They found that migrating mantle instabilities give rise to a wave of surface erosion that lasts tens of millions of years and moves across the continent at a similar speed. This intense erosion removes a huge weight of rock that causes the land surface to rise further, forming elevated plateaus.
    Their landscape evolution models show how a sequence of events linked to rifting can result in an escarpment as well as a stable, flat plateau, even though a layer of several thousands of meters of rocks has been eroded away.
    The team's study provides a new explanation for the puzzling vertical movements of cratons far from the edges of continents, where uplift is more common.
    The team has concluded that the same chain of mantle disturbances that trigger diamonds to quickly rise from Earth's deep interior also fundamentally shape continental landscapes, influencing a host of factors from regional climates and biodiversity to human settlement patterns.

    Thomas Gernon, Co-evolution of craton margins and interiors during continental break-up, Nature (2024). DOI: 10.1038/s41586-024-07717-1www.nature.com/articles/s41586-024-07717-1

    Part 2

    **

  • Dr. Krishna Kumari Challa

    Microbes conquer the next extreme environment: Your microwave

    Since the industrial revolution, microbes have successfully colonized one novel type of habitat after another: for example, marine oil spills, plastic floating in the oceans, industrial brownfields, and even the interior of the International Space Station.

    However, it turns out that one extreme environment harboring a specialized community of highly adapted microbes is much closer to home: inside microwaves. This finding has now been reported for the first time by researchers  in a study in Frontiers in Microbiology. It's not only important from the perspective of hygiene, but could also inspire biotechnological applications—if the strains found inside microwaves can be put to good use in industrial processes that require especially hardy bacteria.

    The research results reveal that domestic microwaves have a more 'anthropized' microbiome, similar to kitchen surfaces, while laboratory microwaves harbour bacteria that are more resistant to radiation.

    Researchers sampled microbes from inside 30 microwaves: 10 each from single-household kitchens, another 10 from shared domestic spaces--for example, corporate centers, scientific institutes, and cafeterias--and 10 from molecular biology and microbiology laboratories. The aim behind this sampling scheme was to see if these microbial communities are influenced by food interactions and user habits.

    The team used two complementary methods to inventorize the microbial diversity: next-generation sequencing and cultivation of 101 strains in five different media.

    In total, the researchers found 747 different genera within 25 bacterial phyla. The most frequently encountered phyla were Firmicutes, Actinobacteria, and especially Proteobacteria.

    They found that the composition of the typical microbial community partly overlapped between shared domestic and single-household domestic microwaves, while laboratory microwaves were quite different. The diversity was lowest in single-household microwaves, and highest in laboratory ones.

    Part 1

  • Dr. Krishna Kumari Challa

    However, it was also similar to the microbiome in an industrial habitat: namely, on solar panels. The authors proposed that the constant thermal shock, electromagnetic radiation, and desiccation in such highly irradiated environments has repeatedly selected for highly resistant microbes, in the same manner as in microwaves.

    For both the general public and laboratory personnel, the researchers recommend regularly disinfecting microwaves with a diluted bleach solution or a commercially available disinfectant spray. In addition, it is important to wipe down the interior surfaces with a damp cloth after each use to remove any residue and to clean up spills immediately to prevent the growth of bacteria.

    The microwave bacteriome: biodiversity of domestic and laboratory microwave ovens, Frontiers in Microbiology (2024). DOI: 10.3389/fmicb.2024.1395751

    Part 2

  • Dr. Krishna Kumari Challa

    Lens-free fluorescence instrument detects deadly microorganisms in drinking water

    Researchers have shown that a fluorescence detection system that doesn't contain any lenses can provide highly sensitive detection of deadly microorganisms in drinking water. With further development, the new approach could provide a low-cost and easy-to-use way to monitor water quality in resource-limited settings such as developing countries or areas affected by disasters.

    It could also be useful when water safety results are needed quickly, such as for swimming events, a concern highlighted during the Paris Olympics.

    Current methods used to assess microbial contamination in water require culturing the water samples and then quantifying harmful bacteria. This can take over 18 hours, making it impractical when immediate confirmation of water safety is needed. This is also a key reason why water surveillance is ineffective in developing countries, where the required skilled human resources, infrastructure and reagents are not readily available.

    The  new water monitoring fluorometer can detect fluorescent proteins from bacteria in water down to levels of less than one part per billion, without using any lenses.

    This sensitivity meets the World Health Organization's criteria for detecting fecal contamination in drinking water.

    During development, the researchers closely examined the fundamentals of optical signal generation in applications like water quality monitoring.

    They discovered that while optical lenses are commonly used in devices such as cameras, microscopes and telescopes, these optical components often reduce performance for practical situations that don't require images.

    This was an important finding because lenses account for a significant share of the costs of optical systems and their bulk and weight make it difficult to create practical portable devices.

    The new analysis revealed that using a light source, detectors and sample sizes that are all as large and as close to each other as possible produces a stronger signal, leading to better performance for water quality monitoring.

    Part 1

  • Dr. Krishna Kumari Challa

    Based on their findings, the researchers designed a lensless fluorescence system using large (1–2 mm2) LEDs and detectors, which have recently become available in UV wavelengths. It works by using UV light to excite proteins from harmful microbes and then detecting the resulting fluorescence.

    In addition to demonstrating the lensless system's sensitivity, they also showed that it produced a fluorescence signal that is about double the strength of a lensed system. They found that the performance of the lensed system was limited by its numerical aperture, the use of larger sources and detectors and the finite imaging distance required between the components and the sample.

    The researchers are now developing a pocket-sized version of the lensless fluorometers for field testing.

     Asim Maharjan et al, Lensless fluorometer outperforms lensed system, Optica (2024). DOI: 10.1364/OPTICA.527289

    Part 2

  • Dr. Krishna Kumari Challa

    Study observes that similarities between physical and biological systems might be greater than we think

    A crowd or a flock of birds have different characteristics from those of atoms in a material, but when it comes to collective movement, the differences matter less than we might think. We can try to predict the behavior of humans, birds, or cells based on the same principles we use for particles.

    This is the finding of a study published in the Journal of Statistical Mechanics: Theory and Experiment, JSTAT, conducted by an international team of researchers.  The study, based on the physics of materials, simulated the conditions that cause a sudden shift from a disordered state to a coordinated one in "self-propelled agents" (like biological ones).

    "In a way, birds are flying atoms", say the researchers, " It may sound strange, but indeed, one of our main findings was that the way a walking crowd moves, or a flock of birds in flight, shares many similarities with the physical systems of particles". 

    In the field of collective movement studies, it has been assumed that there is a qualitative difference between particles (atoms and molecules) and biological elements (cells, but also entire organisms in groups). It was especially believed that the transition from one type of movement to another (for example, from chaos to an orderly flow, known as a phase transition) was completely different.

    The crucial difference for physicists in this case has to do with the concept of distance. Particles moving in a space with many other particles influence each other primarily based on their mutual distance. For biological elements, however, the absolute distance is less important.

    Take a pigeon flying in a flock: what matters to it are not so much all the closest pigeons, but those it can see." In fact, according to the literature, among those it can see, it can only keep track of a finite number, due to its cognitive limits.

    The pigeon, in the physicists' jargon, is in a "topological relationship" with other pigeons: two birds could be at quite a large physical distance, but if they are in the same visible space, they are in mutual contact and influence each other.

    It was long thought that this type of difference led to a completely different scenario for the emergence of collective motion This new study, however, suggests that this is not a crucial difference.

    These statistical models, based on the physics of particles, can  also help us understand biological collective movement.

    Fluctuation-Induced First Order Transition to Collective Motion, Journal of Statistical Mechanics Theory and Experiment (2024). DOI: 10.1088/1742-5468/ad6428

    **

  • Dr. Krishna Kumari Challa

    How intermittent fasting regulates aging through autophagy

    Research published recently in Nature Cell Biology, sheds light on the mechanism through which spermidine regulates autophagy, a process that ensures the recycling of components within the cell, to promote the anti-aging effects of intermittent fasting.

    The work demonstrated that intermittent fasting increases the levels of spermidine, a chemical compound (natural polyamine), that enhances the resilience and survival of cells and organisms, through the activation of autophagy.  

    Autophagy is a process of cellular recycling, the destruction of non-functional/unnecessary components and organelles of the cell. Autophagy defects have been linked to aging, as well as, with the emergence of age-related disorders, such as diabetes, cardiovascular diseases, cancer and neurodegenerative diseases.

    Dietary habits, such as low or high-fat diet, over-nutrition, or fasting can influence the development of these chronic diseases, the prevalence of which is expected to increase considerably in the coming years. Dietary interventions, such as caloric restriction and intermittent fasting, can slow down aging and promote longevity.

    A key element of these interventions is the maintenance of cellular homeostasis through the induction of autophagy. Direct administration of spermidine is an alternative strategy for inducing autophagy and extending lifespan. However, the role of spermidine in the regulation of autophagy and aging upon intermittent fasting remains unclear.

    Using a range of experimental models, ranging from the nematode (Caenorhabditis elegans), yeast (Saccharomyces cerevisiae), fruit fly (Drosophila melanogaster), mouse (Mus musculus), and human cell lines, the research teams of this study have shown that intermittent fasting increases the cellular levels of spermidine, which in turn induces autophagy, resulting in the prolongation of lifespan in these organisms.

    Conversely, inhibition of spermidine synthesis, using appropriate inhibitors, counteracts the benefits of autophagy on lifespan through intermittent fasting.

    The results of the research highlight the critical role of spermidine in regulating autophagy under intermittent fasting, thereby improving lifespan expectancy across all model organisms studied. The fact that the regulation of autophagy through spermidine and intermittent fasting is an evolutionarily conserved process, underscores its central role in monitoring and maintaining cellular homeostasis across different organisms.

    Sebastian J. Hofer et al, Spermidine is essential for fasting-mediated autophagy and longevity, Nature Cell Biology (2024). DOI: 10.1038/s41556-024-01468-x

  • Dr. Krishna Kumari Challa

    Scientists unravel how the BCG vaccine leads to the destruction of bladder cancer cells

    Using zebrafish "avatars," an animal model developed by the Cancer Development and Innate Immune Evasion lab, researchers studied the initial steps of the Bacillus Calmette-Guérin (BCG) vaccine's action on bladder cancer cells.

    Their results, which are published in the journal Disease Models and Mechanisms, show that macrophages—the first line of immune cells activated after an infection—literally induce the cancer cells to commit suicide and then rapidly eat away the dead cancer cells.

    The BCG vaccine was first used against TB in the 1920's and then started to be used as the first cancer immunotherapy around 1976. But decades before that, in the 1890's, William Coley, a surgeon working at the New York Hospital (now Weill Cornell Medical Center) had already tested a mix of different bacteria, coined "Coley's toxins," as a cancer immunotherapy.

    Coley had noticed that several virtually hopeless cancer patients at the hospital went into seemingly "miraculous" remissions from their cancer when they caught a bacterial infection following the surgery performed to remove their tumors (sterile conditions for surgical procedures were then less than optimal). His idea was that such recoveries, far from being miraculous, were in fact caused by an immune response of the patients to the infection.

    Coley started trying to induce bacterial infections in a number of sarcoma patients and was able to reproduce a few cancer remissions. At the time, though, his method was far from being proven and safe—and meanwhile, other treatment methods were developed, such as radiotherapy—so his research was not pursued. But in recent years, the field of immunotherapy has gained enormous momentum, bringing new and more scientifically robust ways of boosting the immune system to fight cancer. 

    BCG immunotherapy is still rather empirically used. However, since it works for many people, it has become a gold standard treatment. Surprisingly, it is a very effective immunotherapy, even when compared to so many fancy immunotherapies that are being developed."

    The treatment consists of instilling the BCG vaccine directly into the bladder. When the treatment works, the 15-year survival rate for patients with so-called "non-muscle-invasive" (early-stage) bladder cancer is 60% to 70%. However, in 30% to 50% of the cases, bladder tumors are unresponsive to BCG treatment. In these cases, the whole bladder has to be removed.

    Macrophages directly kill bladder cancer cells through TNF signaling in an early response to BCG therapy, Disease Models & Mechanisms (2024). DOI: 10.1242/dmm.050693

  • Dr. Krishna Kumari Challa

    Research demonstrates genetically diverse crowds are wiser

    A new study by researchers reveals that genetically diverse groups make more accurate collective judgments compared to genetically homogeneous groups.

    The research, published in Personality and Individual Differences, provides new insights into the origins of the 'wisdom of crowds' phenomenon, emphasizing the role of genetic diversity in enhancing collective intelligence.

    Past studies have suggested that combining individual judgments can improve accuracy, especially when individuals differ in background, education, and demography.

    The study involved 602 identical and fraternal twins, who participated by making numerical judgments in pairs. These pairs consisted either of co-twins (related pairs) or non-related individuals (unrelated pairs).

    The results of this study revealed that judgments made by unrelated (i.e., heterogenous) pairs were more accurate than those made by related (i.e., homogeneous) pairs. Theoretically, however, this finding could emerge either from environmental or genetic factors.

    In order to distinguish between environmental and genetic factors, the study compared the performance of related and unrelated pairs, separately among identical and fraternal twins.
    This comparison is relevant as genetic influences make identical twins more similar to one another compared to fraternal twins, because the former share virtually 100% of their genetic variance, whereas fraternal twins share, on average, 50% of the genetic variance.


    The findings revealed that the superior performance of unrelated versus related pairs was evident for the identical twins. This underscores the impact of genetic relatedness on collective judgment.

  • Dr. Krishna Kumari Challa

    This research is the first empirical demonstration of the benefits of genetic diversity for collective judgments.
    The findings suggest that genetic diversity enhances the collective cognitive abilities of groups, providing a deeper understanding of how diverse crowds can achieve wiser outcomes. By uniquely highlighting the genetic aspect, this research adds a new dimension to the 'wisdom of crowds' phenomenon.
    These findings highlight the significant impact genetic diversity can have on collective decision-making, underscoring the importance of embracing diversity in all its forms to enhance our cognitive abilities and tackle complex challenges more effectively.

    Meir Barneron et al, Genetically-diverse crowds are wiser, Personality and Individual Differences (2024). DOI: 10.1016/j.paid.2024.112823

    Part 2

  • Dr. Krishna Kumari Challa

    The fat-gut-cancer link
    A study in mice and people suggests why there is a link between obesity and some cancers: a high-fat diet increases the number of Desulfovibrio bacteria in the gut. These release leucine, an amino acid, which encourages the proliferation of a kind of cell that suppresses the immune system. With a suppressed immune system, breast cancer tumour growth increases. 

    https://www.pnas.org/doi/10.1073/pnas.2306776121?utm_source=Live+Au...

  • Dr. Krishna Kumari Challa

    A crustacean positively too beautiful to eat

    Rare "cotton candy" lobster found in New Hampshire

    This lobster's shell was a swirling mass of blue, purple, and pink pigments, which looked like glazed, opalescent pottery when wet.

  • Dr. Krishna Kumari Challa

    Serotonin changes how people learn and respond to negative information

    It is easy to advice, 'be positive, be happy, don’t respond to bad criticism, be in control of your emotions'. But can people really do this in real life situations without the help of their biochemistry?

    And without a tough training to improve their behaviour?

    NO!

    Increasing serotonin can change how people react and learn from negative information, as well as improving how they respond to it, according to a new study published in the journal Nature Communications.

    The study by scientists found people with increased serotonin levels had reduced sensitivity to punishing outcomes (for example, losing money in a game) without significantly affecting sensitivity to rewarding ones (winning money).

    The researchers found that increasing serotonin made individuals better able to control their behaviour, particularly when exposed to negative information. The study also showed that elevated serotonin levels benefited different types of memory.

    These findings shed new light on how serotonin shapes human behaviour, particularly in negative environments.

    This provides us with some exciting new information about the role of serotonin in humans. It shows that serotonin, which has been implicated in depression and in the effects of antidepressants, has more of a role in processing negative things, rather than boosting  good responses.

    These findings underscore the central role that serotonin plays in effortful cognitive processes, such as our ability to put the brakes on unwanted behaviors. This study helps to further understand why drugs that change serotonin levels are effective treatments for many mental illnesses, including depression, anxiety and obsessive-compulsive disorder. 

    Michael J. Colwell et al, Direct serotonin release in humans shapes aversive learning and inhibition, Nature Communications (2024). DOI: 10.1038/s41467-024-50394-x

  • Dr. Krishna Kumari Challa

    Unknown mechanism essential for bacterial cell division

    How does matter, lifeless by definition, self-organize and make us alive? One of the hallmarks of life, self-organization, is the spontaneous formation and breakdown of biological active matter. However, while molecules constantly fall in and out of life, one may ask how they "know" where, when, and how to assemble, and when to stop and fall apart.

    Researchers tried to address  these questions in the context of bacterial cell division. They developed a computational model for the assembly of a protein called FtsZ, an example of active matter. 

    A previously unknown mechanism of active matter self-organization essential for bacterial cell division follows the motto "dying to align": Misaligned filaments "die" spontaneously to form a ring structure at the center of the dividing cell. The study, led by the Šarić group at the Institute of Science and Technology Austria (ISTA), was published in Nature Physics. The work could find applications in developing synthetic self-healing materials.

    During cell division, FtsZ self-assembles into a ring structure at the center of the dividing bacterial cell. This FtsZ ring–called the bacterial division ring–was shown to help form a new "wall" that separates the daughter cells.

     The researchers' computational work demonstrates how misaligned FtsZ filaments react when they hit an obstacle.

    By "dying" and re-assembling, they favour the formation of the bacterial division ring, a well-aligned filamentous structure. These findings could have applications in the development of synthetic self-healing materials.

    FtsZ forms protein filaments that self-assemble by growing and shrinking in a continuous turnover. This process, called "treadmilling," is the constant addition and removal of subunits at opposite filament ends. Several proteins have been shown to treadmill in multiple life forms—such as bacteria, animals, or plants.

    Scientists have previously thought of treadmilling as a form of self-propulsion and modeled it as filaments that move forward. However, such models fail to capture the constant turnover of subunits and overestimate the forces generated by the filaments' assembly. 

    Everything in our cells is in constant turnover. Thus, we need to start thinking of biological active matter from the prism of molecular turnover and in a way that adapts to the outside environment.

    What they found was striking. In contrast to self-propelled assemblies that push the surrounding molecules and create a "bump" felt at long molecular distances, they saw that misaligned FtsZ filaments started "dying" when they hit an obstacle.

    "Active matter made up of mortal filaments does not take misalignment lightly. When a filament grows and collides with obstacles, it dissolves and dies.  Treadmilling assemblies lead to local healing of the active material. When misaligned filaments die, they contribute to a better overall assembly.

    By incorporating the cell geometry and filament curvature into their model, the researchers showed how the death of misaligned FtsZ filaments helped form the bacterial division ring.

    Part 1

  • Dr. Krishna Kumari Challa

    Driven by the physical theories of molecular interactions, the researchers  soon made two independent encounters with experimental groups that helped confirm their results. 

    At this meeting, they presented exciting experimental data showing that the death and birth of FtsZ filaments were essential for the formation of the division ring. This suggested that treadmilling had a crucial role in this process.

    They also saw that the in vitro results closely matched the simulations and further confirmed the team's computational results.

    Energy-driven self-organization of matter is a fundamental process in physics. Researchers are asking how to create living matter from non-living material that looks living. Thus, their present work could facilitate the creation of synthetic self-healing materials or synthetic cells.

    Self-organisation of mortal filaments and its role in bacterial division ring formation, Nature Physics (2024). DOI: 10.1038/s41567-024-02597-8

    Part 2

  • Dr. Krishna Kumari Challa

    Scientists find oceans of water on Mars.  But it is too deep to tap!

    Using seismic activity to probe the interior of Mars, geophysicists have found evidence for a large underground reservoir of liquid water—enough to fill oceans on the planet's surface.

    The data from NASA's Insight lander allowed the scientists to estimate that the amount of groundwater could cover the entire planet to a depth of between 1 and 2 kilometers, or about a mile.

    But this  reservoir won't be of much use to anyone trying to tap into it to supply a future Mars colony.

    It's located in tiny cracks and pores in rock in the middle of the Martian crust, between 11.5 and 20 kilometers below the surface. Even on Earth, drilling a hole a kilometer deep is a challenge.

    The finding does pinpoint another promising place to look for life on Mars, however, if the reservoir can be accessed. For the moment, it helps answer questions about the geological history of the planet.

    Wright, Vashan, Liquid water in the Martian mid-crust, Proceedings of the National Academy of Sciences (2024). DOI: 10.1073/pnas.2409983121doi.org/10.1073/pnas.2409983121

  • Dr. Krishna Kumari Challa

    Flood of 'junk': How AI is changing scientific publishing

    An infographic of a rat with a preposterously large penis. Another showing human legs with way too many bones. An introduction that starts: "Certainly, here is a possible introduction for your topic".

    These are a few of the most egregious examples of artificial intelligence that have recently made their way into scientific journals, shining a light on the wave of AI-generated text and images washing over the academic publishing industry.

    Several experts who track down problems in studies told AFP that the rise of AI has turbocharged the existing problems in the multi-billion-dollar sector.

    All the experts emphasized that AI programs such as ChatGPT can be a helpful tool for writing or translating papers—if thoroughly checked and disclosed.

    But that was not the case for several recent cases that somehow snuck past peer review.

    Earlier this year, a clearly AI-generated graphic of a rat with impossibly huge genitals was shared widely on social media.

    It was published in a journal of academic giant Frontiers, which later retracted the study.

    Another study was retracted last month for an AI graphic showing legs with odd multi-jointed bones that resembled hands.

    While these examples were images, it is thought to be ChatGPT, a chatbot launched in November 2022, that has most changed how the world's researchers present their findings.

    A study published by Elsevier went viral in March for its introduction, which was clearly a ChatGPT prompt that read: "Certainly, here is a possible introduction for your topic".

    Such embarrassing examples are rare and would be unlikely to make it through the peer review process at the most prestigious journals, several experts  say.

    Part 1

  • Dr. Krishna Kumari Challa

    It is not always so easy to spot the use of AI. But one clue is that ChatGPT tends to favor certain words.

    At least 60,000 papers involved the use of AI in 2023—over one percent of the annual total. For 2024 we are going to see very significantly increased numbers.

    Meanwhile, more than 13,000 papers were retracted last year, by far the most in history, according to the US-based group Retraction Watch.

    AI has allowed the bad actors in scientific publishing and academia to "industrialize the overflow" of "junk" papers according to  Retraction Watch.

     Such bad actors include what are known as  paper mills.

    These "scammers" sell authorship to researchers, pumping out vast amounts of very poor quality, plagiarized or fake papers. Two percent of all studies are thought to be published by paper mills, but the rate is "exploding" as AI opens the floodgates. 

    The problem 's not just paper mills, but a broader academic culture which pushes researchers to "publish or perish". Publishers have created 30 to 40 percent profit margins and billions of dollars in profit by creating these systems that demand volume.

    The insatiable demand for ever-more papers piles pressure on academics who are ranked by their output, creating a "vicious cycle". 

    Many have turned to ChatGPT to save time—which is not necessarily a bad thing.

    Because nearly all papers are published in English,  AI translation tools can be invaluable to researchers for whom English is not their first language.

    But there are also fears that the errors, inventions and unwitting plagiarism by AI could increasingly erode society's trust in science.

    Another example of AI misuse came last week, when a researcher discovered what appeared to be a ChatGPT re-written version of one his own studies had been published in an academic journal.

    A good example: 

    Samuel Payne, a bioinformatics professor at Brigham Young University in the United States,  revealed that he had been asked to peer review a study in March.

    After realizing it was "100 percent plagiarism" of his own study—but with the text seemingly rephrased by an AI program—he rejected the paper.

    Payne said he was "shocked" to find the plagiarized work had simply been published elsewhere, in a new Wiley journal called Proteomics.

    It has not been retracted till now.

    Source: AFP and other news agencies

    **

    Part 2

  • Dr. Krishna Kumari Challa

    A common parasite could deliver drugs to the brain—how scientists are turning Toxoplasma gondii from foe into friend

    Parasites take an enormous toll on human and veterinary health. But researchers may have found a way for patients with brain disorders and a common brain parasite to become frenemies.

    A new study published in Nature Microbiology has pioneered the use of a single-celled parasite, Toxoplasma gondii, to inject therapeutic proteins into brain cells. The brain is very picky about what it lets in, including many drugs, which limits treatment options for neurological conditions.

    Preventing and treating disease by co-opting the very microbes that threaten us has a history that long predates germ theory. Vaccines are a good example. 

    The concept of inoculation has yielded a plethora of vaccines that have saved countless lives.

    Viruses, bacteria and parasites have also evolved many tricks to penetrate organs such as the brain and could be retooled to deliver drugs into the body. Such uses could include viruses for gene therapy and intestinal bacteria to treat a gut infection known as C. diff.

    Part 1

  • Dr. Krishna Kumari Challa

    Toxoplasma parasites infect all animals, including humans. Infection can occur in multiple ways, including ingesting spores released in the stool of infected cats or consuming contaminated meat or water. Toxoplasmosis in otherwise healthy people produces only mild symptoms but can be serious in immunocompromised people and to gestating fetusus.

    Unlike most pathogens, Toxoplasma can cross the blood-brain barrier and invade brain cells. Once inside neurons, the parasite releases a suite of proteins that alter gene expression in its host, which may be a factor in the behavioral changes it causes in infected animals and people.
    In a new study, a global team of researchers hijacked the system Toxoplasma uses to secrete proteins into its host cell. The team genetically engineered Toxoplasma to make a hybrid protein, fusing one of its secreted proteins to a protein called MeCP2, which regulates gene activity in the brain—in effect, giving the MeCP2 a piggyback ride into neurons. Researchers found that the parasites secreted the MeCP2 protein hybrid into neurons grown in a petri dish as well as in the brains of infected mice.

    A genetic deficiency in MECP2 causes a rare brain development disorder called Rett syndrome. Gene therapy trials using viruses to deliver the MeCP2 protein to treat Rett syndrome are underway. If Toxoplasma can deliver a form of MeCP2 protein into brain cells, it may provide another option to treat this currently incurable condition. It also may offer another treatment option for other neurological problems that arise from errant proteins, such as Alzheimer's and Parkinson's disease.
    The road from laboratory bench to bedside is long and filled with obstacles, so don't expect to see engineered Toxoplasma in the clinic anytime soon.

    The obvious complication in using Toxoplasma for medical purposes is that it can produce a serious, lifelong infection that is currently incurable. Infecting someone with Toxoplasma can damage critical organ systems, including the brain, eyes and heart.

    However, up to one-third of people worldwide currently carry Toxoplasma in their brain, apparently without incident. Emerging studies have correlated infection with increased risk of schizophrenia, rage disorder and recklessness, hinting that this quiet infection may be predisposing some people to serious neurological problems.

    The widespread prevalence of Toxoplasma infections may also be another complication, as it disqualifies many people from using it for treatment. Since the billions of people who already carry the parasite have developed immunity against future infection, therapeutic forms of Toxoplasma would be rapidly destroyed by their immune systems once injected.

    In some cases, the benefits of using Toxoplasma as a drug delivery system may outweigh the risks. Engineering benign forms of this parasite could produce the proteins patients need without harming the organ—the brain—that defines who we are.

    https://www.nature.com/articles/s41564-024-01750-6

    Part 2

    **

  • Dr. Krishna Kumari Challa

    Rat poison is moving up through food chains, threatening carnivores around the world

    Rats thrive around humans, for good reason: They feed off crops and garbage and readily adapt to many settings, from farms to the world's largest cities. To control them, people often resort to poisons. But chemicals that kill rats can also harm other animals.

    The most commonly used poisons are called anticoagulant rodenticides. They work by interfering with blood clotting in animals that consume them. These enticingly flavored bait blocks are placed outside of buildings, in small black boxes that only rats and mice can enter. But the poison remains in the rodents' bodies, threatening larger animals that prey on them.

    Researchers detected rodenticides in about one-third of the animals in these analyses, including bobcats, foxes and weasels. They directly linked the poisons to the deaths of one-third of the deceased animals—typically, by finding the chemicals in the animals' liver tissues.

    When wild animals consume rat poison—typically, by eating a poisoned rat—the effects may include internal bleeding and lesions, lethargy and a reduced immune response, which can make them more susceptible to other diseases. In many cases the animal will die. Sometimes these deaths occur at scales large enough to reduce local predator populations.

     M. P. Keating et al, Global review of anticoagulant rodenticide exposure in wild mammalian carnivores, Animal Conservation (2024). DOI: 10.1111/acv.12947

  • Dr. Krishna Kumari Challa

    How experience shapes neural connectivity in the brain

    Our brain interprets visual information by combining what we see with what we already know. A study published in the journal Neuron, by researchers  reveals a mechanism for learning and storing this existing knowledge about the world.

    They found that neurons are wired to connect seemingly unrelated concepts. This wiring may be crucial for enhancing the brain's ability to predict what we see based on past experiences, and brings us a step closer to understanding how this process goes awry in mental health disorders.

    How do we learn to make sense of our environment? Over time, our brain builds a hierarchy of knowledge, with higher-order concepts linked to the lower-order features that comprise them.

    This interconnected framework shapes our expectations and perception of the world, allowing us to identify what we see based on context and experience.

    Part 1

  • Dr. Krishna Kumari Challa

    The brain's visual system consists of a network of areas that work together, with lower areas handling simple details (e.g. small regions of space, colors, edges) and higher areas representing more complex concepts (e.g. larger regions of space, animals, faces).

    Cells in higher areas send "feedback" connections to lower areas, putting them in a position to learn and embed real-world relationships shaped by experience.

    For instance, cells encoding an "elephant" might send feedback to cells processing features like "gray," "big" and "heavy." The researchers therefore set about investigating how visual experience influences the organization of these feedback projections, whose functional role remains largely unknown.
    Researchers examined the effects of visual experience on feedback projections to a lower visual area called V1 in mice. They raised two groups of mice differently: one in a normal environment with regular light exposure, and the other in darkness. They then observed how the feedback connections, and cells they target in V1, responded to different regions of the visual field.

    In mice raised in darkness, the feedback connections and V1 cells directly below them both represented the same areas of visual space. It 's amazing to see how well the spatial representations of higher and lower areas matched up in the dark-reared mice. This suggests that the brain has an inherent, genetic blueprint for organizing these spatially aligned connections, independent of visual input. However, in normally-reared mice, these connections were less precisely matched, and more feedback inputs conveyed information from surrounding areas of the visual field.
    Part 2
  • Dr. Krishna Kumari Challa

    Researchers found that with visual experience, feedback provides more contextual and novel information, enhancing the ability of V1 cells to sample information from a broader area of the visual scene.

    This effect depended on the origin within the higher visual area: feedback projections from deeper layers were more likely to convey surrounding information compared to those from superficial layers.

    Moreover, the team discovered that in normally-reared mice, deep-layer feedback inputs to V1 become organized according to the patterns they "prefer" to see, such as vertical or horizontal lines. For instance, inputs that prefer vertical lines avoid sending surrounding information to areas located along the vertical direction. In contrast, they found no such bias in connectivity in dark-reared mice.

    This suggests that visual experience plays a crucial role in fine-tuning feedback connections and shaping the spatial information transmitted from higher to lower visual areas.
    Researchers developed a computational model that shows how experience leads to a selection process, reducing connections between feedback and V1 cells whose representations overlap too much. This minimizes redundancy, allowing V1 cells to integrate a more diverse range of feedback.
    Perhaps counter-intuitively, the brain might encode learned knowledge by connecting cells that represent unrelated concepts, and that are less likely to be activated together based on real-world patterns. This could be an energy-efficient way to store information, so that when encountering a novel stimulus, like a pink elephant, the brain's preconfigured wiring maximizes activation, enhancing detection and updating predictions about the world.

    Identifying this brain interface where prior knowledge combines with new sensory information could be valuable for developing interventions in cases where this integration process malfunctions.

    Visual experience reduces the spatial redundancy between cortical feedback inputs and primary visual cortex neurons., Neuron (2024). DOI: 10.1016/j.neuron.2024.07.009www.cell.com/neuron/fulltext/S0896-6273(24)00531-2

    Part 3

    **

  • Dr. Krishna Kumari Challa

    Pre-surgical antibody treatment might prevent heart transplant rejection

    A new study from scientists  suggests there may be a way to further protect transplanted hearts from rejection by preparing the donor organ and the recipient with an anti-inflammatory antibody treatment before surgery occurs.

    The findings, published online in the Proceedings of the National Academy of Sciencesfocus on blocking an innate immune response that normally occurs in response to microbial infections. The same response has been shown to drive dangerous inflammation in transplanted hearts.

    In the new study—in mice—transplanted hearts functioned for longer periods when the organ recipients also received the novel antibody treatment. Now the first of a complex series of steps has begun to determine whether a similar approach can be safely performed for human heart transplants.

    The anti-rejection regimens currently in use are broad immunosuppressive agents that make the patients susceptible to infections. By using specific antibodies, scientists think they can just block the inflammation that leads to rejection but leave anti-microbial immunity intact.

     Pasare, Chandrashekhar, Alloreactive memory CD4 T cells promote transplant rejection by engaging DCs to induce innate inflammation and CD8 T cell priming, Proceedings of the National Academy of Sciences (2024). DOI: 10.1073/pnas.2401658121

  • Dr. Krishna Kumari Challa

    Brain biomarker in blood sample predicts stroke, researchers demonstrate

    Researchers  have demonstrated that a simple blood test that reflects brain health can predict which people are most at risk of suffering a stroke. The discovery could contribute to more individualized treatment of patients with atrial fibrillation. The study has been published in the journal Circulation.

    Atrial fibrillation is the most frequent cardiac arrhythmia, affecting around a third of all people at some point in their life. Atrial fibrillation is a common cause of stroke, since the cardiac arrhythmia increases the risk of blood clots forming in the heart's atria. Many people with atrial fibrillation therefore receive anticoagulation treatment with a view to preventing stroke.

    However, since anticoagulation treatment leads to an increased risk of serious hemorrhages, only people with a moderate or high risk of experiencing a stroke receive this treatment, instead of all people with atrial fibrillation. This makes it important to be able to identify, with as high a degree of precision as possible, the individuals who will benefit from anticoagulation treatment.

    The researchers have now analyzed the substance neurofilament, a protein that is released from the brain in cases of injurious strain and hypoxia, in blood samples from more than 3,000 people with atrial fibrillation. The researchers then followed these people for an average of one and a half years. The individuals with the highest neurofilament levels in their blood had the highest risk of suffering a stroke. The risk of stroke among the quarter with the highest neurofilament levels was more than three times as high as for those with the lowest levels.

    As the risk of suffering a stroke determines which type of treatment is appropriate, this can help to increase the precision in the selection of treatment.

    When the researchers then combined neurofilament with ordinary cardiac blood samples from the same individuals, this further increased the ability to predict stroke.

    Julia Aulin et al, Neurofilament Light Chain and Risk of Stroke in Patients With Atrial Fibrillation, Circulation (2024). DOI: 10.1161/CIRCULATIONAHA.124.069440

  • Dr. Krishna Kumari Challa

    Excessive heat can affect your mental health

    While extreme heat can cause physical harm, it can also wreak havoc with your mental health.

    Sizzling temperatures can make anyone irritable, but it can be far worse for some, especially those with mental health conditions. Excessive heat can trigger feelings of anger, irritability, aggression, discomfort, stress and fatigue because of its impact on serotonin, the neurotransmitter that regulates your sleep, mood and behaviours.

    The most vulnerable groups include people with preexisting mental health conditions and people who abuse alcohol or other drugs.

    All mental illnesses increase with heat because it results in more fatigue, irritability and anxiety, and it can exacerbate depressive episodes.

    What are the signs of impending trouble? They tend to start with irritability, decreased motivation, aggressive behavior and sometimes mental fogging. In worse cases, confusion and disorientation occur.

    If you take medications, consult with your provider because some medications for mental health, such as lithium for bipolar patients, don't pair well with heat. Lithium goes through the kidney, so sweating can have an impact on the levels of the medication in your body.

    Droughts and extreme changes in temperature can also increase levels of pollutants and allergens as air quality worsens. That can exacerbate mental health issues like depression, anxiety or PTSD. Some studies show that exposure to any natural climate disaster can raise the risk of depression by more than 30%, anxiety by 70% and both by over 87%.

    If you feel affected by severe heat, call your doctor or mental health specialist.

    https://www.apa.org/monitor/2024/06/heat-affects-mental-health

  • Dr. Krishna Kumari Challa

    Significant link found between heme iron, found in red meat and other animal products, and type 2 diabetes risk

    Higher intake of heme iron, the type found in red meat and other animal products—as opposed to non-heme iron, found mostly in plant-based foods—was associated with a higher risk of developing type 2 diabetes (T2D) in a new study  by researchers. While the link between heme iron and T2D has been reported previously, the study's findings more clearly establish and explain the link.

    The researchers assessed the link between iron and T2D using 36 years of dietary reports from 206,615 adults enrolled in the Nurses' Health Studies I and II and the Health Professionals Follow-up Study. They examined participants' intake of various forms of iron—total, heme, non-heme, dietary (from foods), and supplemental (from supplements)—and their T2D status, controlling for other health and lifestyle factors.

    The researchers also analyzed the biological mechanisms underpinning heme iron's relationship to T2D among smaller subsets of the participants. They looked at 37,544 participants' plasma metabolic biomarkers, including those related to insulin levels, blood sugar, blood lipids, inflammation, and two biomarkers of iron metabolism. They then looked at 9,024 participants' metabolomic profiles—plasma levels of small-molecule metabolites, which are substances derived from bodily processes such as breaking down food or chemicals.

    The study found a significant association between higher heme iron intake and T2D risk. Participants in the highest intake group had a 26% higher risk of developing T2D than those in the lowest intake group. In addition, the researchers found that heme iron accounted for more than half of the T2D risk associated with unprocessed red meat and a moderate proportion of the risk for several T2D-related dietary patterns. In line with previous studies, the researchers found no significant associations between intakes of non-heme iron from diet or supplements and risk of T2D.

    The study also found that higher heme iron intake was associated with blood metabolic biomarkers associated with T2D. A higher heme iron intake was associated with higher levels of biomarkers such as C-peptide, triglycerides, C-reactive protein, leptin, and markers of iron overload, as well as lower levels of beneficial biomarkers like HDL cholesterol and adiponectin.

    The researchers also identified a dozen blood metabolites—including L-valine, L-lysine, uric acid, and several lipid metabolites—that may play a role in the link between heme iron intake and TD2 risk. These metabolites have been previously associated with risk of T2D.

    But  the findings—based on a study population that was mostly white—must be replicated in other racial and ethnic groups to get established.

    Integration of epidemiological and blood biomarker analysis links heme iron intake to increased type 2 diabetes risk, Nature Metabolism (2024). DOI: 10.1038/s42255-024-01109-5

    On a population level, the study findings carry important implications for dietary guidelines and public health strategies to reduce rates of diabetes, according to the researchers. In particular, the findings raise concerns about the addition of heme to plant-based meat alternatives to enhance their meaty flavor and appearance. These products are gaining in popularity, but health effects warrant further investigation.

  • Dr. Krishna Kumari Challa

    Algorithm achieves 98% accuracy in disease prediction via tongue colour

    A computer algorithm has achieved 98% accuracy in predicting different diseases by analyzing the colour of the human tongue.

    The proposed imaging system developed by Iraqi and Australian researchers can diagnose diabetes, stroke, anemia, asthma, liver and gallbladder conditions, COVID-19, and a range of vascular and gastrointestinal issues.

    The artificial intelligence (AI) model was able to match the tongue color with the disease in almost all cases.

    paper published in Technologies outlines how the proposed system analyzes tongue color to provide on-the-spot diagnosis, confirming that AI holds the key to many advances in medicine.

    The color, shape and thickness of the tongue can reveal a litany of health conditions.

    Typically, people with diabetes have a yellow tongue; cancer patients a purple tongue with a thick greasy coating; and acute stroke patients present with an unusually shaped red tongue. A white tongue can indicate anemia; people with severe cases of COVID-19 are likely to have a deep red tongue; and an indigo or violet colored tongue indicates vascular and gastrointestinal issues or asthma.

    In the study, cameras placed 20 centimeters from a patient captured their tongue color and the imaging system predicted their health condition in real time.

     Ali Raad Hassoon et al, Tongue Disease Prediction Based on Machine Learning Algorithms, Technologies (2024). DOI: 10.3390/technologies12070097

  • Dr. Krishna Kumari Challa

    Are Andromeda and the Milky Way doomed to collide? Maybe not

    Scientists discovered the Andromeda galaxy, known as M31, hundreds of years ago, and around a century ago, we realized that it had negative radial velocity toward the Milky Way. In other words, eventually, the two galaxies would merge spectacularly. That has been common knowledge for astronomers since then, but is it really true?

    new paper from researchers at the University of Helsinki posted to the arXiv preprint server looks at several confounding factors, including the gravitational influence of other galaxies in our local group, and finds only a 50% chance that the Milky Way will merge with the Andromeda galaxy in the next 10 billion years.

    That seems like a pretty big thing to get the physics wrong on. So, how did the authors come to that conclusion? They accounted for a problem that has been popularized in media as of late—the three-body—or in this case, four-body—problem. And with that problem comes a lot of uncertainty, which is why there's still a 50% chance that this huge event might still happen.

    Thinking of Andromeda and the Milky Way in isolation doesn't account for the other galaxies in what we know as the "Local Group." This comprises approximately 100 smaller galaxies at various orientations, distances, and speeds.

    The largest of the remaining galaxies is the Triangulum galaxy, M33, which is about 2.7 million light-years away and consists of upwards of a mere 40 billion stars. That's about 40% of the approximately 100 billion stars in the Milky Way but a mere 4% of the nearly 1 trillion stars estimated to exist in Andromeda. Still, they would have their own gravitational pull, contorting the simplistic dynamic between Andromeda and the Milky Way.

    Part 1

  • Dr. Krishna Kumari Challa

    Scientists never like uncertainty. In fact, much of their research tries to place bounds on certain parameters, like the rotational speed of galaxies or the distances between them. Unfortunately, despite their proximity, there are many uncertainties surrounding the four galaxies used in the study, and those uncertainties make precise calculations of the effects of their gravitational and rotational pull difficult.

    Developing estimates rather than concrete numbers is one way scientists often deal with uncertainty, and in this case, that estimate fell right at the 50% mark in terms of whether or not the two galaxies would collide. However, there is still a lot of uncertainty in that estimate, and plenty more confounding factors, including the other galaxies in the local group, will influence the final outcome.

    Ultimately, time will help solve the mystery, but that is a very long time on the scale of galaxy mergers. If it happens at all, a merger between the Milky Way and Andromeda will happen long after our own sun has burned out, and humans will either die out with it or find a way to expand to new stars. And if, at that point, we get easy access to an additional galaxy's worth of resources, it would be all the better for us.

    Till Sawala et al, Apocalypse When? No Certainty of a Milky Way -- Andromeda Collision, arXiv (2024). DOI: 10.48550/arxiv.2408.00064

    Part 2

    **

  • Dr. Krishna Kumari Challa

    Scientists achieve more than 98% efficiency in removing nanoplastics from water

     Scientists are battling against an emerging enemy of human health: nanoplastics. Much smaller in size than the diameter of an average human hair, nanoplastics are invisible to the naked eye.

    Linked to cardiovascular and respiratory diseases in people, nanoplastics continue to build up, largely unnoticed, in the world's bodies of water. The challenge remains to develop a cost-effective solution to get rid of nanoplastics while leaving clean water behind.

     Recently, researchers created a new liquid-based solution that eliminates more than 98% of these microscopic plastic particles from water.

    The innovative method—using water-repelling solvents made from natural ingredients—not only offers a practical solution to the pressing issue of nanoplastic pollution but also paves the way for further research and development in advanced water purification technologies.

    The strategy uses a small amount of designer solvent to absorb plastic particles from a large volume of water.

    Initially, the solvent sits on the water's surface the way oil floats on water. Once mixed with water and allowed to separate again, the solvent floats back to the surface, carrying the nanoplastics within its molecular structure.

    In the lab, the researchers simply use a pipette to remove the nanoplastic-laden solvent, leaving behind clean, plastic-free water. The researchers  say future studies will work to scale up the entire process so that it can be applied to larger bodies of water like lakes and, eventually, oceans.

    The solvents used are made from safe, non-toxic components, and their ability to repel water prevents additional contamination of water sources, making them a highly sustainable solution.

    Piyuni Ishtaweera et al, Nanoplastics Extraction from Water by Hydrophobic Deep Eutectic Solvents, ACS Applied Engineering Materials (2024). DOI: 10.1021/acsaenm.4c00159

  • Dr. Krishna Kumari Challa

    Protein in mosquito saliva shown to inhibit host immune response

    Mosquito saliva is known to play a significant role in the transmission of viruses such as yellow fever, Zika, dengue, and chikungunya, yet many of its functions have yet to be understood. In a new study, researchers revealed that a mosquito salivary protein binds to an immune molecule in humans, facilitating infection in human skin caused by the transmitted virus.

    The findings are published in Science Immunology.

    Ticks and mosquitoes don't just inject pathogens. Their saliva serves many purposes when it interacts with the human host. 

    For the study, the team probed a curated yeast display library of human proteins with Nest1, a  protein in the Aedes aegypti mosquito saliva that they had identified as important in previous research.

    The researchers demonstrated that Nest1 interacts with human CD47, an immune receptor found on the surface of many cells in the body. CD47 controls several immune processes, including those that protect certain cells and destroy others.

    This interaction shows that the mosquito is trying to change the biological functions governed by CD47.  Nest1 is inhibiting some of these functions, like phagocytosis, the migration of immune cells, and the inflammatory response." These alterations help to enhance virus replication in the skin.

    This discovery increases our knowledge of how disease vectors—like mosquitoes—and hosts—like humans—interact.

    Alejandro Marin-Lopez et al, The human CD47 checkpoint is targeted by an immunosuppressive Aedes aegypti salivary factor to enhance arboviral skin infectivity, Science Immunology (2024). DOI: 10.1126/sciimmunol.adk9872