Scientists create a cell that precludes malignant growth
Cell therapies could help in the treatment of hereditary diseases, myocardial infarction and hundreds of other diseases. For many blood diseases, new cells can already be transplanted into human patients, and diabetes has also been treated by transplanting cells obtained through organ donation or, more recently, β-cells modified from the patient's own stem cells.
A risk associated with gene-edited cells is unintentional DNA mutations, including those that predispose patients to cancer. Moreover, the difference in tissue types makes it impossible to transfer cells simply from one person to another.
Cells that suit anyone, or immunologically invisible cells, as it were, have been created, but they too are associated with an increased risk of cancer. Over a decade ago, a research group set out to develop cells where these problems could be avoided. Now, the group has succeeded in producing cells which cannot proliferate unaided and which cannot therefore turn into malignant cells.
Almost all of our diseases are fundamentally caused by cellular dysfunction. One medical dream is to fight tissue damage, diseases or even aging with new healthy cells. This new study takes us a step closer to safe and novel cell therapies.
The researchers modified stem cells to divide only if they are supplemented with thymidine, one of the building blocks of DNA. The cells that have been subjected to this safety treatment cannot replicate their genome without the supplementary component vital for DNA synthesis. This precludes their proliferation. When the cells are differentiated for their various tasks, they cease to divide and no longer require the supplement. The innovation has been protected by the University's Helsinki Innovation services (HIS).
Initially, the researchers investigated whether cell growth can be regulated with an externally administered substance. Once successful, they examined whether the cells functioned normally.
They used stem cells to create insulin-producing β-cells that they then transplanted into laboratory animals. The cells regulated the blood glucose levels of the animals throughout the almost six-month experiment. The cells are also able to differentiate into other tissue types as usual, and the researchers have not observed any differences in them other than their inability to proliferate without their say-so. Stem cells are very primitive cells, as they have to be able to divide in abundance and develop in many different directions. They have potential for a range of purposes, but their primitive nature also poses a problem: What if some cells are not differentiated, but continue to grow in a primitive form? According to the scientists of the study, the research group's solution enables the efficient proliferation of cells during production, which can be halted at the desired time, such as following transplantation. The solution also makes it possible to edit cells without fear of adverse effects of the editing itself. For example, cells can be made into something that the recipient's immune system does not recognize. Previously, such cells would have been highly risky, as the immune system also monitors the onset of cancer. Now, that risk is very small or non-existent. Ideally, these cells could be turned into products suited to everyone and, when necessary, quickly deployed.
Rocio Sartori-Maldonado et al, Thymidylate synthase disruption to limit cell proliferation in cell therapies, Molecular Therapy (2024). DOI: 10.1016/j.ymthe.2024.06.014
Scientists successfully create a time crystal made of giant atoms
A crystal is an arrangement of atoms that repeats itself in space, in regular intervals: At every point, the crystal looks exactly the same. In 2012, Nobel Prize winner Frank Wilczek raised the question: Could there also be a time crystal—an object that repeats itself not in space but in time? And could it be possible that a periodic rhythm emerges, even though no specific rhythm is imposed on the system and the interaction between the particles is completely independent of time?
For years, Frank Wilczek's idea has caused much controversy. Some considered time crystals to be impossible in principle, while others tried to find loopholes and realize time crystals under certain special conditions.
Now, a particularly spectacular kind of time crystal has successfully been created at Tsinghua University in China, with the support from TU Wien in Austria. The team used laser light and special types of atoms, namely Rydberg atoms, with a diameter that is several hundred times larger than normal. The results have been published in the journal Nature Physics.
The ticking of a clock is also an example of a temporally periodic movement. However, it does not happen by itself: Someone must have wound the clock and started it at a certain time. This starting time then determined the timing of the ticks. It is different with a time crystal:
According to Wilczek's idea, a periodicity should arise spontaneously, although there is actually no physical difference between different points in time.
The tick frequency is predetermined by the physical properties of the system, but the times at which the tick occurs are completely random; this is known as spontaneous symmetry breaking.
How this new work was done:
Laser light was shone into a glass container filled with a gas of rubidium atoms. The strength of the light signal that arrived at the other end of the container was measured.
This is actually a static experiment in which no specific rhythm is imposed on the system.
The interactions between light and atoms are always the same, the laser beam has a constant intensity. But surprisingly, it turned out that the intensity that arrives at the other end of the glass cell begins to oscillate in highly regular patterns.
The key to the experiment was to prepare the atoms in a special way: The electrons of an atom can orbit the nucleus on different paths, depending on how much energy they have. If energy is added to the outermost electron of an atom, its distance from the atomic nucleus can become very large.
Part1
In extreme cases, it can be several hundred times further away from the nucleus than usual. In this way, atoms with a giant electron shell are created—so-called Rydberg atoms.
If the atoms in their glass container are prepared in such Rydberg states and their diameter becomes huge, then the forces between these atoms also become very large.
And that in turn changes the way they interact with the laser. If you choose laser light in such a way that it can excite two different Rydberg states in each atom at the same time, then a feedback loop is generated that causes spontaneous oscillations between the two atomic states. This in turn also leads to oscillating light absorption. All by themselves, the giant atoms stumble into a regular beat, and this beat is translated into the rhythm of the light intensity that arrives at the end of the glass container.
So the researchers have created a new system here that provides a powerful platform for deepening their understanding of the time crystal phenomenon in a way that comes very close to Frank Wilczek's original idea. Precise, self-sustained oscillations could be used for sensors.
New study sheds light on brain responses to emotionally-charged scenes
The ability to recognize and respond to emotionally-charged situations is essential to a species' evolutionary success. A new study published in Nature Communications advances our understanding of how the brain responds to emotionally charged objects and scenes.
This new research reveals that the occipital temporal cortex is tuned not only to different categories of stimuli but it also breaks down these categories based on their emotional characteristics in a way that is well suited to guide selection between alternate behaviours.
The researchers analyzed the brain activity of a small group of volunteers viewing over 1,500 images depicting natural emotional scenes such as a couple hugging, an injured person in a hospital bed, a luxurious home, and an aggressive dog. Participants were asked to categorize the images as positive, negative or neutral and to also rate the emotional intensity of the images.
A second group of participants picked the behavioural responses that best matched each scene.
Using cutting-edge modeling of brain activity divided into tiny cubes (of under 3mm3), the study discovered that the occipital temporal cortex (OTC), a region at the back of the brain, is tuned to represent both the type of stimulus (single human, couple, crowd, reptile, mammal, food, object, building, landscape etc.) and the emotional characteristics of the stimulus—whether it's negative, positive or neutral and also whether it's high or low in emotional intensity.
Machine learning showed that these stable tuning patterns were more efficient in predicting the behaviors matched to the images by the second group of participants than could be achieved by applying machine learning directly to image features—suggesting that the OTC efficiently extracts and represents the information needed to guide behaviour.
These findings expand our knowledge of how the human brain represents emotional natural stimuli.
Occipital-temporal cortical tuning to semantic and affective features of natural images predicts associated behavioral responses, Nature Communications (2024). DOI: 10.1038/s41467-024-49073-8
First local extinction due to sea level rise identified in the US
The United States has lost its only stand of the massive Key Largo tree cactus in what researchers think is the first local extinction of a species caused by sea level rise in the country.
The Key Largo tree cactus (Pilosocereus millspaughii) still grows on a few scattered islands in the Caribbean, including northern Cuba and parts of the Bahamas. In the United States, it was restricted to a single population in the Florida Keys, first discovered in 1992 and monitored intermittently since.
Salt water intrusion from rising seas, soil depletion from hurricanes andhigh tides, and herbivory by mammals had put significant pressure on the population. By 2021, what had been a thriving stand of about 150 stems was reduced to six ailing fragments, which researchers salvaged for off-site cultivation to ensure their survival.
"Unfortunately, the Key Largo tree cactus may be a bellwether for how other low-lying coastal plants will respond to climate change," say scientists.
But don't worry, the researchers are studying and trying to rescue the remnants of a dwindling stock of this cactus.
First U.S. vascular plant extirpation linked to sea level rise? Pilosocereus millspaughii (Cactaceae) in the Florida Keys, U.S.A., Journal of the Botanical Research Institute of Texas (2024). DOI: 10.17348/jbrit.v18.i1.1350
'Unhealthy' gut microbiome patterns linked to heightened risk of death after organ transplant
'Unhealthy' gut microbiome patterns are linked to a heightened risk of death after a solid organ transplant, finds research published online in the journal Gut.
While these particular microbial patterns are associated with deaths from any cause, they are specifically associated with deaths from cancer and infection, regardless of the organ—kidney, liver, heart, or lung—transplanted, the findings show.
The make-up of the gut microbiome is associated with various diseases, including inflammatory bowel disease and diabetes. But few studies have had the data to analyze the association between the gut microbiome and long term survival, explain the researchers.
And while a shift away from a normal pattern of microbes to an 'unhealthy' pattern, known as gut dysbiosis, has been linked to a heightened risk of death generally, it's not clear whether this might also be associated with overall survival in specific diseases, they add.
To find out, they looked at the relationship between gut dysbiosis and death from all and specific causes in solid organ transplant recipients among whom the prevalence of gut dysbiosis is much higher than that of the general population. This makes them an ideal group to study the associations between gut dysbiosis and long term survival, say the researchers.
To find out, they looked at the relationship between gut dysbiosis and death from all and specific causes in solid organ transplantrecipients among whom the prevalence of gut dysbiosis is much higher than that of the general population. This makes them an ideal group to study the associations between gut dysbiosis and long term survival, say the researchers.
They analyzed the microbiome profiles from 1,337 fecal samples provided by 766 kidney, 334 liver, 170 lung, and 67 heart transplant recipients and compared those with the gut microbiome profiles of 8,208 people living in the same geographical area of northern Netherlands.
The average age of the transplant recipients was 57, and over half were men (784; 59%). On average, they had received their transplant 7.5 years previously.
During a follow-up period of up to 6.5 years, 162 recipients died: 88 kidney, 33 liver, 35 lung and six heart recipients. Forty eight (28%) died from an infection, 38 (23%) from cardiovascular disease, 38 (23%) from cancer, and 40 (25%) from other causes.
The researchers looked at several indicators of gut dysbiosis in these samples: microbial diversity; how much their gut microbiomes differed from the average microbiome of the general population; the prevalence of antibiotic resistance genes; and virulence factors which help bacteria to invade cells and evade immune defenses. The analysis revealed that the more the gut microbiome patterns of the transplant recipients diverged from those of the general population, the more likely they were to die sooner after their procedure, irrespective of the organ transplanted.
Similar associations emerged for the abundance of antibiotic resistance genes and virulence factors.
The researchers identified 23 bacterial species among all the transplant recipients that were associated with either a heightened or lower risk of death from all causes. Part3
The researchers further analyzed all bacterial species simultaneously using AI. This revealed a second pattern of 19 different species that were also associated with an increased risk of death.
This is an observational study, and as such, no definitive conclusions can be drawn about the causal roles of particular bacteria.
But, conclude the researchers, "Our results support emerging evidence showing that gut dysbiosis is associated with long-term survival, indicating that gut microbiome targeting therapies might improve patient outcomes, although causal links should be identified first."
Casper Swarte et al, Multiple indicators of gut dysbiosis predict all-cause and cause-specific mortality in solid organ transplant recipients, Gut (2024). DOI: 10.1136/gutjnl-2023-331441
Hepatitis C leaves 'scars' in immune cells even after successful treatment
Chronic hepatitis C, caused by the hepatitis C virus, can lead to severe complications such as liver cirrhosis and liver cancer. The advent of highly effective direct-acting antivirals (DAAs) has resulted in high cure rates for this chronic viral infection. However, it has been reported that the immune system of patients does not fully recover even after being cured.
This work provided new insights into the lasting effects of chronic hepatitis C virus (HCV) infection on the immune system, even after the disease has been successfully treated.
The research team has discovered that traces of "epigenetic scars" remain in regulatory T cells and exhibit sustained inflammatory properties long after the virus is cleared from the body. The paper ispublishedin theJournal of Hepatology.
So-Young Kim et al, Epigenetic scars in regulatory T cells are retained after successful treatment of chronic hepatitis C with direct-acting antivirals, Journal of Hepatology (2024). DOI: 10.1016/j.jhep.2024.06.011
How lasers and 2D materials could solve the world's plastic problem
A global research team has developed a way to blast the molecules in plastics and other materials with a laser to break them down into their smallest parts for future reuse.
The discovery, which involves laying these materials on top of two-dimensional materials called transition metal dichalcogenides and then lighting them up, has the potential to improve how we dispose of plastics that are nearly impossible to break down with today's technologies.
By harnessing these unique reactions, we can explore new pathways for transforming environmental pollutants into valuable, reusable chemicals, contributing to the development of a more sustainable and circular economy.
This discovery has significant implications for addressing environmental challenges and advancing the field of green chemistry.
Jingang Li et al, Light-driven C–H activation mediated by 2D transition metal dichalcogenides, Nature Communications (2024). DOI: 10.1038/s41467-024-49783-z
No GPS, no problem: Researchers are making quantum sensing tools more compact and accurate to replace GPS
Fundamental physics—let alone quantum physics—might sound complicated to many, but it can actually be applied to solve everyday problems.
Imagine navigating to an unfamiliar place. Most people would suggest using GPS, but what if you were stuck in an underground tunnel where radio signals from satellites were not able to penetrate? That's where quantum sensing tools come in.
Researchers are working at making sensing instruments like atomic accelerometers smaller and more accurate so they can be used to navigate when GPS is down.
Atoms are excellent at making accurate measurements because they are all the same. Atomic measurements made in one laboratory would be indistinguishable from those made in another laboratory, as the atoms behave in precisely the same way.
One example of how this physics concept can be applied is making a highly accurate navigation system with these atoms.
As atoms have mass, they can be used to measure accelerations, helping us build atom-based sensors like atomic accelerometers.
The accelerometers let you know how fast and far you're moving in a given direction. They can be coupled with gyroscopes, which tell you whether you've changed directions and how far you've turned, to make a complete measurement. These navigation instruments are useful when you don't have access to GPS.
One of the challenges they're facing is how they can engineer this in a thoughtful way.
For example, they have to think very carefully about how they can miniaturize atomic accelerometers. These accelerometers have historically operated in big laboratory scale systems, where equipment is heavy and consumes a lot of power. To make the accelerometers suitable for public use, Researchers are investigating how to retain their high precision in a much more compact, power-efficient and attractive medium.
Not only do quantum sensing devices work in areas that don't have access to GPS, they can also be part of an exciting new avenue: national security applications.
Modern conflicts are becoming increasingly electronic and less kinetic, as nations vie for information superiority. The radio signal from GPS satellites is easy to disrupt and jam because it is far away. Thus, in any modern conflict, both sides will attempt to deny each other access to these radio signals. More traditional navigation instruments like inertial systems are un-jammable, as they work by adding up accelerations and rotations to measure our change in position. So they can replace GPS in times of conflict. However, all the errors made also get added up, so researchers are interested in using an atom-based measurement to ensure it is more accurate. Atomic accelerometers are one example of these inertial systems. These systems are present in sensors on aircraft and ships, guiding their movement through airspaces and waters. However, existing mechanical-based sensors can wear out easily due to friction, leading to them being swapped out every year and costing a lot of money. They are also hard to build because they're small and delicate. The quantum approach based on atoms pursued by the present researchers could provide acceleration measurements with no moving parts.
For example, if submarines want to be stealthy and quiet in defense scenarios, keeping track of what it's doing and how it's moved through inertial systems is pretty much the only game in town. Achieving this fine balance between simplicity and accuracy is the researchers' main goal, and they hope that their efforts will translate to real-world prototypes someday. Source: USC
Familial endocrine diseases linked to increased risk of pregnancy loss, new research shows
Women who have close family members with endocrine diseases--including type 2 diabetes, thyroid diseases and polycystic ovary syndrome (PCOS)--are at higher risk of pregnancy loss, a new study has found.
The research, presented recently at the ESHRE 40th Annual Meeting in Amsterdam, examined the association between various endocrine diseases and the incidence of pregnancy loss.
The study investigated 366,539 women in Denmark between 1973 and 2022. The study found that women with parents diagnosed with endocrine diseases faced a 6% higher risk of pregnancy loss compared to those without a family history of endocrine diseases.
Similarly, if a woman's sister had an endocrine disease, her risk of experiencing pregnancy loss increased by 7%. These patterns persisted even when individual cases of the diseases were considered.
The results highlight having a family history of endocrine disease as an important yet previously underexplored factor in assessing the risk of pregnancy loss.
Egerup, P., et al, Familial endocrine disease increases the risk of pregnancy loss and recurrent pregnancy loss– a nationwide register-based study of 366,548 Danish women. Human Reproduction (2024). academic.oup.com/humrep/issue/39/Supplement_1
Auroras caused by head-on blows to Earth's magnetic field could damage critical infrastructure, scientists say
Auroras are beautiful colourful lights that catch our imagination. Poets write poems, artists, including me, create art works based on them, writers weave stories around them.
They have inspired myths and portents for millennia—but only now, with modern technology dependent on electricity, are we appreciating their true power.
The same forces which cause auroras also cause currents that can damage infrastructure which conducts electricity, like pipelines.
Now scientists writing in Frontiers in Astronomy and Space Sciences have demonstrated that the impact angle of interplanetary shocks is key to the currents' strength, offering an opportunity to forecast dangerous shocks and shield critical infrastructure.
Auroras and geomagnetically induced currents are caused by similar space weather drivers.
The aurora is a visual warning that indicates that electric currents in space can generate these geomagnetically-induced currents on the ground.
The auroral region can greatly expand during severe geomagnetic storms. Usually, its southernmost boundary is around latitudes of 70 degrees, but during extreme events it can go down to 40 degrees or even further, which certainly occurred during the May 2024 storm—the most severe storm in the past two decades.
Auroras are caused by two processes: either particles ejected from the sun reach Earth's magnetic field and cause a geomagnetic storm, or interplanetary shocks compress Earth's magnetic field.
These shocks also generate geomagnetically induced currents, which can damage infrastructure that conducts electricity. More powerful interplanetary shocks mean more powerful currents and auroras—but frequent, less powerful shocks could also do damage.
The most intense deleterious effects on power infrastructure occurred in March 1989 following a severe geomagnetic storm—the Hydro-Quebec system in Canada was shut down for nearly nine hours, leaving millions of people with no electricity.
But weaker, more frequent events such as interplanetary shocks can pose threats to ground conductors over time. Recent research work shows that considerable geoelectric currents occur quite frequently after shocks, and they deserve attention.
Shocks which hit the Earth head-on, rather than at an angle, are thought to induce stronger geomagnetically induced currents, because they compress the magnetic field more. The scientists investigated how geomagnetically induced currents are affected by shocks at different angles and times of day.
Scientists found that more frontal shocks cause higher peaks in geomagnetically induced currents both immediately after the shock and during the following substorm. Particularly intense peaks took place around magnetic midnight, when the north pole would have been between the sun and Mäntsälä. Localized substorms at this time also cause striking auroral brightening.
Moderate currents occur shortly after the perturbation impact when Mäntsälä is around dusk local time, whereas more intense currents occur around midnight local time.
Because the angles of these shocks can be predicted up to two hours before impact, this information could allow us to set in place protections for electricity grids and other vulnerable infrastructure before the strongest and most head-on shocks strike.
One thing power infrastructure operators could do to safeguard their equipment is to manage a few specific electric circuits when a shock alert is issued. This would prevent geomagnetically induced currents reducing the lifetime of the equipment.
However, the scientists didn't find strong correlations between the angle of a shock and the time it takes for it to hit and then induce a current. This may be because more recordings of currents at different latitudes are needed to investigate this aspect. Current data was collected only at a particular location, namely the Mäntsälä natural gas pipeline system.
Although Mäntsälä is at a critical location, it does not provide a worldwide picture.
First direct observations of interplanetary shock impact angle effects on actual geomagnetically induced currents: The case of the Finnish natural gas pipeline system, Frontiers in Astronomy and Space Sciences (2024). DOI: 10.3389/fspas.2024.1392697
Researchers discover a new defense mechanism in bacteria
When confronted with an antibiotic, toxic substance, or other source of considerable stress, bacteria are able to activate a defense mechanism using cell-to-cell communication to "warn" unaffected bacteria, which can then anticipate, shield themselves and spread the warning signal.
This mechanism has just been described for the first time by a team of scientists .
It paves the way for the development of new, more effective antibiotic treatments that can target this bacterial communication system. The work appears in Nature Communications.
When they perceive a source of stress, bacteria spring into action, inducing changes in the expression of certain genes and their physiological properties to make them less vulnerable to the detected lethal substance. They also produce small alarmone proteins on their surface in order to contact and activate random neighboring bacteria.
Unstressed bacteria can only change state in the presence of a sufficient amount of alarmones. Thus, only a source of stress perceived by sufficient bacteria can trigger propagation of this activation.
The mechanism offers several advantages: It limits the unnecessary use of energy and enables a rapid and coordinated response in the population. Because activation is gradual, it creates diversity in the population over time, thus increasing the bacteria's chances of survival.
These findings were established using a dozen different families of antibiotics on populations of Streptococcus pneumoniae, the bacteria that causes pneumococcal infections.
Pneumococcal competence is a populational health sensor driving multilevel heterogeneity in response to antibiotics, Nature Communications (2024). DOI: 10.1038/s41467-024-49853-2
A new paper on the many ways wildfires affect people and the planet makes clear that as fires become more intense and frequent, the urgency for effective and proactive fire science grows. By addressing these challenges, the fire research community aims to better protect our planet and its inhabitants.
The paper appears in the Zenodo research repository.
Fire is a natural part of life on Earth, sustaining healthy and balanced ecosystems worldwide. But human activity and a changing climate are rapidly shifting both the frequency and severity of wildfire events, creating new risks to human and environmental health.
Recently, a group of scientists from 14 countries and across several disciplines—physical and social sciences, mathematics, statistics, remote sensing, fire communication and art, operational fire science, and fire management—gathered to discuss rapid changes in fire regimes and identify pathways to address these challenges.
The experts identified three grand challenges for fire science in the coming decades: understanding the role of fire in the carbon cycle, fire and extreme events, and the role of humans in fire.
If we want to improve the assessment of future fire impacts on people and the planet, we need to start with a better understanding of how climate, land cover changes, and human land management practices drive fire distribution and severity in the coming decades, the scientists say.
To address the grand challenges, the scientists identified three pressing research priorities: understanding the net carbon balance of fire, developing rapid response tools for wildfire events, and understanding fire's impact on society, especially marginalized and underrepresented populations.
A main goal of the white paper is to be able to improve fire modeling, predictability, and mitigation on both regional and global scales.
As fire events become more intense and frequent, the urgency for effective and proactive fire science grows. Scientists are taking steps to address these challenges collectively, as a unified fire research community, to better protect our planet and its inhabitants.
Douglas S Hamilton et al, Igniting Progress: Outcomes from the FLARE workshop and three challenges for the future of transdisciplinary fire science, Zenodo (2024). DOI: 10.5281/zenodo.12634067
Subsurface of fingernails found to have precise tactile localization
Researchers found that humans have a surprisingly precise degree of tactile localization beneath their fingernails. In his study, published in the Proceedings of the Royal Society B, they tested how well volunteers could pinpoint the part of their fingernail being stimulated and outlines possible reasons.
Humans, like other primates, have nails on the ends of their fingers rather than claws—an evolutionary development that has not been explained. In this new effort, researchers investigated the sensitivity of the skin below the fingernails to learn more about how they were used by our ancestors.
The human fingernail does not have any nerves; thus, it cannot sense touch, pressure, heat, cold or other environmental characteristics. But there is skin beneath the fingernail that is capable of sensations, as evidenced by people who accidentally hit their thumb with a hammer or lose a nail.
To learn more about the sensitivity of the subsurface of the fingernail, the researchers recruited 38 adult volunteers. Each agreed to have their fingernails poked while they indicated on a photograph of a fingernail where they thought their fingernail was being touched. In the experiments, half of the volunteers had their nails touched by a stick, the other half by a filament. Only the thumb and middle finger were tested.
The study found that humans have highly precise localization in their nails—they can tell clearly which part of their nail is being touched. He suggests that this is due to mechanoreceptors called Pacinian corpuscles, buried in the skin beneath the nails. He notes that it is the same mechanism that allows blind people to localize touch using a cane. Pacinian corpuscles are able to detect small amounts of vibration, which happens when a slight impact occurs between a foreign object and a fingernail.
Why did humans develop fingernails instead of claws? Why the skin beneath the nails is so sensitive? Researchers theorize that they likely served a sensorimotor function, giving humans more information about whatever their hands encounter.
Matthew R. Longo, Precise tactile localization on the human fingernail, Proceedings of the Royal Society B: Biological Sciences (2024). DOI: 10.1098/rspb.2024.1200
The geometry of life: Physicists determine what controls biofilm growth
This is physics helping biology.
From plaque sticking to teeth to scum on a pond, biofilms can be found nearly everywhere. These colonies of bacteria grow on implanted medical devices, our skin, contact lenses, and in our guts and lungs. They can be found in sewers and drainage systems, on the surface of plants, and even in the ocean.
Some research says that 80% of infections in human bodies can be attributed to the bacteria growing in biofilms.
The paper, "The biophysical basis of bacterial colony growth," was published in Nature Physics this week, and it shows that the fitness of a biofilm- its ability to grow, expand, and absorb nutrients from the medium or the substrate—is largely impacted by the contact angle that the biofilm's edge makes with the substrate. The study also found that this geometry has a bigger influence on fitness than anything else, including the rate at which the cells can reproduce.
Understanding how biofilms grow—and what factors contribute to their growth rate—could lead to critical insights on controlling them, with applications for human health, like slowing the spread of infection or creating cleaner surfaces.
Aawaz R. Pokhrel et al, The biophysical basis of bacterial colony growth, Nature Physics (2024). DOI: 10.1038/s41567-024-02572-3
New research published in Arthritis & Rheumatology indicates that chronic exposure to air pollutants may increase the risk of developing lupus, an autoimmune disease that affects multiple organs.
For the study, investigators analyzed data on 459,815 participants from the UK Biobank. A total of 399 lupus cases were identified during a median follow-up of 11.77 years. Air pollutant exposure was linked with a greater likelihood of developing lupus. Individuals with a high genetic risk and high air pollution exposure had the highest risk of developing lupus compared with those with low genetic risk and low air pollution exposure.
This study provides crucial insights into the air pollution contributing to autoimmune diseases. The findings can inform the development of stricter air quality regulations to mitigate exposure to harmful pollutants, thereby reducing the risk of lupus.
Air pollution, genetic susceptibility and risk of incident Systemic lupus erythematosus: A prospective cohort study, Arthritis & Rheumatology (2024). DOI: 10.1002/art.42929
Some environmental toxicants linked to depressive symptoms
Certain categories of environmental toxicants are associated with depressive symptoms, according to a study published online July 3 in JAMA Network Open.
Researchers screened and assessed the associations between potential environmental toxicants and depressive symptomsamong 3,427 participants from the 2013 to 2014 and 2015 to 2016 waves of the National Health and Nutrition Examination Survey. Exposures were assessed for 62 toxicants in 10 categories; the association with depression scores, measured by the 9-item Patient Health Questionnaire (PHQ-9), was examined.
The researchers identified associations between 27 chemical compounds or metals in six of 10 categories of environmental toxicants and the prevalence of depressive symptoms, including the volatile organic compound metabolites N-acetyl-S-(2 hydroxy-3-butenyl)-L-cysteine and total nicotine equivalent-2 (odds ratios, 1.74 and 1.42, respectively).
Compared with women and older individuals, men and younger individuals seemed more vulnerable to environmental toxicants. Overall, 5–19 percent of the associations were mediated by peripheral white blood cell count.
"This research highlights the significance of preventing and regulating important environmental toxicants to gain fresh insights into preventing and potentially treating depression," the authors write in their paper.
Respiratory bacteria 'turn off' immune system to survive, study finds
Researchers have identified how a common bacterium is able to manipulate the human immune system during respiratory infections and cause persistent illness. The research was published in PLOS Pathogens.
This study found the virulence mechanisms of Haemophilus influenzae, a bacterium that plays a significant role in worsening respiratory tract infections.
These bacteria are especially damaging to vulnerable groups, such as those with cystic fibrosis, asthma, the elderly, and Indigenous communities.
In some conditions, such as asthma and chronic obstructive pulmonary disease,
they can drastically worsen symptoms.
This research shows the bacterium persists by essentially turning off the body's immune responses, inducing a state of tolerance in human respiratory tissues.
The researchers prepared human nasal tissue in the lab, growing it to resemble the surfaces of the human respiratory tract, then monitored gene expression changes over a 14-day 'infection.' They found very limited production of inflammation molecules over time, which normally would be produced within hours of bacteria infecting human cells.
Researchers then applied both live and dead Haemophilus influenzae, showing the dead bacteria caused a fast production of the inflammation makers, while live bacteria prevented this. This proved that the bacteria can actively reduce the human immune response. If local immunity drops, for example during a viral infection, the bacteria may be able to 'take over' and cause a more severe infection.
Densely packed E. coli form an immobile material similar to colloidal glass, research shows
Dense E. coli bacteria have several similar qualities to colloidal glass, according to new research at the University of Tokyo. Colloids are substances made up of small particles suspended within a fluid, like ink for example. When these particles become higher in density and more packed together, they form a "glassy state."
When researchers multiplied E. coli bacteria within a confined area, they found that they exhibited similar characteristics. More surprisingly, they also showed some other unique properties not typically found in glass-state materials.
This study, which is published inPNAS Nexus, contributes to the understanding of glassy "active matter," a relatively new field of materials research which crosses physics and life science.
In the long term, the researchers hope that these results will contribute to developing materials with new functional capabilities, as well as aiding our understanding of biofilms (where microorganisms stick together to form layers on surfaces) and natural bacterial colonies.
Researchers have now found that the bacteria E. coli can behave in a similar way.
Since bacteria are very different from what we know of as glass, it was surprising that many of the statistical properties of glassy materials were the same for bacteria.
In this experiment, As the number of E. coli increased, they became caged in by their neighbors, restricting their ability to swim freely. Over time, they transitioned to a glassy state. This transition is similar to glass formation, as the researchers noted a rapid slowdown of movement, the caged-in effect and dynamic heterogeneity (whereby molecules travel longer distances in some areas but hardly move in others).
What made this bacterial glass different to other glasslike substances was the spontaneous formation of "microdomains" and the collective motion of the bacteria within these areas. These occurred where groups of the rod-shaped E. coli became aligned the same way.
The researchers were also surprised that the way the bacteria vitrify (turn into a glasslike state) apparently violates a physical law of typical thermal systems. What we characteristically know as glass, including colloidal glass, is classed as thermal glass. However, recently researchers have started to explore glassy states, like the one reported in this paper, which aren't considered thermal glass but share many of the same properties.
"Collections of 'self-propelled particles' like we see here have recently been regarded as a new kind of material called active matter, which is currently a hot topic and shows great potential.
How water controls the speed of muscle contraction
The flow of water within a muscle fiber may dictate how quickly muscle can contract, according to a new study.
Nearly all animals use muscle to move, and it's been known for a long time that muscle, like all other cells, is composed of about 70% water. But researchers don't know what sets the range and upper limits of muscle performance. Previous research into how muscle works focused only on how it worked on a molecular level rather than how muscle fibers are shaped, that they are three-dimensional and are full of fluid.
Researchers now created a theoretical model of water's role in muscle contraction and found that how fluid moves through a muscle fiber determines how quickly a muscle fiber can contract.
They also found that muscle exhibits a new kind of elasticity called odd elasticity that allows muscle to generate power using three dimensional deformations, shown in a common observation that when a muscle fiber contracts lengthwise, it also bulges perpendicularly. Their results are published in the journal Nature Physics.
These results suggest that even such basic questions as how quickly muscle can contract or how many ways muscle can generate power have new and unexpected answers when one takes a more integrated and holistic view of muscle as a complex and hierarchically organized material rather than just a bag of molecules.
Muscle fibers are composed of many components, such as various proteins, cell nuclei, organelles such as mitochondria, and molecular motors such as myosin that convert chemical fuel into motion and drive muscle contraction. All of these components form a porous network that is bathed in water. So an appropriate, coarse-grained description for muscle is that of an active sponge, say the researchers. But the squeezing process takes time to move water around, so the researchers suspected that this movement of water through the muscle fiber set an upper limit on how rapidly a muscle fiber can twitch.
To test their theory, they modeled muscle movements in multiple organisms across mammals, insects, birds, fish and reptiles, focusing on animals that use muscles for very fast motions. They found that muscles that produce sound, such as the rattle in a rattlesnake's tail, that can contract ten to hundreds of times per second typically don't rely on fluid flows. Instead, these contractions are controlled by the nervous system and are more strongly dictated by molecular properties, or the time it takes for molecular motors within cells to bind and generate forces. But in smaller organisms, such as flying insects who are beating their wings a few hundred to a thousand times per second, these contractions are too fast for neurons to directly control. Here fluid flows are more important. In these cases, the researchers found that fluid flows within the muscle fiber are important and their mechanism of active hydraulics is likely to limit the fastest rates of contraction. The researchers also found that when muscle fibers act as an active sponge, the process also causes the muscles to act as an active elastic engine. When something is elastic, such as a rubber band, it stores energy as it tries to resist deformation. Imagine holding a rubber band between two fingers and pulling it back.
When you release the rubber band, the band also releases the energy stored when it was being stretched. In this case, energy is conserved—a basic law of physics that dictates that the amount of energy within a closed system should remain the same over time. But when muscle converts chemical fuel into mechanical work, it can produce energy like an engine, violating the law of the conservation of energy. In this case, muscle shows a new property called "odd elasticity," where its response when squashed in one direction versus another is not mutual.
Unlike the rubber band, when muscle contracts and relaxes along its length, it also bulges out perpendicularly, and its energy does not stay the same. This allows muscle fibers to generate power from repetitive deformations, behaving as a soft engine.
These results are in contrast to prevailing thought, which focuses on molecular details and neglects the fact that muscles are long and filamentous, are hydrated, and have processes on multiple scales. All together, our results suggest a revised view of how muscle functions is essential to understand its physiology. This is also crucial to understanding the origins, extent and limits that underlie the diverse forms of animal movement.
Suraj Shankar et al, Active hydraulics and odd elasticity of muscle fibres, Nature Physics (2024). DOI: 10.1038/s41567-024-02540-x
Malaysia was on the brink of eliminating malaria – then a new parasite swung out of the jungle
A new malaria parasite comes from monkeys. With thousands already infected, experts fear it could one day spread between humans.
Malaria has been eliminated in Malaysia, but another variety is spilling out from the rainforest and infecting people: monkey malaria. Around 25,000 people have been infected with the Plasmodium knowlesi parasite since 2011, which causes nausea, fever and sometimes death. Deforestation has driven a spike in cases, pushing monkeys, mosquitoes and people into closer proximity. The disease isn’t limited to southeast Asia, and as it spreads so too does the chance that monkey malaria will adapt to be spread between humans.
Galaxies avoid an early death because they have a "heart and lungs" which effectively regulate their "breathing" and prevent them from growing out of control, a new study suggests.
If they didn't, the universe would have aged much faster than it has and all we would see today is huge "zombie" galaxies teeming with dead and dying stars.
That's according to a new study published in the Monthly Notices of the Royal Astronomical Society, which investigates one of the great mysteries of the universe—why galaxies are not as large as astronomers would expect.
Something appears to be stifling their enormous potential by limiting the amount of gas they absorb to convert into stars, meaning that instead of endlessly growing, something inside resists what was thought to be the inevitable pull of gravity.
Now, astrophysicists think they may have uncovered the secret. They suggest that galaxies could in fact control the rate at which they grow through how they "breathe."
In their analogy, the researchers compared the supermassive black hole at the center of a galaxy to its heart, and the two bi-polar supersonic jets of gas and radiation they emit to airways feeding a pair of lungs.
Pulses from the black hole—or "heart"—can lead to jet shock fronts oscillating back and forth along both jet axes, much like the thoracic diaphragm in the human body moves up and down inside a chest cavity to inflate and deflate both lungs.
This can result in jet energy being transmitted widely into the surrounding medium, just as we breathe out warm air, resulting in slowing galaxy gas-accretion and growth.
The phenomenon is similar to the terrestrial equivalent of sound and shock waves being produced when opening a bottle of champagne, the screech of a car, rocket exhausts and the puncture of pressurized enclosures.
These supersonic jets might help in inhibiting galaxy growth.
The researchers concluded that a galaxy's lifespan can be extended with the help of its "heart and lungs," where the supermassive black hole engine at its core helps inhibit growth by limiting the amount of gas collapsing into stars from an early stage. This, the researchers say, has helped create the galaxies we see today.
Without such a mechanism, galaxies would have exhausted their fuel by now and fizzled out, as some do in the form of "red and dead" or "zombie" galaxies.
Carl Richards et al, Simulations of Pulsed Over-Pressure Jets: Formation of Bellows and Ripples in Galactic Environments, Monthly Notices of the Royal Astronomical Society (2024). DOI: 10.1093/mnras/stae1498
Scientists identify possible way to block muscle fatigue in long COVID, other diseases
Infections and neurodegenerative diseases cause inflammation in the brain. But for unknown reasons, patients with brain inflammation often develop muscle problems that seem to be independent of the central nervous system. Now, researchers have revealed how brain inflammation releases a specific protein that travels from the brain to the muscles and causes a loss of muscle function.
The study, in fruit flies and mice, also identified ways to block this process, which could have implications for treating or preventing the muscle wasting sometimes associated with inflammatory diseases including bacterial infections, Alzheimer's disease and long COVID.
The study ispublishedJuly 12 in the journalScience Immunology.
The study suggests that when we get sick, messenger proteins from the brain travel through the bloodstream and reduce energy levels in skeletal muscle. This is more than a lack of motivation to move because we don't feel well. These processes reduce energy levels in skeletal muscle, decreasing the capacity to move and function normally.
To investigate the effects of brain inflammation on muscle function, the researchers modeled three different types of diseases—an E. coli bacterial infection, a SARS-CoV-2 viral infection and Alzheimer's.
When the brain is exposed to inflammatory proteins characteristic of these diseases, damaging chemicals called reactive oxygen species build up. The reactive oxygen species cause brain cells to produce an immune-related molecule called interleukin-6 (IL-6), which travels throughout the body via the bloodstream.
The researchers found that IL-6 in mice—and the corresponding protein in fruit flies—reduced energy production in muscles' mitochondria, the energy factories of cells.
Flies and mice that had COVID-associated proteins in the brain showed reduced motor function—the flies didn't climb as well as they should have, and the mice didn't run as well or as much as control mice.
The researchers saw similar effects on muscle function when the brain was exposed to bacterial-associated proteins and the Alzheimer's protein amyloid beta. They also saw evidence that this effect can become chronic. Even if an infection is cleared quickly, the reduced muscle performance remains many days longer in their experiments.
The bacterial brain infection meningitis is known to increase IL-6 levels and can be associated with muscle issues in some patients, for instance. Among COVID-19 patients, inflammatory SARS-CoV-2 proteins have been found in the brain during autopsy, and many long COVID patients report extreme fatigue and muscle weakness even long after the initial infection has cleared.
Patients with Alzheimer's disease also show increased levels of IL-6 in the blood as well as muscle weakness.
The study pinpoints potential targets for preventing or treating muscle weakness related to brain inflammation. The researchers found that IL-6 activates what is called the JAK-STAT pathway in muscle, and this is what causes the reduced energy production of mitochondria. Several therapeutics already approved by the FDA for other diseases can block this pathway. JAK inhibitors as well as several monoclonal antibodies against IL-6 are approved to treat various types of arthritis and manage other inflammatory conditions.
The researchers are not sure why the brain produces a protein signal that is so damaging to muscle function across so many different disease categories, though. They speculate about possible reasons this process has stayed with us over the course of human evolution, despite the damage it does, it could be a way for the brain to reallocate resources to itself as it fights off disease. More research is needed to better understand this process and its consequences throughout the body.
First fossil chromosomes discovered Scientists have discovered intact chromosomes preserved in the skin of a woolly mammoth (Mammuthus primigenius) that met its end some 50,000 years ago — a feat previously thought to be impossible. The team also revealed the spatial organization of the mammoth’s DNA molecules and the active genes in its skin, including one responsible for giving the animal its fuzzy appearance. The study is the first to report the 3D structure of an ancient genome.
Brain size riddle solved as humans exceed evolutionary trend
The largest animals do not have proportionally bigger brains—with humans bucking this trend—a study published in Nature Ecology & Evolution has revealed.
Researchers collected an enormous dataset of brain and body sizes from around 1,500 species to clarify centuries of controversy surrounding brain size evolution.
Bigger brains relative to body size are linked to intelligence, sociality, and behavioral complexity—with humans having evolved exceptionally large brains. The new research reveals the largest animals do not have proportionally bigger brains, challenging long-held beliefs about brain evolution.
For more than a century, scientists have assumed that this relationship was linear—meaning that brain size gets proportionally bigger, the larger an animal is. We now know this is not true. The relationship between brain and body size is a curve, essentially meaning very large animals have smaller brains than expected.
The research reveals a simple association between brain and body size across all mammals which allowed the researchers to identify the rule-breakers—species which challenge the norm.
Among these outliers includes our own species, Homo sapiens, which has evolved more than 20 times faster than all other mammal species, resulting in the massive brains that characterize humanity today. But humans are not the only species to buck this trend.
All groups of mammals demonstrated rapid bursts of change—both towards smaller and larger brain sizes. For example, bats very rapidly reduced their brain size when they first arose, but then showed very slow rates of change in relative brain size, suggesting there may be evolutionary constraints related to the demands of flight.
There are three groups of animals that showed the most pronounced rapid change in brain size: primates, rodents, and carnivores. In these three groups, there is a tendency for relative brain size to increase in time (the "Marsh-Lartet rule"*). This is not a trend universal across all mammals, as previously thought.
* a tendency for relative brain sizes to increase with time. But, as the study notes, this rule is not universally applicable across all mammals part 1
These new results reveal a mystery. In the largest animals, there is something preventing brains from getting too big. Whether this is because big brains beyond a certain size are simply too costly to maintain remains to be seen. But as we also observe similar curvature in birds, the pattern seems to be a general phenomenon—what causes this 'curious ceiling' applies to animals with very different biology.
Chris Venditti et al, Co-evolutionary dynamics of mammalian brain and body size, Nature Ecology & Evolution (2024). DOI: 10.1038/s41559-024-02451-3
New study finds 40% of cancer cases and almost half of all deaths linked to modifiable risk factors
A study led by researchers at the American Cancer Society (ACS) finds four in 10 cancer cases and about one-half of all cancer deaths in adults 30 years old and older could be attributed to modifiable risk factors, including cigarette smoking, excess body weight, alcohol consumption, physical inactivity, diet, and infections.
Cigarette smoking was by far the leading risk factor, contributing to nearly 20% of all cancer cases and 30% of all cancer deaths. The findings are published in the journal CA: A Cancer Journal for Clinicians.
The risk factors included cigarette smoking (current and former smoking); secondhand smoke; excess body weight; alcohol consumption; consumption of red and processed meat; low consumption of fruits and vegetables, dietary fiber, and dietary calcium; physical inactivity; ultraviolet (UV) radiation; and infection with Epstein-Barr virus (EBV), Helicobacter pylori, hepatitis B virus (HBV), hepatitis C virus (HCV), human herpes virus-8 (HHV-8; also called Kaposi sarcoma herpesvirus), human immunodeficiency virus (HIV), and human papillomavirus (HPV).
Introducing co-cultures: When co-habiting animal species share culture
Cooperative hunting, resource sharing, and using the same signals to communicate the same information—these are all examples of cultural sharing that have been observed between distinct animal species. In an opinion piece published June 19 in the journal Trends in Ecology & Evolution, researchers introduce the term "co-culture" to describe cultural sharing between animal species. These relationships are mutual and go beyond one species watching and mimicking another species' behavior—in co-cultures, both species influence each other in substantial ways.
Co-culture challenges the notion of species-specific culture, underscoring the complexity and interconnectedness of human and animal societies, and between animal societies," write the authors.
These cross-species interactions result in behavioral adaptations and preferences that are not just incidental but represent a form of convergent evolution.
Co-cultures have been observed between humans and nonhuman animals—for example, between humans and honeyguides in Tanzania and Mozambique, where the birds lead humans to honeybee nests. They are also evident between different species of nonhuman animals—for example, cooperative scavenging between ravens and wolves, cooperative hunting between false killer whales and bottlenose dolphins, and signal sharing between distinct species of tamarin. Ultimately, this inter-species sharing of culture could drive evolution, the researchers say.
"Cultural behaviors that enhance survival or reproductive success in a particular setting can lead to changes in population habits that, over time, could drive genetic selection," they write.
To extend our understanding of co-cultures, the researchers say that future studies could start by investigating wild animals in urban environments.
Urban animals modify their behaviors, learning, and problem-solving skills to cope with urban challenges, reflecting a dynamic response to urban landscapes," they write. "Similarly, humans alter their urban spaces, influencing wildlife behavior and evolution. This reciprocal adaptation between humans and wildlife is fundamental to understanding co-culture." ---- Future research is also needed to examine the possibility of cultural and genetic co-evolution—the idea that species' cultures and genomes are evolving in concert. A key question, the researchers say, is "In the context of co-culture, how do cultural adaptations influence genetic evolution, and vice versa, across different species and environments?" ----
Cédric Sueur et al, Co-cultures: exploring interspecies culture among humans and other animals, Trends in Ecology & Evolution (2024). DOI: 10.1016/j.tree.2024.05.011
Part 2
Animals interact both within and between species, sharing common spaces.
Animals exhibit cultural behaviours.
Animals can influence and learn from each other, including interactions with other species.
The co-culture concept suggests that two populations of different species can influence each other's cultures through synchronised activities.
Co-culture emphasises cultural adaptations within ecological niches.
Study Finds Life on Earth Emerged 4.2 Billion Years Ago
Once upon a time, Earth was barren. Everything changed when, somehow, out of the chemistry available early in our planet's history, something started squirming – processing available matter to survive, to breed, to thrive.
What that something was, and when it first squirmed, have been burning questions that have puzzled humanity probably for as long as we've been able to ask "what am I?"
Now, a new study has found some answers – and life emerged surprisingly early.
By studying the genomes of organisms that are alive today, scientists have determined that the last universal common ancestor (LUCA), the first organism that spawned all the life that exists today on Earth, emerged as early as 4.2 billion years ago.
Earth, for context, is around 4.5 billion years old. That means life first emerged when the planet was still practically a newborn.
Back when it was new, Earth was a very different place, with an atmosphere that we would find extremely toxic today. Oxygen, in the amount current life seems to need, didn't emerge until relatively late in the planet's evolutionary history, only as early as around 3 billion years ago.
But life emerged prior to that; we have fossils of microbes from 3.48 billion years ago. And scientists think that conditions on Earth may have been stable enough to support life from around 4.3 billion years ago.
But our planet is subject to erosional, geological, and organic processes that make evidence of that life, from that time, almost impossible to find.
a team of scientists went looking somewhere else: in genomes from living organisms, and the fossil record.
Their study is based on something called a molecular clock. Basically, we can estimate the rate at which mutations occur, and count the number to determine how much time has passed since the organisms in question diverged from common ancestors.
All organisms, from the humblest microbe to the mightiest fungus, have some things in common. There's a universal genetic code. The way we make proteins is the same. There's an almost universal set of 20 amino acids that are all oriented the same way. And all living organisms use adenosine triphosphate (ATP) as a source of energy in their cells.
Researchers worked out, based on these similarities and differences, how long it has been since LUCA's successors started to diverge. And, using complex evolutionary modeling, they were able to learn more about LUCA itself – what it was, and how it survived on an Earth so very inhospitable to its descendants.
LUCA, they found, was probably very similar to a prokaryote, a single-celled organism that doesn't have a nucleus. It was obviously not reliant on oxygen, since there would have been little oxygen available; that's not unexpected for a microbe. As such, its metabolic processes probably produced acetate.
But there was something else interesting. LUCA appears not to have been alone. This study showed that LUCA was a complex organism, not too different from modern prokaryotes. But what is really interesting is that it's clear it possessed an early immune system, showing that even by 4.2 billion years ago, our ancestor was engaging in an arms race with viruses. Because its metabolic processes would have produced waste products that could be used by other lifeforms, they could have emerged not long after LUCA did.
This implies that it takes relatively little time for a full ecosystem to emerge in the evolutionary history of a planet – a finding that has implications far beyond our own little pale blue dot.
The Exact Part of The Brain Behind Your Curiosity Being curious is a quintessential part of being human, driving us to learn and adapt to new environments. For the first time, scientists have pinpointed the spot in the brain where curiosity emerges. The discovery was made by researchers who used functional magnetic resonance imaging (fMRI) scans to measure oxygen levels in different parts of the brain, indicating how busy each region is at any one time.
Knowing where curiosity originates could help us understand more about how human beings tick, and potentially lead to therapies for conditions where curiosity is lacking, such as chronic depression. This is really the first time we can link the subjective feeling of curiosity about information to the way your brain represents that information. In the experiments conducted on curiosity and fMRI scans, notable activity was spotted in three regions: the the occipitotemporal cortex (linked to vision and object recognition), the ventromedial prefrontal cortex or vmPFC (which manages perceptions of value and confidence), and the anterior cingulate cortex (used for information gathering). The vmPFC appears to act as a sort of neurological bridge between levels of certainty recorded by the occipitotemporal cortex, and subjective feelings of curiosity – almost like a trigger telling us when to be curious. The less confident the volunteers were about the image subject, the more curious they were about it. These results illuminate how perceptual input is transformed by successive neural representations to ultimately evoke a feeling of curiosity," write the researchers in their published paper.
Study identifies epigenetic 'switches' that regulate the developmental trajectories of single cells
Individual cells in the human body develop progressively over time, ultimately becoming specialized in specific functions. This process, known as cell differentiation or specialization, is central to the formation of distinct cell populations that serve different purposes.
Past studies suggest that the fate of cells is also modulated by epigenetic mechanisms (i.e., interactions between genes and environmental factors). These epigenetic mechanisms, however, have so far been proved difficult to pinpoint.
Researchers recently carried out a study aimed at exploring the epigenetic processes influencing the developmental trajectories of individual cells, using human brain and retina organoids derived from pluripotent stem cells.
Their paper, published in Nature Neuroscience, outlines a single-cell epigenome-wide map that could aid the study of human cell fate determination.
This paper was inspired by the need to better understand the epigenetic mechanisms that regulate cell fate decisions during human brain and retina development.
The primary objective was to create a comprehensive single-cell epigenomic map that could capture the transitions from pluripotent stem cells to differentiated neural cells.
To study the epigenetic mechanisms underpinning the diversification of human cells, the researchers carried out experiments on human brain and retina organoids, three-dimensional (3D), miniaturized versions of human organs created in laboratory settings, using pluripotent stem cells.
Using epigenetic techniques, the researchers were able to track how these three key histone modifications changed as cells transitioned through different stages of development. The observations gathered in their experiments allowed them to identify dynamic epigenetic "switches" that regulate the fate of individual cells. They discovered that the switching of repressive and activating histone modifications happens before cell fate decisions. Additionally, they demonstrated that removing H3K27me3 at the neuroectoderm stage disrupts fate restriction, leading to aberrant cell identities.
Fides Zenk et al, Single-cell epigenomic reconstruction of developmental trajectories from pluripotency in human neural organoid systems, Nature Neuroscience (2024). DOI: 10.1038/s41593-024-01652-0
How climate change is altering the Earth's rotation
For the first time, researchers have been able to fully explain the various causes of long-term polar motion in the most comprehensive modeling to date, using AI methods. Their model and their observations show that climate change and global warming will have a greater influence on the Earth's rotational speed than the effect of the moon, which has determined the increase in the length of the day for billions of years.
Climate change is causing the ice masses in Greenland and Antarctica to melt. Water from the polar regions is flowing into the world's oceans—and especially into the equatorial region.
This means that a shift in mass is taking place, and this is affecting the Earth's rotation.
It's like when a figure skater does a pirouette, first holding her arms close to her body and then stretching them out. The initially fast rotation becomes slower because the masses move away from the axis of rotation, increasing physical inertia.
In physics, we speak of the law of conservation of angular momentum, and this same law also governs the Earth's rotation. If the Earth turns more slowly, the days get longer. Climate change is therefore also altering the length of the day on Earth, albeit only minimally for now.
Another cause of this slowdown is tidal friction, which is triggered by the moon. However, the new study comes to a surprising conclusion: if humans continue to emit more greenhouse gases and the Earth warms up accordingly, this would ultimately have a greater influence on the Earth's rotational speed than the effect of the moon, which has determined the increase in the length of the day for billions of years.
We humans have a greater impact on our planet than we realize and this naturally places great responsibility on us for the future of our planet.
However, shifts in mass on the Earth's surface and in its interior caused by the melting ice not only change the Earth's rotational speed and the length of day: as the researchers show inNature Geoscience, they also alter the axis of rotation. This means that the points where the axis of rotation actually meets the Earth's surface move.
Researchers can observe this polar motion, which, over a longer timeframe, comes to some ten meters per hundred years. It's not only the melting of the ice sheets that plays a role here, but also movements taking place in the Earth's interior.
Deep in the Earth's mantle, where the rock becomes viscous due to high pressure, displacements occur over long periods of time. And there are also heat flows in the liquid metal of Earth's outer core, which are responsible for both generating the Earth's magnetic field and leading to shifts in mass.
In the most comprehensive modeling to date, researchers have now shown how polar motion results from individual processes in the core, in the mantle and from the climate at the surface. One finding in particular that stands out in their study is that the processes on and in the Earth are interconnected and influence each other. Climate change is causing the Earth's axis of rotation to move, and it appears that the feedback from the conservation of angular momentum is also changing the dynamics of the Earth's core. Ongoing climate change could therefore even be affecting processes deep inside the Earth and have a greater reach than previously assumed. However, there is little cause for concern, as these effects are minor and it's unlikely that they pose a risk. Implications for space travel Even if the Earth's rotation is changing only slowly, this effect has to be taken into account when navigating in space—for example, when sending a space probe to land on another planet. Even a slight deviation of just one centimeter on Earth can grow to a deviation of hundreds of meters over the huge distances involved.
"Otherwise, it won't be possible to land in a specific crater on Mars".
Study finds abortion restrictions harm mental health, with low-income women hardest hit
People living in states that enacted tighter abortion restrictions in the wake of the Dobbs v. Jackson Women's Health decision, which returned regulation of abortion access to state legislatures, are more likely to report elevated levels of mental distress. This is particularly true for people of lower socioeconomic means.
These are the key takeaways of July 2024 paper published in Science Advances.
Loss of oxygen in bodies of water identified as new tipping point
Oxygen concentrations in our planet's waters are decreasing rapidly and dramatically—from ponds to the ocean. The progressive loss of oxygen threatens not only ecosystems, but also the livelihoods of large sectors of society and the entire planet, according to the authors of an international study involving GEOMAR published recently in Nature Ecology & Evolution.
Oxygen is a fundamental requirement of life on planet Earth. The loss of oxygen in water, also referred to as aquatic deoxygenation, is a threat to life at all levels. The international team of researchers describes how ongoing deoxygenation presents a major threat to the livelihoods of large parts of society and for the stability of life on our planet.
Previous research has identified a suite of global scale processes, referred to as planetary boundaries, that regulate the overall habitability and stability of the planet. If critical thresholds in these processes are passed, the risk of large-scale, abrupt or irreversible environmental changes ("tipping points") increases and the resilience of our planet, its stability, is jeopardized.
Among the nine planetary boundaries are climate change, land use change, and biodiversity loss. The authors of the new study argue that aquatic deoxygenation both responds to, and regulates, other planetary boundary processes.
Across all aquatic ecosystems, from streams and rivers, lakes, reservoirs, and ponds to estuaries, coasts, and theopen ocean, dissolved oxygen concentrations have rapidly and substantially declined in recent decades.
Lakes and reservoirs have experienced oxygen losses of 5.5% and 18.6% respectively since 1980. The ocean has experienced oxygen losses of around 2% since 1960. Although this number sounds small, due to the large ocean volume it represents an extensive mass of oxygen lost.
Marine ecosystems have also experienced substantial variability in oxygen depletion.
The volumes of aquatic ecosystems affected by oxygen depletion have increased dramatically across all types.
The causes of aquatic oxygen loss are global warming due to greenhouse gas emissions and the input of nutrients as a result of land use.
If water temperatures rise, the solubility of oxygen in the water decreases. In addition, global warming enhances stratification of the water column, because warmer, low-salinity water with a lower density lies on top of the colder, saltier deep water below.
This hinders the exchange of the oxygen-poor deep layers with the oxygen-rich surface water. In addition, nutrient inputs from land support algal blooms, which lead to more oxygen being consumed as more organic material sinks and is decomposed by microbes at depth. Areas in the sea where there is so little oxygen that fish, mussels or crustaceans can no longer survive threaten not only the organisms themselves, but also ecosystem services such as fisheries, aquaculture, tourism and cultural practices.
Microbiotic processes in oxygen-depleted regions also increasingly produce potent greenhouse gases such as nitrous oxide and methane, which can lead to a further increase in global warming and thus a major cause of oxygen depletion. The authors warn: We are approaching critical thresholds of aquatic deoxygenation that will ultimately affect several other planetary boundaries. Failure to address aquatic deoxygenation will, ultimately, not only affect ecosystems but also economic activity, and society at a global level.
Kevin C. Rose et al, Aquatic deoxygenation as a planetary boundary and key regulator of Earth system stability, Nature Ecology & Evolution (2024). DOI: 10.1038/s41559-024-02448-y
Dr. Krishna Kumari Challa
Scientists create a cell that precludes malignant growth
Cell therapies could help in the treatment of hereditary diseases, myocardial infarction and hundreds of other diseases. For many blood diseases, new cells can already be transplanted into human patients, and diabetes has also been treated by transplanting cells obtained through organ donation or, more recently, β-cells modified from the patient's own stem cells.
A risk associated with gene-edited cells is unintentional DNA mutations, including those that predispose patients to cancer. Moreover, the difference in tissue types makes it impossible to transfer cells simply from one person to another.
Cells that suit anyone, or immunologically invisible cells, as it were, have been created, but they too are associated with an increased risk of cancer. Over a decade ago, a research group set out to develop cells where these problems could be avoided. Now, the group has succeeded in producing cells which cannot proliferate unaided and which cannot therefore turn into malignant cells.
The study was published in Molecular Therapy.
Almost all of our diseases are fundamentally caused by cellular dysfunction. One medical dream is to fight tissue damage, diseases or even aging with new healthy cells. This new study takes us a step closer to safe and novel cell therapies.
The researchers modified stem cells to divide only if they are supplemented with thymidine, one of the building blocks of DNA. The cells that have been subjected to this safety treatment cannot replicate their genome without the supplementary component vital for DNA synthesis. This precludes their proliferation. When the cells are differentiated for their various tasks, they cease to divide and no longer require the supplement. The innovation has been protected by the University's Helsinki Innovation services (HIS).
Part 1
Jul 10, 2024
Dr. Krishna Kumari Challa
Initially, the researchers investigated whether cell growth can be regulated with an externally administered substance. Once successful, they examined whether the cells functioned normally.
They used stem cells to create insulin-producing β-cells that they then transplanted into laboratory animals. The cells regulated the blood glucose levels of the animals throughout the almost six-month experiment.
The cells are also able to differentiate into other tissue types as usual, and the researchers have not observed any differences in them other than their inability to proliferate without their say-so.
Stem cells are very primitive cells, as they have to be able to divide in abundance and develop in many different directions. They have potential for a range of purposes, but their primitive nature also poses a problem: What if some cells are not differentiated, but continue to grow in a primitive form? According to the scientists of the study, the research group's solution enables the efficient proliferation of cells during production, which can be halted at the desired time, such as following transplantation.
The solution also makes it possible to edit cells without fear of adverse effects of the editing itself. For example, cells can be made into something that the recipient's immune system does not recognize.
Previously, such cells would have been highly risky, as the immune system also monitors the onset of cancer. Now, that risk is very small or non-existent. Ideally, these cells could be turned into products suited to everyone and, when necessary, quickly deployed.
Rocio Sartori-Maldonado et al, Thymidylate synthase disruption to limit cell proliferation in cell therapies, Molecular Therapy (2024). DOI: 10.1016/j.ymthe.2024.06.014
Jul 10, 2024
Dr. Krishna Kumari Challa
Scientists successfully create a time crystal made of giant atoms
A crystal is an arrangement of atoms that repeats itself in space, in regular intervals: At every point, the crystal looks exactly the same. In 2012, Nobel Prize winner Frank Wilczek raised the question: Could there also be a time crystal—an object that repeats itself not in space but in time? And could it be possible that a periodic rhythm emerges, even though no specific rhythm is imposed on the system and the interaction between the particles is completely independent of time?
For years, Frank Wilczek's idea has caused much controversy. Some considered time crystals to be impossible in principle, while others tried to find loopholes and realize time crystals under certain special conditions.
Now, a particularly spectacular kind of time crystal has successfully been created at Tsinghua University in China, with the support from TU Wien in Austria. The team used laser light and special types of atoms, namely Rydberg atoms, with a diameter that is several hundred times larger than normal. The results have been published in the journal Nature Physics.
The ticking of a clock is also an example of a temporally periodic movement. However, it does not happen by itself: Someone must have wound the clock and started it at a certain time. This starting time then determined the timing of the ticks. It is different with a time crystal:
According to Wilczek's idea, a periodicity should arise spontaneously, although there is actually no physical difference between different points in time.
The tick frequency is predetermined by the physical properties of the system, but the times at which the tick occurs are completely random; this is known as spontaneous symmetry breaking.
How this new work was done:
Laser light was shone into a glass container filled with a gas of rubidium atoms. The strength of the light signal that arrived at the other end of the container was measured.
This is actually a static experiment in which no specific rhythm is imposed on the system.
The interactions between light and atoms are always the same, the laser beam has a constant intensity. But surprisingly, it turned out that the intensity that arrives at the other end of the glass cell begins to oscillate in highly regular patterns.
The key to the experiment was to prepare the atoms in a special way: The electrons of an atom can orbit the nucleus on different paths, depending on how much energy they have. If energy is added to the outermost electron of an atom, its distance from the atomic nucleus can become very large.
Part1
In extreme cases, it can be several hundred times further away from the nucleus than usual. In this way, atoms with a giant electron shell are created—so-called Rydberg atoms.
If the atoms in their glass container are prepared in such Rydberg states and their diameter becomes huge, then the forces between these atoms also become very large.
And that in turn changes the way they interact with the laser. If you choose laser light in such a way that it can excite two different Rydberg states in each atom at the same time, then a feedback loop is generated that causes spontaneous oscillations between the two atomic states. This in turn also leads to oscillating light absorption. All by themselves, the giant atoms stumble into a regular beat, and this beat is translated into the rhythm of the light intensity that arrives at the end of the glass container.
Jul 10, 2024
Dr. Krishna Kumari Challa
So the researchers have created a new system here that provides a powerful platform for deepening their understanding of the time crystal phenomenon in a way that comes very close to Frank Wilczek's original idea.
Precise, self-sustained oscillations could be used for sensors.
Xiaoling Wu et al, Dissipative time crystal in a strongly interacting Rydberg gas, Nature Physics (2024). DOI: 10.1038/s41567-024-02542-9. On arXiv: arxiv.org/html/2305.20070v3
Part 2
Jul 10, 2024
Dr. Krishna Kumari Challa
New study sheds light on brain responses to emotionally-charged scenes
The ability to recognize and respond to emotionally-charged situations is essential to a species' evolutionary success. A new study published in Nature Communications advances our understanding of how the brain responds to emotionally charged objects and scenes.
This new research reveals that the occipital temporal cortex is tuned not only to different categories of stimuli but it also breaks down these categories based on their emotional characteristics in a way that is well suited to guide selection between alternate behaviours.
The researchers analyzed the brain activity of a small group of volunteers viewing over 1,500 images depicting natural emotional scenes such as a couple hugging, an injured person in a hospital bed, a luxurious home, and an aggressive dog. Participants were asked to categorize the images as positive, negative or neutral and to also rate the emotional intensity of the images.
A second group of participants picked the behavioural responses that best matched each scene.
Using cutting-edge modeling of brain activity divided into tiny cubes (of under 3mm3), the study discovered that the occipital temporal cortex (OTC), a region at the back of the brain, is tuned to represent both the type of stimulus (single human, couple, crowd, reptile, mammal, food, object, building, landscape etc.) and the emotional characteristics of the stimulus—whether it's negative, positive or neutral and also whether it's high or low in emotional intensity.
Machine learning showed that these stable tuning patterns were more efficient in predicting the behaviors matched to the images by the second group of participants than could be achieved by applying machine learning directly to image features—suggesting that the OTC efficiently extracts and represents the information needed to guide behaviour.
These findings expand our knowledge of how the human brain represents emotional natural stimuli.
Occipital-temporal cortical tuning to semantic and affective features of natural images predicts associated behavioral responses, Nature Communications (2024). DOI: 10.1038/s41467-024-49073-8
Jul 10, 2024
Dr. Krishna Kumari Challa
First local extinction due to sea level rise identified in the US
The United States has lost its only stand of the massive Key Largo tree cactus in what researchers think is the first local extinction of a species caused by sea level rise in the country.
The Key Largo tree cactus (Pilosocereus millspaughii) still grows on a few scattered islands in the Caribbean, including northern Cuba and parts of the Bahamas. In the United States, it was restricted to a single population in the Florida Keys, first discovered in 1992 and monitored intermittently since.
Salt water intrusion from rising seas, soil depletion from hurricanes and high tides, and herbivory by mammals had put significant pressure on the population. By 2021, what had been a thriving stand of about 150 stems was reduced to six ailing fragments, which researchers salvaged for off-site cultivation to ensure their survival.
"Unfortunately, the Key Largo tree cactus may be a bellwether for how other low-lying coastal plants will respond to climate change," say scientists.
But don't worry, the researchers are studying and trying to rescue the remnants of a dwindling stock of this cactus.
First U.S. vascular plant extirpation linked to sea level rise? Pilosocereus millspaughii (Cactaceae) in the Florida Keys, U.S.A., Journal of the Botanical Research Institute of Texas (2024). DOI: 10.17348/jbrit.v18.i1.1350
Jul 10, 2024
Dr. Krishna Kumari Challa
'Unhealthy' gut microbiome patterns linked to heightened risk of death after organ transplant
'Unhealthy' gut microbiome patterns are linked to a heightened risk of death after a solid organ transplant, finds research published online in the journal Gut.
While these particular microbial patterns are associated with deaths from any cause, they are specifically associated with deaths from cancer and infection, regardless of the organ—kidney, liver, heart, or lung—transplanted, the findings show.
The make-up of the gut microbiome is associated with various diseases, including inflammatory bowel disease and diabetes. But few studies have had the data to analyze the association between the gut microbiome and long term survival, explain the researchers.
And while a shift away from a normal pattern of microbes to an 'unhealthy' pattern, known as gut dysbiosis, has been linked to a heightened risk of death generally, it's not clear whether this might also be associated with overall survival in specific diseases, they add.
To find out, they looked at the relationship between gut dysbiosis and death from all and specific causes in solid organ transplant recipients among whom the prevalence of gut dysbiosis is much higher than that of the general population. This makes them an ideal group to study the associations between gut dysbiosis and long term survival, say the researchers.
Part1
Jul 10, 2024
Dr. Krishna Kumari Challa
To find out, they looked at the relationship between gut dysbiosis and death from all and specific causes in solid organ transplant recipients among whom the prevalence of gut dysbiosis is much higher than that of the general population. This makes them an ideal group to study the associations between gut dysbiosis and long term survival, say the researchers.
They analyzed the microbiome profiles from 1,337 fecal samples provided by 766 kidney, 334 liver, 170 lung, and 67 heart transplant recipients and compared those with the gut microbiome profiles of 8,208 people living in the same geographical area of northern Netherlands.
The average age of the transplant recipients was 57, and over half were men (784; 59%). On average, they had received their transplant 7.5 years previously.
Part2
Jul 10, 2024
Dr. Krishna Kumari Challa
During a follow-up period of up to 6.5 years, 162 recipients died: 88 kidney, 33 liver, 35 lung and six heart recipients. Forty eight (28%) died from an infection, 38 (23%) from cardiovascular disease, 38 (23%) from cancer, and 40 (25%) from other causes.
The researchers looked at several indicators of gut dysbiosis in these samples: microbial diversity; how much their gut microbiomes differed from the average microbiome of the general population; the prevalence of antibiotic resistance genes; and virulence factors which help bacteria to invade cells and evade immune defenses.
The analysis revealed that the more the gut microbiome patterns of the transplant recipients diverged from those of the general population, the more likely they were to die sooner after their procedure, irrespective of the organ transplanted.
Similar associations emerged for the abundance of antibiotic resistance genes and virulence factors.
The researchers identified 23 bacterial species among all the transplant recipients that were associated with either a heightened or lower risk of death from all causes.
Part3
Jul 10, 2024
Dr. Krishna Kumari Challa
The researchers further analyzed all bacterial species simultaneously using AI. This revealed a second pattern of 19 different species that were also associated with an increased risk of death.
This is an observational study, and as such, no definitive conclusions can be drawn about the causal roles of particular bacteria.
But, conclude the researchers, "Our results support emerging evidence showing that gut dysbiosis is associated with long-term survival, indicating that gut microbiome targeting therapies might improve patient outcomes, although causal links should be identified first."
Casper Swarte et al, Multiple indicators of gut dysbiosis predict all-cause and cause-specific mortality in solid organ transplant recipients, Gut (2024). DOI: 10.1136/gutjnl-2023-331441
Part4
Jul 10, 2024
Dr. Krishna Kumari Challa
Hepatitis C leaves 'scars' in immune cells even after successful treatment
Chronic hepatitis C, caused by the hepatitis C virus, can lead to severe complications such as liver cirrhosis and liver cancer. The advent of highly effective direct-acting antivirals (DAAs) has resulted in high cure rates for this chronic viral infection. However, it has been reported that the immune system of patients does not fully recover even after being cured.
This work provided new insights into the lasting effects of chronic hepatitis C virus (HCV) infection on the immune system, even after the disease has been successfully treated.
The research team has discovered that traces of "epigenetic scars" remain in regulatory T cells and exhibit sustained inflammatory properties long after the virus is cleared from the body. The paper is published in the Journal of Hepatology.
So-Young Kim et al, Epigenetic scars in regulatory T cells are retained after successful treatment of chronic hepatitis C with direct-acting antivirals, Journal of Hepatology (2024). DOI: 10.1016/j.jhep.2024.06.011
Jul 10, 2024
Dr. Krishna Kumari Challa
How lasers and 2D materials could solve the world's plastic problem
A global research team has developed a way to blast the molecules in plastics and other materials with a laser to break them down into their smallest parts for future reuse.
The discovery, which involves laying these materials on top of two-dimensional materials called transition metal dichalcogenides and then lighting them up, has the potential to improve how we dispose of plastics that are nearly impossible to break down with today's technologies.
By harnessing these unique reactions, we can explore new pathways for transforming environmental pollutants into valuable, reusable chemicals, contributing to the development of a more sustainable and circular economy.
This discovery has significant implications for addressing environmental challenges and advancing the field of green chemistry.
Jingang Li et al, Light-driven C–H activation mediated by 2D transition metal dichalcogenides, Nature Communications (2024). DOI: 10.1038/s41467-024-49783-z
Jul 10, 2024
Dr. Krishna Kumari Challa
No GPS, no problem: Researchers are making quantum sensing tools more compact and accurate to replace GPS
Fundamental physics—let alone quantum physics—might sound complicated to many, but it can actually be applied to solve everyday problems.
Imagine navigating to an unfamiliar place. Most people would suggest using GPS, but what if you were stuck in an underground tunnel where radio signals from satellites were not able to penetrate? That's where quantum sensing tools come in.
Researchers are working at making sensing instruments like atomic accelerometers smaller and more accurate so they can be used to navigate when GPS is down.
Atoms are excellent at making accurate measurements because they are all the same. Atomic measurements made in one laboratory would be indistinguishable from those made in another laboratory, as the atoms behave in precisely the same way.
One example of how this physics concept can be applied is making a highly accurate navigation system with these atoms.
As atoms have mass, they can be used to measure accelerations, helping us build atom-based sensors like atomic accelerometers.
The accelerometers let you know how fast and far you're moving in a given direction. They can be coupled with gyroscopes, which tell you whether you've changed directions and how far you've turned, to make a complete measurement. These navigation instruments are useful when you don't have access to GPS.
One of the challenges they're facing is how they can engineer this in a thoughtful way.
For example, they have to think very carefully about how they can miniaturize atomic accelerometers. These accelerometers have historically operated in big laboratory scale systems, where equipment is heavy and consumes a lot of power. To make the accelerometers suitable for public use, Researchers are investigating how to retain their high precision in a much more compact, power-efficient and attractive medium.
Not only do quantum sensing devices work in areas that don't have access to GPS, they can also be part of an exciting new avenue: national security applications.
Part 1
Jul 10, 2024
Dr. Krishna Kumari Challa
Modern conflicts are becoming increasingly electronic and less kinetic, as nations vie for information superiority. The radio signal from GPS satellites is easy to disrupt and jam because it is far away. Thus, in any modern conflict, both sides will attempt to deny each other access to these radio signals.
More traditional navigation instruments like inertial systems are un-jammable, as they work by adding up accelerations and rotations to measure our change in position. So they can replace GPS in times of conflict. However, all the errors made also get added up, so researchers are interested in using an atom-based measurement to ensure it is more accurate.
Atomic accelerometers are one example of these inertial systems. These systems are present in sensors on aircraft and ships, guiding their movement through airspaces and waters. However, existing mechanical-based sensors can wear out easily due to friction, leading to them being swapped out every year and costing a lot of money. They are also hard to build because they're small and delicate.
The quantum approach based on atoms pursued by the present researchers could provide acceleration measurements with no moving parts.
For example, if submarines want to be stealthy and quiet in defense scenarios, keeping track of what it's doing and how it's moved through inertial systems is pretty much the only game in town.
Achieving this fine balance between simplicity and accuracy is the researchers' main goal, and they hope that their efforts will translate to real-world prototypes someday.
Source: USC
Part 2
**
Jul 10, 2024
Dr. Krishna Kumari Challa
Familial endocrine diseases linked to increased risk of pregnancy loss, new research shows
Women who have close family members with endocrine diseases--including type 2 diabetes, thyroid diseases and polycystic ovary syndrome (PCOS)--are at higher risk of pregnancy loss, a new study has found.
The research, presented recently at the ESHRE 40th Annual Meeting in Amsterdam, examined the association between various endocrine diseases and the incidence of pregnancy loss.
The study investigated 366,539 women in Denmark between 1973 and 2022. The study found that women with parents diagnosed with endocrine diseases faced a 6% higher risk of pregnancy loss compared to those without a family history of endocrine diseases.
Similarly, if a woman's sister had an endocrine disease, her risk of experiencing pregnancy loss increased by 7%. These patterns persisted even when individual cases of the diseases were considered.
The results highlight having a family history of endocrine disease as an important yet previously underexplored factor in assessing the risk of pregnancy loss.
Egerup, P., et al, Familial endocrine disease increases the risk of pregnancy loss and recurrent pregnancy loss– a nationwide register-based study of 366,548 Danish women. Human Reproduction (2024). academic.oup.com/humrep/issue/39/Supplement_1
Jul 10, 2024
Dr. Krishna Kumari Challa
Auroras caused by head-on blows to Earth's magnetic field could damage critical infrastructure, scientists say
Auroras are beautiful colourful lights that catch our imagination. Poets write poems, artists, including me, create art works based on them, writers weave stories around them.
They have inspired myths and portents for millennia—but only now, with modern technology dependent on electricity, are we appreciating their true power.
The same forces which cause auroras also cause currents that can damage infrastructure which conducts electricity, like pipelines.
Now scientists writing in Frontiers in Astronomy and Space Sciences have demonstrated that the impact angle of interplanetary shocks is key to the currents' strength, offering an opportunity to forecast dangerous shocks and shield critical infrastructure.
Auroras and geomagnetically induced currents are caused by similar space weather drivers.
The aurora is a visual warning that indicates that electric currents in space can generate these geomagnetically-induced currents on the ground.
The auroral region can greatly expand during severe geomagnetic storms. Usually, its southernmost boundary is around latitudes of 70 degrees, but during extreme events it can go down to 40 degrees or even further, which certainly occurred during the May 2024 storm—the most severe storm in the past two decades.
Auroras are caused by two processes: either particles ejected from the sun reach Earth's magnetic field and cause a geomagnetic storm, or interplanetary shocks compress Earth's magnetic field.
These shocks also generate geomagnetically induced currents, which can damage infrastructure that conducts electricity. More powerful interplanetary shocks mean more powerful currents and auroras—but frequent, less powerful shocks could also do damage.
The most intense deleterious effects on power infrastructure occurred in March 1989 following a severe geomagnetic storm—the Hydro-Quebec system in Canada was shut down for nearly nine hours, leaving millions of people with no electricity.
But weaker, more frequent events such as interplanetary shocks can pose threats to ground conductors over time. Recent research work shows that considerable geoelectric currents occur quite frequently after shocks, and they deserve attention.
Part 1
Jul 11, 2024
Dr. Krishna Kumari Challa
Shocks which hit the Earth head-on, rather than at an angle, are thought to induce stronger geomagnetically induced currents, because they compress the magnetic field more. The scientists investigated how geomagnetically induced currents are affected by shocks at different angles and times of day.
Scientists found that more frontal shocks cause higher peaks in geomagnetically induced currents both immediately after the shock and during the following substorm. Particularly intense peaks took place around magnetic midnight, when the north pole would have been between the sun and Mäntsälä. Localized substorms at this time also cause striking auroral brightening.
Moderate currents occur shortly after the perturbation impact when Mäntsälä is around dusk local time, whereas more intense currents occur around midnight local time.
Because the angles of these shocks can be predicted up to two hours before impact, this information could allow us to set in place protections for electricity grids and other vulnerable infrastructure before the strongest and most head-on shocks strike.
One thing power infrastructure operators could do to safeguard their equipment is to manage a few specific electric circuits when a shock alert is issued. This would prevent geomagnetically induced currents reducing the lifetime of the equipment.
However, the scientists didn't find strong correlations between the angle of a shock and the time it takes for it to hit and then induce a current. This may be because more recordings of currents at different latitudes are needed to investigate this aspect. Current data was collected only at a particular location, namely the Mäntsälä natural gas pipeline system.
Although Mäntsälä is at a critical location, it does not provide a worldwide picture.
First direct observations of interplanetary shock impact angle effects on actual geomagnetically induced currents: The case of the Finnish natural gas pipeline system, Frontiers in Astronomy and Space Sciences (2024). DOI: 10.3389/fspas.2024.1392697
Part 2
Jul 11, 2024
Dr. Krishna Kumari Challa
Researchers discover a new defense mechanism in bacteria
When confronted with an antibiotic, toxic substance, or other source of considerable stress, bacteria are able to activate a defense mechanism using cell-to-cell communication to "warn" unaffected bacteria, which can then anticipate, shield themselves and spread the warning signal.
This mechanism has just been described for the first time by a team of scientists .
It paves the way for the development of new, more effective antibiotic treatments that can target this bacterial communication system. The work appears in Nature Communications.
When they perceive a source of stress, bacteria spring into action, inducing changes in the expression of certain genes and their physiological properties to make them less vulnerable to the detected lethal substance. They also produce small alarmone proteins on their surface in order to contact and activate random neighboring bacteria.
Unstressed bacteria can only change state in the presence of a sufficient amount of alarmones. Thus, only a source of stress perceived by sufficient bacteria can trigger propagation of this activation.
The mechanism offers several advantages: It limits the unnecessary use of energy and enables a rapid and coordinated response in the population. Because activation is gradual, it creates diversity in the population over time, thus increasing the bacteria's chances of survival.
These findings were established using a dozen different families of antibiotics on populations of Streptococcus pneumoniae, the bacteria that causes pneumococcal infections.
Pneumococcal competence is a populational health sensor driving multilevel heterogeneity in response to antibiotics, Nature Communications (2024). DOI: 10.1038/s41467-024-49853-2
Jul 11, 2024
Dr. Krishna Kumari Challa
The changing fire science
A new paper on the many ways wildfires affect people and the planet makes clear that as fires become more intense and frequent, the urgency for effective and proactive fire science grows. By addressing these challenges, the fire research community aims to better protect our planet and its inhabitants.
The paper appears in the Zenodo research repository.
Fire is a natural part of life on Earth, sustaining healthy and balanced ecosystems worldwide. But human activity and a changing climate are rapidly shifting both the frequency and severity of wildfire events, creating new risks to human and environmental health.
Recently, a group of scientists from 14 countries and across several disciplines—physical and social sciences, mathematics, statistics, remote sensing, fire communication and art, operational fire science, and fire management—gathered to discuss rapid changes in fire regimes and identify pathways to address these challenges.
The experts identified three grand challenges for fire science in the coming decades: understanding the role of fire in the carbon cycle, fire and extreme events, and the role of humans in fire.
If we want to improve the assessment of future fire impacts on people and the planet, we need to start with a better understanding of how climate, land cover changes, and human land management practices drive fire distribution and severity in the coming decades, the scientists say.
To address the grand challenges, the scientists identified three pressing research priorities: understanding the net carbon balance of fire, developing rapid response tools for wildfire events, and understanding fire's impact on society, especially marginalized and underrepresented populations.
A main goal of the white paper is to be able to improve fire modeling, predictability, and mitigation on both regional and global scales.
As fire events become more intense and frequent, the urgency for effective and proactive fire science grows. Scientists are taking steps to address these challenges collectively, as a unified fire research community, to better protect our planet and its inhabitants.
Douglas S Hamilton et al, Igniting Progress: Outcomes from the FLARE workshop and three challenges for the future of transdisciplinary fire science, Zenodo (2024). DOI: 10.5281/zenodo.12634067
Jul 11, 2024
Dr. Krishna Kumari Challa
Subsurface of fingernails found to have precise tactile localization
Researchers found that humans have a surprisingly precise degree of tactile localization beneath their fingernails. In his study, published in the Proceedings of the Royal Society B, they tested how well volunteers could pinpoint the part of their fingernail being stimulated and outlines possible reasons.
Humans, like other primates, have nails on the ends of their fingers rather than claws—an evolutionary development that has not been explained. In this new effort, researchers investigated the sensitivity of the skin below the fingernails to learn more about how they were used by our ancestors.
The human fingernail does not have any nerves; thus, it cannot sense touch, pressure, heat, cold or other environmental characteristics. But there is skin beneath the fingernail that is capable of sensations, as evidenced by people who accidentally hit their thumb with a hammer or lose a nail.
To learn more about the sensitivity of the subsurface of the fingernail, the researchers recruited 38 adult volunteers. Each agreed to have their fingernails poked while they indicated on a photograph of a fingernail where they thought their fingernail was being touched. In the experiments, half of the volunteers had their nails touched by a stick, the other half by a filament. Only the thumb and middle finger were tested.
The study found that humans have highly precise localization in their nails—they can tell clearly which part of their nail is being touched. He suggests that this is due to mechanoreceptors called Pacinian corpuscles, buried in the skin beneath the nails. He notes that it is the same mechanism that allows blind people to localize touch using a cane. Pacinian corpuscles are able to detect small amounts of vibration, which happens when a slight impact occurs between a foreign object and a fingernail.
Why did humans develop fingernails instead of claws? Why the skin beneath the nails is so sensitive? Researchers theorize that they likely served a sensorimotor function, giving humans more information about whatever their hands encounter.
Matthew R. Longo, Precise tactile localization on the human fingernail, Proceedings of the Royal Society B: Biological Sciences (2024). DOI: 10.1098/rspb.2024.1200
Jul 11, 2024
Dr. Krishna Kumari Challa
The geometry of life: Physicists determine what controls biofilm growth
This is physics helping biology.
From plaque sticking to teeth to scum on a pond, biofilms can be found nearly everywhere. These colonies of bacteria grow on implanted medical devices, our skin, contact lenses, and in our guts and lungs. They can be found in sewers and drainage systems, on the surface of plants, and even in the ocean.
Some research says that 80% of infections in human bodies can be attributed to the bacteria growing in biofilms.
The paper, "The biophysical basis of bacterial colony growth," was published in Nature Physics this week, and it shows that the fitness of a biofilm- its ability to grow, expand, and absorb nutrients from the medium or the substrate—is largely impacted by the contact angle that the biofilm's edge makes with the substrate. The study also found that this geometry has a bigger influence on fitness than anything else, including the rate at which the cells can reproduce.
Understanding how biofilms grow—and what factors contribute to their growth rate—could lead to critical insights on controlling them, with applications for human health, like slowing the spread of infection or creating cleaner surfaces.
Aawaz R. Pokhrel et al, The biophysical basis of bacterial colony growth, Nature Physics (2024). DOI: 10.1038/s41567-024-02572-3
Jul 11, 2024
Dr. Krishna Kumari Challa
Air pollution may affect lupus risk
New research published in Arthritis & Rheumatology indicates that chronic exposure to air pollutants may increase the risk of developing lupus, an autoimmune disease that affects multiple organs.
For the study, investigators analyzed data on 459,815 participants from the UK Biobank. A total of 399 lupus cases were identified during a median follow-up of 11.77 years. Air pollutant exposure was linked with a greater likelihood of developing lupus. Individuals with a high genetic risk and high air pollution exposure had the highest risk of developing lupus compared with those with low genetic risk and low air pollution exposure.
This study provides crucial insights into the air pollution contributing to autoimmune diseases. The findings can inform the development of stricter air quality regulations to mitigate exposure to harmful pollutants, thereby reducing the risk of lupus.
Air pollution, genetic susceptibility and risk of incident Systemic lupus erythematosus: A prospective cohort study, Arthritis & Rheumatology (2024). DOI: 10.1002/art.42929
**
Jul 11, 2024
Dr. Krishna Kumari Challa
Some environmental toxicants linked to depressive symptoms
Certain categories of environmental toxicants are associated with depressive symptoms, according to a study published online July 3 in JAMA Network Open.
Researchers screened and assessed the associations between potential environmental toxicants and depressive symptoms among 3,427 participants from the 2013 to 2014 and 2015 to 2016 waves of the National Health and Nutrition Examination Survey. Exposures were assessed for 62 toxicants in 10 categories; the association with depression scores, measured by the 9-item Patient Health Questionnaire (PHQ-9), was examined.
The researchers identified associations between 27 chemical compounds or metals in six of 10 categories of environmental toxicants and the prevalence of depressive symptoms, including the volatile organic compound metabolites N-acetyl-S-(2 hydroxy-3-butenyl)-L-cysteine and total nicotine equivalent-2 (odds ratios, 1.74 and 1.42, respectively).
Compared with women and older individuals, men and younger individuals seemed more vulnerable to environmental toxicants. Overall, 5–19 percent of the associations were mediated by peripheral white blood cell count.
"This research highlights the significance of preventing and regulating important environmental toxicants to gain fresh insights into preventing and potentially treating depression," the authors write in their paper.
Jianhui Guo et al, Environmental Toxicant Exposure and Depressive Symptoms, JAMA Network Open (2024). DOI: 10.1001/jamanetworkopen.2024.20259
Jul 11, 2024
Dr. Krishna Kumari Challa
Respiratory bacteria 'turn off' immune system to survive, study finds
Researchers have identified how a common bacterium is able to manipulate the human immune system during respiratory infections and cause persistent illness. The research was published in PLOS Pathogens.
This study found the virulence mechanisms of Haemophilus influenzae, a bacterium that plays a significant role in worsening respiratory tract infections.
These bacteria are especially damaging to vulnerable groups, such as those with cystic fibrosis, asthma, the elderly, and Indigenous communities.
In some conditions, such as asthma and chronic obstructive pulmonary disease,
they can drastically worsen symptoms.
This research shows the bacterium persists by essentially turning off the body's immune responses, inducing a state of tolerance in human respiratory tissues.
Part 1
Jul 12, 2024
Dr. Krishna Kumari Challa
The researchers prepared human nasal tissue in the lab, growing it to resemble the surfaces of the human respiratory tract, then monitored gene expression changes over a 14-day 'infection.'
They found very limited production of inflammation molecules over time, which normally would be produced within hours of bacteria infecting human cells.
Researchers then applied both live and dead Haemophilus influenzae, showing the dead bacteria caused a fast production of the inflammation makers, while live bacteria prevented this.
This proved that the bacteria can actively reduce the human immune response.
If local immunity drops, for example during a viral infection, the bacteria may be able to 'take over' and cause a more severe infection.
PLOS Pathogens (2024). journals.plos.org/plospathogen … journal.ppat.1012282
Part 2
Jul 12, 2024
Dr. Krishna Kumari Challa
Densely packed E. coli form an immobile material similar to colloidal glass, research shows
Dense E. coli bacteria have several similar qualities to colloidal glass, according to new research at the University of Tokyo. Colloids are substances made up of small particles suspended within a fluid, like ink for example. When these particles become higher in density and more packed together, they form a "glassy state."
When researchers multiplied E. coli bacteria within a confined area, they found that they exhibited similar characteristics. More surprisingly, they also showed some other unique properties not typically found in glass-state materials.
This study, which is published in PNAS Nexus, contributes to the understanding of glassy "active matter," a relatively new field of materials research which crosses physics and life science.
In the long term, the researchers hope that these results will contribute to developing materials with new functional capabilities, as well as aiding our understanding of biofilms (where microorganisms stick together to form layers on surfaces) and natural bacterial colonies.
Researchers have now found that the bacteria E. coli can behave in a similar way.
Since bacteria are very different from what we know of as glass, it was surprising that many of the statistical properties of glassy materials were the same for bacteria.
In this experiment, As the number of E. coli increased, they became caged in by their neighbors, restricting their ability to swim freely. Over time, they transitioned to a glassy state. This transition is similar to glass formation, as the researchers noted a rapid slowdown of movement, the caged-in effect and dynamic heterogeneity (whereby molecules travel longer distances in some areas but hardly move in others).
Part 1
Jul 12, 2024
Dr. Krishna Kumari Challa
What made this bacterial glass different to other glasslike substances was the spontaneous formation of "microdomains" and the collective motion of the bacteria within these areas. These occurred where groups of the rod-shaped E. coli became aligned the same way.
The researchers were also surprised that the way the bacteria vitrify (turn into a glasslike state) apparently violates a physical law of typical thermal systems. What we characteristically know as glass, including colloidal glass, is classed as thermal glass. However, recently researchers have started to explore glassy states, like the one reported in this paper, which aren't considered thermal glass but share many of the same properties.
"Collections of 'self-propelled particles' like we see here have recently been regarded as a new kind of material called active matter, which is currently a hot topic and shows great potential.
Hisay Lama et al, Emergence of bacterial glass, PNAS Nexus (2024). DOI: 10.1093/pnasnexus/pgae238
Part 2
Jul 12, 2024
Dr. Krishna Kumari Challa
How water controls the speed of muscle contraction
The flow of water within a muscle fiber may dictate how quickly muscle can contract, according to a new study.
Nearly all animals use muscle to move, and it's been known for a long time that muscle, like all other cells, is composed of about 70% water. But researchers don't know what sets the range and upper limits of muscle performance. Previous research into how muscle works focused only on how it worked on a molecular level rather than how muscle fibers are shaped, that they are three-dimensional and are full of fluid.
Researchers now created a theoretical model of water's role in muscle contraction and found that how fluid moves through a muscle fiber determines how quickly a muscle fiber can contract.
They also found that muscle exhibits a new kind of elasticity called odd elasticity that allows muscle to generate power using three dimensional deformations, shown in a common observation that when a muscle fiber contracts lengthwise, it also bulges perpendicularly. Their results are published in the journal Nature Physics.
These results suggest that even such basic questions as how quickly muscle can contract or how many ways muscle can generate power have new and unexpected answers when one takes a more integrated and holistic view of muscle as a complex and hierarchically organized material rather than just a bag of molecules.
Part 1
Jul 12, 2024
Dr. Krishna Kumari Challa
Muscle fibers are composed of many components, such as various proteins, cell nuclei, organelles such as mitochondria, and molecular motors such as myosin that convert chemical fuel into motion and drive muscle contraction.
All of these components form a porous network that is bathed in water. So an appropriate, coarse-grained description for muscle is that of an active sponge, say the researchers.
But the squeezing process takes time to move water around, so the researchers suspected that this movement of water through the muscle fiber set an upper limit on how rapidly a muscle fiber can twitch.
To test their theory, they modeled muscle movements in multiple organisms across mammals, insects, birds, fish and reptiles, focusing on animals that use muscles for very fast motions. They found that muscles that produce sound, such as the rattle in a rattlesnake's tail, that can contract ten to hundreds of times per second typically don't rely on fluid flows. Instead, these contractions are controlled by the nervous system and are more strongly dictated by molecular properties, or the time it takes for molecular motors within cells to bind and generate forces.
But in smaller organisms, such as flying insects who are beating their wings a few hundred to a thousand times per second, these contractions are too fast for neurons to directly control. Here fluid flows are more important.
In these cases, the researchers found that fluid flows within the muscle fiber are important and their mechanism of active hydraulics is likely to limit the fastest rates of contraction.
The researchers also found that when muscle fibers act as an active sponge, the process also causes the muscles to act as an active elastic engine. When something is elastic, such as a rubber band, it stores energy as it tries to resist deformation. Imagine holding a rubber band between two fingers and pulling it back.
When you release the rubber band, the band also releases the energy stored when it was being stretched. In this case, energy is conserved—a basic law of physics that dictates that the amount of energy within a closed system should remain the same over time.
But when muscle converts chemical fuel into mechanical work, it can produce energy like an engine, violating the law of the conservation of energy. In this case, muscle shows a new property called "odd elasticity," where its response when squashed in one direction versus another is not mutual.
Unlike the rubber band, when muscle contracts and relaxes along its length, it also bulges out perpendicularly, and its energy does not stay the same. This allows muscle fibers to generate power from repetitive deformations, behaving as a soft engine.
These results are in contrast to prevailing thought, which focuses on molecular details and neglects the fact that muscles are long and filamentous, are hydrated, and have processes on multiple scales.
All together, our results suggest a revised view of how muscle functions is essential to understand its physiology. This is also crucial to understanding the origins, extent and limits that underlie the diverse forms of animal movement.
Suraj Shankar et al, Active hydraulics and odd elasticity of muscle fibres, Nature Physics (2024). DOI: 10.1038/s41567-024-02540-x
Part 2
Jul 12, 2024
Dr. Krishna Kumari Challa
Monkey malaria is infecting people
Malaysia was on the brink of eliminating malaria – then a new parasite swung out of the jungle
A new malaria parasite comes from monkeys. With thousands already infected, experts fear it could one day spread between humans.
Malaria has been eliminated in Malaysia, but another variety is spilling out from the rainforest and infecting people: monkey malaria. Around 25,000 people have been infected with the Plasmodium knowlesi parasite since 2011, which causes nausea, fever and sometimes death. Deforestation has driven a spike in cases, pushing monkeys, mosquitoes and people into closer proximity. The disease isn’t limited to southeast Asia, and as it spreads so too does the chance that monkey malaria will adapt to be spread between humans.
https://www.telegraph.co.uk/global-health/science-and-disease/monke...
Jul 12, 2024
Dr. Krishna Kumari Challa
How galaxies avoid early death
Galaxies avoid an early death because they have a "heart and lungs" which effectively regulate their "breathing" and prevent them from growing out of control, a new study suggests.
If they didn't, the universe would have aged much faster than it has and all we would see today is huge "zombie" galaxies teeming with dead and dying stars.
That's according to a new study published in the Monthly Notices of the Royal Astronomical Society, which investigates one of the great mysteries of the universe—why galaxies are not as large as astronomers would expect.
Something appears to be stifling their enormous potential by limiting the amount of gas they absorb to convert into stars, meaning that instead of endlessly growing, something inside resists what was thought to be the inevitable pull of gravity.
Now, astrophysicists think they may have uncovered the secret. They suggest that galaxies could in fact control the rate at which they grow through how they "breathe."
In their analogy, the researchers compared the supermassive black hole at the center of a galaxy to its heart, and the two bi-polar supersonic jets of gas and radiation they emit to airways feeding a pair of lungs.
Pulses from the black hole—or "heart"—can lead to jet shock fronts oscillating back and forth along both jet axes, much like the thoracic diaphragm in the human body moves up and down inside a chest cavity to inflate and deflate both lungs.
This can result in jet energy being transmitted widely into the surrounding medium, just as we breathe out warm air, resulting in slowing galaxy gas-accretion and growth.
The phenomenon is similar to the terrestrial equivalent of sound and shock waves being produced when opening a bottle of champagne, the screech of a car, rocket exhausts and the puncture of pressurized enclosures.
These supersonic jets might help in inhibiting galaxy growth.
The researchers concluded that a galaxy's lifespan can be extended with the help of its "heart and lungs," where the supermassive black hole engine at its core helps inhibit growth by limiting the amount of gas collapsing into stars from an early stage. This, the researchers say, has helped create the galaxies we see today.
Without such a mechanism, galaxies would have exhausted their fuel by now and fizzled out, as some do in the form of "red and dead" or "zombie" galaxies.
Carl Richards et al, Simulations of Pulsed Over-Pressure Jets: Formation of Bellows and Ripples in Galactic Environments, Monthly Notices of the Royal Astronomical Society (2024). DOI: 10.1093/mnras/stae1498
Jul 13, 2024
Dr. Krishna Kumari Challa
Scientists identify possible way to block muscle fatigue in long COVID, other diseases
Infections and neurodegenerative diseases cause inflammation in the brain. But for unknown reasons, patients with brain inflammation often develop muscle problems that seem to be independent of the central nervous system. Now, researchers have revealed how brain inflammation releases a specific protein that travels from the brain to the muscles and causes a loss of muscle function.
The study, in fruit flies and mice, also identified ways to block this process, which could have implications for treating or preventing the muscle wasting sometimes associated with inflammatory diseases including bacterial infections, Alzheimer's disease and long COVID.
The study is published July 12 in the journal Science Immunology.
The study suggests that when we get sick, messenger proteins from the brain travel through the bloodstream and reduce energy levels in skeletal muscle. This is more than a lack of motivation to move because we don't feel well. These processes reduce energy levels in skeletal muscle, decreasing the capacity to move and function normally.
To investigate the effects of brain inflammation on muscle function, the researchers modeled three different types of diseases—an E. coli bacterial infection, a SARS-CoV-2 viral infection and Alzheimer's.
When the brain is exposed to inflammatory proteins characteristic of these diseases, damaging chemicals called reactive oxygen species build up. The reactive oxygen species cause brain cells to produce an immune-related molecule called interleukin-6 (IL-6), which travels throughout the body via the bloodstream.
The researchers found that IL-6 in mice—and the corresponding protein in fruit flies—reduced energy production in muscles' mitochondria, the energy factories of cells.
Flies and mice that had COVID-associated proteins in the brain showed reduced motor function—the flies didn't climb as well as they should have, and the mice didn't run as well or as much as control mice.
The researchers saw similar effects on muscle function when the brain was exposed to bacterial-associated proteins and the Alzheimer's protein amyloid beta. They also saw evidence that this effect can become chronic. Even if an infection is cleared quickly, the reduced muscle performance remains many days longer in their experiments.
The bacterial brain infection meningitis is known to increase IL-6 levels and can be associated with muscle issues in some patients, for instance. Among COVID-19 patients, inflammatory SARS-CoV-2 proteins have been found in the brain during autopsy, and many long COVID patients report extreme fatigue and muscle weakness even long after the initial infection has cleared.
Patients with Alzheimer's disease also show increased levels of IL-6 in the blood as well as muscle weakness.
Part1
Jul 13, 2024
Dr. Krishna Kumari Challa
The study pinpoints potential targets for preventing or treating muscle weakness related to brain inflammation. The researchers found that IL-6 activates what is called the JAK-STAT pathway in muscle, and this is what causes the reduced energy production of mitochondria.
Several therapeutics already approved by the FDA for other diseases can block this pathway. JAK inhibitors as well as several monoclonal antibodies against IL-6 are approved to treat various types of arthritis and manage other inflammatory conditions.
The researchers are not sure why the brain produces a protein signal that is so damaging to muscle function across so many different disease categories, though.
They speculate about possible reasons this process has stayed with us over the course of human evolution, despite the damage it does, it could be a way for the brain to reallocate resources to itself as it fights off disease. More research is needed to better understand this process and its consequences throughout the body.
Shuo Yang et al, Infection and chronic disease activate a systemic brain-muscle signaling axis, Science Immunology (2024). DOI: 10.1126/sciimmunol.adm7908. www.science.org/doi/10.1126/sciimmunol.adm7908
Part 2
Jul 13, 2024
Dr. Krishna Kumari Challa
First fossil chromosomes discovered
Scientists have discovered intact chromosomes preserved in the skin of a woolly mammoth (Mammuthus primigenius) that met its end some 50,000 years ago — a feat previously thought to be impossible. The team also revealed the spatial organization of the mammoth’s DNA molecules and the active genes in its skin, including one responsible for giving the animal its fuzzy appearance. The study is the first to report the 3D structure of an ancient genome.
https://www.cell.com/cell/fulltext/S0092-8674(24)00642-1?_returnURL=https%3A%2F%2Flinkinghub.elsevier.com%2Fretrieve%2Fpii%2FS0092867424006421%3Fshowall%3Dtrue
Jul 13, 2024
Dr. Krishna Kumari Challa
Jul 13, 2024
Dr. Krishna Kumari Challa
Brain size riddle solved as humans exceed evolutionary trend
The largest animals do not have proportionally bigger brains—with humans bucking this trend—a study published in Nature Ecology & Evolution has revealed.
Researchers collected an enormous dataset of brain and body sizes from around 1,500 species to clarify centuries of controversy surrounding brain size evolution.
Bigger brains relative to body size are linked to intelligence, sociality, and behavioral complexity—with humans having evolved exceptionally large brains. The new research reveals the largest animals do not have proportionally bigger brains, challenging long-held beliefs about brain evolution.
For more than a century, scientists have assumed that this relationship was linear—meaning that brain size gets proportionally bigger, the larger an animal is. We now know this is not true. The relationship between brain and body size is a curve, essentially meaning very large animals have smaller brains than expected.
The research reveals a simple association between brain and body size across all mammals which allowed the researchers to identify the rule-breakers—species which challenge the norm.
Among these outliers includes our own species, Homo sapiens, which has evolved more than 20 times faster than all other mammal species, resulting in the massive brains that characterize humanity today. But humans are not the only species to buck this trend.
All groups of mammals demonstrated rapid bursts of change—both towards smaller and larger brain sizes. For example, bats very rapidly reduced their brain size when they first arose, but then showed very slow rates of change in relative brain size, suggesting there may be evolutionary constraints related to the demands of flight.
There are three groups of animals that showed the most pronounced rapid change in brain size: primates, rodents, and carnivores. In these three groups, there is a tendency for relative brain size to increase in time (the "Marsh-Lartet rule"*). This is not a trend universal across all mammals, as previously thought.
* a tendency for relative brain sizes to increase with time. But, as the study notes, this rule is not universally applicable across all mammals
part 1
Jul 15, 2024
Dr. Krishna Kumari Challa
These new results reveal a mystery. In the largest animals, there is something preventing brains from getting too big. Whether this is because big brains beyond a certain size are simply too costly to maintain remains to be seen. But as we also observe similar curvature in birds, the pattern seems to be a general phenomenon—what causes this 'curious ceiling' applies to animals with very different biology.
Chris Venditti et al, Co-evolutionary dynamics of mammalian brain and body size, Nature Ecology & Evolution (2024). DOI: 10.1038/s41559-024-02451-3
Part 2
Jul 15, 2024
Dr. Krishna Kumari Challa
New study finds 40% of cancer cases and almost half of all deaths linked to modifiable risk factors
A study led by researchers at the American Cancer Society (ACS) finds four in 10 cancer cases and about one-half of all cancer deaths in adults 30 years old and older could be attributed to modifiable risk factors, including cigarette smoking, excess body weight, alcohol consumption, physical inactivity, diet, and infections.
Cigarette smoking was by far the leading risk factor, contributing to nearly 20% of all cancer cases and 30% of all cancer deaths. The findings are published in the journal CA: A Cancer Journal for Clinicians.
The risk factors included cigarette smoking (current and former smoking); secondhand smoke; excess body weight; alcohol consumption; consumption of red and processed meat; low consumption of fruits and vegetables, dietary fiber, and dietary calcium; physical inactivity; ultraviolet (UV) radiation; and infection with Epstein-Barr virus (EBV), Helicobacter pylori, hepatitis B virus (HBV), hepatitis C virus (HCV), human herpes virus-8 (HHV-8; also called Kaposi sarcoma herpesvirus), human immunodeficiency virus (HIV), and human papillomavirus (HPV).
CA: A Cancer Journal for Clinicians (2024)
Jul 15, 2024
Dr. Krishna Kumari Challa
Introducing co-cultures: When co-habiting animal species share culture
Cooperative hunting, resource sharing, and using the same signals to communicate the same information—these are all examples of cultural sharing that have been observed between distinct animal species. In an opinion piece published June 19 in the journal Trends in Ecology & Evolution, researchers introduce the term "co-culture" to describe cultural sharing between animal species. These relationships are mutual and go beyond one species watching and mimicking another species' behavior—in co-cultures, both species influence each other in substantial ways.
Co-culture challenges the notion of species-specific culture, underscoring the complexity and interconnectedness of human and animal societies, and between animal societies," write the authors.
These cross-species interactions result in behavioral adaptations and preferences that are not just incidental but represent a form of convergent evolution.
Co-cultures have been observed between humans and nonhuman animals—for example, between humans and honeyguides in Tanzania and Mozambique, where the birds lead humans to honeybee nests. They are also evident between different species of nonhuman animals—for example, cooperative scavenging between ravens and wolves, cooperative hunting between false killer whales and bottlenose dolphins, and signal sharing between distinct species of tamarin. Ultimately, this inter-species sharing of culture could drive evolution, the researchers say.
"Cultural behaviors that enhance survival or reproductive success in a particular setting can lead to changes in population habits that, over time, could drive genetic selection," they write.
To extend our understanding of co-cultures, the researchers say that future studies could start by investigating wild animals in urban environments.
Part 1
Jul 15, 2024
Dr. Krishna Kumari Challa
Urban animals modify their behaviors, learning, and problem-solving skills to cope with urban challenges, reflecting a dynamic response to urban landscapes," they write. "Similarly, humans alter their urban spaces, influencing wildlife behavior and evolution. This reciprocal adaptation between humans and wildlife is fundamental to understanding co-culture."
----
Future research is also needed to examine the possibility of cultural and genetic co-evolution—the idea that species' cultures and genomes are evolving in concert. A key question, the researchers say, is "In the context of co-culture, how do cultural adaptations influence genetic evolution, and vice versa, across different species and environments?"
----
Cédric Sueur et al, Co-cultures: exploring interspecies culture among humans and other animals, Trends in Ecology & Evolution (2024). DOI: 10.1016/j.tree.2024.05.011
Part 2
Jul 15, 2024
Dr. Krishna Kumari Challa
Study Finds Life on Earth Emerged 4.2 Billion Years Ago
Once upon a time, Earth was barren. Everything changed when, somehow, out of the chemistry available early in our planet's history, something started squirming – processing available matter to survive, to breed, to thrive.
What that something was, and when it first squirmed, have been burning questions that have puzzled humanity probably for as long as we've been able to ask "what am I?"
Now, a new study has found some answers – and life emerged surprisingly early.
Earth, for context, is around 4.5 billion years old. That means life first emerged when the planet was still practically a newborn.
Back when it was new, Earth was a very different place, with an atmosphere that we would find extremely toxic today. Oxygen, in the amount current life seems to need, didn't emerge until relatively late in the planet's evolutionary history, only as early as around 3 billion years ago.
But life emerged prior to that; we have fossils of microbes from 3.48 billion years ago. And scientists think that conditions on Earth may have been stable enough to support life from around 4.3 billion years ago.
But our planet is subject to erosional, geological, and organic processes that make evidence of that life, from that time, almost impossible to find.
a team of scientists went looking somewhere else: in genomes from living organisms, and the fossil record.
Their study is based on something called a molecular clock. Basically, we can estimate the rate at which mutations occur, and count the number to determine how much time has passed since the organisms in question diverged from common ancestors.
All organisms, from the humblest microbe to the mightiest fungus, have some things in common. There's a universal genetic code. The way we make proteins is the same. There's an almost universal set of 20 amino acids that are all oriented the same way. And all living organisms use adenosine triphosphate (ATP) as a source of energy in their cells.
Researchers worked out, based on these similarities and differences, how long it has been since LUCA's successors started to diverge. And, using complex evolutionary modeling, they were able to learn more about LUCA itself – what it was, and how it survived on an Earth so very inhospitable to its descendants.
Part1
Jul 15, 2024
Dr. Krishna Kumari Challa
LUCA, they found, was probably very similar to a prokaryote, a single-celled organism that doesn't have a nucleus. It was obviously not reliant on oxygen, since there would have been little oxygen available; that's not unexpected for a microbe. As such, its metabolic processes probably produced acetate.
But there was something else interesting. LUCA appears not to have been alone.
This study showed that LUCA was a complex organism, not too different from modern prokaryotes.
But what is really interesting is that it's clear it possessed an early immune system, showing that even by 4.2 billion years ago, our ancestor was engaging in an arms race with viruses.
Because its metabolic processes would have produced waste products that could be used by other lifeforms, they could have emerged not long after LUCA did.
This implies that it takes relatively little time for a full ecosystem to emerge in the evolutionary history of a planet – a finding that has implications far beyond our own little pale blue dot.
https://www.nature.com/articles/s41559-024-02461-1
Part 2
**
Jul 15, 2024
Dr. Krishna Kumari Challa
The Exact Part of The Brain Behind Your Curiosity
Being curious is a quintessential part of being human, driving us to learn and adapt to new environments. For the first time, scientists have pinpointed the spot in the brain where curiosity emerges.
The discovery was made by researchers who used functional magnetic resonance imaging (fMRI) scans to measure oxygen levels in different parts of the brain, indicating how busy each region is at any one time.
Knowing where curiosity originates could help us understand more about how human beings tick, and potentially lead to therapies for conditions where curiosity is lacking, such as chronic depression.
This is really the first time we can link the subjective feeling of curiosity about information to the way your brain represents that information.
In the experiments conducted on curiosity and fMRI scans, notable activity was spotted in three regions: the the occipitotemporal cortex (linked to vision and object recognition), the ventromedial prefrontal cortex or vmPFC (which manages perceptions of value and confidence), and the anterior cingulate cortex (used for information gathering).
The vmPFC appears to act as a sort of neurological bridge between levels of certainty recorded by the occipitotemporal cortex, and subjective feelings of curiosity – almost like a trigger telling us when to be curious. The less confident the volunteers were about the image subject, the more curious they were about it.
These results illuminate how perceptual input is transformed by successive neural representations to ultimately evoke a feeling of curiosity," write the researchers in their published paper.
https://www.jneurosci.org/content/early/2024/07/04/JNEUROSCI.0974-2...
Jul 15, 2024
Dr. Krishna Kumari Challa
Study identifies epigenetic 'switches' that regulate the developmental trajectories of single cells
Individual cells in the human body develop progressively over time, ultimately becoming specialized in specific functions. This process, known as cell differentiation or specialization, is central to the formation of distinct cell populations that serve different purposes.
Past studies suggest that the fate of cells is also modulated by epigenetic mechanisms (i.e., interactions between genes and environmental factors). These epigenetic mechanisms, however, have so far been proved difficult to pinpoint.
Researchers recently carried out a study aimed at exploring the epigenetic processes influencing the developmental trajectories of individual cells, using human brain and retina organoids derived from pluripotent stem cells.
Their paper, published in Nature Neuroscience, outlines a single-cell epigenome-wide map that could aid the study of human cell fate determination.
This paper was inspired by the need to better understand the epigenetic mechanisms that regulate cell fate decisions during human brain and retina development.
The primary objective was to create a comprehensive single-cell epigenomic map that could capture the transitions from pluripotent stem cells to differentiated neural cells.
To study the epigenetic mechanisms underpinning the diversification of human cells, the researchers carried out experiments on human brain and retina organoids, three-dimensional (3D), miniaturized versions of human organs created in laboratory settings, using pluripotent stem cells.
Part 1
Jul 16, 2024
Dr. Krishna Kumari Challa
Using epigenetic techniques, the researchers were able to track how these three key histone modifications changed as cells transitioned through different stages of development. The observations gathered in their experiments allowed them to identify dynamic epigenetic "switches" that regulate the fate of individual cells.
They discovered that the switching of repressive and activating histone modifications happens before cell fate decisions.
Additionally, they demonstrated that removing H3K27me3 at the neuroectoderm stage disrupts fate restriction, leading to aberrant cell identities.
Fides Zenk et al, Single-cell epigenomic reconstruction of developmental trajectories from pluripotency in human neural organoid systems, Nature Neuroscience (2024). DOI: 10.1038/s41593-024-01652-0
Part 2
Jul 16, 2024
Dr. Krishna Kumari Challa
How climate change is altering the Earth's rotation
For the first time, researchers have been able to fully explain the various causes of long-term polar motion in the most comprehensive modeling to date, using AI methods. Their model and their observations show that climate change and global warming will have a greater influence on the Earth's rotational speed than the effect of the moon, which has determined the increase in the length of the day for billions of years.
Climate change is causing the ice masses in Greenland and Antarctica to melt. Water from the polar regions is flowing into the world's oceans—and especially into the equatorial region.
This means that a shift in mass is taking place, and this is affecting the Earth's rotation.
It's like when a figure skater does a pirouette, first holding her arms close to her body and then stretching them out. The initially fast rotation becomes slower because the masses move away from the axis of rotation, increasing physical inertia.
In physics, we speak of the law of conservation of angular momentum, and this same law also governs the Earth's rotation. If the Earth turns more slowly, the days get longer. Climate change is therefore also altering the length of the day on Earth, albeit only minimally for now.
Another cause of this slowdown is tidal friction, which is triggered by the moon. However, the new study comes to a surprising conclusion: if humans continue to emit more greenhouse gases and the Earth warms up accordingly, this would ultimately have a greater influence on the Earth's rotational speed than the effect of the moon, which has determined the increase in the length of the day for billions of years.
We humans have a greater impact on our planet than we realize and this naturally places great responsibility on us for the future of our planet.
However, shifts in mass on the Earth's surface and in its interior caused by the melting ice not only change the Earth's rotational speed and the length of day: as the researchers show in Nature Geoscience, they also alter the axis of rotation. This means that the points where the axis of rotation actually meets the Earth's surface move.
Researchers can observe this polar motion, which, over a longer timeframe, comes to some ten meters per hundred years. It's not only the melting of the ice sheets that plays a role here, but also movements taking place in the Earth's interior.
Part 1
Jul 16, 2024
Dr. Krishna Kumari Challa
Deep in the Earth's mantle, where the rock becomes viscous due to high pressure, displacements occur over long periods of time. And there are also heat flows in the liquid metal of Earth's outer core, which are responsible for both generating the Earth's magnetic field and leading to shifts in mass.
In the most comprehensive modeling to date, researchers have now shown how polar motion results from individual processes in the core, in the mantle and from the climate at the surface.
One finding in particular that stands out in their study is that the processes on and in the Earth are interconnected and influence each other. Climate change is causing the Earth's axis of rotation to move, and it appears that the feedback from the conservation of angular momentum is also changing the dynamics of the Earth's core.
Ongoing climate change could therefore even be affecting processes deep inside the Earth and have a greater reach than previously assumed. However, there is little cause for concern, as these effects are minor and it's unlikely that they pose a risk.
Implications for space travel
Even if the Earth's rotation is changing only slowly, this effect has to be taken into account when navigating in space—for example, when sending a space probe to land on another planet. Even a slight deviation of just one centimeter on Earth can grow to a deviation of hundreds of meters over the huge distances involved.
"Otherwise, it won't be possible to land in a specific crater on Mars".
Kiani Shahvandi, Mostafa, The increasingly dominant role of climate change on length of day variations, Proceedings of the National Academy of Sciences (2024). DOI: 10.1073/pnas.2406930121. doi.org/10.1073/pnas.2406930121
Mostafa Kiani Shahvandi et al, Contributions of core, mantle and climatological processes to Earth's polar motion, Nature Geoscience (2024). DOI: 10.1038/s41561-024-01478-2. www.nature.com/articles/s41561-024-01478-2
Part 2
Jul 16, 2024
Dr. Krishna Kumari Challa
Study finds abortion restrictions harm mental health, with low-income women hardest hit
People living in states that enacted tighter abortion restrictions in the wake of the Dobbs v. Jackson Women's Health decision, which returned regulation of abortion access to state legislatures, are more likely to report elevated levels of mental distress. This is particularly true for people of lower socioeconomic means.
These are the key takeaways of July 2024 paper published in Science Advances.
**
Jul 16, 2024
Dr. Krishna Kumari Challa
Loss of oxygen in bodies of water identified as new tipping point
Oxygen concentrations in our planet's waters are decreasing rapidly and dramatically—from ponds to the ocean. The progressive loss of oxygen threatens not only ecosystems, but also the livelihoods of large sectors of society and the entire planet, according to the authors of an international study involving GEOMAR published recently in Nature Ecology & Evolution.
Oxygen is a fundamental requirement of life on planet Earth. The loss of oxygen in water, also referred to as aquatic deoxygenation, is a threat to life at all levels. The international team of researchers describes how ongoing deoxygenation presents a major threat to the livelihoods of large parts of society and for the stability of life on our planet.
Previous research has identified a suite of global scale processes, referred to as planetary boundaries, that regulate the overall habitability and stability of the planet. If critical thresholds in these processes are passed, the risk of large-scale, abrupt or irreversible environmental changes ("tipping points") increases and the resilience of our planet, its stability, is jeopardized.
Among the nine planetary boundaries are climate change, land use change, and biodiversity loss. The authors of the new study argue that aquatic deoxygenation both responds to, and regulates, other planetary boundary processes.
Across all aquatic ecosystems, from streams and rivers, lakes, reservoirs, and ponds to estuaries, coasts, and the open ocean, dissolved oxygen concentrations have rapidly and substantially declined in recent decades.
Lakes and reservoirs have experienced oxygen losses of 5.5% and 18.6% respectively since 1980. The ocean has experienced oxygen losses of around 2% since 1960. Although this number sounds small, due to the large ocean volume it represents an extensive mass of oxygen lost.
Marine ecosystems have also experienced substantial variability in oxygen depletion.
The volumes of aquatic ecosystems affected by oxygen depletion have increased dramatically across all types.
The causes of aquatic oxygen loss are global warming due to greenhouse gas emissions and the input of nutrients as a result of land use.
Part 1
Jul 16, 2024
Dr. Krishna Kumari Challa
If water temperatures rise, the solubility of oxygen in the water decreases. In addition, global warming enhances stratification of the water column, because warmer, low-salinity water with a lower density lies on top of the colder, saltier deep water below.
This hinders the exchange of the oxygen-poor deep layers with the oxygen-rich surface water. In addition, nutrient inputs from land support algal blooms, which lead to more oxygen being consumed as more organic material sinks and is decomposed by microbes at depth.
Areas in the sea where there is so little oxygen that fish, mussels or crustaceans can no longer survive threaten not only the organisms themselves, but also ecosystem services such as fisheries, aquaculture, tourism and cultural practices.
Microbiotic processes in oxygen-depleted regions also increasingly produce potent greenhouse gases such as nitrous oxide and methane, which can lead to a further increase in global warming and thus a major cause of oxygen depletion.
The authors warn: We are approaching critical thresholds of aquatic deoxygenation that will ultimately affect several other planetary boundaries.
Failure to address aquatic deoxygenation will, ultimately, not only affect ecosystems but also economic activity, and society at a global level.
Kevin C. Rose et al, Aquatic deoxygenation as a planetary boundary and key regulator of Earth system stability, Nature Ecology & Evolution (2024). DOI: 10.1038/s41559-024-02448-y
Part 2
Jul 16, 2024