New models of Big Bang show that visible universe and invisible dark matter co-evolved
Physicists have long theorized that our universe may not be limited to what we can see. By observing gravitational forces on other galaxies, they've hypothesized the existence of "dark matter," which would be invisible to conventional forms of observation.
95% of the universe is dark, is invisible to the eye.
However, we know that the dark universe is there by its gravitational pull on stars. Other than its gravity, dark matter has never seemed to have much effect on the visible universe.
Yet the relationship between these visible and invisible domains, especially as the universe first formed, has remained an open question.
Now, physicists say that there is mounting evidence that these two supposedly distinct realms actually co-evolved.
Through a series of computer models, physicists have discovered that the visible and the hidden sectors, as they call them, likely co-evolved in the moments after the Big Bang, with profound repercussions for how the universe developed thereafter.
They say there was a time when some physicists effectively wrote off this hidden sector, as they can explain most of what happens within the visible—that is, if our models can accurately portray what we can see happening around us, why bother trying to measure something that has no discernible effect?
The question is, what's the influence of the hidden sector on the visible sector?" "But what do we care? We can explain everything."
But we can't explain everything, the authors of this study contends. There are anomalies that don't seem to fit the so-called "Standard Model" of the universe.
That the visible and hidden sectors are mutually isolated is a misconception, they say, based on an assumption "that the visible and the hidden sectors evolved independently of each other." The new work wants to turn that assumption on its head.
In apaperpublished inPhysical Review D, "Big Bang Initial Conditions and Self-Interacting Hidden Dark Matter, they want to ask what they call "the more important question: How do we know that they evolved independently?"
To test this assumption, Nath and his team "introduced some feeble interactions" between the two sectors into their models of the Big Bang. These meager interactions wouldn't be enough to affect the outcome of, say, particle accelerator experiments, "but we wanted to see what the effects would be on the visible sector as a whole," Nath says, "from the time of the Big Bang to the current time."
Even with minimal interactions between the two sectors, Nath and his team discovered that dark matter's influence on the visible matter we're made of could have a major impact on observable phenomena.
The Hubble expansion—which says, in the simplest terms, that galaxies are moving away from one another, and thus that the universe is expanding—for instance, contains a "quite serious" differential between what the Standard Model predicts and what has been observed. Nath's models partially account for this difference.
One major variable is the temperature of the hidden sector during the Big Bang.
The visible sector, we can be fairly certain, started out very hot at the moment of the Big Bang. As the universe cools, Nath says, "what we see is the remnant of that period of the universe."
But by studying the evolution of the two sectors, Nath and his team could model both conditions—a hidden sector that started out hot, and another hidden sector that started out cold.
What they observed was surprising: Despite significant differences between the models, with major implications on what the universe looked like in early times, both the hot and cold models were consistent with the visible sector we can observe today. part 3
Our current measurements of the visible universe, in other words, are insufficient to confirm which side the hidden sector fell on at the beginning—hot or cold.
Nath is quick to point out that, rather than a failing of the experiment, this is an example of the mathematical models outrunning our current experimental capabilities.
It's not that the difference between a hot or cold hidden sector has no bearing on the visible universe, but that we haven't performed experiments—yet—with high enough precision. Nath mentions the Webb Telescope as one example of the next generation of tools that will be able to make such precise observations.
The ultimate goal of all this modeling work is to make better predictions about the state of the universe, how it all functions and what we'll find when we look deeper and deeper into the night sky.
As our experiments gain more accuracy, the questions baked into Nath's models—was the hidden sector hot or cold?—will find their answers, and those clarified models will help predict the solutions to ever-deeper questions. "What's the significance of this?" Nath asks. Human beings, he says, "want to find their place in the universe." And more than that, "they want to answer the question, why is there a universe?
"And we are exploring those issues. It is the ultimate quest of human beings."
Jinzheng Li et al, Big bang initial conditions and self-interacting hidden dark matter, Physical Review D (2023). DOI: 10.1103/PhysRevD.108.115008
Study suggests that living near green spaces reduces the risk of depression and anxiety
Over the past decades, a growing number of people have migrated to urban areas, while the size and population of rural areas have drastically declined. While parks and other green spaces are often viewed as beneficial for the well-being of those living in cities and urban regions, so far very few studies have explored the impact of these spaces on mental health.
Researchers recently carried out a study investigating the potential link between long-term exposure to green spaces in proximity of one's home and two of the most common mental health disorders: depression and anxiety. Their findings, published in Nature Mental Health, suggest that living close to parks and green areas can reduce the risk of becoming depressed and experiencing anxiety.
As part of their study, the researchers analyzed data gathered from 409,556 people and stored in the UK Biobank database. They specifically looked at the distance between participants and green areas, in conjunction with their self-reported well-being scores, as well as hospitalizations, hospital admissions, and deaths in their residential area.
The results of the analyses suggest that there is a link between prolonged proximity to residential green areas and the incidence of both depression and anxiety. Specifically, they suggest that living closer to parks and other green areas reduces the risk of experiencing both depression and anxiety.
Researchers draw the important conclusion that long-term exposure to residential greenness is associated with a decreased risk of incident depression and anxiety, and reduced air pollution in the greenest areas probably plays an important role in this trend. This study thus implies that expanding urban green spaces could promote good mental health.
Jianing Wang et al, Long-term exposure to residential greenness and decreased risk of depression and anxiety, Nature Mental Health (2024). DOI: 10.1038/s44220-024-00227-z.
How light can vaporize water without the need for heat
It's the most fundamental of processes—the evaporation of water from the surfaces of oceans and lakes, the burning off of fog in the morning sun, and the drying of briny ponds that leaves solid salt behind. Evaporation is all around us, and humans have been observing it and making use of it for as long as we have existed.
And yet, it turns out, we've been missing a major part of the picture all along.
In a series of painstakingly precise experiments, a team of researchers has demonstrated that heat isn't alone in causing water to evaporate. Light, striking the water's surface where air and water meet, can break water molecules away and float them into the air, causing evaporation in the absence of any source of heat.
The astonishing new discovery could have a wide range of significant implications. It could help explain mysterious measurements over the years of how sunlight affects clouds, and therefore affect calculations of the effects of climate change on cloud coverand precipitation. It could also lead to new ways of designing industrial processes such as solar-powered desalination or drying of materials.
The findings, and the many different lines of evidence that demonstrate the reality of the phenomenon and the details of how it works, aredescribed recently in theProceedings of the National Academy of Sciences.
The authors say their study suggests that the effect should happen widely in nature—everywhere from clouds to fogs to the surfaces of oceans, soils, and plants—and that it could also lead to new practical applications, including in energy and clean water production.
The new work builds on research reported* last year, which described this new "photomolecular effect" but only under very specialized conditions: on the surface of specially prepared hydrogels soaked with water. In the new study, the researchers demonstrate that the hydrogel is not necessary for the process; it occurs at any water surface exposed to light, whether it's a flat surface like a body of water or a curved surface like a droplet of cloud vapour.
Because the effect was so unexpected, the team worked to prove its existence with as many different lines of evidence as possible. In this study, they report 14 different kinds of tests and measurements they carried out to establish that water was indeed evaporating—that is, molecules of water were being knocked loose from the water's surface and wafted into the air—due to the light alone, not by heat, which was long assumed to be the only mechanism involved.
One key indicator, which showed up consistently in four different kinds of experiments under different conditions, was that as the water began to evaporate from a test container under visible light, the air temperature measured above the water's surface cooled down and then leveled off, showing that thermal energy was not the driving force behind the effect.
Other key indicators that showed up included the way the evaporation effect varied depending on the angle of the light, the exact color of the light, and its polarization. None of these varying characteristics should happen because at these wavelengths, water hardly absorbs light at all—and yet the researchers observed them.
The effect is strongest when light hits the water surface at an angle of 45 degrees. It is also strongest with a certain type of polarization, called transverse magnetic polarization. And it peaks in green light—which, oddly, is the color for which water is most transparent and thus interacts the least.
* Yaodong Tu et al, Plausible photomolecular effect leading to water evaporation exceeding the thermal limit, Proceedings of the National Academy of Sciences (2023). DOI: 10.1073/pnas.2312751120
researchers have proposed a physical mechanism that can explain the angle and polarization dependence of the effect, showing that the photons of light can impart a net force on water molecules at the water surface that is sufficient to knock them loose from the body of water. But they cannot yet account for the color dependence, which they say will require further study.
They have named this the photomolecular effect, by analogy with the photoelectric effect that was discovered by Heinrich Hertz in 1887 and finally explained by Albert Einstein in 1905. That effect was one of the first demonstrations that light also has particle characteristics, which had major implications in physics and led to a wide variety of applications, including LEDs. Just as the photoelectric effect liberates electrons from atoms in a material in response to being hit by a photon of light, the photomolecular effect shows that photons can liberate entire molecules from a liquid surface, the researchers say.
The finding of evaporation caused by light instead of heat provides new disruptive knowledge of light-water interaction. It could help us gain new understanding of how sunlight interacts with cloud, fog, oceans, and other natural water bodies to affect weather and climate. It has significant potential practical applications such as high-performance water desalination driven by solar energy.
The finding may solve an 80-year-old mystery in climate science. Measurements of how clouds absorb sunlight have often shown that they are absorbing more sunlight than conventional physics dictates possible. The additional evaporation caused by this effect could account for the longstanding discrepancy, which has been a subject of dispute since such measurements are difficult to make.
Those experiments are based on satellite data and flight data. They fly an airplane on top of and below the clouds, and there are also data based on the ocean temperature and radiation balance. And they all conclude that there is more absorption by clouds than theory could calculate. However, due to the complexity of clouds and the difficulties of making such measurements, researchers have been debating whether such discrepancies are real or not. And what was discovered now suggests that hey, there's another mechanism for cloud absorption, which was not accounted for, and this mechanism might explain the discrepancies. Part 3
There are many lines of evidence. The flat region in the air-side temperature distribution above hot water will be the easiest for people to reproduce. That temperature profile "is a signature" that demonstrates the effect clearly. It is quite hard to explain how this kind of flat temperature profile comes about without invoking some other mechanism" beyond the accepted theories of thermal evaporation. The observations in the manuscript points to a new physical mechanism that foundationally alters our thinking on the kinetics of evaporation.
Guangxin Lv et al, Photomolecular effect: Visible light interaction with air–water interface, Proceedings of the National Academy of Sciences (2024). DOI: 10.1073/pnas.2320844121
New research has revealed the presence of a previously unknown molecule in space. The open-access paper describing it, "Rotational Spectrum and First Interstellar Detection of 2-Methoxyethanol Using ALMA Observations of NGC 6334I," was published in the April 12 issue of The Astrophysical Journal Letters.
Researchers worked to assemble a puzzle comprised of pieces collected from across the globe, extending beyond MIT to France, Florida, Virginia, and Copenhagen, to achieve this exciting discovery.
To detect new molecules in space, researchers first must have an idea of what molecule they want to look for, then they can record its spectrum in the lab here on Earth, and then finally they look for that spectrum in space using telescopes.
To detect this molecule using radio telescope observations, the group first needed to measure and analyze its rotational spectrum on Earth. The researchers combined experiments from the University of Lille (Lille, France), the New College of Florida (Sarasota, Florida), and the McGuire lab at MIT to measure this spectrum over a broadband region of frequencies ranging from the microwave to sub-millimeter wave regimes (approximately 8 to 500 gigahertz).
The data gleaned from these measurements permitted a search for the molecule using Atacama Large Millimeter/submillimeter Array (ALMA) observations toward two separate star-forming regions: NGC 6334I and IRAS 16293-2422B. Members of the group analyzed these telescope observations alongside researchers at the National Radio Astronomy Observatory (Charlottesville, Virginia) and the University of Copenhagen, Denmark.
Ultimately, they observed 25 rotational lines of 2-methoxyethanol that lined up with the molecular signal observed toward NGC 6334I (the barcode matched), thus resulting in a secure detection of 2-methoxyethanol in this source.This allowed them to then derive physical parameters of the molecule toward NGC 6334I, such as its abundance and excitation temperature. It also enabled an investigation of the possible chemical formation pathways from known interstellar precursors. Molecular discoveries like this one help the researchers to better understand the development of molecular complexity in space during the star formation process. 2-methoxyethanol, which contains 13 atoms, is quite large for interstellar standards—as of 2021, only six species larger than 13 atoms were detected outside the solar system, many by this research group, and all of them existing as ringed structures. Continued observations of large molecules and subsequent derivations of their abundances allows scientists to advance our knowledge of how efficiently large molecules can form and by which specific reactions they may be produced.
Zachary T. P. Fried et al, Rotational Spectrum and First Interstellar Detection of 2-methoxyethanol Using ALMA Observations of NGC 6334I, The Astrophysical Journal Letters (2024). DOI: 10.3847/2041-8213/ad37ff
Laser-treated cork absorbs oil for carbon-neutral ocean cleanup
Oil spills are deadly disasters for ocean ecosystems. They can have lasting impacts on fish and marine mammals for decades and wreak havoc on coastal forests, coral reefs, and the surrounding land. Chemical dispersants are often used to break down oil, but they often increase toxicity in the process.
In Applied Physics Letters, researchers published their work using laser treatments to transform ordinary cork into a powerful tool for treating oil spills.
They wanted to create a nontoxic, effective oil cleanup solution using materials with a low carbon footprint, but their decision to try cork resulted from a surprising discovery.
In a different laser experiment, they accidentally found that the wettability of the cork processed using a laser changed significantly, gaining superhydrophobic (water-repelling) and superoleophilic (oil-attracting) properties. After appropriately adjusting the processing parameters, the surface of the cork became very dark, which made them realize that it might be an excellent material for photothermal conversion.
Combining these results with the eco-friendly, recyclable advantages of cork, they thought of using it for marine oil spill cleanup.
Cork comes from the bark of cork oak trees, which can live for hundreds of years. These trees can be harvested about every seven years, making cork a renewable material. When the bark is removed, the trees amplify their biological activity to replace it and increase their carbon storage, so harvesting cork helps mitigate carbon emissions. Part 1
The authors tested variations of a fast-pulsing laser treatment to achieve the optimal balance of characteristics in the cork that can be achieved at low cost.
They closely examined nanoscopic structural changes and measured the ratio of oxygen and carbon in the material, changes in the angles with which water and oil contact the surface, and the material's light wave absorption, reflection, and emission across the spectrum to determine its durability after multiple cycles of warming and cooling.
The photothermal properties endowed in cork through this laser processing allow the cork to warm quickly in the sun. The deep grooves also increase the surface area exposed to sunlight, so the cork can be warmed by just a little sunlight in 10–15 seconds. This energy is used to heat up spilled oil, lowering its viscosity and making it easier to collect. In experiments, the laser-treated cork collected oil out of water within two minutes.
The laser treatments not only help to better absorb oil, but also work to keep water out. When the cork undergoes a fast-pulsing laser treatment, its surface microstructure becomes rougher. This micro- to nano-level roughness enhances hydrophobicity.
As a result, the cork collects the oil without absorbing water, so the oil can be extracted from the cork and possibly even reused.
Femtosecond laser structured black superhydrophobic cork for efficient solar-driven cleanup of crude oil, Applied Physics Letters (2024). DOI: 10.1063/5.0199291
AMOLF researchers, in collaboration with Delft University of Technology have succeeded in bringing light waves to a halt by deforming the two-dimensional photonic crystal that contains them. The researchers show that even a subtle deformation can have a substantial effect on photons in the crystal. This resembles the effect that a magnetic field has on electrons.
This principle offers a new approach to slow down light fields and thereby enhance their strength. Realizing this on a chip is particularly important for many applications, say the researchers.
The researchers have published their findings in the journal Nature Photonics. Simultaneously, a research team from Pennsylvania State University has published an article in the same journal about how they demonstrated—independently from the Dutch team—an identical effect.
Manipulating the flow of light in a material at small scales is beneficial for the development of nanophotonic chips. Forelectrons, such manipulation can be realized using magnetic fields; the Lorentz force steers the motion of electrons. However, this is impossible for photons because they do not have charge.
Researchers in the Photonic Forces group at AMOLF are looking for techniques and materials that would enable them to apply forces to photons that resemble the effects of magnetic fields.
The researchers looked for inspiration at the way in which electrons behave in materials. In a conductor, electrons can in principle move freely, but an external magnetic field can stop this. The circular movement caused by the magnetic field stops conduction and as such electrons can only exist in the material if they have very specific energies. These energy levels are called Landau levels, and they are characteristic for electrons in a magnetic field
But, in the two-dimensional material graphene—that consists of a single layer of carbon atoms arranged in a crystal—these Landau levels can also be caused by a different mechanism than a magnetic field. In general, graphene is a good electronic conductor, but this changes when the crystal array is deformed, for instance by stretching it like elastics.
"Such mechanical deformation stops conduction; the material turns into an insulator and consequently the electrons are bound to Landau levels. Hence, the deformation of graphene has a similar effect on electrons in a material as a magnetic field, even without a magnet. Researchers asked themselves if a similar approach would also work for photons."
Part 1
In a collaboration with Kobus Kuipers of Delft University of Technology, the group indeed demonstrated a similar effect for light in a photonic crystal.
A photonic crystal normally consists of a regular—two dimensional—pattern of holes in a silicon layer. Light can move freely in this material, just like electrons in graphene. Breaking this regularity in exactly the right manner will deform the array and consequently lock the photons. This is how they create Landau levels for photons. In Landau levels light waves no longer move; they do not flow through the crystal but stand still. The researchers succeeded in demonstrating this, showing that the deformation of the crystal array has a similar effect on photons as a magnetic field on electrons.
By playing with the deformation pattern, we even managed to establish various types of effective magnetic fields in one material. As a result, photons can move through certain parts of the material but not in others. Hence, these insights also provide new ways to steer light on a chip. This brings on-chip applications closer.If we can confine light at the nanoscale and bring it to a halt like this, its strength will be enhanced tremendously. And not only at one location, but over the entire crystal surface. Such light concentration is very important in nanophotonic devices, for example for the development of efficient lasers or quantum light sources.
René Barczyk et al, Observation of Landau levels and chiral edge states in photonic crystals through pseudomagnetic fields induced by synthetic strain, Nature Photonics (2024). DOI: 10.1038/s41566-024-01412-3
Bioluminescence first evolved in animals at least 540 million years ago, pushing back previous oldest dated example
Bioluminescence first evolved in animals at least 540 million years ago in a group of marine invertebrates called octocorals, according to the results of a new study from scientists with the Smithsonian's National Museum of Natural History.
The results,publishedApril 23, in theProceedings of the Royal Society B: Biological Sciences, push back the previous record for the luminous trait's oldest dated emergence in animals by nearly 300 million years, and could one day help scientists decode why the ability to produce light evolved in the first place.
Bioluminescence—the ability of living things to produce light via chemical reactions—has independently evolved at least 94 times in nature and is involved in a huge range of behaviors including camouflage, courtship, communication and hunting. Until now, the earliest dated origin of bioluminescence in animals was thought to be around 267 million years ago in small marine crustaceans called ostracods.
But for a trait that is literally illuminating, bioluminescence's origins have remained shadowy.
Nobody quite knows why it first evolved in animals.
In search of the trait's earliest origins, the researchers decided to peer back into the evolutionary history of the octocorals, an evolutionarily ancient and frequently bioluminescent group of animals that includes soft corals, sea fans and sea pens.
Like hard corals, octocorals are tiny colonial polyps that secrete a framework that becomes their refuge, but unlike their stony relatives, that structure is usually soft. Octocorals that glow typically only do so when bumped or otherwise disturbed, leaving the precise function of their ability to produce light a bit mysterious.
Part 1
Octocorals are one of the oldest groups of animals on the planet known to bioluminescence. "So, the question 's when did they develop this ability?" Researchers had completed an extremely detailed, well-supported evolutionary tree of the octocorals in 2022. They created this map of evolutionary relationships, or phylogeny, using genetic data from 185 species of octocorals.
With this evolutionary tree grounded in genetic evidence, DeLeo and Quattrini then situated two octocoral fossils of known ages within the tree according to their physical features. The scientists were able to use the fossils' ages and their respective positions in the octocoral evolutionary tree to date to figure out roughly when octocoral lineages split apart to become two or more branches.
Next, the team mapped out the branches of the phylogeny that featured living bioluminescent species.
With the evolutionary tree dated and the branches that contained luminous species labeled, the team then used a series of statistical techniques to perform an analysis called ancestral state reconstruction. If we know these species of octocorals living today are bioluminescent, we can use statistics to infer whether their ancestors were highly probable to be bioluminescent or not. The more living species with the shared trait, the higher the probability that as you move back in time that those ancestors likely had that trait as well. The researchers used numerous different statistical methods for their ancestral state reconstruction, but all arrived at the same result: Some 540 million years ago, the common ancestor of all octocorals were very likely bioluminescent. That is 273 million years earlier than the glowing ostracod crustaceans that previously held the title of earliest evolution of bioluminescence in animals. The octocorals' thousands of living representatives and relatively high incidence of bioluminescence suggests the trait has played a role in the group's evolutionary success. While this further begs the question of what exactly octocorals are using bioluminescence for, the researchers said the fact that it has been retained for so long highlights how important this form of communication has become for their fitness and survival.
Scientists study lipids cell by cell, making new cancer research possible
Imagine being able to look inside a single cancer cell and see how it communicates with its neighbors. Scientists are celebrating a new technique that lets them study the fatty contents of cancer cells, one by one.
A study has sampled single live cancer cells and measured the fatty lipid compounds inside them. Working with partners at GSK and UCL, and developing new equipment with Yokogawa, the team saw how those cells transformed in response to changes in their environment. The work appears in Analytical Chemistry. The trouble with cancer cells is that no two are alike. That makes it harder to design good treatment, because some cells will always resist treatment more than others. Yet it has always proven tricky to study live cells after they have been removed from their natural environment, in enough detail to truly understand their makeup. That is why it is so exciting to be able to sample live cells under a microscope and study their fatty contents one by one. Individual pancreatic cancer cells were lifted from a glass culture dish using Yokogawa's Single Cellome System SS2000. This extracts single live cells using tiny tubes 10 µm across—about half the diameter of the thinnest human hair.
By staining the cells with fluorescent dye, the researchers could monitor lipid droplets (stores of fatty molecules inside cells, thought to play an important role in cancer) throughout the experiment.
Then, working with partners at Sciex, researchers developed a new method using a mass spectrometer to fragment the lipids in the cells. This told them about their composition.
The researchers demonstrated that different cells had very different lipid profiles. They also saw how lipids in the cells changed in response to what was going on around them.
Untargeted single-cell lipidomics using liquid chromatography and data-dependent acquisition after live cell selection, Analytical Chemistry (2024). DOI: 10.1021/acs.analchem.3c05677
Lethal mpox strain appears to spread via sex A virulent strain of the monkeypox virus might have gained the ability to spread through sexual contact. The strain, called clade Ib, has caused a cluster of infections in a conflict-ridden region of the Democratic Republic of the Congo (DRC). This isn’t the first time scientists have warned that the monkeypox virus could become sexually transmissible: similar warnings during a 2017 outbreak in Nigeria were largely ignored. The strain responsible, clade II, is less lethal than clade Ib, but ultimately caused an ongoing global outbreak that has infected more than 94,000 people and killed more than 180.
The World Health Organization (WHO) has changed how it classifies airborne pathogens. It has removed the distinction between transmission by smaller virus-containing ‘aerosol’ particles and spread through larger ‘droplets’. The division, which some researchers argue was unscientific, justified WHO’s March 2020 assertion that SARS-CoV-2, the virus behind the COVID-19 pandemic, was not airborne. Under the new definition, SARS-CoV-2 would be recognized as spreading ‘through the air’ — although some scientists feel this term is less clear than ‘airborne’. “I'm not saying everybody is happy, and not everybody agrees on every word in the document, but at least people have agreed this is a baseline terminology,” says WHO chief scientist Jeremy Farrar.
Disease Ecology Butterfly Effect When their preferred trees to chew on were cut down for the tobacco trade, chimpanzees in Uganda began consuming bat guano instead. Researchers recorded videos in the Budongo Forest Reserve between 2017 and 2019 and observed 839 instances of guano consumption, not only by chimpanzees but also by black-and-white colobus monkeys and red duikers, a type of forest antelope. The guano provides the chimps with essential minerals like sodium, potassium, magnesium and phosphorus that they would normally have gotten from the felled trees.
Why this matters: In addition to essential nutrients, the bat guano contained 27 unique viruses, including a novel coronavirus, the researchers found. Illnesses transmitted from animals to humans, called zoonotic diseases, account for about three quarters of new infectious diseases around the world. Those pathogens have a higher chance of jumping from an animal to a human when people encroach on ecosystems and disrupt relationships among species.
“This is the butterfly effect of infectious disease ecology,” says senior study author Tony Goldberg, a wildlife epidemiologist at the University of Wisconsin–Madison. “Far-flung events like demand for tobacco can have crazy, unintended consequences for disease emergence that follow pathways that we rarely see and can’t predict.”
Study explores why human-inspired machines can be perceived as eerie
Artificial intelligence (AI) algorithms and robots are becoming increasingly advanced, exhibiting capabilities that vaguely resemble those of humans. The growing similarities between AIs and humans could ultimately bring users to attribute human feelings, experiences, thoughts, and sensations to these systems, which some people perceive as eerie and uncanny.
A recent paper, published in Computers in Human Behavior: Artificial Humans, reviews past studies and reports the findings of an experiment testing a recent theory known as "mind perception," which proposes that people feel eeriness when exposed to robots that closely resemble humans because they ascribe minds to these robots.
For many people, the idea of a machine with conscious experience is unsettling. This discomfort extends to inanimate objects as well.
Overall, the results of the meta-analysis and experiment run by these researchers suggest that past studies backing mind perception theory could be flawed. In fact, the researcher gathered opposite results, suggesting that individuals who attribute sentience to robots do not necessarily find them eerier due to their human resemblance.
Although attributions of mind are not the main cause of the uncanny valley, they are part of the story. They can be relevant in some contexts and situations, yet that attributing mind to a machine that looks human is creepy. Instead, perceiving a mind in a machine that already looks creepy makes it creepier. However, perceiving a mind in a machine that has risen out of the uncanny valleyand looks nearly human makes it less creepy.
Exploring whether there is strong support for this speculation is an area for future research, which would involve using more varied and numerous stimuli.
Karl F. MacDorman, Does mind perception explain the uncanny valley? A meta-regression analysis and (de)humanization experiment, Computers in Human Behavior: Artificial Humans (2024). DOI: 10.1016/j.chbah.2024.100065
Link between depression and cardiovascular disease explained: They partly develop from same gene module
Depression and cardiovascular disease (CVD) are serious concerns for public health. Approximately 280 million people worldwide have depression, while 620 million people have CVD.
It has been known since the 1990s that the two diseases are somehow related. For example, people with depression run a greater risk of CVD, while effective early treatment for depression cuts the risk of subsequently developing CVD by half. Conversely, people with CVD tend to have depression as well. For these reasons, the American Heart Association (AHA) advises to monitor teenagers with depression for CVD.
What wasn't yet known is what causes this apparent relatedness between the two diseases. Part of the answer probably lies in lifestyle factors common in patients with depression and which increase the risk of CVD, such as smoking, alcohol abuse, lack of exercise, and a poor diet. But it's also possible that both diseases might be related at a deeper level, through shared developmental pathways.
Now, scientists have shown that depression and CVD do indeed share part of their developmental programs, having at least one functional gene module in common. This result, published in Frontiers in Psychiatry, provides new markers for depression and CVD, and could ultimately help researchers to find drugs to target both diseases.
Researchers used advanced statistics to identify 22 distinct gene modules, of which just one was associated with both a high score for depressive symptoms and a low score for cardiovascular health.
The top three genes from this gene module are known to be associated with neurodegenerative diseases, bipolar disorder, and depression. Now they have shown that they are associated with poor cardiovascular health as well.
These genes are involved in biological processes, such as inflammation, that are involved in pathogenesis of both depression and cardiovascular disease. This helps to explain why both diseases often occur together.
Other genes in the shared module have been shown to be involved in brain diseases such as Alzheimer's, Parkinson's, and Huntington's disease.
Researchers and doctors can use the genes in this module as biomarkers for depressionand cardiovascular disease. Ultimately, these biomarkers may facilitate the development of dual-purpose preventative strategies for both the diseases.
Binisha Hamal Mishra et al, Identification of gene networks jointly associated with depressive symptoms and cardiovascular health metrics using whole blood transcriptome in the Young Finns Study, Frontiers in Psychiatry (2024). DOI: 10.3389/fpsyt.2024.1345159
Genetic variations may predispose people to Parkinson's disease following long-term pesticide exposure, study finds
A new UCLA Health study has found that certain genetic variants could help explain how long-term pesticide exposure could increase the risk of Parkinson's disease. While decades of research have linked pesticide exposure and Parkinson's disease risk, researchers have sought to explain why some individuals with high exposure develop the disease while others do not.
One longstanding hypothesis has been that susceptibility to the disease is a combination of both environmental and genetic factors. The new study, published in the journal npj Parkinson's Disease, used genetic data from nearly 800 Central Valley (California) residents with Parkinson's disease, many of whom had long-term exposure to 10 pesticides used on cotton crops for at least a decade prior to developing the disease, with some patients having been exposed as far back as 1974.
The researchers examined the study participants' genetic makeup for rare variants in genes associated with the function of lysosomes—cellular compartments that break down waste and debris, thought to be associated with the development of Parkinson's disease—and looked for enrichment of variants in patients with high exposure to pesticide use compared to a representative sample of the general population.
Researchers found that variants in these genes were enriched in patients with more severe Parkinson's disease who also had higher exposure to pesticides. These genetic variants also appeared to be deleterious to protein function suggesting that disruption of lysosomal activity may be underling the development of Parkinson's disease combined with pesticide exposure.
Yeast study offers possible answer to why some species are generalists and others specialists
In a landmark study based on one of the most comprehensive genomic datasets ever assembled, a team of scientists offer a possible answer to one of the oldest questions about evolution: why some species are generalists and others specialists.
researchers mapped the genetic blueprints, appetites, and environments of more than 1,000 species of yeasts, building a family treethat illuminates how these single-celled fungi evolved over the past 400 million years.
The results,publishedin the journalScience, suggest that internal—not external—factors are the primary drivers of variation in the types of carbon yeasts can eat, and the researchers found no evidence that metabolic versatility, or the ability to eat different foods, comes with any trade-offs. In other words, some yeasts are jacks-of-all-trades and masters of each.
Like other organisms, some yeasts have evolved to be specialists—think koalas, which eat nothing but eucalyptus leaves—while others are generalists like raccoons, which eat just about anything.
Scientists have been trying to explain why both generalists and specialists exist almost since Charles Darwin proposed his theory of evolution in 1859.
Scientists have offered two broad models to explain the phenomenon.
One suggests generalists are jacks-of-all-trades but masters of none, meaning they can tolerate a wider range of conditions or food sources but aren't as dominant as a specialist in any specific niche.
The other theory is that a combination of internal and external factors drive niche variation.
For example, organisms can acquire genes that allow them to make enzymes capable of breaking down more than one substance, expanding the range of foods they can eat. Conversely, random loss of genes over time can result in a narrower palate.
Likewise, environments can exert selective pressure on traits. So a habitat with only one or two food sources or constant temperatures would favor specialists, while generalists might do better in an environment with a wider array of food or conditions.
When it comes to yeast metabolism, the research team found no evidence of trade-offs. The generalists are better across all the carbon sources they can use. Generalists are also able to use more nitrogen sources than carbon specialists. The data also showed that environmental factors play only a limited role. Hittinger cautions there are limitations to what can be inferred from the data. It's possible that tradeoffs are present in species that weren't studied. And the lab experiments used to measure metabolic growth can't replicate the conditions in soils, tree bark, or insect guts where yeasts live in nature.
Forget Billions of Years: Scientists Have Grown Diamonds in Just 150 Minutes
Natural diamonds take billions of years to form in the extreme pressures and temperatures deep underground. Synthetic forms can be produced far quicker, but they typically still require some intense squishing for up to several weeks. A new method based on a mix of liquid metals can pop out an artificial diamond in a matter of minutes, without the need for a giant squeeze. While high temperatures were still required, in the region of 1,025°C or 1,877°F, a continuous diamond film was formed in 150 minutes, and at 1 atm (or standard atmosphere unit). That's the equivalent of the pressure we feel at sea level, and tens of thousands of times less than the pressure normally required. The team behind the innovative approach, led by researchers from the Institute for Basic Science in South Korea, is confident that the process can be scaled up to make a significant difference in the production of synthetic diamonds. -- Dissolving carbon into liquid metal for the manufacture of diamond isn't entirely new. General Electric developed a process half a century ago using molten iron sulfide But these processes still required pressures of 5–6 gigapascals and a diamond 'seed' for the carbon to cling to. "We discovered a method to grow diamonds at 1 atm pressure and under a moderate temperature by using a liquid metal alloy," write the researchers in their published paper. The reduction in pressure was achieved using a carefully mixed blend of liquid metals: gallium, iron, nickel, and silicon. A custom-made vacuum system was built inside a graphite casing to very rapidly heat and then cool the metal while it was exposed to a combination of methane and hydrogen. These conditions cause carbon atoms from the methane to spread into the melted metal, acting as seeds for the diamonds. After just 15 minutes, small fragments of diamond crystals extruded from the liquid metal just beneath the surface, while two-and-a-half hours of exposure produced a continuous diamond film. Though the concentration of carbon forming the crystals decreased at a depth of just a few hundred nanometers, the researchers expect the process can be improved with a few tweaks.
The Sex of Your Doctor Could Have a Concerning Effect on Your Prognosis
Patients treated by a female physician are less likely to die or to be readmitted to hospital than those treated by a physician who is male, according to a new study by a team of researchers from the US and Japan. And if the patient happens to also be female, the difference is even more pronounced, especially so when they're severely ill.
While this study doesn't dive deeply into the reasons for the disparity, it supports previous research that comes to similar conclusions. What these findings indicate is that female and male physicians practice medicine differently, and these differences have a meaningful impact on patients' health outcomes. The team analyzed data from US Medicare sources describing 458,108 female and 318,819 male patients hospitalized between 2016 and 2019. All patients were over the age of 65, and just under a third of both male and female patients were seen by female physicians.
This info was then referenced against 30-day mortality rates (from the date of admission) and 30-day readmission rates (from the date of discharge). In both cases, female doctors led to better outcomes.
While the differences don't show direct cause and effect, and weren't huge – adjusted mortality rates of 8.15 percent (female doctor) vs 8.38 percent (male doctor) for female patients, for example – they represent a statistically significant gap that shouldn't be there at all. To put that difference into perspective, it amounts to 1 death for every 417 hospitalizations. It is important to note that female physicians provide high-quality care, and therefore, having more female physicians benefits patients from a societal point-of-view. The study authors suggest several reasons could be behind the discrepancies, which have been spotted before in different medical scenarios. It's possible that female doctors communicate better with female patients, the researchers say, or that male doctors are more likely to underestimate the severity of conditions experienced by female patients.
There might also be less embarrassment and discomfort between female doctors and female patients, the research team suggests, meaning more honesty about certain conditions and improved diagnosis and treatment.
The researchers want to see more done to improve sex diversity in hospital settings, and to make sure the quality of care is the same no matter whether patients or physicians are male or female – and for that to happen, more studies will be needed looking at why the differences exist. Further research on the underlying mechanisms linking physician gender with patient outcomes, and why the benefit of receiving the treatment from female physicians is larger for female patients, has the potential to improve patient outcomes across the board.
The rule of four: anomalous distributions in the stoichiometries of inorganic compounds
An analysis of a vast database of compounds has revealed a curious repeating pattern in the way matter composes itself.
Of more than 80,000 electronic structures of experimental and predicted materials studied, a whopping 60 percent have a basic structural unit based on a multiple of four.
What's so strange about this is that the research team that discovered this pattern couldn't figure out why it happens. All we know at the moment is that it's real and observable. It just evades explanation for now.
Through an extensive investigation, in this work researchers highlight and analyze the anomalous abundance of inorganic compounds whose primitive unit cell contains a number of atoms that is a multiple of four, a property that they name rule of four.
Study suggests host response needs to be studied along with other bacteriophage research
A team of micro- and immunobiologists has found evidence suggesting that future research teams planning to use bacteriophages to treat patients with multidrug-resistant bacterial infections need to also consider how cells in the host's body respond to such treatment.
In their paperpublishedin the open-access journalPLOS Biology, the group describes experiments they conducted that involved studying the way epithelial cells in the lungs respond to bacteriophages.
Over the past decade, medical scientists have found that many of the antibiotics used to treat bacterial infections are becoming resistant, making them increasingly useless. Because of this, other scientists have been looking for new ways to treat such infections. One possible approach has involved the use of bacteriophages, which are viruses that parasitize bacteria by infecting and reproducing inside of them, leaving them unable to reproduce.
To date, most of the research involving use of bacteriophages to treat infections has taken place in Eastern Europe, where some are currently undergoing clinical trials. But such trials, the researchers involved in this new study note, do not take into consideration how cells in the body respond to such treatment. Instead, they are focused on determining which phages can be used to fight which types of bacteria, and how well they perform once employed.
The reason so little attention is paid to host cell interaction, they note, is that prior research has shown that phages can only replicate inside of the bacterial cells they invade; thus, there is little opportunity for them to elicit a response in human cells.
In this new study, the research team suggests such thinking is misguided because it fails to take into consideration the immune response in the host. To demonstrate their point, the team conducted a series of experiments involving exposing human epithelial cells from the lungs (which are the ones that become infected as part of lung diseases) to bacteriophages meant to eradicate the bacteria causing an infection.
They found that in many cases, the immune system responded by producing proinflammatory cytokines in the epithelial cells. They noted further that different phages elicited different responses, and there exists the possibility that the unique properties of some phages could be used to improve the results obtained from such therapies. They conclude by suggesting that future bacteriophage research involve inclusion of host cell response.
Paula F. Zamora et al, Lytic bacteriophages induce the secretion of antiviral and proinflammatory cytokines from human respiratory epithelial cells, PLOS Biology (2024). DOI: 10.1371/journal.pbio.3002566
High-precision blood glucose level prediction achieved by few-molecule reservoir computing
A collaborative research team from NIMS and Tokyo University of Science has successfully developed an artificial intelligence (AI) device that executes brain-like information processing through few-molecule reservoir computing. This innovation utilizes the molecular vibrations of a select number of organic molecules.
By applying this device for the blood glucose level prediction in patients with diabetes, it has significantly outperformed existing AI devices in terms of prediction accuracy.
The work ispublishedin the journalScience Advances.
With the expansion of machine learning applications in various industries, there's an escalating demand for AI devices that are not only highly computational but also feature low power consumption and miniaturization.
Research has shifted towards physical reservoir computing, leveraging physical phenomena presented by materials and devices for neural information processing. One challenge that remains is the relatively large size of the existing materials and devices.
The team's research has pioneered the world's first implementation of physical reservoir computing that operates on the principle of surface-enhanced Raman scattering, harnessing the molecular vibrations of merely a few organic molecules. The information is inputted through ion gating, which modulates the adsorption of hydrogen ions onto organic molecules (p-mercaptobenzoic acid, pMBA) by applying voltage.
The changes in molecular vibrations of the pMBA molecules, which vary with hydrogen ion adsorption, serve the function of memory and nonlinear waveform transformation for calculation.
This process, using a sparse assembly of pMBA molecules, has learned approximately 20 hours of a diabetic patient's blood glucose level changes and managed to predict subsequent fluctuations over the next five minutes with an error reduction of about 50% compared to the highest accuracy achieved by similar devices to date.
This study indicates that a minimal quantity of organic molecules can effectively perform computations comparable to a computer. This technological breakthrough of conducting sophisticated information processing with minimal materials and in tiny spaces presents substantial practical benefits. It paves the way for the creation of low-power AI terminal devices that can be integrated with a variety of sensors, opening avenues for broad industrial use.
Daiki Nishioka et al, Few- and single-molecule reservoir computing experimentally demonstrated with surface-enhanced Raman scattering and ion gating, Science Advances (2024). DOI: 10.1126/sciadv.adk6438
'Zombie Deer' Disease: Zoonotic Transfer Suspected After Two Human Deaths
A medical case report suggests that a deadly prion disease may have made its way from deer into humans.
Two hunters have died after consuming venison from a population of deer known to be infected with chronic wasting disease – an incurable, fatal prion sometimes known as "zombie deer" disease not dissimilar to bovine spongiform encephalopathy, or mad cow disease.
A team of doctors at the University of Texas report a 72-year-old man died after presenting with rapid-onset confusion and aggression.
The man's friend, who was a member of the same hunting lodge, died at a later, unspecified date after presenting with similar symptoms, the doctors note. A post-mortem determined that this second patient had died of Creutzfeldt-Jakob disease, AKA prion disease.
Since prion disease is relatively rare in humans, the two cases could mean that chronic wasting disease – described by the Center for Disease Control as never having been reported in humans – has made the zoonotic leap from animals.
Prion diseases, known as Creutzfeldt-Jakob disease or CJD in humans, are kind of terrifying. Prions are proteins that haven't folded properly, and therefore don't really function the way they should. The problem is that these misfolded proteins teach the proteins around them how to fold badly, too, resulting in a spread of dysfunctional tissue that cannot be halted or cured.
The spread of prions through brain tissue produces symptoms very similar to a kind of fast-tracked dementia, to which the patient eventually succumbs. Since CJD doesn't produce any kind of immune response, it's practically impossible to diagnose in a living patient.
Significant concerns have already been raised about chronic wasting disease. It infects animals such as deer, elk, and moose, and seems to be transmitted fairly easily between them; scientists think it is transmitted via bodily fluids such as blood or saliva, either through direct contact, or contamination in the environment.
It's not known for certain whether the two men described in the case report succumbed to chronic wasting disease, or whether their illness had another source. Prion disease may emerge spontaneously, for example, although that is, as far as we can tell, extremely rare.
The case report also doesn't mention from whence the two men hailed, but the disease can be found across the North American continent in wild populations, including at least 32 states in the US and across Canada. It can also be found among farmed deer.
Part 1
Given that zoonotic prion disease is absolutely possible, and that transmission to humans has been predicted for some time, the situation, the doctors say, warrants caution and attention.
Although causation remains unproven, this cluster emphasizes the need for further investigation into the potential risks of consuming CWD-infected deer and its implications for public health," they write.
"Clusters of sporadic CJD cases may occur in regions with CWD-confirmed deer populations, hinting at potential cross-species prion transmission. Surveillance and further research are essential to better understand this possible association."
Why do we move slower the older we get? New study delivers answers
It's one of the inescapable realities of aging: The older we get, the slower we tend to move.
A new study led by University of Colorado Boulder engineers helps explain why.
The research is one of the first studies to experimentally tease apart the competing reasons why people over age 65 might not be as quick on their feet as they used to be. The group reported that older adults may move slower, at least in part, because it costs them more energy than younger people—perhaps not too shocking for anyone who's woken up tired the morning after an active day.
Why we move the way we do, from eye movements to reaching, walking, and talking, is a window into aging and Parkinson's. Scientists are trying to understand the neural basis of that.
For the study, the group asked subjects aged 18 to 35 and 66 to 87 to complete a deceptively simple task: to reach for a target on a screen, a bit like playing a video game on a Nintendo Wii. By analyzing patterns of these reaches, the researchers discovered that older adults seemed to modify their motions under certain circumstances to conserve their limited supplies of energy.
All of us, whether young or old, are inherently driven to get the most reward out of our environment while minimizing the amount of effort to do so.
researchers have long known that older adults tend to be slower because their movements are less stable and accurate. But other factors could also play a role in this fundamental part of growing up.
According to one hypothesis, the muscles in older adults may work less efficiently, meaning that they burn more calories while completing the same tasks as younger adults—like running a marathon or getting up to grab a soda from the refrigerator.
Alternatively, aging might also alter the reward circuitry in the human brain. As people age, their bodies produce less dopamine, a brain chemical responsible for giving you a sense of satisfaction after a job well done. If you don't feel that reward as strongly, the thinking goes, you may be less likely to move to get it. People with Parkinson's disease experience an even sharper decline in dopamine production.
In the study, the researchers asked more than 80 people to sit down and grab the handle of a robotic arm, which, in turn, operated the cursor on a computer screen. The subjects reached forward, moving the cursor toward a target. If they succeeded, they received a reward—not a big one, but still enough to make their brains happy.
Sometimes, the targets exploded, and they would get point rewards. It would also make a 'bing bing' sound.
That's when a contrast between the two groups of people began to emerge.
Both the 18 to 35-year-olds and 66 to 87-year-olds arrived at their targets sooner when they knew they would hear that bing bing—roughly 4% to 5% sooner over trials without the reward. But they also achieved that goal in different ways.
The younger adults, by and large, moved their arms faster toward the reward. The older adults, in contrast, mainly improved their reaction times, beginning their reaches about 17 milliseconds sooner on average.
When the team added an 8-pound weight to the robotic arm for the younger subjects, those differences vanished.
The brain seems to be able to detect very small changes in how much energy the body is using and adjusts our movements accordingly. Even when moving with just a few extra pounds, reacting quicker became the energetically cheaper option to get to the reward, so the young adults imitated the older adults and did just that.
The research seems to paint a clear picture. Both the younger and older adults didn't seem to have trouble perceiving rewards, even small ones. But their brains slowed down their movements under tiring circumstances.
The experiment can't completely rule out the brain's reward centers as a culprit behind why we slow down when we age. But if scientists can tease out where and how these changes emerge from the body, they may be able to develop treatments to reduce the toll of aging and disease.
Putting it all together, these results suggest that the effort costs of reaching seem to be determining what's slowing the movement of older adults.
Erik M. Summerside et al, Slowing of Movements in Healthy Aging as a Rational Economic Response to an Elevated Effort Landscape, The Journal of Neuroscience (2024). DOI: 10.1523/JNEUROSCI.1596-23.2024
Artificial sweeteners are chemical compounds are up to 600 times sweeter than sugar with very few (if any) calories, and are cheap and easy for manufacturers to use.
Traditional artificial sweeteners, such as aspartame, sucralose and acesulfame potassium (acesulfame K) have been found in a wide range of foods and drinks for many years as a way to increase the sweet taste without adding significant calories or costs. However, in the last few years, there has been controversy in the field. Several studies have suggested potential health harms associated with consuming these sweeteners, ranging from gastrointestinal disease to dementia. Although none of these harms have been proved, it has paved the way for new sweeteners to be developed to try to avoid any possible health issues. These next-generation sweeteners are up to 13,000 times sweeter than sugar, have no calories and no aftertaste (a common complaint with traditional sweeteners). An example of this new type of sweetener is neotame. Neotame was developed as an alternative to aspartame with the aim of being a more stable and sweet version of the traditional sweetener. It is very stable at high temperatures, which means it is a good additive to use in baked goods. It is also used in soft drinks and chewing gum.
An artificial sweetener called neotame can cause significant harm to the gut, scientists found.It does this harm in two ways. One, by breaking down the layer of cells that line the intestine. And, two, by causing previously healthy gut bacteria to become diseased, resulting in them invading the gut wall.
The study, published in the journal Frontiers in Nutrition, is the first to show this double-hit negative effect of neotame on the gut, resulting in damage similar to that seen in inflammatory bowel disease and sepsis.
Study suggests that stevia is the most brain-compatible sugar substitute
Given the known risks of consuming high amounts of sugar, today many people are looking for alternative sweeteners that produce a similar taste without prompting significant weight gain and causing other health issues. While research suggests that the brain can tell the difference between different sweet substances, the neural processes underlying this ability to tell sweeteners apart remain poorly understood.
Researchers recently carried out a study aimed at better understanding what happens in the brain of mice when they are fed different types of sweeteners. Their findings, published in Neuroscience Research, suggest that the response of neurons to sucrose and stevia is similar, suggesting that stevia could be an equally pleasant but healthier sugar substitute.
Stevia is a sweet sugar substitute that is about 50 to 300 times sweeter than sugar. It is extracted from the leaves of Stevia rebaudiana, a plant native to areas of Paraguay and Brazil in the southern Amazon rainforest.The active compounds in stevia are steviol glycosides (mainly stevioside and rebaudioside). Stevia is heat-stable, pH-stable, and not fermentable. Humans cannot metabolize the glycosides in stevia, and therefore it has zero calories. Its taste has a slower onset and longer duration than that of sugar, and at high concentrations some of its extracts may have an aftertaste described as licorice-like or bitter. Stevia is used in sugar- and calorie-reduced food and beverage products as an alternative for variants with sugar.
Interestingly, the team's recordings revealed that compared to other sugar substitutes considered as part of this study, stevia induced activity in the PVT that more closely resembled that elicited by sugar intake. This suggests that stevia is the most "brain compatible" among most widely used sugar alternatives, most closely mirroring the perceived taste of sugar.
Shaolei Jiang et al, Neuronal activity in the anterior paraventricular nucleus of thalamus positively correlated with sweetener consumption in mice, Neuroscience Research (2024). DOI: 10.1016/j.neures.2024.02.002
Research shows 'profound' link between dietary choices and brain health
A recent study published in Nature Mental Health shows that a healthy, balanced diet is linked to superior brain health, cognitive function and mental well-being. The study, involving researchers at the University of Warwick, sheds light on how our food preferences not only influence physical health but also significantly impact brain health.
The dietary choices of a large sample of 181,990 participants from the UK Biobank were analyzed against and a range of physical evaluations, including cognitive function, blood metabolic biomarkers, brain imaging, and genetics—unveiling new insights into the relationship between nutrition and overall well-being.
The food preferences of each participant were collected via an online questionnaire, which the team categorized into 10 groups (such as alcohol, fruits and meats). A type of AI called machine learning helped the researchers analyze the large dataset.
A balanced diet was associated with better mental health, superior cognitive functions and even higher amounts of gray matter in the brain—linked to intelligence—compared with those with a less varied diet.
The study also highlighted the need for gradual dietary modifications, particularly for individuals accustomed to highly palatable but nutritionally deficient foods. By slowly reducing sugar and fat intake over time, individuals may find themselves naturally gravitating towards healthier food choices.
Genetic factors may also contribute to the association between diet and brain health, the scientists think, showing how a combination of genetic predispositions and lifestyle choices shape well-being.
Ruohan Zhang et al, Associations of dietary patterns with brain health from behavioral, neuroimaging, biochemical and genetic analyses, Nature Mental Health (2024). DOI: 10.1038/s44220-024-00226-0
Human activities have an intense impact on Earth's deep subsurface fluid flow
The impact of human activities—such as greenhouse gas emissions and deforestation—on Earth's surface have been well-studied. Now, hydrology researchers have investigated how humans impact Earth's deep subsurface, a zone that lies hundreds of meters to several kilometers beneath the planet's surface.
They looked at how the rates of fluid production with oil and gas compare to natural background circulation of water and showed how humans have made a big impact on the circulation of fluids in the subsurface.
In the future, these human-induced fluid fluxes are projected to increase with strategies that are proposed as solutions for climate change, according the study. Such strategies include: geologic carbon sequestration, which is capturing and storing atmospheric carbon dioxide in underground porous rocks; geothermal energy production, which involves circulating water through hot rocks for generating electricity; and lithium extraction from underground mineral-rich brine for powering electric vehicles.
Responsible management of the subsurface is central to any hope for a green transition, sustainable future and keeping warming below a few degrees.
With oil and natural gas production, there is always some amount of water, typically saline, that comes from the deep subsurface. The underground wateris often millions of years old and acquires its salinity either from evaporation of ancient seawater or from reaction with rocks and minerals. For more efficient oil recovery, more water from near-surface sources is added to the salt water to make up for the amount of oil removed and to maintain reservoir pressures. The blended saline water then gets reinjected into the subsurface. This becomes a cycle of producing fluid and reinjecting it to the deep subsurface.
The same process happens in lithium extraction, geothermal energy production and geologic carbon sequestration, the operations of which involve leftover saline water from the underground that is reinjected.
Researchers showed that the fluid injection rates or recharge rates from those oil and gas activities is greater than what naturally occurs .
Using existing data from various sources, including measurements of fluid movements related to oil and gas extraction and water injections for geothermal energy, the team found that the current fluid movement rates induced by human activities are higher compared to how fluids moved before human intervention.
As human activities like carbon capture and sequestration and lithium extraction ramp up, the researchers also predicted how these activities might be recorded in the geological record, which is the history of Earth as recorded in the rocks that make up its crust.
Human activities have the potential to alter not just the deep subsurface fluids but also the microbes that live down there. As fluids move around, microbial environments may be altered by changes in water chemistry or by bringing new microbial communities from Earth's surface to the underground.
For example, with hydraulic fracturing, a technique that is used to break underground rocks with pressurized liquids for extracting oil and gas, a deep rock formation that previously didn't have any detectable number of microbes might have a sudden bloom of microbial activity.
There remain a lot of unknowns about Earth's deep subsurface and how it is impacted by human activities, and it's important to continue working on those questions, say the scientists.
Grant Ferguson et al, Acceleration of Deep Subsurface Fluid Fluxes in the Anthropocene, Earth's Future (2024). DOI: 10.1029/2024EF004496
Brown fat, also known as brown adipose tissue (BAT), is a type of fat in our bodies that's different from the white fat around our belly and thighs that we are more familiar with. Brown fat has a special job—it helps to burn calories from the foods that we eat into heat, which can be helpful, especially when we're exposed to cold temperatures like during winter swimming or cryotherapy.
For a long time, scientists thought that only small animals like mice and newborns had brown fat. But new research shows that a certain number of adults maintain their brown fat throughout life. Because brown fat is so good at burning calories, scientists are trying to find ways to activate it safely using drugs that boost its heat-producing abilities.
A new study has found that brown fat has a previously unknown built-in mechanism that switches it off shortly after being activated. This limits its effectiveness as treatment against obesity. The team has now discovered a protein responsible for this switching-off process. It is called "AC3-AT."
Hande Topel et al, Cold-induced expression of a truncated Adenylyl Cyclase 3 acts as rheostat to brown fat function, Nature Metabolism (2024). DOI: 10.1038/s42255-024-01033-8
Physicists overcome two key operating hurdles in fusion reactions
A team of physicists has devised a way to overcome two key hurdles standing in the way of using fusion as a general power source.
In their paper published in the journal Nature, the group describes how they devised a way to raise the density of the plasma in their reactor while also keeping it stable.
Scientists at various sites around the world have been working for several years to figure out how to use fusion reactions to create electricity for general use—thereby freeing the world from using coal and gas fired power plants that spew greenhouse gases into the atmosphere. But it has been a long and difficult road.
It was just in the past couple of years that researchers were able to show that a fusion reaction could be made to sustain itself, and that more power could be produced than was input into such a system.
The next two hurdles to overcome are increasing the density of the plasma in the reactor and then containing it for extended periods of time—long enough for it to be useful for producing electricity. In this new study, the research team has devised a way to do both in a tokamak chamber.
To contain the plasma as its density was increased, the team used additional magnets and bursts of deuterium where needed. They also allowed for higher densities at the core than near the edges, helping to ensure the plasma could not escape. They held it in that state for 2.2 seconds, long enough to prove that it could be done.
They also found that during that short time span, the average density in the reactor was 20% over the Greenwald limit—a theoretical barrier that had been predicted to mark the point at which adding pressure would escape the magnetic field holding the plasma in place.
They also found that the stability of theplasmawas H98y2above 1, which means that the experiment was successful.
The research team acknowledges that their experiment was done in a very small reactor—one with a diameter of just 1.6 meters. For such an achievement to be considered fully successful, it will have to be done in a much larger reactor, such as the one currently under construction in France, which will have a diameter of 6.2 meters.
S. Ding et al, A high-density and high-confinement tokamak plasma regime for fusion energy, Nature (2024). DOI: 10.1038/s41586-024-07313-3
In our modern society, we rarely consider books to be dangerous items. However, certain books contain elements so hazardous that they require scrutiny before being placed on the shelves of public libraries, bookstores or even private homes.
The Poisonous Book Project, a collaborative research project between Winterthur Museum, Garden & Library and the University of Delaware, is dedicated to cataloging such books. Their concern is not with the content written on the pages, but with the physical components of the books themselves—specifically, the colors of the covers.
The project recently influenced the decision to remove two books from the French national library. The reason? Their vibrant green cloth covers raised suspicions of containing arsenic.
This concern is rooted in historical practices in bookbinding. During the 19th century, as books began to be mass produced, bookbinders transitioned from using expensive leather covers to more affordable cloth items. To attract readers, these cloth covers were often dyed in bright, eye-catching colours.
One popular pigment was Scheele's green, named after Carl Wilhelm Scheele, a German-Swedish chemist who in 1775 discovered that a vivid green pigment could be produced from copper and arsenic. This dye was not only cheap to make, it was also more vibrant than the copper carbonate greens that had been used for over a century.
Paris green, 's much more durable. It was quickly adopted for use in various items, including book covers, clothing, candles and wallpaper.
These pigments, however, had a significant drawback: they degraded easily, releasing poisonous and carcinogenic arsenic. The frequent reports of green candles poisoning children at Christmas parties, factory workers tasked with applying paint to ornaments convulsing and vomiting green water and warnings of poisonous ball dresses raised serious concerns about the safety of these green dyes.
Part 1
The harmful effects of these pigments have even been implicated in Napoleon's death from stomach cancer. Napoleon was particularly keen on the new green colors, so much so that he ordered his dwelling on St Helena, where he was exiled, be painted in his favorite color.
The theory that the arsenic in the walls contributed to his death is supported by the high levels of arsenic detected in samples of his hair. Despite the clear link between the green pigments and health issues, toxic wallpapers continued to be produced until the late 19th century. Green isn't the only color to worry about, however. Red is also of concern. The brilliant red pigment vermilion was formed from the mineral cinnabar, also known as mercury sulfide. This was a popular source of red paint dating back thousands of years. There is even evidence that neolithic artists suffered from mercury poisoning. Vermilion red sometimes appears on the marbled patterns on the inside of book covers.
Yellow has also caught the eye of the poisonous book project. In this case, the culprit is lead chromate. The bright yellow of lead chromate was a favorite with painters, not least Vincent van Gogh, who used it extensively in his most famous series of paintings: Sunflowers. For the Victorian-era bookbinders, lead chromate allowed them to create a range of colors from greens (achieved by mixing chrome yellow with Prussian blue) to yellows, oranges and browns.
Both lead and chromium are toxic. But yellow books are less of a concern than green and red. Lead chromate is not particularly soluble, making it difficult to absorb. It is, in fact, still a widely used pigment. Part 2
So what should you do if you come across a green cloth book from the 19th century? First, don't be overly concerned. You would probably have to eat the entire book before you'd suffer from severe arsenic poisoning. However, casual exposure to copper acetoarsenite, the compound in the green pigment, can irritate the eyes, nose and throat.
It is more of a concern for folks who may regularly handle these books where frequent contact could result in more serious symptoms. Therefore, anyone who suspects they might be handling a Victorian-era book with an emerald green binding is advised to wear gloves and avoid touching their face. Then clean all surfaces afterwards.
To aid with the identification of these potentially hazardous books, the Poisonous Book Project has incorporated crowd-sourced data into their research. The researchers now distribute bookmarks that feature safety warnings and showcase various shades of emerald green to aid their identification. As a result, they have now identified over 238 arsenic editions from across the globe.
First fetus-to-fetus transplant demonstrated in rats
The tissue developed into functioning kidneys and produced urine.
Surgeons in Japan have transplanted kidney tissue from one rat fetus to another, while the recipient was still in its mother's womb. Study lead Takashi Yokoo, a nephrologist at Jikei University School of Medicine in Tokyo, says the surgery is the first step to one day transplanting fetal pig kidneys into human fetuses that develop without functioning kidneys.
Transplanting an organ before birth could allow it to grow and develop with the fetus, so that the organ is functioning at birth and has less risk of rejection.
The recipient pups were born as normal after 22 days, according to a preprint (not peer reviewed). Blood vessels from the host fetus grew inside the donated tissue, lowering the risk of rejection.
Scientists show that there is indeed an 'entropy' of quantum entanglement
Researchers have shown, through probabilistic calculations, that there is indeed, as had been hypothesized, a rule of entropy for the phenomenon of quantum entanglement.
This finding could help drive a better understanding of quantum entanglement, which is a key resource that underlies much of the power of future quantum computers. Little is currently understood about the optimal ways to make effective use of it, despite it being the focus of research in quantum information science for decades.
The second law of thermodynamics, which says that a system can never move to a state with lower entropy, or order, is one of the most fundamental laws of nature, and lies at the very heart of physics. It is what creates the "arrow of time," and tells us the remarkable fact that the dynamics of general physical systems, even extremely complex ones such as gases or black holes, are encapsulated by a single function, its entropy.
There is a complication, however. The principle of entropy is known to apply to all classical systems. Then what about quantum world?
We are now going through a quantum revolution, and it becomes crucially important to understand how we can extract and transform the expensive and fragile quantum resources. In particular, quantum entanglement, which allows for significant advantages in communication, computation, and cryptography, is crucial, but due to its extremely complex structure, efficiently manipulating it and even understanding its basic properties is typically much more challenging than in the case of thermodynamics. Part 1
The difficulty lies in the fact that such a "second law" for quantum entanglement would require us to show that entanglement transformations can be made reversible, just like work and heat can be interconverted in thermodynamics.
It is known that reversibility of entanglement is much more difficult to ensure than the reversibility of thermodynamic transformations, and all previous attempts at establishing any form of a reversible theory of entanglement have failed. It was even suspected that entanglement might actually be irreversible, making the quest an impossible one. In their new work, published in Nature Communications, the authors solve this long-standing conjecture by using probabilistic entanglement transformations, which are only guaranteed to be successful some of the time, but which, in return, provide an increased power in converting quantum systems.
Under such processes, the authors show that it is indeed possible to establish a reversible framework for entanglement manipulation, thus identifying a setting in which a unique entropy of entanglement emerges and all entanglement transformations are governed by a single quantity. The methods they used could be applied more broadly, showing similar reversibility properties also for more general quantum resources. findings mark significant progress in understanding the basic properties of entanglement, revealing fundamental connections between entanglement and thermodynamics, and crucially, providing a major simplification in the understanding of entanglement conversion processes. This not only has immediate and direct applications in the foundations of quantum theory, but it will also help with understanding the ultimate limitations on our ability to efficiently manipulate entanglement in practice.
Bartosz Regula et al, Reversibility of quantum resources through probabilistic protocols, Nature Communications (2024). DOI: 10.1038/s41467-024-47243-2
Patients with rheumatoid arthritis have unique and complex autoantibody patterns, study reveals
Patients with rheumatoid arthritis (RA) all have a unique and diverse set of antibodies that are involved in the development of the disease. Researchers unveiled the complexity of these antibodies using powerful lab tools capable of analyzing our immune system at molecular levels. Their discovery suggests that current assumptions about the origin of RA are too simple. Their findings may point towards improved diagnostics.
Rheumatoid arthritis is a chronic autoimmune disease that primarily affects the joints, causing pain, stiffness, and swelling. It arises when the immune systemmistakenly attacks the body's own tissues, leading to inflammation in the joints and potentially other organs.
The exact cause of RA remains unknown, but a crucial role is played by antibodies, special proteins made by the immune system to help fight off infections. They recognize and attack specific targets, like viruses or bacteria. Some antibodies are wrongly produced, causing them to attack our own body. Normally, our body's immune system is equipped with a 'filter' that cleans up these so-called autoantibodies. Researchers think that this mechanism is malfunctioning in RA patients.
The extend to which this filter is malfunctioning, now appears to be much greater than expected. Research published in Nature Communications reveals that it's not just a handful of different RA-associated autoantibodies that evade the filter. On the contrary, the researchers found an extremely broad variety of these antibodies.
The team used novel mass spectrometry tools that profile specific antibodies typically seen in the blood of RA patients, which are called anti-citrullinated protein antibodies (ACPAs). They discovered that each RA patient possesses a unique and diverse set of ACPAs.
Dr. Krishna Kumari Challa
New models of Big Bang show that visible universe and invisible dark matter co-evolved
Physicists have long theorized that our universe may not be limited to what we can see. By observing gravitational forces on other galaxies, they've hypothesized the existence of "dark matter," which would be invisible to conventional forms of observation.
95% of the universe is dark, is invisible to the eye.
However, we know that the dark universe is there by its gravitational pull on stars. Other than its gravity, dark matter has never seemed to have much effect on the visible universe.
Yet the relationship between these visible and invisible domains, especially as the universe first formed, has remained an open question.
Now, physicists say that there is mounting evidence that these two supposedly distinct realms actually co-evolved.
Through a series of computer models, physicists have discovered that the visible and the hidden sectors, as they call them, likely co-evolved in the moments after the Big Bang, with profound repercussions for how the universe developed thereafter.
They say there was a time when some physicists effectively wrote off this hidden sector, as they can explain most of what happens within the visible—that is, if our models can accurately portray what we can see happening around us, why bother trying to measure something that has no discernible effect?
The question is, what's the influence of the hidden sector on the visible sector?" "But what do we care? We can explain everything."
Part 1
Apr 23
Dr. Krishna Kumari Challa
But we can't explain everything, the authors of this study contends. There are anomalies that don't seem to fit the so-called "Standard Model" of the universe.
That the visible and hidden sectors are mutually isolated is a misconception, they say, based on an assumption "that the visible and the hidden sectors evolved independently of each other." The new work wants to turn that assumption on its head.
In a paper published in Physical Review D, "Big Bang Initial Conditions and Self-Interacting Hidden Dark Matter, they want to ask what they call "the more important question: How do we know that they evolved independently?"
To test this assumption, Nath and his team "introduced some feeble interactions" between the two sectors into their models of the Big Bang. These meager interactions wouldn't be enough to affect the outcome of, say, particle accelerator experiments, "but we wanted to see what the effects would be on the visible sector as a whole," Nath says, "from the time of the Big Bang to the current time."Even with minimal interactions between the two sectors, Nath and his team discovered that dark matter's influence on the visible matter we're made of could have a major impact on observable phenomena.
The Hubble expansion—which says, in the simplest terms, that galaxies are moving away from one another, and thus that the universe is expanding—for instance, contains a "quite serious" differential between what the Standard Model predicts and what has been observed. Nath's models partially account for this difference.
Apr 23
Dr. Krishna Kumari Challa
One major variable is the temperature of the hidden sector during the Big Bang.
The visible sector, we can be fairly certain, started out very hot at the moment of the Big Bang. As the universe cools, Nath says, "what we see is the remnant of that period of the universe."
But by studying the evolution of the two sectors, Nath and his team could model both conditions—a hidden sector that started out hot, and another hidden sector that started out cold.
What they observed was surprising: Despite significant differences between the models, with major implications on what the universe looked like in early times, both the hot and cold models were consistent with the visible sector we can observe today.
part 3
Apr 23
Dr. Krishna Kumari Challa
Our current measurements of the visible universe, in other words, are insufficient to confirm which side the hidden sector fell on at the beginning—hot or cold.
Nath is quick to point out that, rather than a failing of the experiment, this is an example of the mathematical models outrunning our current experimental capabilities.
It's not that the difference between a hot or cold hidden sector has no bearing on the visible universe, but that we haven't performed experiments—yet—with high enough precision. Nath mentions the Webb Telescope as one example of the next generation of tools that will be able to make such precise observations.
The ultimate goal of all this modeling work is to make better predictions about the state of the universe, how it all functions and what we'll find when we look deeper and deeper into the night sky.
As our experiments gain more accuracy, the questions baked into Nath's models—was the hidden sector hot or cold?—will find their answers, and those clarified models will help predict the solutions to ever-deeper questions.
"What's the significance of this?" Nath asks. Human beings, he says, "want to find their place in the universe." And more than that, "they want to answer the question, why is there a universe?
"And we are exploring those issues. It is the ultimate quest of human beings."
Jinzheng Li et al, Big bang initial conditions and self-interacting hidden dark matter, Physical Review D (2023). DOI: 10.1103/PhysRevD.108.115008
Part 4
**
Apr 23
Dr. Krishna Kumari Challa
Study suggests that living near green spaces reduces the risk of depression and anxiety
Over the past decades, a growing number of people have migrated to urban areas, while the size and population of rural areas have drastically declined. While parks and other green spaces are often viewed as beneficial for the well-being of those living in cities and urban regions, so far very few studies have explored the impact of these spaces on mental health.
Researchers recently carried out a study investigating the potential link between long-term exposure to green spaces in proximity of one's home and two of the most common mental health disorders: depression and anxiety. Their findings, published in Nature Mental Health, suggest that living close to parks and green areas can reduce the risk of becoming depressed and experiencing anxiety.
As part of their study, the researchers analyzed data gathered from 409,556 people and stored in the UK Biobank database. They specifically looked at the distance between participants and green areas, in conjunction with their self-reported well-being scores, as well as hospitalizations, hospital admissions, and deaths in their residential area.
The results of the analyses suggest that there is a link between prolonged proximity to residential green areas and the incidence of both depression and anxiety. Specifically, they suggest that living closer to parks and other green areas reduces the risk of experiencing both depression and anxiety.
Researchers draw the important conclusion that long-term exposure to residential greenness is associated with a decreased risk of incident depression and anxiety, and reduced air pollution in the greenest areas probably plays an important role in this trend. This study thus implies that expanding urban green spaces could promote good mental health.
Jianing Wang et al, Long-term exposure to residential greenness and decreased risk of depression and anxiety, Nature Mental Health (2024). DOI: 10.1038/s44220-024-00227-z.
Apr 24
Dr. Krishna Kumari Challa
How light can vaporize water without the need for heat
It's the most fundamental of processes—the evaporation of water from the surfaces of oceans and lakes, the burning off of fog in the morning sun, and the drying of briny ponds that leaves solid salt behind. Evaporation is all around us, and humans have been observing it and making use of it for as long as we have existed.
And yet, it turns out, we've been missing a major part of the picture all along.
In a series of painstakingly precise experiments, a team of researchers has demonstrated that heat isn't alone in causing water to evaporate. Light, striking the water's surface where air and water meet, can break water molecules away and float them into the air, causing evaporation in the absence of any source of heat.
The astonishing new discovery could have a wide range of significant implications. It could help explain mysterious measurements over the years of how sunlight affects clouds, and therefore affect calculations of the effects of climate change on cloud cover and precipitation. It could also lead to new ways of designing industrial processes such as solar-powered desalination or drying of materials.
The findings, and the many different lines of evidence that demonstrate the reality of the phenomenon and the details of how it works, are described recently in the Proceedings of the National Academy of Sciences.
The authors say their study suggests that the effect should happen widely in nature—everywhere from clouds to fogs to the surfaces of oceans, soils, and plants—and that it could also lead to new practical applications, including in energy and clean water production.
Part 1
Apr 24
Dr. Krishna Kumari Challa
The new work builds on research reported* last year, which described this new "photomolecular effect" but only under very specialized conditions: on the surface of specially prepared hydrogels soaked with water. In the new study, the researchers demonstrate that the hydrogel is not necessary for the process; it occurs at any water surface exposed to light, whether it's a flat surface like a body of water or a curved surface like a droplet of cloud vapour.
Because the effect was so unexpected, the team worked to prove its existence with as many different lines of evidence as possible. In this study, they report 14 different kinds of tests and measurements they carried out to establish that water was indeed evaporating—that is, molecules of water were being knocked loose from the water's surface and wafted into the air—due to the light alone, not by heat, which was long assumed to be the only mechanism involved.One key indicator, which showed up consistently in four different kinds of experiments under different conditions, was that as the water began to evaporate from a test container under visible light, the air temperature measured above the water's surface cooled down and then leveled off, showing that thermal energy was not the driving force behind the effect.
Other key indicators that showed up included the way the evaporation effect varied depending on the angle of the light, the exact color of the light, and its polarization. None of these varying characteristics should happen because at these wavelengths, water hardly absorbs light at all—and yet the researchers observed them.
The effect is strongest when light hits the water surface at an angle of 45 degrees. It is also strongest with a certain type of polarization, called transverse magnetic polarization. And it peaks in green light—which, oddly, is the color for which water is most transparent and thus interacts the least.
* Yaodong Tu et al, Plausible photomolecular effect leading to water evaporation exceeding the thermal limit, Proceedings of the National Academy of Sciences (2023). DOI: 10.1073/pnas.2312751120
Part 2Apr 24
Dr. Krishna Kumari Challa
researchers have proposed a physical mechanism that can explain the angle and polarization dependence of the effect, showing that the photons of light can impart a net force on water molecules at the water surface that is sufficient to knock them loose from the body of water. But they cannot yet account for the color dependence, which they say will require further study.
They have named this the photomolecular effect, by analogy with the photoelectric effect that was discovered by Heinrich Hertz in 1887 and finally explained by Albert Einstein in 1905. That effect was one of the first demonstrations that light also has particle characteristics, which had major implications in physics and led to a wide variety of applications, including LEDs. Just as the photoelectric effect liberates electrons from atoms in a material in response to being hit by a photon of light, the photomolecular effect shows that photons can liberate entire molecules from a liquid surface, the researchers say.
The finding of evaporation caused by light instead of heat provides new disruptive knowledge of light-water interaction.
It could help us gain new understanding of how sunlight interacts with cloud, fog, oceans, and other natural water bodies to affect weather and climate. It has significant potential practical applications such as high-performance water desalination driven by solar energy.
The finding may solve an 80-year-old mystery in climate science. Measurements of how clouds absorb sunlight have often shown that they are absorbing more sunlight than conventional physics dictates possible. The additional evaporation caused by this effect could account for the longstanding discrepancy, which has been a subject of dispute since such measurements are difficult to make.
Those experiments are based on satellite data and flight data. They fly an airplane on top of and below the clouds, and there are also data based on the ocean temperature and radiation balance. And they all conclude that there is more absorption by clouds than theory could calculate. However, due to the complexity of clouds and the difficulties of making such measurements, researchers have been debating whether such discrepancies are real or not. And what was discovered now suggests that hey, there's another mechanism for cloud absorption, which was not accounted for, and this mechanism might explain the discrepancies.
Part 3
Apr 24
Dr. Krishna Kumari Challa
There are many lines of evidence. The flat region in the air-side temperature distribution above hot water will be the easiest for people to reproduce. That temperature profile "is a signature" that demonstrates the effect clearly.
It is quite hard to explain how this kind of flat temperature profile comes about without invoking some other mechanism" beyond the accepted theories of thermal evaporation.
The observations in the manuscript points to a new physical mechanism that foundationally alters our thinking on the kinetics of evaporation.
Guangxin Lv et al, Photomolecular effect: Visible light interaction with air–water interface, Proceedings of the National Academy of Sciences (2024). DOI: 10.1073/pnas.2320844121
Part 4
Apr 24
Dr. Krishna Kumari Challa
Researchers detect a new molecule in space
New research has revealed the presence of a previously unknown molecule in space. The open-access paper describing it, "Rotational Spectrum and First Interstellar Detection of 2-Methoxyethanol Using ALMA Observations of NGC 6334I," was published in the April 12 issue of The Astrophysical Journal Letters.
Researchers worked to assemble a puzzle comprised of pieces collected from across the globe, extending beyond MIT to France, Florida, Virginia, and Copenhagen, to achieve this exciting discovery.
To detect new molecules in space, researchers first must have an idea of what molecule they want to look for, then they can record its spectrum in the lab here on Earth, and then finally they look for that spectrum in space using telescopes.
To detect this molecule using radio telescope observations, the group first needed to measure and analyze its rotational spectrum on Earth. The researchers combined experiments from the University of Lille (Lille, France), the New College of Florida (Sarasota, Florida), and the McGuire lab at MIT to measure this spectrum over a broadband region of frequencies ranging from the microwave to sub-millimeter wave regimes (approximately 8 to 500 gigahertz).
The data gleaned from these measurements permitted a search for the molecule using Atacama Large Millimeter/submillimeter Array (ALMA) observations toward two separate star-forming regions: NGC 6334I and IRAS 16293-2422B. Members of the group analyzed these telescope observations alongside researchers at the National Radio Astronomy Observatory (Charlottesville, Virginia) and the University of Copenhagen, Denmark.
Ultimately, they observed 25 rotational lines of 2-methoxyethanol that lined up with the molecular signal observed toward NGC 6334I (the barcode matched), thus resulting in a secure detection of 2-methoxyethanol in this source.This allowed them to then derive physical parameters of the molecule toward NGC 6334I, such as its abundance and excitation temperature. It also enabled an investigation of the possible chemical formation pathways from known interstellar precursors.
Molecular discoveries like this one help the researchers to better understand the development of molecular complexity in space during the star formation process. 2-methoxyethanol, which contains 13 atoms, is quite large for interstellar standards—as of 2021, only six species larger than 13 atoms were detected outside the solar system, many by this research group, and all of them existing as ringed structures.
Continued observations of large molecules and subsequent derivations of their abundances allows scientists to advance our knowledge of how efficiently large molecules can form and by which specific reactions they may be produced.
Zachary T. P. Fried et al, Rotational Spectrum and First Interstellar Detection of 2-methoxyethanol Using ALMA Observations of NGC 6334I, The Astrophysical Journal Letters (2024). DOI: 10.3847/2041-8213/ad37ff
Apr 24
Dr. Krishna Kumari Challa
Laser-treated cork absorbs oil for carbon-neutral ocean cleanup
Oil spills are deadly disasters for ocean ecosystems. They can have lasting impacts on fish and marine mammals for decades and wreak havoc on coastal forests, coral reefs, and the surrounding land. Chemical dispersants are often used to break down oil, but they often increase toxicity in the process.
In Applied Physics Letters, researchers published their work using laser treatments to transform ordinary cork into a powerful tool for treating oil spills.
They wanted to create a nontoxic, effective oil cleanup solution using materials with a low carbon footprint, but their decision to try cork resulted from a surprising discovery.
In a different laser experiment, they accidentally found that the wettability of the cork processed using a laser changed significantly, gaining superhydrophobic (water-repelling) and superoleophilic (oil-attracting) properties. After appropriately adjusting the processing parameters, the surface of the cork became very dark, which made them realize that it might be an excellent material for photothermal conversion.
Combining these results with the eco-friendly, recyclable advantages of cork, they thought of using it for marine oil spill cleanup.
Cork comes from the bark of cork oak trees, which can live for hundreds of years. These trees can be harvested about every seven years, making cork a renewable material. When the bark is removed, the trees amplify their biological activity to replace it and increase their carbon storage, so harvesting cork helps mitigate carbon emissions. Part 1Apr 24
Dr. Krishna Kumari Challa
The authors tested variations of a fast-pulsing laser treatment to achieve the optimal balance of characteristics in the cork that can be achieved at low cost.
They closely examined nanoscopic structural changes and measured the ratio of oxygen and carbon in the material, changes in the angles with which water and oil contact the surface, and the material's light wave absorption, reflection, and emission across the spectrum to determine its durability after multiple cycles of warming and cooling.
The photothermal properties endowed in cork through this laser processing allow the cork to warm quickly in the sun. The deep grooves also increase the surface area exposed to sunlight, so the cork can be warmed by just a little sunlight in 10–15 seconds. This energy is used to heat up spilled oil, lowering its viscosity and making it easier to collect. In experiments, the laser-treated cork collected oil out of water within two minutes.
The laser treatments not only help to better absorb oil, but also work to keep water out.
When the cork undergoes a fast-pulsing laser treatment, its surface microstructure becomes rougher. This micro- to nano-level roughness enhances hydrophobicity.
As a result, the cork collects the oil without absorbing water, so the oil can be extracted from the cork and possibly even reused.
Femtosecond laser structured black superhydrophobic cork for efficient solar-driven cleanup of crude oil, Applied Physics Letters (2024). DOI: 10.1063/5.0199291
Part 2
Apr 24
Dr. Krishna Kumari Challa
Light stands still in a deformed crystal
AMOLF researchers, in collaboration with Delft University of Technology have succeeded in bringing light waves to a halt by deforming the two-dimensional photonic crystal that contains them. The researchers show that even a subtle deformation can have a substantial effect on photons in the crystal. This resembles the effect that a magnetic field has on electrons.
This principle offers a new approach to slow down light fields and thereby enhance their strength. Realizing this on a chip is particularly important for many applications, say the researchers.
The researchers have published their findings in the journal Nature Photonics. Simultaneously, a research team from Pennsylvania State University has published an article in the same journal about how they demonstrated—independently from the Dutch team—an identical effect.
Manipulating the flow of light in a material at small scales is beneficial for the development of nanophotonic chips. For electrons, such manipulation can be realized using magnetic fields; the Lorentz force steers the motion of electrons. However, this is impossible for photons because they do not have charge.
Researchers in the Photonic Forces group at AMOLF are looking for techniques and materials that would enable them to apply forces to photons that resemble the effects of magnetic fields.
The researchers looked for inspiration at the way in which electrons behave in materials. In a conductor, electrons can in principle move freely, but an external magnetic field can stop this. The circular movement caused by the magnetic field stops conduction and as such electrons can only exist in the material if they have very specific energies. These energy levels are called Landau levels, and they are characteristic for electrons in a magnetic field
But, in the two-dimensional material graphene—that consists of a single layer of carbon atoms arranged in a crystal—these Landau levels can also be caused by a different mechanism than a magnetic field. In general, graphene is a good electronic conductor, but this changes when the crystal array is deformed, for instance by stretching it like elastics."Such mechanical deformation stops conduction; the material turns into an insulator and consequently the electrons are bound to Landau levels. Hence, the deformation of graphene has a similar effect on electrons in a material as a magnetic field, even without a magnet. Researchers asked themselves if a similar approach would also work for photons."
Part 1
Apr 25
Dr. Krishna Kumari Challa
In a collaboration with Kobus Kuipers of Delft University of Technology, the group indeed demonstrated a similar effect for light in a photonic crystal.
A photonic crystal normally consists of a regular—two dimensional—pattern of holes in a silicon layer. Light can move freely in this material, just like electrons in graphene.
Breaking this regularity in exactly the right manner will deform the array and consequently lock the photons. This is how they create Landau levels for photons.
In Landau levels light waves no longer move; they do not flow through the crystal but stand still. The researchers succeeded in demonstrating this, showing that the deformation of the crystal array has a similar effect on photons as a magnetic field on electrons.
By playing with the deformation pattern, we even managed to establish various types of effective magnetic fields in one material. As a result, photons can move through certain parts of the material but not in others. Hence, these insights also provide new ways to steer light on a chip.
This brings on-chip applications closer.If we can confine light at the nanoscale and bring it to a halt like this, its strength will be enhanced tremendously. And not only at one location, but over the entire crystal surface. Such light concentration is very important in nanophotonic devices, for example for the development of efficient lasers or quantum light sources.
René Barczyk et al, Observation of Landau levels and chiral edge states in photonic crystals through pseudomagnetic fields induced by synthetic strain, Nature Photonics (2024). DOI: 10.1038/s41566-024-01412-3
Part 2
Apr 25
Dr. Krishna Kumari Challa
Bioluminescence first evolved in animals at least 540 million years ago, pushing back previous oldest dated example
Bioluminescence first evolved in animals at least 540 million years ago in a group of marine invertebrates called octocorals, according to the results of a new study from scientists with the Smithsonian's National Museum of Natural History.
The results, published April 23, in the Proceedings of the Royal Society B: Biological Sciences, push back the previous record for the luminous trait's oldest dated emergence in animals by nearly 300 million years, and could one day help scientists decode why the ability to produce light evolved in the first place.
Bioluminescence—the ability of living things to produce light via chemical reactions—has independently evolved at least 94 times in nature and is involved in a huge range of behaviors including camouflage, courtship, communication and hunting. Until now, the earliest dated origin of bioluminescence in animals was thought to be around 267 million years ago in small marine crustaceans called ostracods.But for a trait that is literally illuminating, bioluminescence's origins have remained shadowy.
Nobody quite knows why it first evolved in animals.
In search of the trait's earliest origins, the researchers decided to peer back into the evolutionary history of the octocorals, an evolutionarily ancient and frequently bioluminescent group of animals that includes soft corals, sea fans and sea pens.
Like hard corals, octocorals are tiny colonial polyps that secrete a framework that becomes their refuge, but unlike their stony relatives, that structure is usually soft. Octocorals that glow typically only do so when bumped or otherwise disturbed, leaving the precise function of their ability to produce light a bit mysterious.
Part 1
Apr 25
Dr. Krishna Kumari Challa
Octocorals are one of the oldest groups of animals on the planet known to bioluminescence. "So, the question 's when did they develop this ability?"
Researchers had completed an extremely detailed, well-supported evolutionary tree of the octocorals in 2022. They created this map of evolutionary relationships, or phylogeny, using genetic data from 185 species of octocorals.
With this evolutionary tree grounded in genetic evidence, DeLeo and Quattrini then situated two octocoral fossils of known ages within the tree according to their physical features. The scientists were able to use the fossils' ages and their respective positions in the octocoral evolutionary tree to date to figure out roughly when octocoral lineages split apart to become two or more branches.
Next, the team mapped out the branches of the phylogeny that featured living bioluminescent species.
With the evolutionary tree dated and the branches that contained luminous species labeled, the team then used a series of statistical techniques to perform an analysis called ancestral state reconstruction.
If we know these species of octocorals living today are bioluminescent, we can use statistics to infer whether their ancestors were highly probable to be bioluminescent or not. The more living species with the shared trait, the higher the probability that as you move back in time that those ancestors likely had that trait as well.
The researchers used numerous different statistical methods for their ancestral state reconstruction, but all arrived at the same result: Some 540 million years ago, the common ancestor of all octocorals were very likely bioluminescent. That is 273 million years earlier than the glowing ostracod crustaceans that previously held the title of earliest evolution of bioluminescence in animals.
The octocorals' thousands of living representatives and relatively high incidence of bioluminescence suggests the trait has played a role in the group's evolutionary success. While this further begs the question of what exactly octocorals are using bioluminescence for, the researchers said the fact that it has been retained for so long highlights how important this form of communication has become for their fitness and survival.
Evolution of bioluminescence in Anthozoa with emphasis on Octocorallia, Proceedings of the Royal Society B: Biological Sciences (2024). DOI: 10.1098/rspb.2023.2626. royalsocietypublishing.org/doi … .1098/rspb.2023.2626
Part 2
Apr 25
Dr. Krishna Kumari Challa
Scientists study lipids cell by cell, making new cancer research possible
Imagine being able to look inside a single cancer cell and see how it communicates with its neighbors. Scientists are celebrating a new technique that lets them study the fatty contents of cancer cells, one by one.
A study has sampled single live cancer cells and measured the fatty lipid compounds inside them. Working with partners at GSK and UCL, and developing new equipment with Yokogawa, the team saw how those cells transformed in response to changes in their environment.
The work appears in Analytical Chemistry.
The trouble with cancer cells is that no two are alike. That makes it harder to design good treatment, because some cells will always resist treatment more than others. Yet it has always proven tricky to study live cells after they have been removed from their natural environment, in enough detail to truly understand their makeup. That is why it is so exciting to be able to sample live cells under a microscope and study their fatty contents one by one.
Individual pancreatic cancer cells were lifted from a glass culture dish using Yokogawa's Single Cellome System SS2000. This extracts single live cells using tiny tubes 10 µm across—about half the diameter of the thinnest human hair.
By staining the cells with fluorescent dye, the researchers could monitor lipid droplets (stores of fatty molecules inside cells, thought to play an important role in cancer) throughout the experiment.
Then, working with partners at Sciex, researchers developed a new method using a mass spectrometer to fragment the lipids in the cells. This told them about their composition.
The researchers demonstrated that different cells had very different lipid profiles. They also saw how lipids in the cells changed in response to what was going on around them.
Untargeted single-cell lipidomics using liquid chromatography and data-dependent acquisition after live cell selection, Analytical Chemistry (2024). DOI: 10.1021/acs.analchem.3c05677
Apr 25
Dr. Krishna Kumari Challa
Lethal mpox strain appears to spread via sex
A virulent strain of the monkeypox virus might have gained the ability to spread through sexual contact. The strain, called clade Ib, has caused a cluster of infections in a conflict-ridden region of the Democratic Republic of the Congo (DRC). This isn’t the first time scientists have warned that the monkeypox virus could become sexually transmissible: similar warnings during a 2017 outbreak in Nigeria were largely ignored. The strain responsible, clade II, is less lethal than clade Ib, but ultimately caused an ongoing global outbreak that has infected more than 94,000 people and killed more than 180.
https://www.medrxiv.org/content/10.1101/2024.04.12.24305195v2.full....
https://www.nature.com/articles/d41586-024-01167-5?utm_source=Live+...
Apr 25
Dr. Krishna Kumari Challa
WHO redefines airborne transmission
The World Health Organization (WHO) has changed how it classifies airborne pathogens. It has removed the distinction between transmission by smaller virus-containing ‘aerosol’ particles and spread through larger ‘droplets’. The division, which some researchers argue was unscientific, justified WHO’s March 2020 assertion that SARS-CoV-2, the virus behind the COVID-19 pandemic, was not airborne. Under the new definition, SARS-CoV-2 would be recognized as spreading ‘through the air’ — although some scientists feel this term is less clear than ‘airborne’. “I'm not saying everybody is happy, and not everybody agrees on every word in the document, but at least people have agreed this is a baseline terminology,” says WHO chief scientist Jeremy Farrar.
https://www.who.int/publications/m/item/global-technical-consultati...
Apr 25
Dr. Krishna Kumari Challa
Disease Ecology Butterfly Effect
When their preferred trees to chew on were cut down for the tobacco trade, chimpanzees in Uganda began consuming bat guano instead. Researchers recorded videos in the Budongo Forest Reserve between 2017 and 2019 and observed 839 instances of guano consumption, not only by chimpanzees but also by black-and-white colobus monkeys and red duikers, a type of forest antelope. The guano provides the chimps with essential minerals like sodium, potassium, magnesium and phosphorus that they would normally have gotten from the felled trees.
Why this matters: In addition to essential nutrients, the bat guano contained 27 unique viruses, including a novel coronavirus, the researchers found. Illnesses transmitted from animals to humans, called zoonotic diseases, account for about three quarters of new infectious diseases around the world. Those pathogens have a higher chance of jumping from an animal to a human when people encroach on ecosystems and disrupt relationships among species.
“This is the butterfly effect of infectious disease ecology,” says senior study author Tony Goldberg, a wildlife epidemiologist at the University of Wisconsin–Madison. “Far-flung events like demand for tobacco can have crazy, unintended consequences for disease emergence that follow pathways that we rarely see and can’t predict.”
Apr 25
Dr. Krishna Kumari Challa
Study explores why human-inspired machines can be perceived as eerie
Artificial intelligence (AI) algorithms and robots are becoming increasingly advanced, exhibiting capabilities that vaguely resemble those of humans. The growing similarities between AIs and humans could ultimately bring users to attribute human feelings, experiences, thoughts, and sensations to these systems, which some people perceive as eerie and uncanny.
A recent paper, published in Computers in Human Behavior: Artificial Humans, reviews past studies and reports the findings of an experiment testing a recent theory known as "mind perception," which proposes that people feel eeriness when exposed to robots that closely resemble humans because they ascribe minds to these robots.
For many people, the idea of a machine with conscious experience is unsettling. This discomfort extends to inanimate objects as well.
Overall, the results of the meta-analysis and experiment run by these researchers suggest that past studies backing mind perception theory could be flawed. In fact, the researcher gathered opposite results, suggesting that individuals who attribute sentience to robots do not necessarily find them eerier due to their human resemblance.
Although attributions of mind are not the main cause of the uncanny valley, they are part of the story. They can be relevant in some contexts and situations, yet that attributing mind to a machine that looks human is creepy. Instead, perceiving a mind in a machine that already looks creepy makes it creepier. However, perceiving a mind in a machine that has risen out of the uncanny valley and looks nearly human makes it less creepy.
Exploring whether there is strong support for this speculation is an area for future research, which would involve using more varied and numerous stimuli.
Karl F. MacDorman, Does mind perception explain the uncanny valley? A meta-regression analysis and (de)humanization experiment, Computers in Human Behavior: Artificial Humans (2024). DOI: 10.1016/j.chbah.2024.100065
Apr 26
Dr. Krishna Kumari Challa
Link between depression and cardiovascular disease explained: They partly develop from same gene module
Depression and cardiovascular disease (CVD) are serious concerns for public health. Approximately 280 million people worldwide have depression, while 620 million people have CVD.
It has been known since the 1990s that the two diseases are somehow related. For example, people with depression run a greater risk of CVD, while effective early treatment for depression cuts the risk of subsequently developing CVD by half. Conversely, people with CVD tend to have depression as well. For these reasons, the American Heart Association (AHA) advises to monitor teenagers with depression for CVD.
What wasn't yet known is what causes this apparent relatedness between the two diseases. Part of the answer probably lies in lifestyle factors common in patients with depression and which increase the risk of CVD, such as smoking, alcohol abuse, lack of exercise, and a poor diet. But it's also possible that both diseases might be related at a deeper level, through shared developmental pathways.
Now, scientists have shown that depression and CVD do indeed share part of their developmental programs, having at least one functional gene module in common. This result, published in Frontiers in Psychiatry, provides new markers for depression and CVD, and could ultimately help researchers to find drugs to target both diseases.
Part 1
Apr 26
Dr. Krishna Kumari Challa
Researchers used advanced statistics to identify 22 distinct gene modules, of which just one was associated with both a high score for depressive symptoms and a low score for cardiovascular health.
The top three genes from this gene module are known to be associated with neurodegenerative diseases, bipolar disorder, and depression. Now they have shown that they are associated with poor cardiovascular health as well.
These genes are involved in biological processes, such as inflammation, that are involved in pathogenesis of both depression and cardiovascular disease. This helps to explain why both diseases often occur together.
Other genes in the shared module have been shown to be involved in brain diseases such as Alzheimer's, Parkinson's, and Huntington's disease.
Researchers and doctors can use the genes in this module as biomarkers for depression and cardiovascular disease. Ultimately, these biomarkers may facilitate the development of dual-purpose preventative strategies for both the diseases.
Binisha Hamal Mishra et al, Identification of gene networks jointly associated with depressive symptoms and cardiovascular health metrics using whole blood transcriptome in the Young Finns Study, Frontiers in Psychiatry (2024). DOI: 10.3389/fpsyt.2024.1345159
Part 2
Apr 26
Dr. Krishna Kumari Challa
Genetic variations may predispose people to Parkinson's disease following long-term pesticide exposure, study finds
A new UCLA Health study has found that certain genetic variants could help explain how long-term pesticide exposure could increase the risk of Parkinson's disease.
While decades of research have linked pesticide exposure and Parkinson's disease risk, researchers have sought to explain why some individuals with high exposure develop the disease while others do not.
One longstanding hypothesis has been that susceptibility to the disease is a combination of both environmental and genetic factors.
The new study, published in the journal npj Parkinson's Disease, used genetic data from nearly 800 Central Valley (California) residents with Parkinson's disease, many of whom had long-term exposure to 10 pesticides used on cotton crops for at least a decade prior to developing the disease, with some patients having been exposed as far back as 1974.
The researchers examined the study participants' genetic makeup for rare variants in genes associated with the function of lysosomes—cellular compartments that break down waste and debris, thought to be associated with the development of Parkinson's disease—and looked for enrichment of variants in patients with high exposure to pesticide use compared to a representative sample of the general population.
Researchers found that variants in these genes were enriched in patients with more severe Parkinson's disease who also had higher exposure to pesticides. These genetic variants also appeared to be deleterious to protein function suggesting that disruption of lysosomal activity may be underling the development of Parkinson's disease combined with pesticide exposure.
Lysosomal Genes Contribute to Parkinson's Disease near Agriculture with High Intensity Pesticide Use, npj Parkinson's Disease (2024). DOI: 10.1038/s41531-024-00703-4. www.nature.com/articles/s41531-024-00703-4
Apr 26
Dr. Krishna Kumari Challa
Yeast study offers possible answer to why some species are generalists and others specialists
In a landmark study based on one of the most comprehensive genomic datasets ever assembled, a team of scientists offer a possible answer to one of the oldest questions about evolution: why some species are generalists and others specialists.
researchers mapped the genetic blueprints, appetites, and environments of more than 1,000 species of yeasts, building a family tree that illuminates how these single-celled fungi evolved over the past 400 million years.
The results, published in the journal Science, suggest that internal—not external—factors are the primary drivers of variation in the types of carbon yeasts can eat, and the researchers found no evidence that metabolic versatility, or the ability to eat different foods, comes with any trade-offs. In other words, some yeasts are jacks-of-all-trades and masters of each.
Like other organisms, some yeasts have evolved to be specialists—think koalas, which eat nothing but eucalyptus leaves—while others are generalists like raccoons, which eat just about anything.
Scientists have been trying to explain why both generalists and specialists exist almost since Charles Darwin proposed his theory of evolution in 1859.
Part 1
Apr 26
Dr. Krishna Kumari Challa
Scientists have offered two broad models to explain the phenomenon.
One suggests generalists are jacks-of-all-trades but masters of none, meaning they can tolerate a wider range of conditions or food sources but aren't as dominant as a specialist in any specific niche.
The other theory is that a combination of internal and external factors drive niche variation.
For example, organisms can acquire genes that allow them to make enzymes capable of breaking down more than one substance, expanding the range of foods they can eat. Conversely, random loss of genes over time can result in a narrower palate.
Likewise, environments can exert selective pressure on traits. So a habitat with only one or two food sources or constant temperatures would favor specialists, while generalists might do better in an environment with a wider array of food or conditions.
When it comes to yeast metabolism, the research team found no evidence of trade-offs.
The generalists are better across all the carbon sources they can use. Generalists are also able to use more nitrogen sources than carbon specialists.
The data also showed that environmental factors play only a limited role.
Hittinger cautions there are limitations to what can be inferred from the data. It's possible that tradeoffs are present in species that weren't studied. And the lab experiments used to measure metabolic growth can't replicate the conditions in soils, tree bark, or insect guts where yeasts live in nature.
Dana A. Opulente et al, Genomic factors shape carbon and nitrogen metabolic niche breadth across Saccharomycotina yeasts, Science (2024). DOI: 10.1126/science.adj4503. www.science.org/doi/10.1126/science.adj4503
**
Part 2
Apr 26
Dr. Krishna Kumari Challa
Forget Billions of Years: Scientists Have Grown Diamonds in Just 150 Minutes
Natural diamonds take billions of years to form in the extreme pressures and temperatures deep underground. Synthetic forms can be produced far quicker, but they typically still require some intense squishing for up to several weeks. A new method based on a mix of liquid metals can pop out an artificial diamond in a matter of minutes, without the need for a giant squeeze. While high temperatures were still required, in the region of 1,025°C or 1,877°F, a continuous diamond film was formed in 150 minutes, and at 1 atm (or standard atmosphere unit). That's the equivalent of the pressure we feel at sea level, and tens of thousands of times less than the pressure normally required. The team behind the innovative approach, led by researchers from the Institute for Basic Science in South Korea, is confident that the process can be scaled up to make a significant difference in the production of synthetic diamonds. -- Dissolving carbon into liquid metal for the manufacture of diamond isn't entirely new. General Electric developed a process half a century ago using molten iron sulfide But these processes still required pressures of 5–6 gigapascals and a diamond 'seed' for the carbon to cling to. "We discovered a method to grow diamonds at 1 atm pressure and under a moderate temperature by using a liquid metal alloy," write the researchers in their published paper. The reduction in pressure was achieved using a carefully mixed blend of liquid metals: gallium, iron, nickel, and silicon. A custom-made vacuum system was built inside a graphite casing to very rapidly heat and then cool the metal while it was exposed to a combination of methane and hydrogen. These conditions cause carbon atoms from the methane to spread into the melted metal, acting as seeds for the diamonds. After just 15 minutes, small fragments of diamond crystals extruded from the liquid metal just beneath the surface, while two-and-a-half hours of exposure produced a continuous diamond film. Though the concentration of carbon forming the crystals decreased at a depth of just a few hundred nanometers, the researchers expect the process can be improved with a few tweaks.
https://www.nature.com/articles/s41586-024-07339-7
**
Apr 26
Dr. Krishna Kumari Challa
The Sex of Your Doctor Could Have a Concerning Effect on Your Prognosis
Patients treated by a female physician are less likely to die or to be readmitted to hospital than those treated by a physician who is male, according to a new study by a team of researchers from the US and Japan.
And if the patient happens to also be female, the difference is even more pronounced, especially so when they're severely ill.
While this study doesn't dive deeply into the reasons for the disparity, it supports previous research that comes to similar conclusions.
What these findings indicate is that female and male physicians practice medicine differently, and these differences have a meaningful impact on patients' health outcomes.
The team analyzed data from US Medicare sources describing 458,108 female and 318,819 male patients hospitalized between 2016 and 2019. All patients were over the age of 65, and just under a third of both male and female patients were seen by female physicians.
This info was then referenced against 30-day mortality rates (from the date of admission) and 30-day readmission rates (from the date of discharge). In both cases, female doctors led to better outcomes.
While the differences don't show direct cause and effect, and weren't huge – adjusted mortality rates of 8.15 percent (female doctor) vs 8.38 percent (male doctor) for female patients, for example – they represent a statistically significant gap that shouldn't be there at all. To put that difference into perspective, it amounts to 1 death for every 417 hospitalizations.
It is important to note that female physicians provide high-quality care, and therefore, having more female physicians benefits patients from a societal point-of-view.
The study authors suggest several reasons could be behind the discrepancies, which have been spotted before in different medical scenarios. It's possible that female doctors communicate better with female patients, the researchers say, or that male doctors are more likely to underestimate the severity of conditions experienced by female patients.
There might also be less embarrassment and discomfort between female doctors and female patients, the research team suggests, meaning more honesty about certain conditions and improved diagnosis and treatment.
The researchers want to see more done to improve sex diversity in hospital settings, and to make sure the quality of care is the same no matter whether patients or physicians are male or female – and for that to happen, more studies will be needed looking at why the differences exist.
Further research on the underlying mechanisms linking physician gender with patient outcomes, and why the benefit of receiving the treatment from female physicians is larger for female patients, has the potential to improve patient outcomes across the board.
https://www.acpjournals.org/doi/10.7326/M23-3163
Apr 26
Dr. Krishna Kumari Challa
Most Materials Seem to Obey a 'Rule of Four'
The rule of four: anomalous distributions in the stoichiometries of inorganic compounds
An analysis of a vast database of compounds has revealed a curious repeating pattern in the way matter composes itself.
Of more than 80,000 electronic structures of experimental and predicted materials studied, a whopping 60 percent have a basic structural unit based on a multiple of four.
What's so strange about this is that the research team that discovered this pattern couldn't figure out why it happens. All we know at the moment is that it's real and observable. It just evades explanation for now.
Through an extensive investigation, in this work researchers highlight and analyze the anomalous abundance of inorganic compounds whose primitive unit cell contains a number of atoms that is a multiple of four, a property that they name rule of four.
https://www.nature.com/articles/s41524-024-01248-z
Apr 26
Dr. Krishna Kumari Challa
Study suggests host response needs to be studied along with other bacteriophage research
A team of micro- and immunobiologists has found evidence suggesting that future research teams planning to use bacteriophages to treat patients with multidrug-resistant bacterial infections need to also consider how cells in the host's body respond to such treatment.
In their paper published in the open-access journal PLOS Biology, the group describes experiments they conducted that involved studying the way epithelial cells in the lungs respond to bacteriophages.
Over the past decade, medical scientists have found that many of the antibiotics used to treat bacterial infections are becoming resistant, making them increasingly useless. Because of this, other scientists have been looking for new ways to treat such infections. One possible approach has involved the use of bacteriophages, which are viruses that parasitize bacteria by infecting and reproducing inside of them, leaving them unable to reproduce.
To date, most of the research involving use of bacteriophages to treat infections has taken place in Eastern Europe, where some are currently undergoing clinical trials. But such trials, the researchers involved in this new study note, do not take into consideration how cells in the body respond to such treatment. Instead, they are focused on determining which phages can be used to fight which types of bacteria, and how well they perform once employed.
The reason so little attention is paid to host cell interaction, they note, is that prior research has shown that phages can only replicate inside of the bacterial cells they invade; thus, there is little opportunity for them to elicit a response in human cells.
In this new study, the research team suggests such thinking is misguided because it fails to take into consideration the immune response in the host. To demonstrate their point, the team conducted a series of experiments involving exposing human epithelial cells from the lungs (which are the ones that become infected as part of lung diseases) to bacteriophages meant to eradicate the bacteria causing an infection.
They found that in many cases, the immune system responded by producing proinflammatory cytokines in the epithelial cells. They noted further that different phages elicited different responses, and there exists the possibility that the unique properties of some phages could be used to improve the results obtained from such therapies. They conclude by suggesting that future bacteriophage research involve inclusion of host cell response.
Paula F. Zamora et al, Lytic bacteriophages induce the secretion of antiviral and proinflammatory cytokines from human respiratory epithelial cells, PLOS Biology (2024). DOI: 10.1371/journal.pbio.3002566
Apr 27
Dr. Krishna Kumari Challa
High-precision blood glucose level prediction achieved by few-molecule reservoir computing
A collaborative research team from NIMS and Tokyo University of Science has successfully developed an artificial intelligence (AI) device that executes brain-like information processing through few-molecule reservoir computing. This innovation utilizes the molecular vibrations of a select number of organic molecules.
By applying this device for the blood glucose level prediction in patients with diabetes, it has significantly outperformed existing AI devices in terms of prediction accuracy.
The work is published in the journal Science Advances.
Part 1
Apr 27
Dr. Krishna Kumari Challa
With the expansion of machine learning applications in various industries, there's an escalating demand for AI devices that are not only highly computational but also feature low power consumption and miniaturization.
Research has shifted towards physical reservoir computing, leveraging physical phenomena presented by materials and devices for neural information processing. One challenge that remains is the relatively large size of the existing materials and devices.
The team's research has pioneered the world's first implementation of physical reservoir computing that operates on the principle of surface-enhanced Raman scattering, harnessing the molecular vibrations of merely a few organic molecules. The information is inputted through ion gating, which modulates the adsorption of hydrogen ions onto organic molecules (p-mercaptobenzoic acid, pMBA) by applying voltage.
The changes in molecular vibrations of the pMBA molecules, which vary with hydrogen ion adsorption, serve the function of memory and nonlinear waveform transformation for calculation.
This process, using a sparse assembly of pMBA molecules, has learned approximately 20 hours of a diabetic patient's blood glucose level changes and managed to predict subsequent fluctuations over the next five minutes with an error reduction of about 50% compared to the highest accuracy achieved by similar devices to date.
This study indicates that a minimal quantity of organic molecules can effectively perform computations comparable to a computer. This technological breakthrough of conducting sophisticated information processing with minimal materials and in tiny spaces presents substantial practical benefits. It paves the way for the creation of low-power AI terminal devices that can be integrated with a variety of sensors, opening avenues for broad industrial use.
Daiki Nishioka et al, Few- and single-molecule reservoir computing experimentally demonstrated with surface-enhanced Raman scattering and ion gating, Science Advances (2024). DOI: 10.1126/sciadv.adk6438
Part 2
**
Apr 27
Dr. Krishna Kumari Challa
'Zombie Deer' Disease: Zoonotic Transfer Suspected After Two Human Deaths
A medical case report suggests that a deadly prion disease may have made its way from deer into humans.Two hunters have died after consuming venison from a population of deer known to be infected with chronic wasting disease – an incurable, fatal prion sometimes known as "zombie deer" disease not dissimilar to bovine spongiform encephalopathy, or mad cow disease.
A team of doctors at the University of Texas report a 72-year-old man died after presenting with rapid-onset confusion and aggression.
The man's friend, who was a member of the same hunting lodge, died at a later, unspecified date after presenting with similar symptoms, the doctors note. A post-mortem determined that this second patient had died of Creutzfeldt-Jakob disease, AKA prion disease.
Since prion disease is relatively rare in humans, the two cases could mean that chronic wasting disease – described by the Center for Disease Control as never having been reported in humans – has made the zoonotic leap from animals.
Prion diseases, known as Creutzfeldt-Jakob disease or CJD in humans, are kind of terrifying. Prions are proteins that haven't folded properly, and therefore don't really function the way they should. The problem is that these misfolded proteins teach the proteins around them how to fold badly, too, resulting in a spread of dysfunctional tissue that cannot be halted or cured.
The spread of prions through brain tissue produces symptoms very similar to a kind of fast-tracked dementia, to which the patient eventually succumbs. Since CJD doesn't produce any kind of immune response, it's practically impossible to diagnose in a living patient.
Significant concerns have already been raised about chronic wasting disease. It infects animals such as deer, elk, and moose, and seems to be transmitted fairly easily between them; scientists think it is transmitted via bodily fluids such as blood or saliva, either through direct contact, or contamination in the environment.
It's not known for certain whether the two men described in the case report succumbed to chronic wasting disease, or whether their illness had another source. Prion disease may emerge spontaneously, for example, although that is, as far as we can tell, extremely rare.
The case report also doesn't mention from whence the two men hailed, but the disease can be found across the North American continent in wild populations, including at least 32 states in the US and across Canada. It can also be found among farmed deer.
Part 1
Apr 27
Dr. Krishna Kumari Challa
Given that zoonotic prion disease is absolutely possible, and that transmission to humans has been predicted for some time, the situation, the doctors say, warrants caution and attention.
Although causation remains unproven, this cluster emphasizes the need for further investigation into the potential risks of consuming CWD-infected deer and its implications for public health," they write.
"Clusters of sporadic CJD cases may occur in regions with CWD-confirmed deer populations, hinting at potential cross-species prion transmission. Surveillance and further research are essential to better understand this possible association."
https://www.neurology.org/doi/10.1212/WNL.0000000000204407
Part 2
**
Apr 27
Dr. Krishna Kumari Challa
Why do we move slower the older we get? New study delivers answers
It's one of the inescapable realities of aging: The older we get, the slower we tend to move.
A new study led by University of Colorado Boulder engineers helps explain why.
The research is one of the first studies to experimentally tease apart the competing reasons why people over age 65 might not be as quick on their feet as they used to be. The group reported that older adults may move slower, at least in part, because it costs them more energy than younger people—perhaps not too shocking for anyone who's woken up tired the morning after an active day.
Why we move the way we do, from eye movements to reaching, walking, and talking, is a window into aging and Parkinson's. Scientists are trying to understand the neural basis of that.
For the study, the group asked subjects aged 18 to 35 and 66 to 87 to complete a deceptively simple task: to reach for a target on a screen, a bit like playing a video game on a Nintendo Wii. By analyzing patterns of these reaches, the researchers discovered that older adults seemed to modify their motions under certain circumstances to conserve their limited supplies of energy.
All of us, whether young or old, are inherently driven to get the most reward out of our environment while minimizing the amount of effort to do so.
researchers have long known that older adults tend to be slower because their movements are less stable and accurate. But other factors could also play a role in this fundamental part of growing up.
According to one hypothesis, the muscles in older adults may work less efficiently, meaning that they burn more calories while completing the same tasks as younger adults—like running a marathon or getting up to grab a soda from the refrigerator.
Alternatively, aging might also alter the reward circuitry in the human brain. As people age, their bodies produce less dopamine, a brain chemical responsible for giving you a sense of satisfaction after a job well done. If you don't feel that reward as strongly, the thinking goes, you may be less likely to move to get it. People with Parkinson's disease experience an even sharper decline in dopamine production.
Part 1
Apr 29
Dr. Krishna Kumari Challa
In the study, the researchers asked more than 80 people to sit down and grab the handle of a robotic arm, which, in turn, operated the cursor on a computer screen. The subjects reached forward, moving the cursor toward a target. If they succeeded, they received a reward—not a big one, but still enough to make their brains happy.
Sometimes, the targets exploded, and they would get point rewards. It would also make a 'bing bing' sound.
That's when a contrast between the two groups of people began to emerge.
Both the 18 to 35-year-olds and 66 to 87-year-olds arrived at their targets sooner when they knew they would hear that bing bing—roughly 4% to 5% sooner over trials without the reward. But they also achieved that goal in different ways.
The younger adults, by and large, moved their arms faster toward the reward. The older adults, in contrast, mainly improved their reaction times, beginning their reaches about 17 milliseconds sooner on average.
When the team added an 8-pound weight to the robotic arm for the younger subjects, those differences vanished.
The brain seems to be able to detect very small changes in how much energy the body is using and adjusts our movements accordingly. Even when moving with just a few extra pounds, reacting quicker became the energetically cheaper option to get to the reward, so the young adults imitated the older adults and did just that.
The research seems to paint a clear picture. Both the younger and older adults didn't seem to have trouble perceiving rewards, even small ones. But their brains slowed down their movements under tiring circumstances.
The experiment can't completely rule out the brain's reward centers as a culprit behind why we slow down when we age. But if scientists can tease out where and how these changes emerge from the body, they may be able to develop treatments to reduce the toll of aging and disease.
Putting it all together, these results suggest that the effort costs of reaching seem to be determining what's slowing the movement of older adults.
Erik M. Summerside et al, Slowing of Movements in Healthy Aging as a Rational Economic Response to an Elevated Effort Landscape, The Journal of Neuroscience (2024). DOI: 10.1523/JNEUROSCI.1596-23.2024
Part 2
**
Apr 29
Dr. Krishna Kumari Challa
The harm artificial sweeteners can cause
Artificial sweeteners are chemical compounds are up to 600 times sweeter than sugar with very few (if any) calories, and are cheap and easy for manufacturers to use.
Traditional artificial sweeteners, such as aspartame, sucralose and acesulfame potassium (acesulfame K) have been found in a wide range of foods and drinks for many years as a way to increase the sweet taste without adding significant calories or costs. However, in the last few years, there has been controversy in the field. Several studies have suggested potential health harms associated with consuming these sweeteners, ranging from gastrointestinal disease to dementia. Although none of these harms have been proved, it has paved the way for new sweeteners to be developed to try to avoid any possible health issues. These next-generation sweeteners are up to 13,000 times sweeter than sugar, have no calories and no aftertaste (a common complaint with traditional sweeteners). An example of this new type of sweetener is neotame. Neotame was developed as an alternative to aspartame with the aim of being a more stable and sweet version of the traditional sweetener. It is very stable at high temperatures, which means it is a good additive to use in baked goods. It is also used in soft drinks and chewing gum.
An artificial sweetener called neotame can cause significant harm to the gut, scientists found.It does this harm in two ways. One, by breaking down the layer of cells that line the intestine. And, two, by causing previously healthy gut bacteria to become diseased, resulting in them invading the gut wall.
The study, published in the journal Frontiers in Nutrition, is the first to show this double-hit negative effect of neotame on the gut, resulting in damage similar to that seen in inflammatory bowel disease and sepsis.
https://www.frontiersin.org/articles/10.3389/fnut.2024.1366409/full
Apr 29
Dr. Krishna Kumari Challa
Study suggests that stevia is the most brain-compatible sugar substitute
Given the known risks of consuming high amounts of sugar, today many people are looking for alternative sweeteners that produce a similar taste without prompting significant weight gain and causing other health issues. While research suggests that the brain can tell the difference between different sweet substances, the neural processes underlying this ability to tell sweeteners apart remain poorly understood.
Researchers recently carried out a study aimed at better understanding what happens in the brain of mice when they are fed different types of sweeteners. Their findings, published in Neuroscience Research, suggest that the response of neurons to sucrose and stevia is similar, suggesting that stevia could be an equally pleasant but healthier sugar substitute.
Stevia is a sweet sugar substitute that is about 50 to 300 times sweeter than sugar. It is extracted from the leaves of Stevia rebaudiana, a plant native to areas of Paraguay and Brazil in the southern Amazon rainforest.The active compounds in stevia are steviol glycosides (mainly stevioside and rebaudioside). Stevia is heat-stable, pH-stable, and not fermentable. Humans cannot metabolize the glycosides in stevia, and therefore it has zero calories. Its taste has a slower onset and longer duration than that of sugar, and at high concentrations some of its extracts may have an aftertaste described as licorice-like or bitter. Stevia is used in sugar- and calorie-reduced food and beverage products as an alternative for variants with sugar.
Interestingly, the team's recordings revealed that compared to other sugar substitutes considered as part of this study, stevia induced activity in the PVT that more closely resembled that elicited by sugar intake. This suggests that stevia is the most "brain compatible" among most widely used sugar alternatives, most closely mirroring the perceived taste of sugar.
Shaolei Jiang et al, Neuronal activity in the anterior paraventricular nucleus of thalamus positively correlated with sweetener consumption in mice, Neuroscience Research (2024). DOI: 10.1016/j.neures.2024.02.002
Apr 30
Dr. Krishna Kumari Challa
Research shows 'profound' link between dietary choices and brain health
A recent study published in Nature Mental Health shows that a healthy, balanced diet is linked to superior brain health, cognitive function and mental well-being. The study, involving researchers at the University of Warwick, sheds light on how our food preferences not only influence physical health but also significantly impact brain health.
The dietary choices of a large sample of 181,990 participants from the UK Biobank were analyzed against and a range of physical evaluations, including cognitive function, blood metabolic biomarkers, brain imaging, and genetics—unveiling new insights into the relationship between nutrition and overall well-being.
The food preferences of each participant were collected via an online questionnaire, which the team categorized into 10 groups (such as alcohol, fruits and meats). A type of AI called machine learning helped the researchers analyze the large dataset.
A balanced diet was associated with better mental health, superior cognitive functions and even higher amounts of gray matter in the brain—linked to intelligence—compared with those with a less varied diet.
The study also highlighted the need for gradual dietary modifications, particularly for individuals accustomed to highly palatable but nutritionally deficient foods. By slowly reducing sugar and fat intake over time, individuals may find themselves naturally gravitating towards healthier food choices.
Genetic factors may also contribute to the association between diet and brain health, the scientists think, showing how a combination of genetic predispositions and lifestyle choices shape well-being.
Ruohan Zhang et al, Associations of dietary patterns with brain health from behavioral, neuroimaging, biochemical and genetic analyses, Nature Mental Health (2024). DOI: 10.1038/s44220-024-00226-0
Apr 30
Dr. Krishna Kumari Challa
Human activities have an intense impact on Earth's deep subsurface fluid flow
The impact of human activities—such as greenhouse gas emissions and deforestation—on Earth's surface have been well-studied. Now, hydrology researchers have investigated how humans impact Earth's deep subsurface, a zone that lies hundreds of meters to several kilometers beneath the planet's surface.
They looked at how the rates of fluid production with oil and gas compare to natural background circulation of water and showed how humans have made a big impact on the circulation of fluids in the subsurface.
In the future, these human-induced fluid fluxes are projected to increase with strategies that are proposed as solutions for climate change, according the study. Such strategies include: geologic carbon sequestration, which is capturing and storing atmospheric carbon dioxide in underground porous rocks; geothermal energy production, which involves circulating water through hot rocks for generating electricity; and lithium extraction from underground mineral-rich brine for powering electric vehicles.
Responsible management of the subsurface is central to any hope for a green transition, sustainable future and keeping warming below a few degrees.
With oil and natural gas production, there is always some amount of water, typically saline, that comes from the deep subsurface. The underground water is often millions of years old and acquires its salinity either from evaporation of ancient seawater or from reaction with rocks and minerals. For more efficient oil recovery, more water from near-surface sources is added to the salt water to make up for the amount of oil removed and to maintain reservoir pressures. The blended saline water then gets reinjected into the subsurface. This becomes a cycle of producing fluid and reinjecting it to the deep subsurface.
The same process happens in lithium extraction, geothermal energy production and geologic carbon sequestration, the operations of which involve leftover saline water from the underground that is reinjected.
Researchers showed that the fluid injection rates or recharge rates from those oil and gas activities is greater than what naturally occurs .
Part 1
Apr 30
Dr. Krishna Kumari Challa
Using existing data from various sources, including measurements of fluid movements related to oil and gas extraction and water injections for geothermal energy, the team found that the current fluid movement rates induced by human activities are higher compared to how fluids moved before human intervention.
As human activities like carbon capture and sequestration and lithium extraction ramp up, the researchers also predicted how these activities might be recorded in the geological record, which is the history of Earth as recorded in the rocks that make up its crust.
Human activities have the potential to alter not just the deep subsurface fluids but also the microbes that live down there.
As fluids move around, microbial environments may be altered by changes in water chemistry or by bringing new microbial communities from Earth's surface to the underground.
For example, with hydraulic fracturing, a technique that is used to break underground rocks with pressurized liquids for extracting oil and gas, a deep rock formation that previously didn't have any detectable number of microbes might have a sudden bloom of microbial activity.
There remain a lot of unknowns about Earth's deep subsurface and how it is impacted by human activities, and it's important to continue working on those questions, say the scientists.
Grant Ferguson et al, Acceleration of Deep Subsurface Fluid Fluxes in the Anthropocene, Earth's Future (2024). DOI: 10.1029/2024EF004496
Part 2
Apr 30
Dr. Krishna Kumari Challa
Researchers find brown fat's 'off-switch'
Brown fat, also known as brown adipose tissue (BAT), is a type of fat in our bodies that's different from the white fat around our belly and thighs that we are more familiar with. Brown fat has a special job—it helps to burn calories from the foods that we eat into heat, which can be helpful, especially when we're exposed to cold temperatures like during winter swimming or cryotherapy.
For a long time, scientists thought that only small animals like mice and newborns had brown fat. But new research shows that a certain number of adults maintain their brown fat throughout life. Because brown fat is so good at burning calories, scientists are trying to find ways to activate it safely using drugs that boost its heat-producing abilities.
A new study has found that brown fat has a previously unknown built-in mechanism that switches it off shortly after being activated. This limits its effectiveness as treatment against obesity. The team has now discovered a protein responsible for this switching-off process. It is called "AC3-AT."
Hande Topel et al, Cold-induced expression of a truncated Adenylyl Cyclase 3 acts as rheostat to brown fat function, Nature Metabolism (2024). DOI: 10.1038/s42255-024-01033-8
**
Apr 30
Dr. Krishna Kumari Challa
Physicists overcome two key operating hurdles in fusion reactions
A team of physicists has devised a way to overcome two key hurdles standing in the way of using fusion as a general power source.
In their paper published in the journal Nature, the group describes how they devised a way to raise the density of the plasma in their reactor while also keeping it stable.
Scientists at various sites around the world have been working for several years to figure out how to use fusion reactions to create electricity for general use—thereby freeing the world from using coal and gas fired power plants that spew greenhouse gases into the atmosphere. But it has been a long and difficult road.
It was just in the past couple of years that researchers were able to show that a fusion reaction could be made to sustain itself, and that more power could be produced than was input into such a system.
The next two hurdles to overcome are increasing the density of the plasma in the reactor and then containing it for extended periods of time—long enough for it to be useful for producing electricity. In this new study, the research team has devised a way to do both in a tokamak chamber.
To contain the plasma as its density was increased, the team used additional magnets and bursts of deuterium where needed. They also allowed for higher densities at the core than near the edges, helping to ensure the plasma could not escape. They held it in that state for 2.2 seconds, long enough to prove that it could be done.
They also found that during that short time span, the average density in the reactor was 20% over the Greenwald limit—a theoretical barrier that had been predicted to mark the point at which adding pressure would escape the magnetic field holding the plasma in place.
They also found that the stability of the plasma was H98y2 above 1, which means that the experiment was successful.
The research team acknowledges that their experiment was done in a very small reactor—one with a diameter of just 1.6 meters. For such an achievement to be considered fully successful, it will have to be done in a much larger reactor, such as the one currently under construction in France, which will have a diameter of 6.2 meters.
S. Ding et al, A high-density and high-confinement tokamak plasma regime for fusion energy, Nature (2024). DOI: 10.1038/s41586-024-07313-3
Apr 30
Dr. Krishna Kumari Challa
Many old books contain toxic chemicals
In our modern society, we rarely consider books to be dangerous items. However, certain books contain elements so hazardous that they require scrutiny before being placed on the shelves of public libraries, bookstores or even private homes.
The Poisonous Book Project, a collaborative research project between Winterthur Museum, Garden & Library and the University of Delaware, is dedicated to cataloging such books. Their concern is not with the content written on the pages, but with the physical components of the books themselves—specifically, the colors of the covers.The project recently influenced the decision to remove two books from the French national library. The reason? Their vibrant green cloth covers raised suspicions of containing arsenic.
This concern is rooted in historical practices in bookbinding. During the 19th century, as books began to be mass produced, bookbinders transitioned from using expensive leather covers to more affordable cloth items. To attract readers, these cloth covers were often dyed in bright, eye-catching colours.
One popular pigment was Scheele's green, named after Carl Wilhelm Scheele, a German-Swedish chemist who in 1775 discovered that a vivid green pigment could be produced from copper and arsenic. This dye was not only cheap to make, it was also more vibrant than the copper carbonate greens that had been used for over a century.
Paris green, 's much more durable. It was quickly adopted for use in various items, including book covers, clothing, candles and wallpaper.
These pigments, however, had a significant drawback: they degraded easily, releasing poisonous and carcinogenic arsenic. The frequent reports of green candles poisoning children at Christmas parties, factory workers tasked with applying paint to ornaments convulsing and vomiting green water and warnings of poisonous ball dresses raised serious concerns about the safety of these green dyes.
Part 1
May 1
Dr. Krishna Kumari Challa
The harmful effects of these pigments have even been implicated in Napoleon's death from stomach cancer. Napoleon was particularly keen on the new green colors, so much so that he ordered his dwelling on St Helena, where he was exiled, be painted in his favorite color.
The theory that the arsenic in the walls contributed to his death is supported by the high levels of arsenic detected in samples of his hair. Despite the clear link between the green pigments and health issues, toxic wallpapers continued to be produced until the late 19th century.
Green isn't the only color to worry about, however. Red is also of concern. The brilliant red pigment vermilion was formed from the mineral cinnabar, also known as mercury sulfide. This was a popular source of red paint dating back thousands of years. There is even evidence that neolithic artists suffered from mercury poisoning. Vermilion red sometimes appears on the marbled patterns on the inside of book covers.
Yellow has also caught the eye of the poisonous book project. In this case, the culprit is lead chromate. The bright yellow of lead chromate was a favorite with painters, not least Vincent van Gogh, who used it extensively in his most famous series of paintings: Sunflowers. For the Victorian-era bookbinders, lead chromate allowed them to create a range of colors from greens (achieved by mixing chrome yellow with Prussian blue) to yellows, oranges and browns.
Both lead and chromium are toxic. But yellow books are less of a concern than green and red. Lead chromate is not particularly soluble, making it difficult to absorb. It is, in fact, still a widely used pigment.
Part 2
May 1
Dr. Krishna Kumari Challa
So what should you do if you come across a green cloth book from the 19th century? First, don't be overly concerned. You would probably have to eat the entire book before you'd suffer from severe arsenic poisoning. However, casual exposure to copper acetoarsenite, the compound in the green pigment, can irritate the eyes, nose and throat.
It is more of a concern for folks who may regularly handle these books where frequent contact could result in more serious symptoms. Therefore, anyone who suspects they might be handling a Victorian-era book with an emerald green binding is advised to wear gloves and avoid touching their face. Then clean all surfaces afterwards.
To aid with the identification of these potentially hazardous books, the Poisonous Book Project has incorporated crowd-sourced data into their research. The researchers now distribute bookmarks that feature safety warnings and showcase various shades of emerald green to aid their identification. As a result, they have now identified over 238 arsenic editions from across the globe.
https://theconversation.com/many-old-books-contain-toxic-chemicals-...
Part 3
**
May 1
Dr. Krishna Kumari Challa
First fetus-to-fetus transplant demonstrated in rats
May 1
Dr. Krishna Kumari Challa
Scientists show that there is indeed an 'entropy' of quantum entanglement
Researchers have shown, through probabilistic calculations, that there is indeed, as had been hypothesized, a rule of entropy for the phenomenon of quantum entanglement.
This finding could help drive a better understanding of quantum entanglement, which is a key resource that underlies much of the power of future quantum computers. Little is currently understood about the optimal ways to make effective use of it, despite it being the focus of research in quantum information science for decades.
The second law of thermodynamics, which says that a system can never move to a state with lower entropy, or order, is one of the most fundamental laws of nature, and lies at the very heart of physics. It is what creates the "arrow of time," and tells us the remarkable fact that the dynamics of general physical systems, even extremely complex ones such as gases or black holes, are encapsulated by a single function, its entropy.
There is a complication, however. The principle of entropy is known to apply to all classical systems. Then what about quantum world?
We are now going through a quantum revolution, and it becomes crucially important to understand how we can extract and transform the expensive and fragile quantum resources. In particular, quantum entanglement, which allows for significant advantages in communication, computation, and cryptography, is crucial, but due to its extremely complex structure, efficiently manipulating it and even understanding its basic properties is typically much more challenging than in the case of thermodynamics. Part 1May 2
Dr. Krishna Kumari Challa
The difficulty lies in the fact that such a "second law" for quantum entanglement would require us to show that entanglement transformations can be made reversible, just like work and heat can be interconverted in thermodynamics.
It is known that reversibility of entanglement is much more difficult to ensure than the reversibility of thermodynamic transformations, and all previous attempts at establishing any form of a reversible theory of entanglement have failed. It was even suspected that entanglement might actually be irreversible, making the quest an impossible one.
In their new work, published in Nature Communications, the authors solve this long-standing conjecture by using probabilistic entanglement transformations, which are only guaranteed to be successful some of the time, but which, in return, provide an increased power in converting quantum systems.
Under such processes, the authors show that it is indeed possible to establish a reversible framework for entanglement manipulation, thus identifying a setting in which a unique entropy of entanglement emerges and all entanglement transformations are governed by a single quantity. The methods they used could be applied more broadly, showing similar reversibility properties also for more general quantum resources.
findings mark significant progress in understanding the basic properties of entanglement, revealing fundamental connections between entanglement and thermodynamics, and crucially, providing a major simplification in the understanding of entanglement conversion processes.
This not only has immediate and direct applications in the foundations of quantum theory, but it will also help with understanding the ultimate limitations on our ability to efficiently manipulate entanglement in practice.
Bartosz Regula et al, Reversibility of quantum resources through probabilistic protocols, Nature Communications (2024). DOI: 10.1038/s41467-024-47243-2
Part 2
May 2
Dr. Krishna Kumari Challa
Patients with rheumatoid arthritis have unique and complex autoantibody patterns, study reveals
Patients with rheumatoid arthritis (RA) all have a unique and diverse set of antibodies that are involved in the development of the disease. Researchers unveiled the complexity of these antibodies using powerful lab tools capable of analyzing our immune system at molecular levels. Their discovery suggests that current assumptions about the origin of RA are too simple. Their findings may point towards improved diagnostics.
Rheumatoid arthritis is a chronic autoimmune disease that primarily affects the joints, causing pain, stiffness, and swelling. It arises when the immune system mistakenly attacks the body's own tissues, leading to inflammation in the joints and potentially other organs.
The exact cause of RA remains unknown, but a crucial role is played by antibodies, special proteins made by the immune system to help fight off infections. They recognize and attack specific targets, like viruses or bacteria. Some antibodies are wrongly produced, causing them to attack our own body. Normally, our body's immune system is equipped with a 'filter' that cleans up these so-called autoantibodies. Researchers think that this mechanism is malfunctioning in RA patients.
The extend to which this filter is malfunctioning, now appears to be much greater than expected. Research published in Nature Communications reveals that it's not just a handful of different RA-associated autoantibodies that evade the filter. On the contrary, the researchers found an extremely broad variety of these antibodies.
The team used novel mass spectrometry tools that profile specific antibodies typically seen in the blood of RA patients, which are called anti-citrullinated protein antibodies (ACPAs). They discovered that each RA patient possesses a unique and diverse set of ACPAs.
Part 1
May 2