Brain area necessary for fluid intelligence identified
A team of researchers have mapped the parts of the brain that support our ability to solve problems without prior experience—otherwise known as fluid intelligence.
Fluid intelligence is arguably the defining feature of human cognition. It predicts educational and professional success, social mobility, health, and longevity. It also correlates with many cognitive abilitiessuch as memory.
Fluid intelligence is thought to be a key feature involved in "active thinking"—a set of complex mental processes such as those involved in abstraction, judgment, attention, strategy generation and inhibition. These skills can all be used in everyday activities.
Despite its central role in human behavior, fluid intelligence remains contentious, with regards to whether it is a single or a cluster of cognitive abilities, and the nature of its relationship with the brain. To establish which parts of the brain are necessary for a certain ability, researchers must study patients in whom that part is either missing or damaged. Such "lesion-deficit mapping" studies are difficult to conduct owing to the challenge of identifying and testing patients with focal brain injury. Consequently, previous studies have mainly used functional imaging (fMRI) techniques—which can be misleading.
The new study investigated 227 patients who had suffered either a brain tumour or stroke to specific parts of the brain, using the Raven Advanced Progressive Matrices (APM): the best-established test of fluid intelligence. The test contains multiple choice visual pattern problems of increasing difficulty. Each problem presents an incomplete pattern of geometric figures and requires selection of the missing piece from a set of multiple possible choices. The researchers then introduced a novel "lesion-deficit mapping" approach to disentangle the intricate anatomical patterns of common forms of brain injury, such as stroke.
Their approach treated the relations between brain regions as a mathematical network whose connections describe the tendency of regions to be affected together, either because of the disease process or in reflection of common cognitive ability. This enabled researchers to disentangle the brain map of cognitive abilities from the patterns of damage—allowing them to map the different parts of the brain and determine which patients did worse in the fluid intelligence task according to their injuries.
The researchers found that fluid intelligence impaired performance was largely confined to patients with right frontal lesions—rather than a wide set of regions distributed across the brain. Alongside brain tumours and stroke, such damage is often found in patients with a range of other neurological conditions, including traumatic brain injury and dementia.
These findings indicate for the first time that the right frontal regions of the brain are critical to the high-level functions involved in fluid intelligence, such as problem solving and reasoning.
Lisa Cipolotti et al, Graph lesion-deficit mapping of fluid intelligence, Brain (2022). DOI: 10.1093/brain/awac304
Holding information in mind may mean storing it among synapses
Between the time you read the Wi-Fi password off the café's menu board and the time you can get back to your laptop to enter it, you have to hold it in mind. If you've ever wondered how your brain does that, you are asking a question about working memory that has researchers have strived for decades to explain. Now neuroscientists have published a key new insight to explain how it works.
In a study inPLOS Computational Biology,scientists compared measurements of brain cell activity in an animal performing a working memorytask with the output of various computer models representing two theories of the underlying mechanism for holding information in mind. The results strongly favoured the newer notion that a network of neurons stores the information by making short-lived changes in the pattern of their connections, or synapses, and contradicted the traditional alternative that memory is maintained by neurons remaining persistently active (like an idling engine).
While both models allowed for information to be held in mind, only the versions that allowed for synapses to transiently change connections ("short-term synaptic plasticity") produced neural activity patterns that mimicked what was actually observed in real brains at work. The idea that brain cells maintain memories by being always "on" may be simpler, but it doesn't represent what nature is doing and can't produce the sophisticated flexibility of thought that can arise from intermittent neural activity backed up by short-term synaptic plasticity.
You need these kinds of mechanisms to give working memory activity the freedom it needs to be flexible. If working memory was just sustained activity alone, it would be as simple as a light switch. But working memory is as complex and dynamic as our thoughts.
Using artificial neural networks with short-term synaptic plasticity, scientists show that synaptic activity (instead of neural activity) can be a substrate for working memory. The important takeaway from this paper is: these 'plastic' neural network models are more brain-like, in a quantitative sense, and also have additional functional benefits in terms of robustness.
In addition to matching nature better, the synaptic plasticity models also conferred other benefits that likely matter to real brains. One was that the plasticity models retained information in their synaptic weightings even after as many as half of the artificial neurons were "ablated." The persistent activity models broke down after losing just 10–20 percent of their synapses. And just spiking occasionally requires less energy than spiking persistently.
Furthermore, quick bursts of spiking rather than persistent spiking leaves room in time for storing more than one item in memory. Research has shown that people can hold up to four different things in working memory.
Leo Kozachkov et al, Robust and brain-like working memory through short-term synaptic plasticity, PLOS Computational Biology (2022). DOI: 10.1371/journal.pcbi.1010776
South Asian black carbon causes glacier loss on Tibetan Plateau
Black carbon aerosol is the product of incomplete combustion of fossil fuels and biomass, and has strong light absorption. Black carbon deposition in snow ice reduces the albedo of the snow ice surface, accelerating the melting of glaciers and snow cover, and thus changing the hydrological process and water resources in the region.
The South Asia region adjacent to the Tibetan Plateau is one of the regions with high black carbon emission in the world. Black carbon aerosol from South Asia can transport across the Himalayan Mountains to the inland region of the Tibetan Plateau.
Recently, a research team analyzed the influence of black carbon aerosols on regional precipitation and glaciers over the Qinghai-Tibet Plateau.
Their findings were published inNature Communicationson Nov. 30.
The researchers found that since the 21st century, the South Asian black carbon aerosols have indirectly affected the material supply of the Tibetan Plateau glaciers by changing water vapour transport in the South Asian monsoon.
Black carbon aerosols in South Asia heat up the middle and upper atmosphere, thus increasing the north-south temperature gradient.
Accordingly, the convective activity in South Asia is enhanced, which causes convergence of water vapor in South Asia. Meanwhile, black carbon also increases the number of cloud condensation nuclei in the atmosphere."
These changes in meteorological conditions caused by black carbon aerosols make more water vapor form precipitation in South Asia, and less water vapor transmit to the Tibetan Plateau. As a result, precipitation in the central and southern Tibetan Plateau decreases during monsoon, especially in the southern part of the Tibetan Plateau.
The decrease of precipitation further leads to the decrease of material supply of glaciers. From 2007 to 2016, the reduced material supply accounted for 11.0% of the average glacier material loss on the Tibetan Plateau and 22.1% in the southern part of the plateau.
"The transboundary transport and deposition of black carbon aerosols in South Asia accelerate glacier ablation on the Tibetan Plateau. Meanwhile, the reduction of plateau summer precipitation will reduce the material supply of plateau glacier, which will increase the amount of glacier material deficit.
Junhua Yang et al, South Asian black carbon is threatening the water sustainability of the Asian Water Tower, Nature Communications (2022). DOI: 10.1038/s41467-022-35128-1
Gastroesophageal reflux disease raises risk for periodontitis: Study
Patients with gastroesophageal reflux disease (GERD) have an increased risk for periodontitis development, according to a study published online Nov. 19 in Biomedicines.
Researchers conducted a retrospective cohort study to examine the association between GERD and subsequent periodontitis risk using epidemiological data from the Taiwan National Health Insurance Research Database from 2008 to 2018. A total of 20,125 participants with a minimum age of 40 years were included in the GERD group and propensity-matched in a 1:1 ratio with non-GERD participants.
The researchers found that the incidence rate of periodontitis was significantly higher in patients with versus those without GERD (30.0 versus 21.7 per 1,000 person-years; adjusted hazard ratio, 1.36). Patients with GERD had a higher risk for periodontitis in analyses stratified for age (adjusted hazard ratios, 1.31 and 1.42 for age 40 to 54 and 55 to 69 years, respectively), sex (adjusted hazard ratios, 1.40 and 1.33 for men and women, respectively), and presence and absence of comorbidity (adjusted hazard ratios, 1.36 and 1.40, respectively) compared with those without GERD. The risk for periodontitis was increased with an increasing number of emergency room visits among the GERD cohort (one or more versus less than one; adjusted hazard ratio, 5.19).
"Clinicians should pay more attention to the development of periodontitis while caring for patients with GERD," the authors write. "On the other hand, dentists may consider GERD as an etiology of unexplained periodontitis."
More information:Xin Li et al, Risk of Periodontitis in Patients with Gastroesophageal Reflux Disease: A Nationwide Retrospective Cohort Study,Biomedicines(2022).DOI: 10.3390/biomedicines10112980
Parental astigmatism may increase risk for child astigmatism
Parental astigmatism may confer an independent and dose-dependent association with child astigmatism, according to a study published online Dec. 21 in JAMA Network Open.
Researchers examined the association between parental astigmatism (an optical system with astigmatism is one where rays that propagate in two perpendicular planes have different foci. If an optical system with astigmatism is used to form an image of a cross, the vertical and horizontal lines will be in sharp focus at two different distances) and child astigmatism. The analysis included 5,708 familial trios, each comprising a child aged 6 to 8 years and both parents, participating in the Hong Kong Children Eye Study.
The researchers found that astigmatism of ≥1.0 D in both parents was associated with greater odds of refractive astigmatism (RA; odds ratio, 1.62) and corneal astigmatism (CA; odds ratio, 1.94) in the child. When both parents had astigmatism ≥2.0 D, the risk increased further (odds ratios, 3.10 and 4.31, respectively), with higher parental astigmatism conferring higher risks for both RA and CA in children. There was a significant association between each parental astigmatism and corresponding child astigmatism (odds ratios, 0.76, 0.82, 1.70, and 1.33 for maternal RA, paternal RA, maternal CA, and paternal CA, respectively).
"The findings of this cross-sectional study suggest that parental astigmatism may confer an independent and dose-dependent association with child astigmatism," the authors write. "Children with parentswith astigmatism should have early eye examinations for timely detection of astigmatism to facilitate age-appropriate vision correction and visual development."
More information:Ka Wai Kam et al, Association of Maternal and Paternal Astigmatism With Child Astigmatism in the Hong Kong Children Eye Study,JAMA Network Open(2022).DOI: 10.1001/jamanetworkopen.2022.47795
Researchers discover that soap film on bubbles is cooler than the air around it
A team of researchers has discovered that the film that makes up ordinary soap bubbles is cooler than the surrounding air.
Bubbles exist in a wide variety of environments, from beer glasses to clothes and dish washers to crests on waves. They even exist in tiny environments, like in the space between human teeth. A lot of research has been done with bubbles, much of it focused on controlling them during industrial processes.
In their work, the researchers created bubbles using ordinary dish soap, water and glycerol. After discovering a temperature difference, the team refocused their efforts to learn more. They tried changing the temperature of the air, the humidity level and also the proportions of the ingredients used to make the bubbles. They found that they were able to make bubbles that were up to 8 degrees Celsius cooler than the air around them. They also found that changing the amount of glycerol impacted the temperature of the resulting bubbles—more of it yielded higher temperatures.
The researchers suggest the cooler films could be the result of evaporation as the bubbles form. They noted also that as the bubbles persisted, their films slowly grew warmer, eventually matching the ambient air temperature. They suggest the large temperature differences they found with some bubbles might have an impact on bubble stability, and conclude that more work is required to find out why the films are cooler and if it might be a useful attribute.
An international team of researchers found that ketamine, being an NMDA receptor inhibitor, increases the brain's background noise, causing higher entropy of incoming sensory signals and disrupting their transmission between the thalamus and the cortex. This finding may contribute to a better understanding of the causes of psychosis in schizophrenia. An article with the study's findings has been published in the European Journal of Neuroscience.
Schizophrenic spectrum disorders affect approximately one in 300 people worldwide. The most common manifestations of these disorders are perceptual disturbances such as hallucinations, delusions and psychoses. A drug called ketamine can induce a mental state similar to psychosis in healthy individuals. Ketamine inhibits NMDA receptors involved in the transmission of excitatory signals in the brain. An imbalance of excitation and inhibition in the central nervous system can affect the accuracy of sensory perception. Similar changes in the functioning of NMDA receptors are currently thought to be one of the causes of perception disorders in schizophrenia. However, it is still unclear how exactly this process occurs in the brain regions involved.
To find out, neuroscientists
studied how the brains of laboratory rats on ketamine process sensory signals. The researchers examined beta andgamma oscillationsoccurring in response to sensory stimuli in the rodent brain's thalamo-cortical system , a neural network connecting the cerebral cortex with the thalamus responsible for the transmission of sensory information from the organs of perception to the brain.
Beta oscillations are brainwaves in the range of 15 to 30 Hz, and gamma waves are those in the range of 30 to 80 Hz. These frequencies are believed to be critical for encoding and integrating sensory information.
In the experiment, rats were implanted with microelectrodes to record the electrical activity in the thalamus and the somatosensory cortex , a region of the brain which is responsible for processing sensory information coming from the thalamus. The researchers stimulated the rats' whiskers (vibrissae) and recorded the brain's responses before and after ketamine administration.
A comparison of the two datasets revealed that ketamine increased the power of beta and gamma oscillations in the cortex and the thalamus even in the resting state before a stimulus was presented, while the amplitude of the beta/gamma oscillations in the 200–700 ms post-stimulus period was significantly lower at all recorded cortical and thalamic sites following ketamine administration.
The post-stimulation time lapse of 200–700 ms is long enough to encode, integrate and perceive the incoming sensory signal. The observed decrease in the power of sensory stimulus-induced oscillations can be associated with impaired perception.
Analysis also revealed that by inhibiting NMDA receptors, ketamine administration added noise to gamma frequencies in the post-stimulation 200–700 ms period in one thalamic nucleus and in one layer of the somatosensory cortex. It can be assumed that this observed increase in noise, ie a reduction in the signal-to-noise ratio, also indicates the neurons' impaired ability to process incoming sensory signals.
These findings suggest that psychosis may be triggered by an increase in background noise impairing the function of thalamo-cortical neurons. This, in turn, could be caused by a malfunction of NMDA receptors affecting the balance of inhibition and excitation in the brain. The noise makes sensory signals less defined or pronounced. In addition, this may cause spontaneous outbursts of activity associated with a distorted perception of reality.
"The discovered alterations in thalamic and cortical electrical activity associated with ketamine-induced sensory information processing disorders could serve as biomarkers for testing antipsychotic drugsor predicting the course of disease in patients with psychotic spectrum disorders.
Yi Qin et al, The psychotomimetic ketamine disrupts the transfer of late sensory information in the corticothalamic network, European Journal of Neuroscience (2022). DOI: 10.1111/ejn.15845
Thomas J. Reilly, Ketamine: Linking NMDA receptor hypofunction, gamma oscillations and psychosis (commentary on Qin et al., 2022),European Journal of Neuroscience(2022).DOI: 10.1111/ejn.15872
How chronic blood cancer transitions to aggressive disease
A type of chronic leukemia can simmer for many years. Some patients may need treatment to manage this type of blood cancer—called myeloproliferative neoplasms (MPN)—while others may go through long periods of watchful waiting. But for a small percentage of patients, the slower paced disease can transform into an aggressive cancer, called secondary acute myeloid leukemia, that has few effective treatment options. Little has been known about how this transformation takes place.
But now, researchers have identified an important transition point in the shift from chronic to aggressive leukemia. They have shown that blocking a key molecule in the transition pathway prevents this dangerous disease progression in mice with models of the disease and in mice with tumors sampled from human patients.
Tim Kong et al, DUSP6 mediates resistance to JAK2 inhibition and drives leukemic progression, Nature Cancer (2022). DOI: 10.1038/s43018-022-00486-8
Quasicrystal formed during accidental electrical discharge
Quasicrystals, as their name suggests, are crystal-like substances. They possess characteristics not found in ordinary crystals, such as a non-repeating arrangement of atoms. To date, quasicrystals have been found embedded in meteorites and in the debris from nuclear blasts. In this new effort, the researchers found one embedded in a sand dune in Sand Hills, Nebraska.
A team of researchers has found an incidence of a quasicrystal formed during an accidental electrical discharge.
Study of the quasicrystal showed it had 12-fold, or dodecagonal, symmetry—something rarely seen in quasicrystals. Curious as to how it might have formed and how it ended up in the sand dune, the researchers did some investigating. They discovered that a power line had fallen on the dune, likely the result of a lightning strike. They suggest the electrical surge from either the power line or the lightning could have produced the quasicrystal.
The researchers note that the quasicrystal was found inside of a tubular piece of fulgurite, which they suggest was also formed during the electrical surge due to fusing of melted sand and metal from the power line.
In looking at the quasicrystal using anelectron microscope, the researchers were able to make out its composition. In so doing, they found bits of silicon dioxide glass, which told them that temperatures inside the sand dune during the electrical discharge had to have reached at least 1,710 degrees Celsius. They also found that the quasicrystal had been retrieved from an area of transition between meltedaluminum alloyand silicate glass. Their work confirmed that the object they were studying was, indeed, a quasicrystal, and that it had a previously unseen composition.
The researchers conclude that finding a quasicrystalin such a place suggests that others are likely out there, as well, having formed due to lightning strikes or downed power lines. They also suggest their work could lead to techniques to create quasicrystals in the lab.
Luca Bindi et al, Electrical discharge triggers quasicrystal formation in an eolian dune, Proceedings of the National Academy of Sciences (2022). DOI: 10.1073/pnas.2215484119
Human brain organoids implanted into mouse cortex respond to visual stimuli for first time
A team of engineers and neuroscientists has demonstrated for the first time that human brain organoids implanted in mice have established functional connectivity to the animals' cortex and responded to external sensory stimuli. The implanted organoids reacted to visual stimuli in the same way as surrounding tissues, an observation that researchers were able to make in real time over several months thanks to an innovative experimental setup that combines transparent graphene microelectrode arrays and two-photon imaging.
Madison N. Wilson, Martin Thunemann, Xin Liu, Yichen Lu, Francesca Puppo, Jason W. Adams, Jeong-Hoon Kim, Mehrdad Ramezani, Donald P. Pizzo, Srdjan Djurovic, Ole A. Andreassen, Abed AlFatah Mansour, Fred H. Gage, Alysson R. Muotri, Anna Devor, Duygu Kuzum. Multimodal monitoring of human cortical organoids implanted in mice reveal functional connection with visual cortex. Nature Communications, 2022; 13 (1) DOI: 10.1038/s41467-022-35536-3
A groundbreaking study has discovered about 100,000 new types of previously unknown viruses , a ninefold increase in the amount of RNA viruses known to science until now. The viruses were discovered in global environmental data from soil samples, oceans, lakes, and a variety of other ecosystems. The researchers believe that the discovery may help in the development of anti-microbial drugs and in protecting against agriculturally harmful fungi and parasites.
Self-assembling proteins can store cellular 'memories'
As cells perform their everyday functions, they turn on a variety of genes and cellular pathways. Researchers have now coaxed cells to inscribe the history of these events in a long protein chain that can be imaged using a light microscope.
Cells programmed to produce these chains continuously add building blocks that encode particular cellular events. Later, the ordered protein chains can be labeled with fluorescent moleculesand read under a microscope, allowing researchers to reconstruct the timing of the events.
This technique could help shed light on the steps that underlie processes such as memory formation, response to drug treatment, and gene expression.
If the technique could be extended to work over longer time periods, it could also be used to study processes such as aging and disease progression, the researchers say.
Recording of cellular physiological histories along optically readable self-assembling protein chains, Nature Biotechnology (2022). DOI: 10.1038/s41587-022-01586-7
Dry eye disease affects how the eye's cornea heals itself after injury
People with a condition known as dry eye disease are more likely than those with healthy eyes to suffer injuries to their corneas. Studying mice, researchers at Washington University School of Medicine in St. Louis have found that proteins made by stem cells that regenerate the cornea may be new targets for treating and preventing such injuries.
Dry eye disease occurs when the eye can't provide adequate lubrication with natural tears. People with the common disorder use various types of drops to replace missing natural tears and keep the eyes lubricated, but when eyes are dry, the cornea is more susceptible to injury.
We have drugs to treat this condition, but they only work well in about 10% to 15% of patients. In this new study involving genes that are key to eye health, researchers identified potential targets for treatment that appear different in dry eyes than in healthy eyes.
The researchers analyzed genes expressed by the cornea in several mouse models—not only of dry eye disease, but also of diabetes and other conditions. They found that in mice with dry eye disease, the cornea activated expression of the gene SPARC. They also found that higher levels of SPARC protein were associated with better healing.
Lin, Joseph B. et al, Dry eye disease in mice activates adaptive corneal epithelial regeneration distinct from constitutive renewal in homeostasis, Proceedings of the National Academy of Sciences (2023). DOI: 10.1073/pnas.2204134120. doi.org/10.1073/pnas.2204134120
COVID-19 vaccines, prior infection reduce transmission of omicron, finds study
Vaccination and boosting, especially when recent, helped to limit the spread of COVID-19 in California prisons during the first omicron wave, according to an analysis by researchers.
The study demonstrates the benefits of vaccination and boosting, even in settings where many people are still getting infected, in reducing transmission. And it shows the cumulative effects from boosting and the additional protection that vaccination gives to those who were previously infected. The likelihood of transmission fell by 11% for each additional dose.
A lot of the benefits of vaccines to reduce infectiousness were from people who had received boosters and people who had been recently vaccinated.
Vaccinated residents withbreakthrough infectionswere significantly less likely to transmit them: 28% versus 36% for those who were unvaccinated. But the likelihood of transmission grew by 6% for every five weeks that passed since someone's lastvaccineshot.
Natural immunity from a prior infection also had a protective effect, and the risk of transmitting the virus was 23% for someone with a reinfection compared to 33% for someone who had never been infected.
Those with hybrid immunity, from both infection and vaccination, were 40% less likely to transmit the virus. Half of that protection came from the immunity that one acquires from fighting an infection and the other half came from being vaccinated.
Adults who stay well-hydrated appear to be healthier, develop fewer chronic conditions, such as heart and lung disease, and live longer than those who may not get sufficient fluids, according to a National Institutes of Health study published in eBioMedicine. Using health data gathered from 11,255 adults over a 30-year period, researchers analyzed links between serum sodium levels—which go up when fluid intake goes down—and various indicators of health. They found that adults with serum sodium levels at the higher end of a normal range were more likely to develop chronic conditions and show signs of advanced biological aging than those with serum sodium levels in the medium ranges. Adults with higher levels were also more likely to die at a younger age.
The results suggest that proper hydration may slow down aging and prolong a disease-free life.
The study expands on research the scientists published in March 2022, which found links between higher ranges of normal serum sodium levels and increased risk for heart failure. Both findings came from the Atherosclerosis Risk in Communities (ARIC) study, which includes sub-studies involving thousands of adults.
The researchers found that adults with higher levels of normal serum sodium—with normal ranges falling between 135-146 milliequivalents per liter (mEq/L)—were more likely to show signs of faster biological aging. This was based on indictors like metabolic and cardiovascular health, lung function, and inflammation.
The conclusion therefore was people whose serum sodium is 142 mEq/L or higher would benefit from evaluation of their fluid intake. Doctors may also need to defer to a patient's current treatment plan, such as limiting fluid intake for heart failure. The authors also cited research that finds about half of people worldwide don't meet recommendations for daily total water intake, which often starts at 6 cups (1.5 liters).
Decreased body water content is the most common factor that increases serum sodium, which is why the results suggest that staying well hydrated may slow down the aging process and prevent or delay chronic disease.
Natalia I. Dmitrieva et al, Middle-age high normal serum sodium as a risk factor for accelerated biological aging, chronic diseases, and premature mortality, eBioMedicine (2023). DOI: 10.1016/j.ebiom.2022.104404
Duping antibodies with a decoy, researchers aim to prevent rejection of transplanted cells
Researchers have developed a novel, potentially life-saving approach that may prevent antibodies from triggering immune rejection of engineered therapeutic and transplant cells.
Rejection mediated by antibodies—as opposed to the chemical assault initiated by immune cells—has proven particularly tricky to resolve, a factor holding held back the development of some of these treatments. The new strategy, described in the Monday, Jan. 2, 2023 issue of Nature Biotechnology, involved using a "decoy" receptor to capture the antibodies and take them out of circulation before they could kill the therapeutic cells, that they treat as invading foreigners. The tactic may also be useful for organ transplants.
The most celebrated cellular therapies are chimeric antigen receptor (CAR) T-cell therapies. These CAR-T therapies are often used to successfully treat specific forms of lymphomas, a type of often-deadly cancer. But deploying them against solid tumors has proved much more difficult. Until recently, most CAR-T therapies have been made using the patient's own cells, but the long-term commercial viability of cellular therapies of all types will rely on "allogeneic" cells—mass-produced therapeutic cells grown from a source outside the patient. As with transplanted organs, the recipient's immune system is likely to treat any outsider cells, or tissues developed from them, as foreign and to reject them. This issue is likely to be a severe obstacle in any type of allogeneic cell transplantation.
Normally when an antibody binds to a cell, it acts as a sort of tag, calling out for an immune cell to bind to the antibody and set off an efficient process of destroying the tagged cell. To stop this chain reaction, researchers devised a method to catch the antibodies before they bind to cells, preventing activation of the immune response.
The researchers genetically engineered three types of cells—insulin-producing pancreatic islet cells, thyroid cells, and CAR-T cells—so that each type made and displayed large numbers of a protein called CD64 on their surfaces.
On these engineered cells, CD64, which tightly binds the antibodies responsible for this type of immune rejection, acted as a kind of decoy, capturing the antibodies and binding them to the engineered cell, so they wouldn't activate immune cells.
Indeed this was the case when this approach was tested. This is clear proof-of-concept for this approach.
New type of entanglement lets scientists 'see' inside nuclei
Nuclear physicists have found a new way to use the Relativistic Heavy Ion Collider (RHIC) to see the shape and details inside atomic nuclei. The method relies on particles of light that surround gold ions as they speed around the collider and a new type of quantum entanglement that's never been seen before.
Through a series of quantum fluctuations, the particles of light (a.k.a. photons) interact with gluons—gluelike particles that hold quarks together within the protons and neutrons of nuclei. Those interactions produce an intermediate particle that quickly decays into two differently charged "pions" (π). By measuring the velocity and angles at which these π+ and π- particles strike RHIC's STAR detector, the scientists can backtrack to get crucial information about the photon—and use that to map out the arrangement of gluons within the nucleus with higher precision than ever before. This technique is similar to the way doctors use positron emission tomography (PET scans) to see what's happening inside the brain and other body parts. But in this case, physicists are talking about mapping out features on the scale of femtometers—quadrillionths of a meter—the size of an individual proton. Even more amazing, the STAR physicists say, is the observation of an entirely new kind of quantum interference that makes their measurements possible. Physicists measure two outgoing particles and clearly their charges are different—they are different particles—but they see interference patterns that indicate these particles are entangled, or in sync with one another, even though they are distinguishable particles! This is the first-ever experimental observation of entanglement between dissimilar particles. That discovery may have applications well beyond the lofty goal of mapping out the building blocks of matter. James Brandenburg, Tomography of ultra-relativistic nuclei with polarized photon-gluon collisions, Science Advances (2023). DOI: 10.1126/sciadv.abq3903. www.science.org/doi/10.1126/sciadv.abq3903
Scientists found the structure and function of newly discovered CRISPR immune system
Biochemists in their new research papers described the structure and function of a newly discovered CRISPR immune system that—unlike better-known CRISPR systems that deactivate foreign genes to protect cells—shuts down infected cells to thwart infection.
CRISPR, a manageable acronym for the mouthful "Clustered Regularly Interspaced Short Palindromic Repeats," has captured the imaginations of scientists and lay people alike, with its gene-editing potential. Study of CRISPR DNA sequences and CRISPR-associated (Cas) proteins, which are actually bacterial immune systems, is still a young field, although it's receiving widespread attention for its gene-editing applications.
Identified as a distinct immune system within the last five years, the Class 2, type V Cas12a2 is somewhat similar to the better-known CRISPR-Cas9, which binds to target DNA and cuts it—like molecular scissors—effectively shutting off a targeted gene. But CRISPR-Cas12a2 binds a different target than Cas9, and that binding has a very different effect.
The Cas12a2 protein undergoes major conformational changes upon binding to RNA that opens an indiscriminate active site for DNA destruction. "Cas12a2 destroys the DNA and RNA in target cells, causing them to go senescent."
Using cryo-electron microscopy or "cryo-EM," the researchers demonstrated this unique aspect of CRISPR-Cas12a2, including its RNA-triggered degradation of single-stranded RNA, single-stranded DNA and double-stranded DNA, resulting in a naturally occurring defensive strategy called abortive infection.
Abortive infection is a natural phage resistance strategy used by bacteria and archaea to limit the spread of viruses and other pathogen. For example, abortive infection prevents viral components that have infected a cell from replicating.
The team captured the structure of Cas12a2 in the act of cutting double-stranded DNA.
Incredibly, Cas12a2 nucleases bend the usually straight piece of double-helical DNA 90 degrees, to force the backbone of the helix into the enzymatic active site, where it is cut. It's a change in structure that's extraordinary to observe.
Insulin is an essential hormone for humans and many other living creatures. Its best-known task is to regulate sugar metabolism. How it does this job is well understood. Much less is known about how the activity of insulin-producing cells and consequently the secretion of insulin is controlled.
Researchers have now presented new information on this question in the scientific journal Current Biology. They used the fruit fly Drosophila melanogaster as their study object. Interestingly, this fly also secretes insulin after a meal. However, in the fly, the hormone does not come from the pancreas as in humans, but is instead released by nerve cells in the brain.
They figured out that physical activity of the fly has a strong effect on its insulin-producing cells. For the first time, the researchers measured the activity of these cells electrophysiologically in walking and flying Drosophila.
The result: when Drosophila starts to walk or fly, its insulin-producing cells are immediately inhibited. When the fly stops moving, the activity of the cells rapidly increases again and shoots up above normal levels.
The scientists hypothesize that the low activity of insulin-producing cells during walking and flight contributes to the provision of sugars to meet the increased energy demand. They suspect that the increased activity after exercise helps to replenish the fly's energy stores, for example, in the muscles.
The team was also able to demonstrate that the fast, behaviour-dependent inhibition of insulin-producing cells is actively controlled by neural pathways. It is largely independent of changes in the sugar concentration in the fly's blood.
It makes a lot of sense for the organism to anticipate an increased energy demand in this way to prevent extreme fluctuations in blood sugar levels.
Do the results allow conclusions to be drawn about humans? Probably.
Although the release of insulin in fruit flies is mediated by different cells than in humans, the insulin molecule and its function have hardly changed in the course of evolution.
In the past 20 years, using Drosophila as a model organism, many fundamental questions have already been answered that could also contribute to a better understanding of metabolic defects in humans and associated diseases, such as diabetes or obesity.
Less insulin means longevity
One exciting point is that reduced insulin activity contributes to healthy aging and longevity. This has already been shown in flies, mice, humans and other species. The same applies to an active lifestyle. Our work shows a possible link explaining how physical activity could positively affect insulin regulation via neuronal signaling pathways.
Sander Liessem et al, Behavioral state-dependent modulation of insulin-producing cells in Drosophila, Current Biology (2022). DOI: 10.1016/j.cub.2022.12.005
An Organism That Can Dine Exclusively on Viruses Has Been Found in a World First
A type of freshwater plankton has become the first organism seen thriving on a diet of viruses, according to a new study by researchers.
Virusesare often consumed incidentally by a range wide of organisms, and may evenseason the dietsof certain marine protists. But to qualify as a true step in the food chain – described as virovory – viruses ought to contribute a significant amount of energy or nutrients to their consumer.
The microbeHalteriais a common genus of protist known to flit about as its hair-like cilia propel it through the water. Not only did laboratory samples of the ciliate consume chloroviruses added to its environment, thegiant virusfueledHalteria'sgrowth and increased its population size.
The knock-on effects of widespread consumption of chloroviruses in the wild could have a profound impact on the carbon cycle. Known to infect microscopic green algae, chloroviruses cause their hosts to burst apart, releasing carbon and other nutrients into the environment – a process that serious amounts of virus-eating could be limiting.
There's some good stuff inside viruses if you're an organism looking to feed, including amino acids, nucleic acids, lipids, nitrogen, and phosphorus. Surely something would want to make a meal out of that.
While the Paramecium snacked on the viruses, its sizes and numbers barely budged. Halteria, on the other hand, dined on them, using the chlorovirus as a source of nutrients. The ciliate's population grew about 15 times larger in two days, while the virus population dropped a hundredfold.
Fluorescent green dye was used to tag chlorovirus DNA before it was introduced to the two types of plankton. This confirmed that the viruses were being eaten: the vacuoles – microbial equivalent of stomachs – were glowing green from the feeding.
Further analysis revealed that the growth ofHalteriain comparison to the decline of the chlorovirus matched the ratios seen in other microscopic predator vs prey relationships in aquatic environments, giving the team more evidence of what was happening.
The brain's ability to perceive space expands like the universe
Young children sometimes believe that the moon is following them, or that they can reach out and touch it. It appears to be much closer than is proportional to its true distance. As we move about our daily lives, we tend to think that we navigate space in a linear way. But scientists have discovered that time spent exploring an environment causes neural representations to grow in surprising ways.
The findings, published in Nature Neuroscience on December 29, 2022, show that neurons in the hippocampus essential for spatial navigation, memory, and planning represent space in a manner that conforms to a nonlinear hyperbolic geometry—a three-dimensional expanse that grows outward exponentially. (In other words, it's shaped like the interior of an expanding hourglass.) The researchers also found that the size of that space grows with time spent in a place. And the size is increasing in a logarithmic fashion that matches the maximal possible increase in information being processed by the brain.
This discovery provides valuable methods for analyzing data on neurocognitive disorders involving learning and memory, such as Alzheimer's disease.
This new study demonstrates that the brain does not always act in a linear manner. Instead, neural networks function along an expanding curve, which can be analyzed and understood using hyperbolic geometry and information theory.It is exciting to see that neural responses in this area of the brain formed a map that expanded with experience based on the amount of time devoted in a given place. The effect even held for miniscule deviations in time when animal ran more slowly or faster through the environment.
In the current study, the scientists found that hyperbolic geometry guides neural responses as well. Hyperbolic maps of sensory molecules and events are perceived with hyperbolic neural maps. The space representations dynamically expanded in correlation with the amount of time the rat spent exploring each environment. And, when a rat moved more slowly through an environment, it gained more information about the space, which caused the neural representationsto grow even more.
The findings provide a novel perspective on how neural representations can be altered with experience. The geometric principles identified in our study can also guide future endeavors in understanding neural activityin various brain systems.
You would think that hyperbolic geometry only applies on a cosmic scale, but that is not true. Our brains work much slower than the speed of light, which could be a reason that hyperbolic effects are observed on graspable spaces instead of astronomical ones.
Huanqiu Zhang et al, Hippocampal spatial representations exhibit a hyperbolic geometry that expands with experience, Nature Neuroscience (2022). DOI: 10.1038/s41593-022-01212-4
New approach successfully traces genomic variants back to genetic disorders
Researchers have published an assessment of 13 studies that took a genotype-first approach to patient care. This approach contrasts with the typical phenotype-first approach to clinical research, which starts with clinical findings. A genotype-first approach to patient care involves selecting patients with specific genomic variants and then studying their traits and symptoms; this finding uncovered new relationships between genes and clinical conditions, broadened the traits and symptoms associated with known disorders, and offered insights into newly described disorders. The study was published in the American Journal of Human Genetics.
Genotype-first research can work, especially for identifying people with rare disorders who otherwise might not have been brought to clinical attention.
Typically, to treat genetic conditions, researchers first identify patients who are experiencing symptoms, then they look for variants in the patients' genomes that might explain those findings. However, this can lead to bias because the researchers are studying clinical findings based on their understanding of the disorder. The phenotype-first approach limits researchers from understanding the full spectrum of symptoms of the disorders and the associated genomic variants.
Genomics has the potential to change reactive medicine into preventative medicine. Studying how taking a genotype-first approach to research can help us learn how to model predictive and precision medicine in the future.
The study documents three types of discoveries from a genotype-first approach.
First, the researchers found that this approach helped discover new relationships between genomic variants and specific clinical traits. For example, one NIHstudyfound that having more than two copies of the TPSAB1 gene was associated with symptoms related to the gastrointestinal tract, connective tissues, and the nervous system.
Second, this approach helped researchers find novel symptoms related to a disorder that clinicians previously missed because the patient did not have the typical symptoms.NHGRI researchers identifieda person with a genomic variant associated with a known metabolic disorder. Further testing found that the individual had high levels of certain chemicals in their body associated with the disorder, despite having only minor symptoms.
Third, this approach allowed researchers to determine the function of specific genomic variants, which has the potential to help clinicians understand newly described disorders. For example, in one study, NHGRI researchers and their collaborators found that a genomic variant was associated with immune dysfunction at the molecular level in blood cells.
Study reveals average age at conception for men versus women over past 250,000 years
The length of a specific generation can tell us a lot about the biology and social organization of humans. Now, researchers can determine the average age that women and men had children throughout human evolutionary history with a new method they developed using DNA mutations.
The researchers said this work can help us understand the environmental challenges experienced by our ancestors and may also help us in predicting the effects of future environmental change on human societies.
Through this research on modern humans, researchers noticed that they could predict the age at which people had children from the types of DNA mutations they left to their children.They then applied this model to our human ancestors to determine what age our ancestors procreated.
According to the study, published recently in Science Advances, the average age that humans had children throughout the past 250,000 years is 26.9. Furthermore, fathers were consistently older, at 30.7 years on average, than mothers, at 23.2 years on average, but the age gap has shrunk in the past 5,000 years, with the study's most recent estimates of maternal age averaging 26.4 years. The shrinking gap seems to largely be due to mothers having children at older ages. Other than the recent uptick in maternal age at childbirth, the researchers found that parental age has not increased steadily from the past and may have dipped around 10,000 years ago because of population growth coinciding with the rise of civilization. These mutations from the past accumulate with every generation and exist in humans today. Researchers can now identify these mutations, see how they differ between male and female parents, and how they change as a function of parental age. Children's DNA inherited from their parents contains roughly 25 to 75 new mutations, which allows scientists to compare the parents and offspring, and then to classify the kind of mutation that occurred. When looking at mutations in thousands of children, the researchers noticed a pattern: The kinds of mutations that children get depend on the ages of the mother and the father. Previous genetic approaches to determining historical generation times relied on the compounding effects of either recombination or mutation of modern human DNA sequence divergence from ancient samples. But the results were averaged across both males and females and across the past 40,000 to 45,000 years.
The researchers built a model that uses de novo mutations—a genetic alteration that is present for the first time in one family member as a result of a variant or mutation in a germ cell of one of the parents or that arises in the fertilized egg during early embryogenesis—to separately estimate the male and female generation times at many different points throughout the past 250,000 years.
The story of human history is pieced together from a diverse set of sources: written records, archaeological findings, fossils, etc. Our genomes, the DNA found in every one of our cells, offer a kind of manuscript of human evolutionary history. The findings from our genetic analysis confirm some things we knew from other sources (such as the recent rise in parental age), but also offer a richer understanding of the demography of ancient humans. These findings contribute to a better understanding of our shared history.
Scars mended using transplanted hair follicles in new study
In a new study involving three volunteers, skin scars began to behave more like uninjured skin after they were treated with hair follicle transplants. The scarred skin harbored new cells and blood vessels, remodeled collagen to restore healthy patterns, and even expressed genes found in healthy unscarred skin.
The findings could lead to better treatments for scarring both on the skin and inside the body, leading to hope for patients with extensive scarring, which can impair organ function and cause disability.
After scarring, the skin never truly regains its pre-wound functions, and until now all efforts to remodel scars have yielded poor results. The new findings lay the foundation for exciting new therapies that can rejuvenate even mature scars and restore the function of healthy skin.
Scar tissue in the skin lacks hair, sweat glands, blood vessels and nerves, which are vital for regulating body temperature and detecting pain and other sensations. Scarring can also impair movement as well as potentially cause discomfort and emotional distress.
Compared to scar tissue, healthy skin undergoes constant remodeling by the hair follicle. Hairy skin heals faster and scars less than non-hairy skin—and hair transplants had previously been shown to aid wound healing. Inspired by this, the researchers hypothesized that transplanting growing hair follicles into scar tissue might cause scars to remodel themselves. To test their hypothesis, researchers transplanted hair follicles into the mature scars on the scalps of three participants in 2017. The researchers selected the most common type of scar, called normotrophic scars, which usually form after surgery. They took and microscope-imaged 3 mm-thick biopsies of the scars just before transplantation, and then again at two, four, and six months afterwards. The researchers found that the follicles inspired profound architectural and genetic shifts in the scars towards a profile of healthy, uninjured skin.
After transplantation, the follicles continued to produce hair and induced restoration across skin layers. Scarring causes the outermost layer of skin—the epidermis—to thin out, leaving it vulnerable to tears. At six months post-transplant, the epidermis had doubled in thickness alongside increased cell growth, bringing it to around the same thickness as uninjured skin. The next skin layer down, the dermis, is populated with connective tissue, blood vessels, sweat glands, nerves, and hair follicles. Scar maturation leaves the dermis with fewer cells and blood vessels, but after transplantation the number of cells had doubled at six months, and the number of vessels had reached nearly healthy-skin levels by four months. This demonstrated that the follicles inspired the growth of new cells and blood vessels in the scars, which are unable to do this unaided. Scarring also increases the density of collagen fibers—a major structural protein in skin—which causes them to align such that scar tissue is stiffer than healthy tissue. The hair transplants reduced the density of the fibers, which allowed them to form a healthier "basket weave" pattern, reducing stiffness—a key factor in tears and discomfort. The authors also found that after transplantation, the scars expressed 719 genes differently than before. Genes that promoted cell and blood vessel growth were expressed more, while genes that promoted scar-forming processes were expressed less.
How liver cancer hijacks circadian clock machinery inside cells
The most common type of liver cancer, hepatocellular carcinoma (HCC), is already the third leading cause of cancer-related deaths globally—and cases are on the rise worldwide. While chemotherapy, surgery and liver transplants can help some patients, targeted treatments for HCC could save millions more lives.
Recent studies have offered clues about one potential target: the circadian clock proteins inside cells, which help coordinate changes in the body's functioning over the course of a day. But most of this research only hints at an indirect link between circadian clock function and HCC, for instance the observation that cells collected from patients with liver cancer have disrupted circadian rhythms. Now, a study by researchers not only directly links circadian clock proteins to liver cancer, but also shows precisely how cancer cells hijack circadian clock machinery to divide and spread. The research, just published in the journal Proceedings of the National Academy of Sciences, also found that inhibiting key clock proteins can prevent cancer cells from multiplying.
By targeting the circadian clock, we are not only targeting tumour cells but also the area around the tumor, which can help increase the efficacy of other targeted treatments.
The researchers showed that two key clock proteins, known as CLOCK and BMAL1, are critical for the replication of liver cancer cells in cell culture. When CLOCK and BMAL1 are suppressed, cancer cells' replication process was interrupted—ultimately causing cell death, or apoptosis. Triggering apoptosis, during which a cell stops dividing, then self-destructs, is the goal of many modern cancer treatments.
Among other findings, they showed that eliminating the clock proteins reduced levels of the enzyme Wee1 and increased levels of the enzyme inhibitor P21. That's exactly what you want, because when it comes to cancer cell proliferation, P21 is a brake and Wee1 is a gas pedal.
Finally, the researchers tested their findings in vivo. Mice injected with unmodified human liver cancer cells grew large tumors, but those injected with cells modified to suppress CLOCK and BMAL1 showed little to no tumour growth.
Meng Qu et al, Circadian regulator BMAL1::CLOCK promotes cell proliferation in hepatocellular carcinoma by controlling apoptosis and cell cycle, Proceedings of the National Academy of Sciences (2023). DOI: 10.1073/pnas.2214829120
The oven won't talk to the fridge: 'smart' homes struggle
Yes, you read it right.
Tech firms have spent years hawking the idea of a connected home filled with "smart" devices that help smooth daily domestic lives. However, competition between different tech companies is making things very difficult.
Big players from Amazon and Apple to Google and Samsung have built entire ecosystems for their devices, often around a voice assistant like Alexa or Siri.
The biggest firms have spent years trying to tackle the "interoperability" problem, finally agreeing to a protocol last year called "Matter" that sets a standard for connected home products.
Just as USB ports allowed all devices to plug into all machines, so the Matter protocol means all connected devices will work with each other and users will no longer need to download a different app for each device.
Making the devices work with each other was the easier part.
The hard part is the app model, the data model, the sharing of this, because the human nature of companies is to be very selfish about this.
Each brand is now trying to convince the public to adopt its app to centralize control of household appliances.
And if things don't match up, your fridge doesn't talk with your stove, your vacuum cleaner with your recliner and .... you will have a very messy home.
New research: When cells exchange metabolic products with other cells, they live longer. The fact that these exchanges directly impact cell lifespans could play a significant role in future research into human aging processes and age-related diseases. The study appears in the latest issue of Cell.
Metabolism is inextricably linked to aging. While it helps maintain vital processes, makes us grow, and triggers cellular repairs, it also produces substances that damage our cells and cause us to age. The metabolic processes that occur within cells are highly complex. The exchange of substances between cells in a community is one important factor, because it has a substantial impact on the metabolism occurring inside a cell.
Cells are in constant contact with neighboring cells—within tissues, for instance. They release some substances and consume others from their surrounding environment. In a recent study researchers investigated whether the exchange of metabolic products (known as metabolites) affects the lifespan of cells.
The researchers used yeast cells and performed experiments to establish their lifespan. Yeast cells are a key model in basic research, a dominant microorganism in biotechnology, and important in medicine because they can cause fungal infections. They found that the cells lived around 25% longer when they exchanged more metabolites with each other. So obviously wanted to identify the substances and exchange processes that are behind this life-prolonging effect.
To do so, the researchers employed a special analytical system supported by mass spectrometry that allowed them to precisely track the exchange of metabolites between cells. They found that young cells, which were still able to divide well and often, released amino acids that were consumed by older cells.
Amino acids are the building blocks that make up proteins. The research team discovered that the exchange of the amino acid methionine extended the lives of the cells involved. Methionine occurs in all organisms and plays a key role in protein synthesis, as well as many other cellular processes. Interestingly, it was the young cells' metabolism that prolonged the lives of the old cells.
The cells which within the community consumed methionine, released glycerol. In turn, the presence of glycerol affected methionine producing cells, causing them to live longer. Glycerol is needed for building cell membranes and plays a part in protecting cells. It's a win-win situation. As cells engage in this collaborative exchange, they prolong the lifespan of their community as a whole.
This study of yeast cell communities is the first to show that metabolite exchange directly impacts the lifespan and aging process of the cells. The researchers suspect this also applies to other types of cells, such as those in the human body, and are aiming to investigate this in further studies.
Clara Correia-Melo et al, Cell-cell metabolite exchange creates a pro-survival metabolic environment that extends lifespan, Cell (2023). DOI: 10.1016/j.cell.2022.12.007
Rate of scientific breakthroughs slowing over time: Study
The rate of ground-breaking scientific discoveries and technological innovation is slowing down despite an ever-growing amount of knowledge, according to an analysis released recently of millions of research papers and patents.
While previous research has shown downturns in individual disciplines, the study is the first that 'emphatically, convincingly documents this decline of disruptiveness across all major fields of science and technology'.
The researchers gave a "disruptiveness score" to 45 millionscientific papersdating from 1945 to 2010, and to 3.9 million US-based patents from 1976 to 2010.
From the start of those time ranges,research papersand patents have been increasingly likely to consolidate or build upon previous knowledge, according to results published in the journalNature.
The ranking was based on how the papers were cited in other studies five years after publication, assuming that the more disruptive the research was, the less its predecessors would be cited.
The biggest decrease in disruptive research came in physical sciencessuch as physics and chemistry.
"The nature of research is shifting" as incremental innovations become more common.
One theory for the decline is that all the "low-hanging fruit" of science has already been plucked.
If that were the case, disruptiveness in various scientific fields would have fallen at different speeds. If that were the case, disruptiveness in various scientific fields would have fallen at different speeds.
But instead the declines are pretty consistent in their speeds and timing across all major fields indicating that the low-hanging fruit theory is not likely to be the culprit.
Instead, the researchers pointed to what has been dubbed "the burden of research," which suggests there is now so much that scientists must learn to master a particular field they have little time left to push boundaries.
This causes scientists and inventors to focus on a narrow slice of the existing knowledge, leading them to just come up with something more consolidating rather than disruptive.
Another reason could be that "there's increasing pressure in academia to publish, publish, publish, because that's the metric that academics are assessed on.
The researchers called on universities and funding agencies to focus more on quality, rather than quantity, and consider full subsidies for year-long sabbaticals to allow academics to read and think more deeply.
This work showed that "ultra-specialization" and the pressure to publish had increased over the years.
Researchers blamed a global trend of academics being "forced to slice up their papers" to increase their number of publications, saying it had led to "a dulling of research."
Michael Park et al, Papers and patents are becoming less disruptive over time, Nature (2023). DOI: 10.1038/s41586-022-05543-x
Human-approved medication brings back 'lost' memories in mice
Students sometimes pull an all-nighter to prepare for an exam. However, research has shown that sleep deprivation is bad for your memory. Now, neuroscientists have discovered that what you learn while being sleep deprived is not necessarily lost, it is just difficult to recall.
These neuro-scientists have found a way to make this "hidden knowledge" accessible again days after studying while sleep-deprived using optogenetic approaches, and the human-approved asthma drug roflumilast. These findings were published in the journal Current Biology.
Using genetic techniques, the scientists caused a light-sensitive protein (channelrhodopsin) to be produced selectively in neurons that are activated during a learning experience. This made it possible to recall a specific experience by shining light on these cells.
In the experiment, the genetically engineered mice were given a spatial learning task in which they had to learn the location of individual objects, a process that heavily relies on neurons in the hippocampus. The mice then had to perform this same task days later, but this time with one object moved to a novel location. The mice that were deprived of sleep for a few hours before the first session failed to detect this spatial change, which suggests that they cannot recall the original object locations.
However, when the researchers reintroduced them to the task after reactivating the hippocampal neurons that initially stored this information with light, they did successfully remember the original locations. This shows that the information was stored in the hippocampus during sleep deprivation, but couldn't be retrieved without the stimulation.
The molecular pathway set off during the reactivation is also targeted by the drug roflumilast, which is used by patients with asthma or COPD. When scientists gave mice that were trained while being sleep deprived roflumilast just before the second test, they remembered, exactly as happened with the direct stimulation of the neurons. As roflumilast is already clinically approved for use in humans, and is known to enter the brain, these findings open up avenues to test whether it can be applied to restore access to 'lost' memories in humans.
The discovery that more information is present in the brain than we previously anticipated, and that these "hidden" memories can be made accessible again—at least in mice—opens up all kinds of exciting possibilities.
Youri G. Bolsius et al, Recovering object-location memories after sleep deprivation-induced amnesia, Current Biology (2022). DOI: 10.1016/j.cub.2022.12.006
“I’ve lost count of the number of times that a board member has remarked that the way a study has been designed means it won’t yield any informative data,” says experimental psychologist and ethical-review-board chair Daniël Lakens. To counter this trend, his university has introduceda methodological review board that highlights flaws before data col...— such as sample sizes that are too small to test a hypothesis.
An interesting thought indeed "Researchers blamed a global trend of academics being "forced to slice up their papers" to increase their number of publications, saying it had led to "a dulling of research."
So, Dr Krishna - where does this lead scientists to? Does it indicate a progressive movement towards applied Physics?
Researchers uncover new cell types involved in osteoarthritis
A Medicine study has identified a new potential target for treating osteoarthritis—a debilitating joint disease that affects millions and is a leading cause of disability worldwide.
A team of researchers has uncovered previously unknown cell types in the joint that emerge after an injury and drive the onset of osteoarthritis.
Clinically, osteoarthritis presents as a very complex disease, with patients suffering from joint stiffness, reduced mobility and function, and most notably, persistentpain.
Osteoarthritis patients commonly live with this condition for multiple decades, and no treatments have been developed that can stop or reverse the disease. The condition can occur with age or be sparked by a joint injury and is typically managed with pain relief and end-stage joint replacement.
The study, titled "Synovial fibroblasts assume distinct functional identities and secrete R-spondin 2 in osteoarthritis" and published in theAnnals of the Rheumatic Diseases, examined the cellular and molecular events during the onset of post-traumatic osteoarthritis in joints.
Researchers identified cell types that emerge in the joint after trauma, such as an ACL injury, and they can now associate these cells with the disease process. This allows us to view them as a treatment target for this devastating disease.
By employing a cutting-edge gene sequencing technology called single-cell RNA-sequencing,the researchers were able to uncover these previously uncharacterized cells that emerge in the joint after injury.
The study also described the biological processes that may activate these cells, which offers compelling new targets for an effective treatment.
Interestingly, these cells are not found in healthy joints, and the researchers have to understand exactly what causes them to appear and how they may cause osteoarthritis.
Alexander J Knights et al, Synovial fibroblasts assume distinct functional identities and secrete R-spondin 2 in osteoarthritis, Annals of the Rheumatic Diseases (2022). DOI: 10.1136/ard-2022-222773
Electrons take new shape inside unconventional metal
One of the biggest achievements of quantum physics was recasting our vision of the atom. Out was the early 1900s model of a solar system in miniature, in which electrons looped around a solid nucleus. Instead, quantum physics showed that electrons live a far more interesting life, meandering around the nucleus in clouds that look like tiny balloons. These balloons are known as atomic orbitals, and they come in all sorts of different shapes—perfectly round, two-lobed, clover-leaf-shaped. The number of lobes in the balloon signifies how much the electron spins about the nucleus.
That's all well and good for individual atoms, but when atoms come together to form something solid—like a chunk of metal, say—the outermost electrons in the atoms can link arms and lose sight of the nucleus from where they came, forming many oversized balloons that span the whole chunk of metal. They stop spinning about their nuclei and flow through the metal to carry electrical currents, shedding the diversity of multi-lobed balloons.
Now researchers have produced the first experimental evidence that one metal—and likely others in its class—have electrons that manage to preserve a more interesting, multi-lobed structure as they move around in a solid. The team experimentally studied the shape of these balloons and found not a uniform surface, but a complex structure. This unusual metal is not only fundamentally interesting, but it could also prove useful for building quantum computers that are resistant to noise.
Peering inside a metal, one sees atoms arranged in neat repeating grids, called a crystal lattice. In a crystal, the atomic orbitals of the outermost electrons morph into one another. This allows the electrons to travel far from their original nucleus and carry current through the metal. In this solid setting, a version of orbital balloons still exists, but it's more common to visualize them not in space—where there are many huge and unwieldy orbitals—but as a function of the speed and direction of the traveling electrons. The fastest moving electrons in the crystal form their own balloon, a collective analog of atomic orbitals known as a Fermi surface.
The shape of the Fermi surface reflects the structure of the underlying crystal, which usually bears no resemblance to the orbital structure of single atoms. But for materials like YPtBi with very few mobile electrons, the Fermi surface is not very big. Because of this, it retains some of the properties of electrons that hardly move at all, which sit at the center of the Fermi surface.
The fact that nature figures out counterintuitive atomic arrangements that allow the Fermi surface to retain signatures of the atomic orbitals is rather cool and intricate.
To uncover this cool, counterintuitive Fermi surface, the researchers stuck a YPtBi crystal inside a magnetic field and measured the current flowing through the crystal as they tuned the field. By rotating the direction of the magnetic field, they were able to map out the speed of the fastest electrons in every direction. They found that akin to a higher angular momentum atomic orbital, the Fermi surface has a complex shape to it, with peaks and troughs along certain directions. The high symmetry of the crystal itself would normally lead to a more uniform, ball-like Fermi surface, so it was a surprise to find a more complicated structure. This pointed to the possibility that the collective electrons were exhibiting some of the higher angular momentum nature of atomic orbitals.
Indeed, theoretical calculations by the team showed that the experimental results matched up with a high angular momentum model, leading the team to claim the first experimental observation of a high-angular momentum metal. The team cautions that even this experimental evidence could still be incomplete. What they measured depends not only on the Fermi surface but also on other properties of the electrons, such as their effective mass and the distribution of their velocities. In their work, the team systematically studied the angular dependence of these other quantities and demonstrated that it would be extremely unlikely for them to cause the observed peaks and troughs.
In addition to being fundamentally novel, this higher angular momentum metal has potential applications for quantum computing.
Hyunsoo Kim et al, Quantum oscillations of the j=3/2 Fermi surface in the topological semimetal YPtBi, Physical Review Research (2022). DOI: 10.1103/PhysRevResearch.4.033169
Are we breathing airborne microplastics? Study finds higher concentrations indoors
People are likely exposed to thousands of airborne microplastics a year indoors, a new study has found.
Published inEnvironmental Science & Technology, the study investigated the abundance, distribution, form, and possible sources of microplastics (MPs) in indoor and outdoor sites in Sri Lanka, finding concentrations between 1 and 28 times higher indoors.
With people spending approximately 90% of their time indoors and based on the indoor and outdoor MPs levels identified in this study, the researchers calculated the average human exposure as 2,675 airborne microplasticparticles per person every year.
Researchers collected air samples in different urban, rural, coastal, inland, industrial, and natural habitats with varyingpopulation densities.
The indoor MPs levels, made up of fibers and the occasional fragments mostly from textiles and clothing, were significantly higher than outdoor levels by a factor of 1–28 times, regardless of the type of outdoor environment. Transparent, blue, and black fibers in the size range of 0.10 to 0.50 millimeters were the dominant AMPs across all sites.
These initial results show that the amount of indoor airborne microplastics is more related to indoor sources and the occupants' lifestyle than the outdoor environment.
In the outdoor samples, the amount of AMPs was always greater in high-density sites compared to the low-density areas, suggesting the abundance and distribution of AMPs was related to population density, level of industrialization, and human activity."
The dominant type of microplastics in both indoor and outdoor sites was PET fibers (polyethylene terephthalate), primarily originating from clothing and textiles.
This study is an important first step that shows the abundance of AMPs in a lower-middle income country in South Asia.
Kushani Perera et al, Airborne Microplastics in Indoor and Outdoor Environments of a Developing Country in South Asia: Abundance, Distribution, Morphology, and Possible Sources, Environmental Science & Technology (2022). DOI: 10.1021/acs.est.2c05885
Want to learn about different types of marine energy resources and technologies, their associated challenges and considerations, and how researchers are addressing these challenges? Watch this video ....
We know a crocodile's ability to stay under water for a long time. When the unseen apex predator lashes from the water to seize its prey the impala, its infamous teeth latch onto a hindquarter, jaws clenching with 5,000 pounds of force. Yet it's the water itself that does the killing, with the deep-breathed reptile dragging its prey to the deep end to drown.
The success of the croc's ambush lies in the nanoscopic scuba tanks—hemoglobins—that course through its bloodstream, unloading oxygen from lungs to tissues at a slow but steady clip that allows it to go hours without air. The hyper-efficiency of that specialized hemoglobin has led some biologists to wonder why, of all the jawed vertebrates in all the world, crocodilians were the lone group to hit on such an optimal solution to making the most of a breath.
By statistically reconstructing and experimentally resurrecting the hemoglobin of an archosaur, the 240-million-year-old ancestor of all crocodilians and birds, researchers have gleaned new insights into that why. Rather than requiring just a few key mutations, as earlier research suggested, the unique properties of crocodilianhemoglobin stemmed from 21 interconnected mutations that litter the intricate component of red blood cells.
That complexity, and the multiple knock-on effects that any one mutation can induce in hemoglobin, may have forged an evolutionary path so labyrinthine that nature failed to retrace it even over tens of millions of years, the researchers say.
All hemoglobin binds with oxygen in the lungs before swimming the bloodstream and eventually releasing that oxygen to the tissues that depend on it. In most vertebrates, hemoglobin's affinity for capturing and holding oxygen is dictated largely by molecules known as organic phosphates, which, by attaching themselves to the hemoglobin, can coax it into releasing its precious cargo.
But in crocodilians—crocodiles, alligators and their kin—the role of organic phosphates was supplanted by a molecule, bicarbonate, that is produced from the breakdown of carbon dioxide. Because hardworking tissues produce lots of carbon dioxide, they also indirectly generate lots of bicarbonate, which in turn encourages hemoglobin to dispense its oxygen to the tissues most in need of it.
It's a super-efficient system that provides a kind of slow-release mechanism that allows crocodilians to efficiently exploit their onboard oxygen stores. It's part of the reason they're able to stay underwater for so long.
Chandrasekhar Natarajan et al, Evolution and molecular basis of a novel allosteric property of crocodilian hemoglobin, Current Biology (2022). DOI: 10.1016/j.cub.2022.11.049
Washing fabrics by hand reduces microplastic release compared with machine washing
From tiny plankton to massive whales, microplastics have been found throughout the ocean food chain. One major source of this pollution are fibers shed while laundering synthetic fabrics. Although many studies show microfibers are released during machine washing, it's been less clear how hand washing contributes. Now, researchers reporting in ACS Environmental Science & Technology Water report that hand washing can drastically cut the amount of fibers shed compared with using a machine.
When clothing made from plastic fibres, such as polyester and nylon, are laundered, the fabric sheds microscopic fibers that eventually end up in wastewater and the environment. Though researchers have investigated the amount and types of microplastic fibers shed while laundering clothing, most studies have focused on washing machines. In many countries, however, it is still common to manually launder clothing. So researchers systematically tried to compare both types of washing and their effect on the environment.
The team cleaned two types of fabric swatches made from 100% polyester and a 95% polyester-5% spandex blend with hand washing methods and a washing machine. The researchers found that:
Manual methods released far fewer fibers. For example, the 100% polyester fabric shed an average of 1,853 microplastic pieces during hand washing compared with an average of 23,723 pieces from the same fabric that was machine laundered.
By weight, machine laundering released over five times more microplastics than the traditional method.
The fibers released from hand washing tended to be longer.
Addingdetergent, pre-soaking the fabrics and using a washboard increased the number of released fibers with manual methods, but still not to the same extent as using a machine.
In contrast, they found that temperature, detergent type, wash time and the amount of water used had no meaningful effects on the amount of microplastics shed while hand washing.
The researchers say that these results will help clarify the sources of microplastic pollution in the environment and can provide guidance for "greener" laundering methods.
Chunhui Wang et al, Microplastic Fiber Release by Laundry: A Comparative Study of Hand-Washing and Machine-Washing, ACS ES&T Water (2023). DOI: 10.1021/acsestwater.2c00462
Protein that counteracts key rattlesnake venom toxins identified
Venomous snakes cause an estimated 120,000 deaths and 400,000 disabling injuries worldwide each year.
To reduce and mitigate the severity of venomous snake bites, a team of biologists launched an investigation into the genome of the western diamondback rattlesnake(Crotalus atrox), a species with more venom toxins encoded in its genome than any other known rattlesnake. The team pinpointed a single protein—called FETUA-3—that inhibits a broad spectrum of rattlesnake venom toxins.
FETUA-3 inhibited a huge number of toxins—over 20—that we detected and even bound to and inhibited the toxins of venoms from several other rattlesnakes the researchers tested. There is a need to learn more about how broadly FETUA-3 can be applied or if it'll need some additional tinkering but knowing that this one protein can neutralize an entire class of toxins brings researchers even closer to creating a better anti-venom.
Published in theProceedings of the National Academy of Sciences, the team's findings have notable implications for the development of improved snake bite treatments.
Fiona P. Ukken et al, A novel broad spectrum venom metalloproteinase autoinhibitor in the rattlesnake Crotalus atrox evolved via a shift in paralog function, Proceedings of the National Academy of Sciences (2022). DOI: 10.1073/pnas.2214880119
Lubrication science gives mechanistic insights into how food actually feels in the mouth. You can use that knowledge to design food with better taste, texture or health benefits.
Scientists have decoded the physical process that takes place in the mouth when a piece of chocolate is eaten, as it changes from a solid into a smooth emulsion that many people find totally irresistible.
During the moments it is in the mouth, the chocolate sensation arises from the way the chocolate is lubricated, either from ingredients in the chocolate itself or from saliva or a combination of the two.
Fat plays a key function almost immediately when a piece of chocolate is in contact with the tongue. After that, solid cocoa particles are released and they become important in terms of the tactile sensation, so fat deeper inside the chocolate plays a rather limited role and could be reduced without having an impact on the feel or sensation of chocolate.
The researchers usedanalytical techniquesfrom a field of engineering called tribology to conduct the study, which included in situ imaging.
Tribology is about how surfaces and fluids interact, the levels of friction between them and the role of lubrication: in this case, saliva or liquids from the chocolate. Those mechanisms are all happening in the mouth when chocolate is eaten.
When chocolate is in contact with the tongue, it releases a fatty film that coats the tongue and other surfaces in the mouth. It is this fatty film that makes the chocolate feel smooth throughout the entire time it is in the mouth.
Siavash Soltanahmadi et al, Insights into the multiscale lubrication mechanism of edible phase change materials, ACS Applied Materials & Interfaces (2023). pubs.acs.org/doi/10.1021/acsami.2c13017
Dr. Krishna Kumari Challa
Dec 28, 2022
Dr. Krishna Kumari Challa
Brain area necessary for fluid intelligence identified
A team of researchers have mapped the parts of the brain that support our ability to solve problems without prior experience—otherwise known as fluid intelligence.
Fluid intelligence is arguably the defining feature of human cognition. It predicts educational and professional success, social mobility, health, and longevity. It also correlates with many cognitive abilities such as memory.
Fluid intelligence is thought to be a key feature involved in "active thinking"—a set of complex mental processes such as those involved in abstraction, judgment, attention, strategy generation and inhibition. These skills can all be used in everyday activities.
Despite its central role in human behavior, fluid intelligence remains contentious, with regards to whether it is a single or a cluster of cognitive abilities, and the nature of its relationship with the brain. To establish which parts of the brain are necessary for a certain ability, researchers must study patients in whom that part is either missing or damaged. Such "lesion-deficit mapping" studies are difficult to conduct owing to the challenge of identifying and testing patients with focal brain injury. Consequently, previous studies have mainly used functional imaging (fMRI) techniques—which can be misleading.
The new study investigated 227 patients who had suffered either a brain tumour or stroke to specific parts of the brain, using the Raven Advanced Progressive Matrices (APM): the best-established test of fluid intelligence. The test contains multiple choice visual pattern problems of increasing difficulty. Each problem presents an incomplete pattern of geometric figures and requires selection of the missing piece from a set of multiple possible choices. The researchers then introduced a novel "lesion-deficit mapping" approach to disentangle the intricate anatomical patterns of common forms of brain injury, such as stroke.
Their approach treated the relations between brain regions as a mathematical network whose connections describe the tendency of regions to be affected together, either because of the disease process or in reflection of common cognitive ability. This enabled researchers to disentangle the brain map of cognitive abilities from the patterns of damage—allowing them to map the different parts of the brain and determine which patients did worse in the fluid intelligence task according to their injuries.
The researchers found that fluid intelligence impaired performance was largely confined to patients with right frontal lesions—rather than a wide set of regions distributed across the brain. Alongside brain tumours and stroke, such damage is often found in patients with a range of other neurological conditions, including traumatic brain injury and dementia.
These findings indicate for the first time that the right frontal regions of the brain are critical to the high-level functions involved in fluid intelligence, such as problem solving and reasoning.
Lisa Cipolotti et al, Graph lesion-deficit mapping of fluid intelligence, Brain (2022). DOI: 10.1093/brain/awac304
Dec 29, 2022
Dr. Krishna Kumari Challa
Holding information in mind may mean storing it among synapses
Between the time you read the Wi-Fi password off the café's menu board and the time you can get back to your laptop to enter it, you have to hold it in mind. If you've ever wondered how your brain does that, you are asking a question about working memory that has researchers have strived for decades to explain. Now neuroscientists have published a key new insight to explain how it works.
In a study in PLOS Computational Biology, scientists compared measurements of brain cell activity in an animal performing a working memory task with the output of various computer models representing two theories of the underlying mechanism for holding information in mind. The results strongly favoured the newer notion that a network of neurons stores the information by making short-lived changes in the pattern of their connections, or synapses, and contradicted the traditional alternative that memory is maintained by neurons remaining persistently active (like an idling engine).
While both models allowed for information to be held in mind, only the versions that allowed for synapses to transiently change connections ("short-term synaptic plasticity") produced neural activity patterns that mimicked what was actually observed in real brains at work. The idea that brain cells maintain memories by being always "on" may be simpler, but it doesn't represent what nature is doing and can't produce the sophisticated flexibility of thought that can arise from intermittent neural activity backed up by short-term synaptic plasticity.
You need these kinds of mechanisms to give working memory activity the freedom it needs to be flexible. If working memory was just sustained activity alone, it would be as simple as a light switch. But working memory is as complex and dynamic as our thoughts.
Using artificial neural networks with short-term synaptic plasticity, scientists show that synaptic activity (instead of neural activity) can be a substrate for working memory. The important takeaway from this paper is: these 'plastic' neural network models are more brain-like, in a quantitative sense, and also have additional functional benefits in terms of robustness.
Part 1
Dec 30, 2022
Dr. Krishna Kumari Challa
In addition to matching nature better, the synaptic plasticity models also conferred other benefits that likely matter to real brains. One was that the plasticity models retained information in their synaptic weightings even after as many as half of the artificial neurons were "ablated." The persistent activity models broke down after losing just 10–20 percent of their synapses. And just spiking occasionally requires less energy than spiking persistently.
Furthermore, quick bursts of spiking rather than persistent spiking leaves room in time for storing more than one item in memory. Research has shown that people can hold up to four different things in working memory.
Leo Kozachkov et al, Robust and brain-like working memory through short-term synaptic plasticity, PLOS Computational Biology (2022). DOI: 10.1371/journal.pcbi.1010776
Part 2
Dec 30, 2022
Dr. Krishna Kumari Challa
South Asian black carbon causes glacier loss on Tibetan Plateau
Black carbon aerosol is the product of incomplete combustion of fossil fuels and biomass, and has strong light absorption. Black carbon deposition in snow ice reduces the albedo of the snow ice surface, accelerating the melting of glaciers and snow cover, and thus changing the hydrological process and water resources in the region.
The South Asia region adjacent to the Tibetan Plateau is one of the regions with high black carbon emission in the world. Black carbon aerosol from South Asia can transport across the Himalayan Mountains to the inland region of the Tibetan Plateau.
Recently, a research team analyzed the influence of black carbon aerosols on regional precipitation and glaciers over the Qinghai-Tibet Plateau.
Their findings were published in Nature Communications on Nov. 30.
The researchers found that since the 21st century, the South Asian black carbon aerosols have indirectly affected the material supply of the Tibetan Plateau glaciers by changing water vapour transport in the South Asian monsoon.
Black carbon aerosols in South Asia heat up the middle and upper atmosphere, thus increasing the north-south temperature gradient.
Accordingly, the convective activity in South Asia is enhanced, which causes convergence of water vapor in South Asia. Meanwhile, black carbon also increases the number of cloud condensation nuclei in the atmosphere."
These changes in meteorological conditions caused by black carbon aerosols make more water vapor form precipitation in South Asia, and less water vapor transmit to the Tibetan Plateau. As a result, precipitation in the central and southern Tibetan Plateau decreases during monsoon, especially in the southern part of the Tibetan Plateau.
The decrease of precipitation further leads to the decrease of material supply of glaciers. From 2007 to 2016, the reduced material supply accounted for 11.0% of the average glacier material loss on the Tibetan Plateau and 22.1% in the southern part of the plateau.
"The transboundary transport and deposition of black carbon aerosols in South Asia accelerate glacier ablation on the Tibetan Plateau. Meanwhile, the reduction of plateau summer precipitation will reduce the material supply of plateau glacier, which will increase the amount of glacier material deficit.
Junhua Yang et al, South Asian black carbon is threatening the water sustainability of the Asian Water Tower, Nature Communications (2022). DOI: 10.1038/s41467-022-35128-1
**
Dec 30, 2022
Dr. Krishna Kumari Challa
Gastroesophageal reflux disease raises risk for periodontitis: Study
Patients with gastroesophageal reflux disease (GERD) have an increased risk for periodontitis development, according to a study published online Nov. 19 in Biomedicines.
Researchers conducted a retrospective cohort study to examine the association between GERD and subsequent periodontitis risk using epidemiological data from the Taiwan National Health Insurance Research Database from 2008 to 2018. A total of 20,125 participants with a minimum age of 40 years were included in the GERD group and propensity-matched in a 1:1 ratio with non-GERD participants.
The researchers found that the incidence rate of periodontitis was significantly higher in patients with versus those without GERD (30.0 versus 21.7 per 1,000 person-years; adjusted hazard ratio, 1.36). Patients with GERD had a higher risk for periodontitis in analyses stratified for age (adjusted hazard ratios, 1.31 and 1.42 for age 40 to 54 and 55 to 69 years, respectively), sex (adjusted hazard ratios, 1.40 and 1.33 for men and women, respectively), and presence and absence of comorbidity (adjusted hazard ratios, 1.36 and 1.40, respectively) compared with those without GERD. The risk for periodontitis was increased with an increasing number of emergency room visits among the GERD cohort (one or more versus less than one; adjusted hazard ratio, 5.19).
"Clinicians should pay more attention to the development of periodontitis while caring for patients with GERD," the authors write. "On the other hand, dentists may consider GERD as an etiology of unexplained periodontitis."
Dec 30, 2022
Dr. Krishna Kumari Challa
Parental astigmatism may increase risk for child astigmatism
Parental astigmatism may confer an independent and dose-dependent association with child astigmatism, according to a study published online Dec. 21 in JAMA Network Open.
Researchers examined the association between parental astigmatism (an optical system with astigmatism is one where rays that propagate in two perpendicular planes have different foci. If an optical system with astigmatism is used to form an image of a cross, the vertical and horizontal lines will be in sharp focus at two different distances) and child astigmatism. The analysis included 5,708 familial trios, each comprising a child aged 6 to 8 years and both parents, participating in the Hong Kong Children Eye Study.
The researchers found that astigmatism of ≥1.0 D in both parents was associated with greater odds of refractive astigmatism (RA; odds ratio, 1.62) and corneal astigmatism (CA; odds ratio, 1.94) in the child. When both parents had astigmatism ≥2.0 D, the risk increased further (odds ratios, 3.10 and 4.31, respectively), with higher parental astigmatism conferring higher risks for both RA and CA in children. There was a significant association between each parental astigmatism and corresponding child astigmatism (odds ratios, 0.76, 0.82, 1.70, and 1.33 for maternal RA, paternal RA, maternal CA, and paternal CA, respectively).
"The findings of this cross-sectional study suggest that parental astigmatism may confer an independent and dose-dependent association with child astigmatism," the authors write. "Children with parents with astigmatism should have early eye examinations for timely detection of astigmatism to facilitate age-appropriate vision correction and visual development."
Dec 30, 2022
Dr. Krishna Kumari Challa
Researchers discover that soap film on bubbles is cooler than the air around it
A team of researchers has discovered that the film that makes up ordinary soap bubbles is cooler than the surrounding air.
Bubbles exist in a wide variety of environments, from beer glasses to clothes and dish washers to crests on waves. They even exist in tiny environments, like in the space between human teeth. A lot of research has been done with bubbles, much of it focused on controlling them during industrial processes.
In their work, the researchers created bubbles using ordinary dish soap, water and glycerol. After discovering a temperature difference, the team refocused their efforts to learn more. They tried changing the temperature of the air, the humidity level and also the proportions of the ingredients used to make the bubbles. They found that they were able to make bubbles that were up to 8 degrees Celsius cooler than the air around them. They also found that changing the amount of glycerol impacted the temperature of the resulting bubbles—more of it yielded higher temperatures.
The researchers suggest the cooler films could be the result of evaporation as the bubbles form. They noted also that as the bubbles persisted, their films slowly grew warmer, eventually matching the ambient air temperature. They suggest the large temperature differences they found with some bubbles might have an impact on bubble stability, and conclude that more work is required to find out why the films are cooler and if it might be a useful attribute.
François Boulogne et al, Measurement of the Temperature Decrease in Evaporating Soap Films, Physical Review Letters (2022). DOI: 10.1103/PhysRevLett.129.268001. On Arxiv: arxiv.org/abs/2212.07104
Dec 31, 2022
Dr. Krishna Kumari Challa
Ketamine found to increase brain noise
An international team of researchers found that ketamine, being an NMDA receptor inhibitor, increases the brain's background noise, causing higher entropy of incoming sensory signals and disrupting their transmission between the thalamus and the cortex. This finding may contribute to a better understanding of the causes of psychosis in schizophrenia. An article with the study's findings has been published in the European Journal of Neuroscience.
Schizophrenic spectrum disorders affect approximately one in 300 people worldwide. The most common manifestations of these disorders are perceptual disturbances such as hallucinations, delusions and psychoses. A drug called ketamine can induce a mental state similar to psychosis in healthy individuals. Ketamine inhibits NMDA receptors involved in the transmission of excitatory signals in the brain. An imbalance of excitation and inhibition in the central nervous system can affect the accuracy of sensory perception. Similar changes in the functioning of NMDA receptors are currently thought to be one of the causes of perception disorders in schizophrenia. However, it is still unclear how exactly this process occurs in the brain regions involved.
To find out, neuroscientists
studied how the brains of laboratory rats on ketamine process sensory signals. The researchers examined beta and gamma oscillations occurring in response to sensory stimuli in the rodent brain's thalamo-cortical system , a neural network connecting the cerebral cortex with the thalamus responsible for the transmission of sensory information from the organs of perception to the brain.
Beta oscillations are brainwaves in the range of 15 to 30 Hz, and gamma waves are those in the range of 30 to 80 Hz. These frequencies are believed to be critical for encoding and integrating sensory information.
In the experiment, rats were implanted with microelectrodes to record the electrical activity in the thalamus and the somatosensory cortex , a region of the brain which is responsible for processing sensory information coming from the thalamus. The researchers stimulated the rats' whiskers (vibrissae) and recorded the brain's responses before and after ketamine administration.
A comparison of the two datasets revealed that ketamine increased the power of beta and gamma oscillations in the cortex and the thalamus even in the resting state before a stimulus was presented, while the amplitude of the beta/gamma oscillations in the 200–700 ms post-stimulus period was significantly lower at all recorded cortical and thalamic sites following ketamine administration.
The post-stimulation time lapse of 200–700 ms is long enough to encode, integrate and perceive the incoming sensory signal. The observed decrease in the power of sensory stimulus-induced oscillations can be associated with impaired perception.
Part 1
Dec 31, 2022
Dr. Krishna Kumari Challa
Analysis also revealed that by inhibiting NMDA receptors, ketamine administration added noise to gamma frequencies in the post-stimulation 200–700 ms period in one thalamic nucleus and in one layer of the somatosensory cortex. It can be assumed that this observed increase in noise, ie a reduction in the signal-to-noise ratio, also indicates the neurons' impaired ability to process incoming sensory signals.
These findings suggest that psychosis may be triggered by an increase in background noise impairing the function of thalamo-cortical neurons. This, in turn, could be caused by a malfunction of NMDA receptors affecting the balance of inhibition and excitation in the brain. The noise makes sensory signals less defined or pronounced. In addition, this may cause spontaneous outbursts of activity associated with a distorted perception of reality.
"The discovered alterations in thalamic and cortical electrical activity associated with ketamine-induced sensory information processing disorders could serve as biomarkers for testing antipsychotic drugs or predicting the course of disease in patients with psychotic spectrum disorders.
Yi Qin et al, The psychotomimetic ketamine disrupts the transfer of late sensory information in the corticothalamic network, European Journal of Neuroscience (2022). DOI: 10.1111/ejn.15845
Thomas J. Reilly, Ketamine: Linking NMDA receptor hypofunction, gamma oscillations and psychosis (commentary on Qin et al., 2022), European Journal of Neuroscience (2022). DOI: 10.1111/ejn.15872
Part 2
**
Dec 31, 2022
Dr. Krishna Kumari Challa
How chronic blood cancer transitions to aggressive disease
A type of chronic leukemia can simmer for many years. Some patients may need treatment to manage this type of blood cancer—called myeloproliferative neoplasms (MPN)—while others may go through long periods of watchful waiting. But for a small percentage of patients, the slower paced disease can transform into an aggressive cancer, called secondary acute myeloid leukemia, that has few effective treatment options. Little has been known about how this transformation takes place.
But now, researchers have identified an important transition point in the shift from chronic to aggressive leukemia. They have shown that blocking a key molecule in the transition pathway prevents this dangerous disease progression in mice with models of the disease and in mice with tumors sampled from human patients.
Tim Kong et al, DUSP6 mediates resistance to JAK2 inhibition and drives leukemic progression, Nature Cancer (2022). DOI: 10.1038/s43018-022-00486-8
Dec 31, 2022
Dr. Krishna Kumari Challa
Quasicrystal formed during accidental electrical discharge
Quasicrystals, as their name suggests, are crystal-like substances. They possess characteristics not found in ordinary crystals, such as a non-repeating arrangement of atoms. To date, quasicrystals have been found embedded in meteorites and in the debris from nuclear blasts. In this new effort, the researchers found one embedded in a sand dune in Sand Hills, Nebraska.
A team of researchers has found an incidence of a quasicrystal formed during an accidental electrical discharge.
Study of the quasicrystal showed it had 12-fold, or dodecagonal, symmetry—something rarely seen in quasicrystals. Curious as to how it might have formed and how it ended up in the sand dune, the researchers did some investigating. They discovered that a power line had fallen on the dune, likely the result of a lightning strike. They suggest the electrical surge from either the power line or the lightning could have produced the quasicrystal.
The researchers note that the quasicrystal was found inside of a tubular piece of fulgurite, which they suggest was also formed during the electrical surge due to fusing of melted sand and metal from the power line.
In looking at the quasicrystal using an electron microscope, the researchers were able to make out its composition. In so doing, they found bits of silicon dioxide glass, which told them that temperatures inside the sand dune during the electrical discharge had to have reached at least 1,710 degrees Celsius. They also found that the quasicrystal had been retrieved from an area of transition between melted aluminum alloy and silicate glass. Their work confirmed that the object they were studying was, indeed, a quasicrystal, and that it had a previously unseen composition.
The researchers conclude that finding a quasicrystal in such a place suggests that others are likely out there, as well, having formed due to lightning strikes or downed power lines. They also suggest their work could lead to techniques to create quasicrystals in the lab.
Luca Bindi et al, Electrical discharge triggers quasicrystal formation in an eolian dune, Proceedings of the National Academy of Sciences (2022). DOI: 10.1073/pnas.2215484119
Dec 31, 2022
Dr. Krishna Kumari Challa
Human brain organoids implanted into mouse cortex respond to visual stimuli for first time
A team of engineers and neuroscientists has demonstrated for the first time that human brain organoids implanted in mice have established functional connectivity to the animals' cortex and responded to external sensory stimuli. The implanted organoids reacted to visual stimuli in the same way as surrounding tissues, an observation that researchers were able to make in real time over several months thanks to an innovative experimental setup that combines transparent graphene microelectrode arrays and two-photon imaging.
Madison N. Wilson, Martin Thunemann, Xin Liu, Yichen Lu, Francesca Puppo, Jason W. Adams, Jeong-Hoon Kim, Mehrdad Ramezani, Donald P. Pizzo, Srdjan Djurovic, Ole A. Andreassen, Abed AlFatah Mansour, Fred H. Gage, Alysson R. Muotri, Anna Devor, Duygu Kuzum. Multimodal monitoring of human cortical organoids implanted in mice reveal functional connection with visual cortex. Nature Communications, 2022; 13 (1) DOI: 10.1038/s41467-022-35536-3
Dec 31, 2022
Dr. Krishna Kumari Challa
Researchers identify 100,000 new types of viruses
A groundbreaking study has discovered about 100,000 new types of previously unknown viruses , a ninefold increase in the amount of RNA viruses known to science until now. The viruses were discovered in global environmental data from soil samples, oceans, lakes, and a variety of other ecosystems. The researchers believe that the discovery may help in the development of anti-microbial drugs and in protecting against agriculturally harmful fungi and parasites.
Dec 31, 2022
Dr. Krishna Kumari Challa
Self-assembling proteins can store cellular 'memories'
As cells perform their everyday functions, they turn on a variety of genes and cellular pathways. Researchers have now coaxed cells to inscribe the history of these events in a long protein chain that can be imaged using a light microscope.
Cells programmed to produce these chains continuously add building blocks that encode particular cellular events. Later, the ordered protein chains can be labeled with fluorescent molecules and read under a microscope, allowing researchers to reconstruct the timing of the events.
This technique could help shed light on the steps that underlie processes such as memory formation, response to drug treatment, and gene expression.
If the technique could be extended to work over longer time periods, it could also be used to study processes such as aging and disease progression, the researchers say.
Recording of cellular physiological histories along optically readable self-assembling protein chains, Nature Biotechnology (2022). DOI: 10.1038/s41587-022-01586-7
Jan 4, 2023
Dr. Krishna Kumari Challa
Dry eye disease affects how the eye's cornea heals itself after injury
People with a condition known as dry eye disease are more likely than those with healthy eyes to suffer injuries to their corneas. Studying mice, researchers at Washington University School of Medicine in St. Louis have found that proteins made by stem cells that regenerate the cornea may be new targets for treating and preventing such injuries.
Dry eye disease occurs when the eye can't provide adequate lubrication with natural tears. People with the common disorder use various types of drops to replace missing natural tears and keep the eyes lubricated, but when eyes are dry, the cornea is more susceptible to injury.
We have drugs to treat this condition, but they only work well in about 10% to 15% of patients. In this new study involving genes that are key to eye health, researchers identified potential targets for treatment that appear different in dry eyes than in healthy eyes.
The researchers analyzed genes expressed by the cornea in several mouse models—not only of dry eye disease, but also of diabetes and other conditions. They found that in mice with dry eye disease, the cornea activated expression of the gene SPARC. They also found that higher levels of SPARC protein were associated with better healing.
Lin, Joseph B. et al, Dry eye disease in mice activates adaptive corneal epithelial regeneration distinct from constitutive renewal in homeostasis, Proceedings of the National Academy of Sciences (2023). DOI: 10.1073/pnas.2204134120. doi.org/10.1073/pnas.2204134120
Jan 4, 2023
Dr. Krishna Kumari Challa
COVID-19 vaccines, prior infection reduce transmission of omicron, finds study
Vaccination and boosting, especially when recent, helped to limit the spread of COVID-19 in California prisons during the first omicron wave, according to an analysis by researchers.
The study demonstrates the benefits of vaccination and boosting, even in settings where many people are still getting infected, in reducing transmission. And it shows the cumulative effects from boosting and the additional protection that vaccination gives to those who were previously infected. The likelihood of transmission fell by 11% for each additional dose.
A lot of the benefits of vaccines to reduce infectiousness were from people who had received boosters and people who had been recently vaccinated.
Vaccinated residents with breakthrough infections were significantly less likely to transmit them: 28% versus 36% for those who were unvaccinated. But the likelihood of transmission grew by 6% for every five weeks that passed since someone's last vaccine shot.
Natural immunity from a prior infection also had a protective effect, and the risk of transmitting the virus was 23% for someone with a reinfection compared to 33% for someone who had never been infected.
Those with hybrid immunity, from both infection and vaccination, were 40% less likely to transmit the virus. Half of that protection came from the immunity that one acquires from fighting an infection and the other half came from being vaccinated.
Infectiousness of SARS-CoV-2 breakthrough infections and reinfections during the Omicron wave, Nature Medicine (2022). DOI: 10.1038/s41591-022-02138-x , www.nature.com/articles/s41591-022-02138-x
Jan 4, 2023
Dr. Krishna Kumari Challa
Good hydration linked to healthy aging
Adults who stay well-hydrated appear to be healthier, develop fewer chronic conditions, such as heart and lung disease, and live longer than those who may not get sufficient fluids, according to a National Institutes of Health study published in eBioMedicine. Using health data gathered from 11,255 adults over a 30-year period, researchers analyzed links between serum sodium levels—which go up when fluid intake goes down—and various indicators of health. They found that adults with serum sodium levels at the higher end of a normal range were more likely to develop chronic conditions and show signs of advanced biological aging than those with serum sodium levels in the medium ranges. Adults with higher levels were also more likely to die at a younger age.
The results suggest that proper hydration may slow down aging and prolong a disease-free life.
The study expands on research the scientists published in March 2022, which found links between higher ranges of normal serum sodium levels and increased risk for heart failure. Both findings came from the Atherosclerosis Risk in Communities (ARIC) study, which includes sub-studies involving thousands of adults.
The researchers found that adults with higher levels of normal serum sodium—with normal ranges falling between 135-146 milliequivalents per liter (mEq/L)—were more likely to show signs of faster biological aging. This was based on indictors like metabolic and cardiovascular health, lung function, and inflammation.
The conclusion therefore was people whose serum sodium is 142 mEq/L or higher would benefit from evaluation of their fluid intake. Doctors may also need to defer to a patient's current treatment plan, such as limiting fluid intake for heart failure. The authors also cited research that finds about half of people worldwide don't meet recommendations for daily total water intake, which often starts at 6 cups (1.5 liters).
Decreased body water content is the most common factor that increases serum sodium, which is why the results suggest that staying well hydrated may slow down the aging process and prevent or delay chronic disease.
Natalia I. Dmitrieva et al, Middle-age high normal serum sodium as a risk factor for accelerated biological aging, chronic diseases, and premature mortality, eBioMedicine (2023). DOI: 10.1016/j.ebiom.2022.104404
Jan 4, 2023
Dr. Krishna Kumari Challa
Duping antibodies with a decoy, researchers aim to prevent rejection of transplanted cells
Researchers have developed a novel, potentially life-saving approach that may prevent antibodies from triggering immune rejection of engineered therapeutic and transplant cells.
Rejection mediated by antibodies—as opposed to the chemical assault initiated by immune cells—has proven particularly tricky to resolve, a factor holding held back the development of some of these treatments. The new strategy, described in the Monday, Jan. 2, 2023 issue of Nature Biotechnology, involved using a "decoy" receptor to capture the antibodies and take them out of circulation before they could kill the therapeutic cells, that they treat as invading foreigners. The tactic may also be useful for organ transplants.
The most celebrated cellular therapies are chimeric antigen receptor (CAR) T-cell therapies. These CAR-T therapies are often used to successfully treat specific forms of lymphomas, a type of often-deadly cancer. But deploying them against solid tumors has proved much more difficult. Until recently, most CAR-T therapies have been made using the patient's own cells, but the long-term commercial viability of cellular therapies of all types will rely on "allogeneic" cells—mass-produced therapeutic cells grown from a source outside the patient. As with transplanted organs, the recipient's immune system is likely to treat any outsider cells, or tissues developed from them, as foreign and to reject them. This issue is likely to be a severe obstacle in any type of allogeneic cell transplantation.
Part1
Jan 4, 2023
Dr. Krishna Kumari Challa
Normally when an antibody binds to a cell, it acts as a sort of tag, calling out for an immune cell to bind to the antibody and set off an efficient process of destroying the tagged cell. To stop this chain reaction, researchers devised a method to catch the antibodies before they bind to cells, preventing activation of the immune response.
The researchers genetically engineered three types of cells—insulin-producing pancreatic islet cells, thyroid cells, and CAR-T cells—so that each type made and displayed large numbers of a protein called CD64 on their surfaces.
On these engineered cells, CD64, which tightly binds the antibodies responsible for this type of immune rejection, acted as a kind of decoy, capturing the antibodies and binding them to the engineered cell, so they wouldn't activate immune cells.
Indeed this was the case when this approach was tested. This is clear proof-of-concept for this approach.
Tobias Deuse, Protection of cell therapeutics from antibody-mediated killing by CD64 overexpression, Nature Biotechnology (2023). DOI: 10.1038/s41587-022-01540-7. www.nature.com/articles/s41587-022-01540-7
Part2
Jan 4, 2023
Dr. Krishna Kumari Challa
Black Hole Tidal Disruption Event (Animation)
Jan 4, 2023
Dr. Krishna Kumari Challa
New type of entanglement lets scientists 'see' inside nuclei
Nuclear physicists have found a new way to use the Relativistic Heavy Ion Collider (RHIC) to see the shape and details inside atomic nuclei. The method relies on particles of light that surround gold ions as they speed around the collider and a new type of quantum entanglement that's never been seen before.
Through a series of quantum fluctuations, the particles of light (a.k.a. photons) interact with gluons—gluelike particles that hold quarks together within the protons and neutrons of nuclei. Those interactions produce an intermediate particle that quickly decays into two differently charged "pions" (π). By measuring the velocity and angles at which these π+ and π- particles strike RHIC's STAR detector, the scientists can backtrack to get crucial information about the photon—and use that to map out the arrangement of gluons within the nucleus with higher precision than ever before.This technique is similar to the way doctors use positron emission tomography (PET scans) to see what's happening inside the brain and other body parts. But in this case, physicists are talking about mapping out features on the scale of femtometers—quadrillionths of a meter—the size of an individual proton.
Even more amazing, the STAR physicists say, is the observation of an entirely new kind of quantum interference that makes their measurements possible.
Physicists measure two outgoing particles and clearly their charges are different—they are different particles—but they see interference patterns that indicate these particles are entangled, or in sync with one another, even though they are distinguishable particles! This is the first-ever experimental observation of entanglement between dissimilar particles.
That discovery may have applications well beyond the lofty goal of mapping out the building blocks of matter.
James Brandenburg, Tomography of ultra-relativistic nuclei with polarized photon-gluon collisions, Science Advances (2023). DOI: 10.1126/sciadv.abq3903. www.science.org/doi/10.1126/sciadv.abq3903
Jan 5, 2023
Dr. Krishna Kumari Challa
Scientists found the structure and function of newly discovered CRISPR immune system
Biochemists in their new research papers described the structure and function of a newly discovered CRISPR immune system that—unlike better-known CRISPR systems that deactivate foreign genes to protect cells—shuts down infected cells to thwart infection.
CRISPR, a manageable acronym for the mouthful "Clustered Regularly Interspaced Short Palindromic Repeats," has captured the imaginations of scientists and lay people alike, with its gene-editing potential. Study of CRISPR DNA sequences and CRISPR-associated (Cas) proteins, which are actually bacterial immune systems, is still a young field, although it's receiving widespread attention for its gene-editing applications.
Identified as a distinct immune system within the last five years, the Class 2, type V Cas12a2 is somewhat similar to the better-known CRISPR-Cas9, which binds to target DNA and cuts it—like molecular scissors—effectively shutting off a targeted gene. But CRISPR-Cas12a2 binds a different target than Cas9, and that binding has a very different effect.
The Cas12a2 protein undergoes major conformational changes upon binding to RNA that opens an indiscriminate active site for DNA destruction. "Cas12a2 destroys the DNA and RNA in target cells, causing them to go senescent."
Using cryo-electron microscopy or "cryo-EM," the researchers demonstrated this unique aspect of CRISPR-Cas12a2, including its RNA-triggered degradation of single-stranded RNA, single-stranded DNA and double-stranded DNA, resulting in a naturally occurring defensive strategy called abortive infection.
Abortive infection is a natural phage resistance strategy used by bacteria and archaea to limit the spread of viruses and other pathogen. For example, abortive infection prevents viral components that have infected a cell from replicating.
The team captured the structure of Cas12a2 in the act of cutting double-stranded DNA.
Incredibly, Cas12a2 nucleases bend the usually straight piece of double-helical DNA 90 degrees, to force the backbone of the helix into the enzymatic active site, where it is cut. It's a change in structure that's extraordinary to observe.
Dymtrenko, Oleg, et al. "Cas12a2 Elicits Abortive Infection via RNA-triggered Destruction of dsDNA," Nature, 04 January 2023. DOI: 10.1038/s41586-022-05559-3 , www.nature.com/articles/s41586-022-05559-3
Bravo, Jack and Hallmark, Thomson, et al. "RNA Targeting Unleashes Indiscriminate Nuclease Activity of CRISPR-Cas12a2," Nature, 04 January 2023. DOI: 10.1038/s41586-022-05560-w , www.nature.com/articles/s41586-022-05560-w
Jan 5, 2023
Dr. Krishna Kumari Challa
Exercise curbs insulin production in fruit flies
Insulin is an essential hormone for humans and many other living creatures. Its best-known task is to regulate sugar metabolism. How it does this job is well understood. Much less is known about how the activity of insulin-producing cells and consequently the secretion of insulin is controlled.
Researchers have now presented new information on this question in the scientific journal Current Biology. They used the fruit fly Drosophila melanogaster as their study object. Interestingly, this fly also secretes insulin after a meal. However, in the fly, the hormone does not come from the pancreas as in humans, but is instead released by nerve cells in the brain.
They figured out that physical activity of the fly has a strong effect on its insulin-producing cells. For the first time, the researchers measured the activity of these cells electrophysiologically in walking and flying Drosophila.
The result: when Drosophila starts to walk or fly, its insulin-producing cells are immediately inhibited. When the fly stops moving, the activity of the cells rapidly increases again and shoots up above normal levels.
The scientists hypothesize that the low activity of insulin-producing cells during walking and flight contributes to the provision of sugars to meet the increased energy demand. They suspect that the increased activity after exercise helps to replenish the fly's energy stores, for example, in the muscles.
The team was also able to demonstrate that the fast, behaviour-dependent inhibition of insulin-producing cells is actively controlled by neural pathways. It is largely independent of changes in the sugar concentration in the fly's blood.
It makes a lot of sense for the organism to anticipate an increased energy demand in this way to prevent extreme fluctuations in blood sugar levels.
Part 1
Jan 5, 2023
Dr. Krishna Kumari Challa
Insulin has changed little throughout evolution
Do the results allow conclusions to be drawn about humans? Probably.
Although the release of insulin in fruit flies is mediated by different cells than in humans, the insulin molecule and its function have hardly changed in the course of evolution.
In the past 20 years, using Drosophila as a model organism, many fundamental questions have already been answered that could also contribute to a better understanding of metabolic defects in humans and associated diseases, such as diabetes or obesity.
Less insulin means longevity
One exciting point is that reduced insulin activity contributes to healthy aging and longevity. This has already been shown in flies, mice, humans and other species. The same applies to an active lifestyle. Our work shows a possible link explaining how physical activity could positively affect insulin regulation via neuronal signaling pathways.
Sander Liessem et al, Behavioral state-dependent modulation of insulin-producing cells in Drosophila, Current Biology (2022). DOI: 10.1016/j.cub.2022.12.005
Part 2
Jan 5, 2023
Dr. Krishna Kumari Challa
An Organism That Can Dine Exclusively on Viruses Has Been Found in a World First
A type of freshwater plankton has become the first organism seen thriving on a diet of viruses, according to a new study by researchers.
Viruses are often consumed incidentally by a range wide of organisms, and may even season the diets of certain marine protists. But to qualify as a true step in the food chain – described as virovory – viruses ought to contribute a significant amount of energy or nutrients to their consumer.
The microbe Halteria is a common genus of protist known to flit about as its hair-like cilia propel it through the water. Not only did laboratory samples of the ciliate consume chloroviruses added to its environment, the giant virus fueled Halteria's growth and increased its population size.
The knock-on effects of widespread consumption of chloroviruses in the wild could have a profound impact on the carbon cycle. Known to infect microscopic green algae, chloroviruses cause their hosts to burst apart, releasing carbon and other nutrients into the environment – a process that serious amounts of virus-eating could be limiting.
There's some good stuff inside viruses if you're an organism looking to feed, including amino acids, nucleic acids, lipids, nitrogen, and phosphorus. Surely something would want to make a meal out of that.
While the Paramecium snacked on the viruses, its sizes and numbers barely budged. Halteria, on the other hand, dined on them, using the chlorovirus as a source of nutrients. The ciliate's population grew about 15 times larger in two days, while the virus population dropped a hundredfold.
Fluorescent green dye was used to tag chlorovirus DNA before it was introduced to the two types of plankton. This confirmed that the viruses were being eaten: the vacuoles – microbial equivalent of stomachs – were glowing green from the feeding.
Further analysis revealed that the growth of Halteria in comparison to the decline of the chlorovirus matched the ratios seen in other microscopic predator vs prey relationships in aquatic environments, giving the team more evidence of what was happening.
https://www.pnas.org/doi/10.1073/pnas.2215000120
**
Jan 5, 2023
Dr. Krishna Kumari Challa
The brain's ability to perceive space expands like the universe
Young children sometimes believe that the moon is following them, or that they can reach out and touch it. It appears to be much closer than is proportional to its true distance. As we move about our daily lives, we tend to think that we navigate space in a linear way. But scientists have discovered that time spent exploring an environment causes neural representations to grow in surprising ways.
The findings, published in Nature Neuroscience on December 29, 2022, show that neurons in the hippocampus essential for spatial navigation, memory, and planning represent space in a manner that conforms to a nonlinear hyperbolic geometry—a three-dimensional expanse that grows outward exponentially. (In other words, it's shaped like the interior of an expanding hourglass.) The researchers also found that the size of that space grows with time spent in a place. And the size is increasing in a logarithmic fashion that matches the maximal possible increase in information being processed by the brain.
This discovery provides valuable methods for analyzing data on neurocognitive disorders involving learning and memory, such as Alzheimer's disease.
This new study demonstrates that the brain does not always act in a linear manner. Instead, neural networks function along an expanding curve, which can be analyzed and understood using hyperbolic geometry and information theory.It is exciting to see that neural responses in this area of the brain formed a map that expanded with experience based on the amount of time devoted in a given place. The effect even held for miniscule deviations in time when animal ran more slowly or faster through the environment.
In the current study, the scientists found that hyperbolic geometry guides neural responses as well. Hyperbolic maps of sensory molecules and events are perceived with hyperbolic neural maps. The space representations dynamically expanded in correlation with the amount of time the rat spent exploring each environment. And, when a rat moved more slowly through an environment, it gained more information about the space, which caused the neural representations to grow even more.
The findings provide a novel perspective on how neural representations can be altered with experience. The geometric principles identified in our study can also guide future endeavors in understanding neural activity in various brain systems.
You would think that hyperbolic geometry only applies on a cosmic scale, but that is not true. Our brains work much slower than the speed of light, which could be a reason that hyperbolic effects are observed on graspable spaces instead of astronomical ones.
Huanqiu Zhang et al, Hippocampal spatial representations exhibit a hyperbolic geometry that expands with experience, Nature Neuroscience (2022). DOI: 10.1038/s41593-022-01212-4
Jan 6, 2023
Dr. Krishna Kumari Challa
New approach successfully traces genomic variants back to genetic disorders
Researchers have published an assessment of 13 studies that took a genotype-first approach to patient care. This approach contrasts with the typical phenotype-first approach to clinical research, which starts with clinical findings. A genotype-first approach to patient care involves selecting patients with specific genomic variants and then studying their traits and symptoms; this finding uncovered new relationships between genes and clinical conditions, broadened the traits and symptoms associated with known disorders, and offered insights into newly described disorders. The study was published in the American Journal of Human Genetics.
Genotype-first research can work, especially for identifying people with rare disorders who otherwise might not have been brought to clinical attention.
Typically, to treat genetic conditions, researchers first identify patients who are experiencing symptoms, then they look for variants in the patients' genomes that might explain those findings. However, this can lead to bias because the researchers are studying clinical findings based on their understanding of the disorder. The phenotype-first approach limits researchers from understanding the full spectrum of symptoms of the disorders and the associated genomic variants.
Genomics has the potential to change reactive medicine into preventative medicine. Studying how taking a genotype-first approach to research can help us learn how to model predictive and precision medicine in the future.
Part 1
Jan 6, 2023
Dr. Krishna Kumari Challa
The study documents three types of discoveries from a genotype-first approach.
First, the researchers found that this approach helped discover new relationships between genomic variants and specific clinical traits. For example, one NIH study found that having more than two copies of the TPSAB1 gene was associated with symptoms related to the gastrointestinal tract, connective tissues, and the nervous system.
Second, this approach helped researchers find novel symptoms related to a disorder that clinicians previously missed because the patient did not have the typical symptoms. NHGRI researchers identified a person with a genomic variant associated with a known metabolic disorder. Further testing found that the individual had high levels of certain chemicals in their body associated with the disorder, despite having only minor symptoms.
Jan 6, 2023
Dr. Krishna Kumari Challa
Study reveals average age at conception for men versus women over past 250,000 years
The length of a specific generation can tell us a lot about the biology and social organization of humans. Now, researchers can determine the average age that women and men had children throughout human evolutionary history with a new method they developed using DNA mutations.
The researchers said this work can help us understand the environmental challenges experienced by our ancestors and may also help us in predicting the effects of future environmental change on human societies.
Through this research on modern humans, researchers noticed that they could predict the age at which people had children from the types of DNA mutations they left to their children.They then applied this model to our human ancestors to determine what age our ancestors procreated.
According to the study, published recently in Science Advances, the average age that humans had children throughout the past 250,000 years is 26.9. Furthermore, fathers were consistently older, at 30.7 years on average, than mothers, at 23.2 years on average, but the age gap has shrunk in the past 5,000 years, with the study's most recent estimates of maternal age averaging 26.4 years. The shrinking gap seems to largely be due to mothers having children at older ages. Other than the recent uptick in maternal age at childbirth, the researchers found that parental age has not increased steadily from the past and may have dipped around 10,000 years ago because of population growth coinciding with the rise of civilization. These mutations from the past accumulate with every generation and exist in humans today. Researchers can now identify these mutations, see how they differ between male and female parents, and how they change as a function of parental age. Children's DNA inherited from their parents contains roughly 25 to 75 new mutations, which allows scientists to compare the parents and offspring, and then to classify the kind of mutation that occurred. When looking at mutations in thousands of children, the researchers noticed a pattern: The kinds of mutations that children get depend on the ages of the mother and the father. Previous genetic approaches to determining historical generation times relied on the compounding effects of either recombination or mutation of modern human DNA sequence divergence from ancient samples. But the results were averaged across both males and females and across the past 40,000 to 45,000 years.
Part1
Jan 7, 2023
Dr. Krishna Kumari Challa
The researchers built a model that uses de novo mutations—a genetic alteration that is present for the first time in one family member as a result of a variant or mutation in a germ cell of one of the parents or that arises in the fertilized egg during early embryogenesis—to separately estimate the male and female generation times at many different points throughout the past 250,000 years.
The story of human history is pieced together from a diverse set of sources: written records, archaeological findings, fossils, etc. Our genomes, the DNA found in every one of our cells, offer a kind of manuscript of human evolutionary history. The findings from our genetic analysis confirm some things we knew from other sources (such as the recent rise in parental age), but also offer a richer understanding of the demography of ancient humans. These findings contribute to a better understanding of our shared history.
Richard Wang, Human generation times across the past 250,000 years, Science Advances (2023). DOI: 10.1126/sciadv.abm7047. www.science.org/doi/10.1126/sciadv.abm7047
Part 2
Jan 7, 2023
Dr. Krishna Kumari Challa
Scars mended using transplanted hair follicles in new study
In a new study involving three volunteers, skin scars began to behave more like uninjured skin after they were treated with hair follicle transplants. The scarred skin harbored new cells and blood vessels, remodeled collagen to restore healthy patterns, and even expressed genes found in healthy unscarred skin.
The findings could lead to better treatments for scarring both on the skin and inside the body, leading to hope for patients with extensive scarring, which can impair organ function and cause disability.
After scarring, the skin never truly regains its pre-wound functions, and until now all efforts to remodel scars have yielded poor results. The new findings lay the foundation for exciting new therapies that can rejuvenate even mature scars and restore the function of healthy skin.
Scar tissue in the skin lacks hair, sweat glands, blood vessels and nerves, which are vital for regulating body temperature and detecting pain and other sensations. Scarring can also impair movement as well as potentially cause discomfort and emotional distress.
Compared to scar tissue, healthy skin undergoes constant remodeling by the hair follicle. Hairy skin heals faster and scars less than non-hairy skin—and hair transplants had previously been shown to aid wound healing. Inspired by this, the researchers hypothesized that transplanting growing hair follicles into scar tissue might cause scars to remodel themselves. To test their hypothesis, researchers transplanted hair follicles into the mature scars on the scalps of three participants in 2017. The researchers selected the most common type of scar, called normotrophic scars, which usually form after surgery. They took and microscope-imaged 3 mm-thick biopsies of the scars just before transplantation, and then again at two, four, and six months afterwards. The researchers found that the follicles inspired profound architectural and genetic shifts in the scars towards a profile of healthy, uninjured skin.
Part1
Jan 7, 2023
Dr. Krishna Kumari Challa
After transplantation, the follicles continued to produce hair and induced restoration across skin layers. Scarring causes the outermost layer of skin—the epidermis—to thin out, leaving it vulnerable to tears. At six months post-transplant, the epidermis had doubled in thickness alongside increased cell growth, bringing it to around the same thickness as uninjured skin. The next skin layer down, the dermis, is populated with connective tissue, blood vessels, sweat glands, nerves, and hair follicles. Scar maturation leaves the dermis with fewer cells and blood vessels, but after transplantation the number of cells had doubled at six months, and the number of vessels had reached nearly healthy-skin levels by four months. This demonstrated that the follicles inspired the growth of new cells and blood vessels in the scars, which are unable to do this unaided. Scarring also increases the density of collagen fibers—a major structural protein in skin—which causes them to align such that scar tissue is stiffer than healthy tissue. The hair transplants reduced the density of the fibers, which allowed them to form a healthier "basket weave" pattern, reducing stiffness—a key factor in tears and discomfort. The authors also found that after transplantation, the scars expressed 719 genes differently than before. Genes that promoted cell and blood vessel growth were expressed more, while genes that promoted scar-forming processes were expressed less.
Anagen hair follicles transplanted into mature human scars remodel fibrotic tissue, npj Regenerative Medicine (2023). DOI: 10.1038/s41536-022-00270-3 , www.nature.com/articles/s41536-022-00270-3
Part 2
Jan 7, 2023
Dr. Krishna Kumari Challa
How liver cancer hijacks circadian clock machinery inside cells
The most common type of liver cancer, hepatocellular carcinoma (HCC), is already the third leading cause of cancer-related deaths globally—and cases are on the rise worldwide. While chemotherapy, surgery and liver transplants can help some patients, targeted treatments for HCC could save millions more lives.
Recent studies have offered clues about one potential target: the circadian clock proteins inside cells, which help coordinate changes in the body's functioning over the course of a day. But most of this research only hints at an indirect link between circadian clock function and HCC, for instance the observation that cells collected from patients with liver cancer have disrupted circadian rhythms. Now, a study by researchers not only directly links circadian clock proteins to liver cancer, but also shows precisely how cancer cells hijack circadian clock machinery to divide and spread. The research, just published in the journal Proceedings of the National Academy of Sciences, also found that inhibiting key clock proteins can prevent cancer cells from multiplying.
By targeting the circadian clock, we are not only targeting tumour cells but also the area around the tumor, which can help increase the efficacy of other targeted treatments.
The researchers showed that two key clock proteins, known as CLOCK and BMAL1, are critical for the replication of liver cancer cells in cell culture. When CLOCK and BMAL1 are suppressed, cancer cells' replication process was interrupted—ultimately causing cell death, or apoptosis. Triggering apoptosis, during which a cell stops dividing, then self-destructs, is the goal of many modern cancer treatments.
Among other findings, they showed that eliminating the clock proteins reduced levels of the enzyme Wee1 and increased levels of the enzyme inhibitor P21. That's exactly what you want, because when it comes to cancer cell proliferation, P21 is a brake and Wee1 is a gas pedal.
Finally, the researchers tested their findings in vivo. Mice injected with unmodified human liver cancer cells grew large tumors, but those injected with cells modified to suppress CLOCK and BMAL1 showed little to no tumour growth.
Meng Qu et al, Circadian regulator BMAL1::CLOCK promotes cell proliferation in hepatocellular carcinoma by controlling apoptosis and cell cycle, Proceedings of the National Academy of Sciences (2023). DOI: 10.1073/pnas.2214829120
Jan 8, 2023
Dr. Krishna Kumari Challa
The oven won't talk to the fridge: 'smart' homes struggle
Making the devices work with each other was the easier part.
The hard part is the app model, the data model, the sharing of this, because the human nature of companies is to be very selfish about this.
Each brand is now trying to convince the public to adopt its app to centralize control of household appliances.
And if things don't match up, your fridge doesn't talk with your stove, your vacuum cleaner with your recliner and .... you will have a very messy home.
That is 'smartness' for you!
Source: AFP
Jan 8, 2023
Dr. Krishna Kumari Challa
Cells that cooperate live longer
New research: When cells exchange metabolic products with other cells, they live longer. The fact that these exchanges directly impact cell lifespans could play a significant role in future research into human aging processes and age-related diseases. The study appears in the latest issue of Cell.
Metabolism is inextricably linked to aging. While it helps maintain vital processes, makes us grow, and triggers cellular repairs, it also produces substances that damage our cells and cause us to age. The metabolic processes that occur within cells are highly complex. The exchange of substances between cells in a community is one important factor, because it has a substantial impact on the metabolism occurring inside a cell.
Cells are in constant contact with neighboring cells—within tissues, for instance. They release some substances and consume others from their surrounding environment. In a recent study researchers investigated whether the exchange of metabolic products (known as metabolites) affects the lifespan of cells.
The researchers used yeast cells and performed experiments to establish their lifespan. Yeast cells are a key model in basic research, a dominant microorganism in biotechnology, and important in medicine because they can cause fungal infections. They found that the cells lived around 25% longer when they exchanged more metabolites with each other. So obviously wanted to identify the substances and exchange processes that are behind this life-prolonging effect.
Part1
Jan 8, 2023
Dr. Krishna Kumari Challa
To do so, the researchers employed a special analytical system supported by mass spectrometry that allowed them to precisely track the exchange of metabolites between cells. They found that young cells, which were still able to divide well and often, released amino acids that were consumed by older cells.
Amino acids are the building blocks that make up proteins. The research team discovered that the exchange of the amino acid methionine extended the lives of the cells involved. Methionine occurs in all organisms and plays a key role in protein synthesis, as well as many other cellular processes. Interestingly, it was the young cells' metabolism that prolonged the lives of the old cells.
The cells which within the community consumed methionine, released glycerol. In turn, the presence of glycerol affected methionine producing cells, causing them to live longer. Glycerol is needed for building cell membranes and plays a part in protecting cells. It's a win-win situation. As cells engage in this collaborative exchange, they prolong the lifespan of their community as a whole.
This study of yeast cell communities is the first to show that metabolite exchange directly impacts the lifespan and aging process of the cells. The researchers suspect this also applies to other types of cells, such as those in the human body, and are aiming to investigate this in further studies.
Clara Correia-Melo et al, Cell-cell metabolite exchange creates a pro-survival metabolic environment that extends lifespan, Cell (2023). DOI: 10.1016/j.cell.2022.12.007
Part 2
Jan 8, 2023
Dr. Krishna Kumari Challa
Rate of scientific breakthroughs slowing over time: Study
The rate of ground-breaking scientific discoveries and technological innovation is slowing down despite an ever-growing amount of knowledge, according to an analysis released recently of millions of research papers and patents.
While previous research has shown downturns in individual disciplines, the study is the first that 'emphatically, convincingly documents this decline of disruptiveness across all major fields of science and technology'.
The researchers gave a "disruptiveness score" to 45 million scientific papers dating from 1945 to 2010, and to 3.9 million US-based patents from 1976 to 2010.
From the start of those time ranges, research papers and patents have been increasingly likely to consolidate or build upon previous knowledge, according to results published in the journal Nature.
The ranking was based on how the papers were cited in other studies five years after publication, assuming that the more disruptive the research was, the less its predecessors would be cited.
The biggest decrease in disruptive research came in physical sciences such as physics and chemistry.
"The nature of research is shifting" as incremental innovations become more common.
One theory for the decline is that all the "low-hanging fruit" of science has already been plucked.
If that were the case, disruptiveness in various scientific fields would have fallen at different speeds. If that were the case, disruptiveness in various scientific fields would have fallen at different speeds.
But instead the declines are pretty consistent in their speeds and timing across all major fields indicating that the low-hanging fruit theory is not likely to be the culprit.
Instead, the researchers pointed to what has been dubbed "the burden of research," which suggests there is now so much that scientists must learn to master a particular field they have little time left to push boundaries.
This causes scientists and inventors to focus on a narrow slice of the existing knowledge, leading them to just come up with something more consolidating rather than disruptive.
Another reason could be that "there's increasing pressure in academia to publish, publish, publish, because that's the metric that academics are assessed on.
The researchers called on universities and funding agencies to focus more on quality, rather than quantity, and consider full subsidies for year-long sabbaticals to allow academics to read and think more deeply.
This work showed that "ultra-specialization" and the pressure to publish had increased over the years.
Researchers blamed a global trend of academics being "forced to slice up their papers" to increase their number of publications, saying it had led to "a dulling of research."
Michael Park et al, Papers and patents are becoming less disruptive over time, Nature (2023). DOI: 10.1038/s41586-022-05543-x
Jan 9, 2023
Dr. Krishna Kumari Challa
Human-approved medication brings back 'lost' memories in mice
Students sometimes pull an all-nighter to prepare for an exam. However, research has shown that sleep deprivation is bad for your memory. Now, neuroscientists have discovered that what you learn while being sleep deprived is not necessarily lost, it is just difficult to recall.
These neuro-scientists have found a way to make this "hidden knowledge" accessible again days after studying while sleep-deprived using optogenetic approaches, and the human-approved asthma drug roflumilast. These findings were published in the journal Current Biology.
Using genetic techniques, the scientists caused a light-sensitive protein (channelrhodopsin) to be produced selectively in neurons that are activated during a learning experience. This made it possible to recall a specific experience by shining light on these cells.
In the experiment, the genetically engineered mice were given a spatial learning task in which they had to learn the location of individual objects, a process that heavily relies on neurons in the hippocampus. The mice then had to perform this same task days later, but this time with one object moved to a novel location. The mice that were deprived of sleep for a few hours before the first session failed to detect this spatial change, which suggests that they cannot recall the original object locations.
However, when the researchers reintroduced them to the task after reactivating the hippocampal neurons that initially stored this information with light, they did successfully remember the original locations. This shows that the information was stored in the hippocampus during sleep deprivation, but couldn't be retrieved without the stimulation.
The molecular pathway set off during the reactivation is also targeted by the drug roflumilast, which is used by patients with asthma or COPD. When scientists gave mice that were trained while being sleep deprived roflumilast just before the second test, they remembered, exactly as happened with the direct stimulation of the neurons. As roflumilast is already clinically approved for use in humans, and is known to enter the brain, these findings open up avenues to test whether it can be applied to restore access to 'lost' memories in humans.
The discovery that more information is present in the brain than we previously anticipated, and that these "hidden" memories can be made accessible again—at least in mice—opens up all kinds of exciting possibilities.
Youri G. Bolsius et al, Recovering object-location memories after sleep deprivation-induced amnesia, Current Biology (2022). DOI: 10.1016/j.cub.2022.12.006
Jan 10, 2023
Dr. Krishna Kumari Challa
Method reviews could stop useless science
“I’ve lost count of the number of times that a board member has remarked that the way a study has been designed means it won’t yield any informative data,” says experimental psychologist and ethical-review-board chair Daniël Lakens. To counter this trend, his university has introduced a methodological review board that highlights flaws before data col... — such as sample sizes that are too small to test a hypothesis.
Jan 10, 2023
Deepak Menon
So, Dr Krishna - where does this lead scientists to? Does it indicate a progressive movement towards applied Physics?
Jan 10, 2023
Dr. Krishna Kumari Challa
Researchers uncover new cell types involved in osteoarthritis
A Medicine study has identified a new potential target for treating osteoarthritis—a debilitating joint disease that affects millions and is a leading cause of disability worldwide.
A team of researchers has uncovered previously unknown cell types in the joint that emerge after an injury and drive the onset of osteoarthritis.
Clinically, osteoarthritis presents as a very complex disease, with patients suffering from joint stiffness, reduced mobility and function, and most notably, persistent pain.
Osteoarthritis patients commonly live with this condition for multiple decades, and no treatments have been developed that can stop or reverse the disease. The condition can occur with age or be sparked by a joint injury and is typically managed with pain relief and end-stage joint replacement.
The study, titled "Synovial fibroblasts assume distinct functional identities and secrete R-spondin 2 in osteoarthritis" and published in the Annals of the Rheumatic Diseases, examined the cellular and molecular events during the onset of post-traumatic osteoarthritis in joints.
Researchers identified cell types that emerge in the joint after trauma, such as an ACL injury, and they can now associate these cells with the disease process. This allows us to view them as a treatment target for this devastating disease.
By employing a cutting-edge gene sequencing technology called single-cell RNA-sequencing,the researchers were able to uncover these previously uncharacterized cells that emerge in the joint after injury.
The study also described the biological processes that may activate these cells, which offers compelling new targets for an effective treatment.
Interestingly, these cells are not found in healthy joints, and the researchers have to understand exactly what causes them to appear and how they may cause osteoarthritis.
Alexander J Knights et al, Synovial fibroblasts assume distinct functional identities and secrete R-spondin 2 in osteoarthritis, Annals of the Rheumatic Diseases (2022). DOI: 10.1136/ard-2022-222773
Jan 11, 2023
Dr. Krishna Kumari Challa
Electrons take new shape inside unconventional metal
One of the biggest achievements of quantum physics was recasting our vision of the atom. Out was the early 1900s model of a solar system in miniature, in which electrons looped around a solid nucleus. Instead, quantum physics showed that electrons live a far more interesting life, meandering around the nucleus in clouds that look like tiny balloons. These balloons are known as atomic orbitals, and they come in all sorts of different shapes—perfectly round, two-lobed, clover-leaf-shaped. The number of lobes in the balloon signifies how much the electron spins about the nucleus.
That's all well and good for individual atoms, but when atoms come together to form something solid—like a chunk of metal, say—the outermost electrons in the atoms can link arms and lose sight of the nucleus from where they came, forming many oversized balloons that span the whole chunk of metal. They stop spinning about their nuclei and flow through the metal to carry electrical currents, shedding the diversity of multi-lobed balloons.
Now researchers have produced the first experimental evidence that one metal—and likely others in its class—have electrons that manage to preserve a more interesting, multi-lobed structure as they move around in a solid. The team experimentally studied the shape of these balloons and found not a uniform surface, but a complex structure. This unusual metal is not only fundamentally interesting, but it could also prove useful for building quantum computers that are resistant to noise.
Part 1
Jan 11, 2023
Dr. Krishna Kumari Challa
Peering inside a metal, one sees atoms arranged in neat repeating grids, called a crystal lattice. In a crystal, the atomic orbitals of the outermost electrons morph into one another. This allows the electrons to travel far from their original nucleus and carry current through the metal. In this solid setting, a version of orbital balloons still exists, but it's more common to visualize them not in space—where there are many huge and unwieldy orbitals—but as a function of the speed and direction of the traveling electrons. The fastest moving electrons in the crystal form their own balloon, a collective analog of atomic orbitals known as a Fermi surface.
The shape of the Fermi surface reflects the structure of the underlying crystal, which usually bears no resemblance to the orbital structure of single atoms. But for materials like YPtBi with very few mobile electrons, the Fermi surface is not very big. Because of this, it retains some of the properties of electrons that hardly move at all, which sit at the center of the Fermi surface.
The fact that nature figures out counterintuitive atomic arrangements that allow the Fermi surface to retain signatures of the atomic orbitals is rather cool and intricate.
To uncover this cool, counterintuitive Fermi surface, the researchers stuck a YPtBi crystal inside a magnetic field and measured the current flowing through the crystal as they tuned the field. By rotating the direction of the magnetic field, they were able to map out the speed of the fastest electrons in every direction. They found that akin to a higher angular momentum atomic orbital, the Fermi surface has a complex shape to it, with peaks and troughs along certain directions. The high symmetry of the crystal itself would normally lead to a more uniform, ball-like Fermi surface, so it was a surprise to find a more complicated structure. This pointed to the possibility that the collective electrons were exhibiting some of the higher angular momentum nature of atomic orbitals.
Indeed, theoretical calculations by the team showed that the experimental results matched up with a high angular momentum model, leading the team to claim the first experimental observation of a high-angular momentum metal. The team cautions that even this experimental evidence could still be incomplete. What they measured depends not only on the Fermi surface but also on other properties of the electrons, such as their effective mass and the distribution of their velocities. In their work, the team systematically studied the angular dependence of these other quantities and demonstrated that it would be extremely unlikely for them to cause the observed peaks and troughs.
In addition to being fundamentally novel, this higher angular momentum metal has potential applications for quantum computing.
Hyunsoo Kim et al, Quantum oscillations of the j=3/2 Fermi surface in the topological semimetal YPtBi, Physical Review Research (2022). DOI: 10.1103/PhysRevResearch.4.033169
**
Part2
Jan 11, 2023
Dr. Krishna Kumari Challa
Are we breathing airborne microplastics? Study finds higher concentrations indoors
People are likely exposed to thousands of airborne microplastics a year indoors, a new study has found.
Published in Environmental Science & Technology, the study investigated the abundance, distribution, form, and possible sources of microplastics (MPs) in indoor and outdoor sites in Sri Lanka, finding concentrations between 1 and 28 times higher indoors.
With people spending approximately 90% of their time indoors and based on the indoor and outdoor MPs levels identified in this study, the researchers calculated the average human exposure as 2,675 airborne microplastic particles per person every year.
Researchers collected air samples in different urban, rural, coastal, inland, industrial, and natural habitats with varying population densities.
The indoor MPs levels, made up of fibers and the occasional fragments mostly from textiles and clothing, were significantly higher than outdoor levels by a factor of 1–28 times, regardless of the type of outdoor environment. Transparent, blue, and black fibers in the size range of 0.10 to 0.50 millimeters were the dominant AMPs across all sites.
These initial results show that the amount of indoor airborne microplastics is more related to indoor sources and the occupants' lifestyle than the outdoor environment.
In the outdoor samples, the amount of AMPs was always greater in high-density sites compared to the low-density areas, suggesting the abundance and distribution of AMPs was related to population density, level of industrialization, and human activity."
The dominant type of microplastics in both indoor and outdoor sites was PET fibers (polyethylene terephthalate), primarily originating from clothing and textiles.
This study is an important first step that shows the abundance of AMPs in a lower-middle income country in South Asia.
Kushani Perera et al, Airborne Microplastics in Indoor and Outdoor Environments of a Developing Country in South Asia: Abundance, Distribution, Morphology, and Possible Sources, Environmental Science & Technology (2022). DOI: 10.1021/acs.est.2c05885
Jan 11, 2023
Dr. Krishna Kumari Challa
Introduction to Marine Energy
Want to learn about different types of marine energy resources and technologies, their associated challenges and considerations, and how researchers are addressing these challenges? Watch this video ....
Jan 11, 2023
Dr. Krishna Kumari Challa
Study clarifies mystery of crocodilian hemoglobin
We know a crocodile's ability to stay under water for a long time. When the unseen apex predator lashes from the water to seize its prey the impala, its infamous teeth latch onto a hindquarter, jaws clenching with 5,000 pounds of force. Yet it's the water itself that does the killing, with the deep-breathed reptile dragging its prey to the deep end to drown.
The success of the croc's ambush lies in the nanoscopic scuba tanks—hemoglobins—that course through its bloodstream, unloading oxygen from lungs to tissues at a slow but steady clip that allows it to go hours without air. The hyper-efficiency of that specialized hemoglobin has led some biologists to wonder why, of all the jawed vertebrates in all the world, crocodilians were the lone group to hit on such an optimal solution to making the most of a breath.
By statistically reconstructing and experimentally resurrecting the hemoglobin of an archosaur, the 240-million-year-old ancestor of all crocodilians and birds, researchers have gleaned new insights into that why. Rather than requiring just a few key mutations, as earlier research suggested, the unique properties of crocodilian hemoglobin stemmed from 21 interconnected mutations that litter the intricate component of red blood cells.
That complexity, and the multiple knock-on effects that any one mutation can induce in hemoglobin, may have forged an evolutionary path so labyrinthine that nature failed to retrace it even over tens of millions of years, the researchers say.
All hemoglobin binds with oxygen in the lungs before swimming the bloodstream and eventually releasing that oxygen to the tissues that depend on it. In most vertebrates, hemoglobin's affinity for capturing and holding oxygen is dictated largely by molecules known as organic phosphates, which, by attaching themselves to the hemoglobin, can coax it into releasing its precious cargo.
But in crocodilians—crocodiles, alligators and their kin—the role of organic phosphates was supplanted by a molecule, bicarbonate, that is produced from the breakdown of carbon dioxide. Because hardworking tissues produce lots of carbon dioxide, they also indirectly generate lots of bicarbonate, which in turn encourages hemoglobin to dispense its oxygen to the tissues most in need of it.
It's a super-efficient system that provides a kind of slow-release mechanism that allows crocodilians to efficiently exploit their onboard oxygen stores. It's part of the reason they're able to stay underwater for so long.
Chandrasekhar Natarajan et al, Evolution and molecular basis of a novel allosteric property of crocodilian hemoglobin, Current Biology (2022). DOI: 10.1016/j.cub.2022.11.049
Jan 13, 2023
Dr. Krishna Kumari Challa
Washing fabrics by hand reduces microplastic release compared with machine washing
From tiny plankton to massive whales, microplastics have been found throughout the ocean food chain. One major source of this pollution are fibers shed while laundering synthetic fabrics. Although many studies show microfibers are released during machine washing, it's been less clear how hand washing contributes. Now, researchers reporting in ACS Environmental Science & Technology Water report that hand washing can drastically cut the amount of fibers shed compared with using a machine.
When clothing made from plastic fibres, such as polyester and nylon, are laundered, the fabric sheds microscopic fibers that eventually end up in wastewater and the environment. Though researchers have investigated the amount and types of microplastic fibers shed while laundering clothing, most studies have focused on washing machines. In many countries, however, it is still common to manually launder clothing. So researchers systematically tried to compare both types of washing and their effect on the environment.
The team cleaned two types of fabric swatches made from 100% polyester and a 95% polyester-5% spandex blend with hand washing methods and a washing machine. The researchers found that:
The researchers say that these results will help clarify the sources of microplastic pollution in the environment and can provide guidance for "greener" laundering methods.
Chunhui Wang et al, Microplastic Fiber Release by Laundry: A Comparative Study of Hand-Washing and Machine-Washing, ACS ES&T Water (2023). DOI: 10.1021/acsestwater.2c00462
Jan 13, 2023
Dr. Krishna Kumari Challa
Protein that counteracts key rattlesnake venom toxins identified
Venomous snakes cause an estimated 120,000 deaths and 400,000 disabling injuries worldwide each year.
To reduce and mitigate the severity of venomous snake bites, a team of biologists launched an investigation into the genome of the western diamondback rattlesnake (Crotalus atrox), a species with more venom toxins encoded in its genome than any other known rattlesnake. The team pinpointed a single protein—called FETUA-3—that inhibits a broad spectrum of rattlesnake venom toxins.
FETUA-3 inhibited a huge number of toxins—over 20—that we detected and even bound to and inhibited the toxins of venoms from several other rattlesnakes the researchers tested. There is a need to learn more about how broadly FETUA-3 can be applied or if it'll need some additional tinkering but knowing that this one protein can neutralize an entire class of toxins brings researchers even closer to creating a better anti-venom.
Published in the Proceedings of the National Academy of Sciences, the team's findings have notable implications for the development of improved snake bite treatments.
Fiona P. Ukken et al, A novel broad spectrum venom metalloproteinase autoinhibitor in the rattlesnake Crotalus atrox evolved via a shift in paralog function, Proceedings of the National Academy of Sciences (2022). DOI: 10.1073/pnas.2214880119
Jan 14, 2023
Dr. Krishna Kumari Challa
Why chocolate feels so good
Lubrication science gives mechanistic insights into how food actually feels in the mouth. You can use that knowledge to design food with better taste, texture or health benefits.
Scientists have decoded the physical process that takes place in the mouth when a piece of chocolate is eaten, as it changes from a solid into a smooth emulsion that many people find totally irresistible.
During the moments it is in the mouth, the chocolate sensation arises from the way the chocolate is lubricated, either from ingredients in the chocolate itself or from saliva or a combination of the two.
Fat plays a key function almost immediately when a piece of chocolate is in contact with the tongue. After that, solid cocoa particles are released and they become important in terms of the tactile sensation, so fat deeper inside the chocolate plays a rather limited role and could be reduced without having an impact on the feel or sensation of chocolate.
The researchers used analytical techniques from a field of engineering called tribology to conduct the study, which included in situ imaging.
Tribology is about how surfaces and fluids interact, the levels of friction between them and the role of lubrication: in this case, saliva or liquids from the chocolate. Those mechanisms are all happening in the mouth when chocolate is eaten.
When chocolate is in contact with the tongue, it releases a fatty film that coats the tongue and other surfaces in the mouth. It is this fatty film that makes the chocolate feel smooth throughout the entire time it is in the mouth.
Siavash Soltanahmadi et al, Insights into the multiscale lubrication mechanism of edible phase change materials, ACS Applied Materials & Interfaces (2023). pubs.acs.org/doi/10.1021/acsami.2c13017
Jan 14, 2023