Science, Art, Litt, Science based Art & Science Communication
Science communication series - part 15
Scientists take lots of risks while coming out in public regarding their work. And sometimes they will have to comprehend with incomplete data, unproved conclusions while connecting things and making models. We hear complaints about climate science, GM crops, Ebola virus clinical trials, weather and several other things. Climate scientists are undoubtedly facing these legal challenges (3). Earth quake predictions are one of these things scientists will have to stick their necks out because of the developing phase of the subject. And if seismologists go wrong in communicating their predictions, if people die because of this ... then they will be prosecuted and sent to jail. Prosecuted?
Yes, this was what had happened in 2009, when an earthquake devastated the Italian city of L'Aquila and killed more than 300 people. Seven Italian earthquake scientists faced criminal charges for failing to predict the earthquake — even though pinpointing the time, location and strength of a future earthquake in the short term remains, by scientific consensus, technically impossible!
The scientists had ended up in court as a consequence of botched communication in a highly stressed environment. In the months before the major earthquake struck, the region around L’Aquila had been subject to frequent, mostly low-magnitude tremors, known as seismic swarms. Residents were confused and increasingly alarmed by public statements made by a local amateur earthquake predictor, who claimed he had evidence of an impending quake — although geologists dismissed his methodology as unsound. A commission of experts met on March 31, 2009 to assess the scientific evidence and advise the government.
According to the prosecution, a press conference after that meeting — attended by the acting president of the commission, volcanologist Franco Barberi from the University of Rome ‘Roma Tre’, and government official Bernardo De Bernardinis, then deputy head of the Italian Civil Protection Department — conveyed a reassuring message that a major earthquake was not on the cards.
Moreover, in a television interview recorded a few hours before the meeting but aired after it, De Bernardinis, who is now president of the Institute for Environmental Research and Protection in Rome, said that “the scientific community tells me there is no danger because there is an ongoing discharge of energy” during the seismic storms.
According to the prosecution, when the earthquake struck on April 6, 2009 some people chose to remain at home instead of stepping outside as they otherwise would have done, and died in the collapse of their homes. All seven members of the expert commission were found guilty of manslaughter in October 2012, after a 13-month trial that transfixed the international scientific community.
Prosecutors and the families of victims alike say that the trial has nothing to do with the ability to predict earthquakes, and everything to do with the failure of government-appointed scientists serving on an advisory panel to adequately evaluate, and then communicate, the potential risk to the local population. According to them the scientists were obligated to evaluate the degree of risk given all the factors at L'Aquila site. A persistent message from authorities of "Be calm, don't worry", and a lack of specific advice, deprived them of an opportunity to make an informed decision about what to do on the night of the earthquake. That's why we feel betrayed by science, they say, either they didn't know certain things, which is a problem, or they didn't know how to communicate what they did know, which is also a problem.
The trial has had a chilling effect on scientists' willingness to share their expertise with the public. Now scientists in the areas where it is very difficult to predict things are keeping mum. When the Earth roars, scientists are falling silent! Yes, they are worried! The charges serve as a "dangerous" warning to researchers, who may find themselves in legal trouble because of the way that non-scientists such as public officials or journalists translate their risk analyses for public consumption. This would largely discourage giving honest opinions by scientists.
Scientists involved in disaster assessment everywhere anxiously watched the 13-month-long trial that led to the geologists' original conviction. Many felt it was unfair to hold the geologists accountable for not predicting the earthquake. So when the verdict finally arrived, for these scientists, it felt like an attack on science as a whole.
Scientists often struggle to quantify their uncertainty for laypeople. In the time following the trial, many wondered whether it was possible to work with the government to accurately relay their findings — and include their uncertainty. Some wondered whether it was worth trying to communicate with the public at all in uncertain conditions. This verdict seemed to indicate that scientists could be punished for information that they had gathered and their interpretations. There is no doubt that this has had a very negative effect on the willingness of scientists to communicate their insights into natural phenomena to the press and politicians.
The seismologists appealed against the earlier ruling as they had initially been convicted of manslaughter for failing to predict the earthquake that ravaged the Italian town of L'Aquila and a 30-day trial started on October 10 this year in the L’Aquila courtroom, before a three-judge court. Over the course of six hearings, the scientists’ attorneys argued that no clear causal link had been proven between the meeting and the behaviour of the people of l’Aquila. They also argued that the scientists could not be held accountable for De Bernardinis's reassuring statements, and that the scientific opinions given by seismologists during the meeting were ultimately correct.
Now although the six geologists accused of misleading the public about the risk of an earthquake in Italy were cleared of manslaughter on November 10, 2014 - an appeals court overturned their six-year prison sentences and reduced to two years the sentence for a government official who had been convicted with them, this fear remains in the hearts of the scientists. Because the ruling of the appeals judge can still be overturned. Lawyers for the families of the deceased have announced that they will challenge it in the Supreme Court of Cassation in Rome, the country's court of last resort. That court could invalidate the findings and call for new appeals proceedings.
And the scientific community is waiting with bated breath. The case will continue to affect scientists for years to come. And the world will have to suffer as a result.
The truth is there is no accepted scientific method for earthquake prediction that can be used reliably to warn citizens of an impending disaster. To expect more of science at this time is unreasonable. Experts went on to warn the dangers of a conviction. They worry that subjecting scientists to criminal charges for adhering to accepted scientific practices may have a chilling effect on researchers, thereby… discouraging them from participating in matters of great public importance. One of the accused, Enzo Boschi, actually started warning other researchers to stay away from risk assessment inquiries altogether!
The conversation surrounding scientists’ willingness to share their expertise continued well beyond the geologists’ conviction. To arrest and convict scientists because they cannot forecast earthquakes and to blame them for the earthquakes happening sets a pretty dangerous precedent and it might discourage people from actually trying to address that problem in the future.
Science is still in its growing phase like I mentioned earlier (1). I always tell people science is still in an infant stage. It has to grow a lot to provide answers to all your questions and give you full proof replies. That doesn't mean you cannot predict things. You can, based on the knowledge available but they need not be full proof. But whatever little can be done should be done to help people is the motto of science.
Science is also uncertain. It's based on statistics, probabilities, and error margins. Numbers produced in studies look precise, but there's always room for interference. The most rigorous results tend to be correct in very controlled environments, which look nothing like the world in which we live (2) discussion over interpretation of results is often heated. To most researchers, this uncertainty is normal, and maybe even reassuring — the world still has the power to surprise us as we don't know much about it. All most all scientists agree on this. But in 2009, the uncertainty of scientific research backfired for six geologists and one hydrologist in Italy. While many saw the defense of science as the best way to appeal their manslaughter convictions, their lawyers decided against arguing that case. Instead, their lawyers argued that the defendants were right all along.
The Italian case has emphasized the importance for seismologists to be very careful in communications with the press and public. They also now have a responsibility to quickly correct any misstatements they hear on the news, or through any other influential media.
Scientists have to be a lot more careful about what they say now because people sometimes alter their message. Some are particularly concerned that young people might refrain from specializing in seismology and other difficult subjects because of the trial.
The verdicts have introduced a level of uncertainty in terms of what scientists can and cannot say that will take some time to resolve. But maybe this new development of clearing the scientists of manslaughter in Italy will give them back their courage now to actually speak up about the risks and hazards that they sense with regard to natural phenomena.
References:
1. http://kkartlab.in/group/some-science/forum/topics/2816864:Topic:11...
2. http://kkartlab.in/group/some-science/forum/topics/why-we-get-contr...
3. http://thebridge.agu.org/2014/12/03/defending-science-fall-meeting/
Tags:
843
"Volcanology is progressing similarly to extreme weather forecasting, but is a few decades behind," says Kolzenburg, a volcanologist at the University of Buffalo in the United States. "First, we already have a long record of weather data to draw on. Second, hurricanes are more frequent and often seasonal, whereas major volcanic eruptions are infrequent. Last, volcanoes are technically and logistically difficult to monitor."
Weather forecasting based on an understanding of atmospheric science coupled to regular observations is around 200 years old. Satellites build on this data by drilling down to local scales, contributing precise measurements to variables such as humidity or wind speed.
But while weather is everywhere, volcanoes are scattered around the planet, complicating data gathering. Expensive seismometers for detecting geophysical signals are not evenly distributed globally, and rely on specialist skillsets. Additionally, different types of magma can make eruptions too fast to reach in time, or conversely too infrequent to justify the expense of constant observation—not to mention the potential dangers involved!
But perhaps the biggest impediment is that, as Kolzenburg puts it, "it is orders of magnitude more difficult to 'see' into Earth than to image weather patterns."
To accurately predict volcanic behavior, scientists would have to measure magma temperature and chemical composition, to understand how viscosity and volatility might drive pressure. They would also need to know a lot about what Kolzenburg calls "the geometry of the plumbing system."
"Even with robust sensors, it's virtually impossible to get all the input data that would be needed to predict such a dynamic system," adds Kolzenburg, who was principal investigator of the EU-funded DYNAVOLC project on volcano modeling.
Citizen science for volcano monitoring
Modern seismology tools, coupled with better understanding of the underlying processes through analysis of previous eruptions, experimental research and numerical modeling, are revealing more about the volumes, movements and characteristics of magma. We now know for example that magma chambers are not large cauldrons of magma, but small pockets dispersed throughout the crust, much like a sponge.
Additionally, satellites and airborne sensors streaming data in near real time have proven a game changer for helping predict how active eruptions might develop once under way.
While expensive cutting-edge technology like muon tomography could create 3D images of volcanic structures, what really excites Kolzenburg, is people power:
"We've recently seen, with the La Palma, Nyiragongo and Kilauea eruptions, an international grassroots community pool of resources. I'd put my faith in this interface of shared fieldwork, analytics and modeling, combined with seismology, to track the evolution of future eruptions."
Lilian C. Lucas et al, The Impact of Ice Caps on the Mechanical Stability of Magmatic Systems: Implications for Forecasting on Human Timescales, Frontiers in Earth Science (2022). DOI: 10.3389/feart.2022.868569. www.frontiersin.org/articles/1 … art.2022.868569/full
https://phys.org/news/2022-05-ice-capped-volcanoes-slower-erupt.htm...
Cornell researchers have unearthed precise, microscopic clues to where magma is stored, offering scientists—and government officials in populated areas—a way to better assess the risk of volcanic eruptions.
----
A pair of seismologists has found what might turn out to be an accurate way to predict earthquakes. In their study, reported in the journal Science, they looked at high-rate GPS time series data that was gathered in the time leading up to the moment earthquakes of magnitude 7 or above occurred.
Seismologists have long sought to predict earthquakes so that people could react. In many cases, several minutes warning would be helpful—it would allow people to exit buildings that might collapse. Finding a precursor is difficult due to the lack of information regarding what was happening in the vicinity of an epicenter before a quake. In this new effort, they have found a way to go back in time to learn more about land shifting before a big quake.
In looking for an earthquake precursor, the researchers obtained and studied precise GPS data for geographical areas surrounding the epicenters of 90 quakes over magnitude 7 over the past several years. They found a pattern—a slip between tectonic plates that caused the land above them to move in a measurable, horizontal direction.
They also found that such slips could be observed and measured using GPS, that they occurred up to two hours before the earthquake struck and were too small to show up on standard seismographs. Most important, they saw the same slip in all the earthquakes they studied.
The work suggests that a reliable earthquake system could be designed based on a precise GPS listening system. On the downside, researchers note that more work is required to prove that such a precursor exists for all, or at least most, large earthquakes. Also, they add, some upgrades to GPS technology are required to allow for measuring individual events around the clock.
Quentin Bletery et al, The precursory phase of large earthquakes, Science (2023). DOI: 10.1126/science.adg2565
Roland Bürgmann, Reliable earthquake precursors?, Science (2023). DOI: 10.1126/science.adi8032
by Christian Yates
At around 10:44 Pacific Time on December 5, a huge earthquake struck around 60 miles off the coast of California. The magnitude 7 quake triggered a tsunami alert for some cities in northern California.
Fortunately the potentially catastrophic wave never appeared and the warning was later rescinded. Although many people reported experiencing an alarming shaking, thus far there have been no stories of serious casualties from the quake, with California residents typically reporting only minor damage.
A narrow escape like this is a reminder of the devastation that earthquakes in the area have the potential to cause. Residents might rightly be asking, why are we not able to better predict these quakes so that we have more advanced warning? Why has there been so little progress in predicting these catastrophic natural disasters over the decades?
The truth is that earthquake prediction is extremely hard. The tectonic plates that tessellate the globe and the fault lines where they meet are extremely complex.
Trying to pick out what is a clear signal of a precursor to a potentially catastrophic shift versus the normal background noise of the Earth's movement is difficult. Add human activities like building work, traffic or even music concerts to the mix and the task becomes near impossible.
It's also the case that earthquakes don't always have consistent warning signs or precursors. You can measure seismic activity as accurately as you like, but if there genuinely is no warning sign to indicate an imminent quake, then it isn't going to help.
Despite decades of research the scientific consensus is that individual earthquakes cannot be reliably predicted. But that doesn't mean we can't say anything about the likelihood of a large earthquake occurring at a particular place over a particular period of time.
I can make a prediction right now that, based on the earthquake frequencies in the two areas, my hometown of Manchester will experience fewer earthquakes of magnitude 4 or above than San Francisco over the next 12 months. It's almost certain that I will be correct. This sort of future-facing projection is what the seismologists would call a forecast, rather than a prediction.
When the amount of energy an earthquake releases is plotted against the frequency with which those earthquakes occur, a distinctive relationship emerges (see the figures below). This is the celebrated Gutenberg-Richter law. The data for earthquakes over the 50-year period from 1970 to 2020 ranges from over 40,000 magnitude 4.5 earthquakes releasing around 350,000 million joules of energy each, to just two magnitude 9.1 earthquakes releasing nearly three million million million joules of energy each.
Because the two quantities (energy and frequency) vary so widely, the relationship is easier to see when plotted using logarithmic scales. When we do this, the data falls neatly onto the straight line predicted by the Guttenberg-Richter law.
The Gutenberg-Richter relationship seems to indicate that earthquakes follow a very predictable pattern. Knowing how often smaller earthquakes occur in a particular region can allow us, therefore, to predict how often the larger and less frequent, but more deadly, quakes will occur.
Although this doesn't allow us to predict the time, place and size of earthquakes—what scientists would refer to as a prediction—it does provide us with vital information that tells us whether the expected frequency of earthquakes in an area makes it worthwhile expending time and money preparing for them.
San Francisco, for instance, has a 51% forecasted probability of experiencing a magnitude 7 earthquake or higher over the next 30 years. For a city like that, in a relatively wealthy country like the US, it makes sense to invest significantly in earthquake preparedness. Even if the quake could be predicted precisely and all loss of life minimized, the economic cost of rebuilding the city's infrastructure would be catastrophic in itself.
Contrastingly, in a less wealthy country where similarly powerful earthquakes are forecast to happen less often, the expenditure to make the country quake-proof might not be justifiable.
The historian Edward Gibbon wrote in his memoirs that the laws of probability are "so true in general, so fallacious in particular". Despite the fact that the Gutenberg-Richter law appears to demonstrate that seemingly unpredictable earthquake occurrences can be spectacularly well-behaved, it is a long way from being a crystal ball.
It cannot foretell the precise date and time of the next big quake. Instead, it is limited to providing only the probability that a quake above a given size will occur in given time period.
This doesn't mean these forecasts are useless. Far from it. They allow us to prepare for a range of scenarios, allocating resources appropriate to the risk and likelihood of each.
Exactly how we should trade off preparing for events with low probability but high potential for disaster against events with higher probabilities but lower danger is a question that those in power, who will ultimately be responsible for the consequences of these choices, will have to grapple with. They should not pretend, however, that because we cannot predict the specific timing of any particular disaster that there is nothing we can do to prepare for them.
This article is republished from THE CONVERSATION under a Creative Commons license. Read the original article.
A recent study published in the journal Geology demonstrates that climate change can affect the frequency of earthquakes, adding to a small but growing body of evidence showing that climate can alter the seismic cycle.
Geoscientists analyzed the Sangre de Cristo Mountains in southern Colorado, a range with an active fault along its western edge. Their results indicate that the fault had been held in place under the weight of glaciers during the last ice age, and as the ice melted, slip along the fault increased. This suggests that earthquake activity along a fault could increase as glaciers recede.
Climate change is happening at a rate that is orders of magnitude faster than we see in the geologic record.
We see this in the rapid mountain glacial retreats in Alaska, the Himalayas and the Alps. In many of these regions, there are also active tectonics, and this work demonstrates that as climate change alters ice and water loads, tectonically active areas might see more frequent fault movements and earthquakes due to rapidly changing stress conditions.
It is well known that climate adjusts to seismic changes in the Earth's surface. The tectonic uplift of mountain ranges alters atmospheric circulation and rainfall, for example.
The Sangre de Cristo Mountains were covered with glaciers during the last ice age. Using remote-sensing and field data, the researchers reconstructed where the ice was, calculated the load that would have been pushing on the fault, and then measured displacement of the fault, or how much it had shifted.
The study found that fault slip rates have been five times faster since the last ice age than during the time the range was covered in glaciers. This research may preview how other glacier-adjacent faults will respond to a warming climate.
And this is compelling evidence.
The research adds to our understanding of what drives earthquakes, which is important for hazard assessment. Faults in areas with rapidly retreating glaciers or evaporating large bodies of water may need to be monitored for increasing earthquake activity.
Cecilia Hurtado et al, Exploring the impact of deglaciation on fault slip in the Sangre de Cristo Mountains, Colorado, USA, Geology (2024). DOI: 10.1130/G52661.1
© 2024 Created by Dr. Krishna Kumari Challa. Powered by