As natural disasters go, earthquakes are among the most destructive, as well as the most mysterious. Originating beyond the range of direct observation, miles below the surface, they usually strike with little or no warning.
The magnitude 9.0 Tohoku earthquake and tsunami that devastated Japan on March 11, 2011 was a humbling reminder that even the world’s most earthquake-prepared nation could be overwhelmed by sudden catastrophe.
But such megaquakes are rare, once-in-a-1000 year events in a given region. The majority of earthquake damage, according to UC Davis geophysicist John Rundle, is caused by earthquakes in the range of magnitude 6 to 7. Because the magnitude scale is logarithmic, every whole number represents 10 times the power of the preceding whole number, meaning that a 9.0 earthquake releases 100 times the energy of a 7.0.
These relatively smaller, but still powerful, quakes might occur in seismically active regions every decade and can wreck a city, as in the case of the 7.0 Haiti quake of 2010 or the 6.3 Christchurch quake of 2011.
“These earthquakes are much more significant, they occur much more frequently and there’s a reasonable chance of forecasting them,” Rundle said, speaking from Japan. He traveled there last week to help establish an institute for multi-hazard studies as a collaborative venture of the Association of Pacific Rim Universities, which includes UC Davis.
Rundle is part of a team of physicists, geologists and computer scientists at UC Davis working to improve statistical models that forecast the likelihood of medium-to-large earthquakes in specific areas within intervals of months or years. The basic principle is to use sequences of smaller earthquakes to find patterns that can be used to predict larger ones. One of the key foundations of such work is the well-established relationship between the number of small earthquakes and the number of larger earthquakes on a given fault system, known as the Gutenberg-Richter law.
A major challenge facing earthquake researchers is the lack of direct measurements of the stresses along a fault system that lead to the rupture we feel as an earthquake. Instruments placed in boreholes can sample only isolated points along the system, and even then, only measure changes, rather than absolute values, of the mechanical stress of the constantly shifting crust.
By sticking to observables — the sequences of earthquakes recorded by seismometers around the world as well as sedimentary records of past events — some regularity in earthquake behavior can be detected over long periods of time. The challenge of forecasting has been working these statistical generalizations down to reliable forecasts on timescales relevant to disaster planning and risk management.
One outcome of this effort is Open Hazards, a company Rundle and his colleagues founded to make the practical results of their work on earthquake forecasting available to the public. Among other things, its website allows users to produce personal earthquake forecasts based on location. (Davis locals can rest easy, with a 0.09% chance of an earthquake greater than magnitude 5 hitting within 50 miles in the next month.)
Another approach to forecasting attempts to model the physical interaction of earthquake faults with each other to reproduce what researchers hope are naturally occurring patterns of earthquakes. Virtual California is one such program that models the crust in 3 x 3 kilometer sections overlaid with known faults that trade off stress to one another over long stretches of simulated time.
“We don’t know what the state of the earth is now, but we let it run forward and hopefully there will be regular patterns,” said Eric Heien, lead developer for the UC Davis-based Computational Infrastructure for Geodynamics, a group which develops computer model simulations for deep-earth processes.
While forecasts speak in the language of probabilities, short-term prediction has held out the promise of certainty — the type that would be needed, for example, to evacuate an area prior to an earthquake. But such certainty has been elusive, in large part because definitive precursor signals have been notoriously inconsistent.
UC Davis geology professor Donald Turcotte said that while laboratory studies of fracturing rock and other simulated earthquake experiments suggest that seismic precursors should occur as stresses build toward an earthquake, foreshocks are by no means consistent precursors to major earthquakes. This seems to suggest that the geophysical mechanisms of earthquake formation are still far from understood.
Turcotte said he is also skeptical of efforts to detect non-seismic precursor signals, such as electromagnetic emissions that some researchers have sought to connect to earthquake formation. The basic problem is separating any perceived precursor from background seismicity.
“There are periods of time where people think they see glimmers of hope, and then they seem to recede,” Turcotte said. “The hope of doing accurate short-range prediction is not all that good.”
Despite the many uncertainties inherent in earthquake science, one certainty stands out: the growing danger of natural hazards to global populations.
“One of the reasons that disasters are so much larger these days is because global populations are moving into risky areas,” Rundle said. “It’s certainly not true that the earthquakes are on average bigger, but people’s exposure is growing exponentially and therefore so is the cost and the death toll.”
OYANG TENG can be reached at email@example.com.