Disaster Preparedness & Recovery

How Big Data Is Changing Earthquake Science
By: Robert Perkins on January 16, 2014
Bookmark and Share

Damage in San Francisco from the Loma Prieta earthquake
Damage in San Francisco from the Loma Prieta earthquake in 1989. Photo courtesy of the U.S. Geological Survey

EMERGENCY MANAGEMENT EVENTS



Events for police, fire, and the whole community of first responders!

Our Summits are a superb opportunity for gaining new ideas, best practices and peer relationships critical to collaborative response capabilities in your region. View our events calendar!

Twenty years ago, a fault that scientists didn’t even know existed slipped, triggering a massive 6.7 magnitude earthquake centered beneath the San Fernando Valley, with shockwaves rippling throughout the greater Los Angeles area.

When the strongest shaking ceased, the region had suffered 57 deaths and more than $20 billion in damage. The newly formed Southern California Earthquake Center (SCEC), founded in 1991 and headquartered at USC, stepped in to find out exactly what happened and what could be done about it.

Earthquakes cannot be prevented nor predicted. However, by beefing up and modernizing the region’s seismographic network and then crunching the massive reams of resulting data, scientists from SCEC have been able to piece together a clearer, more granular picture of the varying risk that regions throughout Southern California face due to earthquakes.

That picture can be used to help create appropriate building codes and, in the wake of an earthquake, help direct first responders.

“What you have to do in order to get the earthquake rupture forecast — that is, a forecast of how often, how big and where the earthquakes might occur — is to integrate all of this information together,” said Thomas Jordan, university professor at the USC Dornsife College of Letters, Arts and Sciences and director of SCEC. “You have to know the faults, the rates of motion and something about the ruptures that occur on them in order to estimate the future probabilities of having earthquakes on these different faults.”

To create seismic hazard maps, Jordan and his team use supercomputers to simulate a half million earthquakes at more nearly 300 sites throughout the Los Angeles region.

“All of the science is being put into products that society can use,” Jordan said.

It’s not a simple process. As shown in the Northridge quake, an area’s risk for heavy shaking doesn’t just depend on its proximity to a fault. The type of sediments or rock that an area sits on, as well as the geometry of the region, can help seismic energy focus and propagate. In 1994, the region’s geometry caused an unusual amount of shaking in Santa Monica, resulting in greater damage to the city than surrounding neighborhoods.

In the wake of Northridge, the Southern California Seismographic Network was expanded and modernized, with digital stations replacing the old analog ones. At the time of the Northridge quake, there were six digital stations in use throughout the region. Today, there are more than 400.

The digital stations allow the rapid creation of “ShakeMaps” depicting the pattern of where shaking has occurred. First responders can use these maps to immediately prioritize their response.

“To create the equivalent of this map in ’94, it took us two months that involved going out into the field and picking up film,” said Lucy Jones, science adviser for risk reduction for the U.S. Geological Survey (USGS). “We had to get film recordings, develop them and digitize them, to be able to turn them into this map. This map now happens in a matter of about two to three minutes.”

After Northridge, a special focus was also given to so-called “blind-thrust faults” — buried faults that do not appear on the Earth’s surface — like the one that caused the quake in 1994. As of today, dozens more have been found.

“One of the major points of emphasis in scientific research after Northridge was to better understand these blind thrust faults,” Jordan said.

Beyond creating hazard maps, SCEC’s research has been used to create more realistic earthquake simulations. An example is the simulation used as part of the USGS-developed ShakeOut Scenario, the basis for the original Great Southern California Shakeout in 2008. The massive drill, which has grown statewide and involved 9.6 million participants in 2013, allows emergency management agencies to estimate and plan for the cascading impacts on the area’s infrastructure, according to Kate Long, earthquake program manager for the California Governor’s Office of Emergency Services.

And it all begins with the hard work of scientists at SCEC.

“Without the science, without the scenarios, when we exercise we’re just guessing,” Long said.

Though no one wants to experience another earthquake like Northridge any time soon, Jordan and his colleagues at SCEC are unlikely to run out of temblors to measure and analyze in the Southern California area

As Jones put it, “Those of us who live in Southern California don’t always like to hear what an exciting natural laboratory Southern California is for the scientists, but that’s the reality.”

This article was republished with permission from the University of Southern California.
 

You may use or reference this story with attribution and a link to
http://www.emergencymgmt.com/disaster/Big-Data-Earthquake-Science.html


Comments


Add Your Comment

You are solely responsible for the content of your comments. We reserve the right to remove comments that are considered profane, vulgar, obscene, factually inaccurate, off-topic or a personal attack. Comments are limited to 2,000 characters.





Featured Papers