Sangeeta Apr 27, 2015 No Comments
The weekend was taken over by the worst earthquake in the sub-continent in the last eight decades. Thousands of lives have been lost, scores injured and property worth millions destroyed. News of further losses pour in as the aftershocks continue to shake us all up.
Geologists and scientists have always been monitoring the occurrence and frequency of earthquakes to simulate models. As the potential of seismic activity along tectonic fault lines is pre-established, it is surely possible to correlate structured and unstructured data from various sources to predict earthquakes? That’s what we die-hard believers in GIS and analytics would like to think.
Well, you may be surprised to know it is indeed possible.
Earthquake prediction began with the Haicheng earthquake of China (1975). A blend of empirical analysis, intuitive judgment, extensive scientific studies and a series of foreshocks was used to make the prediction. Evacuation of the one million people populated city was ordered just days before a 7.3-magnitude quake devastated Haicheng.. The success of prediction was based on earthquake precursors – unusually high temperatures, sulphurous gas emissions, strange animal behaviour, abnormal earthquake cloud formations – appearing along fault lines, together with geologic data.
Nearly four decades on. Advances in analytical rigour have moved beyond the descriptive to the predictive. A company called Terra Seismic now successful forecasts earthquakes, like the Tarapaca megaquake (8.1) of Chile, the Mexico quake (7.2) and the 6.4 quake in Indonesia nine days before it hit on March 3. Using its flagship Quake Hunters, the company offers seismic analytics-as-a-service to insurers and government agencies.
The Terra Seismic message tells you ‘Forecasting Earthquakes’ is possible!
Companies like Terra Seismic use satellite Big Data, earth observations and the internet “to predict major earthquakes anywhere in the world with 90% accuracy”. Data from satellite services, as well as ground based sensors is used to measure abnormalities in the atmosphere that occur before the real quake. The Apache Server system processes voluminous satellite data, correlating with sensor and other data based on earlier occurrences, for real-time estimation, analysis and simulation.
In an earlier blog, we have explored the power of big data analytics in climate science. In seismic activity prediction too, big data has the capability to analyse potentially seismic zones, connect the same with huge volumes of structured and unstructured data to construct fairly accurate models using statistical analysis.
Analysing unstructured data
Take twitter data for instance. You can mine information related to earthquakes or its occurrence by targeting hashtags based on scientifically established earthquakes precursors. So hashtags would factor in #unusual #clouds #behavior #animals #weather #gasemissions #cloudformations #hightemp #EarthquakePrediction and so on.
Using various filters, streaming endpoints, images, feeds, links, and the like, huge amounts of unstructured data can be correlated with potential seismic areas, hazard maps and fault lines.
Analysis of Structured Data – includes satellite data from satellite services across the world, global meteorological data, data from field and laboratory observations,
geophysical, geological, geochemical, mathematical and computational modeling of fault zones and seismic activity
atmospheric data, geologic information like crustal deformation
Correlated information – long-term probabilistic hazard assessments (shaking hazard maps, 30-year earthquake probability reports), foreshock probabilities, historical reports
Earthquake predictions can not only be used to prepare and ensure minimal loss of lives, but also be leveraged for risk and vulnerability assessments.
Now are you wondering like I am why the severe 7.9 magnitude earthquake of Nepal was not forecasted, given that the area was long expecting a magnitude 8 earthquake?
NASA has just put out that earthquake prediction is not possible. Yet, as proved by Terra Seismic in a couple of cases, it might just be possible! So does the analytical mind give up on the possibilities of Big Data Analytics to make earthquake predictions?
UPDATE (1st May)
The USGS has put up on its website an Aftershock Forecast & Table (27th April)
“In the coming week, the USGS expects 3-14 M≥5 aftershocks of the magnitude 7.8 Nepal earthquake. Additionally, the USGS estimates that there is a 54% chance of a M≥6 aftershock, and a 7% chance of a M≥7 aftershock during this one-week period.…. Based on general earthquake statistics, the expected number of M≥ 3 or 4 aftershocks can be estimated by multiplying the expected number of M>=5 aftershocks by 100 or 10, respectively.”
So, the analytical rigour DOES extend to earthquake prediction. What if Terra Seismic failed, it hasn’t stopped the USGS from making earthquake predictions or developing apps and systems that make statistical inferences!!
The ShakeAlert App endorsed by USGS