Posts Tagged ‘Japan’

PostHeaderIcon GPS Could Issue Tsunami Alert in Minutes

Still from an animation show how seafloor features influenced the March 11 japan tsunami.
An image from an animation using satellite observations of the March 11 tsunami that shows how the waves of the tsunami were influenced by seafloor features. Wave peaks appear in red-brown, depressions in blue-green and ocean floor topography is outlined in gray.
CREDIT: NASA/Jesse Allen, using data provided by Tony Song (NASA/JPL)

The global positioning system (GPS) — the same system that helps people navigate unfamiliar places — could also serve as an early-warning system for tsunamis, according to new research.

When a magnitude-9.0 earthquake struck Japan on March 11, 2011, coastal residents received an inaccurate estimate of the earthquake’s magnitude before the waves hit and leveled thousands of buildings.

The area under alert was warned based on an estimatedearthquake magnitude of 7.9 — 130 times less intense than the actual quake was — meaning fewer neighborhoods were evacuated in response to the perceived threat.

 

 

Researchers behind a new study have said that GPS systems along the coast could have given the residents a better warning. Sifting through the GPS data from stations along the coast and issuing a more accurate tsunami alert based on that data would only have taken three minutes, the study found. [7 Ways the Earth Changes in the Blink of an Eye]

Subduction zones and GPS

Most tsunamis occur when one tectonic plate slides underneath another and causes an earthquake. In the process, the top plate is forced upward, and this uplift of the seafloor pushes on the water above it, setting off the tsunami. How high the ground rose on the seafloor would influence wave heights up on the surface.

The coast also slightly rises or falls along with the ocean floor, making it possible to see these changes through coastal GPS stations. Therefore, areas near these so-called subduction zones can be mapped and measured using GPS to see how much the ground has shifted and in what way it has deformed.

Whereas traditional seismological stations are located some distance away from the source, GPS transmitters can be placed much closer, on the coastline, to where the tsunami occurred, buying valuable time for those looking to escape.

“To really get absolute values of slip, you would need to have stations at the seafloor,” said Andreas Hoechner, a postdoctoral researcher at the GFZ German Research Centre for Geosciences in Potsdam.

“However, [the coastal GPS readings are] good enough to get good tsunami wave estimates.”

A subduction quake makes several ocean waves: crest waves on top of the seafloor that rise, and trough waves on the seafloor that drop down. Additionally, independent research has recently showed that a shoreline’s features also influence the severity of a tsunami’s impact on land.

Reconstructing an alert

To reconstruct what a GPS alert would have looked like during the 2011 temblor, the scientists took information from the Japanese GPS Earth Observation Network (GEONET)the day before, the day of, and the day after the 2011 earthquake. The station is typically used for long-term changes to the ground, such as “relaxation processes” between earthquakes, but has not been applied yet for tsunami warnings, Hoechner said.

While Japan has about 1,200 of these stations, the researchers only used 50 of them in order to take less time to issue an alert. The exact number of stations does not matter in this scenario, Hoechner noted, as long as there are enough to note a rapidly changing height difference between the ground on the coast and the ground further inland.

GPS stations provide more accurate information about ground shifts than seismological stations do, as seismological stations are better suited for looking at the amount of ground shaking — rather than shifting — associated with an earthquake. Both systems are useful in their own ways and should be used together, Hoechner said.

In the case of Japan’s Tōhoku earthquake, a tsunami warning issued just three minutes after the earthquake struck would have provided several minutes for people to scramble to safety. Tsunamis typically hit land about 20 to 30 minutes after they are generated, Hoechner said, depending on the distance between land and the earthquake’s epicenter.

The challenge will be to actually use the GPS sensors for real events, not just for simulating past tsunamis. And the technique could be used not only in Japan, but also in Indonesia. After the devastating 2004 earthquake in that region, there were some GPS stations installed, but the researchers say more are needed to make accurate tsunami warnings.

The results appear in the latest edition of Natural Hazards and Earth System Sciences, an open-access journal of the European Geosciences Union.

Hoechner’s team plans to extend its research to Chile, which was the site of a devastating tsunami in 2010.

credit

PostHeaderIcon Obama administration supports fracking and natural gas exports

Obama administration initiatives last week support hydraulic fracturing and natural gas exports, despite environmental opposition.

Last Thursday, the US Department of the Interior released a draft proposal that would “establish common-sense safety standards for hydraulic fracturing on public and Indian lands.” Last Friday, the US Department of Energy (DOE) approved a Liquefied Natural Gas (LNG) terminal in Freeport, Texas.

Despite opposition from environmental groups, the Obama administration apparently supports the expansion of the natural gas industry and the controversial technology of hydraulic fracturing. These events are welcome common sense from an administration that is typically deep in green ideology.

Good old Yankee ingenuity has produced a new hydrocarbon revolution. Vast quantities of oil and natural gas can now be recovered from shale rock formations, thanks to enabling technologies of hydraulic fracturing (or fracking) and horizontal drilling.

US crude oil production in 2012 was up 30 percent since reaching a low in 2008. Natural gas production is up 33 percent since 2005. Bob Dudley, CEO of BP, forecasts that the United States will be “nearly self-sufficient in energy” by the year 2030.

Fracking is not new, but has been perfected over the last 20 years to allow cost-effective recovery of hydrocarbon fuels from shale. Water and sand, along with a small amount of chemicals, are injected under pressure to fracture the shale and create millions of tiny fissures, releasing the trapped gas or oil. To develop a large producing field, horizontal drilling is used to bore mile-long horizontal shafts into the shale. Fracking is typically used at depths greater than 5,000 feet.

Hydraulic fracturing is under assault from environmental organizations. According to the Sierra Club, “Fracking, a violent process that dislodges gas deposits from shale rock formations, is known to contaminate drinking water, pollute the air, and cause earthquakes.” A 2011 letter from Friends of the Earth, Greenpeace USA, Climate Protection Campaign, and other groups urged President Obama to “halt hydraulic fracturing…until and unless the environmental and health impacts of this process are well understood and the public is adequately protected.”

The draft rule released Thursday from the Department of the Interior acknowledges that hydraulic fracturing can be conducted in an environmentally safe manner. It calls for disclosure of chemicals used in fracking, assurances of well-bore integrity to prevent leakage of gas and fluid into ground water supplies, and confirmation of a water management plan for disposal of water and fluids used in the fracking process.  Indeed, fracking has been used more than 500,000 times over the last 50 years without incidents of water contamination when proper safeguards were employed.

The hydrofracturing revolution has created a glut of natural gas in the US market. Prior to wide-scale use of fracking, natural gas prices reached $15 per million British thermal units (Btu), and port facilities were being constructed to import LNG. By 2011, prices had fallen to $4 per million Btu and import terminals sat idle.

Unlike crude oil, which is priced and sold in a global market, natural gas is priced and sold regionally. To date, the fracking revolution has been a US phenomenon, with other nations slow to join. While US gas prices have dropped to under $4 per million Btu, Europe’s prices remain above $10, and the price of imported LNG in Japan is above $15.

US producers now see an opportunity to liquefy the gas and ship it to Europe and Japan. Twenty applications have been filed with the Department of Energy (DOE). The approval last week of the Freeport export terminal in Texas is the first since 2011. The $10 billion terminal plans to export up to 1.4 billion cubic feet of natural gas per day, or about two percent of annual US consumption.

Environmental groups have criticized the approval. “Exporting LNG will lead to more drilling―and more drilling means more fracking, more air and water pollution, and more climate fueled weather disasters like last year’s record fires, droughts, and superstorms,” according to Deb Nardone of the Sierra Club. Nevertheless, it appears that the Obama administration will support hydraulic fracturing and the growth of the natural gas industry.

Shale gas booms in Texas, Louisiana, and Pennsylvania have created tens of thousands of jobs. Low natural gas prices are attracting global chemical firms to build plants in the US. Thousands of additional jobs and tax revenues can come from LNG exports. Sound energy policy demands that fracking and export of natural gas be allowed, if environmental safeguards are met.

credit