The Banana Doughnut theory is a model in Seismic tomography that describes the shape of the Fresnel zone along the entire ray path. This theory suggests that the area that influences the ray velocity is the surrounding material and not the infinitesimally small ray path. This surrounding material forms a tube enclosing the ray but does not incorporate the ray path itself.
This theory gets the name "banana" because tube of influence along the entire ray path from source to receiver is an arc resembling the fruit. The "doughnut" part of the name comes from the ring shape of the cross-section. The ray path is a hollow banana, or a banana-shaped doughnut. This theory is sometimes known as "Born-Fréchet kernel theory".
In optics and radio communications (indeed, in any situation involving the radiation of waves, which includes electrodynamics, acoustics, and gravitational radiation), a Fresnel zone, named for physicist Augustin-Jean Fresnel, is one of a (theoretically infinite) number of concentric ellipsoids which define volumes in the radiation pattern of a (usually) circular aperture. Fresnel zones result from diffraction by the circular aperture.
I'm going to use this story from the LA Times with some of my maps interjected to illustrate some things.
The Mogi doughnut hypothesis, developed by a Japanese seismologist, holds that earthquakes occur in a circular pattern over decades, building up to one very large temblor in the doughnut hole.As UC Davis physicist and geologist John Rundle ponders the map of recent California earthquakes, he sees visions of a doughnut even Homer J. Simpson wouldn't like.
The doughnut is formed by pinpointing the recent quakes near Eureka, Mexicali and Palm Springs.
Here are all the tectonic movements from April 2010-July 2010, in the Gulf area.
...here is the center of the residual gravity in the G.o.M..
....here is another that outlines both the plate boundaries and the residual gravity center.
Here's the movements before the big banana
..... this map shows, the resulting direction of the pressure wave in the Gulf.....this is where the banana finally made it through the doughnut.......you have a dirty mind.
Here's what we are back to, in terms of movements..keep in mind they are always moving, but depth is the indicator of movements as well as the intensity. The link to the IRIS Earthquake browser is at the bottom of this page.
This is from right before they finally lost control ,from April 1st up to today only, in terms of seismic events recently.
Take note that there is a difference of the New Madrid movements between the timelines.
This is a map of some zones of concern in the Gulf.
here's where they should be looking for seeps.
Ok...now on to earthquakes...
- The internal field of the Earth (its "main field") appears to be generated in the Earth's core by a dynamo process, associated with the circulation of liquid metal in the core, driven by internal heat sources. Its major part resembles the field of a bar magnet ("dipole field") inclined by about 10° to the rotation axis of Earth, but more complex parts ("higher harmonics") also exist, as first shown by Carl Friedrich Gauss. The dipole field has an intensity of about 30,000-60,000 nanoteslas (nT) at the Earth's surface, and its intensity diminishes like the inverse of the cube of the distance, i.e. at a distance of R Earth radii it only amounts to 1/8 of the surface field in the same direction. Higher harmonics diminish faster, like higher powers of 1/R, making the dipole field the only important internal source in most of the magnetosphere.
- The solar wind is a fast outflow of hot plasma from the sun in all directions. Above the sun's equator it typically attains 400 km/s; above the sun's poles, up to twice as much. The flow is powered by the million-degree temperature of the sun's corona, for which no generally accepted explanation exists yet. Its composition resembles that of the Sun—about 95% of the ions are protons, about 4% helium nuclei, with 1% of heavier matter (C, N, O, Ne, Si, Mg...up to Fe) and enough electrons to keep charge neutrality. At Earth's orbit its typical density is 6 ions/cm3 (variable, as is the velocity), and it contains a variable interplanetary magnetic field (IMF) of (typically) 2–5 nT. The IMF is produced by stretched-out magnetic field lines originating on the Sun, a process described in the article Geomagnetic storm.
Here's the link to the solar warnings and activities page, you can see for yourself, what's been going on lately on a bigger scale.
Current Conditions :
Space Weather Now
Today's Space Wx
Data and Products
Alerts & Forecasts
Space Wx Models
Top News of the Day: The Space Weather Prediction Center experienced a major system failure on July 13, 1910UT. Some systems and products resumed by 2100UT and GOES data at 2250UT. The SWPC Web Site was not accessible until July 14 at 0110UT.
Current Space Weather Conditions
But anyway...back to planet Earth.
Seismologists call the possible pattern a Mogi doughnut. It's the outgrowth of a concept, developed in Japan, which holds that earthquakes sometimes occur in a circular pattern over decades —building up to one very large quake in the doughnut hole. Rundle and his colleagues believe that the recent quakes, combined with larger seismic events including the 1989 Loma Prieta and 1994 Northridge temblors, could be precursors to a far larger rupture.
They just don't know exactly when.
The idea of predicting earthquakes remains controversial and much debated among California's many seismologists. But as technology improves and the understanding of how earthquakes distribute energy grows, experts are gingerly offering improved "forecasts," some of which have been surprisingly prescient.
For example, Southern California was hit earlier this month by a 5.4 quake that struck in the mountains about 30 miles south of Palm Springs — several weeks after seismologists at the Jet Propulsion Laboratory and elsewhere warned that pressure was building in the San Jacinto fault zone, which is where the temblor occurred.
That forecast underscores new thinking by seismologists about how earthquakes occur.
In the past, experts paid less attention to how one fault was connected to another and how one earthquake could increase the chances of a quake on another fault. But now they believe that these connections are extremely important and that this year's temblors along the Mexican border and near Palm Springs seem to support the concept.
"Previously we would identify a fault, map it and name it," said Lisa Grant Ludwig, a UC Irvine earthquake expert. "What we've really got here is a network of faults. Maybe that's what we need to be thinking: more big-picture."
Seismologists made the forecast about the quake risk south of the Palm Springs area after seeing signs that the 7.2 Mexicali temblor in April had placed more pressure on the San Jacinto fault system, which extends from the border northwest 100 miles toward Riverside and San Bernardino. They were particularly concerned because the San Jacinto fault system connects to the massive 800-mile-long San Andreas fault, which last triggered the "Big One" in Southern California in 1857, leaving a trail of destruction from Central California to the Cajon Pass in the Inland Empire.
David Bowman, a geology professor at Cal State Fullerton, said his research indicates that the Mexicali quake — the largest to strike the region in nearly two decades — was actually triggered by a much smaller quake on a unnamed fault line. The small quake's energy "jumped on another fault and kept on going," causing the much larger Mexicali temblor that was felt all the way to Fresno.
"That fault the earthquake started on is so small, we don't even really know where it is. Yet that small earthquake — that would not have made the news at all — was able to jump onto another fault and become a magnitude 7.2 event," he said.
The big question is whether the Mexicali quake has made a destructive temblor in the L.A. area more likely. Experts see strong evidence that there is more pressure now on the San Jacinto and nearby Elsinore fault networks to the east of Los Angeles. The Elsinore fault zone is connected to the Whittier fault, which runs through densely populated sections of the L.A. area, including the San Gabriel Valley. As a result, there's a concern that a quake on the Whittier fault might be more likely.
The Mexicali quake has also turned into a treasure trove of data for earthquake experts. It comes at a time when quake technology has advanced in major ways. Sophisticated satellite images are being used to study creeping ground movement caused by tectonic pressure in advance of an earthquake.
New GPS ground monitoring equipment is tracking how far the ground has moved after a quake, allowing scientists to calculate locations of greater seismic stress. And research in the mountains west of Bakersfield, examining the tracks of earthquakes hundreds of years ago, is showing that catastrophic earthquakes — those as large as magnitude 8 — have occurred in Southern California more frequently than previously believed.
That brings experts back to the Mogi doughnut.
The idea behind the doughnut is relatively straightforward: Earthquakes in California are basically caused by tectonic movements in which the Pacific plate slides northwest relative to the North American plate. As the plates move, stress builds up along both sides of cracks in the Earth's crust, as if a giant sheet of peanut brittle were being shoved in two directions.
Tectonic stress will first cause ruptures on the smaller faults, because they need less pressure before they break and thus produce small earthquakes. When they do rupture, the tectonic pressure gets transferred somewhere else, moving along like a crack in a windshield.
Ultimately, the stress moves closer to bigger faults that need more pressure to erupt, thus creating larger and larger earthquakes until the "Big One" happens.
"It's a matter of looking at the major earthquakes in California over the last 20, 30, 40 years," said UC Davis' Rundle. "They seem to be occurring everywhere except the major faults — the San Andreas, the Elsinore and the San Jacinto."
Those three faults would be enclosed in Southern California's doughnut hole. Northern California's doughnut hole includes the San Andreas and Hayward faults.
The Mogi doughnut hypothesis was developed in 1969 by Japanese seismologist Kiyoo Mogi, who observed a pattern in which smaller earthquakes seemed to precede larger ones.
Experts stress that the hypothesis is still unproven and not universally accepted. Skeptics say the concept could be applied to seemingly random earthquakes.
Whether the doughnut concept proves true, there is a consensus that California is shaking more than in recent years.
That greater activity could presage a larger quake. But the history of earthquake forecasting is littered with bold predictions that prompted more fear than actual earth movement, said Susan Hough, a seismologist with the U.S. Geological Survey in Pasadena. A case in point was the prediction in the 1980s of a devastating quake in Peru.
"The prediction was based on a number of ideas, some more wild-eyed than others," Hough said. "The prediction caused an international incident and a whole lot of real anxiety in Peru."
Perhaps the boldest recent prediction occurred in 2004, when an international research team led by then-83-year-old UCLA professor Vladimir Keilis-Borok said a moderate quake would rattle the California desert during a certain time frame.
The prediction made international headlines and brought Keilis-Borok rock star attention —especially because it was predated by apparently accurate predictions — but the deadline passed without a quake.
Now 88, Keilis-Borok continues his work to predict quakes. He shies away from more cautious terms like "forecast," which he said do not accurately reflect what scientists strive to do.
"The word 'forecast' is weak," he said in a Russian accent. "If you call it 'forecasting,' you are just not wanting to take responsibility." Scientists should not wait until forecasts can be made down to the precise moment, he added.
"Suppose you are the minister of the defense, and you are told the enemy is mobilizing its forces and will attack us within a year," he said. "And you tell them, 'No, I don't want to know. Tell me exactly within seconds, and then I will pay attention.' That would be suicide."
....that's the end of the article
So there's two types of "traps" that oil ends up in underground .One is formed in a rock layer fold, one in a shear pocket where two layers oppose one another.
................here is what a fold trap looks like in the Gulf of Mexico.
Here are the different types of oil, in terms of their relative ages, and the locations where they reside.
Here is a map of all the minor fractures, and the areas and channels oil can flow to, on a larger scale
Oil companies many times tap into the edge of a field and not directly into a deposit, for fear of unstoppable pressure. BP was drilled into the edge of the field, in the hopes the pressure would be less. They were wrong.
I suspect they used Halliburton's seismic imaging equipment. Smooth move guys.
...This is a composite of the above two maps, so you can see what can flow where, and also get an idea of where BP was drilled into.
This is an energy map showing who does what and where.
Here's a map showing only BP's operations and joint collaborations.
This one is a map of difficult area for petroleum mining companies and why. Note that the area where Horizon is/was specifically says H2S gas and higher temperature and pressures.
Now, let's take a look at what kind of oil this is and some of the things in it....and Halliburton. I love you guys. I want to also talk about your special frac'ing fluids....that you normally use in the deep-water areas.
Hi-E oil, sour.
High sand/particulate content ( 60%), high H2S gas...435 %, lots of water and nitrogen..1900 ppms
Sand control plays a key role in production from the Gulf of Mexico (GoM). With some Lower Tertiary reservoir depths in excess of 30,000 ft (9,144 m), and reservoir interval lengths up to 1,500 ft (457 m), innovative completion methodologies must be devised to meet the challenges of the ultra deepwater sandface.
The Lower Tertiary formation can be described as having a permeability range of 10 md - 50 md and bottomhole reservoir pressures from 20,000 psi to 25,000 psi. As compared with deepwater Miocene reservoirs, a longer fracture length is required in this lower permeable formation to optimize production rates. As previously mentioned, these wells can have up to five pay intervals ranging in thickness from 100 ft (30 m) to over 300 ft (91 m). Fracturing such long intervals requires high pump rates. Because of the bottomhole pressures and potential high fracture gradients of these wells, the surface treating pressure can easily exceed the 15,000 psi working limits of conventional offshore pumping equipment and treating lines when using standard fracturing fluids.
While technically feasible, surface treating pressures above 15,000 psi may pose safety risks for service personnel on the vessel and rig. Such high pressure also could be a burst problem for the marine riser or casing below the subsea wellhead in the event of a shallow leak in the workstring without the benefit of sufficient hydrostatic support pressure.Halliburton’s solution to the challenges of treating formations with high bottomhole pressure is DeepQuest, a proprietary fluid that uses gravity to reduce the amount of surface treating pressure required to achieve adequate fracturing pressure to below 15,000 psi. The specific gravity for a conventional fracturing fluid is 1.00 to 1.04; while DeepQuest can have a specific gravity of 1.14 to 1.49 based on the well conditions.
This provides a 3,000 – 3,500 psi surface treating pressure reduction. In addition, the fluid does not require a specialized 20,000 psi stimulation vessel or treating lines. All of Halliburton’s stimulation vessels are fully capable of pumping the fluid which reduces the possibility of waiting on equipment. Twenty DeepQuest jobs have been pumped since 2004, with two of the treatments performed for a major operator in the Lower Tertiary formation.
Subsalt brings more drilling in HP/HT environments
Published: May 1, 2010Brian Skeels
Improved logging and drilling techniques contribute to the significant and growing success rate of finding deep and subsalt reservoirs around the world. This trend brings to the forefront the need for high-pressure/high-temperature (HP/HT) hardware sooner to access and develop this latest realm of geologic formations.
While the development of HP/HT equipment (rated over 10,000 psi [69 MPa] and/or 250 °F [121 °C]) has been significant, the requirements are piecemeal, project driven and proprietary. Wellhead equipment and trees are approaching standards of 15,000 psi (103.4 MPa) and/or 350 °F (177 °C) service while downhole, logging and isolation equipment are working in environments approaching 30,000 psi (207 MPa) and 500 °F (260 °C).
One reason for the disparity is the growing distance between reservoir and wellhead, and the associated guesswork in predicting flowing wellhead conditions from limited fluid and fluid flow properties at the sand face. Several questions arise over how to define appropriate pressure and temperature ratings along with surface environmental (in air or subsea) pressure and temperature ratings—the latter introducing fatigue-cyclic loading conditions not seen downhole.
This great divide may provide a window on material selection, operating, and maintenance service life (as opposed to overall design life), and corrosion rates at elevated temperatures. Heavy oil production also may hold clues on material performance. Some heavy-oil equipment works at temperatures approaching 650 °F (343 °C). Current downhole and "hot" heavy-oil production reveal clues for design guidelines to help the HP/HT wellhead community keep pace with reservoir advances in extreme and ultra HP/HT environments.
|Overpressure geology and its effect on pore pressure (from ADS HPHT Well Control Course [top] and DOT 2008 [bottom]).|
Operator requirements for drilling and completions systems push the boundary of known, proven, and delivered technology and equipment. As these boundaries get pushed further, the need for a new higher pressure, higher temperature, and higher casing capacity subsea wellhead system grows. This need grows as modifications to existing technology are useful in fewer and fewer applications. Additionally, as drilling in the Gulf of Mexico continues to push into deeper, higher pressure formations, the need for increased suspension capacity under the duress of hostile, corrosive environments is equally important. Downhole equipment development and fluid rheology are the tip of the spear for HP/HT technology because of its proximity to the source – the HP/HT reservoir. Unfortunately, reservoirs do not reveal their secrets willingly.
Establishing system requirements
Many HP/HT reservoirs are being found in subsalt (or presalt) locations under abnormal pressure gradients and folds. Depth of the salt play above the reservoir also plays tricks with the estimated reservoir temperature and overburden pressure. Interestingly, thicker salt regions coincide with more remote, deeper water locations, which skew requirement needs. Wellhead equipment manufacturers of subsea and surface hardware are seeing a roughly 5,000 psi and 50 °F disparity in requirements for generally the same reservoir depth, making it more difficult to plan for technology needs and timing.
Another vexing problem for HP/HT requirements is properly defining the equipment's environment. Most oilfield equipment codes, both wellhead and downhole, use "wellbore temperature" to define the equipment's rated temperature class. Later, when equipment migrated to the arctic, operators added a much lower temperature class to equipment requirements as a way to ensure the pressure containing equipment had the appropriate heat treatment and ductility/toughness against cold brittle fracture when the well was not flowing. This practice continues today where "all" equipment is assumed to need to function at both upper and lower defined limits. However, this does not properly address proximity. Some components obviously are closer to the wellbore and so exposed to higher temperatures. Peripheral components, those away from the wellbore, operate at more consistent ambient conditions regardless of wellbore temperature.
This sets up a huge rating conundrum. Does one rate peripheral equipment at one temperature and in-close components at wellbore temperatures? How about when they are adjacent or touching? How do you re-rate peripheral equipment? Should it be insulated to eliminate heat sink outlets? Finally, what do you rate the temperature class for the entire assembly of components? Looking at the system requirements based on the environment and wellbore might establish a different design paradigm. Work with the environment, not against it.
Downhole equipment usually works in a narrower temperature range since it experiences roughly the same rated pressure and temperature both internally and externally. Stability and longevity are keys to downhole success. Typically, the high temperature condition sets the requirements for design and qualifying procedures. These conditions usually remain "high" for extremely long periods of service and usually are the mindset behind an operator's temperature class requirements. Therefore, downhole equipment may not hold too many insights for other production equipment designs. In fact, the narrow focus might restrict how we view the performance of ancillary equipment. A "one-temperature-fits-all" approach may overcomplicate or defeat the possibility of finding a workable mechanism.
Obey or ignore the environment?
Early forays into HP/HT reservoirs peaked in 1982 when roughly 1,200 wells were drilled below 15,000 ft (4,570 m) in the United States. Although the number of HP/HT wells drilled decreased, the target depth steadily increased over the next 10 years reaching 22,500 ft (6,860 m) and recently 33,000 ft (10,060 m). This trend was encouraged by the promise that deeper wells offered greater production capability. Deep reservoirs make up less than 1% of the total number of wells, but account for nearly 7% of domestic production.
Conditions were considered well beyond the historical limits of "sour" service set by NACE standards and boiler code conditions had to address the extreme bottomhole temperatures. Conservatism ruled in planning and drilling these wells, calling for the highest grade, strongest materials available. Most notably, wells were designed with high nickel content CRA tubular goods to mitigate well control catastrophes caused by cracked casing succumbing to the environment. Proximity to population centers and local regulations also mandated their use. CRAs were considered the best choice because corrosion rates for carbon steels would be an order of magnitude greater than practical corrosion allowances.
Recent activity in deep gas plays offshore the GoM look at that same well construction philosophy from a completely different perspective – using carbon steel alloys instead of CRAs. How does this happen, given that many HP/HT wells take a year or so to drill and complete? Surface and upper intermediate casing strings feature 55-85 ksi grade material while deeper casing strings are being set as multiple liners using 125 ksi grade materials (the higher grades being down-rated 10-15% for wellbore temperature to achieve the necessary load requirements).
....The problem there, is that binary fluids like crude oil can hit localized pockets of 200k psia .
...very unpredictable, very rapid changes in density.
The question is: why is carbon steel surviving? The answer could be the environment. Carbon steel pipelines and production tubing in hot, high-sulfur wells in the Kashagan field are surviving 20% hydrogen sulfide conditions because of the high sulfur, low oxygen environment. Pipeline maintenance shows this condition creates a ferrous sulfide coating in the bore, which (if left undisturbed) reduces the corrosive metal loss rate from 40 mpy to 5-10 mpy.
Canadian heavy oil wells exhibit similar equipment survivability using mild service (material class AA), low cost carbon steels for wellhead and completion hardware. Asphaultine-like wellbore fluids produced from these wells constantly coat the bore of the tubing and well control equipment. The well's relatively high-producing temperature "cooks" this coating onto the pipe to give it a protective lining. As with Kashagan, it is a brittle covering, but if left undisturbed, greatly reduces the corrosion rate.
High temperature also may be an ally from another perspective. It is common practice to post-weld heat treat welds for stress relieving and "baking out hydrogen" at temperatures between 400-800 °F (205-425 °C). Work string tubular goods used in workovers routinely are baked at these temperatures after a sour well job to get rid of residual hydrogen. Extreme and ultra HP/HT reservoir temperatures approach this same temperature range, suggesting that high-strength tubular goods may avoid embrittlement problems because the higher ambient temperature allows hydrogen to pass through the steel's molecular lattice structure.
Because of severe requirements and the lack of sufficient data under HP/HT conditions, the industry is limited in its ability to produce oil and gas from these wells. In spite of this, the industry is drilling and logging HP/HT wells with renewed vigor. Drilling these wells has been accomplished using clever techniques which challenge established norms. If current standards and practices are to be upheld, the industry must begin immediately to put HP/HT developments on a scientific footing instead of relying on old rules.
Because it is difficult to accurately define conditions, some companies err on the side of costly and conservative approaches to accessing deep HP/HT reservoirs, or else abandon the attempt altogether. Others choose to ignore the possible hazardous conditions and forged ahead because there is no concrete answer.
The oil industry's continued can-do spirit shows that the HP/HT environment might be used to our advantage rather than treated as a problem. Limiting exposure rather than assuming "last forever – no maintenance" extends conventional designs and makes it possible to use more economical materials and seal technology. The old "20-year design life" mantra has to be replaced with a practical "operating life" philosophy. Lowering the life expectancy of materials to a tolerable obsolescence lessens the design challenge and contradictions (strength versus ductility, years/corrosion allowance versus cycles, near wellbore versus near environment location, high temperature operation versus low temperature shut-in, etc.). Yet, in light of the convention-stretching attitude of the best wildcatters there still is a ways to go. Design methodology appears to be maturing in time to meet HP/HT market needs. Insufficient materials data under HP/HT conditions relegates the industry to educated supposition or technical risk mitigation based only on experience.
So that's that. That's why Halliburton went with seawater, instead of the normal fluid they use in deepwater wells. Too much pressure, too much heat....too little effort, too late to stop.
Ok....moving on..1900 ppms of nitrogen gas. and a methanol content of <20 ppms (greater than)
The gulf is an ancient meteor strike caldera. Read the following short paper on nitrogen hydrates.
" HEAVY NITROGEN IN HYDRATED CLASTS IN A CH CHONDRITE."
Here's the location of the meteor strike...one of the largest ever to hit on this planet.
Here's the 3-D image of the crater of the ocean floor.
From the paper quoted above the pictures.. I see that nitrogen hydrates are found in the presence of carbonate, iron sulfide and magnetite.....in the analysis of the oil, there is :
magnesium = 195.ppms (mg/kg)
iron = 64. ppms (mg/kg)
calcium = 307 ppms (mg/kg)
there's also :
Boron = 3.4 ppms (mg/kg)
potassium = 5.9 ppms (mg/kg)
Manganese = 1.1 ppms (mg/kg)
Nickel = 1.1 ppms (mg/kg)
....and Titanium,Phosphorus, Zinc ,Molybdenum ,Vanadium etc,etc...all things found in meteor strike caulderas.
Moving on to H2S gas,...From the paper below, from studies sponsored by BP ......I see that hydrates are easily formed in the presence of H2S gas....but also rapidly released in the presence of methanol.... <20 ppms in this crude..........how 'bout that..? the ....H2S is.. .435 %/wt.
Extensive studies on hydrate models has shown that the vast majority of naturally occurring fluids will form hydrate structure II. Some fluids,those with a particularly high methane or H2S content, may form hydrate structure II at low pressures but structure I may become more stable as the pressure increases.
Natural gas hydrates are solid ice-like compounds of water and the light components of natural gas. Also, some heavier hydrocarbons found in gas condensates and oils are known to form hydrates if smaller molecules such as methane or nitrogen are present to stabilise the structure. Hydrates may form at temperatures above the ice point and are therefore a serious concern in oil and gas processing operations.
The phase behaviour of systems involving hydrates can be very complex because up to seven phases must normally be considered, even without considering the possibility of scale formation. The behaviour is particularly complex if there is significant mutual solubility between phases, e.g. when inhibitors or CO2 are present.
While salt will always remain in the aqueous phase the thermodynamic inhibitors will also be soluble in the hydrocarbon phases. This is particularly the case for methanol, which is the most volatile. As the inhibition effect of methanol is determinedby the amount of methanol in the aqueous phase,determining how much is lost to the hydrocarbon phases is an important calculation.
From the D.O.E.
Characterization of Natural Hydrate Bearing Sediments and Hydrate Dissociation Kinetics
Before large-scale commercial recovery of natural gas from hydrates can be attempted, important issues regarding reservoir stimulation techniques, safety, and cost must be addressed. Reservoir modeling is an important tool that can be utilized to help address these issues. However, application of these modeling tools requires access to reliable thermodynamic, kinetic, and physical property data for gas hydrates and physicochemical properties of the hydrate-bearing sediments themselves. Infrared data collected on hydrate core samples during the Ocean Drilling Program (ODP) Leg 204 expedition (Hydrate Ridge) will be analyzed to quantitatively assess hydrate abundance and compare hydrate occurrence as determined from well logs and other hydrate proxies. Laboratory studies will also be conducted on natural hydrate-bearing core samples from Mallik, Hydrate Ridge, the Gulf of Mexico, and India. Measurements will include x-ray microtomography, gas permeability, resonant ultrasound, Raman and NMR spectroscopy, and environmental scanning electron microscope (ESEM) analyses coupled with a mass spectrometer for determining methane abundance and location on a grain-to-grain scale.
A key issue in analyzing hydrate samples retrieved from the field is pressure loss during handling and the unavoidable decomposition of hydrate. The effects of partial hydrate decomposition on the measured properties of core samples are presently very poorly understood. A much better fundamental understanding of the effects of hydrate saturation and thermal and pressure cycling on the properties of hydrate-bearing sediments is needed. These studies of hydrate dissociation kinetics will make unique contributions toward a better understanding of gas hydrates in porous media. Results from these pore-scale focused studies will be compared and contrasted with complementary core-scale studies. The results will give researchers the tools and information to begin understanding the microscopic details of the molecular assembly of gas hydrates in porous media, and to explain why laboratory-grown samples do not behave like those produced in nature. It is essential to understand the limitations of laboratory-grown samples and the petrophysical data derived from them in order to make reliable predictions of reservoir behavior during production. In addition, the model resulting from this effort can be used to more accurately predict the transient response of a gas hydrate reservoir to pressure and temperature perturbations and the subsequent effect on hydrate stability in marine and terrestrial accumulations."
Below is the well control loss time-line, and some recent seismic imaging done at the area of the well-head , from Alexander Higgins' blog
Details of the time line of the Macondo Well Blowouts are as follows:
- February 17-23: BP reports cracks in well casing and leaking of hydrocarbons into the surrounding rock formation. It took BP 3 attempts to fill the cracks with cement before the well control event was brought under control.
- March 2-5: BP reported another well control event that took 3 days to bring under control
- March 8-14: BP reported a well control event that took 7 days to get under control. A series of BP internal emails released by Congress showed :
The surrounding rock formation actually collapsed on the drill.
- BP was then given special permission to cement the well at a shallower depth than normally required because the hole caved in on drilling equipment.
- Another BP email released by congress showed that unnamed MMS official at 11 p.m. on March 11 gave the special permission to insert the cement plug about 750 feet above the bottom of the hole .
- April 4-7: Well control event took 3 days to bring under control.
- April 20: Well control event leading to explosion and collapse of the Deepwater Horizon oil rig.
Here's the results from the seismic imaging done by the Thomas Jefferson, also from Alex's blog.
Oil and or natural gas leaks (red and yellow columns) mapped by Thomas Jefferson, and by Gordon Gunter (purple cylinders) along with CTD stations showing high fluorescence or possible oil and gas anomalies (brown, green and white spheres). The Deepwater Horizon well site is in background (red cylinder) and distribution of Bottom Following Reflectors is represented by orange lines.
NOAA has said that the leaks on the sea floor graphed in the 3D model above “appear to be pre-existing seeps that occur naturally and are unrelated to the spill” and have labeled leaks as such in the Thomas Jefferson report.
But how NOAA came to determination the determination that these leaks on the sea floor are natural seeps needs to be questioned.
Salt, unfortunately, is very difficult to image through, and the Gulf is practically solid salt.
Bad data is what makes the risks even more severe. I can't stand incompetence, myself.
Here's what good data looks like.
...thought I'd add this following article from 2006
Another find preceded this week's major discovery
Posted: September 08, 2006
1:00 am Eastern
© 2010 WorldNetDaily.com
Gulf of Mexico saturated with oil?
Another find preceded this week's major discovery
Posted: September 08, 2006
1:00 am Eastern
© 2010 WorldNetDaily.com
Chevron's announcement this week that the Jack Field located in the Gulf of Mexico 270 miles southwest of New Orleans may have as much as 15 billion barrels of oil was not the only recent find of oil in the Gulf. In March, Mexico announced the discovery of a new huge oil find, the Noxal Field some 60 miles from the port of Coatzacoalcos on the coast of Veracruz state. Estimated to contain as much as 10 billion barrels of oil, the find could well be larger than Cantarell, Mexico's biggest oil field, near Yucatan. Like the Jack Field discovery, the Noxal Field is a deep-water find, relying on new drilling technology. Chevron is drilling the Jack Field under some 7,000 feet of water in a 28,175-foot well, in total nearly seven miles under the surface of the Gulf. The Noxal find was deep-water, though somewhat less so that the Jack field, at under a little more than half a mile of water and a further two and a half miles underground. "The new deep-water finds in the Gulf of Mexico are more validation for what we wrote in "Black Gold Stranglehold: The Myth of Scarcity and the Politics of Oil," co-author Jerome R. Corsi explained to WND. "The deep-earth, abiotic theory – that the origin of oil has nothing to do with biological material – argues that oil is abundant at levels deep within the earth." Co-author Craig R. Smith pointed out "all of these Gulf of Mexico oil finds are at deeper levels than traditional-thinking 'fossil fuel' geologists typically looked." "Moreover, these finds call into question the 'peak oil' theories that we are running out of oil," he said. "When huge new finds are being made in the Gulf, why does President Bush continues to believe we must prepare for a world running out of oil?" Even before the new Mexican discovery, the Energy Information Agency's own figures estimate proven world oil reserves at 1.28 trillion barrels, more than ever in human history, despite world consumption nearly doubling since the 1970s. Currently, oil is plentiful on world markets and the price has fallen under $70, despite the continuing uncertainty with resolving Iran's enrichment of uranium in defiance of the U.N. Security Council. The Yucatan seabed is believed to have been deeply cracked by the impact of the huge Chicxulub meteor that killed the dinosaurs at the end of the Mesozoic Era. The Chicxulub impact crater is massive, estimated to be 100 to 150 miles wide. The seismic shock of the meteor fractured the bedrock below the Gulf and set off a series of tsunami activity that caused a huge section of land to break off and fall back into the crater under water. "It is possible that the Chicxulub meteor fractured much of the bedrock in the Gulf of Mexico," Corsi told WND. "Who knows how much oil will be found in the Gulf of Mexico? 100 billion barrels? 200 billion barrels? Nobody really knows. We have just begun to explore the Gulf at deeper levels for oil." The deep-earth, abiotic theory of oil's origin argues that oil forms within the mantle of the earth and enters the sedimentary rock layers through fractures in the bedrock. "The Gulf of Mexico is approximately 550 miles measured north to south," Smith said. "The Gulf has a long shelf all around the perimeter. Deep-sea drilling is the fastest growing segment of the oil industry. We should expect new finds of huge oil deposits beneath the Gulf floor for many years to come." Earlier this year, Cuba announced plans to hire the communist Chinese to drill for oil off Key West, Fla. The move was made possible by the 1977 agreement under President Jimmy Carter that created for Cuba an "Exclusive Economic Zone" extending from the western tip of Cuba to the north, virtually to Key West. "If Cuba and communist China believe they too can find oil in the Gulf, we should pull out all stops," argues Smith. "We may be able to bring the price of gasoline down under $2 a gallon if oil can be found in these huge quantities within our territorial waters. It's crazy to think we should be dependent on foreign oil when we've made Mexico our number two supplier of oil with the reserves Mexico has found in the Gulf." The Banana-Doughnut theory makes perfect sense. Matter follows laws regardless of scale, my belief..maybe slightly ipse dixit. It's no different than what happens in a bubble that undergoes cavitation....exact same action. Unwillingness to compress. Except it's gravity, and not a gas in a denser fluid. Get it ? More force externally equals collapse internally. Why do people have such a hard time understanding this..? http://www.infochemuk.com/publicat/leaflets/hydrates.pdf http://www.fugro-jason.com/readingroom/techpapers/reprint_NTG_from_P_S_2002_TLE.pdf http://www.offshore-mag.com/index/article-display/8247014411/articles/offshore/volume-70/issue-50/drilling-__completion/subsalt-brings_more.html http://www.latimes.com/news/custom/scimedemail/la-me-earthquake-forecast-20100718,0,1789070.story?page=2&track=rss http://www.offshore-mag.com/index/article-display/9070614400/articles/offshore/volume-70/issue-2/drilling-__completion/sandface-completions.html http://en.wikipedia.org/wiki/Magnetosphere http://en.wikipedia.org/wiki/Fresnel_zone http://en.wikipedia.org/wiki/Banana_Doughnut_theory