Oil and gas production comes from oil and gas exploration. By the time a drop of oil is produced, decades will have gone into discovering, delineating and developing a subsurface reservoir. Exploration and field development require a long-term analysis of seismic images, rock and fluid properties from well logs, core data and any other remote sensing data, surface analogues, past analyses or general knowledge. The result of interpretation is a conceptual, visible and dynamic model of a hydrocarbon reservoir in its overall geologic context, along with a prognosis of rock type and structure, fluid in the rock pores and drilling recommendations. Easy, right?
Wrong. Our data and processes get larger and more varied, and the way we work currently has not been cutting it for a while, even with great advances in scientific analyses, computing and communications technologies.
“We need a technology, a process and a skill set that allow data to be accessed from many disciplines, merged into integrated databases, and quantified in terms of their sensitivities to [material] properties, with appropriate measures of uncertainty attached to all calculation steps.” – Bob Hardage, 2012 Society of Exploration Geophysicists president
I confess I’ve been writing this post since returning from the Society of Exploration Geophysicists conference and its day-long IQ Earth: 21st Century Interpretation workshop (thank you and bravo, Ron Masters!) at the beginning of November. A month later, I just had to stop and ask, “What do I really want for the future of seismic interpretation?” To find oil smartly, I want the following (tagged with key concepts Bob used in his statement above and some more of my own):
– competing, inexpensive, Open Source, light and unbundled software and the attendant hardware and networking that gets and puts any imaginable data type quickly from and to large stores and centers on creating and updating an Earth Model. The user should be able to conduct multi-dimensional seismic QC, processing, volume interpretation, well log analysis, rock property estimation, seismic inversion, velocity updating, well planning, reservoir model building and sizing, dynamic simulation, history matching, reserves calculations, reservoir surveillance, statistical analysis and team/management communication at each step along the way [technology],
– well-rounded scientists and engineers, and employers who reward them for their multi-disciplinary outlook and work [skill set and mindset],
– subsurface IT groups that consist of IT nerds well-versed in the real needs of oil and gas scientists and engineers [skill set],
– physical workplaces in which ideas and results are readily exchanged [environment and process], and
– universities in which students learn from mathematics, science, engineering and art professors come together to seamlessly meld quantitative and computing skills with scientific knowledge and knowledge-sharing techniques [skill set].
The sad unerring truth here and what has kept me from writing this post on the future of seismic interpretation is that none of it is futuristic. IT IS ALL POSSIBLE RIGHT NOW and we will be talking about this in 2112 as long as we don’t just do it. Can we, though? While human inefficiency arises from turf and an unwillingness or inability to change, the crux of the problem is that the business ends of academia, companies and software providers are at odds with their more important aspects of teaching, finding oil and creative solutions. The resulting irony is that in the quest for competitiveness (whatever that means any more), confidentiality and maximum profit, much oil is left in the ground or not found at all.
This is not a revolution of thought on my part and many open-minded entities with elegant, nimble and evolving tools wait in the wings, but it will be hard to move again as long as “solutions” come from those with the largest market share (coincidentally those who have the most to gain from keeping the problem alive) and efforts are centralized. So, this is what I think the future of seismic interpretation holds, regardless of what you and I want. That:
– subsurface analysis will be more automated as well as crowdsourced as data sets grow in size and less people are available to work them. Are companies are going to open up their data stores for anyone to see and work on? Sure, look at Google’s reCAPTCHA and how they have circumvented ownership and intellectual property roadblocks by distributing the problem in pieces. Again, the search for oil is not SETI@Home, GalaxyZoo or any other distributed computing project in that it is a for-profit venture with a complicated data-ownership and data-licensing structure and a whole world of project-management implications, but this is where it will have to turn in order to get work done.
– following from the previous, data will (have to) become more open and acquired and hosted by governments, aided by professional societies like *cough*SEG*cough*, that begin to understand the value and potential of their subsurface. For instance, I don’t think Data.Gov understands that Big Geodata is more than TIFFs of seismic lines and geospatial layers, but massive multi-dimensional volumes and their sub-products.
– those with multi-disciplinary, quantitative (hard mathematics) and problem-solving skills will not come from traditional universities as accessible learning moves away from the physical campus and classroom to Massive Open Online Courses (MOOCs). With a primarily geoscience background, I myself have signed up for and participated in the Artificial Intelligence (now Udacity) and Code Academy courses to supplement my work and also simply to grow my brain. Note: Classroom learning, like paper books, are swell for those who have access to them. The main reason I joined a traditional four-year university system after high school at all was for access to the field and laboratories. Community labs and university-independent field geology trips are not far off.
Think on this and let’s discuss other possibilities and sticking points.
***
Here is a list of talks in the workshop that I attended, annotated with items of interest from each talk. One of these days, all workshops will renounce what I call the Powerpoint Pulpit and adopt an inclusive and active approach of people gathered around a truly interactive presentation and discussion.
1. Geological Heterogeneities Characterization through 4D Seismic Interpretation and their Integration into the Reservoir Model – Henri Houllevigue, Total
- Volumetric geobody interpretation of repeat seismic surveys to understand vertical and lateral dynamic communication of reservoirs and to improve well injection/production efficiency.
- Integrated Reservoir Surveillance helped early planning and reduced cycle time down to 6 months
- Uncertainty: Updating and re-updating the model is not trivial.
2. Real-time Updating of Integrated Earth Models to Mitigate Drilling Risk – Huyen Bui for Andy Hawthorn, Schlumberger
- Wellbore instability is 40% related to geomechanics – adopt Earth Model early and update with uncertainties
- Dogma ate my well: Never fall in love with your model. (All models are wrong, some are useful.)
- Corporate behavioral change is required.
- I really liked the multiple realizations of models corresponding to different velocities and structural uncertainties. Show them all equally.
3. Integration of Seismic and Production Data at the Shenzi Field, Deepwater Gulf of Mexico – David Tett, BHP Billiton Petroleum
- Processing – TTI and RTM all the way through updates
4. Panel and Open Microphone: Cross-disciplinary Collaboration in our Competitive Industry
- Found this panel a bit uninspiring and one guy kept interrupting with questions about Huyen’s PSDM processing techniques. I offered that we adopt the “each one find one” approach – take the initiative to invite non-geophysicists to our future meetings and workshops.
5. Reducing Risks and Understanding Uncertainty in Shale Plays: A South Texas Eagle Ford Case Study – Robin Pearson and David Work, Anadarko Petroleum Corp.
6. Marcellus Insights Delivered Using an Integrated Analysis of Multiple Geo-datasets – Craig Beasley, NEOS GeoSolutions
7. Panel and Open Microphone: Adapting to New Plays
8. Why IQ Earth Matters – Bob Hardage
9. Beyond the Horizon: A Vision for Integrated Quantitative Earth Model Analysis – Randal Kissling, ExxonMobil Development Company
- For true Geophysicists, IQ Earth means the next best algorithm for modeling / processing. For “us,” it’s production and field development. (I always say if you’re a physicist not that into geology, then just call yourself a physicist.)
- Quoted John Eastwood from the May 2011 issue of The Leading Edge: “There has not been a new tool or workflow during the past 15 years that has revolutionized interpretation. Interpreting on 2D lines in a 3D survey still represents the lion’s share of interpretation done by industry and, quite frankly, is stuck in the 1990s. I want (we should all want) a new, revolutionary interface – a volume-based interpretation in which the interpreter evolves the 3D interpretation or Earth model that actually looks like the Earth model. Seismic interpretation is not evolving fast enough to keep up with the phenomenal advances in acquisition, processing and computing technology. We are stuck in a paradigm (a rut).”
- “How do we go beyond shovel and dig and lots of mouse clicks?” (Volume geobody interpretation on stable software and hardware. There are companies who cannot install and maintain Virtual Machines and GPUs to save their lives.)
- We are past Visualization (1990s) and Volume Interpretation (2000s) to the IQ Earth Model, and yet folks use the workstation like colored pencils on paper seismic.
- Cross-functional integration is difficult when it is all about the business decision. (The business decision can also be a good reason for efficiency.)
- Velocity model -> synthetic -> compare to real data -> iterate -> new velocity model. (I call this Closing The Loop and think it’s a great place for inter-disciplinary integration when going from seismic to inversion to reservoir model and back.)
- I4Q Earth: The four Is of IQ Earth are Imaging, Interpretation, Integration and Insight. Insight is very the money is, but that’s not the direction in which we currently work.
10. Panel and Open Microphone: Interpretation ” The Road Ahead and the New Interpretation Journal
Interpretation journal:
– Proposed a few years back and announced in October 2012. Very few interpretation and case studies submissions and even fewer from industry.
– I asked a question of Yonghe Sun and Bob Hardage on the overlap, if any, between The Leading Edge and Interpretation. There is overlap between journals that share a common theme. AAPG’s reaction is more interesting: they feel Interpretation is going to erode their journal (“Yes, but that’s life” -Hardage) and very concerned that quantification will turn away their readers (math infusion is inevitable in robust interpretation going forward and required). Considering suggestion that Interpretation and AAPG Bulletin share editorship.
– New SEG award for best paper in Interpretation. Accepting submissions starting February 2013.
– The need for such a journal is there: 11 technical sessions, i.e. 88 talks, in this SEG conference that had “case history” or “case study” in title. Reviewership: Question asked that, given interpretation is subjective, is that paper of value to interpretation community? Future is hybrid geophysics-geology workers (as we’d been talking about all day, hello!). Judgment will be made on whether interpretation approach is robust.
Curation of data and algorithms:
– There are raw data, processed data, basic algorithms. Who will curate these going forward?
– ExxonMobil has resources to address IQEarth problem, but not academic departments. Vendors are going to make focused problems. Ron Masters was the first person to bring up Open and the Commons in this gathering: New eyes on data sets and implications for open data and open source work. Solution has to be Creative Commons so that different workers can integrate different analyses of same data set.
– Display the Analysis, not just the Data: Visualization hype is “I will show you something previously unseen.” But what we breed for is a geoscientist who has a model/interpretation in his or her head. We need a solution that helps them display these effectively. The data look pretty, but they are not effective. Let’s hope Interpretation does not publish screen dumps. (From here on out, the conversation was taken over by Barney Issen of Chevron, whom I could sit and listen to for days.)
Standards and reliability testing of standard products (we need a whole another workshop for this):
Multiple EBCDICs!
Standards must be extensible. Recognize that model have different targets of scale.
@Toastar: We should have standards for each one of those files in the dataset. Benefits everyone to set standards and stick to them.
***
I will close this post with the following exchange between Barney Issen and Randal Kissling. It reminds us of the raison d’etre for IQ Earth: We are scientists who want to do good work in the face of growing data, challenging plays and, thus, challenging scientific questions, new scientific and computing tools and deadline pressures.
Barney: “We have a lot of experts but we have a lot of staff. We want certification that a high-quality process has been followed. In a rapidly-changing world of required expertise, are we there yet? We are going into areas we didn’t grow up with.”
Randal: “When do we have enough of the right answer in science? Make a judgment and let the future revisit the problem.”
This is science. This is what our community needs to work towards to survive and thrive.
==
The picture at the top of this post is the Orbitor 1 Pinball Machine on display at the Las Vegas Pinball Museum. With a faux 3D playing surface and only two bumpers, it is described by pinballheads as “one of the strangest – and most minimal – games around.”
Like this:
Like Loading...