≡ Menu

Amazing. Two weeks to the day after losing my voice, it is back completely. On my way to the doctor, a friend told me not to bother as “it’s a 14-day thing that’s going around in the south” and will disappear on its own with rest and hydration. Who ever heard of a two-week virus? “The number shall be fourteen, no more, no less. Twelve is right out.” This makes me wonder if viruses operate on a lunar cycle and watch Monty Python.

Sick days are lost on me, because I sleep in for a few hours longer and then I’m AWAKE with no place to go. I organized and posted photographs from our trip to Las Vegas and its PINBALL museum! Behold.

IMG_4229

IMG_4190

IMG_4230

I had to be dragged out of there by more adept gamers than me! Think a field trip back to the museum with a leader, field guides and frequent stops is in order. Let me know if you’re interested.

We went to the Mob Museum, too. It was alright. Everything comes back to New Orleans (and India), people.

With little to no appetite for anything more than hot toddies (lots and lots of honey and lemon), I managed also to document How To Make Adai at MaitriLAB. Now I’m hungry.

As is glaringly obvious, the logo for MaitriLAB needs work, which means photography and Photoshopping, which in turn requires a certain level of competence on my part with respect to brain and eye focus and coordination. The Parament WordPress theme works for now in that it is relatively quick and easy to tweak and stays out of the way. I’d rather use Minimatica as a one-shot, browsable portfolio that the user sees upon landing on the site instead of yet another workout on the index-finger treadmill. All was fine until the fourth post, and the very first post at any given time began to show up twice and only in Gallery view. I’ve examined the theme’s files and an email to One Designs has gone unheeded. Themes aren’t very critical, but it’s a how-to blog with pictures where form, function and content are priorities.

It’s 80 degrees and sticky here. It’s 30 and dropping up north. Driving in a winter wonderzoo!

0 comments

Mass wasting is more like it.

NOLADishu points me to a presentation given by Shell’s Donal Rajasingam at this year’s Tulane Engineering Fair. Within a pretty good collection of statistics on increasing energy demands and aging infrastructure, the above set of graphs on energy talent supply stands out. We don’t have enough people now, so who’s going to work the problem later?

The graph on the bottom left is one I consider even more troubling. What it shows you is that with the growing global demand for PE graduates, the gap between the number of all graduates and high-quality ones is increasing. I believe the same goes for the geosciences, considering the geology and geophysics MS factories I observe in the south. We cannot afford to take just anyone with a geo or engineering degree and, in most cases, we don’t. But, what will we do in the upcoming pinch?

What’s the old saying? “We offer three kinds of service: Good, Cheap, Fast. You get to pick two.” I think we’ll be lucky to get one.

1 comment

Oil and gas production comes from oil and gas exploration. By the time a drop of oil is produced, decades will have gone into discovering, delineating and developing a subsurface reservoir. Exploration and field development require a long-term analysis of seismic images, rock and fluid properties from well logs, core data and any other remote sensing data, surface analogues, past analyses or general knowledge. The result of interpretation is a conceptual, visible and dynamic model of a hydrocarbon reservoir in its overall geologic context, along with a prognosis of rock type and structure, fluid in the rock pores and drilling recommendations. Easy, right?

Wrong. Our data and processes get larger and more varied, and the way we work currently has not been cutting it for a while, even with great advances in scientific analyses, computing and communications technologies.

“We need a technology, a process and a skill set that allow data to be accessed from many disciplines, merged into integrated databases, and quantified in terms of their sensitivities to [material] properties, with appropriate measures of uncertainty attached to all calculation steps.” – Bob Hardage, 2012 Society of Exploration Geophysicists president

I confess I’ve been writing this post since returning from the Society of Exploration Geophysicists conference and its day-long IQ Earth: 21st Century Interpretation workshop (thank you and bravo, Ron Masters!) at the beginning of November. A month later, I just had to stop and ask, “What do I really want for the future of seismic interpretation?” To find oil smartly, I want the following (tagged with key concepts Bob used in his statement above and some more of my own):

competing, inexpensive, Open Source, light and unbundled software and the attendant hardware and networking that gets and puts any imaginable data type quickly from and to large stores and centers on creating and updating an Earth Model. The user should be able to conduct multi-dimensional seismic QC, processing, volume interpretation, well log analysis, rock property estimation, seismic inversion, velocity updating, well planning, reservoir model building and sizing, dynamic simulation, history matching, reserves calculations, reservoir surveillance, statistical analysis and team/management communication at each step along the way [technology],

well-rounded scientists and engineers, and employers who reward them for their multi-disciplinary outlook and work [skill set and mindset],

subsurface IT groups that consist of IT nerds well-versed in the real needs of oil and gas scientists and engineers [skill set],

physical workplaces in which ideas and results are readily exchanged [environment and process], and

– universities in which students learn from mathematics, science, engineering and art professors come together to seamlessly meld quantitative and computing skills with scientific knowledge and knowledge-sharing techniques [skill set].

The sad unerring truth here and what has kept me from writing this post on the future of seismic interpretation is that none of it is futuristic. IT IS ALL POSSIBLE RIGHT NOW and we will be talking about this in 2112 as long as we don’t just do it. Can we, though? While human inefficiency arises from turf and an unwillingness or inability to change, the crux of the problem is that the business ends of academia, companies and software providers are at odds with their more important aspects of teaching, finding oil and creative solutions. The resulting irony is that in the quest for competitiveness (whatever that means any more), confidentiality and maximum profit, much oil is left in the ground or not found at all.

This is not a revolution of thought on my part and many open-minded entities with elegant, nimble and evolving tools wait in the wings, but it will be hard to move again as long as “solutions” come from those with the largest market share (coincidentally those who have the most to gain from keeping the problem alive) and efforts are centralized. So, this is what I think the future of seismic interpretation holds, regardless of what you and I want. That:

– subsurface analysis will be more automated as well as crowdsourced as data sets grow in size and less people are available to work them. Are companies are going to open up their data stores for anyone to see and work on? Sure, look at Google’s reCAPTCHA and how they have circumvented ownership and intellectual property roadblocks by distributing the problem in pieces. Again, the search for oil is not SETI@Home, GalaxyZoo or any other distributed computing project in that it is a for-profit venture with a complicated data-ownership and data-licensing structure and a whole world of project-management implications, but this is where it will have to turn in order to get work done.

– following from the previous, data will (have to) become more open and acquired and hosted by governments, aided by professional societies like *cough*SEG*cough*, that begin to understand the value and potential of their subsurface. For instance, I don’t think Data.Gov understands that Big Geodata is more than TIFFs of seismic lines and geospatial layers, but massive multi-dimensional volumes and their sub-products.

– those with multi-disciplinary, quantitative (hard mathematics) and problem-solving skills will not come from traditional universities as accessible learning moves away from the physical campus and classroom to Massive Open Online Courses (MOOCs). With a primarily geoscience background, I myself have signed up for and participated in the Artificial Intelligence (now Udacity) and Code Academy courses to supplement my work and also simply to grow my brain. Note: Classroom learning, like paper books, are swell for those who have access to them. The main reason I joined a traditional four-year university system after high school at all was for access to the field and laboratories. Community labs and university-independent field geology trips are not far off.

Think on this and let’s discuss other possibilities and sticking points.

***

Here is a list of talks in the workshop that I attended, annotated with items of interest from each talk. One of these days, all workshops will renounce what I call the Powerpoint Pulpit and adopt an inclusive and active approach of people gathered around a truly interactive presentation and discussion.

1. Geological Heterogeneities Characterization through 4D Seismic Interpretation and their Integration into the Reservoir Model – Henri Houllevigue, Total

  • Volumetric geobody interpretation of repeat seismic surveys to understand vertical and lateral dynamic communication of reservoirs and to improve well injection/production efficiency.
  • Integrated Reservoir Surveillance helped early planning and reduced cycle time down to 6 months
  • Uncertainty: Updating and re-updating the model is not trivial.

2. Real-time Updating of Integrated Earth Models to Mitigate Drilling Risk – Huyen Bui for Andy Hawthorn, Schlumberger

  • Wellbore instability is 40% related to geomechanics – adopt Earth Model early and update with uncertainties
  • Dogma ate my well: Never fall in love with your model. (All models are wrong, some are useful.)
  • Corporate behavioral change is required.
  • I really liked the multiple realizations of models corresponding to different velocities and structural uncertainties. Show them all equally.

3. Integration of Seismic and Production Data at the Shenzi Field, Deepwater Gulf of Mexico – David Tett, BHP Billiton Petroleum

  • Processing – TTI and RTM all the way through updates

4. Panel and Open Microphone: Cross-disciplinary Collaboration in our Competitive Industry

  • Found this panel a bit uninspiring and one guy kept interrupting with questions about Huyen’s PSDM processing techniques. I offered that we adopt the “each one find one” approach – take the initiative to invite non-geophysicists to our future meetings and workshops.

5. Reducing Risks and Understanding Uncertainty in Shale Plays: A South Texas Eagle Ford Case Study – Robin Pearson and David Work, Anadarko Petroleum Corp.

6. Marcellus Insights Delivered Using an Integrated Analysis of Multiple Geo-datasets – Craig Beasley, NEOS GeoSolutions

7. Panel and Open Microphone: Adapting to New Plays

8. Why IQ Earth Matters – Bob Hardage

9. Beyond the Horizon: A Vision for Integrated Quantitative Earth Model Analysis – Randal Kissling, ExxonMobil Development Company

  • For true Geophysicists, IQ Earth means the next best algorithm for modeling / processing. For “us,” it’s production and field development. (I always say if you’re a physicist not that into geology, then just call yourself a physicist.)
  • Quoted John Eastwood from the May 2011 issue of The Leading Edge: “There has not been a new tool or workflow during the past 15 years that has revolutionized interpretation. Interpreting on 2D lines in a 3D survey still represents the lion’s share of interpretation done by industry and, quite frankly, is stuck in the 1990s. I want (we should all want) a new, revolutionary interface – a volume-based interpretation in which the interpreter evolves the 3D interpretation or Earth model that actually looks like the Earth model. Seismic interpretation is not evolving fast enough to keep up with the phenomenal advances in acquisition, processing and computing technology. We are stuck in a paradigm (a rut).”
  • How do we go beyond shovel and dig and lots of mouse clicks?” (Volume geobody interpretation on stable software and hardware. There are companies who cannot install and maintain Virtual Machines and GPUs to save their lives.)
  • We are past Visualization (1990s) and Volume Interpretation (2000s) to the IQ Earth Model, and yet folks use the workstation like colored pencils on paper seismic.
  • Cross-functional integration is difficult when it is all about the business decision. (The business decision can also be a good reason for efficiency.)
  • Velocity model -> synthetic -> compare to real data -> iterate -> new velocity model. (I call this Closing The Loop and think it’s a great place for inter-disciplinary integration when going from seismic to inversion to reservoir model and back.)
  • I4Q Earth: The four Is of IQ Earth are Imaging, Interpretation, Integration and Insight. Insight is very the money is, but that’s not the direction in which we currently work.

10. Panel and Open Microphone: Interpretation ” The Road Ahead and the New Interpretation Journal

Interpretation journal:

– Proposed a few years back and announced in October 2012. Very few interpretation and case studies submissions and even fewer from industry.

– I asked a question of Yonghe Sun and Bob Hardage on the overlap, if any, between The Leading Edge and Interpretation. There is overlap between journals that share a common theme. AAPG’s reaction is more interesting: they feel Interpretation is going to erode their journal (“Yes, but that’s life” -Hardage) and very concerned that quantification will turn away their readers (math infusion is inevitable in robust interpretation going forward and required). Considering suggestion that Interpretation and AAPG Bulletin share editorship.

– New SEG award for best paper in Interpretation. Accepting submissions starting February 2013.

– The need for such a journal is there: 11 technical sessions, i.e. 88 talks, in this SEG conference that had “case history” or “case study” in title. Reviewership: Question asked that, given interpretation is subjective, is that paper of value to interpretation community? Future is hybrid geophysics-geology workers (as we’d been talking about all day, hello!). Judgment will be made on whether interpretation approach is robust.

Curation of data and algorithms:

– There are raw data, processed data, basic algorithms. Who will curate these going forward?

– ExxonMobil has resources to address IQEarth problem, but not academic departments. Vendors are going to make focused problems. Ron Masters was the first person to bring up Open and the Commons in this gathering: New eyes on data sets and implications for open data and open source work. Solution has to be Creative Commons so that different workers can integrate different analyses of same data set.

– Display the Analysis, not just the Data: Visualization hype is “I will show you something previously unseen.” But what we breed for is a geoscientist who has a model/interpretation in his or her head. We need a solution that helps them display these effectively. The data look pretty, but they are not effective. Let’s hope Interpretation does not publish screen dumps. (From here on out, the conversation was taken over by Barney Issen of Chevron, whom I could sit and listen to for days.)

Standards and reliability testing of standard products (we need a whole another workshop for this):

Multiple EBCDICs!

Standards must be extensible. Recognize that model have different targets of scale.

@Toastar: We should have standards for each one of those files in the dataset. Benefits everyone to set standards and stick to them.

***

I will close this post with the following exchange between Barney Issen and Randal Kissling. It reminds us of the raison d’etre for IQ Earth: We are scientists who want to do good work in the face of growing data, challenging plays and, thus, challenging scientific questions, new scientific and computing tools and deadline pressures.

Barney: “We have a lot of experts but we have a lot of staff. We want certification that a high-quality process has been followed. In a rapidly-changing world of required expertise, are we there yet? We are going into areas we didn’t grow up with.”

Randal: “When do we have enough of the right answer in science? Make a judgment and let the future revisit the problem.”

This is science. This is what our community needs to work towards to survive and thrive.

==
The picture at the top of this post is the Orbitor 1 Pinball Machine on display at the Las Vegas Pinball Museum. With a faux 3D playing surface and only two bumpers, it is described by pinballheads as “one of the strangest – and most minimal – games around.”

9 comments

Love, love, love work but it has me pinned under a landslide of maps, volumes and other deliverables. Yes, I used the word “deliverable” on this blog, which should tell you something about my current state of mind. Before you think I’m turning droid, other upcoming products include chana masala (pictured above), pedas and apple pie. Christmas time + people finding out you can cook relatively well = sore shoulders, fragrant home and happy, well-fed friends.

Things that I’ve checked out lately that may interest you:

Best Practices for Scientific Computing Goodness love arXiv. “Scientists spend an increasing amount of time building and using software. However, most scientists are never taught how to do this efficiently. As a result, many are unaware of tools and practices that would allow them to write more reliable and maintainable code with less effort. We describe a set of best practices for scientific software development that have solid foundations in research and experience, and that improve scientists’ productivity and the reliability of their software. ”

Los Angeles Review of Books | Literature is not Data: Against Digital Humanities The author’s main arguments against digital humanists (?) are that while eBooks are great, books are not data objects to be placed in Data Bases because “literature is terminally incomplete” [bogus argument – physical books are terminally complete; that’s what future editions are for and there are ways to archive and access these electronically] and that “the process of turning literature into data removes distinction itself.” Here, I can see automated librarians falling apart, but look: It will always remain the job of the human to think and gain insight or not. Just because I love Project Gutenberg and the preservation of literature through eBooks doesn’t mean I automatically want to create or support an iTunes Genius functionality for digitally-archived books. Let’s just say that Pandora and Spotify pleasantly surprise me sometimes. I still have to do the hard work of discerning my way through the music.

MUSIC 1323 – Audio Engineering I’ve been an acoustics junkie for a while and want to take this sound creation course so badly. Of course, the only time I can attend entails 6 hours starting at 9am every Saturday for four months. The question then becomes: How “badly” do I want to take this course?

1 comment

Luck Be A Lady Tonight

The best representative I’ve ever had is now a United States senator. On Wisconsin.

Congratulations, Deb Fischer, Heidi Heitkamp, Mazie Hirono, Elizabeth Warren and Tammy Baldwin.

1 comment