Department Seminar: Dr. Laurie BellmanExport this event to calendar

Friday, November 18, 2016 — 2:00 PM EST

The Canadian Society of Exploration Geophysics presents

Data Interpretation and Integration from a Seismic Perspective – The Excitement of Innovation

Dr. Laurie Weston BellmanQuantitative Interpretation Director, Canadian Discovery Ltd.

Abstract

Most companies and research groups are working to some degree to incorporate multiple data types (such as seismic data, well logs, production information, microseismic, etc.) in their interpretations of and predictions from the data they acquire; the more pieces of the puzzle added, the clearer the picture. Integration methods range from simple visual comparisons of maps made from two different data sources to numerical modelling and statistical procedures at a basic data level. Quantitative Interpretation (QI) is a broad approach that encompasses many linked techniques in its aim to extract geological properties from seismic data. These geological properties can then be included in analytical methods to determine the key factors in predicting the future performance of a hypothetical well or field. 

QI involves a series of analysis steps that each require input datasets, mathematical functions and parameter selections. Choices are made at each point that affect the outcome to some extent, so the more experienced the practitioner; the more times they have seen a particular situation and learned from the results, the better the choices and the higher the chance of a satisfactory outcome. By satisfactory, I mean a realistic prediction of geology from seismic data that not only matches the existing wells, but also predicts the geological conditions in an undrilled location that turn out to be correct. This is an accepted and effective process that has been adopted around the world to reduce exploration and development risk. 

Much of the credit for the degree of success of the QI outcome can be attributed to factors that are beyond our control, such as the inherent elastic contrasts and intrinsic properties of the rock, the conditions in the near surface (on land or water), the weather on the day the seismic data were acquired. However if we go beyond face value, regardless of quality, the seismic data always contain more information than we think. QI encompasses the best methods currently available to dig deeper and reveal the hidden information. So, how do we make it better? 

Aside from improvements in the theory, which is ongoing, we make QI better by increasing the quality of the inputs, ensuring the appropriateness of the assumptions underlying the mathematical functions, testing the correctness of the parameter choices, and doing everything faster than ever. This plan sounds straightforward, but it’s almost never obvious how to make these workflow improvements. This kind of challenge is where big data analytical techniques with a corresponding increase in computer processing speeds and capability (eventually even quantum computing) can be introduced. Seismic data has always been big, but seismic analysis is mostly linear: the output from one process is the input to the next. Analytical techniques allow lateral analysis that geoscientists are only just starting to touch on. Statistics and machine learning are much more mathematical than most of us are comfortable with and the approach doesn’t necessarily come easily for geoscientists who are used to seeing a direct cause and effect to their analysis. However, as long as we maintain a good balance of objective mathematical process and subjective geological sense, this new direction should reveal new insights and enhanced efficiencies and, perhaps most importantly, be a catalyst for integration. But how do we get there? It’s hard to argue with the potential benefits of a more complete and thorough analysis of the range of available data, but there is plenty of debate about appropriate and effective procedures, near term objectives, and in a business environment, the best use of limited money. Shortcuts are tempting. Instead of saving money, however, shortcuts usually expose large gaps or inaccuracies in our knowledge – which is not always a bad thing. Collaboration, integration, models of all kinds (scientific and business) and a little bit of faith are therefore necessary to understand, effectively communicate and eventually achieve the ultimate benefits of a significant paradigm shift. 

My presentation will not necessarily answer all the questions posed in the abstract, but there will be explanations, examples and opinions. 

Cost 
Everyone welcome - refreshments!
Location 
DC - William G. Davis Computer Research Centre
Room 1304
200 University Avenue West

Waterloo, ON N2L 3G1
Canada

S M T W T F S
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
1
2
3
4
  1. 2020 (3)
    1. March (2)
    2. January (1)
  2. 2019 (38)
    1. December (1)
    2. November (1)
    3. September (1)
    4. June (1)
    5. May (4)
    6. April (1)
    7. March (18)
    8. February (6)
    9. January (5)
  3. 2018 (54)
  4. 2017 (96)
  5. 2016 (50)
  6. 2015 (50)
  7. 2014 (47)
  8. 2013 (40)
  9. 2012 (38)
  10. 2011 (20)
  11. 2010 (10)