Skip to main content

Application of Artificial Neural Networks in Geoscience and Petroleum Industry

  • Chapter
  • First Online:
Book cover Artificial Intelligent Approaches in Petroleum Geosciences

Abstract

It has been shown that artificial neural networks (ANNs), as a method of artificial intelligence, have the potential to increase the ability of problem solving to geoscience and petroleum industry problems particularly in case of limited availability or lack of input data. ANN application has become widespread in engineering including geoscience and petroleum engineering because it has shown to be able to produce reasonable outputs for inputs it has not learned how to deal with. In this chapter, the following subjects are covered: artificial neural networks basics (neurons, activation function, ANN structure), feed-forward ANN, backpropagation and learning (perceptrons and backpropagation, multilayer ANNs and backpropagation algorithm), data processing by ANN (training, over-fitting, testing, validation), ANN and statistical parameters, an applied example of ANN, and applications of ANN in geoscience and petroleum Engineering.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    A neural network without hidden layer(s).

Abbreviations

\( \alpha \) :

Learning rate

ANN:

Artificial neural network

AAPE:

Average absolute percent error

APE:

Average percent relative error

ARMSE:

Average root-mean-square error

BB:

Backpropagation

f :

Activation or transfer function

\( {\text{Input}}_{i} \) :

The input value corresponding to neuron i

Logsig:

Logistic sigmoid activation/transfer function

m :

Number of output neurons or nodes

MSE:

Mean square error

\( O_{{ {\text{ANN}}}} \) :

Predicted output value by the artificial neural network

R :

Pearson correlation coefficient

R 2 :

Squared pearson correlation coefficient

SD:

Standard deviation

T :

Number of training samples from known data point given for training the network

V :

Variance

\( V_{\text{expected}} \) :

Expected real value (known or measured value of output)

W i :

The weight corresponding to link or connection i

References

  • Adeyemi BJ & Sulaimon AA (2012) Predicting wax formation using artificial neural network. In: SPE-163026-MS, Nigeria annual international conference and exhibition, Lagos, Nigeria

    Google Scholar 

  • Ashena R, Moghadasi J, Ghalambor A, Bataee M, Ashena R, Feghhi, A (2010) Neural networks in BHCP prediction performed much better than mechanistic models. In: SPE 130095, international oil and gas conference and exhibition, Beijing, China

    Google Scholar 

  • Bataee M, Edalatkhah S, Ashena R (2010) Comparison between bit optimization using artificial neural network and other methods base on log analysis applied in Shadegan oil field. In: SPE 132222-MS, international oil and gas conference and exhibition, Beijing, China

    Google Scholar 

  • Bertsekas DP, Tsitsiklis JN (1996) Neuro-dynamic programming. Athena Scientific, Belmont, MA. ISBN 1-886529-10-8

    MATH  Google Scholar 

  • Cacciola M, Calcagno S, Lagana F, Megali G, Pellicano D (2009) Advanced integration of neural networks for characterizing voids in welded strips. In: 19th international conference, Cyprus

    Google Scholar 

  • Coulibaly P, Baldwin CK (2005) Nonstationary hydrological time series forecasting using nonlinear dynamic methods. J Hydrol 307:164–174

    Article  Google Scholar 

  • CVision Software Manual, NGS-Neuro Genetic Solutions GmbH

    Google Scholar 

  • Darken C and Moody J (1992) Towards faster stochastic gradient search. In: Moody JE, Hanson SJ and Lippmann RP (eds)

    Google Scholar 

  • Esmaeili A, Elahifar B, Fruhwirth R, Thonhauser G (2012) ROP modelling using neural network and drill string vibration data. In: SPE 163330, Kuwait international petroleum conference and exhibition

    Google Scholar 

  • Fruhwirth R K, Thonhauser G, Mathis W (2006) Hybrid simulation using neural networks to predict drilling hydraulics in real time. In: SPE 103217, SPE annual technical conference and exhibition in San Antonia, Texas, USA

    Google Scholar 

  • Gidh YK, Purwanto A and Ibrahim H (2012) Artificial neural network drilling parameter optimization system improves ROP by predicting/managing bit wear. In: SPE 149801-MS, SPE intelligent energy, Utrecht, The Netherlands

    Google Scholar 

  • Kharrat R, Mahdavi R, Bagherpour M, Hejri S (2009) Rock type and permeability prediction of a heterogenous carbonate reservoir using artificial neural networks based on flow zone index approach. In: SPE 120166, SPE middle east oil and gas show and conference, Bahrain

    Google Scholar 

  • Lechner J P and Zangl G (2005) Treating uncertainties in reservoir performance prediction with neural networks. In: SPE-94357-MS, SPE Europec/EAGE annual conference, 13–16 June, Madrid, Spain

    Google Scholar 

  • Lind YB and Kabirova AR (2014) Artificial neural networks in drilling troubles prediction. In: SPE 171274-MS, SPE Russian oil and gas exploration and production technical conference and exhibition, Moscow, Russia

    Google Scholar 

  • Mohaghegh S (2000) Virtual intelligence application in petroleum engineering: part I-artificial neural networks. J Pet Technol 52:64–72

    Google Scholar 

  • Naeeni, MN, Zargari H, Ashena R, Kharrat R (2010) Permeability prediction of uncored intervals using IMLR method and artificial neural networks: a case study of Bangestan field, Iran. In: SPE 140682, 34th annual SPE international conference and exhibition, Nigeria

    Google Scholar 

  • Tang H (2008) Improved carbonate reservoir facies classification using artificial neural network method. In: PETSOC-2008-122, Canadian international petroleum conference, Calgary, Alberta

    Google Scholar 

  • Thomas AL and Pointe PR (1995) Conductive fracture identification using neural networks. In: ARMA-95-0627, the 35th U.S. symposium on rock mechanics (USRMS), Reno, Nevada

    Google Scholar 

  • Todorov, D and Thonhauser G (2014) Hydraulics monitoring and well control event detection using model based analysis. In: SPE 24803, offshore technology conference Asia, Kuala lumpur, Malasia

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Rahman Ashena .

Editor information

Editors and Affiliations

Appendix: Important Statistical Parameters

Appendix: Important Statistical Parameters

The corresponding relations of a few important statistical parameters to compare performance and accuracy of different neural network models are given as follows:

  1. 1.

    Average percent relative error (APE):

    This error is defined as the relative deviation from the measured data.

    $$ {\text{APE}} = \frac{1}{n}\mathop \sum \limits_{i = 1}^{n} E_{\text{i}} $$
    (23)
    $$ E_{i} = \left[ {\frac{{P_{\text{m}} - P_{\text{e}} }}{{P_{\text{m}} }}} \right]_{i} \quad i = 1,{ 2},{ 3}, \ldots , n, $$
    (24)
  2. 2.

    Average Absolute Percent Relative Error (AAPE):

    This error gives an idea of absolute relative deviation of estimated outputs from the measured or expected output data.

    $$ {\text{AAPE}} = \frac{1}{n}\mathop \sum \limits_{i = 1}^{n} \left| {e_{i} } \right| $$
    (25)
    $$ e_{i} = \left[ {P_{\text{m}} - P_{\text{e}} } \right]_{i} $$
    (26)
  3. 3.

    Mean squared error (MSE):

    This error is corresponding to the expected value of the squared error loss.

    $$ {\text{MSE}} = \frac{1}{n}\mathop \sum \limits_{i = 1}^{n} e_{i}^{2} $$
    (27)
  4. 4.

    Average root-mean-square error (ARMSE):

    This error is an indeed measure of scatter or lack of accuracy of the estimated data.

    $$ {\text{ARMSE}} = \sqrt {\frac{1}{n}\mathop \sum \limits_{i = 1}^{n} e_{i}^{2} } $$
    (28)
  5. 5.

    Standard deviation (SD):

    This error shows the dispersion of the values from the average value or mean.

    $$ {\text{SD}} = \left[ {\frac{{n\mathop \sum \nolimits_{i = 1}^{n} E_{i}^{2} - (\mathop \sum \nolimits_{i = 1}^{n} E_{i} )^{2} }}{{n^{2} }}} \right]^{\frac{1}{2}} $$
    (29)
  6. 6.

    Variance or V, \( \sigma^{2} \):

    This error is the square of the standard deviation.

    $$ \sigma^{2} = \frac{{\sum {\left( {X - M} \right)^{2} } }}{N} $$
    (30)
  7. 7.

    Correlation coefficient or Pearson coefficient (R):

    It represents the degree of success in reduction of the standard deviation (SD). It is normally used as a measure of the extent of the linear dependence between two variables. The nearer R is to 1, the better the convergence and ANN performance is.

    $$ R = \frac{{\mathop \sum \nolimits_{i = 1}^{n} \left[ {\left( {P_{{{\text{m}},i}} - P_{{m,{\text{av}}}} } \right) \times \left( {P_{{{\text{e}},i}} - P_{{e,{\text{av}}}} } \right)} \right]}}{{\sqrt {\mathop \sum \nolimits_{i = 1}^{n} \left[ {\left( {P_{{{\text{m}},i}} - P_{{{\text{m}},{\text{av}}}} } \right)^{2} } \right] \times \mathop \sum \nolimits_{i = 1}^{n} \left( {P_{{{\text{e}},i}} - P_{\text{e,av}} } \right)^{2} } }} $$
    (31)
    $$ P_{\text{av}} = \frac{1}{n}\mathop \sum \limits_{i = 1}^{n} P_{i} $$
    (32)
  8. 8.

    Squared Pearson coefficient: R 2

Rights and permissions

Reprints and permissions

Copyright information

© 2015 Springer International Publishing Switzerland

About this chapter

Cite this chapter

Ashena, R., Thonhauser, G. (2015). Application of Artificial Neural Networks in Geoscience and Petroleum Industry. In: Cranganu, C., Luchian, H., Breaban, M. (eds) Artificial Intelligent Approaches in Petroleum Geosciences. Springer, Cham. https://doi.org/10.1007/978-3-319-16531-8_4

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-16531-8_4

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-16530-1

  • Online ISBN: 978-3-319-16531-8

  • eBook Packages: EnergyEnergy (R0)

Publish with us

Policies and ethics