Advertisement

Use of Genetic and Neural Technologies in Oil Equipment Computer-Aided Design

  • R. M. Vahidov
  • M. A. Vahidov
  • Z. E. Eyvazova
Conference paper

Abstract

Oil pumping equipment designers have to solve different types of optimization problems. Use of strong mathematical means is frequently very difficult, or, even impossible because of complexity of those problems. This paper suggests using genetic algorithms as an alternate facility to find optimal parameters of pumping unit under given particular conditions. Neural networks are employed to approximate the best solution using statistics on already found solutions for a set of conditions. Experimental results are discussed.

Keywords

Genetic Algorithm Weight Coefficient Predetermined Interval Pumping Unit Universal Approximators 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. [1]
    A.D. Bethke, “Genetic algorithms as function optimizers”, Ph.D. thesis, Dept. Computer and Communication Sciences, Univ. of Michigan, 1981.Google Scholar
  2. [2]
    L.B. Booker, “Intelligent behavior as an adaptation to the task of environment”, Ph.D. thesis, Dept. Computer and Communication Sciences, Univ. of Michigan, 1982.Google Scholar
  3. [3]
    D.J. Cavicchio, “Adaptive search using simulated evolution”, Ph.D. thesis, Dept. Computer and Communication Sciences, Univ. of Michigan, 1970.Google Scholar
  4. [4]
    K.A. DeJong, “Analysis of the behavior of a class of genetic adaptive systems”, Ph.D. thesis, Dept. Computer and Communication Sciences, Univ. of Michigan, 1975.Google Scholar
  5. [5]
    J.H. Holland, Adaptation in Natural and Artificial Systems, Univ. of Michigan, Ann Arbor, MI, 1975.Google Scholar
  6. [6]
    K. Hornik, M. Stinchcombe and H. White, “Multilayer feedforward networks are universal approximators”, Neural Networks, 2 (5), 1989, p. 359CrossRefGoogle Scholar
  7. [7]
    Z. Michalewicz, Genetic Algorithms + Data Structures = Evolution Programs, Berlin; New York: Springer-Verlag, 1992.MATHGoogle Scholar
  8. [8]
    W.S. McCulloch, W. Pitts, “A logical calculus of the ideas immanent in nervous activity”, Bulletin of Mathematical Biophysics, Vol. 9, 1943, p. 127.Google Scholar
  9. [9]
    M. Minsky, S. Papert, Perceptrons, Cambridge, Massachusetts: The MIT Press, 1969.MATHGoogle Scholar
  10. [10]
    D.E. Rummelhart, G.E. Hinton, R.J. Williams, “Learning representations by back-propagating errors”, Nature 323, 1986, p. 533.CrossRefGoogle Scholar
  11. [11]
    D.E. Rummelhart, G.E. Hinton, R.J. Williams, “Learning internal representations by back-propagating errors” In PDP Vol.1: Foundations, Ed D.E. Rummelhart, J.L. McClelland and The PDP Research Group, Cambridge, Massachusetts: The MIT Press, 1986, p. 318.Google Scholar

Copyright information

© Springer-Verlag/Wien 1995

Authors and Affiliations

  • R. M. Vahidov
    • 1
  • M. A. Vahidov
    • 1
  • Z. E. Eyvazova
    • 1
  1. 1.Azerbaijan State Oil AcademyBakuAzerbaijan

Personalised recommendations