Skip to main content

A General Framework for Constructing Cooperative Global Optimization Algorithms

  • Conference paper

Part of the book series: Nonconvex Optimization and Its Applications ((NOIA,volume 74))

Abstract

Many well-structured problems in many areas can be defined as the optimization of a multivariate energy function E(x 1,x 2,…,x n). In most cases, a problem of that type is computationally hard. The only way we know before to find the global optimum is by making an algorithm that runs in exponential time which is too expensive in practice. The other traditional methods, like Local Search, Simulated Annealing, Tabu Search, and the evolutionary algorithms, have no general conditions to identify the global optimum and stop searching. A new cooperative optimization algorithm is presented in this paper that is capable of finding the global optimum within a polynomial time, though not guaranteed. It has both sufficient conditions and necessary conditions to identify the global optimums and to trim the search space. It is guaranteed to converge linearly to a solution, which must be the global one if it is a consensus solution. The convergence is also insensitive to the disturbances to its initial or intermediate solutions. A lower bound on the energy function is also provided by the algorithm which is guaranteed to be improved after each iteration. Its power is demonstrated in solving nonlinear optimization problems raised from an early vision problem, shape from shading of a polyhedron. Most importantly in this paper, to make the algorithm complete, i.e. be always capable of finding the global optimum, we propose a general framework for constructing the cooperative global optimization algorithms more powerful than the primary one based on the lattice concept from Abstract Algebra.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   169.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD   219.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. K.E. Atkinson, An Introduction to Numerical Analysis, John Wiley & Sons, 1989.

    MATH  Google Scholar 

  2. G. Bilbro et al, “Optimization by mean field annealing,”, Advances in Neural Information Processing Systems,San Mateo, CA: Morgan-Kauffman, 1989.

    Google Scholar 

  3. M.R. Garey and D.S. Johnson, Computers and Intractability, San Francisco:Freeman, 1979.

    MATH  Google Scholar 

  4. S. Geman and D. Geman. Stochastic relaxation, gibbs distributions, and the bayesian restoration of images. IEEE Transactions on Pattern Analysis and Machine Intelligence, PAMI-6(6):721–741, Nov. 1984.

    Article  Google Scholar 

  5. G.E. Hinton, T. Sejnowski, and D. Ackley, “Boltzman machines: Constraint satisfaction machines that learn”, Cognitive Science, 9:147–169, 1985.

    Article  Google Scholar 

  6. J.H. Holland, “Genetic Algorithms”, Scientific American, July:66–72, 1992.

    Google Scholar 

  7. J.J. Hopfield, “Neural networks and physical systems with emergent collective computational abilities,” Proceedings of the National Academy of Sciences, 79:2554–2558, 1982.

    Article  MathSciNet  Google Scholar 

  8. J.J. Hopfield, “Neurons with graded response have collective computational properties like those two-state neurons”, Proceedings of the National Academy of Sciences, 81:3088–3092, 1984.

    Article  Google Scholar 

  9. Horn, B.: Understanding image intensity. Artificial Intelligence 8 (1977) 301–231

    Article  Google Scholar 

  10. X. Huang, “A Cooperative Search Approach to Global Optimization,” Proceeding of the First International Conference on Optimiztion Methods and Software, Dec.:140, Hangzhou, China, 2002.

    Google Scholar 

  11. X. Huang. On theorectical foundations of a new algorithm for combinatorial optimization. Tsinghua Science and Technology, P.R. China, To appear.

    Google Scholar 

  12. Kirkpatrick, C.D. Gelatt, and M.P. Vecchi, “Optimization by simulated annealing,” Science,220:671–680, 1983.

    Article  MathSciNet  Google Scholar 

  13. P. Larranaga and J. A. Lozano, “Estimation of Distribution Algorithms,” A New Tool for Evolutionary Computation, Kluwer Academic Publishers, 2001.

    Book  Google Scholar 

  14. Z. Michalewicz Genetic Algorithm + Data Structures = Evolution Programs. 3rd, Extended Edition, Springer-Verlag, 1995.

    Google Scholar 

  15. R.S. Varga, Matrix iterative Analysis, Englewood Cliffs, N.J.:Prentice-Hall, 1962.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2004 Kluwer Academic Publishers

About this paper

Cite this paper

Huang, X. (2004). A General Framework for Constructing Cooperative Global Optimization Algorithms. In: Floudas, C.A., Pardalos, P. (eds) Frontiers in Global Optimization. Nonconvex Optimization and Its Applications, vol 74. Springer, Boston, MA. https://doi.org/10.1007/978-1-4613-0251-3_11

Download citation

  • DOI: https://doi.org/10.1007/978-1-4613-0251-3_11

  • Publisher Name: Springer, Boston, MA

  • Print ISBN: 978-1-4613-7961-4

  • Online ISBN: 978-1-4613-0251-3

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics