Skip to main content

Optimizing Feature Calculation in Adaptive Machine Vision Systems

  • Chapter
  • First Online:
Book cover Learning in Non-Stationary Environments

Abstract

A classifier’s accuracy substantially depends on the features that are utilized to characterize an input sample. The selection of a representative and—ideally—small set of features that yields high discriminative power is an important step in setting up a classification system. The features are a set of functions that transform the raw input data (an image in the case of machine vision systems) into a vector of real numbers. This transformation may be a quite complex algorithm, with lots of parameters to tune and consequently with much room for optimization. In order to efficiently use this additional room for optimizing the features, we propose an integrated optimization step that adapts the feature parameters in such a way that the separation of the classes in feature space is improved, thus reducing the number of misclassifications. Furthermore, these optimization techniques may be used to “shape” the decision boundary in such a way that it can be easily modeled by a classifier. After covering the relevant elements of the theory behind this automatic feature optimization process, we will demonstrate and assess the performance on two typical machine vision applications. The first one is a quality control task, where different types of defects need to be distinguished, and the second example is a texture classification problem as it appears in image segmentation tasks. We will show how the optimization process can be successfully applied in morphological and textural features that both offer a number of parameters to tune and select.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Azimi-Sadjadi, M.R.D.Y., Dobeck, G.J.: Adaptive feature mapping for underwater target classification. In: IJCNN ’99. International Joint Conference on Neural Networks, vol. 5, pp. 3221–3224 (1999)

    Google Scholar 

  2. Breiman, L.: Random forests. Machine Learning 45(1), 5–32 (2001)

    Article  MATH  Google Scholar 

  3. Brodatz, P.: A Photographic Album for Artists and Designers. Dover Publications, New York (1966)

    Google Scholar 

  4. Cardie, C.: Using decision trees to improve case-based learning. In: Proceedings of 10th International Conference on Machine Learning, pp. 25–32 (1993)

    Google Scholar 

  5. Chen, H.T., Liu, T.L., Fuh, C.S.: Probabilistic tracking with adaptive feature selection. In: 17th International Conference on Pattern Recognition (ICPR’04), volume 2, pp. 736–739 (2004)

    Google Scholar 

  6. Collins, R., Liu, Y.: On-line selection of discriminative tracking features. In: Proc. of the 2003 International Conference of Computer Vision (ICCV 03), pp. 346–352 (2003)

    Google Scholar 

  7. Costanza, C.M., Afifi, A.A.: Comparison of stopping rules in forward stepwise discriminant analysis. Journal Amer. Statist. Assoc. 74, 777–785 (1979)

    MATH  Google Scholar 

  8. Dash, M., Liu, H.: Feature selection for classification. International Journal of Intelligent Data Analysis 1, 131–156 (1997)

    Article  Google Scholar 

  9. Demant, C., Streicher-Abel, B., Waszkewitz, P.: Industrielle Bildverarbeitung. Springer-Verlag, Berlin Heidelberg New York (1998)

    Google Scholar 

  10. Duda, R.O., Hart, P.E., Stork, D.G.: Pattern Classification, 2nd edition. John Wiley & Sons, New York (2001)

    MATH  Google Scholar 

  11. Dunn, D., Higgins, W., Wakeley, J.: Texture segmentation using 2-d Gabor elementary functions. Pattern Analysis and Machine Intelligence, IEEE Transactions on 16(2), 130 –149 (1994). DOI 10.1109/34.273736

    Article  Google Scholar 

  12. Eitzinger, C., Gmainer, M., Heidl, W., Lughofer, E.: Increasing classification performance with adaptive features. In: A. Gasteratos, M. Vincze, J. Tsotsos (eds.) Proceedings of ICVS 2008, LNCS, vol. 5008, pp. 445–453. Springer, Santorini Island, Greece (2008)

    Google Scholar 

  13. Grigorescu, S.E., Petkov, N., Kruizinga, P.: Comparison of texture features based on gabor filters. In: IEEE Trans. on Image Process., vol. 11, pp. 1160–1167 (2002)

    Article  MathSciNet  Google Scholar 

  14. Guyon, I., Elisseeff, A.: An introduction to variable and feature selection. Journal of Machine Learning Research 3, 1157–1182 (2003)

    MATH  Google Scholar 

  15. Hand, D.J.: Discrimination and classification. Wiley Series in Probability and Mathematical Statistics, Wiley, Chichester, UK (1981)

    MATH  Google Scholar 

  16. Jain, A.K., Farrokhnia, F.: Unsupervised texture segmentation using gabor filters. Pattern Recogn. 24(12), 1167–1186 (1991). DOI http://dx.doi.org/10.1016/0031-3203(91)90143-S

    Google Scholar 

  17. Kim, M., Park, C., Koo, K.: Natural / man-made object classification based on gabor characteristics. In: W.K. Leow, M. Lew, T.S. Chua, W.Y. Ma, L. Chaisorn, E. Bakker (eds.) Image and Video Retrieval, Lecture Notes in Computer Science, vol. 3568, pp. 550–559. Springer Berlin / Heidelberg (2005)

    Google Scholar 

  18. Kohavi, R., John, G.: Wrappers for feature subset selection. Artificial Intelligence 97(1–2), 273–324 (1997)

    Article  MATH  Google Scholar 

  19. Kononenko, I.: Estimating attributes: Analysis and extensions of relief. In: Proceedings of ECML-94, pp. 171–182. Springer Verlag, Catania, Sicily (1994)

    Google Scholar 

  20. Krishnapuram, B., Hartemink, A.J., Carin, L., Figueiredo, M.A.T.: A Bayesian approach to joint feature selection and classifier design. IEEE Transactions on Pattern Analysis and Machine Intelligence 26(9), 1105–1111 (2004)

    Article  Google Scholar 

  21. Lee, T.S.: Image representation using 2d gabor wavelets. Pattern Analysis and Machine Intelligence, IEEE Transactions on 18(10), 959–971 (1996). DOI 10.1109/34. 541406

    Article  Google Scholar 

  22. Li, M., Staunton, R.: Optimum gabor filter design and local binary patterns for texture segmentation. Pattern Recognition Letters 29(5), 664–672 (2008). DOI 10. 1016/j.patrec.2007.12.001

    Article  Google Scholar 

  23. MIT Media Laboratory Cambridge: Vistex - Vision Texture Database. http://vismod.media.mit.edu/pub/VisTex/ (1995)

  24. Molina, L.C., Belanche, L., Nebot, A.: Feature selection algorithms: A survey and experimental evaluation. In: ICDM ’02: Proceedings of the 2002 IEEE International Conference on Data Mining, pp. 306–311. Maebashi City, Japan (2002)

    Google Scholar 

  25. Narendra, P., Fukunaga, K.: A branch and bound algorithm for feature subset selection. IEEE Transactions on Computer 26(9), 917–922 (1977)

    Article  MATH  Google Scholar 

  26. University of Oulu, C.S., Laboratory, E.: Supervised texture segmentation mosaics. http://www.cse.oulu.fi/MVG/SupervisedTextureSegmentation/ (2011)

  27. Rao, C.R.: Linear statistical inference and its applications. John Wiley & Sons, Inc., NY, U.S.A. (1965)

    Google Scholar 

  28. Reisert, M., Burkhardt, H.: Feature selection for retrieval purposes. In: Proceedings of the ICIAR’06, Vol. 1, pp. 661–672. Pavoa do Varzim, Portugal (2006)

    Google Scholar 

  29. Sandler, R., Lindenbaum, M.: Optimizing gabor filter design for texture edge detection andclassification. International Journal of Computer Vision 84, 308–324 (2009). DOI 10.1007/s11263-009-0237-x

    Article  Google Scholar 

  30. Smith, G., Burns, I.: Measuring texture classification algorithms. Pattern Recognition Letters 18(14), 1495–1501 (1997). http://www.cssip.uq.edu.au/staff/meastex/meastex.html

  31. Thumfart, S., Heidl, W., Scharinger, J., Eitzinger, C.: A quantitative evaluation of texture feature robustness and interpolation behaviour. In: X. Jiang, N. Petkov (eds.) Computer Analysis of Images and Patterns, Lecture Notes in Computer Science, vol. 5702, pp. 1154–1161. Springer Berlin / Heidelberg (2009)

    Google Scholar 

  32. Tsai, D.M., Wu, S.K., Chen, M.C.: Optimal gabor filter design for texture segmentation using stochastic optimization. Image and Vision Computing 19(5), 299–316 (2001). DOI 10.1016/S0262-8856(00)00078-0

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Christian Eitzinger .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2012 Springer Science+Business Media New York

About this chapter

Cite this chapter

Eitzinger, C., Thumfart, S. (2012). Optimizing Feature Calculation in Adaptive Machine Vision Systems. In: Sayed-Mouchaweh, M., Lughofer, E. (eds) Learning in Non-Stationary Environments. Springer, New York, NY. https://doi.org/10.1007/978-1-4419-8020-5_13

Download citation

  • DOI: https://doi.org/10.1007/978-1-4419-8020-5_13

  • Published:

  • Publisher Name: Springer, New York, NY

  • Print ISBN: 978-1-4419-8019-9

  • Online ISBN: 978-1-4419-8020-5

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics