Skip to main content

Optimization with Clifford Support Vector Machines and applications

  • Chapter
  • 1651 Accesses

Part of the book series: Adaptation, Learning, and Optimization ((ALO,volume 7))

Abstract

This chapter introduces a generalization of the real- and complex-valued SVM’s using the Clifford algebra. In this framework we handle the design of kernels involving the geometric product for linear and nonlinear classification and regression. The major advantage of our approach is that we redefine the optimization variables as multivectors. This allows us to have a multivector as output and, therefore we can represent multiple classes according to the dimension of the geometric algebra in which we work. By using the CSVM with one Clifford kernel we reduce the complexity of the computation greatly. This can be done thanks to the Clifford product, which performs the direct product between the spaces of different grade involved in the optimization problem. We conduct comparisons between CSVM and the most used approaches to solve multi-class classification to show that ours is more suitable for practical use on certain type of problems. In this chapter are included several experiments to show the application of CSVM to solve classification and regression problems, as well as 3D object recognition for visual guided robotics. In addition, it is shown the design of a recurrent system involving LSTM network connected with CSVM and we study the performance of this system with time series experiments and robot navigation using reinforcement learning.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   189.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   249.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD   249.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Vapnik, V.: Statistical Learning Theory. Wiley, New York (1998)

    MATH  Google Scholar 

  2. Burges, C.J.C.: A tutorial on Support Vector Machines for Pattern Recognition. In: Knowledge Discovery and Data Mining, vol. 2, pp. 1–43. Kluwer Academic Publishers, Dordrecht (1998)

    Google Scholar 

  3. Müller, K.-R., Mika, S., Rätsch, G., Tsuda, K., Schölkopf, B.: An Introduction to Kernel-Based Learning Algorithms. IEEE Trans. on Neural Networks 12(2), 181–202 (2001)

    Article  Google Scholar 

  4. Cristianini, N., Shawe-Taylor, J.: Support Vector Machines and other kernel-based learning methods. Cambridge University Press, Cambridge (2000)

    Google Scholar 

  5. Hestenes, D., Li, H., Rockwood, A.: New algebraic tools for classical geometry. In: Sommer, G. (ed.) Geometric Computing with Clifford Algebras. Springer, Heidelberg (2001)

    Google Scholar 

  6. Lee, Y., Lin, Y., Wahba, G.: Multicategory Support Vector Machines, Technical Report No. 1043, University of Wisconsin, Departament of Statistics, pp. 10–35 (2001)

    Google Scholar 

  7. Weston, J., Watkins, C.: Support vector machines for multi-class pattern recognition. In: Proceedings of the 6th European Symposium on Artificial Neural Networks (ESANN), pp. 185–201 (1999)

    Google Scholar 

  8. Bayro-Corrochano, E., Arana-Daniel, N., Vallejo-Gutierrez, R.: Geometric Preprocessing, geometric feedforward neural networks and Clifford support vector machines for visual learning. Journal Neurocomputing 67, 54–105 (2005)

    Article  Google Scholar 

  9. Bayro-Corrochano, E., Arana-Daniel, N., Vallejo-Gutierrez, R.: Recurrent Clifford Support Machines. In: Proceedings IEEE World Congress on Computational Intelligence, Hong-Kong (2008)

    Google Scholar 

  10. Mukherjee, S., Osuna, E., Girosi, F.: Nonlinear prediction of chaotic time series using a support vector machine. In: Principe, J., Gile, L., Morgan, N., Wilson, E. (eds.) Neural Networks for Signal Precessing VII - Proceedings of the 1997 IEEE Workshop, New York, pp. 511–520 (1997)

    Google Scholar 

  11. Müller, K.-R., Smola, A.J., Rätsch, G., Schölkopf, B., Kohlmorgen, J., Vapnik, V.N.: Predicting time series with support vector machines. In: Gerstner, W., Hasler, M., Germond, A., Nicoud, J.-D. (eds.) ICANN 1997. LNCS, vol. 1327, pp. 999–1004. Springer, Heidelberg (1997)

    Google Scholar 

  12. Salomon, J., King, S., Osborne, M.: Framewise phone classification using support vector machines. In: Proc. Int. Conference on Spoke Language Processing, Denver (2002)

    Google Scholar 

  13. Altun, Y., Tsochantaris, I., Hofmann, T.: Hidden markov support vector machines. In: Proc. Int. Conference on Machine Learning (2003)

    Google Scholar 

  14. Jaakkola, T.S., Haussler, D.: Exploting generative models in discriminative classifiers. In: Proc. of the Conference on Advances in Neural Information Systems II, Cambridge, pp. 487–493 (1998)

    Google Scholar 

  15. Bengio, Y., Frasconi, P.: Difussion of credit and markovian models. In: Tesauro, G., Touretzky, D.S., Leen, T.K. (eds.) Advances in Neural Information Systems 14. MIT Press, Cambridge (2002)

    Google Scholar 

  16. Hochreiter, S., Mozer, M.: A discrete probabilistic memory for discovering dependencies in time. In: Int. Conference on Neural Networks, pp. 661–668 (2001)

    Google Scholar 

  17. Suykens, J.A.K., Vanderwalle, J.: Recurrent least squares support vector machines. IEEE Transactions on Circuits and Systems-I 47, 1109–1114 (2000)

    Article  Google Scholar 

  18. Schmidhuber, J., Gagliolo, M., Wierstra, D., Gomez, F.: Recurrent Support Vector Machines, Technical Report, no. IDSIA 19-05 (2005)

    Google Scholar 

  19. Schmidhuber, J., Wierstra, D., Gómez, F.J.: Hybrid neuroevolution optimal linear search for sequence prediction. In: Kaufman, M. (ed.) Proceedings of the 19th International Joint Conference on Artificial Intelligence, IJCAI, pp. 853–858 (2005)

    Google Scholar 

  20. Hochreiter, S., Schmidhuber, J.: Long Short-Term Memory, Technical Report FKI-207-95 (1996)

    Google Scholar 

  21. Gmez, F.J., Miikkulainen, R.: Active guidance for a finless rocket using neuroevolution. In: Proc. GECCO, pp. 2084–2095 (2003)

    Google Scholar 

  22. Hsu, C.W., Lin, C.J.: A comparison of methods for multi-class Support Vector Machines. Technical report, National Taiwan University, Taiwan (2001)

    Google Scholar 

  23. Bottou, L., Cortes, C., Denker, J., Drucker, H., Guyon, I., Jackel, L.Y., Muller, U., Sackinger, E., Simard, P., Vapnik, V.: Comparison of classifier methods: a case study in handwriting digit recognition. In: International Conference on Pattern Recognition, pp. 77–87. IEEE Computer Society Press, Los Alamitos (1994)

    Google Scholar 

  24. Knerr, S., Personnaz, L., Dreyfus, G.: Single-layer learning revisited: a stepwise procedure for building and training a neural network. In: Fogelman, J. (ed.) Neurocomputing: Algorithms, Architectures and Applications. Springer, Heidelberg (1990)

    Google Scholar 

  25. Kreßel, U.: Pairwise classification and support vector machines. In: Schlkipf, B., Burges, C.J.J., Smola, A.J. (eds.) Advances in Kernel Methods - Support Vector Learning, pp. 255–268. MIT Press, Cambridge (1999)

    Google Scholar 

  26. Platt, J.C., Cristianini, N., Shawe-Taylor, J.: Large margin DAGs for multiclass classification. In: Advances in Neural Information Processing Systems, vol. 12, pp. 547–533. MIT Press, Cambridge (2000)

    Google Scholar 

  27. Weston, J., Watkins, C.: Multi-class support vector machines. Technical Report CSD-TR-98-04, Royal Holloway, University of London, Egham (1998)

    Google Scholar 

  28. Hsu, C.W., Lin, C.J.: A simple decomposition method for Support Vector Machines. Machine Learning 46, 291–314 (2002)

    Article  MATH  Google Scholar 

  29. Joachims, T.: Making large-scale SVM learning practical. In: Schölkopf, B., Burges, C.J.C., Smola, A.J. (eds.) Advances in Kernel Methods-Support Vector Learning. MIT Press, Cambridge (1998); Journal of Machine Learning Research 5, 819–844 (1998)

    Google Scholar 

  30. Jaeger, H.: Harnessing nonlinearity: Predicting chaotic systems and saving energy in wireless communication. EmphScience (304), 78–80 (2004)

    Google Scholar 

  31. Gers, F.A., Eck, D., Schmidhuber, J.: Applying LSTM to time series predictable through time-window approaches. In: Dorffner, G., Bischof, H., Hornik, K. (eds.) ICANN 2001. LNCS, vol. 2130, pp. 669–685. Springer, Heidelberg (2001)

    Chapter  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2010 Springer-Verlag Berlin Heidelberg

About this chapter

Cite this chapter

Arana-Daniel, N., López-Franco, C., Bayro-Corrochano, E. (2010). Optimization with Clifford Support Vector Machines and applications. In: Tenne, Y., Goh, CK. (eds) Computational Intelligence in Optimization. Adaptation, Learning, and Optimization, vol 7. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-12775-5_10

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-12775-5_10

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-12774-8

  • Online ISBN: 978-3-642-12775-5

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics