Abstract
This chapter introduces a generalization of the real- and complex-valued SVM’s using the Clifford algebra. In this framework we handle the design of kernels involving the geometric product for linear and nonlinear classification and regression. The major advantage of our approach is that we redefine the optimization variables as multivectors. This allows us to have a multivector as output and, therefore we can represent multiple classes according to the dimension of the geometric algebra in which we work. By using the CSVM with one Clifford kernel we reduce the complexity of the computation greatly. This can be done thanks to the Clifford product, which performs the direct product between the spaces of different grade involved in the optimization problem. We conduct comparisons between CSVM and the most used approaches to solve multi-class classification to show that ours is more suitable for practical use on certain type of problems. In this chapter are included several experiments to show the application of CSVM to solve classification and regression problems, as well as 3D object recognition for visual guided robotics. In addition, it is shown the design of a recurrent system involving LSTM network connected with CSVM and we study the performance of this system with time series experiments and robot navigation using reinforcement learning.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
Vapnik, V.: Statistical Learning Theory. Wiley, New York (1998)
Burges, C.J.C.: A tutorial on Support Vector Machines for Pattern Recognition. In: Knowledge Discovery and Data Mining, vol. 2, pp. 1–43. Kluwer Academic Publishers, Dordrecht (1998)
Müller, K.-R., Mika, S., Rätsch, G., Tsuda, K., Schölkopf, B.: An Introduction to Kernel-Based Learning Algorithms. IEEE Trans. on Neural Networks 12(2), 181–202 (2001)
Cristianini, N., Shawe-Taylor, J.: Support Vector Machines and other kernel-based learning methods. Cambridge University Press, Cambridge (2000)
Hestenes, D., Li, H., Rockwood, A.: New algebraic tools for classical geometry. In: Sommer, G. (ed.) Geometric Computing with Clifford Algebras. Springer, Heidelberg (2001)
Lee, Y., Lin, Y., Wahba, G.: Multicategory Support Vector Machines, Technical Report No. 1043, University of Wisconsin, Departament of Statistics, pp. 10–35 (2001)
Weston, J., Watkins, C.: Support vector machines for multi-class pattern recognition. In: Proceedings of the 6th European Symposium on Artificial Neural Networks (ESANN), pp. 185–201 (1999)
Bayro-Corrochano, E., Arana-Daniel, N., Vallejo-Gutierrez, R.: Geometric Preprocessing, geometric feedforward neural networks and Clifford support vector machines for visual learning. Journal Neurocomputing 67, 54–105 (2005)
Bayro-Corrochano, E., Arana-Daniel, N., Vallejo-Gutierrez, R.: Recurrent Clifford Support Machines. In: Proceedings IEEE World Congress on Computational Intelligence, Hong-Kong (2008)
Mukherjee, S., Osuna, E., Girosi, F.: Nonlinear prediction of chaotic time series using a support vector machine. In: Principe, J., Gile, L., Morgan, N., Wilson, E. (eds.) Neural Networks for Signal Precessing VII - Proceedings of the 1997 IEEE Workshop, New York, pp. 511–520 (1997)
Müller, K.-R., Smola, A.J., Rätsch, G., Schölkopf, B., Kohlmorgen, J., Vapnik, V.N.: Predicting time series with support vector machines. In: Gerstner, W., Hasler, M., Germond, A., Nicoud, J.-D. (eds.) ICANN 1997. LNCS, vol. 1327, pp. 999–1004. Springer, Heidelberg (1997)
Salomon, J., King, S., Osborne, M.: Framewise phone classification using support vector machines. In: Proc. Int. Conference on Spoke Language Processing, Denver (2002)
Altun, Y., Tsochantaris, I., Hofmann, T.: Hidden markov support vector machines. In: Proc. Int. Conference on Machine Learning (2003)
Jaakkola, T.S., Haussler, D.: Exploting generative models in discriminative classifiers. In: Proc. of the Conference on Advances in Neural Information Systems II, Cambridge, pp. 487–493 (1998)
Bengio, Y., Frasconi, P.: Difussion of credit and markovian models. In: Tesauro, G., Touretzky, D.S., Leen, T.K. (eds.) Advances in Neural Information Systems 14. MIT Press, Cambridge (2002)
Hochreiter, S., Mozer, M.: A discrete probabilistic memory for discovering dependencies in time. In: Int. Conference on Neural Networks, pp. 661–668 (2001)
Suykens, J.A.K., Vanderwalle, J.: Recurrent least squares support vector machines. IEEE Transactions on Circuits and Systems-I 47, 1109–1114 (2000)
Schmidhuber, J., Gagliolo, M., Wierstra, D., Gomez, F.: Recurrent Support Vector Machines, Technical Report, no. IDSIA 19-05 (2005)
Schmidhuber, J., Wierstra, D., Gómez, F.J.: Hybrid neuroevolution optimal linear search for sequence prediction. In: Kaufman, M. (ed.) Proceedings of the 19th International Joint Conference on Artificial Intelligence, IJCAI, pp. 853–858 (2005)
Hochreiter, S., Schmidhuber, J.: Long Short-Term Memory, Technical Report FKI-207-95 (1996)
Gmez, F.J., Miikkulainen, R.: Active guidance for a finless rocket using neuroevolution. In: Proc. GECCO, pp. 2084–2095 (2003)
Hsu, C.W., Lin, C.J.: A comparison of methods for multi-class Support Vector Machines. Technical report, National Taiwan University, Taiwan (2001)
Bottou, L., Cortes, C., Denker, J., Drucker, H., Guyon, I., Jackel, L.Y., Muller, U., Sackinger, E., Simard, P., Vapnik, V.: Comparison of classifier methods: a case study in handwriting digit recognition. In: International Conference on Pattern Recognition, pp. 77–87. IEEE Computer Society Press, Los Alamitos (1994)
Knerr, S., Personnaz, L., Dreyfus, G.: Single-layer learning revisited: a stepwise procedure for building and training a neural network. In: Fogelman, J. (ed.) Neurocomputing: Algorithms, Architectures and Applications. Springer, Heidelberg (1990)
Kreßel, U.: Pairwise classification and support vector machines. In: Schlkipf, B., Burges, C.J.J., Smola, A.J. (eds.) Advances in Kernel Methods - Support Vector Learning, pp. 255–268. MIT Press, Cambridge (1999)
Platt, J.C., Cristianini, N., Shawe-Taylor, J.: Large margin DAGs for multiclass classification. In: Advances in Neural Information Processing Systems, vol. 12, pp. 547–533. MIT Press, Cambridge (2000)
Weston, J., Watkins, C.: Multi-class support vector machines. Technical Report CSD-TR-98-04, Royal Holloway, University of London, Egham (1998)
Hsu, C.W., Lin, C.J.: A simple decomposition method for Support Vector Machines. Machine Learning 46, 291–314 (2002)
Joachims, T.: Making large-scale SVM learning practical. In: Schölkopf, B., Burges, C.J.C., Smola, A.J. (eds.) Advances in Kernel Methods-Support Vector Learning. MIT Press, Cambridge (1998); Journal of Machine Learning Research 5, 819–844 (1998)
Jaeger, H.: Harnessing nonlinearity: Predicting chaotic systems and saving energy in wireless communication. EmphScience (304), 78–80 (2004)
Gers, F.A., Eck, D., Schmidhuber, J.: Applying LSTM to time series predictable through time-window approaches. In: Dorffner, G., Bischof, H., Hornik, K. (eds.) ICANN 2001. LNCS, vol. 2130, pp. 669–685. Springer, Heidelberg (2001)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2010 Springer-Verlag Berlin Heidelberg
About this chapter
Cite this chapter
Arana-Daniel, N., López-Franco, C., Bayro-Corrochano, E. (2010). Optimization with Clifford Support Vector Machines and applications. In: Tenne, Y., Goh, CK. (eds) Computational Intelligence in Optimization. Adaptation, Learning, and Optimization, vol 7. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-12775-5_10
Download citation
DOI: https://doi.org/10.1007/978-3-642-12775-5_10
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-12774-8
Online ISBN: 978-3-642-12775-5
eBook Packages: EngineeringEngineering (R0)