Neurocomputing with High Dimensional Parameters

  • Bipin Kumar TripathiEmail author
Part of the Studies in Computational Intelligence book series (SCI, volume 571)


Neurocomputing has established its identity for robustness toward ill-defined and noisy problems in science and engineering. This is due to the fact that artificial neural networks have good ability of learning, generalization, and association. In recent past, different kinds of neural networks are proposed and successfully applied for various applications concerning single dimension parameters. Some of the important variants are radial basis neural network, multilayer perceptron, support vector machines, functional link networks, and higher order neural network. These variants with single dimension parameters have been employed for various machine learning problems in single and high dimensions. A single neuron can take only real value as its input, therefore a network should be configured so that conventionally use as many neurons as the dimensions (parameters) in high dimensional data for accepting each input. This type of configuration is sometimes unnatural and also may not achieve satisfactory performance for high dimensional problems. It has been revealed by extensive research work done in recent past that neural networks with high dimension parameters have several advantages and better learning capability for high dimensional problems over conventional one. Moreover, they have surprising ability to learn and generalize phase information among the different components simultaneously with magnitude, which is not possible with the conventional neural network. There are two approaches to naturally extend the dimensionality of data elements as single entity in high dimensional neural networks. In first line of attack the number field is extended from real number (single dimension) to complex number (two dimension), to quaternion (four dimension), to octanion (eight dimension). The second tactic is to extend the dimensionality of data element using high dimensional vector with scalar components, i.e., three dimension and N-dimension real-valued vectors. Applications of these numbers and vectors to neural networks have been extensively investigated in this chapter.


Neural Network Complex Number Quaternionic Space Hypercomplex Number Conventional Neural Network 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


  1. 1.
    Leung, H., Haykin, S.: The complex backpropagation algorithm. IEEE Trans. Sig. Proc. 39(9), 2101–2104 (1991)Google Scholar
  2. 2.
    Piazza, F., Benvenuto, N.: On the complex backpropagation algorithm. IEEE Trans. Sig. Proc. 40(4), 967–969 (1992)Google Scholar
  3. 3.
    Nitta, T.: An analysis of the fundamental structure of complex-valued neurons. Neural Process. Lett. 12, 239–246 (2000)CrossRefzbMATHGoogle Scholar
  4. 4.
    Aizenberg, I., Moraga, C.: Multilayer feedforward neural network based on multi-valued neurons (MLMVN) and a back-propagation learning algorithm. Soft Comput. 11(2), 169–183 (2007)Google Scholar
  5. 5.
    Kim, T., Adali, T.: Approximation by fully complex multilayer perceptrons. Neural Comput. 15, 1641–1666 (2003)CrossRefzbMATHGoogle Scholar
  6. 6.
    Hirose, A.: Complex-Valued Neural Networks. Springer, New York (2006)CrossRefzbMATHGoogle Scholar
  7. 7.
    Shin, Y., Keun-Sik, J., Byung-Moon, Y.: A complex pi-sigma network and its application to equalization of nonlinear satellite channels. In: IEEE International Conference on Neural Networks (1997)Google Scholar
  8. 8.
    Nitta, T.: An extension of the back-propagation algorithm to complex numbers. Neural Netw. 10(8), 1391–1415 (1997)CrossRefGoogle Scholar
  9. 9.
    Tripathi, B.K., Kalra, P.K.: On the learning machine for three dimensional mapping. Neural Comput. Appl. 20(01), 105–111. Springer (2011)Google Scholar
  10. 10.
    Moreno, A.B., Sanchez, A., Velez, J.F., Daz, F.J.: Face recognition using 3D surface-extracted descriptors. In: Proceedings of IMVIPC (2003)Google Scholar
  11. 11.
    Xu, C., Wang, Y., Tan, T., Quan, L.: Automatic 3D face recognition combining global geometric features with local shape variation information. In: Proceedings of AFGR, pp. 308–313 (2004)Google Scholar
  12. 12.
    Chen, L., Zhang, L., Zhang, H., Abdel-Mottaleb, M.: 3D shape constraint for facial feature localization using probabilistic-like output. In: Proceedings of 6th IEEE International Conference on Automatic Face and Gesture Recognition (2004)Google Scholar
  13. 13.
    Achermann, B., Bunke, H.: Classifying range images of human faces with Hausdorff distance. In: Proceedings of ICPR, pp. 809–813 (2000)Google Scholar
  14. 14.
    Blanz, V., Vetter, T.: Face recognition based on fitting a 3D morphable model. IEEE Trans. PAMI 25(9), 1063–1074 (2003)CrossRefGoogle Scholar
  15. 15.
    Hamilton, W.R.: Lectures on Quaternions. Hodges and Smith, Dublin (1853)Google Scholar
  16. 16.
    Nitta, T.: A solution to the 4-bit parity problem with a single quaternary neuron. Neural Inf. Process. Lett. Rev. 5, 33–39 (2004)Google Scholar
  17. 17.
    Wirtinger, W.: “Zur formalen theorie der funktionen von mehr komplexen ver”, anderlichen. Math. Ann. 97, 357–375 (1927)CrossRefMathSciNetGoogle Scholar
  18. 18.
    Leo, S.D., Rotelli, P.P.: Quaternonic analyticity. Appl. Math. Lett. 16, 1077–1081 (2003)CrossRefzbMATHMathSciNetGoogle Scholar
  19. 19.
    Schwartz, C.: Calculus with a quaternionic variable. J. Math. Phys. 50, 013523:1013523:11 (2009)Google Scholar
  20. 20.
    Mandic, D., Goh, V.S.L.: Complex Valued Nonlinear Adaptive Filters: Noncircularity, Widely Linear and Neural Models. Wiley, Hoboken (2009)CrossRefGoogle Scholar
  21. 21.
    Nitta, T.: A quaternary version of the backpropagation algorithm. In: Proceedings of IEEE International Conference on Neural Networks, vol. 5, pp. 2753–2756 (1995)Google Scholar
  22. 22.
    Nitta, T.: N-dimensional vector neuron. IJCAI Workshop, Hyderabad, India (2007)Google Scholar

Copyright information

© Springer India 2015

Authors and Affiliations

  1. 1.Computer Science and EngineeringHarcourt Butler Technological InstituteKanpurIndia

Personalised recommendations