Skip to main content

Neurocomputing with High Dimensional Parameters

  • Chapter
  • First Online:
  • 1058 Accesses

Part of the book series: Studies in Computational Intelligence ((SCI,volume 571))

Abstract

Neurocomputing has established its identity for robustness toward ill-defined and noisy problems in science and engineering. This is due to the fact that artificial neural networks have good ability of learning, generalization, and association. In recent past, different kinds of neural networks are proposed and successfully applied for various applications concerning single dimension parameters. Some of the important variants are radial basis neural network, multilayer perceptron, support vector machines, functional link networks, and higher order neural network. These variants with single dimension parameters have been employed for various machine learning problems in single and high dimensions. A single neuron can take only real value as its input, therefore a network should be configured so that conventionally use as many neurons as the dimensions (parameters) in high dimensional data for accepting each input. This type of configuration is sometimes unnatural and also may not achieve satisfactory performance for high dimensional problems. It has been revealed by extensive research work done in recent past that neural networks with high dimension parameters have several advantages and better learning capability for high dimensional problems over conventional one. Moreover, they have surprising ability to learn and generalize phase information among the different components simultaneously with magnitude, which is not possible with the conventional neural network. There are two approaches to naturally extend the dimensionality of data elements as single entity in high dimensional neural networks. In first line of attack the number field is extended from real number (single dimension) to complex number (two dimension), to quaternion (four dimension), to octanion (eight dimension). The second tactic is to extend the dimensionality of data element using high dimensional vector with scalar components, i.e., three dimension and N-dimension real-valued vectors. Applications of these numbers and vectors to neural networks have been extensively investigated in this chapter.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD   109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Notes

  1. 1.

    In abstract algebra, a field is a set F, together with two associative binary operations, typically referred to as addition and multiplication.

  2. 2.

    Interested readers may consult modern abstract algebra to understand difficulty of building a three-dimensional field extension over R and Hamiltons breakthrough concerning the necessity of three distinct imaginary parts along with one real.

  3. 3.

    Split complex function refers to functions \(f{:}\; {\varvec{C}} \longrightarrow {\varvec{C}}\) for which the real and imaginary part of the complex argument are processed separately by a real function of real argument.

References

  1. Leung, H., Haykin, S.: The complex backpropagation algorithm. IEEE Trans. Sig. Proc. 39(9), 2101–2104 (1991)

    Google Scholar 

  2. Piazza, F., Benvenuto, N.: On the complex backpropagation algorithm. IEEE Trans. Sig. Proc. 40(4), 967–969 (1992)

    Google Scholar 

  3. Nitta, T.: An analysis of the fundamental structure of complex-valued neurons. Neural Process. Lett. 12, 239–246 (2000)

    Article  MATH  Google Scholar 

  4. Aizenberg, I., Moraga, C.: Multilayer feedforward neural network based on multi-valued neurons (MLMVN) and a back-propagation learning algorithm. Soft Comput. 11(2), 169–183 (2007)

    Google Scholar 

  5. Kim, T., Adali, T.: Approximation by fully complex multilayer perceptrons. Neural Comput. 15, 1641–1666 (2003)

    Article  MATH  Google Scholar 

  6. Hirose, A.: Complex-Valued Neural Networks. Springer, New York (2006)

    Book  MATH  Google Scholar 

  7. Shin, Y., Keun-Sik, J., Byung-Moon, Y.: A complex pi-sigma network and its application to equalization of nonlinear satellite channels. In: IEEE International Conference on Neural Networks (1997)

    Google Scholar 

  8. Nitta, T.: An extension of the back-propagation algorithm to complex numbers. Neural Netw. 10(8), 1391–1415 (1997)

    Article  Google Scholar 

  9. Tripathi, B.K., Kalra, P.K.: On the learning machine for three dimensional mapping. Neural Comput. Appl. 20(01), 105–111. Springer (2011)

    Google Scholar 

  10. Moreno, A.B., Sanchez, A., Velez, J.F., Daz, F.J.: Face recognition using 3D surface-extracted descriptors. In: Proceedings of IMVIPC (2003)

    Google Scholar 

  11. Xu, C., Wang, Y., Tan, T., Quan, L.: Automatic 3D face recognition combining global geometric features with local shape variation information. In: Proceedings of AFGR, pp. 308–313 (2004)

    Google Scholar 

  12. Chen, L., Zhang, L., Zhang, H., Abdel-Mottaleb, M.: 3D shape constraint for facial feature localization using probabilistic-like output. In: Proceedings of 6th IEEE International Conference on Automatic Face and Gesture Recognition (2004)

    Google Scholar 

  13. Achermann, B., Bunke, H.: Classifying range images of human faces with Hausdorff distance. In: Proceedings of ICPR, pp. 809–813 (2000)

    Google Scholar 

  14. Blanz, V., Vetter, T.: Face recognition based on fitting a 3D morphable model. IEEE Trans. PAMI 25(9), 1063–1074 (2003)

    Article  Google Scholar 

  15. Hamilton, W.R.: Lectures on Quaternions. Hodges and Smith, Dublin (1853)

    Google Scholar 

  16. Nitta, T.: A solution to the 4-bit parity problem with a single quaternary neuron. Neural Inf. Process. Lett. Rev. 5, 33–39 (2004)

    Google Scholar 

  17. Wirtinger, W.: “Zur formalen theorie der funktionen von mehr komplexen ver”, anderlichen. Math. Ann. 97, 357–375 (1927)

    Article  MathSciNet  Google Scholar 

  18. Leo, S.D., Rotelli, P.P.: Quaternonic analyticity. Appl. Math. Lett. 16, 1077–1081 (2003)

    Article  MATH  MathSciNet  Google Scholar 

  19. Schwartz, C.: Calculus with a quaternionic variable. J. Math. Phys. 50, 013523:1013523:11 (2009)

    Google Scholar 

  20. Mandic, D., Goh, V.S.L.: Complex Valued Nonlinear Adaptive Filters: Noncircularity, Widely Linear and Neural Models. Wiley, Hoboken (2009)

    Book  Google Scholar 

  21. Nitta, T.: A quaternary version of the backpropagation algorithm. In: Proceedings of IEEE International Conference on Neural Networks, vol. 5, pp. 2753–2756 (1995)

    Google Scholar 

  22. Nitta, T.: N-dimensional vector neuron. IJCAI Workshop, Hyderabad, India (2007)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Bipin Kumar Tripathi .

Rights and permissions

Reprints and permissions

Copyright information

© 2015 Springer India

About this chapter

Cite this chapter

Tripathi, B.K. (2015). Neurocomputing with High Dimensional Parameters. In: High Dimensional Neurocomputing. Studies in Computational Intelligence, vol 571. Springer, New Delhi. https://doi.org/10.1007/978-81-322-2074-9_2

Download citation

  • DOI: https://doi.org/10.1007/978-81-322-2074-9_2

  • Published:

  • Publisher Name: Springer, New Delhi

  • Print ISBN: 978-81-322-2073-2

  • Online ISBN: 978-81-322-2074-9

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics