Advertisement

Yes,Trees May Have Neurons

60 Varieties of Trees
  • Alois P. Heinz
Chapter
  • 440 Downloads
Part of the Lecture Notes in Computer Science book series (LNCS, volume 2598)

Abstract

Neural Trees are introduced. These descendants of decision trees are used to represent (approximations to) arbitrary continuous functions. They support efficient evaluation and the application of arithmetic operations, differentiation and definite integration.

Keywords

Marked Interval Evaluation Algorithm Automatic Differentiation Algorithm Diff Arbitrary Continuous Function 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Thomas Ottmann. Bäume in der informatik. Technical Report 13, Institut für Informatik, Universität Freiburg, June 1988.Google Scholar
  2. 2.
    Thomas Ottmann. Trees-a personal view. In Hermann A. Maurer, editor, New Results and New Trends in Computer Science, volume 555 of Lecture Notes in Computer Science, pages 243–255. Springer, 1991.CrossRefGoogle Scholar
  3. 3.
    Alois P. Heinz. A tree-structured neural network for real-time adaptive control. In S. Amari, L. Xu, L.-W. Chan, I. King, and K.-S. Leung, editors, Progress in Neural Information Pro-cessing, Proceedings of the International Conference on Neural Information Processing, Hong Kong, volume 2, pages 926–931, Berlin, September 1996. Springer.Google Scholar
  4. 4.
    J. C. Burkill and H. Burkill. A Second Course in Mathematical Analysis. Cambridge University Press, Cambridge, England, 1970.zbMATHGoogle Scholar
  5. 5.
    Alois P. Heinz. On a class of constructible neural networks. In Françoise Fogelman-Soulié and Patrick Gallinari, editors, ICANN’ 95, Proceedings of the International Conference on Artificial Neural Networks, Paris, France, volume 1, pages 563–568, Paris La Défense, France, October 1995. EC2 & Cie.Google Scholar
  6. 6.
    J. Kindermann and A. Linden. Inversion of neural networks by gradient descent. Journal of Parallel Computing, 14(3):277–286, 1992.CrossRefGoogle Scholar
  7. 7.
    D. DeMers and K. Kreutz-Delgado. Solving inverse kinematics for redundant manipulators. In O. Omidvar and P. van Smagt, editors, Neural Systems for Robotics, pages 75–112. Academic Press, New York, 1997.Google Scholar
  8. 8.
    M. Brown and C. Harris. Neurofuzzy Adaptive Modelling and Control. Systems and Control Engineering. Prentice Hall International, London, UK, 1994.Google Scholar
  9. 9.
    J. E. Dennis and R. B. Schnabel. Numerical Methods for Unconstrained Optimization and Nonlinear Equations. Prentice-Hall, Englewood Cliffs, NJ, 1983.zbMATHGoogle Scholar
  10. 10.
    R. E. Wengert. A simple automatic derivative evaluation program. Comm. ACM, 7(8):463–464, 1964.zbMATHCrossRefGoogle Scholar
  11. 11.
    A. E. Bryson and Y.-C. Ho. Applied Optimal Control. Blaisdell, New York, 1969. [Revised printing New York: Hemisphere, 1975].Google Scholar
  12. 12.
    W. Baur and V. Strassen. The complexity of partial derivatives. Theoretical Computer Science, 22:317–330, 1983.zbMATHCrossRefMathSciNetGoogle Scholar
  13. 13.
    D. E. Rumelhart, G. E. Hinton, and R. J. Williams. Learning representations by back-propagating errors. Nature, 323:533–536, 1986.CrossRefGoogle Scholar
  14. 14.
    T. Yoshida. Derivation of a computational process for partial derivatives of functions using transformations of a graph. Transactions of Information Processing Society of Japan, 11(19):1112–1120, 1987.Google Scholar
  15. 15.
    A. Griewank and S. Reese. On the calculation of Jacobian matrices by the Markowitz rule. In [17], pages 126–135. SIAM, 1991.MathSciNetGoogle Scholar
  16. 16.
    T. Yoshida. A node elimination rule for the calculation of Jacobian matrices. In Proceedings of the Second International SIAM Workshop on Computational Differentiation, Santa Fe, NM, Philadelphia, 1996. SIAM.Google Scholar
  17. 17.
    A. Griewank and G. F. Corliss, editors. Automatic Differentiation of Algorithms: Theory, Implementation, and Application. SIAM, Philadelphia, Penn., 1991.zbMATHGoogle Scholar
  18. 18.
    Alois P. Heinz. Efficient top-down jacobian evaluation of tree-structured neural networks. In Lars Niklasson, Mikael Bodén, and Tom Ziemke, editors, ICANN 98, Proceedings of the 8th International Conference on Artificial Neural Networks, Skövde, Sweden, volume I, pages 87–92. Springer-Verlag, 1998.Google Scholar
  19. 19.
    Graeme Fairweather and Rick D. Saylor. The reformulation and numerical solution of certain nonclassical initial-boundary value problems. SIAM Journal on Scientific and Statistical Computing, 12(1):127–144, January 1991.zbMATHCrossRefMathSciNetGoogle Scholar
  20. 20.
    A. C. Norman and J. H. Davenport. Symbolic integration-the dust settles? In Proc. EU-ROSAM 1979, Lecture Notes in Computer Science, volume 72, pages 398–407. Springer-Verlag, 1979.Google Scholar
  21. 21.
    R. Cranley and T. N. L. Patterson. On the automatic numerical evaluation of definite integrals. The Computer Journal, 14(2):189–198, May 1971.zbMATHCrossRefMathSciNetGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2003

Authors and Affiliations

  • Alois P. Heinz
    • 1
  1. 1.University of Applied Sciences HeilbronnHeilbronn

Personalised recommendations