Skip to main content

Recognition and Generation of Sentences through Self-organizing Linguistic Hierarchy Using MTRNN

  • Conference paper
Book cover Trends in Applied Intelligent Systems (IEA/AIE 2010)

Abstract

We show that a Multiple Timescale Recurrent Neural Network (MTRNN) can acquire the capabilities of recognizing and generating sentences by self-organizing a hierarchical linguistic structure. There have been many studies aimed at finding whether a neural system such as the brain can acquire languages without innate linguistic faculties. These studies have found that some kinds of recurrent neural networks could learn grammar. However, these models could not acquire the capability of deterministically generating various sentences, which is an essential part of language functions. In addition, the existing models require a word set in advance to learn the grammar. Learning languages without previous knowledge about words requires the capability of hierarchical composition such as characters to words and words to sentences, which is the essence of the rich expressiveness of languages. In our experiment, we trained our model to learn language using only a sentence set without any previous knowledge about words or grammar. Our experimental results demonstrated that the model could acquire the capabilities of recognizing and deterministically generating grammatical sentences even if they were not learned. The analysis of neural activations in our model revealed that the MTRNN had self-organized the linguistic structure hierarchically by taking advantage of differences in the time scale among its neurons, more concretely, neurons that change the fastest represented “characters,” those that change more slowly represented “words,” and those that change the slowest represented “sentences.”

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Chomsky, N.: Barrier. MIT Press, Cambridge (1986)

    Google Scholar 

  2. Pollack, J.B.: The induction of dynamical recognizers. Machine Learning 7(2-3), 227–252 (1991)

    Article  Google Scholar 

  3. Elman, J.L.: Finding structure in time. Cognitive Science 14, 179–211 (1990)

    Article  Google Scholar 

  4. Elman, J.L.: Distributed representations, simple recurrent networks, and grammatical structure. Machine Learning 7(2-3), 195–225 (1991)

    Article  Google Scholar 

  5. Elman, J.L.: Language as a dynamical system. In: Port, R., van Gelder, T. (eds.) Mind as Motion: Explorations in the Dynamics of Cognition, pp. 195–223. MIT Press, Cambridge (1995)

    Google Scholar 

  6. Weckerly, J., Elman, J.L.: A pdp approach to processing center-embedded sentences. In: Fourteenth Annual Conference of the Cognitive Science Society, vol. 14, pp. 414–419. Routledge, New York (1992)

    Google Scholar 

  7. Cleeremans, A., Servan-Schreiber, D., McClelland, J.L.: Finite state automata and simple recurrent networks. Neural Computation 1(3), 372–381 (1989)

    Article  Google Scholar 

  8. Giles, C.L., Miller, C.B., Chen, D., Chen, H.H., Sun, G.Z., Lee, Y.C.: Learning and extracting finite state automata with second-order recurrent neural networks. Neural Computation 4(3), 393–405 (1992)

    Article  Google Scholar 

  9. Sugita, Y., Tani, J.: Learning semantic combinatoriality from the interaction between linguistic and behavioral processes. Adaptive Behavior 13(1), 33–52 (2005)

    Article  Google Scholar 

  10. Ogata, T., Murase, M., Tani, J., Komatani, K., Okuno, H.G.: Two-way translation of compound sentences and arm motions by recurrent neural networks. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2007), pp. 1858–1863 (2007)

    Google Scholar 

  11. Tani, J., Ito, M.: Self-organization of behavioral primitives as multiple attractor dynamics: A robot experiment. IEEE Trans. on Systems, Man, and Cybernetics Part A: Systems and Humans 33(4), 481–488 (2003)

    Article  Google Scholar 

  12. Yamashita, Y., Tani, J.: Emergence of functional hierarchy in a multiple timescale neural network model: a humanoid robot experiment. PLoS Comput. Biol. 4 (2008)

    Google Scholar 

  13. Rumelhart, D.E., Hinton, G.E., Williams, R.J.: 8. In: Learning internal representations by error propagation, pp. 318–362. MIT Press, Cambridge (1986)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2010 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Hinoshita, W., Arie, H., Tani, J., Ogata, T., Okuno, H.G. (2010). Recognition and Generation of Sentences through Self-organizing Linguistic Hierarchy Using MTRNN. In: García-Pedrajas, N., Herrera, F., Fyfe, C., Benítez, J.M., Ali, M. (eds) Trends in Applied Intelligent Systems. IEA/AIE 2010. Lecture Notes in Computer Science(), vol 6098. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-13033-5_5

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-13033-5_5

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-13032-8

  • Online ISBN: 978-3-642-13033-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics