Skip to main content

A Sentence Generation Network That Learns Surface and Abstract Syntactic Structures

  • Conference paper
Artificial Neural Networks and Machine Learning – ICANN 2011 (ICANN 2011)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 6792))

Included in the following conference series:

  • 2327 Accesses

Abstract

In this paper we present a connectionist model of sentence generation based on the novel idea that sentence meanings are represented in the brain as sequences of sensorimotor signals which are replayed during sentence generation. Our model can learn surface patterns in language as well as abstract word-ordering conventions. The former is achieved by a recurrent network module; the latter by a feed-forward network that learns to inhibit overt pronunciation of predicted words in certain phases of sensorimotor sequence rehearsal. Another novel element of the model is adaptive switching of control based on uncertainty (entropy) of predicted word distributions. Experiments with the model show that it can learn the syntax, morphology and semantics of a target language and generalize well to unseen meanings/sentences.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Elman, J.: Finding Structure in Time. Cognitive Science 14, 179–211 (1990)

    Article  Google Scholar 

  2. Elman, J.: Distributed Representations, Simple Recurrent Networks, and Grammatical Structure. Machine Learning 7, 195–225 (1991)

    Google Scholar 

  3. Chang, F.: Symbolically Speaking: A Connectionist Model of Sentence Production. Cognitive Science 26, 609–651 (2002)

    Article  Google Scholar 

  4. Marcus, G.F.: Rethinking Eliminative Connectionism. Cognitive Psychology 37(3), 243–282 (1998)

    Article  Google Scholar 

  5. Pulvermüller, F., Knoblauch, A.: Discrete Combinatorial Circuits Emerging in Neural Networks: A Mechanism for Rules of Grammar in the Human Brain. Neural Networks 22(2), 161–172 (2009)

    Article  Google Scholar 

  6. Dominey, P., Hoen, M., Inui, T.: A Neurolinguistic Model of Grammatical Construction Processing. Journal of Cognitive Neuroscience 18(12), 2088–2107 (2006)

    Article  Google Scholar 

  7. Ballard, D., Hayhoe, M., Pook, P., Rao, R.: Deictic Codes for the Embodiment of Cognition. Behavioral and Brain Sciences 20(4), 723–767 (1997)

    Google Scholar 

  8. Knott, A.: Sensorimotor Cognition and Natural Language Syntax. MIT Press, Cambridge (in press)

    Google Scholar 

  9. Takac, M., Benuskova, L., Knott, A.: Mapping Sensorimotor Sequences to Word Sequences: A Connectionist Model of Language Acquisition and Sentence Generation. Technical report OUCS-2011-01, University of Otago, New Zealand (2011)

    Google Scholar 

  10. Takac, M., Knott, A., Benuskova, L.: Generation of Idioms in a Simple Recurrent Network Architecture. Technical report OUCS-2010-02, University of Otago, New Zealand (2010)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2011 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Takac, M., Benuskova, L., Knott, A. (2011). A Sentence Generation Network That Learns Surface and Abstract Syntactic Structures. In: Honkela, T., Duch, W., Girolami, M., Kaski, S. (eds) Artificial Neural Networks and Machine Learning – ICANN 2011. ICANN 2011. Lecture Notes in Computer Science, vol 6792. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-21738-8_44

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-21738-8_44

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-21737-1

  • Online ISBN: 978-3-642-21738-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics