Abstract
In this paper we present a connectionist model of sentence generation based on the novel idea that sentence meanings are represented in the brain as sequences of sensorimotor signals which are replayed during sentence generation. Our model can learn surface patterns in language as well as abstract word-ordering conventions. The former is achieved by a recurrent network module; the latter by a feed-forward network that learns to inhibit overt pronunciation of predicted words in certain phases of sensorimotor sequence rehearsal. Another novel element of the model is adaptive switching of control based on uncertainty (entropy) of predicted word distributions. Experiments with the model show that it can learn the syntax, morphology and semantics of a target language and generalize well to unseen meanings/sentences.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Elman, J.: Finding Structure in Time. Cognitive Science 14, 179–211 (1990)
Elman, J.: Distributed Representations, Simple Recurrent Networks, and Grammatical Structure. Machine Learning 7, 195–225 (1991)
Chang, F.: Symbolically Speaking: A Connectionist Model of Sentence Production. Cognitive Science 26, 609–651 (2002)
Marcus, G.F.: Rethinking Eliminative Connectionism. Cognitive Psychology 37(3), 243–282 (1998)
Pulvermüller, F., Knoblauch, A.: Discrete Combinatorial Circuits Emerging in Neural Networks: A Mechanism for Rules of Grammar in the Human Brain. Neural Networks 22(2), 161–172 (2009)
Dominey, P., Hoen, M., Inui, T.: A Neurolinguistic Model of Grammatical Construction Processing. Journal of Cognitive Neuroscience 18(12), 2088–2107 (2006)
Ballard, D., Hayhoe, M., Pook, P., Rao, R.: Deictic Codes for the Embodiment of Cognition. Behavioral and Brain Sciences 20(4), 723–767 (1997)
Knott, A.: Sensorimotor Cognition and Natural Language Syntax. MIT Press, Cambridge (in press)
Takac, M., Benuskova, L., Knott, A.: Mapping Sensorimotor Sequences to Word Sequences: A Connectionist Model of Language Acquisition and Sentence Generation. Technical report OUCS-2011-01, University of Otago, New Zealand (2011)
Takac, M., Knott, A., Benuskova, L.: Generation of Idioms in a Simple Recurrent Network Architecture. Technical report OUCS-2010-02, University of Otago, New Zealand (2010)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2011 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Takac, M., Benuskova, L., Knott, A. (2011). A Sentence Generation Network That Learns Surface and Abstract Syntactic Structures. In: Honkela, T., Duch, W., Girolami, M., Kaski, S. (eds) Artificial Neural Networks and Machine Learning – ICANN 2011. ICANN 2011. Lecture Notes in Computer Science, vol 6792. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-21738-8_44
Download citation
DOI: https://doi.org/10.1007/978-3-642-21738-8_44
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-21737-1
Online ISBN: 978-3-642-21738-8
eBook Packages: Computer ScienceComputer Science (R0)