Idioms are an ideal testbed for studying the interplay of lexical (content preparing) and syntactic (structure building) mechanisms in language production. I will present a neural network model of sentence generation, which is able to produce continuous and discontinuous idioms within regular compositional sentences. Detailed analysis of the representational space of the network's hidden layer shows that (1) an implicit structure-content division can arise as a result of internal space reorganization within a single Simple Recurrent Network during learning, (2) idioms can be produced by the same general sequencing mechanism that works for regular sentences, (3) the production of idioms is modulated by content-specific mechanisms.
Last modified: Friday, 09-Apr-2010 14:24:19 NZST
This page is maintained by the seminar list administrator.