Reassessing combinatorial productivity exhibited by simple recurrent networks in language acquisition

Francis C K Wong, James W. Minett, William Shi Yuan Wang

Research output: Chapter in book / Conference proceedingConference article published in proceeding or bookAcademic researchpeer-review

Abstract

it has long been criticized that connectionist models are inappropriate models for language acquisition since one of the important properties, the property of generalization beyond the training space, cannot be exhibited by the networks. Recently van der Velde et al. have discussed the issue of the combinatorial productivity, arguing that simple recurrent networks (SRNs) fail in this regard. They have attempted to show that performance of SRNs on generalization is limited to word-word association. In this paper, we report our follow-up study with two simulations demonstrating that (i) bi-gram does not play the dominant role as claimed (ii) SRNs are indeed able to exhibit combinatorial productivity when appropriately trained.
Original languageEnglish
Title of host publicationInternational Joint Conference on Neural Networks 2006, IJCNN '06
Pages1596-1603
Number of pages8
Publication statusPublished - 1 Dec 2006
Externally publishedYes
EventInternational Joint Conference on Neural Networks 2006, IJCNN '06 - Vancouver, BC, Canada
Duration: 16 Jul 200621 Jul 2006

Conference

ConferenceInternational Joint Conference on Neural Networks 2006, IJCNN '06
CountryCanada
CityVancouver, BC
Period16/07/0621/07/06

ASJC Scopus subject areas

  • Software

Cite this