A Comparison of Simple Recurrent and Sequential Cascaded Networks for Formal Language Recognition

by Jacobsen, Henrik

Abstract (Summary)
Two classes of recurrent neural network models are compared in this report, simple recurrent networks (SRNs) and sequential cascaded networks (SCNs) which are first- and second-order networks respectively. The comparison is aimed at describing and analysing the behaviour of the networks such that the differences between them become clear. A theoretical analysis, using techniques from dynamic systems theory (DST), shows that the second-order network has more possibilities in terms of dynamical behaviours than the first-order network. It also revealed that the second order network could interpret its context with an input-dependent function in the output nodes. The experiments were based on training with backpropagation (BP) and an evolutionary algorithm (EA) on the AnBn-grammar which requires the ability to count. This analysis revealed some differences between the two training-regimes tested and also between the performance of the two types of networks. The EA was found to be far more reliable than BP in this domain. Another important finding from the experiments was that although the SCN had more possibilities than the SRN in how it could solve the problem, these were not exploited in the domain tested in this project
Bibliographical Information:


School:Högskolan i Skövde

School Location:Sweden

Source Type:Master's Thesis

Keywords:recurrent neural networks formal language


Date of Publication:12/12/2007

© 2009 All Rights Reserved.