Training auto-associative recurrent neural network with preprocessed training data

Document Type

Conference Proceeding

Publication Date

8-19-1993

Abstract

The Auto-Associative Recurrent Network (AARN), a modified version of the Simple Recurrent Network (SRN) can be trained to behave as recognizer of a language generated by a regular grammar. The network is trained successfully on an unbounded number of sequences of the language, generated randomly from the Finite State Automaton (FSA) of the language. But the training algorithm fails when training is restricted to a fixed finite set of examples. Here, we present a new algorithm for training the AARN from a finite set of language examples. A tree is constructed by preprocessing the training data. The AARN is trained with sequences generated randomly from the tree. The results of the simulations experiments are discussed.

Identifier

84993748320 (Scopus)

Publication Title

Proceedings of SPIE the International Society for Optical Engineering

External Full Text Location

https://doi.org/10.1117/12.152645

e-ISSN

1996756X

ISSN

0277786X

First Page

420

Last Page

428

Volume

1966

This document is currently not available here.

Share

COinS