Enhancement of memory capacity of neural networks

Document Type

Conference Proceeding

Publication Date

1-1-1992

Abstract

Two approaches to enhance the memory capacity of neural networks are presented. The first approach is to break a large neural network into a number of independent subnetworks. This causes a drastic increase of the memory capacity and a reduction of the convergence time to stable states. The second approach is to use the limit cycle of states which can be employed to simulate a sequence of temporal memory states. This can also greatly increase the memory capacity without the use of higher-order terms. The searching of limit cycles for a given synaptic matrix is a very difficult problem. Analysis and synthesis of limit cycles are presented.

Identifier

84944988718 (Scopus)

ISBN

[0780307372]

Publication Title

IEEE International Conference on Intelligent Robots and Systems

External Full Text Location

https://doi.org/10.1109/IROS.1992.587384

e-ISSN

21530866

ISSN

21530858

First Page

519

Last Page

526

Volume

1

This document is currently not available here.

Share

COinS