An Adaptive Deep Belief Network with Sparse Restricted Boltzmann Machines

Document Type

Article

Publication Date

10-1-2020

Abstract

Deep belief network (DBN) is an efficient learning model for unknown data representation, especially nonlinear systems. However, it is extremely hard to design a satisfactory DBN with a robust structure because of traditional dense representation. In addition, backpropagation algorithm-based fine-tuning tends to yield poor performance since its ease of being trapped into local optima. In this article, we propose a novel DBN model based on adaptive sparse restricted Boltzmann machines (AS-RBM) and partial least square (PLS) regression fine-tuning, abbreviated as ARP-DBN, to obtain a more robust and accurate model than the existing ones. First, the adaptive learning step size is designed to accelerate an RBM training process, and two regularization terms are introduced into such a process to realize sparse representation. Second, initial weight derived from AS-RBM is further optimized via layer-by-layer PLS modeling starting from the output layer to input one. Third, we present the convergence and stability analysis of the proposed method. Finally, our approach is tested on Mackey-Glass time-series prediction, 2-D function approximation, and unknown system identification. Simulation results demonstrate that it has higher learning accuracy and faster learning speed. It can be used to build a more robust model than the existing ones.

Identifier

85077258713 (Scopus)

Publication Title

IEEE Transactions on Neural Networks and Learning Systems

External Full Text Location

https://doi.org/10.1109/TNNLS.2019.2952864

e-ISSN

21622388

ISSN

2162237X

PubMed ID

31880561

First Page

4217

Last Page

4228

Issue

10

Volume

31

Grant

2018AAA0101600

Fund Ref

National Natural Science Foundation of China

This document is currently not available here.

Share

COinS