"Integrated deep learning method for workload and resource prediction i" by Jing Bi, Shuang Li et al.
 

Integrated deep learning method for workload and resource prediction in cloud systems

Document Type

Article

Publication Date

2-1-2021

Abstract

Cloud computing providers face several challenges in precisely forecasting large-scale workload and resource time series. Such prediction can help them to achieve intelligent resource allocation for guaranteeing that users’ performance needs are strictly met with no waste of computing, network and storage resources. This work applies a logarithmic operation to reduce the standard deviation before smoothing workload and resource sequences. Then, noise interference and extreme points are removed via a powerful filter. A Min–Max scaler is adopted to standardize the data. An integrated method of deep learning for prediction of time series is designed. It incorporates network models including both bi-directional and grid long short-term memory network to achieve high-quality prediction of workload and resource time series. The experimental comparison demonstrates that the prediction accuracy of the proposed method is better than several widely adopted approaches by using datasets of Google cluster trace.

Identifier

85098779545 (Scopus)

Publication Title

Neurocomputing

External Full Text Location

https://doi.org/10.1016/j.neucom.2020.11.011

e-ISSN

18728286

ISSN

09252312

First Page

35

Last Page

48

Volume

424

Grant

61802015

Fund Ref

Alexander von Humboldt-Stiftung

This document is currently not available here.

Share

COinS