Scalable statistical inference for averaged implicit stochastic gradient descent
Document Type
Article
Publication Date
12-1-2019
Abstract
Stochastic gradient descent (SGD) provides a scalable way to compute parameter estimates in applications involving large-scale data or streaming data. As an alternative version, averaged implicit SGD (AI-SGD) has been shown to be more stable and more efficient. Although the asymptotic properties of AI-SGD have been well established, statistical inferences based on it such as interval estimation remain unexplored. The bootstrap method is not computationally feasible because it requires to repeatedly resample from the entire data set. In addition, the plug-in method is not applicable when there is no explicit covariance matrix formula. In this paper, we propose a scalable statistical inference procedure, which can be used for conducting inferences based on the AI-SGD estimator. The proposed procedure updates the AI-SGD estimate as well as many randomly perturbed AI-SGD estimates, upon the arrival of each observation. We derive some large-sample theoretical properties of the proposed procedure and examine its performance via simulation studies.
Identifier
85061056211 (Scopus)
Publication Title
Scandinavian Journal of Statistics
External Full Text Location
https://doi.org/10.1111/sjos.12378
e-ISSN
14679469
ISSN
03036898
First Page
987
Last Page
1002
Issue
4
Volume
46
Recommended Citation
Fang, Yixin, "Scalable statistical inference for averaged implicit stochastic gradient descent" (2019). Faculty Publications. 7122.
https://digitalcommons.njit.edu/fac_pubs/7122
