Adaptive Alternating Stochastic Gradient Descent Algorithms for Large-Scale Latent Factor Analysis

Document Type

Conference Proceeding

Publication Date

1-1-2021

Abstract

Latent factor analysis (LFA) is highly efficient in knowledge discovery from high-dimensional and sparse (HiDS) matrices frequently encountered in big data and web service related applications. A stochastic gradient descent (SGD) algorithm is commonly adopted as a learning algorithm for LFA owing to its high efficiency. However, its sequential nature makes it less scalable when processing large-scale data. Although an alternating SGD algorithm decouples an LFA process to achieve parallelization, its performance relies on its hyper-parameter selection that is highly expensive to tune. To address it, this paper presents three adaptive alternating SGD algorithms, thus leading to three Parallel Adaptive LFA (PAL) models for LFA on large-scale HiDS matrices. Experimental studies on HiDS matrices from industrial service applications show that the proposed PAL models perform significantly better than existing ones in terms of both convergence rate and computational efficiency, as well as achieve competitive prediction accuracy for missing data.

Identifier

85123289223 (Scopus)

ISBN

[9781665416832]

Publication Title

Proceedings 2021 IEEE International Conference on Services Computing Scc 2021

External Full Text Location

https://doi.org/10.1109/SCC53864.2021.00041

First Page

285

Last Page

290

This document is currently not available here.

Share

COinS