"Latent Factor-Based Recommenders Relying on Extended Stochastic Gradie" by Xin Luo, Dexian Wang et al.
 

Latent Factor-Based Recommenders Relying on Extended Stochastic Gradient Descent Algorithms

Document Type

Article

Publication Date

2-1-2021

Abstract

High-dimensional and sparse (HiDS) matrices generated by recommender systems contain rich knowledge regarding various desired patterns like users' potential preferences and community tendency. Latent factor (LF) analysis proves to be highly efficient in extracting such knowledge from an HiDS matrix efficiently. Stochastic gradient descent (SGD) is a highly efficient algorithm for building an LF model. However, current LF models mostly adopt a standard SGD algorithm. Can SGD be extended from various aspects in order to improve the resultant models' convergence rate and prediction accuracy for missing data? Are such SGD extensions compatible with an LF model? To answer them, this paper carefully investigates eight extended SGD algorithms to propose eight novel LF models. Experimental results on two HiDS matrices generated by real recommender systems show that compared with an LF model with a standard SGD algorithm, an LF model with extended ones can achieve: 1) higher prediction accuracy for missing data; 2) faster convergence rate; and 3) model diversity.

Identifier

85099759549 (Scopus)

Publication Title

IEEE Transactions on Systems Man and Cybernetics Systems

External Full Text Location

https://doi.org/10.1109/TSMC.2018.2884191

e-ISSN

21682232

ISSN

21682216

First Page

916

Last Page

926

Issue

2

Volume

51

Grant

cstc2017kjrc-cxcytd0149

Fund Ref

National Natural Science Foundation of China

This document is currently not available here.

Share

COinS