Core Matrix Regression and Prediction with Regularization

Document Type

Conference Proceeding

Publication Date

11-2-2022

Abstract

Many finance time-series analyses often track a matrix of variables at each time and study their co-evolution over a long time. The matrix time series is overly sparse, involves complex interactions among latent matrix factors, and demands advanced models to extract dynamic temporal patterns from these interactions. This paper proposes a Core Matrix Regression with Regularization algorithm (CMRR) to capture spatiotemporal relations in sparse matrix-variate time series. The model decomposes each matrix into three factor matrices of row entities, column entities, and interactions between row entities and column entities, respectively. Subsequently, it applies recurrent neural networks on interaction matrices to extract temporal patterns. Given the sparse matrix, we design an element-wise orthogonal matrix factorization that leverages the Stochastic Gradient Descent (SGD) in a deep learning platform to overcome the challenge of the sparsity and large volume of complex data. The experiment confirms that combining orthogonal matrix factorization with recurrent neural networks is highly effective and outperforms existing graph neural networks and tensor-based time series prediction methods. We apply CMRR in three real-world financial applications: firm earning forecast, predicting firm fundamentals, and firm characteristics, and demonstrate its consistent performance superiority: reducing error by 23%-53% over other state-of-the-art high-dimensional time series prediction algorithms.

Identifier

85142479188 (Scopus)

ISBN

[9781450393768]

Publication Title

Proceedings of the 3rd ACM International Conference on AI in Finance Icaif 2022

External Full Text Location

https://doi.org/10.1145/3533271.3561709

First Page

291

Last Page

299

This document is currently not available here.

Share

COinS