KMS Chongqing Institute of Green and Intelligent Technology, CAS
Convergence Analysis of Single Latent Factor-Dependent, Nonnegative, and Multiplicative Update-Based Nonnegative Latent Factor Models | |
Liu, Zhigang1,2,3; Luo, Xin2,3,4; Wang, Zidong5 | |
2021-04-01 | |
摘要 | A single latent factor (LF)-dependent, nonnegative, and multiplicative update (SLF-NMU) learning algorithm is highly efficient in building a nonnegative LF (NLF) model defined on a high-dimensional and sparse (HiDS) matrix. However, convergence characteristics of such NLF models are never justified in theory. To address this issue, this study conducts rigorous convergence analysis for an SLF-NMU-based NLF model. The main idea is twofold: 1) proving that its learning objective keeps nonincreasing with its SLF-NMU-based learning rules via constructing specific auxiliary functions; and 2) proving that it converges to a stable equilibrium point with its SLF-NMU-based learning rules via analyzing the Karush-Kuhn-Tucker (KKT) conditions of its learning objective. Experimental results on ten HiDS matrices from real applications provide numerical evidence that indicates the correctness of the achieved proof. |
关键词 | Manganese Convergence Computational modeling Learning systems Analytical models Sparse matrices Big Data Big data convergence high-dimensional and sparse (HiDS) matrix latent factor (LF) analysis learning system neural networks nonnegative LF (NLF) analysis single LF-dependent nonnegative and multiplicative update (SLF-NMU) |
DOI | 10.1109/TNNLS.2020.2990990 |
发表期刊 | IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS |
ISSN | 2162-237X |
卷号 | 32期号:4页码:1737-1749 |
通讯作者 | Luo, Xin(luoxin21@cigit.ac.cn) |
收录类别 | SCI |
WOS记录号 | WOS:000637534200027 |
语种 | 英语 |