KMS Chongqing Institute of Green and Intelligent Technology, CAS
Alternating nonnegative least squares-incorporated regularized symmetric latent factor analysis for undirected weighted networks | |
Zhong, Yurong1; Liu, Kechen2; Chen, Jiqiu3,4; Zhe, Xie1; Li, Weiling1 | |
2024-11-28 | |
摘要 | An Undirected Weighted Network (UWN) can be precisely quantified as an adjacency matrix whose inherent characteristics are fully considered in a Symmetric Nonnegative Latent Factor (SNLF) model for its good representation accuracy. However, an SNLF model uses a sole latent factor matrix to precisely describe the topological characteristic of a UWN, i.e., symmetry, thereby impairing its representation learning ability. Aiming at addressing this issue, this paper proposes an Alternating nonnegative least squares-incorporated Regularized Symmetric Latent factor analysis (ARSL) model. First of all, equation constraints composed of multiple matrices are built in its learning objective for well describing the symmetry of a UWN. Note that it adopts an L2 2-norm- based regularization scheme to relax such constraints for making such a symmetry-aware learning objective solvable. Then, it designs an alternating nonnegative least squares-incorporated algorithm for optimizing its parameters efficiently. Empirical studies on four UWNs demonstrate that an ARSL model outperforms the stateof-the-art models in terms of representation accuracy, as well as achieves promising computational efficiency. |
关键词 | Undirected Weighted Network Symmetric Nonnegative Latent Factor Model Alternating Nonnegative Least Squares Missing Data Estimation |
DOI | 10.1016/j.neucom.2024.128440 |
发表期刊 | NEUROCOMPUTING |
ISSN | 0925-2312 |
卷号 | 607页码:12 |
通讯作者 | Chen, Jiqiu(cigityrc@163.com) |
收录类别 | SCI |
WOS记录号 | WOS:001304251700001 |
语种 | 英语 |