Residual matrix product state for machine learning
Ye-Ming Meng, Jing Zhang, Peng Zhang, Chao Gao, Shi-Ju Ran
SciPost Phys. 14, 142 (2023) · published 2 June 2023
- doi: 10.21468/SciPostPhys.14.6.142
- Submissions/Reports
Abstract
Tensor network, which originates from quantum physics, is emerging as an efficient tool for classical and quantum machine learning. Nevertheless, there still exists a considerable accuracy gap between tensor network and the sophisticated neural network models for classical machine learning. In this work, we combine the ideas of matrix product state (MPS), the simplest tensor network structure, and residual neural network and propose the residual matrix product state (ResMPS). The ResMPS can be treated as a network where its layers map the "hidden" features to the outputs (e.g., classifications), and the variational parameters of the layers are the functions of the features of the samples (e.g., pixels of images). This is different from neural network, where the layers map feed-forwardly the features to the output. The ResMPS can equip with the non-linear activations and dropout layers, and outperforms the state-of-the-art tensor network models in terms of efficiency, stability, and expression power. Besides, ResMPS is interpretable from the perspective of polynomial expansion, where the factorization and exponential machines naturally emerge. Our work contributes to connecting and hybridizing neural and tensor networks, which is crucial to further enhance our understanding of the working mechanisms and improve the performance of both models.
Cited by 2
Authors / Affiliations: mappings to Contributors and Organizations
See all Organizations.- 1 2 Ye-Ming Meng,
- 3 Jing Zhang,
- 3 Peng Zhang,
- 1 Chao Gao,
- 4 Shi-Ju Ran
- 1 浙江师范大学 / Zhejiang Normal University [ZJNU]
- 2 华东师范大学 / East China Normal University [ECNU]
- 3 天津大学 / Tianjin University [TJU]
- 4 首都师范大学 / Capital Normal University [CNU]