A Sequential Recommendation Model Based on Convolutional Neural Networks and State Space Models

Authors

  • Chenfei Feng
  • Haitao Wang
  • Yuguo Wang

DOI:

https://doi.org/10.54097/108jrv85

Keywords:

State Space Model, Convolutional Neural Network, Sequential Recommendation, Long- and Short-Term Interest Modeling

Abstract

Sequential recommendation aims to predict users' future preferences by analyzing their historical behavior sequences. In recent years, deep learning techniques have been widely applied to sequential recommendation tasks, achieving remarkable improvements in recommendation accuracy. However, existing methods often suffer from structural imbalance in modeling the rapidly changing short-term interests and the relatively stable long-term preferences of users. Moreover, these methods face limitations such as high computational costs and low inference efficiency when dealing with long behavior sequences. To address these challenges, this paper proposes a novel sequential recommendation model named CNN-Mamba, which integrates Convolutional Neural Networks (CNN) with State Space Models (SSM). Specifically, the CNN component leverages local receptive fields to efficiently extract short-term interest features from recent user interactions, thereby enhancing the model’s ability to capture local behavior patterns. Meanwhile, the SSM component is introduced as a long-term interest modeling module to capture global dependencies within long sequences. Furthermore, an adaptive fusion layer is designed to dynamically integrate the short- and long-term modeling outputs, thereby improving the model's generalization ability. In addition, the implicit recurrence mechanism in the state space model effectively reduces computational complexity and enhances the efficiency of long-sequence modeling. Experimental results on three real-world datasets demonstrate that the proposed CNN-Mamba model outperforms state-of-the-art baselines in both recommendation accuracy and inference efficiency, validating its effectiveness and practicality.

Downloads

Download data is not yet available.

References

[1] Chen K., Chen H. L., Peng C., et al. User profiling enhanced personalized recommendation system based on large model world knowledge [J]. Computer Applications, 2024, 44(8). doi: 10.11772/j.issn.1001-9081.2024060775

[2] Wang L., Zhu Z. Q., Zhou H. A survey on sequential recommendation based on deep learning [J]. Journal of Computer Research and Development, 2023, 46(1): 1-26. DOI:10.11897/SP.J.1016.2023.00001.

[3] Rendle S, Freudenthaler C, Schmidt-Thieme L. FPMC: Factorizing personalized Markov chains for next-basket recommendation[C]//Proceedings of the 19th International Conference on World Wide Web. ACM, 2010: 811–820.

[4] Hidasi B, Karatzoglou A, Baltrunas L, Tikk D. Session-based recommendations with recurrent neural networks [C] // Proceedings of the International Conference on Learning Representations (ICLR). 2016.

[5] Tang J, Wang X. Personalized top-n sequential recommendation via convolutional sequence embedding[C]//Proceedings of the Eleventh ACM International Conference on Web Search and Data Mining (WSDM). ACM, 2018: 565–573.

[6] Kang W C, McAuley J. Self-attentive sequential recommendation [C]//Proceedings of the IEEE International Conference on Data Mining (ICDM). IEEE, 2018: 197–206.

[7] Sun F, Liu J, Wu J, et al. BERT4Rec: Sequential recommendation with bidirectional encoder representations from transformer[C]//Proceedings of the 28th ACM International Conference on Information and Knowledge Management (CIKM). 2019: 1441–1450.

[8] Gu A, Dao T, Ermon S, Ré C. Mamba: Linear-time sequence modeling with selective state spaces[C]//Advances in Neural Information Processing Systems. NeurIPS, 2023.

[9] Dao T, Liu T, Zhang T, et al. Mamba-2: A linear-time state space model for language modeling[J]. arXiv preprint arXiv:2404.03063, 2024.

[10] Feng L., Wang Y. F., Zhang X. Y. A survey on personalized sequential recommendation based on Transformer [J]. Journal of Frontiers of Computer Science and Technology, 2022, 16(6): 925-938.

[11] Quadrana M, Karatzoglou A, Hidasi B, Cremonesi P. Personalizing session-based recommendations with hierarchical recurrent neural networks[C]//Proceedings of the 11th ACM Conference on Recommender Systems (RecSys). 2017: 130–137.

[12] Sarwar B, Karypis G, Konstan J, Riedl J. Item-based collaborative filtering recommendation algorithms[C]//Proceedings of the 10th International Conference on World Wide Web. 2001: 285–295.

[13] Wang Q, Zhang T, Yu Y, Liu Y, Zhang H. Learning disentangled user preferences for sequential recommendation[C]//Proceedings of the 31st ACM International Conference on Information and Knowledge Management (CIKM). ACM, 2022: 1839–1848.

[14] Wang Y, Li J, Zhao W X, et al. A survey on session-based recommender systems[J]. ACM Transactions on Information Systems, 2022, 40(3): 1–38. DOI:10.1145/3510428.

[15] Chen W, Zhang X, Zhang H, et al. S3-Rec: Self-supervised learning for sequential recommendation with mutual information maximization[C]//Proceedings of the 29th ACM International Conference on Information and Knowledge Management. ACM, 2020: 1893–1902.

[16] Zhou T, Xu Y, Zhang H, et al. Dynamic intent-aware recommendation with temporal attention networks[J]. Information Processing & Management, 2022, 59(3): 102949.

[17] Xu Y, Zhao Z, Liu B, et al. Graph contextualized self-attention network for session-based recommendation[C]//Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining. ACM, 2019: 2298–2306.

[18] Fang H, Sun F, Zhang M, et al. SINE: Sequential recommendation with sparse user interest[C]//Proceedings of the 27th ACM SIGKD.

Downloads

Published

29-09-2025

Issue

Section

Articles

How to Cite

Feng, C., Wang, H., & Wang, Y. (2025). A Sequential Recommendation Model Based on Convolutional Neural Networks and State Space Models. Journal of Computing and Electronic Information Management, 18(2), 24-31. https://doi.org/10.54097/108jrv85