Attentive Hybrid Recurrent Neural Networks for sequential recommendation

Lixiang Zhang, Peisen Wang, Jingchen Li, Zhiwei Xiao, Haobin Shi

Research output: Contribution to journalArticlepeer-review

10 Scopus citations

Abstract

Recently, the sequential recommendation has become a hot spot. Previous works combined user long-term and short-term behavior to achieve the next item recommendation, but the previous works typically processed the user long-term sequential behavior in left-to-right order and some useful information may be overlooked in such a particular way. Moreover, these methods ignored that every user has his/her own attention on the different items. In this paper, we propose a novel hybrid model called Attentive Hybrid Recurrent Neural Networks to resolve these problems. The first module is the bidirectional long- and short-term memory network (Bi-LSTM), and the second is the GRU module, both of which are equipped with user-based attention mechanism. The hybrid model aims to grasp the user general preference as well as to capture the user latest intent. Experiment results on two public datasets suggest that our hybrid model has better performance on the next item recommendation task compared with previously reported baseline algorithm.

Original languageEnglish
Pages (from-to)11091-11105
Number of pages15
JournalNeural Computing and Applications
Volume33
Issue number17
DOIs
StatePublished - Sep 2021

Keywords

  • Attention mechanism
  • Recurrent neural network
  • Sequential recommendation

Fingerprint

Dive into the research topics of 'Attentive Hybrid Recurrent Neural Networks for sequential recommendation'. Together they form a unique fingerprint.

Cite this