TY - JOUR
T1 - Attentive Hybrid Recurrent Neural Networks for sequential recommendation
AU - Zhang, Lixiang
AU - Wang, Peisen
AU - Li, Jingchen
AU - Xiao, Zhiwei
AU - Shi, Haobin
N1 - Publisher Copyright:
© 2021, The Author(s), under exclusive licence to Springer-Verlag London Ltd. part of Springer Nature.
PY - 2021/9
Y1 - 2021/9
N2 - Recently, the sequential recommendation has become a hot spot. Previous works combined user long-term and short-term behavior to achieve the next item recommendation, but the previous works typically processed the user long-term sequential behavior in left-to-right order and some useful information may be overlooked in such a particular way. Moreover, these methods ignored that every user has his/her own attention on the different items. In this paper, we propose a novel hybrid model called Attentive Hybrid Recurrent Neural Networks to resolve these problems. The first module is the bidirectional long- and short-term memory network (Bi-LSTM), and the second is the GRU module, both of which are equipped with user-based attention mechanism. The hybrid model aims to grasp the user general preference as well as to capture the user latest intent. Experiment results on two public datasets suggest that our hybrid model has better performance on the next item recommendation task compared with previously reported baseline algorithm.
AB - Recently, the sequential recommendation has become a hot spot. Previous works combined user long-term and short-term behavior to achieve the next item recommendation, but the previous works typically processed the user long-term sequential behavior in left-to-right order and some useful information may be overlooked in such a particular way. Moreover, these methods ignored that every user has his/her own attention on the different items. In this paper, we propose a novel hybrid model called Attentive Hybrid Recurrent Neural Networks to resolve these problems. The first module is the bidirectional long- and short-term memory network (Bi-LSTM), and the second is the GRU module, both of which are equipped with user-based attention mechanism. The hybrid model aims to grasp the user general preference as well as to capture the user latest intent. Experiment results on two public datasets suggest that our hybrid model has better performance on the next item recommendation task compared with previously reported baseline algorithm.
KW - Attention mechanism
KW - Recurrent neural network
KW - Sequential recommendation
UR - http://www.scopus.com/inward/record.url?scp=85098778837&partnerID=8YFLogxK
U2 - 10.1007/s00521-020-05643-7
DO - 10.1007/s00521-020-05643-7
M3 - 文章
AN - SCOPUS:85098778837
SN - 0941-0643
VL - 33
SP - 11091
EP - 11105
JO - Neural Computing and Applications
JF - Neural Computing and Applications
IS - 17
ER -