Sequential Self-Attention
时间: 2023-09-30 21:10:52 浏览: 201
A Hybrid LSTM and Sequential Self-Attention based Approach.pdf
回答: Sequential Self-Attention是一种用于顺序推荐系统的模型。传统的顺序推荐系统模型如马尔可夫链、循环神经网络和自我注意,都将交互历史视为一个有顺序的序列,但没有考虑交互物品之间的时间间隔。而Sequential Self-Attention则在序列模型框架中考虑了交互物品之间的时间间隔,以探索不同时间间隔对预测下一个物品的影响。这个模型被称为TiSASRec,它不仅考虑了物品的绝对位置,还考虑了序列中物品之间的时间间隔。研究表明,TiSASRec在不同的设置下,以及在稀疏和密集数据集上,都表现出很好的性能,并且优于其他先进的序列模型。<span class="em">1</span><span class="em">2</span><span class="em">3</span>
#### 引用[.reference_title]
- *1* *3* [论文笔记 Time Interval Aware Self-Attention for Sequential Recommendation](https://blog.csdn.net/weixin_39129550/article/details/105375259)[target="_blank" data-report-click={"spm":"1018.2226.3001.9630","extra":{"utm_source":"vip_chatgpt_common_search_pc_result","utm_medium":"distribute.pc_search_result.none-task-cask-2~all~insert_cask~default-1-null.142^v92^chatsearchT0_1"}}] [.reference_item style="max-width: 50%"]
- *2* [Time Interval Aware Self-Attention for Sequential Recommendation](https://blog.csdn.net/buduxiyi2/article/details/116146096)[target="_blank" data-report-click={"spm":"1018.2226.3001.9630","extra":{"utm_source":"vip_chatgpt_common_search_pc_result","utm_medium":"distribute.pc_search_result.none-task-cask-2~all~insert_cask~default-1-null.142^v92^chatsearchT0_1"}}] [.reference_item style="max-width: 50%"]
[ .reference_list ]
阅读全文