crossattention
时间: 2023-08-24 10:04:23 浏览: 57
Cross-attention是一种注意力机制,用于处理不同维度的数据。通常,cross-attention要求输入是相同维度的矩阵,但是有些情况下,可以处理不同维度的数据。在cross-attention中,有两个输入序列,一个作为查询(Q)输入,另一个作为键(K)和值(V)输入。这种注意力机制可以通过计算查询序列和键序列之间的相似度来为每个查询分配权重,并使用这些权重对值序列进行加权求和,从而得到最终的输出。在某些情况下,也可以使用一个序列的查询和值,另一个序列的键。总之,cross-attention通过将两个不同维度的嵌入序列组合在一起,实现了跨序列的信息交互。\[1\]\[2\]\[3\]
#### 引用[.reference_title]
- *1* [cross attention输入不同维度的矩阵](https://blog.csdn.net/m0_46482145/article/details/129637158)[target="_blank" data-report-click={"spm":"1018.2226.3001.9630","extra":{"utm_source":"vip_chatgpt_common_search_pc_result","utm_medium":"distribute.pc_search_result.none-task-cask-2~all~insert_cask~default-1-null.142^v91^insert_down28v1,239^v3^insert_chatgpt"}} ] [.reference_item]
- *2* [CrossAttention KBQA](https://blog.csdn.net/qq_42791848/article/details/122630149)[target="_blank" data-report-click={"spm":"1018.2226.3001.9630","extra":{"utm_source":"vip_chatgpt_common_search_pc_result","utm_medium":"distribute.pc_search_result.none-task-cask-2~all~insert_cask~default-1-null.142^v91^insert_down28v1,239^v3^insert_chatgpt"}} ] [.reference_item]
- *3* [【科研】浅学Cross-attention?](https://blog.csdn.net/MengYa_Dream/article/details/126688503)[target="_blank" data-report-click={"spm":"1018.2226.3001.9630","extra":{"utm_source":"vip_chatgpt_common_search_pc_result","utm_medium":"distribute.pc_search_result.none-task-cask-2~all~insert_cask~default-1-null.142^v91^insert_down28v1,239^v3^insert_chatgpt"}} ] [.reference_item]
[ .reference_list ]