多头注意力机制qkv
时间: 2023-09-21 10:11:39 浏览: 52
多头注意力机制(Multi-Head Attention)是注意力机制的一种扩展形式,用于更有效地提取序列数据中的信息。在多头注意力机制中,输入被拆分为多个不同的线性变换,分别称为查询(Q)、键(K)、和值(V)。通过对每个线性变换进行注意力计算,可以获得多个注意力权重。然后将这些注意力权重与值进行加权求和,得到最终的多头注意力输出。<span class="em">1</span><span class="em">2</span><span class="em">3</span>
#### 引用[.reference_title]
- *1* [Attention Mechanisms in Deep Learning.pdf](https://download.csdn.net/download/weixin_43909715/85832684)[target="_blank" data-report-click={"spm":"1018.2226.3001.9630","extra":{"utm_source":"vip_chatgpt_common_search_pc_result","utm_medium":"distribute.pc_search_result.none-task-cask-2~all~insert_cask~default-1-null.142^v93^chatsearchT3_2"}}] [.reference_item style="max-width: 50%"]
- *2* *3* [注意力机制(四):多头注意力](https://blog.csdn.net/qq_51957239/article/details/129732592)[target="_blank" data-report-click={"spm":"1018.2226.3001.9630","extra":{"utm_source":"vip_chatgpt_common_search_pc_result","utm_medium":"distribute.pc_search_result.none-task-cask-2~all~insert_cask~default-1-null.142^v93^chatsearchT3_2"}}] [.reference_item style="max-width: 50%"]
[ .reference_list ]