transformer position embding
时间: 2023-10-13 11:03:36 浏览: 121
transformertection-mai笔记
在Transformer模型中,Positional Encoding用于为输入序列中的每个位置添加一个表示位置信息的向量。这个向量可以是固定的,也可以是可学习的。在将Positional Encoding层的数据与Embedding层后的数据相加时,根据需要进行不同的处理。如果Positional Encoding数据不需要变换,则直接相加。如果需要进行训练,则使用Variable()方法进行封装,然后再相加。这样就完成了对Embedding层和Positional Encoding层的处理。<span class="em">1</span><span class="em">2</span><span class="em">3</span>
#### 引用[.reference_title]
- *1* [Transformer详解之Embedding、Positional Encoding层(面向初学者)](https://blog.csdn.net/qq_41018669/article/details/120341783)[target="_blank" data-report-click={"spm":"1018.2226.3001.9630","extra":{"utm_source":"vip_chatgpt_common_search_pc_result","utm_medium":"distribute.pc_search_result.none-task-cask-2~all~insert_cask~default-1-null.142^v93^chatsearchT3_1"}}] [.reference_item style="max-width: 33.333333333333336%"]
- *2* [Transformer中的position encoding(位置编码二)](https://blog.csdn.net/weixin_42715977/article/details/122139883)[target="_blank" data-report-click={"spm":"1018.2226.3001.9630","extra":{"utm_source":"vip_chatgpt_common_search_pc_result","utm_medium":"distribute.pc_search_result.none-task-cask-2~all~insert_cask~default-1-null.142^v93^chatsearchT3_1"}}] [.reference_item style="max-width: 33.333333333333336%"]
- *3* [科技行业前言:Transformer模型改变AI生态](https://download.csdn.net/download/m0_37685981/88220950)[target="_blank" data-report-click={"spm":"1018.2226.3001.9630","extra":{"utm_source":"vip_chatgpt_common_search_pc_result","utm_medium":"distribute.pc_search_result.none-task-cask-2~all~insert_cask~default-1-null.142^v93^chatsearchT3_1"}}] [.reference_item style="max-width: 33.333333333333336%"]
[ .reference_list ]
阅读全文