transformer出现
时间: 2023-08-22 21:07:00 浏览: 89
Transformer是一种网络结构,最早于2017年由Google机器翻译团队提出,用于改进机器翻译任务。它完全取代了以往的RNN和CNN结构,采用了由Transformer堆叠的方式构建模型。Transformer在自然语言处理(NLP)领域取得了惊人的效果,后来也被引入计算机视觉(CV)领域的目标检测任务和图像分类任务中,取得了很大成功。在BERT模型中,Transformer起到了重要的作用,并且后续的XLNET、roBERT等模型也都采用了Transformer作为核心结构。Transformer由encoder和decoder组成,每个encoder或decoder由多个encoder block或decoder block堆叠而成,其中的关键结构是Multi-Head Attention。Transformer的优势在于能够更好地处理长距离依赖和捕捉全局上下文信息。<span class="em">1</span><span class="em">2</span><span class="em">3</span>
#### 引用[.reference_title]
- *1* *3* [transformer](https://blog.csdn.net/chaipp0607/article/details/120520833)[target="_blank" data-report-click={"spm":"1018.2226.3001.9630","extra":{"utm_source":"vip_chatgpt_common_search_pc_result","utm_medium":"distribute.pc_search_result.none-task-cask-2~all~insert_cask~default-1-null.142^v92^chatsearchT3_1"}}] [.reference_item style="max-width: 50%"]
- *2* [Transformer背景介绍](https://blog.csdn.net/weixin_46556352/article/details/124453724)[target="_blank" data-report-click={"spm":"1018.2226.3001.9630","extra":{"utm_source":"vip_chatgpt_common_search_pc_result","utm_medium":"distribute.pc_search_result.none-task-cask-2~all~insert_cask~default-1-null.142^v92^chatsearchT3_1"}}] [.reference_item style="max-width: 50%"]
[ .reference_list ]
阅读全文