Shuffle Transformer
时间: 2023-10-15 08:28:15 浏览: 196
Shuffle Transformer是一种重新思考视觉转换器的架构,旨在改善其空间洗牌机制。[1]该架构引入了一种称为Shuffle-T的新型网络模块,其中包含了洗牌窗口自注意力机制(shuffle window self-attention)和基于窗口的自注意力机制(window-based self-attention)。其中,洗牌窗口自注意力机制用于对输入特征进行洗牌操作,以增强特征之间的交互和混合。而基于窗口的自注意力机制则用于对洗牌后的特征进行自注意力计算。
具体来说,Shuffle Transformer由多个Shuffle Transformer Block组成,每个Block包含了两个连续的网络层:WMSA(window-based multi-head self-attention)和Shuffle WMSA(带有空间洗牌的基于窗口的多头自注意力)。WMSA用于实现基于窗口的自注意力计算,而Shuffle WMSA则在WMSA的基础上引入了空间洗牌操作,以增强特征之间的交互和混合效果。
通过引入Shuffle Transformer,研究人员希望能够改善传统视觉转换器的性能,并提高其在处理图像任务时的表现。这种架构的创新在于引入了空间洗牌机制,从而增强了特征之间的交互和混合能力,提高了模型的表达能力和性能。<span class="em">1</span><span class="em">2</span><span class="em">3</span>
#### 引用[.reference_title]
- *1* [Shuffle Transformer重新思考视觉转换器的空间洗牌_Shuffle Transformer Rethinking](https://download.csdn.net/download/dwf1354046363/76464505)[target="_blank" data-report-click={"spm":"1018.2226.3001.9630","extra":{"utm_source":"vip_chatgpt_common_search_pc_result","utm_medium":"distribute.pc_search_result.none-task-cask-2~all~insert_cask~default-1-null.142^v93^chatsearchT3_2"}}] [.reference_item style="max-width: 50%"]
- *2* *3* [[2106] [NIPS 2021] Shuffle Transformer: Rethinking Spatial Shuffle for Vision Transformer](https://blog.csdn.net/weixin_43355838/article/details/123230579)[target="_blank" data-report-click={"spm":"1018.2226.3001.9630","extra":{"utm_source":"vip_chatgpt_common_search_pc_result","utm_medium":"distribute.pc_search_result.none-task-cask-2~all~insert_cask~default-1-null.142^v93^chatsearchT3_2"}}] [.reference_item style="max-width: 50%"]
[ .reference_list ]
阅读全文