s2attention
时间: 2023-08-24 16:06:48 浏览: 156
S2-Attention是一种在神经网络中使用的注意力机制。它是基于S2-MLPv2架构中的Spatial-shift和Split Attention的改进。S2-MLPv2通过引入空间平移和分割注意力的方式,代替了原有的N×N卷积操作,并且没有延续MLP-Mixer架构中长距离依赖的思想。在S2-Attention中,并没有使用长距离依赖。S2-Attention的性能相比于之前的模型有所提升,但是目前还没有开源,所以其自身的贡献有限,理论性也不足。<span class="em">1</span><span class="em">2</span><span class="em">3</span>
#### 引用[.reference_title]
- *1* [深度学习之图像分类(二十五)-- S2MLPv2 网络详解](https://blog.csdn.net/baidu_36913330/article/details/120636036)[target="_blank" data-report-click={"spm":"1018.2226.3001.9630","extra":{"utm_source":"vip_chatgpt_common_search_pc_result","utm_medium":"distribute.pc_search_result.none-task-cask-2~all~insert_cask~default-1-null.142^v92^chatsearchT0_1"}}] [.reference_item style="max-width: 33.333333333333336%"]
- *2* [【arXiv2022】S2-FPN: Scale-ware Strip Attention GuidedFeature Pyramid Network for Real-time SS](https://blog.csdn.net/m0_61899108/article/details/128255358)[target="_blank" data-report-click={"spm":"1018.2226.3001.9630","extra":{"utm_source":"vip_chatgpt_common_search_pc_result","utm_medium":"distribute.pc_search_result.none-task-cask-2~all~insert_cask~default-1-null.142^v92^chatsearchT0_1"}}] [.reference_item style="max-width: 33.333333333333336%"]
- *3* [关于注意力机制Attention解析](https://blog.csdn.net/liaominghui123/article/details/118875368)[target="_blank" data-report-click={"spm":"1018.2226.3001.9630","extra":{"utm_source":"vip_chatgpt_common_search_pc_result","utm_medium":"distribute.pc_search_result.none-task-cask-2~all~insert_cask~default-1-null.142^v92^chatsearchT0_1"}}] [.reference_item style="max-width: 33.333333333333336%"]
[ .reference_list ]