split attention module
时间: 2023-08-07 08:08:49 浏览: 103
20200417_ResNeSt Split-Attention Networks.pdf
Split Attention Module是一种用于增强特征提取的模块,可以用于替代ResNet的3×3卷积。它在目标分类和目标检测任务中取得了state-of-the-art的效果。该模块的结构可以用三个步骤进行概括:划分(Split)、获得注意力(Attention)和汇聚(Pooling)\[2\]。具体来说,Split Attention的操作是先将划分出的R个组进行相加得到U,然后通过两个全连接层获得注意力权重。注意力权重经过softmax后,将其与划分后的R个组和原始的分支进行相乘,并进行相加,从而完成了split attention的操作\[3\]。这种模块的引入可以提高特征提取的性能,并在图像分类和目标检测任务中取得更好的结果。
#### 引用[.reference_title]
- *1* [EPSANet: An Efficient Pyramid Split Attention Block on Convolutional Neural Network 论文解读](https://blog.csdn.net/sc1434404661/article/details/118355530)[target="_blank" data-report-click={"spm":"1018.2226.3001.9630","extra":{"utm_source":"vip_chatgpt_common_search_pc_result","utm_medium":"distribute.pc_search_result.none-task-cask-2~all~insert_cask~default-1-null.142^v91^control,239^v3^insert_chatgpt"}} ] [.reference_item]
- *2* *3* [目标检测Backbone系列(1)ResNest——基于Channel-Wise的Split Attention及其block实现](https://blog.csdn.net/lzzzzzzm/article/details/123515132)[target="_blank" data-report-click={"spm":"1018.2226.3001.9630","extra":{"utm_source":"vip_chatgpt_common_search_pc_result","utm_medium":"distribute.pc_search_result.none-task-cask-2~all~insert_cask~default-1-null.142^v91^control,239^v3^insert_chatgpt"}} ] [.reference_item]
[ .reference_list ]
阅读全文