注意门(AG)跳跃连接
时间: 2023-10-26 07:06:26 浏览: 183
注意力门(AG)通过跳跃连接(skip connection)来增强传播语义信息的效率。在注意门的架构中,输入是扩展路径中的上采样特征和来自编码器的相应特征。上采样特征作为门控信号,用于增强对与分割任务相关的目标区域的学习,并同时抑制与任务无关的区域。通过选择适当的激活函数(通常是S形激活函数sigmoid),训练注意门中的参数,可以得到注意力系数α。最后,通过将编码器特征乘以注意力系数α,可以逐像素地获得输出。<span class="em">1</span><span class="em">2</span><span class="em">3</span>
#### 引用[.reference_title]
- *1* *3* [论文阅读----------UNET++:用于肝脏 CT 图像的嵌套注意感知 U-NET分割](https://blog.csdn.net/weixin_45241355/article/details/118300513)[target="_blank" data-report-click={"spm":"1018.2226.3001.9630","extra":{"utm_source":"vip_chatgpt_common_search_pc_result","utm_medium":"distribute.pc_search_result.none-task-cask-2~all~insert_cask~default-1-null.142^v93^chatsearchT3_1"}}] [.reference_item style="max-width: 50%"]
- *2* [Attention U-Net:Learning Where to Look for the Pancreas](https://blog.csdn.net/fujikoo/article/details/111462676)[target="_blank" data-report-click={"spm":"1018.2226.3001.9630","extra":{"utm_source":"vip_chatgpt_common_search_pc_result","utm_medium":"distribute.pc_search_result.none-task-cask-2~all~insert_cask~default-1-null.142^v93^chatsearchT3_1"}}] [.reference_item style="max-width: 50%"]
[ .reference_list ]
阅读全文