self-attention
时间: 2023-09-22 10:13:51 浏览: 72
Self-attention is a mechanism in deep learning models that allows the model to attend to different parts of the input sequence or image at different times, and to weigh the importance of each part in the final output. Self-attention is often used in natural language processing tasks such as machine translation, where the model needs to attend to different words in the input sentence to generate the correct translation. Self-attention has also been used in computer vision tasks such as image captioning, where the model needs to attend to different parts of the image to generate a description. Self-attention has been shown to improve the performance of deep learning models on a wide range of tasks.
阅读全文