SE注意力机制matlab代码,以及SAE注意力机制matlab代码?
时间: 2023-11-13 16:58:59 浏览: 155
CBAM注意力机制(MatLab代码实现)
SE注意力机制的Matlab代码:
```matlab
function [attention, output] = se_attention(x, w1, w2)
% x: input tensor, size = [height, width, channel]
% w1, w2: weight tensors, size = [channel, 1]
% attention: attention tensor, size = [height, width, channel]
% output: output tensor, size = [height, width, channel]
channel = size(x, 3);
% Squeeze operation
z = reshape(x, [], channel); % size = [height*width, channel]
z = z * w1; % size = [height*width, 1]
z = sigmoid(z); % size = [height*width, 1]
% Excitation operation
y = z .* reshape(x, [], channel); % size = [height*width, channel]
y = y * w2; % size = [height*width, 1]
y = reshape(y, size(x, 1), size(x, 2), []); % size = [height, width, 1]
% Normalize attention tensor
attention = z ./ mean(z(:));
output = y;
end
```
SAE注意力机制的Matlab代码:
```matlab
function [attention, output] = sae_attention(x, w1, w2)
% x: input tensor, size = [height, width, channel]
% w1, w2: weight tensors, size = [channel, 1]
% attention: attention tensor, size = [height, width, channel]
% output: output tensor, size = [height, width, channel]
channel = size(x, 3);
% Squeeze operation
z = reshape(x, [], channel); % size = [height*width, channel]
z = z * w1; % size = [height*width, 1]
z = sigmoid(z); % size = [height*width, 1]
% Attention operation
q = reshape(x, [], channel); % size = [height*width, channel]
k = q; % k is the same as q in SAE
v = k * w2; % size = [height*width, 1]
v = reshape(v, size(x, 1), size(x, 2)); % size = [height, width]
% Normalize attention tensor
attention = reshape(z, size(x, 1), size(x, 2), []); % size = [height, width, channel]
attention = attention ./ mean(attention(:));
output = v;
end
```
阅读全文