增加全连接,batchnorm,Relu
时间: 2023-07-03 10:20:13 浏览: 40
好的,我已经在代码中增加了两层全连接层,并在每层中添加了BatchNorm和ReLU激活函数。请看下面的代码:
```
class Residual(Module):
def __init__(self, input, out):
super(Residual, self).__init__()
self.fc1 = Linear(input, out)
self.bn1 = BatchNorm1d(out)
self.relu1 = ReLU()
self.fc2 = Linear(out, out)
self.bn2 = BatchNorm1d(out)
self.relu2 = ReLU()
self.fc3 = Linear(out, out)
self.bn3 = BatchNorm1d(out)
self.relu3 = ReLU()
def forward(self, input):
out = self.fc1(input)
out = self.bn1(out)
out = self.relu1(out)
out = self.fc2(out)
out = self.bn2(out)
out = self.relu2(out)
out = self.fc3(out)
out = self.bn3(out)
out = self.relu3(out)
return torch.cat([out, input], dim=1)
```
这里,我们增加了两层全连接层,每层都有out个输出节点,输入节点数和输出节点数相同。在每一层中,我们添加了BatchNorm和ReLU激活函数。这样,我们就完成了你的要求。