prelu激活函数pytorch代码
时间: 2023-05-03 15:00:18 浏览: 283
下面是PyTorch中PReLU的激活函数代码示例:
```
import torch.nn as nn
import torch.nn.functional as F
class Net(nn.Module):
def __init__(self):
super(Net, self).__init__()
self.conv1 = nn.Conv2d(3, 6, 5)
self.bn1 = nn.BatchNorm2d(6)
self.prelu1 = nn.PReLU()
self.conv2 = nn.Conv2d(6, 16, 5)
self.bn2 = nn.BatchNorm2d(16)
self.prelu2 = nn.PReLU()
self.fc1 = nn.Linear(16 * 5 * 5, 120)
self.fc2 = nn.Linear(120, 84)
self.fc3 = nn.Linear(84, 10)
def forward(self, x):
x = self.conv1(x)
x = self.bn1(x)
x = self.prelu1(x)
x = F.max_pool2d(x, (2, 2))
x = self.conv2(x)
x = self.bn2(x)
x = self.prelu2(x)
x = F.max_pool2d(x, 2)
x = x.view(-1, self.num_flat_features(x))
x = F.relu(self.fc1(x))
x = F.relu(self.fc2(x))
x = self.fc3(x)
return x
def num_flat_features(self, x):
size = x.size()[1:]
num_features = 1
for s in size:
num_features *= s
return num_features
net = Net()
```
以上代码中包括了一个使用PReLU激活函数的神经网络模型,其中`nn.PReLU()`是PyTorch自带的PReLU激活函数,在`Net()`的构造函数中被实例化并用于各层之间的传递。在`forward()`函数中,先经过卷积层和BatchNorm层,然后再通过PReLU激活函数进行非线性变换,最后一并传入下一层。
阅读全文