双通道卷积神经网络用nn.Sequential()搭建python代码
时间: 2023-10-16 08:07:32 浏览: 136
卷积神经网络python代码
双通道卷积神经网络可以通过在nn.Sequential()中添加不同的层来构建。
以下是一个示例代码:
```python
import torch.nn as nn
class DoubleChannelCNN(nn.Module):
def __init__(self):
super(DoubleChannelCNN, self).__init__()
self.conv1 = nn.Conv2d(3, 32, kernel_size=3, padding=1)
self.bn1 = nn.BatchNorm2d(32)
self.relu1 = nn.ReLU(inplace=True)
self.conv2 = nn.Conv2d(32, 64, kernel_size=3, padding=1)
self.bn2 = nn.BatchNorm2d(64)
self.relu2 = nn.ReLU(inplace=True)
self.pool1 = nn.MaxPool2d(kernel_size=2, stride=2)
self.conv3 = nn.Conv2d(64, 128, kernel_size=3, padding=1)
self.bn3 = nn.BatchNorm2d(128)
self.relu3 = nn.ReLU(inplace=True)
self.conv4 = nn.Conv2d(128, 256, kernel_size=3, padding=1)
self.bn4 = nn.BatchNorm2d(256)
self.relu4 = nn.ReLU(inplace=True)
self.pool2 = nn.MaxPool2d(kernel_size=2, stride=2)
self.fc1 = nn.Linear(256 * 7 * 7, 1024)
self.dropout = nn.Dropout(p=0.5)
self.fc2 = nn.Linear(1024, 10)
def forward(self, x):
x1 = x[:, :3] # 第1个通道
x2 = x[:, 3:] # 第2个通道
x1 = self.relu1(self.bn1(self.conv1(x1)))
x1 = self.pool1(self.relu2(self.bn2(self.conv2(x1))))
x1 = self.relu3(self.bn3(self.conv3(x1)))
x1 = self.pool2(self.relu4(self.bn4(self.conv4(x1))))
x1 = x1.view(x1.size(0), -1)
x2 = self.relu1(self.bn1(self.conv1(x2)))
x2 = self.pool1(self.relu2(self.bn2(self.conv2(x2))))
x2 = self.relu3(self.bn3(self.conv3(x2)))
x2 = self.pool2(self.relu4(self.bn4(self.conv4(x2))))
x2 = x2.view(x2.size(0), -1)
x = torch.cat((x1, x2), dim=1)
x = self.dropout(self.fc1(x))
x = self.fc2(x)
return x
```
在这个例子中,我们首先定义了一个DoubleChannelCNN类,它继承了nn.Module。在构造函数中,我们定义了所有的卷积层、池化层、全连接层和dropout层。在前向传播函数中,我们将输入数据分成两个通道,并对每个通道进行相同的卷积操作,然后将它们连接在一起,通过全连接层和dropout层得到最终的输出。
这是一个简单的例子,你可以根据你的具体需求来修改和扩展它。
阅读全文