ResNeXt-101结构图
时间: 2024-04-26 10:18:30 浏览: 22
ResNeXt-101是一种深度神经网络模型,它是在ResNet的基础上进行改进和扩展而来的。下面是ResNeXt-101的结构图:
ResNeXt-101结构图:
```
Input
|
|--- Convolutional Layer (7x7, stride 2)
|--- Batch Normalization
|--- ReLU
|--- Max Pooling (3x3, stride 2)
|
|--- Residual Block x 3 (with cardinality=32)
| |--- Convolutional Layer (1x1, stride 1)
| |--- Batch Normalization
| |--- ReLU
| |--- Convolutional Layer (3x3, stride 1)
| |--- Batch Normalization
| |--- ReLU
| |--- Convolutional Layer (1x1, stride 1)
| |--- Batch Normalization
| |--- Skip Connection
| |--- ReLU
|
|--- Residual Block x 4 (with cardinality=32)
| |--- Convolutional Layer (1x1, stride 2)
| |--- Batch Normalization
| |--- ReLU
| |--- Convolutional Layer (3x3, stride 1)
| |--- Batch Normalization
| |--- ReLU
| |--- Convolutional Layer (1x1, stride 1)
| |--- Batch Normalization
| |--- Skip Connection
| |--- ReLU
|
|--- Residual Block x 23 (with cardinality=32)
| |-- Batch Normalization
| |--- ReLU
| |--- Convolutional Layer (3x3, stride 1)
| |--- Batch Normalization
| |--- ReLU
| |--- Convolutional Layer (1x1, stride 1)
| |--- Batch Normalization
| |-- Residual Block x 3 (with cardinality=32)
| |--- Convolutional Layer (1x1, stride 2)
| |--- Batch Normalization
| |--- ReLU
| |--- Convolutional Layer (3x3, stride 1)
| |--- Batch Normalization
| |--- ReLU
| |--- Convolutional Layer (1x1, stride 1)
| |--- Batch Normalization
| |--- Skip Connection
| |--- ReLU
|
|--- Average Pooling (7x7, stride 1)
|--- Fully Connected Layer
|--- Softmax Output
```