VGG16网络代码实现
时间: 2023-12-20 17:06:44 浏览: 194
java实现VGG16
以下是VGG16网络的代码实现:
```python
import tensorflow as tf
def VGG16(input_shape=(224, 224, 3), num_classes=1000):
model = tf.keras.Sequential([
# Block 1
tf.keras.layers.Conv2D(64, (3, 3), activation='relu', padding='same', input_shape=input_shape),
tf.keras.layers.Conv2D(64, (3, 3), activation='relu', padding='same'),
tf.keras.layers.MaxPooling2D((2, 2), strides=(2, 2)),
# Block 2
tf.keras.layers.Conv2D(128, (3, 3), activation='relu', padding='same'),
tf.keras.layers.Conv2D(128, (3, 3), activation='relu', padding='same'),
tf.keras.layers.MaxPooling2D((2, 2), strides=(2, 2)),
# Block 3
tf.keras.layers.Conv2D(256, (3, 3), activation='relu', padding='same'),
tf.keras.layers.Conv2D(256, (3, 3), activation='relu', padding='same'),
tf.keras.layers.Conv2D(256, (3, 3), activation='relu', padding='same'),
tf.keras.layers.MaxPooling2D((2, 2), strides=(2, 2)),
# Block 4
tf.keras.layers.Conv2D(512, (3, 3), activation='relu', padding='same'),
tf.keras.layers.Conv2D(512, (3, 3), activation='relu', padding='same'),
tf.keras.layers.Conv2D(512, (3, 3), activation='relu', padding='same'),
tf.keras.layers.MaxPooling2D((2, 2), strides=(2, 2)),
# Block 5
tf.keras.layers.Conv2D(512, (3, 3), activation='relu', padding='same'),
tf.keras.layers.Conv2D(512, (3, 3), activation='relu', padding='same'),
tf.keras.layers.Conv2D(512, (3, 3), activation='relu', padding='same'),
tf.keras.layers.MaxPooling2D((2, 2), strides=(2, 2)),
# Classifier
tf.keras.layers.Flatten(),
tf.keras.layers.Dense(4096, activation='relu'),
tf.keras.layers.Dropout(0.5),
tf.keras.layers.Dense(4096, activation='relu'),
tf.keras.layers.Dropout(0.5),
tf.keras.layers.Dense(num_classes, activation='softmax')
])
return model
```
这个实现使用了 TensorFlow 2.x 的 Keras 接口,并且模型中的每个卷积层都使用了相同的参数设置,即卷积核大小为 3x3,使用 ReLU 激活函数,padding 方式为 same。在模型的最后使用了全连接层,其中第一个全连接层的节点数为 4096,第二个全连接层的节点数也为 4096,最终输出层使用 softmax 激活函数。
阅读全文