使用tensorflow2.7,cuda11.6版本实现在LeNet-5上增加BN层,解决手写体数字识别,并比较几种算法(全连接、LeNet-5、LeNet-5+BN)在训练集上的精度变化,画出随epoch变化的曲线的问题,并给出实现代码
时间: 2024-03-12 22:49:59 浏览: 66
c语言实现手写数字识别代码_c_presidentp1s_数字识别c代码_tensorflow_LeNet5
5星 · 资源好评率100%
好的,以下是我为您提供的实现代码:
```python
import tensorflow as tf
from tensorflow.keras import layers, models, optimizers, datasets
import numpy as np
import matplotlib.pyplot as plt
# 加载数据集
(train_images, train_labels), (test_images, test_labels) = datasets.mnist.load_data()
train_images, test_images = train_images / 255.0, test_images / 255.0
train_images, test_images = train_images[..., np.newaxis], test_images[..., np.newaxis]
# 构建全连接神经网络模型
def build_fc():
model = models.Sequential([
layers.Flatten(input_shape=(28, 28, 1)),
layers.Dense(units=128, activation='relu'),
layers.Dense(units=10, activation='softmax')
])
return model
# 构建LeNet-5模型
def build_lenet5():
model = models.Sequential([
layers.Conv2D(filters=6, kernel_size=(5, 5), activation='sigmoid', input_shape=(28, 28, 1)),
layers.MaxPooling2D(pool_size=(2, 2)),
layers.Conv2D(filters=16, kernel_size=(5, 5), activation='sigmoid'),
layers.MaxPooling2D(pool_size=(2, 2)),
layers.Flatten(),
layers.Dense(units=120, activation='sigmoid'),
layers.Dense(units=84, activation='sigmoid'),
layers.Dense(units=10, activation='softmax')
])
return model
# 构建LeNet-5+BN模型
def build_lenet5_bn():
model = models.Sequential([
layers.Conv2D(filters=6, kernel_size=(5, 5), input_shape=(28, 28, 1)),
layers.BatchNormalization(),
layers.Activation('sigmoid'),
layers.MaxPooling2D(pool_size=(2, 2)),
layers.Conv2D(filters=16, kernel_size=(5, 5)),
layers.BatchNormalization(),
layers.Activation('sigmoid'),
layers.MaxPooling2D(pool_size=(2, 2)),
layers.Flatten(),
layers.Dense(units=120),
layers.BatchNormalization(),
layers.Activation('sigmoid'),
layers.Dense(units=84),
layers.BatchNormalization(),
layers.Activation('sigmoid'),
layers.Dense(units=10, activation='softmax')
])
return model
# 训练模型并绘制曲线
def train_and_plot(model, train_images, train_labels, test_images, test_labels):
model.compile(optimizer=optimizers.Adam(learning_rate=1e-3),
loss='sparse_categorical_crossentropy',
metrics=['accuracy'])
history = model.fit(train_images, train_labels, epochs=10, batch_size=64,
validation_data=(test_images, test_labels))
plt.plot(history.history['accuracy'])
plt.plot(history.history['val_accuracy'])
plt.legend(['training', 'validation'], loc='best')
plt.title('Accuracy')
plt.show()
# 构建并训练全连接神经网络模型
fc_model = build_fc()
train_and_plot(fc_model, train_images, train_labels, test_images, test_labels)
# 构建并训练LeNet-5模型
lenet5_model = build_lenet5()
train_and_plot(lenet5_model, train_images, train_labels, test_images, test_labels)
# 构建并训练LeNet-5+BN模型
lenet5_bn_model = build_lenet5_bn()
train_and_plot(lenet5_bn_model, train_images, train_labels, test_images, test_labels)
```
需要注意的是,以上代码中的全连接神经网络模型只是一个简单的baseline,如果需要进一步比较其他算法,可以在此基础上进行修改。同时,为了方便起见,此处只画了accuracy的曲线,如果需要画loss的曲线,只需要将`plt.plot(history.history['accuracy'])`改为`plt.plot(history.history['loss'])`即可。
阅读全文