深度残差网络实现鸢尾花分类模型预测代码
时间: 2024-10-19 07:03:31 浏览: 37
深度残差网络(Residual Network,简称ResNet)在深度学习领域有广泛应用,特别是在图像识别任务中。下面是一个简单的Python代码示例,使用Keras库实现ResNet进行鸢尾花(Iris)分类,这通常是在Keras的`tf.keras.datasets`模块中的鸢尾花数据集上进行的。请注意,这里假设您已经安装了Keras和TensorFlow。
```python
import tensorflow as tf
from tensorflow.keras import layers, models
# 加载鸢尾花数据集
(x_train, y_train), (x_test, y_test) = tf.keras.datasets.iris.load_data()
x_train, x_test = x_train / 255.0, x_test / 255.0
# 定义ResNet基础块
def residual_block(input_tensor, num_filters):
shortcut = input_tensor
x = layers.Conv2D(num_filters, kernel_size=3, padding='same')(input_tensor)
x = layers.BatchNormalization()(x)
x = layers.Activation('relu')(x)
x = layers.Conv2D(num_filters, kernel_size=3, padding='same')(x)
x = layers.BatchNormalization()(x)
x += shortcut
return layers.Activation('relu')(x)
# 构建ResNet模型
num_classes = 3
input_shape = (x_train.shape[1], x_train.shape[2], 1) if len(x_train.shape) == 4 else x_train.shape[1:]
base_model = models.Sequential([
layers.Input(shape=input_shape),
layers.Conv2D(16, kernel_size=(3, 3), activation='relu', padding='same'),
# 更多卷积层...
layers.Flatten(), # 转换为一维向量
layers.Dense(128, activation='relu'),
layers.Dropout(0.5), # 防止过拟合
layers.Dense(num_classes, activation='softmax') # 输出层
])
# 将ResNet块添加到基础模型中
for i in range(2): # 这里可以调整层数,例如ResNet-18有18个这样的残差块
base_model.add(residual_block(base_model.output, 32))
# 编译模型
base_model.compile(optimizer='adam', loss='sparse_categorical_crossentropy', metrics=['accuracy'])
# 训练模型
history = base_model.fit(x_train, y_train, epochs=10, validation_data=(x_test, y_test))
# 使用训练好的模型进行预测
predictions = base_model.predict(x_test)
# 打印一些预测结果
print(predictions[:5])
```
阅读全文