python全连接神经网络如何调参代码
时间: 2023-10-13 15:25:28 浏览: 200
调参是神经网络模型优化的重要步骤,下面是一个简单的全连接神经网络的调参代码示例:
```python
import numpy as np
import pandas as pd
from sklearn.model_selection import train_test_split
from sklearn.preprocessing import StandardScaler
from keras.models import Sequential
from keras.layers import Dense
from keras.optimizers import Adam
# 读取数据
data = pd.read_csv('data.csv')
# 划分训练集和测试集
X_train, X_test, y_train, y_test = train_test_split(data.iloc[:, :-1], data.iloc[:, -1], test_size=0.2)
# 标准化数据
scaler = StandardScaler()
X_train = scaler.fit_transform(X_train)
X_test = scaler.transform(X_test)
# 构建神经网络模型
model = Sequential()
model.add(Dense(units=16, activation='relu', input_dim=X_train.shape[1]))
model.add(Dense(units=8, activation='relu'))
model.add(Dense(units=1, activation='sigmoid'))
# 编译模型
model.compile(loss='binary_crossentropy', optimizer=Adam(lr=0.001), metrics=['accuracy'])
# 训练模型
model.fit(X_train, y_train, batch_size=32, epochs=50, validation_data=(X_test, y_test))
# 超参数调优
from keras.wrappers.scikit_learn import KerasClassifier
from sklearn.model_selection import GridSearchCV
# 定义模型构建函数
def build_model(optimizer='adam'):
model = Sequential()
model.add(Dense(units=16, activation='relu', input_dim=X_train.shape[1]))
model.add(Dense(units=8, activation='relu'))
model.add(Dense(units=1, activation='sigmoid'))
model.compile(loss='binary_crossentropy', optimizer=optimizer, metrics=['accuracy'])
return model
# 定义超参数搜索范围
param_grid = {'optimizer': ['adam', 'sgd', 'rmsprop'], 'batch_size': [16, 32, 64], 'epochs': [50, 100, 150]}
# 创建模型对象
model = KerasClassifier(build_fn=build_model, verbose=0)
# 超参数搜索
grid_search = GridSearchCV(estimator=model, param_grid=param_grid, cv=3)
grid_search.fit(X_train, y_train)
# 输出最佳参数
print(grid_search.best_params_)
```
其中,超参数的搜索使用了GridSearchCV函数,可以根据指定参数范围进行搜索,最终返回最佳参数。在本例中,搜索了优化器、批次大小和训练轮数三个超参数的最佳组合。
阅读全文