逻辑回归和神经网络的交叉验证和网格搜索调参代码
时间: 2024-12-04 10:27:40 浏览: 0
用logistic回归,SVM,神经网络实现分类算法.zip
逻辑回归和神经网络都常用交叉验证(Cross-validation)来进行模型评估和超参数调整。网格搜索(Grid Search)是一种常见的调参策略,它会在给定的参数范围内生成所有可能的组合,并通过交叉验证选择最优解。
对于逻辑回归:
```python
from sklearn.model_selection import GridSearchCV, cross_val_score
from sklearn.linear_model import LogisticRegression
# 假设我们有一个数据集 X_train, y_train 和一些参数范围
param_grid = {'C': [0.001, 0.01, 0.1, 1, 10], 'penalty': ['l1', 'l2']}
logreg = LogisticRegression()
# 使用GridSearchCV
grid_search = GridSearchCV(logreg, param_grid, cv=5) # 5折交叉验证
grid_search.fit(X_train, y_train)
# 最优参数
best_params = grid_search.best_params_
best_score = grid_search.best_score_
# 可以使用best_estimator_获取最优化的模型
optimized_logreg = grid_search.best_estimator_
# 交叉验证得分
scores = cross_val_score(optimized_logreg, X_train, y_train, cv=5)
```
对于神经网络(例如用Keras):
```python
from keras.wrappers.scikit_learn import KerasClassifier
from sklearn.model_selection import GridSearchCV
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense
def create_neural_network(optimizer='adam'):
model = Sequential()
model.add(Dense(units=64, activation='relu', input_dim=X_train.shape[1]))
model.add(Dense(units=32, activation='relu'))
model.add(Dense(units=1, activation='sigmoid')) # 如果是二分类任务
model.compile(loss='binary_crossentropy', optimizer=optimizer, metrics=['accuracy'])
return model
# 创建KerasClassifier包装器
nn = KerasClassifier(build_fn=create_neural_network, epochs=50, batch_size=32)
# 参数网格
param_grid = {'optimizer': ['SGD', 'RMSprop', 'Adam']}
# 开始网格搜索
grid_search = GridSearchCV(nn, param_grid, cv=5)
grid_search.fit(X_train, y_train)
# 获取最佳参数和模型
best_optimizer = grid_search.best_params_['optimizer']
best_nn = grid_search.best_estimator_
# 交叉验证得分
scores = cross_val_score(best_nn, X_train, y_train, cv=5)
```
阅读全文