请用XGBoost的交叉验证 寻找最优参数
时间: 2023-10-15 14:05:53 浏览: 81
XBGBoost参数调优代码.md
以下是使用XGBoost的交叉验证寻找最优参数的步骤:
1. 导入所需的库
```python
import xgboost as xgb
from sklearn.model_selection import GridSearchCV
```
2. 准备数据集
```python
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42)
dtrain = xgb.DMatrix(X_train, label=y_train)
dtest = xgb.DMatrix(X_test, label=y_test)
```
3. 定义参数范围
```python
params = {
'max_depth': [3, 4, 5],
'learning_rate': [0.05, 0.1, 0.2],
'n_estimators': [50, 100, 200],
'min_child_weight': [1, 3, 5],
'gamma': [0, 0.1, 0.2],
'subsample': [0.6, 0.8, 1],
'colsample_bytree': [0.6, 0.8, 1],
'reg_alpha': [0, 0.1, 0.5],
'reg_lambda': [0.1, 1, 5]
}
```
4. 定义模型和交叉验证
```python
xgb_model = xgb.XGBClassifier()
grid = GridSearchCV(estimator=xgb_model, param_grid=params, cv=5, n_jobs=-1)
```
5. 训练模型
```python
grid.fit(X_train, y_train)
```
6. 输出最优参数
```python
print(grid.best_params_)
```
7. 输出模型评分
```python
print(grid.best_score_)
```
通过以上步骤,可以使用XGBoost的交叉验证寻找最优参数。
阅读全文