xgboost使用10折交叉验证的代码
时间: 2024-09-21 11:05:59 浏览: 47
XGBoost是一个强大的梯度提升库,常用于机器学习竞赛和实际项目中。使用10折交叉验证(k-fold Cross Validation)是为了评估模型的泛化能力,避免过拟合。这里是一个简单的Python示例,使用sklearn库结合xgboost进行10折交叉验证:
```python
import xgboost as xgb
from sklearn.model_selection import KFold
# 假设你已经有了数据集X和目标变量y
X = ... # 输入特征数据
y = ... # 目标值
# 初始化XGBClassifier
model = xgb.XGBClassifier()
# 设置k折,这里是10折
kf = KFold(n_splits=10, shuffle=True, random_state=42)
# 进行交叉验证
cv_scores = []
for train_index, val_index in kf.split(X):
X_train, y_train = X[train_index], y[train_index]
X_val, y_val = X[val_index], y[val_index]
model.fit(X_train, y_train, eval_set=[(X_val, y_val)], early_stopping_rounds=10)
cv_scores.append(model.score(X_val, y_val))
# 获取平均交叉验证得分
mean_cv_score = sum(cv_scores) / len(cv_scores)
print("Mean cross-validation score:", mean_cv_score)
阅读全文