coef1 = pd.DataFrame() for alpha in np.linspace(0.0001,0.2,20): lasso_clf = Lasso(alpha=alpha) lasso_clf.fit(X_train[features_without_ones],y_train) df = pd.DataFrame([lasso_clf.coef_],columns=X_train[features_without_ones].columns) df['alpha'] = alpha coef1 = pd.concat([coef1,df],ignore_index=True) coef1.head() plt.figure(figure=(9,6),dpi=600) for feature in X_train.columns[:-1]: plt.plot('alpha',feature,data=coef1) plt.legend(loc='upper right') plt.xlabel(r'$\alpha$',fontsize=15) plt.ylabel("coefficient",fontsize=15) plt.show()
时间: 2023-06-19 19:05:18 浏览: 104
这段代码使用 Lasso 回归对特征进行选择,并绘制了不同超参数 alpha 值下各个特征的系数变化曲线。具体来说,代码通过循环不同的 alpha 值,分别训练 Lasso 回归模型,并记录下各个特征的系数。最后将这些系数数据整合成一个 DataFrame,并绘制出每个特征随着 alpha 变化的系数变化曲线。
相关问题
# Regularization channel of Lasso coef1 = pd.DataFrame() for alpha in np.linspace(0.0001,0.2,20): lasso_clf = Lasso(alpha=alpha) lasso_clf.fit(X_train[features_without_ones],y_train) df = pd.DataFrame([lasso_clf.coef_],columns=X_train[features_without_ones].columns) df['alpha'] = alpha coef1 = pd.concat([coef1,df],ignore_index=True) coef1.head() plt.figure(figure=(9,6),dpi=600) for feature in X_train.columns[:-1]: plt.plot('alpha',feature,data=coef1) plt.legend(loc='upper right') plt.xlabel(r'$\alpha$',fontsize=15) plt.ylabel("coefficient",fontsize=15) plt.show()修改
# Lasso系数的正则化通道
coef1 = pd.DataFrame()
for alpha in np.linspace(0.0001,0.2,20):
lasso_clf = Lasso(alpha=alpha)
lasso_clf.fit(X_train[features_without_ones], y_train)
df = pd.DataFrame([lasso_clf.coef_], columns=X_train[features_without_ones].columns)
df['alpha'] = alpha
coef1 = pd.concat([coef1, df], ignore_index=True)
plt.figure(figsize=(9,6), dpi=600)
for feature in X_train.columns[:-1]:
plt.plot('alpha', feature, data=coef1)
plt.legend(loc='upper right')
plt.xlabel(r'$\alpha$',fontsize=15)
plt.ylabel("coefficient",fontsize=15)
plt.show()
from sklearn.linear_model import Lasso lasso = Lasso(alpha=0.01) lasso.fit(X_train[features_without_ones],y_train) print(lasso.coef_) print(lasso.intercept_) coef1 = pd.DataFrame() for alpha in np.linspace(0.0001,0.2,20): lasso_clf = Lasso(alpha=alpha) lasso_clf.fit(X_train[features_without_ones],y_train) df = pd.DataFrame([lasso_clf.coef_],columns=X_train[features_without_ones].columns) df['alpha'] = alpha coef1 = pd.concat([coef1,df],ignore_index=True) plt.figure(figsize=(9,6),dpi=600) for feature in X_train.columns[:-1]: plt.plot('alpha',feature,data=coef1) plt.legend(loc='upper right') plt.xlabel(r'$\alpha$',fontsize=15) plt.ylabel("coefficient",fontsize=15) plt.show()
这段代码使用了Lasso回归模型来进行特征选择。Lasso回归是一种线性回归模型,它在损失函数中添加了一个L1正则化项,可以使得一些不重要的特征的系数变为0,从而达到特征选择的效果。具体来说,这段代码先使用Lasso回归模型拟合训练数据,然后输出模型的系数和截距。接着,使用np.linspace生成20个等差数列作为不同的alpha值,对于每个alpha值,使用Lasso回归模型拟合训练数据,并将其系数以及对应的alpha值保存到coef1中。最后,使用plt.plot函数绘制每个特征在不同alpha值下的系数变化曲线,以帮助选择最佳的alpha值。
阅读全文