最小二乘法实现线性回归,评估线性回归分析的误差(测试集的均方误差)写一份代码
时间: 2024-09-25 11:19:13 浏览: 56
最小二乘法是一种常用的线性回归算法,它通过寻找最佳拟合直线(即最小化残差平方和),使得预测值与真实值之间的差距最小。在Python中,我们可以使用NumPy库来实现这个过程,并计算测试集的均方误差(Mean Squared Error, MSE)。下面是一个简单的示例:
```python
import numpy as np
def least_squares(x, y):
# 计算斜率和截距
x_mean = np.mean(x)
y_mean = np.mean(y)
n = len(x)
slope = (n * np.dot(x, y) - x.sum() * y_mean) / (n * np.dot(x, x) - (x ** 2).sum())
intercept = y_mean - slope * x_mean
return slope, intercept
def predict(slope, intercept, x_test):
return slope * x_test + intercept
# 假设我们有训练数据
x_train = ... # 输入特征数组
y_train = ... # 目标变量数组
# 训练模型并获取参数
slope, intercept = least_squares(x_train, y_train)
# 对于测试集,假设x_test是独立的数据
x_test = ... # 测试输入特征数组
# 预测结果
y_pred = predict(slope, intercept, x_test)
# 计算均方误差
mse = np.mean((y_pred - y_train) ** 2)
print(f"测试集均方误差(MSE): {mse}")
阅读全文