梯度下降算法python实战
时间: 2023-12-31 11:20:45 浏览: 164
XGBoost算法Python实战(代码).zip
5星 · 资源好评率100%
梯度下降算法是一种常用的优化算法,用于求解损失函数的最小值。下面是一个使用Python实现梯度下降算法的示例代码:
```python
import numpy as np
def gradient_descent(X, y, learning_rate, num_epochs):
num_samples, num_features = X.shape
theta = np.zeros(num_features) # 初始化参数
for epoch in range(num_epochs):
gradients = np.zeros(num_features) # 初始化梯度
for i in range(num_samples):
x = X[i]
prediction = np.dot(theta, x)
error = prediction - y[i]
gradients += error * x # 累计梯度
gradients /= num_samples # 求平均梯度
theta -= learning_rate * gradients # 更新参数
return theta
# 示例数据
X = np.array([[1, 2], [2, 3], [3, 4], [4, 5]])
y = np.array([2, 3, 4, 5])
# 调用梯度下降算法
learning_rate = 0.01
num_epochs = 1000
theta = gradient_descent(X, y, learning_rate, num_epochs)
print("最优参数:", theta)
```
阅读全文