使用numpy实现logistic回归,手动梯度下降
时间: 2024-10-09 07:03:41 浏览: 47
python+numpy+matplotalib实现梯度下降法
在Python中,我们可以使用NumPy库来实现线性回归的基础版本——逻辑回归,其中手动梯度下降是一个常用的优化算法。以下是简单的步骤:
1. 导入所需库:
```python
import numpy as np
```
2. 定义sigmoid函数,用于计算激活值:
```python
def sigmoid(z):
return 1 / (1 + np.exp(-z))
```
3. 初始化权重和学习率:
```python
def initialize_weights(X, y, learning_rate=0.01):
m = X.shape[0]
weights = np.random.randn(1, X.shape[1]) / np.sqrt(m)
bias = 0
return weights, bias, learning_rate
```
4. 手动梯度下降更新权重函数:
```python
def update_weights(weights, bias, gradients, learning_rate):
weights -= learning_rate * gradients[0]
bias -= learning_rate * gradients[1]
return weights, bias
```
5. 训练模型(循环迭代直至收敛或达到最大迭代次数):
```python
def logistic_regression(X, y, max_iter=1000, learning_rate=0.01):
weights, bias, _ = initialize_weights(X, y, learning_rate)
for i in range(max_iter):
# 正向传播
z = np.dot(weights, X) + bias
predictions = sigmoid(z)
# 计算损失和梯度
loss = -y * np.log(predictions) - (1 - y) * np.log(1 - predictions)
dw = (np.dot(X.T, (predictions - y))) / X.shape[0]
db = np.sum(predictions - y) / X.shape[0]
# 更新权重
weights, bias = update_weights(weights, bias, [dw, db], learning_rate)
return weights, bias
```
6. 最后,你可以使用训练得到的权重和偏置预测新的数据点。
注意,这个实现相对基础,实际应用中可能会包括交叉验证、正则化等复杂处理。此外,对于大规模数据集,通常会采用更高效的优化器如Stochastic Gradient Descent(SGD)或Mini-Batch Gradient Descent。
阅读全文