以sigmoid为激活函数、交叉熵为损失函数,实现逻辑回归
时间: 2024-02-21 15:59:53 浏览: 23
好的,逻辑回归是一种二分类模型,我们可以使用sigmoid函数将输出限制在0到1之间,进而进行分类。交叉熵是一种常用的损失函数,可以用于衡量预测值与真实值之间的差异。以下是一个使用sigmoid和交叉熵实现逻辑回归的示例代码:
```python
import numpy as np
class LogisticRegression:
def __init__(self, learning_rate=0.1, max_iter=1000):
self.learning_rate = learning_rate
self.max_iter = max_iter
self.weights = None
def sigmoid(self, z):
return 1 / (1 + np.exp(-z))
def fit(self, X, y):
n_samples, n_features = X.shape
self.weights = np.zeros(n_features + 1)
X = np.insert(X, 0, 1, axis=1) # add bias term
for i in range(self.max_iter):
z = np.dot(X, self.weights)
y_pred = self.sigmoid(z)
loss = (-y * np.log(y_pred) - (1 - y) * np.log(1 - y_pred)).mean()
grad = np.dot(X.T, (y_pred - y)) / n_samples
self.weights -= self.learning_rate * grad
def predict(self, X):
X = np.insert(X, 0, 1, axis=1)
z = np.dot(X, self.weights)
y_pred = self.sigmoid(z)
return (y_pred > 0.5).astype(int)
```
其中,fit方法用于训练模型,predict方法用于预测。在训练时,我们首先将样本矩阵X添加一个全1列作为偏置项,然后使用随机梯度下降法更新权重,直到达到最大迭代次数或者损失函数收敛。在预测时,我们同样需要添加偏置项,并且将输出大于0.5的视为正类,小于等于0.5的视为负类。