逻辑回归图python代码
时间: 2023-10-12 15:15:23 浏览: 44
以下是一个简单的逻辑回归图 python 代码示例:
```python
import numpy as np
import matplotlib.pyplot as plt
# 定义sigmoid函数
def sigmoid(z):
return 1.0 / (1 + np.exp(-z))
# 生成随机数据
np.random.seed(0)
X = np.random.randn(100, 2)
y = np.random.randint(2, size=100)
# 绘制数据散点图
plt.scatter(X[:,0], X[:,1], c=y)
plt.xlabel('x1')
plt.ylabel('x2')
# 添加偏置项,构造训练数据集
X = np.insert(X, 0, 1, axis=1)
theta = np.zeros(X.shape[1])
# 定义损失函数
def cost_function(theta, X, y):
h = sigmoid(np.dot(X, theta))
return -(1.0 / len(y)) * np.sum(y * np.log(h) + (1 - y) * np.log(1 - h))
# 定义梯度函数
def gradient(theta, X, y):
h = sigmoid(np.dot(X, theta))
return (1.0 / len(y)) * np.dot(X.T, (h - y))
# 定义梯度下降函数
def gradient_descent(theta, X, y, alpha=0.1, num_iters=100):
J_history = []
for i in range(num_iters):
theta = theta - alpha * gradient(theta, X, y)
J_history.append(cost_function(theta, X, y))
return theta, J_history
# 训练模型
theta, J_history = gradient_descent(theta, X, y)
# 绘制决策边界
x1_min, x1_max = X[:,1].min(), X[:,1].max()
x2_min, x2_max = X[:,2].min(), X[:,2].max()
xx1, xx2 = np.meshgrid(np.linspace(x1_min, x1_max), np.linspace(x2_min, x2_max))
h = sigmoid(np.dot(np.c_[np.ones((xx1.ravel().shape[0],1)), xx1.ravel(), xx2.ravel()], theta))
h = h.reshape(xx1.shape)
plt.contour(xx1, xx2, h, [0.5], linewidths=1, colors='red')
plt.show()
```
该代码使用 numpy 和 matplotlib 库生成一个二维散点图,然后使用逻辑回归模型来拟合这些数据,并在图中绘制出决策边界。