根据给定数据集(存放在data1.txt文件中,二分类数据),编码实现基于梯度下降的Logistic回归算法,并画出决策边界
时间: 2023-05-17 13:04:31 浏览: 167
好的,让我来回答你的问题。以下是基于梯度下降的Logistic回归算法的代码实现:
```
import numpy as np
import matplotlib.pyplot as plt
# 读取数据集
data = np.loadtxt('data1.txt', delimiter=',')
X = data[:, :-1]
y = data[:, -1]
# 定义sigmoid函数
def sigmoid(z):
return 1 / (1 + np.exp(-z))
# 定义损失函数
def cost_function(theta, X, y):
m = len(y)
h = sigmoid(X @ theta)
J = -1/m * (y @ np.log(h) + (1-y) @ np.log(1-h))
grad = 1/m * X.T @ (h - y)
return J, grad
# 初始化参数
m, n = X.shape
X = np.hstack((np.ones((m, 1)), X))
initial_theta = np.zeros(n+1)
# 计算损失函数和梯度
cost, grad = cost_function(initial_theta, X, y)
# 定义梯度下降函数
def gradient_descent(X, y, theta, alpha, num_iters):
J_history = []
for i in range(num_iters):
cost, grad = cost_function(theta, X, y)
theta = theta - alpha * grad
J_history.append(cost)
return theta, J_history
# 运行梯度下降算法
theta, J_history = gradient_descent(X, y, initial_theta, 0.01, 1000)
# 画出决策边界
plt.scatter(X[:, 1], X[:, 2], c=y)
x1_min, x1_max = X[:,1].min(), X[:,1].max(),
x2_min, x2_max = X[:,2].min(), X[:,2].max(),
xx1, xx2 = np.meshgrid(np.linspace(x1_min, x1_max), np.linspace(x2_min, x2_max))
h = sigmoid(np.c_[np.ones((xx1.ravel().shape[0],1)), xx1.ravel(), xx2.ravel()].dot(theta))
h = h.reshape(xx1.shape)
plt.contour(xx1, xx2, h, [0.5], linewidths=1, colors='b')
plt.show()
```
这段代码可以读取名为data1.txt的二分类数据集,实现基于梯度下降的Logistic回归算法,并画出决策边界。如果你有任何问题,请随时问我。
阅读全文