带有正则化的逻辑回归代码
时间: 2023-11-21 11:52:43 浏览: 76
以下是带有L2正则化的逻辑回归代码:
```
import numpy as np
class LogisticRegression:
def __init__(self, alpha=0.01, lambda_val=0.01, num_iterations=1000):
self.alpha = alpha
self.lambda_val = lambda_val
self.num_iterations = num_iterations
self.theta = None
def sigmoid(self, z):
return 1 / (1 + np.exp(-z))
def cost_function(self, X, y, theta):
m = len(y)
h = self.sigmoid(X.dot(theta))
J = (-1 / m) * (y.T.dot(np.log(h)) + (1 - y).T.dot(np.log(1 - h)))
reg = (self.lambda_val / (2 * m)) * np.sum(np.square(theta[1:]))
J += reg
return J
def gradient_descent(self, X, y, theta):
m = len(y)
for i in range(self.num_iterations):
h = self.sigmoid(X.dot(theta))
reg = (self.lambda_val / m) * theta[1:]
reg = np.insert(reg, 0, 0, axis=0)
gradient = (1 / m) * X.T.dot(h - y) + reg
theta -= self.alpha * gradient
return theta
def fit(self, X, y):
m, n = X.shape
X = np.insert(X, 0, 1, axis=1)
self.theta = np.zeros((n + 1, 1))
self.theta = self.gradient_descent(X, y, self.theta)
def predict(self, X):
X = np.insert(X, 0, 1, axis=1)
return np.round(self.sigmoid(X.dot(self.theta)))
```
其中,alpha是学习率,lambda_val是正则化参数,num_iterations是梯度下降的迭代次数。在cost_function中,加入了正则化项,用于惩罚模型的复杂度。在gradient_descent中,对theta进行更新时,也加入了正则化项。
阅读全文
相关推荐
![zip](https://img-home.csdnimg.cn/images/20241231045053.png)
![](https://csdnimg.cn/download_wenku/file_type_ask_c1.png)
![](https://csdnimg.cn/download_wenku/file_type_ask_c1.png)
![](https://csdnimg.cn/download_wenku/file_type_ask_c1.png)
![zip](https://img-home.csdnimg.cn/images/20241231045053.png)
![zip](https://img-home.csdnimg.cn/images/20241231045053.png)
![-](https://img-home.csdnimg.cn/images/20241231045053.png)
![-](https://img-home.csdnimg.cn/images/20241231045053.png)
![-](https://img-home.csdnimg.cn/images/20241231045053.png)
![-](https://img-home.csdnimg.cn/images/20241226111658.png)
![-](https://img-home.csdnimg.cn/images/20241226111658.png)
![-](https://img-home.csdnimg.cn/images/20241226111658.png)
![-](https://img-home.csdnimg.cn/images/20241226111658.png)
![](https://csdnimg.cn/download_wenku/file_type_ask_c1.png)
![](https://csdnimg.cn/download_wenku/file_type_ask_c1.png)
![](https://csdnimg.cn/download_wenku/file_type_ask_c1.png)
![](https://csdnimg.cn/download_wenku/file_type_ask_c1.png)