可以给我一个逻辑回归的python example 吗
时间: 2024-05-05 19:17:56 浏览: 6
当然可以!下面是一个简单的逻辑回归的Python示例代码:
```
import pandas as pd
from sklearn.linear_model import LogisticRegression
#读取数据
data = pd.read_csv('data.csv')
#将数据拆分为特征和目标变量
X = data.iloc[:, :-1]
y = data.iloc[:, -1]
#创建逻辑回归对象
lr = LogisticRegression()
#拟合模型
lr.fit(X, y)
#预测新数据
new_data = [[1, 2, 3, 4], [5, 6, 7, 8]]
predictions = lr.predict(new_data)
print(predictions)
```
以上就是一个简单的逻辑回归的Python示例代码,希望能对你有所帮助!
相关问题
有序逻辑回归函数python代码
下面是一个用Python编写的有序逻辑回归函数的示例代码:
```python
import numpy as np
from scipy.optimize import minimize
def sigmoid(x):
return 1 / (1 + np.exp(-x))
def ordinal_logistic_regression(X, y):
n_samples, n_features = X.shape
n_classes = len(np.unique(y))
# Initialize parameters
theta = np.zeros((n_features, n_classes - 1))
# Define cost function
def cost_function(theta, X, y):
m = len(y)
probabilities = sigmoid(X @ theta)
loss = np.zeros_like(probabilities)
for i in range(n_classes - 1):
loss[:, i] = np.where(y == i + 1, -np.log(probabilities[:, i]), 0)
cost = np.sum(loss) / m
return cost
# Define gradient function
def gradient(theta, X, y):
m = len(y)
probabilities = sigmoid(X @ theta)
gradients = np.zeros_like(theta)
for i in range(n_classes - 1):
gradients[:, i] = np.sum((probabilities[:, i] - (y == i + 1))[:, None] * X, axis=0) / m
return gradients.flatten()
# Minimize the cost function
result = minimize(cost_function, theta, args=(X, y), method='BFGS', jac=gradient)
optimal_theta = result.x
return optimal_theta
# Example usage:
X = np.array([[1, 2], [2, 3], [3, 4], [4, 5]])
y = np.array([1, 2, 3, 3])
theta = ordinal_logistic_regression(X, y)
print("Optimal Theta:")
print(theta)
```
这个代码使用了NumPy库来进行数值计算,并使用了SciPy库中的minimize函数进行优化。sigmoid函数用于计算逻辑回归的预测概率。`ordinal_logistic_regression`函数实现了
用python代码绘制逻辑回归ROC曲线
好的,以下是绘制逻辑回归ROC曲线的Python代码示例:
```python
import numpy as np
import matplotlib.pyplot as plt
from sklearn.metrics import roc_curve, auc
from sklearn.linear_model import LogisticRegression
from sklearn.datasets import make_classification
from sklearn.model_selection import train_test_split
# 生成具有2个特征的二分类数据集
X, y = make_classification(n_samples=1000, n_features=2, n_informative=2, n_redundant=0, random_state=0)
# 将数据集分为训练集和测试集
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.3, random_state=0)
# 创建逻辑回归模型并拟合训练数据
lr = LogisticRegression()
lr.fit(X_train, y_train)
# 预测测试集中的类别概率
y_score = lr.predict_proba(X_test)[:, 1]
# 计算ROC曲线的各项指标
fpr, tpr, thresholds = roc_curve(y_test, y_score)
roc_auc = auc(fpr, tpr)
# 绘制ROC曲线
plt.plot(fpr, tpr, color='darkorange', lw=2, label='ROC curve (area = %0.2f)' % roc_auc)
plt.plot([0, 1], [0, 1], color='navy', lw=2, linestyle='--')
plt.xlim([0.0, 1.0])
plt.ylim([0.0, 1.05])
plt.xlabel('False Positive Rate')
plt.ylabel('True Positive Rate')
plt.title('Receiver operating characteristic example')
plt.legend(loc="lower right")
plt.show()
```
这段代码将生成一个具有2个特征的二分类数据集,并使用逻辑回归模型拟合训练数据。然后,通过预测测试集中的类别概率计算ROC曲线的各项指标,并使用Matplotlib库绘制ROC曲线。最终的ROC曲线将显示在一个新的图形窗口中。