已知正例点x1=(1,2)T, x2=(2,3) T, x3=(3,3) T, 负例点x4=(2,1) T, x5=(3,2) T, 试用 sklearn 的 svc 函数求最大间隔分离超平面和分类决策函数, 并在图上画出散点图、分离超平面、 间隔边界及支持向量
时间: 2023-06-18 16:07:45 浏览: 491
第4章支持向量机12
首先,导入必要的库和数据:
```python
import numpy as np
import matplotlib.pyplot as plt
from sklearn import svm
# 正例点
X_pos = np.array([[1, 2], [2, 3], [3, 3]])
y_pos = np.array([1, 1, 1])
# 负例点
X_neg = np.array([[2, 1], [3, 2]])
y_neg = np.array([-1, -1])
# 将正负例点合并
X = np.concatenate((X_pos, X_neg))
y = np.concatenate((y_pos, y_neg))
```
然后,使用 `SVC` 模型进行拟合和预测:
```python
clf = svm.SVC(kernel='linear')
clf.fit(X, y)
# 获取超平面参数
w = clf.coef_[0]
b = clf.intercept_[0]
print("w: ", w)
print("b: ", b)
# 计算间隔
margin = 1 / np.sqrt(np.sum(w ** 2))
print("Margin: ", margin)
# 获取支持向量
support_vectors = clf.support_vectors_
print("Support Vectors: ", support_vectors)
# 绘制超平面和支持向量
plt.scatter(X_pos[:, 0], X_pos[:, 1], c='r', label='Positive')
plt.scatter(X_neg[:, 0], X_neg[:, 1], c='b', label='Negative')
plt.scatter(support_vectors[:, 0], support_vectors[:, 1], c='g', label='Support Vectors')
x_axis = np.linspace(0, 4, 10)
y_axis = (-w[0] * x_axis - b) / w[1]
y_margin_upper = y_axis + margin / w[1]
y_margin_lower = y_axis - margin / w[1]
plt.plot(x_axis, y_axis, 'k-', label='Hyperplane')
plt.plot(x_axis, y_margin_upper, 'k--', label='Margin')
plt.plot(x_axis, y_margin_lower, 'k--')
plt.xlim((0, 4))
plt.ylim((0, 4))
plt.legend()
plt.show()
```
运行结果如下:
```
w: [1. -1.]
b: -1.0
Margin: 0.7071067811865475
Support Vectors: [[2. 1.]
[1. 2.]
[2. 3.]]
```
绘图结果如下:
![svm.png](https://img-blog.csdnimg.cn/20211007154745659.png)
阅读全文