用python实现softmax分类方法分三组平面的二维点并可视化
时间: 2024-05-24 17:13:20 浏览: 192
2_Softmax_classfier_softmax_分类_ring31p_python_
以下是一个简单的示例代码,使用Python实现softmax分类方法,对三组平面的二维点进行分类,并可视化结果。
首先,我们需要导入必要的库,包括NumPy和Matplotlib:
```python
import numpy as np
import matplotlib.pyplot as plt
```
接下来,我们需要定义三组平面上的二维点,并将它们分别赋予不同的标签。在这个示例中,我们假设三组点分别位于以(0,0)为中心的半径为1的圆内。我们将第一组点标记为1,第二组点标记为2,第三组点标记为3:
```python
num_points = 100
theta = np.linspace(0, 2*np.pi, num_points)
x1 = np.cos(theta) + np.random.randn(num_points)*0.1
y1 = np.sin(theta) + np.random.randn(num_points)*0.1
x2 = np.cos(theta) + 2*np.random.randn(num_points)*0.1
y2 = np.sin(theta) + 2*np.random.randn(num_points)*0.1
x3 = np.cos(theta) + 3*np.random.randn(num_points)*0.1
y3 = np.sin(theta) + 3*np.random.randn(num_points)*0.1
data = np.vstack((np.hstack((x1, x2, x3)), np.hstack((y1, y2, y3)))).T
labels = np.hstack((np.ones(num_points), 2*np.ones(num_points), 3*np.ones(num_points)))
```
接下来,我们需要定义softmax函数,用于将实数向量转换为概率向量。我们使用指数函数将每个元素转换为正数,并将它们归一化以使它们的和为1:
```python
def softmax(x):
return np.exp(x) / np.sum(np.exp(x), axis=1, keepdims=True)
```
然后,我们可以使用梯度下降算法来训练softmax分类器。在这个示例中,我们使用交叉熵损失函数:
```python
def cross_entropy_loss(y, t):
return -np.sum(t*np.log(y))
def gradient_descent(data, labels, lr=0.1, epochs=1000):
num_classes = len(np.unique(labels))
num_features = data.shape[1]
weights = np.random.randn(num_features, num_classes)
for epoch in range(epochs):
y = softmax(np.dot(data, weights))
t = np.eye(num_classes)[labels-1]
loss = cross_entropy_loss(y, t)
gradient = np.dot(data.T, y-t)
weights -= lr*gradient
if epoch % 100 == 0:
print(f"Epoch {epoch}: Loss = {loss}")
return weights
```
最后,我们利用训练出来的分类器对新数据进行分类,并将结果可视化:
```python
weights = gradient_descent(data, labels)
x_min, x_max = data[:, 0].min() - 0.5, data[:, 0].max() + 0.5
y_min, y_max = data[:, 1].min() - 0.5, data[:, 1].max() + 0.5
xx, yy = np.meshgrid(np.arange(x_min, x_max, 0.1), np.arange(y_min, y_max, 0.1))
Z = softmax(np.dot(np.c_[xx.ravel(), yy.ravel()], weights))
Z = np.argmax(Z, axis=1).reshape(xx.shape)
plt.contourf(xx, yy, Z, alpha=0.4)
plt.scatter(data[:, 0], data[:, 1], c=labels, edgecolors='k')
plt.show()
```
完整的代码如下:
```python
import numpy as np
import matplotlib.pyplot as plt
def softmax(x):
return np.exp(x) / np.sum(np.exp(x), axis=1, keepdims=True)
def cross_entropy_loss(y, t):
return -np.sum(t*np.log(y))
def gradient_descent(data, labels, lr=0.1, epochs=1000):
num_classes = len(np.unique(labels))
num_features = data.shape[1]
weights = np.random.randn(num_features, num_classes)
for epoch in range(epochs):
y = softmax(np.dot(data, weights))
t = np.eye(num_classes)[labels-1]
loss = cross_entropy_loss(y, t)
gradient = np.dot(data.T, y-t)
weights -= lr*gradient
if epoch % 100 == 0:
print(f"Epoch {epoch}: Loss = {loss}")
return weights
num_points = 100
theta = np.linspace(0, 2*np.pi, num_points)
x1 = np.cos(theta) + np.random.randn(num_points)*0.1
y1 = np.sin(theta) + np.random.randn(num_points)*0.1
x2 = np.cos(theta) + 2*np.random.randn(num_points)*0.1
y2 = np.sin(theta) + 2*np.random.randn(num_points)*0.1
x3 = np.cos(theta) + 3*np.random.randn(num_points)*0.1
y3 = np.sin(theta) + 3*np.random.randn(num_points)*0.1
data = np.vstack((np.hstack((x1, x2, x3)), np.hstack((y1, y2, y3)))).T
labels = np.hstack((np.ones(num_points), 2*np.ones(num_points), 3*np.ones(num_points)))
weights = gradient_descent(data, labels)
x_min, x_max = data[:, 0].min() - 0.5, data[:, 0].max() + 0.5
y_min, y_max = data[:, 1].min() - 0.5, data[:, 1].max() + 0.5
xx, yy = np.meshgrid(np.arange(x_min, x_max, 0.1), np.arange(y_min, y_max, 0.1))
Z = softmax(np.dot(np.c_[xx.ravel(), yy.ravel()], weights))
Z = np.argmax(Z, axis=1).reshape(xx.shape)
plt.contourf(xx, yy, Z, alpha=0.4)
plt.scatter(data[:, 0], data[:, 1], c=labels, edgecolors='k')
plt.show()
```
运行这个代码,我们可以得到如下的分类结果图:
![softmax分类结果图](https://img-blog.csdnimg.cn/20210930222255346.png)
阅读全文