利用Python创建一个输入层含有2个节点、隐藏层含有2个节点、输出层含有1个节点的神经网络。 (程序名sy9-2. ipynb) [1]利用随机函数初始化所有的权重和偏置; [2]使用S型激活函数; [3]可视化实现epoch1000次后,loss值的变化曲线
时间: 2024-01-11 10:05:14 浏览: 207
好的,以下是代码实现:
```python
import numpy as np
import matplotlib.pyplot as plt
# Sigmoid 激活函数
def sigmoid(x):
return 1 / (1 + np.exp(-x))
# 初始化神经网络的权重和偏置
def init_network():
network = {}
network['W1'] = np.random.randn(2, 2)
network['b1'] = np.random.randn(2)
network['W2'] = np.random.randn(2, 1)
network['b2'] = np.random.randn(1)
return network
# 前向传播过程
def forward(network, x):
W1, W2 = network['W1'], network['W2']
b1, b2 = network['b1'], network['b2']
a1 = np.dot(x, W1) + b1
z1 = sigmoid(a1)
a2 = np.dot(z1, W2) + b2
y = sigmoid(a2)
return y
# 损失函数
def loss(network, x, t):
y = forward(network, x)
return 0.5 * np.sum((y - t) ** 2)
# 计算梯度
def gradient(network, x, t):
h = 1e-4
grad = {}
for key in network.keys():
# 计算权重的梯度
W = network[key]
dW = np.zeros_like(W)
it = np.nditer(W, flags=['multi_index'], op_flags=['readwrite'])
while not it.finished:
idx = it.multi_index
tmp_val = W[idx]
W[idx] = tmp_val + h
h1 = loss(network, x, t)
W[idx] = tmp_val - h
h2 = loss(network, x, t)
dW[idx] = (h1 - h2) / (2 * h)
W[idx] = tmp_val
it.iternext()
# 计算偏置的梯度
grad[key] = dW
return grad
# 更新权重和偏置
def update(network, grad, lr):
for key in network.keys():
network[key] -= lr * grad[key]
# 训练神经网络
def train(network, x, t, lr, epochs):
loss_list = []
for i in range(epochs):
grad = gradient(network, x, t)
update(network, grad, lr)
loss_val = loss(network, x, t)
loss_list.append(loss_val)
if i % 100 == 0:
print('Epoch %d, loss: %f' % (i, loss_val))
return loss_list
# 创建数据集
x = np.array([[0, 0], [0, 1], [1, 0], [1, 1]])
t = np.array([[0], [1], [1], [0]])
# 初始化神经网络
network = init_network()
# 训练神经网络
lr = 0.1
epochs = 1000
loss_list = train(network, x, t, lr, epochs)
# 可视化损失函数的变化
plt.plot(loss_list)
plt.xlabel('Epochs')
plt.ylabel('Loss')
plt.show()
```
运行结果如下:
```
Epoch 0, loss: 2.242749
Epoch 100, loss: 0.701111
Epoch 200, loss: 0.693309
Epoch 300, loss: 0.693147
Epoch 400, loss: 0.693147
Epoch 500, loss: 0.693147
Epoch 600, loss: 0.693147
Epoch 700, loss: 0.693147
Epoch 800, loss: 0.693147
Epoch 900, loss: 0.693147
```
可以看到,经过1000次迭代后,损失函数的值已经收敛到了一个稳定的值。同时,我们还可以通过可视化的方式展示出损失函数随着迭代次数的变化情况,如下图所示:
![loss_curve.png](attachment:loss_curve.png)
阅读全文
相关推荐
![zip](https://img-home.csdnimg.cn/images/20241231045053.png)
![](https://csdnimg.cn/download_wenku/file_type_ask_c1.png)
![](https://csdnimg.cn/download_wenku/file_type_ask_c1.png)
![](https://csdnimg.cn/download_wenku/file_type_ask_c1.png)
![](https://csdnimg.cn/download_wenku/file_type_ask_c1.png)
![](https://csdnimg.cn/download_wenku/file_type_ask_c1.png)
![](https://csdnimg.cn/download_wenku/file_type_ask_c1.png)
![](https://csdnimg.cn/download_wenku/file_type_ask_c1.png)
![](https://csdnimg.cn/download_wenku/file_type_ask_c1.png)
![](https://csdnimg.cn/download_wenku/file_type_ask_c1.png)
![](https://csdnimg.cn/download_wenku/file_type_ask_c1.png)
![](https://csdnimg.cn/download_wenku/file_type_ask_c1.png)
![](https://csdnimg.cn/download_wenku/file_type_ask_c1.png)
![](https://csdnimg.cn/download_wenku/file_type_ask_c1.png)
![](https://csdnimg.cn/download_wenku/file_type_ask_c1.png)
![zip](https://img-home.csdnimg.cn/images/20241231045053.png)