用Python编写bp神经网络
时间: 2023-11-19 20:08:09 浏览: 77
好的,以下是一个简单的Python实现的bp神经网络:
```python
import numpy as np
class BPNeuralNetwork:
def __init__(self, input_size, hidden_size, output_size):
self.input_size = input_size
self.hidden_size = hidden_size
self.output_size = output_size
# 初始化权重和偏置
self.weights1 = np.random.randn(self.hidden_size, self.input_size)
self.bias1 = np.zeros((self.hidden_size, 1))
self.weights2 = np.random.randn(self.output_size, self.hidden_size)
self.bias2 = np.zeros((self.output_size, 1))
def sigmoid(self, x):
return 1 / (1 + np.exp(-x))
def forward(self, x):
hidden = self.sigmoid(self.weights1 @ x + self.bias1)
output = self.sigmoid(self.weights2 @ hidden + self.bias2)
return output, hidden
def train(self, x, y, learning_rate=0.1, epochs=1000):
for epoch in range(epochs):
output, hidden = self.forward(x)
# 计算误差
error = y - output
delta_output = error * output * (1 - output)
delta_hidden = self.weights2.T @ delta_output * hidden * (1 - hidden)
# 更新权重和偏置
self.weights2 += learning_rate * delta_output @ hidden.T
self.bias2 += learning_rate * delta_output.sum(axis=1, keepdims=True)
self.weights1 += learning_rate * delta_hidden @ x.T
self.bias1 += learning_rate * delta_hidden.sum(axis=1, keepdims=True)
# 计算损失
loss = np.mean(np.square(error))
if epoch % 100 == 0:
print(f"Epoch: {epoch}, Loss: {loss:.4f}")
```
这个神经网络具有一个输入层,一个隐藏层和一个输出层。在初始化时,权重和偏置被随机初始化。在前向传播中,输入通过权重和偏置传递到隐藏层和输出层。在反向传播中,通过计算输出误差和隐藏层误差来更新权重和偏置。该函数`sigmoid`用于激活层。`train`函数用于训练神经网络,并在每个时期打印损失。
阅读全文
相关推荐















