机器学习BP算法代码
时间: 2024-06-16 12:00:59 浏览: 99
BP算法是一种常用的神经网络算法,用于训练多层神经网络,以实现分类或回归等任务。BP算法通过反向传播误差来调整网络中的权重和偏差参数,从而最小化输出误差。下面是BP算法的伪代码:
1. 初始化神经网络参数(权重和偏差)为随机值
2. 输入一组训练数据
3. 前向传播计算输出
4. 计算输出误差
5. 反向传播误差,计算每个参数的梯度
6. 使用梯度下降更新参数
7. 重复步骤2-6直到达到收敛条件
以下是使用Python实现BP算法的简单示例代码:
```python
import numpy as np
class NeuralNetwork:
def __init__(self, input_dim, hidden_dim, output_dim):
self.input_dim = input_dim
self.hidden_dim = hidden_dim
self.output_dim = output_dim
# 初始化权重和偏差为随机值
self.weights1 = np.random.randn(input_dim, hidden_dim)
self.bias1 = np.zeros((1, hidden_dim))
self.weights2 = np.random.randn(hidden_dim, output_dim)
self.bias2 = np.zeros((1, output_dim))
def sigmoid(self, x):
return 1 / (1 + np.exp(-x))
def sigmoid_derivative(self, x):
return x * (1 - x)
def forward(self, X):
self.z1 = np.dot(X, self.weights1) + self.bias1
self.a1 = self.sigmoid(self.z1)
self.z2 = np.dot(self.a1, self.weights2) + self.bias2
self.a2 = self.sigmoid(self.z2)
return self.a2
def backward(self, X, y, output):
error = y - output
delta_output = error * self.sigmoid_derivative(output)
error_hidden = delta_output.dot(self.weights2.T)
delta_hidden = error_hidden * self.sigmoid_derivative(self.a1)
# 更新权重和偏差
self.weights2 += self.a1.T.dot(delta_output)
self.bias2 += np.sum(delta_output, axis=0, keepdims=True)
self.weights1 += X.T.dot(delta_hidden)
self.bias1 += np.sum(delta_hidden, axis=0)
def train(self, X, y, epochs):
for i in range(epochs):
output = self.forward(X)
self.backward(X, y, output)
```
阅读全文