bp神经网络回归python
时间: 2023-07-08 17:43:13 浏览: 97
好的,您想了解如何使用 Python 实现 BP 神经网络进行回归任务。下面是一个简单的示例代码:
```python
import numpy as np
# 定义激活函数
def sigmoid(x):
return 1 / (1 + np.exp(-x))
# BP 神经网络类
class BPNeuralNetwork:
def __init__(self, input_size, hidden_size, output_size):
# 初始化权重和偏置
self.weights1 = np.random.randn(input_size, hidden_size)
self.bias1 = np.zeros((1, hidden_size))
self.weights2 = np.random.randn(hidden_size, output_size)
self.bias2 = np.zeros((1, output_size))
def forward(self, X):
# 前向传播
self.z1 = np.dot(X, self.weights1) + self.bias1
self.a1 = sigmoid(self.z1)
self.z2 = np.dot(self.a1, self.weights2) + self.bias2
self.y_hat = self.z2
return self.y_hat
def backward(self, X, y, y_hat, learning_rate):
# 反向传播
delta3 = y_hat - y
dweights2 = np.dot(self.a1.T, delta3)
dbias2 = np.sum(delta3, axis=0, keepdims=True)
delta2 = np.dot(delta3, self.weights2.T) * sigmoid(self.z1) * (1 - sigmoid(self.z1))
dweights1 = np.dot(X.T, delta2)
dbias1 = np.sum(delta2, axis=0)
# 更新权重和偏置
self.weights2 -= learning_rate * dweights2
self.bias2 -= learning_rate * dbias2
self.weights1 -= learning_rate * dweights1
self.bias1 -= learning_rate * dbias1
def train(self, X, y, learning_rate, epochs):
# 训练模型
for i in range(epochs):
y_hat = self.forward(X)
self.backward(X, y, y_hat, learning_rate)
loss = np.mean(np.square(y_hat - y))
if i % 100 == 0:
print(f"Epoch {i}, Loss: {loss:.4f}")
# 测试数据
X = np.array([[0, 0], [0, 1], [1, 0], [1, 1]])
y = np.array([[0], [1], [1], [0]])
# 创建 BP 神经网络实例
nn = BPNeuralNetwork(2, 4, 1)
# 训练模型
nn.train(X, y, 0.1, 1000)
# 预测结果
y_pred = nn.forward(X)
print(y_pred)
```
在上面的代码中,我们首先定义了 sigmoid 激活函数。然后定义了一个 BPNeuralNetwork 类,该类包含初始化权重和偏置、前向传播、反向传播和训练模型等方法。在训练模型方法中,我们使用了均方误差作为损失函数,并使用梯度下降法更新权重和偏置。最后,我们使用训练好的模型进行预测并输出结果。
阅读全文