bp神经网络多元回归relu激活函数代码
时间: 2023-07-07 21:42:36 浏览: 72
以下是一个使用反向传播算法的多层神经网络(BP神经网络)进行多元回归任务,并使用ReLU激活函数的Python代码示例:
```python
import numpy as np
class NeuralNetwork:
def __init__(self, input_size, hidden_size, output_size):
self.input_size = input_size
self.hidden_size = hidden_size
self.output_size = output_size
self.weights1 = np.random.randn(self.input_size, self.hidden_size)
self.weights2 = np.random.randn(self.hidden_size, self.output_size)
self.bias1 = np.zeros((1, self.hidden_size))
self.bias2 = np.zeros((1, self.output_size))
def relu(self, x):
return np.maximum(0, x)
def relu_derivative(self, x):
x[x<=0] = 0
x[x>0] = 1
return x
def forward(self, X):
self.z1 = np.dot(X, self.weights1) + self.bias1
self.a1 = self.relu(self.z1)
self.z2 = np.dot(self.a1, self.weights2) + self.bias2
self.y_hat = self.z2
return self.y_hat
def backward(self, X, y, lr):
delta3 = self.y_hat - y
delta2 = np.dot(delta3, self.weights2.T) * self.relu_derivative(self.z1)
d_weights2 = np.dot(self.a1.T, delta3)
d_weights1 = np.dot(X.T, delta2)
d_bias2 = np.sum(delta3, axis=0)
d_bias1 = np.sum(delta2, axis=0)
self.weights2 -= lr * d_weights2
self.weights1 -= lr * d_weights1
self.bias2 -= lr * d_bias2
self.bias1 -= lr * d_bias1
def train(self, X, y, epochs=1000, lr=0.1):
for i in range(epochs):
y_hat = self.forward(X)
self.backward(X, y, lr)
if i % 100 == 0:
loss = np.mean(np.square(y_hat - y))
print(f'Epoch {i}, loss: {loss:.4f}')
```
在这个示例中,我们使用了NumPy库来处理矩阵和向量运算,并实现了ReLU激活函数及其导数。在神经网络的初始化中,我们随机初始化了权重和偏差。在前向传播中,我们计算了隐藏层的输出,然后使用ReLU激活函数对其进行非线性变换,并计算输出层的预测值。在反向传播中,我们计算了误差并使用梯度下降更新了权重和偏差。最后,我们使用train函数对整个神经网络进行训练,并在每个epoch中输出损失。
阅读全文