基于BP神经网络的分类实验python代码实现
时间: 2023-10-08 22:06:52 浏览: 48
好的,这是一个关于基于BP神经网络的分类实验Python代码的实现示例:
```python
import numpy as np
class NeuralNetwork:
def __init__(self, input_size, hidden_size, output_size):
# initialize weights with random values
self.weights1 = np.random.randn(input_size, hidden_size)
self.weights2 = np.random.randn(hidden_size, output_size)
def sigmoid(self, x):
return 1 / (1 + np.exp(-x))
def sigmoid_derivative(self, x):
return x * (1 - x)
def forward(self, X):
# calculate dot product of input layer with weights1 and apply sigmoid activation function
self.layer1 = self.sigmoid(np.dot(X, self.weights1))
# calculate dot product of hidden layer with weights2 and apply sigmoid activation function
self.output = self.sigmoid(np.dot(self.layer1, self.weights2))
return self.output
def backward(self, X, y, o, lr):
# calculate error between predicted output and actual output
self.output_error = y - o
# calculate the derivative of the output layer
self.output_delta = self.output_error * self.sigmoid_derivative(o)
# calculate the derivative of the hidden layer
self.layer1_error = np.dot(self.output_delta, self.weights2.T)
self.layer1_delta = self.layer1_error * self.sigmoid_derivative(self.layer1)
# update weights with derivative product and learning rate
self.weights1 += lr * np.dot(X.T, self.layer1_delta)
self.weights2 += lr * np.dot(self.layer1.T, self.output_delta)
def train(self, X, y, lr=0.1, epochs=1000):
for i in range(epochs):
# forward propagation
o = self.forward(X)
# backward propagation
self.backward(X, y, o, lr)
def predict(self, X):
return self.forward(X)
```
这个代码演示了一个非常简单的神经网络实现,它具有一个输入层,一个隐藏层和一个输出层。在前向传播时,输入进入第一层并通过权重传递到隐藏层,然后再将该输出传递到输出层。在反向传播过程中,损失函数被计算并传播回上一层,以更新权重。使用该代码可以进行分类实验,只需将输入特征和标签传递给 train() 方法即可。