写一个BP神经网络用python实现的代码
时间: 2023-11-22 08:55:01 浏览: 79
python实现BP神经网络的代码
5星 · 资源好评率100%
Sure, here's a simple implementation of a BP neural network using Python:
```
import numpy as np
class NeuralNetwork:
def __init__(self, input_size, hidden_size, output_size):
self.input_size = input_size
self.hidden_size = hidden_size
self.output_size = output_size
self.weights1 = np.random.rand(self.input_size, self.hidden_size)
self.bias1 = np.random.rand(self.hidden_size)
self.weights2 = np.random.rand(self.hidden_size, self.output_size)
self.bias2 = np.random.rand(self.output_size)
def sigmoid(self, x):
return 1 / (1 + np.exp(-x))
def feedforward(self, inputs):
hidden_layer = self.sigmoid(np.dot(inputs, self.weights1) + self.bias1)
output_layer = self.sigmoid(np.dot(hidden_layer, self.weights2) + self.bias2)
return output_layer
def train(self, inputs, targets, learning_rate):
hidden_layer = self.sigmoid(np.dot(inputs, self.weights1) + self.bias1)
output_layer = self.sigmoid(np.dot(hidden_layer, self.weights2) + self.bias2)
output_error = targets - output_layer
output_delta = output_error * output_layer * (1 - output_layer)
hidden_error = np.dot(output_delta, self.weights2.T)
hidden_delta = hidden_error * hidden_layer * (1 - hidden_layer)
self.weights2 += learning_rate * np.dot(hidden_layer.T, output_delta)
self.bias2 += learning_rate * np.sum(output_delta, axis=0)
self.weights1 += learning_rate * np.dot(inputs.T, hidden_delta)
self.bias1 += learning_rate * np.sum(hidden_delta, axis=0)
```
This implementation uses the sigmoid activation function and backpropagation algorithm to train the neural network. You can initialize the neural network with the number of input, hidden, and output neurons, and then train it with the `train` method using the inputs and target outputs. The `feedforward` method can then be used to predict the output for new inputs.
阅读全文