前馈神经网络python实现
时间: 2023-06-17 11:08:30 浏览: 86
下面是一个简单的前馈神经网络的Python实现示例:
```python
import numpy as np
class NeuralNetwork:
def __init__(self, input_size, hidden_size, output_size):
self.weights1 = np.random.randn(input_size, hidden_size)
self.bias1 = np.zeros((1, hidden_size))
self.weights2 = np.random.randn(hidden_size, output_size)
self.bias2 = np.zeros((1, output_size))
def forward(self, X):
self.z1 = np.dot(X, self.weights1) + self.bias1
self.a1 = np.tanh(self.z1)
self.z2 = np.dot(self.a1, self.weights2) + self.bias2
exp_scores = np.exp(self.z2)
self.probs = exp_scores / np.sum(exp_scores, axis=1, keepdims=True)
return self.probs
def compute_loss(self, X, y):
num_examples = len(X)
log_probs = -np.log(self.probs[range(num_examples), y])
data_loss = np.sum(log_probs)
return 1./num_examples * data_loss
def backpropagation(self, X, y, learning_rate=0.01):
num_examples = len(X)
delta3 = self.probs
delta3[range(num_examples), y] -= 1
dW2 = np.dot(self.a1.T, delta3)
db2 = np.sum(delta3, axis=0, keepdims=True)
delta2 = np.dot(delta3, self.weights2.T) * (1 - np.power(self.a1, 2))
dW1 = np.dot(X.T, delta2)
db1 = np.sum(delta2, axis=0)
self.weights1 -= learning_rate * dW1
self.bias1 -= learning_rate * db1
self.weights2 -= learning_rate * dW2
self.bias2 -= learning_rate * db2
def train(self, X, y, num_passes=20000, learning_rate=0.01):
for i in range(num_passes):
self.forward(X)
self.backpropagation(X, y, learning_rate)
if i % 1000 == 0:
print("Iteration:", i, " Loss:", self.compute_loss(X, y))
```
这个前馈神经网络由一个输入层、一个隐藏层和一个输出层组成,其中隐藏层使用tanh作为激活函数,输出层使用softmax作为激活函数。在训练过程中,使用交叉熵损失函数,并使用反向传播算法更新网络参数。
阅读全文