一段神经网络代码
时间: 2023-07-08 19:24:45 浏览: 38
以下是一个简单的神经网络代码,用于实现二分类任务:
```python
import numpy as np
# 定义 sigmoid 函数
def sigmoid(x):
return 1 / (1 + np.exp(-x))
# 初始化参数
def initialize_parameters(dim):
W = np.zeros((dim, 1))
b = 0
return W, b
# 前向传播
def forward_propagation(X, W, b):
Z = np.dot(X, W) + b
A = sigmoid(Z)
return A
# 计算损失
def compute_cost(A, Y):
m = Y.shape[0]
cost = (-1/m) * np.sum(Y * np.log(A) + (1-Y) * np.log(1-A))
return cost
# 反向传播
def backward_propagation(X, A, Y):
m = Y.shape[0]
dZ = A - Y
dW = (1/m) * np.dot(X.T, dZ)
db = (1/m) * np.sum(dZ)
return dW, db
# 更新参数
def update_parameters(W, b, dW, db, learning_rate):
W = W - learning_rate * dW
b = b - learning_rate * db
return W, b
# 训练模型
def train_model(X, Y, num_iterations, learning_rate):
dim = X.shape[1]
W, b = initialize_parameters(dim)
for i in range(num_iterations):
A = forward_propagation(X, W, b)
cost = compute_cost(A, Y)
dW, db = backward_propagation(X, A, Y)
W, b = update_parameters(W, b, dW, db, learning_rate)
if i % 100 == 0:
print("Cost after iteration {}: {}".format(i, cost))
return W, b
# 预测结果
def predict(X, W, b):
A = forward_propagation(X, W, b)
Y_pred = np.zeros_like(A)
Y_pred[A > 0.5] = 1
return Y_pred
```
这段代码实现了一个单层神经网络(Logistic 回归),通过梯度下降算法来训练模型参数,最终预测二分类结果。