交叉熵损失函数和均方误差损失函数的代码实现
时间: 2023-10-21 14:00:48 浏览: 120
交叉熵损失函数的代码实现:
```python
import numpy as np
def cross_entropy_loss(y_pred, y_true):
"""
Computes the cross-entropy loss between predicted and true labels.
Args:
- y_pred: numpy array of predicted probabilities (shape: batch_size x num_classes)
- y_true: numpy array of true labels (shape: batch_size x num_classes)
Returns:
- cross-entropy loss: scalar value
"""
epsilon = 1e-12 # small value to avoid division by zero
num_examples = y_pred.shape[0] # number of examples in the batch
# clip predicted values to avoid log(0) errors
y_pred = np.clip(y_pred, epsilon, 1. - epsilon)
# compute cross-entropy loss
ce_loss = -np.sum(y_true * np.log(y_pred)) / num_examples
return ce_loss
```
均方误差损失函数的代码实现:
```python
import numpy as np
def mse_loss(y_pred, y_true):
"""
Computes the mean squared error loss between predicted and true labels.
Args:
- y_pred: numpy array of predicted values (shape: batch_size x num_classes)
- y_true: numpy array of true labels (shape: batch_size x num_classes)
Returns:
- mean squared error loss: scalar value
"""
num_examples = y_pred.shape[0] # number of examples in the batch
# compute mean squared error loss
mse_loss = np.sum(np.square(y_pred - y_true)) / (2 * num_examples)
return mse_loss
```
阅读全文