Unknown activation function: leaky_relu
时间: 2024-05-11 10:19:53 浏览: 80
03_activation_functions_激活函数_activationfunction_
Leaky ReLU is an activation function commonly used in neural networks. It is similar to the ReLU activation function, but instead of setting all negative values to zero, it sets them to a small constant value (usually 0.01 times the input) to prevent the "dying ReLU" problem.
Here is an implementation of Leaky ReLU in Python using the NumPy library:
```python
import numpy as np
def leaky_relu(x):
return np.maximum(0.01*x, x)
```
You can use this function as an activation function in a neural network by passing it to the appropriate layer, such as a Dense layer in Keras:
```python
from keras.layers import Dense
from keras.models import Sequential
model = Sequential()
model.add(Dense(32, activation=leaky_relu, input_shape=(input_dim,)))
```
阅读全文