LeakyReLU
classkeras.layers.LeakyReLU(negative_slope=0.3, **kwargs)
Leaky version of a Rectified Linear Unit activation layer.
This layer allows a small gradient when the unit is not active.
Formula:
f(x) = alpha * x if x < 0
f(x) = x if x >= 0
Example
leaky_relu_layer = LeakyReLU(negative_slope=0.5)
input = np.array([-10, -5, 0.0, 5, 10])
result = leaky_relu_layer(input)
# result = [-5. , -2.5, 0. , 5. , 10.]
Arguments
0.3
.name
and dtype
.