ReLU
classkeras.layers.ReLU(max_value=None, negative_slope=0.0, threshold=0.0, **kwargs)
Rectified Linear Unit activation function layer.
Formula:
f(x) = max(x,0)
f(x) = max_value if x >= max_value
f(x) = x if threshold <= x < max_value
f(x) = negative_slope * (x - threshold) otherwise
Example
relu_layer = keras.layers.activations.ReLU(
max_value=10,
negative_slope=0.5,
threshold=0,
)
input = np.array([-10, -5, 0.0, 5, 10])
result = relu_layer(input)
# result = [-5. , -2.5, 0. , 5. , 10.]
Arguments
None
.0.0
.0.0
.name
and dtype
.