On this page
tf.keras.backend.relu
Rectified linear unit.
tf.keras.backend.relu(
x, alpha=0.0, max_value=None, threshold=0
)
With default values, it returns element-wise max(x, 0).
Otherwise, it follows: f(x) = max_value for x >= max_value, f(x) = x for threshold <= x < max_value, f(x) = alpha * (x - threshold) otherwise.
| Arguments | |
|---|---|
x |
A tensor or variable. |
alpha |
A scalar, slope of negative section (default=0.). |
max_value |
float. Saturation threshold. |
threshold |
float. Threshold value for thresholded activation. |
| Returns | |
|---|---|
| A tensor. |
© 2020 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.
https://www.tensorflow.org/versions/r2.3/api_docs/python/tf/keras/backend/relu