On this page
tf.keras.backend.elu
Exponential linear unit.
tf.keras.backend.elu(
x, alpha=1.0
)
Arguments | |
---|---|
x |
A tensor or variable to compute the activation function for. |
alpha |
A scalar, slope of negative section. |
Returns | |
---|---|
A tensor. |
© 2020 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.
https://www.tensorflow.org/versions/r2.3/api_docs/python/tf/keras/backend/elu