On this page
tf.keras.activations.hard_sigmoid
Hard sigmoid activation function.
tf.keras.activations.hard_sigmoid(
    x
)
A faster approximation of the sigmoid activation.
For example:
a = tf.constant([-3.0,-1.0, 0.0,1.0,3.0], dtype = tf.float32)
b = tf.keras.activations.hard_sigmoid(a)
b.numpy()
array([0. , 0.3, 0.5, 0.7, 1. ], dtype=float32)
| Arguments | |
|---|---|
| x | Input tensor. | 
| Returns | |
|---|---|
| The hard sigmoid activation, defined as: 
 | 
© 2020 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.
 https://www.tensorflow.org/versions/r2.4/api_docs/python/tf/keras/activations/hard_sigmoid