On this page
tf.compat.v1.nn.crelu
Computes Concatenated ReLU.
tf.compat.v1.nn.crelu(
    features, name=None, axis=-1
)
  Concatenates a ReLU which selects only the positive part of the activation with a ReLU which selects only the negative part of the activation. Note that as a result this non-linearity doubles the depth of the activations. Source: Understanding and Improving Convolutional Neural Networks via Concatenated Rectified Linear Units. W. Shang, et al.
| Args | |
|---|---|
features | 
      A Tensor with type float, double, int32, int64, uint8, int16, or int8. | 
     
name | 
      A name for the operation (optional). | 
axis | 
      The axis that the output values are concatenated along. Default is -1. | 
| Returns | |
|---|---|
A Tensor with the same type as features. | 
     
References:
Understanding and Improving Convolutional Neural Networks via Concatenated Rectified Linear Units: Shang et al., 2016 (pdf)
© 2020 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.
 https://www.tensorflow.org/versions/r2.3/api_docs/python/tf/compat/v1/nn/crelu