On this page
tf.keras.activations.softmax
The softmax activation function transforms the outputs so that all values are in
tf.keras.activations.softmax(
x, axis=-1
)
range (0, 1) and sum to 1. It is often used as the activation for the last layer of a classification network because the result could be interpreted as a probability distribution. The softmax of x is calculated by exp(x)/tf.reduce_sum(exp(x)).
| Arguments | |
|---|---|
x |
Input tensor. |
axis |
Integer, axis along which the softmax normalization is applied. |
| Returns | |
|---|---|
| Tensor, output of softmax transformation (all values are non-negative and sum to 1). |
| Raises | |
|---|---|
ValueError |
In case dim(x) == 1. |
© 2020 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.
https://www.tensorflow.org/versions/r1.15/api_docs/python/tf/keras/activations/softmax