On this page
tf.raw_ops.SparseSoftmaxCrossEntropyWithLogits
Computes softmax cross entropy cost and gradients to backpropagate.
tf.raw_ops.SparseSoftmaxCrossEntropyWithLogits(
    features, labels, name=None
)
  Unlike SoftmaxCrossEntropyWithLogits, this operation does not accept a matrix of label probabilities, but rather a single label per row of features. This label is considered to have probability 1.0 for the given row.
Inputs are the logits, not probabilities.
| Args | |
|---|---|
features | 
      A Tensor. Must be one of the following types: half, bfloat16, float32, float64. batch_size x num_classes matrix | 
     
labels | 
      A Tensor. Must be one of the following types: int32, int64. batch_size vector with values in [0, num_classes). This is the label for the given minibatch entry. | 
     
name | 
      A name for the operation (optional). | 
| Returns | |
|---|---|
A tuple of Tensor objects (loss, backprop). | 
     |
loss | 
      A Tensor. Has the same type as features. | 
     
backprop | 
      A Tensor. Has the same type as features. | 
     
© 2020 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.
 https://www.tensorflow.org/versions/r2.3/api_docs/python/tf/raw_ops/SparseSoftmaxCrossEntropyWithLogits