On this page
tf.keras.losses.categorical_crossentropy
Computes the categorical crossentropy loss.
tf.keras.losses.categorical_crossentropy(
y_true, y_pred, from_logits=False, label_smoothing=0
)
Standalone usage:
y_true = [[0, 1, 0], [0, 0, 1]]
y_pred = [[0.05, 0.95, 0], [0.1, 0.8, 0.1]]
loss = tf.keras.losses.categorical_crossentropy(y_true, y_pred)
assert loss.shape == (2,)
loss.numpy()
array([0.0513, 2.303], dtype=float32)
Args | |
---|---|
y_true |
Tensor of one-hot true targets. |
y_pred |
Tensor of predicted targets. |
from_logits |
Whether y_pred is expected to be a logits tensor. By default, we assume that y_pred encodes a probability distribution. |
label_smoothing |
Float in [0, 1]. If > 0 then smooth the labels. |
Returns | |
---|---|
Categorical crossentropy loss value. |
© 2020 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.
https://www.tensorflow.org/versions/r2.3/api_docs/python/tf/keras/losses/categorical_crossentropy