On this page
tf.keras.losses.Reduction
Types of loss reduction.
Contains the following values:
AUTO: Indicates that the reduction option will be determined by the usage context. For almost all cases this defaults toSUM_OVER_BATCH_SIZE. When used withtf.distribute.Strategy, outside of built-in training loops such astf.kerascompileandfit, we expect reduction value to beSUMorNONE. UsingAUTOin that case will raise an error.NONE: Weighted losses with one dimension reduced (axis=-1, or axis specified by loss function). When this reduction type used with built-in Keras training loops likefit/evaluate, the unreduced vector loss is passed to the optimizer but the reported loss will be a scalar value.SUM: Scalar sum of weighted losses.SUM_OVER_BATCH_SIZE: ScalarSUMdivided by number of elements in losses. This reduction type is not supported when used withtf.distribute.Strategyoutside of built-in training loops liketf.kerascompile/fit.You can implement 'SUM_OVER_BATCH_SIZE' using global batch size like:
with strategy.scope():
  loss_obj = tf.keras.losses.CategoricalCrossentropy(
      reduction=tf.keras.losses.Reduction.NONE)
  ....
  loss = tf.reduce_sum(loss_object(labels, predictions)) *
      (1. / global_batch_size)
  Please see the custom training guide
 for more details on this.
Methods
all
  
  @classmethod
all()
  validate
  
  @classmethod
validate(
    key
)
  Class Variables
AUTO = 'auto'NONE = 'none'SUM = 'sum'SUM_OVER_BATCH_SIZE = 'sum_over_batch_size'
© 2020 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.
 https://www.tensorflow.org/versions/r2.3/api_docs/python/tf/keras/losses/Reduction