On this page
tf.keras.losses.squared_hinge
Computes the squared hinge loss between y_true and y_pred.
tf.keras.losses.squared_hinge(
    y_true, y_pred
)
loss = mean(square(maximum(1 - y_true * y_pred, 0)), axis=-1)
Standalone usage:
y_true = np.random.choice([-1, 1], size=(2, 3))
y_pred = np.random.random(size=(2, 3))
loss = tf.keras.losses.squared_hinge(y_true, y_pred)
assert loss.shape == (2,)
assert np.array_equal(
    loss.numpy(),
    np.mean(np.square(np.maximum(1. - y_true * y_pred, 0.)), axis=-1))
| Args | |
|---|---|
| y_true | The ground truth values. y_truevalues are expected to be -1 or 1. If binary (0 or 1) labels are provided we will convert them to -1 or 1. shape =[batch_size, d0, .. dN]. | 
| y_pred | The predicted values. shape = [batch_size, d0, .. dN]. | 
| Returns | |
|---|---|
| Squared hinge loss values. shape = [batch_size, d0, .. dN-1]. | 
© 2020 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.
 https://www.tensorflow.org/versions/r2.4/api_docs/python/tf/keras/losses/squared_hinge