On this page
tf.raw_ops.TPUEmbeddingActivations
An op enabling differentiation of TPU Embeddings.
tf.raw_ops.TPUEmbeddingActivations(
    embedding_variable, sliced_activations, table_id, lookup_id, name=None
)
  This op simply returns its first input, which is assumed to have been sliced from the Tensors returned by TPUEmbeddingDequeueActivations. The presence of this op, and its first argument being a trainable Variable, enables automatic differentiation of graphs containing embeddings via the TPU Embedding Python libraries.
| Args | |
|---|---|
embedding_variable | 
      A Tensor of type float32. A trainable variable, enabling optimizers to find this op. | 
     
sliced_activations | 
      A Tensor of type float32. The embedding activations Tensor to return. | 
     
table_id | 
      An int that is >= 0. The id of the table in the embedding layer configuration from which these activations were computed. | 
     
lookup_id | 
      An int that is >= 0. Identifier of the set of embedding indices which produced these activations. | 
     
name | 
      A name for the operation (optional). | 
| Returns | |
|---|---|
A Tensor of type float32. | 
     
© 2020 The TensorFlow Authors. All rights reserved.
Licensed under the Creative Commons Attribution License 3.0.
Code samples licensed under the Apache 2.0 License.
 https://www.tensorflow.org/versions/r2.3/api_docs/python/tf/raw_ops/TPUEmbeddingActivations