tf.contrib.layers.bow_encoder
Maps a sequence of symbols to a vector per example by averaging embeddings.
tf.contrib.layers.bow_encoder(
ids, vocab_size, embed_dim, sparse_lookup=True, initializer=None,
regularizer=None, trainable=True, scope=None, reuse=None
)
Args |
ids |
[batch_size, doc_length] Tensor or SparseTensor of type int32 or int64 with symbol ids. |
vocab_size |
Integer number of symbols in vocabulary. |
embed_dim |
Integer number of dimensions for embedding matrix. |
sparse_lookup |
bool , if True , converts ids to a SparseTensor and performs a sparse embedding lookup. This is usually faster, but not desirable if padding tokens should have an embedding. Empty rows are assigned a special embedding. |
initializer |
An initializer for the embeddings, if None default for current scope is used. |
regularizer |
Optional regularizer for the embeddings. |
trainable |
If True also add variables to the graph collection GraphKeys.TRAINABLE_VARIABLES (see tf.Variable). |
scope |
Optional string specifying the variable scope for the op, required if reuse=True . |
reuse |
If True , variables inside the op will be reused. |
Returns |
Encoding Tensor [batch_size, embed_dim] produced by averaging embeddings. |
Raises |
ValueError |
If embed_dim or vocab_size are not specified. |