Skip to content

Latest commit

 

History

History
50 lines (34 loc) · 1.88 KB

File metadata and controls

50 lines (34 loc) · 1.88 KB

Module: tfra.dynamic_embedding.shadow_ops

View source on GitHub




Dynamic Embedding is designed for Large-scale Sparse Weights Training.

See Sparse Domain Isolation

The file will be introduced as shadow_opsunderdynamic_embedding. It is a submodule of dynamic_embedding`.

In TensorFlow 2.x, tf.function is introduced to speedup the computation. And also modular programming based on tf.Module are recommended because of the Pythonic style APIs. But APIs like embedding_lookup, embedding_lookup_unique, embedding_lookup_sparse, and safe_embedding_lookup_sparse in dynamic_embedding, are wrappers of embedding_lookup. And it will create a TrainableWrapper object inside the function, which doesn't meet the requirements of tf.function

The shadow_ops submodule is designed to support usage on tf.function and modular style development, like keras.

Classes

class ShadowVariable: ShadowVariable is a eager persistent twin of TrainableWrapper.

Functions

embedding_lookup(...): Shadow version of dynamic_embedding.embedding_lookup. It use existed shadow