Replies: 3 comments 1 reply
-
By assuming user_embeddings is already a list: warmup_checkpoint.restore(
restore_path, var_list=model.user_embeddings) |
Beta Was this translation helpful? Give feedback.
1 reply
-
Hi @dakabang ,Thank you for your work!
|
Beta Was this translation helpful? Give feedback.
0 replies
-
I have talked with @Lifann , we found the keras has no warm start mechanism, model = Model()
checkpoint = tf.train.Checkpoint(model)
restore_emb = tfra.dynamic_embedding.warm_start(ckpt_to_initialize_from='/path/to/ckpt/', vars_to_warm_start='*user_id*') |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
de.Variable warmup restore
TF2
tfra.WarmupCheckpoint
In tf2.x we use
tf.train.Checkpoint
for saving/restoring model, but I found it a little hard when I usetf.train.Checkpoint
to restore part of the model to perform transfer learning. In recommendation system, it's a common trick to speed up model convergence using embedding warmup, and the dense part of the model is trained from scratch, hence partial restore. The demonstrate the API design for partial restore are as follows.API design
Possible implementation(pseudo code)
Beta Was this translation helpful? Give feedback.
All reactions