You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Nov 3, 2022. It is now read-only.
I'd like to add the AdaMod optimizer to the keras-contrib optimizers.
Paper reference: https://arxiv.org/abs/1910.12249
I modified the Adam optimizer code from the main keras repo and added the exponential averaging of past learning rates via the beta_3 coefficient and clamping of learning rates as described by the paper.
Here is my current branch: https://github.com/mffigueroa/keras-contrib/commits/user/mffigueroa/adamod
The text was updated successfully, but these errors were encountered:
I'd like to add the AdaMod optimizer to the keras-contrib optimizers.
Paper reference: https://arxiv.org/abs/1910.12249
I modified the Adam optimizer code from the main keras repo and added the exponential averaging of past learning rates via the beta_3 coefficient and clamping of learning rates as described by the paper.
Here is my current branch:
https://github.com/mffigueroa/keras-contrib/commits/user/mffigueroa/adamod
The text was updated successfully, but these errors were encountered: