Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Square LR-Schedule #70

Open
ClashLuke opened this issue Aug 13, 2022 · 0 comments
Open

Square LR-Schedule #70

ClashLuke opened this issue Aug 13, 2022 · 0 comments
Labels
core Improves core model while keeping core idea intact engineering Software-engineering problems that don't require ML-Expertise

Comments

@ClashLuke
Copy link
Member

Our learning rate scheduler currently uses a linear increase and exponential dropoff, so our learning rate curve looks like the following:
grafik
where the duration of the initial ramp-up and the decay are tuneable hyperparameters.

However, others pointed out that square ramp-up and square decay can perform significantly better, so we might also want to use them. The modified curve (orange) would look like the following:
grafik

@ClashLuke ClashLuke added engineering Software-engineering problems that don't require ML-Expertise core Improves core model while keeping core idea intact labels Aug 13, 2022
@ClashLuke ClashLuke changed the title Inverse Square Root LR-Schedule Square LR-Schedule Aug 13, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
core Improves core model while keeping core idea intact engineering Software-engineering problems that don't require ML-Expertise
Projects
None yet
Development

No branches or pull requests

1 participant