Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added missing parameter for llama function call #1663

Merged
merged 3 commits into from
Dec 24, 2024
Merged

Conversation

yeonsily
Copy link
Collaborator

What does this PR do?

From PR #1148, we need 'training' parameter set for apply_customized_rope(). Otherwise it's True by default.
But when apply_customized_rope() is called in modling_llama, we don't set it.
I think it's missed by mistake. Other model files are updated properly.

Fixes # (issue)

Before submitting

  • This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • Did you make sure to update the documentation with your changes?
  • Did you write any new necessary tests?

@regisss regisss merged commit ee8f408 into v1.15-release Dec 24, 2024
1 check passed
@regisss regisss deleted the fix_param branch December 24, 2024 15:34
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants