-
Notifications
You must be signed in to change notification settings - Fork 19
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
a confusing issue #8
Comments
This error is usually caused by calling |
I just precisely followde the inference instructions,but the issue remained... |
Could you please check your transformers version? RoPE api for llama is changed again after 4.38. (Actually, It always changes...from 4.35 to 4.36, to 4.37, to 4.38 ... almost each recent transformers release has a new RoPE implementation for Llama..😓) |
Hey guys, the code works in my environment. My transformer version is
|
Please use Flash Attention for processing longer input: |
thank you all guys !!!😄 success environment: |
If there are no further questions or follow-up discussions, I will close this issue shortly. Thank you all for your contributions and participation. |
infer is corret, but when finetune, it comes out again |
cos, sin = self.rotary_emb(value_states, seq_len=kv_seq_len)
ValueError: too many values to unpack (expected 2)
I follow the instructions in the Full inference code,bu then I encounter this issue.
How can I fix this?
The text was updated successfully, but these errors were encountered: