-
-
Notifications
You must be signed in to change notification settings - Fork 1.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[FIXED] Flash-Attn breaks with flash_attn_gpu
#1437
Comments
Try uninstalling and reinstalling unsloth. If error still persists let us know |
the error message is still happening |
I do have the same problem but with latest flash-attn version , with 2.7.0.post2 It works fine (wsl) |
maybe edit this file: i think there is no more flash_attn_cuda, instead we have flash_attn_2_cuda.
|
I'm getting the same thing. In Dao-AILab/flash-attention#1203 (specific location) For me I did a different workaround of re-adding I just edited USE_TRITON_ROCM = os.getenv("FLASH_ATTENTION_TRITON_AMD_ENABLE", "FALSE") == "TRUE"
if USE_TRITON_ROCM:
from .flash_attn_triton_amd import interface_fa as flash_attn_gpu
else:
import flash_attn_2_cuda as flash_attn_gpu
+ flash_attn_cuda = flash_attn_gpu |
Same here. Got a reminder about flash-attn not installed. I used pip install flash-attn --no-build-isolation to install the package. Solved with the workaround mentioned above. |
Apologies a lot on this - and sorry I missed this entirely - I added a fix to the nightly branch - so sorry on the issue! |
flash_attn_gpu
+1 |
I just fixed it! Apologies on the delay! For local machines, please update Unsloth via: pip install --upgrade --no-cache-dir --force-reinstall --no-deps unsloth unsloth_zoo |
flash_attn_gpu
flash_attn_gpu
I have
installed unsloth with the wget command and still i got this warning message
The text was updated successfully, but these errors were encountered: