Skip to content

Commit

Permalink
use another flash_attn
Browse files Browse the repository at this point in the history
  • Loading branch information
CuriousPanCake committed Nov 13, 2024
1 parent ecb5f47 commit aabe792
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion .github/workflows/job_pytorch_layer_tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -123,7 +123,7 @@ jobs:
run: |
# due to flash_attn issues, it needs to be installed separately from other packages
# pip install flash_attn --no-build-isolation
pip install https://github.com/Dao-AILab/flash-attention/releases/download/v2.6.3/flash_attn-2.6.3+cu123torch2.4cxx11abiTRUE-cp312-cp312-linux_x86_64.whl
pip install https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.0.post1/flash_attn-2.7.0.post1+cu12torch2.5cxx11abiTRUE-cp312-cp312-linux_x86_64.whl
- name: PyTorch Layer Tests
if: ${{ fromJSON(inputs.affected-components).PyTorch_FE.test && runner.arch != 'ARM64' }} # Ticket: 126287, 142196
Expand Down

0 comments on commit aabe792

Please sign in to comment.