Skip to content

Commit

Permalink
update for v4 so we don't crash
Browse files Browse the repository at this point in the history
  • Loading branch information
ahmeda14960 committed Oct 23, 2024
1 parent 1706803 commit f0ca163
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion config/llama_7b_tulu.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ model: # 7B class model
num_heads: 32
num_kv_heads: 32
use_flash_attention: True
flash_attention_block_size: 1024
flash_attention_block_size: 512
use_bias: false
use_layer_norm_weight: false
trainer:
Expand Down

0 comments on commit f0ca163

Please sign in to comment.