Skip to content

Commit

Permalink
Fix loading Mistral configuration with attention window disabled
Browse files Browse the repository at this point in the history
  • Loading branch information
jonatanklosko committed Feb 23, 2024
1 parent d3ad71e commit 715ad2f
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion lib/bumblebee/text/mistral.ex
Original file line number Diff line number Diff line change
Expand Up @@ -393,7 +393,7 @@ defmodule Bumblebee.Text.Mistral do
num_blocks: {"num_hidden_layers", number()},
num_attention_heads: {"num_attention_heads", number()},
num_key_value_heads: {"num_key_value_heads", number()},
attention_window_size: {"sliding_window", number()},
attention_window_size: {"sliding_window", optional(number())},
intermediate_size: {"intermediate_size", number()},
activation: {"hidden_act", activation()},
rotary_embedding_base: {"rope_theta", number()},
Expand Down

0 comments on commit 715ad2f

Please sign in to comment.