Skip to content

Commit

Permalink
[Hotfix][VLM] Fixing max position embeddings for Pixtral (vllm-projec…
Browse files Browse the repository at this point in the history
…t#8399)

Signed-off-by: Amit Garg <[email protected]>
  • Loading branch information
ywang96 authored and garg-amit committed Oct 28, 2024
1 parent ec9b07f commit 7932801
Showing 1 changed file with 2 additions and 0 deletions.
2 changes: 2 additions & 0 deletions vllm/transformers_utils/config.py
Original file line number Diff line number Diff line change
Expand Up @@ -217,6 +217,8 @@ def recurse_elems(elem: Any):
config_dict["tie_word_embeddings"] = config_dict.get(
"tie_embeddings", False)
config_dict["max_seq_len"] = config_dict.get("max_seq_len", 128_000)
config_dict["max_position_embeddings"] = config_dict.get(
"max_position_embeddings", 128_000)

if config_dict.get("moe") is not None:
config_dict["architectures"] = ["MixtralForCausalLM"]
Expand Down

0 comments on commit 7932801

Please sign in to comment.