Skip to content

Commit

Permalink
fix: tool_call was not encoded (#284)
Browse files Browse the repository at this point in the history
Summary:



Test Plan:

Previously failing test LLAMA_STACK_CONFIG=fireworks pytest -s -v tests/client-sdk/agents/test_agents.py::test_create_turn_response --safety-shield meta-llama/Llama-Guard-3-8B
  • Loading branch information
ehhuang authored Feb 21, 2025
1 parent b6dad01 commit 2411407
Show file tree
Hide file tree
Showing 2 changed files with 4 additions and 1 deletion.
3 changes: 2 additions & 1 deletion models/llama3/api/chat_format.py
Original file line number Diff line number Diff line change
Expand Up @@ -149,8 +149,9 @@ def _process_content(c):
def encode_dialog_prompt(
self,
messages: List[RawMessage],
tool_prompt_format: ToolPromptFormat = ToolPromptFormat.json,
tool_prompt_format: Optional[ToolPromptFormat] = None,
) -> LLMInput:
tool_prompt_format = tool_prompt_format or ToolPromptFormat.json
tokens = []
images = []
tokens.append(self.tokenizer.special_tokens["<|begin_of_text|>"])
Expand Down
2 changes: 2 additions & 0 deletions models/llama3/api/tool_utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -190,3 +190,5 @@ def format_value(value: RecursiveType) -> str:

args_str = ", ".join(f"{k}={format_value(v)}" for k, v in t.arguments.items())
return f"[{fname}({args_str})]"
else:
raise ValueError(f"Unsupported tool prompt format: {tool_prompt_format}")

0 comments on commit 2411407

Please sign in to comment.