ChatAnthropic doesn't stream tool_call tokens as tool_call_chunks but normal chunk content #7484
Replies: 2 comments 5 replies
-
I found some relevant discussions and an issue related to your question:
To address your specific issue, you can use the import { AIMessageChunk } from "@langchain/core/messages";
import { extractToolUseContent } from "path/to/anthropic";
// Assume `chunk` is an AIMessageChunk received in the stream_events loop
let concatenatedChunks: AIMessageChunk | undefined;
const { toolUseContent, concatenatedChunks: updatedConcatenatedChunks } = extractToolUseContent(chunk, concatenatedChunks);
if (toolUseContent) {
console.log("Tool call was made:", toolUseContent);
// Process the tool call as needed
}
concatenatedChunks = updatedConcatenatedChunks; This code checks each To continue talking to Dosu, mention @dosu. Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other |
Beta Was this translation helpful? Give feedback.
-
@dosu is there a way to limit my LLM to either return a tool_call OR respond normally, but never do both ? |
Beta Was this translation helpful? Give feedback.
-
Checked other resources
Commit to Help
Example Code
Description
Im using langgraph for a agent orchestration.
As a node i have defined an agent that has some tools but can also respond without using tools.
When this agent decides to use a tool he still responds with normal chunk.content, which i also don't want but i haven't found a smart way disable that. Afaik the tool call is always streamed after the normal content so i don't know beforehand if the llm uses a tool or not in my stream_events loop. But now comes the super confusing part:
The function call chunks are also streamed via the normal chunk.content event. This means that the generated json is streamed to the frontend which is definitely not intended from our side. Shouldn't the fn call json be streamed via the tool_call_chunks ? This happens when im using the ChatAnthropic LLM. If someone can help me how to check in stream_events weather a tool_call was called or not i would also highly appreaciate it!
System Info
MacBook Pro M2
MacOS 15
"@langchain/anthropic": "^0.3.11",
"@langchain/community": "^0.3.22",
"@langchain/core": "^0.3.27",
"@langchain/langgraph": "^0.2.39",
"@langchain/openai": "^0.3.16",
Beta Was this translation helpful? Give feedback.
All reactions