Replies: 3 comments
-
|
هاي |
Beta Was this translation helpful? Give feedback.
-
|
Hi! At the moment, streaming tool calls with Chat Completions isn’t fully supported for “custom” tool types in the official API or SDK. The spec and current SDK implementations only explicitly list support for In short:
If you need streaming of tool calls or fine-grained streaming of custom tool invocations, one option is to use the Responses API instead — it has richer streaming events and better tooling support for advanced tool interactions, including custom tools. The Responses API is the newer recommended interface for complex agent-style streaming. Here’s a minimal streaming snippet showing what is supported with function tools: from openai import OpenAI
client = OpenAI()
stream = client.chat.completions.create(
model="gpt-5",
messages=[{"role": "user", "content": "Hello"}],
tools=[
{"type": "function", "function": {"name": "my_tool"}}
],
stream=True,
)
for chunk in stream:
# text arrives chunk by chunk
print(chunk.choices[0].delta) |
Beta Was this translation helpful? Give feedback.
-
|
To answer your question directly: no, streaming of custom tool calls is not supported in Chat Completions. The streaming chunk schema ( Why this matters for agent builders: The Chat Completions API was designed for a request-response model with function calling bolted on. The What to use instead: The Responses API is the correct path for anything beyond basic function calling. It has:
from openai import OpenAI
client = OpenAI()
# Responses API with streaming
stream = client.responses.create(
model="gpt-4o",
input="Search for the latest AutoGen release",
tools=[
{"type": "web_search_preview"}, # built-in tool
{"type": "function", "function": {"name": "my_tool", ...}}, # custom function
],
stream=True,
)
for event in stream:
# Each event has a distinct type — no multiplexing
if event.type == "response.output_text.delta":
print(event.delta, end="")
elif event.type == "response.function_call_arguments.delta":
print(f"[tool arg chunk: {event.delta}]")The practical tradeoff: If you're building an agent system and need streaming tool calls, migrate to the Responses API. Chat Completions with The SDK typing issues you noticed ( |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Context:
In the documentation here, an example is given for passing
"custom"tools to models using the Chat Completions API.However, in the chat completion chunk object described here, the
choices.delta.tool_callsonly seems to supporttype="function"with no mention of"custom". The same is reflected in theopenaipython SDK, currently leading to among other things typing issues in downstream custom code.Question:
So does this mean that streaming from Chat Completions while passing
"custom"tools isn't supported? Or perhaps is the API definition incomplete?Thanks in advance.
Beta Was this translation helpful? Give feedback.
All reactions