Description
Summary
The MCPServerStreamableHttp
class from openai-agents-python
does not expose the MCP session ID that is internally created and managed by the underlying streamablehttp_client
in the mcp
SDK.
As a result, users have no way to persist or reuse the session ID between turns, making it impossible to maintain stateful interactions across multiple requests unless the exact same MCPServerStreamableHttp
instance is reused — which is often impractical (e.g. in stateless worker setups or async architectures).
Reproduction
from agents.mcp.server import MCPServerStreamableHttp
srv = MCPServerStreamableHttp(name="demo", params={"url": "http://localhost:8080/mcp"})
async with srv:
print(hasattr(srv, "session_id")) # ➜ False
print(hasattr(srv, "_get_session_id")) # ➜ False
Where the callback is lost
mcp-sdk: streamable_http.py returns (read, write, get_session_id)
openai-agents: server.py unpacks the tuple but keeps only read and write — get_session_id is discarded.
Impact:
The only supported way to keep a stateful MCP session is to reuse the same MCPServerStreamableHttp instance for the entire chat.
If a new instance is opened per user turn (to prevent ClosedResourceError or to run in stateless workers) the server always issues a new session-id and loses context.
Proposed fix:
Store the callback and expose it:
class MCPServerStreamableHttp(...):
async def _create_streams(...):
read, write, get_sid = await streamablehttp_client(...)
self._get_session_id = get_sid # preserve it
return read, write
@property
def session_id(self) -> str | None:
if hasattr(self, "_get_session_id"):
return self._get_session_id()
return None
This keeps backward compatibility while allowing users to:
async with MCPServerStreamableHttp(...) as srv:
# after first response
sid = srv.session_id # "3c72e57209aa43348ae41bed0b1d3d9b"
Environment:
openai-agents-python: v0.0.19
modelcontextprotocol sdk: v1.9.4
Python 3.11
Ask:
Please expose the session ID via a public interface so developers can persist and reuse sessions for continuity across multiple requests.
Thanks for your time!