fix(langchain): prevent llmToolSelectorMiddleware from leaking into message stream#10160
Open
Youngho Kim (JadenKim-dev) wants to merge 6 commits into
Open
fix(langchain): prevent llmToolSelectorMiddleware from leaking into message stream#10160Youngho Kim (JadenKim-dev) wants to merge 6 commits into
Youngho Kim (JadenKim-dev) wants to merge 6 commits into
Conversation
🦋 Changeset detectedLatest commit: ff99621 The changes in this PR will be included in the next version bump. This PR includes changesets to release 1 package
Not sure what this means? Click here to learn what changesets are. Click here if you're a maintainer who wants to add another changeset to this PR |
…essage stream
Pass `{ callbacks: [] }` to the internal structuredModel.invoke() call so
LangGraph's streaming callbacks are not inherited, preventing the tool
selection response from appearing as an assistant message when using
agent.stream() with streamMode "messages".
0953c10 to
6c305b4
Compare
6 tasks
…erit runtime config Merge the parent runnable config from runtime so LangSmith tracing and other callback-based consumers can properly track the internal tool selection call, while still overriding callbacks with an empty array to prevent streaming events from leaking to the UI.
Previously `callbacks: []` was spread after `config` in the invoke call, making the override intent implicit. Moving it into `mergeConfigs` as the second argument makes it clear that the empty array intentionally overrides any inherited callbacks from AsyncLocalStorage context.
Christian Bromann (christian-bromann)
requested changes
May 6, 2026
| }); | ||
|
|
||
| const agent = createAgent({ | ||
| model: new ChatOpenAI({ model: "gpt-4o-mini", temperature: 0 }), |
There was a problem hiding this comment.
We shouldn't need a real model for this and instead just use a mock as documented in https://docs.langchain.com/oss/javascript/langchain/test/unit-testing
b9fdd0f to
abc6088
Compare
…istChatModel Replace the llmToolSelector integration test that required a real OpenAI API call with a unit test using FakeListChatModel, so the streaming isolation check runs without network access or API keys.
Contributor
Author
|
Christian Bromann (@christian-bromann) |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
llmToolSelectorMiddlewareinternally callsstructuredModel.invoke()to select relevant tools. Whenagent.stream()is used withstreamMode: "messages", LangGraph injects aStreamMessagesHandlerintoconfig.callbacksand stores it inAsyncLocalStorage. Without an explicit config override, this handler is inherited by the internal invoke call, causing the tool selection response ({"tools":["..."]}) to appear as an assistant message in the UI stream.Root cause
ensureConfig()in@langchain/coremerges the explicit config with the implicit config fromAsyncLocalStorage. Ifconfig.callbacksisundefined, theStreamMessagesHandlerfrom the parent context is inherited. Passingcallbacks: [](an empty array) breaks this inheritance because a non-undefinedvalue always overrides the implicit one.LangSmith tracing is unaffected — it is injected via
LangChainTracer.getTraceableRunTree()inside_configureSync, not throughconfig.callbacks.Fix
Build an explicit config using
mergeConfigs:callbacks: []preventsStreamMessagesHandlerfrom being inherited viaAsyncLocalStoragepickRunnableConfigKeysinherits the parent config (tags, metadata, configurable, etc.) fromruntimelc_source: "llmToolSelector"tags the call for observability, consistent withsummarizationMiddlewareFixes #10042
Test plan
yarn test src/agents/middleware/tests/llmToolSelector.test.tsOPENAI_API_KEY=... yarn vitest run --mode int src/agents/middleware/tests/llmToolSelector.int.test.ts