Description
When using AzureChatOpenAI from @langchain/openai, the Azure deployment name is not exposed in the LangChain callback/run data. run.extra.metadata.ls_model_name and run.extra.invocation_params.model both show "gpt-3.5-turbo" (the hardcoded default) instead of the actual deployment name.
This breaks all downstream observability/tracing tools (OpenTelemetry instrumentors, LangSmith, etc.) that rely on run metadata to identify the model.
Reproduction
import { AzureChatOpenAI } from "@langchain/openai";
const model = new AzureChatOpenAI({
azureOpenAIApiDeploymentName: "gpt-4o-mini",
azureOpenAIApiInstanceName: "my-instance",
azureOpenAIApiKey: "...",
azureOpenAIApiVersion: "2024-08-01-preview",
});
In a LangChain BaseTracer callback, run.extra for an LLM run shows:
run.extra.metadata.ls_model_name → "gpt-3.5-turbo" (wrong)
run.extra.invocation_params.model → "gpt-3.5-turbo" (wrong)
run.extra.invocation_params.azureOpenAIApiDeploymentName → undefined
The actual deployment name "gpt-4o-mini" is stored on this.azureOpenAIApiDeploymentName but never flows into the callback run data.
Root Cause
BaseChatOpenAI sets model = "gpt-3.5-turbo" as the default (line 35 of base.ts). When using AzureChatOpenAI with only azureOpenAIApiDeploymentName (and no explicit model), this.model stays as the default.
getLsParams() sets ls_model_name: this.model → "gpt-3.5-turbo"
invocationParams() returns { model: this.model, ... } → "gpt-3.5-turbo"
- Neither method is overridden by
AzureChatOpenAI
Python parity
The Python AzureChatOpenAI already fixes this in two ways:
_identifying_params includes azure_deployment: self.deployment_name
_get_ls_params() falls back to self.deployment_name when self.model_name is None
# Python langchain_openai/chat_models/azure.py
@property
def _identifying_params(self):
return {"azure_deployment": self.deployment_name, **super()._identifying_params}
See also: langchain-ai/langchain#24838 (Python equivalent)
Suggested Fix
Override getLsParams() in AzureChatOpenAI to fall back to the deployment name:
getLsParams(options) {
const params = super.getLsParams(options);
if (this.azureOpenAIApiDeploymentName &&
(!params.ls_model_name || params.ls_model_name === "gpt-3.5-turbo")) {
params.ls_model_name = this.azureOpenAIApiDeploymentName;
}
return params;
}
And/or override invocationParams() to include the deployment name:
invocationParams(options) {
return {
...super.invocationParams(options),
azureOpenAIApiDeploymentName: this.azureOpenAIApiDeploymentName,
};
}
Impact
All callback-based instrumentation that reads model identity from LangChain run data is affected, including:
- OpenTelemetry distros (microsoft/opentelemetry-distro-javascript)
- LangSmith tracing
- Any custom BaseTracer implementations
Versions
@langchain/openai: ^1.4.4
langchain: ^1.2.31
- Node.js: 24.x
Description
When using
AzureChatOpenAIfrom@langchain/openai, the Azure deployment name is not exposed in the LangChain callback/run data.run.extra.metadata.ls_model_nameandrun.extra.invocation_params.modelboth show"gpt-3.5-turbo"(the hardcoded default) instead of the actual deployment name.This breaks all downstream observability/tracing tools (OpenTelemetry instrumentors, LangSmith, etc.) that rely on run metadata to identify the model.
Reproduction
In a LangChain BaseTracer callback,
run.extrafor an LLM run shows:The actual deployment name
"gpt-4o-mini"is stored onthis.azureOpenAIApiDeploymentNamebut never flows into the callback run data.Root Cause
BaseChatOpenAIsetsmodel = "gpt-3.5-turbo"as the default (line 35 ofbase.ts). When usingAzureChatOpenAIwith onlyazureOpenAIApiDeploymentName(and no explicitmodel),this.modelstays as the default.getLsParams()setsls_model_name: this.model→"gpt-3.5-turbo"invocationParams()returns{ model: this.model, ... }→"gpt-3.5-turbo"AzureChatOpenAIPython parity
The Python
AzureChatOpenAIalready fixes this in two ways:_identifying_paramsincludesazure_deployment: self.deployment_name_get_ls_params()falls back toself.deployment_namewhenself.model_nameis NoneSee also: langchain-ai/langchain#24838 (Python equivalent)
Suggested Fix
Override
getLsParams()inAzureChatOpenAIto fall back to the deployment name:And/or override
invocationParams()to include the deployment name:Impact
All callback-based instrumentation that reads model identity from LangChain run data is affected, including:
Versions
@langchain/openai: ^1.4.4langchain: ^1.2.31