Skip to content

AzureChatOpenAI: deployment name missing from callback/tracing run data (ls_model_name defaults to gpt-3.5-turbo) #10874

@fpfp100

Description

@fpfp100

Description

When using AzureChatOpenAI from @langchain/openai, the Azure deployment name is not exposed in the LangChain callback/run data. run.extra.metadata.ls_model_name and run.extra.invocation_params.model both show "gpt-3.5-turbo" (the hardcoded default) instead of the actual deployment name.

This breaks all downstream observability/tracing tools (OpenTelemetry instrumentors, LangSmith, etc.) that rely on run metadata to identify the model.

Reproduction

import { AzureChatOpenAI } from "@langchain/openai";

const model = new AzureChatOpenAI({
  azureOpenAIApiDeploymentName: "gpt-4o-mini",
  azureOpenAIApiInstanceName: "my-instance",
  azureOpenAIApiKey: "...",
  azureOpenAIApiVersion: "2024-08-01-preview",
});

In a LangChain BaseTracer callback, run.extra for an LLM run shows:

run.extra.metadata.ls_model_name → "gpt-3.5-turbo"  (wrong)
run.extra.invocation_params.model → "gpt-3.5-turbo"  (wrong)
run.extra.invocation_params.azureOpenAIApiDeploymentName → undefined

The actual deployment name "gpt-4o-mini" is stored on this.azureOpenAIApiDeploymentName but never flows into the callback run data.

Root Cause

BaseChatOpenAI sets model = "gpt-3.5-turbo" as the default (line 35 of base.ts). When using AzureChatOpenAI with only azureOpenAIApiDeploymentName (and no explicit model), this.model stays as the default.

  • getLsParams() sets ls_model_name: this.model"gpt-3.5-turbo"
  • invocationParams() returns { model: this.model, ... }"gpt-3.5-turbo"
  • Neither method is overridden by AzureChatOpenAI

Python parity

The Python AzureChatOpenAI already fixes this in two ways:

  1. _identifying_params includes azure_deployment: self.deployment_name
  2. _get_ls_params() falls back to self.deployment_name when self.model_name is None
# Python langchain_openai/chat_models/azure.py
@property
def _identifying_params(self):
    return {"azure_deployment": self.deployment_name, **super()._identifying_params}

See also: langchain-ai/langchain#24838 (Python equivalent)

Suggested Fix

Override getLsParams() in AzureChatOpenAI to fall back to the deployment name:

getLsParams(options) {
  const params = super.getLsParams(options);
  if (this.azureOpenAIApiDeploymentName && 
      (!params.ls_model_name || params.ls_model_name === "gpt-3.5-turbo")) {
    params.ls_model_name = this.azureOpenAIApiDeploymentName;
  }
  return params;
}

And/or override invocationParams() to include the deployment name:

invocationParams(options) {
  return {
    ...super.invocationParams(options),
    azureOpenAIApiDeploymentName: this.azureOpenAIApiDeploymentName,
  };
}

Impact

All callback-based instrumentation that reads model identity from LangChain run data is affected, including:

  • OpenTelemetry distros (microsoft/opentelemetry-distro-javascript)
  • LangSmith tracing
  • Any custom BaseTracer implementations

Versions

  • @langchain/openai: ^1.4.4
  • langchain: ^1.2.31
  • Node.js: 24.x

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type
    No fields configured for issues without a type.

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions