Skip to content

Conversation

matthicksj
Copy link

Summary

Fixes tool input parameters being lost when using Claude models via Vertex AI with streaming enabled. The Vertex AI beta messages implementation was returning plain Stream objects instead of BetaMessageStreamManager, bypassing the event accumulation logic entirely.

Closes #1020

Problem

When using the beta messages API with Vertex AI and streaming enabled (stream=True), tool calls fail with validation errors because tool inputs remain empty {}.

The root cause:

  • Vertex beta messages delegated directly to FirstPartyMessagesAPI.create
  • This returned Stream instead of BetaMessageStreamManager
  • The accumulation logic in BetaMessageStream was never reached
  • Vertex sends proper input_json_delta events, but they weren't being accumulated

Solution

This PR overrides the create() and stream() methods in /src/anthropic/lib/vertex/_beta_messages.py to ensure:

  1. The create() method wraps Stream in BetaMessageStream when stream=True
  2. The stream() method returns BetaMessageStreamManager for proper event accumulation
  3. Tool inputs are accumulated from input_json_delta events as intended
  4. Both sync and async implementations are fixed

Changes Made

  • src/anthropic/lib/vertex/_beta_messages.py:
    • Override Messages.create() to wrap Stream in BetaMessageStream when streaming
    • Override Messages.stream() to return BetaMessageStreamManager
    • Override AsyncMessages.create() to wrap AsyncStream in BetaAsyncMessageStream when streaming
    • Override AsyncMessages.stream() to return BetaAsyncMessageStreamManager
    • Add imports for BetaMessageStream and BetaAsyncMessageStream
    • Maintain delegation for non-streaming methods

Testing

Manual Testing

# Before fix: Tool inputs are empty
response = client.beta.messages.create(..., stream=True)
for event in response:
    # Tool input: {}  ❌

# After fix: Both methods work correctly
# Method 1: Using create(stream=True)
response = client.beta.messages.create(..., stream=True)
for event in response:
    # Tool input: {"location": "Paris"}  ✅

# Method 2: Using stream()
with client.beta.messages.stream(...) as stream:
    for event in stream:
        # Process events
    final_msg = stream.get_final_message()
    # Tool input: {"location": "Paris"}  ✅

Automated Tests

  • ✅ All 18 existing Vertex tests pass
  • ✅ Event accumulation validated with mock data
  • ✅ Both sync and async implementations tested
  • ✅ No regression in non-streaming functionality

Test Output

============================================================
VERTEX BETA STREAMING FIX VALIDATION
============================================================
✅ create(stream=True) correctly returns BetaMessageStream
✅ BetaMessageStream correctly wraps the raw stream
✅ create(stream=False) correctly returns unwrapped message
✅ stream() correctly returns BetaMessageStreamManager
✅ Tool inputs accumulated correctly from delta events
   Final input: {'location': 'Paris'}

Impact

  • Severity: High - Fixes critical bug affecting all Vertex AI beta users with streaming
  • Scope: Only affects Vertex AI streaming implementation
  • Breaking Changes: None - maintains backward compatibility
  • Performance: No impact - same number of API calls

Verification Checklist

  • Code follows project style guidelines
  • All tests pass (pytest tests/lib/test_vertex.py)
  • Linting passes (ruff check)
  • No breaking changes to existing APIs
  • Documentation comments added where needed
  • Commit message follows conventional format

Related Context

Example Usage After Fix

from anthropic import AnthropicVertex

client = AnthropicVertex(project_id="my-project", region="us-east5")

tools = [{
    "name": "get_weather",
    "description": "Get weather for a location",
    "input_schema": {
        "type": "object",
        "properties": {"location": {"type": "string"}},
        "required": ["location"]
    }
}]

# Now works correctly with streaming
with client.beta.messages.stream(
    model="claude-3-5-sonnet@20240620",
    max_tokens=1024,
    messages=[{"role": "user", "content": "What's the weather in Paris?"}],
    tools=tools
) as stream:
    for event in stream:
        if event.type == "content_block_start":
            if hasattr(event, 'content_block'):
                if event.content_block.type == 'tool_use':
                    # Tool inputs now properly accumulated!
                    print(f"Tool: {event.content_block.name}")
    
    message = stream.get_final_message()
    # Access complete tool inputs from accumulated events

Fixes tool input parameters being lost when using Claude models via
Vertex AI with streaming enabled. Both create(stream=True) and stream()
methods now properly handle event accumulation.

The Vertex beta messages implementation was returning plain Stream objects
instead of BetaMessageStream/BetaMessageStreamManager, bypassing the event
accumulation logic entirely.

This fix overrides create() and stream() methods in
/src/anthropic/lib/vertex/_beta_messages.py to:
- Wrap create(stream=True) responses in BetaMessageStream
- Return BetaMessageStreamManager from stream()
- Apply same fixes to async implementations

Fixes anthropics#1020
@matthicksj matthicksj requested a review from a team as a code owner August 23, 2025 20:08
Copy link

@dustinkremer dustinkremer left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Test review comment

justinlietz93 pushed a commit to justinlietz93/anthropic-sdk-python that referenced this pull request Sep 1, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Tool inputs lost during streaming with Vertex AI integration (beta messages)
2 participants