fix(deepseek): preserve reasoning_content in multi-turn conversations #34516
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Summary
Fixes #34166
When using
deepseek-reasonerwith tool calling, multi-turn conversations fail with:Root Cause: The parent's
_get_request_payloadmethod doesn't preserve thereasoning_contentfield fromadditional_kwargswhen convertingAIMessageto the payload format.Changes
_get_request_payloadto extractreasoning_contentfromAIMessage.additional_kwargsbefore calling the parent methodreasoning_contentin the assistant message payload for multi-turn conversationsdeepseek-reasonermodels, automatically add emptyreasoning_contentif not present (DeepSeek API requires this field)Test plan
Added 3 new unit tests:
test_get_request_payload_preserves_reasoning_content- verifies reasoning_content is preserved from additional_kwargstest_get_request_payload_adds_empty_reasoning_for_reasoner- verifies empty reasoning_content is added for reasoner modelstest_get_request_payload_no_reasoning_for_non_reasoner- verifies non-reasoner models don't get reasoning_content addedDisclaimer
This PR was generated with assistance from an AI agent (Claude Code).
🤖 Generated with Claude Code