Skip to content

llms: preserve reasoning_content in MessageContent for tool calling round-trip#1479

Open
krus210 wants to merge 1 commit intotmc:mainfrom
krus210:fix/deepseek-reasoning-content-tool-calling
Open

llms: preserve reasoning_content in MessageContent for tool calling round-trip#1479
krus210 wants to merge 1 commit intotmc:mainfrom
krus210:fix/deepseek-reasoning-content-tool-calling

Conversation

@krus210
Copy link
Copy Markdown

@krus210 krus210 commented Feb 21, 2026

fix: preserve reasoning_content in tool calling round-trip for deepseek-reasoner

Closes #1478

Summary

  • Fix reasoning_content being lost during conversation round-trip when using deepseek-reasoner with tool calling
  • The DeepSeek API requires every assistant message to include reasoning_content -- without it, the API returns HTTP 400: Missing reasoning_content field in the assistant message
  • This is the same class of bug that was found in the Python LangChain library

Problem

The reasoning_content field was correctly parsed from API responses (PR #1121) but could not be sent back in subsequent requests because:

  1. MessageContent struct had no ReasoningContent field -- users couldn't attach reasoning content to assistant messages in conversation history
  2. The MessageContent -> ChatMessage conversion in the OpenAI provider didn't transfer reasoning content

Changes

File Change
llms/generatecontent.go Add ReasoningContent string field to MessageContent struct
llms/openai/openaillm.go Copy mc.ReasoningContent to msg.ReasoningContent when converting AI messages
llms/marshaling.go Update MarshalJSON/UnmarshalJSON to include reasoning_content (with omitempty)
llms/marshaling_test.go Add test cases for reasoning_content marshal/unmarshal/roundtrip
llms/openai/internal/openaiclient/chat_test.go Add test for ChatMessage with reasoning + tool_calls

Usage

After this fix, users can preserve reasoning_content in the tool calling round-trip:

// Step 1: Get response with reasoning + tool calls
resp, _ := llm.GenerateContent(ctx, messages, llms.WithTools(tools))
choice := resp.Choices[0]

// Step 2: Build assistant message preserving reasoning_content
assistantMsg := llms.MessageContent{
    Role:             llms.ChatMessageTypeAI,
    Parts:            []llms.ContentPart{choice.ToolCalls[0]},
    ReasoningContent: choice.ReasoningContent,  // <-- now possible!
}

// Step 3: Send back with tool result -- no more HTTP 400
resp2, _ := llm.GenerateContent(ctx, []llms.MessageContent{
    userMsg,
    assistantMsg,
    toolResultMsg,
}, llms.WithTools(tools))

Test plan

  • Added unit test: ChatMessage marshal/unmarshal with ReasoningContent + ToolCalls
  • Added unit test: MessageContent marshal/unmarshal with reasoning_content
  • Added unit test: round-trip MessageContent with reasoning_content + tool calls (JSON and YAML)
  • All existing tests pass (go test ./llms/...)
  • Verified with live DeepSeek API: two-step tool calling flow completes successfully

Notes

  • The fix is minimal and backward-compatible -- ReasoningContent defaults to empty string and is omitted from JSON when empty
  • The OpenAIFunctionsAgent also doesn't propagate reasoning_content through its scratchpad (agents/openai_functions_agent.go). This is a separate issue that can be addressed in a follow-up PR
  • DeepSeek API docs on thinking with tools: https://api-docs.deepseek.com/guides/thinking_with_tools

@krus210 krus210 force-pushed the fix/deepseek-reasoning-content-tool-calling branch from fe77f3a to df72e5b Compare February 22, 2026 15:50
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

llms/openai: reasoning_content lost during tool calling round-trip with deepseek-reasoner

1 participant