Skip to content

Anthropic tool loops: thinking blocks missing on turn 2+ #340

@imsherrill

Description

@imsherrill

TanStack AI version

@tanstack/ai@0.6.1, @tanstack/ai-anthropic@0.6.0

Framework/Library version

tanstack/react-start@1.159.0, react@19.2.3

Describe the bug and the steps to reproduce it

In PR #336, I tried to fix Anthropic thinking/tool-loop context handling, but I’m not confident the approach is idiomatic for this codebase.

The core issue is that we are not reliably getting thinking blocks after the first turn in tool loops (turn 2+).

At first I thought this might just be the model deciding not to think, but I tested the same kind of flow with the Vercel AI SDK and it reliably produced thinking between tool iterations. That makes me think our loop/context assembly is likely the issue.

Current vs Expected

CURRENT (observed)

Turn 1: user -> model
          model: [thinking] + [tool_use]
          app runs tool -> [tool_result]

Turn 2: app -> model (with tool_result)
          model: [tool_use or final text]
                 (missing [thinking])

Turn 3+: same pattern, often still no [thinking]

EXPECTED

Turn 1: user -> model
          model: [thinking] + [tool_use]
          app runs tool -> [tool_result]

Turn 2: app -> model (with tool_result)
          model: [thinking] + [tool_use or final text]

Turn 3+: if more tool loops, model should keep producing
         [thinking] when reasoning is needed

thinking should not be first-turn-only in tool loops; we expect it on turn 2+ when reasoning continues.

Goal

Define and implement the correct Anthropic loop context behavior so thinking blocks appear consistently across multi-turn tool flows when thinking is enabled.

Expected outcome

  • Thinking blocks appear on turn 2+ in tool loops (when reasoning is needed).
  • Message/context assembly is provider-valid and deterministic.
  • Tests cover multi-turn tool loops with interleaved thinking, tool_use, and tool_result.
  • PR #336 can be closed in favor of this issue.

Your Minimal, Reproducible Example - (Sandbox Highly Recommended)

https://codesandbox.io/p/sandbox/rdrwcj

Screenshots or Videos (Optional)

No response

Do you intend to try to help solve this bug with your own PR?

Maybe, I'll investigate and start debugging

Terms & Code of Conduct

  • I agree to follow this project's Code of Conduct
  • I understand that if my bug cannot be reliable reproduced in a debuggable environment, it will probably not be fixed and this issue may even be closed.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions