Skip to content

instrumentLangGraph and createLangChainCallbackHandler silently drop spans when used together #19627

@priscilawebdev

Description

@priscilawebdev

Is there an existing issue for this?

How do you use Sentry?

Sentry Saas (sentry.io)

Which SDK are you using?

@sentry/browser

SDK Version

^10.40.0

Framework Version

No response

Link to Sentry event

https://pri-dogfooding-ai.sentry.io/insights/ai-agents/?project=4510951776059472&statsPeriod=30d&trace=043dd6239e7645d9b70f641fe3ec2f83&span=undefined&trace-timestamp=1772622191

Reproduction Example/SDK Setup

No response

Steps to Reproduce

const graph = new StateGraph(IdeaState)
  .addNode("expand", expandNode)     // calls llm.invoke()
  .addNode("validate", validateNode) // calls llm.invoke()
  .addNode("differentiate", diffNode) // calls llm.invoke()
  .addNode("roadmap", roadmapNode)   // calls llm.invoke()
  .addNode("pitch", pitchNode)       // calls llm.invoke()
  .addEdge(START, "expand")
  .addEdge("expand", "validate")
  .addEdge("validate", "differentiate")
  .addEdge("differentiate", "roadmap")
  .addEdge("roadmap", "pitch")
  .addEdge("pitch", END);

Sentry.instrumentLangGraph(graph, { recordInputs: true, recordOutputs: true });
const compiled = graph.compile({ name: "idea-forge" });

const sentryHandler = Sentry.createLangChainCallbackHandler({
  recordInputs: true,
  recordOutputs: true,
});
const result = await compiled.invoke({ idea: "test" }, { callbacks: [sentryHandler] });

Expected Result

All 5 LLM calls should produce chat spans in the trace.

Actual Result

Setup chat spans captured
instrumentLangGraph + createLangChainCallbackHandler 1 out of 5
createLangChainCallbackHandler only 5 out of 5
instrumentLangGraph only 0 (expected — no callback handler)

4 out of 5 chat spans are silently dropped with no error or warning.

Additionally, the trace contains multiple spurious nested invoke_agent sub-spans with near-zero durations (0.20ms) that should not exist:

invoke_agent idea-forge        7.43s    ← top-level (expected)
  invoke_agent idea-forge      2.70ms   ← spurious
  invoke_agent idea-forge      1.60ms   ← spurious
  invoke_agent idea-forge      0.20ms   ← spurious
  invoke_agent idea-forge      4.41s
    chat claude-haiku-4-5-20251001  4.41s  ← only 1 of 5 chat spans
  invoke_agent idea-forge      0.20ms   ← spurious
  invoke_agent idea-forge      0.20ms   ← spurious

Additional Context

Possible Root Cause:

instrumentLangGraph wraps invoke() inside Sentry.startSpan(). As async execution passes through sequential nodes, the span context from startSpan is not consistently maintained across all node invocations. Some handleChatModelStart callbacks from the handler fire outside an active span context and their spans are dropped.

Priority

React with 👍 to help prioritize this issue. Please use comments to provide useful context, avoiding +1 or me too, to help us triage it.

Metadata

Metadata

Assignees

No one assigned

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions