Documentation
Integration Examples/LangGraph Integration/LangGraph Integration Overview

LangGraph Integration Overview

Comprehensive guide to integrating Noveum Trace with LangGraph applications for complex agent workflows

Noveum Trace provides powerful integration with LangGraph applications, enabling you to trace complex agent workflows, multi-step reasoning, and state management. This integration gives you complete visibility into your LangGraph applications' execution flow and performance.

What You Get

  • Workflow Tracing: Complete visibility into LangGraph execution flows
  • State Management: Track state changes and transitions
  • Node-level Tracing: Monitor individual nodes and their performance
  • Conditional Routing: Trace decision-making and routing logic
  • Iterative Processes: Monitor self-loops and iterative refinement
  • Performance Analytics: Detailed metrics on workflow execution

Installation

pip install noveum-trace

Note: There's no special noveum-trace[langgraph] package. The base noveum-trace package includes full LangGraph support.

Quick Start

Integrate Noveum Trace with LangGraph using the NoveumTraceCallbackHandler. There are two approaches:

Approach 1: Graph-Level Callbacks

Pass callbacks during graph compilation:

import os
import noveum_trace
from noveum_trace import NoveumTraceCallbackHandler
from langgraph.graph import StateGraph
 
# Initialize Noveum Trace
noveum_trace.init(
    api_key=os.getenv("NOVEUM_API_KEY"),
    project="customer-support-bot",
    environment="development"
)
 
# Create callback handler
handler = NoveumTraceCallbackHandler()
 
# Create and compile your graph with callbacks
workflow = StateGraph(YourStateType)
# ... add nodes and edges
app = workflow.compile()
 
# Run with callbacks
result = app.invoke(initial_state, config={"callbacks": [handler]})

Pass callbacks via config parameter for more control:

import os
import noveum_trace
from noveum_trace import NoveumTraceCallbackHandler
from langgraph.graph import StateGraph
 
# Initialize Noveum Trace
noveum_trace.init(
    api_key=os.getenv("NOVEUM_API_KEY"),
    project="customer-support-bot",
    environment="development"
)
 
# Create callback handler
handler = NoveumTraceCallbackHandler()
 
# Create and compile your graph
workflow = StateGraph(YourStateType)
# ... add nodes and edges
app = workflow.compile()
 
# Pass callbacks via config with additional metadata
result = app.invoke(
    initial_state,
    config={
        "callbacks": [handler],
        "tags": ["langgraph"],
        "metadata": {"user_id": "123"}
    }
)

Why config-based is recommended for LangGraph:

  • Callbacks propagate through all nodes automatically
  • LLM calls within nodes are traced
  • Tool usage is captured
  • State transitions are monitored
  • Works with complex workflows and loops

Integration Patterns

1. Basic Agents

Trace simple agent workflows with single decision points.

2. Iterative Research

Monitor agents that loop back to refine their work.

3. Conditional Routing

Track complex routing decisions and state transitions.

4. Mixed Tracing

Combine automatic and manual tracing for maximum control.

5. State Management

Monitor state changes and data flow through your graph.

Important: LLM Component Callbacks

❌ Avoid adding callbacks to individual LLM instances:

# ❌ Not recommended: Adding to LLM instances within nodes
llm = ChatOpenAI(callbacks=[handler])  # Doesn't propagate to graph level

This approach won't capture graph-level events like node transitions, state changes, and conditional routing.

✅ Always use graph-level or config-based callbacks as shown in Quick Start to ensure complete tracing of your entire workflow.

Advanced Configuration

Automatic Parent Relationship Resolution

For optimal tracing in LangGraph, enable automatic parent relationship resolution. This ensures that the callback handler properly resolves parent-child span relationships based on LangChain's internal parent_run_id mechanism:

# Enable automatic parent relationship resolution for LangGraph
handler = NoveumTraceCallbackHandler(use_langchain_assigned_parent=True)
 
# Create and compile your graph
workflow = StateGraph(YourStateType)
# ... add nodes and edges
app = workflow.compile()
 
# Use the handler with your graph
result = app.invoke(
    initial_state,
    config={"callbacks": [handler]}
)

Why this matters for LangGraph:

  • LangGraph automatically passes parent_run_id in callback events
  • This parameter tells the handler to use LangChain's parent tracking
  • Results in accurate parent-child relationships in your traces
  • Provides better visualization of your workflow structure

Note: This is the recommended configuration for all LangGraph applications to ensure proper trace hierarchy.

Key Features

  • Automatic Node Tracing: Every node execution is automatically traced
  • State Visibility: Track state changes and data flow
  • Performance Metrics: Monitor execution time and resource usage
  • Error Tracking: Comprehensive error handling and debugging
  • Workflow Analytics: Understand execution patterns and bottlenecks

LangGraph-Specific Benefits

  • Graph Structure: Visualize your entire workflow structure
  • Node Dependencies: Understand how nodes connect and depend on each other
  • State Transitions: Track how state evolves through your graph
  • Loop Detection: Monitor iterative processes and self-loops
  • Conditional Logic: Trace routing decisions and branching

Next Steps

Need Help?

  • Documentation: Browse our comprehensive guides
  • Examples: Check out LangChain and LiveKit examples
  • Community: Join our Discord for support and discussions
  • Support: Contact our team for enterprise support
Exclusive Early Access

Get Early Access to Noveum.ai Platform

Be the first one to get notified when we open Noveum Platform to more users. All users get access to Observability suite for free, early users get free eval jobs and premium support for the first year.

Sign up now. We send access to new batch every week.

Early access members receive premium onboarding support and influence our product roadmap. Limited spots available.