LangChain Integration Overview
Comprehensive guide to integrating Noveum Trace with LangChain applications for automatic AI tracing and observability
Noveum Trace provides seamless integration with LangChain applications, automatically capturing detailed traces of your AI workflows without requiring code changes to your core logic. This integration helps you monitor, debug, and optimize your LangChain applications with comprehensive observability.
What You Get
- Automatic Tracing: Zero-code integration with LangChain components
- Complete Visibility: Track LLM calls, chains, agents, tools, and retrieval operations
- Performance Metrics: Monitor latency, token usage, and costs
- Error Tracking: Identify and debug issues in your AI workflows
- Cost Optimization: Analyze spending patterns and find cost-effective alternatives
Installation
Note: There's no special noveum-trace[langchain] package. The base noveum-trace package includes full LangChain support.
Quick Start
Integrate Noveum Trace with LangChain using the NoveumTraceCallbackHandler. There are two approaches:
Approach 1: LLM-Level Callbacks
Pass callbacks directly to the LLM during construction:
Approach 2: Config-Based Callbacks (Recommended)
Pass callbacks via the config parameter at runtime:
Both approaches work equally well. Choose based on your use case:
- Use LLM-level for consistent tracing across all executions
- Use config-based for selective tracing or dynamic callback configuration
Important: Concurrent Execution
The NoveumTraceCallbackHandler maintains internal state and should not be shared across concurrent operations like asyncio.gather() or ThreadPoolExecutor calls, as this will mix traces together. For concurrent execution, create a new handler instance for each call. Sequential operations can safely reuse the same handler instance.
Integration Patterns
Noveum Trace supports comprehensive tracing for all LangChain components:
1. Basic LLM Calls
Trace individual LLM interactions with automatic context capture. → Learn more
2. Chains
Monitor multi-step workflows and chain compositions. → Learn more
3. Agents & Advanced Patterns
Track agent decision-making, tool usage, error handling, and complex workflows. → Learn more
Key Features
- Zero Configuration: Works out of the box with existing LangChain code
- Rich Context: Automatically captures inputs, outputs, and metadata
- Performance Insights: Detailed metrics on latency and resource usage
- Error Handling: Comprehensive error tracking and debugging information
- Cost Analysis: Track spending across different models and operations
- Manual Trace Control: Advanced control over trace lifecycle for complex workflows
- Custom Parent Relationships: Explicit parent-child span relationships with metadata
- LangGraph Integration: Full support for LangGraph routing decisions and node transitions
- Routing Decision Tracking: Capture and analyze conditional routing logic
Advanced Features
Manual Trace Control
For complex workflows, you can manually control trace lifecycle with start_trace() and end_trace() methods.
Custom Parent Span Relationships
Set explicit parent-child relationships between spans using metadata configuration:
LangGraph Integration
Full support for LangGraph workflows including:
- Node execution tracing
- Routing decision tracking
- State transition monitoring
- Custom event emission
Routing Decision Attributes
When tracking routing decisions, the following attributes are captured:
- Source and target nodes
- Decision reasoning and confidence
- State snapshots
- Alternative options
Next Steps
Ready to dive deeper? Explore these detailed guides:
- Basic LLM Tracing - Start with simple LLM calls
- Chain Tracing - Monitor multi-step workflows
- Advanced Patterns - Agents, tools, error handling, and more
Need Help?
Get Early Access to Noveum.ai Platform
Be the first one to get notified when we open Noveum Platform to more users. All users get access to Observability suite for free, early users get free eval jobs and premium support for the first year.