CrewAI Integration Overview
Integrate Noveum Trace with CrewAI crews and flows for multi-agent observability
Noveum Trace integrates with CrewAI so crew kickoffs, tasks, LLM calls, tools, memory, flows, and related events export to your Noveum project.
What You Get
- Crew and task visibility: Spans for crew kickoff, tasks, agents, LLM calls, and tools
- Rich context: Optional capture of inputs, outputs, tool schemas, and agent snapshots (configurable on the listener)
- Flows and extras: Flow execution, memory, knowledge, A2A, MCP, and streaming can be traced when you use those features
Prerequisites
- Python 3.10+ (the
noveum-trace[crewai]extra aligns with CrewAI’s supported Python range) - A CrewAI app (agents, tasks,
Crew, and optionally Flows) - A Noveum API key and project
Installation
Quick Start
Set credentials (or use your own config):
Initialize the SDK once, then start CrewAI tracing. Attach the returned listener to your crew before kickoff(), and shut it down when the process is done (especially in tests or short scripts).
Configure your CrewAI LLM the same way you do today (for example provider API keys your agents use). setup_crewai_tracing() must run after noveum_trace.init().
Verification
Run the crew once with a valid NOVEUM_API_KEY, open the Noveum dashboard, pick your project, and confirm new traces. In short-lived processes you can call noveum_trace.flush() before exit so batched spans are sent.
Next Steps
- SDK integration - Init options, transports, and other frameworks
- Simple LLM example - Manual spans alongside automatic integrations
Need Help?
Get Early Access to Noveum.ai Platform
Be the first one to get notified when we open Noveum Platform to more users. All users get access to Observability suite for free, early users get free eval jobs and premium support for the first year.