Documentation
Integration Examples/CrewAI Integration/CrewAI Integration

CrewAI Integration

Trace CrewAI crews, agents, tasks, and tools with Noveum Trace for full observability of multi-agent workflows

Noveum Trace provides deep integration with CrewAI, automatically capturing crew executions, agent decisions, task completions, tool calls, LLM interactions, and multi-agent coordination patterns.

What You Get

  • Crew and task visibility: Spans for crew kickoff, tasks, agents, LLM calls, and tools
  • Rich context: Optional capture of inputs, outputs, tool schemas, and agent snapshots (configurable on the listener)
  • Flows and extras: Flow execution, memory, knowledge, A2A, MCP, and streaming can be traced when you use those features

Requirements

  • Python 3.10+ (required for the noveum-trace[crewai] extra)
  • CrewAI >= 0.177.0
  • A Noveum API key and project

Installation

pip install "noveum-trace[crewai]"

Quick Start

Set credentials using environment variables:

export NOVEUM_API_KEY="your-api-key"
export NOVEUM_PROJECT="my-crewai-app"

Initialize the SDK once, then attach the returned listener to your crew before kickoff(). Call listener.shutdown() when the process is done (especially important in tests or short-lived scripts).

import os
import noveum_trace
from noveum_trace.integrations.crewai import setup_crewai_tracing
from crewai import Agent, Task, Crew, Process
 
noveum_trace.init(
    api_key=os.getenv("NOVEUM_API_KEY"),
    project=os.getenv("NOVEUM_PROJECT", "my-crewai-app"),
    environment=os.getenv("NOVEUM_ENVIRONMENT", "development"),
)
 
# Define agents
researcher = Agent(
    role="Research Analyst",
    goal="Research and summarize AI industry trends",
    backstory="You are an expert at finding and synthesizing AI research",
    verbose=True,
)
 
writer = Agent(
    role="Content Writer",
    goal="Write clear, engaging content based on research",
    backstory="You excel at turning research into compelling narratives",
    verbose=True,
)
 
# Define tasks
research_task = Task(
    description="Research the latest developments in large language models",
    expected_output="A summary of 5 key developments with sources",
    agent=researcher,
)
 
writing_task = Task(
    description="Write a blog post based on the research findings",
    expected_output="A 500-word blog post on LLM developments",
    agent=writer,
    context=[research_task],
)
 
# Create and run crew
crew = Crew(
    agents=[researcher, writer],
    tasks=[research_task, writing_task],
    process=Process.sequential,
    verbose=True,
)
 
listener = setup_crewai_tracing()
crew.callback_function = listener
 
try:
    result = crew.kickoff()
    print(result)
finally:
    listener.shutdown()

setup_crewai_tracing() must run after noveum_trace.init(). Configure your CrewAI LLM providers the same way you do today — Noveum only observes the interactions.

Verification

Run the crew once with a valid NOVEUM_API_KEY, open the Noveum dashboard, pick your project, and confirm new traces appear. In short-lived processes call noveum_trace.flush() before exit to ensure batched spans are sent.

What Gets Traced

EventWhat's captured
Crew kickoff / completionInputs, final output, execution time
Agent executionAgent role, goal, task assignment, iteration count
Task executionTask description, assigned agent, output, status
LLM callsModel, tokens (input/output), cost, latency, messages
Tool callsTool name, inputs, outputs, execution time
Agent-to-agent delegationSource agent, target agent, task details
Memory operationsMemory query/save events
MCP server callsServer name, tool called, parameters, result
Flow eventsCrewAI Flow state transitions
Reasoning stepsChain-of-thought reasoning tokens
Guardrail evaluationsValidation results, pass/fail status

Manual Instantiation

For more control over what gets captured, instantiate the listener directly:

import noveum_trace
from noveum_trace import get_client
from noveum_trace.integrations.crewai import NoveumCrewAIListener
 
noveum_trace.init(api_key="your-api-key", project="crewai-app")
 
client = get_client()
 
listener = NoveumCrewAIListener(
    client,
    capture_inputs=True,
    capture_outputs=True,
    capture_llm_messages=True,
    capture_tool_schemas=True,
    capture_agent_snapshot=True,
    capture_crew_snapshot=True,
    capture_memory=True,
    capture_a2a=True,
    capture_mcp=True,
    trace_name_prefix="crewai",
    verbose=False,
)
 
crew.callback_function = listener
try:
    crew.kickoff()
finally:
    listener.shutdown()

Configuration Options

OptionDefaultDescription
capture_inputsTrueCapture task inputs, agent goals, tool arguments
capture_outputsTrueCapture task results, agent outputs, tool results
capture_llm_messagesTrueCapture full message history (system + user)
capture_tool_schemasTrueCapture tool definitions
capture_agent_snapshotTrueCapture agent goal/backstory at start
capture_crew_snapshotTrueCapture crew agents/tasks at kickoff
capture_memoryTrueCapture memory operations
capture_a2aTrueCapture agent-to-agent delegation
capture_mcpTrueCapture MCP server calls
capture_flowTrueCapture CrewAI Flow events
capture_reasoningTrueCapture reasoning steps
trace_name_prefix"crewai"Prefix for trace names
verboseFalseEnable debug logging

Async Kickoff

The integration supports both synchronous and asynchronous crew execution:

import asyncio
import noveum_trace
from noveum_trace.integrations.crewai import setup_crewai_tracing
 
noveum_trace.init(api_key="your-api-key", project="crewai-async")
 
async def main():
    listener = setup_crewai_tracing()
    crew.callback_function = listener
    try:
        return await crew.kickoff_async()
    finally:
        listener.shutdown()
 
result = asyncio.run(main())

Complete Example

For a complete end-to-end example with tests, see:

Next Steps

Exclusive Early Access

Get Early Access to Noveum.ai Platform

Be the first one to get notified when we open Noveum Platform to more users. All users get access to Observability suite for free, early users get free eval jobs and premium support for the first year.

Sign up now. We send access to new batch every week.

Early access members receive premium onboarding support and influence our product roadmap. Limited spots available.

On this page

CrewAI Integration | Documentation | Noveum.ai