API Reference
Complete API documentation and method details
GitHub Repository
Source code, examples, and issue tracking
Hello World Example
Working example with basic conversation flow
Installation
Pipecat Flows
To use Pipecat Flows, install the required dependency:Pipecat Dependencies
For fresh installations, you’ll need to install Pipecat with dependencies for your Transport, STT, LLM, and TTS providers. For example, to use Daily, OpenAI, Deepgram, Cartesia, and Silero:Reference Pages
FlowManager
Core orchestration class: constructor, properties, and methods
Types
NodeConfig, FlowsFunctionSchema, ActionConfig, context strategies, and type
aliases
Exceptions
Error handling hierarchy for flow management
Function Types
Node Functions
Execute operations within a single conversation state without switching nodes. Return(FlowResult, None).
Edge Functions
Create transitions between conversation states, optionally processing data first. Return(FlowResult, NodeConfig).
Direct Functions
Functions passed directly to NodeConfig with automatic metadata extraction from signatures and docstrings. Seeflows_direct_function and FlowsDirectFunction.
LLM Provider Support
Pipecat Flows works with any LLM service that supports function calling. Pipecat handles provider-specific format conversion internally.| Provider | Installation |
|---|---|
| OpenAI | uv add "pipecat-ai[openai]" |
| OpenAI-compatible | Provider-specific (see below) |
| Anthropic | uv add "pipecat-ai[anthropic]" |
| Google Gemini | uv add "pipecat-ai[google]" |
| AWS Bedrock | uv add "pipecat-ai[aws]" |
LLMService base class is supported. This includes OpenAI-compatible services like Groq, Together, Cerebras, DeepSeek, and others.
Additional Notes
- State Management: Use
flow_manager.statedictionary for persistent conversation data - Automatic Function Call Registration and Validation: All functions are automatically registered and validated at run-time
- Provider Compatibility: Pipecat handles provider-specific format conversion internally