Omni supports two deployment strategies: Docker Compose and AWS Terraform.Documentation Index
Fetch the complete documentation index at: https://docs.getomni.co/llms.txt
Use this file to discover all available pages before exploring further.
Deployment Options
Docker Compose
Development, small teams, single-server production. Get running in 10-15 minutes.
AWS Terraform
High availability, auto-scaling, multi-region. Production-ready in 30-45 minutes.
After deploying, follow the Initial Setup guide to configure LLM providers, embeddings, and connectors.
Omni Components
All deployments include the same core components:| Component | Purpose |
|---|---|
| omni-web | SvelteKit frontend and API |
| omni-searcher | Search query processing |
| omni-indexer | Document processing |
| omni-ai | LLM and embedding orchestration |
| omni-connector-manager | Orchestrate connector services |
| omni-sandbox | Isolated execution environment for agent tools (bash, Python, files) |
| omni-docling | Optional structured text extraction for PDFs, Office docs, and images |
| PostgreSQL | Database with pg_search and pgvector |
| Redis | Caching and sessions |