Skip to main content
After deploying Omni via Docker Compose or AWS Terraform, follow these steps to get your instance fully operational.

Step 1: Access the Web UI and Create Your Admin Account

Navigate to your Omni URL (e.g., https://<your_domain_name>) and sign up. The first user to register automatically becomes the admin.
Admin users have access to Settings in the sidebar, where you can configure LLM providers, manage connectors, and invite users.

Step 2: Configure an LLM Provider

LLM providers are configured through the admin panel at Settings > LLM Providers. You can add multiple providers and users can select which model to use per chat.

Add a Provider

  1. Go to Settings > LLM Providers
  2. Click Connect against the provider you wish to, well, connect
  3. Enter the API key/credentials
  4. Click Connect to save, the provider’s predefined models become available automatically
You can add multiple providers simultaneously. For example, use Anthropic for complex reasoning and a local vLLM model for sensitive queries.

AWS Bedrock Notes

When running on AWS (ECS/EC2) with an appropriate IAM role, no access keys are needed — the SDK uses instance credentials automatically. Otherwise, set AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY as environment variables.

vLLM Notes

If you’re running vLLM as part of the Docker Compose stack, enable GPU support first:
# Download GPU override
curl -fsSL -o docker/docker-compose.gpu.yml \
  https://raw.githubusercontent.com/getomnico/omni/master/docker/docker-compose.gpu.yml

# Start with vLLM profile
omni-compose -f docker/docker-compose.gpu.yml --profile vllm up -d
Then add vLLM as a provider in the admin panel, pointing to http://vllm:8000/v1. See LLM Provider Configuration for all environment variable details.

Step 3: Configure an Embedding Provider

Embeddings power Omni’s semantic search. Unlike LLM providers, embedding configuration is done via environment variables (not the admin panel).

Add a Provider

  1. Go to Settings > Embedding Providers
  2. Click Connect against the provider you wish to use
  3. Enter the API key/credentials
  4. Click Connect to save
Changing the embedding provider after documents have been indexed will require a full re-index, since different providers produce incompatible vector representations.

Step 4: Enable and Configure Connectors

Connectors sync data from your external tools into Omni.

Enable Connectors

Set the ENABLED_CONNECTORS environment variable to a comma-separated list of the connectors you want to run:
# Example: enable Google Workspace, Slack, and Atlassian
ENABLED_CONNECTORS=google,slack,atlassian
Available connector names: google, slack, atlassian, github, hubspot, notion, filesystem, fireflies, web Restart services after changing this variable (same process as Step 3).

Configure Each Connector

Once enabled, connectors are configured in the admin panel:
  1. Go to Settings > Integrations
  2. Select the connector you want to configure
  3. Follow the setup instructions (typically involves providing service credentials or API keys)
  4. Start the initial sync

Google Workspace

Drive, Docs, Sheets, Gmail

Slack

Messages, threads, files

Atlassian

Jira issues, Confluence pages

All Connectors

Full list and setup guides

Step 5: Verify Your Setup

Once you’ve configured at least one LLM provider, an embedding provider, and a connector, verify everything is working.

Check Connector Sync Status

Go to Settings > Integrations and confirm your connectors show a successful sync status. The initial sync may take a few minutes depending on the volume of data.
  1. Navigate to the Search page
  2. Enter a query related to content from your connected sources
  3. Verify that results appear and are relevant

Test the AI Assistant

  1. Navigate to the Chat page
  2. Ask a question about information in your connected sources
  3. Verify the assistant responds with an answer that cites your documents
If search returns results but the AI assistant doesn’t work, double-check your LLM provider configuration in Settings > LLM Providers.

Next Steps