π Standardized proxy server for accessing 100+ LLM providers
LiteLLM provides a unified API interface for interacting with multiple LLM providers including OpenAI, Anthropic, Cohere, and many more.
Copy the example environment file and configure it:
cp .env.example .envEdit the .env file to set your secure credentials.
π‘ Generate Secure Keys:
# For database passwords
openssl rand -base64 32
# For LiteLLM API keys
openssl rand -hex 32Start PostgreSQL and LiteLLM:
docker compose up -dThis service uses Caddy for HTTPS/SSL termination. Create the external network:
docker network create caddy-networkConfigure your Caddy reverse proxy to point to http://litellm:4000.
Example Caddyfile:
your-domain.com {
reverse_proxy litellm:4000
}
Once everything is running and Caddy is configured:
- π LiteLLM API:
https://your-domain.com - π Admin Dashboard:
https://your-domain.com/ui
Login to the admin dashboard using the credentials set in your .env file (LITELLM_UI_USERNAME and LITELLM_UI_PASSWORD).
Note: LiteLLM is not directly exposed on localhost. Access is only available through your Caddy reverse proxy for security.
# All services
docker compose logs -f
# LiteLLM only
docker compose logs -f litellm
# PostgreSQL only
docker compose logs -f postgresdocker compose downdocker compose restart# Pull latest images
docker compose pull
# Restart with new images
docker compose down
docker compose up -dEdit .env file to configure:
POSTGRES_USER: PostgreSQL usernamePOSTGRES_PASSWORD: PostgreSQL passwordPOSTGRES_DB: Database name for LiteLLMLITELLM_PORT: LiteLLM service port (default: 4000)LITELLM_MASTER_KEY: Master API key for LiteLLMLITELLM_SALT_KEY: Salt key for encrypting credentialsLITELLM_UI_USERNAME: Admin dashboard usernameLITELLM_UI_PASSWORD: Admin dashboard password
Modify litellm-config.yaml to customize:
- Logging levels
- Retry behavior
- Caching settings
- User tracking
- Access the admin dashboard at
https://your-domain.com/ui - Navigate to Models or Providers section
- Add your API keys for various providers (OpenAI, Anthropic, etc.)
- Configure model routing as needed
This setup includes:
- PostgreSQL 17: Persistent database for storing model configurations, API keys, and usage data
- LiteLLM Proxy: Main service providing standardized API access
- Internal Networks:
litellm-network: Secure isolated network for database communicationcaddy-network: External network for Caddy reverse proxy integration
- Keep your
.envfile secure and never commit it to version control - Use strong, randomly generated passwords and keys
- The PostgreSQL database is only accessible within the Docker network
- LiteLLM is NOT directly exposed to the host - access only through Caddy reverse proxy
- Always use HTTPS in production via Caddy with valid SSL certificates
- The
caddy-networkmust be created before starting services
- All data is persisted in Docker volumes
- Database data survives container restarts
- LiteLLM configuration is stored in the database
- Health checks ensure services are running correctly
- LiteLLM Documentation: https://docs.litellm.ai/
- Supported Providers: https://docs.litellm.ai/docs/providers
- API Reference: https://docs.litellm.ai/docs/api
Network not found error:
# Create the caddy-network if it doesn't exist
docker network create caddy-networkCheck if internal port 5432 is in use (PostgreSQL):
lsof -i :5432Ensure PostgreSQL is healthy:
docker compose ps
docker compose logs postgresdocker compose down -v
docker compose up -dLiteLLM Proxy Server - π One API for all LLMs