Project IRIS is a robust, containerized ecosystem for running Large Language Models (LLMs) locally and privately.
It orchestrates a high-performance AI stack using Docker Compose, bridging the Ollama inference engine with modern, user-friendly web interfaces, fully optimized for Nvidia GPU acceleration.
Project IRIS connects three powerful components in an isolated network (iris-network):
- π§ Ollama (Backend): The engine that runs the models (Llama 3, Mistral, Gemma, etc.). Configured for full Nvidia GPU passthrough.
- π¬ Lobe Chat (UI 1): A modern, high-performance chatbot interface with plugin support and a clean design. accessible at port
3210. - π Open WebUI (UI 2): A feature-rich interface (formerly Ollama WebUI) offering advanced chat history, RAG (Retrieval Augmented Generation), and model management. Accessible at port
3211.
- Docker & Docker Compose installed.
- Nvidia GPU (Optional but recommended).
- Nvidia Container Toolkit installed (Required for the GPU configuration in
docker-compose.yml).
git clone https://github.com/Cluyverth/Project-IRIS.git
cd Project-IRISRun the stack in detached mode:
docker compose up -d| Service | URL | Description |
|---|---|---|
| Lobe Chat | http://localhost:3210 |
Modern, aesthetic chat interface. |
| Open WebUI | http://localhost:3211 |
Advanced UI with RAG and management features. |
| Ollama API | http://localhost:11434 |
Direct API access. |
Since the Ollama container starts empty, you need to pull a model first. You can do this via the Open WebUI interface or the command line:
# Example: Pulling Llama 3 inside the container
docker compose exec ollama ollama run llama3.
βββ docker-compose.yml # Orchestration logic
βββ .gitignore # Ignores Python caches & personal docs
βββ .dockerignore # Optimizes build context
βββ PersonalDocsFolder/ # (Ignored) Place for local RAG documents/scriptsThis project is set up for Python development/scripting (e.g., for custom RAG pipelines or automation). The .gitignore is configured to keep your environment clean by ignoring __pycache__, .venv, and .env files.