A clean, minimal CLI application to interact with local Ollama models — featuring session management, chat persistence, and resume functionality.
- 🔁 Start New Chats with any supported local Ollama model.
- 💾 Save Chat Sessions automatically with timestamp and model.
- ♻️ Resume Previous Chats seamlessly — continue from where you left off.
- 🧹 Delete Chats with interactive selection.
- 📜 View Past Chat Messages before resuming sessions.
- ✅ Fully persistent and local – no cloud or external API dependency.
- 🧪 Built with
cobrafor a user-friendly CLI experience. - 🌐 Internet-augmented responses using SerpAPI for real-time search within chats
- 🧠 Supports streamed response generation via
OllamaAPI.
🔨 Option 1: Download Prebuilt Binaries (Recommended)
Available for: Windows, macOS (Intel & ARM), Linux
Go to the Releases directory and download the binary for your OS:
OS Files
- Windows
ollama-cli_windows_amd64.exe - macOS
ollama-cli_darwin_amd64 or ollama-cli_darwin_arm64 - Linux
ollama-cli_linux_amd64
- ollama-cli_darwin_amd64 (Intel)
- ollama-cli_darwin_arm64 (Apple Silicon: M1, M2, M3)
After downloading:
Before running the cli make sure the ollama is running.
# (macOS/Linux)
chmod +x ollama-cli_darwin_amd64 or ollama_cli_darwin_arm64
or ollama-cli_linux_amd64
./ollama-cli_darwin_amd64 or ./ollama_cli_darwin_arm64
or ./ollama_cli_linux_amd64
# (Windows)
Open cmd and navigate to Directory where the .exe is.
.\ollama-cli_windows_amd64.exe
⚙️ Option 2: Build from Source
Requirements: Go 1.22+, Ollama installed and running locally.
http://localhost:11434) for this CLI to function correctly.
Start Ollama in a separate terminal before using the CLI:
ollama serveThen build and run the CLI:
git clone https://github.com/Ashank007/ollama-cli.git
cd ollama-cli
go build -o ollama-cli
./ollama-cli
Or run directly:
go run .Start the CLI:
./ollama-cli chatYou'll see an interactive menu:
🧠 What would you like to do?
[1] Start a new chat
[2] Continue a previous chat
[3] Delete a chat
[4] ExitList saved chats:
./ollama-cli list- Chats are saved locally in the .chats directory.
- Filenames include session name and timestamp.
- All history is preserved in JSON format.
To search within a chat, type:
search: What is the latest update on NVIDIA GPUs?
The CLI will:
- Fetch top search results from SerpAPI
- Display the result in your chat
- Automatically append it into the context
You can set your SerpAPI key in this way:
./ollama-cli set-key keyhere
- Go to https://serpapi.com and Register or Sign in
- Go to Api Key Section
- Copy the Private API Key and Done.
Command Description
chat Start, resume, or delete chat sessions
list View all saved chat sessions
set-key Set your SerpAPI keyYou can choose the model interactively when starting a chat, or predefine it using:
./ollama-cli chat --model llama3
cmd/
├── api.go # Api Logic
├── chat.go # Chat command & interaction
├── chat_handler.go # Chat loop logic
├── config.go # Save the SERP Api key
├── list_chats.go # List command
├── listmodels.go # List All Local Models
├── model_picker.go # Model selection logic
├── root.go # CLI root setup
├── setkey.go # Setting Key Logic Here
├── storage.go # Chat persistence (save/load/delete)
├── types.go # Message and Session struct definitions
├── websearch.go # Web Search Logic
Yes, all chats run locally if your model is downloaded.
It is made for Ollama Only.
⚙️ Go 1.22
🐍 spf13/cobra for CLI structure
🧠 Ollama API for model interaction
🗂️ JSON-based local chat persistence
Licensed under the MIT License.
- Powered by Ollama
- CLI framework: spf13/cobra