EdgePilot is an on-premises AI copilot that combines a lightweight FastAPI backend with an Electron desktop UI. It features full MCP (Model Context Protocol) integration, enabling Gemini to autonomously monitor your system, launch applications with scheduling, and manage processes through natural language.
- π€ MCP Integration - Gemini can autonomously call tools for system monitoring, app launching, and process management
- π Real-time Metrics - CPU, memory, disk, network monitoring with process-level details and executable paths
- π Smart App Launcher - Launch applications by name with delay support using Windows Start Menu search
- π― Smart Tool Calling - LLM automatically decides when to gather metrics, launch apps, or end processes
- π₯οΈ Desktop UI - Electron-based chat interface with dark theme
- π Provider Abstraction - Pluggable system supporting Gemini (with tools), Claude, and GPT
- πΎ Local Persistence - JSON-based chat history and usage analytics (privacy-first)
- β‘ Lightweight - Clean codebase focused on core functionality
# Install dependencies
pip install -r requirements.txt
# Configure your API key
# Edit env/.env and add: GEMINI_API_KEY=your_key_here
# Install Electron UI (one-time, requires Node.js 18+)
cd ui && npm install && cd ..# Launch the full application (API + Electron UI)
python main.py
# Or run API only:
python main.py serve --host 127.0.0.1 --port 8000# Test all MCP tools integration
python test_tools.py
# Test launcher directly
python tools/launcher.pyOpen the UI and try these prompts with Gemini:
System Monitoring:
- "What's my current CPU and memory usage?"
- "Show me the top 5 processes using the most CPU"
Application Discovery:
- "What apps do I have installed?"
- "Do I have Discord installed?"
- "List all my games"
Application Launching:
- "Launch notepad"
- "Open Chrome in 30 seconds"
- "Start Minecraft in 1 minute"
Process Control:
- "Close all Chrome instances"
- "End the notepad process"
Edit env/.env:
GEMINI_API_KEY=your_gemini_key # Required for MCP tools
ANTHROPIC_API_KEY=your_claude_key # Optional
OPENAI_API_KEY=your_openai_key # Optional
DEFAULT_PROVIDER=gemini # Use gemini for tool callingEdgePilot/
βββ README.md
βββ requirements.txt
βββ test_tools.py # MCP tools integration test
βββ main.py # FastAPI backend + CLI entry point
βββ ui/ # Electron desktop application
β βββ index.html # UI markup
β βββ renderer.js # Frontend logic
β βββ styles.css # Dark theme styling
β βββ main.js # Electron main process
β βββ package.json # Node.js dependencies
βββ providers/ # LLM provider adapters
β βββ base.py # BaseLLM protocol + ToolCall classes
β βββ gemini.py # Gemini with function calling
β βββ claude.py # Claude adapter
β βββ gpt.py # GPT placeholder
βββ tools/ # System utilities exposed as tools
β βββ __init__.py # Export gather_metrics, launch, search, list_apps, end_task
β βββ metrics.py # System monitoring (CPU, memory, processes)
β βββ launcher.py # Application launcher with Windows Start Menu search
β βββ end_task.py # Process termination
βββ MCP/ # Model Context Protocol integration
β βββ tool_schemas.py # Function calling schemas for all 5 tools
β βββ tool_executor.py # Tool execution engine
β βββ README.md # Full MCP documentation
βββ env/.env # API keys and configuration
βββ data/ # JSON persistence
βββ chat_history.json # Chat sessions
βββ usage_metrics.json # API usage tracking
βββ tool_call_history.json # Tool execution logs
GET /api/providersβ enumerate providers and configuration statusGET /api/chatsβ list chat sessions with summary metadataPOST /api/chatsβ create a new chat sessionGET /api/chats/{chat_id}β fetch full conversation historyPOST /api/chats/{chat_id}/messagesβ send a prompt and get LLM response (with tool calling)GET /api/metricsβ retrieve current system metrics snapshot
EdgePilot includes full MCP integration with 5 powerful tools using launcher.py for intelligent app launching:
Collects comprehensive system metrics including CPU, memory, disk, network, battery, and all running processes with executable paths.
# LLM can call this automatically when user asks about system status
gather_metrics(top_n=10, all_processes=False)Launch applications by name with optional delay. Uses Windows Start Menu search and Microsoft Store app discovery.
# LLM calls this when user wants to launch an app
launch(app_name="chrome", delay_seconds=0)
launch(app_name="minecraft", delay_seconds=30) # Launch in 30 secondsFeatures:
- Searches Windows Start Menu shortcuts
- Finds Microsoft Store/UWP apps
- Supports delayed execution with threading
- Simple app names (no paths needed)
Search for installed applications by name. Returns list of matching apps found in Start Menu and Microsoft Store.
# LLM calls this to check if an app is installed
search(app_name="discord") # Returns: ["Discord"]
search(app_name="game") # Returns: ["Game Bar", "Steam", ...]List all installed applications with optional filtering. Perfect for "what apps do I have?" queries.
# LLM calls this to browse available apps
list_apps(filter_term="") # Returns all apps
list_apps(filter_term="game") # Returns only apps with "game" in nameTerminates processes by name, path, or command line identifier.
# LLM calls this when user wants to close an app
end_task(identifier="chrome", force=False)
end_task(identifier="notepad", force=True)- User sends a message in natural language
- Gemini analyzes the request and decides which tools to call
- Tools are executed automatically (e.g., launching apps, gathering metrics)
- Results are fed back to Gemini
- Gemini formulates a human-readable response
Example 1: System Monitoring
User: "Show me what's using the most CPU"
β Gemini calls gather_metrics(top_n=3)
β Receives: {processes: [{name: "chrome.exe", cpu: 15.2%, ...}]}
β Responds: "Chrome is using the most CPU at 15.2%..."
Example 2: Scheduled App Launch
User: "Launch Minecraft in 30 seconds"
β Gemini calls launch(app_name="minecraft", delay_seconds=30)
β Receives: {success: true, message: "Scheduled 'minecraft' to launch in 30 seconds"}
β Responds: "I've scheduled Minecraft to launch in 30 seconds!"
Example 3: App Discovery
User: "What games do I have?"
β Gemini calls list_apps(filter_term="game")
β Receives: {count: 3, apps: ["Game Bar", "Steam", "Minecraft"]}
β Responds: "You have 3 games installed: Game Bar, Steam, and Minecraft"
See MCP/README.md for the complete guide. It's a simple 5-step process:
- Create tool function in
tools/ - Export it in
tools/__init__.py - Add schema to
MCP/tool_schemas.py - Add executor in
MCP/tool_executor.py - Restart and test!
# Test all MCP tools integration
python test_tools.py
# Test launcher directly (launches notepad, chrome, minecraft)
python tools/launcher.py
# Run modules directly
python -c "from tools import gather_metrics; print(gather_metrics(top_n=5))"
python -c "from tools import search; print(search('chrome'))"
python -c "from tools import list_apps; print(list_apps('game'))"- Add a module under
providers/implementing theBaseLLMprotocol - Register it in
providers/__init__.py - Add environment variables for API keys/models
- For tool support, implement
enable_tools()and parsetool_callsin responses
EdgePilot's application launching is powered by launcher.py, which provides:
- Windows Start Menu Search - Searches .lnk shortcuts in user and system Start Menu locations
- Microsoft Store Apps - Discovers and launches UWP/Store apps via PowerShell
- Delayed Execution - Background threading for scheduled launches
- Intelligent Fallback - Falls back to Windows
startcommand for built-in apps - Simple API - Just 3 core functions:
launch(),search(),list_apps()
The LLM can use simple app names like "chrome", "minecraft", or "notepad" without needing full paths!
README.md(this file) - Quick start and overviewMCP/README.md- Complete MCP integration guidetools/launcher.py- Application launcher implementation with detailed documentation
MIT License - See LICENSE file for details