A Swift Vapor-based proxy server for OpenRouter API that provides OpenAI-compatible endpoints for Xcode Intelligence.
- Proxy requests to OpenRouter API
- OpenAI-compatible API format
- Model filtering support
- Streaming responses
- SSL verification control
- CLI interface
- Request logging
- Ensure you have Swift 5.9+ installed
- Clone the repository
- Build the project:
swift buildRun the proxy server with default settings:
swift run OpenRouterXcodeProxy--port, -p: Port to run the server on (default: 8080)--model-filter-file: Path to model filter file--disable-ssl-verify: Disable SSL verification
# Run on custom port
swift run OpenRouterXcodeProxy --port 3000
# Use model filter file
swift run OpenRouterXcodeProxy --model-filter-file ./allowed-models.txt
# Disable SSL verification
swift run OpenRouterXcodeProxy --disable-ssl-verifyCreate a text file with one model ID per line to filter available models:
openai/gpt-4
anthropic/claude-3-sonnet
meta-llama/llama-2-70b-chat
GET /v1/models
Returns available models in OpenAI format.
POST /v1/chat/completions
Proxies chat completion requests to OpenRouter with full streaming support.
MODEL_FILTER_FILE: Path to model filter fileDISABLE_SSL_VERIFY: Set to "true" to disable SSL verificationPORT: Server port (can also be set via command line)
- Swift 5.9+
- macOS 10.15+
- Internet connection for OpenRouter API access