Lightweight streaming proxy for Anthropic and OpenAI, focused on computer-use (CUA) loops via the Messages/Responses APIs.
-
Configure server env
export ANTHROPIC_API_KEY=your-key export OPENAI_API_KEY=your-key # Redis (required) export REDIS_URL=redis://localhost:6379
-
Install & run
npm install npm run build npm start
-
Provision keys (CLI)
npm run key:new -- alice
npm run key:listKeys are stored in Redis under keys:rec:<pk> (and indexed in keys:all).
# Test Anthropic (Messages API)
curl -X POST http://localhost:3000/anthropic/v1/messages \
-H "Authorization: Bearer pk-<user-key>" \
-H "Content-Type: application/json" \
-H "anthropic-version: 2023-06-01" \
-d '{"model": "claude-3-haiku-20240307", "max_tokens": 10, "messages": [{"role": "user", "content": "Hi"}]}'
# Test OpenAI (Responses/Chat)
curl -X POST http://localhost:3000/openai/v1/chat/completions \
-H "Authorization: Bearer pk-<user-key>" \
-H "Content-Type: application/json" \
-d '{"model": "gpt-3.5-turbo", "max_tokens": 10, "messages": [{"role": "user", "content": "Hi"}]}'
Note: Streaming (SSE / chunked JSON) is forwarded transparently.
## Status
- Health: `GET /health` → `{ status: 'ok' }`
- Version: `GET /version` → `{ version, commit }`
- Status: `GET /status` → Probes Redis and upstream providers (OpenAI/Anthropic) using configured credentials and returns per-target `ok`, `status/latencyMs`, plus overall status.
## CUA Loop Test (Simple)
Minimal commands; ensure `.env` contains `OPENAI_API_KEY` for OpenAI official.
- Ticketek (find Ticket Delivery FAQs):
- `npm run cua:ticketek`
- AccuWeather (find Monthly forecast for Manchester, GB):
- `npm run cua:accuweather`
Notes:
- Uses Playwright Chromium (downloaded automatically).
- Switches to Node 20 with `nvm` if available.
- Always acknowledges tool safety checks in this harness to keep the loop simple.
## ACUA Loop (Anthropic)
Run the Anthropic computer-use loop locally using the Messages API tool-calls.
```bash
# Ensure ANTHROPIC_API_KEY is set (or use the proxy base URL and a pk-...)
export ANTHROPIC_API_KEY=sk-ant-...
# Optional: route via this server proxy
# export ANTHROPIC_BASE_URL=http://localhost:3000/anthropic
# Ticketek and AccuWeather tasks
npm run acua:ticketek
npm run acua:accuweather
# Generic
ACUA_TASK=tasks/accuweather.json npm run acua:runEnvironment variables:
ANTHROPIC_API_KEY: API key for Anthropic (required unless using proxy withpk-...).ANTHROPIC_BASE_URL(optional): Set tohttp://localhost:3000/anthropicto route via this server.ANTHROPIC_MODEL(optional): Defaults toclaude-sonnet-4-20250514.ACUA_THINKING_BUDGET(optional): Enable thinking for Claude 4/3.7 (e.g.1024).
Run against OpenAI (uses .env OPENAI_API_KEY):
- Ticketek:
npm run cua:ticketek - AccuWeather:
npm run cua:accuweather
Notes:
- All scripts auto-install Chromium and use Node 20 via
nvmif available.
- OpenAI SDKs: set
baseURLtohttp://<your-proxy>/openaiand use your issuedpk-...as the API key. Example (Node):import OpenAI from 'openai' const client = new OpenAI({ baseURL: 'http://localhost:3000/openai', apiKey: 'pk-...' }) const res = await client.chat.completions.create({ model: 'gpt-4o-mini', messages: [{ role:'user', content:'hi'}] })
- Anthropic SDKs: set
baseURLtohttp://<your-proxy>/anthropicand use your issuedpk-...as the API key. You may omitanthropic-version; a default is set if missing.
- Per-key counters are updated after each proxied request (total/success/error/lastUsedAt) in Redis (
keys:rec:<pk>). - Additional Redis usage metrics (hashed by key id) are in
usage:key:<sha16>.