Skip to content

pravaco/ca-server

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

13 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

CA Server

Lightweight streaming proxy for Anthropic and OpenAI, focused on computer-use (CUA) loops via the Messages/Responses APIs.

Setup

  1. Configure server env

    export ANTHROPIC_API_KEY=your-key
    export OPENAI_API_KEY=your-key
    # Redis (required)
    export REDIS_URL=redis://localhost:6379
  2. Install & run

    npm install
    npm run build
    npm start
  3. Provision keys (CLI)

npm run key:new -- alice
npm run key:list

Keys are stored in Redis under keys:rec:<pk> (and indexed in keys:all).

Test Manually

# Test Anthropic (Messages API)
curl -X POST http://localhost:3000/anthropic/v1/messages \
  -H "Authorization: Bearer pk-<user-key>" \
  -H "Content-Type: application/json" \
  -H "anthropic-version: 2023-06-01" \
  -d '{"model": "claude-3-haiku-20240307", "max_tokens": 10, "messages": [{"role": "user", "content": "Hi"}]}'

# Test OpenAI (Responses/Chat)
curl -X POST http://localhost:3000/openai/v1/chat/completions \
  -H "Authorization: Bearer pk-<user-key>" \
  -H "Content-Type: application/json" \
  -d '{"model": "gpt-3.5-turbo", "max_tokens": 10, "messages": [{"role": "user", "content": "Hi"}]}'

Note: Streaming (SSE / chunked JSON) is forwarded transparently.

## Status

- Health: `GET /health``{ status: 'ok' }`
- Version: `GET /version``{ version, commit }`
- Status: `GET /status` → Probes Redis and upstream providers (OpenAI/Anthropic) using configured credentials and returns per-target `ok`, `status/latencyMs`, plus overall status.

## CUA Loop Test (Simple)

Minimal commands; ensure `.env` contains `OPENAI_API_KEY` for OpenAI official.

- Ticketek (find Ticket Delivery FAQs):
  - `npm run cua:ticketek`
- AccuWeather (find Monthly forecast for Manchester, GB):
  - `npm run cua:accuweather`

Notes:
- Uses Playwright Chromium (downloaded automatically).
- Switches to Node 20 with `nvm` if available.
- Always acknowledges tool safety checks in this harness to keep the loop simple.

## ACUA Loop (Anthropic)

Run the Anthropic computer-use loop locally using the Messages API tool-calls.

```bash
# Ensure ANTHROPIC_API_KEY is set (or use the proxy base URL and a pk-...)
export ANTHROPIC_API_KEY=sk-ant-...

# Optional: route via this server proxy
# export ANTHROPIC_BASE_URL=http://localhost:3000/anthropic

# Ticketek and AccuWeather tasks
npm run acua:ticketek
npm run acua:accuweather

# Generic
ACUA_TASK=tasks/accuweather.json npm run acua:run

Environment variables:

  • ANTHROPIC_API_KEY: API key for Anthropic (required unless using proxy with pk-...).
  • ANTHROPIC_BASE_URL (optional): Set to http://localhost:3000/anthropic to route via this server.
  • ANTHROPIC_MODEL (optional): Defaults to claude-sonnet-4-20250514.
  • ACUA_THINKING_BUDGET (optional): Enable thinking for Claude 4/3.7 (e.g. 1024).

Proxy One-Liners

Run against OpenAI (uses .env OPENAI_API_KEY):

  • Ticketek: npm run cua:ticketek
  • AccuWeather: npm run cua:accuweather

Notes:

  • All scripts auto-install Chromium and use Node 20 via nvm if available.

1:1 Drop-in (Swap Base URL)

  • OpenAI SDKs: set baseURL to http://<your-proxy>/openai and use your issued pk-... as the API key. Example (Node):
    import OpenAI from 'openai'
    const client = new OpenAI({ baseURL: 'http://localhost:3000/openai', apiKey: 'pk-...' })
    const res = await client.chat.completions.create({ model: 'gpt-4o-mini', messages: [{ role:'user', content:'hi'}] })
  • Anthropic SDKs: set baseURL to http://<your-proxy>/anthropic and use your issued pk-... as the API key. You may omit anthropic-version; a default is set if missing.

Usage Accounting

  • Per-key counters are updated after each proxied request (total/success/error/lastUsedAt) in Redis (keys:rec:<pk>).
  • Additional Redis usage metrics (hashed by key id) are in usage:key:<sha16>.

About

CUA API Server

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published