Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
14 changes: 12 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -51,7 +51,7 @@ This guide will walk you through setting up your API key, downloading a dataset,

### Step 1: Configure Your API Key

First, tell CARIBOU about your OpenAI or DeepSeek API key. This is a one-time setup.
First, tell CARIBOU about your OpenAI, Anthropic (Claude), or DeepSeek API key. This is a one-time setup.

```bash
caribou config set-openai-key "sk-YourSecretKeyGoesHere"
Expand All @@ -63,6 +63,12 @@ or
caribou config set-deepseek-key "sk-YourSecretKeyGoesHere"
```

or

```bash
caribou config set-anthropic-key "sk-ant-YourSecretKeyGoesHere"
```


Your key will be stored securely in a local `.env` file within the CARIBOU configuration directory.

Expand Down Expand Up @@ -91,7 +97,7 @@ This will trigger a series of prompts:
2. **Select a driver agent:** Choose which agent in the system will receive the first instruction.
3. **Select Dataset:** Pick the dataset you downloaded in Step 2.
4. **Choose a sandbox backend:** Select `docker` or `singularity`.
5. **Choose an LLM backend:** Select `chatgpt` or `ollama`.
5. **Choose an LLM backend:** Select `chatgpt`, `claude`, `deepseek`, or `ollama`.

After configuration, the session will begin, and you can start giving instructions to your agent team\!

Expand Down Expand Up @@ -161,6 +167,10 @@ Manage your CARIBOU configuration.
```bash
caribou config set-deepseek-key "sk-..."
```
* **Set your Anthropic API key:**
```bash
caribou config set-anthropic-key "sk-ant-..."
```

-----

Expand Down
14 changes: 12 additions & 2 deletions caribou/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -51,7 +51,7 @@ This guide will walk you through setting up your API key, downloading a dataset,

### Step 1: Configure Your API Key

First, tell CARIBOU about your OpenAI or DeepSeek API key. This is a one-time setup.
First, tell CARIBOU about your OpenAI, Anthropic (Claude), or DeepSeek API key. This is a one-time setup.

```bash
caribou config set-openai-key "sk-YourSecretKeyGoesHere"
Expand All @@ -63,6 +63,12 @@ or
caribou config set-deepseek-key "sk-YourSecretKeyGoesHere"
```

or

```bash
caribou config set-anthropic-key "sk-ant-YourSecretKeyGoesHere"
```


Your key will be stored securely in a local `.env` file within the CARIBOU configuration directory.

Expand Down Expand Up @@ -91,7 +97,7 @@ This will trigger a series of prompts:
2. **Select a driver agent:** Choose which agent in the system will receive the first instruction.
3. **Select Dataset:** Pick the dataset you downloaded in Step 2.
4. **Choose a sandbox backend:** Select `docker` or `singularity`.
5. **Choose an LLM backend:** Select `chatgpt` or `ollama`.
5. **Choose an LLM backend:** Select `chatgpt`, `claude`, `deepseek`, or `ollama`.

After configuration, the session will begin, and you can start giving instructions to your agent team\!

Expand Down Expand Up @@ -161,6 +167,10 @@ Manage your CARIBOU configuration.
```bash
caribou config set-deepseek-key "sk-..."
```
* **Set your Anthropic API key:**
```bash
caribou config set-anthropic-key "sk-ant-..."
```

-----

Expand Down
34 changes: 33 additions & 1 deletion caribou/src/caribou/cli/config_cli.py
Original file line number Diff line number Diff line change
Expand Up @@ -71,4 +71,36 @@ def set_deepseek_key(
new_content = content.strip() + f"\n{key_to_set}\n"

ENV_FILE.write_text(new_content)
console.print(f"[bold green]✅ DeepSeek API key has been set successfully in:[/bold green] {ENV_FILE}")
console.print(f"[bold green]✅ DeepSeek API key has been set successfully in:[/bold green] {ENV_FILE}")

@config_app.command("set-anthropic-key")
def set_anthropic_key(
ctx: typer.Context,
api_key: Optional[str] = typer.Argument(None, help="Your Anthropic API key (e.g., 'sk-ant-...')"),
):
"""
Saves your Anthropic API key to the Caribou environment file.
"""
if api_key is None:
console.print("[bold red]Error:[/bold red] You must provide an API key.\n")
typer.echo(ctx.parent.get_help())
raise typer.Exit()

if not api_key.startswith("sk-"):
console.print(
"[yellow]Warning: Key does not look like a standard Anthropic API key (should start with 'sk-').[/yellow]"
)

if not ENV_FILE.exists():
ENV_FILE.touch()

content = ENV_FILE.read_text()
key_to_set = f'ANTHROPIC_API_KEY="{api_key}"'

if re.search(r"^ANTHROPIC_API_KEY=.*$", content, flags=re.MULTILINE):
new_content = re.sub(r"^ANTHROPIC_API_KEY=.*$", key_to_set, content, flags=re.MULTILINE)
else:
new_content = content.strip() + f"\n{key_to_set}\n"

ENV_FILE.write_text(new_content.strip())
console.print(f"[bold green]✅ Anthropic API key has been set successfully in:[/bold green] {ENV_FILE}")
17 changes: 12 additions & 5 deletions caribou/src/caribou/cli/run_cli.py
Original file line number Diff line number Diff line change
Expand Up @@ -370,7 +370,7 @@ def initialize_context(

# ---- LLM Backend ----
if llm_backend is None:
llm_backend = Prompt.ask("Choose an LLM backend", choices=["chatgpt", "ollama", "deepseek"], default="chatgpt")
llm_backend = Prompt.ask("Choose an LLM backend", choices=["chatgpt", "claude", "ollama", "deepseek"], default="chatgpt")

console.print(f"[cyan]Initializing LLM backend: {llm_backend}[/cyan]")

Expand All @@ -380,7 +380,14 @@ def initialize_context(
raise typer.Exit(1)
context.llm_client = OpenAI(api_key=os.getenv("OPENAI_API_KEY"))
context.model_name = "gpt-5.2"

elif llm_backend == "claude":
anthropic_key = os.getenv("ANTHROPIC_API_KEY")
if not anthropic_key:
console.print("[bold red]Error: ANTHROPIC_API_KEY not set. Use 'caribou config set-anthropic-key'.[/bold red]")
raise typer.Exit(1)
from caribou.core.anthropic_wrapper import AnthropicClient
context.llm_client = AnthropicClient(api_key=anthropic_key)
context.model_name = "claude-sonnet-4-5-20250929"
elif llm_backend == "deepseek":
if not os.getenv("DEEPSEEK_API_KEY"):
console.print("[bold red]Error: DEEPSEEK_API_KEY not set. Use 'caribou config set-deepseek-key'.[/bold red]")
Expand Down Expand Up @@ -459,7 +466,7 @@ def main_run_callback(
dataset: Optional[Path] = typer.Option(None, "--dataset", "-ds", help="Path to the primary dataset file (.h5ad).", readable=True),
reference_dataset: Optional[Path] = typer.Option(None, "--reference-dataset", "-ref", help="Path to an optional reference dataset file (.h5ad).", readable=True),
resources_dir: Optional[Path] = typer.Option(None, "--resources", help="Path to a directory of resource files to mount.", exists=True, file_okay=False),
llm_backend: Optional[str] = typer.Option(None, "--llm", help="LLM backend: 'chatgpt', 'ollama', or 'deepseek'."),
llm_backend: Optional[str] = typer.Option(None, "--llm", help="LLM backend: 'chatgpt', 'claude', 'ollama', or 'deepseek'."),
ollama_host: str = typer.Option("http://localhost:11434", "--ollama-host", help="Base URL for Ollama backend."),
sandbox: Optional[str] = typer.Option(None, "--sandbox", help="Sandbox backend: 'docker' or 'singularity'."),
force_refresh: bool = typer.Option(False, "--force-refresh", help="Force refresh/rebuild of the sandbox environment."),
Expand Down Expand Up @@ -515,7 +522,7 @@ def run_interactive(
dataset: Path = typer.Option(None, "--dataset", "-ds", help="Path to the primary dataset file (.h5ad).", readable=True),
reference_dataset: Path = typer.Option(None, "--reference-dataset", "-ref", help="Path to an optional reference dataset file (.h5ad).", readable=True),
resources_dir: Path = typer.Option(None, "--resources", help="Path to a directory of resource files to mount.", exists=True, file_okay=False),
llm_backend: str = typer.Option(None, "--llm", help="LLM backend to use: 'chatgpt', 'ollama', or 'deepseek'."),
llm_backend: str = typer.Option(None, "--llm", help="LLM backend to use: 'chatgpt', 'claude', 'ollama', or 'deepseek'."),
ollama_host: str = typer.Option("http://localhost:11434", "--ollama-host", help="Base URL for Ollama backend."),
sandbox: str = typer.Option(None, "--sandbox", help="Sandbox backend to use: 'docker' or 'singularity'."),
force_refresh: bool = typer.Option(False, "--force-refresh", help="Force refresh/rebuild of the sandbox environment."),
Expand Down Expand Up @@ -560,7 +567,7 @@ def run_auto(
dataset: Path = typer.Option(None, "--dataset", "-ds", help="Path to the primary dataset file (.h5ad).", readable=True),
reference_dataset: Path = typer.Option(None, "--reference-dataset", "-ref", help="Path to an optional reference dataset file (.h5ad).", readable=True),
resources_dir: Path = typer.Option(None, "--resources", help="Path to a directory of resource files to mount.", exists=True, file_okay=False),
llm_backend: str = typer.Option(None, "--llm", help="LLM backend to use: 'chatgpt', 'ollama', or 'deepseek'."),
llm_backend: str = typer.Option(None, "--llm", help="LLM backend to use: 'chatgpt', 'claude', 'ollama', or 'deepseek'."),
ollama_host: str = typer.Option("http://localhost:11434", "--ollama-host", help="Base URL for Ollama backend."),
sandbox: str = typer.Option(None, "--sandbox", help="Sandbox backend to use: 'docker' or 'singularity'."),
force_refresh: bool = typer.Option(False, "--force-refresh", help="Force refresh/rebuild of the sandbox environment."),
Expand Down
75 changes: 75 additions & 0 deletions caribou/src/caribou/core/anthropic_wrapper.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,75 @@
"""
Lightweight Anthropic client wrapper that mimics the subset of the OpenAI
chat API used by CARIBOU. Exposes a `.chat.completions.create(...)` method
that returns an object shaped like the OpenAI response:

resp.choices[0].message.content
"""

from __future__ import annotations

from types import SimpleNamespace
from typing import Any, Dict, List, Optional

import anthropic


class AnthropicClient:
"""
Example:
client = AnthropicClient(api_key="sk-ant-...", model="claude-sonnet-4-5-20250929")
resp = client.chat.completions.create(model="claude-sonnet-4-5-20250929", messages=[...])
print(resp.choices[0].message.content)
"""

def __init__(
self,
*,
api_key: str,
model: str = "claude-sonnet-4-5-20250929",
max_output_tokens: int = 1024,
base_url: Optional[str] = None,
):
client_kwargs: Dict[str, Any] = {"api_key": api_key}
if base_url:
client_kwargs["base_url"] = base_url
self._client = anthropic.Anthropic(**client_kwargs)
self._default_model = model
self._max_output_tokens = max_output_tokens
self.chat = SimpleNamespace(completions=SimpleNamespace(create=self._chat_create))

def _chat_create(
self,
*,
messages: List[Dict[str, str]],
model: Optional[str] = None,
temperature: Optional[float] = None,
max_output_tokens: Optional[int] = None,
**_: Any,
):
system_parts: List[str] = []
converted: List[Dict[str, str]] = []
for msg in messages:
role = msg.get("role", "user")
content = msg.get("content", "")
if role == "system":
system_parts.append(str(content))
continue
converted.append({"role": role if role in ("assistant", "user") else "user", "content": content})

system_prompt = "\n\n".join(system_parts) if system_parts else None

response = self._client.messages.create(
model=model or self._default_model,
system=system_prompt,
messages=converted,
temperature=temperature,
max_tokens=max_output_tokens or self._max_output_tokens,
)

text_chunks = [block.text for block in response.content if getattr(block, "type", None) == "text"]
content = "".join(text_chunks)

message = SimpleNamespace(content=content, role="assistant")
choice = SimpleNamespace(message=message, index=0, finish_reason=getattr(response, "stop_reason", "stop"))
return SimpleNamespace(choices=[choice])
127 changes: 127 additions & 0 deletions caribou/tests/QUICKSTART.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,127 @@
# CARIBOU Test Suite - Quick Start Guide

## Installation

1. **Activate your conda environment (if using conda):**

```bash
conda activate olaf # or your environment name
```

2. **Install test dependencies:**

```bash
cd /data1/peerd/riffled/riffled/Olaf_project/CARIBOU
python -m pip install pytest pytest-cov
```

Or install all requirements:

```bash
python -m pip install -r requirements.txt
```

**Important:** Make sure pytest is installed in the same Python environment where `anthropic`, `openai`, and other CARIBOU dependencies are installed.

## Running Tests

### Option 1: Using the Test Runner Script (Recommended)

```bash
cd caribou/tests
./run_tests.sh
```

Available options:
- `./run_tests.sh --unit` - Run only unit tests
- `./run_tests.sh --integration` - Run only integration tests
- `./run_tests.sh --verbose` - Verbose output
- `./run_tests.sh --coverage` - Generate coverage report

### Option 2: Using pytest Directly

From the project root directory:

```bash
# Run all tests
pytest caribou/tests/

# Run specific test categories
pytest caribou/tests/unit/
pytest caribou/tests/integration/

# Run specific test file
pytest caribou/tests/unit/test_message_utils.py

# Run with verbose output
pytest caribou/tests/ -v

# Run with coverage
pytest caribou/tests/ --cov=caribou --cov-report=html
```

## What Gets Tested

✅ **LLM API Wrappers**
- AnthropicClient (OpenAI compatibility)
- OllamaClient (local models)

✅ **Message Routing**
- Delegation detection (`delegate_to_agent`)
- RAG query detection (`query_rag_<topic>`)
- Artifact extraction (notes, TODOs)

✅ **History Management**
- MemoryManager with episodic summarization
- Context assembly and compression

✅ **Agent System**
- Multi-agent configuration
- Prompt generation
- Agent switching

✅ **End-to-End Integration**
- Complete message flows
- Multi-agent conversations
- Error handling

## Verifying the Setup

Run a quick smoke test:

```bash
pytest caribou/tests/unit/test_message_utils.py::TestDelegationDetection::test_detect_simple_delegation -v
```

Expected output:
```
test_detect_simple_delegation PASSED
```

## Troubleshooting

**Import errors?**
The tests automatically add `caribou/src` to the Python path via `conftest.py`. If you still get import errors, verify the directory structure:

```
CARIBOU/
└── caribou/
├── src/
│ └── caribou/
│ ├── core/
│ ├── execution/
│ └── agents/
└── tests/
├── conftest.py # ← Should add src/ to path
├── unit/
└── integration/
```

**Tests hanging?**
All external API calls are mocked - tests should run quickly (< 10 seconds total).

## Next Steps

- See [README.md](README.md) for detailed documentation
- Run with coverage to see what's tested: `./run_tests.sh --coverage`
- Open `htmlcov/index.html` to view the coverage report
Loading