From f77a8113cf36ed974381505f6aa1b10cd0dde561 Mon Sep 17 00:00:00 2001 From: openhands Date: Wed, 17 Dec 2025 14:09:19 +0000 Subject: [PATCH 1/2] docs(sdk): add LLM Profiles guide synced with agent-sdk example Co-authored-by: openhands --- sdk/guides/llm-profiles.mdx | 73 +++++++++++++++++++++++++++++++++++++ 1 file changed, 73 insertions(+) create mode 100644 sdk/guides/llm-profiles.mdx diff --git a/sdk/guides/llm-profiles.mdx b/sdk/guides/llm-profiles.mdx new file mode 100644 index 00000000..8fefcf80 --- /dev/null +++ b/sdk/guides/llm-profiles.mdx @@ -0,0 +1,73 @@ +--- +title: LLM Profiles +description: Save, load, and reuse LLM configurations via named profiles with LLMRegistry. +--- + + +This example is available on GitHub: [examples/01_standalone_sdk/31_llm_profiles.py](https://github.com/OpenHands/software-agent-sdk/blob/main/examples/01_standalone_sdk/31_llm_profiles.py) + + +LLM profiles are named JSON configuration files for `openhands.sdk.llm.LLM`. They let you reuse the same model configuration across scripts and runs without copy/pasting large inline payloads. + +## Storage format and location + +- A profile file is simply the JSON representation of an `LLM` instance. +- Default location: `~/.openhands/llm-profiles/.json` +- The `profile_id` is the filename stem (no `.json` suffix). + +## Managing profiles with `LLMRegistry` + +Use `LLMRegistry` as the entry point for both in-memory registration (`usage_id` -> `LLM`) and on-disk profile management (`profile_id` -> JSON file). + +APIs: + +- `LLMRegistry.list_profiles()` returns available `profile_id`s +- `LLMRegistry.load_profile(profile_id)` loads the profile from disk +- `LLMRegistry.save_profile(profile_id, llm, include_secrets=True)` writes a profile + +### Secrets + +By default, profiles are saved with secrets included (e.g., `api_key`, `aws_access_key_id`, `aws_secret_access_key`). + +- To omit secrets on disk, call `save_profile(..., include_secrets=False)` +- New files are created with restrictive permissions (0600) when possible + +If you prefer not to store secrets locally, supply them at runtime via environment variables or a secrets manager. + +## Conversation persistence and profile references + +Conversation snapshots (`base_state.json`) can either: + +- store full inline LLM payloads (default, reproducible), or +- store compact profile references (`{"profile_id": "..."}`) when inline mode is disabled + +This is controlled by `OPENHANDS_INLINE_CONVERSATIONS`: + +- default: `true` (inline LLM payloads) +- set `OPENHANDS_INLINE_CONVERSATIONS=false` to persist profile references for any `LLM` that has `profile_id` set + +If you switch back to inline mode and try to resume a conversation that contains profile references, the SDK raises an error so you don’t accidentally resume without the full config. + +## Example script + +```python icon="python" expandable examples/01_standalone_sdk/31_llm_profiles.py +``` + +```bash Running the Example +export LLM_API_KEY="your-api-key" +# Optional overrides +# export LLM_MODEL="anthropic/claude-sonnet-4-5-20250929" +# export LLM_BASE_URL="https://api.your-llm-provider.com" +# export LLM_PROFILE_NAME="gpt-5-mini" + +cd agent-sdk +uv run python examples/01_standalone_sdk/31_llm_profiles.py +``` + +## Example profile file + +A minimal profile JSON used by the example above is here: + +- [examples/llm-profiles/gpt-5-mini.json](https://github.com/OpenHands/software-agent-sdk/blob/main/examples/llm-profiles/gpt-5-mini.json) + +You can create your own profiles using `LLMRegistry.save_profile(...)` or by editing JSON files under `~/.openhands/llm-profiles/`. From fd2d59685b62661d770962f80fd212817f7869c7 Mon Sep 17 00:00:00 2001 From: Engel Nyst Date: Tue, 30 Dec 2025 00:15:36 +0100 Subject: [PATCH 2/2] docs(sdk): document LLM profile and switching examples --- sdk/guides/agent-server/apptainer-sandbox.mdx | 20 +++++++++++++++++++ sdk/guides/agent-server/custom-tools.mdx | 10 ++++++++++ sdk/guides/agent-server/llm-switching.mdx | 20 +++++++++++++++++++ sdk/guides/llm-profiles.mdx | 20 +++++++++++++++++++ sdk/guides/llm-runtime-switching.mdx | 20 +++++++++++++++++++ 5 files changed, 90 insertions(+) create mode 100644 sdk/guides/agent-server/apptainer-sandbox.mdx create mode 100644 sdk/guides/agent-server/llm-switching.mdx create mode 100644 sdk/guides/llm-profiles.mdx create mode 100644 sdk/guides/llm-runtime-switching.mdx diff --git a/sdk/guides/agent-server/apptainer-sandbox.mdx b/sdk/guides/agent-server/apptainer-sandbox.mdx new file mode 100644 index 00000000..fd2e5bc2 --- /dev/null +++ b/sdk/guides/agent-server/apptainer-sandbox.mdx @@ -0,0 +1,20 @@ +--- +title: Apptainer Sandbox +description: Run agent server in an Apptainer sandbox for isolation on HPC systems. +--- + +This example shows how to run a sandboxed remote agent server using Apptainer. + + +This example is available on GitHub: [examples/02_remote_agent_server/07_convo_with_apptainer_sandboxed_server.py](https://github.com/OpenHands/software-agent-sdk/blob/main/examples/02_remote_agent_server/07_convo_with_apptainer_sandboxed_server.py) + + +```python icon="python" expandable examples/02_remote_agent_server/07_convo_with_apptainer_sandboxed_server.py +``` + +```bash Running the Example +export LLM_API_KEY="your-api-key" +cd agent-sdk +uv run python examples/02_remote_agent_server/07_convo_with_apptainer_sandboxed_server.py +``` + diff --git a/sdk/guides/agent-server/custom-tools.mdx b/sdk/guides/agent-server/custom-tools.mdx index 39cb72bc..48443334 100644 --- a/sdk/guides/agent-server/custom-tools.mdx +++ b/sdk/guides/agent-server/custom-tools.mdx @@ -15,6 +15,16 @@ For standalone custom tools (without remote agent server), see the [Custom Tools This example is available on GitHub: [examples/02_remote_agent_server/05_custom_tool/](https://github.com/OpenHands/software-agent-sdk/tree/main/examples/02_remote_agent_server/05_custom_tool) +### Tool Modules (Built Into the Server Image) + +```python icon="python" expandable examples/02_remote_agent_server/05_custom_tool/custom_tools/__init__.py +``` + +```python icon="python" expandable examples/02_remote_agent_server/05_custom_tool/custom_tools/log_data.py +``` + +### Client Script + ```python icon="python" expandable examples/02_remote_agent_server/05_custom_tool/custom_tool_example.py """Example: Using custom tools with remote agent server. diff --git a/sdk/guides/agent-server/llm-switching.mdx b/sdk/guides/agent-server/llm-switching.mdx new file mode 100644 index 00000000..a258b477 --- /dev/null +++ b/sdk/guides/agent-server/llm-switching.mdx @@ -0,0 +1,20 @@ +--- +title: Agent Server LLM Switching +description: Switch the active LLM for a remote conversation via the agent server API and restore with the new profile. +--- + +This guide demonstrates switching the active LLM profile for a remote conversation and then restoring the conversation with the new selection. + + +This example is available on GitHub: [examples/02_remote_agent_server/07_llm_switch_and_restore.py](https://github.com/OpenHands/software-agent-sdk/blob/main/examples/02_remote_agent_server/07_llm_switch_and_restore.py) + + +```python icon="python" expandable examples/02_remote_agent_server/07_llm_switch_and_restore.py +``` + +```bash Running the Example +export LLM_API_KEY="your-api-key" +cd agent-sdk +uv run python examples/02_remote_agent_server/07_llm_switch_and_restore.py +``` + diff --git a/sdk/guides/llm-profiles.mdx b/sdk/guides/llm-profiles.mdx new file mode 100644 index 00000000..bf5896b1 --- /dev/null +++ b/sdk/guides/llm-profiles.mdx @@ -0,0 +1,20 @@ +--- +title: LLM Profiles +description: Save and load named LLM configurations for reuse across runs. +--- + +LLM Profiles let you persist LLM configurations (model, base_url, secrets, etc.) under a stable `profile_id`, and then load them when constructing agents. + + +This example is available on GitHub: [examples/01_standalone_sdk/31_llm_profiles.py](https://github.com/OpenHands/software-agent-sdk/blob/main/examples/01_standalone_sdk/31_llm_profiles.py) + + +```python icon="python" expandable examples/01_standalone_sdk/31_llm_profiles.py +``` + +```bash Running the Example +export LLM_API_KEY="your-api-key" +cd agent-sdk +uv run python examples/01_standalone_sdk/31_llm_profiles.py +``` + diff --git a/sdk/guides/llm-runtime-switching.mdx b/sdk/guides/llm-runtime-switching.mdx new file mode 100644 index 00000000..a18daa81 --- /dev/null +++ b/sdk/guides/llm-runtime-switching.mdx @@ -0,0 +1,20 @@ +--- +title: Runtime LLM Switching +description: Switch the active LLM profile during a conversation and persist the choice. +--- + +You can swap the active LLM profile at runtime (between runs) and persist the new selection so it survives restarts. + + +This example is available on GitHub: [examples/01_standalone_sdk/26_runtime_llm_switch.py](https://github.com/OpenHands/software-agent-sdk/blob/main/examples/01_standalone_sdk/26_runtime_llm_switch.py) + + +```python icon="python" expandable examples/01_standalone_sdk/26_runtime_llm_switch.py +``` + +```bash Running the Example +export LLM_API_KEY="your-api-key" +cd agent-sdk +uv run python examples/01_standalone_sdk/26_runtime_llm_switch.py +``` +