-
Notifications
You must be signed in to change notification settings - Fork 883
Open
Description
Summary
The Ollama Python client raises a Pydantic ValidationError when /api/show responses omit model_info. This happens with some cloud models and breaks downstream tools that call ollama.show() (e.g., llm-ollama).
The llm-ollama maintainer recommended filing this here: taketwo/llm-ollama#65 (comment)
Steps to Reproduce
- Ensure Ollama server is running.
- Run: curl -s http://127.0.0.1:11434/api/show -d '{"name":"glm-4.7:cloud"}'
Observed Behavior
Response is valid JSON but missing model_info, and the Python client fails with:
pydantic_core._pydantic_core.ValidationError: 1 validation error for ShowResponse
model_info Field required [type=missing, input_value={...}, input_type=dict]
Example /api/show response (missing model_info)
{"modelfile":"# Modelfile generated by "ollama show"...","template":"{{ .Prompt }}","details":{"parent_model":"","format":"","family":"","families":null,"parameter_size":"","quantization_level":""},"remote_model":"glm-
4.7","remote_host":"https://ollama.com:443","capabilities":["completion","tools","thinking"],"modified_at":"2025-12-
24T10:08:53.438277171-05:00"}
Models observed missing model_info
- glm-4.7:cloud
- qwen3-next:80b-cloud
- deepseek-v3.2:cloud
Other cloud models (e.g., gpt-oss:120b-cloud) return model_info as expected.
Versions
- ollama python client: 0.6.0
- ollama server: 0.13.2
- python: 3.13
taketwo and dragonman117
Metadata
Metadata
Assignees
Labels
No labels