Skip to content

ShowResponse ValidationError when /api/show omits model_info (cloud models) #607

@hcal

Description

@hcal

Summary

The Ollama Python client raises a Pydantic ValidationError when /api/show responses omit model_info. This happens with some cloud models and breaks downstream tools that call ollama.show() (e.g., llm-ollama).

The llm-ollama maintainer recommended filing this here: taketwo/llm-ollama#65 (comment)

Steps to Reproduce

  1. Ensure Ollama server is running.
  2. Run: curl -s http://127.0.0.1:11434/api/show -d '{"name":"glm-4.7:cloud"}'

Observed Behavior

Response is valid JSON but missing model_info, and the Python client fails with:

pydantic_core._pydantic_core.ValidationError: 1 validation error for ShowResponse
model_info Field required [type=missing, input_value={...}, input_type=dict]

Example /api/show response (missing model_info)

{"modelfile":"# Modelfile generated by "ollama show"...","template":"{{ .Prompt }}","details":{"parent_model":"","format":"","family":"","families":null,"parameter_size":"","quantization_level":""},"remote_model":"glm-
4.7","remote_host":"https://ollama.com:443","capabilities":["completion","tools","thinking"],"modified_at":"2025-12-
24T10:08:53.438277171-05:00"}

Models observed missing model_info

  • glm-4.7:cloud
  • qwen3-next:80b-cloud
  • deepseek-v3.2:cloud

Other cloud models (e.g., gpt-oss:120b-cloud) return model_info as expected.

Versions

  • ollama python client: 0.6.0
  • ollama server: 0.13.2
  • python: 3.13

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions