Skip to content

Conversation

@mandelbro
Copy link
Owner

This pull request introduces several improvements to configuration management and environment variable handling, with a focus on making the system more flexible and robust. The main changes include supporting YAML-based configuration for embedders, updating model defaults, and improving the way environment variables are handled for both Docker and runtime configuration. Additionally, there are enhancements to how optional parameters are managed for LLM clients and improvements to dependency versions.

Configuration and Environment Management Improvements:

  • The embedder configuration now supports loading from both YAML provider files and environment variables, aligning its selection logic with the LLM configuration. Methods and references to environment-only configuration have been updated accordingly. (src/config/embedder_config.py [1] [2] [3]; src/config/server_config.py [4] [5]
  • The Docker setup and docker-compose.yml now use environment variable expansion for several keys, allowing for easier overrides and more consistent configuration across environments. (Dockerfile [1]; docker-compose.yml [2]

Model and Dependency Updates:

  • The default LLM models and token limits have been updated to use newer versions: gpt-5-mini and gpt-5-nano, with a higher max_tokens value. (config/providers/openai.yml config/providers/openai.ymlL7-R11)
  • The graphiti-core dependency has been updated to version 0.21.0. (pyproject.toml pyproject.tomlL27-R27)

LLM Client Parameter Handling:

  • The LLM client creation logic now only includes the temperature parameter for OpenAI and Azure OpenAI clients if the OPENAI_ENABLE_TEMPERATURE environment variable is set to "true", preventing accidental overriding of API defaults. (src/config/llm_config.py [1] [2]

Ollama Client Compatibility Enhancement:

  • The Ollama client now gracefully ignores unsupported keyword arguments (such as reasoning) in structured completion calls, improving compatibility with callers that may pass extra parameters. (src/ollama_client.py [1] [2]

… and configuration organization

- Modified docker-compose.yml to allow environment variables for USE_OLLAMA and OPENAI_API_KEY, enhancing flexibility in configuration.
- Updated Dockerfile to include copying of the config directory, improving application structure and maintainability.
…nt variables

- Introduced `from_yaml_and_env` method in `GraphitiEmbedderConfig` to load configuration from both YAML files and environment variables, enhancing flexibility and maintainability.
- Updated the logic for Ollama and OpenAI configurations to utilize the new loading method, improving error handling and configuration management.
- Modified `from_cli_and_env` to start with the new YAML+env based configuration, ensuring CLI arguments can override as needed.
Temperature is now only set in LLMConfig if the OPENAI_ENABLE_TEMPERATURE environment variable is explicitly set to true. This allows for more control over when the temperature parameter is included in requests to OpenAI and Azure OpenAI clients.
Updated GraphitiConfig to load the embedder configuration using from_yaml_and_env() instead of from_env(), aligning it with the LLM configuration method. This change ensures consistent configuration loading from both YAML files and environment variables.
cursor[bot]

This comment was marked as outdated.

Replaces ci-cd.yml and manual-test.yml with a simplified ci.yml workflow focused on code quality checks. Adds a Justfile to standardize development commands and automation tasks. This streamlines CI processes and improves local developer experience.
Allows the EMBEDDER_MODEL_NAME environment variable to override the model selection for OpenAI and Azure OpenAI providers, improving backward compatibility. Ollama provider is not affected by this change.
@mandelbro mandelbro requested a review from Copilot October 4, 2025 01:01
Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This pull request introduces significant improvements to configuration management and environment variable handling, with a primary focus on making the system more flexible and robust. The changes modernize how configuration is loaded and managed across different components.

  • Updated embedder configuration to support both YAML and environment variables, aligning with LLM configuration patterns
  • Enhanced Docker and dependency management with environment variable expansion and version updates
  • Improved OpenAI client parameter handling with conditional temperature configuration

Reviewed Changes

Copilot reviewed 12 out of 13 changed files in this pull request and generated no comments.

Show a summary per file
File Description
src/config/embedder_config.py Refactored to support YAML-based configuration with environment fallbacks
src/config/server_config.py Updated to use unified YAML+env configuration approach
src/config/llm_config.py Added conditional temperature parameter handling for OpenAI clients
src/ollama_client.py Enhanced compatibility by accepting and ignoring unsupported parameters
pyproject.toml Updated graphiti-core dependency to version 0.21.0
docker-compose.yml Added environment variable expansion for USE_OLLAMA and OPENAI_API_KEY
config/providers/openai.yml Updated default models to gpt-5-mini/gpt-5-nano with increased token limits
Dockerfile Added config directory copy to Docker image
Justfile Added new build automation configuration file
.github/workflows/ Simplified CI workflow and removed complex CI/CD pipeline

Tip: Customize your code reviews with copilot-instructions.md. Create the file or learn how to get started.

@mandelbro mandelbro merged commit 125ba98 into main Oct 4, 2025
1 check passed
@mandelbro mandelbro deleted the openai-fixes branch October 4, 2025 01:03
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants