-
Notifications
You must be signed in to change notification settings - Fork 1
Add tool call & some other params support for llms api #117
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
This PR is targeting If this is a regular feature/fix PR, please change the base branch to Current base: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
This PR adds tool call support to the LLMs API, enabling function calling capabilities for both Ollama and Hugging Face Hub inference routers.
- Extended the
complete_chatmethod to accept additional parameters via**kwargsincluding tool definitions - Modified return types to support both string responses and full response objects when tools are involved
- Updated both Ollama and Hugging Face Hub implementations to handle tool calls and return appropriate response formats
Tip: Customize your code reviews with copilot-instructions.md. Create the file or learn how to get started.
bradhe
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good, let's ship it.
Add tool call & some other params support for llms api
Add tool call support for llms api