Skip to content

Support multiple tool call functions in remote vLLM inference provider #1120

@terrytangyuan

Description

@terrytangyuan

🚀 Describe the new functionality needed

Currently, the remote vLLM inference provider only supports a single tool call function. For example, if you use this example: https://github.com/meta-llama/llama-stack-apps/blob/main/examples/agents/e2e_loop_with_client_tools.py, only the first function passed to client_tools argument in AgentConfig will be used.

💡 Why is this needed? What if we don't build it?

Users won't be able to use multiple tool call functions with agent.

Other thoughts

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions