Skip to content

Conversation

danalsan
Copy link
Contributor

Fixes: #1955

Since 0.2.0, the vLLM gets an empty list (vs Nonein 0.1.9 and before) when there are no tools configured which causes the issue described in #1955 p. This patch avoids sending the 'tools' param to the vLLM altogether instead of an empty list.

It also adds a small unit test to avoid regressions.

The OpenAI specification does not explicitly state that the list cannot be empty but I found this out through experimentation and it might depend on the actual remote vllm. In any case, as this parameter is Optional, is best to skip it altogether if there's no tools configured.

@facebook-github-bot
Copy link

Hi @danalsan!

Thank you for your pull request and welcome to our community.

Action Required

In order to merge any pull request (code, docs, etc.), we require contributors to sign our Contributor License Agreement, and we don't seem to have one on file for you.

Process

In order for us to review and merge your suggested changes, please sign at https://code.facebook.com/cla. If you are contributing on behalf of someone else (eg your employer), the individual CLA may not be sufficient and your employer may need to sign the corporate CLA.

Once the CLA is signed, our tooling will perform checks and validations. Afterwards, the pull request will be tagged with CLA signed. The tagging process may take up to 1 hour after signing. Please give it that time before contacting us about it.

If you have received this in error or have any questions, please contact us at [email protected]. Thanks!

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Meta Open Source bot. label Apr 15, 2025
Copy link
Collaborator

@leseb leseb left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

See pre-commit failure


input_dict: dict[str, Any] = {}
if isinstance(request, ChatCompletionRequest) and request.tools is not None:
# Only include the 'tools' param if there is any. It can break things if an empty list is sent to the vlllm.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
# Only include the 'tools' param if there is any. It can break things if an empty list is sent to the vlllm.
# Only include the 'tools' param if there is any. It can break things if an empty list is sent to the vLLM.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just this one and we should be good to go!

Fixes: llamastack#1955

Since 0.2.0, the vLLM gets an empty list (vs ``None`` in 0.1.9 and
before) when there are no tools configured which causes the issue described
in llamastack#1955. This patch avoids sending the 'tools' param to the vLLM altogether
instead of an empty list.

It also adds a small unit test to avoid regressions.

Signed-off-by: Daniel Alvarez <[email protected]>
Copy link
Contributor

@booxter booxter left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you.


input_dict: dict[str, Any] = {}
if isinstance(request, ChatCompletionRequest) and request.tools is not None:
# Only include the 'tools' param if there is any. It can break things if an empty list is sent to the vLLM.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Apparently it depends on vllm version; but it's good that we are going to handle this gracefully again here.

@booxter
Copy link
Contributor

booxter commented Apr 15, 2025

FYI the CI failure is not related to the PR (the job doesn't even utilize vllm). I reported #1959 since I saw the failure a number of times before.

Copy link
Collaborator

@terrytangyuan terrytangyuan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks!

@terrytangyuan terrytangyuan merged commit b5a9ef4 into llamastack:main Apr 16, 2025
21 of 22 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Meta Open Source bot.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

BadRequest error using remote-vllm on inference
6 participants