fix(models): Add non-streaming support to OpenAIModel #942
+164
−36
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Description
This pull request adds support for non-streaming responses to the OpenAIModel. Users can now set streaming=False during model initialization to receive a single, complete response object instead of an event stream. To ensure this fix integrates cleanly, a type conflict in the inheriting LiteLLMModel was also resolved.Key Changes
src/strands/models/openai.py:
Added optional streaming: Optional[bool] config key.
format_request now respects streaming (defaults to True to preserve existing behavior).
Stream method supports both streaming and non-streaming flows (converts non-streaming provider response into streaming-style events).
Small helper _convert_non_streaming_to_streaming added to normalize non-streaming responses into the same event format.
src/strands/models/litellm.py:
tests/strands/models/test_openai.py:
Related Issues
Closes #778Documentation PR
N/AType of Change
Bug fix
Testing
How have you tested the change? Verify that the changes do not break functionality or introduce warnings in consuming repositories: agents-docs, agents-tools, agents-cli
hatch run prepare
Checklist
By submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of your choice.