Skip to content

Conversation

Ratish1
Copy link
Contributor

@Ratish1 Ratish1 commented Sep 29, 2025

Description

This pull request adds support for non-streaming responses to the OpenAIModel. Users can now set streaming=False during model initialization to receive a single, complete response object instead of an event stream. To ensure this fix integrates cleanly, a type conflict in the inheriting LiteLLMModel was also resolved.

Key Changes

src/strands/models/openai.py:

  • Added optional streaming: Optional[bool] config key.

  • format_request now respects streaming (defaults to True to preserve existing behavior).

  • Stream method supports both streaming and non-streaming flows (converts non-streaming provider response into streaming-style events).

  • Small helper _convert_non_streaming_to_streaming added to normalize non-streaming responses into the same event format.

src/strands/models/litellm.py:

  • Updated to include a streaming option in its config and to preserve compatibility with OpenAIModel.

tests/strands/models/test_openai.py:

  • Updated/added tests to cover non-streaming behavior.

Related Issues

Closes #778

Documentation PR

N/A

Type of Change

Bug fix

Testing

How have you tested the change? Verify that the changes do not break functionality or introduce warnings in consuming repositories: agents-docs, agents-tools, agents-cli

  • I ran hatch run prepare
  • I ran hatch fmt --formatter locally.
  • I ran hatch fmt --linter and pre-commit run --all-files.
  • I ran unit tests locally:
  • tests/strands/models/test_openai.py::test_stream → Passed
  • tests/strands/models/test_openai.py::test_stream_respects_streaming_flag → Passed

Checklist

  • I have read the CONTRIBUTING document
  • I have added any necessary tests that prove my fix is effective or my feature works
  • I have updated the documentation accordingly
  • I have added an appropriate example to the documentation to outline the feature, or no new docs are needed
  • My changes generate no new warnings
  • Any dependent changes have been merged and published

By submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of your choice.

This commit introduces a streaming parameter to OpenAIModel to allow for non-streaming responses.

  The initial implementation revealed a type incompatibility in the inheriting LiteLLMModel. This has been resolved by updating LiteLLMConfig to be consistent with the parent
  OpenAIConfig, ensuring all pre-commit checks pass.

  The associated unit tests for OpenAIModel have also been improved to verify the non-streaming behavior.
@Ratish1 Ratish1 changed the title fix(models): Add non-streaming suppor to OpenAIModel fix(models): Add non-streaming support to OpenAIModel Sep 29, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[BUG] Stream can't be disabled in OpenAIModel
1 participant