Skip to content

[Bug]: fix streaming cached response logging #2074

@krrishdholakia

Description

@krrishdholakia

What happened?

The streamed cache response yields an async generator. This causes logging to fail as the response is not of the required type (ModelResponse, EmbeddingResponse, ImageResponse)

Relevant log output

2024-02-19 16:57:18 LiteLLM.LoggingError: [Non-Blocking] Exception occurred while success logging Traceback (most recent call last):
2024-02-19 16:57:18   File "/usr/local/lib/python3.9/site-packages/litellm/utils.py", line 1145, in success_handler
2024-02-19 16:57:18     result.choices[0].finish_reason is not None
2024-02-19 16:57:18 AttributeError: 'async_generator' object has no attribute 'choices'

Twitter / LinkedIn details

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions