Skip to content

[BUG] Strands agent can't support LiteLLM model with Groq and Cerebras #729

@vien2024

Description

@vien2024

Checks

  • I have updated to the lastest minor and patch version of Strands
  • I have checked the documentation and this is not expected behavior
  • I have searched ./issues and there are no duplicates of my issue

Strands Version

1.5.0

Python Version

3.12.11

Operating System

Colab notebook

Installation Method

pip

Steps to Reproduce

Problem Statement

Currently, LiteLLM model supports Cerebras and Groq provider, but i cannot use them in strands agent

  • It appears strands_agents wrapper of litellm did not passed the input correctly, which result in not applying chat template
  • This affects Cerebras and Groq.

Expected Behavior

I want Strands_agents wrapper of Litellm to be accurate in passing arguments

Actual Behavior

Below is my code

from strands import Agent
from strands.models.litellm import LiteLLMModel

model = LiteLLMModel(
    client_args={
        "api_key":api_key,
    },
    # **model_config
    model_id="cerebras/qwen-3-235b-a22b-instruct-2507",
    params={
        "max_tokens": 3000,
        "temperature": 0.8,
        "stream": False,
    }
)

agent = Agent(model=model)
response = agent("What is 2+2")
print(response)

Error

The code produces this error:

---------------------------------------------------------------------------
BadRequestError                           Traceback (most recent call last)
[/usr/local/lib/python3.12/dist-packages/litellm/llms/openai/openai.py](https://localhost:8080/#) in acompletion(self, messages, optional_params, litellm_params, provider_config, model, model_response, logging_obj, timeout, api_key, api_base, api_version, organization, client, max_retries, headers, drop_params, stream_options, fake_stream)
    811 
--> 812                 headers, response = await self.make_openai_chat_completion_request(
    813                     openai_aclient=openai_aclient,

32 frames
BadRequestError: Error code: 400 - {'message': "Failed to apply chat template to messages due to error: 'list object' has no attribute 'startswith'", 'type': 'invalid_request_error', 'param': 'messages', 'code': 'wrong_api_format'}

During handling of the above exception, another exception occurred:

OpenAIError                               Traceback (most recent call last)
OpenAIError: Error code: 400 - {'message': "Failed to apply chat template to messages due to error: 'list object' has no attribute 'startswith'", 'type': 'invalid_request_error', 'param': 'messages', 'code': 'wrong_api_format'}

During handling of the above exception, another exception occurred:

BadRequestError                           Traceback (most recent call last)
[/usr/local/lib/python3.12/dist-packages/litellm/litellm_core_utils/exception_mapping_utils.py](https://localhost:8080/#) in exception_type(model, original_exception, custom_llm_provider, completion_kwargs, extra_kwargs)
    389                 ):
    390                     exception_mapping_worked = True
--> 391                     raise BadRequestError(
    392                         message=f"{exception_provider} - {message}",
    393                         llm_provider=custom_llm_provider,

BadRequestError: litellm.BadRequestError: CerebrasException - Failed to apply chat template to messages due to error: 'list object' has no attribute 'startswith'

I also have the same error when using strands_agents wrapper of Groq

Additional Context

No response

Possible Solution

No response

Related Issues

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    area-providerRelated to model providersbugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions